top of page
Search

AI for IT Directors: a minimum-viable guide

  • Writer: Chris Watterson
    Chris Watterson
  • 3 days ago
  • 5 min read

Everyone is talking about AI; from your board, to your suppliers, to your team. There is a perceived urgency: “Why are you not ‘doing AI’ yet?”


Reality: colleagues already use AI assistants for personal productivity, but sector-wide operational adoption is still early. You don’t need to be first - you do need to be safe, ready, and focused on measurable value for customers and homes.


It's important to remember AI is not one thing and this article considers 1) general-purpose assistants (e.g. ChatGPT and Copilot), 2) AI embedded in your core platforms (e.g. within your HMS/CRM or other applications), and 3) bespoke solutions (built in-house or by partners). There are differences in the risks and opportunities each category presents.

 

As an IT Director your priorities right now are to:

  1. Understand and manage the emerging risks.

  2. Keep developing the foundations which allow you to make better use of your data - this means cultural and capability as well as technology.

  3. Understand how your technology vendors are bringing AI into your core products and think about how you get value from it.

  4. Develop a community of practice to start experimenting and learning, on a small scale and with low budgets. 

  5. Engage with colleagues in the sector to share knowledge.

 

What not to do:

  1. Write a policy and call it done.

  2. Ban AI outright. People will route around you and your “Shadow IT” will get worse.

  3. Spend big on R&D or a large data science team without being really clear on how AI is going to deliver improvements for customers and colleagues or setting up the right governance and support.



Priority

Practical steps you can take

1

Understand and manage the emerging risks 

  • Most organisations do not need a standalone AI policy. But you will need to review existing policies in light of AI. Data protection is crucial, but not sufficient to act ethically. Map AI risk into your enterprise risk register.

  • Cyber risk has gone up. Follow NCSC guidance. At very least, get Cyber Essentials right now, in preparation for the Cyber Security and Resilience Bill. The latter will extend responsibility to your supply chain, and you may as well get your own house in order first.

  • Consider creating a plain English Customer Charter to engage with residents about how you will be using AI, with fairness and ethical principles embedded. Consider extending this to co-design of AI-enabled projects.

  • Work out how you are going to govern AI and integrate AI into your organisational strategy, and how you will keep up so that you are aware of new risks and opportunities. Ensure data sensitivity is controlled and that existing assurance structures (e.g. Audit and Risk Committee) are equipped to audit effectively. 

2

Keep developing the foundations

  • Use AI as a vehicle to get data integrity up the strategic agenda. It is in the Better Social Housing Review. It is in the Sector Risk Profile. It is part of the Grenfell Inquiry Report. Get the investment you need to start building a culture of data ownership and fixing data at source. Bias training and data literacy are part of this. Engage with data leaders in your organisation from the outset. 

  • Continue to modernise your infrastructure favouring architectures that support the easy aggregation of data across core systems and favouring technologies which aid capture of high quality data.

  • Use any change activities to move towards HACT UK Housing Data Standards for key domains to align with others in the sector. Common data standards will allow AI models to be more easily shared between HAs. 

  • Support Board and Exec to be data-driven will accelerate the cultural transformation. Encourage Board to triangulate data. For example, TSMs, ombudsman rulings, internally handled complaints and social media data are all independent datasets which can help you, at scale, understand how your customers feel about your services and how to improve them. 

3

Understand how your technology vendors are bringing AI

  • Understand your tech vendor roadmaps, consider what functionality they will be releasing and how your organisation will get value from it. For most HAs much of your AI is likely to be embedded into your core systems rather than self developed.

  • Consider acceptance criteria for enabling any embedded AI (accuracy tests, bias checks, rollback)

4

Develop a community of practice to start experimenting and learning

  • Set up an internal AI Community of Practice to understand who is doing what in your organisation already. Your colleagues will be using this in their personal lives and want to bring that benefit to their work. These people are your future AI evangelists and change agents. Train them and others appropriately.

  • Blocking AI tools on your network is not a robust solution (users can easily take photographs or email data to themselves) although this might reduce your risk in the short term. Instead, engage interested users, provide them with tools and publish “safe patterns” (e.g., summarising non-personal docs).

  • Use your new Community of Practice to uncover small use cases. Keep the focus on value, not on implementing AI for AI’s sake. Remember you don’t need perfect data before you can do anything - find the use cases which don’t require data you don’t have. Start small. Use these small cases to support a culture shift and reduce fears that “the robots are taking over” 

  • Closely manage and measure the results of experimentation. Set people up to evaluate the impact of what they are trying. Support colleagues with the training they need to use these tools. 

  • Ignore most vendor AI-washing and focus on customer outcomes. That said, do consider new products which use AI to deliver value, particularly if these have been successfully implemented by others in the sector. 

  • As you experiment, create and update a re-usable AI playbook: experiment template, DPIA template, red-lines, rollout checklist, benefits tracking.

5

Engage with colleagues in the sector 

  • Other housing providers are all somewhere on the journey too. Share knowledge with your peers.

  • Attend and participate in sector forums and working groups such as HAILIE, and those run by HACT and NHF.

  • Share and reuse policies, roadmaps, use cases models and successes. Open licensing models exist, such as Creative Commons.


Recommended resources



Author credits: Guy Marshall (Fuza), Chris Watterson (Rannoch Associates), Andy Johnson (Beacon Cymru), Mark Shephard (Yorkshire Housing), Ian Cresswell (Magenta Living)

Version: 1.0 published 15 October 2025


License: AI for IT Directors in Housing © 2025 by Guy Marshall, Chris Watterson, Andy Johnson, Mark Shephard, Ian Cresswell is licensed under CC BY-SA 4.0

ree

ree

ree

 
 
 

Comments


bottom of page