Do you need an AI centre of excellence? A decision guide for UK businesses

Why "do we need a CoE?" is the wrong first question
Not every UK business needs a formal AI Centre of Excellence, and building one too early wastes resource. The question is not "should we have a CoE?" but "have we reached the point where uncoordinated AI adoption is creating more risk and inefficiency than a structured function would cost?" For a 15-person UK SME running ChatGPT Team and one automation in Zapier, the answer is almost always no. For a 300-person mid-market firm running five AI tools across marketing, operations, HR, and finance, with different teams buying different licences and no shared standards, the answer is almost always yes.
This guide gives a clear decision framework, describes what a CoE does at different stages of maturity, and sets realistic resource expectations for UK mid-market firms. It is written for senior leaders and for operations or transformation directors who are being asked whether a CoE is warranted.
What an AI CoE does (and does not do)
An AI Centre of Excellence is an internal function that sets AI standards, vets tools, coordinates training, and governs AI adoption across an organisation. For most UK mid-market firms, a CoE is not a team of AI engineers building custom models. It is 2 to 4 people (often part-time, drawn from existing roles) with a defined mandate.
Core functions:
- Maintain the approved tool list and vendor relationships, including DPAs and data residency confirmations.
- Coordinate AI training across departments against a common literacy framework.
- Set prompt standards and maintain a shared library of templates.
- Monitor AI-related compliance requirements (UK GDPR, EU AI Act, FCA guidance, ICO updates, sector-specific regulation).
- Track ROI on AI investments and report to leadership on a quarterly or half-yearly cadence.
- Act as the escalation point when a team hits a question it cannot answer locally (a new tool, a new compliance question, a client concern).
A CoE does not build bespoke AI systems. That is a separate implementation function, typically staffed by developers and solution architects. A CoE does not run individual AI tools for departments, either. Each team continues to own and operate its tools; the CoE provides the common governance layer.
The distinction matters because senior leaders often imagine a CoE as a large, technical team. The reality for UK mid-market firms is a small, cross-functional group whose value is in coordination rather than in building.
The decision framework: do you need one?
The simplest way to answer the question is a scored checklist. Score 1 point for each condition that applies to your organisation.
- More than 3 different AI tools are in use across your business.
- Different departments are using AI tools without coordinating on standards, prompts, or data handling.
- You have received at least one question from a client, partner, or auditor about your AI usage.
- You are in a regulated sector (financial services, healthcare, legal, accountancy, education).
- You have had at least one AI-related incident (an error in a client-facing output, a data input concern, a compliance question raised internally).
- AI spend exceeds £20,000 per year across tool licences and consultancy.
- More than 50 employees regularly use AI tools in their work.
- You are considering building custom AI rather than using off-the-shelf tools.
Interpretation:
- 0 to 2 points. A CoE is not yet needed. Informal oversight, a named AI lead within IT or operations, and an annual review of tool usage is sufficient. Revisit in 12 months or when a significant change occurs.
- 3 to 4 points. A lightweight CoE is warranted. Appoint a named AI lead, establish a shared policy document, and run a quarterly review meeting with representatives from each department. This is an efficient use of existing time rather than new headcount.
- 5 or more points. A formal CoE with dedicated resource is justified. The risk of uncoordinated adoption now exceeds the cost of structured governance. The question becomes how to staff and structure the CoE, not whether to have one.
The checklist is a starting point, not the final answer. A firm in a highly regulated sector may need a CoE at 2 points, because the regulatory exposure alone justifies the governance investment. A small, fast-growing consultancy may delay to 5 points if its AI use is concentrated in a single team led by someone senior who is already providing de facto coordination.
Three maturity stages for UK businesses
A CoE looks different at different organisational sizes. The three stages below describe the function at typical UK mid-market maturity levels, with resource expectations.
Stage 1: 0 to 25 AI-active employees. A named AI lead (existing role, approximately 20% of their time), a shared policy document (the AI acceptable use policy covered in our separate AUP guide), and a quarterly review meeting. No dedicated budget line needed; the function is folded into IT or operations. At this stage, the CoE is one person with a light process overhead, not a team.
Stage 2: 25 to 150 AI-active employees. AI lead with 0.5 to 1 FTE commitment, a formal tool vetting process, a training programme aligned to the organisation's literacy framework, and a compliance review process. Estimated range based on UK market rates: £40,000 to £80,000 per year including a share of the AI lead's time and training provision for staff. The CoE now has enough structure to produce and maintain documentation on a regular cycle.
Stage 3: 150+ AI-active employees, or a regulated sector regardless of size. Dedicated AI lead with 2 to 3 supporting roles (often a training coordinator, a compliance analyst, and a technical lead), a formal governance board, and quarterly reporting to the main board. Estimated range based on UK market rates: £100,000 to £250,000 per year. At this stage, the CoE may justify a Knowledge Partnership Agreement (KPA) or a formal arrangement with a university or external AI advisor.
These budget ranges are estimates based on UK market rates as of early 2026. They assume in-house staff at typical mid-market salary bands and external training delivered by a specialist provider. They do not include the cost of AI tool licences themselves, which sit within departmental budgets rather than within the CoE.
The UK government has stated a target of upskilling 10 million workers by 2030 through free AI training initiatives. Where available, CoE programmes should make use of these government-backed resources to reduce the training line in the CoE budget. At the time of writing, availability varies by sector and region, so the CoE should check the latest DSIT-published programmes before committing to paid providers.
The AI champions model: an alternative to a formal CoE
Many UK SMEs cannot justify a formal CoE but need more than no governance. The AI champions model is the practical middle ground and is often the right answer for organisations with 20 to 150 employees.
The champions model identifies one person per department who takes responsibility for AI standards within that team. Champions are not the most senior person, and they are not necessarily the most technical. They are selected because they are curious about AI, respected by their peers, and willing to take on a coordinating role alongside their main job.
Champion responsibilities:
- Promote the approved tool list within their team and flag non-approved tools if they appear.
- Maintain a prompt library for their department's most common tasks.
- Share new tools and techniques for central vetting before wider adoption.
- Escalate compliance concerns to the central AI lead.
- Participate in a monthly champions meeting (one hour) to share learnings across departments.
The champions model works because peer-to-peer knowledge transfer is significantly more effective than top-down training for AI adoption. A champion in each department means every team has a first port of call who speaks their language and understands their work.
Champions should be at Level 3 minimum on the 5-level AI literacy framework: that is, capable of integrating AI into workflows and identifying automation opportunities. An organisation that does not yet have any Level 3 staff should prioritise developing them before appointing champions.
When to involve external support
Building a CoE from scratch is faster with external input on tool vetting, governance frameworks, and initial training design. External support is particularly justified in three scenarios.
First, when the business is entering a new regulated sector. Moving from, say, a general B2B service into healthcare or financial services introduces regulatory frameworks the CoE's internal staff have not worked with before, and the learning curve for those frameworks is long.
Second, when the AI technology stack is changing significantly. A move from off-the-shelf tools to a RAG deployment or an agentic AI system requires skills most internal teams do not yet have. External support accelerates the build and reduces the risk of an expensive false start.
Third, when there is a specific compliance trigger. An audit question, a client request for AI documentation, an FCA inquiry, or a near-miss incident all create time pressure that an internal CoE in early maturity cannot absorb.
In all three cases, the role of external support is to accelerate internal capability, not to replace it. The CoE remains the owner of AI governance; the external advisor supplies expertise, templates, and review capacity during the build phase.
Further reading and services
A CoE only works if the organisation has the underlying policy, literacy, and tooling in place. For the AUP that underpins most CoE work, see our AI acceptable use policy guide. For the training framework that defines target literacy levels for each role, see our 5 levels of AI literacy. For strategic advice on governance design, CoE staffing, and vendor selection, see our AI strategy consulting. For more on building AI-ready teams, see the AI training and capability building section of the Knowledge Hub.
Frequently asked questions
- What is the minimum size of business that needs an AI Centre of Excellence?
- There is no strict size threshold. The decision is driven by the number of AI tools in use, the degree of coordination between departments, the regulatory sector, and whether AI-related incidents have occurred. A 30-person firm in a regulated sector may need a lightweight CoE; a 200-person firm using a single AI tool in one department may not. The scored decision checklist in this guide is a better filter than a raw headcount figure.
- How much does it cost to set up an AI CoE in a UK mid-market company?
- Estimated ranges based on UK market rates in 2026: Stage 1 (0–25 AI-active employees) typically costs nothing beyond 20% of an existing role's time. Stage 2 (25–150) costs approximately £40,000 to £80,000 per year. Stage 3 (150+ or regulated sector) costs approximately £100,000 to £250,000 per year. These figures exclude the cost of the AI tool licences themselves, which sit within departmental budgets.
- Can a single person run an AI CoE in a 50-person business?
- Yes, for a Stage 1 to early Stage 2 organisation. A single AI lead on 20% to 50% of their time can maintain the approved tool list, coordinate training, and escalate compliance questions. The risk is bus factor: if that person leaves, the CoE disappears with them. Appointing one or two department-level champions to shadow the lead mitigates this risk and improves resilience.
- What is the difference between an AI Centre of Excellence and an AI team?
- An AI team builds AI systems: custom models, RAG deployments, agentic workflows. An AI Centre of Excellence governs AI use across the organisation: policy, training, tool vetting, compliance. The two functions can coexist or overlap in larger firms. In most UK mid-market businesses, the CoE comes first and is light-touch; an AI team, if needed at all, is a separate and later investment.
- How does an AI CoE relate to ISO 42001 certification?
- ISO 42001 is the AI Management System standard published in December 2023. It requires a documented management system covering AI policy, risk assessment, data governance, and continual improvement. A CoE is the practical vehicle for running such a management system. Organisations pursuing ISO 42001 certification typically staff the management system through a CoE, and the two structures align closely in scope and activities.