娇色导航

Our Network

The missing backbone behind your stalled AI strategy

Without a formal AI CoE, most organizations stall at experimentation; this is the blueprint for building the structure that drives scale.

Credit: iStock/BlackJack3D

Most organizations are somewhere on the path to becoming , but few have figured out how to scale their wins. It’s one thing to have pilots popping up in silos; it’s another to orchestrate those wins across the enterprise in a way that builds momentum. In our work, one of the clearest indicators of AI maturity is whether an  

Can you get wins without one? Absolutely. But can you scale those wins in a repeatable way across people, process and technology without a CoE? We haven’t seen it yet.

The AI maturity journey

, we typically define AI maturity across five distinct levels. The earliest is what we call the student stage. Here, organizations are recognizing the opportunity, educating teams and making sense of the technology landscape. Many companies lived in this zone throughout 2023 and some part of 2024. 

Then comes the explorer stage. This is when AI use cases start taking root, but in pockets. In some orgs, AI gets folded into broader digital transformations. In others, it’s managed off the side of someone’s desk, without dedicated resources or consistent tooling. It’s progress, but often chaotic, and hard to scale. 

Still, these early phases serve a critical purpose: they help organizations prove the business value of AI. Even when pilots are executed in silos, they can provide the kinds of wins that help justify investment in a formal CoE. . They did it without a dedicated team, budget or enterprise tech stack. But the results spoke for themselves, and when other business units began adopting the solution, it created the momentum to accelerate adoption of AI across the enterprise. In other words: don’t underestimate the importance of getting some runs on the board. Most organizations won’t be able to fund a CoE without them. 

Then comes the turning point: the builder stage. This is where organizations begin laying the foundation to industrialize AI. The biggest leap from explorer to builder is the emergence of a . This is the structure that transforms scattered experimentation into coordinated momentum. It facilitates the creation of an enterprise AI strategy, defines and enforces governance and serves as product manager for a central Data and AI Marketplace of reusable components. It also drives enablement across the organization — equipping executives to steer with strategy, enabling technical teams to build effectively, and empowering power users to adopt AI tools with confidence. All of this is done with the intent of establishing a federated operating model. While it may not lead every initiative, the CoE provides critical expertise to those who matter most. In short, it is the backbone for scaling AI responsibly.

From there, organizations can advance to the scale stage and ultimately to commander territory, embedding AI into how they design, deliver and operate across every part of the enterprise.

Getting the CoE right

So, if the CoE is such a pivotal unlock, how do you design it for success? , we’ve seen three models in the field, each with its own strengths and shortcomings.

The consultative model

This is the lightweight option: a team of thinkers, often chartered to set policy and review use cases. It can feel a lot like a governance PMO or an enterprise architecture group that draws beautiful pictures but doesn’t actually build anything. In most cases, this model lacks execution power and can come across as red tape. We don’t recommend it.

The shared service model

A step up. Here, the CoE assembles generalist AI teams and “loans” them out to business units to work on prioritized use cases. As they embed with teams, they help drive adoption of enterprise standards and tooling. But the catch is context, especially industry and functional context. Without deep business knowledge, these teams often end up spending too much time asking basic questions. Helpful? Yes. Scalable? Not quite.

The teach-to-fish model (our recommendation)

This model strikes the balance. The CoE acts as a central hub for strategy, enablement, standards and education. But delivery happens in the spokes, within the business units and functions themselves. The CoE exists to empower, not to approve. It provides infrastructure, reusable assets, training and guardrails. The BUs retain ownership for use case delivery, funding and outcomes.

Every organization will scale differently, but there are four roles we consistently see as critical to making this model work effectively.

First is the AI strategy leader. This person serves as the connective tissue across the enterprise, defining the AI roadmap, evolving the operating model and orchestrating how the rest of the CoE supports the broader organization. They think through priorities, risks and investment sequencing, and often develop reusable assets such as intake forms, validation templates and lifecycle checklists that domain teams can adopt and adapt for their own use cases. They also play a critical role in promoting awareness and adoption of responsible AI frameworks, facilitating reviews for sensitive or cross-functional use cases, often via an ethics or risk committee.

The second essential role is the architect. This individual owns the technology architecture that underpins enterprise-scale AI. They’re responsible for designing and maintaining the shared infrastructure: things like secure, GPU-enabled sandboxes, model registries and MLOps pipelines. These inputs allow domain teams to build and deploy responsibly and efficiently. They also define and enforce enterprise-wide data governance standards, recognizing that, like any technology, AI depends entirely on the quality and context of the data it consumes.

Next is the teacher, a role we think every CoE should prioritize early. This person leads the education motion across the organization, building awareness around the benefits and risks of AI and enabling teams to upskill continuously as the technology evolves. They’re responsible for designing role-based learning programs and for training the spokes on key delivery processes and enterprise guidelines.

Finally, we have the engineers. These are generalist AI engineers — data scientists and data engineers — who partner with the business delivery teams in the spokes. They help accelerate use case delivery by supporting data preparation, model development and deployment, especially for aspects that don’t require deep domain knowledge. They also contribute to the ongoing development of the data and AI marketplace, ensuring teams across the organization have access to curated, high-quality data products and vetted, reusable models.

Together, these roles don’t just support delivery, they enable it. They form the core of an AI CoE that’s designed not to control from the center, but to empower the edges.

From red tape to rocket fuel

Done right, an AI CoE is not a bottleneck. It’s a force multiplier. It’s what turns isolated wins into a flywheel. . And if you want to drive AI maturity forward, if you want to scale, start with the center.

This article is published as part of the Foundry Expert Contributor Network.
Want to join? 

Michael Bertha
Contributor

is a partner at and leads the firm’s central office. He has 15+ years of experience advising digital and technology executives across industries, helping Fortune 500 and high-growth companies use technology as a strategic advantage. His focus spans strategy, operating model design and transformation. Michael began his career in business application development and data migration before moving into strategy consulting. He holds an MBA from Cornell and a Master’s in the Management of IT from the University of Virginia.

More from this author

Chris Davis
Contributor

is a partner at , a strategy and management consulting firm specializing in the intersection of business strategy and technology. Chris is the head of the firm's west coast office, where he advises Fortune 500 CIOs and digital executives on the role that technology plays in differentiating the customer experience, developing new products and services, unlocking new business models and improving organizational operations.

More from this author