娇色导航

Our Network

Senior Writer

Hakkoda Labs adds homegrown AI agent to its data team

Feature
Jun 27, 20257 mins

The data consultancy transformed a skunkworks experiment into a ‘support engineer’ AI assistant that helps automate time-consuming data migration tasks — with higher-level agentic AI ambitions on the horizon.

Patrick Buell
Credit: Patrick Buell / Hakkoda Labs

娇色导航

Hakkoda Labs took the road less traveled and built AI agents that disrupted their own business processes, effectively handing over their proprietary data migration IP — the company’s secret sauce — to any developer.

The project itself was launched as a skunkworks challenge to build an AI business analyst. The data consultancy, , formalized the effort in mid-2023, hiring AI engineers to help pilot the project. Dubbed a “support engineer” model, the project used OpenAI’s large language models (LLMs) to perform data migration tasks, such as source target mapping; extract, transform, load (ETL); and extract, load, and transform (ELT) — the essential work of data scientists and data engineers.

Many enterprises are using AI for automation but Hakkoda’s “sophisticated and specialized” AI agents for developer-oriented use cases such as ETL and schema matching are far less common that other generative AI applications such as document summarization and content creation, according to one of the company’s founders.

“The sole charter was to disrupt our own business and build AI agents in-house and make them extensible,” says , co-founder and chief innovation officer of Hakkoda.

“Our agents are constantly looking at the jobs our people are doing and take them and do something more powerful — generate the transformation scripts needed to build an ETL job,” says Buell, who began his career as an ETL engineer in the oil and gas industry.

Data cleaning, preparation, and consolidation are arguably the most tedious and arduous tasks required to build a quality, error-free AI agent. Hakkoda is also a major partner of AWS and Snowflake.

ETL involves data integration processes used to pull data from various sources, such as thousands of Excel spreadsheets or Power BI reports, and transform it for loading into a target storage destination such as a Snowflake data lake. ELT is similar to ETL, but data is extracted and loaded directly into a system without any transformation.

Hakkoda’s AI agents, also dubbed “copilots,” automate the tasks required to perform data migrations but require human input with expertise to reduce or eliminate hallucinations.

The company points out these specialized agents can quickly process and make sense of troves of information in an enterprise, resulting in massive time savings.

“Hakkoda’s main strength has been its focus. The company has targeted its efforts on a specific problem: improving the data modernization process using AI tools to get clients off legacy systems,” says Reid Sherard, research manager of enterprise intelligence services at IDC.  

“CIOs may find value in Hakkoda’s deep industry expertise as well as defining data modernization as an ongoing process that has a common direction but may have different goal posts depending on where a client is starting  and where they are trying to end up,” he says.

From skunkworks to potential agentic AI

The skunkworks project was originally undertaken by a few engineers who wanted to see what was possible with generative AI. Before long it was formalized with an R&D team tasked to pilot the AI data migration agents.

“As a modern data consultancy, we wanted to build AI agents in-house and make them extensible and user friendly across the organization for many purposes,” says Buell, an avid mountaineer and skier like several other company founders who named the firm after the famed Japanese mountains.

These AI agents make intelligent decisions and “generate the mapping in Excel or even take it a step further and actually build out the data pipeline that would populate a table,” Buell says, noting that human data engineers then validate and check to ensure they are trained properly — or not. “They are reinforcing the learning of the agent so it can do something the right way next time.”

Hakkoda is now modeling an even more advanced version in which agentic architecture comes into the picture. “Our team figured if we could create an agent to perform specific tasks, why can’t we create an agent capable of building other agents?” Buell asks.

Technical disruptions and resistance to change

Hakkoda faced two key challenges trying to implement such ground-breaking technology: the unprecedented pace of innovation in the overall AI market and developers’ resistance to change.

“The pace of change [in AI innovation] is the biggest challenge. It’s moving so quickly that staying ahead of it while building these things is a challenge,” Buell says. “It’s a constant refactoring and possible rearchitecting as major AI model vendors constantly ‘drop’ new capabilities.”

Google, for instance, announced an Agent2Agent interoperability protocol at its Next Conference in April that suddenly impacted how Hakkoda was developing an AI agent. Changes are constant but developers try to architect AI agents to avoid the need to rebuild mid-process, Buell says.  

A much bigger challenge, however, was persuading developers — internally and outside of the labs — to experiment with the AI agents.

“Engineers are horrible at adopting this,” Buell says. “There is a bit of resistance especially among technologists who pride themselves on their technical ability and feel like it’s almost an attack on their identity to suggest an agent is better at it or asking a developer to automate yourself.”

Hakkoda adopted a project-by-project approach to change management to ensure its engineers would embrace a CI/CD feedback loop with its R&D team to refine and fully deploy this “radically” new technology effectively, according to company representatives.

The project earned Hakkoda a 2025 娇色导航100 Award for IT innovation and leadership.

A new chapter

Shortly after that award was announced in late March, IBM acquired Hakkoda to expand IBM Consulting’s data transformation services portfolio. Big Blue cited Hakkoda’s generative AI “powered assets” that would speed up data modernization projects as another reason for buying the consultancy.

IBM’s acquisition will enable Hakkoda to stay on its original mission — embracing open source — while giving its developers access to a vast repository of research data, Buell says.

“I really like how the strategy for IBM and IBM technology is open source because that allows us to then wall these things off, make them Apache-compliant and guarantee security and governance of the usage of it, versus having a more OpenAI approach, where you have less control over that model,” Buell notes.

Analysts say Hakkoda offers premium value to developers working in the AI era.

“IBM’s acquisition of Hakkoda positions developers to work smarter, faster, and with higher-quality data, ultimately accelerating their journey towards leveraging AI for tangible business impact,” says David McCarthy, a research director at IDC.

“Developers working on AI/ML projects will have access to more efficient and automated tools for data preparation, which is often the most time-consuming part of an AI initiative,” McCarthy adds. “This means fewer manual data wrangling and more time for model building and deployment.”

See also:

Senior Writer

Paula Rooney is a senior writer at CIO.com, where she focuses on how CIOs deploy AI, cloud, and digital technologies to transform their organizations. A veteran IT journalist, Paula has reported for PC Week, CRN Linux.com, The Register, TechTarget and ZDnet, and UBM, among other outlets. She holds a master’s degree in journalism from Columbia University and was most recently recognized with ASBPE Regional Silver and Regional Bronze awards for her enterprise news story “AI to go nuclear? Data center deals say it’s inevitable” and her case study “LA Public Defender 娇色导航digitizes to divert people to programs, not prison.”

More from this author