In our interview with IDC's Katie Norton, we learn about AI inside vs AI inside, how to build an AI-ready organization, and why success demands rethinking strategy rather than simply selecting the best tools.
Register Now
Hi. We're here at IDC 60th Annual
Directions Conference here in San Jose, California. And I'm delighted to be joined today by Katie Naughton, who is the research manager, DevSecOps and software supply chain security for IDC.
In essence, she's an expert. And we're here to talk to the experts here today.
So Katie, we're talking all things. I, I'd love to sort of get levels I'd love to hear in a sentence or two. What is the state of play with AI and the it by as you were talking to every day? What's top of mind for that? Yeah.
So with application development and security right now, you know, we're kind of at a point where it's been really focused on productivity, right?
We talk a lot about, coding assistants and chat bots and how AI is really helping both developers and gain productivity and these efficiency gains across the software development lifecycle.
But as we're sort of shifting our view towards a genetic AI and talking about what what that means in terms of application development, we're really seeing this realization and an opportunity to eliminate toil from the development lifecycle.
And this is something we've been talking about since, like the dawn of DevOps.
And it's finally coming into view, but it's also really creating these challenges and questions for organizations of like, what does the developer even look like in today's day and age, and how does this impact, you know, the larger ecosystems around software development?
You have to think about things like integration and governance and, and, just the overall processes around agents.
And so there's a lot of these questions for, for organizations right now of what do we need to do to actually achieve these gains that we're looking to get.
So, so a, I like to call it the Jurassic Park, question, which is just because we can do something, should we do. Exactly. And you know what you're describing. There's amazing.
Of course, there's lots of technology that can help and support actions, but at the end of the day, what are we trying to achieve? And and how is this, this technology. Can I support it? Yeah.
And something I hear often from the CIOs I speak to as part of my role with 娇色导航magazine, is, is this sense of frustration that, I or AI projects are all kind of lumped together in one amorphous thing, right?
Let's just put some AI on it. And that's going to solve a problem, this kind of thing. Now, I've heard you speak really eloquently about the concept of AI inside versus AI outside. Yeah. Can you speak to that? Yeah. Yeah, it's it's really important distinction to make.
I don't know if it's the best analogy, but it's really important to understand.
I think, you know, AI inside you can think of like, how am I incorporating AI into the processes, platforms and tools that I'm already using or investing in?
So the examples here being like AI layered into our observability platform or, you know, anomaly detection in our cybersecurity tools that we're leveraging internally. But then I outside is more talking about the organization being an AI builder.
How do I incorporate AI into the products and services that I'm delivering to my customers?
And so it's a really important distinguish, you know, or point to distinguish for CIOs so that they can have this conversation sort of around like where am I binding AI and where am I building AI. And that helps that conversation be a little more clear. Yeah.
I also think it might experience, that it's important then to articulate that back to their business colleagues. Right. Because there's just this presumption that, hey, I was going to change everything. And there's a big difference between buying an external service to make things more efficient. Yeah.
And and doing something radically transformative and different. So thinking about that concept of AI inside driving efficiencies within existing processes.
I think one big challenge I hear about a lot is demonstrating return on investment. Yes. So our organizations demonstrate demonstrate seeing ROI today. How has it been measured? Is it been measured? Yeah, that's a ROI. When it comes to AI and application development.
And so the elephant in the room right now, because we have seen that organizations have invested significantly in things like coding assistance.
You know, as far as almost, you know, 90% of developers in large enterprises are using coding assistance, but yet we don't really have great measurements yet for what these, like, productivity returns are supposed to look like, especially, with a coding assistance and developers, we've developed some metrics, you know, like, shortened change, lead time to change or you're introducing more, pull requests more frequently.
But there's these aspects of, AI and coding assistance that are more, squishy. And that's way to put it, like developer efficiency and like developer happiness and elimination of friction and those things don't factor into ROI, spreadsheets, all that. Yeah.
And so organizations are really at this turning point of, okay, I've deployed this across my organization. How can I know that I'm getting those gains?
It hasn't really matured yet. We're still kind of like in those early stages of DevOps, before we had like Dora metrics and things to measure software efficiency, same kind of state, right, with with AI.
So it's like it's like the kind of thing that you can present and say, hey, look, we've made things X amount kind of faster, quicker, easier or whatever. Yeah.
But is there a return on that from a business perspective? Hard to argue against and hard to say. Yes, absolutely. So can you think in a little bit more broadly? Could you speak a little bit about the impact of AI on app development?
What are some of the potential implications of baking AI into customer facing? Yeah, so I'm going to dive into my security realm here.
Where are we going to? This is there's huge implications for this. You know, like the the upside of AI outside or incorporating AI into your actually shoots, right? Like you can really develop these intricate touchpoints with your customers and make your products that much more intelligent, engaging.
But it creates this entirely new attack surface that most organizations aren't really prepared or have put a lot of thought into.
And so, for example, you know, when you incorporate AI into your user experience and into your application and AI can predict in on a probabilistic system, right? It can predict and it can engage in ways that you can't predict.
And therefore it created this whole new class of vulnerabilities that the tools we're currently using to share applications can't even start to address.
And then you have issues of data, right? Organizations, not only their own data, but their customers data. And how how is this tool leveraging it? Who has access to it? Is it being used to train a model somewhere? Can it be leaked?
And then you also just have these issues around, you know, what is the AI output saying that we don't have to think about when traditional, you know, applications and securing them, you know, if your app has this, you know, it says it makes a legal claim that's completely incorrect.
It's not just a quality issue anymore. It's an issue of reputation of compliance for these organizations.
So really AI is not just like this cool new feature you're building, but it really is this new, you know, attack surface that you have to lock down and really think about early on, in your AI development. Yeah.
So, so something that can do an exponential number of things that you can't predict.
Yes, requires you to think about security compliance. And it's just totally different. It's like almost like an extra dimension. Yes. To the whole scenario okay. And then another issue which is also extradimensional is, you know, like I hear two things all the time, right?
I hear stories of like more I.T professionals than ever, like seeking work out of work.
An AI is able to take on some of those functions previously done by humans. And yet I also hear from CIOs all the time that there's this critical skills gap that they can't fill empty positions, that they can't figure out what the right positions are to fill.
Even so, how should it buyers build a winning team just like an infused organization?
Yeah, and application development. This is really huge, right. And I think I like to think about it is that we're shifting and especially in this age from, automation to autonomy, I think is the best way to think about it.
Organizations now have to think less about, like doing tasks and about how they're guiding, orchestrating, and controlling these things.
And so like dev and ops teams, they now need these like AI choreography skills. More so than than anything. They're no longer, you know, doing the tasks themselves.
They're figuring out how do I, you know, provide the guardrails around these agents to get them to, to do the tasks.
And so I think IT buyers really have to start thinking about this in a sense of having these, heterogeneous teams where you're like, even beyond the whole DevOps concept, right?
You're bringing together developers, data scientists, operations and security all working and, and, and subject matter experts, in many cases working together to build these, you know, flexible and reliable systems that can help them deliver the outcomes. Or they're looking to deliver with AI.
And so I think those IoT buyers that can, you know, jump into this idea of really pulling together these people together, well, those are the ones that are going to get the real value out of AI.
And it's not just going to be a novelty. Yeah. So it requires a set amount of ambition and it requires full, capable people to be accountable. Yeah. For things that because they might be doing things that weren't previously done. Right.
So you can't have somebody who's skilled and experienced in that role. Yeah, absolutely. And you know, even like what a developer is and does is going to just totally change, you know, at IDC, we have in the application development team, we have a lot of predictions around that.
Like developers are going to be developing agents and agent workflows and less about writing code in the way that we think about it today.
Yeah, but in the world of journalism, we tend to think like if if an AI can do it, it will be done by the AI, right? So what's the insight? What's the unique thing that you're bringing to to the flow?
I think sounds similar with with developers. Okay. Incredible. We could speak for hours about this and we don't have hours. So I guess I'd love to leave. With, with one piece of advice for it buyers today.
Like what should they be thinking about with regard to AI in 2025? Yeah, I think this year I like to think about it as that organization, and they need to build the AI muscle to deploy AI reliably repeatable and at scale.
And to be able to do that, organizations need to think less about, you know, it's going to be less about like this particular AI tool or this particular model. And I'm like choosing the right one. It's about thinking in a ecosystem perspective.
Organizations need to be systems designers, not like IT shoppers when they're thinking about AI. Okay, amazing.
Thank you so much you guys and a great conversation. So thank you so much Kate. Thank you.
Sponsored Links