In June 2014, the first commit of Kubernetes was pushed to developer platform GitHub. As the open-source container orchestration tool nears its 11th birthday, Kubernetes now finds itself in a different world, one that is driven by AI and an even greater need for interoperability and standardization.
Can Kubernetes continue to deliver in this new AI and machine learning landscape after 11 years? This was a central question on the minds of enterprise information technology executives and open-source leaders at the KubeCon + CloudNativeCon Europe gathering in London this week.
“The kind of apps running on Kubernetes are a lot different now,” Jago Macleod, director of engineering-Kubernetes for Google Cloud, said during a roundtable discussion for media and analysts at KubeCon Thursday. “This new round of AI/ML workloads is really different than what we built Kubernetes for.”
Rising adoption for new protocol
Questions around the future role for Kubernetes are being fueled by the rapid rise of Model Context Protocol or MCP. This open-source technology was created by Anthropic PBC and launched last year. In just a few months, it has gone from a little-known concept to a growing ecosystem with more than 1,000 community-built servers or connectors, as noted by AI repository Hugging Face.
MCP enables AI models to connect with external data sources and services, without requiring developers to build unique integrations. “MCP is just white-hot right now,” Keith Babo, chief product officer at Solo.io, said in a conference panel discussion. “The question is how much does it grow?”
Babo’s company is betting that MCP will continue to be an attractive tool. Solo.io announced MCP Gateway on Thursday that streamlines integration and governance of AI agents with MCP-compatible toolchains to support cloud native applications.
The popularity of MCP provides a glimpse into the evolving role of AI in the cloud-native ecosystem, according to Christian Posta, field chief technology officer at Solo.io. “We’re seeing a very interesting convergence,” Posta told News in an interview. “AI models are getting so good we can credibly leverage them in our applications. That’s were the cloud native world comes into the picture.”
Kubernetes orchestrates AI workloads
Although MCP’s interoperability has attracted the support of key industry players such as OpenAI, Kubernetes is still being used to power a new era of intelligent workloads. Hyperscalers such as Google Cloud and Amazon Web Services Inc. are working within the cloud-native community to evolve the container orchestration tool from supporting microservices and stateless applications to high-performance training and inference AI workloads.
Over 12,000 people attended KubeCon EU in London this week.
“Customers are using Kubernetes to orchestrate those workloads,” David Nalley, director of open-source strategy and marketing at AWS, said in an exclusive interview with News. “That’s an early signal that Kubernetes is part of the foundational layer for training AI.”
Being part of the foundational layer will involve an ability to automate the deployment of large language models in a cluster across available central processing unit and graphics processing unit resources. The Kubernetes AI Toolchain Operator or KAITO is a managed add-on that simplifies the experience of open-source and private AI models in cloud environments such as Azure Kubernetes Service.
KAITO employs retrieval-augmented generation or RAG capabilities that can bring context into AI models, according to Lachlan Evenson, principal program manager of the cloud native ecosystem for Microsoft Azure. “You’re going to see tools like KAITO and platforms like Kubernetes at the forefront of that,” said Evenson.
There’s also a number of AI-related Kubernetes projects inside the Cloud Native Computing Foundation universe that are attracting enterprise interest. These include the CNCF incubating project Volcano that extends scheduling capabilities to multicluster environments to simplify management and support scalable AI workloads, and Kubeflow, a Kubernetes-native framework for developing, managing and running machine learning workloads.
“Kubeflow is something we’re playing a lot of attention to,” said Amazon’s Nalley.
Agents drive quest for tools
One of the reasons why tools such as Volcano and Kubeflow are attracting enterprise attention is the continued growth of interest in AI agents, intelligent pieces of software that can perform specific autonomous tasks. AI agents were a prime topic of discussion at the KubeCon event in Paris last year, and they continued to be a focus this week in London.
Solo.io announced what it labeled the “first open-source agentic AI framework for Kubernetes” with the release of kagent in March and revealed this week that it will contribute the tool to CNCF. Kubiya Inc. launched an enterprise AI stack for agents at KubeCon, designed to provide orchestration and observability for agents at-scale.
“What I’m seeing is AI being wrapped around tools,” Omer Hamerman, principal engineer at Zesty.co, a Kubernetes optimization platform for reducing cloud costs, told News. “Since an LLM can be trained on specific models, we’ll be able to deploy engines to do specific things. We’ll see more AI agents play specific roles inside the cluster.”
As the cloud-native community has continued to sort through the impact of AI and the paths for future technologies, it was energized in recent months by the news in January that the Chinese startup DeepSeek released an open-source AI model that rivaled those built by U.S. companies at a small fraction of the cost.
DeepSeek’s decision to open source its technology followed a pattern seen in China for quite a while, according to Jim Zemlin, executive director of the Linux Foundation. The industry executive noted that Chinese companies have adopted huge amounts of open-source technology and have been major contributors over many years.
“It was just a huge moment,” said Zemlin. “People seemed surprised that DeepSeek would open-source this technology. I think that is, net, a good thing. We are going to see more DeepSeeks.”
The prospect of additional AI releases that will roil markets and inject further chaos highlights the uncertainty facing the enterprise world in general and the cloud-native community in particular. In her keynote remarks at KubeCon this week, Christine Yen, chief executive of Honeycomb.io, expressed a belief that writing software still felt magical. Despite the unpredictability that AI models have brought to the technology world, she remained sanguine about the future and what it may bring.
“As we enter this age of AI, I’m weirdly optimistic,” Yen said. “We’ve got this.”
Featured photo: CNCF/X; Mark Albertson/ News
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU