Artificial intelligence is driving innovation in cloud-native technologies, streamlining processes and enhancing decision-making across industries. Kubernetes AI continues to reshape enterprise operations, tackling challenges from scaling artificial intelligence workloads to addressing security, compliance and modernization needs.
As organizations navigate these complexities, automation and innovation remain central to unlocking Kubernetes’ full potential, according to Mike Barrett (pictured, left), vice president and general manager of hybrid cloud platforms at Red Hat Inc.
“Security really exploded this year in terms of compliance standards, regulatory assignments and different vertical markets,” he told theCUBE during last week’s KubeCon event. “Typically, we would isolate in Kubernetes at the namespace or at the cluster boundary. So, we had to make that very inexpensive and automated for people to make that choice between a cluster boundary or a namespace boundary.”
Barrett, along with an outstanding lineup of cloud-native thought leaders, spoke with theCUBE Research’s Savannah Peterson and Rob Strechay at KubeCon + CloudNativeCon NA, during an exclusive broadcast on theCUBE, News Media’s livestreaming studio. Discussions centered on advancements in Kubernetes scalability, AI infrastructure, open-source collaboration and modern observability tools driving enterprise innovation. (* Disclosure below.)
Here are three key insights you may have missed from theCUBE’s coverage:
1. Kubernetes AI advancements unlock scalable, flexible deployment.
Kubernetes AI advancements are playing a pivotal role in enhancing scalability and flexibility for AI workloads, enabling organizations to address the growing demand for powerful, adaptable infrastructure. Recent upgrades to Google Kubernetes Engine have expanded its capacity to support clusters of up to 65,000 nodes. These enhancements, powered by technologies such as Spanner, meet the scalability needs of modern AI applications, according to Gari Singh, product manager, Google Cloud, at Google Cloud LLC, and Bobby Allen, cloud therapist at Google.
“If you had an environment where we took the barriers off, we took the limits off, what could you do? That’s what we’re trying to do,” Allen said in an interview during KubeCon. “Before you get there, we want to go ahead and break through that barrier so you can just be unleashed.”
Nutanix Inc. is addressing the growing demand for scalable and secure AI infrastructure with its new Enterprise AI solution, according to Luke Congdon, senior director of product management at Nutanix. Designed to streamline the deployment of AI and machine learning workloads, the platform operates seamlessly across private, public and hybrid cloud environments. By leveraging partnerships with Hugging Face Inc. and Nvidia Corp., Nutanix enhances its ability to deliver scalable, high-performance solutions tailored to enterprise needs.
“We’d like to make sure that you’re going to get production-level Kubernetes with security, with load balancing, with ingress, with everything else that you need, because Kubernetes is really great, but it’s one key important orchestrator piece,” Congdon told theCUBE during the event. “You need so much more. What’s really unique about them coming to Nutanix is they got what has traditionally really been hard for Kubernetes … stateful storage across objects, files, volumes, anything that you need.”
MinIO Inc.’s AIStor further underscores the importance of centralized data management in optimizing Kubernetes AI infrastructure. By integrating AI functionalities directly within the storage platform, AIStor allows enterprises to streamline operations and maximize their AI investments, according to Anand Babu Periasamy, co-founder and chief executive officer of MinIO.
“Unless you put data at the heart of your business, bring all of your data from different teams and centralize and build an AI data repository, you’re not going to have an AI practice,” Periasamy told theCUBE during the event. “We actually brought in cool AI capabilities inside the product itself … you can directly prompt the data and talk to the data.”
CoreWeave Inc.’s innovative use of Kubernetes AI to build its cloud infrastructure highlights how the technology supports not only scalability but also operational resilience. By adopting Kubernetes, CoreWeave has eliminated reliance on proprietary systems, empowering organizations to innovate and deploy AI solutions more efficiently, according to Peter Salanki, chief technology officer of CoreWeave, and Chen Goldberg, senior vice president of engineering at CoreWeave.
“As a company, we would not be where we are today without Kubernetes,” Salanki told theCUBE during the event. “We started out building our cloud later than the traditional legacy hyperscalers. We built our entire stack around Kubernetes.”
Here’s the complete video interview with Gari Singh and Bobby Allen:
2. Open-source collaboration accelerates AI innovation.
Open-source ecosystems continue to be the foundation of innovation in Kubernetes AI, enabling organizations to scale solutions while fostering collaboration across industries. Intel Corp.’s Open Platform for Enterprise AI exemplifies this ethos, providing a vendor-neutral framework that integrates more than 30 cloud-native microservices. This initiative supports enterprises in building customized AI applications with single-click deployment, according to Arun Gupta, vice president and general manager of developer programs at Intel.
“The first combination of OPEA is all of those about 30-plus microservices,” Gupta said during KubeCon. “These are all cloud-native, so they’re published as containers, and you can run them anywhere. Now, the microservices by itself are not very helpful. What’s really impactful for developers and the end customers is blueprints that sit by building those microservices together. We provide that diverse and wide set of integrations for you.”
Oracle Corp.’s contributions to the Cloud Native Computing Foundation further emphasize the critical role of Kubernetes AI in advancing generative models and simplifying service launches. These investments, supported by tools such as Oracle’s AI Studio, prioritize cross-sector collaboration and innovation, according to Sudha Raghavan, senior vice president of the Oracle Cloud Infrastructure developer platform at Oracle and CNCF board member.
“What’s also been fascinating to see is that graphics processing units, as everybody’s talking, are super expensive,” she told theCUBE during the event. “Of course, nobody wants to keep their [graphics processing units] reserved without running any workload. How do you get this kickstarted, and how do you quick-start? For that, we have built AI Studio for our internal developers to quickly kickstart their gen AI services.”
Red Hat Inc. is another key player driving Kubernetes AI innovation through its collaborative Kubernetes solutions. Focused working groups are addressing the challenges of deploying and scaling AI workloads and creating accessible, cost-effective tools for domain-specific applications, according to Sally O’Malley, principal software engineer at Red Hat, and Jeremy Eder, distinguished engineer and chief AI/ML platform strategist at Red Hat.
“How do we deploy [containers]?” O’Malley told theCUBE during KubeCon. “How do we scale them? How do we make them accessible to everybody? That’s what we’ve been working on at Red Hat.”
Here’s the complete interview with Sally O’Malley and Jeremy Eder:
3. Modern tools transform legacy systems and boost efficiency.
Enterprises are embracing advanced tools to transform legacy systems and drive operational efficiency. Red Hat’s recently enhanced OpenShift Virtualization platform is a prime example, providing hybrid cloud solutions that simplify virtual machine migration while meeting regulatory requirements. By focusing on customer needs, Red Hat is driving Kubernetes AI innovation to modernize infrastructure, according to Red Hat’s Barrett and Ju Lim (pictured, right), senior manager of OpenShift product management and distinguished engineer at Red Hat.
“March of 2024, the customer base just wanted to migrate,” Barrett told theCUBE during the event. “They didn’t want to talk about modernization. They wanted to talk about getting off their legacy virtualization platform as quickly as possible. It changed everything we were doing at the beginning of the year. It really made us focus on putting features into the product.”
MultiKueue and OpenTelemetry are also reshaping how organizations manage workloads and optimize observability, according to Morgan McLean, OpenTelemetry co-founder and senior director of product management at Splunk. These tools enable Kubernetes AI models to efficiently queue tasks across multicloud environments, while new profiling features allow real-time analysis of application performance. These innovations help identify inefficiencies, leading to significant cost savings for enterprises.
“You can profile stuff at development time, but for a large production system, you can never emulate its load properly on a single box or even run it,” McLean told theCUBE during KubeCon. “[Profiling] gives you insights that you have no other way of capturing for real workloads, and you can save tons of money with it.”
Here’s the complete interview with Mike Barnett and Ju Lim:
To watch more of theCUBE’s coverage of KubeCon + CloudNativeCon NA, here’s our complete event video playlist:
https://www.youtube.com/watch?v=videoseries
(* Disclosure: TheCUBE is a paid media partner for the KubeCon + CloudNativeCon NA. Neither Red Hat Inc., the headline sponsor of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or News.)
Photo: News
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU