Artificial intelligence is no longer just a tool for experiments — it is becoming the backbone of enterprise operations. Private AI, running where the data lives inside systems companies directly control, is reshaping how organizations build and manage their technology stacks.
Instead of treating AI as a service delivered only through hyperscalers or public research models, enterprises are increasingly designing ecosystems that balance open-source innovation with enterprise-grade stability. This shift signals more than an infrastructure decision — it represents the start of a new wave for AI, one where consistency, governance and local control define how businesses harness intelligence at scale, according to Josh West (pictured), global AI ecosystem leader and distinguished architect at Red Hat Inc.
Red Hat’s Josh West joins theCUBE to talk about the future of private AI and open source.
“AI has started with research in centers where things are consumed as a service or models are coming out of research environments such as OpenAI or [Google’s] Gemini,” he said. “Those are possibly consumed through services you’re getting from Salesforce or Workday. Then people are consuming services from public clouds through a token-based interface. But this third wave is really starting now, where people are running AI inference in their local data centers so they have full control of the stack, from the hardware up to the developer experiences.”
West spoke with theCUBE’s Dave Vellante at theCUBE + NYSE Wired: AI Factories – Data Centers of the Future event, during an exclusive broadcast on theCUBE, News Media’s livestreaming studio. They discussed why enterprises are turning to private AI and what the next wave of AI factories will look like.
Open source helps keep private AI consistent
For Red Hat, which stands firm on its focus on open source, AI workloads must span a wide range of hardware platforms including Advanced RISC Machines and Reduced Instruction Set Computing V from multiple vendors, yet still deliver reliable results. That consistency depends on shared open-source projects at the center of the stack, according to West.
“In the old days, consistency … used to just be open standards, where everyone would comply with a specific regulation or a specification,” he said. “Open source brought the idea that we can all be based on the same projects, engineering and testing our technologies to work with specific chips. The Linux kernel is a great example; it works on IBM Z, it works on x86, it works on ARM, it works on RISC-V and all these other new emerging standards.”
Open source also gives enterprises that valuable, consistent through-line in a market where hyperscalers and chipmakers push vertically integrated stacks. Instead of committing to one vendor, companies can rely on shared projects such as Kubernetes and OpenShift to keep workloads running the same across environments. That consistency is what provides long-term stability, according to West.
“Whether we are working commercially together or not, open source is the thing that provides consistency and stability for those customers,” he said. “We don’t have to force people to do things, but we believe that people understand Red Hat is leading the charge to shape and steer that toward an enterprise-operable, secure, safe and sound standard environment.”
Employee and customer needs at forefront
The tension between cloud and on-premises remains, but enterprises are finding a balance. Some workloads benefit from hyperscalers’ scalability, while others require the control of sovereign deployments, according to West.
“It’s not a binary decision,” he said. “It’s [about] which use cases you want in a full-control environment. Those tend to be the highest-volume workloads or the ones with the strictest compliance and security requirements.”
To this end, it seems unlikely one model will define private AI. A network of specialized agents managed by an orchestration layer such as Kubernetes will determine how workloads run across environments, according to West. Either way, he suggests these decisions come down to everyday work and customer reach.
“What really matters is how this changes the day in the life of employees,” West said. “How can we reach customers in entirely new ways? That’s going to happen on a platform where you have all the the capacity to run wherever you need.”
Here’s the complete video interview, part of News’s and theCUBE’s coverage of theCUBE + NYSE Wired: AI Factories – Data Centers of the Future event:
Photo: News
Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.
- 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
- 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About News Media
Founded by tech visionaries John Furrier and Dave Vellante, News Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.