The race to modernize enterprise infrastructure is reshaping what data centers are and what they’re becoming. No longer just facilities for compute, they’re evolving into intelligence engines where governance, performance and power converge to define the future of data centers and the enterprises they serve.
ZK Research’s Zeus Kerravala talks with theCUBE about validated designs in AI.
That convergence is forcing leaders to rethink architecture itself. Every layer of the stack — from silicon to sovereignty — must deliver efficiency and reliability at scale. It’s an evolution that demands faster systems, according to Zeus Kerravala, founder and principal analyst of ZK Research Inc. As artificial intelligence becomes more central to enterprise strategy, organizations are leaning into integrated systems designed to simplify deployment and reduce complexity.
“With AI, you will see companies … working together … and build another turnkey system,” Kerravala told theCUBE. “These turnkey systems, to me, are the right way to go for an enterprise because when you start to deploy these things, there’s a lot of dials you can turn and levers you can pull in order to tweak the systems. That’s why the validated designs are important.”
During theCUBE + NYSE Wired: AI Factories – Data Centers of the Future event, executives, architects and industry leaders joined theCUBE’s Dave Vellante and John Furrier to discuss the evolving relationship between innovation, governance and resilience in enterprise design. The conversations underscored how data centers have become not just operational hubs, but strategic levers for the AI economy.
Here’s the complete video interview with Zeus Kerravala:
Here are three insights you may have missed from theCUBE’s coverage of the event:
Insight #1: Hardware strategy shapes the future of data centers.
Intel Corp.’s attempt to reclaim foundry relevance marks a critical inflection point in the future of data centers. The global demand for a second source is not just about market diversification — it’s now a matter of geopolitical urgency, according to Diane Bryant, former head of Intel’s Data Center Division. Intel’s sixth foundry push, supported by U.S. government investment and Nvidia Corp.’s strategic partnerships, is framed as both a national security imperative and a foundational pillar for the AI economy.

Nvidia’s Dion Harris talks with theCUBE about how data centers are evolving to support business growth and global challenges.
“TSMC is running 60% of all silicon in the world,” she told theCUBE. “Someone needs to grab the other 40%. Intel has the opportunity to grab the other 40%. Everybody would like a second source in every area. Now Intel needs to go off and build a foundry. Again, if Intel fails, that opportunity is gone.”
Meanwhile, Nvidia reframes the purpose of AI infrastructure, positioning data centers as more than just enablers. These “AI factories” produce tokens as their primary output, fueling innovation in drug discovery, new materials science and intelligent application development, according to Dion Harris, senior director of HPC, cloud and AI infrastructure solutions at Nvidia. The future of data centers now hinges not only on compute power, but also on the ability to scale, secure and operationalize AI across millions of users.
“These AI factories are revenue centers now,” Harris said during the event. “They’re not just cost centers that are driving efficiency gains and productivity gains. They’re actually driving revenue.”
While Intel and Nvidia build the geopolitical and architectural scaffolding of AI infrastructure, Dell Technologies Inc. focuses on the enterprise floor, where integration, governance and business use cases collide. Its AI Factory simplifies adoption by delivering pre-integrated systems tailored to organizational maturity, according to Mary Kiernan (pictured), director of gen AI and global consulting at Dell. From proof-of-concept to production-scale deployments, Dell aims to help enterprises match data strategies to outcomes.
“An AI factory for a [Proof of Concept] is a very self-contained thing,” Kiernan told theCUBE. “When you start getting into what does this need to look like in order to provide a cloud-like experience on my data center floor, then it becomes observability tools, it becomes orchestration tools, it becomes security processes, some of which is tooling and some of which is governance. When you think of all of these things together, it can be a phenomenal amount of choice.”
Here’s the complete video interview with Mary Kiernan:
Insight #2: Control replaces chaos as enterprises move from open-source freedom to standardized frameworks of trust and sovereignty.
Private AI is reshaping enterprise infrastructure as companies bring intelligence back inside systems they directly control. For Red Hat Inc., the key to that control lies in open-source collaboration that keeps workloads consistent across architectures and vendors, ensuring enterprises can innovate freely without sacrificing governance, according to Josh West, global AI ecosystem leader and distinguished architect at Red Hat. As organizations shift AI inference closer to home, this third wave of localized intelligence is redefining architectures and shaping the future of data centers.

Red Hat’s Josh West talks with theCUBE about why enterprises are turning to private AI.
“Whether we are working commercially together or not, open source is the thing that provides consistency and stability for those customers,” he said during the event. “We don’t have to force people to do things, but we believe that people understand Red Hat is leading the charge to shape and steer that toward an enterprise-operable, secure, safe and sound standard environment.”
Enterprises are redefining flexibility by pairing cloud convenience with the control of on-premises data centers. Cloudera Inc. positions its hybrid platform as a bridge between these worlds, giving customers consistency, sovereignty and freedom of deployment without the trade-offs that once defined cloud adoption, according to Charles Sansbury, chief executive officer of Cloudera. That hybrid balance lets organizations use private data to train AI models securely while maintaining the agility they expect from the cloud — a key pillar in the future of data centers.
“What we’ve been trying to do is give people the tools, capabilities and frameworks to be able to implement those technologies in a way that they view as being safe and within their purview,” Sansbury told theCUBE. “That’s not to minimize the importance of cloud. As we move forward, we’re going to basically have the same form function for our cloud-based private cloud and on-prem data services.”

Cloudera’s CEO, Charles Sansbury, talks about the debate between cloud and on-prem.
As AI moves from experimentation to implementation, Groq Inc. is redefining sovereignty at the inference layer. The company’s system of chips and GroqCloud software gives enterprises and national telecoms a way to deploy AI securely within regional or organizational boundaries, blending performance with compliance, according to Chris Stephens, vice president and field chief technology officer of Groq. By building inference networks that operate closer to users, Groq reinforces the broader shift toward controlled, energy-efficient architectures designed for scale.
“Most organizations, governments, large entities, startups, innovators, all in between are not likely to be doing training,” Stephens said during the event. “They’re going to be doing inference. Every single time a customer’s interacting with your application powered by gen AI, that’s inference. We all know that, and we’ve seen a solidification of that as a market over the past year and a half.”
That drive for sovereignty and control ultimately depends on the integrity of the systems themselves — the physical layer where reliability is defined in real time. Achieving control in AI doesn’t stop at governance, according to Kerravala. It extends into the foundations that sustain reliability — the architecture driving the future of data centers. As enterprises standardize around secure, high-performance designs, aligning storage, networking and compute becomes essential for dependable AI operations.
“You need three things to make sure your AI works: Fast storage, fast network and fast processor,” ZK Research’s Kerravala told theCUBE. “Everybody understands the role of GPUs. If your network speeds don’t match what you’re trying to do to the GPU layer, and your data access speeds on the storage side don’t match that, one of those as a weak link will create the whole system to not work properly. The three of those things have to be in lockstep with one another.”
Here’s the complete video interview with Josh West:
Insight #3: Innovation meets infrastructure in shaping the future of data centers.
Startups are becoming the fuel line for enterprise innovation as Dell transforms its global ecosystem into a proving ground for AI at scale. The company’s network of more than 500 startups spans AI, quantum computing and robotics, accelerating Dell’s own development while helping customers deploy emerging technologies faster, according to Satish Iyer, vice president and chief technology officer of technology innovation and research at Dell. By acting as both partner and customer, Dell aims to close the loop between invention and application, turning internal solutions into enterprise-ready offerings.

Walmart’s Sravana Karnati talks with theCUBE about how WIBEY is driving Walmart’s transformation.
“We kind of drink our own Kool-Aid,” Iyer said during the event. “We are bringing these startups, some of them solving a problem within Dell, and then we are turning it around to say, ‘OK, let’s actually take the same approach and same tech to go solve the same problem for our customers.’ And our customers love it.”
As the AI landscape matures, Dell’s startup ecosystem highlights a clear shift toward on-premises infrastructure and localized data control, a model that’s increasingly central to the future of data centers. By supporting these startups with scale, credibility and open-ecosystem access, Dell provides a platform where innovation and infrastructure evolve together, strengthening trust and performance in enterprise AI, according to Iyer.
“One of the main reasons why startups like to work with us is [that] we not only have our AI Factory with Nvidia, but we also embrace open ecosystems,” Iyer said. “If our customers want to actually pick the best-in-class, we are able to demonstrate what those partners could be.”
While Dell builds an external innovation network, Walmart Inc. turns that same AI momentum inward — scaling from experimentation to execution The company’s Triplet Model, built on multiple public and private clouds and regional edge nodes, enables intelligence to operate close to users — a framework that blends velocity with proximity, according to Sravana Karnati, executive vice president of global technology platforms at Walmart.
“Triplet is a core part of our strategy when it comes to how we run our data centers in the cloud,” Karnati told theCUBE. “We needed to make sure that our infrastructure is as close to our customers as possible. We have multiple regions spread out across the U.S., and in each of those regions, we have three data centers … a couple of public clouds and then our own data centers as well.”
Here is the complete video interview with Satish Iyer:
For more insights from theCUBE + NYSE Wired: AI Factories – Data Centers of the Future event, don’t miss these information-packed segments:
- Mindy Cancila, vice president of corporate strategy at Dell, talks about how enterprise AI deployment is redefining infrastructure and long-term value.
- Kenneth Patchett, vice president of data center infrastructure at Lambda Inc., discusses how graphics processing unit innovation is reshaping data center design and accelerating the push for infrastructure that can handle the demands of tomorrow.
- Beth Williams, global portfolio lead for AI, apps and data at Dell, explores the challenges of scaling AI from pilots to production and the importance of data readiness and governance for successful enterprise adoption.
- Pawel Czech, co-founder and chief executive officer of New Native Inc. and lablab.ai, discusses New Native and lablab.ai’s AI-native community and the competitive market for AI companies.
- Pete Shadbolt, co-founder and chief scientific officer of PsiQuantum Corp., talks about how the company is positioning itself at the frontier of the next era of computing.
- Preet Virk, co-founder and chief operating officer of Celestial AI Inc., explores how photonic interconnects, disaggregated memory and new scaling models are redefining the data center into a high-efficiency factory producing tokens — the new currency of AI.
- Andrew Feldman, founder and chief executive officer of Cerebras Systems Inc., discusses the rapid growth in inference demand and the global expansion of data centers to meet AI demand.
- Arvind Jain, founder and chief executive officer of Glean Technologies Inc., talks about the importance of enterprises designing systems that empower users to be their most productive selves.
To watch more of theCUBE’s coverage of theCUBE + NYSE Wired: AI Factories – Data Centers of the Future event, here’s our complete video playlist:
https://www.youtube.com/watch?v=videoseries
Photo: News
Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.
- 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
- 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About News Media
Founded by tech visionaries John Furrier and Dave Vellante, News Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.