A tidal wave of AI-driven innovation is reshaping global technology. At its forefront is a massive buildout of next-gen infrastructure — comprising data centers, custom silicon, sovereign AI models and a rising culture of AI entrepreneurship.
HUMAIN’s Shehram Jamal discusses sovereign AI’s geopolitical imperative with theCUBE.
Given the proliferation of bespoke next-gen enterprise AI products, what is the ideal roadmap for fleshing those out in a performant and scalable manner?
“There are three stages that we need to go through, and we are currently at stage zero,” said Kevin Cochrane (pictured, middle), chief marketing officer at Vultr, a registered trademark of The Constant Company LLC. “Stage zero is all of the initial build-out of infrastructure for generative AI to train all your large-scale foundational models, which mostly occurs in core geographies. We’re just moving to stage one, where enterprises will start deploying for production inference, open source LLMs, to start adding efficiencies to all of their enterprise workflows. Stage two, we need to go global.”
Cochrane; alongside Andy Hock (second from right), senior vice president of product and strategy at Cerebras Systems Inc.; Shehram Jamal (right), chief product officer for data and models at HUMAIN; and Joseph S. Spence (second from left), chairman and chief investment officer at NativelyAI, spoke with theCUBE’s John Furrier at theCUBE + NYSE Wired: Robotics & AI Infrastructure Leaders 2025 event, during an exclusive broadcast on theCUBE, News Media’s livestreaming studio. They discussed the current state of AI infrastructure development and what lies ahead.
The rise of sovereign AI
Regional AI infrastructure is no longer optional — it’s essential, according to Hock. Notably, national governments are actively investing in data programs, talent development and infrastructure to fuel sovereign AI strategies.
“We’re also seeing broad enterprise adoption, not just of simple chat-based models, but of reasoning models of agentic workflows that can fundamentally change how we process data, how we make decisions and how we operate,” Hock said. “The tipping point … we’re seeing is from an interesting technology into a globally valuable technology. Geographically, there’s a broad global adoption both internationally and by national governments who recognize the economic transformational power of AI.”
Besides innovation, the push for AI sovereignty is also being driven by geopolitical necessity. From Southeast Asia to the Middle East, countries are fast-tracking infrastructure to become self-sufficient in compute and data sovereignty, Jamal added.
“Sovereign AI is the next big thing, which means that you are responsible for your data, responsible for your privacy, and you want to be in full control,” he said. “I think that, now, all these big cloud providers are working with regional countries to make sure that you can have these multicloud in the country, specifically, and as per the local laws as well.”
The importance of culture, language and human-centric innovation
As AI infrastructure globalizes, so too does the cultural DNA of AI itself. Drawing off that cultural shift is the importance of language preservation and local relevance, as AI must “speak the language” of every society to ensure equitable access and utility, according to Spence.
“In many cases you have lots of resources, you have lots of electricity, but they don’t necessarily have the billions of dollars to pour in,” he said. “I think different types of models are going to work there. You’ll have some typical standard type centralized models, but you also have distributed models there because they’re less capital intensive for these countries and they can still come to the table and play.”
From regional hackathons to government-partnered initiatives, the message was clear: Distributed infrastructure unlocks distributed innovation, allowing local entrepreneurs to solve local problems using global-grade technology.
On the logistical front, traditional cloud models are rapidly evolving into sovereign, distributed ecosystems. This shift requires more than just moving data centers closer to users — it means developing local talent, governance models and open-source ecosystems that are compliant with national regulations and tailored to local needs, according to Cochrane.
“Distributed computing, first and foremost, means that you have distributed talent, which means that you need to broaden the ecosystem,” he said. “You need to be able to uplift new digital leaders. You need to empower students in universities all around the globe to start building up a new talent pool of people that can contribute to new open-source projects and models, and can innovate at a global scale.”
Technically speaking, the software stack is more critical than ever. New orchestration layers, vector databases and frameworks are rapidly evolving to support the complexity of AI workflows. But while headlines often warn of job displacement, the panel painted a different picture: AI will remain a productivity engine and entrepreneurship catalyst.
Here’s the complete video interview, part of News’s and theCUBE’s coverage of theCUBE + NYSE Wired: Robotics & AI Infrastructure Leaders 2025 event:
Photo: News
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU