Unstructured data is now the constraint shaping how far artificial intelligence platforms can realistically scale.
Enterprises are struggling to scale AI because unstructured data pipelines can’t deliver the latency, throughput and consistency inference workloads demand. Hewlett Packard Enterprise Co. is seeing this pressure firsthand as customers push to keep GPUs busy and simplify how unstructured data moves across increasingly hybrid environments, according to Ed Beauvais (pictured), director of product management for AI and cloud data infrastructure at HPE.
“One of the things that we hear from our enterprise customers is that it’s critical to keep the GPUs busy,” Beauvais said. “One of the areas that we’re investing in is RDMA support, and we believe that customers want to look and infuse RDMA throughout the entire data pipeline.”
Beauvais spoke with theCUBE’s Rob Strechay at the Future of Data Platforms Summit, during an exclusive broadcast on theCUBE, News Media’s livestreaming studio. They discussed how unstructured data is driving architectural change in AI data platforms, from performance and openness to hybrid management and ROI. (* Disclosure below.)
How unstructured data is redefining AI data platforms
As AI inference expands, infrastructure bottlenecks are surfacing earlier in the data lifecycle. Latency at ingest, movement across tiers and handoffs between systems can idle expensive compute resources. That reality is forcing organizations to treat unstructured data pipelines as performance-critical infrastructure rather than passive storage layers, Beauvais explained.
“To be able to really provide and process inference at scale, you really need to rethink how we’re doing data pipelines end-to-end,” he said. “That’s not just the processing at the end, but how data is ingested and how it works with the broader ecosystem of the data pipeline.”
Data silos compound these challenges by fragmenting access to information that AI systems increasingly depend on. When unstructured data is isolated by format, protocol or location, organizations risk training and inference decisions based on incomplete context. That gap becomes more damaging as AI systems move closer to operational decision-making, Beauvais noted.
“When you think about the intelligence of data is if it’s in a silo and it’s a critical piece of business information or if it’s a critical piece of information that you’re not aware of, that’s risk to the business,” he said. “You might be making decisions without all that data.”
Openness becomes essential as platforms span environments
Openness is emerging as a practical response to this fragmentation. Enterprises are placing growing emphasis on metadata accessibility and standard formats that allow data to be discovered, queried and reused without custom integration. Making metadata usable is becoming just as important as storing the data itself.
“We want to make sure that customers can get access to their data and, more importantly, metadata,” Beauvais said. “What you’ll see in 2026 is that we see more organizations using technology like MCP, Model Context Protocol, for data discovery.”
Hybrid environments add another layer of complexity to unstructured data management. With data split across cloud and on-premises systems for governance, sovereignty and compliance reasons, organizations need consistent control planes that prevent further fragmentation. Flexibility without unified management only increases operational overhead.
“Customers want that flexibility to leverage the cloud and then to also bring that data back on-premise,” Beauvais said. “Whether that’s a sovereignty issue, a governance issue, a compliance issue, all those things are critical.”
As enterprises chase ROI, they are also rethinking platform value. Infrastructure that supports AI alone is no longer sufficient. Organizations want unstructured data platforms that can simultaneously support analytics, recovery and resilience without forcing separate architectures.
“The key aspect of getting a massive ROI is having a platform that’s not just limited to a single use case,” Beauvais said. “That’s how we’ve thought about it, and that’s how we want to provide value to our customers.”
Here’s the complete video interview, part of News’s and theCUBE’s coverage of the Future of Data Platforms Summit:
(* Disclosure: HPE sponsored this segment of theCUBE. Neither HPE nor other sponsors have editorial control over content on theCUBE or News.)
Photo: News
Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.
- 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
- 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About News Media
Founded by tech visionaries John Furrier and Dave Vellante, News Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.
