Dell Technologies Inc. is betting big on enterprise AI — not as an add-on, but as the backbone of a new era in infrastructure, from the datacenter to the edge.
With more than four-plus decades as an industry leader, Dell has shown a remarkable ability to adapt to the shifting winds of change. Through the personal computer revolution, the birth of the internet, the rise of cloud computing, the transformative acquisition of EMC and today’s enterprise AI-fueled era, Dell has managed to retool its offerings each time to meet evolving customer demands.
A case can be made that the needs of enterprise customers have never been more complex. Organizations want solutions that simplify scaling AI workloads while maximizing compute efficiency. They demand flexible deployment models that can integrate easily with hybrid cloud environments. They also want services and solutions that minimize complexity and risk for an expanding list of artificial intelligence initiatives.
These requirements arise during a sea change in the enterprise AI computing paradigm, one that is rapidly transforming to accommodate the wholesale adoption of AI. Dell must now remake itself yet again, from being not only a server and storage vendor, but an AI-solutions leader for the full stack.
“We are witnessing the rise of a completely new computing era,” said Dave Vellante, chief analyst at theCUBE Research. “Within the next decade, a trillion-dollar-plus data center business is poised for transformation, powered by what we refer to as extreme parallel computing, or as some prefer to call it, accelerated computing. While artificial intelligence is the primary accelerant, the effects ripple across the entire technology stack.”
This feature is part of News Media’s exploration of Dell’s efforts in enterprise AI. Be sure to tune in for theCUBE’s analyst-led coverage of Dell Technologies World, May 19-21. (* Disclosure below.)
Disaggregation for enterprise AI
With the growth of AI and edge applications, Dell’s enterprise AI infrastructure strategy reflects customers’ need to balance traditional workloads, such as virtual machines and databases. As enterprise AI gains a foothold in business operations, the datacenter has become more distributed, driven by a demand for data regardless of where it may reside. This has created what Dell terms a “disaggregated infrastructure,” in which the three tiers — compute, storage and networking — must be pulled in as shared, adaptable resources.
“What we’ve learned over the course of the last 10 years is that while hyperconverged was really great if people were focused on a singular ecosystem … you need to move to a disaggregated architecture,” said Travis Vigil, senior vice president, product management at Dell, during a recent interview with theCUBE. “If you want to be ready for generative AI, if you want to set yourself up to have multiple hypervisors in your environment, if you just want to get better [total cost of ownership] in your data center, moving to an architecture that looks a lot like what three-tier was, but is different in a couple [of] key ways, is the right investment to make.”
That investment means that customers will look for servers that can support the needs of a scalable datacenter. Dell’s unveiling in March of its PowerEdge XE8712 liquid-cooled server for large AI and high-performance computing workloads provides a central element of this strategy.
Dell has also partnered with Nvidia Corp. to drive the AI Factory, positioned by the company as an “end-to-end AI enterprise solution” for training and running enterprise AI models. By integrating Dell’s compute, storage, networking, workstations and laptops with Nvidia’s advanced AI infrastructure, customers can take advantage of deployment options across the IT landscape.
“[Customers] can start small and literally just stack that up, not only just within a rack, but create rack scale deployments,” said Adam Glick, senior director of AI portfolio marketing at Dell, in an interview with theCUBE about AI Factory solutions. “We make it super simple to be able to take the hardware. We’ve worked a lot with our friends at Nvidia.”
Moreover, Dell’s portfolio has always been one that stresses openness and offers customer choice. This is especially important in AI as organizations bring intelligence to data on-premises and run inference at scale. To accommodate a variety of use cases, Dell must offer myriad solutions with optionality around silicon, storage and high-speed networks that support a variety of tools and applications.
Enhancements for edge devices
The pursuit of an “end-to-end AI enterprise solution” involves deployment at the edge. To facilitate this goal, Dell has brought a set of enhancements to its PC lineup that enable on-device AI processing.
The company unveiled updates for its AI PC lineup at the Consumer Electronics Show in January. These enhancements included integration of the latest processor technology from Intel Corp., Advanced Micro Devices Inc. and Qualcomm Inc., along with the addition of the Dell Pro AI Studio, which allows developers to deploy AI applications on Dell AI PCs more easily and rapidly.
“We’ve gone all in on AI; we’re putting it into our devices,” said Sam Burd, president of the Client Solutions Group at Dell, in an interview with theCUBE during the Dell Bets Big on AI PCs at CES 2025 event. “It’s basically accelerators that are sitting next to the [central processing unit], so you can think about that as enabling the apps that we use every day to be more intelligent. Then, for our commercial customers, the most exciting piece to me is they’re going to be able to take their data generated at the edge on their devices, their PCs, and run their AI models on those devices.”
Dell NativeEdge also plays a key role in the firm’s market strategy for AI. The automated software platform is designed to help businesses manage and securely scale edge applications across multiple locations. One example of how NativeEdge can be deployed involves a cold chain application for a large national grocery store business. A cold chain is a supply system that involves transporting and storing temperature-sensitive food items, and Dell NativeEdge provides centralized monitoring and lifecycle management across 1,100 stores, 275 trucks and 55 warehouses for this customer.
NativeEdge emerged from Project Frontier, which was originally launched by Dell in 2022 to address complexities in edge management and deployment. Edge technology is progressing fast, according to company founder Michael Dell, who sees device communication as a key area of growth.
“In the future, most of the edge devices will be machines talking to machines,” Dell said during an exclusive conversation with theCUBE. “And we’re actually already seeing it. It’s not really the future; it’s right now.”
Building cloud-native solutions
AI’s rapid adoption has led enterprises to move from traditional virtualization to cloud-native containerized applications. The shift toward containerization is fueled by a need for model portability and governance and a desire to streamline model integration and management within the enterprise.
This dynamic has led Dell and Red Hat Inc. to pave the way for hardware to serve as a platform for AI development. In September, the two companies announced that Red Hat Enterprise Linux AI, or RHEL AI, would be integrated with Dell’s PowerEdge servers. The collaboration serves the stated goal of simplifying containerized applications management and providing a more effective infrastructure foundation for AI.
“A lot of what is being built from an AI and machine learning perspective is actually in containers,” said Rob Strechay, principal analyst at theCUBE Research, during an analysis on theCUBE. “What they’re doing for ease of use and being able to bring the portability of these models [is] containerizing those so that they can actually do governance on some of those models. There’s a lot of containers technology and Kubernetes technology being put into this AI/ML.”
Ease of use and portability have become key solutions for AI deployment. Dell has also leveraged its Data Lakehouse, which was introduced last year, to provide Kubernetes-orchestrated system software as part of making the path easier for customers to launch new AI infrastructure. The Data Lakehouse, built on top of Starburst Data Inc.’s advanced querying platform, enables access to data so enterprises can process it regardless of location. It’s the kind of flexibility that organizations want when it comes to driving AI business applications at scale.
Leveraging multicloud for efficiency
The work Dell is doing in edge and cloud-operating models highlights the importance of providing solutions that can deploy and be managed in diverse computing environments. IT infrastructure is getting more complex by the day, and Dell’s approach to helping customers in this area is centered around providing flexible deployment methodologies and form factors based on software-based storage, manageability and integrations with partners such as Nutanix and Red Hat’s OpenShift, according to Strechay.
“Dell’s vision of ‘ground-to-cloud’ is focused on ensuring hybrid and multicloud workloads tie back to the on-premises data centers, where much of the intellectual property of organizations exists,” he said.
This strategy helps customers transition from “ground-to-cloud” and back again through Storage Services for Public Cloud and storage software for block, file and data protection across cloud providers. It also includes a set of dedicated IT subscription-based services that foster a universal storage layer.
Dell has also focused on the integration of cloud ecosystems on-prem, providing consistency and control. This includes Dell’s implementation of its Cloud Platform with Microsoft Azure that extends AI workload operation and control to on-prem environments.
“I think the story that Dell started to tell is, ‘Hey, wait a minute. Maybe we can do some of this sort of stuff on-prem,’” said Bob O’Donnell, president and chief analyst at TECHnalysis Research LLC., during a discussion on theCUBE. “The basic message … is ‘Why move your data to the AI? Why not move the AI to your data?’ The vast majority of most organizations’ data is still behind their firewall, so it just makes logical sense to do that.”
By following the data, Dell is pursuing a market strategy to build full-stack solutions for multicloud and hybrid ecosystems. Its partnerships with Nvidia, Microsoft Corp. and Red Hat signify its interest in embracing solutions that will support enterprise AI initiatives while answering the call for operational simplicity. As Dell kicks off its annual conference on May 19, more announcements are expected that support the company’s evolving market AI strategy.
To date, this strategy has also resulted in a positive impact on the firm’s balance sheet. Dell has sold $10 billion in AI-optimized servers during the current fiscal year and expects that to grow to $15 billion before the books are closed on the current earnings cycle. The 41-year-old company has shifted with the winds of technology change yet again.
(* Disclosure: TheCUBE is a paid media partner for Dell Technologies World. Neither Dell Technologies Inc., the primary sponsor of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or News.)
Image: News/Microsoft Designer
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU