The MSI XpertStation WS300 a desktop AI supercomputer Next-generation technology designed to meet the growing demands of large language models (LLM), generative AI, and advanced data science workflows.
The machine is based on the DGX Station architecture that NVIDIA has launched at GTC 2026. Powered by the NVIDIA GB300 Grace Blackwell Ultra Desktop chip, which supports up to 748GB of high-capacity coherent memory and dual 400GbE network connectivity, the platform extends advanced AI infrastructure capabilities to a compact desktop deployment model and is available for order starting today.
“MSI has a strategic vision to drive AI-based computing”explained Danny Hsu, General Manager of Enterprise Platform Solutions at MSI. “Together with NVIDIA, we are defining the next era of AI infrastructure, uniting centralized performance with distributed innovation and enabling organizations to move from experimentation to production with greater speed, scale and confidence”.
MSI XpertStation WS300: from data centers to the desktop
La XpertStation WS300 integrate up to 748GB memory High-capacity coherent memory, combining high-bandwidth HBM3 GPU memory and LPDDR5X CPU memory into a unified domain to enable efficient data exchange between CPU and GPU for training and fine-tuning large-scale models.
Connectivity is guaranteed with the dual 400GbE interface thanks to the NVIDIA ConnectX-8 SuperNIC network card. The platform offers up to 800 Gb/s added network bandwidth to support distributed AI workloads and multi-node scalability.
High-speed NVMe PCIe Gen5 and Gen6 storage accelerates data set ingestion and AI data streams, ensuring consistent compute utilization during intensive training and inference operations. Combined with support for the NVIDIA AI Software Stack, the platform provides a integrated hardware and software foundation for seamless AI development and deployment, from the desktop to the data center.
Expanding AI Workflows
The MSI XpertStation WS300 supports the full AI lifecyclefrom large-scale model training and intensive data analysis to real-time inference and emerging physics and robotics AI workloads. The platform enables organizations to accelerate deep learning models, process massive data sets efficiently, and run complex AI workloads locally with high performance.
The system can also function as a centralized AI computing node for collaborative fine-tuning and on-demand deployment, providing teams with greater operational flexibility while maintaining control over proprietary data and intellectual property.
