AWS and OpenAI have reached a strategic agreementlasting several years, which will make AWS’s world-class infrastructure available to run and scale its AI workloads. basic from now on. The deal, worth $38 billion, will continue to grow over the next seven years, and means OpenAI will have access to AWS computing.
Amazon‘s cloud division has extensive experience running large-scale AI infrastructures securely and reliably, which combined with OpenAI’s advances in generative AI, will help ChatGPT improve its availability, among other things.
OpenAI will immediately use AWS computing as part of the alliance, with the goal of deploying its full capacity before the end of 2026, as well as the possibility of expanding it in 2027 and beyond.
The infrastructure that AWS is developing for OpenAI has an architectural design optimized for maximum efficiency and performance in AI processing. Clustering NVIDIA GPUs, both GB200 and GB300, via Amazon EC2 UltraServers on the same network enables low-latency performance across all interconnected systems.
This gives OpenAI the opportunity to run workloads efficiently with optimal performance. The clusters are designed to support diverse workloads, from serving inference for ChatGPT to training next-generation models, with the flexibility to adapt to the changing needs of OpenAI.
The alliance that both parties have signed is the continuity of their joint work in recent months. Thus, in early 2025, OpenAI’s open source foundational models became available through Amazon Bedrock. OpenAI is therefore one of the publicly available model providers on Amazon Bedrock.
According to Sam Altman, Co-Founder and CEO of OpenAIpoints out that «Scaling cutting-edge AI requires massive, reliable computing. Our alliance with AWS strengthens the broad computing ecosystem that will drive this new era and bring advanced AI to the world.«.
Matt Garman, CEO de AWShas highlighted for his part that «As OpenAI continues to push the boundaries of what is possible, AWS’s leading infrastructure will inform its AI ambitions. The breadth and immediate availability of lean compute demonstrates why AWS is uniquely positioned to support OpenAI’s massive AI workloads.«.
