Amazon.com Inc. today detailed plans to double its investment in Anthropic PBC to $8 billion.
The announcement of the latest $4 billion cash infusion comes about a year after the cloud and retail giant disclosed its first $4 billion commitment to Anthropic. At the time, the OpenAI rival named Amazon Web Services as its primary cloud provider. The deal announced today will see AWS take on the additional role of Anthropic’s primary AI training provider.
Anthropic introduced its most advanced large language model, Claude 3.5 Sonnet, last month. It’s an improved version of an identically named LLM that debuted a few months earlier. The new Claude 3.5 Sonnet is better than its namesake at several tasks, including code generation, and outperformed OpenAI’s GPT-4o across multiple benchmark tests.
Anthropic offers its LLMs through Amazon Bedrock, an AWS service that provides access to managed AI models. The companies’ expanded partnership will give Bedrock users early access to a feature that makes it possible to fine-tune Claude models using customer-provided datasets. Increasing the amount of data available to an LLM often improves the quality of its output.
Alongside the go-to-market collaboration, AWS and Anthropic plan to support one another’s product development efforts. Anthropic will use the cloud giant’s AWS Trainium and Inferentia chips, which are optimized for artificial intelligence training and inference, respectively, to power its internal workloads. The OpenAI rival detailed that it will leverage the former processor line to build its “largest foundation models.”
The latest Trainium chip, Trainium2, debuted last November. The processor boasts double the performance of its predecessor and twice the power efficiency. Customers can provision as many as 16 Trainium2 chips per instance, as well as combine instances into AI clusters with up to 100,000 chips and 65 exaflops of computing power.
In parallel, Anthropic engineers will support AWS’ efforts to develop new Trainium processors. The LLM developer will contribute to the software stack that powers the chip lineup. Neutron, as the software stack is called, includes a compiler that optimizes customers’ AI models to run on Trainium instances and several other tools.
Anthropic is working on “low-level kernels that allow us to directly interface with the Trainium silicon,” the company detailed in a blog post today. In some AI processors, a kernel is a code snippet that distributes computations across its host chip’s cores as a way of boosting performance. Kernels are among the building blocks of advanced AI models.
“By continuing to deploy Anthropic models in Amazon Bedrock and collaborating with Anthropic on the development of our custom Trainium chips, we’ll keep pushing the boundaries of what customers can achieve with generative AI technologies,” said AWS Chief Executive Officer Matt Garman.
Anthropic’s latest funding round comes two months after OpenAI raised $6.6 billion in the largest startup investment on record. It also secured a $4 billion line of credit from a group of banks. OpenAI, which is now worth $157 billion, will invest the funds in AI research and compute infrastructure.
Photo: AWS
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU