Google LLC will adopt multiple future iterations of Intel Corp.’s Xeon processor series as part of a collaboration announced today.
Shares of the chipmaker closed 4.7% higher on the news.
Google will deploy the chips in its cloud platform. They will power artificial intelligence models and general-purpose workloads.
Google Cloud already uses Xeon 6, Intel’s newest series of central processing units, to run some of its general-purpose C4 instances. The virtual machines can achieve a maximum clock speed of 3.9 gigahertz when all the cores of the underlying CPU are active. That frequency can rise to 4.2 gigahertz when only the fastest cores are online.
Google’s C4 instances use a specific variant of Xeon 6 called Granite Rapids. It’s based on a core design called P-core that includes multiple AI-focused optimizations. One of those optimizations is AMX, a set of extensions to the machine language in which Intel chips express computations. AMX speeds up a calculation called multiply-accumulate that AI models run frequently during inference.
Intel also offers a second collection of Xeon 6 chips called Sierra Forest. Those CPUs are based on a core design called E-core that trades off some of P-core’s performance for increased efficiency.
Intel debuted its most advanced E-core server processor in March. It includes 288 cores, or 160 more than the largest Granite Rapids processor. The chip is based on the company’s latest Intel 18A manufacturing process, which offers up to 15% better performance per watt than the Intel 3 node that underpins earlier Xeon 6 chips.
“Scaling AI requires more than accelerators – it requires balanced systems,” said Intel chief executive officer Lip-Bu Tan. “CPUs and IPUs are central to delivering the performance, efficiency and flexibility modern AI workloads demand.
The companies’ new partnership also extends to Intel’s IPU, or infrastructure processing unit, product family. The chips in the lineup are optimized to perform infrastructure management tasks such as encrypting data traffic and coordinating storage hardware. IPUs offload those tasks from a server’s CPU, which leaves more computing capacity for user workloads.
Intel and Google plan to expand “their co-development of custom ASIC-based IPUs.” An ASIC, or application-specific integrated circuit, is a processor designed from the ground up for a specific set of use cases. That suggests Google will commission IPUs optimized for its cloud data centers.
The contract is a needed win for Intel, which faces growing competition in the server CPU market. Last month, rival Arm Holdings plc debuted its first ready-made processor for data centers. The 136-core AGI CPU was developed in partnership with Meta Platforms Inc., which will use the chip to power its internal AI infrastructure.
Image: Intel
Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.
- 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
- 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About News Media
Founded by tech visionaries John Furrier and Dave Vellante, News Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.
