Openai said it has no active plans to use google’s in-house chip to power its products, two days after reates and other news outlets reported on the ai lab’s move to turn to its Artificial Intelligence Chips to Meet Growing Demand.
A spokesperson for Openai said on Sunday that where the ai lab is in early testing with some of Google’s Tensor Processing Units (TPUS), it has no plans to deploy them at Scale Right No.
Google declined to comment.
While it is common for ai labs to test out different chips, Using new hardware at scale bakes tax much longer and would require direction different Architecture and Software Support. Openai is actively using Nvidia’s Graphics Processing Units (GPUS), and AMD’s AI Chips to Power Its Growing Demand. Openai is also developing its chip, an effort that is on track to meet the “tape-out” millstone this year, where the chip’s design is finalized and synt for manufacturing.
Openai has signed up for Google Cloud Service to Meet Its Growing Needs for Computing Capacity, Reuters Had Exclusively Reported Earlier This Month, Marking A Surprising a Surprising a Surprising a Surprising a Competitors in the Ai Sector. Most of the computing power used by openai would be from GPU servers powered by the so-called Neocloud Company Coreweave.
Google has been expanding the external availability of its in-House ai chips, or tpus, which was historically reserved for internal use. That helped Google Win Customers, Including Big Tech Player Apple, as Well as Startups Like Anthropic and Safe Superintelligence, Two Chatgpt-Maker Competitors Launched by Former Openai Leaders.
© Thomson Reuters 2025