This partnership would aim to diversify Google’s semiconductor supply chain and compete with Nvidia in the inference space.
Google is reportedly in talks with Marvell Technology to develop two new AI chips: a memory processing unit and a TPU optimized for inference. The American company would thus seek to diversify its supply chain into custom semiconductors. It would rely on Broadcom, MediaTek and now Marvell for the design, while entrusting the manufacturing to TSMC.
A partnership between Google and Marvell for the design of two chips
For the moment, no contract has been signed, but according to the American media The InformationGoogle is in close discussions with Marvell Technology to develop two new chips for running AI models. The first bullet is a memory processing unit optimized to support Google’s existing TPUs. The second is a new generation of TPU dedicated to inference : the operational phase where the model, once trained, is used to generate predictions or responses in real conditions.
This latest chip therefore aims to make Gemini faster and less expensive. Through this partnership, Google favors inference as the main item of computing expenses, and thus hopes to compete with Nvidia in this area. Note that Marvell would work with Google as a design service provider.
A third supplier for Google
By partnering with Marvell Technology, Google is not replacing Broadcom, but adding a design partner to its supply chain which includes Broadcom (high performance chips), MediaTek (notably the versions of the “e” range, optimized to offer a better performance-price ratio) and TSMC (manufacturing). It is therefore a question of a strategy of diversification for Google.
Ironwood, Google’s seventh-generation TPU, designed by Broadcom, launched this month. Google presents it as “ the first Google TPU designed for the inference era “. The two chips developed by Marvell Technology will therefore complement those of Broadcom, without replacing them. The objective is simple: to allow Google to adjust its resources according to the types of tasks or budgets. This responds to the increasingly significant share of calculation devoted to inference, that is to say the deployment of models, rather than to their learning phase.
Google has also signed an agreement with artificial intelligence company Anthropic. The latter will have access to a gigawatt of AI computing capacity, thanks to Google’s AI chips called “Tensor Processing Units”, or TPUs.
👉🏻 Follow tech news in real time: add 01net to your sources on Google, and subscribe to our WhatsApp channel.
Source :
The Information
