Google has been working with Broadcom to help it design the Alphabet subsidiary’s AI accelerator chips known as Tensor Processing Units (TPU). Keep in mind that these are not the same as the Tensor Gx application processors used to power Pixel devices. According to a fresh report, Google might be replacing Broadcom as Google’s design partner on the TPU team. The report says that Taiwan chip designer MediaTek will step in to work on the new TPUs that will be Google’s seventh-generation AI chips.
To lower its reliance on Nvidia’s GPUs, which are the most widely used chips used to train AI models, Google designed the TPU AI accelerators. which are customized for AI use and are employed by Google for its internal workloads. They are also used by Google Cloud customers. As a result, Google is not as reliant on Nvidia as much as other major AI players are. Google’s rivals such as OpenAI and Meta Platforms remain heavily dependent on Nvidia which can backfire when a shortage exists.
GPUs are used to help with the rendering of graphics and images on devices like smartphones and are known for their ability to process large amounts of data simultaneously. This is more in line with the matrix style of data processing used with AI. CPUs are known more for their sequential style of processing data which makes them less useful for AI.