Nvidia is making a strategic return to the Chinese market by introducing its China-specific H20 AI chip, a move that comes after US policies affected the company’s sales in China. Nvidia has begun taking pre-orders from distributors for its H20 AI chips, priced almost on par with Huawei’s Ascend 910B, according to Reuters.
Why it matters: To address challenges posed by the US’s bans on the sale of certain GPUs (Graphics Processing Units) to China, Nvidia has launched cut-down variants compliant with American export policies. Amid concerns about potential limitations on accessing Nvidia’s products, Huawei’s chip is widely acknowledged as the leading alternative AI offering in China.
Details: Prior to the US curbs, Nvidia held over 90% of the AI chip market in China, according to the Reuters report. However, it is now facing growing competition from local competitors such as Huawei. The H20 AI chip is priced at $12,000 to $15,000, positioning itself as a competitor to Huawei’s Ascend 910B.
- In October 2023, the US government imposed new restrictions on the export of advanced AI chips, leading Nvidia to immediately halt shipments of high-performance AI chips including the A100, A800, H100, H800, and L40S products. Subsequently, Nvidia initiated the development of new AI chips specifically designed for the Chinese market, including the H20, L20, and L2. All three chips are modified versions of Nvidia’s H100 AI chip.
- While the H20 is expected to provide less computing power than Nvidia’s flagship H100 AI chip, specifications suggest its performance is also inferior to Huawei’s Ascend 910B in certain key aspects. Notably, the H20 may lag behind the 910B in FP32 performance, a critical metric that measures processing speed, and is rated at less than half of its competitor’s capability, the source behind the Reuters report told the news agency.
- However, the H20 is likely to have an edge over the 910B in terms of interconnectivity speed, with the H20 being competitive in applications that require the connection of a large number of chips to function as a system, the report explained.
- In terms of performance, the H20 chip’s AI computing power is slightly less than 15% of the H100, according to US media outlet Wccftech. The H20 AI chip features 96GB memory capacity operating at up to 4.0 Tb/s, 296 TFLOPs computing power, and a performance density of 2.9 TFLOPs/die, compared to the H100’s 19.4 TFLOPs/die.
- Distributors have reportedly informed clients that they can start deliveries of H20 products in small batches during the first quarter of 2024, with larger quantities available from the second quarter. Last month, it was reported that Nvidia intends to mass produce the H20 in the second quarter of this year.
Context: On Jan. 20, Nvidia CEO Jensen Huang visited the company’s offices in Shenzhen, Shanghai, and Beijing for the annual parties held in celebration of China’s Lunar New Year holiday, with the company clarifying that the visit did not involve business operations.
Related