The early years of building out artificial intelligence (AI) infrastructure have been dominated by training, as companies race to create the best AI models. However, according to reports, the AI inference market could increase from approximately $106 billion to nearly $255 billion by 2030.
Let’s take a look at three stocks that will benefit from this uptrend.
Image source: Getty Images.
Nvidia
While Nvidia (NVDA 4.43%) is known for its dominance in large language model training (LLM), but the company is also a leader in AI inference. Through Nvidia NIM (Nvidia Inference Microservices), the company offers pre-built, optimized inference microservices. Meanwhile, the Blackwell GB300 Ultra graphics processing units (GPUs) are optimized for inference and agentic AI, while the upcoming Vera Rubin platform is expected to continue improving its inference performance.
Today’s change
(-4.43%)$-8.20
Current price
$176.69
Key data points
Market capitalization
$4.3 tons
Day range
$176.56 -$182.58
Range of 52 weeks
$86.62 -$212.19
Volume
11M
Avg. full
174M
Gross margin
71.07%
Dividend yield
0.02%
However, it is the company’s acquisition of Groq’s employees and licensing of its technology that could truly make the company an AI inference winner. Groq (owned by X) has developed a new type of chip called language processing units (LPUs), which are specifically designed for AI inference. Nvidia plans to integrate these chips into its CUDA software platform and networking infrastructure to enhance its inference offering. As such, I wouldn’t overlook Nvidia in the inference market, where it should remain a winner.
Advanced micro devices
Since Nvidia’s CUDA moat is not as wide in inference as it is in training, this opens the door for Advanced micro devices (AMD 1.71%) to take part. The company has already done a good job carving out a niche in the inference market, so the overall growth of the market should benefit, especially given its much smaller revenue base than Nvidia.
Today’s change
(-1.71%)$-3.49
Current price
$200.19
Key data points
Market capitalization
$326 billion
Day range
$197.75 -$201.87
Range of 52 weeks
$76.48 -$267.08
Volume
1.2 million
Avg. full
35M
Gross margin
45.99%
Meanwhile, AMD will benefit from an investment from OpenAI and a commitment from the startup to deploy 6 gigawatts of GPUs. With 1 gigawatt of chips worth about $35 billion based on the price of Nvidia GPUs, that’s a big upcoming growth driver for the company. OpenAI will use them specifically for inference, so this could also open the door to inference deals with other companies.
Also not to be overlooked in the AMD story is the importance of central processing units (CPUs) when it comes to agentic AI. CPUs act more like the brain of a computer, and with AI agents they become an increasingly important part of the AI infrastructure story. Between increasing AI inferences and data center CPU demands, AMD seems well positioned for the future.
Broadcom
As companies look to reduce the computational costs of AI infrastructure, they are increasingly turning to AI ASICs (application-specific integrated circuits). ASICs are custom chips that are wired for specific tasks, and as such they tend to perform these tasks very well while being more power efficient. This becomes increasingly important with inference, as it is an ongoing cost that consumes energy every time a question is answered or a task is completed.
As a leader in ASIC technology, Broadcom (AVGO 0.67%) is one of the best ways to capitalize on this trend. The company provides the building blocks to convert its customers’ chip designs into physical chips. Meanwhile, it also maintains key relationships with memory manufacturers and foundries to secure key components and manufacturing capacity for these chips so they can be manufactured at scale.
Today’s change
(-0.67%)$-2.14
Current price
$319.56
Key data points
Market capitalization
$1.5 tons
Day range
$310.00 -$319.99
Range of 52 weeks
$138.10 -$414.61
Volume
767K
Avg. full
30M
Gross margin
64.71%
Dividend yield
0.76%
Broadcom helped Alphabet design are highly regarded tensor processing units (TPUs), and this alone is a big opportunity, especially as Alphabet now lets customers deploy TPUs via Google Cloud. Anthropic has already placed a $21 billion order for TPUs with Broadcom this year, while a nice chunk of Alphabet’s estimated $180 billion in capital expenditures this year will likely go to TPUs as well. Meanwhile, the company is winning new ASIC customers, including OpenAI, which has committed to 10 gigawatts worth of chips.
With the inference market set to soar, Broadcom appears poised to be one of the biggest winners in the chip space.
