AMD has taken advantage of Financial Analyst Day 2025 to make interesting announcements about its upcoming developments, especially its new generation of AI accelerators.
AI accelerators using GPUs have become the main engine driving new data centers for artificial intelligence. A sector that until now has been dominated by NVIDIA with innovations such as Blackwell Ultra and for which the next AMD companies want to position themselves as a great alternative at the expense of what Intel can do in low hours and what comes from China from companies like Huawei.
AMD AI Accelerators
AMD relies on its ‘Instinct’ family to increase its position in the sector and has presented the roadmap that advances the next releases. Next year, AMD will launch its series Instinct MI400. The new line will be based on the CDNA 5 architecture and will offer:
- Higher capacity and bandwidth of HBM4.
- Expanded AI formats with higher performance.
- Standards-based rack-scale networking (UALoE, UAL, UEC)
Official metrics indicate that the MI400 is a 40 PFLOP (FP4) and 20 PFLOP (FP8) product, which doubles the computing capacity of the MI350 serieswhich is a highly in-demand product for AI data centers.
In addition to computing power, AMD will also leverage HBM4 memory for its Instinct MI400 series. The new chip will offer a 50% increase in memory capacity, going from 288 GB of HBM3e to 432 GB of HBM4. The HBM4 standard will offer a bandwidth of 19.6 TB/s, more than double the 8 TB/s of the MI350 series. The GPU will also feature a scalable bandwidth of 300 GB/s per GPU, an impressive increase.
The MI400 series will be delivered in two solutions: the Instinct MI455X, designed for large-scale AI training and inference workloads; and the MI430X, aimed at HPC and sovereign AI workloads, with hardware-based FP64 capabilities, hybrid computing (CPU+GPU), and the same HBM4 memory as the MI455X.
Already in 2027AMD will present its new generation of AI accelerators Instinct MI500. Since AMD is adopting an annual release cycle, we will see updates in the data center and AI space at a rapid pace, similar to what NVIDIA is currently doing with its standard and “Ultra” offerings. These accelerators will power next-generation AI racks and deliver a radical improvement in overall performance. According to AMD, the Instinct MI500 series will offer next-generation computing, memory and interconnect capabilities.
More ads in CPU
AMD also confirmed its great financial momentwith revenue forecast greater than 35% compound annual rate (CAGR) over the next 3 to 5 years. The company plans to raise operating margins above 35%, and generate earnings per share (non-GAAP) of more than $20. Very good forecasts that boosted the share Price by 9%.
Other important announcements were the confirmation of the launch of the ZEN 6 architecture in 2026 and with it a new generation of processors with which it hopes to call into question Intel’s supremacy that has lasted more than 40 years. For the professional market and in addition to GPUs, AMD has assured that AI will drive the exponential growth of server CPU market.
AMD’s internal projection indicates that the AI tipping point will generate an additional $30 billion in revenue, solely for CPU demand, in 2030. This figure is expected to add to the already existing demand for conventional server CPUs – which is around $30 billion – which would raise 2030 projections to a staggering figure of $60 billion in revenue, just for CPUs.
