By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Microsoft introduces AI accelerator for US Azure customers | Computer Weekly
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > Microsoft introduces AI accelerator for US Azure customers | Computer Weekly
News

Microsoft introduces AI accelerator for US Azure customers | Computer Weekly

News Room
Last updated: 2026/01/27 at 1:19 AM
News Room Published 27 January 2026
Share
Microsoft introduces AI accelerator for US Azure customers | Computer Weekly
SHARE

Microsoft has announced that Azure’s US central datacentre region is the first to receive a new artificial intelligence (AI) inference accelerator, Maia 200.

Microsoft describes Maia 200 as an inference powerhouse, built on TSMC 3nm process with native FP8/FP4 (floating point) tensor cores, a redesigned memory system that uses 21 6GB of the latest high-speed memory architecture (HBM3e). This is capable of transferring data at 7TB per second. Maia also provides 272MB of on-chip memory plus data movement engines, which Microsoft said is used to keep massive models fed, fast and highly utilised.

According to the company, these hardware features mean Maia 200 is capable of delivering three times the FP4 performance of the third generation Amazon Trainium, and FP8 performance above Google’s seventh-generation tensor processing unit. Microsoft said Maia 200 represents its most efficient inference system yet, offering 30% better cost performance over existing systems, but at the time of writing, it was unable to give a date as to when the product would be available outside of the US.

Along with its US Central datacentre region, Microsoft also announced that its US West 3 datacentre region near Phoenix, Arizona will be the next to be updated with Maia 200.

In a blog post describing how Maia 200 is being deployed, Scott Guthrie, Microsoft executive vice-president for cloud and AI, said the setup comprises racks of trays configured with four Maia accelerators. Each tray is fully connected with direct, non‑switched links, to keep high‑bandwidth communication local for optimal inference efficiency.

He said the same communication protocol is used for intra-rack and inter-rack networking using the Maia AI transport protocol to provide a way to scale clusters of Maia 200 accelerators with minimal network hops.

“This unified fabric simplifies programming, improves workload flexibility and reduces stranded capacity while maintaining consistent performance and cost efficiency at cloud scale,” added Guthrie.

Guthrie said Maia 200 introduces a new type of two-tier scale-up design built on standard ethernet. “A custom transport layer and tightly integrated NIC [network interface card] unlocks performance, strong reliability and significant cost advantages without relying on proprietary fabrics,” he added.

In practice, this means each accelerator offers up to 1.4TB per second of dedicated scale-up bandwidth and, according to Guthrie, enables Microsoft to provide predictable, high-performance collective operations across clusters of up to 6,144 accelerators.

What this all means, at least from Guthrie’s perspective, is that the Maia 200 architecture is capable of delivering scalable performance for dense inference clusters while reducing power usage and overall total cost of ownership across Azure’s global fleet of datacentres.

On the software side, he said a sophisticated simulation pipeline was used to guide the Maia 200 architecture from its earliest stages. The pipeline involved modelling the computation and communication patterns of large language models with high fidelity.

“This early co-development environment enabled us to optimise silicon, networking and system software as a unified whole – long before first silicon,” said Guthrie, adding that Microsoft also developed a significant emulation environment, which was used from low-level kernel validation all the way to full model execution and performance tuning.

As part of the roll-out, the company is offering AI developers a preview of the Maia 200 software developer’s kit.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article 👨🏿‍🚀 Daily – KCB goes River-shopping | 👨🏿‍🚀 Daily – KCB goes River-shopping |
Next Article Today's NYT Mini Crossword Answers for Jan. 27 – CNET Today's NYT Mini Crossword Answers for Jan. 27 – CNET
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

A Refreshingly Good Deal: Take 24% Off Blueair’s Blue Signature Air Purifier
A Refreshingly Good Deal: Take 24% Off Blueair’s Blue Signature Air Purifier
News
Huawei, vivo, and OPPO help establish first global fast-charging standard under ITU · TechNode
Huawei, vivo, and OPPO help establish first global fast-charging standard under ITU · TechNode
Computing
AirTag 2 vs AirTag: What’s new with the iOS tracker?
AirTag 2 vs AirTag: What’s new with the iOS tracker?
Gadget
MSI Prestige 14 Flip AI+ review: Intel Panther Lake arrives with a roar
MSI Prestige 14 Flip AI+ review: Intel Panther Lake arrives with a roar
News

You Might also Like

A Refreshingly Good Deal: Take 24% Off Blueair’s Blue Signature Air Purifier
News

A Refreshingly Good Deal: Take 24% Off Blueair’s Blue Signature Air Purifier

4 Min Read
MSI Prestige 14 Flip AI+ review: Intel Panther Lake arrives with a roar
News

MSI Prestige 14 Flip AI+ review: Intel Panther Lake arrives with a roar

20 Min Read
Cyber policy in action: A glimpse into the 2025 DC Cyber 9/12 Strategy Challenge
News

Cyber policy in action: A glimpse into the 2025 DC Cyber 9/12 Strategy Challenge

11 Min Read
California governor Gavin Newsom accuses TikTok of suppressing content critical of Trump
News

California governor Gavin Newsom accuses TikTok of suppressing content critical of Trump

4 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?