By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Maia 200, an AI chip to reduce dependence on NVIDIA
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > Mobile > Maia 200, an AI chip to reduce dependence on NVIDIA
Mobile

Maia 200, an AI chip to reduce dependence on NVIDIA

News Room
Last updated: 2026/01/29 at 6:35 AM
News Room Published 29 January 2026
Share
Maia 200, an AI chip to reduce dependence on NVIDIA
SHARE

Microsoft has presented the Maia 200, its new internal chip for artificial intelligence systems with which it aims to reduce dependence on NVIDIA or AMD and compete with the developments of Amazon and Google in data centers.

In the midst of a race to dominate generative AI services, All major technology companies are developing their own chips. Microsoft launched the Azure Maia AI platform in 2023, developed the Cobalt CPU and announced the Maia 100 chip. Now comes the second generation that is committed to performance per dollar and efficiency as its main features.

Maia 200 is an AI accelerator designed for inference workloads. If the Maia 100 was built on 5nm technological processes, the second generation is based on the 5nm process node. 3 nm from TSMC and includes native FP8/FP4 tensor cores. It supports 216 GB of HBM3e memory with 7 TB/s of bandwidth, in addition to 272 MB of on-chip SRAM memory.

Microsoft has assured that it is the higher performance internal silicon designed by the Redmond firm and also by any hyperscaler, including Amazon and Google. Surprisingly, Microsoft published a comparison table showing the Maia 200 with equivalent chips from the other two giants. According to the published table, the Maia 200 offers almost double the FP8 performance of Amazon’s third-generation Trainium and about 10% more than Google’s seventh-generation TPU.

The Maia 200 also wants reduce dependency from the great leader in the sector, NVIDIA, and solutions like the Blackwell B300 Ultra, although direct comparisons here are relative. NVDIA’s accelerator is sold to third-party customers, is optimized for much higher power use cases than the Microsoft chip, and the software stack is released much earlier than any other contemporary model.

Maia 200, the commitment to efficiency

Where the Microsoft chip does stand out is in energy efficiency and performance for Price. Microsoft claims 30% higher performance per dollar than next-generation hardware currently deployed in Azure. Maia 200 is also designed for scale-up deployments, featuring an on-die network card (NIC) with 2.8 TB/s of bi-directional bandwidth for communication across a cluster of 6,144 accelerators.

The Maia 200 operates at almost half the TDP of NVIDIA’s B300 (750W vs 1400W) and if it runs like the original version, it will operate below its theoretical maximum TDP. Microsoft’s efficiency-first message follows its recent trend of emphasizing the corporation’s concern for the communities near its data centers, striving to mitigate negative reactions to the rise of AI. Microsoft CEO Satya Nadella recently spoke at the World Economic Forum in Davos about the need for AI to demonstrate its real usefulness so as not to lose what he called “social permission” and create the feared AI bubble that many are warning about.

Unlike the Maia 100, which was announced long before its implementation, Maia 200 is already deployed in main Microsoft data centers in the United States. The chip can work with a variety of AI models, including OpenAI’s GPT-5.2 models, allowing the company to offer AI capabilities in Microsoft 365 and other services. Microsoft’s Superintelligence team will also use it for synthetic data generation and reinforcement learning to develop future internal models.

To help developers and startups optimize their tools and models for Maia 200, Microsoft has released a preview version of the SDK. Includes integration with PyTorch, a Triton compiler, optimized kernel libraries, and access to the Maia low-level programming language.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Mobile coverage reaches 78% of Nigeria’s major roads, NCC says Mobile coverage reaches 78% of Nigeria’s major roads, NCC says
Next Article Expert Advice: Follow These 4 Rules for Perfect Espresso Every Time Expert Advice: Follow These 4 Rules for Perfect Espresso Every Time
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

MEXC Achieves 20X Growth in GOLD Futures, Captures Up to 47% Market Share with Zero-Fee Strategy | HackerNoon
MEXC Achieves 20X Growth in GOLD Futures, Captures Up to 47% Market Share with Zero-Fee Strategy | HackerNoon
Computing
Securing Britain’s tech future through trade, talent, and tax – UKTN
Securing Britain’s tech future through trade, talent, and tax – UKTN
News
SteelSeries Arctis Nova 7X Gen 2 Review
SteelSeries Arctis Nova 7X Gen 2 Review
Gadget
From ash to assets: Seattle’s Mast proves buried biomass from wildfires can fund forest recovery
From ash to assets: Seattle’s Mast proves buried biomass from wildfires can fund forest recovery
Computing

You Might also Like

Sony confirms PS Plus Monthly Games for February: Check full list here
Mobile

Sony confirms PS Plus Monthly Games for February: Check full list here

4 Min Read
the small print of the new PVPC and the end of volatility
Mobile

the small print of the new PVPC and the end of volatility

8 Min Read
Snap makes room for its future Specs glasses
Mobile

Snap makes room for its future Specs glasses

3 Min Read
France deploys Visio to gain digital sovereignty
Mobile

France deploys Visio to gain digital sovereignty

4 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?