Microsoft unveils Maia 200 accelerator, claiming better performance per dollar than Amazon
January 26, 2026
(Image credit: Microsoft)
Share this article
Join the conversation
Add us as a preferred source on Google
Newsletter
Subscribe to our newsletter
Microsoft has announced the launch of Maia 200, a next-generation AI accelerator intended to bolster its in-house inference capabilities.
Maia 200 is the latest AI accelerator in the tech giant’s Maia chip family, based on TSMC’s 3nm process, providing 10 petaflops at 4-bit precision (FP4) and approximately 5 petaflops at 8-bit precision (FP8).
This makes it a strong piece of hardware for AI inference – with Microsoft having claimed it can run the largest frontier models with no issue – and is ready for future releases.
Microsoft compared the performance of Maia 200 favorably with competing hardware, claiming it delivers 3x the FP4 performance of Amazon’s Trainium3 and better FP8 performance than Google’s TPU v7, which clocks in at 4.61 petaflops.
As of today, Maia 200 is active in its US Central data center region and Microsoft revealed its US West 3 region is next on the list.
How powerful is Maia 200?
Though raw computational performance is a strong benchmark for how well a chip can run AI models, data bandwidth is a core concern.
It’s a particularly critical factor for enterprises seeking to inference AI with as little latency as possible for critical workloads such as AI agents, as well as for delivering scalable AI services in the public cloud.
Get the ITPro daily newsletter
Sign up today and you will receive a free copy of our Future Focus 2025 report – the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
Maia 200 has 256GB of fifth-generation high-bandwidth memory (HBM3E), capable of 7TB/sec transfer speeds.
On a technical level, this meant implementing a new direct memory access (DMA) engine, a tailor-made network on chip (NoC) fabric, and 272MB of on-die static random access memory (SRAM) to enable high-bandwidth data transfer and keep weights next to processing units.
Each Maia 200 accelerator is capable of 1.4TB/sec of scale-up bandwidth.
Hyperscalers expand custom silicon
Microsoft first announced the Maia 100 chip in November 2023, with the stated goal of powering services like Microsoft Copilot and Azure OpenAI Service in its datacenters, as well as run training for models.
It’s far from alone when it comes to reducing its reliance on third-party chip designs from AMD and Nvidia, as AWS leans heavily on its Trainium and Inferentia chips and the bulk of Google’s core AI workloads are completed using its tensor processing units (TPUs).
The Maia chip family is meant to supplement, rather than replace, Microsoft’s use of chips by AMD and Nvidia. But the inherent advantages Maia 200 make it likely that in the near future, a greater percentage of Microsoft’s core workloads could run on its own silicon.
For example, the hyperscaler stressed that Maia 200 was painstakingly optimized for the Azure control plane and Microsoft’s proprietary cooling systems.
All this means Maia 200 can go from delivery to deployment at its data centers in just days, cutting the overall timeline on Microsoft’s internal AI infrastructure program in half.
On top of that, Maia 200’s improved energy efficiency is intended to lower the energy cost of running AI workloads across Azure.
To begin with, Microsoft’s Superintelligence team will be using Maia 200 to generate synthetic data and improve in-house AI models.
Starting today, developers, academics, and open source contributors, and frontier AI labs can sign up for the Maia 200 software development kit (SDK) to optimize future workloads for the hardware.
FOLLOW US ON SOCIAL MEDIA
Make sure to follow ITPro on Google News to keep tabs on all our latest news, analysis, and reviews.
You can also follow ITPro on LinkedIn, X, Facebook, and BlueSky.
Search
RECENT PRESS RELEASES
First of its kind ‘high-density’ hydro system begins generating electricity in Devon
SWI Editorial Staff2026-01-26T22:16:30-08:00January 26, 2026|
Amazon secures $581 million Pentagon cloud contract for Air Force By Investing.com
SWI Editorial Staff2026-01-26T22:14:35-08:00January 26, 2026|
ISE 2026: Amazon Is Serious About Signage
SWI Editorial Staff2026-01-26T22:13:30-08:00January 26, 2026|
Amazon secures $581 million Pentagon cloud contract for Air Force By Investing.com
SWI Editorial Staff2026-01-26T22:13:24-08:00January 26, 2026|
Morgan Stanley resets Apple stock forecast ahead of earnings
SWI Editorial Staff2026-01-26T22:11:39-08:00January 26, 2026|
Morgan Stanley resets Apple stock forecast ahead of earnings
SWI Editorial Staff2026-01-26T22:10:28-08:00January 26, 2026|
Related Post
