#maia-200

[ follow ]
Artificial intelligence
fromComputerworld
8 hours ago

Microsoft launches its second generation AI inference chip, Maia 200

Maia 200 is a high-performance, energy-efficient inference accelerator optimized for large reasoning models, delivering superior FP4/FP8 throughput and memory compared with rival cloud accelerators.
Artificial intelligence
fromTheregister
15 hours ago

Microsoft looks to drive down AI infra costs with Maia 200

Microsoft unveiled the Maia 200 inference accelerator: 144 billion transistors, 10 petaFLOPS FP4, 216GB HBM3e (7TB/s), and 750W power consumption.
Artificial intelligence
fromTechzine Global
20 hours ago

Microsoft unveils new proprietary AI chip Maia 200

Maia 200 is a high-performance AI accelerator delivering superior throughput, efficiency, and large-model support with extensive memory, networking, and Azure SDK integration.
fromThe Verge
21 hours ago

Microsoft's latest AI chip goes head-to-head with Amazon and Google

Built on TSMC's 3nm process, Microsoft says its Maia 200 AI accelerator "delivers 3 times the FP4 performance of the third generation Amazon Trainium, and FP8 performance above Google's seventh generation TPU." Each Maia 200 chip has more than 100 billion transistors, which are all designed to handle large-scale AI workloads. "Maia 200 can effortlessly run today's largest models, with plenty of headroom for even bigger models in the future," says Scott Guthrie, executive vice president of Microsoft's Cloud and AI division.
Artificial intelligence
[ Load more ]