Microsoft Silicon—2 New Chips to Power Azure Cloud and AI
By: Craig Kennedy
Microsoft Silicon—2 New Chips to Power Azure Cloud and AI
Earlier this month at Microsoft’s annual Ignite conference, Microsoft CEO Satya Nadella announced two new custom silicon chips designed and built by Microsoft to power its Azure cloud infrastructure. The Maia 100 and Cobalt 100 chips are being piloted today and are scheduled to be available in Microsoft Azure data centers beginning early in 2024.
Azure Maia AI Chip: Designed for AI Workloads
Microsoft unveiled the Azure Maia AI Accelerator, a novel AI chip, in response to the growing demand for AI technology. This chip codenamed “Athena,” allows Microsoft to diversify its supply chain and provides its customers a choice in infrastructure.
The Azure Maia AI chip is designed to address GPU shortages and reduce reliance on NVIDIA designed GPUs. The chip is designed to run large language models (LLMs) and Microsoft worked in tandem with OpenAI to help refine the chip for efficiently running their models.
Azure Cobalt CPU: Arm Based Power for Cloud Computing
Microsoft launched the Azure Cobalt 100 CPU, a revolutionary processor based on Arm Neoverse CSS. It is a 64-bit, 128 core chip that boasts a 40% boost in performance over the generation of Arm chips currently in Azure datacenters. The Azure Cobalt CPU is engineered to efficiently run scale-out, cloud-native workloads at a significantly reduced power.
Fast Network Performance is Key
The introduction of the Azure Maia AI chip and the Azure Cobalt CPU represents a significant shift in the cloud computing and AI landscape. Both chips share some impressive network technology, including hollow core fiber, providing 12.5 GBs throughput and 650K IOPs for remote storage and 200 GBs in network performance. These technologies not only enhance the performance of cloud-native workloads but also promise better overall total cost of ownership for Microsoft.
Keeping Your Cool in the Datacenter
The Maia 100 chip requires liquid cooling to dissipate heat generated by these chips when running intensive AI tasks. Standard server racks couldn’t be used to house the Maia server boards, so Microsoft designed a new server rack that would house the new Maia server boards and provide ample room for the liquid cooling, as well as network cabling required for remote storage and networking between servers.
Bottom Line
Microsoft is joining the other hyperscalers in designing and running its in-house silicon in its datacenters. Google has its TPU chip, IBM has its AIU chip, AWS has its Graviton, Trainium, and Inferentia chips, and now Microsoft has its Cobalt and Maia chips. By developing its own AI chip and Arm processors, Microsoft is easing some of its supply chain issues with the insatiable demand for leading AI chips while cutting costs to run its AI workloads.
Get Ready for 2024 with Aragon’s 2023 Q1 Research Agenda!
Wednesday, January 17th, 2024 at 10 AM PT | 1 PM ET
Aragon Research’s 2024 Q1 Agenda
Aragon Research provides the strategic insights and advice you need to help your business navigate disruption and outperform your goals. Our research is designed to help you understand the technologies that will impact your business–using a number of trusted research methodologies that have been proven to help organizations like yours get to business outcomes faster.
On Wednesday, January 17th, 2024, join Aragon Research CEO and Lead Analyst, Jim Lundy for a complimentary webinar as they walk you through Aragon’s Q1 2024 research agenda.
This webinar will cover:
- Aragon’s coverage areas, analysts, and services
- Research offered by Aragon, including Visual Research
- The research agenda for Q1 2024
- Sneak peek at what’s coming in Q2 2024
Have a Comment on this?