At their Ignite 2023 conference, Microsoft revealed plans for two new ARM-based processors intended to enhance Azure’s cloud and AI capabilities. Slated to deploy in Azure data centers in 2024, the custom silicon marks Microsoft’s latest push into designing specialized chips in-house.
Rather than relying solely on offerings from vendors like Intel and AMD, the tech giant aims to tailor processors explicitly for its cloud infrastructure and services.
Microsoft expects the new ARM processors to provide performance and efficiency gains for tasks like machine learning model training/inference. Though details remain limited, the announcement signals Microsoft’s continued investment in custom hardware to differentiate Azure as a cloud platform.
Cobalt 100 CPU: an Arm processor for general workloads
The first chip, dubbed Microsoft Azure Cobalt 100 CPU, targets general cloud workloads focusing on performance-per-watt. As with other ARM processors, the Cobalt 100 aims to provide greater efficiency for cloud-based tasks.
The second chip remains unnamed but is optimized for AI applications like machine learning training and inference. Slated to deploy in Azure data centers in 2024, the custom silicon marks Microsoft’s latest push into designing specialized chips in-house.
Rather than relying solely on offerings from vendors like Intel and AMD, the tech giant aims to tailor processors explicitly for its cloud infrastructure and services. Microsoft expects the new ARM processors to provide gains for key workloads as it invests in custom hardware to differentiate Azure as a cloud platform.
Maia AI Accelerator: a chip dedicated to artificial intelligence
Microsoft has developed a new chip tailored specifically for artificial intelligence workloads, dubbed the Azure Maia AI Accelerator, and optimized for tasks like machine learning training and inference and generative AI. The Maia chip was designed in collaboration with OpenAI.
According to Microsoft, it will drive Azure’s most demanding internal AI applications and language models. The company states the Maia AI Accelerator will enable Azure to handle compute-intensive AI workloads more efficiently than with generic processors.
Microsoft aims to boost performance for key workloads in Azure’s cloud infrastructure by creating a chip specialized for AI. The Maia marks their latest effort to build custom silicon that differentiates Azure through hardware optimization.
Vertical integration for total control of the cloud
Microsoft believes custom silicon is critical to optimize its cloud and AI services. The new Azure chips will integrate closely with software and hardware stacks designed in-house. “The hardware will work hand-in-hand with the software, unlocking new capabilities,” the company stated.
To augment its custom processors, Microsoft is expanding infrastructure partnerships. In 2023, Azure will add NVIDIA’s latest H100 GPUs and preview virtual machines for AI training workloads. AMD’s new Instinct MI300X GPU for accelerated AI modeling will also be introduced.
With its ARM-based chips and strategic partnerships, Microsoft aims to compete with Intel and Qualcomm in cloud computing and AI. The custom silicon push may expand beyond data centers to Surface and other consumer devices. But for now, Azure appears to be the testing ground for Microsoft’s in-house chip efforts.