Eplly is Your Ultimate Source for the Latest News, Science, Health, Fashion, Education, Family, Music and Movies.
—— 《 Eplly • Com 》
Microsoft Preps In-House Chips: One For AI, the Other an Arm Server CPU
Views: 2530
2023-11-16 01:17
For the first time, Microsoft has developed its own chips to power artificial intelligence programs

For the first time, Microsoft has developed its own chips to power artificial intelligence programs and cloud services.

Microsoft has been secretly developing these in-house processors for years, and they were introduced today at the company's Ignite event. The first chip is the Azure Maia AI accelerator, and it’ll power some of Microsoft's largest generative AI workloads, such as Copilot.

The second is an Arm-based CPU called Azure Cobalt designed to run general-purpose cloud workloads, but more efficiently and with greater performance than rival silicon. Both chips will sit inside custom-made boards and server racks that can be easily fitted inside the company’s existing data centers.

Cobalt chip (Credit: Microsoft)

Expect the company to install the chips into its data centers early next year. Microsoft plans on using the new silicon to power its own internal services as well as workloads from third-party customers through Microsoft Azure.

Redmond didn’t say which chipmaker it used to manufacture the processors (rumors point to TSMC). But the company developed the silicon to squeeze even more performance from both its servers and software. Another goal is to offer Azure cloud customers more flexibility on power, performance, and cost.

No benchmarks were posted, making the performance gains unclear. However, Microsoft’s partner and ChatGPT developer OpenAI says the Maia chip “paves the way for training more capable models and making those models cheaper for our customers.”

A custom-built rack for the Maia 100 AI Accelerator (Credit: Microsoft)

Still, Microsoft isn’t ditching processors from major chip providers, such as Nvidia or AMD. In addition to the custom chips, the company plans on offering access to Nvidia’s H100 and H200 hardware, along with AMD’s MI300X, to Azure cloud customers looking to train their AI models.

"Customer obsession means we provide whatever is best for our customers, and that means taking what is available in the ecosystem as well as what we have developed," said Microsoft Azure hardware CVP Rani Borker. "We will continue to work with all of our partners to deliver to the customer what they want."

Nevertheless, the company’s entry into custom silicon could put more competitive pressure on AMD, Intel, and Nvidia. It also makes us wonder if Microsoft is preparing more in-house chips for consumer PCs. Redmond has already partnered with Qualcomm to produce Arm-based SQ chips for the Surface line.

In the meantime, Microsoft adds that the company has already started work on developing second-generation Maia and Cobalt chips.

Tags processors ai