Microsoft has revealed it is launching its own-brand silicon hardware in order to help further the development and use of artificial intelligence in businesses everywhere.
Announced at the company's Microsoft Ignite 2023 event, the new Azure Maia and Azure Cobalt chips put the tech giant fully into the AI hardware arms race.
Microsoft says its new hardware has been built, "with a holistic view of hardware and software systems to optimize performance and price" and form the "last puzzle piece" in its aim to deliver infrastructure systems featuring unparalled optmization for a customer's specific needs.
Microsoft Azure silicon
Specific details on the new hardware were somewhat slim, but AI tools and tasks are unsurprisingly the target for the new releases, with the new Microsoft Azure Maia AI accelerator set to power some of the company's largest internal AI workloads running on Azure, including OpenAI models, Bing, GitHub Copilot and ChatGPT, hopefully producing huge gains in performance and efficiency
The hardware has been built with the help of OpenAI, whose CEO Sam Altman noted that the two companies had collaborated to co-design Azure’s AI infrastructure at every layer for its models and unprecedented training needs.
“We were excited when Microsoft first shared their designs for the Maia chip, and we’ve worked together to refine and test it with our models. Azure’s end-to-end AI architecture, now optimized down to the silicon with Maia, paves the way for training more capable models and making those models cheaper for our customers," Altman added.
Elsewhere, the Microsoft Azure Cobalt 100 CPU will be optimized to deliver greater efficiency and performance in cloud-native offerings, being built on Arm architecture to provide maximum performance per watt across more general purpose workloads.
Microsoft says that having the ability to build its own custom silicon allows it to target certain qualities and ensure a high level of performance even on the most demanding workloads.
“Microsoft is building the infrastructure to support AI innovation, and we are reimagining every aspect of our datacenters to meet the needs of our customers,” said Scott Guthrie, executive vice president of Microsoft’s Cloud + AI Group. “At the scale we operate, it's important for us to optimize and integrate every layer of the infrastructure stack to maximize performance, diversify our supply chain and give customers infrastructure choice.”
More from TechRadar Pro
- Someone has used diamonds and a high-end Nvidia GPU to build perhaps the world's fastest graphics card ever — but you won't be able to buy it yet
- The first AI nation? A ship with 10,000 Nvidia H100 GPUs worth $500 million could become the first ever sovereign territory that relies entirely on artificial intelligence for its future
- These are the best AI writers available today
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Mike Moore is Deputy Editor at TechRadar Pro. He has worked as a B2B and B2C tech journalist for nearly a decade, including at one of the UK's leading national newspapers and fellow Future title ITProPortal, and when he's not keeping track of all the latest enterprise and workplace trends, can most likely be found watching, following or taking part in some kind of sport.