Microsoft announced the creation of two specially-designed computing chips in an effort to strategically leverage the potential of AI and efficiently control expenses.

Microsoft Unveils Computing Chips To Power AI And Control Costs

Microsoft (MSFT.O) has announced the creation of two specially-designed computing chips, Maia and Cobalt, in an effort to strategically leverage the potential of artificial intelligence (AI) and efficiently control expenses. The tech giant joins other industry leaders in bringing key technologies in-house to tackle the high expenses associated with delivering AI services.

Microsoft clarified that it does not intend to sell the chips independently. Instead, they will be utilized to enhance its subscription software offerings and bolster its Azure cloud computing service, demonstrating a commitment to optimizing internal operations and providing innovative solutions to its customers.

At the Ignite developer conference in Seattle, Microsoft introduced the Maia chip, designed to accelerate AI computing tasks and serve as the foundation for its $30-a-month “Copilot” service catering to business software users and developers seeking customized AI services. Specifically tailored to run large language models, Maia is integral to Microsoft’s collaboration with OpenAI, the creator of ChatGPT and the Azure OpenAI service.

Facing challenges related to the high costs of delivering AI services, Microsoft plans to streamline its efforts by routing the majority of AI integration through a common set of foundational AI models. The Maia chip is optimized for this purpose, offering faster, lower-cost, and higher-quality AI solutions, according to Scott Guthrie, the executive vice president of Microsoft’s cloud and AI group.

Additionally, Microsoft shared plans to offer cloud services to Azure customers utilizing the latest flagship chips from Nvidia and Advanced Micro Devices (AMD) in the coming year. The company is currently testing GPT-4, OpenAI’s most advanced model, on AMD chips, emphasizing ongoing partnerships with leading semiconductor manufacturers.

The second chip unveiled, named Cobalt, serves a dual purpose as both an internal cost-saving measure and a response to Amazon Web Services (AWS), Microsoft’s chief cloud rival. Cobalt, a central processing unit (CPU) based on Arm Holdings technology, has undergone testing to power Teams, Microsoft’s business messaging tool.

Microsoft aims to directly compete with AWS’s “Graviton” series of in-house chips by offering direct access to Cobalt, positioning itself competitively in terms of performance and price-to-performance.

While details on technical specifications were limited, Rani Borkar, corporate vice president for Azure hardware systems and infrastructure, revealed that both Maia and Cobalt are manufactured using 5-nanometer technology from Taiwan Semiconductor Manufacturing Co.

Microsoft’s commitment to standardization was highlighted, with Borkar stating that the Maia chip would utilize standard Ethernet network cabling, a departure from the more expensive custom Nvidia networking technology used in previous supercomputers built for OpenAI.

As Microsoft takes these strides in AI innovation and chip development, the industry watches closely to see how these advancements will impact the landscape of cloud computing and AI services.