Challenging fellow businessmen! Microsoft (MSFT.US) releases its first AI chip, aiming to integrate software and hardware for a synergistic effect greater than the sum of its parts.

date
16/11/2023
avatar
GMT Eight
Microsoft (MSFT.US) announced on Wednesday the launch of two high-end custom-designed computing chips, one of which is the highly anticipated artificial intelligence (AI) chip Maia 100. However, Microsoft did not provide detailed information about the performance of the products. Microsoft stated that it intends to use these chips to support its subscription software products and as part of its Azure cloud computing services. At the Microsoft Ignite global technology conference on the same day, Microsoft unveiled a series of new features, including Microsoft 365 Copilot, Security Copilot demonstrations, and the latest Azure features. But the highlight was Microsoft's first AI chip, Maia 100, which will provide power to its Azure cloud data centers and lay the foundation for its various AI services. Microsoft is joining other major tech companies in bringing critical technology in-house to address the high cost of providing AI services. With these two chips, Microsoft is now on par with its competitors Google (GOOGL.US) and Amazon (AMZN.US), both of which have developed their own custom chips to run competing cloud platforms. The company stated that the chip will be used for cloud-based AI model training and inference. Training is the process of building AI models, while inference is the process of deploying the models for practical use. 1+1>2! Microsoft launches its first AI chip Maia 100 that works well with its own software At the Ignite developers conference in Seattle, Microsoft unveiled its new AI chip Maia 100, which is designed to accelerate AI computing tasks and compete with Nvidia's popular AI graphics processing units. Microsoft also laid the foundation for its "Copilot" service, offering a monthly fee of $30 for commercial software users and developers who want customized AI services. The Maia chip is designed to run large-scale language models, which is a type of AI software that supports Microsoft's Azure OpenAI services and is a collaboration between Microsoft and the creators of ChatGPT, OpenAI. Microsoft, Alphabet, and other tech giants are working to address the high cost of providing AI services, which can be ten times higher than traditional services like search engines. Microsoft executives said they plan to solve these cost issues by using a universal foundational AI model that incorporates almost all of the company's substantial AI efforts in its products. They said the Maia chip has been optimized for this work. Scott Guthrie, Executive Vice President of Microsoft's Cloud and AI division, said, "We believe this gives us a way to offer customers better, faster, lower-cost, and higher-quality solutions." Rani Borkar, Corporate Vice President of Azure Hardware Systems and Infrastructure at Microsoft, said in a statement, "Software is our core strength, but honestly, we are a systems company. At Microsoft, we are co-designing and co-optimizing hardware and software to make one plus one greater than two. We can see the whole stack, and the chip is just one component of it." Microsoft stated that by providing services powered by its own custom chips, it can bring "huge benefits in performance and efficiency." The idea here is basic: if you are using Microsoft software, it will run better on chips designed by Microsoft because it can customize both software and hardware to deliver better performance. This is similar to Nvidia providing its own AI software outside of AI chips or Apple developing its own chips for iPhone or Mac. When a company can control both hardware and software, it can deliver better results for users. Microsoft stated that it has been collaborating with OpenAI, the developer of ChatGPT, to test its Maia 100 chip and will use the experience gained to build future chips. Sam Altman, CEO of OpenAI, said in a statement, "We were excited when Microsoft first shared the design of its Maia chip, and we have been working together to improve and test it with our models. Azure's end-to-end AI architecture, now optimized to the chip with Maia, paves the way for training more capable models and makes these models cheaper for our customers." Microsoft stated that in addition to manufacturing its own chips, it also manufactures the server motherboards and server racks that the chips are housed in. To address the heat issues with AI servers, Microsoft stated that it has created a special cooling feature called "Sidekicks," which transfers cooling liquid through a series of pipes and radiators to absorb heat more efficiently than just fans. This should also help reduce power consumption. Coming next year! Microsoft also unveils a new universal computing chip with a 40% performance boost Microsoft also announced that next year it will offer cloud services running on Nvidia and AMD's latest flagship chips for its Azure customers. This is the second chip launched by Microsoft on the same day, called Cobalt 100, a 128-core cloud-native chip based on the Arm architecture, which may compete with Intel processors. Microsoft stated that they are testing OpenAI's state-of-the-art GPT-4 on AMD's chips. Regarding the Azure Cobalt chip, Microsoft stated that it offers a 40% performance improvement over the current generation of Azure chips based on the Arm architecture. Guthrie said in a statement, "At the scale we operate, optimizing and integrating every layer of the infrastructure stack to maximize performance, diversify our supply chain, and offer more options to our customers." Overall, these announcements from Microsoft demonstrate the company's focus on developing its own custom chips to enhance its AI capabilities and improve the performance and efficiency of its software products and cloud services."Providing infrastructure choices is very important to us."Microsoft announced on Tuesday its second chip aimed at reducing internal costs and countering its primary cloud computing competitor, Amazon. On Wednesday, Microsoft revealed that it has been testing Cobalt to power its business communication tool Teams. However, Guthrie stated that Microsoft also intends to sell direct access to Cobalt to compete with Amazon Web Services' (AWS) Graviton series of in-house chips. Guthrie said, "We are designing Cobalt solutions to ensure we are very competitive in terms of performance and cost compared to Amazon's chips." AWS will hold its own developer conference later this month, with a spokesperson stating that the company's Graviton chip currently has 50,000 customers. This spokesperson commented after Microsoft's chip announcement, "AWS will continue to innovate and provide chips designed for future generations of AWS, offering better value for customers' workloads." Microsoft has disclosed almost no technical details, making it difficult to gauge the competitiveness of these chips against traditional chip manufacturers. Borkar, vice president of Azure Hardware Systems and Infrastructure, said both products use Taiwan Semiconductor Manufacturing Company's (TSMC.US) 5-nanometer manufacturing technology. Guthrie added that the Maia chip will be connected with standard Ethernet network cables instead of the more expensive custom Nvidia network technology used by Microsoft in the supercomputers it manufactures for OpenAI. She said, "You'll see us go more of a standardized route." However, Microsoft's introduction of its own chips does not mean it is abandoning Nvidia or AMD. The company will continue to offer cloud computing capabilities running on Nvidia H100 chips and will increase access to the newly released H200 chips from Nvidia. Microsoft has also stated that it will begin offering access to the AMD MI300 chip next year. Ben Bajarin, CEO of analysis firm Creative Strategies, said, "This is not about replacing Nvidia. The Maia chip will enable Microsoft to sell AI services in the cloud until personal computers and phones are powerful enough to handle them. Microsoft has a very different core opportunity here because they earn a lot of money from each user."

Contact: contact@gmteight.com