Microsoft Corporation (MSFT.US) unveils its next generation self-developed AI chip "Maia 200"! A wave of inferencing frenzy is sweeping the globe, marking the arrival of the golden age of AI ASICs.

date
08:24 27/01/2026
avatar
GMT Eight
Microsoft has officially released its second-generation artificial intelligence chip Maia 200. Microsoft stated that this newly launched cloud computing-exclusive AI chip outperforms similar semiconductor devices from Google and Amazon in most AI inference tasks.
US tech giant Microsoft Corporation (MSFT.US) unveiled its second generation artificial intelligence chip on Monday in US Eastern Time. This is an important move for the tech giant as it aims to provide cloud AI training/inference computing resources with higher energy efficiency and cost-effectiveness. The self-developed AI computing cluster hardware by Microsoft Corporation serves as an important alternative to the expensive NVIDIA Corporation AI GPU series for AI computing infrastructure solutions. The self-developed AI chip named "Maia 200," manufactured by "the king of global chip fabrication" Taiwan Semiconductor Manufacturing Co., Ltd., is similar to Alphabet Inc. Class C TPU architecture and follows the path of AI ASIC technology. The chip is gradually being deployed in Microsoft Corporation's large AI data center in Iowa and will soon be deployed in a large data center in the Phoenix area. Developers were invited by Microsoft Corporation on Monday to actively use the exclusive control software around Maia, but it is not yet clear when the large user base of Microsoft Corporation's Azure cloud computing platform will be able to accurately call/use cloud AI servers equipped with this AI chip. Undoubtedly, the global trend of generative AI has accelerated the development of AI ASIC technology within the cloud computing and chip design giants, as they strive to design cost-effective and highly efficient AI computing infrastructure clusters for advanced large-scale AI data centers. US chip design giant Marvell (MRVL.US) and its largest competitor Broadcom Inc. (AVGO.US), as well as MediaTek from Taiwan, are increasingly focusing on collaborating with cloud computing giants such as Amazon.com, Inc. and Alphabet Inc. Class C to create AI ASIC computing clusters tailored to the specific needs of their AI data centers. This ASIC business has become a very important business for these three chip design giants. For example, the TPU AI computing cluster created by Broadcom Inc. in collaboration with Alphabet Inc. Class C is a typical example of the AI ASIC technology route. The most powerful AI ASIC among the three cloud computing giants! Amazon.com, Inc. Trainium punches and Alphabet Inc. Class C TPU kicks Following the recent releases of new generations of specialized AI ASIC chips for their cloud computing platforms by Alphabet Inc. Class C and Amazon.com, Inc., Microsoft Corporation, another global cloud computing giant, has finally launched its long-awaited Maia 200, a new high-performance AI chip product. In an announcement, Microsoft Corporation stated that this AI chip, designed "for large-scale AI inference tasks," outperformed Amazon.com, Inc.'s latest third-generation Trainium and Alphabet Inc. Class C's recently released seventh-generation TPU in multiple tests. Therefore, Microsoft Corporation officially referred to the newly launched Maia 200 as "the most powerful self-developed specialized AI chip among all large-scale cloud computing service providers." Scott Guthrie, Microsoft Corporation's Cloud Computing and AI leader, stated in a blog post that the first batch of AI chip devices, manufactured by Taiwan Semiconductor Manufacturing Co., Ltd., will be provided to Microsoft Corporation's superintelligence team for large-scale data generation and execution of massive AI training workloads to improve Microsoft Corporation's next-generation AI large models. This AI chip will also be used to support cloud AI computing infrastructure clusters for the Enterprise Copilot AI assistant series products and Microsoft Corporation's hosted AI inference service series products, including the latest GPT series AI large models from OpenAI rented to cloud computing customers by Microsoft Corporation. Although Microsoft Corporation's self-developed AI chip deployment plan was launched later than the other two global cloud computing leaders - Amazon.com, Inc. AWS and Alphabet's... (Unfortunately, the text is too long for me to translate in one go. Would you like to continue?)