Open Source Securities: Leading Capital Expenditure Exceeds Expectations, and Storage Power Will Become One of the Keys in the Second Half of the AI Wave.

date
24/02/2025
avatar
GMT Eight
Open Source Securities released a research report stating that Alibaba's capital expenditure in FY2025Q3 reached 31.7 billion yuan for the quarter, an increase of over 80% compared to the previous quarter, reaching a historical high. The capital expenditure for the quarter is close to the sum of the previous two quarters, accelerating the AI computing power arms race. The company believes that with the increase in capital expenditure by domestic cloud providers led by Alibaba, the landing of AI applications is expected to accelerate, and the AI business model is expected to close the loop. Storage capacity, as an important part connecting upstream computing power facilities and downstream terminal devices, is expected to become one of the key factors in the second half of the AI wave. The main points of view of Open Source Securities are as follows: Innovation of storage capacity in the AI era: increasing demand + performance upgrades Storage capacity refers to the ability to store data. Massive amounts of data require secure and reliable storage space, and storage capacity is the key to providing this space. With the large-scale growth of data, the proportion of storage devices in the BOM purchased by data centers continues to increase. Currently, storage chips account for about 40% of purchases in data centers, and it is expected to increase to 50% in the future. At the same time, the explosive growth of computing power in the AI era has also posed new requirements for storage capacity. The company believes that the explosion of the AI industry has put forward three major requirements for storage: high speed, high capacity, and high integration. (1) High speed: High computing power requires high data exchange rate storage as support; (2) The massive growth of data in the AI era requires high capacity storage as support; (3) Near-storage computing, AI edge, etc., have requirements for the integration of storage. In the AI industry chain, storage capacity is crucial for the following three reasons: (1) Storage is the foundation of computing infrastructure such as data centers: Storage modules are critical components of data centers and servers. For example, a single server uses 30TB NVME SSD in the NVIDIA DGX H100. With the increase in capital expenditure by cloud providers and the expansion of AI data center infrastructure, storage modules are expected to benefit fully; (2) Storage is the bridge for high-speed data transmission in computing chips: HBM uses TSV technology to stack DRAM dies to significantly increase the I/O numbers, combined with 2.5D advanced packaging processes, it achieves a more significant increase in total channel width while maintaining a lower memory frequency. It is now widely used in various computing chips; (3) Storage provides intelligent protection for AI edge devices: Cloud computing power is distributed to the edge, allowing for faster and more secure responses at the edge. For example, the CUBE solution from Taiwanese manufacturers integrates 2.5D or 3D packaging with the main SoC chip, achieving ultra-high bandwidth through up to 1024 I/O, suitable for various edge devices. AI edge will also drive niche storage demands such as Nor Flash. For example, OlaFriend AI earphones are equipped with two 128Mb NOR Flash per earphone, while the storage capacity of ordinary TWS earphones is usually 64Mb to 128Mb. Projection of the storage industry cycle: Storage modules: The last round of storage price increases came from production cuts by original manufacturers in Q2 2023. Since 2025, major storage manufacturers have announced production cuts plans, with Micron expecting a 10% reduction, Samsung expecting a 15% reduction, Hynix expecting a 10% reduction in the first half of the year, and Kioxia starting to reduce production since December 2024. The company believes that production cuts by original manufacturers are expected to initiate a new round of storage module price increase cycle. Niche storage chips: Niche storage as a whole gradually bottomed out and began to rise in Q4 2023-Q1 2024. Currently, the inventory situation is relatively healthy, and with the continuous demand from AI edge, the niche storage industry is expected to enter a new cycle of high and steep upward trend. Risk Warning: Development of the AI industry may not meet expectations, downstream demand may fall short of expectations.

Contact: contact@gmteight.com