Cinda: Storage prices are expected to enter an uptrend. The opportunity for CUBE replacements is expected to accelerate.

date
10/03/2025
avatar
GMT Eight
Cinda released a research report stating that domestic manufacturers are accelerating the application of large models. In early 2025, all NAND Flash manufacturers will take more decisive production reduction measures, reducing the annual production scale in hopes of effectively reducing the supply bit growth rate. This move will help quickly alleviate the pressure of market supply and demand imbalance and lay the foundation for price rebound. As the manufacturers gradually exit the DDR3/DDR4 competition, attention should be paid to the CUBE window of opportunity for substitution. The release of related equipment is expected to bring more significant cost advantages to edge AI devices, expanding the application range of DDR4. The bank recommends focusing on storage modules, storage chip-related stocks. Cinda's main points are as follows: Global large manufacturers' capex is expected to increase, and demand is expected to rise According to data, the combined capital expenditures of Meta, Google, Amazon, and Microsoft are expected to reach $297.2 billion by 2025, a year-on-year increase of 36.8%. Although Microsoft has recently adjusted some data center leasing strategies, it has explicitly stated that it will maintain its infrastructure investment intensity in the next three years, with over $80 billion already invested in the 2024 fiscal year. This investment direction is in line with the long-term growth trend of AI computing power demand. Alibaba announced that it will invest 380 billion yuan in cloud and AI infrastructure construction in the next three years, setting a record for similar investments by domestic private enterprises. Its Tongyiqianwen large model family has spawned over 90,000 derivative models, making it one of the largest open-source model communities in the world. Tencent, Baidu, and other companies are accelerating the industry application of DeepSeek large models, covering areas such as government affairs, finance, and healthcare. In 2025, the capital expenditures of the three major domestic manufacturers (Tencent, Baidu, Alibaba) are expected to increase by 19.1% year-on-year to reach $15.42 billion. Storage manufacturers actively reduce production, potentially driving storage prices into an upward range Since 2023, major NAND Flash manufacturers have realized the serious impact of oversupply on the industry and suppliers need to actively adjust production strategies to avoid further widening of the price downturn cycle. In early 2025, all NAND Flash manufacturers will take more decisive production reduction measures, reducing the annual production scale in hopes of effectively reducing the supply bit growth rate. This move will help quickly alleviate the pressure of market supply and demand imbalance and lay the foundation for price rebound. Samsung has already decided to reduce NAND flash production at its Xi'an factory by more than 10% at the end of 2024. It has also reduced the production capacity of the Hwaseong 12th and 17th production lines to further control overall output. According to TrendForce's forecast, with production reduction by major manufacturers like Samsung, inventory clearance of smartphones, and demand growth brought by AI and DeepSeek effects in the first quarter of this year, the NAND Flash supply and demand structure is expected to significantly improve. NAND Flash is expected to increase by 10-15% quarter-on-quarter in the third quarter and continue to grow by 8-13% in the fourth quarter. Manufacturers are gradually withdrawing from DDR3/DDR4 competition, focusing on CUBE window of substitution opportunity According to DIGITIMES, as DRAM prices are falling due to weak demand, the three major DRAM manufacturers Samsung, SK Hynix, and Micron intend to concentrate their production resources on DDR5 and high-bandwidth memory (HBM), leading to the discontinuation of DDR3 and DDR4 within 2025. Companies like Hua Ban Electronics have introduced CUBE (customized ultra-high bandwidth components) as a new memory solution. By significantly optimizing memory technology, they demonstrate high-performance edge AI computing capabilities, helping customers achieve edge AI computing applications in mainstream scenarios. This is equivalent to a bandwidth of 4 to 32 LP-DDR4x 4266Mbps x16IO, with the opportunity to minimize the size of SoC chips and save TSV area loss, potentially bringing more significant cost advantages to edge AI devices and expanding the application range of DDR4. Risk factors: Downstream demand below expectations risk; Upstream manufacturers facing oversupply risk.

Contact: contact@gmteight.com