DRAM and NAND team up for a crazy ride! Storage chips become the hottest commodity of the AI era. Micron (MU.US) and SanDisk (SNDK.US) embrace the "AI super bonus".

date
10:04 05/03/2026
avatar
GMT Eight
Storage chips have become the "absolute C-position" in the AI super wave, no less competitive than NVIDIA's AI chips, and they are still one of the core supply bottlenecks that first appeared in this wave of imbalances in supply and demand, and first reflected pricing power.
Driven by the incredibly strong demand in artificial intelligence data centers, prices of DRAM/NAND series storage will continue to soar. Two major super storage giants in the United States, Micron Technology, Inc. (MU.US) and SanDisk (SNDK.US), once again became the focus of global investors on Wednesday as tech stocks rallied in the US stock market. Both of their stock prices surged nearly 6% at the close of trading on Wednesday, leading the storage sector and the NASDAQ index to mount a strong rebound. French bank BNP PARIBAS recently released a research report stating that contract prices for DRAM are expected to rise by 90% in the first quarter of 2026 compared to the previous quarter. NAND, known for its stable pricing curve, is expected to increase by a significant 55% in the first quarter. Prices are projected to continue rising in the second quarter, following the upward trajectory since the second half of 2025. BNP Paribas' assessment of the rising trend in storage prices is not an isolated view. TrendForce recently revised its expectations for the first quarter of 2026, raising the forecast for regular DRAM contract prices from 55%-60% to +90% to +95% QoQ (quarter-on-quarter basis), while NAND Flash contract prices were also significantly adjusted to +55% to +60% QoQ. The demand from North American cloud computing companies for enterprise SSDs is expected to lead to a further price increase of 53% to 58% in the first calendar quarter. All of these factors highlight a key fact: storage chips are becoming a central player in the AI super wave, on par with NVIDIA Corporation's AI chips in terms of importance and are among the first to experience supply-demand imbalances and pricing power in this wave. The price surge in storage shows no signs of stopping, and BNP Paribas is bullish on Micron and SanDisk continuing their strong performance. Senior analyst Karl Ackerman from BNP Paribas stated in a client report, "Our in-depth analysis of the contract prices for over 50 dynamic random-access memory (DRAM) SKUs and over 75 NAND SKUs leads us to estimate a significant 90% QoQ increase in the average selling price of dynamic random-access memory in the first quarter of the year, followed by another QoQ increase in the second quarter; the primary reason is the growing demand for AI servers which is driving a broader supply-demand imbalance, thereby exerting continued upward pressure on prices." "For NAND series storage products, we predict a significant 55% QoQ increase in prices in the first quarter of the year, followed by another QoQ increase in the second quarter. This trend is mainly driven by the dynamic supply-demand imbalance, as NAND suppliers continue to redirect capacity towards enterprise-grade high-performance NAND storage products while remaining extremely cautious in terms of adding new capacity." Analyst Ackerman has given a target price of up to $500 for Micron and up to $650 for SanDisk. At the close of trading on Wednesday, Micron rose by 5.55% to $400.77, and SanDisk's stock price rose by 5.95% to $599. The target prices set by BNP Paribas indicate that the strong bull market trajectory for both companies since 2025 is far from over. Furthermore, foreign media reports on Wednesday indicated that Samsung, the largest player in the storage industry, has raised prices for dynamic random-access memory (DRAM series storage products) by over 100%. According to Korean electronic news sources, Samsung Electronics completed final negotiations on the first-quarter prices of DRAM supplies with its biggest clients, including Apple Inc. The average price of server, PC, and mobile DRAM has increased by approximately 100% from the previous quarter and doubled from the fourth quarter of last year, with some clients and products seeing price hikes of over 100%. Industry insiders revealed that negotiations have been completed, and some overseas clients have already made payments. This increase is greater than the 70% level negotiated in January and has expanded by approximately 30 percentage points within a month. The rapid increase in prices is reshaping the long-term contract norms in the global storage industry, particularly as the GPU/TPU system's increasing reliance on high bandwidth memory (HBM), DRAM, and enterprise-grade SSD creates a long-term supply-demand imbalance. Supply negotiations have transitioned from traditional annual contracts to quarterly contracts and now even require monthly adjustments, reflecting the severity of the supply-demand imbalance in the storage chip market. "Alphabet Inc. Class C's AI Computing Chain" and "NVIDIA Corporation's GPU Chain" are both heavily dependent on storage Both the massive TPU AI computing cluster led by Alphabet Inc. Class C and the vast NVIDIA Corporation AI GPU computing cluster rely on fully integrated HBM storage systems carrying AI chips. In addition to HBM, Alphabet Inc. Class C and OpenAI tech giants must also purchase server-grade DDR5 storage and enterprise-grade high-performance SSDs/HDDs as storage solutions to accelerate the construction or expansion of AI data centers. In contrast to Seagate and Western Digital Corporation, which focus on dominating near-line high-capacity HDDs, SanDisk focuses on high-performance eSSDs. Samsung Electronics, SK Hynix, and Micron, the three major storage chip manufacturers, are squarely positioned in multiple core storage areas: HBM, server DRAM (including DDR5/LPDDR5X), and high-end data center enterprise-grade SSDs (eSSD). They are the most direct beneficiaries of the "AI Memory + Storage Stack" in this wave of AI infrastructure, where they all participate in the extraordinary dividends of AI infrastructure. From a fundamental hardware perspective, AI computation is not only limited by computing power but also by the "data transfer capability." Whether it be NVIDIA Corporation's GPU or TPU computing systems, the real determinant of the efficiency of training and inference for large models lies not only in the number of Tensor Cores/matrix units but also in the ability to feed weights, KV cache, activation values, and intermediate tensors into the computing cores at high bandwidth per second. Taking a cross-analysis perspective on semiconductor and AI data center infrastructure, storage chips are "perfectly positioned" in the AI trend because they benefit from both training expansion and inference expansion. They also serve as a "universal toll booth" across platforms, architectures, and ecosystems. As the AI era shifts from training dominance to inference, agent, long context, and enhanced retrieval dominance, the demand for capacity, bandwidth, power efficiency, and data persistence layers will only increase. An official document from Alphabet Inc. Class C states that the Cloud TPU features HBM (high-bandwidth memory) to support larger parameter models and batch sizes, while its Ironwood TPU geared towards the "inference era" further enhances HBM capacity and bandwidth. NVIDIA Corporation's AI GPU computing system is more direct: a single GPU based on the Blackwell Ultra architecture can be equipped with up to 288GB of HBM3e, and the GB300 NVL72 rack-level system is designed to improve long context inference throughput based on large HBM capacity. In other words, without HBM, the peak computing power of GPUs/TPUs cannot be effectively realized; the storage bandwidth and capacity determine whether large models can be run effectively and efficiently. Furthermore, the storage systems that AI data centers truly rely on are not just HBM. The complete AI storage hierarchy includes HBM for rapid data supply closest to the accelerators, DDR5/RDIMM/LPDRAM for main memory expansion and data preprocessing, and enterprise-grade SSDs for training datasets, checkpoints, vector libraries, RAG retrieval, and inference cache, and other persistent data pathways. For example, Micron defines its AI data center storage solution as a comprehensive assembly of storage devices covering training and inference. They explicitly state that their eSSD product line is used to maintain efficient data supply for the AI pipeline during training and inference. TrendForce also points out that as the era of AI inference arrives, major North American cloud computing giants are rapidly increasing their purchases of high-performance storage, with demand for eSSDs far exceeding expectations. This means that AI GPU clusters cannot function without storage, and Google TPU clusters are equally dependent on storagethe only difference lies in the brand of the accelerator, but the underlying data storage must be established on the complete pyramid of HBM + server DRAM + NAND/SSD. In their latest storage price outlook, analysts at Citigroup present a more aggressive "super cycle" stance on storage prices compared to UBS Group AG, Nomura, and JPMorgan. Citigroup analysts believe that driven by the widespread adoption of AI agents and the surge in AI CPU memory demand, the prices of storage chips will experience an uncontrolled surge in 2026, raising the expected ASP growth for DRAM from the original 53% to 88%, and for NAND from 44% to 74%. The ASP for server DRAM in 2026 is expected to experience a year-on-year surge of 144% (previously predicted at +91%) due to the dual push from AI training and inference demands. Taking the mainstream product 64GB DDR5 RDIMM as an example, Citigroup predicts its price to reach $620 in the first quarter of 2026, a 38% increase QoQ far exceeding the previous forecast of $518. In the NAND sector, Citigroup's predictions are similarly aggressive, raising the ASP growth expectation from +44% to +74% in 2026; specifically, the ASP of enterprise-grade SSDs is expected to increase by 87%. According to Citigroup analysts, the storage chip market will enter an extremely intense seller's market, with pricing power fully in the hands of storage giants such as Samsung, SK Hynix, Micron, and SanDisk.