Enhanced "super cycle" logic in storage! SSD leader SanDisk (SNDK.US) saw a sharp increase of 878% in operating profit.

date
07:47 07/11/2025
avatar
GMT Eight
SanDisk is becoming increasingly aggressive in the field of AI/data center eSSD (enterprise-level SSD) layout, and has started to actual orders and revenue growth contributions.
Global SSD storage product leader SanDisk (SNDK.US) announced its fiscal first-quarter performance report for the 2026 financial year ending on October 3, Friday morning Beijing time. The financial report data shows that SanDisk's Q1 core performance indicators and future performance outlook far exceeded the average expectations of Wall Street analysts. SanDisk's performance has significantly strengthened its position in the storage market alongside the three major storage giants - Samsung, SK Hynix, and Micron Technology, Inc., as well as Western Digital Corporation and Seagate, leading what is referred to as the "storage industry super cycle". This highlights the global surge in demand for AI training/inference computing power and the recovery cycle of consumer electronic demand driven by the AI trend at the edge, resulting in comprehensive growth in demand for DRAM/NAND series storage products, especially the surge in demand for enterprise-level SSD in the HBM memory and NAND sectors. In February 2025, the storage product supergiant Western Digital Corporation (WDC.US) completed the spin-off of its flash business, and its subsidiary SanDisk operates as an independent company, mainly focusing on NAND Flash chips and SSD storage products. SSD is a solid-state hard drive storage product with NAND Flash as the main storage medium, appearing externally as a "disk" that can be directly installed in computers, servers, and data centers. SSD presents an oligopolistic competition landscape, with SanDisk's market size ranking just below Samsung, SK Hynix, and Micron. Since successfully spinning off from Western Digital Corporation in 2025, SanDisk's stock price has skyrocketed by nearly 500%. Following the strong performance announcement, SanDisk's stock price surged by over 10% in after-hours trading. After the AI training/inference systems have pushed the performance of "computing devices" (GPU/HBM) to the extreme, the real bottleneck lies in "feeding massive global datasets to these GPU/HBM in the correct manner and fast enough", and enterprise-level SSDs happen to be the optimal solution for this "feeding system". Fueled by the unprecedented AI trend, SanDisk's Q1 performance data as of October 3 shows that the company's total revenue grew by 23% year-on-year and by 21% quarter-on-quarter to $2.31 billion, exceeding Wall Street analysts' average expectation of $2.1 billion; the company's adjusted Non-GAAP earnings per share under the Non-GAAP guidelines were $1.22, much higher than the previous quarter's $0.29 and the Wall Street analyst's expectation of $0.89. In broader performance indicators, SanDisk's Q1 operating profit reached $176 million, indicating a whopping 878% quarter-on-quarter increase, and the Non-GAAP operating profit was approximately $245 million, a significant 145% quarter-on-quarter increase; the Q1 net profit for the company was around $112 million, compared to a loss of $23 million in the previous quarter, and the Q1 Non-GAAP net profit for SanDisk was about $181 million, a substantial 331% quarter-on-quarter increase. SanDisk's Q1 gross margin was approximately 29.8%, higher than the previous quarter's 26.2%. As of October 3, SanDisk achieved a 26% quarter-on-quarter increase in data center business revenue in Q1. This is mainly due to SanDisk's existing qualifications to supply NAND chips to two super large-scale data center operators, and they plan to collaborate with a third super large-scale data center and a large storage OEM in 2026. In addition, SanDisk has had in-depth discussions with five major super large-scale data center operators. In terms of the company's performance outlook that global investors are focusing on, SanDisk's management expects the revenue range for the second fiscal quarter of 2026 to be between $2.55 billion and $2.65 billion, higher than the Wall Street average expectation of around $2.36 billion; SanDisk expects the adjusted earnings per share for the second quarter to be between $3.00 and $3.40 under the Non-GAAP guidelines, far above the Wall Street analyst's average expectation of $1.82. The latest better-than-expected performance outlook highlights that the AI trend is driving high-capacity enterprise-level SSD actual demand and product prices towards a new cycle of growth. In Wall Street, the sentiment for bullish storage giants is increasingly heating up. Wall Street analysts generally believe that the current AI bubble is in its early stage of formation, far from the dreaded "bubble burst moment", and before the burst of the AI bubble, the DRAM and NAND storage giants will be one of the biggest beneficiaries of the AI trend. The stock prices of these storage giants are incredibly strong in the "bull market trend" and far from over. Morgan Stanley recently reiterated an "overweight" rating for SanDisk's stock, while significantly raising the target price from $95 to $230 within 12 months. Another Wall Street major bank, Bank of America Corp, also maintains an "overweight" rating, with the target price soaring from $125 to $230. As of Thursday's closing on the US stock market, SanDisk's stock price closed at $207.69. Enterprise-level SSD - one of the biggest beneficiaries of the AI trend From a product and customer pipeline perspective, SanDisk's positioning is becoming more aggressive in the AI/data center eSSD (enterprise-level SSD) track, which has already started to convert into actual orders and revenue growth contributions. In the data center sector, the company emphasized in its performance report that it is advancing certifications for UltraQLC high-capacity enterprise-level SSD with multiple super large-scale cloud and storage OEMs, including qualifications for NVIDIA Corporation's flagship AI GPU product - the NVIDIA GB300, and qualification tests for eSSD with multiple super large data center operators. SanDisk previously released the 256TB UltraQLC NVMe enterprise-level SSD at FMS in August 2025, positioning it explicitly for AI-driven intensive workloads such as data ingestion, preparation, and AI data lakes, designed for large-scale cloud and high-capacity application scenarios to significantly optimize total cost of ownership. Enterprise-level SSDs benefit from the core logic of super large-scale AI training/inference, mainly because they occupy the optimal position in the multi-dimensional space of "throughput, latency, capacity, power efficiency, and cost" under AI GPU/AI ASIC computing clusters. With expensive yet ultra-fast HBM/DRAM above and cheap but slow HDD/object storage below, this "middle layer that needs to be large, fast, power-efficient and cost-effective" naturally falls on enterprise-level NVMe SSDs. Enterprise-level NVMe SSDs offer the best balance in terms of throughput, latency, power efficiency, and cost between many core factors, making them the "adjacent storage layer" for AI GPU and AI ASIC computing clusters and the layer of storage that AI training/inference/RAG workloads rely on most and see the greatest incremental growth. For instance, enterprise-level SSDs can significantly reduce cold start and model switching overhead in inference and multi-model services. Many businesses need to "on-demand load" different models or large language model segments, and SSDs act as a "model warehouse", with low latency and high concurrent read capability of enterprise-level SSDs significantly reducing cold start times, preventing the overall QPS from being dragged down due to frequent loading of weights. In addition, technical whitepapers and engineering practice have shown that placing high-dimensional vector libraries on NVMe SSDs is the optimal trade-off between cost, performance, and capacity; along with software-defined storage, the RAG latency and throughput can be further optimized. A McKinsey research report shows that the demand for enterprise-level SSDs in high-density NAND series products driven by super large-scale generative AI and large model training/inference is expected to bring a CAGR of over 35% to enterprise-level SSDs in the 2024-2030 timeframe, with the growth in AI inference/RAG scenarios potentially exceeding 100% (CAGR benchmark), and the growth in AI training-related eSSDs could reach a 62% CAGR. All AI infrastructure projects feature the "storage" element, and the storage super cycle has already begun Looking at the massive $500 billion "Stargate" AI infrastructure project and the close to $1 trillion in agreements signed by OpenAI for AI computing infrastructure, these super AI infrastructure projects cannot escape the need for NVIDIA Corporation AI GPU computing clusters and high-performance storage products for data centers (such as HBM storage systems, enterprise-grade SSDs/HDDs, server-level DDR5 and other storage products) in order to support such grand projects. In this unprecedented investment cycle driven by AI model updates and the expansion/construction of AI data centers, NVIDIA Corporation and other core AI computing component manufacturers are undoubtedly the biggest winners. Following closely behind are the high-end memory suppliers represented by HBM (such as SK Hynix, Samsung, and Micron), as well as enterprise-level high-performance storage manufacturers serving AI data centers (including near-line HDDs and data center SSDs). These two links are forming a dual-wheel drive AI investment cycle driven by AI computing power and storage, where HBM storage systems are the first-tier storage product fleet closely linked to AI GPU/AI ASIC computing clusters, and the enterprise-level HDDs/SSDs that follow are another major force in the frenzied construction of AI infrastructure to accommodate the flood of AI data. In the unprecedented "AI computing race" closely tied to global rapid expansion and AI training/inference infrastructure, Morgan Stanley and other Wall Street giants are exclaiming that the "storage super cycle" is here, as the booming demand for enterprise-level storage drives the revenue growth of data storage product giants such as Seagate, SanDisk, and Western Digital Corporation, with stock prices soaring by triple digits this year, significantly outperforming not only the US stock market but also global stock markets. In a recent research report, Morgan Stanley stated that in this unprecedented frenzy of AI infrastructure investment by large enterprises and government agencies worldwide, the core storage chip demand associated with AI training/inference systems remains extremely hot, driving substantial revenue growth for data center storage businesses including HBM storage systems, server-level DDR5, and enterprise-level SSDs. One of the most exciting news for Wall Street analysts recently undoubtedly comes from NVIDIA Corporation CEO Jensen Huang's presentation at the GTC conference in Washington at the end of October, outlining the "visibility of cumulative data center business revenue from 2025 to 2026 - that is, the cumulative data center business revenue from the next five quarters derived from the Blackwell and the next-generation Rubin architecture AI GPU series products." The continuous expansion of global AI computing power demand, coupled with the ever-growing AI infrastructure investment projects led by the US government and the continuous massive investments by tech giants in building large-scale data centers, largely signifies that the "AI faith" sweeping the globe is far from over in terms of supercharging the stock prices of leading AI computing companies like NVIDIA Corporation, Taiwan Semiconductor Manufacturing Co., Ltd. Sponsored ADR, Micron, SK Hynix, Seagate, Western Digital Corporation, among others, driving these companies along the AI computing industry chain towards further "bull market curves".