Are you surprised or not surprised! Micron Technology (MU.US), whose stock price surged 180% during the year - a rare "value oasis" in the super storage cycle.
Compared to other global AI computing infrastructure suppliers such as Nvidia, AMD, SanDisk, Western Digital, Seagate, and Intel, Micron's valuation appears much cheaper.
The storage chip manufacturing giant Micron (MU.US), headquartered in the United States, has now become one of the most important participants in the field of AI computing infrastructure. The core reason behind this is undoubtedly the urgent need of large-scale cloud computing companies for nearly unlimited high-bandwidth memory (HBM) capacity. This high-performance storage chip is only available on a large scale for production by SK Hynix, Samsung Electronics, and Micron globally. The unique feature of this "storage chip super cycle" is that the supply shortage is more structural rather than short-term than in any previous demand boom cycle.
The current "storage chip super cycle" that began in the second half of 2025 can be said to have created significant conditions for Micron to transform from a typical cyclical commodity manufacturer to a strategic AI infrastructure supplier. Its profit visibility and business sustainability have significantly improved. More importantly, compared to other global AI computing infrastructure suppliers such as NVIDIA Corporation, AMD, SanDisk, Western Digital Corporation, Seagate, and Intel Corporation, Micron's valuation appears much cheaper, making it more attractive to institutional investors compared to other AI computing industry leaders in the North American market.
AI is changing the rules of storage chip demand. The global AI data center construction frenzy has created a nearly insatiable demand for storage chips. Since 2023, the AI GPU/AI ASIC, high-performance network infrastructure, and data center power chains have dominated the first two stages of AI computing infrastructure construction. Micron's management recently pointed out that in the future, HBM, high-performance DRAM, and SSD capacity for AI training/inference systems will become the most important and core supply bottlenecks.
Modern AI computing infrastructure has a tremendous demand for bandwidth, memory, and storage capacity, driven by inference models, inference algorithms, physical AI technologies such as Siasun Robot & Automation, and longer context windows. Micron's management has continuously emphasized in the latest earnings conference call that the demand for AI computing power far exceeds supply at present, and larger-scale new storage chip capacity is not expected to be effectively available until before 2028, providing historical preconditions for maintaining high storage chip prices and capacity utilization rates for a longer period than previous cycles.
Compared to traditional PC or smartphone demand, the cycle of AI infrastructure-related demand is lower. Large-scale cloud computing companies are no longer buying storage components just to expand their cloud infrastructure but are rapidly expanding cloud-side AI inference computing resources because having the best AI infrastructure means having a significant competitive advantage in cloud services, AI application types, software products, and even cloud-based AI national security services. This trend has significantly changed the procurement process. Micron's management has repeatedly emphasized in recent interviews that the supply of DRAM and NAND storage chips is still far below demand, and new production capacity is unlikely to be available before the 2028 fiscal year, which is crucial.
The manufacturing process of HBM capacity is unique, including the most complex advanced packaging, silicon vias (TSVs), and yield issues in the storage chip industry to date. Clean room restrictions and higher levels of green energy efficiency requirements hinder rapid expansion of capacity to cope with price spikes. Traditionally, when prices favor storage chip manufacturers, supply usually quickly increases. However, the industry currently faces many structural constraints, especially HBM manufacturing and packaging processes becoming increasingly complex, general DRAM/NAND supply elasticity being insufficient, and the growth rate of AI-driven demand exceeding expectations.
If there is not enough storage chips, AI models must recalculate from scratch! HBM, DDR5, and SSD jointly start the storage chip super cycle
In the performance conference call, Micron's management team specifically mentioned the explosive demand for high-capacity data center SSDs, KV cache deployments, and PCIe Gen6 SSD demands related to NVIDIA Corporation's AI computing infrastructure clusters. This indicates that AI-related storage chip demand is much broader than the expectations of many Wall Street analysts. Modern AI infrastructure not only consumes more HBM memory but also requires high-bandwidth DRAM, more storage capacity, and high-speed SSD infrastructure to meet the growing demands of retrieval and agentic AI workloads. Emerging AI applications, including Siasun Robot & Automation, multiple AI agent systems, and multimodal inference models, continue to create new storage demand vectors, meaning that AI storage density may continue to grow exponentially even after AI deployment.
With the benchmark of the Korean stock market, KOSPI, which is dominated by Samsung and SK Hynix, hitting historical highs and rising 85% year-to-date under pressure from deteriorating political conditions at GEO Group Inc, and with the Taiwan Semiconductor Manufacturing Co., Ltd., a heavyweight stock with the title of "chip manufacturing king" under the push of AI boom and leading the Chinese-Taiwanese stock market to historical highs, coupled with a record 18 consecutive trading days of rising in the Philadelphia Semiconductor Index, and the S&P 500 index hitting a record high for the sixth consecutive week, investors are increasingly convinced that the "AI computing power investment theme" can overpower all the noise in the stock market, especially the political noise related to the Middle East and GEO Group Inc.
As Micron Technology, Inc.'s Senior Vice President and General Manager of the Data Center Business Unit, Jeremy Werner, revealed in a recent interview, from the perspective of the underlying AI data center data flow processing engineering logic, this market cycle's underlying DRIVE is not as simple as "AI needs more compute chips," but rather the dominance of Claude Cowork and other AI agents leading the AI inference era is pushing memory/storage from supporting components to becoming a system bottleneck.
AI training operations rely more on large-scale parallel computing, and inference, especially with long-context, multi-turn conversations, and agentic AI workflows, requires continuous retention of KV cache, context states, and intermediate results. When memory/storage space is insufficient, the model must recalculate historical states, causing a decrease in GPU utilization and an increase in token generation costs. Therefore, HBM, DDR5, LPDDR, enterprise-grade SSDs, and even HDD/data lakes are forming an "AI memory chain" from GPU near-end to far-end storage, determining the throughput, latency, concurrency, and unit token economy of AI systems. This is why Micron, Samsung, SK Hynix, SanDisk, Western Digital Corporation, and other storage and data storage stocks show a wild rise: the demand is not just concentrated on HBM but spills over the entire chain from AI server architecture to DRAM, NAND, SSD, and HDD.
Furthermore, AI CPUs are opening up a second demand curve. In the past, the market equated AI computing power almost exclusively with GPU+HBM, but as inference workloads become more complex, CPUs are evolving from being a "GPU accessory" to becoming an "AI coordinator" that schedules multiple agents, manages contexts, and coordinates workflows, significantly increasing the demand for DDR5/data center-level SSD configurations. At the same time, HBM capacity is being heavily locked up by AI GPUs, general DRAM available capacity is being squeezed, DDR5 and DDR4 price trends are diverging, and the storage shortage is spilling over from high-end HBM to a more general DRAM/NAND supply chain. TrendForce also cites the latest views of the Micron CEO, saying that demand for both traditional servers and AI servers is strong but constrained by tight DRAM and NAND supply; Samsung and SK Hynix have also warned recently that AI-driven storage shortages may persist until 2028 or even longer.
Compared to most leaders in the AI computing power industry chain, Micron is much cheaper in valuation
Regarding the rising prices of DRAM/NAND storage chips, the financial giant Goldman Sachs Group, Inc. has recently judged that the price increases in storage chips in 2026 will far exceed the optimistic expectations it previously gave. Goldman Sachs Group, Inc. has significantly raised its forecast of DRAM memory price increases from about 150% to 250%280%, and NAND price increases from about 100% to 200%250%. In other words, Goldman Sachs Group, Inc. believes that this is not a normal inventory correction cycle but a "super supply shortage cycle" caused by unprecedented demand surge driven by AI computing power, complex HBM manufacturing and packaging processes, and insufficient elasticity in general DRAM/NAND supply.
GPUs are responsible for generating intelligence, HBM/DRAM is responsible for high-speed heavy lifting, enterprise-grade NAND/eSSDs are responsible for hot data and caches, while HDDs are responsible for long-term retention of vast amounts of cold/warm data. Therefore, Goldman Sachs Group, Inc. believes that the AI computing arms race led by cloud computing giants is transforming storage chips from cyclical commodities into scarce strategic assets, and the price increases in DRAM/NAND in 2026 are not the end, but may just be the initial stage of a super cycle.
Micron's actual financial performance has changed significantly over the past few years. Traditionally, Micron has been seen by Wall Street institutional investors as a technology company that produces cyclical, commoditized memory products, with significant financial fluctuations. However, this situation has changed rapidly. In the latest quarter, the company's operating cash flow was close to 12 billion US dollars, adjusted free cash flow reached 7 billion US dollars, and it decided to increase its dividend by 30%, showing a high level of confidence in the current profitability. Operationally, the company has successfully launched 1-gamma DRAM, G9 NAND, and HBM3E products, and maintains confidence in ramping up HBM4 capacity.
In addition, as the only manufacturer of DRAM and NAND in the United States, Micron's unique position in terms of technological leadership and capacity is significant for investors in the North American market. Considering the government support based on the CHIPS Act in the United States and the political risks related to GEO Group Inc, Micron's importance in the strategic game of technology has further increased. In other words, its business model has changed, and in the context of the increasing importance of global semiconductor autonomy, Micron has gained a competitive edge due to its capacity advantage.
From a valuation perspective, Micron's closing stock price last Friday was around 746 US dollars per share, with a staggering year-to-date increase of 180%. However, in terms of valuation, the expected price-to-earnings ratio of 12.8 times for the consensus earnings per share (EPS) of 58.11 US dollars for the fiscal year 2026, and an even lower 7.3 times for the consensus EPS of 101.78 US dollars for the fiscal year 2027. Even if the profit is close to the lower end of the 2027 fiscal year forecast (about 70.77 US dollars), the price-to-earnings ratio is still only about 10.5 times, which is quite reasonable for a core chip giant at the heart of AI computing infrastructure. The forward EV/EBITDA valuation benchmark is only about 8 times, significantly lower than many popular software companies focused on AI workflows, including Palantir, even though their free cash flow acceleration is much slower than Micron's and still has a higher multiple compared to Micron.
The super cycle reshapes DRAM/NAND storage chips, with Micron's stock price heading towards 1000 US dollars?
A recent report by the analyst team led by star analyst Ben Reitzes at Melius states that the AI boom will drive continuous strong growth in storage chip demand until the end of this decade (2030). According to market research firm Counterpoint Research, the storage market has entered a "super bull market" or "super cycle" phase, with the current supply and price situation far exceeding the historical highs of the cloud computing boom in 2018.
Recently, Micron's stock price has performed extremely well, rising by 6.5% on Monday to close at 795.33 US dollars, hitting a historic high and approaching a market value of nearly 900 billion US dollars. Deutsche Bank Aktiengesellschaft has significantly raised its target price to 1000 US dollars, which is about 26% higher than the closing price, and maintains a "buy" rating. Factors driving the stock price include the long-term structural shortage of storage chips driven by AI, potential labor unrest at Samsung that could harm Samsung's share of DRAM/NAND storage, and Micron's recent release of high-capacity enterprise SSDs (such as the 245TB Micron 6600 ION), significantly improving rack-level storage density and data center efficiency.
In terms of cash flow indicators, Micron has departed from the typical characteristics of large cyclical memory chip manufacturers: the latest quarter's operating cash flow was nearly 12 billion US dollars, adjusted free cash flow was 7 billion US dollars, and plans to increase dividends by 30%. In addition, as mentioned above, the company has successfully launched 1-gamma DRAM, G9 NAND, and maintained confidence in ramping up HBM3E capacity.
The latest data from the Korean customs shows that DRAM and NAND prices continue to surge, with monthly increases of up to 63%, HBM memory rising by 165.5% compared to the same period last year, and flash memory prices increasing by over 350% year-on-year. The supply-demand imbalance and the ultra-high capacity demand driven by AI have created a structural uptrend in the storage market, significantly strengthening Micron's position as a core supplier. Meanwhile, global IT spending forecasts show that IT hardware, software, and services spending is expected to increase to 6.32 trillion US dollars in 2026, a year-on-year increase of 13.5%. Data center system spending as a percentage of overall IT spending has risen from 4.5% in 2012 to 12.5%, indicating that investments in storage and computing driven by AI are becoming a critical engine of economic growth.
Related Articles

ZYLOXTB(02190) spent approximately HKD 1.1362 million to repurchase 51,000 shares on May 12th.

PW MEDTECH (01358) spent 992,000 Hong Kong dollars to repurchase 841,000 shares on May 12th.

CIG Shanghai (06166) will distribute a final dividend of $0.28 per share on June 25th.
ZYLOXTB(02190) spent approximately HKD 1.1362 million to repurchase 51,000 shares on May 12th.

PW MEDTECH (01358) spent 992,000 Hong Kong dollars to repurchase 841,000 shares on May 12th.

CIG Shanghai (06166) will distribute a final dividend of $0.28 per share on June 25th.

RECOMMEND

Two Mainland Accounting Firms Approved for H‑Share Audits, Lowering Listing Costs and Deepening Mainland–Hong Kong Market Integration**The Ministry of Finance, the CSRC, and Hong Kong’s Accounting and Financial Reporting Council have approved two additional mainland accounting firms—RSM China and ShineWing—to conduct H‑share audit work, marking the first expansion of the list since 2010.
11/05/2026

HKEX Tightens Rules on Auditor Dismissals as Sudden “Audit Firm Switches” Raise Governance Concerns
11/05/2026

The Chip Stock Frenzy Is Still Accelerating
11/05/2026


