The power struggle behind the global AI competition: Energy anxiety and explosive growth of Lenovo's liquid cooling technology (00992)
The power struggle behind the global AI competition: energy anxiety and the explosive growth of Lenovo's liquid cooling technology.
We have a massive number of idle Nvidia GPUs, but they can only lie in the warehouse because there isn't enough power to light them up.
Not long ago, on a podcast, Microsoft CEO Nadella revealed the harsh truth to the market: the bottleneck of the AI race has shifted from chip supply to electricity.
In the past three months, Nvidia's market value has surged from $4 trillion to $5 trillion. This trillion-dollar valuation growth is built on the continuously growing demand for AI infrastructure at an exponential rate. The market expects the global investment in building data centers to reach an astonishing $3 trillion in the future.
In the ongoing AI frenzy for several years, people have been enthusiastic about discussing the nanometer race of chip processes and the scale of trillion-level parameters in large models, but they have rarely paid attention to the "invisible cornerstone" that supports everythingelectricity.
Now, what seems like an AI competition dominated by algorithms and chips has quietly evolved into a global competition around electricity resources, cooling technologies, and energy strategies. From the giant data centers being built in Tennessee, USA, to the hydroelectric computing hub in the Yangtze River Basin in China, the supply capability and efficiency of electricity have become the core variables determining the competitiveness of countries and tech giants in AI.
Energy crisis under the computing power craze
Why is AI becoming more and more power-hungry?
In the past, the energy consumption of AI was mainly concentrated in the model training phaseone GPT-3 level model training is equivalent to the total electricity consumption of 300 households in a year. But with the widespread use of large language models (LLMs), energy consumption has shifted from "centralized training" to "distributed inference": every question asked, every image generated is an "inference request". Although a single LLM query consumes only 0.3 to 1 watt-hour of electricity, when the daily requests reach millions or even billions, the cumulative energy consumption will increase exponentially. Data shows that the energy consumption of current LLM inference workloads is on par with or even exceeds model training; it is estimated that by 2026, the global AI inference demand will reach the exawatt-hour (1 exawatt-hour = 1 billion kilowatt-hours) level, enough to overload the power grids of many countries.
Former Google CEO Eric Schmidt once warned in public, "The power demand scale of AI data centers is something I have never seen in my career before." His team's calculations show that by 2030, global AI data centers will need an additional 96 gigawatts of power capacityequivalent to the annual total electricity generation of Sweden or the total output of 100 standard nuclear power plants in the US.
Even more shocking is the electricity cost behind it. If a data center's power demand is 1 gigawatt (1,000 megawatts), assuming it runs at full capacity all year round (about 8,760 hours), its annual electricity consumption could reach as high as 8.66 billion kilowatt-hours. This means that its annual electricity cost could be as high as $700 million to $876 million. In the future, there will be numerous data centers with even larger power demands, potentially reaching 10 gigawatts, and the growth in electricity costs is quite terrifying.
For tech companies, energy is not just a "cost item" but a "survival item."
According to a Goldman Sachs research report, the power demand of data centers is expected to increase by 165% by 2030, and the energy consumption of AI inference accounts for over 20% of operational expenses (OPEX). This means that for every 1% increase in energy efficiency, enterprises can save hundreds of millions of dollars in long-term costs; on the other hand, if there is an interruption in power supply or a spike in costs, revenue forecasts for companies will collapse instantly. Today, tech giants have had to play the role of "energy developers," viewing investment in energy infrastructure (CAPEX) as a "prerequisite" for their AI deploymentswithout power, even the most advanced chips and algorithms are just "scrap metal."
The rivalry behind the AI energy strategy of China and the US
As the country with the most frantic investment in computing power, the US is currently facing an electricity shortage crisis. Currently, requests for the refurbishment of some older power facilities in the US require a wait of over 7 years. While the expansion of US data centers often only takes a few years, projects for natural gas power plants (with most of the electricity supply for US data centers coming from gas-fired power plants) without existing equipment contracts will take at least a decade to become operational.
Faced with the limitations of electricity, top global tech giants have had to adopt aggressive strategiesbuilding their own power plants to control their energy supply. The core of this model is the construction of integrated parks with "data centers + self-owned power plants."
Elon Musk's xAI company is an "aggressive practitioner" of this strategy. Its "Colossus 2" data center under construction in Memphis, Tennessee, plans to deploy between 550,000 and 1 million AI chips, with a potential investment scale of up to trillions of dollars. To support this "computing behemoth" that requires 1 gigawatt of power, xAI is building a natural gas power plant in neighboring Mississippi, deploying multiple gas turbines for on-site power generation and direct supply. This "on-site power generation" model not only avoids transmission losses in the power grid but also allows for flexible adjustments in power generation based on computing demands to handle "load fluctuations."
A more ambitious plan comes from the "Stargate" project jointly promoted by OpenAI, SoftBank, and Oracle. The three parties plan to invest $500 billion over the next four years to build multiple giant data centers worldwide. Larry Ellison, the executive chairman of Oracle, revealed that the project has already launched the construction of 10 data centers in Texas.
This kind of energy independence is becoming a key competitive barrier in the AI competition between tech giants and even countries.
In contrast to US companies achieving energy independence through the "self-built power plant" model, China is addressing the energy needs of AI computing power by relying on policy guidance and the advantages of clean energy.
In July this year, the dormant Yarlung Tsangpo River was awakened by an explosion, and the $1.2 trillion super power project officially kicked off. This century-scale project has a capacity of 60 million kilowatts (equivalent to three Three Gorges Dam hydropower stations) and an annual electricity output of 300 billion kilowatt-hours. This scale can meet over 75% of the national AI computing power growth demand, with electricity costs as low as 0.1-0.3 yuan/kWh, a 60% reduction compared to similar AI computing costs in the US.
However, reducing energy consumption is obviously a more fundamental strategy.
Liquid cooling becomes the "lifeline" of AI
Another aspect of the energy crisis is the cooling crisis.
Data shows that the electricity consumption of an AI data center mainly comes from two parts: computation, accounting for 40% of the data center's power demand, and cooling necessary for stable processing efficiency, also accounting for around 40% of the demand. The remaining 20% comes from other related information technology equipment.
Going back to the calculations mentioned earlier, a data center's electricity bill for cooling in a year can amount to $280-350 million, a staggering expense.
Therefore, the cooling capacity has become a "physical bottleneck" constraining the improvement of computing power if the heat generated by chips cannot be promptly dissipated, it will lead to performance degradation, shortened lifespan, and even direct damage. These chips typically have a heat flux density exceeding 50W/cm, with local hotspots reaching up to 150W/cmequivalent to generating 50-150 joules of heat per second in the area of a fingernail, enough to boil a cup of water in a few minutes. If the high temperature continues, causing the chip's performance to decline, reliability to decrease, and even failures to occur. For AI data centers that need to operate continuously 24 hours a day, this "throttling" and risk of failure will directly affect service stability and computing power output efficiency.
In such extreme circumstances, cooling AI chips with air is like using a fan to cool a boilercompletely futile and incapable of meeting the demands of AI chips. Therefore, more advanced cooling technologies have become a necessity for data centers to reduce electricity costs, forcing the industry to shift entirely to liquid cooling technology.
The core advantage of liquid cooling is "efficient heat conduction" the heat capacity of liquid is over 4 times that of air, allowing it to absorb heat more quickly. Through direct contact with chips or immersion in servers, liquid cooling technology can raise the heat flux density limit to over 300W/cm, easily meeting the cooling needs of current and future AI chips. Liquid cooling is not an "optional solution" but a "necessity" for deploying the latest AI chips.
On a global scale, tech giants and specialized vendors are accelerating the deployment of liquid cooling ecosystems. Top companies like Nvidia, Meta, Microsoft, Google have fully integrated liquid cooling into their AI infrastructure.
Therefore, the explosion of AI computing power is driving the liquid cooling market into a period of rapid growth, creating a new hot spot for investment in the supply chain. The global data center cooling market is expected to grow from $18.78 billion in 2025 to $42.48 billion in 2032, with a compound annual growth rate (CAGR) of 12.4%.
The domestic liquid cooling market also shows a trend of technological diversity and rapid growth. Therefore, we have seen the liquid cooling server sector continuously booming on the capital market in recent times, with related concept stocks steadily rising.
Inspur's cold plate liquid cooling solution has been applied in multiple large-scale data center projects, particularly in finance, government, and other fields that demand high stability. Huawei mainly adopts cold plate technology in its liquid cooling solutions, and its "air-liquid hybrid" cooling technology has been applied in multiple AI data center projects. Lenovo, as the world's third-largest server manufacturer, was one of the tech companies that earliest laid out the liquid cooling track and covered technical routes such as cold plate, immersion, and sprinkler.
Lenovo: the underestimated player in liquid cooling
As a "veteran" in the liquid cooling field with nearly 20 years of experience, Lenovo has formed comprehensive competitive advantages in technology accumulation, scale applications, and ecosystem construction. The real value of Lenovo's liquid cooling solutions in the liquid cooling track is gradually being recognized by the market, with a clear and solid long-term growth logic.
According to the latest Q2 financial report for the 25/26 fiscal year, Lenovo's Neptune liquid cooling technology revenue grew by 154% year-on-year (compared to 68% growth in Q1), showing rapid growth. In this quarter, Lenovo's ISG achieved revenue of nearly 30 billion RMB, a 24% year-on-year growth, maintaining a growth trend for multiple quarters and achieving a 1.2 percentage point increase in operating profit margin on a quarter-on-quarter basis, demonstrating that the business's profitability is stabilizing and improving. The AI infrastructure business related to data centers has maintained rapid growth, with strong order backlogs, and high double-digit growth in AI server revenue. The leap in growth from 68% to 154% not only reflects the rapidly growing market value of liquid cooling technology but also demonstrates Lenovo's strategic foresight and execution in the AI infrastructure domain.
Data shows that by adopting 100% liquid-cooled cold plate design, Lenovo's "Neptune" liquid cooling solution reduces system power consumption by 40% compared to traditional air cooling, with a PUE value that can be reduced to below 1.1; the recently released "dual-circuit" phase-change immersion cooling system achieves precise temperature control of the phase-change chamber through innovative external single-phase heat exchanger design, significantly improving the heat transfer efficiency of the boiling heat exchanger, doubling the heat dissipation capacity compared to traditional solutions, and lowering the system's PUE to as low as 1.035.
By the third quarter of 2025, Lenovo's Neptune liquid cooling systems had been deployed globally in over 80,000 units, covering critical areas such as artificial intelligence, supercomputing, government affairs, finance, and automotive. In terms of technical maturity, Lenovo's liquid cooling solutions have gained many project implementation experiences in both domestic and international markets and have participated in the formulation of industry standards such as the "Cooling Liquid Technical Requirements and Testing Methods for Data Center Liquid Cooling Systems," putting the company at the forefront of the industry in terms of technological influence.
The rich deployment cases are the best evidence of Lenovo's strength in liquid cooling technology, with its solutions tested in several major projects and demonstrating strong adaptability to various scenarios. For instance, to support Geely StarRui's smart calculation center operations, Lenovo tailored the Askian Neptune liquid cooling solution, optimizing the heat dissipation path design for the Beijing Vastdata Technology processing requirements in automotive R&D, controlling the average PUE throughout the year at 1.1, saving about 3179 tons of carbon emissions per year. In overseas markets, liquid cooling solutions have been provided for projects such as the Barcelona Supercomputing Center, the Korean Meteorological Administration, and the Canadian Meteorological Administration.
Goldman Sachs' latest research report points out that the global server cooling market is entering a period of structural growth opportunities. With expectations of increased shipments of AI servers and rapid penetration rates of liquid cooling, the total global server cooling market is projected to achieve annual growth rates of 111%/77%/26% from 2025 to 2027, reaching $17.6 billion in 2027. In this market, Lenovo, with its triple advantages of "technical leadership, rich cases, and a complete ecosystem," is expected to continue increasing market share.
The global explosion of AI has transformed energy infrastructure into a new, high-growth asset class. At the same time, it forces companies to internalize risk management and makes liquid cooling's efficiency an essential solution. Operators can achieve an extremely low PUE through liquid cooling, not only reducing operational costs but also minimizing the impact on the power grid. Therefore, the core of this global AI competition will ultimately be a dual investment wave in power and thermal management technology. Liquid cooling technology providers like Lenovo will be important beneficiaries in the future.
Related Articles

Software crashed together? Roblox (RBLX.US): It has an ecological closed-loop, Genie can't break.

Industrial: Hong Kong stock market sentiment index has reached the bottom area.

"The 'Chinese Choice' for Global SiC Core Customers: Why TIANYU SEMI (02658)?"
Software crashed together? Roblox (RBLX.US): It has an ecological closed-loop, Genie can't break.

Industrial: Hong Kong stock market sentiment index has reached the bottom area.

"The 'Chinese Choice' for Global SiC Core Customers: Why TIANYU SEMI (02658)?"

RECOMMEND

Nine Companies With Market Value Over RMB 100 Billion Awaiting, Hong Kong IPO Boom Continues Into 2026
07/02/2026

Hong Kong IPO Cornerstone Investments Surge: HKD 18.52 Billion In First Month, Up More Than 13 Times Year‑On‑Year
07/02/2026

Over 400 Companies Lined Up For Hong Kong IPOs; HKEX Says Market Can Absorb
07/02/2026


