"Under the wave of the AI boom, the 'computing power shortage' is becoming more and more intense! Despite just receiving a financing of $12.2 billion, OpenAI is still loudly proclaiming that computing power is not enough."

date
14:58 15/05/2026
avatar
GMT Eight
OpenAI's CFO stated that with the worsening shortage of computing power, this leading AI application company may raise more funding.
Global AI application leader OpenAI's Chief Financial Officer (CFO) Sarah Friar recently said in an interview with the media that even after completing what OpenAI claims to be the largest private financing round ever, this global top AI application developer that created ChatGPT may still raise even larger capital, as the company continues to face explosive expansion in AI computing power demand and must do everything possible to meet the nearly endless AI computing power needs. OpenAI's cry for a shortage of computing power also highlights the fact that the "money-burning wave" around AI computing power infrastructure by AI application leaders and global tech giants is far from over. The underlying logic of the AI computing power industry's bull market narrative is shifting from "AI training expectations driving" to "massive AI computing power demand at the inference end + accelerated capital expenditure realization + reinforced supply bottleneck." According to reports, Friar stated in the interview that the recent $122 billion financing round completed by OpenAI has provided the AI unicorn company with "a lot of room for choice." However, she added that future financing expansion will still depend on demand, revenue growth, cash flow, and the gap between OpenAI's required computing power and the computing power expenditures it can afford. She also mentioned that over time, the global public market may become an attractive financing channel because the public market is "significantly larger" than the private market, and it may allow companies to have access to a wider range of financing options. These latest comments highlight the core tensions faced by OpenAI and the broader AI prosperity growth cycle: the accelerating growth in AI computing power demand around its range of AI application products, and the scarcity and increasing cost of computing power infrastructure such as GPU/TPU, CPU, HBM/DRAM/NAND. Friar stated that OpenAI's ChatGPT has over 900 million active users per week, much higher than market expectations, and also mentioned that their software engineering subscription product Codex has exceeded 4 million users. In a market where "AI computing resources are not abundant in 2026," owning a large AI computing power infrastructure remains a "huge competitive advantage." The big wave of AI funding is far from receding! OpenAI and Anthropic collide with the "vertical wall of demand" in the global computing power shortage, further strengthening the logic of the computing power chain bull market "We are climbing a vertical wall of product demand," Friar said. She added that OpenAI's enterprise sales team is "exhausted" because customers are constantly asking how to use AI to transform their business operations to maximize efficiency, while some large commercial banks on Wall Street are prioritizing the company's cybersecurity model. When asked about the tense relationship with one of the American tech giants, Apple Inc., and the possibility of legal action, Friar said that OpenAI hopes to continue the relationship in good standing but declined to comment on litigation. It is undeniable that AI application leaders like OpenAI and Anthropic are experiencing a "vertical demand wall," while the market in 2026 is "not abundant in computing power." These factors indicate that the core constraints faced by advanced model companies and tech giants are no longer just model parameters but whether they can continuously and on a large scale obtain AI GPU/TPU, CPU, HBM/DRAM/NAND storage chips, core power equipment, high-performance network infrastructure to continuously increase AI computing power infrastructure capacity. As major US tech giants such as Microsoft, Meta, Google under Alphabet, and Amazon push annual capital expenditures to nearly $800 billion for AI computing power infrastructure construction, and Taiwan Semiconductor Manufacturing Company (TSMC) as the main manufacturer of almost all advanced AI chips and high-performance computing chips for data centers, with its leading-edge nodes (including 3nm/2nm/future N2 and A16) capacity being long-term fully booked or even exceeding capacity, the supply-demand imbalance has made its almost irreplaceable manufacturing capacity a "bottleneck node" for the entire industry. On April 30, the three cloud computing super giants, Microsoft, Google, and Amazon, delivered outstanding results on the same night, highlighting the unexpectedly rapid growth of their cloud computing businesses benefiting from the AI wave, prompting Wall Street to re-evaluate the commercial returns on AI. The latest research report from Morgan Stanley analysts shows that the combined capital expenditures of the five super-scale tech giants (Amazon, Google, Meta, Microsoft, Oracle) in 2026 are estimated to be around $800 billion, and the number is expected to exceed $1.1 trillion in 2027, up from the previous forecast of $950 billion. The Morgan Stanley analysts emphasized that the core logic behind these massive capital investments is to first reinvest and build capacity, then rely on scale commercial revenue and return on investment based on AI computing resources; the surge in backlogged orders for cloud computing is the most direct evidence that this logic holds true, the unexpectedly rapid expansion of their cloud computing businesses is prompting Wall Street to reassess the commercial returns on AI. Alibaba's official statement extends this logic to the Chinese cloud and AI ecosystem. Alibaba's management stated in an earnings conference call that AI investment in the next three years will exceed the previously planned 380 billion RMB because AI investment is starting to show signs of returns and is driving the expansion of its cloud computing capabilities; the financial performance conference call also mentioned that the demand for computing power infrastructure will increase tenfold by 2022, with market reports suggesting that management said "no AI computing card is idle." Is there no end to the bull market in AI computing power? The latest pre-IPO trading data on the blockchain shows that Anthropic's implied valuation has soared to $1.2 trillion. If Anthropic successfully completes its IPO at a valuation of $1.2 trillion, it will directly become the 11th largest listed company globally, creating a new myth in commercial history. Anthropic announced last month that its annualized revenue (ARR) has exceeded $30 billion, a significant increase compared to the $9 billion at the end of 2025. Third-party research firm Semi Analysis indicated in a report in early May that Anthropic's annualized revenue has risen to around $44 billion. This growth rate far exceeds that of OpenAI and other AI application leaders. Anthropic's explosive growth in performance is mainly due to the popularity of its Claude AI model. Last month, Anthropic stated that the demand for Claude has led to "inevitable pressure on infrastructure," affecting the "stability and performance" of user experience, especially during peak hours. The explosion in demand and tight computing power situation has also prompted Anthropic to actively engage with commercial space giant SpaceX, Google Cloud, Amazon, and others in recent times to acquire more computing power resources. The surge in demand for Claude has become a "sweet burden" for Anthropic, OpenAI, and other AI application leaders, as well as Google, Microsoft, and other tech giants. Amodei, CEO of Anthropic, recently stated that the company originally planned for 10x growth, but in the first quarter, revenue and usage grew 80x on an annualized basis, explaining why the company struggles to meet demand. He somewhat arrogantly stated that the current growth level is "crazy" and "difficult to manage," and hopes that future expansion can be "more normal." As the mad wave of AI infrastructure construction led by these tech giants takes on a path similar to the "first capital expenditure, then application explosion" in early railways, electricity grids, broadband, and cloud computing, for the AI computing power industry chain and the global stock market bull market narrative driven by the AI computing power leaders, the rising trajectory seems far from over. These tech giants aim to convince more investors to believe that their massive investments in artificial intelligence will yield record returns. Therefore, for the AI computing power industry chain and the global stock market bull market narrative driven by the AI computing power, their increasingly strong AI capital expenditure is a real positive factor that will continue to support the leaders in the AI computing power industry chain around AI GPU/ASIC, data center CPU, HBM/NAND/HDD storage, 2.5D/3D advanced packaging, liquid cooling systems, optical interconnection supply chain, data center power chain, and other leaders driving the AI bull market narrative on this round of the global stock market bull market trajectory. In addition, North American tech giants are transcending the single US Wall Street bond market and turning to the global credit market, with AI infrastructure construction moving from a tech stock valuation story to a real capital expenditure cycle involving global credit markets and cross-currency financing systems. This dynamic is a strong blow to the "AI bubble narrative." If AI leaders tend to believe that the AI investment frenzy is in a bubble process, these companies typically rely on stock price narratives, venture capital, or short-term market sentiment for financing; however, the reality of expanding computing power-driven AI infrastructure construction, the high enthusiasm for bond purchases, and the super-scale cloud vendors such as Google's parent company Alphabet, Amazon, and Meta are using euro, Canadian dollar, Japanese yen, Swiss franc, US dollar, and other multi-currency bonds to finance data centers, AI servers, power, networks, storage, and other AI large model training/inference infrastructure. This indicates that they view AI as a long-term asset-liability engineering, rather than a short-term marketing concept.