GPT-5 "disappointed", has AI "hit a wall"?

date
17/08/2025
avatar
GMT Eight
The release of GPT-5 clearly indicates that the nature of artificial intelligence competition has changed.
OpenAI's highly anticipated GPT-5 failed to bring a revolutionary breakthrough. Although the road to Artificial General Intelligence (AGI) seems to have hit a bottleneck, the market focus is now shifting towards how to leverage existing technologies to create broader commercial value at the product and service level. Last week, when OpenAI released its new model GPT-5, it was supposed to be another highlight for the company. Sam Altman had previously announced that GPT-5 was "an important step on the road to AGI." However, the model quickly sparked disappointment after its release. Users shared on social media the low-level errors made by the new model, such as incorrectly labeling the map of the United States, and experienced users were dissatisfied with its performance and "personality" changes, believing it performed poorly in benchmark tests. While this may not have been OpenAI's intention, the launch of GPT-5 clearly indicates that the nature of the AI competition has changed. Even if this does not bring extraordinary progress in AGI or so-called super intelligence, it may bring more innovation to products and services created using AI models. This controversy has raised a sharp question in Silicon Valley: after investing billions of dollars, are advancements in generative AI technology approaching the limit of the current stage? This not only challenges OpenAI's valuation of up to $500 billion but also prompts the outside world to reevaluate the development trajectory of AI technology. Despite doubts in discussions on the cutting edge of technology, enthusiasm in capital markets and industrial applications has not waned. Investors seem to value the practical growth of AI in commercial applications rather than distant promises of AGI. This shift indicates that in the second half of the AI competition, the focus will shift from the sprint towards the limits of model capabilities to a more practical and cost-effective productization. Discrepancy between expectations and reality Over the past three years, AI researchers, users, and investors have become accustomed to the rapid pace of technological advancements. However, the release of GPT-5 has disrupted this inertia. GPT-5 was described as "clumsy" due to technical malfunctions, received widespread complaints from users, and was even considered inferior to its predecessors. CEO Sam Altman admitted to a "rocky release," explaining that the underlying "automatic switcher" malfunctioned, resulting in the system calling on weaker models. Thomas Wolf, co-founder and Chief Scientific Officer of the open-source AI startup Hugging Face, stated: "People were expecting to see something completely new from GPT-5, but this time we didn't see that." This discrepancy is particularly strong because before the release of GPT-5, the industry was filled with optimistic predictions about the imminent achievement of AGI, with Altman even predicting it would arrive during President Trump's term. Gary Marcus, an Honorary Professor of Psychology and Neural Science at New York University and a renowned AI critic, stated: "GPT-5 is a sign of the overall route to AGI through scaling, but it did not succeed." At the same time, the competitive landscape in the industry has silently changed. Competitors such as Google, Anthropic, DeepSeek, and Musk's xAI have narrowed the gap with OpenAI in cutting-edge development. The dominance of OpenAI alone no longer exists. Bottleneck of the "scaling laws" Behind the underperformance of GPT-5 lies the core logic supporting the development of large language models the "scaling laws" are approaching their limits. Over the past five years, companies such as OpenAI and Anthropic have followed a simple formula: investing more data and stronger computing power can create larger and better models. However, this path is facing two major constraints. The first is data depletion AI companies have almost exhausted all freely available training data on the internet. Although they are now seeking new data sources through deals with publishers and rights holders, whether this is enough to drive advancements at the cutting edge of technology remains unknown. The second constraint is the physical and economic limitations of computing power. Training and running large AI models consume enormous energy. It is estimated that GPT-5's training utilized hundreds of thousands of next-generation NVIDIA processors. Altman also admitted to reporters this week that while the underlying AI models are "still progressing rapidly," chat-based models like ChatGPT "will not get better." The ghost of an AI winter Signs of slowing technological progress have led some seasoned researchers to recall the historical "AI winter." Stuart Russell, a computer science professor at the University of California, Berkeley, warned that the current situation bears similarities to the burst of the bubble in the 1980s when technological innovations failed to deliver on promises and provide investment returns. He stated: "The bubble bursts, the system doesn't earn money, we can't find enough high-value applications." "It's like a game of musical chairs where everyone is scrambling not to be the one left holding the AI baby." Russell cautioned that high expectations can easily lead to a collapse in investor confidence, and if they believe the bubble has been overly inflated, "they will exit the door as quickly as possible, and the collapse could be very, very, very fast." However, capital is still flowing into AI startups and infrastructure projects. According to data from Bain & Company and Crunchbase, AI has accounted for 33% of global venture capital investments this year. From AGI to productization The nature of the competition is changing. Rather than technological stagnation, the focus is shifting. Sayash Kapoor, a researcher at Princeton University, pointed out that AI companies "are slowly accepting a fact that they are building infrastructure for products." Kapoor's team evaluation found that GPT-5's performance in various tasks is not significantly inferior but excels in cost-effectiveness and speed. This could open the doors to innovation in products and services based on AI models, even if it did not bring extraordinary progress towards AGI. Yann LeCun, Chief Scientist at Meta, also believes that while LLMs trained on pure text are entering the stage of diminishing returns, there is still huge potential for multimodal data such as videos and "world models" aimed at understanding the physical world. This trend is also reflected in corporate strategies. Companies such as OpenAI have begun deploying "frontline engineers" to client companies to help integrate models. Kapoor commented: "If a company believes they are about to achieve automation of all human work, they wouldn't do this." Investors betting on application prospects Despite ongoing debates among experts about the prospects of technology, Silicon Valley investors seem unfazed by it. Valuations of AI-related stocks and startups are continuously soaring, with NVIDIA's market value reaching $4.4 trillion, nearing historic highs. SoftBank Group, an investor in OpenAI, has seen its stock price rise by over 50% in the past month. What is driving this enthusiasm is no longer the grand narrative of AGI but the strong growth of products like ChatGPT. Reportedly, ChatGPT has brought OpenAI an annual recurring revenue of $12 billion. David Schneider, a partner at Coatue Management, an investor in OpenAI, stated that the company's products have become "a verb," much like Google once was. Many investors believe that there is still vast untapped value in this generation of models. Peter Deng, a General Partner at the venture capital firm Felicis, stated: "In the realms of business and consumer applications, startups and enterprises are just beginning to scratch the surface of the potential of these models." As Thomas Wolf of Hugging Face said, even if AGI or superintelligence cannot be achieved in the short term, "there are still a lot of cool things that can be created." Perhaps this is the most important information for the market at this current stage. This article is sourced from "Wall Street News," written by Zhang Yaqi; GMTEight edited by Wang Qiujia.