Sequoia Capital's 2025 AI Outlook: AI in 2025.

date
15/12/2024
avatar
GMT Eight
The evolution of the AI landscape in 2025 has clearly shifted from the initial excitement and rapid investment phase to a stage that focuses on implementation, differentiation, and demonstrating real value.
Sequoia Capital recently published an article "AI in 2025: Building Blocks Firmly in Place," making three predictions about the development trends of AI in 2025, to some extent reflecting the capital's qualitative judgments on the direction of large models. In summary, the evolution of the AI landscape in 2025 has clearly shifted from the initial phase of excitement and rapid investment to a phase that emphasizes implementation, differentiation, and demonstrating practical value. The current direction of large models is still rapidly developing in both basic technology and industrial applications, and the commercialization process is still in its very early stages, so these predictions may also be wrong, and it is wise to listen to multiple opinions. Before delving into Sequoia's AI trends for 2025, let's take a look at some interesting insights from Sequoia into the large model direction over the past year or two. With the release of ChatGPT at the end of 2022, intelligent individuals around the world, as well as intelligent money, quickly analyzed and followed this direction. Before investing real money, a series of articles from the past indicates that Sequoia's judgment has evolved from cautious observation and fear of missing out to rapid analysis and investment, starting with small investments before larger ones. This also represents a change in the understanding of VCs regarding this round of "technological revolution" represented by large models. In the past two years, Sequoia's series of articles on AI, in my opinion, have three very insightful points: AI's $200B Question (2003) & AI's $600B Question (2024) The rise of generative AI in 2023 marked a turning point, with early use cases demonstrating the potential of generative AI in various industries. Amid the excitement, people also began to worry about the huge capital expenditure required to build and train these models, as well as the lack of corresponding revenue. This disconnect between investment and returns has been called the "AI $200 billion question" or "AI $600 billion question." The "AI $200 billion question" was first raised in 2023, and a follow-up "AI $600 billion question" was presented in 2024. These two articles reflect Sequoia's cautious attitude towards the current frenzy of investment. Whether it's $200 billion or $600 billion, the core of these two questions is about how large models can generate profits and complete the commercial loop. The assumption and reasoning process of this question is roughly as follows: AI's $200B Question Summary: If Nvidia sold $500 billion worth of GPUs in Q4 2023, then the $500 billion investment that large model service providers made in purchasing chips needs to generate $200 billion in revenue to be a normal business. The reasoning is as follows: 1. Assuming that $1 spent on purchasing GPUs also requires an additional $1 in energy costs to run that GPU in a data center. 2. If Nvidia achieved $500 billion in GPU revenue in Q4 2023 (based on conservative analyst estimates), this would mean data center expenses of approximately $1 trillion. 3. Large model applications and service providers also need to make a profit, such as large model service providers like OpenAI, Microsoft, Google, XAI, Salesforce, etc. Assuming they need a 50% profit margin (corresponding to SaaS software experience), this means that for the $500 billion GPU capital expenditure in 2023, these GPUs need to generate $200 billion in revenue during their lifespan to recoup the initial capital investment. 4. This does not yet include the profit of cloud service providers - if they want to make a positive return, the total revenue requirement will be even higher. In 2024, this figure became the AI $600 billion question: From Software-as-a-Service to Service-as-a-Software In Sequoia's "Generative AI's Act 01" article published in October of this year, there are several very visionary insights, one of which is that from the future perspective, Software-as-a-Service will transition to Service-as-a-Software. In the future, companies will not sell SaaS software, but rather sell work directly, selling services with clearer and more distinct value, and charging based on outcomes. Sequoia believes that the TAM (Total Addressable Market) brought about by the transformation of AI will reach trillions of dollars in scale, mainly based on several key arguments: Transitioning from SaaS to Service-as-a-Software In the AI era, Service-as-a-Software will turn human labor into software services, whereas in the cloud computing era, Software-as-a-Service turned software companies into cloud service providers. The key difference between the two is that the target market expands from the software market to the entire service market, that is, the software market of Software + the service market of Labor. Fundamental transformation of business models AI applications in the future will be charged based on outcomes ($/outcome), while traditional SaaS models charge per seat.Charged per seat ($/seat), such as Sierra (customer service AI) is now charged based on the number of problems solved, rather than the traditional per seat charging. The fundamental change in pricing strategy is driven by a transformation in the business model.Market expansion effect AI applications have reduced the marginal cost of service delivery, allowing customer groups who were previously unable to afford services due to high costs to now be able to afford them. This is mainly reflected in the rapid decrease in costs of various services or software that were previously high-cost, resulting in a rapid decrease in barriers to entry and construction costs, leading to the effective expansion of the market as a whole. For example, in the past, a customer service representative and quality control personnel had limited service capacity, and manual inspection tasks were difficult to afford due to high labor costs. In the future, AI customer service and quality control may be able to provide services to customers 24/7 at very low costs, automatically perform full inspections, etc. New technologies can access incremental markets that were previously untapped. Comprehensive coverage of service markets Every field represents a huge service market scale, such as legal services (Harvey), software engineering (Factory/Cursor/Devin), medical records (Abridge), customer support (Sierra), network security (XBOW), etc. AI applications based on large-scale models are penetrating various fields, fundamentally transforming services that were originally performed by humans into software automation services. Due to the total market size of the service market far exceeding that of the software market, the overall Total Addressable Market (TAM) will reach trillions of dollars. This is not just market substitution, but also market expansion achieved through cost reductions. From this diagram, it can be seen that although AI infrastructure and modeling layers have been dominated by giants, there is still a huge WHITE SPACE waiting for entrepreneurs to explore at the application layer, and based on historical experience, this space is fertile ground for many successful companies. From "Big Bang" (2023) to "Primordial Soup" (2024) Sequoia's article "AI in 2024: From Big Bang to Primordial Soup" published in early 2024 suggests a transition in AI development from the frenzy of 2023 to a return to fundamental research and exploration in 2024. This metaphor implies Sequoia's more pragmatic core viewpoint on the industry. Building on the momentum of 2023, the AI industry in 2024 is characterized by new ideas and potential applications, with the focus shifting from competition in large model pretraining of various parameters to reaching the level of leading models like GPT-4. This round of competition has seen domestic and international players racing to stay ahead and not fall behind. The search for "killer applications" in 2024 has become more intense, with various use cases being explored, from AI-driven assistants to specialized tools for specific industries. Although innovation is happening rapidly, the long-term sustainability issue of high capital expenditures remains in the absence of a clear path to profitability. The insight behind this is also the tone of Sequoia's forecast for AI trends in 2025. Overview of the AI Landscape in 2025 By 2025, the AI ecosystem has undergone significant transformation. Starting from the frenzy of 2023 and 2024, the industry is moving towards a more structured landscape, emphasizing providing practical value and reasonable return on investment. Predictions for AI in 2025 Sequoia's main predictions for AI in 2025 include differentiation among Large Language Model (LLM) providers, the rise of AI search as a killer application, and the stabilization of AI investments and ongoing ROI challenges.OpenAI: OpenAI has established a strong brand in the field of AI, primarily benefiting from the continuous popularity of ChatGPT. This brand recognition has translated into a powerful revenue engine, giving OpenAI a significant advantage in attracting consumers and enterprise clients. Anthropic: Anthropic's advantage lies in its concentration of top AI talent, having recently attracted key researchers from OpenAI and other leading institutions, providing a deep reservoir of expertise to drive innovation. xAI: xAI is focused on building up data centers first, starting with tens of thousands of H100/GB200 cards before proceeding. Its ability to rapidly build and deploy large-scale computing infrastructure is crucial for training and running the next generation of large AI models, with Grok reportedly showing good results. Meta: Meta has differentiated itself by focusing on open-source models. The Llama series of models has gained a large following, making Meta an advocate for accessibility and driving innovation through the open-source community. Emergence of Distinct Superpowers As the leading providers of LLM (Microsoft/OpenAI, Amazon/Anthropic, Google, Meta, and xAI) mature, they have developed specialized advantages to differentiate themselves in the competitive landscape, shaping their strategic focus and influencing how they approach the market. Some sell cloud services, some sell APIs, some sell subscriptions (tokens), and some sell applications. Prediction 2: Rise of AI Search as a Killer App Sequoia's strong emphasis on AI Search is somewhat unexpected, as the form and experience of AI search have significantly improved compared to traditional search. However, there seems to be uncertainty regarding how to monetize this, as Perplexity has started doing e-commerce guidance, and it remains to be seen if it will be profitable. Fragmentation of the Search Market: The rise of AI search may fragment the search market currently dominated by Google. Specialized AI search engines are emerging to meet the specific needs of different professional fields, providing more targeted and relevant results. For example, Perplexity targets analysts and investors, Harvey targets lawyers, and OpenEvidence targets doctors. This trend indicates that search will no longer be a single market but a diverse ecosystem of specialized tools. AI Search Revolutionizes Information Access: Since the emergence of ChatGPT, the industry has been searching for AI's "killer app" - an application that fundamentally changes how we interact with technology. AI search has become a strong competitor for this title, with the potential to completely change how we access and utilize information. AI search goes beyond simple web indexing to semantic understanding and knowledge synthesis, allowing users to find information more efficiently and gain deeper insights. Prediction 3: Stabilization of AI Investment and ROI Challenges Lingering Questions about ROI: Despite investment stabilizing and benefits for startups, concerns remain about achieving sufficient ROI from AI investments. Massive capital injections need to be backed up by tangible business outcomes, which remains a key challenge for the industry. Focusing on developing applications that create tangible value is essential to address this challenge and ensure the long-term sustainability of the AI ecosystem. Benefits to Startups from Declining Compute Prices: Although the dominance of tech giants presents challenges, overbuilding of AI infrastructure is driving down compute prices, creating favorable conditions for startups. As major consumers of AI computing resources, startups benefit from this trend, as it lowers their costs and allows them to experiment and innovate more freely. Emergence of Oligopolistic Dynamics: The dominant position of a few tech giants (Microsoft, Amazon, Google) in the AI infrastructure market has raised concerns about oligopolistic behavior. These companies have significant control over the resources and platforms needed for AI development, which could limit competition and innovation. Stabilization of CapEx: With numerous data center construction projects underway, capital expenditures for AI infrastructure are expected to stabilize by 2025. This stabilization reflects a shift from ensuring resources and building capacity to optimizing utilization and maximizing returns on existing infrastructure. Shift from Aggressive Spending to Focus on Execution: In 2024, large tech companies increased capital expenditures out of concern for falling behind in the AI arms race. As we enter 2025, this aggressive spending is shifting towards a more cautious approach focused on execution and realizing returns on investments.The total investment has reached 15.5 billion US dollars, nearly 120 billion RMB. By the full year of 2024, it should reach 20 billion US dollars, and by 2025, it is said to exceed 26 billion US dollars, close to 200 billion RMB. Both the intensity of capital investment and the density of talent are at least one order of magnitude higher than in the domestic market.Summary As the AI industry enters a new stage, the focus in 2025 is to transform the basic building blocks into applications that solve real-world problems and complete the commercial cycle of monetization. From Building Blocks to Impactful Applications: The basic elements of the AI ecosystem are already in place, with large-scale data centers still under construction, more powerful large models being trained, and innovative research pushing the boundaries of AI capabilities. The focus in 2025 needs to shift towards using these building blocks to develop large model applications and software that truly improve people's lives and address key challenges in various fields. Navigating the Hype Cycle: While the potential of AI is undeniable, blind optimism should be avoided. The industry is going through a period of hype and inflated expectations, which may be followed by a period of disillusionment when progress fails to meet initial expectations. Clear assessment of the opportunities and challenges brought by AI is needed in order to focus on long-term value creation rather than short-term profits. The Opportunity for Visionary Companies: The current landscape provides unique opportunities for companies that can effectively utilize AI to create value and solve real-world problems. This requires not only technical expertise but also an in-depth understanding of specific fields and the ability to transform AI capabilities into practical solutions that meet real user needs. From this perspective, apart from a few giants, companies like Salesforce, with real data and a large customer base and business processes on the platform, are likely to be the first to benefit from this wave of dividends. The rapid development of large models in the past two years is unprecedented, with no other technology developing and being adopted so quickly. The time cycles of development and adoption have been compressed, requiring us to quickly embrace new technologies, embrace change, and adapt quickly. As the industry matures, focusing on actual implementation, differentiation, and value orientation will be crucial to ensuring whether large model technology can truly take off in the future. This article is reprinted from the WeChat public account "Fighter's World"; GMTEight Editor: Yan Wencai.