From NVIDIA Rubin leading the "AI Factory Era" to AMD's MI500 "Thousandfold Roadmap": the "AI Bull Market Narrative" continues to dominate the stock market.

date
16:29 06/01/2026
avatar
GMT Eight
Huang Renxun and Su Zifeng concentrated on releasing a heavy signal at CES, indicating that "AI computing resources are still scarce, platform-level computing system iterations are accelerating, and the AI capital expenditure cycle is far from over", and counteracting market concerns about "AI hype peaking/bubble about to burst" with a product roadmap.
NVIDIA Corporation has long been a major competitor in the AI data center and PC chip fields with AMD (AMD.US). Shortly after NVIDIA Corporation CEO Huang Renxun unveiled Vera Rubin, the next generation AI GPU computing platform, at the Consumer Electronics Show (CES), AMD announced that they will launch a competing product called the next generation AI GPU computing platform aimed at enterprise-level data centers, Vera Rubin. During the CES event, AMD CEO Lisa Su emphasized the exclusive energy efficiency and cost-effectiveness advantages of their next generation products for the data center market, aiming to challenge NVIDIA Corporation, known as the "AI chip super hegemon," for its dominant position in the AI computing infrastructure market. In his speech, Huang Renxun stated that the Vera Rubin computing platform has been fully launched, with all six core chips of the new AI computing infrastructure platform Vera Rubin completed in manufacturing and key testing processes, entering full production. He emphasized the global AI computing arms race entering the "rack/platform-level AI factory era" (such as the six-core consolidation of the Rubin platform, emphasizing system synergy and cost/efficiency improvement). Despite the physical limit of transistor growth being only 1.6 times, NVIDIA Corporation has forcibly achieved a 5x AI inference performance improvement and a 3.5x training performance improvement compared to the already powerful Blackwell architecture with "extreme collaborative design." Huang Renxun also stated that Microsoft Corporation's next-generation "AI super factory" is expected to deploy hundreds of thousands of chips based on the Vera Rubin platform. In addition, facing market skepticism and dissatisfaction over the high procurement costs of NVIDIA Corporation's AI computing infrastructure, the new AI computing platform Vera Rubin significantly reduces the cost of AI inference token generation compared to the Blackwell architecture AI GPU cluster, making costly Agentic AI (AI intelligent agents with agency functions) commercially viable. In the Vera Rubin platform, NVIDIA Corporation utilizes the inference context memory storage platform built on BlueField-4 DPU, adding 16TB of high-speed shared memory to each AI GPU device, significantly addressing the issue of the "memory wall" in long text scenarios. Lisa Su, the CEO of AMD, highlighted the company's latest flagship AI GPU products, the MI440X and MI455X, for enterprise-level large-scale data centers at the CES event. She also showcased the high-end Helios rack-level system and teased the upcoming MI500 series of AI GPU products in 2027, with performance expected to be 1000 times that of flagship products from 2023. Su, alongside OpenAI co-founder, emphasized the continued demand for global AI computing power and aimed to further break NVIDIA Corporation's absolute monopoly on the AI computing infrastructure market, competing for orders worth hundreds of billions to even trillions of dollars in AI computing power. From single GPU to whole rack-level AI system engineering, the global AI computing landscape continues to demonstrate a supply far below demand scenario. Through the releases of MI440X and MI455X, AMD's "king of the hill" status propels the stock price to a new uptrend? It is known that AMD is adding a new model, the MI440X, to its existing AI GPU product line, targeting smaller-scale enterprise data centers where hardware can be deployed on a large scale locally, keeping data within their AI computing infrastructure. AMD CEO Lisa Su also introduced AMD's latest flagship AI GPU product, the MI455X, which offers a significant improvement in energy efficiency in terms of available training/inference capabilities based on the AI chip for rack-level AI computing clusters. Su also joined the chorus of American AI technology executives (including counterparts at NVIDIA Corporation) advocating for the continued global AI wave, as the productivity potential brought by AI and the demand for massive AI computing power infrastructure remain far from over, dismissing concerns of "AI bubble narrative" in the financial markets. "We simply do not have enough computing power infrastructure to support all the transformations we are planning." Su stated. "The speed and progress of AI technology innovation over the past few years have been incredible. We are just getting started." AMD is widely considered as the closest competitor to NVIDIA Corporation in the semiconductor sub-market for creating and running AI application software or AI intelligent agents with AI chips. Over the past few years, AMD has built a new and billion-dollar scale AI computing infrastructure business around its unique AI chips, driving significant revenue and profit growth in recent quarters. The Wall Street institutional investors that have boosted its stock price generally hope that AMD will show more progress in capturing up to 80-90% market share in the NVIDIA Corporation AI computing infrastructure area. The AMD Helios AI rack system based on the MI455X AI GPU and the new Venice data center-grade central processing unit design are set to launch later this year. OpenAI co-founder Greg Brockman and Su appeared together on the CES stage in Las Vegas, outlining their long-term partnership with AMD and discussing plans for large-scale deployment of their AI computing systems in the future. They share a common view and belief that future global economic growth will be closely linked to the availability of AI computing resources. The new MI440X AI GPU will fit compact computers in existing small data centers. Su also teased the MI500 series data center processor set to launch in 2027. She stated that the series will offer performance up to 1000 times that of the MI300 series (which first launched in 2023). Since 2025, AMD's stock price has risen by over 80%, with a significant portion of gains concentrated in October. The main catalyst for AMD's strong uptrend was a procurement agreement with the Saudi "sovereign AI system" Humain for a 1 gigawatt power AI chip computing cluster, as well as a significant AI computing infrastructure cooperation agreement with OpenAI worth billions of dollars. These large-scale partnerships not only affirm the strong capabilities of AMD's AI computing infrastructure technology but also make Wall Street institutional investors, such as Citi, more optimistic about its future financial prospects. Citi Group has dubbed AMD the "king of the hill," stating that, considering the higher EPS growth rate in the calendar year 2027, AMD is the most momentum-based buy in the market. Our research feedback shows that recent analyst activities have helped consolidate its revenue/profit margin goals and a future EPS target of up to $20. Citi analyst Christopher Danely's team wrote in a report to clients and reiterated the $260 target price. The average target price from Wall Street analysts compiled by TIPRANKS shows an average target price expectation of $282.33 for AMD, indicating a potential upside of at least 28% over the next 12 months, with the highest target price being $377 provided by Raymond James, a reputable investment firm. Lisa Su, the CEO of AMD, has given very optimistic expectations for the AI computing chip market, predicting a stronger growth trajectory for AMD over the next five years. Su revealed AMD's fundamental financial goals for the next three to five years, stating that the company aims to capture a "double-digit" share of the data center AI chip market, with projected annual revenue from AMD's data center chips reaching $100 billion in five years (compared to the current $16 billion), with profits expected to more than double by 2030. The "AI bull market narrative" is set to dominate the global stock market's upward trajectory in 2026 Morgan Stanley, a major financial giant on Wall Street, published a recent research report indicating that the "long-term bull market logic" for chip stocks remains intact amidst the unprecedented AI infrastructure boom, with chip stocks centered around AI chips and storage chips potentially being one of the most prominent sectors in the U.S. stock market's performance in 2026. The AI data center optical interconnection industry chain may grow into a more robust new generation technology force. Bank of America Corp in its research report stated that the global AI arms race is still in an "early to mid-stage" phase, and despite severe downsides in popular chip stocks like NVIDIA Corporation and Broadcom Inc., investors should continue to focus on industry leaders. Vanguard, one of the largest asset management giants, recently pointed out in a research report that AI investment cycle may have only completed 30%-40% of the final cycle peak. The Gemini3 series products brought an immensely large AI token processing load upon launch, further validating the "AI boom narrative is still in the early-stage accelerated construction phase of AI computing infrastructure supply being insufficient." According to the latest 2026 semiconductor industry investment perspectives from Bank of America Corp and Morgan Stanley, the "AI bull market narrative" is set to dominate the global stock market's bullish trend in 2026 - the chip sector focusing on "AI chips" and the storage chips closely related to the expansion of AI training/inference systems being the common investment theme favored by both sides. Strategists from UBS Group AG, an international major bank, also predict that the AI investment fervor and robust profit growth led by giants in the AI chip space like NVIDIA Corporation will support a bullish trend in the U.S. stock market in 2026. "We note that forward PE multiples have only slightly increased from the beginning of the year, further strengthening the fact that the market's rise is driven by strong profit growth and not market concerns of an 'valuation bubble.'" The analysts from the institution said in a recent research report. According to Morgan Stanley, Citi, Loop Capital, and Wedbush, the global AI infrastructure investment wave around AI chip computing hardware is far from over, just beginning. Under the unprecedented "AI inference end computing power demand storm," pushing from now to 2030, this round of AI infrastructure investment wave is expected to reach a scale of $3 to $4 trillion. "We believe 2026 is the midpoint of the 8 to 10-year period for upgrading traditional IT infrastructure to accommodate accelerated and AI workloads." "A greater review of AI investment returns and the cash flow of ultra-large cloud service providers may make stock price movements unpredictable, but this will be countered by updated/faster LLM developers and AI factories serving enterprise and sovereign clients. We predict that semiconductor sales in 2026 will move towards the first $1 trillion, achieving about 30% growth, while wafer fab equipment sales will achieve double-digit year-over-year growth." Analysts from Bank of America Corp stated in their research report. Bank of America expects the chip sector to lead the U.S. stock market into a super bull run in 2026 and emphasizes that NVIDIA Corporation and Broadcom Inc. remain the most worthwhile long-term chip stocks in 2026. ** Continued in the next message **