Huang Renxun appeared in Taipei, revealing the key words of NVIDIA Corporation's next "AI blueprint": Rubin, silicon photonics, and the Chinese market.
Huang Renxun's trip comes at a time when the technological competition between China and the United States is escalating. Lao Huang stated that the ultimate decision on the next generation of AI chips specifically designed for the Chinese market, the H20, lies with the US government.
NVIDIA Corporation (NVDA.US) CEO Jensen Huang arrived in Taipei, China on Friday to visit the long-term chip manufacturing partner of the chip giant - the world's largest chip maker Taiwan Semiconductor Manufacturing Co., Ltd. Sponsored ADR. As the world's most valuable company, NVIDIA Corporation is facing increasing friction from Washington and Beijing over its industry-leading AI chips, as well as the entry and exit issues of the world's most core AI infrastructure hardware components. In addition, Huang revealed that the Rubin architecture AI GPU and a silicon photon processor have completed initial tape-out.
His visit comes just a few days before NVIDIA Corporation is set to release its earnings next Wednesday in the US East Coast time. There have been reports that the chip company has requested some suppliers to stop manufacturing or packaging testing related to the H20 AI chip, as Beijing has expressed caution about the security risks of the chip. There are also reports that the company is preparing to develop a next-generation AI chip for the Chinese market based on the newly launched Blackwell architecture.
During an interview in Taipei, Huang said that the final decision on the successor product of the China-exclusive version of the AI chip - the future successor of the H20 AI chip, lies with the US government.
"My main purpose of this trip is to visit Taiwan Semiconductor Manufacturing Co., Ltd. Sponsored ADR," he said in front of reporters in Taipei, adding that he will only stay for a few hours and will leave after having dinner with the senior management of Taiwan Semiconductor Manufacturing Co., Ltd. Sponsored ADR. All remarks in the interview were broadcast live by local media at the Taipei airport where he landed on his private plane.
Huang said that the management of Taiwan Semiconductor Manufacturing Co., Ltd. Sponsored ADR requested him to deliver a speech. Taiwan Semiconductor Manufacturing Co., Ltd. Sponsored ADR stated in a declaration that Huang would deliver an internal talk on his "management philosophy." The statement did not provide further details.
NVIDIA Corporation's current market value is about $4.3 trillion, still ranking first globally, significantly ahead of the second-ranked US tech giant Microsoft Corporation (MSFT.US) with a current market value of about $3.75 trillion. Wall Street analysts believe that the unprecedented global wave of AI infrastructure based on AI computing power will continue to thrive, with the continued massive expansion of AI computing power demands worldwide expected to drive NVIDIA Corporation's performance to continue exhibiting explosive growth, and its stock price expected to continue to experience a "super bull market."
Rubin and Silicon Photonics
Huang said that his visit was to thank Taiwan Semiconductor Manufacturing Co., Ltd. Sponsored ADR, and that the two companies have completed the tape-out of six new AI chips, including a new AI GPU based on the Rubin architecture supercomputer and a new silicon photon processor. Tape-out typically refers to the stage in chip design where chips are completed and small-scale production can begin.
"This is the first time in our history that each chip is completely new and revolutionary," he said in the interview. "We have completed tape-out for almost all of the next generation chips."
The positioning of the Rubin architecture is a direct successor to Blackwell, aimed for production in 2026, with core changes including a shift to HBM4, faster NVLink, and higher rack/cluster scalability, key upgrades being HBM4 (bandwidth of about 13 TB/s, a significant increase from Blackwell's 8TB/s), faster NVLink (to approximately 260 TB/s total bandwidth), and expanding towards higher density racks (targeting 600kW level racks). In addition, the combination of Vera CPU + Rubin GPU for AI server clusters will be the successor to the "Grace-Blackwell" combination.
The so-called "silicon photonics processor" is likely to be a silicon photon exchange/receiver chip for AI data center network/ultra-high-speed interconnection (not an AI GPU series product) according to independent semiconductor research institutions such as SemiAnalysis. According to previously revealed reports, NVIDIA Corporation has successfully integrated silicon photon engines into its Quantum-X (InfiniBand) and Spectrum-X (Ethernet) high-performance switch ASIC devices for cabinet/cluster-level optical interconnects (CPO/Co-Packaged Optics approach) to support the larger scale of AI GPU interconnection in the Rubin era.
NVIDIA Corporation has also indicated that silicon photon technology may initially be used in switch ASICs - that is, CPO technology will first be implemented on the switch side, while the AI GPU end will still primarily use high-speed copper interconnects for reliability/cost reasons, therefore silicon photon processors in the "Rubin era" are likely to first appear in network switches/interconnect devices to provide optical bandwidth for larger scale AI GPU pooling and AI server cluster flattening.
The demand for AI computing power brought by the inference side can be described as "vast and boundless," which is expected to drive the continuous and exponential growth of the artificial intelligence computing power infrastructure market. Huang believes that the "AI inference system" will be the largest source of revenue for NVIDIA Corporation in the future. The increasingly massive demand for AI computing power will undoubtedly bring a huge demand for optical interconnects, so silicon photon technology has significant potential in high-speed data communication and data center interconnection scenarios with high bandwidth, low power consumption, and low heat requirements. As the penetration rate of cloud-based AI computing power services and ChatGPT generative AI applications based on AI training/inference computing power systems grows, the demand for AI computing power will increase sharply, and silicon photon technology will play an increasingly important role.
The nearing limits of Moore's Law have largely led to a slowdown in the enhancement of performance of traditional electronic chips. Chip packaging technology based on silicon photonics provides a performance enhancement solution based on optical technology, accelerating the expansion of chip performance in situations where nanometer process technology is limited. Silicon photonics is a technology that integrates optical elements such as laser devices with silicon-based integrated circuits, enabling high-speed data transmission, longer transmission distances, and lower power consumption through light rather than electrical signals. In addition, compared to conventional electrical signal chips, silicon photonics chips can provide much lower latency.
In the realm of silicon photonics technology, "Co-Packaged Optics (CPO)" and "Optical I/O" form two complementary but distinctly oriented paths: the former prioritizes addressing power consumption and panel density bottlenecks at the rack-level switch ASIC interface, while the latter transforms optical sending and receiving as chiplets targeting the next-generation off-chip bus between computing chips such as CPU/GPU/NPU.
Will the Chinese market soon see the Blackwell architecture AI chip?
Earlier this month, US President Donald Trump opened the door to selling more advanced NVIDIA Corporation chips, surpassing the H20, to China, and reached agreements with NVIDIA Corporation and another AI chip leader AMD (AMD.US). Under the leadership of President Trump, the US government will receive a 15% revenue share from sales of advanced AI chips in the China market.
Media reports this week suggested that NVIDIA Corporation is developing a new custom version chip tentatively named "B30A" specifically for the Chinese market, based on its latest Blackwell architecture, with performance that will exceed the H20 AI chip based on the previous Hopper generation architecture.
Compared to NVIDIA Corporation's previous generation AI GPU - the H100, the overall performance of the H20 in "artificial intelligence training (especially multi-card parallel)" can be considered much weaker. However, the greatest advantage of this AI chip lies in NVIDIA Corporation's unique CUDA ecosystem and the single-card inference efficiency/throughput of the H20, providing an advantage in the efficient deployment of large-scale AI inference workloads in the Chinese market, making it difficult to replace.
When asked about the B30A, Huang said that NVIDIA Corporation is in consultation with the Trump administration to provide China with a successor to the H20 AI chip, but this is not a decision that the company can make on its own. "Of course, it depends on the US government, and we are in talks with them, but it is too early to draw conclusions now," he said in the interview.
NVIDIA Corporation only received permission from the US government to resume sales of the H20 in July of this year. The AI chip was developed specifically for the Chinese market after the Biden administration imposed export restrictions on the NVIDIA Corporation AI chip product line in 2023, but the company was abruptly asked to stop sales by the Trump administration in April, only to be approved by Trump in July.
Shortly after receiving approval from Washington, NVIDIA Corporation reportedly placed an order for up to 300,000 H20 chips from Taiwan Semiconductor Manufacturing Co., Ltd. Sponsored ADR to increase its existing inventory due to strong demand from Chinese tech companies. However, a few days later, NVIDIA Corporation faced accusations that its chips could pose security risks. However, NVIDIA Corporation insisted that its chips have no backdoor risks.
Media reports on Friday cited two informed sources as saying that Foxconn had been asked by NVIDIA Corporation to halt manufacturing or packaging testing activities related to the H20 chip. A third source said that NVIDIA Corporation hopes to first deplete its existing H20 inventory. Foxconn did not immediately respond to requests for comment.
Tech media The Information reported on Thursday that NVIDIA Corporation had instructed the chip packaging leader Amkor Technology based in Arizona to stop processes related to the H20 chip this week and also notified South Korean electronics giant Samsung to suspend related work.
Amkor provides advanced chiplet packaging technology for this AI chip, while Samsung supplies the HBM storage system based on stacking technology for this model. Both companies did not immediately respond to requests for comment.
When asked whether NVIDIA Corporation had requested suppliers to halt production, Huang told reporters in Taipei that NVIDIA Corporation had prepared a large quantity of H20 chips and was currently waiting for purchase orders from Chinese customers. "We are managing our supply chain continuously to respond to market conditions," said an NVIDIA Corporation spokesperson in a statement, adding, "As recognized by both governments, H20 is neither a military product nor used for government infrastructure."
Huang said that shipping H20 to China was not a matter of national security, and being able to ship H20 AI chips to China is something "we deeply appreciate.".
Related Articles

Johnson & Johnson's blockbuster product has applied for a new indication domestically.

Sinolink: Driven by both policies and markets, the solid-state battery market may see explosive demand in emerging fields.

Ping An Insurance has increased its holdings in China Pacific Insurance (02601) by 3.898 million shares, with an average price per share of approximately HK$36.82.
Johnson & Johnson's blockbuster product has applied for a new indication domestically.

Sinolink: Driven by both policies and markets, the solid-state battery market may see explosive demand in emerging fields.

Ping An Insurance has increased its holdings in China Pacific Insurance (02601) by 3.898 million shares, with an average price per share of approximately HK$36.82.

RECOMMEND

Advertising Revenue Contracts as Baidu Reconfigures Core Search Business amid AI Transition Pains
22/08/2025

United States and European Union Release Joint Statement Confirming Agreement on Trade Deal Framework
22/08/2025

Boeing (BA.US) Nears Landmark Deal to Sell Up to 500 Aircraft to China, Signaling Possible End to Years of Sales Freeze
22/08/2025