Guotai Haitong: Suppliers are continuously rolling out high-end AI chips, memory upgrades help boost both quantity and price of DRAM.
Guotai Haitong released a research report stating that NVIDIA's next-generation Rubin CPX has split the calculation load of AI inference at the hardware level, and memory upgrades provide faster transmission. With the increase in computing speed, the average capacity of DRAM and NAND Flash in various AI extended applications such as smartphones, servers, and laptops has grown, with the server sector experiencing the highest growth. By 2024, the average capacity of Server DRAM is expected to increase by 17.3% annually. As the demand for AI servers continues to rise, high-end AI chips such as NVIDIA's next-generation Rubin and cloud service providers' self-developed ASIC are being released or starting mass production, which will help boost the volume and price of DRAM products for high-speed computing. It is recommended to focus on storage modules.
Latest
2 m ago