NVIDIA CEO Jensen Huang signaled to memory chip makers to expand production, asserting that NVIDIA will absorb increased capacity, viewing resource constraints as beneficial. He highlighted NVIDIA's ability to offer high-performance AI solutions and end-to-end "AI factory" services. This confidence stems from demand for next-generation AI platforms, requiring significantly more High Bandwidth Memory (HBM) with increased capacity and complexity. NVIDIA has secured key components, including HBM for its GB300 and Vera Rubin platforms, with Samsung and SK Hynix positioned as core HBM4 suppliers. SK Hynix is projected to lead in global HBM bit capacity share, while Samsung is expected to significantly increase its market share.

TradingKey - NVIDIA ( NVDA) CEO Jensen Huang recently sent a powerful signal to global memory chip makers at the Morgan Stanley Technology Conference: go ahead and expand production; NVIDIA will take it all.
While attendees generally viewed resource constraints in memory, wafers, packaging, and power as bottlenecks for the AI industry, Huang made a surprising statement, asserting that chip supply shortages are "excellent news" for NVIDIA.
At the conference, Huang explained his unique logic regarding resource constraints: in an environment where data center land, power, space, and other resources are limited, customers' procurement decisions become more cautious; instead of trial and error, they tend to choose the highest-performance solutions right away.
With its abundant capital and scale advantages, NVIDIA can lock in massive supply across the entire industry chain, naturally becoming the biggest beneficiary of this trend. He even boldly told DRAM manufacturers: "Go ahead and build memory fabs; however much capacity you add, NVIDIA will consume it."
To help the outside world understand this "counter-intuitive" logic, Huang emphasized that he "loves constraints."
He explained that when resources are tight, customers must precisely choose hardware that provides the highest "tokens per watt," and NVIDIA is currently the only company in the world capable of building an entire "AI factory" for customers from scratch—an end-to-end service capability that competitors find difficult to match.
NVIDIA has already secured key components such as memory, wafers, and CoWoS packaging required for large-scale "AI factory" deployments; even if DRAM prices rise, it will not scale back its procurement efforts, providing clear demand expectations for manufacturers like Samsung, SK Hynix, and Micron.
Behind Huang's statements is the massive demand for memory resources from NVIDIA's next-generation AI platforms. The High Bandwidth Memory (HBM) capacity supported by the existing GB300 chip has reached 288GB, a significant increase from the 192GB of the previous-generation GB200.
The upcoming Vera Rubin platform will maintain the 288GB capacity but upgrade from HBM3E to HBM4 specifications.
HBM4 uses a 16-layer stack design, which is more complex than the 12-layer stack process of HBM3E, leading to higher yield loss rates and significantly greater consumption of memory resources per unit of output.
Meanwhile, the supply landscape for High Bandwidth Memory (HBM4) for NVIDIA's next-generation AI accelerator, Vera Rubin, has gradually become clear, with both Samsung Electronics and SK Hynix making it onto the core supplier list.
According to reports, the two South Korean memory giants have been integrated into the Vera Rubin component supply system; since the production cycle for HBM4 from wafer to packaging exceeds six months, the two companies could start mass production as early as this month,
In terms of supply share allocation, SK Hynix is expected to capture more than half of NVIDIA's total HBM procurement (covering both HBM3E and HBM4 products) in 2026, while Samsung is poised to secure the majority of orders for the Vera Rubin-exclusive HBM4.
Industry forecast data shows that SK Hynix will remain the industry leader with a 50% share of global HBM bit capacity, though this is down from 59% in 2025; Samsung's global share is expected to rise from 20% to 28%, showing significant momentum in catching up.
This content was translated using AI and reviewed for clarity. It is for informational purposes only.