Skip to main content
  • Home
  • Tech
  • Jensen Huang Weighs In on a Prolonged Memory Shortage as the AI Boom Continues, With Bubble Fears Still Lingering

Jensen Huang Weighs In on a Prolonged Memory Shortage as the AI Boom Continues, With Bubble Fears Still Lingering

Picture

Member for

6 months 3 weeks
Real name
Aoife Brennan
Bio
Aoife Brennan is a contributing writer for The Economy, with a focus on education, youth, and societal change. Based in Limerick, she holds a degree in political communication from Queen’s University Belfast. Aoife’s work draws connections between cultural narratives and public discourse in Europe and Asia.

Modified

Nvidia CEO Jensen Huang Directly Flags an AI Memory Supply Crunch
“Supply Cliff Will Last at Least Several Years,” With Forecasts of Prolonged Market Turbulence
AI Bubble Talk Persists Despite Unrelenting, Aggressive Investment in AI Infrastructure

Nvidia CEO Jensen Huang—widely seen as one of the biggest buyers in the memory market—has directly addressed the memory chip supply crunch. As explosive growth in the AI market disrupts the balance between memory demand and supply, tension across the broader market is visibly rising. Even as tech companies continue aggressive investment in AI infrastructure—the underlying driver of the shortage—investor doubts about how quickly these bets will translate into meaningful business results have yet to ease.

Jensen Huang: “The Memory Supply Chain Is Under Strain”

According to foreign media reports compiled on the 2nd, Huang commented on the importance of memory chips after meeting with executives from Taiwanese supply-chain companies on the 31st of last month (local time). “AI needs memory, and we’re going to need a significant amount of memory semiconductors this year,” he said, adding, “For performance, we need high-bandwidth memory (HBM), and for low-power memory we need LPDDR.” He continued, “Demand for memory is much higher this year, so the memory supply chain is in a difficult situation.”

Huang made similar remarks earlier last month as well. At a press conference in Las Vegas on the 6th, a U.S. outlet told Huang that “one side effect of AI adoption is the memory shortage,” and asked whether Nvidia planned to acquire a memory chip company. Huang responded, “For the time being, we will be the only customer in the world to use HBM4 (sixth-generation HBM) first and at scale,” adding, “We trust the production capacity of Samsung Electronics and SK hynix, so we’re not worried about memory supply shortages at all.” He added, however, that “as demand for AI factories surges, the world will need more memory fabs.”

Huang’s repeated emphasis on memory reflects the visible worsening of supply constraints amid the recent AI boom. Large language models (LLMs), for example, require enough memory capacity and bandwidth to match the fast processing power of GPUs in order to perform at full potential. As a result, Nvidia, AMD, and Google have begun “sweeping up” memory chips including HBM, and memory makers have prioritized advanced process nodes and new capacity for server and HBM production. That has in turn created supply disruptions—especially for DRAM—as production capacity has been squeezed by the shift toward HBM. According to analysis by KB Securities, the industry’s recent fulfillment rate for customer DRAM demand is around 60%, and the fulfillment rate for server DRAM in particular is below 50%.

Prolonged Memory Supply Cliff Likely

Markets broadly expect this trend to persist over the long term. Sassine Ghazi, CEO of semiconductor design software firm Synopsys, said in an interview with CNBC last month that “the memory shortage will continue through 2026 and 2027.” He noted that “most of the memory produced by major suppliers is going straight into AI infrastructure,” leaving “many other products that require memory struggling with shortages as they are crowded out by the AI market.”

Global investment bank Macquarie said that “limited semiconductor supply capacity is likely to delay or reschedule AI data center projects, which in turn will exacerbate supply shortages,” adding that “this environment of tight supply and rising prices is favorable for DRAM suppliers and could lead to margin expansion over several years.” Another global investment bank, Morgan Stanley, also raised its target prices and earnings forecasts for Samsung Electronics and SK hynix, projecting that the severe memory supply shortage will persist through 2027.

Micron, one of the core memory manufacturers, echoed this view last month through comments from a senior executive in its mobile and client business. The company said that “when factoring in fab construction, customer qualification, and yield stabilization, it will be difficult for the current memory supply shortage to be resolved before 2028.” It explained that the certification process for high-spec memory demanded by AI customers, along with the need to simultaneously meet multiple capacity configurations, has become a bottleneck in the supply chain. As more customers require multiple capacity options at once, the burden of process switching rises for memory makers, ultimately weighing on overall productivity.

Did Nvidia Hesitate Over Investing in OpenAI?

Even as AI-driven investment exuberance becomes increasingly visible, it is difficult to view the outlook for the AI industry as unconditionally optimistic. AI revenue models commensurate with the scale of infrastructure spending by major AI players have yet to be fully proven. These concerns have weighed on investor sentiment and are also creating friction on the ground across the industry. On the 30th of last month, The Wall Street Journal reported that “inside Nvidia, cautious views emerged on investing in OpenAI due to concerns over OpenAI’s operating model and the competitive landscape.”

Huang pushed back forcefully against the report. Speaking to reporters in Taipei on the 31st, he dismissed it as “nonsense.” “What OpenAI is doing is incredible, and they are one of the most influential companies of our time,” he said, adding, “I really enjoy working with Sam Altman.” He also made clear Nvidia’s intention to make a large-scale investment in OpenAI. “We’re going to invest massively in OpenAI,” Huang said. “It will probably be the largest investment we’ve ever made.”

He added, however, that Nvidia’s investment in OpenAI’s most recent funding round would not come close to $100 billion. When asked to disclose the exact amount, he declined, saying, “That’s something Sam will decide.” Nvidia had previously signed a letter of intent in September last year outlining a potential investment of up to $100 billion in OpenAI, a deal that would have made Nvidia a shareholder and funded the construction of data centers equivalent to 10 nuclear power plants. Huang’s comments effectively signal the possibility that the investment size could be scaled back from the original plan.

Picture

Member for

6 months 3 weeks
Real name
Aoife Brennan
Bio
Aoife Brennan is a contributing writer for The Economy, with a focus on education, youth, and societal change. Based in Limerick, she holds a degree in political communication from Queen’s University Belfast. Aoife’s work draws connections between cultural narratives and public discourse in Europe and Asia.