Skip to main content
  • Home
  • Tech
  • AI race upends the market: soaring memory prices, DRAM supply crunch looms

AI race upends the market: soaring memory prices, DRAM supply crunch looms

Picture

Member for

1 year 3 months
Real name
Tyler Hansbrough
Bio
[email protected]
As one of the youngest members of the team, Tyler Hansbrough is a rising star in financial journalism. His fresh perspective and analytical approach bring a modern edge to business reporting. Whether he’s covering stock market trends or dissecting corporate earnings, his sharp insights resonate with the new generation of investors.

Modified

AI boom sends DRAM and HBM prices soaring
Data-center investment surges across Asia, keeping memory demand on a steady rise
U.S. big tech leans on alliances to cope, while the EU scrambles to loosen rules

Prices of high-bandwidth memory (HBM) and DRAM are soaring. As the AI industry grows at breakneck speed and competition intensifies, demand for memory chips has surged, leaving supply struggling to keep up and pushing the market into overheating. With data-center investment expanding rapidly, many in the market expect this “memory boom” to continue for some time.

Memory chips in a full-blown boom

According to media reports on the 11th, memory chip prices have been climbing sharply in recent months. Data from market tracker DRAMeXchange show that the average price of mainstream PC DRAM came in at $8.1 in November, roughly six times the $1.35 recorded in January. The HBM4 supply price SK hynix agreed with Nvidia is about $500, around 1.6 times higher than SK hynix’s 12-high HBM3E, which is priced at $300. Even at these elevated levels, SK hynix and Samsung Electronics have effectively sold out their DRAM and HBM volumes for next year, underscoring just how hot demand is.

Big Tech companies are throwing their weight behind securing memory supply. OpenAI, which signed a letter of intent in October to join the global “Stargate” AI infrastructure project, is expected to purchase up to 900,000 DRAM wafers a month through the initiative—more than half of the roughly 1.5 million DRAM wafers produced worldwide each month. Google, Amazon, and Microsoft have reportedly placed what are effectively blank-check orders with Samsung Electronics, SK hynix, and Micron, telling them to deliver every unit they can, regardless of price.

At the root of this deepening shortage is the AI boom sweeping the global market. Alongside GPUs, memory chips are a core factor in determining the cost of AI training and inference, meaning demand inevitably grows exponentially as AI technology advances and the market expands. The problem is that memory makers have focused limited DRAM production capacity on HBM just as demand for server DRAM for data centers has jumped, throwing supply and demand completely out of balance and sending product prices sharply higher.

AI data centers are springing up across Asia

Market watchers widely expect the current memory boom to last for some time, as AI companies keep ramping up their data-center investments and, in turn, their demand for memory products. Recently, AI firms have been accelerating efforts to secure data-center infrastructure with a focus on Asia. Countries that can offer large sites for new builds and relatively cheap electricity to run them are coming into the spotlight. Singapore has long been the dominant data-center hub in Asia, but as facilities multiplied and the strain on its national power supply grew, it temporarily halted approvals for new projects—pushing demand out to neighboring countries.

According to the “Asia-Pacific Data Center Investment Prospectus” published by real estate services firm Cushman & Wakefield, Malaysia is expected to record the fastest improvement in data-center capacity per capita in the Asia-Pacific region through 2030. Domestic demand centered on Kuala Lumpur and regional AI and cloud demand anchored in Johor are converging, turning the country into a key Southeast Asian hub. Google has begun building a data center and cloud facility in Selangor’s Elmina Business Park with an investment of $2 billion, while Oracle last year announced a $6.5 billion data-center build-out plan. Amazon Web Services (AWS) has already completed and is operating a data center in Malaysia.

Thailand is projected to post the second-fastest growth after Malaysia. The country’s data-center market is expected to improve from about 800,000 people per megawatt (MW) of capacity today to around 220,000 by 2030. Although Thailand’s current operating capacity stands at just 89 MW—among the lowest in Asia-Pacific—hyperscaler investment plans announced in 2024 have sharply heightened global investor interest. Google, AWS, Microsoft, and Equinix are all pushing ahead with data-center projects in Chonburi, near Bangkok. AWS has also said it will invest $5 billion to build AI infrastructure in Thailand.

Indonesia is seeing a rush of data-center projects as well. China’s Tencent has announced plans to invest $500 million in Indonesian data centers by 2030, while Alibaba Cloud already operates three data centers in the country. Nvidia has agreed to invest $200 million with local telecom operator Indosat to build an AI center. In Vietnam, state-owned Emirati AI and tech company G42 of the UAE selected the country in October as the site for a $2 billion data center.

AI race redraws global market winners and losers

Amid intensifying competition, U.S. big tech firms leading the AI market are leaning on cross-alliances to defend their positions. A standout example came in September, when Nvidia announced a massive investment plan for OpenAI. At the time, Nvidia said it had signed a letter of intent aiming to invest up to $100 billion in OpenAI to build a 10-gigawatt data center powered by Nvidia chips. The facility is estimated to require some 4–5 million Nvidia GPUs.

Last month, Microsoft signed strategic partnerships with Nvidia and Anthropic. Under the deal, Microsoft will invest $5 billion in Anthropic, while Nvidia will put in $10 billion. Anthropic, in turn, agreed to buy $300 billion worth of Azure computing resources from Microsoft and signed a contract for up to 1 GW of additional computing capacity. It also plans to secure up to 1 GW of GPU computing power using Nvidia’s next-generation Grace Blackwell and Vera Rubin systems.

The European Union, by contrast, has stumbled badly. Despite being first to roll out a comprehensive AI law—the EU AI Act—and putting in place an ambitious regulatory regime, it has found itself sidelined in the global tech race dominated by the U.S. and China. Many of the very companies expected to drive innovation have moved to less regulated jurisdictions, eroding the bloc’s competitiveness. In response, the European Commission last month reversed course, pushing back the full application of key AI rules from next August to December 2027 and easing data-protection constraints so AI companies can train models on Europeans’ data.

Picture

Member for

1 year 3 months
Real name
Tyler Hansbrough
Bio
[email protected]
As one of the youngest members of the team, Tyler Hansbrough is a rising star in financial journalism. His fresh perspective and analytical approach bring a modern edge to business reporting. Whether he’s covering stock market trends or dissecting corporate earnings, his sharp insights resonate with the new generation of investors.