AI Servers Absorb Memory Supply, Consumer DRAM Output Contracts Sharply as HBM Dominates
Input
Modified
DRAM Pushed to a Lower Priority Amid Intensifying AI Competition
Supply Chain Leverage Concentrated Around Nvidia
Potential Volatility Emerging in the Commodity DRAM Price Cycle

As competition in artificial intelligence semiconductors intensifies, the center of gravity in the memory industry has also begun shifting rapidly. Major manufacturers including Samsung Electronics, SK Hynix, and Micron have increasingly concentrated their capabilities on producing high-bandwidth memory (HBM), significantly reducing the supply of commodity DRAM used in smartphones and personal computers. The result has been a simultaneous surge in memory prices and growing supply instability.
Shortage of Commodity Memory Supply Persists
On the 15th, Bloomberg reported that spot prices for consumer DRAM have surged nearly 700 percent over the past year. The increase reflects the fact that the three major memory suppliers—Samsung Electronics, SK Hynix, and Micron—have shifted production capacity toward the far more profitable HBM segment, sharply reducing the supply of commodity DRAM used in smartphones and PCs. Bloomberg noted that “while Samsung Electronics and SK Hynix dominate the AI-focused HBM market and generate record profits, the supply of standard memory used in laptops and desktop PCs has effectively been pushed to a cliff,” adding that this trend is increasingly allowing Chinese semiconductor companies to penetrate the market.
Global PC manufacturer ASUS also addressed the issue in a market report released on the 11th, stating that “memory demand from AI servers has absorbed available production capacity at an overwhelming scale, severely damaging the supply of DRAM for standard PCs.” The company noted that consumer prices for some flagship products rose by more than 100 percent during the first quarter of this year. ASUS further projected that the shortage of commodity memory—including standard double data rate (DDR)4 and DDR5 modules used in laptops and PCs—could persist for more than two years. In response, the company plans to pursue long-term supply contracts with major memory manufacturers lasting three to five years.
The shift in supply dynamics has already begun affecting the finished-product market. Research from market analysis firm Gartner shows that the share of memory costs in notebook manufacturing rose from 16 percent in 2024 to 23 percent by the end of last year, an increase of seven percentage points. This cost pressure is expected to translate into higher PC prices and declining shipment volumes. Gartner projects that global PC shipments will decline by at least 10 percent this year compared with the previous year. The smartphone market faces similar pressure, with rising memory prices increasing manufacturing costs and making price increases of roughly 15 percent likely within the year.
Chinese companies have moved quickly to exploit this gap. ChangXin Memory Technologies (CXMT), in particular, has expanded its global DRAM market share to roughly 13 percent through aggressive capacity expansion, rising to become the world’s fourth-largest memory manufacturer after Samsung Electronics, SK Hynix, and Micron. At the same time, CXMT has accelerated efforts to narrow the technology gap by mass-producing DDR5 products capable of operating at 8,000 MHz using a 15-nanometer process. Industry observers widely note that while the leading manufacturers’ focus on HBM has boosted profitability in the short term, it has also opened the door to new competitive dynamics in the commodity memory market.

Competition Intensifies for Next-Generation HBM Supply
The intensifying competition in the HBM market is closely tied to the supply structure centered around Nvidia, the dominant force in the AI accelerator industry. As demand for graphics processing units (GPUs) used in AI servers has surged dramatically, the memory supply chain has increasingly been reshaped around Nvidia’s product roadmap and technical requirements. HBM, a mandatory component in high-performance GPUs, has evolved beyond a simple component into a critical determinant of AI performance. For memory manufacturers, securing cooperation with Nvidia has therefore become a decisive factor shaping their position in the next-generation AI semiconductor market.
Nvidia previously worked with major memory companies to determine the supply structure for HBM4 to be used in the flagship NVL72 model of its next-generation AI GPU series, Vera Rubin. The industry initially expected Samsung Electronics to secure an exclusive supply position. However, Nvidia later began considering a dual-supplier approach, dividing the supply structure between a “stability-focused standard model” and a “highest-performance AI infrastructure model.” Industry sources believe that while the overall shipment volume will be higher for the standard version, prices for the high-end product line could reach two to three times those of the previous generation. Securing the top-tier segment therefore carries greater significance in terms of profitability.
At the core of this technological competition lies the DRAM process technology that forms the foundation of HBM. Samsung Electronics attempted to differentiate itself during the development of HBM4 by introducing its sixth-generation 10-nanometer-class (1c) DRAM process. This represents a finer process compared with the fifth-generation 10-nanometer-class (1b) DRAM used by SK Hynix and Micron. As process scaling advances, chip density, power efficiency, and operating speed all improve simultaneously. Analysts note that these process differences contributed to Samsung Electronics’ HBM4 achieving speeds exceeding 13 gigabits per second—more than 40 percent faster than international standards.
Nvidia has also shown signs of imposing extremely demanding technical standards on its partners as it manages its supply chain. A recent example involved an Nvidia inspection team visiting Samsung Electronics’ Pyeongtaek campus to conduct an internal audit of its HBM production process. According to industry sources, Nvidia applied exceptionally strict benchmarks during the inspection and sharply pointed out technical and operational issues. The move is interpreted as an attempt to strengthen Nvidia’s negotiating leverage ahead of formal supply agreements. Sources note that Nvidia has previously employed similar strategies with other partners, including SK Hynix and TSMC.
Potential Transition Toward a Supply Surplus Phase
The expansion of HBM production also reflects an effort to adjust portfolios in preparation for potential shifts in the DRAM market cycle. Historically, the memory industry has exhibited a classic cyclical pattern in which surging demand leads to simultaneous increases in capital investment, eventually triggering oversupply and price declines. According to industry sources, executives within Samsung Electronics’ semiconductor division and related strategy units are currently reviewing business strategies under the assumption that the current memory boom could last for roughly two years before market conditions reverse. Investment plans and production strategies are reportedly being evaluated simultaneously under the scenario that demand could regain structural strength around 2028.
Investment and production strategies for next-generation memory are also moving in line with this assessment. Samsung Electronics is transitioning to next-generation DRAM processes at its Hwaseong complex while simultaneously expanding production lines centered around its Pyeongtaek facilities. SK Hynix is likewise expanding memory production capacity across its key manufacturing bases in Icheon, Cheongju, and Yongin, while construction of next-generation DRAM production lines is underway at its new M15X plant. Given that building a new production line typically takes about two years, the investments currently underway are likely to translate into actual manufacturing capacity around 2028.
These changes in production strategy are not limited to Korean companies. U.S.-based Micron is also expanding DRAM production lines in Taiwan, Singapore, and the United States while reportedly placing large-scale equipment orders to secure HBM supply capacity. Analysts interpret these moves not only as responses to the current surge in AI demand but also as efforts to establish defensive measures against potential fluctuations in the commodity memory market. Rather than simply capitalizing on a short-term boom, companies appear to be adjusting their profit structures and customer bases in advance of the next phase of the memory cycle.