Skip to main content
  • Home
  • Tech
  • “U.S.–China Energy War Ignited by AI” — Is America Being Hobbled by a Power Crunch

“U.S.–China Energy War Ignited by AI” — Is America Being Hobbled by a Power Crunch

Picture

Member for

6 months 3 weeks
Real name
Siobhán Delaney
Bio
Siobhán Delaney is a Dublin-based writer for The Economy, focusing on culture, education, and international affairs. With a background in media and communication from University College Dublin, she contributes to cross-regional coverage and translation-based commentary. Her work emphasizes clarity and balance, especially in contexts shaped by cultural difference and policy translation.

Modified

U.S. cries of an “electricity shortage”
China’s surplus power by 2030 alone to be triple global demand
“Power infrastructure will decide the outcome”—China could overtake the U.S.

While the United States has tightened its grip on the “brains” of cutting-edge semiconductors to pressure China, Beijing has launched a forceful counteroffensive by weaponizing “electricity”—the lifeblood that runs AI. This is an so-called “asymmetric power” strategy, aimed at neutralizing Washington’s technology blockade by leveraging overwhelming generation capacity and cheap electricity. As the AI supremacy race escalates from a technology contest into an all-out infrastructure showdown, abundant power is increasingly emerging as a “game changer” capable of offsetting even deficits in AI-chip performance.

Reserve Capacity at the Brink Across 8 of 13 U.S. Regional Grids

On Dec. 11 (all dates hereafter local time), The Wall Street Journal (WSJ) reported that electricity is emerging as a new decisive battleground in the U.S.–China contest for technological primacy. The rationale for both sides staking their future on power infrastructure is straightforward: AI is driving electricity consumption vertically upward. For example, a single Google search requires an average of 0.3 watt-hours (Wh), but generative AI models such as ChatGPT consume 2.9 Wh per query. In other words, the far heavier compute load translates into nearly 10 times the power draw compared with conventional portal searches. The Electric Power Research Institute (EPRI) has also projected that if AI search functions are integrated into Google Search, the electricity required per query could rise by as much as nearly 30-fold. As AI models become more sophisticated, their compute requirements expand exponentially. Without sufficient power infrastructure to supply data centers, AI businesses will inevitably remain “half-built.”

Yet even as the United States maintains an edge over China in AI technology, it is already gripped by fears of power scarcity. Data centers currently account for roughly 6% of total U.S. electricity demand, and that share is projected to reach about 11% by 2030. But despite this rapid growth, delays in building new power plants mean that across eight of the country’s 13 regional grids, surplus generation capacity is reported to be already below critical thresholds. This power situation is expected to constrain U.S. data-center expansion through 2030. Goldman Sachs analysts warned that “limited available reserve capacity could act as a hurdle to further data center development in the U.S.,” and Microsoft CEO Satya Nadella recently said he worries that “even if we secure a huge amount of chips, we may not have enough electricity to run them.”

The surge in electricity demand is also placing direct pressure on regional grid operations. Data-center construction is clustering particularly in Virginia, Pennsylvania, Ohio, Illinois, and New Jersey—areas overseen by PJM Interconnection (PJM), the regional transmission organization (RTO) responsible for managing the transmission network across 13 states in the U.S. East and Midwest. Meanwhile, the ERCOT grid that covers Texas is also slated to absorb massive additional loads. Analysts have noted that early-stage data-center projects have more than doubled from last year, raising the likelihood of sustained strain on grids over the coming years. Against this backdrop, Monitoring Analytics—PJM’s independent market monitor—has urged the Federal Energy Regulatory Commission (FERC) to allow new data-center connections only when the grid can demonstrably handle them. Concerns are mounting that large-scale data centers could undermine local power reliability and drive up costs.

China’s Massive Expansion of Generation Capacity and Grid Infrastructure

China, by contrast, moved early to expand power infrastructure and is adding renewable energy, coal-fired plants, and nuclear facilities at the fastest pace in the world. Goldman Sachs analysts projected that by 2030, China will have about 400 gigawatts (GW) of surplus generation capacity. That is three times the electricity demand global data centers are expected to require.

China’s abundance of power is also a key lever for circumventing U.S. semiconductor restrictions. After imports of Nvidia’s latest chips (A100, H100, etc.) were blocked, China pivoted to domestic alternatives such as Huawei’s Ascend series. The challenge is energy efficiency. According to analysis by market-research firm SemiAnalysis, Huawei’s “CloudMatrix 384” system—bundling 384 chips—can deliver higher compute performance than Nvidia’s flagship system (equipped with 72 Blackwell chips), but consumes four times as much electricity. Under normal conditions it would be pushed out as an “electricity guzzler,” but China’s cheap power makes commercialization viable. In effect, China is pouring electricity into the system to compensate for chip-performance disadvantages.

China is also ahead of the United States not only in total generation, but in transmission-line infrastructure that delivers power to where it is needed. According to People’s Daily Online, as of last month China had built 46 ultra-high-voltage direct-current (HVDC) lines at 800kV or higher, totaling more than 40,000 kilometers—enough to circle the Earth once. HVDC lines can rapidly transmit electricity produced at power plants over ultra-long distances. Because they operate at higher voltage than conventional alternating-current (AC) lines, they reduce transmission losses. Chinese authorities say electricity generated at the Baihetan Dam in Chongqing in the country’s southwest can be delivered to Jiangsu Province, 2,080 kilometers away, in just 7 milliseconds (milliseconds = 0.007 seconds).

Behind China’s nationwide grid buildout lies its “West-to-East Power Transmission” strategy, designed to draw electricity from resource-rich western regions—such as Inner Mongolia’s solar-abundant desert areas and hydropower-rich Chongqing—and channel it to industrial hubs in the east. More recently, China has shifted emphasis toward the “East Data, West Computing” initiative, which relocates data centers to western regions with abundant power resources (including Inner Mongolia) while serving the heavy data demand of the densely populated east.

In addition, Beijing views this year as the peak of coal consumption and has set a 2030 target for peak carbon emissions, accelerating the expansion of renewables. Through this push, China has reached a level where incremental annual consumption growth can be covered by low-carbon power sources. In the first half of last year alone, renewable generation reached 1.56 trillion kWh, accounting for 35.1% of total electricity output. China is also moving aggressively to build a green-hydrogen production base. It is already the world’s largest electrolyzer producer, with production capacity of 11.5 GW as of 2023, and plans to expand that to 40 GW by 2025.

“If This Continues, the U.S. Loses 100%”

Experts argue that the power gap between the United States and China will translate directly into an AI competitiveness gap. Data centers can typically be brought online within two to three years, enabling rapid expansion. Power infrastructure, however, requires long-term planning, permitting, and massive upfront investment—making quick responses structurally difficult. Moreover, data centers tend to cluster near major metropolitan areas and can add enormous electricity demand in a short period. If generation, transmission, and distribution upgrades do not keep pace, grid bottlenecks are highly likely. In fact, total U.S. electricity generation last year was 4,387 terawatt-hours (TWh), up only 11.4% from 25 years earlier in 1999 (3,936 TWh). In an August report, Goldman Sachs assessed that U.S. power production has been at “zero growth,” and that building additional facilities takes 10 years.

Compounding the issue, power shortages in the United States are now forcing real-world data-center shutdowns. Bloomberg reported that Digital Realty Trust’s “SJC37” and Stack Infrastructure’s “SVY02A” data centers in Santa Clara, California, were completed years ago but remain idle. They have been unable to secure the 100 megawatts (MW) of electricity required to operate. The scenario Nadella warned about earlier this year—“if we can’t get power, AI chips could pile up as inventory”—has effectively materialized. Over the same period, China’s total electricity generation surged roughly ninefold, from 1,239 TWh to 10,072 TWh.

This abundance of power, in turn, is feeding into advances in China’s AI capabilities. Huawei’s advanced AI chip Ascend 910C delivers 780 teraFLOPS (TFLOPS), only about 40% of Nvidia’s H100 (2,000 TFLOPS). To offset the deficit, Huawei bundled 384 Ascend 910C chips into rack-scale systems, aiming to match the performance of Nvidia’s NVL72 rack, which groups 72 H100 chips. In this configuration, Huawei’s rack consumes five times more electricity than Nvidia’s, but China plans to cover the difference through government-provided low-cost power.

China’s ability to rapidly expand electricity generation stems from state-led construction of power infrastructure. The Chinese Communist Party has set generation targets every five years through 15 iterations of its Five-Year Plans and built power plants accordingly. In crises such as the 2008 global financial crisis and the 2020 COVID-19 pandemic, power infrastructure repeatedly served as a go-to stimulus lever. China has also pursued a pragmatic “black cat, white cat” approach—deploying renewables, nuclear, and thermal power as needed. Last year, China built or restarted 94.5 GW of new coal-fired power plants, marking the highest level in a decade.

By contrast, the United States has largely relied on companies to lead power-plant and grid construction, responding after demand materializes rather than forecasting and building ahead of it. An investment-industry source said, “In the U.S., software companies can see returns in three years, so there was little reason to invest in power infrastructure that takes 10 years.” As a result, even U.S. big tech firms are increasingly concerned that America could fall behind in the AI supremacy race against China. Nvidia CEO Jensen Huang said last month, “China will win the AI war,” adding, “It’s because electricity is free in China.” >> Do not omit a single character

Picture

Member for

6 months 3 weeks
Real name
Siobhán Delaney
Bio
Siobhán Delaney is a Dublin-based writer for The Economy, focusing on culture, education, and international affairs. With a background in media and communication from University College Dublin, she contributes to cross-regional coverage and translation-based commentary. Her work emphasizes clarity and balance, especially in contexts shaped by cultural difference and policy translation.