[AI Hegemony War] The Second AI War, Where the Decisive Battlefield Lies: Hardware and Power Grids
Input
Modified
Inference and physical AI expansion shifts competition from models to infrastructure From chip rivalry to power grid rivalry, U.S. grid constraints versus China’s supply expansion Korea faces a dual bind of grid saturation and site conflicts, calling for a national AI strategy

The global race for artificial intelligence (AI) supremacy is expanding beyond algorithms into a contest over the physical foundations of hardware and power infrastructure. As the industry’s center of gravity shifts from training to inference, semiconductor supply chains and the ability to secure stable energy supplies are emerging as the core determinants of national competitiveness. Amid intensifying U.S.–China infrastructure competition, Korea faces mounting urgency to craft a strategic response as it confronts structural limits in its power grid despite surging demand approaching 49 gigawatts (GW).
From Training to Inference, the Battlefield Shifts
According to the IT industry on the 19th, the balance of the global AI race is rapidly moving away from the training phase of building massive models toward the inference phase of operating them efficiently. This trajectory aligns with the “data wall” forecast by U.S. nonprofit AI research group Epoch AI, which warns that high-quality data could be exhausted by around 2028. As model scaling approaches physical limits, big tech companies are pivoting toward practical service deployment and monetization. OpenAI’s push to introduce advertising into free and low-cost tiers of ChatGPT is widely interpreted as an effort to reinforce revenue models capable of absorbing the cumulative costs of inference during service operations.
As inference takes center stage, the variables shaping cost and performance are shifting from sheer model size to computing resources and power efficiency. This transition is reshaping the hardware market itself. With demand rising for inference-optimized solutions, chip specialization by purpose is accelerating. Nvidia, long dominant in the AI accelerator market, has signed a non-exclusive technology licensing agreement with inference-focused chip startup Groq. This signals that the GPU-centric model is loosening in inference markets where low latency and power efficiency are paramount, giving way to a more segmented hardware architecture defined by distinct functional roles.
The evolution of hardware has expanded AI’s operating arena from virtual environments into the physical world. At CES 2026, the world’s largest IT and electronics exhibition held earlier this month, the central theme was the shift in AI’s focus from “agentic AI,” which functions as virtual assistants, toward “physical AI” that directly interacts with the real world through robots and other systems. Nvidia unveiled its next-generation Rubin GPU platform alongside a suite of software offerings, including the Omniverse virtual simulation platform and the Cosmos data generation tool. This reflects a strategy aimed beyond chip supply, seeking to dominate an end-to-end ecosystem that transfers AI trained in virtual environments directly into real-world machines.
Movements by device manufacturers are also taking concrete shape. Samsung Electronics has set a target of expanding AI-enabled mobile devices to 800 million units by the end of this year, accelerating the growth of the on-device market. Apple has entered a strategic partnership with rival Google to integrate Google’s Gemini model and cloud infrastructure into the next generation of Siri. These developments underscore that the essence of AI competition is evolving toward securing stable hardware supply chains and strengthening software integration.
Despite these alliances and technological advances, severe physical bottlenecks persist as supply struggles to keep pace with explosive infrastructure demand. BlackRock, the world’s largest asset manager, estimates the global new infrastructure market at $85 trillion over the next 15 years, while identifying a shortage of skilled labor as a key risk factor. According to the Associated Builders and Contractors (ABC), data center expansion alone requires an additional 349,000 workers this year. Ultimately, today’s AI supremacy race is morphing from a contest over superior algorithms into a massive volume-driven struggle over who can deploy physical chips, servers, manpower, and capital at scale and on time.
The Next Battlefield Is the Power Grid: Chip Dominance in the U.S., Power Supremacy in China
The scope of physical infrastructure competition is now extending beyond semiconductors to the acquisition of power grids. As Meta CEO Mark Zuckerberg recently stated that future AI growth will be determined by electricity, the industry’s foundational pillar is shifting from computing devices to energy. The International Energy Agency (IEA) projects that global data center electricity consumption will more than double from 460 terawatt-hours (TWh) in 2022 to 1,050 TWh by 2026. AI hyperscale data centers, in particular, require power capacities of around 100 megawatts (MW), four to ten times that of conventional data centers, making the expansion of transformers, power semiconductors, and grid operating systems an urgent priority.
China holds a clear advantage in this domain, underpinned by its power generation capacity and reserves. As of 2023, China’s installed power generation capacity stood at 2,907 GW, more than double that of the United States at 1,334 GW, with some regions even experiencing negative electricity prices due to oversupply. State Grid Corporation of China plans to invest approximately $590 billion by 2030 to expand its power grid. This effort is linked to the “East Data, West Computing” initiative, which transmits abundant renewable energy from western regions to eastern demand centers and disperses data centers across the west, reinforcing the power foundation needed for AI industry growth.
By contrast, the United States maintains an edge in semiconductor design but faces constraints from an aging power grid and complex permitting processes. Meanwhile, China continues to widen the gap through state-led investment. China’s projected power generation for 2025 stands at 10.6 trillion kilowatt-hours (kWh), more than double the U.S. level of 4.24 trillion kWh, and Tesla CEO Elon Musk has suggested the disparity could triple by 2026. The IEA also forecasts that by 2030, China’s growth rate in data center power consumption, at 170 percent, will outpace that of the United States at 130 percent.
Facing uncertainty in power supply, U.S. big tech firms are moving to secure energy directly. Alphabet, Google’s parent company, acquired renewable energy developer Intersect Power in December last year, internalizing power procurement, while Amazon and Meta have invested in developers of small modular reactors (SMRs), seen as next-generation power sources. The Wall Street Journal observed that the risk of facilities becoming stranded assets is outweighed by the greater risk of falling behind in the AI race. In effect, the focal point of AI industry competition is converging on the question of how stably and efficiently electricity can be supplied.

A 49 GW Demand Surge Meets a Grid at Full Capacity, Korea Confronts Structural Limits
As global AI competition escalates into a struggle over hardware and power grids, Korea’s structurally isolated power system, unconnected to neighboring countries, is revealing its inherent limitations. The Korea Data Center Council projects that the domestic private data center market will expand rapidly from $4.7 billion in 2024 to $7.7 billion by 2028, yet infrastructure supply is failing to keep pace with this surge. New power connection requests submitted to Korea Electric Power through 2029 total 49,397 MW, dwarfing the existing capacity of 1,986 MW at the end of 2023. Meeting this demand would require the construction of 53 additional 1 GW-scale power plants, a prospect widely viewed as unrealistic. Despite possessing world-class semiconductor capabilities, Korea faces fundamental imbalances in grid capacity and renewable energy penetration. The National Assembly Research Service has warned that urgent measures are needed to meet projected power demand of 129.3 GW by 2038.
Physical constraints stemming from concentration in the Seoul metropolitan area have reached a critical threshold. As data center demand surges, grid saturation, local acceptance conflicts, and water shortages are becoming tangible realities, while transmission bottlenecks are intensifying to the point that electricity cannot be delivered even when generated. The prolonged delay of 12 years and 6 months in constructing the transmission line linking North Dangjin in South Chungcheong Province to Sintangjeong in Pyeongtaek starkly illustrates the dysfunction in Korea’s grid expansion process. According to Cushman & Wakefield, the time required to confirm power supply in the metropolitan area has lengthened dramatically from two to three months in the past to over 12 months recently. Compounding these challenges are vulnerabilities in securing essential AI semiconductor materials such as copper, rare metals, and ultra-high-purity inputs. Heavy reliance on China-centered refining and processing supply chains means that Korea’s AI and semiconductor expansion inevitably carries heightened geopolitical risk.
Overcoming these structural barriers requires a comprehensive shift toward an integrated resource and climate strategy. Experts argue that Korea must reform its rigid, centralized grid into a decentralized system based on local production and consumption, supported by more flexible power markets. The existing monopoly grid structure generates inefficiencies by preventing effective distribution even when renewable energy is abundant. Beyond this, a national AI strategy must be redesigned as a holistic roadmap encompassing power infrastructure expansion, accelerated renewable energy deployment, critical mineral recycling, and data center site management. On the technological front, analysts stress the need to reduce dependence on Nvidia by fostering domestic neural processing unit (NPU) and processing-in-memory (PIM) ecosystems that drastically cut power consumption, securing both energy security and high-efficiency technology as dual strategic assets.