The Gigawatt Bottleneck: Power Constraints Now Define AI Scaling
US data centers consume 4.4% of national electricity today; projections show 6.7-12% by 2028, creating a hard constraint on AI ambitions and fragmenting competitive advantage toward nations with spare grid capacity.
Power availability, not compute architecture, has become the binding constraint on artificial intelligence infrastructure scaling in 2026. US data centers consumed 176 TWh of electricity in 2023—4.4% of total national consumption—but projections from Lawrence Berkeley National Laboratory show consumption reaching 325-580 TWh by 2028, representing 6.7-12% of the grid. A single hyperscale AI training cluster now requires 100-1,000 MW of dedicated power, equivalent to the electricity demand of 80,000-800,000 households.
This constraint is already manifesting in deployment delays and cancellations. Approximately 7 GW of the 12 GW planned US data center capacity for 2026 has been delayed or cancelled due to power constraints, according to analysis from Goldman Sachs. In Texas, electricity loads totaling tens of gigawatts have been requested but only approximately 1 GW received approval. The mismatch is structural: infrastructure planning timelines run 5-10 years while AI deployment targets run 12-24 months.
Regional Grid Strain and Market Fragmentation
The power bottleneck is reshaping competitive geography within the United States. Virginia’s Data Centers consumed 26% of the state’s total electricity supply in 2023, per Pew Research Center analysis. The PJM Interconnection—covering the mid-Atlantic region—saw capacity market clearing prices jump to $329/MW for 2026-27 delivery versus $28.92/MW two years prior, an 11-fold increase that translated directly into residential rate hikes of $18/month in Maryland and $16/month in Ohio.
“Basically, we have run out of headroom, largely speaking, in the US.”
— Ben Hertz-Shargel, Data Centers Expert, Wood Mackenzie
Texas is projected to become the largest US data center market by 2028 with over 40 GW capacity, a 142% increase from 2025 levels, according to Bloom Energy projections. This shift reflects available grid headroom rather than technical or talent advantages. California’s interconnection queue now averages over 9 years for new connections, up from under 2 years in 2008.
The Off-Grid Scramble
Faced with grid bottlenecks, hyperscalers are pursuing two parallel strategies: onsite generation and Small Modular Reactor procurement. Meta’s Louisiana campus required 2 GW of dedicated power; Entergy is spending $3.2 billion to build three natural gas plants totaling 2.3 GW specifically for that facility. Conditional offtake agreements for SMRs grew from 25 GW at the end of 2024 to 45 GW in May 2026, according to the International Energy Agency.
This approach concentrates advantage among cash-rich players. Tech hyperscalers combined spent over $320 billion on data center capex in a single year while the entire US utility industry spent approximately $160 billion on all energy infrastructure. Smaller competitors without balance sheets to build dedicated power plants face escalating costs and uncertain timelines.
A single AI inference task now consumes up to 1,000 times more electricity than a traditional web search. Data centers accounted for approximately 50% of all US electricity demand growth in 2025, according to the International Energy Agency. This intensity will worsen as training runs scale and deployment expands.
The China Advantage
China added 429 GW of net electric generation capacity in 2024 versus approximately 28 GW for the United States, a 15-fold annual capacity advantage reported by the Federal Reserve. This asymmetry creates strategic vulnerability: US export controls on advanced semiconductors assume AI systems will train domestically, but if domestic power constraints force training offshore, the policy rationale collapses.
| Metric | United States | China |
|---|---|---|
| New Capacity Added | ~28 GW | 429 GW |
| Build Multiple | 1x | 15.3x |
| Interconnection Queue | 5 years avg | ~18 months |
| Public Opposition Constraints | High | Minimal |
Analysis from the Brookings Institution notes that while the United States maintains advantage in access to cutting-edge semiconductors, China’s energy infrastructure capacity creates an “electron gap” that could reshape the balance of compute between the two countries. US electricity demand remained flat for nearly two decades before the AI surge; Chinese planning assumes continuous expansion.
State Regulatory Response
Over 200 data center bills were introduced across state legislatures in 2025; over 300 bills appeared in the first six weeks of 2026 across 30 states, according to tracking by MultiState. Twenty-seven states are advancing legislation requiring data centers to cover energy infrastructure costs directly rather than socialising them across ratepayers. Maine became the first state to implement a construction moratorium, pausing new deployments until November 2027.
The regulatory acceleration reflects constituent pressure. Retail electricity prices rose 42% from 2019 to 2026 with data center demand identified as a significant contributing factor. In Arizona, average industrial electricity rates rose 17% from 2014 to November 2024 while residential rates rose 36%, per APM Research Lab analysis. Ireland implemented a de facto cap on new data center grid connections in the Dublin region.
- Hyperscalers’ capex dwarfs utility infrastructure spending, creating private-sector capacity to bypass grids but fragmenting national advantages
- Interconnection queues averaging 5+ years collide with AI deployment cycles of 12-24 months
- State cost-recovery mandates raise deployment costs just as China’s unconstrained build-out accelerates
- Gartner predicts power shortages will restrict 40% of AI data centers by 2027
What to Watch
The critical variable is whether federal action—potentially through proposed legislation like the DATA Act—can accelerate interconnection timelines and grid modernisation faster than state-level restrictions slow deployment. If not, the US faces fragmented outcomes: a two-tier market where early movers with dedicated power enjoy structural advantage while latecomers face compounding delays.
Track three indicators: first, the pace of SMR commercial deployment versus hyperscaler demand growth; second, whether Texas grid capacity materialises at projected rates or hits infrastructure bottlenecks; third, legislative outcomes in the 27 states advancing cost-recovery requirements. The global dimension matters equally—if China’s 15x annual capacity advantage persists, advanced model training may shift offshore by commercial necessity rather than policy choice, eroding the strategic foundation of US export controls on AI semiconductors.
The electron gap is no longer hypothetical. It is the constraint that determines which nations can build, train, and deploy frontier AI systems at scale.