The US Energy Secretary authorized grid operators to order data centers to use backup generators during the recent winter storm, revealing just how much strain AI infrastructure is putting on the power grid. Data centers were asked to generate 35 gigawatts - enough for millions of homes.
This is the AI infrastructure crisis nobody wants to talk about. We're building massive data centers for AI training while the power grid can barely handle winter weather. The energy math doesn't work, and now we're seeing the consequences in real time.
Energy Secretary Chris Wright authorized PJM, ERCOT, and Duke Energy to direct data centers to fire up their backup diesel generators during Winter Storm Fern in late January. The goal was to free up 35 gigawatts of power capacity for residential customers as temperatures plunged.
Let that sink in. Data centers were consuming so much electricity that asking them to briefly use their own generators freed up enough power for many millions of homes. That's not a rounding error. That's a fundamental infrastructure problem.
PJM manages the grid serving much of the mid-Atlantic. ERCOT runs Texas's famously independent power system. Duke Energy supplies much of the Southeast. All three faced the same crisis: surging heating demand combined with massive baseline data center consumption threatened grid stability.
The numbers are staggering. According to Lawrence Berkeley National Lab, data centers accounted for 4.4% of US electricity production in 2023. By 2028, that could jump to between 6.7% and 12%. PJM anticipates peak load growth of 32 gigawatts by 2030 - "nearly all going to new data centers."
That's capacity equivalent to 30 million homes, except it's not going to homes. It's going to train large language models and run AI inference.
