A planned 10-gigawatt data center in Ohio—potentially the world's largest—will require a $33 billion natural gas power plant equivalent to nine nuclear reactors, exposing the fossil fuel infrastructure underpinning the artificial intelligence boom.
The SoftBank facility, connected to broader AI infrastructure investments, represents the hidden climate cost of machine learning's exponential growth. While tech giants pledge carbon neutrality and renewable energy commitments, their AI ambitions increasingly depend on massive fossil fuel expansion.
The contradiction could not be starker. Technology companies have spent years cultivating sustainability credentials: renewable energy purchases, net-zero pledges, climate advocacy. Yet AI's computational demands now drive construction of gas infrastructure that will emit greenhouse gases for decades. The Ohio plant alone matches the output of nine nuclear reactors—approximately 10,000 megawatts of continuous fossil fuel generation.
"This is the AI industry's climate reckoning," environmental analysts warn. "You cannot train frontier models and maintain climate commitments using current infrastructure. Something has to give."
The scale staggers comprehension. A single large language model training run can consume more electricity than 100 US homes use in a year. Multiply that across hundreds of models, continuous retraining, inference serving billions of queries, and supporting infrastructure, and AI's energy appetite becomes civilization-scale. Ohio's rust belt location adds regional economic complexity—jobs and investment versus climate costs.
Data center energy consumption already accounts for roughly 1-2 percent of global electricity demand, a figure projected to triple by 2030 as AI deployment accelerates. That growth cannot be met by renewables alone at current deployment speeds, forcing reliance on gas plants like Ohio's or coal facilities kept operating beyond planned retirements.
