That's a lot o'leccy.
From Asterisk Magazine, Issue No. 9:
By 2030, leading AI labs will need data centers so massive they will require the power equivalent of some of America’s largest cities. Will they be able to find it?
We access AI through our screens as part of the ephemeral digital world that makes up the internet. But AI doesn’t actually live in the cloud.
AI lives in data centers. Tens of thousands of computers racked in rows hum away in buildings the size of several football fields. It's louder than you would expect. Industrial fans push in a constant breeze of chilled air, funneling away the waste heat. Thick bundles of fiber optic cables snake along ceiling tracks like mechanical veins, carrying unfathomable streams of data.
This is a 100-megawatt data center — a facility that consumes as much electricity as a small city, all to keep the digital world spinning. It’s impossible to say exactly how many exist today — companies prefer to keep their data centers private — but estimates put the number of hyperscale data centers worldwide at over 1,000. Yet, despite their footprint, they are already being outclassed in the constant need for more compute capable of training future generations of AI. We have reached the point where, if we don’t build bigger centers soon, tech giants will be forced to stretch training runs over multiple years. In short, AI has already outgrown its starter home.
GPT-4 was reportedly trained with 30 MW of power. Forecasts predict that in the next five years large training runs will require 5-gigawatt data centers 1 — facilities at least 10 times the size of today’s largest data centers. That is roughly the average energy needed to power all of New York City.
Tech leaders seem confident that they’ll be able to build centers of a size that even a few years ago would have seemed unprecedented. Mark Zuckerberg said a 1-GW data center is “only a matter of time.” Meta has broken ground on their largest data center yet, where they hope to bring one gigawatt online in 2025. Microsoft and OpenAI are reportedly planning a 1- to 5-GW “Stargate” facility supposedly launching in 2028. Sam Altman even pitched the White House on the construction of multiple data centers that each require up to 5 GW. Tech, in short, is betting on a YIMBY future for AI training.
But how realistic are these plans? For all of the big talk, most GW proposals are still in the planning and permission stages. And actual sites in the United States that could support 1-GW — let alone 5-GW — projects are scarce for several reasons....
....MUCH MORE