From Barron's, March 14:
Energy companies increasingly cite AI as a major driver of electricity demand. But the grid could hold everything back.
Elon Musk recently made a bold prediction about artificial intelligence—and not the one about it being an existential threat to humanity.
Musk said rising demand for power-hungry AI chips could soon lead to an electricity shortage. “Next year, you will see that they just can’t find enough electricity to run all the chips,” the Tesla CEO said at the Bosch ConnectedWorld conference late last month.
While AI’s surging demand may not lead to mass electrical outages, the AI boom is already changing how data centers are built and where they’re located, and it’s already sparking a reshaping of U.S. energy infrastructure.
Energy companies increasingly cite AI power consumption as a leading contributor to new demand. AES, a Virginia-based utility, recently told investors that data centers could comprise up to 7.5% of total U.S. electricity consumption by 2030, citing data from Boston Consulting Group. The company is largely betting its growth on the ability to deliver renewable power to data centers in the coming years.
Sempra Energy , which operates public utilities in California and Texas, has cited AI as a major factor in its growth, alongside the electrification of the oil and gas industry.
New data centers coming on line in its regions ”represent the potential for thousands of megawatts of new electric load—often hundreds of megawatts for just one project,” Sempra told investors on its earnings call last month.
According to Boston Consulting Group, the data-center share of U.S. electricity consumption is expected to triple from 126 terawatt hours in 2022 to 390 terawatt hours by 2030. That’s the equivalent usage of 40 million U.S. homes, the firm says.
Much of the data-center growth is being driven by new applications of generative AI. As AI dominates the conversation, it’s likely to bring renewed focus on the nation’s energy grid.
Siemens Energy CEO Christian Bruch told shareholders at the company’s recent annual meeting that electricity needs will soar with the growing use of AI. “That means one thing: no power, no AI. Or to put it more clearly: no electricity, no progress.”
The technology sector has already shown how quickly AI can recast long-held assumptions. Chips, for instance, driven by Nvidia , have replaced software as tech’s hottest commodity. Nvidia has said that the trillion dollars invested in global data-center infrastructure will eventually shift from traditional servers with central processing units, or CPUs, to AI servers with graphics processing units, or GPUs. GPUs are better able to power the parallel computations needed for AI.
For AI workloads, Nvidia says that two GPU servers can do the work of a thousand CPU servers at a fraction of the cost and energy. Still, the better performance capabilities of GPUs is leading to more aggregate power usage as developers find innovative new ways to use AI....
....MUCH MORE
Also at Barron's