Thursday, October 3, 2024

"AI versus the climate as data center emissions soar"

It is probably time to require higher electricity prices for data centers, not so much to price the CO₂ externalities—that approach to carbon may not be legal in the U.S. if the Supreme Court continues dismantling the administrative state as it forces the Congress to accept and embrace their constitutional responsibilities, think EPA power plant rules—not so much to price the CO₂ externalities but to avoid burdening households with other externalities, the cost of bringing on new generating capacity, the cost of stringing cables and reconductoring existing rights-of-way, the cost of back-up systems and batteries and other storage mediums.

The AI business has a lot of money, OpenAI just raised $6 billion at a $157 billion valuation, or my favorite recent example, the three centi-billionaires Jensen Huang (NVDA), Elon Musk (TSLA+++) and Larry Ellison (ORCL) having dinner at Nobu, Palo Alto, combined net worth at the table $550 billion, with Musk and Ellison begging Huang to get them more chips:

“Please take our money. By the way, I got dinner. No, no, take more of it.
We need you to take more of our money please.

Ellison on the table-talk

After that longer than usual introduction here's the story at Asia Times, October 3:

Once up to speed, an AI can use 33 times more energy to complete a function than traditional software   

Artificial intelligence (AI) is curating your social media feed and giving you directions to the train station. It’s also throwing the fossil fuel industry a lifeline.

Three of the biggest tech companies, Microsoft, Google and Meta, have reported ballooning greenhouse gas emissions since 2020. Data centers packed with servers running AI programs day and night are largely to blame.

AI models consume a lot of electricity, and the World Economic Forum estimated in April that the computer power dedicated to AI is doubling every 100 days. Powering this boom in the US, where many AI tech pioneers are based, have been revitalized gas power plants once slated for closure.

First, what actually is AI?

“At its core, the kind of AI we are seeing in consumer products today identifies patterns,” say Sandra Peter and Kai Riemer, computing experts at the University of Sydney. “Unlike traditional coding, where developers explicitly program how a system works, AI ‘learns’ these patterns from vast datasets, enabling it to perform tasks.”

While AI programs are being “trained” and fed huge sums of data over several weeks and months, data processors run 24/7. Once up to speed, an AI can use 33 times more energy to complete a function than traditional software.

In fact, a single query to an AI-powered chatbot can consume ten times as much energy as a traditional Google search according to Gordon Noble and Fiona Berry, sustainability researchers at the University of Technology Sydney.

“This enormous demand for energy translates into surges in carbon emissions and water use, and may place further stress on electricity grids already strained by climate change,” they say.

Data centers are thirsty as well as power-hungry: millions of liters of water have to be pumped to keep them cool. These enormous server warehouses are vying with people for an increasing share of power and water, a situation that could prove deadly during a heatwave or drought.

A dubious solution....

....MUCH MORE