Wednesday, September 10, 2025

"Can AI Be Energy Efficient?" (TSLA)

From the Wharton School of the University of Pennsylvania, Knowledge@Wharton, September 9:

AI’s rapid growth poses major energy and sustainability challenges. Researchers from Wharton and Penn are leading a multidisciplinary initiative to find solutions. 

The possibilities of artificial intelligence seem as infinite as its challenges. How can data centers, with their voracious power demands, be energy efficient? Who regulates these facilities and under what kinds of policies? And how can different software and hardware across the globe be coordinated to work for everyone?

As AI advances at breakneck speed, experts at Penn are urgently trying to answer these questions. Professors Arthur van Benthem and Benjamin C. Lee have assembled a multidisciplinary team of scientists, industry professionals, and policy leaders to explore solutions. The group met recently on campus for a two-day workshop, “AI Infrastructure: Foundations for Energy Efficiency and Scalability.”

“There are so many different perspectives on these questions that we wanted to bring everyone together to connect with each other. We want to have a broad impact,” said Lee, a professor in Penn’s Department of Electrical and Systems Engineering and Department of Computer and Information Science.

Lee and van Benthem are recipients of a National Science Foundation Expedition in Computing grant, which is one of the largest awards for sustainability research. The workshop was part of the funding for a project called Carbon Connect, of which Lee is co-director. He and van Benthem said the issues emerging from AI are complex but not impossible.

“Computing in general has a massive environmental impact. Lots of data centers are going up, and they are using a lot of electricity” Lee said. “A second concern is around the manufacturing costs of semiconductors. It’s understanding all of these things and coming up with solutions.”

Van Benthem, who is a professor in Wharton’s Department of Business Economics and Public Policy and co-director of the Wharton Climate Center, said two major policy issues arise with AI’s projected growth. The first is increasing the generation and transmission capacity of power grids, and the second is tapping as much green energy as possible.

“There are a lot of setbacks in terms of policy,” he said. “It’s not so clear that all the growth will be green, and the bottlenecks in modernizing and expanding the grid are pervasive.”

Workshop participants focused on several initiatives, including:

Energy Efficiency and Innovation

The data centers that power the computing processes of AI add significant load growth to regional energy grids, and running them will not be cheap — whether they rely on fossil fuel, renewables, or stored energy. Workshop participants said economic analyses need to reflect the true cost of providing electricity so that a full suite of potential grid responses are considered. A modern, decarbonized grid is possible with better coordination across system operators, electric and tech companies, and local, state, and federal jurisdictions.

Lee said there are several options to make AI more energy and carbon efficient, including building data centers in tandem with renewable energy generation and batteries, and developing smaller, specialized AI models for specific tasks that would require less energy.

“System architects could manage all of these assets intelligently, computing more when energy is abundant or computing less to ensure grid stability as needed,” he said....

....MUCH MORE 

Related in spirit if not in verbiage, from EVbase, September 8: 

Elon Musk is setting high expectations for Tesla AI5 and AI6 chips 
Tesla’s AI5 and AI6 Chips: A New Era in In-House Silicon

Elon Musk has set very high hopes for Tesla’s upcoming AI chips, AI5 and AI6. These chips are central to Tesla’s future in AI and robotics.

Tesla just completed the design review for AI5. It will be built first by TSMC in Taiwan and later scaled up in Arizona. Musk explained that Tesla has shifted from developing two separate chip architectures to focusing on a single, powerful design. This focus allows all its chip talent to work on one exceptional product.

Performance Expectations 
Musk described AI5 as an "epic chip." He believes it will likely become the best inference chip for AI models under approximately 250 billion parameters. It promises to deliver unmatched cost efficiency and performance per watt. He added that AI6, the chip to follow, “has a shot at being the best AI chip by far”. 
Chips for Vehicles, Robots, and AI Infrastructure
These chips are crucial for Tesla’s high-volume products like Optimus (the humanoid robot) and Cybercab. AI5 will also serve as the main inference chip onboard vehicles, while AI6 is expected to handle both inference and model training. Musk hinted that AI6 could effectively replace Tesla’s previous Dojo supercomputing platform.
Production Partners and Strategy Shift

Tesla has a $16.5 billion, nine-year deal with Samsung to manufacture AI6 chips at its Texas factory, run from 2025 to 2034. Musk has said he will personally oversee operations there to help accelerate progress. At the same time, AI5 is set to be fabricated by TSMC, starting in Taiwan and later in Arizona.

Why Tesla Shifted from Dojo...

....MORE