this post was submitted on 22 May 2025
17 points (100.0% liked)

Fuck AI

2837 readers
371 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
 

New analysis by MIT Technology Review reveals AI's rapidly growing energy demands, with data centers expected to triple their share of US electricity consumption from 4.4% to 12% by 2028. According to Lawrence Berkeley National Laboratory projections, AI alone could soon consume electricity equivalent to 22% of all US households annually, driven primarily by inference operations that represent 80-90% of AI's computing power.

top 1 comments
sorted by: hot top controversial new old
[–] hedgehog@ttrpg.network 1 points 4 minutes ago

From the Slashdot comments, by Rei:

Or, you can, you know, not fall for clickbait. This is one of those...

Ultimately, we found that the common understanding of AI’s energy consumption is full of holes.

"Everyone Else Is Wrong And I Am Right" articles, which starts out with....

The latest reports show that 4.4% of all the energy in the US now goes toward data centers.

without bothering to mention that AI is only a small percentage of data centre power consumption (Bitcoin alone is an order of magnitude higher), and....

In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023.

What a retcon. AI was *nothing* until the early 2020s. Yet datacentre power consumption did start skyrocketing in 2017 - having nothing whatsoever to do with AI. Bitcoin was the big driver.

At that point, AI alone could consume as much electricity annually as 22% of all US households.

Let's convert this from meaningless hype numbers to actual numbers. First off, notice the fast one they just pulled - global AI usage to just the US, and just households. US households use about 1500 TWh of the world's 24400 TWh/yr, or about 6%. 22% of 6% is ~1,3% of electricity (330 TWh/yr). Electricity is about 20% of global energy, so in this scenario AI would be 0,3% of global energy. We're just taking at face value their extreme numbers for now (predicting an order of magnitude growth from today's AI consumption), and ignoring that even a single AI application alone could entirely offset the emissions of all AI combined. Let's look first at the premises behind what they're arguing for this 0,3% of global energy usage (oh, I'm sorry, let's revert to scary numbers: "22% OF US HOUSEHOLDS!"):

  • It's almost all inference, so that simplifies everything to usage growth
  • But usage growth is offset by the fact that AI efficiency is simultaneously improving at faster than Moore's Law on three separate axes, which are multiplicative with each other (hardware, inference, and models). You can get what used to take insanely expensive, server-and-power-hungry GPT-4 performance (1,5T parameters) on a model small enough to run on a cell phone that, run on efficient modern servers, finishes its output in a flash. So you have to assume not just one order of magnitude of inference growth (due to more people using AI), but many orders of magnitude of inference growth.   * You can try to Jevon at least part of that away by assuming that people will always want the latest, greatest, most powerful models for their tasks, rather than putting the efficiency gains toward lower costs. But will they? I mean, to some extent, sure. LRMs deal with a lot more tokens than non-LRMs, AI video is just starting to take off, etc. But at the same time, for example, today LRMs work in token space, but in the future they'll probably just work in latent space, which is vastly more efficient. To be clear, I'm sure Jevon will eat a lot of the gains - but all of them? I'm not so sure about that.   * You need the hardware to actually consume this power. They're predicting by - three years from now - to have an order of magnitude more hardware out there than all the AI servers combined to this point. Is the production capacity for that huge level of increase in AI silicon actually in the works? I don't see it.