this post was submitted on 22 May 2025
23 points (96.0% liked)

Hardware

2171 readers
127 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
top 1 comments
sorted by: hot top controversial new old
[–] anamethatisnt@sopuli.xyz 11 points 1 week ago* (last edited 1 week ago)

[...]the authors of the report determined that the energy used for a single query to the open-source Llama 3.1 8B engine used around 57 joules of energy to generate a response.[...]

A larger model, like Llama 3.1 405B, needs around 6,706 joules per response – eight seconds of microwave usage.

In other words, the size of a particular model plays a huge role in how much energy it uses.
Although its true size is a mystery, OpenAI's GPT-4 is estimated to have well over a trillion parameters, meaning its per-query energy footprint is likely far higher than the Llama queries tested.[...]

AI video generation, on the other hand, is an energy sinkhole.

In order to generate a five-second long video at 16 frames per second, the CogVideoX AI video generation model consumes a whopping 3.4 million joules of energy – equivalent to running a microwave for an hour or riding 38 miles on an e-bike, Hugging Face researchers told the Tech Review.