this post was submitted on 12 Aug 2025
81 points (97.6% liked)

Technology

4391 readers
503 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
all 31 comments
sorted by: hot top controversial new old
[–] grue@lemmy.world 59 points 2 months ago (2 children)

When mainstream media starts asking if something is a bubble, it's not only already been one for quite a while already, but it's about to pop.

[–] three_trains_in_a_trenchcoat@piefed.social 21 points 2 months ago (2 children)

yeah. I'm wondering if GPT-5 being a wet fart is going to be the thing that pops the bubble.

[–] LiveLM@lemmy.zip 7 points 2 months ago* (last edited 2 months ago) (1 children)

Was there a lot of hype surrounding the new launch? I didn’t really keep up with it.

Regardless, I think it’ll take a bigger disappointment to burst it. Maybe something on the corporate side, like big players not seeing a return of their investment.

[–] brucethemoose@lemmy.world 6 points 2 months ago* (last edited 2 months ago)

Ohhh yes. Altmans promotion for it was the Death Star coming up from behind a planet.

Maybe something on the corporate side, like big players not seeing a return of their investment.

Ohhh, it is. The big corporate hosters arent making much money and burning cash, and it’s not getting any better as specialized open models eat them from the bottom up.

GPT-OSS was kinda a flop too, even with decensoring.

AI isn’t gone, but the corporate side is realizing this is as good a way to make money as selling air.

[–] Rentlar@lemmy.ca 4 points 2 months ago

Thanks, now my mental image of the AI bubble is now shit-coloured.

[–] 9point6@lemmy.world 29 points 2 months ago

It's been a bubble since GPT2 guys, get with the program.

There is zero chance even half of all these AI product companies still exist in half a decade.

Now if you don't mind I reckon I'm gonna Alta Vista search for CDNow and then webvan something from pets.com

[–] Tb0n3@sh.itjust.works 16 points 2 months ago

Always has been.

[–] Feyd@programming.dev 15 points 2 months ago

🌏👨‍🚀🔫👨‍🚀🌌

[–] misk@piefed.social 9 points 2 months ago
[–] etherphon@piefed.world 8 points 2 months ago* (last edited 2 months ago)

Is it really a boom when it's literally forced upon you unwillingly? Shoehorned into all the products you used to like?

[–] IrateAnteater@sh.itjust.works 7 points 2 months ago (3 children)

For some it will be. For the pure AI software companies, yes. For the hardware vendors and data centers, less so. Even if it's not for generative AI, there will always be need for hyper scale compute.

[–] misk@piefed.social 3 points 2 months ago (1 children)

What’s the commercial use for current capacity of hyper scale compute?

[–] brucethemoose@lemmy.world 3 points 2 months ago* (last edited 2 months ago) (1 children)

Not a lot? The quirk is they’ve hyper specialized nodes around AI.

The GPU boxes are useful for some other things, but they will be massively oversupplied, and they mostly aren’t networked like supercomputer clusters.

Scientists will love the cheap CUDA compute though. I am looking forward to a hardware crash.

[–] misk@piefed.social 2 points 2 months ago* (last edited 2 months ago) (1 children)

That’s what I figured but was open to hearing how data centers won’t go bankrupt when current VC / investor money stops propping up AI arms race. I’m not even sure lots existing hardware won’t go to waste because there’s seemingly not enough power infrastructure to feed it and big tech corpos are building nuclear reactors (on top of restarting coal power plants…). Those reactors might be another silver lining however similar to cheap compute becoming available for scientific applications.

[–] brucethemoose@lemmy.world 2 points 2 months ago

because there’s seemingly not enough power infrastructure

This is overblown. I mean, if you estimate TSMC’s entire capacity and assume every data center GPU they make is full TDP 100% of the time (which is not true), the net consumption isn’t that high. The local power/cooling infrastructure things are more about corpo cost cutting.

Altman’s preaching that power use will be exponential is a lie that’s already crumbling.

But there is absolutely precedent for underused hardware flooding the used markets, or getting cheap on cloud providers. Honestly this would be incredible for the local inference community, as it would give tinkerers (like me) actually affordable access to experiment with.

[–] brucethemoose@lemmy.world 3 points 2 months ago* (last edited 2 months ago)

I mean, GPU box hardware prices will plument if there’s a crash, like they did with crypto GPU mining.

That’s how I got my AMD 7950 for peanuts. And a Nvidia 980 TI!

I am salivating over this. I am so in for a fire sale MI300 or A100.

[–] bacon_pdp@lemmy.world 3 points 2 months ago (1 children)

When entire state governments can fit in a single Rack, why bother?

[–] IrateAnteater@sh.itjust.works 4 points 2 months ago (1 children)

An entire state government could fit it your cellphone. That's never been one of the use cases for data center level compute.

[–] heyWhatsay@slrpnk.net 5 points 2 months ago

Its like bubble wrap, because someone is gonna have fun popping it

[–] DrFistington@lemmy.world 4 points 2 months ago

ROFL. iS iT a bUBblE?!?

Always has been

[–] homesweethomeMrL@lemmy.world 2 points 2 months ago
[–] fubarx@lemmy.world 2 points 2 months ago