Technology
Which posts fit here?
Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.
Post guidelines
[Opinion] prefix
Opinion (op-ed) articles must use [Opinion] prefix before the title.
Rules
1. English only
Title and associated content has to be in English.
2. Use original link
Post URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communication
All communication has to be respectful of differing opinions, viewpoints, and experiences.
4. Inclusivity
Everyone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacks
Any kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangents
Stay on topic. Keep it relevant.
7. Instance rules may apply
If something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.
Companion communities
!globalnews@lemmy.zip
!interestingshare@lemmy.zip
Icon attribution | Banner attribution
If someone is interested in moderating this community, message @brikox@lemmy.zip.
view the rest of the comments
For some it will be. For the pure AI software companies, yes. For the hardware vendors and data centers, less so. Even if it's not for generative AI, there will always be need for hyper scale compute.
When entire state governments can fit in a single Rack, why bother?
An entire state government could fit it your cellphone. That's never been one of the use cases for data center level compute.
An entire state government could run on your phone but requires an entire data center because it's written in JavaScript that emulates the original COBOL code that ran the government in the 1960's.
"State services" is database lookups and billing. Back in the 90's, I supported 10k users (1.5k active at any moment) on a Pentium 3 with 512MB of Ram.
US population has grown 25% from the year 2000. Other than Anti AI detection, everything worked on the hardware of 25 years ago. Single core performance has gone up more than 25% over the past 25 years.
What’s the commercial use for current capacity of hyper scale compute?
Not a lot? The quirk is they’ve hyper specialized nodes around AI.
The GPU boxes are useful for some other things, but they will be massively oversupplied, and they mostly aren’t networked like supercomputer clusters.
Scientists will love the cheap CUDA compute though. I am looking forward to a hardware crash.
That’s what I figured but was open to hearing how data centers won’t go bankrupt when current VC / investor money stops propping up AI arms race. I’m not even sure lots existing hardware won’t go to waste because there’s seemingly not enough power infrastructure to feed it and big tech corpos are building nuclear reactors (on top of restarting coal power plants…). Those reactors might be another silver lining however similar to cheap compute becoming available for scientific applications.
This is overblown. I mean, if you estimate TSMC’s entire capacity and assume every data center GPU they make is full TDP 100% of the time (which is not true), the net consumption isn’t that high. The local power/cooling infrastructure things are more about corpo cost cutting.
Altman’s preaching that power use will be exponential is a lie that’s already crumbling.
But there is absolutely precedent for underused hardware flooding the used markets, or getting cheap on cloud providers. Honestly this would be incredible for the local inference community, as it would give tinkerers (like me) actually affordable access to experiment with.
I mean, GPU box hardware prices will plument if there’s a crash, like they did with crypto GPU mining.
That’s how I got my AMD 7950 for peanuts. And a Nvidia 980 TI!
I am salivating over this. I am so in for a fire sale MI300 or A100.