this post was submitted on 17 Feb 2026
33 points (100.0% liked)
Hardware
6243 readers
130 users here now
All things related to technology hardware, with a focus on computing hardware.
Some other hardware communities across Lemmy:
- Augmented Reality - !augmented_reality@lemmy.world
- Gaming Laptops - !gaminglaptops@piefed.social
- Laptops - !laptops@piefed.social
- Linux Hardware - !linuxhardware@programming.dev
- Mechanical Keyboards - !mechanical_keyboards@programming.dev
- Monitors - !monitors@piefed.social
- Raspberry Pi - !raspberry_pi@programming.dev
- Retro Computing - !retrocomputing@lemmy.sdf.org
- Virtual Reality - !virtualreality@lemmy.world
Rules (Click to Expand):
-
Follow the Lemmy.world Rules - https://mastodon.world/about
-
Be kind. No bullying, harassment, racism, sexism etc. against other users.
-
No Spam, illegal content, or NSFW content.
-
Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.
-
Please try and post original sources when possible (as opposed to summaries).
-
If posting an archived version of the article, please include a URL link to the original article in the body of the post.
Icon by "icon lauk" under CC BY 3.0
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So.. What's the point of this? Even the openai ram purchase was bizarre to me as they've been able to train their LLMs without gobbling up all the ram just fine, then the ssds and now hard disks? It's not even the write cache that sold out, the hard disks themselves are.
I don't think there's any untapped source of data for llm training is there? Datasets can't get larger so I fail to see any reason this would be necessary.
Does anyone know the reason they're buying up storage? No cynical doomerism please.
Even aside from training datasets, as long as they keep hosting chats for people using these genAIs their storage will keep expanding over time as people upload/generate more pictures and videos.
I really hope all this urgency on getting as much hardware as possible is because the ai companies see the funding drying up soon and are trying to maximize what they get out of it, but it’s probably more likely they’re just focusing more on videos now and want to have all of youtube (+etc) in their training sets
I'm not sure there's many options other than cynical doomerism for analysing this situation. My uneducated guess? They probably already ran out of real world data, and are now forced to produce ridiculous amounts of LLM-generated data to try and continue the training process like this.
Other alternatives I can think of:
We've known for a while that they're running out of training data, so it makes a lot of sense to either generate more data with models, create it at big scale without AI, or collect even more data in even more invasive ways from everyone online. There's literally no other reason I can think of to buy the entire stock of WD drives for 2026 2 months into the year.