this post was submitted on 04 Feb 2025
97 points (99.0% liked)

PC Gaming

12473 readers
349 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
all 24 comments
sorted by: hot top controversial new old
[–] cyberpunk007@lemmy.ca 46 points 8 months ago (1 children)

nvidia has been a garbage company for a few generations now. They got to the top, and sat up there enshitifying everything because they have a monopoly on the market. Don't buy into this shit.

[–] rtxn@lemmy.world 16 points 8 months ago (1 children)

I'm sure replacement units are in plentiful supply. Right?

[–] ramble81@lemm.ee 10 points 8 months ago (2 children)

Okay so the “S” models stood for “Super” which was a slight step up from the base. What are “D” models? “Duper”?

[–] Psythik@lemmy.world 6 points 8 months ago* (last edited 8 months ago)

China exclusive models, labeled as such to get around regulations.

[–] tehWrapper@lemmy.world 8 points 8 months ago (2 children)

It feels like things are so powerful and complex that failure rates of all these devices is much higher now.

[–] Rekall_Incorporated@lemm.ee 11 points 8 months ago (2 children)

I don't have any stats to back this up, but I wouldn't be surprised if failure rates were higher back in the 90s and 2000s.

We have much more sophisticated validation technologies and the benefit of industry, process and operational maturity.

Would be interesting to actually analyze the real world dynamics around this.

[–] GrindingGears@lemmy.ca 1 points 8 months ago* (last edited 8 months ago) (3 children)

Not very many people had a dedicated GPU in the 90s and 2000s. And there's no way the failure rate was higher, not even Limewire could melt down the family PC back then. It sure gave it the college try, but it was usually fixable. The biggest failures, bar none, were HD or media drives.

[–] Jimmycakes@lemmy.world 2 points 8 months ago

We all did they used to cost like 60 bucks

[–] TacoSocks 2 points 8 months ago

Dedicated GPUs were pretty common in the 2000s, they were required for most games, unlike the 90s where it was an unstandardized wild west. The failure rate had to be higher, I know I had 3 cards die with less than 2 years use on each card in the 2000s. Cases back then had terrible airflow and graphic demands jumped quickly.

[–] Rekall_Incorporated@lemm.ee 1 points 8 months ago

I was referring to PC components in general.

[–] tehWrapper@lemmy.world 1 points 8 months ago

I am going to guess the amount made is also much higher than 90s and 2000s since hardware tech is way more popular and used in way more places in the world. So maybe a lower percent but just a high total amount.

But I have no idea..

[–] GrindingGears@lemmy.ca 2 points 8 months ago

You are just short of needing a personal sized nuclear reactor to power these damn things, so I mean the logic follows that the failure rate is going to climb

[–] Viri4thus@feddit.org 0 points 8 months ago