this post was submitted on 27 Nov 2025
376 points (99.2% liked)

PC Gaming

12808 readers
961 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 41 comments
sorted by: hot top controversial new old
[–] SalamenceFury@lemmy.world 83 points 6 days ago* (last edited 6 days ago)

Honestly this whole "AMD sells more!" "Well Nvidia sells more in THIS place!" is a fool's errand to keep fighting over. Different locations in the world will have different pricing and thus it will shift which brand/card is better value for the money, and it will also depend on local tax laws and a ton of other variables. Brazil, for example, has AMD have somewhat lower prices overall thanks to AMD having an assembly plant here, something neither Nvidia and Intel have, and thus, their GPUs/CPUs are more expensive.

[–] Lumisal@lemmy.world 55 points 6 days ago (4 children)

They'd probably sell even more of their naming scheme was less confusing.

[–] fleem@piefed.zeromedia.vip 20 points 6 days ago

so that's not just me being unable to keep a good hierarchy of their shit in my thought cabinet?

[–] 87Six@lemmy.zip 9 points 6 days ago (1 children)

How is it confusing?

Well minus the part where theres 8GB versions with no name differences..

[–] Anivia@feddit.org 4 points 5 days ago (1 children)

It's confusing because they constantly change their numbering scheme. Nvidia has stayed consistent in the last 17 years

[–] 87Six@lemmy.zip 3 points 5 days ago

Well yea... But nvidia didn't change their naming scheme because they use it to mislead people.

A 5080 today is equivalent to what a 1060 used to be when it comes to raw sillicon if I remember correctly.

[–] Evotech@lemmy.world 6 points 6 days ago (2 children)

It's just "confusing" because people are used to nvidia naming

[–] boonhet@sopuli.xyz 4 points 6 days ago (3 children)

I thought it was confusing because this is the 3rd changeup to their naming in recent years.

RX 580 to RX Vega 56 and 64 to RX 5700 to 9070. Yes it's still the intuitive "bigger number better" and "first number is generation", but I can see how people might be frustrated with it whereas nvidia has consistency.

[–] frongt@lemmy.zip 4 points 5 days ago (1 children)

Consistency? Nvidia does the exact same thing, so yeah I guess that's consistent. That reliable numeric with the Nvidia GeForce Titan, Nvidia Titan, and Nvidia Titan RTX (yes those are all very different cards).

And don't even get me started on the workstation Quadro card naming, with the Quadro RTX 4000, RTX A4000, RTX 4000 (Ada generation), and RTX PRO 4000 Blackwell.

[–] Evotech@lemmy.world 2 points 5 days ago

Then you have the server cards. L4, L40, A100, H100 then rtx 6000 Blackwell (workstation and server editions are same name but different cards)

It goes on

[–] cron@feddit.org 3 points 5 days ago

Don't forget the Radeon VII (announced 2019) that didn't fit in any scheme.

I think they had to make that change because there was a Radeon 9700 back in the ATI days or something. I just wish they hadn't done the RX thing. Because those charts where people compare GPU performance will have legends like:

RX-7600 RTX-5060 RX-9060XT TXT-6060RXT TVX-5040T RTX-4060

and you say "There's no consistency with generation or manufacturer and I'm pretty sure one of those is the part number of a cylinder head for a Toyota Tacoma.

[–] victorz@lemmy.world 3 points 6 days ago (1 children)

Well good thing the 9070 series is the series that is trying to adopt a more Nvidia-like naming scheme.

[–] Evotech@lemmy.world 1 points 5 days ago (1 children)

Yeah it has two names. But slightly less confusing

[–] victorz@lemmy.world 1 points 5 days ago (1 children)

The 9070 had two names? You're talking about the regular, the XT, and GRE?

[–] dream_weasel@sh.itjust.works 1 points 5 days ago (1 children)

I am not retaking the GRE to use that card.

[–] victorz@lemmy.world 1 points 5 days ago (1 children)

I'm wooshing. Is that some kind of test where you live?

[–] dream_weasel@sh.itjust.works 2 points 5 days ago

Yeah it's the exam you take for grad school in the US for some colleges. I think it is the "graduate readiness exam" but it's been a few years and I cant spare 10 seconds to look lol.

[–] randombullet@programming.dev 4 points 5 days ago

I can't wait for an RTX 9070 to replace my RX 9070 XT!

Just like how the Z890 chipset from Intel is not the same generation as the X870 chipset from AMD.

[–] hanzo@lemmy.dbzer0.com 18 points 5 days ago

I use AMD mainly because FLOSS values closer to mine than NVIDIA, but I wish nearly everything in late-stage capitalism wasn't a duopoly. Really a monopoly when the oligarchs own them both via multinational asset managers. I agree with the comments about older cards and the future of gaming though, I think the AMD vs NVIDIA conversation is going to move away from gaming as CUDA vs ROCm continues to evolve.

[–] 30p87@feddit.org 23 points 6 days ago (1 children)

Maybe AMD is cheaper at Mindfactory than other resellers, and with NVidia it's the other way around?

[–] utjebe@reddthat.com 8 points 6 days ago (2 children)

At least in EU it just could have been just the right pricing with models in 800-900 EUR for 9070XT. Still outrageous price..

Then you have 5070 that's the "low" model of 50s Nvidia that only maching 9070XT with "you can do AI and RTX" and 5080 arguably better than 9070XT, but in most cases over 1k EUR.

[–] Noja@sopuli.xyz 3 points 6 days ago

the RX 9070 XT starts at ~590 € and the sales data is just for the last week

[–] majari42@lemmy.world 3 points 6 days ago

They sell for €639 in EU now. (No BF sale) I've got myself one on the way just now for €599 (Asus)

Given the rumors about VRam scarsety I thought this was the right time to get my hands on one, coming from a 5700xt, it's a welcome upgrade

[–] kindenough@kbin.earth 13 points 6 days ago

Great card. I am surprised how well this card performs on Bazzite.

[–] TheObviousSolution@lemmy.ca 8 points 5 days ago (1 children)

I saw a Black Friday sale for a 5080 for around €1k, a decent OC model too. I have no interest in supporting NVIDIA or their fire hazards with the way they've been acting.

[–] Blackmist@feddit.uk 5 points 5 days ago

It's an obscene price to pay for a GPU no matter how fast it is.

[–] cron@feddit.org 14 points 6 days ago (3 children)

Is this just this week's exception or is this representative of the retail market?

[–] Truscape@lemmy.blahaj.zone 27 points 6 days ago (2 children)

Might just be a personal perspective, but the majority of Nvidia cards that I've seen purchased in my community (Not in germany tho) are second hand - most aren't buying the 50 series due to their horrible pricing. People are buying AMD cards new though due to their good value proposition.

[–] GenosseFlosse@feddit.org 11 points 6 days ago (2 children)

The power consumption of the 50xx cards is just insane, and this makes it a bad choice in Germany where electricity prices are very high. This means I have to pay a premium for the card, extra for the excessive power demands and potentially extra in summer to keep my gaming room cool.

[–] Truscape@lemmy.blahaj.zone 1 points 6 days ago

That is an important aspect I didn't consider, good catch.

[–] SaveTheTuaHawk@lemmy.ca 0 points 5 days ago

American don't think about that. Same reason poor people buy V8 pickups then complain about being poor.

[–] chiliedogg@lemmy.world 2 points 6 days ago (1 children)

While nobody is talking about the Intel B580, which was actually spectacular for the price.

[–] Truscape@lemmy.blahaj.zone 1 points 6 days ago

I think the limiting factor for that one is availability depending on region. In the US the B580 is an amazing budget card, but AMD has a better distribution network in say LATAM nations.

[–] captain_aggravated@sh.itjust.works 2 points 5 days ago* (last edited 5 days ago)

A couple generations back, Nvidia was the obvious choice and AMD just couldn't compete. Nvidia had real-time ray tracing, AMD didn't, Nvidia's hardware video encoding was great, AMD's sucked, Nvidia had CUDA, AMD pretty much didn't, Nvidia had their DLSS frame gen technology, AMD either didn't or it wasn't very good.

Well, in most of those places, AMD has caught up, and they offer more VRAM in their lower tier products, at better prices.

Oh, and I've been hearing through the grapevine that Nvidia is dropping the ball with drivers. That used to be AMD's bag, but AMD's drivers are more solid these days.

Oh, and Nvidia's weird new power socket keeps catching on fire. I've got a 7900GRE that attaches with two good old 8-pin PCIe connectors that offer a distinct lack of combustion.

[–] DScratch@sh.itjust.works 3 points 6 days ago (1 children)

Nvidia are selling many multiples of this volume to AI providers.

Gaming means essentially nothing in the big picture. (Outside of maybe prestige or similar)

[–] cron@feddit.org 1 points 6 days ago (1 children)

This statistic refers only to enthusiasts who build their own PCs. It does not represent the overall PC market, laptops, or data-center and other business purchases.

[–] Maestro@fedia.io 3 points 6 days ago

Yeah, I just got a new gaming laptop. Nobody sells gaming laptops with AMD GPUs in my country. I looked! So I went with an AMD CPU and a 5070 instead.

[–] BilboBargains@lemmy.world 1 points 4 days ago

The 9070xt is a great card and much better value than the equivalent Nvidia offering, last time I checked the benchmarks. The Linux compatibility was the other major draw. My system is now 100% AMD and I'm preparing to trial Steam os and Bazzite.

[–] nomecks@lemmy.wtf 1 points 5 days ago* (last edited 5 days ago)

NVIDIA made like 4 billion on gaming GPUs and like 58 billion on AI GPUs last quarter. I'm sure 80% of those gaming GPUs were used for AI as well. To them it's likely irrelevant how they're doing in the gaming market.