this post was submitted on 23 May 2025
75 points (93.1% liked)

PC Gaming

11046 readers
474 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Fizz@lemmy.nz 1 points 4 hours ago* (last edited 4 hours ago)

Ive got 16gb of vram 2k monitor and this tracks pretty accurately. I almost never use over 8gb. The only games that I can break 10gb are games where I can enable a setting (designed for old PCs) where I can load all the textures into vram.

[–] edgemaster72@lemmy.world 6 points 8 hours ago (1 children)

Then put 8GB in a 9060 non-XT and sell it for $200. You're just wasting dies that could've been used to make more 16GB cards available (or at least a 12 GB version instead of 8).

[–] DtA@lemmy.ca 1 points 2 hours ago* (last edited 2 hours ago)

That wouldn't work. AMD uses a lot of low memory cheap cheap memory in unison to achieve high speeds, that's why their cards have more vram than nvidia, not because the amount matters, but because more memory chips together can get higher speeds.

Nvidia uses really expensive chips that are high speed so they can fewer memory chips to get the same memory speed.

Then AMD lied and manipulated gamers for advertising that you need 16gb vram.

Memory speed > memory amount

[–] Kolanaki@pawb.social 34 points 16 hours ago (1 children)

Tell that to game developers. Specifically the ones that routinely don't optimize shit.

[–] DriftingLynx@lemmy.ca 6 points 13 hours ago

Or to gamers who insist on playing these unoptimized games at max settings. $80 for the game, and then spend $1000 buying a gpu that can run the game.

[–] ryper@lemmy.ca 93 points 19 hours ago (3 children)

The full tweet:

Majority of gamers are still playing at 1080p and have no use for more than 8GB of memory. Most played games WW are mostly esports games. We wouldn't build it if there wasn't a market for it. If 8GB isn't right for you then there's 16GB. Same GPU, no compromise, just memory options.

I don't think he's that far off; eSports games don't have the same requirements as AAA single-player games.

[–] Alphane_Moon@lemmy.world 56 points 18 hours ago (2 children)

This is a much more nuanced take than the headlines implies.

[–] snoons@lemmy.ca 29 points 17 hours ago

Are you saying journalists will publish articles with inflammatory headlines to maximize engagement with their ad-based website funding? Nah way, I don believe it.

[–] insomniac_lemon@lemmy.cafe 5 points 15 hours ago* (last edited 13 hours ago)

I still see it being an issue of pricing and questionable value (over older/used/already-owned) of a bottlenecked part, particularly when it ends up with users who aren't esports users (for a multitude of reasons). In other words: stagnation.

It's more obvious with AMD selling new 4GB cards still in the budget category rather than ultra-budget, as in they aren't raising the floor. The jokes still work:

Sheen (from the show Jimmy Neutron) as AMD holding a GPU in the air: "This is the 6500XT!" Teacher: "AMD, this is the 5th year in a row that you've launched the RX 480"

EDIT: There were even polaris GPUs with 8GB

[–] SharkAttak@kbin.melroy.org 10 points 16 hours ago

In this case, Intel's options of 10/12GB sounds like a more reasonable middle ground.

[–] inclementimmigrant@lemmy.world 4 points 17 hours ago* (last edited 12 hours ago)

Counter point.

https://prosettings.net/blog/1440p-competitive-gaming/

https://www.pcguide.com/news/benq-says-1080p-is-still-a-sweet-spot-resolution-despite-more-pc-gamers-upgrading-to-1440p/

Increased resolution has been the trend for a bit now even in these competitive games.

ETA, let's also not pretend that those who play esports games only play esports games too.

[–] fox2263@lemmy.world 12 points 13 hours ago

Do you just not want more money?

Nvidia have dropped the ball epically and you have a golden opportunity to regain some GPU share here.

[–] SomeRandomNoob@discuss.tchncs.de 12 points 14 hours ago (1 children)

IMHO The Problem is only partly the 8GB VRAM (for 1080p). An at least equal part of the Problem is the sitty Optimisation of some game engines. Especially Unreal Engine 5.

[–] MangoPenguin@lemmy.blahaj.zone 2 points 14 hours ago* (last edited 14 hours ago) (1 children)

Yeah seeing a cool game and then seeing it's made in UE5 really puts a damper on things. I wish the engine had more work into performance optimization.

[–] ILikeBoobies@lemmy.ca 2 points 8 hours ago (1 children)

What would you do to optimize it more?

[–] Soleos@lemmy.world 1 points 8 hours ago (1 children)

I would ask ChatGPT to review the source code and optimize it 😈

Task failed Successfully! Now 8TB is the minimum. Well at least it's still 8. :)

[–] r00ty@kbin.life 8 points 16 hours ago

"8gb ought to be enough for anybody"

[–] mormund@feddit.org 15 points 19 hours ago (1 children)

Guess I'll stick with my GTX 1070TI until next century when GPU manufacturers have passed the bong to someone else. Prices are insane for the performance they provide these days.

[–] snoons@lemmy.ca 4 points 17 hours ago

Greetings fellow 1070Ti user.

[–] CrowAirbrush@lemmy.world 6 points 15 hours ago

I just ditched my 8gb card because it wasn't doing the trick well enough at 1080p and especially not at 1440p.

So if i get this straight AMD agrees that they need to optimize games better.

I hate upscaling and frame gen with a passion, it never feels right and often looks messy too.

First descendant became a 480p mess when there were a bunch of enemies even tho i have a 24gb card and pretty decent pc to accompany that.

I'm now back to heavily modded Skyrim and damn do i love the lack of upscaling and frame gen. The Oblivion stutters were a nightmare and made me ditch the game within 10 hours.

[–] mintiefresh@lemmy.ca 14 points 19 hours ago (2 children)

Lmao. AMD out here fumbling a lay up.

[–] BombOmOm@lemmy.world 13 points 16 hours ago* (last edited 16 hours ago) (1 children)

Seriously.

All AMD had to do here is create a 12GB and 16GB version (instead of 8 and 16), then gesture at all the reviews calling the RTX 5060 8GB DOA because of the very limiting VRAM quantity.

8GB VRAM is not enough for most people. Even 1080p gaming is pushing the limits of an 8GB card. And this is all made worse when you consider people will have these cards for years to come.

Image (and many more) thanks to Hardware Unboxed testing

[–] dormedas@lemmy.dormedas.com 3 points 14 hours ago

Even worse when you consider the cost difference between 8GB and 16GB can’t be that high. If they ate the cost difference and marketed 16GB as the new “floor” for a quality card, then they might have eaten NVIDIA’s lunch where they can (low-end)

[–] inclementimmigrant@lemmy.world 2 points 14 hours ago* (last edited 12 hours ago)

I mean honestly, yeah. With a simple 4 GB chip they could have won the low end and not screwed over gamers.

They've really seemed to have forgotten their roots with the GPU market, which is a damn shame.

[–] ILikeBoobies@lemmy.ca 0 points 8 hours ago

I would agree because 8gb is entry for desktop gaming and most people start at entry level

[–] original_reader@lemm.ee 6 points 17 hours ago

I wish.

Send one of these guys by my place. I'll show them what 8GB can not do..

[–] xploit@lemmy.world 6 points 17 hours ago (1 children)

Oh so it's not that many players are FORCED to play at 1080p because AMDs and Novideos "affordable" garbage can't cope with anything more to make a game seem smooth, or better yet the game detected we're running on a calculator here so it took pity on us and set the graphics bar low.

[–] insomniac_lemon@lemmy.cafe 3 points 14 hours ago

Hey, give a little credit to our ~~public schools~~ (poorly-optimized eye-candy) new games! (where 10-20GiB is now considered small)

[–] newthrowaway20@lemmy.world 8 points 18 hours ago (2 children)

My 4k tv disagrees. Even upscaling from 1440p, my 10GB is barely enough on new games

[–] ryper@lemmy.ca 3 points 13 hours ago

Last month's Steam survey had 1080p as the most common primary display resolution at about 55%, while 4k was at 4.57%.

[–] Alphane_Moon@lemmy.world 6 points 18 hours ago

4K is a tiny part of the market. Even 1440p is a small segment (albeit rapidly growing).

[–] MyOpinion@lemm.ee 9 points 19 hours ago (1 children)
[–] capuccino@lemmy.world 1 points 18 hours ago (2 children)
[–] snoons@lemmy.ca 4 points 17 hours ago

that's just like, they're opinion

load more comments (1 replies)
[–] 30p87@feddit.org 9 points 19 hours ago

Oh fuck you AMD. NVidia fucked up with the 4060 already, and again with the 5060.

[–] barusu@lemmy.ca 7 points 19 hours ago (2 children)

Tell that to my triple 1440p screen flight simulator!

[–] hddsx@lemmy.ca 6 points 18 hours ago

Have you tried buying three graphics cards?

[–] Ulrich@feddit.org 4 points 18 hours ago (1 children)

most gamers aren't doing that. You can get a very good idea of what they're doing by looking at Steam hardware surveys.

[–] inclementimmigrant@lemmy.world 3 points 14 hours ago (1 children)

Most gamers are stuck with lower end hardware because they can't afford anything anymore.

[–] Ulrich@feddit.org 2 points 14 hours ago
[–] SharkAttak@kbin.melroy.org 2 points 16 hours ago

So the ones who had VGAs do more and more stuff like they were small separate PCs, and pushed for the "1440p Ultra Gaeming!!!1!" are telling us that nah 8GB is enough?

load more comments
view more: next ›