this post was submitted on 17 Mar 2026
217 points (97.4% liked)

Games

47214 readers
2130 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Rules

1. Submissions have to be related to games

Video games, tabletop, or otherwise. Posts not related to games will be deleted.

This community is focused on games, of all kinds. Any news item or discussion should be related to gaming in some way.

2. No bigotry or harassment, be civil

No bigotry, hardline stance. Try not to get too heated when entering into a discussion or debate.

We are here to talk and discuss about one of our passions, not fight or be exposed to hate. Posts or responses that are hateful will be deleted to keep the atmosphere good. If repeatedly violated, not only will the comment be deleted but a ban will be handed out as well. We judge each case individually.

3. No excessive self-promotion

Try to keep it to 10% self-promotion / 90% other stuff in your post history.

This is to prevent people from posting for the sole purpose of promoting their own website or social media account.

4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

This community is mostly for discussion and news. Remember to search for the thing you're submitting before posting to see if it's already been posted.

We want to keep the quality of posts high. Therefore, memes, funny videos, low-effort posts and reposts are not allowed. We prohibit giveaways because we cannot be sure that the person holding the giveaway will actually do what they promise.

5. Mark Spoilers and NSFW

Make sure to mark your stuff or it may be removed.

No one wants to be spoiled. Therefore, always mark spoilers. Similarly mark NSFW, in case anyone is browsing in a public space or at work.

6. No linking to piracy

Don't share it here, there are other places to find it. Discussion of piracy is fine.

We don't want us moderators or the admins of lemmy.world to get in trouble for linking to piracy. Therefore, any link to piracy will be removed. Discussion of it is of course allowed.

Authorized Regular Threads

Related communities

PM a mod to add your own

Video games

Generic

Help and suggestions

By platform

By type

By games

Language specific

founded 2 years ago
MODERATORS
top 49 comments
sorted by: hot top controversial new old
[–] orlyowl@piefed.ca 12 points 5 hours ago
[–] raicon@lemmy.world 67 points 7 hours ago (1 children)

Finally, the slopware generation

[–] 1984@lemmy.today 18 points 6 hours ago

Being young today must be fun. Just missed out on all the good stuff in the 80, 90s and 2000s to experience this mess we have now.

[–] pennomi@lemmy.world 50 points 7 hours ago (1 children)

I don’t see how this will stay consistent enough for art directors to sign off on it. It’s effectively just a hallucination based on your current video game frame.

[–] Rhaedas@fedia.io 14 points 6 hours ago (3 children)

Unforunately the latest stuff I've seen is all about keeping character consistency, which is basically having a fixed frame of reference for every generation. What I don't get not knowing much about the details is how LLM generation is faster than actual 3D modeling with more details? Perhaps overall it is faster per frame to generate a 2D image vs. tracking all the polys.

Not saying which is right to do, there's lots of baggage with discussing AI stuff, just wondering about the actual tech itself.

[–] knightly@pawb.social 17 points 6 hours ago* (last edited 5 hours ago) (1 children)

What I don't get not knowing much about the details is how LLM generation is faster than actual 3D modeling with more details?

It's not, DLSS5 takes a frame as rendered normally by your GPU and feeds it into a second $3k GPU to run the AI image transformer.

There is no performance benefit, in fact it adds a bit of latency to the process.

[–] paraphrand@lemmy.world 6 points 3 hours ago

And Nvidia claims it will release without the need for a second 50 series card this year.

Lots of bullshit being laid out by Nvidia here.

[–] Eggymatrix@sh.itjust.works 3 points 6 hours ago (1 children)

Polys is not where expensive computation is. The bottleneck is raytracing and volumetric fog etc. All those things that make a game look more real and natural.

I think this dlss stuff could potentially substitute raytracing and other light/shadow/reflections/transparency things that are very expensive to both program correctly and calculate every frame.

My two cents

[–] phailhaus@piefed.social 2 points 4 hours ago* (last edited 4 hours ago) (1 children)

Lighting is the space image gen struggles in most now. Individual areas will show convincing shadows, atmosphere, etc, but motivation and consistency is lacking. The shots from Hogwarts Legacy show that really clearly. Slice out a random 10%x10% chunk of the frame and the lighting looks more realistic, but the overall frame loses the directional lighting driven by real things in the scene.

[–] paraphrand@lemmy.world 1 points 3 hours ago* (last edited 3 hours ago)

I’m curious how well it handles lighting from unseen light sources that otherwise didn’t contribute as much to the scene as they should have. In other words, off screen lights that shine into the scene but are not fully rendered by traditional means. Same thing goes for reflections.

I expect a lot of nonsense being hallucinated in those areas.

[–] kromem@lemmy.world 1 points 5 hours ago (1 children)

It's not an 'LLM' (large language model). 🤦

[–] Rhaedas@fedia.io 2 points 5 hours ago (1 children)

I try to avoid the overhyped and wrongly used term AI, so what's the proper term? Related to diffusion models? Something different?

[–] kromem@lemmy.world 5 points 5 hours ago

Neural network would be the most technically accurate given what they've announced so far.

There's no information on if it's a diffusion or transformer architecture. Though given DLSS 4.5 introduced a transformer for lighting, my guess would be that it's the same thing just being more widely applied. But the technical details haven't been released from anything I've seen, so for the time being it's being described as "neural rendering" using an unspecified neural network.

https://www.nvidia.com/en-us/geforce/news/dlss-4-5-dynamic-multi-frame-gen-6x-2nd-gen-transformer-super-res/

[–] warmaster@lemmy.world 18 points 6 hours ago (1 children)

Nvidia DLSS (Deep Learning Super Sloppificator)

[–] knightly@pawb.social 8 points 6 hours ago

Double Latency Slow Sloppification.

[–] wraithcoop@programming.dev 7 points 5 hours ago* (last edited 5 hours ago) (1 children)

I think Jensen said during a presentation a long time ago that a long term goal was to have graphics being entirely generated. This seems like a big step towards that.

[–] yggstyle@lemmy.world 7 points 4 hours ago

Nvidia realized they hit a wall on rendering... So they went full on into something that you can't properly benchmark - especially against competitors AND prior generations.

Nvidia went full snake oil... And really would like people not to notice.

[–] NONE_dc@lemmy.world 23 points 7 hours ago (5 children)

Personally, I don't know much about this technology. That said, I've heard that the original purpose of DLSS was to improve gaming performance, give you more FPS, and so on.

In that sense, many—myself included—are wondering: How is this slop generator going to improve game performance? How is giving Grace from RE9 a totally different face with make up on going to improve my gaming experience?

[–] inclementimmigrant@lemmy.world 13 points 6 hours ago* (last edited 6 hours ago)

All of this upscaling when it was presented over a decade ago was to give older cards a longer lease in life and now it's morphed into the mandatory way to get a stable framerate since developers can now just rely on DLSS and to a lesser extent FSR to get them to a acceptable framerate instead of optimization.

As for how will this improve the gaming experience? I honestly don't see it at this point. Back when it was the original goal, sure, now with this "Chat GPT moment for graphics", I see it only beneficial for corporate parasites and "shareholder value" as we wave good bye to artistic vision as everything goes to looking like AI only fans.

[–] SalamenceFury@piefed.social 16 points 7 hours ago

It makes "I fixed this ugly FEMALE character" chuds happy and that's what matters to them.

[–] zewm@lemmy.world 10 points 7 hours ago (1 children)

Saves payroll on artists which means more profits. #winning

[–] snooggums@piefed.world 2 points 6 hours ago

The artists are still necessary for the slop generator to have aomething ro slop all over.

[–] affenlehrer@feddit.org 6 points 6 hours ago

I think the idea is that you could use low resolution / detail models that take up less RAM and are faster to process and DLSS hallucinates a high res image.

[–] Rhaedas@fedia.io 2 points 6 hours ago (1 children)

Not a fan of reimagined stuff of any sort, it usually doesn't hit well. But from a tech standpoint I can think of ways one could use the tech to improve game performance for new games. Usually making a game run faster or feel more realistic is all about fooling the player, not drawing what's not seeable, showing hints of things that aren't really there. Hell, that's been true for movies and even stage, right?

So my thought on how this could work is to have the actual core models be lower polys, enough for details but not as high as the best we've seen done, and minimal texturing. Then the generator uses that as a base to form the image it puts over the top. Still don't see how that can be done that fast, but apparently we're there now.

[–] NONE_dc@lemmy.world 6 points 6 hours ago

The problem I see is consistency. Whatever the AI generates for a given source won’t be consistent throughout the game. Even in the original Digital Foundry video, you can see how Grace’s face looks like a totally different person depending on the distance from which it’s viewed.

The artistic style is supposed to solve that consistency issue, but this AI is ruining it.

(Also, in the same video, you can see they were using two 5090s to run the DLSS 5 games, so...)
[–] BillyClark@piefed.social 15 points 6 hours ago

I find it interesting that the AI parts of the video have very little video in them. They have the original game moving along, and then they show the AI version and mostly keep it as a still. I suspect that they did this so that you can't do a side-by-side comparison and see that the AI version doesn't actually play as well as the original version.

Also, I've got to wonder about how it must feel to be an artist who worked on one of these games, and watch the thing you carefully hand-tuned to match the artistic vision of the game design be replaced by the mindless addition of wrinkles.

Aren’t video games Art? Why does nvidia think it knows better than the artists? . And maybe I am the only one, but DLSS has never made any of my games look better ever. It’s always looked worse, and I now permanently turn that shit off and regret spending a 20 percent higher price for something I don’t use. I don’t usually agree with IGN articles, but this one hits it on the head.

[–] Sanguine@lemmy.dbzer0.com 9 points 7 hours ago (1 children)

Easy to avoid nvidia products these days anyway. Overpriced, marginal performance over AMD, and tied to AI slop globally. Never been so easy to vote with your wallet.

[–] Shadowcrawler@discuss.tchncs.de 2 points 6 hours ago

That would only work i they still gave a shit about gamers or the consumer market in general. They became a full AI Datacenter company, thats where they make the money. I don't think they will even produce consumer graphics cards in 5 years anymore.

[–] KneeTitts@lemmy.world 6 points 6 hours ago

Everything tech bros touch, dies

[–] webghost0101@sopuli.xyz 5 points 6 hours ago

The upside of enshitification in newer games and crap like this is that i am less interested in playing the new stuff and the more so enjoy replaying older games.

Which also run at peak performance.

[–] tyrant@lemmy.world 4 points 6 hours ago (1 children)

It's crazy to watch these developers march everyone's jobs, including their own off a cliff.

[–] inclementimmigrant@lemmy.world 4 points 6 hours ago (1 children)

I don't think developers and artists have a choice when they're being told by the publishers to do it.

[–] tyrant@lemmy.world 2 points 5 hours ago

I get it, but there's always a choice

[–] SalamenceFury@piefed.social 4 points 7 hours ago (1 children)

And there's still people who think this is a good thing and an evolution in graphics. You can guess what their political alignments are pretty quickly.

[–] glitzer_gadze@feddit.org 1 points 6 hours ago

This article speaks to my heart

[–] Dariusmiles2123@sh.itjust.works 1 points 6 hours ago (3 children)

To be honest it looks great and at first I thought about how impressive it is.

Then you think about how it modifies what was created and how it could lack consistency with a character looking a certain way in a part of the game and differently later.

I don’t know know what to think about these technologies like FSR and DLSS..

I don’t think I mind the fake frames as it could give you more performance on something like a Steam Deck, but the rest is tricky..

[–] Player2@sopuli.xyz 2 points 5 hours ago

The problem is that these 'features' give you the fake appearance of performance but at the cost of actual performance. And they aren't even good in the first place. Upscaling is like smearing vaseline all over your screen and saying "see how good it looks" when it's just a blurry mess, especially with temporal elements. Frame generation gives artificial smoothness but doesn't help input latency which is the part of frame rate that actually matters.

The kicker is that it costs real frames to generate fake ones. The game ends up looking and feeling worse.

[–] snooggums@piefed.world 2 points 5 hours ago

I dislike upscaling, it always looks off to me and frequently causes edges of things to be weird kind of like aliasong but different. After upgrading from a gtx 3060, where I only used it for a few games that were hectic enough to not notice it as much, to a 9070xt I turned off upscaling globally and run everything in the native resolution. Games look so much better without dlss or fsr to me, fewer weird artifacts causing distractions.

I'll take 20 less fps to not constsntly feel like something is off even if I can't point out exactly what it is.

[–] foodandart@lemmy.zip 2 points 6 hours ago

The magic here is you always have the option in most game settings, to turn it off.

[–] krisevol@lemmus.org -3 points 6 hours ago

Nvidia filters have been a thing for years. This isn't any different.

[–] HerrVorragend@lemmy.world -1 points 6 hours ago* (last edited 6 hours ago) (1 children)

I am not completely against this technology if it can be used to 'remaster' older games.

In the current climate of rising hardware prices, it just seems unnecessary and tone-deaf to introduce such tech that requires not one, but TWO top of the line GPUs to run.

Yes, Nvidia is hugely into AI and machine learning, but they might be known as slopvidia soon if the keep forcing this stuff on gamers and developers.

[–] knightly@pawb.social 5 points 6 hours ago* (last edited 5 hours ago)

It can't, remastering games is an entirely different process that requires artistic direction. Piping a game's video output through an AI filter just takes the original game and smears a bunch of slop over it.

[–] leave_it_blank@lemmy.world -1 points 7 hours ago (1 children)

If this could be applied to games like Return to Castle Wolfenstein, System Shock 2, Soldier of Fortune, Medal of Honour, Tron 2.0, Call of Duty and all other titles from that era, I would not mind in the slightest.

A man can dream...

[–] inclementimmigrant@lemmy.world 10 points 6 hours ago (1 children)

Well the last time that was done we ended up with the GTA trilogy mess with AI upscaling/remastering.

[–] leave_it_blank@lemmy.world 1 points 5 hours ago

I did not mean that crap of course!

But if technology existed that would enhance graphics of old games to look like stuff from 2016 or so that would be pretty neat. I would definitely use it!

But that won't happen I guess...