Grofit

joined 2 years ago
[–] Grofit@lemmy.world 1 points 3 weeks ago

In some games it's super obvious that it's doing upscaling and looks awful, some games with without upscaling look awful (TAA), it's odd that in some cases tools like Dlss can look better than native TAA.

For the most part though it feels like the FLAC vs Opus or whatever, most people can't notice difference and don't care enough about it, but some people can tell the difference and want the best, I don't think either are wrong it's just down to how much you notice/can tolerate before it's annoying.

[–] Grofit@lemmy.world 8 points 3 weeks ago (2 children)

For some games I'm happy to turn it on as it drops power/temps and provides virtually identical output (as far as i can tell anyway) to native, but my fans don't need to go into overdrive mode.

I may even put on frame gen too if I just want to bump a stable-ish 80-90fps to stable 120fps, and again drops power and temps slightly. That sometimes does cause smearing but for the most part I don't notice enough to be annoyed. Without them I would probably be running with more power draw and higher temps, and possibly still not even hitting lower resolutions at 120, some games as you say though can hit 120 no problem even without and the gpu won't be stressed.

[–] Grofit@lemmy.world 11 points 3 months ago

I often wonder if Hideo Kojima is actually a time traveller as it feels like we are living through the MGS series. With AI and government propaganda and control measures it feels like we are at the MGS2 phase

[–] Grofit@lemmy.world 8 points 4 months ago

The consensus seems to be that AMD priced their cards higher expecting Nvidia to price higher than they did.

Then Nvidia priced lower than they expected (still too expensive imo) and AMD needed to react and price their card cheaper. Problem is retailers already paid for shipments so AMD needed to settle some sort of reimbursement process for the soon to be out of pocket retailers.

This was a big issue for them, but also they realised they could generate more frames if they wanted to, and match Nvidia so they would be able to also claim crazy high FPS figures (it's all nonsense, we care about raster performance).

To be able to do this they needed a couple of months to dev and test it before reviewers get it.

So delaying launch let's them solve both problems with the extra time, but in reality they are missing a window to gain market advantage while also being able to align the narrative with what gamers care about (pure raster performance).

[–] Grofit@lemmy.world 2 points 6 months ago

Recently, I would say Roadwarden, was such a great game with such a unique feel to it.

[–] Grofit@lemmy.world 6 points 6 months ago

What a gem, my friend and I played through the first 2 lunar games in university.

I also discovered wild Arms which was pretty good and doesn't get much attention compared to jrpgs like FF and Suikoden.

[–] Grofit@lemmy.world 3 points 6 months ago

If these were stories I was picking up to implement I would be asking the BA to elaborate some more 😂

[–] Grofit@lemmy.world 2 points 6 months ago

There have been some decent results historically with checkerboard and other separated reconstruction techniques. Nvidia was working on some new checkerboard approaches before they killed off SLI.

A decade or two ago most people I knew had dual GPUs, itbwas quite common for gamers and while you were not getting 100x utilisation it was enough to be noticeable and the prices were not mega bucks back then.

On your point of buying 1 card vs many, I get that, but it seems like we are reaching some limitations with monolithic dies. Shrinking chips seems to be getting far harder and costly, so to keep performance moving forward we are now jacking up the power throughput etc.

Anyway the point I'm trying to make is that it's going to become so costly to keep getting these more powerful monolithic gpus and their power requirements will keep going up, so if it's 2 mid range gpus for $500 each or 1 high end gpu for $1.5k with possibly higher power usage im not sure if it will be as much of a shoe in as you say.

Also if multi chiplet designs are already having to solve the problem of multiple gpu cores communicating and acting like one big one, maybe some of that R&D could benefit high level multi gpu setups.

[–] Grofit@lemmy.world 1 points 6 months ago

It was some on board gpu with my super amazing AMD K6-2, it couldn't even run mega man X without chugging. Then a friend gave me an S3 Virge with a glorious 4mb vram.

 

Given the information around how AMD are currently focusing on the mid tier, it got me thinking about their focus on multi chiplet approaches for RDNA5+, they will be having to do a lot of work to manage high speed interconnects and some form of internal scheduler/balancer for the chipets to split out the work etc.

So with this in mind if they could leverage that work on interconnectors and schedulers at a higher level to be a more cohesive form of Crossfire/SLI they wouldnt even need to release any high end cards, as they could just sell you multiple mid tier cards and you just daisy chain them together (within reason). It would allow them to sell multiple cards to individuals increasing sales numbers and also let them focus on less models so simpler/cheaper production costs.

Historically I think the issues with Crossfire/SLI was that to make best use of it the developers had to do a lot of legwork to spread out loads etc, but if they could somehow handle this at the lower levels like they do with the chiplets then maybe it could be abstracted away from the developers somewhat, i.e you designate a master/slave GPUs so the OS just treats the main one as a bigger one or something.

I doubt this is on the cards but it felt like something the was plausible and worth discussion.

[–] Grofit@lemmy.world 1 points 8 months ago

I'm sure there is a simple answer and I'm an idiot, but given it's in a place that gets lots of sun, can they not just install solar panels with batteries at consumer/grid level?

Or is the problem not with the generation of the power and with transmitting it to properties? I don't know cost of solar installation but I'm sure the amount it's costing them when it all fails they could at least incentives individuals to install solar or something.

[–] Grofit@lemmy.world 3 points 8 months ago

Really enjoying it so far.

I was initially saddened to hear it was going to follow in the steps of 15 and be an action based rpg, and I thought 15 was brain dead "warp strike simulator" with horrible story pacing and poor characters (until last 5% of the game).

This game though has simple but effective action combat with enough variety to be fun and the characters and pacing are a joy.

I still wish we could get some FF games like 7 or 9 where there is depth to equipment, magic and turn based combat, but jrpgs have been iterating away from complex battle systems and sell well so can't see them going back.

I still think FF7 was the pinnacle as material mixing and matching with equipment was really simple and super fun.

Anyway rnat over, FF16 is good, recommend it.

[–] Grofit@lemmy.world 22 points 8 months ago

One point that stands out to me is that when you ask it for code it will give you an isolated block of code to do what you want.

In most real world use cases though you are plugging code into larger code bases with design patterns and paradigms throughout that need to be followed.

An experienced dev can take an isolated code block that does X and refactor it into something that fits in with the current code base etc, we already do this daily with Stackoverflow.

An inexperienced dev will just take the code block and try to ram it into the existing code in the easiest way possible without thinking about if the code could use existing dependencies, if its testable etc.

So anyway I don't see a problem with the tool, it's just like using Stackoverflow, but as we have seen businesses and inexperienced devs seem to think it's more than this and can do their job for them.

 

Keyboards have been around for over 40 years and since then not much has really changed in terms of the standard keyboard functionality at the driver/os level.

In the past decade we have seen quite a few keyboards coming out with analogue keys which is great but they are really sketchy to try and actually use for anything as it's not something an OS expects a keyboard to be doing so you need special 3rd party drivers/software which often don't get used in a truly analogue way anyway.

For example in a lot of games analogue directional sticks are the norm, so altering movement speed/sneaking based off the analogue amount is pretty normal, however when you get to PCs you just get keydown/keyup events so you can't process it in an analogue way.

So given we are seeing more keyboards coming out with this functionality at a lower price point is there any company/person/body trying to put together a standard that would allow for analogue key events at OS level or even DirectX (DirectInput) / OpenGl?

I imagine the answer is no, but wanted to ask incase anyone in the know had more info.

view more: next ›