Thread: The GPU Thread
Damn nice



Hoooooly! This game was at the top 5 on my list for games I've saved to play for the first time for when I get a new card. I'd only played up to the point where you first get to Hogwarts on my 3080 and decided I'd wait till I could pump up all the settings on a new card at better framerates. This may have just moved it above Cyberpunk on that list... Here's to hoping I can snag a founders 5080 in a decent timeframe!

Also, all reports I'm hearing are that the new FE cards are nice and quiet. There is still the coil-whine lottery as usual though.
 


The new transformer model for Nvidia DLSS 4 lives up to its name, with transformational improvements over the convolutional neural network approach seen in DLSS 2.0 upwards. We'll be tackling super resolution improvements in the second video on this topic, but by far the biggest enhancements are seen with ray reconstruction - as you shall discover in this deep dive from Alex.
 


Respect to Tim/HU for going into detail and not sugar coating it.

It seems many others are being careful with the language they use to describe multi-frame gen,. Wendell from Level1Techs, for example, was applying what I'd describe as some extreme damage control for Nvidia with regards to this feature which I guess was to be expected when his thumbnail for the review said "I want 2 of them."
 
  • 100%
Reactions: IrishWhiskey
Respect to Tim/HU for going into detail and not sugar coating it.

It seems many others are being careful with the language they use to describe multi-frame gen,. Wendell from Level1Techs, for example, was applying what I'd describe as some extreme damage control for Nvidia with regards to this feature which I guess was to be expected when his thumbnail for the review said "I want 2 of them."

There are a lot of reviewers out there terrified of losing their sampling from Nvidia like what almost happened to HUB before so it generally pays to not piss off the 85-90% market leader if your channel is reliant on them for day 1 reviews so most will push whatever feature narrative Nvidia wants.

Fair play to Tim and HUB in general for taking that risk by not sugar coating it
 
  • Like
Reactions: Joe T.
It dawned on me that the only meaningful gaming upgrade you can get from this thing is if you game at 8k, due to memory bandwidth. At 4k sure it's 20-30% better sometimes, sometimes not even that.

But at 8k bandwidth plays a huge role, plus vram sometimes. It's why 3090 is way way faster than say the 4070ti at 8k.

Locked 4k60fps on witcher 2 with ubersampling (8k) wasn't possible until 3090 and its 384 bit gddr6x.

Otherwise with dlss, cpu limits etc. it's just a stupid purchase. Much dumber than going from 1080ti to 2080ti which was a similar perf increase but introduced dlss.

It's a prosumer/8k card imo.
 
  • Like
Reactions: Joe T.
It dawned on me that the only meaningful gaming upgrade you can get from this thing is if you game at 8k, due to memory bandwidth. At 4k sure it's 20-30% better sometimes, sometimes not even that.

But at 8k bandwidth plays a huge role, plus vram sometimes. It's why 3090 is way way faster than say the 4070ti at 8k.

Locked 4k60fps on witcher 2 with ubersampling (8k) wasn't possible until 3090 and its 384 bit gddr6x.

Otherwise with dlss, cpu limits etc. it's just a stupid purchase. Much dumber than going from 1080ti to 2080ti which was a similar perf increase but introduced dlss.

It's a prosumer/8k card imo.

Funny you say that - and I largely agree - because Youtube recommended a video of someone using it to play GTAV at 8K and even maxing out the 32GB of VRAM at 16K, towards the end of the video around the 16 min mark:

 
  • Like
Reactions: Chozofication
Honest question. Why are people going so hard on Nvidia for this fake frames thing when I vividly remember people going wild over lossless scaling for months now? For instance...




And people in the comments are glazing lossless scaling hard with comments like, "Well I guess you can really download more frames...".

I'm not saying Nvidia shouldn't be raked over the coals for their retarded performance comparisons and I get that the raw performance upgrade between gens looks lackluster, but it really seems like the vast majority are dogpiling on "fake frames" when the majority, at least it seemed that way, were doing the opposite before the reveal. Let alone the fact that the new cards will have on board hardware specifically for this task which will do the job even better. You'd think you'd at least see those same people who were glazing for lossless scaling getting excited for that aspect of the new cards?
 
  • Brain
Reactions: DonDonDonPata
Honest question. Why are people going so hard on Nvidia for this fake frames thing when I vividly remember people going wild over lossless scaling for months now? For instance...




And people in the comments are glazing lossless scaling hard with comments like, "Well I guess you can really download more frames...".

I'm not saying Nvidia shouldn't be raked over the coals for their retarded performance comparisons and I get that the raw performance upgrade between gens looks lackluster, but it really seems like the vast majority are dogpiling on "fake frames" when the majority, at least it seemed that way, were doing the opposite before the reveal. Let alone the fact that the new cards will have on board hardware specifically for this task which will do the job even better. You'd think you'd at least see those same people who were glazing for lossless scaling getting excited for that aspect of the new cards?


That's unfortunately par for the course in the media game. Sensationalism, click bait, hot takes, etc.

Though it's easy to see how someone might be genuinely hyped by frame gen and then have their opinion sour with experience over time. Even my friend's youngest kid, still in elementary school, was quick to complain about the way games felt when I had her try out Lossless Scaling.
 
That's unfortunately par for the course in the media game. Sensationalism, click bait, hot takes, etc.

Though it's easy to see how someone might be genuinely hyped by frame gen and then have their opinion sour with experience over time. Even my friend's youngest kid, still in elementary school, was quick to complain about the way games felt when I had her try out Lossless Scaling.

But I can find video's even up to days before the Nvidia reveal that salivate all over lossless scaling. There's even video's on the lossless scaling app since the reveal from the same creators who lambasted Nvidia over it who are still positive about it as is the chat. Also, there's this that I thought was funny...



Vex is the one that made the lossless scaling vid where he was glazing it and so was his chat.

The dude he has on in the vid admits he doesn't like frame gen and would rather have pure rasterization. During the blind test, he says frame gen looks and feels great. But after it's revealed which one was frame gen he switches his story up and says that one looked and felt bad.

I wonder how much of this stuff is biased placebo effect and how much is someone actually being able to tell the difference. Granted, this is with on board AI assisted DLSS4 which has a relatively low impact on input lag in comparison to lossless scaling. Maybe it really is that much better and people are expecting performance like with lossless scaling. But then that doesn't explain why so many were glazing lossless scaling before but not dlss4 which performs even better...
 
  • Like
Reactions: regawdless
But I can find video's even up to days before the Nvidia reveal that salivate all over lossless scaling. There's even video's on the lossless scaling app since the reveal from the same creators who lambasted Nvidia over it who are still positive about it as is the chat. Also, there's this that I thought was funny...



Vex is the one that made the lossless scaling vid where he was glazing it and so was his chat.

The dude he has on in the vid admits he doesn't like frame gen and would rather have pure rasterization. During the blind test, he says frame gen looks and feels great. But after it's revealed which one was frame gen he switches his story up and says that one looked and felt bad.

I wonder how much of this stuff is biased placebo effect and how much is someone actually being able to tell the difference. Granted, this is with on board AI assisted DLSS4 which has a relatively low impact on input lag in comparison to lossless scaling. Maybe it really is that much better and people are expecting performance like with lossless scaling. But then that doesn't explain why so many were glazing lossless scaling before but not dlss4 which performs even better...


You can find videos with people being for or against pretty much anything. I don't think the vast majority of PC gamers were going crazy for lossless scaling, in fact I'd say most probably don't even know it exists.

The thing is though that it is an open source platform agnostic mod that people can use if they enjoy frame gen, so most of the people commenting positively are going to be the minority who enjoy frame gen in general/like to tinker with mods or the casuals who don't know anything about it or the downsides and just see "bigger number = better!" on some video.

The reason Nvidia is getting some blowback is because they are selling a product to people using multi frame generation as the flagship feature for the 5000 series and then using that to try to trick people into thinking that the performance of their new cards is significantly better than the performance of their current generation. For example "5070 = 4090!".

From that perspective Nvidia deserves to be raked over the coals for the misleading marketing and charts. At the same time, it looks like Blackwell is a bit disappointing from a performance uplift perspective and definitely worse than most were expecting prior to announcement so the fact that Nvidia leaned so hard into MFG make it seem that they are trying to mask the bad performance uplift of the Blackwell generation over Ada so an extra thumb in the eye so to speak.

The people complaining about frame gen, multi frame whatnot are not really the same people praising Lossless Scaling for the most part I would wager.
 
3080 to 5080 should be a decent upgrade, but games still run and look just fine. :/ Not sure if I want to buy and build a new build or wait one more generation
 
3080 to 5080 should be a decent upgrade, but games still run and look just fine. :/ Not sure if I want to buy and build a new build or wait one more generation

I have one as well and tbh, for the most games it's still absolutely great. Only the most advanced games like Indiana Jones or Black Myth Wukong bring the GPU to it's knees.
 
  • Like
Reactions: TaySan
They may have been thinking 5070 was going to have a $99 MSRP.

The messaging and marketing by AMD is extremely weird at the moment. They took away a lot of excitement by not offering a halo product. Now, with their focus on low and mid range, people expect them to come out swinging price to performance wise.

But I don't think anyone is really confident that they'll deliver.
 
  • This tbh
Reactions: IrishWhiskey
Still not sure if I should upgrade from my 4090? Most of the messaging seems to be you get a decent jump but at the expense of noise and heat. A bit conflicted at the moment
 
Still not sure if I should upgrade from my 4090? Most of the messaging seems to be you get a decent jump but at the expense of noise and heat. A bit conflicted at the moment

For people with 3000 series or lower I think it is a solid (but expensive) option.

For an existing 4090 owner, personally I would wait until the 6000 series releases and get a 6090 as the 4090 will likely serve you well for the next 2 years.

The new DLSS transformer model is available on 4000 and 3000 series cards so feature wise the only thing you would be missing is MFG which has a lot of caveats but some people might enjoy.

Overall it depends on what you use it for, for video editing, rendering and productivity stuff it might be a good buy if a 4090 isn't cutting it, if you just want the absolute best, no compromises and money is no object then it also makes sense to get a 5090.
 
Still not sure if I should upgrade from my 4090? Most of the messaging seems to be you get a decent jump but at the expense of noise and heat. A bit conflicted at the moment

Depends on what you want. The 4090 is still an incredible card and will easily remain the second most powerful GPU for the next two years, and should provide a great experience even in the most demanding games.

If I had a 4090, I'd honestly wait for the 6090. Sure, the 5090 does offer a good boost over the 4090 but I don't think it's worth the additional price plus all the heat.
 
Still not sure if I should upgrade from my 4090? Most of the messaging seems to be you get a decent jump but at the expense of noise and heat. A bit conflicted at the moment

Wait for the next generation. It will be on a new node so peformance should be a significant boost over the current generation. 4090 should still be able to play anything for years to come just fine with little compromises. Especially with DLSS 4
 
5080 is too small of an upgrade for anyone with a 4080.

I wouldn't expect to get big raster gains ever again. It wouldn't surprise me in the slightest is the 6080 just matches the 4090 in raster with all new silicon going to tensor cores and RT perf.
 
5080 is too small of an upgrade for anyone with a 4080.

I wouldn't expect to get big raster gains ever again. It wouldn't surprise me in the slightest is the 6080 just matches the 4090 in raster with all new silicon going to tensor cores and RT perf.

The 4090 is what, 20% faster than the 5080? Yeah I don't expect a larger jump from a future 6080. Seems like a safe bet and tbh if they price it at 999, that would still be ok for the crazy performance that a 4090 offers.
 
Had a traumatic experience.

Tried to inject the new DLSS version into Space Marine 2. Launched the game, suddenly looked like total shit, I immediately saw that it's blurry and unstable.

Turns out the DLSS swap didn't work and the game ran in FSR Quality mode. My God, being used to DLSS, seeing FSR was jarring.