Thread: The GPU Thread
Very positive impressions for the 5070 vs the 4090. IF using frame gen and implemented well.

However, Nvidia also stressed that Marvel Rivals is an outlier here, and in most games, the two GPUs are level in terms of performance (as long as they support multi frame gen) – it's just that Marvel Rivals responds particularly well to frame generation.

In action, I honestly couldn't tell the difference between the two systems, and I played the game for several minutes on both of them. Whatever magic Nvidia has worked with multi frame gen (and we'll be able to talk more about that at a later date), it works surprisingly well, at least in this title. The game was smooth and responsive, and I couldn't see any notable glitches on the RTX 5070 system. This $549 GPU can genuinely offer a 4090-level experience in a game that's well optimized for it.

And therein lies the catch. If a game doesn't support multi frame gen, then the RTX 4090 will be significantly faster than the RTX 5070, as the underlying GPU has so much more raw horsepower. It also has twice as much VRAM available, with the RTX 5070's meager 12GB locking it out of some of the high settings you can enable on the RTX 4090 at 4K, even if you enable frame gen.



Complete fluff piece, while we should all wait for proper benchmarks and 3rd party reviews to properly gague the performance of the 5070, trying to say with a straight face that the 5070 is as fast as a 4090 because you doubled the fake frames is just retarded.

Honestly, is this a paid fluff piece?
 
  • This tbh
Reactions: Dacon
Complete fluff piece, while we should all wait for proper benchmarks and 3rd party reviews to properly gague the performance of the 5070, trying to say with a straight face that the 5070 is as fast as a 4090 because you doubled the fake frames is just retarded.

Honestly, is this a paid fluff piece?

It clearly points out what's happening and what the use case is.

Of course the 5070 won't be close to the 4090 in any other use case outside of games that can leverage the frame gen fuckery. The lines get blurred to some degree. In this game, results are apparently great, it's valid to point that out.
 
  • Brain
Reactions: IrishWhiskey
It clearly points out what's happening and what the use case is.

Of course the 5070 won't be close to the 4090 in any other use case outside of games that can leverage the frame gen fuckery. The lines get blurred to some degree. In this game, results are apparently great, it's valid to point that out.

Yeah I might have been too harsh and just fired off a hot take. I think my well known hatred for all frame gen got the better of me!

Reading more of the article he does make the caveats clear but I still hate frame gen and especially trying to say card a with frame gen is as good as card b without
 
Yeah I might have been too harsh and just fired off a hot take. I think my well known hatred for all frame gen got the better of me!

Reading more of the article he does make the caveats clear but I still hate frame gen and especially trying to say card a with frame gen is as good as card b without

Yeah that Nvidia marketing fuckery is embarrassing. I don't like it either, very misleading.

BUT ..... When DLSS started, they did the same. Now that tech is so mature that I use it in all games. With the new model especially, there'll barely be a reason to play at native res. Even without the new model, DLSS was actually the best AA option in many games, looking better than native + other AA tech. While being 30-40% faster. Which is crazy stuff tbh.

I believe it won't be too long until Nvidia will achieve that level of maturity with frame generation. Frame gen is fake frames, DLSS is fake pixels. But those fake pixels didn't look great at first as well, now often look better than non fake pixels.

Give it some more updates, they'll further reduce input lag, it'll become cleaner and cleaner. Until it's so good that it becomes the default way to play. Feels inevitable and given the stagnation in actual hardware potential, it feels like the best way forward.
 
Yeah that Nvidia marketing fuckery is embarrassing. I don't like it either, very misleading.

BUT ..... When DLSS started, they did the same. Now that tech is so mature that I use it in all games. With the new model especially, there'll barely be a reason to play at native res. Even without the new model, DLSS was actually the best AA option in many games, looking better than native + other AA tech. While being 30-40% faster. Which is crazy stuff tbh.

I believe it won't be too long until Nvidia will achieve that level of maturity with frame generation. Frame gen is fake frames, DLSS is fake pixels. But those fake pixels didn't look great at first as well, now often look better than non fake pixels.

Give it some more updates, they'll further reduce input lag, it'll become cleaner and cleaner. Until it's so good that it becomes the default way to play. Feels inevitable and given the stagnation in actual hardware potential, it feels like the best way forward.

Yeah I get where you are coming from but in the case of frame gen, especially multi frame gen, it has some fundamental issues with input latency that I'm not sure is easy to fix.

The work they have done with Reflex is great, although that only goes so far and you can always turn on Reflex without frame gen at all.

I think the only way frame gen would be accepted by the wider market over native rendering would be if they were somehow able to make the input latency match the frame rate of the generated frames.

That is a much much harder problem to solve, although knowing Nvidia I wouldn't put it past them at some point in the future, but that is basically what would be required for me at least to start accepting it.
 
  • Like
Reactions: regawdless
Yeah I get where you are coming from but in the case of frame gen, especially multi frame gen, it has some fundamental issues with input latency that I'm not sure is easy to fix.

The work they have done with Reflex is great, although that only goes so far and you can always turn on Reflex without frame gen at all.

I think the only way frame gen would be accepted by the wider market over native rendering would be if they were somehow able to make the input latency match the frame rate of the generated frames.

That is a much much harder problem to solve, although knowing Nvidia I wouldn't put it past them at some point in the future, but that is basically what would be required for me at least to start accepting it.

Agree, it's a difficult value proposition for hardcore gamers. For me, I like high fps gaming for the low latency and super fast inputs, feels great. Image fluidity is a nice bonus.

Frame gen provides very fluid motions and the new iteration apparently looks great. Buuuuut the input lag is there. A No-Go for Souls likes for example. Stuff like Witcher 4 though, frame gen would make a lot of sense.

Most customers, even many who spend a ton of money on a GPU, aren't really knowledgeable. They'll see a very fluid, great looking image with insane visuals. If the input lag isn't bad, they won't notice or care, they'll be happy with the visual feast.

Not too long, and I'll check that out myself I guess.
 
Very impressive


Gg-QaOBbEAAv0cF
 
So an even smaller increase over the 4080 Super that costs the same? Disappointing.

No wonder Nvidia was so heavily promoting the new AI features with these cards.

Yeah it's like last Gen, if you want more performance, you have to pay more. Which is a frustrating reality.

Like last Gen, the xx90 card is the only one that offers a real and very significant jump. But for an obscene price.

The times of 20-30% price to performance increases every few years seem to be over.
 
I'm starting to view Nvidia's fanboys the same way I do Apples misbegotten paypigs.

Why yes, I do see that your wallet enjoys taking it in the ass for middling gains.

40xx card users have very few reasons to buy a 50xx card, that's for sure. I'm curious about the AMD offerings this time for the mid range. Let's hope they don't fuck it up.
 
Yeah it's like last Gen, if you want more performance, you have to pay more. Which is a frustrating reality.

Like last Gen, the xx90 card is the only one that offers a real and very significant jump. But for an obscene price.

The times of 20-30% price to performance increases every few years seem to be over.

tbh I saw the writing on the wall in the 2010s and got out of high-end PC gaming permanently. The extra juice wasn't worth the squeeze.
 
  • Brain
Reactions: Mickmrly and Zefah
40xx card users have very few reasons to buy a 50xx card, that's for sure. I'm curious about the AMD offerings this time for the mid range. Let's hope they don't fuck it up.

The 4090 was like 60~70% faster than the 3090. Will need to wait for benchmarks, but it's looking more like the 5090 will be more like a 20 ~ 30% jump, which is pretty disappointing given the price and time since the 4090 launched.
 
The 4090 was like 60~70% faster than the 3090. Will need to wait for benchmarks, but it's looking more like the 5090 will be more like a 20 ~ 30% jump, which is pretty disappointing given the price and time since the 4090 launched.

4090 was the only huge one but.... For a more or less linear price to performance ratio.
 
  • Brain
Reactions: Kadayi
I'm starting to view Nvidia's fanboys the same way I do Apples misbegotten paypigs.

Why yes, I do see that your wallet enjoys taking it in the ass for middling gains.

Yup, but 2020 already did a great job exposing corporate bootlickers everywhere.

5000 series will be in stores soon and we still don't have the 'Racer RTX' demo/game they used to show off the 4000 series in 2022. Maybe now that they have DLSS4 and multi-frame gen they'll be confident enough to release it. 😄
 
Yup, but 2020 already did a great job exposing corporate bootlickers everywhere.

5000 series will be in stores soon and we still don't have the 'Racer RTX' demo/game they used to show off the 4000 series in 2022. Maybe now that they have DLSS4 and multi-frame gen they'll be confident enough to release it. 😄

I'm still waiting for Racer RTX! Tried to push them on X multiple times lol
 
Surprised this wasn't posted already, but Tech Jesus mentioned that in the MSI promo the Box for the 5080 says 24GB (around the 5-minute mark), which definitely lends some credence to the idea that originally the 5080 was going to launch as such, but probably to keep the pricepoint below $1000 Nvidia opted to gimp it.



Increasingly it does seem it's the card that got the short end of the stick compared to the rest.
 
Surprised this wasn't posted already, but Tech Jesus mentioned that in the MSI promo the Box for the 5080 says 24GB (around the 5-minute mark), which definitely lends some credence to the idea that originally the 5080 was going to launch as such, but probably to keep the pricepoint below $1000 Nvidia opted to gimp it.



Increasingly it does seem it's the card that got the short end of the stick compared to the rest.


5090 seems like the insane card everyone was expecting.

5080 is a meh upgrade over the 4080, gimped to be under 1k. Under 1k is cool though. Not a great card to upgrade to though.

5070 (ti) seem to be the most attractive cards from a price to performance perspective.

The lower cards seem to be.... Meh.
 
  • 100%
Reactions: Kadayi
5090 seems like the insane card everyone was expecting.

5080 is a meh upgrade over the 4080, gimped to be under 1k. Under 1k is cool though. Not a great card to upgrade to though.

5070 (ti) seem to be the most attractive cards from a price to performance perspective.

The lower cards seem to be.... Meh.

Agreed. The 5070Ti might be the card I go for personally. My Image AI dabbler would want me to get a 5090 and that sweet 32GB of RAM, but £2K is too much, esp given I'd like to do a full system upgrade next year.
 
  • Brain
Reactions: regawdless
A positive spin on the whole multiple frame generation discussion:

It's not a substitute for the real thing. If you have a 165hz monitor, real 165fps is awesome and feels the best. But if your games base performance is 60fps, you can play like that and it's fine. Now, with frame gen, you can reach that max 165fps that your monitor can display, making it look way smoother. Cost of that is a minimal increase in input lag compared to your native 60fps. Doesn't sound like the worst trade off.

It's not competing with the native real 165fps, but with the 60fps you'd get without it. And then it becomes a somewhat appealing option to significantly increase the fluidity of the image.
 
A positive spin on the whole multiple frame generation discussion:

It's not a substitute for the real thing. If you have a 165hz monitor, real 165fps is awesome and feels the best. But if your games base performance is 60fps, you can play like that and it's fine. Now, with frame gen, you can reach that max 165fps that your monitor can display, making it look way smoother. Cost of that is a minimal increase in input lag compared to your native 60fps. Doesn't sound like the worst trade off.

It's not competing with the native real 165fps, but with the 60fps you'd get without it. And then it becomes a somewhat appealing option to significantly increase the fluidity of the image.

What I'm wondering is how the latency will compare to consoles at high quality settings. If it's a graphically heavy game and I can still play with all the bells and whistles and image IQ that PC allows with same as or even better than console latency while looking as fluid as people have been reporting then I'll be happy. I don't play them anymore but pretty much everyone I know who cares about latency in twitch shooters doesn't care about graphics while playing them and have nearly all those settings turned down anyways. Even on their 4090's. I was playing and smashing through games like Elden Ring and other souls-likes at "lower" framerates (lower than my LG's 120hz anyway) on my 3080 probably at higher latency because I'm playing on a 4k screen and I didn't want to turn down any graphical settings. Point is, I've been fine with latency so far even with my struggling 3080.

I do wish the 5080 was a larger leap, but since I chose to buy my kids a steam deck this Christmas I can't afford the 5090. I was planning on spending a chunk now on the best on offer for an upgrade so I didn't need to do it again for another 7-8 years. So in my mind, since it's actually a bit cheaper than I was expecting, I'm tempted to get either the 5080 or it's Ti variant and wait for the new process node in the 6000 series which should actually be a larger jump to drop, sell the 5080 while it still has value and upgrade one more time. Only problem is I'd like to wait till at least the second iteration of a new process node because that seems to be the sweet spot for performance and efficiency... Gah.

At this point it all hinges on impressions and reviews anyway.
 
  • Like
Reactions: regawdless
What I'm wondering is how the latency will compare to consoles at high quality settings. If it's a graphically heavy game and I can still play with all the bells and whistles and image IQ that PC allows with same as or even better than console latency while looking as fluid as people have been reporting then I'll be happy. I don't play them anymore but pretty much everyone I know who cares about latency in twitch shooters doesn't care about graphics while playing them and have nearly all those settings turned down anyways. Even on their 4090's. I was playing and smashing through games like Elden Ring and other souls-likes at "lower" framerates (lower than my LG's 120hz anyway) on my 3080 probably at higher latency because I'm playing on a 4k screen and I didn't want to turn down any graphical settings. Point is, I've been fine with latency so far even with my struggling 3080.

I do wish the 5080 was a larger leap, but since I chose to buy my kids a steam deck this Christmas I can't afford the 5090. I was planning on spending a chunk now on the best on offer for an upgrade so I didn't need to do it again for another 7-8 years. So in my mind, since it's actually a bit cheaper than I was expecting, I'm tempted to get either the 5080 or it's Ti variant and wait for the new process node in the 6000 series which should actually be a larger jump to drop, sell the 5080 while it still has value and upgrade one more time. Only problem is I'd like to wait till at least the second iteration of a new process node because that seems to be the sweet spot for performance and efficiency... Gah.

At this point it all hinges on impressions and reviews anyway.

The 5080 is so far ahead of consoles, you'll get way higher base fps. Even adding frame gen, it'll be more responsive while offering a way smoother image. It's not even a contest. Same for the 5070ti, it'll offer a crazy good experience.
 
  • Love
Reactions: VlaudTheImpaler
5080 reviews will now be available one day before the launch



Great improvement for frame gen on 40xx and 50xx cards, increasing performance while using less VRAM.

 
  • Like
Reactions: IrishWhiskey
Gaming Performance without all the frame gen fuckery revealed. 5090 is a beast with 33% better performance than the 4090, 5080 +15% over the 4080 and $200 less, 5070 + ti 20% faster than their 40xx equivalents. Not that bad I'd say.

GhVvlbwbkAAWrd_


 
  • Like
Reactions: Bullet Club