Thread: The GPU Thread
Alright, ray tracers!

Final chance for the price guesses. My absolutely naive wishful thinking being hopeful but then getting anally dry-fisted by Jensen guess:

5080 = $1.199; €1.399
5090 = $ 1.699; € 1.899

No idea where AMD will land, performance or price wise. After announcing that they won't be chasing the high-end, it's an absolute wild card for me.
 
  • Brain
Reactions: Kadayi
Jensen Huang, Who art in one of his 50 mansions,
Hallowed be Thy GPU.
Thy DLSS4 come.
Thy bloated price be done,
on 5080 as it is in 5090.

Give us this day our daily price rumor.
And forgive us our empty bank accounts,
as we forgive those who skimp on VRAM.
And lead us not into FSR,
but deliver us from AMD. Amen.
 

Radeon 9070 and 9070 XT will feature 16GB of VRAM. "Coming soon".

AMD-next-gen-TUF-series_3D.jpg
feature-amd-next-gen.jpg
 
  • Strength
  • Like
Reactions: Kadayi and Joe T.
Alright, ray tracers!

Final chance for the price guesses. My absolutely naive wishful thinking being hopeful but then getting anally dry-fisted by Jensen guess:

5080 = $1.199; €1.399
5090 = $ 1.699; € 1.899

No idea where AMD will land, performance or price wise. After announcing that they won't be chasing the high-end, it's an absolute wild card for me.

I am going to be stupid and say $999 for the 5080.

4080 was what 800$? (or am I thinking 3080)
5090 i'll say 1500.
 
  • Brain
Reactions: regawdless
Alright, ray tracers!

Final chance for the price guesses. My absolutely naive wishful thinking being hopeful but then getting anally dry-fisted by Jensen guess:

5080 = $1.199; €1.399
5090 = $ 1.699; € 1.899

No idea where AMD will land, performance or price wise. After announcing that they won't be chasing the high-end, it's an absolute wild card for me.

Cutting it close as the keynote just started, but I'll guess $1200-1400 for the 5080 and we'll be very lucky if the 5090 starts below $2000.

Nvidia has fueled my cynicism over the last few years, hopefully they don't do more of that tonight.
 
  • Like
Reactions: regawdless
Looks like 5090 and 5080 are launching on the same day, Jan 30th. Unspecified February date for 5070 and 5070 Ti.


Clipped direct from Nvidia's site:

Availability

For desktop users, the GeForce RTX 5090 GPU with 3,352 AI TOPS and the GeForce RTX 5080 GPU with 1,801 AI TOPS will be available on Jan. 30 at $1,999 and $999, respectively.

The GeForce RTX 5070 Ti GPU with 1,406 AI TOPS and GeForce RTX 5070 GPU with 988 AI TOPS will be available starting in February at $749 and $549, respectively.

The NVIDIA Founders Editions of the GeForce RTX 5090, RTX 5080 and RTX 5070 GPUs will be available directly from nvidia.com and select retailers worldwide.

Stock-clocked and factory-overclocked models will be available from top add-in card providers such as ASUS, Colorful, Gainward, GALAX, GIGABYTE, INNO3D, KFA2, MSI, Palit, PNY and ZOTAC, and in desktops from system builders including Falcon Northwest, Infiniarc, MAINGEAR, Mifcom, ORIGIN PC, PC Specialist and Scan Computers.

Laptops with GeForce RTX 5090, RTX 5080 and RTX 5070 Ti Laptop GPUs will be available starting in March, and RTX 5070 Laptop GPUs will be available starting in April from the world's top manufacturers, including Acer, ASUS, Dell, GIGABYTE, HP, Lenovo, MECHREVO, MSI and Razer.
 
Well I'll try to get a founders 5090 but I'm assuming it will sell out in seconds and be on some dick heads StockX page for double the price.
 
Ok I didn't expect the 5090 to cost twice as much as the 5080, wow.

Euro prices will be fucked, 5090 will be at least €2.299, the 5080 will be €1.169.

Let's see the real world non DLSS performance numbers.

At this point I'm leaning heavily toward the 5080. That price gap is just too much if the 5080 still outperforms the 4090. Not to mention power usage...
 
  • This tbh
Reactions: Kadayi
At this point I'm leaning heavily toward the 5080. That price gap is just too much if the 5080 still outperforms the 4090. Not to mention power usage...

Yeah the 5080 seems like the way to go for me as well. Just saw on their page, 5090 is €2.329,00. Looks like an absolutely insane card, but I don't have that much gaming time anymore, paying over €2.3k doesn't make much sense.

I guess I'll be a 16GB peasant.
 
Will wait for in-depth benchmarking, but once again it looks like only the top-end card (5090) will be worth it, because it will be heads and shoulders above anything from the competition or the previous generation.
 
I wasn't too impressed by the presentation with regards to gaming, I expected to see/hear more if the uplift was so significant. They seem to be banking hard on the new DLSS multi-frame generation and I'm not personally a big fan of frame gen, I want the full low latency benefit of those extra frames.

The Asus "Astral Series" model looks like it could be hitting the high end of rumored pricing:

 
I just find it crazy how the 5070 still has the same amount of VRAM as the 3060. While it may load faster than older memory, it still needs the space to actually load the assets.

I guess they are trying to give consoles enough time to catch up in VRAM specs. 😓
 
  • Funny
Reactions: VFX_Veteran
Frame gen creates too many artifacts, especially around static UI elements. I can see it being worth it in some cases, but it comes with some significant trade offs.
 
Frame gen creates too many artifacts, especially around static UI elements. I can see it being worth it in some cases, but it comes with some significant trade offs.

Curious how that'll look like with the new transformer model which apparently significantly reduces artefacts and improves image quality.

I wouldn't be surprised to see Nvidia solving that issue.

The improvements to the core DLSS sound pretty cool. Say what you want about Nvidia, it's really good that even 20xx cards will get better performance and quality with the new iteration. Looking at my 3080, various driver improvements and DLSS made that card an all time great for me, it's still an incredible card for 1440p gaming.
 
Yeah the 5080 seems like the way to go for me as well. Just saw on their page, 5090 is €2.329,00. Looks like an absolutely insane card, but I don't have that much gaming time anymore, paying over €2.3k doesn't make much sense.

I guess I'll be a 16GB peasant.

Yeah, I'd be tempted by the 5080, but you damn well know that they're likely to release a 5080ti or Super in 6 months with 24GB of RAM, just to spite everyone.
 
The new feature is "Frame Generation (again)"? I will stick with my 4070ti Super. It's only a year old and has been running things fine. I am not going to buy into the marketing jargon that 5070=4090.
 
  • Strength
Reactions: Kadayi
Very interesting stuff, I had insomnia so actually watched most of the conference live at 2.30am last night.

DLSS4 Multi Frame Gen seems to be the "big" feature highlighted. They now generate 3 frames for every one rendered frame, essentially "extrapolation" rather than just interpolation. This is exclusive to Blackwell/5000 series GPUs. For people who like frame generation I suppose this is a cool upgrade, personally I dislike frame generation myself and I think I'm probably in the majority in that, but still nice to have for some use cases I suppose.

New "Transformer" model for DLSS. To me this is the much more interesting feature, it seems that this will use a much heavier model than what DLSS previously used but the results seem to be significantly improved, especially with temporal artifacts. This affects Upscaling as well as Nvidia's AI denoiser (Ray Reconstruction).

Nvidia seems to also have an override feature in their software that will (where possible) upgrade a DLSS feature (frame gen, upscaling etc..) to the newer DLSS model/feature where possible which is honestly really really cool.

The pricing seems better than what most of us were expecting outside of the 5090 so fair play to them there, it makes the cards a more attractive proposition than if they just went mental on pricing. VRAM is a little starved below the 5090 as we expected with the 5070 only having 12GB RAM.

Having said that they are using GDDR7 across the whole stack which is great for additional bandwidth. 5090 even went all in and has 32GB VRAM which is pretty impressive.

Reflex 2 is interesting, they make use of something called "frame warp" to help reduce latency when moving the camera. This won't affect things like the latency of pressing jump, firing a gun or whatever but when moving the camera will help "fill in the blanks" so to speak with AI which is interesting.

The had some "Neural Rendering" tech that they teased, stuff like texture compression, material shading, and neural rendered character model faces where it seems you give it a simpler model and it will make a more realistic face? Not exactly sure how that will work in the end or how much adoption we will see of these things, but they seem to be based on new Direct X API features that Microsoft also announced, so I suppose in theory Intel, AMD or whoever else should also be able to make use of these new APIs.

Performance is the big question mark. I mean we know what to expect from Nvidia presentations at this point and this one was no exception with crazy claims of performance being pretty much all frame generation gains.

Some things to keep in mind are that these GPUs are on the exact same process node as Ada cards were, so they are not gaining additional clock speed, die area or transistor area here. Due to this pretty much all performance gains need to come from changes to the architecture itself and increases in memory bandwidth from using GDDR7.

We don't have any proper benchmarks from Nvidia to go off but based on what they showed in their chats, without using frame generation, most of the cards seem to be somewhere in the 20-30% performance increase with only 5090 bucking that trend with possibly 40+% performance increases.

We also see that all of the cards are more power hungry than before as they likely need to pump more power into the same die area to help increase performance.

If you take the node into account and also look at the SM counts we see that across the stack each GPU seems to only gain +2 SMs versus its 4000 series equivalent, with the exception of the 5090 which gains a significant amount of SMs over the 4090, so I think for most of the stack outside the 5090 somewhere in the 20-30% range seems reasonable/realistic, although maybe on average it might be something like 32% or something like that.

Impossible to say until we see proper benchmarks and reviews so just take the above as speculation/grain of salt until we know more but I think I might be in the rough ballpark here.

There are also some improvements I think to the media engine and decoder/encoder which should make streamers and content creators pretty happy.

All in all I'm feature wise I'm most interested in the new transformer model they are using for upscaling and denoising, exciting times ahead.
 
Last edited:
The comparisons on those slides are worthless and typical Nvidia marketing BS. Comparing the 5070 to the 4090 is ridiculous and stupid.

But.... I don't even care much about fps with native res. I basically always use DLSS, and I want to see those benchmarks. Especially now with the improved model, no reason to use native with another, inferior AA method. When the headroom is there, DLAA for funzies.

Frame gen, meh. Let's see.
 
Kind of interesting that DF were given a 5080 to play with versus a 5090. Deffo feels like Nvidia is pimping the 5080 as the main Gaming Consumer Card with the 5090 being more for the Professional end of the market. Price tag on that bad boy isn't going to dissuade my AI bros though as they want that 32 GB for AI model Video output.

Its how it always should have been. I still think it shouldnt be called a xx90, rather its own classification (like bring back the Titan branding!)
 
Its how it always should have been. I still think it shouldnt be called a xx90, rather its own classification (like bring back the Titan branding!)

Agreed. Still, even with them clearly outlining that for gaming, you didn't need a Titan, which was more of a card for Video editing/Rendering and the like, it didn't stop people from paying the markup tax and stuffing them into their gaming rigs, though, even though the FPS gains weren't overly absurd versus the xx80 cards.

Still, the 16GB RAM gimp on the 5080 is a pisser, from an AI perspective. Feels like a deliberate move to keep the general price point down, and I can foresee a lot of chagrin happening if they ship a 5080Ti/Super with 24 GB RAM down the road.
 
  • Brain
Reactions: regawdless
Agreed. Still, even with them clearly outlining that for gaming, you didn't need a Titan, which was more of a card for Video editing/Rendering and the like, it didn't stop people from paying the markup tax and stuffing them into their gaming rigs, though, even though the FPS gains weren't overly absurd versus the xx80 cards.

Still, the 16GB RAM gimp on the 5080 is a pisser, from an AI perspective. Feels like a deliberate move to keep the general price point down, and I can foresee a lot of chagrin happening if they ship a 5080Ti/Super with 24 GB RAM down the road.

If they do ship a 5080TI, they wont drop the current 5080 price, instead they'll put the TI at like $1499 for 24gb ram, right between the standard 5080 and 5090. I wonder if those people who would be mad would also have been willing to put up $1499 for the card.
 
The jump for me from a 3080 to a 5080 will be huge, obviously. But to me, this is reveal seems like a weird value proposition to communicate for the Nvidia marketing team.

1. New DLSS model with way better results
- available on all RTX cards, going back to the 20xx series. Great stuff, but not an argument for the 50xx cards.

2. Hugely improved ray reconstruction
- same as above.

3. Improved frame generation
- frame gen wasn't a killer feature for the 40xx series. Now it's apparently way better but... Doesn't seem like a must have.

4. Hi, I'm Nvidia, I ignore native raster performance.

The actual case for upgrading a 40xx to a 50xx card is very weak. But that's ok, I still applaud Nvidia for pushing DLSS to the next level, making it available for old cards as well. They could've easily made the new model exclusive to the 50xx series. But they even include an automatic upgrade option into the Nvidia app, which is awesome. They significantly improved ray reconstruction, also available for old cards.

So yeah, Nvidia is doing not so great stuff with the VRAM and keeps prices high, but fuck me, their advancements in the software and AI side are great. All current RTX card owners get a good image quality improvement for free.