Thread: The GPU Thread
I built an ATI system back in the day (20 years ago or sth) and it was the worst, most unstable system I ever had. Since then, I've stuck with Intel.

I know Ryzen is leaps and bounds ahead of ATI now, but the scars run deep!

Happy to be convinced otherwise. I presume if I go with Ryzen, the only other AMD component i would need is the mobo, right?

Firstly, ATI was a graphics company that made GPUs but never CPUs. They were bought out by AMD sometime around Xbox 360 era.

As far as current day AMD CPUs they are leaps and bounds better than they were 10 years ago for example and each Ryzen iteration seems to have gotten even better and more stable. In DIY Desktops they are absolutely smoking Intel in marketshare and mindhare right now and for gaming their X3D CPUs are top of the line performance wise while Intel as a whole seems to be floundering their execution the past decade and their share price has plummeted as a result, as has consumer confidence in the brand.

If you bought a Ryzen CPU then you would only need to have a Ryzen compatible motherboard, you don't need to buy a special PSU or an AMD GPU or anything like that.
 
  • Cheers
Reactions: Dreamlord
So AMD is no longer going to release high-end GPUs to compete with Nvidia. That's a sad state of affairs imo.

Here is their latest RDNA4 tech (i.e. 9000 series -- which might be used in the next-gen PS6/XsX2 consoles. As I mentioned before with some PS gamers, I don't think the next consoles will be as powerful as the 4090 even if the PC will be releasing the 6x00 series GPUs by then. Even if the consoles get a hybrid RDNA4/5, I don't think it will be competitive with the RTX4090 with respect to RT performance. We'll just have to see.

 
So AMD is no longer going to release high-end GPUs to compete with Nvidia. That's a sad state of affairs imo.

Here is their latest RDNA4 tech (i.e. 9000 series -- which might be used in the next-gen PS6/XsX2 consoles. As I mentioned before with some PS gamers, I don't think the next consoles will be as powerful as the 4090 even if the PC will be releasing the 6x00 series GPUs by then. Even if the consoles get a hybrid RDNA4/5, I don't think it will be competitive with the RTX4090 with respect to RT performance. We'll just have to see.



I think it does need to be mentioned that the 4090 is a crazy anomaly of a card, it's so head and shoulders above all other cards it's ludicrous. Honestly shocked Nvidia didn't charge more for it. Best card I've ever owned. I can't imagine Consoles making the leap needed to catch up with it in one gen (on the hardware side at least).

I wonder what the 5090 will look like, if they can recreate another ludicrous card, or if it will just be a modest bump from the 4090 with more memory.
 
So AMD is no longer going to release high-end GPUs to compete with Nvidia. That's a sad state of affairs imo.

Here is their latest RDNA4 tech (i.e. 9000 series -- which might be used in the next-gen PS6/XsX2 consoles. As I mentioned before with some PS gamers, I don't think the next consoles will be as powerful as the 4090 even if the PC will be releasing the 6x00 series GPUs by then. Even if the consoles get a hybrid RDNA4/5, I don't think it will be competitive with the RTX4090 with respect to RT performance. We'll just have to see.



I'm not sure if they had some technical issues with the multi graphics die chips they had originally planned or if they needed supply of interconnects and whatnot for Datacenter instead but it has been clear for a while that they are only targeting the mid range this time with RDNA4, kind of like RDNA1 with the 5700XT for example.

It's a pity as I would have liked to see the larger chips, in terms of wanting a healthier market I would prefer if both AMD and Intel were competing higher up the stack than they currently are.

I hope that with UDNA (Previously RDNA5 but now they are merging CDNA and RDNA into one) they start competing higher up the stack again. Supposedly UDNA will have Matrix units like Nvidia and Intel do, in fact in CDNA AMD has had matrix units for a while.

PS6 will not use RDNA4, they should be using UDNA as PS6 won't launch until 2027/2028. UDNA itself will either launch very late 2026 or possibly early 2027.

With RDNA4 at least it is not all bad as AMD seem to be taking Ray Tracing a little more seriously with doubling the throughput to 8 Ray/Box and 2 Ray/Tri intersections per cycle as well as now using BVH8. Plus I think they will finally launch FSR4 which will be 100% AI supposedly so at least not all bad.

I am disappointed that they only went low-mid range this generation, but hopefully they can come back with something higher up the stack with UDNA, but we'll see what happens.
 
  • Brain
Reactions: Mirabilis
I think it does need to be mentioned that the 4090 is a crazy anomaly of a card, it's so head and shoulders above all other cards it's ludicrous. Honestly shocked Nvidia didn't charge more for it. Best card I've ever owned. I can't imagine Consoles making the leap needed to catch up with it in one gen (on the hardware side at least).

I wonder what the 5090 will look like, if they can recreate another ludicrous card, or if it will just be a modest bump from the 4090 with more memory.
So AMD is no longer going to release high-end GPUs to compete with Nvidia. That's a sad state of affairs imo.

Here is their latest RDNA4 tech (i.e. 9000 series -- which might be used in the next-gen PS6/XsX2 consoles. As I mentioned before with some PS gamers, I don't think the next consoles will be as powerful as the 4090 even if the PC will be releasing the 6x00 series GPUs by then. Even if the consoles get a hybrid RDNA4/5, I don't think it will be competitive with the RTX4090 with respect to RT performance. We'll just have to see.



Highly unlikely that PS6 will have a GPU that reaches 4090 levels of performance, like Pegasus said, it's such a wild card. Digital Foundry talked about this and said the same.

I'd say 4080 performance would be a great jump from the current 2070 level, more than double the performance (~114% from benchmarks). Combined with a better CPU, smart memory and bandwidth solutions, efficient architecture etc. it'll still be a great basis for devs. Especially with better RT capabilities.

Will be interesting how much further Nvidia will be able to push it in the PC space. If the 5090 is around 45% faster than the 4090, that's fucking crazy. But what after that? Will they be able to deliver that kind of increase with a 6090 as well? Either way, Nvidia pushing their super high-end card into the 2k and above price region will mean that the high-end PC market will remain in a completely different league than consoles, by margins that we haven't seen in the past.
 
Highly unlikely that PS6 will have a GPU that reaches 4090 levels of performance, like Pegasus said, it's such a wild card. Digital Foundry talked about this and said the same.

I'd say 4080 performance would be a great jump from the current 2070 level, more than double the performance (~114% from benchmarks). Combined with a better CPU, smart memory and bandwidth solutions, efficient architecture etc. it'll still be a great basis for devs. Especially with better RT capabilities.

Will be interesting how much further Nvidia will be able to push it in the PC space. If the 5090 is around 45% faster than the 4090, that's fucking crazy. But what after that? Will they be able to deliver that kind of increase with a 6090 as well? Either way, Nvidia pushing their super high-end card into the 2k and above price region will mean that the high-end PC market will remain in a completely different league than consoles, by margins that we haven't seen in the past.

Yup. I think that will be made clear at CES 2025. Rumors are that the 5x00 series boards will have some new way of AI helping rendering.
 
So AMD is no longer going to release high-end GPUs to compete with Nvidia.

It's up to Intel then.

Angry Sponge Bob GIF by SpongeBob SquarePants


I just hope they don't go bankrupt first... 😓
 
It's up to Intel then.

Angry Sponge Bob GIF by SpongeBob SquarePants


I just hope they don't go bankrupt first... 😓

Currently they're in the low end market and struggle to even reach 1% market share. Going for the super high end requires a ton of R&D, which is very expensive. I'd love to see Intel bringing the heat, the new cards are very promising and the RT performance is very good.
 
I think it does need to be mentioned that the 4090 is a crazy anomaly of a card, it's so head and shoulders above all other cards it's ludicrous. Honestly shocked Nvidia didn't charge more for it. Best card I've ever owned. I can't imagine Consoles making the leap needed to catch up with it in one gen (on the hardware side at least).

I wonder what the 5090 will look like, if they can recreate another ludicrous card, or if it will just be a modest bump from the 4090 with more memory.

I agree 100% on that.
 
  • Strength
Reactions: Mirabilis
I'm not sure if they had some technical issues with the multi graphics die chips they had originally planned or if they needed supply of interconnects and whatnot for Datacenter instead but it has been clear for a while that they are only targeting the mid range this time with RDNA4, kind of like RDNA1 with the 5700XT for example.

It's a pity as I would have liked to see the larger chips, in terms of wanting a healthier market I would prefer if both AMD and Intel were competing higher up the stack than they currently are.

I hope that with UDNA (Previously RDNA5 but now they are merging CDNA and RDNA into one) they start competing higher up the stack again. Supposedly UDNA will have Matrix units like Nvidia and Intel do, in fact in CDNA AMD has had matrix units for a while.

PS6 will not use RDNA4, they should be using UDNA as PS6 won't launch until 2027/2028. UDNA itself will either launch very late 2026 or possibly early 2027.

With RDNA4 at least it is not all bad as AMD seem to be taking Ray Tracing a little more seriously with doubling the throughput to 8 Ray/Box and 2 Ray/Tri intersections per cycle as well as now using BVH8. Plus I think they will finally launch FSR4 which will be 100% AI supposedly so at least not all bad.

I am disappointed that they only went low-mid range this generation, but hopefully they can come back with something higher up the stack with UDNA, but we'll see what happens.
Good points, but I'll have to disagree on the timeline here. To push a release of a product in 3yrs time, they'll have needed to already settle on the tech now. Not only that, you have cost to consider. The consoles have never come out with equivalent featuresets at a much much lower pricepoint. That's like the new DDR7 VRAM coming out in Jan. and costing $1k for every 24G and then in June that same hardware gets introduced by another manufacture at $500 for the same 24G. That's literally highly unlikely.
 
Good points, but I'll have to disagree on the timeline here. To push a release of a product in 3yrs time, they'll have needed to already settle on the tech now. Not only that, you have cost to consider. The consoles have never come out with equivalent featuresets at a much much lower pricepoint. That's like the new DDR7 VRAM coming out in Jan. and costing $1k for every 24G and then in June that same hardware gets introduced by another manufacture at $500 for the same 24G. That's literally highly unlikely.

It's true that Sony doesn't just take an RDNA generation wholesale, they kind of mix and match what they want. The could have some kind of frankenstein console with parts from RDNA4, UDNA and even UDNA2 as well as possibly some custom Sony hardware, hard to say really.

Regarding settling on tech, they likely have access to Radeon's future roadmap and have done for a while so they work with whatever timelines AMD tells them that certain IP blocks and featuresets will be available so its not like they need to wait for a tech to be completely finalized and released or just about to release to include IP blocks, features etc... but I think you are right that whatever Sony does for PS6 it likely won't be just off the shelf UDNA but it will certainly contain UDNA features if not have UDNA as the base IP possibly.
 
It's true that Sony doesn't just take an RDNA generation wholesale, they kind of mix and match what they want. The could have some kind of frankenstein console with parts from RDNA4, UDNA and even UDNA2 as well as possibly some custom Sony hardware, hard to say really.

Regarding settling on tech, they likely have access to Radeon's future roadmap and have done for a while so they work with whatever timelines AMD tells them that certain IP blocks and featuresets will be available so its not like they need to wait for a tech to be completely finalized and released or just about to release to include IP blocks, features etc... but I think you are right that whatever Sony does for PS6 it likely won't be just off the shelf UDNA but it will certainly contain UDNA features if not have UDNA as the base IP possibly.
And that may be the case, but there are two very real challenges: 1) Cost for new tech to be brought to average consumer prices. 2) AMD is already behind the curve on RT performance and upscaling.
 
I'll say this, no point in releasing a PS6 if it can't beat the 4090 in RT. Just a little bit better raster performance isn't going to move the needle, that's a PS5 Pro+.

Next gen, has to move the needle drastically on the visuals. PS4 -> PS5 was a small jump even though the flow count jump was decent. I suspect we need a factor of 100x to see substantial gains in visual quality using traditional raster techniques.

I'm very eager to see what the Blackwell brings to the table with regards to rendering tech. I skipped the 40xx series, but if the 50xx series brings some awesome gains and features I may jump in with a 5080 likely. I'm just not interested in a 5090 that requires 600W (and corresponding cooling)
 
  • Brain
Reactions: Mickmrly
I'll say this, no point in releasing a PS6 if it can't beat the 4090 in RT. Just a little bit better raster performance isn't going to move the needle, that's a PS5 Pro+.
But the 4090 is such a big leap over the 3090. I don't think AMD is going to make a GPU that's equivalent to the 4090 in RT. We don't see a single AMD board that can do that now let alone be much cheaper too.
 
I'll say this, no point in releasing a PS6 if it can't beat the 4090 in RT. Just a little bit better raster performance isn't going to move the needle, that's a PS5 Pro+.

Next gen, has to move the needle drastically on the visuals. PS4 -> PS5 was a small jump even though the flow count jump was decent. I suspect we need a factor of 100x to see substantial gains in visual quality using traditional raster techniques.

I'm very eager to see what the Blackwell brings to the table with regards to rendering tech. I skipped the 40xx series, but if the 50xx series brings some awesome gains and features I may jump in with a 5080 likely. I'm just not interested in a 5090 that requires 600W (and corresponding cooling)

Looks like AMD won't be able to match the 4090 RT performance with the upcoming cards. They might be able to do it with their best card in 2027. Which will be way too expensive to be used in a PS6. And it would likely be too late to be included in a PS6.
 
  • 100%
Reactions: VFX_Veteran
And that may be the case, but there are two very real challenges: 1) Cost for new tech to be brought to average consumer prices. 2) AMD is already behind the curve on RT performance and upscaling.

You make a great point about cost, I'll also add things like transistor budget for a fixed die size, thermals, power draw, performance per area etc...

Although the majority of costs I think are more down to the cost of the wafers. If PS6 is using somethings like N3 process from TSMC then that doesn't come cheap, I would imagine that would be more of a cost factor than specific IP blocks and which generation they come from.

I just want to clear one thing up, I'm not making an argument that I think PS6 will match or exceed the 4090 in either raster or RT.

I was just stating that PS6 will probably use UDNA features heavily if not use UDNA as the base IP, of course probably mixed and matched with other Radeon and Sony custom IP blocks, I don't believe they will use RDNA4 as the base IP, although I could be wrong.
 
I was just stating that PS6 will probably use UDNA features heavily if not use UDNA as the base IP, of course probably mixed and matched with other Radeon and Sony custom IP blocks, I don't believe they will use RDNA4 as the base IP, although I could be wrong.
Yea, that's true. I'll be anxious to see what they come up with and how their true capabilities will be on release. We'll have years to speculate though. ;)
 

People are in an outroar over the pricing! But can you really blame Nvidia? They have 0 competition.

Honestly, I feel like Nvidia did themselves a disservice with the naming of these cards. The xx90 series should have been a new tier of card that was marketed as a "Prosumer Card", meant for the low end private business/artist/creator market that wants a hyrbrid gaming and work machine, maybe should have kept the Titan name brand. But it never should have been associated with the other gaming cards.

Then the xx80 series could have been called the xx90 series with the above 1k price point, and marked as the peak "gaming card".

The xx70 seires as it is now could have been the real xx80 and stayed a sub 1k priced card that is consider the high end "gaming card"


Or you know.. just price these things all reasonably. But they are in such a dominant position and probably don't even care about the gaming market anymore. I can't image it makes up more that 10% of their income.
 
lol ok if the 5090 approaches 3k €, then I'm out. That's just too much, if true. Euro prices are always quite a bit higher that the Dollar ones.



If the 5080 is a grand, I'll get it. If it's 1500, I'll buy a 4080 super for my current PC and will delay building a new one for 2026.

A 5090 won't fit in my current PC anyway, so it would need a new build.
 
Seems like the 5070 will have 12gb again lmao.

I'll either wait until there is an 18gb version of that card (3gb ram modules vs. 2) or 24gb version of 5070ti.

But i'm also curious how the higher end intel cards turn out.
 
  • Brain
Reactions: Mickmrly
Nvidia can kiss my ass with those prices. My 7800xt hasn't seen a game this year that can't do 1440p high at a minimum of 60fps.

Edit - 5090 is $2600, fucking absurd.
 
Last edited:
  • 100%
Reactions: Arkam
5080 apparently releasing Jan 21, two weeks after CES.


It is a bit weird how they could be releasing it before the 5090.

Anyone who buys the 5080 on release is just going to be disappointed when they see the specs of the 5090.

p3mcp.jpg
 


MSI is preparing the GeForce RTX 5080 GAMING TRIO model, which, as we can see, is the OC Edition. The packaging confirms that the card ships with 16GB of GDDR7 memory and a 256- bit memory bus. It is also set to feature a new design, though not a major change from its predecessor. The box confirms it includes three DisplayPort and one HDMI port, but does not specify the version of these ports.

Our focus should be on the platform technology details. Interestingly, there appears to be no changes whatsoever compared to the RTX 40 series. Each marketing point and technology mentioned remains exactly the same as the older model.

GEFORCE-RTX-5080-MSI-1536x799.jpg


RTX-5080-MSI.jpeg
 
part of me is glad they are doing the 5080 first. Hopefully that'll catch some of the fomo market that would have otherwise bought/scalpt the 5090.
 
  • Like
Reactions: regawdless
When its time to rebuild my PC Ill be grabbing as much GPU as I can for $500 and not a penny more. These numbers are retarded. For $2600 I could get a decent used motorcycle.

I read that motorcycle trips often showcase incredible path traced lighting in beautiful environments.
 
I read that motorcycle trips often showcase incredible path traced lighting in beautiful environments.

Actually that makes me realise another reason I don't give a shit about raytracing.

If I want to see beautifully realised realistic environments, I go outside. No videogame is ever going to truly compete with the beauty of reality, and chasing that pointless impossible dream is why we're in this nosedive of exploding costs for minimal improvements, while realistic game dev teams become 90% bloat.

I play videogames for their unrealistic elements. Even with graphics I vastly prefer style over realism.
 
nVidia are fully taking the piss, and I dread to think of the sick joke that the 5060 is going to be. I still haven't forgotten the artificial number bump where everything jumped a price tier and no one kicked up enough of a fuss.

Intel looks very interesting and could potentially be disruptive, but I'm hearing that they aren't the way to go for older games. That makes them a no go for me if true, which is a shame. But for anyone looking to play newer games they seem to be a hell of a value proposition.

I'm still intrigued by AMD, especially given the above. Like some of you I don't care about RT in the slightest, only the price/performance ratio and energy consumption. And maybe some Linux as a bonus. Which is why posts like @Torrent of Pork's catch my attention.
 
  • Like
Reactions: Torrent of Pork
Actually that makes me realise another reason I don't give a shit about raytracing.

If I want to see beautifully realised realistic environments, I go outside. No videogame is ever going to truly compete with the beauty of reality, and chasing that pointless impossible dream is why we're in this nosedive of exploding costs for minimal improvements, while realistic game dev teams become 90% bloat.

I play videogames for their unrealistic elements. Even with graphics I vastly prefer style over realism.

You make the common mistake of linking RT to realism.

It's all an artistic choice. RT lighting even gives the devs more freedom to achieve artsy and unrealistic games. On the fly, they can adjust all the parameters like color, density, wave length, they can distort waves, go for a comic style, exclude objects from the RT etc. it makes it way easier to realize their vision than by using traditional lighting with all its complicated methods that require a ton of manual work. RT becoming the norm saves a lot of costs and work for all dev teams, big or small.

Nice try though, Mr. RT derangement syndrome.
 
  • Star
Reactions: VlaudTheImpaler
nVidia are fully taking the piss, and I dread to think of the sick joke that the 5060 is going to be. I still haven't forgotten the artificial number bump where everything jumped a price tier and no one kicked up enough of a fuss.

Intel looks very interesting and could potentially be disruptive, but I'm hearing that they aren't the way to go for older games. That makes them a no go for me if true, which is a shame. But for anyone looking to play newer games they seem to be a hell of a value proposition.

I'm still intrigued by AMD, especially given the above. Like some of you I don't care about RT in the slightest, only the price/performance ratio and energy consumption. And maybe some Linux as a bonus. Which is why posts like @Torrent of Pork's catch my attention.

They keep jumping the price tiers. My 4070ti Super was almost £1000. My 1080ti was considerably cheaper than that.