Thread: The GPU Thread

Looks disappointing tbh, but the price will determine everything. In a way, it's impressive that the 7800XT with its 60 CUs can keep up with the 6800XT with 72 CUs. The 7800XT should have been the 7800 at most, considering the 6800 also has 60 CUs. It would look better too, as it would be around 15-20% than previous Gen. Compared to the 6800XT it probably is barely 5% faster. 🤷🏽‍♂️

The 7700XT has the potential to be very good though. About 15% faster than the 4060Ti, which would put it around the 6800/3070Ti level.

Don't price things stupidly, AMD. If the 7700XT is $349, it's an instant buy for me.
 
  • Strength
Reactions: Mickmrly
Looks like AMD announced FSR3

Still works on pretty much any GPU, still open source.

Combination of FSR2, Anti-Lag+ (this seems new but not info on it) and Fluid Motion Frames (Frame Gen).

They also announced FSRAA, so you can go at Native resolution but replace the games TAA with FSRAA.

Most surprisingly they announced a driver toggle to get Fluid Motion Frames working on the driver level.

Also supposedly works on consoles too.

I don't know if this includes new Anti-Lag+ or how it works exactly, might be a bit similar to RSR vs FSR1.0 where an integration in the engine works better but you can still force it on in the driver?

Apparently coming to 12 games this year including Cyberpunk 2077.

Supposedly launches in a few weeks with updates to Forspoken and Immortals of Aveum.

I''m not mad on the idea of frame interpolation no matter who makes it but lets see how they go with this. Hopefully they also include an improvement in the upscaling (FSR2) portion as well as FSR2 has been stangely stagnant as of late with updates which leads me to believe they may have been holding back some improvements for this? Hard to say.

At least they are finally making a move here, radio silence for so long made it appear as vapor ware.



So, how does FSR 3 look? At Gamescom, we had a demonstration of both titles running with the new technology active on a Radeon 7900 XTX running at 4K output. Both were running with v-sync on, which AMD recommends for frame-pacing purposes. In the very small Forspoken demo we saw, the game was running locked at 120 frames per second and looked just as a v-synced 120fps should look. The game was running in FSR 2 quality mode providing its own frame-rate boost, with frame-gen then taking you up to the limit. In terms of fluidity and clarity, FSR 3 looked a match for DLSS 3 - a view shared by Alex, Rich and John, who were all present to see the demos in person. A great start for FSR 3.

😮
 
Last edited:
Supposedly FSR3 is designed with solutions in place to mitigate the HUD artifacting issues that affect some DLSS3 titles, so that seems to be a good sign. AMD was apparently very proud of this.

Weirdly enough they mentioned a new version of Anti-Lag, called "Anti-Lag+" but it seems that it only works on the 7000 series cards and only on "supported titles". Not sure if that means a whitelist or what.
 
  • Like
Reactions: Ascend
People here really don't care about AMD GPUs at all. If this was an nVidia announcement it would get a lot of attention. Even the trashy 4060 cards got more attention when announced.
In any case... This is a great video, detailing a lot about FSR3 that was not quite clear during the AMD keynote.



 
Last edited:
People here really don't care about AMD GPUs at all. If this was an nVidia announcement it would get a lot of attention. Even the trashy 4060 cards got more attention when announced.
In any case... This is a great video, detailing a lot about FSR3 that was not quite clear during the AMD keynote.





Probably because most of us own Nvidia cards because we are an enthusiast board who wants flagship performance and the mid-tier doesn't interest many of us. Also too many pro AMD posters come off as AMD shills always trying to hype everything AMD does, almost to the point it seems like they are either paid or just out to be contrarians. Its annoying and just makes some of us want to hate AMD for that alone.

As for the announcements today, I'm actually happy AMD is attacking the middle, bringing this type of performance to the $450 price range will help raise the tides and keep devs aiming higher and higher as years go on. FSR3 just isnt interesting to most of us Nvidia card owner, DLSS3.5 was a more interesting announcement to me given the raytracing performance and quality improvements, and the fact we'll have it on display in Cyberpunk in a few weeks. I'm excited for what FSR3 will mean for the console space though, anything that helps dev hit higher frame rate targets and opens up their tool chest to incorporate more things is a good thing.
 
$499 rx 7800 with 16gb… nvidia will have to lower prices and release a super line, no doubt. That price is just too good not to take market share.
 
People here really don't care about AMD GPUs at all. If this was an nVidia announcement it would get a lot of attention. Even the trashy 4060 cards got more attention when announced.
In any case... This is a great video, detailing a lot about FSR3 that was not quite clear during the AMD keynote.





I am interested in the cards but I'm just waiting for the Australian tech stores to release their pricing. Retailers here are very weird with local pricing so It is usually best to wait a little bit to see how the cards stack up value wise.

For me it's between the 7700, the 3060 and the intel a770. I guess I could wait until intel released their next gen, but then the 4000 TI cards will be out and then I might need to compare those as well. So many potential choices…
 
Probably because most of us own Nvidia cards because we are an enthusiast board who wants flagship performance and the mid-tier doesn't interest many of us. Also too many pro AMD posters come off as AMD shills always trying to hype everything AMD does, almost to the point it seems like they are either paid or just out to be contrarians. Its annoying and just makes some of us want to hate AMD for that alone.

Imagine how the ones that like AMD feel when the whole internet is doing what you claim "AMD posters" are doing. But I digress.

I guess there's where I differ from most users. Every release influences the whole market. So even though I was never getting the 7600 for example, nor would I get an nVidia card, I still follow all the releases and reviews. I'm interested in the whole market, not just what I want to buy.


FSR3 just isnt interesting to most of us Nvidia card owner, DLSS3.5 was a more interesting announcement to me given the raytracing performance and quality improvements, and the fact we'll have it on display in Cyberpunk in a few weeks. I'm excited for what FSR3 will mean for the console space though, anything that helps dev hit higher frame rate targets and opens up their tool chest to incorporate more things is a good thing.

You mean the alleged nVidia enthusiasts I presume, considering that FSR 3 will work on nVidia cards as well, including the nVidia 3000 and 2000 series that don't have DLSS3. So FSR3 should be interesting to the majority of nVidia users that don't have access to DLSS3.

AMD does more for nVidia users than nVidia does.
 
Still works on pretty much any GPU, still open source.

The reason they make it work on any GPU is because they can't match dlss.

If you got a nvidia GPU there's zero reason to use fsr.

Except in those games where amd pays to let out dlss in their releases.

I'm hyped for dlss 3. 5 and hyped over my 4070 will support it.

Crazy to get more and more performance as time goes on.

And yeah I'm gonna ignore crap games like remnant 2 and the other 3 hour long fps game.
 
The reason they make it work on any GPU is because they can't match dlss.
Actually I would imagine it is mainly due to market share. Radeon has probably 15% or less of the discreet GPU market on PC.

For a developer to implement a closed source version of their FSR tech for example that only works on AMD GPUs or a subset of AMD GPUs would mean that the majority of the market wouldn't be able to utilize it. Which in turn would mean developers probably wouldn't bother implementing it.

Nvidia doesn't have that problem due to their massive market share and mind share with gamers.

Having said that, you are right that at the moment AMD hasn't yet been able to match DLSS2.x in overall quality. No argument here.

If you got a nvidia GPU there's zero reason to use fsr.

While this is true for Turing architecture onwards, older Nvidia cards do not support DLSS as they have no tensor cores. I can imagine plenty of scenarios where for example a 1080ti or 1050ti etc.. owner would want to use FSR2 or XeSS for that matter.

But in this case, we are talking about the new FSR3 which is, much like DLSS3, a frame interpolation/frame generation technology. DLSS3 only works on RTX4000 series GPUs, so Ampere and Turning users are out of luck there. If people are into frame generation then this could be a nice feature for those Nvidia users, while also working on AMD and Intel GPUs, hardly a reason to complain when it comes for free and many people could benefit.

Except in those games where amd pays to let out dlss in their releases.

Wasn't there an interview the other day where AMD said they haven't asked Bethesda to keep DLSS out of Starfield and that if they want to they can add it and they have AMD's full support? Granted that doesn't guarantee that they never mandated a lack of DLSS in the past but there have been a number of AMD sponsored titles that have launched with DLSS so that would maybe indicate that they don't block it as part of sponsorship, no?


I'm hyped for dlss 3. 5 and hyped over my 4070 will support it.

Crazy to get more and more performance as time goes on.

That is cool, I'm happy for you. I agree getting additional software features over time is great and fair play to Nvidia for continuing to innovate here. Having said that, isn't this exactly what AMD is doing with FSR3? I don't mean the RT reconstruction stuff, but rather offering additional functionality and performance over time?

And yeah I'm gonna ignore crap games like remnant 2 and the other 3 hour long fps game.

Emmm ok...cool story bro. No idea what this has to do with anything I said or the current topic? (FSR3)
 
  • Brain
Reactions: Ascend
Actually I would imagine it is mainly due to market share. Radeon has probably 15% or less of the discreet GPU market on PC.

For a developer to implement a closed source version of their FSR tech for example that only works on AMD GPUs or a subset of AMD GPUs would mean that the majority of the market wouldn't be able to utilize it. Which in turn would mean developers probably wouldn't bother implementing it.

Nvidia doesn't have that problem due to their massive market share and mind share with gamers.

Having said that, you are right that at the moment AMD hasn't yet been able to match DLSS2.x in overall quality. No argument here.



While this is true for Turing architecture onwards, older Nvidia cards do not support DLSS as they have no tensor cores. I can imagine plenty of scenarios where for example a 1080ti or 1050ti etc.. owner would want to use FSR2 or XeSS for that matter.

But in this case, we are talking about the new FSR3 which is, much like DLSS3, a frame interpolation/frame generation technology. DLSS3 only works on RTX4000 series GPUs, so Ampere and Turning users are out of luck there. If people are into frame generation then this could be a nice feature for those Nvidia users, while also working on AMD and Intel GPUs, hardly a reason to complain when it comes for free and many people could benefit.



Wasn't there an interview the other day where AMD said they haven't asked Bethesda to keep DLSS out of Starfield and that if they want to they can add it and they have AMD's full support? Granted that doesn't guarantee that they never mandated a lack of DLSS in the past but there have been a number of AMD sponsored titles that have launched with DLSS so that would maybe indicate that they don't block it as part of sponsorship, no?




That is cool, I'm happy for you. I agree getting additional software features over time is great and fair play to Nvidia for continuing to innovate here. Having said that, isn't this exactly what AMD is doing with FSR3? I don't mean the RT reconstruction stuff, but rather offering additional functionality and performance over time?



Emmm ok...cool story bro. No idea what this has to do with anything I said or the current topic? (FSR3)

100% a fair response.
 
My money is already set aside to get the 7800XT. Funnily enough, I randomly received over $2k in cryptocurrency yesterday for free :D Not bad during a bear market. I sold half of it to USDC to pay for the graphics card and have some spare money left. The other half will stay in my wallet awaiting the bull market.

Now I just have to decide which 7800XT I'm going for... I've always had a good experience with Sapphire. They are my go-to brand, but I notice that they have been a bit overpriced lately, since everyone likes them. We'll wait and see. I'm definitely not going to pre-order though. I'm going to wait for reviews first. But if the performance holds true, this is the best value for money graphics card we've had in a while.
 
My money is already set aside to get the 7800XT. Funnily enough, I randomly received over $2k in cryptocurrency yesterday for free :D Not bad during a bear market. I sold half of it to USDC to pay for the graphics card and have some spare money left. The other half will stay in my wallet awaiting the bull market.

Now I just have to decide which 7800XT I'm going for... I've always had a good experience with Sapphire. They are my go-to brand, but I notice that they have been a bit overpriced lately, since everyone likes them. We'll wait and see. I'm definitely not going to pre-order though. I'm going to wait for reviews first. But if the performance holds true, this is the best value for money graphics card we've had in a while.

Why not grab a 4090?
 
Why not grab a 4090?

Opportunity cost. I try to be efficient with my money even if I have enough, and the 4090 is anything but that, in my case. Graphics cards are depreciating assets, meaning that the majority of the $2k is better spent elsewhere.

And in practice, it doesn't make sense for me. Even the 7800XT is probably overkill for my current 21:9 1080p monitor, so the 4090 is a waste of money, as I might as well upgrade my monitor as well. Not to mention that there's not really a secondary market for cards where I live. I'd have to send the card to another country to sell it, which limits what I can gain from reselling it. It's one of the reasons I always plan to use my cards for 5 years or so. And it's no secret that I dislike nVidia and wouldn't support them unless there was literally no other choice. If I wanted a higher end card I'd get one of the 7900 cards, and I'd still have more money left.

Additionally, if I stake the rest of the cryptocurrency, I can get anywhere between 5% and 20% return on it, not counting fluctuations in its price. And I'd like to get a new case and some additional SSDs for more space as well. So yeah. A $500 - $600 card does nicely to upgrade my R9 Fury, as well as having some money to buy other stuff, and be able to gain some additional interest that beats inflation.
 
  • Like
Reactions: Pagusas
Old GPU designs were wild, I loved it. So boring today. Give me badly rendered CGI action characters. Give me mages and aliens on my GPUs.

Yeston is the only one that does this for their GPUs, although the cards are often quite girly.
 


In this video we unbox the AMD Limited-Edition Starfield AMD RX 7900 XTX & Ryzen 7 7800X3D Bundle! This special edition of their flagship hardware is being produced in limited numbers of only 500 and can only be won via a giveaway. I think the kit is absolutely beautiful and can't wait to play Starfield on this awesome Kit!
 
Leak suggests that the 7700XT scores ~17k in 3DMark TimeSpy. Allegedly the 7800XT scores somewhere in the 19k ballpark.
By comparison...;

6700XT scores ~12.5k
4060Ti scores ~13.4k
3070 scores ~13.6k
6800 scores ~15k
6800XT scores ~17.2k
4070 scores ~17.7k
3080 scores ~17.8k
3080 Ti scores ~19k
6900XT scores ~20k

So the 12GB $449 7700XT has a score of around 3.5k points higher (25% faster than 4060 Ti), while costing 13% more than the 8GB 4060 Ti, or costing 10% less than the 4060 Ti 16GB. It really makes the 4060 Ti look more stupid than it already is.

If the 19k is correct for the 7800XT, we're getting 3080 Ti performance at $499, which is good. It would be $100 cheaper than the 4070 but still beating it slightly. Seems a bit too good to be true for this one, although if RDNA3 would have perfect scaling per CU for some reason (11% more CUs for ~11% more performance), the scores would make sense. But scaling is normally not linear.

It seems crazy too, for a 54CU 7700XT to be able to keep up with the 72CUs of the 6800XT, or even the 60CUs for the 6800. Same applies for the 60CUs of the 7800XT keeping up with the 80CUs of the 6900XT. After all, the 7900XT required 96 CUs to beat the 80 CU 6950XT. That's 20% more CUs for 18% more performance. Unless AMD fixed some sort of bug in RDNA3, the scaling portrayed in these leaks is bogus. It would seem weird if the 7800XT is able to even reach the 6800XT at all with 25% less CUs, let alone beat it... AMD may have lied about their performance again just like with the 7900 cards, and these leaks are maybe AMD themselves trying to market their cards.


Can't wait for benchmarks, and the performance disappointment.
 


So... It was nVidia that "leaked" the story of AMD allegedly blocking DLSS. I suspected as much. And they're planning on blaming AMD again for their poor performance in Starfield. What a surprise.
 
  • Funny
Reactions: Bolivar

In summary, without ray tracing, the Radeon RX 7800 XT outperforms the GeForce RTX 4070 by almost 7% on average, while with ray tracing enabled, it maintains a slight 0.5% lead. Conversely, the RX 7700 XT exhibits 16% higher performance over the RTX 4060 Ti 16GB. However, the presence of ray tracing can tip the scales slightly in NVIDIA's favor, resulting in an 8.5% lead of AMD GPU.

  • Radeon RX 7800 XT 16GB vs. RTX 4070 12GB
    • RASTER: +6.9%
    • RT: -11.6%
    • AVG: +0.5%
  • Radeon RX 7700 XT 12G vs. RTX 4060 Ti 16GB
    • RASTER: +15.9%
    • RT: -5.4%
    • AVG: +8.5%
9xTj9oi.jpg

eYoEQ8I.jpg
 
Last edited:
  • Brain
Reactions: Mickmrly
I get that the 7800xt is significantly cheaper than was 6800xt at launch but uh, basically offering the same performance is nearly 4000 series lame.

Just waiting for this whole shitty generation to be over.
 
  • Funny
Reactions: Ascend
@Ascend How can you seriously think these are good

Like big whoop, 7700xt is better than the worst 60 series card ever. The bar is fucking low dude.

They're definitely better than the Nvidia equivalents, (unless Nvidia drops the price of 4070 in particular) but exciting? Come on.
 
@Ascend How can you seriously think these are good
Because they are.

Like big whoop, 7700xt is better than the worst 60 series card ever. The bar is fucking low dude.
Why are you specifically paying attention to the worse of the two? That being said, the 7700XT only looks less good because of the 7800XT.

They're definitely better than the Nvidia equivalents, (unless Nvidia drops the price of 4070 in particular) but exciting? Come on.
Let's see... Let's assume for now that the 7800XT performs exactly the same as the 6800XT.

Considering that everyone was drooling over how good value the RTX 3080 was at its MSRP of $699, and the 6800XT was about the same performance for $649, how is that same performance for $150 less at launch today not good value? Just FYI, the 6800XT MSRP of $649 in 2020 (launch year) is over $760 today. That is using CPI values, which means it's likely worse in reality. Tell me again how the 7800XT is not a good deal.
Considering that the 4070 is $100 more and is either the same or slightly slower than the 7800XT, how is the 7800XT not good value?

If the 4070 would have come out at $499 it would have broken the internet. But once again, because it's AMD, everybody needs to find something wrong with it.

Yes, we could have bought the 6800XT for a while now for around $500. All I'll say is that I normally buy previous gen cards just before they're being phased out, because they are generally better value. It literally was like this for every single previous graphics card that I bought. This time, it is better to buy the new version. Assuming that I'm not ignorant regarding graphics cards, that says something.
 
Because they are.


Why are you specifically paying attention to the worse of the two? That being said, the 7700XT only looks less good because of the 7800XT.


Let's see... Let's assume for now that the 7800XT performs exactly the same as the 6800XT.

Considering that everyone was drooling over how good value the RTX 3080 was at its MSRP of $699, and the 6800XT was about the same performance for $649, how is that same performance for $150 less at launch today not good value? Just FYI, the 6800XT MSRP of $649 in 2020 (launch year) is over $760 today. That is using CPI values, which means it's likely worse in reality. Tell me again how the 7800XT is not a good deal.
Considering that the 4070 is $100 more and is either the same or slightly slower than the 7800XT, how is the 7800XT not good value?

If the 4070 would have come out at $499 it would have broken the internet. But once again, because it's AMD, everybody needs to find something wrong with it.

Yes, we could have bought the 6800XT for a while now for around $500. All I'll say is that I normally buy previous gen cards just before they're being phased out, because they are generally better value. It literally was like this for every single previous graphics card that I bought. This time, it is better to buy the new version. Assuming that I'm not ignorant regarding graphics cards, that says something.
I am not saying it's shit because it's Amd and B. The 7800xt is just as boring and unimpressive as the 7700 xt.

I've literally said 4000 series is worse, and sets the low bar.

These new cards suck too, just less. 7800xt is a smaller chip than 6800xt, so it really shouldn't even be labled the same ; same problem with Nvidia's lineup getting bumped up in naming tiers despite the 4060 ti really being an 4050ti etc.

You only think it's good because of the fucked up gpu market drastically lowering the bar compared to nvidia maxwell/pascal generation.
 
The contrast between the cpu and gpu market is remarkable, where the cpu competition is fierce but the gpu makers just couldn't give a shit less.

I think that will change next generation with rdna4 and blackwell, then maybe some people will see just how terrible this generation is and I mean it is pretty much the worst generation in graphics to date.
 
The contrast between the cpu and gpu market is remarkable, where the cpu competition is fierce but the gpu makers just couldn't give a shit less.

I think that will change next generation with rdna4 and blackwell, then maybe some people will see just how terrible this generation is and I mean it is pretty much the worst generation in graphics to date.

Its so weird, because I absolutely can see it being the worst generation.. but the 4090 is also the best GPU I've ever owned. Such a weird generation, its produced the best card ever made, value be damned, but also produced the worst cards in a long time, and so many bad cards that it easily is the worst gen.
 
  • Like
Reactions: Chozofication
Its so weird, because I absolutely can see it being the worst generation.. but the 4090 is also the best GPU I've ever owned. Such a weird generation, its produced the best card ever made, value be damned, but also produced the worst cards in a long time, and so many bad cards that it easily is the worst gen.
4090 only exists because of Jensen's ego… he really cares about having the best card on the market.

With the rest of the lineup, they were hoping gamers would accept fake frames as a big generational leap lol.

I actually think his ego is a reason why we're going to get a much better generation this time. Like with turing, or 2000 series they were similar to this generation hoping a new feature would get people excited (rt) but they weren't.

So I think it's fair to expect 5000 series as a damage control generation like 3000 series was.

As for amd, they're clearly content to "just" compete with Nvidia and not take this opportunity to steal market share, likely because the of the consoles and server market/laptops.

Intel is the only hope to really shake things up, but for the moment they're having too many issues with drivers and they also need to focus on the server market and laptops as well
 
  • Brain
Reactions: Mickmrly
4090 only exists because of Jensen's ego… he really cares about having the best card on the market.

With the rest of the lineup, they were hoping gamers would accept fake frames as a big generational leap lol.

I actually think his ego is a reason why we're going to get a much better generation this time. Like with turing, or 2000 series they were similar to this generation hoping a new feature would get people excited (rt) but they weren't.

So I think it's fair to expect 5000 series as a damage control generation like 3000 series was.

As for amd, they're clearly content to "just" compete with Nvidia and not take this opportunity to steal market share, likely because the of the consoles and server market.

Intel is the only hope to really shake things up, but for the moment they're having too many issues with drivers and they also need to focus on the server market.

Ego or not, I'm glad the 4090 exist, its the perfect Prosumer card, and makes PC gaming insanely good. Its sad that its priced out of range for so many users though. A new 1080Ti is what is needed, a card that defies all Value/cost/performance rations. That was my 2nd favorite card and even in retirement I keep that thing on the shelf next to my GeForce 2 GTS, ATI 9800pro and GeForce 8800. Monsters of the GPU world. The 4090 will be going up next to them once its retired.
 
  • Like
Reactions: Joe T.
Ego or not, I'm glad the 4090 exist, its the perfect Prosumer card, and makes PC gaming insanely good. Its sad that its priced out of range for so many users though. A new 1080Ti is what is needed, a card that defies all Value/cost/performance rations. That was my 2nd favorite card and even in retirement I keep that thing on the shelf next to my GeForce 2 GTS, ATI 9800pro and GeForce 8800. Monsters of the GPU world. The 4090 will be going up next to them once its retired.
1080ti was really better all round for the time both in price to performance and doesn't have as much cpu overhead from drivers as 4090.

I hate how hot and stupidly sized 4090 is as well… for me I don't own it because I don't want to get hosed on price no matter how fast it is.