So, how does FSR 3 look? At Gamescom, we had a demonstration of both titles running with the new technology active on a Radeon 7900 XTX running at 4K output. Both were running with v-sync on, which AMD recommends for frame-pacing purposes. In the very small Forspoken demo we saw, the game was running locked at 120 frames per second and looked just as a v-synced 120fps should look. The game was running in FSR 2 quality mode providing its own frame-rate boost, with frame-gen then taking you up to the limit. In terms of fluidity and clarity, FSR 3 looked a match for DLSS 3 - a view shared by Alex, Rich and John, who were all present to see the demos in person. A great start for FSR 3.
People here really don't care about AMD GPUs at all. If this was an nVidia announcement it would get a lot of attention. Even the trashy 4060 cards got more attention when announced.
In any case... This is a great video, detailing a lot about FSR3 that was not quite clear during the AMD keynote.
People here really don't care about AMD GPUs at all. If this was an nVidia announcement it would get a lot of attention. Even the trashy 4060 cards got more attention when announced.
In any case... This is a great video, detailing a lot about FSR3 that was not quite clear during the AMD keynote.
Probably because most of us own Nvidia cards because we are an enthusiast board who wants flagship performance and the mid-tier doesn't interest many of us. Also too many pro AMD posters come off as AMD shills always trying to hype everything AMD does, almost to the point it seems like they are either paid or just out to be contrarians. Its annoying and just makes some of us want to hate AMD for that alone.
FSR3 just isnt interesting to most of us Nvidia card owner, DLSS3.5 was a more interesting announcement to me given the raytracing performance and quality improvements, and the fact we'll have it on display in Cyberpunk in a few weeks. I'm excited for what FSR3 will mean for the console space though, anything that helps dev hit higher frame rate targets and opens up their tool chest to incorporate more things is a good thing.
Still works on pretty much any GPU, still open source.
Actually I would imagine it is mainly due to market share. Radeon has probably 15% or less of the discreet GPU market on PC.The reason they make it work on any GPU is because they can't match dlss.
If you got a nvidia GPU there's zero reason to use fsr.
Except in those games where amd pays to let out dlss in their releases.
I'm hyped for dlss 3. 5 and hyped over my 4070 will support it.
Crazy to get more and more performance as time goes on.
And yeah I'm gonna ignore crap games like remnant 2 and the other 3 hour long fps game.
Actually I would imagine it is mainly due to market share. Radeon has probably 15% or less of the discreet GPU market on PC.
For a developer to implement a closed source version of their FSR tech for example that only works on AMD GPUs or a subset of AMD GPUs would mean that the majority of the market wouldn't be able to utilize it. Which in turn would mean developers probably wouldn't bother implementing it.
Nvidia doesn't have that problem due to their massive market share and mind share with gamers.
Having said that, you are right that at the moment AMD hasn't yet been able to match DLSS2.x in overall quality. No argument here.
While this is true for Turing architecture onwards, older Nvidia cards do not support DLSS as they have no tensor cores. I can imagine plenty of scenarios where for example a 1080ti or 1050ti etc.. owner would want to use FSR2 or XeSS for that matter.
But in this case, we are talking about the new FSR3 which is, much like DLSS3, a frame interpolation/frame generation technology. DLSS3 only works on RTX4000 series GPUs, so Ampere and Turning users are out of luck there. If people are into frame generation then this could be a nice feature for those Nvidia users, while also working on AMD and Intel GPUs, hardly a reason to complain when it comes for free and many people could benefit.
Wasn't there an interview the other day where AMD said they haven't asked Bethesda to keep DLSS out of Starfield and that if they want to they can add it and they have AMD's full support? Granted that doesn't guarantee that they never mandated a lack of DLSS in the past but there have been a number of AMD sponsored titles that have launched with DLSS so that would maybe indicate that they don't block it as part of sponsorship, no?
That is cool, I'm happy for you. I agree getting additional software features over time is great and fair play to Nvidia for continuing to innovate here. Having said that, isn't this exactly what AMD is doing with FSR3? I don't mean the RT reconstruction stuff, but rather offering additional functionality and performance over time?
Emmm ok...cool story bro. No idea what this has to do with anything I said or the current topic? (FSR3)
My money is already set aside to get the 7800XT. Funnily enough, I randomly received over $2k in cryptocurrency yesterday for freeNot bad during a bear market. I sold half of it to USDC to pay for the graphics card and have some spare money left. The other half will stay in my wallet awaiting the bull market.
Now I just have to decide which 7800XT I'm going for... I've always had a good experience with Sapphire. They are my go-to brand, but I notice that they have been a bit overpriced lately, since everyone likes them. We'll wait and see. I'm definitely not going to pre-order though. I'm going to wait for reviews first. But if the performance holds true, this is the best value for money graphics card we've had in a while.
Why not grab a 4090?
Old GPU designs were wild, I loved it. So boring today. Give me badly rendered CGI action characters. Give me mages and aliens on my GPUs.
In this video we unbox the AMD Limited-Edition Starfield AMD RX 7900 XTX & Ryzen 7 7800X3D Bundle! This special edition of their flagship hardware is being produced in limited numbers of only 500 and can only be won via a giveaway. I think the kit is absolutely beautiful and can't wait to play Starfield on this awesome Kit!
In this video we take look at the Intel Arc A770 from ASRock. More Specifically the Ntel Arc A770Phantom Gaming 16GB OC. Is it worth buying an Intel ARC Gpu In 2023? Let's Find Out.
Btw will the next gen GPUs be even bigger? At some point they'll come in an own tower, looking at some of the monstrosities this gen.
Because they are.@Ascend How can you seriously think these are good
Why are you specifically paying attention to the worse of the two? That being said, the 7700XT only looks less good because of the 7800XT.Like big whoop, 7700xt is better than the worst 60 series card ever. The bar is fucking low dude.
Let's see... Let's assume for now that the 7800XT performs exactly the same as the 6800XT.They're definitely better than the Nvidia equivalents, (unless Nvidia drops the price of 4070 in particular) but exciting? Come on.
I am not saying it's shit because it's Amd and B. The 7800xt is just as boring and unimpressive as the 7700 xt.Because they are.
Why are you specifically paying attention to the worse of the two? That being said, the 7700XT only looks less good because of the 7800XT.
Let's see... Let's assume for now that the 7800XT performs exactly the same as the 6800XT.
Considering that everyone was drooling over how good value the RTX 3080 was at its MSRP of $699, and the 6800XT was about the same performance for $649, how is that same performance for $150 less at launch today not good value? Just FYI, the 6800XT MSRP of $649 in 2020 (launch year) is over $760 today. That is using CPI values, which means it's likely worse in reality. Tell me again how the 7800XT is not a good deal.
Considering that the 4070 is $100 more and is either the same or slightly slower than the 7800XT, how is the 7800XT not good value?
If the 4070 would have come out at $499 it would have broken the internet. But once again, because it's AMD, everybody needs to find something wrong with it.
Yes, we could have bought the 6800XT for a while now for around $500. All I'll say is that I normally buy previous gen cards just before they're being phased out, because they are generally better value. It literally was like this for every single previous graphics card that I bought. This time, it is better to buy the new version. Assuming that I'm not ignorant regarding graphics cards, that says something.
The contrast between the cpu and gpu market is remarkable, where the cpu competition is fierce but the gpu makers just couldn't give a shit less.
I think that will change next generation with rdna4 and blackwell, then maybe some people will see just how terrible this generation is and I mean it is pretty much the worst generation in graphics to date.
4090 only exists because of Jensen's ego… he really cares about having the best card on the market.Its so weird, because I absolutely can see it being the worst generation.. but the 4090 is also the best GPU I've ever owned. Such a weird generation, its produced the best card ever made, value be damned, but also produced the worst cards in a long time, and so many bad cards that it easily is the worst gen.
4090 only exists because of Jensen's ego… he really cares about having the best card on the market.
With the rest of the lineup, they were hoping gamers would accept fake frames as a big generational leap lol.
I actually think his ego is a reason why we're going to get a much better generation this time. Like with turing, or 2000 series they were similar to this generation hoping a new feature would get people excited (rt) but they weren't.
So I think it's fair to expect 5000 series as a damage control generation like 3000 series was.
As for amd, they're clearly content to "just" compete with Nvidia and not take this opportunity to steal market share, likely because the of the consoles and server market.
Intel is the only hope to really shake things up, but for the moment they're having too many issues with drivers and they also need to focus on the server market.
1080ti was really better all round for the time both in price to performance and doesn't have as much cpu overhead from drivers as 4090.Ego or not, I'm glad the 4090 exist, its the perfect Prosumer card, and makes PC gaming insanely good. Its sad that its priced out of range for so many users though. A new 1080Ti is what is needed, a card that defies all Value/cost/performance rations. That was my 2nd favorite card and even in retirement I keep that thing on the shelf next to my GeForce 2 GTS, ATI 9800pro and GeForce 8800. Monsters of the GPU world. The 4090 will be going up next to them once its retired.