I stopped posting about this shortly after we came here because it's unpleasant and it feels like talking to a wall. I know I've said some of this before, but I'll put it all here in one place, and I don't really have anything left to say beyond this. If anyone has anything constructive to add, I'll certainly check back.
If there's anything you guys take away from this post, I would encourage you to seriously reevaluate your estimation of AMD and their hardware. I have had a 6900XT for the last 2 years, bought it when the pandemic ended and it was on sale for MSRP. Great card, great price, overclocks nicely, awesome temps and efficiency. I don't know if Nvidia still bifurcates their control panel and GeForce Experience, but the user experience with Radeon is leagues better, with everything all in one easy to navigate place with a toggleable in-game overlay. I have also had no software issues I can recall in the last two years, but I had plenty using a GTX 1080 from 2017-2022. But most importantly:
it's still destroying every single game, new and old, I throw at it at native resolutions. I would challenge you guys to honestly ask yourselves whether or not you think the bigger problem is that graphics and game design might be in a bad place. It's taking years to develop new titles and it's clear no one wants to push the graphical envelope anymore. The ones that do are still getting shredded by these four-year-old graphics cards and many of them are no longer even bothering with Ray Tracing, which I will get into later.
I genuinely no longer enjoy writings things like this on the internet anymore, but here it is: I don't believe Nvidia fans who say they want AMD to compete. And the reason is because I've seen what happens to Nvidia fans when they did. In 2013, the Radeon 200 series had incredible price to performance and forced Nvidia to release a great lineup at competitive prices, with a $350 970 and $500 980. The magical thing Nvidia fans say they always wanted actually happened - and they completely lost their minds. They initiated a carpet bombing campaign of threads and posts insisting Radeon GPUs were unusable failed products that would legitimately damage your PC if you used them. This is the dictionary definition of "delusional," because these allegedly horrific products just literally forced Nvidia's hand to create a competitive GPU market.
The only other time I saw anything like that was the launch of RDNA2 in 2020. You have to remember that people had said for years that AMD would never compete beyond the midrange again and, having gone back to Nvidia myself, I was honestly a little stunned at the time that they did. Not only did RDNA2 compete with the 3070 and 3080, but they actually had better efficiency, temperatures, frame pacing, and overclocking than the 30 series. I mention these specifically because these four metrics were supposed to have been the reasons why no one could ever choose a Radeon 200 over a GeForce 900. Which is the theme of this post: I don't believe the Nvidia fans anymore because it's clear they don't actually care about any of these things - they will always and every time trash AMD even when they match performance, undercut the price, and surpass Nvidia's overall quality.
This obviously brings us to the reason why Nvidia fans said RDNA2 couldn't compete, Ray Tracing, and this is the part of the conversation that makes me feel like I must be a completely deranged crazy person and if someone can help me make sense of this, I would honestly be immensely grateful:
WHERE ARE THE RAY TRACING GAMES?
We were told you couldn't buy RDNA2 because you would miss out on all the biggest PC games in that generation and, if anyone replies critically to this post, I hope you will at least answer this one question: are you willing to admit that this prediction turned out to be untrue? The best looking games I played in 2022, when the third generation of RTX cards launched, were Warhammer III and Modern Warfare 2, neither of which had Ray Tracing, the latter of which had actually stopped using the feature after implementing it in earlier titles. Ray Tracing isn't even "dead on arrival" anymore, because it's been too many years without marquee titles now.
This naturally brings us to frame generation, and this is another area where it again feels like I'm the lone crazy person on earth. It utterly blows my mind and disheartens me to read that PC gamers actually want to play games at fake resolutions and framerates, for the sake of a feature that no game developers are using anymore. This is not why I built a PC, continue upgrading it, and spend time tweaking all the settings of my favorite games. I know that there's anti-aliasing benefits, but I wish all these resources would have just gone into AA proper rather than the smoke and mirrors for a feature that the industry is clearly not ready for and won't be for a very long time. The few new Ray Tracing games I play, my 6900XT still runs everything at high framerates and I honestly can't even see exactly what it's doing. The only games that show what Ray Tracing can do are half a decade old or more at this point, or are updates to games from the gaming stone age like Quake.
Guys, I'm on my knees begging you, please explain to me what I'm getting wrong here.
The one area where AMD is not competitive is against the x090 tier cards, and it needs to be said - those are not gaming GPUs. They are obviously productivity Titans that Nvidia has been forced to market as gaming cards to reinforce the false narrative that there is still at least some area where AMD cannot compete. If Radeon really wanted to squeeze every last bit of performance out of RDNA3 with an absurd size, throwing efficiency out the window, and charging an absurd price... well, that would be a 4090, and I don't blame them for not doing it. I know some of you guys have it, and you're not really paying full price, because you're selling your older card, but it's still a meme, because there are no games today, nor will there be any in the next few years, that the 4080/7900XTX can't already destroy.
Of course, this entire argument is absurd - AMD is in every consumer gaming machine on the market and we just capped off a whole new generation of a resurrected and reimagined handheld market. They are going to be a central player in the gaming industry, much more so than Nvidia in that sense, for a long time to come, and there's nothing Nvidia can do to change that. So we have to restrict the argument to something crazy, like dedicated enthusiast tier productivity-crossover GPUs, just because people have to perpetuate the narrative.
Steam charts mean nothing. You can't acknowledge that they're inaccurate and also say they must mean something at the same time. That's called cognitive dissonance, and it's psychologically unhealthy for you. Again, if there's a critical reply to this post, I ask that you at least answer this question in addition to the one above: if you think AMD can't compete, then please explain to me what it means that I'm running all my favorite new games on max settings at high resolutions and framerates every night.
If AMD can't compete, it's not because of their hardware. It's because Nvidia fans are crazy and post a lot of really crazy things on the internet.
AMD is certainly important in terms of gaming hardware with them supplying hardware for the consoles and most of the new handheld PC market. In this case though we are only talking about the discrete GPU market.
In the PC space, competing across the entire stack is important for mindshare with gamers. Having an ultra high end product that competes with or exceeds the competition has a trickle down effect on the rest of your stack in the minds of gamers and will have an impact on what they buy en masse.
For Radeon to dent Nvidia's dominance and gain more market share they need to consistently compete or exceed Nvidia at the top performance bracket for a few generations. It just is the way it is in the PC market regardless of what any of us here believe.
Personally I own a 7900XTX, it is a great card in terms of performance, I was happy with the price point I bought it at and I don't regret picking it over a 4080 at all. Having said that there does seem to be some kind of late in the game fault that reduced performance by like 15% across the board on the RDNA3 cards prior to launch, which is why I say from a broader perspective in the market it was disappointing to see as RDNA2 was trading blows with the 3090 for example.
The reason it was disappointing is that I want to see Radeon do well in the market, grow their market share and mindshare and reduce Nvidia's dominance. I want to see an overall healthier market that is not as lopsided towards any one vendor. This is also why I would like to see Intel do well here with Battlemage but lets see how that goes.
Regarding RDNA2 you are definitely spot on that a lot of the Nvidia fanboys at the time who constantly viewed Radeon as a mid range only budget brand were shocked when RDNA2 competed not only with the 3080 but actually with the 3090 as well. I saw all kinds of vitriol, fanboy logic and trolling in any thread relating to RDNA2 at the time because the truth is that it was way too close to comfort for the Nvidia faithful. However not everybody acted that way and regardless of whether it is consoles, operating systems or GPU manufacturers, some dumb fanboys are going to exist and shit up wherever they happen to be, unfortunate reality of life but I wouldn't dwell too much on what those guys say anymore than what console warriors say. I don't think it should necessarily blind us from taking a critical view of how Radeon does in the PC market and discussing what they could do to improve.
Regarding Ray Tracing, personally I'm still not particularly pushed about it as a whole, although I do appreciate that it exists and is trying to push the envelope on lighting, I still think it's importance was massively overstated where at the moment most developers seem to view it as a tick box exercise with only a few really putting in the effort to try to make a substantive implementation. Consoles and the mainstream market will determine when RT finally becomes ubiquitous and at the moment most mainstream level GPUs can't really run RT worth a shit regardless of who makes them. Until they can most developers won't go all in balls out on it, simple reality of the market.
I've used RT I think only really used in Dragon's Dogma 2 as I noticed a bit of a difference in Bakbathal and the performance hit wasn't too bad. Most of the time in other games, I turn in on for a second and either don't really notice a difference or there is a bit of a difference but not worth the performance drop so I normally turn it off again.
But outside of what I personally feel about RT, it is important in the broader enthusiast market in the PC space, in the future it will continue to gain importance as more capable hardware from all vendors release and propogate across the stack, especially at the mainstream end. On top of that in the enthusiast market if the competition has a graphical feature that you either don't have or that they strongly out perform you in then the enthusiast crowd will flock to the one with the feature (and the marketing/mindshare to push it) unless the one with the worse performing feature is significantly or at least noticably cheaper. In this sense AMD needs to start taking RT more seriously because once they get "close enough" to Nvidia in RT it basically mostly eliminates that selling point that Nvidia cards have over Radeon cards in the minds of most people.
If AMD want their Radeon GPUs to compete then they need to take RT more seriously, even if you or I don't particularly care that much about RT at present it is still something Nvidia does better and given Radeon want to focus on profit and not become a budget brand it means that that their price points are close enough to Nvidia that some people will just pay the little extra to get better RT for example.
Regarding frame generation, I agree I don't like this feature conceptually at all and I've been pretty consistent with that stance. I don't like it and will never use it regardless of who is the vendor. Ironically enough frame generation seems to be one area where AMD seem to be quite competitive with Nvidia. Personally it is not for me and I think the general public at large seems to agree as most people don't seem to care much about frame generation in the same way they care about upscaling and reconstruction.
Rgarding upscaling/DLSS etc... love it or hate it, its here to stay and is implemented in almost all games these days. DLSS has continued to improve from their original 2.0 release on this and over time have fixed most of their artifacts and issues, while FSR upscaling still has many issues and they seemed to stop updating it for a long time until recently. These upscalers are actually useful to gain more performance and given how shitty most TAA implementations are they also tend to improve the AA of the game too.
AMD are behind here, especially at lower input resolutions and quality settings. They need to have an AI based solution. Given how slow FSR upscaling was to update after originally getting steady updates, and the AMD big wigs being all in on AI this and that for that sweet investor money I personally think that they may have been working on an AI based FSR upscaler for a while. Maybe that is wishful thinking but it would be a missed opportunity if they weren't. The problem is the FSR brand has taken a hit in terms of mindshare in the enthusiast market so they really need to have something AI related as soon as possible to turn the ship around. Luckily in this market people have short memories so as long as they come out of the gate with something good then I think it will do a lot for them.
Again back to the lack of high end competition. With RDNA4 it seems they will only have a mid-mid high range, which for people buying in that segment if the price is right and the performance good then there will be some great products, but without the higher end market it looks like Radeon are retreating from the market a little and not trying to compete. Now it seems most likely the MCM versions being cancelled was due to CoWoS packing shortages where AMD prioritised their datacenter business as it makes them way more money, so that makes sense.
However as a gamer it is disappointing that AMD doesn't prioritise competing well in the dGPU market. Coming off RDNA3 having a performance regression due to some bug just before launch and now this it looks from the outside like AMD are reducing their focus on the PC market. However RDNA5 will be the real test, I think if they iron out supply and design issues and actually want to compete here then all is well and RDNA4 was just an unfortunate generation that was sacrificed for the more profitable DC business. However if they pull a similar stunt again with RDNA5 then it will signal to the market that they are not serious about competing and they will have to settle for being a budget friendly option.