Thread: I'm worried about AMDs GPU business & their competitiveness

regawdless

hare-assment
 
Platforms
  1. PC
Things don't look too great for dedicated AMD GPUs. They don't sell well, like I mentioned in the PC OT, more Steam users own an RTX 4090 than any not integrated AMD GPU. The highest end, absurdly priced Nvidia card is more popular than the mass market oriented AMD cards. Yes, Steam is not a 100% accurate representation of the market, but a very good indicator.

XxwNBtU.png


In addition to that, we see AMD being significantly behind in terms of R&D efforts. Very slow progress with FSR, which added frame generation but the core FSR tech is just bad with barely any improvements over several years. It's an artefact ridden shit show if you're playing below 4k & quality Mode. Intels XeSS is superior, and Nvidias DLSS runs circles around it with significantly better image quality while running faster and also including ray reconstruction for RT/PT.

AMD tried to brute force their way to success with raw rasterization performance (with great results in non RT games) but fell flat on a everything else, losing significant market share this generation. Nvidia has a market share of around 88% now, which is crazy.

Which makes me worried because while I + at the moment - prefer Nvidia cards, I don't want lazy arrogant Nvidia. I want AMD to be competitive, I want AMD to keep Nvidia on their toes. Current situation isn't good.

Yes, AMD does huge numbers with their console business, investing in new tech there, likely to be the supplier for next gen consoles.... But we're talking about a CPU/GPU combo, not dedicated cards. I really don't know how much they'll be (financially) able to fight back on the dedicated GPU market next GPU gen. I think they should completely drop the high end cards because that's Nvidia country either way, and focus on the mid & low end with everything they have.

What do you think? Will AMD be able to have a good comeback next year?
 
  • This tbh
Reactions: Soodanim
I'm a Nvidia user, have been since 9xx series, and tbh we got lazy Nvidia for a while.

Amd has been known for being the cheaper alternative, but, at least in my country, amd saw how expensive Nvidia cards are, and wanted that piece. They started upping their prices to match Nvidia, without having the same performance or proper RT and scaling (I don't know if that's changed).

The 3xxx series showed what lazy Nvidia does, and tbh, the 40xx series ain't that super great either, while being expensive AF.

I hope Intel will up their game and compete, because I have no faith in AMD at all. I don't really know why they fall so much behind.

Regard AMD getting a comeback next year is the same copium I hear amd users use each time a new series is being launched.
 
The entire GPU market is in a weird place right now.

Nvidia have been untouchable for the better part of 2 decades at this point, and ultimately success breeds success, so they were simply able to outspend AMD to maintain that lead, even before Crypto and now AI gave them more money than they can even spend to reinvest in their products.

And yet…

GPU's have kind of been getting more shit since the 10 series Nvidia cards, with the 1080ti arguably being the peak of performance to cost, that is still a capable card all these years later.

Just as games have graphically plateaued, so too have graphics cards, with increased performance coming only with a significant cost and power draw increase in kind, with significant components like VRAM capacity even going backwards despite the rising price tags.

Both the new AMD and Nvidia cards are rumoured to be further disappointments when it comes to performance, and none are expected to have significant VRAM increases. AMD is still going to be behind Nvidia, and seems to have abandoned competing at the high end entirely, leaving the 4090 and 5090 uncontested as the GPU for those with money to burn, but neither actually produce the products people want a price they can afford anymore. The biggest competition Nvidia faces is with itself, as you can see from how old most of the cards being used are. Even the 1060 is ahead of the 4060 still.

But there are wider issues that are at play here. For all the shit Intel has received for it's Arc cards, they are gaining ground, and it's AMD they're taking market share from. If you don't care about older games and don't mind putting up with a few hiccups in exchange for a lower price tag, well Arc has AMD beaten quite handily, with better RT, upscaling and productivity performance to boot.

Nvidia meanwhile might be unassailable now, but they are sitting atop a MASSIVE bubble with regards to the AI market, which they have absolutely gotten themselves neck deep in with their R&D investments, so when that bubble pops, it could have devastating effects on them as a company. I doubt they'll go bankrupt, but they're writing cheques currently that they'll almost certainly not be able to cash in the future.

The two rays of hope in all this for AMD, that isn't simply Nvidia taking themselves out via hubris, is that they are deeply integrated into the enthusiast market in a way their competitors aren't, with consoles offering them millions of GPU sales without them needing to lift a finger in the marketing and distribution side, as well as their domination of the low end enthusiast niche devices like the Steam Deck that are becoming increasingly popular, as well as their CPU business going from strength to strength, especially with how good Ryzen is and how badly Intel have fucked up recently.

Overall however, I am generally pretty pessimistic about the PC hardware market generally.

For what is a vital resource for the modern world, it is extremely vulnerable to disruption and supply chain issues, while also butting up against the limits of what can be squeezed out of physics as we know it, to provide the performance gains the world has come to both expect , or even meet what is demanded.

With increased political instability, supply chain failures, resource scarcity, economic decline, demographic collapse and increasing military conflict only growing in pressure from here on out, it's very possible we have already seen the peak of what can be expected of the casual enthusiast PC market, and that we could very well see all the current big players going under to be replaced by government owned manufacturers, that only supply what is necessary for the state to limp on, with consumer available gaming hardware suddenly one day, likely sooner than we'd expect, simply being a thing of the past.
 
While the steam survey is not 100% accurate due to internet cafes in China etc... recording multiple copies of the same (normally Nvidia) GPU etc.. it still gives us a rough indication of the market and Radeon are not exactly setting the world on fire in terms of overall market share. Nvidia is heavily dominant in the discreet GPU market.

There are likely a multitude of reasons for exactly why Radeon are so behind in mindshare and marketshare, different people will put different weights behind each reason but I imagine some of the big ones are Nvidia being much better at marketing and understanding the market, AMD as a whole being nearly bankrupt and literally not having the money/resources to properly compete for a long time until recently, often falling behind in performance versus Nvidia in the pre-RT days, then of course when they finally catch up Nvidia has already innovated and changed the ballgame with RT and DLSS which Radeon are still behind in.

Of course a big reason is momentum, once a company reaches a certain critical mass of popularity and mindshare in the market it becomes very hard to challenge them or compete, see Amazon, Apple, MS etc.. as an example. Nvidia reached that critical mass while AMD as a company was in the process of going under, now that AMD is doing well the last few years and actually has money to compete they have an uphill battle with gamers for mindshare and they also have large tech debts that they were catching up on while Nvidia was miles ahead. Add to that a very financially cautious and profit focused gun shy leadership and you get the situation that we are in now where parts of the company are afraid to go balls out all in with that hunger as they believe that gamers won't buy a super high end product in high enough quantities vs Nvidia to make it profitable or grow market share.

Of course if you don't aim for the top of performance/Halo segement and give people unique reasons/features to buy your product then this won't change, you have aim for the stars to reach the Moon etc..

Over the last 2 years there have been a lot of high profile departures from the Radeon division, some of it was older people retiring while others have moved on to other companies, hard to say if this means bad things for Radeon or just people naturally moving on at certain phases in their careers but from an outside perspective it doesn't exactly inspire confidence without us knowing the ins and outs of it all.

RDNA3 unfortunately was a little dissapointing. Initially I was expecting them to improve their RT performance a little bit more than they did in the end and then of course the weird hardware bug they have with voltage/clockspeeds meant that each card lost 15-20% perf compared to where it should have been as they were forced to reduce the clocks. (Remember "first architecture designed to exceeed 3Ghz!" slides that marketing forget to change before the reveal?).

The final cards at the price points they were sold at were fine in the market (except 7900XT was a little overpriced but that was fixed quickly enough), but just disappointing compared to what could have been and damaged their competitiveness a little in the eyes of enthusiasts.

Regarding RDNA4 we really don't know much at the moment, lots of rumours but sounds like overall a bit of a mixed bag. The good points seem to be that the actual architecture itself is supposedly pretty solid, with an expected large uplift in RT performance finally, rumours are that RDNA4 RT was incorporated into the PS5 Pro which supposedly has a big uplift there so hopefully this ends up being true as it would be nice to see Radeon finally take RT more seriously as there were not many changes to the RT hardware from RDNA2 -> RDNA3.

There are also some rumours of some kind of matrix hardware or improved low precision FP that could be useful for some AI related tasks (I'll come back to this).

The bad part seems to be that Radeon abandoned their MCM versions of RDNA4. Which included their high end "Halo" parts. Nobody is 100% sure of the reasons for this but the two leading theories are as follows:

1. Their MCM designs were way too ambitious to properly test and validate in time for the market so they got cold feet/ a fatal bug or design fault was found that would force them to massively delay to fix it.

2. The type of MCM designs they were using required the use of CoWoS packaging technology. This is what is being used for example in their MI300 datacenter GPUs. At the moment CoWoS supply is extremely limited and every company is trying to secure their own piece of that pie, which further limits supply and increases the pricing. Due to these supply issues/shortages AMD big wigs decided that their limited CoWoS supply was better used in their much more profitable datacenter business rather than their gaming dGPU business.

While the RDNA3 mess ups might lead us to believe number 1 above, personally I'm leaning more towards the CoWoS shortages being the main reason. If it was just some difficult validation issue with the top halo configs then the lower tier MCM versions would still be viable and exist, but if we look at how RDNA4 is supposedly shaping up there are only 2 physical dies and both are monolithic rather than using any MCM.

In addition to that we can look at the product codenames. Radeon code names are numbered in the order that they are designed. The ones we are getting are supposedly only Navi44 and Navi48.

It would make sense if Navi44 was always designed to be the entry level die and therfore monolithic, but it seems that after they decided to cancel all the dies above Navi44 that they scrambled to have something in the higher midrange bracket while still being monolithic so they came up wth Navi48 which will probably be somewhere around 4070 Super performance in raster, RT no idea but hopefully a lot better than RDNA3.

So yeah, looking at RDNA3 having some issues and then RDNA4 cancelling their MCM cards and only going mid range it doesn't exactly look good for Radeon right now. I suppose if the price is right at the RT doesn't suck they might have a good value proposition on their hands with their new mid range cards so maybe that will move the needle a little in terms of marketshare but hard to say for certain.

Regarding FSR it seems that their frame gen technology is surprisingly pretty good but their upscaling is still way behind the competition. Originally I was excited for FSR2 as it seemed they were updating and iterating rapidly on it but It seems odd that they had no improvements to upscaling for over a year, now we finally see some improvements but they are still miles behind. I honestly think they have been working on an AI assisted version of FSR upscaling for a while, with the hopes that the non-AI version will be the fallback in cases that there is no hardware support.

If RDNA4 does have some kind of improved low precision FP and/or matrix hardware then this could be used for a potential FSR AI version. Especially since both AMD themselves and pretty much everyone else is riding the AI marketing bubble and orienting their operations towards AI it seems like a big oversight if they were not working on something like this.

If the CoWoS theory is true, and it does seem the most likely, then hopefully if they plan to actually compete properly with RDNA5 then hopefully either the CoWoS supply situation improves or they come up with some alternate packaging solution to get the job done because if they pull something like RDNA4 again and only go mid range for RDNA5 then I don't think they are serious about competing anymore. As always we will have to wait and see.
 
  • Like
Reactions: regawdless
Another issue to bear in mind AMD get hit with against both their CPU and GPU sales is that bigger companies like Intel and Nvidia are able to monopolise partnership deals with venders.

Pre built PC's almost always have Intel and Nvidia components by default, often without so much as the option for AMD parts.

Even if there were demand for AMD GPU's, most people won't be allowed to have them unless they build their own desktop, which the vast majority simply would never do.

The multi year partnership deals Nvidia and Intel made, undercutting AMD and simply shutting them out of most of the market, simply cannot be overcome and not even fought until the contracts come up for renewal. This is pretty much the only reason Ryzen isn't body slamming Intel Core CPU's these days, as the enthusiast self build market is overwhelmingly favouring AMD with regards to CPU and Motherboard sales currently.

Success breeds success, while failure takes consistency to even begin to have an effect on the industry leader's performance.
 
I honestly don't think there is much AMD can do at this point. Nvidia is just way too ahead and has way more funding than AMD does to be able to catch up. At least AMD is doing well in the CPU space
 
I stopped posting about this shortly after we came here because it's unpleasant and it feels like talking to a wall. I know I've said some of this before, but I'll put it all here in one place, and I don't really have anything left to say beyond this. If anyone has anything constructive to add, I'll certainly check back.

If there's anything you guys take away from this post, I would encourage you to seriously reevaluate your estimation of AMD and their hardware. I have had a 6900XT for the last 2 years, bought it when the pandemic ended and it was on sale for MSRP. Great card, great price, overclocks nicely, awesome temps and efficiency. I don't know if Nvidia still bifurcates their control panel and GeForce Experience, but the user experience with Radeon is leagues better, with everything all in one easy to navigate place with a toggleable in-game overlay. I have also had no software issues I can recall in the last two years, but I had plenty using a GTX 1080 from 2017-2022. But most importantly: it's still destroying every single game, new and old, I throw at it at native resolutions. I would challenge you guys to honestly ask yourselves whether or not you think the bigger problem is that graphics and game design might be in a bad place. It's taking years to develop new titles and it's clear no one wants to push the graphical envelope anymore. The ones that do are still getting shredded by these four-year-old graphics cards and many of them are no longer even bothering with Ray Tracing, which I will get into later.

I genuinely no longer enjoy writings things like this on the internet anymore, but here it is: I don't believe Nvidia fans who say they want AMD to compete. And the reason is because I've seen what happens to Nvidia fans when they did. In 2013, the Radeon 200 series had incredible price to performance and forced Nvidia to release a great lineup at competitive prices, with a $350 970 and $500 980. The magical thing Nvidia fans say they always wanted actually happened - and they completely lost their minds. They initiated a carpet bombing campaign of threads and posts insisting Radeon GPUs were unusable failed products that would legitimately damage your PC if you used them. This is the dictionary definition of "delusional," because these allegedly horrific products just literally forced Nvidia's hand to create a competitive GPU market.

The only other time I saw anything like that was the launch of RDNA2 in 2020. You have to remember that people had said for years that AMD would never compete beyond the midrange again and, having gone back to Nvidia myself, I was honestly a little stunned at the time that they did. Not only did RDNA2 compete with the 3070 and 3080, but they actually had better efficiency, temperatures, frame pacing, and overclocking than the 30 series. I mention these specifically because these four metrics were supposed to have been the reasons why no one could ever choose a Radeon 200 over a GeForce 900. Which is the theme of this post: I don't believe the Nvidia fans anymore because it's clear they don't actually care about any of these things - they will always and every time trash AMD even when they match performance, undercut the price, and surpass Nvidia's overall quality.

This obviously brings us to the reason why Nvidia fans said RDNA2 couldn't compete, Ray Tracing, and this is the part of the conversation that makes me feel like I must be a completely deranged crazy person and if someone can help me make sense of this, I would honestly be immensely grateful:

WHERE ARE THE RAY TRACING GAMES?
We were told you couldn't buy RDNA2 because you would miss out on all the biggest PC games in that generation and, if anyone replies critically to this post, I hope you will at least answer this one question: are you willing to admit that this prediction turned out to be untrue? The best looking games I played in 2022, when the third generation of RTX cards launched, were Warhammer III and Modern Warfare 2, neither of which had Ray Tracing, the latter of which had actually stopped using the feature after implementing it in earlier titles. Ray Tracing isn't even "dead on arrival" anymore, because it's been too many years without marquee titles now.

This naturally brings us to frame generation, and this is another area where it again feels like I'm the lone crazy person on earth. It utterly blows my mind and disheartens me to read that PC gamers actually want to play games at fake resolutions and framerates, for the sake of a feature that no game developers are using anymore. This is not why I built a PC, continue upgrading it, and spend time tweaking all the settings of my favorite games. I know that there's anti-aliasing benefits, but I wish all these resources would have just gone into AA proper rather than the smoke and mirrors for a feature that the industry is clearly not ready for and won't be for a very long time. The few new Ray Tracing games I play, my 6900XT still runs everything at high framerates and I honestly can't even see exactly what it's doing. The only games that show what Ray Tracing can do are half a decade old or more at this point, or are updates to games from the gaming stone age like Quake.

Guys, I'm on my knees begging you, please explain to me what I'm getting wrong here.

The one area where AMD is not competitive is against the x090 tier cards, and it needs to be said - those are not gaming GPUs. They are obviously productivity Titans that Nvidia has been forced to market as gaming cards to reinforce the false narrative that there is still at least some area where AMD cannot compete. If Radeon really wanted to squeeze every last bit of performance out of RDNA3 with an absurd size, throwing efficiency out the window, and charging an absurd price... well, that would be a 4090, and I don't blame them for not doing it. I know some of you guys have it, and you're not really paying full price, because you're selling your older card, but it's still a meme, because there are no games today, nor will there be any in the next few years, that the 4080/7900XTX can't already destroy.

Of course, this entire argument is absurd - AMD is in every consumer gaming machine on the market and we just capped off a whole new generation of a resurrected and reimagined handheld market. They are going to be a central player in the gaming industry, much more so than Nvidia in that sense, for a long time to come, and there's nothing Nvidia can do to change that. So we have to restrict the argument to something crazy, like dedicated enthusiast tier productivity-crossover GPUs, just because people have to perpetuate the narrative.

Steam charts mean nothing. You can't acknowledge that they're inaccurate and also say they must mean something at the same time. That's called cognitive dissonance, and it's psychologically unhealthy for you. Again, if there's a critical reply to this post, I ask that you at least answer this question in addition to the one above: if you think AMD can't compete, then please explain to me what it means that I'm running all my favorite new games on max settings at high resolutions and framerates every night.

If AMD can't compete, it's not because of their hardware. It's because Nvidia fans are crazy and post a lot of really crazy things on the internet.
 
@Bolivar I'm an Nvidia fan but I still believe competition is necessary. If AMD came out with something that could compete or was better than the 4090 then I'd try that as well.
 
If AMD can't compete, it's not because of their hardware. It's because Nvidia fans are crazy and post a lot of really crazy things on the internet.

Talking about dedicated PC GPUs, AMD struggles to compete, sitting at around 12% market share vs Nvidia with around 88%. Intel GPUs are at under 1%.

That huge gap doesn't happen only because of marketing or brand affiliation. In parts, sure, those things have an effect. Trust in a brand also plays into it over time. Business and supplier relationships.

Like you said, raw rasterization performance wise, AMD can compete well. So what's the differentiator? When customers are asked to pay 500, 800 or even 1000 bucks, but the cards are very comparable in standard game performance?
It comes down to the additional features. In the low and mid end market, Nvidia completely crushes AMD with DLSS. You get 30-50% more performance with barely any perceivable visual issues and basically all somewhat demanding games support it. At 4k and Quality settings, AMD FSR is good, at lower resolutions and quality settings, it totally falls apart, it's an artefact ridden mess. While DLSS, I just played an upcoming UE5 game at 3440x1440, I couldn't make out a real difference between quality, balanced or even performance modes. And I have a 3080, not even a 40xx cars that can add frame gen which can be very useful depending on the use case. Which is a big deal for cheaper cards and a real differentiator. DLSS offers significantly better image quality while running faster. It's a significant gap that hurts AMD.

Another differentiator is ray tracing, which isn't in every game like you said, but there's a fair amount of them out there, it definitely makes a difference, newer games like Alan Wake 2 or Avatar look incredible and RT/PT plays a big role. When standard performance is more or less a wash, why should customers pick the one brand where a few of high profile high end RT games will run significantly worse? Especially in the high end enthusiast market, these few games to show off make the difference. For mid end cards, Nvidia enables players to still have a high end experience with those games. Heck, you can play Cyberpunk or Alan Wake 2 with path tracing very well on a 4070 thanks to DLSS and ray reconstruction, which is impossible with the AMD equivalent.

The general gist is, pricing isn't that much on AMDs side. Buying an Nvidia card, you can do everything that an AMD card does (having great rasterization performance). AMD is even out performing Nvidia in many games, but when both cards perform on a very high level either way, that's not a real differentiator. Buying an AMD card, you can't do the things Nvidia does well (demanding RT and DLSS).

I'm not saying AMD cards are bad, not at all. They're very good cards. You still get very good gaming fun out of them, it's all on a high level. But you have to offer more to edge out the competition than being mostly on par without offering a clear advantage while falling short in certain aspects.
 
I stopped posting about this shortly after we came here because it's unpleasant and it feels like talking to a wall. I know I've said some of this before, but I'll put it all here in one place, and I don't really have anything left to say beyond this. If anyone has anything constructive to add, I'll certainly check back.

If there's anything you guys take away from this post, I would encourage you to seriously reevaluate your estimation of AMD and their hardware. I have had a 6900XT for the last 2 years, bought it when the pandemic ended and it was on sale for MSRP. Great card, great price, overclocks nicely, awesome temps and efficiency. I don't know if Nvidia still bifurcates their control panel and GeForce Experience, but the user experience with Radeon is leagues better, with everything all in one easy to navigate place with a toggleable in-game overlay. I have also had no software issues I can recall in the last two years, but I had plenty using a GTX 1080 from 2017-2022. But most importantly: it's still destroying every single game, new and old, I throw at it at native resolutions. I would challenge you guys to honestly ask yourselves whether or not you think the bigger problem is that graphics and game design might be in a bad place. It's taking years to develop new titles and it's clear no one wants to push the graphical envelope anymore. The ones that do are still getting shredded by these four-year-old graphics cards and many of them are no longer even bothering with Ray Tracing, which I will get into later.

I genuinely no longer enjoy writings things like this on the internet anymore, but here it is: I don't believe Nvidia fans who say they want AMD to compete. And the reason is because I've seen what happens to Nvidia fans when they did. In 2013, the Radeon 200 series had incredible price to performance and forced Nvidia to release a great lineup at competitive prices, with a $350 970 and $500 980. The magical thing Nvidia fans say they always wanted actually happened - and they completely lost their minds. They initiated a carpet bombing campaign of threads and posts insisting Radeon GPUs were unusable failed products that would legitimately damage your PC if you used them. This is the dictionary definition of "delusional," because these allegedly horrific products just literally forced Nvidia's hand to create a competitive GPU market.

The only other time I saw anything like that was the launch of RDNA2 in 2020. You have to remember that people had said for years that AMD would never compete beyond the midrange again and, having gone back to Nvidia myself, I was honestly a little stunned at the time that they did. Not only did RDNA2 compete with the 3070 and 3080, but they actually had better efficiency, temperatures, frame pacing, and overclocking than the 30 series. I mention these specifically because these four metrics were supposed to have been the reasons why no one could ever choose a Radeon 200 over a GeForce 900. Which is the theme of this post: I don't believe the Nvidia fans anymore because it's clear they don't actually care about any of these things - they will always and every time trash AMD even when they match performance, undercut the price, and surpass Nvidia's overall quality.

This obviously brings us to the reason why Nvidia fans said RDNA2 couldn't compete, Ray Tracing, and this is the part of the conversation that makes me feel like I must be a completely deranged crazy person and if someone can help me make sense of this, I would honestly be immensely grateful:

WHERE ARE THE RAY TRACING GAMES?
We were told you couldn't buy RDNA2 because you would miss out on all the biggest PC games in that generation and, if anyone replies critically to this post, I hope you will at least answer this one question: are you willing to admit that this prediction turned out to be untrue? The best looking games I played in 2022, when the third generation of RTX cards launched, were Warhammer III and Modern Warfare 2, neither of which had Ray Tracing, the latter of which had actually stopped using the feature after implementing it in earlier titles. Ray Tracing isn't even "dead on arrival" anymore, because it's been too many years without marquee titles now.

This naturally brings us to frame generation, and this is another area where it again feels like I'm the lone crazy person on earth. It utterly blows my mind and disheartens me to read that PC gamers actually want to play games at fake resolutions and framerates, for the sake of a feature that no game developers are using anymore. This is not why I built a PC, continue upgrading it, and spend time tweaking all the settings of my favorite games. I know that there's anti-aliasing benefits, but I wish all these resources would have just gone into AA proper rather than the smoke and mirrors for a feature that the industry is clearly not ready for and won't be for a very long time. The few new Ray Tracing games I play, my 6900XT still runs everything at high framerates and I honestly can't even see exactly what it's doing. The only games that show what Ray Tracing can do are half a decade old or more at this point, or are updates to games from the gaming stone age like Quake.

Guys, I'm on my knees begging you, please explain to me what I'm getting wrong here.

The one area where AMD is not competitive is against the x090 tier cards, and it needs to be said - those are not gaming GPUs. They are obviously productivity Titans that Nvidia has been forced to market as gaming cards to reinforce the false narrative that there is still at least some area where AMD cannot compete. If Radeon really wanted to squeeze every last bit of performance out of RDNA3 with an absurd size, throwing efficiency out the window, and charging an absurd price... well, that would be a 4090, and I don't blame them for not doing it. I know some of you guys have it, and you're not really paying full price, because you're selling your older card, but it's still a meme, because there are no games today, nor will there be any in the next few years, that the 4080/7900XTX can't already destroy.

Of course, this entire argument is absurd - AMD is in every consumer gaming machine on the market and we just capped off a whole new generation of a resurrected and reimagined handheld market. They are going to be a central player in the gaming industry, much more so than Nvidia in that sense, for a long time to come, and there's nothing Nvidia can do to change that. So we have to restrict the argument to something crazy, like dedicated enthusiast tier productivity-crossover GPUs, just because people have to perpetuate the narrative.

Steam charts mean nothing. You can't acknowledge that they're inaccurate and also say they must mean something at the same time. That's called cognitive dissonance, and it's psychologically unhealthy for you. Again, if there's a critical reply to this post, I ask that you at least answer this question in addition to the one above: if you think AMD can't compete, then please explain to me what it means that I'm running all my favorite new games on max settings at high resolutions and framerates every night.

If AMD can't compete, it's not because of their hardware. It's because Nvidia fans are crazy and post a lot of really crazy things on the internet.
Great post.

I'm currently on a 1660ti and absolutely do not want to upgrade to nVidia because they take the piss.

I think I'm going to look into AMD cards, because I couldn't care less about RT. I just want smooth framerates in a GPU that will last me long enough. You said your card is good for efficiency, which is something I'm always concerned about. I know AMD weren't as good as NV in previous gens, so if that's changed now I'll be pleased.

There's also the Linux drivers, which I plan to take advantage of in my next build.
 
Great post.

I'm currently on a 1660ti and absolutely do not want to upgrade to nVidia because they take the piss.

I think I'm going to look into AMD cards, because I couldn't care less about RT. I just want smooth framerates in a GPU that will last me long enough. You said your card is good for efficiency, which is something I'm always concerned about. I know AMD weren't as good as NV in previous gens, so if that's changed now I'll be pleased.

There's also the Linux drivers, which I plan to take advantage of in my next build.

This seems very much emotionally driven tbh.

AMD generally doesn't offer better performance per watt from what I've seen for this gen. With UE5 becoming more the most popular engine by far, you should care about RT for the next years, hardware Lumen (which is ray tracing) will become more common, especially when next gen consoles drop in what, three years or so.

If you want to get the most out of your GPU over a long period of time, DLSS is a crucial feature to keep games running well over the course of the years. My 3080 isn't a top of the line card anymore, but with DLSS I'm still riding the high end wave, punching way above it's weight. Just watch the shit show that is FSR in so many console games, it's bad. AMD users might get lucky though, with TSR from UE5 being available more often in the future, which is way better than FSR, so there's hope (for UE5 games).

Linux drivers are definitely a thing and a valid argument.

Again, AMD cards are very good overall, it'll still be a good purchase, performance for most games will be on point (if they don't do heavy RT or even PT, and you don't want to play the upcoming RTX Remix mods for old games).

The real question is, what will AMD be able to offer with the next gen cards to offer more than Nvidia does OR offer it at a price point that's more attractive. For example, I just bought a monitor with AMD Freesync instead of Nvidia Gsync, because it was cheaper and not that much worse.
 
My irrelevant two cents: for the business side, it's a clusterfuck with hilarious brand worship, would rather not engage in it. For actual experience, I have my AMD card right here working like a dream for everything I play. Chasing the PC trends is a losing game in my eyes, mainly in the financial side of things, and I just can't be bothered with all of that cuckold-level bullshit of Nvidia and their feature locking, like, shut the fuck up and just let me play the damn games at good FPS... and I think this is important, before getting into PC, I always used to think ray tracing and all that crap is the future, and why would I invest in a PC if I don't get to get the latest bells and whistles? but now that I do have a PC, I really just don't care, it's all noise.

And finally, Novideo GPUs are still a nightmare on Linux, like really, it's torture, with everyone in the open source space trying their best to reduce the pains, expect for the company itself. AMD GPUs work out of the box, no additional downloads required. So that's another no-Nvidia for me until they get that shit sorted.
 
Last edited:
This seems very much emotionally driven tbh.

AMD generally doesn't offer better performance per watt from what I've seen for this gen. With UE5 becoming more the most popular engine by far, you should care about RT for the next years, hardware Lumen (which is ray tracing) will become more common, especially when next gen consoles drop in what, three years or so.

If you want to get the most out of your GPU over a long period of time, DLSS is a crucial feature to keep games running well over the course of the years. My 3080 isn't a top of the line card anymore, but with DLSS I'm still riding the high end wave, punching way above it's weight. Just watch the shit show that is FSR in so many console games, it's bad. AMD users might get lucky though, with TSR from UE5 being available more often in the future, which is way better than FSR, so there's hope (for UE5 games).

Linux drivers are definitely a thing and a valid argument.

Again, AMD cards are very good overall, it'll still be a good purchase, performance for most games will be on point (if they don't do heavy RT or even PT, and you don't want to play the upcoming RTX Remix mods for old games).

The real question is, what will AMD be able to offer with the next gen cards to offer more than Nvidia does OR offer it at a price point that's more attractive. For example, I just bought a monitor with AMD Freesync instead of Nvidia Gsync, because it was cheaper and not that much worse.
It's the opposite. I see worse value for money, it's as simple as that.

I paid about £300 for my 970, then £330 for my 1660ti. 4060ti is £450+ and 4070 is £500+. Go higher than that and it gets worse.

FSR vs DLSS is an important one for those aiming for higher resolutions, and DLSS takes the win at the moment. But I suppose that with me being on 1080p right now I won't have so much need for it anyway. Freesync is a good thing to bring up, because I think most people hope that much like GSync the alternative to DLSS will close the gap and be good enough for most.

What I find interesting about AMD is that they don't undercut to gain market share. You would think that going from 12% to even 20+% would outweigh the immediate profit margin losses. Undercut, gain some familiarity with customers, bring in more money overall to fund your R&D. I'd love to know the specifics of why they don't.
 
  • 100%
Reactions: Stilton Disco
I stopped posting about this shortly after we came here because it's unpleasant and it feels like talking to a wall. I know I've said some of this before, but I'll put it all here in one place, and I don't really have anything left to say beyond this. If anyone has anything constructive to add, I'll certainly check back.

If there's anything you guys take away from this post, I would encourage you to seriously reevaluate your estimation of AMD and their hardware. I have had a 6900XT for the last 2 years, bought it when the pandemic ended and it was on sale for MSRP. Great card, great price, overclocks nicely, awesome temps and efficiency. I don't know if Nvidia still bifurcates their control panel and GeForce Experience, but the user experience with Radeon is leagues better, with everything all in one easy to navigate place with a toggleable in-game overlay. I have also had no software issues I can recall in the last two years, but I had plenty using a GTX 1080 from 2017-2022. But most importantly: it's still destroying every single game, new and old, I throw at it at native resolutions. I would challenge you guys to honestly ask yourselves whether or not you think the bigger problem is that graphics and game design might be in a bad place. It's taking years to develop new titles and it's clear no one wants to push the graphical envelope anymore. The ones that do are still getting shredded by these four-year-old graphics cards and many of them are no longer even bothering with Ray Tracing, which I will get into later.

I genuinely no longer enjoy writings things like this on the internet anymore, but here it is: I don't believe Nvidia fans who say they want AMD to compete. And the reason is because I've seen what happens to Nvidia fans when they did. In 2013, the Radeon 200 series had incredible price to performance and forced Nvidia to release a great lineup at competitive prices, with a $350 970 and $500 980. The magical thing Nvidia fans say they always wanted actually happened - and they completely lost their minds. They initiated a carpet bombing campaign of threads and posts insisting Radeon GPUs were unusable failed products that would legitimately damage your PC if you used them. This is the dictionary definition of "delusional," because these allegedly horrific products just literally forced Nvidia's hand to create a competitive GPU market.

The only other time I saw anything like that was the launch of RDNA2 in 2020. You have to remember that people had said for years that AMD would never compete beyond the midrange again and, having gone back to Nvidia myself, I was honestly a little stunned at the time that they did. Not only did RDNA2 compete with the 3070 and 3080, but they actually had better efficiency, temperatures, frame pacing, and overclocking than the 30 series. I mention these specifically because these four metrics were supposed to have been the reasons why no one could ever choose a Radeon 200 over a GeForce 900. Which is the theme of this post: I don't believe the Nvidia fans anymore because it's clear they don't actually care about any of these things - they will always and every time trash AMD even when they match performance, undercut the price, and surpass Nvidia's overall quality.

This obviously brings us to the reason why Nvidia fans said RDNA2 couldn't compete, Ray Tracing, and this is the part of the conversation that makes me feel like I must be a completely deranged crazy person and if someone can help me make sense of this, I would honestly be immensely grateful:

WHERE ARE THE RAY TRACING GAMES?
We were told you couldn't buy RDNA2 because you would miss out on all the biggest PC games in that generation and, if anyone replies critically to this post, I hope you will at least answer this one question: are you willing to admit that this prediction turned out to be untrue? The best looking games I played in 2022, when the third generation of RTX cards launched, were Warhammer III and Modern Warfare 2, neither of which had Ray Tracing, the latter of which had actually stopped using the feature after implementing it in earlier titles. Ray Tracing isn't even "dead on arrival" anymore, because it's been too many years without marquee titles now.

This naturally brings us to frame generation, and this is another area where it again feels like I'm the lone crazy person on earth. It utterly blows my mind and disheartens me to read that PC gamers actually want to play games at fake resolutions and framerates, for the sake of a feature that no game developers are using anymore. This is not why I built a PC, continue upgrading it, and spend time tweaking all the settings of my favorite games. I know that there's anti-aliasing benefits, but I wish all these resources would have just gone into AA proper rather than the smoke and mirrors for a feature that the industry is clearly not ready for and won't be for a very long time. The few new Ray Tracing games I play, my 6900XT still runs everything at high framerates and I honestly can't even see exactly what it's doing. The only games that show what Ray Tracing can do are half a decade old or more at this point, or are updates to games from the gaming stone age like Quake.

Guys, I'm on my knees begging you, please explain to me what I'm getting wrong here.

The one area where AMD is not competitive is against the x090 tier cards, and it needs to be said - those are not gaming GPUs. They are obviously productivity Titans that Nvidia has been forced to market as gaming cards to reinforce the false narrative that there is still at least some area where AMD cannot compete. If Radeon really wanted to squeeze every last bit of performance out of RDNA3 with an absurd size, throwing efficiency out the window, and charging an absurd price... well, that would be a 4090, and I don't blame them for not doing it. I know some of you guys have it, and you're not really paying full price, because you're selling your older card, but it's still a meme, because there are no games today, nor will there be any in the next few years, that the 4080/7900XTX can't already destroy.

Of course, this entire argument is absurd - AMD is in every consumer gaming machine on the market and we just capped off a whole new generation of a resurrected and reimagined handheld market. They are going to be a central player in the gaming industry, much more so than Nvidia in that sense, for a long time to come, and there's nothing Nvidia can do to change that. So we have to restrict the argument to something crazy, like dedicated enthusiast tier productivity-crossover GPUs, just because people have to perpetuate the narrative.

Steam charts mean nothing. You can't acknowledge that they're inaccurate and also say they must mean something at the same time. That's called cognitive dissonance, and it's psychologically unhealthy for you. Again, if there's a critical reply to this post, I ask that you at least answer this question in addition to the one above: if you think AMD can't compete, then please explain to me what it means that I'm running all my favorite new games on max settings at high resolutions and framerates every night.

If AMD can't compete, it's not because of their hardware. It's because Nvidia fans are crazy and post a lot of really crazy things on the internet.

AMD is certainly important in terms of gaming hardware with them supplying hardware for the consoles and most of the new handheld PC market. In this case though we are only talking about the discrete GPU market.

In the PC space, competing across the entire stack is important for mindshare with gamers. Having an ultra high end product that competes with or exceeds the competition has a trickle down effect on the rest of your stack in the minds of gamers and will have an impact on what they buy en masse.

For Radeon to dent Nvidia's dominance and gain more market share they need to consistently compete or exceed Nvidia at the top performance bracket for a few generations. It just is the way it is in the PC market regardless of what any of us here believe.

Personally I own a 7900XTX, it is a great card in terms of performance, I was happy with the price point I bought it at and I don't regret picking it over a 4080 at all. Having said that there does seem to be some kind of late in the game fault that reduced performance by like 15% across the board on the RDNA3 cards prior to launch, which is why I say from a broader perspective in the market it was disappointing to see as RDNA2 was trading blows with the 3090 for example.

The reason it was disappointing is that I want to see Radeon do well in the market, grow their market share and mindshare and reduce Nvidia's dominance. I want to see an overall healthier market that is not as lopsided towards any one vendor. This is also why I would like to see Intel do well here with Battlemage but lets see how that goes.

Regarding RDNA2 you are definitely spot on that a lot of the Nvidia fanboys at the time who constantly viewed Radeon as a mid range only budget brand were shocked when RDNA2 competed not only with the 3080 but actually with the 3090 as well. I saw all kinds of vitriol, fanboy logic and trolling in any thread relating to RDNA2 at the time because the truth is that it was way too close to comfort for the Nvidia faithful. However not everybody acted that way and regardless of whether it is consoles, operating systems or GPU manufacturers, some dumb fanboys are going to exist and shit up wherever they happen to be, unfortunate reality of life but I wouldn't dwell too much on what those guys say anymore than what console warriors say. I don't think it should necessarily blind us from taking a critical view of how Radeon does in the PC market and discussing what they could do to improve.

Regarding Ray Tracing, personally I'm still not particularly pushed about it as a whole, although I do appreciate that it exists and is trying to push the envelope on lighting, I still think it's importance was massively overstated where at the moment most developers seem to view it as a tick box exercise with only a few really putting in the effort to try to make a substantive implementation. Consoles and the mainstream market will determine when RT finally becomes ubiquitous and at the moment most mainstream level GPUs can't really run RT worth a shit regardless of who makes them. Until they can most developers won't go all in balls out on it, simple reality of the market.

I've used RT I think only really used in Dragon's Dogma 2 as I noticed a bit of a difference in Bakbathal and the performance hit wasn't too bad. Most of the time in other games, I turn in on for a second and either don't really notice a difference or there is a bit of a difference but not worth the performance drop so I normally turn it off again.

But outside of what I personally feel about RT, it is important in the broader enthusiast market in the PC space, in the future it will continue to gain importance as more capable hardware from all vendors release and propogate across the stack, especially at the mainstream end. On top of that in the enthusiast market if the competition has a graphical feature that you either don't have or that they strongly out perform you in then the enthusiast crowd will flock to the one with the feature (and the marketing/mindshare to push it) unless the one with the worse performing feature is significantly or at least noticably cheaper. In this sense AMD needs to start taking RT more seriously because once they get "close enough" to Nvidia in RT it basically mostly eliminates that selling point that Nvidia cards have over Radeon cards in the minds of most people.

If AMD want their Radeon GPUs to compete then they need to take RT more seriously, even if you or I don't particularly care that much about RT at present it is still something Nvidia does better and given Radeon want to focus on profit and not become a budget brand it means that that their price points are close enough to Nvidia that some people will just pay the little extra to get better RT for example.

Regarding frame generation, I agree I don't like this feature conceptually at all and I've been pretty consistent with that stance. I don't like it and will never use it regardless of who is the vendor. Ironically enough frame generation seems to be one area where AMD seem to be quite competitive with Nvidia. Personally it is not for me and I think the general public at large seems to agree as most people don't seem to care much about frame generation in the same way they care about upscaling and reconstruction.

Rgarding upscaling/DLSS etc... love it or hate it, its here to stay and is implemented in almost all games these days. DLSS has continued to improve from their original 2.0 release on this and over time have fixed most of their artifacts and issues, while FSR upscaling still has many issues and they seemed to stop updating it for a long time until recently. These upscalers are actually useful to gain more performance and given how shitty most TAA implementations are they also tend to improve the AA of the game too.

AMD are behind here, especially at lower input resolutions and quality settings. They need to have an AI based solution. Given how slow FSR upscaling was to update after originally getting steady updates, and the AMD big wigs being all in on AI this and that for that sweet investor money I personally think that they may have been working on an AI based FSR upscaler for a while. Maybe that is wishful thinking but it would be a missed opportunity if they weren't. The problem is the FSR brand has taken a hit in terms of mindshare in the enthusiast market so they really need to have something AI related as soon as possible to turn the ship around. Luckily in this market people have short memories so as long as they come out of the gate with something good then I think it will do a lot for them.

Again back to the lack of high end competition. With RDNA4 it seems they will only have a mid-mid high range, which for people buying in that segment if the price is right and the performance good then there will be some great products, but without the higher end market it looks like Radeon are retreating from the market a little and not trying to compete. Now it seems most likely the MCM versions being cancelled was due to CoWoS packing shortages where AMD prioritised their datacenter business as it makes them way more money, so that makes sense.

However as a gamer it is disappointing that AMD doesn't prioritise competing well in the dGPU market. Coming off RDNA3 having a performance regression due to some bug just before launch and now this it looks from the outside like AMD are reducing their focus on the PC market. However RDNA5 will be the real test, I think if they iron out supply and design issues and actually want to compete here then all is well and RDNA4 was just an unfortunate generation that was sacrificed for the more profitable DC business. However if they pull a similar stunt again with RDNA5 then it will signal to the market that they are not serious about competing and they will have to settle for being a budget friendly option.
 
It's the opposite. I see worse value for money, it's as simple as that.

I paid about £300 for my 970, then £330 for my 1660ti. 4060ti is £450+ and 4070 is £500+. Go higher than that and it gets worse.

FSR vs DLSS is an important one for those aiming for higher resolutions, and DLSS takes the win at the moment. But I suppose that with me being on 1080p right now I won't have so much need for it anyway. Freesync is a good thing to bring up, because I think most people hope that much like GSync the alternative to DLSS will close the gap and be good enough for most.

What I find interesting about AMD is that they don't undercut to gain market share. You would think that going from 12% to even 20+% would outweigh the immediate profit margin losses. Undercut, gain some familiarity with customers, bring in more money overall to fund your R&D. I'd love to know the specifics of why they don't.

Pricing sucks at the moment, in general in the market. I hope it'll get better next year with the new cards. But I'm not confident.

Regarding DLSS, it's actually the opposite. It gets more important at lower res, when you're struggling with performance on older cards, when you're trying to hit a certain frame rate. It enables you to have good image quality with great performance by rendering very low internal resolutions. While FSR is basically unusable at low resolutions. It's easily a 30-40% performance gain that you won't get with AMD cards. Meaning your low end card will be able to carry you for more years through the gaming landscape.

Reason for AMD not being able to undercut is that the components are just expensive. There isn't much room for AMD or Nvidia. That's why the actual raw performance isn't that different. Unfortunately for AMD, Nvidia leverages their whole AI thing from other parts of the company to offer better additional features that make the difference.

I hope that there'll be upscaling tech implemented in engines by default, making DLSS etc. obsolete. It would put pressure on Nvidia to focus more on the actual design of the GPUs and level the playing field more, giving AMD a better shot.
 
Aaaaaand here's a plot twist!

If PSSR is as good as it seems and AMD can really use it - great news for PC gamers and future console generations.

 
  • Brain
Reactions: Stilton Disco
I stopped posting about this shortly after we came here because it's unpleasant and it feels like talking to a wall. I know I've said some of this before, but I'll put it all here in one place, and I don't really have anything left to say beyond this. If anyone has anything constructive to add, I'll certainly check back.

If there's anything you guys take away from this post, I would encourage you to seriously reevaluate your estimation of AMD and their hardware. I have had a 6900XT for the last 2 years, bought it when the pandemic ended and it was on sale for MSRP. Great card, great price, overclocks nicely, awesome temps and efficiency. I don't know if Nvidia still bifurcates their control panel and GeForce Experience, but the user experience with Radeon is leagues better, with everything all in one easy to navigate place with a toggleable in-game overlay. I have also had no software issues I can recall in the last two years, but I had plenty using a GTX 1080 from 2017-2022. But most importantly: it's still destroying every single game, new and old, I throw at it at native resolutions. I would challenge you guys to honestly ask yourselves whether or not you think the bigger problem is that graphics and game design might be in a bad place. It's taking years to develop new titles and it's clear no one wants to push the graphical envelope anymore. The ones that do are still getting shredded by these four-year-old graphics cards and many of them are no longer even bothering with Ray Tracing, which I will get into later.

I genuinely no longer enjoy writings things like this on the internet anymore, but here it is: I don't believe Nvidia fans who say they want AMD to compete. And the reason is because I've seen what happens to Nvidia fans when they did. In 2013, the Radeon 200 series had incredible price to performance and forced Nvidia to release a great lineup at competitive prices, with a $350 970 and $500 980. The magical thing Nvidia fans say they always wanted actually happened - and they completely lost their minds. They initiated a carpet bombing campaign of threads and posts insisting Radeon GPUs were unusable failed products that would legitimately damage your PC if you used them. This is the dictionary definition of "delusional," because these allegedly horrific products just literally forced Nvidia's hand to create a competitive GPU market.

The only other time I saw anything like that was the launch of RDNA2 in 2020. You have to remember that people had said for years that AMD would never compete beyond the midrange again and, having gone back to Nvidia myself, I was honestly a little stunned at the time that they did. Not only did RDNA2 compete with the 3070 and 3080, but they actually had better efficiency, temperatures, frame pacing, and overclocking than the 30 series. I mention these specifically because these four metrics were supposed to have been the reasons why no one could ever choose a Radeon 200 over a GeForce 900. Which is the theme of this post: I don't believe the Nvidia fans anymore because it's clear they don't actually care about any of these things - they will always and every time trash AMD even when they match performance, undercut the price, and surpass Nvidia's overall quality.

This obviously brings us to the reason why Nvidia fans said RDNA2 couldn't compete, Ray Tracing, and this is the part of the conversation that makes me feel like I must be a completely deranged crazy person and if someone can help me make sense of this, I would honestly be immensely grateful:

WHERE ARE THE RAY TRACING GAMES?
We were told you couldn't buy RDNA2 because you would miss out on all the biggest PC games in that generation and, if anyone replies critically to this post, I hope you will at least answer this one question: are you willing to admit that this prediction turned out to be untrue? The best looking games I played in 2022, when the third generation of RTX cards launched, were Warhammer III and Modern Warfare 2, neither of which had Ray Tracing, the latter of which had actually stopped using the feature after implementing it in earlier titles. Ray Tracing isn't even "dead on arrival" anymore, because it's been too many years without marquee titles now.

This naturally brings us to frame generation, and this is another area where it again feels like I'm the lone crazy person on earth. It utterly blows my mind and disheartens me to read that PC gamers actually want to play games at fake resolutions and framerates, for the sake of a feature that no game developers are using anymore. This is not why I built a PC, continue upgrading it, and spend time tweaking all the settings of my favorite games. I know that there's anti-aliasing benefits, but I wish all these resources would have just gone into AA proper rather than the smoke and mirrors for a feature that the industry is clearly not ready for and won't be for a very long time. The few new Ray Tracing games I play, my 6900XT still runs everything at high framerates and I honestly can't even see exactly what it's doing. The only games that show what Ray Tracing can do are half a decade old or more at this point, or are updates to games from the gaming stone age like Quake.

Guys, I'm on my knees begging you, please explain to me what I'm getting wrong here.

The one area where AMD is not competitive is against the x090 tier cards, and it needs to be said - those are not gaming GPUs. They are obviously productivity Titans that Nvidia has been forced to market as gaming cards to reinforce the false narrative that there is still at least some area where AMD cannot compete. If Radeon really wanted to squeeze every last bit of performance out of RDNA3 with an absurd size, throwing efficiency out the window, and charging an absurd price... well, that would be a 4090, and I don't blame them for not doing it. I know some of you guys have it, and you're not really paying full price, because you're selling your older card, but it's still a meme, because there are no games today, nor will there be any in the next few years, that the 4080/7900XTX can't already destroy.

Of course, this entire argument is absurd - AMD is in every consumer gaming machine on the market and we just capped off a whole new generation of a resurrected and reimagined handheld market. They are going to be a central player in the gaming industry, much more so than Nvidia in that sense, for a long time to come, and there's nothing Nvidia can do to change that. So we have to restrict the argument to something crazy, like dedicated enthusiast tier productivity-crossover GPUs, just because people have to perpetuate the narrative.

Steam charts mean nothing. You can't acknowledge that they're inaccurate and also say they must mean something at the same time. That's called cognitive dissonance, and it's psychologically unhealthy for you. Again, if there's a critical reply to this post, I ask that you at least answer this question in addition to the one above: if you think AMD can't compete, then please explain to me what it means that I'm running all my favorite new games on max settings at high resolutions and framerates every night.

If AMD can't compete, it's not because of their hardware. It's because Nvidia fans are crazy and post a lot of really crazy things on the internet.

Not much to add to this, this sums it up perfectly. I've been using a 6800xt since launch that I got at msrp directly from AMD and it's still running all the games I throw at it very well. But oh well, people fell into the hype of RT
 
  • Strength
  • Like
Reactions: Pyrate and Optimus
Not much to add to this, this sums it up perfectly. I've been using a 6800xt since launch that I got at msrp directly from AMD and it's still running all the games I throw at it very well. But oh well, people fell into the hype of RT

Nvidia RTX f'ing rocks!

Not that you can't live without raytracing in games, but I use my RTX card for a lot more than just gaming.
 
Last edited:
  • Like
Reactions: regawdless and Mal
I have a 4090 in my gaming rig but I recently picked up a 7900xtx for a linux-based (Bazzite) gaming PC in the living room.

I gotta tell you, I'm impressed by the 7900xtx and have a hard time telling the difference between it and the 4090. I'm impressed and I'm looking forward to seeing what AMD puts out for their 8000 series.
 
  • Like
Reactions: Mal and Pyrate
I have a 4090 in my gaming rig but I recently picked up a 7900xtx for a linux-based (Bazzite) gaming PC in the living room.

I gotta tell you, I'm impressed by the 7900xtx and have a hard time telling the difference between it and the 4090. I'm impressed and I'm looking forward to seeing what AMD puts out for their 8000 series.

Bazzite sounds so cool. I'm using regular Fedora on my battlestation and Kinoite for the work laptop. If I wasn't already super comfortable with Fedora I would just have Bazzite and never think about updates etc.
 
  • Like
Reactions: Bolivar
I don't hold loyalty to any brands, I buy the best service/device for my budget, from whoever provides it at the time.

Having said that, my experience with AMD hasn't been the best. Over the 20+ years of owning various hardware and the software that it comes with, I've not only always had some issues, but every single one of my AMD CPUs and GPUs died quite early (within 2-3 years), while I have Intel chips and Nvidia cards still operational after more than 10 years of abuse. Only exception would be my Ryzen 3600, but then straight from the factory, this thing wouldn't and still can't boot if any core is clocked above 3.6Ghz, no matter the voltages etc.

AMD, unless they slash their prices massively and provide a substantial bang for the buck in rasterization, also need to counter-act Nvidia's extras that they provide. Those RTX/Tensor cores aren't just for flashy graphics which chop your frame rate in half, I use them for upscaling old/low res videos. It makes the differences from taking most of the day to upscale a video down to less than an hour, it's hard to believe the speed difference. Also being able to inject the latest versions of DLSS into a game provides a remarkable image quality upgrade that AMD is nowhere near at the moment.

Planning an 9800X3D with a RTX 5080 for my next PC.
 
  • Strength
Reactions: regawdless
You know that Bazzite is based on Fedora, right?

Of course, but Bazzite being atomic makes it distinct enough from plain Fedora. Kinoite, which is also atomic, I'm already using like I said, but it's quite barebones in some areas and that becomes a bit of an annoyance for atomic desktops when you're forced to layer packages, but as I understand, Bazzite and other uBlue images either come with said popular packages bundled in, or have some fancy computer magic scripts that go over my head that can install them for you without layering.
 
I have a 4090 in my gaming rig but I recently picked up a 7900xtx for a linux-based (Bazzite) gaming PC in the living room.

I gotta tell you, I'm impressed by the 7900xtx and have a hard time telling the difference between it and the 4090. I'm impressed and I'm looking forward to seeing what AMD puts out for their 8000 series.

You will notice a difference when trying to max out the best looking games which are currently Alan Wake 2, Avatar and Hellblade 2 I'd say. Which is why people usually buy the highest end GPUs.

Alan Wake 2
4090 at 4k
44.6 fps

7900XTX with 24GB
24.7 fps

79 fps vs 44 fps in 1440p

In Avatar it's 60.6 fps vs 40

Hellblade 2 at 4k is 57 vs 44 fps for the 4090.

Or take Black Myth Wukong, one of the newest best looking games. Maxed but without RT, at 1440p, the 4090 does 85 fps while the 7900XTX does 62 fps. Go for the real high-end visuals with RT at 4k with all the fancy frame gen from both brands, which makes the game look a lot better, you can forget about any AMD card. We're talking a 5-6x performance advantage for the 4090.

And that's why they aren't selling. The not best looking games run great on mid tier cards already. It's not a good value proposition to buy a high priced AMD card while getting last Gen Nvidia card performance in some of the best looking games.
 
I have a 7800xt in my prebuilt, and it chews up everything I throw at it on Ultra/High 90+fps at 1440p. I am a rather simple man when it comes to gaming, and I don't care about 4k or RT. A 4090 is more expensive than my entire prebuilt, and it's nowhere near worth it to me.

Unless there is some massive paradigm shift in the very nature of the way we play videogames (think full-dive VR ala The Matrix), I don't think there's much room left for raw horsepower to make a prettier image on a 2D monitor. When Hellblade, and TLoU can be done on console SOC architecture, we're well on our way to marginal hardware ROI when it comes to visuals.

Of course tech will always get better, but the real advancement in videogames going forward needs to come from dev creativity.
 
I have a 7800xt in my prebuilt, and it chews up everything I throw at it on Ultra/High 90+fps at 1440p. I am a rather simple man when it comes to gaming, and I don't care about 4k or RT. A 4090 is more expensive than my entire prebuilt, and it's nowhere near worth it to me.

Unless there is some massive paradigm shift in the very nature of the way we play videogames (think full-dive VR ala The Matrix), I don't think there's much room left for raw horsepower to make a prettier image on a 2D monitor. When Hellblade, and TLoU can be done on console SOC architecture, we're well on our way to marginal hardware ROI when it comes to visuals.

Of course tech will always get better, but the real advancement in videogames going forward needs to come from dev creativity.

This, all day, every day. The high priority placed on visuals (and "equity") has been a massive downer.
 
AMD is very fortunate to be the console GPU & CPU supplier of choice for consoles. Sony does a lot of R&D, points them in the right direction, collaborate on smart designs and solutions. I believe Sony will be the only reason AMD will remain somewhat relevant in the PC GPU space next gen, and that they don't get totally left behind by Nvidia.

Which is good and will lead to well priced and well performing AMD offerings.
 
  • This tbh
Reactions: Stilton Disco
You will notice a difference when trying to max out the best looking games which are currently Alan Wake 2, Avatar and Hellblade 2 I'd say. Which is why people usually buy the highest end GPUs.

Alan Wake 2
4090 at 4k
44.6 fps

7900XTX with 24GB
24.7 fps

79 fps vs 44 fps in 1440p

In Avatar it's 60.6 fps vs 40

Hellblade 2 at 4k is 57 vs 44 fps for the 4090.

Or take Black Myth Wukong, one of the newest best looking games. Maxed but without RT, at 1440p, the 4090 does 85 fps while the 7900XTX does 62 fps. Go for the real high-end visuals with RT at 4k with all the fancy frame gen from both brands, which makes the game look a lot better, you can forget about any AMD card. We're talking a 5-6x performance advantage for the 4090.

And that's why they aren't selling. The not best looking games run great on mid tier cards already. It's not a good value proposition to buy a high priced AMD card while getting last Gen Nvidia card performance in some of the best looking games.

I get it, I'll be getting a 5090 on release day even if I have to go the scalper route, but I wanted a linux-based system running Bazzite (for that Steam Deck feel) for the living room so I had to use a AMD card in order to get HDR working.

Hopefully by the time the 5000-series NVIDIA cards roll out, Linux drivers will be caught up.

That said, I'm not counting fps so the 7900xtx is getting the job done right now.
 
  • Strength
Reactions: regawdless
Hopefully by the time the 5000-series NVIDIA cards roll out, Linux drivers will be caught up.

All depends on Nvidia getting their heads out of their own ass. There's no good reason why AMD works out of the box while Nvidia is absolute garbage. They just need to stop being bitches about it and either open source some of their stack or seriously work on the proprietary excuse they call a graphics driver.
 
  • Like
Reactions: Torrent of Pork
All depends on Nvidia getting their heads out of their own ass. There's no good reason why AMD works out of the box while Nvidia is absolute garbage. They just need to stop being bitches about it and either open source some of their stack or seriously work on the proprietary excuse they call a graphics driver.

I haven't been following too closely, but I think they have started to do both - open source some of their stack and work on the issue in-house.

Steam Deck exploded, and I'd assume that NVIDIA would like to be in the running to provide GPU hardware for future similar hardware. In order to do so, they are going to need to embrace linux.

Supposedly people have gotten HDR to work with Gamescope (essentially the Steamdeck front-end) and NVIDIA but I don't have the patience to fiddle with it. AMD just works out of the box.
 
  • Like
Reactions: Pyrate
I haven't been following too closely, but I think they have started to do both - open source some of their stack and work on the issue in-house.

The part they recently open sourced basically adds nothing of functional value as far as I understand. I even read that apparently it only benefits enterprises runnning AI on their Linux servers, as if that was the real reason behind the decision.




Just yesterday, I helped a friend install their atrocious driver on Fedora. graphical glitches and whole system crashes occured immediately after installing the nvidia driver, that's just starting the system, we haven't got to running games yet.

They indeed made progress the last few months, but it's so, sooo far from where they should be, especially with their position on the market.
 
The part they recently open sourced basically adds nothing of functional value as far as I understand. I even read that apparently it only benefits enterprises runnning AI on their Linux servers, as if that was the real reason behind the decision.




Just yesterday, I helped a friend install their atrocious driver on Fedora. graphical glitches and whole system crashes occured immediately after installing the nvidia driver, that's just starting the system, we haven't got to running games yet.

They indeed made progress the last few months, but it's so, sooo far from where they should be, especially with their position on the market.

I've only tried it with Nobara, which has a version with NVIDIA drivers already installed. I had zero problems installing and everything worked great, except for HDR.
 
  • Like
Reactions: Pyrate