Thread: [Tom's Hardware] AMD deprioritizing flagship gaming GPUs

TaySan

+1 Hero
 
Platforms
  1. PC
  2. Xbox
  3. PlayStation
  4. Nintendo

Tom's Hardware [TH], Paul Alcorn: There's been a lot of anxiety in the PC enthusiast community that, with this massive amount of focus on the data center that AMD has created and your success, there will be less of a focus on gaming. There have even been repeated rumors from multiple different sources that AMD may not be as committed to the high end of the enthusiast GPU market, that it may come down more to the mid-range, and maybe not even have flagship SKUs to take on Nvidia's top-of-stack model. Are you guys committed to competing at the top of the stack with Nvidia?

Jack Huynh [JH]: I'm looking at scale, and AMD is in a different place right now. We have this debate quite a bit at AMD, right? So the question I ask is, the PlayStation 5, do you think that's hurting us? It's $499. So, I ask, is it fun to go King of the Hill? Again, I'm looking for scale. Because when we get scale, then I bring developers with us. So, my number one priority right now is to build scale, to get us to 40 to 50 percent of the market faster. Do I want to go after 10% of the TAM [Total Addressable Market] or 80%? I'm an 80% kind of guy because I don't want AMD to be the company that only people who can afford Porsches and Ferraris can buy. We want to build gaming systems for millions of users. Yes, we will have great, great, great products. But we tried that strategy [King of the Hill] — it hasn't really grown. ATI has tried this King of the Hill strategy, and the market share has kind of been...the market share. I want to build the best products at the right system price point. So, think about price point-wise; we'll have leadership.

TH: Price point-wise, you have leadership, but you won't go after the flagship market?

JH: One day, we may. But my priority right now is to build scale for AMD. Because without scale right now, I can't get the developers. If I tell developers, 'I'm just going for 10 percent of the market share,' they just say, 'Jack, I wish you well, but we have to go with Nvidia.' So, I have to show them a plan that says, 'Hey, we can get to 40% market share with this strategy.' Then they say, 'I'm with you now, Jack. Now I'll optimize on AMD.' Once we get that, then we can go after the top.

TH: This is specifically a client strategy [consumer market]?

JH: This is a client strategy.


We won't know the final verdict on AMD's decisions for its next-gen RDNA 4 Radeon RX 8000 lineup until launch, which is expected to come later this year or early next year. However, given the typical long lead times for chip design and final production, it's a safe bet that AMD's decision is already set in stone. It sure sounds like AMD is ready to cede the performance crown to Nvidia before the battle has even begun, and that doesn't bode well for the pricing of Nvidia's next-gen gaming flagships — a lack of meaningful competition at the top of the stack is never good for the consumer.

Not surprising but still disappointing to read. :( I miss the days of the 290X
 






Not surprising but still disappointing to read. :( I miss the days of the 290X

They're not just behind in hardware but also software. Even if they closed the gap in raytracing, they're several years behind on the software stack.

They aren't competitive now. Their flagship card doesn't sell.

What does sell are their SoCs. They rule the console and handheld space. How many hundred million gaming chips are playing games today?
 
AMD sort of did this to themselves. When the crypto mining boom was going on, they jacked their prices almost as high as NVIDIA. They could have tried to undercut NVIDIA and gain some market share, but they took some short term profits instead. If a NVIDIA card and AMD card are similar prices, most gamers would choose NVIDIA due to their more advanced ray tracing and software.

AMD needs to focus on value to gain market share, which will allow them to spend more on R & D, which they can then use to close the gap with NVIDIA. What they did with those short term profits during the crypto boom I don't know, but it doesn't seem to have done much for them.
 
It's rather a hot take, but I honestly don't see "high end gaming GPUs" as being all that necessary for actual, y'know, gaming.

They're good for mining, for certain rasterization and rendering tasks, stuff like movie editing, and all the AI/Deep Learning things they wound up useful for.

Gaming, really not so much. Gaming can be done on minuscule amounts of GPU power, in comparison.
 
It's rather a hot take, but I honestly don't see "high end gaming GPUs" as being all that necessary for actual, y'know, gaming.

They're good for mining, for certain rasterization and rendering tasks, stuff like movie editing, and all the AI/Deep Learning things they wound up useful for.

Gaming, really not so much. Gaming can be done on minuscule amounts of GPU power, in comparison.

Gotta agree with this honestly.

High end gaming has been a mess for more than half a decade now. We are deep into diminishing returns both in what bang you get for your buck (and power draw) as well as what can actually be achieved on the high end of AAA gaming.

The 4090 might be without peer, but to achieve that it's insanely expensive and the size of a bus.

Even in the mid tier, things have stagnated.

Ram just isn't keeping up with the demands of high resolutions, which devs just are not accounting for. The 4060 and 4070 should be the 4050 and the 4060 respectively when their hardware is compared to the 3050, 60 and 70 cards. By price they should all be high end to flagship cards, with prices having outpaced inflation by a significant margin ever since the crypto craze and never come down.

High end gaming PC's were always the domain of the wealthy few, but these days that's true of even the upper mid tier cards.

Meanwhile low spec gaming has never been better.

Those diminishing returns mean we're getting much the same results at low compared to ultra settings, just with fewer bells and whistles, most of which seem to only exist to appeal to pixel counting websites like digital foundry, while the core important graphical mainstays are achievable on a toaster.

So yeah, I don't think abandoning the high end space is a bad idea at all, but AMD do need to get their shit together, ensure their software becomes rock solid, and do everything they can to get those prices down.
 
  • Brain
  • Like
Reactions: Mickmrly and Pyrate
This thread is a smart bunch (yes we are).

High end GPUs is not a market the market leader is even depending on. It's so not dependable that the market leader is raking on AI hardware and the next closest competitor has been choosing to focus on power and thermal efficiency while concentrating on budget hardware.

Also, if I were in AMD's shoes I'd be pushing my R&D funds towards ARM architecture. AMD is sort of the king of power efficient x64 chips (laptops, mobiles), but ARM is taking it over. Now, Windows is supporting ARM. The writing is on the wall, and Snapdragon is making big waves.

AMD is in danger.
 
I'm sure this will lead to Nvidia resting on their laurels again and give AMD the opportunity in the future to compete in the high end once more.

It's rather a hot take, but I honestly don't see "high end gaming GPUs" as being all that necessary for actual, y'know, gaming.

They're good for mining, for certain rasterization and rendering tasks, stuff like movie editing, and all the AI/Deep Learning things they wound up useful for.

Gaming, really not so much. Gaming can be done on minuscule amounts of GPU power, in comparison.

You're right, but given where I am in my life, I'm probably always going to buy whatever the highest end card is whenever I do a new PC build.
 
It's rather a hot take, but I honestly don't see "high end gaming GPUs" as being all that necessary for actual, y'know, gaming.

They're good for mining, for certain rasterization and rendering tasks, stuff like movie editing, and all the AI/Deep Learning things they wound up useful for.

Gaming, really not so much. Gaming can be done on minuscule amounts of GPU power, in comparison.

This used to be common knowledge, Nvidia's 90 series GPUs replaced the Titan class which were never meant for gaming, until a massive lie in 2020 exposed lot of ugly things about us. The sharks came out to feast, even the console market which is all about getting the best bang for buck was comfortable spending $200, $500 or $1000 above MSRP to get their hands on a PS5 or Series X.
 
  • Brain
Reactions: Stilton Disco
I have not purchased a "flagship" GPU since the Geforce 256. Since then I buy the best bang for the buck mid tier card. So if AMD can get their shit together and make a more practical mid tier card than Nvidia, they would get my money.
 
  • Brain
Reactions: Mickmrly
Even if they don't do high end cards anymore, I'm concerned regarding their ability to catch up. It's not like Nvidias strongest differentiators are limited to their high-end cards. Because Nvidia mid range cards have two significant advantages at the moment:

- mid range RTX cards beat high-end AMD cards in ray tracing performance, destroying AMD mid range cards. A 4070 can do path traced Cyberpunk and Alan Wake 2 for example, while high-end AMD cards struggle. Being able to achieve that in a few high profile games is important for a lot of consumers.

- you can easily add 30 - 40% of performance to mid range RTX cards because DLSS is so effective. I just tested a bunch of games, really trying to look for differences, I couldn't see any between native 3440x1440, DLSS Quality and DLSS Balanced. AMDs FSR2 doesn't come close.

Frame generation is a mixed bag and Nvidias ray reconstruction has some ugly issues, but Nvidia will definitely pull something out of the magic AI hat for the 50xx series that will be useful, something AMD doesn't have. Widening the gap, especially for low and mid tier cards.

Will AMD be able to undercut Nvidias prices by so much that the huge DLSS performance boost and RT won't be that important anymore? How much cheaper should they be? Or will they offer 40% more raster performance at the same price?
I'm curious what exactly AMDs approach here will be.
 
The thing is AMD is competitive with nVidia when it comes to rasterization but not even close when it comes to ray-tracing. Since they know that gamers who buy high-end GPUs will always want the best RT performnce possible, they will always go with nVidia so I can understand AMD not trying to compete in that segment.
 
I have a 4090 on my main gaming PC and I picked up a 7900xtx for my living room, linux-based HTPC. I'm hard-pressed to tell the difference between the 4090 and the 7900xtx.

It really makes me wonder why I paid the premium price for the 4090.

If AMD puts out an 8000 series equivalent to the 7900xtx I'm there day one.
 
The thing is AMD is competitive with nVidia when it comes to rasterization but not even close when it comes to ray-tracing. Since they know that gamers who buy high-end GPUs will always want the best RT performnce possible, they will always go with nVidia so I can understand AMD not trying to compete in that segment.

Also, "performance" isn't pure raster performance anymore. In the real world, basically any game that benefits from it, supports DLSS. So for Nvidia cards, you can add 30-40% of performance, causing them to outperform their AMD equivalents.

My 3080 is still maxing out most games at an acceptable frame rate with DLSS, it's a God sent and added so much value to that card.
 
  • Brain
Reactions: crumbs
Even if they don't do high end cards anymore, I'm concerned regarding their ability to catch up. It's not like Nvidias strongest differentiators are limited to their high-end cards. Because Nvidia mid range cards have two significant advantages at the moment:

- mid range RTX cards beat high-end AMD cards in ray tracing performance, destroying AMD mid range cards. A 4070 can do path traced Cyberpunk and Alan Wake 2 for example, while high-end AMD cards struggle. Being able to achieve that in a few high profile games is important for a lot of consumers.

- you can easily add 30 - 40% of performance to mid range RTX cards because DLSS is so effective. I just tested a bunch of games, really trying to look for differences, I couldn't see any between native 3440x1440, DLSS Quality and DLSS Balanced. AMDs FSR2 doesn't come close.

Frame generation is a mixed bag and Nvidias ray reconstruction has some ugly issues, but Nvidia will definitely pull something out of the magic AI hat for the 50xx series that will be useful, something AMD doesn't have. Widening the gap, especially for low and mid tier cards.

Will AMD be able to undercut Nvidias prices by so much that the huge DLSS performance boost and RT won't be that important anymore? How much cheaper should they be? Or will they offer 40% more raster performance at the same price?
I'm curious what exactly AMDs approach here will be.

DLSS is definitely important, and helps make up the cost disadvantage, but I honestly don't think Ray Tracing is a factor for any significant number of potential customers in the mid range.

The tech still comes with a ridiculous performance cost in the games that utilise it well, which is still almost none.
 
This has been rumoured for a while for RDNA4. Although these statements from AMD seem like reactive marketing spin, if we assume that the CoWoS packaging shortages are what led to AMD priotizing datacenter and cancelling all of their MCM RDNA4 cards then they can't exactly come out and say that. They will only have two dies targeting the mid and low end of the market so they have to roll with what they have and spin it as a planned strategy and good thing.

If we take it at face value that they will have a lot of volume and are aiming for market share then I hope they can do well there, but the reality of the desktop PC GPU market is that you need to high end halo cards to create enthusiasm and brand loyalty that trickles down the stack to your mid and low end buyers. If Radeon doesn't have that then they might find it difficult to gain market share even if they have the volume.

Supposedly RDNA4 is rumoured to have much better RT than previous gens, while the MCM parts were scrapped the actual core architectural IP might be good here. I suppose we will see if this is somewhat true with PS5 Pro performance as this is essentially supposed to be using RDNA4 RT cores, or something close to it. Would be nice to see them get closer to Nvidia here, or at least "close enough" as there is a lot of low hanging fruit on the RT HW acceleration side of things that Radeon have yet to implement in their GPUs that could have some nice performance uplifts. Hopefully this is not just wishful thinking.

It seems that AMD have decided to combine their CDNA and RDNA architectures together. They might call it UDNA or something like that. If you remember they originally split them apart after Vega where the compute focused stuff went to the datacenter CDNA architecture while the gaming focused IP went to the RDNA architecture. What was previously known as RDNA5 might be the first iteration of this, some implications for this mean that RDNA5 onwards could be making use of matrix multiplaction blocks (tensor cores) that AMD has in datacenter cards such as MI300 etc..

It would also mean that they wouldn't be able to make any fuck ups with the architecture like they did with RDNA3 due to how much was at stake (DC is big money). A positive read of this could be that they are focusing all of their combined engineering efforts and that we might get AMD tensor cores in desktop GPUs. A negative read on it could be that they are deprioritizing the gaming side of things and adding it as an afterthought to their datacenter focused architecture.

It will be difficult to say until we actually see what they release. I wonder will they keep up this "aim for midrange only to gain volume" strategy with RDNA5 (UDNA?) onwards or if this is just a face saving message for RDNA4 due to them having to scrap their MCM gaming models for the datacenter instead.

It is worth noting that Nvidia has often times used a single architecture for both DC and desktop (or slight variants on the same IP blocks) so it wouldn't exactly be terrible idea if they can integrate it properly and they don't lose focus on gaming.

Regarding RDNA4 I think the best we can hope for is improved RT finally and hopefully some nice clock speeds at good efficiency/price for the performance. Hopefully PS5 Pro will give us a preview of what the RT improvement might be like. Of course if they don't actually meaningfully improve their RT with RDNA4 then...I suppose they better price these things well if they want to sell well.
 
I have a 4090 on my main gaming PC and I picked up a 7900xtx for my living room, linux-based HTPC. I'm hard-pressed to tell the difference between the 4090 and the 7900xtx.

It really makes me wonder why I paid the premium price for the 4090.

If AMD puts out an 8000 series equivalent to the 7900xtx I'm there day one.

Something feels wrong here, what resolution do you play at? 😱
 
Something feels wrong here, what resolution do you play at? 😱

4K

I'm telling you, I can't perceive a difference between the 4090 and 7900 xtx

Could be due to Linux drivers, I've heard Nvidia is pretty bad in the Linux space so he could be losing lots of performance. AMD open sources their Linux drivers so they are supposed to be pretty good.

NVIDIA Linux drivers are garbage.

AMD stuff just works, including HDR
 
I have not purchased a "flagship" GPU since the Geforce 256. Since then I buy the best bang for the buck mid tier card. So if AMD can get their shit together and make a more practical mid tier card than Nvidia, they would get my money.

They already did. A 6800xt directly from them during the COVID shitshow for 600$ CAD tax in and shipping.

4 years later I can still play everything with it, been trying space marine 2 on it and it's running great
 
They already did. A 6800xt directly from them during the COVID shitshow for 600$ CAD tax in and shipping.

4 years later I can still play everything with it, been trying space marine 2 on it and it's running great

I would love to see AMD to dominate in the $300 range. Something that competes well against a 5060 at that price point. The XX60's I think are the sweet spot if you want to be able to play all games for years without breaking the bank or power draw.
 
  • Like
Reactions: Mal
I would love to see AMD to dominate in the $300 range. Something that competes well against a 5060 at that price point. The XX60's I think are the sweet spot if you want to be able to play all games for years without breaking the bank or power draw.

I want to see them make an ARM-based CPU.

And as much as I love graphics cards, I feel like they will become a dying market that gets phased out in favor of all-in-one systems. I know it's less cool and will mean a loss of performance for a time, but power efficient SoC's would seem to have advantages.
 
DLSS is definitely important, and helps make up the cost disadvantage, but I honestly don't think Ray Tracing is a factor for any significant number of potential customers in the mid range.

The tech still comes with a ridiculous performance cost in the games that utilise it well, which is still almost none.

I have a 4070ti Super. I couldn't care less about ray tracing, and that isn't going to change.
 
DLSS is definitely important, and helps make up the cost disadvantage, but I honestly don't think Ray Tracing is a factor for any significant number of potential customers in the mid range.

The tech still comes with a ridiculous performance cost in the games that utilise it well, which is still almost none.

With Unreal Engine 5 being extremely popular and a ton of studios switching over to that engine, RT implementations of some form will become the norm inside of the next few years. Especially now that Nvidia put a lot of RT work into the engine and studios like CDPR pushing new features into UE5 as well. Lumen, be it the SW or HW, is already pretty common and it'll become the standard. There are a lot of optimization efforts right now, and they'll pay off quickly.
 
With Unreal Engine 5 being extremely popular and a ton of studios switching over to that engine, RT implementations of some form will become the norm inside of the next few years. Especially now that Nvidia put a lot of RT work into the engine and studios like CDPR pushing new features into UE5 as well. Lumen, be it the SW or HW, is already pretty common and it'll become the standard. There are a lot of optimization efforts right now, and they'll pay off quickly.

I've been hearing much the same ever since Nvidia introduced the RTX line.

The fact is lighting wasn't broken, or even bad to begin with. Infact it's consistently often lovely and atmospheric in games since 6th gen.

Ray Tracing really only offers minor improvements in quality in the vast majority of cases, can be near perfectly recreated with prebaked lighting, mostly provides details you won't notice in motion, and is always going to be an additional cost to performance, regardless of how optimised it becomes.

Put it another way; if RTX cards hadn't been created, would we be currently seeing the majority of gamers complaining about the poor quality of lighting in games? Were we before they were announced?

Ray Tracing is just like other tech boondoggles, such as 3D and VR. It answer a question no one was asking while massively inflating the prices of whatever it's tied to.

The issues we have today with GPU's are price, RAM limitations and the steep cost of high resolution.

DLSS helps that last one, but the first 2 are vastly more important to address than Ray Tracing will ever be.
 
They tried this strategy with Polaris and RDNA1 and gave the same rationale, that they're trying to scale. It doesn't work. I also don't buy that they're hurting for partnerships or optimization, they just got Space Marine II and cornering the consoles means running well on AMD is a must have for every big release. I assume they're just not getting prioritized on the wafers, which I assumed was already limited for their dedicated GPU business anyway.
 
I've been hearing much the same ever since Nvidia introduced the RTX line.

The fact is lighting wasn't broken, or even bad to begin with. Infact it's consistently often lovely and atmospheric in games since 6th gen.

Ray Tracing really only offers minor improvements in quality in the vast majority of cases, can be near perfectly recreated with prebaked lighting, mostly provides details you won't notice in motion, and is always going to be an additional cost to performance, regardless of how optimised it becomes.

Put it another way; if RTX cards hadn't been created, would we be currently seeing the majority of gamers complaining about the poor quality of lighting in games? Were we before they were announced?

Ray Tracing is just like other tech boondoggles, such as 3D and VR. It answer a question no one was asking while massively inflating the prices of whatever it's tied to.

The issues we have today with GPU's are price, RAM limitations and the steep cost of high resolution.

DLSS helps that last one, but the first 2 are vastly more important to address than Ray Tracing will ever be.

I disagree with the notion that it's a minor improvement, RT lighting for example looks so much more natural. But we had that dance and I won't post an army of comparisons where RT looks clearly better. Lighting in RoboCop looks great thanks to Lumen, and that game runs well despite using a light form of RT.

For me, rasterized lighting always looked off, those glowing objects, characters that look out of place, super bland indirect lighting in interiors, especially in open world games etc.
Getting lighting right was always an issue without a proper solution.

You can get good looking lighting with other techniques, but then you have to use so many tricks, that it's a lot of work for the devs. Like in Starfield, where I'm pretty impressed with the lighting overall. Notice how that game is very demanding even without RT. With the current hardware limitations, hybrid solutions are where it's at.

I don't follow your argument that it's not important because nobody was demanding it before. Nobody was screaming about wanting better textures 20 years ago. But super detailed great textures today are important and make a difference.

Either way, gaming will be ray traced, we will own nothing, and we will be happy.