Thread: The GPU Thread
AMD is not interested in being #1 apparently. They do not think they can constantly keep up with the expectations of being #1. They prefer to ride the market out being a secondary choice.

Their mistake is that they want to charge as if they are #1. But the market quickly corrects their prices (unlike nVidia).

As for their high end 8000 series / rdna 4, they probably wanted to have the GCD being chiplets and are having issues getting it working like they want. So they simply ditch it and hope to fix things for RDNA 5.
 
AMD is not interested in being #1 apparently. They do not think they can constantly keep up with the expectations of being #1. They prefer to ride the market out being a secondary choice.

Their mistake is that they want to charge as if they are #1. But the market quickly corrects their prices (unlike nVidia).

As for their high end 8000 series / rdna 4, they probably wanted to have the GCD being chiplets and are having issues getting it working like they want. So they simply ditch it and hope to fix things for RDNA 5.

I'm pretty sure they want to be #1

They just can't. If its lack of funding or not that's nothing that I know for certain, but they lack better engineers.

If consoles didn't use amd then amd wouldn't exist today.

Say what you will about nvidia, but until the crypto boom nvidia really kept moving the needle.

Post crypto they just don't care anymore. Amd just sucks and everyone buys nvidia even when nvidia put out less effort, and they are still better than amd.
 
  • Brain
Reactions: Mickmrly
I'm pretty sure they want to be #1

They just can't. If its lack of funding or not that's nothing that I know for certain, but they lack better engineers.

If consoles didn't use amd then amd wouldn't exist today.

Say what you will about nvidia, but until the crypto boom nvidia really kept moving the needle.

Post crypto they just don't care anymore. Amd just sucks and everyone buys nvidia even when nvidia put out less effort, and they are still better than amd.

Consoles use AMD hardware because AMD is willing to deliver adequate performance within the budget constraints of the console space, have the tech and production capacities for the needed APUs. Nvidia doesn't want to do it for the price, so it's good that we have AMD filling that space because we can say what we want about the consoles, they're incredible for the comparatively low costs.

Nvidia has way more R&D budget and it shows. Take FSR vs DLSS for example. DLSS2 constantly improved and is now a very good option to have, depending on the game and the use case. Now we also have DLSS3, adding another layer of options for extremely demanding games, showing very good results in Cyberpunk for example. While FSR2 has the same issues it had years ago, with a ton of artefacting, issues with transparencies, ghosting, issues with particles, issues with vegetation, anything else than Quality mode being pretty bad etc.

Development is very expensive and getting qualified people is very hard in many cases. Nvidia is able to do more here.

While AMD uses their available resources to really go for certain aspects like raster performance, with great results.
 
AMD is not interested in being #1 apparently. They do not think they can constantly keep up with the expectations of being #1. They prefer to ride the market out being a secondary choice.

Their mistake is that they want to charge as if they are #1. But the market quickly corrects their prices (unlike nVidia).

As for their high end 8000 series / rdna 4, they probably wanted to have the GCD being chiplets and are having issues getting it working like they want. So they simply ditch it and hope to fix things for RDNA 5.
The fact it is rumored the ceos of nvidia and amd are blood related, always makes me consider the possibility that they're acting as an oligopoly.
 
The fact it is rumored the ceos of nvidia and amd are blood related, always makes me consider the possibility that they're acting as an oligopoly.

zqnQRBSatqLfhw9wrEQ2k3-970-80.jpg
 
  • Brain
Reactions: Mickmrly
I'm pretty sure they want to be #1

They just can't. If its lack of funding or not that's nothing that I know for certain, but they lack better engineers.

If consoles didn't use amd then amd wouldn't exist today.

Say what you will about nvidia, but until the crypto boom nvidia really kept moving the needle.

Post crypto they just don't care anymore. Amd just sucks and everyone buys nvidia even when nvidia put out less effort, and they are still better than amd.

What I mentioned is based on leaks, not just blind speculation. They don't want to be #1 in the PC graphics card space. They do want it in pretty much everything else, including CPU, datacenters, servers and AI. They want to be relevant, but not #1.

AMD is more innovative in hardware than people give them credit for, especially considering the limits on their resources compared to both Intel and nVidia. If they weren't, they wouldn't be a factor at all, but they're still here. And even though it's not the majority, many gamers prefer them over nVidia, and for good reason.
 
Consoles use AMD hardware because AMD is willing to deliver adequate performance within the budget constraints of the console space, have the tech and production capacities for the needed APUs. Nvidia doesn't want to do it for the price, so it's good that we have AMD filling that space because we can say what we want about the consoles, they're incredible for the comparatively low costs.
Its not just a case of cost, Nvidia soured their relationships with both Microsoft and Sony for being unfortunately a shitty partner to work with, as pretty much anyone who has worked with them can attest to and normally drop them shortly afterwards, look at Apple, EVGA etc... (except Nintento for some reason).

Outside of that AMD definitely have a technology/IP advantage when it comes to APUs, partially because consoles are such a large part of their graphics portfolio they focus their designs around silicon area efficiency, low power envelopes etc... they do this from the initial design phase of their graphics IP rather than designing something for desktop and then trying to scale it down afterwards.

Of course their APUs are also used in Laptops, Steamdeck and other handhelds, embedded into cars etc...

I don't disagree with the rest of your post though, you are spot on there.

AMD has been growing a lot the last 7 years or so and they are much better funded and have way more staff than before. However they are still much smaller than Nvidia or Intel and Radeon group is a smaller subset of AMD again.

In addition to that they have a lot of tech debt from their near bankrupt years that they need to catch up on and ramping up new teams and new staff in something as complex as silicon/graphics design/validation is incredibly difficult. It often takes more than a year for someone to be trained up and to start properly contributing.

One of the ways that AMD competes with much larger companies like Intel or Nvidia is that they are always on the bleeding edge of silicon/packaging technology. For example jumping to advanced nodes before Nvidia with TSMC, they were even doing this long before RDNA was a thing, sometimes it bit them in the ass in the past.

However this is the main reason they are able to compete with these larger companies. They are the world leaders in MCM chip/silicon design and disaggregation as Zen has easily shown. On top of that their new MI300 is absolutely insane from a design and packaging perspective, the fact that they are able to release something so mental is a testament to that. I think @Ascend is right that in gamer land they often don't get the praise they deserve for always being on the bleeding edge technologically when it comes to their silicon designs.

For RDNA3 they tried to do something crazy and unprecedented in consumer graphics. Datacenter stuff is easier because it is intended for compute so you can create multiple dies and connect them together more easily because they don't have to handle graphics.

Unfortunately they tried to do too much for RDNA3 by both doing a massive redesign of the underlying architecture/CU etc... while also disaggregating the GPU components (MCM approach) via their advanced packaging know how.

Long story short a bug that they missed fucked up RDNA3. It was supposed have 15-20% better performance across the board while drawing less power than it ended up drawing. They realised way too late to fix it before release, the kind of fix it needed would require full re-tape out which would cost money, resources and manpower and would take 6-8 months to complete.

They decided to cut their loses, reassign their valuable validation engineers to the next generation (RDNA4). So we ended up getting a somewhat gimped RDNA3 compared to what is was intended to be.

Remember that had their plans worked out N31 would be nipping at the heels of 4090 for much cheaper with a smaller die and they wouldn't have had to drop the rest of the lineup down a tier.

Moving on to RDNA4, the current rumours don't look good for Radeon group right now. Again they were trying to do something insane for their high end RDNA4 dies, some crazy MCM type stuff with multiple Shader Engine Dies (SED) tiles. Kind of like some of the madness they have done with MI300.

Supposedly (take all these rumours with a grain of salt) to get the top dies working properly would take much longer than they initially forecast due to the novel disaggreated nature of them in consumer graphics.

I've heard different reasons why this is, some people say it was a skill issue and the task was just too difficult for them to handle, others say they don't have a problem doing it but the roadmap has a super aggressive time to market window that the top brass don't want to miss. Either way the supposed idea is that the time it would take them to do it right would push the RDNA4 release too far back that it would end up being too close to RDNA5.

So they decided to cancel N41 and N42 instead (the two disaggregated dies that had multiple SED tiles) and will only release instead N43 and N44 midrange parts.

Supposedly they have deferred their multi SED die type frankenstein type designs for high end RDNA5. I don't know if that is just cope from AMD fanboy types or not but we will have to wait and see. I don't care about jumping on these hype trains for something that won't release until 2026.
 
Its not just a case of cost, Nvidia soured their relationships with both Microsoft and Sony for being unfortunately a shitty partner to work with, as pretty much anyone who has worked with them can attest to and normally drop them shortly afterwards, look at Apple, EVGA etc... (except Nintento for some reason).

Outside of that AMD definitely have a technology/IP advantage when it comes to APUs, partially because consoles are such a large part of their graphics portfolio they focus their designs around silicon area efficiency, low power envelopes etc... they do this from the initial design phase of their graphics IP rather than designing something for desktop and then trying to scale it down afterwards.

Of course their APUs are also used in Laptops, Steamdeck and other handhelds, embedded into cars etc...

I don't disagree with the rest of your post though, you are spot on there.

AMD has been growing a lot the last 7 years or so and they are much better funded and have way more staff than before. However they are still much smaller than Nvidia or Intel and Radeon group is a smaller subset of AMD again.

In addition to that they have a lot of tech debt from their near bankrupt years that they need to catch up on and ramping up new teams and new staff in something as complex as silicon/graphics design/validation is incredibly difficult. It often takes more than a year for someone to be trained up and to start properly contributing.

One of the ways that AMD competes with much larger companies like Intel or Nvidia is that they are always on the bleeding edge of silicon/packaging technology. For example jumping to advanced nodes before Nvidia with TSMC, they were even doing this long before RDNA was a thing, sometimes it bit them in the ass in the past.

However this is the main reason they are able to compete with these larger companies. They are the world leaders in MCM chip/silicon design and disaggregation as Zen has easily shown. On top of that their new MI300 is absolutely insane from a design and packaging perspective, the fact that they are able to release something so mental is a testament to that. I think @Ascend is right that in gamer land they often don't get the praise they deserve for always being on the bleeding edge technologically when it comes to their silicon designs.

For RDNA3 they tried to do something crazy and unprecedented in consumer graphics. Datacenter stuff is easier because it is intended for compute so you can create multiple dies and connect them together more easily because they don't have to handle graphics.

Unfortunately they tried to do too much for RDNA3 by both doing a massive redesign of the underlying architecture/CU etc... while also disaggregating the GPU components (MCM approach) via their advanced packaging know how.

Long story short a bug that they missed fucked up RDNA3. It was supposed have 15-20% better performance across the board while drawing less power than it ended up drawing. They realised way too late to fix it before release, the kind of fix it needed would require full re-tape out which would cost money, resources and manpower and would take 6-8 months to complete.

They decided to cut their loses, reassign their valuable validation engineers to the next generation (RDNA4). So we ended up getting a somewhat gimped RDNA3 compared to what is was intended to be.

Remember that had their plans worked out N31 would be nipping at the heels of 4090 for much cheaper with a smaller die and they wouldn't have had to drop the rest of the lineup down a tier.

Moving on to RDNA4, the current rumours don't look good for Radeon group right now. Again they were trying to do something insane for their high end RDNA4 dies, some crazy MCM type stuff with multiple Shader Engine Dies (SED) tiles. Kind of like some of the madness they have done with MI300.

Supposedly (take all these rumours with a grain of salt) to get the top dies working properly would take much longer than they initially forecast due to the novel disaggreated nature of them in consumer graphics.

I've heard different reasons why this is, some people say it was a skill issue and the task was just too difficult for them to handle, others say they don't have a problem doing it but the roadmap has a super aggressive time to market window that the top brass don't want to miss. Either way the supposed idea is that the time it would take them to do it right would push the RDNA4 release too far back that it would end up being too close to RDNA5.

So they decided to cancel N41 and N42 instead (the two disaggregated dies that had multiple SED tiles) and will only release instead N43 and N44 midrange parts.

Supposedly they have deferred their multi SED die type frankenstein type designs for high end RDNA5. I don't know if that is just cope from AMD fanboy types or not but we will have to wait and see. I don't care about jumping on these hype trains for something that won't release until 2026.

I like AMD pushing the silicon and how they are willing to take risks. From an innovation and risk perspective, I don't think they do more than Nvidia though. I'd say they are more or less on the same innovation level hardware wise, with Nvidia going all in on AI and raytracing with dedicated hardware on their GPUs, which was a bold move. Their cards are always comparable in raster performance and trade blows, with AMD even being stronger in many cases. In the other aspects like image reconstruction and RT, they're a generation behind, unfortunately.
 
I like AMD pushing the silicon and how they are willing to take risks. From an innovation and risk perspective, I don't think they do more than Nvidia though. I'd say they are more or less on the same innovation level hardware wise, with Nvidia going all in on AI and raytracing with dedicated hardware on their GPUs, which was a bold move. Their cards are always comparable in raster performance and trade blows, with AMD even being stronger in many cases. In the other aspects like image reconstruction and RT, they're a generation behind, unfortunately.

Oh Nvidia is certainly on the cutting edge with tensor cores, RT cores etc... and their software teams are much larger/more innovative than Radeon groups. No doubt about that, Nvidia certainly innovates.

My point was more to do with actual silicon node/advanced packaging. And when I speak of AMD I mean the whole company, so their CPU, APU, Consumer graphics, data center GPU etc... in that sense AMD is ahead of everyone, Intel included.

Nvidia has never made an MCM type disaggregated chip design, this requires years of R&D, budget, resources, expertise and experience, learned pitfalls etc... in these senses AMD are definitely ahead of Nvidia. We saw this where Nvidia was prototyping some kind of MCM version Hopper for the DC but they ended up cancelling it and going monolithic instead. Nvidia in this sense is easily 5 years behind AMD.

Unfortunately it is not something that you can just throw money at to instantly catch up overnight. It will take some time for catch up here.

The question that I suppose we as gamers care about is: Can Radeon group leverage this packaging/silicon advantage into an actual viable product that either competes with or outperforms Nvidia's Halo product?

It seems they are aiming for a moonshot type design but whether they can actually wrangle the design into an actual working product without dropping the ball is anyone's guess. So far their track record in consumer GPUs are not painting a great picture for them if these RDNA4 rumours are true.

My take is, would be nice to see them nail it with RDNA5 but I'm not holding my breath. I'll believe it when I see it.
 

NVIDIA's highly anticipated update to its workstation series has finally arrived, introducing three new models to the lineup. Leading the pack is the RTX 5000, which boasts the ADA AD102 GPU housing 12,800 CUDA cores. Positioned just below the flagship RTX 6000 with its 48 GB memory, the RTX 5000 comes with 32GB of memory and is expected to retail at around $4000.

Expanding the RTX 4000 ADA series, two additional models have been introduced: the RTX A4500 and the RTX 4000. The latter may sound familiar, as NVIDIA had previously released this GPU in a Small Form Factor (SFF) design. However, the new version features a single-slot design but departs from the low-profile design of its predecessor. While power requirements have risen from 70W to 130W, the boost clock has increased significantly, offering up to 2.2 GHz compared to the previous 1.56 GHz on the SFF variant.

The RTX 4500 ADA features the AD104 GPU equipped with 7680 CUDA cores and combined with 24GB of memory. Notably, this model incorporates a 192-bit memory bus to meet the 24GB memory capacity requirement. Simultaneously, the power consumption is reported to be around 210W.
 
So I've been looking for GPUs, and I think I'm gonna sell my kidney for a MSI GeForce RTX 4070 VENTUS 3X OC so Jensen can buy a new leatherjacket.

God damn I hope it is worth it.

My machine just random crashes with my 2070s so I just hope the money spend on the gpu was better than on a psychiatrist in case my machine still crashes.
 
So I've been looking for GPUs, and I think I'm gonna sell my kidney for a MSI GeForce RTX 4070 VENTUS 3X OC so Jensen can buy a new leatherjacket.

God damn I hope it is worth it.

My machine just random crashes with my 2070s so I just hope the money spend on the gpu was better than on a psychiatrist in case my machine still crashes.

I think you'll like it. Frame gen is legit feature of the 40 series.
 
I think you'll like it. Frame gen is legit feature of the 40 series.

Well my 2070s already has DLSS 1 gen, so I know about it.

I can see its pretty good in games, but I can't help but feeling a little dissapointed in seeing such an expensive card with everything on only running 60 to 80 FPS in games like cyberpunk with "fake" frames.

But I guess that's why the 40 series is called a dissapointment.

But I'm roling 1440p on my monitor so I think it will be good enougj for a long time. I am the type that first buy new hardware when it dies.

My 2070 is 3 or 4 years old, and it died the minute it went out of warranty.

My GF got my old 970 I bought 8 or 9 years ago, it is still running strong.
 
Uhm yeah. The raytracing support for AMD cards in Ratchet & Clank just arrived.

You'd think that the game being developed for AMD hardware initially and how there's no crazy expensive RTGI or path tracing going on, AMD GPUs would perform well here.

Maxed 4k even with FSR2 Quality mode, a 7900XTX struggles hard. Fluctuating between 25 and 80+ fps. You can't lock the game at 30fps.

 
  • Shocked
Reactions: IrishWhiskey
Uhm yeah. The raytracing support for AMD cards in Ratchet & Clank just arrived.

You'd think that the game being developed for AMD hardware initially and how there's no crazy expensive RTGI or path tracing going on, AMD GPUs would perform well here.

Maxed 4k even with FSR2 Quality mode, a 7900XTX struggles hard. Fluctuating between 25 and 80+ fps. You can't lock the game at 30fps.



Like most Sony 1st party games that get ported to PC, isn't R&C Nvidia sponsored? Even though these games were originally designed for AMD consoles the PC branch of the code is generally modified to be optimized for Nvidia GPUs in these cases.

Given that Sony doesn't use Direct X and therefore doesn't use DXR, they have their own API. It probably makes sense that replacing the entire RT implementation with DXR would result in different performance results?

Of course the RT implementation could just be shit? I have no idea but I figured it was worth mentioning.
 
  • Brain
Reactions: Mickmrly and Ascend
I just upgraded my monitor to a 4K/144Hz one and my 3080/10 GB is really struggling. That 10 GB really sucks now.

I have a feeling that the 4070Ti with 12 GB would still to little to justify the upgrade. Do you guys agree?

Can't justify a 4080/4090 at their price points at the moment...
 
  • Like
Reactions: regawdless
Like most Sony 1st party games that get ported to PC, isn't R&C Nvidia sponsored? Even though these games were originally designed for AMD consoles the PC branch of the code is generally modified to be optimized for Nvidia GPUs in these cases.

Given that Sony doesn't use Direct X and therefore doesn't use DXR, they have their own API. It probably makes sense that replacing the entire RT implementation with DXR would result in different performance results?

Of course the RT implementation could just be shit? I have no idea but I figured it was worth mentioning.

It's not Nvidia sponsored from what I know. They obviously use it for their marketing because it has DLSS and stuff, but there's no cooperation or tech support from what I've seen. It's three RT effects at the same time and the more complex the RT gets, the more we nee AMD cards struggle, no matter the game. When it's only reflections and maybe shadows, AMD cards hold up ok. Add RTAO or RTGI, they can't deal with it well anymore, it's just their cards not being great for RT.

I just upgraded my monitor to a 4K/144Hz one and my 3080/10 GB is really struggling. That 10 GB really sucks now.

I have a feeling that the 4070Ti with 12 GB would still to little to justify the upgrade. Do you guys agree?

Can't justify a 4080/4090 at their price points at the moment...

The 4070 ti is a perfect 1440p card. At 4k, it's still 18% faster on average than a 3080, so not that much. But it depends on the game. DLSS3 can make it worth though.

2160p-p.webp
 
  • Like
Reactions: McHuj
The halt in 40xx cards seems to be real and already kicking in.

All 4070s are out of stock in my country and has to get ordered.

I can first expect to see my msi 4070 august 17.

The struggle is real.

Uhm yeah. The raytracing support for AMD cards in Ratchet & Clank just arrived.

You'd think that the game being developed for AMD hardware initially and how there's no crazy expensive RTGI or path tracing going on, AMD GPUs would perform well here.

Maxed 4k even with FSR2 Quality mode, a 7900XTX struggles hard. Fluctuating between 25 and 80+ fps. You can't lock the game at 30fps.



Can't you just cap framerate in amd software like with nvidia?
 
Can't you just cap framerate in amd software like with nvidia?

You can cap the fps with many different tools. But performance drops to sub 30fps in places and is generally between 40 and 60fps. You get an uneven experience either way, no matter where you lock it. I mean theoretically. Gladly, this game offers the option for dynamic resolution, so AMD users can use that to help with the fps for a bit worse image quality. So they'll be able to get a good experience.

I have to do it with my 3080, it's a good way to increase performance and even it out.
 
I just upgraded my monitor to a 4K/144Hz one and my 3080/10 GB is really struggling. That 10 GB really sucks now.

I have a feeling that the 4070Ti with 12 GB would still to little to justify the upgrade. Do you guys agree?

Can't justify a 4080/4090 at their price points at the moment...

I hope he's wrong about it, but JayzTwoCents brought up something to keep an eye on: another potential GPU shortage because of the rising interest in AI.



The halt in 40xx cards seems to be real and already kicking in.

All 4070s are out of stock in my country and has to get ordered.

I can first expect to see my msi 4070 august 17.

The struggle is real.

Where? I took another brief look over in Europe and stock seems plentiful. The Ventus OC you were eyeing in Germany, for example:


It's up on Amazon for the UK.
 
I hope he's wrong about it, but JayzTwoCents brought up something to keep an eye on: another potential GPU shortage because of the rising interest in AI.





Where? I took another brief look over in Europe and stock seems plentiful. The Ventus OC you were eyeing in Germany, for example:


It's up on Amazon for the UK.


I live in Denmark, where it seems to be in a remote stock somewhere (I don't know if it's the correct work to English.)
 
  • Like
Reactions: Joe T.
It's not Nvidia sponsored from what I know. They obviously use it for their marketing because it has DLSS and stuff, but there's no cooperation or tech support from what I've seen. It's three RT effects at the same time and the more complex the RT gets, the more we nee AMD cards struggle, no matter the game. When it's only reflections and maybe shadows, AMD cards hold up ok. Add RTAO or RTGI, they can't deal with it well anymore, it's just their cards not being great for RT.



The 4070 ti is a perfect 1440p card. At 4k, it's still 18% faster on average than a 3080, so not that much. But it depends on the game. DLSS3 can make it worth though.

2160p-p.webp

Where did you make this chart?
 
I live in Denmark, where it seems to be in a remote stock somewhere (I don't know if it's the correct work to English.)

Yeah. Checked a few places out of curiosity to see what the situation's like there, the more affordable models seem to be in stock at suppliers/warehouses with delivery times between 1-8 days while more expensive models are available in store for immediate pickup/delivery.
 
  • Like
Reactions: The_Mike
Yeah. Checked a few places out of curiosity to see what the situation's like there, the more affordable models seem to be in stock at suppliers/warehouses with delivery times between 1-8 days while more expensive models are available in store for immediate pickup/delivery.

Yeah and I would rather give 20 bucks more for the msi and wait a little. Seems like a card that's worth the extra money.
 
  • Brain
Reactions: Mickmrly
I hope he's wrong about it, but JayzTwoCents brought up something to keep an eye on: another potential GPU shortage because of the rising interest in AI.





Where? I took another brief look over in Europe and stock seems plentiful. The Ventus OC you were eyeing in Germany, for example:


It's up on Amazon for the UK.


He's not wrong. And crypto mining on Ethereum might also re-emerge in another way.

If you want a card, buy it this year.
 
It's not Nvidia sponsored from what I know. They obviously use it for their marketing because it has DLSS and stuff, but there's no cooperation or tech support from what I've seen. It's three RT effects at the same time and the more complex the RT gets, the more we nee AMD cards struggle, no matter the game. When it's only reflections and maybe shadows, AMD cards hold up ok. Add RTAO or RTGI, they can't deal with it well anymore, it's just their cards not being great for RT.
I was under the impression that Sony had some kind of deal with Nvidia for GeForce Now which meant that most of their PC releases were Nvidia sponsored (unless specifically sponsored individually by AMD/Intel like TLOU).

Maybe I'm wrong about that and they just list the games in their marketing due to DLSS+Other Nvidia tech? I'm not 100% sure.

Yeah it definitely seems that using too many RT effects at once still has a detrimental effect on Radeon GPUs, maybe we will see more of a silicon/R&D commitment to RT for RDNA4/5 with some proper fixed function cores?

However my point still stands about the game's graphics APIs needing to be entirely replaced with DirectX API calls as Sony uses their own custom graphics API on Playstation. This includes the RT implementation which would have to be rewritten in DXR.

This could help explain why a game originally designed for AMD hardware in consoles could perform worse or un-optimally when ported to PC.

Given that Nvidia is the clear market leader in the PC space with a near monopoly it would make sense that most optimization work and testing would be done for Nvidia GPUs, especially where RT is concerned given Nvidia's strong performance and software engineer support here.

Plus as we said, currently Radeon GPUs are somewhat lacking when it comes to RT hardware particularly when multiple heavy RT effects are used at once.
 
Interesting video, comparing the 3080 and 6800XT 2nd hand market. The 3080 is cheaper and arguably better, but is crippled by its VRAM. The VRAM issue is fixed by using DLSS however.
The most funny part is that after basically arguing in favor for the 3080 during the whole video, he will use the 6800XT as his daily driver. 🤣

 
Moore's Law Is Dead has become a very reliable leaker tbh. And this vid has awesome info.


The rumored amd strix halo desktop apu, is one of the things that has me most hyped up. The apu will have almost midrange performance, rumored to also have ai accelerators. With that one can likely run ai models without worrying about ram limits. Since its a 2024 product, it is likely to handle 128GB or maybe even 256GB of ram. Enough to run even some of the largest open source models.

Many of which are likely to be competitive with gpt4. Imagine coding, image generation, music, voice and sound generation all with privacy and local.

Also since it has a beefy cpu can be paired with an rtx 5000 gpu for high gaming performance.
 
  • Like
Reactions: Ascend
The 4060 ti 16gb is such a shitty card it's unbelievable. Nvidia fucked up on the lower end this gen, hard.
It's really just a 4050 ti which would be great if it were called that and cost $300 or less, then $250 or less for the 8gb model. 4070 should be a 4060 for $400, 4080 a 4070 for $600 and some new card for the 4080 in between 4080 and 4090.

4060 ti being branded as such and for that price is really a shame as the 1050 ti was a legendary card for the money ; $130-40 for the 4gb model.

@Ascend I regularly skim through MLID videos for news on intel.
 
It's really just a 4050 ti which would be great if it were called that and cost $300 or less, then $250 or less for the 8gb model. 4070 should be a 4060 for $400, 4080 a 4070 for $600 and some new card for the 4080 in between 4080 and 4090.

4060 ti being branded as such and for that price is really a shame as the 1050 ti was a legendary card for the money ; $130-40 for the 4gb model.

@Ascend I regularly skim through MLID videos for news on intel.
Hear there are rumors of super revisions on horizon with far better specs for similar price
 
Hear there are rumors of super revisions on horizon with far better specs for similar price

Rumors were also that 40xx would be a good budget card series with reasonable prices before they were launched.

Yet here we are.

I highly doubt we would see better prices from nvidia until they at least have sold most of their stock now