Thread: The GPU Thread
1080ti was really better all round for the time both in price to performance and doesn't have as much cpu overhead from drivers as 4090.

I hate how hot and stupidly sized 4090 is as well… for me I don't own it because I don't want to get hosed on price no matter how fast it is.

The 4090 is actually super tiny... But I'm also watercooling mine. But even before I did that, its actually a hyper efficient card that runs cool.
 
The 4090 is actually super tiny... But I'm also watercooling mine. But even before I did that, its actually a hyper efficient card that runs cool.
It is efficient. Some cards are nearly 4 slots though.

By hot I just mean you need to have more and more case cooling/room.

And I do think the 5090 will be bigger.
 
quiet? You've swapped like 20 of them

Coil whine is unfortunately not limited to the 4090; 4080 and 7900XTX and 7900XT also suffer from it. Just going by DB which is generally a lot lower than previous high end cards. Definitely not value for money but I don't think it's justifiable saying hot and loud.
 
Coil whine is unfortunately not limited to the 4090; 4080 and 7900XTX and 7900XT also suffer from it. Just going by DB which is generally a lot lower than previous high end cards. Definitely not value for money but I don't think it's justifiable saying hot and loud.
By loud I actually do mean coil whine more so than fan noise which can be mitigated with case flow.

Thing about coil whine though, the more watts a card pulls the more it's an issue. Hence why I said it's a loud card. And you've experienced that all too well. It's silly to say it's not loud because it's coil whine and not fan noise.

Coil whine definitely bothers me, so i'd like to see really good 60/70 class cards from the 5000 series as an upgrade for me.
 
  • Like
Reactions: Hostile_18
By loud I actually do mean coil whine more so than fan noise which can be mitigated with case flow.

Thing about coil whine though, the more watts a card pulls the more it's an issue. Hence why I said it's a loud card. And you've experienced that all too well. It's silly to say it's not loud because it's coil whine and not fan noise.

Coil whine definitely bothers me, so i'd like to see really good 60/70 class cards from the 5000 series as an upgrade for me.

Yeah if they solve the issue I've probably got one more upgrade in me in 2025 for the 50series.
 
  • Like
Reactions: Chozofication
@Hostile_18 also dude I am just like you with swapping stuff with defects, esp. with tvs.

You'd think these manufacturers are losing money due to returns vs. spending an extra $20 or whatever on better capacitors or whatnot, but maybe most people just put up with it and we are in the minority.
 
@Hostile_18 also dude I am just like you with swapping stuff with defects, esp. with tvs.

You'd think these manufacturers are losing money due to returns vs. spending an extra $20 or whatever on better capacitors or whatnot, but maybe most people just put up with it and we are in the minority.

True. I think I'm just trying to walk a middle ground now between poor and perfect. Because the pursuit of perfect I've found is impossible. But like wise had a fair few units that no one should have to put up with, that literally sounded like a beard trimmer.
 
I am not saying it's shit because it's Amd and B. The 7800xt is just as boring and unimpressive as the 7700 xt.

I've literally said 4000 series is worse, and sets the low bar.

These new cards suck too, just less. 7800xt is a smaller chip than 6800xt, so it really shouldn't even be labled the same ; same problem with Nvidia's lineup getting bumped up in naming tiers despite the 4060 ti really being an 4050ti etc.

You only think it's good because of the fucked up gpu market drastically lowering the bar compared to nvidia maxwell/pascal generation.

The naming is whack. We can all agree on that. But I don't care about the naming. I care about value for money, and provided the performance is accurate, the 7800XT is one of the best value for money cards that we've had in a while.

Ah the Maxwell generation. The time when everyone flocked to the GTX 970 that ultimately bit them in the ass with the 3.5GB because rather than buying the superior AMD cards in the previous gen and the direct competitor in the same gen, people simply lingered around the original failure that was Maxwell, that made Maxwell 2.0 look a lot better than it really was. The R9 390 was a better choice than the GTX 970, but of course, the same old mantras were repeated about drivers. Nobody could argue against the 980 & 980 Ti though.

Pascal gen was probably peak value of all time, for nVidia at least. They still managed to screw over the Titan X buyers with the 1080 Ti, but I digress. I generally agree that Pascal was a good gen. I also agree that the bar has generally been lowered, but I wouldn't call it "drastically lowered".

Going back to Pascal... The 1080Ti was $699. Today you can get that same performance in a 6600XT or Arc A770. Those cards cost way under $300 today. That same performance was $499 with the 2070S. So the progress has been there. Even the 4060 at $300 offers slightly better performance than the 1080 Ti, so, it is progress.

Aside from a handful of games, like Starfield, the cards are capable enough at their respective resolutions. This only changes when RT comes into play. I've generally argued against excessive focus on RT because it's too heavy for modern hardware, and this current market is one of the results of focusing too much on techniques that the hardware is not ready for, along with our other favorite potential market condition, which is nothing being capable to run at playable framerates natively but requiring upscaling.

But let's do some more numbers here, picking a good mid-tier card.
970 ($329) -> 1070 ($379) +47% performance, +15% price
1070 ($379) -> 2070 ($499) +34% performance, +32% price
2070 ($499) -> 3070 ($499) +53% performance, +0% price
3070 ($499) -> 4070 ($599) +22% performance, +20% price

AMD's is harder to compare due to their messy gens, but here it goes... I'll use similar performance tiers to the above...
R9 390 ($329)-> RX Vega 56 ($399) +59% performance, +21% price
RX Vega 56 ($399) -> RX 5700XT ($399) +26% performance, +0% price
RX 5700XT ($399) -> RX 6700XT ($479) +35% performance, +20% price
RX 6700XT ($479) -> RX 7700XT ($449) (assuming 6800 performance) +25% performance, -6% price

Tell me. What do you see?

The contrast between the cpu and gpu market is remarkable, where the cpu competition is fierce but the gpu makers just couldn't give a shit less.

I think that will change next generation with rdna4 and blackwell, then maybe some people will see just how terrible this generation is and I mean it is pretty much the worst generation in graphics to date.

This is only a bad gen if your only point of reference is nVidia.
 
@Ascend 970 was a freaking amazing card dude. I used one all the way up until I picked up my 3060. The drama about the 3.5gb didn't negate how good it was, and much better value vs 980. I could play games like metro exodus at medium 1080p at locked 60 on the 970.

Maxwell was great, including the budget 750ti. Heck the only dud was the 960.

I mean yeah, vega 56 aged better, but it also was much hotter and in terms of games that were out at the time, 970 was better. That extra 500mb on the 980, i mean it can help with a *few texture settings but in today's context a 980 isn't any more useful than 970.

In terms of kepler vs amd, aka 680/780 vs 7970/r9 290, that was amd's generation no doubt. Kepler aged like milk.
 
@Ascend 970 was a freaking amazing card dude. I used one all the way up until I picked up my 3060. The drama about the 3.5gb didn't negate how good it was, and much better value vs 980. I could play games like metro exodus at medium 1080p at locked 60 on the 970.

Maxwell was great, including the budget 750ti. Heck the only dud was the 960.

I mean yeah, vega 56 aged better, but it also was much hotter and in terms of games that were out at the time, 970 was better. That extra 500mb on the 980, i mean it can help with a *few texture settings but in today's context a 980 isn't any more useful than 970.

In terms of kepler vs amd, aka 680/780 vs 7970/r9 290, that was amd's generation no doubt. Kepler aged like milk.

The competitor of the 970 was the R9 390, not the Vega 56. The Vega 56 was the competitor of the 1070. I still don't get why the 970 was so much more popular than the R9 390.



It's a real shame when competing AMD cards like the R9 390 go forgotten in history, but the card that deceived consumers is still remembered as one of the best cards ever. But I guess that's what gamers are like, and now we all pay the price for it.

But that was not the main point of my post. Since you missed the point I was trying to make, let me elaborate.
970 ($329) -> 1070 ($379) +47% performance, +15% price
1070 ($379) -> 2070 ($499) +34% performance, +32% price
2070 ($499) -> 3070 ($499) +53% performance, +0% price
3070 ($499) -> 4070 ($599) +22% performance, +20% price

AMD's is harder to compare due to their messy gens, but here it goes... I'll use similar performance tiers to the above...
R9 390 ($329)-> RX Vega 56 ($399) +59% performance, +21% price
RX Vega 56 ($399) -> RX 5700XT ($399) +26% performance, +0% price
RX 5700XT ($399) -> RX 6700XT ($479) +35% performance, +20% price
RX 6700XT ($479) -> RX 7700XT ($449) (assuming 6800 performance) +25% performance, -6% price

nVidia performance rose by 39% on average, and increased price by 17% on average.
AMD performance rose by 36% on average, and increased price by 9% on average.
Note that the R9 390 and GTX 970 were pretty much identical in performance, and the 7700XT will likely be just shy of the RTX 4070.

Also note that there were two generations where nVidia basically didn't increase performance per dollar at all. They did this with the 2000 series and the 4000 series. That is where they were deliberately hiking their prices.
AMD on the other hand, even though they are accused of price hiking too, they increased performance per dollar every single generation.

So again, this generation only looks bad when nVidia is your only reference. Clearly, AMD is doing better here in terms of pricing and has been quite consistent in its improvement in performance and pricing. If anyone thought the 5700XT was a good card, you cannot think that the 7700XT is actually a bad card. It gives a similar performance bump after all, and even at a slight reduction in price. It does look bad in comparison to the 7800XT, because that one's value is so much higher, and I will be getting it if everything is as it appears.
 
  • Brain
Reactions: Mickmrly
The competitor of the 970 was the R9 390, not the Vega 56. The Vega 56 was the competitor of the 1070. I still don't get why the 970 was so much more popular than the R9 390.



It's a real shame when competing AMD cards like the R9 390 go forgotten in history, but the card that deceived consumers is still remembered as one of the best cards ever. But I guess that's what gamers are like, and now we all pay the price for it.

But that was not the main point of my post. Since you missed the point I was trying to make, let me elaborate.


nVidia performance rose by 39% on average, and increased price by 17% on average.
AMD performance rose by 36% on average, and increased price by 9% on average.
Note that the R9 390 and GTX 970 were pretty much identical in performance, and the 7700XT will likely be just shy of the RTX 4070.

Also note that there were two generations where nVidia basically didn't increase performance per dollar at all. They did this with the 2000 series and the 4000 series. That is where they were deliberately hiking their prices.
AMD on the other hand, even though they are accused of price hiking too, they increased performance per dollar every single generation.

So again, this generation only looks bad when nVidia is your only reference. Clearly, AMD is doing better here in terms of pricing and has been quite consistent in its improvement in performance and pricing. If anyone thought the 5700XT was a good card, you cannot think that the 7700XT is actually a bad card. It gives a similar performance bump after all, and even at a slight reduction in price. It does look bad in comparison to the 7800XT, because that one's value is so much higher, and I will be getting it if everything is as it appears.

I didn't miss your point, I just don't think you have one considering power consumption and drivers ; it's not all about price to performance, which was close enough. This was the argument back then : amd competes in performance, but at the same price you can get a much more efficient card and have better support.

I mentioned that the 4090 has high cpu overhead from drivers. Well that used to be amd's problem in comparison to maxwell/pascal.

I mean, since you think amd is doing so great atm, it would actually be gamers such as yourself making things worse. People who see it's all crap and aren't buying are going to (hopefully) fuel a better generation.

I mean most of the problem was people buying cards at scalper prices last gen, but we still need to hold their feet to the fire, which you're not doing.

I'm not defending Nvidia like you are with amd, i'm giving them both shit. AMD if they wanted to could wreck the nvidia lineup sans 4090, but they're following Nvidia here with minimal effort.
 
@Ascend As an olive branch, depending on what card you're coming from and you needed a card now, yeah the 7800xt is the one to get if I had to. But i'm holding out.

Don't get me wrong I see your point that that card is a… not terrible increase in value, which is better than Nvidia. But you must admit things could be much better than they are.

I just can't get over how they are basically just offering a cheaper 6800xt… if the 1080 was just a cheaper 980 we would have laughed at it to no end back then. It's just not exciting, and at best a sobering reality where component costs keep rising and desktop gamers are no longer the focus as opposed to other markets.
 
I didn't miss your point, I just don't think you have one considering power consumption and drivers ; it's not all about price to performance, which was close enough. This was the argument back then : amd competes in performance, but at the same price you can get a much more efficient card and have better support.
I will never understand people making a big deal out of 50W.
I will also never understand features only mattering when nVidia has them. But I'm not gonna delve into that now.
Long story short, the video in my previous post shows that the power consumption isn't that big of a deal for the R9 390 vs the GTX 970, and that the benefit of the VRAM probably outweighs the power consumption drawback.
But to each their own.

I mentioned that the 4090 has high cpu overhead from drivers. Well that used to be amd's problem in comparison to maxwell/pascal.
The situation flipped with Vulkan/DX12, because AMD still uses a hardware scheduler to this day. The hardware scheduler was limiting them in DX11, but is helping them in lower level APIs like DX12.
nVidia dropped their hardware scheduler for Maxwell, which is one of the main reasons for their reduced power consumption. It gave them more granularity in DX11, and more control over the front end of the graphics pipeline. Basically, the software scheduler uses additional CPU resources. But with DX11 being very single core heavy, nVidia had learnt to offload their drivers and scheduling to the unused CPU cores. This would translate to a lower CPU bottleneck on the primary core used during gaming, and making use of the idle ones helped to boost their driver performance as much as possible in games. Under DX12 and Vulkan, this translated to the appearance of using more CPU resources, but it's really something that has been there since Maxwell.

I mean, since you think amd is doing so great atm, it would actually be gamers such as yourself making things worse.
Oh sure. The one that didn't buy a graphics card since the R9 Fury is making things worse by buying a good value product from the underdog... Definitely not the ones that bought the nVidia RTX 2000 series or are constantly paying more and more for the top end cards...

People who see it's all crap and aren't buying are going to (hopefully) fuel a better generation.
I think it's time we clarify what is exactly crap about it.

I mean most of the problem was people buying cards at scalper prices last gen, but we still need to hold their feet to the fire, which you're not doing.
Who's feet? The 6800XT MSRP was $649 and was considered great value at the time, and is now available below $500. Scalper prices are and have been gone for a while.
Let me ask you this. What would the price of the 4070 need to be for it to be a good buy and for this market not to be crappy?

I'm not defending Nvidia like you are with amd, i'm giving them both shit.
I just presented the facts. It sounds like I'm defending AMD because everyone is used to looking at things from the standard nVidia perspective. I'm fully aware that AMD has lied about their performance in the past and that tomorrow we might have a shitshow regarding the performance of these cards.

AMD if they wanted to could wreck the nvidia lineup sans 4090, but they're following Nvidia here with minimal effort.
I see why people say this, but honestly, I don't think it completely follows. The 7900XTX and the 7900XT didn't have pricing that followed nVidia. They clearly undercut nVidia by quite a significant margin, and they didn't go higher with their flagship price compared to the previous gen. And they are currently using the same weird strategy of placing the lower tier card too close in price to the higher tier one, something that nVidia does not do.
They did follow nVidia at the launch of the 6000 series, and at the time everyone thought those cards were good deals (if you could actually buy one at MSRP), except then the mining boom happened. We are now getting better deals than that, which is why I don't get the criticism.

@Ascend As an olive branch, depending on what card you're coming from and you needed a card now, yeah the 7800xt is the one to get if I had to. But i'm holding out.

Don't get me wrong I see your point that that card is a… not terrible increase in value, which is better than Nvidia. But you must admit things could be much better than they are.

I just can't get over how they are basically just offering a cheaper 6800xt… if the 1080 was just a cheaper 980 we would have laughed at it to no end back then. It's just not exciting, and at best a sobering reality where component costs keep rising and desktop gamers are no longer the focus as opposed to other markets.
Technically I don't HAVE to upgrade right now. I can easily wait for the next gen. But I totally see another graphics card hoarding event coming much sooner than most expect. It's only a matter of time before AI and crypto converge, where you can gain crypto coins for using your GPU to process AI requests. I made the mistake of passing on the 5700XT, and I'm not doing that again. The stars are equally aligned at this point.

Things could be better than they are for sure (in a way that's always possible), but unfortunately, I wouldn't count on it becoming better than it is right now. According to recent news, nVidia didn't even update their drivers properly for Starfield because they've allocated more software resources to their AI business. I wouldn't hold my breath for a better GPU market.

As for the 1080 vs 980 comparison, I don't think that follows directly. Those were generally considered at the top of the line at their release, even if we would expect a Ti or Titan release later. The 7700XT and 7800XT cards are mid tier cards, even if the naming suggests otherwise. This gen is definitely the most confusing naming-wise, and in that sense, I don't disagree about the market.

But I honestly don't understand the issue. I bought my R9 Fury in 2016 for $300. Guess which card came out that year. The RX480. I decided to get the Fury at stock clearing prices, because back then there was also a mining craze, but pretty much only AMD GPUs were wanted for it. nVidia's cards sucked at compute back then, so miners didn't buy those too much. The RX480 was regarded as a good card, in spite of it being slower than the R9 Fury. On paper it cost less too, but miners had driven the prices up to be higher than the Fury cards, because it had some advantages for ease of mining. So ultimately, the R9 Fury appeared to be the better deal, so I got that.
But in spite of all those circumstances, the RX480 was still considered a good card. The 5700XT situation is also not much different compared to the Vega 64. That was also considered a good card. The 7800XT is in better standing compared to the 6800XT, than Polaris was against the Fury, or than the 5700XT was against the Vega 64... But somehow that's still not good enough... 🤷‍♂️

As for gamers not being the focus, I fully agree. nVidia is following their bottom line, and AMD put their relatively resources elsewhere after gamers basically ignored their good deals for years. I wouldn't be surprised if in five years we only have AMD and Intel for the desktop DIY market. And honestly, I would be extremely happy for that green toxicity to exit the gaming market, as long as all the online green nVidia drones follow suit (not talking about you specifically, but in general).
 
@Ascend I read all of that. Just going to agree to disagree, although I thought I was clear on why I think everything is crap i.e. both amd and nvidia selling graphics cards with chip sizes 1 tier lower than previous i.e. 7800xt should be a 7700xt and 4060ti is really a 4050 ti.

What would I say is a good price for 4070? $450. $450 because, it's basically what a 4060 should be given A. The Vram (both bus width and capacity) and chip size. It's a damn 4060 except in name. The actual 4060 is a damn 4050.

3060 was $329 msrp, so $450 would more than adjust for inflation.
 
@Chozofication I agree that they are both naming their cards as if they are a higher tier. I guess where we disagree is the general scaling in performance & price, at least for the AMD mid tier.

A 4070 at $450 would be great. But we're likely never going back to such prices. The price increase from $329 to $450 definitely accounts for inflation and then some, although nVidia also went from a cheap Samsung 8nm process to the more expensive TSMC 5nm process. So they have to account for that too. The former costs $5k per wafer while the latter costs 16k.
 
Hardware unboxed : "borderline embarrassing for amd, pretty crappy after 3 years but since the market has been terrible, I guess we'll take it."

100% the nicest that can be said about 7800xt.

It'll be interesting to see if Nvidia drops prices or not. In the past I would say so, but maybe they 100% do not give a shit right now.
 
I got hit with an artifacting issue that has apparently been experienced by 3000 or 4000 series owners this year, didn't start for me until the latest Windows 11 update that I didn't download until the end of last month. Only seems to happen while browsing sites with lots of thumbnails for videos, like Youtube, and happened once with Discord probably because the page had a couple of video links from Twitter. It only appears for a second or two at most, if even that long, but is it ever annoying. Doesn't come up during gaming at all.

Pulled this image from Linus Tech Tips' forum that mirrors what I've seen, though it often takes up a larger chunk of the screen and sometimes includes solid rectangles/triangles:

iu


Seen a couple of less than ideal suggested fixes for it like turning off hardware acceleration for browsing (didn't work) and turning off Gsync (seems to be working).

How common is this problem?
 
  • Shocked
Reactions: Mickmrly
First one that I could find comparing AIB cards.



Edit: rest comes here as they pop up.

 
Last edited:
@Ascend What model are you thinking of getting?

That sapphire nitro + model if it is really only $550, id probably go for that. You'd think it would be near silent since it has the same insane cooler as the 7900xtx lol

We will see if it's really only $550 though, hard to believe considering the markup on the 7900xtx
 
@Ascend What model are you thinking of getting?

That sapphire nitro + model if it is really only $550, id probably go for that. You'd think it would be near silent since it has the same insane cooler as the 7900xtx lol

We will see if it's really only $550 though, hard to believe considering the markup on the 7900xtx

Definitely eyeing the Nitro.

All sold out on Newegg though, if there was any stock of it in the first place.

Edit:
The Asrock Phantom Gaming is looking good too, tbh, at a cheaper price... And it's in stock... Tempting.


 
Last edited:
Most components on the PCBs seem to be the same across most vendors, but there's a minor difference. The reference card is using a 10 phase power design, with MP2856 controllers. The Sapphire Nitro is using the same thing. But the PowerColor Hellhound and the Asrock Phantom Gaming are using an 11 phase power design, with MP2857 controllers instead.


I also noticed that some interesting things are going on, on the SOC.

AMD Reference Card
zmc5UCD.jpg


PowerColor Hellhound
72kpGSc.jpg


Sapphire Nitro
AKUHGZl.jpg


Asrock Phantom Gaming
ApvLYR7.jpg


Asus TUF
y3Q8G0E.jpg


Notice how some of the SOCs have more components in the top. At first I thought it had to do with the power phase implemented, but the Sapphire Nitro throws that for a loop, because it's 10 phase, but has the same component layout as the 11 phase Asrock Phantom Gaming.

Now I'm wondering which one would be better quality, or if it has any influence at all. Generally, more components would be better, I guess? Although it's probably also a higher chance of failure due to having more of them (but that would probably be negligible nowadays).

I'm probably overthinking this :D

Edit: The 7800XT Nitro+ is available on Amazon, but it's $599 already.

Edit 2: Asrock Phantom Gaming sold out at Newegg.
 
Last edited:
I got hit with an artifacting issue that has apparently been experienced by 3000 or 4000 series owners this year, didn't start for me until the latest Windows 11 update that I didn't download until the end of last month. Only seems to happen while browsing sites with lots of thumbnails for videos, like Youtube, and happened once with Discord probably because the page had a couple of video links from Twitter. It only appears for a second or two at most, if even that long, but is it ever annoying. Doesn't come up during gaming at all.

Pulled this image from Linus Tech Tips' forum that mirrors what I've seen, though it often takes up a larger chunk of the screen and sometimes includes solid rectangles/triangles:

iu


Seen a couple of less than ideal suggested fixes for it like turning off hardware acceleration for browsing (didn't work) and turning off Gsync (seems to be working).

How common is this problem?

First time I hear of this, so, I can't help you. Sorry.
 


Literally all 7800XTs are sold out on NewEgg now. Maybe I should have snagged the Phantom Gaming when it was available instead of waiting for the Nitro. Oh well. Guess I have no choice but to wait a bit longer.
 
Last edited:
I got hit with an artifacting issue that has apparently been experienced by 3000 or 4000 series owners this year, didn't start for me until the latest Windows 11 update that I didn't download until the end of last month. Only seems to happen while browsing sites with lots of thumbnails for videos, like Youtube, and happened once with Discord probably because the page had a couple of video links from Twitter. It only appears for a second or two at most, if even that long, but is it ever annoying. Doesn't come up during gaming at all.

Pulled this image from Linus Tech Tips' forum that mirrors what I've seen, though it often takes up a larger chunk of the screen and sometimes includes solid rectangles/triangles:

iu


Seen a couple of less than ideal suggested fixes for it like turning off hardware acceleration for browsing (didn't work) and turning off Gsync (seems to be working).

How common is this problem?

Fucking AMD drivers! This is why I only buy Nvidia!...oh wait...

But in all seriousness that really sucks. Have you tried a clean driver wipe with DDU and a fresh install of the latest GeForce driver? Probably won't fix your issue but sometimes Windows updates can do weird things with graphics drivers and a clean install might set right whatever has been messed up.

Failing that I suppose just report the issue to Nvidia and hope that they come up with a driver fix at some point? Hopefully the issue gets fixed sooner rather than later.
 
  • Like
Reactions: Ascend and Joe T.
Fucking AMD drivers! This is why I only buy Nvidia!...oh wait...

No joke, first thing that came to mind.

But in all seriousness that really sucks. Have you tried a clean driver wipe with DDU and a fresh install of the latest GeForce driver? Probably won't fix your issue but sometimes Windows updates can do weird things with graphics drivers and a clean install might set right whatever has been messed up.

Failing that I suppose just report the issue to Nvidia and hope that they come up with a driver fix at some point? Hopefully the issue gets fixed sooner rather than later.

Yeah, DDU didn't do me any good. I barely browse at all on the S95B and generally have Vsync turned on anyway while gaming on the 60hz 4K monitor (waiting for price reductions on Asus PG32UQXR) so turning off Gsync hasn't been much of a bother up until now.

I see complaints about this problem going back 8+ months and one post at Nvidia's forums claiming they officially recognized it (D3D11 chromium supposedly being the culprit). Their driver update for Baldur's Gate 3 included it as an open issue, notes on their forum here.

I'm surprised something like this has dragged on so long without getting more exposure.
 
  • Shocked
Reactions: IrishWhiskey
No joke, first thing that came to mind.



Yeah, DDU didn't do me any good. I barely browse at all on the S95B and generally have Vsync turned on anyway while gaming on the 60hz 4K monitor (waiting for price reductions on Asus PG32UQXR) so turning off Gsync hasn't been much of a bother up until now.

I see complaints about this problem going back 8+ months and one post at Nvidia's forums claiming they officially recognized it (D3D11 chromium supposedly being the culprit). Their driver update for Baldur's Gate 3 included it as an open issue, notes on their forum here.

I'm surprised something like this has dragged on so long without getting more exposure.

That's the nVidia pass for you. If this was AMD there would be headlines with titles like "AMD drivers strike again".

Ultimately, I know people find me annoying since I constantly hammer the same nail. But it's the negligence of this stuff for the market leader and the constant beating up of the underdog for less that ultimately costs all of us.

In any case... I truly hope you can get this resolved.
 
That's the nVidia pass for you. If this was AMD there would be headlines with titles like "AMD drivers strike again".

Ultimately, I know people find me annoying since I constantly hammer the same nail. But it's the negligence of this stuff for the market leader and the constant beating up of the underdog for less that ultimately costs all of us.

In any case... I truly hope you can get this resolved.

Yeah this one is pretty inexcusable for Nvidia. It's been a known bug with chromium based browsers for some time. If it was like a small no name browser maybe I could understand, but we're talking about the most popular and used desktop applications in the world. Every single Nvidia user likely used their web browser more than any other application, how can you not make it an instant fix priority?
 
Man the 7800XT Nitro is hard to buy. Just before going to work I checked Amazon. It was in stock, albeit slightly more expensive at $569. Didn't want to be late to work, which is a 15 minute drive, so I thought to myself, I'll buy it when I arrive.

Boom. Sold out. Even at that price.
 
  • Shocked
Reactions: Mickmrly
Man the 7800XT Nitro is hard to buy. Just before going to work I checked Amazon. It was in stock, albeit slightly more expensive at $569. Didn't want to be late to work, which is a 15 minute drive, so I thought to myself, I'll buy it when I arrive.

Boom. Sold out. Even at that price.

Yeah, they seem to have vanished fast even locally here in Montreal. Hopefully that has more to do with high demand than low supply.
 
Man the 7800XT Nitro is hard to buy. Just before going to work I checked Amazon. It was in stock, albeit slightly more expensive at $569. Didn't want to be late to work, which is a 15 minute drive, so I thought to myself, I'll buy it when I arrive.

Boom. Sold out. Even at that price.

Wonder if its scalpers or pent up demand. Hoping pent up demand. Does AMD have a system like Nvidia does to get cards into real fans hands and not scalpers? I was able to get my 4090 thanks to Nvidia's progream, I have to imagine AMD has something similar.
 
Wonder if its scalpers or pent up demand. Hoping pent up demand. Does AMD have a system like Nvidia does to get cards into real fans hands and not scalpers? I was able to get my 4090 thanks to Nvidia's progream, I have to imagine AMD has something similar.

They had it during the mining boom. Not sure if they do now.

Allegedly, AMD sent the 7700XT and 7800XT in the same quantity to different regions with a relatively limited supply to asses the popularity of the cards in each region. This would allow them to ship proportionally to the region's demand, and after the 1st week they would start shipping larger volumes.

In some European countries where the prices are different, the 7700XT seems to be selling better than the 7800XT. So I guess that makes sense.

Maybe next time week I'll have an easier time buying one. It gives me more time to line up everything else I want to buy too, so it's not all bad. Importing things on an island becomes expensive if you ship everything separately.
 
The Asrock doesn't have a BIOS switch. That's good to know. It uses more power too, but still stays cooler than the Nitro. Its cooler seems to be better than the Nitro, which is interesting.




TechPowerUp created a nice summary of a few different variants as well.



And this is a good video showing the current state of the GPU market and how the AMD Starfield Bundle has influenced prices, as well as the launch of the 7800XT cards.
 
  • Like
Reactions: regawdless
Wonder if its scalpers or pent up demand. Hoping pent up demand. Does AMD have a system like Nvidia does to get cards into real fans hands and not scalpers? I was able to get my 4090 thanks to Nvidia's progream, I have to imagine AMD has something similar.

Scalpers buy up everything anymore. My wife wanted a Capitals jersey. It was some new design for a special event last year I think and they sold out in minutes and were on eBay and I think StockX right after for $100+ more than retail. It was just a random hockey jersey. So if there's something you want and you can't find it there's most likely some douche bag with 1500 of them in his basement.
 
Random but does anyone here with a 4090 have the original metro 2033? I want to see if it can run it at 4k120 locked.

YouTube only has 3090 vids and it dips in the 60's, talking about a 2010 game here lol

Might seem silly but I want my next Pc to run games like that and witcher 2 perfectly, games from around the time I built my first PC
 
  • Like
Reactions: Bolivar