Trio and Suprim are great. Love those fans. Ventus is good value but if be more tempted for the FE at that price.
I more tempted to sell my kidneys at that price.
Trio and Suprim are great. Love those fans. Ventus is good value but if be more tempted for the FE at that price.
It's better. That it's "much much" better is highly debatable. And in the case of performance mode, being better doesn't mean it's good.Performance mode is much much better in DLSS too.
It's better. That it's "much much" better is highly debatable. And in the case of performance mode, being better doesn't mean it's good.
It's better. That it's "much much" better is highly debatable. And in the case of performance mode, being better doesn't mean it's good.
No lies detected. I only said what I said, because the screenshot offered only Quality and Performance mode. Balanced is a viable setting too, especially for DLSS. But there was no available comparison in that screenshot.- You usually move in games and there's movement, so how the solutions deal with that has a huge impact on you image quality. And generally DLSS copes way better with that.
- Balanced and performance modes are important because they offer the biggest frame rate gains. I often use DLSS balanced in demanding games, and it looks nearly as good as Quality mode.
Again, I simply picked this one screenshot, and made the obvious DLSS bias clear, especially with statements like "way way better image quality". It's arguably better overall, but let's keep things down to earth here.You're only holding onto this one mode (FSR Quality), at this one resolution (4k), knowing that most don't even have 4k displays and can't / don't target 4k, knowing that all other modes and resolutions clearly go to DLSS. And in that one mode, at that one resolution, using a game from two years ago, you also exclude movement from your evaluation.
Convenient, that the "correct" image ignores the blurry background.I just like a correct image. It is probably due to the fact that I spent over 20yrs in VFX
I also would like to mention that obviously 4K is the most valuable resolution to use upscaling tech. 1440p is more useable for DLSS than it is for FSR, arguably. But once we're getting into usability territory, this is the equivalent of claiming that a graphics card is 100% faster when it produces 20 fps instead of 10 fps and therefore arguing for that card over the other. Who cares? Nobody plays at those framerates.
And that is typical with blind nVidia followers; advantages that are irrelevant are propped up to be much more relevant and great than they really are. And that is unsurprisingly happening with DLSS vs FSR too.
After reading the article, I don't see what the problem is...
DLSS vs FSR performance went to DLSS, as it should. And this is primarily the part you're complaining about... Why? DLSS won. What are you complaining about?
Compatibility went to FSR, for obvious reasons. If you want to argue that DLSS requires the Tensor cores on the cards, fine. But nobody complains about RTX 3000 series being capable of having DLSS3 and nVidia deliberately locking their own users out. FSR works for everything. It should get points for that.
Game support was given a tie. This is debatable, but overall, it was a tie, and this was the conclusion:
DLSS is the more widespread option, with better performance and more games supported, but it necessitates an Nvidia GPU and isn't an option for anyone gaming on older hardware.
Conversely, AMD was late to the upscaling party with FSR, but has been putting in the work to improve its competing software. I'm hopeful that FSR 3.0 will ensure that Team Red remains competitive against Nvidia in this particular battleground, but until it arrives that's anyone's guess.
As for which one you should use - well, it depends on what GPU you've got and what games you want to play! If you're rocking a brand new RTX 4060, you should obviously be taking advantage of DLSS 3 in all the games supporting it, but if you're using an older GPU or any of AMD's best graphics cards, then FSR is probably the way to go.
I detect no lies. Tell me again what's so bad about the article...? For anyone that keeps up with graphics card, it's pretty much a useless article, but that doesn't mean it's bad. The fact that this article required a thread trying to bash it, says everything about how skewed the PC gaming market is towards nVidia. Saying anything other than "nVidia always best" triggers some people.
Still, voting for incompetent, because it basically adds no real value to the reader. And when you're biased yourself, neutral articles look like biased ones to you.
Wait... I complained about them only taking performance increase into account, completely ignoring the actual quality of the results and the pictures. No word on anything regarding image quality.
Now you come back and say.... "But they gave performance to DLSS, where's your issue?".
It's hard to take you seriously anymore tbh.
Image quality can be seen as part of performance. After all, all upscaling is a tradeoff between image quality and framerate.
Saying anything other than "nVidia always best" triggers some people.
My only reply to the above.
I'm rather critical of Nvidia
Honest question:But you simply ignore everything that's being said and go back to some delusional assumptions.
The background, while blurred, is the best of the two that I would take. I don't like artifacts in the image at all and having the main character have the artifacts is a no-no.Convenient, that the "correct" image ignores the blurry background.
That is fair.The background, while blurred, is the best of the two that I would take. I don't like artifacts in the image at all and having the main character have the artifacts is a no-no.
Also, I've not seen this blurriness to that degree in Quality mode in any of the games that support DLSS. Can you point me to a game that has this so I can test?
Without frame gen, yes, you are correct. But frame gen allows that reality now (i.e. CP2077).
Is CP2077 fully path traced like Quake 2 RTX or Portal RTX?
Is CP2077 fully path traced like Quake 2 RTX or Portal RTX?
What flaws are you seeing with this mode? It's probably the most optimized game to date. And rightly so as Nvidia helped a lot with making that a reality today.Their recently added "Overdrive" mode introduces full path tracing. But it's a "Preview", so it still has some flaws
It's pretty feature complete now. I didn't know there was any work in progress as the game doesn't have any visual bugs and the FPS are very consistent from frame-to-frame.because to make it run in a demanding game like that, the devs had to do a lot of work and it's still a work in progress.
"at times"?Works well though and looks incredible at times.
What flaws are you seeing with this mode? It's probably the most optimized game to date. And rightly so as Nvidia helped a lot with making that a reality today.
It's pretty feature complete now. I didn't know there was any work in progress as the game doesn't have any visual bugs and the FPS are very consistent from frame-to-frame.
"at times"?
I played the Overdrive mode a lot and encountered a number of issues. Some examples:
Sometimes, a lighting bug when switching from the menu back to the game, I also had random rare light flashes in game:
In general, lighting response is delayed, like here the shadows behind and under the car, look behind the front wheel. They're delayed. It's not a hardware related issue, happens on 4090 GPUs as well. Go to a dark room and stand in a corner, shoot a gun. The muzzle flash will appear, the environment will light up delayed.
Also, reflections on fine elements like fences create artefacts at times. Unfortunately these are low res clips, but it's visible here nonetheless.
![]()
I think I read somewhere that they keep improving it, but can't confirm right now.
Regarding "at times".
You know I'm a big fan of ray tracing and path tracing, and I think the Overdrive mode looks incredible. But the difference even to vanilla isn't always huge because the devs did a good job with the standard lighting. I posted comparisons where it was night and day, OD mode looking two gens ahead. But it's not always the case. Especially at daytime, often not significant during normal play.
OD vs Vanilla. Of course there are differences likle the shadows in the night shot, or the building in the background in the second one. What I'm saying is, the impact of OD varies and is mostly visible in indirectly lit scenes, but in the open, not dramatic.
![]()
![]()
![]()
![]()
Wow. Thanks for the info man!
I have never seen the light bloom blow out like that.. like ever.
The delay for showing the results of the ray-tracing will be there for a long time. Every game that uses RT ambient occlusion will have it. The cards just aren't fast enough to render it quick enough.
I don't agree with the small differences between PT and regular RT. To me and what I look for, it's blatantly obvious. The skin on characters is the biggest detractor where one uses the GI light probes and therefore having a flat shading.. vs. actually computing the secondary bounces and seeing a shadow with a proper light direction on the skin. This alone destroys every game on the market as this artifact is so jarring to me, it literally makes me consider even continuing to play the game that doesn't have the correct lighting solution on the skin.
One of the main draws to path tracing is the implementation of area lights. If you pay attention to lighting in games, they are all either of 3 variety: 1) Points 2) Directional (sun) and 3) Spot. This is generally good in practice but light has many more different shapes it can take on. In CP OD, they show off this feature in spades with all kinds of lights of varying shapes and actually seeing the materials take on the luminance of said shapes. Specular doesn't just have round highlights anymore. Now they are the shape of the light that's bouncing off of them. That is a HUGE difference.Overdrive looks nice, but I'm not sure how much nicer. Like if its 20fps or greater loss... that's a lot to ask.
I belive that the reason they were so lazy with 4060 ti was because they rely too much on AI/DLSS tech
I have to replace my 2070s, in the worst time ever, but cannot afford a 4070. The current 4060 Ti is a joke with 8 gb ram, but there should be released a 4060 Ti 16 gb ram soon, or at least proper revealed.
Do you guys think it will be value for my money?
A current 4060 ti costs 500 in my country. A 4060 ti 16 gb would probably costs 650 usd with taxes included.
Meanwhile, a 4070 costs about 760 usd.
I just feel a 4070 is too expensive, way too expensive, and was wondering if the Ti series is the new "go to" for budget gamers who can afford a little more expensive card.
I can't give you any advice tbh because I hate this GPU gen and will skip it.
I belive that the reason they were so lazy with 4060 ti was because they rely too much on AI/DLSS tech
I have to replace my 2070s, in the worst time ever, but cannot afford a 4070. The current 4060 Ti is a joke with 8 gb ram, but there should be released a 4060 Ti 16 gb ram soon, or at least proper revealed.
Do you guys think it will be value for my money?
A current 4060 ti costs 500 in my country. A 4060 ti 16 gb would probably costs 650 usd with taxes included.
Meanwhile, a 4070 costs about 760 usd.
I just feel a 4070 is too expensive, way too expensive, and was wondering if the Ti series is the new "go to" for budget gamers who can afford a little more expensive card.
Guys, 5x00 series of cards are rumored to come early 2025. So it's another long year of gaming on the 4x00 series boards.
To me this is good because I get the most time out of the money I spent playing all games at the best possible resolution, graphics features and framerates (assuming the devs don't botch a port).
Damn, misread that as 2024 at first. 2025 is late.
I want at least 16GB too, but heard the 4070 is like 30% faster than the 4060ti, if true can't justify buying a ti when a 12GB 4070 has that much of a performance jump for just like 100$ more.I belive that the reason they were so lazy with 4060 ti was because they rely too much on AI/DLSS tech
I have to replace my 2070s, in the worst time ever, but cannot afford a 4070. The current 4060 Ti is a joke with 8 gb ram, but there should be released a 4060 Ti 16 gb ram soon, or at least proper revealed.
Do you guys think it will be value for my money?
A current 4060 ti costs 500 in my country. A 4060 ti 16 gb would probably costs 650 usd with taxes included.
Meanwhile, a 4070 costs about 760 usd.
I just feel a 4070 is too expensive, way too expensive, and was wondering if the Ti series is the new "go to" for budget gamers who can afford a little more expensive card.
Leaving this here. First 14 minutes.
Yes very silly.
I'm interested when I get my new GPU (40-series) to use DLSS effectively for AA at my native res, I've seen games that support that natively (you choose output and input res as the same value, in this case 2160p) but seems like a few, I take it though I just do it in a hacky way by choosing an output res that gives an input res of 2160p based on whatever percentage Quality DLSS renders at when you choose 2160p.
Though I'd want to disable most if not all the added sharpening if I'm doing that, is that always possible?
DLSS sharpening has been toned down a lot, it's not going crazy anymore. I saw a a sharpening scaler for DLSS in newer games. Nvidia also has deep learning AA, DLAA, which looks great but comes at a cost, also only a few games support it. If you have the compute budget, you can always super sample your res to more than native levels, to then use DLSS to get your native res upscaled.
Here's an article with some comparison shots between DLSS, DLAA and TAA:
![]()
Nvidia DLAA: How it works, supported games and performance vs DLSS
We've tested Nvidia DLAA to see how it compares to DLSS and traditional TAA. Find out everything there is to know about the AI-powered anti-aliasing tech here.www.rockpapershotgun.com