Thread: Blatantly biased FSR vs DLSS comparison by Techradar: Paid, bias or incompetence?

Is Techradar:

  • Paid

    Votes: 7 58.3%
  • Biased

    Votes: 1 8.3%
  • Incompetent

    Votes: 4 33.3%

  • Total voters
    12
It's better. That it's "much much" better is highly debatable. And in the case of performance mode, being better doesn't mean it's good.

I just like a correct image. It is probably due to the fact that I spent over 20yrs in VFX
 
  • Like
Reactions: Phil_t98
It's better. That it's "much much" better is highly debatable. And in the case of performance mode, being better doesn't mean it's good.

Looking at the insane artefacting all over the screen and overall bad image quality in FSR Performance, I'd say DLSS Performance mode is much, much better than FSR Performance mode.

Go and boot up Jedi Survivor and play it on PC, or use the performance mode on consoles. It's very bad.

DLSS is very clearly way better in that regard. Not saying I like the DLSS performance mode, it's not good enough for me. Not clear enough and not enough details.
 
  • 100%
Reactions: VlaudTheImpaler
- You usually move in games and there's movement, so how the solutions deal with that has a huge impact on you image quality. And generally DLSS copes way better with that.

- Balanced and performance modes are important because they offer the biggest frame rate gains. I often use DLSS balanced in demanding games, and it looks nearly as good as Quality mode.
No lies detected. I only said what I said, because the screenshot offered only Quality and Performance mode. Balanced is a viable setting too, especially for DLSS. But there was no available comparison in that screenshot.

But my point was that FSR generally has some advantages, but those are generally simply ignored. Everyone was so busy looking at the smudged character and nobody paid attention to the very obvious and clearly superior background with FSR, which is something that is there constantly.

You're only holding onto this one mode (FSR Quality), at this one resolution (4k), knowing that most don't even have 4k displays and can't / don't target 4k, knowing that all other modes and resolutions clearly go to DLSS. And in that one mode, at that one resolution, using a game from two years ago, you also exclude movement from your evaluation.
Again, I simply picked this one screenshot, and made the obvious DLSS bias clear, especially with statements like "way way better image quality". It's arguably better overall, but let's keep things down to earth here.

I also would like to mention that obviously 4K is the most valuable resolution to use upscaling tech. 1440p is more useable for DLSS than it is for FSR, arguably. But once we're getting into usability territory, this is the equivalent of claiming that a graphics card is 100% faster when it produces 20 fps instead of 10 fps and therefore arguing for that card over the other. Who cares? Nobody plays at those framerates.
And that is typical with blind nVidia followers; advantages that are irrelevant are propped up to be much more relevant and great than they really are. And that is unsurprisingly happening with DLSS vs FSR too.

I remember the days where every nVidia fanboy were trashing on consoles for using upscaling. Now it's their favorite thing.

In any case, I'm actually gonna take a look at the article now.

I just like a correct image. It is probably due to the fact that I spent over 20yrs in VFX
Convenient, that the "correct" image ignores the blurry background.
 
  • Like
Reactions: Kuranghi
After reading the article, I don't see what the problem is...

DLSS vs FSR performance went to DLSS, as it should. And this is primarily the part you're complaining about... Why? DLSS won. What are you complaining about?

Compatibility went to FSR, for obvious reasons. If you want to argue that DLSS requires the Tensor cores on the cards, fine. But nobody complains about RTX 3000 series being capable of having DLSS3 and nVidia deliberately locking their own users out. FSR works for everything. It should get points for that.

Game support was given a tie. This is debatable, but overall, it was a tie, and this was the conclusion:

DLSS is the more widespread option, with better performance and more games supported, but it necessitates an Nvidia GPU and isn't an option for anyone gaming on older hardware.

Conversely, AMD was late to the upscaling party with FSR, but has been putting in the work to improve its competing software. I'm hopeful that FSR 3.0 will ensure that Team Red remains competitive against Nvidia in this particular battleground, but until it arrives that's anyone's guess.

As for which one you should use - well, it depends on what GPU you've got and what games you want to play! If you're rocking a brand new RTX 4060, you should obviously be taking advantage of DLSS 3 in all the games supporting it, but if you're using an older GPU or any of AMD's best graphics cards, then FSR is probably the way to go.


I detect no lies. Tell me again what's so bad about the article...? For anyone that keeps up with graphics card, it's pretty much a useless article, but that doesn't mean it's bad. The fact that this article required a thread trying to bash it, says everything about how skewed the PC gaming market is towards nVidia. Saying anything other than "nVidia always best" triggers some people.

Still, voting for incompetent, because it basically adds no real value to the reader. And when you're biased yourself, neutral articles look like biased ones to you.
 
Last edited:
I also would like to mention that obviously 4K is the most valuable resolution to use upscaling tech. 1440p is more useable for DLSS than it is for FSR, arguably. But once we're getting into usability territory, this is the equivalent of claiming that a graphics card is 100% faster when it produces 20 fps instead of 10 fps and therefore arguing for that card over the other. Who cares? Nobody plays at those framerates.
And that is typical with blind nVidia followers; advantages that are irrelevant are propped up to be much more relevant and great than they really are. And that is unsurprisingly happening with DLSS vs FSR too.

I think upscaling tech became very relevant and helps a lot of player to enjoy demanding games (RT games of course but also all the other ones, like Plague Tale Requiem) at playable frame rates with their GPUs. A 3060 for example offers still very good fps with DLSS and it helps GPUs to "punch above their weight".

I had games where it helped a lot and was crucial for me to get the performance that I wanted. It often boosted it from the 40fps to close to 60fps. Which made a big difference for me. Console games all need it and it's significance will increase, at least that's my prediction.
 
After reading the article, I don't see what the problem is...

DLSS vs FSR performance went to DLSS, as it should. And this is primarily the part you're complaining about... Why? DLSS won. What are you complaining about?

Compatibility went to FSR, for obvious reasons. If you want to argue that DLSS requires the Tensor cores on the cards, fine. But nobody complains about RTX 3000 series being capable of having DLSS3 and nVidia deliberately locking their own users out. FSR works for everything. It should get points for that.

Game support was given a tie. This is debatable, but overall, it was a tie, and this was the conclusion:

DLSS is the more widespread option, with better performance and more games supported, but it necessitates an Nvidia GPU and isn't an option for anyone gaming on older hardware.

Conversely, AMD was late to the upscaling party with FSR, but has been putting in the work to improve its competing software. I'm hopeful that FSR 3.0 will ensure that Team Red remains competitive against Nvidia in this particular battleground, but until it arrives that's anyone's guess.

As for which one you should use - well, it depends on what GPU you've got and what games you want to play! If you're rocking a brand new RTX 4060, you should obviously be taking advantage of DLSS 3 in all the games supporting it, but if you're using an older GPU or any of AMD's best graphics cards, then FSR is probably the way to go.


I detect no lies. Tell me again what's so bad about the article...? For anyone that keeps up with graphics card, it's pretty much a useless article, but that doesn't mean it's bad. The fact that this article required a thread trying to bash it, says everything about how skewed the PC gaming market is towards nVidia. Saying anything other than "nVidia always best" triggers some people.

Still, voting for incompetent, because it basically adds no real value to the reader. And when you're biased yourself, neutral articles look like biased ones to you.

Wait... I complained about them only taking performance increase into account, completely ignoring the actual quality of the results and the pictures. No word on anything regarding image quality. They're image reconstruction techniques, and the author doesn't say a word about the actually reconstructed images. Where FSR is only competitive at one resolution and at one mode, getting soundly beaten by DLSS in image quality.

Now you come back and say.... "But they gave performance to DLSS, where's your issue?".

It's hard to take you seriously anymore tbh.
 
Wait... I complained about them only taking performance increase into account, completely ignoring the actual quality of the results and the pictures. No word on anything regarding image quality.

Now you come back and say.... "But they gave performance to DLSS, where's your issue?".

It's hard to take you seriously anymore tbh.

Image quality can be seen as part of performance. After all, all upscaling is a tradeoff between image quality and framerate.
 
  • Really?
Reactions: regawdless
Image quality can be seen as part of performance. After all, all upscaling is a tradeoff between image quality and framerate.

You can't be serious. They only talk about performance increase, the topic is "performance", there is not a single word about image quality, they spend multiple paragraphs on that part of the analysis without any mention of the actual image.

You have a whole long ass article without one word about image quality differences. Which, performance and quality differences should be the main part of such an article, especially on a tech site. A tech site completely ignoring the tech side... Of course no issue to be seen.

It's pretty impressive how your strong bias turns off any rational thinking. That's too much for me though. Disappointing. I won't do that dance with you anymore. Have fun.

shaking-head-disappointed.gif
 
My only reply to the above.

I don't care about corporations and have no horse in this race. I'm rather critical of Nvidia and said many times that this gen is a total shit show from them. I would love to see AMD kicking their asses. But you simply ignore everything that's being said and go back to some delusional assumptions.

This whole conversation with you feels very disingenuous, unfortunately. Either way, I learned my lesson.... Again, I think. We did this dance on GAF already, no point in continuing it here.
 
I'm rather critical of Nvidia

giphy.gif


But you simply ignore everything that's being said and go back to some delusional assumptions.
Honest question:

Keeping in mind that nVidia is already borderline a monopoly, is overcharging its customers and artificially segregating features on card generations, why is it so important for everyone to know that DLSS has better image quality than FSR? What can possibly be gained by claiming that this insignificant article is paid off by or biased towards AMD?
 

Curious. I mean, I obviously am very enthusiastic about tech and graphics in general, therefore appreciating how they push RT and did well with the image reconstruction. Of course I like the most shiny good working stuff. At the same time, I said that they fucked up this gen, that they're fucktards for their pricing in general, that everything outside of the 4090 is a joke gen to gen wise, the 4090 price being a joke, criticizing that there's no price per performance increase this gen, that their 4070 cards are embarrassing, that their performance graphs using DLSS3 were disingenuous and manipulative, that they try to sell this gens cards through software and marketing etc.

But sure. I guess I assisted them in killing your dog back then.
 
  • Like
Reactions: The_Mike
For reference, I'll add this comparison by Hardware Unboxed. It's the most extensive FSR vs DLSS analysis done so far. Digital Foundry approving and referencing is, whatever that's worth to you.


nvidia-dlss-2-vs-amd-fsr-2-image-quality-performance-comparison-hardwareunboxed-_1-png.290853

nvidia-dlss-2-vs-amd-fsr-2-image-quality-performance-comparison-hardwareunboxed-_2-png.290854


I'll add it to the first post as well.
 
  • Like
Reactions: MildWolverine
Convenient, that the "correct" image ignores the blurry background.
The background, while blurred, is the best of the two that I would take. I don't like artifacts in the image at all and having the main character have the artifacts is a no-no.

Also, I've not seen this blurriness to that degree in Quality mode in any of the games that support DLSS. Can you point me to a game that has this so I can test?
 
The background, while blurred, is the best of the two that I would take. I don't like artifacts in the image at all and having the main character have the artifacts is a no-no.
That is fair.

Also, I've not seen this blurriness to that degree in Quality mode in any of the games that support DLSS. Can you point me to a game that has this so I can test?

It's probably an exception to have the difference be so big. Generally it's more subtle, and everything depends on the implementation per game. See here, where the flickering on the line in the top left at the start of the video is higher on DLSS than FSR, even though generally DLSS is better against these kinds of flickering images.



And DLSS does have its bugs... Timestamped DLSS bugging out giving similar effect to FSR screenshot previously posted:



It only happened once apparently, while the FSR one was repeatable. But I guess we can all find outliers.
 
Is CP2077 fully path traced like Quake 2 RTX or Portal RTX?

Their recently added "Overdrive" mode introduces full path tracing. But it's a "Preview", so it still has some flaws because to make it run in a demanding game like that, the devs had to do a lot of work and it's still a work in progress.

Works well though and looks incredible at times.
 
  • Brain
Reactions: XOR and zebraStraw
Is CP2077 fully path traced like Quake 2 RTX or Portal RTX?

Yes, CP2077 Overdrive mode for only high-end PC GPUs is using full path-tracing in the game. It absolutely needs frame gen on 4090s (which is the recommended GPU) in order to maintain 4k/60FPS but it's highly scalar and works even on lower end GPUs (i.e. 3x00 series).
 
Their recently added "Overdrive" mode introduces full path tracing. But it's a "Preview", so it still has some flaws
What flaws are you seeing with this mode? It's probably the most optimized game to date. And rightly so as Nvidia helped a lot with making that a reality today.

because to make it run in a demanding game like that, the devs had to do a lot of work and it's still a work in progress.
It's pretty feature complete now. I didn't know there was any work in progress as the game doesn't have any visual bugs and the FPS are very consistent from frame-to-frame.

Works well though and looks incredible at times.
"at times"?
 
What flaws are you seeing with this mode? It's probably the most optimized game to date. And rightly so as Nvidia helped a lot with making that a reality today.


It's pretty feature complete now. I didn't know there was any work in progress as the game doesn't have any visual bugs and the FPS are very consistent from frame-to-frame.


"at times"?

I played the Overdrive mode a lot and encountered a number of issues. Some examples:

Sometimes, a lighting bug when switching from the menu back to the game, I also had random rare light flashes in game:


In general, lighting response is delayed, like here the shadows behind and under the car, look behind the front wheel. They're delayed. It's not a hardware related issue, happens on 4090 GPUs as well. Go to a dark room and stand in a corner, shoot a gun. The muzzle flash will appear, the environment will light up delayed.



Also, reflections on fine elements like fences create artefacts at times. Unfortunately these are low res clips, but it's visible here nonetheless.


giphy.gif


I think I read somewhere that they keep improving it, but can't confirm right now.

Regarding "at times".

You know I'm a big fan of ray tracing and path tracing, and I think the Overdrive mode looks incredible. But the difference even to vanilla isn't always huge because the devs did a good job with the standard lighting. I posted comparisons where it was night and day, OD mode looking two gens ahead. But it's not always the case. Especially at daytime, often not significant during normal play.

OD vs Vanilla. Of course there are differences likle the shadows in the night shot, or the building in the background in the second one. What I'm saying is, the impact of OD varies and is mostly visible in indirectly lit scenes, but in the open, not dramatic.



 
I played the Overdrive mode a lot and encountered a number of issues. Some examples:

Sometimes, a lighting bug when switching from the menu back to the game, I also had random rare light flashes in game:


In general, lighting response is delayed, like here the shadows behind and under the car, look behind the front wheel. They're delayed. It's not a hardware related issue, happens on 4090 GPUs as well. Go to a dark room and stand in a corner, shoot a gun. The muzzle flash will appear, the environment will light up delayed.



Also, reflections on fine elements like fences create artefacts at times. Unfortunately these are low res clips, but it's visible here nonetheless.


giphy.gif


I think I read somewhere that they keep improving it, but can't confirm right now.

Regarding "at times".

You know I'm a big fan of ray tracing and path tracing, and I think the Overdrive mode looks incredible. But the difference even to vanilla isn't always huge because the devs did a good job with the standard lighting. I posted comparisons where it was night and day, OD mode looking two gens ahead. But it's not always the case. Especially at daytime, often not significant during normal play.

OD vs Vanilla. Of course there are differences likle the shadows in the night shot, or the building in the background in the second one. What I'm saying is, the impact of OD varies and is mostly visible in indirectly lit scenes, but in the open, not dramatic.





Wow. Thanks for the info man!

I have never seen the light bloom blow out like that.. like ever.

The delay for showing the results of the ray-tracing will be there for a long time. Every game that uses RT ambient occlusion will have it. The cards just aren't fast enough to render it quick enough.

I don't agree with the small differences between PT and regular RT. To me and what I look for, it's blatantly obvious. The skin on characters is the biggest detractor where one uses the GI light probes and therefore having a flat shading.. vs. actually computing the secondary bounces and seeing a shadow with a proper light direction on the skin. This alone destroys every game on the market as this artifact is so jarring to me, it literally makes me consider even continuing to play the game that doesn't have the correct lighting solution on the skin.
 
  • Cheers
Reactions: regawdless
Wow. Thanks for the info man!

I have never seen the light bloom blow out like that.. like ever.

The delay for showing the results of the ray-tracing will be there for a long time. Every game that uses RT ambient occlusion will have it. The cards just aren't fast enough to render it quick enough.

I don't agree with the small differences between PT and regular RT. To me and what I look for, it's blatantly obvious. The skin on characters is the biggest detractor where one uses the GI light probes and therefore having a flat shading.. vs. actually computing the secondary bounces and seeing a shadow with a proper light direction on the skin. This alone destroys every game on the market as this artifact is so jarring to me, it literally makes me consider even continuing to play the game that doesn't have the correct lighting solution on the skin.

For me, one of the most impressive advantages of PT is that it all looks natural and cohesive. To me, the daytime shot above doesn't look significantly different between vanilla and PT. But running around, seeing indirectly lit areas, dynamic lights etc. it all feels just right. While raster lighting keeps having these glowy unnatural areas and looks very uneven, with all the known issues (light bleed, no bounce light, SSR fuckery, objects sticking out, floating looking objects).
 
  • Like
Reactions: VFX_Veteran
Overdrive looks nice, but I'm not sure how much nicer. Like if its 20fps or greater loss... that's a lot to ask.
 
Overdrive looks nice, but I'm not sure how much nicer. Like if its 20fps or greater loss... that's a lot to ask.
One of the main draws to path tracing is the implementation of area lights. If you pay attention to lighting in games, they are all either of 3 variety: 1) Points 2) Directional (sun) and 3) Spot. This is generally good in practice but light has many more different shapes it can take on. In CP OD, they show off this feature in spades with all kinds of lights of varying shapes and actually seeing the materials take on the luminance of said shapes. Specular doesn't just have round highlights anymore. Now they are the shape of the light that's bouncing off of them. That is a HUGE difference.
 
Very good points NXGamer made. And I agree that Nvidia is somewhat responsible for this due to DLSS being propritary. However, there literally is nothing wrong with making propritary and custom tech.

My biggest fear is that Starfield won't be optimized enough and we will look at stuttering frames even on 4090s.
 
I belive that the reason they were so lazy with 4060 ti was because they rely too much on AI/DLSS tech

I have to replace my 2070s, in the worst time ever, but cannot afford a 4070. The current 4060 Ti is a joke with 8 gb ram, but there should be released a 4060 Ti 16 gb ram soon, or at least proper revealed.

Do you guys think it will be value for my money?

A current 4060 ti costs 500 in my country. A 4060 ti 16 gb would probably costs 650 usd with taxes included.
Meanwhile, a 4070 costs about 760 usd.

I just feel a 4070 is too expensive, way too expensive, and was wondering if the Ti series is the new "go to" for budget gamers who can afford a little more expensive card.
 
I belive that the reason they were so lazy with 4060 ti was because they rely too much on AI/DLSS tech

I have to replace my 2070s, in the worst time ever, but cannot afford a 4070. The current 4060 Ti is a joke with 8 gb ram, but there should be released a 4060 Ti 16 gb ram soon, or at least proper revealed.

Do you guys think it will be value for my money?

A current 4060 ti costs 500 in my country. A 4060 ti 16 gb would probably costs 650 usd with taxes included.
Meanwhile, a 4070 costs about 760 usd.

I just feel a 4070 is too expensive, way too expensive, and was wondering if the Ti series is the new "go to" for budget gamers who can afford a little more expensive card.

I can't give you any advice tbh because I hate this GPU gen and will skip it.
 
  • Like
Reactions: The_Mike
I belive that the reason they were so lazy with 4060 ti was because they rely too much on AI/DLSS tech

I have to replace my 2070s, in the worst time ever, but cannot afford a 4070. The current 4060 Ti is a joke with 8 gb ram, but there should be released a 4060 Ti 16 gb ram soon, or at least proper revealed.

Do you guys think it will be value for my money?

A current 4060 ti costs 500 in my country. A 4060 ti 16 gb would probably costs 650 usd with taxes included.
Meanwhile, a 4070 costs about 760 usd.

I just feel a 4070 is too expensive, way too expensive, and was wondering if the Ti series is the new "go to" for budget gamers who can afford a little more expensive card.

What type of settings and fidelity are you going for?
 
Guys, 5x00 series of cards are rumored to come early 2025. So it's another long year of gaming on the 4x00 series boards.

To me this is good because I get the most time out of the money I spent playing all games at the best possible resolution, graphics features and framerates (assuming the devs don't botch a port).
 
Guys, 5x00 series of cards are rumored to come early 2025. So it's another long year of gaming on the 4x00 series boards.

To me this is good because I get the most time out of the money I spent playing all games at the best possible resolution, graphics features and framerates (assuming the devs don't botch a port).

Damn, misread that as 2024 at first. 2025 is late.
 
  • 100%
Reactions: Ascend
Damn, misread that as 2024 at first. 2025 is late.

Exactly. Suffering an extra year of atrocious prices isn't exactly something to look forward to. And considering nVidia ditched gamers for miners and are doing the same for AI, I don't count on things becoming better. But at least a new Gen has the potential for improvement.
 
I belive that the reason they were so lazy with 4060 ti was because they rely too much on AI/DLSS tech

I have to replace my 2070s, in the worst time ever, but cannot afford a 4070. The current 4060 Ti is a joke with 8 gb ram, but there should be released a 4060 Ti 16 gb ram soon, or at least proper revealed.

Do you guys think it will be value for my money?

A current 4060 ti costs 500 in my country. A 4060 ti 16 gb would probably costs 650 usd with taxes included.
Meanwhile, a 4070 costs about 760 usd.

I just feel a 4070 is too expensive, way too expensive, and was wondering if the Ti series is the new "go to" for budget gamers who can afford a little more expensive card.
I want at least 16GB too, but heard the 4070 is like 30% faster than the 4060ti, if true can't justify buying a ti when a 12GB 4070 has that much of a performance jump for just like 100$ more.

I'm thinking of waiting till 5000 series, as hopefully those have way more ram if we are lucky which will be good for next gen ai models. Also want 1440p path tracing at 60+fps.

Most of the games I play are low budget indies which run fine on my current card. Though do want to play ffxvi future pc port on high settings.
 
Leaving this here. First 14 minutes.



Interesting, thanks for posting. Can someone actually confirm that devs have to hand over their code to Nvidia? Because I know this was the case in the beginning, but I believe it changed, meaning his information might not be accurate. Because in UE4 + 5 and for Unity, there are plug ins and you only need 11 clicks to have it up and running without any of that stuff, you don't need to hand anything over to Nvidia. Modders implement DLSS with ease, just putting the DLSS files in there and adjusting some settings.

Doesn't seem like it's an actually valid concern. At the very least, all Unreal Engine and Unity games could easily ship with DLSS.

I checked on Nvidias site, you can download the DLSS sdk and I didn't find anything in the license agreement regarding his claims.
 
Yes very silly.

I'm interested when I get my new GPU (40-series) to use DLSS effectively for AA at my native res, I've seen games that support that natively (you choose output and input res as the same value, in this case 2160p) but seems like a few, I take it though I just do it in a hacky way by choosing an output res that gives an input res of 2160p based on whatever percentage Quality DLSS renders at when you choose 2160p.

Though I'd want to disable most if not all the added sharpening if I'm doing that, is that always possible?
 
Yes very silly.

I'm interested when I get my new GPU (40-series) to use DLSS effectively for AA at my native res, I've seen games that support that natively (you choose output and input res as the same value, in this case 2160p) but seems like a few, I take it though I just do it in a hacky way by choosing an output res that gives an input res of 2160p based on whatever percentage Quality DLSS renders at when you choose 2160p.

Though I'd want to disable most if not all the added sharpening if I'm doing that, is that always possible?

DLSS sharpening has been toned down a lot, it's not going crazy anymore. I saw a a sharpening scaler for DLSS in newer games. Nvidia also has deep learning AA, DLAA, which looks great but comes at a cost, also only a few games support it. If you have the compute budget, you can always super sample your res to more than native levels, to then use DLSS to get your native res upscaled.

Here's an article with some comparison shots between DLSS, DLAA and TAA:
 
  • Like
Reactions: Kuranghi
DLSS sharpening has been toned down a lot, it's not going crazy anymore. I saw a a sharpening scaler for DLSS in newer games. Nvidia also has deep learning AA, DLAA, which looks great but comes at a cost, also only a few games support it. If you have the compute budget, you can always super sample your res to more than native levels, to then use DLSS to get your native res upscaled.

Here's an article with some comparison shots between DLSS, DLAA and TAA:

Good to hear about the sharpening tone down, was too much for me.

Mate I absolutely love super sampling, played Dead Space 1 at 4K with 4xSGSSAA + TrSSAA and it was so fucking clean, I didn't manage to take any proper screenshots... well I took plenty but Steam was set to trash quality screenshots so not worth looking at compared to what I saw unfortunately 😥 lesson learned.

I did take some decent ones of Dead Space 2 DSR'd from 8K to 4K, I believe I could've gotten better overall IQ from GeDoSaTo but I couldn't be get it working on DS2 for some reason. SGSSAA from nvidia profile inspector doesn't really work with DS2, you either get weird white outlines on things, broken shadows or blurry IQ: pick your poison. So I went with DSR instead. DS2 has fantastic contrast/gamma/presentation so it already looked so nice at 4K, but 8K to 4K was like "stop and stare at the screen for 30 seconds at a time"-amazing. Aided greatly by my Sony ZD9 of course :p

Can't truly capture how good it looked in stills, for example the burning train station was so brilliant looking, the detail in the displays and the way the little burning fires looked just... awesome.


I'm more excited to get a new GPU just so I can do that to old games like Just Cause 3 or Batman: Arkham Knight rather than actually to play modern games at 60fps lol
 
  • Star
Reactions: regawdless