Just loaded it up this morning just to take a sneek peek before working and here's some thoughts:
1) Lighting looks amazing (of course) -- BUT -- their artists aren't very good. I would tone down the light scattering propagation with a normalizing factor. I'm not sure if this is from the actual scattering or their tone map shader is tuned too high. This could be a small change with an console variable that controls the engine somewhat like UE does. They might address this in a patch.
2) Path tracing WITHOUT DLSS 3 is hovering in the high 20s low 30s on a RTX 4090. DLSS 3 is producing over 100% of the dropped frames!! Wow!!
Basically, realtime games right now will absolutely REQUIRE some sort of frame generation to get these kinds of visuals otherwise we are probably 2 generations away from implementing path-tracing at native 4k/60FPS (i.e. 6x00 series cards). A PS6 with no frame generation will absolutely be dead in the water on release.
My issue with all this prediction/reconstruction is I can put up with upscaling artifacts now that its matured a bit but this motion prediction (I'm aware its much more complex than usual real-time MEMC found in TVs) needs to be better for me to get on board with it, I spend all day looking at TVs with motion smoothing turned up too high and its completely horrible, what I've seen of DLSS3 in Spiderman for example isn't ready for me personally, the artifacts are too visible without me looking for them. I hope with later version of FG it can be hooked into more aspects of the engine so it can deal with the current problem areas as well.
Recently in Tchia the amount of trailing artifacts from whatever they are using to clean up the image (or how that interfacts with their lighting/reflection methods maybe) is pretty distracting but still I prefer it to uneven motion artifacts caused by FG.
Input lag reduction is also more of the reason I would want to play at >30fps than motion clarity, its a very close second ofc but I get used to a fixed framerate as long as its 30 or above, Horizon 2 for example I would never play in the performance mode because the IQ hit is too much for me, it robs it of what makes it look so fantastic as an overall package to me, the fact that input lag is still quite low in the 30fps mode just cements my choice. I'm presuming there isn't going to be a way to have these fake frames actually contribute to a reduction in lag due to how it works so for me its just a bandaid like VRR is - VRR's near black flashing is a no-go for me, that makes my display's ability to display (effectively) true blacks pointless and ruins IQ for me - I wouldn't be bothered about using it to go from 80-90 real frames up to FG'd 120 fps if it introduces the artifacts it does, I'd prefer to use it to hit 60fps when I'm not locked to it in rare cases, like in RE4R's resolution mode on PS5. In the case of a game running at 80-90 I'd still rather lock it to 60fps and use the extra frames as overhead.
That way its rarely active and only during extreme cases of stress on the GPU, I'm not sure it works that way though, can you set a cap and have it not be doing the FG if the cap is met?
I don't want to just be chasing ever higher framerates, I'd much prefer to focus on world interactibility and image quality and keep it locked to 60fps, most 60fps modes in console games have disappointing IQ to and every single 120fps mode without fail is far too soft. Everyone raved about Uncharted 4 finally being in 60fps on PS5 and while input lag and smoothness of animation is obviously fantastic playing it again at 1440p like I did on my PS4 Pro in the first place is not exciting for me and maybe even worse the cutscenes look awful at 60fps, they were clearly made with 30fps in mind so it looks really soap opera-y and robs it of, dare I say, its cinematic quality. Everyone looked like horrifically puppeted meat mannequins at 60fps in cutscenes.
This is definitely in no small part to do with me using a slow response time panel'd LCD so 24 and 30 fps is way more acceptable when combined with a good MB implementation. If I was on a monitor then I'd chasing framerate more but then I'd lose the insane image quality difference between almost all monitors and my ZD9 TV, the contrast difference and depth of the image is insane compared to fast panel monitors, with the exception of those QD-OLED monitors but then you lose the large area brightness due to OLED's ABL limitations, its not even just fullscreen white brightness, its crazy how less impactful moderately bright scenes in games look on an OLED vs. an LCD like mine. Standing in the desert in Horizon 2 is so amazing looking that I don't miss the increase in depth my OLED TV brings for the LCD challenging parts of the image.