That makes no sense. Now we are guessing.
You're guessing that HFW could have included a comparable waterworld. The game had limited water areas. It had water, but it didn't have an extensive waterworld. The second game appears to. Pretty obvious improvement.
One has to twist logic into pretzels to assume "this could've been done in the first game, they just chose not to" , or perhaps leaning to the unfalsifiable "oh well there's a difference in team and talent, that's why it looks better"
I'm trying to stay within the definition of what's considered "next-gen" or an improvement over current tech. It makes sense to compare the same teams - not pick a team that's been known to have less than spectacular graphics like FROM to a company that thrives off of cinematic film quality graphics like SSM.
I mean everyone should objectively have the same definition for tech though. What have we seen as far as tech last generation? PBR shaders. That is what pushed the generation forward. This generation is RT and Nanite. No one disagrees that the Nanite demo for the PS5 was a step ahead of games because it showed tech that was superior. We saw something that beats out normal maps and parallax occlusion mapping. It went closer and closer as far as detail was concerned. At the same time, we got path tracing demos from Nvidia where everyone said that games using it would be a few generations away until CP2077OD came out. What I'm trying to say is that we have objective examples of new tech being in a game and looking a generation ahead.
If we have no Nanite or PT this generation, then there can't be a big leap in visuals as the artists are using the same tools they used last gen. That's why we see 2 modes in games. Performance mode is sacrificing what you could get for 60FPS. 30FPS mode gives higher res textures, better texture filtering and some RT at a small scale.
I think an easy example to punch a hole in your argument is patches: a game is released and has such-and-such graphics, then a year later it gets a patch and now it can run on the same hardware at 60fps with no discernable problems elsewhere.
Will you make the argument that the original version of the game was "suboptimized", but the final patched version is "more optimized" therefore it doesn't count? Perhaps, but if we can list off many examples of this happening in patches surely this also happens across the lifespan of the console.
Performance-taxing techniques are swapped out for less intensive ones, middleware gets updated or replaced with more efficient tools. This also happens in the PC space, where driver updates "improve the hardware" by improving the utilization of said hardware and improving the coordination between PC parts.
There's no logical reason why console games
wouldn't get more technically impressive as the console generation goes on, and we see this across console history. We even see this on Nintendo Switch, where ass ports like Ark ran poorly on the system but now, after a patch, is one of the best-looking games on the system. The hardware itself didn't get more powerful, what happened?