Thread: Why do console fans overestimate future console hardware?
PS5 was still a crazy good deal for the performance in late 2020, especially considering it was basically impossible to get GPUs.

I don't want to imagine how much insanity went on behind the scenes with all the availability, pricing, supply chain issues.

Both MS and Sony did exceptionally well in my opinion.
 
  • Like
Reactions: Zefah and Nobel 6
Despite the fact that from a developer perspective working to a unified spec versus a broad one is beneficial as they can optimise for the hardware at every step of the way, the tech will age out in short measure given how quickly graphics tech is improving. The not had my Sunday morning coffee conspiratorial side of my brain thinks that albeit Sony wants those PC double dippers as it's a significant revenue stream after the fact, they're also quite happy to release them sub-optimally initially so that the PS4/5 still looks like a good investment to the Sony-faithful whereas if their games all came out universally running like butter on a hot knife even on a medium spec PC people would be losing their shit regarding the Sony brand strength. I mean sure they'll patch up the PC release and things will work better (although in H:ZD the Cauldron Levels still run like utter dogshit unless you turn down the effects to this day), however, it always seems to take them a while. It's a dumb theory, but we've had enough instances of it where I'm very much in the 'wait for patches'/maybe a sale' mindset with any Sony PC Release versus day 1 and albeit post-release patches are very much the norm these days for most games, with Sony response time seems to be..' sure we'll get around to that soon' . Maybe they've gotten better though.
 
I think gamers talk too much about hardware when there's far more interesting stuff to nitpick, compare, and speculate in the games themselves.
/thread

It's exactly as you said Don. Like, do I want to listen to a biker complain about his adventures on the highway with his boys and them tearing up the towns they visited or do I want to hear him complain about every piece that makes up his hog that isn't shined up or working to perfection? The answer is neither, I'd rather fuck his wife while he's hangin' with the homos
 
i think the "punches above its weight" truism is pretty true for consoles, always has been. A $500 PS5 is playing better-looking games than building a potato-computer from scratch for the same cost. Some PC gamers try to take this away from their console brethren, for some reason.

Unfortunately, instead of gratitude and "knowing one's place" in a manner of speaking, some console fans take this logic a bit too far and they conclude "punches above its weight" means "outperforms hardware more than three times as expensive". This is the territory of delusion, like a Honda fan trying to insist his souped-up 4 banger could outperform an 8 cylinder supercar.

I think gamers talk too much about hardware when there's far more interesting stuff to nitpick, compare, and speculate in the games themselves.

Plus it's literally been console gospel up until5 minutes ago that console developers were forced to wrangle more power out of hardware that was already mid-tier at best when it came out four or five years down the line. And it was true, some of the most amazing games came out on hardware that simply shouldn't have been possible as a result. I mean, Todd Howard admitted that they got Morrowind to run on the OG Xbox by secretly rebooting the console during load screens to clear the cache.
 
Last edited:
In all fairness though, the last gen-mid gen console refreshes were hampered by the very weak x86 jaguar cpus. This generation, they have better options to work with.

We also don't know what kind of manufacturing deals they have with chip makers , or what kind of tech they're working on.

Overall , it would be silly to expect something like a rtx40 series this refresh, but it's safe to say we will see much better gains than last gens refresh.
 
  • Brain
Reactions: Mickmrly and Kadayi
Plus it's literally been console gospel up until5 minutes ago that console developers were forced to wrangle more power out of hardware that was already mid-tier at best when it came out four or five years down the line. And it was true, some of the most amazing games came out on hardware that simply shouldn't have been possible as a result. I mean, Todd Howard admitted that they got Morrowind to run on the OG Xbox by secretly rebooting the console during load screens to clear the cache.

Both console manufacturers are guilty of some hardware BS. Microsoft shipping the Core system with no HDD and not supporting HD-DVD when games were already hitting DVD capacity was a really poor short-term sales move that essentially constrained developers having to ensure everything could be loaded into memory. To add salt to the wound they also infamously set aside 1GB on their game DVDs for "security measures" that apparently barely needed any of it (it got patched out around the time of Skyrim's release). If you look at all of the games from that period you'll note how much reliance there was on generic repeat assets, both 3D and audio which was a result of having to manage this. Sony on the other hand did go with Blu-Ray but also tried to monopolise developers by introducing the Cell Processor which was apparently a bitch to code for if you were a third-party developer trying to do multi-platform. With all of that said I think things are a bit easier these days as both console makers are a bit more receptive to developer feedback.
 
Plus it's literally been console gospel up until5 minutes ago that console developers were forced to wrangle more power out of hardware that was already mid-tier at best when it came out four or five years down the line. And it was true, some of the most amazing games came out on hardware that simply shouldn't have been possible as a result. I mean, Todd Howard admitted that they got Morrowind to run on the OG Xbox by secretly rebooting the console during load screens to clear the cache.
I don't think the "wrangle more power" applies here anymore. That ended with PS3. The PS4 was just a mid-tier PC back then. The PS5/XSX now have very low bandwidth (equivalent to PS4 when it came out) compared to the mid-high tier GPUs. Common sense should follow that there won't be a jump from low bandwidth PS5s to high bandwidth Path tracing ready PS5 Pros in a matter of 2-3 yrs.
 
In all fairness though, the last gen-mid gen console refreshes were hampered by the very weak x86 jaguar cpus. This generation, they have better options to work with.

We also don't know what kind of manufacturing deals they have with chip makers , or what kind of tech they're working on.

Overall , it would be silly to expect something like a rtx40 series this refresh, but it's safe to say we will see much better gains than last gens refresh.
I don't think so. In fact, I'm not sold on a mid-gen machine at all. Sony and MS had two options at the time the PS5/XSX came to the masses. Work 6yrs on a PS6 or integrate a PS5 Pro into the mix based on AMDs current architecture (would be 3yrs old now). One option makes more sense than the other. A PS6 would be a better fit for the timeline right now as AMD is struggling to put something even remotely comparable to Nvidia on the ray-tracing side of things.

I always take a glimpse into the future by looking at what the tech companies introduce to PC gamers. AMD's next Navi4 is rumored to brute force more RT by putting more SMUs on the chip (not a good sign). And there are no rumors of a frame generation aspect on the newer GPUs either. That leads me to believe they have at least a scaled down version already being built for the mid-gen machines or none at all and going with trying to get PS6 together for better competition.

In all fairness, path tracing shouldn't have been a goal this generation. Clearly not even the high end 4x00 series can handle it without frame generation. If AMD has no such thing, I don't see how they will be able to brute force their way to running path-traced games for the next generation (i.e. Navi4, PS6, etc..)
 
I don't think so. In fact, I'm not sold on a mid-gen machine at all. Sony and MS had two options at the time the PS5/XSX came to the masses. Work 6yrs on a PS6 or integrate a PS5 Pro into the mix based on AMDs current architecture (would be 3yrs old now). One option makes more sense than the other. A PS6 would be a better fit for the timeline right now as AMD is struggling to put something even remotely comparable to Nvidia on the ray-tracing side of things.

I always take a glimpse into the future by looking at what the tech companies introduce to PC gamers. AMD's next Navi4 is rumored to brute force more RT by putting more SMUs on the chip (not a good sign). And there are no rumors of a frame generation aspect on the newer GPUs either. That leads me to believe they have at least a scaled down version already being built for the mid-gen machines or none at all and going with trying to get PS6 together for better competition.

In all fairness, path tracing shouldn't have been a goal this generation. Clearly not even the high end 4x00 series can handle it without frame generation. If AMD has no such thing, I don't see how they will be able to brute force their way to running path-traced games for the next generation (i.e. Navi4, PS6, etc..)

Hmm. Maybe? If PS6 was releasing within the next 2 years, I would agree for sure. But it may not be until the end of 2027. And that's the absolute earliest.

We still need to see what AMD has up their sleeve with FSR3 and the newer GPUs. There could be a reason for more SMUs that we don't know about( or if that's related to newer FSR techniques)

As of now, we can only take educated guesses.
 
  • Like
Reactions: VFX_Veteran
I'm supremely happy with my Series X for the convenience it offers, ease of use and value dollar for dollar compared to a high-end gaming rig. I'm not obsessed with graphics; I just want to play fun games and the Series X suits me just fine.

All that said, even though I'm the world's biggest Xbox fan girl, never in a million years did I expect my Series X to compete graphics or performance wise with a top-of-the-line PC. That is just flat out retarded. Xbox games on a good gaming PC look and perform far better than on a Series X which is exactly as it should be.

@VFX_Veteran I think we are all pretty sure what crazies you're talking about that overestimates their hardware. I don't recall ever seeing any Xbots claiming the Series console meets or exceeds good PCs. Nintendo fans most definitely don't brag the Switch is the equal of a high-end gaming rig. There is only one group of idiots that can't recognize that either due to incompetence or more nefarious reasons, Sony sucks at porting their own games.
 
Both console manufacturers are guilty of some hardware BS. Microsoft shipping the Core system with no HDD and not supporting HD-DVD when games were already hitting DVD capacity was a really poor short-term sales move that essentially constrained developers having to ensure everything could be loaded into memory. To add salt to the wound they also infamously set aside 1GB on their game DVDs for "security measures" that apparently barely needed any of it (it got patched out around the time of Skyrim's release). If you look at all of the games from that period you'll note how much reliance there was on generic repeat assets, both 3D and audio which was a result of having to manage this. Sony on the other hand did go with Blu-Ray but also tried to monopolise developers by introducing the Cell Processor which was apparently a bitch to code for if you were a third-party developer trying to do multi-platform. With all of that said I think things are a bit easier these days as both console makers are a bit more receptive to developer feedback.

True, and Sony even admitted they made their consoles difficult to code for with the PS3 and Cell, which is why so many 360 games performed better, at least when it came to third parties.
 
  • Brain
  • Like
Reactions: Nobel 6 and Kadayi
I don't think the "wrangle more power" applies here anymore. That ended with PS3. The PS4 was just a mid-tier PC back then. The PS5/XSX now have very low bandwidth (equivalent to PS4 when it came out) compared to the mid-high tier GPUs. Common sense should follow that there won't be a jump from low bandwidth PS5s to high bandwidth Path tracing ready PS5 Pros in a matter of 2-3 yrs.

But even still, games that came out on the PS4 like God of War 2018 were hard to believe they could run on vanilla PS4 hardware from 2013. Or take Skyrim on the Switch; they put an entire world on that tiny cart. That's why I've always admired consoles; because they continue to amaze in spite of their specs.

That said, I'm not arguing that the Pro systems will be a huge increase performance-wise, because all the games still have to be made for the larger vanilla base anyway the Pro systems will be marginal and are largely for enthusiasts, not the market overall.
 
  • Brain
Reactions: Mickmrly
True, and Sony even admitted they made their consoles difficult to code for with the PS3 and Cell, which is why so many 360 games performed better, at least when it came to third parties.

I wasn't aware they admitted to it, but I do recall a rather frank interview (back when he did those sorts of things) from Sir Gaben of Newell where he talked at length about what an absolute nightmare the cell processor was to code for. However, with that said I think over time better tools emerged to make developers' lives easier. Still, lessons were learned for sure.
 
I wasn't aware they admitted to it, but I do recall a rather frank interview (back when he did those sorts of things) from Sir Gaben of Newell where he talked at length about what an absolute nightmare the cell processor was to code for. However, with that said I think over time better tools emerged to make developers' lives easier. Still, lessons were learned for sure.


Yeah, here's an article on it. I think I appreciated the PS3 a lot more as time went on, whereas the 360 peaked earlier and suffered from the lack of focus during Mattrick with Kinect.
 
  • Like
Reactions: Nobel 6

Yeah, here's an article on it. I think I appreciated the PS3 a lot more as time went on, whereas the 360 peaked earlier and suffered from the lack of focus during Mattrick with Kinect.


Damn, that's some 'It's not a bug, it's a feature' mindset from Kaz Hirai right there. :LOL:
 
But even still, games that came out on the PS4 like God of War 2018 were hard to believe they could run on vanilla PS4 hardware from 2013. Or take Skyrim on the Switch; they put an entire world on that tiny cart. That's why I've always admired consoles; because they continue to amaze in spite of their specs.
That's one of the biggest misleading assumptions that a lot of console junkies believe. 99.9% of fanboys praise a game's looks without knowing what technically it is that's leading the pack. None of the console games are doing any advanced tech in graphics. The talent is superior to most gaming companies - and so they know how to make things look good even though there is nothing demanding on the hardware. For example, CP2077 OD - that clearly looks superior to games because of a direct cause of the tech being used - not the art. This leads to arguments like "imagine if they did this kind of work on a PS6 or PS7".

I remember many people declaring R&C looked like a Pixar movie and it was nowhere near that kind of tech. It was artistically driven to mimic the colors and environment that made it look similar. But it had nothing to do with the hardware from the PS5.
 
  • This tbh
Reactions: Nobel 6
That's one of the biggest misleading assumptions that a lot of console junkies believe. 99.9% of fanboys praise a game's looks without knowing what technically it is that's leading the pack. None of the console games are doing any advanced tech in graphics. The talent is superior to most gaming companies - and so they know how to make things look good even though there is nothing demanding on the hardware. For example, CP2077 OD - that clearly looks superior to games because of a direct cause of the tech being used - not the art. This leads to arguments like "imagine if they did this kind of work on a PS6 or PS7".

I remember many people declaring R&C looked like a Pixar movie and it was nowhere near that kind of tech. It was artistically driven to mimic the colors and environment that made it look similar. But it had nothing to do with the hardware from the PS5.

I agree for the most part and would add an important factor. Devs adjust and balance the visual settings like texture details, draw distance, shadow quality etc. specifically for the console hardware at hand. Often with settings that can't be selected for the PC versions, just customized for the fixed hardware to get the best visual configuration for the target performance, often combined with dynamic res to avoid fps drops, which still isn't too common on PC.

This is a significant contributor to the "punching above it's weight" perception. Add the already mentioned great talent of the first party studios from Sony especially, and you get really good looking games. PC gamers with mid end rigs have to wait for DFs recommended settings lol

(I still think the Demons Souls remake is an absolute beauty. Damn I want that to make it's way to the PC)
 
That's one of the biggest misleading assumptions that a lot of console junkies believe. 99.9% of fanboys praise a game's looks without knowing what technically it is that's leading the pack. None of the console games are doing any advanced tech in graphics. The talent is superior to most gaming companies - and so they know how to make things look good even though there is nothing demanding on the hardware. For example, CP2077 OD - that clearly looks superior to games because of a direct cause of the tech being used - not the art. This leads to arguments like "imagine if they did this kind of work on a PS6 or PS7".

I remember many people declaring R&C looked like a Pixar movie and it was nowhere near that kind of tech. It was artistically driven to mimic the colors and environment that made it look similar. But it had nothing to do with the hardware from the PS5.

One doesn't need to be a fanboy or a console junkie to see that the games that came out at the end of the system's life are far superior visually to what came out initially. And this shouldn't come s a surprise — every console generation with the exception of perhaps a few has seen huge advances visually as developers became more adept with the hardware. I'm not talking about advanced tech; I'm talking about Game Z for System X looking far superior to Game A on the same system a half decade earlier. I call bullshit on any assertion otherwise because I've seen it happen in pretty much every generation that has existed, and you'd have to be fucking blind to assert otherwise. I mean, are you suggesting that Digital Foundry are a bunch of fucking rubes because they were amazed that Guerilla got Horizon Forbidden West to run as well as they did on the vanilla PS4? I don't even care for that game but I'm not stupid enough to think that it wasn't a huge achievement to get it running on a base PS4 with as little sacrifices as were made And that something like that just wasn't possible in 2014.
 
I don't think so. In fact, I'm not sold on a mid-gen machine at all. Sony and MS had two options at the time the PS5/XSX came to the masses. Work 6yrs on a PS6 or integrate a PS5 Pro into the mix based on AMDs current architecture (would be 3yrs old now). One option makes more sense than the other. A PS6 would be a better fit for the timeline right now as AMD is struggling to put something even remotely comparable to Nvidia on the ray-tracing side of things.

I always take a glimpse into the future by looking at what the tech companies introduce to PC gamers. AMD's next Navi4 is rumored to brute force more RT by putting more SMUs on the chip (not a good sign). And there are no rumors of a frame generation aspect on the newer GPUs either. That leads me to believe they have at least a scaled down version already being built for the mid-gen machines or none at all and going with trying to get PS6 together for better competition.

In all fairness, path tracing shouldn't have been a goal this generation. Clearly not even the high end 4x00 series can handle it without frame generation. If AMD has no such thing, I don't see how they will be able to brute force their way to running path-traced games for the next generation (i.e. Navi4, PS6, etc..)

My take is that most of the market doesn't give a shit about path tracing or ray tracing or whatever. I know a bunch of people who are playing PS5 games on a 1080 display And don't give a shit. They just wanted to play Sony exclusives or just own the next PlayStation or what have you. I think it's a profound mistake to attribute tech specs as the sole reason people buy console hardware, which is exactly why I'm going to agree with you on Pro releases being a mistake… they're niche products for an audience that is already on board with the platform most likely anyway. Sony and Microsoft would be much smarter to try and release revised Slim models that hit the $299 (or $199) pricepoint because there are a fuck ton of people who would like to own one of these systems but aren't invested in the hobby enough to pay $400-$500 to do it.
 
Last edited:
I get that prior to the PS4, things were different and consoles got a lot of special sauce hardware that could rival or outperform PC hardware but those days are long gone.
This is the answer.
Consoles used to have a slight edge at the start of a gen.

But now that's gone, it's just medium spec PC hardware in a closed box.
People just haven't adapted to this.
 
I don't think it's hard to comprehend the console sentiment: "oh cool look at the level of graphics I can play for $400". Even PC Reddit is crammed with threads about potato builds and getting DOOM to run on a LeapFrog.

When estimating future hardware, the underlying thinking is the same: "oh cool, all of this graphical glitz for the price of [average expected console price], that's amazing!"

PC gamers are just chronic ACKTUALLY nerds trying to justify their expensive purchase. "Actually that console SSD isn't the fastest SSD, there's a Western Digital drive for $1,100 that outperforms it by 420% according to Digital Foundry tests"
 
I don't think it's hard to comprehend the console sentiment: "oh cool look at the level of graphics I can play for $400". Even PC Reddit is crammed with threads about potato builds and getting DOOM to run on a LeapFrog.

When estimating future hardware, the underlying thinking is the same: "oh cool, all of this graphical glitz for the price of [average expected console price], that's amazing!"

PC gamers are just chronic ACKTUALLY nerds trying to justify their expensive purchase. "Actually that console SSD isn't the fastest SSD, there's a Western Digital drive for $1,100 that outperforms it by 420% according to Digital Foundry tests"

LOL THIS. The fact that a lot of PC players can name each individual component inside their individual chassis is embarrassing enough, but when they can actually cite not only its spec but the specs of its competing nerd organs, it's just fucking cringe IMO. And many of them always assume it's a money thing; it isn't — not in my case anyway. I've been blessed and could build 5 high end PCs and not feel it really, but I never would. Partially because I don't see the value in it, and partially because I'd rather use my very finite free time to play games on a closed platform than pore over specs and benchmark tests to see which part is the absolute best for next two weeks until another part comes along and replaces it. That would be maddening to me. Conversely, when someone tries to argue that a console is more powerful than a PC that cost thousands of dollars, that's equally embarrassing.

To each their own, but I'd just rather play games when I have the time along with my other hobbies. Speaking of which, one observation I've made is that the watch nerds get along far better than the gaming nerds. Watch people (most of them anyway) don't sneer at someone buying a G Shock just because they've got an Omega or a Rolex on their wrist. People into horology can all get together and admire the timelessness of a Rolex Submariner all the while extolling the insane coolness and retro chic of a Casio F91. There are elitists in the watch space, but far fewer of them in my experience hanging out on the Watchuseek forums. The firearms hobby is actually quite similar to gaming. People will shit all over Brand X because they're a Brand Z guy, even though Brand X pumps out thousands of quality, affordable firearms and has a storied name in the industry. People who spent 4 grand on a Wilson Combat 1911 will sneer at someone buying a Springfield or Smith and Wesson and they'll go on to list what a fucking poon someone was for wasting their money on such a piece of shit when they could've saved up and bought a real gun. Same with rifles, elitists will say, "Ah fuck, that rifle isn't even sub-MOA, what a piece of shit!" while the guy with his M&P Sport II is just amazed that he can put it on paper at 300 yards within a couple inches of each shot and it only cost him $700 or so. Most people in most markets are somewhat casual -- it's those people that keep the market alive, and that's why most companies make products for the casual buyer, on top of the fact that they know the hardcore buyer is going to be there no matter what anyway.
 
Last edited:
One doesn't need to be a fanboy or a console junkie to see that the games that came out at the end of the system's life are far superior visually to what came out initially. And this shouldn't come s a surprise — every console generation with the exception of perhaps a few has seen huge advances visually as developers became more adept with the hardware. I'm not talking about advanced tech; I'm talking about Game Z for System X looking far superior to Game A on the same system a half decade earlier. I call bullshit on any assertion otherwise because I've seen it happen in pretty much every generation that has existed, and you'd have to be fucking blind to assert otherwise. I mean, are you suggesting that Digital Foundry are a bunch of fucking rubes because they were amazed that Guerilla got Horizon Forbidden West to run as well as they did on the vanilla PS4? I don't even care for that game but I'm not stupid enough to think that it wasn't a huge achievement to get it running on a base PS4 with as little sacrifices as were made And that something like that just wasn't possible in 2014.
You have to know about the tech and what to look for. Getting sharper textures or adding in a different environment never seen before isn't a basis for advanced tech or claiming developers too dumb to know what a typical x86 architecture can do. You aren't going to see HFW : DLC 2 suddenly have path tracing at the end of the generation simply because they had to "figure it out" over a 7yr timespan.
 
  • Like
Reactions: HeresJohnny
You have to know about the tech and what to look for. Getting sharper textures or adding in a different environment never seen before isn't a basis for advanced tech or claiming developers too dumb to know what a typical x86 architecture can do. You aren't going to see HFW : DLC 2 suddenly have path tracing at the end of the generation simply because they had to "figure it out" over a 7yr timespan.

Yes, current techniques that don't work on current hardware won't work on the same hardware in 7 years. The full-blown ray tracing barely possible on the best PC graphics cards isn't going to be possible on XsX or PS5 through driver updates or patches. The results we see on the screen are a combination of hardware and software, drivers and middleware and they can only flex and improve so much.

Spider Man Miles Morales has ray tracing on some reflective surfaces. Some fanboys (stupidly) tout this as PROOF that consoles are within touching distance of "full" ray tracing. This is deluded. I'm not a fan of the game but I have to admit the reflections look great in certain scenes (when low-poly stuff isn't interfering with the effect). Ray tracing used selectively is possible now. And as software improves the ability to fake ray-tracing, to use it selectively in concert with pre-baked lighting, the middleware to lower the "cost" of implementing these things... all these will grow. The end result will perhaps be sloppily referred to as "better Ray tracing" because the end result will truly look better than what we have now. Underneath perhaps the ray-tracing is just as constrained, but due to artistic vision and better engines, the end-result is better.

Those future techniques are capable of bringing consoles closer to the intended effect of ray tracing. More selective usage of ray tracing with the level layout and scenario in mind may allow current consoles to fake "ray tracing" much further. For instance, God of War Ragnarok is absurdly linear and narrow. The tiny size of the levels is kinda embarassing in a modern game, but on the flip side, every inch of the game looks incredible. The ability to artificially narrow your game and trick the player for the sake of pumping of graphics is not new

So in Howevermany years, the end result will be games that "do raytracing better" than current XsX and Ps5 games. Trying to speculate what exact combination of middleware, software improvement, trickery, etc will lead to this result is fruitless because it'll depend on the game and on the shifts of the market.
 
Yes, current techniques that don't work on current hardware won't work on the same hardware in 7 years. The full-blown ray tracing barely possible on the best PC graphics cards isn't going to be possible on XsX or PS5 through driver updates or patches. The results we see on the screen are a combination of hardware and software, drivers and middleware and they can only flex and improve so much.

Spider Man Miles Morales has ray tracing on some reflective surfaces. Some fanboys (stupidly) tout this as PROOF that consoles are within touching distance of "full" ray tracing. This is deluded. I'm not a fan of the game but I have to admit the reflections look great in certain scenes (when low-poly stuff isn't interfering with the effect). Ray tracing used selectively is possible now. And as software improves the ability to fake ray-tracing, to use it selectively in concert with pre-baked lighting, the middleware to lower the "cost" of implementing these things... all these will grow. The end result will perhaps be sloppily referred to as "better Ray tracing" because the end result will truly look better than what we have now. Underneath perhaps the ray-tracing is just as constrained, but due to artistic vision and better engines, the end-result is better.

Those future techniques are capable of bringing consoles closer to the intended effect of ray tracing. More selective usage of ray tracing with the level layout and scenario in mind may allow current consoles to fake "ray tracing" much further. For instance, God of War Ragnarok is absurdly linear and narrow. The tiny size of the levels is kinda embarassing in a modern game, but on the flip side, every inch of the game looks incredible. The ability to artificially narrow your game and trick the player for the sake of pumping of graphics is not new

So in Howevermany years, the end result will be games that "do raytracing better" than current XsX and Ps5 games. Trying to speculate what exact combination of middleware, software improvement, trickery, etc will lead to this result is fruitless because it'll depend on the game and on the shifts of the market.
We'll have to agree to disagree here. I believe they are victums of the placebo affect. Instead of new software techniques, they see (at the end of a generation ) new environments, more particles, more lighting/shading, possibly better animation ,etc... The guys drooling over HFW: Burning Shores don't see a game with more tech than the original game HFW. They see the same tech with different biomes, added bug fixes and polished techniques that are already in place.

The end of generation expectation might have been a thing during the PS3 days where developers truly needed to figure out the hardware but I think it's pretty consistent with what the devs have to do with now.
 
We'll have to agree to disagree here. I believe they are victums of the placebo affect. Instead of new software techniques, they see (at the end of a generation ) new environments, more particles, more lighting/shading, possibly better animation ,etc... The guys drooling over HFW: Burning Shores don't see a game with more tech than the original game HFW. They see the same tech with different biomes, added bug fixes and polished techniques that are already in place.

The end of generation expectation might have been a thing during the PS3 days where developers truly needed to figure out the hardware but I think it's pretty consistent with what the devs have to do with now.

I mean... we have real-world comparisons between launch XsX games and current XsX games, launch PS5 games and current PS5 games.

We could call it "implementation", we could even call it "trickery", but I don't think it's realistic to deny the improvements that get made over the course of a generation. The truism "these are now just mid budget PCs" is not an actual argument, as consoles have always shared hardware territory with PCs.

Is the point here "yes, it all APPEARS to look better and to have bigger environments, but the tech is not new"? Because we do agree on that point in the sense that hardware tech hasn't magically changed iinside of the console. The improvements are due to software and implementation getting better.
 
I mean... we have real-world comparisons between launch XsX games and current XsX games, launch PS5 games and current PS5 games.

We could call it "implementation", we could even call it "trickery", but I don't think it's realistic to deny the improvements that get made over the course of a generation. The truism "these are now just mid budget PCs" is not an actual argument, as consoles have always shared hardware territory with PCs.

Is the point here "yes, it all APPEARS to look better and to have bigger environments, but the tech is not new"? Because we do agree on that point in the sense that hardware tech hasn't magically changed iinside of the console. The improvements are due to software and implementation getting better.
Tell me a game that's released now that's showing more tech than 3yrs ago? I don't see it. HFW: BS is equivalent in tech than HFW. They changed a biome (i.e. the Sky) and that's not enough to claim a developer needed time to master the hardware. There is no ray-tracing which indicates they have already tapped out the hardware and it wouldn't lend itself a leap in looks to justify the performance cost. When we see game after game from 3rd parties with the same setups, we know the hardware is tapped out. Mostly all games have 2 performance modes now. That'll be standard fair for the rest of the generation. I also don't believe Nanite will take hold this generation.
 
Tell me a game that's released now that's showing more tech than 3yrs ago? I don't see it. HFW: BS is equivalent in tech than HFW. They changed a biome (i.e. the Sky) and that's not enough to claim a developer needed time to master the hardware.

Haven't played Burning Shores myself, but I can say right off the get that the water areas are more complex compared to Forbidden West (which unless I was missing some massive underwater world, was mostly water-tunnels from one area to another), and faster traversal between water, land, and sky implies improved loading.

Demon's Souls and Astro Bot both look nice but God of War Ragnarok shows graphical improvements beyond those two games. I'd take that as another example of improved utilization of the hardware over the lifespan of a console.

Or am I misunderstanding your criteria?
 
I mean... we have real-world comparisons between launch XsX games and current XsX games, launch PS5 games and current PS5 games.

We could call it "implementation", we could even call it "trickery", but I don't think it's realistic to deny the improvements that get made over the course of a generation. The truism "these are now just mid budget PCs" is not an actual argument, as consoles have always shared hardware territory with PCs.

Is the point here "yes, it all APPEARS to look better and to have bigger environments, but the tech is not new"? Because we do agree on that point in the sense that hardware tech hasn't magically changed iinside of the console. The improvements are due to software and implementation getting better.
Tell me a game that's released now that's showing more tech than 3yrs ago? I don't see it. HFW: BS is equivalent in tech than HFW. They changed a biome (i.e. the Sky) and that's not enough to claim a developer needed time to master the hardware. There is no ray-tracing which indicates they have already tapped out the hardware and it wouldn't lend itself a leap in looks to justify the performance cost. When we see game after game from 3rd parties with the same setups, we know the hardware is tapped out. Mostly all games have 2 performance modes now. That'll be standard fair for the rest of the generation. I also don't believe Nanite will take hold this generation.

I think with this gen, it became way easier for devs to tap into the potential of the hardware. It's the second gen that rocks the x86 architecture, the tools matured a lot, devs got a very good grip on the hardware early on. Demon's Souls is still up there with the best looking games, and it released in Nov 2020. Metro Exodus EE released two years ago and still has one of the, if not the best RT implementation on consoles.

I'm sure we will see further improvements and better looking games, but it won't be as drastical as with previous gens (on base consoles, mid gen upgrades could change that). That said, tech and tools always improve, while the hardware stays the same. I think UE5 and Lumen will bring some nice improvements regarding the lighting, for example.
 
  • Star
Reactions: DonDonDonPata
The same reason people overestimate the importance, significance and usefulness of exclusive nVidia tech.
 
Haven't played Burning Shores myself, but I can say right off the get that the water areas are more complex compared to Forbidden West (which unless I was missing some massive underwater world, was mostly water-tunnels from one area to another), and faster traversal between water, land, and sky implies improved loading.
That's called an addition to the world -- a biome. Your claim could only be true if there was a waterworld in HFW and Burning Shores waterworld shows a noticable leap from that.

A better comparison would be CP2077 RT mode vs. Overdrive mode. Or Spiderman 2 over Spiderman Miles (which we don't know yet).

Demon's Souls and Astro Bot both look nice but God of War Ragnarok shows graphical improvements beyond those two games.
They aren't the same games. GoW:Rag doesn't have anything in it that *should* have been implemented in Demon Souls.

I believe this is the crux of the console gamer's arguments. You are all comparing games among other games and saying, "This looks better than that for some reason so I'll write it off as better tech".

That is completely biased assumption. GoW would always look better than a Souls' type game due to better talent, level design, and VFX teams. None of this relates to better use of the hardware. Different teams, different art direction, different talent and level design.
 
That's called an addition to the world -- a biome. Your claim could only be true if there was a waterworld in HFW and Burning Shores waterworld shows a noticable leap from that.

What if HFW's presentation of the world used too many assets so it couldn't present a waterworld at that scale?

I think you're arbitrarily picking what does or doesn't count as improvement to refute an argument made by people on other forums like Icon ERA.

They aren't the same games. GoW:Rag doesn't have anything in it that *should* have been implemented in the game and isn't.

I believe this is the crux of the console gamer's arguments. You are all comparing games among other games and saying, "This looks better than that for some reason so I'll write it off as better tech".

That is completely biased assumption. GoW would always look better than a Souls' type game due to better talent, level design, and VFX teams. None of this relates to better use of the hardware. Different teams, different art direction, different talent and level design.

This is an unfalsifiable claim. When I try to give examples -- yes, imperfect examples because we're comparing old games with newer ones -- you have pre-loaded excuses for why they don't count.

You're just being arbitrary. Guess that's one way to never lose arguments about "tech" when you alone get to define what is or isn't "tech".
 
What if HFW's presentation of the world used too many assets so it couldn't present a waterworld at that scale?
That makes no sense. Now we are guessing.

I think you're arbitrarily picking what does or doesn't count as improvement to refute an argument made by people on other forums like Icon ERA.
I'm trying to stay within the definition of what's considered "next-gen" or an improvement over current tech. It makes sense to compare the same teams - not pick a team that's been known to have less than spectacular graphics like FROM to a company that thrives off of cinematic film quality graphics like SSM.

This is an unfalsifiable claim. When I try to give examples -- yes, imperfect examples because we're comparing old games with newer ones -- you have pre-loaded excuses for why they don't count.

You're just being arbitrary. Guess that's one way to never lose arguments about "tech" when you alone get to define what is or isn't "tech".
I mean everyone should objectively have the same definition for tech though. What have we seen as far as tech last generation? PBR shaders. That is what pushed the generation forward. This generation is RT and Nanite. No one disagrees that the Nanite demo for the PS5 was a step ahead of games because it showed tech that was superior. We saw something that beats out normal maps and parallax occlusion mapping. It went closer and closer as far as detail was concerned. At the same time, we got path tracing demos from Nvidia where everyone said that games using it would be a few generations away until CP2077OD came out. What I'm trying to say is that we have objective examples of new tech being in a game and looking a generation ahead.

If we have no Nanite or PT this generation, then there can't be a big leap in visuals as the artists are using the same tools they used last gen. That's why we see 2 modes in games. Performance mode is sacrificing what you could get for 60FPS. 30FPS mode gives higher res textures, better texture filtering and some RT at a small scale.
 
The same reason people overestimate the importance, significance and usefulness of exclusive nVidia tech.
So a game/demo that shows a huge difference in rendering quality from previous techniques is overestimated but a DLC like Burning Shores is significant because it shows some cinematic gameplay flying through volume clouds (that have been implemented years ago in other games) and is a Sony-exclusive?
 
That makes no sense. Now we are guessing.

You're guessing that HFW could have included a comparable waterworld. The game had limited water areas. It had water, but it didn't have an extensive waterworld. The second game appears to. Pretty obvious improvement.

One has to twist logic into pretzels to assume "this could've been done in the first game, they just chose not to" , or perhaps leaning to the unfalsifiable "oh well there's a difference in team and talent, that's why it looks better"

I'm trying to stay within the definition of what's considered "next-gen" or an improvement over current tech. It makes sense to compare the same teams - not pick a team that's been known to have less than spectacular graphics like FROM to a company that thrives off of cinematic film quality graphics like SSM.


I mean everyone should objectively have the same definition for tech though. What have we seen as far as tech last generation? PBR shaders. That is what pushed the generation forward. This generation is RT and Nanite. No one disagrees that the Nanite demo for the PS5 was a step ahead of games because it showed tech that was superior. We saw something that beats out normal maps and parallax occlusion mapping. It went closer and closer as far as detail was concerned. At the same time, we got path tracing demos from Nvidia where everyone said that games using it would be a few generations away until CP2077OD came out. What I'm trying to say is that we have objective examples of new tech being in a game and looking a generation ahead.

If we have no Nanite or PT this generation, then there can't be a big leap in visuals as the artists are using the same tools they used last gen. That's why we see 2 modes in games. Performance mode is sacrificing what you could get for 60FPS. 30FPS mode gives higher res textures, better texture filtering and some RT at a small scale.

I think an easy example to punch a hole in your argument is patches: a game is released and has such-and-such graphics, then a year later it gets a patch and now it can run on the same hardware at 60fps with no discernable problems elsewhere.

Will you make the argument that the original version of the game was "suboptimized", but the final patched version is "more optimized" therefore it doesn't count? Perhaps, but if we can list off many examples of this happening in patches surely this also happens across the lifespan of the console.

Performance-taxing techniques are swapped out for less intensive ones, middleware gets updated or replaced with more efficient tools. This also happens in the PC space, where driver updates "improve the hardware" by improving the utilization of said hardware and improving the coordination between PC parts.

There's no logical reason why console games wouldn't get more technically impressive as the console generation goes on, and we see this across console history. We even see this on Nintendo Switch, where ass ports like Ark ran poorly on the system but now, after a patch, is one of the best-looking games on the system. The hardware itself didn't get more powerful, what happened?
 
  • Like
Reactions: Nikana
You're guessing that HFW could have included a comparable waterworld. The game had limited water areas. It had water, but it didn't have an extensive waterworld. The second game appears to. Pretty obvious improvement.
That's still not a "we learned the hardware more so we were able to generate a water world" even though it's been implemented in games like Sea of Thieves and AC. So improvement to you means patching tools and iterating on them to one day have a game that has much better visuals?

There's no logical reason why console games wouldn't get more technically impressive as the console generation goes on, and we see this across console history.
There is a very important reason. The power of the system has been used up so there is no more room to sacrifice ms for an added features.

We even see this on Nintendo Switch, where ass ports like Ark ran poorly on the system but now, after a patch, is one of the best-looking games on the system.
Batman looked the same even though it was bugged to hell and didn't perform well.


Let's all agree on a standard definition otherwise we will be here forever arguing over this stuff.

What consistutes a signficant leap in visuals over a previous implementation?

I claim :

1) Implementing a new rendering technique that hasn't been done before. Examples, PBR, RT, PT and Nanite.


Your turn.
 
  • Like
Reactions: Joe T.