Thread: Why do console fans overestimate future console hardware?
That's still not a "we learned the hardware more so we were able to generate a water world" even though it's been implemented in games like Sea of Thieves and AC.

Why not? Why do devs talk about tools and middleware if it made no difference over the course of a system's life? You alreadycited difference in "teams and talent". Does an improved talent at using the hardware not count to you, or does it?


So improvement to you means patching tools and iterating on them to one day have a game that has much better visuals?

Do you have a valid reason for excluding this kind of improvement?


There is a very important reason. The power of the system has been used up so there is no more room to sacrifice ms for an added features.

How do you know the "power has been used up"? Is Candy Crush 3 using up the same amount of power as God of War Ragnarok? Or are we back to the handwave "oh that's because of a difference of teams and talent".

Batman looked the same even though it was bugged to hell and didn't perform well.


Let's all agree on a standard definition otherwise we will be here forever arguing over this stuff.

What consistutes a signficant leap in visuals over a previous implementation?

I claim :

1) Implementing a new rendering technique that hasn't been done before. Examples, PBR, RT, PT and Nanite.


Your turn.

I retort that a "significant leap in visuals" can also be accomplished by implementing the same rendering technique, but in a "cheaper" manner. This is achieved by improving the middleware which facilitates the implementation of said technique,allowing the hardware to allocate resources elsewhere to achieve the same (or often a superior) visual effect in the game. This occurs all the time on PC and on console hardware and is hardly controversial.

Or does this still not fall under your arbitrarily-defined 'significant leap in visuals over a previous implementation'?
 
Why not? Why do devs talk about tools and middleware if it made no difference over the course of a system's life? You alreadycited difference in "teams and talent". Does an improved talent at using the hardware not count to you, or does it?
No, it does not count to me. SSM will always have better talented artists and programmers than FROM. Why would I gauge the hardware based on SSM's team and FROM's team comparing two totally different games? I can't, in good faith, say that the PS5 (for example) was mastered by SSM and not FROM so we have a leap in visuals from developer to developer.

Do you have a valid reason for excluding this kind of improvement?
I gave you a valid reason with my claim.


How do you know the "power has been used up"? Is Candy Crush 3 using up the same amount of power as God of War Ragnarok? Or are we back to the handwave "oh that's because of a difference of teams and talent".
I know the power has been used up when you see a pattern of usage from developer to developer. I mentioned that the pattern is 2 modes of gameplay now. Quality and Performance. I then went on to describe exactly how those are used. Let's take the Matrix demo for example. That looks better than most games because they used Nanite and Lumen - both new technologies (although Lumen still uses GI probes) The rendering environment was 1080p/30FPS for consoles.

We don't even have a developer from Sony that has an equivalent tech for Nanite yet. GG may or may not have time to put that into their next title before the generation is over. If they *do* get the time, then I can agree that their next game will push tech forward when compared to HFW and BS. That would be a logical assumption. And if you agree with me, then we both agree on the standard definition of improving during a gen. Burning Shores doesn't qualify because it's not using anything new tech-wise. The water biome isn't a feature that would be considered "next-gen" simply because they add that environment to their game.

I retort that a "significant leap in visuals" can also be accomplished by implementing the same rendering technique, but in a "cheaper" manner.
There is no such thing. Find me a better way of implementing object space reflections without using cube maps, environment maps or SSR and without using RT. Give me a better algorithm to implement higher mesh detail using parallax occlusion mapping or normal maps with more compute performance.

Also, no need to get snide with me. We are all discussing here.
 
I retort that a "significant leap in visuals" can also be accomplished by implementing the same rendering technique, but in a "cheaper" manner. This is achieved by improving the middleware which facilitates the implementation of said technique,allowing the hardware to allocate resources elsewhere to achieve the same (or often a superior) visual effect in the game. This occurs all the time on PC and on console hardware and is hardly controversial.

Or does this still not fall under your arbitrarily-defined 'significant leap in visuals over a previous implementation'?

I was about to suggest something similar to what @VFX_Veteran asked after reading the exchanges here. You two appear to be on the same page, more or less, but might be getting tangled up on language/terms that need to be better defined.

Diving into the software side with dev tools complicated things a bit whereas the thread started exclusively on the hardware side.

Improving dev tools, software tricks/design decisions and learning from mistakes/experience tends to help studios create more finely-tuned products which goes on to fuel the perception that maybe it was the result of untapped hardware potential.
 
No, it does not count to me. SSM will always have better talented artists and programmers than FROM. Why would I gauge the hardware based on SSM's team and FROM's team comparing two totally different games? I can't, in good faith, say that the PS5 (for example) was mastered by SSM and not FROM so we have a leap in visuals from developer to developer.

So if a difference in talent -- and the ability to utilize the hardware -- varies between two devs, it would stand to reason that the difference could also be a dev in 2019 compared to a dev in 2025. Not sure you can claim to be arguing in good faith when I give you an example of the same dev improving things between a game and its DLC and you throw it out arbitrarily.

I gave you a valid reason with my claim.



I know the power has been used up when you see a pattern of usage from developer to developer. I mentioned that the pattern is 2 modes of gameplay now. Quality and Performance. I then went on to describe exactly how those are used. Let's take the Matrix demo for example. That looks better than most games because they used Nanite and Lumen - both new technologies (although Lumen still uses GI probes) The rendering environment was 1080p/30FPS for consoles.

We don't even have a developer from Sony that has an equivalent tech for Nanite yet. GG may or may not have time to put that into their next title before the generation is over. If they *do* get the time, then I can agree that their next game will push tech forward when compared to HFW and BS. That would be a logical assumption. And if you agree with me, then we both agree on the standard definition of improving during a gen. Burning Shores doesn't qualify because it's not using anything new tech-wise. The water biome isn't a feature that would be considered "next-gen" simply because they add that environment to their game.


There is no such thing. Find me a better way of implementing object space reflections without using cube maps, environment maps or SSR and without using RT. Give me a better algorithm to implement higher mesh detail using parallax occlusion mapping or normal maps with more compute performance.

lol let me get this straight: do I have to cite the patch notes and whitepapers showing an algorithm has improved?

Also, no need to get snide with me. We are all discussing here.

The premise of your thread is snide, begins by attacking a strawman, asserting superiority of knowledge, all while playing persecuted victim fleeing from mean forums who don't appreciate your truth.

Or maybe you didn't mean to be snide. I didn't take it that way, so don't take my disagreement as snide either. If my frustration is coming through in my text, it's because you're slipping between criteria and definitions and you are not really "discussing" anything.
 
I was about to suggest something similar to what @VFX_Veteran asked after reading the exchanges here. You two appear to be on the same page, more or less, but might be getting tangled up on language/terms that need to be better defined.

Diving into the software side with dev tools complicated things a bit whereas the thread started exclusively on the hardware side.

Improving dev tools, software tricks/design decisions and learning from mistakes/experience tends to help studios create more finely-tuned products which goes on to fuel the perception that maybe it was the result of untapped hardware potential.
Exactly.
 
i'll be honest with you guys. with all the fanboys wanting the ps5 to run any game at 4k/120fps. With kaz preaching that the some gran turismo was running at 8k/120kz. and now Gran Turismo 7 receiving a 120fps patch I thought "this time i'll be eating crow. the motherfucker made it possible running ps5 at 4k 120fps using magic". i even went back to the game to chek that and when checked it i got so disappointed that it could run only at 1080p/120fps.

point is i hate when guys overhype a console like it was ultimate console of all time and the console itself does not deliver. not even 4k/30fps. no even through magic/ not even through 'secret sauce'.
 
So if a difference in talent -- and the ability to utilize the hardware -- varies between two devs, it would stand to reason that the difference could also be a dev in 2019 compared to a dev in 2025. Not sure you can claim to be arguing in good faith when I give you an example of the same dev improving things between a game and its DLC and you throw it out arbitrarily.
You didn't show an example of the same dev improving things between the game and the DLC. We both agree that the DLC's environment happens to be different than that of the main game. How is adding a biome of underwater an improvement to the rendering engine. Both of the games look the same rendering feature-wise.

lol let me get this straight: do I have to cite the patch notes and whitepapers showing an algorithm has improved?
No, but we should see something that points to that fact in the visuals. For example, we see less grainy looking textures when viewed at oblique angles because they implemented 16X anistropic filtering compared to 4X (of which the consoles still do despite being more powerful than last gen's consoles).


The premise of your thread is snide, begins by attacking a strawman, asserting superiority of knowledge, all while playing persecuted victim fleeing from mean forums who don't appreciate your truth.
How am I asserting superiority of knowledge? I think you guys know I work in the field and intentionally view me as a "guilty party" because of that fact despite what I may say. Persecuted victum fleeing from forums of people who don't appreciate my truth or people who are outright fanboys that want their hardware of choice to always be a superior product and hate what I'm saying?
 
i'll be honest with you guys. with all the fanboys wanting the ps5 to run any game at 4k/120fps. With kaz preaching that the some gran turismo was running at 8k/120kz. and now Gran Turismo 7 receiving a 120fps patch I thought "this time i'll be eating crow. the motherfucker made it possible running ps5 at 4k 120fps using magic". i even went back to the game to chek that and when checked it i got so disappointed that it could run only at 1080p/120fps.

point is i hate when guys overhype a console like it was ultimate console of all time and the console itself does not deliver. not even 4k/30fps. no even through magic/ not even through 'secret sauce'.
And when I call that out - I'm acting like a persecuted victum. See the irony in that? It's simply discussing that Sony fanboys are biased as hell hate anyone telling them the machine isn't doing anything special. I never get that vibe from Xbox bros - they seem to be more realistic and less prideful.
 
How am I asserting superiority of knowledge? I think you guys know I work in the field and intentionally view me as a "guilty party" because of that fact despite what I may say. Persecuted victum fleeing from forums of people who don't appreciate my truth or people who are outright fanboys that want their hardware of choice to always be a superior product and hate what I'm saying?

c'mon, re-read what you just typed out above. Clearly you aren't engaging with this topic "in good faith" if this is actually what you believe.
 
c'mon, re-read what you just typed out above. Clearly you aren't engaging with this topic "in good faith" if this is actually what you believe.
I am absolutely engaging in good faith. I want to know why hardcore fanboys have to hype their opinions and their expectations to high heaven to somehow come out on top of everyone else. The topic is intentionally made to get people to discuss honestly WHY they do that shit. Why not be realistic? In a couple of years, I'll make a topic call "Next-gen expectations and predictions" . I bet you a dollar to a dime that the Sony fanboys will come in and declare PS6 leap frogging over 3x00 series GPUs and being in the realm of 4x00 series cards with Path tracing, 4k60FPS, DLSS 3 frame generation and all for less than $600. Despite the fact that AMD is significantly behind in tech compared to Nvidia with no rumor of frame generation or hardware Tensor/RT cores. For some magical reason, the PS6 will include all of that.
 
So a game/demo that shows a huge difference in rendering quality from previous techniques is overestimated but a DLC like Burning Shores is significant because it shows some cinematic gameplay flying through volume clouds (that have been implemented years ago in other games) and is a Sony-exclusive?

I fully agree that Sony exclusives are overhyped in general, simply because they are Sony exclusives. It happens whenever there is excess fanaticism. The kicker is that the same applies to nVidia tech, and their graphics cards, because the fanaticism is extremely high for it.
Two prime examples are Hairworks and the RTX 2000 series. The former is pretty much irrelevant nowadays (and was even pretty much irrelevant at the time for anyone wanting to actually play games), and the RTX 2000 series significantly messed up pricing for gaming, because people were overhyping and overpaying for a feature that basically is incapable of actually running on those graphics cards.

The RTX implementation of RT is already showing signs of being excessive, when we compare it to the hardware accelerated lumen implementation in UE5, not unlike Hairworks. It took two more generations for RT to finally be a somewhat useful feature with the RTX 4000 series, and even then, it's still mostly a gimmick, because it doesn't actually significantly enhance gameplay, and the average person wanting to pay at most $500 for a graphics card is still unable to make use of it. And AMD is stupid enough to follow suit.

To me it really feels that gaming has declined, and one of the reasons is the focus on graphics over substance and gameplay. The switch from 2D to 3D enabled great innovations. RT has the potential to enable innovation, but as of now, it is merely an image boost so people can brag about their purchase on the internet. It should be combined with in-game physics in some way to make some innovative gameplay.

In reality, you're better off with a Nintendo Switch than either a PS5, Xbox Series X or a PC at this point. But fanaticism doesn't allow people to see it.
 
I fully agree that Sony exclusives are overhyped in general, simply because they are Sony exclusives. It happens whenever there is excess fanaticism. The kicker is that the same applies to nVidia tech, and their graphics cards, because the fanaticism is extremely high for it.
Two prime examples are Hairworks and the RTX 2000 series. The former is pretty much irrelevant nowadays (and was even pretty much irrelevant at the time for anyone wanting to actually play games), and the RTX 2000 series significantly messed up pricing for gaming, because people were overhyping and overpaying for a feature that basically is incapable of actually running on those graphics cards.
Yup I agree. The issue I have with Nvidia is their lack of good amount of VRAM. They continuously cripple the lower tiers of cards to get more sells on the x90 version of cards (24G is an oustanding amount for now). I also hate that they close off their custom patents and innovation based on AI and Tensor cores. I'm sure that DLSS 3 frame generation could probably be done on 3x00 series boards. However this doesn't exempt AMD which is in a world of trouble right now. They are unable to come up with like-for-like technology in their graphics boards and they continue to focus on brute forcing RT with more SMUs. Clearly it's not enough for Path tracing.

The RTX implementation of RT is already showing signs of being excessive, when we compare it to the hardware accelerated lumen implementation in UE5, not unlike Hairworks.
Lumen is extremely well done for it's time. However, Epic doesn't have a path tracing pipeline that runs in realtime like Nvidia does. I am 99% sure that CDPR paid Nvidia for the PT tech to run so well. Since it's Nvidia, of course they will be able to optimize something like path tracing more than any other developer. This move by Nvidia sort of shits on Epic's Lumen because it's made it obsolete already and studios haven't even used Lumen yet (I hope my company decides to use it.).

It took two more generations for RT to finally be a somewhat useful feature with the RTX 4000 series, and even then, it's still mostly a gimmick, because it doesn't actually significantly enhance gameplay,
I can understand your point here. You want enhanced gameplay instead of visuals. Sounds viable to me even though I am exactly the audience for Nvidia with visuals. So I'll take that criticism. 😩
 
  • Cheers
Reactions: Ascend
Reading through this thread, I feel like Jack in Sideways. While I appreciate the knowledge people have about how the games are made (much like Miles' knowledge of wine), I just see pretty images and play the game. I'm a console first knuckle head whose only pc is a Steam Deck.

I think they can get a bunch more out of these consoles before a new one is necessary. They want us to buy new hardware. Hogwarts looks great, RE 4 runs and looks awesome in ray tracing mode. Cyberpunk, Dead Space etc. Jedi and the Insomniac games should look incredible. I don't know, I think these games are starting to look pretty good.

PC elitists are snobs. Nobody cares about your specs. Great, you can run GTA 5 at 600 fps with mods. It only cost you 2 grand. It's not hilarious at all when you all bitch and moan about how terrible the pc version of games run compared to the console.
 
The thing about consoles is they never ran well to begin with. From a development standpoint the snes and n64 suck due to a number of bizarre last minute decisions. The snes could barely play the arcade games of that era due to a number of reasons. If you think about it for a second the snes and genesis weren't the most powerful hardware on the market and the difference between the two consoles were unnoticable the genesis's best capability was that it was easier to develop for. The same thing with the n64 vs psx vs saturn the easiest thing to develop for was the psx and that thing sucked from every almost every standpoint compared to the saturn.

But you look back at the 90s console wars kids were all over the braindead console war bs because some guy on a television screen told him his friends genesis sucks and didn't have blah blah blah. Same thing with the saturn compare the arcade ports of games like alpha 3 between the two systems you'll see a difference. The saturn was almost as good as the n64 but people brushed it off as a bunch of nothing because some guy on tv told them that the saturn's hardware sucks compared to the psx and they ate it up. Then the same marketing tactics were used against the dreamcast when the ps2 released and people didn't care.

TLDR the game industry may reap what they sew.
 
Seems like I'm a bit late to this thread, but I mentioned it elsewhere: Diminishing returns

Graphics used to improve significantly without needing too much of an increase in hardware. Look at how Playstation 1 games look compared to Playstation 2. Night and day difference, and that's in spite of the PS2 actually being the weakest console of the sixth gen. There was still a significant jump from sixth gen to seventh gen. Compare original Xbox games to Xbox 360.

But, as time goes on and we reach closer and closer to photorealism, we get far less improvements for the same level of hardware upgrade, but people still expect the generational changes in graphics that was occurring in prior gens. Hell you even see that in PC games. I always bring up the fact that the time between Quake 1 and Crysis is less than the time between Crysis and Battlefield 2042, and you can really see how progress has slowed down.
 
Last edited: