Jump to content
IGNORED

Why don't 8th gen games look THAT much better than 7th gen? Will this also be true for 9th gen in 2020?


Estil
 Share

Recommended Posts

(old NA topic from back in 2013 (has it been that long ago?); I thought it'd be appropriate to bring back this discussion seeing as how we're about to enter the 9th gen next year)

 

Now, everyone here knows I'm very much a traditionalist retro gamer.  That being said, I have nothing against modern gaming at all, even though I don't really understand a lot of it (I don't do online gaming and what is up with all this stream and cloud stuff?).   But I can't help but wonder when I look at videos of upcoming X1/PS4 games (like that recent side-by-side comparsion video IGN just did a few days ago) I can't help but wonder...can't the PS3/X360 do the same thing if they really try to push the system?

See, with previous generation transistions, the difference in graphics (and sound) was like night and day.  Those of you like myself who were kids back when the SNES was first released in 1991, remember how blown away we all were by Super Mario World?  Super Mario Bros 3 clearly pushed the NES with everything it had but Super Mario World was so much more colorful and detailed and bigger.

When the N64 came out, remember how blown away people were by Super Mario 64 and how much crisper and less jaggedy the visuals were compared to the early PS1 games?  How about when the PS2/GC/Xbox came out and showed 3D games where everything didn't have those sharp points?

UPDATE: Here's another really perfect example of exactly what I mean...I just recently got a copy of Luigi's Mansion for the GC (the launch title if you recall; I had one right at launch but I stupidly sold it years ago). Now, this is only the launch title and I believe I read somewhere that Nintendo said it was only using about 5% of what the GC was capable of. But even so, it was very VERY clear that even this launch title looked WAY better than even the very best N64 graphics. And I was totally blown away seeing the Super Smash Bros Melee intro for the first time in 2001...I never seen graphics like it! And again, that was a game that came out right after launch! See, THIS is the kind of graphical leap we all fully expected with each new generation of gaming. But now when I see videos of the PS4/X1 games, I often can't help but wonder if the PS3/360 could do those same kind of graphics if the system was really pushed. But with the GC, there was never any doubt the N64 was no match for even the earliest GC games.

See what I mean?  It used to be the difference between generations was like night and day.  But now when I see videos of say PS4 vs PS3, why is it that I can't help but think that the PS4 isn't making it look all that much better?

Am I just blind?  Am I so out of touch with current gaming and pop culture?  Or have we reached a point where graphics just can't get much better?

 

UPDATE: Like I said I brought this back because we're about to have the first 9th gen consoles come out next year (PS5 and whatever the Xbox 4 ends up being called).  I imagine for those too, the graphics wouldn't look hardly any different? 😞  Whatever happened to the 80s and 90s when we could see tech at shows and whatnot really pushed to the limits and it was clearly leaps and bounds ahead of what we currently could get in a home console/PC.  Maybe I'm just blind...I still haven't gotten around though to doing much on PS3, much less PS4 or PS5 so I guess there's that as well...

  • Like 2
Link to comment
Share on other sites

Administrator · Posted

Diminishing returns. The PS1 and n64 were pioneers in the 3d space. It's going to take some new method of interacting with games to really wow us again like that. VR is an example, though it's yet unpolished to a degree that anyone can truly look at it and go "this IS the future". 

  • Like 1
Link to comment
Share on other sites

VR was indeed "the future" in the 90s!  And people were wowed by what it could do.  But when attempts were made to bring such cutting edge future tech to the home market, it was either too watered down to really do the state of the art tech justice (Power Glove, Virtual Boy) and it just wasn't the same... 

Or, OTOH, if they do try to release it with at least mostly all the state of the art tech, it can be way too expensive and it ends up being a clear case of trying to do too much too soon and without enough people buying it, it never really gets a chance to catch on.  Laserdisc, the 90s "luxury consoles" (3DO, CD-I, NeoGeo, etc), IBM Simon, and the Apple Newton are all perfect examples.

Edited by Estil
Link to comment
Share on other sites

It depends on what you're playing. There's nothing last gen that can produce a game like Uncharted 4, God of War 2018 or Red Dead Redemption 2. 

There's also the reality that there is an upper limit...reality. Eventually, and soon, gaming will be photo realistic, so the only path forward will be in the hands of the artists in making things not seen in real life look real.

  • Like 1
Link to comment
Share on other sites

The games you cited though, that was after the PS4 was out for several years and game creators knew better how to get more out of the system.  It used to be though that even the first launch games of a new console blew away even the very best of what the previous gen had to offer.  That was the point I was trying to make earlier.

  • Like 1
Link to comment
Share on other sites

We saw a nice bump in graphical quality and resolution when widescreen aspect ratio became standard and games went from 480i to 720p. Unfortunately, the bump to 1080p wasn't particularly notable to consumers and the industry responded by ignoring framerate and shooting for another resolution bump (4K).

We've certainly seen some improvements in graphical quality in the interim, but I don't think you'll really see anything remarkable until games standardize to 4K/60FPS. That should happen at the start of the next generation, so hopefully we'll see some truly amazing, photorealistic stuff in 2023 and beyond.

Link to comment
Share on other sites

Here's another way of looking at it...you know how with the PS1 (the first disc based console that really caught on) there's a lot more room for things and such it was no trouble at all to have real voices and cutscenes and such (yes I was in fact blown away in my college days seeing those on MegaMan games, especially the Legends Trilogy)?  So it wasn't just transitioning from mostly 2D to mostly 3D gaming...but also from "silent games" (you might a few seconds of spoken dialogue at most; cartridge couldn't fit much more) to "talkies" (with CD-ROM you'd have lots of room for things like voice acting)?

Thank God we had the good sense to preserve all those "silent" games as opposed to how most silent films (according to wikipedia it is guesstimated that around 90%ish of silent and 50%ish of pre-1950 "talkies" are forever lost 😞 ) were either thrown away or ruined (the nitrate film used back then was very flammable and gradually deteriorated quite easily) because in terms of any sort of saving for future posterity, their 'tude for the most part was:

PS: As for "that far ahead", James Avery was just under 45 at the time of this episode...Will Smith reached that age back in 2013.  So yes "that far ahead" has in fact come and gone!

Edited by Estil
  • Like 1
Link to comment
Share on other sites

31 minutes ago, DoctorEncore said:

We saw a nice bump in graphical quality and resolution when widescreen aspect ratio became standard and games went from 480i to 720p. Unfortunately, the bump to 1080p wasn't particularly notable to consumers and the industry responded by ignoring framerate and shooting for another resolution bump (4K).

We've certainly seen some improvements in graphical quality in the interim, but I don't think you'll really see anything remarkable until games standardize to 4K/60FPS. That should happen at the start of the next generation, so hopefully we'll see some truly amazing, photorealistic stuff in 2023 and beyond.

Oh yes indeed, I did in fact at the time view the transition to HDTV in the early-mid 2000s as well as the transition from dial-up to cable internet as like the difference between black and white and color TV back in the day.  Thing is though, even though 4K TV's have been quite affordable all things considered, there's still hardly any 4K programs!  Unless I haven't been looking through my digital cable lineup closely enough or something.

Now to put that in perspective, you guys remember the Wonder Years episode from 1988 (takes place in 1968) called "Christmas" where the family hopes Jack (the father) would splurge on a new color TV for Christmas (which he did two years later)?  Well the reason for his hesitation and putting it off was classic sticker shock...$470 for what looks like about a 20"-ish screen.  Adjusted for inflation, that would be around $3500!!!  And by this point color TVs were around for a little over ten years already...as you can see here, thusly (a 1958 RCA Victor Color TV would've set you back about $800 or about $7000!!!! in today's money and before 1962 only NBC did color programming (hence the colorful peacock) and only once in awhile!)...

So please think about all that for those of you hoping to score a nice new 4K TV this Black Friday or Cyber Monday or whatever.

Link to comment
Share on other sites

I honestly think that we’ve reached a point where the clever programming is no longer the star of the show. Hardware is cheap, powerful, and plentiful. Gone are the days where guys like John Carmack are dipping into Assembly to streamline routines in order to get Doom to run on our lowly 3 and 486s. The modern mentality is to just throw hardware at it and move on.

 

It’s become a 100 person art project where the majority of the project is spent making 3D models and textures. Since they also tend to aim for realism, diminishing returns is becoming more and more of a factor the closer we get. The early 3D games were SO far from realistic that each step closer seemed huge. Now it’s more just fine detail.

  • Like 1
Link to comment
Share on other sites

Because processing power and speed are not growing at the same rate since we are running into the physical limits of silicon chip manufacturing regarding feature size. 

 

Nowadays it is more about power consumption improvements. 

 

Another big factor is now that we are more or less at a point where the cost effective tech companies are putting into the consoles are not as much a limitation for how good the graphics look, but rather it’s more on the artists to make it look good or bad. So generally there will be a plateau of graphical quality that is possible in the deadline driven and team collaboration environments of games studios. 

Link to comment
Share on other sites

It's just that we've already passed the point where the artists' visions can be fully realized without any compromises due to hardware limitations. Going forward, it will mostly be about achieving better resolutions and framerates, while being able to keep track of more and more objects/characters within the game space.

Link to comment
Share on other sites

Think you need to adjust your bifocals 🤓 

Higher quality models and assets, better particle effects, better lighting systems like ray tracing, jump in overall resolution and performance, better draw distances with less pop in, faster loading with ssd.  These improvements make an impact.

Go back and play some of those games on older systems.  You are going janky looking characters and environments and some the fakest fire and explosion effects.  They look a lot worse than you remember.

Link to comment
Share on other sites

9 minutes ago, fox said:

Think you need to adjust your bifocals 🤓 

Higher quality models and assets, better particle effects, better lighting systems like ray tracing, jump in overall resolution and performance, better draw distances with less pop in, faster loading with ssd.  These improvements make an impact.

Go back and play some of those games on older systems.  You are going janky looking characters and environments and some the fakest fire and explosion effects.  They look a lot worse than you remember.

Exactly how far you want me to go back?  I mean I've been a 3rd-6th gen gamer my whole life so I'd like to think I'm a pretty fair judge of older systems; for the most part.  Then again I have always played mostly the top blue chip games so yeah if you go further down to at least the middle of the pack of the line up, then YES I most definitely can see how you'd run into those "janky looking characters and environments and some of the fakest fire and explosion effects".  I guess i expect that for a console's top blue chip games I expect for the game's developers to make the best use of the console they can given that point in time.  Sure the characters in Final Fantasy 7 do look rather funny/blocky/pointed but back in 1997 could they have in fact made them better?

Link to comment
Share on other sites

1 hour ago, MachineCode said:

I honestly think that we’ve reached a point where the clever programming is no longer the star of the show. Hardware is cheap, powerful, and plentiful. Gone are the days where guys like John Carmack are dipping into Assembly to streamline routines in order to get Doom to run on our lowly 3 and 486s. The modern mentality is to just throw hardware at it and move on.

That's precisely what I miss most.  That you had to be really creative to get the most out of the console; the first really notable example I can think of what when Atari 2600 had to figure out how to make a chess game for the console when they were threatened with a lawsuit for false advertising if they don't.  That's just silly though, I mean the Atari 2600 is such a basic primitive system, where would someone looking at one on the shelf to purchase get such a crazy idea that it could play chess?

post-3562-1203154154.jpg

 

And for the NES/SNES they even used several different kinds of "helper chips" in the cartridge to get far more than what was originally thought possible with just the console alone.  BTW these chip helpers are not to be confused with Hamburger Helper obviously! 😄 

 

 

Edited by Estil
  • Like 2
Link to comment
Share on other sites

2 minutes ago, MachineCode said:

Same here. The reason I became a programmer was the desire to know what made my favorite games tick. I love when people push the limits of systems. Modern AAA gaming is just so homogeneous that it bores the shit out of me. And cheeseburger macaroni is delicious.

 

Link to comment
Share on other sites

2 hours ago, MachineCode said:

I honestly think that we’ve reached a point where the clever programming is no longer the star of the show. Hardware is cheap, powerful, and plentiful. Gone are the days where guys like John Carmack are dipping into Assembly to streamline routines in order to get Doom to run on our lowly 3 and 486s. The modern mentality is to just throw hardware at it and move on.

It’s become a 100 person art project where the majority of the project is spent making 3D models and textures. Since they also tend to aim for realism, diminishing returns is becoming more and more of a factor the closer we get. The early 3D games were SO far from realistic that each step closer seemed huge. Now it’s more just fine detail.

Yes and no. I do agree that "clever programming" is no longer the answer. Especially at the assembly level since computers and modern games are just too complicated to dive that deep. But I don't think hardware is the sole answer either. Not everyone has a top-of-the-line computer to handle poorly optimized content. There are still plenty of games coming out that can't hold a solid 60 fps even on consoles, many can't even reach a solid 30 on release.

There is still is the need for tech artists and programmers to keep the hundreds of artists in check for how many triangles or pixels they cram into unnecessarily large models and textures. Even besides that, there still is plenty of work for engine and graphics programmers to meet performance goals (but I'm probably showing my bias here).

Link to comment
Share on other sites

24 minutes ago, 0xDEAFC0DE said:

There is still is the need for tech artists and programmers to keep the hundreds of artists in check for how many triangles or pixels they cram into unnecessarily large models and textures. Even besides that, there still is plenty of work for engine and graphics programmers to meet performance goals (but I'm probably showing my bias here).

Of course it’s not 100% what I had said. After all, only a Sith deals in absolutes. But it does seem to have trended in that direction.

 

Out of curiosity, what drives your bias? Do you program game engines for modern games? If so, does the part about work for engine designers still apply as much to individual games in an era where a large portion of games all run on the same 2 3rd party engines? Clearly you can’t make a game without programmers, no matter what engine you use.

 

 I’m super curious about engine design. It seems so hard to find good information about engine programming from scratch. Everything I find seems to leverage 3rd party engines or heavily rely on 3rd party libraries.

Edited by MachineCode
  • Like 1
Link to comment
Share on other sites

There was a time when graphics wowed me, jumping for NES to SNES did that, as did from that to Playstation, after that even though every console thereafter the games were even more beautiful, yes I love them, I didn't get as excited over them. I see most games nowadays and just shrug, yeah looks nice, but will the game be enjoyable and play without being plagued with a ridiculous amount of bugs, or am I going to get bored within short time before even finishing it? Maybe it's my age, but also perhaps expectations, no more surprises. The SNES had a lot of effects that the NES just did not have but still it was sorta the same just a superfied version of it even if it did wow me at first that probably had to do with my age and the times, and the Playstation was something totally new for me with 3D environments and fmv and lots of voice and other stuff, it blew me away like no other console ever did from those before to those of now, but after that, while everything has improved greatly visually as well as how much stuff there is going on and how much smoother the games play, I dunno how to say, the wow is gone, but replaced with just an easing in and accepting it for what we are used to with advances in technology and gaming, it's a jump up from the last, obviously going to be better, more stuff, smoother playing, better graphics, but been there done that. What we need is that drastic change like what happened from the SNES to the Playstation, if done right. I do like my graphics when it comes to higher end games though, because having tried the 360 with the original cables that came with it and an hd cable, massive difference in quality that it is painful to look at if not in the highest quality picture.

And now looking back and remembering how awed I was over original Playstation graphics after having played 360 for over 3 years, wholly shit, example Lara Croft Tomb Raider, any of the games on the PS1, pointy boobs, blocky ass, the elements are there but the beauty not so much because even then graphics still had some ways to go, even if they were a wow factor then, but that was only because they were so new to the eyes, but fast forward to Underworld on the 360, Lara Croft in that wetsuit at the beginning, damn I can look at that all day. Yet, aside from details in the environment that really make a game come to life, a pretty picture alone is not enough to really draw one into the game without still a little bit of imagination on gamer's part. I can drawn in immediately to the visuals of Tomb Raider Underworld or Resident Evil Revelations 2 but for others like Skyrim, Fallout 3 or State of Decay because sacrifices have to be made to have that openworldness minuscule details in the environment are lacking thus find yourself not interacting with many objects instead passing through them as if they are holograms, so thus rely on the imagination of the gamer to fill in the blanks and draw themselves into it with whatever other content is available. And if a game can't play without hiccups, constant choppiness or freezing up, then it don't matter how pretty it looks because it done just pissed me off yet again because they didn't program that shit right to make it run smoothly on the console it is made for.

Graphics do make a difference, I used to never think so, but nowadays, yeah. I can still play and enjoy PS1 games or PS2 or whatever, because a good game doesn't suddenly become bad just because graphically it does come close to what is out now or in my case last gen since I've yet to be able to keep up completely with the times, but also there are other things like the controls, games now have far better control over everything from character to camera, I can run smooth circles around anything without suddenly stopping to have to turn my character again and don't feel like I'm controlling a tank instead of a person. Yet, when it comes to the older games of yor, or any 2D game really, simple pixels of various colors on screen doesn't matter, which is why I can easily play NES, SNES, Genesis, Gameboy, DS, whatever, pretty much any generation and not really be bothered by the differences in the graphics, the difference is noticeable for sure but it's not really a big deal, at least for me.

 

  • Like 1
Link to comment
Share on other sites

3 hours ago, MachineCode said:

Of course it’s not 100% what I had said. After all, only a Sith deals in absolutes. But it does seem to have trended in that direction.

Out of curiosity, what drives your bias? Do you program game engines for modern games? If so, does the part about work for engine designers still apply as much to individual games in an era where a large portion of games all run on the same 2 3rd party engines? Clearly you can’t make a game without programmers, no matter what engine you use.

 I’m super curious about engine design. It seems so hard to find good information about engine programming from scratch. Everything I find seems to leverage 3rd party engines or heavily rely on 3rd party libraries.

Fair point. I think I did misunderstand your position a bit. I will say artist do have more freedom in recent years, which I think was one of your main points. 

My bias is from working in an engine and graphics programming contracting firm that specializes in UE4 for AAA games. Additionally, I'm fairly new, so a decent chunk of my knowledge comes from stories of my coworkers. Since most all my coworkers are engine and graphics people, I probably do overestimate how many people work on rendering/engine compared to other disciplines. 

But I will say a lot of people assume that just because you're using something like UE4, you don't need engine programmers. While that may be true for smaller projects, once you approach AAA you start to hit a lot of limits within the engine. So in order to get the game running with the desired performance, you'll need a team of engine programmers to modify the engine to get rid of or at least work around those problems. A fair amount of work we do does end up being small fixes where the hardest part is figuring out the cause (especially for me since I'm a junior), but even I have gotten free range on a task just to optimize a system.

Engine design from scratch is a lot of work. But it can be a very enjoyable personal project to some. While I hesitate to call my personal engine a full game engine (I mostly focused on rendering), I found it very fun and I mean to get back to developing it someday. 

For resources here's a random collection of things that I remember being helpful from the top of my head:

That's all I can think of at the moment. If you are looking for something more specific let me know. 

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

Wow. Thank you so much for your reply and all the info. I’m excited to check all of this out. I’m sure I’ll definitely have some questions for you along the way as game focused programming is a little newer to me. It was always to goal from day 1, but after 20 years I ended up in the more boring fields of web and iOS app development.

Link to comment
Share on other sites

There's a few factors:

1) the improvements are there, but they're more subtle. Framerates, resolution, etc are all a lot higher quality and more stable this generation compared to the Xbox 360 and PS3. You really have to sit down with the games for a while to notice.

2) the CPUs of the PS4 and Xbox One are relatively weak, while they were relatively strong on the PS3/X360, which prevented things like really advanced AI and physics. This shouldn't be the case with the PS5 and Xbox Two.

3) ray tracing, if done right, is the generational leap in quality you're looking for. Not sure what the implementation will be like on the PS5 and Xbox Two, but it has a lot of potential: 

 

Edited by Sega Genesis Sage
  • Like 3
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...