Jump to content
IGNORED

Does 60 FPS really matter?


Nintegageo

Recommended Posts

Yeah it makes a difference, a 60hz monitor can't display 120fps. And if a game is running at 30fps, that doesn't mean a 60hz monitor will slow down to 30hz, it'll just display the same image twice - or rather, the game itself is programmed to repeat the same frame.

But if you have a VRR monitor (variable refresh rate), it'll properly support whatever framerate the game is running as long as it can go that high of course.
The biggest advantage of that isn't even going for high framerates, but rather being able to smoothly display uneven framerates. Like imagine 45fps on a 60hz monitor. You'd have either tearing, or the same frame twice, next one once, etc. And that's with correct frame pacing, some games are famous for running a lot more jaggy than that.
Not having to buffer frames needlessly helps combat input lag too, so it's a really nice improvement that we probably won't see used widely on consoles for quite a while yet.

  • Thanks 1
Link to comment
Share on other sites

14 minutes ago, Sumez said:

And if a game is running at 30fps, that doesn't mean a 60hz monitor will slow down to 30hz, it'll just display the same image twice - or rather, the game itself is programmed to repeat the same frame.

I'm pretty sure that isn't the case, or at least it wasn't for the vast majority of the history of video games.  Unless you're selecting a specific refresh rate setting within a game, the game itself shouldn't have any idea what the refresh rate of your display is--it just outputs graphics as it was designed, and doesn't talk to the hardware to figure out how and when to sync up.  In the realm of computer games, where the refresh rate of a display is most easily (and commonly) able to be changed, It's going to end up being something in the monitor, the video card, the operating system, or some combination of those three that does any sort of frame doubling for 30fps games running on a 60Hz screen (with the video card typically being the most common workhorse in that scenario).

Edit:  Games have been programmed to output at certain framerates, 30fps, for example, but still won't know what your display refresh is.  At that point, it would be up to the end user to make sure that the software will run properly on their system, or make whatever hardware setting adjustments are required to have the software run correctly, still leaving the software out of the loop regarding direct hardware control or knowledge about what the actual hardware settings are (without user input).

Edited by darkchylde28
Link to comment
Share on other sites

Administrator · Posted
19 hours ago, Dr. Morbis said:

I grew up with the NES and I've never given a shit about frame rates in my entire life.  Maybe on FMV shit on 3DO it would matter when it's chugging along at like 8 frames per second?!?!  But man, in the comparison video posted above, I seriously can't even tell the difference.  I guess it's a good thing I'm not a modern gamer, or everyone would be calling me batshit blind...

 

Take your favorite action NES game. Now go play the PAL version.

Tell me if it feels right.

Link to comment
Share on other sites

Editorials Team · Posted
1 hour ago, Sumez said:

Where the 30 to 60 fps distinction is important is when it comes to reactive gameplay - the difference is huge and anyone claiming they can't tell must be really good at lying to themselves

Maybe you're the one lying to yourself about it mattering. 😎  

We're middle-aged dads playing at a low level.  A .03 second difference in reaction ain't gonna matter, unless you're some kind of 20 year old trying to take down Ninja while high on energy drinks.

  • Like 1
  • Haha 1
Link to comment
Share on other sites

Editorials Team · Posted
5 minutes ago, Gloves said:

How fucking dare you.

Every year, you get older and slower.  The kids stay the same age.  It's already happening!

...my kids will be better than me in less than 4 years.  So I have to crush them in Smash while I still have the chance!

 

source.gif

Link to comment
Share on other sites

1 hour ago, darkchylde28 said:

the game itself shouldn't have any idea what the refresh rate of your display is--it just outputs graphics as it was designed, and doesn't talk to the hardware to figure out how and when to sync up.

So why or how is vsync a thing then? 🙂 

You could argue a game doesn't talk directly to the hardware, but it talks to a driver that does. Or it talks to an OS that talks to a driver that does. Or it talks to a framework that talks to the OS, etc. But the software absolutely knows if it's targeting a 60hz output or something else. It needs to know.

1 hour ago, darkchylde28 said:

It's going to end up being something in the monitor, the video card, the operating system, or some combination of those three that does any sort of frame doubling for 30fps games running on a 60Hz screen (with the video card typically being the most common workhorse in that scenario).

I think you're overestimating how much those things do for you. 🙂 Yes, video cards have video memory for flat textures and render surfaces that will help you buffer your frame for multiple subsequent outputs, but it's really up to the software how to use that.

Link to comment
Share on other sites

52 minutes ago, Reed Rothchild said:

Maybe you're the one lying to yourself about it mattering. 😎  

Have you played the remastered version of Dark Souls on something like a PS4 or a PC capable of running it at 60fps? It's an immensely satisfying and immediately noticeable difference to the original.

Does that make the original bad? Of course not, you know I love it. I mean, Blighttown was bad, but that's almost a part of its charm at this point.
But just because the game was fine at 30fps, that doesn't mean doubling that doesn't matter 🙂 It was great enough of a difference for me to pick it over the portability of the Switch version, which is something that is usually a pretty big factor for me.

  • Agree 1
Link to comment
Share on other sites

1 hour ago, Reed Rothchild said:

Every year, you get older and slower.  The kids stay the same age.  It's already happening!

...my kids will be better than me in less than 4 years.  So I have to crush them in Smash while I still have the chance!

This is why you play bullet hells to keep your reflexes strong 😎

Link to comment
Share on other sites

32 minutes ago, Sumez said:

So why or how is vsync a thing then? 🙂 

You could argue a game doesn't talk directly to the hardware, but it talks to a driver that does. Or it talks to an OS that talks to a driver that does. Or it talks to a framework that talks to the OS, etc. But the software absolutely knows if it's targeting a 60hz output or something else. It needs to know.

The stuff you're talking about has only come about, really, in the last 15-20 years of ~45 years of computer gaming.  It's only existed at the console level within the last two generations (PS4/PS5 & other consoles of the same generation).  So yeah, most of computer gaming has never had that sort of access.  Not saying it's not at all a thing, just not present in the vast majority of all games that have ever been released.  Modern stuff, ok, sure, it's all over the place.  Even with modern stuff, if you set your game, or even OS to a particular setting, it has zero true idea whether your hardware can actually handle it or not.  Hence in modern iterations of Windows, to this very day, when you change your resolution, screen refresh, etc., a confirmation window pops up saying something akin to "Is this setting correct?  Please click to confirm" with a little timer counting down to reverse the change if you can't see it.  Some software can or does have autodetect features, but not all, and not nearly all of it is accurate, so again, the software isn't actually determining anything at all beyond what it's told to do--either by the developer who sets certain defaults, or the end user who tells it how it should perform.

39 minutes ago, Sumez said:

I think you're overestimating how much those things do for you. 🙂 Yes, video cards have video memory for flat textures and render surfaces that will help you buffer your frame for multiple subsequent outputs, but it's really up to the software how to use that.

Ok, let's take Digger for example.  CGA game, released in 1983.  How does it know if I'm using a 60Hz screen, and to subsequently double each frame that it outputs?  I've seen it run on a 15KHz monitor, a 30Hz one, and a 60Hz one.  Game runs the same regardless, as if something else is handling matching fps to the screen refresh.  So is it the "dumb" CGA card doing the doubling?  MS-DOS?  It's certainly not Digger knowing what it's hooked up to.

As I said, there are more modern games that are smarter about things, but the vast majority of the history of video games have just had games outputting whatever the developers told it to, unless the end user told it to do something different.  I've been on the receiving end of this being a problem, as some games were programmed with default video modes that I didn't have and thus either had to figure out how to finagle things from the command line in order to get the game to start in some mode that I could see, or have a friend that could see what they were doing walk me through what keystrokes I needed to press, in sequence, in order to get to a setting that worked on my hardware.  If the game knew what was up, I'd have never been in that situation, and yet there we were.

Link to comment
Share on other sites

3 hours ago, Gloves said:

 

Take your favorite action NES game. Now go play the PAL version.

Tell me if it feels right.

Exactly my point: you have to bring in a completely different television standard to make a comparison (and this is for a console where it is pretty well known that almost all Japanese NES games released in PAL regions were not re-programmed to make them properly run at their originally intended speed due to Japanese developers not giving two shits about European end users) in order to support your argument.

I'm gonna go ahead and throw that one in the "W" column right now... 😛

Link to comment
Share on other sites

Administrator · Posted
4 minutes ago, Dr. Morbis said:

Exactly my point: you have to bring in a completely different television standard to make a comparison (and this is for a console where it is pretty well known that almost all Japanese NES games released in PAL regions were not re-programmed to make them properly run at their originally intended speed due to Japanese developers not giving two shits about European end users) in order to support your argument.

I'm gonna go ahead and throw that one in the "W" column right now... 😛

The point being that one is 60 and the other is 50. And you WILL notice a difference. 

YouTube videos don't do the difference justice. The one referenced in a terrible example, it doesn't even itself run at 60fps (this is an option on YouTube). You're not noticing a difference because it's not being accurately represented. 

Link to comment
Share on other sites

Oh, and just to support my earlier claim that I can't tell the difference in fps, until I watched this video a couple of days ago, I had no idea that certain aspects of TMNT on the NES run at 30 fps, which is literally half the speed of almost all other NES games; I would have gone to my grave not ever knowing or caring about its fps rate.  Well I still don't care, but it was an interesting video:

 

Edited by Dr. Morbis
Link to comment
Share on other sites

Administrator · Posted
4 minutes ago, Dr. Morbis said:

Oh, and just to support my earlier claim that I can't tell the difference in fps, until I watched this video a couple of days ago, I had no idea that TMNT on the NES runs at 30 fps, which is literally half the speed of almost all other NES games; I would have gone to my grave not ever knowing or caring about its fps rate.  Well I still don't care, but it was an interesting video:

 

Your claim that you can't tell the difference is not being questioned - the fact though in that there IS  difference. 

Honestly to my knowledge you've not TRIED to tell the difference in any valid way. You'd really need to turn on a game set to 60, play it, then set to 30 and repeat. You'd notice the difference guaranteed, barring any mental of visual impairment on your part. 

  • Haha 1
Link to comment
Share on other sites

20 minutes ago, Gloves said:

The point being that one is 60 and the other is 50. And you WILL notice a difference.

I can tell the difference in the speed a PAL NES game runs at versus its NTSC counterpart, absolutely, but I will not notice a difference in fps.  Maybe I might notice if I were to watch two videos side by side and really study them closely while looking for this attribute specifically, but that's a whole lot different than noticing it while I'm playing a game and focusing on all of the in-game crap that I'm trying to do...

Edited by Dr. Morbis
  • Like 2
Link to comment
Share on other sites

Administrator · Posted
1 minute ago, Dr. Morbis said:

I can tell the difference in the speed a PAL NES game runs at versus its NTSC counterpart, absolutely, but I will not notice a difference in fps.  Maybe I might notice if I were to watch two videos side by side and really study them closely while specifically looking for this attribute specifically, but that's a whole lot different than noticing it while I'm playing a game and focusing on all of the in-game crap that I'm trying to do...

It's not something you look for, really - it's something that is visual yes but it's something you FEEL. which again - is why some YouTube video isn't going to do a good job of anything but misrepresenting the actual difference. 

Link to comment
Share on other sites

1 minute ago, Gloves said:

Your claim that you can't tell the difference is not being questioned - the fact though in that there IS  difference

I don't doubt for a second that there is a difference; I'm just saying that as someone who doesn't play modern games, and specifically someone who doesn't play first person shooters, the difference in question is so negligible for my gaming tastes that it may as well not exist at all...

  • Agree 1
Link to comment
Share on other sites

Administrator · Posted
1 minute ago, Dr. Morbis said:

I don't doubt for a second that there is a difference; I'm just saying that as someone who doesn't play modern games, and specifically someone who doesn't play first person shooters, the difference in question is so negligible for my gaming tastes that it may as well not exist at all...

Well that's totally reasonable and fair and I wish you a good day, sir! 

Hrumph. 

Link to comment
Share on other sites

Events Team · Posted

The only thing I'll really add to this conversation is that actually playing a game is different than watching a video of a game. If you watch a video comparing 60fps and 30fps, you're relying on trying to visually spot the difference, which can be difficult. Whereas if you were to actually, say, lock a game at 30fps, play for a couple minutes, then lock it to 60fps and play for another couple minutes, you'd be able to actually feel the difference, which is usually more noticeable, particularly in very quick, fast-paced games such as, say, DOOM Eternal.

Either way though, the most important thing is that the game runs at a stable framerate. The higher the better of course, but stable is best either way.

  • Agree 1
Link to comment
Share on other sites

14 minutes ago, Gloves said:

You'd really need to turn on a game set to 60, play it, then set to 30 and repeat. You'd notice the difference guaranteed, barring any mental of visual impairment on your part. 

That's the point right there: I don't even own a game that has an option to choose frame rates.  I'm not a PC gamer: no Steam account, no Blizzard account, etc.  If my NES or SNES or Genesis or whatever would let me choose the frame rate when I inserted a game, I'd gladly attempt your experiment, but it is such a non-issue for the games that I play that I've literally never come across that option on any of the thousands of (cartridge!!!) games that I own...

Edited by Dr. Morbis
  • Like 1
Link to comment
Share on other sites

15 hours ago, darkchylde28 said:

I've seen it run on a 15KHz monitor, a 30Hz one, and a 60Hz one.  Game runs the same regardless, as if something else is handling matching fps to the screen refresh.  So is it the "dumb" CGA card doing the doubling?  MS-DOS?  It's certainly not Digger knowing what it's hooked up to.

Not even sure what you're trying to argue here, but trust me when I say there is a bunch of misunderstanding involved. 🙂 

First of all 15khz refers to the number of (horizontal) syncs per second, not frames. And it's a rough range, with some upper and lower leeway. 15khz allows for 60fps progressive video using NTSC, or 50fps with PAL - using interlaced video you get 30fps or 25fps video, but the monitor is still drawing 60 or 50 fields per second. 30hz monitors are probably a thing, but it sounds unlikely to me that you've been running an MS-DOS game on one.

And I can't answer for how Digger is programmed, but it does need to send graphics to the operating system's video output somehow, and it needs to know when it's safe to draw to the video card's graphics buffer, and when to switch buffers if double buffering is allowed, so yes it's something that game needs to consider.
Whether the programmers of Digger did this manually, or relied on a framework with the logic already designed for them is impossible for me to answer. These things get a little hairy on PC systems where there's always a lot of layers between the hardware and the software.
But if you're running Digger on a modern computer, the game isn't interfacing with a real CGA card anyway, it's being rendered via an emulator which handles all of the stuff I've been talking about. I'm willing to best there are very few MS-DOS games which are internally programmed to handle multiple different refresh rates.

For a simpler comparison look at something like the NES, where the software is much closer to the metal. Every single NES game (and pretty much any other 2D console or arcade game) is programmed around the timing of the video output. At the vblank period between two frames the game's frame logic needs to be done processing, and then the video memory needs to be prepared for the following frame before it starts drawing.

If the game needs to run at a different refresh rate - which is very much a thing on the NES, since you'd have 50hz output on European systems. You can't just run the same game at 50hz and then have it play the same speed - the TV isn't somehow going to double every fifth frame for you automatically. And that's why so many games just run slightly slower in 50hz.

Link to comment
Share on other sites

17 minutes ago, Sumez said:

Not even sure what you're trying to argue here, but trust me when I say there is a bunch of misunderstanding involved. 🙂

I'm arguing that the vast majority of games don't have a clue (or care, really) what output they're hooked up to, that they're programmed to certain specifications, and it's up to the end user to make sure that those things match up in order for the software to work properly.

And I wasn't arguing that 15KHz was apples to apples with the FPS that a monitor/screen is capable of, I simply included that one because it tends to be the odd duck among older computers, with a relative few few computers requiring it for proper output, and some software written for those specific systems refusing to function properly unless that specific video frequency is adapted and/or emulated properly.  As such, for a piece of software that's almost 40 years old to NGAF about what display you've got hooked up (when the available options when it was programmed were text mode, CGA, and monochrome/Hercules), it speaks to the software not being in control of anything graphics-wise, and it being up to the video card, OS, etc., to properly interpret.  Your statement was that games themselves would be programmed to double the frame [if programmed to run at 30fps but being run at 60fps], mine is that the software would have to be aware of what the video hardware was in order to do it.

If software had been that smart across the entire breadth of video games being produced, we wouldn't have been seeing blank screens, "random" crashes and reboots, etc., when running software on hardware that doesn't support it.  Sure, nowadays, with much smarter hardware and software, you'll typically get some sort of incompatibility message or warning when you try to mix and match poorly, but before accelerated 3D graphics became the norm, that just didn't happen 99.99% of the time.  End users were expected to read the requirements on the box, know whether they met them or not, and purchase and then run the software accordingly.  And if they chose poorly, they'd have to figure out what was up, because the software itself wouldn't tell them--it would just run what it could, typically resulting in a black screen because the graphical requirements of the software didn't match up to what was available on the system.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...