# How important is 16ms (or less)?

Discussion in 'General Discussion' started by G_Trex, Aug 4, 2020.

1. ### G_Trex

Joined:
Apr 20, 2013
Posts:
100
Wasn't sure where to put this since it's not an issue, just a general question about game engines.

We all know (or should know), that when it comes to game performance, we try to get everything running at 16ms or less. Even Unreal has the same rule.

How stringent is this, and what are the consequences of going over?

What will 16.5 ms do to the game?

I know what the ms is and why it's important to keep it low, I've just never seen anyone discuss why that specific amount is significant or the consequences of exceeding it. I'm curious.

Deleted User likes this.
2. ### Acissathar

Joined:
Jun 24, 2011
Posts:
677
From what I understand, it's to help ensure 60fps / 60 game loops in a second.

1000ms = 1 second, 16.6ms a loop * 60 loops = 0.996 seconds.

Rewaken, G_Trex, angrypenguin and 3 others like this.
3. ### AlanMattano

Joined:
Aug 22, 2013
Posts:
1,502
In my personal experience, in-game under 20 frames per second (fps), you need to link the user inputs to the physics, so that the reactions are instant under the hood (also if the game drop 1fps). If the game is low in fps, then the player can sofer nausea, especially in the long run. For example, in my game, when the player (pilot) pulls back the joystick (makes a flare). At the airport is when fps drop more, so I try to give good feedback. For a First Person Shooter, Linus demonstrated that 240 fps(4ms) is better.
A tv or monitor runs at 60Hertz equal to 60 fps (16.67 ms).

(1/60fps)*1000=16.67 ms

Code (CSharp):
1. using UnityEngine;
2.
3. public class FramesPerSecond : MonoBehaviour
4. {
5.
7.     public float fps = 60f;
9.     public float ms;
10.
11.     void Update()
12.     {
13.         ms = 1f / fps;
14.         ms = ms * 1000f;
15.     }
16. }
When in a first-person game you move the mouse side, rotating the camera, if the frame rate is lower than the monitor Hertz, you will see distortion line cut. So I presume that is one reason.
Nausea: I experiment with nausea at 30fps (33ms) like IPadPro. The nausea feeling is delay far in time when increasing fps.
But actually for long runs playing or VR 90Hertz (11ms) or 120Hertz(8ms) is much better.

What will 16.5 ms do to the game?
Is good for a game that runs on a monitor. Can be no good for a VR comfortable experience or competitive First Person Shooter.

How about 17 or 18 ms?
58fps is not a problem but there will be occasionally some distortion in the screen when updates: Screen Tearing.

Mobile runs at 30fps. Most of the game with low budget GPU and very hi-performance demand at the game lunch runs at 40fps. But the user can experience nausea after several hours playing.

Last edited: Aug 4, 2020
G_Trex and Deleted User like this.
4. ### Ryiah

Joined:
Oct 11, 2012
Posts:
21,937
This. 6.94ms is 144 FPS. 16.67ms is 60 FPS. 33.34ms is 30 FPS. A low frame rate is fine on console hardware because it's been the norm long enough that people have accepted it, but it's expected your game will hit 60 FPS on the PC and many hardcore players expect try their best to achieve 144 FPS or greater. There are monitors that can handle 300 now.

Rewaken and G_Trex like this.
5. ### EternalAmbiguity

Joined:
Dec 27, 2014
Posts:
3,146
As mentioned above it's about the framerate.

1 frame / (0.0165 second) = 60 fps.

Your monitor has a default rate at which it refreshes the screen (typically 60 Hz). If you send "too few" frames, it has to wait an extra 16.6 ms to display the frame, which may have arrived 5 ms in. Or it will change the frame in the middle of a screen refresh, which produces screen-tearing.

6. ### kdgalla

Joined:
Mar 15, 2013
Posts:
4,723
60 fps is a common monitor refresh rate because it is comfortable to view for most people. Anything less and it starts to look choppy, but it varies from person to person how irritating it is. It also depends hugely on the type of game and how fast-paced and competitive it is. Lower frame-rates will cause the game to feel laggy and competitive players are very precise and get frustrated when the controls feel anything less than instantaneous. People who are good at online shooter games generally demand the 60 fps+, but for a point-and-click adventure game it wouldn't matter so much.

There have been some big-studio action/adventure games which were released capped at 30 fps on console. Most people were fine with this, but you always hear from a few angry people who refuse to buy these. This is just the first example I happened to find googling. Notice the comments:
https://www.playstationlifestyle.ne...frame-rate-capped-at-30fps-epic-explains-why/

Also, If you can't achieve consistent frame rate, that will be even more irritating than a lower frame rate.

G_Trex likes this.
7. ### Moonjump

Joined:
Apr 15, 2010
Posts:
2,574
As said 16.66ms relates to 60FPS. It could be you have a game without fast motion and be fine with 33.33 for 30FPS, or a really fast games that need a much higher frame rate. And there is Apple ProMotion for 120FPS at 8.33ms on some of their devices.

And as has been written while I wrote this (delayed by Skype chat with publisher), consistency is important.

G_Trex likes this.
8. ### EternalAmbiguity

Joined:
Dec 27, 2014
Posts:
3,146
just want to point out that 30 fps on console is actually the rule rather than the exception. This has started to change recently with the introduction of higher power consoles (PS4 Pro and X1X) and the introduction of "performance modes," but usually the experience is 30 fps.

G_Trex, angrypenguin and Joe-Censored like this.
9. ### kdgalla

Joined:
Mar 15, 2013
Posts:
4,723
I don't really notice either way because I'm really bad at video games.

EternalAmbiguity likes this.
10. ### EternalAmbiguity

Joined:
Dec 27, 2014
Posts:
3,146
Well, the main effect which hasn't yet been mentioned is that with a longer frametime (33.3 instead of 16.67) you have more "room" for better graphics or more intensive CPU calculations.

It's a balancing act between performance and presentation.

G_Trex, angrypenguin and aer0ace like this.
11. ### aer0ace

Joined:
May 11, 2012
Posts:
1,513
Somebody better contact the MPAA because their movies have been running at 24 FPS for decades. Aside from the Hobbit series that ran at double that, and made a lot of moviegoers nauseous.

But I get it... Games are interactive, and the screen refresh had better be quick enough for me to react to it. Quick google search for quickest human reaction time results in 0.15 seconds, with the average at around 0.2 seconds.

I can never really understand people who absolutely require something greater than 30 fps. Sure, you can "tell" the difference between 30 and 60, but I never felt that it was enough to decide whether I would play the game or not.

G_Trex and angrypenguin like this.
12. ### kdgalla

Joined:
Mar 15, 2013
Posts:
4,723
Actually, 24 frames a second is noticeably jarring and this was noticed in the very early days of film production. Traditional films are projected with a shutter flickering at 72 frames per second to mask the transition between frames.

Ryiah and aer0ace like this.
13. ### Zuntatos

Joined:
Nov 18, 2012
Posts:
612
High FPS (60 instead of 30) is essential if you want to be able to quickly react and readjust things like mouse aim. The extra frames really help with being able to extrapolate where things are going (both regarding movement on screen and where your aim is going). There's even noticeable improvement in tests from 60 to 144 hz, though it is significantly less improvement than 30 to 60.

That said, You can get pretty used to 30 fps and especially if the game is not super into predicting movement / mouse aim then 30 fps can do fine. Or if you are playing as a "casual" where your skill is still more of a limit than the hardware. Anything with a controller can be fine input-wise at 30 fps. Stick imprecision is a bigger factor there I'd guess.

aer0ace likes this.
14. ### ShilohGames

Joined:
Mar 24, 2014
Posts:
3,032
I can tell a huge difference between 30 FPS and 60 FPS. 30 FPS feels terrible to me. I can even feel a big difference between 60 FPS and 144 FPS.

24 FPS is for non-interactive movies. 24 FPS would feel completely horrible in a competitive first person shooter style game. Also, some games run certain visual effects at lower frame rates for stylistic reasons, but still run the game at 60 FPS to make sure it feels smooth to play.

G_Trex, EternalAmbiguity and aer0ace like this.
15. ### ShilohGames

Joined:
Mar 24, 2014
Posts:
3,032
With regards to your question about 17 or 18 ms instead of 16 ms: If you have V-sync disabled, it probably won't affect gameplay too much. But if you have V-sync enabled, each time your frametime is 17 or 18 ms, your game with briefly sync to 30 FPS instead of 60 FPS. It is how V-sync works unfortunately. With V-sync enabled, 17 ms frametimes cause the game to bounce between 60 FPS and 30 FPS. It is very jarring.

16. ### Zuntatos

Joined:
Nov 18, 2012
Posts:
612
Triple buffered vsync stops this 30-60 jumping right? But it introduces basically 1 extra frame of latency on top of the vsync latency.

17. ### neginfinity

Joined:
Jan 27, 2013
Posts:
13,669

Divide 1000 by number of milliseconds per frame, and you'll get the framerate of your game.

16.5 ms is 60 fps.

G_Trex likes this.
18. ### Kiwasi

Joined:
Dec 5, 2013
Posts:
16,860
No no no no. That's not the rule at all. The rule is everything related to rendering should come in at 16ms or less. This is so that your frame rate stays consistent.

Everything else can take much longer. Depending on your game, input doesn't need to happen every frame (although it should be close). And some things like path finding can actually take seconds to calculate without affecting your player's experience.

So get everything that relates directly to rendering optimized, and shift everything else off the main thread.

19. ### unitedone3D

Joined:
Jul 29, 2017
Posts:
162
Dear G_Trex, just a 2 cents.

In my POV, I feel frame rate is all over the place...I am also a bit surprised when I hear people say ''no difference between 30 or 60...''...that amazes me. It may be possible that some people have difficulty detecting this difference or don'T care at all/put not eye investment into that...it takes a trained eye to see it; 60 is fluid, 30 is less fluid and feels slightly 'jarring/slugghish/jerkyframe'..lack there is lack of something in the movement; you can still 'read' it enough..as others said..films are 24fps cinema standard (long story...but this was chosen for several reasons..conveniency/need less frames/less film roll (back then..now it's digital DCP files))/24 fps is a 'dreamy effect' called 'strobbing' effect..kind of like when you put your hand in fron the monitor and move it fastly (you will see 'frame' of your hand moving..that is the 'jerky' strobbing effect of lacking frames). It was called dreamy because it feels 'in the past' 24 fps gives an 'old feeling' to films...so they feel 'like they happened before'..instead of 'shot Live' (Live TV/Reality TV..which that one depends on high frame rate..in films it is 48fps..double 24; and it feels more 'lively'..like it was shot yesterday...).

Just a 2 cents.