So as I got it VSync makes a game wait for x milliseconds for the time when my monitor will be updating and then it will paint the next frame on the screen, right? In the Profiler, I see that VSync takes 9-12 ms every frame and I have 70-80 FPS with it turned on. When I turn it off I have 700 FPS and the most important it seems like my game runs more smoothly. (I can't say for sure because the human eye can not catch more than 60 frames per second... etc.) My target platforms are Android and iOS, so should I turn VSync off to see more smooth game rendering. (At least for Android. I know that iOS always turns VSync on). I am just worried to accidentaly break some internal Unity *thing* and get some phantom bugs later
No. Specially not on android unless you want to nuke the battery on faster phones. IOS will force vsync on anyway, so you should just explore other avenues.
Thanks! I knew there was something with it BTW I Profiled it on my phone (with VSync ON) and it does not spend any time on that WaitForSeconds function and I have a smooth movement.
You can't get rid of vsync on both iOS and Android. Also, 60 FPS is not required for all games. 30 might be enough for you, and having it set to 30 will decrease the battery consumption and, potentially, overheating.
You should ensure your game runs correct pace at 30 and 60fps. With being forced on you will find issues with timing if you do not know what deltatiming is.
You can cap the framerate by setting the VSync count to every v blank of every second v blank. On a display with 60 FPS this will give you either 60 or 30 FPS.
30 FPS is enough ? Huh.. You must be joking.. 30 FPS its a laggy thing.. I wish to have over 80 FPS. But 60 in most cases is enough.. Can you tell why is that needed ? Why can't I set up the 45 FPS, or say 65 FPS ? Is that a device OS restriction ?
Not everyone is making FPS games. A puzzle games usually is fine with 30 FPS and it really helps with battery and overheating. On iOS at least, you can only have frames rates that are whole divisions of 60 (60, 30, 20, 15, 12, 10 etc) because all it does is skip an integer number of frames, while the screen is always at 60hz.
@doarp is correct, the same applies for Android. It really depends on the type of game you're making. It is. There is no way to turn VSync off.
That is not entirely true. However, similar to what dragon_script mentioned above, technology unfortunately, has it's limits.
Is this typo? I thought set to higher to like 60 will decrease the battery consumption instead of lower 30?
@optimise rendering 30 frames per second is 2x less work than rendering 60 frames per second, so having 30 FPS will decrease the battery consumption compared to 60 FPS.
To clarify, a game that can only run at 30 fps due to needing more CPU / GPU performance than the platform has available is just as bad or potentially much worse for the battery than a game that can run over 60 fps. What @aleksandrk is talking about is a game that can run over 60 fps but is being artificially limited to 30 fps via targetFrameRate or vSyncCount. In that case the CPU & GPU are able to either clock down significantly or potentially sleep (almost completely stop using power) for most of the time between updates, greatly reducing battery consumption.
This is a link explaining why we can have only 30 or 60 and not something in between. https://support.unity3d.com/hc/en-u...game-flips-between-running-at-30FPS-and-60FPS Also I have tried my best to optimize my game by reducing draw calls, by using low poly meshes, by using render cues, frustum culling, limiting camera frustum, by optimizing update and fixed update calls, but still I don't know why my game is running at 30 to 33 FPS all the time whenever I profile it on my target device. The vehicle selection scene shows nearly 60 FPS(This also includes 50% from WaitForPresent and its break down tells waitForTargetFPS). But when my game scene starts, it is less 30 to 33 FPS and my profiler shows the rest of the time in terms of waitForTargetFPS. This time again it is 50% or more. I am setting vSync to every blank through Quality Settings and I am not setting targetFPS or vSyncCount through script. I don't know what could be the case. I have even turned off the whole scene and just a vehicle running on plane. But still this behaviour. I don't know how it would be possible for me to achieve 60 FPS if waitForTargetFPS is taking this much time. PS : After months of this post, I came to know 30 FPS is enough for most of the games on android. Cheers.
So basically, high intense parts kick it into high gear using more fuel, low intense parts like menus kick it into low gear. So far using 30FPS my game's frame rate looks like crap when the camera is rotating, seems to rotate jaggedly unless it's 60FPS or higher...hmm...
LOL I'm reading this now in 2023 - you know - when the greatest game in the world, Starfield, is *ace* because you know - it's gotta be at 30 fps! Well, *this comment* didn't age too well!
there's other factors at play - even physics ie. rigidbody draw calls depending on whether discrete vs. continuous vs. dynamic and interpolate, extrapolate... a lot of fps camera work arounds use a camera placed outside of the player controller hierarchy, but that means then using LateUpdate in scripts pinging camera orientation and position, as recommended in Unity documentation - high mouseSensitivity etc can have an impact - yes, with 60fps, you are getting more snapshots per second to help "smooth out" the problem, it's like how in calculus, the area under a curve is approximated by "slabs" like a bar graph - the more "slabs" you have, approximating the area under the curve, the more accurate - but the improvement you see at 60fps is not the real answer; think of it like this too - you make a MMORPG and there are players in the session some of which have latest super turbo computers - playing versus other "Johnny's" who are accessing via a S***ty old subpar old gaming laptop's that are laggy. How will the overall game experience likely be handled? The dinosaur Johnny's are not gonna be suddenly throttled up magically to suddenly be joining in at 60fps or 120fps. It's more likely say everyone else will get dropped to 30fps as a result. Think always of the "lowest common denominator" It's actually a noble, healthy aspiration to optimize so that when your game runs really well at 30fps, you know you've got your bases largely covered - most people want a game they know can run well at 30fps, um gosh I wanna play Starfield, what do you mean I have to have a brand new super computer that can only play it at 60fps? 120fps? How's that gonna work out in the long term? That's why Bethesda decided to throttle it down to 30fps - and made sure to let everyone know they were doing it - so basically guaranteeing everyone in the known universe would be able to join in on the party and not have to worry about being left out of the fun if their system don't happen to be all *latest* & *greatest* 'n all