Search Unity

  1. Unity 2020.2 has been released.
    Dismiss Notice
  2. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

Application.targetFrameRat It doesn't work in versión 2020.2.0b5.3233

Discussion in '2020.2 Beta' started by Eck0, Oct 4, 2020.

  1. Eck0

    Eck0

    Joined:
    Jun 6, 2017
    Posts:
    38
    when use Application.targetFrameRate = 100 It doesn't work, I receive fps above 100.
     
  2. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    5,478
    What fps does it run at precisely if you set it to 100? Is it a huge gap or just like 101?
    Where doesn't it work? In the Editor, in a Build or both?
    What platforms are affected?
     
  3. TheZombieKiller

    TheZombieKiller

    Joined:
    Feb 8, 2013
    Posts:
    119
    Do you have vsync enabled? As far as I know, targetFrameRate is completely ignored in that case.
     
  4. Eck0

    Eck0

    Joined:
    Jun 6, 2017
    Posts:
    38
    Is it a huge gap.

    I have tried to force QualitySettings.vSyncCount = 0 and it remains the same.

    This happens to me from this version that I have installed. Does the same thing happen to someone else?
     
  5. Hyp-X

    Hyp-X

    Joined:
    Jun 24, 2015
    Posts:
    364
    I tested on 2020.2.0b5 on Windows 10 in the Editor play mode and
    - Application.targetFrameRate works for me with vSync enabled or disabled
    - Vsync doesn't work (Application.targetFrameRate not set), I get 200+ fps on a standard 60Hz monitor
     
  6. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    Please don't use Application.targetFrameRate unless you're running on mobile. It's horrible for input latency and will introduce noticeable stuttering.
     
  7. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    702
    I just tried in 2020.2.0b5 on Windows 10 - setting target framerate to 42 (chosen at random), stats shows fluctuation between 41.6-42.0 but doesn't exceed.

    WebGL (in particular) is the one platform I've noticed that you never, ever want to use it with. In a project having a consistent 60fps without setting it, explicitly setting to 60 caused stuttering and poor performance in general by comparison. Set to -1 for best performance. However, as noted above, it appears to works as expected on Android (and is a good approach for either overriding a default of 30 with 60, or throttling it yourself to keep the temperature down).
     
  8. LeonhardP

    LeonhardP

    Unity Technologies

    Joined:
    Jul 4, 2016
    Posts:
    2,365
    It would be great if you could send us a report for the vsync issue. Is it project/machine specific or does it happen across the board?
     
  9. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    1,927
    Does VSync even work in the editor? In past it definitely didn't work properly there.

    I'd personally only worry if these don't work in actual builds as that's only place where it matters.
     
  10. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    737
    What's the best way to limit framerate in Unity for PC or console without VSync?

    A lot of games have built-in frame rate limiters and a lot of players like using them to get a locked FPS if possible and a smooth frametime. Currently, for games that don't support it, most people just use MSI Afterburner with Riviatuner to set a lock framerate.
     
  11. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    Why don't you want to use VSync? Any way you come up will be mimicking what VSync does but will do it considerably worse because it will insert sleep at the wrong place in the game loop. And you will still get tearing.
     
    GliderGuy and Neonlyte like this.
  12. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    737
    From what I've gathered not everyone likes to use Vsync because you also get input latency when using it. Even if they are using a lower Hz monitor than what their system can achieve with the game, most would run it unlocked or set a higher target frame rate to have better input latency. Sometimes they might also not want to use Vsync but still cap the framerate so I'm just trying to see if there's a better way than Application.targetFrameRate.
     
  13. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    Well the input latency argument just doesn't make sense. Sure, if you disable vsync then your input latency will be slightly improved (I talk a bit about it in the blog post here). But all those improvements go out the window (and it actually gets worse than with vsync on) when you start limiting frame rate artificially. I encourage you to test input latency with a slow motion camera and see the results for yourself with various frame limiting techniques.

    There is this notion that VSync is bad and should be avoided. It's a popular urban legend among PC gamers. There are cases where you might want to turn it off (like you can't hit frame rate that's equal to your refresh rate, or you want to run at significantly higher frame rate), but due to the same reasons you don't want to limit frame rate there either.
     
    GliderGuy likes this.
  14. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    1,927
    Just to reiterate the above posts:

    The point of frame limiter on PC would be that some gamers want to disable V-Sync but have some cap on the rendering, basically instead of using V-Sync to limit your frames to say, 60, 144Hz etc, you limit them to 250 Hz.

    If your input updates once per game loop/Unity's Update, having higher rendering rate inevitably does lower the input latency as you are more likely to get input response closer to your rendered frame.

    I'd get the argument that using this mechanism is bad if you actually target framerates closer to what VSync would offer or less but I don't see how it's bad if the player is fine with tearing or is using G-Sync etc (if you use g-sync without vsync enabled in game, it still prevents tearing while game updates in your monitors refresh rate range so you get best of both worlds, meaning no tearing and less latency when your system can render faster than the monitor can display them).

    Also the input latency is really a thing that mainly competitive gamers care about. If it's a casual game then this is not going to be any kind of factor.

    Edit: here's one example:
     
    Last edited: Oct 6, 2020
    goncalo-vasconcelos likes this.
  15. Hyp-X

    Hyp-X

    Joined:
    Jun 24, 2015
    Posts:
    364
    If you want as low input latency as possible you shouldn't use an engine that has a "render thread" that can produce +1 frame latency...
     
  16. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    I'll repeat myself: if you want lower latency, why limit frame rate at all?
     
  17. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    2,896
    Could be for competetive e-sports, all players the same high frame rate of 240 fps for example.
     
    goncalo-vasconcelos likes this.
  18. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    If you're in competitive e-sports, you better get a monitor that can display all those 240 fps...
     
  19. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    2,896
  20. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    1,927
    If game renders at ridiculously high frame rate, it will introduce coil whine on many GPU's, also increase power consumption, heat etc. Having some sane cap there is still giving you most of the gains without all the negatives.

    I agree that for competitive gaming people would want those high refresh rate monitors for sure but it's not possible for all + running without vsync does help 60Hz monitor gamers.
     
    goncalo-vasconcelos likes this.
  21. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    1,927
    Point was more like, you better use monitor that can also show all those frames to get full gain of those extra frames (instead of just getting lower input latency from faster game loop iteration). We have a lot of 240Hz monitors now and even some 360Hz ones.
     
    goncalo-vasconcelos likes this.
  22. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    737
    I'm more interested in having a cap for those who want to use it. So far I've seen no one on PC really uses Vsync to cap their framerate they rather just let it push the max frames possible or use Adoptive-Sync or G-sync if possible. Guess that's more of a mobile and console thing as even PC gamers with 60Hz displays don't run the games at 60 Hz but let it run uncapped or cap it manually at whatever framerate they desire. Having it uncapped increases the latency more than having it capped so providing a limiter is a good option.

    A few games that let you do this now are Modern Warfare/Warzone, Overwatch, CS: GO, Battlefield 5. A limiter is also useful to reduce GPU load if you want to do something else on your PC like streaming or recording while running at a set framerate. Currently, if you don't have an option in-game then most people who want this feature will just use Riviatuner statistic, or in their GPU options as both AMD and NVIDIA provide an option to do so. Both of these options could cause issues and are usually slower than an in-game limiter.
    I'm just trying to see if there's a better way than Application.targetFrameRate or is this the only option?


    There's also this video going over all of these options with tests -
     
  23. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    I can see how those things can be useful. There are still a few issues with having built-in frame limiter.

    Firstly, how do you implement it? Application.targetFrameRate uses Sleep(), which isn't accurate. It's not suitable for high frame rate applications. It works well if you're sleeping 15+ ms per frame (like when you're limiting frame rate to 30 on mobile), but it totally breaks down when you want to limit to something like 240 fps. You have 4 ms per frame, let's say actual doing everything takes 1.5 ms, now you need to sleep for 2.5 ms. Overshooting by even 1 ms means that you now drop to 200 fps instead of 240 fps. You could use a busy loop and query time over and over again in Update(), but then all the power savings are lost.

    Secondly, it has high potential for incorrect usage. Many people think "VSync is bad" and would gladly disable it and limit their game to 60 fps. If they do that, the game will look like crap on 75 hz or 144 hz monitors. It will just be a stuttery mess. Now, you might say it won't happen if you set limit it to 240 fps. But now you make your game not look good on 360 Hz monitors because 360 isn't a multiple of 240. Now let's say in ten years, 1000 hz monitors get introduced. You see where I'm going with this?

    GSync is essentially VSync without a fixed timestep. Anyway, there definitely exists an urban legend which says "VSYNC IS BAD", but it really depends on specific engine and monitor setup.

    I don't think that statement about the input latency is correct. I did extensive tests with a slow motion camera in Unity on Windows standalone player. The absolute lowest latency can be achieved by running at unlimited frame rate without vsync. Pushing 900 fps actually made the latency lower than the windows cursor. The lower the frame rate got (by artificially limiting it), the more latency increased. When the frame rate was limited to what VSync was limiting it to, the achieved latency was significantly worse than what it was with VSync enabled and QualitySettings.maxQueuedFrames set to 1. This is a natural consequence of VSync forcing the frame to be aligned with the monitor's refresh rate.

    It could be that my findings are incorrect. If you have any proof that they're not, please tell me! Honestly, all this stuff is pretty complicated. But trust me, I have definitely done my part of the research :).
     
  24. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    737

    Yea, I've also seen tests where some engines will have a higher latency with the framerate limit set so I guess this includes Unity as well. It's just Vsync and unlocked framerate options for PC then.
     
  25. Kamyker

    Kamyker

    Joined:
    May 14, 2013
    Posts:
    574
    As stated in Battle(non)sense videos using GPU at 100% increases latency. I'm running game that targets these 240-360hz monitors and I'd like to avoid wasting energy (and laptops battery), generating heat and in the end lowering hardware lifespan:


    From my esport experience many players prefer stable fps instead of highest fps. When someone has 240hz monitor and their fps are 140-240 they would limit them 180 as framerate drops are worse than slightly lower but stable fps.

    I've noticed in the game that using 240 targetFrameRate on my 240hz monitor is a bit laggy so increased default fps limit for all players to 500 and it seems better but players with worse hardware may struggle to get stable fps.
    I guess it is possible as some other games don't have this issue. Google results shows some better methods like platform specific calls, sub-ms timers.

    Probably the simplest one is to sleep for less then required, check for how long it really slept and sleep again if result if far from target - could be better as it's "targetFrameRate" not "maxFps".
     
  26. Kamyker

    Kamyker

    Joined:
    May 14, 2013
    Posts:
    574
    Urban legend???
     
  27. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    That directly contradicts my testing, though...

    Wouldn't limiting the frame rate to 180 cause the fps wiggle between 140 and 180?

    Now we're talking about a specific case, great. Did you measure the input latency between VSync on and off with targetFrameRate set? Did you measure the frame rate stability in both cases? Did you consider reducing QualitySettings.maxQueuedFrames 1 to improve latency if you're able to achieve such high frame rates in your game instead of turning VSync off?

    There's no platform specific sub-ms timer on Windows (I'm assuming Windows because we're talking about PC gaming and that's where majority of PC gaming is). Since Windows is not a RTOS, you cannot guarantee that you will be woken up after 1 ms if you decide to sleep for 1 ms. For all your know, you might be woken up 10 ms later. That's the biggest issue with Application.targetFrameRate. Every time I tested it, it resulted in huge frame timing swings.

    Now do the measurement in Unity (preferably 2020.2 since a lot of work on this went into there) :). It definitely depends on the engine and other factors.
     
    steinbitglis, GliderGuy and Prodigga like this.
  28. Kamyker

    Kamyker

    Joined:
    May 14, 2013
    Posts:
    574
    Yes but dropping from 240 to 140 is a lot more noticeable and annoying than 180 to 140.
     
    goncalo-vasconcelos likes this.
  29. Digika

    Digika

    Joined:
    Jan 7, 2018
    Posts:
    96
    What am I reading.jpg

    Because in general (read: in the game engines that have implemented frame limiting way better than Unity), we have a wealth of good testing data (including popular tech-reviewers like GamerNexus or Battlenonsense) showing that using a framelimiter and limiting framerate to the refresh rate 1:1 versus using full Vsync mode (locked to refresh rate, 1:1 as well) produces 30-40% lower latency. The explanation for that is fairly obvious and here is a good talk from Nvidia guys where they discuss Vsync included penalty and their (pretty terrible) attempted solution:


    In general, if you cant afford having framerate completely uncapped (overheating, power draw, etc) but you still want lower latency then you use framelimiter and take an image quality hit in a form of screentearing. For example, external limiters like RivaTuner or Nvidia's GeForce new built-in frame limiter do very good job with minimum overhead latency. Of course, ideally it would be better if engine did this job properly with no overhead at all, but we know that Unity, for example, cant do it well so we use external tools. For Overwatch or Valorant or CoD games it is usually recommended use their built-in limiters as they do better job at this.

    Subtle IDTech6 advertisment.

    Moving goalposts.

    You cant just do it because it requires quite expensive gear + experience. Asking a random person on Unity forum to do something like that is kinda facetious and disingenuous. You know the answer you get.
    It would be nice if Battlenonsense could do test on Unity Engine games but I dont think he got time to do separate tests for specific niche engines that are not even relevant to competitive scene.
     
    Last edited: Oct 19, 2020
  30. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    It's actually quite simple. You only need a slow motion camera (anything with higher frame rate than your monitor's refresh rate will do) and most smartphones nowadays have one. Create a project with a script that draws a texture on mouse cursor:

    Code (csharp):
    1. public class FollowMouse : MonoBehaviour
    2. {
    3.     Vector2 m_MousePosition;
    4.  
    5.     [SerializeField]
    6.     Texture m_Texture;
    7.  
    8.     void Start()
    9.     {
    10.    
    11.     }
    12.  
    13.     void Update()
    14.     {
    15.         m_MousePosition = Input.mousePosition;
    16.         m_MousePosition.y = Screen.height - m_MousePosition.y - 1;
    17.     }
    18.  
    19.     private void OnGUI()
    20.     {
    21.         GUI.DrawTexture(new Rect(m_MousePosition.x, m_MousePosition.y, 64, 64), m_Texture);
    22.     }
    23. }
    Then build the project, run it, and film it with a slow motion camera. Finally, look how far behind the texture is behind the actual cursor.

    Anyway, I tested this extensively and merely telling you what I discovered. Comparing input latency between Unity and games made in other engines is like comparing apples to oranges. While you may see similar trends, ultimately the game loops are structured differently and will behave differently. My point was this: if your measurements in your game made in Unity are different, we can discuss it and see where we can go from there. But so far there hasn't been any evidence to that "frame limit" thing being better for input latency and my personal tests directly contradict it.

    EDIT: I just watched the video you linked. They're not comparing 60 fps vsync vs 60 fps no-vsync frame limiter. They're comparing 60 fps vsync vs no-vsync no-frame limiter. Rendering at higher frame rate will definitely help with input frame rate. I actually pointed this out in the blog post I wrote (under "VSync effects on input latency").

    EDIT2: Properly implementing your own frame limiting logic will be possible once I implement this request: https://forum.unity.com/threads/tim...afollow-and-jitter.430339/page-9#post-6392136
     
    Last edited: Oct 19, 2020
    GliderGuy and Prodigga like this.
  31. Digika

    Digika

    Joined:
    Jan 7, 2018
    Posts:
    96
    Please refrain from actually recommending this, we don't need more inaccurate and improperly measured data (especially when some phones actually retime and upscale from 120fps to 240fps for big marketing numbers) on this topic, it will only make things worse than we already observe in this thread. Nobody will benefit from this.

    Well, sure, that'd be silly. However, the trends across multiple completely different engines with different arch do have very, very common trends Unity seems to stem away from. Which was the main critique of this thread or at least OP.

    The video was about vsync and where does latency comes from, not about specific usecase. Point is - there is no "gamer myth about vsync", It just simple math and how flip buffer operates. There is accumulated over the years data and experience that is used as a general guideline and that is still correct. Of course, with more than 100k games released just in last decade or so, there is no way to know every edge case like Unity.
     
  32. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    I'm not sure why you think that. Even 60 fps capture is enough to capture every single frame on a 60 Hz monitor and see how many refresh cycles the texture is behind the cursor. There's nothing "improper" about it and it works very well. I suggest you try it before dismissing it.

    I mean the thing I posted about Application.targetFrameRate comes from research we did on the subject. You're just getting an inferior experience using that in games made in Unity. And again, if you find that this is not correct, I'll gladly take those data points into consideration. So far, however, I've seen zero evidence that it helps at all, and given all the negative consequences it has, my current recommendation is not to use it.

    I'd love to see the math if you don't mind writing it down :).
     
    Last edited: Oct 20, 2020
    GliderGuy likes this.
  33. Digika

    Digika

    Joined:
    Jan 7, 2018
    Posts:
    96
    I think there is misunderstanding. You are talking about targetFramerate Unity provides and saying not to use it because it is very imprecise and leads to negative effects and nobody argues that. The very OP in this thread actually started with the issue related to this functionality.

    The reason people were asking for better logic for custom in-engine frame limiter is already been mentioned multiple times, I'm not sure how can you keep missing or ignoring it:

    targetFramerate does not in case of Unity, but similar in-engine limiters in other engines or external limiters do help to reduce it when avoding vsync and evidence of that was posted multiple times. Again, as stated above, the thread was opened with the issue that it is not really the case with Unity.
     
  34. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    Alright. Well, this will allow implementing this as it will give you the callback at the right place in the player loop to limit frame rate:

    I still want to see the math you mentioned though!
     
  35. Digika

    Digika

    Joined:
    Jan 7, 2018
    Posts:
    96
    Just basic Vsync math/logic - it always waits for vblank, where as if you limit framerate at the rendering API side, for example (external tools), the engine runs uncapped but the cap happens afterwards therefore there is advantage of having lower latency as game renders as fast as possible under limit without waiting for vblank. And obviously, the higher you set cap the bigger reduction it is, but even at cap == refresh rate it is still faster than Vsync. Again, depends on how cap is done/handled, in the context of Unity we know that targetFramerate is not gonna behave in similar way.
     
  36. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    How is that so? If you limit to cap == refreshRate, then waiting for frame limit will take the same amount of time as waiting for vblank. Except now you won't be aligned at vblank boundary which means that upper portion of the screen will be displaying frame N while lower portion of the screen will be displaying frame N + 1. If you land at the middle of the refresh cycle on a 60 Hz display when you flip the back buffer, then bottom half of the screen will get 8 ms lower latency, but then upper half of the screen will get 8 ms higher latency. On average, you end up with the same thing but lose out on frame rate stability and motion smoothness.
     
  37. Digika

    Digika

    Joined:
    Jan 7, 2018
    Posts:
    96
    As test data on this subject over the last few years for multiple different game engines shows - that is not the case and latency is way lower than with Vsync under the same conditions. Of course there always can be outliers like Unity.
     
  38. TJHeuvel-net

    TJHeuvel-net

    Joined:
    Jul 31, 2012
    Posts:
    569
    Could this be added to the documentation?
     
    GliderGuy likes this.
  39. LeonhardP

    LeonhardP

    Unity Technologies

    Joined:
    Jul 4, 2016
    Posts:
    2,365
    Yes, I have asked to docs team to add it.

    For future references, the best way to submit feedback and requests regarding the Manual or Scripting Reference is via the built in feedback function at the bottom of the pages.

    2020-10-21_17-43-49.png
     
  40. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    1,927
    TBH I've never had any success with this route. I've reported few things in past, probably the transform.up etc page even for several times throughout the years but there it sits still telling it somehow moves gameobjects instead of just telling what it really is (it is a direction vector and it definitely doesn't move anything on it's own):

    This has come up few times in past when people new to Unity/math has wondered what that doc page really means. I can tell that the person who wrote that doc page tried to rather describe the following snippet with that but it's still incorrect and shouldn't be like that for a thing that's likely among the first things people read when they open API docs the first time.
     
    Last edited: Oct 23, 2020
  41. LeonhardP

    LeonhardP

    Unity Technologies

    Joined:
    Jul 4, 2016
    Posts:
    2,365
    Thanks for the feedback, I'll talk to the team about these specific cases. AFAIK, the process of how this feedback is being handled internally has recently been revamped.
     
    GliderGuy, TJHeuvel-net and rz_0lento like this.
  42. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    And test data we have on Unity shows that it is the case. Even with busy wait loops that implement exact frame limiting.
     
  43. Kamyker

    Kamyker

    Joined:
    May 14, 2013
    Posts:
    574
    VSync doesn't work for me also in 2020.1, nvidia vsync is set to "Use Application Settings".
     
  44. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    Is this in the editor or the player? If it's in the editor, did you enable "Vsync in game view" option?

    upload_2020-10-21_17-50-53.png
     
  45. Kamyker

    Kamyker

    Joined:
    May 14, 2013
    Posts:
    574
    Player with
    QualitySettings.vSyncCount = 1


    Also doesn't seem to work in editor, sometimes fps drop after enabling that tickbox but fps are nothing close to refresh rate

    Will check if vsync works for me at all in other games. maybe it's problem with my machine not Unity.
     
    Last edited: Oct 22, 2020
  46. Digika

    Digika

    Joined:
    Jan 7, 2018
    Posts:
    96
    Yes, as we etsablished here - for Unity it is incorrect, but for majority of other engines it works that way.

    Could you elaborate on how that would help?
     
  47. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    That is the right place to insert the "frame limiter" so that the input latency isn't negatively impacted. If you want to implement Application.targetFramerate like mechanism and the built-in way doesn't work as well as you wish, you may implement your own. For instance, you could do a busy wait there.
     
  48. Zuntatos

    Zuntatos

    Joined:
    Nov 18, 2012
    Posts:
    563
    So if I understand correctly: say you render in 0.5-1.5 ms and want to draw at 100 fps, for convenience. You need 8.5-9.5 ms of sleep somewhere. Let's ignore the existence of the 'queued frame'.
    case 1) "safe vsync": poll input, run your logic and then sleep for 8.5-9.5 ms waiting for the screen to present. Repeat.
    case 2) "risky vsync": sleep 8 ms, poll input, logic, sleep ~0.5-1.5 ms (safety margin) waiting for screen to present. Repeat.

    This would process the inputs happening during the first 8 ms sleep of case 2) in that frame, where in case 1) they would be delayed by 1 frame (10 ms). But it comes with risk of mistiming the pre-emptive sleeping, possibly missing a frame.

    case 1) is how it currently works, and case 2) would be implementable by users if the proposed "before-PumpOSMessages-EarlyUpdate" existed.
     
  49. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    8,023
    I don't think there's really a difference between cases 1 and 2 (unless I misunderstood what you wrote). Let me elaborate.

    Waiting for the GPU to flip the backbuffer to the display is done at the end of the frame. So is frame limiting. Effectively, at the end of the frame you will be waiting for whichever finishes later: end of frame limiting sleep or the backbuffer to be flipped.

    The thing I'm exposing allows you to employ different frame limiting techniques depending on how aggressive you want to be. Application.targetFramerate doesn't give you that flexibility: it's optimized for power consumption rather than making sure that you don't get frame drops. It uses OS sleep to wait, and in many cases it can be inaccurate which will introduce frame time inconsistencies. Currently, it looks something like this (in pseudocode):

    Code (csharp):
    1. if (vsync > 0 || Application.targetFrameRate == -1)
    2.     return;
    3.  
    4. var targetFrameTime = 1.0 / Application.targetFramerate;
    5. var waitEnd = m_LastFrameStart + targetFrameTime;
    6. var timeToWaitMS = (GetCurrentTime() - waitEnd);
    7. Sleep((int)timeToWaitMS); // Round down
    8.  
    9. // Finish the remainder
    10. while (GetCurrentTime() < waitEnd)
    11.     YieldProcessor();
    12.  
    13. m_LastFrameStart = GetCurrentTime();
    That's basically my whole argument: doing waits like these is much less accurate than proper VSync waiting and will introduce uneven frame rate. And since you're reducing the frame rate from what it would be without using it, you're increasing the input latency too. Lastly, it is pretty hard to figure out what's the right targetFramerate for a given machine, so unless you allow the gamer to enter an arbitrary number, it will make your game feel like crap on certain refresh rate/GPU combos.

    If you want to implement, for instance, a busy wait loop for exact timing at the cost of burning the CPU cycles, you will now be able to. Also, if you can come up with a better algorithm than we did, you will also be able to do that without needing us to change the engine.
     
    GliderGuy and Prodigga like this.
  50. Zuntatos

    Zuntatos

    Joined:
    Nov 18, 2012
    Posts:
    563
    In my case 1) and case 2), the start and end of the frame are exactly the same, and they both wait on the same buffer flip. But in one case the input & logic processing is done 8 ms later than the other.

    To be clear, I am fine with the way Unity does it as presented in the blog - it is a more stable and reliable way that is the better choice for the majority of games, taking into account power draw etc.
     
unityunity