Search Unity

  1. Unity 2019.2 is now released.
    Dismiss Notice

Time.deltaTime Not Constant: VSync CameraFollow and Jitter

Discussion in 'General Graphics' started by Zullar, Sep 9, 2016.

  1. soulburner

    soulburner

    Joined:
    Mar 25, 2012
    Posts:
    147
    @soulburner In your case, are you running with vsync on?[/QUOTE]

    It doesn't matter, I got this no matter if vsync is on or off.

    But later I realized my problem was the display was duplicating to the TV. And disabling cloning fixed that issue. Strangely, enabling it back again didn't get the bug back...

    Anyway, I still know that some of our player experience a terrible stutter and can't reproduce/fix it.
     
    herb_nice likes this.
  2. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    I pinged Unity QA again and they responded.

    "On the internal page, I can see that the latest update has been made a couple of days ago and they've marked its milestone (the version that the problem is hopefully to be fixed in) as the upcoming 2019.1 version. "
     
    hippocoder, Edy and AcidArrow like this.
  3. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    Thanks for the update! Now that I (sort of) understand the problem, I'd really love to know what "the latest update" is about.
     
    herb_nice likes this.
  4. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,191
    I recently noticed this again as well (probably always been there, just didn't realize I could measure it). Kind of baffled this is still an issue with recent builds :) Any chance we'd get some official Unity response here about the current state?

    I've used Unity since early 4.x and I've always struggled in getting perfectly smooth frame updates on it. Even after fixing GC spikes, physics interpolation, mouse input etc frame updates appear somewhat ok but never butter smooth like in other game engines I've used.

    edit-> this issue still exists on latest 2019.1.0a12.
     
    Last edited: Jan 2, 2019
    Zullar likes this.
  5. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,191
    Zullar likes this.
  6. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    It still isn't smooth in 2019.1? When Unity QA messaged me in Nov they said it would be fixed in 2019.1 (see message I posted above). :[
     
  7. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,191
    Yeah I saw the message, I guess it's some internal target but since it's not public it might not happen even, hence curious about staff reply here :)

    Also worth noting that 2019.1 is still in alpha, apparently soon in beta but it'll probably not release in many months still. I'd suspect they target 2019.1 to be out around GDC (late March).
     
    Zullar likes this.
  8. sharkapps

    sharkapps

    Joined:
    Apr 4, 2016
    Posts:
    143
    I have been watching this thread for a while hoping to see a fix for this in Unity 2018, but am disappointed to see how little attention this issue seems to be getting. I mean, this seems like a deal-breaker for anyone who is serious about keeping animation/movement buttery-smooth. I am curious how it is that so many developers have built so many games without flagging this as a show-stopper. What about the studios that are trying to use Unity for Film & Animation projects? How is this acceptable?

    Anyway, I noticed that the latest 2019.1.0a13 seems to have even more variation in deltaTime than in the 2017 and 2018 builds that I tested. It's really disappointing and I hope more developers chime in and ask for something to be done.



     
    MattDavis and Zullar like this.
  9. Nucky9

    Nucky9

    Joined:
    Apr 8, 2016
    Posts:
    10
    It took me a while to notice the stuttering, but now that I have I can't stop seeing it. It seems like some people see it easier than others, and some systems are better/worse for it, so this thread reassured me that I wasn't crazy, heh.

    Interestingly, I have seen at least one commercial Unity game that don't have the issue (or maybe the scenes are complex enough to hide it?), but even Unity's own tutorials do. For now, setting deltaTime to 1/60 has helped my project a lot.
     
    xVergilx, herb_nice and Zullar like this.
  10. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    I think I've got something. At least, it's the way it works better for me and provides a way to record smooth videos directly within the Unity editor. Hope it's also useful to you, or at least gives you some help on finding a more general solution.

    Having understood the problem as described here, then a solution is:
    • Force deltaTime to be 1 / refresh rate
    • Ensure VSync is enabled and in effect
    A way to do this in the Unity Editor is:
    • Time.captureFramerate = 1 / Screen.currentResolution.RefreshRate
      This forces deltaTime to be exactly 1 / refresh rate every frame.
    • Open the editor in OpenGL mode (-force-glcore).
      This forces vSync to remain enabled, at least on my Windows. When DX11 is used then setting Time.captureFrameRate disables vSync. I guess that you may force vSync in the display settings instead, but I haven't tried.
    Now the movement and animations are perfect with maybe some very sporadic frame drops (as captureFrameRate may not be exactly the hardware refresh rate). Cameras, especially smoothed Look-At cameras, show perfectly smooth movements when using Time.deltaTime. Here's an example recorded with OBS directly from the Unity Editor:


    Note: captureFrameRate doesn't fix Time.unscaledDeltaTime. If you're using Time.unscaledDeltaTime you should replace it with one of these alternatives for the fix to work:
    • unscaledDeltaTime = 1 / Screen.currentResolution.RefreshRate
    • unscaledDeltaTime = Time.deltaTime / Time.timeScale
      (this one doesn't support very small or zero time scales)
    Here's a simple script I use to test and configure these options:

    Code (CSharp):
    1. public class TryFixDeltaTimeSomehow : MonoBehaviour
    2. {
    3.     [Range(0,2)]
    4.     public int vSyncCount = 1;
    5.     [Range(-1,200)]
    6.     public int targetFrameRate = -1;
    7.     [Range(0,200)]
    8.     public int captureFramerate = 0;
    9.     public bool captureRefreshRate = false;
    10.  
    11.     void Update ()
    12.     {
    13.         QualitySettings.vSyncCount = vSyncCount;
    14.         Application.targetFrameRate = targetFrameRate;
    15.  
    16.         if (captureRefreshRate)
    17.             Time.captureFramerate = Screen.currentResolution.refreshRate;
    18.         else
    19.             Time.captureFramerate = captureFramerate;
    20.     }
    21. }
    22.  

    Also, I've modified the deltaTime test script posted by @Zullar to include the true average frame time, the unscaled delta time and the reported refresh rate. Also I've added an option so the graph shows either Time.deltaTime or the measured frame time (real time between each Update call):

    Code (CSharp):
    1.  
    2. // https://forum.unity.com/threads/time-deltatime-not-constant-vsync-camerafollow-and-jitter.430339/page-2
    3.  
    4. using UnityEngine;
    5. using System.Collections.Generic;
    6.  
    7. public class DeltaTimeTestFromUnityForum : MonoBehaviour
    8. {
    9.     public enum GraphicShows { DeltaTime, FrameTime };
    10.     public GraphicShows graphicShows = GraphicShows.DeltaTime;
    11.  
    12.     private List<float> listDeltaTime = new List<float>();
    13.     private List<float> listFrameTime = new List<float>();
    14.  
    15.     private const int pixelWidth = 256; //for texture2D
    16.     private const int pixelHeight = 128; //for texture2D
    17.  
    18.     private const int frameTimeSamples = 20;
    19.  
    20.     private Texture2D texture2D;
    21.  
    22.     private static readonly Color colorDarkGrey = new Color(0.3f, 0.3f, 0.3f);
    23.  
    24.     float lastFrameTime;
    25.     float deltaTimeAve;
    26.     float deltaTimeMin;
    27.     float deltaTimeMax;
    28.     float frameTimeAve;
    29.  
    30.     private void Awake()
    31.     {
    32.         // DontDestroyOnLoad(gameObject);
    33.         for (int i = 0; i < pixelWidth; i++)
    34.         {
    35.             listDeltaTime.Add(0.01f);
    36.         }
    37.  
    38.         texture2D = new Texture2D(pixelWidth, pixelHeight);
    39.         texture2D.filterMode = FilterMode.Point;
    40.  
    41.         lastFrameTime = Time.realtimeSinceStartup;
    42.     }
    43.  
    44.     private void Update()
    45.     {
    46.         float frameTime = Time.realtimeSinceStartup - lastFrameTime;
    47.         lastFrameTime = Time.realtimeSinceStartup;
    48.  
    49.         listDeltaTime.RemoveAt(0);
    50.         if (graphicShows == GraphicShows.DeltaTime)
    51.             listDeltaTime.Add(Time.deltaTime);
    52.         else
    53.             listDeltaTime.Add(frameTime);
    54.  
    55.         if (listFrameTime.Count >= frameTimeSamples)
    56.             listFrameTime.RemoveAt(0);
    57.         listFrameTime.Add(frameTime);
    58.  
    59.         RefreshTexture2D();
    60.  
    61.         deltaTimeAve = Average(listDeltaTime);
    62.         deltaTimeMin = Mathf.Min(listDeltaTime.ToArray());
    63.         deltaTimeMax = Mathf.Max(listDeltaTime.ToArray());
    64.         frameTimeAve = Average(listFrameTime);
    65.     }
    66.  
    67.     private void OnGUI()
    68.     {
    69.         GUI.Label(new Rect(10f, 10, 300f, 20f), "Time.deltaTime = " + (Time.deltaTime * 1000f).ToString("0.000") + "ms");
    70.         GUI.Label(new Rect(10f, 30f, 500f, 20f), "Time.unscaledDeltaTime = " + (Time.unscaledDeltaTime * 1000f).ToString("0.000") + "ms");
    71.         GUI.Label(new Rect(10f, 50f, 500f, 20f), "Screen.currentResolution.RefreshRate = " + Screen.currentResolution.refreshRate);
    72.  
    73.         GUI.Label(new Rect(10f, 80f, 500f, 20f), "Measured Frame Time (Average) = " + (frameTimeAve * 1000f).ToString("0.000") + "ms");
    74.         GUI.Label(new Rect(10f, 100f, 500f, 20f), "Measured Frame Rate (Average) = " + (1 / frameTimeAve).ToString("0.00"));
    75.  
    76.         if (graphicShows == GraphicShows.DeltaTime)
    77.         {
    78.             GUI.Label(new Rect(10f, 140f, 400f, 20f), "Graphic Shows: Time.deltaTime");
    79.             GUI.Label(new Rect(30f, 280f+texture2D.height*2, 400f, 20f), "X-Axis: Sample     Y-Axis: Time.deltaTime");
    80.         }
    81.         else
    82.         {
    83.             GUI.Label(new Rect(10f, 140f, 400f, 20f), "Graphic Shows: Measured Frame Time");
    84.             GUI.Label(new Rect(30f, 280f+texture2D.height*2, 400f, 20f), "X-Axis: Sample     Y-Axis: Measured Frame Time");
    85.         }
    86.  
    87.         GUI.Label(new Rect(30f, 170f, 300f, 20f), "Time (Average) = " + (deltaTimeAve * 1000f).ToString("0.000") + "ms");
    88.         GUI.Label(new Rect(30f, 190f, 300f, 20f), "Rate (Average) = " + (1f/ deltaTimeAve).ToString("0.00"));
    89.         GUI.Label(new Rect(30f, 210f, 300f, 20f), "Min = " + (deltaTimeMin * 1000f).ToString("0.000") + "ms");
    90.         GUI.Label(new Rect(30f, 230f, 300f, 20f), "Max = " + (deltaTimeMax * 1000f).ToString("0.000") + "ms");
    91.         GUI.Label(new Rect(30f, 250f, 300f, 20f), "% Variation = " + ((deltaTimeMax/ deltaTimeMin -1f) * 100f).ToString("0.0") + "%");
    92.  
    93.         GUI.DrawTexture(new Rect(0f, 280, Screen.width, texture2D.height*2), texture2D);
    94.  
    95.     }
    96.  
    97.     private void RefreshTexture2D()
    98.     {
    99.         Color[] pixels = texture2D.GetPixels();
    100.         for (int i = 0; i < pixels.Length; i++)
    101.         {
    102.             pixels[i] = new Color (0.0f, 0.0f, 0.0f, 0.9f);
    103.         }
    104.         texture2D.SetPixels(pixels);
    105.  
    106.         float dtMin = Mathf.Min(listDeltaTime.ToArray());
    107.         float dtMax = Mathf.Max(listDeltaTime.ToArray());
    108.         int yMin = GetY(dtMin);
    109.         int yMax = GetY(dtMax);
    110.  
    111.         for (int i = 0; i < pixelWidth; i++)
    112.         {
    113.             texture2D.SetPixel(i, yMin, colorDarkGrey);
    114.             texture2D.SetPixel(i, yMax, colorDarkGrey);
    115.         }
    116.  
    117.         for (int i = 0; i < pixelWidth; i++)
    118.         {
    119.             int y = GetY(listDeltaTime[i]);
    120.             texture2D.SetPixel(i, y, Color.white);
    121.         }
    122.         texture2D.Apply(false);
    123.     }
    124.  
    125.     private const float texDeltaTimeMax = 0.05f; //for texture2D
    126.  
    127.     private static int GetY(float deltaTimeIn)
    128.     {
    129.         return Mathf.Clamp(Mathf.RoundToInt(deltaTimeIn / texDeltaTimeMax * pixelHeight), 0, pixelHeight - 1);
    130.     }
    131.  
    132.     private static float Average(List<float> listFloatIn)
    133.     {
    134.         float average = 0f;
    135.         for(int i = 0; i < listFloatIn.Count; i++)
    136.         {
    137.             average += listFloatIn[i];
    138.         }
    139.         average = average / listFloatIn.Count;
    140.         return average;
    141.     }
    142. }

    You may use both scripts to check out the effect of each option in your specific system. Here's what I've got in my case. This graph shows the measured frame time, that is, the true time between each Update call, which is the value normally used for Time.deltaTime:

    upload_2019-1-15_18-53-35.png

    However, the reported deltaTime is exactly 1/59, that is, the real frame rate, thanks to Time.captureFramerate being 1 / RefreshRate:

    upload_2019-1-15_18-58-57.png

    The above result comes from these options:

    upload_2019-1-15_19-8-35.png

    Again, in my case it works because I've opened Unity editor in OpenGL mode (-force-glcore), which forces vSync enabled somehow. In DX11 vSync is not active when using captureFramerate.

    I'm aware this solution is not generic, not applicable to all projects, and most probably there are many systems out there with variable results (i.e. sometimes vSync may be or not in effect).
    • If vSync is not enabled, then the game will run as fast as the hardware permits, without constraints. You may try Application.targetFrameRate here but I haven't got consistent results here.
    • If the system can't complete each frame in the given time, then frame drop will happen and the whole "game time" will slow down, as Time.deltaTime won't be adjusted.
    It works to me as my biggest problem was recording good promo videos. Hopefully the information is useful to you as well.
     
    Last edited: Jan 15, 2019
    Rond, hippocoder, Mr_Dan_O and 5 others like this.
  11. herb_nice

    herb_nice

    Joined:
    May 4, 2017
    Posts:
    159
    I agree with most of your post. However, would be careful using Screen.currentResolution.RefreshRate as your refresh rate as it is hilariously inaccurate on some devices- i've seen 3.33% variance between that value and reality. it's a rough guideline, not a scientific measurement and will cause your game to play at different rates on different devices.

    ALSO I have seen Screen.currentResolution.RefreshRate return 0- in which case we assume it is really 60 before we do our calculations.

    But, you can measure refresh rate yourself (and should) if you don't want your game playing at different speeds or if you are synchronizing to external things. We measure refresh rate by ensuring vsync is non-0 and keeping a running average of frametimes that are within range of Screen.currentResolution.RefreshRate /
    QualitySettings.vSyncCount. This way it works at 30fps on mobile as well, with vsyncCount 2.

    And yeah, Application.targetFrameRate has never worked right, as far as I have seen.
     
    Last edited: Jan 15, 2019
  12. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,191
    Worth mentioning that Screen.currentResolution.RefreshRate is an int as well. Afaik true refresh rate that the displays run is never a round number so I agree with @herb_nice's suggestion in measuring the avg for this purpose (simply using smoothDeltaTime is not enough here as it still causes too much variance).

    This is also what I've been thinking to test myself ever since I found this topic but my current project isn't really at stage where this is important yet.
     
  13. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    Thanks! I've just tested this approach, and found the "within range" part to be highly significant. In my case currentResolution.RefreshRate reports 59. Taking the average of frame times within +-10% of 59 results 59.95, which I think is pretty accurate (my display is configured to 60Hz). However, reducing the tolerance to 5% gives a result of 59.76, which seems a biased value towards the reported refresh rate of 59.

    After some tests I found that a tolerance of +-12% provides a rather good result and stabilizes at 59.94 - 59.95.

    Other observations:
    • Currently I calculate the average by filtering out the values outside the tolerance, then accumulate all them and divide by the number of values. Another method is keeping the latest n measurements, which allows the frame rate to recover from "interference".
    • Moving the mouse over the Unity Editor distorts the measurements to the point a stable value of 59.95 may be lowered to less than 59.80. If the average is taken from the latest n samples then the average can return to the stable value.
     
    rizu and herb_nice like this.
  14. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Is there any way to tell when the monitor refreshes? For example lets say you have a 50Hz (20ms) monitor.

    If your Update( ) calculation takes 4-6ms to compute then you should have ~14-16ms idle time and not drop frames. But ***if*** the beginning of your Update loop is synced up and begins computing 15ms after the the monitor refresh then if an Update loop takes 6ms to calculate you will actually drop a frame (15+6 = 21 which is > 20). However if the Update calc took 4ms then you will not drop the frame (15+4 = 19 which is < 20).

    I guess what I'm trying to say is that there are 2 things that are required to be smooth.
    1:
    Find and match the monitor refresh rate
    2: Syncing up the beginning of your Update( ) loop with the monitor refresh time

    If you fail to do #2 then depending on arbitrary syncing sometimes your app will be smooth and sometimes (rarely) it will be jittery and you can drop frames even if your total Update calculation time is < monitor refresh interval.

    I'm not sure I'm explaining myself well. I can draw a picture if I'm not being clear enough. But any thoughts on addressing point #2?
     
  15. herb_nice

    herb_nice

    Joined:
    May 4, 2017
    Posts:
    159
    I just had a peek at what tolerance we settled on, and we have +/-11.111%, which is pretty close to what you found. :D

    We just do the running total (until there are 0x7ffffffe counted samples, then we just use previous result- that would be hundreds of days)... after a couple hundred samples it gets pretty stable anyway.

    This is unity's responsibility. If you have vSync > 0, it should be calling you as soon after vsync as it can.
     
  16. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    Nice! And thanks for the info, I think I'll use the running total as well. However what puzzles me is that just moving the mouse over the editor distorts the measures. But possibly it's just an editor-only issue.

    What makes you think that's not the case already? Most probably it's already like that, but Unity does a lot of stuff within its own Update loop, not to mention parallelism and jobs, and the time it takes to reach our Update methods varies due to all that. Even our own projects have a ton of Update methods, so one cannot predict when some Update will be reached, or how much time has been already spent in other scripts between the vSync trigger and the beginning of this specific Update.

    But that's not the problem. Our CPUs/GPUs have power enough to do all that within each monitor refresh interval. The problem is that Time.deltaTime is not the actual frame time, but the time measured between each Update pass. This gives an average of the frame rate in the long term, but values vary highly due to Unity internal stuff as we all already know.

    The real, definitive solution would be Time.deltaTime reporting 1 / RefreshRate on every frame unless frames are dropped (= no frame ready before the end of the current refresh rate). If frames are dropped Time.deltaTime should be (1+DroppedFrames) / RefreshRate. And the real problem is that, as far as I've understood here, currently there's no way to know when frames are dropped.

    An example. Imagine that deltaTime is the precise time from the vsync trigger to the end of the entire Update pass, including Unity stuff, Update methods and everything. A deltaTime greater than 1/RefreshRate doesn't necessarily involve a frame dropped. Not only there may be a triple buffer enabled in the GPU, but also there may be a couple of frames already queued in the GPU command buffer waiting to be presented. After that long Update pass there will be probably a "fast" Update pass to keep up the pace and have the GPU fed with enough frames. That's what's happening here:

    upload_2019-1-17_0-47-39.png

    Despite the measured frame times (= intervals between each Update) there are no frames dropped at all in this sequence. But as these frame times are used for animation, visual interpolation, movement, etc then everything stutters.

    So forcing deltaTime = 1/60 in this scene produces a perfectly smooth stutter-free animation. It's not a generic solution because if the workload is high enough then there will be frames dropped, and deltaTime should get adapted to them or everything would slow down. Can't recommend this article enough.
     
    Last edited: Jan 20, 2019
    keeponshading, Shorely and AcidArrow like this.
  17. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    Leaving aside extreme cases we may take for sure that 1 frame = 1 Update call. Thus the average of Time.deltaTime should match the monitor refresh rate. Also, Screen.currentResolution.refreshRate is the reported refresh rate, which is not precise.

    So the method proposed by @herb_nice that I've just tested is measuring the intervals between each Update call, take the values within a range of the reported rate (+-12% in my case), and use their average as the precise refresh rate.

    For measuring the Update intervals I've tested Time.realtimeSinceStartup and Time.unscaledDeltaTime with indistinguishable results. Time.deltaTime itself may be subject to scales, so it's not a good candidate.
     
    Shorely likes this.
  18. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,191
    Afaik this happens on builds too for hardware that is affected by this.

    I think it's actually an old issue that keeps popping up every now and then. Only some people with high refresh rate (gaming) mice see this. If you open your mouse software and drop the polling rate to 100-200 etc, you'll not see this. I've tried to repro this on my own end (I have 1kHz update on my mouse) but I mostly see up to 0.1ms cpu load difference with moving the mouse cursor regardless the setting I use for it. I talked recently with a user who had this happening even on completely blank Unity scene and dropping the mouse poll rate fixed it once again (but obviously we can't tell our players to do that to fix this).
     
    Last edited: Jan 17, 2019
  19. steego

    steego

    Joined:
    Jul 15, 2010
    Posts:
    911
    I'm curious if this problem goes away with adaptive sync, like FreeSync / G-Sync? The whole concept of vsync is outdated anyway, so it's much better if the monitor just displays the image as soon as the GPU has it ready.
     
  20. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    That causes screen tearing. vsync ensures the image is presented between each monitor's hardware update, and not in the middle of it.
     
    Shorely likes this.
  21. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,191
    This is what his mentioned techniques (adaptive sync/freesync/g-sync) solve, they literally refresh the whole frame at once on monitor asap when it's ready, instead of A) waiting for next vertical sync for monitor once you have the full frame in buffer B) sending it right away on next vertical sync even when frame is not fully updated (so it tears).
     
  22. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,191
    as to the @steego's question, I also wonder this myself. In theory with adaptive syncing, we could be right back in the starting point with this issue we have here now (unless this tech lets us only render one frame into the future, instead of 2-3 like we do right now, I wouldn't know), although it might still help in hiding it better. I don't have such monitors but nvidia recently started supporting adaptive sync (so you should now be able to run adaptive display updates also on existing freesync monitors while using Nvidia GPUs) besides their own g-sync tech.
     
  23. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    I think there's some misunderstanding here. No uncompleted or non-updated frames are sent to the monitor. Monitors refresh each line vertically from top to bottom by reading each line's content from the video buffer. When you "refresh the whole frame" you write it to that buffer (more specifically, you switch the buffer pointer). If you do that while the monitor "refresh pointer" is at the middle of the screen then it will read and display the data for the new frame. This means that the top half of the monitor still shows the previous frame, but as you've changed the buffer's content in the middle of a refresh cycle, the second half will show the new frame. Hence the tearing. The point of vsync is ensuring the video buffer is not modified while the monitor is actually in the middle of a refresh cycle.

    But again, vsync is NOT the problem here. The problems are:
    • Time.deltaTime is not the monitor's frame time (no matter the sync method(s) or hardware used), but the time between each Update call, which is highly variable in busy multitasking systems such as most devices.
    • There's no way to know when frames have been dropped (= same frame being presented twice due to the game failing to keep the GPU fed with fresh frames, i.e. as result of scenes too complex or too much workload in the device).
     
    Last edited: Jan 20, 2019
  24. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,191
    Terminology aside, you probably understood what I was trying to tell as you explained the same thing again :) We could nitpick about the terms here but it doesn't change what happens in practice.

    What I did try to tell you though was that the syncing methods steego mentioned do NOT cause tearing. You don't have to wait for monitor to do next v-sync at fixed interval on these types of displays (as long as both GPU and monitor both support this kind of functionality): the whole point on adaptive sync/freesync/g-sync is that you can immediately refresh the monitor as soon as new data is available instead of locking the display to fixed 60Hz, 144Hz etc.
     
  25. herb_nice

    herb_nice

    Joined:
    May 4, 2017
    Posts:
    159
    I think it would be even harder to control jitter on a fluctuating refresh rate... because the problem still is- you do not know how long it will take to simulate and render the current frame, so you don't know how long your timeStep should be. locking it down to vsync simplifies this problem.
     
    Edy likes this.
  26. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,191
    Yes, this was what I'm afraid as well:
    But to verify any of this personally, would need to be able to test this somehow with freesync/g-sync monitor (I don't have one).
     
  27. CleverEndeavour

    CleverEndeavour

    Joined:
    Mar 4, 2016
    Posts:
    7
    So as I understand this problem:

    ----- Creating a frame begins -------
    1. A Time.DeltaTime is calculate ( usually 16ms, but this bug is causing it to fluctuate)

    2. Computation happens ( Physics,Animation, FixedUpdate, Update(), LateUpdate() etc)

    3. Scene with all the updated positions baked in is sent to the GPU

    4. Graphics GPU Rendering happens

    5. Frame is displayed on screen (Hopefully exact 16 ms)
    ----- Frame is done -------

    So my understanding is that the real time between 1. and 5. needs to match the deltaTime given in step 1. And if for any reason it doesn't, we get some jitter.

    The big problem being that step 2. can fluctuate wildly, for any number of reasons ( system, game programming, hardware). So fundamentally it isn't really possible to be assured of smoothness.



    I'm wondering if the below idea has any merit:
    ----- Creating a frame begins -------
    1. A Time.DeltaTime is calculate ( usually 16ms, but this bug is causing it to fluctuate)

    2. Computation happens ( Physics,Animation, FixedUpdate, Update(), LateUpdate() etc)

    3. Scene with all the updated positions + velocity baked in is sent to the GPU

    4. GPU looks at difference between Time.Delta time, and the actual current time, and uses the velocities to nudge objects slightly to fix any discrepency.

    5. Graphics GPU Rendering happens

    6. Frame is displayed on screen (Doesn't need to be exactly 16 ms after, as the GPU extrapolated new positions)
    ----- Frame is done -------


    Is that I'm describing, where a GPU can nudge objects around even possible? To me, it seems like the base problem is impossible unless, GPUs can do a little after the fact correction.
     
    Zullar likes this.
  28. herb_nice

    herb_nice

    Joined:
    May 4, 2017
    Posts:
    159
    Yes. This is why I quantize deltaTime to refresh rate / vsync count, overriding the value unity gives in step 1.

    This is why you wait for vsync. As long as you can do everything in steps 2-4 in less than 16ms, you are good to go.


    As for your second scenario, I believe it would be very very hard to correct jitter using that method, even if you did manage the shader magic to make it work. The problem being, you need to know how far to extrapolate, ie, how long it will take to simulate and render ahead of time.

    If you are locked to vsync, at least you have a pretty good idea that your simulation step should be refresh rate / vsync count.
     
    Zullar likes this.
  29. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    @CleverEndeavour @herb_nice you're missing a fundamental point: the asynchronous nature of the GPU. I think a better approach would be like this:

    -----
    1. Time.deltaTime is calculated as the time spent since this step 1 was executed last time.
    2. Computation happens using that Time.deltaTime: animation, physics interpolation, Update, etc.
    3. Scene is sent to the GPU. This is just a set of commands in a buffer. There's no way to know when this specific frame will be actually displayed on the screen.
    4. Unity does its internal stuff: processing, jobs, parallelism, garbage collection, etc.
    5. Unity waits for the GPU to report it's ready to accept new frames, then goes back to 1.
    -----

    Pay attention to the step 3. The frame is just sent to the GPU and the control returns immediately to the CPU. There may be one or two frames already queued. There may be a triple buffer so frames are rendered but not yet presented. The GPU works in parallel and will present the next available frame after the next vsync (if enabled), without the CPU having any way to know the exact time this happens, nor the precise frame that has been presented. The GPU will just report "ready to accept more frames" in step 5.

    The step 4 is where the problem begins. Unity spends a very fluctuating time in its stuff. For example, I've noticed that exactly once per second something happens internally that takes a long time. Sometimes even the time spent between steps 1 and 5 is more than the monitor's frame time. But spending more time than a frame in Update doesn't mean that a frame will be dropped. Remember, there may be a triple buffer, there may be a couple of frames already queued, and the GPU works entirely on its own. What will surely happen is that the GPU has another frame already in the buffer, presents it on time and tells the CPU "give me more frames fast!" so there's nearly no wait time when reaching the step 5 until the pace is recovered.

    The implications of the previous paragraph are profound. Frames are being displayed by the GPU at the correct monitor's frame rate, without dropping any frame. However, Time.deltaTime is computed as the time between two consecutive Update cycles. If some Unity stuff takes too much we'll have a very large deltaTime immediately followed by a very small deltaTime as the GPU requests more frames quickly to keep the buffer fed. So animations, visual interpolation, movement and anything using Time.deltaTime stutter due to its large variability, despite every single frame being displayed on time in every monitor cycle.

    When vSync is enabled then the GPU accepts frames at the average rate of the screen's refresh rate. But the GPU doesn't require a constant rate, and can handle significant delays without actually missing refresh cycles. However, the delta time calculated by Unity is exclusively CPU-based and causes everything to stutter.

    Possible workarounds:
    • If you're sure that vSync is active and you know the screen's refresh rate then you may force Time.deltaTime to be the correct value by setting Time.captureFramerate = refreshRate. However, at least in DX11 this disables vsync. OpenGL seems to keep vsync forced. Personally, I've started to using the Unity Editor in OpenGL mode just for this.
    • Measure the frame rate yourself (there are some methods described above in this thread), compute a correct deltaTime and ensure your scripts use it instead of Unity's Time.deltaTime. However, this leaves Unity animation, rigidbody interpolation and other Unity stuff out of the picture.
    When all this started in Unity?

    This issue wasn't present in Unity 4. I deduce that the step 4 above (internal stuff) was quite constant then, so large Time.deltaTime variations were unlikely. Only garbage collection may cause some issue then. I still maintain a project in Unity 4 and the movement is perfectly smooth.

    In Unity 5.x many significant features such as jobs, parallelism and others arrived. This introduced heavy variations in the stuff that Unity does internally, thus the values of Time.deltaTime started experiencing large variations. I've been dealing with this issue since then and was unable to record smooth videos until a couple of weeks ago, when I finally understood the problem in depth enough to find a workaround.

    What could Unity do?

    As first harmless measure, exposing a way to override Time.deltaTime would provide us a valuable tool for finding solutions that fit our projects.

    Ideally, when vSync is enabled and active then Time.deltaTime should be the frame time, instead of the largely varying time between two update cycles.

    Apart from that, this problem affects every player in the entire graphics ecosystem. Investigating the issue and collaborating with the people involved would help reaching a definitive solution. Please read the article The Elusive Frame Timing for a very detailed description on the problem, its context, history and current state. It all started over a decade ago.
     
    Last edited: Jan 31, 2019
    CorvOrk, elcionap, Zullar and 2 others like this.
  30. herb_nice

    herb_nice

    Joined:
    May 4, 2017
    Posts:
    159
    I get the async nature of GPU- but you still need to predict your deltaTime accurately if you want to not jitter, async or no... and with async, getting the error in that prediction down below a threshold of visible jitter seems quite improbable.
     
  31. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    The part that is surprising to me is that a blank new project with basically no objects and no scripts still exhibits the Time.deltaTime variation & jitter.
     
  32. jsip

    jsip

    Joined:
    Jan 25, 2015
    Posts:
    5
    I have this issue on PC and it's bugging me. On PS4 any project runs smooth. On any PC it doesn't.

    I've tried test rigs with nothing but fresh install of Windows and an empty project on an 8700k and 1080ti - still that micro stutter in frame fluctuation.
     
  33. sharkapps

    sharkapps

    Joined:
    Apr 4, 2016
    Posts:
    143
    @Zullar I am curious if you noticed more variation in 2019.x builds than in 2018 and earlier as I noticed in my quick test above?
     
  34. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    Easy. Unity does a growing amount of stuff internally that takes different time each Update cycle. Time.deltaTime is measured as the time between each Update call, so it gets a varying value. Anything using Time.deltaTime will stutter then (animations, rigidbody interpolations, custom scripts using Time.deltaTime in Update...), even in an empty project.

    It was a surprise to me as well until I understood the problem. Then it all fit.

    To me, possibly the way the PS4 reports it's ready to accept new frames (step 5 in the execution flow described in my previous post above) is much stabler than in PC, so each Update cycle takes a time that is much closer to the true frame time.
     
  35. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    I haven't done any testing with 2019.x yet. Are there any improvements you have noticed? Or anything in their patch notes?
     
    Last edited: Jan 28, 2019
  36. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Maybe this is mincing words but I think it is incorrect to attribute the source of the jitter/Time.deltaTime variation to loading caused by internal unity stuff/loading. I think the cause is variation in external loading, not internal.

    On Windows machines there are many background applications running. I've noticed that turning on additional background apps (even apps aren't GPU intensive and don't drop framerate) increases the Time.deltaTime variation & jitter. You can also see massive Time.deltaTime spikes from things like changing Windows application focus.

    I've also tested my Unity App on two similar PC's. If my Unity app's internal stuff loading (as you state) was the source of the variation then you'd expect similar variation on both machines. But instead I saw drastically different variation from machine to machine indicating something else.

    Similarly if the Unity App itself was the source of the variation then it would exhibit similar issues on PS4 and such.

    All these things point to the PC's external background app loading (as opposed to variation in Unity app internal stuff loading) as a primary source of variation.

    I would state the cause of the jitter problem as this: Unity's Time.deltaTime is unrobust to variation in external loading.

    Now I'm speculating but I believe the cause is more like this. Lets say you have a 60Hz (17ms loop) monitor. If your Unity app can complete it's Update & render in 2ms that should leave 15ms idle time. The start of one Update to the next Update is 17ms so Unity sets Time.deltaTime = 17ms. However if another app intermittently "uses" up 4ms and pushes back the start of your Update then Unity sets Time.deltaTime = 21ms (instead of 17ms). It does this even though Unity can still can complete the Update & render in 2ms and still leave you with 11ms idle time and not drop a frame. Unity will have a 21ms Time.deltaTime frame and a corresponding 13ms "make-up" frame... causing the jitter even though it's capable of completing both and pushing to the monitor buffer on time. What it should do (we all agree) is have two 17ms Time.deltaTime frames even though the start of the Update frames are 21ms & 13ms apart.

    So lots of words but I think we are in agreement for the most part. Unity should be asking "When will this frame reach the monitor" instead of "What is the Time.deltaTime between the start of my Update" loops... because the later will be effected by variation in external loading.

    I think the solution looks something like this:
    1: What is monitor framerate?
    2: How long did my last Update() take to calculate (this is a predictor if the current frame)
    3: Starting now if I perform the Update() & render when will it complete? Which monitor frame will I hit? Typically it will be able to hit every monitor frame without dropping (unless loading is large). This also must somehow account for Physics (which can run at a different loop speed)

    In other words TIME BETWEEN THE START OF UPDATES SHOULD NOT DIRECTLY MATTER! It should only matter insofar as it pushes your render completion back too far and you will drop a frame (which is fairly rare).
     
  37. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    Thanks for your detailed reply. We both are building hypothesis that try to explain what we observe. We both agree on the conclusion, but I think your hypothesis is missing some points.

    I take for granted that external factors also affect the timing. But I sustain that the primary cause is internal. My arguments are:
    • Applications built with Unity 4 run perfectly smooth without any stutter, running side-by-side with an Unity 2018 application that is heavily stuttering. Any significant external factor should equally affect both applications, but that doesn't happen. Moreover, if you try the Unigine Valley demo its beautifully perfect 60 fps makes you cry.
    • Differences between platforms and machines could be explained by the GPU requesting frames at more precise rates, possibly caused by drivers, settings, and/or requirements that are processed in a different way.
    Completely agreed. The problem is there's no way to know when a frame reaches the monitor. There's nothing Unity can do to figure out that, at least in a generic/reliable/compatible way.

    That would possibly be a solution. But there's no generic/reliable/compatible way to know that (if exists at all), not even at Unity side.

    There's no generic/reliable/compatible way to know when you will drop a frame, nor at application's side nor at Unity side.


    Modern GPUs (since ~10 years to now) run entirely on their own. The CPU composes the frame as a bunch of commands, sends all them to the GPU as a batch and the control returns immediately to the CPU. But that's all. There's no way for the CPU to know when that frame will reach the monitor: triple buffer, frame queue, the compositor... All the the CPU can do is just ask the GPU "are you ready to receive a new frame?". If so, the CPU composes a new frame, sends it and the cycle repeats.

    Even with these limitations there ARE ways to achieve good 60 fps in PC. Unigine does it. Unity 3D doesn't. First, Time.deltaTime should be exactly the refresh rate. Second, Unity should research some strategy to detect dropped frames. Then if (and only if) there are some frames dropped, Time.deltaTime should be (1+droppedFrames) / RefreshRate.
     
    Last edited: Jan 29, 2019
    CorvOrk and Zullar like this.
  38. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    If it's true what you say that there is no way to tell when a monitor pulls from the buffer and actually puts the pixels on the screen then that's the huge underlying problem for everything. It's really hard (impossible?) to develop a good strategy for smooth frame rates without knowing this critical piece of information.
     
  39. phil_lira

    phil_lira

    Unity Technologies

    Joined:
    Dec 17, 2014
    Posts:
    385
    We are working with partners to get better frame pacing on some platforms. This requires a multi-lateral collaboration and it's not something that, as I understand it, can be solved properly in a short time. Given this varies per-platform, it's complicated.

    Meanwhile, the Time.smoothDeltaTime is an acceptable workaround for camera animation if the game already doesn't smooth/damp it.
     
    Zullar and AcidArrow like this.
  40. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    Thanks a lot for your reply! It's good to know that you are aware of the problem and looking for a solution.

    I agree with you here. But the problem is that animation, rigidbody interpolation, and every internal Unity system use Time.deltaTime. I've already tested Time.smoothDeltaTime and camera movements improve, but looking at a rigidbody stutters because the rigidbody interpolation still uses Time.deltaTime.

    As short-term workaround, some way to override Time.deltaTime globally in each Update cycle would be incredibly helpful. Maybe a method Time.OverrideDeltaTime() or something like that we could call each frame in a way that the value will then be used along all Unity systems. Then we may set it to smoothDeltaTime or any other value we calculate ourselves.

    Currently, the only way to override Time.deltaTime is setting Time.captureFrameRate. I've got perfectly smooth movement using this technique. However, it has a number of implications (mainly, it disables VSYNC on DirectX), so it's good for recording product videos but not for deploying final applications.
     
    Last edited: Jan 29, 2019
    Zullar and zyzyx like this.
  41. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    That IS the huge underlying problem for everything! You really haven't read this article yet? I've referenced it multiple times along my posts in this thread since @BakeMyCake first referenced it earlier:


    It's long but comprehensive and highly revealing. Find some time, take a cup of coffee, and read it entirely. I really understood the problem and was able to find a local workaround to my specific situation (OpenGL/Forced VSync + Time.captureFramerate) only after reading this article.
     
    Last edited: Jan 29, 2019
  42. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    I've read it. But it seems too shocking to believe that there is no way to know when a monitor pulls and displays from the buffer! :p
     
    Last edited: Jan 29, 2019
    Edy likes this.
  43. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Glad you are looking into it!

    I believe Unity internally relies on Time.deltaTime for many things such as
    -Rigidbody Interpolation
    -Particle Systems
    -Animations

    and I am unsure how to plug in Time.smoothDeltaTime in place of Time.deltaTime for these things.
     
    zyzyx and AcidArrow like this.
  44. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Would it be possible to open a trackable issue# for this so we can follow progress?
     
    Zeitcatcher likes this.
  45. nxrighthere

    nxrighthere

    Joined:
    Mar 2, 2014
    Posts:
    542
    I don't know how the Time implemented under the hood, but I believe that this issues around jitter are directly related to the nature of wall-clock mechanism where time inconsistently jumps when it changes, and it's very vulnerable to influence of external sources that rely on it. I encountered similar problems while rewriting some legacy stuff in ENet where wall-clock was used for many internal timers, and it was very unstable until we replaced it with a high-resolution monotonic clock.
     
    Last edited: Jan 31, 2019
    Vincenzo likes this.
  46. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    That's not the case here. The cause of the stutter in this case is that there’s no way for the application to know for sure when a frame was actually displayed on the screen. It's well explained here, and (shorter) above in this page.
     
  47. nxrighthere

    nxrighthere

    Joined:
    Mar 2, 2014
    Posts:
    542
    What you explained in that post is exactly what is called wall-clock which is based on guessing and calculations around it. Monotonic time represents the absolute elapsed time which is independent of any external sources, delta time obtained from absolute elapsed time subtracted by previous time and divided by SI base unit. To achieve stronger guarantees in a multi-threaded environment, it can be mixed with atomic primitives such as CMPXCHG instruction.
     
  48. nxrighthere

    nxrighthere

    Joined:
    Mar 2, 2014
    Posts:
    542
  49. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    @nxrighthere measuring the time is not the problem here. I can't see how what I explained above can be related with that.

    Time.deltaTime is measured with great precision in Unity. I already had considered possible measurement errors, so I compared Time.deltaTime with measures taken with QueryPerformanceCounter, the most precise timer available in Windows (<1us, I built a native C++ DLL just for it). Results are indistinguishable from Time.deltaTime. Indeed, std::chrono:steady_clock is already based on QueryPerformanceCounter in Windows, so using that timer would make no difference (surely it's already being used internally in Unity).

    The problem is not that the measurements are imprecise. Measurements are very precise, but there's an actual variability in the time spent in each Update cycle, which directly translates to variability in the Time.deltaTime values, causing the stutter.
     
  50. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,573
    Again, timing is not the problem! The Time implementation is meaningless, as timing here is already as precise as it can be. Please read carefully the article and links I referred you to previously for understanding the problem.