Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Join us on Thursday, June 8, for a Q&A with Unity's Content Pipeline group here on the forum, and on the Unity Discord, and discuss topics around Content Build, Import Workflows, Asset Database, and Addressables!
    Dismiss Notice

Time.deltaTime Not Constant: VSync CameraFollow and Jitter

Discussion in 'General Graphics' started by Zullar, Sep 9, 2016.

  1. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,364
    In Editor there IS VSync. An easy way to get perfectly smooth movement in the Editor is:
    1. Force VSync enabled externally, either at the display adapter's settings, or by opening the Editor in OpenGL mode (VSync gets forced in OpenGL mode for some reason)

    2. Add this to any of your scripts (OnEnable or Start):
      Code (CSharp):
      1. Time.captureFramerate = Screen.currentResolution.refreshRate;
      Setting Time.captureFramerate disables VSync. This is why VSync needs to be forced externally.

    3. Collapse the Transform inspector (it seems to cause a double-sync it for some reason).
    This provides constant Time.deltaTime values and thus a perfectly smooth movement synchronized to the display refresh rate. I use this trick to record videos for Youtube directly from the Editor. It's not a generic solution as forcefully needs VSync and causes other issues (i.e. no frames will be dropped, ever). But it demonstrates the nature of the problem.

    No. Currently Time.deltaTime returns the time it took to render the frame, but that time is not necessarily the time the frame will stay visible in the screen. If VSync is enabled and the monitor's frame rate is 60Hz, then that frame will be visible in the screen for 1/60 seconds. Thus, Time.deltaTime should be 1/60 seconds. When that happens (i.e. using the trick described abobe) then you get perfectly smooth motion with Time.deltaTime.

    The problem is that Time.deltaTime can vary hugely in some systems. Typically, value spikes are produced at a regular rate (this is why the problem is named "Heartbeat stutter"). Those spikes in Time.deltaTime cause everything to move very fast or very slow for one frame. But that frame is still presented in the screen for 1/60 seconds. As result, everything that uses Time.deltaTime stutters.

    An image of the varying Time.deltaTime values is included in this forum post, along with a more technical explanation on the problem:
    https://forum.unity.com/threads/tim...afollow-and-jitter.430339/page-4#post-4173886
     
    Last edited: Jun 19, 2019
  2. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    The thing is - if you just use a fixed value in place of time.deltaTime, you should get smoother motion. That's by design. The problem is the motion itself will desynchronize against player clock. It will appear smoother, but potentially be less accurate (assuming we're dealing with vsynch).

    That's because visually, the eye is less likely to notice what are essentially minute shifts in the flow of time from a dropped frame than they are to notice motion being doubled by the catchup required to synchronize movement rate against real time.

    Especially if the machine is under load. You'll get smoother animation by using a constant in place of time.delta. This in effect is matching the amount of time elapsed by the framerate and why a lot of games were built using a fixed timestep (when the machine lags, the action slow slightly, but the animation remains smooth).

    I believe that's why this isn't considered a bug by unity, and falls under the feature request header.
     
  3. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    Like, if you just change some code so that instead of moving at time.delta, it's moving at .016 - you're going to get smoother motion. If you change that to .2, you will also get smoother motion. If you change it to 1.234, you will also get smoother motion.

    You're just doing frame rate driven animation instead of time based animation. Choosing a constant that happens to be 1/screenRefresh or some other number is irrelevant.

    Fixing deltaTime to always be a multiple of 1/screenRefresh that counts missed vblanks would just make the jumps in deltaTime larger, which would likely make the problem (visual stutter) worse. Like the standard deviation in deltatime would increase if locked to multiples of 1/refresh. If you never drop a frame, this would be akin to using a fixed delta, if you do drop a frame it'd be higher variance.
     
  4. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,357
    To be honest, Unity has many sources for it's stuttering that many new users can face before they know better, and not all are caused by some ghostly deltatime variance:

    - Standard input: when you map mouse to your camera turn using old built-in input, you immediately risk of getting unclean data that can make your camera feel jittery. Same turning with gamepad input will not jitter. If you do your own smoothing or use different game engine, you will not suffer from this. I'd imagine 3rd party input libraries and new input system can deal with this better but haven't tested them in Unity for this particular case to give them fair judgement on that.

    - Good old GC spikes: this happens to best of us because many Unity's own systems can cause garbage too. People used to fight this in various manners, some even fully disabled GC and manually triggered it on places where it didn't matter (for example Cuphead does this). Fortunately, the incremental GC on 2019.1+ helps a lot for this so you'll not see similar jumps in the builds anymore if you enable it.

    - Physics: Unity runs physics at FixedUpdate which isn't synced to rendering by default, you see the async unsmooth movement even on Unity's own char controller which is using Rigidbody for the movement - without interpolation. Even if you use the built-in interpolation, rigidbody movement will never be totally smooth, I suspect this is ultimately because the deltatimes vary or there's something funky on the interpolation code itself. Note that I've implemented fixed timestepped physics with interpolation in UE4 and CE myself (using same PhysX version Unity uses atm) and haven't seen similar stuttering on those engines as I see with stock Unity physics with interpolation enabled.

    - People simply using FixedUpdate where they shouldn't (I've seen many put camera movement here). If you know how it works, you'll know why this is bad. =) Most new users don't understand fixedupdate at all and just randomly try to use it.

    You shouldn't use a constant value drawn from the hat here, you should use somewhat stable avg value that will result in same total time if you add these up for longer period of time. That way it will not cause your game to drift into different duration vs what happens in realtime. Also small drift is not an issue on most games.
     
  5. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    That's a different thing though. The vast majority of stutter is just like bad/inefficient code. And you list some of those which I agree with.

    The thing is, physics running at fixedupdate, and that fixed update running in fixed time increment is a necessity for accuracy in the physics when you're dealing with effects that don't scale linearly with time. Applying x jules of force over 1 second is different from applying x jules of force across 5 seconds.

    Physics simulation has tons of non linear effects, so using variable time increment will lead to much worse simulation.

    So the physics system sacrifices accuracy to realtime for stability in simulation, which is why fixedupdate works the way it does, and why it uses a fixed timestamp.

    If your goal is to lock animation to be frame rate dependent, the time increment you choose is irrelevant. It's just a multiplier you're applying to your rate of movement. You will get smoother seeming animation using this approach, but you're incorrect when you say "it will not cause your game to drift into different duration" - it will unless you compensate for dropped frames.

    But in the process of compensating for dropped frames you're gonna increase the variance in deltaTime, which can create worse stutter, as you're moving in fixed increments of 16.67 ms.
     
  6. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,357
    I didn't mean to question the fixedupdate use in physics, there's a good reason I've implemented this specific approach in engines that don't support it. The point was only to mention why some people see the stuttering. And besides, fixed timestepped physics can still be butter smooth when you visualize it, that's what interpolation is for there.

    Whole point of listing those random reasons was that we can't reliably draw a conclusion that stutter x some random user has seen in Unity throughout the years would be from what we've discussed on this thread as there are tons of reasons why people get stuttering in Unity, but we try to focus on one specific thing on this thread. This comment was mainly to the quotes from Edy, not a response to you specifically.

    Clear frame drops should be treated as exceptions anyway, for example you'd treat them without the manually smoothed deltatime as it can't cover them. Dropped frames are also totally different thing from what people are trying to achieve here, those happen for a different reason. Also my comment may have not been clear, I didn't mean you'd even use any constant value in actual game for deltatime, that's bad. I mean you'd use some value you constantly compute as your true avg deltatime, that way you advance the time same amount ultimately as without any smoothing.
     
    Last edited: Jun 19, 2019
  7. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    I hear that, and tbh, I think most of your points are pretty reasonable.

    The problem is that when dealing with dropped frames (and there are a lot of dropped frames in games in the wild). You gotta make tradeoffs one way or another.

    Using a fixed value for delta time (without compensation for dropped frames) will provide the smoothest visual animation, but you're going to drift from player clock.

    Attempting to keep synchronized against player clock at all points will produce delta time's that are gonna shift around.

    If all you care about is the smoothest visual animation while consistently running at 60fps, just set the physics period to the refresh rate and use fixedDelta. I believe this should provide you what you're looking for.

    You can also just use all the rest of the built in stuff, just set everything to use fixed update instead of update.

    PS: I would not recommend this for stuff you want to publish to the wild unless you really know what you're doing.
     
  8. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,357
    I see that my edit for additional note was missed, I've never meant you'd use any true constant, only that you'd constantly measure and compute a better smoothed deltatime.
     
  9. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    That's fair, I'm just trying to hammer in the point that - in terms of pure visual clarity - running a constant (any constant) for time.delta will provide smoother animation (including in the case of dropped frames where you get visually smoother animation by not compensating for the drop).

    Animation using a constant value is, in effect, doing frame rate dependent animation (which again, will generally be visually smoother).

    Time.deltaTime is not intended to provide visually smoother animation. It's intended to synchronize the animation against time.

    For physics, it was decided that desynchronizing the clock was an acceptable tradeoff for more stability. You may feel this way about animation and decide that drifting from the clock is worth it for smoother visuals.

    But I think there's just generally a misconception about what deltaTime is there for (hint: it's not for smoother looking visuals).
     
  10. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    650
    Time.deltaTime is currently defined in Unity as the Time delta between the start of Update to Update

    Lets define Time.deltaF2F: The estimated Time delta between when the current frame will be displayed on the screen and when the last one was displayed on the screen. Time.deltaF2F will almost always be exactly equal to the monitor frame interval... except in cases of dropped frames when everything gets ugly. Time.deltaF2F does not exist in Unity, but we need it. This is what we need to smoothly calculate graphics. Using anything other than this value will generate visual jitter.

    Exactly. You say Time.deltaTime in it's current form should not be used for visuals and we all agree. The problem is that is IS embedded in visual things like the ParticleSystem or RigidbodyInterpolation or Animations or Timelines. Unity should use Time.deltaF2F for all these things, but it doesn't exist.

    Can we all agree if you open a blank new project, enable VSync, attach a rigidbody, send it moving at 5m/s, build, and click play it should move smoothly. It doesn't.

    I think what needs to happen is either
    1: Leave Time.deltaTime as-is and create Time.deltaF2F which can be used for visuals.
    2: Refine Time.deltaTime as Time.deltaF2F, then create Time.deltaU2U (update start to update start)

    I think we are not being clear on our terminology so hopefully we can get on the same page with these definitions. Either way the argument is the same. The time delta from the start of Update to Update should not be used to calculate visuals. Instead we want the time delta from when the last frame is displayed to the estimated time the current frame will be displayed.
     
  11. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,357
    Add interpolation to the list and I can agree with it =) (I know you meant it was there, just mentioned this before it was brought up as counter argument).
     
    Zullar likes this.
  12. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,357
    As a side note, I don't quite see the need for yet another deltaTime variable, we already got Time.smoothDeltaTime so would rather focus on improving it instead as it's just not good enough. I also don't know which one Unity's internal systems use atm either (deltatime or smoothDeltaTime) or if they use one for system x and other for system y.
     
  13. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    This is not what I'm saying.

    I'm saying that Time.deltaTime's primary goal is not presenting the most visually smooth animation possible. The goal is keeping the animation synchronized to time. There is a difference!

    You can, if you want, use fixed delta to achieve smoother looking animation if you can completely guarantee 60fps at all times, just set the timestep to 1/60 and set your animation to physics, and rigidbody interpolation off.

    I believe (and I'm not an expert) that the behavior of deltatime is actually what most games should use, especially if they want to release into the wild where people's framerates are gonna vary, getting things working properly on a toaster is often a high priority and framerates can fluctuate pretty wildly.
     
  14. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    650
    I agree that being synchronized to time is important. But having smooth graphics is also important.

    I think the goal of Time.deltaXXX should be two fold.
    1: Synchronized to Time (i.e avoid integration errors where if you sum the time deltas over many frames it won't continue to leak/grow time).
    2: Visually smooth (i.e. equal to monitor interval when there is low loading and no dropped frames)

    Currently Time.detalTime only achieves goal 1, but not goal 2. But something like Time.deltaF2F would achieve both goals. Heck even Time.smoothDeltaTime basically achieves both goals but the issue is that many Unity components (Animation, Particle System, Rigidbody Interpolate, Timelines, etc.) internally rely on Time.deltaTime and it can't be changed to Time.smoothDeltaTime.

    To state another way: In most cases for visual graphics real time does not directly matter. For smooth visuals what we really need to know is when the current frame that is being processed will be displayed by the screen but there currently is nothing in Unity that calculates this.
     
  15. hellstorm

    hellstorm

    Joined:
    Jun 29, 2015
    Posts:
    39
    Oh wow. I'm so late to the party.

    This is exactly what I'm seeing in vr.
     
    Zullar likes this.
  16. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,364
    Time.deltaTime provides the time between the current and previous frame (literally from the manual). This represents the time your frame/moving object/animation/particle will be visible at its current position for that frame. Time.deltaTime is used to move an object in a direction at n units per second by multiplying n by Time.deltaTime and adding the result to the position (also literally from the manual).

    With VSync enabled the time between current and previous frame is exactly 1/refreshRate. Your frame/moving object/animation/particle will be visible at its current position for exactly 1/refreshRate in the current frame.

    The fact is Time.deltaTime with VSync not only provides varying values different than the time between the current and previous frame (1/refreshRate), but also produces large spikes regularly (about each second) in many systems.

    As your moving object/animation/particle relies on Time.deltaTime for computing the new position during the movement, each spike produces a single frame where the moving object/animation/particle changes its position too fast or too slow with respect to the correct time between frames (1/refreshRate). As result, it produces a clearly visible stutter.

    As those spikes causing this stutter are produced at regular intervals (typically about each second), the problem is known as "heartbeat stutter". When one suffers it, it's easily distinguishable from other types of stutter caused by other reasons (GC collect, heavy load, busy OS, etc).

    I really don't know how to explain it in a simpler way.
     
    Last edited: Jun 20, 2019
    Zullar likes this.
  17. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    2019-06-20_12-28-41.png
    So this is the FPS meter you guys have been using, since it seems nobody has actually bothered to profile it, this is a profiler shot.

    What's happening is that the FPS meter is allocating 500k worth of memory per frame (mostly string cat and array construction). That's producing GC collection on a fairly regular basis as you can see above. Intervals of around a second. Depending on when GC is fired it can produce a dropped frame.

    Now @Edy, is that "heartbeat stutter"?
     
  18. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,364
    No.
     
  19. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    650
    Good catch on the garbage collection. Allocating the color pixel array generated the 0.5MB / frame. Turning it off reduces it from 0.5MB to 2KB per frame.

    This code allows the toggle of the massive garbage generation on/off (from 500KB /frame down to 2KB / frame). For me toggling garbage generation on/off has no effect in the build. Garbage generation does add to the jitter in the editor however.

    Code (csharp):
    1.  
    2. using UnityEngine;
    3. using System.Collections.Generic;
    4.  
    5. public class FPS : MonoBehaviour
    6. {
    7.     private List<float> listDeltaTime = new List<float>();
    8.     private const int pixelWidth = 256; //for texture2D
    9.     private const int pixelHeight = 128; //for texture2D
    10.     private Texture2D texture2D;
    11.     private const float deltaTimeMax = 0.05f; //for texture2D
    12.     private static readonly Color colorDarkGrey = new Color(0.3f, 0.3f, 0.3f);
    13.     private void Awake()
    14.     {
    15.         DontDestroyOnLoad(gameObject);
    16.         for (int i = 0; i < pixelWidth; i++)
    17.         {
    18.             listDeltaTime.Add(0.01f);
    19.         }
    20.         texture2D = new Texture2D(pixelWidth, pixelHeight);
    21.         texture2D.filterMode = FilterMode.Point;
    22.     }
    23.     private void Update()
    24.     {
    25.         listDeltaTime.RemoveAt(0);
    26.         listDeltaTime.Add(Time.deltaTime);
    27.         RefreshTexture2D();
    28.     }
    29.  
    30.     private bool toggleGC = false;
    31.  
    32.     private void OnGUI()
    33.     {
    34.         float deltaTimeAve = Average(listDeltaTime);
    35.         float deltaTimeMin = Mathf.Min(listDeltaTime.ToArray());
    36.         float deltaTimeMax = Mathf.Max(listDeltaTime.ToArray());
    37.         GUI.Label(new Rect(10f, 10f, 200f, 20f), "DeltaTime (Average) = " + (deltaTimeAve * 1000f).ToString("0.000") + "ms");
    38.         GUI.Label(new Rect(10f, 30, 200f, 20f), "FrameRate (Average) = " + (1f / deltaTimeAve).ToString("0"));
    39.         GUI.Label(new Rect(10f, 50, 200f, 20f), "Time.deltaTime = " + (Time.deltaTime * 1000f).ToString("0.000") + "ms");
    40.         GUI.Label(new Rect(10f, 70f, 200f, 20f), "deltaTimeMin = " + (deltaTimeMin * 1000f).ToString("0.000") + "ms");
    41.         GUI.Label(new Rect(10f, 90f, 200f, 20f), "deltaTimeMax = " + (deltaTimeMax * 1000f).ToString("0.000") + "ms");
    42.         GUI.Label(new Rect(10f, 110f, 200f, 20f), "% Variation = " + ((deltaTimeMax / deltaTimeMin - 1f) * 100f).ToString("0.0") + "%");
    43.         GUI.DrawTexture(new Rect(0f, 200, Screen.width, texture2D.height * 2), texture2D);
    44.  
    45.         toggleGC = GUI.Toggle(new Rect(10, 130, 200, 20), toggleGC, "ToggleGC: " + toggleGC);
    46.     }
    47.  
    48.     Color[] pixels = new Color[pixelWidth * pixelHeight];
    49.  
    50.     private void RefreshTexture2D()
    51.     {
    52.         if(toggleGC == true)
    53.         {
    54.             pixels = texture2D.GetPixels(); //This generates 0.5MB of garbage
    55.         }
    56.      
    57.         for (int i = 0; i < pixels.Length; i++)
    58.         {
    59.             pixels[i] = Color.black;
    60.         }
    61.         texture2D.SetPixels(pixels);
    62.         float deltaTimeMin = Mathf.Min(listDeltaTime.ToArray());
    63.         float deltaTimeMax = Mathf.Max(listDeltaTime.ToArray());
    64.         int yMin = GetY(deltaTimeMin);
    65.         int yMax = GetY(deltaTimeMax);
    66.         for (int i = 0; i < pixelWidth; i++)
    67.         {
    68.             texture2D.SetPixel(i, yMin, colorDarkGrey);
    69.             texture2D.SetPixel(i, yMax, colorDarkGrey);
    70.         }
    71.         for (int i = 0; i < pixelWidth; i++)
    72.         {
    73.             int y = GetY(listDeltaTime[i]);
    74.             texture2D.SetPixel(i, y, Color.white);
    75.         }
    76.         texture2D.Apply(false);
    77.     }
    78.     private static int GetY(float deltaTimeIn)
    79.     {
    80.         return Mathf.Clamp(Mathf.RoundToInt(deltaTimeIn / deltaTimeMax * pixelHeight), 0, pixelHeight - 1);
    81.     }
    82.     private static float Average(List<float> listFloatIn)
    83.     {
    84.         float average = 0f;
    85.         for (int i = 0; i < listFloatIn.Count; i++)
    86.         {
    87.             average += listFloatIn[i];
    88.         }
    89.         average = average / listFloatIn.Count;
    90.         return average;
    91.     }
    92. }
    93.  
     
    Last edited: Jun 20, 2019
  20. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    650
    Here's something else I've noticed. I have no idea why it does this... but thought I'd share in case it sheds some light on our issue.

    Here's a .exe build on Windows. Full screen. Blank new project with nothing in it except script.
    When I hold my mouse still I have ~30% variation. When I begin move my mouse I get a huge spike in frame rate jitter. It's very repeatable.


    upload_2019-6-20_13-28-26.png
     
    Last edited: Jun 20, 2019
  21. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    @Zullar, honestly that code really shouldn't be used to measure microsecond performance difference.

    OnGui stuff you're doing there is inconsistent in terms of performance overhead. The UI is responding to mouse code, etc, and all kinds of stuff are happening outside update.

    For the record, I'm not trying to deny there are stutters or whatever. I just tend to think a lot of the problem is in user end code. I'm sure there are some unity processes that can also make a problem now and then, but I don't think the general assessment you guys have come to, that there's a problem with deltatime, is well founded.
     
    Last edited: Jun 20, 2019
    Ultroman, LaireonGames and xVergilx like this.
  22. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    650
    The problem exists in a blank new project without a single user script in it.

    Also the garbage generation had no noticeable effect in the build when I toggled it on/off.

    I don't think the core of the problem is user script related.
     
    Last edited: Jun 20, 2019
  23. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    Like there's a few problems.

    1: Vsynch and frame rate is a long time known problem. The fact that the monitor dictates when frames are refreshed is an engineering problem that's existed since forever.

    The fix for this problem is really adaptive refresh, gsynch and freesynch. Here, the GPU takes control of frame refresh not the monitor.

    2: "Heartbeat Stutter" - there are problems here that really come from #1. I believe that there are no real ways to fix or better address this problem at the moment other than what we got. Fixing deltatime into multiples of 1/60 is not going to make things smoother, it's going to guarantee more 32ms deltatimes instead of 20ms followed by 12ms.

    Think about it, you'll have more 100% variance deltatime, instead of 20-30%. Which may or may not look more correct depending on what the actual monitor refresh cycle is at. It could be way worse, and would likely be even worse during high load.


    It's also really worth learning about vsynch and where a lot of this stuff is rooted. This video is the best explanation of the history, reasoning and current state of refresh rates that I ever saw.



    It won't solve the stutter problem, but it might help on understanding why some of these things are the way they are.

    Like, it's important to also remember that the unity framerate isn't guaranteed to be the actual refreshrate even w/ vsynch. It's some guesswork that hopes to line up, but with no real guarantee that it does so. If the untiy timing and the monitor timing aren't lined up perfectly, even when you're rolling at 60fps, I believe it's entirely possible that you can still get the same frame displayed twice, depending on how the timing lines up exactly.

    I could be wrong about that last bit, I'm not an expert, but I'm fairly certain this is the case.
     
  24. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,364
    It's also really worth learning about the heartbeat stutter and the cause of the issue many of us are experiencing, regardless of the deniers who are lucky enough not to suffer/notice it. This article is the best explanation of the history, reasoning, background and current state of the issue that I've ever read:
    https://medium.com/@alen.ladavac/the-elusive-frame-timing-168f899aec92
     
    Last edited: Jun 21, 2019
    IsaiahKelly and Zullar like this.
  25. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    Here is my take on the implications of that link. The critical portion being that there is no way to tell when the screen is actually refreshing.

    Imagine monitor refresh is perfectly 16ms and vsynch is also 16 (for easier numbers)
    0 16 32 48 64 <- Monitor
    2 18 34 50 66 <- Unity

    so we are refreshing perfectly at exactly 16ms. We are 2ms after the true refresh rate of the monitor. So each frame being presented is shown 14ms after it was calculated and prepared.

    Let's say that something happened here like the cpu clock frequency changed or something. And we had 4 ms of extra time on that second frame.
    0 16 32 48 64 <- Monitor
    2 20 34 50 66 <- Unity

    Here you'll note that we still haven't dropped a frame. The first frame is shown 14ms after it was prepared, and the second frame was shown 12ms after it was prepared. We still have a fresh frame (on time!) for the monitor. Our next frame is back to being shown 14ms after it was prepared.

    But we took 18ms for that second frame! That's more than 16ms so we could have dropped a frame. Here's an example with a different timing. Here we're 1ms ahead of the monitor. When both are flawless...
    1 17 33 49 65 <- Monitor
    0 16 32 48 64 <- Unity

    But here, when the second frame takes 18ms...
    1 17 33 49 65 <- Monitor
    0 18 32 48 64 <- Unity
    • The screen shows the first frame 1ms after calculation
    • The screen shows the first frame AGAIN 17ms after calculation
    • The screen shows the third frame 1ms after calculation
    We dropped frame 2, and are back onto our original schedule on frame 3.

    Since there is no way to know if we are in situation 1 (14ms ahead of the monitor) or situation 2 (1ms ahead of the monitor) we have no way of knowing if the frame got dropped or not.

    Take a minute here and just take that in. We don't know if we dropped a frame or not. We really can't know.

    What's more, there's so much stuff happening with variable clock speeds and powersaving modes and hardware timing and asynch processing. Stuff goes wrong. This process will never be 100% reliable, timings will not always line up perfectly.

    So what about locking deltaTime to increments of 1/refresh, what would that look like?
    1 17 33 49 65
    0 18 (50)?

    We present: Frame 1, Frame 1, then on frame 3 we calculate ahead as if we're on Frame 4?

    That suggestion doesn't even make sense once you look at the time line for how these things are happening.

    This whole thing is not an easy thing to fix. You really need gsync/freesynch so we don't have to worry about monitor refresh rates, and we can just feed in frames whenever they're ready.
     
    Last edited: Jun 21, 2019
    LaireonGames likes this.
  26. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    650
    Monitors output a VBlank signal when they refresh. I wonder if behind the scenes Unity uses this signal to align the start of it's Update loop with the monitor refresh?

    If not we're doomed :p
     
  27. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,357
    The graphs I showed didn't use that, I literally only logged values for longer duration of the time and then dumped them into file at once after done logging. I also use incremental GC which makes sure GC doesn't put spikes on your builds when vsync is enabled.That being said, all framerate visualizer tools show the inconsistency.
     
    Ultroman and frosted like this.
  28. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,357
    There used to be a bug in Unity related to high polling rate mice that caused something like this. Like mentioned earlier, Unity's built-in input is trouble.
     
  29. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    But saying "the inconsistency" is kinda a bad framing. There is a lot going on in modern hardware, again, stuff like chip clockspeed on the fly adjustments and background processes and asynch gpu and etc, etc. Not to mention stuff like fixedUpdate firing potentially multiple times in some frames and all the calculation overhead hat goes along with that.

    There's also stuff like Timer Resolution (https://randomascii.wordpress.com/2013/07/08/windows-timer-resolution-megawatts-wasted/) which was linked by a Unity engineer (and commented on by a second) which can cause problems on specific systems. He mentioned that standalone player calls timeBeginPeriod but the editor doesn't, meaning that the accuracy of the internal clock for waits will vary from machine to machine in editor.

    It's not like "this number should be consistent but its not" - its more like "we can't guarantee exactly precise timings because complexity and no way to know when an actual frame happens".
     
  30. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    Sorry, I should have made that super clear. Unity doesn't get a vblank signal. The GPU responds directly to monitor and doesn't let the cpu know when it happened. This is basically the root of the problem. Unity doesn't ever know when the new frame gets swapped in, so they need to keep a separate timer and guess. Which brings us back to deltaTime.
     
    Ultroman and Edy like this.
  31. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,364
    That's correct. Now let's talk about possible solutions.

    The values given to us by deltaTime may not be reliable as produce stutter while they shouldn't in some systems and situations. Time.deltaTime is used by the internal Unity systems. So fixing deltaTime may be a possible solution.

    There are two methods to modify deltaTime user-side:
    1. Time.captureFramerate accepts an int and forces Time.deltaTime = 1 / value given. The drawback is that this disables VSync in some systems.
    2. Time.captureDeltaTime (introduced in Unity 2019.2) modifies Time.deltaTime directly. I haven't tested it, but possibly has the same issue with VSync.
    I've attached the script I use for testing these options (except Time.captureDeltaTime). Add it to your favorite test scene. By default it does nothing but trying to enforce Vsync. You may then try the different options and examine the results.

    I've verified that forcing VSync somehow and configuring Time.captureFramerate to refreshRate produces super smooth motion in Unity in the same systems/situations/scenes that suffer stutter, assuming there are no dropped frames. When the GPU drops a frame then the motion slows down (but doesn't stutter) because the same frame will be visible for twice the time, while the object moves for the time of a single frame.

    This solution has two issues:
    1. Vsync may or may not be in effect.
    2. Adapting to dropped frames and/or to the actual frame rate.
    How could we improve this solution to overcome these issues?
     

    Attached Files:

    Aike and sharkapps like this.
  32. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    650
    How do we know Unity does not internally access and align/phase itself with the VBlank signal or some other trickery? I know the Elusive Time Framing article says that you can't access VBlank. However my testing has shown the opposite... and it appears Unity's loop is aligned with VBank. I am not 100% certain on this. I could do some further testing.

    No matter what the case I believe the solution to smooth motion during low loading for fixed frame rate screens is...
    1: Time.deltaX used for incremental motion/animations/particles/etc. must be exactly equal to monitor frame interval
    2A: Unity loop start must be aligned with VBlank signal or...
    2B: Unity must delay the output of it's frame to the screen. For example for a 60Hz screen a frame that always takes 16.67ms to complete would be smoother than a frame that sometimes takes 2ms, and sometimes takes 10ms ***if*** the vblank alignment is not guaranteed). However this will introduce a constant 1 frame of delay.

    No idea how to achieve ^, but I think that what's needed.

    I can draw pictures explaining if any of that isn't clear.

    Anyhow I'm done with this issue for now. I have spent too much time as-is and have to get back to work. Good luck!
     
  33. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,364
    Because that part happens at the GPU side, and the GPU runs entirely on its own asynchronously to the GPU. All Unity can do is querying the GPU like "are you ready to receive a new frame?" or "wait until the GPU is ready to accept new frames". When the GPU responds "yes" or "give me a new frame", then the Update cycle begins, a frame is rendered, and sent to the GPU which will present that frame on the screen at its consideration.

    Said that, the solutions 2A and 2B can't be implemented. What you describe is how things worked in the 90's and early 00's, when the CPU had direct control of the graphic adapter, but current GPUs work in a completely different way.

    I believe the issue can be resolved or at least improved with a better calculation of Time.deltaTime (like in your solution 1).
     
    Last edited: Jun 22, 2019
  34. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    Improved in what sense? In terms of recording a demo in a fixed environment perhaps, but in terms of actually releasing a game - you really want framerate independent calculation.

    For actual releases, the current deltaTime is going to produce better results than using a fixed timestep.
     
  35. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,364
    @frosted totally agreed. That's what I meant about the second issue with this solution. "Adapting to dropped frames and/or to the actual frame rate" effectively means that you would get a framerate-independent calculation, but that also produces stutter-less super smooth motion when the frame rendering rate is fast enough to provide frames at the screen's refresh rate.

    How to achieve that? That's what I would like to discuss. @herb_nice developed a solution based on quantizing deltaTime that seems to work pretty good. His implementation bypasses Time.deltaTime entirely and uses his own time management system. So if we could apply that solution to modify Time.deltaTime as I described above, then we could have a better Time.deltaTime globally in Unity.

    This solution would leave mostly the same two issues to resolve:
    1. VSync gets disabled when modifying Time.deltaTime
    2. The solution should also work with Vsync disabled
    What do we have? We cannot:
    • Know when a frame gets actually displayed on the screen
    • Know for how long a frame will be (or has been) visible on the screen
    • Know whether Vsync is enabled or not
    We can:
    • Count the frames that are sent to the GPU. In Unity, one Update cycle = one frame rendered and sent to the GPU.
    • Measure times, such as: real time spent since the application started, time spent from one Update cycle to another (this is the actual Time.deltaTime), etc.
     
    Last edited: Jun 22, 2019
  36. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011

    Please review: https://forum.unity.com/threads/tim...afollow-and-jitter.430339/page-6#post-4670426

    This demonstrates why that hack will produce undesirable results and why modifying deltatime the way suggested really doesn't make sense.

    Again, please review - if a frame is sent to gpu or not doesn't mean a frame was actually rendered even with vsynch on. The CPU has no way to determine if a frame was rendered, and this is the root of the problem.

    I know my post above has a lot of numbers and stuff, but please think it though and understand it, as it shows why there's such a problem and why locking deltatime into increments of 1/refresh will simply make the problem worse.
     
    Last edited: Jun 22, 2019
  37. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,364
    Reviewed thoughtfully. Let me emphasize that we both agree on the fundamentals:
    • We can't know if we've dropped a frame.
    • Unity doesn't get a vblank signal.
    • The GPU responds directly to monitor and doesn't let the cpu know when it happened.
    Done. There's a fundamental flaw in your reasoning. I leave finding it as exercise. I've explained it at least 4 times in this thread in every imaginable way, and I don't really feel like explaining it once again. I couldn't bring anything new that hasn't already been said.


    Back to the possible solution. I have verified experimentally beyond any doubt that locking deltaTime to 1/refresRate while VSync is active produces stutter-free super smooth motion (assuming no dropped frames, i.e. light cpu load), while in the same conditions the default deltaTime may cause heavy stutter in some systems. This is a fact. You can verify it yourself easily.

    Problems with this solution still are:
    1. VSync gets disabled when modifying Time.deltaTime
    2. Only locking deltaTime to 1/refreshRate is not frameRate-independent.
    We can count frames sent to the GPU. We can measure time. So for example:
    • If we measure 60 frames sent to the GPU in 1 second this is an indicator that the frame rate is at least close to 60 frames per second.
    • If we measure 40 frames sent to the GPU in 1 second this means the frame rate is close to 40 frames per second. If Vsync is active and refresRate is 60, we can take for granted that frames have been dropped. But we can't know whether VSync is enabled, nor if the actual correct refresh rate should be 40 or 60.
    Imagine we want to move an object at a rate of 1 meter per second:
    • In the first case (60 frames sent to the GPU in 1 second) we should have moved it at an ideal rate of 1/60 meters each frame, so at the end of the second the object would have moved exactly 1 m.
    • In the second case (40 frames sent to the GPU in 1 second) we should have moved it at an ideal rate of 1/40 meters each frame to reach the 1 meter mark in the same time.
    Of course this is like a "forensic" analysis of an ideal example. The problem is figuring out how to ensure that object reaches the 1 meter mark after 1 second regardless the number of frames sent to the GPU, and keeping the movement smooth. Yes, just using the actual Time.deltaTime would do the job, but causing heavy stutter in some systems, and that's what we're trying to fix.
     
  38. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    I'm sorry Edy, but this is simply incorrect. You are obsessing about deltaTime as if it's a number that's wrong that you need to correct without thinking about the actual timing.

    deltaTime is the elapsed time dude. It's the time in the past. Increasing it to 32 from 17 - all that does is skip foward - and if your refresh interval is lined up so that youre sending to gpu 2ms or more ahead of monitor refresh - you're also skipping a frame of rendering for no reason.

    Let me try one more diagram.

    Here each period represents 1 millisecond, and for ease, our refresh rate is 5ms.
    Code (csharp):
    1.  
    2. // flawless operation, 3ms ahead of monitor refresh
    3. ....r....r....r....r   <- monitor cycles
    4. r....r....r....r....   <- unity cycles
    5. 0    5    5    5 <- delta time
    6.  
    7. // flawless operation, 1ms ahead of monitor refresh
    8. r....r....r....r....r   <- monitor cycles
    9. ....r....r....r....r.   <- unity cycles
    10.     0    5    5    5   <- delta time
    11.  
    12.  
    13. // second unity frame takes 7 ms to calculate (we drop frame 2)
    14. r....r....r....r....r   <- monitor cycles
    15. ....r......r..r..r.   <- unity cycles
    16.     0      5  7  3   <- delta time
    17.  
    18. // second unity frame takes 7ms to calculate, we lock delta to multiples of 5 (we drop frame 2, then jump ahead to frame 4, and continue as if we're going into frame 5)
    19. r....r....r....r....r   <- monitor cycles
    20. ....r......r..r..r.   <- unity cycles
    21.     0      5 10  5 <- delta time
    22.  
    That example also doesn't even look at the case where we don't actually drop a frame, but by rounding to the next refresh, we decide to skip that frame anyway and jump to the next.

    Like, you're thinking about deltatime as if it's for the current frame. So if the current frame time is more than refresh, we're going to miss our frame. But that requires being able to predict how long our frame is going to take. And even if you could predict the future, you'd still be jumping ahead and desynchronizing the clock half the time.

    Edy, the end of the article you linked over and over even has a header thats titled: "Can't we just ... No we can't". He's right, these kinds of cludges don't work and just make the problem worse.
     
    LaireonGames likes this.
  39. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,364
    Thanks for dedicating the time to write the new diagram. I've reviewed it thoughtfully and understood it entirely. Same fundamental flaw in your reasoning, sorry...

    No, i'm not. I'm assuming deltaTime is calculated based on what we know from past frames.
     
    Zullar likes this.
  40. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,011
    Uh...ok man. I'm done here, have fun with your quest. I'm sure your brilliant:
    Code (csharp):
    1. if( time.deltaTime > 16 ) time.deltaTime = 32;
    2. else time.deltaTime = 16;
    solution will revolutionize 3d software. And that the "fundamental flaw" in my reasoning that you simply can't explain, is sound.

    Good luck.
     
  41. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,364
    Totally misunderstood, but as you wish
    There are plenty of explanations to it in this thread if you want to understand them. Anyways, so long and thank you for the fish.
     
    Last edited: Dec 28, 2019
    Ultroman, Noisecrime and Obsurveyor like this.
  42. hellstorm

    hellstorm

    Joined:
    Jun 29, 2015
    Posts:
    39
    Zeitcatcher, Edy and Zullar like this.
  43. Lynxed

    Lynxed

    Joined:
    Dec 9, 2012
    Posts:
    121
    Is this related to the amount of polygons? It seems, that the more tris there are on the screen, the more pronounced the "hartbeat" amplitude (delay\jerking motion)? Right now i don't have lods in my project and sometimes have up to 150 mil polys rendered on screen. Stuff is GPU-instanced heavily/ That's where i get insane periodical jerking motion and nothing in profiler (i have clean GC and Enlighten DynamicGI is turned off and it shows 60 fps with no spikes at all). I'll be making LODs to try to mitigate this, but i can't be sure it will be enough. My game is heavily proc-gen.
     
    Last edited: Jul 1, 2019
  44. Obsurveyor

    Obsurveyor

    Joined:
    Nov 22, 2012
    Posts:
    277
    It will do it with a single cube.
     
  45. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    650
    On the roadmap:

    Garbage Collector Upgrade

    Scripting
    Experimental support for incremental garbage collection, which should reduce stutters and time spikes.
     
  46. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,357
    This has been available since 2019.1 for builds. On 2019.2+ it should also work in the editor (you still need to enable it from player settings).
     
    Last edited: Jul 4, 2019
    Zullar likes this.
  47. herb_nice

    herb_nice

    Joined:
    May 4, 2017
    Posts:
    170
    briefly tried the Android Optimized Frame Pacing feature in 2019.2.0f1 @rizu mentioned. there is more information about it here: https://developer.android.com/games/optimize/frame-pacing

    it is enabled in a weird place, in the resolution and presentation sub platform menu. and it's disabled by default.

    the game does run smoother on nexus 5 for sure, but appears to perform a bit worse on average on alcatel a30. less stutter in both cases. overall, it is an improvement.

    it does _not_ result in a fixed deltaTime. delta time is still erratic, especially on the alcatel, so there is still some jitter. it's less bad, but it's still bad.

    when used in conjunction with quantized delta time, however, it works quite well. anecdotally better than without "Optimized Frame Pacing"
     
    Zullar likes this.
  48. hellstorm

    hellstorm

    Joined:
    Jun 29, 2015
    Posts:
    39
  49. DMinsky

    DMinsky

    Joined:
    May 20, 2012
    Posts:
    30
    Ok, here is the thing. All you talking about is 100% true. We don't know when any frame will be dropped. There is no doubt. And no one will argue with that. What do we really want, is a way to have a smooth gameplay without stutter like every second. Nothing more.
    And I don't know how, but for example Source/Frostbite/Unreal you name it are able to do that. The picture is really smooth in Unreal without any stutter right from the box, even inside the editor. And I doubt that they achieve it by sacrificing a time drift thing.
    As for Unity, it just incapable to smooth move a single cube in the scene with single line of code in the whole project. So, please, don't blame a bad devs code here. It's simply unachievable nor with infinite frame-rate nor with vsync, nor with FreeSync monitor.
    In the end, we're just looking for anything, any hack or workaround to achieve this smoothness in Unity. Real world experience with Unreal shows that this is possible. At least for the end user feel.
     
    Last edited: Aug 8, 2019
    Edy, elias_t, DragonSix and 1 other person like this.
  50. herb_nice

    herb_nice

    Joined:
    May 4, 2017
    Posts:
    170
    i've been able to identify and correct the issue in my own game, thanks to the help of this thread. it is not a GC issue. although bad user code can be a problem, it is not the problem we are discussing.

    the topic of this thread is how unity uses a bad unscaledDeltaTime internally when vsync is enabled. the clock is not synced to vsync. i feel you that it's frustrating that an engine can get something so fundamental so subtly wrong, but your lack of comprehension is not the fault of other posters.
     
    Last edited: Aug 8, 2019
    Ultroman and Edy like this.