Search Unity

Time.deltaTime Not Constant: VSync CameraFollow and Jitter

Discussion in 'General Graphics' started by Zullar, Sep 9, 2016.

  1. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    I'm running with VSync at a constant 60fps which is a calculated dt of 16.667ms (blank project with just a cube).

    When I call Time.deltaTime it varies wildly. Time.deltaTime can be as low as 2ms and as high as 31ms. The deltaTime "averages" out to be ~16.67ms.

    Why does Time.deltaTime vary wildly? Shouldn't it be a constant 16.667ms assuming the Update() loop has minimal computations and lots of idle time?

    My understanding is the Update loop runs as fast as possible (and will have variable Time.deltaTime) but if VSync is enabled then it should sync the Update rate to the display (60fps). Assuming the Update loop isn't overloaded and it can complete in < 16.667ms then shouldn't Time.deltaTime always be a constant 16.667ms?

    The question relates to camera jitter (camera following the player). Because the Time.deltaTime varies so much it is making the camera jitter.

    Thanks in advance.
     
  2. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    2,469
    In your camera moving code, multiply Time.deltaTime by the vector for one second of travel to get a scaled down movement vector that matches the current time step. That will eliminate your jitters even when the framerate varies.
     
    Zullar likes this.
  3. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Thanks for the tip but I don't think that helps. The issue is that my monitor updates update 60Hz (16.667ms) and my Update() loop gets called at 60Hz (16.667ms) on average... but some Time.deltaTimes are 1ms and some are over 30ms.

    So if I multiply anything by Time.deltaTime the actual delta motion will vary wildly (because I am multiplying anywhere between 1ms and 30ms). This causes jitter. If I multiply by a constant 20ms then I will get smooth motion even though my Time.deltaTime varies wildly from it's average of 20ms.

    I've also noticed that the Time.deltaTime varies more in the editor than in the build. The build is a bit smoother.

    But with a blank project and no graphics or code loading and Vysync of there *should* be lots of idle time and I would think Time.deltaTime would be very close to the ideal 16.667ms... why does it vary so much??

    It's worth emphasizing that my monitor framerate is not varying (it's 60Hz) and my Update framerate is also always 60Hz if you take a 1sec average... it's just the Time.deltaTime that varies drastically. This seems so strange.
     
    Xarbrough likes this.
  4. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,145
    How are you determining Time.deltaTime? If you're using Debug.Log every frame, that causes lag. Better to collect data over a period of time and then output it all at once. I don't get any wild variances here:

    Code (csharp):
    1. 0.01665795
    2. 0.01681304
    3. 0.01697999
    4. 0.016348
    5. 0.016451
    6. 0.01687998
    7. 0.01671803
    8. 0.01662397
    9. 0.01652002
    10. 0.01681399
    11. 0.01668197
    12. 0.01667905
    13. 0.01664799
    14. 0.01576495
    15. 0.01757705
    16. 0.01605797
    17. 0.01727098
    18. 0.01665604
    19. 0.01665896
    20. 0.01667005
    21. 0.01631999
    22. 0.01700497
    23. 0.01698202
    24. 0.01636398
    25. 0.01667804
    26. 0.01665395
    27. 0.01652002
    28. 0.01682001
    29. 0.01666498
    30. 0.016675
    31. 0.01666099
    32. 0.01635301
    33. 0.01698601
    34. 0.016662
    35. 0.01660103
    36. 0.01657498
    But yes it's interesting that it varies at all.

    --Eric
     
  5. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    2,469
    You need to assume that the frame rate is going to vary some even when you try to lock the frame rate. Make sure your code compensates the movements based on the deltaTime.

    I don't know for sure why the frame rate in Unity varies, but my hunch is that Unity's code for locking the frame rate actually limits the frame rate to just below the desired frame rate instead of correctly locking it with the vsync. The way vertical sync works, the hardware will experience a brief 30 FPS frame rate if the game frame rate dips to 59 FPS, assuming the monitor vsync is 60 FPS.

    By definition, vsync forces the hardware to sync the frame with the monitor. If the game is slightly slower than the monitor, the hardware will sync at the next opportunity, which leads to 60 FPS, 30 FPS, and 15 FPS. That is one reason I like to simply disable vsync.

    When you get a 31ms frame, that is probably where Unity was running 59 FPS and the hardware dropped it back to 30 FPS. The 2ms frame was probably Unity's attempt to compensate for the 31ms frame.
     
  6. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,145
    No, it happens even with an empty scene. Unity does run vsync correctly (it waits for the monitor sync, that's what WaitForTargetFPS in the profiler is), and there are no dropped frames. In my case the variance is always within a 60Hz cycle, so I expect that the exact moment in which Time.deltaTime is measured each frame is not done at a set time for whatever reason.

    --Eric
     
  7. MariuszKowalczyk

    MariuszKowalczyk

    Joined:
    Nov 29, 2011
    Posts:
    245
    I know this thread is old, but I would like to know the answer. Why the deltaTime is not constant with vsync? Even though the fluctuations may look small, human brain is very good at seeing if the movement is smooth or not, so even something like a 2ms difference is noticeable.

    Is there anyone who knows? Or maybe someone could ask someone from Unity. I think deltaTime is probably the most often used variable, so we should know what's going on.
     
  8. Dave-Hampson

    Dave-Hampson

    Unity Technologies

    Joined:
    Jan 2, 2014
    Posts:
    149
    I can't explain the 2ms, but I can possibly help you with the camera jitter, assuming you've make some kind of blending camera:

    https://twitter.com/davidhampson/status/823652498334420993
     
  9. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    I appreciate the response but I'll admit I don't understand that code. What is the intent?
    Code (csharp):
    1.  
    2. float scale = 1.0f - (float)System.Math.Pow(0.95, Time.deltaTime * 60.0f); // Framerate independent code, good!
    3.  
    For context my game is set up top-down w/ a camera locked to the rigidbody networked player.


    So getting things to appear smooth has been very difficult. I have 4 interacting deltaTimes.
    1: FixedUpdate: 0.02s (50Hz) constant rigidbody physics engine
    2: Update: Varies between 0.002s and 0.031s. Average of 0.0167s (60Hz)
    3: Monitor: framerate: Constant 0.0167s (60Hz)
    4: NetworkSendInterval. Lets say 0.10s (10Hz) average with some variation.

    So there are several issues that cause jitter.
    Jitter Cause #1: Rigidbodies: Because FixedUpdate is does not match Update you will get jitter. Using 50Hz FixedUpdate and 60Hz Update as an example you will get jitter sequences of 6 frames. In 5 of 6 frames the rigibody moves faster than actual, and then 1 of 6 frames where there is no motion.
    Frame1: Movement at 60/50 (120% speed)
    Frame2: Movement at 60/50 (120% speed)
    Frame3: Movement at 60/50 (120% speed)
    Frame4: Movement at 60/50 (120% speed)
    Frame5: Movement at 60/50 (120% speed)
    Frame6: Movement at 0 (0%) speed
    Frame7: Movement at 60/50 (120% speed)
    Frame8: Movement at 60/50 (120% speed)
    Frame9: Movement at 60/50 (120% speed)
    Frame10: Movement at 60/50 (120% speed)
    Frame11: Movement at 60/50 (120% speed)
    Frame12: Movement at 0 (0%) speed

    Jitter Cause #2: NonRigidbodies: Anything that is not a rigidbody that I move manually (transform.positon += velocity*time.deltaTime) will be subject to variations in time.deltaTime. Even though monitor framerate is exactly 60Hz (16.67ms) and my code is not overloaded (no dropped frames) my time.deltaTime varies wildly between 2ms and 31ms (even though the average is 16.67ms). This causes very significant jitter. This is the original post in this thread. I don't understand the cause of the time.deltaTime variance.

    Jitter Cause #3: Non-Owned Networked Objects: If I snap the position of the object to what was received in the network packet then you will see jitter. So care has to be taken how to apply the position data (i.e. LERP, add delay, etc.).



    To address issue #1 I detach the mesh & renderer from the rigibody and lag it slightly (by up to 1 frame) and interpolate. This seems to work really well, and adds virtually no delay like a smoothing filter would. The camera is then attached to the mesh & renderer (not the rigidbody). See in the video how things move smooth even though the camera is attached to a rigibody with a FixedUpdate rate.

    To address issue #3 I instantly snap the rigidbody to the data received, but again I detach the meshRenderer from the rigibody and LERP it (like a 1st order filter). This adds noticeable lag/delay but I think is acceptable. I send commanded position/angle/velocity (in addition to current position/velocity) across the network and perform physics locally in-between network updates which seems to help a lot.

    To address issue #2 I'm at a loss (hence this post). With a time.deltaTime varying from 2ms to 31ms using transform.positon += velocity*time.deltaTime generates jitter. So if I am moving an object manually every Update() loop then what deltaTime do I multiply by? That's my hang up.

    Thanks in advance.
     
  10. Dave-Hampson

    Dave-Hampson

    Unity Technologies

    Joined:
    Jan 2, 2014
    Posts:
    149
    ... In that case the code isn't relevant. I was referring to a damped camera.

    Could that be part of the problem? You should attached to the camera to the gameobject transform, not the rigidbody. The rigidbody will move at physics steps, whereas the transform will be interpolated.

    I don't think this statement is true: many games use a FixedUpdate of 1/50. Unity should interpolate the transform for smooth gameplay. Try simplifying the case (e.g. a box, moved by rigidbody gravity) if you aren't convinced Unity is 'smooth' out of the box.

    Have you tried using the Profiler? Maybe you have significant garbage collection going on on the longer frames?

    Makes sense. I'm not entirely sure how to combat this, but there might be some generic networking wisdom you could apply here, or maybe even some UNET tutorials? Sorry not to be of more help here.

    Again, I think you might be reinventing the wheel here, because I think this is what Unity does by default.
    If you try making a simple example with a cube acting due to physics gravity you should be able to convince yourself of this.

    Yep this all sounds like logical netplay stuff.

    You should always use Time.deltaTime, but the real question is why is it fluctuating? I suspect occasional big garbage collections. Try making a simple test in an empty scene, I think you should find it is more consistent.
    If it does turn out to be Garbage Collection, try looking at https://unity3d.com/learn/tutorials...ion/optimizing-garbage-collection-unity-games
     
    Zullar likes this.
  11. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Ah it looks like you are right. If you turn on Rigidbody.interpolate it interpolates the 1/50 FixedUpdate to a smooth 1/60 Update to remove those jitters. I did reinvent the wheel...



    Regarding time.deltaTime jitters try this script in a blank new project and see what min/max Time.deltaTime's it records. For me most timesteps are close to the 16.67ms but periodically (let it run a few min) I get outliers (worst I've seen is 5ms and 34ms) which show up visually as intermittent jitter.

    Code (csharp):
    1.  
    2. public class MeasureDeltaTime : MonoBehaviour
    3. {
    4.     private float deltaTimeRunningAve;
    5.     private float filterRatio = 0.01f; //1st order filter
    6.     private float deltaTimeMin;
    7.     private float deltaTimeMax;
    8.     private void Start()
    9.     {
    10.         Application.runInBackground = true;
    11.         Invoke("ResetMinMax", 1f); //clear any startup initial outliers
    12.     }
    13.     private void Update()
    14.     {
    15.         //Debug.Log(transform.position.ToString("0.000"));
    16.         float deltaTimeMS = Time.deltaTime * 1000f;
    17.         deltaTimeRunningAve = (1f- filterRatio) * deltaTimeRunningAve + filterRatio * deltaTimeMS;
    18.         deltaTimeMin = Mathf.Min(deltaTimeMin, deltaTimeMS);
    19.         deltaTimeMax = Mathf.Max(deltaTimeMax, deltaTimeMS);
    20.     }
    21.     private void OnGUI()
    22.     {
    23.         GUI.Label(new Rect(10, 40, 200, 30), "DeltaTime: " + (Time.deltaTime * 1000f).ToString("0.00") + "ms");
    24.         GUI.Label(new Rect(10, 70, 200, 30), "DeltaTimeAve: " + deltaTimeRunningAve.ToString("0.00") + "ms");
    25.         GUI.Label(new Rect(10, 100, 200, 30), "DeltaTimeMin: " + deltaTimeMin.ToString("0.00") + "ms");
    26.         GUI.Label(new Rect(10, 130, 200, 30), "DeltaTimeMax: " + deltaTimeMax.ToString("0.00") + "ms");
    27.         if (GUI.Button(new Rect(10, 160, 200, 30), "Reset Min Max DeltaTime"))
    28.         {
    29.             ResetMinMax();
    30.         }
    31.     }
    32.     private void ResetMinMax()
    33.     {
    34.         deltaTimeRunningAve = Time.deltaTime*1000f;
    35.         deltaTimeMin = 555.555f;
    36.         deltaTimeMax = -555.555f;
    37.     }
    38. }
    39.  
     
  12. Dave-Hampson

    Dave-Hampson

    Unity Technologies

    Joined:
    Jan 2, 2014
    Posts:
    149
    Quick update, since I haven't quite finished my investigations yet.

    I've been writing a program to try and visualise these small 'deltaTime' variations, as well as the bigger spikes, to see if I can determine the cause.

    The smaller variations, (between 16.5 and 17.0) are actually only about 2%, so won't make a huge amount of difference. To confirm this I did a small test, one white bar moving at .deltaTime and one using a fixed 1/60 interval. They both moved about as smooth as each other, so I don't think this is a huge concern.





    The spikes though are a bit strange, I haven't quite got to the bottom of them yet. I'm not sure if it's GC, my laptop throttling power or something else. More news when I have it.
     
  13. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Are you getting spikes every now and then too? What's your lowest/highest?

    The big spikes are pretty rare for me. I get maybe 1 big spike every minute (by big I mean >4ms from the 16.67ms average) otherwise everything is close to 16.67ms. This leads to a little jitter every minute or so... not the end of the world, but would be nice to fix if you find out a way.

    Thanks for looking into this!
     
  14. MariuszKowalczyk

    MariuszKowalczyk

    Joined:
    Nov 29, 2011
    Posts:
    245
    I have investigated this some time ago and I have came to a conclusion, that a 2D square will always jitter, even if you will move it without any delta (like 10 pixel to the right every frame with vsync on). It will jitter because of the way screens are refreshed. The refresh is from the top to the bottom, line by line, from the left to the right. So the whole square will not change the position at the same time, it will move line by line. Sure the refreshing is really fast, but our eyes/brains can notice this as a lack of smooth movement/jitter/tearing, especially when you are paying attention to this. And it's much more easy to notice this when you look at a 2D square on a solid background. If you will try to notice this in a complicated 3D environment, it will not be so noticeable (if at all).

    The jitter is less noticeable when you move the square from the top to the bottom. It's again due to the way the monitor refreshes. When you move the square from the left to the right, and looking in a very slow motion, you can notice that at some point half of your square has been already moved to the right and the other half still waits for the refresh process to finish. If you move from top to the bottom and the square has one color, the jitter is less noticeable because the same color will be refreshed over the same color and there will also be no situation when half of your square is already moved to the bottom and half is not (remember, the refreshing process is from the left to the right, line by line). The jitter will be only noticeable at the top and bottom edge of the square.

    Anyway, let's get back to the delta. Sure 2% is not a lot, but. You have to remember that most of the time you will move things with some speed, if it will be high enough, the variation will scale up. Our brain is really good at noticing errors in the smoothness.

    If you object will move with the speed equal to 500 pixels per second and the delta variation is from 15.7ms to 17.5ms (as on the samples from one of the posts above), then your object will move:
    0.0157 * 500 = 7.85 pixels per frame
    0.0175 * 500 = 8.75 pixels per frame

    (0.0175 - 0.0166) / 0.0166 * 100 = 5.42%

    The fact that the square can be moved only by the whole pixels, makes things even more complicated. One pixel difference due to the delta variation
    1 / 7 * 100 = 14.29%

    I think something like that may be noticeable, even if only sometimes. We should try to avoid this if possible. On top of that I just want to know what is going on under the hood. We are using deltaTime all the time so it's worth knowing what is going on with it and why. I hope Dave will provide us some answers. Thank you for your help!
     
    Last edited: Apr 28, 2017
  15. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,145
    Just turn on vsync, and you prevent anything like that from being able to happen. (Since any changes to the screen happen between refreshes in that case.) Also, monitors don't really work like that any more; you're thinking of old CRTs.

    --Eric
     
  16. MariuszKowalczyk

    MariuszKowalczyk

    Joined:
    Nov 29, 2011
    Posts:
    245
    Like you have quoted, I have said in my post that I have been testing with vsync ON. I am not saying here about the famous vsync off tearing effect, but about the subtle tearing visible even with vsync on. And the cause of the tearing I am talking about is the refreshing.

    If the LCD is not refreshing the screen line by line like you can see on this movie:

    then please tell me how is LCD refreshing.
    Also explain my observation of jitter with vsync on and moving the square 10 pixels per frame.
     
    Last edited: Apr 29, 2017
  17. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,145
    There's no tearing effect with vsync on, and I don't get any jitter either. Different LCDs refresh in different ways; for example some only refresh part of the screen that changed.

    --Eric
     
  18. Dave-Hampson

    Dave-Hampson

    Unity Technologies

    Joined:
    Jan 2, 2014
    Posts:
    149
    I've never seen this myself, but even if so, this isn't going to be specifically a Unity problem is it? There's nothing an engine could potentially do about it.

    I think there must be something wrong with the maths there, because 500 pixels per second at 60hz is 8.333 pixels per frame. The variation of 2% would be 0.1667 pixels.

    The test program I wrote was designed to check whether this error of +/-2% would be noticeable in a game. At the moment it doesn't look like it will be. Not that there isn't technically a problem here, but I don't think it's too noticeable. As I say, I'm more worried about the occasional larger time glitches at this point. I've found that for some reason Unity takes longer in the WaitForTargetFPS call than the 16.6667ms it should do. I haven't quite figured out why yet. It could be something to do with power saving on my laptop, so I will give it a try on another machine later.
     
    Last edited: May 1, 2017
  19. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,618
    I suspect the actual delta time variance increase quite a bit depending on the amount of threaded features you use in Unity, that is there should be more spikes. It would be absurd if there wasn't, even if workload is reasonably constant.

    In the old days, one would rightfully expect there to be minimal difference. We have much longer pipelines now in engines, and rendering probably takes place much later, after a heck of a lot of noise and multi threaded code...

    I guess triple buffering was one solution nobody uses any more. It's kind of a pain though, as I'm noticing console games at 30fps tend to be a fair bit smoother than Unity games at 30fps. I wonder what strategies exist beyond motion blur to smooth this out?
     
  20. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    All testing was done with a blank new project, and blank scene (with just a cube). No scripts (other than the one that measures time.deltaTime). I'm using Windows 10 which may have unknown background processes.

    The computer is pretty new. i7 4GHz, 16gb ram, GeForce 1070, only SSD (no disk drive)

    I do notice that if I let Unity "run in background" and then minimize/expand applications this does lead to spikes. But all the spikes reported above occur without losing Unity application focus.



    I agree. The small variations are not noticeable from what I've seen. Only the large spikes.
     
  21. jvo3dc

    jvo3dc

    Joined:
    Oct 11, 2013
    Posts:
    1,388
    Are you testing with a multiple monitor setup by any chance? Then this is a fairly old issue. The 2 ms is an as fast as you can render scenario and the 31 ms balances that to the correct average. It's not that there are any other variations. There are just these spikes. So you have three types of frame really, 1/60, "as fast as you can" and (2/60 - "as fast as you can").

    It would be nice if this finally got fixed, because VR setups are generally multiple monitor setups by design.
     
  22. Dave-Hampson

    Dave-Hampson

    Unity Technologies

    Joined:
    Jan 2, 2014
    Posts:
    149
    Hello everyone,

    OK so I've been spending a couple of days on this, performing little experiments inside and outside Unity to try and figure out what has been going on. I won't say I've got to the bottom of everything, but I discovered some stuff which I'll share with you. If anyone else has some more findings I'm keen hear it too!

    The first thing I have discovered is there is a big difference between Editor and Standalone Player on Windows. In the standalone player, the code calls timeBeginPeriod(1) whereas in the Editor it does not. For the difference, have a read of this:

    https://randomascii.wordpress.com/2013/07/08/windows-timer-resolution-megawatts-wasted/

    Now if you think about it, this makes sense, because in the Editor you are typically multitasking, maybe you are at a Game Jam, you want your battery to last as long as possible. In the Standalone you are saying "I want the game to be as smooth as possible". So that's why animation is particularly uneven in the Editor.

    In standalone, I didn't get to the bottom of the occasional 1-frame glitching (although it does seem to be happening a lot less for me recently), but I think I can explain the variance in the deltaTime. I think it's just inherent timing inaccuracy.

    For this I took a standard DX9 program, with a Present call, and added this code to measure the time delta:
    Code (C++):
    1.  
    2.     if (GetAsyncKeyState(VK_F1) & 0x8000)
    3.     {
    4.         LARGE_INTEGER qpft = {0};
    5.         QueryPerformanceCounter(&qpft);
    6.         long long now = qpft.QuadPart;
    7.         static long long prevTime = 0;
    8.         long long delta = now - prevTime;
    9.         double msec = (double)delta / 2435.778;
    10.  
    11.         HWND hwnd = FrameWnd;
    12.         if (hwnd)
    13.         {
    14.             char text[128] = {0};
    15.             sprintf(text, "PresentFrame msec = %8.4f", msec);
    16.             SetWindowTextA(hwnd, text);
    17.             prevTime = now;
    18.         }
    19.     }
    20.  
    Sure enough the frame delta varied in just the same way: 16.88, 15.77...
    So I think it's just inherent to the PC/Windows system, not a Unity problem per se.

    I'd be very curious if someone else can reproduce (or not reproduce!) my results with a non-Unity program.

    I have heard that timing is a lot more consistent on PS4, which makes sense in an environment where there is a lot less going on in terms of other processes.
     
    Zullar likes this.
  23. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Yes I have 2 monitors.
     
  24. MariuszKowalczyk

    MariuszKowalczyk

    Joined:
    Nov 29, 2011
    Posts:
    245
    That's right, it has nothing to do with Unity, I have just posted this as I have found this interesting. I have tested this on iMac from 2011.

    I don't know how you count your 2%, post your equation please. Using your 8.333 it's still 5%
    (8.75-8.333)/8.333 * 100 = 5%
    It's the maximum difference, but we are talking here about the worse case not the average case.
     
  25. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    FrameRateJitter.png
    X axis is each Update sample point
    Y axis is Time.deltaTime
    5.6.1f1

    If anybody is reading/watching this thread would you mind creating a blank new project and a blank scene with this script. What kind of Time.deltaTime variation do you see (both in the editor and when building an executable).

    Code (csharp):
    1.  
    2. using UnityEngine;
    3. using System.Collections.Generic;
    4. public class FPS : MonoBehaviour
    5. {
    6.     private List<float> listDeltaTime = new List<float>();
    7.     private const int pixelWidth = 256; //for texture2D
    8.     private const int pixelHeight = 128; //for texture2D
    9.     private Texture2D texture2D;
    10.     private const float deltaTimeMax = 0.05f; //for texture2D
    11.     private static readonly Color colorDarkGrey = new Color(0.3f, 0.3f, 0.3f);
    12.     private void Awake()
    13.     {
    14.         DontDestroyOnLoad(gameObject);
    15.         for (int i = 0; i < pixelWidth; i++)
    16.         {
    17.             listDeltaTime.Add(0.01f);
    18.         }
    19.         texture2D = new Texture2D(pixelWidth, pixelHeight);
    20.         texture2D.filterMode = FilterMode.Point;
    21.     }
    22.     private void Update()
    23.     {
    24.         listDeltaTime.RemoveAt(0);
    25.         listDeltaTime.Add(Time.deltaTime);
    26.         RefreshTexture2D();
    27.     }
    28.     private void OnGUI()
    29.     {
    30.         float deltaTimeAve = Average(listDeltaTime);
    31.         float deltaTimeMin = Mathf.Min(listDeltaTime.ToArray());
    32.         float deltaTimeMax = Mathf.Max(listDeltaTime.ToArray());
    33.         GUI.Label(new Rect(10f, 10f, 200f, 20f), "DeltaTime (Average) = " + (deltaTimeAve * 1000f).ToString("0.000") + "ms");
    34.         GUI.Label(new Rect(10f, 30, 200f, 20f), "FrameRate (Average) = " + (1f/ deltaTimeAve).ToString("0"));
    35.         GUI.Label(new Rect(10f, 50, 200f, 20f), "Time.deltaTime = " + (Time.deltaTime * 1000f).ToString("0.000") + "ms");
    36.         GUI.Label(new Rect(10f, 70f, 200f, 20f), "deltaTimeMin = " + (deltaTimeMin * 1000f).ToString("0.000") + "ms");
    37.         GUI.Label(new Rect(10f, 90f, 200f, 20f), "deltaTimeMax = " + (deltaTimeMax * 1000f).ToString("0.000") + "ms");
    38.         GUI.Label(new Rect(10f, 110f, 200f, 20f), "% Variation = " + ((deltaTimeMax/ deltaTimeMin -1f) * 100f).ToString("0.0") + "%");
    39.         GUI.DrawTexture(new Rect(0f, 200, Screen.width, texture2D.height*2), texture2D);
    40.     }
    41.     private void RefreshTexture2D()
    42.     {
    43.         Color[] pixels = texture2D.GetPixels();
    44.         for (int i = 0; i < pixels.Length; i++)
    45.         {
    46.             pixels[i] = Color.black;
    47.         }
    48.         texture2D.SetPixels(pixels);
    49.         float deltaTimeMin = Mathf.Min(listDeltaTime.ToArray());
    50.         float deltaTimeMax = Mathf.Max(listDeltaTime.ToArray());
    51.         int yMin = GetY(deltaTimeMin);
    52.         int yMax = GetY(deltaTimeMax);
    53.         for (int i = 0; i < pixelWidth; i++)
    54.         {
    55.             texture2D.SetPixel(i, yMin, colorDarkGrey);
    56.             texture2D.SetPixel(i, yMax, colorDarkGrey);
    57.         }
    58.         for (int i = 0; i < pixelWidth; i++)
    59.         {
    60.             int y = GetY(listDeltaTime[i]);
    61.             texture2D.SetPixel(i, y, Color.white);
    62.         }
    63.         texture2D.Apply(false);
    64.     }
    65.     private static int GetY(float deltaTimeIn)
    66.     {
    67.         return Mathf.Clamp(Mathf.RoundToInt(deltaTimeIn / deltaTimeMax * pixelHeight), 0, pixelHeight - 1);
    68.     }
    69.     private static float Average(List<float> listFloatIn)
    70.     {
    71.         float average = 0f;
    72.         for(int i = 0; i < listFloatIn.Count; i++)
    73.         {
    74.             average += listFloatIn[i];
    75.         }
    76.         average = average / listFloatIn.Count;
    77.         return average;
    78.     }
    79. }
    80.  
    I typically see 50-100% Time.DeltaTime variation (jitter) in the .exe build (Windows). What do you guys see?

    Any idea how to fix this Unity jitter issue? Still unsure why a blank project with Vsync enabled sees 100% Time.deltaTime variation that causes all this jitter.
     
    Last edited: Jul 12, 2017
  26. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Please. Anybody with a Windows machine mind running that script really quick and seeing what Time.deltaTime jitter variation you have?
     
  27. BakeMyCake

    BakeMyCake

    Joined:
    May 8, 2017
    Posts:
    121
    I'm not sure what exactly you're trying to prove and furthermore how you do it, but I ran your script and in editor I get 4-10% variation. In build I get 60-120% jitter. Win7.
     
    Zullar likes this.
  28. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Thanks for testing. Appreciate it.

    It must be related to my machine. In build...
    -I get 20-100% jitter.
    -My friend only gets 5-7%.
    -You get 4-10%

    I do not know what could be causing my machine to be so bad. I have no programs running and it's a new high-end machine (GeForce 1070, SSD).

    It's impossible for me to make a smooth game when a blank Unity project has ~100% Time.deltaTime variation!
     
  29. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Also its strange. Moving my mouse has a massive effect on jitter. If I hold my cursor still the Time.deltaTime variation is *much* smaller.

    On a fresh reboot closing all programs and tasks holding my mouse still I get 5%. Moving mouse I get 30%.
     
  30. jvo3dc

    jvo3dc

    Joined:
    Oct 11, 2013
    Posts:
    1,388
    And your friend probably only has one.
     
  31. scvnathan

    scvnathan

    Joined:
    Apr 29, 2016
    Posts:
    74
    I just tried it in 2017.1. I get 5-25% in editor. In a build I get like over a 1000% variation:

    upload_2017-7-15_1-52-33.png

    Also note I have two monitors
     
    Zullar likes this.
  32. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Anybody have any luck figuring out how to fix this Time.deltaTime variation that causes visual jitter?
     
  33. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,618
    My only thoughts are that all VR work going on is probably going to shine a light on anything like this. Those platforms have very specific frame timing goals.

    One thing you need to be aware of when running builds to test is that you should not have the Unity editor also running in the background. Easy mistake to make and does the pauses you guys are talking about. It also does that to netflix running in the foreground.

    For me the editor is a piglet that bounces around leaving a complete mess.
     
    Zullar likes this.
  34. RocketCar

    RocketCar

    Joined:
    Dec 8, 2017
    Posts:
    1
    I get noticeable jitter in Time.deltaTime even on device (an old iPad). I have one counter that's counting the frames that are being rendered and it's a rock-solid 60 per second. Next to it is a number showing me 1 / Time.deltaTime. You'd expect that to stay stable at 60 too, but it's constantly vibrating between 57 and 63, and on occasional frames it dips to under 45 for no apparent reason. No frames are dropped, it's just a constant 5% variance with the odd 30% slower deltaTime.

    Every frame I'm calculating and visualizing an object's velocity (distance / deltaTime), and it's fairly smooth despite the variance. (Hence why most games, including my own, look fine with it) But I'm also calculating its acceleration (velocity / deltaTime), and that compounded error makes for noticeable noise even when damped.

    For sensitive values I'm going to have to write my own ".quantizedDeltaTime" to round off deltaTime to exactly 90/60/30/15fps and base my math on that instead.
     
    Zullar likes this.
  35. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Yes I'm also considering filtering/locking the deltaTime (instead of using the varying time.deltaTime). My only worry is that if frames start to drop or slow down then doing this would create many unintended consequences.

    Since this deltaTime variation occurs on multiple (all?) platforms and regardless of loading (it occurs with a simple blank project) I'm surprised that *everybody* isn't experiencing this issue.
     
  36. petersvp

    petersvp

    Joined:
    Dec 20, 2013
    Posts:
    41
    Everybody.
    Sadly, recently devs don't even care about system integration stuff. Nobody. Check Steam for example.
    I experience justters and stutters with every single Unity game. In Unreal games I have another issue: resolution scale. And Source games (Valve) seem smooth as... baby skin. Spelunky is also smooth. Even in INSIDE I has some stuttering, on an i7/nVidia 970 / 16 G DDR4.

    Now I do have issues with timing. Getting 200 FPS with VSYNC off and getting 60/60 dropping in half (30/60) with VSYNC On. I have no idea what happens, actually, but on 120hz monitor it is unacceptable to have 72 and even 48 fps lockdowns with no reason.

    About system integration, I posted https://fogbugz.unity3d.com/default.asp?975924_otd1ouli7el4jvi0 - hopefully they will address this because it does not happen for me in Source games, and the scene isn't that huge.

    I can basically get perfect timing only on 60/60 (60 hz). Why I have 120hz monitor if the game just jitters all over the place.

    As of the above code: ~100-200 jitter with two monitors on.
    ~15-30 jitter with one monitor only.
    [and smoother experience]
     
  37. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Ugh.

    I was thinking of creating a SmoothTime.deltaTime script. Something along the lines of
    1: Access the monitor vsync rate (i.e. 60Hz, 120Hz). Is there a way to do this? Calculate vSyncDeltaTime = 1/VsyncRate (i.e. 16.67ms for 60Hz or 8.33ms for 120Hz).
    2: If time.deltaTime is ~= vSyncDeltaTime then SmoothTime.deltaTime would return exactly vSyncDeltaTime.
    3: Else in the event of dropped frames/slowdown then SmoothTime.deltaTime wold return the closest rounded whole multiple interval of vSyncDeltaTime. (i.e. for a 60Hz monitor it would return exactly 16.67ms (no dropped frame), 33.33ms (1 dropped frame), 50.00ms (2 dropped frames), etc.

    Do you think some bandaid like this could solve our jitter problems? See any issues with it?

    I think this could improve jitter for manually moved objects. However things that are automatically moved by Unity (i.e. rigidbody physics) I believe rely internally on Time.deltaTime and therefore would jitter (unless you detach the object's visual mesh from the rigidbody and do a bunch of fancy compensation calcs?).
     
    Last edited: Dec 11, 2017
  38. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Issue still exists 2017.2.0f3. I just bug reported
    https://fogbugz.unity3d.com/default.asp?977641_r0jbe37tl5p0h0sk

    If anybody is reading/watching this thread would you mind creating a blank new project and a blank scene with this script. What kind of Time.deltaTime variation do you see (both in the editor and when building an executable).

    Code (csharp):
    1.  
    2. using UnityEngine;
    3. using System.Collections.Generic;
    4.  
    5. public class FPS : MonoBehaviour
    6. {
    7.     private List<float> listDeltaTime = new List<float>();
    8.  
    9.     private const int pixelWidth = 256; //for texture2D
    10.     private const int pixelHeight = 128; //for texture2D
    11.  
    12.     private Texture2D texture2D;
    13.  
    14.     private const float deltaTimeMax = 0.05f; //for texture2D
    15.  
    16.     private static readonly Color colorDarkGrey = new Color(0.3f, 0.3f, 0.3f);
    17.  
    18.     private void Awake()
    19.     {
    20.         DontDestroyOnLoad(gameObject);
    21.         for (int i = 0; i < pixelWidth; i++)
    22.         {
    23.             listDeltaTime.Add(0.01f);
    24.         }
    25.  
    26.         texture2D = new Texture2D(pixelWidth, pixelHeight);
    27.         texture2D.filterMode = FilterMode.Point;
    28.     }
    29.  
    30.     private void Update()
    31.     {
    32.         listDeltaTime.RemoveAt(0);
    33.         listDeltaTime.Add(Time.deltaTime);
    34.         RefreshTexture2D();
    35.     }
    36.  
    37.     private void OnGUI()
    38.     {
    39.         float deltaTimeAve = Average(listDeltaTime);
    40.         float deltaTimeMin = Mathf.Min(listDeltaTime.ToArray());
    41.         float deltaTimeMax = Mathf.Max(listDeltaTime.ToArray());
    42.  
    43.         GUI.Label(new Rect(10f, 10f, 200f, 20f), "DeltaTime (Average) = " + (deltaTimeAve * 1000f).ToString("0.000") + "ms");
    44.         GUI.Label(new Rect(10f, 30, 200f, 20f), "FrameRate (Average) = " + (1f/ deltaTimeAve).ToString("0"));
    45.         GUI.Label(new Rect(10f, 50, 200f, 20f), "Time.deltaTime = " + (Time.deltaTime * 1000f).ToString("0.000") + "ms");
    46.         GUI.Label(new Rect(10f, 70f, 200f, 20f), "deltaTimeMin = " + (deltaTimeMin * 1000f).ToString("0.000") + "ms");
    47.         GUI.Label(new Rect(10f, 90f, 200f, 20f), "deltaTimeMax = " + (deltaTimeMax * 1000f).ToString("0.000") + "ms");
    48.         GUI.Label(new Rect(10f, 110f, 200f, 20f), "% Variation = " + ((deltaTimeMax/ deltaTimeMin -1f) * 100f).ToString("0.0") + "%");
    49.  
    50.         GUI.DrawTexture(new Rect(0f, 200, Screen.width, texture2D.height*2), texture2D);
    51.         GUI.Label(new Rect(200f, 180f, 400f, 20f), "X-Axis: Sample             Y-Axis: Time.deltaTime");
    52.      
    53.     }
    54.  
    55.     private void RefreshTexture2D()
    56.     {
    57.         Color[] pixels = texture2D.GetPixels();
    58.         for (int i = 0; i < pixels.Length; i++)
    59.         {
    60.             pixels[i] = Color.black;
    61.         }
    62.         texture2D.SetPixels(pixels);
    63.  
    64.         float deltaTimeMin = Mathf.Min(listDeltaTime.ToArray());
    65.         float deltaTimeMax = Mathf.Max(listDeltaTime.ToArray());
    66.         int yMin = GetY(deltaTimeMin);
    67.         int yMax = GetY(deltaTimeMax);
    68.  
    69.         for (int i = 0; i < pixelWidth; i++)
    70.         {
    71.             texture2D.SetPixel(i, yMin, colorDarkGrey);
    72.             texture2D.SetPixel(i, yMax, colorDarkGrey);
    73.         }
    74.  
    75.         for (int i = 0; i < pixelWidth; i++)
    76.         {
    77.             int y = GetY(listDeltaTime[i]);
    78.             texture2D.SetPixel(i, y, Color.white);
    79.         }
    80.         texture2D.Apply(false);
    81.     }
    82.  
    83.     private static int GetY(float deltaTimeIn)
    84.     {
    85.         return Mathf.Clamp(Mathf.RoundToInt(deltaTimeIn / deltaTimeMax * pixelHeight), 0, pixelHeight - 1);
    86.     }
    87.  
    88.     private static float Average(List<float> listFloatIn)
    89.     {
    90.         float average = 0f;
    91.         for(int i = 0; i < listFloatIn.Count; i++)
    92.         {
    93.             average += listFloatIn[i];
    94.         }
    95.         average = average / listFloatIn.Count;
    96.         return average;
    97.     }
    98. }
    99.  
    100.  
     
    Last edited: Dec 11, 2017
  39. nxrighthere

    nxrighthere

    Joined:
    Mar 2, 2014
    Posts:
    537
    @Zullar With VSync. Editor: ~15.435 ms / ~17.253 ms (~10%). Standalone build: ~16.349 ms / ~17.075 ms (~5%).
    Without VSync. Editor: ~4.510 ms / ~13.447 ms (~200%). Standalone build: ~2.631 ms / ~3.222 ms (~25%).
     
    Last edited: Dec 16, 2017
    Zullar likes this.
  40. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    @nxrighthere Thanks for running. I wonder why you see such less jitter than I do. For your standalone build you see 5%, but I see around 120% when moving mouse. What kind of hardware you running? Windows 10? 1 monitor or 2? Does moving your mouse affect the % jitter?
     
  41. nxrighthere

    nxrighthere

    Joined:
    Mar 2, 2014
    Posts:
    537
    GTX 950 OC, AMD FX-4300 (4 GHz), Windows 10 (Build 1709), 1 monitor. I'm using Unity 2017.1.2p4.

    Yes, if I don't move the mouse, I get 3%, and If I start moving it, the jitter increases by 2-3%.
     
    Last edited: Dec 19, 2017
    Zullar likes this.
  42. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Cool looks like Unity is digging into this. From Unity QA

    "We have talked with one our developers and concluded that wanting to set time step source is a valid feature request. We have sent it to our developers for consideration.

    We have also investigated our frame time variations and it appears we get inconsistent results between systems. We will be creating a request to look into Time.deltaTime variation too."



    Hopefully Time.deltaTime can be smoothed out because it manifests itself in many ways
    -Manual Motion Jitter
    -Physics Rigidbody Interpolated Jitter
    -Particle System Jitter (I think)
    -Animation/Mechanim Jitter (I think)
    -Anything else that relies on Time.deltaTime being smooth
     
    Edy and scvnathan like this.
  43. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,554
    I've been noticing such hiccups since a long time. Indeed, I cannot record videos using cameras with smooth "Look At" method anymore because the issue can be noticed so clearly that it hurts. Even Cinemachine cameras show huge hiccups!

    I've tested your script. In the editor I get a consistent variation of about 80 - 120% without doing anything.

    In a build there's so much fun. 1430%. Vsync, DX11, Full-Screen window, and a dumb empty scene. Results are similar in Full-Screen Exclusive mode. Worth noting that I also have two screens.

    upload_2018-1-17_0-53-22.png

    In my case those large hiccups are produced every second:

    upload_2018-1-17_1-8-9.png

    It must be Unity. I've discarded everything else: BIOS (hyperthreading etc), scheduled tasks, background processes, CPU / GPU power options...

    At the same time, with the same conditions, the Unigine's Benchmark "Valley" runs at so beautifully smooth 60fps that it makes me cry.

    My hardware: Asus Z170i, Intel Core i7-7700K, GeForce GTX 1070, RAM 32 GB, SSDs...

    I had even suspected that the deltaTime value itself was wrong. So I wrote a native C++ DLL that calls the WinAPI functions QueryPerformanceFrequency and QueryPerformanceCounter and used it in Unity for timing the intervals between each Update in the most precise way. The results are consistent with the deltaTime values, so there is a true delay here. Unity is doing something internally that halts the whole engine every single second. Not to mention the disparity on the regular deltaTime values.
     
    Last edited: May 4, 2018
  44. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Thanks for running script. I wonder why you see such a massive amount of jitter.

    The good news Unity has reproduced this variation and is investigating. Unity told me they've also seen a jitter difference between different hardware. Unity also told me they also noticed a jitter sensitivity to background applications running. I noticed this as well... so it's surprising that closing all background applications doesn't change things for you.

    Hopefully Unity can get to the bottom of this. They've been very responsive so I'm crossing my fingers.


    As a bandaid you can use Time.smoothDeltaTime or create a filtered Time.deltaTime script. This can improve things that you have control over, but you can't fix things that internally rely on Time.deltaTime (animations, rigidbody interpolation, etc.)
     
  45. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,554
    I wonder that too. The amount of background applications has absolutely no effect in my case. It makes some sense though, as the hardware is widely capable of running all them simultaneously. It's just Unity doing something that blocks itself.

    I've tested smoothDeltaTime, and even forced a constant delta time (= 1.0f / Screen.currentResolution.refreshRate) with rather acceptable results in both cases. Still, the issue persists and jitter is evident in some situations.

    Thanks for writing the test script!
     
    Zullar likes this.
  46. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Unity just messaged me today and said they will be "trying our best to fix VSync algorithm". This is great. That's the underlying root cause of the jitter.

    The way I see the problem is that if the the CPU/GPU is very fast and has lots of idle time it should have no problem achieving 60fps. And it does achieve 60fps... but each Time.deltaTime varies wildly when it should be constant. No idea why, but hardware and background apps seem to have an effect.

    But bottom line with no dropped frames: Time.deltaTime should exactly equal 1f/MonitorRefreshRate
    60Hz monitor should generate a constant 16.67ms Time.deltaTime

    What I've been arguing is that with VSync enabled each Time.deltaTime should be an exact multiple of the monitor refresh rate. With dropped frames things get more complicated... with 1 dropped frame Time.deltaTime = 33.3ms, 2 dropped frames 50ms, etc. The trick is you probably don't know the frame is dropped due to excessive loading until mid-way through the update loop... so it will probably still jitter. But I think we can accept the fact that if we have massive loading from our app (or background apps) and drop frames that the game will not run smoothly. But in this case I don't think we should really expect things to be smooth and I don't really care much about the scenario with dropped frames because I won't want my game to run under these conditions anyhow.

    I think apps should almost always aim to run w/o dropped frames and this is where the smooth focus should be. And to be smooth Time.deltaTime must exactly equal 1f/MonitorRefreshRate (i.e. 16.67ms for a 60Hz monitor). Anything other than this will generate jitter.
     
    Noisecrime, Edy and nxrighthere like this.
  47. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,554
    Great news!

    Please let me know if I can help in any way, i.e. testing scripts, providing feedback, etc.
     
    Zullar likes this.
  48. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,554
    Oh, I have a reply for this. Time.deltaTime is the time interval between the previous two Update calls. I measured it with a precise independent timer using WinAPI via native C++ DLL and the results match. This is why background apps have such effect. Update calls take a bit longer to happen when background apps are also demanding CPU time.

    This would be correct when VSYNC is disabled. But with VSYNC enabled (and idle time enough) I agree it should be constant: it should measure the time the frame will be presented.
     
    Zullar likes this.
  49. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    615
    Completely agree. Even if the actual time between Updates varies slightly due to background demands, as long as Unity is still able to produces 60fps the VSync'd Time.deltaTime should be 1f/60 or 16.67ms.

    i.e. if an external app interrupts and uses 10ms of my 16.67ms and delays my Update call by 10ms I don't want one deltaTime of 26.67ms and the following of deltaTime of 6.67ms... I want them both to be 16.67ms (as long as I can complete my Unity loop in < 6.67ms).

    In reality I think we could could care less about the time between updates frames. All we care about is time between monitor frames... which should be a constant 16.67ms with VSync enabled and no dropped frames.

    It gets very complicate and ugly when you start considering dropped frames. But again I don't care much about this scenario because I don't intend my users to operate under these conditions. I accept it will be choppy no matter what you do when you start dropping frames.
     
  50. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    1,554
    I may be wrong (I'm not an expert on this), but I can think on a simple way for managing deltaTime and dropped frames when VSYNC is enabled.

    Every "physical" frame occurs at the screen at the hardware rate. When an Update cycle misses its frame opportunity there's no way to recover it. There's no point on making a "deltaTime = <6ms" after a long "deltaTime = >25ms". This just makes the average deltaTime to match the screen rate, but I don't see any practical meaning.

    So when "hardware" frames are dropped we must ensure that the logic in Update can match the varying rate. I can think on this simple algorithm:
    • When an Update cycle makes it into its corresponding frame, then deltaTime = 1/refreshRate.
    • If an Update misses its frame but gets presented into the following one (1 frame dropped) then deltaTime = 2 * 1/refreshRate in the next frame.
    • General case: if an Update cycle has to bypass n frames before it gets presented then deltaTime = (1+n) * 1/refreshRate in the next frame.
    This gives the opportunity to the underlying logic to adapt properly to the varying rate.

    An example. Imagine moving an object at 1m/s using deltaTime, like position = position + vel * deltaTime. The position should advance exactly 1m each second. Now imagine that each Update cycle takes 20ms due to the complexity of the scene. Screen rate is 60Hz. This means that every Update misses its frame but gets presented in the following one.

    Using the previous algorithm, deltaTime would be 2 * 1/refreshRate = 2 * 16.67 = 33.34 consistently. At 30 Update cycles per second, this allows the object to move exactly 1m each second.

    If the Update cycle goes back to <16ms then deltaTime would be 16.67 in the very next frame. There's no point on making it a small value just for matching a pointless average.

    Again, I might be wrong as I may be missing other factors. Maybe the implementation details are highly complex. But from the user's point of view this is what I'd expect from deltaTime.
     
    Cynicat and Zullar like this.