Search Unity

Help with Time.deltaTime on different platforms!

Discussion in 'Editor & General Support' started by SleepyWolfie, Jan 14, 2021.

  1. SleepyWolfie

    SleepyWolfie

    Joined:
    Jun 8, 2020
    Posts:
    36
    The problem is the following according to this very very simple code:

    timeElapsed += Time.deltaTime;

    After hooking this up to a piece of UI (or the log, or whatever), I run it on two devices at the exact same time:

    1) On a common Android phone,

    2) On a Mac Pro,

    Now, I am fully aware that the Mac is many times faster and more powerful than a phone, that it reaches a much larger frame rate, that android phones (as many phones) are Frame capped due to their refresh rate. Im also aware that Time.deltaTime is a bit inconsistent (something that will be improved upon on 2020.2).

    Given my awareness of that, I cant understand how and why the timers are completely off. Time.deltaTime should be frame rate independent (with the small inconsistencies that it has), and thus the timer shouldn't be off by over 10 seconds (PC ahead by 10 seconds in comparison to the phone) after 40 seconds have elapsed. The PC accumulates delta time MUCH more faster than the android phone. This means that if I have two characters, one on the phone and one on the PC run a 100 meter race, the PC would always win. How is this possible? Isn't Time.deltaTime the tool that should solve this no matter the frame rate? What am I doing wrong? Is this something I just have to accept? How would an Android phone compare to other phones in this regard (android and IOS)?

    Please help, and any other extra info needed id be happy to provide. Thanks in advance!

    EDIT: Typo
     
  2. Kurt-Dekker

    Kurt-Dekker

    Joined:
    Mar 16, 2013
    Posts:
    38,736
    How are you measuring this? Are you letting the app start and finish all of its initialization before pressing a button to start the timer on each platform?
     
  3. SleepyWolfie

    SleepyWolfie

    Joined:
    Jun 8, 2020
    Posts:
    36
    I enter a scene and the UI starts counting, just that. Note that this isn't a startup thing, since at first the difference might be a few milliseconds, but that difference increases.

    Say at t = 1, timers might be: PC at 0.98s, Android at 0.8s, then at say t = 20, PC at 19.2s and android at 15s, the difference is enormous and only gets bigger with time.
     
  4. Xarbrough

    Xarbrough

    Joined:
    Dec 11, 2014
    Posts:
    1,188
    If you need this to track time consistently between multiple devices, you're out of luck. It's impossible to let multiple timers run independently on different machines and expect them to stay synchronized. This is true for most commercial software, consumer PCs, and Unity. I'd love to dive into the exact reasons myself, but roughly speaking, computers run on an internal clock with limited precision and multiple devices will have slightly different speeds. This is also the reason why you have to synchronize your PCs clock with an online server from time to time. You can go a few days without, but after several weeks or months, the time will be off.

    The precision difference is very small, but accumulated over many frames, the difference becomes rather large quickly. As to why you're seeing very large differences, I have no clue, I'd be interested to know more on this myself.

    In any case, if you need to synchronize time between multiple devices you'll either need a server to dictate the time or a hardware clock that is used as a single-source-of-truth.

    On this Unity blog article, there's some more info.
     
  5. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Even if Unity were to provide a completely accurate delta time (which it doesn't), the fundamentals of floating point numbers make it impossible to get the same results when accumulating values like that, because the small imprecisions will accumulate differently.

    For example, incrementing a variable by 0.01 a hundred times will always give a slightly different result than incrementing it by 0.001 a thousand times. You can see this happen on the same machine.
     
  6. SleepyWolfie

    SleepyWolfie

    Joined:
    Jun 8, 2020
    Posts:
    36
    Thanks for answering! I understand and it makes sense. I guess my frustration came from the fact that the difference is enormous which (like you) would like to find out, and I will tell you if I make progress on it.