Search Unity

  1. Unity 2019.1 is now released.
    Dismiss Notice

Time class floating point accuracy

Discussion in 'Editor & General Support' started by trepan, Feb 14, 2012.

  1. trepan

    trepan

    Joined:
    Feb 11, 2011
    Posts:
    69
    I was just wondering if someone could clarify how the Unity timer class addresses floating point accuracy deterioration - and hopefully confirm that I don't need to worry about it :)

    The docs suggest that critical fields such as Time.time and Time.realtimeSinceStartup are simply floats, but if so, that means that their accuracy will degrade over time (as explained here). Hopefully these value are internally maintained as doubles that get converted to single precision upon access?
     
  2. shawn

    shawn

    Unity Technologies

    Joined:
    Aug 4, 2007
    Posts:
    551
    You are correct. Natively, these values are doubles, they are just exposed as floats.
     
  3. trepan

    trepan

    Joined:
    Feb 11, 2011
    Posts:
    69
    Thanks for the reply - good to know. ...Although now that I think about it some more, I can see how this approach gives us consistently reliable deltas, but I'm still not sure I understand how Time.time and Time.realtimeSinceStartup aren't going to become inaccurate since the form we access is just a single precision float, i.e. we lost the double precision at point of conversion. Or is there some magic I'm not seeing?

    Thanks again!
     
  4. ZPSheks

    ZPSheks

    Joined:
    Jan 22, 2013
    Posts:
    4
    trepan: I'm pretty sure you're aware of this, but just to clarify for any future readers of this thread:

    The answer to the last question is that we lose the double precision at conversion to float. So be particularly wary with long-running instances (attract demos, game servers, mission critical software) that have certain time precision demands. Use any combination of Time.deltatime, Time.timeSinceLevelLoad and your own high-precision timer for more correct results.
     
    Last edited: Jun 5, 2013
  5. Simon-O

    Simon-O

    Joined:
    Jan 22, 2014
    Posts:
    23
    I had a problem with this in a shader using time as a parameter to generate simplex noise (to drive a "twinkle" effect for surface lights when seen from orbit). If the game was left running for a few hours, the precision degraded to such a degree that effects would "stutter". This isn't because the timer lost its place, but rather that after conversion to a float, the representable values were too far apart to appear smooth.

    In the end, I cheated and kept my own count, resetting it to 0 every hour or so. There was a flicker when I reset, but it was so infrequent (and the shader so sparsely used) that it wasn't an issue.
     
  6. peterpi

    peterpi

    Joined:
    May 18, 2013
    Posts:
    20
    The UNET source code has various places where absolute time since startup is stored as a float. e.g. NetworkBehaviour.m_LastSendTime and NetworkTransform.m_LastClientSendTime. This will cause problems for long-running games.
     
    nxrighthere likes this.
  7. petersvp

    petersvp

    Joined:
    Dec 20, 2013
    Posts:
    38
    The problem still exists in 2017.1. Time.realTimeSinceStartup should be exposed as double as well. Even in day 3, long running server loses time precision, unless you use custom timing like I do: use real Mono timers (like System.diagnostics.Stopwatch) and custom time classes
     
    wobes and nxrighthere like this.
  8. Flavelius

    Flavelius

    Joined:
    Jul 8, 2012
    Posts:
    593
    It would be very useful if there was a double version exposed. And as it's stored that way internally, is there anything preventing that?
     
    dadude123 likes this.
  9. tz18

    tz18

    Joined:
    Dec 14, 2017
    Posts:
    1
    Especially if used for scientific purposes where accurate timekeeping is essential, it would be great to have a double precision Time.time exposed.
     
  10. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    5,428
    I would guess that DateTime.Now.Ticks would be a better choice in that use case, which should be accurate to the level of the OS timer and hardware implementation allow. It is also a long instead of a floating point variable, so doesn't degrade in accuracy as time passes.
     
  11. tsibiski

    tsibiski

    Joined:
    Jul 11, 2016
    Posts:
    267
    Can someone explain to me why having a double helps? Aren't both doubles and floats going to have potential precision issues in C#? They both should not be compared for equality due to the epsilon +- value potential between them.

    I thought that was why decimal was used instead for values that must have guaranteed precision? What am I missing here?
     
  12. Flavelius

    Flavelius

    Joined:
    Jul 8, 2012
    Posts:
    593