Search Unity

How to synchronize time across clients?

Discussion in 'Multiplayer' started by Inaetaru, Aug 12, 2016.

  1. Inaetaru

    Inaetaru

    Joined:
    Aug 9, 2015
    Posts:
    16
    I would like to synchronize times across networked games, so I can calculate exact time of certain events, however I'm currently stuck.

    What I'm trying to find out is difference between Time.unscaledTime between a server and a client. One of them will send message with certain even (for example to spawn projectile) with local time and the other will calculate when it happened so it can simulate where the projectile actually should be.

    • I'm using Time.unscaledTime, because AFAIK accuracy of System.DateTime differs with platform.
    • Accuracy I would like to reach is < 50 ms.
    - - -

    How I'm trying to solve this

    I was unable to find any document for this, so I made up following:

    I send ping messages between the server and client. One is sender, the other is receiver:
    • Sender sends ST1 (sender local time #1) - it's current time
    • Receiver stores it and sends RT1 (receiver local time #1)
    • Sender replies with ST2
    • Receiver replies with RT2
    First three are sent with unreliable channel, last one is sent with reliable one. Now I calculate latency and remote time offset (both sender and receiver have all four values):
    • latency1 = (ST2 - ST1) / 2
    • latency2 = (RT2 - RT1) / 2
    • remoteTimeOffset1 = ((ST1 + latency1) - RT1)
    • remoteTimeOffset2 = (ST2 - (RT2 - latency2))
    I store these samples in an array and process them:
    • When there is more then 10 samples, removes outliers using IQR (Interquartile range)
    LowQuartile = 25% quarile
    HighQuartile = 75% quartile
    IQR = HighQuarile - LowQuartile
    remove if value < LowQuartile - 1.5 IQR or value > HighQuartile + 1.5 IQR
    • Remove highest and lowest values if there is more then 50 samples (keeping 50 samples at most)
    • Calculate average value from remaining samples - that's the result value (latency or remote time offset)
    How I'm testing
    • I run two instances on same PC with network simulation enabled.
    • With ping, I send also current local system time (DateTime.Now)
    • When the ping is processed, I take the remote time (which is naturally part of the ping), calculate delay of the ping using current local time and current remote time offset.
    • Add this delay to remote game system time and compare the result with local system time. The difference is remote time offset error plus DateTime accuracy error plus error caused by separating game into frames. (Note: it runs on 60 fps.)

    What's the problem
    • It's not accurate. When I run it with simulated latency 200ms, the error is 50 - 150ms. What's the more disturbing, it's always plus 50 - 150ms on server and minus 50 - 150ms on client.
    • When I run it without simulated latency (so the latecy should be 0, because it's on same machine), the error is 5 - 10ms. That would be more then enough and understandable, because there is delay caused by separating game logic to frames, but again, it's always positive number on server and negative number of client. I would expect to oscillate between both values on both game instances.
    • With higher simulated latency, the higher error in synchronizing times. Even if I use 1000 samples instead of 50.

    Example code
    • Attached
     

    Attached Files: