Search Unity

Bug Network Simulator leads to disconnects

Discussion in 'Multiplayer Tools' started by iddqd, Sep 1, 2016.

  1. iddqd

    iddqd

    Joined:
    Apr 14, 2012
    Posts:
    501
    Hi

    I'm testing my game on the same machine and it works until i enable the Network Simulator with:
    Latency 50ms
    Packet Loss 0%

    Now after about 1 Minute the client is disconnected from the server.

    What is the reason for this? This doesn't happen when the Network Simulator is disabled.

    Thanks.
     
  2. srylain

    srylain

    Joined:
    Sep 5, 2013
    Posts:
    159
    I was having problems a bit ago with adding any sort of lag simulation, one of the devs explained that it was because if the engine determines that more than 5% packet loss happens in between each heartbeat (heartbeats are when the engine itself sends data to confirm you're still connected) it'll start to throttle your connection because it assumes that there's some sort of network congestion going on. The problem, is that by sending even less packets it keeps throttling your connection more and more eventually timing you out.

    The dev explained by using this:
    NetworkManager.singleton.connectionConfig.NetworkDropThreshold = 90;
    before you start the server mainly fixes that problem (it could possibly be your problem). You can change the value to whatever you want, but the dev said most of the time you're fine with leaving it around 80-90. Before I used that code, I would drop connections after only setting 5% packet loss, now I can set it up to 80% and it won't disconnect.
     
  3. iddqd

    iddqd

    Joined:
    Apr 14, 2012
    Posts:
    501
    yeah thanks, i read about that and put it to 90, didn't change anything.

    But anyway, how can there not be enough packets if i'm sending 40 messages per second on the same computer, which should have 0% packet loss.

    And if this is the solution, then why not set it to the default value in the next unity update?
     
  4. iddqd

    iddqd

    Joined:
    Apr 14, 2012
    Posts:
    501
    Well the latency simulation seems to be buggy or not correctly named.

    If i set the latency to 300ms, then my packets arrive only every 300ms (they should normally arrive every 50ms (20 messages per second)). The way i understand latency is actually that one client has a high ping, so his packets still arrive +- every 50ms from the server, but 300ms later because of his distance to the server.
     
  5. srylain

    srylain

    Joined:
    Sep 5, 2013
    Posts:
    159
    You could potentially be doubling the delay, if both instances of your game has the network simulator enabled it might be enforcing it for both instances (build it once without network simulator enabled, then run the server from the editor with it enabled). If you want better control over network simulation, look into other network simulation programs like Clumsy.
     
    iddqd likes this.
  6. iddqd

    iddqd

    Joined:
    Apr 14, 2012
    Posts:
    501
    Thanks - i'll check out clumsy!