Search Unity

Question High CPU usage on Server builds

Discussion in 'Game Server Hosting' started by MarcoZVincenzi, Dec 28, 2023.

  1. MarcoZVincenzi


    Mar 3, 2022
    We are trying to optimize our linux server build hosted on Multiplay/GSH. We are using LTS 2022 and FishNet as a networking package (similar to Mirror). The default Cloud server density that GSH sets is 7, meaning each instance would only be allowed ~15% of the CPU usage. Unfortunately our build seems to use around 24% of the CPU during load (active gameplay) and about 12% on “idle” (players interacting with the UI in between game sessions).

    When keeping a server density of 7, the server would crash after a couple minutes of gameplay, as the allowance threshold of 14% would be crossed almost immediately. We’ve tentatively reduced the server density to 3 (33.33% allocated to each instance), but we’d like to improve on it, especially since we don’t think the gameplay itself is too computationally expensive.

    This is a screenshot of the CPU load in 3 tests we did back-to-back.

    We’ve set a `targetFramerate` of 60 or 30, but performance was comparable between the two, with about 1-2% less load when going from 60 to 30. Vsync is also set to 0.

    We also tried profiling the server remotely, and we saw a pretty evenly-spread graph. On idle, script loops only take about 5-10% of each frame time. During the active gameplay this only rises to 25%.

    The remaining ~85% of the frame is instead spent on `WaitForTargetFPS` to reach the desired framerate (30 FPS in the screenshot below, but the same holds for when we tried with 60 FPS). This value is only visible if the VSync flag is turned on on the Profiler. Could this be the reason behind the high CPU load? Seeing how Unity defaults the server density to 7 we thought 1%-14% was the baseline range within which they’d expect a standard instance to run, but what kind of values should we expect in real-life usage?

    A couple more notes:
    • The initial connection and setup steps can spike as high as 52%, but only for a couple of seconds, meaning the 2-minute overload window should not be triggered. Same for the server shutdown process.
    • Profiling did add an additional ~4% to the CPU load, but it looks like a fairly constant overhead given by the development build which disappears on standard builds. The values written above don’t include this overhead, tho the screenshots might.
    • There are minor areas where we know our Server logic could be optimized, and we are slowly tackling them, but seeing how the vast majority of the time is spent waiting for the end of the frame, we don’t actually know if it would be of any help.
    • The number of players connected to a server should range between 2 and 8, depending on the gameplay mode.

    If anyone with experience in this area could chime in it’d be greatly appreciated.

  2. MartinTilo


    Unity Technologies

    Aug 16, 2017
    I'd try setting QualitySettings.VSyncCount = 2 instead of Application.targetFrameRate = 30 as that should then do a Semaphore wait on the GPU instead of waiting in a busy-loop on main thread. If your server even needs any rendering to happen, you could also set UnityEngine.Rendering.OnDemandRendering.renderFrameInterval = 3 or higher.
    MarcoZVincenzi and bugfinders like this.
  3. bugfinders


    Jul 5, 2018
    TBH Im surprised as part of the server build optimization that setting the target rate down is not a default.