Search Unity

High CPU usage even with server doing nothing.

Discussion in 'Multiplayer' started by MrsPiggy, Jun 13, 2018.

  1. MrsPiggy

    MrsPiggy

    Joined:
    Jun 13, 2018
    Posts:
    154
    Hi,
    I've seen several topics complaining about different performance issues but the one I am reporting seems different. If not I apologize in advance.

    Working with a simple prototype I've noticed that running the Unity exe as server only (even from command line) causes major CPU usage without doing anything. So I setup a new Project with no assets and just added one game object in the scene with a NetworkManager and NetworkManager HUD component attached to it.

    Then I build and run on an i7 MacBook PRO. The empty scene takes ~13% CPU but as soon as I click on the "LAN Server Only" button the process jumps at ~37-40% usage, while sitting there waiting for a connection.

    The project is built with Unity 2018.1.4f1 under macOS 10.11. I also tested under Windows 8.1 and 10 with even worse results as the machine I was using runs an i3. (process takes 60% CPU doing nothing).

    Am I missing something?
    Is this "normal"?

    Thanks
     
  2. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,847
    What have you set Application.targetFrameRate to? If you're not aware, on Win/Mac/Linux without setting Application.targetFrameRate it will try to reach the maximum frame rate the hardware can handle, which should be pretty high without having to wait on any graphics processing.

    So I usually set Application.targetFrameRate to 60 or so to start with for a server project.
     
    Ellernate likes this.
  3. MrsPiggy

    MrsPiggy

    Joined:
    Jun 13, 2018
    Posts:
    154
    Thanks, I've tried setting the framerate to 60 and even 30 but it doesn't change the situation at all.
    As soon as I hit the "LAN Server" button the CPU jumps to 45% and it goes down when I hit the "Stop Server" button.

    I have no idea why this happens. I hope I am missing something major here, otherwise this is literally unusable.

    Any clues?
     
  4. newjerseyrunner

    newjerseyrunner

    Joined:
    Jul 20, 2017
    Posts:
    966
    How exactly are you determining CPU usage? CPU monitors can be misleading if you don't know what you're looking for. Here is a copy-paste from `top` of one of my servers right now.

    top - 11:47:08 up 165 days, 10:33, 2 users, load average: 0.05, 0.17, 0.22
    Tasks: 376 total, 1 running, 375 sleeping, 0 stopped, 0 zombie
    Cpu(s): 0.5%us, 0.2%sy, 0.0%ni, 99.3%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st <- Mostly idle
    Mem: 16359660k total, 16045920k used, 313740k free, 258036k buffers
    Swap: 18415608k total, 320k used, 18415288k free, 13697660k cached

    PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
    8039 apache 15 0 423m 38m 21m S 62.6 0.2 0:04.42 httpd <- Misleading CPU percentage (likely waiting on sockets)
    8023 apache 15 0 423m 38m 21m S 1.7 0.2 0:06.18 httpd


    Instead of looking at the CPU usage of your application, check how idle the CPU is instead. Once you open a socket, you introduce IO wait. You're also dealing with a listener loop. Both of these things can affect your CPU usage reports.
     
  5. MrsPiggy

    MrsPiggy

    Joined:
    Jun 13, 2018
    Posts:
    154
    Hi,
    CPU monitors are far from misleading. If your process takes 60% and you have 6 cores your idle % will be ~90% because the process usage is per single core.

    But that doesn't tell me anything about the % of resources used by the specific process, which is what I am looking for. And it's not out of curiosity but rather because when the server starts it pushes the fans to top speed which is really bizarre for something that is sitting there waiting for connections.

    Not really.
    I am very familiar with sockets and IO loops and there's no universe where a socket waiting for connections is affecting the CPU in any significant capacity unless you're running it on a C64.

    So I am very puzzled as to why this is happening, and I can't see a potential error on my part since the test application is literally made of two components which are bundled in Unity itself and no custom code...

    I am also quite surprised this hasn't been reported before... If anyone has an idea I'd be grateful to hear about it.

    Thanks
     
    Last edited: Jun 15, 2018
  6. newjerseyrunner

    newjerseyrunner

    Joined:
    Jul 20, 2017
    Posts:
    966
    Well, I’ve seems cpus go high when there completely idle. Top does weird things.

    Anyway, where in your code is the cpu being used? You’re 100% sure it’s not in the listener loop or the event driver loop? Fast moving loops report high cpu.
     
  7. MrsPiggy

    MrsPiggy

    Joined:
    Jun 13, 2018
    Posts:
    154
    Sorry, I don't know what you mean by that.

    As I wrote, there is no code.
    I created an empty project, added a single GameObject to the scene with a NetworkManager and NetworkManager HUD.

    Again, don't know what you mean by that, what I am asking here is why does a server doing absolutely nothing is taking a huge amount of CPU resources?

    Try opening a .Net socket, a leave it there to listen for connections. Then go check the CPU usage for that process and see if you can find any...

    Thanks
     
  8. newjerseyrunner

    newjerseyrunner

    Joined:
    Jul 20, 2017
    Posts:
    966
    There is still plenty of stuff going on. You should be able to profile the application still.

    Sort of. Depends on the type of listener, if it's blocking then yeah, the CPU won't do anything. I highly doubt Unity uses a blocking socket call because it may have to deal with multiple requests at the same time. The likely open a listener, then have a loop that's constantly polling it. Depending on how fast that loop is polling, it can take a significant amount of CPU usage.

    pseudocode:
    Code (csharp):
    1.  
    2. socket = open_socket();
    3. listen(socket);
    4. while(true){
    5.      select(socket);
    6.      if (FD_ISSET(readFds)){
    7.           launchNewConnection();
    8.      } else {
    9.           sleep(10);  //This value will affect CPU usage
    10.      }
    11. }
    12.  
    I've not seen Unity's code, but this is how most servers that handle lots of requests work. How many milliseconds it sleeps for between checks can greatly affect how the CPU usage appears. Lots of times people use the value 1, which doesn't really sleep at all, it simply yields the thread.
     
  9. MrsPiggy

    MrsPiggy

    Joined:
    Jun 13, 2018
    Posts:
    154
    @newjerseyrunner
    I am sure you're trying to help but this isn't going anywhere.

    Speculations as to how Unity might be running the dedicated server are not very useful because:

    a) we have no idea how it works
    b) it makes no sense that an idle server takes 40%+ on an quad core i7 and over 60% of a dual core 3Ghz CPU
    (Imagine what happens when you run the same thing on mid-range mobile phone)

    As a side note, non blocking servers register for events and don't need to run tight expensive loops. There exist a million API to do this efficiently and without wasting resources from low level BSD sockets up to C# and Java.

    I have written a lousy Java server that runs a non blocking socket in a tight loop. It's a terrible approach, considering that you can use selectors that keep a single thread blocked until data is available (thus avoiding to waste any cpu cycle). Even that terrible, home made server takes 5% of the same quad core i7. Not 45%.

    So, yeah.
    Unless I am missing something glaring, something is horribly wrong with how the NetworkManager class is currently implemented.
     
  10. moco2k

    moco2k

    Joined:
    Apr 29, 2015
    Posts:
    294
    You could write a message to Alex @aabramychev. He is active in the forums and very helpful. At least he can forward the question within the UNET team.
     
  11. Whippets

    Whippets

    Joined:
    Feb 28, 2013
    Posts:
    1,775
    If it's repeatable with such a simple project - file a bug report
     
  12. MrsPiggy

    MrsPiggy

    Joined:
    Jun 13, 2018
    Posts:
    154
    Thanks, will do.
     
  13. MrsPiggy

    MrsPiggy

    Joined:
    Jun 13, 2018
    Posts:
    154
    Done, thanks
     
    Whippets likes this.
  14. aabramychev

    aabramychev

    Unity Technologies

    Joined:
    Jul 17, 2012
    Posts:
    574
    for quick fix globalConfig.ThreadAwakeTimeout = 5.
    and use NetworkTransport.Init(globalConfig);
     
  15. MrsPiggy

    MrsPiggy

    Joined:
    Jun 13, 2018
    Posts:
    154
    Hi,
    thanks for the reply.
    My test example has no code in it, so I am not entirely sure what's the appropriate way to use this snippet.

    Assuming it must be done very early in the player startup I've added an empty object with a script and added the code in the Awake() method.

    Code (CSharp):
    1. private void Awake()
    2. {
    3.     var cfg = new GlobalConfig();
    4.     cfg.ThreadAwakeTimeout = 5;
    5.     NetworkTransport.Init(cfg);
    6. }
    7.  
    This however makes things even worse, causing the CPU to skyrocket to ~76% when I hit the Server button.
    If I comment out the code and re-build and launch it goes back to 45-48% after starting the server.

    Any other ideas?
     
  16. MrsPiggy

    MrsPiggy

    Joined:
    Jun 13, 2018
    Posts:
    154
  17. aabramychev

    aabramychev

    Unity Technologies

    Joined:
    Jul 17, 2012
    Posts:
    574
    OK if you send/receive a lot it is (probably) expected. As network threads will awake on any message which you will receive. If you obtain that without any traffic on the LAST unity build or path - probably it is a bug, generate report about this. On my computer, without any network traffic, i have only loading from the main unity thread so ~ 0... Without detailed inspection of the project i cannot say more :( sorry about
     
  18. OldHarryMTX

    OldHarryMTX

    Joined:
    Sep 18, 2018
    Posts:
    45
    Sorry if I rise this old topic from the grave but I am experiencing the same problem. I noticed that if I try to start a normal build of my project I have a cpu use of 2-3%, if I use a server build I exceed 20%. Consider that the phenomenon occurs even if every GameObject in the scene is disabled.

    I also tried to start the project on a Linux VM, and this is the result:

    upload_2020-4-26_12-57-47.png

    As you can see the thread MainServer use the 95% of the cpu, while the Arena, that is a process opened by the MainServer using Process.Start(), use only the 5%. They are both headless, and and consider that the MainServer is simply a scene with only the canvas and a custom tcp socket, while the Arena is a 3d scenario full of GameObject, NavMesh, the same aforementioned custom tcp socket and a second networking system that uses Mirror.

    Has anyone managed to understand why it happens??
     
    Last edited: Apr 26, 2020
  19. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,847
    Did you already do the first suggestion in this thread in comment #2?

    The Profiler can also be helpful to see what your game is busy doing.
     
  20. OldHarryMTX

    OldHarryMTX

    Joined:
    Sep 18, 2018
    Posts:
    45
    Yes, i've just finished to try setting the targetFrameRate at 30 fps, and the CPU usage is dropped below 1%. I think also that the difference between the previews usage on my pc (around 20%) and on the google Vm (near 100%) depended on the number of cores available.
     
  21. tony040209_unity

    tony040209_unity

    Joined:
    Mar 21, 2021
    Posts:
    31
    Hello, I also encountered the situation that the cpu was too high,
    but no script was executed at all. I suspect that the UGS server did not completely clean up the previous build.
    Later, I reopened a project in MuitlyPlay and it was normal.
     
  22. earthcrosser

    earthcrosser

    Joined:
    Oct 13, 2010
    Posts:
    122
    I know this maybe sort of an old thread but I ran into this on my current project and setting the application frame rate also seemed to take care of it. At least for now. I'm on Unity 2022.2.7f1 and making a linux dedicated server build. Every time I ran a test allocation I was kicked out after only a few minutes. I tried making a build that only loaded the most basic scene I could make with a network manager and some code to that goes as far as calling ready for players on multiplay. Everything I tried I still ended up with a run away CPU. Setting the application frame rate seems to have solved it though and my test allocation stayed up for the full hour. So now I'm going to go back and try it with my full lobby scene back in place.
     
    aaronpenn44 and NpXAutobot like this.