Search Unity

Help getting 60FPS

Discussion in 'General Graphics' started by Nosada, Feb 5, 2019.

  1. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Hello,

    I am an independent developer working on a small game by myself. I am having some issues with frame rate. Using 1.0f/ Time.smoothDeltaTime I am monitoring my frame rate. In the editor I get around 60FPS and it runs smooth. However, when I create a build I only get around 30FPS and it seems choppy. I set the target frame rate to 60 in the script, so I am not sure why this is happening. Any insight or advice would be greatly appreciated.

    Thanks.
     
  2. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    The framerate cannot be set, this only is a cap. So if the application cannot reach that speed then it will obviously not run at that speed. If it's faster in the editor it is likely due to the fact it's not rendering as high a resolution.

    High resolutions will cause the computer to do more work, so more work takes a longer amount of time.

    The solution is not to cap anything, and code your game to be framerate independent.

    You also need to learn how to code with delta time, there's a lot of resources on this, so your game becomes "framerate independent".

    This is basic stuff the learn tutorials go over, plus a whole number of youtube videos and so forth. All games have this, and it's essentially multiplying any values that will change over time by Time.deltaTime so that the movement becomes consistent regardless of hardware, or options.
     
    Nosada likes this.
  3. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Thanks for the response. I am aware of deltaTime and FixedUpdate and I am using those for movmement based calculations.

    I thought maybe I had too many lights in my scene, deactivated all of them and made a new build, but my framerate did not increase.
     
  4. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Obviously not because you are guessing. What slows down the CPU is not going to be the same thing that slows down the GPU. You should also use the profiler.

    Typically if the performance changes quite a bit with resolution it will be shader complexity or bandwidth.
     
  5. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    I saw in another post that you said not to use Unity UI. Should I just create my own UI elements without using a canvas? Do you think that could be part of my problem?
     
  6. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Please be careful with advice with game development, it's extremely specific to the version of unity, the situation for that particular game and so on.

    Perhaps you could show a few profiler screenshots and explain a bit more about your game, such as the pipeline and so on... also if you turn off vsync in Unity's player options (in the editor menu) you can properly see the framerate - it may be that it's running at 50fps but because of vsync, it's got no choice but to run at 30.
     
  7. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Well, maybe I am worrying about this frame rate issue too early in the project, because it is not close to finished yet. However, I wanted to make sure that I could solve this problem so that I don't do all this work for nothing.So I decided to monitor the issue and research solutions.

    Maybe it's got something to do with opaque geometry?
    Edit: These screenshots of the profiler were taken with the game running in the editor with vsync enabled.
     

    Attached Files:

    Last edited: Feb 6, 2019
  8. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Here is screenshots of the profiler with data recorded from the development build, with vsync enabled.
     

    Attached Files:

    • P03.png
      P03.png
      File size:
      97.3 KB
      Views:
      690
    • p04.png
      p04.png
      File size:
      101.1 KB
      Views:
      636
  9. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,770
    You know you can click on profiler graph, to actually get snapshot of time frame? This way you would see CPU usage percentage. Much easier to track issues.
     
  10. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    I'm not sure I understand what you mean. Should I click on the graph to see data for that point? Or are you saying that there is a "graph" option that I can click to view the data differently?
     
  11. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,770
    What I mean, you can click on profiler graph, to select time on time line. So you will see vertical line moving, with tooltips times next to it. This will automatically display relevant data below. So basically you can drag/click that vertical line/pointer.

    Also, if you scroll down your profiler, below you will find GPU usage. Somewhere below memory.

    However, your scene don't seams to be near demanding. What target device you are testing on?
     
  12. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Oh ok, yes I noticed that. No, my scene isn't very demanding. Low poly models and low res textures. I am developing for PC.
     
  13. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Here is an image of the profiler GPU data from my development build. Opaque geometry seems to be my biggest issue.
     

    Attached Files:

    • p05.png
      p05.png
      File size:
      100.4 KB
      Views:
      654
  14. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,770
    Yep finally we see more what is going on.
    Why you have over 300 renderer calls?
    Can you put textures into atlases?
     
  15. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    I am not sure why there are over 300. I have never put textures in an atlas before. I am unfamiliar with this practice.

    EDIT: An atlas is similar in concept to s sprite sheet?
     
  16. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,770
    Basically you have multiple textures in one image.
    Then using offset UV, to get correct texture for a mesh.
     
  17. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Ok I will try this. Do I need to do this with normal maps and emissive textures too?

    Also, does this mean that Unity basically can't handle more than a handful of textures at a time? I don't understand why this is necessary. I expected that Unity could handle a couple of dozen textures at a time, but I guess not?
     
  18. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,770
    On modern PC with dedicated card shouldn't be an issue. But is hard to deduct what else is going on in your project, for such slow down. I suspect something else in opaque, costing your GPU.

    Do you use material instancing? (tick box)
     
  19. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    I hadn't been using it; but I just tried it and it did not seem to improve the frame rate on the development build that I made afterwards. Still, it is probably best that I leave it checked, so I will.
    I am running on a fairly new gaming laptop which has a GTX 1060.

    My scene is largely made of quads, could this be an issue?
     
  20. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,770
    Just wonder, did that scene from a screenshot, with vending machine, renders 300 material calls?
    If so, for such simple scene, is far too much.
     
  21. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Yes, the data in the profiler is from that same scene.

    Perhaps it could be related to how many GUI elements I have on my canvas? They are hidden in that image, but I have many GUI elements with alpha transparencies. They are inactive until needed.
     
  22. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,770
    Then definitely you have some incorrect approach taken there, In terms of objects, and/or materials.

    You should review, what is going on. Check I is culling active, if in case you have some objects rendered behind walls, or outside camera frustum.

    But if that only objects in your scene what we see, then need review.

    Do you use multiple materials per object by any chance?
     
  23. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    I have baked occlusion culling. I am not using multiple materials on anything.
     
  24. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,770
    If you select all visible object on the scene from a picture, how many objects is there?
     
    Nosada likes this.
  25. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    I do not know how to determine the exact number of objects.
     
  26. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Here is an image of the scene in the editor. If there is some way to get a count of the objects without having to do it manually or write a script, I do not know.
     

    Attached Files:

    • s01.png
      s01.png
      File size:
      512.1 KB
      Views:
      645
  27. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,770
    Nosada likes this.
  28. xVergilx

    xVergilx

    Joined:
    Dec 22, 2014
    Posts:
    3,296
    Use frame debugger to find out what's actually being rendered in the current frame. You can connect it to the build as well.

    By selecting a draw call in debugger, it will show you why it's not batched.

    Note that modern PC's can easily render more than 3k+ batches (depending on the texture size). Though, if you can decrease it, you probably should.

    Also, Gaming laptops do not exist. Well, not unless they fall in absurdly costly category.

    By the way, make sure you're not running your application on the integrated GPU instead of actual discrete one, because 300 batches are way too low to significantly drop GTX 1060 performance, albeit mobile one.

    Keypoints:
    - If it's less than 300 verts / submesh - use dynamic batching;
    - If the target is PC or something better than potato - use GPU instancing, as that will reduce draw calls.
    - Transparent UI still renders. So make sure to disable the actual object if it's full trasparent.
    - Use deferred rendering path to draw multiple lights without major performance hits.
     
    Last edited: Feb 7, 2019
    Nosada and Antypodish like this.
  29. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Thanks for the information. I will have to double check and be sure it is not using the integrated GPU. I will definitely look into the frame debugger more. I am learning alot about optimization now. I will keep looking for a solution. I wish they had taught us this stuff in college lol.
     
  30. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Yes, I suspected the lights as well, because I need to work on that still. However, I did create a build with all of the lights deactived and the frame rate was still low. I will continue to look for solutions. I am researching different types of optimization now.
     
  31. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    I am going to try batching my mesh and see if that gives me some improvement.
     
  32. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69

    It turns out I wasn't running the application on the dedicated GPU. I feel a bit foolish for not realizing that sooner. I looked into this problem to see if there is some way to get unity to automatically use the dedicated GPU, but I found no way to do this. Many people seem to have an issue with this, having to tell users to manually switch over themselves. Does anyone know about any solutions to this issue? Is there a way to ensure that unity will run using the dedicated GPU instead of the integrated GPU?
     
    Last edited: Feb 8, 2019
    MadeFromPolygons likes this.
  33. AlanMattano

    AlanMattano

    Joined:
    Aug 22, 2013
    Posts:
    1,501
    Look into the bios at startup if there is an option to exclude the integrated GPU.
     
  34. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
  35. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,980
    You can actually make a custom shortcut to set it to use a certain gpu , I cant remember the exact command line text you need to add but it is possible, we use this to force usage of GPU 2 on our work machines
     
    Nosada likes this.
  36. xVergilx

    xVergilx

    Joined:
    Dec 22, 2014
    Posts:
    3,296
    Open Nvidia Control Panel -> Global Settings -> Set the gpu to the discrete one there.
     
  37. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Yes, this sounds like what I need to figure out. If you are talking about the build, and not in the editor.
     
    Last edited: Feb 8, 2019
  38. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    1. you are profiling in editor where it is already 60fps and your complaint was that it's slow for standalone builds

    2. you are 3-4ms on GPU, this is tiny. There are 16ms available for a 60fps game so likely the GPU is not really the problem but the CPU is consuming so much that the GPU doesn't have much of a window to finish it's work. This, plus vsync is what is likely driving your framerate down.

    Stop focusing on the GPU it's really unlikely now I've seen it's taking 3-4ms.

    3. Look at CPU timings. Sort by time millisec, not %
     
    MadeFromPolygons likes this.
  39. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    I did profile the development build, as I had mentioned previously.
    Like I said, the problem seems to be that the standalone build was using integrated GPU instead of dedicated GPU. I had to manually select my dedicated GPU when I ran the build. Once the build was running on the dedicated GPU, the frame rate was good and it ran smooth.

    My problem now is that I need the build to automatically run on the dedicated GPU without the end user having to select this manually.
     
  40. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Yes but even so, 3-4ms is not a lot, so it means you do have CPU perf issues.
     
  41. xVergilx

    xVergilx

    Joined:
    Dec 22, 2014
    Posts:
    3,296
    It will, if user decides to run on it.

    There's no way to force it, unless I'm missing something.
     
  42. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Yes, I don't want the user to have to decide. I would expect that it should naturally choose the better GPU. I don't know why that is not the case. It seems like there may not be a solution for this though.
     
  43. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,980
    No I meant the editor. I don’t think you can force the build and why would you want to, you will not know what slot your users will consider their “main” gpu
     
  44. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    It's cropped up a few times over the years in forums, might be worth a search.
     
  45. xVergilx

    xVergilx

    Joined:
    Dec 22, 2014
    Posts:
    3,296
    I think it's more related on how Nvidia handles new games. It might be worth contacting the them, maybe they'll add your game to their library to automatically pick which GPU to use.

    Although, it would be better if it was automatically done for any Unity runtime instance instead.

    Same comes to AMD, but it's way worse. Sometimes their drivers completely fail to even allow to pick which GPU should be used (for apps on laptops).
     
  46. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Intel's been problematic too and they have the majority market share here by a very, very large margin.

    AFAIK with the proper drivers installed most people can just select gpu via menu.
     
    xVergilx likes this.
  47. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    I have been searching for information about this. I found one thread that was relevant.

    In the thread people said that they had customers refunding the game because it was running slow. The customers didn't realize that they had to manually set the GPU. I do not want this to happen to me if I finish my game. I don't want people to return the game just because it wasn't running on the best possible GPU. There must be a way for Unity to identify the best possible GPU and use it.
     
    Antypodish likes this.
  48. xVergilx

    xVergilx

    Joined:
    Dec 22, 2014
    Posts:
    3,296
    You're thinking in a reverse manner. It's not a Unity's problem, it's hardware manufacturer's one.
    I don't think there's a way to enforce GPU usage outside of OS security ring.


    If it's that imporant - make a special notification for the laptop users. But I don't think its that big of deal.
    If Unity's runtime running on a wrong GPU - probably the rest of the applications will encounter the same issue, as this is a global setup thing.
    (Although this issue somewhat reduced by Nvidia's game library and other companies similar solutions)

    Also, get a game running first. Then think about what troubles some minority may encounter.
    Users lack of knowledge should be mitigated by users, not machines.
     
    Last edited: Feb 9, 2019
  49. Nosada

    Nosada

    Joined:
    Aug 6, 2013
    Posts:
    69
    Yeah, it is starting to seem like this is a problem that is beyond my control. I just don't want users to ask for refunds or give bad reviews over this problem.
     
  50. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    You should look at how AAA titles deal with it. Pretty much every game's faq section has a bit about integrated gpus with the advice to switch it to "high performance" or similar option.

    What you can do is detect the hardware on the machine and pop a little message up to advise the user on startup.

    upload_2019-2-9_17-36-39.png
     
    Nosada likes this.