Search Unity

  1. Unity 2018.3 is now released.
    Dismiss Notice
  2. The Unity Pro & Visual Studio Professional Bundle gives you the tools you need to develop faster & collaborate more efficiently. Learn more.
    Dismiss Notice
  3. We've updated our Terms of Service. Please read our blog post from Unity CTO and Co-Founder Joachim Ante here
    Dismiss Notice
  4. Want to provide direct feedback to the Unity team? Join the Unity Advisory Panel.
    Dismiss Notice
  5. Improve your Unity skills with a certified instructor in a private, interactive classroom. Watch the overview now.
    Dismiss Notice


Discussion in 'Editor & General Support' started by jordi_boni, Nov 15, 2013.

  1. Nition


    Jul 4, 2012
    Yeah, that is weird. Certainly seems like either a Unity or a graphics driver bug rather than anything you're doing.

    I don't think it belongs in a Gfx.WaitForPresent thread though. The reason you've got big Gfx.WaitForPresent spikes is that your GPU FPS is ~130 and your CPU FPS is like a thousand, so it's spending tons of time waiting on the GPU. The real question is wtf the GPU is doing.
  2. Zolden


    May 9, 2014
    I just tried to load GPU alot with a compute shader. And what I got? Huge Gfx.WaitForPreset. The more I load GPU, the bigger the time Gfx.WaitForPreset consumes. Conclusion: Gfx.WaitForPreset represents the time CPU has to wait until GPU finishes its job for the current Update() frame.
  3. JBR-games


    Sep 26, 2012
    So i was have a hugh issue with this and for the life of me a couldnt figure out why i was having such a hugh lag in a fairly simple scene.. Then after checking this thread i realized (im assuming ) a windows 10 update broke my current graphics driver. So after finding and installing the latest driver it seems to be working as expected..
    Zolden likes this.
  4. bitinn


    Aug 20, 2016
    Hello to everyone at the gfx.waitForPresent community, I was running into this "high gfx.waitForPresent time" situation earlier today, and would like to share some basic practice on how to troubleshoot this.

    So first of all, let's prepare 2 standalone development builds: one without vsync and one with vsync.

    Without Vsync (click for larger image)

    Screen Shot 2018-02-25 at 13.29.45.png

    With Vsync (click for larger image)

    Screen Shot 2018-02-25 at 13.49.00.png

    (It may appear my screenshots are flipped, but no, they are not: "WaitForTargetFPS" doesn't show when profiling a standalone build, for reasons I don't understand. But the game is absolutely sync to 60fps, I have an in-game fps monitor to double-check it.)

    One would assume, with vsync set to 0, the game will run as fast as GPU allowed. But confusingly, in my case, it was running just above 60fps (less accurate than vsync, but still around 60fps, the time is largely shown as gfx.waitForPresent).

    For a direct comparison, here is a graph show both no-vsync build (left) and vsync build (right), you can see the actual camera rendering cost about the same. The scene contains only very simple UI, so by all means it could render faster. (click for larger images)

    Screen Shot 2018-02-25 at 14.02.08.png

    Screen Shot 2018-02-25 at 14.02.16.png

    In fact, it was running faster in the Editor, above 90fps. I have an in-game monitor to verify this. In a standalone build, also without vsync, the same scene can run at best around ~60 fps, but never quite as fast as 90 fps.

    Screen Shot 2018-02-25 at 14.11.42.png

    Then I realize what was going on: I was fillrate bound, not texel fillrate, but pixel fillrate! My MacBook Pro 2015 Intel GPU just can't push that much pixels onto screen. Changing the resolution, toggle between fullscreen and window mode, all has a huge impact on my game even though very little texture processing was happening.

    I am also on deferred rendering which is even more dependent on ROPs (to process g-buffer).

    So that's it, feel free to point out any mistakes in my summary :)
    Last edited: Feb 25, 2018
    JBR-games and JJC1138 like this.
  5. bitinn


    Aug 20, 2016
    Oh and one more thing: this is why Unity GPU profiler is not enabled by default - it doesn't work on every platform. In my case, an "Intel Iris Graphics 6100" on MacBook Pro 2015. So I have to figure out my issue the hard way.

    (click for larger image)

    Screen Shot 2018-02-26 at 15.44.46.png
    awesomedata likes this.
  6. RafaelF82


    Oct 28, 2013
    I wonder when this GPU profiler actually works.
    I have an AMD R9 290 and a nVidia GTX 745, the profiler doesn't work in either of them, drivers updated obviously.
  7. Paul-Kirby


    Jul 27, 2013
    I have a nVidia GTX 960 and it shows stuff when using DirectX11 in Unity shown below:

    However its blank for DirectX12 shown below:

    What I have noticed is an issue where its not displaying all the Timeline sections shown in this video I have created:

    I have not yet tried it in the beta version yet, so it may or may not already of been fixed.

    Peter77 likes this.
  8. qoobit


    Aug 19, 2014
    Not sure if people are still having this issue but my specific case got a response a while back and it had to do with the GeForce Experience overlay causing these issues. If you've already disabled VSYNC probably try to turn off or uninstall GeForce Experience to see if that helps.
    daisyDynamics likes this.
  9. daisyDynamics


    Dec 29, 2011
    Thank you! Disabling GeForce Experience Overlay instantly fixed the issue - made a massive difference.
  10. Peter77


    Jun 12, 2013
    Excellent catch!

    Did you submit a bug-report to Unity Technologies for this? I'm asking because if they don't have a bug-report for it, they're probably not aware of the problem and thus not going to fix it.
  11. DanJa512


    Dec 14, 2015
    I don't think it's a bug in Unity.. It also does this when using EVGA Precision X software, likely for the same reason (overlay).
  12. GuardHei


    Feb 10, 2018
    Interesting finding. My mbp 2017 has the same problem. In Editor, the gfx.waitForPresent goes significantly high when choose "maximize when play". I'm not sure which graphic card the device is using when running under editor mode, because mbp has two gpu. I originally thought when I chose high or ultra qualities, the device would automatically pick the most powerful gpu for me, but your finding doubted it.
  13. Ogdy


    Dec 26, 2014
    Woah I'm happy to see that I'm not alone, but I'm confused about why this is still an issue after more than 4 years !

    I don't know if their is a but with the Gfw.WaitForPresent, but there is for sure a bug somewhere, just test it by yourself:

    Make and empty scene, and place a high poly model (for example a subdivided sphere made in blender) with any shader. Between 0 triangles and a few milions, the CPU and GPU time remains under 1 ms for both, and after a certain threshold, it just explodes !!! But the GPU time is still under one ms.

    I don't get how this is possible. Either something is broken deeply in the engine either the GPU cost isn't under 1ms as it's written in the editor.
  14. GuardHei


    Feb 10, 2018
    What is more strange to me is that I don't really think my cpu needs to wait for gpu. My gpu thread only takes 2.1s while the cpu thread takes 24s with 90% cpu taken by Gfx.waitForPresent. If just ignore the Gfx.waitForPresent time, both threads are roughly equal lengths.
  15. NGC6543


    Jun 3, 2015
    Hi, I'm experiencing this weird 'GFX.WaitForPresent' and need some advice!
    My scene is fairly simple : 64x64 terrain, boundaries, a car and a target.
    I used Lightweight Rendering Pipeline for my target device(Mac Mini, Late 2014) with all the materials switched to LightWeightPipeline/Standard, and IL2CPP build for performance.
    It works fine on my dev machine (iMac), but on the Mac Mini the GFX.WaitForPresent spikes high.
    (Test resolution : 1280x720)

    As you can see, those cat-like gradually growing spikes occur periodically, over and over again.
    One thing I've figured out is that disabling the shadow prevents those spikes. My app runs over 200 FPS on Mac Mini. And the V-sync option was irrelevant.

    I only used hard shadows with the Shadowmap Resolution to 1024 and reasonable shadow distance(60). And there are roughly 10 objects casting shadows.
    So, I'm wondering how this can cause GFX.WaitForPresent.

    Mixed light(baked + realtime) caused this issue. After disabling the lightmap, the shadow option no longer causes this spike. I really want to use the lightamap, though...

    Sorry. Shadow Cascade option causes this spikes. Setting it to 'No Cascade' makes the app run smooth.
    Last edited: Aug 2, 2018
  16. jhughes2112


    Nov 20, 2014
    For what it's worth, the distance you render shadows makes a big difference, as does the resolution of your shadow maps. What I noticed is in my game, if you go above "medium resolution", it takes more time to generate the shadow maps, but looks about the same (after I tuned the cascades properly). The shadow distance matters a lot, too, because the farther you render them out, the more objects have to be re-rendered. Check the shadow passes in the shaders being used, too, because in a LOT of shaders, it's doing full complex calculations for what will ultimately be a black pixel. They have to do that only to support punch-alpha shadows, so in some cases you can make that a lot faster just by changing the shadow pass.

    For people who are confused by the fact that the GPU says it's only busy for 1ms but there's a long GFX.WaitForPresent, it's usually because it's only measuring how long it takes to send the instructions to the GPU. If you have a lot for it to render, it will take a long time before the GPU is available again, and you get GFX.WaitForPresent in the meantime. Simplify your shaders, reduce draw distance, find places where your CPU and GPU are waiting on each other, etc. The usual. Sometimes it's an asset pack on the store that's badly behaved, or a badly written shader or plugin. Try toggling things off at runtime until it magically goes faster. If you resize your window larger and your frame rate drops, it's a sure sign you have a fillrate / shader / GPU bottleneck.
    GuardHei likes this.
  17. SoftwareGeezers


    Jun 22, 2013
    Just been hit by this issue and it seems pretty random. A simple scene with 500 particles and a couple of background textures renders at 150fps when not maximised with 5ms GFX.WaitForPresent. Maximised, however, I hit 15-25 fps with profile bumps, from 0 to 90 ms and back down to zero on maybe a 1/2 second interval. I've turned off and on some aspects like background and UI, to end up with the same scene and yet a different profile; now I get spikes of 120ms rather than bumps.


    Reducing the number of particles reduces the WaitForPresent.
    Last edited: Aug 28, 2018
  18. Avietry


    Mar 4, 2015
    In my case GFX.WaitForPresent caused by 4K textures. I reduced them to 512x512 and problem solved. Be sure to check each import settings.
  19. BrownBot


    Feb 27, 2013
    As weird as it sounds uninstalling the Nvidia overlay and rebooting massively increased my framerate.... 60 odd to 800+ FPS

    Now getting more what I'd expect from a 1060 GPU.