Search Unity

GPU usage shoots up when I take Oculus headset off

Discussion in 'AR/VR (XR) Discussion' started by strauboelephant, Feb 10, 2020.

  1. strauboelephant

    strauboelephant

    Joined:
    Jul 22, 2019
    Posts:
    1
    Screen Shot 2020-02-11 at 11.22.48 AM.png Screen Shot 2020-02-11 at 11.22.48 AM.png headset_off.png headset_on.png took_headset_off.png Hi there,

    I'm a long-time listener, first time caller.

    I'm attempting to maintain a Unity experience that uses an Oculus Rift S headset. This experience runs on a dedicated Alienware PC (Nvidia 1080) in one location, and that's all it needs to do. This game sits dormant most of the day, and is only used occasionally. The display setup: there is an Oculus headset and a projector, which, when the headset is not used, displays a video (using ProAV), and when the game is getting played, mirrors what you see in the headset on the screen.

    I deleted some of the low-quality export settings - due to the way the game was launched (batch script), it seemed to auto-select the lowest existing quality setting every time. However, since it is now running on a higher quality setting, the game has been crashing. According to the logs, it's due to a ProAV error that the comp was out of resources. Turns out the build was running at around 60% GPU usage while idle. When I put the headset on, the GPU consumption dips (?!!) to 30 - 40%.

    When I take the headset off, the VSync process spikes, crushing the framerate to 15fps or so. To be clear, this coincides with the GPU/CPU spike. Forums say Vsync/WaitforGPU is not a problem, but.... is it not? This feels like a problem in my use case. What I think is happening: Vsync/WaitForGPU is spiking CPU/GPU usage, and as things are... running, oddities, other processes push the computer (in its elevated consumption state) over the edge and the game crashes. This never happened (afaik) before I boosted the quality settings.

    I am actively reducing my quality settings, etc. to mitigate, but i still find the spike in CPU and GPU usage somewhat disturbing. Is this known behavior for Vsync? Is Vsync ever used as a protective process for the GPU? Seems to be bass ackwards right now.

    Apologies if this is a commonly-known topic - I didn't see a ton of info in my searches. Maybe I'm digging in the wrong direction? I can give more info about the setup: Unity Version 2018.2, using NewtonVR, etc etc - let me know what you need to know to tell me what I need to know. :)

    Thanks!
     
    Last edited: Feb 11, 2020
    bkernan likes this.