Search Unity

Question force the use of dedicated GPU

Discussion in 'Web' started by andyz, May 26, 2021.

  1. andyz

    andyz

    Joined:
    Jan 5, 2010
    Posts:
    2,276
    When I run webgl content on my laptop, the content uses the built-in intel GPU (to the max!), rather than using the dedicated GPU. Is there anyway to force the use of the dedicated GPU from the website side?

    (You can force through power and nvidia settings a certain GPU to be fully used by a web browser, but that is not ideal for end users to have to do)
     
  2. jukka_j

    jukka_j

    Unity Technologies

    Joined:
    May 4, 2018
    Posts:
    953
    Which Unity version are you using? I recall this was adjusted in some Unity version, although I don't exactly remember which.

    If that is not somehow working, you should be able to explicitly force that by setting

    config['powerPreference'] = 'high-performance';

    in the config object for createUnityInstance, see https://docs.unity3d.com/Manual/webgl-graphics.html
     
    adamgolden likes this.
  3. andyz

    andyz

    Joined:
    Jan 5, 2010
    Posts:
    2,276
    This was with 2019.4.24, will check on it thanks - edit: seems different, can it be done on 2019 LTS?
    This was previously a big stopping block with webgl, along with Mac Safari complaining about energy use! - though they may have changed that now.
     
    Last edited: May 26, 2021
  4. andyz

    andyz

    Joined:
    Jan 5, 2010
    Posts:
    2,276
    @jukka_j seems like fix in 2020.1.15 but I don't get the back port in 2019 LTS so was that bit backported?
    can I force in 2019 and how?
    • WebGL: Added support for PVRTC and RG16 textures. Enable use of high-performance WebGL GPU powerPreference. (1187965)
     
  5. jukka_j

    jukka_j

    Unity Technologies

    Joined:
    May 4, 2018
    Posts:
    953
    You should be able to force it by explicitly setting the powerPreference in context creation attributes.

    See 2019.4 documentation at https://docs.unity3d.com/2019.4/Doc...9.1944342225.1621934556-1045188262.1576837928

    Code (JavaScript):
    1. UnityLoader.instantiate("unityContainer", "%UNITY_WEBGL_BUILD_URL%", {     Module: {         "webglContextAttributes": {"preserveDrawingBuffer": true, "powerPreference": "high-performance"},     } });
     
  6. andyz

    andyz

    Joined:
    Jan 5, 2010
    Posts:
    2,276
    the documentation does not explain that particular attribute, which does not seem to work like that.
    Well I guess I have to try 2020
     
  7. jukka_j

    jukka_j

    Unity Technologies

    Joined:
    May 4, 2018
    Posts:
    953
    Agreed, the documentation is not quite general here. The name "webglContextAttributes" there maps to the official WebGL specification and its "webglContextAttributes" field at https://www.khronos.org/registry/webgl/specs/latest/1.0/#5.2 . So it should be possible to configure any context attribute field from there.
     
  8. FactotumDigital

    FactotumDigital

    Joined:
    Mar 8, 2018
    Posts:
    2
    It's an old issue, but I found that on Windows laptops by default it won't let apps decide the GPU to run on (and defaults to the integrated one instead of the dedicated one), so these changes didn't have any effect for me.
    You need to go to Windows Settings > System > Display > Graphics, press Add desktop app, and find your browser executable (in my case C:/Program Files/Mozilla Firefox/firefox.exe) once you add it, expand it's options and set GPU performance to the High Performance (and name of your GPU in brackets).
    Also for Firefox specifically you might need to navigate to a page about:config and set webgl.enable-privileged-extensions flag to true.
    P.S. Remember to restart the browser after these changes.