Search Unity

How to get webplayer to detect and use dedicated GPU over integreated(or vice versa)

Discussion in 'Editor & General Support' started by ironbelly, Mar 28, 2014.

  1. ironbelly

    ironbelly

    Joined:
    Dec 26, 2011
    Posts:
    597
    I have looked far and wide and have yet to find an explanation about how to accomplish this, Stand alone player, no problem, but nothing for the web player.. I have tested or tried hundreds of web players and the results are always the same: Unity will use the integrated GPU even when there is a dedicated GPU in the wings. It doesn't matter if I go into my graphics card settings(nvidia control panel in my case) and have it be the default for everything, it doesn't matter if I add chrome to the list of 3D programs and manually set it to use my gpu, I can't seem to find any way to accomplish this and was hoping that someone has figured this out.

    Thanks
     
  2. Graham-Dunnett

    Graham-Dunnett

    Administrator

    Joined:
    Jun 2, 2009
    Posts:
    4,287
    Unity web player knows nothing about the machine it's running on. It'll ask DirectX to start up, and simply uses whatever is provided.
     
  3. ironbelly

    ironbelly

    Joined:
    Dec 26, 2011
    Posts:
    597
    Hey Graham, this doesn't really answer my question directly, although it does in a round about way. But could you just let me know if it is possible at all for a Unity web player to make use of a dedicated GPU in laptops that have integrated and dedicated chips? If it is possible could you send over some examples showing it in action? If it isn't possible, is this something that will addressed in Unity 5? I understand this might be completely out of Unity's hands and a DirectX problem, we just need to know if we should be allocating resources into ensuring GPU's are properly used in our webplayer games or if it would be a fool errand with no solution in sight.

    Thanks in advance
     
    Last edited: Apr 8, 2014
  4. Schubkraft

    Schubkraft

    Unity Technologies

    Joined:
    Dec 3, 2012
    Posts:
    1,073
    Can you add some more details?
    I run a Win7 with IntelHD4000/NvidiaNVS5400 and have the default GPU set to the NVS. When starting the webplayer it uses the NVS right away.
     
  5. ironbelly

    ironbelly

    Joined:
    Dec 26, 2011
    Posts:
    597
    Windows 7, Nvidia 540M, Intel I7 chip.

    I have set the nvidia to the default GPU, and then went to something like nplay.com/BeGone or any other web app and the gpu stays dark.

    When we are building projects and loading it in the stand alone player it loads the gpu fine but never with the the web player.

    @Schubkraft - could you send a link over of a webplayer that you can get to work? It will help to determine if this is an issue with my hardware or a particular web player. thanks
     
    Last edited: Apr 8, 2014
  6. Schubkraft

    Schubkraft

    Unity Technologies

    Joined:
    Dec 3, 2012
    Posts:
    1,073
  7. makeshiftwings

    makeshiftwings

    Joined:
    May 28, 2011
    Posts:
    3,350
    You're talking about Nvidia Optimus, which is a general problem with some Nvidia cards on laptops, not really Unity specific, where it fails to realize that 3D is happening and never kicks in with the real graphics card. A couple of things you can try:

    1) Go to Windows Control Panel -> Power Settings -> select "High Performance Mode"

    2) In the Nvidia control panel, change the default 3D settings to "High performance graphics"

    3) Try right-clicking on your browser icon and if it's there, select "Run with graphics processor -> Nvidia"

    Some info: http://www.howtogeek.com/136123/htg-explains-what-you-need-to-know-about-nvidia-optimus/
     
    WaqasGameDev likes this.