Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

How to make Unity build use main graphic card instead of integrated one.

Discussion in 'General Graphics' started by GloriaVictis, Feb 4, 2017.

  1. GloriaVictis

    GloriaVictis

    Joined:
    Sep 1, 2016
    Posts:
    133
    Hello,

    As topic says, most of PC users are not that experienced they know/will to set graphic card on Nvidia panel as our game launches with integrated card on default for many users.

    This is a BIG part of our total refunds, people launch the game - we show them message "You are using integrated graphics, please set your default graphic card on Nvidia panel" but of course they don't give a snap and refund with negative comment saying that the game is not optimized at all.

    Is there any way to prevent that? I cannot belive that we are unable to set some kind of flag which would make the game start with main graphics by default.

    See that topic for example:
    http://steamcommunity.com/app/327070/discussions/0/133256080236664989/
     
    zetaFairlight likes this.
  2. Fabian-Haquin

    Fabian-Haquin

    Joined:
    Dec 3, 2012
    Posts:
    231
    Hi,

    From experience, it's more like a Windows/DirectX issue than Unity itself.

    From what I observed, it will depend on what graphic card your screen is connected.
    If you have two screen, one connected to the integrated one and the other to the PCI graphic card, the perf will always the weakest card.

    Probably because windows have to synchronize them, like two DDR RAM running on different speed.

    This topic may help you: http://www.tomshardware.co.uk/answers/id-2701252/monitors-intel-nvidia-gpu.html
     
  3. GoGoGadget

    GoGoGadget

    Joined:
    Sep 23, 2013
    Posts:
    855
    Out of interest, how are you detecting whether they're using an integrated (and also have a dedicated) GPU? I've encountered this issue in my game as well and haven't found a good solution.
     
  4. GloriaVictis

    GloriaVictis

    Joined:
    Sep 1, 2016
    Posts:
    133
    if (SystemInfo.graphicsDeviceName.Contains("Intel") || SystemInfo.graphicsDeviceName.Contains("Integrated") || SystemInfo.graphicsDeviceName.Contains("0D"))
    {
    //Show Debug
    }

    of course its not 100% reliable
     
    Ony and GoGoGadget like this.
  5. theANMATOR2b

    theANMATOR2b

    Joined:
    Jul 12, 2014
    Posts:
    7,790
    The experience I've had playing higher end games on my PC - the game usually has suggested graphic settings based on my system. Previous post mentions the game does this hardware scan.
    Couldn't the game have a little popup that states - you have two graphics cards - it is suggested to use this one instead of that one to get the best visual experience?
     
  6. bart_the_13th

    bart_the_13th

    Joined:
    Jan 16, 2012
    Posts:
    485
    Got it from here : http://answers.unity3d.com/questions/732531/how-to-do-i-make-a-windows-build-default-to-using.html
    haven't tried it though, since I havent build any pc game using Unity atm...
     
  7. GilCat

    GilCat

    Joined:
    Sep 21, 2013
    Posts:
    676
    Anyone got this working yet?
    I run into a similar problem recently where it is picking the low end card.
    The thing is that on Mac i have a dropdown menu to choose the card but not in windows.
     
  8. gecko

    gecko

    Joined:
    Aug 10, 2006
    Posts:
    2,238
    Same here -- in the resolution dialog, it defaults to the integrated card, not the dedicated card. Did you figure that out?
     
  9. s_guy

    s_guy

    Joined:
    Feb 27, 2013
    Posts:
    102
    I'm struggling with this issue too, and I'm on Unity 5.6, which doesn't even list the GPU choices in the resolutions dialog. Many devs would prefer to disable that placeholder launcher anyway.

    After a fair amount of research, it looks like we're at the mercy of the OS / drivers to determine default GPU in these situations.

    According to Nvidia and AMD documentation, some versions of their drivers look for hints of various kinds to make this determination, but in general will default to the integrated (low power / low performance) chip. You can see a few of the options that developers have with Nvidia Optimus drivers, for example.

    http://developer.download.nvidia.co...megraphics/files/OptimusRenderingPolicies.pdf

    I've tried all of the self-serve options described in there and none worked for me. These were all a pain because they're well outside the normal dev process that most Unity developers are used to (requiring C++ externs, linking static libraries, etc.). It's quite an old doc, but it's the best I could find. I don't even know if modern Nvidia drivers use "optimus" tech or these mechanisms for taking GPU hints anymore.

    Interestingly, it looks like Unity tries one of the self-serve options for us to get a good GPU choice. Your standalone builds should have certain external values automatically set for those drivers to notice, which you can see with the VS dumpbin tool.




    Cross reference those names to the documentation and you'll see that this is what their drivers supposedly look for to give you the good GPU by default.

    My best hope at this point may be to petition Nvidia to include my game's specs in their drivers, which might work for release, but is likely impractical for pre-release builds for beta testers and press.

    Would be really nice to get a Team Unity response on this issue that people have been fussing over forever without direction.
     
    Last edited: Sep 17, 2018
    GloriaVictis likes this.
  10. s_guy

    s_guy

    Joined:
    Feb 27, 2013
    Posts:
    102
    I did try this and it didn't work. I'm not sure that alone causes linking of the static library anyway. Does it?
     
    Last edited: Sep 13, 2018
  11. GloriaVictis

    GloriaVictis

    Joined:
    Sep 1, 2016
    Posts:
    133
    Bump, is there are improvement on that matter in next versions of Unity? I feel we are missing something, but seeing analytics how many players running the game on integrated GPU without even knowing that is very weird.