Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Dual GPU Compatibility? Nvidia RTX NVLink - 2x 2080ti

Discussion in 'General Graphics' started by BlackBox_Team, May 29, 2019.

  1. BlackBox_Team

    BlackBox_Team

    Joined:
    Feb 15, 2017
    Posts:
    18
    I'm wondering if the Unity Team can speak to whether 2018.3+ can utilize the added horsepower of of Dual RTX yet? We have an exhibit install that will finish at 8K, and were hoping to leverage the power of two RTX 2080 ti cards to ensure quality and stable frame-rate at that ultra high resolution.

    I know that there was some advances made by Nvidia VRWorks in unlocking the SLI power of the previous generation of cards, but can't find anything about the current architecture.
     
    olix4242 likes this.
  2. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    9,350
  3. BlackBox_Team

    BlackBox_Team

    Joined:
    Feb 15, 2017
    Posts:
    18
    Yeah, that's the only related post I had been able to find, and it links to an SLI Best Practices document "Last updated on 02/15/2011". I imagine that the new NVLINK system that replaced SLI functions similarly, but has much higher speed bridge, 100GB/s. We'll be testing dual cards soon, and though I highly doubt we're the first to try this, we'll definitely share what we discover best we can.

    https://www.pcgamesn.com/nvidia-rtx-2080-ti-nvlink-performance
     
    CloudyVR and olix4242 like this.
  4. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    9,350
    if you don't mind, can you test my point cloud plugin on that rig?

    i've tested it with few different setups (not sli), results here:
    https://forum.unity.com/threads/released-point-cloud-viewer-tools.240536/page-7#post-4223734

    would be interested to know if there's any difference,
    can download the tester here https://www.dropbox.com/s/23huto75j8dpm03/PointCloudTesterV1-537p4.zip?dl=0

    upload_2019-5-30_19-38-18.png
    - start exe, select 1024x768, fastest
    - then press key 0, five times to add 50 million points
    - dont move mouse while its running to keep all points in view
     
  5. BlackBox_Team

    BlackBox_Team

    Joined:
    Feb 15, 2017
    Posts:
    18
    We love experimenting with Photogrammetry and experimenting with Volumetric capture, so we'd enjoy getting a chance to test that in the next few weeks.
     
  6. olix4242

    olix4242

    Joined:
    Jul 21, 2013
    Posts:
    1,962
    I am also interested in how it works - mostly with render textures.
     
  7. CloudyVR

    CloudyVR

    Joined:
    Mar 26, 2017
    Posts:
    714
    Any update, I too have 2x 2080ti and VR, I would like to fully utilize them. Why does Unity not already do this?
     
  8. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    The traditional way Nvidia SLI has been implemented by drivers has been to render every other frame using alternating GPUs. This means you can render at a higher resolution and/or frame rate, but at the cost of an extra frame of latency.

    You can’t do that for VR since latency is a primary concern, so you can’t use the default driver behavior. To add to that if you reuse any data from a previous frame you don’t get any benefits at all since each GPU has to wait for the previous to finish before it can start, and that’s becoming increasingly common for games.

    That means to support SLI you need to potentially significantly rework your rendering pipeline to support something only a tiny fraction of a percentage of users will have, and even then there’s a good chance it won’t increase performance much.

    Unity has elected not to spend resources on this. However Nvidia released their VRWorks plugin for Unity that does most of the work for you, though it won’t work with most post processing plugins, and might break some other shaders, without extra work from you.
     
  9. CloudyVR

    CloudyVR

    Joined:
    Mar 26, 2017
    Posts:
    714
    Would it be possible for Unity to use dual GPUs where each eye gets it's own devoted GPU?
     
    Last edited: Dec 23, 2019
  10. nikescar

    nikescar

    Joined:
    Nov 16, 2011
    Posts:
    165
    VRWorks SLI works exactly that way. According to this post HERE you have to use MRS instead of SPS, which makes sense.

    HERE is a link explaining how it works and cuts down on the SLI latency problem
     
    CloudyVR likes this.
  11. CloudyVR

    CloudyVR

    Joined:
    Mar 26, 2017
    Posts:
    714
    This sound great, until reading the Unity plugin reviews - conclusion: dreadful.. :(

    Why can't Unity it's self target both GPUs to render individual eyes concurrently?

    I know nothing about GPU scaling. I just wonder why we need to use SLI in the first place? Why would it not be possible for Unity to just send left and right draw calls to both GPUs in parallel instead of sending the left and right draw calls to a single GPU serially?

    Nvidia SLI seems very proprietary, allowing scaling up to any number of GPUs is complicated stuff. But for VR I believe if we were even limited to only two GPUs that would still be many times an improvement over a single; and would cover 99.9% of the market (by my estimates :cool:).

    I wonder (purely theoretically) would it be possible to run two instance of the same game on a PC where each instance uses a unique GPU? If so then what limits Unity from just running a single game in a similar way where two 3d engines are synced but offset for the left and right eye?

    I really wish Unity could utilize both GPUs natively, without worrying about losing Post Processing effects and/or having to incorporate poorly written plugins into our projects.

    Maybe SLI and combining GPUs is the completely wrong approach for VR (in Unity).. may not.. but currently I see no good solution while framerates are suffering and hardware is not the main bottleneck.
     
    Last edited: Jan 11, 2020