Search Unity

Question OpenVR wireless pixel streaming to Unity app running on an AR HMD

Discussion in 'AR' started by UnDreaming, Mar 5, 2021.

  1. UnDreaming

    UnDreaming

    Joined:
    Feb 5, 2013
    Posts:
    3
    Hi,

    I'm working on a small project allowing presenting highly detailed 3D scenes on an AR HMD (not VR) with limited resources.

    Based on my research (mainly NVIDIA's CloudXR public information) it seemed like the fastest approach would be to use OpenVR with a custom driver, especially since the source 3D package already supports OpenVR.

    The custom driver on a PC properly receives images from the 3D package and communicates input to it.

    Now, I'm planning to connect the HMD to the driver on a PC using sockets.

    On the driver side, I have access to texture data for each eye and I'm wondering how to display it in Unity's on the HMD side.

    I'm thinking about using a plane with material using stereo-capable shader like this https://docs.unity3d.com/Manual/SinglePassInstancing.html, but I'm not sure if that's going to be displayed properly.

    Does my approach sound reasonable? Are there alternatives that I should consider?

    I've checked OpenXR, which ultimately will probably cover my case natively, however, I believe that, at this point, OpenXR doesn't support wireless connection with the HMDs, which is an important requirement for my project.