Search Unity

Resolved Adding VR support to a Custom SRP

Discussion in 'VR' started by davesmeathers, Jan 18, 2021.

  1. davesmeathers

    davesmeathers

    Joined:
    Aug 18, 2015
    Posts:
    7
    I'm trying to add VR support to an existing custom render pipeline using single pass instanced. Does anyone know if there are any examples, tutorials, or documentation for this?

    This is what I've done so far, mostly through guesswork and poking around in the URP code:
    1. Added the XR plugin management package and set it to initialize on startup
    2. Added the oculus package and set it to Single Pass Instanced
    3. Changed my call to SetupCameraProperties() to pass true for the stereoSetup param
    4. Set up instancing like this:
      InitCommandBuffer.EnableShaderKeyword("UNITY_STEREO_INSTANCING_ENABLED");
      InitCommandBuffer.EnableShaderKeyword("STEREO_INSTANCING_ON");
      InitCommandBuffer.SetInstanceMultiplier(2);

    5. Set my test mesh to use a simple shader that has instancing support via the
      UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO and
      UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX macros
    6. Getting the XRDisplaySubsystem and using it to fill in the CullingParameters instead of using the camera
    7. Rendering to the XRRenderPass renderTarget instead of the camera render target
    8. Calling renderContext.StartMultiEye and StopMultiEye around draw calls that should be rendered in stereo
    9. Calling renderContext.StereoEndRender(camera) at the end of rendering, although I'm not sure if it should go before or after calling Submit()
    It renders to the headset but only in the left eye. Looking at the "both eyes" view on the game view in the editor shows the same thing: Left eye looks correct, right eye is black.
     
  2. davesmeathers

    davesmeathers

    Joined:
    Aug 18, 2015
    Posts:
    7
    The problem was the way I was setting the render target. Setting it like this allows it to render to both eyes:

    CommandBuffer.SetRenderTarget(BuiltinRenderTextureType.CameraTarget, 0, CubemapFace.Unknown, -1)

    The important bit is using -1 for the depth slice. I didn't find any documentation to explain what this does exactly, but it's what URP does.

    I created a small project that has a very small custom render pipeline that supports VR. Hopefully this will help anyone facing similar issues:

    https://github.com/dsmeathersFireproof/TinyVRSRP
     
    Fewes and m4d like this.
  3. KNOTGAMES

    KNOTGAMES

    Joined:
    Jan 23, 2017
    Posts:
    2
    hi, I am stuck with a related issue.
    I am trying to render color and depth to two different textures.
    Been trying to solve it for long now. I cant figure a way yet.
    any advice on how could I achieve it?