Search Unity

[SOLVED] possible to let the Camera render to a Texture2DArray with single pass stereo

Discussion in 'AR/VR (XR) Discussion' started by catox, Apr 20, 2017.

  1. catox

    catox

    Joined:
    Dec 19, 2014
    Posts:
    15
    I am a developer of an Android-based standalone HMD, not Daydream or Cardboard. For Unity, it uses 2 Cameras to render to seperate RenderTexture and then a native plugin will draw these 2 Texture2D on one display. AFAIK, the native plugin do the ATW things and anti-distortion transition against the lens to achieve the best view.
    Recently, I decide to give the newest Unity5.6.0f3 a try and the VR single-pass mode is the most insteresting part of it. I tested this on the device without using the native plugin but select the "Split Stereo Display (non head-mounted)" in Unity. It results around 30% better performance than the 2-cameras way.
    While I have to use the native plugin to achieve the best view ( BTW I could make changes to the native plugin since source code is available), I tried several ways in VR single-pass mode to let the Camera to render to a Texture2DArray, which will be the best cause little changes to native plugin could make it work. But I FAILED!
    So, I am asking for help here. Is there any possible way in VR single-pass mode letting a Camera to render to a Texture2DArray, or something similar that I could work around with it?

    The best result I got so far is to set a RenderTexture like this:
    RenderTexture.dimension = UnityEngine.Rendering.TextureDimension.Tex2DArray;
    RenderTexture.volumeDepth = 2;
    And let the Camera.TargetEye = Both;
    But when I set this RenderTexture to this Camera, I could only get image from slice 0 and slice 1 stays black.
    After more tests, I found that just after I set Camera.TargetTexture, the value of Camera.StereoEnabled will becomes Ture to False. So it could explain why I got one slice stays black.
     
  2. catox

    catox

    Joined:
    Dec 19, 2014
    Posts:
    15