Search Unity

RenderTexture on HoloLens

Discussion in 'AR/VR (XR) Discussion' started by dj0002, Dec 15, 2017.

  1. dj0002

    dj0002

    Joined:
    Dec 15, 2017
    Posts:
    8
    I'm trying to merge the outputs of two cameras on HoloLens (each rendering different content using culling masks and layers). My secondary camera targets a RenderTexture, and my primary camera has a script on it that takes that RenderTexture, and additively renders the secondary texture and its own in a simple shader.

    My issue is that while the objects seen by the primary camera's view look fine, the secondary camera seems to lose perspective or something when rendering to a texture instead of to the screen: its object doesn't track properly in space, it's like it's not being rendered in stereo.

    Both cameras are using the HoloLensCamera prefab supplied by HoloToolkit, and both are set to render to both eyes. Does rendering to a texture instead of to screen affect stereoscopic rendering?

    I've uploaded a sample project (it was too large to attach to the thread) to demonstrate the issue. In editor it looks fine, but if you run it on a HoloLens you'll see a sphere being rendered as expected, and a cube that just looks strange.
     
  2. dj0002

    dj0002

    Joined:
    Dec 15, 2017
    Posts:
    8
    I ended up solving this myself. The solution was really simple, and didn't need any RenderTextures or custom shaders or anything. All it needed was to set the second camera to a higher depth than the first, and set its Clear Flags to Depth Only. This then meant that the second camera draws its content over the top of the first camera but doesn't remove the content drawn by the first camera.
     
    kayy and ScottHerz like this.