Search Unity

SteamVR, Canvases, RenderTextures, Oh My

Discussion in 'AR/VR (XR) Discussion' started by ScottHerz, Jun 21, 2018.

  1. ScottHerz

    ScottHerz

    Joined:
    Mar 18, 2014
    Posts:
    24
    I have an existing (traditional display) scene I'm trying to create in the Vive. In the traditional scene, there are two cameras: the main camera and a second "effects" camera attached to it. The second effects camera captures a similar view to the main camera. The effects camera runs what it captures (a few objects and a camera-space canvas) though a bunch of shaders, including one that crops to a region of interest, and then ultimately out to a RenderTexture. That mostly transparent RenderTexture is then blit over the main camera. The result is some cool effects that appear over the rest of the world.

    That worked pretty well for my purposes on a traditional display. In moving to the Vive though, I'm having a tough time getting my second camera to create a transparent RenderTexture I can blit over the main camera.

    Either the effects camera isn't creating a steroscopic render texture, or I'm losing that information during the blit. The effect camera content ends up in exactly the same place on the left and right display, so you see double. As near as I can tell, my effects camera is setup correctly. It's setup to render Both Eyes. The render texture I feed it was created using the description from XRSettings.eyeTextureDesc. I've tried single and multipass rendering.

    In a bit of a flail, I cloned the main VR camera component with introspection and used that to create a render texture. It didn't look right at all, but at least the left/right images had different perspectives. That reinforced in my mind that the effects camera isn't getting setup correctly.

    That's a long lead in to ask: What's the right way to capture the output of a second camera such that shaders can work it over before being blit over the main VR display?

    Thanks!
     
  2. ScottHerz

    ScottHerz

    Joined:
    Mar 18, 2014
    Posts:
    24
    I'm still struggling with this. Is there a way I can ask this more effectively? I can believe what I wrote isn't easy to follow without some back and forth.

    Thanks!
     
  3. SprAmir

    SprAmir

    Joined:
    May 16, 2018
    Posts:
    9
    I have a similar problem.
    Any new ideas?

    Thanks!
     
  4. ScottHerz

    ScottHerz

    Joined:
    Mar 18, 2014
    Posts:
    24
    I actually did get something to work. I'm not able to write it up this second, but will tonight.
     
  5. ScottHerz

    ScottHerz

    Joined:
    Mar 18, 2014
    Posts:
    24
    Here's what I ended up doing. I make no claims that this is remotely the right way to go about it :) I love being educated on the right way to do something, so hit me!

    First, I added two child GameObjects with Camera components to the parent of my main camera (all the camera GameObjects are siblings). Each of those gets a component which constantly updates the various camera matrices and renders to a texture. This component has a flag to denote if it's the left or right camera. Those cameras have depth (0) which is less than the main camera (1).

    I create and assign the RenderTextures like so on Start:
    Code (CSharp):
    1.  
    2.     void CreateRenderTexture()
    3.     {
    4.         RenderTexture = new RenderTexture(XRSettings.eyeTextureDesc);
    5.         GetComponent<Camera>().targetTexture = RenderTexture;
    6.     }
    7.  
    Next, I make sure that the camera matrices match the real XR camera. I call this as late and as often as I can:

    Code (CSharp):
    1.     private void UpdateWorldAndProjectionMatrices()
    2.     {
    3.         transform.position = Camera.main.transform.position;
    4.         transform.rotation = Camera.main.transform.rotation;
    5.  
    6.         Camera camera = GetComponent<Camera>();
    7.  
    8.         // Depending on which camera we are, we need to setup our projection matrix
    9.         // to match the eye we're intending to render
    10.  
    11.         if (IsLeft) {
    12.             camera.projectionMatrix = Camera.main.GetStereoNonJitteredProjectionMatrix(Camera.StereoscopicEye.Left);
    13.             camera.worldToCameraMatrix = Camera.main.GetStereoViewMatrix(Camera.StereoscopicEye.Left);
    14.         } else {
    15.             camera.projectionMatrix = Camera.main.GetStereoNonJitteredProjectionMatrix(Camera.StereoscopicEye.Right);
    16.             camera.worldToCameraMatrix = Camera.main.GetStereoViewMatrix(Camera.StereoscopicEye.Right);
    17.         }
    18.     }
    19.  
    So the two cameras are making RenderTextures for the eye they're responsible for. There's another component over on the main camera which blits the correct texture depending on what eye is active. In my case, the two left/right eye cameras generate textures which need to be composited over the world.

    Code (CSharp):
    1.     private void OnRenderImage(RenderTexture source, RenderTexture destination)
    2.     {
    3.         // Draw the world
    4.         Graphics.Blit(source, destination);
    5.  
    6.         if (Camera.main.stereoActiveEye == Camera.MonoOrStereoscopicEye.Left) {
    7.             RenderTexture leftTexture = LeftSource.GetComponent<Camera>().targetTexture;
    8.             Graphics.Blit(leftTexture, destination, AlphaMaterial);
    9.         } else {
    10.             RenderTexture rightTexture = RightSource.GetComponent<Camera>().targetTexture;
    11.             Graphics.Blit(rightTexture, destination, AlphaMaterial);
    12.             }
    13.         }
    14.     }
    15.  
    I'm sure there are better ways to do this. I'd love to learn what they are!
     
    SprAmir likes this.
  6. copperrobot

    copperrobot

    Joined:
    May 22, 2013
    Posts:
    69
    Hey,

    Not sure if I'm late to this. I have an issue where the image being overlaid looks visually correct, and I can add post effects to it etc, but it doesn't seem stable. It appears to shake, or lag behind the current frame by 1 or something like that. If the Vive is sitting still it's fine, but as soon as it moves it becomes apparent.

    Cameras seem solid - I don't think this is where the problem is creeping in. I think it's how it's putting the render texture via the Blit.

    Anything I could try?
     
  7. jeo77

    jeo77

    Joined:
    Jan 5, 2012
    Posts:
    5
    I'm not sure if you're using Update() to call the UpdateWorldAndProjectionMatrices() function, but maybe try using OnPreRender() instead?