Search Unity

  1. Unity 2020.1 has been released.
    Dismiss Notice
  2. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

How to create stereo RenderTextures and cameras?

Discussion in 'VR' started by trzy, Jul 4, 2020.

  1. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    90
    I'd like to implement a portal effect in VR. To do this, I've duplicated the XR Rig's CenterEye Anchor camera to observe the other side of the portal and render to a RenderTexture whose dimensions are set programmatically to Screen.width and Screen.height. This works great in the Editor because there is no stereo rendering going on and the center camera is exactly what is used.

    However, this obviously does not work when deployed to my Quest. I'm stumped as to how to proceed. I set Multiview as my rendering mode in the Oculus XR settings, which I believe is equivalent to Single Pass Stereo.

    But how do I create cameras that duplicate the stereo view? How do I create the RenderTexture and have each eye render to the appropriate side? How do I even size that texture?

    I can't find any working examples on the forum.

    EDIT:

    Here's what I tried just now:

    - Modify my portal shader to accept two textures, left and right, in single stereo mode.
    - Modify my portal script to disable the single camera and create two cameras and two render textures in stereo mode.

    On device, it just renders black.

    Shader

    Code (csharp):
    1.  
    2. Shader "Custom/Portal"
    3. {
    4.     Properties
    5.     {
    6.         _MainTex ("Texture", 2D) = "white" {}
    7.         _LeftEyeTexture ("Texture", 2D) = "white" {}
    8.         _RightEyeTexture("Texture", 2D) = "white" {}
    9.     }
    10.     SubShader
    11.     {
    12.         Tags { "RenderType"="Opaque" }
    13.         LOD 100
    14.         Pass
    15.         {
    16.             CGPROGRAM
    17.             #pragma vertex vert
    18.             #pragma fragment frag
    19.             // make fog work
    20.             #pragma multi_compile_fog
    21.             #include "UnityCG.cginc"
    22.             struct appdata
    23.             {
    24.                 float4 vertex : POSITION;
    25.                 float2 uv : TEXCOORD0;
    26.             };
    27.             struct v2f
    28.             {
    29.                 float4 screenPos : TEXCOORD0;
    30.                 UNITY_FOG_COORDS(1)
    31.                 float4 vertex : SV_POSITION;
    32.             };
    33.             sampler2D _MainTex;
    34.             float4 _MainTex_ST;
    35.             sampler2D _LeftEyeTexture;
    36.             sampler2D _RightEyeTexture;
    37.             v2f vert (appdata v)
    38.             {
    39.                 v2f o;
    40.                 o.vertex = UnityObjectToClipPos(v.vertex);
    41.                 o.screenPos = ComputeScreenPos(o.vertex); // use the screen position coordinates of the portal to sample the render texture (which is our screen)
    42.                 UNITY_TRANSFER_FOG(o,o.vertex);
    43.                 return o;
    44.             }
    45.             fixed4 frag(v2f i) : SV_Target
    46.             {
    47.                 float2 uv = i.screenPos.xy / i.screenPos.w; // clip space -> normalized texture (?)
    48.                 uv = UnityStereoTransformScreenSpaceTex(uv);
    49.                 // sample the texture
    50. #if SINGLE_PASS_STEREO
    51.                 fixed4 col = tex2D(_LeftEyeTexture, uv);
    52. #else
    53.                 fixed4 col = tex2D(_MainTex, uv);
    54. #endif
    55.                 // apply fog
    56.                 UNITY_APPLY_FOG(i.fogCoord, col);
    57.                 return col;
    58.             }
    59.             ENDCG
    60.         }
    61.     }
    62. }
    63.  
    Portal Script

    Code (csharp):
    1.  
    2. using UnityEngine;
    3. public class Portal : MonoBehaviour
    4. {
    5.   [Tooltip("Camera observing the other side of the portal.")]
    6.   [SerializeField]
    7.   private Camera m_otherCamera;
    8.   [Tooltip("The other portal transform, which must be the equivalent transform to this portal's.")]
    9.   [SerializeField]
    10.   private Transform m_otherPortal;
    11.   private MeshRenderer m_ourPortalRenderer;
    12.   private void Update()
    13.   {
    14.     Vector3 userOffsetFromPortal = Camera.main.transform.position - transform.position;
    15.     m_otherCamera.transform.position = m_otherPortal.transform.position + userOffsetFromPortal;
    16.     float angularDifferenceBetweenPortalRotations = Quaternion.Angle(transform.rotation, m_otherPortal.rotation);
    17.     Quaternion portalRotationDelta = Quaternion.AngleAxis(angularDifferenceBetweenPortalRotations, Vector3.up);
    18.     Vector3 newCameraDirection = portalRotationDelta * Camera.main.transform.forward;
    19.     m_otherCamera.transform.rotation = Quaternion.LookRotation(newCameraDirection, Vector3.up);
    20.   }
    21.   private void Start()
    22.   {
    23.     if (m_otherCamera.targetTexture != null)
    24.     {
    25.       m_otherCamera.targetTexture.Release();
    26.     }
    27.     Debug.LogFormat("Stereo={0}", Camera.main.stereoEnabled);
    28.     if (!Camera.main.stereoEnabled)
    29.     {
    30.       m_otherCamera.targetTexture = new RenderTexture(Camera.main.pixelWidth, Camera.main.pixelHeight, 24);
    31.       m_ourPortalRenderer.material.mainTexture = m_otherCamera.targetTexture;
    32.     }
    33.     else
    34.     {
    35.       // Disable the camera and attach stereo cameras
    36.       m_otherCamera.enabled = false;
    37.       GameObject left = new GameObject("LeftEye");
    38.       left.transform.parent = m_otherCamera.transform;
    39.       left.tag = m_otherCamera.gameObject.tag;
    40.       //left.transform.localPosition = -Vector3.right * Camera.main.stereoSeparation;
    41.       GameObject right = new GameObject("RightEye");
    42.       right.transform.parent = m_otherCamera.transform;
    43.       right.tag = m_otherCamera.gameObject.tag;
    44.       //right.transform.localPosition = Vector3.right * Camera.main.stereoSeparation;
    45.       Camera leftCamera = left.AddComponent<Camera>();
    46.       Camera rightCamera = right.AddComponent<Camera>();
    47.       leftCamera.CopyFrom(m_otherCamera);
    48.       rightCamera.CopyFrom(m_otherCamera);
    49.    
    50.       leftCamera.projectionMatrix = Camera.main.GetStereoProjectionMatrix(Camera.StereoscopicEye.Left);
    51.       rightCamera.projectionMatrix = Camera.main.GetStereoProjectionMatrix(Camera.StereoscopicEye.Right);
    52.    
    53.       leftCamera.targetTexture = new RenderTexture(leftCamera.pixelWidth, leftCamera.pixelHeight, 24);
    54.       rightCamera.targetTexture = new RenderTexture(rightCamera.pixelWidth, rightCamera.pixelHeight, 24);
    55.       leftCamera.enabled = true;
    56.       rightCamera.enabled = true;
    57.       m_ourPortalRenderer.material.SetTexture("_LeftEyeTexture", leftCamera.targetTexture);
    58.       m_ourPortalRenderer.material.SetTexture("_RightEyeTexture", rightCamera.targetTexture);
    59.     }
    60.   }
    61.   private void Awake()
    62.   {
    63.     m_ourPortalRenderer = GetComponentInChildren<MeshRenderer>();
    64.     Debug.Assert(m_otherCamera != null);
    65.     Debug.Assert(m_otherPortal != null);
    66.     Debug.Assert(m_ourPortalRenderer != null);
    67.   }
    68. }
    69.  
     
    Last edited: Jul 4, 2020
  2. bricefr

    bricefr

    Joined:
    May 3, 2015
    Posts:
    43
    Funny I'm actually trying to do the same thing with the same technique (camera + RenderTexture) but using multi-pass... no success yet. I get some results in VR mode but the image is blurred (like if both eyes are renderer in the texture; seems like the texture rendering is not done with the correct pass matching the correct eye).

    I may be wrong but I thought that using multi-pass rendering + RenderTexture with vrUsage setup + stereo-enabled cameras would have been enough... but no apparently. I maybe be missing something (maybe my shader used to crop the image rendered by the portal camera to fit the portal mesh).

    I've seen techniques using stencils but it would have to much impact on my level design, so it's not acceptable in my case.

    So, if I make any progress I will post here. Please do so as well ;)

    BTW in your case (single pass), did you check those to adapt your shader?
    https://docs.unity3d.com/Manual/SinglePassStereoRendering.html
    https://docs.unity3d.com/Manual/Android-SinglePassStereoRendering.html
     
    Last edited: Jul 4, 2020
  3. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    90
    bricefr: vrUsage is a flag used when creating a RenderTexture but how do you enable stereo on a camera?

    I did some more investigation and found that UNITY_SINGLE_PASS_STEREO is evidently *not* defined in the shader when I build. unity_StereoEyeIndex *is* available when I build for Quest. So I tried using that to render from the appropriate texture but the results are bizarre. Also, the cameras do not appear to track my head rotation (only translation).

    EDIT: Okay, so despite unity_StereoEyeIndex being defined, it is *not* working. The value is always 0.
     
    Last edited: Jul 4, 2020
  4. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    90
    I figured out a few things. Firstly, multiview stereo on Quest does not appear to be treated as a single-view stereo mode by Unity. So I'm back to the default and inefficient multi-view mode. This renders the scene twice, once for each eye, one after the other. But it does make unity_StereoEyeIndex available and now I have a stereo portable.

    The problem now is that the cameras don't replicate the stereo characteristics of the actual VR camera and I'm not sure why. I assume Camera.main.transform tracks the center point between eyes -- is this not the case?

    Attempting to offset the two virtual cameras I create manually by the stereo separation does not work. I also am skeptical whether the separation I'm getting is what is actually being used to render.

    Here is how it looks in mono (running on PC without stereo) -- perfect:

    Perfect.png

    Now, in VR, clearly wrong. Note the misalignment between the blue map and the orange map (which is on the other side of the portal):

    Imperfect.png

    And here is my code for setting up the cameras:

    Code (CSharp):
    1. private void Start()
    2.   {
    3.     if (m_otherCamera.targetTexture != null)
    4.     {
    5.       m_otherCamera.targetTexture.Release();
    6.     }
    7.  
    8.     Debug.LogFormat("Stereo={0}", Camera.main.stereoEnabled);
    9.     Debug.LogFormat("Separation={0}", Camera.main.stereoSeparation);
    10.     Debug.LogFormat("Convergence={0}", Camera.main.stereoConvergence);
    11.  
    12.     if (!Camera.main.stereoEnabled)
    13.     {
    14.       m_otherCamera.targetTexture = new RenderTexture(Camera.main.pixelWidth, Camera.main.pixelHeight, 24);
    15.       m_ourPortalRenderer.material.SetTexture("_LeftEyeTexture", m_otherCamera.targetTexture);
    16.     }
    17.     else
    18.     {
    19.       // Disable the camera and attach stereo cameras
    20.       m_otherCamera.enabled = false;
    21.  
    22.       //float separation = 0.5f * Camera.main.stereoSeparation;
    23.       //float convergenceAngle = 90f - Mathf.Atan2(Camera.main.stereoConvergence, separation) * Mathf.Rad2Deg;
    24.  
    25.       GameObject left = new GameObject("LeftEye");
    26.       left.tag = m_otherCamera.gameObject.tag;
    27.       left.transform.parent = m_otherCamera.transform;
    28.       GameObject right = new GameObject("RightEye");
    29.       right.tag = m_otherCamera.gameObject.tag;
    30.       right.transform.parent = m_otherCamera.transform;
    31.  
    32.       Camera leftCamera = left.AddComponent<Camera>();
    33.       Camera rightCamera = right.AddComponent<Camera>();
    34.       leftCamera.CopyFrom(m_otherCamera);
    35.       rightCamera.CopyFrom(m_otherCamera);
    36.  
    37.       leftCamera.fieldOfView = Camera.main.fieldOfView;
    38.       rightCamera.fieldOfView = Camera.main.fieldOfView;
    39.       leftCamera.aspect = Camera.main.aspect;
    40.       rightCamera.aspect = Camera.main.aspect;
    41.  
    42.       leftCamera.projectionMatrix = Camera.main.GetStereoProjectionMatrix(Camera.StereoscopicEye.Left);
    43.       rightCamera.projectionMatrix = Camera.main.GetStereoProjectionMatrix(Camera.StereoscopicEye.Right);
    44.  
    45.       Debug.LogFormat("aspect={0}, {1}", Camera.main.aspect, m_otherCamera.aspect);
    46.       Debug.LogFormat("type={0}, {1}", Camera.main.cameraType, m_otherCamera.cameraType);
    47.       Debug.LogFormat("aspect={0}, {1}, {2}", Camera.main.aspect, m_otherCamera.aspect, leftCamera.aspect);
    48.       Debug.LogFormat("fov={0}, {1}, {2}", Camera.main.fieldOfView, m_otherCamera.fieldOfView, leftCamera.fieldOfView);
    49.       Debug.LogFormat("focalLen={0}, {1}", Camera.main.focalLength, m_otherCamera.focalLength);
    50.       Debug.LogFormat("lensShift={0}, {1}", Camera.main.lensShift, m_otherCamera.lensShift);
    51.       Debug.LogFormat("rect={0}, {1}", Camera.main.rect, m_otherCamera.rect);
    52.       Debug.LogFormat("left={0}", Camera.main.GetStereoProjectionMatrix(Camera.StereoscopicEye.Left));
    53.       Debug.LogFormat("right={0}", Camera.main.GetStereoProjectionMatrix(Camera.StereoscopicEye.Right));
    54.  
    55.       leftCamera.targetTexture = new RenderTexture(Camera.main.pixelWidth, Camera.main.pixelHeight, 24);
    56.       rightCamera.targetTexture = new RenderTexture(Camera.main.pixelWidth, Camera.main.pixelHeight, 24);
    57.  
    58.       leftCamera.enabled = true;
    59.       rightCamera.enabled = true;
    60.  
    61.  
    62.       left.transform.localPosition = Vector3.zero;
    63.       right.transform.localPosition = Vector3.zero;
    64.       left.transform.localRotation = Quaternion.identity;
    65.       right.transform.localRotation = Quaternion.identity;
    66.  
    67.       //left.transform.localPosition = -Vector3.right * separation;
    68.       //left.transform.localRotation = Quaternion.AngleAxis(convergenceAngle, Vector3.up);
    69.       //right.transform.localPosition = Vector3.right * separation;
    70.       //right.transform.localRotation = Quaternion.AngleAxis(-convergenceAngle, Vector3.up);
    71.  
    72.       m_ourPortalRenderer.material.SetTexture("_LeftEyeTexture", leftCamera.targetTexture);
    73.       m_ourPortalRenderer.material.SetTexture("_RightEyeTexture", rightCamera.targetTexture);
    74.     }
    75.   }
     
  5. bricefr

    bricefr

    Joined:
    May 3, 2015
    Posts:
    43
    Seems like a problem with your RenderTexture sizes, doesn't it? Have you checked the RenderTexture(XRSettings.eyeTextureDesc) constructor?

    Regarding unity_StereoEyeIndex, have you declared UNITY_VERTEX_OUTPUT_STEREO in your v2f struct, and UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(o) in your vert() shader func? And also UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(i) in your frag() shader func? After all those, I guess UnityStereoScreenSpaceUVAdjust should work... I guess...
     
    Last edited: Jul 5, 2020
  6. bricefr

    bricefr

    Joined:
    May 3, 2015
    Posts:
    43
    On my side, using multi-pass single camera portal effect, good results in non-VR mode but not in VR:



    Man, the devil is in the details like they say... :D
     
    Last edited: Jul 5, 2020
  7. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    90
    I have not tried these and will take a look. But keep in mind that UNITY_SINGLE_PASS_STEREO isn't even defined, so I'm skeptical this solution will work. For now, I am stuck using multi-pass stereo, for which these functions do not work (everything is simply rendered twice).

    I don't think the render texture size is wrong. Each render texture is the size of the main camera render target. If the size is wrong, it mostly just affects the fidelity of the portal, not really the alignment.

    It appears to me that the stereo cameras I create are not calibrated properly. It might be the stereo separation, the convergence, or something else about the projection matrices. I wonder whether Unity is giving me the wrong projection parameters.

    Is it possible that Camera.main.transform.position is *not* in the exact center of the left and right eye cameras? Because my math assumes that it is. But I don't see any discussion about how Camera.main behaves in a VR stereo system.
     
  8. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    90
    bricefr: In your Oculus XR plugin settings, are you using Multi Pass or Multi View rendering?

    Looks like we both have an offset issue. Could you share your code for creating the render texture as well as the portal camera (I'm still not clear on how stereo is enabled on virtual cameras)?
     
  9. bricefr

    bricefr

    Joined:
    May 3, 2015
    Posts:
    43
    I am using multi-pass since the start. I tried to switch to multi-view (which is single-pass from my understanding) but same problem and it will have too much impact on my shaders to use this (I will also have to create dual-camera for my portals, and adapt my shaders to offset the rendering to the matching eyes, like you're trying to do). My prototype doesn't require too many CPU performances so I guess multi-pass is acceptable, even for the Quest... for the moment... ;) And I really wan't to have the VR implementation has close as the classical implementation possible. Just to mention I am using pure Unity XR package, no Oculus package or any other VR extensions what so ever.

    I'm actually doing some experimentations regarding the camera settings (which I now instantiate from the main VR camera to be sure, instead from scratch), the RenderTexture parameters (with or without using the XRSettings.eyeTextureDesc), and the cutout shader.But it also seems like I will need to make some stereo projection/view matrices adaptations on my camera for this to work... using setStereo*Matrix.

    Another strange thing, if I disable completely the rotation of the portal camera (meaning, only the player position should be reflected in the portal plane): the left eye still rotates according to the player camera (like if the projection/view matrix of the left eye was still impacted by the main camera... but the right eye is till good!). Too many thing I don't fully understand yet... but I will share my code today... working or not :)

    Regarding your stereo settings, they are parameters on the Camera, why don't you try to get them from the main camera: stereoConvergence, stereoSeparation, … ? I have also seen you already setup the projection matrices of your cameras... which should include the convergence and separation I guess... What about the view ones? Shouldn't they be adapted as well? Just trying to guess here sorry :p
     
  10. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    90
    Check this out:

    com.DefaultCompany.QuestUnityXR-20200704-215459[1].jpg



    :cool:

    Perfect alignment. But there is still a problem: there's a lot of jitter with even the slightest head motion. It is not rigidly locked to my head motion and it is extremely noticeable. I can post a video if you'd like.

    How I solved it:

    In the XR Rig, you have a CenterEyeAnchor, right? Duplicate this twice and rename the dupes to LeftEyeAnchor and RightEyeAnchor. Remove the camera from both of those. There is a TrackedPoseDriver remaining. Set one to Left Eye and the other to Right Eye. Now, you can compute the exact translation and rotation relative to CenterEyeAnchor (or Camera.main.transform)!

    For example, my LateUpdate looks like this now:

    Code (CSharp):
    1.  
    2.   private void LateUpdate()
    3.   {
    4.     // Reposition center anchor point at the other side of the portal based on relative position of our head
    5.     // to the portal entrance
    6.     Transform cameraTransform = Camera.main.transform;
    7.     Vector3 userOffsetFromPortal = cameraTransform.position - transform.position;
    8.     m_otherCamera.transform.position = m_otherPortal.transform.position + userOffsetFromPortal;
    9.     m_otherCamera.transform.rotation = m_otherPortal.rotation * Quaternion.Inverse(transform.rotation) * cameraTransform.rotation;
    10.  
    11.     // Ensure the left and right eye cameras are offset from the center anchor correctly
    12.     if (Camera.main.stereoEnabled)
    13.     {
    14.       m_left.position = m_otherCamera.transform.position + m_leftEye.position - Camera.main.transform.position;
    15.       m_right.position = m_otherCamera.transform.position + m_rightEye.position - Camera.main.transform.position;
    16.       m_left.rotation = m_otherCamera.transform.rotation * Quaternion.Inverse(Camera.main.transform.rotation) * m_leftEye.rotation;
    17.       m_right.rotation = m_otherCamera.transform.rotation * Quaternion.Inverse(Camera.main.transform.rotation) * m_rightEye.rotation;
    18.     }
    19.   }
    A bit messy. m_left is the transform of the virtual left eye camera observing the portal (I create this object myself and add a Camera there) and m_leftEye is set via the inspector to be the LeftEyeAnchor driven by TrackedPoseDriver. I'm just trying to compute the local position and rotation of my virtual cameras based on how the left and right eye are offset.

    I have not yet had time to capture a log dump of the separation and convergence values but my suspicion is that they will differ from the values in Camera.main. Will confirm later tonight.
     
  11. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    90
    Now I am really confused. You need the Oculus plugin for the Unity XR package, right? (Not necessarily the Oculus Integration asset, just the Oculus plugin). When I go to Project Settings -> XR Plug-in Management, there is an Oculus drop-down.

    Now I'm really confused as to how you are creating a single render texture and rendering to that. If you can share your code, it would be super helpful to understand what you are doing. Feel free to reach out privately at bart.trzy at gmail dot com, also.

    See my post above. I think these values are actually incorrect! I also thought the projection matrices include these values but now am confused because it seems that the fix for me (minus the horrible jitter now) was to offset the cameras manually.
     
  12. bricefr

    bricefr

    Joined:
    May 3, 2015
    Posts:
    43
    Nice ! Well done. I understand. And this is using the initial cutout shader you posted? Nothing specific to single-pass rendering in this shader?

    On my side, the code is pretty simple, because I have a single camera... and I still want to keep one camera :D

    Code (CSharp):
    1. [Header("General")] [SerializeField, Required]
    2.     private GameObject portalOuput;
    3.  
    4.     [Header("Setup")]
    5.     [SerializeField, Required] private new Renderer renderer;
    6.     [SerializeField, AssetsOnly, Required] private Material material;
    7.     [SerializeField, Required] private Collider activationArea;
    8.     [SerializeField, Required] private Collider teleportationArea;
    9.  
    10.     private Camera _camera;
    11.     private Material _defaultMaterial;
    12.     private Material _mirrorMaterial;
    13.     private RenderTexture _renderTexture;
    14.  
    15.     private bool _active;
    16.     private bool _tracking;
    17.     private Transform _trackedObject;
    18.  
    19.     private void Start() {
    20.      
    21.         // instantiate a camera and parent it to the portal output (to follow it if its moving)
    22.         _camera = Instantiate(PlayerController.instance.head.camera, portalOuput.transform);
    23.         _camera.gameObject.AddComponent<UniversalAdditionalCameraData>();
    24.      
    25.         _camera.transform.localPosition = Vector3.zero;
    26.         _camera.transform.localRotation = Quaternion.identity;
    27.         _camera.forceIntoRenderTexture = true;
    28.      
    29.         _camera.stereoTargetEye = StereoTargetEyeMask.Both;
    30.         _camera.depth -= 1;
    31.  
    32.         // duplicate render texture and material
    33.         _mirrorMaterial = Instantiate(material);
    34.  
    35.         // create the render texture
    36.         _renderTexture = XRSettings.enabled
    37.             ? new RenderTexture(XRSettings.eyeTextureDesc)
    38.             : new RenderTexture(Screen.width, Screen.height, 24);
    39.      
    40.         _renderTexture.antiAliasing = 2;
    41.         _renderTexture.vrUsage = VRTextureUsage.TwoEyes; // default seems to be DeviceSpecific
    42.  
    43.         // link render texture to material
    44.         _mirrorMaterial.mainTexture = _renderTexture;
    45.      
    46.         // associate camera to render texture
    47.         _camera.targetTexture = _renderTexture;
    48.      
    49.         // retrieve default material
    50.         _defaultMaterial = renderer.material;
    51.      
    52.         // default, no mirroring
    53.         SetupTracking();
    54.  
    55.     }
    56.  
    57.     private void LateUpdate() {
    58.  
    59.         if (!_active)
    60.             return;
    61.      
    62.         // mirroring through the portal
    63.         if (_tracking && _trackedObject != null) {
    64.             _camera.transform.position = portalOuput.transform.TransformPoint(transform.InverseTransformPoint(_trackedObject.transform.position));
    65.             _camera.transform.localRotation = _trackedObject.transform.localRotation;
    66.         }
    67.     }
    And the shader I use if the same as yours... thanks to Brackeys :)

    And I don't use the TrackedPoseDriver, but I have my implementation which basically does the same...

    Code (CSharp):
    1. using System.Collections.Generic;
    2. using Sirenix.OdinInspector;
    3. using UnityEngine;
    4. using UnityEngine.XR;
    5.  
    6. [DefaultExecutionOrder(-30000)]
    7. public class RoomScaleTracker : MonoBehaviour {
    8.  
    9.     [Required] public XRNode node;
    10.  
    11.     /// <summary>
    12.     /// Should the position be tracked?
    13.     /// </summary>
    14.     public bool trackPosition = true;
    15.  
    16.     /// <summary>
    17.     /// Should the rotation be tracked?
    18.     /// </summary>
    19.     public bool trackRotation = true;
    20.  
    21.     /// <summary>
    22.     /// Last known position (local space).
    23.     /// </summary>
    24.     public Vector3 lastLocalPosition { get; private set; }
    25.  
    26.     /// <summary>
    27.     /// Last known rotation (local space).
    28.     /// </summary>
    29.     public Quaternion lastLocalRotation { get; private set; }
    30.  
    31. #if ENABLE_VR
    32.  
    33.     private static bool _initialized;
    34.  
    35.     private void Awake() {
    36.         if (enabled && !_initialized) {
    37.             _initialized = true;
    38.            
    39.             var subsystems = new List<XRInputSubsystem>();
    40.             SubsystemManager.GetInstances(subsystems);
    41.             foreach (var t in subsystems) {
    42.                 t.TrySetTrackingOriginMode(TrackingOriginModeFlags.Floor);
    43.             }
    44.         }
    45.     }
    46.  
    47.     private void Update() {
    48.         Track();
    49.     }
    50.    
    51.     private void Track() {
    52.         if (!enabled)
    53.             return;
    54.        
    55.         var device = InputDevices.GetDeviceAtXRNode(node);
    56.         if (device.isValid) {
    57.             if (device.TryGetFeatureValue(CommonUsages.deviceRotation, out var rotation)) {
    58.                 lastLocalRotation = rotation;
    59.                
    60.                 if (trackRotation)
    61.                     transform.localRotation = rotation;
    62.             }
    63.  
    64.             if (device.TryGetFeatureValue(CommonUsages.devicePosition, out var position)) {
    65.                 lastLocalPosition = position;
    66.                
    67.                 if (trackPosition)
    68.                     transform.localPosition = position;
    69.             }
    70.         }
    71.  
    72.     }
    73.  
    74. #endif
    75.  
    76. }
    77.  
     
  13. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    90
    My fragment shader:

    Code (CSharp):
    1.             fixed4 frag(v2f i) : SV_Target
    2.             {
    3.                 float2 uv = i.screenPos.xy / i.screenPos.w; // clip space -> normalized texture
    4.  
    5.                 // sample the texture
    6.                 fixed4 col = unity_StereoEyeIndex == 0 ? tex2D(_LeftEyeTexture, uv) : tex2D(_RightEyeTexture, uv);
    7.  
    8.                 // apply fog
    9.                 UNITY_APPLY_FOG(i.fogCoord, col);
    10.                 return col;
    11.             }
    I still don't understand why the projection matrices aren't handling the convergence and separation. I've confirmed that they do differ (albeit by only one element). I need to sit down and review projection matrices. Off the top of my head it does seem like one value would be insufficient to handle both phenomena...

    Here are the matrices taken from the Quest log:

    Code (csharp):
    1.  
    2. 07-04 13:59:37.546 31733 31747 I Unity   : left=0.91729    0.00000    -0.17407    0.00000
    3. 07-04 13:59:37.546 31733 31747 I Unity   : 0.00000    0.83354    -0.10614    0.00000
    4. 07-04 13:59:37.546 31733 31747 I Unity   : 0.00000    0.00000    -1.00060    -0.60018
    5. 07-04 13:59:37.546 31733 31747 I Unity   : 0.00000    0.00000    -1.00000    0.00000
    6. 07-04 13:59:37.546 31733 31747 I Unity   :
    7. 07-04 13:59:37.546 31733 31747 I Unity   : (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
    8. 07-04 13:59:37.546 31733 31747 I Unity   :
    9. 07-04 13:59:37.546 31733 31747 I Unity   : right=0.91729    0.00000    0.17407    0.00000
    10. 07-04 13:59:37.546 31733 31747 I Unity   : 0.00000    0.83354    -0.10614    0.00000
    11. 07-04 13:59:37.546 31733 31747 I Unity   : 0.00000    0.00000    -1.00060    -0.60018
    12. 07-04 13:59:37.546 31733 31747 I Unity   : 0.00000    0.00000    -1.00000    0.00000
    13.  
    m02 differs (+ in one - in the other).

    And the jitter... hmm....
     
  14. bricefr

    bricefr

    Joined:
    May 3, 2015
    Posts:
    43
    Have you seen this? https://docs.unity3d.com/ScriptReference/Camera.CopyStereoDeviceProjectionMatrixToNonJittered.html
     
  15. bricefr

    bricefr

    Joined:
    May 3, 2015
    Posts:
    43
    Ok I'm starting to understand the problem... the second camera I create at runtime isn't in stereographic mode... and I don't seem to be able to force that... question is: can we setup more than one stereographic camera in a same scene on Unity??

    At runtime on the Quest:





    And I did CopyFrom the initial camera...

    Code (CSharp):
    1. // camera setup
    2.         _camera = _clone.AddComponent<Camera>();
    3.         _camera.CopyFrom(PlayerController.instance.head.camera);
    4.      
    5.         _camera.transform.localPosition = Vector3.zero;
    6.         _camera.transform.localRotation = Quaternion.identity;
    7.         _camera.forceIntoRenderTexture = true;
    8.  
    9.         _camera.stereoTargetEye = XRSettings.enabled ? StereoTargetEyeMask.Both : StereoTargetEyeMask.None;
    10.      
    11.         // URP-specifics
    12.         _clone.AddComponent<UniversalAdditionalCameraData>();
     
  16. bricefr

    bricefr

    Joined:
    May 3, 2015
    Posts:
    43
    Ok apparently setting the renderTexture attribute to a Camera disable the stereo mode... what the... ?! I don't see the point of having a vrUsage on the RenderTexture...

    I'm screwed, I guess I have to switch to multiview and build up a fake stereographic system with two cameras, like you did. So sad, seems like a little thing is missing to get a reliable and easy solution to VR-enabled portal or mirror in Unity...
     
  17. bricefr

    bricefr

    Joined:
    May 3, 2015
    Posts:
    43
    Ok I ended up doing like you and it's working.

    For the eye positioning, I implemented this. Attach it to your portal cameras, which are then attached to the portal remote origin. Given the fact my transformations are all local-relative (between the player and the portal), it's pretty simple to compute the required offset to get the alignement. Strangely, I have to override the projectionMatrix at every frame. For sure it can be optimized, but it works and the portals - at least mine - will be enabled on a short period of time.

    @trzy Thanks for the discussion, it help us both find a way to solve this.

    The eye tracker (local space):

    Code (CSharp):
    1. public class LocalEyeTracker : MonoBehaviour {
    2.  
    3.  
    4.     [SerializeField] private Camera.StereoscopicEye eye;
    5.  
    6.     private void Update() {
    7.         var node = eye == Camera.StereoscopicEye.Left ? XRNode.LeftEye : XRNode.RightEye;
    8.        
    9.         // update relative eye position
    10.         var device = InputDevices.GetDeviceAtXRNode(node);
    11.         if (device.isValid) {
    12.             if (device.TryGetFeatureValue(eye == Camera.StereoscopicEye.Left ? CommonUsages.leftEyeRotation : CommonUsages.rightEyeRotation, out var rotation))
    13.                 transform.localRotation = rotation;
    14.            
    15.             if (device.TryGetFeatureValue(eye == Camera.StereoscopicEye.Left ? CommonUsages.leftEyePosition : CommonUsages.rightEyePosition, out var position))
    16.                 transform.localPosition = position;
    17.         }
    18.        
    19.         // update projection matrix
    20.         GetComponent<Camera>().projectionMatrix = PlayerController.instance.head.camera.GetStereoProjectionMatrix(eye);
    21.     }
    22. }
    Initialization of the cameras:

    Code (CSharp):
    1.  
    2. // instantiate the clone and parent it to the portal output (to follow it if its moving)
    3. _clone = new GameObject("Portal Clone");
    4. _clone.transform.parent = portalOutput.transform;
    5. _clone.transform.localPosition = Vector3.zero;
    6. _clone.transform.localRotation = Quaternion.identity;
    7.  
    8. // retrieve default material
    9. _defaultMaterial = renderer.material;
    10.  
    11. // setup render to texture
    12. if (XRSettings.enabled) { // stereographic
    13.    
    14.     portalStereoSystem.SetActive(true);
    15.  
    16.     portalStereoSystem.transform.parent = _clone.transform;
    17.     portalStereoSystem.transform.localPosition = Vector3.zero;
    18.     portalStereoSystem.transform.localRotation = Quaternion.identity;
    19.    
    20.     // setup cameras
    21.     portalStereoLeftCamera.CopyFrom(PlayerController.instance.head.camera);
    22.     portalStereoRightCamera.CopyFrom(PlayerController.instance.head.camera);
    23.  
    24.     // render textures
    25.     portalStereoLeftCamera.targetTexture = new RenderTexture(PlayerController.instance.head.camera.pixelWidth, PlayerController.instance.head.camera.pixelHeight, 24);
    26.     portalStereoRightCamera.targetTexture = new RenderTexture(PlayerController.instance.head.camera.pixelWidth, PlayerController.instance.head.camera.pixelHeight, 24);
    27.  
    28.     // render texture material
    29.     _mirrorMaterial = Instantiate(stereographicMaterial);
    30.     _mirrorMaterial.SetTexture(LeftEyeTexture, portalStereoLeftCamera.targetTexture);
    31.     _mirrorMaterial.SetTexture(RightEyeTexture, portalStereoRightCamera.targetTexture);
    32.  
    33. } else { // monographic
    34.    
    35.     portalStereoSystem.SetActive(false);
    36.  
    37.     // camera setup
    38.     _camera = _clone.AddComponent<Camera>();
    39.     _camera.CopyFrom(PlayerController.instance.head.camera);
    40.  
    41.     _camera.transform.localPosition = Vector3.zero;
    42.     _camera.transform.localRotation = Quaternion.identity;
    43.     _camera.forceIntoRenderTexture = true;
    44.  
    45.     // URP-specifics
    46.     _clone.AddComponent<UniversalAdditionalCameraData>();
    47.    
    48.     // create the render texture
    49.     _camera.targetTexture = new RenderTexture(Screen.width, Screen.height, 24) {antiAliasing = 2 };
    50.  
    51.     // render texture material
    52.     _mirrorMaterial = Instantiate(monographicMaterial);
    53.     _mirrorMaterial.mainTexture = _camera.targetTexture;
    54. }
    55.  
    And the portal LateUpdate:

    Code (CSharp):
    1. _clone.transform.position = portalOutput.transform.TransformPoint(transform.InverseTransformPoint(_trackedObject.transform.position));
    2.             _clone.transform.localRotation = _trackedObject.transform.localRotation;
    My PortalStereoSystem is just a node with two children: one with a camera with a left-enabled LocalEyeTracker, and another child with a right-enabled one.
     
  18. bricefr

    bricefr

    Joined:
    May 3, 2015
    Posts:
    43
    Ok if by "jitter" you mean the rendered image rotate way too fast regarding the VR camera, I get the same phenomena. And also the portal camera rotation around its forward axis has not the same behavior than the VR camera... is there some kind of "adaptation" made by XR management package on the camera after updates, and just before rendering, to correct/smooth things out?
     
  19. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    90
    In fact there is. All VR and AR HMDs require the rendered image to be adjusted just before it hits the display to account for head motion that occurred after the scene was submitted to the GPU for rendering. There are various techniques and terms for this: reprojection, time warp, space warp, etc.

    I've never actually seen a system where this is disabled to compare the effect. I'm not sure if this is the cause of what we are seeing, although it is certainly possible. The portal is rendered from the same head pose as the rest of the scene, so I don't quite understand why this would be. From my reading, Oculus Quest Asynchronous TimeWarp is not using depth information and operates purely on the texture and mesh that the renderer outputs each frame to. Therefore, the fact that the portal is itself a texture with depth information discarded should not be an issue.

    I wonder if this is just latency and the head pose during LateUpdate is simply not the same as the frame is actually rendered with? Maybe it is one full frame behind?

    BTW, thanks for your other posts. I'll have to find some time to study them further.
     
  20. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    90
    Out of curiosity I got the true convergence and separation values:

    Convergence angle = 0
    Separation = 0.06792906

    This looks better. 68mm is a reasonable and realistic human IPD. There is no convergence angle (the left and right eye cameras are parallel) so as not to create uncomfortable vertical parallax and presumably the difference in the left/right projection matrices is to create the asymmetric, off-axis projection for each eye, as below. At some point for my own edification, I'll derive the projection matrices myself.

    projection.gif

    Now as for the camera jitter: I have one idea that I'll try later. There is an onPreRender() callback that gets fired on the cameras themselves. This would require re-working the scripts to live on the cameras, though. Maybe the HMD pose is correct at that stage?
     
  21. bricefr

    bricefr

    Joined:
    May 3, 2015
    Posts:
    43
    Well thanks for the explanations. Very interesting! It reminds my some of my math courses back then... but way too much in the past. I may have to take some time to refresh all of this.

    I manage to get the perfect rendering: great alignements, perfect FOV-matching, great transition between the portals, … in fact I had a little problem in my portal rotations sync: given the fact I put the equivalent of the TrackedPoseDriver on my portal camera rig (to mimic the player head local rotations without having to sync them explicitly), I had to mirror the neck position to the portal and not the head anymore. And I also replace the camera sync with the left/right XRNode, put both of them in the center (CenterEye) and just override the projectionMatrix in the update loop with the matching stereographic projection matrix of the main camera.

    I will post a video if I have time. And maybe a sumup of all the stuff we did for people who is trying/will try to do the same thing.

    I'm so glad because the rendering is awesome. I will now have to change the shaders to fade in/out the portal mirror effect given the player distance. And check if I can chain them, just for fun. And also find a way to change my teleport pathfinding system to go through the portals as well... some great days of dev to come :p
     
  22. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    90
    You managed to get rid of the jitter? How? Hope you can post your updated code. I’m a bit confused when you say you place the cameras off of the center anchor. You must create the stereo separation in addition to applying the matrices?
     
  23. bricefr

    bricefr

    Joined:
    May 3, 2015
    Posts:
    43
    Hello there. Sorry for the delay, just saw the forum notification in my spam box >_> Can you make a video of your "jitter" thing because English isn't my main language and I'm not sure to understand the phenomena ;)

    Here the way I setup my portal stereo camera system:









    And here is my LocalEyeTracker code:

    Code (CSharp):
    1. [RequireComponent(typeof(Camera))]
    2. [DefaultExecutionOrder(-30000)]
    3. public class LocalEyeTracker : MonoBehaviour {
    4.  
    5.     [SerializeField] private Camera.StereoscopicEye eye;
    6.  
    7.     private Camera _cam;
    8.  
    9.     private void Awake() {
    10.         _cam = GetComponent<Camera>();
    11.     }
    12.  
    13.     private void Update() {
    14.         var node = eye == Camera.StereoscopicEye.Left ? XRNode.LeftEye : XRNode.RightEye;
    15.      
    16.         // update relative eye position
    17.         var device = InputDevices.GetDeviceAtXRNode(node);
    18.         if (device.isValid) {
    19.             if (device.TryGetFeatureValue(eye == Camera.StereoscopicEye.Left ? CommonUsages.leftEyeRotation : CommonUsages.rightEyeRotation, out var rotation))
    20.                 transform.localRotation = rotation;
    21.          
    22.             if (device.TryGetFeatureValue(eye == Camera.StereoscopicEye.Left ? CommonUsages.leftEyePosition : CommonUsages.rightEyePosition, out var position))
    23.                 transform.localPosition = position;
    24.         }
    25.      
    26.         // sync camera properties
    27.         _cam.aspect = PlayerController.instance.head.camera.aspect;
    28.         _cam.fieldOfView = PlayerController.instance.head.camera.fieldOfView;
    29.         _cam.projectionMatrix = PlayerController.instance.head.camera.GetStereoProjectionMatrix(eye);
    30.         _cam.nonJitteredProjectionMatrix = PlayerController.instance.head.camera.GetStereoNonJitteredProjectionMatrix(eye);
    31.     }
    32. }
    I'm not sure it's necessary to sync the aspect and the FOV since the projectionMatrix is overridden just after…

    Hope this help :)

    Let me know if you solved your jitter thing.
     
  24. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    90
    Thank you! I will try to implement it this way and let you know. Mine was subtly different in that I use the transform of the game object that contains the TrackedPoseDriver. So maybe that is the issue for me.

    Re: jitter, I reproduced the effect in this video. You may need to watch closely and carefully. If I record from the headset, the phenomenon does not occur (probably because a third view is instantiated for recording). I reproduced it by adding a frame of delay for the portal cameras and ran it on my PC.
     
  25. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    90
    It does not work for me :( The values returned from the left and right eye devices are always the same and do not change with head motion.

    I've uploaded my project source here (hopefully you can take a look -- it's quite small): https://trzy.org/tmp/PortalDemo.zip

    Here is my code (my script is attached to the portal and instantiates the cameras):

    Code (csharp):
    1.  
    2. using UnityEngine;
    3. using UnityEngine.XR;
    4. [RequireComponent(typeof(Camera))]
    5. [DisallowMultipleComponent]
    6. [DefaultExecutionOrder(-30000)]
    7. public class PortalCamera : MonoBehaviour
    8. {
    9.   [Tooltip("Where the portal is rendered (entrance to portal).")]
    10.   [SerializeField]
    11.   private MeshRenderer m_portalRenderer;
    12.   [Tooltip("A transfor on the local end (entrance) of the portal.")]
    13.   [SerializeField]
    14.   private Transform m_portalLocalObservationPoint;
    15.   [Tooltip("A transform on the remote end (observed by this camera) of the portal corresponding exactly to the local-end point.")]
    16.   [SerializeField]
    17.   private Transform m_portalRemoteObservationPoint;
    18.   [Tooltip("Left eye transform used for stereoscopic rendering. If not set, searches for a game object named LeftEyeAnchor.")]
    19.   [SerializeField]
    20.   private Transform m_leftEyeAnchor;
    21.   [Tooltip("Right eye transform used for stereoscopic rendering. If null, searches for a game object named RightEyeAnchor.")]
    22.   [SerializeField]
    23.   private Transform m_rightEyeAnchor;
    24.   private Camera m_centerCamera;  // for non-stereoscopic rendering
    25.   private Camera m_leftCamera;
    26.   private Camera m_rightCamera;
    27.   private bool TryGetEyePose(ref Vector3 position, ref Quaternion rotation, Camera.StereoscopicEye eye)
    28.   {
    29.     InputDevice device = InputDevices.GetDeviceAtXRNode(eye == Camera.StereoscopicEye.Left ? XRNode.LeftEye : XRNode.RightEye);
    30.  
    31.     if (!device.isValid)
    32.     {
    33.       Debug.Log("XR device is invalid");
    34.       return false;
    35.     }
    36.     bool success = true;
    37.     success &= device.TryGetFeatureValue(eye == Camera.StereoscopicEye.Left ? CommonUsages.leftEyeRotation : CommonUsages.rightEyeRotation, out rotation);
    38.     success &= device.TryGetFeatureValue(eye == Camera.StereoscopicEye.Left ? CommonUsages.leftEyePosition : CommonUsages.rightEyePosition, out position);
    39.     Debug.LogFormat("sucess={0}, position={1}", success, position);
    40.     return success;
    41.   }
    42.   private void Update()
    43.   {
    44.     if (Camera.main.stereoEnabled)
    45.     {
    46.       // Get eye poses
    47.       Vector3 leftEyePosition = Vector3.zero;
    48.       Vector3 rightEyePosition = Vector3.zero;
    49.       Quaternion leftEyeRotation = Quaternion.identity;
    50.       Quaternion rightEyeRotation = Quaternion.identity;
    51.       if (!TryGetEyePose(ref leftEyePosition, ref leftEyeRotation, Camera.StereoscopicEye.Left) || !TryGetEyePose(ref rightEyePosition, ref rightEyeRotation, Camera.StereoscopicEye.Right))
    52.       {
    53.         return;
    54.       }
    55.       Debug.LogFormat("got here");
    56.       // Reposition center anchor point at the other side of the portal based on relative position of each eye
    57.       // to the portal entrance
    58.       Vector3 leftEyeOffsetFromPortal = leftEyePosition - m_portalLocalObservationPoint.position;
    59.       Vector3 rightEyeOffsetFromPortal = rightEyePosition - m_portalLocalObservationPoint.position;
    60.       m_leftCamera.transform.position = m_portalRemoteObservationPoint.position + leftEyeOffsetFromPortal;
    61.       m_rightCamera.transform.position = m_portalRemoteObservationPoint.position + rightEyeOffsetFromPortal;
    62.       m_leftCamera.transform.rotation = m_portalRemoteObservationPoint.rotation * Quaternion.Inverse(m_portalLocalObservationPoint.rotation) * leftEyeRotation;
    63.       m_rightCamera.transform.rotation = m_portalRemoteObservationPoint.rotation * Quaternion.Inverse(m_portalLocalObservationPoint.rotation) * rightEyeRotation;
    64.       m_leftCamera.projectionMatrix = Camera.main.GetStereoProjectionMatrix(Camera.StereoscopicEye.Left);
    65.       m_rightCamera.projectionMatrix = Camera.main.GetStereoProjectionMatrix(Camera.StereoscopicEye.Right);
    66.       m_leftCamera.nonJitteredProjectionMatrix = Camera.main.GetStereoNonJitteredProjectionMatrix(Camera.StereoscopicEye.Left);
    67.       m_rightCamera.nonJitteredProjectionMatrix = Camera.main.GetStereoNonJitteredProjectionMatrix(Camera.StereoscopicEye.Right);
    68.     }
    69.     else
    70.     {
    71.       // Reposition center anchor point at the other side of the portal based on relative position of our head
    72.       // to the portal entrance
    73.       Transform cameraTransform = Camera.main.transform;
    74.       Vector3 userOffsetFromPortal = cameraTransform.position - m_portalLocalObservationPoint.position;
    75.       m_centerCamera.transform.position = m_portalRemoteObservationPoint.position + userOffsetFromPortal;
    76.       m_centerCamera.transform.rotation = m_portalRemoteObservationPoint.rotation * Quaternion.Inverse(m_portalLocalObservationPoint.rotation) * cameraTransform.rotation;
    77.       m_centerCamera.projectionMatrix = Camera.main.projectionMatrix;
    78.     }
    79.   }
    80.   private void CreateRenderTarget(Camera camera)
    81.   {
    82.     if (camera.targetTexture != null)
    83.     {
    84.       camera.targetTexture.Release();
    85.     }
    86.     camera.targetTexture = new RenderTexture(Camera.main.pixelWidth, Camera.main.pixelHeight, 24);
    87.   }
    88.   // Creates a portal-observing camera representing a stereoscopic eye anchored to center camera
    89.   private Camera CreateEyeCamera(Camera.StereoscopicEye eye)
    90.   {
    91.     GameObject cameraContainer = new GameObject(eye == Camera.StereoscopicEye.Left ? "LeftEye" : "RightEye");
    92.     cameraContainer.tag = m_centerCamera.gameObject.tag;
    93.     cameraContainer.transform.parent = m_centerCamera.transform;
    94.     cameraContainer.transform.localPosition = Vector3.zero;
    95.     cameraContainer.transform.localRotation = Quaternion.identity;
    96.  
    97.     Camera camera = cameraContainer.AddComponent<Camera>();
    98.     camera.CopyFrom(m_centerCamera);  // can't use main camera because we would inherit pose tracking
    99.     camera.projectionMatrix = Camera.main.GetStereoProjectionMatrix(eye);
    100.     CreateRenderTarget(camera);
    101.     return camera;
    102.   }
    103.   private void Start()
    104.   {
    105.     if (!Camera.main.stereoEnabled)
    106.     {
    107.       CreateRenderTarget(m_centerCamera);
    108.       m_portalRenderer.material.SetTexture("_LeftEyeTexture", m_centerCamera.targetTexture);
    109.     }
    110.     else
    111.     {
    112.       m_leftCamera = CreateEyeCamera(Camera.StereoscopicEye.Left);
    113.       m_rightCamera = CreateEyeCamera(Camera.StereoscopicEye.Right);
    114.       m_portalRenderer.material.SetTexture("_LeftEyeTexture", m_leftCamera.targetTexture);
    115.       m_portalRenderer.material.SetTexture("_RightEyeTexture", m_rightCamera.targetTexture);
    116.       m_centerCamera.enabled = false;
    117.       m_leftCamera.enabled = true;
    118.       m_rightCamera.enabled = true;
    119.     }
    120.   }
    121.   private void Awake()
    122.   {
    123.     m_centerCamera = GetComponent<Camera>();
    124.     Debug.Assert(m_centerCamera != null);
    125.     if (m_leftEyeAnchor == null)
    126.     {
    127.       GameObject leftEyeAnchor = GameObject.Find("LeftEyeAnchor");
    128.       Debug.Assert(leftEyeAnchor != null);
    129.       m_leftEyeAnchor = leftEyeAnchor.transform;
    130.     }
    131.     if (m_rightEyeAnchor == null)
    132.     {
    133.       GameObject rightEyeAnchor = GameObject.Find("RightEyeAnchor");
    134.       Debug.Assert(rightEyeAnchor != null);
    135.       m_rightEyeAnchor = rightEyeAnchor.transform;
    136.     }
    137.     Debug.Assert(m_portalRenderer != null);
    138.     Debug.Assert(m_portalLocalObservationPoint != null);
    139.     Debug.Assert(m_portalRemoteObservationPoint != null);
    140.   }
    141. }
    And here is the log output on Quest:

    Code (csharp):
    1.  
    2. 07-14 20:38:58.070  3740  3754 I Unity   : sucess=True, position=(-0.2, 1.0, 0.1)
    3. 07-14 20:38:58.070  3740  3754 I Unity   : (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
    4. 07-14 20:38:58.070  3740  3754 I Unity   :
    5. 07-14 20:38:58.070  3740  3754 I Unity   : sucess=True, position=(-0.1, 1.0, 0.1)
    6. 07-14 20:38:58.070  3740  3754 I Unity   : (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
    7. 07-14 20:38:58.070  3740  3754 I Unity   :
    8. 07-14 20:38:58.070  3740  3754 I Unity   : got here
    9. 07-14 20:38:58.070  3740  3754 I Unity   : (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
    10. 07-14 20:38:58.070  3740  3754 I Unity   :
    11. 07-14 20:38:58.070  3740  3754 I Unity   : sucess=True, position=(-0.2, 1.0, 0.1)
    12. 07-14 20:38:58.070  3740  3754 I Unity   : (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
    13. 07-14 20:38:58.070  3740  3754 I Unity   :
    14. 07-14 20:38:58.070  3740  3754 I Unity   : sucess=True, position=(-0.1, 1.0, 0.1)
    15. 07-14 20:38:58.070  3740  3754 I Unity   : (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
    16. 07-14 20:38:58.070  3740  3754 I Unity   :
    17. 07-14 20:38:58.070  3740  3754 I Unity   : got here
    18. 07-14 20:38:58.070  3740  3754 I Unity   : (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
    19. 07-14 20:38:58.070  3740  3754 I Unity   :
    20. 07-14 20:38:58.083  3740  3754 I Unity   : sucess=True, position=(-0.2, 1.0, 0.1)
    21. 07-14 20:38:58.083  3740  3754 I Unity   : (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
    22. 07-14 20:38:58.083  3740  3754 I Unity   :
    23. 07-14 20:38:58.083  3740  3754 I Unity   : sucess=True, position=(-0.1, 1.0, 0.1)
    24. 07-14 20:38:58.083  3740  3754 I Unity   : (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
    25. 07-14 20:38:58.083  3740  3754 I Unity   :
    26. 07-14 20:38:58.083  3740  3754 I Unity   : got here
    27. 07-14 20:38:58.083  3740  3754 I Unity   : (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
    28. 07-14 20:38:58.083  3740  3754 I Unity   :
    29. 07-14 20:38:58.083  3740  3754 I Unity   : sucess=True, position=(-0.2, 1.0, 0.1)
    30. 07-14 20:38:58.083  3740  3754 I Unity   : (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
    31. 07-14 20:38:58.083  3740  3754 I Unity   :
    32. 07-14 20:38:58.083  3740  3754 I Unity   : sucess=True, position=(-0.1, 1.0, 0.1)
    33. 07-14 20:38:58.083  3740  3754 I Unity   : (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
    34. 07-14 20:38:58.083  3740  3754 I Unity   :
    35. 07-14 20:38:58.083  3740  3754 I Unity   : got here
    36. 07-14 20:38:58.083  3740  3754 I Unity   : (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 35)
    37. 07-14 20:38:58.083  3740  3754 I Unity   :
    38.  
    I am using the standard rendering pipeline. Do I have to enable something for tracking to work? I also tried reading out the values for the center eye and also nothing. I confirmed that the InputSystem is running.

    Interestingly, the TrackedPoseDrivers in the XR Rig do update their transforms (they are just one frame behind, making them unusable).
     
    Last edited: Jul 15, 2020
  26. stillwind

    stillwind

    Joined:
    Jun 13, 2016
    Posts:
    6
    I gave up trying this way to make a portal (lacking the kind of expertise you all have), but I did implement a workaround that might be interesting to some.
    I made a stencil mask for each portal which acts as a window to a hidden layer. When you poke your head through it hides the main scene and puts the hidden layer onto a normal layer and reverses that when you step out of it again. I had to make a reverse portal as well for when the player goes entirely into the portal and then decides to walk back to the main scene. A bit fiddly, but works with no 3 portals next to each other, each showing different hidden layers. It's definitely got some drawbacks, but I found the render texture method at least as difficult, and probably would be very expensive with 3 portals to render for each eye...
     
  27. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    984
    I think the Unity XR team said: no, by design. They didn't see a use-case for it beyond "multiple HMDs" which is a loooong way from being supported/performant enough to be worth trying.

    So ... file a bug report? :) (and paste the case-number here so others know it's been reported)
     
unityunity