Search Unity

Question View direction with single pass stereo enabled

Discussion in 'Shaders' started by sboys3sev, Jan 30, 2023.

  1. sboys3sev

    sboys3sev

    Joined:
    Aug 14, 2018
    Posts:
    2
    I am trying to get the view direction in a material shader for ray-marching.

    viewVector = i.worldPos - i.worldCamPos;
    works but it has problems with precision when a few kilometers away from 0,0,0.

    I then found this way of doing it:
    Code (CSharp):
    1. float2 uvMod = i.screenPos.xy / i.screenPos.w * 2 - 1;
    2. viewVector = mul(unity_CameraInvProjection, float4(uvMod, 0, -1));
    3. viewVector = mul(unity_CameraToWorld, float4(viewVector,0));
    This works great at far distances when viewed on desktop, but it results in double vision when viewed in a headset with single pass stereo enabled.

    The closest I have got it to working is by adding an offset to non-stereo screen space with the following code:
    Code (CSharp):
    1. float2 uvMod = i.screenPosNonStereo.xy / i.screenPosNonStereo.w * 2 - 1;
    2. uvMod.x += (unity_StereoEyeIndex * 2 - 1) * 0.11;
    3. viewVector = mul(unity_CameraInvProjection, float4(uvMod, 0, -1));
    4. viewVector = mul(unity_CameraToWorld, float4(viewVector,0));
    This almost works but there seems to be slight diagonal offset for both eyes from what it should be and far away objects seem slightly wrong in the ray marching shader. I think the 0.11 might come from the stereo separation but I am not sure and it might be specific to my headset.

    Here is my vertex function:
    Code (CSharp):
    1. struct appdata{
    2.     float4 vertex : POSITION;
    3.     float2 uv : TEXCOORD0;
    4.     UNITY_VERTEX_INPUT_INSTANCE_ID
    5. };
    6. struct v2f{
    7.     float2 uv : TEXCOORD0;
    8.     UNITY_FOG_COORDS(1)
    9.     float4 vertex : SV_POSITION;
    10.     float4 pos : TEXCOORD2;
    11.     float3 worldPos : TEXCOORD5;
    12.     float3 worldCamPos : TEXCOORD4;
    13.     float4 screenPos: TEXCOORD3;
    14.     float4 screenPosNonStereo: TEXCOORD6;
    15.     UNITY_VERTEX_INPUT_INSTANCE_ID
    16.     UNITY_VERTEX_OUTPUT_STEREO
    17. };
    18. v2f vert (appdata v){
    19.     v2f o;
    20.     UNITY_SETUP_INSTANCE_ID(v);
    21.     UNITY_TRANSFER_INSTANCE_ID(v, o);
    22.     UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(o);
    23.     o.vertex = UnityObjectToClipPos(v.vertex);
    24.     o.pos = v.vertex;
    25.     o.uv = TRANSFORM_TEX(v.uv, _ConfigTex);
    26.     UNITY_TRANSFER_FOG(o,o.vertex);
    27.     o.screenPos = ComputeScreenPos(o.vertex);
    28.     o.screenPosNonStereo = ComputeNonStereoScreenPos(o.vertex);
    29.     o.worldPos = mul(unity_ObjectToWorld, v.vertex).xyz;
    30.     o.worldCamPos = _WorldSpaceCameraPos.xyz;
    31.     return o;
    32. }
    33. fixed4 frag (v2f i) : SV_Target {
    34.     UNITY_SETUP_INSTANCE_ID(i);
    35.     UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(i);
    36.     ...
    How do I properly get this working?
    I am also open to alternative ways of getting the view direction that don't suffer from precision problems when far out in the world.
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    This is a surprisingly hard problem to entirely solve. The main problem being that once you're that far out from the world origin, floating point math simply starts to fall apart regardless of what you do. The only "real" answer is ... don't get that far from the origin.

    However there are some tricks to getting things to be "less bad".

    Perhaps the simplest solution is to interpolate the camera relative world position instead of the world position. Basically subtract the
    _WorldSpaceCameraPos
    from the world position in the vertex shader instead of in the fragment shader. This will produce a smooth view direction. But it still won't be perfect, and might jitter or shift around with camera movement. Something that is obviously going to be happening in VR at all times.

    The most accurate option (apart from keeping the world closer to the origin) is to pass the clip space position from the vertex to the fragment.

    Code (csharp):
    1. //vertex
    2. o.vertex = UnityObjectToClipPos(v.vertex);
    3. // copy of the o.vertex
    4. o.clipPos = o.vertex; // float4 clipPos : TEXCOORD# in the v2f
    5.  
    6. // fragment
    7. float3 viewSpaceViewDir = mul(unity_CameraInvProjection, i.clipPos);
    8. // unity_CameraInvProjection matches OpenGL projection matrix and will need to be flipped for other APIs
    9. #ifdef UNITY_REVERSED_Z
    10. viewSpaceViewDir.y *= -1;
    11. #endif
    12.  
    13. // don't use unity_CameraToWorld, it's not the same as the inverse of UNITY_MATRIX_V
    14. // however since the view matrix is a uniformly scaled matrix, the transpose is identical to the inverse
    15. // so use mul with the matrix and vector order swapped to get the view space to world space transform
    16. float3 worldSpaceViewDir = mul(viewSpaceViewDir, (float3x3)UNITY_MATRIX_V);
    17.  
    18. // then transform into object space
    19. float3 objectSpaceRayDir = normalize(mul((float3x3)(unity_WorldToObject), worldSpaceViewDir));
    20. float3 objectSpaceRayOrigin = mul(unity_WorldToObject, float4(_WorldSpaceCameraPos.xyz, 1)).xyz;
    Really this will be only slightly more accurate than using the camera relative world position. And unfortunately that last line to get the ray origin will still add some jittering as the precision loss has already occurred in the
    _WorldSpaceCameraPos
    value, or any other way of getting the camera position, before the value has gotten to the shader. Which is why the only real solution is to not ever have the camera that far away from the origin.
     
    Noisecrime likes this.
  3. sboys3sev

    sboys3sev

    Joined:
    Aug 14, 2018
    Posts:
    2
    Thank you, that method works better than what I came up with. Because my object is very close to the camera, that method has far better precision than i.worldPos - _WorldSpaceCameraPos. However it was still slightly off. It also becomes extremely warped when changing the FOV through SteamVR settings. At least in unity 2019.3, it appears that unity_CameraInvProjection is not correct for single pass stereo. inverse(UNITY_MATRIX_P) produces the correct results.

    The following code produces the correct view direction with good precision.
    Code (CSharp):
    1. float3 viewSpaceViewDir = mul(inverse(UNITY_MATRIX_P), i.clipPos);
    2. float3 worldSpaceViewDir = mul(viewSpaceViewDir, (float3x3)UNITY_MATRIX_V);
    However the inverse function I used is very expensive. Is there any easy way to speed this up?
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    There's no way to make the inverse function faster, but you should be able to get the stereo matrices from the camera using a c# script. You can also combine the projection and view matrices to make things even faster for the GPU.

    Code (csharp):
    1. void OnPreRender()
    2. {
    3.   Camera cam = GetComponent<Camera>();
    4.   Matrix4x4[] projMatrix = {
    5.     GL.GetGPUProjectionMatrix(cam.GetStereoProjectionMatrix(StereoscopicEye.Left)),
    6.     GL.GetGPUProjectionMatrix(cam.GetStereoProjectionMatrix(StereoscopicEye.Right))
    7.   };
    8.   Matrix4x4[] viewMatrix = {
    9.     cam.GetStereoViewMatrix(StereoscopicEye.Left),
    10.     cam.GetStereoViewMatrix(StereoscopicEye.Right)
    11.   };
    12.   Matrix4x4[] invViewProjMatrix = {
    13.     (projMatrix[0] * viewMatrix[0]).inverse,
    14.     (projMatrix[1] * viewMatrix[1]).inverse
    15.   };
    16.   Shader.SetGlobalMatrixArray("_InverseViewProj", invViewProjMatrix);
    17. }
    Something like that. I might have the eyes reversed, I forget which eye is 0 and which is 1.

    But then in your shader you just need to access it with something like this:
    Code (csharp):
    1. #if defined(USING_STEREO_MATRICES)
    2. float4x4 _InverseViewProj[2];
    3. #define INV_VIEW_PROJ _InverseViewProj[unity_StereoEyeIndex]
    4. #else
    5. // warning: this will still be inverted on non-GL APIs
    6. #define INV_VIEW_PROJ (unity_CameraInvProjection * transpose(UNITY_MATRIX_V))
    7. #endif
    You could also calculate a third inverse matrix that uses the non-stereo projection and view matrices and pass that along to the shaders so things work in the editor properly without needing special handling.