# Question VR camera eye offset inside HLSL image effects shader

Discussion in 'VR' started by nabilmansour1999, Feb 24, 2023.

1. ### nabilmansour1999

Joined:
May 6, 2020
Posts:
4
Hello everyone,

I am trying to make a ray marcher inside of unity as an image effects shader with VR stereo vision.

I have been able to make ray-marched geometry and was able to calculate the necessary depth information so that I can render both ray-marched geometries with polygonal geometry. You can see that in the picture below:

The red, green, and blue box is the ray-marched object I am using for testing. Currently, the ray marcher colors are based on the absolute value of the normals which is why these colors are shown. The other boxes and the sphere are regular unity game objects (polygonal geometry).

My issue is really when I switch to stereo vision when I press play. I don't have a VR on me right now, but essentially I am using Mock HMD to just see how the rendering will look like (this project is part of a final project for a course at my university, my friend has a VR headset, I'm just trying to get the rendering engine going for the ray marcher).

As you can imagine, the following issue occurs when I play in stereo vision:

It is obvious that this is occurring due to the eye offset where the polygonal geometry is moving slightly to accommodate the eyes offset. My question is therefore the following: how is the offset calculated? and how would I apply that to the ray marcher?

One thought I had was that the offset is occurring on the right vector of the camera by the separation * convergence. The following code shows that:

Code (HLSL):
1. o.ro = _WorldSpaceCameraPos; // ray origin
2.
3. float2 uv = o.uv;
4. uv -= 0.5;
5. uv.x *= _camAspect;
6. uv *= _camTanFov;
7. float3 uvPos = normalize(float3(uv, 0.5));
8.
9. float4x4 VM = UNITY_MATRIX_V;
10. float4x4 CTW = unity_CameraToWorld;
11.
12. float4 dir = mul(VM, float4(uvPos, 1.));
13. dir.xyz /= dir.w;
14. o.rd = mul(CTW, dir).xyz - o.ro;
15.
16. if defined(USING_STEREO_MATRICES)
17. float stereoOffset = _stereoSep * (2. * unity_StereoEyeIndex - 1.); // from CHATGPT
18. float3 shift = _camRight * stereoOffset * _stereoConvergance * 2.;
19. o.ro += shift;
20. endif
21.
22. //o.rd = mul(CTW, dir).xyz - o.ro;
23. o.rd /= dot(o.rd, _camForward); // scale dir for depth calcuation
24. return o;
This is handled inside the vertex shader. This code results in the following:

This might seem like is solving the issue, but if I just move the camera away or rotate it, the issue pops up again.

Any help would be much appreciated.

File size:
51.2 KB
Views:
16
File size:
10 KB
Views:
16
2. ### nabilmansour1999

Joined:
May 6, 2020
Posts:
4
The forums didn't allow for more pictures, so I'll post how the code affects the camera when the camera rotates or moves away here:

Joined:
Jan 13, 2017
Posts:
77
4. ### nabilmansour1999

Joined:
May 6, 2020
Posts:
4
I have. I do add what they say when rendering, but they only mention texture offsets. Moreover, when I use the stereo variables, nothing seems to change:
Code (HLSL):
1.
2. o.ro = unity_StereoWorldSpaceCameraPos[unity_StereoEyeIndex]; // ray origin
3.
4. float2 uv = o.uv;
5. uv -= 0.5;
6. uv.x *= _camAspect;
7. uv *= _camTanFov;
8. float3 uvPos = normalize(float3(uv, 0.5));
9.
10. float4x4 VM = unity_StereoMatrixV[unity_StereoEyeIndex];
11. float4x4 CTW = unity_StereoCameraToWorld[unity_StereoEyeIndex];
12.
13. float4 dir = mul(VM, float4(uvPos, 1.));
14. dir.xyz /= dir.w;
15. o.rd = mul(CTW, dir).xyz - o.ro;
16.
17. //#if defined(USING_STEREO_MATRICES)
18. //float stereoOffset = _stereoSep * (2. * unity_StereoEyeIndex - 1.); // from CHATGPT
19. //float3 shift = _camRight * stereoOffset * _stereoConvergance * 1.55;
20. //o.ro += shift;
21. //#endif
22.
23. //o.rd = mul(CTW, dir).xyz - o.ro;
24. o.rd /= dot(o.rd, _camForward); // scale dir for depth calculation
25. return o;
26.

Joined:
Jan 13, 2017
Posts:
77
6. ### nabilmansour1999

Joined:
May 6, 2020
Posts:
4
Thank you, I'll look into it.