Search Unity

VR Post-Processing Effects and Screen Space Coordinates

Discussion in 'AR/VR (XR) Discussion' started by Jonathan_L, Oct 3, 2018.

  1. Jonathan_L

    Jonathan_L

    Joined:
    Jan 26, 2016
    Posts:
    43
    Hi,

    So I have been working on some screen space post processing effects in VR and noticed that a lot of them don't work as intended and cause me to feel nausea. I've also tested some of effects from Unity's Post Processing Stack such as ambient occlusion and screen space reflections and found that the effects are sometimes hard to look at.

    Here is an image of a simple distortion effect using a texture sampled in screen space to help you better understand what I mean.
    sample_left_right.png
    If you look at the top of that structure, you can see that the left and right eye see two completely different things and this makes it really hard to look at and even somewhat blurry.

    So I searched around and couldn't find much info on how one might approach to solve this problem.
    I did find this thread:
    https://forum.unity.com/threads/the...-vrs-lefteye-image-and-righteye-image.540654/
    which really helped me understand how screen space coordinates work in VR (which is the same as in non-VR).

    I tried to do some testing and found that when I render a texture as a post processing effect the image doesn't line up from my left eye to right eye. Here is an example:
    uv_left_right.png
    Although they line up from this perspective, when looking through a headset, your eyes expect two different images from two different viewpoints to for the stereoscopic effect.

    I placed a quad with the same texture with a local position of (0, -0.052, 0.483) relative to the camera and got this result:
    uv_quad_offset.png
    So when looking at this, everything is natural and easy to look at as expected, even though the eye sees two different images.

    So basically what I want to do is sample UV coordinates not in screen space but in this "quad offset" space to see if screen space effects can be handled better. I am not sure how I would implement this in a shader because at the shader level I don't think Unity gives access to which eye is being rendered in multi-pass stereoscopic mode. Does anyone have an idea on how I would approach this? Or if there is another solution for screen space post processing effects, I would love to know.

    Thanks for reading.
     
  2. BrandonFogerty

    BrandonFogerty

    Joined:
    Jan 29, 2016
    Posts:
    83
    Hi @Jonathan_L,

    You should be able to use the builtin scalar value unity_StereoEyeIndex to detect which eye is currently being rendered. If the left eye then 0 and if the right eye then 1.