Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

URP cmd.DrawRenderer not working with VR

Discussion in 'Universal Render Pipeline' started by akalegman, Oct 5, 2020.

  1. akalegman

    akalegman

    Joined:
    Feb 23, 2019
    Posts:
    26
    I need some advice with an outline effect I am trying to achieve with VR. I need to draw tagged objects into a RenderTexture using cmd.DrawRenderer as a flat colour which I can then blur and blend back into the scene. I am doing this with a custom ScriptableRendererFeature.

    The effect works for non VR applications, but when using SteamVR 2.6.1 Single Pass Instanced I can see both renders of the object in the same eye. I am using the green/red example shader from https://docs.unity3d.com/Manual/SinglePassInstancing.html.

    I can see both the green AND red render of the object in my left eye. How do I draw the object using a custom shader in both eyes?



    This is an screenshot of the left eye. The green capsule is correctly rendered however the red capsule should be in the right eye.

    Thanks
     
    Last edited: Oct 6, 2020
  2. akalegman

    akalegman

    Joined:
    Feb 23, 2019
    Posts:
    26
    OK so I tested the shader by just applying it as a material to the object in front of me. Everything worked as it should, the object was green in the left eye and red in the right eye. Taking out the Single Pass Instanced support from the shader meant the object vanished from the right eye as expected.

    So there is something either wrong with DrawRenderer and Single Pass Instanced or the way I have implemented it.

    Has anyone had any success along these lines?
     
  3. akalegman

    akalegman

    Joined:
    Feb 23, 2019
    Posts:
    26
    OK another update, I am getting there!

    I removed all my Blit operations and just drew the objects using the Command Buffer without setting a new Render Target. It actually worked fine, with the green object on the left and the red object on the right. So the issue is not with DrawRenderer but with either with the RT I am drawing to or the way I am blitting to the final scene.

    I will continue to investigate and report my findings to help others.

    Any advice would be greatly appreciated - How does one Blit when dealing with VR, how should I define my RT etc
     
  4. akalegman

    akalegman

    Joined:
    Feb 23, 2019
    Posts:
    26
    It looks like when I target a specific RenderTexture to draw these objects into, it ends up drawing both objects into that texture which then gets drawn back into the left eye later down the line.

    If I just draw these objects to screen then they are in the correct eyes. How can I go about setting up my RenderTextures correctly? I essentially want a mask for the left eye and a mask for the right eye, both eyes go through the same process of generating a blurred outline.

    Does the RenderPass Execute get called once for each eye?
     
  5. akalegman

    akalegman

    Joined:
    Feb 23, 2019
    Posts:
    26
    OK after some more debugging it seems that SetRenderTarget is what is breaking the VR rendering. Even calling SetRenderTarget and passing in my source camera target will cause the DrawRenderer to render to the left eye.

    It is almost as if calling SetRenderTarget messes up the matrices for the objects that later get drawn. Should I be using?

    Code (CSharp):
    1. o.vertex = UnityObjectToClipPos(v.vertex);
    Am I missing something regarding managing the RT and VR?
     
  6. ThomasZeng

    ThomasZeng

    Unity Technologies

    Joined:
    Jun 24, 2019
    Posts:
    78
    Hi @akalegman,

    From reading your post, it is likely that your SetRenderTarget call only binds the first slice of the target texture. Maybe you could try bind all slices(slice 0 for left eye and slice 1 for right eye)?
     
  7. akalegman

    akalegman

    Joined:
    Feb 23, 2019
    Posts:
    26
    I managed to solve the issue using
    Code (CSharp):
    1. CoreUtils.SetRenderTarget(cmd, maskRenderTexture.Identifier(), ClearFlag.All, Color.clear);
    Here is the final result.

     
  8. akalegman

    akalegman

    Joined:
    Feb 23, 2019
    Posts:
    26
    @ThomasZeng

    Can I asked you, what is the best way detecting when VR is in use for a particular View? To elaborate, I am currently using
    Code (CSharp):
    1. var opaqueDescriptor = XRSettings.enabled ? XRSettings.eyeTextureDesc : cameraTextureDescriptor;
    to create my RenderTexture.

    The issue with this is XRSettings.enabled is true as soon as we enter play mode so the scene view breaks when running the game. I am looking for a more dynamic way so that both game view and scene view use their respective descriptors.
     
  9. akalegman

    akalegman

    Joined:
    Feb 23, 2019
    Posts:
    26
    Scratch that, turns out I can just use the
    Code (CSharp):
    1. cameraTextureDescriptor
    for both!
     
  10. ThomasZeng

    ThomasZeng

    Unity Technologies

    Joined:
    Jun 24, 2019
    Posts:
    78
  11. akalegman

    akalegman

    Joined:
    Feb 23, 2019
    Posts:
    26
    Thanks for the help!
     
  12. akalegman

    akalegman

    Joined:
    Feb 23, 2019
    Posts:
    26
    @ThomasZeng

    I am using Unity 2020.1.6f1 but I do not have access to cameraData.xrRendering via RenderingData. Is this because ENABLE_VR && ENABLE_XR_MODULE are not defined?

    I have XR enabled + I am using the new SteamVR Beta which is using the XR Pose Driver.
     
  13. akalegman

    akalegman

    Joined:
    Feb 23, 2019
    Posts:
    26
    Looks like ENABLE_XR_MODULE is not defined even though I am using the XR Plugin Management Version 3.2.16
     
  14. ChannexDK

    ChannexDK

    Joined:
    Apr 14, 2015
    Posts:
    15
    If somebody trying to get tagged rendering to work in VR should stumble upon this thread (I did multiple times), here is what I did to get it working:
    1. Declare and sample the target texture with the _X functions:
    TEXTURE2D_X(_EffectMap);
    SAMPLER(sampler_EffectMap);
    ..
    half4 effects = SAMPLE_TEXTURE2D_X(_EffectMap, sampler_EffectMap, UnityStereoTransformScreenSpaceTex(psIn.screenPos.xy / psIn.screenPos.w));

    This was my first mistake, since the target render texture will be a texture array, when rendering in single pass, it will appear as nothing is in the texture when in fact it's just not being bound correctly.

    2. In the Configure() function of the ScriptableRenderPass, both slices of the texarray needs to be bound (as mentioned by ThomasZeng, who forgot to show how). Instead of calling
    ConfigureTarget(m_EffectTexture.Identifier());
    you (obviously!) call:
    ConfigureTarget(new RenderTargetIdentifier(m_EffectTexture.Identifier(), 0, CubemapFace.Unknown, -1));

    I actually started out thinking it would be another VR related bug in VR, so I was pleasantly surprised to find out that it was actually working, once I figured out of it was supposed to work.