Search Unity

Reconstructing world position in custom render pass...

Discussion in 'Universal Render Pipeline' started by stu_pidd_cow, Dec 31, 2020.

  1. stu_pidd_cow

    stu_pidd_cow

    Joined:
    Aug 4, 2014
    Posts:
    233
    I'm trying to sample the camera's depth texture to reconstruct the world position in a custom render pass and can't find any solution that works. All of these tests were done in Unity 2020.2.1f1, URP 10.2.2. I've enable the depth texture on the camera and my near and far planes have reasonable values.

    Firstly, declaring the _CameraDepthTexture (which is how it works in legacy renderer) results in this compile error:
    It's obviously defined elsewhere. So if I call this:
    Code (CSharp):
    1. SAMPLE_DEPTH_TEXTURE(_CameraDepthTexture, texcoord)
    I get the following compile error:
    I searched the URP code for the definition, and apparently it requires the sampler now, so if I do this:
    Code (CSharp):
    1. SAMPLE_DEPTH_TEXTURE(_CameraDepthTexture, sampler_CameraDepthTexture, texcoord)
    Then I get this compile error:
    Alright, so this is a dead end. So I looked into the forums how to do it. I found a few suggestions, but they all failed. This is what I tried:

    Code (CSharp):
    1. SAMPLE_DEPTH_TEXTURE_PROJ(_CameraDepthTexture, UNITY_PROJ_COORD(uv)) // undeclared identifier 'UNITY_PROJ_COORD' at line 142 (on d3d11)
    2. _CameraDepthTexture.Sample (sampler_CameraDepthTexture, uv).r // 'Sample': no matching 2 parameter intrinsic method
    3. SAMPLE_TEXTURE2D_ARRAY(_CameraDepthTexture, sampler_CameraDepthTexture, texcoord, unity_StereoEyeIndex).r // compiles, but is always gray
    More dead ends. So I then looked into how shadergraph implements the Scene Depth node:
    Code (CSharp):
    1. void Unity_SceneDepth_Linear01_float(float4 UV, out float Out)
    2. {
    3.     Out = Linear01Depth(SHADERGRAPH_SAMPLE_SCENE_DEPTH(UV.xy), _ZBufferParams);
    4. }
    Seems poorly designed that it has to go through a shader graph function to work, but I try anyways. It compiles and returns all white. That's three dead ends.

    <RANT>
    I had another thread complaining about these issues. All that would be need to help me is some documentation on the shader functions (instead of throwing a super basic sample us as saying "you figure it out"). Documentation for custom PP was put on the roadmap back in september 2019. It's absolutely absurd that documentation is something that goes on a roadmap, let alone AFTER coming out of preview. I'm doing this to support an asset in the asset store and my customers have been asking for URP support for well over a year now, and I have to be the doofus that shrugs and says "I can't do anything about it". I have seriously considered discontinuing my asset since supporting URP has become unbearable and completely not worth the time, effort and stress.
    </RANT>

    Anyways, has anyone had any luck trying to do this?
    Thanks.
     
    Last edited: Dec 31, 2020
  2. caladluin

    caladluin

    Joined:
    Jan 29, 2014
    Posts:
    25
    I was dealing with this kind of issue about 4 days ago...
    I had to:
    1) make sure that the URP is providing a depth texture - by setting the "quality" presents to all provide a depth texture.
    2) On custom render passes, the camera position is all wonky - I followed the approach from here to calculate the camera view frustrum: Raymarching Distance Fields: Concepts and Implementation in Unity (adrianb.io), and pass that along with the cameraToWorld matrix in to the blit material.
    3) In the shader - you need to first define the "sampler2D _CameraDepthTexture;" along with other parameters/uniforms, so you can get it. It will be filled in automatically, but you must define it first.
    4) in the vertex shader I use some unfortunate if statements to determine which corner the vertex is, and get the Frustrum "ray" from that
    Code (HLSL):
    1.  
    2. ...
    3. half index = 0;
    4.                 if (v.uv.x > 0.5f)
    5.                 {
    6.                     if (v.uv.y < 0.5f)
    7.                     {
    8.                         index = 2;
    9.                     } else
    10.                     {
    11.                         index = 1;
    12.                     }
    13.                 } else if (v.uv.y < 0.5f)
    14.                 {
    15.                     index = 3;
    16.                 }
    17. ...
    18.                 o.ray = _FrustumCornersES[(int)index].xyz;
    19.                 o.ray /= abs(o.ray.z);
    20.                 o.ray = mul(_CameraInvViewMatrix, o.ray);
    where v is the vertex data, and o is the struct I defined for the output. The "ray" gets interpolated between the verts, giving me a unit-length direction for each point on the screen in the fragment shader.
    5) I defined my own "LinearEyeDepth" - since I couldn't import it. copied from here: LinearEyeDepth and Linear01Depth in a Compute Shader returning infinity - Unity Forum w/ a couple of fixes:
    Code (HLSL):
    1.  
    2. float LinearEyeDepth( float rawdepth )
    3. {
    4.     float x, y, z, w;
    5. #if SHADER_API_GLES3 // insted of UNITY_REVERSED_Z
    6.     x = -1.0 + _ProjectionParams.y/ _ProjectionParams.z;
    7.     y = 1;
    8.     z = x / _ProjectionParams.y;
    9.     w = 1 / _ProjectionParams.y;
    10. #else
    11.     x = 1.0 - _ProjectionParams.y/ _ProjectionParams.z;
    12.     y = _ProjectionParams.y / _ProjectionParams.z;
    13.     z = x / _ProjectionParams.y;
    14.     w = y / _ProjectionParams.y;
    15. #endif
    16.  
    17.   return 1.0 / (z * rawdepth + w);
    18. }
    19.  
    6) finally - I get the depth in the fragment shader:
    Code (HLSL):
    1.  
    2. float depth = LinearEyeDepth(tex2D(_CameraDepthTexture, i.uv).r);
    3. depth *= length(i.ray.xyz);
    4. float3 worldPos = _WorldSpaceCameraPos + i.ray.xyz * depth;
    5.  
    Hope this helps!
     
    stu_pidd_cow likes this.
  3. stu_pidd_cow

    stu_pidd_cow

    Joined:
    Aug 4, 2014
    Posts:
    233
    @caladluin Thanks for the reply!
    Everything you said aligns with what I've tried, except for #6. To read the _CameraDepthTexture, I needed to do this:
    Code (CSharp):
    1. // this should be the only include you need!
    2. #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
    3.  
    4. // declaration
    5. TEXTURE2D_FLOAT(_CameraDepthTexture);
    6. SAMPLER(sampler_CameraDepthTexture);
    7.  
    8. // reading
    9. float z = SAMPLE_DEPTH_TEXTURE(_CameraDepthTexture, sampler_CameraDepthTexture, texcoord.xy);
    After that, everything seems to work!
     
    caladluin likes this.
  4. BenStorch

    BenStorch

    Joined:
    Aug 20, 2017
    Posts:
    7
    Hi, I am having an issue as well with '' undeclared identifier 'sampler_CameraDepthTexture' '' associated with a couple of shaders using the standard pipeline -
    so I am wondering what the simplest script is I need to attach to the camera to declare the depth texture ?