Search Unity

_CameraDepthTexture works in shadergraph, but not in custom shader

Discussion in 'Shaders' started by ualogic, Nov 6, 2020.

  1. ualogic

    ualogic

    Joined:
    Oct 15, 2018
    Posts:
    25
    Hi, I am having trouble with a simple _CameraDepthTexture sampling. In shadergraph it works perfectly fine using this setup:
    Unity_LL4zZvE5Y7.png
    Unity_3lweur1jVj.png

    In custom shader I do:

    Code (CSharp):
    1.  
    2. float4 scrPos : TEXCOORD01;
    3. ...
    4. uniform sampler2D _CameraDepthTexture;
    5.  
    6. //Then in vertex
    7. o.vertex = UnityObjectToClipPos(v.vertex);
    8. o.scrPos = ComputeScreenPos(o.vertex);
    9.  
    10. //Then in frag
    11. float2 screenUV = i.scrPos.xy / i.scrPos.w;
    12. float depth = tex2D(_CameraDepthTexture, screenUV).r;
    13. return depth;
    14.  
    And the result is:
    Unity_21dgZ0V2hS.png

    And this is what I get if I add depth = Linear01Depth(depth):
    Unity_QHFAfwmH4P.png


    What am I doing wrong?
     
  2. unityuserunity85496

    unityuserunity85496

    Joined:
    Jan 9, 2019
    Posts:
    89
    try multiplying buy the cameras frustrum
    Linear01Depth(depth) * _ProjectionParams.z
    I recall its default is clip space I could be wrong