Search Unity

URP Depth issue

Discussion in 'Shaders' started by Phantom_X, Mar 12, 2021.

  1. Phantom_X

    Phantom_X

    Joined:
    Jul 11, 2013
    Posts:
    314
    Hi,
    I want am having issue with depth intersection effect using URP.
    I get the correct result in editor ( top picture ) but when built on Android I get weird banding ( bottom picture )



    here's the simplified version of the code

    Code (CSharp):
    1. float pixelDepth = IN.screenCoord.z; // (ObjectToViewPos.z)
    2. float2 screenUV = IN.screenCoord.xy / IN.screenCoord.w; // ComputeScreenPos()
    3. float rawDepth = SAMPLE_TEXTURE2D(_CameraDepthTexture, sampler_ScreenTextures_linear_clamp, screenUV.xy);
    4. float depth = LinearEyeDepth(rawDepth, _ZBufferParams);
    5.  
    6. float start = 0;
    7. float end = 0.5;
    8. float dist = ((depth - pixelDepth) - end) / (start - end);
    9.  
    10. return saturate(dist);
    Am I doing something wrong?
    I'm using unity 2020.3.0 and URP 10.3.2

    Thanks!
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,339
    My guess is the android versions using a 16 bit depth buffer. If you can, push out the near clip on the camera. If that’s the problem, even a small amount might help a ton.
     
  3. Phantom_X

    Phantom_X

    Joined:
    Jul 11, 2013
    Posts:
    314
    Hey,
    so I reduced the far clip plane and increased the near plane and it did help a lot indeed. Thanks for that!

    In SRP you could declare your sampler like this sampler2D_float _CameraDepthTexture; and it would use full float precision data.

    Any way to do that with the TEXTURE2D(_CameraDepthTexture); way of doing it?
    I found TEXTURE2D_FLOAT however the define is just the same as regular TEXTURE2D

    Or is it really just 16bits buffer and higher precision sampler will no change anything? ( in my build settings I did check the 32bits buffer box )

    thanks again!
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,339
    I honestly don't understand what the point of the
    _float
    suffix on the
    sampler2D
    definitions are for, since in OpenGLES 3.0 and desktop all
    sampler2D
    s are float precision.

    My guess is the issue isn't anything to do with the shader code, but rather Unity is creating the camera depth texture as a half precision depth texture on Android, regardless of what the system depth buffer is using. You'd need to use a gpu profiler (one for the device you're using, not Unity's) to find out if that's what's happening, and I have no idea how you would fix that if it is the issue.
     
    Phantom_X likes this.
  5. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,339
    Looks like
    sampler2D_float
    adds
    highp
    precision modifier in front of the
    sampler2D
    in the glsl. So it might do something on some platforms.