Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

_CameraDepthTexture in Compute Shader

Discussion in 'High Definition Render Pipeline' started by zellspires, Sep 28, 2020.

  1. zellspires

    zellspires

    Joined:
    Jan 14, 2013
    Posts:
    3
    Hi everyone.

    I'm trying to pass the _CameraDepthTexture global shader propery to my compute shader using

    Shader.SetTextureFromGlobal(kernel, "DepthTexture", "_CameraDepthTexture") 

    but i get this error:
    Compute shader (PS_procedural): Property (DepthTexture) at kernel index (2) has mismatching texture dimension (expected 2, got 5).

    So it seems like it's not a 2D texture even if inside ShaderVariables.hlsl it is defined as: TEXTURE2D_X(_CameraDepthTexture);

    and it's sampled with :
    float LoadCameraDepth(uint2 pixelCoords)
    {
    return LOAD_TEXTURE2D_X_LOD(_CameraDepthTexture, pixelCoords, 0).r;
    }

    This approach used to work fine when using the legacy rendering pipeline. So what's wrong with _CameraDepthTexture dimensions in HDRP? Am I missing something?

    Is there another way to get the camera depth texture without using another camera and a render texture? Maybe a Custom Pass?

    [P.S.] similar thread : https://forum.unity.com/threads/acc...n-hdrp-and-pass-it-to-compute-shaders.539003/
     
    Egad_McDad likes this.
  2. Egad_McDad

    Egad_McDad

    Joined:
    Feb 5, 2020
    Posts:
    39
    So I incidentally just started working on a HDRP post processing effect that uses a compute shader. I also got the mismatching texture dimension error. I managed to make it go away by declaring my texture in the compute shader using
    TEXTURE2D_X()
    . Below is a stripped down version of the working code:

    Code (CSharp):
    1. #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Common.hlsl" // required by the below file (I believe)
    2. #include "Packages/com.unity.render-pipelines.high-definition/Runtime/ShaderLibrary/ShaderVariables.hlsl" // for TEXTURE2D_X() and RW_TEXTURE2D_X
    3.  
    4. #pragma kernel CSMain
    5.  
    6. TEXTURE2D_X(_Source);
    7. RW_TEXTURE2D_X(float4, _Destination);
    8.  
    9. [numthreads(8,8,1)]
    10. void CSMain (uint3 id : SV_DispatchThreadID)
    11. {
    12.     // how to read from a texture
    13.     float4 myValue = _Source[ COORD_TEXTURE2D_X(id.xy) ];
    14.  
    15.     // how to write to the texture
    16.     _Destination[ COORD_TEXTURE2D_X(id.xy) ] = myValue;
    17. }

    I gleaned this from
    Accumulation.compute
    which is in HDRP package under Runtime > RenderPipeline > Accumulation > Shaders. I've yet to try fetching the depth buffer, but if I get it working I'll come back to add to this thread
     
  3. Egad_McDad

    Egad_McDad

    Joined:
    Feb 5, 2020
    Posts:
    39
    As promised, an update:

    I got my compute shader and the associated CustomPostProcessVolumeComponent working.

    Sampling the depth texture was as simple as adding the #includes from my first post and using
    LoadCameraDepth()


    I should also note that the effect still didn't render in play mode unless I used Resources.Load() to initialize the compute shader in the volume's
    Setup()
    . In other words, if you use a public reference and drag in a compute shader you will find that it is null upon entering play mode and your effect will not work