Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Join us on Dec 8, 2022, between 7 am & 7 pm EST, in the DOTS Dev Blitz Day 2022 - Q&A forum, Discord, and Unity3D Subreddit to learn more about DOTS directly from the Unity Developers.
    Dismiss Notice
  3. Have a look at our Games Focus blog post series which will show what Unity is doing for all game developers – now, next year, and in the future.
    Dismiss Notice

Custom shader not writing to depth buffer

Discussion in 'Shaders' started by dyamanoha_, Jan 31, 2021.

  1. dyamanoha_

    dyamanoha_

    Joined:
    Mar 17, 2013
    Posts:
    43
    Hey there,

    I have a fairly simple simple unlit transparent shader that I would like to have write values to the depth buffer for later use in a post-processing shader. Right now, the color components of the shader draw to the camera just fine, but the the depth buffer is empty. Default material objects show up in the depth buffer just fine. Any idea what I'm doing wrong?

    https://pastebin.com/EVgbpCCS - Simple unlit / transparent shader that's on all my scene game objects (except the sphere)
    https://pastebin.com/9tDiwNYh - Script with OnRenderImage(..) where I'm blitting the src render texture to dst, using a shader that currently just displays the depth buffer.
    https://pastebin.com/H9QG3B5y - The post-processing shader

    Thanks ahead of time
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    11,833
    The depth buffer and camera depth texture are not the same thing. The depth buffer is used when rendering the camera view color. The camera depth texture is rendered separately prior to rendering the main camera view. For objects to render to the camera depth texture two things need to be true, they need to use a shader that has a shadow caster pass and they need to use a material with a render queue less than 2500.

    A transparent material (a queue of 3000) will not render to the camera depth texture, regardless of if it has a shadow caster pass or not.
     
    tonytopper, yigitcanoksz and r033 like this.
  3. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    11,833
    However, generally you do not want transparent objects as part of the opaque queue range (<2500) because they sort front to back, and because the sky will render over them unless
    ZWrite On
    is used. But then the skybox won’t render behind it, and neither will any other normal transparent objects. It will also cause problems for any directional shadows, as those use the camera depth texture as well and objects behind this transparent object won’t receive shadows, because it’ll be casting on the transparent object’s depth.

    Your best option would be to manually render your object into a depth texture after the opaque queues have been rendered to the camera color target.
     
    tonytopper likes this.
  4. dyamanoha_

    dyamanoha_

    Joined:
    Mar 17, 2013
    Posts:
    43
    Makes sense.

    > ... manually render your object into a depth texture after the opaque queues have been rendered to the camera color target.

    Does this mean that I would need to iterate across all my objects and swap their materials each frame? Then use Camera.Render() with a RenderTexture?

    Also, I can't seem to find a way to hook into the rendering pipeline to do anything between the opaque and transparent draw calls. I'll keep digging.
     
    Last edited: Feb 1, 2021
  5. dyamanoha_

    dyamanoha_

    Joined:
    Mar 17, 2013
    Posts:
    43
    By the way, I'm using the built-in rendering pipeline. Maybe I should be looking at the scriptable / universal pipelines?
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    11,833
    I’d recommend either looking into replacement shaders, or command buffers using
    DrawRenderer()
    and iterate over the objects you care about and draw them to your target texture.

    https://docs.unity3d.com/ScriptReference/Rendering.CameraEvent.html
    https://docs.unity3d.com/ScriptReference/Camera.AddCommandBuffer.html
     
  7. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    11,833
    They both have the same limitations in this specific regard, and require the same solution.
     
  8. dyamanoha_

    dyamanoha_

    Joined:
    Mar 17, 2013
    Posts:
    43
    I had no idea Unity's graphics pipeline was extensible like this. Very cool, thanks for the pointers. Here are some notes on what I ended up doing for anyone else reading this in the future. I'm still trying to debug one issue but I'll post about that next.

    Unity supports multiple rendering pipelines. The two built-ins are forward and deferred. Depending on which pipeline you're using for you project, you can bind to different CameraEvents.


    credit

    Code (CSharp):
    1.  
    2.    private Camera cam;
    3.    private CommandBuffer cbuf;
    4.  
    5.    ...
    6.  
    7.    void OnEnable()
    8.    {
    9.         this.cam.AddCommandBuffer(CameraEvent.AfterForwardOpaque, this.cbuf);
    10.    }
    11.  
    The output of draw commands will be written to the color buffer. If your shader is writing depth information it can be half4 encoded into the color channels. You can set a render target if you need to capture the depth texture for usage in a later shader.

    Code (CSharp):
    1.  
    2.    private RenderTexture rt;
    3.  
    4.    ...
    5.  
    6.    void Update()
    7.    {
    8.        this.cbuf.Clear();
    9.        this.rt = RenderTexture.GetTemporary(this.cam.pixelWidth, this.cam.pixelHeight, 16, RenderTextureFormat.Depth);
    10.        this.cbuf.SetRenderTarget(this.rt);
    11.        this.cbuf.ClearRenderTarget(true, true, Color.black);
    12.        foreach (Renderer r in this.ship.GetComponentsInChildren<Renderer>())
    13.        {
    14.            if (r.enabled)
    15.            {
    16.                this.cbuf.DrawRenderer(r, this.depthMat);
    17.            }
    18.        }
    19.     }
    20.  
    When allocating a RenderTexture with a depth format specified, it appears to be the case that the RenderTexture's color buffer is allocated per. the format specification -- in the case of the snippet above, 16 bits per pixel.

    The size of the depth buffer values (needs?) to match the output of the fragment shader type -- so half4 for 16 bit.

    In my case, I want to use this depth buffer in a full screen post processing shader. To do that I just bind the the render texture to the fullscreen shader material

    Code (CSharp):
    1.  
    2.     void OnRenderImage(RenderTexture source, RenderTexture destination)
    3.     {
    4.         this.sfMat.SetTexture("_DepthTex", this.rt);
    5.         Graphics.Blit(source, destination, this.sfMat);
    6.         RenderTexture.ReleaseTemporary(this.rt);
    7.     }
    8.  
    Lastly, use SAMPLE_DEPTH_TEXUTURE to pull a float from the depth texture.

    Code (CSharp):
    1.  
    2.     float depth = SAMPLE_DEPTH_TEXTURE(_DepthTex, i.uv);
    3.  
     
    tonytopper and kadd11 like this.