Search Unity

Custom shader not writing to depth buffer

Discussion in 'Shaders' started by dyamanoha_, Jan 31, 2021.

  1. dyamanoha_

    dyamanoha_

    Joined:
    Mar 17, 2013
    Posts:
    87
    Hey there,

    I have a fairly simple simple unlit transparent shader that I would like to have write values to the depth buffer for later use in a post-processing shader. Right now, the color components of the shader draw to the camera just fine, but the the depth buffer is empty. Default material objects show up in the depth buffer just fine. Any idea what I'm doing wrong?

    https://pastebin.com/EVgbpCCS - Simple unlit / transparent shader that's on all my scene game objects (except the sphere)
    https://pastebin.com/9tDiwNYh - Script with OnRenderImage(..) where I'm blitting the src render texture to dst, using a shader that currently just displays the depth buffer.
    https://pastebin.com/H9QG3B5y - The post-processing shader

    Thanks ahead of time
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    The depth buffer and camera depth texture are not the same thing. The depth buffer is used when rendering the camera view color. The camera depth texture is rendered separately prior to rendering the main camera view. For objects to render to the camera depth texture two things need to be true, they need to use a shader that has a shadow caster pass and they need to use a material with a render queue less than 2500.

    A transparent material (a queue of 3000) will not render to the camera depth texture, regardless of if it has a shadow caster pass or not.
     
  3. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    However, generally you do not want transparent objects as part of the opaque queue range (<2500) because they sort front to back, and because the sky will render over them unless
    ZWrite On
    is used. But then the skybox won’t render behind it, and neither will any other normal transparent objects. It will also cause problems for any directional shadows, as those use the camera depth texture as well and objects behind this transparent object won’t receive shadows, because it’ll be casting on the transparent object’s depth.

    Your best option would be to manually render your object into a depth texture after the opaque queues have been rendered to the camera color target.
     
    tonytopper likes this.
  4. dyamanoha_

    dyamanoha_

    Joined:
    Mar 17, 2013
    Posts:
    87
    Makes sense.

    > ... manually render your object into a depth texture after the opaque queues have been rendered to the camera color target.

    Does this mean that I would need to iterate across all my objects and swap their materials each frame? Then use Camera.Render() with a RenderTexture?

    Also, I can't seem to find a way to hook into the rendering pipeline to do anything between the opaque and transparent draw calls. I'll keep digging.
     
    Last edited: Feb 1, 2021
  5. dyamanoha_

    dyamanoha_

    Joined:
    Mar 17, 2013
    Posts:
    87
    By the way, I'm using the built-in rendering pipeline. Maybe I should be looking at the scriptable / universal pipelines?
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    I’d recommend either looking into replacement shaders, or command buffers using
    DrawRenderer()
    and iterate over the objects you care about and draw them to your target texture.

    https://docs.unity3d.com/ScriptReference/Rendering.CameraEvent.html
    https://docs.unity3d.com/ScriptReference/Camera.AddCommandBuffer.html
     
  7. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    They both have the same limitations in this specific regard, and require the same solution.
     
  8. dyamanoha_

    dyamanoha_

    Joined:
    Mar 17, 2013
    Posts:
    87
    I had no idea Unity's graphics pipeline was extensible like this. Very cool, thanks for the pointers. Here are some notes on what I ended up doing for anyone else reading this in the future. I'm still trying to debug one issue but I'll post about that next.

    Unity supports multiple rendering pipelines. The two built-ins are forward and deferred. Depending on which pipeline you're using for you project, you can bind to different CameraEvents.


    credit

    Code (CSharp):
    1.  
    2.    private Camera cam;
    3.    private CommandBuffer cbuf;
    4.  
    5.    ...
    6.  
    7.    void OnEnable()
    8.    {
    9.         this.cam.AddCommandBuffer(CameraEvent.AfterForwardOpaque, this.cbuf);
    10.    }
    11.  
    The output of draw commands will be written to the color buffer. If your shader is writing depth information it can be half4 encoded into the color channels. You can set a render target if you need to capture the depth texture for usage in a later shader.

    Code (CSharp):
    1.  
    2.    private RenderTexture rt;
    3.  
    4.    ...
    5.  
    6.    void Update()
    7.    {
    8.        this.cbuf.Clear();
    9.        this.rt = RenderTexture.GetTemporary(this.cam.pixelWidth, this.cam.pixelHeight, 16, RenderTextureFormat.Depth);
    10.        this.cbuf.SetRenderTarget(this.rt);
    11.        this.cbuf.ClearRenderTarget(true, true, Color.black);
    12.        foreach (Renderer r in this.ship.GetComponentsInChildren<Renderer>())
    13.        {
    14.            if (r.enabled)
    15.            {
    16.                this.cbuf.DrawRenderer(r, this.depthMat);
    17.            }
    18.        }
    19.     }
    20.  
    When allocating a RenderTexture with a depth format specified, it appears to be the case that the RenderTexture's color buffer is allocated per. the format specification -- in the case of the snippet above, 16 bits per pixel.

    The size of the depth buffer values (needs?) to match the output of the fragment shader type -- so half4 for 16 bit.

    In my case, I want to use this depth buffer in a full screen post processing shader. To do that I just bind the the render texture to the fullscreen shader material

    Code (CSharp):
    1.  
    2.     void OnRenderImage(RenderTexture source, RenderTexture destination)
    3.     {
    4.         this.sfMat.SetTexture("_DepthTex", this.rt);
    5.         Graphics.Blit(source, destination, this.sfMat);
    6.         RenderTexture.ReleaseTemporary(this.rt);
    7.     }
    8.  
    Lastly, use SAMPLE_DEPTH_TEXUTURE to pull a float from the depth texture.

    Code (CSharp):
    1.  
    2.     float depth = SAMPLE_DEPTH_TEXTURE(_DepthTex, i.uv);
    3.  
     
  9. insomanx

    insomanx

    Joined:
    May 29, 2020
    Posts:
    2
    @bgolus Is it possible to blit the depth buffer to a texture? I would like to capture that and not have to use a shadow caster pass when I don't care about shadows.

    I would like the code below to work but it appears that BuiltinRenderTextureType.Depth still produces the camera depth texture, not the actual depth buffer.

    Code (CSharp):
    1.     void Start()
    2.     {
    3.         depthTexture = new(mainCam.pixelWidth, mainCam.pixelHeight, 0, GraphicsFormat.R32_SFloat);
    4.         depthTexture.antiAliasing = 1;
    5.         depthTexture.filterMode = FilterMode.Point;
    6.         depthTexture.useMipMap = false;
    7.  
    8.         Shader.SetGlobalTexture("wtf", depthTexture);
    9.  
    10.         cb = new CommandBuffer();
    11.         cb.name = "Selection Commands";
    12.         cb.Clear();
    13.         cb.Blit(BuiltinRenderTextureType.Depth, depthTexture);
    14.         mainCam.AddCommandBuffer(CameraEvent.AfterForwardOpaque, cb);
    15.     }
    16.  
     
  10. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Is it possible to get a texture from an arbitrary depth buffer?

    Yes.*

    Is it possible for you to get a texture from an arbitrary depth buffer?

    No.*

    * Not all APIs and platforms allow you to grab data from arbitrary depth buffers!

    To get a depth texture you can sample, you need a render texture with the Depth or ShadowMap format. This can then be used as a depth buffer and will automatically be resolved to a texture to be sampled. Though like normal render textures you can't be both rendering to and reading from a render texture. So if you want to be able to grab a depth buffer from an arbitrary camera, you need to override the render targets before it renders rather than trying to grab it afterwards.

    Here's some basic example code.
    Code (csharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4.  
    5. [RequireComponent(typeof(Camera))]
    6. public class DepthRenderTextureTest : MonoBehaviour
    7. {
    8.     public bool OnlyRenderToDepth;
    9.     public RenderTexture RTColor;
    10.     public RenderTexture RTDepth;
    11.     public RenderTexture RTDepthCopy;
    12.  
    13.     void Update()
    14.     {
    15.         Camera cam = GetComponent<Camera>();
    16.  
    17.         if (RTDepthCopy)
    18.             RenderTexture.ReleaseTemporary(RTDepthCopy);
    19.  
    20.         RTColor = RenderTexture.GetTemporary(1024, 1024, 0, RenderTextureFormat.Default); // no depth buffer
    21.         RTDepth = RenderTexture.GetTemporary(1024, 1024, 24, RenderTextureFormat.Depth); // only a depth buffer
    22.         RTDepthCopy = RenderTexture.GetTemporary(1024, 1024, 0, RenderTextureFormat.RFloat); // depth buffer copy
    23.  
    24.         // note: if the objects being rendered have code in their fragment shader, that's still all running even
    25.         // there is no color buffer to render to. so preferably it should be using replacement shaders or only see objects             // with depth only materials applied.
    26.         if (OnlyRenderToDepth)
    27.             cam.targetTexture = RTDepth;
    28.         else
    29.             cam.SetTargetBuffers(RTColor.colorBuffer, RTDepth.depthBuffer);
    30.         cam.Render();
    31.         cam.targetTexture = null;
    32.  
    33.         // at this point you can sample the depth render texture
    34.         // can't use CopyTexture unless both RTs are Format.Depth, but that is an alternative!
    35.         Graphics.Blit(RTDepth, RTDepthCopy);
    36.  
    37.         RenderTexture.ReleaseTemporary(RTColor);
    38.         RenderTexture.ReleaseTemporary(RTDepth);
    39.     }
    40.  
    41.     void OnRenderImage(RenderTexture src, RenderTexture dst)
    42.     {
    43.         // proof that the copy worked
    44.         Graphics.Blit(RTDepthCopy, dst);
    45.     }
    46. }
     
    insomanx likes this.
  11. insomanx

    insomanx

    Joined:
    May 29, 2020
    Posts:
    2
    You are the man @bgolus! I truly appreciate your generosity in sharing knowledge that few can. I got something that seems to work well enough for my needs. FYSA, I plan to use this for some advanced compositing logic with outlines and occlusion fx. Here's the test code based on your example:

    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.Rendering;
    3.  
    4. public class DepthBufferTransfer : MonoBehaviour
    5. {
    6.     public Camera mainCam;
    7.  
    8.     CommandBuffer cb;
    9.     RenderTexture colorTex;
    10.     RenderTexture depthTex;
    11.     RenderTexture persistentDepthTex;
    12.  
    13.     Material depthCopyMat;
    14.  
    15.     void Start()
    16.     {
    17.         int width = mainCam.pixelWidth;
    18.         int height = mainCam.pixelHeight;
    19.  
    20.         colorTex = new RenderTexture(width, height, 0, RenderTextureFormat.Default); // no depth buffer
    21.         depthTex = new RenderTexture(width, height, 24, RenderTextureFormat.Depth); // only a depth buffer
    22.         persistentDepthTex = new RenderTexture(width, height, 24, RenderTextureFormat.Depth); // only a depth buffer to persist
    23.  
    24.         // This is used in a shader on a visible quad to test that things are working
    25.         Shader.SetGlobalTexture("wtf", persistentDepthTex);
    26.  
    27.         // This is the secret sauce: predetermine where the depth buffer will go
    28.         mainCam.SetTargetBuffers(colorTex.colorBuffer, depthTex.depthBuffer);
    29.  
    30.         // Shader can be found at https://support.unity.com/hc/en-us/articles/115000229323-Graphics-Blit-does-not-copy-RenderTexture-depth
    31.         depthCopyMat = new Material(Shader.Find("Hidden/DepthCopy"));
    32.         depthCopyMat.SetTexture("_DepthTex", depthTex);
    33.  
    34.         cb = new CommandBuffer();
    35.         cb.Clear();
    36.  
    37.         // Copy the current depth buffer to use for whatever (only needed if we can't count on the data in depthTex not changing)
    38.         cb.Blit(depthTex, persistentDepthTex, depthCopyMat);
    39.  
    40.         // Copy color to the screen (this might not be where you want to do this but it's fine for this example)
    41.         cb.Blit(colorTex, BuiltinRenderTextureType.CameraTarget);
    42.  
    43.         mainCam.AddCommandBuffer(CameraEvent.AfterForwardOpaque, cb);
    44.     }
    45. }
    46.  
    And here's an ugly image of my test visuals: https://pasteboard.co/5imYSOb6BWGU.png
     
    Last edited: Sep 23, 2023