Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Join us on Dec 8, 2022, between 7 am & 7 pm EST, in the DOTS Dev Blitz Day 2022 - Q&A forum, Discord, and Unity3D Subreddit to learn more about DOTS directly from the Unity Developers.
    Dismiss Notice
  3. Have a look at our Games Focus blog post series which will show what Unity is doing for all game developers – now, next year, and in the future.
    Dismiss Notice

Is it possible to write into a depth render texture using SetRenderTarget

Discussion in 'Shaders' started by bitinn, Jun 11, 2018.

  1. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    952
    Hi all,

    Long story short, while I know we can encode a high precision depth float (32, 24 or 16bits) into a RGBA texture (8bit/channel). I am wondering whether we can write to a render texture that's set to RenderTextureFormat.Depth, and avoid encoding it altogether.

    In particular, I want to achieve it with a SetRenderTarget(colorTex, depthTex). I have tried quite a few Fragment shader output struct, including:

    Code (CSharp):
    1.     struct FragmentOutput {
    2.         half4 color : SV_Target0;
    3.         float depth : DEPTH;
    4.     };
    Code (CSharp):
    1.     struct FragmentOutput {
    2.         half4 color : SV_Target0;
    3.         float depth : SV_Target1;
    4.     };
    But just couldn't get it to work.

    I don't have a strong reason to use Depth only texture over RGBA texture, and I heard writing to depth render texture is potentially more expensive?

    I just want to double-check they result in the same thing.

    Thx!
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    11,834
    If you need the depth you can use a RenderTextureFormat.Depth format, or just use a RenderTextureFormat.RFloat instead which is basically the same thing. The trick is to be able to read the texture in another shader you need to assign it not as the depth buffer, but as the color buffer.

    var rt = new RenderTexture(x, y, 24, RenderTextureFormat.RFloat);
    Graphics.SetRenderTarget(rt);


    Then in the fragment shader output the i.pos.z value. And yes, you are now effectively rendering two depth textures, but there's not an easy way to copy the "real" depth texture.


    In trying to figure out how the Scriptable Render Pipelines handle this, they appear to simply set the color format to RenderTextureFormat.Depth, with the fragment shader using ColorMask 0 and return 0. You can try that and see if anything useful comes out. I would expect there to be some kind of resolve pass to copy the depth texture into an RFloat texture (which is what the default Unity rendering paths do), but I can't find it anywhere.
     
    bitinn likes this.
  3. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    952
    @bgolus I am back to this old question 2 years later, hope you don't mind me asking a follow up:

    What's so special about RenderTextureFormat.Depth?

    It seems to makes cmd.Blit(src, dest) understands that dest is a depth texture, so that it passes the depth buffer as _MainTex instead of color buffer;

    Thing is, I couldn't find any other way to hint the cmd.Blit(), when dest has format like RenderTextureFormat.RHalf;
    RenderTextureFormat.Depth also maps to its own GraphicsFormat (value 142), which suggests some internal magics.

    In short, does unity cmd.Blit(src, dest, mat) ever allow us to pass src depth as _MainTex instead of src color, if dest is not RenderTextureFormat.Depth? I am not aware of any solutions here.
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    11,834
    Somewhere between a lot, and nothing at all.

    Some APIs have explicit depth render texture formats, like OpenGL. Others use any generic single channel floating point render texture format ... AFAIK basically everything not OpenGL.

    However, I don't know if Unity allows this. Behind the scenes it's picking the appropriate format for the API, and may have code to force the requirement of a depth format in certain cases when it doesn't actual need to. There's a lot of code in Unity's rendering systems still written against OpenGL and its requirements even as Unity moves away from it.

    I've spent a lot of time and headache trying to work around custom depth related stuff with Unity such that I basically avoid it entirely and find other solutions.
     
  5. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    952

    So for depth buffer I made it work using your tricks.

    For shadowmap (light depth buffer), I had a problem: on SRP, DrawShadows() are too smart and only write into the depth buffer (even if my Shadowcaster pass do explicitly write into color buffer and don't have colormask 0)

    I wonder if there is workaround or is it basically not possible.

    (I did see a post that says we can use DrawRenderers() instead, but then the culling are wrong as shadowcaster outside of camera view are not taken into account...)

    PS: oh the one more thing, does using a custom render texture format affect our ability to use Stencil? If we use RFloat are we safe or do we need anything special?
     
    Last edited: Sep 23, 2020
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    11,834
    That's a question outside my understanding unfortunately. I think when the Graphics API calls to set the current depth buffer are called, you give it the current buffer to use as the depth texture, and the GPU creates a hidden 8 bit stencil buffer along side it, as most modern GPUs use a 32 bit depth buffer even when you specify a 24 bit depth precision. Some mobile & past desktop GPUs actually use a single 32 bit buffer of which 24 bits are for depth and 8 are for the stencil, but I think that's rare now. Unfortunately I haven't touched "real" graphics programming for quite a while now, and actually setting up a depth buffer wasn't something I ever did.