Search Unity

Question Render and share down-sampled depth texture between cameras?

Discussion in 'Shaders' started by SamKennedy, Jan 25, 2023.

  1. SamKennedy

    SamKennedy

    Joined:
    Apr 25, 2021
    Posts:
    12
    I have two cameras which are parented as follows:
    • Scene camera (Depth 0)
      • Effects camera (Depth 1)
    Currently, the scene camera renders the entire scene (minus effects) at full resolution, and the effects camera renders some volumetric effects (minus the rest of the scene) at a lower resolution.

    The effects camera needs access to the scene camera's depth buffer, however, because they are rendering at different resolutions, the effects camera needs a downsampled version of the parent camera's depth buffer.

    The effects camera has a public RenderTexture, so the scene camera can access it from a script. However, now I'm not sure on the exact sequence of events and the correct way to implement them in C#

    Here is what should happen:

    1. Scene camera renders colour + depth at full resolution
    2. Scene camera renders a second pass, only depth, down sampled to match resolution of the effects camera
    3. Effects camera uses down sampled depth texture from step 2
    I know I can override OnPreRender, OnRenderImage and OnPostRender, but I can't figure out how to put all of this together to get the desired effect?