Search Unity

Depth Texture Shader Replacement

Discussion in 'Shaders' started by Chimera3D, Nov 12, 2014.

  1. Chimera3D

    Chimera3D

    Joined:
    Jan 27, 2012
    Posts:
    73
    All I am trying to do at the moment is mimic the functionality of the builtin depth texture with shader replacement. For the replacement shader I'm using the shader below (which is pretty much the same as the builtin shader):

    Code (CSharp):
    1. Shader "Custom/DepthTexture" {
    2. SubShader {
    3.     Tags { "RenderType"="Opaque" }
    4.     Pass {
    5.         Fog { Mode Off }
    6. CGPROGRAM
    7.  
    8. #pragma vertex vert
    9. #pragma fragment frag
    10. #include "UnityCG.cginc"
    11.  
    12. struct v2f {
    13.     float4 pos : SV_POSITION;
    14.     float2 depth : TEXCOORD0;
    15. };
    16.  
    17. v2f vert (appdata_base v) {
    18.     v2f o;
    19.     o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
    20.     UNITY_TRANSFER_DEPTH(o.depth);
    21.     return o;
    22. }
    23.  
    24. half4 frag(v2f i) : COLOR {
    25.     UNITY_OUTPUT_DEPTH(i.depth);
    26. }
    27. ENDCG
    28.     }
    29. }
    30. }
    31.  
    The code below is attached to a secondary camera (with the camera component disabled) that is supposed to render the depth buffer. The depthTextureShader variable is set to the shader above.

    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3.  
    4. public class GBufferCam : MonoBehaviour {
    5.  
    6.     public Shader depthTextureShader;
    7.     public RenderTexture bBuffer;
    8.  
    9.     // Use this for initialization
    10.     void Start () {
    11.  
    12.         gBuffer = new RenderTexture(Screen.width, Screen.height, 0, RenderTextureFormat.ARGBHalf);
    13.         gBuffer.depth = 24;
    14.    
    15.     }
    16.  
    17.     public void GetBuffer() {
    18.  
    19.         camera.CopyFrom(Camera.main);
    20.         camera.renderingPath = RenderingPath.Forward;
    21.  
    22.         camera.SetTargetBuffers(gBuffer.colorBuffer, gBuffer.depthBuffer);
    23.         camera.clearFlags = CameraClearFlags.SolidColor;
    24.         camera.RenderWithShader(depthTextureShader, "RenderType");
    25.  
    26.     }
    27. }
    When I read from the gBuffer texture in another shader using:

    Code (CSharp):
    1. float depth = Linear01Depth(UNITY_SAMPLE_DEPTH(tex2D(_GBuffer,i.uv_depth)));
    and display the depth in the fragment shader using:

    Code (CSharp):
    1. return float4(depth, depth, depth, 1.0);
    my result is incorrect. To my understating this is the same process that is used for the builtin depth texture shader, can anyone tell me what might be going wrong?
     
  2. spraycanmansam

    spraycanmansam

    Joined:
    Nov 22, 2012
    Posts:
    254
    I can't see anywhere in that code snippet where you're actually passing the new depth texture to a material or setting it globally... or if GetBuffer is even getting called :/

    I normally use something like this for my replacement shaders, give it a whirl and see how you go.

    Code (csharp):
    1. private void Awake()
    2. {
    3.     camera.CopyFrom(Camera.main);
    4.  
    5.     var target = new RenderTexture(Screen.width, Screen.height, 16, RenderTextureFormat.Depth);
    6.     camera.targetTexture = target;
    7.     camera.depthTextureMode = DepthTextureMode.None;
    8.  
    9.     camera.SetReplacementShader(Shader.Find("Hidden/Camera-CustomDepthTexture"), "RenderType");
    10.     Shader.SetGlobalTexture("_GBuffer", target);
    11. }
     
  3. Chimera3D

    Chimera3D

    Joined:
    Jan 27, 2012
    Posts:
    73
    Well I simply trimmed that bit out, where I send the gbuffer to the shader where it's used. I tested the code above and nothing seems to be being written into the depth buffer of the render texture, or at least it isn't producing the same results as using the builtin depth texture. The reason I'm trying to make a replacement shader that produces a depth buffer is that the builtin depth texture's buffer is linear whereas I need the depth buffer to be logarithmic (to preserve precision). Is there a way to store the depth in one of the color channels (or all of them, instead of the depth buffer) as this texture is only being used to store depth and not necessarily any other information of what the camera is looking at?