Search Unity

Does Unity 2018 understand the SV_DepthGreater semantic?

Discussion in 'Shaders' started by Zergling103, Nov 14, 2018.

  1. Zergling103

    Zergling103

    Joined:
    Aug 16, 2011
    Posts:
    284
    Currently we were hoping to use depth writing to give billboards visual quality similar to the real models they represent. Ambient occlusion and screen space shadows really expose that the objects in question are sprites, and using depth writing fixes this.

    However, when you output to SV_Depth in your fragment shader, your shader suddenly becomes a lot slower. This is because this tells your graphics card to not do depth testing at the beginning of the drawing process, but rather at the very end.

    This is where SV_DepthGreater comes in to save the day. This allows the graphics card to perform depth testing both on the initial depth value from the rasterizer, and the final depth value from the shader. However guess what?

    Unity's shader compiler doesn't recognize the SV_DepthGreater semantic in 5.6 as far as we can tell. And believe me, we tried. If you have any ideas on how we can enable this feel free to let us know. Perhaps we can force it to do the initial depth test despite using SV_Depth?

    So now we're basically facing the possibility of going on a wild adventure of trying to emulate the same result without a depth write, which is turning out to be a huge pain with a lot of work for an inferior result - just because they forgot to expose a DX11 feature in the API.

    So, does 2018 understand the SV_DepthGreater semantic? Will upgrading from 5.6.5p4 solve all our problems or are we doomed to graphical mediocrity?
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    7,887
    The problem is I don't believe SV_DepthGreater exists. I'm pretty sure it's a typo from mjp's conservative rasterization article.

    The real semantic is SV_DepthGreaterEqual. There's also earlydepthstencil which may work for your needs as well (I use it for some of my stuff).

    edit: also, due to Unity's use of a reversed Z buffer, I'm guessing you actually need to use SV_DepthLessEqual instead.
     
    Last edited: Nov 14, 2018
  3. Zergling103

    Zergling103

    Joined:
    Aug 16, 2011
    Posts:
    284
    How would one apply earlydepthstencil? #pragma earlydepthstencil in the fragment shader?
     
  4. Zergling103

    Zergling103

    Joined:
    Aug 16, 2011
    Posts:
    284
    Also, apparently neither SV_DepthGreaterEqual nor SV_DepthLessEqual are recognized by the compiler in 5.6.5p4; they spit out errors. Are they recognized in newer versions?
     
  5. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    7,887
    You need #pragma target 5.0 and then add [earlydepthstencil] just before the fragment shader. ie:

    Code (csharp):
    1. [earlydepthstencil]
    2. half4 frag (v2f i, out depth : SV_Depth) : SV_Target
    3. {
    4.     ... etc
    5.  
    But be warned it's not alpha test or alpha to coverage friendly. For sprites you'll need tight geometry mesh shapes.

    Works in 2018.2 at least.

    Code (CSharp):
    1. Shader "Unlit/DepthWriteTest"
    2. {
    3.     Properties
    4.     {
    5.     }
    6.     SubShader
    7.     {
    8.         Tags { "RenderType"="Opaque" }
    9.         LOD 100
    10.  
    11.         Pass
    12.         {
    13.             Cull Off
    14.  
    15.             CGPROGRAM
    16.             #pragma vertex vert
    17.             #pragma fragment frag
    18.             #pragma target 5.0
    19.          
    20.             #include "UnityCG.cginc"
    21.  
    22.             struct v2f
    23.             {
    24.                 linear noperspective sample float4 pos : SV_POSITION;
    25.                 float2 uv : TEXCOORD0;
    26.             };
    27.  
    28.             sampler2D _MainTex;
    29.             float4 _MainTex_ST;
    30.  
    31.             float LinearToDepth(float linearDepth)
    32.             {
    33.                 return (1.0 - _ZBufferParams.w * linearDepth) / (linearDepth * _ZBufferParams.z);
    34.             }
    35.          
    36.             v2f vert (appdata_full v)
    37.             {
    38.                 v2f o;
    39.                 o.pos = UnityObjectToClipPos(v.vertex);
    40.                 o.uv = v.texcoord.xy * 2.0 - 1.0;
    41.                 return o;
    42.             }
    43.          
    44.             half4 frag (v2f i,
    45.                 out float outDepth : SV_DepthLessEqual
    46.                 ) : SV_Target
    47.             {
    48.                 float circle = sqrt(dot(i.uv.xy, i.uv.xy));
    49.                 clip(1 - circle);
    50.  
    51.                 float zLinear = LinearEyeDepth(i.pos.z);
    52.                 zLinear -= cos(circle * UNITY_PI * 0.5) - 1;
    53.  
    54.                 outDepth = LinearToDepth(zLinear);
    55.                 return half4(1,1,1,1);
    56.             }
    57.             ENDCG
    58.         }
    59.     }
    60. }
     
  6. Zergling103

    Zergling103

    Joined:
    Aug 16, 2011
    Posts:
    284
    In what way is it not alpha test friendly? We're using this for imposters of opaque objects. And as you could imagine, they will require cutout. I suppose alternatively we could write the depth to something insane like +inf to get it to clip instead of using alpha.
     
  7. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    7,887
    I've always had some weirdness with how earlydepthstencil works, compared to my expectations. Reading more about it now, the problem with earlydepthstencil is the way everyone expects it to work is that it forces early depth and stencil rejection, which it does, but it also forces early depth and stencil writes to happen prior to running the fragment shader. I only just understood that now, as none of the documentation makes that clear, and there are even several AMD and Nvidia papers that make it sound like that is not the case, but early test and write is the behaviour that people using it see.

    Basically, earlydepthstencil prevents the use of SV_Depth or clipping and will ignore them. You really do need to use SV_DepthLessEqual for what you're doing. I suspect the cases in the past where I had gotten earlydepthstencil to "work" with SV_Depth were because the compiler was just ignoring earlydepthstencil, probably because I hadn't set the proper #pragma target. Ultimately its purpose was to re-enable the early depth and stencil read/write that happens normally by default in some specific cases where it would otherwise be disabled.

    It also means that SV_DepthLessEqual does depth testing twice?
     
  8. Zergling103

    Zergling103

    Joined:
    Aug 16, 2011
    Posts:
    284
    One would think so. Once for the initial depth comparison for the depth values returned by the rasterizer, and again for the depth value returned by the fragment shader.