Search Unity

Question Rendering two layers of semitransparent fabric culls backside

Discussion in 'Shaders' started by felixgruber, May 28, 2021.

  1. felixgruber

    felixgruber

    Joined:
    Jan 16, 2019
    Posts:
    2
    Hi everyone! I'm hoping somebody can help me with the following problem, that my colleague @d4n3x already pointed out in his previous post here: https://forum.unity.com/threads/ar-rendering-culls-randomly.1111552

    We have a two layers of semi-transparent fabrics inside a model, which looks fine when the model is placed in its initial state. This can be seen in the following screenshot:

    img_working.jpg

    You can sligthly see the back fabric, which is further away from the camera, through the transparent parts of the closer one. That looks totally fine until I start to move the fabric, which is closer to the camera, upwards (and only upwards, no other translation).

    img_not_working.jpg
    Suddenly there is a point where the fabric in the back just "visually disappears" and you can no longer see it through the front one, except for the parts on the left side, which are not covered by the front fabric, which are still visible.

    I can only quote my colleague, in what he already stated:
    "Since the backside fabric is still active and somehow visible on the side I thought of a culling problem.
    I tried disabling dynamic Culling for the meshes, disabled culling on the camera, changed transparency sort order in the settings to 'perspective', disabled HDR on camera since this had some effect on the editor but doesn't seem on iPad.

    Also attached is the code for the shader of the semi-transparent fabrics as there is maybe something I missed."


    Code (CSharp):
    1. Shader "RGBplusAFalse" {
    2.     Properties {
    3.         _Color ("Main Color", Color) = (1,1,1,1)
    4.         _MainTex ("Base (RGB) Trans (A)", 2D) = "white" {}
    5.         _AlphaTex ("AlphaTex (R)", 2D) = "black" {}
    6.         _Blend ("Blend", Range (0, 1.0) ) = 1.0
    7.     }
    8.  
    9.     SubShader {
    10.         Tags {"Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent"}
    11.         LOD 300
    12.         // extra pass that renders to depth buffer only
    13.         Pass {
    14.             ZWrite On
    15.             ColorMask 0
    16.         }
    17.  
    18.         // paste in forward rendering passes from Transparent/Diffuse
    19.         UsePass "Transparent/Diffuse/FORWARD"
    20.         CGPROGRAM
    21.         #pragma surface surf Lambert alpha:fade
    22.  
    23.         sampler2D _MainTex;
    24.         sampler2D _AlphaTex;
    25.         fixed4 _Color;
    26.         float _Blend;
    27.  
    28.         struct Input {
    29.             float2 uv_MainTex;
    30.         };
    31.  
    32.         void surf (Input IN, inout SurfaceOutput o) {
    33.             fixed4 c = tex2D(_MainTex, IN.uv_MainTex) * _Color;
    34.             fixed4 alph = tex2D(_AlphaTex, IN.uv_MainTex);
    35.             o.Albedo = c.rgb;
    36.            
    37.             o.Alpha = (1 - alph.r) * _Blend;
    38.            
    39.         }
    40.         ENDCG
    41.     }
    42.  
    43.     Fallback "Legacy Shaders/Transparent/Diffuse"
    44. }
    45.  
    Again, thanks to anyone in advance, who might be able to point us in the right direction or tell us where we screwed up :)
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Accurate and efficient sorting of real time transparent surfaces is an unsolved problem.
    https://realtimevfx.com/t/transparency-issues-in-unity/4337/2

    The problem you have of one plane disappearing is caused by this:
    Specifically that
    ZWrite On
    . To render transparent objects in real time, they use what's known as the Painter's Algorithm, which basically means objects further away need to be drawn first. To accomplish this transparent meshes are sorted back to front based on their distance from the camera using the center of the mesh's spherical bounds. If the meshes intersect, or if they aren't sorted perfectly, then this fails. As I mentioned, Unity sorts them by their spherical bounds center, so if you have two flat parallel planes facing the camera and one is moved "up" further than the two planes are apart, then the further away plane will render first. Unity doesn't know, or care, that the planes are, well, planes. It just know the center of the bounds of one is further away than the other.

    When you use
    ZWrite On
    , you're telling the GPU to write to the depth buffer. The depth buffer is used to prevent overlapping geometry from rendering when some parts are further away than what's previously been rendered. For the case of a single 3D mesh, that extra pass will fill in the depth buffer for the entire mesh, and then none of the surfaces but those closest to the camera will be visible. Anything that renders before it will render normally, and anything that renders afterward can only appear closer than that mesh.

    In the case of two planes, which are separate objects, if they're sorted properly back to front, that
    ZWrite
    pass doesn't really do anything. The second plane renders closer than the first one and the depth buffer doesn't prevent anything from rendering. When you move it up, suddenly the "closer" plane is rendering first, and the depth buffer is preventing the "further" plane from rendering. However removing the extra pass will just mean the "closer" plane renders under the "further" one.


    It might seem like the obvious solution is to sort them "properly", but there isn't really a good way to do that in the general case. Sure you could write some code to sort them as planes rather than spheres, but there's no right answer if they intersect and it's relatively expensive for something that's uncommon. This is why Unity's built in sorting is "wrong" in this case. In 2D games sprites are sorted by their depth rather than distance which avoids the problem there, but that solution assumes the sprites are always perfectly facing the camera and they can sort wrong if this isn't true.


    Again, this is an unsolved problem for the general case! There are no perfect solutions to this! In the link above I mention per triangle sorting and pre-pass depth write as work arounds. But that was in reply to the specific case in that thread, which is a single mesh with waves of water. Those won't really work well here. OIT (Order Independent Transparency) like depth peeling could work, but is costly and difficult to implement. Weighted Blended OIT is an approximation that fails with near opaque objects, and is better described as a way to make the order of overlapping transparent surfaces difficult to discern rather than being "correct". Alpha to Coverage is a form of dithered transparency, both of which are options, but don't always look great.

    Technically there's also ray tracing now, but that's still very expensive.


    However for this very specific case of two close planes that need to render in a specific order, don't intersect, and are always parallel, you can sort the planes manually by using a sorting group component on an immediate parent of the two, and setting the mesh renderers'
    .sortingOrder
    with a script that checks the depth of each plane against the camera position with a dot product along the planes' common normal.
     
  3. felixgruber

    felixgruber

    Joined:
    Jan 16, 2019
    Posts:
    2
    @bgolus Thank you so much for this detailed and exceptionally well explained answer! My colleague and I will try to implement your suggested solution ASAP - thanks again!