Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Weird performance drops, that depends on camera view direction.

Discussion in 'General Graphics' started by BadHabbits1337, Mar 11, 2019.

  1. BadHabbits1337

    BadHabbits1337

    Joined:
    Jul 27, 2015
    Posts:
    2
    Hi everybody.

    We are developing a game, which is based on large amount of blocks (Like mine craft). When we are started to implement world rendering system, which is based on chunks, and custom shaders, we got some times big and weird performance degradation. After some research, we found, that depends on which side we are looking on the chunks, we have high gpu loads and almost double framerate drops. At first, we were thinks that problem in our custom shaders, which we are create in shader graph, but after complete project stripping, the problem still exist. Even when using builtin pipeline and standard shader, and when chunks are generated from default builtin cubes.

    It happens all the time in all setups:
    — While running in editor
    — While running in built player
    — This behaviour reproduced on 4 pc's with completely different gpus (gtx1060, hd4000, gtx540m, gtx1050ti) — And on different unity versions (2018.3.5f1, 2018.3.8f1, 2019.1.6b1)

    You can see video of this behaviour here (gpu: Intel hd4000):


    On more powerful devices, gpu loads became from 30% to 100% (info from task manager).

    Also i am attached a simple project, where you can check it by your self.

    In Attachments, you can see Overdraw renders in "difference" mixing mode, and it's black, that means gpu is doing the same job (probably).

    In example project, i have pretty simple setup, but with custom shaders, performance drop became more noticeable, and frame rates may drops under 20.

    Question is, why does it happens, and what should i do, to avoid this behaviour?
     

    Attached Files:

  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    Unity's Overdraw visualization is sort of flawed, or at least it's showing you slightly different data than what you need to debug this problem.

    I'm 100% convinced the issue you're seeing is overdraw related, the difference is the order the triangles are being drawn in. The Overdraw visualization that Unity ships with doesn't take into account ZWrite and depth rejection that GPUs do.

    For example, here's a bunch of random cubes visualized with Unity's overdraw scene view:
    upload_2019-3-11_13-51-53.png

    Here's that same scene with using a custom overdraw shader instead:
    Code (CSharp):
    1. Shader "Overdraw Visualization"
    2. {
    3.     Properties {
    4.         _Scale ("Brigtness Scale", Range(0.1, 10)) = 1
    5.         [Toggle] _ZWrite ("ZWrite", Float) = 1
    6.         [Enum(UnityEngine.Rendering.CullMode)] _Cull ("Cull", Float) = 2
    7.     }
    8.     SubShader {
    9.         Tags { "RenderType"="Opaque" }
    10.         Pass {
    11.             Blend One One
    12.             ZWrite [_ZWrite]
    13.             Cull [_Cull]
    14.  
    15.             CGPROGRAM
    16.                 #pragma vertex vert_img
    17.                 #pragma fragment frag
    18.                 #include "UnityCG.cginc"
    19.  
    20.                 float _Scale;
    21.  
    22.                 fixed4 frag() : SV_Target
    23.                 {
    24.                     return float4(10.0/255.0, 5.0/255.0, 0.0/255.0, 0) * _Scale;
    25.                 }
    26.             ENDCG
    27.         }
    28.     }
    29. }

    First, with ZWrite off, and scaled to 3x brightness. This is just to compare against the built in colors, as these show the same "data".
    upload_2019-3-11_13-53-35.png

    And finally with ZWrite On, this is closer to the actual overdraw as the GPU "sees" the scene.
    upload_2019-3-11_13-55-25.png

    ZWrite (along with ZTest) tells the shader the use the depth buffer to skip the rendering of triangles on pixels that have already been rendered to by something closer. For this bunch of random cubes, Unity is sorting the objects by how closer they are to the camera with closer objects rendering first, which results in pretty close to perfect sorting for this case with only a few areas having pixels drawn to twice. For meshes that have lots of faces, like your voxel worlds, it's going to be down to the order the triangles are stored in the index array and Unity will not sort these for you. The result is if you're viewing them so they're roughly sorted optimally the front faces are going to occlude those behind them in the depth buffer and the GPU will skip rendering duplicates. If you're viewing them so they're in the opposite order, then every pixel of every triangle (not culled by facing or frustum) is going to be rendered.
     
    Last edited: Mar 11, 2019
  3. BadHabbits1337

    BadHabbits1337

    Joined:
    Jul 27, 2015
    Posts:
    2
    bgolus, thanks a lot, your owerdraw shader helps me, because now i can see what actualy going on. (See attached pics)

    I shoud rewrite chunk my generation algorithm, cause now it pretty simple and straightforward, i just combining NxN meshes, in order of cycles(for) create CombineInstances. But realy, i should do it in other maner. At first, create chunk block caps, that made almost 90% of underlying geometry "invisible" thank to depth test. And after that, each side of block in chunks, should be added into index buffer, in order from front, to back rows row by row from visible angle.
     

    Attached Files: