Search Unity

Question GPU triangle culling possible?

Discussion in 'Shaders' started by kenamis, Jun 15, 2020.

  1. kenamis

    kenamis

    Joined:
    Feb 5, 2015
    Posts:
    387
    Hello, I’m looking for some advice on how to do gpu triangle culling. I want to be able to either remove or collapse the triangles for a mesh render once it’s submitted for the draw call (i.e. I don’t want to modify the mesh asset). This sounds like a perfect use of a Geometry shader but they aren’t supported everywhere and it’s recommended to use compute shaders instead. To use a compute shader I would need to be able to get access to the buffer of triangle indices. I think this would be possible using the Graphics api, but I’m trying to use MeshRenderers and SkinnedMeshRenderers too. Is there a way to get that buffer to dispatch a ComputeShader before it proceeds to the vertex program?

    I think this is very similar to how the blendshape compute shader works, but I can’t see how it accesses those buffers (vertices in that case).

    Alternatively, I was thinking this could be done in a tessellation shader without actually tessellating but still culling triangles. But I’m afraid of the unnecessary overhead of adding the hull and domain stages and also it would be a pain (if not impossible) to use with shader graph made shaders.
     
    ecurtz likes this.
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,342
    If you use a tessellation shader on a Metal platform (iOS or OSX), then this is automatically converted to a compute shader to do the tessellation before running the basic vertex and fragment shader. But that's "just how Metal works" rather than something you can take advantage of. The tessellation compute shader Metal uses isn't user modifiable outside of the basic hull & domain stuff, which Unity's cross compiler translates for you into the appropriate form.

    Otherwise, no. there's no easy way to have a skinned/meshrenderer copy it's data to a form a compute shader can use before hand. And AFAIK all the methods available to do so in Unity are CPU based. Unity's skinned mesh renderer in particular is a GPU based skinner, but to use the skinned vertices in a compute shader you have to bake the skinned mesh positions, which is done CPU side, and manually copy those values into a buffer. I've heard of some people replacing Unity's skinned mesh renderer with their own compute based system, but that also basically means writing your own animation system from scratch too.

    Geometry shaders can do this, but there's not really any performance advantage. By the time the geometry shader stage has run, you've already paid the computation cost for those vertices. I guess the real question I have is what exactly are you trying to do and why? If you're trying to cull triangles for performance reasons, then there's probably not any point. The GPU is already really good at doing this and nothing you can do in the shader will make this any faster. If you're looking to reduce over shading, again, the GPU is actually really good at handling this, especially on mobile where opaque over shading isn't really an issue. On PC you can use a two pass shader with the first pass used to prime the depth buffer. But, again, GPUs are super fast and this might not actually help with perf that much or at all depending on your use case.

    If you're looking to put holes in your mesh, then you can use a geometry shader, or set a vertex's clip space position to be a NaN, which will remove all triangles that reference that vertex.
     
  3. kenamis

    kenamis

    Joined:
    Feb 5, 2015
    Posts:
    387
    Thanks for the detailed response.

    I should have mentioned, the purpose is for preventing triangle clipping on runtime customizable characters. I cull meshes at a "part" level, a triangle level, and per pixel level. The first and third strategies are straightforward, but the triangle culling not so much.

    If I need this simply for visuals and not a performance gain (but not a loss), then would you recommend a Geometry shader? If I end up not doing any triangle culling in some instances, but I'm using the same shader, would I be paying a high performance price because of that added geometry shader?

    I tried a while ago the strategy of setting the vertex's clip space position to NaN and got different results (some not acceptable) on different platforms. Which my understanding is because it's undefined across gpus. But maybe I should test this again?
     
    Last edited: Jun 15, 2020
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,342
    Geometry shader is almost guaranteed to come at a performance loss of some kind, but as you said it's not supported on all platforms.

    The NaN position this is technically "undefined behavior". AFAIK desktop dedicated GPUs will handle this the same and reject them, but integrated or mobile ... who knows what those will do.

    There are only two 100% guaranteed to work on all platform solutions.

    A) Modify the mesh data on the CPU side, either pre-build/compute variations, or modify the mesh at runtime for clothing changes.
    B) Use alpha tested materials with an alpha texture mask.

    Option A is more work, but will render faster assuming this only has to happen "once" when the clothing changes. Option B is less work, but may be slower on some platforms. Probably not slower than geometry shaders on the platforms where the performance impact is noticeable though.
     
    kenamis likes this.