Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more..
    Dismiss Notice
  3. Dismiss Notice

Resolved Does Unity support triangleadj in geometry shaders?

Discussion in 'Shaders' started by Milun, Jul 13, 2020.

  1. Milun

    Milun

    Joined:
    Nov 30, 2012
    Posts:
    12
    Hello. I'm a bit new to shaders (and Unity for that matter), but not new to game dev (worked with MonoGame for years prior to this), and I've run into a bit of a problem that has barely any Google results.

    I'm trying to make a Vectrex shader for a game of mine (so far it looks like this):



    I'm drawing these with a LineStream geometry shader, but I've run into a bit of a problem. I need to have the edges which belong to two visible faces to not be drawn. I thought I could easily achieve this by using triangleadj as my input, but it only seems to provide vert data in [0], [1], and [2], and the other 3 elements are all just verts at [0,0,0] (meaning I can get data about the current face, but not the adjacent faces).

    Code (CSharp):
    1. [maxvertexcount(6)] void geom(triangleadj v2g input[6], inout LineStream<g2f> stream)
    2. {
    3. // Works:
    4. float4 p0 = input[0].pos;
    5. float4 p1 = input[1].pos;
    6. float4 p2 = input[2].pos;
    7.  
    8. // Just gives [0,0,0]
    9. float4 p3 = input[3].pos;
    10. float4 p4 = input[4].pos;
    11. float4 p5 = input[5].pos;
    12. }
    Googling this issue has yielded me results from 5+ years ago stating that Unity doesn't support it, and I was hoping things may have changed in that time. Is it still just unsupported?
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Adjacency data isn’t supported, no. I know of no game engines that support it. It’s basically something that only gets used by academia.
     
    Milun likes this.
  3. Milun

    Milun

    Joined:
    Nov 30, 2012
    Posts:
    12
    Thanks for the reply!

    ...Aw. I guess that explains why I could barely find any information on it. Alright then, I guess I'll have to search for something else. If it's not too much of a hassle, which method would you recommend I use if I want to draw only these edges?



    I know I can get an effect that's kind of close by drawing the model again slightly enlarged along the normals and culling inverted, but I don't think it would work for the ears (which are a 2D plane) or the horns (the horns are a tri which has two verts in the exact same location. I had to do it this way since Unity doesn't seem to import loose (not belonging to a face) vertices).
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    I would recommend a camera depth normals texture based post process outline for the main parts of the mesh, and a line renderer for the antennae.

    There’s also a thread or two somewhere on this forum of someone attempting this as a “on object” post process where they’re reading the camera depth normals texture in the object’s shader as an alternative to a full post process so they could limit the effect to specific objects.

    The last option is you could try to store adjacency data for the mesh manually. Either by shoving data into the mesh’s unused UVs, or using a per triangle index array / structured buffer set on the material.
     
    Milun likes this.
  5. Milun

    Milun

    Joined:
    Nov 30, 2012
    Posts:
    12
    Hm... definitely going to have to look up some of those terms (I'm a bit of a shader noob), but now I'm at least pointed in the right direction! I'll give options 1 and 3 a try, and if all goes well, report my findings. Thank you very much!

    (P.S.: off topic, but I can't resist. Quite the coincidence that my character's head shares a resemblance to your avatar : P).
     
  6. Milun

    Milun

    Joined:
    Nov 30, 2012
    Posts:
    12
    ...this is embarrassing, but could you give me a bit more information about that option please? I did some searching, but I'm not quite sure how that would work. I know about storing data in unused UVs (I'm storing line colours in one of them now), but even if I was to store the index of every vert that's adjacent to every other vert, I'm not sure how that would tell me what the 3 adjacent triangles are when it comes to the geometry step.
     
  7. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,495
    Since you'd know the ID of the adjescent verts, you'd have access to those vert's data, so you get those vert's adjacent vert IDs as well and viola, you can construct the adjacent triangles. I think one potential downside to this solution though would be when you have vertex poles with more than 4 edges, so you'd have to be careful with how your meshes are constructed.

    Going the structured buffer way could be easier to work with and only need the data to be sent when it's needed instead of embedded in the mesh data that's being moved around everywhere.
     
  8. Milun

    Milun

    Joined:
    Nov 30, 2012
    Posts:
    12
    Oh yeeeeee... that would work wouldn't it! Thanks for that; will give it a shot!
     
  9. Milun

    Milun

    Joined:
    Nov 30, 2012
    Posts:
    12
    I'm back! Took me a while (not very good with shaders yet), but after learning about and using structured buffers, I managed to make some progress!



    This is exactly the effect I was after! (will share my code once I've cleaned it up).

    However... there was an additional rule I needed to set afterwards. In the image above, the rule is: If an edge has no adjacent tri, or if its adjacent tri has been culled by the backface, then draw it (works perfectly). The other rule I needed was to allow some edges to be drawn at all times. I accomplished this by setting UV values, but then this started to happen:



    (The one on the right was meant to have its mouth and feet have always draw lines like this):



    Edges marked in red are edges which belong to only 1 tri (aka, the edge edges). For whatever reason, If I export the verts with UVs set in Blender, it breaks my tri adjacency calculating .cs code, and I can't really figure out why.

    I guess I'll write back once I figure it out; just wanted to update.

    EDIT: Ok, figured out at least what's causing it. My triangle adjacency assigning script looked through the MeshFilter.sharedMesh.triangles array for triangles which shared two vert indexes. When I set the UVs in Blender, I make sure each tri in the mesh has its own UVs, and that I believe is what's causing them to "unlink" in the MeshFilter.

    EDIT2: After assigning the vert indexes in Blender to uv0.x; success!
     
    Last edited: Jul 26, 2020
    bgolus and Invertex like this.
  10. Milun

    Milun

    Joined:
    Nov 30, 2012
    Posts:
    12
    So... sorry it took so long (Unity is just a spare time hobby for me), but I've finally gotten around to both publishing my work to a nice Git, and also fixing the many bugs I discovered while cleaning it up and stress testing.

    You can find it here (where hopefully people will understand my ramblings. Years of working as the only programmer for a company has ironically improved my normal social skills, but reduced my ability to communicate with other programmers):
    https://github.com/Milun/unity-solidwire-shader

    Thanks again for all the help!

     
  11. 8Bit_Parallax

    8Bit_Parallax

    Joined:
    Sep 23, 2015
    Posts:
    1
    Milun, great work! I was working towards the same effect and stumbled across your git files before I saw this thread. You did a great job documenting everything and commenting on your code. I can get it working just fine with your Blender/Unity workflow, but I'm a 3DSMax guy, so with your permission, I might branch off it and try getting it working for my 3DS max / Unity workflow. Max can import python scripts, so that shouldn't be too tricky...I'm might change your code a bit for some Max specific options, and modify the unity script to allow for some additional visual modifications for that Vectrex look. (Line intensity and colors affected by overlay graphics, etc.). Again thanks for your perseverance (and everyone's assistance) in solving the 'shared visible face edge' puzzle. You saved me a lot of work and research!
     
  12. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    I'm gonna bump this 'cause... WTF? Just wasted about 3 days before I realized that it just silently fails if you try to use triangleadj, and puts the main verts in [0,1,2] while [3,4,5] are left as zero. This seems undocumented, in fact the unity docs https://docs.unity3d.com/Manual/SL-ShaderPrograms.html point you to the HLSL docs, and if you look up geometry shaders there, you'll see https://docs.microsoft.com/en-us/windows/win32/direct3dhlsl/dx-graphics-hlsl-geometry-shader , which actually has an example using triangleadj.

    Is this something that could be added to URP by setting some state bits? This doesn't seem like something that would need much engine support; it would be done by the API and driver, right?
     
  13. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    The main things you need are to be able to set the mesh topology, and supply an index array in the correct formatting. But Unity explicitly does not support the adjacency topology types for its internal mesh format, and it’d need to be set when the data is uploaded to the GPU which happens fairly deep in Unity’s native code, so I have no idea how you could set it yourself.

    You could pass setup the mesh vertex data you need to be accessible to the shader using GraphicsBuffers and 2021.2’s Mesh.GetVertexBuffer() to reuse the existing mesh data, then construct an adjacency index list for your mesh that you pass in as a structure buffer and use the primitive ID to get the offset into the adjacency data.

    And once you get that far, you might try putting all of that into a compute shader to construct the mesh with that rather than using a geometry shader so it’s faster, and works on platforms that support compute but not geometry shaders … like all Apple hardware.
     
    burningmime and lilacsky824 like this.
  14. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    The reason I wanted to do it in a geometry shader is because I want to be able to simply render the whole scene (or at least a large subset of renderers) in an SRP batch with a replacement material, then let it do dynamic batching, animation for skinned meshes, etc.

    I'm trying to do stencil shadows/shadow volumes by extruding the silohuette edges ( https://developer.nvidia.com/gpugem...-11-efficient-and-robust-shadow-volumes-using )

    Generating special adjacency data for every mesh is a no-go. Geometry Shaders seemed like the easiest way to just make that happen; I don't know if that's possible with compute shaders without totally rewriting pieces of the pipeline.
     
  15. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,495
    I believe you could write a custom importer that stores adjacency data in the UV channels of the mesh and do a re-import of all meshes and it'll just automatically take care of it all for you. Then you'll have access to this in compute and can avoid runtime cost of calculating it.

    It's definitely possible with compute shaders in 2021.2 though (technically possible before then too but more expensive due to memory copies needed). With the GraphicBuffers bgolus mentioned you can get the GPU side reference to a mesh's data to be used in the Compute Shader, without having to modify the pipeline to run mesh data through your compute.
     
    burningmime likes this.
  16. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Adjacency data isn't something Geometry Shaders make "just happen", they using pre-calculated index data. Just like a triangle mesh has to specify a list of triangles 3 vertex indicies at a time, triangle adjacency is the same kind of data with 3 vertex indices for the triangle, and 3 more for the adjacency data. Since Unity doesn’t support calculating that data, you would have to do it yourself, no matter what.
     
    burningmime likes this.
  17. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    So the idea is to get the edge info in some sort of hash-like buffer in a compute shader, use that to find which edges are silouhettes, and extrude from there? At least for 2-manifold meshes that might work if I'm careful about how the hashing is done and collisions are resolved.
     
  18. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    No need for a hash. Vertex data is all in matching length sets of arrays for each kind of data (position, normal, UV, color, etc) where each index of the array is the data for a single vertex. You just need an
    int
    array that points to the adjacent vertex index per triangle edge, with some value like -1 to mark edges with no adjacent vertex.
     
  19. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    Ah; OK; that would probably work to emulate triangleadj as long as we assume that no more than 2 triangles share an edge.
     
  20. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Also this terminology makes me think you're thinking about meshes in the format they exist for non-gpu mesh formats. Generally speaking every mesh used in a realtime 3D game is non-manifold because each vertex can only hold one set of data, so there are lots of things that can create seams via split edges. Most commonly from UV seams. 2-manifold geometry basically doesn't exist in the real world for GPU rendering.
     
    burningmime likes this.
  21. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    Yup; that's the problem :). Which is why I think I need a spatial hash or something so that I can detect edges that don't share actual vertices, just position.
     
  22. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    Just to follow up, I am generating edge adjacency in a job here, which you can pass as a buffer to a compute shader. This could be adapted to other purposes that need triangleadj since the output is just the edge indices:

    https://gitlab.com/burningmime/burn...b/master/Assets/src/graphics/ShadowEdgeJob.cs

    It seems MUCH slower to access and generate vertices in a compute shader than a geometry shader (at least on my GPU). I think it's because the input assembler does some sort of magic with the vertex and index input to the VS/GS, while the CS needs to look up indices and then positions from within the shader code itself.

    Basically, something like this introduces latency that a geometry shader would not have:

    Code (CSharp):
    1. ByteAddressBuffer _vertices;
    2. ByteAddressBuffer _indices;
    3. uniform uint _vertexStride;
    4.  
    5. [numthreads(64,1,1)]
    6. void someComputeShader(uint3 threadIds : SV_DispatchThreadID)
    7. {
    8.     int3 tri = _indices.Load3(threadIds.x * 12);
    9.     float3 a = asfloat(_vertices.Load3(tri.x * _vertexStride));
    10.     float3 b = asfloat(_vertices.Load3(tri.y * _vertexStride));
    11.     float3 c = asfloat(_vertices.Load3(tri.z * _vertexStride));
    12.     // ...
    13. }
    A GS with
    triangle
    input would get that part for free.
     
    Last edited: Sep 9, 2021