Search Unity

MaterialPropertyBlock for each submesh and SetPass counts in HDRP

Discussion in 'High Definition Render Pipeline' started by syscrusher, Jan 7, 2020.

  1. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    I have C# code that generates a pair of procedural meshes which are, by design, very low-poly. There can be a lot of instances of these pairs in a scene, and the two meshes always use the same Material but with different per-GameObject parameters passed to the shader (which is a custom shader I've created) via a MaterialPropertyBlock.

    To improve performance, I procedurally combine these two meshes during initialization or any time the input meshes change at runtime (which is rare but possible). I create a single combined mesh with two submeshes, and I assign the same Material to each submesh. I can't (currently) allow the submeshes to be merged, as I'll explain below.

    In order to have a separate MaterialPropertyBlock for each submesh, I've found that I have to treat my single Material as if it was two different ones, i.e. the following (approximate; I don't have my source in front of me right now):

    Code (CSharp):
    1. myMeshRenderer.sharedMaterials = new Material[] { myMaterial, myMaterial };
    This is because the Renderer.SetPropertyBlock() method's second parameter is the material index integer -- in other words, there isn't a way within the mesh to indicate which Material should apply to particular triangles.

    All of the above works perfectly, but I'm getting more SetPass() calls than I'd like. The shader already has two passes because I need a depth-only pass (I'm working to eliminate that requirement, but that's a different topic). With two submeshes, I now get four SetPass() calls per procedural object. If I can combine the submeshes fully, that would reduce from four to two SetPass() calls, and if I can further solve the depth pass problem (which I believe I can), then I'd be down to just one SetPass().

    My custom shader receives (among other parameters) two Color values, one for the "primary color" and one for the "secondary color". Every triangle is rendered using these two colors only, but the details of which color goes where (within the triangle) are exactly opposite between the two submeshes. Other parameters govern some additional cosmetics, but those other parameters are common to both submeshes. In other words, the fragment portion of the shader could be given the same primary and secondary colors for both submeshes -- as long as there is a way for it to determine which submesh it is currently rendering.

    If I could do that, then I wouldn't even need two submeshes -- I could combine the meshes fully, have just one Material and one MaterialPropertyBlock (because the colors are still local to each GameObject).

    To work around this, the only method I've come up with so far is to stash some extra data in an otherwise-unused data channel from the mesh. My custom shader is unlit and textureless, so the UVs are not actually used nor will be in the future, so I am considering using one of the higher-numbered UV channels. Alternatively, I could use the vertex colors. In either case, the extra data per vertex would simply be the equivalent of "which submesh index contains this vertex?"

    If you're still with me after all of this lengthy context, thank you! Here, finally, are my questions:
    1. I haven't found such a thing yet in the documentation, but is there an existing helper function in the shader language that I can use to obtain the integer submesh index? If I can do that, problem solved, and I don't need to stuff any additional data into the mesh.
    2. Failing #1, is there a helper function for the fragment shader to obtain the triangle number (that is, the subscript of the current triangle in the Mesh.triangles array)? Again, if I can interrogate that data, I can solve the problem easily.
    3. Is it a really terrible idea to put values into the secondary UV channel, or the vertex color channel, that are interpreted by my custom shader in unusual ways? It occurs to me that having weird values in a UV channel might cause a problem if somehow my Material got switched to a different shader, but in theory any arbitrary RGBA data is allowed in a color channel and at worst might cause some really ugly colors but not actually break anything. So I'm thinking that using a color channel would be safer, if I really must go down this road.
    4. Does anyone have an alternative method I might not have considered?
    5. Am I more worried about the number of SetPass() calls than I should be? In particular, just how much real-world performance impact will I see from a depth-only pass?
    I suspect #1 and #2 are not viable, because the member properties of the Mesh class are specific to Unity and not necessarily directly corresponding to entities known to the shader program.

    With respect to #3, it's extremely unlikely my Material would get assigned to the wrong shader, because I create the Material (a shared static member) procedurally during initialization.

    Any suggestions or comments welcome. Thanks!
     
    gnostici likes this.
  2. gnostici

    gnostici

    Joined:
    Jul 27, 2013
    Posts:
    23
    I can't get submeshes to render AT ALL. I can't get hierarchical transforms to work AT ALL.

    I'm beginning to think UI is just plain impossible in Unity ECS. I find people all over having some of the same problems I am, and absolutely none of the offered advice has any effect whatsoever. Then there are the handful who say they have it working, and post no code at all. Meanwhile, we have absolutely zero examples of this functionality in the documents or even the official Github.

    A week or so now I've grinded at just having quads pointed at the camera, that can be moved in the viewport plane with the same position maintained regardless of camera angle. I've found that using the Translation and Rotation components only keep quads still in the viewport precisely at the camera's forward vector. One z-unit in front of the camera. Anywhere else at all -- any translation on any axis -- and the quads oscillate on that axis as the camera rotates.

    It's looking hopeless.
     
  3. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    My code isn't in ECS, and since I'm still prototyping it's not clean enough to publish yet. If you think some non-ECS code would help, I can try to piece together the sections of my code that deal with this functionality and put it here as snippets, but it would be very much "some assembly required". I didn't create the code with any notion of publishing it as a tutorial; it's just a test harness to help me figure out how things work so I can design the real code in a way that will actually function.
     
  4. gnostici

    gnostici

    Joined:
    Jul 27, 2013
    Posts:
    23
    I made the noob mistake of clicking the wrong subforum while having the noob experience of being frustrated. Then I thought I found someone else having a similar issue. Sorry about that.
     
    syscrusher likes this.
  5. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    No harm, no foul. :) Hope you are able to solve your ECS issue.
     
    RamType0 likes this.