Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Graphics.DrawMeshInstanced

Discussion in 'Graphics for ECS' started by Arathorn_J, Jun 26, 2018.

  1. Arnold_2013

    Arnold_2013

    Joined:
    Nov 24, 2013
    Posts:
    256
    I fixed some errors by opening all shaderGraph / VFX graph items and re-saving them. Even shaders that I did not make in shadergraph might be opened by shadergraph when clicking the edit. I had this with the URP/Shaders/AutodeskInteractive... just open it and save. Only issue is that the error is only triggered when the shader is used... so hope I don't miss too many of them :)

    The URP convert all material has been moved into windows -> rendering -> Render Pipeline Converter but so far it has not been the magic bullet for me (this was my goto solution for anything pink shader related)

    In the editor I can run a game (have not tried to build anything).

    Unity 2021.3.5F1
    Entities+HR+... 0.51.0-preview.32
    URP/ShaderGraph/VFXgraph 12.1.7...
     
  2. lclemens

    lclemens

    Joined:
    Feb 15, 2020
    Posts:
    703
    The render pipeline converter tools weren't much help. I'm not a shader graph expert, and I can't figure out how to open the shader in shader-graph since the original shader was created manually via code. From what I'm reading here: https://gamedev.stackexchange.com/questions/183836/convert-from-shader-to-shadergraph it sounds to me like there is no easy way to just open the .shader file - I either have to attempt to rebuild it in shader-graph or make a new shader graph file and integrate the functions as a black-box. The first method might not even be possible and the second method seems like it would still give me the same errors since the shader code won't change.
     
  3. thelebaron

    thelebaron

    Joined:
    Jun 2, 2013
    Posts:
    822
    @lclemens
    I had to change the UNITY_ACCESS_DOTS_INSTANCED_PROP_FROM_MACRO underscores from two __ to one _ which could be the source of your issues
     
  4. lclemens

    lclemens

    Joined:
    Feb 15, 2020
    Posts:
    703
    I think I figured it out...

    I had to change these lines:

    Code (CSharp):
    1. #define _Color UNITY_ACCESS_DOTS_INSTANCED_PROP_FROM_MACRO(float4, Metadata__Color)
    2. #define _SpeedInst UNITY_ACCESS_DOTS_INSTANCED_PROP_FROM_MACRO(float, Metadata__SpeedInst)
    3. #define _CurTime UNITY_ACCESS_DOTS_INSTANCED_PROP_FROM_MACRO(float, Metadata__CurTime)
    4. #define _ClipIdx UNITY_ACCESS_DOTS_INSTANCED_PROP_FROM_MACRO(float, Metadata__ClipIdx)
    to these:

    Code (CSharp):
    1. #define _Color UNITY_ACCESS_DOTS_INSTANCED_PROP(float4, _Color)
    2. #define _SpeedInst UNITY_ACCESS_DOTS_INSTANCED_PROP(float, _SpeedInst)
    3. #define _CurTime UNITY_ACCESS_DOTS_INSTANCED_PROP(float, _CurTime)
    4. #define _ClipIdx UNITY_ACCESS_DOTS_INSTANCED_PROP(float, _ClipIdx)
    And now it works in 2021 with URP 12.1.7 and HRV2 0.51.
     
  5. lclemens

    lclemens

    Joined:
    Feb 15, 2020
    Posts:
    703
    I have a question. So for this baked-vertex-animation-in-a-texture method... What is the point in baking the normal positions? All this time I have been baking a position vertex map and a normal map because zulfajuniadi did it in his github project and other people were doing it in their vertex-animation projects. Every time I bake, I've been generating two textures as output.

    upload_2022-8-24_19-55-14.png

    But today I decided to make a lit version. I took the SimpleLit URP shader and modified it so that it used the encoded vertex-position texture, and for the normal texture I just used the regular normal map (not the generated one).

    upload_2022-8-24_20-1-43.png

    And as far as I can tell, it looks great! The shadows, lighting, and bmp mapping are behaving just like I would have expected.
    ezgif-4-3f1a816974.gif

    And here is the plain unencoded normal map on the baked texture (this model also has alpha clipping).

    upload_2022-8-24_20-9-55.png

    So.... this is probably a stupid question, but why have I been generating vertex-encoded normal maps all this time??
     
  6. Arathorn_J

    Arathorn_J

    Joined:
    Jan 13, 2018
    Posts:
    47
    If you don't bake the normal map and translate the normals based on that output you will get some really odd light reflections where you will see surfaces that should be obscured getting direct looking lighting or reflecting in the incorrect direction. So like if you had a model where the top of a an arm in T-Pose where the vertex has a normal of say 0,1,0 straight up and several frames later the arm is at the side where the vertex should be facing to the right of the model like 1,0,0 as an example it won't reflect light properly. In your example above you added a bump normal texture to overlay lighting on a textured surface but it won't change out the actual vertex normals which you still need to update by frame.

    This is better visualized in shader graph where you will see the "Normal" for a bump/normal map as an input and also be able to translate the Vertex Normal and Position.

    In the screenshot below I've actually got calculations reading a vertex animation texture and calculating the normal and position and I'm not populating the normal map at all (though I could if I needed to).

    upload_2022-8-24_21-53-30.png
     
    lclemens likes this.
  7. lclemens

    lclemens

    Joined:
    Feb 15, 2020
    Posts:
    703
    Thanks - that makes perfect sense!

    What about tangents? I don't know a whole lot about what the tangents are used for, but several of the texture-baked vertex position libraries I've seen have been ignoring them. I think they're used for bump mapping as well as the normals? Will I need them if I want that bump normal texture to work?

    Hopefully tangents don't require too much precision because the most I can squeeze outs is 10 bits per axis before I would have to use a really large floating point texture instead.
     
  8. Arathorn_J

    Arathorn_J

    Joined:
    Jan 13, 2018
    Posts:
    47
    I'm not sure on the tangent issue, and if it would actually be modified in some way by the position of the vertex. What I've done when I'm developing my shader is run the original mecanim animator in parallel and paused frame by frame to look at the surfaces and compare side by side to see if there are differences. Sorry I can't help more on the tangent calculation, maybe someone else can weigh in.
     
    lclemens likes this.
  9. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    3,916
    Some lighting models use tangents, but not all. I know it is pretty hard to do brushed metals without it. You are much more likely to need it for an HDRP project I believe, but it has been a while since I have looked at all the shaders Unity uses across pipelines.
     
    lclemens likes this.
  10. lclemens

    lclemens

    Joined:
    Feb 15, 2020
    Posts:
    703
    Thanks for the info. I went ahead and implemented the tangent to see if it mattered in the modified Simple Lit shader I am using. I could not find a visible difference at all.

    To be honest - I think the models I'm using actually look better without the encoded normals texture. I'm not sure if it's a precision problem or what, but the bump mapping looks better without it in comparison to the original. On a practical level all of my character model movements are not very drastic (run, die, attack, stagger, idle) so it's not really noticeable that the encoded normals texture isn't being used, plus that is one less texture needed on mobile platforms. I think for the moment I will run without the encoded normal texture unless I see an obvious visual artifact.
     
  11. lclemens

    lclemens

    Joined:
    Feb 15, 2020
    Posts:
    703
    Something is perplexing me...

    I built an executable of my Animation Cooker which uses Vertex Animation and sent it to a friend of mine. We both benchmarked with the same high-vertex model and 100k instances (all on-screen). It's definitely GPU-bound because the CPU is hardly touched at all (around 4%).

    On my laptop with an RTX 2060, I got 9fps.
    On his laptop with an RTX 3060, he got 9fps.

    We retested with 10k instances and again got identical numbers.

    This was unexpected to say the least.... any explanations for this odd phenomenon?
     
  12. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    3,916
    Not all mobile GPUs are integrated equally. Even if they have the same model number, they could have drastically different power schemes. It could also be that there was minimal generational improvement in whatever is the specific bottleneck for this algorithm.
     
  13. Arathorn_J

    Arathorn_J

    Joined:
    Jan 13, 2018
    Posts:
    47
    At a certain point with however much data per frame is getting sent to the GPU you hit a limit to the throughput and that can cause them to be pretty close. I gather you don’t have any sort of frame syncing enabled for the quality settings?
     
  14. lclemens

    lclemens

    Joined:
    Feb 15, 2020
    Posts:
    703
    The app is set to run as fast as possible (before spawning anything it's in the 300fps+ range). But in the player it defaulted with VSync Count set to "Every V Blank". Maybe that's it?

    Another thing is that the build is set for full-screen native resolution. My friend's resolution is 2k and mine is 4k, and he still got the same number. Today I built a player to be 1920x1080 and ran it and it gets the same numbers as with 4k, so resolution doesn't make a difference.

    So it sounds like it's some sort of throughput bottleneck that the RTX 3060 mobile didn't improve upon after the RTX 2060 mobile. When 100k entities are going I can see in task-manager that the GPU is just pegged to 100% on my 2060. I know if I choose a different model with less vertexes the frame rate improves a lot, so I just figured it was triangle throughput, but I'm pretty sure that a 3060 has a higher triangle throughput than a 2060, so it must be something else.

    It's not super important that I discover the exact root cause or anything... I was just curios.

    Any chance one of you could try it quick on a desktop? Just set the spawn count to 100,000 and then hit the Spawn button and note the FPS. Alt-F4 to exit. https://drive.google.com/file/d/1So-ellBcqjl43GZbH_60u-b-5RcH48fn/view?usp=drive_link
     
  15. Rukhanka

    Rukhanka

    Joined:
    Dec 14, 2022
    Posts:
    171
    lclemens likes this.
  16. inSight01

    inSight01

    Joined:
    Apr 18, 2017
    Posts:
    86
    lclemens likes this.
  17. lclemens

    lclemens

    Joined:
    Feb 15, 2020
    Posts:
    703
    Thanks for running the test. That confirms our suspicions on some sort of bottleneck that doesn't scale with card performance... even the 3070 Ti just barely pulls ahead, while the 3060 desktop card and 2060 laptop-version perform the same.

    Using models with less vertices (like 600 to 700 or so) I can keep it above 40fps with 100k instances. Because the vertex count has a huge impact, my theory is that if I started using LODs it could go way higher. I'm not quite sure how to do that in DOTS yet (haven't really investigated it). It's on the todo list.
     
  18. inSight01

    inSight01

    Joined:
    Apr 18, 2017
    Posts:
    86
  19. Arathorn_J

    Arathorn_J

    Joined:
    Jan 13, 2018
    Posts:
    47
    Yes you for sure need to make sure Vsync is disabled otherwise it won't matter what you do. As to any other issues, it can just be throughput bottlenecks, but profiler should show you spikes for graphics calls when you see it trying to push through too much consistently and the CPU waiting on those bottlenecks.
     
    lclemens likes this.
  20. lclemens

    lclemens

    Joined:
    Feb 15, 2020
    Posts:
    703
    I set the "VSync Count" to "Don't Sync" and tested on my 2060 laptop and it didn't make any difference at all, regardless of the entity count. I don't know why.

    Lol! --- yeah I think it's safet to say that there are spikes for graphics calls. :)
    upload_2023-5-26_11-30-15.png
     
  21. lclemens

    lclemens

    Joined:
    Feb 15, 2020
    Posts:
    703
    The technique is similar in that it bakes some things into textures, but it's different because the Crowd Animations asset is using GPU-Skinning - it is recording the bone transforms only into textures, whereas I'm recording vertex positions (it's the vertex animation technique). Vertex animation makes larger textures and is less flexible, but it is less work for the GPU so it can handle more agents at a time. I put it on gitlab - https://gitlab.com/lclemens/animationcooker .