Search Unity

4,000 Adams @ 90 FPS

Discussion in 'Shaders' started by RecklessGames_, Mar 7, 2018.

  1. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    moveNewVertexpositoin2.PNG moveNewVertexpositoin.PNG Top Image right is Original Mesh. Left is Shader Changed.
    Second Image is Vertex Function in Shader.

    Hoping to find some insight into what I thought would have been a relatively simple conversion of C# logic to Shader.


    I am trying to Animate a Mesh Renderer through Shader using a Pre-Baked Animation Texture.

    This all works find and Dandy in C# on the CPU side. All I'm doing is Scaling the Vertex/frames into a texture at 0-1 then in a runtime C# Script Reading the Texture data and moving the Vertexes in the Mesh to the Desired Positions. Simple Vertex = animMap.

    But when I use this type of Logic in the Vertex Shader Function. It all goes sideways.

    C# Succes test.


    Finally Got a Close to Completion. There is still distortion in the Pixel (RGB) to Vector(x,y,z) floating point precision loss. Any insight would be great.

    4,000 Adams and 10,000 Adams


    So I originally wanted to make this project completed and all the issues fixed before making a public repo but life has other plans. So I cleaned up the code and minimized the project files the best I could. The largest being the Adam Models. Anyways without further ado. 4,000 Adams @ 90 FPS repo is now public. I hope this helps others find there way to mastering large simulated crowds in Unity. :). (warning) this is not a finished product in any shape. Only a learning project.

    Repo Link
    https://gitlab.com/RecklessGames5858/RealtimeCrowdSystem


    Standalone Test Application.
    Have Fun. :)
    4,000 Adams@ 90 FPS Standalone Unity Application.
    Unity GPU Instancing Standalone 23.2MB


    shortkey for Locking camera is L key input as well as on screen toggle.

    Lock Camera = L;

    Use On screen toggle to change between GPU Skinning(Non Instanced) and GPU Instacing then press Spawn Button.

    simple UI for testing your own hardware to check fps. All GPU Instancing uses Material Blocks combined with Texture Arrays to allow for Multiple Colors as well as Different animation playing all on the GPU Shader side. The only CPU Side code is the Spawn Logic and UI.
     
    Last edited: Jul 11, 2018
  2. StaffanEk

    StaffanEk

    Joined:
    Jul 13, 2012
    Posts:
    380
    How does each vertex know which animation vertex it should sample from?
     
  3. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    With out seeing more of the code I can't say for sure where the error is.

    I can say if you're using the world to object matrix here you're probably doing something wrong. Additionally you shouldn't need to decode the w component since that should always be 1.0 for a position.
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    I believe it's being set in an additional UV. Though you could just as easily use SV_VertexID.
     
  5. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    Well I thought from my super basic shader understanding.(which must be wrong. lol). That the Vertex Function inside the Vertex Shader runs on the Vertices in the same order as you would on the CPU Side when Calling mesh.vertices[].

    So you would think more of the Vertex Function as a For(Loop) running on the vertices of the Shader Mesh over and over. The Line here "float4 animMap = Tex2LOD(x+OffsetY,y,0,0);" is suppposed to supply it with the OffsetY Value used to Pan over the texture. But new to Tex2Lod use as well. Perhaps its not feeding the right vertex to the current vertex.
     
  6. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    Is the SV_VertexID set by Unity at runtime and in the Same Order as cpu side Mesh.Vertices[] ?

     
  7. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    As long as the meshes aren't being batched, yes.
     
  8. eron82

    eron82

    Joined:
    Mar 10, 2017
    Posts:
    83
    How do you select the animation?
     
  9. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    Currently for testing animating a MeshRenderer from Shader/Texture, there is a single texture per animation. So simply switching the Texture and adjusting the Desired Frames in the shader inspector will do it. I will need to devise my own method of storing multiple animations in one texture and passing those offsets to the Shader.

    The shader access the texture pixel through the Tex2DLod() function returning a Float4(R,G,B,A) for the specific pixel desired, then I scale back from the Color range 0,1 back to a -1, 1 Scale and directly apply the Position to the Shader Vertex Function.
     
  10. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    Successfully Animating a Mesh Render from (GPU) Shader!!! :)

    Thanks to the Unity community directing me to the proper Shader Semantics SV_VertexID for proper vertex->Pixel matching.

    Test Video of 4,000 & 10,000 Mesh Renderers animated by GPU Shader with Instacing. No additional Optimizations have been performed yet. This is just the Shader and unity's builtin Instacing/Batching system.

     
    sharkapps likes this.
  11. eron82

    eron82

    Joined:
    Mar 10, 2017
    Posts:
    83
    Amazing! I hope you will share this work!
     
    RecklessGames_ likes this.
  12. riba78

    riba78

    Joined:
    Feb 16, 2018
    Posts:
    33
    Great. You got a great result!!!

    I quote eron82.... I hope you'll share more in detail what you did....or share the project will be the top :)
     
    RecklessGames_ likes this.
  13. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,983
  14. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    Gitlab actually. :)
     
  15. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    Yes I will be sharing my work. :) But before I release it I want to get the Normalize/Denormalize Right. As it currently stands the Precision Transfer is not 1:1. The Yi Fei Boon presentation didn't appear to have any loss in precision. So I know its possible. Gonna take another look into the Vector Direction method as I see now why it would not have precision loss. But my implementation currently is not there. But I feel I'm close. Thanks
     
    MadeFromPolygons likes this.
  16. eron82

    eron82

    Joined:
    Mar 10, 2017
    Posts:
    83
    Is it possible to use an high detail mesh or there is some limit?
     
  17. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    Yes it is possible but for every vertex in your model it will take one pixel space * total Animation Frame per Animation.

    So if your gonna go high Res model route Your textures may become very large. Your most likely gonna want to get the Texture Animation packing method right and possibly look into texture Compression and use interpolation to extract it.

    An interesting Concept is presented here.


    And as always you still have GPU Instacing/Batching limitations as well.
    https://docs.unity3d.com/Manual/GPUInstancing.html
     
    MadeFromPolygons likes this.
  18. eron82

    eron82

    Joined:
    Mar 10, 2017
    Posts:
    83
  19. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    I feel that is too broad to answer. Obviously a lot of the needs to meet your "Best" criteria comes from the details of your project. Both methods have there pros and cons. The "Mesh Flipbook" method can be used as well to achieve the desired end result. So only understanding how they both work will give you a clear enough picture to decide.
     
  20. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    Standalone Test Application. shortkey for Locking camera is L key input as well as on screen toggle.

    Lock Camera = L;

    Use On screen toggle to change between GPU Skinning(Non Instanced) and GPU Instacing then press Spawn Button.

    simple UI for testing your own hardware to check fps. All GPU Instancing uses Material Blocks combined with Texture Arrays to allow for Multiple Colors as well as Different animation playing all on the GPU Shader side. The only CPU Side code is the Spawn Logic and UI.

    Have Fun. :)
    4,000 Adams@ 90 FPS Standalone Unity Application.
    Unity GPU Instancing Standalone 23.2MB
     
    riba78 likes this.
  21. eron82

    eron82

    Joined:
    Mar 10, 2017
    Posts:
    83
    Nice! Is it possible to create a plugin?
     
  22. riba78

    riba78

    Joined:
    Feb 16, 2018
    Posts:
    33
    I've tested the standalone. really really nice!!!!
    It's not comparable without your instacing and animation method. I've tried at home with a GTX1070 and i6700 (no overclock) with FULL HD resolution. Then I've spawned 18000 Adams, moved the camera away to include all the Adam models in the camera frustrum and FPS still around 45/50 FPS.
    Simply fantastic... but now I'm too curious to see the whole system :)

    Seriously, I can understand if you don't want to share this...you could build an asset to sell...otherwise, if you want to share with us...more than welcome :)
     
  23. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    So I found this article by Unity. There method is more refined and utilizes a more advanced BoneMatrix -> Vertex Method. The GitHub Repo link to the Project is in the article. This may be a better approach then my current direct translations of vertex to pixel. I'm surprised this wasn't linked when googling GPU Instancing before. Anyways I hope this can help others achieve there massive Crowed goals for there Project. :)

    https://blogs.unity3d.com/2018/04/16/animation-instancing-instancing-for-skinnedmeshrenderer/
     
    AVitt, riba78 and chelnok like this.
  24. eron82

    eron82

    Joined:
    Mar 10, 2017
    Posts:
    83
    Thank you!
     
  25. riba78

    riba78

    Joined:
    Feb 16, 2018
    Posts:
    33
    Thank you for your reply but testing the projects, it seems that is good but it has worst performance than your demo.
    have you tried it?
     
  26. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,983
    Has anyone worked out how to get the unity version working with transitions?
     
  27. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    No I haven't got the chance to run through the Unity post's system. Just wanted others to see there options of approach. I've been doing contract work so I haven't had time to get back to my personal projects yet.
     
    riba78 likes this.
  28. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    So I originally wanted to make this project completed and all the issues fixed before making a public repo but life has other plans. So I cleaned up the code and minimized the project files the best I could. The largest being the Adam Models. Anyways without further ado. 4,000 Adams @ 90 FPS repo is now public. I hope this helps others find there way to mastering large simulated crowds in Unity. :). (warning) this is not a finished product in any shape. Only a learning project.

    https://gitlab.com/RecklessGames5858/RealtimeCrowdSystem
     
    AVitt and chelnok like this.
  29. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    Updated all links to the Top Post of this forum for easier location. Enjoy
     
  30. AVitt

    AVitt

    Joined:
    Oct 28, 2013
    Posts:
    13
    Wow Reaper
    Fantastic work Reaper, you should call it 20,000 Adam's @ 144hz as that's what I was able to get with your demo. :)

     
  31. RecklessGames_

    RecklessGames_

    Joined:
    Jul 30, 2010
    Posts:
    222
    Thank you, I am glad you are having success with the project. I cannot view your youtube link however. I am met with a Copyright block in my country I guess, it says your video contains Sony Music Entertainment stuff? Not sure what thats about. Never seen this before.
     

    Attached Files:

  32. AVitt

    AVitt

    Joined:
    Oct 28, 2013
    Posts:
    13
    Ah It's because I used some music that's copyrighted. I've removed it now and the change is processing. It should work again for you soon.
     
  33. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    A couple of notes after having dug into the gpu skinning project Unity put out https://github.com/Unity-Technologies/Animation-Instancing.

    The Unity project has a much better core approach as it's based on calculating vertexes from bone positions. If you don't care about correct cross fading then you don't need that, but I think that's a rather limited use case myself.

    The issue with the Unity project is they don't blend bone weights for vertex calculations in the shader. So cross fading similar animations say walk/run looks fine. Try cross fading two dissimilar animations and it starts to get ugly. You see a lot of stretching/distortion that's just not acceptable in most games.

    The good thing is it's fixable, the bad thing is they built up a lot of abstractions that are wrong for a correct solution where the shader is aware of all animations involved not just a notion of 'current' animation.

    I managed to refactor it so the shader gets all animations. So I have a rough version of layer blending. I don't yet have bone weight blending calculations but the shader has the information it needs to do it now. Cross fading out of a blend needs some additional normalization of values over both animations that I haven't done yet. The code logic is a bit twisted still since while I refactored some stuff, I was trying to be non intrusive. But by the time I'm done most of the existing abstractions need to go.

    I'll likely open source what I have once it's working well. But it's very focused on PC games where characters are assumed to have customization. So for example I just removed all the mobile specific stuff, went from 16 bit to 32 bit textures, half4 to float4 for some stuff in the shader. Remove instanced rendering since the project had a ton of complexity just for that. I might add back in instanced rendering once I get it all cleaned up.

    But anyways the larger point is if you want to try something like this make sure you start with an approach that fits your needs. In most cases it's going to be what the Unity project does not the 9000 adams approach.
     
    AVitt likes this.
  34. AVitt

    AVitt

    Joined:
    Oct 28, 2013
    Posts:
    13
    Hi Snacktime, yes that unity blog is the approach I was originally intending to achieve. However it didn't exist when I first posted ;)

    I just had trouble calculating and storing the bone transforms correctly to store in the texture and then moving the verts based off that. reaper kindly helped out and got something working by storing the vertex positions as in the Adam demo but without lighting/shadows and with distortion there is room for improvement.

    I haven't been able to get the Unity demo working properly yet though, maybe you could help with that?

    I'm not too fussed about animation transitions right now and just want to get a basic implementation going. It would be nice to have attachments and the ability to have some mesh variation too but worse case I can do without.

    Looking at their shader it is very similar to my original so that's reassuring I'm on the right track. I'm not developing for mobile so their solution isn't ideal for me plus the fact I can't get it to work. I'd appreciate a look at your project if that's possible. I'd like to get something working for the weekend as there is a game dev competition this weekend I would like to use this in.
     
  35. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    My version of the code in that instancing project has diverged significantly at this point and it's in my actual game not a separate project. But if you could post some information about what isn't working I might be able to help.

    Their mobile specific stuff doesn't effect the core flow. My version is targeted at PC game characters assuming they will all have custom clothing. So I ripped out all the mobile stuff as well as the instancing. I also removed support for multiple skinned meshes and materials, as that added a ton of complexity which IMO is better solved by baking into a single skinned mesh/atlas to start with. I also couldn't handle the codebase itself, kind of a mess so I've already refactored a lot of it moving it towards an ECS system with the per frame bone calculations moved to jobs.
     
  36. AVitt

    AVitt

    Joined:
    Oct 28, 2013
    Posts:
    13
    That would be great, thank you. I Understand, your project sounds cool, hope it goes well.

    Regarding the Unity Blog's Animated Instancing project. When I first load the project, it re-imports all the assets and some of the scripts are not hooked up to the script component on the game objects in the example project. This creates errors and I'm not sure what scripts to add to fix it.

    It complains about not having a prototype object and again i'm not sure what to put in it.

    You need the AnimationInstancing script on to generate the animation texture, is it needed at run-time?

    I can build the asset bundle and run it but while instancing is enabled the characters spawn but nothing is instanced and fps drops after 100+ characters. When instancing is disabled things spawn but nothing is visible/rendered.

    I wonder if an earlier commit will work better?

    I'm gonna try to reverse engineer their code for storing the Bone Matrix and implement it my project. I think the calculation i was missing was:

    SkinMatrix = rootBoneMat * skinnedBoneMat * bindPoseBoneMat;

    Then encode the SkinMatrix into a texture.

    Here's my original thread https://forum.unity.com/threads/when-animated-gpu-skinned-instancing-goes-wrong.512279/#post-3578426
     
  37. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    Forget the demo scene just start from scratch. Take a character with a single skinned mesh, animator, controller, and put an instancing script on it and then run the generator. The prototype is your character for the simple case. It's just a reference to itself. The system uses that to determine the name linked to the data.

    Now just make sure the material on your skinned mesh is using an instancing shader, and it should work.
     
  38. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    Also note they store the bone indexes for each vertex into Uv2.
     
  39. AVitt

    AVitt

    Joined:
    Oct 28, 2013
    Posts:
    13
    Thanks I tried an earlier version which worked. It allowed me to build the asset bundle for pc not sure if that was the difference.

    I will follow your advice and try it with my own character mesh. How would I go about spawning them once I've done that. Can I use their animationInstancingMgr or should i just create my own logic for that?

    I found it isn't as good performance but I expected that but was noticing an fps drop around 900 characters. From your experience is their still optimizations to be made. I know using ECS as you have would help a lot with the cpu side of things. I think they have more logic on each instance than what I need. I have a boid simulation which is ideal for jobifying. Before that I was considering offloading that to the gpu too but I think ECS would be optimal. That wasn't available when I was first working on this.
     
    Last edited: Aug 10, 2018
  40. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    Basically the 'prototype' is your prefab of sorts. So spawn multiple copies of that, but the prototype field has to link back to the original, because they key off of the name internally.

    There are a lot of optimizations to be had on the cpu side. For every unique mesh/material combo they loop through every single animation index just to get to one animation. Even though in the build bones pass they have all that information, they could have just built up an array at that point and saved all the later complex layered iteration.

    In the processing of adding layers which meant sending more data to the gpu per frame, I also consolidated all the property block calls into one, use a single matrix4x4 to send it all.

    I fixed up those things and moved everything into jobs that could be. So I have one loop to set some stuff like position and local to world matrix on the transform, it does the minimal and then runs a job which does the rest of the calculations. Resulting in a simple list of draw commands that have the info to call a single property block set and then issue DrawMesh.

    So work on the main thread is down to only what actually has to be there, Unity api's that are not thread safe.

    I'm using DrawMesh so I have a much higher cpu cost as I scale. Instancing would be much better. If I was optimizing for really high instanced counts I would add a transform access job to let me get that one calculation loop off the main thread. With 500 non instanced characters it uses just under half a ms.
     
    AVitt likes this.
  41. peeka

    peeka

    Joined:
    Dec 3, 2014
    Posts:
    113
    I tried this, got it to work, but I can't get the Enable Attachment to work, I change the code from
    allTrans.RemoveAll(q => boneTransform.Contains(q)); to
    allTrans.RemoveAll(q => !boneTransform.Contains(q));

    to have the bone I want to show up, but after generate the animation, no bone follows the animation.
     
  42. AVitt

    AVitt

    Joined:
    Oct 28, 2013
    Posts:
    13
    ok I'm able to spawn 1000+ characters of my own mesh now with very little performance hit which is great!

    Now how do I control the animations for those instances?

    atm they just repeat the first animation which is jump start, not the best for 1000 characters as it makes me feel nauseous o_O.
     
  43. twobob

    twobob

    Joined:
    Jun 28, 2014
    Posts:
    2,058
    So.. you will...
    Blah
    No, You won't :)
     
    Last edited: Sep 10, 2018
  44. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    My track record for releasing open source work, better then most here by a fair margin.

    The question isn't would I release it, it's a matter of quality. I started with a codebase that's a mess. I'm not putting my name anywhere near that until I have time to refactor it. Which could be a few months since the work I did was primarily to prove the option was viable, but I don't actually need it until a later point in our game.
     
    chelnok and twobob like this.
  45. twobob

    twobob

    Joined:
    Jun 28, 2014
    Posts:
    2,058
    Lol. I know. I follow your repos on Twithub. Was just poking fun. Forgive me
     
    chelnok likes this.