Search Unity

Dual quaternion skinning for Unity

Discussion in 'General Graphics' started by mr_madcake, Oct 22, 2017.

  1. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    Hi guys.

    I was discouraged to see that Unity does not support dual quaternion skinning so i took my time and implemented it myself.

    Comparison default skinning vs DQ:



    Features:

    • volume preserving deformations (bye-bye candy wrapper and collapsing shoulders)
    • GPU skinning with compute shaders (only)
    • blend shape support (calculations performed in compute shader)
    • works with any platform that supports compute shaders (Dx11, Vulkan, OpenGL 4.3+)
    • zero GC allocations per frame

    github repository


    Is it free?

    The repository is licensed under the MIT License

    A short and simple permissive license with conditions only requiring preservation of copyright and license notices. Licensed works, modifications, and larger works may be distributed under different terms and without source code.

    (Yes)



    P. S.
    messages below are related to earlier version
     
    Last edited: Jul 29, 2019
    Malbers, NotaNaN, Invertex and 4 others like this.
  2. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,609
  3. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    Peter77 likes this.
  4. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,566
    I thought about doing this kind of project, never had the time.
     
  5. nat42

    nat42

    Joined:
    Jun 10, 2017
    Posts:
    353
    Ironic URL :p

    Looks really interesting; I'm not up with such things, but why use a fragment shader for the tranformation?
     
    Flurgle likes this.
  6. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    You need to blend [how many vertices you have] quaternions and apply resulting quaternions to vertex positions. Which is a hell lot of quaternions :). Doing this on CPU is a total performance killer (i got like 1-2 fps). The good part is that vertices do not depend on each other and can be calculated in parallel. So why not use GPU for this?

    I dump vertex positions, bone indexes, bone weights and quaternions into textures, send them to GPU for calculation and retrieve the result. And its infinitely faster than just doing it on CPU.

    I could probably use compute shaders but i don't see any significant advantage and also i did not have time to properly learn them.

    One thing that confuses me about compute shaders is that i have to manually select constant number of threads. Which means i could either use less threads than i have available ruining performance or i could request more threads than GPU can provide and my shader fails to run.

    Also not every platform supports compute shaders while fragment shaders work everywhere (or almost everywhere).

    The biggest problem with my approach (idk if it this problem can be solved using compute shaders) is that i get the result of my calculations as RenderTexture. To get my data back to CPU i convert it to Texture2D and call GetPixels()

    It works fine but every GetPixels() call allocates a new array of Color[ ] and this forces regular GC.Collect( ). It is still bearable but could be easily prevented if GetPixels() would accept an already allocated buffer for it's data. It turns out i am not the only one who stumbled upon this problem: https://feedback.unity3d.com/suggestions/garbage-free-texture2d-dot-getpixels
    If somebody knows how to deal with this bottleneck i would appreciate your input.
     
  7. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Also the readback from the gpu causes a stall as it has to flush everything it's doing, making this technique pretty much only useful for movie making, no bad thing!
     
  8. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    i'm just speaking theoretically here as i didn't test it properly yet, but i suppose in general the number of skinned vertices you have would be a lot smaller than the number of pixels on screen and thus shouldn't be much of a problem

    if you know a better way to do this or which potential problems could arise please do tell

    also if this script turns out too heavy to replace SkinnedMeshRemderer you could probably use it at least for main character or dynamically enable it based on distance as a kind of LOD
     
  9. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    nat42 likes this.
  10. nat42

    nat42

    Joined:
    Jun 10, 2017
    Posts:
    353
    I thought CPU performance would be within an order of magnitude of linear blend skinning, so on CPU no worse than say 3 times slower than doing LBS on the CPU. But it was just the use of a fragment shader that I was question more than wanting to use the GPU (and there's so much I don't know about Unity it is/was reasonable it's my ignorance)

    That sounds less than ideal... I thought generally you'd want to keep the data on GPU and render it without the CPU massaging the data further - I figured you might use a custom vertex shader to read the data from the texture, but then you've kind of implemented a poor man's implementation of "transform feedback"/"stream out" (and naively running the transforms in a vertex shader without "transform feedback"/"stream out" as the material would make you pay to re-tranform for every pass)... I'm not sure if Unity exposes "transform feedback"/"stream out" but (or keeps such tricks all to itself)

    Well, not all fragment shaders work everywhere, and whether it's effecient everywhere is another matter, though I'm not certain either case necessarily applies here... but if you do want that kind of portability your textures would be POT dimensions and limited to 2048 pixels wide/high, and limited to 4 8-bit channels, right? (too lazy to double check when that ceased to apply)

    Might be worth double checking, but I think compute shaders generally are able to spit out data in a form you can feed directly to be rendered, and I'd guess Unity would probably let you use this? (I think I saw a project using compute shaders for skinning on Github earlier, might be worth seeing how others have approached similar problems perhaps?)

    EDIT: Sorry, took so long posting I missed the additional responces

    EDIT2: Also as Hippocoder hints at, you really don't want a "stall", remember the GPU is usually rendering a frame or two ahead, I believe stall means in this case that the CPU must wait for the GPU to finish every thing it's batched up to finish later, including but not just your transforms, so it can give you the results (see: https://blogs.msdn.microsoft.com/shawnhar/2008/04/14/stalling-the-pipeline/ ).
     
    Last edited: Oct 23, 2017
    mr_madcake likes this.
  11. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    Indeed it is. However i struggle to find a better way to do it. :(

    AFAIK you can add custom fields to vertex structure but the Mesh class will not support new fields which kinda defeats the purpose

    That's still better than what i currently have.
    However to keep it efficient i would need to use separate UV coords for this. I don't think using uv4 is that bad - i doubt people ever use that many. But i might be wrong

    I'm going to look for info about vertex shaders now as i'm not sure how to make it yet. Is it possible to apply my "skinning vertex shader" while keeping whatever material was assigned to the mesh with it's own shader? From what you said i suppose two vertex shaders would override each other but can i keep at least fragment shader? Otherwise you would have to manually add "vertex skinning" shader code to every shader you use. Or am i missing something?

    Thanks for the link. I was not aware of this

    Edit:

    Meanwhile i started working on vertex shader and achieved some success. I was able to get rid of Texture2D and reading texture data from CPU completely. My vertex shader takes uses the second set of uv coodrinates and samples a texture to retrieve animated vertex coordinates.

    However i discovered that while uv1 and uv2 have float precision, uv3 and uv4 are only half which may not be enough

    Also i did not found a way to make my shader "cooperate" with object's material so if you want to use your own shader you would have to copy the vertex function from my shader.

    Also i noticed some worrying artifacts on my modes when using this shader. Im not sure they are related to my animation script but it is possible. Will check it when i finish writing code.
     
    Last edited: Oct 24, 2017
  12. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    So in the end vertex shader did work. However not without problems :confused:

    1) I get some weird artifacts along uv seams. No idea what could cause it.

    2) It seems meshes that consist of multiple parts do not work properly. Will test is later.

    3) My script only works in play mode but shader tries to adjust vertex positions all the time. Making my script work in edit mode is not a good idea (it needs SkinnedMeshRenderer for initial parameters and removes it in Start(). After removing it in edit mode it won't come back. also many other things could get messed up in edit mode)

    I think the best solution would be to provide alternative shader without gpu skinning and switch to skinning shader in Start()

    Here is the new code:

    https://bitbucket.org/MadC9ke/unity...le-view-default#GPU_Skinning_shader.shader-29

    https://bitbucket.org/MadC9ke/unity...=file-view-default#MadCakeMeshAnimator.cs-192
     
  13. nat42

    nat42

    Joined:
    Jun 10, 2017
    Posts:
    353
    Artifacts look to be cracks in the mesh due to vertices along the seam not having quite the same position (or possibly tessellation) as their neighbour on the other side. This could be an issue with the input mesh (eg. that was being masked by other factors such as precision issues) or it might indicate an issue with the skinning (generally you'd want the same vertex position and bones weights to give exactly the same result) ...but that's just a guess.
     
  14. nat42

    nat42

    Joined:
    Jun 10, 2017
    Posts:
    353
    Oh I thought you might've found a way to get Unity to preform transform feedback and were doing the skinning in a vertex shader... because you have UV coord and the vertex shader is being used to read vertex data, I assume the cracks might be due to blending being performed on the texture samples - you'll want to set the texture to use nearest neighbour sampling (instead of say bilinear sampling)

    Anyway not sure this is the best approach either (or if there is a best approach to be found)... but if you stick with this one, perhaps you could trade the UV coord for "SV_VertexID" see: https://docs.unity3d.com/Manual/SL-ShaderSemantics.html (you'll still want to fix the sampler) though it will rule out some of the possible targets
     
  15. theANMATOR2b

    theANMATOR2b

    Joined:
    Jul 12, 2014
    Posts:
    7,790
    Until joint angle deformers are supported in vanilla Unity as they are in ALL 3D packages out of the box - I believe the simplest approach for production is corrective morphs/blend shapes driven by joint angles.

    I applaud your effort though.
     
  16. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    Thanks ;-)

    Thanks for the idea. I will try it.

    That might be the case. I will double-check it.

    I already tries passing the same skinned vertex positions through different means (apply through mesh.vertices and vertex colors) and it worked without any artifacts. So i believe this is not the source of trouble.
     
  17. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    That fixed it! Thanks!
    I was setting samplers so nearest neighbor everywhere but forgot it where it was the most important :confused:

    So i am happy to say: it works.
    No artifacts, no GPU stalling, no GC.Collect() issues. Just DQ skinning in all it's glory.

    Now it's about making this usable. I will add a default shader with vertex skinning and a quick readme on how to adjust your own shader to work with the script. Also i will make an alternative shader that will be used in Edit mode so the model is visible :)

    And yeah, i'll try to use SV_VertexID to pass uv coords and look for other ways to do it.

    Thanks for your help everybody.
     
  18. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,609
    A performance and memory comparison between your DQ implementation and Unity's (CPU/GPU) skinning would be interesting.
     
  19. nat42

    nat42

    Joined:
    Jun 10, 2017
    Posts:
    353
    As far as I know it was Ladislav Kavan's papers that popularised dual quaternions for skinning (with approximation for slerp), he seems to have a paper advocating just using more bones with LBS as a solution to the candy wrapper issue ;) https://www.cs.utah.edu/~ladislav/kavan09automatic/kavan09automatic.pdf
     
  20. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    From what i read using this method requires some additional setup (training animation). Also AFAIK it can help with candy wrapper artifacts but it's not the only situation where dual quaternions provide good result. Thighs, knees and elbows IMHO look much better with properly set up dual quaternion skinning than with linear skinning but it's not a candy wrapper problem.
     
  21. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    for memory: it`s simple


    For speed however it's not. Built-in SkinnedMeshRenderer does not appear in profiler (well i see several jobs connected to it but i do not see total time that it takes per-frame). Also i'm almost certain that constant factor of my script would be much higher (using c# instead of c++ and calculating quaternions for bones). However it might scale better than built-in skinning with big number of vertices.
     
  22. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,609
    I thought about the actual memory usage, not just a single frame allocation. I saw in MadCakeMeshAnimator.cs that it keeps various array copies and textures per MadCakeMeshAnimator instance.

    It would be interesting to compare memory usage of, for example, 1000 SkinnedMeshRenderer's and 1000 MadCakeMeshAnimator's (total memory allocated for such system).

    Regarding performance, you could toss 1000 skinned objects in a scene and measure the average "frame time" over N seconds. Repeat this with different skinning implementations. Write down numbers and we have something to compare. :)

    PS: Don't forget to disable v-sync for performance tests.
     
  23. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    Well, i made a bool flag in my shader to conveniently switch between edit and play mode and it all seems to be working.

    Except this o_O



    There is a copy of my mesh that stays in bind pose. It does not actually exist in 3D space but i can see it through my perfectly animated mesh. I am confused for the lack of a better word. WTF could it be?
     
  24. nat42

    nat42

    Joined:
    Jun 10, 2017
    Posts:
    353
    Reckon that's a shadow
     
  25. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    But a shadow of what? The skinned mesh is the only object in the scene.
    Also this "shadow" looks like a rendered mesh that was overlayed onto the surface of skinned mesh. I have no idea how could this happen though
     
  26. nat42

    nat42

    Joined:
    Jun 10, 2017
    Posts:
    353
    Guessing you don't have the right magic in your shader that says to run the vertex bit for the shadow pass, so Unity goes to draw your mesh, checks the material and decides to use it's own simpler shader (without vertex shader doing DQB skinning) to draw it on the shadow map, I mentioned shadow passes earlier in the thread I think.
     
  27. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    Nope. May be it was another thread :)

    It seems to be the case. Thanks!
    Trying to get it working properly now...

    Edit:
    You, my friend, are the best!
    I added one word to #pragma and it worked like charm :)
    For others who might have same problem: https://docs.unity3d.com/Manual/SL-SurfaceShaders.html (ctrl+f addshadow)

    I'll soon upload what i have to bitbucket and run performance test. Thanks everybody for help.

    P. S.

    There still was a problem with meshes that consist of multiple parts. I will investigate it after uploading what i have now.

    Edit: new code is here
    https://bitbucket.org/MadC9ke/unity3d-dual-quaternion-skinning/src/

    Edit:
    I found why it didn't work properly with meshes that consist of multiple parts. I simply forgot to assign proper texture to all materials in MeshRender ¯\_(ツ)_/¯

    The fix is in bitbucket already.

    Next im going to:
    • make benchmarks
    • write readme
    • improve usability
    However one thing still bothers me: using SkinnedMeshRenderer to extract bones. The problem is that bones have indexes irrelevant to their hierarchy and only SkinnedMeshRenderer knows in which order i should put them.
     
    Last edited: Oct 25, 2017
  28. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    Update:
    I hacked into Unity's Standard shader and now it can work with my skinning.
    The system is almost ready to grab and use. Hurray!

    I also placed a convenient readme file in "DQ animator" folder.

    Here is how it works at the moment (so that you don't have to look into repo):
    Code (CSharp):
    1. Using MadCake's dual quaternion skinning system:
    2.  
    3. Create a normal skinned character with SkinnedMeshRenderer
    4. Add MadCakeMeshAnimator.cs (it will require a MeshFilter component)
    5. MeshFilter will have empty mesh by default. Give it the same mesh from SkinnedMeshRenderer
    6.  
    7. All materials of the mesh should use special shader to apply vertex positions
    8. The shader is "Madcake's collection/Etc/Standard hacked for skinning"
    9. Alternatively you can adjust your own shader to work with my skinning system (Read below)
    10.  
    11.  
    12.  
    13.  
    14.  
    15.  
    16.  
    17. In order to use custom shader with MadCakeMeshAnimator.cs you need to manually change it's code:
    18.  
    19. Add these variables to the shader:
    20.  
    21.        sampler2D _SkinnedVertexTex;
    22.        sampler2D _SkinnedNormalTex;
    23.        bool _DoSkinning;
    24.  
    25. Add vertex function to your shader:
    26.  
    27.        void vert (inout appdata_full v) {
    28.            if(_DoSkinning){
    29.                v.vertex = tex2Dlod(_SkinnedVertexTex, float4(v.texcoord1.x, 0, 0, 0));
    30.                v.vertex.w = 1;
    31.                v.normal = normalize(tex2Dlod(_SkinnedNormalTex, float4(v.texcoord1.x, 0, 0, 0)).xyz);
    32.            }
    33.        }
    34.  
    35. Add this to your #pragma surface:
    36.  
    37.        vertex:vert addshadow
    38.  
    39. Done!
    40.  
    41.  
    42.  
    43.  
    44. vertex:vert registers the function vert
    45. addshadow tells unity to use the same vertex function when rendering shadows
    46. without addshadow you will see shadows from unanimated mesh which looks weird
    The mentioned shader works just like the standard shader you always use
     
    Last edited: Oct 25, 2017
  29. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    So i did some performance testing and the results were quite unexpected:

    It works a lot slower than default skinning. However it gets slower when you get many skinned meshes (many bones).
    But doesn't give a damn how many vertices you have at all.

    [in screenshot below i had 100 animated high-poly characters]



    So if i get rid of that performance killing matrix multiplications it will fly like a feather no matter the polycout.

    Currently i have two ideas:
    • precompute everything that can be precomputed
    • use shaders for bone transforms as well
     
    Last edited: Oct 26, 2017
    Peter77 likes this.
  30. nat42

    nat42

    Joined:
    Jun 10, 2017
    Posts:
    353
    You have separate draw calls for each character (I don't know if instancing is enabled/supported by your code yet)?

    You're skinning on the GPU so don't expect any per vertex cost to show on a break down of CPU use, there is not per vertex cost on the CPU (similarly I don't expect overhead of draw calls to show either as that happens outside your process, possibly in kernel land... EDIT: actually the graph appears to show only work your code is doing on the CPU, so not even Unity's contribution to the performance?)

    That's 80milliseconds on matrix maths over what period? If that's over 10 seconds at 30 fps, the 4 matrix methods combined cost you what about a quarter of a millisecond per frame? ((26+23+15+15) / (10*30))

    Be careful with graphs of CPU use, the maths to get bone transforms is the only real work your code is doing on CPU (relatively speaking), so it should dominate the profiling.


    Have you tried variations with Texture2D.Apply()?

    You want updateMipmaps to be false (I presume it is already?)

    Does makeNoLongerReadable set to true/false make a difference to the framerate? makeNoLongerReadable should be better for the driver, hopefully akin to handing a pointer to the chunk of memory and saying this is yours now do with it what you want; but I don't know whether there's any catch on the c# side balancing gains with garbage collection or reallocation of a new texture?
     
    Last edited: Oct 26, 2017
  31. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    So after a lot of tweaking i got my numbers much better. Still far from what i would like to see though.

    400 skinned characters. SkinnedMeshRenderer:

    400 skinned characters. MadCakeMeshAnimator.cs:


    CPU time is much bigger though i do almost nothing on CPU.



    I moved all matrix multiplications to another shader and used transform.rotation and transform.position instead of matrices (their getters are really slow)

    Now the slowest operation is RenderTexture.GetTemporary() which i consider a good sign.

    The script itself took 17.41 ms even though it's children have like 11ms in total and the script has 3.64 ms self.
    No ide where 4 more ms came from.

    So looking at what i have i concluded:

    Bottleneck at the moment is RenderTexture.GetTemporary(), Graphics.Blit(), transform.get_rotation/position and something slow in Update() that was not shown in profiler o_O

    I can not use less than 2 Blits and i don's think shaders could be made much faster than they are.
    Getting poition/rotation of bones in unavoidable.

    So i have two options left: optimize Update() and use less RenderTextures.

    I can easily reduce the number of calls to RenderTexture.GetTemporary by packing data from several textures into one. However if it will help the performance is an open question.

    For the Update loop - i'm going to search for these missing 4 ms. And try to optimize it in general.

    P. S.

    I updated code in https://bitbucket.org/MadC9ke/unity3d-dual-quaternion-skinning to the most recent version.
     
    Last edited: Oct 26, 2017
  32. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    Could you explain what do you mean by variations?

    Actually it is not. Apply() does not seem too slow anyway but i will certainly fix this. Thanks for advice.

    I suppose makeNoLongerReadable would actually have a negative effect on performance. The Texture2D holds a buffer for pixels. If you call makeNoLongerReadable it would discard the buffer so the next time you call SetPixels() a new buffer will be created. Allocating / deallocating buffer in a loop seems meaningless.
    Though it's just my assumption. I will check how it really behaves so we can know for certain.

    Are you talking about this? http://developer.download.nvidia.co...nstancing/doc/SkinnedInstancingWhitePaper.pdf

    If yes, i am not planning to implement this. At least not in near future.
     
  33. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,609
    Did you profile the project running in the editor or an actual build? Performance characteristics between editor and player can be quite significant. It's recommended to profile a build.
     
  34. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    I profiled in editor. After i optimize what i already see can be optimized i will profile build and upload results.

    Edit:
    I grouped several textures into one and performance increased quite a bit.


    There are still some things i could optimize to make it faster. But i suddenly started getting this error: https://forum.unity.com/threads/texture2d-has-out-of-range-width.501936/

    I created a separate thread because i believe the issue is not directly connected to this thread. I created a new project just to test it and got the same error. It seems Texture2D constructor checks the limit on width and height separately with a hard-coded (or derived from something) maximum of 16384. So a wide texture with 1 pixel height can not be created.

    What drives me crazy however is that i did not get this error since i began creating this script and wide textures worked fine until now.
     
    Last edited: Oct 26, 2017
  35. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    Ok so it was quite a pain in a painful place but i got it working despite the new limit on Texture2D size. New code here: https://bitbucket.org/MadC9ke/unity3d-dual-quaternion-skinning

    Though to be honest i'm kinda pissed off by the lack of explanation why did i have to fix perfectly working code: https://forum.unity.com/threads/texture2d-has-out-of-range-width.501936/

    Anyway it finally works again so i can get back to optimizations. Packing more data into a single texture proved to be quite effective to i'll try to use as few textures as possible.

    Edit:
    i reduced the number of textures in Update() to logical minimum. One texture for blending quaternions (size ~ boneCount) and one for applying them to vertices (size ~ vertCount)

    Tomorrow i will post the benchmark. Code in repo is updated
     
    Last edited: Oct 31, 2017
  36. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    I tried to profile in build, but could not get it to work properly. So this is from editor:

    SkinnedMeshRenderer. 10 characters 2.6 mln triangles each:



    MadCakeMeshAnimator. 10 characters 2.6 mln triangles each:



    SkinnedMeshRenderer. 200 characters 10.000 triangles each:



    MadCakeMeshAnimator. 200 characters 10.000 triangles each:



    Detailed profiling. 1 character 2.6 mln triangles:



    Detailed profiling. 100 characters 10.000 triangles each:



    Conclusion:

    • MadCakeMeshAnimator outperforms SkinnedMeshRenderer with high polycount and few characters
    • MadCakeMeshAnimator gets significantly slower with many characters. Probably due to high cost of Transform.get_position and Transform.get_rotation compared to getting transforms directly in C++
    • The performance with many characters is still acceptable (IMHO).
    • Detailed profiling 100 characters shows such a bad result because detailed profiling multiplies the cost of operations within script.
    P. S.

    If you can't afford the performance loss with many characters you could use MadCakeMeshAnimator selectively.

    Either for important characters only or with some kind of LOD system dynamically switching from DQ to Linear skinning at distance. Also you could allow all characters to have DQ skinning until the number of active animated characters exceeds certain threshold.

    P. P. S.

    The current bottleneck is RenderTexture.GetTemporary and Transform.get_position/Transform.get_rotation.
    If you know how to improve it i would appreciate the advice.

    I tried using the same RenderTexure to avoid RenderTexture.GetTemporary but it turns out Unity does not like keeping a RenderTexure over several frames. The data occasionally gets messed up.
     
    Last edited: Nov 1, 2017
  37. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,609
    Unfortunately, profiling in the editor isn't very representative. For example, performance is completely different between editor and a build as seen in this thread:
    https://forum.unity.com/threads/difficulties-interpreting-profiler-timeline.486401/#post-3170839

    Perhaps you did build without the "Development" option checked in the build window? That would cause the profiler not being able to connect.

    Did you profile Unity's CPU and GPU skinning or just one of them?
     
  38. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    My profiler is able to connect and works. But profiling anything heavier than an empty project just chokes my PC.

    I only profiled CPU skinning.
    Alas GPU skinning requires DX11 which is out of reach with my Windows 7

    I'm afraid if you want to compare my script with Unity default GPU skinning you would have to do it yourself.
    I can upload project files if you want.
     
  39. Hadeaths

    Hadeaths

    Joined:
    Apr 5, 2013
    Posts:
    1
    Hi,
    The link is not working anymore, is it possible to get a copy somewhere please?
    BR
     
  40. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    The repository url changed. I updated the link in my first post.
     
  41. lhoyet

    lhoyet

    Joined:
    Apr 20, 2015
    Posts:
    6
    Hi int_constantine,
    I'm trying to use your DQ code on Rocketbox characters that we have in our lab, and it completely breaks done. I was wondering if you had any insight why. Here is a picture of the character before and hitting play.



    I've looked at the code, and so far I can see that disabling the following code in StandardHacked.shader displays the static mesh properly, but of course does not animate it.

    Code (CSharp):
    1. if(_DoSkinning){
    2.                     v.vertex = tex2Dlod(_SkinnedVertexTex, float4(v.uv1, 0, 0));
    3.                     v.vertex.w = 1;
    4.                     v.normal = normalize(tex2Dlod(_SkinnedNormalTex, float4(v.uv1, 0, 0)).xyz);
    5.                 }
    Then if I enable this code but disable the following code in DQBlend, I also manage to display the static mesh, but still no animation of course as I'm not doing the skinning anymore

    Code (CSharp):
    1.  
    2. vertexPos = MultiplyQuaternion(RQ, vertexPos);
    3.                 vertexPos = MultiplyQuaternion(vertexPos, InvertQuaternion(RQ));
    4.  
    5.                 if (!_OnlyRotation)
    6.                     vertexPos += MultiplyQuaternion(TQ * 2, InvertQuaternion(RQ)) / _meshScale;
    7.                 else
    8.                     vertexPos.xyz = normalize(vertexPos.xyz);
    9.  
    But that makes me believe that the problem is in the DQ computation for the model I am using. do you have ideas what the reason could be or of the next step to solve it ? I'm currently trying to adapt your shader to perform LBS and get a grip on the way you do things, and see how to make this work before trying to fix your DQ shader for my purpose ...

    Cheers,
     
    Last edited: Apr 30, 2018
  42. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    I can not make any exact conclusion from what i can see.
    I suggest we try to eliminate possibilities one by one to find the cause of such behavior:

    • It happened to me several times that Unity's update broke my code completely or partially and i had to re-write it. And it already happened once to this piece in particular. I will check this script with a model that worked before using the last Unity version and report the results here
    • It could be something about the model you're using. I had trouble with many things - for example, models that use more than one material (fixed already). If you upload your model somewhere i could be able to investigate.
    • Your character holds something that looks like VR controller. Are you making a VR project? If yes, you might try using my script with the same model in "normal" project. Something may work differently in VR project and cause my script to fail.
    edit:
    version 2017.4.1f1: still works
    version 2017.4.2f2: same
     
    Last edited: May 2, 2018
  43. lhoyet

    lhoyet

    Joined:
    Apr 20, 2015
    Posts:
    6
    Thanks for your reply, at least I know it's not a Unity version problem (I'm using 2017.4.1f1).

    About the model,
    - it is to be used in a VR project in the future, but my tests with DQ were made in a separate Unity project which does not involve VR, to check that at least it works
    - the model also has several materials (3), I could see to upload the model for you to test if you don't mind, but it's copyrighted so I only have to careful that it is not shared afterwards
    - if that can help, the model has been made and exported from 3Ds Max, and the skeleton was used using the standard 3dsMax bidep.

    And again, thanks for the help!
     
  44. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    In case someone else has similar problem:

    In order for this script to work properly the root bone of the skeleton and the mesh object(s) must not have offset/rotation/scale relative to the parent object.

    I will add this to the README later and hopefully fix it some day.
     
  45. local306

    local306

    Joined:
    Feb 28, 2016
    Posts:
    155
    Hey @mr_madcake. Thank you kindly for creating this. I cannot believe how this is not standard with Unity given how much better the skinning results are.

    I'm currently trying to get this to work the Pre-Integrated Skin asset. Unfortunately, my shader knowledge isn't the best and I cannot get it to work, even with your instructions (which seem simple enough but I guess I can't figure it out haha). When I play from the editor, the mesh squishes down to a plane.

    Have you worked with this skin shader before?
     
  46. vladibalan

    vladibalan

    Joined:
    Sep 14, 2014
    Posts:
    6
    First, thank you for sharing your work @mr_madcake .
    Second, the vert function produced garbled results until I modified float4(v.texcoord1.x, 0, 0, 0)) to float4(v.texcoord1.x, v.texcoord1.y, 0, 0)).
    After some research I observed you used float4(v.uv1, 0, 0) in your hacked shader so I assumed it need the y coordinate instead of 0 as the second argument. After this it worked like a charm.
     
  47. Ldnicon

    Ldnicon

    Joined:
    May 22, 2016
    Posts:
    4
    Hey.

    Is there a way to still use blendshapes? Since DualQuaternionSkinner removes the SkinnedMeshRenderer component.
     
  48. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94

    Currently no.

    I may implement blend shapes in future (this is not the first time i'm asked about them), but i have no way of predicting when will i have the time for it.

    There is no way (that i'm aware of) to implement custom skinning without kicking out SkinnedMeshRenderer
     
    Last edited: Oct 22, 2018
  49. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    Hey, @vladibalan
    You're welcome!
    I think (though not sure) the repository had this fixed in readme.md quite some time before your message.
    Kudos for fixing it on your own though.

    Please look at the repository for the latest code and documentation as the messages in this thread might be outdated.

    If there is any issue still persisting in the current version please let me know.

    -------------------------------------------

    I do know that my script breaks the Unity's new subsurface scattering in standard shader.
    I'm 90% sure this will be an easy fix but i'm quite busy right now.
    It will be fixed once my schedule frees up a bit.

    Seems like they've added another set of vertex coordinates (it was the main mesh and shadow mesh before, now they added the SSS mesh). Just need to apply the same transformation there.

    If you really need it i think your experience would be enough to figure it out as you did the forgotten v.texcoord1.y
     
    Last edited: Oct 22, 2018
    vladibalan likes this.
  50. mr_madcake

    mr_madcake

    Joined:
    Jul 17, 2017
    Posts:
    94
    The script received a significant update and moved to github.

    Changes:
    • calculations are now performed in compute shaders
    • added blend shape support
    • fixed several bugs
    The latest version of the package can be downloaded from releases page
     
    Last edited: Jan 30, 2019
    vladibalan, isidro02139 and Flurgle like this.