Search Unity

Animation C# jobs in 2018.2a5

Discussion in 'Animation' started by Mecanim-Dev, Apr 4, 2018.

  1. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    Hi guys,

    the animation team has been busy in the last fews month to implement the new animation c# jobs.
    We think it a great addition has it give you complete access to the internal data at any point in the animation graph.
    It already available in 2018.2a5 under UnityEngine.Experimental.Animations namespace.

    Since this new tech is build on top of the jobs system it does also have all the same constraint.

    I know that most of you don't have access to alpha or beta but still I think you should be aware of what is coming
    https://docs.google.com/document/d/12ljfvi8knahdElbRuIRijsx0U1T4CyXRQMdOcq6lXRA/edit?usp=sharing
     
    Last edited: Apr 4, 2018
    twobob, mitaywalle, Grizmu and 6 others like this.
  2. optimise

    optimise

    Joined:
    Jan 22, 2014
    Posts:
    2,129
    @Mecanim-Dev, awesome. Btw does any performance improvement for Animator?
     
  3. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    @optimise we are currently looking at different solution to improve performance, but nothing as landed yet.
     
    optimise likes this.
  4. Deleted User

    Deleted User

    Guest

    Awesome, been waiting for this since I first heard about playables! This kind of implementation is exactly what I was hoping for.

    I'm curious what the restrictions on BindSceneProperty there are? Would there be ways to feed non AnimationClip animation data into the job? Like an array of Matrices or something to use as a custom animation source?
     
  5. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    you can read/write any supported intrisics type from the animation system: float, int, bool.
    right now all properties on all component are supported except for array type like :
    m_Materials.Array.data[0].

    sure, you have access to all the data flowing throught the graph so any jobs can inject new values or blend values in you're own fancy ways.

    The animation system know how to blend position, rotation(euler, and quaternion), scale, float, int, and bool. It doesn't knows how to deal with matrix directly because a matrix could have different semantic depending on what you store into it, 3x3 rotation, 3x3 rotation+scale, 3x3 scale, 4x4 affine
     
  6. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,335
    Looks good so far!

    I'd love to see some full-scale examples of integration with Jobs/ECS. A good case would be something like handling foot IK for a large amount of agents at the same time.
     
    NotaNaN and Deeeds like this.
  7. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    This is great news, but it's sad that this topic hasn't gained traction! -- Are the number of animators using Unity in the (low) single-digits or something nowadays? Perhaps the "Community Discussion" location of the topic is to blame?

    Either way, the forums seem to be next to dead these days, and nothing less than at least a link to this topic from the Roadmap seems appropriate. :(

    ---

    So, I'm with @Baste on the examples! -- I personally would love to see what a custom mixer using custom curve interpolation would look like with this new system.

    A variety of code examples to "see how things are done" with this should really be mandatory. With no documentation to work with, and without testing ourselves, how are we supposed to know whether a Playables wrapper (or even an Animator?) is required? The sole example uses both, but I don't plan (or want) to use the Playables API for tweaking an animation. Sometimes, I don't even want an Animator in a custom mixer (where it seems I'll have to write all the transform math myself).


    After reading through the early documentation linked to by @Mecanim-Dev, I have some questions:

    1. Can humanoid bone transforms be used with this system? -- If so, what does the code look like to do that?
    2. I'm assuming it is required all the time, but is an "Animator" component actually /required/ to use streaming transforms -- or is that only required for /most/ things (such as using it with the PlayablesAPI)?
    3. If I want to, say, shift the arms of a character up or down by a small amount based on the current rigidbody force being applied to the entire character, would you provide an example of how this would be done?
    4. If all I want to do is define a custom interpolation curve for my animation clips, am I going to have to rewrite the entirety of the animation system to do it?? -- If so, it would be very nice if you guys could spare me the work on figuring that black-box out by reverse-engineering it and copy/paste your test code (or some helper methods I can override?) for that as a starting point (assuming this is most likely the case)


    5. Unrelated, but not entirely: Are you guys ever going to provide a better way to grab all the "bones" from a model's internal model (import) data? -- Or is there a secret way to do this without a for loop (or a call to GetComponentsInChildren, which picks up extra transforms / meshes / skinned-meshes I might have on my prefab)
    6. How soon is the "Animation" scripting component going to be removed? -- I'd really like to base some simple animation stuff on this solid API, but I don't know how future-proof it is. :(
     
    Last edited: May 31, 2018
    NotaNaN and Deeeds like this.
  8. Deleted User

    Deleted User

    Guest

    Hi!

    EDIT: The Unity package no longer exists, the project is now on GitHub.

    We're proud to announce the Animation C# Jobs Samples repository:
    https://github.com/Unity-Technologies/animation-jobs-samples

    This repository contains several samples showing how to use the new Animation C# Jobs feature:
    • SimpleMixer
      How to mix two clips together.
    • WeightedMaskMixer
      How to mix two clips together, but use different weights on each bones during the blend.
    • Look At
      How to add a look at behaviour to a quadruped creature.
    • Two-bone IK
      How to add a two-bone IK to a character limb.
    • Fullbody IK
      How to add a fullbody IK on a character.
    The number of samples will increase over time, don't hesitate to create new issues or even propose some PRs. We hope these samples will help you learn and understand this brand new feature.

    Don't hesitate if you have feedbacks, we'll be glad to address them ;)
     
    Last edited by a moderator: Jul 3, 2018
  9. Serinx

    Serinx

    Joined:
    Mar 31, 2014
    Posts:
    788
    Hello, would it be possible to use this to allow an easy transition from ragdoll into animation?
    E.g. inserting the current bone rotations into the animator and using the standard transition system into a "stand up" animation?
     
    awesomedata likes this.
  10. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    First, I have a few questions (in addition to my post above):
    1. Do any of these examples show how to do custom interpolation curves?
    2. Also, how performant are these versus manipulating the bones manually?
    3. What sort of limitations are there with these C# jobs (@Mecanim-Dev mentioned there would be some, but I'm not yet entirely familiar with the C# jobs system just yet, so I want to know which of these might affect what functionality or classes I have to work with)
     
    Deeeds likes this.
  11. Deleted User

    Deleted User

    Guest

    Haha, wanted to keep it as a surprise, but we're working on that ;)

    Can't answer all your previous question right now (sorry), but I can already say that since this feature is only in 2018.2 beta, I'm expecting more traction when 2018.2 is officially released. And I'm pretty sure there will be a dedicated section for this feature in the "What's new in 2018.2" blog post.

    1. the animation stream won't offer you the possibility to change the interpolation between keys. but you will be able to change the interpolation between streams. For example you can take the basic linear interpolation in the SimpleMixer and do your own interpolation algorithm.

    2. As @Mecanim-Dev said, we're still working on this part. Right now, and keep in mind that we're still working on it, it's not obvious on a single character, but it can be quite good when there are several characters on screen. For now, we're focusing on the API and the new possibilities that the feature bring.

    3. Even internally, we're still discovering what the animation C# jobs can bring. On the top of my head I don't see limitations, but I'm way too deep in the code, at some point I'll need to take a step back and take a look at the bigger picture. But that's where your feedback is important! because it'll make us to think about ideas that we haven't think about before.

    Gotta go, take care!
     
  12. recursive

    recursive

    Joined:
    Jul 12, 2012
    Posts:
    669
    Would it be possible to write an animation "mirroring" system with this for non-humanoids? If so, where would be a place to start. I ran into this issue in a previous version of my project, and if this can solve it now that the project is starting over, that'd be great.
     
    awesomedata and deab like this.
  13. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    take a look to MixerJob.cs in this new animation sample package, it not a curve interpolator but a weighted transform blender with linear interpolation for position and spherical interpolation for rotation. Basically you connect two animation source into this mixer(could be two animation clip playable, or anything that generate animation) and this mixer will blend all transforms from both source
    here the code
    Code (CSharp):
    1. using Unity.Collections;
    2. using UnityEngine;
    3. using UnityEngine.Experimental.Animations;
    4.  
    5. public struct MixerJob : IAnimationJob
    6. {
    7.     public NativeArray<TransformStreamHandle> handles;
    8.     public NativeArray<float> boneWeights;
    9.     public float weight;
    10.  
    11.     public void ProcessRootMotion(AnimationStream stream)
    12.     {
    13.         var streamA = stream.GetInputStream(0);
    14.         var streamB = stream.GetInputStream(1);
    15.  
    16.         var velocity = Vector3.Lerp(streamA.velocity, streamB.velocity, weight);
    17.         var angularVelocity = Vector3.Lerp(streamA.angularVelocity, streamB.angularVelocity, weight);
    18.         stream.velocity = velocity;
    19.         stream.angularVelocity = angularVelocity;
    20.     }
    21.  
    22.     public void ProcessAnimation(AnimationStream stream)
    23.     {
    24.         var streamA = stream.GetInputStream(0);
    25.         var streamB = stream.GetInputStream(1);
    26.  
    27.         var numHandles = handles.Length;
    28.         for (var i = 0; i < numHandles; ++i)
    29.         {
    30.             var handle = handles[i];
    31.  
    32.             var posA = handle.GetLocalPosition(streamA);
    33.             var posB = handle.GetLocalPosition(streamB);
    34.             handle.SetLocalPosition(stream, Vector3.Lerp(posA, posB, weight * boneWeights[i]));
    35.  
    36.             var rotA = handle.GetLocalRotation(streamA);
    37.             var rotB = handle.GetLocalRotation(streamB);
    38.             handle.SetLocalRotation(stream, Quaternion.Slerp(rotA, rotB, weight * boneWeights[i]));
    39.         }
    40.     }
    41. }
    So if you look closely you can change the interpolation mode by simply changing Vector3.Lerp and Quaternion.Slerp with something that match your needs.

    We would like to provide a set of custom interpolator like: lerp, slerp, bicubic, spring. Suggestion are welcome as always.
     
    awesomedata likes this.
  14. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    yes we do have a AnimationHumanStream that allow you to set/get muscle value and IK effector. You can even set a pose with muscle values and then request by exemple the global position of any transform on your humanoid. The stream is smart enough to detect such case and does invoke a retarget internally to switch from muscle space to FK pose.

    yes an animator component is needed as this new system is build on top of the playables API.


    I'm currently working on 3 new animation job that will allow anybody to blend physics with animation. It not exactly the example that you want but it will give you a good hint on how to do what you would like.

    1. PhysicsRigReadJob: this job basically read the rigidbody position/rotation and inject it into the animation stream.

    2. ComputeTransformVelocities: this job take care to differentiate two animation pose to compute velocities which can be used to drive the rigid bodies.

    3. PoseMatcher: Once a ragdoll has been simulated this job take care to match the ragdoll pose with a set of poses that you provide to the job to allow the system to blend from the ragdoll pose to your animation.

    I'm not sure what you are talking about, are you talking about the legacy animation component? if yes, we are not going to remove it, as it can still be useful in some simple case.
     
    thelebaron and awesomedata like this.
  15. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    One of the hardest constraint is the fact that you cannot use reference type in a job. Which mean that you cannot access any class from the scripting API.

    So from a job you cannot call directly rigidbody.AddForce() but you can compute all the velocities so at least this part would not comsume cpu cycle on your main thread.

    Performance wise we expect that this will offer new performance improvement opportunity because you can now offload some computing extensive task on a worker thread rather than doing the same thing on the main thread.
     
    MadeFromPolygons and awesomedata like this.
  16. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    yes of course you could, you would need to define what mirroring mean in your case, mirroring a humanoid is pretty straight forward as the rig is symetric.

    The humanoid mirroring system is simply a muscle map
    LeftLegOpenClose = RightLegOpenClose
    RightLegOpenClose = LeftLegOpenClose

    So if I would implement this for a generic rig I would probably create a transform map and maybe add the mirror plane to allow funky mirror mapping. Humanoid mirror use the YZ plane.
     
    recursive likes this.
  17. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    That looks incredibly useful, and it seems to cover all the bases for interpolation one could want programmatically.

    I'm not clear on this (and I don't want to assume), but does it support AnimationCurve already too? There are some times where I want to program my interpolation, and other times when I want to edit it visually instead.


    You're gonna regret saying that... lol -- I've got plenty! :D -- Here's some:


    Feedback (Important/Specific):


    A) First and foremost:
    1. I want a list of bones directly from the skeleton.

      I want to choose whether to ignore transforms with a "skinned-mesh" component (for example, in a multi-part / multi-mesh animated character, the list of transforms to account for balloons). I want to exclude those transforms from the list of bones automatically (and get a dictionary from this master list too). I can do this manually, but it slows things down.

    2. I want the "pose" to be compatible with this "skeleton" list (perhaps by conversion), just in case I want to reset all of the bone positions at once to a particular pose.

    3. Grabbing a "pose" from a specific AnimationClip at any given time (without being required to add it to an Animator -- instead, I just want to grab it from an array, without worrying about it being an explicit "State" in the Animator and having the CPU overhead of it existing in both places.)

    4. Is a general "MatchTarget" function available that works for a list of bones? -- I want to be able to define a list of bones (for example, a shoulder, forearm, hand and fingers) and have them align exactly with the same parts from another pose (while ignoring the rest of the other pose -- only the arm bones specified, and their children, will match the new pose).

    5. I would also like a "MatchTargetRelative" that would, for example, take the bones of a floppy ear (or a hand) and, no matter where the head (or arm) is at, they would curl the bones toward a "relative" / local transform (i.e. if the ear is curled, the ear will match the same curl even if the head is laying backwards on the ground.) This would also take into account completely separate bones too (say a floating bone that controls the relative position of a witch's floating crystal ball and another floating set of bones that control the location and shape of her animated cartoon-style floating broomstick).

    B) Regarding interpolation:
    1. I have two suggestions off the bat to make our animator lives easier -- first, it would be useful if AnimationCurve itself had an option to automatically generate the types of interpolations you've mentioned (bicubic, spring, etc.) for a set of control points.
    2. Being able to "blend" these interpolations (as they're generated) with the existing curve in the AnimationCurve interface would be even more useful for animators like me (so it doesn't have to be calculated at runtime.) Blending would happen as if the two curves were stacked on top of each other (i.e. the existing curve and the generated curve would be merged -- say you have generated some bicubic interpolation, but you wanted to add a spring to that same bicubic curve and get the resulting curve at design-time) -- Sort of like "additive blending" essentially.

    C) Other important things (so far):
    1. In regards to the AnimationHumanStream -- I want to be clear about this, but I want to also modify some of the regular bone transforms (say the ears or a ponytail) while (simultaneously) modifying the humanoid portions of the model itself.
    2. Is the AnimationHumanStream smart enough to modify regular transforms in the model directly (say the "hips" bone) without duplicating the corresponding humanoid transform's muscle animation?

    3. A visual preview of our animation output in the UnityEditor. Preferably, provide a way to override the functionality of (or hook into) the UnityEditor's animation window to create custom animation previews (and a space for a few custom controls) using this new streaming/jobs system. In most cases, we will need to be able to preview our animation outputs at design-time (rather than runtime) in order to properly author them with this jobs system.

      Please don't make us have to reinvent the wheel here -- We just need a hook into the AnimationWindow (or a separate "custom" animation window) letting us override the preview behavior and make it take into account our modifications to the animation stream. People have been asking for an easy way to author custom keyframes for a long time, but with this new system, it is mandatory that we have the ability to make custom keyframes easily (and have a way to preview them in the editor), so I don't believe it to be beyond this system's scope. Having to go to playmode everytime to author some of these animations is a terrible design decision -- especially when the technology is already there to allow us this preview mechanism.

    4. A place in the "custom keyframe inspector" / "custom animation window" that will give animators a place where they can modify AnimationStreaming properties at design-time in the UnityEditor. This would be a place where one could place buttons / controls / information to better-author these new animation styles in a visual way. I could see an EditorWindow or a simple Inspector (or even an extra area in the existing AnimationWindow) being sufficient.
    General feedback so far:

    This is incredible! -- MUCH more than I expected from this new system! :D -- You, sir, totally rock!!


    Yes, I was -- I was almost sure I read you saying that it wasn't going to be removed somewhere before, but then I came across a few posts from others recently (and one from you) that made me doubt that. The post of yours in question was something along the lines of "We're going to remove it only when the new system is able to do everything the Legacy system can do.", and as far as I can tell, that day is nearly here (despite it being a little more convoluted to do with the new system due to the new AnimationStreaming / PlayablesAPI wrapper stuff mixed with the [still terrible in-practice] Mecanim state/variable/transition interface). Thanks for the clarification on this though. I'm totally going to hold you to this! :)


    Sadly, I'm worried this might affect the "custom keyframe editor" / "custom animation window" authoring approach I mentioned, making it impossible to preview our animation streaming in-editor for accurate authoring (particularly with the ragdoll and physics simulations). If it does affect this, then we need some decent alternative to get the final "pose" sampled in-editor somehow. I'd rather the ragdoll / physics not work in editor at all so we can still be able to preview our custom interpolation / etc., but I feel like this could still be possible since we're already calculating our velocity each frame to some extent and we're showing only 1 frame at a time (via scrubbing), so I feel that, even without the "Jobs" part of the system properly implemented in the editor, we can at least get /some/ idea of what the final animation is going to look like at runtime, even if it is via simulation.

    Any thoughts on this visual approach?

    Overall, I really love the direction this is taking so far! -- Lots of things I've wanted to do since the inception of Mecanim are now finally becoming possible!

    Keep up the awesome work so far guys! :)
     
  18. JohnHudeski

    JohnHudeski

    Joined:
    Nov 18, 2017
    Posts:
    126
    We probably need a workflow tutorial. How would this work with playables?
     
    AubreyH and MadeFromPolygons like this.
  19. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Gimmie!

    And <3 forever if you can ease in partial ragdoll - one arm or head...
     
    MadeFromPolygons and awesomedata like this.
  20. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,618
    Do you think having GetLocalPositionAndRotation and SetLocalRotationAndPosition methods would reduce function calls, thus be beneficial in terms of performance?
     
    MadeFromPolygons likes this.
  21. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,335
    Some questions:

    - How does this compare to using already built-in mechanim features? In particular, I've got a large amount of agents that are looking at things, which I do by using Animator.SetLookAtPosition/Weight.
    Now, I understand that the LookAt example is a lot more powerful, as it allows for LookAt for non-humanoid agents, but if the old humanoid solution works for me, is there any reason for me to use a LookAtJob instead? Can I expect performance gains compared to setting the LookAtPosition to some transform's .position in a MonoBehaviour's Update?

    - I see that you're not setting the time update mode in any of the examples. Playable graphs default to using the DSPClock update mode, so I've found that I have to set it to GameTime, as otherwise very, very strange things happen. Is there a reason that you're sticking with DPSClock, or is the DPSClock default simply a bug you haven't caught yet?

    - Can I mix an AnimationScriptPlayable with an AnimationClipPlayable though eg. an AnimationMixerPlayable? If the ScriptPlayable and ClipPlayable targets the same bones, what happens?

    Oh, and all of this is looking really great, and the examples are pretty neat. I can't open the example scenes, but I think that's a bug in the package manager stuff, not in the examples.
     
  22. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    the master piece of this animation job system is the AnimationScriptPlayable, which is a playable Node like AnimationMixerPlayable or AnimationClipPlayable.
    So you can now build your own animation graph and with an AnimationScriptPlayable inject custom c# code in the graph to modify the animation stream.
     
  23. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    Definitively in some use cases.
    The animation stream do store local space information, so GetLocalXXX/SetLocalXXX is almost free except for the icall, so we would save one icall
    on the other hand GetPosition/SetPosition/GetRotation/SetRotation is more CPU extensive since we need to compute the local space, so for global we would save one icall and one compute local.

    So I will add this to the roadmap list and the team will discuss about this in our next meeting.
     
    dwilson_magic and Peter77 like this.
  24. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    For such a simple case like Animator.SetLookAt for humanoid only, there would be almost no perf gain since in this case the main computation is already done on a worker thread by the native code.

    But if you write your own look at code which include generic rig too, without animation jobs you would need to run this in a lateupdate which consume main thread cpu cycle compare to a solution with animation jobs where the computing occur on a worker thread. So we do expect a performance gain in more complex scenario.
     
  25. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    yes you're right, the default graph update mode is DSP because you can mix audio and video in a PlayableGraph too so the time need to be more precise. There is so much detail that this one was simply overlooked when we did the samples

    of course you can, with a mixer, depending on the weight of the input it will blend the result of the animation clip and script playable togheter.
    So depending on how you build the graph you can blend thing togheter or overwrite them,we even made a weightedmaskmixer which allow you to blend each transform independantly with a weight for each transforms.
     
  26. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Individual Bone-List Interpolation:

    I'm wondering if modifying a group of parts of a rig is supported with its own custom interpolation. Seems like the mention above about being only able to set custom interpolation for the entire animation (and not individual parts) might make this not possible -- but I really hope I'm wrong!



    Transformation Blending/Mixing of an Array/List:

    Can you guys make a method to "simultaneously" apply the custom transform weights to a list/array of bones? This would, for example, be an arm or a leg or a tentacle.



    Deep Copy Transform Array:

    I was recently having issues with a "deep copy" and "shallow copy" of transform arrays. Can you guys provide an easy method that would allow one to grab a full-on "copy" of the model's transforms (or really, any other "reference-type" class?) and place /that/ "deep copy" into an array instead of simply a "reference" to them? -- I'm aware you can jump through a few hoops and use "Instantiate" to do this to some extent, but then a full-on copy of the mesh appears in the scene when you really only want to modify the bone positions.


    Hybrid Ragdolls:

    @Mecanim-Dev -- Would it be possible to create a "hybrid" ragdoll-setup window? This window would keep all the default humaniod joint constraints, but also let us pick individual bones or "chains" of bones that should have limits/constraints applied to them for "hybrid" humanoids. We have the "ragdoll creator" right now, but it's not very user-friendly for the new types of ragdolls we'd be creating.

    Also, this window would let us choose whether or not to generate (and what types of) colliders will be generated (capsule/box(/mesh?)) in the system upon building the ragdoll. Additionally, if one doesn't need to manage a set of colliders but still wants to simulate joint "weights" and "constraints" in the animation joints, he should be able to exclude the colliders (and joint limits) if he likes inside this window (and the system should support these types of joints from the backend as well.)



    Cleaner-Looking Animation "Play" Method:

    Is there any way we can do a faster/cleaner set of calls to "Play" our animations (sort of like the Legacy system)?

    The idea I have is this -- Although you have to create a playables graph, etc. first to play from, is there a way we can make it at least "look" cleaner (or you guys provide a temporary graph to "Play" from) when we just want to play a quick animation? The way I was thinking is perhaps with a single function-call and a few "ref" variables (that could be defined at the beginning of the class) that would be supplied as arguments when calling the "Play" method.



    Alternative Method to Compute Bone Transformation (Locally) using Global Position Input:

    I could be wrong, but a way to do this seems like it could possibly be computed (locally) for almost nothing if you "set" it with its global position and reference the current location of the bone's end (by getting the length from it to its child) and then computing the transform's LocalXXX rotation from the transform's start location and child location and the difference between these two vectors (from the previous frame to the current). This method seems like it could nullify the "position" portion of only one of two ends of the bone (and if the end of the bone lines up with another bone, you can compute LocalRotation/etc. from that bone's alignment with its parent.)

    Basically, this "matching one of the bone ends to its child" changes a "global" transformation to a local one, and makes it faster to compute rotations/etc than full-on transformation might.
     
    NotaNaN likes this.
  27. JohnHudeski

    JohnHudeski

    Joined:
    Nov 18, 2017
    Posts:
    126
    I am guessing playables work in hybrid ECS. This is the question I was trying to ask. How to use this with ECS
     
  28. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    About the "Optimize Transform Hierarchy" in relation to these new animation jobs -- does this option still exist with animation jobs?

    Are there any limitations with using this option in relation to ragdoll-driven physics?
     
  29. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Does there exist the opportunity to have a higher performance attachment system? we tend to track a *lot* of bone transforms to move attachments there each frame (we don't child them).
     
  30. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Been looking into ECS a bit lately (the new videos released today created by Mike Geig finally cracked it open for me), and judging from the fact that C# jobs are pretty much mandatory to making our lives easier to work with the "component" and "system" parts of the ECS acronym in a familiar way, it seems like if we have access to the AnimationStream mid-frame (particularly with "bone" / vertex transforms and not the typical "Transform" class, which, comparatively, has huge overhead the more you scale in size), much of the "transform" process could actually be left up to the GPU to handle the vertex transformations between frames (just as it seemed to be doing in that spaceship/bullet-hell example of this system -- his computer was clearly 100% GPU-bound, which suggests to me that nearly ALL of the transformation work was going directly to the GPU -- and judging by that, I can only suspect it would be exactly the same with animation.)

    Please correct me if I am wrong for thinking this -- it just seems like this new way of doing our "animation" jobs with near-direct communication with the GPU could allow for direct mesh access (and transformations) across the board, which, in a lot of cases, opens up huge possibilities for "tracking" bone transforms -- and so much more.

    Totally stoked for the new ECS system after today. :D
     
    Last edited: Jun 7, 2018
    NotaNaN and hippocoder like this.
  31. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419

    @Mecanim-Dev or @RomainFailliot

    Thinking on this a bit more -- How far off am I in thinking that this "jobs" system would be a good "low-level" place to have the capability to selectively apply our own "blendshapes / transformation-data" from? -- Obviously I don't want to animate ALL vertexes manually, but what if there was still a small group of them on this level that I did want to manipulate over time?

    Perhaps I'm not a "shader-guru" and want to do this with physics or rigidbodies?

    Maybe I am a "shader-guru" and just wanted to animate uvs / textures / vertex colors / normals over time for fancy shader effects that can't be done with the standard Animator just yet?


    This leads me to "complex" animation states...

    This new ECS system combined with AnimationJobs has got me to wondering how more "complex" animation states might be managed now with ECS -- Any thoughts on providing a good example of something like that?

    I feel like we should have the option of whether to use the Animator class or not (or at least a simplified or more "hardware-direct" version of it) in cases like the above where we want to animate verts/textures/normals/colors/etc.

    Seems like it follows the ECS principles a lot better in this way.
     
    Last edited: Jun 8, 2018
    NotaNaN likes this.
  32. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
  33. JohnHudeski

    JohnHudeski

    Joined:
    Nov 18, 2017
    Posts:
    126
    Just when I was thinking. Motion Matching would be great with the new ECS you guys Got Micheal Buttner
     
  34. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    @JohnHudeski I guess you just saw the Unite berlin keynote, well yes we do have a build of motion matching running with Animation C# jobs. Our plan is to release this later this year when it will be ready for public.
     
  35. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Thanks for your hard work, animation team :)
     
  36. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Absolutely! -- You guys rock! :D
     
    NotaNaN and (deleted member) like this.
  37. JohnHudeski

    JohnHudeski

    Joined:
    Nov 18, 2017
    Posts:
    126
    @Mecanim-Dev If there is one feature I think unity is still missing it is a proper animation inspector. I think the default one is ok for importing your motion but for tagging data you sometimes need to run you motion in something like a simulation mode. Also for testing IK or maybe duplicating root curve and stuff for easy event detection.
    Maybe a separate window

    For those curious Motion Matching
     
    Last edited: Jun 20, 2018
    awesomedata likes this.
  38. JohnHudeski

    JohnHudeski

    Joined:
    Nov 18, 2017
    Posts:
    126
    Package Errors help

    Assets/animation-jobs-samples/Runtime/AnimationJobs/FullBodyIKJob.cs(6,31): error CS1070: The type `UnityEngine.Experimental.Animations.IAnimationJob' has been forwarded to an assembly that is not referenced. Enable the built in package 'Animation' in the Package Manager window to fix this error.



     
    Last edited: Jun 20, 2018
  39. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    @JohnHudeski which version of unity are you using?

    The package manager team did a change lately that did break project compatibility in 2018.2 so I guess you got hit by this issue.

    Dependings of your unity version, check in the main menu if you have Help/Reset Packages to Defaults.
    You could try this if you didn't customized your included packages in your project
     
  40. JohnHudeski

    JohnHudeski

    Joined:
    Nov 18, 2017
    Posts:
    126
    Beta9.
    I tried Beta12 and it always crashes
    With Beta9 when i reset the manifest to default I dont even see the package manager option anymore
     
  41. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    beta 12? :confused: we are currently working on releasing beta 10.

    Does it work if you reset to default?
     
  42. JohnHudeski

    JohnHudeski

    Joined:
    Nov 18, 2017
    Posts:
    126
    Silly me 12 was 2018.1 omg
    but yeah the one available from the unityhub is 2018.2.b9 and it doesnt

    ddddd.png

    OMG Fixed it
    Package Manager Built-in Packages
     
    Last edited: Jun 20, 2018
  43. JohnHudeski

    JohnHudeski

    Joined:
    Nov 18, 2017
    Posts:
    126
    Right now Package Manager will not even show in the editor if you dont include this in your manifest

    Code (CSharp):
    1. {
    2.   "registry": "https://staging-packages.unity.com",
    3.   "dependencies": {
    4.     "com.unity.package-manager-ui": "1.9.11"
    5.   }
    6. }
    7.  
     
  44. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Seems fine here. I tried Unity on a fresh machine and package manager shows up without an issue. But I have never used the unity hub and I do not intend to.
     
  45. tinyant

    tinyant

    Joined:
    Aug 28, 2015
    Posts:
    127
  46. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    @Mecanim-Dev


    Can we have --- vvv THIS vvv --- instead of "motion matching" and "Mecanim" please?





    I only recently came across this video -- we should have had this YEARS ago... :'(

    The title is terrible and not many people are even in the audience... but Ubisoft.... GO Ubisoft artist technology! :D
     
    Last edited: Jun 27, 2018
  47. Deleted User

    Deleted User

    Guest

    @awesomedata I was still at Ubisoft when this video was released, memories... :D

    As impressive as I find this video, the humanoid feature in Unity already allows some of the features presented here, like retargetting and foot locking (see the "IK Solver" section).

    That being said, I would also like to make all these features completely generic. We're slowly working on that, but it needs to rethink a good part of the animation workflow, which is not a piece of cake when you have millions of users! :)

    There is already a full body ik sample in the repository using the Animation C# Jobs. If you don't want to use the Animation C# Jobs, you can also make a regular C# script that would also change the humanoid goals weights, but the computation will be done on the main thread (because it will have to be a MonoBehaviour).

    We don't know yet when the feature will be out of experimental. It will depend on the amount of feedback we receive. But rest assured that the goal is to make it stable as soon as possible (but it'll probably not be before 2019.1).
     
    hippocoder, Reanimate_L and tinyant like this.
  48. Deleted User

    Deleted User

    Guest

    As long as the properties are "animatable", it would be possible. But for computing stuff directly on the GPU, I don't know if the Animation C# Jobs could help...

    ECS and the Animation C# Jobs aren't exactly the same, as you can see in this blog post. They both use the C# Jobs system, but the Animation C# Jobs API is a subpart of the Playable API.

    ECS is yet another beast to tame ;)
     
    tinyant and awesomedata like this.
  49. Deleted User

    Deleted User

    Guest

    If you look at the WeightedMaskMixer, you'll see that you can blend together different parts of a skeleton and you can define weights for all the bones (but I'm not sure I completely understood your questions).

    If I'm not mistaken, the NativeArray type only does deep copies (references are not supported in C# Jobs).

    There are some functions in UnityEngine.Playables.AnimationPlayableUtilities, like PlayClip().

    Yep, that is actually the good practice when changing bones transforms: prefer using local space; if not possible, get all the world transforms, compute the new global, and set them (never alternate get/set/get/set/get/set/... that's the worst that could happen).
     
    awesomedata likes this.
  50. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Fair enough! -- I was mainly just daydreaming on what might become possible with ECS's new low-level approach. That video Mike Geig did with the spaceship instancing impressed me because it seemed he didn't even have to mess with shaders. I was thinking that kind of performance (and especially flexibility!) might be REALLY nice if the Animation system could leverage it somehow!



    Indeed -- I was aware of the Jobs system, but I think ECS + PlayablesAPI was what was throwing me off, particularly how ECS works with the PlayablesAPI concept. As far as I understand it, the whole purpose of the PlayablesAPI is for "animatable" things to function properly with Timeline. -- Is that correct?

    If this is so, then I think we need to consider a low-level animation component that can tie into the Jobs system while not being forced to be part of the PlayablesAPI. As counter-intuitive as it seems, I think it might be a good idea to split the various "types" of "animation" systems. For example, one that is able to work with skeletal-based animation, and another that is able to do vertex transformation, another with texture animation, etc. and whatnot. Essentially modularizing the systems. Then later plug in a "link" system to hook up the systems up to the PlayablesAPI so that they can be used both individually as well as with Timeline / ECS also. Is that something you've looked at doing?

    I personally want to do low-level animation on a regular basis, but other times IK / skeletal stuff is my main thing, so both animation styles are important to me. I just wonder if we'll be able to animate shader material properties with custom SRP too when it comes out? ( Not sure if you know about that, but I thought it was worth an ask! :D )

    Sorry for all the questions, but thank you for sharing with me what you have time for! :)



    Haha! -- That's quite awesome! :D

    Many props to you! -- I totally respect the minds at Ubisoft in terms of artist tools! -- Ever since the UbiArt engine I've been a fan! -- I'm really glad to see Unity scooping some of you guys up for this! :D



    I would be totally honored if I could help out with that in some way! -- I've been thinking about this very thing a lot over the past few months and have even made a few prototype tools to test out some concepts I believe could achieve a good workflow in this. If you're interested in hearing some of ideas, please let me know! -- I'll even draw diagrams and pictures! :)
     
    tinyant likes this.