Search Unity

  1. Unity 2018.3 is now released.
    Dismiss Notice
  2. The Unity Pro & Visual Studio Professional Bundle gives you the tools you need to develop faster & collaborate more efficiently. Learn more.
    Dismiss Notice
  3. We've updated our Terms of Service. Please read our blog post from Unity CTO and Co-Founder Joachim Ante here
    Dismiss Notice
  4. Want to provide direct feedback to the Unity team? Join the Unity Advisory Panel.
    Dismiss Notice
  5. Improve your Unity skills with a certified instructor in a private, interactive classroom. Watch the overview now.
    Dismiss Notice

Animation C# jobs in 2018.2a5

Discussion in 'Animation' started by Mecanim-Dev, Apr 4, 2018.

  1. Mecanim-Dev

    Mecanim-Dev

    Unity Technologies

    Joined:
    Nov 26, 2012
    Posts:
    1,623
    @JohnHudeski not yet, but this is on our road map for animation jobs.
     
  2. JohnHudeski

    JohnHudeski

    Joined:
    Nov 18, 2017
    Posts:
    85
    The tears flow endlessly

    In other news. Is it better to create duplicate mirrored clips or have it done at runtime ala mecanim?
     
    awesomedata likes this.
  3. Mecanim-Dev

    Mecanim-Dev

    Unity Technologies

    Joined:
    Nov 26, 2012
    Posts:
    1,623
    there is no good answer to this question because it depends on many factors specific to each project.

    Using mirror at runtime can save you a few thousand MB if your project has many clip so for project close to memory limit that could help a lot.
    On the other side managing this at runtime will increase the complexity of your code and could create some bugs.
    Also there is the performance aspect of the question, mirroing a clip cost a few cpu cycle, so if the project is already close to performance limit maybe it can't afford the cost to mirror all clip at runtime.

    So you have to choose between maintainability vs memory comsumption vs performance
     
    JohnHudeski likes this.
  4. KospY

    KospY

    Joined:
    May 12, 2014
    Posts:
    63
    I recently come across the animation jobs samples on Github and I'm pretty interested about the fullbodyIK solution.

    I'm currently using another solution for fullbodyIK but it is pretty complex, not really supported and really costly on the CPU.
    The fullbodyIK solution of Romain on the contrary seem to be really promising, it's a lot simpler and it use animation jobs so it's pretty efficient.
    This solution could be great for player fullbody IK in VR, but it lack an essential thing: an head effector (position and rotation to sync HMD with head). I hope there is a plan to add an head effector at some point because it's heartbreaking to know that such a solution exist but can't be used for player in VR as it miss a such essential thing.

    By the way, if you want to make VR devs happy just add an arm extender after that and that will be perfect (instead of pulling the body when the hand is too far it stretch the arm).

    Thanks for taking into consideration this features!

    Edit : Also I will be happy to know if finger IK effectors are planned?
     
    Last edited: Sep 19, 2018
  5. Mecanim-Dev

    Mecanim-Dev

    Unity Technologies

    Joined:
    Nov 26, 2012
    Posts:
    1,623
    @KospY we do have a head effector whicc work like a look at
    https://docs.unity3d.com/ScriptRefe...s.AnimationHumanStream.SetLookAtPosition.html

    If you look closely in the fullbodyik sample you should find it, it's call LookAtEffector and it can affect the whole spine if wanted
    Code (CSharp):
    1.  
    2. m_LookAtEffector = SetupLookAtEffector(ref job.lookAtEffector, "LookAtEffector");
    3.  
    if it doesn't work exactly like you would like you can create your own by simply driving the head transform with a TransformStreamHandle, feed the VR position/rotation directly into this TransformStreamHandle.

    There is no plan to add finger IK at the moment.

    Fun Facts: Mecanim already do support finger ik since 2009 but it was never exposed to Unity.
     
  6. KospY

    KospY

    Joined:
    May 12, 2014
    Posts:
    63
    I'm not looking into a look at effector. For VR, the goal is to sync the player head position (the headset) with the head of the character. It should work the same as an hand effector (set position and rotation).

    Interesting, does feeding the position rotation of the head with a TransformStreamHandle will automatically solve the full body IK? For example if the player crouch IRL does other bones will follow accordingly like the hand effector do? (so it would lower torso and bend legs).

    I'm so sad... Why not expose this?
    Does it's possible to use TransformStreamHandle for that?
     
  7. AubreyH

    AubreyH

    Joined:
    May 17, 2018
    Posts:
    6
    This video comes up from time to time, and gets me quite excited, but... the talk is a little misleading.

    I talked to a few people involved. Just want to note that this talk was mostly a prospective view of a technology. I don't believe it moved forward very far. Almost all the animations in there were mockups, so it's best take it all with a pinch of salt.

    Some of the ideas will work... like the idea of doing variants by animating IK targets/constraints as opposed to the bones themselves, and then applying those to lots of variant rigs... that's fairly plausible and even happens in some games today, but perhaps in more focussed scopes.

    Having a kind of general "filter" to convert one animation style to another (i.e. male walk to female walk)... that's where things get a little more, err, "optimistic" (though I share those hopes - retargeting mocap on a cartoon character looks like nightmare fuel to me.)

    You might expect with deep learning we could get toward that sort of thing, possibly. But back then, that wasn't really a thing people knew how to use.

    Anyway, I don't mean to douse your hopes, because there are still some good ideas in there, but from what I've gathered, it didn't actually go anywhere.
     
    Last edited: Oct 22, 2018
    WendelinReich, awesomedata and Baste like this.
  8. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    655
    Sure, I thought as much. After all, it was mentioned in the talk, and I watched the entire talk a number of times just to see how much of what was mentioned in there was possible/relevant in Unity these days.


    No worries. That's good information -- thanks for the heads-up. :)

    Unity has kinda lagged a bit in the Animation department ever since Mecanim was revealed, so seeing these C# Animation Jobs really got me inspired.
    Thinking about better procedural animation options like these and finding a way to integrate the best of them is what I live for. That's why I tossed this video out there -- I really wanted others to explore this new possibility space in Unity, now that it's possible -- There's much that hasn't been thoroughly explored before in Unity so it would be a great time to get involved, considering native webcam-based facial-mocap and non-graph-based animation such as Kinematica is not that far around the corner...



    @RomainFailliot:
    I'm not sure if you're involved in Kinematica, but (from what I've gleaned from) it (and my research into better animation tools), I've come to the conclusion that we'll never truly rid ourselves of the "graph" entirely -- but that's not a bad thing. As much as I support visual tools, I actually think minimizing the graph's use for "state" -based animation and instead relegating it to the realm of "secondary-motion" animation (such as IK / Ragdoll-Physics blends -- both a more natural "fit" to a visual "graph" -based interface) is a much-better goal to shoot for with something like Kinematica and allow C# Animation Jobs to ultimately work together with this secondary visual solution.

    Is this Unity's stance on this atm -- or are they looking at other options for the tool?

    Kinematica is a great start to authoring more realistic animation scenarios (without the complexities of the graph), but I think it can go even farther -- Graph complexity can be relegated to things like that video demonstrates (i.e. the positioning of limbs in regards to weights, inertia acting upon the limbs, or masks for different weapon types, facial animation, etc.) instead of basic physical movement states like runs/jumps/falls/turns/rolls. Visual-scripting could play a huge role in this for artists, but the major factor would be that the artist designs the secondary motions in a more mechanical way (graphs that are based on logical if/thens) while basic body kinematic states would be based around machine-learning Kinematic motion states, which would be only rarely overridden by code or game-state -- e.g. in cinematics or victory poses.


    BTW -- Anyone interested in a collab with me to implement a better design of a generic animation system? -- The basic idea would be to supplement the Kinematica movements and new facial mocap systems when they arrive with a more graph-based approach to secondary motion (while managing traditional key-based animation as the special case), with masking and other such "pose- / mask- / target- / property- blending" being the primary method of animation instead.
     
  9. AubreyH

    AubreyH

    Joined:
    May 17, 2018
    Posts:
    6
    With any luck, Kinematica will be implemented as a playable graph node of some sort, so you can simply chuck it into a graph, and use an AnimationLayerMixerPlayable to override anything you like.
     
    awesomedata likes this.
  10. mitaywalle

    mitaywalle

    Joined:
    Jul 1, 2013
    Posts:
    62
    Hi, I've recreated this procedural animation "between poses" with some unity standard features.

    1. In model + Animation asset import settings add "curves" to animation,
    2. create float - var for your Animator with same name, as curve has ( Curve would automaticly used for this)
    3. use this var for blend tree's blend between two anmations with 2 key frame each ( attention - var became disabled)
    4. ???
    5. PROFIT!!

    Here is example:
    - 3 animations, all used
    1. T-pose - 2 same key frames,
    2. idle - 2 same key frames
    3. idle_0 - 5 key frames, different poses for fingers
    - 2 curves

    setup: http://prntscr.com/lam4tz
    result: https://imgur.com/a/vk5Ae7z

    P.S. finger animation made through custom IK: https://imgur.com/a/Luk6c1p
     
    Last edited: Oct 26, 2018
    montenom likes this.
  11. purpl3grape

    purpl3grape

    Joined:
    Oct 18, 2015
    Posts:
    3
    Yea, this looks like it has to be done with the 'hybrid' approach.

    - Though to get a pure ECS player, would have to be sort of 'faked', if I have understood posts elsewhere I've come across.

    1) Think they mentioned to Setup a pure Entity and have it follow the 'hybrid player' which handles the animations only. While the Entity can handle other game logic, leveraging the use of multi-thread there.

    2) The other approach, baking animations into textures I've found to have its own limitations too. Such as having to switch animations, would mean modifying the Material texture data, but I don't think that can be described within structs though. It IS possible to still modify the texture data on the main thread, but not within a job. So an army of characters changing animations, especially if there are several animation scenarios may end up causing a bit of performance spikes.

    I feel that option 2) would be suitable for characters that only have 1 animation to be handled, or seldom changes from multiple animations. So just baking that 1 or 2 animations and just play them in loop (like in the Nordeus Projectg example)

    But for say a massive multiplayer first / third person game where the characters have a bunch of animations that are complex and require blending etc..., I would go with option 1)

    This is just what I've considered so far. But there may be better, more efficient ways of handling animations for a massive multiplayer, while not having to keele to the 'hybrid approach'.


    I'm curious how would you approach this issue if you'd like to have control over a 'Pure Entity', having animations playing?
     
  12. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    655
    Now that we have ECS, C# jobs, and the ability to render a mesh directly to the GPU, we could (theoretically) bypass the CPU (almost) entirely (except for processing a frame's basic data).

    The caveat to this is that we'd have to rewrite the animation system to support stuff like vertex and bone animations ourselves directly (which would kind of suck, but would be possible if we could alter the mesh before drawing it outselves.)



    I think a lot of that git project was research into how to get the animations to process/render via the GPU (rather than CPU) via the shader (way before ECS I assume), probably using the material to control vertex offsets directly. I've not looked into it heavily, but the approach definitely was limiting as you say.

    As mentioned above, ECS and C# animation jobs pretty much "replace" the need to do material storage, since it was only used as a way to bypass CPU bottlenecks by transferring the work directly to the GPU.
     
  13. mf_andreich

    mf_andreich

    Joined:
    Jul 7, 2017
    Posts:
    23
    Hi guys! I'm not very expirienced in Animation JOBs... But may be its possible use PlayableGraph (for animations only) without target Animator... If its possible, can I get Positions or rotations of bones without
    TransformStreamHandle? I want get pure data from blended animations in job without Transforms and animators for sending directly in shader.
     
  14. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    3,519
    I'm trying to use this to modify Humanoid animations. I can't really get it to work, though.

    In essence, to test, I'm just doing a very naive job that's supposed to offset a bone:

    Code (csharp):
    1. public struct TestJob : IAnimationJob
    2. {
    3.     public TransformStreamHandle bone;
    4.  
    5.     public void ProcessRootMotion(AnimationStream stream)
    6.     {
    7.     }
    8.  
    9.     public void ProcessAnimation(AnimationStream stream)
    10.     {
    11.         var position = bone.GetPosition(stream);
    12.         position += new Vector3(1f, 0f, 0f);
    13.         bone.SetPosition(stream, position);
    14.     }
    15. }
    16.  
    17. // use:
    18.  
    19. graph = PlayableGraph.Create();
    20. var animator = GetComponent<Animator>();
    21. var output = AnimationPlayableOutput.Create(graph, "Player", animator);
    22.  
    23. // this is a AnimationLayerMixerPlayable
    24. var animations = CreateAnimationGraph();
    25.  
    26. var animJob = new TestJob();
    27. animJob.bone = OutputAnimator.BindStreamTransform(testBone);
    28. var animJobPlayable = AnimationScriptPlayable.Create(graph, animJob);
    29. animJobPlayable.AddInput(animations, 0, 1f);
    30.  
    31. output.SetSourcePlayable(animJobPlayable);
    32. graph.Play();
    This doesn't seem to do anything. I suspect that this is due to the model being a Humanoid model. If I modify a muscle in the humanStream, it works as expected:

    Code (csharp):
    1.  
    2. public struct TestJob : IAnimationJob
    3. {
    4.     public float time;
    5.  
    6.     public void ProcessRootMotion(AnimationStream stream)
    7.     {
    8.     }
    9.  
    10.     public void ProcessAnimation(AnimationStream stream)
    11.     {
    12.         var muscleHandle = new MuscleHandle(BodyDof.UpperChestLeftRight);
    13.         var humanStream = stream.AsHuman();
    14.         time += stream.deltaTime;
    15.         humanStream.SetMuscle(muscleHandle, humanStream.GetMuscle(muscleHandle) + Mathf.PingPong(time, 2f) - 1f);
    16.     }
    17. }
    18.  
    So I'm guessing that this is a human/non-human issue? If I move a TransformStreamHandle that's targeting one of the bones used in the human avatar definition (like the shoulder), it moves the red helper gizmos that are shown when I've selected the animator, but that doesn't affect the played animation (and they're already all over the place).

    What's the intended way to move a bone in a human stream?
     
    florianhanke likes this.
  15. Mecanim-Dev

    Mecanim-Dev

    Unity Technologies

    Joined:
    Nov 26, 2012
    Posts:
    1,623
    Hi @Baste,

    that should work, we did some IK job that change transfrom rotation on a humanoid.

    In that case you are changing the position of the shoudler, does your rig is setup with translation DoF on? That could be one reason why it doesnt work in your case
     
  16. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    3,519
    The translation DoF was indeed turned off. Changing it did not help any, though, there's still no effect of running the job other than moving the gizmos.

    Edit: also made sure every single fbx file containing animations used has translation DOF turned on.
     
    Last edited: Nov 21, 2018
  17. Mecanim-Dev

    Mecanim-Dev

    Unity Technologies

    Joined:
    Nov 26, 2012
    Posts:
    1,623
    @Baste can you validate if rotation works?

    anyway, can you log a bug? this is expected to be working
     
  18. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    655
    I've had issues with moving bones directly on humanoid characters since 2017.1 -- way before animation jobs.

    As mentioned above, moving muscles work fine (assuming you can get the proper values for a pose), but modifying humanoid transforms directly fails without a very specific setup.
     
  19. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    3,519
    Rotation (both absolute and local) works. Position and scale does not.

    I imported the same models and animations as generic rigs, and then everything works as expected. I'm uploading the bug report now, will post the bug # when I get it.

    Update: Just tested this in 2018.3.0b11, and the issue persists. There's an issue in 2018.2 where the IK gizmos start moving around weirdly once I set the position or rotation of TransformStreamHandles, which I thought was a side-effect of this bug, but that seems fixed.

    Update 2: Bug 1103108
     
    Last edited: Nov 22, 2018
  20. Mecanim-Dev

    Mecanim-Dev

    Unity Technologies

    Joined:
    Nov 26, 2012
    Posts:
    1,623
    Scale is not expected to work as there is no scale dof in muscle space.
    Position should work.

    Yes this is a bug that was fixed in 2018.3b11
    • Fixed IK effector being overriden in animation jobs when calling skeleton methods.
    So as soon as you set a position/rotation in an animation jobs for an humanoid rig the IK effector position/rotation were overriden with a bad transform. There is no link between this bug and the one you did log.
     
  21. CyberSlime

    CyberSlime

    Joined:
    Jun 24, 2015
    Posts:
    6
    TransformStreamHandle.GetPosition doesn't get the correct world position in 2018.3.0f2. it works in 2018.2.19f1, but in the 2018.3.0f2, it only returns a position relative to the root bone of the skeleton, unless set applyRootMotion to true on the Animator. Is this a bug?
     
  22. Mecanim-Dev

    Mecanim-Dev

    Unity Technologies

    Joined:
    Nov 26, 2012
    Posts:
    1,623
    yes, could you log a bug please, we will investigate
     
  23. Mi_Ramon

    Mi_Ramon

    Joined:
    Oct 5, 2017
    Posts:
    1
    Using AnimationHumanStream.SolveIK() is currently very expensive. For comparison, this is a test scene with 50x DefaultMale_Humanoid from the sample project using an animator with IK enabled:
    ik_animator_20191.png

    And this is the same scene, but using a playable graph with an animation job that calls AnimationHumanStream.SolveIK():
    ik_jobs_20191.png


    There were some much needed bug fixes in the 2019.1 alpha, like the animation jobs now working with the burst compiler. Animation jobs also now seem cheaper to schedule. Will some of these be backported to 2018.3 or 2018.4?