Search Unity

Animation C# jobs in 2018.2a5

Discussion in 'Animation' started by Mecanim-Dev, Apr 4, 2018.

  1. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    @JohnHudeski not yet, but this is on our road map for animation jobs.
     
  2. JohnHudeski

    JohnHudeski

    Joined:
    Nov 18, 2017
    Posts:
    126
    The tears flow endlessly

    In other news. Is it better to create duplicate mirrored clips or have it done at runtime ala mecanim?
     
    awesomedata likes this.
  3. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    there is no good answer to this question because it depends on many factors specific to each project.

    Using mirror at runtime can save you a few thousand MB if your project has many clip so for project close to memory limit that could help a lot.
    On the other side managing this at runtime will increase the complexity of your code and could create some bugs.
    Also there is the performance aspect of the question, mirroing a clip cost a few cpu cycle, so if the project is already close to performance limit maybe it can't afford the cost to mirror all clip at runtime.

    So you have to choose between maintainability vs memory comsumption vs performance
     
    JohnHudeski likes this.
  4. KospY

    KospY

    Joined:
    May 12, 2014
    Posts:
    153
    I recently come across the animation jobs samples on Github and I'm pretty interested about the fullbodyIK solution.

    I'm currently using another solution for fullbodyIK but it is pretty complex, not really supported and really costly on the CPU.
    The fullbodyIK solution of Romain on the contrary seem to be really promising, it's a lot simpler and it use animation jobs so it's pretty efficient.
    This solution could be great for player fullbody IK in VR, but it lack an essential thing: an head effector (position and rotation to sync HMD with head). I hope there is a plan to add an head effector at some point because it's heartbreaking to know that such a solution exist but can't be used for player in VR as it miss a such essential thing.

    By the way, if you want to make VR devs happy just add an arm extender after that and that will be perfect (instead of pulling the body when the hand is too far it stretch the arm).

    Thanks for taking into consideration this features!

    Edit : Also I will be happy to know if finger IK effectors are planned?
     
    Last edited: Sep 19, 2018
  5. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    @KospY we do have a head effector whicc work like a look at
    https://docs.unity3d.com/ScriptRefe...s.AnimationHumanStream.SetLookAtPosition.html

    If you look closely in the fullbodyik sample you should find it, it's call LookAtEffector and it can affect the whole spine if wanted
    Code (CSharp):
    1.  
    2. m_LookAtEffector = SetupLookAtEffector(ref job.lookAtEffector, "LookAtEffector");
    3.  
    if it doesn't work exactly like you would like you can create your own by simply driving the head transform with a TransformStreamHandle, feed the VR position/rotation directly into this TransformStreamHandle.

    There is no plan to add finger IK at the moment.

    Fun Facts: Mecanim already do support finger ik since 2009 but it was never exposed to Unity.
     
  6. KospY

    KospY

    Joined:
    May 12, 2014
    Posts:
    153
    I'm not looking into a look at effector. For VR, the goal is to sync the player head position (the headset) with the head of the character. It should work the same as an hand effector (set position and rotation).

    Interesting, does feeding the position rotation of the head with a TransformStreamHandle will automatically solve the full body IK? For example if the player crouch IRL does other bones will follow accordingly like the hand effector do? (so it would lower torso and bend legs).

    I'm so sad... Why not expose this?
    Does it's possible to use TransformStreamHandle for that?
     
  7. AubreyH

    AubreyH

    Joined:
    May 17, 2018
    Posts:
    18
    This video comes up from time to time, and gets me quite excited, but... the talk is a little misleading.

    I talked to a few people involved. Just want to note that this talk was mostly a prospective view of a technology. I don't believe it moved forward very far. Almost all the animations in there were mockups, so it's best take it all with a pinch of salt.

    Some of the ideas will work... like the idea of doing variants by animating IK targets/constraints as opposed to the bones themselves, and then applying those to lots of variant rigs... that's fairly plausible and even happens in some games today, but perhaps in more focussed scopes.

    Having a kind of general "filter" to convert one animation style to another (i.e. male walk to female walk)... that's where things get a little more, err, "optimistic" (though I share those hopes - retargeting mocap on a cartoon character looks like nightmare fuel to me.)

    You might expect with deep learning we could get toward that sort of thing, possibly. But back then, that wasn't really a thing people knew how to use.

    Anyway, I don't mean to douse your hopes, because there are still some good ideas in there, but from what I've gathered, it didn't actually go anywhere.
     
    Last edited: Oct 22, 2018
    WendelinReich, awesomedata and Baste like this.
  8. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Sure, I thought as much. After all, it was mentioned in the talk, and I watched the entire talk a number of times just to see how much of what was mentioned in there was possible/relevant in Unity these days.


    No worries. That's good information -- thanks for the heads-up. :)

    Unity has kinda lagged a bit in the Animation department ever since Mecanim was revealed, so seeing these C# Animation Jobs really got me inspired.
    Thinking about better procedural animation options like these and finding a way to integrate the best of them is what I live for. That's why I tossed this video out there -- I really wanted others to explore this new possibility space in Unity, now that it's possible -- There's much that hasn't been thoroughly explored before in Unity so it would be a great time to get involved, considering native webcam-based facial-mocap and non-graph-based animation such as Kinematica is not that far around the corner...



    @RomainFailliot:
    I'm not sure if you're involved in Kinematica, but (from what I've gleaned from) it (and my research into better animation tools), I've come to the conclusion that we'll never truly rid ourselves of the "graph" entirely -- but that's not a bad thing. As much as I support visual tools, I actually think minimizing the graph's use for "state" -based animation and instead relegating it to the realm of "secondary-motion" animation (such as IK / Ragdoll-Physics blends -- both a more natural "fit" to a visual "graph" -based interface) is a much-better goal to shoot for with something like Kinematica and allow C# Animation Jobs to ultimately work together with this secondary visual solution.

    Is this Unity's stance on this atm -- or are they looking at other options for the tool?

    Kinematica is a great start to authoring more realistic animation scenarios (without the complexities of the graph), but I think it can go even farther -- Graph complexity can be relegated to things like that video demonstrates (i.e. the positioning of limbs in regards to weights, inertia acting upon the limbs, or masks for different weapon types, facial animation, etc.) instead of basic physical movement states like runs/jumps/falls/turns/rolls. Visual-scripting could play a huge role in this for artists, but the major factor would be that the artist designs the secondary motions in a more mechanical way (graphs that are based on logical if/thens) while basic body kinematic states would be based around machine-learning Kinematic motion states, which would be only rarely overridden by code or game-state -- e.g. in cinematics or victory poses.


    BTW -- Anyone interested in a collab with me to implement a better design of a generic animation system? -- The basic idea would be to supplement the Kinematica movements and new facial mocap systems when they arrive with a more graph-based approach to secondary motion (while managing traditional key-based animation as the special case), with masking and other such "pose- / mask- / target- / property- blending" being the primary method of animation instead.
     
  9. AubreyH

    AubreyH

    Joined:
    May 17, 2018
    Posts:
    18
    With any luck, Kinematica will be implemented as a playable graph node of some sort, so you can simply chuck it into a graph, and use an AnimationLayerMixerPlayable to override anything you like.
     
    awesomedata likes this.
  10. mitaywalle

    mitaywalle

    Joined:
    Jul 1, 2013
    Posts:
    247
    Hi, I've recreated this procedural animation "between poses" with some unity standard features.

    1. In model + Animation asset import settings add "curves" to animation,
    2. create float - var for your Animator with same name, as curve has ( Curve would automaticly used for this)
    3. use this var for blend tree's blend between two anmations with 2 key frame each ( attention - var became disabled)
    4. ???
    5. PROFIT!!

    Here is example:
    - 3 animations, all used
    1. T-pose - 2 same key frames,
    2. idle - 2 same key frames
    3. idle_0 - 5 key frames, different poses for fingers
    - 2 curves

    setup: http://prntscr.com/lam4tz
    result: https://imgur.com/a/vk5Ae7z

    P.S. finger animation made through custom IK: https://imgur.com/a/Luk6c1p
     
    Last edited: Oct 26, 2018
    montenom likes this.
  11. purpl3grape

    purpl3grape

    Joined:
    Oct 18, 2015
    Posts:
    8
    Yea, this looks like it has to be done with the 'hybrid' approach.

    - Though to get a pure ECS player, would have to be sort of 'faked', if I have understood posts elsewhere I've come across.

    1) Think they mentioned to Setup a pure Entity and have it follow the 'hybrid player' which handles the animations only. While the Entity can handle other game logic, leveraging the use of multi-thread there.

    2) The other approach, baking animations into textures I've found to have its own limitations too. Such as having to switch animations, would mean modifying the Material texture data, but I don't think that can be described within structs though. It IS possible to still modify the texture data on the main thread, but not within a job. So an army of characters changing animations, especially if there are several animation scenarios may end up causing a bit of performance spikes.

    I feel that option 2) would be suitable for characters that only have 1 animation to be handled, or seldom changes from multiple animations. So just baking that 1 or 2 animations and just play them in loop (like in the Nordeus Projectg example)

    But for say a massive multiplayer first / third person game where the characters have a bunch of animations that are complex and require blending etc..., I would go with option 1)

    This is just what I've considered so far. But there may be better, more efficient ways of handling animations for a massive multiplayer, while not having to keele to the 'hybrid approach'.


    I'm curious how would you approach this issue if you'd like to have control over a 'Pure Entity', having animations playing?
     
    Deleted User and thelebaron like this.
  12. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Now that we have ECS, C# jobs, and the ability to render a mesh directly to the GPU, we could (theoretically) bypass the CPU (almost) entirely (except for processing a frame's basic data).

    The caveat to this is that we'd have to rewrite the animation system to support stuff like vertex and bone animations ourselves directly (which would kind of suck, but would be possible if we could alter the mesh before drawing it outselves.)



    I think a lot of that git project was research into how to get the animations to process/render via the GPU (rather than CPU) via the shader (way before ECS I assume), probably using the material to control vertex offsets directly. I've not looked into it heavily, but the approach definitely was limiting as you say.

    As mentioned above, ECS and C# animation jobs pretty much "replace" the need to do material storage, since it was only used as a way to bypass CPU bottlenecks by transferring the work directly to the GPU.
     
  13. mf_andreich

    mf_andreich

    Joined:
    Jul 7, 2017
    Posts:
    38
    Hi guys! I'm not very expirienced in Animation JOBs... But may be its possible use PlayableGraph (for animations only) without target Animator... If its possible, can I get Positions or rotations of bones without
    TransformStreamHandle? I want get pure data from blended animations in job without Transforms and animators for sending directly in shader.
     
  14. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,334
    I'm trying to use this to modify Humanoid animations. I can't really get it to work, though.

    In essence, to test, I'm just doing a very naive job that's supposed to offset a bone:

    Code (csharp):
    1. public struct TestJob : IAnimationJob
    2. {
    3.     public TransformStreamHandle bone;
    4.  
    5.     public void ProcessRootMotion(AnimationStream stream)
    6.     {
    7.     }
    8.  
    9.     public void ProcessAnimation(AnimationStream stream)
    10.     {
    11.         var position = bone.GetPosition(stream);
    12.         position += new Vector3(1f, 0f, 0f);
    13.         bone.SetPosition(stream, position);
    14.     }
    15. }
    16.  
    17. // use:
    18.  
    19. graph = PlayableGraph.Create();
    20. var animator = GetComponent<Animator>();
    21. var output = AnimationPlayableOutput.Create(graph, "Player", animator);
    22.  
    23. // this is a AnimationLayerMixerPlayable
    24. var animations = CreateAnimationGraph();
    25.  
    26. var animJob = new TestJob();
    27. animJob.bone = OutputAnimator.BindStreamTransform(testBone);
    28. var animJobPlayable = AnimationScriptPlayable.Create(graph, animJob);
    29. animJobPlayable.AddInput(animations, 0, 1f);
    30.  
    31. output.SetSourcePlayable(animJobPlayable);
    32. graph.Play();
    This doesn't seem to do anything. I suspect that this is due to the model being a Humanoid model. If I modify a muscle in the humanStream, it works as expected:

    Code (csharp):
    1.  
    2. public struct TestJob : IAnimationJob
    3. {
    4.     public float time;
    5.  
    6.     public void ProcessRootMotion(AnimationStream stream)
    7.     {
    8.     }
    9.  
    10.     public void ProcessAnimation(AnimationStream stream)
    11.     {
    12.         var muscleHandle = new MuscleHandle(BodyDof.UpperChestLeftRight);
    13.         var humanStream = stream.AsHuman();
    14.         time += stream.deltaTime;
    15.         humanStream.SetMuscle(muscleHandle, humanStream.GetMuscle(muscleHandle) + Mathf.PingPong(time, 2f) - 1f);
    16.     }
    17. }
    18.  
    So I'm guessing that this is a human/non-human issue? If I move a TransformStreamHandle that's targeting one of the bones used in the human avatar definition (like the shoulder), it moves the red helper gizmos that are shown when I've selected the animator, but that doesn't affect the played animation (and they're already all over the place).

    What's the intended way to move a bone in a human stream?
     
    florianhanke likes this.
  15. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    Hi @Baste,

    that should work, we did some IK job that change transfrom rotation on a humanoid.

    In that case you are changing the position of the shoudler, does your rig is setup with translation DoF on? That could be one reason why it doesnt work in your case
     
  16. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,334
    The translation DoF was indeed turned off. Changing it did not help any, though, there's still no effect of running the job other than moving the gizmos.

    Edit: also made sure every single fbx file containing animations used has translation DOF turned on.
     
    Last edited: Nov 21, 2018
  17. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    @Baste can you validate if rotation works?

    anyway, can you log a bug? this is expected to be working
     
  18. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    I've had issues with moving bones directly on humanoid characters since 2017.1 -- way before animation jobs.

    As mentioned above, moving muscles work fine (assuming you can get the proper values for a pose), but modifying humanoid transforms directly fails without a very specific setup.
     
  19. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,334
    Rotation (both absolute and local) works. Position and scale does not.

    I imported the same models and animations as generic rigs, and then everything works as expected. I'm uploading the bug report now, will post the bug # when I get it.

    Update: Just tested this in 2018.3.0b11, and the issue persists. There's an issue in 2018.2 where the IK gizmos start moving around weirdly once I set the position or rotation of TransformStreamHandles, which I thought was a side-effect of this bug, but that seems fixed.

    Update 2: Bug 1103108
     
    Last edited: Nov 22, 2018
  20. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    Scale is not expected to work as there is no scale dof in muscle space.
    Position should work.

    Yes this is a bug that was fixed in 2018.3b11
    • Fixed IK effector being overriden in animation jobs when calling skeleton methods.
    So as soon as you set a position/rotation in an animation jobs for an humanoid rig the IK effector position/rotation were overriden with a bad transform. There is no link between this bug and the one you did log.
     
  21. CyberSlime

    CyberSlime

    Joined:
    Jun 24, 2015
    Posts:
    6
    TransformStreamHandle.GetPosition doesn't get the correct world position in 2018.3.0f2. it works in 2018.2.19f1, but in the 2018.3.0f2, it only returns a position relative to the root bone of the skeleton, unless set applyRootMotion to true on the Animator. Is this a bug?
     
  22. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    yes, could you log a bug please, we will investigate
     
  23. Mimimi_Ramon

    Mimimi_Ramon

    Joined:
    Oct 5, 2017
    Posts:
    3
    Using AnimationHumanStream.SolveIK() is currently very expensive. For comparison, this is a test scene with 50x DefaultMale_Humanoid from the sample project using an animator with IK enabled:
    ik_animator_20191.png

    And this is the same scene, but using a playable graph with an animation job that calls AnimationHumanStream.SolveIK():
    ik_jobs_20191.png


    There were some much needed bug fixes in the 2019.1 alpha, like the animation jobs now working with the burst compiler. Animation jobs also now seem cheaper to schedule. Will some of these be backported to 2018.3 or 2018.4?
     
  24. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    We will backport the burst fix to 2018.3 and 2018.4
     
  25. daniel_groundshatter

    daniel_groundshatter

    Joined:
    Nov 21, 2016
    Posts:
    1
    Resolving TransformStreamHandle in a job will cause a crash in 2019.1.0b1 if the bone you are resolving is part of an optimised rig. Previous versions of Unity that I have tested throw an InvalidOperationException. Do you plan to support optimised rigs for animation jobs?
     
  26. MiFrilke

    MiFrilke

    Joined:
    Dec 14, 2016
    Posts:
    41
    Hi, I'd like to follow up on the previously mentioned scheduling improvements we saw in the 2019.1 alpha. Are these also coming to 2018.3/.4, or does the backport only contain the burst fix as you seem to mention explicitly? Or maybe the burst fix automatically allows for better scheduling?

    Also, can you share information about the status of the backport, maybe a projected release date? We're currently in the process of analyzing all performance critical systems in our game and with sometimes around 50 fully animated NPCs on screen, it can get quite expensive. If the mentioned animation job fixes land in 2018.3 some time soon, we might be able to try them out again.

    Thank you
     
  27. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    it shouldn't crash, please log a bug.

    We would like to support optimised rigs, but I don't have any ETA, right now all our resources are used for runtime rigging and new DOTS animation system.
     
  28. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    The backport will contain only the fix to allow user to use burst with animation jobs. it should land pretty soon in 2018.x since the original fix already landed
     
  29. JohnHudeski

    JohnHudeski

    Joined:
    Nov 18, 2017
    Posts:
    126
    Will the skills acquired from this be transferable once the motion matching api is released?
     
  30. AubreyH

    AubreyH

    Joined:
    May 17, 2018
    Posts:
    18
    Hi, what does DOTS stand for?
     
  31. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,334
    Data Oriented Tech Stack. So ECS/Job System.

    I don't know why Unity decided to start referring to it as DOTS - I know that there's more to it than ECS; but the two acronyms gets confusing.
     
    AubreyH likes this.
  32. thelebaron

    thelebaron

    Joined:
    Jun 2, 2013
    Posts:
    857
    by DOTS animation system does this refer to just burst or the whole gamut of ecs native animation in combination with jobs and burst?
     
  33. YuriyPopov

    YuriyPopov

    Joined:
    Sep 5, 2017
    Posts:
    237
    So I want to use the look at sample on our enemies so they look at the player but I cant figure out how to use this new system with an animator. We still need the animator for all other animations on the enemy such as attacks etc. How do I go about doing this
     
  34. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Documentation is pretty lacking in general in regards to Animation.

    Without a demonstration, it takes hours of testing to even be able to grasp which methods interact, much less how they were intended to do so.

    I'd venture to say -- you can't use Animators because Animators handle low-level transform stuff automatically (on the main thread), but with C# animation jobs, everything with transforming (even bones) suddenly becomes manual.
     
  35. YuriyPopov

    YuriyPopov

    Joined:
    Sep 5, 2017
    Posts:
    237
    You can actually I found the examole from the new playables documentation since the job system also seems to use them. Using a mixer it works just fine
     
    awesomedata likes this.
  36. Miss_Jones

    Miss_Jones

    Joined:
    Sep 21, 2014
    Posts:
    1
    I tried setting up a transformstreamhandle to effect the head position directly instead of a lookateffector. I cannot get it to work.

    I modified the FullBodyIKJob script to include a headeffectorhandle:

    Code (CSharp):
    1.    
    2. public struct HeadEffectorHandle
    3.     {
    4.         public TransformStreamHandle effector;
    5.         public Vector3 position;
    6.         public Quaternion rotation;
    7.     }
    8.  
    I then have a method for setting the head effector:

    Code (CSharp):
    1.    
    2. private void SetHeadEffector(AnimationStream stream, ref HeadEffectorHandle handle)
    3.     {
    4.      
    5.         if (handle.effector.IsValid(stream))
    6.         {
    7.             AnimationHumanStream humanStream = stream.AsHuman();
    8.             handle.effector.SetPosition(stream,handle.position);
    9.             handle.effector.SetRotation(stream,handle.rotation);
    10.         }
    11.     }
    12.  
    Which is called in the ProcessAnimation method:

    Code (CSharp):
    1.    
    2. public void ProcessAnimation(AnimationStream stream)
    3.     {
    4.         SetMaximumExtension(stream, ref leftArm);
    5.         SetMaximumExtension(stream, ref rightArm);
    6.         SetMaximumExtension(stream, ref leftLeg);
    7.         SetMaximumExtension(stream, ref rightLeg);
    8.  
    9.         SetEffector(stream, AvatarIKGoal.LeftFoot, ref leftFootEffector);
    10.         SetEffector(stream, AvatarIKGoal.RightFoot, ref rightFootEffector);
    11.         SetEffector(stream, AvatarIKGoal.LeftHand, ref leftHandEffector);
    12.         SetEffector(stream, AvatarIKGoal.RightHand, ref rightHandEffector);
    13.  
    14.         SetHintEffector(stream, AvatarIKHint.LeftKnee, ref leftKneeHintEffector);
    15.         SetHintEffector(stream, AvatarIKHint.RightKnee, ref rightKneeHintEffector);
    16.         SetHintEffector(stream, AvatarIKHint.LeftElbow, ref leftElbowHintEffector);
    17.         SetHintEffector(stream, AvatarIKHint.RightElbow, ref rightElbowHintEffector);
    18.  
    19.         SetHeadEffector(stream, ref headEffector);
    20.         //SetLookAtEffector(stream, ref lookAtEffector);
    21.      
    22.      
    23.         SetBodyEffector(stream, ref bodyEffector);
    24.  
    25.         Solve(stream);
    26.     }
    27.  
    The binding is setup in the FullBodyIK script:

    Code (CSharp):
    1. handle.effector = m_Animator.BindStreamTransform(go.transform);
    However when played it gives me the following error:
    InvalidOperationException: The TransformStreamHandle cannot be resolved.

    What does this error mean and how can i resolve it, such that i can get my head effector to work properly?

    I read somewhere that the Animator Avatar might be the reason for the error, but removing it causes the FullBodyIK to not work at all.
     
  37. AubreyH

    AubreyH

    Joined:
    May 17, 2018
    Posts:
    18
    I've noticed that ProcessAnimation() does not have access to the input streams' root motion velocities, even though ProcessRootMotion() seems to occur before ProcessAnimation()

    Reading stream.GetInput(0).rootVelocity or angularVelocity from inside void ProcessAnimation() always comes back zero even when there's non zero rootVelocity/angularVelocity in the same job's ProcessRootMotion()

    I'm guessing they're on different threads and thus can't talk to one another?

    Any workarounds? Any way I can see how much that node is going to rotate me, without adding a downstream node?

    (The reason I'm doing this is to IK hands into camera space, but the root motion rotates my entire model, so the camera bone itself gets hefted around by root motion rotation. If I could adjust the target camera position/rotation to take into account the rotation caused my root motion, I could preemptively compensate the camera's position to take into account the root motion rotation).

    (Then again, maybe I need to re-think this.)
     
  38. AubreyH

    AubreyH

    Joined:
    May 17, 2018
    Posts:
    18
    Actually, thinking about this another way, doesn't the order of execution HAVE to be:
    ProcessRootMotion, OnAnimatorMove, ProcessAnimation?

    Because if i'm trying to solve IK for a world position for my animation then i need my root motion to have been applied already, right? Otherwise IK code going on in the ProcessAnimation call is dealing with a stale world position for the rig space?

    And I can't in a very easy way predict the outcome of root motion first, right? Not in any blend tree of any real complexity, at least.

    If it IS this way already, sorry to bother you. I wasn't able to discern it from the docs. I'll go and try and figure out if this is the order of execution and come back and confirm.
     
    Last edited: May 1, 2019
  39. AubreyH

    AubreyH

    Joined:
    May 17, 2018
    Posts:
    18
    Yep! Just checked and It does work in that execution order.

    1. ProcessRootMotion
    2. OnAnimatorMove
    3. ProcessAnimation

    Phew! Good stuff! Right! Now I know that my error is in another place! I should be updating my effector positions in OnAnimatorMove so that they're just in time for the IK.
     
    Last edited: May 1, 2019
  40. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    So we do have two different kind of transform handle:

    TransformStreamHandle: transform is animated by the animator, transform is a child or a grand child or the animator root transform.
    TransformSceneHandle: transform is not animated by the animator, the animator will read the value directly from the transform at the beginning of the animator frame.

    Usually effector are not animated by the same animator, for a look at it probably something outside of the animator hierarchy, maybe another character, so you probably need to use a TransformSceneHandle in this case.

    I don't see the whole code so I could be wrong, so if your effector is indeed a child or grand child of your animator root then you should log a bug because it should works
     
  41. WendelinReich

    WendelinReich

    Joined:
    Dec 22, 2011
    Posts:
    228
    Hi @Mecanim-Dev, I have a question about data-flow between a custom IAnimationJob and the main thread. In our case, at least a dozen variables used by our job will be modified in Update() each frame. Right now, the only way to inform the job of these changes seems very convoluted (via PropertySceneHandle, which requires setting up some MonoBehavior just to exchange data between main thread and custom IAnimationJob). So: is there a better way to get new data into the job, or is it in the works?

    Thanks!
     
    Last edited: May 27, 2019
  42. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Good luck on getting anything out of these guys right now. -- Either the Unity Animation team is _really_ busy right now with something animation-related (I'm hoping and praying that's the case), or they've all been disbanded and pulled to work on other projects (which has happened before.) I'm really hoping it's the former and not the latter. Animation really needs some TLC these days... :(
     
  43. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    No that the only ways to safely send data to job, you can also create custom property in the animation stream if you don't want to bind them to a specific component value.

    Yes we are indeed very busy, we are working on DOTs animation and Kinematica
     
  44. WendelinReich

    WendelinReich

    Joined:
    Dec 22, 2011
    Posts:
    228
    Thanks @Mecanim-Dev ! Actually, there seems to be another way: calling AnimationScriptPlayable.SetJobData() each frame in Update(), as shown here. To decide between the two options, could you tell me which is better performance-wise? One call to SetJobData() would surely be faster than modifying 10 custom stream properties using PropertyStreamHandle.SetFloat() each frame?
     
  45. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    It really depends on what you want to do, if you only need to send data to a single job, and this data is not needed by other jobs then yes you can simply create public member on your job and use GetJobData/SetJobData to change the value.
    There is a hidden cost to GetJobData/SetJobData.
    The managed job data is stored in native memory.
    when you call GetJobData we do memcpy the native memory into the managed job data.
    when you call SetJobData we do the inverse, we do memcpy the managed job data to native memory.

    A bigger job struct will take more time to mempcy, but memcpy are usaully really fast.

    On the other hands PropertyStreamHandle represent an animated value that is injected into the animation stream which mean that a job can inject a value which is then processed by another animation jobs like a mixer which can blend thoses value with something else. Setting a value in the animation stream with a PropertyStreamHandle is pretty fast once your handle is resolved, it only one indirection to write the final value, but of course all other jobs need to handle this animated value.

    So if we come back to the original question:
    If we take into account the inherit cost of all jobs processing thoses animated value in the stream, then yes PropertyStreamHandle is probably more costly than just setting a few values on a job with SetJobData() but they can do way more than just represent a parametric value for a single job.
     
    WendelinReich likes this.
  46. coolguyconan

    coolguyconan

    Joined:
    Aug 24, 2018
    Posts:
    4
    Hi @Mecanim-Dev ,I encounter a prblem. I set position use TransformStreamHandle.SetPosition(), but the position is not changed in the next frame. And there is not data in the animator. Here is my code:
    Code (CSharp):
    1.  
    2.    public struct JobTest:IAnimationJob
    3.     {
    4.         public NativeArray<TransformStreamHandle> handles;
    5.  
    6.         public void ProcessRootMotion(AnimationStream stream) { }
    7.         public void ProcessAnimation(AnimationStream stream)
    8.         {
    9.             for(int i=0;i<handles.Length;++i)
    10.             {
    11.                 Debug.Log(handles[i].GetPosition(stream));// not changed in the next frame.
    12.                 handles[i].SetPosition(stream,handles[i].GetPosition(stream)+new Vector3(1,0,0));
    13.                 Debug.Log(handles[i].GetPosition(stream));
    14.             }
    15.         }
    16.     }
     
  47. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,334
    I'm running into an issue with Animation Jobs, not sure if it's a bug or not.

    The documentation for IAnimationStreamHandle says that GetPosition and GetRotation return the position/rotation in world space.

    But when checking the values, what they're actually returning is the position/rotation in the animator's local space. Ie. these things give the exact same value:

    Code (csharp):
    1. // in MonoBehaviour:
    2. animator.transform.InverseTransformPoint(targetBone.transform.position
    3.  
    4. // in Animation Job, targeting same bone:
    5. var bonePosition = boneHandle.GetPosition(stream);
    Is this an error in the docs, or is it a bug?

    I was first assuming that it was a documentation error, but the basic two bone IK job is giving me bad results, and IK becomes way harder than it needs to when I don't have straight-forward access to the world positions of the bones, so it seems buggy.

    Oh, this is in Unity 2018.4.4f1
     
    Last edited: Jul 30, 2019
  48. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,334
    Oh, and another question:

    If I have an optimized model (using AnimatorUtility.OptimizeTransformHierarchy), can I still get stream handles for the bones in the rig? I tried keeping some bones, but that just generates errors: "Could not resolve 'DEFORM-forearm.R' because it is not a child Transform in the Animator hierarchy."

    Which makes sense, exposed bones are already read-only. Still, I'd really like to write some IK for an optimized character.
     
    awesomedata likes this.
  49. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    it should return world space position and rotation, if not it's a bug.

    Normally when the animation system invoke an animation jobs we do always read back the current value of the root transform before invoking your job to compute the world space because if another system than the animation system move the root transform between the Update() and the animation system update you would get a bad world space matrix.

    We had a bug like this a few month ago when the animator was setup with apply root motion at false and the user was moving the root transform from a monobehaviour, can you validate if that the case for you?

    The bug has been backported to my knowledge but maybe something went wrong
     
    Baste likes this.
  50. Mecanim-Dev

    Mecanim-Dev

    Joined:
    Nov 26, 2012
    Posts:
    1,675
    yes this is a know issue with animation stream handle and optimized hierarchy, it on the road map to make it work but I don't know when it will be ready
     
    Baste likes this.