A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Discussion in 'Animation' started by Mecanim-Dev, Apr 4, 2018.
@JohnHudeski not yet, but this is on our road map for animation jobs.
The tears flow endlessly
In other news. Is it better to create duplicate mirrored clips or have it done at runtime ala mecanim?
there is no good answer to this question because it depends on many factors specific to each project.
Using mirror at runtime can save you a few thousand MB if your project has many clip so for project close to memory limit that could help a lot.
On the other side managing this at runtime will increase the complexity of your code and could create some bugs.
Also there is the performance aspect of the question, mirroing a clip cost a few cpu cycle, so if the project is already close to performance limit maybe it can't afford the cost to mirror all clip at runtime.
So you have to choose between maintainability vs memory comsumption vs performance
I recently come across the animation jobs samples on Github and I'm pretty interested about the fullbodyIK solution.
I'm currently using another solution for fullbodyIK but it is pretty complex, not really supported and really costly on the CPU.
The fullbodyIK solution of Romain on the contrary seem to be really promising, it's a lot simpler and it use animation jobs so it's pretty efficient.
This solution could be great for player fullbody IK in VR, but it lack an essential thing: an head effector (position and rotation to sync HMD with head). I hope there is a plan to add an head effector at some point because it's heartbreaking to know that such a solution exist but can't be used for player in VR as it miss a such essential thing.
By the way, if you want to make VR devs happy just add an arm extender after that and that will be perfect (instead of pulling the body when the hand is too far it stretch the arm).
Thanks for taking into consideration this features!
Edit : Also I will be happy to know if finger IK effectors are planned?
@KospY we do have a head effector whicc work like a look at
If you look closely in the fullbodyik sample you should find it, it's call LookAtEffector and it can affect the whole spine if wanted
m_LookAtEffector = SetupLookAtEffector(ref job.lookAtEffector, "LookAtEffector");
if it doesn't work exactly like you would like you can create your own by simply driving the head transform with a TransformStreamHandle, feed the VR position/rotation directly into this TransformStreamHandle.
There is no plan to add finger IK at the moment.
Fun Facts: Mecanim already do support finger ik since 2009 but it was never exposed to Unity.
I'm not looking into a look at effector. For VR, the goal is to sync the player head position (the headset) with the head of the character. It should work the same as an hand effector (set position and rotation).
Interesting, does feeding the position rotation of the head with a TransformStreamHandle will automatically solve the full body IK? For example if the player crouch IRL does other bones will follow accordingly like the hand effector do? (so it would lower torso and bend legs).
I'm so sad... Why not expose this?
Does it's possible to use TransformStreamHandle for that?
This video comes up from time to time, and gets me quite excited, but... the talk is a little misleading.
I talked to a few people involved. Just want to note that this talk was mostly a prospective view of a technology. I don't believe it moved forward very far. Almost all the animations in there were mockups, so it's best take it all with a pinch of salt.
Some of the ideas will work... like the idea of doing variants by animating IK targets/constraints as opposed to the bones themselves, and then applying those to lots of variant rigs... that's fairly plausible and even happens in some games today, but perhaps in more focussed scopes.
Having a kind of general "filter" to convert one animation style to another (i.e. male walk to female walk)... that's where things get a little more, err, "optimistic" (though I share those hopes - retargeting mocap on a cartoon character looks like nightmare fuel to me.)
You might expect with deep learning we could get toward that sort of thing, possibly. But back then, that wasn't really a thing people knew how to use.
Anyway, I don't mean to douse your hopes, because there are still some good ideas in there, but from what I've gathered, it didn't actually go anywhere.
Sure, I thought as much. After all, it was mentioned in the talk, and I watched the entire talk a number of times just to see how much of what was mentioned in there was possible/relevant in Unity these days.
No worries. That's good information -- thanks for the heads-up.
Unity has kinda lagged a bit in the Animation department ever since Mecanim was revealed, so seeing these C# Animation Jobs really got me inspired.
Thinking about better procedural animation options like these and finding a way to integrate the best of them is what I live for. That's why I tossed this video out there -- I really wanted others to explore this new possibility space in Unity, now that it's possible -- There's much that hasn't been thoroughly explored before in Unity so it would be a great time to get involved, considering native webcam-based facial-mocap and non-graph-based animation such as Kinematica is not that far around the corner...
I'm not sure if you're involved in Kinematica, but (from what I've gleaned from) it (and my research into better animation tools), I've come to the conclusion that we'll never truly rid ourselves of the "graph" entirely -- but that's not a bad thing. As much as I support visual tools, I actually think minimizing the graph's use for "state" -based animation and instead relegating it to the realm of "secondary-motion" animation (such as IK / Ragdoll-Physics blends -- both a more natural "fit" to a visual "graph" -based interface) is a much-better goal to shoot for with something like Kinematica and allow C# Animation Jobs to ultimately work together with this secondary visual solution.
Is this Unity's stance on this atm -- or are they looking at other options for the tool?
Kinematica is a great start to authoring more realistic animation scenarios (without the complexities of the graph), but I think it can go even farther -- Graph complexity can be relegated to things like that video demonstrates (i.e. the positioning of limbs in regards to weights, inertia acting upon the limbs, or masks for different weapon types, facial animation, etc.) instead of basic physical movement states like runs/jumps/falls/turns/rolls. Visual-scripting could play a huge role in this for artists, but the major factor would be that the artist designs the secondary motions in a more mechanical way (graphs that are based on logical if/thens) while basic body kinematic states would be based around machine-learning Kinematic motion states, which would be only rarely overridden by code or game-state -- e.g. in cinematics or victory poses.
BTW -- Anyone interested in a collab with me to implement a better design of a generic animation system? -- The basic idea would be to supplement the Kinematica movements and new facial mocap systems when they arrive with a more graph-based approach to secondary motion (while managing traditional key-based animation as the special case), with masking and other such "pose- / mask- / target- / property- blending" being the primary method of animation instead.
With any luck, Kinematica will be implemented as a playable graph node of some sort, so you can simply chuck it into a graph, and use an AnimationLayerMixerPlayable to override anything you like.
Hi, I've recreated this procedural animation "between poses" with some unity standard features.
1. In model + Animation asset import settings add "curves" to animation,
2. create float - var for your Animator with same name, as curve has ( Curve would automaticly used for this)
3. use this var for blend tree's blend between two anmations with 2 key frame each ( attention - var became disabled)
Here is example:
- 3 animations, all used
1. T-pose - 2 same key frames,
2. idle - 2 same key frames
3. idle_0 - 5 key frames, different poses for fingers
- 2 curves
P.S. finger animation made through custom IK: https://imgur.com/a/Luk6c1p
Yea, this looks like it has to be done with the 'hybrid' approach.
- Though to get a pure ECS player, would have to be sort of 'faked', if I have understood posts elsewhere I've come across.
1) Think they mentioned to Setup a pure Entity and have it follow the 'hybrid player' which handles the animations only. While the Entity can handle other game logic, leveraging the use of multi-thread there.
2) The other approach, baking animations into textures I've found to have its own limitations too. Such as having to switch animations, would mean modifying the Material texture data, but I don't think that can be described within structs though. It IS possible to still modify the texture data on the main thread, but not within a job. So an army of characters changing animations, especially if there are several animation scenarios may end up causing a bit of performance spikes.
I feel that option 2) would be suitable for characters that only have 1 animation to be handled, or seldom changes from multiple animations. So just baking that 1 or 2 animations and just play them in loop (like in the Nordeus Projectg example)
But for say a massive multiplayer first / third person game where the characters have a bunch of animations that are complex and require blending etc..., I would go with option 1)
This is just what I've considered so far. But there may be better, more efficient ways of handling animations for a massive multiplayer, while not having to keele to the 'hybrid approach'.
I'm curious how would you approach this issue if you'd like to have control over a 'Pure Entity', having animations playing?
Now that we have ECS, C# jobs, and the ability to render a mesh directly to the GPU, we could (theoretically) bypass the CPU (almost) entirely (except for processing a frame's basic data).
The caveat to this is that we'd have to rewrite the animation system to support stuff like vertex and bone animations ourselves directly (which would kind of suck, but would be possible if we could alter the mesh before drawing it outselves.)
I think a lot of that git project was research into how to get the animations to process/render via the GPU (rather than CPU) via the shader (way before ECS I assume), probably using the material to control vertex offsets directly. I've not looked into it heavily, but the approach definitely was limiting as you say.
As mentioned above, ECS and C# animation jobs pretty much "replace" the need to do material storage, since it was only used as a way to bypass CPU bottlenecks by transferring the work directly to the GPU.
Hi guys! I'm not very expirienced in Animation JOBs... But may be its possible use PlayableGraph (for animations only) without target Animator... If its possible, can I get Positions or rotations of bones without
TransformStreamHandle? I want get pure data from blended animations in job without Transforms and animators for sending directly in shader.
I'm trying to use this to modify Humanoid animations. I can't really get it to work, though.
In essence, to test, I'm just doing a very naive job that's supposed to offset a bone:
public struct TestJob : IAnimationJob
public TransformStreamHandle bone;
public void ProcessRootMotion(AnimationStream stream)
public void ProcessAnimation(AnimationStream stream)
var position = bone.GetPosition(stream);
position += new Vector3(1f, 0f, 0f);
graph = PlayableGraph.Create();
var animator = GetComponent<Animator>();
var output = AnimationPlayableOutput.Create(graph, "Player", animator);
// this is a AnimationLayerMixerPlayable
var animations = CreateAnimationGraph();
var animJob = new TestJob();
animJob.bone = OutputAnimator.BindStreamTransform(testBone);
var animJobPlayable = AnimationScriptPlayable.Create(graph, animJob);
animJobPlayable.AddInput(animations, 0, 1f);
This doesn't seem to do anything. I suspect that this is due to the model being a Humanoid model. If I modify a muscle in the humanStream, it works as expected:
public struct TestJob : IAnimationJob
public float time;
public void ProcessRootMotion(AnimationStream stream)
public void ProcessAnimation(AnimationStream stream)
var muscleHandle = new MuscleHandle(BodyDof.UpperChestLeftRight);
var humanStream = stream.AsHuman();
time += stream.deltaTime;
humanStream.SetMuscle(muscleHandle, humanStream.GetMuscle(muscleHandle) + Mathf.PingPong(time, 2f) - 1f);
So I'm guessing that this is a human/non-human issue? If I move a TransformStreamHandle that's targeting one of the bones used in the human avatar definition (like the shoulder), it moves the red helper gizmos that are shown when I've selected the animator, but that doesn't affect the played animation (and they're already all over the place).
What's the intended way to move a bone in a human stream?
that should work, we did some IK job that change transfrom rotation on a humanoid.
In that case you are changing the position of the shoudler, does your rig is setup with translation DoF on? That could be one reason why it doesnt work in your case
The translation DoF was indeed turned off. Changing it did not help any, though, there's still no effect of running the job other than moving the gizmos.
Edit: also made sure every single fbx file containing animations used has translation DOF turned on.
@Baste can you validate if rotation works?
anyway, can you log a bug? this is expected to be working
I've had issues with moving bones directly on humanoid characters since 2017.1 -- way before animation jobs.
As mentioned above, moving muscles work fine (assuming you can get the proper values for a pose), but modifying humanoid transforms directly fails without a very specific setup.
Rotation (both absolute and local) works. Position and scale does not.
I imported the same models and animations as generic rigs, and then everything works as expected. I'm uploading the bug report now, will post the bug # when I get it.
Update: Just tested this in 2018.3.0b11, and the issue persists. There's an issue in 2018.2 where the IK gizmos start moving around weirdly once I set the position or rotation of TransformStreamHandles, which I thought was a side-effect of this bug, but that seems fixed.
Update 2: Bug 1103108
Scale is not expected to work as there is no scale dof in muscle space.
Position should work.
Yes this is a bug that was fixed in 2018.3b11
Fixed IK effector being overriden in animation jobs when calling skeleton methods.
So as soon as you set a position/rotation in an animation jobs for an humanoid rig the IK effector position/rotation were overriden with a bad transform. There is no link between this bug and the one you did log.
TransformStreamHandle.GetPosition doesn't get the correct world position in 2018.3.0f2. it works in 2018.2.19f1, but in the 2018.3.0f2, it only returns a position relative to the root bone of the skeleton, unless set applyRootMotion to true on the Animator. Is this a bug?
yes, could you log a bug please, we will investigate
Using AnimationHumanStream.SolveIK() is currently very expensive. For comparison, this is a test scene with 50x DefaultMale_Humanoid from the sample project using an animator with IK enabled:
And this is the same scene, but using a playable graph with an animation job that calls AnimationHumanStream.SolveIK():
There were some much needed bug fixes in the 2019.1 alpha, like the animation jobs now working with the burst compiler. Animation jobs also now seem cheaper to schedule. Will some of these be backported to 2018.3 or 2018.4?
We will backport the burst fix to 2018.3 and 2018.4
Resolving TransformStreamHandle in a job will cause a crash in 2019.1.0b1 if the bone you are resolving is part of an optimised rig. Previous versions of Unity that I have tested throw an InvalidOperationException. Do you plan to support optimised rigs for animation jobs?
Hi, I'd like to follow up on the previously mentioned scheduling improvements we saw in the 2019.1 alpha. Are these also coming to 2018.3/.4, or does the backport only contain the burst fix as you seem to mention explicitly? Or maybe the burst fix automatically allows for better scheduling?
Also, can you share information about the status of the backport, maybe a projected release date? We're currently in the process of analyzing all performance critical systems in our game and with sometimes around 50 fully animated NPCs on screen, it can get quite expensive. If the mentioned animation job fixes land in 2018.3 some time soon, we might be able to try them out again.
it shouldn't crash, please log a bug.
We would like to support optimised rigs, but I don't have any ETA, right now all our resources are used for runtime rigging and new DOTS animation system.
The backport will contain only the fix to allow user to use burst with animation jobs. it should land pretty soon in 2018.x since the original fix already landed
Will the skills acquired from this be transferable once the motion matching api is released?
Hi, what does DOTS stand for?
Data Oriented Tech Stack. So ECS/Job System.
I don't know why Unity decided to start referring to it as DOTS - I know that there's more to it than ECS; but the two acronyms gets confusing.
by DOTS animation system does this refer to just burst or the whole gamut of ecs native animation in combination with jobs and burst?
So I want to use the look at sample on our enemies so they look at the player but I cant figure out how to use this new system with an animator. We still need the animator for all other animations on the enemy such as attacks etc. How do I go about doing this
Documentation is pretty lacking in general in regards to Animation.
Without a demonstration, it takes hours of testing to even be able to grasp which methods interact, much less how they were intended to do so.
I'd venture to say -- you can't use Animators because Animators handle low-level transform stuff automatically (on the main thread), but with C# animation jobs, everything with transforming (even bones) suddenly becomes manual.
You can actually I found the examole from the new playables documentation since the job system also seems to use them. Using a mixer it works just fine