Search Unity

  1. We are migrating the Unity Forums to Unity Discussions by the end of July. Read our announcement for more information and let us know if you have any questions.
    Dismiss Notice
  2. Dismiss Notice

Question Face Capture animations on humanoid characters

Discussion in 'Virtual Production' started by godbeygames, Sep 24, 2021.

  1. godbeygames

    godbeygames

    Joined:
    Oct 3, 2020
    Posts:
    13
    Hi,

    Is there a suggested workflow for getting facial blendshape animations recorded with the Face Capture app onto a humanoid model? It seems I am able to record facial mocap with my phone just fine, but it looks like the animations that are created are Generic anims because my model goes into a kneeling muscle test pose during recording the facial anim and I can't get it out of this pose after recording. I don't see a way to convert the recorded anim to humanoid. Also trying to apply the anim to a duplicate model that wasn't used to record the facial blendshapes doesn't work (again probably cause it records a Generic anim?).

    Any help on this would be appreciated!
     
  2. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    I believe the facial blendshapes are not part of the avatar humanoid muscle stuff - so you would have say a timeline with a base body animation (from body tracking or a separate animation clip) with an override track for the face (i use an avatar mask for the head, but not sure that is necessary). I am not sure if Face Capture captures head movements for example, which you would want as humanoid to be reusable. But mouth, eyes etc are not a part of the humanoid standardized muscles.
     
  3. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    Sorry, missed this part. What i would do is look at the animation clip when applied to the first character in a timeline. What properties of what component does it control? Then I would replace the character with the other duplicate. Does it have the same component etc. E.g. did the properties in the animation clip go yellow meaning they did not find the component and properties to bind to.

    If a pure duplicate, you should be able to see the properties in the inspector window change as you scrub through the clip.

    Assuming that works, I would then look at what the component was. Is it relying on something else to activate it. The Unity provided stuff had several bits of infrastructure - you may need a separate component or something to point at it. Eg when you duplicated, maybe some properties are still pointing at the old character etc.
     
  4. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
  5. godbeygames

    godbeygames

    Joined:
    Oct 3, 2020
    Posts:
    13
    Thanks for replying, akent99. Still having a bit of trouble though.

    What I want to do is use animations from Mixamo which only works on humanoid rigs and also apply the Face Capture anims along with it. I tried putting my Mixamo animation in a Timeline along with the animation from Face Capture, but the two don't seem to play well together. It definitely looks like the Face Capture anim is a Generic animation and I'm thinking this is where the main problem lies. Setting my character's animation type to Generic, I can get the Face Capture anim to work, but not the Mixamo anim. Setting my character's animation type to Humanoid, I can get the Mixamo animation to work but not the Face Capture anim.

    Only the blendshapes are being driven by the Face Capture anim, head movement is driven by the Mixamo animation. Even when setting up avatars to mask out the appropriate areas they don't work together.

    My hope is that Unity can give an option for animations created with the Face Capture tool work with humanoid rigs. In the meantime if anyone has a workaround I would love to hear it.
     
  6. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    If it makes you feel any better, this all is till giving me lots of problems too! Lol! I could not get the mapping from the Unity Face Camera onto my model for some reason (sigh). So I created a recording of what I do using a different (free) package called VSeeFace. It is not a direct answer to your question, but just recorded something quick and dirty in case any use. I record a "Face" animation that is generic and layer it over the top of "Avatar" animations (I use a "basic idle" animation clip + a custom recorded VSeeFace clip for the top half of the body). I show how I recorded them using EVMC4U and EasyMotionRecorder, then drop them into a Timeline with override tracks etc.

    When you say "I set my character's animation type", I am not sure what that means sorry. I combine Humanoid clips and Generic clips on the same timeline, using Avatar Masks to control which parts of which Humanoid clip to use (I combine an upper body recording with a lower body standard Humanoid animation clip).



    Unity might come back with a correct answer for you using their tools. I played with them, but since I had a working solution and I could not get the mappings to work, plus I don't want to use live recordings for the face. I found it too hard in practice to line up multiple recordings. So instead I have a folder of facial expression animation clips, my own script for using VRoid Studio "Blendshape Clips".

    Side note: blendshape clips is something VRM introduced - you can define facial expressions using another level of abstraction, so I can say "angry" but map it through to blendshapes slightly different for different characters. My Angry facial animation clip I can reuse on any character. It is kind of like the Avatar Humanoid stuff I guess, but you can define your own expressions. I then wrote my own script to make animating the expressions easier - it just maps some properties through to the different weights for the blendshape clips I use. But it means mixing facial animation clips (which go more direct to properties) and my script does not work that well - only one can win, so I settled for using clips for facial expressions all the time - good enough for my purposes.
     
  7. godbeygames

    godbeygames

    Joined:
    Oct 3, 2020
    Posts:
    13
    Thanks for putting this together! Looks like a possible workaround for my purposes, it'll just take a little to get it setup. :)
    For the switching animation types, when you select your FBX character in the Project folder and in the inspector go to Rig, is the Animation Type set to Humanoid or Generic?
     
  8. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    Ah, that may be the difference. I am using VRoid Studio to create characters (free software!). It exports a VRM file (a standard for virtual reality character models - so different apps can load it like VR Chat). To load into unity, I use the "UniVRM" package, drop the VRM file in, and it creates a prefab - not an FBX file. So I don't have experience using FBX files sorry.

    (I did grab some from Mixamo in the past, but I opened the FBX file, selected the animation clip, then duplicated it so they got copied into a separate file outside the FBX file. I could delete the FBX file and just have an animation clip I can edit like all the rest. I think it was a matter of getting the right settings when downloading the file from Mixamo, rather than the settings of the FBX file after downloading it - but that was a while ago now sorry.)
     
  9. Sergi_Valls

    Sergi_Valls

    Unity Technologies

    Joined:
    Dec 2, 2016
    Posts:
    212
    I'll try to explain why the character goes into the kneeling muscle test pose:
    The Avatar asset tells the underlaying animation system what properties are going to be animated. Internally, the animation system writes a default value to those properties (usually zero). When a Humanoid Avatar is set to your character's Animator component, you are telling the animation system that your intention is to animate the muscle, IK and root properties (plus other properties not related to locomotion). When you set a Generic Avatar you are telling the animation system which properties to animate one by one.
    When trying to play a clip to an Animator that uses a Humanoid Avatar, the animation system will allocate and initialize to zero all the muscles, expecting the clip to contain curves for them. If your clip doesn't contain muscle curves, the character will stay in that default pose.
    The solution is to tell the animation system that your clip doesn't contain muscle curves by using an AvatarMask.
    In order to play the animations in Timeline, add an override track and set the AvatarMask.
    EDIT:
    In Timeline, set a humanoid clip at the root AnimationTrack and set your recorded clip in an Override track.
     
    Last edited: Oct 6, 2021
    godbeygames likes this.
  10. godbeygames

    godbeygames

    Joined:
    Oct 3, 2020
    Posts:
    13
    Fantastic! Thank you for the explanation. I will give your suggestion a try.
     
  11. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    How do the recordings for the Face Capture app interact with Avatar definitions? For example, what should the AvatarMask be? Or should Face Capture recordings be made Generic because they are orthogonal to the Avatar definition?

    My understanding was facial expression blend shapes are not a part of the Avatar definition (has this changed?), but I was not sure if the Facial Capture also captured head movements for example, so it is best to make Avatar based clips with an Avatar mask of the Head only?
     
  12. Sergi_Valls

    Sergi_Valls

    Unity Technologies

    Joined:
    Dec 2, 2016
    Posts:
    212
    FaceCapture recordings do not interact with the avatar definition. They are just animation curves playing on the FaceActor component. The AvatarMask I mentioned before shouldn't be necessary, just the override track.

    If my understanding is correct, the Avatar asset stores all the Transform hierarchy. You can then create an AvatarMask importing the skeleton from the Avatar and uncheck the nodes containing SkinnedMeshRenderers if you want to mask blend shapes (this only applies to the ModelImporter workflow and is unrelated to the FaceCapture and its recorded clips).

    Face Capture head position and orientation are processed by the FaceActor, using the Mapper asset. The animation clip only contains bindings to the FaceActor component.
     
  13. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    So it sounds like a generic (not humanoid) animation clip is fine for Face Capture recordings, drop it in an override track, and no need to use avatar masks. Thanks!