Search Unity

Trouble using Face Capture on Synty Studios Characters

Discussion in 'Virtual Production' started by RybiconZX, Sep 3, 2021.

  1. RybiconZX

    RybiconZX

    Joined:
    Oct 26, 2013
    Posts:
    3
    I've seen other posts with similar issues with no answer so maybe I'll try adding more detail. Basically I got Face Capture working with the sample scene provided by the package manager after a small hiccup with Windows Defender Firewall and it works great!

    I was able to record a take (.anim file) with the sample scene.

    Then I thought I could apply this animation to a Synty Character using a timeline (honestly I have no idea what I'm doing, I'm a programmer and not an animator) and it predictably messed up the timeline animation. I figured that the reason for this is because the sample scene has a head with a rig and who knows how that is going to be different compared to another random rig.

    So then I opened the Face Capture sample scene again, dragged in my Synty Character prefab and added the ARKit Face Actor component to it and created a new mapper file. The Synty models are pretty basic so I'm just trying to get the head movement working, maybe later I'll try to split out the eyes and and blend shapes.

    After messing around with the mapper and getting it looking similar to the SampleHead prefab I moved to the New FaceDevice object in the scene and dragged over my Synty character from the scene to the Actor field. Once I do that the character crumples into a ball and their head twists back to the right like they just looked at the Arc of the Covenant or something.

    I'm not sure if this is specific to just Synty characters or what but I cannot get this tool to work for anything other than the SampleHead. If anyone knows how to not have my character crumple into a ball or point me in the right direction I would be very grateful, thanks :)
     
    peterparnes likes this.
  2. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    Disclaimer - I am still learning, but maybe something in here will help you work it out.

    Not sure if it's the same thing, but do you know how avatar and humanoid animation clips work? They define a standard set of 'muscles' that all humanoid characters can share so animation clips can be more portable, even if the bones etc are a bit different across models. In addition there are avatar masks you can apply to a track (or override track) that can say 'this animation clip is only doing some of the bits, like the legs, or head etc'. Otherwise it seems like Unity assumes the animation clip is going to do everything (and then when it does not set values for the other bones, it goes into weird default positions). I am wondering if this is what you are seeing. If so, you might like to try creating an avatar mask for just the head and assigning that to the layer with the blendshapes. I think that way it knows the clip is only meant to be given control over part of the overall character.

    Note; blendshapes (used to control the face) are kind of interesting in their own right as they are not part of the humanoid avatar stuff. But I did have some success with a similar sort of project doing this.
    might be int4resting - if not there are a few other videos in the channel there.

    But if you got the mapping going, you are one up on me! I could not get the mapping right for my character. But I am successfully using another free tool which is good enough for me... for now.
     
  3. peterparnes

    peterparnes

    Joined:
    Jul 5, 2016
    Posts:
    3
    @RybiconZX Did you get this working? I am having similar problems and I am animation-beginner as well.
     
  4. DevynCole

    DevynCole

    Joined:
    Feb 23, 2018
    Posts:
    8
    Hey! Facial animation is complicated stuff! I’m a tech artist that deals with this stuff every day, I’ll do my best to explain why it won’t work with the synty characters. The ARKit way of face tracking uses blendshapes to drive the animation. Blendshapes are modified versions of the mesh that get created in 3D modeling software. Essentially, the mesh gets duped, and the verts get pushed around to make the specific facial expression. This expression then gets mapped back to the original model as a blendshape. The original ver positions can then be blended to the positions of the specific expression with a float. Depending on use case, a 3D character can have all kinds of blendshapes, they aren’t even specific to just the face, is used for all kinds of mesh deformation.

    Apple came along and created the ARKit face tracking standard that uses a specific set of shapes. They document guidelines of what these shapes should be labeled as, as well as how they’re intended to look and be used. This documentation is a bit confusing for someone who doesn’t directly work in Xcode, but if you click around you can find images of each blendshape.
    https://developer.apple.com/documentation/arkit/arfaceanchor/blendshapelocation

    Long story short, a character that wasn’t made with these blendshapes wont work. BUT! What kind of person would I be if I didn’t help you find a solution?! When trying to developer a facial animation pipeline for my studio, I discovered a fantastic blender plugin (paid) that makes it really intuitive to create blendshapes for an existing character. I don’t recommend someone who has no idea how models work to dive straight in, but the documentation for it is thorough enough that it’s possible to figure it out with some sand-punching.
    https://blendermarket.com/products/faceit

    It’s a fantastic tool that’s actively being improved. I use it frequently. Once you’ve made your blendshapes and have everything set up properly, you can follow unity’s document on how to plug it into the face capture app and components.

    Hope this helps! Happy to talk more about it and go deeper! Go forth and make cool S***!
     
    TacticSiege and marc_tanenbaum like this.
  5. Sean__R

    Sean__R

    Joined:
    Oct 1, 2014
    Posts:
    106
    Hello. Regarding Faceit, is there a way to export the Faceit Mocap data from Blender as FBX with pivot transforms on the face or export with bones in the face?

    Or, after the Mocap has been applied to the Faceit, can you export the Faceit spline controls from Blender as an FBX with Mocap anim keys?