Search Unity

Need to track facial parameters in runtime with procedurally generated face meshes & blendshapes

Discussion in 'AR/VR (XR) Discussion' started by Zante, Sep 8, 2020.

  1. Zante

    Zante

    Joined:
    Mar 29, 2008
    Posts:
    429
    I've just finished creating a runtime blendshape generator and need to come up with a workflow which would allow developers to capture facial mocap data in real-time, mapped on to those same blendshape values.

    Because, in this instance, the meshes and blendshapes are generated at runtime, I cannot configure this while in edit mode (hence why many existing solutions don't meet the requirements - they require extensive configuration and setup). The saving grace however is that all my blendshape values are consistent over multiple meshes. To that end, I can imagine a workflow in which I have a dedicated XML/hardcoded script configuration that gets applied at runtime to any model generated the same way.

    I'm just looking for the path of least resistance when it comes to retrieving these values from a webcamera. Is anyone aware of some existing examples I can use to make this vision a reality for peeps?

     
  2. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,144
    Are you looking for an Editor-only solution or do you wish to record Blendshapes at runtime in a standalone build?

    For the Editor-only use case, I can recommend my remote plugin + iOS device. The plugin can transmit Blenshapes from the iOS device back to the Editor. But this solution will not work with a webcam.

    It is possible to modify the plugin to work in standalone builds, but will require extra coding. I laid out the general idea here:
    https://forum.unity.com/threads/ar-...ject-in-the-editor.898433/page-4#post-6280649
     
  3. Zante

    Zante

    Joined:
    Mar 29, 2008
    Posts:
    429
    I was hoping for a real-time solution in which you can animate the character directly without having to use an intermediary model with its own preconfigured blendshapes. In hindsight, the results seem to work just as well. The video shows my progress when remapping the blendshapes in runtime from a real skinnedmeshrender to fake ones on a bone-driven facial rig.

    This is using FaceWare and a normal webcam. I'm going to keep looking for editor tools designed to work in play mode which can work using the same apparatus. As is, this requires a streaming server to pipe the data into Unity, which is just plain annoying as a workflow.

     
    Last edited: Sep 9, 2020
  4. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,144
    I'm sorry, I didn't make it clear. My plugin works in real-time, but only in Editor (standalone builds are not supported out of the box).
    If you have an iOS device, you can use it as blendshapes data source for your models.
     
  5. Zante

    Zante

    Joined:
    Mar 29, 2008
    Posts:
    429
    Sadly I don't have an iOS device :[ The results above come from a normal webcamera though - I'm using an older gen Microsoft Livecam. I do have a Kinect 2 which I'm prepping for body-based motion capture though, not sure if that's viable as an addition to your tool?
     
  6. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,144
    My plugin only covers the AR Foundation API. And, unfortunately, Kinect is not supported by AR Foundation.
    I have a plan to add body tracking support in the future, but again, this will be iOS only feature.