Search Unity

Other Mars Face tracking questions

Discussion in 'Unity MARS' started by jiraphatK, Feb 17, 2021.

  1. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    300
    We're evaluating if we should use Mars in our project.
    We are in desperate need of advanced face tracking that works on both platforms.
    I have tried the various hacks around ARFoundation but found it massively lacking.

    Face Mask, Face painting
    this seems to work fine in both platform. I can just paint a custom texture and applied it to the ARFace material.

    Face Landmark
    this is tricky since ARFoundation does not give us any information related to the landmarks. I need to calculate the landmark positions from the ARFace's vertices position. This method work but is quite painful to work with since The vertices position and order is different in both platform and there is no way to guarantee that it will work in the next version

    BlendShape & Expression detection
    Work out of the box in IOS. In Android though, I need to use the vertices position to calculate if users is opening their mouth, smiling, blink, etc.. then adjust BlendShape's value accordingly. This does not work well since in Android those vertices BARELY move(except the vertices around the mouth). In fact, the blinking check did not work at all because the vertice around the eye did not move when the user blinks. This method also suffers the same problem as Face Landmark because I directly use vertices position.

    Face deformation
    none

    I read the documentation and found out that Mars can solve Face Landmark but there's nothing on Blendshape
    Does Mars have blendshape or facial expression detection that I can use to manipulate Blendshape? Is there any plan to support face deformation?
     
  2. jmunozarUTech

    jmunozarUTech

    Unity Technologies

    Joined:
    Jun 1, 2020
    Posts:
    297
    Hello @jiraphatK,

    Short answer is No, we dont support blenshapes; the large answer (from @mtschoen 's own words) from this Thread:
    https://forum.unity.com/threads/arkit-facetracking-with-truedepth.903548/#post-6523851


    Blendshapes in particular are a tricky subject in today's AR landscape. As far as I know, ARKit is the only platform to use that particular set of blendshapes. As a result, if you author content to use them it will likely only work on Apple devices with TrueDepth cameras. It won't, for example, work on Android devices or other potential platforms which support face tracking with head pose estimation, but do not provide the same blendshapes as ARKit. It's feasible to try and "translate" blendshape values if they are similar enough, but things break down quickly as the different representations facial expressions diverge.

    Furthermore, as a way of responding to facial expressions, blendshapes can be a bit limiting. They are not all directly mapped to expressions (like smile, frown, wink, etc.) although they can be used this way. They are very good for deforming a mesh that has been authored specifically with the platform's configuration in mind, but not very good at providing general-purpose anchor points for locations on the face. For example, blendshapes do not help me place a mustache on the user's upper lip, because I still need the face mesh to locate the "staring point" for the mouth open/close expression.

    In MARS, we expose a set of landmarks and expressions which we intend to be a "minimum subset" that all platforms should support. In our ARFoundationFaceTrackingProvider, we utilize the platform-provided mesh to produce these landmarks and expression values based on the position of specific triangles that we know to correspond to facial features. If you want to make googly eyes or fake eyebrows, landmarks are a great way to drag and drop content onto specific facial features. You can read more about this in our documentation:

    https://docs.unity3d.com/Packages/c...acking.html#placing-digital-content-on-a-face

    As a final note, these landmarks are included in the face recordings that come with MARS, and can therefore be simulated in the Editor.

    For expression data on Android, we use a set of position ranges for these landmarks to do a basic estimation, however this approach does not work for all faces and we do not recommend its use for production apps. On iOS with truedepth, we actually use blendshape data to calculate the expressions, which is a bit more reliable. The expression system is best suited to future integrations with SDKs that are specifically intended to provide expression/emotion information.

    If your intent is to create an "animoji" type character based on ARKit blendshapes, this is unfortunately not a workflow that we support in MARS. You are able to access the blendshape data through the AR Foundation session that we use to implement face tracking in MARS, but we do not expose blendshapes through our MARS data types at this time.

    We released a code sample back in 2018 which may help you iterate on characters using ARKit blend shapes. https://github.com/Unity-Technologies/facial-ar-remote

    Hope this helps!
     
    jiraphatK likes this.
  3. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    300
    Thanks for clarifying.
    Where can I read about this, I can't find information about it in the doc
     
  4. jmunozarUTech

    jmunozarUTech

    Unity Technologies

    Joined:
    Jun 1, 2020
    Posts:
    297
    hello @jiraphatK,

    For the easy to follow guide on face landmarks you can check our guide in https://docs.unity3d.com/Packages/com.unity.mars@1.2/manual/FaceTracking.html

    For a more in depth guide about landmarks and how to create your own custom landmarks you can check https://docs.unity3d.com/Packages/com.unity.mars@1.2/manual/Landmarks.html

    With regards to expressions you could check the FaceExpressionAction and there we have a set of basic expressions.

    Let us know if you have any other questions. :)
     
    riddhibhadani and jiraphatK like this.