Search Unity

ARKit function for blendShape--> faceGeometry to modify face mesh

Discussion in 'AR' started by NumesSanguis, Nov 16, 2017.

  1. NumesSanguis

    NumesSanguis

    Joined:
    Nov 9, 2017
    Posts:
    6
    I want to use the blendShapes dictionary to modify a face mesh in Unity.

    In the example of the `Assets\UnityARKitPlugin\Examples\FaceTracking\FaceMeshScene`, the face mesh is being updated by `UnityARFaceMeshManager` script in the same directory. It does this by reading the `anchorData.faceGeometry` received from the iPhone X and updating the faceMesh.

    In another example, `FaceBlendshapeScene`, the script `BlendshapePrinter` is being used to retrieve the values of the blendShapes dictionary and output this in the GUI. The problem is that this example only show how to output these values.

    What I want to do is to retrieve the dictionary, change the blendshape value, and update the mesh in Unity based on that value. E.g. the blendShape dict contains `<"jawOpen", 0.8>` and I want to change it to `<"jawOpen", 0.4>` (an avatar that has trouble opening its mouth).

    However, I cannot find any function in he ARKit plugin that transforms the mesh based on the blendShape dict or returns faceGeometry values based on blendShape dict, which I can then use to update the mesh in Unity.

    According to developer website of Apple it should be able to modify the blendshapes based on the values (0.0 - 1.0) in a `ARFaceAnchor.BlendShapeLocation` dictionary. How to do this in Unity?

    Related links:
    https://developer.apple.com/documentation/arkit/arfaceanchor/2928251-blendshapes
    https://developer.apple.com/documentation/arkit/arfacegeometry/2928204-init
    https://developer.apple.com/documentation/arkit/arfaceanchor.blendshapelocation
     
  2. vstreamdigital

    vstreamdigital

    Joined:
    Dec 31, 2015
    Posts:
    1
    I was wondering if there was an implementation of the init() function somewhere too. Being able to build the mesh from blend shapes would be quite useful.

    I want to get each of the blend shapes separately as a target so our modellers can make better blend shapes for our custom models since we're having a lot of issues with mouth shapes and we want to use them for recording actors. (Although ideally apple would have just given an fbx with blendshapes already set up)
     
  3. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    There is a slothface with blendshapes setup if you wanted to use that as a reference. The other reference is the apple docs, which show a drawing of the expected shape of the face when you have that particular blendshape active:

    e.g. for eyeblinkright:
     
  4. jilt

    jilt

    Joined:
    Nov 7, 2014
    Posts:
    49
    Exposing the functionality to compose the mesh from blendshapes could still be very handy.
     
    sgoose likes this.
  5. User340

    User340

    Joined:
    Feb 28, 2007
    Posts:
    3,001
  6. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    User340 likes this.
  7. User340

    User340

    Joined:
    Feb 28, 2007
    Posts:
    3,001
  8. BrandStone

    BrandStone

    Joined:
    Jul 21, 2014
    Posts:
    79
    Question: can I upload the iphone facial ar remote app on any iphone x or I need to be an apple developer?

    Thanks
     
  9. AdamBebko

    AdamBebko

    Joined:
    Apr 8, 2016
    Posts:
    168
    This guy has an awesome tutorial. Might be helpful to you guys.

     
  10. thexdd

    thexdd

    Joined:
    Mar 20, 2013
    Posts:
    20
    Is there anyone who happen to know how to implement something similar but when my head is being created in runtime? For some reason it just doesn't work at all. What I mean is in my blendshapes visualizer script, the
    (ARKitFaceSubsystem)_faceManager.subsystem is always null.