Search Unity

  1. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice
  2. Ever participated in one our Game Jams? Want pointers on your project? Our Evangelists will be available on Friday to give feedback. Come share your games with us!
    Dismiss Notice

ARFoundation Face Tracking step-by-step

Discussion in 'Handheld AR' started by jimmya, Dec 18, 2018.

  1. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    793
    We have just announced the release of the latest version of ARFoundation that works with Unity 2018.3.

    For this release, we have also updated the ARFoundation samples project with examples that show off some of the new features. This post will explain more about the Face Tracking feature that is shown off in the ARFoundation samples.

    The Face Tracking example scenes will currently only work on iOS devices that have a TrueDepth camera: iPhone X, XR, XS XS Max or iPad Pro (2018).

    The Face Tracking feature is exposed as part of the Face Subsystem similar to other subsystems: you can subscribe to events that will inform you when a new face has been detected, and that event will provide you with basic information about the face including the pose (position and rotation) of the face, and its TrackableId (which is a session unique id for any tracked object in the system). Using the TrackableId, you can then get more information about that particular face from the subsystem including the description of its mesh and the blendshape coefficients that describe the expression on that face. There are also events to subscribe to which allow you to detect when the face has been updated or when it has been removed.

    ARFoundation as usual provides some abstraction via the ARFaceManager component. This component can be added to an ARSessionOrigin in the scene, and it will create a copy of the "FacePrefab" prefab and add it to the scene as a child of the ARSessionOrigin when it detects a face. It will also update and remove the generated "Face" GameObjects as needed when the face is updated or removed respectively.

    You usually place an ARFace component on the root of the "FacePrefab" prefab above so that the generated GameObject can automatically update its position and orientation according to the data that is provided by the underlying subsystem.

    You can see how this works in the FaceScene in ARFoundation-samples. You first create a prefab named FacePrefab that has ARFace component on the root of the GameObject hierarchy:



    You then plug this prefab reference in the ARFaceManager component that you have added to ARSessionOrigin:



    You will also need to check the "ARKit Face Tracking" checkbox in the ARKitSettings asset as explained here.

    When you build this to a device that supports ARKit Face Tracking and run it, you should see the three colored axes from FacePrefab appear on your face and move around with your face and orient itself aligned to your face.

    You can also make use of the face mesh in your AR app by getting the information from the ARFace component. Open up FaceMeshScene and look at the ARSessionOrigin. It is setup in the same way as before, with just a different prefab being referenced by the ARFaceManager. You can take a look at this FaceMeshPrefab:



    You can see that the mesh filter on the prefab is actually empty. This is because the script component ARFaceMeshVisualizer will create a mesh and fill in the vertices and texture coordinates based on the data that it gets from ARFace component. You will also notice the MeshRenderer contains the material that this mesh will be rendered with, in this case a texture with three colored vertical bands. You can change this material to create masks and face paints etc. If this scene is built out to a device, you should be able to see your face rendered with the material specified.


    You can also make use of the blendshape coefficients that are output by ARKit Face Tracking. Open up the FaceBlendsShapeScene and check out the ARSessionOrigin. It looks the same as before, except that the prefab it references is now FaceBlendShapes prefab. You can take a look at that prefab:



    In this case, there is a ARFaceARKitBlendShapeVisualizer which references the SkinnedMeshRender GameObject of this prefab so that it can manipulate the blendshapes that exist on that component. This component will get the Blendshape coefficients from the ARKit SDK and manipulate the blendshapes on our sloth asset so that it replicates your expression. Build it out to a supported device and try it out!

    For more detailed coverage of the face tracking support, have a look at the docs.

    This was a summary of how to create your own face tracking experiences with the new ARFoundation release. Please take some time to try it out and let us know how it works out! As always, I'm happy to post any cool videos, demos or apps you create on https://twitter.com/jimmy_jam_jam
     

    Attached Files:

    Tair, Jonas_Hartmann and tdmowrer like this.
  2. Staus

    Staus

    Joined:
    Jul 7, 2014
    Posts:
    13
    Awesome! Is there any way of exposing raw TrueDepth video feed from iOS devices?
    I know that ARFoundation aims to abstract device specific features away, but it just seems like such a pity to not have access to this data when it's just sitting there and fx facetracking likely makes use of it anyway.
    Thanks for the good work!
     
  3. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    793
    ARKit FaceTracking does not have this available to end users but abstracts it away. There are other APIs that will expose it, but it is outside of the scope right now. The depthmap may not be as useful as you think it is - they have to use a lot of filtering and smoothing in their SDK to enable these features.
     
  4. nbkhanhdhgd

    nbkhanhdhgd

    Joined:
    Apr 19, 2016
    Posts:
    23
    It's worked perfectly on IphoneX.
    But camera is default front camera. Can I switch to back camera ?
     
    ramphulneillpartners likes this.
  5. m3rt32

    m3rt32

    Joined:
    Mar 5, 2013
    Posts:
    73
    Arcore introduced their facetracking support. You guys have any plan on this ?
     
  6. xiaocheny

    xiaocheny

    Joined:
    Feb 20, 2018
    Posts:
    1
    What about Android devices? The face tracking samples will not working so far?
     
    ZenUnity likes this.
  7. lincode

    lincode

    Joined:
    May 2, 2019
    Posts:
    3
    No, we can't currently. It requires the depth true that is only the front camera's feature in iPhoneX now.
     
  8. WR-reddevil

    WR-reddevil

    Joined:
    Apr 16, 2019
    Posts:
    52
    DO anyone know how to identify the face parts like Eyes, nose etc ?
     
  9. Pawnkiller

    Pawnkiller

    Joined:
    Apr 7, 2016
    Posts:
    4
    How can i switch between the assets that appear on the face.. Im able to switch between the different textures or different assets. but not between assets and textures.
     
  10. JBMS

    JBMS

    Joined:
    Sep 30, 2017
    Posts:
    6
    I am also trying to determine if eye transforms are supported in ARFoundation or its subsystems.

    With the (now deprecated) Unity ARKit Plugin, we could use leftEyePose and rightEyePose on the ARFaceAnchor.

    Is there an equivalent with ARFoundation, etc?
     
  11. AdrienMgm

    AdrienMgm

    Joined:
    Oct 10, 2017
    Posts:
    3
    Same question here, is it possible to use eyes pose with ARFoundation?
     
    ina likes this.
  12. marcinpolakowski15

    marcinpolakowski15

    Joined:
    Aug 7, 2019
    Posts:
    2
    I find it completely irresponsible for Unity to deprecate a plugin before its successor has caught up. Eye Gaze tracking is an essential part of the AR experience, and the old ARKit Plugin did this perfectly well.

    Not only is eye gaze apparently unsupported by AR Foundation, we also have total radio silence from Unity on the subject. It's a disgraceful way to treat devs who rely on their technology, the least they could do is reply.
     
    ina likes this.
  13. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    26
    We understand your frustration and want to address your concerns. I am happy to inform everyone that eye transforms and fixation points will be available in the next preview release of ARFoundation and related packages (ARKit-Face-Tracking).

    A short preview of how it will work is that each ARFace has 3 new nullable properties added to them: leftEyeTransform, rightEyeTransform, and fixationPoint. Each of these are being exposed as Unity transforms so to utilize them, you simply need add your content as a child to the transform. We have also made sure to have a few samples made to assist with utilizing the "new" feature.

    Unfortunately, I can't speak to what the timeframe for the release will be but I did want to at least address the concerns surrounding this feature as it is a crucial feature for many AR applications.
     
    BuoDev likes this.
  14. GeniusKoala

    GeniusKoala

    Joined:
    Oct 13, 2017
    Posts:
    4
    Hi, is it compatible with the iPhone 11 since it integrates a TrueDepth Camera inside? I hesitate to invest in an iPhone X but it is well expensive and rare on the Internet contrary to the iPhone 11 which is new and cheaper. Which iPhone will give the best performances?

    Plus, do I need a Mac to run the live stream from the iPhone on Unity? I need to check the materials I need to do some Mocap tests (with the Xsens body suit with does not have their software on MacOS and the iPhoneX). Thank you so much for you support! It's my first try on Mocap :)
     
  15. jalajshah

    jalajshah

    Joined:
    Mar 5, 2018
    Posts:
    31
    hello every one , i want to create avatar app using Unity and ARFoundation uisng ARCore , is that possible ?
    need to use bland-shape with controller like eyes , nose , mouth , tongue and ear , but don't know where to start !!
     
  16. rainbowmimizu

    rainbowmimizu

    Joined:
    Sep 19, 2017
    Posts:
    2
    hello, Is it possible to add an option not remove the prefab but just hold it while tracked face is missing from camera?
    I'm using it for facial capture but expose actor's face suddenly is not acceptable...:(
    Maybe I could catch the Remove event and duplicate the prefab until new face appears, but looking for simpler way.
     
  17. rainbowmimizu

    rainbowmimizu

    Joined:
    Sep 19, 2017
    Posts:
    2
    found the easiest solution. So it seems to be SkinnedMeshRenderer is disabled while camera is missing the face.
    Just force it on in Update(), then face never get disappeared.

    transform.GetComponent<SkinnedMeshRenderer>().enabled = true
     
  18. atorisa

    atorisa

    Joined:
    Dec 1, 2013
    Posts:
    233
    Docs Says: "
    facesChanged
    Raised for each new ARFace detected in the environment.
    "
    But in fact event is dispatched like Update Function on Android.
    it continues to be dispatched up on the same face when staying my face in front of camera.

    Is it bug? @davidmo_unity @jimmya @mdurand

    ARFoundation 3.0.1 with FaceMesh scene from samples.

    P.S. So in comparing with iOS, Android can't remember the face. Is It Normal?
     
  19. GeniusKoala

    GeniusKoala

    Joined:
    Oct 13, 2017
    Posts:
    4
    I can run the remote on the Sloth on Unity 2019.2.9f1 but I'm trying to update to 2019.3.X with no success. Did someone make it work on 2019.3.X ?

    Plus, my client app built on 2019.3.7 crashes on launch on my iPhone11. XCode error says that my device has not the ARKit enabled... but built on 2019.2.9f1 it's working great.

    Weirdly it's working on my Mac (Server scene on Unity 2019.3.7) and my iPhone 11 (built on Unity 2019.2.9f1) whereas on my PC(Server scene on 2019.3.7) it does not. I thought the computer needed the same version used to build the Client app but on my Mac it's working with 2 different versions.

    Is there any plan for the facial ar remote project to be supported in the future? It's a very interesting project!
     
  20. sameel

    sameel

    Joined:
    Dec 4, 2015
    Posts:
    15
  21. theminer1236

    theminer1236

    Joined:
    Apr 21, 2020
    Posts:
    2
    Hello,
    I was wondering if it was possible for me to get plane tracking on the rear camera and then face tracking on the front. They both work individually but when I add them both together, I can never see the rear camera. There is a delay with the face tracking when I add the plane tracking but I'm not sure what is happening.
     
  22. waldgeist

    waldgeist

    Joined:
    May 6, 2017
    Posts:
    100
    How can I detect if the device actually supports face tracking?
     
  23. atorisa

    atorisa

    Joined:
    Dec 1, 2013
    Posts:
    233
    First you can use basic checking with AR Foundation Support Checker.

    Additionally, you can check conditions described here:
     
  24. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    112
    You can check if ARFaceManager.descriptor is not null. Please wait until ARSession.state >=
    ARSessionState.Ready before executing this code.
    Code (CSharp):
    1. var faceManager = FindObjectOfType<ARFaceManager>();
    2. var faceManagerDescriptor = faceManager.descriptor;
    3. if (faceManagerDescriptor != null) {
    4.     Debug.Log("face tracking is supported. To determine supported features, please access faceManagerDescriptor properties.");
    5. } else {
    6.     Debug.Log("face tracking is not supported");
    7. }
     
  25. waldgeist

    waldgeist

    Joined:
    May 6, 2017
    Posts:
    100
    Thanks for the fast response.

    Unfortunately, this only works if the Face Tracker has already been activated, which is not what I want, because it forces my camera to the front.

    I need to detect the Face Tracking capability before actually activating it, to show or hide a UI button accordingly.
     
  26. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    112
    You can disable ARFaceManager in your scene and activate it only when you need to actually start face tracking.
     
  27. waldgeist

    waldgeist

    Joined:
    May 6, 2017
    Posts:
    100
    Yes, that's exactly what I do to switch the camera between world and user. But I found out that it needs to be activated to get your code work. Otherwise, it only reports face tracking as not being available. Since my app starts in world facing mode, activating the face tracker just to detect capabilities would cause some weird behaviour.
     
  28. waldgeist

    waldgeist

    Joined:
    May 6, 2017
    Posts:
    100
    I checked the code of the "support checker", but it seems to only check if AR is available, not the existence of a particular feature.

    For the native docs, this would require me to develop native Unity plugins, right? And this would not guarantee that the Unity implementation for FaceTracking is matching. For instance, Face Tracking was available in ARCore before Unity supported it there. So I would prefer a reliable way to test capabilities right in ARFoundation, but without having to activate the face tracker first.

    The strange thing is, that enabling the face tracker without face tracking being available prints a message to the iOS logs, but it would not give you any feedback.
     
  29. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    112
    @waldgeist I found a clean solution! It can be even called in Awake() and will NOT trigger camera permission request.
    Code (CSharp):
    1. public static bool DetermineIfFaceTrackingSupported() {
    2.     var descriptors = new List<XRFaceSubsystemDescriptor>();
    3.     SubsystemManager.GetSubsystemDescriptors(descriptors);
    4.     if (descriptors.Any()) {
    5.         var descriptor = descriptors.First();
    6.         Debug.Log("Face Tracking is supported, supportsEyeTracking: " + descriptor.supportsEyeTracking + ", supportsFacePose: " + descriptor.supportsFacePose + ", supportsFaceMeshNormals: " + descriptor.supportsFaceMeshNormals + ", supportsFaceMeshUVs: " + descriptor.supportsFaceMeshUVs + ", supportsFaceMeshVerticesAndIndices: " + descriptor.supportsFaceMeshVerticesAndIndices);
    7.         return true;
    8.     } else {
    9.         Debug.Log("Face Tracking is not supported.");
    10.         return false;
    11.     }
    12. }
     
    Last edited: Jul 2, 2020 at 6:19 PM
  30. waldgeist

    waldgeist

    Joined:
    May 6, 2017
    Posts:
    100
    Hey, perfect, I will try it out and report back!
     
  31. waldgeist

    waldgeist

    Joined:
    May 6, 2017
    Posts:
    100
    Works, thanks!
     
  32. atorisa

    atorisa

    Joined:
    Dec 1, 2013
    Posts:
    233
    Awesome.

    P.S. Good naming is: "public static bool IsFaceTrackingSupported() "
     
unityunity