Search Unity

ARFoundation Face Tracking step-by-step

Discussion in 'AR' started by jimmya, Dec 18, 2018.

Thread Status:
Not open for further replies.
  1. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    We have just announced the release of the latest version of ARFoundation that works with Unity 2018.3.

    For this release, we have also updated the ARFoundation samples project with examples that show off some of the new features. This post will explain more about the Face Tracking feature that is shown off in the ARFoundation samples.

    The Face Tracking example scenes will currently only work on iOS devices that have a TrueDepth camera: iPhone X, XR, XS XS Max or iPad Pro (2018).

    The Face Tracking feature is exposed as part of the Face Subsystem similar to other subsystems: you can subscribe to events that will inform you when a new face has been detected, and that event will provide you with basic information about the face including the pose (position and rotation) of the face, and its TrackableId (which is a session unique id for any tracked object in the system). Using the TrackableId, you can then get more information about that particular face from the subsystem including the description of its mesh and the blendshape coefficients that describe the expression on that face. There are also events to subscribe to which allow you to detect when the face has been updated or when it has been removed.

    ARFoundation as usual provides some abstraction via the ARFaceManager component. This component can be added to an ARSessionOrigin in the scene, and it will create a copy of the "FacePrefab" prefab and add it to the scene as a child of the ARSessionOrigin when it detects a face. It will also update and remove the generated "Face" GameObjects as needed when the face is updated or removed respectively.

    You usually place an ARFace component on the root of the "FacePrefab" prefab above so that the generated GameObject can automatically update its position and orientation according to the data that is provided by the underlying subsystem.

    You can see how this works in the FaceScene in ARFoundation-samples. You first create a prefab named FacePrefab that has ARFace component on the root of the GameObject hierarchy:



    You then plug this prefab reference in the ARFaceManager component that you have added to ARSessionOrigin:



    You will also need to check the "ARKit Face Tracking" checkbox in the ARKitSettings asset as explained here.

    When you build this to a device that supports ARKit Face Tracking and run it, you should see the three colored axes from FacePrefab appear on your face and move around with your face and orient itself aligned to your face.

    You can also make use of the face mesh in your AR app by getting the information from the ARFace component. Open up FaceMeshScene and look at the ARSessionOrigin. It is setup in the same way as before, with just a different prefab being referenced by the ARFaceManager. You can take a look at this FaceMeshPrefab:



    You can see that the mesh filter on the prefab is actually empty. This is because the script component ARFaceMeshVisualizer will create a mesh and fill in the vertices and texture coordinates based on the data that it gets from ARFace component. You will also notice the MeshRenderer contains the material that this mesh will be rendered with, in this case a texture with three colored vertical bands. You can change this material to create masks and face paints etc. If this scene is built out to a device, you should be able to see your face rendered with the material specified.


    You can also make use of the blendshape coefficients that are output by ARKit Face Tracking. Open up the FaceBlendsShapeScene and check out the ARSessionOrigin. It looks the same as before, except that the prefab it references is now FaceBlendShapes prefab. You can take a look at that prefab:



    In this case, there is a ARFaceARKitBlendShapeVisualizer which references the SkinnedMeshRender GameObject of this prefab so that it can manipulate the blendshapes that exist on that component. This component will get the Blendshape coefficients from the ARKit SDK and manipulate the blendshapes on our sloth asset so that it replicates your expression. Build it out to a supported device and try it out!

    For more detailed coverage of the face tracking support, have a look at the docs.

    This was a summary of how to create your own face tracking experiences with the new ARFoundation release. Please take some time to try it out and let us know how it works out! As always, I'm happy to post any cool videos, demos or apps you create on https://twitter.com/jimmy_jam_jam
     

    Attached Files:

  2. Staus

    Staus

    Joined:
    Jul 7, 2014
    Posts:
    13
    Awesome! Is there any way of exposing raw TrueDepth video feed from iOS devices?
    I know that ARFoundation aims to abstract device specific features away, but it just seems like such a pity to not have access to this data when it's just sitting there and fx facetracking likely makes use of it anyway.
    Thanks for the good work!
     
  3. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    ARKit FaceTracking does not have this available to end users but abstracts it away. There are other APIs that will expose it, but it is outside of the scope right now. The depthmap may not be as useful as you think it is - they have to use a lot of filtering and smoothing in their SDK to enable these features.
     
  4. nbkhanhdhgd

    nbkhanhdhgd

    Joined:
    Apr 19, 2016
    Posts:
    23
    It's worked perfectly on IphoneX.
    But camera is default front camera. Can I switch to back camera ?
     
    ramphulneillpartners likes this.
  5. merpheus

    merpheus

    Joined:
    Mar 5, 2013
    Posts:
    202
    Arcore introduced their facetracking support. You guys have any plan on this ?
     
  6. xiaocheny

    xiaocheny

    Joined:
    Feb 20, 2018
    Posts:
    1
    What about Android devices? The face tracking samples will not working so far?
     
    ZenUnity likes this.
  7. lincode

    lincode

    Joined:
    May 2, 2019
    Posts:
    5
    No, we can't currently. It requires the depth true that is only the front camera's feature in iPhoneX now.
     
  8. Vinayak-VC

    Vinayak-VC

    Joined:
    Apr 16, 2019
    Posts:
    60
    DO anyone know how to identify the face parts like Eyes, nose etc ?
     
  9. Shreenath1322

    Shreenath1322

    Joined:
    Apr 7, 2016
    Posts:
    7
    How can i switch between the assets that appear on the face.. Im able to switch between the different textures or different assets. but not between assets and textures.
     
  10. JBMS

    JBMS

    Joined:
    Sep 30, 2017
    Posts:
    6
    I am also trying to determine if eye transforms are supported in ARFoundation or its subsystems.

    With the (now deprecated) Unity ARKit Plugin, we could use leftEyePose and rightEyePose on the ARFaceAnchor.

    Is there an equivalent with ARFoundation, etc?
     
  11. AdrienMgm

    AdrienMgm

    Joined:
    Oct 10, 2017
    Posts:
    4
    Same question here, is it possible to use eyes pose with ARFoundation?
     
    ina likes this.
  12. marcinpolakowski15

    marcinpolakowski15

    Joined:
    Aug 7, 2019
    Posts:
    2
    I find it completely irresponsible for Unity to deprecate a plugin before its successor has caught up. Eye Gaze tracking is an essential part of the AR experience, and the old ARKit Plugin did this perfectly well.

    Not only is eye gaze apparently unsupported by AR Foundation, we also have total radio silence from Unity on the subject. It's a disgraceful way to treat devs who rely on their technology, the least they could do is reply.
     
    ina likes this.
  13. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    99
    We understand your frustration and want to address your concerns. I am happy to inform everyone that eye transforms and fixation points will be available in the next preview release of ARFoundation and related packages (ARKit-Face-Tracking).

    A short preview of how it will work is that each ARFace has 3 new nullable properties added to them: leftEyeTransform, rightEyeTransform, and fixationPoint. Each of these are being exposed as Unity transforms so to utilize them, you simply need add your content as a child to the transform. We have also made sure to have a few samples made to assist with utilizing the "new" feature.

    Unfortunately, I can't speak to what the timeframe for the release will be but I did want to at least address the concerns surrounding this feature as it is a crucial feature for many AR applications.
     
    BuoDev likes this.
  14. GeniusKoala

    GeniusKoala

    Joined:
    Oct 13, 2017
    Posts:
    97
    Hi, is it compatible with the iPhone 11 since it integrates a TrueDepth Camera inside? I hesitate to invest in an iPhone X but it is well expensive and rare on the Internet contrary to the iPhone 11 which is new and cheaper. Which iPhone will give the best performances?

    Plus, do I need a Mac to run the live stream from the iPhone on Unity? I need to check the materials I need to do some Mocap tests (with the Xsens body suit with does not have their software on MacOS and the iPhoneX). Thank you so much for you support! It's my first try on Mocap :)
     
  15. jalajshah

    jalajshah

    Joined:
    Mar 5, 2018
    Posts:
    62
    hello every one , i want to create avatar app using Unity and ARFoundation uisng ARCore , is that possible ?
    need to use bland-shape with controller like eyes , nose , mouth , tongue and ear , but don't know where to start !!
     
  16. rainbowmimizu

    rainbowmimizu

    Joined:
    Sep 19, 2017
    Posts:
    5
    hello, Is it possible to add an option not remove the prefab but just hold it while tracked face is missing from camera?
    I'm using it for facial capture but expose actor's face suddenly is not acceptable...:(
    Maybe I could catch the Remove event and duplicate the prefab until new face appears, but looking for simpler way.
     
  17. rainbowmimizu

    rainbowmimizu

    Joined:
    Sep 19, 2017
    Posts:
    5
    found the easiest solution. So it seems to be SkinnedMeshRenderer is disabled while camera is missing the face.
    Just force it on in Update(), then face never get disappeared.

    transform.GetComponent<SkinnedMeshRenderer>().enabled = true
     
    ilmario likes this.
  18. makaka-org

    makaka-org

    Joined:
    Dec 1, 2013
    Posts:
    1,026
    Docs Says: "
    facesChanged
    Raised for each new ARFace detected in the environment.
    "
    But in fact event is dispatched like Update Function on Android.
    it continues to be dispatched up on the same face when staying my face in front of camera.

    Is it bug? @davidmo_unity @jimmya @mdurand

    ARFoundation 3.0.1 with FaceMesh scene from samples.

    P.S. So in comparing with iOS, Android can't remember the face. Is It Normal?
     
    Parixit_Jadeja likes this.
  19. GeniusKoala

    GeniusKoala

    Joined:
    Oct 13, 2017
    Posts:
    97
    I can run the remote on the Sloth on Unity 2019.2.9f1 but I'm trying to update to 2019.3.X with no success. Did someone make it work on 2019.3.X ?

    Plus, my client app built on 2019.3.7 crashes on launch on my iPhone11. XCode error says that my device has not the ARKit enabled... but built on 2019.2.9f1 it's working great.

    Weirdly it's working on my Mac (Server scene on Unity 2019.3.7) and my iPhone 11 (built on Unity 2019.2.9f1) whereas on my PC(Server scene on 2019.3.7) it does not. I thought the computer needed the same version used to build the Client app but on my Mac it's working with 2 different versions.

    Is there any plan for the facial ar remote project to be supported in the future? It's a very interesting project!
     
  20. sameel

    sameel

    Joined:
    Dec 4, 2015
    Posts:
    17
  21. theminer1236

    theminer1236

    Joined:
    Apr 21, 2020
    Posts:
    2
    Hello,
    I was wondering if it was possible for me to get plane tracking on the rear camera and then face tracking on the front. They both work individually but when I add them both together, I can never see the rear camera. There is a delay with the face tracking when I add the plane tracking but I'm not sure what is happening.
     
  22. waldgeist

    waldgeist

    Joined:
    May 6, 2017
    Posts:
    388
    How can I detect if the device actually supports face tracking?
     
  23. makaka-org

    makaka-org

    Joined:
    Dec 1, 2013
    Posts:
    1,026
    First you can use basic checking with AR Foundation Support Checker.

    Additionally, you can check conditions described here:
     
  24. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,145
    You can check if ARFaceManager.descriptor is not null. Please wait until ARSession.state >=
    ARSessionState.Ready before executing this code.
    Code (CSharp):
    1. var faceManager = FindObjectOfType<ARFaceManager>();
    2. var faceManagerDescriptor = faceManager.descriptor;
    3. if (faceManagerDescriptor != null) {
    4.     Debug.Log("face tracking is supported. To determine supported features, please access faceManagerDescriptor properties.");
    5. } else {
    6.     Debug.Log("face tracking is not supported");
    7. }
     
  25. waldgeist

    waldgeist

    Joined:
    May 6, 2017
    Posts:
    388
    Thanks for the fast response.

    Unfortunately, this only works if the Face Tracker has already been activated, which is not what I want, because it forces my camera to the front.

    I need to detect the Face Tracking capability before actually activating it, to show or hide a UI button accordingly.
     
    BjoUnity3d and ilmario like this.
  26. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,145
    You can disable ARFaceManager in your scene and activate it only when you need to actually start face tracking.
     
  27. waldgeist

    waldgeist

    Joined:
    May 6, 2017
    Posts:
    388
    Yes, that's exactly what I do to switch the camera between world and user. But I found out that it needs to be activated to get your code work. Otherwise, it only reports face tracking as not being available. Since my app starts in world facing mode, activating the face tracker just to detect capabilities would cause some weird behaviour.
     
    ilmario likes this.
  28. waldgeist

    waldgeist

    Joined:
    May 6, 2017
    Posts:
    388
    I checked the code of the "support checker", but it seems to only check if AR is available, not the existence of a particular feature.

    For the native docs, this would require me to develop native Unity plugins, right? And this would not guarantee that the Unity implementation for FaceTracking is matching. For instance, Face Tracking was available in ARCore before Unity supported it there. So I would prefer a reliable way to test capabilities right in ARFoundation, but without having to activate the face tracker first.

    The strange thing is, that enabling the face tracker without face tracking being available prints a message to the iOS logs, but it would not give you any feedback.
     
  29. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,145
    UPDATE:
    It seems the following code is not working for all cases. It reports false on my iPhone 5s, but true on iPhone 7 (link1, link2).
    Please use this answer instead.


    ORIGINAL WRONG POST:
    @waldgeist I found a clean solution! It can be even called in Awake() and will NOT trigger camera permission request.
    Code (CSharp):
    1. public static bool IsFaceTrackingSupported() {
    2.     var descriptors = new List<XRFaceSubsystemDescriptor>();
    3.     SubsystemManager.GetSubsystemDescriptors(descriptors);
    4.     if (descriptors.Any()) {
    5.         var descriptor = descriptors.First();
    6.         Debug.Log("Face Tracking is supported, supportsEyeTracking: " + descriptor.supportsEyeTracking + ", supportsFacePose: " + descriptor.supportsFacePose + ", supportsFaceMeshNormals: " + descriptor.supportsFaceMeshNormals + ", supportsFaceMeshUVs: " + descriptor.supportsFaceMeshUVs + ", supportsFaceMeshVerticesAndIndices: " + descriptor.supportsFaceMeshVerticesAndIndices);
    7.         return true;
    8.     } else {
    9.         Debug.Log("Face Tracking is not supported.");
    10.         return false;
    11.     }
    12. }
     
    Last edited: Sep 21, 2020
    IARI likes this.
  30. waldgeist

    waldgeist

    Joined:
    May 6, 2017
    Posts:
    388
    Hey, perfect, I will try it out and report back!
     
  31. waldgeist

    waldgeist

    Joined:
    May 6, 2017
    Posts:
    388
    Works, thanks!
     
  32. makaka-org

    makaka-org

    Joined:
    Dec 1, 2013
    Posts:
    1,026
    Awesome.

    P.S. Good naming is: "public static bool IsFaceTrackingSupported() "
     
    KyryloKuzyk likes this.
  33. Parixit_Jadeja

    Parixit_Jadeja

    Joined:
    Jul 27, 2017
    Posts:
    8
    How to subscribe to this events? I can only subscribe to the "faceChange" event and that does not seem to work on an android device.
     
  34. makaka-org

    makaka-org

    Joined:
    Dec 1, 2013
    Posts:
    1,026
    Code (CSharp):
    1. private void ChangeFaces(ARFacesChangedEventArgs args)
    2. {
    3.     if (args.added.Count > 0)
    4.     {
    5.         DebugPrinter.Print("Face Added");
    6.     }
    7.  
    8.     if (args.updated.Count > 0)
    9.     {
    10.         DebugPrinter.Print("Face Updated");
    11.     }
    12. }
     
  35. jinjer

    jinjer

    Joined:
    Oct 12, 2014
    Posts:
    3
    Hello.

    Does anyone know how to change ArFaceManger.facePrefab dinamically?
    Because currently, I use this code
    Code (CSharp):
    1.  
    2. var faceManager = GetComponent<ARFaceManager>();
    3.         if (faceManager != null)
    4.         {        
    5.             Destroy(faceManager);
    6.         }
    7.  
    8.         faceManager = gameObject.AddComponent<ARFaceManager>();      
    9.      
    10.          faceManager.facePrefab = somePrefab;
    11.  
    And when I want to change prefab, the previous prefab is still displaying on the screen.

    Destroy(somePrefab) also doesn't help
     
  36. Parixit_Jadeja

    Parixit_Jadeja

    Joined:
    Jul 27, 2017
    Posts:
    8

    Thanks for the reply. This is working for me.
     
  37. Parixit_Jadeja

    Parixit_Jadeja

    Joined:
    Jul 27, 2017
    Posts:
    8

    I think you should not destroy the ARFaceManager. I tried to change the prefab in several ways but none of them worked.

    So I have a parent gameobject with all the prefabs I need as its child and make that parent gameobject as a facePrefab in ARManager. You can activate and deactivate the child objects.

    This is how I did it, if this is not an appropriate way or there is a better way to do it, please share.


    Code (CSharp):
    1. foreach (ARFace face in arFaceManager.trackables)
    2. {
    3. face.transform.GetChild(chidindex).gameObject.SetActive(true);
    4. [ICODE][/ICODE]}
     
    Last edited: Aug 20, 2020
  38. jinjer

    jinjer

    Joined:
    Oct 12, 2014
    Posts:
    3
    Thanks for your suggestion. But it won't work for me.

    I need to download some prefabs and then switch them in runtime. They are a different objects that I need to place on the fave
     
  39. IARI

    IARI

    Joined:
    May 8, 2014
    Posts:
    70
    I tried this, and had someone test our app with it on an iPhone 7Plus - it appears, that it returns true, which doesn't seem right: the iPhone 7Plus does not have a true-depth Camera.
    Am I missing something here?
     
  40. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,145
    Could you please share console logs? Is there a chance a descriptor is present, but it doesn't support any face tracking features?
     
  41. Suwas93

    Suwas93

    Joined:
    Feb 8, 2018
    Posts:
    62

    Same...I tried it on iPhone 7 and it says it supports it. Here's what I got: Face Tracking is supported, supportsEyeTracking: True, supportsFacePose: True, supportsFaceMeshNormals: False, supportsFaceMeshUVs: True, supportsFaceMeshVerticesAndIndices: True
     
    IARI likes this.
  42. Suwas93

    Suwas93

    Joined:
    Feb 8, 2018
    Posts:
    62
    This seems to be working though!!
     
  43. IARI

    IARI

    Joined:
    May 8, 2014
    Posts:
    70
    Unfortunately I could not share logs, because we do not own the device - we were having someone test a build via testflight.
    Thanks @Suwas93 - judging from this info you got, it seems that this method is rather unreliable, giving false positives at least for the iPhone 7.

    We need a robuts solution for this problem..
    Is a native iOS plugin necessary, would ARFaceTrackingConfiguration.isSupported possibly do the job?
    https://developer.apple.com/documentation/arkit/arfacetrackingconfiguration
     
    Last edited: Sep 21, 2020
  44. IARI

    IARI

    Joined:
    May 8, 2014
    Posts:
    70
    @Suwas93 what about the issue that @waldgeist pointed out - Does that just not happen for you, or is it not a requirement?


    I also need to know about the Facetracking capability in advance, because I have to show a popup on application start in case it is not supported.
     
  45. IARI

    IARI

    Joined:
    May 8, 2014
    Posts:
    70
    Our previous solution simply checked the device-generation for a manual list in code:
    Do not use this code, it isn't a sufficient solution for this problem, an I'm posting it for demonstration only
    Code (CSharp):
    1.  
    2. return (
    3.     Device.generation == DeviceGeneration.iPhoneX
    4.     || Device.generation == DeviceGeneration.iPhoneXS
    5.     || Device.generation == DeviceGeneration.iPhoneXSMax
    6.     || Device.generation == DeviceGeneration.iPhoneXR
    7.     || Device.generation == DeviceGeneration.iPhone11
    8.     || Device.generation == DeviceGeneration.iPhone11Pro
    9.     || Device.generation == DeviceGeneration.iPhone11ProMax
    10.     || Device.generation == DeviceGeneration.iPadPro11Inch
    11.     || Device.generation == DeviceGeneration.iPadPro3Gen
    12. );
    13.  
    This is clearly is not robust at all, since it requires us to maintain a list manually - but even if we did that, Unity does not know the newest devices after release. for example an iPad Pro 4th-gen does not exist in unity's enum but is instead reported as
    iPadUnknown
     
  46. Suwas93

    Suwas93

    Joined:
    Feb 8, 2018
    Posts:
    62
    What do you exactly mean by false positives? I'm waiting for session to be >= ready before I check this and my face tracking manager is false.
     
  47. IARI

    IARI

    Joined:
    May 8, 2014
    Posts:
    70
    Sorry - this might have been confusing: I was only referring to the solution using
    SubsystemManager.GetSubsystemDescriptors

    By false positive i mean, the function is supposed to return false, but instead it returns true - In case of an iPhone 7 Plus i would call the result a false positive.

    With the solution that works for you, you wait for a session to be ready - but @waldgeist earlier wrote this

    because it forces my camera to the front.

    I need to detect the Face Tracking capability before actually activating it, to show or hide a UI button accordingly.​

    The way I read this, it will activate the camera - and possibly it will even require the camera permission to even work?
    If I understood that correctly, this will - just as waldgeist pointed out for his case - not be a solution in our case.

    (Sorry for all the "if" - Unfortunately right now I cannot test anything, I don't have access to any apple device anymore, because all my apple test-devices were stolen in a burglary this weekend :( )
     
  48. makaka-org

    makaka-org

    Joined:
    Dec 1, 2013
    Posts:
    1,026
    It also doesn't work, when AR Face Manager is disabled on start and then still on this code time.
     
  49. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    335
    Hey everyone, going to close this now outdated thread. If you're still experiencing issues or have questions, feel free to start a new thread and we'll take a look. Thanks!
     
    makaka-org likes this.
Thread Status:
Not open for further replies.