Search Unity

Bug Face Expression Action for eye open/close not working

Discussion in 'Unity MARS' started by styGGx, Jul 21, 2021.

  1. styGGx

    styGGx

    Joined:
    Aug 27, 2017
    Posts:
    19
    Hi there

    As the title explains, I am unable to get either of the eye's Close or Open events to trigger when running on an Android device.

    It works fine when simulated in the editor's Device View, but not after building to the device.

    The smile engaged/disengaged Expression Actions work however.

    My project can be located here:
    https://github.com/Petrie65/unity-mars-build-fail
    It is an empty project with MARS that swaps out game objects on the eyes/month Landmarks based on Face Expression Actions.

    Any advice would be appreciated. Thanks
     
  2. jmunozarUTech

    jmunozarUTech

    Unity Technologies

    Joined:
    Jun 1, 2020
    Posts:
    297
    hello @Stygian65,

    Sometimes face expression actions will need some tweaking to make them work. You can do that by modifying settings in the
    ARCoreFacialExpressionSettings
    asset.

    To do that, you will need to move that package to the assets folder; so try this:
    1. Navigate to "Packages/Unity MARS AR Foundation Providers" in the Project window
    2. In the Project window, right click -> "Show in Explorer" to bring up the location of the package
    3. Move the package contents ("
      com.unity.mars-ar-foundation-providers...
      ") to the "Packages" folder of the project
    4. Return to the Unity editor
    5. Navigate to the asset "
      ARCoreFacialExpressionSettings
      " and modify the expression parameters.
    After doing this the asset should be editable for you to modify
    Screen Shot 2021-07-22 at 10.27.51 AM.png
     
    mtschoen likes this.
  3. styGGx

    styGGx

    Joined:
    Aug 27, 2017
    Posts:
    19
    Hi @jmunozarUTech

    Thanks for getting back to me. I have done as you suggested. I also had to remove mars-ar-foundation-providers from the package manager as there were now duplicate scripts. Unfortunately after building these changes, it seems that MARS is not working at all on the device.

    I made a script listen to MARSFaceManager FaceUpdated events to see what values MARS has for the facial expressions. I can manipulate all the values with my face on-device, except for the eyes - they are frozen on 0. But when I run it in the Editor then it is clearly detecting the values.
     
    Last edited: Jan 8, 2023
  4. jmunozarUTech

    jmunozarUTech

    Unity Technologies

    Joined:
    Jun 1, 2020
    Posts:
    297
    Hello @Stygian65,

    If you moved (not copied) the package, this shouldnt affect at all any behavior in the project.
    you shouldn't have duplicated folders; can you try embedding the package with this then?
    https://github.com/liortal53/upm-embed

    This will embeds the package in your project, so you can change the asset I mentioned
     
  5. styGGx

    styGGx

    Joined:
    Aug 27, 2017
    Posts:
    19
    Ok, thank you I was able to move the package so that I could modify the
    ARCoreFacialExpressionSettings
    and successfully build it to device.

    I have tried a bunch of different configurations, but I am still unable to get the
    LeftEyeClose 
    or
    RightEyeClose 
    to reflect anything other than 0.

    Something else worth mentioning; in order to get the Android build to succeed, I had to remove the
    Unity Content Manager
    and
    Unity MARS Companion Core
    preview packages from the project. I don't think this should affect the core functionality though (for reference https://forum.unity.com/threads/il2...tandard-version-2-0-0-0.1041868/#post-7331191)
     
  6. styGGx

    styGGx

    Joined:
    Aug 27, 2017
    Posts:
    19
    Hi @jmunozarUTech

    I've been trying to get this working, but unfortunately made no progress. Do you perhaps have some advice on how I could further debug this?

    I am really baffled by what could cause this. Especially since the rest of the expressions are working. I'd imagine the method used for detecting eyes closed/open would use on the same tech stack that allows recognition of the rest of the expressions.

    Do you maybe know if other MARS users have successfully built this to an Android device? I would think so because it seems like a pretty common use case.

    On a related note, I also tried deploying to an iPhone 6S Plus and could only see the rear-facing camera with no face-tracking functionality. I believe that is because it doesn't have a forward-facing depth camera, so it would only work on an iPhone X or newer, is that right?
     
  7. jmunozarUTech

    jmunozarUTech

    Unity Technologies

    Joined:
    Jun 1, 2020
    Posts:
    297
    Hello @Stygian65,

    We are looking into this, seems this is not a configure-related issue.

    On the other hand; with the iOS face tracking, it should just work; ~it doesn't~ DOES rely on depth camera for face tracking. BUT it will not work with the rear camera, so you will need the front camera to use facetracking
     
    Last edited: Jul 28, 2021
  8. styGGx

    styGGx

    Joined:
    Aug 27, 2017
    Posts:
    19
    Thank you for looking into it.

    Today we deployed to iPhone X and it worked, including the eye expressions. The same build is still not working for the iPhone 6S Plus though, I suppose some older-generation devices might not be able to run it.

    We also tried 2 different Android platforms - so in total, we have tried a device from Huawei, Samsung, and Xiaomi. All with the same issue.

    For more context, the relevant versions we are using are the following:
    • Unity 2020.3.6f1 (LTF)
    • AR Foundation, AR Subsystems, ARCore, ARKit, AR Face Tracking - 4.1.7
    • Mars, Mars AR Foundation Providers, Mars Nav Mesh - 1.3.1
     
  9. amydigiov_Unity

    amydigiov_Unity

    Unity Technologies

    Joined:
    May 24, 2016
    Posts:
    16
    Hi @Stygian65, thank you for bringing this issue to our attention. We have looked into it and confirmed that the eye close expressions are not supported on Android. We are working on both fixing this issue and exposing the expression settings in our 1.4 release.
     
    styGGx likes this.
  10. styGGx

    styGGx

    Joined:
    Aug 27, 2017
    Posts:
    19
    Thanks @amydigiov

    I'll be looking forward to the 1.4 release.
     
  11. TaylorTangXR

    TaylorTangXR

    Joined:
    Dec 29, 2020
    Posts:
    3
    May I know the roadmap when 1.4 will be release to fix on the captioned issue? Thank you!
     
  12. azevedco

    azevedco

    Joined:
    Mar 2, 2014
    Posts:
    34
    Hello,
    I've updated to 1.4.1 and can confirm that files such as "ARCoreFacialExpressionSettings" are in a more accessible location to modify and that eyes closed detection is "working" for Android.
    It is finicky, however. If the user's device is tilted back or forward, it alters the values in which it thinks the eyes are closed. Currently looking into using the device orientation to offset this value.

    If anyone who finds this thread is wondering how i'm finding the eye's closed float value, here's some code to get you started:

    Code (CSharp):
    1. public class FacialFeaturesReader : MonoBehaviour, IUsesFaceTracking, IUsesCameraOffset
    2.     {
    3.         [SerializeField] private TextMeshProUGUI dbgEyesText = null;
    4.  
    5.         IProvidesFaceTracking IFunctionalitySubscriber<IProvidesFaceTracking>.provider { get; set; }
    6.         IProvidesCameraOffset IFunctionalitySubscriber<IProvidesCameraOffset>.provider { get; set; }
    7.  
    8.         private void OnEnable()
    9.         {
    10.             this.SubscribeFaceAdded(OnFaceAdded);
    11.             this.SubscribeFaceUpdated(OnFaceUpdated);
    12.             this.SubscribeFaceRemoved(OnFaceRemoved);
    13.         }
    14.  
    15.         private void OnDisable()
    16.         {
    17.             this.UnsubscribeFaceAdded(OnFaceAdded);
    18.             this.UnsubscribeFaceUpdated(OnFaceUpdated);
    19.             this.UnsubscribeFaceRemoved(OnFaceRemoved);
    20.         }
    21.  
    22.         private void OnFaceAdded(IMRFace face)
    23.         {
    24.             Debug.Log(face.Expressions.Count + " expression found for facial tracker");
    25.         }
    26.  
    27.         void OnFaceUpdated(IMRFace face)
    28.         {
    29.              if (dbgEyesText != null)
    30.              {
    31.                  dbgEyesText.text = $"Left Eye: {face.Expressions[MRFaceExpression.LeftEyeClose]}" +
    32.                      $"\n\nRight Eye: {face.Expressions[MRFaceExpression.RightEyeClose]}";
    33.              }
    34.         }
    35.  
    36.         private void OnFaceRemoved(IMRFace face)
    37.         {
    38.  
    39.         }
    40.     }
     
    styGGx likes this.