Search Unity

  1. Unity 2019.2 is now released.
    Dismiss Notice

Does AR Foundation support ARKit 3 Simultaneous Front and Back Camera?

Discussion in 'Handheld AR' started by HVRAT, Jul 16, 2019.

  1. HVRAT

    HVRAT

    Joined:
    Aug 28, 2018
    Posts:
    6
    Hi I wonder does the latest AR Foundation support ARKit 3 Simultaneous Front and Back Camera feature?

    I checked RearCameraWithFrontCameraFaceMesh scene from AR Foundation samples, but it doesn't demonstrate how to use front camera to enable face tracking and at the same time using device full orientation and position information.

    Does Unity plan to create a demo like that?

    Also I am curious whether user can get human depth and stencil texture from front camera, just like the human occlusion demo, but depth is from front camera instead of back camera. I tried to attached both ARHumanBodyManager and ARFaceManager on ARSessionOrigin, and it seems like as long as ARHumanBodyManager is turned on, it's using back camera, not front.
     
  2. sun_dony

    sun_dony

    Joined:
    Oct 31, 2017
    Posts:
    1
  3. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    180
    I"m having the same issue, the RearCameraWithFrontCameraFaceMesh sample isn't doing exactly what I want, and seems to just switch back and forth between front and rear camera. I can't tell what it's tracking. What I want to know how to do is get the blend shape data from the face (in the form of variables) while doing rear camera AR with plane tracking etc...
     
  4. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    180


    built a little scene to test the issue, when you turn on face manager, it automatically switches to the front camera, which is NOT what I want to happen, did anyone at Unity actually test this? ARKit3.0 is supposed to be able to do this.

    Stats:
    Unity 2019.1.9f1
    ARFoundation 2.1.1
    ARKit Face Tracking 1.0.1
    ARKit XR Plugin 2.1.1
    Xcode 11.0 beta 4
    iOS 13 beta 4
     
  5. 3d_Artist1987

    3d_Artist1987

    Joined:
    Jul 9, 2012
    Posts:
    716
    Any update?
     
  6. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    180
    dear lord someone help us
     
  7. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    19
    To use ARKit 3.0 Features you need to upgrade to ARFoundation 2.2 and upgrade your ARKit packages as well.

    Keep in mind that since your post XCode and iOS have since had beta updates to beta 5.

    EDIT:

    I neglected to add some of the documentation about ARKit 3 features only being available on certain versions which can be found in the readme here: https://github.com/Unity-Technologies/arfoundation-samples

    Also I was incorrect when I stated the latest update to iOS is beta 5 because it's beta 6 now. XCode has remained the same however.

    I also neglected to mention that to use this feature you need an iPhone with an A12/A12X Bionic chips, ANE, and TrueDepth Camera. These are the iPhone XS, iPhone XS Max, iPhone XR currently.
     
    Last edited: Aug 9, 2019
  8. 3d_Artist1987

    3d_Artist1987

    Joined:
    Jul 9, 2012
    Posts:
    716
    Arkit 3 support iPad 2018 11 inches?

    When we use both cameras on in our app, the front camera works fine with blendshape (morphing) but with the back camera, if we enable AR face script on the 3D model, it does not augment. And if we disable AR Face script it augments ok, but blend shape does not work properly.

    I used RearCameraWithFrontCameraFaceMesh sample but it's not working.
    Does any sample provide for this?

    Unity 2019.2
    Xcode 11.0 beta 5
    ARFoundation 2.2 preview.3
    ARKit Face Tracking 1.1.0 preview.4
    ARKit XR Plugin 2.2.0 preview.4
     
  9. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    19
    The back facing camera can't augment faces as augmented faces requires a true depth camera on the front face. I believe the iPad Pro 11in has an A12 chip and true depth front facing camera which supports face tracking but the standard 2018 iPad does not as it has an A10 chip and no true depth camera.

    https://developer.apple.com/documentation/arkit/arfacetrackingconfiguration?language=objc

    "Face tracking is available only on iOS devices with a front-facing TrueDepth camera (see iOS Device Compatibility Reference). Use the ARFaceTrackingConfiguration isSupported property to determine whether face tracking is available on the current device before offering the user any features that require face tracking."
     
  10. 3d_Artist1987

    3d_Artist1987

    Joined:
    Jul 9, 2012
    Posts:
    716
  11. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    180
    Just tested this with all proper packages / versions (as listed above) it's still not working! It still switches to render face cam automatically as soon as ARFaceManager is enabled.
     
  12. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    180
    This is not a version issue and I'm testing on an iPhone XR. All of the versions I was using before were supposed to support ARKit 3.0 as well. It said as much in the documentation.

    This issue is simple: the camera being rendered is switching to the selfie cam when you turn on the face manager, this should not happen with ARKit 3.0 and the dev should be in control of which camera is shown.

    The code making this happen is somewhere in the bowels of ARFoundation/ARKIt/ARFaceTracking plugins. An engineer at Unity needs to fix this.
     
  13. 3d_Artist1987

    3d_Artist1987

    Joined:
    Jul 9, 2012
    Posts:
    716
    Are you checked RearCameraWithFrontCameraFaceMesh sample scene which provides in unity sample?
     
  14. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    19
    My apologies. I didn't mean to imply that you weren't using the correct device when you clearly stated you were. I was merely adding that as additional information for everyone.

    I also realized what you are saying and I incorrectly addressed it. Maybe I can clear up some confusion by explaining how the activation of two cameras works for ARKit 3. There are 2 configurations we are interested in that govern which camera is being used:

    1. ARWorldTrackingConfiguration - Backfacing Camera
    https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration?language=objc
    2. ARFaceTrackingConfiguration - Frontfacing Camera
    https://developer.apple.com/documentation/arkit/arfacetrackingconfiguration?language=objc

    You would normally select one to use at a given time as per the instructions found here: https://developer.apple.com/documentation/arkit/choosing_which_camera_feed_to_augment?language=objc. However, in the ARWorldTrackingConfiguration you will notice that there is a flag for face tracking that is marked as beta. This flag is what turns on the front facing camera while tracking the world. ARFoundation ties these configurations to our particular managers (or rather, it ties the ARFaceTrackingConfiguration to the ARFaceManager) and by activating the face manager you are swapping configurations which is why you see the camera change. The samples show that faces are being tracked when using the plane manager and then when you swap it enables the face manager to take over and show the front facing camera. Again, I am sorry I misunderstood the issue you were having.

    You did mention you were using ARFoundation 2.1 which does not support the ARKit 3 changes, you do need to update to 2.2-preview to use these new features. From our samples readme.md which can be found here: https://github.com/Unity-Technologies/arfoundation-samples

    "ARKit 3 Support
    TL;DR: If you want to checkout the latest and greatest features in ARKit 3, use this master branch, Xcode 11 beta 5, and a device running iOS 13 beta. Otherwise, see the 2.1 branch, which only lacks support for the new ARKit 3 features.

    The master branch includes support for ARKit 3, which is still in beta and requires Xcode 11 beta 5 and iOS 13 beta 5.

    The 2.1 branch is compatible with Xcode 9 and 10 and only lacks the new ARKit 3 features.

    ARFoundation 2.2 provides interfaces for ARKit 3 features, but only Unity's ARKit XR Plugin 2.2 package contains support for these features and requires Xcode 11 beta 5 and iOS 13 beta 5. Unity's ARKit XR Plugin 2.2 is not backwards compatible with previous versions of Xcode or iOS. Unity's ARKit XR Plugin 2.1 will work with the latest ARFoundation (it just doesn't implement the ARKit 3 features).

    While Xcode 11 & iOS 13 are in beta, we will continue to maintain both the 2.2 and 2.1 versions of the packages.

    The same is also true for Unity's ARKit Face Tracking package 1.1: it requires Xcode 11 beta 5 and iOS 13 beta 5."


    Then yes your device has the chip it needs to support these new features.
     
    3d_Artist1987 likes this.
  15. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    180
    Yes I can see that the sample can show how many faces are being tracked while using the plane manager and showing the rear camera.

    However, there is no clear example of how to get blend shape data from a face while using the plane manager and showing the rear camera. As far as I can tell, I need to use the Face Manager to spawn a face prefab to get access to the blend shape data. Is there a way I can get access to blend shape data without the Face Manager?
     
  16. 3d_Artist1987

    3d_Artist1987

    Joined:
    Jul 9, 2012
    Posts:
    716
  17. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    19
    To start, I was mistaken. The overriding manager in this situation is the PlaneManager which overrides a FaceTrackingConfiguration with a WorldTrackingConfiguration. As long as you have ARFoundation 2.2 you can have both managers active and utilize the backface camera while tracking faces.

    Okay, so to acquire blendshape information what you need to do is get the subsystem and cast it to an ARKitFaceTrackingSubsystem.
    var arkitFaceTrackingSubsystem = (ARKitFaceTrackingSubsystem)faceManager.subsystem;
    and then use the GetBlendShapeCoefficients() function on that subsystem. The sample showing this can be found here: https://github.com/Unity-Technologi...r/Assets/Scripts/ARKitBlendShapeVisualizer.cs

    Getting the subsystem and casting:
    https://github.com/Unity-Technologi...cripts/ARKitBlendShapeVisualizer.cs#L145-L151

    Acquiring Blend Shape information:
    https://github.com/Unity-Technologi...cripts/ARKitBlendShapeVisualizer.cs#L181-L196
     
  18. 3d_Artist1987

    3d_Artist1987

    Joined:
    Jul 9, 2012
    Posts:
    716
    I totally agree with @edwon, Please provide one sample to avoid issues
     
  19. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    180
    I already found the example you are referencing, but it requires the face manager to be enabled, and a spawned ARFace prefab in the scene, but, turning the face manager on switches the camera to the front facing camera, as I said before! Are you actually testing this stuff out?

    You should really have a specific sample in the ARFoundation-Samples git repo that specifically shows blend shapes being appled to a character that is tracked and shown in the rear camera.

    P.S. it's been more than 2 weeks since I posted in this thread and it's still unresolved
     
    Last edited: Aug 14, 2019
  20. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    180
    After many hours of debugging, found out that this line is causing the camera to switch to front camera if ARFaceManager is on:

    arPlaneManager.subsystem.Stop();

    this re-enforces my point that the dev should have explicit control of which camera is shown at any time, debugging this was hell
     
    Blarp likes this.
  21. 3d_Artist1987

    3d_Artist1987

    Joined:
    Jul 9, 2012
    Posts:
    716
    Any update?
     
  22. adev39996

    adev39996

    Joined:
    Sep 2, 2019
    Posts:
    3
    Same isuue
     
  23. Blarp

    Blarp

    Joined:
    May 13, 2014
    Posts:
    193
    Last edited: Sep 12, 2019
  24. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    180
    What issue are you having exactly? I was actually able to get front and back camera working simultaenously by following the example Rear Camera with Front Facing Camera scene in https://github.com/Unity-Technologies/arfoundation-samples

    My only issue was that it kept switching to render the front camera (even though it was still tracking both) but it was actually only happening because of me stopping/starting the plane manager for other reasons, which was resetting the AR session and causing the camera to flip.

    I was able to fix my specific issue this way, but it's still a major problem that AR foundation doesn't provide an explicit way to switch between the cameras. It should not be automated!
     
    Blarp likes this.
  25. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    19
    We are currently working on a way to setup configurations (with explicit camera controls when deploying) that will not only allow for more explicit feature control but also alert you to when a configuration is not viable on the current platform (such as trying to show the front-facing/selfie camera when using face tracking and plane tracking). I do not have a timeline on it quite yet but we are working on it.
     
    3d_Artist1987 and Blarp like this.
  26. adev39996

    adev39996

    Joined:
    Sep 2, 2019
    Posts:
    3
    @edwon
    plane anchor with face blendshape not working in back camera.can you share your working sample?
    It's very helpful to me slove above isuue.
     
  27. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    180
    For me the example scene in the AR-Foundation-Samples project on GitHub worked from the beginning. It just wasn't working in my project for reasons mentioned above. Remember you do an iPhone or iPad that supports ARKit 3.0 and it must have iOS 13 beta on it, and you need to use the latest Xcode beta when building.
     
  28. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    180
    Great, explicit camera control is much needed, it should not be automatically turning on and off based on which subsystems you're using.

    Also a better way to check for feature compatibility would be great. Right now you have to explicit check each device via the Device.iPhoneX || Device.iPhoneXR etc.. it's a major pain. Would be great to have like a Device.iOS.ARKit.3.0Compatible flag or bool to check against.
     
    raphaelnew likes this.