Search Unity

Resolved Does AR Foundation support ARKit 3 Simultaneous Front and Back Camera?

Discussion in 'AR' started by HypeVR, Jul 16, 2019.

Thread Status:
Not open for further replies.
  1. HypeVR

    HypeVR

    Joined:
    Aug 28, 2018
    Posts:
    11
    Hi I wonder does the latest AR Foundation support ARKit 3 Simultaneous Front and Back Camera feature?

    I checked RearCameraWithFrontCameraFaceMesh scene from AR Foundation samples, but it doesn't demonstrate how to use front camera to enable face tracking and at the same time using device full orientation and position information.

    Does Unity plan to create a demo like that?

    Also I am curious whether user can get human depth and stencil texture from front camera, just like the human occlusion demo, but depth is from front camera instead of back camera. I tried to attached both ARHumanBodyManager and ARFaceManager on ARSessionOrigin, and it seems like as long as ARHumanBodyManager is turned on, it's using back camera, not front.
     
  2. sun_dony

    sun_dony

    Joined:
    Oct 31, 2017
    Posts:
    1
  3. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    I"m having the same issue, the RearCameraWithFrontCameraFaceMesh sample isn't doing exactly what I want, and seems to just switch back and forth between front and rear camera. I can't tell what it's tracking. What I want to know how to do is get the blend shape data from the face (in the form of variables) while doing rear camera AR with plane tracking etc...
     
  4. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266


    built a little scene to test the issue, when you turn on face manager, it automatically switches to the front camera, which is NOT what I want to happen, did anyone at Unity actually test this? ARKit3.0 is supposed to be able to do this.

    Stats:
    Unity 2019.1.9f1
    ARFoundation 2.1.1
    ARKit Face Tracking 1.0.1
    ARKit XR Plugin 2.1.1
    Xcode 11.0 beta 4
    iOS 13 beta 4
     
    mohammedalanwar likes this.
  5. TechnicalArtist

    TechnicalArtist

    Joined:
    Jul 9, 2012
    Posts:
    736
    Any update?
     
  6. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    dear lord someone help us
     
  7. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    99
    To use ARKit 3.0 Features you need to upgrade to ARFoundation 2.2 and upgrade your ARKit packages as well.

    Keep in mind that since your post XCode and iOS have since had beta updates to beta 5.

    EDIT:

    I neglected to add some of the documentation about ARKit 3 features only being available on certain versions which can be found in the readme here: https://github.com/Unity-Technologies/arfoundation-samples

    Also I was incorrect when I stated the latest update to iOS is beta 5 because it's beta 6 now. XCode has remained the same however.

    I also neglected to mention that to use this feature you need an iPhone with an A12/A12X Bionic chips, ANE, and TrueDepth Camera. These are the iPhone XS, iPhone XS Max, iPhone XR currently.
     
    Last edited: Aug 9, 2019
  8. TechnicalArtist

    TechnicalArtist

    Joined:
    Jul 9, 2012
    Posts:
    736
    Arkit 3 support iPad 2018 11 inches?

    When we use both cameras on in our app, the front camera works fine with blendshape (morphing) but with the back camera, if we enable AR face script on the 3D model, it does not augment. And if we disable AR Face script it augments ok, but blend shape does not work properly.

    I used RearCameraWithFrontCameraFaceMesh sample but it's not working.
    Does any sample provide for this?

    Unity 2019.2
    Xcode 11.0 beta 5
    ARFoundation 2.2 preview.3
    ARKit Face Tracking 1.1.0 preview.4
    ARKit XR Plugin 2.2.0 preview.4
     
  9. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    99
    The back facing camera can't augment faces as augmented faces requires a true depth camera on the front face. I believe the iPad Pro 11in has an A12 chip and true depth front facing camera which supports face tracking but the standard 2018 iPad does not as it has an A10 chip and no true depth camera.

    https://developer.apple.com/documentation/arkit/arfacetrackingconfiguration?language=objc

    "Face tracking is available only on iOS devices with a front-facing TrueDepth camera (see iOS Device Compatibility Reference). Use the ARFaceTrackingConfiguration isSupported property to determine whether face tracking is available on the current device before offering the user any features that require face tracking."
     
  10. TechnicalArtist

    TechnicalArtist

    Joined:
    Jul 9, 2012
    Posts:
    736
  11. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    Just tested this with all proper packages / versions (as listed above) it's still not working! It still switches to render face cam automatically as soon as ARFaceManager is enabled.
     
  12. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    This is not a version issue and I'm testing on an iPhone XR. All of the versions I was using before were supposed to support ARKit 3.0 as well. It said as much in the documentation.

    This issue is simple: the camera being rendered is switching to the selfie cam when you turn on the face manager, this should not happen with ARKit 3.0 and the dev should be in control of which camera is shown.

    The code making this happen is somewhere in the bowels of ARFoundation/ARKIt/ARFaceTracking plugins. An engineer at Unity needs to fix this.
     
  13. TechnicalArtist

    TechnicalArtist

    Joined:
    Jul 9, 2012
    Posts:
    736
    Are you checked RearCameraWithFrontCameraFaceMesh sample scene which provides in unity sample?
     
  14. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    99
    My apologies. I didn't mean to imply that you weren't using the correct device when you clearly stated you were. I was merely adding that as additional information for everyone.

    I also realized what you are saying and I incorrectly addressed it. Maybe I can clear up some confusion by explaining how the activation of two cameras works for ARKit 3. There are 2 configurations we are interested in that govern which camera is being used:

    1. ARWorldTrackingConfiguration - Backfacing Camera
    https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration?language=objc
    2. ARFaceTrackingConfiguration - Frontfacing Camera
    https://developer.apple.com/documentation/arkit/arfacetrackingconfiguration?language=objc

    You would normally select one to use at a given time as per the instructions found here: https://developer.apple.com/documentation/arkit/choosing_which_camera_feed_to_augment?language=objc. However, in the ARWorldTrackingConfiguration you will notice that there is a flag for face tracking that is marked as beta. This flag is what turns on the front facing camera while tracking the world. ARFoundation ties these configurations to our particular managers (or rather, it ties the ARFaceTrackingConfiguration to the ARFaceManager) and by activating the face manager you are swapping configurations which is why you see the camera change. The samples show that faces are being tracked when using the plane manager and then when you swap it enables the face manager to take over and show the front facing camera. Again, I am sorry I misunderstood the issue you were having.

    You did mention you were using ARFoundation 2.1 which does not support the ARKit 3 changes, you do need to update to 2.2-preview to use these new features. From our samples readme.md which can be found here: https://github.com/Unity-Technologies/arfoundation-samples

    "ARKit 3 Support
    TL;DR: If you want to checkout the latest and greatest features in ARKit 3, use this master branch, Xcode 11 beta 5, and a device running iOS 13 beta. Otherwise, see the 2.1 branch, which only lacks support for the new ARKit 3 features.

    The master branch includes support for ARKit 3, which is still in beta and requires Xcode 11 beta 5 and iOS 13 beta 5.

    The 2.1 branch is compatible with Xcode 9 and 10 and only lacks the new ARKit 3 features.

    ARFoundation 2.2 provides interfaces for ARKit 3 features, but only Unity's ARKit XR Plugin 2.2 package contains support for these features and requires Xcode 11 beta 5 and iOS 13 beta 5. Unity's ARKit XR Plugin 2.2 is not backwards compatible with previous versions of Xcode or iOS. Unity's ARKit XR Plugin 2.1 will work with the latest ARFoundation (it just doesn't implement the ARKit 3 features).

    While Xcode 11 & iOS 13 are in beta, we will continue to maintain both the 2.2 and 2.1 versions of the packages.

    The same is also true for Unity's ARKit Face Tracking package 1.1: it requires Xcode 11 beta 5 and iOS 13 beta 5."


    Then yes your device has the chip it needs to support these new features.
     
    TechnicalArtist likes this.
  15. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    Yes I can see that the sample can show how many faces are being tracked while using the plane manager and showing the rear camera.

    However, there is no clear example of how to get blend shape data from a face while using the plane manager and showing the rear camera. As far as I can tell, I need to use the Face Manager to spawn a face prefab to get access to the blend shape data. Is there a way I can get access to blend shape data without the Face Manager?
     
  16. TechnicalArtist

    TechnicalArtist

    Joined:
    Jul 9, 2012
    Posts:
    736
  17. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    99
    To start, I was mistaken. The overriding manager in this situation is the PlaneManager which overrides a FaceTrackingConfiguration with a WorldTrackingConfiguration. As long as you have ARFoundation 2.2 you can have both managers active and utilize the backface camera while tracking faces.

    Okay, so to acquire blendshape information what you need to do is get the subsystem and cast it to an ARKitFaceTrackingSubsystem.
    var arkitFaceTrackingSubsystem = (ARKitFaceTrackingSubsystem)faceManager.subsystem;
    and then use the GetBlendShapeCoefficients() function on that subsystem. The sample showing this can be found here: https://github.com/Unity-Technologi...r/Assets/Scripts/ARKitBlendShapeVisualizer.cs

    Getting the subsystem and casting:
    https://github.com/Unity-Technologi...cripts/ARKitBlendShapeVisualizer.cs#L145-L151

    Acquiring Blend Shape information:
    https://github.com/Unity-Technologi...cripts/ARKitBlendShapeVisualizer.cs#L181-L196
     
  18. TechnicalArtist

    TechnicalArtist

    Joined:
    Jul 9, 2012
    Posts:
    736
    I totally agree with @edwon, Please provide one sample to avoid issues
     
  19. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    I already found the example you are referencing, but it requires the face manager to be enabled, and a spawned ARFace prefab in the scene, but, turning the face manager on switches the camera to the front facing camera, as I said before! Are you actually testing this stuff out?

    You should really have a specific sample in the ARFoundation-Samples git repo that specifically shows blend shapes being appled to a character that is tracked and shown in the rear camera.

    P.S. it's been more than 2 weeks since I posted in this thread and it's still unresolved
     
    Last edited: Aug 14, 2019
  20. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    After many hours of debugging, found out that this line is causing the camera to switch to front camera if ARFaceManager is on:

    arPlaneManager.subsystem.Stop();

    this re-enforces my point that the dev should have explicit control of which camera is shown at any time, debugging this was hell
     
    Blarp likes this.
  21. TechnicalArtist

    TechnicalArtist

    Joined:
    Jul 9, 2012
    Posts:
    736
    Any update?
     
  22. adev39996

    adev39996

    Joined:
    Sep 2, 2019
    Posts:
    3
    Same isuue
     
  23. Blarp

    Blarp

    Joined:
    May 13, 2014
    Posts:
    269
    Last edited: Sep 12, 2019
  24. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    What issue are you having exactly? I was actually able to get front and back camera working simultaenously by following the example Rear Camera with Front Facing Camera scene in https://github.com/Unity-Technologies/arfoundation-samples

    My only issue was that it kept switching to render the front camera (even though it was still tracking both) but it was actually only happening because of me stopping/starting the plane manager for other reasons, which was resetting the AR session and causing the camera to flip.

    I was able to fix my specific issue this way, but it's still a major problem that AR foundation doesn't provide an explicit way to switch between the cameras. It should not be automated!
     
    Blarp likes this.
  25. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    99
    We are currently working on a way to setup configurations (with explicit camera controls when deploying) that will not only allow for more explicit feature control but also alert you to when a configuration is not viable on the current platform (such as trying to show the front-facing/selfie camera when using face tracking and plane tracking). I do not have a timeline on it quite yet but we are working on it.
     
    TechnicalArtist and Blarp like this.
  26. adev39996

    adev39996

    Joined:
    Sep 2, 2019
    Posts:
    3
    @edwon
    plane anchor with face blendshape not working in back camera.can you share your working sample?
    It's very helpful to me slove above isuue.
     
  27. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    For me the example scene in the AR-Foundation-Samples project on GitHub worked from the beginning. It just wasn't working in my project for reasons mentioned above. Remember you do an iPhone or iPad that supports ARKit 3.0 and it must have iOS 13 beta on it, and you need to use the latest Xcode beta when building.
     
  28. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    Great, explicit camera control is much needed, it should not be automatically turning on and off based on which subsystems you're using.

    Also a better way to check for feature compatibility would be great. Right now you have to explicit check each device via the Device.iPhoneX || Device.iPhoneXR etc.. it's a major pain. Would be great to have like a Device.iOS.ARKit.3.0Compatible flag or bool to check against.
     
    raphaelnew likes this.
  29. GuyNir

    GuyNir

    Joined:
    Nov 2, 2019
    Posts:
    1
    Hello,
    Having an easy way to select front or back camera is a great addition, I sure would like to be able to use different features regardless of whether the video source is coming from the front or back camera.

    Is there any ETA for these features to be available ?
    Thank you,
    Guy
     
  30. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    99
    Unfortunately I do not. It is still being worked on, we have had design reviews about the about it that went well but it requires a non-trivial refactor of our original code that is still in progress.

    Though this is not going to be a "magic bullet" when it comes to allowing certain features to function on configurations that don't support it but it will try to acquiesce the largest amount of requested features that can be supported in the configuration and also alert you to which ones are not for the requested configuration. It won't open up the ability to do things regardless of camera as a lot of functionality is tied to which camera you are using.
     
  31. LT23Live

    LT23Live

    Joined:
    Jul 8, 2014
    Posts:
    97
    Any Update on this?
     
    forman92 likes this.
  32. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    We are still working on this.
     
  33. tahafarooq

    tahafarooq

    Joined:
    Sep 18, 2015
    Posts:
    7
    Hi,

    As shown here that you can use the front camera (on specific devices) to detect faces as well as the world tracking by simply enabling the isWorldTrackingEnabled bool:
    https://developer.apple.com/documentation/arkit/arfacetrackingconfiguration

    Does this mean that we can also detect HumanBodies with a TrueDepth front camera?
    If yes, how can I achieve this while using Unity ARKit Plugin (part of AR Foundation)? I want to use the device's front camera to detect both the face and the HumanBody (Joints and all that)?
     
  34. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    Body tracking on ARKit only works with the rear-facing camera.
     
    Mry_Huang and tahafarooq like this.
  35. tahafarooq

    tahafarooq

    Joined:
    Sep 18, 2015
    Posts:
    7
    Thank you for your prompt reply.

    I have one more question. When I connect my phone with a TV and position it in a way that the phone's back camera can see me, the TV screen shows everything inverted horizontally. Is there an easy way to invert the frame being received from the camera using the ARKIT - ARFoundation Unity, so that I do not see myself inverted on the TV screen?
     
  36. dakso

    dakso

    Joined:
    Jul 14, 2017
    Posts:
    1
    I want to know how to turn off isWorldTrackingEnabled in arfoundation.
     
  37. Mry_Huang

    Mry_Huang

    Joined:
    Aug 26, 2019
    Posts:
    6
    How do I modify isWorldTrackingEnabled value
     
  38. Mry_Huang

    Mry_Huang

    Joined:
    Aug 26, 2019
    Posts:
    6
    These words may be redundant, but I want to make sure.The Ipad's front-facing camera with the A12 chip doesn't do the job of motion capture。Yes?
     
  39. Mry_Huang

    Mry_Huang

    Joined:
    Aug 26, 2019
    Posts:
    6
    hi ! The methods you mentioned are all from apple , I want to learn Unity for ARfoundation.
     
  40. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    Correct. Only the rear-facing camera supports motion capture.
     
  41. HypeVR

    HypeVR

    Joined:
    Aug 28, 2018
    Posts:
    11
    Any future plans for providing human depth and stencil texture from front-facing camera?
     
  42. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    That is functionality Apple would have to build into their ARKit SDK.
     
  43. Sylafrs

    Sylafrs

    Joined:
    Jun 25, 2013
    Posts:
    65
    Hey everyone,

    I have a game in which I'm trying to achieve this behaviour:

    - First, with the rear camera and the plane manager, I'm detecting the planes
    - Then, the user touches the screen to place an object to the ground (let say a cube)
    - Then, he touches a play button and an animation starts to play
    - The animation, will frequently swap the cameras and use face detection to add something on the user face (let say a hat) and swap back to show the object again.

    The behaviour runs normally but the rear tracking seems off :
    It can swap the worlds, it show the hat as desired but the object I placed on the ground has issues staying where it should be.

    It starts by levitating, then I can show it on the face 'world' then, it can be everywhere in the world; surely because the phone moves while in face tracking. My guess is that the object is not tracked anymore on this mode.

    How can I ask my device to continue using the rear camera (without showing it) to keep the object to the right place ?
    I haven't understood how to achieve this reading the code, the samples, the documentations and this thread.
    Is it at least possible ?

    Best regards,
    Sylafrs
     
  44. RyanJVR

    RyanJVR

    Joined:
    Jan 15, 2014
    Posts:
    9
    Hi,

    Was wondering if its possible to get light direction and ambient spherical harmonics from the True Depth camera on the front of the iPhone when busy using the back camera with world tracking?

    Regards
    Ryan
     
  45. scrant

    scrant

    Joined:
    Jun 1, 2017
    Posts:
    73
    @RyanJVR Hi, Ryan did you ever get an answer on this? Know this thread is old but I'm trying to do exactly the same thing and can't find a good example. Many thanks.
     
  46. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,142
  47. scrant

    scrant

    Joined:
    Jun 1, 2017
    Posts:
    73
    I'm at my wits end. I've been through every forum post I can find including this one and get close to an answer but never quite there. I am trying to have both cameras active on a new iPhone 12 Pro where the user facing camera will read the Main Light Direction value and the back facing camera will track planes and anchors for object placement. The value for the lighting direction will then set the light in the scene to match the environment lighting on all the 3D objects. Seems like the obvious thing that everyone wants to do. How the !@#$ do you get Unity to do that? I know it's in ARKit 4. I've looked at all the GitHub samples including the one above but that doesn't solve this. I've looked at possibly using the ConfigurationChooser to do this but don't have any idea how to use it for this particular situation. Can anyone please help and provide a sample scene or script? Thank you.
     
  48. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    99
    @scrant unfortunately this is not a supported feature in ARKit. ARKit makes the decision of what form of light estimation to use based on the current ARConfiguration in use.

    If you use an ARWorldTrackingConfiguration then you will get the standard light estimation values. See ARLightEstimation in the ARKit Docs.

    If you use an ARFaceTrackingConfiguration then you will get the main light direction and spherical harmonics values. See ARDirectionalLightEstimation.

    These two configurations are actually what control which of the two cameras are being displayed. World will give you the back-facing camera and the Face will give you the front-facing camera. In order to use both tracking modes at the same time you can either enable the userFaceTrackingEnabled boolean on the ARWorldTrackingConfiguration which will result in a back-facing camera image but will allow you to leverage face tracking but will only be able to give you the basic light estimation (no spherical harmonics) because you are not using an ARFaceTrackingConfiguration. Likewise, if you wish to display the front-facing camera but also do world tracking then you would use the ARFaceTrackingConfiguration with the isWorldTrackingEnabled boolean set to true but you will not be able to access the basic light estimation features of the ARWorldTrackingConfiguration.
     
  49. scrant

    scrant

    Joined:
    Jun 1, 2017
    Posts:
    73
    Thanks @davidmo_unity. Ahhh that's a bummer. I thought this was possible. So what is the best way in Unity to switch between those two configurations and also to set the two flags you mentioned (userFaceTrackingEnabled/isWorldTrackingEnabled)? Should we using ConfigurationChooser or setting camera facing direction on ARCameraManger?
     
  50. scrant

    scrant

    Joined:
    Jun 1, 2017
    Posts:
    73
    And also @davidmo_unity wouldn't we be able to hit the user facing camera, grab the light direction, and then switch to the back back camera for tracking and apply the main light direction we found from the user facing camera? Or are the coordinate systems going to be messed up?
     
Thread Status:
Not open for further replies.