Search Unity

Unity Human occlusion has been added to ARFoundation / ARKit

Discussion in 'AR' started by todds_unity, Nov 22, 2019.

  1. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    171
    The latest AR Foundation 3.1.0-preview.1 release introduces a new component AROcclusionManager. Adding this new component to the AR camera in your scene provides depth data to the ARCameraBackground component so that any available depth information may be passed to the camera background shader.

    In this initial implementation, the people occlusion functionality of ARKit 3 is used to generate depth information about people detected in the real world. The camera background rendering then uses this depth information, thus allowing the people to occlude any virtual content that appears behind them in the scene.

    At this time, only iOS devices that support the ARKit 3 people occlusion functionality will produce occlusion effects with the new AROcclusionManager. These are devices with the A12, A12X, or A13 chips running iOS 13 (or later).

    Future devices will be supported when depth map functionality is added to the respective SDKs.


    Simply adding the new AROcclusionManager component to the AR camera (along with both the ARCameraManager and ARCameraBackground components) will enable automatic human occlusion to occur on supported devices.

    occlusion-manager.png

    The new AROcclusionManager has 2 parameters: HumanSegmentationStencilMode and HumanSegmentationDepthMode. These two settings allow you to balance the quality of the depth information from ARKit and the performance cost for rendering the occlusion pass.

    HumanSegmentationStencilMode has 4 possible values as follows:
    • Disabled - No human stencil image is produced, and automatic human occlusion is disabled.
    • Fastest - A human stencil image with dimensions 256x192 is produced.
    • Medium - A human stencil image with dimensions 960x720 is produced.
    • Best - A human stencil image with dimensions 1920x1440 is produced.
    HumanSegmentationDepthMode has 3 possible values as follows:
    • Disabled - No human depth image is produced, and automatic human occlusion is disabled.
    • Fastest - A human depth image with dimensions 256x192 is produced.
    • Best - A filtering pass is applied to enhance the 256x192 human depth image.
    Note the previous dimensions/behaviors are produced by the ARKit 3 implementation and are subject to change in future devices and/or ARKit SDK versions.

    Modify the HumanSegmentationStencilMode value to alter the boundaries of the human segmentation. Modify the HumanSegmentationDepthMode to alter how the real world depth is measured. Disabling either setting will disable automatic human occlusion.

    stencil-combined.png
     
  2. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    847
    Looks awesome. Any idea if the Android guys are working on a similar thing so we can down the line (hopefully soonish) get same across AR Foundation for both ios and Android?
     
  3. justdizzy

    justdizzy

    Joined:
    Mar 8, 2016
    Posts:
    72
    I am using stenciling for "portals" (materials have a stencil ref and compare func, and a mask is in front of the camera), will this collide with that functionality? Also, does it play well with LW/URP?
     
  4. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    21
    I am using URP with an iPhone 11 Pro and I get a black screen with the latest preview version of ARFoundation and ARKit
     
  5. xzodia

    xzodia

    Joined:
    Jan 24, 2013
    Posts:
    45
    Does this work along side human body tracking?
     
    Blarp likes this.
  6. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    21
    Aside from fixing the black screen issue, I have a question. I am using SSAO as a render feature with URP, is it possible to render the depth stencil on top of post processing effects?
     
  7. justdizzy

    justdizzy

    Joined:
    Mar 8, 2016
    Posts:
    72
    Just upgraded my project to 3.1.0-preview.1 and is still functioning properly (showing camera data as background). I am using the ARPipelineAsset from a branch of the ARFoundation-samples repo, not the default SRPAsset.
     
  8. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    21
    Hmm, not sure if I’m doing something wrong because I basically just upgraded an existing project using URP which worked on version 3.0.0. I then created a new project and set everything up to work with URP and still got a black screen. Note I’m using Unity 2020.
     
  9. justdizzy

    justdizzy

    Joined:
    Mar 8, 2016
    Posts:
    72
    Ah, I am on 2019.2.11. I cannot speak to 2020.
     
  10. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    21
    For me version 3.1.0 preview 1 works with LWRP but not URP. I have tested both 2019.3 and 2020 with URP. 2019.2 with LWRP works.
     
  11. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    171
    ARKit people occlusion and ARKit motion tracking cannot be used simultaneously. This is an ARKit restriction.
     
    Blarp likes this.
  12. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    171
    I would not expect any functionality collision.

    The automated occlusion has been tested with LWRP 6.9.2 in Unity 2019.2.12f1 and with URP 7.1.5 in Unity 2019.3.0b12.
     
  13. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    171
    The AROcclusionManager provides two texture properties humanStencilTexture and humanDepthTexture that can be used in your custom render passes.
     
    VadimPyrozhok and DiveshNaidoo like this.
  14. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    21
    Thank you for the feedback. I’m unable to get the latest preview version of ARFoundation working with URP and I’ve tried both 2019.3 as well as 2020. I made a barebones project just setting up the pipeline and forward renderer with the ARbackground feature and link the pipeline in the build settings. The build gives me a black screen. I am using an iPhone 11 Pro Max
     
  15. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    21
    Update, it works on 2020.1.0a14
     
    newguy123 likes this.
  16. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    21
    The black screen when using ARFoundation/ARkit 3.1.0 preview 1 with URP is reproducible by enabling post processing on the camera.
     
    newguy123 likes this.
  17. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    171
    DiveshNaidoo and VadimPyrozhok like this.
  18. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    21
    todds_unity likes this.
  19. Leonardo_Carrasco

    Leonardo_Carrasco

    Joined:
    Aug 2, 2019
    Posts:
    1
    There is no update for android yet
     
    Doraemon231 likes this.
  20. enigmatic

    enigmatic

    Joined:
    Feb 6, 2015
    Posts:
    2
    The ARFoundation works for human occlusion! can i expect it to work for building and other physical objects as well??
     
  21. eyalfx

    eyalfx

    Joined:
    Oct 8, 2010
    Posts:
    102
    Any update on support for new iPad Pro (LIDAR) and ARKit 3.5 ?
    Also , there was a mention about making a better example that handles depth and stencil for Human Segmentation. Is that still in the works?
    Thanks.
     
    Last edited: Mar 27, 2020
  22. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    171
    The SimpleAR scene in https://github.com/Unity-Technologies/arfoundation-samples contains a working sample. The image from the original post in this thread was made using this scene.
     
  23. sbethge

    sbethge

    Joined:
    Mar 20, 2019
    Posts:
    16
  24. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    171
  25. poshaughnessey

    poshaughnessey

    Joined:
    Jul 12, 2012
    Posts:
    44
    @todds_unity are you aware of a sample that translates a pixel in the humanDepthTexture to an x,y,z value in world coordinates, if one, or example, wanted to place a GameObject or a particle at one or more of the spots where a human was detected?
     
  26. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    171
    The floating point value of each pixel in the human depth texture is either (1) 0.0 if there is no human in the pixel or (2) the estimated depth in meters from the device camera to the human.

    Using information from the camera matrix, it should be possible to determine the world coordinate position of that human pixel.

    There is no sample currently that demonstrates this functionality.
     
    poshaughnessey likes this.
  27. herra_lehtiniemi

    herra_lehtiniemi

    Joined:
    Feb 12, 2017
    Posts:
    93
    Is there a built-in way in AROcclusionManager to only display the human without the background? (Basically “green screening”)? Maybe using the depth texture? How should one approach this effectively performance wise?
     
  28. zulaman

    zulaman

    Joined:
    May 12, 2017
    Posts:
    26
    I think everyone here is still waiting for similar functionality. The sample that's currently there is from a year ago (ARKit 3.0). It would be great to have a functional sample that is ready to use.
     
  29. EZ_Sherab

    EZ_Sherab

    Joined:
    Mar 21, 2020
    Posts:
    2
    I am using "ARkit XR Plugin 3.1.3" & "AR foundation 3.1.3"
    But I can't find "AR Occlusion Manager" in "Add Component"

    I can find "AR Occlusion Manager" in "Add Component" with "ARkit XR Plugin 4.0.1" & "AR Foundation 4.0.2"

    but camera become black screen in 4.0.1 with all setting remain the same : (

    UPDATE: I can find "AR Occlusion Manager" in 3.1.0 , but not 3.1.3....
     
    Last edited: Jun 17, 2020
  30. HidingGlass

    HidingGlass

    Joined:
    Apr 29, 2015
    Posts:
    24
    Is it possible to use the AROcclusionManager in the user-facing camera direction? It seems that AROcclusionManager is somehow forcing the camera to be world-facing.

    I understand if segmentation is not supported in the front facing camera, but can someone point me to where ARFoundation is forcing my camera the other way? I don't see any logs indicating 'user facing camera not supported, switching to world facing' etc. and I haven't tracked down where it's happening yet.
     
  31. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    171
    Automatic occlusion via the AROcclusionManager does not work with the front-facing camera.

    Enabling the "Development Build" option in the Build Settings will result in additional information logged including the available configurations on the device, the feature sets supported for each configuration, and which features are met and which features are not met on every configuration change.
     
  32. HidingGlass

    HidingGlass

    Joined:
    Apr 29, 2015
    Posts:
    24
    Thanks that's good to know about the extra logs!

    It seems that for some reason outside of the sample scene I am able to get the raw occlusion mask for the front facing camera which is great. I'm doing a bit of a homebrew solution to blur the background behind the user (similar to portrait camera on iOS). With the depth mask and the camera texture I think I should be able to accomplish this with a shader and re-apply my new 'processed' camera texture. Still trying it out.

    Just for my reference. I'm able to query the 'requested camera' and the 'actual camera', but do you know off the top of your head what subsystem is saying "You can't use the requested camera so here's the other one"? Still trying to track that down.
     
    rocket5tim likes this.
  33. dtaddis

    dtaddis

    Joined:
    Oct 17, 2019
    Posts:
    15
    Sorry to dredge up this thread, but is there a recommended way to switch between the two features?

    We would like to use people occlusion for the majority of our game, but switch to body tracking for a few missions. Then we'll switch back to occlusion (ideally we would use both simultaneously, but we'd live with the switch). Is this possible in Unity at the moment? What's the recommended way to do it?
     
    ina likes this.
  34. dtaddis

    dtaddis

    Joined:
    Oct 17, 2019
    Posts:
    15
    Ok, answered my own question... at runtime you can set the "requestedDepthMode" and "requestedStencilMode" to disabled on the AROcclusionManager component, and "pose3Drequested" flag on your ARHumanBodyManager component to true (or vice versa), and ARFoundation will redetermine which configuration to use on the fly. Nice! I guess this is the recommended way to do things, if you don't want to or can't change scenes.
     
  35. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,003
    Can occlusion settings be set at runtime? Such as prefer human vs environment occlusion?
     
  36. maxuntoldgarden

    maxuntoldgarden

    Joined:
    Dec 9, 2019
    Posts:
    3
    That sounds great!
    How did you manage to get the raw occlusion mask from the front facing camera? Are you getting it from AROcclusionManager? I'm trying to mask 3D objects with someone's head occlusion mask while in front-facing camera mode.
     
  37. Zinn_Gi

    Zinn_Gi

    Joined:
    Oct 17, 2018
    Posts:
    2
    HI @all

    Is Human occlusion supports in latest android devices
    If yes can someone post me some working unity examples

    thanks in advance
     
    Treecrotch likes this.
  38. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    738
    Human Occlusion is not supported by ARCore.
     
    Zinn_Gi likes this.
  39. lixiongguo

    lixiongguo

    Joined:
    Mar 20, 2020
    Posts:
    1
    I can not see AROcculussionManager.cs in ARFoundation3.1.6, Is this version of ARFoundation not availiable for Human Occulussion?
     
  40. b9n

    b9n

    Joined:
    Feb 6, 2020
    Posts:
    14
    it worked on 3.1.3
     
  41. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    306
    Hey everyone, apologies for the confusion. The AR occlusion manager was included in AR Foundation 3.1 preview versions but was not considered production ready for AR Foundation 3.1 verified versions. To use the occlusion manager, please use AR Foundation 4.0 verified in Unity 2020.2.
     
  42. rocket5tim

    rocket5tim

    Joined:
    May 19, 2009
    Posts:
    237
    Referring back to the question @HidingGlass posted a while back: "...do you know off the top of your head what subsystem is saying "You can't use the requested camera so here's the other one"? Still trying to track that down."

    I would like to use the front facing camera with the AROcclusionManager, but something is forcing it to always use the back camera even when I specifically try to switch to the front camera in code.
     
  43. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    738
    Current implementations of ARKit and ARCore don't support occlusion on the front-facing camera. Here is an extension method I made to check if all features can be satisfied simultaneously using the XRSessionSubsystem.DetermineConfiguration():
    Code (CSharp):
    1. using JetBrains.Annotations;
    2. using UnityEngine.Assertions;
    3. using UnityEngine.XR.ARFoundation;
    4. using UnityEngine.XR.ARSubsystems;
    5.  
    6.  
    7. public static class ARSessionExtensions {
    8.     public static bool AreAllFeaturesSupportedSimultaneously([NotNull] this ARSession session, Feature features) {
    9.         var subsystem = session.subsystem;
    10.         Assert.IsNotNull(subsystem, $"Please ensure that {nameof(ARSession)} was enabled at least once before calling the {nameof(AreAllFeaturesSupportedSimultaneously)}() method.");
    11.         var config = subsystem.DetermineConfiguration(features);
    12.         return config.HasValue && features.SetDifference(config.Value.features) == 0;
    13.     }
    14. }
    15.  
     
  44. rocket5tim

    rocket5tim

    Joined:
    May 19, 2009
    Posts:
    237
    Thanks for the reply! I'm actually using your ARFoundationsEssentials repo (and your AR Remote plugin) for my testing, thanks so much for all your efforts!

    Without trying the above extension, it sounds like it just confirms that the front camera won't work with my config - is that about right? Simply adding AROcclusionManager to my my AR setup is what forces the system to use the back camera only, without the component, I can switch cameras at will.

    @todds_unity or @mfaud, do you guys know why the front camera isn't supported? Is it a hardware limitation of the front camera or is it a limitation of ARKit itself? And if neither of those, do you know if you're planning to add support for occlusion on the front camera in the future?
     
  45. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    738
    Thank you :)

    Yes, this is correct.

    This is a limitation in the native implementation of ARKit/ARCore. AR Foundation is a wrapper around these native implementations and can't provide any extra functionality.
    For example, here is the ARKit documentation that describes all available session configurations. As you can see, the user-facing camera (they call it 'selfie' in the article) is only available in ARFaceTrackingConfiguration.
     
    rocket5tim likes this.
  46. TreyK-47

    TreyK-47

    Unity Technologies

    Joined:
    Oct 22, 2019
    Posts:
    1,699
  47. unity_h36Dp8nmGLPPVA

    unity_h36Dp8nmGLPPVA

    Joined:
    May 19, 2020
    Posts:
    1
    Hi, sorry my question might be strange; but, Is there any solution in AR occlusion manager to switch off the depth mode while keeping the stencile mode on ?
    In my specific case, I need to use AR occlusion manager to occlude my virtual object constently with hand/body whether the virtual object is behind the hand/body or infront of it. Indeed, I need to disable the depth detection. In the current setting both modes (Human Segmentation stencile mode and Human segmentation depth mode) needs to be enabled to make the occlusion manager work and disabling the depth mode will result in build error. So, Is there any way to reach the explained objective? Any help is appreciated!
     
  48. lbaptista95

    lbaptista95

    Joined:
    Jan 4, 2019
    Posts:
    4
    Hi. Is there a way to detect if my body is colliding with some GameObject? I mean, when the camera detects my hand, does it turn my hand into a game object too?
     
    unity_h36Dp8nmGLPPVA likes this.
unityunity