Search Unity

Human occlusion has been added to ARFoundation / ARKit

Discussion in 'Handheld AR' started by todds_unity, Nov 22, 2019.

  1. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    100
    The latest AR Foundation 3.1.0-preview.1 release introduces a new component AROcclusionManager. Adding this new component to the AR camera in your scene provides depth data to the ARCameraBackground component so that any available depth information may be passed to the camera background shader.

    In this initial implementation, the people occlusion functionality of ARKit 3 is used to generate depth information about people detected in the real world. The camera background rendering then uses this depth information, thus allowing the people to occlude any virtual content that appears behind them in the scene.

    At this time, only iOS devices that support the ARKit 3 people occlusion functionality will produce occlusion effects with the new AROcclusionManager. These are devices with the A12, A12X, or A13 chips running iOS 13 (or later).

    Future devices will be supported when depth map functionality is added to the respective SDKs.


    Simply adding the new AROcclusionManager component to the AR camera (along with both the ARCameraManager and ARCameraBackground components) will enable automatic human occlusion to occur on supported devices.

    occlusion-manager.png

    The new AROcclusionManager has 2 parameters: HumanSegmentationStencilMode and HumanSegmentationDepthMode. These two settings allow you to balance the quality of the depth information from ARKit and the performance cost for rendering the occlusion pass.

    HumanSegmentationStencilMode has 4 possible values as follows:
    • Disabled - No human stencil image is produced, and automatic human occlusion is disabled.
    • Fastest - A human stencil image with dimensions 256x192 is produced.
    • Medium - A human stencil image with dimensions 960x720 is produced.
    • Best - A human stencil image with dimensions 1920x1440 is produced.
    HumanSegmentationDepthMode has 3 possible values as follows:
    • Disabled - No human depth image is produced, and automatic human occlusion is disabled.
    • Fastest - A human depth image with dimensions 256x192 is produced.
    • Best - A filtering pass is applied to enhance the 256x192 human depth image.
    Note the previous dimensions/behaviors are produced by the ARKit 3 implementation and are subject to change in future devices and/or ARKit SDK versions.

    Modify the HumanSegmentationStencilMode value to alter the boundaries of the human segmentation. Modify the HumanSegmentationDepthMode to alter how the real world depth is measured. Disabling either setting will disable automatic human occlusion.

    stencil-combined.png
     
  2. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    273
    Looks awesome. Any idea if the Android guys are working on a similar thing so we can down the line (hopefully soonish) get same across AR Foundation for both ios and Android?
     
  3. justdizzy

    justdizzy

    Joined:
    Mar 8, 2016
    Posts:
    61
    I am using stenciling for "portals" (materials have a stencil ref and compare func, and a mask is in front of the camera), will this collide with that functionality? Also, does it play well with LW/URP?
     
  4. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    16
    I am using URP with an iPhone 11 Pro and I get a black screen with the latest preview version of ARFoundation and ARKit
     
  5. xzodia

    xzodia

    Joined:
    Jan 24, 2013
    Posts:
    31
    Does this work along side human body tracking?
     
  6. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    16
    Aside from fixing the black screen issue, I have a question. I am using SSAO as a render feature with URP, is it possible to render the depth stencil on top of post processing effects?
     
  7. justdizzy

    justdizzy

    Joined:
    Mar 8, 2016
    Posts:
    61
    Just upgraded my project to 3.1.0-preview.1 and is still functioning properly (showing camera data as background). I am using the ARPipelineAsset from a branch of the ARFoundation-samples repo, not the default SRPAsset.
     
  8. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    16
    Hmm, not sure if I’m doing something wrong because I basically just upgraded an existing project using URP which worked on version 3.0.0. I then created a new project and set everything up to work with URP and still got a black screen. Note I’m using Unity 2020.
     
  9. justdizzy

    justdizzy

    Joined:
    Mar 8, 2016
    Posts:
    61
    Ah, I am on 2019.2.11. I cannot speak to 2020.
     
  10. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    16
    For me version 3.1.0 preview 1 works with LWRP but not URP. I have tested both 2019.3 and 2020 with URP. 2019.2 with LWRP works.
     
  11. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    100
    ARKit people occlusion and ARKit motion tracking cannot be used simultaneously. This is an ARKit restriction.
     
  12. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    100
    I would not expect any functionality collision.

    The automated occlusion has been tested with LWRP 6.9.2 in Unity 2019.2.12f1 and with URP 7.1.5 in Unity 2019.3.0b12.
     
  13. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    100
    The AROcclusionManager provides two texture properties humanStencilTexture and humanDepthTexture that can be used in your custom render passes.
     
    VadimPyrozhok and DiveshNaidoo like this.
  14. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    16
    Thank you for the feedback. I’m unable to get the latest preview version of ARFoundation working with URP and I’ve tried both 2019.3 as well as 2020. I made a barebones project just setting up the pipeline and forward renderer with the ARbackground feature and link the pipeline in the build settings. The build gives me a black screen. I am using an iPhone 11 Pro Max
     
  15. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    16
    Update, it works on 2020.1.0a14
     
    newguy123 likes this.
  16. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    16
    The black screen when using ARFoundation/ARkit 3.1.0 preview 1 with URP is reproducible by enabling post processing on the camera.
     
    newguy123 likes this.
  17. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    100
    DiveshNaidoo and VadimPyrozhok like this.
  18. DiveshNaidoo

    DiveshNaidoo

    Joined:
    Apr 1, 2018
    Posts:
    16
    todds_unity likes this.
  19. Leonardo_Carrasco

    Leonardo_Carrasco

    Joined:
    Aug 2, 2019
    Posts:
    1
    There is no update for android yet
     
  20. enigmatic

    enigmatic

    Joined:
    Feb 6, 2015
    Posts:
    2
    The ARFoundation works for human occlusion! can i expect it to work for building and other physical objects as well??
     
  21. eyalfx

    eyalfx

    Joined:
    Oct 8, 2010
    Posts:
    101
    Any update on support for new iPad Pro (LIDAR) and ARKit 3.5 ?
    Also , there was a mention about making a better example that handles depth and stencil for Human Segmentation. Is that still in the works?
    Thanks.
     
    Last edited: Mar 27, 2020 at 7:36 PM
  22. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    100
    The SimpleAR scene in https://github.com/Unity-Technologies/arfoundation-samples contains a working sample. The image from the original post in this thread was made using this scene.
     
unityunity