Search Unity

Official Environmental Occlusion and Depth in ARFoundation

Discussion in 'AR' started by todds_unity, Jun 24, 2020.

Thread Status:
Not open for further replies.
  1. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    Updated: Occlusion and depth is now supported in both ARCore and ARKit.

    The AR Foundation, ARCore, and ARKit packages have been published with version 4.1.0-preview.2. These packages add automatic environment occlusion functionality to the existing AROcclusionManager component. With this component configured in your scene, your virtual content is more integrated into the real world using depth information.

    Generated for each video frame, depth information conveys an image in which each pixel represents the distance between the device and real-world objects. With automatic occlusion configured, the camera background rendering uses this depth image to populate the depth buffer prior to rendering. When the virtual content is rendered, Z-buffer occlusion allows for both the virtual content and real-world object to occlude each other, thus, giving the effect that virtual and real world appear as one.

    For ARCore, Google maintains this list of ARCore compatible devices which notes the devices that support the depth API.

    For ARKit, the new iPad Pro running ARKit 4 with iOS 14 supports automatic environment depth occlusion. Note that Xcode 12 is also required.


    To use automatic occlusion, you simply add the AROcclusionManager component to the AR camera (along with both the ARCameraManager and ARCameraBackground components)

    arfoundation-arocclusionmanager-2020-06-24.png

    The AROcclusionManager has 3 parameters:
    - Environment depth mode
    - Human segmentation stencil mode
    - Human segmentation depth mode

    Human segmentation modes are discussed in this forum post.

    Environment depth mode has four settings:
    - Disabled
    - Fastest
    - Medium
    - Best

    Enable automatic environment occlusion by setting this property to anything other than Disabled. The other settings specify the quality of the depth occlusion to perform. However, each additional level of quality does come with more operations that impact your game's performance, so choose a setting that best meets your game performance needs.

    arfoundation-arocclusion-example.png

    The arfoundation-samples repository also has a SimpleOcclusion scene preconfigured with this setup. In this scene, you tap the screen to fire a red projectile into the scene and watch as it becomes occlude by real-world geometry.
     
    Last edited: Jun 25, 2020
    jbraam and newguy123 like this.
  2. Natzke

    Natzke

    Joined:
    Feb 5, 2017
    Posts:
    7
    Do you need XCode 12 to compile for ARKit4? Still waiting for iOS 14 beta to download for my iPad and haven't come across any development requirements (besides updated packages and device iOS beta)
     
    dimitris_baud likes this.
  3. rmuk

    rmuk

    Joined:
    Feb 18, 2015
    Posts:
    65
    ARCore's depth API is now out of beta. Will the Occlusion Manager be updating to support that on Android at some point?
     
    ROBYER1 and peterfiveeight like this.
  4. peterfiveeight

    peterfiveeight

    Joined:
    Jun 24, 2013
    Posts:
    9
    I was just about to post the exact same question. Thanks for beating me to it! It would be great to get the support for ARCore as soon as possible for android.
     
  5. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    We have released support for both ARCore and ARKit depth. The original forum post has been updated.
     
  6. rmuk

    rmuk

    Joined:
    Feb 18, 2015
    Posts:
    65
    Wonderful thanks!
     
  7. dimitris_baud

    dimitris_baud

    Joined:
    Jan 15, 2018
    Posts:
    31
    For iPad Pro with LiDAR running iOS13.x (not iOS14 Beta) this is not supported?
     
  8. spacemarine

    spacemarine

    Joined:
    Mar 28, 2013
    Posts:
    14
    My Galaxy Tab S6(2019) says "ARCore unsupported device".
    Please fix this. Thanks~
     
  9. DineshPunni

    DineshPunni

    Joined:
    Mar 31, 2020
    Posts:
    1
    I have the same issue with my Samsung Galaxy S8+
     
  10. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    For ARKit, iOS 14 and Xcode 12 are both required for depth to work.
     
    dimitris_baud likes this.
  11. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    You may check Google's list of supported devices for compatibility with ARCore.
     
  12. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    Which version of the ARCore app ("Google Play Services for AR") is installed on your device?
     
  13. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    1,248
    I'm not understanding this list
    Are all Android devices on that list supported, or only the ones with notes that says "supports depth API"?
     
  14. Cenda

    Cenda

    Joined:
    Jun 3, 2010
    Posts:
    66
    Yes, only with notes "Support depth API" support depth API :) Rest support just AR Core without Depth API.
     
    newguy123 likes this.
  15. dimitris_baud

    dimitris_baud

    Joined:
    Jan 15, 2018
    Posts:
    31
    Does Environmental Occlusion and Depth work out of the box (just with the AROcclusionManager) in URP?
     
  16. weirdmonkey2807

    weirdmonkey2807

    Joined:
    Nov 14, 2018
    Posts:
    1
    what a mess android is. as good as the depth api is, there is about 20 devices that support this out of hundreds of android phones ...
     
    newguy123 likes this.
  17. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    Yes.
     
  18. Tomzai

    Tomzai

    Joined:
    Oct 12, 2015
    Posts:
    7
    Is it possible to use the AROcclusionManager to get access to the environment depth texture without having automatic occlusion? My use cases are -
    - I want to use the depth texture for collision tests only
    - I want to generate my own occlusion from the depth texture so that I can try excluding around planar regions that my game content is sitting on
    - Also occlusion doesn't seem correct when a scale is applied to the AR Session Origin so I want to see if I can work around that
     
  19. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    The ARCameraBackground only looks for the AROcclusionManager on the same game object, the ARCamera.

    If you add the AROcclusionManager to any other game object except for the ARCamera, the AROcclusionManager produces the environment depth texture (if enabled), but the background rendering does not use it for occlusion.
     
  20. Tomzai

    Tomzai

    Joined:
    Oct 12, 2015
    Posts:
    7
    Perfect, thanks! Do you have any tips for converting the values in the depth texture into depth in metres?
     
  21. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    The pixel values are in meters.
     
  22. Tomzai

    Tomzai

    Joined:
    Oct 12, 2015
    Posts:
    7
    Handy - thanks again.
     
  23. j0schm03

    j0schm03

    Joined:
    Jun 21, 2018
    Posts:
    16
  24. Dalton-Lima

    Dalton-Lima

    Joined:
    Dec 21, 2016
    Posts:
    19
    I wish I had a checkbox in `AROcclusionManager` to toggle occlusion per platform (iOS, Android). I know I can do that with an extra script thought.

    For the moment I am not quite happy with the Occlusion that I get on Android, part of objects that are placed close to the floor are getting clipped, although on iOS the Human Occlusion work quite good.
     
  25. mrkhnstn

    mrkhnstn

    Joined:
    Jun 23, 2013
    Posts:
    4
    Is it somehow possible to get the human stencil texture in combination with the environment depth texture on iPad + LiDAR + iOS14?
     
  26. VictorChow_K

    VictorChow_K

    Joined:
    Jan 16, 2019
    Posts:
    9
    The environment depth feature does not function after testing with the Simple Occlusion sample in ARFoundation Samples. Setup:

    - ARFoundation 4.1.0-preview.5 (also tested with preview.2)
    - ARKit XR Plugin 4.1.0-preview.5 (also tested with preview.2)
    - ARSubsystems 4.1.0-preview.5 (also tested with preview.2)
    - Unity 2019.4.3
    - iPadOS 14 developer beta 3 (18A5332f) on an iPad Pro 4th Gen
    - Xcode 12 beta 3 (12A8169g)

    ARKit-specific demos from developer.apple.com appear to still function correctly.

    Any other suggested troubleshooting (i.e. perhaps a public beta iPadOS version)?
     
  27. prono82

    prono82

    Joined:
    Jan 22, 2019
    Posts:
    4
    Yes, me too.
    Did you find any solutions?

     
  28. DreamPower

    DreamPower

    Joined:
    Apr 2, 2017
    Posts:
    103
    I just verified the same problem with my iPad Pro 4th generation, using Unity 2020.1 and ARFoundation 4.1.0-preview.5, iPad OS 14.0 beta 3. I tried the arfoundation-samples latest preview branch, and other demos work, such as Classification Meshes, but neither SimpleOcclusion nor DepthImages demos work. I verified that m_OcclusionManager.descriptor?.supportsEnvironmentDepthImage is true.

    This is the output I'm seeing in XCode's console in SimpleOcclusion, starting with the loading of the UnityARKit plugin:

    Code (CSharp):
    1. Subsystems] Loading plugin UnityARKit for subsystem ARKit-Input...
    2. [Subsystems] UnityARKit successfully registered Provider for ARKit-Input
    3. [Subsystems] UnityARKit successfully registered Provider for ARKit-Input
    4. [Subsystems] UnityARKit successfully registered Provider for ARKit-Meshing
    5. [Subsystems] Loading plugin UnityARKit for subsystem ARKit-Meshing...
    6. 2020-07-26 17:15:42.958376-0700 samples[413:11445] Unbalanced calls to begin/end appearance transitions for <UnityViewControllerStoryboard: 0x10d705240>.
    7. UnloadTime: 0.476458 ms
    8. 2020-07-26 17:15:44.565873-0700 samples[413:11445] UnityARKit: Updating ARSession configuration with <ARWorldTrackingConfiguration: 0x283d80790 worldAlignment=Gravity lightEstimation=Disabled frameSemantics=Unknown videoFormat=<ARVideoFormat: 0x2829de490 imageResolution=(1920, 1440) framesPerSecond=(60)> autoFocus=Enabled environmentTexturing=None wantsHDREnvironmentTextures=Enabled planeDetection=None collaboration=Disabled userFaceTracking=Disabled sceneReconstruction=None>
    9. 2020-07-26 17:15:44.974870-0700 samples[413:11445] UnityARKit: Updating ARSession configuration with <ARWorldTrackingConfiguration: 0x283d880b0 worldAlignment=Gravity lightEstimation=Disabled frameSemantics=Unknown videoFormat=<ARVideoFormat: 0x2829de490 imageResolution=(1920, 1440) framesPerSecond=(60)> autoFocus=Enabled environmentTexturing=None wantsHDREnvironmentTextures=Enabled planeDetection=None collaboration=Disabled userFaceTracking=Disabled sceneReconstruction=None>
    10. 2020-07-26 17:15:44.975027-0700 samples[413:11620] [General] ARWorldTrackingConfiguration <0x283d88370>: Expected 3 jasper framerate values: 0
    11. 2020-07-26 17:15:46.732889-0700 samples[413:11622] [Sensor] ARDepthSensor <0x283189ba0>: (AVCaptureDeviceTypeBuiltInTimeOfFlightCamera - Back): capture session dropped jasper frame: 1115.505557, LateData
    In the DepthImages demo, I tried touching the button that changes the resolution, didn't make it start working.
     
    Last edited: Jul 27, 2020
  29. prono82

    prono82

    Joined:
    Jan 22, 2019
    Posts:
    4
    In the environment of iPadOS 14 beta 3, when ARKit Sample code “Visualizing a Point Cloud Using Scene Depth(*1)" is built with Xcode 12.0 beta 3 (12A8169g), the point cloud is displayed, but when built with Xcode 12.0 beta (12A6159), the point cloud is not displayed. According to this result, it seems that Xcode needs to use "beta 3".

    According to Changelog of ARKit XR Plugin 4.1.0-preview.5 (*2),
    it says "Static library was built with Xcode 12.0 beta 2 (12A6163b)".
    By the above result, I guess the static library should be built with Xcode 12.0 beta 3.

    *1 https://developer.apple.com/documen...a_point_cloud_using_scene_depth?language=objc
    *2 https://docs.unity3d.com/Packages/com.unity.xr.arkit@4.1/changelog/CHANGELOG.html
     
  30. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    j0schm03 likes this.
  31. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    iOS 14 beta 3 introduced a breaking change to environment depth.

    The 4.1.0-preview.6 set of AR Foundation packages will work on the new iOS 14 beta 3.
     
    VictorChow_K and prono82 like this.
  32. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    iOS beta 1 and beta 2 had an issue with the interaction between the human stencil texture and the environment depth texture.

    iOS beta 3 resolved this issue. Do note that you will need the 4.1.0-preview.6 set of AR Foundation packages to work with iOS 14 beta 3.
     
    mrkhnstn likes this.
  33. Misnomer

    Misnomer

    Joined:
    Jul 15, 2013
    Posts:
    28
    I tried running Normals Meshing example with Ar Foundation 4.0.2 on Unity 2019.4.4f1 and after 8-10 minutes application freezes. The problem seems to be with the Unity garbage collector and reproduces even if I don't move iPad (very few meshes are being generated). Anyone else faced this issue?
     
  34. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    This issue should be fixed in 4.1.0-preview.6 set of AR Foundation packages. We are working on backporting the fix to a future 4.0 verified version.
     
    Misnomer likes this.
  35. stephen_unity653

    stephen_unity653

    Joined:
    Feb 8, 2018
    Posts:
    1
    Is it possible to have environment occlusion and human occlusion working simultaneously?

    So far, setting AROcclusionManager OcclusionPreferenceMode to "Human Occlusion" seems to make environment occlusion not work - setting it to "Environment Occlusion" seems to make human occlusion not work.

    Is this intended?
     
  36. Misnomer

    Misnomer

    Joined:
    Jul 15, 2013
    Posts:
    28
    I tried the same example with latest preview and unfortunately the problem persists. Maybe you can give some general direction / advice on how I can fix it.
     
    Last edited: Aug 4, 2020
  37. j0schm03

    j0schm03

    Joined:
    Jun 21, 2018
    Posts:
    16
    One more issue I've run into. When an AR app is locked in a landscape orientation the perspective is off and the virtual objects do not appear in their correct locations while in landscape mode. Turn the phone to portrait orientation and everything looks great. Back to landscape and its off again. Its as if the app is not detecting the initial orientation setting and re-projecting.

    Would setting the FOV Axis on the AR camera to Horizontal fix this issue if the app is locked in landscape?
     
  38. DreamPower

    DreamPower

    Joined:
    Apr 2, 2017
    Posts:
    103
    Environmental Occlusion worked for me (in my own project) with 4.1.0-preview.6 (though it wasn’t as detailed/stable as I hoped it would be). Unfortunately, I don’t have the project available right now to give any specifics/tips, aside from putting the occlusion manager on the camera.
     
  39. Tomzai

    Tomzai

    Joined:
    Oct 12, 2015
    Posts:
    7
    Is it possible to use AR Core's depth capabilities on non-LIDAR iOS devices through AR Foundation? Or would I need to use AR Core without AR Foundation?
     
  40. j0schm03

    j0schm03

    Joined:
    Jun 21, 2018
    Posts:
    16
    In a portrait app, the environmentDepthTexture is rotated 90 degrees. Is there a way to automatically adjust this to portrait?

    We also are noticing that when using getPixels on the environmentDepthTexture at a given location the values returned for the r value are only returning between 0 and 1. Is this the correct behavior? With the distance in meters being stored in the r value we anticipated values over 1 meter, but maybe that was a mis judgment on our end. Want to verify its working correctly. Thank you!
     
  41. j0schm03

    j0schm03

    Joined:
    Jun 21, 2018
    Posts:
    16
    In a portrait app, the environmentDepthTexture is rotated 90 degrees. Is there a way to automatically adjust this to portrait?

    It's my understanding that the values returned by the environmentDepthTexture are in meters, the distance from the camera to the pixel in the environmentDepthTexture. According to this comment:

    We are noticing that when using getPixel on the environmentDepthTexture at a given location the values returned for the r value are only returning between 0 and 1. Is this the correct behavior? With the distance supposedly in meters we anticipated values over 1 meter. Or are the Color.r values returned by the environmentDepthTexture actually not the distance in meters and we then have to calculate the distance based on the clipping planes etc?

    For example, doing the following never gives us a value over 1:

    Code (CSharp):
    1. Color depthValue = m_OcclusionManager.environmentDepthTexture.GetPixel((int)touchPos.x,(int)touchPos.y);
    2. Debug.Log(depthValue.r)
     
    Last edited: Aug 24, 2020
  42. Tomzai

    Tomzai

    Joined:
    Oct 12, 2015
    Posts:
    7
    From what I can tell, the depth values are in meters but the format of the texture means that if you use GetPixel you get a value clamped to 1 meter max. To get full-range values what I ended up doing is blitting the depth texture into a render texture (format RenderTextureFormat.ARGBFloat) then render that into a new Texture2D, which I could then use GetPixel on. I'm sure there's a more efficient way but it worked. I also use a shader when blitting the initial texture that rotates the texture based on Screen.orientation so I don't have to handle that later on - the initial texture is always in 'Landscape Right' orientation.
     
    eco_bach likes this.
  43. j0schm03

    j0schm03

    Joined:
    Jun 21, 2018
    Posts:
    16
    UPDATE:
    This worked perfectly, thank you again for your help! I've been stuck on this for a week and this finally got me over the hump. Thank you!

    ORIGINAL POST:
    Thank you so much for this information! I tried blitting into a rendertexture, but was using a different texture format. I will give that a try and see what results we get.

    Any chance you'd be willing to share the shader you used for rotating? I've been using the shader provided with the DisplayDepthImage example, with some modifications. This may work with the change you recommended, but if not it would be awesome if you were able to share. If not, I definitely understand.

    Thanks for your help!
     
    Last edited: Aug 25, 2020
  44. Tomzai

    Tomzai

    Joined:
    Oct 12, 2015
    Posts:
    7
    No problem. This is the shader I've been using - pass in Screen.orientation in "_Orientation" -

    Code (CSharp):
    1. Shader "Unlit/CopyShader"
    2. {
    3.     SubShader
    4.     {
    5.         Tags { "RenderType"="Opaque" }
    6.         LOD 100
    7.         ZTest Always
    8.         Cull Off
    9.         ZWrite Off
    10.        
    11.         Pass
    12.         {
    13.             Name "Unlit"
    14.                
    15.             HLSLPROGRAM
    16.             #pragma vertex vert
    17.             #pragma fragment frag
    18.            
    19.             #include "UnityCG.cginc"
    20.  
    21.             struct appdata
    22.             {
    23.                 float4 vertex : POSITION;
    24.                 float2 uv : TEXCOORD0;
    25.             };
    26.  
    27.             struct v2f
    28.             {
    29.                 float2 uv : TEXCOORD0;
    30.                 float4 vertex : SV_POSITION;
    31.             };
    32.            
    33.             Texture2D _MainTex;
    34.             SamplerState sampler_MainTex;
    35.             float4 _MainTex_ST;
    36.             int _Orientation;
    37.  
    38.             v2f vert (appdata v)
    39.             {
    40.                 v2f o;
    41.                 o.vertex = UnityObjectToClipPos(v.vertex);
    42.                
    43.                 // Flip X
    44.                 o.uv = float2(1.0 - v.uv.x, v.uv.y);
    45.                
    46.                 if (_Orientation == 1) {
    47.                     // Portrait
    48.                     o.uv = float2(1.0 - o.uv.y, o.uv.x);
    49.                 }
    50.                 else if (_Orientation == 3) {
    51.                     // Landscape left
    52.                     o.uv = float2(1.0 - o.uv.x, 1.0 - o.uv.y);
    53.                 }
    54.                
    55.                 o.uv = TRANSFORM_TEX(o.uv, _MainTex);
    56.                 return o;
    57.             }
    58.  
    59.             fixed4 frag (v2f i) : SV_Target
    60.             {
    61.                 return  _MainTex.Sample(sampler_MainTex, i.uv);
    62.             }
    63.             ENDHLSL
    64.         }
    65.     }
    66. }
    67.  
     
    eco_bach and j0schm03 like this.
  45. j0schm03

    j0schm03

    Joined:
    Jun 21, 2018
    Posts:
    16
  46. DreamPower

    DreamPower

    Joined:
    Apr 2, 2017
    Posts:
    103
    Right now it would work, because only the LIDAR supports creating depth images in ARKit (or mesh reconstruction for that matter). But that's not guaranteed for the future, it's possible that the next ARKit could add depth sensing and mesh reconstruction to the normal camera for older devices.

    Another option is to check for the device model, and check against devices you know have a LIDAR. For that you would parse the string returned from SystemInfo.deviceModel.
     
    Last edited: Aug 29, 2020
  47. j0schm03

    j0schm03

    Joined:
    Jun 21, 2018
    Posts:
    16
    Thanks!
     
  48. prono82

    prono82

    Joined:
    Jan 22, 2019
    Posts:
    4
    Does the environment depth features of AR Foundation 4.1.0-preview.7 work with the latest iPadOS 14 beta 7 and Xcode 12.0 beta 6?
    I'm worried that similar above problems will occur.
     
  49. serjightu

    serjightu

    Joined:
    Mar 1, 2019
    Posts:
    4
    If I use ARKore Deptx API in my project. I cannot see the entire 3D model. I see only part of the 3D model and the rest is hidden. How to make me see the entire 3D model completely?
     
  50. unity_8OaaYiLvKPqJSg

    unity_8OaaYiLvKPqJSg

    Joined:
    Dec 19, 2018
    Posts:
    1
    Hello ! I've started testing the Depth API, but I have the same problem as j0schm03, could'nt retrieve depth values.
    I tried to understand the tagged post, but I'm not sure how you did it.

    I tried to Convert the Texture2D using ARGBFloat and GetPixels on the converted texture, but I'm still getting values between 0 and 1.

    What I am doing wrong ?

    Thanks in advance !
     
Thread Status:
Not open for further replies.