Search Unity

Official Environmental Occlusion and Depth in ARFoundation

Discussion in 'AR' started by todds_unity, Jun 24, 2020.

Thread Status:
Not open for further replies.
  1. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    KnewK likes this.
  2. thorikawa

    thorikawa

    Joined:
    Dec 3, 2013
    Posts:
    25
    On iOS, does environmental occlusion automatically consider the confidence map provided by ARKit? If so can is there any way to set the threshold of that?
     
  3. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    The AR Foundation background shader does not consider the confidence map.

    You would need to have your own custom shader to achieve this behavior.
     
    Last edited: Sep 26, 2020
    thorikawa likes this.
  4. dimitris_baud

    dimitris_baud

    Joined:
    Jan 15, 2018
    Posts:
    31
    For those spending time trying to understand why Depth is not working for them on the iPad Pro Lidar / ARKit / iOS:

    It seems to be broken on `AR Foundation 4.1.0-preview9` and `AR Foundation 4.1.0-preview10`.

    It works on `AR Foundation 4.1.0-preview5`.

    Maybe it works on some in-between versions too.
     
  5. dimitris_baud

    dimitris_baud

    Joined:
    Jan 15, 2018
    Posts:
    31
    Is anyone else having issues with this specifically on iOS (and iPadOS) 14.0.1?
     
  6. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,145
    Just tested this scene in all these setups and everything seems to be working fine:
    iPad running iOS 14.0.1
    Unity 2019.4.12/2020.1.8
    AR Foundation 4.1.0-preview.7/4.1.0-preview.9/4.1.0-preview.10
    macOS 10.15.5

    Could you please share what exactly is not working and describe your setup? Have you tried the DepthImages scene from samples?
     
  7. MassiveTchnologies

    MassiveTchnologies

    Joined:
    Jul 5, 2016
    Posts:
    87
    For some reason human occlusion doesn't work on iOS 14.0 and iPhone 11 Pro Max. That's very strange.
     
  8. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    This feature works on my iPhone 11 Pro running iOS 14.0.1 and ARFoundation 4.1.0-preview.11. Can you post the exact versions of iOS and ARFoundation you are using?
     
  9. MassiveTchnologies

    MassiveTchnologies

    Joined:
    Jul 5, 2016
    Posts:
    87
    I tried again on iPhone 11 Pro Max with ARFoundation 4.1.0-preview.11 and iOS 14.1, human occlusion still doesn't work, could be an issue with the project / package / temp files. Human Occlusion does work on iPad Pro (with LiDAR) running iOS 14.1 though, same exact build.
     
  10. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    The two devices have different capabilities, and depending on what other features you have requested, the iPhone 11 could be choosing a different configuration. In a development build, there is debug output (look in the Xcode console) that tells you what features were requested, which are enabled, and which could not be satisfied. This should help you identify the issue.
     
  11. MassiveTchnologies

    MassiveTchnologies

    Joined:
    Jul 5, 2016
    Posts:
    87
    Oh thanks, didn't know that! I will give that a try.
     
  12. MassiveTchnologies

    MassiveTchnologies

    Joined:
    Jul 5, 2016
    Posts:
    87
    I found another issue, this time on Android. If, at the start of the app, "AR Occlusion Manager" environemnt depth mode was enabled (from the inspector), and then was set to "disabled" from a script, Environment Occlusion remains on. There has to be a way to enable / disable Environment Occlusion during runtime since many devices can't handle it without big dips in FPS.

    arom.requestedEnvironmentDepthMode = UnityEngine.XR.ARSubsystems.EnvironmentDepthMode.Disabled;
     
  13. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    I've just tested this exact scenario on a Pixel 3 and it works as expected. It also works to toggle the AROcclusionManager's enabled property.

    Can you provide more details? Exact versions of ARFoundation, ARCore XR Plugin, device, and OS?
     
  14. dimitris_baud

    dimitris_baud

    Joined:
    Jan 15, 2018
    Posts:
    31
    @tdmowrer is there a way for the "Depth" texture that comes from the `OcclusionManager` to be updated on every frame and not only if the "Human Stencil" is not blank/black? Or am I just doing something wrong?

    Please see this video to see what I mean. Please note that this is using URP and the default Camera Background Material.

    OcclusionManager settings:
    - Environment Depth: Best
    - Human Stencil: Best
    - Human Depth: Best
    - Preference: Prefer Environment Occlusion
     
    Last edited: Nov 4, 2020
  15. Rich_XR

    Rich_XR

    Joined:
    Jan 28, 2020
    Posts:
    9
    Hey, wondering if you solved this at all? I am trying to only work with points in a set distance around the device? I need access to the point location data in 3d space....
     
  16. j0schm03

    j0schm03

    Joined:
    Jun 21, 2018
    Posts:
    16
    First you'll need to generate a Render Texture and Texture2D that you can reuse each frame using something like this:

    Code (CSharp):
    1. depthRT = new RenderTexture (Screen.width, Screen.height, 24, RenderTextureFormat.ARGBFloat);
    2. envDepth = new Texture2D(Screen.width, Screen.height, TextureFormat.RGBAFloat, false);
    Then you can Blit to the RenderTexture:

    Code (CSharp):
    1. Graphics.Blit(m_OcclusionManager.environmentDepthTexture, depthRT, depthGradientMaterial);
    2. envDepth.ReadPixels(new Rect(0,0,depthRT.width, depthRT.height),0,0);
    3. envDepth.Apply();
    Lastly, you can use GetPixel and get the value from the pixel of your choosing. Also you'll need the depthGradientMaterial which has a custom shader that Tomzai provided:

    Code (CSharp):
    1. Shader "Unlit/CopyShader"
    2. {
    3.     SubShader
    4.     {
    5.         Tags { "RenderType"="Opaque" }
    6.         LOD 100
    7.         ZTest Always
    8.         Cull Off
    9.         ZWrite Off
    10.    
    11.         Pass
    12.         {
    13.             Name "Unlit"
    14.            
    15.             HLSLPROGRAM
    16.             #pragma vertex vert
    17.             #pragma fragment frag
    18.        
    19.             #include "UnityCG.cginc"
    20.             struct appdata
    21.             {
    22.                 float4 vertex : POSITION;
    23.                 float2 uv : TEXCOORD0;
    24.             };
    25.             struct v2f
    26.             {
    27.                 float2 uv : TEXCOORD0;
    28.                 float4 vertex : SV_POSITION;
    29.             };
    30.        
    31.             Texture2D _MainTex;
    32.             SamplerState sampler_MainTex;
    33.             float4 _MainTex_ST;
    34.             int _Orientation;
    35.             v2f vert (appdata v)
    36.             {
    37.                 v2f o;
    38.                 o.vertex = UnityObjectToClipPos(v.vertex);
    39.            
    40.                 // Flip X
    41.                 o.uv = float2(1.0 - v.uv.x, v.uv.y);
    42.            
    43.                 if (_Orientation == 1) {
    44.                     // Portrait
    45.                     o.uv = float2(1.0 - o.uv.y, o.uv.x);
    46.                 }
    47.                 else if (_Orientation == 3) {
    48.                     // Landscape left
    49.                     o.uv = float2(1.0 - o.uv.x, 1.0 - o.uv.y);
    50.                 }
    51.            
    52.                 o.uv = TRANSFORM_TEX(o.uv, _MainTex);
    53.                 return o;
    54.             }
    55.             fixed4 frag (v2f i) : SV_Target
    56.             {
    57.                 return  _MainTex.Sample(sampler_MainTex, i.uv);
    58.             }
    59.             ENDHLSL
    60.         }
    61.     }
    62. }
    We actually ended up writing a separate shader from this, but try this out.

    I'm sure there are other ways of achieving this, but this works for us!

    One last bonus, add your occlusion manager on another game object, separate from your AR Camera, if you don't want the occlusion to show and you only want to obtain the depth texture and values :)
     
    Last edited: Nov 30, 2020
  17. FOKSlab

    FOKSlab

    Joined:
    Jun 27, 2017
    Posts:
    30
    Hi,

    I try to get some XRCpuImage from my iPadPro LIDAR which cannot achieve the TryAcquireEnvironmentDepthCpuImage (always return false). I use ARFoundation/ARKit package 4.1.1 with Unity 2020.2.0b11 and an iPad Pro with iOS13.4. (I will try to upgrade my iOS to 14.2).

    I followed the ARFoundation sample scripts to get and convert the raw images (for camera, depth and confidence) but I can only get RGB camera images.

    Have you got any idea to make this work ?
     
  18. FOKSlab

    FOKSlab

    Joined:
    Jun 27, 2017
    Posts:
    30
    It seems this is the iOS version which was the source of the problem ! I hope this could help those who met the same issue.
    Regards
     
  19. mb13admin

    mb13admin

    Joined:
    May 28, 2017
    Posts:
    22
    Hi tdmowrer,
    I'm using Unity 2020.1.16 with AR Foundation/Subsystem/ ARCore/ARkit Plugins 4.1.1
    and I'm using macOS Big Sur with XCode 12.2
    However, both Environment Occlusion and Human Occlusion are not working on iPhone 12 Pro Max iOS 14
    The project's occlusion works on Android on Pixel 2, Galaxy S9.
    Yesterday, I tried to use ARFoundation 4.1.1 on Unity 2019.4.15 LTS and it worked on iPhone 12 Pro Max.


    OcclusionManager settings:
    - Environment Depth: Best
    - Human Stencil: Best
    - Human Depth: Best
    - Preference: Prefer Environment Occlusion
     
  20. karandeep_nagarro

    karandeep_nagarro

    Joined:
    Sep 7, 2018
    Posts:
    6
    Hi Everyone,

    Read the entire forum and still wondering if we can capture the 3D reconstruct with its texture and save it internally (as we see in videos posted on youtube), but using Lidar and Unity?

    I have to start working on a PoC using iPad + Lidar and this information will help me choose if I can work on Unity or will I have to go native!

    Thanks in advance
     
    Blarp likes this.
  21. PhantomTech_Loui

    PhantomTech_Loui

    Joined:
    Dec 21, 2020
    Posts:
    4
    Hoya,

    I'm having the same issue as others regarding CPU images on the new iPad Pro. Accessing RGB is fine but I can't get a copy of the depth texture from the occlusion manager (despite the fact occlusion works normally for rendering).

    iOS 14.3 (18C66)
    Xcode 12.2 (12B45b)

    AR Foundation 4.1.1
    ARKit XR Plugin 4.1.1
    ARKit Face Tracking 4.1.1
    ARCore XR Plugin 4.1.1
    XR Plugin Management 3.2.16

    Unity 2019.4.9f1

    Please help! D':
     
    Last edited: Dec 22, 2020
  22. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,145
    Could you please share your code?
    Here is an example I made that shows how to access the depth texture on CPU:
    Code (CSharp):
    1. var occlusionManager = UnityEngine.Object.FindObjectOfType<UnityEngine.XR.ARFoundation.AROcclusionManager>(); // todo cache
    2. if (occlusionManager.descriptor != null && occlusionManager.descriptor.supportsEnvironmentDepthImage && occlusionManager.TryAcquireEnvironmentDepthCpuImage(out var cpuImage) && cpuImage.valid) {
    3.     using (cpuImage) {
    4.         var conversionParams = new UnityEngine.XR.ARSubsystems.XRCpuImage.ConversionParams(cpuImage, UnityEngine.XR.ARSubsystems.XRCpuImageFormatExtensions.AsTextureFormat(cpuImage.format));
    5.         using (var rawData = new Unity.Collections.NativeArray<byte>(cpuImage.GetConvertedDataSize(conversionParams), Unity.Collections.Allocator.Temp)) {
    6.             cpuImage.Convert(conversionParams, rawData);
    7.             var texture = new UnityEngine.Texture2D(conversionParams.outputDimensions.x, conversionParams.outputDimensions.y);
    8.             texture.LoadRawTextureData(rawData);
    9.             texture.Apply();
    10.         }
    11.     }
    12. }
     
  23. mtellezleon46

    mtellezleon46

    Joined:
    Dec 9, 2020
    Posts:
    4
    Hi! How do you exactly get the distance value from the pixel to the device? Or how is that you get that Value from r value from RGB...? Thanks for the explanation.
     
    eco_bach likes this.
  24. PhantomTech_Loui

    PhantomTech_Loui

    Joined:
    Dec 21, 2020
    Posts:
    4
    Hi thanks for answering. We're using the CPU Images sample provided in the AR Foundation Samples repo. I will give your code a shot and report back if any issues.
     
  25. PhantomTech_Loui

    PhantomTech_Loui

    Joined:
    Dec 21, 2020
    Posts:
    4
    Hey I tried out your code but I'm still having the same issue. It's really baffling me because by all accounts this should work. I know the device has LiDAR and supports environment occlusion but for whatever reason ARFoundation just won't let me get at the depth image.

    I added a debug log to the code to help diagnose the issue:
    Code (CSharp):
    1.  
    2.     Debug.Log(
    3.         (_occlusionManager.descriptor != null) + " " +
    4.          _occlusionManager.descriptor.supportsEnvironmentDepthImage + " " +
    5.          _occlusionManager.TryAcquireEnvironmentDepthCpuImage(out var cpuImage) + " " +
    6.          cpuImage.valid
    7.     );
    8.  
    9. OUTPUT: True True False False
    10.  
    Looks like everything is in order but the TryAcquireEnvironmentDepthCpuImage() function isn't getting a depth map for whatever reason. Again, I know this device supports depth... Environment occlusion works as normal! This has to be a bug with ARFoundation.

     
  26. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,145
    Ok, your issue is connected to the AR Foundation Editor Remote plugin, not the code itself. And yes, this code should work on iPad Pro 2020.
    Unfortunately, my plugin doesn't currently support Occlusion CPU images, only camera CPU images.
     
  27. PhantomTech_Loui

    PhantomTech_Loui

    Joined:
    Dec 21, 2020
    Posts:
    4
    Bummer. Thanks for the clarification. Any chance this functionality could get patched in?
     
  28. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,145
    Yes, I'm planning to add the support for Occlusion CPU images, but I can't currently tell any timeframe.
     
  29. PSST_Adam

    PSST_Adam

    Joined:
    Aug 26, 2020
    Posts:
    15
    @tdmowrer When requesting environment occlusion through the OcclusionManager (either with human segmentation on or off) the requested feature is never listed in the logs, and occlusion is not enabled. Any idea what would be causing this issue?

    I am running this on an iPhone 12 Pro Max with iOS 14.3, built from a MacBook Pro running Big Sur 11.1, and Unity 2020.1.17f1.
     
  30. danbrown99

    danbrown99

    Joined:
    Dec 14, 2017
    Posts:
    1
    Does the Depth API help with detecting planes will low features, such as white walls? I know that the camera isnt able to find features as pixel clusters, but could the Depth API overcome this and recognise a vertical wall, regardless of feature points?
    Thanks
     
  31. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,145
    It will help only on iOS devices with LiDAR scanner. On Android, Depth API still relies on the camera and is still vulnerable to all previous limitations.

    Please check that you've selected the 'Prefer environment occlusion' in AROcclusionManager. Only one feature is supported at a time, either human or environment occlusion.
     
  32. PSST_Adam

    PSST_Adam

    Joined:
    Aug 26, 2020
    Posts:
    15
    Yeah I had that selected. Looks like there was another component which was forcing it to be disabled, I have it working again now. Thanks.
     
  33. TreyK-47

    TreyK-47

    Unity Technologies

    Joined:
    Oct 22, 2019
    Posts:
    1,822
    Hi everyone, we're going to close this now outdated thread. If you have any additional questions, please post a new thread and we'll be happy to take a look. Thanks!
     
Thread Status:
Not open for further replies.