Search Unity

  1. Read here to learn more about our latest preview of the XR Interaction Toolkit.
    Dismiss Notice
  2. Click here to see what's on sale for the "Best of Super Sale" on the Asset Store
    Dismiss Notice
  3. Read here for Unity's latest plans on OpenXR.
    Dismiss Notice

Unity Environmental Occlusion and Depth in ARFoundation

Discussion in 'AR' started by todds_unity, Jun 24, 2020.

  1. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    168
    KnewK likes this.
  2. thorikawa

    thorikawa

    Joined:
    Dec 3, 2013
    Posts:
    10
    On iOS, does environmental occlusion automatically consider the confidence map provided by ARKit? If so can is there any way to set the threshold of that?
     
  3. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    168
    The AR Foundation background shader does not consider the confidence map.

    You would need to have your own custom shader to achieve this behavior.
     
    Last edited: Sep 26, 2020
    thorikawa likes this.
  4. dimitris_baud

    dimitris_baud

    Joined:
    Jan 15, 2018
    Posts:
    27
    For those spending time trying to understand why Depth is not working for them on the iPad Pro Lidar / ARKit / iOS:

    It seems to be broken on `AR Foundation 4.1.0-preview9` and `AR Foundation 4.1.0-preview10`.

    It works on `AR Foundation 4.1.0-preview5`.

    Maybe it works on some in-between versions too.
     
  5. dimitris_baud

    dimitris_baud

    Joined:
    Jan 15, 2018
    Posts:
    27
    Is anyone else having issues with this specifically on iOS (and iPadOS) 14.0.1?
     
  6. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    343
    Just tested this scene in all these setups and everything seems to be working fine:
    iPad running iOS 14.0.1
    Unity 2019.4.12/2020.1.8
    AR Foundation 4.1.0-preview.7/4.1.0-preview.9/4.1.0-preview.10
    macOS 10.15.5

    Could you please share what exactly is not working and describe your setup? Have you tried the DepthImages scene from samples?
     
  7. MassiveTchnologies

    MassiveTchnologies

    Joined:
    Jul 5, 2016
    Posts:
    60
    For some reason human occlusion doesn't work on iOS 14.0 and iPhone 11 Pro Max. That's very strange.
     
  8. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    593
    This feature works on my iPhone 11 Pro running iOS 14.0.1 and ARFoundation 4.1.0-preview.11. Can you post the exact versions of iOS and ARFoundation you are using?
     
  9. MassiveTchnologies

    MassiveTchnologies

    Joined:
    Jul 5, 2016
    Posts:
    60
    I tried again on iPhone 11 Pro Max with ARFoundation 4.1.0-preview.11 and iOS 14.1, human occlusion still doesn't work, could be an issue with the project / package / temp files. Human Occlusion does work on iPad Pro (with LiDAR) running iOS 14.1 though, same exact build.
     
  10. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    593
    The two devices have different capabilities, and depending on what other features you have requested, the iPhone 11 could be choosing a different configuration. In a development build, there is debug output (look in the Xcode console) that tells you what features were requested, which are enabled, and which could not be satisfied. This should help you identify the issue.
     
  11. MassiveTchnologies

    MassiveTchnologies

    Joined:
    Jul 5, 2016
    Posts:
    60
    Oh thanks, didn't know that! I will give that a try.
     
  12. MassiveTchnologies

    MassiveTchnologies

    Joined:
    Jul 5, 2016
    Posts:
    60
    I found another issue, this time on Android. If, at the start of the app, "AR Occlusion Manager" environemnt depth mode was enabled (from the inspector), and then was set to "disabled" from a script, Environment Occlusion remains on. There has to be a way to enable / disable Environment Occlusion during runtime since many devices can't handle it without big dips in FPS.

    arom.requestedEnvironmentDepthMode = UnityEngine.XR.ARSubsystems.EnvironmentDepthMode.Disabled;
     
  13. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    593
    I've just tested this exact scenario on a Pixel 3 and it works as expected. It also works to toggle the AROcclusionManager's enabled property.

    Can you provide more details? Exact versions of ARFoundation, ARCore XR Plugin, device, and OS?
     
  14. dimitris_baud

    dimitris_baud

    Joined:
    Jan 15, 2018
    Posts:
    27
    @tdmowrer is there a way for the "Depth" texture that comes from the `OcclusionManager` to be updated on every frame and not only if the "Human Stencil" is not blank/black? Or am I just doing something wrong?

    Please see this video to see what I mean. Please note that this is using URP and the default Camera Background Material.

    OcclusionManager settings:
    - Environment Depth: Best
    - Human Stencil: Best
    - Human Depth: Best
    - Preference: Prefer Environment Occlusion
     
    Last edited: Nov 4, 2020
  15. Rich_XR

    Rich_XR

    Joined:
    Jan 28, 2020
    Posts:
    5
    Hey, wondering if you solved this at all? I am trying to only work with points in a set distance around the device? I need access to the point location data in 3d space....
     
  16. j0schm03

    j0schm03

    Joined:
    Jun 21, 2018
    Posts:
    14
    First you'll need to generate a Render Texture and Texture2D that you can reuse each frame using something like this:

    Code (CSharp):
    1. depthRT = new RenderTexture (Screen.width, Screen.height, 24, RenderTextureFormat.ARGBFloat);
    2. envDepth = new Texture2D(Screen.width, Screen.height, TextureFormat.RGBAFloat, false);
    Then you can Blit to the RenderTexture:

    Code (CSharp):
    1. Graphics.Blit(m_OcclusionManager.environmentDepthTexture, depthRT, depthGradientMaterial);
    2. envDepth.ReadPixels(new Rect(0,0,depthRT.width, depthRT.height),0,0);
    3. envDepth.Apply();
    Lastly, you can use GetPixel and get the value from the pixel of your choosing. Also you'll need the depthGradientMaterial which has a custom shader that Tomzai provided:

    Code (CSharp):
    1. Shader "Unlit/CopyShader"
    2. {
    3.     SubShader
    4.     {
    5.         Tags { "RenderType"="Opaque" }
    6.         LOD 100
    7.         ZTest Always
    8.         Cull Off
    9.         ZWrite Off
    10.    
    11.         Pass
    12.         {
    13.             Name "Unlit"
    14.            
    15.             HLSLPROGRAM
    16.             #pragma vertex vert
    17.             #pragma fragment frag
    18.        
    19.             #include "UnityCG.cginc"
    20.             struct appdata
    21.             {
    22.                 float4 vertex : POSITION;
    23.                 float2 uv : TEXCOORD0;
    24.             };
    25.             struct v2f
    26.             {
    27.                 float2 uv : TEXCOORD0;
    28.                 float4 vertex : SV_POSITION;
    29.             };
    30.        
    31.             Texture2D _MainTex;
    32.             SamplerState sampler_MainTex;
    33.             float4 _MainTex_ST;
    34.             int _Orientation;
    35.             v2f vert (appdata v)
    36.             {
    37.                 v2f o;
    38.                 o.vertex = UnityObjectToClipPos(v.vertex);
    39.            
    40.                 // Flip X
    41.                 o.uv = float2(1.0 - v.uv.x, v.uv.y);
    42.            
    43.                 if (_Orientation == 1) {
    44.                     // Portrait
    45.                     o.uv = float2(1.0 - o.uv.y, o.uv.x);
    46.                 }
    47.                 else if (_Orientation == 3) {
    48.                     // Landscape left
    49.                     o.uv = float2(1.0 - o.uv.x, 1.0 - o.uv.y);
    50.                 }
    51.            
    52.                 o.uv = TRANSFORM_TEX(o.uv, _MainTex);
    53.                 return o;
    54.             }
    55.             fixed4 frag (v2f i) : SV_Target
    56.             {
    57.                 return  _MainTex.Sample(sampler_MainTex, i.uv);
    58.             }
    59.             ENDHLSL
    60.         }
    61.     }
    62. }
    We actually ended up writing a separate shader from this, but try this out.

    I'm sure there are other ways of achieving this, but this works for us!

    One last bonus, add your occlusion manager on another game object, separate from your AR Camera, if you don't want the occlusion to show and you only want to obtain the depth texture and values :)
     
    Last edited: Nov 30, 2020 at 9:50 PM
unityunity