Search Unity

Question ARFoundation environment occlusion is glitchy

Discussion in 'AR' started by Marks4, Feb 20, 2021.

  1. Marks4

    Marks4

    Joined:
    Feb 25, 2018
    Posts:
    547
    At least when running on android. Take a look at this video:
    . The model sinks into the ground, parts of it disappears and reappears, sometimes the object isn't immediately occluded. Sometimes not all of it occluded. Now compare with the ARCore depth api demonstration here:
    https://www.youtube.com/watch?v=VOVhCTb-1io and here

    None of these problems happen. What's going on?
     
    Last edited: Feb 21, 2021
  2. Marks4

    Marks4

    Joined:
    Feb 25, 2018
    Posts:
    547
    Please help I need to get a decent looking occlusion.
     
  3. Marks4

    Marks4

    Joined:
    Feb 25, 2018
    Posts:
    547
  4. TreyK-47

    TreyK-47

    Unity Technologies

    Joined:
    Oct 22, 2019
    Posts:
    1,821
    Hey there! Bounced this off of the team:

    Different Android devices have varying quality of depth images. Some Android devices have time-of-flight sensors which will provide more accurate results, but most Android devices do not have time-of-flight sensors.
     
  5. Marks4

    Marks4

    Joined:
    Feb 25, 2018
    Posts:
    547
    @TreyK-47 Mr Dan Miller from Unity said more about this on discord:
    Is the smoothing feature coming along soon?
     
  6. ekyah411

    ekyah411

    Joined:
    Aug 12, 2019
    Posts:
    6
    I am making an application where an object needs to be viewed from distance > 10m, and using the "default" AROcclusionManager, the object disappears at this distance.
    I read that max depth is 8m (correct me if I am wrong) so this is probably what caused the object disapperance.
    At the moment, I am trying to make a simple custom occlusion shader that somewhat replicates what the AROcclusionManager is doing using the AROcclusionManager.environmentDepthTexture, and for example, make the object transparent when it it too far away
    I have been able to follow the DepthGradient.shader from ar-foundation samples and could easily convert it to a simple surface shader that does the same thing.
    However, for the occlusion shader, I am having problems figuring out how to calculate the proper real and virtual depths.
    For now I have:
    float virtualDepth = -UnityWorldToViewPos(IN.worldPos).z; // depth of the virtual vertex
    float realDepth = GetDepthInMeter(_DepthTex, IN.screenPos.xy);
    But it's not right. Maybe some transformation is required to convert the screenPos to the depthmap uv?
    What could I do?
    Any suggestions for alternative approaches are aslo welcomed.

    Thanks a lot in advance
     
  7. ekyah411

    ekyah411

    Joined:
    Aug 12, 2019
    Posts:
    6
  8. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
  9. Marks4

    Marks4

    Joined:
    Feb 25, 2018
    Posts:
    547
    @todds_unity What's this, new feature? Decent occlusion is finally available?
     
  10. Marks4

    Marks4

    Joined:
    Feb 25, 2018
    Posts:
    547
    Last edited: Jan 10, 2022
  11. Stents

    Stents

    Joined:
    Jan 23, 2014
    Posts:
    18
    @Marks4 This was added in ARFoundation 4.2 if I remember correctly, so you may need to update if you are currently using an older version of ARFoundation.

    Regarding this option though, I noticed that on most devices I have tested on (mainly Samsung devices) it is not really supported. The environment depth map is point filtered rather than using bilinear filtering, which makes occlusion appear very jagged, and basically unusable.

    This issue can be worked around with a custom material on the ARCameraBackground, which does the Bilinear filtering in the shader, which makes things look much better, however this is not possible when using URP as far as I know, as URP requires it's own custom background.

    Is this a known issue @todds_unity ? Is there a suggested way of filtering the environment depth map when using URP?
     
  12. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    URP does not require a custom background shader with AR Foundation.

    Also, the sampling of the depth texture is not affected by the camera background shader. The temporal smoothing, when enabled, is applied at the native ARCore level.