Search Unity

Post Processing + ARKit... some depth buffer glitch bug?

Discussion in 'Image Effects' started by Aaron-Meyers, Apr 30, 2019.

  1. Aaron-Meyers

    Aaron-Meyers

    Joined:
    Dec 8, 2009
    Posts:
    305
    I’m surprised I couldn’t find a thread about this... when I try to use any post-processing in my ARKit scene, some really messed up stuff starts happening. I can’t quite tell what’s going on, but it seems like some kind of issue with the depth buffer.

    Here is a video showing what I'm seeing. The camera just has a simple Post Processing Profile with only Bloom turned on. The cubes are just spawned from touching the plane and they have a simple Standard shader material on them.


    The project is Unity 2018.3.9f1 and the latest version of Post Processing Stack from the Package Manager (2.1.6)
     
  2. Aaron-Meyers

    Aaron-Meyers

    Joined:
    Dec 8, 2009
    Posts:
    305
    Seems to me that this issue must stem from how the UnityARVideo script sets up its CommandBuffer.
    Code (CSharp):
    1. m_VideoCommandBuffer.Blit(null, BuiltinRenderTextureType.CurrentActive, m_ClearMaterial);
    In a scene without the Post Processing stack, the CurrentActive render texture on a camera will be whatever back buffer is setup in the scene, but it seems like whatever voodoo with render textures the post processing stuff is doing is in conflict with the way this works.

    Perhaps using CommandBuffer's SetRenderTarget before the Blit could resolve this, but I don't really know what to try setting the render target to.

    If anyone has any ideas about how to deal with this issue, I'd very much appreciate it!
     
  3. lebo47

    lebo47

    Joined:
    Oct 18, 2013
    Posts:
    9
    Did you ever resolve this? Do Image Effects just not work with AR Kit or Core?