Search Unity

AOV Image Sequence for Depth Maps formula used

Discussion in 'High Definition Render Pipeline' started by dusansvilarkovic, Jun 14, 2021.

  1. dusansvilarkovic

    dusansvilarkovic

    Joined:
    Jun 8, 2021
    Posts:
    9
    Hi,
    I am using Perception package of the camera and HDRP pipeline to capture depth maps from my images, and would like to know which kind of projections do you use when calculating per pixel relative depth. I need this since I want to convert [0,1] (or whichever range you use, it seems its [0,127]) values to [0, max_camera_distance] values, since I need it for my further training on these images.

    I guess that from first two formulas in Mathematics section in Z-buffering - Wikipedia I can see that far and near values are clipping planes values shown in Perception Camera Inspector as:

    upload_2021-6-14_21-21-4.png

    But I am worried that Depth in here might make me problem when calculating :

    upload_2021-6-14_21-21-31.png

    Given that my camera has perspective projection can I rely on this Wikipedia formula for inverting
    from relative to absolute distance values from :

    upload_2021-6-14_21-22-46.png

    Or you have your own implementation of this formula?
     
    Last edited: Jun 14, 2021
  2. PavlosM

    PavlosM

    Unity Technologies

    Joined:
    Oct 8, 2019
    Posts:
    31
    As you probably know, in computer graphics we typically do the projection math using a 4x4 matrix.
    You can find more information about the projection matrices in Unity here:
    https://docs.unity3d.com/2021.2/Documentation/ScriptReference/Matrix4x4.Perspective.html

    In your code you can get the projection matrix from a camera by accessing the projectionMatrix member, as you can see here:
    https://docs.unity3d.com/2021.2/Documentation/ScriptReference/Camera-projectionMatrix.html

    The AOV API should give you raw depth values from the graphics API, it does not do any remapping, but the values might be quantized (and clamped) to 8-bits if you save them to non-float textures. However I'm not familiar with what the Perception package does on top of the AOV API, so there might be a limitation there.

    Also note that the debug view for Depth in HDRP has a mode that remaps the depth values to a range for visualization purposes:
    upload_2021-6-16_0-45-40.png
    You can check the code that is used to do this in the DebugFullScreen.shader file in case you have to remap the values in your code but this ramapping is not exposed in the AOV API.
     
    Last edited: Jun 16, 2021