Search Unity

How to reproject pixels coordinates with the correct depth into world coordinates

Discussion in 'Windows Mixed Reality' started by tfisiche, Nov 22, 2019.

  1. tfisiche

    tfisiche

    Joined:
    Sep 30, 2019
    Posts:
    9
    Hi,

    I'm trying to reproject pixels coordinates into world coordinates using PhotoCaptureFrame (from MRTK) for Hololens.
    I get the projection and view matrices from TryGetProjectionMatrix() and TryGetCamera2World() functions of PhotoCaptureFrame.
    The problem is that i get the depth from the Unity Camera using a shader with _CameraDepthTexture. If I'm not mistaking, this gives me a 2DTexture with all the distances for every pixels from the Unity camera point of view which is slightly different from the PhotoCaptureFrame position and rotation.
    In consequence, when I reproject the RGB pixels from PhotoCaptureFrame using the depth of the Unity Camera, the results is not perfectly accurate.
    What is the correct way to do this reprojection for Hololens ?