Search Unity

Resolved ArBackgroundCamera & raw missalignment

Discussion in 'AR' started by xsmarty_unity, Nov 19, 2020.

  1. xsmarty_unity

    xsmarty_unity

    Joined:
    Sep 19, 2019
    Posts:
    1
    Hello,
    I have some ARFoundation questions,
    despite reading extensively, I have missed the information if it is in the forum already.

    Q1 :
    I have noticed that on an Ipad pro2020, the ARCameraBackground image and the Raw CameraImage are not aligned. There is a stretch and crop factor once we superimpose the raw.
    See here : https://pasteboard.co/JB7STLZ.png

    [Unity 2020.1.9, ARfoundation 4.1.0p12, ARkit 4.1.0p12]

    The question is how to match a pixel in viewport or screen coordinates in Unity to the u,v coordinates of the texture ? The ARCameraBackground texture is center stretched and cropped and seems to be device dependant.

    Subsequently
    Q2 :
    UnityEngine.XR.ARFoundation.ARRaycastManager.Raycast(Input.mouse, .. , TrackableType.FeaturePoint)) is using the unity screen coordinates but it looks that it reads the raw coordinates from ARKIT. Making this function exhibit the same problem as Q1.

    Is there a pixel perfect way to get the correct depth directly via a unity helper ?

    Q3 :
    I am unable to recreate worldtoViewport() by hand on the Ipad.

    Code (CSharp):
    1.     // 1. Local/Camera Space <=> Local
    2.     // T1
    3.     // 2. World Space <=> Local
    4.     // T2
    5.     // 3. View Space
    6.     // T3
    7.     // 4. Clip Space
    8.     // T4
    9.     // 5. Screen Space or Viewport (pel or 0-1)
    10.     Vector3 worldToViewportPoint(Vector3 worldPoint)
    11.     {
    12.         // Vclip = M_projection x M_view x M_model x V_local
    13.         Matrix4x4 GLprojection = Camera.main.projectionMatrix;
    14.         Matrix4x4 world2local = Camera.main.transform.worldToLocalMatrix;
    15.         Matrix4x4 world2clip = GLprojection * world2local;
    16.  
    17.         Vector4 point4 = new Vector4(worldPoint.x, worldPoint.y, worldPoint.z, 1.0f);
    18.         Vector4 clip_point = world2clip * point4;  // clip space (-1,+1)            
    19.         Vector3 viewport_point = clip_point;
    20.  
    21.         // Convert clip space orthographic (-1, 1) into Unity Perspective (0,1)
    22.         viewport_point /= -clip_point.w;  // normalize by "-w"
    23.         viewport_point.x = viewport_point.x * 0.5f + 0.5f;
    24.         viewport_point.y = viewport_point.y * 0.5f + 0.5f;
    25.         viewport_point.z = -clip_point.w;  // d = worldPoint.z - near_Plane. // + extra negative sign as for Rhs to unity Lhs ?
    26.         var UnityVpPt = Camera.main.WorldToViewportPoint(worldPoint);
    27.         Debug.Assert ( viewport_point == UnityVpPt, "World to viewport not equal: Distance is : " + Vector2.Distance(viewport_point,UnityVpPt));
    28.  
    29.         return viewport_point;
    30.     }
    Seems to work in Editor (metal), but asserts on the Ipad (metal). (2-3cm off for something mid screen projected at 1m). Comments reflect my understanding.
    Any ideas why ?

    Is there some code available to recreate this by hand ? Did I miss something ?

    Q4 :
    Is there some documentation about extrinsics between screen and sensors ?
    Is the VR camera centered on the first RGB by default ?

    Note 1 :
    For anyone looking :
    XRCameraIntrinsics only returns RGB "wide" intrinsics on the ipad pro 2020.
    The camera with the auto focus.
    I have yet to find anything for the ultra wide, depthmap or confidence.


    Thanks a lot !
    Cheers.
     
    avilleret and GenieAGC like this.
  2. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    412
    I have a suggestion for Q1.
    Recently, I played around with ARCameraFrameEventArgs.displayMatrix.

    ARKitBackground.shader uses the displayMatrix to remap the texture coordinates:
    Code (CSharp):
    1. // Remap the texture coordinates based on the device rotation.
    2. float2 texcoord = mul(float3(v.texcoord, 1.0f), _UnityDisplayTransform).xy;
    But when I tried to multiply this matrix by a vector (bound to [0, 1]), the result was NOT bound by [0, 1].

    It seems like this matrix contains not only the rotation information but also some sort of scale. Maybe this is the reason why the raw camera image is not aligned with what ARKitBackground shows.
     
  3. Shaunyowns

    Shaunyowns

    Unity Technologies

    Joined:
    Nov 4, 2019
    Posts:
    328
    KirillKuzyk likes this.
unityunity