Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Resolved ArBackgroundCamera & raw missalignment

Discussion in 'AR' started by xsmarty_unity, Nov 19, 2020.

  1. xsmarty_unity

    xsmarty_unity

    Joined:
    Sep 19, 2019
    Posts:
    1
    Hello,
    I have some ARFoundation questions,
    despite reading extensively, I have missed the information if it is in the forum already.

    Q1 :
    I have noticed that on an Ipad pro2020, the ARCameraBackground image and the Raw CameraImage are not aligned. There is a stretch and crop factor once we superimpose the raw.
    See here : https://pasteboard.co/JB7STLZ.png

    [Unity 2020.1.9, ARfoundation 4.1.0p12, ARkit 4.1.0p12]

    The question is how to match a pixel in viewport or screen coordinates in Unity to the u,v coordinates of the texture ? The ARCameraBackground texture is center stretched and cropped and seems to be device dependant.

    Subsequently
    Q2 :
    UnityEngine.XR.ARFoundation.ARRaycastManager.Raycast(Input.mouse, .. , TrackableType.FeaturePoint)) is using the unity screen coordinates but it looks that it reads the raw coordinates from ARKIT. Making this function exhibit the same problem as Q1.

    Is there a pixel perfect way to get the correct depth directly via a unity helper ?

    Q3 :
    I am unable to recreate worldtoViewport() by hand on the Ipad.

    Code (CSharp):
    1.     // 1. Local/Camera Space <=> Local
    2.     // T1
    3.     // 2. World Space <=> Local
    4.     // T2
    5.     // 3. View Space
    6.     // T3
    7.     // 4. Clip Space
    8.     // T4
    9.     // 5. Screen Space or Viewport (pel or 0-1)
    10.     Vector3 worldToViewportPoint(Vector3 worldPoint)
    11.     {
    12.         // Vclip = M_projection x M_view x M_model x V_local
    13.         Matrix4x4 GLprojection = Camera.main.projectionMatrix;
    14.         Matrix4x4 world2local = Camera.main.transform.worldToLocalMatrix;
    15.         Matrix4x4 world2clip = GLprojection * world2local;
    16.  
    17.         Vector4 point4 = new Vector4(worldPoint.x, worldPoint.y, worldPoint.z, 1.0f);
    18.         Vector4 clip_point = world2clip * point4;  // clip space (-1,+1)            
    19.         Vector3 viewport_point = clip_point;
    20.  
    21.         // Convert clip space orthographic (-1, 1) into Unity Perspective (0,1)
    22.         viewport_point /= -clip_point.w;  // normalize by "-w"
    23.         viewport_point.x = viewport_point.x * 0.5f + 0.5f;
    24.         viewport_point.y = viewport_point.y * 0.5f + 0.5f;
    25.         viewport_point.z = -clip_point.w;  // d = worldPoint.z - near_Plane. // + extra negative sign as for Rhs to unity Lhs ?
    26.         var UnityVpPt = Camera.main.WorldToViewportPoint(worldPoint);
    27.         Debug.Assert ( viewport_point == UnityVpPt, "World to viewport not equal: Distance is : " + Vector2.Distance(viewport_point,UnityVpPt));
    28.  
    29.         return viewport_point;
    30.     }
    Seems to work in Editor (metal), but asserts on the Ipad (metal). (2-3cm off for something mid screen projected at 1m). Comments reflect my understanding.
    Any ideas why ?

    Is there some code available to recreate this by hand ? Did I miss something ?

    Q4 :
    Is there some documentation about extrinsics between screen and sensors ?
    Is the VR camera centered on the first RGB by default ?

    Note 1 :
    For anyone looking :
    XRCameraIntrinsics only returns RGB "wide" intrinsics on the ipad pro 2020.
    The camera with the auto focus.
    I have yet to find anything for the ultra wide, depthmap or confidence.


    Thanks a lot !
    Cheers.
     
    ROBYER1, avilleret and GenieAGC like this.
  2. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,128
    I have a suggestion for Q1.
    Recently, I played around with ARCameraFrameEventArgs.displayMatrix.

    ARKitBackground.shader uses the displayMatrix to remap the texture coordinates:
    Code (CSharp):
    1. // Remap the texture coordinates based on the device rotation.
    2. float2 texcoord = mul(float3(v.texcoord, 1.0f), _UnityDisplayTransform).xy;
    But when I tried to multiply this matrix by a vector (bound to [0, 1]), the result was NOT bound by [0, 1].

    It seems like this matrix contains not only the rotation information but also some sort of scale. Maybe this is the reason why the raw camera image is not aligned with what ARKitBackground shows.
     
  3. Shaunyowns

    Shaunyowns

    Joined:
    Nov 4, 2019
    Posts:
    328
    ROBYER1 and KyryloKuzyk like this.
  4. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,450
    Is there any documentation or advice on how to use this displaymatrix for example with something like Shader Graph? I am stumped
     
  5. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,128
    I don't have much experience with Shader Graph, but the general idea is to grab the ARCameraFrameEventArgs.displayMatrix from
    ARCameraManager.frameReceived event, then somehow pass it to Shader Graph.
     
  6. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,450
    I am basically the first person trying to do this it seems :eek:
     
  7. TropicalCyborg

    TropicalCyborg

    Joined:
    Mar 19, 2014
    Posts:
    28
    No, You're not.I am trying to do the same with Shader Graph and facing the same challenges. What seemed to be a simples task actually is a very hard one to solve.
     
  8. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,450
    I just had to work out a nice ratio of size fitting using my own shader graph between tablet/phone screen sizes that was close enough in our use case. Unity ARF support were useless with helping us.
     
    TropicalCyborg likes this.
  9. TropicalCyborg

    TropicalCyborg

    Joined:
    Mar 19, 2014
    Posts:
    28
    Do you have this project on GitHub? It would be very helpful to look at your solution to learn
     
  10. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,450
    I will get back to you on this as I may be able to share something, it is using some store plugins like chroma shader to do some masking with smoothed edges.

    It is a very simple solution though - just use the unity sample with the stencil, look at where they save out the stencil to a rendertexture, assign that rendertexture to your shader graph. Then you can directly mess with the scaling of that texture in shader graph. Make the shader graph applied to an image or rawimage in your UI canvas to see the outline lined up over the hand.

    There is also a store plugin we used like AR Foundation to editor or something which allowed us to replicate how it worked on device. However if you don't have that just use some UI sliders connected to the shader graph via scripting to mess with the X and Y scale until it looks right.

    Finally, you will want to code in an if/else to check if you are running with an iphone/ipad and have a preset x/y value for each
     
    Last edited: Jun 29, 2021
  11. TropicalCyborg

    TropicalCyborg

    Joined:
    Mar 19, 2014
    Posts:
    28
    Thanks a lot! I'll try it
     
    ROBYER1 likes this.
  12. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,450
    Drop a reply here if you get stuck! My solution isn't easy to copy paste but I will help you