Search Unity

AR raycast misplaced when viewport rect is modified

Discussion in 'AR' started by citizen_12, Jan 11, 2019.

  1. citizen_12

    citizen_12

    Joined:
    Jun 21, 2013
    Posts:
    33
    Playing with the arfoundation-samples project. Decided to add a UX panel that is always visible on the screen, so I created a second UX camera and changed the viewport rect of the AR Camera to only draw the top 75% of the screen (i.e. Y=0.25 instead of Y=0). The UX camera uses the bottom 25%.

    Problem is that now my touches in the AR view are all offset vertically by the height of the UX view. Are we supposed to transform the touch ourselves before passing it to Raycast?
     

    Attached Files:

  2. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    There are two versions of the raycast interface.
    • One takes a
      Ray
      . The ray is specified in world space, so it is independent of the camera's viewport.
    • The other takes a
      Vector3
      , actually a 2 dimensional screen space value between (0,0)-(screen width, screen height). This you would need to transform as it assumes the AR camera image takes up the entire screen.
    The easiest thing to do would probably be to use the first version of the Raycast interface (the one that takes a Ray) and pass it a ray from Camera.ScreenPointToRay.