Search Unity

  1. Unity 2019.2 is now released.
    Dismiss Notice

Order of FeaturePoints in ARRaycastHit

Discussion in 'Handheld AR' started by BuoDev, Jan 30, 2019.

  1. BuoDev

    BuoDev

    Joined:
    Nov 28, 2018
    Posts:
    45
    I'm working on a rather complex way of placing GameObjects using only FeaturePoints from ARRaycastHit results by finding the n count of FeaturePoints nearest to the cone center point and averaging their positions.

    What is the order of the results in the ARRaycastHit list? Do they have a specific order starting from the cone centerpoint, or are they somehow randomly added to the list?
     
  2. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    553
    All raycast hits are sorted by distance from the raycast's origin. The nearest hit will be the first element in the list.
     
  3. BuoDev

    BuoDev

    Joined:
    Nov 28, 2018
    Posts:
    45
    Thank goodness for that. Using, say the first 50 hits returned, is there a good way to then more accurately estimate the depth position in case some of the feature points are further in the cameras z direction? For instance, if you were to scan a plant on a table, most feature points would be on the plant, but when calculating the feature points average, the feature points in the background (or foreground) would affect its depth position.
     
  4. BuoDev

    BuoDev

    Joined:
    Nov 28, 2018
    Posts:
    45
    Actually I think I could first take the first few closest hits to the raycast's origin and set that as the depth position, and then use the rest of the hits for the x and y position. I think this would work, but I'm curious if there is another way to filter out hits.
     
  5. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    553
    More accurate than what?

    I think I'm missing context to these questions. What are you trying to do? What is "the depth position" used for?
     
  6. BuoDev

    BuoDev

    Joined:
    Nov 28, 2018
    Posts:
    45
    So I discovered that the ordering is indeed from the origin, or from the camera's position forward.

    After experimenting with the raycast hits sorting, I realised that I misunderstood what you meant by this. I though that it would be the distance from the raycast origin in screenspace, meaning hits closest to the touch position (in screenspace) would be first in the list, and hits further on the x and y axis would be added in order. Does this make sense?

    I was able to do this by do-while looping the Raycasts cone with a condition for a minumum count of FeaturePoints. Is there a more efficient or better way of doing this?

    The goal is to get a minimum count of FeaturePoints in the raycast, say 12 hits, and then average the hit's position into a single Vector3 for the position. If the first raycast has no hits, the cone angle will keep increasing until the condition is met.

    So all of this worked fine, even when dragging by lerping between the object's position and the averaged point position, very robust! I'm just not sure if there would be a less expensive way of doing this.
    ---
    Now, using the hits, I want to somehow get their estimated direction (or like an estimated polygon normal), and spawn objects with this angle. How can this be done?
     
    Last edited: Jan 31, 2019
  7. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    553
    If I understand correctly, you want the feature point that is closest to the ray originating from the touch point.

    That's a different geometric query than the raycasting functionality provides, but it is one that you could perform yourself. This Wikipedia article explains how to calculate the distance from a point to a line. You can get the ray from the camera with camera.ScreenPointToRay, and then iterate over each feature point to calculate its distance from the line defined by the ray origin and direction.

    I'm not sure what you mean by "direction" -- do you mean direction to the ray, or something else?
     
  8. BuoDev

    BuoDev

    Joined:
    Nov 28, 2018
    Posts:
    45
    Yes you understood correctly. camera.ScreenPointToRay seems to be exactly what I'm looking for!

    So 3 points can make a polygon. Regardless of where the points are in 3d-space (assuming that they are not exactly on top of eachother), they can be made into a polygon.

    Now, using 3 feature points, I want simulate creating a polygon using these three points. Direction, I meant as a polygon's normal direction which always points in a direction. In this case it would have been more clear to ask for the global rotation of these three points, not each point's individual rotation, but this simulated polygon's rotation.
     
  9. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    553
    The normal of a triangle formed by three non-colinear points is the cross product of two of its edges. So if you have three points A, B & C (clockwise winding) then the left-handed normal is (B - A) x (C - A). You may want to normalize the result so that it is unit length (otherwise its length will be twice the triangle's area).

    Is this what you're looking for?