Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice
  3. Join us on November 16th, 2023, between 1 pm and 9 pm CET for Ask the Experts Online on Discord and on Unity Discussions.
    Dismiss Notice

ARKit and ARCore - How does one detect with something from the real world "hits" your object?

Discussion in 'AR/VR (XR) Discussion' started by dberroa, Sep 17, 2017.

  1. dberroa

    dberroa

    Joined:
    Dec 16, 2012
    Posts:
    146
    Think if someone in real life waved their hand and hit the 3D object in AR, how would I detect that? I basically want to know when something crosses over the AR object so I can know that something "hit" it and react.

    Can this be done? If so how?
     
  2. guru20

    guru20

    Joined:
    Jul 30, 2013
    Posts:
    239
    It could have probably been done with Tango (because there are built-in depth sensors), but I don't see how it would be done with ARKit or ARCore, since there isn't actually actually any depth-sensing going on...

    A workaround would be to have it to image detection on your hand at a specified distance, tell it the size/measurement, and then as the hand got bigger or smaller, the scale could approximate the depth/distance/z-buffer...

    But I haven't actually tried anything like that, not sure how doable (doable, maybe, but probably not built-in or easy) with these single-camera AR SDKs...
     
  3. dberroa

    dberroa

    Joined:
    Dec 16, 2012
    Posts:
    146
    Thanks for the response. I have no clue how to even start with something like that but I'm assuming that image detection is probably not fast enough to detect quick moving images. I thinking my idea is not feasible at the moment with current tech.