Search Unity

Oculus Quest Handtracking and Unity UI

Discussion in 'AR/VR (XR) Discussion' started by Theformand, Dec 23, 2019.

  1. Theformand

    Theformand

    Joined:
    Jan 2, 2010
    Posts:
    271
    So, I would really like to have a way to interact with Unity UI. The Oculus documentation says to use the OVRHand.cs exposed PointerPosition as the pointer. Raycast and click, thats all I need. I tried making my own HandTrackingRaycaster to use with a Canvas, but it doesnt seem to be working correctly. Does anyone have experience with custom raycasters?
     
  2. Jichaels

    Jichaels

    Joined:
    Dec 27, 2018
    Posts:
    237
    I'm having colliders on every UI interactable, on a separate layer. Then I just raycast to it using the correct layermask. Working fine so far
     
  3. Theformand

    Theformand

    Joined:
    Jan 2, 2010
    Posts:
    271
    Hmm, Might have to just go down that route. Its not for a super complicated UI, so I guess it would work. But, you know, it seems a little odd that you should use the physics system for all the different UI Interactables (sliders etc). Thank you
     
  4. Theformand

    Theformand

    Joined:
    Jan 2, 2010
    Posts:
    271
    Figured it out. On startup, find the OvrInputModule and set its rayTransform to OvrHand.PointerPose. Also find OvrRaycaster and set its .pointer to be OvrHand.PointerPose. Now you can interact with UI
     
    fgeorg_oculus and mmarkman like this.
  5. unity_v_goq8ro7aYWyQ

    unity_v_goq8ro7aYWyQ

    Joined:
    Feb 18, 2020
    Posts:
    1
    Please, could you explain how to access to OvrHand.PointerPose from the script OvrHand.cs? It don't work for me..
     
    KevPan likes this.
  6. KevPan

    KevPan

    Joined:
    Feb 14, 2019
    Posts:
    1
    If you have the answer, I am also interested...
     
  7. Theformand

    Theformand

    Joined:
    Jan 2, 2010
    Posts:
    271
  8. zhare86

    zhare86

    Joined:
    Mar 28, 2017
    Posts:
    30
    This works for one hand right? This way you can only interact with ui with the hand whose PointerPose you set to raycaster and input module?

    Has anyone done it so that you can use both hands, like in the system ui?