Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice

Question XRHand Oculus Menu Pinch Gesture

Discussion in 'VR' started by Hotshot10101, Jul 19, 2023.

  1. Hotshot10101

    Hotshot10101

    Joined:
    Jun 23, 2012
    Posts:
    215
    I have installed the packages needed to Quest 2 hand tracking including the XR Interaction Toolkit, XR Hands and the sample that includes the Hand Visualizer.

    When I run this on my Quest 2 and set down the controllers, I see the hands. I am able to make the right hand menu gesture and the oculus menu appears and I can choose to resume / quit the app.

    When I do the same gesture on the left hand, the menu icon appears and then goes away when I pinch.

    What I need to know is how to do something in a script when that happens. How do I hook into that left hand menu pinch gesture?
     
    julienkay likes this.
  2. DarkSoulsBoss2

    DarkSoulsBoss2

    Unity Technologies

    Joined:
    Mar 24, 2023
    Posts:
    16
    The menu pinch that you're seeing on the right hand is something that is being raised and recognized by the Oculus OS I think. If you want to have similar behavior in Unity, you probably need to write a script to recognize the specific hand shapes via parsing the joints of the hands.
     
  3. julienkay

    julienkay

    Joined:
    Nov 12, 2013
    Posts:
    160
    I don't think it's reasonable for every developer to implement this themselves. The right hand is reserved for the Oculus menu, that's correct. What should happen for the left hand is that when executed it should be the same behavior as when the start/menu button is pressed on a left Quest controller. There should be a simple binding that can be used with the XR Interaction Toolkit / Input System where menu button presses can be detected, regardless of whether controllers or hands are used.

    Others are having trouble finding documentation and getting this to work as well, (see here)

    What I'm currently doing
    Code (CSharp):
    1. private bool _menuPrev;
    2. private void Update() {
    3.     var state = OVRPlugin.GetControllerState4((uint)OVRInput.Controller.Hands);
    4.     bool menuGesture = (state.Buttons & (uint)OVRInput.RawButton.Start) > 0;
    5.     if (menuGesture && !_menuPrev) {
    6.         ToggleMenu();
    7.     }
    8.     _menuPrev = menuGesture;
    9. }
    But this relies on using the Oculus Utilities, I'd much rather have a cross-platform compatible way of checking for menu button presses that supports hands (with XR Interaction Toolkit)