Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Question XR Hands - how to listen for Menu Button?

Discussion in 'VR' started by rchapman, Feb 3, 2023.

  1. rchapman

    rchapman

    Joined:
    Feb 13, 2014
    Posts:
    100
    Working with XR Hands - integration has been nice & smooth so far, but I can't find any way to listen for the Oculus Menu Button on the left hand. The Oculus button on the right hand works properly. Currently my implementation uses action-based input and listens for <XRController>/menuButton. I can't find an input that works for XR Hands. I'm thinking I may be missing a configuration step, but I'm not sure which one. I've also tried

    hand.TryGetFeatureValue(UnityEngine.XR.CommonUsages.menuButton, out var menu)


    but this only ever returns false.

    Is there an input action that works or is there some other mechanism to pick this up?

    * Edited to add:

    I should note that I am currently testing in the editor using Link. I haven't tested in headset yet.
     
    Last edited: Feb 3, 2023
  2. jasonboukheir3

    jasonboukheir3

    Joined:
    Apr 13, 2022
    Posts:
    81
    I don't know the answer to your question, but I wonder if your left hand menu button is bound to oculus home and is being intercepted by the headset. What happens when you try swapping the pause and home button in your headset's settings?

    UPDATE: I guess that wouldn't rule out if your headset was intercepting the input, since you still don't know what the proper input event should be anyway. Hmm...
     
  3. rchapman

    rchapman

    Joined:
    Feb 13, 2014
    Posts:
    100
    I don't think this is what's happening--the Home button is working fine (takes me to the Oculus dash) and I can see the Menu button animation is activated on click.
     
    jasonboukheir3 likes this.
  4. nilagard

    nilagard

    Joined:
    Jan 13, 2017
    Posts:
    77
    I think you should take a deep dive into your controller bindings settings via the current XRInteractions file you have set to the left / right controller. As jasonboukheir3 said it might be a conflict between what is actually bound up and what you want to use as if I have understood correctly that you are setting via code.
     
  5. rchapman

    rchapman

    Joined:
    Feb 13, 2014
    Posts:
    100
    The question is: which input binding is *expected* to deliver this event using the XR Hands package? All XR controllers I've tested use <XRController>/menuButton, but this does not work for XR Hands + Oculus. I can't find one that works in this configuration. The code snippet above is simply an additional attempt I made to determine if the user activated the menu, which also failed.
     
  6. rchapman

    rchapman

    Joined:
    Feb 13, 2014
    Posts:
    100
    To clarify the left hand/right hand distinction of the original post--if you're not familiar with Oculus hand tracking, it renders a menu button on your left hand and the Oculus Home button on the right. This matches the menu button on the left controller and home button on the right controller. But the normal mapping for the controller menu button (<XRController>/menuButton) does not work with the hands integration.
     
  7. nilagard

    nilagard

    Joined:
    Jan 13, 2017
    Posts:
    77
    I would need to look into your scripts and controller settings to troubleshoot this to give you a good answer. There's not much I can do without more insight in your setup. I am currently working on a VR project myself with support for the Quest 2, might bump into the same issue if I am lucky!
     
  8. rchapman

    rchapman

    Joined:
    Feb 13, 2014
    Posts:
    100
    Sascha-L and jasonboukheir3 like this.
  9. nilagard

    nilagard

    Joined:
    Jan 13, 2017
    Posts:
    77
    I think I stumbled upon something that may help you. As I said earlier, I do not know what your settings is and you still won't post it, so I am working blind here to assist. Depending on the XR plugins you use, the controller setup is different and also has to be coded/wired different.
    Just try setting every input to the most specific you can get eg. XR Controller -> Oculus Touch Controller -> Oculus Touch Controller Left Hand -> [action] etc. Hope it helps somehow
     
  10. rchapman

    rchapman

    Joined:
    Feb 13, 2014
    Posts:
    100
    I posted the answer above. Again the question wasn't about my settings or any configuration on my side. The question was about the (lack of) documentation for the XR Hands package and which event is raised when the user clicks the "menu" or "system" button on the left hand for the Meta Aim Hand controller. If you're not familiar with the XR Hands package, please install and test it out to see what I mean.
     
  11. rchapman

    rchapman

    Joined:
    Feb 13, 2014
    Posts:
    100
    In case this is helpful, I created a composite input that takes an Integer + a Mask and turns it into a float that can be used as a button input:

    Code (CSharp):
    1. using UnityEditor;
    2. using UnityEngine;
    3. using UnityEngine.InputSystem;
    4. using UnityEngine.InputSystem.Utilities;
    5. #if UNITY_EDITOR
    6. [InitializeOnLoad] // Automatically register in editor.
    7. #endif
    8. [DisplayStringFormat("{Input}+{Mask}")]
    9. public class IntFlagsButtonComposite : InputBindingComposite<float>
    10. {
    11.     [UnityEngine.InputSystem.Layouts.InputControl(layout = "Integer")]
    12.     public int Input;
    13.  
    14.     //[InputControl(layout = "Button")]
    15.     public int Mask;
    16.    
    17.     public override float ReadValue(ref InputBindingCompositeContext context)
    18.     {
    19.         var val = context.ReadValue<int>(Input);
    20.  
    21.         return (val & Mask) == Mask ? 1 : 0;
    22.     }
    23.  
    24.     public override float EvaluateMagnitude(ref InputBindingCompositeContext context)
    25.     {
    26.         return ReadValue(ref context);
    27.     }
    28.  
    29.     static IntFlagsButtonComposite()
    30.     {
    31.         // Can give custom name or use default (type name with "Composite" clipped off).
    32.         // Same composite can be registered multiple times with different names to introduce
    33.         // aliases.
    34.         //
    35.         // NOTE: Registering from the static constructor using InitializeOnLoad and
    36.         //       RuntimeInitializeOnLoadMethod is only one way. You can register the
    37.         //       composite from wherever it works best for you. Note, however, that
    38.         //       the registration has to take place before the composite is first used
    39.         //       in a binding. Also, for the composite to show in the editor, it has
    40.         //       to be registered from code that runs in edit mode.
    41.         InputSystem.RegisterBindingComposite<IntFlagsButtonComposite>();
    42.     }
    43.  
    44.     [RuntimeInitializeOnLoadMethod]
    45.     static void Init() { }
    46. }
    47.  
    Then in my Input Actions I set up something like this along with the other inputs for Menu:

    upload_2023-2-8_16-11-14.png

    Not sure if this composite will be helpful outside of this use case at all, but here we are. :)
     
    Sascha-L and jasonboukheir3 like this.
  12. nilagard

    nilagard

    Joined:
    Jan 13, 2017
    Posts:
    77
    This looks good to me, as you are saying the fault may be somewhere else then.