Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Question Oculus Quest - how to detect A,B,X,Y button presses?

Discussion in 'XR Interaction Toolkit and Input' started by GohanCZ, May 11, 2021.

  1. GohanCZ

    GohanCZ

    Joined:
    Apr 29, 2021
    Posts:
    32
    Hello,

    how do I access these buttons with the new XR Interaction?

    In normal games these happen in the Update, but the way I understand the new XR Interaction the buttons can now trigger events, therefore there is no need to check if a button is pressed in the Update.

    How do I detect the presses of the A,B,X,Y (at least that is what they are on the Oculus Quest) ?

    Thank you
     
  2. mattouBatou

    mattouBatou

    Joined:
    Sep 2, 2016
    Posts:
    20
    In XRI Default Input Actions Settings - You are looking for the bindings for primaryButton and secondaryButton for Left and right hand controllers.

    secondaryButton [LeftHand XR Controller] = Y button
    primaryButton [LeftHand XR Controller] = X button
    secondaryButton [RightHand XR Controller] = B button
    primaryButton [RightHand XR Controller] = A button

    XRI-default-input-actions.png XRI-default-input-actions_controller_buttons.png
     
    DhiaSendi and ErkanAkin like this.
  3. GohanCZ

    GohanCZ

    Joined:
    Apr 29, 2021
    Posts:
    32
    I see.

    So I create an action, add bind to it and then assign it to a script that takes it as "InputActionReference" and execute method when the action gets InputActionReference.action.performed, correct?

    What I am also curious about is if there is an option to monitor the button from the "outside" lets say from another script that has a reference to the controller to see check if a certain button is pressed before doind some action.
     
    Last edited: May 14, 2021
  4. mattouBatou

    mattouBatou

    Joined:
    Sep 2, 2016
    Posts:
    20
    "So I create an action, add bind to it and then assign it to a script that takes it as "InputActionReference" and execute method when the action gets InputActionReference.action.performed, correct?"

    Yes, so for the binding I showed in my previous post, this is how you would call a function when the button was pressed. inputActionReference_UISwitcher is, as you said, added to the script via SerializeField and references the input action object created by the input system.

    Code (CSharp):
    1. private void OnEnable()
    2.     {
    3.         inputActionReference_UISwitcher.action.performed += ActivateUIMode;
    4.     }
    In the editor it looks like this:

    ExampleInputActionReferencing.png

    I guess if you for some reason you have GameObjects that have a reference to the controller but can't have specific inputActions added in SerializedFields to them, you'd just have to make sure you make any inputs from the controller you need reference to public and you'll be able to call functions in this way
    inputActionReference_UISwitcher.action.performed += ActivateUIMode;
    as above.

    I'm still learning the new input system though so there may well be alternative ways of doing this. IMO passing specific input actions into script components and assigning functions to them rather than doing something like Controller.GetInput("LeftPrimaryButton", DoSomething) in an update function or function called in an update method is better. Less convenient for quick prototypes? Perhaps, but this allows for multiple controller mappings based on context of what is going on in your game that you can later simply remap from the input actions panel rather than going back into your code to change the button you are listening for.
    I also believe you can allow your users to remap controls at runtime too which is pretty huge.

    So in short:
    • You create an action.
    • Bind a specific controller and button to that action.
    • You then attach functions to action.performed to do stuff when that action was performed (rather than when a given button is pressed).
    • You can then simply go into that action and add or change the controller or button bindings for that action without changing your code (Yep multiple different controllers can bind their buttons to the same actions, no extra code needed!).
    So it is pretty powerful and convenient. I used to use the ReWired asset store package which I believe is what inspired this new input system.
     
    Last edited: May 15, 2021
  5. codemaker2015

    codemaker2015

    Joined:
    Aug 19, 2018
    Posts:
    27
    Code (CSharp):
    1. OVRInput.Get(OVRInput.Button.One)
    code to detect the button click. OVRInput.Get() method returns true if the A button is pressed. Similarly you can code other buttons as well.

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.SceneManagement;
    5.  
    6. public class ButtonTest : MonoBehaviour
    7. {
    8.  
    9.     // Start is called before the first frame update
    10.     void Start()
    11.     {
    12.  
    13.     }
    14.  
    15.     // Update is called once per frame
    16.     void Update()
    17.     {
    18.         if (OVRInput.Get(OVRInput.Button.One)){
    19.             Debug.Log("A button pressed");
    20.         }
    21.     }
    22. }
    23.  
    For more:
    https://docs.unity3d.com/560/Documentation/Manual/OculusControllers.html
     
    IbiTheDon and StudioZ7 like this.
  6. sharramon

    sharramon

    Joined:
    Nov 11, 2018
    Posts:
    4
    Wait, isn't this just not for the XR Interaction Toolkit? The OVR package is just something else entirely
     
  7. StudioZ7

    StudioZ7

    Joined:
    Nov 13, 2021
    Posts:
    10
    Thank you!
    Helped me a lot. Straight and simple.
     
  8. TheKProgrammer

    TheKProgrammer

    Joined:
    Apr 6, 2020
    Posts:
    1
    Help! With my Version it looks like this and I cant find the primary or secondary buttons. Screenshot.png
     
  9. izzanfuad

    izzanfuad

    Joined:
    Dec 7, 2021
    Posts:
    16
    I also have the same problem

    @mattouBatou do you have any ideas on this?
     
  10. izzanfuad

    izzanfuad

    Joined:
    Dec 7, 2021
    Posts:
    16
    Nevermind, I think I have found where the input button is. @TheKProgrammer You can add your own action and set bindings according to your controller. Just open your XR Default Input Action and select on which interaction (For example your XRI LeftHand Interaction). After that just right click and select Add Action then search for your action on that specific controller (In this case the Left Hand Controller)
     
    Last edited: Nov 30, 2022
  11. AntAptive

    AntAptive

    Joined:
    Nov 8, 2020
    Posts:
    7
    I'm trying to detect the Menu button on the left controller of an Oculus Quest 2. I've added it in XRI LeftHand Interaction, it works in Play mode, but not in the final build. Does anyone know what I could possibly be doing wrong? Any help is greatly appreciated!
     
  12. izzanfuad

    izzanfuad

    Joined:
    Dec 7, 2021
    Posts:
    16
    Try giving us the bindings screenshot and you script
     
  13. NexyDev

    NexyDev

    Joined:
    Jan 19, 2022
    Posts:
    2
    Why doesn't this work, do i need to import something or no
     
  14. NexyDev

    NexyDev

    Joined:
    Jan 19, 2022
    Posts:
    2
    check the Left/Right Hand Interaction.
     
    IbiTheDon likes this.
  15. IbiTheDon

    IbiTheDon

    Joined:
    Jul 26, 2023
    Posts:
    6
    Doesn't work for me either all I did was swap "OVRInput.Button.One" with "OVRInput.Button.Two"
     
  16. LabRedSis

    LabRedSis

    Joined:
    May 26, 2023
    Posts:
    1
    You need add OVRInput.Update(); in Update()