Search Unity

  1. Check out the Unite LA keynote for updates on the Visual Effect Editor, the FPS Sample, ECS, Unity for Film and more! Watch it now!
    Dismiss Notice
  2. The Unity Pro & Visual Studio Professional Bundle gives you the tools you need to develop faster & collaborate more efficiently. Learn more.
    Dismiss Notice
  3. Improved Prefab workflow (includes Nested Prefabs!), 2D isometric Tilemap and more! Get the 2018.3 Beta now.
    Dismiss Notice
  4. Improve your Unity skills with a certified instructor in a private, interactive classroom. Watch the overview now.
    Dismiss Notice
  5. Want to see the most recent patch releases? Take a peek at the patch release page.
    Dismiss Notice

Oculus Go Buttons + Handedness in plain Unity [SOLVED]

Discussion in 'AR/VR (XR) Discussion' started by plmx, Jun 13, 2018.

  1. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    192
    Hi there,

    Can I detect button presses on the Oculus Go controller using plain Unity (no plug-ins from Oculus) input buttons/axes and the InputManager? The docs only seem to address the remote and the touch controllers.

    Secondly, is it possible using plain Unity to detect which hand the user has selected to use for their controller (i.e. right or left)?

    Thanks,

    Philip
     
    Last edited: Jun 13, 2018
  2. JDMulti

    JDMulti

    Joined:
    Jan 4, 2012
    Posts:
    381
    I think it's possible, but I didn't try the remote controller rotation axis yet.

    For example, the back button on the GearVR is basically the Escape button and using that in combination with a certain timing you can fire an event for a long or short press. This same mapping works on the Oculus Go controller. The other buttons if I'm right is acting like the touch button on the GearVR, which is MouseButtonDown / Up (1).

    If you've access to the 3dof axis, you can make a raycast and pass it into the EventSystem of unity trough PointerEventData.position.

    But how to get the axis of the Oculus Go 3dof controller with native Unity scripting API is still a big question for me. However writing this reply I did a quick google and came across this Coursera lessons and is about a tracke pose driver. In previous Unity versions this wasn't something I have seen but it looks like it's able to track the bluetooth controller of GearVR and Oculus Go I think: https://www.coursera.org/learn/mobile-vr-app-development-unity/lecture/BVYOl/the-tracked-pose-driver

    Let me know if it worked. ;)
     
  3. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,418
    You can do some things but not all with the Unity API. If your game is fairly simple and you need just the buttons and orientation, you'll be all right. Some more advanced things don't work reliably with just the Unity API and will require the Oculus Utilities.
     
  4. JDMulti

    JDMulti

    Joined:
    Jan 4, 2012
    Posts:
    381
    Could you specify a few advanced things which aren't reliable using the Unity API?
     
  5. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    192
    Getting the rotation is not the issue; this works as expected by using the XR API:

    Code (CSharp):
    1. this.transform.localRotation = UnityEngine.XR.InputTracking.GetLocalRotation(controller);
    The issue is the controller buttons, which are not listed with IDs on https://docs.unity3d.com/Manual/OculusControllers.html, and the question of handedness, although the latter can probably be solved by just checking both left and right controller, as only one will be present at a time.
     
    Last edited: Jun 13, 2018
    JoeStrout likes this.
  6. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    192
    OK, so I solved this issue.

    First, to find out handedness, use

    Code (CSharp):
    1. Input.GetJoystickNames()
    It will return either "Oculus Tracked Remote - Right" or "Oculus Tracked Remote - Left".

    Second, to get input from the Go remote using the InputManager of Unity, create the following axes in the Input inspector:
    • The pad on top: This is a button press from a joystick, the button is 9 for right hand and 8 for left hand, so create a "key and mouse" input with positive button name "joystick button 9" or "joystick button 8". For querying, use
      Code (CSharp):
      1. Input.GetButton("<name you used in input axis">);
    • The trigger: This is a joystick axis, namely axis 10 (right) or 9 (left), so create a "Joystick Axis" type axis, select the axis 9 or 10. For querying, use
      Code (CSharp):
      1. Input.GetAxis("<name you used in input axis">);
    • The back button: Same as GoPad, but button ID is 1 (for both left and right).
    Since the buttons and the trigger are different for left and right, you don't necessarily need to check the handedness, unless required for other purposes.

    Hope this helps someone.
     
    Last edited: Jun 15, 2018
    Claytonious and JoeStrout like this.
  7. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,418
    Recentering doesn't work reliably (sometimes does, sometimes not), certain parts of the raw gyro, accelerometer, etc. It's certainly come a long ways but still has a bit to go. At least currently fairly straightforward games can be accomplished if you use parts of xrinputtracking and parts of the input manager.
     
  8. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    192
    @greggtwep16 Yes, I remember some issues with recentering myself.

    I also think haptic feedback on the controllers can't be triggered, and you can't get the size of the play area (for Rift), or do you know if these things have been implemented yet?