Search Unity

Any example of the new 2019.1 XR input system?

Discussion in 'AR/VR (XR) Discussion' started by fariazz, Feb 15, 2019.

  1. fariazz

    fariazz

    Joined:
    Nov 21, 2016
    Posts:
    55
    I'm so excited to see there is a new XR input mapping system in 2019.1: https://docs.unity3d.com/2019.1/Documentation/Manual/xr_input.html

    But haven't been able to find any code example showing how to use it.

    Does anyone know how to do something as simple as say spawning a cube in the position of the controller when the trigger is pressed?

    Any help would be much appreciated!!
     
    Romenics, MrBenPi and ROBYER1 like this.
  2. nilsdr

    nilsdr

    Joined:
    Oct 24, 2017
    Posts:
    374
    Docs say this, wouldn't that do it?

    Code (CSharp):
    1. bool triggerValue;
    2. if (device.TryGetFeatureValue(UnityEngine.XR.CommonUsages.triggerButton,
    3.                               out triggerValue)
    4.     && triggerValue)
    5. {
    6.     Debug.Log("Trigger button is pressed");
    7. }
     
  3. fariazz

    fariazz

    Joined:
    Nov 21, 2016
    Posts:
    55
    Thanks that was helpful!

    The documentation for 2019.1 has been expanded over the last couple of days and it new features multiple examples (none of that code was present when I asked here): https://docs.unity3d.com/2019.1/Documentation/Manual/xr_input.html
     
  4. fariazz

    fariazz

    Joined:
    Nov 21, 2016
    Posts:
    55
    In case it helps anyone, the following script will allow you to call any method when a button is pressed, and when it's released. This only works for the Button entries here (https://docs.unity3d.com/2019.1/Documentation/Manual/xr_input.html) but it can easily be adapted to work with the Axis as well. Just attach this to any game object.


    Code (CSharp):
    1. using System;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4. using UnityEngine;
    5. using UnityEngine.Events;
    6. using UnityEngine.XR;
    7.  
    8. namespace Zenva.VR
    9. {
    10.     public class ButtonController : MonoBehaviour
    11.     {
    12.         static readonly Dictionary<string, InputFeatureUsage<bool>> availableButtons = new Dictionary<string, InputFeatureUsage<bool>>
    13.         {
    14.             {"triggerButton", CommonUsages.triggerButton },
    15.             {"thumbrest", CommonUsages.thumbrest },
    16.             {"primary2DAxisClick", CommonUsages.primary2DAxisClick },
    17.             {"primary2DAxisTouch", CommonUsages.primary2DAxisTouch },
    18.             {"menuButton", CommonUsages.menuButton },
    19.             {"gripButton", CommonUsages.gripButton },
    20.             {"secondaryButton", CommonUsages.secondaryButton },
    21.             {"secondaryTouch", CommonUsages.secondaryTouch },
    22.             {"primaryButton", CommonUsages.primaryButton },
    23.             {"primaryTouch", CommonUsages.primaryTouch },
    24.         };
    25.  
    26.         public enum ButtonOption
    27.         {
    28.             triggerButton,
    29.             thumbrest,
    30.             primary2DAxisClick,
    31.             primary2DAxisTouch,
    32.             menuButton,
    33.             gripButton,
    34.             secondaryButton,
    35.             secondaryTouch,
    36.             primaryButton,
    37.             primaryTouch
    38.         };
    39.  
    40.         [Tooltip("Input device role (left or right controller)")]
    41.         public InputDeviceRole deviceRole;
    42.        
    43.         [Tooltip("Select the button")]
    44.         public ButtonOption button;
    45.  
    46.         [Tooltip("Event when the button starts being pressed")]
    47.         public UnityEvent OnPress;
    48.  
    49.         [Tooltip("Event when the button is released")]
    50.         public UnityEvent OnRelease;
    51.  
    52.         // to check whether it's being pressed
    53.         public bool IsPressed { get; private set; }
    54.  
    55.         // to obtain input devices
    56.         List<InputDevice> inputDevices;
    57.         bool inputValue;
    58.  
    59.         InputFeatureUsage<bool> inputFeature;
    60.  
    61.         void Awake()
    62.         {
    63.             // get label selected by the user
    64.             string featureLabel = Enum.GetName(typeof(ButtonOption), button);
    65.  
    66.             // find dictionary entry
    67.             availableButtons.TryGetValue(featureLabel, out inputFeature);
    68.            
    69.             // init list
    70.             inputDevices = new List<InputDevice>();
    71.         }
    72.  
    73.         void Update()
    74.         {
    75.             InputDevices.GetDevicesWithRole(deviceRole, inputDevices);
    76.  
    77.             for (int i = 0; i < inputDevices.Count; i++)
    78.             {
    79.                 if (inputDevices[i].TryGetFeatureValue(inputFeature,
    80.                     out inputValue) && inputValue)
    81.                 {
    82.                     // if start pressing, trigger event
    83.                     if (!IsPressed)
    84.                     {
    85.                         IsPressed = true;
    86.                         OnPress.Invoke();
    87.                     }
    88.                 }
    89.  
    90.                 // check for button release
    91.                 else if (IsPressed)
    92.                 {
    93.                     IsPressed = false;
    94.                     OnRelease.Invoke();
    95.                 }
    96.             }
    97.         }
    98.     }
    99. }
     
  5. icave_user

    icave_user

    Joined:
    Dec 14, 2017
    Posts:
    1
    Hello! Here is my example implementation for buttons input for 6Dof VR Headsets using controllers like the Vive or Rift or WMR. This is from a racing game I was working on. Any feedback/questions would be appreciated.

    VRPlayer/Camera Parent Code (Attach only this script to your Camera Parent):
    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.Events;
    5. using UnityEngine.Serialization;
    6. using UnityEngine.XR;
    7.  
    8. [System.Serializable] // Generic Event holding button value
    9. public class AButtonEvent : UnityEvent<bool>
    10. {
    11.     public bool Value { get; set; }
    12.  
    13.     public void Initialize(bool value, UnityAction<bool> method)
    14.     {
    15.         Value = value;
    16.         AddListener(method);
    17.     }
    18. }
    19.  
    20.  
    21. public class RacerVR : Racer
    22. {
    23.     public GameObject LeftAnchor;
    24.     public GameObject RightAnchor;
    25.  
    26.     private HandController _leftController;
    27.     private HandController _rightController;
    28.  
    29.     private UnityEngine.XR.InputDevice _leftDevice;
    30.     private UnityEngine.XR.InputDevice _rightDevice;
    31.  
    32.     // Start is called before the first frame update
    33.     void Start()
    34.     {
    35.         base.Start();
    36.  
    37.         SetDevices();
    38.  
    39.         //Initialize Hands
    40.         _leftController = LeftAnchor.AddComponent<HandController>();
    41.         _rightController = RightAnchor.AddComponent<HandController>();
    42.  
    43.     }
    44.  
    45.     // Update is called once per frame
    46.     private new void Update()
    47.     {
    48.         base.Update();
    49.  
    50.         //Set Tracked Devices
    51.         SetDevicePosAndRot(XRNode.LeftHand, LeftAnchor);
    52.         SetDevicePosAndRot(XRNode.RightHand, RightAnchor);
    53.  
    54.         //Set Buttons
    55.         UpdateButtonState(_leftDevice, CommonUsages.gripButton, _leftController.GripEvent);
    56.         UpdateButtonState(_rightDevice, CommonUsages.gripButton, _rightController.GripEvent);
    57.  
    58.         UpdateButtonState(_leftDevice, CommonUsages.primary2DAxisClick, _leftController.ClickEvent);
    59.         UpdateButtonState(_rightDevice, CommonUsages.primary2DAxisClick, _rightController.ClickEvent);
    60.  
    61.         UpdateButtonState(_leftDevice, CommonUsages.triggerButton, _leftController.TriggerEvent);
    62.         UpdateButtonState(_rightDevice, CommonUsages.triggerButton, _rightController.TriggerEvent);
    63.  
    64.         UpdateButtonState(_leftDevice, CommonUsages.menuButton, _leftController.MenuEvent);
    65.         UpdateButtonState(_rightDevice, CommonUsages.menuButton, _rightController.MenuEvent);
    66.     }
    67.  
    68.     private static void SetDevicePosAndRot(XRNode trackedDevice, GameObject anchor)
    69.     {
    70.         anchor.transform.localPosition = UnityEngine.XR.InputTracking.GetLocalPosition(trackedDevice);
    71.         anchor.transform.localRotation = UnityEngine.XR.InputTracking.GetLocalRotation(trackedDevice);
    72.     }
    73.  
    74.     private static InputDevice GetCurrentDevice(XRNode node)
    75.     {
    76.         var device = new InputDevice();
    77.         var devices = new List<UnityEngine.XR.InputDevice>();
    78.         UnityEngine.XR.InputDevices.GetDevicesAtXRNode(node,
    79.             devices);
    80.         if (devices.Count == 1)
    81.         {
    82.             device = devices[0];
    83.             //Debug.Log($"Device name '{device.name}' with role '{device.role.ToString()}'");
    84.         }
    85.         else if (devices.Count > 1)
    86.         {
    87.             Debug.Log($"Found more than one '{device.role.ToString()}'!");
    88.             device = devices[0];
    89.         }
    90.  
    91.         return device;
    92.     }
    93.  
    94.     private void UpdateButtonState(InputDevice device, InputFeatureUsage<bool> button,
    95.         AButtonEvent aButtonPressEvent)
    96.     {
    97.         bool tempState;
    98.         bool invalidDeviceFound = false;
    99.         bool buttonState = false;
    100.  
    101.         tempState = device.isValid // the device is still valid
    102.                     && device.TryGetFeatureValue(button, out buttonState) // did get a value
    103.                     && buttonState; // the value we got
    104.  
    105.         if (!device.isValid)
    106.                 invalidDeviceFound = true;
    107.  
    108.         if (tempState != aButtonPressEvent.Value) // Button state changed since last frame
    109.         {
    110.             aButtonPressEvent.Invoke(tempState);
    111.             aButtonPressEvent.Value = tempState;
    112.         }
    113.  
    114.         if (invalidDeviceFound) // refresh device lists
    115.            SetDevices();
    116.     }
    117.  
    118.     private void SetDevices()
    119.     {
    120.         //Set Controller Devices
    121.         _leftDevice = GetCurrentDevice(XRNode.LeftHand);
    122.         _rightDevice = GetCurrentDevice(XRNode.RightHand);
    123.     }
    124.  
    125.     private void ShowCurrentlyAvailableXRDevices()
    126.     {
    127.         var inputDevices = new List<UnityEngine.XR.InputDevice>();
    128.         UnityEngine.XR.InputDevices.GetDevices(inputDevices);
    129.         foreach (var device in inputDevices)
    130.         {
    131.             Debug.Log($"Device found with name '{device.name}' and role '{device.role.ToString()}'");
    132.         }
    133.     }
    134.  
    135. }
    Hand/Controller Code:
    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.Events;
    5. using UnityEngine.Serialization;
    6.  
    7. [RequireComponent(typeof(Animator))]
    8. public class HandController : MonoBehaviour
    9. {
    10.  
    11.     public bool IsGripPressed;
    12.     public bool IsTriggerPressed;
    13.     public bool IsMenuPressed;
    14.     public bool IsClickPressed;
    15.  
    16.     // Button Events
    17.     public AButtonEvent GripEvent { get; set; }
    18.     public AButtonEvent TriggerEvent { get; set; }
    19.     public AButtonEvent MenuEvent { get; set; }
    20.     public AButtonEvent ClickEvent { get; set; }
    21.  
    22.     private Animator _animator;
    23.  
    24.     void Start()
    25.     {
    26.         InitializeButtons();
    27.         _animator = GetComponent<Animator>();
    28.  
    29.     }
    30.  
    31.     // Update is called once per frame
    32.     void Update()
    33.     {
    34.      
    35.     }
    36.  
    37.     private void InitializeButtons()
    38.     {
    39.         (GripEvent = new AButtonEvent()).Initialize(IsGripPressed, OnGripButtonEvent);
    40.         (TriggerEvent = new AButtonEvent()).Initialize(IsTriggerPressed, OnTriggerButtonEvent);
    41.         (MenuEvent = new AButtonEvent()).Initialize(IsMenuPressed, OnMenuButtonEvent);
    42.         (ClickEvent = new AButtonEvent()).Initialize(IsClickPressed, OnClickButtonEvent);
    43.     }
    44.  
    45.     // Button Functions
    46.     private void OnGripButtonEvent(bool pressed)
    47.     {
    48.         IsGripPressed = pressed;
    49.         _animator.SetBool("GripAnimation", pressed);
    50.  
    51.         if (pressed)
    52.         {
    53.             Debug.Log("Grip Pressed");
    54.         }
    55.         else
    56.         {
    57.             Debug.Log("Grip Released");
    58.         }
    59.     }
    60.  
    61.     private void OnTriggerButtonEvent(bool pressed)
    62.     {
    63.         IsTriggerPressed = pressed;
    64.         _animator.SetBool("TriggerAnimation", pressed);
    65.  
    66.         if (pressed)
    67.         {
    68.             Debug.Log("Trigger Pressed");
    69.         }
    70.         else
    71.         {
    72.             Debug.Log("Trigger Released");
    73.         }
    74.     }
    75.  
    76.     private void OnMenuButtonEvent(bool pressed)
    77.     {
    78.         IsMenuPressed = pressed;
    79.         if (pressed)
    80.         {
    81.             Debug.Log("Menu Pressed");
    82.         }
    83.     }
    84.  
    85.     private void OnClickButtonEvent(bool pressed)
    86.     {
    87.         IsClickPressed = pressed;
    88.         _animator.SetBool("ClickAnimation", pressed);
    89.  
    90.         if (pressed)
    91.         {
    92.             Debug.Log("Click Pressed");
    93.         }
    94.         else
    95.         {
    96.             Debug.Log("Click Released");
    97.         }
    98.     }
    99. }
     
    superstream1 and harleydk like this.
  6. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Glad to see some people experimenting with it, and most of these examples look to get the job done.

    Pinging in here to watch this thread for additional questions or feedback. It's seeing a few minor additions in 2019.2 (device connection and disconnection callbacks), and is a start to help us break away from the limitations and manual steps of using the Input Manager.
     
  7. MR_Fancy_Pants

    MR_Fancy_Pants

    Joined:
    Aug 21, 2014
    Posts:
    19
    Heya interesting thread! Thanks for these examples people!

    I have been playing around with the Unity XR SDK since 2018, really happy to see the haptics added to the SDK and the updated input system. Whooohooo!

    Could somebody enlighten me when to use a Device feature and when to use an XRNode?

    For example:
    Device.TryGetFeatureValue(CommonUsages.devicePosition, out pos);
    Returns the world position.

    And this returns the local position :
    pos = UnityEngine.XR.InputTracking.GetLocalPosition(XRNode);
     
  8. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    I have a question regarding the new XR input system and the Unity UI system. If I have a world space canvas UI and I want to use input from a VR controller to control it (pushing buttons etc.) - is there built-in functionality already to make this happen? I found an older thread by Oculus (https://developer.oculus.com/blog/unitys-ui-system-in-vr/) which talks about subclassing several event system-related classes to make this happen. Is this still required or is there a built-in solution already?

    Thanks,

    Philip
     
  9. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Hello!

    Those are actually returning the same data. We'd like to wean people off of using XRNodes, in fact GetLocalPosition has now been tagged as obsolete. The remainder of InputTracking will be made obsolete over time as we can migrate people over to InputDevice(s) APIs. So my personal suggestion is to always try to use InputDevice APIs, those will not be going away.


    This is part of a larger, upcoming thing, and is on it's way very shortly!
    Sorry it's not yet available.
     
  10. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    @StayTalm_Unity Thanks for your answer. Just to be sure, we are indeed talking about a "touch+click to use" functionality (not a "laser pointer thing"), right? Is this planned for this year or would you say its more of a 2020 thing?

    And, if I wanted to implement this beforehand, is the general idea in the Oculus thread (i.e.subclassing InputModule and the Raycaster classes) still the preferred way to go?

    Thanks! Philip
     
  11. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
  12. addent

    addent

    Joined:
    Apr 27, 2019
    Posts:
    35
    So just out of curiosity, below is the script I attach to a game object to have it track with the given XRNode:

    Code (CSharp):
    1. public class XRDeviceTracker : MonoBehaviour
    2. {
    3.     [SerializeField]
    4.     protected UnityEngine.XR.XRNode device;
    5.  
    6.     void Update()
    7.     {
    8.         this.transform.localPosition = UnityEngine.XR.InputTracking.GetLocalPosition(device);
    9.         this.transform.localRotation = UnityEngine.XR.InputTracking.GetLocalRotation(device);
    10.     }
    11. }
    12.  
    It's very simple and clean, and I'm not sure what an equivalent solution is using an InputDevice instead of an XRNode? Any help would be much appreciated.
     
    Last edited: May 4, 2019
  13. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Hey @addent
    It's a little less simple, but this would be a close equivalent:
    Code (CSharp):
    1.  
    2. public class BasicXRTracker : MonoBehaviour
    3. {
    4.     //Keep this around to avoid creating heap garbage
    5.     static List<InputDevice> devices = new List<InputDevice>();
    6.     [SerializeField]
    7.     InputDeviceRole role;
    8.     // Update is called once per frame
    9.     void Update()
    10.     {
    11.         InputDevices.GetDevicesWithRole(role, devices);
    12.         if(devices.Count > 0)
    13.         {
    14.             InputDevice device = devices[0];
    15.             Vector3 position;
    16.             if (device.TryGetFeatureValue(CommonUsages.devicePosition, out position))
    17.                 this.transform.position = position;
    18.             Quaternion rotation;
    19.             if (device.TryGetFeatureValue(CommonUsages.deviceRotation, out rotation))
    20.                 this.transform.rotation = rotation;
    21.         }
    22.     }
    23. }
    24.  
    However, I would suggest using the TrackedPoseDriver, which is part of the XR Legacy Input Helpers package, because there are some quirks to XR that arn't being picked up by this kind of simple tracker. For one, we actually update the tracking two different times in the frame: once before update, and once right before we send things off for rendering. This second one is crucial, as it prevents that feeling of your virtual hands and head feeling like they are lagging just a little bit behind the real world equivalents. As well, the new XR.InputDevice APIs have a second benefit of letting you use events and persistent InputDevice structs to retain the device you are using. This let's you avoid redoing work in finding the same device repeatedly, and makes it easier to work with devices that you can have more than one of (for example hardware trackers), where getting the local position or rotation only works for the first one registered.

    That said, here is a slightly more robust XRTracker I whipped up using XR.InputDevice APIs:

    Code (CSharp):
    1.  
    2. public class XRTracker : MonoBehaviour
    3. {
    4.     //Keep this around to avoid creating heap garbage
    5.     static List<InputDevice> devices = new List<InputDevice>();
    6.     [SerializeField]
    7.     InputDeviceRole role;
    8.     InputDevice trackedDevice;
    9.     void OnEnable()
    10.     {
    11.         InputDevices.deviceConnected += OnDeviceConnected;
    12.         InputDevices.deviceConnected += OnDeviceDisconnected;
    13.         Application.onBeforeRender += OnBeforeRender;
    14.         InputDevices.GetDevicesWithRole(role, devices);
    15.         if (devices.Count > 0)
    16.             OnDeviceConnected(devices[0]);
    17.     }
    18.     void OnDisable()
    19.     {
    20.         InputDevices.deviceConnected -= OnDeviceConnected;
    21.         InputDevices.deviceConnected -= OnDeviceDisconnected;
    22.         Application.onBeforeRender -= OnBeforeRender;
    23.     }
    24.     void Update()
    25.     {
    26.         if (trackedDevice.isValid)
    27.             TrackToDevice(trackedDevice);
    28.     }
    29.     void OnDeviceConnected(InputDevice device)
    30.     {
    31.         if (!trackedDevice.isValid && device.role == role)
    32.             trackedDevice = device;
    33.     }
    34.     void OnDeviceDisconnected(InputDevice device)
    35.     {
    36.         if (device == trackedDevice)
    37.             trackedDevice = new InputDevice();
    38.     }
    39.     void OnBeforeRender()
    40.     {
    41.         if (trackedDevice.isValid)
    42.             TrackToDevice(trackedDevice);
    43.     }
    44.     void TrackToDevice(InputDevice trackedDevice)
    45.     {
    46.         Vector3 position;
    47.         if (trackedDevice.TryGetFeatureValue(CommonUsages.devicePosition, out position))
    48.             this.transform.position = position;
    49.         Quaternion rotation;
    50.         if (trackedDevice.TryGetFeatureValue(CommonUsages.deviceRotation, out rotation))
    51.             this.transform.rotation = rotation;
    52.     }
    53. }
    54.  
    That said, I'd still suggest going with the ready-made TrackedPoseDriver mentioned above. It does all this and exposes proper options and settings, and is something we are committed to maintaining and upgrading as new features become available.
    Hope that helps :)
     
    Last edited: May 7, 2019
  14. addent

    addent

    Joined:
    Apr 27, 2019
    Posts:
    35
    Thank-you for the examples and the quick response!

    I tested the examples with Unity 2019.1.1f1 and Unity 2019.2.0a4, The first example works and I can go with that, but the second example produces the following error:

    Assets\Scripts\XRTracker.cs(25,22): error CS0117: 'InputDevices' does not contain a definition for 'deviceConnected'

    So this looks like it's something very new and not available yet.

    I'll check out the TrackedPoseDriver later today. I avoided the XR Legacy Input Helpers package because, you know... it's legacy. :p

    [Update]: Tried the TrackedPoseDriver... it works fine but feels inconsistent with the naming conventions of the InputDevice APIs. For now, I'm going to go with a hybrid of your two examples:
    Code (CSharp):
    1. public class BasicXRTracker : MonoBehaviour
    2. {
    3.     //Keep this around to avoid creating heap garbage
    4.     static List<InputDevice> devices = new List<InputDevice>();
    5.     [SerializeField]
    6.     InputDeviceRole role;
    7.     InputDevice trackedDevice;
    8.  
    9.     void OnEnable()
    10.     {
    11.         Application.onBeforeRender += OnBeforeRender;
    12.         GetDevice();
    13.     }
    14.  
    15.     void OnDisable()
    16.     {
    17.         Application.onBeforeRender -= OnBeforeRender;
    18.     }
    19.  
    20.     void Update()
    21.     {
    22.         if (trackedDevice.isValid)
    23.             TrackToDevice(trackedDevice);
    24.         else {
    25.             GetDevice();
    26.         }
    27.     }
    28.  
    29.     void OnBeforeRender()
    30.     {
    31.         if (trackedDevice.isValid)
    32.             TrackToDevice(trackedDevice);
    33.     }
    34.  
    35.     void GetDevice()
    36.     {
    37.         InputDevices.GetDevicesWithRole(role, devices);
    38.         if (devices.Count > 0) {
    39.             trackedDevice = devices[0];
    40.         }
    41.     }
    42.  
    43.     void TrackToDevice(InputDevice trackedDevice)
    44.     {
    45.         Vector3 position;
    46.         if (trackedDevice.TryGetFeatureValue(CommonUsages.devicePosition, out position))
    47.             this.transform.localPosition = position;
    48.  
    49.         Quaternion rotation;
    50.         if (trackedDevice.TryGetFeatureValue(CommonUsages.deviceRotation, out rotation))
    51.             this.transform.localRotation = rotation;
    52.     }
    53. }
    Thanks for the help!
     
    Last edited: May 8, 2019
    aeiou963 and StayTalm_Unity like this.
  15. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    @addent, thanks for asking and @StayTalm_Unity, thanks for providing an answer regarding the new APIs. This is the type of support that really makes a difference on the forums. I would also suggest adding this to the docs.

    If it's not too much to ask, might I tempt you into providing a TrackedPoseDriver "best practise" usage example as well? It is immensely helpful for us "users" to see the API applied by those who made it, just to understand the intended usage patterns.

    Philip
     
    a436t4ataf, StayTalm_Unity and addent like this.
  16. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    @StayTalm_Unity I got around to checking the TrackedPoseDriver class now. I was surprised that I had to enable a package called "XR Legacy Input Helpers" for it to be available. I understood from your post above that the TrackedPoseDriver is in fact what we should use going forward, so why does the package have "Legacy" in its name?
     
    a436t4ataf likes this.
  17. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @addent
    As far as InputDevices.deviceConnected, you are right. It shipped in 2019.2.0a6. Your solution should work most of the time, although the edge case would be if someone disconnects one controller (say, a Vive) and reconnected something else (Maybe a knuckles controller). It would lose the InputDevice reference and stop tracking. You may want to just check if the device is valid at the top of the update functions and if not, do a quick search for any valid device.

    @plmx
    That naming one is complicated, and political. We assumed a few things would ship and replace older systems a little faster than they did. We are hoping to upgrade it to use InputDevices APIs, and the new Input System as well, but some of this is just incoming.

    As for Best Practices, I've also put that on the backlog. It's a good idea, not a lot of people know about the TrackedPoseDriver, and so we are going to get better exposure and information on it.
     
  18. addent

    addent

    Joined:
    Apr 27, 2019
    Posts:
    35
    Cool. Thanks for info and great job keeping active on the forum! It is very much appreciated.
     
  19. addent

    addent

    Joined:
    Apr 27, 2019
    Posts:
    35
    I've submitted a bug report as well, but I thought I'd mention it here too...

    The InputDevice.TryGetFeatureUsages() reports incorrect information with an HTC Vive.
    In particular, it reports that the various "Finger" inputs exists when they do not.

    Ideally this would be fixed to be accurate. I'm trying to avoid writing code specific to each manufacturer's controllers and the TryGetFeatureUsages() would have been a nice solution to achieve it.

    An example of incorrect reporting would be:

    Device: OpenVR Controller(Vive. Controller MV) - Right
    Role: RightHanded
    Features:
    Primary2DAxis
    Trigger
    Grip
    Secondary2DAxis <-- Not True! This feature does not exist on Vive Controllers!
    IndexFinger <-- Not True! This feature does not exist on Vive Controllers!
    MiddleFinger <-- Not True! This feature does not exist on Vive Controllers!
    RingFinger <-- Not True! This feature does not exist on Vive Controllers!
    PinkyFinger <-- Not True! This feature does not exist on Vive Controllers!
    PrimaryButton
    SecondaryButton
    GripButton
    Primary2DAxisClick
    TriggerButton
    Primary2DAxisTouch
    DevicePosition
    DeviceRotation
    DeviceVelocity
    DeviceAngularVelocity
    TrackingState
    IsTracked

    The script I used to test what it was reporting is:

    Code (CSharp):
    1. using System.Collections.Generic;
    2. using UnityEngine;
    3. using UnityEngine.XR;
    4.  
    5. public class XRFeatureCheck: MonoBehaviour
    6. {
    7.     static List<InputDevice> devices = new List<InputDevice>();
    8.     static List<InputFeatureUsage> featureUsages = new List<InputFeatureUsage>();
    9.  
    10.     void Update()
    11.     {
    12.         InputDevices.GetDevices(devices);
    13.  
    14.         string logString = "";
    15.  
    16.         foreach (var device in devices)
    17.         {
    18.  
    19.             logString += "\n\n-------------------------------------------------------";
    20.             logString += "\nDevice: " + device.name;
    21.             logString += "\nRole: " + device.role;
    22.             logString += "\nFeatures: ";
    23.  
    24.             var featuresFound = device.TryGetFeatureUsages(featureUsages);
    25.  
    26.             if (!featuresFound)
    27.             {
    28.                 logString += "\n   No Features Found.";
    29.             }
    30.             else
    31.             {
    32.                 foreach (var f in featureUsages)
    33.                 {
    34.                     logString += "\n   " + f.name;
    35.                 }
    36.             }
    37.         }
    38.  
    39.         Debug.Log(logString);
    40.  
    41.     }
    42. }
    43.  
    [EDIT]
    bug report here...
    https://issuetracker.unity3d.com/is...-features-that-are-not-in-the-vive-controller

    Thanks guys!
     
    Last edited: May 24, 2019
    ROBYER1 likes this.
  20. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @addent
    That is an unfortunate result of how OpenVR handles buttons and axes. Some of the controllers will use axes reported as not in use, and there is no way to know which buttons will be used by any given controller. I didn't want to attempt to inject controller-specific logic (e.g. if(ViveWand)... else if(WMRController)... else if(OculusTouch)) into an otherwise generic interface (openVR), as that would make Unity responsible for implementing each controller that decides to work with OpenVR.

    As a result, we blast out all potential features, and so every controller used in OpenVR will have the full list of potential input features.
     
    hippocoder likes this.
  21. addent

    addent

    Joined:
    Apr 27, 2019
    Posts:
    35
    Well that's disappointing to hear. If it is not going to report reliable and accurate information, then there's really not much point in using it. Oh well, that's the way the cookie crumbles sometimes. I'll just add my own controller-specific logic about feature availability for now. It's still better than setting up the old input system for everything. Thanks for letting us know. :)

    On a side note, is this stuff based on the OpenXR specification? It looks like it has fairly well defined mechanism to report features in section 6, "Semantic Paths". It even specifically states the interaction profiles for all the common controllers as XrPaths.

    https://www.khronos.org/registry/OpenXR/specs/0.90/html/xrspec.html#semantic-path
    https://www.khronos.org/registry/Op...rspec.html#semantic-path-interaction-profiles

    It then uses those XrPaths in its ActionSet system to handle the input-to-action mapping.

    https://www.khronos.org/registry/OpenXR/specs/0.90/html/xrspec.html#input

    I guess what I'm getting at is that yes, somewhere at some level there must be controller-specific information. I had hoped that this would be "under the hood" in Unity's XR.InputDevices API, but sounds like that isn't really the case.

    My hope is that this will all get worked out with the new Input System once it's out of preview, and it will in-fact hide controller specifics and have something similar to the InputDevice.TryGetFeatureUsages() that reports accurately.

    Anyhow - thank's again for the update. I have to move on from worrying about inputs for now, but I will continue to follow this thread. It's interesting stuff.
     
  22. Yumby

    Yumby

    Joined:
    Aug 17, 2012
    Posts:
    19
    Could you please explain the proper way to identify multiple devices types with the same XRNode or InputDeviceRole?

    I've not been able to locate a way to reconcile the two. InputTracking.GetNodeStates will return a list of all tracking XRNodeStates each with a unique ID. But it seems InputDevices.GetDevicesAtXRNode can return multiple devices with no way to correlate them to their XRNodeState.

    For example, in a typical Vive setup with two Lighthouses, called InputTracking.GetNodeStates will return two unique XRNodeStates with nodeType TrackerReference (one for each LightHouse) but InputDevices.GetDevicesAtXRNode(XRNode.TrackerReference) will return two devices in the list with no way to match them to their corresponding XRNodeState.

    Similarly, InputDevices.GetDevicesWithRole(InputDeviceRole.TrackingReference, devices) will also return two devices with no way to associate them to an XRNodeState.

    Thank you for any help you can provide!
     
  23. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    540
    Maybe im missing something, but im using the TryGetFeatureValue to detect if the Trigger was pressed, im passing CommonUsages.triggerButton.
    This returns true as long as the trigger is pressed, is there any API to check if the trigger was pressed during this frame and then return false for all following frames even if the trigger is still down?

    Something similar can be down with the "PC" input for example Input.GetButton vs Input.GetButtonDown or Input.GetMouseButtonDown vs Input.GetMouseButton.

    A workaround would be to store the state myself and update the variable every frame but a "TryGetFeatureDown" or something like that would be helpful or is that already possible and im just doing it wrong?

    Edit: There seems to be a additional TryGetFeatureValue with a DateTime "time" parameter, I can't find any documentation about it, what does the time parameter do?

    Edit2: Additional question about the commonUsages.triggerButton, on some controllers (for example Oculus Touch) the "button" feels way too sensitive, I "touch" the trigger and the triggerButton already returns true, is there a way to change the axis value which triggers the "button"?
    As a current workaround I use the trigger float value instead and check if the value is a certain limit to trigger a "click", but there should be a way to "fix" the commonUsages.triggerButton sensitivity, otherwise it's not really useful.
     
    Last edited: Jun 29, 2019
  24. HastilyAssembledGames

    HastilyAssembledGames

    Joined:
    Oct 25, 2017
    Posts:
    4
    @StayTalm_Unity
    We just tried using the individual finger features (middleFinger, ringFinger, etc.) on the new index (knuckles) controllers, but the returned values are always 0. Is this an issue of these feature not *really* being available on the index controller yet, or is this a bug?
    Or are we doing something wrong? We're on Unity 2019.1, the controllers are working fine otherwise and other XR features, like trigger and buttons, seem to work fine too. Also, the grip feature appears to always return either 0 or 1 and not the "weighted average" the docs are speaking of.
     
  25. SteenPetersen

    SteenPetersen

    Joined:
    Mar 13, 2016
    Posts:
    103
    Im having an issue where:

    Code (CSharp):
    1. if (device.TryGetFeatureValue(CommonUsages.triggerButton, out trigger) && trigger)
    is firing ok on vive controller but on oculus Quest it is fired as soon as the trigger is even touched not pressed. Is there a way to return how far along the axis it is pressed? All I find in docs is trigger and triggerbutton, none of which return me the axis value.


    @StayTalm_Unity
     
    Last edited: Jun 30, 2019
    vincismurf likes this.
  26. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @Mandarb
    Yeah, you are doing nothing wrong, and sadly those fingers do not come through for us on OpenVR. The only way you can access those finger values is through SteamVRInput 2.0, and the action-based system. I'd really like to get this fixed, we have the axes available, they are just not reported.

    @SteenPetersen
    You should be able to use CommonUsages.trigger to get the axis (0 to 1) values of the trigger for all Oculus controllers.
    If that doesn't work ping me back, and I can have a quick look for any obvious things that could be wrong.
     
  27. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @R1PFake
    You are correct, this new API doesn't really keep track of last frame changes.

    It's definitely a useful feature to have, but I opted to keep that out of this system for now. The reason being that the Input System is coming along, and that has a really good action abstraction to be able to properly remap and get input state changes, and all the data from InputDevice gets fed along to that system, which is for more than just XR. We are struggling a little with a few shared problems:
    1) Getting good APIs and useful tools out to developers ASAP
    2) Reducing our overal API surface and not solving the same problem it multiple ways.

    And so we have InputDevice to get XR state data now, in a useful, albeit simple format, and are leaning on the incoming Input System to provide a stronger, more useful abstraction and set of higher level gameplay tools.
     
  28. MR_Fancy_Pants

    MR_Fancy_Pants

    Joined:
    Aug 21, 2014
    Posts:
    19
    Heya @StayTalm_Unity,

    Thanks for your support in this thread :) I had another question, not sure if this is part of the input system new or otherwise. But is there a way to set the tracking origin type to either eyeLevel or floorLevel?
     
  29. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @TheFistCannon2000

    It's on its way!
    It's in the 2019.3 alpha versions at the moment and looks like this:

    Code (CSharp):
    1. namespace UnityEngine.XR
    2. {
    3.     public enum TrackingOriginModeFlags
    4.     {
    5.         Unknown = 0,
    6.         Device = 1,
    7.         Floor = 2,
    8.         TrackingReference = 4
    9.     }
    10.  
    11.     public static class XRInputSubsystem
    12.     {
    13.         public extern bool TrySetTrackingOriginMode(TrackingOriginModeFlags origin);
    14.         public extern TrackingOriginModeFlags GetTrackingOriginMode();
    15.         public extern TrackingOriginModeFlags GetSupportedTrackingOriginModes();
    16.  
    17.         public event Action<XRInputSubsystem> trackingOriginUpdated;
    18.  
    19.     }
    20. }
    It's pretty similar to what was in XRDevice, and should look familiar. The main improvements done were a boolean result to know if you actually *did* change the tracking space, the ability to check what tracking origin modes are supported at the time, and a callback for when the origin *does* update. This last one is called both when you update the tracking origin, as well as when the underlying SDK decides to perform a recenter, because it lost its bearings.

    You'll notice that it's on a new class, called XRInputSubsystem. That's because of this. The new subsystem design means that you can have multiple SDKs loaded (once we get that many), and so you can set the Tracking Origin Mode of an individual SDK. Think a Leap Motion mounted to an HMD. Since you'll be working in InputDevice objects, the InputDevice also has a new property called 'InputDevice.subsystem' that will let you get the corresponding subsystem that created that device.

    This is all coming in 2019.3, and is already in the current alpha builds, along with getting boundary points and a few other minor improvements.

    Hope that helps!
    -Tom
     
    Last edited: Jul 17, 2019
    addent likes this.
  30. MR_Fancy_Pants

    MR_Fancy_Pants

    Joined:
    Aug 21, 2014
    Posts:
    19
    Heya Tom!

    Awesome thanks for the quick reply and info. I'll try those asap!

    Best,
    Daniel
     
    StayTalm_Unity likes this.
  31. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    We're currently on Unity 2019.1 with our project, and support SteamVR, Oculus SDK, PSVR, maybe Windows MR and one more esoteric VR/AR-platform that has its own SDK. I'm about to port over to SteamVR Input 2.0 and am hoping to be able to feed input captured from other systems into that so that I can code against one single API (also, we'll be using the SteamVR Interaction system, and support Valve Index controllers). We also have some quite elaborate generic tracker support with automatic role assignment and things like that.

    Ideally, at some point I would like to only have SteamVR Input 2.0 and Unity's generic XR input system. One problem with action-based Input systems is that it makes supporting different systems a lot more complicated.

    Would you recommend waiting with porting to the Unity XR Input system until it's more stable (e.g. when 2019.2/2019.3 is released) ... or actually, maybe even until the new Input System (with actions) is more stable?
     
  32. nukadelic

    nukadelic

    Joined:
    Aug 5, 2017
    Posts:
    77
    Not sure if that's a bug or not, I noticed when the oculus dekstop app is closed, while trying to start play mode in the unity editor, it will not render in VR ( the heads up display screen is completely black ) but it will track the headset and the controllers positions and rotation and render on the game tab.
     
    Last edited: Jul 27, 2019
  33. vincismurf

    vincismurf

    Joined:
    Feb 28, 2013
    Posts:
    200
    I have noticed that I get poor responsiveness with the XRInput system and buttons in scroll rects for unity.

    I was wondering if you could verify.

    Basically I have a scroll rect of many buttons, I want to select a button by pulling the trigger
    like this:


    I have tried both THIS:

    Code (CSharp):
    1.         bool triggerPulled;
    2.  
    3.         if (controller.TryGetFeatureValue(CommonUsages.triggerButton, out triggerPulled) && triggerPulled)
    4.         {
    5.             if (TriggerPressed != null)
    6.             {
    7.                 TriggerPressed();
    8.             }
    9.  
    10.         }
    AND this
    Code (CSharp):
    1. bool triggerPulled;
    2.  
    3.         if (controller.TryGetFeatureValue(CommonUsages.trigger, out trig))
    4.         {
    5.              Debug.Log("CommonUsages.trigger button is pressed trigger " + trig);
    6.         }
    7.  
    8.         triggerPulled = (trig == 1);


    To trigger this

    Code (CSharp):
    1.         if (updateCurveUIInput)
    2.         {
    3.             CurvedUIInputModule.CustomControllerButtonState = triggerPulled;
    4.         }

    Most the time the button will highlight as though pressed but the command isn't invoked.

    Usually you have to pull the trigger slowly to get a proper selection.

    I fear this is an inherent problem of scroll rect. Is seems that if you move the cursor the slightest bit while pressing a button in the scroll rect it will think you are moving the scroll rect and consume the event.

    Can you help?

    Anthony
     
    Last edited: Jul 29, 2019
  34. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    The scroll rect and the input systems dont interact in that way, the UI system cannot force input events to be consumed.
    do you get any other input data? (particularly per-frame data eg: tracking)? are you absolutely sure that your code calling TryGetFeatureValue is being hit? what frame rate are you seeing?
     
  35. vincismurf

    vincismurf

    Joined:
    Feb 28, 2013
    Posts:
    200
    There are a couple solttion I found

    1) increase the dragThreshold on the event system

    2) add EventTriggers for BeginDrag/Drag/EndDrag on the button to consume the drag event.
     
  36. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @vinicismurf
    You are onto the underlying problem, which is that the drag threshold for a mouse is way to small for VR. When comparing a tracked controller to a mouse, the tracked controller is very twitchy and has a lot of small movements, especially when you think of it in screen space.

    I suggest you do jack up the drag threshold for VR to something that feels right to you. That should be available in the EventSystem component. If it makes enough sense, I can look into splitting it up between mouse and AR/VR drag thresholds as they do behave pretty differently.
     
    IsDon likes this.
  37. EthanF4D

    EthanF4D

    Joined:
    Jan 9, 2018
    Posts:
    13
    Hi, SteamVR has updatedto 1.6.10 on 30 JUL
    https://steamcommunity.com/games/250820/announcements/detail/1603763772826636893
    These changes breaks the
    Code (csharp):
    1. Input.GetAxis(axis9or10)
    and also
    Code (CSharp):
    1. device.TryGetFeatureValue(CommonUsages.trigger, out float triggerValue)
    which is the vive controller trigger button. Previously it would report 0.0-1.0. Now it only reports 0.0-0.2, and then jump to 1.0 if >0.2.
     
  38. vincismurf

    vincismurf

    Joined:
    Feb 28, 2013
    Posts:
    200
    I just got a Rift S and notice that the TriggerButton behaives differantly than Vive and Go.

    This code
    Code (CSharp):
    1. if (controller.TryGetFeatureValue(CommonUsages.triggerButton, out triggerPulled) && triggerPulled)
    2.  
    On a Vive and Go only executes when the trigger is pulled past midway point.

    On a Rift S, if the trigger is touched ever so slightly, the value will be true. Also on the Rift S the code

    Code (CSharp):
    1.         if (controller.TryGetFeatureValue(CommonUsages.trigger, out trig))
    2.        
    The trig value has a delayed reaction, and after you release a pulled in trigger the trig value can and will be greater than 0, usually a value greater than 1 .

    So what is the official Unity way of interacting with with trigger on Rift S?


    I think the docs should be changed to show something like this

    Code (CSharp):
    1.         triggerPulled = false;
    2.         advancedTriggerPull = false;
    3.         // Recommended new way
    4.         if (controller.TryGetFeatureValue(CommonUsages.triggerButton, out triggerPulled) && triggerPulled)
    5.         {
    6.         }
    7.  
    8.         // Another way to do it
    9.         float trig = 0;
    10.         if (controller.TryGetFeatureValue(CommonUsages.trigger, out trig))
    11.         {
    12.  
    13.         }
    14.  
    15.         advancedTriggerPull = triggerPulled && (trig > 0.5);
     
    Last edited: Aug 12, 2019
  39. nukadelic

    nukadelic

    Joined:
    Aug 5, 2017
    Posts:
    77
    @vincismurf, I also noticed that slightly touching the trigger will set triggerButton to true. Your workaround does the trick, maybe it would be best to have both triggerButton and triggerTouch ( similar to how primaryTouch and primaryButton is working right now ) ??

    This makes it confusing when accessing similarly named input features. Now consider that the axis grip will fire CommonUsages.gripButton when its pressed ~25% down, unlike the CommonUsages.primaryButton and others alike

    So instead i implemented a custom class with the following input features added
    Code (CSharp):
    1. CustomUsages.triggerButton
    2. CustomUsages.triggerTouch
    3. CustomUsages.gripTouch
    4. CustomUsages.gripButton
    that will fire the button when its at 50% or more
    and fire the touch when its 0.5% or more
    ( not sure how the same method applies to different headsets, since i only got the Rift S )
     
    Last edited: Aug 15, 2019
    vincismurf likes this.
  40. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    540
    How did you implement a custom InputFeatureUsage?
     
  41. nukadelic

    nukadelic

    Joined:
    Aug 5, 2017
    Posts:
    77
    @R1PFake with a TryGetValue wrapper, also its possible to create new input features:
    Code (CSharp):
    1. new InputFeatureUsage<bool>("_TriggerTouch");
    Here is the bit im using to handle this input:
    Code (CSharp):
    1.         public static class Buttons
    2.         {
    3.             public static InputFeatureUsage<bool> primaryButton         = CommonUsages.primaryButton;
    4.             public static InputFeatureUsage<bool> primaryTouch          = CommonUsages.primaryTouch;
    5.             public static InputFeatureUsage<bool> secondaryButton       = CommonUsages.secondaryButton;
    6.             public static InputFeatureUsage<bool> secondaryTouch        = CommonUsages.secondaryTouch;
    7.             public static InputFeatureUsage<bool> gripButton            = new InputFeatureUsage<bool>("_Grip_Button");
    8.             public static InputFeatureUsage<bool> gripTouch             = new InputFeatureUsage<bool>("_Grip_Touch");
    9.             public static InputFeatureUsage<bool> triggerButton         = new InputFeatureUsage<bool>("_Trigger_Button");
    10.             // Oculus Rift S - this is actually the trigger touch and it works very well
    11.             public static InputFeatureUsage<bool> triggerTouch          = CommonUsages.triggerButton;
    12.             // public static InputFeatureUsage<bool> triggerButton         = new InputFeatureUsage<bool>("_Trigger_Touch");
    13.             public static InputFeatureUsage<bool> menuButton            = CommonUsages.menuButton;
    14.             public static InputFeatureUsage<bool> primary2DAxisClick    = CommonUsages.primary2DAxisClick;
    15.             public static InputFeatureUsage<bool> primary2DAxisTouch    = CommonUsages.primary2DAxisTouch;
    16.         }
    17.  
    18.         bool GetCustomButtonValue( InputFeatureUsage<bool> usage )
    19.         {
    20.             float value = 0f;
    21.  
    22.             if( usage.name.Contains("_Grip") )      value = GetAxisGrip();
    23.             if( usage.name.Contains("_Trigger") )   value = GetAxisTrigger();
    24.        
    25.             if( usage.name.Contains("_Touch") )       return value >= MinValue_Touch;
    26.             if( usage.name.Contains("_Button") )      return value >= MinValue_Button;
    27.  
    28.             return false;
    29.         }
    Code (CSharp):
    1.         static List<InputFeatureUsage<bool>> CustomButtons = new List<InputFeatureUsage<bool>>
    2.         {
    3.             Buttons.gripButton,
    4.             Buttons.gripTouch,
    5.             Buttons.triggerButton,
    6.             Buttons.triggerTouch,
    7.         };
    8.  
    9. // ...
    10. if( CustomButtons.Contains( input_device ) )
    11. {
    12.     is_pressed = GetCustomButtonValue( input_device );
    13. }
    14. // ...
    PS. I got few utils and snippets for UnityXR, let me know if anyone is interested and i will share it on github.
     
    Last edited: Aug 15, 2019
    ROBYER1 and R1PFake like this.
  42. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    So, Rift & Rift S should be setting Trigger Button at a reasonable threshold (I suggested maybe 50% or 75%) and not at the 'Touch' values. I've forwarded this thread along to the Oculus specific guys.
     
    vincismurf likes this.
  43. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,453
    I would be interested in these if you are happy to share - i started writing my player controls using the legacy input system for now but some nods in the right direction would help a lot. I don't understand the benefit of this new input system over the legacy input one, especially if we are using the Tracked Pose Driver from 'Legacy XR Input Helpers'
     
  44. nukadelic

    nukadelic

    Joined:
    Aug 5, 2017
    Posts:
    77
    Here is the bin for the older version i had:
    https://pastebin.com/fNLiwtXD ( Added description and info in discord )
    So the XR inputs now can be used like so:

    Vector2 axis = tracker.GetAxisPrimary();
    bool down = tracker.GetButton( CommonUsages.primaryButton );
    // GetButtonDown( ... ), GetButtonUp( ... ) , HasAxisGrip() , GetAxisGrip() , ...
     
    Last edited: Mar 13, 2020
    ROBYER1 likes this.
  45. vincismurf

    vincismurf

    Joined:
    Feb 28, 2013
    Posts:
    200
    Um I updated my Oculus to latest drivers yesterday and now

    Code (csharp):
    1.  
    2. if (controller.TryGetFeatureValue(CommonUsages.triggerButton, out triggerPulled) && triggerPulled)
    3.         {
    4.         }
    5.  
    Is always false. . . . has anyone else noticed this?
     
    Last edited: Aug 29, 2019
  46. Zapan15

    Zapan15

    Joined:
    Apr 11, 2011
    Posts:
    186
    Yes, for me, too.. For Quest I do no get any positve results for any button.
     
  47. vincismurf

    vincismurf

    Joined:
    Feb 28, 2013
    Posts:
    200
  48. Zapan15

    Zapan15

    Joined:
    Apr 11, 2011
    Posts:
    186
    Thank you for the info!

    @Unity
    Is this something that will be integrated, or do we have to do this ourself (change manfiest manualy)?
     
  49. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @Zapan15
    The fixes for the quest manifest issue are on their way to all Unity releases (2017.x, 2018.x, 2019.x). It will be in the form of a checkbox in the XR settings (where you set Oculus as the SDK to use). It's just making it's way through the shipping pipeline now.
     
    ROBYER1 and vincismurf like this.
  50. Zapan15

    Zapan15

    Joined:
    Apr 11, 2011
    Posts:
    186
    Thank you for the Feedback. Sounds nice! :)