Search Unity

  1. Read here to learn more about our latest preview of the XR Interaction Toolkit.
    Dismiss Notice
  2. Click here to see what's on sale for the "Best of Super Sale" on the Asset Store
    Dismiss Notice
  3. Read here for Unity's latest plans on OpenXR.
    Dismiss Notice

Any example of the new 2019.1 XR input system?

Discussion in 'AR/VR (XR) Discussion' started by fariazz, Feb 15, 2019.

  1. Corysia

    Corysia

    Joined:
    Mar 14, 2018
    Posts:
    99
    What is it today? With 1.40, Oculus removed that entry in the manifest. Are we still going to need that proposed checkbox?

    Also, am I right in that if I import the Oculus Integration asset, I'm overriding the Oculus libraries in the Unity Package Manager?

    That was from May. Are the XR Legacy Input Helpers still recommended?

    I'd like to avoid tying myself irrevocably to Oculus, so I'd like to make a purely XR app even tho right now I've only a Rift and Quest to work with.
     
  2. rmon222

    rmon222

    Joined:
    Oct 24, 2018
    Posts:
    70
    Hi, I'm new to this new XR input system. Do I need to install "XR Management" and "Oculus XR Plugin" to use it?
    Instead of the legacy TrackedPoseDriver, what component should I be adding to our left and right controller objects?
    I'm on 2019.3.0b4. Thanks in advance.
     
  3. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    152
    @rmon222
    You do not need to install the new XR Plugin packages, XR Input works both with our current and future backends :)

    We don't yet have an XR Input based tracked pose driver, and so I'd continue to use the TrackedPoseDriver. There will be new tools coming, but we will not deprecate the Tracked Pose Driver in the near term, and will make upgrading as painless as we can.
     
    ROBYER1 likes this.
  4. rmon222

    rmon222

    Joined:
    Oct 24, 2018
    Posts:
    70
    Thanks. It works. One question: Is there a new way to get controller velocity or is XRNodeState.TryGetVelocity() the most current?
     
  5. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    152
    The usage deviceVelocity!
    You may want to check trackingState first, which is a series of flags reporting the currently available tracking elements (e.g. position, velocity, angular acceleration, etc...). Depending on the current state of tracking sometimes you drop into 3DoF (rotation only) mode and won't be able to read Velocity, in which case it'll return (0,0,0) until tracking improves.
     
  6. rmon222

    rmon222

    Joined:
    Oct 24, 2018
    Posts:
    70
    Many thanks for the quick response.
     
    StayTalm_Unity likes this.
  7. rmon222

    rmon222

    Joined:
    Oct 24, 2018
    Posts:
    70
    There's a bug in the angular velocity of a tracked controller. It doesn't take into account the rotation of the tracked object. In effect, it reports the same angular velocity no matter what direction you point the controller. Adding the rotation of the controller to reported angular velocity fixes it.
     
  8. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    152
    @rmon222
    That was intentional!
    We want as many values as possible to be in the same global space, so that features don't implicitly depend on each other. The Angular acceleration and velocity are in world space by design, but as you've figured out already, you can add the known rotation in if you'd like to look for local rotations (like trying to identify a twist gesture or something similar).

    There is a bug where one of the SDKs is still local, and I can't find it offhand, but it will be fixed and made global and not the other way around.
     
  9. rmon222

    rmon222

    Joined:
    Oct 24, 2018
    Posts:
    70
    I think I'm talking about the bug you refer to. I'm using the Oculus SDK. Angular velocity seems to be in local space. Adding the rotation of the controller, ie. converting to global space, works.

    Question: if the controllers are missing at init time, do you recommend subscribing to XRDevice.deviceLoaded to discover them later? Is there a better alternative?

    Thanks,
     
  10. roomera

    roomera

    Joined:
    Sep 27, 2016
    Posts:
    17
    Hi Tom,

    Regarding getting or setting the origin mode flags - when I try to access the InputDevice's subsystem, it's always returning null. Am I doing something incorrectly? (I'm on 2019.3.0b3, building for the Quest)

    Code (CSharp):
    1.  
    2.     private void OnEnable() {
    3.         InputDevices.deviceConnected += OnDeviceConnected;
    4.     }
    5.  
    6.     private void OnDisable() {
    7.         InputDevices.deviceConnected -= OnDeviceConnected;
    8.     }
    9.  
    10.     private void OnDeviceConnected(InputDevice device) {
    11.         if (device.characteristics.Is(InputDeviceCharacteristics.TrackedDevice | InputDeviceCharacteristics.HeadMounted)) {
    12.             if (device.subsystem == null) {
    13.                 Debug.Log("Subsystem is null");
    14.             } else {
    15.                 Debug.Log(device.subsystem.GetTrackingOriginMode().ToString());
    16.             }
    17.         }
    18.  
     
  11. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    152
    @rmon222
    I believe XRDevice.deviceLoaded is more about the HMD, and so if the controller is powered on _after_ the game has started it's still not going to be correct.

    Starting in 2019.2, I would suggest UnityEngine.XR.InputDevices.deviceConnected. It will call back for each new device, and there is a parallel disconnected callback to know when things are no longer available. You can keep the InputDevice structs returned from these for as long as you like so you can have some consistency about which systems are polling off of which devices, and not have to follow the pattern of getting all new devices every frame. Actually, @roomera 's sample shows the two callbacks for getting device connections.

    @roomera
    You arn't doing something wrong, this is an unfortunate quirk of the transition between to our XR Plugin Architecture. InputDevices that come from the built-in SDKs, set in the XR tab of the Player Settings panel, do not have subsystems. They can be set via the XRDevice APIs, Get/SetTrackingSpaceType. With tracking space types, Stationary generally refers to the Device tracking origin mode, and RoomScale generally refers to Floor tracking origin mode, but it can depend somewhat on the intricacies of the SDK. If you start using the 'Oculus XR Plugin' package and our new architecture, you will start seeing InputDevices that are associated with individual Subsystems.
     
    ROBYER1 likes this.
  12. roomera

    roomera

    Joined:
    Sep 27, 2016
    Posts:
    17
    @StayTalm_Unity - So I went ahead and switched from the built-in Oculus SDK to the Oculus XR Plugin, and also added the manifest (which was needed to prevent the Quest from being seen as a Go).

    It works, but now positional tracking for the controllers has a considerable amount of jitter.

    When I switch back to the built-in SDK (and remove the manifest), the jitter goes away. I am not using the Oculus Integration in either case, and was using Tracked Pose Driver.

    Is this a bug, or is it something on my end?
     
    Last edited: Oct 3, 2019
    hippocoder, fherbst and ROBYER1 like this.
  13. rmon222

    rmon222

    Joined:
    Oct 24, 2018
    Posts:
    70
    Yes. roomera's post answered my question immediately. Thank you. I ran into problems making the new XR Input work with the vive so I went back to using SteamVR plugin for vive, and the new XR Input for oculus. Hope to revisit it in a couple months.
     
  14. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,726
    Still no able to switch Quest to floor tracking. Every game on the store seems fine but mine has this weird offset. Any advice?
     
  15. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    151
    offset in which direction?
     
    hippocoder likes this.
  16. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,726
  17. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    151
    tracking space is its own self contained space, where 0,0,0 (the local origin of tracking space) is determined by the device itself. some cases it's where you start tracking, others its your marked out play area etc.

    there's two main modes that devices use 'device tracking origin' eg: some historical device position is regarded as 0,0,0 and the device will report its position relative to that. and 'floor tracking origin'. where 0,0,0 is at some point on the floor, with the device reporting its position relative to that point.

    what we generally recommend in terms of game object hierarchy is:

    XR Rig
    |-Camera Offset
    ...|-Head

    (hopefully my ascii art comes through, the camera offset is the child of the XR Rig, and the Head is the child of the Camera Offset.)

    where the 'Head' Game object has a tracked pose driver (in non relative mode) tracking the camera. the 'Camera Offset' node is used to offset the camera up from the floor when using a device tracking origin (since its data does not implicitly contain the height of the device).

    when moving the user, you move the XR Rig game object. the head is then the position in tracking space of the device, the xr rig is its anchor in unity world space. if you want to add controllers, you can put them at the same level as the 'head' game object using a TPD with the 'use relative transform' unticked. controllers also publish their position in tracking space.

    so, by dragging the XR Rig around you're really moving the transform from tracking -> world space around.

    of course you can use whatever hierarchy you want, but you'll need to figure out how to make sure that all your tracking -> world space transforms line up yourself.
     
    roomera, ROBYER1 and hippocoder like this.
  18. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,726
    Hi Matt! Thank you for replying.

    Yep got exactly that so far:
    upload_2019-10-10_19-18-38.png
    Camera, Left Hand and Right Hand have a TrackedPoseDriver component, and are locally positioned at 0,0,0
    The parent (Camera Rig) is currently 0,0,0 and it is that which I'm trying to get an offset from so I can move the Rig up a little or down so that perceptually in VR, the controller would be at my feet if I put it down.

    I realise now starting in roomscale vs sitting makes the device decide to use completely different coordinates, so I need to figure out how to do that. I wanted to go with Oculus's Unity's recommendation to use XR instead of the oculus stuff but there is no documentation for it and massive holes in my knowledge.

    From what I am reading from you, I should be looking for some kind of event or callback and modifying the Camera Rig (offset in in your example) Height each time the user changes that, as it will be often different, even while app is running.

    I am not using relative because it's marked as deprecated and I want to do everything exactly correct for the latest recommendations by Unity. I don't want to use anything that's going to be deprecated.

    So how do I a) know when the tracking changes from say roomscale to seated, and b) how do I get the offset to the floor for the Camera Offset ?

    It's really confusing just finding those things. Or polling for changes. I honestly don't know where to look first since there's what seems to be 3 different API's each time I google...
     
  19. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    151
    also dont forget to make sure that the TPD's have the 'use relative transform' unticked.

    right, the origin of tracking space will change depending on how its initialized.

    not quite. in "Floor" tracking mode, the height will be implicit in the data reported by the hmd.
    As an example the position reported by the device if the user is 1.8mtrs tall while standing directly at the origin of tracking space would be expected to look like (0.0, 1.8, 0.0).

    in "Device" mode, you will need to offset the camera with some value that you decide upon. (eg: from a menu in your app or something similar ) as the device will report (0.0,0.0,0.0) when a person 1.8mtrs tall is standing at where the device has decided the origin of tracking space is. you need to add an offset to compensate. what that offset is, is up to you. AFAIK there's no api to ask that.

    a) as far as i'm aware, only you, the application, can change that during the runtime of the app. the user can pick their startup mode (which you can query via the Get version of our api). but after that, its up to you. the only time this might not hold is if the user does a guardian system reset/reconfig.
    b) see above. its either given to you in the data, or you'll have to either pick a value or let the user configure it.
     
    Last edited: Oct 10, 2019
  20. nocanwin

    nocanwin

    Joined:
    May 7, 2009
    Posts:
    167
    I can't get trigger values from hardware trackers even though the vive trackers have that functionality via their pogo pins. Is there a reason why this is missing or should it work? I ran TryGetFeatureUsage on the hardware tracker input device and it didn't return trigger so I'm guessing it wasn't implemented. I'd like to switch to XR but this is a blocker for me. We run 4 trackers at the same time, each with custom hardware and I need to get trigger input from them.

    Any chance we can get this @StayTalm_Unity or @Matt_D_work?
     
    Last edited: Jan 10, 2020
  21. Gamrek

    Gamrek

    Joined:
    Sep 28, 2010
    Posts:
    126

    I was about to build a new VR project and I realised this new changes.

    I am using VRTK developing my project and the camera that I use is the UnityCameraRig which set as stationary mode. After changing into this new XR input system, the camera shift up to what it suppose to do. And I cannot reset it using InputTracking.Recenter().

    VS tells me to use XRInputSubSystem instead. So I type in XRInputSubSystem. but no available function comes on editor. How do I set the mode back to Stationary Mode again and recenter the headset when needed? Do I need to attach a new XRInputSubSystem component or what?
     
  22. Gamrek

    Gamrek

    Joined:
    Sep 28, 2010
    Posts:
    126

    Sorry, I found what I need from other thread:
    Code (CSharp):
    1.  
    2. List<XRInputSubsystem> subsystems = new List<XRInputSubsystem>();
    3. SubsystemManager.GetInstances<XRInputSubsystem>(subsystems);
    4. for (int i = 0; i < subsystems.Count; i++)
    5. {
    6.    subsystems[i].TrySetTrackingOriginMode(TrackingOriginModeFlags.Device);
    7.    subsystems[i].TryRecenter();
    8. }
    9.  
    Here is what you need if you want to reset camera or set tracking mode.
     
  23. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    259
    I've been using SteamVR but I'm trying to transition to XR Input. I've started by using TryGetFeatureValue(CommonUsages.devicePosition... to place objects representing my Vive trackers which seems to work fine.

    My problem is when I need to move the player to specific place. Using SteamVR I just move the camera rig object, and the trackers follow along.

    If I try the same approach in the XR Interaction toolkit demo, they don't move when you move the XR Rig despite being child objects.

    Is there a way to do this or am I just missing something?
     
  24. vincismurf

    vincismurf

    Joined:
    Feb 28, 2013
    Posts:
    200
    If you place your camera AND controllers under a gameobject without the components that use the tracking code. You can move the parent game object it's children, the tracked objects. should follow along.
     
  25. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    259
    Thanks for confirming you can do that. Double checked and I was using transform.position instead of localPosition :oops:
     
  26. tgaldi

    tgaldi

    Joined:
    Oct 28, 2015
    Posts:
    87

    This does not work with OpenVR headsets, and it seems InputTracking.Recenter() no longer works either.

    https://forum.unity.com/threads/steam-vr-plugin-reset-position-and-orientation.389588/

    Is there a solution for OpenVR headsets, since there is no OpenVR plugin for the new system?
     
    Last edited: Feb 3, 2020
  27. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    152
    InputTracking.Recenter() should actually still work for everything not under the new Plugin System. If not, that's a bug I need to fix.

    As well, the equivalent to TrySetTrackingOriginMode in the old system is: https://docs.unity3d.com/2019.2/Documentation/ScriptReference/XR.XRDevice.SetTrackingSpaceType.html

    Where
    TrackingOriginModeFlags.Floor = TrackingSpaceType.RoomScale
    &
    TrackingOriginModeFlags.Device = TrackingSpaceType.Stationary

    I understand this 2-phase series of APIs is not ideal at this point in time, I'm sorry.
    Is that not what you are seeing?
     
  28. tgaldi

    tgaldi

    Joined:
    Oct 28, 2015
    Posts:
    87
    Sorry you are correct, it does still work but its marked as deprecated. What should we use for OpenVR if this API is going to be removed and there is no OpenVR plugin for the new system?
     
  29. Charlicopter

    Charlicopter

    Joined:
    May 15, 2017
    Posts:
    24
    @StayTalm_Unity :
    Is there any chance we'll see Valve Index + bone hierarchies working W/O SteamVRInput 2.0 in the next year or two?
    It seems like these bone hierarchies should be a pretty standard framework moving forward - it seems universally generic to me, since it's modeling anatomical characteristics.
     
  30. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    152
    @Charlicopter
    We have a basic Hand feature. It's super basic, just a series of 21 bones (4 bones per finger, 5 fingers, 1 root bone), but gets the data into C#. It's available for Hololens 2 and Magic Leap I believe.

    But 2 things:
    1) We'd like to do better with hands. The data is there, but using it is complicated at the moment. We are looking into better ways to express it, so that gestures and actions can be more easily interpreted and the data can be read and bound to in a more developer-friendly way.
    2) OpenVR is complicated. Valve is now in charge of the OpenVR plugin, so it's up to them if they want to expose it. As well, the bone hierarchies are only available in SteamVR2.0 (the action map system), and not available to OpenVR. This means that in order to expose those bones, Valve will either need to update their device-based APIs, or migrate the entire plugin to an action based system, which would complicate cross-platform support with Oculus and WMR.I won't be going into any real detail, but this is something we are in the process of solving, and so I would say yes for 'in the next year or two'.
     
    Charlicopter likes this.
  31. Charlicopter

    Charlicopter

    Joined:
    May 15, 2017
    Posts:
    24
    @StayTalm_Unity :
    Cool. Yeah I've been playing with your hand framework a bit but trailed off after realizing the relevant Index Controller features are walled off like that. It doesn't feel right having to implement SteamVRInput 2.0 solely for the purpose of gaining access to finger tracking on the Index when the remainder of the input data is entirely available through Unity.
    Like you said: "The data is there", but it's a bit scattered and lacks cohesiveness at the moment. I'm sure I'm preaching to the choir ;)

    Thanks for the info.
     
    StayTalm_Unity likes this.
  32. Zapgun

    Zapgun

    Joined:
    Jun 3, 2011
    Posts:
    47
    Anyone else having difficulty getting accurate readings from touchpad controls? I've set up a fairly straightforward script to detect click direction on the an oculus go touchpad and it seems to be reporting the wrong vectors after a few clicks. (The same problem occurs with primay2DAxisTouch as well, in case you're wondering). Maybe the scripting is wrong? Suggestions or insights welcome.

    Code (CSharp):
    1. using System.Collections.Generic;
    2. using UnityEngine;
    3. using UnityEngine.Events;
    4. using UnityEngine.XR;
    5.  
    6. public class Touchpad : MonoBehaviour
    7. {
    8.     private bool lastPressState = false;
    9.  
    10.     private List<InputDevice> devicesWithPrimary2DAxis;
    11.  
    12.     // to check whether it's being pressed
    13.     public bool IsPressed { get; private set; }
    14.  
    15.     [Tooltip("Event when the button starts being pressed")]
    16.     public UnityEvent OnPress;
    17.  
    18.     [Tooltip("Event when the button is released")]
    19.     public UnityEvent OnRelease;
    20.  
    21.     private void Awake()
    22.     {
    23.         devicesWithPrimary2DAxis = new List<InputDevice>();
    24.     }
    25.  
    26.     void OnEnable()
    27.     {
    28.         List<InputDevice> allDevices = new List<InputDevice>();
    29.         InputDevices.GetDevices(allDevices);
    30.         foreach (InputDevice device in allDevices)
    31.             InputDevices_deviceConnected(device);
    32.  
    33.         InputDevices.deviceConnected += InputDevices_deviceConnected;
    34.         InputDevices.deviceDisconnected += InputDevices_deviceDisconnected;
    35.     }
    36.  
    37.     private void OnDisable()
    38.     {
    39.         InputDevices.deviceConnected -= InputDevices_deviceConnected;
    40.         InputDevices.deviceDisconnected -= InputDevices_deviceDisconnected;
    41.         devicesWithPrimary2DAxis.Clear();
    42.     }
    43.  
    44.     private void InputDevices_deviceConnected(InputDevice device)
    45.     {
    46.         Vector2 discardedValue;
    47.         if (device.TryGetFeatureValue(CommonUsages.primary2DAxis, out discardedValue))
    48.         {
    49.             devicesWithPrimary2DAxis.Add(device); // Add any devices that have a 2D axis.
    50.         }
    51.     }
    52.  
    53.     private void InputDevices_deviceDisconnected(InputDevice device)
    54.     {
    55.         if (devicesWithPrimary2DAxis.Contains(device))
    56.             devicesWithPrimary2DAxis.Remove(device);
    57.     }
    58.  
    59.     void Update()
    60.     {
    61.         bool tempState = false;
    62.         Vector2 axis = Vector2.zero;
    63.         foreach (var device in devicesWithPrimary2DAxis)
    64.         {
    65.             device.TryGetFeatureValue(CommonUsages.primary2DAxis, out axis);
    66.  
    67.             bool primaryPressState = false;
    68.             tempState = device.TryGetFeatureValue(CommonUsages.primary2DAxisClick, out primaryPressState) // did get a value
    69.                         && primaryPressState // the value we got
    70.                         || tempState; // cumulative result from other controllers
    71.         }
    72.  
    73.         if (tempState != lastPressState) // Button state changed since last frame
    74.         {
    75.             if (!IsPressed)
    76.             {
    77.                 IsPressed = true;
    78.  
    79.                 OnPress.Invoke();
    80.                 Debug.Log("OnPress: " + GetDirection(axis));
    81.             }
    82.             else if (IsPressed)
    83.             {
    84.                 IsPressed = false;
    85.  
    86.                 OnRelease.Invoke();
    87.                 // Debug.Log("OnRelease");
    88.             }
    89.             lastPressState = tempState;
    90.         }
    91.     }
    92.  
    93.     public enum Direction { up, right, down, left, none };
    94.  
    95.     public Direction GetDirection(Vector2 input)
    96.     {
    97.         Vector2[] directions = new Vector2[] {
    98.                 Vector2.up,
    99.                 Vector2.right,
    100.                 Vector2.down,
    101.                 Vector2.left
    102.             };
    103.  
    104.         Vector2 dir = Vector2.zero;
    105.         float max = Mathf.NegativeInfinity;
    106.  
    107.         foreach (Vector2 vec in directions)
    108.         {
    109.             float dot = Vector2.Dot(vec, input.normalized);
    110.  
    111.             if (dot > max)
    112.             {
    113.                 dir = vec;
    114.                 max = dot;
    115.             }
    116.         }
    117.         if (dir == Vector2.up)
    118.         {
    119.             return Direction.up;
    120.         }
    121.         else if (dir == Vector2.right)
    122.         {
    123.             return Direction.right;
    124.         }
    125.         else if (dir == Vector2.down)
    126.         {
    127.             return Direction.down;
    128.         }
    129.         else if (dir == Vector2.left)
    130.         {
    131.             return Direction.left;
    132.         } else {
    133.             return Direction.none;
    134.         }
    135.     }
    136. }
     

    Attached Files:

  33. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    54
    quick question

    Problem: When my controllers [Vive Wands] are off and I run my game, then in-game turn on the controller, I don't get input from the controller. If the controllers are on before the game starts the input is read fine.

    The input debugger shows input coming into both controllers, but the scripts I have on the loaded controller model won't register the input.

    I set my input device at start:
    Code (CSharp):
    1. m_Controller = GetComponentInParent<XRController>();
    2. m_Node = m_Controller.controllerNode;
    3. m_InputDevice = InputDevices.GetDeviceAtXRNode(m_Node);
    and use TryGetFeatureValue calls during the Update loop:
    Code (CSharp):
    1. m_InputDevice.TryGetFeatureValue(CommonUsages.trigger, out m_TriggerValue);
    2. m_InputDevice.TryGetFeatureValue(new InputFeatureUsage<bool>("TriggerButton"), out m_TriggerButtonStatus);
    If unplugged, I get no value for m_InputDevice (GetDeviceAtXRNode), which leads me to believe I need to register to the InputDevices.deviceConnected event?? to set the device when it loads... Is this correct? and if so, is there a recommended way to do so in general?

    ***
    What I'm doing now to get this to work is moving the GetDeviceAtXRNode call into a function "InitializeDevice()" which I then call anytime InputDevices.deviceConnected or InputDevices.deviceDisconnected is triggered (via OnEnable OnDisable) - it seems to work but I don't know if there's a better way to solve this issue
     
    Last edited: Feb 23, 2020
  34. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    259
    Is there a hard limit on the amount of input devices unity can detect? Seems to be 9? Someone posted the same question here but no reply https://forum.unity.com/threads/unityengine-xr-openvr-maximum-9-input-devices.721715/

    I'm using XR Input to detect where on the body my 3 vive trackers are. This works fine, but if I'm in a room with 4 steam lighthouses, then I have 10 devices total and Unity wont detect 1 of the 3 trackers. Is there any way around this? Cheers.
     
  35. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    152
    @Shizola
    In the plugin architecture, there is no limit, however OpenVR isn't yet on that system.
    Looking into the old source code for the existing Legacy XR Settings, there is an array constant of 12, so we should be able to fill up to there. OpenVR has it's own, internal device limit of 64. I see nothing that should limit you to 9 on Unity's end.

    Not sure what's going on on the SteamVR end though.
     
  36. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    152
    @kavanavak

    InputDevices only knows about devices currently connected. This means InputDevices.GetDeviceAtXRNode will return an invalid device. You can check this with InputDevice.isValid. That API has to return something, so it returns an effectively null device. The other APIs will return a list of 0 elements if there is no device yet connected for that node.


    What you are doing is correct: Check for any existing devices that match what you need in [Start/Awake/OnEnable], and then register for InputDevices.deviceConnected to get any future device connections. You can often skip checking for deviceDisconnected, because a device that disconnects will simply become invalid (so you can check InputDevice.isValid in the update loop to know when it’s lost).
     
  37. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    259
    I just double checked my setup using the SteamVR plugin, all trackers working as expected. Our apps need vive trackers in a room with many lighthouses. I'll have to stick with the old OpenVR set up for now, will test again when/if the new plugin arrives.
     
  38. MaxIzrinCubeUX

    MaxIzrinCubeUX

    Joined:
    Jan 13, 2020
    Posts:
    5
    Getting input of the Cosmos controllers using the XR input system always returns false for everything if SteamVR is part of the project, and I couldn't strip it off, had to start a clean project and re-import all the assets.
    Just a heads up for anyone encountering this.
     
    ROBYER1 likes this.
  39. josrodes

    josrodes

    Joined:
    Nov 15, 2017
    Posts:
    9
    I'm having issues with the BackButton for the Oculus GO.

    Code (CSharp):
    1. Controller.TryGetFeatureValue(CommonUsages.menuButton, out backButtonPressed);
    It is not being reliable, it returns false intermittently and I can't figure out how to get around it.

    Here is the code I'm using to test it( this works great on the Oculus Quest BTW) so it seems to be something specific to the GO controller, even if I change it to lefthand I see the same issue.

    Code (CSharp):
    1. void Update()
    2.     {
    3.         if (rightController.isValid)
    4.         {
    5.             bool backButtonPressed;
    6.             rightController.TryGetFeatureValue(CommonUsages.menuButton, out backButtonPressed);
    7.             if (backButtonPressed)
    8.             {
    9.                 buttonPressed.text = "MenuButton Pressed";
    10.             }
    11.             rightController.TryGetFeatureValue(CommonUsages.triggerButton, out backButtonPressed);
    12.             if (backButtonPressed)
    13.             {
    14.                 //reset the text with the trigger
    15.                 buttonPressed.text = "";
    16.             }
    17.         }
    18.         if (leftController.isValid)
    19.         {
    20.             bool backButtonPressed;
    21.             leftController.TryGetFeatureValue(CommonUsages.menuButton, out backButtonPressed);
    22.             if (backButtonPressed)
    23.             {
    24.                 buttonPressed.text = "MenuButton Pressed";
    25.             }
    26.             leftController.TryGetFeatureValue(CommonUsages.triggerButton, out backButtonPressed);
    27.             if (backButtonPressed)
    28.             {
    29.                 //reset the text with the trigger
    30.                 buttonPressed.text = "";
    31.             }
    32.         }
    33.     }
    @StayTalm_Unity any idea how to get around this issue? Maybe a higher-level event I can hook into?

    Really appreciate any help here,
    Thanks!
     
  40. AgileLens

    AgileLens

    Joined:
    Mar 5, 2019
    Posts:
    10
    Hi all,

    I've been following this thread with interest and finally tried synthesizing some of the information into a single github. It contains examples based on the work of @icave_user @fariazz and @dilmer , plus the results of some additional work on my own. Hope it's helpful-- please feel free to improve upon it!
    https://github.com/ibrews/XRInputExamples
     
  41. AgileLens

    AgileLens

    Joined:
    Mar 5, 2019
    Posts:
    10
  42. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    984
  43. gjf

    gjf

    Joined:
    Feb 8, 2012
    Posts:
    39
    The reporting from
    Code (CSharp):
    1. InputDevices.deviceConnected
    on the Quest has changed for some devices.

    One, running versions 15.x of the OS & run-time, are presenting as Oculus Touch Controller whilst another, running 14.x, still shows Oculus Quest Controller.

    Any insights would be appreciated.
     
  44. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    152
    @josrodes
    If I recall, the Go back button reports a flicker event. So it's not reporting if the back button is down, but just that it has been pressed, and only for 1 frame. Does that match what you are seeing? If not, and sometimes you get no back button at all, I have a hunch I can test out, and please report a bug so I can track it.
     
    a436t4ataf likes this.
  45. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    152
    @gjf
    You are talking about the .name property of the connected device?

    Your app must be using the 2019.2 and earlier Built-In XR system and not the new Plugin Architecture.
    In the new plugins they are all Touch Controllers (because they are now all the same controllers).

    In the old system, looking at the code, it seems we did a simple string comparison on the device model. We were comparing against the in-development name for the quest and not 'Quest', and so I suspect in 15.x they updated the underlying hardware model name to match the retail name. We don't intend to revert it to say 'Oculus Quest Controller' so your code should support the 'Oculus Quest Controller' only if you want to work with OS version 14.x and lower, and otherwise rely on checking for 'Oculus Touch Controller' for both desktop and standalone platforms.
     
    gjf and a436t4ataf like this.
  46. unity_ZGfiwpUxe1w-dw

    unity_ZGfiwpUxe1w-dw

    Joined:
    Feb 3, 2019
    Posts:
    1
    Why not just map "joystick button 15" to a new button called "Triggered" in the input manager and check if its higher then 0 and make a if statement during a update call
    Input.GetAxis("Triggered") > 0
     
  47. gjf

    gjf

    Joined:
    Feb 8, 2012
    Posts:
    39

    Thanks for the quick reply. It's OK, our controller profiles are set up for all variations so it'll handle either.
     
  48. joshb08

    joshb08

    Joined:
    Jan 22, 2019
    Posts:
    3
    I'm having the same issue. I'm getting no back button at all most of the time. I believe maybe like 1 out of 20 if I press it very rapidly I get a back button pressed, but like you said just for what looks like one frame. I do have a bug reported here: https://fogbugz.unity3d.com/default.asp?1239095_s4bfqrphq79lhr35
     
  49. josrodes

    josrodes

    Joined:
    Nov 15, 2017
    Posts:
    9
    @joshb08 I was able to get it working checking for the Escape key instead, it reports every event and you can easily calculate long-press, long-press cancel, and short press. Hope this helps.

    Code (CSharp):
    1. private float timeElapsed;
    2.     void CheckForBackButton()
    3.     {
    4.         if (Input.GetKeyDown(KeyCode.Escape))
    5.         {
    6.             timeElapsed = 0f;
    7.         }
    8.         else if (Input.GetKey(KeyCode.Escape))
    9.         {
    10.             timeElapsed += Time.deltaTime;
    11.             if (timeElapsed > .75f)
    12.             {
    13.                 //long press
    14.                 OVRManager.PlatformUIConfirmQuit();
    15.             }
    16.         }
    17.         else if (Input.GetKeyUp(KeyCode.Escape))
    18.         {
    19.             if (timeElapsed < .25f)
    20.             {
    21.                 //short press
    22.             }
    23.             else
    24.             {
    25.                 //long-press cancel
    26.             }
    27.         }
    28.     }
     
    joshb08 likes this.
  50. Loths

    Loths

    Joined:
    Aug 18, 2017
    Posts:
    9
    Hey, I've started playing around with XR coming from making regular games and I'm really struggling to understand this input system. I'm trying to check whether a button is pressed and it all just seems extremely complicated compared to the "Input.GetButton()" that I'm used to with the old input manager.

    Here's me trying to get the state of the X button on an oculus controller, trying to follow the examples in the manual:

    Code (CSharp):
    1. void Start()
    2.     {
    3.         var leftHandDevices = new List<InputDevice>();
    4.         InputDevices.GetDevicesAtXRNode(XRNode.LeftHand, leftHandDevices);
    5.  
    6.         if (leftHandDevices.Count == 1)
    7.         {
    8.             leftDevice = leftHandDevices[0];
    9.         }
    10.         else if (leftHandDevices.Count > 1)
    11.         {
    12.             Debug.Log("Found more than one left hand!");
    13.         }
    14.     }
    15.  
    16.     void Update()
    17.     {
    18.         if(leftDevice.TryGetFeatureValue(CommonUsages.gripButton, out buttonX)){
    19.  
    20.             print(buttonX);
    21.         }
    22. }
    Doesn't register the button press for me. Could I please get some feedback on where I'm going wrong with this?
     
unityunity