Search Unity

Any example of the new 2019.1 XR input system?

Discussion in 'AR/VR (XR) Discussion' started by fariazz, Feb 15, 2019.

  1. Corysia

    Corysia

    Joined:
    Mar 14, 2018
    Posts:
    92
    What is it today? With 1.40, Oculus removed that entry in the manifest. Are we still going to need that proposed checkbox?

    Also, am I right in that if I import the Oculus Integration asset, I'm overriding the Oculus libraries in the Unity Package Manager?

    That was from May. Are the XR Legacy Input Helpers still recommended?

    I'd like to avoid tying myself irrevocably to Oculus, so I'd like to make a purely XR app even tho right now I've only a Rift and Quest to work with.
     
  2. rmon222

    rmon222

    Joined:
    Oct 24, 2018
    Posts:
    46
    Hi, I'm new to this new XR input system. Do I need to install "XR Management" and "Oculus XR Plugin" to use it?
    Instead of the legacy TrackedPoseDriver, what component should I be adding to our left and right controller objects?
    I'm on 2019.3.0b4. Thanks in advance.
     
  3. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    135
    @rmon222
    You do not need to install the new XR Plugin packages, XR Input works both with our current and future backends :)

    We don't yet have an XR Input based tracked pose driver, and so I'd continue to use the TrackedPoseDriver. There will be new tools coming, but we will not deprecate the Tracked Pose Driver in the near term, and will make upgrading as painless as we can.
     
    ROBYER1 likes this.
  4. rmon222

    rmon222

    Joined:
    Oct 24, 2018
    Posts:
    46
    Thanks. It works. One question: Is there a new way to get controller velocity or is XRNodeState.TryGetVelocity() the most current?
     
  5. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    135
    The usage deviceVelocity!
    You may want to check trackingState first, which is a series of flags reporting the currently available tracking elements (e.g. position, velocity, angular acceleration, etc...). Depending on the current state of tracking sometimes you drop into 3DoF (rotation only) mode and won't be able to read Velocity, in which case it'll return (0,0,0) until tracking improves.
     
  6. rmon222

    rmon222

    Joined:
    Oct 24, 2018
    Posts:
    46
    Many thanks for the quick response.
     
    StayTalm_Unity likes this.
  7. rmon222

    rmon222

    Joined:
    Oct 24, 2018
    Posts:
    46
    There's a bug in the angular velocity of a tracked controller. It doesn't take into account the rotation of the tracked object. In effect, it reports the same angular velocity no matter what direction you point the controller. Adding the rotation of the controller to reported angular velocity fixes it.
     
  8. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    135
    @rmon222
    That was intentional!
    We want as many values as possible to be in the same global space, so that features don't implicitly depend on each other. The Angular acceleration and velocity are in world space by design, but as you've figured out already, you can add the known rotation in if you'd like to look for local rotations (like trying to identify a twist gesture or something similar).

    There is a bug where one of the SDKs is still local, and I can't find it offhand, but it will be fixed and made global and not the other way around.
     
  9. rmon222

    rmon222

    Joined:
    Oct 24, 2018
    Posts:
    46
    I think I'm talking about the bug you refer to. I'm using the Oculus SDK. Angular velocity seems to be in local space. Adding the rotation of the controller, ie. converting to global space, works.

    Question: if the controllers are missing at init time, do you recommend subscribing to XRDevice.deviceLoaded to discover them later? Is there a better alternative?

    Thanks,
     
  10. roomera

    roomera

    Joined:
    Sep 27, 2016
    Posts:
    17
    Hi Tom,

    Regarding getting or setting the origin mode flags - when I try to access the InputDevice's subsystem, it's always returning null. Am I doing something incorrectly? (I'm on 2019.3.0b3, building for the Quest)

    Code (CSharp):
    1.  
    2.     private void OnEnable() {
    3.         InputDevices.deviceConnected += OnDeviceConnected;
    4.     }
    5.  
    6.     private void OnDisable() {
    7.         InputDevices.deviceConnected -= OnDeviceConnected;
    8.     }
    9.  
    10.     private void OnDeviceConnected(InputDevice device) {
    11.         if (device.characteristics.Is(InputDeviceCharacteristics.TrackedDevice | InputDeviceCharacteristics.HeadMounted)) {
    12.             if (device.subsystem == null) {
    13.                 Debug.Log("Subsystem is null");
    14.             } else {
    15.                 Debug.Log(device.subsystem.GetTrackingOriginMode().ToString());
    16.             }
    17.         }
    18.  
     
  11. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    135
    @rmon222
    I believe XRDevice.deviceLoaded is more about the HMD, and so if the controller is powered on _after_ the game has started it's still not going to be correct.

    Starting in 2019.2, I would suggest UnityEngine.XR.InputDevices.deviceConnected. It will call back for each new device, and there is a parallel disconnected callback to know when things are no longer available. You can keep the InputDevice structs returned from these for as long as you like so you can have some consistency about which systems are polling off of which devices, and not have to follow the pattern of getting all new devices every frame. Actually, @roomera 's sample shows the two callbacks for getting device connections.

    @roomera
    You arn't doing something wrong, this is an unfortunate quirk of the transition between to our XR Plugin Architecture. InputDevices that come from the built-in SDKs, set in the XR tab of the Player Settings panel, do not have subsystems. They can be set via the XRDevice APIs, Get/SetTrackingSpaceType. With tracking space types, Stationary generally refers to the Device tracking origin mode, and RoomScale generally refers to Floor tracking origin mode, but it can depend somewhat on the intricacies of the SDK. If you start using the 'Oculus XR Plugin' package and our new architecture, you will start seeing InputDevices that are associated with individual Subsystems.
     
    ROBYER1 likes this.
  12. roomera

    roomera

    Joined:
    Sep 27, 2016
    Posts:
    17
    @StayTalm_Unity - So I went ahead and switched from the built-in Oculus SDK to the Oculus XR Plugin, and also added the manifest (which was needed to prevent the Quest from being seen as a Go).

    It works, but now positional tracking for the controllers has a considerable amount of jitter.

    When I switch back to the built-in SDK (and remove the manifest), the jitter goes away. I am not using the Oculus Integration in either case, and was using Tracked Pose Driver.

    Is this a bug, or is it something on my end?
     
    Last edited: Oct 3, 2019
    hippocoder, fherbst and ROBYER1 like this.
  13. rmon222

    rmon222

    Joined:
    Oct 24, 2018
    Posts:
    46
    Yes. roomera's post answered my question immediately. Thank you. I ran into problems making the new XR Input work with the vive so I went back to using SteamVR plugin for vive, and the new XR Input for oculus. Hope to revisit it in a couple months.
     
  14. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,202
    Still no able to switch Quest to floor tracking. Every game on the store seems fine but mine has this weird offset. Any advice?
     
  15. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    113
    offset in which direction?
     
    hippocoder likes this.
  16. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,202
  17. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    113
    tracking space is its own self contained space, where 0,0,0 (the local origin of tracking space) is determined by the device itself. some cases it's where you start tracking, others its your marked out play area etc.

    there's two main modes that devices use 'device tracking origin' eg: some historical device position is regarded as 0,0,0 and the device will report its position relative to that. and 'floor tracking origin'. where 0,0,0 is at some point on the floor, with the device reporting its position relative to that point.

    what we generally recommend in terms of game object hierarchy is:

    XR Rig
    |-Camera Offset
    ...|-Head

    (hopefully my ascii art comes through, the camera offset is the child of the XR Rig, and the Head is the child of the Camera Offset.)

    where the 'Head' Game object has a tracked pose driver (in non relative mode) tracking the camera. the 'Camera Offset' node is used to offset the camera up from the floor when using a device tracking origin (since its data does not implicitly contain the height of the device).

    when moving the user, you move the XR Rig game object. the head is then the position in tracking space of the device, the xr rig is its anchor in unity world space. if you want to add controllers, you can put them at the same level as the 'head' game object using a TPD with the 'use relative transform' unticked. controllers also publish their position in tracking space.

    so, by dragging the XR Rig around you're really moving the transform from tracking -> world space around.

    of course you can use whatever hierarchy you want, but you'll need to figure out how to make sure that all your tracking -> world space transforms line up yourself.
     
    roomera, ROBYER1 and hippocoder like this.
  18. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,202
    Hi Matt! Thank you for replying.

    Yep got exactly that so far:
    upload_2019-10-10_19-18-38.png
    Camera, Left Hand and Right Hand have a TrackedPoseDriver component, and are locally positioned at 0,0,0
    The parent (Camera Rig) is currently 0,0,0 and it is that which I'm trying to get an offset from so I can move the Rig up a little or down so that perceptually in VR, the controller would be at my feet if I put it down.

    I realise now starting in roomscale vs sitting makes the device decide to use completely different coordinates, so I need to figure out how to do that. I wanted to go with Oculus's Unity's recommendation to use XR instead of the oculus stuff but there is no documentation for it and massive holes in my knowledge.

    From what I am reading from you, I should be looking for some kind of event or callback and modifying the Camera Rig (offset in in your example) Height each time the user changes that, as it will be often different, even while app is running.

    I am not using relative because it's marked as deprecated and I want to do everything exactly correct for the latest recommendations by Unity. I don't want to use anything that's going to be deprecated.

    So how do I a) know when the tracking changes from say roomscale to seated, and b) how do I get the offset to the floor for the Camera Offset ?

    It's really confusing just finding those things. Or polling for changes. I honestly don't know where to look first since there's what seems to be 3 different API's each time I google...
     
  19. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    113
    also dont forget to make sure that the TPD's have the 'use relative transform' unticked.

    right, the origin of tracking space will change depending on how its initialized.

    not quite. in "Floor" tracking mode, the height will be implicit in the data reported by the hmd.
    As an example the position reported by the device if the user is 1.8mtrs tall while standing directly at the origin of tracking space would be expected to look like (0.0, 1.8, 0.0).

    in "Device" mode, you will need to offset the camera with some value that you decide upon. (eg: from a menu in your app or something similar ) as the device will report (0.0,0.0,0.0) when a person 1.8mtrs tall is standing at where the device has decided the origin of tracking space is. you need to add an offset to compensate. what that offset is, is up to you. AFAIK there's no api to ask that.

    a) as far as i'm aware, only you, the application, can change that during the runtime of the app. the user can pick their startup mode (which you can query via the Get version of our api). but after that, its up to you. the only time this might not hold is if the user does a guardian system reset/reconfig.
    b) see above. its either given to you in the data, or you'll have to either pick a value or let the user configure it.
     
    Last edited: Oct 10, 2019
  20. nocanwin

    nocanwin

    Joined:
    May 7, 2009
    Posts:
    145
    I can't get trigger values from hardware trackers even though the vive trackers have that functionality via their pogo pins. Is there a reason why this is missing or should it work? I ran TryGetFeatureUsage on the hardware tracker input device and it didn't return trigger so I'm guessing it wasn't implemented. I'd like to switch to XR but this is a blocker for me. We run 4 trackers at the same time, each with custom hardware and I need to get trigger input from them.

    Any chance we can get this @StayTalm_Unity or @Matt_D_work?
     
    Last edited: Jan 10, 2020
  21. Gamrek

    Gamrek

    Joined:
    Sep 28, 2010
    Posts:
    116

    I was about to build a new VR project and I realised this new changes.

    I am using VRTK developing my project and the camera that I use is the UnityCameraRig which set as stationary mode. After changing into this new XR input system, the camera shift up to what it suppose to do. And I cannot reset it using InputTracking.Recenter().

    VS tells me to use XRInputSubSystem instead. So I type in XRInputSubSystem. but no available function comes on editor. How do I set the mode back to Stationary Mode again and recenter the headset when needed? Do I need to attach a new XRInputSubSystem component or what?
     
  22. Gamrek

    Gamrek

    Joined:
    Sep 28, 2010
    Posts:
    116

    Sorry, I found what I need from other thread:
    Code (CSharp):
    1.  
    2. List<XRInputSubsystem> subsystems = new List<XRInputSubsystem>();
    3. SubsystemManager.GetInstances<XRInputSubsystem>(subsystems);
    4. for (int i = 0; i < subsystems.Count; i++)
    5. {
    6.    subsystems[i].TrySetTrackingOriginMode(TrackingOriginModeFlags.Device);
    7.    subsystems[i].TryRecenter();
    8. }
    9.  
    Here is what you need if you want to reset camera or set tracking mode.
     
  23. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    217
    I've been using SteamVR but I'm trying to transition to XR Input. I've started by using TryGetFeatureValue(CommonUsages.devicePosition... to place objects representing my Vive trackers which seems to work fine.

    My problem is when I need to move the player to specific place. Using SteamVR I just move the camera rig object, and the trackers follow along.

    If I try the same approach in the XR Interaction toolkit demo, they don't move when you move the XR Rig despite being child objects.

    Is there a way to do this or am I just missing something?
     
  24. vincismurf

    vincismurf

    Joined:
    Feb 28, 2013
    Posts:
    201
    If you place your camera AND controllers under a gameobject without the components that use the tracking code. You can move the parent game object it's children, the tracked objects. should follow along.
     
  25. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    217
    Thanks for confirming you can do that. Double checked and I was using transform.position instead of localPosition :oops:
     
  26. tgaldi

    tgaldi

    Joined:
    Oct 28, 2015
    Posts:
    82

    This does not work with OpenVR headsets, and it seems InputTracking.Recenter() no longer works either.

    https://forum.unity.com/threads/steam-vr-plugin-reset-position-and-orientation.389588/

    Is there a solution for OpenVR headsets, since there is no OpenVR plugin for the new system?
     
    Last edited: Feb 3, 2020
  27. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    135
    InputTracking.Recenter() should actually still work for everything not under the new Plugin System. If not, that's a bug I need to fix.

    As well, the equivalent to TrySetTrackingOriginMode in the old system is: https://docs.unity3d.com/2019.2/Documentation/ScriptReference/XR.XRDevice.SetTrackingSpaceType.html

    Where
    TrackingOriginModeFlags.Floor = TrackingSpaceType.RoomScale
    &
    TrackingOriginModeFlags.Device = TrackingSpaceType.Stationary

    I understand this 2-phase series of APIs is not ideal at this point in time, I'm sorry.
    Is that not what you are seeing?
     
  28. tgaldi

    tgaldi

    Joined:
    Oct 28, 2015
    Posts:
    82
    Sorry you are correct, it does still work but its marked as deprecated. What should we use for OpenVR if this API is going to be removed and there is no OpenVR plugin for the new system?
     
  29. Charlicopter

    Charlicopter

    Joined:
    May 15, 2017
    Posts:
    6
    @StayTalm_Unity :
    Is there any chance we'll see Valve Index + bone hierarchies working W/O SteamVRInput 2.0 in the next year or two?
    It seems like these bone hierarchies should be a pretty standard framework moving forward - it seems universally generic to me, since it's modeling anatomical characteristics.
     
  30. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    135
    @Charlicopter
    We have a basic Hand feature. It's super basic, just a series of 21 bones (4 bones per finger, 5 fingers, 1 root bone), but gets the data into C#. It's available for Hololens 2 and Magic Leap I believe.

    But 2 things:
    1) We'd like to do better with hands. The data is there, but using it is complicated at the moment. We are looking into better ways to express it, so that gestures and actions can be more easily interpreted and the data can be read and bound to in a more developer-friendly way.
    2) OpenVR is complicated. Valve is now in charge of the OpenVR plugin, so it's up to them if they want to expose it. As well, the bone hierarchies are only available in SteamVR2.0 (the action map system), and not available to OpenVR. This means that in order to expose those bones, Valve will either need to update their device-based APIs, or migrate the entire plugin to an action based system, which would complicate cross-platform support with Oculus and WMR.I won't be going into any real detail, but this is something we are in the process of solving, and so I would say yes for 'in the next year or two'.
     
    Charlicopter likes this.
  31. Charlicopter

    Charlicopter

    Joined:
    May 15, 2017
    Posts:
    6
    @StayTalm_Unity :
    Cool. Yeah I've been playing with your hand framework a bit but trailed off after realizing the relevant Index Controller features are walled off like that. It doesn't feel right having to implement SteamVRInput 2.0 solely for the purpose of gaining access to finger tracking on the Index when the remainder of the input data is entirely available through Unity.
    Like you said: "The data is there", but it's a bit scattered and lacks cohesiveness at the moment. I'm sure I'm preaching to the choir ;)

    Thanks for the info.
     
    StayTalm_Unity likes this.
  32. Zapgun

    Zapgun

    Joined:
    Jun 3, 2011
    Posts:
    47
    Anyone else having difficulty getting accurate readings from touchpad controls? I've set up a fairly straightforward script to detect click direction on the an oculus go touchpad and it seems to be reporting the wrong vectors after a few clicks. (The same problem occurs with primay2DAxisTouch as well, in case you're wondering). Maybe the scripting is wrong? Suggestions or insights welcome.

    Code (CSharp):
    1. using System.Collections.Generic;
    2. using UnityEngine;
    3. using UnityEngine.Events;
    4. using UnityEngine.XR;
    5.  
    6. public class Touchpad : MonoBehaviour
    7. {
    8.     private bool lastPressState = false;
    9.  
    10.     private List<InputDevice> devicesWithPrimary2DAxis;
    11.  
    12.     // to check whether it's being pressed
    13.     public bool IsPressed { get; private set; }
    14.  
    15.     [Tooltip("Event when the button starts being pressed")]
    16.     public UnityEvent OnPress;
    17.  
    18.     [Tooltip("Event when the button is released")]
    19.     public UnityEvent OnRelease;
    20.  
    21.     private void Awake()
    22.     {
    23.         devicesWithPrimary2DAxis = new List<InputDevice>();
    24.     }
    25.  
    26.     void OnEnable()
    27.     {
    28.         List<InputDevice> allDevices = new List<InputDevice>();
    29.         InputDevices.GetDevices(allDevices);
    30.         foreach (InputDevice device in allDevices)
    31.             InputDevices_deviceConnected(device);
    32.  
    33.         InputDevices.deviceConnected += InputDevices_deviceConnected;
    34.         InputDevices.deviceDisconnected += InputDevices_deviceDisconnected;
    35.     }
    36.  
    37.     private void OnDisable()
    38.     {
    39.         InputDevices.deviceConnected -= InputDevices_deviceConnected;
    40.         InputDevices.deviceDisconnected -= InputDevices_deviceDisconnected;
    41.         devicesWithPrimary2DAxis.Clear();
    42.     }
    43.  
    44.     private void InputDevices_deviceConnected(InputDevice device)
    45.     {
    46.         Vector2 discardedValue;
    47.         if (device.TryGetFeatureValue(CommonUsages.primary2DAxis, out discardedValue))
    48.         {
    49.             devicesWithPrimary2DAxis.Add(device); // Add any devices that have a 2D axis.
    50.         }
    51.     }
    52.  
    53.     private void InputDevices_deviceDisconnected(InputDevice device)
    54.     {
    55.         if (devicesWithPrimary2DAxis.Contains(device))
    56.             devicesWithPrimary2DAxis.Remove(device);
    57.     }
    58.  
    59.     void Update()
    60.     {
    61.         bool tempState = false;
    62.         Vector2 axis = Vector2.zero;
    63.         foreach (var device in devicesWithPrimary2DAxis)
    64.         {
    65.             device.TryGetFeatureValue(CommonUsages.primary2DAxis, out axis);
    66.  
    67.             bool primaryPressState = false;
    68.             tempState = device.TryGetFeatureValue(CommonUsages.primary2DAxisClick, out primaryPressState) // did get a value
    69.                         && primaryPressState // the value we got
    70.                         || tempState; // cumulative result from other controllers
    71.         }
    72.  
    73.         if (tempState != lastPressState) // Button state changed since last frame
    74.         {
    75.             if (!IsPressed)
    76.             {
    77.                 IsPressed = true;
    78.  
    79.                 OnPress.Invoke();
    80.                 Debug.Log("OnPress: " + GetDirection(axis));
    81.             }
    82.             else if (IsPressed)
    83.             {
    84.                 IsPressed = false;
    85.  
    86.                 OnRelease.Invoke();
    87.                 // Debug.Log("OnRelease");
    88.             }
    89.             lastPressState = tempState;
    90.         }
    91.     }
    92.  
    93.     public enum Direction { up, right, down, left, none };
    94.  
    95.     public Direction GetDirection(Vector2 input)
    96.     {
    97.         Vector2[] directions = new Vector2[] {
    98.                 Vector2.up,
    99.                 Vector2.right,
    100.                 Vector2.down,
    101.                 Vector2.left
    102.             };
    103.  
    104.         Vector2 dir = Vector2.zero;
    105.         float max = Mathf.NegativeInfinity;
    106.  
    107.         foreach (Vector2 vec in directions)
    108.         {
    109.             float dot = Vector2.Dot(vec, input.normalized);
    110.  
    111.             if (dot > max)
    112.             {
    113.                 dir = vec;
    114.                 max = dot;
    115.             }
    116.         }
    117.         if (dir == Vector2.up)
    118.         {
    119.             return Direction.up;
    120.         }
    121.         else if (dir == Vector2.right)
    122.         {
    123.             return Direction.right;
    124.         }
    125.         else if (dir == Vector2.down)
    126.         {
    127.             return Direction.down;
    128.         }
    129.         else if (dir == Vector2.left)
    130.         {
    131.             return Direction.left;
    132.         } else {
    133.             return Direction.none;
    134.         }
    135.     }
    136. }
     

    Attached Files:

  33. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    47
    quick question

    Problem: When my controllers [Vive Wands] are off and I run my game, then in-game turn on the controller, I don't get input from the controller. If the controllers are on before the game starts the input is read fine.

    The input debugger shows input coming into both controllers, but the scripts I have on the loaded controller model won't register the input.

    I set my input device at start:
    Code (CSharp):
    1. m_Controller = GetComponentInParent<XRController>();
    2. m_Node = m_Controller.controllerNode;
    3. m_InputDevice = InputDevices.GetDeviceAtXRNode(m_Node);
    and use TryGetFeatureValue calls during the Update loop:
    Code (CSharp):
    1. m_InputDevice.TryGetFeatureValue(CommonUsages.trigger, out m_TriggerValue);
    2. m_InputDevice.TryGetFeatureValue(new InputFeatureUsage<bool>("TriggerButton"), out m_TriggerButtonStatus);
    If unplugged, I get no value for m_InputDevice (GetDeviceAtXRNode), which leads me to believe I need to register to the InputDevices.deviceConnected event?? to set the device when it loads... Is this correct? and if so, is there a recommended way to do so in general?

    ***
    What I'm doing now to get this to work is moving the GetDeviceAtXRNode call into a function "InitializeDevice()" which I then call anytime InputDevices.deviceConnected or InputDevices.deviceDisconnected is triggered (via OnEnable OnDisable) - it seems to work but I don't know if there's a better way to solve this issue
     
    Last edited: Feb 23, 2020 at 12:10 AM
unityunity