Search Unity

Official XR Interaction Toolkit 0.10 preview is available

Discussion in 'XR Interaction Toolkit and Input' started by mfuad, Nov 5, 2020.

  1. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    335
    We have just published a new preview of the XR Interaction Toolkit (XRI) that brings a few new features and a lot of bug fixes. For those who want to experiment with XRI, the best way to start is with our samples available at https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples. As always, you can refer to our documentation for more information.


    What’s new:
    • Smooth locomotion - move around the virtual environment at a constant rate, using the trackpad or thumbstick of a VR controller
    • Support for Unity Input System - use the input system to drive VR interactions
    • Keyboard/mouse simulation - simulate input without having to deploy to device
    • Improved layout of properties in the Inspector window
    • New samples, including use of the Universal Render Pipeline
    Bug fixes:
    • Fixed some behaviors not supporting multi-object editing in the Inspector
    • Fixed PrimaryAxis2D input from mouse not moving the scrollbars on UI as expected. (1278162)
    • Fixed issue where Bezier Curve did not take into account controller tilt. (1245614)
    • Fixed issue where a socket's hover mesh was offset. (1285693)
    • Fixed issue where disabling parent before XRGrabInteractable child was causing an error in OnSelectCanceling().
    • Fixed Tracked Device Graphic Raycaster not respecting the Raycast Target property of UGUI Graphic when unchecked. (1221300)
    • Fixed XR Ray Interactor flooding the console with assertion errors when sphere cast is used. (1259554) (1266781)
    • Fixed foldouts in the Inspector to expand or collapse when clicking the label, not just the icon. (1259683)
    • Fixed created objects having a duplicate name of a sibling (1259702)
    • Fixed created objects not being selected automatically (1259682)
    • Fixed XRUI Input Module component being duplicated in EventSystem GameObject after creating it from UI Canvas menu option (1218216)
    • Fixed missing AudioListener on created XR Rig Camera (1241970)
    • Fixed several issues related to creating objects from the GameObject menu, such as broken undo/redo and proper use of context object.
    • Fixed issue where GameObjects parented under an XRGrabInteractable did not retain their local position and rotation when drawn as a Socket Interactor Hover Mesh (1256693)
    • Fixed issue where Interaction callbacks (OnSelectEnter, OnSelectExit, OnHoverEnter, and OnHoverExit) are triggered before interactor and interactable objects are updated (1231662, 1228907, 1231482)
    • Fixed compilation issue when AR Foundation package is also installed
    • Fixed the Interactor Line Visual lagging behind the controller (1264748)
    • Fixed Socket Interactor not creating default hover materials, and backwards usage of the materials (1225734)
    • Fixed Tint Interactable Visual to allow it to work with objects that have multiple materials
    Known issues:
    • Teleportation is not functional when the Continuous Move Provider sets Gravity Application Mode to Immediately
    • Teleportation position is overridden by continuous movement when both occur on the same frame
    • Custom reticles get displayed on objects without a custom reticle (1252565)
    • Socket Interactor can apply the wrong rotation to an interactable and cause the interactable to skew in scale when the interactable has a parent with a non-uniform scale (1228990)
    • Socket Interactor does not take the enabled state of the Renderer into account when drawing the hover mesh
    • Adding an interactable to a parent GameObject of an interactable that is being destroyed will cause the colliders to not be properly associated with the new interactable (1231482)
    • Controller connection interruptions disable interactors when using the Controller Manager script from the examples project (1241245)
    • Anchor manipulation in Ray Interactor is inconsistently applied based on the controller class, and deadzone should be configured in the action for Action-based controllers
    • Layer of Grab Interactable does not inherit layer of Interactor when selected, which can cause Continuous Move locomotion of the rig to be pushed away in the wrong direction when the object overlaps with the Character Controller
    • In the example VR project, the Interactor Line Visual only appears in the left eye when using Windows Mixed Reality
    Roadmap
    We now have a public roadmap available for users to see our latest plans, upvote existing feature requests, and/or submit new feature requests. We are currently working towards a public 1.0 release next year (Unity 2021.2). Most of our focus and development efforts now are on bug fixes, UX improvements, and polished documentation & samples. The feature set for public release will primarily reflect what exists today.

    Sharing feedback
    This forum is the best place to open discussions and ask questions. As mentioned above, please use the public roadmap to submit feature requests. If you encounter a bug, please use the Unity Bug Reporter in the Unity Editor, accessible via Help > Report a Bug. Include “XR Interaction Toolkit” in the title to help our team triage things appropriately!
     
  2. harleydk

    harleydk

    Joined:
    Jul 30, 2012
    Posts:
    41
    Great news!
     
  3. dnnkeeper

    dnnkeeper

    Joined:
    Jul 7, 2013
    Posts:
    84
  4. Redtail87

    Redtail87

    Joined:
    Jul 17, 2013
    Posts:
    125
    Try checking this option in the package manager @dnnkeeper
     

    Attached Files:

    dnnkeeper and chris-massie like this.
  5. educa

    educa

    Joined:
    Mar 3, 2015
    Posts:
    14
    When I make an empty new project with this version of XR toolkitinstead of the previous version, then my quest2 doesn't find its controllers and is showing me a red ray shooting in the forward direction on ground level.

    Doing the exact same with XR Interaction Toolkit 0.9.4 and it works flawless , so something wrong with 0.10.0 or should some extra settings be made ?
     
    Last edited: Nov 6, 2020
    Ali_V_Quest and Rs like this.
  6. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    If you're using the Action-based version of the behaviors, and you're referencing an Input Action contained in an asset on the XR Controller to use for position and rotation tracking, you'll need to ensure the Action is enabled. The XR Controller will only automatically enable Input Actions that are directly defined on the component, and will require you to manage enabling or disabling the externally defined Input Actions. You can add a GameObject to your scene and add the Input Action Manager behavior to it, and then add the Input Action Asset that the Input Actions are defined in to the Action Assets list in the Inspector. That behavior will then enable all the Input Actions in that asset during its own OnEnable.

    You'll also want to make sure Active Input Handling in Edit > Project Settings > Player is set to either Input System Package (New) or Both. While in Play mode, you'll need to make sure the Game view has focus for input to be read properly. You can set an option to lock input to the Game view so you don't have to remember to do that, see https://docs.unity3d.com/Packages/c...tml#using-actions-with-action-based-behaviors.
     
  7. Bentoon

    Bentoon

    Joined:
    Apr 26, 2013
    Posts:
    98
    Awesome ! Nice work

    Quick QUESTION:
    @mfuad @chris-massie
    Do you know if there’s any way to port the navigation from EditorVR ...?
    The grip based Pull & Scale... like all the other XR creation tools have too (tiltBrush/Quill Medium etc)

    Would be so useful!
    Let me know if you have any ideas as how to get started
    Thanks

    ~be
     
  8. freso

    freso

    Joined:
    Mar 19, 2013
    Posts:
    73
    Nice. I hope fixing stuff like this (core Unity support) and documentation should be high on your prio list.

    How about fixing the Editor inspector documentation link (the question mark) to link to the correct site also?
    For example: clicking the question mark takes me to https://docs.unity3d.com/Manual/script-XRController.html instead of https://docs.unity3d.com/Packages/c...gine.XR.Interaction.Toolkit.XRController.html
    I noticed you renamed these methods from ...Enter to ...Entered. And also some naming convention changes (uppercase to lowercase etc IDE1006). Are these changes following a general Unity rule, or just for this pet project?
     
  9. pseltmann

    pseltmann

    Joined:
    Aug 2, 2019
    Posts:
    2
    I found a potential bug in the ray interactor in Preview 0.10.

    Behaviour in Preview 0.9.4:
    • Interactor has an attach transform from a fingertip
    • Interactor is disabled on startup from a script (Start() => interactor.enable = false; )
    • later on, the interactor gets enabled and shows the ray starting at the fingertip
    In Preview 0.10, the attach transform is resetted to position and rotation (0, 0, 0) when the interactor is disabled.
    A quick dive in the code showed that OnSelectExiting is called when an interactor gets disabled. In this method, attachTransform is set to m_OriginalAttachTransform. However, m_OriginalAttachTransform is set to Vector3.zero and Quaternion.Identity in Awake.

    When changing line 481 and 482 in the Awake function of the RayInteractor.cs script to

    Code (CSharp):
    1. m_OriginalAttachTransform.localPosition = attachTransform ? attachTransform.localPosition : Vector3.zero;
    2. m_OriginalAttachTransform.localRotation = attachTransform ? attachTransform.localRotation : Quaternion.identity;
    it works like in the Preview 0.9.4.

    Is this intended or just a bug?
     
  10. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    335
    Hi @freso, yes our focus is on stability and usability. This includes a high priority on bug fixes, polishing our samples and documentation, and UX improvements to ensure the onboarding experience of using this toolkit is optimal.
     
  11. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    There's currently nothing built-in to do this out-of-the-box with XRI. One approach to do something like this would be to create a behavior that would be responsible for converting gestures with the controllers into translation and rotation amounts to either move the XR Rig or the object you are manipulating. You could serialize multiple Input System Actions that would bind to a controller's position, rotation, and the grip button. The behavior would listen for the Actions that represents the grip button, and when both are pressed, sample and store the position and rotation of each controller, and likely take the average. Then each frame, the behavior would convert the difference between the initial sampled pose with the current pose, and convert that into a desired change of position and rotation. All of that would be achievable with using the Input System package.

    Depending on how your scene is structured, you could then use the position and rotation deltas you computed with that behavior into either updating the Transform of the object you want to manipulate, or the XR Rig with it pivoting around the object you are manipulating. If you're updating the XR Rig, you can write a new behavior that derives from
    LocomotionProvider
    to push those changes to the XR Rig through the
    LocomotionSystem
    .

    You may be able to search online for examples to hopefully help out with this locomotion method. Some refer to this style as "move the environment".
     
    Bentoon likes this.
  12. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    I created a bug issue (1291475) to get the documentation button fixed.

    Some of the renaming done this version, like changing some properties from PascalCase to camelCase, was to follow Unity style conventions for a package of this type. Most of the properties in the package were already following this convention, but a few didn't match and were updated for consistency.

    The other methods you mentioned were actually split into multiple methods. For example,
    OnSelectEnter
    was split into
    OnSelectEntering
    and
    OnSelectEntered
    . This was done to allow the different phases of an interaction state change, like "select", to be overridden and to change when the public event is invoked for the Interactor and Interactable. Before this change, the public event on the Interactor would be invoked before the Interactable had a chance to process the change, which could lead to undesired behavior. In this new version of the package, both the Interactor and Interactable will finish processing the change before the events are invoked. These public events were also renamed to match the function, for example
    onSelectEnter
    to
    onSelectEntered
    . This timing of these methods and events is visualized in the documentation, see Extending the XR Interaction Toolkit.
     
  13. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    This is a bug, I created an issue (1291523) for you to track.

    We'll likely copy the world position and rotation of attachTransform instead of local position to support more hierarchies.
     
    Last edited: Nov 10, 2020
  14. pseltmann

    pseltmann

    Joined:
    Aug 2, 2019
    Posts:
    2
    Great, thanks for the help! Do you know when the fix will be available?
     
  15. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    335
    No specific ETA as we work through our backlog; we'll keep you updated via the Issue Tracker link @chris-massie shared.
     
  16. Bentoon

    Bentoon

    Joined:
    Apr 26, 2013
    Posts:
    98
    Thank you Chris! Very helpful
     
  17. RShiftStudios

    RShiftStudios

    Joined:
    Jul 30, 2020
    Posts:
    13
    Is throwing grabbable while moving continuously works fine now?
    We implemented our own solution for continious motion but the throw was not working properly in the previous version.
     
  18. Mathusalemrex

    Mathusalemrex

    Joined:
    Oct 8, 2017
    Posts:
    5


    "If you're using the Action-based version of the behaviors, and you're referencing an Input Action contained in an asset on the XR Controller to use for position and rotation tracking, you'll need to ensure the Action is enabled. The XR Controller will only automatically enable Input Actions that are directly defined on the component, and will require you to manage enabling or disabling the externally defined Input Actions. You can add a GameObject to your scene and add the Input Action Manager behavior to it, and then add the Input Action Asset that the Input Actions are defined in to the Action Assets list in the Inspector. That behavior will then enable all the Input Actions in that asset during its own OnEnable."



    No clue what any of this means, once again but imagine I am clueless, which I am...
     
  19. Redtail87

    Redtail87

    Joined:
    Jul 17, 2013
    Posts:
    125
    @Mathusalemrex

    In the new input system package, when you create new action maps, they are disabled by default, meaning the inputs will not work at runtime.

    If you are using the new XR Controllers that take advantage of the Action Maps, you will need to manually make sure the Action Maps are enabled to ensure input works at runtime.

    There is a component already made that will enable all action maps during runtime on start called ActionManager.

    tldr: put the ActionManager component somewhere in your scene referencing your input S.O. if you are using the new input system with XR
     
    chris-massie likes this.
  20. Mischief_Cody

    Mischief_Cody

    Joined:
    Nov 24, 2012
    Posts:
    8
    So I've created a new scene in which I add an action-based XR Rig. I add the Input Action Manager and connect the XRI Default Input Actions. And the controllers work fine, and they are able to interact with a canvas, but the camera is only updating the position, not the rotation.

    I've tried changing the Rotation Action on Tracked Pose Driver (New Input System) from the default centerEyeRotation [XR HMD] to centerEyeRotation [Oculus Headset] and it works, but I'm developing for Steam and can't have only Oculus work. Am I doing something wrong or is there something I'm missing?
     
  21. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    It should work fine. If you think there is a bug, follow the steps under "Sharing feedback" in the original post.

    I'm not sure what's going wrong since the the more generic XR HMD should work for all supported devices. Make sure Tracking Type is set to Rotation And Position and make sure the Game view has focus by clicking in it with your mouse while playing. To help with debugging, open Window > Analysis > Input Debugger, expand the Actions foldout, and it should have at least two actions, "Main Camera - TPD - Position" and "Main Camera - TPD - Rotation". If you can expand those foldouts, it will show the resolved controls.

    Since you mentioned using Steam, when I test with my Oculus Quest when I use the deprecated OpenVR Desktop package instead of the XR Plug-in Management packages, the two Actions resolve to
    /OculusQuest/centereyeposition
    and
    /OculusQuest/centereyerotation
    when the Tracked Pose Driver (New Input System) is using
    <XRHMD>/centerEyePosition
    and
    <XRHMD>/centerEyeRotation
    for the Actions. (Note that the T button will toggle between the UI selector and the raw string.)
     
  22. Mischief_Cody

    Mischief_Cody

    Joined:
    Nov 24, 2012
    Posts:
    8
    @chris-massie thanks for the reply. I do have the rotation and position tracking type set, the game view is focused, and looking at the input debugger it does show my oculus device under Main Camera - TPD - Position and Rotation. And to support SteamVR I am using the legacy VR system. I do notice that it also works when the rotation action is set to CenterEyeRotation [Any]; is there any specific difference between that and centerEyeRotation [XR HMD]? Would there be any instances that having it be the any variant would produce bugs?
     
  23. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    Using [Any] (see wildcards in https://docs.unity3d.com/Packages/c...manual/ActionBindings.html#binding-resolution) will mean that it will be possible to bind to any device layout for centerEyeRotation rather than only those that derive from the XRHMD class. I don't foresee any problems with using that instead, especially if you're not registering any custom device layouts with a centerEyeRotation that you wouldn't want to use.
     
  24. wm-VR

    wm-VR

    Joined:
    Aug 17, 2014
    Posts:
    123
    With the new locomotion system integrated in this package: can the player look through walls when moving the camera too far? It would be convenient if there is standard solution included to check or uncheck at the component.

    Another question about the socket interactor: is it possible to restrict the socket-space just to draggable objects with a specific tag? (like only red spheres can go into a specific socket or specific ammo to fill a specific gun type). Thanks.
     
  25. dilmerval

    dilmerval

    Joined:
    Jun 15, 2013
    Posts:
    232
    Funny enough I had the exact same issue and it was driving me insane, looks like moving my scene view to my second monitor caused this issue, I moved it back to the main monitor and made sure the game view had the focused, after doing this everything work.
     
  26. Skinzart

    Skinzart

    Joined:
    Sep 11, 2020
    Posts:
    14
    hi, is it possible to implement (OnHoverEnter, OnHoverExit, OnSelectEnter, OnSelectExit, HoverToSelect) in XRRayInteractor with UI canvas?
    so we don't need to press the controller button, just need to hover over the UI for a few moment to select... it may also be useful to play audio / or send haptic feedback when the ray controller interacts with the UI.
    I have a very short XRRayInteractor, I will use it as a touch interaction from the controller to the ui canvas, but I can't because HoverToSelect doesn't work with the UI.
    if that's not possible, i have to create my own custom interactor.
    Thank you in advance
     
  27. yarsrvenge

    yarsrvenge

    Joined:
    Jun 25, 2019
    Posts:
    87
    Hi. I've been running into some problems with spawning an object to be grabbed that worked okay in 0.7 but not in the latest version. The simplest use case is reaching your hand behind the back to grab ammo. How I have this set today is I set a trigger when the controller is in the backpack area and when they press the grab button, I spawn a magazine in front of their hand. At that point, the hand would simply grab it. It appears to be a timing issue now where the magazine is spawned but the grab select does not work. I tried switching to "state" instead of ":state changed" and while it works it introduces a ton of other problems. I feel like there is probably a better way to do this.

    I started going down the "StartManualInteraction" route, but that is not working as expected and no longer uses the standard on select exit events.

    Hoping someone has an idea how to do this more reliably.
     
  28. dnnkeeper

    dnnkeeper

    Joined:
    Jul 7, 2013
    Posts:
    84
    Is there any reason of XRBaseInteractable having so many internal functions we can't override? Is it not supposed to be extendable? I'd like to change OnSelectEntering to preserve rigidbody drag settings which is being cleared to 0 when grabbed and restore it in Drop function.
    Also it seems there is no way of making non-kinematic Rigidbody to behave smoothly while in hand. I'd like to separate visual and physics components of grabbed objects to maintain smooth visuals and interactive behaviour but it seems cumbersome without making custom interactable class. Also I'd like to implement multiple grab points for some obects, etc.
     
    Last edited: Nov 16, 2020
  29. Skinzart

    Skinzart

    Joined:
    Sep 11, 2020
    Posts:
    14
    or maybe make GetCurrentUIRaycastResult public, it will help a lot
     
  30. dpcactus

    dpcactus

    Joined:
    Jan 13, 2020
    Posts:
    53
    Am I doing something severely wrong or can it be that no of this is working with an HTC Vive? Never locomotion not button interaction is working for me. It works like a charm on my Quest though.
     
    XRELABS likes this.
  31. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    335
    Valve's OpenVR Unity plug-in requires their SteamVR plug-in for input. It does not support Unity input, and thus doesn't work with the XR Interaction Toolkit. Please refer to their documentation for more information regarding their requirements. That will change as Valve transitions to OpenXR, and Unity adds initial support for OpenXR. With OpenXR, all input from conformant OpenXR runtimes (including Valve’s runtime) will be routed through Unity and can be accessed through Unity's Input System and the XR Interaction Toolkit.
     
  32. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    By default, the player is able to look through walls. You can achieve a simplistic solution that may be sufficient for your game/experience with the current version of the package, though it does involve setting up the XR Rig in a certain way. You can use a Character Controller component on the XR Rig to add a single capsule-shaped collider to represent the body and head of the player. Since it's just one single capsule collider, that does mean that you won't be able to, for example, lean your head out and over the railing of a balcony to look down since your body will be constrained.

    If you add the Character Controller Driver behavior to the rig and set it to listen to the Continuous Move Provider, that behavior will drive the center position and height of the Character Controller based on the camera position whenever that Continuous Move Provider does a locomotion event. If you set Gravity Application Mode to Immediately on the Continuous Move Provider, that will make the events fire every frame instead of just upon thumbstick input. With this configuration, as you move your head/HMD, the Character Controller Driver will try to move the Character Controller to that position. If the wall has a Collider and you try to move your head through it, the rig will be pushed away due to the collision.

    You can use layers and set the Interaction Layer Mask on the Grab Interactable and the Socket Interactor to define which objects are compatible. See https://docs.unity3d.com/Packages/c...t@0.10/manual/index.html#interactionlayermask for more.
     
    Sublinear, wm-VR and LuigiNicastro like this.
  33. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    I'm sorry you had frustrations while using it. We're working on better warnings and potentially removing the need to give the Game view focus, but it isn't finalized yet.

    That is currently not possible in the package at the moment. Follow the link in the OP to submit feature requests.

    The functions you listed all operate on an Interactable, and those typically require that they have a Collider. UI interaction with a World Space Canvas with the Ray Interactor is achieved with the XRUI Input Module on the Event System, which then executes events by calling methods implemented by behaviors on those objects, similar to how Mouse clicks are handled in a non-VR game/experience.

    If you want to add the ability to hover to select for UI, there are a few ways you could go about implementing it. You could add a custom behavior to the buttons in the UI that implements
    IPointerEnterHandler
    and
    IPointerExitHandler
    , and trigger a press after hovering for long enough. Another way would be to override
    XRUIInputModule
    or write a custom Input Module to do the hover to select logic. To send haptic feedback, you would need to call
    XRBaseController.SendHapticImpulse
    .
     
  34. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    You should be able to call
    XRInteractionManager.SelectEnter
    to initiate the select through script after you spawn the Interactable object.

    The functions are protected internal and virtual so you're able to override and call them in a derived class. Some also have an associated event accessible through script or in the Interactable Events foldout in the Inspector window. You should be able to override some methods in
    XRGrabInteractable
    to do what you're describing. We'll also look into making more functions in that class like
    Drop
    virtual.

    That is something we are aware of and is on our roadmap.

    We'll look into making this public.
     
  35. Skinzart

    Skinzart

    Joined:
    Sep 11, 2020
    Posts:
    14
    Thanks for your response, I managed it to work using customInputModule override XRUIInputModule ...

    ezgif-6-e0bff7cd5bd6.gif

    Is there a way to modify Ray Interactor Transform(position/rotation)?
     
    Last edited: Nov 18, 2020
  36. yarsrvenge

    yarsrvenge

    Joined:
    Jun 25, 2019
    Posts:
    87
    Thanks. I actually ended up solving this by using "ForceSelect" on the interaction manager since it appears to be public. Not sure if there is a difference with select enter but will look at it later.
     
  37. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    You can create a child GameObject of the Ray Interactor and set its position/rotation that you want, and then set the Ray Interactor's Attach Transform to that object. However, there's currently a bug (1291523) mentioned in post #9 that resets the pose of the
    originalAttachTransform
    that is used for the cast. A fix is currently in progress and will be included in the next patch. As a workaround until then, you can subclass
    XRRayInteractor
    and override
    Awake
    to set the position/rotation of
    originalAttachTransform
    to the pose you want.

    Very cool gif, nice job!

    There's no difference, it just calls the other function.
     
  38. PincerGame

    PincerGame

    Joined:
    Aug 7, 2018
    Posts:
    16
    Hello @chris-massie, do we have a good example for Keyboard simulation? How can I add keyboard simulation on the project?
     
    Phanou likes this.
  39. Trekopep

    Trekopep

    Joined:
    Dec 18, 2013
    Posts:
    15
    Is there a way I can manually tell the XRRayInteractor to click UI elements? Elements are correctly hovered, but I can't click them.

    My game is set up using SteamVR Input, which currently has some compatibility issues with Unity (https://github.com/ValveSoftware/unity-xr-plugin/issues/16). The only thing I'm using the XR Toolkit for is to try and interact with UI. I've tried to get it to work by overriding XRRayInteractor to make
    isSelectActive
    true when I want it to be, I've tried using the methods here (https://skarredghost.com/2020/09/25/steamvr-unity-xr-interaction-toolkit-input/). Are
    isSelectActive
    and
    onSelectEntered
    the right places to be looking, or is there something separate for UI that I'm missing?
     
  40. RShiftStudios

    RShiftStudios

    Joined:
    Jul 30, 2020
    Posts:
    13
    Hey there is an annoying bug while teleporting and holding an interactable,
    1. Grab an interactable.
    2. Preform Teleport.
    3. The instant the teleport has performed release the interactable.

    It results that the grabbable launches forward in a very high speed.
    this is not a desired behaviour, I would expect it to fall..

    It is reproducible in the example scene provided by you.
     
  41. DamianHell

    DamianHell

    Joined:
    Jul 3, 2017
    Posts:
    1
    Hi!
    We are currently working on a VR production using XR Interaction Toolkit, trying to deal with app freezes during scene loading. Both Oculus and SteamVR have systems that can render splash screen even in that situation. Will XR Interaction Toolkit get that functionality? The fact the Oculus plugin is closed and doesn't display any type of splash screen renders the whole plugin troublesome to use in our project.
     
  42. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    I don't know of any examples that I can point you to. If you would like to submit a feature request, use the link in the OP to our public roadmap.

    If you would like to implement a solution yourself, adding an in-game keyboard will be fairly complex, but it should be doable with the packages as they exist now. I'm going on the assumption that you are wanting to create a world-space canvas with a keyboard layout, use Ray Interactors to point at key buttons, and use it to type into Input Fields in Unity UI.

    If your use case is limited in scope, you could create a world-space canvas with the keyboard buttons you want to support, use the pointer down and up events of the buttons to notify a custom behavior that the key has been pressed/released, and use it to construct the input characters that you can then make use of. You could then copy it to whatever you need through available scripting APIs. You can look at the examples repo linked in the OP for the scene setup you'll need to support using Ray Interactors to click world-space canvas buttons. You may want to also make considerations for different keyboard layouts for localization.

    If you want to support Input Fields more generically, and have a floating in-game virtual keyboard to type into it, you'll likely want to copy the
    XRUIInputModule
    class included in the XR Interaction Toolkit and customize it for adding keyboard simulation. The Input Module is responsible for executing events and also to set which UI object is the currently selected GameObject in the EventSystem. Since clicking the in-game keyboard buttons would steal focus from the Input Field, you'll need to modify that input module to control whether to skip the call to
    SetSelectedGameObject
    so the Input Field remains the selected object even when you pull the trigger on your virtual keyboard. Then you'll need to add to the Process loop of the input module to push a queue of Event objects to the selected Input Field. Your custom behavior that listens to the press/release events of the keyboard buttons would use Event.KeyboardEvent to construct these Event objects. The input module would check if the
    EventSystem.currentSelectedGameObject
    has an InputField/TMP_InputField component, and then call the
    ProcessEvent
    method on it so it can handle the key press. Finally, you'll need to add in logic somewhere to display the virtual in-game keyboard when the Input Field receives focus.

    There's a lot of details I'm probably missing, but hopefully this sends you down the right path if you want to try writing your own.
     
  43. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    The Ray Interactor doesn't interact with UI elements with the Collider-based select and hover interaction states, it does so through the XRUIInputModule on the EventSystem GameObject. The
    isSelectActive
    property won't play a role with UI interaction. Instead the
    XRBaseControllerInteractor.isUISelectActive
    property is used which reads the value from
    XRBaseController.uiPressInteractionState
    which is typically driven by the controller trigger.

    If you want to manually cause a UI press through script instead of from the actual controller, you'll need to override the
    XRBaseController.UpdateInput
    method to set the
    XRBaseController.uiPressInteractionState
    properties based on when you want to activate and deactivate that state for press and release.
     
    Trekopep likes this.
  44. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    Please report this bug following the steps in the Sharing feedback section in the OP and we'll get that fixed.
     
  45. Michael316

    Michael316

    Joined:
    Aug 2, 2012
    Posts:
    21
    Prior to using the XR Interaction Toolkit, UI Scrollers would natively scroll with right thumbstick movement when pointing at the UI Scroller, but with the Toolkit, it doesn't. Is this a known issue, or am I missing something? Thanks!

    EDIT: I'm seeing this in the bug fixes, but I'm still having the issue with 0.10 and 1.0 that was recently released.

    • Fixed PrimaryAxis2D input from mouse not moving the scrollbars on UI as expected. (1278162)
     
    Last edited: Dec 2, 2020
    curtispelissier likes this.
  46. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    We have an upcoming release that has an additional fix so that mouse input will work with UI as expected when using the XRUIInputModule when Active Input Handling of the project is set to Input System Package (rather than Input Manager (Old) or Both).

    I'm not sure that answers your question though because I don't fully understand what you mean. Can you explain what you mean by using the right thumbstick and what is pointing at the UI scroller?
     
  47. Michael316

    Michael316

    Joined:
    Aug 2, 2012
    Posts:
    21
    Yep, sorry I should have been more clear. So I have a World Space Canvas, a player setup with the standard XRRig, and a UIScroller. When using the VR controllers (Quest 2 FWIW) and aiming at the Canvas elements, all buttons work. The UI Scroller works only if I aim at it and use the trigger to "grab" it and scroll it up and down. What is not working is what worked previously (using the Oculus Integration package) which is: I am unable to aim the Touch Controller ray at the UIScroller element and use the SecondaryAxis2D to scroll up and down.

    The goal is to be able to use the Touch Controller's right thumbstick to scroll UIScrollers by aiming at them and using the stick. Thanks!
     
  48. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    Ah, I understand now, thanks. That was a feature of the Oculus Integration asset package, however we currently don't have the ability to drive
    IScrollHandler
    objects, such as Scroll Rect, implemented in the input module included with the XR Interaction Toolkit. Use the public roadmap linked in the OP to submit a feature request for this.

    There are a couple approaches you could take if you want to implement a solution yourself. You could extend
    XRUIInputModule
    and override the
    DoProcess
    method to add the step of setting
    PointerEventData.scrollDelta
    and execute the event in the hierarchy, similarly to how the mouse wheel is handled when processing mouse input. Unfortunately however, at this time most of the code in the input module you would need access to is private or internal, so you would likely need to make copy of that class and its base class. Another approach you could take would be to add a behavior to the UI element itself that would listen for the pointer enter event, and read from an Input System Action bound to the thumbstick on the controller to get the input amount, and use that to scroll the element in the UI through script.
     
  49. Michael316

    Michael316

    Joined:
    Aug 2, 2012
    Posts:
    21
    Perfect - thank you for the detailed response. I'll try both approaches and see which one has the best result. Thanks!
     
  50. daveinpublic

    daveinpublic

    Joined:
    May 24, 2013
    Posts:
    167
    I've got an issue where I can't get any button / joystick input from my XR controllers. They are tracked in 3D space correctly. And my keyboard works using the same InputAction that the XR Controllers use.

    I'm using the action based XR Rig, have both controllers setup with all the references pointed to the presets that come with the package.

    To get the input to control my movement, I created a public InputAction called 'moveStick' in my player's script. This action is shown in the inspector when I click on the player in my scene. I add XR right and lefthand controller's primary2Daxis, I also add a keyboard up down left right. When I play the game in the editor, the keyboard buttons move the player. When I play the game in the headset, the controllers don't move the player.

    I've tried doing this with a Player Controller component and an Input Action Map, but it slowed my game down to 5 fps for some reason. I tried turning off the XR controllers references to see if it was overruling my player script's InputAction. This has been almost a week long issue now. I'm really needing some help if anyone has an idea.

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.UI;
    5. using UnityEngine.InputSystem;
    6. using TMPro;
    7.  
    8. public class Script_Move : MonoBehaviour {
    9.  
    10.     public InputAction moveStick;
    11.     public GameObject objToMove;
    12.     public TextMeshProUGUI debugText;
    13.  
    14.     private void OnEnable()
    15.     {
    16.         moveStick.Enable();
    17.     }
    18.     private void OnDisable()
    19.     {
    20.         moveStick.Disable();
    21.     }
    22.     private void Awake()
    23.     {
    24.         moveStick.started += OnMoveStick;
    25.         moveStick.canceled += OnMoveStick;
    26.     }
    27.  
    28.  
    29.     public void OnMoveStick(InputAction.CallbackContext context){
    30.         debugText.text = "On Move Stick - moved! \nValue type:" + context.valueType + ", \nPhase type: " + context.phase;
    31.        var contextResult = context.ReadValue<Vector2>();
    32.  
    33.             if (context.started)
    34.             {
    35.                 debugText.text = "Context Started: Y = " + contextResult.y.ToString() + ", X = " + contextResult.x.ToString();
    36.                 objToMove.transform.Translate (0, 0, contextResult.y);
    37.                 objToMove.transform.Rotate(0, contextResult.x, 0);
    38.             }
    39.             else if (context.canceled)
    40.             {
    41.                 debugText.text = "Context Cancelled";
    42.             }
    43.         }
    The keyboard works fine, and the XR rig is using the exact same input action that the keyboard is.

    You can see I put a TextMeshPro text item that shows a few variables to me in the scene. The way I have it setup now, when I play with the keyboard, and press the down key, it says:
    "Context Started: Y = -1, X = 0".
    And when I let go it says:
    "Context Cancelled"

    And press right to see it quickly change back to "Context Started: Y = 0, X = 1".
    And let go to quickly see "Context Cancelled".

    But when I build to XR, the message immediately says "Context Started: Y = 1.525879E-05, X = - 1.525879E-05" and it never changes. It never says "Context Cancelled". It shouldn't go over 1 or -1. But it stays at that message from beginning of game till end. Sometimes it doesn't start until I touch a joystick, and then it stays.

    But whether I press up, down, or nothing, it never changes. The only other thing I noticed is when I hit the oculus button, and the oculus context menu comes up, the message says "Context Cancelled". And when I close the oculus menu it goes back to "Context Started: Y = 1.525879E-05, X = - 1.525879E-05".
     
    Last edited: Dec 4, 2020