Search Unity

Official XR Interaction Toolkit 0.10 preview is available

Discussion in 'XR Interaction Toolkit and Input' started by mfuad, Nov 5, 2020.

  1. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    You are subscribing to
    started
    , which will only occur once when the control moves away from its default (0, 0) state for a Vector2 control. If you subscribe to
    performed
    , it will be invoked whenever the control changes value. You'll have to update your
    OnMoveStick
    method for this other phase. See Responding to Actions and Default Interaction in the Input System docs for more explanation about this. Unlike the keyboard arrows for a Composite 2D Vector binding, the thumbstick will constantly be slightly changing value due to the sensitivity of the sensor, even if you are trying to hold it still. If you want to absolutely ensure input is processed each frame if the value happens to not change, you will need to read the value in a coroutine or Update loop.

    You can also just use polling to read the value rather than subscribing to the events, which can often be simpler. A number of improvements to Polling Actions API have been made with Input System 1.1 to make it easier to handle these phases, however that is still in preview.

    The 1.525879E-05 value you mentioned is within [-1, 1], it is just a small number written in scientific notation.

    The FPS drop you mentioned should not happen. If you are able to get a reproducible sample and submit a bug report, we can fix that.
     
  2. TimeWalk-org

    TimeWalk-org

    Joined:
    Nov 3, 2014
    Posts:
    38
    And is this now a *newer* version (1.0.0 - "pre.1") as of November 17, 2020?

    Screen Shot 2020-12-06 at 11.31.25 AM.png
     
    TobySch likes this.
  3. metaphysician

    metaphysician

    Joined:
    May 29, 2012
    Posts:
    190
    i don't see it in the latest version of Unity 2020.1.15 - maybe it's specific to 2020.2?

    also i installed the XR Interaction toolkit 0.10.0 (Nov 5) and i don't see the convenient list of setups under the GameObject menu under XR shown in the manual. all that shows up is 'convert Main Camera to XR rig' and it does that but doesn't add controllers. i'm on Oculus Quest through Link, i've installed the new Input system and the Oculus XR support package. would be nice to get some working prefabs or samples. i just need head tracking and basic controller action for UI interaction and button/trigger recognition. any help appreciated!
     
  4. TimeWalk-org

    TimeWalk-org

    Joined:
    Nov 3, 2014
    Posts:
    38
    I am seeing this under Unity 2020.1.16f1
     
  5. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    The 1.0.0-pre.1 version of XR Interaction Toolkit is a pre-release package. Pre-release packages are in the process of becoming stable and will be available as production-ready by the end of this next 2021 LTS release. Starting in 2021.1 Alpha, Unity is changing the way we publish and show packages in the Package Manager, and is designed to provide clear guidance around a package's readiness and expected support level. Pre-release packages are supported packages. There will be additional iterations of XRI before we get to the 1.0.0 release, so expect a 1.0.0-pre.2.

    There will be more information about this posted in the future.

    Which Unity Editor version are you using? The minimum supported version is 2019.3 for XR Interaction Toolkit. The 'Convert Main Camera To XR Rig' menu item is part of XR Legacy Input Helpers (com.unity.xr.legacyinputhelpers), which is a dependency of XR Interaction Toolkit. However, when XRI is properly installed in a supported version, that menu item should be replaced with the other ones in the manual.

    If you are using 2021.1 Alpha, check the Console window to see if there are any warnings or errors. If you are encountering issues, you may have to install newer Release versions of the dependencies. If you have Input System 1.0.1, XR Legacy Input Helpers 2.1.6, and XR Plugin Management 3.2.17 installed, you should be able to install XR Interaction Toolkit successfully.
     
  6. metaphysician

    metaphysician

    Joined:
    May 29, 2012
    Posts:
    190
    hi @chris-massie the version i'm currently using is 2020.1.15, but the project was opened last in a version of Unity 2019 (i think it was 2019.2). i'll check all those supported versions of the dependencies later on today/tonight and see if any need updating. thanks for the tip!
     
  7. XRELABS

    XRELABS

    Joined:
    Nov 15, 2019
    Posts:
    1
    same with me the new toolkit stopped my controller input totally its useless for with the openvr unity package from github i really hope vive do upgrade and join the rest in XR like normal people, i will be sad if i need to discard my vive for this
     
  8. metaphysician

    metaphysician

    Joined:
    May 29, 2012
    Posts:
    190
    well, @chris-massie - your suggestions did seem to work for me. just a note - i mistakenly installed Unity MockHMD support which was a big mistake, as it took over and i could no longer get it to recognize the Quest headset i was using. even uninstalling the package didn't shake it loose. had to completely restart from scratch.

    now i'm trying to get a stationary demo to work and i'm not getting very far. the main older Initial Tracking Demo in the XR Interaction Examples VR project works fine but it looks like it's using a different Input system than the Action based one. if i start a blank new scene and add the Stationary Rig i get the HMD tracking but that's it. no controllers are recognized, but they are under the XR/XR Controller (Device-based) script and not the Action based one. not sure how to get around this problem. i know the main interaction demo has a manager for the hand controller that seems action-based but i don't need all the extra features like teleporting, turning, and grabbing objects. just pointer interaction on a flat UI really.
     
  9. Trekopep

    Trekopep

    Joined:
    Dec 18, 2013
    Posts:
    15
    Thanks for the reply! I ended up getting it working by copying and overriding the
    XRRayInteractor.UpdateUIModel
    method, and replacing
    model.select = isUISelectActive;
    with
    model.select = SteamVR_Input.GetBooleanAction("InteractUI").GetState(mainHand);


    This means that I had to change the UpdateUIModel method in XRRayInteractor to virtual and change two variables to protected, which isn't ideal (I always prefer to not edit code in packages since if I update the package, I'll need to redo the changes), but it got the job done! UI interaction is working as expected.


    To comment on your initial suggestions (ignore this if you want, since the issue is now fixed, but I figured I'd put it here for the sake of completion):
    I initially tried what you said with overriding the
    XRBaseController.UpdateInput
    method, but couldn't figure out how to get it to work, primarily because I'm not even sure how the XRController hooks up to the XRRayInteractor. i.e. the XRRayInteractor was still doing almost everything I wanted even without an XRController in the scene? Perhaps this is just a side effect of me having something set up differently since I added in the XR Toolkit after I already had most VR functionality implemented. Regardless, I tried adjusting
    m_UIPressInteractionState.activatedThisFrame/deactivatedThisFrame/active
    to no avail.
     
  10. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    That sounds like it may have been a bug. Even if the package is installed, as long as Mock HMD Loader is unchecked in Edit > Project Settings > XR Plug-in Management, it shouldn't interfere as an HMD source. If you run into that problem again, or if you still have a reproducible example project, please submit a bug report so we can get that fixed.

    Try following the steps from post #6, it should hopefully fix the problems you are seeing.

    We'll look into changing the access level on those to make it easier to override.
     
    Trekopep likes this.
  11. srizvipk

    srizvipk

    Joined:
    Jul 19, 2017
    Posts:
    2
    Hi,
    Is this issue resolved !

    I've tried many times with XR Interaction Toolkit 0.10.0 and then 0.9.4
    but without any luck.
     
  12. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    We are planning on releasing version 1.0.0-pre.2 which includes the fix for that issue in early January.
     
  13. Skinzart

    Skinzart

    Joined:
    Sep 11, 2020
    Posts:
    14
    hi @chris-massie I don't know why when I extend the XRUIInputModule the mouse interaction can't be used, I just raycast from there to perform hover to select ui, and I don't override any XRUIInputModule function, the function I made works fine like i post in this thread before, but i don't know why mouse interaction can't be used.

    Edit: ohh, I saw on the XRUIInputModule, mouse & touch use the old input system right? and i am now using a new input system which makes mouse and touch not work?
     
    Last edited: Dec 22, 2020
  14. vladk

    vladk

    Joined:
    Jul 10, 2008
    Posts:
    167
    Hi guys.

    I'm really new to XR and when I watch tutorials on youtube the XRRig script normally allows to specify the Y Offset with "Floor" Tracking Origin Mode, yet when I installed the package the only way to specify the Y Offset is to choose "Device". As a result my virtual headset is always sticks to the floor and doesn't lift up (no matter if I have the device connected or not). What am I missing?
     
  15. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    There is a bug in the current version when Active Input Handling is set to Input System Package (New), the mouse is not processed correctly in
    XRUIInputModule
    . The next version has a fix for that issue, which we are planning on releasing in early January. As a workaround until then, you can change Active Input Handling to Both.
     
  16. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    See my response in a related topic Setting camera y offset for XR Rig.

    When the Tracking Origin Mode is set to Floor, the Transform of the Camera should be driven by the Tracked Pose Driver to be the height of the HMD off the floor. The height value reported by the HMD is based off the calibrated settings on the device itself, so if it is working properly, it should match the real distance off the floor. The Camera Y Offset value is only used when the mode is set to Device to define the height from the floor at the position where the HMD's view is reset/recentered.

    Make sure that the Game view has focus for tracking and input to work correctly, and that the XR Plug-in Management settings in Edit > Project Settings has the provider for your device enabled. If you are still having problems, refer to the
    WorldInteractionDemo
    Scene in the Examples project with the mode of the Rig changed to Floor.
     
  17. Thimo_

    Thimo_

    Joined:
    Aug 26, 2019
    Posts:
    59
    Hey!

    Im sorry if I didn't read it somewhere but is the XR interaction toolkit made ready for the Unity new Input system or is that coming in the future? Thanks in advance!
     
  18. Redtail87

    Redtail87

    Joined:
    Jul 17, 2013
    Posts:
    125
    It works with the new input system, just make sure to use the "Action Based" version of the components.
     
  19. vladk

    vladk

    Joined:
    Jul 10, 2008
    Posts:
    167
    I'm sorry, is there any documentation on how to set up objects properly for XR interaction Toolkit? Because what you have online here https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@0.10/manual/index.html pretty much useless as it doesn't explain which scripts are necessary for which actions.
    I'm trying to implement the dynamic instantiation of the player's avatar via Photon server and I end up being unable to detect why my avatar can't move or use teleport :( It looks like scripts of XR Toolkit only work if the scene and player's avatar is preset in the scene with no further instantiations in scripts...
     
    Last edited: Dec 25, 2020
  20. gdarkzone

    gdarkzone

    Joined:
    May 11, 2019
    Posts:
    2
    I've tried the demo scene on the Quest 2 (native Android build, not PCVR/Link) and the input text field is not working, at least it doesn't behave as I expected (I expect to bring up the system keyboard so I can type).
    I could force it by adding

    <uses-feature android:name="oculus.software.overlay_keyboard" android:required="false"/>
    to the AndroidManifest.xml file. Now the system keyboard pops and I can write, however after writing the XR Rays won't work. Is there a solution? Thanks for all your hard work.
     
  21. Viikable

    Viikable

    Joined:
    Apr 12, 2018
    Posts:
    10
    Hello,

    I have been trying to work with the toolkit for around a week or so now, also using the new OpenXR plugin and things are working with mixed levels of success. I am utilising a HTC Vive, and I have learned the actionbased inputsystem, have got my bindings set correctly and most things work alright: I can teleport, grab objects and release them, made a button functionality and even a custom PositionRewind which works okay with the XrRig. Locomotion works well and snap turning. The problems however seem to be that the built-in events don't seem to do what they are supposed to, and I'm at loss of sanity trying to figure out what is wrong.

    So basically, I've debugged that most of the interactions seem to register perfectly all the time, the events are sent both from the inspector to methods in other objects in the scene, as well as within the XR Toolkit itself (I have debugged the OnHoverEnter event at least). The problem is that the events don't do what they are supposed to, in some cases at least. The socket interactor works fine for socketing the object, but it won't display the mesh of the object on hover like it's supposed to, even though the actual method inside XR Toolkit does get called when the hover happens. I also cannot get any kind of haptic output, the event fires, I have also tried just calling the SendHapticImpulse method myself, but it just doesn't do anything. Reading the definition of the method, I also don't see how it does what it does, as the method just seems to return itself in the base interactor class, but that's probably just beyond me.

    Also I had initially a problem that my interactor refused to release the grabbed object (direct interactor) when I had binded it to triggerpress, but after binding it to grip it has been working fine, I got no idea why. Also it seems when the controller has the RayInteractor component in it, the collider you attach to it doesn't seem to work normally, as I have my button and the controller with the direct interactor collides properly with it while the other does not. Also I don't understand the design of not being able to add both direct and ray interactor components into the controllers, I could at least enable and disable them by code then when I wanna switch using them. How is it designed to be done that the player can both teleport and grab objects directly with the same hand? I personally don't plan to use ray interaction with object grabbing, but the occasional teleportation could be useful for sure. Is the purpose to just make a custom rayCaster on the controllers and then hook that into the QueueTeleportAction method?

    EDIT: I made my own raycaster to the hand and use that to queue teleports, would have preferred to use the one provided by rayInteractor but didn't see how to. Also I have now checked that for some reason my controllers register as not supporting any haptics, which is strange as I know they do support them. I tested via code by first fetching the vive controller and then with
    bool y = controller.TryGetHapticCapabilities(out var capabilities);
    bool x = capabilities.supportsImpulse;
    bool z = capabilities.supportsBuffer;
    and while y returns true as it should, both x and z here return false, indicating no support for any haptic feedback. What is going on?

    EDIT2: The first edit was using the InputDevice method of finding the controller, and trying with the ActionBasedController reference straight out, meaning just drag dropping one into the inspector and calling the SendHapticImpulse from that (which has a different implementation in the ActionBasedController class) I get a return value of true, meaning it should work. I just don't get any rumble though..

    Thanks for help!
     
    Last edited: Dec 30, 2020
  22. nrvllrgrs

    nrvllrgrs

    Joined:
    Jan 12, 2010
    Posts:
    62
    While overall I like what the XR Interaction Toolkit has to offer, I'm a little disappointed that Unity didn't take this opportunity to create a generic Interaction Toolkit. The generic toolkit could be used 2D / 3D pancake games, and then XR Interaction Toolkit would simply be an extension. Such a toolkit would be a boon for the community.

    Again, not digging on what XRIT is. Only what else it could have offered in its architecture.
     
  23. Phanou

    Phanou

    Joined:
    Jan 4, 2017
    Posts:
    11
    Hi,

    I Build a windows application with the WorldInteractionDemo.
    This work perfectly with Oculus Rift S and Oculus Quest + Link cable.
    I also try to use it with a Quest without cable but with Wifi / Side Quest and Virtual Desktop app.

    I can see my destktop but no way to switch on VR mode in the Quest.
    I stay in the virtual desktop environnement and just see the WorldInteractionDemo in a flat screen

    It's like the application didn't detect Quest device behind Virtual Desktop.
    I have the impression this problem happen with XR Plugin Management and not with OpenVR in Player/XR Settings (but deprecated).
    Have you an idea ?
     
    Last edited: Jan 4, 2021
  24. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    If you are instantiating a rig prefab, and the components such as Locomotion System and Teleportation Provider do not have references set for the XR Rig or System, it may be failing to find those objects automatically. Check that those properties are set and are referencing the GameObject you expect. Refer to the Examples project for examples of rig configurations which may help you out.

    If you believe instantiating prefabs with the XR Interaction Toolkit scripts is not working as you expect, please file a bug report so that we may address it.

    We are working to improve the documentation of the package and we'll use your feedback to make it better.

    Editing the manifest should be fine for Oculus, but we do not have a cross-platform solution in the example project at this time.

    With my testing on the Quest 1, the Ray Interactor continues to work as I expect after dismissing the Oculus system keyboard to give focus back to the application. What are you seeing not work with the Ray Interactor after using the keyboard? And does the same issue occur if you just bring up the Oculus system menu rather than the keyboard? If you have a reproducible example project you can submit, please report a bug so we can look into it.

    If the Socket Interactor is not displaying the mesh on hover, it could be that you have a material assigned to the Interactable Hover Mesh Material property in the Inspector that is not compatible with the Mesh of the Interactable. If a material is not set, the behavior will generate one at runtime that should work for most meshes as it just uses a transparent Standard shader or URP/Lit shader.

    The
    SendHapticImpulse
    method on
    XRBaseController
    is overridden by both derived classes to implement the logic for executing the haptic impulse command to the XR plugin. If you are using the OpenXR (Preview) plugin in XR Plug-in Management, it currently does not support haptics. Is that the plugin you are using?

    You should be able to use the trigger instead of the grip for the Select Action. Make sure it's set to the hand you expect, and that you saved the Input Actions asset by clicking the Save Asset button in the window (I've forgotten to do that several times). The binding path should be
    <XRController>{LeftHand}/triggerPressed
    for the left hand. If that doesn't work, it could be a bug with the HTC Vive controller with OpenXR.

    Depending on your Layer configuration, the invisible Collider you have on the Ray Interactor GameObject may be getting hit by the ray, blocking what you are trying to aim at. The expected setup of the hierarchy has an XR Controller for each Controller Interactor component, which would allow you to have a Direct Interactor for grabbing objects, and a Ray Interactor for teleporting. You aren't limited to one XR Controller per motion controller. You would then need to write a script to manage the enabled state of both interactors. This setup is used in the WorldInteractionDemo scene in the Examples project.
     
  25. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    I'm not too familiar with that setup with Virtual Desktop. I found this Reddit thread with comments that may help.
     
  26. StephaneDucher

    StephaneDucher

    Joined:
    May 31, 2019
    Posts:
    3
    Thanks Chris for this link.
    Unfortunately I didn't find a solution to my problem.
    I really hope it's a configuration or process problem and not a problem with XR Interaction Toolkit compatibility.
    To resume :
    - With cable link no problem
    - With Virtual Desktop and wifi : application don't use the Device.
     
  27. StephaneDucher

    StephaneDucher

    Joined:
    May 31, 2019
    Posts:
    3
    After a discussion with Virtual Desktop developer, I had an answer.
    Virtual Desktop don't support OpenXR.
    Unfortunately, actually all applications made with XR Interaction Toolkit can't use Virtual Desktop :(
     
  28. Viikable

    Viikable

    Joined:
    Apr 12, 2018
    Posts:
    10
    Okay so the haptics just aren't supported in OpenXR at all yet, that is good to know. I gotta take a look at that worldInteractionSetup for that dual functionality.
     
  29. LuigiNicastro

    LuigiNicastro

    Joined:
    Feb 7, 2018
    Posts:
    34
    This is amazing, I've been searching for this solution for way too long! Thank you
     
    mfuad likes this.
  30. abelguima

    abelguima

    Joined:
    Oct 30, 2015
    Posts:
    3
    Hi,
    I tried to run this new toolkit 0.10 but didn't work well, when I start and put the oculus the view follows the eyes and you can not interact with the hands, to interact I have to go to the game display outside the oculus and click on the display with the mouse after that everything comes normally. I'm using 2020.2 unity.
     
  31. CryptopherColumbus

    CryptopherColumbus

    Joined:
    Nov 3, 2020
    Posts:
    6
    Hey Unity Forums,

    I've been encountering an issue for quite some time and have not come up with a proper solution.

    When moving with an XRGrabInteractable that is Kinematic you encounter jittering. The object will start moving around and won't be smooth. This can best be seen here (I can't add a time stamp so fast forward to 11:32)

    He also does a great job of explaining the issue.

    I've also bumped the issue here and posted some of the things I've tried https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples/issues/30

    I would love to get Unity's take and maybe a possible official solution into fixing it.
     
  32. abelguima

    abelguima

    Joined:
    Oct 30, 2015
    Posts:
    3
    Interesting after the oculus update yesterday this problem disappear.
     
    mfuad likes this.
  33. vladk

    vladk

    Joined:
    Jul 10, 2008
    Posts:
    167
    Hey guys, here is the thing:

    We've got XRSimpleInteractable - very useful piece of code. It allows to specify a lot of colliders for a single interactable object. And it's a lot of basic events like "OnSelectEntered" with XRBaseInteractor as a parameter. Is there a way in this event to get the specific collider that we actually aiming at?

    Update: Wow... I'm stupid... the the XRRayInteractor is a child of XRBaseInteractor and is basically the object we receive in OnSelectEntered and other similar events :) so basically it's just cast to XRRayInteractor and call GetCurrentRaycastHit.
     
    Last edited: Jan 12, 2021
  34. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    335
  35. Riiich

    Riiich

    Joined:
    Sep 30, 2014
    Posts:
    19
    An old question but I found a workaround: https://stackoverflow.com/a/70887283/1040562
     
  36. LuigiNicastro

    LuigiNicastro

    Joined:
    Feb 7, 2018
    Posts:
    34
    Hey I had this working in the past, for some reason this fix isn't working for me anymore. Would this behave any differently with the most recent version of the toolkit?
     
  37. BelyakovM

    BelyakovM

    Joined:
    Nov 6, 2021
    Posts:
    2
    Did you solve the problem?
     
  38. KevinCastejon

    KevinCastejon

    Joined:
    Aug 10, 2021
    Posts:
    97
    hey guys, I'm still trying to achieve a "manual grab cancel". I'm working on vr network gaming, I need the grabbed object to be "ungrabbed" (without the grip controller being actually released) when "stolen" by another player. I've read a lot, none of the solutions worked, is there a simple way to tell "you're not grabbing (selecting) this VRGrabInteractable anymore" ?
     
  39. KevinCastejon

    KevinCastejon

    Joined:
    Aug 10, 2021
    Posts:
    97
    only hack I found was to enabled=false then true the Interactable...