Search Unity

Official XR Interaction Toolkit Preview Release (0.9)

Discussion in 'XR Interaction Toolkit and Input' started by mfuad, Dec 17, 2019.

Thread Status:
Not open for further replies.
  1. rslarner

    rslarner

    Joined:
    Mar 17, 2020
    Posts:
    1
    I'm having this exact same error. I tried uninstalling and reinstalling Unity. I do have DLLs under the PackageCache folder. Any other things to try?
     
  2. tkw722

    tkw722

    Joined:
    Jan 31, 2018
    Posts:
    1
    Hi All! Long time software developer here that is just scratching the surface of some Quest development using Unity and in particular the new XR Interaction Toolkit. I'm absolutely still learning best practices and the appropriate patterns for how to do common things. That being said, I'm a little stuck at how best to approach something.

    I would like to create a script, which I'm calling InteractorManager, that helps the user toggle between a short range XR Ray Interactor used for object interaction and a long range XR Ray Interactor used for teleportation. I'm using the standard XR Rig and my first instinct was to put a custom script on the XR Rig and hand it the XR Ray Interactor as well as XR Interactor Line Visual and LineRenderer (so I can adjust the line to cue the user into which interactor they're working with by adjusting it appropriately for each use case). I'm consider some button or stick combo to trigger the switch between the two Interactor modes, but I'm stuck at what feels like a pretty basic point. How do I reference the types for XR Ray Interactor and XR Interactor Line Visual? In my script, I can't seem to include UnityEngine.XR.Interactor.Toolkit (if that's the correct include). I've been reading through the manual here: https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@0.9/manual/index.html but have not found much detail on how to work with the whole thing programmatically. Curious if anyone can share any insights on the approach I'm taking here.

    Edit: Turns out I should indeed be able to reference those types and it was just my VSCode not being able to properly reference them. I've got that corrected now so I think I'm all set, please disregard!
     
    Last edited: May 3, 2020
    gjf likes this.
  3. Stacklucker

    Stacklucker

    Joined:
    Jan 23, 2015
    Posts:
    82
    Hello XRI developers. Is there any way to force select an Interactable, aka. force snap it to a interactor ? I really think this should be made into a easy to access option, just like the snap-at-start option you added.

    I am only finding an internal ForceSnap method in the XRInteractionManager class, but I dont know how to access it.
    Any help would be much appreciated, I've been struggling with this for a while now.

    Cheers for all the great work you've put into the package so far.
     
  4. DEGUEKAS

    DEGUEKAS

    Joined:
    Jul 12, 2018
    Posts:
    26
    a question, in XR Unity they will add at some point "snap offset" of the script OVRGrabbable2 of oculus integration, this helped us to accommodate the position of the object
     
  5. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    Is that different from the attachTransform that's already in XR?
     
  6. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,453
    What's stopping you from setting the 'Attach Transform' to an object childed to your hand transforms that you can move around to offset where grabbables attach?
     
  7. ThisIsNik

    ThisIsNik

    Joined:
    Oct 28, 2019
    Posts:
    9
    Edit: Nevermind, I forgot to include
    base.OnHoverEnter(interactable);



    I've been trying out OnHoverEnter() from the XR Direct Interactor;

    It's being called every frame as opposed to only when the controller enters into hover mode; is this intentional?

    I thought it would behave the same way as OnTriggerEnter() or OnCollisionEnter(), being called only once.
     
    Last edited: May 8, 2020
  8. Aaron-Meyers

    Aaron-Meyers

    Joined:
    Dec 8, 2009
    Posts:
    305
    I think it would be nice to be able to have some kind of control to override where the XR Ray Interactor raycasts from. Like I might for instance want to raycast from an extended index finger of a hand, not necessarily the 0,0,0 origin of the XR Controller. Maybe an option transform that you can drop into a slot in the inspector so that it would use it's Z forward too?
     
  9. tom_willmowski

    tom_willmowski

    Joined:
    May 23, 2015
    Posts:
    3
    Has anyone managed to connect XR Interaction Toolkit with new Unity Input System?
     
  10. hareharu

    hareharu

    Joined:
    Nov 22, 2014
    Posts:
    5
    I am trying to add the ability to interact with different ingame elements by looking at them and pressing some action button. I added XRRayInteractor on GameObject with Camera and set ControllerNode in XRController script to CenterEye (and disable TrackedPoseDriver, since XRController will handle tracking). But, of course, there are no buttons on the headset itself, and there is no way to select the usage buttons on another device other than the device in ControllerNode. With small script I was able to get this to work when interacting with XRIneractables by using GetHoverTargets metod of interatror and invoking SelectEnter/SelectExit on InteractionManager.
    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.XR;
    3. using UnityEngine.XR.Interaction.Toolkit;
    4. using System.Collections.Generic;
    5. public class ForceInteraction : MonoBehaviour
    6. {
    7.     public XRNode controllerNode;
    8.     public InputHelpers.Button selectUsage;
    9.     public XRBaseInteractor interactor;
    10.     public XRInteractionManager interactionManager;
    11.     private InputDevice inputDevice;
    12.     private XRBaseInteractable selectedinteractable;
    13.     private List<XRBaseInteractable> hoverTargets = new List<XRBaseInteractable>();
    14.     private bool lastButtonState = false;
    15.     void Start()
    16.     {
    17.         inputDevice = InputDevices.GetDeviceAtXRNode(controllerNode);
    18.     }
    19.     void Update()
    20.     {
    21.         bool currentButtonState = false;
    22.         currentButtonState = inputDevice.IsPressed(selectUsage, out bool pressed) && pressed;
    23.         if (currentButtonState != lastButtonState)
    24.         {
    25.             if (currentButtonState) {
    26.                 interactor.GetHoverTargets(hoverTargets);
    27.                 if (hoverTargets.Count != 0) {                    
    28.                     interactionManager.SelectEnter_public(interactor, hoverTargets[0]);
    29.                     selectedinteractable = hoverTargets[0];
    30.                 }
    31.             } else {
    32.                 if (selectedinteractable) {
    33.                     interactionManager.SelectExit_public(interactor, selectedinteractable);
    34.                     selectedinteractable = null;
    35.                 }
    36.             }
    37.             lastButtonState = currentButtonState;
    38.         }
    39.     }
    40. }
    But now the fun part - how can I get this to work with ui elements on the canvas?
     
    linojon likes this.
  11. bentunnadine

    bentunnadine

    Joined:
    Oct 21, 2016
    Posts:
    9
    Is it possible to use multiple different interactors per hand?

    Currently I have a ray interactor on my left and a direct interactor on my right. The ray interactor is currently just used for teleportation and I would like to have a direct interactor as well for normal interaction. Is there a workaround to get this to work or would I be better rolling my own interactor for this? (I would also like another ray interactor for UI control in menus, but that would be easy enough to simply swap out the hands for "UI Hands" when the menu is accessed)
     
  12. rinkymehra

    rinkymehra

    Joined:
    Mar 20, 2020
    Posts:
    4
    I have the WorldInteractionDemo = Unity 2019.3.0 installed but when I'm in play mode on the Rift there's just the loading screen and doesn't move on

    Regards, tubemate apk
    stream videos
     
    Last edited: May 19, 2020
  13. Aaron-Meyers

    Aaron-Meyers

    Joined:
    Dec 8, 2009
    Posts:
    305
    I have a setup I made where I have two different ray interactors, one for interacting with UI and the other for teleporting. I wanted the teleport to be like the WMR/HL:Alyx style, so I have a script that turns off the UI ray interactor and turns on the teleport ray interactor when the joystick is in the UP position and then when the joystick is released, i trigger the teleport and switch back to the other ray interactor.

    So I'd come up with a system for switching between your different interactors.
     
    davl3232 likes this.
  14. julietteSta

    julietteSta

    Joined:
    Jan 8, 2018
    Posts:
    1
    Hi,

    I use unity 2019.3.13f1 and XR Interaction Toolkit 0.9.4.

    I have a camera with an XRRayInteractor script and I would like to look at 2 seconds a game object with a SimpleInteractiveXR script to make it pass from « hover » state to « select » state.

    On The Camera :
    camera.PNG

    On the GameObject :
    cylinder.PNG

    To do that :
    In the XRRayInteractor script, I checked from the inspector the "Hover To Select" box and set the time at 2s but after looking at my game object during 2s, its does not go to the « select » state.
    The "Hover" state works fine.

    Here is what I have debugged :

    Line 288 of the XRInteractionManager « if (interactor.isSelectActive) » is always false.
    This variable is true or false in the XRBaseControllerInteractor line 219 or in the editor is link with "Select Action Trigger". But it seems that each case (State, StateChange, or Toggle Sticky) waits for a user action but in my case I don't have any.

    Am I missing something or is it a bug ?
    Thank You
     
  15. FishStickies94

    FishStickies94

    Joined:
    Jul 27, 2019
    Posts:
    70
    The lack of any form of smooth locomotion system for Unity is a little annoying considering all other VR systems have one built in. While it's simple enough to create your own, everything I had tried just fights the velocity tracking option causing jitters. Would be good if there was a system that worked with the velocity tracking system to basically update it on the players position so it could ignore the locomotion movement.
     
    Matt_D_work and a436t4ataf like this.
  16. yarsrvenge

    yarsrvenge

    Joined:
    Jun 25, 2019
    Posts:
    87
    I’ve noticed a couple of things with the toolkit.

    1. This one has been mentioned on here a few times but I’ve yet to see anyone provide a hack to fix it. When using instantaneous movement type on an interactable, it is not using the attachtransform. It seems to be using the center of the rigid body instead? It works ok on the other two movement types but those are completely unusable as far as the lag is concerned. This bug poses big problems because I have some grabables that have pieces (with a collider) that may or may not be enabled. This causes the center of the object to be different and throws of the positioning. I could code around this by having different attach transforms for each attachment type.

    2. This bug is a bit more serious. When using instantaneous, the raycast I have on the gun interactable starts doing weird things (going in weird directions not following transform.forward) This happens only after I do something on the interactable itself. I do some things with the children of this game object like hiding, disabling colliders, (ejecting magazine, reload) and that is when the bug starts to occur. Haven’t been able to nail down what is causing it yet, but might be related to child colliders and the internal list that is stored (mentioned in an earlier post on here)
     
    a436t4ataf likes this.
  17. FishStickies94

    FishStickies94

    Joined:
    Jul 27, 2019
    Posts:
    70
    @yarsrvenge The effect you are getting is because enableing and disabling collision colliders or changing them to trigger affects the center of mass of the object, which is affects the grab position in some ways that I'm not 100% certain on. My current work around is simply setting the center of mass on awake. If you need to have a dynamic CoM then I don't know a solution, but otherwise you can simple do this in awake

    Code (CSharp):
    1. Vector3 centerOfMass = rigidBody.centerOfMass;
    2. rigidBody.centerOfMass = CenterOfMass;
    That will now set the center of mass permantely regardless of what colliders go on or off. You can also manuelly set this point for intantenous if that is what effecting the grab position. Though looking through the interactable script it does look like it uses the attachpoint.

    EDIT: I've also managed to get smooth locomotion and velocity tracking with zero jitters. It's very hacky and wouldn't reccomend it but if anyone else needs it, it's a case of making velocity tracking happen in update rather then fixed update, then making the interactable a child of the XR rig when it's grabbed. I know that's great since it's physics calculations, which should be done in fixed up, however there is simply no other way to get it too work and I have a sinking feeling this XR Toolkit will barely be touched by Unity so it's on us to work out solutions.
     
    Last edited: May 15, 2020
    a436t4ataf likes this.
  18. yarsrvenge

    yarsrvenge

    Joined:
    Jun 25, 2019
    Posts:
    87
    Thanks for the tip on center of mass. Hopefully that will sort out problem number 2. I'll try later today.

    EDIT: That solved #1 since it forces the COM before I adjust any attachments or anything. Still have an issue with #2 and raycasts. If I drop the weapon and pick it back up it works fine till I reload. I setup a debug.drawling on the raycast and can see it is not using transform.forward but not exactly sure whats going on. It just goes in weird directions, sometimes lower, sometimes upper right. This worked fine prior to the toolkit, so obviously something with how instantaneous works is causing issues with the raycasts after modifying the held object. I'll keep plugging away at it to try and figure it out.
     
    Last edited: May 15, 2020
  19. FishStickies94

    FishStickies94

    Joined:
    Jul 27, 2019
    Posts:
    70
    That's very odd. I only use velocity tracking so know nothing of instaneous. I would look in the XRGrabInterable and XRBaseInteractable scripts and see if you can see anything going on in there.
     
  20. TimeWalk-org

    TimeWalk-org

    Joined:
    Nov 3, 2014
    Posts:
    38
    Was the XR Interaction Toolkit removed from Package Manager in 2020.1? It no longer appears. Nor does the "Show Preview Packages" option. Is this documented anywhere?
     
  21. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    540
    What's a good way to "prevent" the player from moving into walls? For example the player can teleport near a wall and then just move in real life to go through it.
     
  22. TimeWalk-org

    TimeWalk-org

    Joined:
    Nov 3, 2014
    Posts:
    38
    SOLVED: in 2020.1, the "Show Preview Packages" is no longer under the Advanced tab in the Package Manager. Instead, go to menu Edit/Project Settings/Package Manager. There you can check the "Enable Preview Packages" option. I found that I had to quit Unity and restart it before the preview packages (e.g. "XR Interaction Toolkit") would appear.

    package manager.JPG
     
    a436t4ataf, ilyaylm and harleydk like this.
  23. brianleake

    brianleake

    Joined:
    Feb 5, 2013
    Posts:
    10
    Curious if anybody else has noticed that LeftBaseController transform (and everything else) isn't being updated for Windows MR in the WorldInteractionDemo scene? Right controller to teleport around works fine, but the Left controller is seemingly disconnected / dead. My controller of course has power, works fine with Windows MR in general and can be seen active and moving around normally elsewhere.
     
  24. mikeNspired

    mikeNspired

    Joined:
    Jan 13, 2016
    Posts:
    82
    I don't have any problems using a HP Reverb, or Samsung Odyssey. Everything works for me and my friends that are using UnityXR with Mixed Reality besides only one eye displaying certain shaders/particles/ui elements unless I'm on URP.
     
  25. RetroFlight

    RetroFlight

    Joined:
    Nov 16, 2019
    Posts:
    41
    Does anyone have a working lever or joystick tutorial using Xr?
     
  26. MaskedMouse

    MaskedMouse

    Joined:
    Jul 8, 2014
    Posts:
    1,092
    XRInteractorLineVisual
    variable
    stopLineAtFirstRaycastHit
    is not even used.
    In our project we sometimes want to hide elements but still be interactable so they are "discoverable".
    It should be used at line 247 for it to be any effect. I've mentioned this one before but it seems like it got ignored.

    Reticle Flickering
    We have a reticle attached to our
    XRInteractorLineVisual
    . But for some reason while hovering over UI the reticle sometimes flickers. It doesn't happen with all UI.

    Image Raycast Target is not taken into account in the UI Interaction?
    I have a Stretched image as a visual representation of a stroke around a panel with Raycast Target turned off.
    But due to it not taking Raycast Target being off taken into account, my button is untargetable.
    I fixed it by moving the Stroke Image higher up in the hierarchy. But I shouldn't have to if the Raycast Target is turned off.
     
  27. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    297
    XR management is just a system to manage various plugins(Oculus, OpenVR, etc). XR interaction toolkit is a higher-level framework that sits on top of the system.
    They are compatible and you can use both in your project.
     
  28. yarsrvenge

    yarsrvenge

    Joined:
    Jun 25, 2019
    Posts:
    87
    There are times where I need to disable an XRGrabInteractable so it cannot be grabbed. It seems as though it cannot be disabled/enabled other than setting the entire GameObject.SetActive which I don't want to do. Hoping I am missing something obvious.
     
    a436t4ataf likes this.
  29. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    540
    Did you try to disable the collider of the grab object?
     
  30. yarsrvenge

    yarsrvenge

    Joined:
    Jun 25, 2019
    Posts:
    87
    Not sure why I didn't think of that. I'll give that a go tonight and see how that fares. Thanks!
     
  31. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    540
    After reading it again I realized that this will not work if the object has a rigidbody because it will not detect any other collisions anymore.

    I don't have the source code open right now, there should be a easier solution to do this.

    If there is really no other solution then you might be able to do a weird hack solution by calling Physics.IgnoreCollision two times with the grab object and both hand colliders with true/false to en/disable collision for these specific colliders instead of disabling all collisions for the whole collider, but I really hope that there is a easier solution.
     
    linojon likes this.
  32. yarsrvenge

    yarsrvenge

    Joined:
    Jun 25, 2019
    Posts:
    87
    In my particular scenario, I can make the objects kinematic as well as disable the collider(s) when I don't want them to be grabbable. The objects cannot be seen anyway (they are inside a chest/locker/etc). Once they open the container I reenable the colliders and turn off kinematic. The whole point is to prevent them from being grabbed, highlighted and a few other things until the container is open. This is part of a procedural generation engine, so I disable them when they are added in that process.

    It works "ok" for now (I just tested it), but there is some slight movement of course when kinematic is turned off and they fall to the nearest collider which is barely visible to the player.

    I agree there should be a better way though. I've already made a few changes to their code to overcome different problems and want to reign that in a bit until they have time to complete. If not I will come back to it. OVRGrababble from Oculus could simply be disabled which is how I did it before moving to the XR Toolkit.
     
    linojon likes this.
  33. hessex

    hessex

    Joined:
    Jul 24, 2019
    Posts:
    7
    Is there a method to auto-grab an XRGrabinteractable? For instance, allowing an "OnHoverEnter" event to cause the player to grab the item. Or if a player were to throw an item, have that Item re-appear in the player's hand.
     
  34. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    There's a lot wrong with the movement types :). My current plan is to delete them / most of that class and replace it with working versions if Unity doesn't do a rewrite in the next XRIT update - just so much over-simplified about how they've implemented that here, causing weird behaviours, jitters, bad performance, etc. Unity physics and collision (and parenting!) is a lot more complicated than this class seems to think it is...

    Which raycast is this? Interactables don't have a raycast.
     
  35. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
  36. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    I'm doing the brutal: destroy the XRGrabInteractable component, and recreating it to re-enable. This works fine, and appears to be the way Unity intended it to work: http://snapandplug.com/xr-input-toolkit-2020-faq/#FAQ:-How-do-I-temporarily-disable-a-Grabbable? - although I agree with you that it makes a lot more sense to have a boolean toggle for this.
     
  37. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    From reading the source code: I would strongly advise not to do this. One of the recurring assumptions that Unity made when designing XRIT was that you would only ever want one of everything (in almost all cases this is wrong, it's definitely not right for game-development - but it works for simple demos, and as a "first draft, pre-release" it's OK).

    They may or may not fix this. But the design strongly suggests they expect you to make your own custom interactor for situations exactly like this one.

    (even if they update the core design, your custom interactor will continue to work ... so I'd go with that option, unless you're willing to wait a few months and see what's in the next release)
     
  38. ZeBraNS

    ZeBraNS

    Joined:
    Feb 21, 2015
    Posts:
    40
    Hi,
    I have a question, on how to make an object (A) that is placed in the socket on an object (B) a "fake" parent object so that when I move again object (A) it moves the object (B). I know how to disable objects to be glued to a socket so that it doesn't move from the place but it is a "fake" child of the socked object and those not move "parent" objects.

    Imagine placing a doorknob on the door (it snaps in the socket on the door) and then pooling that doorknob to open that door.
     
  39. yarsrvenge

    yarsrvenge

    Joined:
    Jun 25, 2019
    Posts:
    87
    Sorry, the ray cast is not on the interactable, I meant my raycast on the gun gameobject was somehow being interfered with when I used instaneous. It was very weird. I solved it by moving to Kinematic and doing a few things to eliminate the jitters.
     
  40. quix22

    quix22

    Joined:
    Sep 6, 2018
    Posts:
    16
    Is there any functionality (in the locomotion) of the XR Interaction Toolkit that allows the player to fly around as well as the teleportation that's available?
     
  41. Virtimed

    Virtimed

    Joined:
    Nov 1, 2017
    Posts:
    29
    Hi,

    I have a usecase where I want UI control and the select interactor of real world objects on the same button (so essentially the trigger does button/UI selection and real world selection)

    What I notice though is the ray visual will notice the UI element and end the line on the UI but if a physical interactor is behind it, it will still be interactable.

    I tried to illustrate with an example in the attached image. So if that line isn't impeded, it would hit the cube (which I can interact and pickup by using the trigger). But even with that UI element in the way, the cube behind is still interactable.

    I seem able to fix this by putting box colliders on the UI but this seems an odd setup. Just wondering what's the best way of handling this use case?
     

    Attached Files:

  42. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    Someone asked me about this recently, and I was sure I previously had the same problem but it was fixed quickly - off the top of my head it turned out to be a misconfigured Canvas object (didn't have the right combination of EventSystem, Raycasters, etc).

    Did you already have a Canvas in scene, did you create one yourself (did you already have an ES) ... or did you let XRIT create one for you? (it's very easy to end up with a misconfigured one by accident - there are no warnings on this)
     
  43. Virtimed

    Virtimed

    Joined:
    Nov 1, 2017
    Posts:
    29
    The example in the screenshot is the one from the samples they provide so fairly sure everything should be as in terms of setup. I copied a preexisting setup canvas in the scene which I just placed between the user and the interactables and this problem occurred.
     
  44. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    297
    It's been half a year, Is there a plan to update this toolkit?
     
    Aaron-Meyers and a436t4ataf like this.
  45. Aaron-Meyers

    Aaron-Meyers

    Joined:
    Dec 8, 2009
    Posts:
    305
    +1
     
    harleydk and ROBYER1 like this.
  46. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,453
    Is there any way to use hand tracking with this? I know left/right controllers are options, so could left/right hands be too?
     
    alienheretic likes this.
  47. mikeNspired

    mikeNspired

    Joined:
    Jan 13, 2016
    Posts:
    82
    Anyone else looking for a hand posing system like steamVR's skeletal poser, Message me. Looking for beta testers for my hand poser for the xr interaction toolkit.
     
  48. tom_willmowski

    tom_willmowski

    Joined:
    May 23, 2015
    Posts:
    3
    Hey Unity, shouldn't XRBaseInteractable remove itself from XRInteractionManager OnDestroy?
     
  49. dynamoleddisplays

    dynamoleddisplays

    Joined:
    Jun 8, 2020
    Posts:
    1
    This toolkit provides a set of components to enable users to build interactive and immersive experiences quickly and easily. It enables common augmented reality (AR) and virtual reality (VR) interactions without the need to write code, while still making the system extensible for developers who wish to create their own interactions. Additionally, it is compatible with all of our supported AR and VR platforms.
     
  50. mikeNspired

    mikeNspired

    Joined:
    Jan 13, 2016
    Posts:
    82
    Code (CSharp):
    1.    void OnDestroy()
    2.         {
    3.             if (m_RegisteredInteractionManager)
    4.                 m_InteractionManager.UnregisterInteractable(this);
    5.         }
    6.  
    That's the destroy method right there in XRBaseInteractable. Looks like that's exactly what its doing.
     
Thread Status:
Not open for further replies.