Search Unity

Official XR Interaction Toolkit 1.0.0-pre.3 pre-release is available

Discussion in 'XR Interaction Toolkit and Input' started by chris-massie, Mar 20, 2021.

  1. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    We have just published a new preview of the XR Interaction Toolkit (XRI) that brings a number of bug fixes and improvements. This version is available now for Unity 2019.4 and later, and will be included in 2021.1 in an upcoming patch release. For those who want to experiment with XRI, the best way to start is with our samples available at https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples. As always, you can refer to our documentation for more information.

    Pre-release
    This 1.0.0-pre.3 version of the XR Interaction Toolkit is considered pre-release. Pre-release packages are supported packages in the process of becoming stable and will be available as production-ready by the end of this upcoming 2021 LTS release. Starting in 2021.1, Unity is changing the way we publish and show packages in the Package Manager, and is designed to provide clear guidance around a package's readiness and expected support level. For pre-release packages to appear in Package Manager, they must be enabled in Edit > Project Settings > Package Manager. There will be additional iterations of XRI before we get to the final 1.0.0 release.

    What’s new
    For a full list of changes, refer to the Changelog in our documentation.

    Many of the changes and fixes in this version were a direct result of feedback we received from the forum and from reported bugs. Thank you to everyone who took the time to make these issues known to the team and for your feedback!

    Notable changes
    • Derived behaviors which added serialized fields will no longer need custom Editor classes to see them in the Inspector. This is an improvement to the changes made with the previous 1.0.0-pre.2 version where the custom Inspector appearance applied to derived classes, but required writing Editor classes to insert additional values in the Inspector.
    • Added support for Input System touches to AR behaviors. Projects no longer have to enable the old Input manager, and can now set Active Input Handling to Input System Package (New).
    • Fixed multiple bugs with Ray Interactor, including the end of the Projectile Curve lagging behind and appearing bent while moving the controller fast (1291060), being able to still interact with Interactables that were behind UI (1312217), and frame latency during locomotion and while aiming at UI.
    • Additional properties have been added to Ray Interactor, Grab Interactable, and AR components to allow for more tweaking of behavior. Additionally, several methods in Ray Interactor have been added or changed to virtual or public to allow developers to have more control and to get information about UI hits.
    Known issues
    Use the Unity Issue Tracker for Package: XR Interaction to see the active issues.
    • The attach point for Grab Interactables can be inconsistent between Movement Type values (1294410)
    • Custom reticles get displayed on objects without a custom reticle (1252565)
    • Socket Interactor can apply the wrong rotation to an interactable and cause the interactable to skew in scale when the interactable has a parent with a non-uniform scale (1228990)
    • Grab Interactables can cause undesired behavior when using Continuous Move locomotion where the Character Controller can be blocked from moving while holding it, or cause the rig to rapidly move away when the object overlaps with the Character Controller
    • The Default Input Actions sample asset uses the wrong pose bindings for Windows Mixed Reality controllers, causing the controllers to point at the wrong angle
    • Interaction Manager throws an exception when an Interactor or Interactable is registered or unregistered during event handling and processing
    • Interactables with multiple Colliders are removed from valid targets list of Direct and Socket Interactors when any Collider exits bounds
    Deprecated members
    As we approach the final 1.0.0 release, properties, methods, and events that are currently marked as Obsolete or Deprecated will start to be elevated from warnings to errors before ultimately being removed from the package for the final release. This process will begin to happen starting with the next 1.0.0-pre.4 release. As more users begin to use the package, we want to eliminate potential confusion from the different members available and improve the experience while developing in an IDE with code completion and member lists.

    Roadmap
    Use the public roadmap to see our latest plans, upvote existing feature requests, and/or submit new feature requests. We are currently working towards a public 1.0 release this year for Unity 2021.2 (LTS). Most of our focus and development efforts now are on bug fixes, UX improvements, and polished documentation & samples. The feature set for public release will primarily reflect what exists today.

    Sharing feedback
    This forum is the best place to open discussions and ask questions. As mentioned above, please use the public roadmap to submit feature requests. If you encounter a bug, please use the Unity Bug Reporter in the Unity Editor, accessible via Help > Report a Bug. Include “XR Interaction Toolkit” in the title to help our team triage things appropriately!
     
    GDesmoulins likes this.
  2. dpcactus

    dpcactus

    Joined:
    Jan 13, 2020
    Posts:
    53
    Is there a workaround for this issue? I think it's messing with my pistol holsters
     
  3. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    You may be able to edit the model(s) so that the scaling is baked into the asset so that everything imports at (1, 1, 1) Transform scale at the same proportions to avoid the issue. You may also be able to alter the hierarchy so it doesn't have any parents with scaling.
     
  4. Mr_Jigs

    Mr_Jigs

    Joined:
    Apr 18, 2015
    Posts:
    69
    Is it just me or does it appear that the world space canvas now renders on top of the scene? Besides my own project I've also tried the XR Toolkit samples and here the world space canvas is also rendered on top of everything else. That's not what I've been expecting. I would expect a world space canvas to be hidden by objects in front of it. If this is expected behaviour (on Unity's side) how would I fix it? Using Unity 2020.1.7f1 and XR Toolkit 1.0.0-pre3
     
  5. bestknighter

    bestknighter

    Joined:
    Dec 2, 2014
    Posts:
    18
    I'm having a problem with the Device Simulator. No matter what I do, I can translate and rotate the hands but I can only translate the head. Head rotation is broken for me. I tried changing key binds. I tried using stationary instead of room-scale. Neither worked. I can even translate both hands and the head at the same time. But when I hold "Rotate Mode Override" or press "Toggle Mouse Transform Mode" only the hands rotate while the head simply stops.

    I'm using Unity 2020.3.0f1 and XR Toolkit 1.0.0-pre3 with the new input system in a brand new scene.

    I added the following code below the line at
    XRDeviceSimulator.cs:1429
    to try and debug


    Debug.Log( "anglesDelta: " + anglesDelta + "\tm_CenterEyeEuler: " + m_CenterEyeEuler + "\tm_HMDState.centerEyeRotation: " + m_HMDState.centerEyeRotation );


    This was the result:

    upload_2021-3-23_11-56-41.png

    I also tried to change the sensibility of Mouse Rotate Sensitivity but that didn't work either. It just made the hands rotate faster.

    Any idea on how should I fix this?
     
  6. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    Are you using the Built-in Render Pipeline, High Definition Render Pipeline (HDRP), or Universal Render Pipeline (URP)? Which version of the High Definition RP package or Universal RP package are you using if so? Are the objects in front of the UI using materials that write to the depth buffer? Are you experiencing that only when deployed to the HMD device or in the Unity Editor also? When using the VR Example project, I am seeing 3D objects closer than the UI occlude it. You could try updating to the 2020.3.1f1 LTS release.

    If you have an HMD device connected to your PC and have the plug-in provider for it enabled in Edit > Project Settings > XR Plug-in Management under the PC, Mac & Linux Standalone settings tab, it may be conflicting and overriding the simulated rotation. If so, try unplugging it or disabling the plug-in provider.
     
  7. bestknighter

    bestknighter

    Joined:
    Dec 2, 2014
    Posts:
    18
    I don't have any HMD connected. I had both OpenXR and OpenVR Loader plug-ins activated but keeping only OpenXR on didn't solve the issue, sadly. However, inspired by your answer I tried a lot of combinations of settings in the XR Plug-in Management and made it work.

    Turns out that having either the Mock Runtime in XR Plug-in Management > Open XR > Features or Mock HMD Loader in XR Plug-in Management on breaks the head rotation of the simulated HMD. I had to have both deactivated for it to work. Which is a bummer as this makes the HMD camera render as a normal camera, not as a HMD one (I need to see the occlusion mesh).
     
    anishhuey and Seazer2023 like this.
  8. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    475
    Just wanted to say thanks for continuing the development of the toolkit, it's been really helpful.

    I have 2 suggestions -

    - If you create an Action based Rig, I think the Input Action Manager script should be automatically added to the scene in the same way the XR Interaction Manager script is. At the moment there's no obvious reason why you're controllers don't move unless you add this script.

    - It would be good if the toolkit could automatically setup or prompt you to change the project fixed timestep to match the refresh rate of your headset. When I first used the toolkit I thought the jitter when moving physics objects was something to do with the toolkit itself.
     
  9. Mr_Jigs

    Mr_Jigs

    Joined:
    Apr 18, 2015
    Posts:
    69
    I am using the Built-in render pipeline and it was showing in the editor I don't think I had gotten round to checking it on a device yet. Now several days, a windows update and many restarts later the problem seems to have disappeared from my own project apart from one issue remaining. TextMeshPro - Text (UI) assets are still rendered on top of everything else (both in editor and in device). That's to say ... the .sdf TMP_Font assets that I copied from a previous project (and different Unity version) are exhibiting this problem. The default .sdf that comes with the TextMeshPro installation behaves correctly. I will delete the copied versions and recreate them. I will also check the VR Example project for any changes. But that's for later I gotta run now.
     
  10. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    I'm glad you were able to work around it! If you report that bug about it not working when the Mock HMD is enabled, we can fix that in a future version.

    We may be able to detect it in the situation that a dev has a preset set for the ActionBasedController, and it contains a reference to an Input Action Asset, and there isn't currently an Input Action Manager in the set of opened scenes that has a reference to the same Input Action Asset, and determine that we should add one to enable them, but we can't be sure that is what the dev wanted to do. However it is a common problem that trips up some of our users, so we'll look into doing this.

    Some of this will be addressed because we are planning on creating a VR template in Unity Hub that will have XR Interaction Toolkit with recommended project settings and a rig already set up so users won't have to manually do those steps to set up the rig and input actions. We are also going to be improving the documentation of the package, especially around new users and the different grab movement types, which should help to explain some of these steps.
     
    Seazer2023, Shizola and bestknighter like this.
  11. Mr_Jigs

    Mr_Jigs

    Joined:
    Apr 18, 2015
    Posts:
    69
    This is getting more interesting by the minute :) This morning I started up the project and the problem was back in the editor. I was about to try upgrading the project as you suggested when I remembered your question about if the problem also occurred on the device. So I tried it on the device and lo and behold it was working on the device with the exception of the tetxmeshpro issue mentioned above. Also after exiting play mode the editor was behaving correctly as well with afore mentioned exception.
    Next I changed all the font assets to the default font asset that is part of the textmeshpro package. That worked correctly in the editor and on the device. I deleted the copied font assets and a textmeshpro material that somehow made it into my fonts folder and recreated the font assets. After assigning them both the editor and the device behaved in the manner expected.
    Next I quit Unity and started it back up. The problem was back in the editor but disappeared after entering and exiting play mode for the first time with the device connected. I guess I can live with that,
     
  12. Skinzart

    Skinzart

    Joined:
    Sep 11, 2020
    Posts:
    14
    @chris-massie is there a method to reset the camera forward to fit the rig forward? and reset the camera center to the Rig center?
     
  13. Thimo_

    Thimo_

    Joined:
    Aug 26, 2019
    Posts:
    59
    @chris-massie

    This might not be a bug perse, but I saw that multiple people encountered it. When I use the AR side of the interaction toolkit, I want to extend and make custom variants for translating,scaling,rotating etc. However some nessesary code is locked as private/internal or through other ways. In particular 'GestureTouchesUtility.cs' is private and I want to access it. Also the gesture.Cancel() function in ARScaleInteractable.cs is not accessible. It would be amazing if you would make all or most methods public available for us to use in custom implementation. Thanks in advance!
     
    Last edited: Apr 13, 2021
  14. dpcactus

    dpcactus

    Joined:
    Jan 13, 2020
    Posts:
    53
    Does the XR Socket Interactor somehow stores the (direct) Interactor that but the interactible in it? I'm getting a weird behaviour where a gun keeps poping back into my hand when it's holstered in a socket and I press Select on my controller. I've put the socket a mile a away to make sure there is no collider interfering. If so, I can I somehow disable that?!

    Edit: Seems like it is because I'mparenting the Object....


    Edit2: Okay, after a few more testing I came to the conclusion that something has changed from 2020.3.0 to 2020.3.1 (and 2 too).
    Here is the situation: I'm trying to make a gun holster which is a XR Socket Attached to the player. On Socket Enter, I parent the gun to the holster so it doesn't lag behind when I move the character (teleport is no option!). This was never a problem until I updated to 2020.3.1, because now, when I put the gun in my XR Socket and make it a parent of the socket, I can always select. The gun will always pop in my hand. It even does that, when the Socket isn't even connected to the player. You can walk always from it and the gun with be in your hand once hit select.

    This is extremely inkonvenient. Reverting to 2020.3.0 now

    It's probably kind of hard to understand what the issue here is. I'm having trouble myself to put this in accurate words. But maybe someone will investigate this or make an option to the Grab Interactiable that it would parent to the interactor in way that works better than my spaghetti code.
     
    Last edited: Mar 30, 2021
  15. bestknighter

    bestknighter

    Joined:
    Dec 2, 2014
    Posts:
    18
    Thank you for your attention and suggestion. I just finished submitting the bug. In case someone wants to follow the report: https://fogbugz.unity3d.com/default.asp?1325407_eeatp68lrmr50urv
     
    chris-massie likes this.
  16. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    The Camera in a typical XR Rig setup is driven by the tracked HMD pose every frame. This means that if you try to modify the Transform of the Camera, it will be overridden by the Tracked Pose Driver when it updates to match the tracked values reported by the device. You will want to modify the Rig Transform instead to control where the player is.

    There are various convenience methods in
    XRRig
    that allows you to update its position relative to the Camera position to help with doing this. The XRRig also has references to the Rig GameObject and the Camera GameObject, so you can do whatever calculations you need to manipulate the Rig.

    What result are you hoping to achieve with what you are asking? There may be another way to control for what you are trying to do rather than trying to realign the Camera.

    If you have a reproducible bug, you can submit a bug report using Help > Report a Bug. There shouldn't be any change in behavior between those versions of Unity.

    We may create an upcoming example to demonstrate having a Socket Interactor parented to the XR Rig to show how to do this.
     
  17. Skinzart

    Skinzart

    Joined:
    Sep 11, 2020
    Posts:
    14
    umm, it's like holding down the oculus button for a while then it will reset the view, is this possible?
    Maybe method like InputTracking.Recenter? because it's obsolete
     
    Last edited: Apr 1, 2021
  18. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
  19. Ikaro88

    Ikaro88

    Joined:
    Jun 6, 2016
    Posts:
    300
    Hi everyone!
    Does XR Interaction Toolkit work with webvr?
     
  20. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    We don't currently support WebVR/WebXR with XR Plug-in Management, so the input data that XRI depends on will not work out of the box. However, as long as you are able to pipe input into the Input System, XRI could work with it since XRI uses the Input System as its input abstraction and doesn't really care about the source of that input.
     
  21. CryptopherColumbus

    CryptopherColumbus

    Joined:
    Nov 3, 2020
    Posts:
    6
    Hey @chris-massie

    I'm currently encountering an issue when using XRGrabInteractables. I currently have a very large collider on my grab interactable and I noticed that when I let go of the xr grab interactable (let's say on a table) and my interactor is still within the collider and I attempt to grab again it will not grab the interactable.

    It works half the time and only really works when I make it so my interactor leaves the collider boundaries and re enters it.

    Is this by design? If so, is there a way to avoid this? When I let go of the grab interactable I want to be able to grab it again even if my interactor has not left the bounds of the collider.
     
  22. unity_lgPxW1cIGK8_nw

    unity_lgPxW1cIGK8_nw

    Joined:
    Nov 26, 2020
    Posts:
    3
    Hi everyone!
    Does XR Interaction Toolkit- action based work with htc vive cosmos?
     
  23. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    There is a bug where the Direct Interactor does not handle an Interactable with multiple Colliders well. If any of the Colliders associated with the Grab Interactable exits the trigger collider of the Direct Interactor, the Direct Interactor removes it from its list of valid objects it can interact with. It should instead only do that when all of the Colliders exits the trigger collider.

    What you describe of being able to grab it again when the interactor doesn't leave the bounds is supposed to be the functionality, and we'll have to fix that bug to make it work right with multiple colliders.

    As a workaround if this is the issue you are seeing, you will need to either create a single mesh for a Mesh Collider component, or use a single primitive collider type (like Box Collider or Sphere Collider) in the Colliders property of the Grab Interactable in the Inspector.
     
  24. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    If you use the HTC OpenXR runtime and you enable the OpenXR Plugin in XR Plug-in Management, it should work.
     
  25. CryptopherColumbus

    CryptopherColumbus

    Joined:
    Nov 3, 2020
    Posts:
    6
  26. stevendimmerse

    stevendimmerse

    Joined:
    Aug 7, 2018
    Posts:
    9

    Is there any work around for this? We have a box with 2 items inside, the user places the objects in sockets in the box then the user can teleport with said box to carry the items around, however sometimes when they teleport the objects will be offset from the positions they should be at.
     
    Last edited: Apr 27, 2021
    hayhilal likes this.
  27. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    Those values I mentioned are now editable through the Inspector or though script on
    XRGrabInteractable
    with
    velocityDamping
    ,
    velocityScale
    ,
    angularVelocityDamping
    , and
    angularVelocityScale
    as of 1.0.0-pre.3. They have the same values they were before, but in the upcoming 1.0.0-pre.4 I changed all the default values to 1 so Velocity Tracked functions like Instantaneous or Kinematic by default without that extra smoothing. You should be able to upgrade to 1.0.0-pre.3 and set those values to 1 in the Inspector to get what you want.

    The issue with the Character Controller of the XR Rig colliding with the objects you are holding can potentially be addressed by creating a new layer (Edit > Project Settings > Tags and Layers) for the grab objects, then setting the layer of the GameObject, and then editing the Layer Collision Matrix (Edit > Project Settings > Physics) so the rig and grab objects don't collide. You could also utilize Physics.IgnoreCollision to be more selective about specific Collider collisions you want to turn off.

    The example you described sounds like it may be different from that known issue, however. Socket Interactors currently override the Movement Type of the Grab Interactable to Kinematic, which would mean that the position of that grab object would update at the Physics update rate rather than the game update rate. If the player teleports, it could be causing issues where the Rigidbody is failing to move to the updated target position correctly.

    In the upcoming 1.0.0-pre.4, we have a few fixes and changes to Grab Interactable and Socket Interactor that may address that situation. Socket Interactor will override the Movement Type as Instantaneous instead of Kinematic (
    XRSocketInteractor.selectedInteractableMovementTypeOverride
    was changed), and the Rigidbody of the Grab Interactable will now have Is Kinematic get set true while selected for Instantaneous to improve how it's handled by the Physics system.

    If you want to try those changes before the next release, you could create a custom derived behavior of XRSocketInteractor that overrides that property. You could also override
    XRGrabInteractable.SetupRigidbodyGrab
    if you want to also make the change to set Is Kinematic on the Rigidbody when its effective Movement Type is not Velocity Tracked.
     
    Last edited: Apr 30, 2021
  28. dpcactus

    dpcactus

    Joined:
    Jan 13, 2020
    Posts:
    53
    I tried this and it seems to work when I test it in editor, but not when I deploy it on Quest.
    Code (CSharp):
    1.   public void RecenterPlayer()
    2.     {
    3.         subSystem = XRGeneralSettings.Instance.Manager.activeLoader.GetLoadedSubsystem<XRInputSubsystem>();
    4.         subSystem.TryRecenter();
    5.     }
    Am I doing something wrong or does the Quest need something specific to recenter the player?
     
  29. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    540
    I found an bug in the XRGrabInteractable SetupRigidbodyGrab method.

    The line rigidbody.isKinematic = m_CurrentMoveType == MovementType.Kinematic ignores the MovementType Instantaneous.

    I have a grab object which has IsKinematic true in the editor and MovementType Instantaneous this grab method will falsely set IsKinematic to false which causes issues, see this post for more details why it causes issues, but short version, the rigidbody and controller will try to move the object at the same time: https://forum.unity.com/threads/xr-controller-object-spawn-jitter.1106497/

    Visually the controller always seems to "win" because it sets the transform in Update and BeforeRender (with the default settings), but the transform values can be messed up if you use them during the Update method, in my case it caused random spawn object offsets and ray cast misses that use the grab object transform.

    As a workaround I added a GrabKinematicPatch script to the same object which handles the XRGrabInteractable Selected event and sets isKinematic to true again
     
    Last edited: May 8, 2021
  30. AvinashB9

    AvinashB9

    Joined:
    Feb 10, 2021
    Posts:
    14
    Hello @chris-massie,

    I am not sure if this is the issue or intended behavior. I am facing the issue related to activate or trigger button especially for XRBaseInteractable class or whichever the class derived from XRBaseInteractable class.

    I have created simple cube and have added the script to gameobject's component as "TestBaseInteractable" which is deriving the XRBaseInteractable class.

    This class has the following script.

    Code (CSharp):
    1. public class TestBaseInteractable : XRBaseInteractable
    2.     {
    3.         protected override void OnActivated(ActivateEventArgs args)
    4.         {
    5.             Debug.Log($"activated");
    6.         }
    7.  
    8.         protected override void OnDeactivated(DeactivateEventArgs args)
    9.         {
    10.             Debug.Log($"deactivated");
    11.         }
    12.  
    13.         protected override void OnSelectEntered(SelectEnterEventArgs args)
    14.         {
    15.             Debug.Log($"select entered");
    16.         }
    17.  
    18.         protected override void OnSelectExited(SelectExitEventArgs args)
    19.         {
    20.             Debug.Log($"select exited");
    21.         }
    22.  
    23.         protected override void OnSelectEntering(SelectEnterEventArgs args)
    24.         {
    25.             Debug.Log($"select entering");
    26.         }
    27.  
    28.         protected override void OnSelectExiting(SelectExitEventArgs args)
    29.         {
    30.             Debug.Log($"select exiting");
    31.         }
    32.  
    33.         private void OnTriggerEnter(Collider other)
    34.         {
    35.             Debug.Log($"trigger enter");
    36.         }
    37.  
    38.         private void OnCollisionEnter(Collision collision)
    39.         {
    40.             Debug.Log($"collision enter");
    41.         }
    42. }
    I had performed the following steps below.
    1. My controller is touching the cube, and pressed trigger button, OnActivated and OnDeactivated method, supposed to call, but I do not see any output below for activated or trigger button.
    upload_2021-5-12_19-49-59.png
    2. If I press grip button then I see below output
    upload_2021-5-12_19-48-37.png
    3. If I press grip button first then press trigger button then OnActivated and OnDeactived method gets called. It works only if I hold the grip button and then press trigger button and I see output below.
    upload_2021-5-12_19-51-55.png

    My Question is how to get called OnActivated and OnDeactivated methods, just by pressing Trigger button alone. Is it intended behavior or is it an issue. Is there a way workaround fix in order to get the OnActivated and OnDeactivated methods called? What scripts I need to change and get those methods working?

    Thanks,
    Avinash
     
  31. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    Thanks for the post, this has been fixed so it sets isKinematic true when Instantaneous or Kinematic in the upcoming 1.0.0-pre.4 version of the package which will be released very soon.

    The OnActivated and OnDeactivated methods (along with the associated activated and deactivated events) are only called when the Interactable is selected when the activate action is triggered. This is intended behavior, and it's for doing some kind of "use" action on an interactable that you've selected/grabbed, such as shooting a gun or turning on a flashlight.

    If you want to modify this so that it also happens for an interactable object that you're just hovering over without first selecting, you'll have to override the ProcessInteractor method. Look to the method in
    XRBaseControllerInteractor
    to see how it's calling
    OnActivated
    and
    OnDeactivated
    . Override the method to add additional code to do the same, but when the
    selectTarget == null
    and call it on each element of
    hoverTargets
    .

    Also I know that TestBaseInteractable script is just a test script, but in your actual interactable script you'll want to make sure you are calling each base method of your overridden methods so that the UnityEvents are invoked. If you don't, any listeners you add to the events in the Inspector won't be triggered.
     
  32. DriesVrBase

    DriesVrBase

    Joined:
    Mar 24, 2020
    Posts:
    65
    Hi, I'm having an issue that I haven't run into before. Because earlier it worked. I'm using the xr packages and openXR. When I run a scene where I'm the host/server and I play the game. Everything works as expected. The events from the XR interaction toolkit are working as intended. Once I become a client and another person is server, I can't press any buttons. I tried debugging it and it just doesn't see any input coming through. So as long as I'm not connected to a server everything is okay? I think this guy ran into the same problem but he has no fix for it https://forum.unity.com/threads/unity-xr-problem-at-runtime.913136/ . I believe it is a Unity bug since it used to work. (around 2 months ago were my final tests) Using the new Unity input system everything seems to work fine but now I have to do my whole input system again instead of the xr interaction toolkit
     
  33. homer_3

    homer_3

    Joined:
    Jun 19, 2011
    Posts:
    111
    Can you add a click option for the Snap Turn Provider? Currently, there's only a single action for each hand that can be bound to a turn. So if I set it to track pad, just gently touching the track pad causes a turn. But the typical way this is done is on clicking the track pad. I just made a local change to add 2 more fields that allow me to bind a trackpadPressed action, and check if that is down, if applicable, before returning the left/right hand action results.

    Continuous move should really have the same option as well.

    I can post my local changes if that's helpful.
     
  34. AvinashB9

    AvinashB9

    Joined:
    Feb 10, 2021
    Posts:
    14
    @chris-massie Thanks for the reply. I got it working by adding new additional methods OnActivateEntered, OnActivateExited, OnActivateEntering, and OnActivateExiting, and followed the same code flow as in OnSelectEntering and other select methods. The reason is so that I wont be able to disturb the Activate and Deactivate methods.

    I appreciate your help.
     
  35. AvinashB9

    AvinashB9

    Joined:
    Feb 10, 2021
    Posts:
    14
    @chris-massie

    I am having an issue with below requirement.

    I have been using XRGrabInteractable on a simple gun mag. On other hand I have a gun attached to my hand and added XRSocketInteractor in child of the new game object with box collider using trigger.

    When I hover the mag close socket interactor trigger, I want the mag to force select/move to the target without releasing the grip/grab button. How do I achieve force select/move when hovering the socket with the mag in hand and also is there any work around for this. Do I need to add/modify methods either XRGrabInteractable or XRSocketInteractor classes? Appreciate your help on this.

    Thanks,
    Avinash
     
    Last edited: Jun 11, 2021
  36. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    When you say that using the new Unity input system that everything works, are you saying that you are only seeing the issue when running as a client when using the older device-based behaviors as opposed to the newer action-based behaviors which use the Input System? Or that you detect input when using the Input System API, but that interactions with the XR Interaction Toolkit are not working?

    If you are using the Input System and action-based controllers, I would suggest that you use the Window > Analysis > Input Debugger window to check that the controller devices are connected and that your actions have successfully bound to a control path. One common cause for interactions not working is that the actions used are not enabled. There is an Input Action Manager component that can be used to enable all the input actions in an asset. In your application, it could be that you are enabling those actions when instantiating prefabs as a host/server but not doing so when running as a client.
     
  37. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    Another way to approach this would be to update the Input Action so it uses a new composite binding that would predicate the Vector2 control with a condition of the trackpad being pressed, similar to the "Button With One Modifier Component." I'll see what we can do to improve turning and moving for trackpad input. I may end up going with your solution to add extra properties to those scripts. Thanks for the feedback!
     
  38. Khang_Pham

    Khang_Pham

    Joined:
    Feb 28, 2021
    Posts:
    13
    Hi there @chris-massie!

    I have a question if you don't mind asking:
    In a scene where multiple Interactors are next to each other. (e.g. multiple socketinteractors inside a chest)
    How does the toolkit prioritize which interactor will interact with the interactable?
    I want only one interactor (e.g. the one closest to the interactable) at the same time to trigger his e.g. onhover event, etc.

    How would one do this?

    As far as I read the documentation there is only one list for the other way around m_ValidTargets, which defines the list of interactables that an interactor can interact with.
    What I need however is a list of all the interactors, that an interactable can interact with.

    Are you able to give me some documentation that I could read into?
    Your help would be much appreciated!
     
  39. AvinashB9

    AvinashB9

    Joined:
    Feb 10, 2021
    Posts:
    14
    @chris-massie

    Sorry to repost this post again.

    I am having an issue with the below requirement.

    I have been using XRGrabInteractable on a simple gun mag. On other hand I have a gun attached to my hand and added XRSocketInteractor in child of the new game object with box collider using trigger.

    When I hover the mag close socket interactor trigger, I want the mag to force select/move to the target without releasing the grip/grab button. How do I achieve force select/move when hovering the socket with the mag in hand and also is there any work around for this. Do I need to add/modify methods either XRGrabInteractable or XRSocketInteractor classes? Appreciate your help on this.

    Thanks,
    Avinash
     
  40. Khang_Pham

    Khang_Pham

    Joined:
    Feb 28, 2021
    Posts:
    13
    Dummy code:
    Hook yourself to the on-hover event of the pistolslot.
    then
    selectexited (hand, gunmag)
    InteractionManager.ForceSelect(gunmag, pistolslot );

    Feel free to read the docu regarding stuff like this.
    https://docs.unity3d.com/Packages/c...on.toolkit@1.0/manual/index.html#architecture

    There is also a realistic gun guide from valem, that could be helpful for beginners like you:
    https://docs.unity3d.com/Packages/c...on.toolkit@1.0/manual/index.html#architecture
     
  41. AvinashB9

    AvinashB9

    Joined:
    Feb 10, 2021
    Posts:
    14
    @Khang_Pham Thanks for the solution. I have removed the old code and used yours, I didn't know there was ForceSelect method. Thanks for the help.
     
  42. Khang_Pham

    Khang_Pham

    Joined:
    Feb 28, 2021
    Posts:
    13
    no problem.
    Glad I could help :)
     
  43. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    XRI will iterate each Interactor in the order that they are registered with the XR Interaction Manager. The Socket Interactor has the property that it requires that an Interactable is not selected for it to select it. So in effect, if a Grab Interactable is being held by a Direct Interactor and it overlaps with two different Socket Interactors, upon releasing the grip, the first socket that executes after the Direct Interactor will be the one that successfully selects the Interactable that was just dropped.

    All of the Interactors have the same script execution order by default, so their registration order will be indeterminate unless you manually control this.

    The Socket Interactor builds its valid targets list from any collider that enters its trigger collider, and then sorts it by distance. This is so within the scope of a single socket, if there are multiple Interactable objects overlapping with it, it will prioritize the closest one. In your example where there are two Socket Interactors close enough that an Interactable object can overlap both of them, each one will greedily try to select it and show the hover mesh, however only the first one that executes will successfully select it.

    If you want to change the Socket Interactor so it is aware of other Socket Interactors, you'll need to create a new behavior that derives from it so you can override the
    CanSelect
    and
    CanHover
    methods. Then you can have each check with some other behavior that manages the multiple sockets in the group to make each only able to select/hover the interactable object if it is the closest to it.
     
  44. Khang_Pham

    Khang_Pham

    Joined:
    Feb 28, 2021
    Posts:
    13
    Thank you very much for the explanation! I have dug into the code and came to the same conclusion.
    The above explanation you just wrote is very a good rundown and it might make sense to add it to the official documentation as well (if not already there).

    Overriding the CanSelect and CanHover methods is definitely the easiest way to do it. I will have to dig into detail on how the XR interaction toolkit manages the list of interactable first tho. As I plan to mimic this logic for the validinteractors list. I think using the same logic that you do for the interactable list, makes sense as a self-written solution by me can easily become bad for the performance. (checking the distance of all interactions all the time etc.)

    I appreciate the explanation and I am very much looking forward to the next release and the next doc wiki version!
     
  45. VR_Junkie

    VR_Junkie

    Joined:
    Nov 26, 2016
    Posts:
    77
    Hello, if I am using the XR interaction debugger I see there is a value called indexTouch that turns true when I am touching the trigger button. I cant seem to find indexTouch when I am trying to bind an action to my action map. Please advise
    upload_2021-7-12_13-44-47.png
     
  46. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    indexTouch
    is an alias for the
    triggerTouched
    control defined as part of the Oculus Touch Controller. Your Input Action may need to be set to an Axis control for that binding to appear as an option in the binding path.
     
  47. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    540
    Looks like version 1.0.0-pre.4 is out already, should this thread be updated?
    Btw I had version pre.3 and for some reason the Unity package manager didn't show that there is a newer version (usually there is a changed icon and update button) I had to manually update to version .4
     
    AdrielCodeops likes this.
  48. VR_Junkie

    VR_Junkie

    Joined:
    Nov 26, 2016
    Posts:
    77
    When I am binding the action my Action Type is set to "Value", my Control Type is set to "Axis", and the binding I am using is "trigger touched [Left Hand Oculus Touch controller]". I am not getting the touch response when debugging, am I doing something wrong?
     
  49. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    I'm assuming you're able to create the binding for the Input Action in that window now. To read values from an Input Action with
    ReadValue<float>()
    , you have to first
    Enable()
    the Input Action otherwise it will just return the default value. You can use the Input Action Manager behavior which will automatically enable all of the input actions contained in an Input Action Reference. You probably already have one in your scene to use the input actions for selecting and the controller poses, so you can just add your custom asset to the list in the Inspector.

    We'll be making a new post soon. Which version of Unity are you using where you didn't see the update button?
     
  50. XSpitFire

    XSpitFire

    Joined:
    Jan 22, 2018
    Posts:
    15
    I am calling ForceSelect on the manager, but my object does not get selected. I did a test and added the object to the interactor's starting interactable and on start up it gets selected. I remove it from the starting interactable and then call FoceSelect myself with the same paramenters (after the game started). Does not attach. Any ideas ?