Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Official XR Interaction Toolkit Preview Release (0.9)

Discussion in 'XR Interaction Toolkit and Input' started by mfuad, Dec 17, 2019.

Thread Status:
Not open for further replies.
  1. hareharu

    hareharu

    Joined:
    Nov 22, 2014
    Posts:
    5
    These methods just invoke onActivate/onDeactivate events, and you can invoke them directly.
     
  2. TobySch

    TobySch

    Joined:
    Jun 15, 2018
    Posts:
    8
    @hareharu What do you mean by "Invoke them directly"? Directly from the Manager?

    I'm facing the same Problem as @Utarastas. It would be great, if OnActivate only has an effect on selected object or not, could be decided by the user. This brings some unnecessary restrictions, which easily could be implemented by oneself when handling the events. Without the need of extending or editing the base or manager class of the toolkit.
     
    Last edited: Aug 31, 2020
  3. emrys90

    emrys90

    Joined:
    Oct 14, 2013
    Posts:
    755
    Has this been abandoned? The last update was 5 months ago. Please, Unity, provide us some guidance on if we should start looking elsewhere and not use this plugin anymore.
     
  4. MaskedMouse

    MaskedMouse

    Joined:
    Jul 8, 2014
    Posts:
    1,091
    I'm fairly certain they wouldn't abandon this package. This is one of the most essential packages for the XR Tech Stack.
    Imho, there is a lack of communication towards the community and misses a roadmap for what is planned.

    @Matt_D_work is there anything you could share about this, anything else than "work is being done, don't worry"?
    What is taking so long to update this package? Some transparency would be nice.
     
  5. hareharu

    hareharu

    Joined:
    Nov 22, 2014
    Posts:
    5
    I meant, invoke it like any other event. Something like that:
    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.InputSystem;
    3. using UnityEngine.XR.Interaction.Toolkit;
    4. public class InteractorActivation : MonoBehaviour
    5. {
    6.     private XRBaseInteractable interactable;
    7.     private XRBaseInteractor interactor;
    8.     private InputAction activation;
    9.     void Start()
    10.     {
    11.         interactable = GetComponent<XRBaseInteractable>();
    12.         interactor = FindObjectOfType<XRBaseInteractor>();
    13.         activation = new InputAction(binding: "<Keyboard>/A");
    14.         activation.performed += _ => interactable.onActivate?.Invoke(interactor);
    15.         activation.canceled += _ => interactable.onDeactivate?.Invoke(interactor);
    16.         activation.Enable();
    17.     }
    18. }
    Just keep in mind that if you use interactor somethere in the event code (for example, to perform different actions depending on whether it is a left hand or a right hand), you will need to take into account that it can be random.
     
  6. TobySch

    TobySch

    Joined:
    Jun 15, 2018
    Posts:
    8
    Thanks @hareharu for clarifying.

    After fiddeling around for a couple of days I finally found a solution that worked for me. Maybe this might help you as well @Utarastas.

    I ended up bypassing the event triggers on the interactables by using a separate controller for the activate action.
    This Controller get activated automatically, as soon as my activation button gets pressed. You will have to configure this similar to a teleport controller in your XRControllerManager. This activation controller is also using the select event to trigger some action on the interactables. But we will filter that out later.

    At first, I created a separate controller with a custom interactor script on it. This custom interactor basically does nothing and is only to identify the selection controller later. In My case, it derives directly from the 'XRDirectInteractor'.

    Code (CSharp):
    1.  
    2. using UnityEngine.XR.Interaction.Toolkit;
    3.  
    4. /// <summary>
    5. /// This class adds no functionallity and derives directly from the 'XRDirectInteractor'.
    6. /// Its only purpose is to mark the interactor as an 'activation' interactor to bypass selection events on any interactable deriving from XRActivatitionInteractable.
    7. /// </summary>
    8. public class XRDirectActivateInteractor : XRDirectInteractor
    9. {
    10.  
    11. }
    I then created an override class of the (in my case) 'XRGrabInteractable' to filter any 'OnSelectEnter' calls from interactors of type 'XRDirectActivateInteractor'.

    Code (CSharp):
    1.  
    2. using UnityEngine.XR.Interaction.Toolkit;
    3.  
    4. public class XRActivationInteractable : XRGrabInteractable
    5. {
    6.     protected override void OnSelectEnter(XRBaseInteractor interactor)
    7.     {
    8.         // Bypassing the Select event from the interactor when it is of type 'XRActivateInteractor' to call the OnActivate method instead
    9.         if (interactor is XRDirectActivateInteractor)
    10.         {
    11.             base.OnActivate(interactor);
    12.         } else
    13.         {
    14.             base.OnSelectEnter(interactor);
    15.         }
    16.     }
    17.  
    18.     protected override void OnSelectExit(XRBaseInteractor interactor)
    19.     {
    20.         // Bypassing the Select event from the interactor when it is of type 'XRActivateInteractor' to call the OnDeactivate method instead
    21.         if (interactor is XRDirectActivateInteractor)
    22.         {
    23.             base.OnDeactivate(interactor);
    24.         } else
    25.         {
    26.             base.OnSelectExit(interactor);
    27.         }
    28.     }
    29. }
    You then can easily use the 'OnActive' and 'OnDeavtivate' Events on the intractable.
    This works fine for me. But be aware, that you cannot use the XRControllers 'activate' method anymore in its default manner.

    Hope this will help someone.


    Finally some words to Unity:
    I'ts very disappointing to see, how less support and development flows into this toolkit. In my eyes it has some great possibilities and could help us XR developers a lot. I like the idea of a simple and lightweight toolkit with possibilities to extend it for your own needs. But in this stage it is not really usable as alternative to existing toolkits. Although it is in preview, it does not seem to be halfway done. There are a lot of open ends and limitations build into your code and some functionalities might have to be overworked. The LayerMasks, for example, work very counterintuitive.
    Please keep on developing this!
    An official statement from Unity on the toolkit would be really nice and helpful. I'ts difficult to start using a toolkit in your projects, without knowing if it will be further developed in the future.
     
    Shizola and gjf like this.
  7. petegaley

    petegaley

    Joined:
    Aug 27, 2020
    Posts:
    1
    Hi all! I wanted to share a tip that might save you a bit of headache. If you're adding a custom reticle using the XR Interactor Reticle Visual component, make sure the reticle prefab doesn't have a collider on it, or you'll get spammed with never-ending hover / unhover events. Yes, I'm sure that seems really obvious, but it still took me a couple of days of fiddling with every setting to get to the bottom of it :-/
     
    Dalton-Lima likes this.
  8. yarsrvenge

    yarsrvenge

    Joined:
    Jun 25, 2019
    Posts:
    87
    I've started to migrate my objects to Velocity Tracked since the rest of my game is making some progress. I was originally going to wait for an update to XR toolkit but not sure when that will happen. I've hacked a few things to get it working well for the most part, but I'm noticing that held objects will still go through a collider if it moves fast enough. Think a gun held in the hand going through a wall. I've tried the usual suspects like continuous collision detectionand making wall collider thicker but no dice. Thought I would see if anyone had any experience with this. I suppose it would be possible to detect when the object collides with the wall using a trigger and stopping the movement somehow, but seems like a lot to workaround. I assumed the physics engine would handle stopping it.
     
  9. FishStickies94

    FishStickies94

    Joined:
    Jul 27, 2019
    Posts:
    70
    If you want to get accruate physics I reccomend moving over to a joint-based system instead. We've spent the past few months developing a system on-top of the XRToolkit and it's far better than what's on offer. It's only really needed though if physics are a large part of the experience you are developing.
     
  10. yarsrvenge

    yarsrvenge

    Joined:
    Jun 25, 2019
    Posts:
    87
    Thanks. Yea, today I worked on starting to integrate a configurable joint that is enabled when there is a collision and disabled after it is no longer colliding. It seems to be working pretty well even with smooth locomotion. Time will tell.
     
  11. Trekopep

    Trekopep

    Joined:
    Dec 18, 2013
    Posts:
    15
    Hi there! I'm trying to get the XR UI system to work with an XR Ray Interactor, and feel like I must be missing something obvious. My UI buttons show their hover animations correctly when I hover over them, but clicking them doesn't do anything. I also tried Hover To Select, just in case it was a button issue. The test scene I made consists only of an XR UI Canvas with a button on it and a Stationary XR Rig, both created from the GameObject>XR menu.

    As a side note, I want this to work with an existing SteamVR setup, which seems to be working just as well except for the same issue of not being able to select UI. Do XRRayInteractor.isSelectActive, OnSelectEnter, etc. apply to UI as well, or is that exclusively for selecting 3D non-UI GameObjects? (I notice that XR Controller has separate configurations for "Select Usage" and "UI Press Usage") I can override XRRayInteractor to make isSelectActive true when I want, but I'm not sure if that's actually the right place to be looking to get UI to work.
     
  12. cruizea5

    cruizea5

    Joined:
    Sep 18, 2020
    Posts:
    2
    I have the WorldInteractionDemo = Unity 2019.3.0f1 installed but when I'm in play mode on the Rift there's just the loading screen and doesn't move on mxplayer app get-mxplayer.in

    Edit: Issue is solved! :)
     
    Last edited: Oct 11, 2020
  13. Ishnubarak

    Ishnubarak

    Joined:
    Aug 28, 2019
    Posts:
    6
    I suspect you are facing this issue: https://github.com/ValveSoftware/unity-xr-plugin/issues/16
     
  14. yarsrvenge

    yarsrvenge

    Joined:
    Jun 25, 2019
    Posts:
    87
    I am trying to implement a way to change positions of the held object after it is already held, essentially changing the attach transform. Think about holding a bottle different ways by pulling the trigger while an object is already grabbed. If you've played saints and sinners you know the mechanic.

    You cannot change the attachTransform and have the new position reflected after you have already grabbed it, so I've been trying to set the interactable's local position to the correct attach transform a couple of ways including rigidbody.moveposition but does not seem to be working. Curious if anyone has tackled this. I feel like I'm missing something obvious.
     
  15. yarsrvenge

    yarsrvenge

    Joined:
    Jun 25, 2019
    Posts:
    87
    I think I answered my own question. I was updating the wrong attachTransform. If you update the one on the interactor rather than the interactable in this case you can update while it is being held and it actually worked. There are just some side effects I have to work out.
     
    Gustavo-Quiroz likes this.
  16. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    470
  17. yashna0107

    yashna0107

    Joined:
    Sep 6, 2019
    Posts:
    3
    Yes, it is removed from Package Manager. I had to use it in my project through manifest.json file.
     
  18. kblood

    kblood

    Joined:
    Jun 20, 2012
    Posts:
    92
    So we cannot use the XR Toolkit in Unity 2020.1?

    I cannot find the UnityEngine.XR.Interaction.Toolkit namespace now. The whole "interaction" part seems to just be gone. I have installed the XR stuff manually, but have been unable to get it to work.
     
    DavidZobrist likes this.
  19. harleydk

    harleydk

    Joined:
    Jul 30, 2012
    Posts:
    41
    I have it in my 2020.1 here.



    That's on a project that used to be 2019.13, then was updated to latest.
     
  20. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    470
    You can still get the toolkit through the package manager. There's a new Package Manager section in Project Settings, and this is now where you enable preview packages.
     
  21. goout88

    goout88

    Joined:
    Sep 18, 2017
    Posts:
    4
  22. goout88

    goout88

    Joined:
    Sep 18, 2017
    Posts:
    4
    Hi, what you would recommend from existing toolkits ?
     
  23. Trekopep

    Trekopep

    Joined:
    Dec 18, 2013
    Posts:
    15
    Ah, thanks for the link! Seems like this is probably the issue.

    Hm, I took a look at the first method, but it turns out I'm already doing the first solution: I have the legacy "Virtual Reality Supported" option checked, and I don't have the XR Plugin Management package installed, so it seems like this should work.

    I took a look at the second method, and tried just rewriting InputHelpers. Putting a breakpoint in XRController, line 324
    HandleInteractionAction(controllerNode, m_UIPressUsage, ref m_UIPressInteractionState);

    I can see that it is reading the button as pressed and the InteractionState is being set to active, but the UI still doesn't get selected, which makes me wonder if the reading of the input isn't the real issue here. I dug around for a while, but wasn't able to figure out exactly where the UI selection was supposed to be happening. (I looked at things like XRInteractionManager.SelectEnter, but I'm not sure if the UI actually counts as an XRBaseInteractable, or if that's only for things like 3D objects?)

    To clarify, the only reason I'm trying to use the new XR system is to interact with worldspace UI, and the only thing that appears to not be working is clicking the UI. If there's some way to manually trigger a UI click/release on an XRRayInteractor, that would solve all my problems.
     
  24. freso

    freso

    Joined:
    Mar 19, 2013
    Posts:
    73
    The documentation for XRSocketInteractor says: https://docs.unity3d.com/Packages/c...R.Interaction.Toolkit.XRSocketInteractor.html
    Code (CSharp):
    1. selectedInteractableMovementTypeOverride
    2. Gets the movement type to use when overriding the selected interactable's movement (always Kinematic for sockets).
    Always kinematic for Socket. Why? It just makes the movement of the socketed item jerky. Commenting this out makes the movement stable. It would have been nice if the docs explained WHY it does something, instead of "get value". Does anyone know why?
     
    Last edited: Oct 18, 2020
    jimt123 likes this.
  25. TimeWalk-org

    TimeWalk-org

    Joined:
    Nov 3, 2014
    Posts:
    38
    I too would like to be able to implement something like the "distance grab" we see in Oculus Home. Aim ray at an object, press GRIP to grab it, push joystick up/down to move object away/toward hand. Seems like a very basic feature but I can find no working XR examples anywhere. Coming someday?
     
  26. harleydk

    harleydk

    Joined:
    Jul 30, 2012
    Posts:
    41
    Methinks that's quite a specific ask and you'll have to - should - implement it yourself. It wouldn't be hard - look into locking the position of the suitable axis of the interactible and provoking it yourself in script.
     
  27. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    334
    Trekopep and P_Jong like this.
Thread Status:
Not open for further replies.