Search Unity

Official XR Interaction Toolkit Preview Release (0.9)

Discussion in 'XR Interaction Toolkit and Input' started by mfuad, Dec 17, 2019.

Thread Status:
Not open for further replies.
  1. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    I am using the UI as a pause menu for my game and I noticed on occasion that the XR Ray Interactor doesn't always find the UI, it is quite temperamental. I have a reproduction project I can share for this but it is quite large so might make a smaller one. On occasion when loading into a scene for the first time, the raycasters won't immediately find the UI, requiring me to walk about or turn with the analogue stick for a while to get the ray casters to interact.

    Shall I report a bug for this?
     
  2. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @ROBYER1
    Yes, please file a bug, so I can track it properly.
    First instinct is that it has to do with the Ray Interactor being enabled already looking at a UI object, but I'll definitely have to dig in, I wrote the whole Ray-UI stuff, and don't doubt it still has a couple little bugs hidden in there somewhere.
     
    Last edited: Feb 12, 2020
    harleydk and ROBYER1 like this.
  3. linojon

    linojon

    Joined:
    Aug 25, 2014
    Posts:
    118
    hi. The Ray Interactor has an Enable Interaction with UI GameObjects checkbox, and the Direct Interactor does not. I can implement direct interactions manually by adding a XR Simple Interactable, constraining the Rigidbody, and adding a Collider to the button ui object, and wiring the xr events into the button Select() for example. Is better UI interaction planned with Direct Interactor?
     
  4. MaskedMouse

    MaskedMouse

    Joined:
    Jul 8, 2014
    Posts:
    1,092
    Will the XR Interaction toolkit supply a way of interacting with a VR input field?
    At this point we have an own made VR Keyboard, which works kind of, but of course the input field loses focus when interacting on the virtual keyboard.
     
  5. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    9,442
    briefly tested the example scene (on oculus using quest with link)

    issue:
    when trying to throw those cubes (any of those 3 types),
    there is like 0.5s delay before the cube is released and flies away.. (and it feels stuck on release, instead of properly following the velocity).
     
    harleydk likes this.
  6. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    curious. it _should_ be fine. raise a bug (and take a video if you can!)
     
  7. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    9,442
    i noticed that adjusting this value to 0.99 helps, but then picking up requires full press or something:
    upload_2020-2-15_0-38-14.png

    *Comparing this to SteamVR interaction demo scene, where throwing those cubes feels more natural / responsive.

    But ill test a bit more first.

    *some improvement if removed all the smoothing from cube,
    upload_2020-2-15_21-3-27.png

    Small video clip, compared box handling in SteamVR (but not that clearly visible in video..)


    Comparison:
    XRKit: With slow movement: box stuttering, With fast movement: box lags behind, not perfectly following
    SteamVR: Smooth movement, fast follow (since its parented)

    (maybe due to the box is not parented to hand in xr kit? or some other settings..)
     
    Last edited: Feb 16, 2020
    harleydk likes this.
  8. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Using the Standalone Input Module from the new Unity Input system causes the UI input module to misbehave.. and by misbehave I mean that if a Raycaster is aimed at the UI and you move or turn, it gets stuck with the UI. Most strange!
    upload_2020-2-17_10-30-32.png
     
  9. brunocoimbra

    brunocoimbra

    Joined:
    Sep 2, 2015
    Posts:
    679
    From your screenshot, you are not using the InputModule from the new InputSystem, StandaloneInputModule is from the old InputManager.
     
  10. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    I'm aware of that but the component being there regardless of using legacy or new input system seems to interfere, just took the screengrab after I reverted to legacy input
     
  11. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    @StayTalm_Unity Terrain placed trees seem to always receive the teleporter raycast, nomatter what I do with their layer

    This has been reported at (Case 1220974) [XR Interaction Toolkit] Terrain Trees are picked up by XR Raycasters for Teleporting
     
    Last edited: Feb 18, 2020
    Matt_D_work likes this.
  12. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    **Edit: I am rewriting the input section of this to use Unity Input System through the new package version, if anyone is interested in using the updated version, drop me a message!

    I'm feeling nice and sharing this for anyone looking to set up a free locomotion character controller for XR Rig, it's a mishmash of my own logic, the logic in OVRPlayer Controller and Unity Examples FPS Character controller, the name of it is a bit of a lie as it doesn't use Rigidbody, it uses the Character Controller component. I won't be able to help people much with it but take what you want from it and comment out anything you don't need. It takes the form of a LocomotionProvider and has free movement. I cannot state enough that it is not a drag and drop solution, it has references to things that you won't have or need so just delete them, also the Input is referencing the standard Unity input system mostly so you can adapt it to your own needs by providing the 'GetInput' function at the bottom of the script with your own input values :)

    Happy Free-Look locomotion and I'm interested if anyone can improve it for me also ;)

    There is also some floating point math glitch if you are exactly at 0,0,0 in the world with it where your head will jolt a bit, if you find it annoying replace line 170 and so on with this where it says something similar


    Code (CSharp):
    1.         if (Mathf.Round(CurrentDistance * 10) / 10 > 0)
    2.             {
    3.                 headsetOffset.position = oldCameraPos - delta;
    4.             }
    It also doesn't play well with slopes but I have an updated version for that, the logic for that is pretty standard though :)
     

    Attached Files:

    Last edited: Mar 1, 2020
    harleydk, LaCorbiere, shiena and 2 others like this.
  13. UXVirtual

    UXVirtual

    Joined:
    Jan 28, 2016
    Posts:
    2
    Excellent, keen to give this a go. Currently working on a physics sandbox demo with support for climbing, projectile weapons etc so implementing a free-look locomotion system will be useful.
     
    ROBYER1 likes this.
  14. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    that's due to the way that we use the physics system to move the object. its something we're thinking of ways to minimize. having the physics timestep the same as your framerate certainly helps tho!.
     
  15. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Drop me a message about it if you have any questions!

    If anyone else wants to collaborate on it or has issues/improvements to suggest, tag me in another forum post about it.
    I don't want to distract from conversation about XR Interaction package here!

    @Matt_D_work I didn't check if detail meshes also are included in the terrain raycast layer much like the trees issue I reported above but it is safe to assume they are also included. Will there be a workaround for this?

    Currently my playtesters are teleporting up trees which is not ideal and we need to use terrain object instancing for Oculus Quest performance reasons. There are of course workarounds but if the terrain is changing during development it adds quite a lot of extra dev time!
     
  16. seltar_

    seltar_

    Joined:
    Apr 16, 2015
    Posts:
    15
    Is there a way to trigger the Select action on the XRController via script?
    It seems all the fields and methods required to do so are private / internal.
     
    HereLen and MaxInno like this.
  17. bigdaddio

    bigdaddio

    Joined:
    May 18, 2009
    Posts:
    220
    It could be me but the whole attach point thing makes very little sense to me. On my XR Rig I have an XR-Direct-Interactor, it has an attach point I have an object with XR-Grab-Interactable, I added an attach transform. When you grab the interactable object you would think the attach transforms would align. They of course do not. Nothing really makes any sense to me, nothing aligns. How does this work?
     
    a436t4ataf likes this.
  18. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    @StayTalm_Unity is there any plan to integrate the XRUI Input module from this with the Unity Input System Player Input Component to synchronise the Input actions with UI interaction?
    upload_2020-2-24_8-29-4.png

    Noticing the lack of support for Unity Input across all of XR, until the most recent update, XR input devices produced constant errors. Now that is fixed we are looking to use the Input System but I fear there may be more barriers along the way.
     
  19. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    New input system support in XRI is something planned. short term we'll be putting out a version with a few more bug fixes and an interface for XRController so its a little easier to hook up various input back ends until we drop the next major version.
     
    rgbauv and ROBYER1 like this.
  20. PierreLouis29

    PierreLouis29

    Joined:
    Sep 12, 2019
    Posts:
    6
    Hi everyone, I just read all the posts on recent VR dev with Unity and I must say I'm a bit confused.

    If I want to develop with Unity for OpenVR HMDs (Index, Vive, Vive Pro) I should :
    • Use Unity 2019.3
    • Use URP (since it's performance oriented)
    • Install the following packages : XR Management, XR Interaction Toolkit
    • And since OpenVR isn't yet supported by XR Management install these packages : OpenVR Desktop, XR Legacy Input Helpers
    This is the way I've been doing things, please tell me if I'm doing it wrong.

    Also, back in the days I used to use the SteamVR plugin which I found way easier to use than the XRI Toolkit. I read in this blogpost (https://blogs.unity3d.com/2020/01/24/unity-xr-platform-updates/) that Valve was working on their own VR plugin for Unity. Is this plugin going to be a simple integration of OpenVR support to the XR Management or a full VR plugin that will replace the XR Interaction Toolkit ? (a kind of successor to the SteamVR plugin so to say)

    Thanks
     
  21. Jichaels

    Jichaels

    Joined:
    Dec 27, 2018
    Posts:
    237
    XR management plugins is the replacement for OpenVR Desktop (any sdk basically), so having both is useless, if you're using OpenVR Desktop, you're not using the XR management
     
    ROBYER1 likes this.
  22. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    @StayTalm_Unity I am having the hardest time whittling down the behaviour of it to a specific cause, using the XRI sample scenes too it happens. Could you take a look? It seems to happen more often when loading between scenes and can happen in Editor and on build on the Quest.
     
  23. PierreLouis29

    PierreLouis29

    Joined:
    Sep 12, 2019
    Posts:
    6
    Ok thank you, any input about Valve's future's SDK ? Is it going to be a full SteamVR-like plugin or is it just going to be an interface for the XR Management Package ?
     
    FinalQ likes this.
  24. bigdaddio

    bigdaddio

    Joined:
    May 18, 2009
    Posts:
    220
    Further clarification on the attach point issue. If you have the grab intractable set to velocity tracking or kinematic, then the attach points align. If you set movement type to instantaneous then who knows what's happening, I cannot tell if it aligns centers or aligns on the center of the collider or what.
     
    a436t4ataf, nigel-moore and ROBYER1 like this.
  25. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    476
    In my project I was trying to use this toolkit along with the SteamVR but I'm having a problem where if a scene is loaded which invokes the SteamVR plugin, the UI interaction of this toolkit seems to stop working.

    You can recreate it just by importing the SteamVR plugin from the asset store into the XR toolkit examples project.
    1) Run the Simple Sample scene.
    2) Now run the WordInteractionDemo scene. Now the UI interaction won't work until you restart unity.

    I know most people probably won't be doing this but I thought it still might be interesting.
     
  26. mikeNspired

    mikeNspired

    Joined:
    Jan 13, 2016
    Posts:
    82
    Anybody get a good way to start the scene with an interactable object already attached to your hand?

    Right now im inheriting from the the XRBaseInteractabler class.
    I call the IsSelectedEnter on the start of the class.
    Override the OnSelectExit to do nothing so the user cant let go of the item., but I cant get OnActivate to call without holding the Grab button and pulling the trigger.
    This just seems like a round about terrible way I am doing this to get the job done.

    (I am trying to just have guns attached the character at the start of the scene)
     
  27. Matt_D_

    Matt_D_

    Joined:
    Jan 10, 2017
    Posts:
    6
    there should be a "starting interactable" option in the interactor!
     
    createtheimaginable likes this.
  28. mikeNspired

    mikeNspired

    Joined:
    Jan 13, 2016
    Posts:
    82
    Your my hero... Maybe I should actually read the documentation.
     
  29. erizzoalbuquerque

    erizzoalbuquerque

    Joined:
    Jan 23, 2015
    Posts:
    50
    Are you planning to add a VR Simulator like VRTK in this example?



    I'm a teacher and in my class, most of the students don't have a VR device and we don't have enough worksations for each one of them at the lab. A VR Simulator is nice, so the students can make quick tests at home before using the VR workstations at the university.
     
    createtheimaginable likes this.
  30. harleydk

    harleydk

    Joined:
    Jul 30, 2012
    Posts:
    41
    I think they mentioned earlier how that was in the pipeline. Waiting for this myself :)
     
  31. Z_hannmuell

    Z_hannmuell

    Joined:
    Nov 2, 2017
    Posts:
    4
    Hello there!
    Thanks for the great XR-Interaction Toolkit!
    we are going to use it for a upcoming project.
    we found a issue with the XRSocketInteractor. This is (atm) not ment to be for meshes mith multiple materials.

    heres my fix:

    Code (CSharp):
    1.                                 //XRSocketInteractor.cs
    2.                                 //Line 174
    3.                                 for (int j = 0; j < meshFilter.sharedMesh.subMeshCount; j++)
    4.                                 {
    5.                                     Graphics.DrawMesh(
    6.                                       mesh: meshFilter.sharedMesh,
    7.                                       matrix: GetInteractableAttachMatrix(hoverTarget, meshFilter.transform.lossyScale * hoveredScale),
    8.                                       material: interactableHoverMeshMaterial,
    9.                                       submeshIndex: j,
    10.                                       layer: gameObject.layer,
    11.                                       camera: Camera.main);
    12.                                 }
    Please add this to the package!

    now we're facing another problem.
    when releasing grabbutton at the socketinteractor the object doesnt snap into the right pivot.
    i guess this because its using the pivot of the first submesh. not of the overall objectmesh.
    or GetInteractableAttachMatrix isnt calculating the correct way.
     
    Last edited: Mar 3, 2020
    createtheimaginable and ROBYER1 like this.
  32. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    I also have this issue
     
  33. Z_hannmuell

    Z_hannmuell

    Joined:
    Nov 2, 2017
    Posts:
    4
    found the solution.
    The way the phantom meshes is calculated need to be changed.

    Code (CSharp):
    1.  //Line 149 in XRSocketInteractor.cs
    2. //Inverse Rotation calculation
    3. Vector3 finalPosition = attachTransform.position - interactableLocalPosition;
    4. Quaternion finalRotation = Quaternion.Inverse(attachTransform.rotation * interactableLocalRotation);
    EDIT: ^^ the above Solution only worked for one particulare case
     
    Last edited: Mar 3, 2020
  34. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202

    please lob in a bug for this and we'll get it sorted :) great catch!
     
  35. prismexpress

    prismexpress

    Joined:
    Nov 15, 2017
    Posts:
    2
    This works for me, but after teleporting, it appears that the Line Renderer from the Base controller state(i.e. the straight line, not the projectile curve) stay in the old location for a handful of frames. I'm still trying to wrap my head around how the state machine from the Controller Manager script interacts with the Teleportation scripts so I haven't been able to solve this. Any ideas?

    Here's a video demonstrating the issue:
     
    jiraphatK likes this.
  36. stippy

    stippy

    Joined:
    Mar 1, 2020
    Posts:
    9
    The setup to get it up and running seems to be straight forward (which is a really good for a Unity and VR beginner).
    As far as I can see, there is no smooth locomotion system included. That was a major bummer for me, as it seems to be a major system nearly everyone needs (apart from teleportation). Will be looking into hot to0 implement that myself.
     
    ROBYER1 likes this.
  37. jariwake

    jariwake

    Joined:
    Jun 2, 2017
    Posts:
    100
    Hey. I have been experimenting with XRIT for a while and it seems very useful.

    A couple of questions I could not find answers to:

    -How to react to the physical controller buttons, like trigger, grip, and A, B, X, Y...etc?
    -How to make the teleport line visualizer only be visible when for example holding the trigger..there are no settings for this in the XRController or XRRayInteractor.
     
    Last edited: Mar 5, 2020
  38. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    first, we're doing wider input work next! will be a lil while tho :) but it's planned. for now you can always poke through what you need from code.

    as for the line, you'll need to customize the XR Controller for that behavior :) but that's what it's there for!
     
    rgbauv and ROBYER1 like this.
  39. dakomz

    dakomz

    Joined:
    Nov 12, 2019
    Posts:
    40
    What is the workflow for single controller when developing on Rift (or Quest w/ Link) but deploying to Go?

    e.g. do I just pick a hand (like LeftController) and delete it?
     
    ROBYER1 likes this.
  40. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    300
    I have the same Issue. It was especially glaring when doing a snap turn.
     
  41. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    @StayTalm_Unity @Matt_D_work

    I have found the direct cause of the raycasters not working sometimes for UI/menus when changing scenes.

    In ControllerManager.cs the OnEnable function, you call .ClearAll() for all controller states, which sometimes some how disables the ray interactor and line visual for them when it shouldn't.

    We have commented out the ClearAll() function in ControllerManager.cs like below to avoid this issue, it was happening a lot in build on the Quest and our testers didn't know why, it happened a few times in Editor on rare occasions and I have now successfully caught it!


    Code (CSharp):
    1.     void OnEnable()
    2.     {
    3.         m_LeftTeleportDeactivated = false;
    4.         m_RightTeleportDeactivated = false;
    5.  
    6.         m_RightControllerState.Initialize();
    7.         m_LeftControllerState.Initialize();
    8.  
    9.         m_RightControllerState.SetGameObject(ControllerStates.Select, m_RightBaseController);
    10.         m_RightControllerState.SetGameObject(ControllerStates.Teleport, m_RightTeleportController);
    11.  
    12.         m_LeftControllerState.SetGameObject(ControllerStates.Select, m_LeftBaseController);
    13.         m_LeftControllerState.SetGameObject(ControllerStates.Teleport, m_LeftTeleportController);
    14.  
    15.         //m_LeftControllerState.ClearAll();
    16.         //m_RightControllerState.ClearAll();
    17.  
     
  42. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    at the moment yes. we're hoping to fix that soon.
     
    dakomz likes this.
  43. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    interesting. although it seems that the problem is that it's not dropping back into the right state after load right? as the OnEnable is clearing all the states (and effectively turning everything off) ill have a look.
     
    ROBYER1 likes this.
  44. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Correct, it would even happen during editor time, perhaps a timing issue?
     
  45. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    300
    Does anyone knows the proper way to swap controller?
    In ControllerManager.cs
    they use this
    upload_2020-3-7_23-52-14.png
    to enable and disable controller. Why ?
    Can I just enable and disable gameObject instead of doing this?
     
  46. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,933
    Bug: (Case 1225985) - hard-crash of Editor in the InputDevice.TryGetValue API call (muahahahaha!)

    Bug (maybe?): Appears to be no support for Quest's capacitive touch trigger button?

    Re: second item - I eventually found the OculusUsages class (please add that to the documentation here: https://docs.unity3d.com/Manual/xr_input.html#XRInputMappings - I only discovered it existed by accident!), but the capacitive options there always fail on the Quest. From the Oculus/Quest demos, we appear to have capacitive touch hardware on the trigger and grip (although I tried to find an official list from Oculus of which items are capacitive - is it every button? Some? - I couldn't find any details) - but no way to access them in Unity?
     
  47. dakomz

    dakomz

    Joined:
    Nov 12, 2019
    Posts:
    40
    Thanks. Is the upcoming support for this flow related to the new Input System, or it's totally a separate thing?
     
  48. dakomz

    dakomz

    Joined:
    Nov 12, 2019
    Posts:
    40
  49. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,933
    Is there a more up-to-date version of the API docs anywhere? I'm trying to reverse-engineer some of the missing/not-yet-written high-level docs (e.g. details on how to make custom Interactables - and trying to debug the situation mentioned earlier in this thread: https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/page-4#post-5451039), and running into places where the API docs seem missing too.

    Without user-guide docs, I can figure it out from API docs. But without API docs, I have to read through the source code and reverse-engineer the whole system :(.

    For instance:

    "ProcessInteractable(XRInteractionUpdateOrder.UpdatePhase)
    This method is called by the interaction manager to update the interactable. Please see the interaction manager documentation for more details on update order"

    ... but XRInteractionManager has literally zero docs :(. https://docs.unity3d.com/Packages/c...Interaction.Toolkit.XRInteractionManager.html
     
    Last edited: Mar 9, 2020
    createtheimaginable likes this.
  50. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,933
    e.g. digging in the source, there's some really interesting (and super important) stuff about:

    "case XRInteractionUpdateOrder.UpdatePhase.Fixed:"

    ...but none of this is mentioned anywhere in the docs that I can see (?) - the XRInteractionManager is empty / dead, and the homepage has a section "Update Loop" which only has superficial information, and doesn't even mention these different UpdatePhases.
     
    Last edited: Mar 9, 2020
Thread Status:
Not open for further replies.