Search Unity

Unity XR Interaction Toolkit Preview Release

Discussion in 'AR/VR (XR) Discussion' started by mfuad, Dec 17, 2019.

  1. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    18
    We are excited to share the initial preview release (0.9 preview) of the XR Interaction Toolkit, available as a package for Unity 2019.3. The XR Interaction Toolkit enables you to add interactivity to your AR & VR experiences, across our supported platforms, without having to code the interactions from scratch. For more information, please refer to our blog post.

    Let us know what you think
    The main goal of this preview release is to gather feedback on how the XR Interaction Toolkit is working for you so we can fine-tune the experience to fit your needs. Please provide input via this quick survey or feel free to ask questions in this forum thread.

    If you encounter a bug, please use the Unity Bug Reporter in the Unity Editor, accessible via “Help” > “Report a Bug”. Include “XR Interaction Toolkit” in the title to help our staff triage things appropriately! For more details on how to report a bug, please visit this page.
     
    Last edited: Dec 17, 2019
  2. OwlchemyDawson

    OwlchemyDawson

    Joined:
    Aug 28, 2018
    Posts:
    8
    Unfortunately, it seems that installing via the Package Manager is not working as the XR Legacy Input Helpers version 1.3.9 is not yet released. Also, the Package Manager version is 0.9.1 rather than 0.9.0.
     
    ROBYER1 likes this.
  3. OwlchemyDawson

    OwlchemyDawson

    Joined:
    Aug 28, 2018
    Posts:
    8
    Downloading the example project rather than starting with a blank project and trying to install the package worked around the issue for me.
     
    Matt_D_work likes this.
  4. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    91
    0.9.2 of com.unity.xr.interaction.toolkit is now on package manager :) the example repo has also been updated.
     
    Last edited: Dec 17, 2019
    davtam and kavanavak like this.
  5. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    45
    Hi, great stuff...

    Docs Feedback:

    1. On the Toolkit docs page (https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@0.9/manual/index.html)
    a. Near the bottom of the page there is a reference to a Known Limitations git repo for issues (https://github.com/Unity-Technologies/com.unity.xr.interaction.toolkit/issues) this repo is either private or doesn't exist.

    2. On the Locomotion docs page (https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@0.9/manual/locomotion.html)
    a. You reference a Primary and Secondary device for the SnapTurnProvider, from what I see on the script this has been replaced by a "Controllers" array. If so, this needs to be updated in the docs.
    b. The Locomotion Systems section states that the example value is set to 600 seconds, while the attached image shows a Timeout of 10. I believe this is an error?
    c. The Teleportation section has a link that doesn't seem to be formatted correctly. "The XR Interaction system also provides various line rendering options. For more information, see the main documentation for the Interaction Package."
     
    Last edited: Dec 17, 2019
    Matt_D_work likes this.
  6. John_Sietsma

    John_Sietsma

    Joined:
    Nov 16, 2010
    Posts:
    30
    Exciting stuff!

    * EDIT * Just noticed the request for bug reporting by the OP. I'll submit through there.

    I just tried in on the Quest and found:
    - The pointer appeared briefly and then dissappeared.
    - The controllers positions only updated every few seconds.

    Unity 2019.3.0f1

    Will there be a mock rig for testing in the editor?
     
  7. Jayjays

    Jayjays

    Joined:
    Nov 4, 2015
    Posts:
    9
    I have the WorldInteractionDemo = Unity 2019.3.0f1 installed but when I'm in play mode on the Rift there's just the loading screen and doesn't move on
     
    Last edited: Dec 18, 2019
  8. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    109
    @Jayjays
    Some triage:
    1) Do you have the 'Oculus XR Plugin' package installed? Go to Window->Package Manager, and check the left hand side bar. This is our latest Oculus plugin.
    2) Once you've installed that package, you'll also want to configure it to load up when entering playmode. To do that, go to Edit->PlayerSettings->XR Management, and in your current platform add the Oculus Loader. This is what loads up the various systems when entering play mode. You shouldn't have to mess with settings otherwise.

    If both those are good, let me know if you get any logs or warnings or errors when entering playmode.
     
    Jayjays likes this.
  9. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    109
    @kavanavak
    1) Will check with legal and the boss man, that *is* our source repo, will either update the docs to not point there, or will make that public depending on what's 'safe'.
    2:a)Changed that mechanic late in the game, so it's pointing to old mechanics. It used to force a left & right hand only, and imo wasn't extensible. As we move toward 1.0 I'd like to make XRController the one point of truth for all XR Interaction input, and so it now pulls it's data off the assigned XRController.
    2:b&c) Will fix both on our next pass of docs, thanks for the catch!

    Somebody also mentioned elsewhere that there are obsolete warnings when using Unity 2019.3.0f1 . Those will not cause issues right now, and the next quick release will remove the warnings, and upgrade any existing XRRig assets (prefabs & scenes) to use the new, non-obsolete types.
     
    kavanavak likes this.
  10. Jayjays

    Jayjays

    Joined:
    Nov 4, 2015
    Posts:
    9
    Thanks for this. I have checked those two steps. There are the errors attached:
     

    Attached Files:

  11. MaskedMouse

    MaskedMouse

    Joined:
    Jul 8, 2014
    Posts:
    424
    Since I see the legacy Input Helpers I suppose this package will only support the legacy input system.
    Will there be an update to use the new input system?
    It looks like an awesome package but without compatibility with the new input system I can't use it.
     
  12. academyofgames

    academyofgames

    Joined:
    Nov 10, 2018
    Posts:
    3
    Excited to see all this new stuff for XR :)
     
    ibyte likes this.
  13. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    109
    @MaskedMouse
    It supports our intermediary solution 'InputDevice'. The LegacyInputHelpers dependency is to get access to the TrackedPoseDriver, which is also being upgraded.

    Support for the new Input System is on the short term roadmap. This will bring us a greatly expanded layer of abstraction in the XRController an AR gesture systems which I'm looking forward to.
     
    MaskedMouse likes this.
  14. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    109
    @Jayjays
    Hmmmmmmm,
    That definitely looks like our supplied example project, and those errors are what you would see if the Interaction Toolkit Assembly wasn't properly referenced in your project assemblies.
    I don't want to run you on a wild goose chase, but can you verify that 'XR Interaction Toolkit' is an installed package (in case something in our example got misconfigured), and if it is, maybe run `Assets->Reimport All`.

    I've seen this in development builds, where the package manager gets a little confused and 'drops' a reference, but I would have hoped that got fixed before a final release.
     
  15. HeadClot88

    HeadClot88

    Joined:
    Jul 3, 2012
    Posts:
    701
    Quick question - Are there plans for smooth locomotion and snap turning in the near future?
     
    ROBYER1 likes this.
  16. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    109
    @HeadClot88

    Do you mean having a slower warp time? e.g. when SnapTurning, instead of doing it in one frame, doing it over 0.5 seconds, maybe with a small vignette effect?

    We are gathering improvements from internal teams and partners atm, based on the types of interactions and systems they need, and general feedback as they get comfortable with the system. You can always extend and replace, most of our components are open source and based off of base classes or interfaces, but we will be looking to bake in common or expected functionality to make the out-of-the-box components more robust.

    I don't think that one is on the shortlist, but it definitely feels like something that would be nice to be built in. Our Locomotion System already has a concept of exclusivity locks (can't snap turn while locomoting), so we should be able to extend that out. I'll throw it on the pile :).
     
    davtam and HeadClot88 like this.
  17. HeadClot88

    HeadClot88

    Joined:
    Jul 3, 2012
    Posts:
    701
    Yep that is what I mean when it comes to snap turning. :)

    Is smooth locomotion planned?
     
    ROBYER1 likes this.
  18. Jayjays

    Jayjays

    Joined:
    Nov 4, 2015
    Posts:
    9
    Ok so I restarted the PC loaded up Unity and WorldInteractionDemo and its works fine so lets not ask any questions why it wasn't working. Thanks for your suggestions! :)
     
  19. WavyRancheros

    WavyRancheros

    Joined:
    Feb 5, 2013
    Posts:
    84
    Can you open source this toolkit, just like Unity UI? Then making extensions and adaptations would be a lot easier.
     
    MattMaker likes this.
  20. NSWell

    NSWell

    Joined:
    Jul 30, 2017
    Posts:
    47
    There are some issues with ARFoundation.
    1."AR Translation Interactable" ,
    Selected Chair -> translation it -> The Chair floating the air
    Video:


    2."twist“ some time need the two finger ,some time only one fingure
    Not Video!
    ---
    Sorry my english no good~
     
    createtheimaginable likes this.
  21. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    267
    My hot take on the VR side of things. Just playing with the demo scene.

    1.The teleportation demo is very strange on a Vive. Touchpad to enter teleport mode and trigger to actually teleport? Nobody does that!
    2. We need a non-instantaneous teleport mode. Dash or similar.
    3. Held objects seem to jitter a lot
    4. Throwing feels a little off. Things don't go as far as they should.
    5. Passing an object from hand to hand doesn't work.
    6. Hands and controller models would be a nice thing to include
    7. Various missing features: Grab and scale objects, scale world, climbing, NewtonVR-esque levers and buttons. I'm sure the community will be happy to add some of these. Do you accept pull requests or would you prefer everything to be in separate repos? (Actually - that's a viable question for new features but - for fixes and core improvements it has to be PRs. Will you accept such?)

    I'll probably post more observations as they come to me. Very nice start.

    EDIT:

    8. Navmesh based teleporting is easier to work with than custom teleport location objects. Is it supported?

    EDIT 2

    Point 3 - only seems to apply to "Kinematic" interactables.
     
    Last edited: Dec 20, 2019
  22. dwatt-hollowworldgames

    dwatt-hollowworldgames

    Joined:
    Apr 26, 2019
    Posts:
    31
    Does this include an option to have a curved canvas?
     
    VeeRedd and kavanavak like this.
  23. HeadClot88

    HeadClot88

    Joined:
    Jul 3, 2012
    Posts:
    701
  24. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    45
    @StayTalm_Unity I might be mistaken, but it looks like this doesn't play nicely with the New Input System Player Input UI module interface. Is there a current process I'm missing to use the new device-agnostic input system to fallback gamepads and other input in conjunction with this XR interaction system?
     
  25. dwatt-hollowworldgames

    dwatt-hollowworldgames

    Joined:
    Apr 26, 2019
    Posts:
    31
    To me its critical. Curved Canvases are an essential part of vr development. The only way to build a sizeable ui element and I don't know of any reliable tool that does it. I tried CurvedUI but it doesn't directly support XRInput and falls apart unless the canvas is at least 10m or so away. Looks like I will just have to make one myself that does the job my way.
     
  26. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    45
    If you get one to work nicely please share with the community ;)
     
  27. createtheimaginable

    createtheimaginable

    Joined:
    Jan 30, 2019
    Posts:
    18
    @NSWell How did you do that with ARFoundation / XRInteraction and the chair? Are there any Unity videos or tutorials from Unity on how to use ARFoundation and the XRInteraction Toolkit together?

    Could you post the project to github?
     
  28. NSWell

    NSWell

    Joined:
    Jul 30, 2017
    Posts:
    47
    https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples
    Here is a demo project . U can download it.
     
  29. createtheimaginable

    createtheimaginable

    Joined:
    Jan 30, 2019
    Posts:
    18
    Thanks NSWell!

    This is a new question for the Unity Engineers. Since it is the Holidays I haven't been able to load that example on my iPhone 11 Pro yet! :)

    Does XRInteraction have built-in support for simulated AR planes that we can use in the Unity Editor without loading the AR project onto our device like in the example below or Project MARS?

    https://forum.unity.com/threads/ar-...a-fake-ar-plane-for-testing-in-editor.542061/

    https://blogs.unity3d.com/2019/10/02/labs-spotlight-project-mars/
     
  30. NSWell

    NSWell

    Joined:
    Jul 30, 2017
    Posts:
    47
    Yep, XRInteraction have built-in support for simulated AR planes ,
    But the simulation mode of this demo does not work for me. I dont know why... Too few documents.

    I am building the ipa to my iPhone 11 , its work for me.
     
    createtheimaginable likes this.
  31. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    557
    I have many questions, like:

    -Why is the snap turning delayed
    -Why do we need box colliders behind UI for raycasting interactors to detect the hit
    -Why is there no smooth locomotion
    -Why does the teleportation locomotion not allow for world-alignment as an option (if you want to teleport upright on top of a rotated gameobject, this is not supported by default)
     
  32. Matt_D_

    Matt_D_

    Joined:
    Jan 10, 2017
    Posts:
    5
    it shouldnt be, i'll look into it.

    we did when we had the UI pointer and Ray Interactor separate. we may not need it anymore.

    what do you define as "smooth" locomotion here? do you mean fade in/out? or joystick motion?

    we kept it simple for initial release. for scenarios where you are world aligned on top of a rotated object, if you're in room scale then you will clip through the object when translating. so rather than deal with attempting to move the camera "up" the rotated object at this point we kept it simple. it should be relatively easy to extend as you need it.

    we actually had a _bunch_ of different teleport options. but it was almost impossible to explain to people what they did.
     
    ROBYER1 likes this.
  33. Matt_D_

    Matt_D_

    Joined:
    Jan 10, 2017
    Posts:
    5
    not yet. it does use the same data stream that goes _to_ the new input system tho. so while we cant use actions yet. it is the same data. and the usages/etc will get you some of the way for now.
     
  34. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    557
    I will make a repro bug report for the delayed snap turning

    For the Ui pointer needing a collider on the canvas, it can still interact without a collider but the ray interactor wouldn't highlight the line renderer if it was over a button until I added a box collider over it even though the raycasting was set to use the UI layer.

    For smooth locomotion I meant smooth movement with the joystick, I am porting bits of XR interaction over to my own rig but lack of smooth movement made the demo a bit hard to traverse with teleport only compared to others like VRTK, Oculus Integration and SteamVR :)
     
    kavanavak likes this.
  35. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    267
    Slightly surprised you need to ask this. "Smooth locomotion" is fairly widely used to mean joystick motion ("teleport" vs "smooth" debates are an ongoing topic of debate in the VR community).

    EDIT - this is a fairly comprehensive guide that sticks closely to the terminology that I've found to be most commonly used: https://www.tomshardware.com/uk/picturestory/807-virtual-reality-games-locomotion-methods.html
     
    Last edited: Dec 24, 2019
    ROBYER1 likes this.
  36. anagamedev

    anagamedev

    Joined:
    Oct 21, 2014
    Posts:
    5
    Thanks, guys for this pack, I'm super excited to try it out but got this issue:
    I downloaded the XR-Interaction-Toolkit-Examples on GitHub but when I play the "World Interaction Demo" scene, teleport function doesn't work so I cannot access the other interactions, just the ones I'm facing when the demo starts.
    PS: I'm using Unity 2019.3.0f3 and Oculus Rift CV1
     
  37. anagamedev

    anagamedev

    Joined:
    Oct 21, 2014
    Posts:
    5
    I just tested and the teleport works just fine at the Locomotion scene. It must be an issue just on the other ones.
     
  38. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    45
    **UPDATE** I was not seeing the XR DIRECT INTERACTOR, thank you to those who pointed it out!

    It appears that the XR (VR) interaction system is based solely on a raycasting 'force-grab' mechanic, rather than a controller-based, natural motion hand-grabbing of objects. Is this true or am I missing something? If this is true it seems short-sided and is not the most intuitive way to handle object manipulation in VR.
     
    Last edited: Dec 28, 2019
  39. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    267
    Could you explain the distinction for me? I'd like to understand properly.
     
  40. alexchesser

    alexchesser

    Joined:
    Sep 15, 2017
    Posts:
    66
    I built this for the quest and there was a strange behavior happening with the pointers and the teleport bends.

    Something was happening that the teleportation curve beam and the pointer beam were both active at the same time.
    I think that should function as an either/or.

    Additionally there are a few different scenes in the package, it'd be nice if there was a "master" scene and some sort of UI interaction to load from one to another so the overhead in testing all scenes is a little lower.

    I love Andybak's feedback above, especially #1 & #2.

    I'll also echo the sentiment that this feels like a really great start.

    My experience so far witht he new VR framework stuff (2019.3 / 2020.beta) Is that the reorg makes sense - but it would be nice if the deprecation messages were EVEN FRIENDLIER ... link to some document that will tell me where certain features have gone.

    For example: where is the Mock HMD now? I was happily using it in 2019.1 as a built in on the player menu - now it is gone and I can't find it. (or technically not gone, but not compatible with other SDK implementations of VR like OculusSDK)
     
    MattMaker and ROBYER1 like this.
  41. createtheimaginable

    createtheimaginable

    Joined:
    Jan 30, 2019
    Posts:
    18
    So I got it working in the Unity Editor with ARFoundation 3.0.1 on a simulated iPad! You click the mouse to add the cube to the simulated or debug plane. :)

    So in the example project that uses the PlaceOnPlane script, that script will always return and not run the Update() code that follows because Input.TouchCount does not seems to work in the Unity Editor for mouse clicks when using the "Game" pane. But there is a work around! If you use the new Device Simulator ( preview 2.0.0 ) Input.TouchCount works! But there is a catch! Since the code thinks it is running on a real device it disables the debug plane. So you have to delete or uncheck the "Disable Debug Plane on Device" component on the PlacementPlane in the supplied sample project. I may have missed it but the documentation should explicitly state that we should use the Device Simulator if running in the Unity Editor. Also, it would be nice if when Input.Touch code runs in the Editor and it detects a click in the "Game" pane then it should log a message telling us that we should be using the new Device Simulator. How do I file a bug report for this?

    Code (CSharp):
    1.  
    2. void Update()
    3.     {
    4.         if (Input.touchCount == 0)  // Broken in Unity Editor but works in the new Device Simulator
    5.             return;
    6.  
    7.         Debug.Log("Never gets HERE HERE HERE!!!!");
    8.  
    9.         var touch = Input.GetTouch(0);
    10.  
    11.         if (m_RaycastManager.Raycast(touch.position, s_Hits, TrackableType.PlaneWithinPolygon))
    12.         {
    13.             // Raycast hits are sorted by distance, so the first one
    14.             // will be the closest hit.
    15.             var hitPose = s_Hits[0].pose;
    16.  
    17.             if (spawnedObject == null)
    18.             {
    19.                 spawnedObject = Instantiate(m_PlacedPrefab, hitPose.position, hitPose.rotation);
    20.             }
    21.             else
    22.             {
    23.                 spawnedObject.transform.position = hitPose.position;
    24.             }
    25.         }
    26.     }
    XRInteraction_iPad_Pro.png
     
    Last edited: Dec 26, 2019
  42. RyanYN

    RyanYN

    Joined:
    Aug 13, 2013
    Posts:
    18
    color space linear support
     
  43. vice39

    vice39

    Joined:
    Nov 11, 2016
    Posts:
    76
    Are there plans to add a SteamVR XR Plugin? Right now there is MagicLeap, Oculus and Windows. Why is SteamVR not on the list?
     
    jashan, Davedub, addent and 2 others like this.
  44. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    267
    There's examples of this in one of the demo scenes. From what I remember it's got several types of direct manipulation (depending on how the object is meant to be connected to the hand i.e. via parenting, physics etc).
     
  45. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    267
    That's odd. I can't check right now (for obvious festive reasons) but I distinctly remember testing grabbing objects with my controller. I remember because I was also trying passing them from hand to hand (which didn't work quite right) and throwing them.

    I can't actually remember distance grab in the scene I'm thinking of (the one with multiple tables with stuff on).
     
  46. vice39

    vice39

    Joined:
    Nov 11, 2016
    Posts:
    76
    I am really confused as to why OpenVR is not supported. It's the most used PCVR system. There are 10x more SteamVR headsets on the market than WMR. Is there something I'm missing? Any news when OpenVR will be supported?
     
    jashan, xsirlith and ROBYER1 like this.
  47. createtheimaginable

    createtheimaginable

    Joined:
    Jan 30, 2019
    Posts:
    18
    Here is a video done by "VR with Andrew" on getting set up with the XRInteraction Toolkit and VR.

     
  48. daveinpublic

    daveinpublic

    Joined:
    May 24, 2013
    Posts:
    33
    I'm not able to use Oculus Integration and XRManager at the same time. If I have Oculus Integration installed, it will render both eyes as if they were one long monitor, feels like you're cross-eyed. So, the solution is to delete Oculus Integration.

    But... then I don't have access to Oculus Go input the way I've done it in the past. I wanted to use the latest XR Toolkit etc, but every step has caused me days of problems. Now, I feel like I'm almost there, and now I find out I have to get Oculus input without Oculus Integration. Can anyone help me find the simplest way to just get the trigger and touchpad input?

    Oculus Go - Current way I get input, with Oculus Integration...

    Code (CSharp):
    1. // I get the trigger input like this:
    2. public OVRInput.Controller controller;
    3. float triggerState = OVRInput.Get (OVRInput.Axis1D.PrimaryIndexTrigger, controller);
    4. if (triggerState > .25f) { ...
    5.  
    6. // I can get the touchpad input like this:
    7. Vector2 touchpadVR = OVRInput.Get (OVRInput.Axis2D.PrimaryTouchpad);
    8.  
    9. // I can see if the large touchpad button is pressed like this:
    10. if (OVRInput.GetDown (OVRInput.Button.One)) { ...
    Is there a way as simple as this to get this data without Oculus Integration?

    Code (CSharp):
    1. // I found this here: https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/QuickStartGuide.html
    2. var gamepad = Gamepad.current;
    3. if (gamepad.rightTrigger.wasPressedThisFrame) {...
     
  49. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    557
    There is a nearby grab implementation, I urge you to look at both of the hands in the examples.
     
    kavanavak likes this.
  50. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    557
    You can reproduce the Ui raycasting issue by going to the example scene with the canvas in and disabling the box collider on it. It's probably a bit confusing for first time users also as the box isn't described as necessary in the Documentation.

    - On an XR Controller, using Primary Axis 2D as a 'Select Usage' input on the teleporter example doesn't seem to do anything despite changing the Axis to Press threshold

    - If you click in the thumbstick or touchpad with the interaction example scene (to switch between the interactor ray and teleport ray), and hold the trigger button then let go of the thumbstick button and then the trigger after, you won't teleport to that spot until you press the thumbstick button in again!

    Also, the example of swapping between controller states is rather confusing to me - you switch between an interactor and a teleporter controller on each hand on touchpad/analogue stick click, but I can't see where the teleporter is actually hooked up to the teleportation provider and am now scratching my head over how the primary button press translates to telling the teleporter to activate!
     
    Last edited: Dec 27, 2019
    kavanavak likes this.
unityunity