Search Unity

Official XR Interaction Toolkit Preview Release (0.9)

Discussion in 'XR Interaction Toolkit and Input' started by mfuad, Dec 17, 2019.

Thread Status:
Not open for further replies.
  1. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    542
    That sucks, the default for the Oculus Quest (or any 6dof headset) should be RoomScale. Things like these are the reason why I gave up on the "native" Unity XR support and downloaded the Oculus package a while ago.
     
  2. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    569
    I think the confusion comes from the fact that the toolkit is currently using the legacy VR input package (which does support OpenVR but somewhere above it was stated that this is only temporary and it will move to the new system (for which no ETA for OpenVR has been announced - or even if there is definitely going to be support)
     
  3. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Yes that is the standalone OpenVR package that you can still import if you edit the package manifest, see notes about it here https://docs.unity3d.com/Packages/com.unity.xr.openvr.standalone@2.0/manual/index.html

    "NOTE: Support for built-in VR integrations will be removed in Unity 2020.1."
     
  4. Aaron-Meyers

    Aaron-Meyers

    Joined:
    Dec 8, 2009
    Posts:
    305
    Yea I guess if you are running the alpha of 2020.1, you're in trouble. I agree that the lack of messaging on the future of OpenVR is pretty lame and hopefully they'll get it together by the time 2020.1 hits beta (lol).

    But if you open the sample package with the current release (2019.3), it just works fine without any modifications on OpenVR hardware.
     
  5. d4n3x

    d4n3x

    Joined:
    Jan 23, 2019
    Posts:
    24
    Has somebody figured out to delete placed AR-Objects yet?
     
    createtheimaginable likes this.
  6. jackpr

    jackpr

    Unity Technologies

    Joined:
    Jan 18, 2018
    Posts:
    50
    Hi there! Grab Interactables have their parent cleared while they are being interacted with so that they behave properly. Otherwise transform hierarchy shenanigans can cause weird issues (for example: if the parent is scaled and you rotate the child, then the child mesh will stretch).

    I'm working on an option right now that will return a grab interactable to its original place in the scene hierarchy once it is released. For your specific use case I recommend looking into interactable events or having a script on the grab interactable that self-destructs if the parent menu is destroyed.
     
  7. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    54
    @jackpr Since the menu can be opened and closed repeatedly during the experience I think I'll just continue to have all the interactable GOs under root, with the menu enabling or disabling (rather than triggering destruction and instantiation) Looking forward to updates as your team figures them out.
     
  8. jackpr

    jackpr

    Unity Technologies

    Joined:
    Jan 18, 2018
    Posts:
    50
    Hi there, I've been working on some of your observations.

    I just merged in a PR for this. It will show up in a future release.

    XRGrabInteractables have a ThrowVelocityScale property. Does increasing this value get you the distance and feel you're looking for?
     
  9. jackpr

    jackpr

    Unity Technologies

    Joined:
    Jan 18, 2018
    Posts:
    50
    When the menu is enabled/disabled, you could toggle m_AllowSelect on the XRGrabInteractables. This will force any held object to be dropped by the hand's interactor.
     
  10. linojon

    linojon

    Joined:
    Aug 25, 2014
    Posts:
    118
    Is there a way to use XRI with the Oculus cross-platform local avatar hands? any examples?
     
  11. Brad-Newman

    Brad-Newman

    Joined:
    Feb 7, 2013
    Posts:
    185
    R2-RT and kavanavak like this.
  12. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    I guess the reason people think that OpenVR support is lacking is because neither Valve, nor SteamVR, nor HTC is on the list of "officially supported platforms": https://docs.unity3d.com/2019.3/Documentation/Manual/XR.html

    Can someone from Unity Technologies please let us know why Valve/SteamVR/OpenVR is missing from that list?

    From the outside, that list certainly looks like something going on with politics. Because from a technological / developer perspective, it does not make any sense at all to ignore Valve when it comes to VR.

    The Steam/SteamVR ecosystem is huge, and does a lot of things really well. It's also the only one that at least tries to work on Linux and Mac. Also, the Valve Index controllers are currently the most advanced VR controllers ("next-gen" compared to Touch).

    SteamVR Input 2.0 has solved not only controller input / action binding in a really good way, but with its skeletal input system also has a really good solution for finger-tracking (as supported by Valve Index controllers, and to a much lesser degree Oculus Touch - but the system also handles all other controllers really well, like the 2016 Vive wands). Valve's skeletal input system should even work really well with Oculus Quest hand tracking, once those things are wired properly.

    Honestly, without fully supporting SteamVR/OpenVR, Unity's XR Interaction Toolkit is not "cross-platform" at all ... and Unity is several years behind, technologically (the first version of the SteamVR Interaction system is from 2015 or 2016 and, from the looks of it, had everything Unity's system has, if not more ... like ... proper controller abstractions with the correct controller models loaded during runtime; meanwhile Valve is at 2.5 or their system and it evolved a lot since back then).

    Don't get me wrong: I'd love to use a native Unity system and would actually much prefer using such a native system over a third party system. But not if the native system lacks support for the largest and most technologically advanced platform. This does upset me a little because I feel pushed back in time, to 2015, when Unity introduced it's "native VR" support (which also was way behind the SteamVR Unity Plugin at that time, in so many ways and it took several years to catch up).
     
  13. Corysia

    Corysia

    Joined:
    Mar 14, 2018
    Posts:
    108
    Tonight was my first night to play with this, so I picked up all the latest stuff - Unity 2019.3.0f5, the latest preview packages, the latest XR Interaction Toolkit off of github. A quick preview in my Rift and saw everything working. I then naively set it to an Android build and stuffed it on to my Quest, just to see what would happen. I was pleasantly surprised to find that it worked!

    I'm very curious to know why MagicLeap is supported, but not SteamVR. While I prefer Oculus devices (today), I can't disenfranchise all those HTC users. I suspect it's in the "Coming Soon" category, but I'd like to hear that officially.

    What I couldn't figure out is why I couldn't open the scene. I see a folder icon where I'd expect to see the normal Unity logo for a scene.
     
  14. linojon

    linojon

    Joined:
    Aug 25, 2014
    Posts:
    118
    Does the oculus local avatar hands integrate with the XRI toolkit? if so, i'd like advise how to get that working. Tia.
     
  15. linojon

    linojon

    Joined:
    Aug 25, 2014
    Posts:
    118
    I just downloaded Unity 2020.1.0a19 and it doesnt even have Player Settings > XR Settings panel at all! So unless they're dropping support OpenVR in Unity 2020.1 altogether (haha!), it's being worked on.
     
  16. Corysia

    Corysia

    Joined:
    Mar 14, 2018
    Posts:
    108
    Not with XR alone, no, I don't expect so. You still have to have Oculus Integration. The OC6 video from September shows the ecosystem pretty well -- tho the 2nd presenter has WAY too much energy.
     
  17. linojon

    linojon

    Joined:
    Aug 25, 2014
    Posts:
    118
    yes i get that. I've created a new project with both installed and trying to figure out how to use the XRI XR Rig with oculus hands.
     
  18. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    54
    Is this expected behavior? @jackpr @mfuad @Matt_D_work @StayTalm_Unity

    If there isn't a mesh for the XR raycast to hit (somewhere in the scene behind the UI canvas) the Valid Color Line Visual won't trigger, regardless of if you're hitting the UI element on the canvas. You can still interact with the UI, but the line visual won't change to indicate that you're able to.

    This can be seen in the included demo (UI CANVAS) by disabling the 'LowPolyIsland' gameObject. The 'sphere' ground mesh is what allows the Valid line to show in this case.

    This doesn't feel like expected behavior since assuming there will always be a mesh behind UI seems like a big assumption for features to work. Shouldn't the UI itself trigger whether the interaction is valid?
     
  19. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    If you'd like to fix it locally in your package code, it's in XRRayInteractor.cs: Line 319
    Change
    Code (CSharp):
    1. if (raycastPointIndex < positionInLine || ((raycastPointIndex == rayIndex) && (raycastHit.distance <= distance)))
    To

    Code (CSharp):
    1. if (raycastPointIndex < rayIndex || ((raycastPointIndex == rayIndex) && (result.distance <= distance)))
     
    Last edited: Jan 20, 2020
    designleviathan and kavanavak like this.
  20. createtheimaginable

    createtheimaginable

    Joined:
    Jan 30, 2019
    Posts:
    29
    I am trying to place a AR cube programmatically at (0,1,0) with C# using the XR Interaction Toolkit. There is an example AR file "PlaceOnPlane.cs" included in the example github ( https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples ) repository but it does not seem to work? It is not attached to any of the example's Game Objects? Also, the PlacementInteractive seems to completely override the Place On Plane script when it I attach it to a Game Object?

    XRInteraction_iPad_Pro.png

    Code (CSharp):
    1. using System.Collections.Generic;
    2. using UnityEngine;
    3. using UnityEngine.XR.ARFoundation;
    4. using UnityEngine.XR.ARSubsystems;
    5. //using UnityEngine.InputSystem;
    6.  
    7. /// <summary>
    8. /// Listens for touch events and performs an AR raycast from the screen touch point.
    9. /// AR raycasts will only hit detected trackables like feature points and planes.
    10. ///
    11. /// If a raycast hits a trackable, the <see cref="placedPrefab"/> is instantiated
    12. /// and moved to the hit position.
    13. /// </summary>
    14. [RequireComponent(typeof(ARRaycastManager))]
    15. public class PlaceOnPlane : MonoBehaviour
    16. {
    17.     [SerializeField]
    18.     [Tooltip("Instantiates this prefab on a plane at the touch location.")]
    19.     GameObject m_PlacedPrefab;
    20.  
    21.     /// <summary>
    22.     /// The prefab to instantiate on touch.
    23.     /// </summary>
    24.     public GameObject placedPrefab
    25.     {
    26.         get { return m_PlacedPrefab; }
    27.         set { m_PlacedPrefab = value; }
    28.     }
    29.    
    30.     /// <summary>
    31.     /// The object instantiated as a result of a successful raycast intersection with a plane.
    32.     /// </summary>
    33.     public GameObject spawnedObject { get; private set; }
    34.  
    35.     void Awake()
    36.     {
    37.         m_RaycastManager = GetComponent<ARRaycastManager>();
    38.     }
    39.  
    40.     void Update()
    41.     {
    42.         if (Input.touchCount == 0)
    43.             return;
    44.  
    45.         var touch = Input.GetTouch(0);
    46.  
    47.         if (m_RaycastManager.Raycast(touch.position, s_Hits, TrackableType.PlaneWithinPolygon))
    48.         {
    49.             // Raycast hits are sorted by distance, so the first one
    50.             // will be the closest hit.
    51.             var hitPose = s_Hits[0].pose;
    52.  
    53.             if (spawnedObject == null)
    54.             {
    55.                 spawnedObject = Instantiate(m_PlacedPrefab, hitPose.position, hitPose.rotation);
    56.             }
    57.             else
    58.             {
    59.                 spawnedObject.transform.position = hitPose.position;
    60.             }
    61.         }
    62.     }
    63.  
    64.     static List<ARRaycastHit> s_Hits = new List<ARRaycastHit>();
    65.  
    66.     ARRaycastManager m_RaycastManager;
    67. }
     
    Last edited: Jan 19, 2020
  21. ibyte

    ibyte

    Joined:
    Aug 14, 2009
    Posts:
    1,048
    Under XR management the "Must add at least one installed plugin provider to use this platform for XR" warning should also be shown in the console.
     
  22. alsharefeeee

    alsharefeeee

    Joined:
    Jul 6, 2013
    Posts:
    80
    I just tested it out and it was nicely done but I am wondering about the performance difference. And if there are any future plans to use DOTS with it.
     
    Alverik likes this.
  23. ibyte

    ibyte

    Joined:
    Aug 14, 2009
    Posts:
    1,048
    Any plans to integrate object movement with the grab mechanic? Currently ray touches object, object snaps on grip pull or touch. How would I implement object movement at the end of the ray on grip, joystick brings object closer or moves it farther away, trigger snaps object to controller like original remote grip selection.
     
    TimeWalk-org likes this.
  24. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    335
    To folks who are curious about OpenVR support:
    We have a blog post scheduled this Thursday (January 23, 2020) that will cover all of our XR platform support updates, which will also include an update on OpenVR.
     
    bruno1308, ibyte and Shizola like this.
  25. createtheimaginable

    createtheimaginable

    Joined:
    Jan 30, 2019
    Posts:
    29
    I think all of my problems stem from ARRaycastManager API calls not working in the Unity Editor?

    Does anyone have a work around for this?

    I remember there was this solution from a while ago but it is no longer working.

    https://forum.unity.com/threads/mock-ar-device-for-in-editor-simulation.546703/
     
    Last edited: Jan 22, 2020
  26. createtheimaginable

    createtheimaginable

    Joined:
    Jan 30, 2019
    Posts:
    29
    Hello mfuad,

    In that coming blog post is there going to be anything now on simulated planes and testing in the Unity Editor? I have been dead in the water because it does not look like ARRaycastManager calls work in the editor?
     
  27. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    @StayTalm_Unity If you create a XR Grab Interactable, then add an interaction event but have it missing its connected gameobject, it will throw lots of errors once you try to pick it up or interact with it.

    Can be recreated in the example scenes by adding an interactable event one a prefab in a scene with something assigned to the interactable event then trying to use that prefab in another scene where the event will be missing its assignment.
     
  28. scrant

    scrant

    Joined:
    Jun 1, 2017
    Posts:
    73
    Hi, this seems weird to me too and I'm having to completely rethink the way my scene/app is architected. Is this always going to be this way? Did you get any answers to your above question? Thanks!
     
  29. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    54
    Yeah @scrant ::
     
  30. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    54
    YES!! Thank you again @ROBYER1

    @jackpr @mfuad @Matt_D_work @StayTalm_Unity can someone confirm that this has been noted on your end? I'd hate to have it overwritten when your team pushes updates.
     
  31. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    It's been acknowledged on their end and coming in the next update
     
    kavanavak likes this.
  32. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    As a quick update, we've been fixing a bunch of bugs from the initial feedback, and planning how we're going to attack the larger feedback item's we've collated from both here, the blog post, and other users.

    We'll try and get another release out shortly. Will post when we have a new version ready to go.
     
    kavanavak, Shizola, ROBYER1 and 4 others like this.
  33. linojon

    linojon

    Joined:
    Aug 25, 2014
    Posts:
    118
    Why is OnDeactivate not visible in the XR Grab Interactable component in Inspector? (and Custom Reticle). Whereas both are visible in XR Simple Interactable? both derived from XRBaseInteractable.

    upload_2020-1-22_19-18-48.png

    upload_2020-1-22_19-20-12.png
     
  34. bruno1308

    bruno1308

    Joined:
    Aug 25, 2016
    Posts:
    4
    Drag and Angular Drag are zeroed on OnSelectEnter and they are not restored after OnSelectExit. It's pretty straightforward to fix it editing the XRGrabInteractable class, but it would be good to have it fixed soon.
     
    ROBYER1 likes this.
  35. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    @StayTalm_Unity In the WoldInteractionDemo scene, if you set the teleporter interaction usages from 'Trigger' to something like primary or secondary 2d axis, there is no way to get it to teleport using the axis values.

    I dove in and debugged but the interaction handler event is never sprung despite the axis value being higher than the Axis threshold on the XR Controller script here:

    **Edit: I have found that for some reason HandleInteractionAction is only called for the Trigger and Grip buttons, for anything else like PrimaryAxis2D or SecondaryAxis2D it doesn't get called

    Code (CSharp):
    1.         void HandleInteractionAction(XRNode node, string usage, ref InteractionState interactionState)
    2.         {
    3.             float value = 0.0f;
    4.             if (inputDevice.isValid && inputDevice.TryGetFeatureValue(new InputFeatureUsage<float>(usage), out value) &&
    5.                 value >= m_AxisToPressThreshold)
    6.             {
    7.                 if (!interactionState.active)
    8.                 {
    9.                     interactionState.activatedThisFrame = true;
    10.                     interactionState.active = true;
    11.                 }
    12.             }
     
    Last edited: Jan 23, 2020
    kavanavak likes this.
  36. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    335
    Hey everyone, we had to push our platform support blog post to tomorrow (January 24, 2020). It's coming soon, I promise.
     
  37. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    No idea, ill have a quick look.
     
  38. fherbst

    fherbst

    Joined:
    Jun 24, 2012
    Posts:
    802
    d4n3x and ROBYER1 like this.
  39. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    upload_2020-1-24_14-29-4.png

    XR Ray Interactor never shows its states with respect to UI, despite it working just fine at clicking the UI itself.

    The ray interactor says it is always 'Hover Active', however none of the 'On Hover Enter' or 'On Hover Exit' Interactor events we have set up are ever triggered.

    Are we wrong in thinking that UI should be considered an Interactable? As it doesn't behave as one of show up in the XR Interaction Debugger as an Interactable.

    The XR Interactor Line Visual knows that it is hovering over UI which is annoying as we want to use the hover state of the UI to trigger events.
     
    Last edited: Jan 24, 2020
  40. vice39

    vice39

    Joined:
    Nov 11, 2016
    Posts:
    108
    "OpenVR is no longer a Unity supported platform in 2019.3 and beyond. Valve is currently developing their OpenVR XR Plugin, and they will share more information on where to access it once it is available."


    oh crap.
     
    ROBYER1 likes this.
  41. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    Interaction with UI is a different system entirely than the interaction system. UI wont generate interaction system events unfortunately. What we do is combine the raycasts and switch between the two systems under the hood as needed.
     
  42. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    54
  43. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    I wish the documentation would state that clearly, I managed to work around it by using a script with OnEnable/OnDisable attached to a custom cursor/reticle spawned by the interactor on the UI.

    I mostly found it misleading as the raycaster changes colour over UI - something is exposed there even as simple as on hover/off hover that changes the colour of the line. We wanted to use that to trigger events, could it be exposed?
     
    wavyeye and nigel-moore like this.
  44. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    Will look into it
     
    wavyeye, kavanavak and ROBYER1 like this.
  45. linojon

    linojon

    Joined:
    Aug 25, 2014
    Posts:
    118
    When you say “OpenVR is no longer a Unity supported platform in 2019.3 and beyond. Valve is currently developing their OpenVR XR Plugin” does that mean, when available, their plugin will work with XR Interaction Toolkit like other Unity supported plugins?
     
    jashan likes this.
  46. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    as per reply on the blog post "That is the plan. It should work the same way as all other XR input plugins."
     
  47. fherbst

    fherbst

    Joined:
    Jun 24, 2012
    Posts:
    802
    @Matt_D_work would be great to get a reply to my post above asking about input system support - thanks!
     
    ROBYER1 likes this.
  48. fherbst

    fherbst

    Joined:
    Jun 24, 2012
    Posts:
    802
    Unrelated to my previous questions - I just tried to show someone how the new XR Management stuff is supposed to work. Turns out, the samples referenced from the blog post, found here on GitHub, don't work. After looking into the issues, someone broke it with a PR 2 days ago, and QA approved, probably by only testing against a yet-unreleased version of the XR Interaction Toolkit (downloading an older commit from the samples works).
    (Please, in the future use a "development" branch for stuff like that.)
    ...
    Good timing with the blog post about XR Management, which drives more people into testing this!
     
  49. scrant

    scrant

    Joined:
    Jun 1, 2017
    Posts:
    73
    How does one programmatically have an ARInteractable selected in code? For the life of me can’t figure this out. Don’t see any options in interactables, interactors, or interaction manager. Need to have the toolkit select something at various points in AR.
     
    createtheimaginable likes this.
  50. Matt_D_

    Matt_D_

    Joined:
    Jan 10, 2017
    Posts:
    6
    I noticed that on friday, and was hoping to have another release out but alas there's one last thing i need to fix first :) will hoepfully be monday!
     
Thread Status:
Not open for further replies.