Search Unity

Official XR Interaction Toolkit Preview Release (0.9)

Discussion in 'XR Interaction Toolkit and Input' started by mfuad, Dec 17, 2019.

Thread Status:
Not open for further replies.
  1. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Does this affect the common input usages XRController bindings already written in the examples? I'm working on migrating a project over to using this but if the input coding changes again it could be an extra load of work!
     
  2. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    54
    Yes, you're right, I totally missed it - XR Direct Interactor! thank you :)

    @andybak - you're also right, thanks ;)
     
    alexchesser and ROBYER1 like this.
  3. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    54
    BUG? @mfuad @Matt_D_work @StayTalm_Unity
    When you grab objects they're pulled from their position in the Hierarchy and set as root objects. I confirmed that this happens in the demo scenes. Is this supposed to be the default behavior?

    It creates issues, am I missing a way to set this to not be the default behavior? I have a menu with grabbable objects. When the objects are moved and the menu is closed the objects are now no longer in the menu, and as such remain in the scene. I can rebuild the project to assume everything interactable has to be a root object but I would rather not if I don't have to.
     
    Last edited: Dec 28, 2019
    a436t4ataf likes this.
  4. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    How do the XR Controller classes behave with 3dof Headset controllers (rotation only) - I was originally using the Tracked Pose driver with Arm Model script provided by the XR Legacy Input Package, however that is now quite difficult to implement with the XR Ray Interaction script, and also without the XR Controller script, I can't seem to easily set this up?
     
  5. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Did you report the issue where both the curve beam and pointer beam show simultaneously? I am having this issue too on Oculus Quest and Oculus Go!

    I have identified this is a case caused by a Nullreference in the ControllerStates.ClearAll function ran by OnEnable in the ControllerManager Script. I can recreate it by loading to another scene or reloading the scene at runtime

    Reported at Case 1208482

    You can fix it in the meantime by commenting out the line for left and right controllers in the RegisterDevices function that is m_LeftControllerState.ClearAll(); :

    Code (CSharp):
    1.  void RegisterDevices(InputDevice connectedDevice)
    2.     {
    3.         //  Debug.Log("Registering device: " + connectedDevice);
    4.         if (connectedDevice.isValid)
    5.         {
    6. #if UNITY_2019_3_OR_NEWER
    7.             if ((connectedDevice.characteristics & InputDeviceCharacteristics.Left) != 0)
    8.  
    9. #else
    10.             if (connectedDevice.role == InputDeviceRole.LeftHanded)
    11. #endif
    12.  
    13.             {
    14.                 m_LeftController = connectedDevice;
    15.                 // m_LeftControllerState.ClearAll();
    16.                 m_LeftControllerState.SetState(ControllerStates.Select);
    17.             }
     
    Last edited: Dec 31, 2019
    alexchesser likes this.
  6. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Is there a way to set the XRRig tracking space type via script? Or does it follow with the return of XRDevice.GetTrackingSpaceType automatically?

    We currently have the issue where XRDevice.GetTrackingSpaceType returns Stationary always for Oculus Link and build running on Oculus Quest regardless of what Guardian tracking mode we have set (Roomscale or Stationary) - reported at Case 1208329

    Also, we would be able to detect the device by name and change what we want via script but XRDevice.Model returns nothing when using XR Management Subsystems Package. Reported at Case 1208324

    We are working with the Oculus Go and Quest, hence wanting to switch out the rig, the XRRig by default in the examples is not suitable for the Go

    XR Interaction Toolkit - On Oculus Go, ControllerManager errors due to one controller active - Case 1208341

    If you are able to look into any of this it would be really helpful, all 3 of those issue cases are stopping us from having a means of changing the rig at runtime on the Oculus Quest or Go to support roomscale or stationary or even just changing what controllers we have active for each platform in the scene.
     
  7. alexchesser

    alexchesser

    Joined:
    Sep 15, 2017
    Posts:
    147
    WOW! Nice work ... thanks.

    I didn't report it because I didn't actually know how (outside of making a forum post).
    Where did you go to open a case on the issue?
     
  8. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    While your project is open in the Unity Editor - Help> Report a Bug

    More info about bug reporting here: https://unity3d.com/unity/qa/bug-reporting
     
  9. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Hello,
    First off, sorry for the delay, I'll run back up this thread and try to answer what I can, just got off of Christmas Vacation.

    1) XRRig Tracking Space Type: It works the other way around. XrRig has 2 properties: 'trackingSpaceType' & 'trackingOriginMode'. We suggest you use the latter once you are on 2019.3, due to the XR Management/new plugin architecture you probably noticed we are migrating quite a few APIs.
    XRDevice.GetTrackingSpaceType (or XRInputSubsystem.GetTrackingSpaceType when using XR Management) mirrors the m_TrackingSpaceType (or m_TrackingOriginMode) variables. On Start() and whenever the property is set, it tries to make the underlying SDK mirror the setting in XRRig. That means if you set the TrackingSpaceType to 'Room', but that is not available on that platform, then XRRig would be set to 'Room' and XRDevice.GetTrackingSpaceType() would return 'Stationary'. Is that not what you are seeing?
    Without XRManagement, we don't have a good way to know why a tracking space type could not be set, so it will quietly fail, however in 2019.3 and up, when using XR Management, we will report when the Rig is set to a tracking space type that is not supported by the underlying SDK.

    2) XRDevice.Model not being valid: This is part of the migration to the new XR Management setup. Starting in 2019.3 you should be getting deprecation warnings advising on the new API that replaces it. For Model, it's part of InputDevice, specifically Name & Manufacturer properties. Since these are for each individual device, and so you have a unique name and manufacturer for the HMD as well as each controller in a standard VR setup, you need to find the appropriate device. You likely want to get the HMD, which can be retrieved with: InputDevices.GetDevicesWithCharacteristics using the HeadMounted characteristic. InputDevices has a few different APIs you can use to filter for the kinds of devices you want.

    3) I'll investigate these errors after I finish sweeping this thread and seeing what I missed. I think maybe I need to add in better systems for handling a 'may be there' controller, and making sure the overall Rig can understand if it's in 1-handed or 2-handed mode. ControllerManager is part of the sample, and easily edited, as well as the Rig prefab, but in both cases I will look into making sure it works better on a 1-handed device, since cross-platform ease is the whole purpose of this.
     
    P_Jong and MattMaker like this.
  10. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Yeah that ClearAll() function is written incorrectly and can be called before Initialize(), which results in a null reference.

    That ClearAll() needs to null check m_Interactors before for looping with a constant. Will fix, sorry that slipped by.
     
  11. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Gotta cede to @Matt_D_work on this one for intended behaviours. He gets back next Monday, so please hold tight :)
     
    kavanavak likes this.
  12. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    ! You are correct.
    XRController should also implement the same PoseProvider concepts as the TrackedPoseDriver. Right now it gets the raw Position/Rotation data from the underlying device, but you are correct that that does not allow for Arm Models, which is a serious setback.

    I'll put that in queue to fix.
    Sorry to have you guys running into so many early issues, coming into this new year, I plan on trying to implement a lot of the first round of feedback we've received, please hold tight, it'll get better :)
     
    kavanavak likes this.
  13. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Nope! Or at least I hope not, it's hard to promise until it's done.
    Input Usages are available both in InputDevice & the Input System. I plan to do my best to write an automated updater, and since it's strings, it should be pretty easy to match old to new once that's finished.
     
  14. alexchesser

    alexchesser

    Joined:
    Sep 15, 2017
    Posts:
    147
    So for my own understanding - should be looking at this as a replacement for VRTK? Or perhaps a replacement for both VRTK *AND* the oculus assets pack. I'm partway through the Oculus VR course https://learn.unity.com/course/oculus-vr which focuses pretty heavily on VRTK as the bridge.

    Questions to anyone who has experience working in both this and VRTK/Oculus. How do you find this compares to VRTK? Would I bother using this without the Oculus pack? Is it possible to create a VR experience that doesn't pull in anything external outside of this? Is there even any value in doing that?

    Especially considering hand-tracking on the quest ... which is super cool :) I'm going to make my way through finishing the course then try to take a fresh look at the elements from the perspective of quality, ease of integration and, dependency elimination.

    ... and welcome back from vacation @StayTalm_Unity and the rest of the team!
     
    Antares19 likes this.
  15. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Thankyou so much for your detailed and thoughtful responses, I was worried I had overshot on feedback here but it was difficult to sum it all up in the feedback survey when I was finding the issues over the course of a few development sessions.

    Alignment with the legacy input systems Arm model and improved cross-platform support for 3dof and 6dof headsets with 1 or 2 controllers will be fantastic for ease of cross-platform use.
    This functionality is all almost there for it though and you've all done a great job at unifying the necessary XR functionality here :D

    - If the Controller Manager script could potentially have an example of registering/unregistering 1/2 controllers (for example a 3dof with 1 controller) and separately a 6dof with 2 to switch out the controller states (for example the 3dof could have the grabbable interactor that shoots a ray at grabbables and a teleporter on touchpad press but the 6dof 2 controller setup would work better with a direct grabbable on one hand and the teleporter on another on touchpad press, that would be really neat). However, I could probably switch this out myself with the Input Device API otherwise if that's more advisable.

    Looking forward to the fixes and thanks again, your work is a huge help for my projects!
     
  16. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    I would use this as a replacement for VRTK, in my opinion VRTK was a lazy quick-start and any customisation of underlying functionality seemed to be complex due to the way it was written. I've been using just the Unity tracked pose drivers for my project from the XR Legacy Input helpers package and other things like teleportation/locomotion are more custom to your project. This pack does add UI interaction which in the past has involved hacks/workarounds. Support for VRTK has been on/off. For future-proofing I would use the Unity stuff like XR Interaction package as it has that Unity team support.

    My thread here is a good starting point, I should update it to point people to this package really, lots of good advice there.

    https://forum.unity.com/threads/vrt...-project-starting-development-in-2019.620998/

    Feel free to direct message me if you have any questions about migrating to this package - I don't want to distract from feedback and discussion here.
     
    Antares19, kavanavak and alexchesser like this.
  17. alexchesser

    alexchesser

    Joined:
    Sep 15, 2017
    Posts:
    147
    Thanks for this feedback.

    I'd have to agree that some parts of VRTK seem heavier than they might need to be. I was doing the "locomotion" unit last night in the course and found that, while I liked & understood the need for interface abstraction there were some parts that seemed to be a little heavier than they needed to be.

    Specifically it struck me that there was a lot of hooking things up in the editor manually that I thought could lead to situations that were hard to debug.

    I'll definitely reach out if I can't figure out the XRTK after the course, I'm hoping the stuff is intuitive enough to pick up and run with.

    It's probably valuable feedback for the Unity team to talk about other frameworks and toolkits if it leads to constructive criticism of the XRTK. Like what do the other frameworks do well that Unity's native setup doesn't do?

    Now that they're back from vaction - I'd definitely like to reiterate my desire to have an XR toolkit scene switcher built in to the example scene so I could try out a number of the different interactions in a single build. There's just so much work in getting something built to the quest it'd be nice to have that streamlined.
     
  18. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    That would be super easy to make with a simple script using a function to call LoadScene and a worldspace UI button to call it, or test them out in editor by changing scenes and hitting play :) What headset are you using?
     
  19. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Also cross-platform Fixed-Foveated Rendering, which is available through (the as of yet unreleased for Android VR) Vulkan. Is there any news on that you are able to share or even just a hint that the XR teams are aware? FFR is almost essential for android VR and currently is only actually achievable on Oculus Quest/Go with the Oculus Integration package through the OVRPlugin.dll - this is a bit of a pain for us in terms of cross platform support.

    There was a talk at Oculus Connect about the Vulkan FFR and how it would simplify cross-platform VR development and help with optimization for different platforms :)
     
  20. wwm0nkey

    wwm0nkey

    Joined:
    Jul 7, 2015
    Posts:
    42
    One thing I would like is the option to grab from 2 points rather than just the singular attach point, would help a lot for FPS games.

    Also the option for it not to just snap to a position in your hands, picking up things can be weird if it just teleports to a position.
     
  21. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @wwm0nkey
    Both good ideas, multiple attach points on an interactable (2 ways to grab one thing) shouldn't be too difficult.
    As for the not-snap, we've been talking about having maybe an attach area, so that any point within that zone can be an 'attach point' which would allow a more organic, 'grab any part of the object' behaviour. Is that what you were thinking?
     
    kavanavak likes this.
  22. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    jashan and linojon like this.
  23. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    475
    My thoughts after playing around with the toolkit a couple of times:

    Bug - For the record, the "all pointers active all the time" bug that Robyer1 reported also happens on PC builds.

    Manipulating "Kinematic" or "Velocity driven" objects is pretty unpleasant due to severe juddering. On my Rift S it is very noticeable. On my Index, the juddering is less noticeable as you increase the framerate. At 144hz it looks as if there is a motion blur effect being applied to the object being held. What's weird is that the juddering still seems to apply after being thrown. In the scene with apples, if you throw one, you'll notice it's still juddering until it hits the floor, then will roll smoothly as the physics kicks in. Not actually sure what the point of these grab types are? They all seem to have the same behaviour when you throw them.

    Things I"m surprised by:

    Needing to pull trigger to complete a teleport
    No smooth locomotion
    No swapping objects between hands

    Other things I would like to see:

    Animated hands example
    2 hand grab options
    More teleport options
    More snap placement options.

    It would be great if one of these could be done as an example of how you intend people to extend the toolkit themselves.

    Long term, I really hope unity continues support this fully. I'd really like the new input system and ui elements to be made compatible as soon as possible. Recently, someone has made MRTK compatible with oculus quest hand tracking , it would great if this toolkit ended up being that versatile. On the whole though, a good start :).
     
    innertalk and ROBYER1 like this.
  24. wwm0nkey

    wwm0nkey

    Joined:
    Jul 7, 2015
    Posts:
    42
    Yup exactly what I was thinking, having that be an option would be fantastic because teleportation / snapping is very useful for some objects, but things like boxes for example would be better with the grabbing any part system. I honestly think with those 2 options people would get a LOT more use out of the XR kit :)
     
  25. wwm0nkey

    wwm0nkey

    Joined:
    Jul 7, 2015
    Posts:
    42
    RE: The judder that Shizola talked about, I do also get this and it can make the experience pretty unpleasant. I do also agree with adding in teleport options, smooth locomotion / turning and swapping options if possible as those would be very nice as well.

    Also a reset world position option would be really good too
     
    Last edited: Jan 6, 2020
  26. alexchesser

    alexchesser

    Joined:
    Sep 15, 2017
    Posts:
    147
    I'm on a quest - with a laptop that doesn't support quest-to-link so the run-it-in-editor option is off the table, but the worldspace button is what I'm thinking of... or some sort of UI interface that pops up on left-hand menu button press. I'm not too worried about adding it myself to the project for local testing and development, but unless someone on @StayTalm_Unity 's side includes it, we'd have to re-add the menu item thing for every new release.
     
  27. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    @StayTalm_Unity Regarding the XRRig Tracking Space Type, when using the Oculus Quest with Oculus Link in editor XRDevice.GetTrackingSpaceType returns Stationary always for Oculus Link and build running on Oculus Quest also, I have this reported at Case 1208329, is this expected behaviour?
     
  28. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @ROBYER1 If it's only for the link, then it is not intended behaviour. We will have to check if it's a bug in our SDK or Oculus'.
     
    ROBYER1 likes this.
  29. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    Wow, thank you everyone for the amazing feedback. We're in the process of filtering through everything from here, the blog post comments, and the feedback forms.

    This is all super awesome, especially the bug reports. We'll reply back here with some info once we've got everything filtered through :)
     
  30. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    We know the physics-based movement has jitter. This is due to complications between when physics updates, and when VR updates for rendering in the correctly tracked positions. It can get better, but I'm not sure it can be excellent. Non-physics solutions will always be smoother. That said, it staying active after release sounds like a bug, and I'm curious if what is being seen is the known issue, or something else.
     
  31. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    This should be doable by moving the rig to 0,0,0, and calling XRInputSubsystem.TryRecenter()
    Maybe we can add a helper on XRRig to make it a one call operation.
     
    ROBYER1 likes this.
  32. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Is the Physics tracker from the Unity SuperScience pack any help for this? I'm sure your teams have looked into this internally but I found this pack super useful for smooth holding of physics objects in another project

    https://github.com/Unity-Technologies/SuperScience

    Also, the worldspace UI canvas can only have one camera assigned to it for raycasting still, is there a workaround for this you would suggest? (In the case of ever having two XR Rigs or more going on)
     
    Last edited: Jan 7, 2020
  33. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Yes, and we talk to those guys on a semi-regular basis. The current physics implementation in XRI came from them (Labs), and so it's a matter of making the upgrade in a safe, stable way.

    As for multiple cameras, the UIInputModule.cs and TrackedDeviceGraphicRaycaster.cs could be modified to make it work, but we opted to avoid that complexity, because it's not an issue 'yet'. We don't have support in Unity for multiple headsets in the same application instance. And if there is multiple instances of Unity running (1 for each headset), then (in my opinion) the UI processing should be done locally, and it should be the results of that processing that should be networked. For example, don't network that tracking positions and input states and then check if a button was pressed on the host. Check if the button was pressed on the client, send that across the network, and have the host process the results of that button press. And so, I would suggest each client setting their own world camera as the camera on their rig, and not doing any direct UI processing for other rigs.

    Does that make sense and work for you? Or are there cases I'm missing here?
     
  34. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Great to hear, and also I understand the reason for only one camera being active for UI interaction there. Haven't found a use case for having 2 active otherwise as you suggested do it via networking.

    We are struggling with working on OpenVR headsets with the new XR Management Subsystem current as the OpenVR XR Plugin isn't out yet.

    Also UniversalRP through SRP does not render UI or Transparent objects on Oculus Go/Quest when we tried out the samples with it unless when you disable MSAA or disable opaqueTexture in URP pipeline setting - it is likely related to this issue: https://github.com/Unity-Technologies/ScriptableRenderPipeline/pull/5060

    Hopefully that gets fixed as well as the Fixed-Foveated Rendering not working with UniversalRP either as most of this interaction package and fixes for the above issues should be coming in Unity 2020 releases we assume.
     
  35. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    54
    I might be missing something, is there a way to hot-swap controller models during runtime?

    The XRController script checks a private bool (m_PerformSetup) to decide if it should setup the controller model. Without me going in and making this public, and then tripping it after setting the modelPrefab to my next desired prefab transform, is there a way to hot-swap?

    or.. should I just have all my controllers loaded at start and turn their respective game objects off and on as needed? **1/16/20 EDIT I ended up doing this GO swap solution since it works and I didn't get any better suggestions**
     
    Last edited: Jan 16, 2020
  36. Lignaciuk

    Lignaciuk

    Joined:
    Aug 26, 2019
    Posts:
    1
    I am trying to get teleport and ray activated by Thumb touch on Rift but I can't get touch to work. Updated XR Managment, XR legacy input helpers, Oculus XR Plugin to the newest version. Does touch on oculus touch work with XRI?

    Edit:
    Of course, I realized 3 min after posting this that activation might be hardcoded somewhere so I now got it to work.
    @StayTalm_Unity
    1)What is "activation" in XRController for? I thought it would invoke activation of teleport or base state respectively from ControllerManager but activating teleport state is hardcoded. Would it be possible to let the user choose which button/touch switches controller into teleport state?

    2)Could unity generate HandleInteractionAction method in XRController so that it also accepts boolean values? Thumbstick click (which right now is probably the most common way to teleport on oculus) is type of bool and doesn't work with this method in XRI 0.9.2.
     
    Last edited: Jan 9, 2020
  37. Murray_Zutari

    Murray_Zutari

    Joined:
    Jun 1, 2017
    Posts:
    45
    Hi, I've just been reading up and playing with it for a day now but it seems really promising.

    I picked up one issue when using the Oculus Quest. If I'm in the scene and then leave the guardian and come back inside the guardian it fails to load the scene back in/switch from passthrough it just goes black and the 3 loading dots appear. The scene is was testing was the WorldInteractionDemo.
     
  38. LaCorbiere

    LaCorbiere

    Joined:
    Nov 11, 2018
    Posts:
    29
    Hi everyone. I've been trying to get the WorldInteractionDemo to work on the Oculus Quest but without much joy. Have a heap big load of console errors (screen grab for taster). Before I start mucking about with it, does anyone have any idea why I'm getting this? Is it a Metadata issue?

    Unity 2019.3.0f3
     

    Attached Files:

    Last edited: Jan 8, 2020
  39. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    335
    Hi @Murray_Aurecon and @Holty71, do you mind following the instructions below on bug reporting? We'll be able to track/manage better that way. Thanks!

    "If you encounter a bug, please use the Unity Bug Reporter in the Unity Editor, accessible via “Help” > “Report a Bug”. Include “XR Interaction Toolkit” in the title to help our staff triage things appropriately! For more details on how to report a bug, please visit this page."
     
  40. LaCorbiere

    LaCorbiere

    Joined:
    Nov 11, 2018
    Posts:
    29
    Yes will do. Just not sure of it was a bug or user error!
     
    Last edited: Jan 9, 2020
    mfuad likes this.
  41. d4n3x

    d4n3x

    Joined:
    Jan 23, 2019
    Posts:
    24
    Hi @StayTalm_Unity and the whole community,

    I use the package for quite a few days now and experienced a bug (I think so) when deleting placed Objects in AR.
    It seems that Objects that are placed with an SelectionInteractable on it are registered once placed at the InteractionManager. When destroying the object it seems that the InteractionManager still has a reference to the already deleted object. I tried that out with all InteractableScripts - Only the SelectionInteractable seems to cause this error.
    So once a object containing that script is deleted there is no chance to place objects or select objects anymore.

    Can somebody copy that behavior or do I miss something elementary?

    Greets d4n3x
     
  42. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    335
    Hi @d4n3x, do you mind documenting this issue by following the instructions below? We'll be able to properly track and follow up with you better that way. Thanks!

    "If you encounter a bug, please use the Unity Bug Reporter in the Unity Editor, accessible via “Help” > “Report a Bug”. Include “XR Interaction Toolkit” in the title to help our staff triage things appropriately! For more details on how to report a bug, please visit this page."
     
  43. elamd

    elamd

    Joined:
    Jan 23, 2017
    Posts:
    2
    When using the supplied AR demo scene on IOS, I'm noticing that if you have different horizontal planes of different heights, the supplied cube game object will translate to the highest plane. So if you scan a plane on the floor, then scan one on a table, then place the cube on the floor plane and hold your finger down on the cube and move it to translate, the cube will fly upwards to the table plane height and towards your finger.

    Being new to AR foundation, I'm not sure if this is a bug, or the expected behavior.
     
    Last edited: Jan 10, 2020
    createtheimaginable likes this.
  44. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    This would be invaluable, often I have an issue with needing to call something like move the rig to 0,0,0 at start or awake, then recenter, but player movement code on our character controller also tries to move the character at the same time or between them (Which I have stopped by delaying it). But the ideal state would be for the player rig to be set to 0,0,0 as an operation on the XRRig before anything else runs to give people peace of mind that their rig will start at the 0,0,0 of its starting position in the scene.

    Something like 'Recenter Rig on Awake' - "XRRig will be positioned at its position in the scene on start regardless of the user's position"
     
  45. elamd

    elamd

    Joined:
    Jan 23, 2017
    Posts:
    2
    Hi Guys,

    It's not clear to me how to disable AR interactions when you need to. I've tried disabling the XRInteractionManager and ARPlacementInteractable or looking into the scripts for an appropriate state but no luck.

    EDIT: I just ended up disabling the ARGestureInteractor.
     
    Last edited: Jan 13, 2020
    createtheimaginable likes this.
  46. bruno1308

    bruno1308

    Joined:
    Aug 25, 2016
    Posts:
    4
    Anything on ETA for supporting OpenVR? It's something that some people have asked but I haven't seen an answer.
    Thanks!
     
    jashan likes this.
  47. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    In response to Case 1183256 where I reported Oculus Quest defaults to stationary not room-scale tracking mode regardless of the guardian being set to 'Stationary' or 'RoomScale', QA have told me that behaviour is intended.

    This messes with the point of having a stationary rig for 3dof where the headset is positioned at a set height (due to no positional tracking) and 6dof headsets which should use Room-scale as they have full positional control.

    What do you guys think of this? I'm baffled as the XRRig has a Stationary and Roomscale setting which seems to be more for 6dof/3dof but the Quest always reporting 'stationary' messes with this for us @StayTalm_Unity
     
  48. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    569
    Any chance of a link? I can't see a way to look up issues via ID.
     
  49. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
  50. Aaron-Meyers

    Aaron-Meyers

    Joined:
    Dec 8, 2009
    Posts:
    305
    not sure why people are saying OpenVR isn't supported. I downloaded the sample project from the Unity github and ran it on a Vive. Just take a look at the packages manifest on line 15: https://github.com/Unity-Technologi...1ccf0eca3a6e0682bc0/VR/Packages/manifest.json
     
    bruno1308 likes this.
Thread Status:
Not open for further replies.