A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Separate names with a comma.
Discussion in 'AR/VR (XR) Discussion' started by rickmus, Dec 17, 2020.
Hi I also encounter similar problem, does anyone have a fix for this issue?
Believe this is an issue with the SteamVR OpenXR runtime. Try changing runtime temporarily in (Project Settings -> XR Plug-in management -> OpenXR - > OpenXR Runtime) to Oculus. You can also try using the Steam VR beta version of the runtime to see if it is resolved there.
Hi Apoxol, thanks for replying., the runtime Oculus do work fine(the device based)., but I would also like to target the HTC Vive.., which (i think) should use openXR right?
Ok, the Oculus Desktop beta fixed the problem. But we can't ask our players to switch on Oculus Desktop beta. Is there a roadmap available for the correction ? And why this problem occurs on OpenXR and not Oculus plugin ?
With OpenXR you will not have control over which runtime the player is using. They could be using SteamVR or the oculus runtime. As long as you target OpenXR you should be able to get both of those devices. Just keep in mind that OpenXR support is still in preview and the runtimes for steam and oculus are still in beta.
This is just a growing pain until the runtimes get out of beta. OpenXR is a preview package and the runtimes from Oculus and Steam are still going through their beta phases. Once the dust settles this should be a non issue. The problem occurs on OpenXR and not Oculus plugin because they are two different things. The Oculus plugin uses the Oculus API where OpenXR/Oculus uses the Oculus OpenXR runtime (which is still in beta and only support on Desktop).
Why does the native Oculus Provider gets disabled when I enable this package? I wanted to use native oculus support when possible, and when someone uses vive or other steamvr oculus then use this plugin. For companies it's terrible to have to install steam, so just those who have vive would use steamvr. How do I achieve what I want? Do I have to have one build for each platform?
Is it possible to manually add/remove providers in runtime? I mean I'd like to have a script to add providers in an order where openxr is the last option and only gets enabled if the others fail
If you are using OpenXR your users would not need Steam on their machine unless they are using the Vive. If your user only had an Oculus they would have the oculus desktop on their machine and it would be the default runtime so when they launch your app it would work without Steam.
As for why we disable the Oculus plugin when OpenXR is selected I will have to inquire and get back to you.
I don't know if this is the right place to ask, but I'm desperate for a solution and I can't find it anywhere. I was using open VR in unity 2019.3 and now I'm migrating to unity 2020.2. Seems like open vr handled "devicePosition" and "deviceRotation" differently from what I'm seeing in xrplugin management. My models have some offset from where they were. What does "devicePosition" pivots? this insn't clear anywhere, and seems like it's different from device to device. I'll open one thread about this issue anyway
If you are using OpenXR then devicePosition / deviceRotation are the grip and pointerPosition / pointerRotation are the axis pointing out of the controllers. If using OpenXR you should be using action based controls as well and binding them to those paths.
What about when I'm using native oculus provider?
It's also funny that with valve's openvr everything pivots in the correct place, independent from the device
I am not as familiar with the native oculus plugin but after looking at the code it looks like there is only devicePosition / deviceRotation for oculus (in fact for OpenXr both values are the same for oculus now that I think about it). I am unsure if that poisition is the grip axis or pointer axis in the native plugin though. I would assume it is the pointer axis or it would be very difficult to align the controllers.
well, this is my exact problem, It's being very hard to align the controllers, my hand feels pretty weird
The axis should be pointing with z being the natural forward vector coming out of your hands and x being left/right.
I think this could be better documented
According to the known issues in the doumentation: "Haptics is currently not supported. It will be added in a later version of the OpenXR plug-in" --> https://docs.unity3d.com/Packagesemail@example.com/manual/index.html
Is there any word on when haptics will be added? Is it a far off thing or (hopefully) sometime soon?
I have tested this plugin with vive and steam vr and the controllers won't match with vive controller models. I have tested both device and pointer pose
I'm experiencing a weird bug after switching to OpenXR today where the audio output device is always defaulting to the system's default instead of the VR headset. It happens when I'm using the Oculus and Steam as the OpenXR runtime -- both in editor and in builds. I've checked Oculus's settings for VR audio by adding/removing pc audio to the headset, and vice versa, all combinations have no effect. I can get audio out to the headset by setting it to be the system's default audio device, but that isn't workable long term. The bug is persistent for both the CV1 and Quest2. Is this bug already known?
I'm running the latest (everything) including the beta for oculus as to fix the visual bug outlined in the known issues here:
dos preguntas rápidas:
¿Cómo se llama Oculus GestureMenu desde XR?
¿Cómo puedo sacar la Y de TrackingOriginModeFlags.Floor?
There an ETA on Quest builds? I feel that's probably the most important thing missing from OpenXR atm (other than Haptics). Hoping it'll be ready by the LTS release...
I've been having trouble with using the Oculus Quest through SteamVR/Virtual Desktop, as it won't detect the primary2dAxis.
Looking at the OpenXR specification, the Oculus Touch Controller Profile doesn't have a binding for …/input/thumbstick, which is what the plugin is looking for, instead it seems that the bindings are …/input/thumbstick/x and …/input/thumbstick/y separately.
Could this be a source of the issue? Or is that just a very explicit way they are defining /input/thumbstick to be a 2D Vector?
Using the OculusXR plugin with Link works fine btw.
Hello, I'm using OpenXR 0.1.2-preview.2 and it's impossible to quit the game from the SteamVR menu layout in headset. The button "Quit Game" only quit the game in VR, but the window remains opened in the desktop.
Also having the axis input issue.
Other than this, can you reliably get the VR to start from your build (not from Editor)?
When I use the legacy VR there is no problem at all, it starts every time both with Oculus & Steam VR, however with the new plugin it's like 90% of the time VR won't even start.
Also when I toggle on the OpenXR plugin the Oculus plugin gets disabled.
Maybe this is on purpose but is there a reason that Unity can't support both? I would prefer to have the option to launch Oculus in their native platform while being able to support Steam VR headsets through OpenXR.
The current situation with the Unity XR plugin system is a bit messy. Cross platform VR apps are pretty much stuck with legacy VR right now due to lack of support from Steam VR and I was hoping OpenXR plugin to be the saviour but it seems it is still not quite ready yet.
Does anyone know if there is a way to force the FOV of the camera to a different setting? I'm using the Mock Runtime of the OpenXR plugin. When I try to change the FOV in the Inspector, the game view doesn't react, it seems to be overridden to some default value. I wonder how I could force this to a custom projection.
So I have now been testing OpenXR using the new HP Reverb G2, and it worked completely flawlessly (for a fortnight or so) up until today, when I was testing some VR multiplayer scene in Unity and suddenly I lost all controller tracking. This also caused me to have several Unity crashes in succession. However, at first I was still able to track my controllers in other projects, but soon that went as well. After reinstalling windows mixed reality portal and restarts etc. I got the tracking once back for another scene, and since then no dice anywhere. The input debugger does not seem to recognize that the controllers exist at all.
I get head tracking through OpenXR just fine when I use the WMR openxr runtime, but the steamVR runtime which used to work causes a crash (both in steam and in Unity).
This is very interesting, as I had just tried to show my friend how to utilize his G2 with OpenXR, and he got a similar problem, although for him the WMR runtime gets stuck into a loading screen and then crashes Unity, while with SteamVR (non-beta version) he can get just the HMD tracking. So yeah, I was kinda trying to figure out what was the problem for him with my own testing and happened to run in to the same problem, no controllers tracking at all. (I've submitted a bug log out of this).
The problem is definitely in the OpenXR, as just using the windows mixed reality plugin works fine in the same project. This is all highly confusing though, as things worked so well for a long time, only to just suddenly be unable to detect anything.
So I am asking if anyone has any solutions for this or any idea whatsoever what is the problem? I'm guessing it could be Microsoft's end too but then again the OpenXR sample scene through Windows Mixed Reality Portal runs flawlessly while Unity does not.
I tested with versions 2020.2.0f2, 2020.2.2f1 and 2020.2.3f1 at least and no change. OpenXR is the latest preview2 and input system is latest as well, although tried with the default for OpenXR as well. Confirmed that there is no kind of detection of the controllers whatsoever.
EDIT: Tracking is working again now, came back as randomly as it went. Assuming this course of action might repeat some point again.
Apparently on Thursday a verified 1.0.2 release quietly dropped?
Pretty great, but I didn't see anything posted here—is there a changelog available for what features made it in? Things like Quest support etc.
Same issues as before, no haptics, no Quest support, just an openXR runtime debugger?
How the heck is this a 1.0 release?
Yea, I have the same problem...Even with the latest version released on Friday.
Is there a timeline on these missing features? No haptics really feels like a step back for VR given it's supposed to be the most immersive medium... Also Quest support is pretty important it being the fastest growing sector of the VR market.
I'm having a similar experience with Viikable.
I am very happy that this came out but it is too complex and confusing. Not to mention unstable.
I also appreciate the OpenXR project validation tool but please do not make this so obscure.
Create a proper OpenXR wizard that sets up a scene exactly right for the user.
Same as you have for HDRP. Do not break UX consistency.
Installing the right packages and ticking on all the correct settings.
As it is I have to use a separate tool (thankful it exists but adds another risk for errors) the WMR Feature Tool to set up the project.
What are we supposed to install there for a basic "standard" VR application? (that would be a basic walkthrough) So many options. Also after using the MRTK installer there are so many unnecessary features turned on. I suppose that is Microsoft's doing.
I am trying to use it with the XR interaction toolkit but it fails,.. It probably is something I am doing wrong but why should we fail at such an early stage? Why do we need to set up each and every option on a teleportation rig?
Please create a standard teleportation rig and add a basic teleportation area with it. Like OpenVR does. Nice, easy simple. We can scale up the teleportation area ourselves or replace it with a different one or add accordingly.
And sure there are other people who need a more specialized or specific setup. These are usually more advanced users, they can take care of it themselves. Add a basic standard teleportation rig for everyone. Make Unity User friendly again.
While I agree that Unity has some issues with teaching users the XR platform workflow (mainly because it's still in flux at the moment), There are many places that have handy tutorials on setting things up. For example, you could try VR With Andrew, who has been a great help to many in working out the systems. https://www.youtube.com/channel/UCG8bDPqp3jykCGbx-CiL7VQ
Thank you. But I think you misunderstood something.
Such users* do not want to learn how to perform a complex procedure.
They want a tool that does what they need with the least effort.
Unity used to be such a tool. Not anymore.
Now, regarding the tutorials. Sure there are many out there. The vast majority outdated, or simply parroting the manual showcasing things you can figure out simply by creating a scene
Valem is my favorite.
This is a nice Tutorial using the new input system with OpenXR.
*by these users I mean the users of the AEC industry that Unity is so much after, but losing to Unreal due to increased visual realism and more user friendly scene preparation and packaging.These users do not care to learn how to mix and match the numerous systems of Unity and help debug them as they go. They need to present their work yesterday. Renders that usually happen in a week are done in days, and while you are finishing the renders, a new set of changes has already come up. That is the world of AEC and previz. Fast and merciless. Stumbling your way there (as Unity does lately,) is considered as failure. Not the "adventure of development"
Is anyone else having some issues with SteamVR inputs through OpenXR?
I can get the Vive menu button with:
bool menuSuccess = myInputDevice.TryGetFeatureValue(
out bool isMenuPressed);
but in this case, isMenuPressed is always false.
Also the CommonUsages.primary2DAxisClick appears to return true when the Vive touchpad is touched, when it should only return true when it is fully pressed.
I'm running Unity 2020.2 and OpenXR 1.0.2 and have tried with both SteamVR 1.15.19 and the latest beta and the latest InputSystem 1.0.2
The paradigm with OpenXR/XRIT/Input System workflow has changed towards using Action based input, rather than Device based input.
The video linked in the post above yours actually has a good tutorial on how to set up Action based input for OpenXR.
This needs to support the VIVE and Quest ASAP. It'll make Unity VR development much less painful when it does.
It already supports Vive. Quest is indeed a sticking point right now, but I would much prefer to have Haptics be a priority.
Open XR just keep crashing my whole computer with bright blue screens of the death everytime I try to enter in my scene with my oculus rift CV1 with unity 2019.4.17f1, HDRP 7.5.3, GTX970, 8Gb Ram DDR3 and i7 4790k on windows 10.
So far i'll keep using the "legacy" system that just work perfectly fine.
How in the Sam hill did you get it even *installed* on 19.4? the OpenXR plugin requires 2020.2+!!
Will the controller offsets be fixed? Or does every device need custom offsets just like SteamVR caused....
I hope so. That is such an annoying issue.
There appears to be a problem reading the float trigger value for Oculus touch controllers. I believe the problem is on line 514 of OculusTouchControllerProfile.cs (1.0.2) where triggerPressed config has "interactionPath = trigger" which is the same interactionPath that trigger uses.
Thank you for posting this.
@Unity, I would love to get confirmation on this and get an idea when something like this might make it into a fix I can update to. I am actively working with a system that requires this to be working.
The problem that I am having is that the trigger value is not coming through when I use OpenXR with SteamVR via a Quest that is either linked via cable or wireless with Virtual Desktop.
The trigger works fine when using Oculus rather than SteamVR.
The Grip works fine, just the trigger doesn't work.
I also notice the Thumb touch doesn't work, but trigger is more important.
After some more looking, the problem doesn't appear to be line 514. There actually are several inputs that are not reporting correctly with the Oculus touch controllers with SteamVR. Looking at the Input Debugger and XR Interaction Debugger screens it is showing things like the primary and secondary buttons not getting values with button presses. If you push down on the thumbstick that registers as a menu button, primary button, and secondary button press.
I've noticed this for me as well, using CV1.
Note that: forcing the OpenXR Runtime to Oculus fixes this in the editor, however in builds of the project Unity will still automatically use SteamVR runtime causing these input issues to persist.
I've created a small library that allows you to switch between OpenXR runtimes just like an editor.
You can prioritize that runtime by setting it like the sample code before initializing the OpenXR plugin.
Quest 2 is already in the first place among the helmets in Steam, but we still cannot use the controls in OpenXR.
What a waste of time.
@the_real_apoxol please help us, this is a very important thing in the VR industry.
The latest version of Steam VR OpenXR Runtime (beta 1.16.8) should have a fix for the controls that did not work.