Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Bug OpenXR Interaction Features on Quest

Discussion in 'VR' started by sjando, May 11, 2023.

  1. sjando

    sjando

    Joined:
    Nov 6, 2017
    Posts:
    15
    I’ve got a couple issues with the OpenXR XR provider + Meta Quest (I know there’s other options, I'd still like to get past the issues with this one) related to interaction profiles (aka interaction features) that have caused me a bit of pain over the last year or two.

    When enabling the Meta Quest Support feature (or the old Oculus Quest Support feature), a project validation rule is enforced that “only Oculus Touch Interaction Profile and Meta Quest Pro Touch Interaction Profile are supported right now”. This is an error, not a warning, and will prevent builds from succeeding. This is bizarre behaviour for a number of reasons:
    • There’s no reason profiles should be checked in the first place. The whole point of OpenXR input is that the application specifies the profiles it can support, and the runtime can decide which ones it understands. It should not be an error (or even a warning) to list “unsupported” profiles.
    • If Meta adds new supported profiles in a system update and/or new hardware, we’re stuck waiting for an OpenXR plugin update to 'allow' them, as was the case with the Pro Controller profile.
    • If you hack around this check, you’ll find that several of the “unsupported” built-in profiles work just fine right now, including Microsoft Hand Interaction (for hand tracking) and Khronos Simple Controller
    This is all exacerbated by the fact that the OpenXR plugin project validator unintuitively doesn’t respect the “Ignore build errors” option exposed in the XR Project Validation window (see below). Your build will fail regardless, and I'd love to see that fixed urgently.

    upload_2023-5-11_10-53-46.png

    Second issue is that it seems enabling more than one interaction profile for Quest at a time causes runtime errors (xrSyncActions: XR_ERROR_INVALID_PATH) that break ALL input. Even with two interaction profiles that each work fine if used on their own. I’m not sure if this is a bug in the Meta runtime or in Unity OpenXR plugin but wanting to specify at least one hand tracking interaction profile and at least one controller interaction profile should be fairly common so it’s hard to imagine this is a particularly exotic scenario. And it certainly works on non-Quest platforms, even other Android-based ones. Anyone else had success with this?

    Maybe the first issue (the heavy-handed validation) is an attempt to avoid users running into the (known) second one? If so, it doesn’t quite work, as even the “supported” combo of Oculus Touch Controller Profile and Meta Quest Touch Pro Controller Profile will manifest this error if used together.
     
    unitypsycurio likes this.
  2. DevDunk

    DevDunk

    Joined:
    Feb 13, 2020
    Posts:
    4,214
    To give some of the reasons for this (I think).
    1. OpenXR on android is not cross platform yet. This means each vendor has their own runtime support, and it will not run on other vendors. There is a start on 1 build for all devices, but that's still some time away
    2. Vendors can release their own interaction profiles, just like Pico does. Meta not releasing an update for their new controller also is their issue (and the normal interaction profile works on the pro as well, so there was no big need I think)
    3. Ignore build errors does not mean ignore errors while building I think, not too sure about this one.

    I have made a build script in the past that makes both a pico and quest build with 1 button, in order to save me some time
     
  3. sjando

    sjando

    Joined:
    Nov 6, 2017
    Posts:
    15
    Thanks for this, that all makes sense. My issue isn’t about one build, multiple headsets though. I understand that several vendors have proprietary implementations of the OpenXR loader on Android, that there are different Android manifest requirements etc. Easily scripted around, as you say.

    This is about the parts Unity does distribute and maintain themselves applying arbitrary, nonsensical limitations. As I mentioned, the Microsoft Hand Interaction Profile (that Unity distributes with the OpenXR plugin) works perfectly fine with the Meta Quest Support feature (also distributed by Unity as part of the OpenXR plugin) if you hack around the validation rule, so why does the rule exist? There’s also no downside to enabling an interaction profile that isn’t supported so I just can’t fathom it.

    Yes, Meta could release their own interaction feature for a new controller. I can also trivially make an interaction feature for a new Meta controller once the OpenXR spec is published. But the out-of-box Unity Meta Quest Support feature will fail your build if you attempt to use it because it isn’t whitelisted by Unity, so to speak. So, I’d also have to do one of:
    • Reimplement my own version of Meta Quest Support (which is just the Meta loader library and some manifest processing, so it can be done) without the validation rule, and use it instead
    • Fork/embed com.unity.xr.openxr allowing this “error” to be bypassed, perhaps by respecting the Ignore build errors editorpref set by the XR project validator window (the window that shows the error to begin with, so that would make sense I reckon)
    • Write an extra OpenXR feature that mutates the validation rule list to effectively disable the built-in rule or at least downgrade it to a warning (this is usually the option I go with)
    It’s a problem/mistake in the implementation of the OpenXR package that should (and could easily) be fixed in a future version.

    Anyway, you can get around all of this if you know what you’re doing. Just mildly inconvenient. But my second issue (runtime errors with multiple profiles enabled) is a real blocker. One that seems be coming from the native XRSDK implementation of OpenXR plugin (if not from the Meta runtime itself), so not as easily worked around or reimplemented. Any insight into that one?
     
    DevDunk likes this.
  4. DevDunk

    DevDunk

    Joined:
    Feb 13, 2020
    Posts:
    4,214
    Yeah definitely understand the frustration. Especially if it technically could work, it should be a warning and not a blocking error.
    Sadly cannot help much further with finding a workaround
     
  5. sjando

    sjando

    Joined:
    Nov 6, 2017
    Posts:
    15
    In case anyone else comes across this, an update regarding the second issue (runtime errors with multiple profiles enabled):

    As best I can determine through use of the Unity's OpenXR Runtime Debugger, Unity OpenXR is not actually doing anything wrong here. I'm convinced that Meta's Android runtime doesn't properly conform to the OpenXR spec (or at least interprets it differently from most other runtimes) when it comes to the xrSyncActions command. It returns XR_PATH_UNSUPPORTED when it shouldn't.

    I've confirmed it can be worked around by intercepting Unity's call to xrSyncActions in a custom OpenXRFeature - but it's ugly, could be implemented simpler and faster in the Unity OpenXR plugin itself.

    If anyone involved with maintaining Unity OpenXR reads this:

    - Please consider removing the "only Oculus Touch Controller is supported right now" validation step from the Meta Quest Support feature. There are good reasons right now to use Microsoft Hand Interaction or Khronos Simple Controller (see note below) instead. A "multiple interaction profiles are not supported on Quest right now" error if you enable more than one at a time would be a reasonable short-term replacement given the issue discussed in this post. Ability to push past this with the "Ignore build errors" setting in XR Project Validation would still be appreciated also.
    - A built-in workaround to this issue would be great. If an XrActionSet corresponds to an ActionMapConfig in which no child ActionConfig matches the current interaction profile for a given subaction path, don't include a that XrActiveActionSet in the call to xrSyncState. Otherwise, Meta's runtime seems to fail the call (incorrectly by my reading of the spec, but if not, maybe other runtimes I haven't seen will do the same - a problem either way).

    NOTE: Khronos Simple Controller is actually very handy on Quest if you're doing something with simple input needs, since unlike with the Touch Controller profiles the Meta runtime will bind both hand-tracked and controller inputs to this profile - allowing you to seamlessly switch back and forth between the two input methods at runtime without triggering the multi-profile issue discussed in this post. You still need of course to enable hand tracking in the Android manifest file for this to work on device (having a toggle for that in Meta Quest Feature would be great too *nudge nudge*).
     
    DevDunk likes this.
  6. billyjoecain

    billyjoecain

    Joined:
    Jul 10, 2015
    Posts:
    10
    I just ran into this and here is what got me over the hump:

    Uncheck Meta Quest Support. You can follow along with the error icons. Make sure that
    XR Interaction Toolkit is installed. I could not see the edit button in the second screenshot without it.
     

    Attached Files:

    unitypsycurio likes this.
  7. unitypsycurio

    unitypsycurio

    Joined:
    Apr 20, 2020
    Posts:
    2
    Yep it turns out that for meta hand tracking you dont need the Hand Interaction poses all works fine without

    upload_2023-9-20_14-35-11.png
     
  8. sjando

    sjando

    Joined:
    Nov 6, 2017
    Posts:
    15
    Thanks. Yeah, you can get Meta hand tracking data, including platform-specific aim poses from the XR_FB_hand_tracking_aim extension, in the way you've described but it isn't quite a remedy to the issues I'm describing.

    XRHandSubsystem in your case exposes data directly through the input system, and it does so with layouts that don't inherit XRController (MetaAimHand and XRHandDevice) like almost every other XR controller or hand layout. If you have a bunch of input bindings to e.g. <XRController>, they won't work. Not a huge deal, especially as you can probably just bind <TrackedDevice> instead, but a minor inconvenience.

    More of an issue is that by exposing data directly through the input system, instead of indirectly at a lower-level via an XRSDK input subsystem (which is how interaction profiles work), any application code using the XR input APIs (UnityEngine.XR.InputDevices et. al, aka not the new Input System) doesn't work and needs to be rewritten.

    I'm mostly thinking about scenarios where you aren't using XR Interaction Toolkit or some other framework or SDK that handles everything for you and you don't care about hand joints, just simple point and click functionality. Maybe you have an existing application that works on a bunch of devices and you just want to support hands on Quest. This could and should be trivial, but it isn't.

    It *is* trivial when you enable the Khronos Simple (if you want both hands and controller) or Microsoft Hand (if you just want hands) interaction profiles on Quest, since the Quest runtime explicitly supports these. An existing application will 'just work' with hands. The problem is Unity blocks them on Quest, likely for legacy reasons or through sheer misunderstanding of how the Quest OpenXR runtime operates w/ respect to hand interaction these days (there was a time when you did need to use XR_FB_hand_tracking_aim, and it was annoying, but that time has long past). Perhaps a bug report is a better way to bring Unity attention to this.