Search Unity

Bug OpenXR Interaction Features on Quest

Discussion in 'VR' started by sjando, May 11, 2023.

  1. sjando

    sjando

    Joined:
    Nov 6, 2017
    Posts:
    18
    I’ve got a couple issues with the OpenXR XR provider + Meta Quest (I know there’s other options, I'd still like to get past the issues with this one) related to interaction profiles (aka interaction features) that have caused me a bit of pain over the last year or two.

    When enabling the Meta Quest Support feature (or the old Oculus Quest Support feature), a project validation rule is enforced that “only Oculus Touch Interaction Profile and Meta Quest Pro Touch Interaction Profile are supported right now”. This is an error, not a warning, and will prevent builds from succeeding. This is bizarre behaviour for a number of reasons:
    • There’s no reason profiles should be checked in the first place. The whole point of OpenXR input is that the application specifies the profiles it can support, and the runtime can decide which ones it understands. It should not be an error (or even a warning) to list “unsupported” profiles.
    • If Meta adds new supported profiles in a system update and/or new hardware, we’re stuck waiting for an OpenXR plugin update to 'allow' them, as was the case with the Pro Controller profile.
    • If you hack around this check, you’ll find that several of the “unsupported” built-in profiles work just fine right now, including Microsoft Hand Interaction (for hand tracking) and Khronos Simple Controller
    This is all exacerbated by the fact that the OpenXR plugin project validator unintuitively doesn’t respect the “Ignore build errors” option exposed in the XR Project Validation window (see below). Your build will fail regardless, and I'd love to see that fixed urgently.

    upload_2023-5-11_10-53-46.png

    Second issue is that it seems enabling more than one interaction profile for Quest at a time causes runtime errors (xrSyncActions: XR_ERROR_INVALID_PATH) that break ALL input. Even with two interaction profiles that each work fine if used on their own. I’m not sure if this is a bug in the Meta runtime or in Unity OpenXR plugin but wanting to specify at least one hand tracking interaction profile and at least one controller interaction profile should be fairly common so it’s hard to imagine this is a particularly exotic scenario. And it certainly works on non-Quest platforms, even other Android-based ones. Anyone else had success with this?

    Maybe the first issue (the heavy-handed validation) is an attempt to avoid users running into the (known) second one? If so, it doesn’t quite work, as even the “supported” combo of Oculus Touch Controller Profile and Meta Quest Touch Pro Controller Profile will manifest this error if used together.
     
    unitypsycurio likes this.
  2. DevDunk

    DevDunk

    Joined:
    Feb 13, 2020
    Posts:
    5,063
    To give some of the reasons for this (I think).
    1. OpenXR on android is not cross platform yet. This means each vendor has their own runtime support, and it will not run on other vendors. There is a start on 1 build for all devices, but that's still some time away
    2. Vendors can release their own interaction profiles, just like Pico does. Meta not releasing an update for their new controller also is their issue (and the normal interaction profile works on the pro as well, so there was no big need I think)
    3. Ignore build errors does not mean ignore errors while building I think, not too sure about this one.

    I have made a build script in the past that makes both a pico and quest build with 1 button, in order to save me some time
     
  3. sjando

    sjando

    Joined:
    Nov 6, 2017
    Posts:
    18
    Thanks for this, that all makes sense. My issue isn’t about one build, multiple headsets though. I understand that several vendors have proprietary implementations of the OpenXR loader on Android, that there are different Android manifest requirements etc. Easily scripted around, as you say.

    This is about the parts Unity does distribute and maintain themselves applying arbitrary, nonsensical limitations. As I mentioned, the Microsoft Hand Interaction Profile (that Unity distributes with the OpenXR plugin) works perfectly fine with the Meta Quest Support feature (also distributed by Unity as part of the OpenXR plugin) if you hack around the validation rule, so why does the rule exist? There’s also no downside to enabling an interaction profile that isn’t supported so I just can’t fathom it.

    Yes, Meta could release their own interaction feature for a new controller. I can also trivially make an interaction feature for a new Meta controller once the OpenXR spec is published. But the out-of-box Unity Meta Quest Support feature will fail your build if you attempt to use it because it isn’t whitelisted by Unity, so to speak. So, I’d also have to do one of:
    • Reimplement my own version of Meta Quest Support (which is just the Meta loader library and some manifest processing, so it can be done) without the validation rule, and use it instead
    • Fork/embed com.unity.xr.openxr allowing this “error” to be bypassed, perhaps by respecting the Ignore build errors editorpref set by the XR project validator window (the window that shows the error to begin with, so that would make sense I reckon)
    • Write an extra OpenXR feature that mutates the validation rule list to effectively disable the built-in rule or at least downgrade it to a warning (this is usually the option I go with)
    It’s a problem/mistake in the implementation of the OpenXR package that should (and could easily) be fixed in a future version.

    Anyway, you can get around all of this if you know what you’re doing. Just mildly inconvenient. But my second issue (runtime errors with multiple profiles enabled) is a real blocker. One that seems be coming from the native XRSDK implementation of OpenXR plugin (if not from the Meta runtime itself), so not as easily worked around or reimplemented. Any insight into that one?
     
    DevDunk likes this.
  4. DevDunk

    DevDunk

    Joined:
    Feb 13, 2020
    Posts:
    5,063
    Yeah definitely understand the frustration. Especially if it technically could work, it should be a warning and not a blocking error.
    Sadly cannot help much further with finding a workaround
     
  5. sjando

    sjando

    Joined:
    Nov 6, 2017
    Posts:
    18
    In case anyone else comes across this, an update regarding the second issue (runtime errors with multiple profiles enabled):

    As best I can determine through use of the Unity's OpenXR Runtime Debugger, Unity OpenXR is not actually doing anything wrong here. I'm convinced that Meta's Android runtime doesn't properly conform to the OpenXR spec (or at least interprets it differently from most other runtimes) when it comes to the xrSyncActions command. It returns XR_PATH_UNSUPPORTED when it shouldn't.

    I've confirmed it can be worked around by intercepting Unity's call to xrSyncActions in a custom OpenXRFeature - but it's ugly, could be implemented simpler and faster in the Unity OpenXR plugin itself.

    If anyone involved with maintaining Unity OpenXR reads this:

    - Please consider removing the "only Oculus Touch Controller is supported right now" validation step from the Meta Quest Support feature. There are good reasons right now to use Microsoft Hand Interaction or Khronos Simple Controller (see note below) instead. A "multiple interaction profiles are not supported on Quest right now" error if you enable more than one at a time would be a reasonable short-term replacement given the issue discussed in this post. Ability to push past this with the "Ignore build errors" setting in XR Project Validation would still be appreciated also.
    - A built-in workaround to this issue would be great. If an XrActionSet corresponds to an ActionMapConfig in which no child ActionConfig matches the current interaction profile for a given subaction path, don't include a that XrActiveActionSet in the call to xrSyncState. Otherwise, Meta's runtime seems to fail the call (incorrectly by my reading of the spec, but if not, maybe other runtimes I haven't seen will do the same - a problem either way).

    NOTE: Khronos Simple Controller is actually very handy on Quest if you're doing something with simple input needs, since unlike with the Touch Controller profiles the Meta runtime will bind both hand-tracked and controller inputs to this profile - allowing you to seamlessly switch back and forth between the two input methods at runtime without triggering the multi-profile issue discussed in this post. You still need of course to enable hand tracking in the Android manifest file for this to work on device (having a toggle for that in Meta Quest Feature would be great too *nudge nudge*).
     
    DevDunk likes this.
  6. billyjoecain

    billyjoecain

    Joined:
    Jul 10, 2015
    Posts:
    10
    I just ran into this and here is what got me over the hump:

    Uncheck Meta Quest Support. You can follow along with the error icons. Make sure that
    XR Interaction Toolkit is installed. I could not see the edit button in the second screenshot without it.
     

    Attached Files:

    unitypsycurio likes this.
  7. unitypsycurio

    unitypsycurio

    Joined:
    Apr 20, 2020
    Posts:
    2
    Yep it turns out that for meta hand tracking you dont need the Hand Interaction poses all works fine without

    upload_2023-9-20_14-35-11.png
     
    tatelax likes this.
  8. sjando

    sjando

    Joined:
    Nov 6, 2017
    Posts:
    18
    Thanks. Yeah, you can get Meta hand tracking data, including platform-specific aim poses from the XR_FB_hand_tracking_aim extension, in the way you've described but it isn't quite a remedy to the issues I'm describing.

    XRHandSubsystem in your case exposes data directly through the input system, and it does so with layouts that don't inherit XRController (MetaAimHand and XRHandDevice) like almost every other XR controller or hand layout. If you have a bunch of input bindings to e.g. <XRController>, they won't work. Not a huge deal, especially as you can probably just bind <TrackedDevice> instead, but a minor inconvenience.

    More of an issue is that by exposing data directly through the input system, instead of indirectly at a lower-level via an XRSDK input subsystem (which is how interaction profiles work), any application code using the XR input APIs (UnityEngine.XR.InputDevices et. al, aka not the new Input System) doesn't work and needs to be rewritten.

    I'm mostly thinking about scenarios where you aren't using XR Interaction Toolkit or some other framework or SDK that handles everything for you and you don't care about hand joints, just simple point and click functionality. Maybe you have an existing application that works on a bunch of devices and you just want to support hands on Quest. This could and should be trivial, but it isn't.

    It *is* trivial when you enable the Khronos Simple (if you want both hands and controller) or Microsoft Hand (if you just want hands) interaction profiles on Quest, since the Quest runtime explicitly supports these. An existing application will 'just work' with hands. The problem is Unity blocks them on Quest, likely for legacy reasons or through sheer misunderstanding of how the Quest OpenXR runtime operates w/ respect to hand interaction these days (there was a time when you did need to use XR_FB_hand_tracking_aim, and it was annoying, but that time has long past). Perhaps a bug report is a better way to bring Unity attention to this.
     
  9. ahmedshariff

    ahmedshariff

    Joined:
    Feb 25, 2020
    Posts:
    11
    As a temporary workaround, you could use openxr plugin version 1.7.0:
    In the manifest file
    upload_2023-9-25_16-42-31.png
     
  10. nguyenvietnamhg

    nguyenvietnamhg

    Joined:
    Mar 21, 2023
    Posts:
    2
    I attempted to do it but encountered the same issue.
     
  11. sjando

    sjando

    Joined:
    Nov 6, 2017
    Posts:
    18
    Not sure every poster in this thread is actually up against the issues I'm describing. Rolling the OpenXR package back won't make any difference, the Unity part of this particular issue is present in every version up to and including the current 1.8.2.

    I stress that this particular issue is unlikely to actually be a problem for most, even if you happen to see the "only touch controller is supported right now" error message. Particularly if you're using a toolkit like the Oculus integration SDK, XR Hands/XR Interaction Toolkit, MRTK etc stop reading here unless you're certain you understand exactly what I'm describing in my previous posts and still believe it is relevant to you.

    By way of update, the second part of the issue I described (XR_ERROR_INVALID_PATH) does indeed appear to have been a bug in the Meta Quest OpenXR runtime that was thankfully quietly fixed circa v57. Now there is no constraint on the Meta side as to which interaction profiles you can use in combination via Unity. (see here for more detail on this problem)

    The second part, the Unity OpenXR package issue, remains as always an unnecessary restriction. You can work around it in various ways, but the simplest is to define and enable your own OpenXR Feature that overrides the troublesome validation error of the built-in Meta Quest Feature:

    Code (CSharp):
    1. using System;
    2. using System.Linq;
    3. using System.Collections.Generic;
    4. using UnityEngine.XR.OpenXR;
    5. using UnityEngine.XR.OpenXR.Features;
    6.  
    7. #if UNITY_EDITOR
    8. using UnityEditor;
    9. using UnityEditor.XR.OpenXR.Features;
    10. #endif
    11.  
    12. /// <summary>
    13. /// As at OpenXR Plugin <= 1.8.2 (at least), Unity's MetaQuestFeature breaks certain builds under the (incorrect) notion that only Touch Controller interaction profiles are supported.
    14. /// OpenXR Plugin also incorrectly ignores the "Ignore build errors" setting provided by the XR Management packages's validation pipeline, making this tricky to bypass.
    15. /// This feature removes the troublesome validation rule so 'unsupported' configurations can still be built. Needs to be lower priority than MetaQuestFeature.
    16. /// Also adds a warning that the use of multiple interaction profiles (even supported ones) causes severe input issues on Quest runtimes before c. v57 (this is a Meta bug, not a Unity bug)
    17. /// </summary>
    18. #if UNITY_EDITOR
    19. [OpenXRFeature(
    20.     UiName = "Unlock Quest Interaction Profiles",
    21.     FeatureId = "com.sjando.unlockquestinteractionprofiles",
    22.     BuildTargetGroups = new BuildTargetGroup[] { BuildTargetGroup.Android },
    23.     Priority = int.MinValue
    24. )]
    25. #endif
    26. public class QuestUnlockInteractionProfilesFeature : OpenXRFeature
    27. {
    28. #if UNITY_EDITOR
    29.     protected override void GetValidationChecks(List<ValidationRule> rules, BuildTargetGroup targetGroup)
    30.     {
    31.         if (targetGroup != BuildTargetGroup.Android) return;
    32.  
    33.         // find the problematic rule
    34.         var troublesomeRule = rules.Where(r => r.message.Contains("supported right now", StringComparison.InvariantCultureIgnoreCase)).FirstOrDefault();
    35.  
    36.         // remove it
    37.         if (troublesomeRule != null) rules.Remove(troublesomeRule);
    38.  
    39.         // add a new one (warning only) describing a semi-related bug on the Meta side
    40.         rules.Add(new ValidationRule(this)
    41.         {
    42.             checkPredicate = () => OpenXRSettings.ActiveBuildTargetInstance.GetFeatures<OpenXRInteractionFeature>().Length < 2,
    43.             error = false,
    44.             message = "The use of multiple interaction profiles on Quest runtimes before c. v57 can cause input issues. Please ensure your Quest system software is up to date.",
    45.         });
    46.     }
    47. #endif
    48. }
    upload_2023-10-25_9-4-41.png
    Meta Quest Support complaining about a perfectly valid set of interaction profiles

    upload_2023-10-25_9-5-21.png
    New feature enabled, unlocking arbitrary interaction profile combinations (with a warning about the <v57 Quest runtime bug if specifying more than one profile)

    The primary usefulness of doing this is unlocking the ability to use either Khronos Simple Controller (which the Meta runtime will bind to both hand and controller inputs) or Microsoft Hand Interaction (which the Meta runtime will bind to hand input only), perhaps in combination with other profiles (make sure to update to latest Meta system software if so to avoid the bug on their side), in order to get simple hand interaction using nothing but Input System and the stock OpenXR Input Subsystem.

    [Note if you want to work with hand input you'll still need to separately enable hand tracking feature in the manifest, which is most easily done with an additional OpenXR Feature that has a build postprocessing step]

    EDIT: all of this applies to the recently released Unity OpenXR plugin 1.9.1 also
     
    Last edited: Oct 30, 2023
    Farl_Lee likes this.
  12. sjando

    sjando

    Joined:
    Nov 6, 2017
    Posts:
    18
    OpenXR plugin 1.10.0 released in the last few days has resolved the most frustrating part of this issue and renders the workaround above unnecessary. The Meta Quest Support feature no longer cares which extra interactions profiles you enable. It does still complain if you don't have one of Touch Controller or Touch Controller Pro in your profile list (even though it is perfectly valid to use neither), but this at least has been downgraded from error to warning and won't break builds.

    As above you still do need to enable hand tracking in the Android manifest to work with hand input, but 1.10.0 adds ProvideManifestRequirementExt to the OpenXRFeatureBuildHooks class to make this easier than ever with a custom feature (Unity if you're reading this, a setting to control hand tracking permission/version as part Meta Quest Feature itself would be lovely).

    The one other thing I mentioned in the initial post was OpenXR plugin not respecting the "Ignore build errors" setting exposed by the XR Project Validation window. That appears to still be the case through 1.10.0 (even though the documentation suggests otherwise).