Search Unity

Question Hand Tracking not working when build !

Discussion in 'XR Interaction Toolkit and Input' started by momo5ms1, May 2, 2023.

  1. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    Hello everyone,
    I'm using XR Interaction toolkit hand tracking feature in my project, I noticed that hand tracking not working when i build the project. I read somewhere that i need to change the OpenXR backend type to legacy as a solution, but i can't manage to find where to change this value. So where can I find it ? Or if there's any other alternatives for making hand tracking work on build I'll accept it.
    Cordially.
     
  2. jimmyd14

    jimmyd14

    Joined:
    Jun 10, 2017
    Posts:
    6
  3. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    Thank you @jimmyding but i don't think this is the same problem, the main issue that i'm facing is when i build my project and run using oculus quest 2, hand tracking is not working, this happened twice when i was using oculus sdk for hand tracking and now with the new xr interaction toolkit hand tracking. In both cases hand tracking is not working when i build (i'm building for windows platform), i saw in a thread somewhere that i need to change openxr backend type to legacy but i didn't manage to find where to change that value.
     
  4. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    866
    Hand Tracking on PCVR with Oculus only works if you are in dev mode on Oculus software. Then it also works in a build.
     
  5. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    thank you for replying @Qleenie, but it doesn't seem to be working in a build. I tried several ways to solve this and i built it from different unity versions and different PC's but i didn't work even in a development build.
     
  6. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    261
    So first, you need to make sure you enabled the hand subsystem and aim extension on the android build target.

    If you want to test in the editor then you need to enable development mode on your oculus account and enable the OpenXR dev features on the pc app.

    I have an example project with everything configured that should help.
    https://github.com/provencher/OpenXR_XRI_Sample
     
    momo5ms1 likes this.
  7. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    Thank you @ericprovencher, i will try and compare between the two project and tell you if it solves the problem. But i'm sure that hand subsystem is enabled on the android build.
     
    ericprovencher likes this.
  8. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    261
    Did you enable hands in your Quest settings on the device?
     
  9. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    @ericprovencher yes and it's working in editor but not when i build it for PC.
     
  10. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    261
    Oh that's normal. Hand tracking only works in the editor and in android builds. Not standalone builds.
     
  11. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    Seems clear now but can I know why or if it will be available in the future? because I was using Oculus SDK hand tracking when I faced this issue, now i'm using XR Interaction Toolkit and the problem persists.
     
  12. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    261
    This is an issue to direct to the oculus forums. They support hand tracking in the editor to enable faster iteration time, but it's not been a priority for them to support it in PC apps.

    Funny enough though if you compile your app for OpenXR on PC, you should be able to emulate hands with the OpenXR toolkit. It modifies the runtime to surface features that engine integrations don't surface normally.
    https://mbucchia.github.io/OpenXR-Toolkit/
     
  13. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    Thank you @ericprovencher, I managed to make it work. I think the problem was on the oculus desktop app in which some features are not enabled. Now it works fine thanks a lot.
     
    ericprovencher likes this.
  14. bmdenny

    bmdenny

    Joined:
    Jul 3, 2012
    Posts:
    8
    Is Standalone hand tracking something that is going to be implemented in the future.
     
  15. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    Sorry for not giving the last update on this thread, @bmdenny I solved it as eric said, I set up the developer options again as in this screenshot and everything works fine. Enabling Developer Runtime Features seems solved it. Hope this helps.
     
    bmdenny and VRDave_Unity like this.
  16. tom2918

    tom2918

    Joined:
    Jun 4, 2018
    Posts:
    8
    I'm facing the same problem using Quest 3. But the desktop app shows only the first 3 settings till "Demo Mode". Is there another way to activate "Developer Runtime Features"?
     
  17. tom2918

    tom2918

    Joined:
    Jun 4, 2018
    Posts:
    8
    found the solution: a developer account is needed
     
    ericprovencher likes this.
  18. BernieRoehl

    BernieRoehl

    Joined:
    Jun 24, 2010
    Posts:
    80
    I'm running into the same problem as everyone else. Oculus hand tracking no longer works in the Unity Editor, which makes development of hand-tracking apps basically impossible.

    On the Quest settinggs, I've got "Hand and body tracking" enabled as well as Auto Switch from Controllers to Hands. Hand tracking works on the Quest, and I can put down the controllers and interact using my hands.

    In the Oculus app on the PC side I'm using the Public Test Channel, and I've enabled "Developer runtime features".

    In Unity, my build target is Android. I'm using the latest available packages, as shown in the screenshot below. I have the Oculus OpenXR runtime selected, and I've got the Hand Tracking Subsystem and Meta Hand Tracking Aim feature groups enabled, also shown in the screenshot below.

    upload_2023-12-12_7-31-11.png


    (the Android tab shows the same, and I've also got Meta Quest Support enabled there)

    The hands are not being tracked. I do see the two menu icons following my hands, but I suspect those are being drawn by the headset. However, they do demonstrate that hand tracking is enabled on the Quest.

    I've opened up the HandVisualizer script, and added some debugging statements. The OnTrackingAcquired method is never called, and in the OnUpdatedHands method (which is indeed called) the subsystem.leftHand.isTracked and subsystem.rightHand.isTracked values are always false.

    Have I missed anything, or is this a bug in Unity or on the Oculus end?
     
  19. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    261
    When you’re working in the editor the settings that are used for OpenXR are in the standalone build target. Be sure to enable the hand subsystem and meta aim extensions there as well.
     
  20. BernieRoehl

    BernieRoehl

    Joined:
    Jun 24, 2010
    Posts:
    80
    When you say standalone build target, do you mean the settings shown in the Android tab of OpenXR? If so, I have the feature groups enabled there as well. See screenshot below.

    upload_2023-12-12_12-1-35.png
     

    Attached Files:

  21. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    261
    No I mean the Standalone tab in that menu, as so. This is where the editor behavior is configured.
    upload_2023-12-12_12-42-20.png
     
    Verdant88 likes this.
  22. BernieRoehl

    BernieRoehl

    Joined:
    Jun 24, 2010
    Posts:
    80
    Right, but if you check my previous screenshot you'll see that I already have those two feature groups (Hand Tracking Subsystem and Meta Hand Tracking Aim) enabled.
     
  23. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    261
    Ah I missed that.

    In that case the only other step I can recommend is to confirm that you're properly using the OpenXR runtime in the XR plugin management settings with the windows standalone build target.
    upload_2023-12-12_16-51-43.png
     
  24. BernieRoehl

    BernieRoehl

    Joined:
    Jun 24, 2010
    Posts:
    80
    Yeah, I am:

    upload_2023-12-12_21-53-25.png

    So, I guess it's a bug. The thing is, I'm not sure whether the bug is with Unity or with the Oculus runtime.

    I'll try getting your github example. Either it'll work or it won't, and either way it'll help narrow down the problem.
     
  25. BernieRoehl

    BernieRoehl

    Joined:
    Jun 24, 2010
    Posts:
    80
    ... and the results are interesting.

    I don't see hands when I run your example either. The controllers work, but when I put them down my hands do not appear.

    So it doesn't seem to be anything as simple as an incorrect setting in Unity, and it also doesn't appear to be a Unity bug (unless it was introduced in a very recent version of Unity -- I'm running 2022.3.12f1, which is only slightly ahead of what you used for your example).

    I do see the two menus attached to my palms (hamburger menu tracking my left hand and Oculus menu tracking my right hand), so I know that hand tracking is working on the Oculus side.

    I definitely have "Developer runtime features" enabled:

    upload_2023-12-12_22-12-13.png


    Out of curiousity... when was the last time you tested it? I'm wondering if Oculus changed their runtime and broke something.

    I'm using a Quest 2 for testing (I also have a Quest 1 and a Quest Pro, but I ultimately need to get it working on the Quest 2).
     
  26. Verdant88

    Verdant88

    Joined:
    Jul 2, 2019
    Posts:
    4
    I had the same experience as BernieRoehl. I was building for Android and wondering why hand tracking wasnt working. I was getting this error: hand tracking subsystem not found, can't subscribe to hand tracking status. enable that feature in the openxr project settings and ensure openxr is enabled as the plug-in provider. I realized I only enabled the hand tracking subsystem for Android, and after I enabled it for Windows. Now I can see the hand tracking in the editor and it seems to be working.
     
    momo5ms1 likes this.
  27. xxluky

    xxluky

    Joined:
    Dec 4, 2014
    Posts:
    19
    Is this thread still active? Is it possible to see my hands using XR hands in Windows builds already? Does it know anybody?
    I can track my hands in Unity Editor but I am not able to get it working on standalone...

    EDIT: Ok, I finally got it working. I couldn't find the developer mode in standalone app. I had to enable the developer mode on the phone app first in order to see that option in standalone as well.
    After enabling the developer mode, I can track my hands in standalone.
     
    Last edited: Jan 31, 2024
    momo5ms1 likes this.
  28. EmpireStudios

    EmpireStudios

    Joined:
    Oct 23, 2018
    Posts:
    24
    I'm trying to do an Android build but Hands don't show in the app. I can see my hands and use them to open the app in my Quest 2 but then when I open it using hand gesture I get a popup that says "Switch to controllers, To ues this app pick up controllers." When I open the app with the controllers I can see the controllers and everything works fine. I'm using a template so I know all of the setting are acurate. Meaning I've installed XR hands demo and build HandVisualizer scene but no hands. The build works fine but no hands.

    upload_2024-2-12_11-7-24.png