Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice

Question Hand Tracking not working when build !

Discussion in 'XR Interaction Toolkit and Input' started by momo5ms1, May 2, 2023.

  1. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    Hello everyone,
    I'm using XR Interaction toolkit hand tracking feature in my project, I noticed that hand tracking not working when i build the project. I read somewhere that i need to change the OpenXR backend type to legacy as a solution, but i can't manage to find where to change this value. So where can I find it ? Or if there's any other alternatives for making hand tracking work on build I'll accept it.
    Cordially.
     
  2. jimmyd14

    jimmyd14

    Joined:
    Jun 10, 2017
    Posts:
    3
  3. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    Thank you @jimmyding but i don't think this is the same problem, the main issue that i'm facing is when i build my project and run using oculus quest 2, hand tracking is not working, this happened twice when i was using oculus sdk for hand tracking and now with the new xr interaction toolkit hand tracking. In both cases hand tracking is not working when i build (i'm building for windows platform), i saw in a thread somewhere that i need to change openxr backend type to legacy but i didn't manage to find where to change that value.
     
  4. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    765
    Hand Tracking on PCVR with Oculus only works if you are in dev mode on Oculus software. Then it also works in a build.
     
  5. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    thank you for replying @Qleenie, but it doesn't seem to be working in a build. I tried several ways to solve this and i built it from different unity versions and different PC's but i didn't work even in a development build.
     
  6. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    157
    So first, you need to make sure you enabled the hand subsystem and aim extension on the android build target.

    If you want to test in the editor then you need to enable development mode on your oculus account and enable the OpenXR dev features on the pc app.

    I have an example project with everything configured that should help.
    https://github.com/provencher/OpenXR_XRI_Sample
     
    momo5ms1 likes this.
  7. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    Thank you @ericprovencher, i will try and compare between the two project and tell you if it solves the problem. But i'm sure that hand subsystem is enabled on the android build.
     
    ericprovencher likes this.
  8. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    157
    Did you enable hands in your Quest settings on the device?
     
  9. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    @ericprovencher yes and it's working in editor but not when i build it for PC.
     
  10. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    157
    Oh that's normal. Hand tracking only works in the editor and in android builds. Not standalone builds.
     
  11. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    Seems clear now but can I know why or if it will be available in the future? because I was using Oculus SDK hand tracking when I faced this issue, now i'm using XR Interaction Toolkit and the problem persists.
     
  12. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    157
    This is an issue to direct to the oculus forums. They support hand tracking in the editor to enable faster iteration time, but it's not been a priority for them to support it in PC apps.

    Funny enough though if you compile your app for OpenXR on PC, you should be able to emulate hands with the OpenXR toolkit. It modifies the runtime to surface features that engine integrations don't surface normally.
    https://mbucchia.github.io/OpenXR-Toolkit/
     
  13. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    Thank you @ericprovencher, I managed to make it work. I think the problem was on the oculus desktop app in which some features are not enabled. Now it works fine thanks a lot.
     
    ericprovencher likes this.
  14. bmdenny

    bmdenny

    Joined:
    Jul 3, 2012
    Posts:
    8
    Is Standalone hand tracking something that is going to be implemented in the future.
     
  15. momo5ms1

    momo5ms1

    Joined:
    Feb 10, 2021
    Posts:
    20
    Sorry for not giving the last update on this thread, @bmdenny I solved it as eric said, I set up the developer options again as in this screenshot and everything works fine. Enabling Developer Runtime Features seems solved it. Hope this helps.
     
    bmdenny and VRDave_Unity like this.
  16. tom2918

    tom2918

    Joined:
    Jun 4, 2018
    Posts:
    8
    I'm facing the same problem using Quest 3. But the desktop app shows only the first 3 settings till "Demo Mode". Is there another way to activate "Developer Runtime Features"?
     
  17. tom2918

    tom2918

    Joined:
    Jun 4, 2018
    Posts:
    8
    found the solution: a developer account is needed
     
    ericprovencher likes this.