Search Unity

  1. We are migrating the Unity Forums to Unity Discussions by the end of July. Read our announcement for more information and let us know if you have any questions.
    Dismiss Notice
  2. Dismiss Notice

Question Can't figure out hand tracking in Vision Pro

Discussion in 'AR' started by jeffcrouse, May 8, 2024.

  1. jeffcrouse

    jeffcrouse

    Joined:
    Apr 30, 2010
    Posts:
    20
    I have successfully built and deployed many of the examples scenes from visionOSTemplate-1.2.3.zip. I am especially interested in the MixedReality sample.

    mixed_reality_demo.gif
    In an effort to understand the project setup, I then tried to create a project that just implements hand tracking as shown in the MixedReality example. So I followed the "Create a Project from Scratch/Mixed Reality" instructions, then added in the things that I thought were relevant from the MixedReality sample.

    Here's what I did. (I'm on a 2020 Macbook Air M1 running Sonoma 14.4.1)

    1. Create a new project in the hub using Universal3D template and Unity 2022.3.26f1
    2. Go to Project Settings > XR Plug-in Management and click "Install XR Plug-in Management"
    3. Enable "Apple visionOS" Plug-in Provider in the VisionOS tab, wait for it to be installed
    4. When I get the popup warning about "This project uses the new input system package, but the native platform backends for the new input system are not enabled in the player settings... Do you want to enable the backends?" I click "Yes"
    5. In the "Apple visionOS" section of "XR Plug-in Management", set the App Mode to "Virtual Reality - Fully Immersive Space" and when I am prompted to "Install PolySpatial", I click "Yes"
    6. Add "Hand Tracking Usage Description" and "World Sensing Usage Description"
    7. Under "Project Validation", i choose "visionOS MR - Volume", and then "Fix All". This adds an AR Session to the default SampleScene and disables Splash Screen
    8. While in Player Settings, go to "Player" section and change CompanyName so that it will build properly
    9. Go to Build Settings and change the Build Target to visionOS
    [NOTE: this is where the "Create a visionOS Project from Scratch" instructions end, so the rest I am just guessing or copying from visionOSTemplate-1.2.3.zip]
    10. I noticed that "Input Settings Package" in the visionOSTemplate-1.2.3.zip looks different than mine, so I copied "InputSystem.inputsettings.asset" over from visionOSTemplate-1.2.3.zip
    11. Double-check that the following packages have been installed: com.unity.polyspatial, com.unity.polyspatial.visionos, com.unity.polyspatial.xr, com.unity.xr.hands
    12. I add a VolumeCamera to the scene and create a Volume Camera Window Configuration (in the Resources folder), and assign it to the VolumeCamera
    13. I replicate the "XR Origin" setup from the Mixed Reality sample:
    - Empty Game Object called "XR Origin
    - Main camera becomes a child of XR Origin. Reset Transform and change clipping planes to 0.1/20
    - Add a "Tracked Pose Driver (Input System)" to the Main Camera and replicate the "centerEyePosition" and "devicePosition" actions in Position Input and Rotation Input. [Side Note: Is this necessary for hand tracking in visionOS?]
    14. Also add the "HandManager" object as a child of XR Origin
    15. Copy over the "Hand Visualizer" script and assign it to HandManager
    16. Copy over the Joint Prefab and assign it to Hand Visualizer > Joint Prefab
    16. Add a cube to the scene, just to make sure it's actually rendering something
    17. Build and deploy to Vision Pro

    The app builds and deploys, but all I can see is a red cube in my room, and my hands aren't being tracked like they are in the MixedReality sample. What am I doing wrong?

    vision_05.gif

    Could I just start with the visionOSTemplate? Of course. But I'm worried that whatever I am missing will come back to haunt me somehow. So I want to understand how to create the project from scratch.

    Thanks in advance
     
  2. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    1,151
    I asked the XR Hands team about this, and they suggested that you ask this question in the visionOS discussions space. From our FAQ:

     
  3. jeffcrouse

    jeffcrouse

    Joined:
    Apr 30, 2010
    Posts:
    20