Search Unity

Question Unity Hands not showing up in Oculus Quest 2

Discussion in 'VR' started by rikukojima, Jul 7, 2021.

  1. rikukojima

    rikukojima

    Joined:
    Aug 9, 2020
    Posts:
    1
    Im using Unity 2020.2.1f1, Oculus Integration 29.0 .

    I built a test scene in Oculus Integration (Asset/Oculus/VR/Scenes/HandTest), with OVRCameraRig and set the Input (Hand Tracking Support) to Hands Only. The OVRHandPrefab are assigned to LeftHandAnchor and RightHandAnchor (changed also the type to right).

    Now my problem is, when I build my scene to my Quest, the hands aren't showing, but I can still do the system gesture to close the program.

    I recorded my situation.(
    ) If anyone knows the solution, please let me know.
     
    Last edited: Jul 7, 2021
  2. korinVR

    korinVR

    Joined:
    Jul 7, 2012
    Posts:
    34
    Can you try updating Oculus XR Plugin to 1.10.0-preview.1?
    It has a hand-tracking fix which might explain your situation.

    > Fixed issue #1325113, where hand tracking was not working on Quest/Quest 2 when the Unity splash screen was disabled
     
  3. SiuSiuSiu

    SiuSiuSiu

    Joined:
    Oct 24, 2021
    Posts:
    1
    Hi I am hitting the same problem. Do you a solution in the meantime?
     
  4. HaukeCornell

    HaukeCornell

    Joined:
    Oct 15, 2021
    Posts:
    1
    Have you turned it on in the Oculus Project Settings?
    upload_2021-11-11_18-9-20.png
     
  5. ubeshkarthick00

    ubeshkarthick00

    Joined:
    Oct 7, 2022
    Posts:
    1
    xr hand tracking is not visible kindly give me a solution to over come this problem
     
  6. DevDunk

    DevDunk

    Joined:
    Feb 13, 2020
    Posts:
    5,058
    I think it's a different issue. Make a new thread with all info about the project
     
  7. hildekerkhoven

    hildekerkhoven

    Joined:
    Mar 16, 2023
    Posts:
    1
    I have the same problem. I am using Unity 2021.3.24f1 with Oculus Integration 50.0. These are different versions, but I was wondering if you ended up fixing the problem?
     
  8. jaewan0114

    jaewan0114

    Joined:
    Apr 4, 2023
    Posts:
    4
    I'm having the same problem.
    upload_2023-5-25_10-41-31.png
     
  9. ken_cessna

    ken_cessna

    Joined:
    Nov 25, 2013
    Posts:
    1
    Did you use URP? That could be the point why the hands aren`t visible! ;)
     
  10. jaewan0114

    jaewan0114

    Joined:
    Apr 4, 2023
    Posts:
    4
    Didn't use...
    upload_2023-5-26_2-49-52.png
     

    Attached Files:

  11. dekatron1

    dekatron1

    Joined:
    May 11, 2020
    Posts:
    1
    I had the same problem, where the hands were being detected fine, but the hand prefab either didn't appear, or was stuck at the origin on the floor. I had the problem both in the editor play mode, and the Oculus build. Here are the two things I tried which worked for me.

    Fix 1 (Temporary fix)
    If the hand prefabs appear on the floor (origin), but their position don't match your actual hands, just do a squeeze motion with both your hands, like you're grabbing something. The hands then suddenly appear. I'm not sure why this works.

    Fix 2
    If your OVRHandPrefab objects are parented under the LeftHandAnchor and RightHandAnchor, move them outside the anchor objects, so that they are parented under TrackingSpace in OVRCameraRig.
    2023-09-02_20-00-59.png

    After you do this, check the "Update Root Pose" checkbox in the OVRSkeleton script under OVRHandPrefab. Make sure to do this for both hands.
    2023-09-02_20-00-04.png