Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Feature Request XR Interaction Toolkit and Hand Tracking 2022

Discussion in 'XR Interaction Toolkit and Input' started by ThomasBERNARD33, Aug 16, 2022.

  1. ThomasBERNARD33

    ThomasBERNARD33

    Joined:
    Apr 20, 2022
    Posts:
    11
    Hello everyone,

    I would have loved to directly post following this old thread but it has been blocked.

    I there any news on hand tracking capabilities for XR Interaction Toolkit ? On the public roadmap it's set as planned so it's the same as nearly two years ago... Not much of "high on our priority list" like Matt_D_work said on 23 of January of 2020 (public roadmap). I'm sorry if I sound a bit annoyed but I've been looking for a few weeks and it's hard to find anything on the topic...

    I there any way today to use Hand tracking with the interaction toolkit ? Even by using a custom wrapper to make it work. Do you know of anybody who does that ?

    I've come across different solutions that use custom interaction toolkit to use hand tracking but it's no longer based on Unity XR Interaction Toolkit and would not fit my project (Recently UlitmateXR does it for free).

    Currently, I can easily set up a scene using XR Interaction toolkit and add the oculus integration package to allow the hand tracking in OVRSettings and in the Android Manifest. When I apply the Oculus hand prefab under my XR Origin Left and Right hand controller I get a prefab of a hand with fingers moving according to the hand position but I get absolutely no tracking of the position of the hand in space. The hand appears at the controllers last known position and react to any finger movement.
    upload_2022-8-16_15-0-45.png

    The issue will probably be resolved in the future by adding a "hand tracked" node in the Input actions...
    upload_2022-8-16_15-4-46.png

    Anybody have ideas or clues on how to solve this ?
     
    PandamoniumDev and andybak like this.
  2. DanMillerU3D

    DanMillerU3D

    Unity Technologies

    Joined:
    May 12, 2017
    Posts:
    26
    The XR Interaction toolkit does not currently support hand tracking. It's something we're working on and is on our roadmap you can mark it accordingly and share context.
     
  3. ThomasBERNARD33

    ThomasBERNARD33

    Joined:
    Apr 20, 2022
    Posts:
    11
    I think there is a huge difference between what I understand of your roadmap and what I need. The solution may even be coming from Oculus/Meta side.
    " [VR/AR] Articulated hand tracking : Use of hand tracking from external cameras "
    It sounds like you are trying to do hand tracking yourself using external cameras (am I wrong ?).

    The need is simply for Oculus/Meta to provide a generic way (OpenXR ?) to use hand tracking and for you to add it like any other node in the XRIT.

    @DanMillerU3D any ideas on timeframe for this feature ? Two and a half years ago it was in your high priority list and it looks like it no longer is. Some paid assets apparently manage to make it work like UltimateXR, VRIF and probably others.
     
    YVR_WXJ likes this.
  4. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    507
    I was wondering the same thing and found the same old post.

    Sad to read that we still have to import that Oculus package for hand tracking.

    Can we at least add manual support for Oculus Quest 2 hand tracking with https://docs.unity3d.com/2019.3/Documentation/ScriptReference/XR.Hand.html or does it still only work for Magic Leap?

    But as a Oculus Quest 2 dev it might be better if we migrate to the new official Oculus Interaction toolkit. Relearning a new toolkit, but I guess it will be worth it and better in the end, because the Oculus Interaction toolkit will be updated much more frequently and with better Quest feature support.
     
  5. ThomasBERNARD33

    ThomasBERNARD33

    Joined:
    Apr 20, 2022
    Posts:
    11
    Going all the way into oculus integration is not a bad take on this but it's kinda sad not to be able to use cross compatible tools more open and linked to many assets like XRIT.

    We are undertaking a wrapper for Oculus integration into XRIT...

    Let's see how it turns out. Sadly we probably won't be able to share it because of work space regulation.
     
  6. VRDave_Unity

    VRDave_Unity

    Unity Technologies

    Joined:
    Nov 19, 2021
    Posts:
    254
    Hey @ThomasBERNARD33
    In terms of the Hand Tracking support, the foundational, cross-platform support is in active development. The current implementation will have the initial flexibility to take advantage of either the OpenXR or Oculus/Meta SDKs as the main provider library. In terms of the full use of hand tracking in XRI for interactions, this work is also in active development alongside the main foundational support. Unfortunately, I cannot give you an exact date, but it's definitely on the 'sooner' side of things.
     
    jakoblacour and Fenikkel like this.
  7. AndyUnityDeveloper

    AndyUnityDeveloper

    Joined:
    Dec 7, 2013
    Posts:
    7
    @ThomasBERNARD33 since you got that far, you could select the "Update Root Pose" setting in the "OVR Skeleton" component of the OVRHandPrefab. This way you will get tracking of the hand in space.

    See also OVRHand.cs which updates the position by calling OVRPlugin.GetHandState()

    Unfortunately this doesn't feed into the interaction toolkit so you won't be able to use it to drive any of its interactions (in my case, I'm interested in pointer rays for UI interactions). I'm looking into a way to bridge the two, if anyone has any ideas...
     
    Atheane likes this.
  8. ThomasBERNARD33

    ThomasBERNARD33

    Joined:
    Apr 20, 2022
    Posts:
    11
    I wrote a custom wrapper to do this. It's a bit jerky but I got highly inspired by XRDeviceSimulator.

    I can't share the code since it's done for a company but here is the basic principle :

    - You use both XRRig and OVRRig at the same time.
    - You cut all input from the XRRig and send them through a custom script from the OVRRig.
    - You're script will basically do the exact same thing as the XRDeviceSimulator but with inputs coming from OVRRig.
    This way you can choose what you need. I ended up using custom gestures (interaction SDK) for inputs to replace all my existing button press. I used a trick to avoid loss in latency from the XRRig camera by showing the OVR camera as display 1 and disabling the camera component on XRRig Camera. This is just a temp fix so we can iterate on hand tracking.


    XRDeviceSimulator :
    https://docs.unity3d.com/Packages/c...lkit.Inputs.Simulation.XRDeviceSimulator.html
    From unity package in project : Library\PackageCache\com.unity.xr.interaction.toolkit@2.0.2\Runtime\Inputs\Simulation\XRDeviceSimulator.cs

    Interaction SDK :
    https://developer.oculus.com/documentation/unity/unity-isdk-interaction-sdk-overview/

    upload_2022-9-13_11-51-31.png
     
    Fenikkel likes this.
  9. AndyUnityDeveloper

    AndyUnityDeveloper

    Joined:
    Dec 7, 2013
    Posts:
    7
    Thank you @ThomasBERNARD33 for the detailed instructions. I ended up writing a custom OpenXR controller device that wraps OVRPlugin. The result should be similar. If anyone needs it I can find some time to share the code here.
     
    Fenikkel likes this.
  10. JakeS_97

    JakeS_97

    Joined:
    Jun 13, 2013
    Posts:
    9
    Hey I would love to check out the code if you have the time to post / give some guidance. I'm working on a wrapper for compatibility with both Oculus and Vive hand tracking to interact with XR Interaction Toolkit interactables.
     
  11. AndyUnityDeveloper

    AndyUnityDeveloper

    Joined:
    Dec 7, 2013
    Posts:
    7
    Sure thing! It's the three classes below. Let me know if you run into any issues. When you're done I'd be interested in taking a look at your Vive hand tracking piece too! :)

    Setup:
    - Use the XRRig from XR Interaction toolkit. NOT the OVRCameraRig from Oculus Integration
    - At the moment I still have the OVRManager script enabled on a game object but I don't believe all of it is necessary. I plan to extract just the necessary code. Set Hand Tracking Support to Controllers and Hands.
    - I have Unity 2021.3.9.f1, Oculus Integration 43.0, XR Plugin Management 4.2.1, XR Interaction toolkit 2.1.1
    - Oculus > Tools > OVR Utilities Plugin > Set OVR Plugin to OpenXR
    - Project Settings > XR Plug-in Management > Android > Oculus
    - Project Settings > XR Plug-in Management > Windows > Oculus

    Usage:
    - Add the QuestHandSupport component to a gameobject.
    - Note that when the hands are tracked, the device position/rotation properties are different from the pointer position/rotation, as they should be. So you may want to create additional actions for DevicePosition and DeviceRotation, and add separate gameobjects with their respective XR Controllers (action-based). Then put the ray interactors on the pointers, and the hand models on the devices.
    - On all your XR Controllers (action-based), set the Update Tracking Type to: Update. (I don't know why the other settings cause strange behavior)

    QuestHandDevice.cs
    Code (CSharp):
    1. using System;
    2. using System.Linq;
    3. using UnityEditor;
    4. using UnityEngine;
    5. using UnityEngine.InputSystem;
    6. using UnityEngine.InputSystem.Controls;
    7. using UnityEngine.InputSystem.Layouts;
    8. using UnityEngine.InputSystem.LowLevel;
    9. using UnityEngine.InputSystem.XR;
    10. using UnityEngine.Scripting;
    11. using UnityEngine.XR;
    12.  
    13. #if UNITY_EDITOR
    14. [InitializeOnLoad]
    15. #endif
    16.  
    17. [InputControlLayout(displayName = "Quest Hand", stateType = typeof(QuestHandDeviceState))]
    18. public class QuestHandDevice : XRController, IInputUpdateCallbackReceiver
    19.     {
    20.     public const string INTERFACE_NAME = "QuestHandTrackingInput";
    21.  
    22.     [Preserve, InputControl(usage = "IsTracked")]
    23.     new public ButtonControl isTracked { get; private set; }
    24.  
    25.  
    26.     [Preserve, InputControl(usage = "TrackingState")]
    27.     new public IntegerControl trackingState { get; private set; }
    28.  
    29.     [Preserve, InputControl(alias = "gripPosition")]
    30.     new public Vector3Control devicePosition { get; private set; }
    31.  
    32.  
    33.     [Preserve, InputControl(alias = "gripOrientation")]
    34.     new public QuaternionControl deviceRotation { get; private set; }
    35.  
    36.  
    37.     [Preserve, InputControl()]
    38.     public Vector3Control pointerPosition { get; private set; }
    39.  
    40.  
    41.     [Preserve, InputControl(alias = "pointerOrientation")]
    42.     public QuaternionControl pointerRotation { get; private set; }
    43.  
    44.     private static bool initialized;
    45.  
    46.     static QuestHandDevice()
    47.     {
    48.         Initialize();
    49.     }
    50.  
    51.     [RuntimeInitializeOnLoadMethod]
    52.     public static void Initialize() {
    53.         if (!initialized)
    54.         {
    55.             InputSystem.RegisterLayout<QuestHandDevice>(
    56.                 matches: new InputDeviceMatcher()
    57.                 .WithInterface(QuestHandDevice.INTERFACE_NAME));
    58.             initialized = true;
    59.         }
    60.     }
    61.  
    62.  
    63.     protected override void FinishSetup()
    64.         {
    65.             base.FinishSetup();
    66.  
    67.             isTracked = GetChildControl<ButtonControl>("isTracked");
    68.             trackingState = GetChildControl<IntegerControl>("trackingState");
    69.  
    70.             devicePosition = GetChildControl<Vector3Control>("devicePosition");
    71.             deviceRotation = GetChildControl<QuaternionControl>("deviceRotation");
    72.  
    73.             pointerPosition = GetChildControl<Vector3Control>("pointerPosition");
    74.             pointerRotation = GetChildControl<QuaternionControl>("pointerRotation");
    75.  
    76.             var capabilities = description.capabilities;
    77.             var deviceDescriptor = XRDeviceDescriptor.FromJson(capabilities);
    78.  
    79.             if (deviceDescriptor != null)
    80.             {
    81.                 if ((deviceDescriptor.characteristics & InputDeviceCharacteristics.Left) != 0)
    82.                     InputSystem.SetDeviceUsage(this, UnityEngine.InputSystem.CommonUsages.LeftHand);
    83.                 else if ((deviceDescriptor.characteristics & InputDeviceCharacteristics.Right) != 0)
    84.                     InputSystem.SetDeviceUsage(this, UnityEngine.InputSystem.CommonUsages.RightHand);
    85.             }
    86.         }
    87.  
    88.         public void OnUpdate()
    89.         {
    90.             OVRPlugin.Hand hand;
    91.  
    92.             if (this.usages.Contains(UnityEngine.InputSystem.CommonUsages.LeftHand))
    93.             {
    94.                 hand = OVRPlugin.Hand.HandLeft;
    95.             }
    96.             else if (this.usages.Contains(UnityEngine.InputSystem.CommonUsages.RightHand))
    97.             {
    98.                 hand = OVRPlugin.Hand.HandRight;
    99.             }
    100.             else
    101.             {
    102.                 throw new ArgumentException("No hand usage assigned to controller");
    103.             }
    104.  
    105.             OVRPlugin.HandState _handState = new OVRPlugin.HandState();
    106.  
    107.             if (OVRPlugin.GetHandState(OVRPlugin.Step.Render, hand, ref _handState))
    108.             {
    109.                 QuestHandDeviceState state = new QuestHandDeviceState();
    110.  
    111.                 bool isTracked = (_handState.Status & OVRPlugin.HandStatus.HandTracked) != 0;
    112.                 state.buttons = (ushort)(isTracked ? 1 : 0);
    113.  
    114.                 state.trackingState = isTracked ? (ushort)(InputTrackingState.Position | InputTrackingState.Rotation)
    115.                         : (ushort)(InputTrackingState.None);
    116.  
    117.  
    118.                 state.devicePosition = _handState.RootPose.Position.FromFlippedZVector3f();
    119.                 state.deviceRotation = _handState.RootPose.Orientation.FromFlippedZQuatf();
    120.  
    121.  
    122.                 state.pointerPosition = _handState.PointerPose.Position.FromFlippedZVector3f();
    123.                 state.pointerRotation = _handState.PointerPose.Orientation.FromFlippedZQuatf();
    124.  
    125.  
    126.                 state.fingerIsPinching = (ushort)_handState.Pinches;
    127.  
    128.                 if (_handState.PinchStrength != null && _handState.PinchStrength.Length == (int)OVRPlugin.HandFinger.Max)
    129.                 {
    130.                     state.thumbPinchStrength = _handState.PinchStrength[(int)OVRPlugin.HandFinger.Thumb];
    131.                     state.indexPinchStrength = _handState.PinchStrength[(int)OVRPlugin.HandFinger.Index];
    132.                     state.middlePinchStrength = _handState.PinchStrength[(int)OVRPlugin.HandFinger.Middle];
    133.                     state.ringPinchStrength = _handState.PinchStrength[(int)OVRPlugin.HandFinger.Ring];
    134.                     state.pinkyPinchStrength = _handState.PinchStrength[(int)OVRPlugin.HandFinger.Pinky];
    135.                 }
    136.                 else
    137.                 {
    138.                     state.thumbPinchStrength = 0.0f;
    139.                     state.indexPinchStrength = 0.0f;
    140.                     state.middlePinchStrength = 0.0f;
    141.                     state.ringPinchStrength = 0.0f;
    142.                     state.pinkyPinchStrength = 0.0f;
    143.                 }
    144.                 InputSystem.QueueStateEvent(this, state);
    145.             }
    146.  
    147.         }
    148. }
    149.  
    150.  
    QuestHandDeviceState.cs
    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.InputSystem.Layouts;
    3. using UnityEngine.InputSystem.LowLevel;
    4. using UnityEngine.InputSystem.Utilities;
    5. using UnityEngine.XR;
    6.  
    7. public struct QuestHandDeviceState : IInputStateTypeInfo
    8. {
    9.     FourCC IInputStateTypeInfo.format => new FourCC('Q', 'S', 'T', 'H');
    10.  
    11.     [InputControl(name = "devicePosition")]
    12.     public Vector3 devicePosition;
    13.  
    14.     [InputControl(name = "deviceRotation")]
    15.     public Quaternion deviceRotation;
    16.  
    17.     [InputControl(name = "pointerPosition")]
    18.     public Vector3 pointerPosition;
    19.  
    20.     [InputControl(name = "pointerRotation")]
    21.     public Quaternion pointerRotation;
    22.  
    23.     [InputControl(name = "trackingState", usage = "TrackingState", layout = "Integer")]
    24.     public int trackingState;
    25.  
    26.     [InputControl(name = "IsTracked", usage = "IsTracked", layout = "Button", bit = 0)]
    27.     public ushort buttons;
    28.  
    29.     [InputControl(name = "thumbIsPinching", layout = "Button", bit = (uint)HandFinger.Thumb)]
    30.     [InputControl(name = "indexIsPinching", layout = "Button", bit = (uint)HandFinger.Index,
    31.         aliases = new[] { "triggerPressed", "indexButton", "indexTouched", "triggerbutton" }, usage = "TriggerButton") ]
    32.     [InputControl(name = "middleIsPinching", layout = "Button", bit = (uint)HandFinger.Middle)]
    33.     [InputControl(name = "ringIsPinching", layout = "Button", bit = (uint)HandFinger.Ring)]
    34.     [InputControl(name = "pinkyIsPinching", layout = "Button", bit = (uint)HandFinger.Pinky)]
    35.     public ushort fingerIsPinching;
    36.  
    37.     [InputControl(name = "thumbPinchStrength", layout = "Axis")]
    38.     public float thumbPinchStrength;
    39.  
    40.     [InputControl(name = "indexPinchStrength", layout = "Axis")]
    41.     public float indexPinchStrength;
    42.  
    43.     [InputControl(name = "middlePinchStrength", layout = "Axis")]
    44.     public float middlePinchStrength;
    45.  
    46.     [InputControl(name = "ringPinchStrength", layout = "Axis")]
    47.     public float ringPinchStrength;
    48.  
    49.     [InputControl(name = "pinkyPinchStrength", layout = "Axis")]
    50.     public float pinkyPinchStrength;
    51.  
    52.  
    53. }
    QuestHandSupport.cs
    Code (CSharp):
    1. using System.Linq;
    2. using UnityEngine;
    3. using UnityEngine.InputSystem;
    4. using UnityEngine.InputSystem.Layouts;
    5. using UnityEngine.InputSystem.XR;
    6. using UnityEngine.XR;
    7.  
    8. public class QuestHandSupport : MonoBehaviour
    9. {
    10.  
    11.     private const string LEFT_DEVICE_NAME = "QuestHandDevice Left";
    12.     private const string RIGHT_DEVICE_NAME = "QuestHandDevice Right";
    13.  
    14.     private bool leftIsAdded;
    15.     private bool rightIsAdded;
    16.  
    17.     OVRPlugin.HandState leftHandState = new OVRPlugin.HandState();
    18.     OVRPlugin.HandState rightHandState = new OVRPlugin.HandState();
    19.  
    20.     private void Awake()
    21.     {
    22.         QuestHandDevice.Initialize();
    23.     }
    24.  
    25.     private void OnDisable()
    26.     {
    27.         if (leftIsAdded)
    28.         {
    29.             RemoveDevice(LEFT_DEVICE_NAME);
    30.         }
    31.  
    32.         if (rightIsAdded)
    33.         {
    34.             RemoveDevice(RIGHT_DEVICE_NAME);
    35.         }
    36.     }
    37.  
    38.     private void Update()
    39.     {
    40.         bool leftIsConnected = OVRPlugin.GetHandState(OVRPlugin.Step.Render, OVRPlugin.Hand.HandLeft, ref leftHandState);
    41.         if (leftIsConnected && !leftIsAdded)
    42.         {
    43.             AddDevice(LEFT_DEVICE_NAME, InputDeviceCharacteristics.Left);
    44.             leftIsAdded = true;
    45.         } else if (!leftIsConnected && leftIsAdded)
    46.         {
    47.             RemoveDevice(LEFT_DEVICE_NAME);
    48.             leftIsAdded = false;
    49.         }
    50.  
    51.         bool rightIsConnected = OVRPlugin.GetHandState(OVRPlugin.Step.Render, OVRPlugin.Hand.HandRight, ref rightHandState);
    52.         if (rightIsConnected && !rightIsAdded)
    53.         {
    54.             AddDevice(RIGHT_DEVICE_NAME, InputDeviceCharacteristics.Right);
    55.             rightIsAdded = true;
    56.         }
    57.         else if (!rightIsConnected && rightIsAdded)
    58.         {
    59.             RemoveDevice(RIGHT_DEVICE_NAME);
    60.             rightIsAdded = false;
    61.         }
    62.     }
    63.  
    64.     private void AddDevice(string name, InputDeviceCharacteristics characteristics)
    65.     {
    66.         XRDeviceDescriptor deviceDescriptor = new XRDeviceDescriptor();
    67.         deviceDescriptor.characteristics = characteristics;
    68.  
    69.         InputSystem.AddDevice(
    70.             new InputDeviceDescription
    71.             {
    72.                 interfaceName = QuestHandDevice.INTERFACE_NAME,
    73.                 product = name,
    74.                 capabilities = deviceDescriptor.ToJson(),
    75.             }) ;
    76.     }
    77.  
    78.     public static void RemoveDevice(string name)
    79.     {
    80.         var device = InputSystem.devices.FirstOrDefault(
    81.            x => x.description.interfaceName == QuestHandDevice.INTERFACE_NAME
    82.            && x.description.product == name);
    83.  
    84.         if (device != null)
    85.         {
    86.             InputSystem.RemoveDevice(device);
    87.         }
    88.     }
    89. }
    90.  
     
    Last edited: Oct 6, 2022
    fdz_ likes this.
  12. HabilityPierre

    HabilityPierre

    Joined:
    Nov 18, 2021
    Posts:
    6
    Hello.
    With the recent rise of "openXR awareness" on the HTC and Meta side, was there any update inside the Unity toolkit ? I am myself very interested in cross-platform hand tracking (for quest 2 and Vive Focus 3) and it would be awesome to not duplicate projects only to modify the hand tracking for each HMD.
    Thaks in advance!
     
    Fenikkel and ImpossibleRobert like this.
  13. AKSamsung

    AKSamsung

    Joined:
    Mar 14, 2023
    Posts:
    1
    Hi. I am new to XR development and I was exploring the xr hands package and how the XR interaction toolkit uses it in the Hand Interaction Demo. What I don't understand is, how is XRI using xr hands to interact with the ui/elements? The only script I see which is xr hands specific is PokeGestureDetector.cs which just detects whether the hand is making the poke gesture and based on that, it is turning on/off the ray interactor.

    But when I run the app in a quest device (quest pro), the hands are able to grab, pinch, poke ui elements and 3d objects.

    I want to achieve this same effect but in the editor (in play mode without any device connected - similar to a simulator).

    One example of an issue I am facing is, when I run the app in quest, the objects go into hover state as soon as the tip or any part of the hand interacts with the object (the use as well). But when I run it in editor in play mode, the 3d object only enter hover when the sphere collider intersects with the 3d object. See this post on r/unity3d with pictures of what I mean
     
  14. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,009
    So is it supporting hand tracking already ?
    Should I be able to just pinch and move this objects that are in HandsDemoScene ?
     
  15. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    146
    Yes that’s supported now. Pinching requires the meta aim extension right now, as does ray based aiming.

    I setup a demo repo with everything configured if you want to replicate it in your project.
     
    koirat likes this.
  16. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,009
    Thank you for answering.
    So can I assume it works only with Quest headset for now ?
    And maybe other devices from Meta ?
     
  17. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    146
    Hand tracking works with any headset that supports the OpenXR Hand Tracking extension, but that only provides the skeleton. Poking should work, but pinching and rays require the Meta Aim extension. That second extension may be implemented by other vendors, but since it is a meta extension, we've only tested it on Meta headsets indeed.
     
    koirat likes this.
  18. feathered_wing

    feathered_wing

    Joined:
    Dec 5, 2020
    Posts:
    20
    Does gesture tracking currently only support all-in-one devices? After I build on pcvr, it doesn't work properly.Thanks.
     
  19. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    146
    We only support what the runtimes expose. The hand subsystem does work more broadly though. If you’re looking to support hand tracking on pcvr with meta devices, I invited you to reach out on their forums.
     
    feathered_wing likes this.
  20. flipwon

    flipwon

    Joined:
    Dec 29, 2016
    Posts:
    179
    Is there any intention of adding a "grab" as well? The only thing stopping me from switching from the oculus toolkit is the lack of having both the pinch and grab gestures out of the box.
     
  21. pepeInLudus

    pepeInLudus

    Joined:
    Jan 18, 2022
    Posts:
    4
    I am unable to visualize the hands in the editor with your project as it is, not even with the Oculus options activated that you mention in the repository, neither in HandVisualizer nor in your interactions scene. I am using Quest 2 over AirLink. Any idea of what it could be, or if any Quest update may have caused it to be ineffective?
     
  22. ericprovencher

    ericprovencher

    Unity Technologies

    Joined:
    Dec 17, 2020
    Posts:
    146
    Have you enabled all the OpenXR options in the oculus desktop pc app? It’s not active by default.

    Note hands are only supported over link in Unity - not in standalone windows builds.
     
  23. pepeInLudus

    pepeInLudus

    Joined:
    Jan 18, 2022
    Posts:
    4
    I guess you are talking about these:

    All options enabled.

    The only way I see HandTracking working is in an Android build of the HandVisualizer sample scene. It isn't working in the Unity Editor no matter what platform is active.
     
  24. unity_andrewc

    unity_andrewc

    Unity Technologies

    Joined:
    Dec 14, 2015
    Posts:
    201
    Hi @pepeInLudus - you may be missing the same project setup in Build Settings that you did for hand-tracking to work on Android, but repeat it in Standalone ("Windows, Mac, Linux") build settings. The editor uses the Standalone build settings when playing in-editor.
     
    ericprovencher likes this.
  25. pepeInLudus

    pepeInLudus

    Joined:
    Jan 18, 2022
    Posts:
    4
    Thank you, good to know. I had it set up this way already, but it still isn't working. I give up on this. When a new version comes out or something, I'll try again. Thank you both
     
  26. Helmi172

    Helmi172

    Joined:
    Apr 7, 2020
    Posts:
    10
    Hey is it possible to get the hand tracking to work on the vive focus 3? And can you develop your own project with it, including interactions (e.g. grab)?
     
  27. ammars26

    ammars26

    Joined:
    Nov 7, 2016
    Posts:
    12
    Any way to recognize our custom hand gestures or expressions?