Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  2. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice

Other How to Oculus Quest Hand Tracking PointerPose Pinch

Discussion in 'VR' started by supernamey923834, Sep 16, 2021.

  1. supernamey923834

    supernamey923834

    Joined:
    Jul 27, 2017
    Posts:
    10
    Install
    https://developer.oculus.com/downloads/package/unity-integration/31.2
    32 is broken from what I can tell

    Add OVRCamera prefab > Find section enum that has Controllers, Hands And Controllers, Pick the Hands and controllers one

    OVRCamera in scene > cycle down to Righthand anchor + add OVRHandPrefab
    In the prefab components change all enums to right hand if youre setting up the right hand
    Enable the Mesh Renderer and Skinned Renderer if off. You can also setup up a custom hand mesh but thats another tutorial guide like

    SUPER make sure OVRCameraRig transform is reset to 0,0,0,0,0,01,1,1
    ALL THINGS breaks otherwise, there is zero way to move it by default for now

    Add a UI button to the scene, make its canvas world space and futz with it till you get it scaled down and into the cameras view
    roughly width 181.4188 : height 51 : scale 0.02755119
    button pos xy 0
    World Canvas add > OVRRaycaster (stupid name, should be raycatcher)
    Turn off Graphics Raycaster

    Add UIHelpers prefab to scene
    UIHelpers > HandedInputSelector turn this off, in its script it is forcing the EventSystem to the wrong transform

    Patch the OVRHand.cs file
    File private GameObject _pointerPoseGO; and make it public GameObject _pointerPoseGO;
    Go down to awake and add after the new GameObject
    _pointerPoseGO.name = $"{nameof(PointerPose)}_{HandType}";
    the name is not necessary, but super helps in debugging in editor mode
    https://forums.oculusvr.com/t5/Unit...ixes-for-the-Oculus-Integration-28/m-p/864374


    Make a script like this roughly

    Code (CSharp):
    1.  
    2. public class HandPointerLike : MonoBehaviour
    3. {
    4.  
    5.     public OVRInputModule _OVRInputModule;
    6.  
    7.     public OVRRaycaster _OVRRaycaster;
    8.  
    9.     public OVRHand _OVRHand;
    10.  
    11.  
    12.     void Start(){
    13.  
    14.         _OVRInputModule.rayTransform = _OVRHand.PointerPose;
    15.         _OVRRaycaster.pointer = _OVRHand.PointerPose.gameObject;
    16.      
    17.     }
    18.  
    19.     void Update(){
    20.             _OVRInputModule.rayTransform = _OVRHand.PointerPose;
    21.             _OVRRaycaster.pointer = _OVRHand.PointerPose.gameObject;
    22.     }
    23.  
    24. }
    25.  
    Link up the _OVRInputModule from > UIHelper > EventSystem
    Link up _OVRRaycaster from > The world canvas object you made earlier
    Link up _OVRHand from OVRCamera > ... OVRHandPrefab
    Turn on UIHelper >LaserPointer > Line Render if off

    Hit play and select EventSystem, see if its property rayTransform is pointing to the PointerPose object
    If not sa~~~~ flip a table and fix something

    If it works, build to the device and test, the laser should line out from roughly your palm into your viewable space. and pinching seems to default act as a press, dont know where to turn this off yet when not needed. The button should react with a color

    The key of this is the setting of the two lines of code above. Which feel like a total hack, but most of Oculus Integration feels that way already. the examples are messy, old code, there are two forms of Hands and no good naming dissertation of each. getting bones by id single does not seem to be a thing so you have to loop with the enums

    anyway its been a year and ZERO tutorials on how to do this just two notes in this forum and oculus's which is a wasteland. The one example file in the folder is HandsInteractionTrainScene which is a bit more convoluted to sift though

    ..............

    fun
     
    Last edited: Sep 17, 2021
  2. clooock

    clooock

    Joined:
    May 23, 2021
    Posts:
    4
    Thank you very much!
    I followed your guide and everything works, but I noticed that I can click on a button only pinching with right hand (even if the raycast is on the left hand). Is that normal? I don't know how to fix that.
     
  3. zhare86

    zhare86

    Joined:
    Mar 28, 2017
    Posts:
    30
    Thank you, this helped me today.

    I had to do a two more things though:
    1. Delete the EventSystem that gets automatically created with the Canvas.
    2. Under UIHelpers, on LaserPointer game object, enable the LineRenderer component.
     
  4. Lorrieto

    Lorrieto

    Joined:
    Apr 16, 2020
    Posts:
    2
    So one thing to note if OVRInputModule gives you an error you need to add the namespace: "namespace UnityEngine.EventSystems{ " and close it at the end of your class. Add it before public class HandPointerLike: Monobehaviour.
     
  5. talha-safdar

    talha-safdar

    Joined:
    May 28, 2022
    Posts:
    7