Search Unity

Oculus VR touch controller hand position is at offset

Discussion in 'AR/VR (XR) Discussion' started by mynameisook, Feb 6, 2018.

  1. mynameisook

    mynameisook

    Joined:
    May 3, 2016
    Posts:
    3
    I'm working on spawning objects based on where the user's hands currently are with Oculus Touch Controllers. The position I'm spawning objects at is the position returned from OVRInput.GetLocalControllerPosition(OVRInput.Controller.RTouch) (and LTouch), however the new spawned object's position is far below the touch controllers. Does anyone have any idea of what might be happening here?
     
  2. StickyHoneybuns

    StickyHoneybuns

    Joined:
    Jan 16, 2018
    Posts:
    207
    I don't have experience using Oculus but your situation doesn't sound like it's Oculus specific. First, are these objects that you made yourself in a 3d modeling app? If so I would check your object origin. 9 times out 10 when objects are spawning correctly for me it has to do with origin orientation/location. You should be able to google your program specific instructions to figure this out.

    If however your origin is correct I would run a Debug.Log(OVRInput.GetLocalControllerPosition); and simply look at the transform in the inspector compared to the Debug.Log() info. This should give you plenty of ideas to solve the issue.

    Of course there is always the band aid approach of simply offsetting your spawned object by the offset you are experiencing by using transform.position.
     
  3. Dmano

    Dmano

    Joined:
    Feb 15, 2017
    Posts:
    1
    This is caused due to the floor position originally set in the Oculus app when first setting up your Oculus rift.

    Here is my fix for the same issue:
    1. Create a transform
    2. In the inspector reset the transform i.e. position and orientation to x=0,y=0,z=0
    3. If you have an OVRPlayerController or OVRCameraRig, make sure these are also at origin, because they determine where your touch controllers are
    4. Attach the transform you created as a child to the hand touch controller either in the inspector or in script. If in script, make a public transform variable and drag and drop your transform into it
    5. Now instead of using the:
    OVRInput.GetLocalControllerPosition, just get the position from your child transform object i.e. transform.position and set your instantiated/spawned object positions to it
     
    MjStrwy likes this.
  4. pgfinke

    pgfinke

    Joined:
    Aug 19, 2019
    Posts:
    1
    This thread is somewhat old and the answer provided by @Dmano above is working. But as I am currently starting to learn about Oculus SDK in Unity and this was one of my top hits when searching for the command in question, I wanted to provide a solution that may be little more satisfying.

    OVRInput.GetLocalControllerPosition
    returns the position of the controller in the coordinate frame of the tracking space, i.e. in the local space of the game object called "Tracking Space" in the hierarchy of
    OVRPlayerController
    . When passing the
    position
    argument to the
    Instantiate
    method, it is required to be in world space, that is relative to the world origin, i.e.
    Vector3.zero
    . To instantiate the game object at the right position, we thus need to convert from local space to world space. Unity provides us with
    Transform.TransformPoint
    to transform points/positions from local to world space and
    Transform.TransformDirection
    to transform directions/rotations from local to world space. The documentation for both can be found here and here respectively.

    Finally an (untested) example, where
    trackingSpace
    should be set to the transform of the game object "Tracking Space" via the editor.

    Code (CSharp):
    1. public class InstantiateObject : MonoBehaviour
    2. {
    3.     public Transform trackingSpace; // reference to the tracking space
    4.     public OVRInput.Controller controller; // the controller to instantiate the object at
    5.     public GameObject gameObject; // the game object to instantiate
    6.  
    7.     public void InstantiateObject()
    8.     {
    9.         Vector3 position = trackingSpace.TransformPoint(OVRInput.GetLocalControllerPosition(controller));
    10.         Vector3 rotation = trackingSpace.TransformRotation(OVRInput.GetLocalControllerRotation(controller));
    11.         Instantiate(gameObject, position, rotation);
    12.     }
    13. }
     
  5. beccannlittle

    beccannlittle

    Joined:
    Mar 6, 2016
    Posts:
    1
    pgfinke's solution is correct, but quaternions/eulers are a little mixed up. Here's a tested/working version:

    Code (CSharp):
    1. public class InstantiateObject : MonoBehaviour
    2. {
    3.     public Transform trackingSpace; // reference to the tracking space
    4.     public OVRInput.Controller controller; // the controller to instantiate the object at
    5.     public GameObject toInstantiate; // the game object to instantiate
    6.     public void InstantiateObject()
    7.     {
    8.         Vector3 position = trackingSpace.TransformPoint(OVRInput.GetLocalControllerPosition(controller));
    9.         Vector3 rotation = trackingSpace.TransformDirection(OVRInput.GetLocalControllerRotation(controller).eulerAngles);
    10.         Instantiate(toInstantiate, position, Quaternion.Euler(rotation));
    11.     }
    12. }
    I'm sure there's a more efficient way to collapse all the angle conversions but oh well
     
  6. giantkilleroverunity3d

    giantkilleroverunity3d

    Joined:
    Feb 28, 2014
    Posts:
    383
    So how does one use this to refer to the lefthand and righthand transforms?
    In two projects I have I was using the lefthand and righthand transforms just fine. Now I get no data after the last Quest update.
    Oh, wait.
    I use this script twice and I drag the tracking space into trackingspace and left and right into the controller vars?
     
    Last edited: Nov 9, 2019
  7. dragon376_unity

    dragon376_unity

    Joined:
    Nov 22, 2019
    Posts:
    5
    beccannlittle This works really well, thanks.

    giantkilleroverunity3d, exactly, you use this twice:
    Code (CSharp):
    1. Vector3 rightPosition = trackingSpace.TransformPoint(OVRInput.GetLocalControllerPosition(OVRInput.Controller.RTouch));
    2. Vector3 leftPosition = trackingSpace.TransformPoint(OVRInput.GetLocalControllerPosition(OVRInput.Controller.LTouch));
    This was tested on an Oculus Quest
     
  8. Cazforshort

    Cazforshort

    Joined:
    Feb 22, 2016
    Posts:
    16

    Any thoughts on how this would work with the quest's hand tracking?

    I'm thinking its going to be something with:
    PointerPose.localPosition = _handState.PointerPose.Position.FromFlippedZVector3f();
    PointerPose.localRotation = _handState.PointerPose.Orientation.FromFlippedZQuatf();
     
  9. Cazforshort

    Cazforshort

    Joined:
    Feb 22, 2016
    Posts:
    16
    Can confirm this works as is.
     
  10. devinshay2009

    devinshay2009

    Joined:
    Feb 23, 2020
    Posts:
    2
    so confusing