Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice

Resolved OpenXR standard controller orinetation

Discussion in 'VR' started by neginfinity, Dec 10, 2021.

  1. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,561
    Using Unity 2021.2.5f1, Oculus Quest 1 with Virtual Desktop, OpenXR plugin with Oculus Touch Controller configuration.

    Is there such thing as "standard controller orientation" in OpenXR?

    Basically when I run VR example, the controllers I see in the headset are turned relative to actual controller position in my hands. I'm talking about sample controller models attached to nodes driven by TrackedPoseDriver.

    After attaching debug geometry to controllers, I figured out that -Y points forward. Z points up, and X points right and the pivot is located somewhere within controller's grip.

    Is this orientation the same for all OpenXR devices?
     
  2. the_real_apoxol

    the_real_apoxol

    Unity Technologies

    Joined:
    Dec 18, 2020
    Posts:
    467
    Unfortunately no, at least with the `grip` pose. The `aim` pose (pointerPosition/pointerRotation) is a bit more consistent as it is based on the actual direction of aim rather than the controllers grip. There are some extensions in the works that will eventually make this better hopefully.
     
    neginfinity likes this.
  3. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,561
    Thanks for the response.

    In this case, I'm referring to GameObjects driven by Tracked Pose Driver component that comes along with XR, I believe. In my case, the vector pointing FORWARD seems to be negative Y axis (transform.down of the driven object).
    https://docs.unity3d.com/Packages/c...yEngine.InputSystem.XR.TrackedPoseDriver.html

    Your words imply that TrackedPoseDriver uses grip pose exposed through input system, and there's no way to guarantee that grip pose orientation will be the same on other XR devices.

    Did I understand this right?

    So, basically... does that mean that I should implement my own controller uses pointer position from XR controllers if I want to have a reasonable expectation that if the build is tested on another headset, the orientation of the "hands" will be roughly the same?
     
  4. the_real_apoxol

    the_real_apoxol

    Unity Technologies

    Joined:
    Dec 18, 2020
    Posts:
    467
    You can just use a different input action in the tracked pose driver that references the aim pose instead of the grip pose. But yes unfortunately the grip pose is not standard enough, the aim pose generally seems to be a little bit more reliable.
     
    neginfinity likes this.
  5. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,561
    Ah, I see now. There's positionInput and rotationInput properties that are not visible in the inspector but are accessible through code.

    I kinda wonder if them not being visible is a glitch.
     
  6. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,561
    Well, this is very amusing, but:

    * There's no way to specify that I want pointer position in TrackedPoseDriver inspector.
    upload_2021-12-12_15-33-10.png

    My initial idea was to hold a reference to it in inspector of custom component, and override the positionAction/rotationAction with the ones used from input asset. BUT.

    I can't save a reference to TrackedPoseDriver. It does not serialize and is automatically reset back to null:
    upload_2021-12-12_15-36-43.png
    ^^^ I can drag a TrackedPoseDriver in there, but it does not save.


    I tried to enumerate all the TrackedPoseDrivers and find one with the right/left controller:
    Code (csharp):
    1.  
    2.         var wands = GetComponentsInChildren<UnityEngine.InputSystem.XR.TrackedPoseDriver>();
    3.         if (wands.Length > 0){
    4.             foreach(var x in wands){
    5.             }
    6.         }
    7.  
    But apparently there's no equivalent of PoseSource dropdown being exposed to scripting side?

    So, what now?

    Do I now need to subclass BasePoseProvider to just get the pointer data?
    And do I even need the TrackedPoseDriver if I (apparently) can pull the data through InputSystem?
     
  7. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,561
    Last edited: Dec 12, 2021
    hippocoder likes this.
  8. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,561
    Alright, so, to summarize it.

    * Apparently VR Template is using outdated components. Or "legacy" components. That's not very helpful.
    * Switching to the "new" and "proper" input system components wasn't hard, but...
    * For some reason using references for actiosn in Tracked Pose Driver doesn't work. I think I'm missing something here. (nothing when control is initialized)
    * Default device position/rotation for right/left is located roughly in the middle of my palm for oculus controller. It is aligned with orientation of the grip. Looks like grip position is intended for a gun grip.
    * Pointer position is close to where an extended index trigger would be, although tilted upwards a bit. It mostly correctly points forward in a decent orientation.
     
  9. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,561
    Alright. This is solved. I can work from here.

    For those interested, it is possible to do this sort of thing.

    First you declare trackedposedrivers as [SerializeField] and assign them controllers.
    Code (csharp):
    1.  
    2.     [SerializeField] UnityEngine.InputSystem.XR.TrackedPoseDriver leftWand;
    3.     [SerializeField] UnityEngine.InputSystem.XR.TrackedPoseDriver rightWand;
    4.     [SerializeField] UnityEngine.InputSystem.XR.TrackedPoseDriver hmdObj;
    5.  
    That's in conjunction with action map class auto-generated from InputMap asset.
    Code (csharp):
    1.  
    2.     VRInputActions vrInput;
    3.  
    Then, within OnEnable, you can do this:
    Code (csharp):
    1.  
    2.     public void OnEnable(){
    3.         if (vrInput == null){
    4.             vrInput = new VRInputActions();
    5.         }
    6.         vrInput.Enable();
    7.  
    8.         if (leftWand){
    9.             //leftWand.rotationAction = vrInput.VRControls.LCtrlRot;
    10.             //leftWand.positionAction = vrInput.VRControls.LCtrlPos;
    11.             leftWand.positionAction = vrInput.VRControls.LPointerPos;
    12.             leftWand.rotationAction = vrInput.VRControls.LPointerRot;
    13.         }
    14.  
    15.         if (rightWand){
    16.             //rightWand.positionAction = vrInput.VRControls.RCtrlPos;
    17.             //rightWand.rotationAction = vrInput.VRControls.RCtrlRot;
    18.             rightWand.positionAction = vrInput.VRControls.RPointerPos;
    19.             rightWand.rotationAction = vrInput.VRControls.RPointerRot;
    20.         }
    21.  
    22.         if (hmdObj){
    23.             hmdObj.positionAction = vrInput.VRControls.HMDCenterEyePos;
    24.             hmdObj.rotationAction = vrInput.VRControls.HMDCenterEyeRot;
    25.         }
    26.  
    VRControls, "RPointerPos" are properties declared in action map which are then turned into C# class. When things done this way, the Trackers can use whatever you declared as pose input in action map as source.

    @the_real_apoxol You guys might want to check your documentation on XR configuration. Current docs are a bit outdated and refer to menus that are currently gone.
     
  10. the_real_apoxol

    the_real_apoxol

    Unity Technologies

    Joined:
    Dec 18, 2020
    Posts:
    467
    Sorry, should have mentioned it was a different TrackedPoseDriver :p
     
  11. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    You've lost track.