Search Unity

Tracked Devices and UI Raycasting

Discussion in 'Input System' started by jdh5259, Apr 23, 2020.

  1. jdh5259

    jdh5259

    Joined:
    Sep 14, 2017
    Posts:
    20
    I am running into an issue interacting with the UI using tracked devices and I was wondering if anyone could provide some insight/guidance. I am using the following combination (version 1.0.0-preview.7):
    • InputSystemUIInputModule
    • Canvas (World Space) with TrackedDeviceRaycaster
    • A tracked device (Oculus Rift hand)
    When I attempt to raycast against my canvas using the tracked device, I do not get any results. I was able to track down the reason why this happens and it is due to a difference between handling local and world space positions.

    It appears like the InputSystemUIInputModule treats the tracked device position as a world space position and will continue to perform raycasts as such. However, the camera rig I have for my tracked devices actually uses the tracked device positions as local space positions in order to allow the entire camera rig to move around the scene while maintaining the relative distance between the head and hands. This means that if I move away from the origin and attempt to raycast to a world space UI, the raycaster will return nothing because the input module is still raycasting using a world space position based off of the inputs directly (small numbers due to place space size) which does not match the actual location of the camera rig.

    Obviously this issue is caused by how I am handling my camera rig but:
    • For VR, is the approach for moving the camera rig around okay or am I supposed to be moving the world around the camera rig instead?
    • Assuming moving the camera rig around is okay, is it possible to get this UI raycasting working using the InputSystemUIInputModule? I would prefer to use the built in input module instead of writing a new one just for this one case.
    • Am I out of luck and need to use one of the alternatives above?
    Does anyone have any recommendations? Any help is greatly appreciated. Thanks!
     
  2. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    The XR Interaction Toolkit has a great example of how to interact with UI using an interactor @StayTalm_Unity can provide more details if you need em!
     
  3. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @jdh5259
    That is definitely a bug. The XR Interaction Toolkit and InputSystemUIInputModule are based off the same patterns, however because of how they diverged, some bugs found in one didn't get ported over to the other and vise versa.

    You should be allowed to move your rig, in fact it's almost essential to be able to while working in VR. I'll have to look into the best way to do that, since the Input System bindings are not directly associated with any given GameObject to convert from local space to world space.
     
  4. jdh5259

    jdh5259

    Joined:
    Sep 14, 2017
    Posts:
    20
    Yeah. This is exactly why I figured I was out of luck when it came to using the InputSystemUIInputModule (in its current state) for this use case. I was able to temporarily work around the issue for my specific use case for now.

    Thanks for getting back to me and acknowledging that it isn't necessarily working as intended.

    Also, just to avoid confusion, the issue described previously does not use the XR Interaction Toolkit.
     
  5. JeffWTD

    JeffWTD

    Joined:
    Dec 17, 2012
    Posts:
    6
    I ran into the same issue and found this thread.
    Is there a quick and dirty workaround to get this working until a proper fix is released?
    Also is there a bug report or page I can keep an eye on for updates on this?

    Thank you!
     
  6. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    Were you guys able to figure this out? I am still having this problem. Everything works until my rig (or the UI) moves.
     
  7. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    Maybe pointer origin(s) (transform) override?
     
    Last edited: Jan 12, 2021
  8. jdh5259

    jdh5259

    Joined:
    Sep 14, 2017
    Posts:
    20
    Unfortunately, I wasn't able to get a generic solution working for my previous issue and had to use an older input module.

    Since then, I have been migrating stuff to use the XR Interaction Toolkit which uses a different input module that supports tracked devices and works with the new input system.

    Probably not the answer you were hoping to hear but it is all I can really provide as an update from my end.
     
  9. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    Thank you! I will check out the tool kit. I was using my on InputModule but hoping this would be better than that.

    It's been over about 4 years since I started using unity for VR and I think it's a little absurd we still do not have a proper solution nor much information about it for tracked devices and interacting with UI. This is such a critical and core component for every game.
     
  10. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    Just want to say thank you! After struggling for 2 days finally works with the ToolKit. It is not perfect (it's bloaty, I do not want to use all this other stuff, just want the UI interaction) but at least it's a place to start. Hopefully, they will fix the native Input Module.
     
  11. ShienXIII

    ShienXIII

    Joined:
    Feb 5, 2017
    Posts:
    1
    If anyone still having this problem, I managed to get it working by setting the EventCamera field of the Canvas to the headset camera
     
  12. honWai

    honWai

    Joined:
    Aug 25, 2014
    Posts:
    1
    Hi ShienXIII, can you explain how you set the EventCamera field of the Canvas to the headset camera? I am using a prefab object, when i clicked in to change, there are no camera to select.
     
  13. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    This got fixed for Input System 1.1.0-Pre.4
    The InputSystemUIInputModule now has a field for 'Xr Tracking Origin' that you can set that will then transform all tracked UI positions and rotations from local tracking space to world space in Unity.
     
    jdh5259 and honWai like this.
  14. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    Is this recent? Just to be clear, does this mean, now we can use a tracked device like we used to do with a mouse pointer?
     
  15. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Yes.
    I actually think the pre.4 version is 'to be released', in that it's coming down the release pipes and will be available soon, but not available yet.

    To be clear, you could use a tracked device like a mouse before, so long as you kept the tracked objects in Unity world space (no parent transform that is off the Unity world origin).

    As well, in the same release, there is a new API in the InputSystemUIModule to get the last raycast of a specific input device. This will make it easier to draw rays off the controllers (to see where you are pointing) or to get external access to the hit point. Coming soon! (not sure exact date, but will be the next Input System available release).
     
    FlightOfOne likes this.
  16. cp-

    cp-

    Joined:
    Mar 29, 2018
    Posts:
    78
    Hi @StayTalm_Unity, if I read the changelog correctly we should also be able to interact with 3d colliders via EventSystem handlers like IPointerClickHandler etc by using TrackedDevicePhysicsRaycaster, right?

    I threw TrackedDevicePhysicsRaycaster on the main cam and referenced it, selected the layers my 3d colliders are on.
    On the controller's XRRayInteractor I selected the same layers for InteractionLayerMask as well as RaycastMask.
    The EventSystem has an instance of XRUIInputModule and "Enable XR Input" is checked.

    Still the ray goes right through the colliders.

    For my previous non-vr setup mouse interaction with the colliders just works with the PhysicsRaycaster (and the same layers in EventMask).

    Do you maybe have an idea what else has to be in place to make this work?

    Edit: Interaction with world-space canvases works by the way.
     
  17. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    This is great! The only thing is tracked devices are in the rig as a child object, they are not free-floating in the world space. Can we not just use the world position from the controller (tracked device), or does it require a world space (no parent) transform as a parameter?

    Thanks.
     
  18. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    So, if your controllers are not parented to anything, so that the Unity world space position is the same as the tracked device's real world reported position, then there was no issue before. You can leave this new `XR Tracking Origin` parameter null, and it will work as you'd expect

    If your controllers *are* parented to something, such as a rig, then the previous issue was that there was no way to tell the UI/Input System how to translate from tracked device real world space to Unity world space. You can now set the `XR Tracking Origin` to your rig, and this makes sure that UI raycasting and picking are translated from 'local to rig' to 'Unity world'.

    I'm hoping you can make sense of the above, I'm not sure I explain spaces very well.
     
    FlightOfOne likes this.
  19. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @cp-
    I know the Raycaster can be set to ignore collisions with triggers, could cause pass through.
    Sounds like you set the Event Camera, which is good.
    Also, setting the UI GameObject to be on a layer that's included in Raycast and Interaction Layer masks on the Ray Interactor is good too.
    And it sounds like pass-through and not just an issue with clicks. I mostly copied off the PhysicsRaycaster, so I don't know offhand.
    I've provided the simplest little working project, that should just have some red interactor rays, that go white when you point at a cube and log out pointer enter/exit and Down/Up/Click events. Should at least proof out the routing and basic structure. Maybe this can let you compare and see if you missed something, or if you have a really simple repro, I can pop it open and see what you could be missing.
     

    Attached Files:

  20. cp-

    cp-

    Joined:
    Mar 29, 2018
    Posts:
    78
    @StayTalm_Unity wow thanks for the quick response and the demo project, will check it out immediately!
     
  21. cp-

    cp-

    Joined:
    Mar 29, 2018
    Posts:
    78
    @StayTalm_Unity hey again. I tracked it down and think my issue is with compound colliders. You can easily reproduce it if you move the UIResponder script in your repro project to the parent "GameObject": The handlers will not be called in this scenario.
    With default mouse interaction this works.

    I guess this is due to line 233:
    Code (CSharp):
    1. var validHit = go.GetComponent<IEventSystemHandler>() != null;
    If we just skip this check it works:
    Code (CSharp):
    1. var validHit = true;
    A not so harsh change would be
    Code (CSharp):
    1. var validHit = go.GetComponentInParent<IEventSystemHandler>() != null;
    ...but i don't know about a) the performance implications and b) if this check it is even needed as in the PhysicsRaycaster it is not present.
    I guess you need it to report back to the XRRayInteractor so it can change its color on a valid hit?

    Edit: In case anyone else is in dire need of a solution for this I just uploaded the adapted class as a drop-in replacement for the TrackedDevicePhysicsRaycaster using the GetComponentInParent approach.
     

    Attached Files:

    Last edited: Jun 8, 2021
  22. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    Makes sense, thanks!
     
  23. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    I'll have to get back to you on this one, but this is my main focus:
    > if this check it is even needed as in the PhysicsRaycaster it is not present.​
    The TrackedDevicePhysicsRaycaster *should* operate identically to the PhysicsRaycaster, so I'll need to chase down why one works with sub-colliders, and one does not.

    Edit: Actually @cp- if you can report it as a big, that would help keep me honest.
     
  24. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    Hi @StayTalm_Unity ,

    Is this all I need to do for VR to interact with UI?

    1. Set the tracked position and orientation inputs
    2. Add this to canvas: TrackedDeviceRaycaster
    And assuming I have a pointer (e.g. line render) so I can see where it is pointing, everything should work?

    For some reason, it does not seem to be working.

    upload_2021-6-24_13-50-22.png
    upload_2021-6-24_13-59-57.png
     
    Last edited: Jun 24, 2021
  25. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @FlightOfOne
    Quick sanity check: You are on OpenXR, or WMR? OTOH, those are the two have Pointer poses and not just device poses.

    That *should* be enough for hover. If you want selection, you will need to bind an XR input to `UI/Click`

    Can you hit play, and pop open the Input Debugger? Windows -> Analysis -> Input Debugger. In play mode there should be a list of Actions and their bindings, and you can also sanity check that it's bound.

    Finally, Is you're rig at the Unity origin? There are some newer Input System versions (1.1.0-pre.4) that have an `XR Tracking Origin` property. Previously the Input System UI integration only took into account tracking space positions and rotations and put them directly in Unity world space. This new property and bug fix let's you choose a 'rig space' or where to put the real world tracking origin in Unity's world space.

    If all that doesn't work, ping back, maybe I can try to give out a smallest case working sample, or debug a small repro that you've got.
     
  26. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    Hi thanks for the response!

    I am using OpenXR:
    upload_2021-7-2_7-39-4.png

    So, it won't even highlight the button/UI element let alone click, but yes I did have this set as well and confirmed bindings worked. I also tested with IPointerEnterHandler which gave me no hits.

    I guess I am still unclear what you mean. It is not free-floating. It is parented to the rig. Here's how it is:
    Ignore the XRPointer script, that is my old system and the transform "Right Hand" is just a transform, not tracked pose or anything... Also, there is an EventSystem in the scene (just not on this screenshot)

    upload_2021-7-2_7-44-51.png


    Lastly, I am on Input System 1.1.0-preview.3

    Thanks again! It would be awesome if this would work.
     
  27. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    Hi, @StayTalm_Unity

    So, I tried again with the latest input system and the
     XR Tracking Origin
    showed up. This was the missing link. I also figured out how to how to raycast to get the hit point (e.g. where the laser ends). I am leaving it here hoping it would same some brain cells :)


    Code (CSharp):
    1.  
    2.  
    3.            Transform xrRig = transform.root;
    4.  
    5.             if (!InputModule)
    6.             {
    7.                 return;
    8.             }
    9.             else if (xrRig && InputModule.xrTrackingOrigin !=xrRig)
    10.             {
    11.                 InputModule.xrTrackingOrigin = xrRig;
    12.             }
    13.  
    14.             Vector3 localPos = trackedDevicePosition.ReadValue<Vector3>();
    15.             Vector3 position = localPos;
    16.  
    17.             if (transform.parent)
    18.             {
    19.                 position = transform.parent.TransformPoint(localPos);
    20.             }
    21.  
    22.             transform.position = position;
    23.             linePositions[0] = position;
    24.  
    25.             Quaternion trackedDeviceRot = trackedDeviceRotation.ReadValue<Quaternion>();
    26.             Vector3 forward = trackedDeviceRot * Vector3.forward;
    27.             Vector3 hitPoint = position + forward * rayDistanceMax;
    28.  
    29.             //ExtendedPointerEventData <--new event data type
    30.             extendedPointerEvent.trackedDevicePosition = position;
    31.             extendedPointerEvent.trackedDeviceOrientation = trackedDeviceRot;
    32.  
    33.             EventSystem.current.RaycastAll(extendedPointerEvent, hits);
    34.  
    35.             //Canvas intersection point
    36.             if (hits.Count > 0)
    37.             {
    38.                 hitPoint = hits[0].worldPosition;
    39.             }
    40.  
    41.             hits.Clear();
    42.             linePositions[1] = hitPoint;
    43.  
    44.  
    It would be very handy to have all this info in one location, e.g. in a guide. This new system makes it very, very easy to do things with VR + UI but what's the point of creating these awesome systems if no one knows about them.

    Thanks!
     
    StayTalm_Unity likes this.
  28. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Hey, sorry I let this fall. Glad to hear that Preview-4 and the xrTrackingOrigin feature helped you out :)
    On the topic of 'this should be documented', I also added a 'GetLastRaycast' function to the InputSystemUIInputModule.
    So you can do something like this:
    Code (CSharp):
    1.  
    2.         InputDevice trackedDevice; //<- This can be a reference to the device you'd like to check. One easy way to pick one is to use `XRController.leftHand` or `XRController.rightHand` static properties.
    3.         var hit = m_InputModule.GetLastRaycastResult(trackedDevice.deviceId);
    4.  
    And avoid having to do the second raycast for the visuals. This should work in a normal Update loop, but be careful with 'EarlyUpdate' since this is just retrieving a cached value from the UI Processing pass. Too early, and it's one frame behind.

    I'm with you though, much of the lack of documentation here is on me. I'm not a strong documenter and tend to put it off for more feature development, which makes some of the tricks in the InputSystem very opaque. I'm working on getting things better documented.

    The way you are doing it is potentially dangerous. It will pick up the greater value of both hands, because the InputSystemUIModule has 1 binding for all devices, and then does a second pass in the InputAction callback to figure out which Device/Pointer the position/rotation/click events should be assigned to. So it will work if your application binds to the left or right hand exclusively, but if you want to interact with UI with either hand you may get some buggy results.
     
    FlightOfOne likes this.
  29. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    Thanks for the new info! I will try that out -it seems even easier. I am only using the right-hand controller's pointer position/rotation, so it hasn't been an issue. The player will have an option to select if they want a right or left pointer, not both.

    As for the documentation, I understand how it goes with rapid development/changes and docs, and I have yet to find someone who loves documentation, read or write them, haha. I meant more like a one-page guide (you know, the ones Unity posts time to time) showing how to make a VR pointer from scratch. That would cover this whole thing. Or like one of those videos from @Mike-Geig (lure him into VR) :D

    It is now so, so much easier than what we had to do back in the day. For the first time, I was able to make my AAA quality pointer only using what unity provides! And it only took one day (and most of that time was spent looking for information). I think we are headed in the right direction with VR with the new input (which is another great new feature) and the XR management + OpenXR. Thanks for all the hard work on this!
     
    Last edited: Jul 30, 2021
  30. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    About this, what if I set the input module to Unified pointer? Same result?
     
  31. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    I suspect it won't work well. Running it in my head, I suspect the last device to connect would be considered the dominant tracking position (or the right hand if both are connected at start), but the selection/click could be triggered from either hand. But, I'm not 100%, and I'm not sure it's desired.

    If you want just one hand for UI pointing, you can use 'usages'. You can tag an InputDevice with extra tags, and then use those as filters within your bindings. By default, XR devices have a LeftHand and RightHand usage you can select in the InputAction UI.

    The more advanced technique is that you can create your own usages at runtime and the InputActions will respect those custom usages. If you open up the InputAction UI and press the 'T' button next to the binding you'll see something like : `<XRController>{LeftHand}/Trigger`. The {} brackets denote a Usage. You can read up on the other elements here.

    Something we have in some work-in-progress systems is to use {DominantHand} and {OffHand} within your paths, and then at runtime select whether the user is left or right handed. Calling
    InputSystem.SetDeviceUsage
    let's you update those usages and select which device you wish to use for a given InputAction.

    A second advanced technique is to use the
    InputActionMap.devices
    to select which device you want to restrict binding to. A null array is 'all devices' an empty array is 'no devices', and otherwise it will only bind to devices within that list. That let's you pick a InputDevice reference to use at runtime for your actionmap. It can also be changed at runtime, although the InputActions must be disabled.
     
    FlightOfOne likes this.
  32. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    @StayTalm_Unity

    Hi, So I ended up using what you recommended (I started noticing the problem you mentioned) and now all is working well.

    I am having a little issue (and this happened with both methods we talked about) where the mouse pointer have to be locked into the game window for UI interactions (it seems clicks do not go through) to work. If I click off of the game window, UI stops responding. All other inputs work fine.

    This happens in the editor and builds. I ended up locking the mouse but if the user happens to alt+tab for some reason and forgets to click back in the game view (so mouse can be locked again), UI stops responding to clicks.

    In short if the game window is not selected and active(not in background), UI stops responding to clicks.

    Hope I am making sense and you can give some guidance.

    Thanks!
     
  33. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    @StayTalm_Unity Hi, One more question. I do not see any of the preview input packages in Unity 2021 package manager. Is there a way to get these from elsewhere?

    upload_2021-9-8_16-51-33.png
     
  34. FlightOfOne

    FlightOfOne

    Joined:
    Aug 1, 2014
    Posts:
    668
    Just want leave this here so it might help someone else. If you want both controllers to work with the UI you must set this to pointers as is. If you set this to unified, only the least connected (left or right) OR only one connected will work.
    upload_2021-9-18_12-43-27.png
     
  35. dduncanbtw

    dduncanbtw

    Joined:
    Aug 18, 2021
    Posts:
    2
    How do you handle drawing the line from the controller?
    Do you use XR Ray Interactor or a custom component?
    Should I be using the InputSystemUIInputModule or the XRUIInputModule on my Event System?
     
  36. copperrobot

    copperrobot

    Joined:
    May 22, 2013
    Posts:
    69
    @StayTalm_Unity is there a way in which to raycast to a button using the XR Ray Interactor, and instead of having to pull the trigger have the button be triggered via an external script. Specifically I am looking to add a 'touch' screen. The way I am doing so is to detect when the user's finger collides with a 'screen'.

    I was hoping to say 'On collider enter: tell the event system to click whatever is being selected' - but I cannot locate the way to do so.
     
    tenconmar likes this.
  37. tenconmar

    tenconmar

    Joined:
    Mar 15, 2021
    Posts:
    29
    Yeah i got you. so just use what every ray cast line you have right now and set its render distance to .01 or something small. All you have to do now is set the ray cast to constantly “trigger” and you have a touch screen.