Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Unity's UI and XR input

Discussion in 'AR/VR (XR) Discussion' started by plmx, May 3, 2019.

  1. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    Hi,

    simple question: Is there any native Unity support for using a world-space canvas UI with controllers based on Unity's XR namespace? If not, what are the steps to be able to use a controller as I would use a mouse in a normal canvas? I would like to have the controller(s) send clicks and move/drag/drop events to the canvas based on their actual position.

    The only definitive source on this I found was this Oculus article (https://developer.oculus.com/blog/unitys-ui-system-in-vr/), but it is nearly 4 years old and I suspect/hope that some things have changed since then (and, also, I would like to remain SDK-independent and only use Unity).

    Philip
     
    bemrlab likes this.
  2. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Hello!
    I replied in another thread, but since it was a tangent there, so it makes sense to do the same here.

    A generalized solution exists, and it is on it's way through our testing and shipping processes. Sorry I can't just drop it here, but stay tuned, and I'll reply to this thread once it's available.
     
    bemrlab and ROBYER1 like this.
  3. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Moving this over to your own thread:

    Soooo, within 2019 is the plan at the moment, but I don't make those plans, so they can change.
    And it is similar to the Oculus solution you linked, in that it allows 3D tracked devices to trigger world-space UI as if they were mice. So you can use dropdowns, scrollbars, buttons, textboxes, etc... Each tracked device and mouse and touch is treated independently so that you can still use Spectator mode to have both player and spectator UIs independently controllable, or you can have 1 hand using a scrollbar and the other pressing a button.

    It is UI-based, so it uses the EventSystem, replaces things like StandaloneInputModule, and has a custom UI Graphic Raycaster for 3D devices.
     
    ROBYER1 likes this.
  4. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    @StayTalm_Unity Thanks for your reply, that sounds pretty good. I think I will go with a minimal custom implementation for now and switch once it is available. Thanks, looking forward to the notification ;-)
     
  5. joelybahh

    joelybahh

    Joined:
    Mar 18, 2015
    Posts:
    66
    I am curious on when this rolls out also,

    I wrote a VR Input Module, but it seems to be in a constant battle with the standalone input module, they both kind of work together, but the desktop UI interactions only work when I am constantly moving my mouse.

    When watching the event system debug I can see my XR Input Module keeps overriding the normal one, unless I'm constantly moving my mouse, then it gets even weirder when both the VR and Desktop users try to interact with their respective UI's at the same time.
     
  6. joelybahh

    joelybahh

    Joined:
    Mar 18, 2015
    Posts:
    66
    I managed to get it working for both VR and Desktop UI interactions simultaneously.

    I had to manually call Process on the StandaloneInputModule from my XRInputModule because the EventSystem component just grabs the first BaseInputModule found, and calls Execute on it. Bit of a hack but it's working great.

    I also tried this same thing, but instead, inherited from the StandaloneInputModule, but that doesn't work because of private variables that I can't access and need to access.

    So rewrote an 'almost' exact copy of the StandaloneInputModule to work with Unity Generic XR inputs and Desktop.
     
  7. mikewarren

    mikewarren

    Joined:
    Apr 21, 2014
    Posts:
    109
    FYI,

    I'm interested in an XR Input Module as well and asked pretty much the same exact question here.
    Thanks to @StayTalm_Unity for taking the time to reply to both.

    Any idea (best guess) when the module might be available? I'm just trying to decide if I should wait or attempt to write something on my own.

    I'm wondering if there's a better way to tie into the event system for the raycasting into the scene. I know that operating a uGUI control involves a (Graphics) raycast from a camera position, through a screen pixel point into the scene to do an intersection test. If I already have a world space raycaster, I'd like to be able to skip the whole screen, point, camera operation.

    Lastly, any speculation on how the new input system or UIElements efforts will affect uGUI down the road?
     
  8. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Hello!
    I'm going to do this in reverse order:
    UIElements: That is going to be a different endeavor and something I still need to look into.

    New Input System: It has it's own uGui Input Module, also written by me, so they share a lot of the same concepts. That one is public: https://github.com/Unity-Technologi.../com.unity.inputsystem/InputSystem/Plugins/UI
    The XRInput specific one will look very similar, both are based off of the UIInputModule and device models in that folder. I tried to make it much more easily extensible compared to the StandaloneInputModule. The basic design is that you inherit from the UIInputModule, create ____Model structs, and on a per-frame basis, update those and pass them back down to the UIInputModule in order to convert that into actual UI events. I wanted to separate the Input sources from the actual internal uGUI concepts and event processing, to untangle hooking uGUI up to new sources, and I'm pretty happy with the result. If you want to start with your own, I'd suggest starting from that point.

    Raycasters: You are bang on on that one! If you look at the New Input System's TrackedDeviceRaycaster, I extended out the pointerEventData type, and created a new raycaster that can handle 3D coordinates not connected to any specific camera. It should look really similar to a few other solutions out in the field. The tricky part was to write the extended pointerEventData to not get picked up by the 2D Graphic Raycasters. It does graphics-only (no physics raycaster), but does handle physics and graphics occlusion. That raycaster works well for all uGUI types except the dropdown due to how the dropdown does it's dropping down effect. But that is being fixed and ported backwards to 2019.1.

    ETA: I wanna stress that I'm no authority on this one, but I do know that we are scheduling approximately one month out (it's packaged with a few other thingamabobs too). That does not take into account business decisions, surprises, etc... that may come up, and so this is a thought and not a promise.

    Hope all this info helps!
     
    ROBYER1, aaronfranke and joelybahh like this.
  9. mikewarren

    mikewarren

    Joined:
    Apr 21, 2014
    Posts:
    109
    @StayTalm_Unity, thanks for the information and the hard work. This sounds great.
     
    ROBYER1 likes this.
  10. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,444
    After being told to use Curved UI a LOT, it would be nice to just see World Space Canvas UI working in VR, especially as a lot of courseware for Unity suggests or alludes to using native UI in worldspace with VR is a straight forward endeavour.

    Keep us posted please!
     
    FlightOfOne and bemrlab like this.
  11. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,444
    Is there any update on this? I keep downloading new alpha versions of Unity in the hopes of seeing this implemented
     
  12. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    Still on it's way.
    I wish I could say more, but I cannot, but I'm gonna stay true and ping to this thread once it's available.
    I can say it will be a separate package, so it won't need a specific Alpha version.
     
    ROBYER1 likes this.
  13. RomanMultan

    RomanMultan

    Joined:
    Jun 15, 2018
    Posts:
    3
    @StayTalm_Unity hi, is there any update on this? In addition, will there be any built-in solutions for interacting with UI through a laser pointer (like in SteamVR)?
     
    Last edited: Sep 5, 2019
    ROBYER1 likes this.
  14. dwatt-hollowworldgames

    dwatt-hollowworldgames

    Joined:
    Apr 26, 2019
    Posts:
    104
    Create a gameobject called line which has the linerenderer for the pointer and a camera that is disabled and has a FOV of 1. then use this code in a custom input module. CasterRight is the camera on the right controller pointer. I do this for both left and right hands. Use this as a replacement for InputModule. I also put a script on each canvas that adds itself to the Canvases list. Most of it is derived from VR Andrew's tutorial video that explains how to do it.



    public Camera CasterRight;
    public Camera CasterLeft;
    [HideInInspector]
    public List<Canvas> Canvases;
    public GameObject RightCurrent;
    public GameObject LeftCurrent;
    GameObject RightPressed;
    GameObject LeftPressed;
    public PointerEventData RightData { get; private set; }


    public PointerEventData LeftData { get; private set; }

    private void ProcessRight()
    {
    if (ControllerManager.Right.GetState() != ControllerState.Unavailable && CasterRight != null)
    {
    //set camera for canvases to right camera
    foreach (Canvas canvas in Canvases)
    {
    canvas.worldCamera = CasterRight;
    }
    //reset data
    RightData.Reset();
    RightData.position = new Vector2(CasterRight.pixelWidth / 2, CasterRight.pixelHeight / 2);
    //Raycast
    eventSystem.RaycastAll(RightData, m_RaycastResultCache);
    RightData.pointerCurrentRaycast = FindFirstRaycast(m_RaycastResultCache);
    RightCurrent = RightData.pointerCurrentRaycast.gameObject;

    //clear raycast
    m_RaycastResultCache.Clear();
    //handle hover
    HandlePointerExitAndEnter(RightData, RightCurrent);
    //handle press
    if (ControllerManager.Right.IsPressed(Buttons.Trigger))
    {
    OnPressRight();
    }
    //handle release
    else
    {
    OnReleaseRight();
    }
    }
    }

    void OnPressRight()
    {
    //set raycast
    RightData.pointerPressRaycast = RightData.pointerCurrentRaycast;
    //check for object hit and send down event
    GameObject newPointerPress = ExecuteEvents.ExecuteHierarchy(RightCurrent, RightData, ExecuteEvents.pointerDownHandler);
    //if no down handler , try and get click handler
    if (newPointerPress == null)
    {
    newPointerPress = ExecuteEvents.GetEventHandler<IPointerClickHandler>(RightCurrent);
    }
    if (RightPressed != null && RightCurrent == null)
    {
    //if we exit the element and still have button pressed send up event
    ExecuteEvents.Execute(RightPressed, RightData, ExecuteEvents.pointerUpHandler);
    }
    //set RightData
    RightData.pressPosition = RightData.position;
    RightData.pointerPress = newPointerPress;
    RightData.rawPointerPress = RightCurrent;
    RightPressed = newPointerPress;//save pressed element for later use when released

    }

    void OnReleaseRight()
    {
    ExecuteEvents.Execute(RightData.pointerPress, RightData, ExecuteEvents.pointerUpHandler);

    GameObject pointerUpHandler = ExecuteEvents.GetEventHandler<IPointerClickHandler>(RightCurrent);
    if (RightData.pointerPress == pointerUpHandler)
    {
    //send up event to handler under cursor
    ExecuteEvents.Execute(RightData.pointerPress, RightData, ExecuteEvents.pointerUpHandler);
    }
    if(RightPressed != null && pointerUpHandler == RightPressed)
    {
    //if we are still over the one we first pressed send click event
    ExecuteEvents.Execute(RightPressed, RightData, ExecuteEvents.pointerClickHandler);
    Debug.Log("Clicked - " + RightPressed);
    }
    //clear selected gameobject
    eventSystem.SetSelectedGameObject(null);
    //reset RightData
    RightData.pressPosition = Vector2.zero;
    RightData.pointerPress = null;
    RightData.rawPointerPress = null;
    RightPressed = null;
    }

    David Watt
    Hollow World Games
    https://www.youtube.com/channel/UCjQTmu4jMw2nNtpNbiUSjrg
     
  15. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,444
    Hi David,

    Thanks for sharing this, could be a good workaround to use in the meantime until the official Unity VR UI package is released. I will have to do some performance testing as I'm wondering what the implications on performance are for using 2 additional cameras rendering the UI at almost all times. Especially on lower end platforms like the Oculus Quest/Go where all the draw calls cost you big time.
     
  16. dwatt-hollowworldgames

    dwatt-hollowworldgames

    Joined:
    Apr 26, 2019
    Posts:
    104
    ROBYER1 likes this.
  17. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,444
    I followed the tutorial that you linked and got it working fine too, thanks for the tip. However it would obviously be preferable to use the official Unity solution once that is finally released (feels like it's taking ages).

    Having to set the worldspace camera for events is not fun and hopefully the Unity official solution avoids that limitation for left/right hand users of UI pointers. @StayTalm_Unity we wait patiently for your solution to come through the pipeline, please let us know when you can share more as our own projects are being delayed by this.
     
  18. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    I know it *is* taking a long time.
    I'm waiting too.
    But I will post info once I can.
     
    hippocoder and KingOfSnake like this.
  19. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,444
    For anyone wondering what the new XR Interaction package is like, I had a go with it at Unity Unite Copenhagen and passed some feedback on about it, including slow lerping of held object positions (seemingly laggy) and also some issues with UI deactivation once the UI raycaster is not pointing at the UI on buttons etc.

    It also seems some work is being done on the Inputs for it as the current state of XR input means you have 3 options - legacy input manager, XR Input in a C# script or the new Input System coming in 2019.3/2020*

    There is a sneak peek of it here
     
    Matt_D_work and Ostwind like this.
  20. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    thanks robyer1 :) if you have any other feedback drop me a line!

    yep. we're going to launch with a simple input implementation to get an initial version out. We're going to expand the input as soon as I can finish working on it after launch :)
     
    ROBYER1 likes this.
  21. Rib

    Rib

    Joined:
    Nov 7, 2013
    Posts:
    39
    Really looking forward to this becoming available! At the moment I have a system similar to the one by Andrew, posted above (except based on Unity XR input APIs) but it doesn't currently handle dragging/scrolling correctly. I'm kinda blocked by this except, idealy, I don't want to spend any significant time developing and debugging yet another InputModule/Raycaster for VR input if Unity already basically has a solution that will make all this hackery redundant.

    To double check though; is this not going to be available before 2019.3/2020? - is it not maybe possible to see some kind of preview package earlier than that?

    Thanks for any clarification!
     
  22. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    It will be a preview package with 19.3.
    We're trying to get it out as soon as we can but we're committing to 19.3.
     
    ROBYER1 and appymedia like this.
  23. Rib

    Rib

    Joined:
    Nov 7, 2013
    Posts:
    39
    Okey, thanks @Matt_D_work!

    Actually today I was able to iterate my InputHandler to get drag/drop working so now I think most canvas interaction is working for me using XR input handling. Hopefully this will tide me over until this new functionality is available.
     
  24. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,444
    Really looking forward to this, I have a project starting full alpha development soon looking to use all cross-platform VR capabilities including the XR Interaction Package. Will there be an announcement post or blog post when it is available? (Apologies for seeming impatient, your hard work on this area of Unity is really appreciated and I want to provide any useful feedback I can once the package is out) :)
     
  25. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    Yep blog post etc when we're ready :)
     
    ROBYER1 likes this.
  26. AccentDave

    AccentDave

    Joined:
    Nov 16, 2015
    Posts:
    43
    Please tell me this will happen in 2019 (the year)
     
  27. hungrybelome

    hungrybelome

    Joined:
    Dec 31, 2014
    Posts:
    336
    Any updates on this? Hoping to use it asap!
     
  28. daveinpublic

    daveinpublic

    Joined:
    May 24, 2013
    Posts:
    167
    Yes, apparently there are a lot of people waiting for this. I find myself refreshing the Package Manager every other day in case it's out, hopefully it's close. (fingers crossed)
     
  29. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    It took a long time, but finally:
    https://blogs.unity3d.com/2019/12/17/xr-interaction-toolkit-preview-package-is-here/
    and
    https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/

    This contains the XR UI integration, and if you saw it at Unite, it now merges the UI and Interactable systems so that UI is a first-class citizen, so to speak. You can use all the same utilities and options of the XRRayInteractor.

    This toolkit will see ongoing improvements, please leave feedback, questions, or opinions about it, and I'll try to keep a close eye and respond and listen.
     
    hungrybelome likes this.
  30. salvolannister

    salvolannister

    Joined:
    Jan 3, 2019
    Posts:
    50

    Hi StayTalm I would like to understand how to have the "onHover event" present in the "XRBaseInteractable" also for the UIElement whent the laser is hovering.

    Which interface implement the "IPointerEnterHandler" for the XR UI?
     
  31. Discmage

    Discmage

    Joined:
    Aug 11, 2014
    Posts:
    57
    Should we be using the XR interaction toolkit or the new input system 1.0 tools for VR input? I've spent basically ALL day on this trying to get even a simple button press from the left motion controller on my rift to read and I haven't been able to do it.

    All I want is something like:
    if(XR.lefthand.aButton.isPressed){
    // do something!
    }

    What would be the equivalent with the interaction toolkit...or input system for that matter?
     
  32. dwatt-hollowworldgames

    dwatt-hollowworldgames

    Joined:
    Apr 26, 2019
    Posts:
    104
    you will have to write a layer over xrinput to do that. That is what I have done as it only gives you up or down state. I also did it to unify access between various input sources. The new input system is a waste for me as it won't recognize controllers it doesn't know. Frankly with the whole xr management and steamvr debacle going on, I am still using legacy vr and steam input. if you are planning to support steamvr I highly recommend using only legacy vr and steamvr input, however awful it may be.
     
  33. Discmage

    Discmage

    Joined:
    Aug 11, 2014
    Posts:
    57
    Just putting this here because for some reason this was hard to find easily in documentation and forums...maybe it's just me, but still. Credit to a friend for getting this code to me.

    Also, If you don't have the headset ON, the boolean is not triggered. So while it was working with no errors, it won't actually 'work'. I was not getting any input from the controllers because I did not have the headset on (I assume it's the sensor though)...This took me a while to figure out though as I assumed I wouldn't have to have the headset on to test just the motion controllers.

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.XR.Interaction.Toolkit;
    5.  
    6. public class ControllerManager : MonoBehaviour
    7. {
    8.     public XRController rightHand;
    9.     public InputHelpers.Button button;
    10.  
    11.     void Update() {
    12.  
    13.         bool pressed;
    14.         leftHand.inputDevice.IsPressed(button, out pressed);
    15.  
    16.         if (pressed) {
    17.             Debug.Log("Hello - " + button);
    18.         }
    19.     }
    20. }
    Put this script on the XRRig, drag in the motion controller and Voila!
     
    InsaneDuane likes this.
  34. MattESqr

    MattESqr

    Joined:
    Feb 5, 2020
    Posts:
    15
    When using the provided InputHelpers in the XR Toolkit, it seems everything is binary, as the IsPressed only returns a boolean.

    Does the XR Toolkit provide a way to get the axis value of the joystick axis' on Oculus controllers, such as GetAxis() ? I couldn't find this in the Unity XR Toolkit
     
  35. alexboost222

    alexboost222

    Joined:
    Jul 1, 2018
    Posts:
    10
    @StayTalm_Unity, hi there! I'm currently trying to setup uGUI for XR with Unity Input System Package v1.0.2. I'm using HTC vive headset. Firstly i just followed guidline from here https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/UISupport.html. But instead of using DefaultInputActions i made UI action map in my input actions asset: upload_2021-10-21_15-50-42.png
    Then i chose my input actions asset in the inspector of my InputSystemUIInputModule Component: upload_2021-10-21_15-52-11.png
    Input action asset is activated and fire actions, double cheked this using debugger.
    On the scene i have a large button on the canvas. Canvas set of components: upload_2021-10-21_15-55-17.png
    Scene view with hierarchy:
    upload_2021-10-21_15-56-7.png
    upload_2021-10-21_15-56-27.png
    And with this setup nothig works at all. When i'm pointing to the button on the canvas with my controller, nothing happens. I digged inside the TrackedDeviceRaycaster and found out that raycast performed along the forward axis of the controller (111 string in TrackedDeviceController.cs). Then i double checked that i'm pointing with my controller's forward axis on the button. Same result - nothing seems work. Then i started to test different angles of my controller. I'm literrally pefrormed attemts of pointing on the button and then clicking on it from ev ery possible orientation of my controller. Again the same result. Then i returned to the code and with debugger found out that execution always ends on the line 194 of TrackedDeviceRaycaster.cs. There is a string:
    Code (CSharp):
    1. if (RayIntersectsRectTransform(graphic.rectTransform, ray, out worldPos, out distance))
    and it seems that the method RayIntersectsRectTransform for some reason never returns true in my case.
    Any help is appreciated.
     
    Last edited: Oct 21, 2021
  36. alexboost222

    alexboost222

    Joined:
    Jul 1, 2018
    Posts:
    10
    Update: spend some time on testing again and found the position of the controller in which all works fine. I have a guess that tracking object position is not the world position of controller's game object.
     
  37. alexboost222

    alexboost222

    Joined:
    Jul 1, 2018
    Posts:
    10
    Update: found some information here https://forum.unity.com/threads/tracked-devices-and-ui-raycasting.874951/#post-7287043. It seems that this is a bug in Input System versions lower that 1.1.0.
     
  38. alexboost222

    alexboost222

    Joined:
    Jul 1, 2018
    Posts:
    10