Search Unity

  1. Unity 2019.2 is now released.
    Dismiss Notice

Unity's UI and XR input

Discussion in 'AR/VR (XR) Discussion' started by plmx, May 3, 2019.

  1. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    245
    Hi,

    simple question: Is there any native Unity support for using a world-space canvas UI with controllers based on Unity's XR namespace? If not, what are the steps to be able to use a controller as I would use a mouse in a normal canvas? I would like to have the controller(s) send clicks and move/drag/drop events to the canvas based on their actual position.

    The only definitive source on this I found was this Oculus article (https://developer.oculus.com/blog/unitys-ui-system-in-vr/), but it is nearly 4 years old and I suspect/hope that some things have changed since then (and, also, I would like to remain SDK-independent and only use Unity).

    Philip
     
    bemrlab likes this.
  2. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    86
    Hello!
    I replied in another thread, but since it was a tangent there, so it makes sense to do the same here.

    A generalized solution exists, and it is on it's way through our testing and shipping processes. Sorry I can't just drop it here, but stay tuned, and I'll reply to this thread once it's available.
     
    bemrlab and ROBYER1 like this.
  3. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    86
    Moving this over to your own thread:

    Soooo, within 2019 is the plan at the moment, but I don't make those plans, so they can change.
    And it is similar to the Oculus solution you linked, in that it allows 3D tracked devices to trigger world-space UI as if they were mice. So you can use dropdowns, scrollbars, buttons, textboxes, etc... Each tracked device and mouse and touch is treated independently so that you can still use Spectator mode to have both player and spectator UIs independently controllable, or you can have 1 hand using a scrollbar and the other pressing a button.

    It is UI-based, so it uses the EventSystem, replaces things like StandaloneInputModule, and has a custom UI Graphic Raycaster for 3D devices.
     
    ROBYER1 likes this.
  4. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    245
    @StayTalm_Unity Thanks for your reply, that sounds pretty good. I think I will go with a minimal custom implementation for now and switch once it is available. Thanks, looking forward to the notification ;-)
     
  5. joelybahh

    joelybahh

    Joined:
    Mar 18, 2015
    Posts:
    62
    I am curious on when this rolls out also,

    I wrote a VR Input Module, but it seems to be in a constant battle with the standalone input module, they both kind of work together, but the desktop UI interactions only work when I am constantly moving my mouse.

    When watching the event system debug I can see my XR Input Module keeps overriding the normal one, unless I'm constantly moving my mouse, then it gets even weirder when both the VR and Desktop users try to interact with their respective UI's at the same time.
     
  6. joelybahh

    joelybahh

    Joined:
    Mar 18, 2015
    Posts:
    62
    I managed to get it working for both VR and Desktop UI interactions simultaneously.

    I had to manually call Process on the StandaloneInputModule from my XRInputModule because the EventSystem component just grabs the first BaseInputModule found, and calls Execute on it. Bit of a hack but it's working great.

    I also tried this same thing, but instead, inherited from the StandaloneInputModule, but that doesn't work because of private variables that I can't access and need to access.

    So rewrote an 'almost' exact copy of the StandaloneInputModule to work with Unity Generic XR inputs and Desktop.
     
  7. mikewarren

    mikewarren

    Joined:
    Apr 21, 2014
    Posts:
    81
    FYI,

    I'm interested in an XR Input Module as well and asked pretty much the same exact question here.
    Thanks to @StayTalm_Unity for taking the time to reply to both.

    Any idea (best guess) when the module might be available? I'm just trying to decide if I should wait or attempt to write something on my own.

    I'm wondering if there's a better way to tie into the event system for the raycasting into the scene. I know that operating a uGUI control involves a (Graphics) raycast from a camera position, through a screen pixel point into the scene to do an intersection test. If I already have a world space raycaster, I'd like to be able to skip the whole screen, point, camera operation.

    Lastly, any speculation on how the new input system or UIElements efforts will affect uGUI down the road?
     
  8. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    86
    Hello!
    I'm going to do this in reverse order:
    UIElements: That is going to be a different endeavor and something I still need to look into.

    New Input System: It has it's own uGui Input Module, also written by me, so they share a lot of the same concepts. That one is public: https://github.com/Unity-Technologi.../com.unity.inputsystem/InputSystem/Plugins/UI
    The XRInput specific one will look very similar, both are based off of the UIInputModule and device models in that folder. I tried to make it much more easily extensible compared to the StandaloneInputModule. The basic design is that you inherit from the UIInputModule, create ____Model structs, and on a per-frame basis, update those and pass them back down to the UIInputModule in order to convert that into actual UI events. I wanted to separate the Input sources from the actual internal uGUI concepts and event processing, to untangle hooking uGUI up to new sources, and I'm pretty happy with the result. If you want to start with your own, I'd suggest starting from that point.

    Raycasters: You are bang on on that one! If you look at the New Input System's TrackedDeviceRaycaster, I extended out the pointerEventData type, and created a new raycaster that can handle 3D coordinates not connected to any specific camera. It should look really similar to a few other solutions out in the field. The tricky part was to write the extended pointerEventData to not get picked up by the 2D Graphic Raycasters. It does graphics-only (no physics raycaster), but does handle physics and graphics occlusion. That raycaster works well for all uGUI types except the dropdown due to how the dropdown does it's dropping down effect. But that is being fixed and ported backwards to 2019.1.

    ETA: I wanna stress that I'm no authority on this one, but I do know that we are scheduling approximately one month out (it's packaged with a few other thingamabobs too). That does not take into account business decisions, surprises, etc... that may come up, and so this is a thought and not a promise.

    Hope all this info helps!
     
    ROBYER1, aaronfranke and joelybahh like this.
  9. mikewarren

    mikewarren

    Joined:
    Apr 21, 2014
    Posts:
    81
    @StayTalm_Unity, thanks for the information and the hard work. This sounds great.
     
    ROBYER1 likes this.
  10. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    343
    After being told to use Curved UI a LOT, it would be nice to just see World Space Canvas UI working in VR, especially as a lot of courseware for Unity suggests or alludes to using native UI in worldspace with VR is a straight forward endeavour.

    Keep us posted please!
     
    bemrlab likes this.
  11. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    343
    Is there any update on this? I keep downloading new alpha versions of Unity in the hopes of seeing this implemented
     
  12. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    86
    Still on it's way.
    I wish I could say more, but I cannot, but I'm gonna stay true and ping to this thread once it's available.
    I can say it will be a separate package, so it won't need a specific Alpha version.
     
    ROBYER1 likes this.
  13. RomanMultan

    RomanMultan

    Joined:
    Jun 15, 2018
    Posts:
    1
    @StayTalm_Unity hi, is there any update on this? In addition, will there be any built-in solutions for interacting with UI through a laser pointer (like in SteamVR)?
     
    Last edited: Sep 5, 2019
    ROBYER1 likes this.
  14. dwattintellisoft

    dwattintellisoft

    Joined:
    Apr 26, 2019
    Posts:
    9
    Create a gameobject called line which has the linerenderer for the pointer and a camera that is disabled and has a FOV of 1. then use this code in a custom input module. CasterRight is the camera on the right controller pointer. I do this for both left and right hands. Use this as a replacement for InputModule. I also put a script on each canvas that adds itself to the Canvases list. Most of it is derived from VR Andrew's tutorial video that explains how to do it.



    public Camera CasterRight;
    public Camera CasterLeft;
    [HideInInspector]
    public List<Canvas> Canvases;
    public GameObject RightCurrent;
    public GameObject LeftCurrent;
    GameObject RightPressed;
    GameObject LeftPressed;
    public PointerEventData RightData { get; private set; }


    public PointerEventData LeftData { get; private set; }

    private void ProcessRight()
    {
    if (ControllerManager.Right.GetState() != ControllerState.Unavailable && CasterRight != null)
    {
    //set camera for canvases to right camera
    foreach (Canvas canvas in Canvases)
    {
    canvas.worldCamera = CasterRight;
    }
    //reset data
    RightData.Reset();
    RightData.position = new Vector2(CasterRight.pixelWidth / 2, CasterRight.pixelHeight / 2);
    //Raycast
    eventSystem.RaycastAll(RightData, m_RaycastResultCache);
    RightData.pointerCurrentRaycast = FindFirstRaycast(m_RaycastResultCache);
    RightCurrent = RightData.pointerCurrentRaycast.gameObject;

    //clear raycast
    m_RaycastResultCache.Clear();
    //handle hover
    HandlePointerExitAndEnter(RightData, RightCurrent);
    //handle press
    if (ControllerManager.Right.IsPressed(Buttons.Trigger))
    {
    OnPressRight();
    }
    //handle release
    else
    {
    OnReleaseRight();
    }
    }
    }

    void OnPressRight()
    {
    //set raycast
    RightData.pointerPressRaycast = RightData.pointerCurrentRaycast;
    //check for object hit and send down event
    GameObject newPointerPress = ExecuteEvents.ExecuteHierarchy(RightCurrent, RightData, ExecuteEvents.pointerDownHandler);
    //if no down handler , try and get click handler
    if (newPointerPress == null)
    {
    newPointerPress = ExecuteEvents.GetEventHandler<IPointerClickHandler>(RightCurrent);
    }
    if (RightPressed != null && RightCurrent == null)
    {
    //if we exit the element and still have button pressed send up event
    ExecuteEvents.Execute(RightPressed, RightData, ExecuteEvents.pointerUpHandler);
    }
    //set RightData
    RightData.pressPosition = RightData.position;
    RightData.pointerPress = newPointerPress;
    RightData.rawPointerPress = RightCurrent;
    RightPressed = newPointerPress;//save pressed element for later use when released

    }

    void OnReleaseRight()
    {
    ExecuteEvents.Execute(RightData.pointerPress, RightData, ExecuteEvents.pointerUpHandler);

    GameObject pointerUpHandler = ExecuteEvents.GetEventHandler<IPointerClickHandler>(RightCurrent);
    if (RightData.pointerPress == pointerUpHandler)
    {
    //send up event to handler under cursor
    ExecuteEvents.Execute(RightData.pointerPress, RightData, ExecuteEvents.pointerUpHandler);
    }
    if(RightPressed != null && pointerUpHandler == RightPressed)
    {
    //if we are still over the one we first pressed send click event
    ExecuteEvents.Execute(RightPressed, RightData, ExecuteEvents.pointerClickHandler);
    Debug.Log("Clicked - " + RightPressed);
    }
    //clear selected gameobject
    eventSystem.SetSelectedGameObject(null);
    //reset RightData
    RightData.pressPosition = Vector2.zero;
    RightData.pointerPress = null;
    RightData.rawPointerPress = null;
    RightPressed = null;
    }

    David Watt
    Hollow World Games
    https://www.youtube.com/channel/UCjQTmu4jMw2nNtpNbiUSjrg
     
  15. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    343
    Hi David,

    Thanks for sharing this, could be a good workaround to use in the meantime until the official Unity VR UI package is released. I will have to do some performance testing as I'm wondering what the implications on performance are for using 2 additional cameras rendering the UI at almost all times. Especially on lower end platforms like the Oculus Quest/Go where all the draw calls cost you big time.
     
  16. dwattintellisoft

    dwattintellisoft

    Joined:
    Apr 26, 2019
    Posts:
    9
    ROBYER1 likes this.
  17. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    343
    I followed the tutorial that you linked and got it working fine too, thanks for the tip. However it would obviously be preferable to use the official Unity solution once that is finally released (feels like it's taking ages).

    Having to set the worldspace camera for events is not fun and hopefully the Unity official solution avoids that limitation for left/right hand users of UI pointers. @StayTalm_Unity we wait patiently for your solution to come through the pipeline, please let us know when you can share more as our own projects are being delayed by this.
     
  18. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    86
    I know it *is* taking a long time.
    I'm waiting too.
    But I will post info once I can.
     
    hippocoder and KingOfSnake like this.
  19. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    343
    For anyone wondering what the new XR Interaction package is like, I had a go with it at Unity Unite Copenhagen and passed some feedback on about it, including slow lerping of held object positions (seemingly laggy) and also some issues with UI deactivation once the UI raycaster is not pointing at the UI on buttons etc.

    It also seems some work is being done on the Inputs for it as the current state of XR input means you have 3 options - legacy input manager, XR Input in a C# script or the new Input System coming in 2019.3/2020*

    There is a sneak peek of it here
     
    Matt_D_work and Ostwind like this.
  20. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    70
    thanks robyer1 :) if you have any other feedback drop me a line!

    yep. we're going to launch with a simple input implementation to get an initial version out. We're going to expand the input as soon as I can finish working on it after launch :)
     
    ROBYER1 likes this.
  21. Rib

    Rib

    Joined:
    Nov 7, 2013
    Posts:
    3
    Really looking forward to this becoming available! At the moment I have a system similar to the one by Andrew, posted above (except based on Unity XR input APIs) but it doesn't currently handle dragging/scrolling correctly. I'm kinda blocked by this except, idealy, I don't want to spend any significant time developing and debugging yet another InputModule/Raycaster for VR input if Unity already basically has a solution that will make all this hackery redundant.

    To double check though; is this not going to be available before 2019.3/2020? - is it not maybe possible to see some kind of preview package earlier than that?

    Thanks for any clarification!
     
  22. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    70
    It will be a preview package with 19.3.
    We're trying to get it out as soon as we can but we're committing to 19.3.
     
  23. Rib

    Rib

    Joined:
    Nov 7, 2013
    Posts:
    3
    Okey, thanks @Matt_D_work!

    Actually today I was able to iterate my InputHandler to get drag/drop working so now I think most canvas interaction is working for me using XR input handling. Hopefully this will tide me over until this new functionality is available.