Search Unity

  1. Unity 2019.2 is now released.
    Dismiss Notice

Unity's UI and XR input

Discussion in 'AR/VR (XR) Discussion' started by plmx, May 3, 2019.

  1. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    241
    Hi,

    simple question: Is there any native Unity support for using a world-space canvas UI with controllers based on Unity's XR namespace? If not, what are the steps to be able to use a controller as I would use a mouse in a normal canvas? I would like to have the controller(s) send clicks and move/drag/drop events to the canvas based on their actual position.

    The only definitive source on this I found was this Oculus article (https://developer.oculus.com/blog/unitys-ui-system-in-vr/), but it is nearly 4 years old and I suspect/hope that some things have changed since then (and, also, I would like to remain SDK-independent and only use Unity).

    Philip
     
    bemrlab likes this.
  2. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    71
    Hello!
    I replied in another thread, but since it was a tangent there, so it makes sense to do the same here.

    A generalized solution exists, and it is on it's way through our testing and shipping processes. Sorry I can't just drop it here, but stay tuned, and I'll reply to this thread once it's available.
     
    bemrlab and ROBYER1 like this.
  3. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    71
    Moving this over to your own thread:

    Soooo, within 2019 is the plan at the moment, but I don't make those plans, so they can change.
    And it is similar to the Oculus solution you linked, in that it allows 3D tracked devices to trigger world-space UI as if they were mice. So you can use dropdowns, scrollbars, buttons, textboxes, etc... Each tracked device and mouse and touch is treated independently so that you can still use Spectator mode to have both player and spectator UIs independently controllable, or you can have 1 hand using a scrollbar and the other pressing a button.

    It is UI-based, so it uses the EventSystem, replaces things like StandaloneInputModule, and has a custom UI Graphic Raycaster for 3D devices.
     
  4. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    241
    @StayTalm_Unity Thanks for your reply, that sounds pretty good. I think I will go with a minimal custom implementation for now and switch once it is available. Thanks, looking forward to the notification ;-)
     
  5. joelybahh

    joelybahh

    Joined:
    Mar 18, 2015
    Posts:
    58
    I am curious on when this rolls out also,

    I wrote a VR Input Module, but it seems to be in a constant battle with the standalone input module, they both kind of work together, but the desktop UI interactions only work when I am constantly moving my mouse.

    When watching the event system debug I can see my XR Input Module keeps overriding the normal one, unless I'm constantly moving my mouse, then it gets even weirder when both the VR and Desktop users try to interact with their respective UI's at the same time.
     
  6. joelybahh

    joelybahh

    Joined:
    Mar 18, 2015
    Posts:
    58
    I managed to get it working for both VR and Desktop UI interactions simultaneously.

    I had to manually call Process on the StandaloneInputModule from my XRInputModule because the EventSystem component just grabs the first BaseInputModule found, and calls Execute on it. Bit of a hack but it's working great.

    I also tried this same thing, but instead, inherited from the StandaloneInputModule, but that doesn't work because of private variables that I can't access and need to access.

    So rewrote an 'almost' exact copy of the StandaloneInputModule to work with Unity Generic XR inputs and Desktop.
     
  7. mikewarren

    mikewarren

    Joined:
    Apr 21, 2014
    Posts:
    77
    FYI,

    I'm interested in an XR Input Module as well and asked pretty much the same exact question here.
    Thanks to @StayTalm_Unity for taking the time to reply to both.

    Any idea (best guess) when the module might be available? I'm just trying to decide if I should wait or attempt to write something on my own.

    I'm wondering if there's a better way to tie into the event system for the raycasting into the scene. I know that operating a uGUI control involves a (Graphics) raycast from a camera position, through a screen pixel point into the scene to do an intersection test. If I already have a world space raycaster, I'd like to be able to skip the whole screen, point, camera operation.

    Lastly, any speculation on how the new input system or UIElements efforts will affect uGUI down the road?
     
  8. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    71
    Hello!
    I'm going to do this in reverse order:
    UIElements: That is going to be a different endeavor and something I still need to look into.

    New Input System: It has it's own uGui Input Module, also written by me, so they share a lot of the same concepts. That one is public: https://github.com/Unity-Technologi.../com.unity.inputsystem/InputSystem/Plugins/UI
    The XRInput specific one will look very similar, both are based off of the UIInputModule and device models in that folder. I tried to make it much more easily extensible compared to the StandaloneInputModule. The basic design is that you inherit from the UIInputModule, create ____Model structs, and on a per-frame basis, update those and pass them back down to the UIInputModule in order to convert that into actual UI events. I wanted to separate the Input sources from the actual internal uGUI concepts and event processing, to untangle hooking uGUI up to new sources, and I'm pretty happy with the result. If you want to start with your own, I'd suggest starting from that point.

    Raycasters: You are bang on on that one! If you look at the New Input System's TrackedDeviceRaycaster, I extended out the pointerEventData type, and created a new raycaster that can handle 3D coordinates not connected to any specific camera. It should look really similar to a few other solutions out in the field. The tricky part was to write the extended pointerEventData to not get picked up by the 2D Graphic Raycasters. It does graphics-only (no physics raycaster), but does handle physics and graphics occlusion. That raycaster works well for all uGUI types except the dropdown due to how the dropdown does it's dropping down effect. But that is being fixed and ported backwards to 2019.1.

    ETA: I wanna stress that I'm no authority on this one, but I do know that we are scheduling approximately one month out (it's packaged with a few other thingamabobs too). That does not take into account business decisions, surprises, etc... that may come up, and so this is a thought and not a promise.

    Hope all this info helps!
     
    ROBYER1, aaronfranke and joelybahh like this.
  9. mikewarren

    mikewarren

    Joined:
    Apr 21, 2014
    Posts:
    77
    @StayTalm_Unity, thanks for the information and the hard work. This sounds great.
     
  10. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    167
    After being told to use Curved UI a LOT, it would be nice to just see World Space Canvas UI working in VR, especially as a lot of courseware for Unity suggests or alludes to using native UI in worldspace with VR is a straight forward endeavour.

    Keep us posted please!
     
    bemrlab likes this.
  11. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    167
    Is there any update on this? I keep downloading new alpha versions of Unity in the hopes of seeing this implemented
     
  12. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    71
    Still on it's way.
    I wish I could say more, but I cannot, but I'm gonna stay true and ping to this thread once it's available.
    I can say it will be a separate package, so it won't need a specific Alpha version.
     
    ROBYER1 likes this.