Hi, simple question: Is there any native Unity support for using a world-space canvas UI with controllers based on Unity's XR namespace? If not, what are the steps to be able to use a controller as I would use a mouse in a normal canvas? I would like to have the controller(s) send clicks and move/drag/drop events to the canvas based on their actual position. The only definitive source on this I found was this Oculus article (https://developer.oculus.com/blog/unitys-ui-system-in-vr/), but it is nearly 4 years old and I suspect/hope that some things have changed since then (and, also, I would like to remain SDK-independent and only use Unity). Philip
Hello! I replied in another thread, but since it was a tangent there, so it makes sense to do the same here. A generalized solution exists, and it is on it's way through our testing and shipping processes. Sorry I can't just drop it here, but stay tuned, and I'll reply to this thread once it's available.
Moving this over to your own thread: Soooo, within 2019 is the plan at the moment, but I don't make those plans, so they can change. And it is similar to the Oculus solution you linked, in that it allows 3D tracked devices to trigger world-space UI as if they were mice. So you can use dropdowns, scrollbars, buttons, textboxes, etc... Each tracked device and mouse and touch is treated independently so that you can still use Spectator mode to have both player and spectator UIs independently controllable, or you can have 1 hand using a scrollbar and the other pressing a button. It is UI-based, so it uses the EventSystem, replaces things like StandaloneInputModule, and has a custom UI Graphic Raycaster for 3D devices.
@StayTalm_Unity Thanks for your reply, that sounds pretty good. I think I will go with a minimal custom implementation for now and switch once it is available. Thanks, looking forward to the notification ;-)
I am curious on when this rolls out also, I wrote a VR Input Module, but it seems to be in a constant battle with the standalone input module, they both kind of work together, but the desktop UI interactions only work when I am constantly moving my mouse. When watching the event system debug I can see my XR Input Module keeps overriding the normal one, unless I'm constantly moving my mouse, then it gets even weirder when both the VR and Desktop users try to interact with their respective UI's at the same time.
I managed to get it working for both VR and Desktop UI interactions simultaneously. I had to manually call Process on the StandaloneInputModule from my XRInputModule because the EventSystem component just grabs the first BaseInputModule found, and calls Execute on it. Bit of a hack but it's working great. I also tried this same thing, but instead, inherited from the StandaloneInputModule, but that doesn't work because of private variables that I can't access and need to access. So rewrote an 'almost' exact copy of the StandaloneInputModule to work with Unity Generic XR inputs and Desktop.
FYI, I'm interested in an XR Input Module as well and asked pretty much the same exact question here. Thanks to @StayTalm_Unity for taking the time to reply to both. Any idea (best guess) when the module might be available? I'm just trying to decide if I should wait or attempt to write something on my own. I'm wondering if there's a better way to tie into the event system for the raycasting into the scene. I know that operating a uGUI control involves a (Graphics) raycast from a camera position, through a screen pixel point into the scene to do an intersection test. If I already have a world space raycaster, I'd like to be able to skip the whole screen, point, camera operation. Lastly, any speculation on how the new input system or UIElements efforts will affect uGUI down the road?
Hello! I'm going to do this in reverse order: UIElements: That is going to be a different endeavor and something I still need to look into. New Input System: It has it's own uGui Input Module, also written by me, so they share a lot of the same concepts. That one is public: https://github.com/Unity-Technologi.../com.unity.inputsystem/InputSystem/Plugins/UI The XRInput specific one will look very similar, both are based off of the UIInputModule and device models in that folder. I tried to make it much more easily extensible compared to the StandaloneInputModule. The basic design is that you inherit from the UIInputModule, create ____Model structs, and on a per-frame basis, update those and pass them back down to the UIInputModule in order to convert that into actual UI events. I wanted to separate the Input sources from the actual internal uGUI concepts and event processing, to untangle hooking uGUI up to new sources, and I'm pretty happy with the result. If you want to start with your own, I'd suggest starting from that point. Raycasters: You are bang on on that one! If you look at the New Input System's TrackedDeviceRaycaster, I extended out the pointerEventData type, and created a new raycaster that can handle 3D coordinates not connected to any specific camera. It should look really similar to a few other solutions out in the field. The tricky part was to write the extended pointerEventData to not get picked up by the 2D Graphic Raycasters. It does graphics-only (no physics raycaster), but does handle physics and graphics occlusion. That raycaster works well for all uGUI types except the dropdown due to how the dropdown does it's dropping down effect. But that is being fixed and ported backwards to 2019.1. ETA: I wanna stress that I'm no authority on this one, but I do know that we are scheduling approximately one month out (it's packaged with a few other thingamabobs too). That does not take into account business decisions, surprises, etc... that may come up, and so this is a thought and not a promise. Hope all this info helps!
After being told to use Curved UI a LOT, it would be nice to just see World Space Canvas UI working in VR, especially as a lot of courseware for Unity suggests or alludes to using native UI in worldspace with VR is a straight forward endeavour. Keep us posted please!
Is there any update on this? I keep downloading new alpha versions of Unity in the hopes of seeing this implemented
Still on it's way. I wish I could say more, but I cannot, but I'm gonna stay true and ping to this thread once it's available. I can say it will be a separate package, so it won't need a specific Alpha version.
@StayTalm_Unity hi, is there any update on this? In addition, will there be any built-in solutions for interacting with UI through a laser pointer (like in SteamVR)?
Create a gameobject called line which has the linerenderer for the pointer and a camera that is disabled and has a FOV of 1. then use this code in a custom input module. CasterRight is the camera on the right controller pointer. I do this for both left and right hands. Use this as a replacement for InputModule. I also put a script on each canvas that adds itself to the Canvases list. Most of it is derived from VR Andrew's tutorial video that explains how to do it. public Camera CasterRight; public Camera CasterLeft; [HideInInspector] public List<Canvas> Canvases; public GameObject RightCurrent; public GameObject LeftCurrent; GameObject RightPressed; GameObject LeftPressed; public PointerEventData RightData { get; private set; } public PointerEventData LeftData { get; private set; } private void ProcessRight() { if (ControllerManager.Right.GetState() != ControllerState.Unavailable && CasterRight != null) { //set camera for canvases to right camera foreach (Canvas canvas in Canvases) { canvas.worldCamera = CasterRight; } //reset data RightData.Reset(); RightData.position = new Vector2(CasterRight.pixelWidth / 2, CasterRight.pixelHeight / 2); //Raycast eventSystem.RaycastAll(RightData, m_RaycastResultCache); RightData.pointerCurrentRaycast = FindFirstRaycast(m_RaycastResultCache); RightCurrent = RightData.pointerCurrentRaycast.gameObject; //clear raycast m_RaycastResultCache.Clear(); //handle hover HandlePointerExitAndEnter(RightData, RightCurrent); //handle press if (ControllerManager.Right.IsPressed(Buttons.Trigger)) { OnPressRight(); } //handle release else { OnReleaseRight(); } } } void OnPressRight() { //set raycast RightData.pointerPressRaycast = RightData.pointerCurrentRaycast; //check for object hit and send down event GameObject newPointerPress = ExecuteEvents.ExecuteHierarchy(RightCurrent, RightData, ExecuteEvents.pointerDownHandler); //if no down handler , try and get click handler if (newPointerPress == null) { newPointerPress = ExecuteEvents.GetEventHandler<IPointerClickHandler>(RightCurrent); } if (RightPressed != null && RightCurrent == null) { //if we exit the element and still have button pressed send up event ExecuteEvents.Execute(RightPressed, RightData, ExecuteEvents.pointerUpHandler); } //set RightData RightData.pressPosition = RightData.position; RightData.pointerPress = newPointerPress; RightData.rawPointerPress = RightCurrent; RightPressed = newPointerPress;//save pressed element for later use when released } void OnReleaseRight() { ExecuteEvents.Execute(RightData.pointerPress, RightData, ExecuteEvents.pointerUpHandler); GameObject pointerUpHandler = ExecuteEvents.GetEventHandler<IPointerClickHandler>(RightCurrent); if (RightData.pointerPress == pointerUpHandler) { //send up event to handler under cursor ExecuteEvents.Execute(RightData.pointerPress, RightData, ExecuteEvents.pointerUpHandler); } if(RightPressed != null && pointerUpHandler == RightPressed) { //if we are still over the one we first pressed send click event ExecuteEvents.Execute(RightPressed, RightData, ExecuteEvents.pointerClickHandler); Debug.Log("Clicked - " + RightPressed); } //clear selected gameobject eventSystem.SetSelectedGameObject(null); //reset RightData RightData.pressPosition = Vector2.zero; RightData.pointerPress = null; RightData.rawPointerPress = null; RightPressed = null; } David Watt Hollow World Games https://www.youtube.com/channel/UCjQTmu4jMw2nNtpNbiUSjrg
Hi David, Thanks for sharing this, could be a good workaround to use in the meantime until the official Unity VR UI package is released. I will have to do some performance testing as I'm wondering what the implications on performance are for using 2 additional cameras rendering the UI at almost all times. Especially on lower end platforms like the Oculus Quest/Go where all the draw calls cost you big time.
cameras are unchecked so they don't render. They are only used for the raycast. David Watt Hollow World Games https://www.youtube.com/channel/UCjQTmu4jMw2nNtpNbiUSjrg
I followed the tutorial that you linked and got it working fine too, thanks for the tip. However it would obviously be preferable to use the official Unity solution once that is finally released (feels like it's taking ages). Having to set the worldspace camera for events is not fun and hopefully the Unity official solution avoids that limitation for left/right hand users of UI pointers. @StayTalm_Unity we wait patiently for your solution to come through the pipeline, please let us know when you can share more as our own projects are being delayed by this.
For anyone wondering what the new XR Interaction package is like, I had a go with it at Unity Unite Copenhagen and passed some feedback on about it, including slow lerping of held object positions (seemingly laggy) and also some issues with UI deactivation once the UI raycaster is not pointing at the UI on buttons etc. It also seems some work is being done on the Inputs for it as the current state of XR input means you have 3 options - legacy input manager, XR Input in a C# script or the new Input System coming in 2019.3/2020* There is a sneak peek of it here
thanks robyer1 if you have any other feedback drop me a line! yep. we're going to launch with a simple input implementation to get an initial version out. We're going to expand the input as soon as I can finish working on it after launch
Really looking forward to this becoming available! At the moment I have a system similar to the one by Andrew, posted above (except based on Unity XR input APIs) but it doesn't currently handle dragging/scrolling correctly. I'm kinda blocked by this except, idealy, I don't want to spend any significant time developing and debugging yet another InputModule/Raycaster for VR input if Unity already basically has a solution that will make all this hackery redundant. To double check though; is this not going to be available before 2019.3/2020? - is it not maybe possible to see some kind of preview package earlier than that? Thanks for any clarification!
It will be a preview package with 19.3. We're trying to get it out as soon as we can but we're committing to 19.3.
Okey, thanks @Matt_D_work! Actually today I was able to iterate my InputHandler to get drag/drop working so now I think most canvas interaction is working for me using XR input handling. Hopefully this will tide me over until this new functionality is available.
Really looking forward to this, I have a project starting full alpha development soon looking to use all cross-platform VR capabilities including the XR Interaction Package. Will there be an announcement post or blog post when it is available? (Apologies for seeming impatient, your hard work on this area of Unity is really appreciated and I want to provide any useful feedback I can once the package is out)
Yes, apparently there are a lot of people waiting for this. I find myself refreshing the Package Manager every other day in case it's out, hopefully it's close. (fingers crossed)
It took a long time, but finally: https://blogs.unity3d.com/2019/12/17/xr-interaction-toolkit-preview-package-is-here/ and https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/ This contains the XR UI integration, and if you saw it at Unite, it now merges the UI and Interactable systems so that UI is a first-class citizen, so to speak. You can use all the same utilities and options of the XRRayInteractor. This toolkit will see ongoing improvements, please leave feedback, questions, or opinions about it, and I'll try to keep a close eye and respond and listen.
Hi StayTalm I would like to understand how to have the "onHover event" present in the "XRBaseInteractable" also for the UIElement whent the laser is hovering. Which interface implement the "IPointerEnterHandler" for the XR UI?
Should we be using the XR interaction toolkit or the new input system 1.0 tools for VR input? I've spent basically ALL day on this trying to get even a simple button press from the left motion controller on my rift to read and I haven't been able to do it. All I want is something like: if(XR.lefthand.aButton.isPressed){ // do something! } What would be the equivalent with the interaction toolkit...or input system for that matter?
you will have to write a layer over xrinput to do that. That is what I have done as it only gives you up or down state. I also did it to unify access between various input sources. The new input system is a waste for me as it won't recognize controllers it doesn't know. Frankly with the whole xr management and steamvr debacle going on, I am still using legacy vr and steam input. if you are planning to support steamvr I highly recommend using only legacy vr and steamvr input, however awful it may be.
Just putting this here because for some reason this was hard to find easily in documentation and forums...maybe it's just me, but still. Credit to a friend for getting this code to me. Also, If you don't have the headset ON, the boolean is not triggered. So while it was working with no errors, it won't actually 'work'. I was not getting any input from the controllers because I did not have the headset on (I assume it's the sensor though)...This took me a while to figure out though as I assumed I wouldn't have to have the headset on to test just the motion controllers. Code (CSharp): using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEngine.XR.Interaction.Toolkit; public class ControllerManager : MonoBehaviour { public XRController rightHand; public InputHelpers.Button button; void Update() { bool pressed; leftHand.inputDevice.IsPressed(button, out pressed); if (pressed) { Debug.Log("Hello - " + button); } } } Put this script on the XRRig, drag in the motion controller and Voila!
When using the provided InputHelpers in the XR Toolkit, it seems everything is binary, as the IsPressed only returns a boolean. Does the XR Toolkit provide a way to get the axis value of the joystick axis' on Oculus controllers, such as GetAxis() ? I couldn't find this in the Unity XR Toolkit
@StayTalm_Unity, hi there! I'm currently trying to setup uGUI for XR with Unity Input System Package v1.0.2. I'm using HTC vive headset. Firstly i just followed guidline from here https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/UISupport.html. But instead of using DefaultInputActions i made UI action map in my input actions asset: Then i chose my input actions asset in the inspector of my InputSystemUIInputModule Component: Input action asset is activated and fire actions, double cheked this using debugger. On the scene i have a large button on the canvas. Canvas set of components: Scene view with hierarchy: And with this setup nothig works at all. When i'm pointing to the button on the canvas with my controller, nothing happens. I digged inside the TrackedDeviceRaycaster and found out that raycast performed along the forward axis of the controller (111 string in TrackedDeviceController.cs). Then i double checked that i'm pointing with my controller's forward axis on the button. Same result - nothing seems work. Then i started to test different angles of my controller. I'm literrally pefrormed attemts of pointing on the button and then clicking on it from ev ery possible orientation of my controller. Again the same result. Then i returned to the code and with debugger found out that execution always ends on the line 194 of TrackedDeviceRaycaster.cs. There is a string: Code (CSharp): if (RayIntersectsRectTransform(graphic.rectTransform, ray, out worldPos, out distance)) and it seems that the method RayIntersectsRectTransform for some reason never returns true in my case. Any help is appreciated.
Update: spend some time on testing again and found the position of the controller in which all works fine. I have a guess that tracking object position is not the world position of controller's game object.
Update: found some information here https://forum.unity.com/threads/tracked-devices-and-ui-raycasting.874951/#post-7287043. It seems that this is a bug in Input System versions lower that 1.1.0.
Update: finally solved. My steps to fix: - updated Input System package to 1.1.1 - set XR Tracking Origin property in the inpector (XR Rig is the parent of my camera and controllers object)