Hello, I am wondering if there's an easy way to use Unity's UI system in VR, specifically with the Oculus Go. I found this article on Oculus's website, but it is designed for Unity 5.2, it is outdated and does not seem to work properly in Unity 2019, saying that type cannot be found and that there are methods that hide inherited ones. I basically want to have a simple flat world-space UI in front of the camera, which the controller raycasts towards in order to place a virtual cursor on the world-space UI for selecting UI elements.
Hello! This is a very in demand feature, and it's coming down the pipes as we speak. I'll be reporting back on it here: https://forum.unity.com/threads/unitys-ui-and-xr-input.671893/#post-4544146 Sorry to make you wait...
Right now my solution was to design my menus to be used with a controller. This is much better if you might want to expand your game also to a PC port, because on PC you'll be using the same Vertical, Horizontal, Submit, Cancel thing. I would highly recommend just using a Standalone Input Module, configuring your Input in Project Settings, using the OpenVR scheme of controls. You make like VR_LeftHand_Horizontal and VR_RightHand_Horizontal < - (Made up those names) etc etc..., then in your Standalone Input Module you just type in 'VR_RightHand_Horizontal' and the joystick or touch pad on your VR controller depending on what device you have will work with your menus. OpenVR is amazing because it allows you to target WMR, Vive, and Oculus devices, Unity really knocked themselves out with implementing it. It's also extremely lightweight, you just create a regular Windows application. It's great.