Search Unity

Current state of Unity UI components for VR?

Discussion in 'UI Toolkit' started by mikewarren, May 9, 2019.

  1. mikewarren

    mikewarren

    Joined:
    Apr 21, 2014
    Posts:
    109
    I'm about to start building a Unity UI (2019) that will need to run in both a desktop mode (mouse interactions) and in VR (using a world space raycaster and controller button). I know the Unity UI systems is pretty heavily designed around the concept of a screen, so I've been researching articles and implementations like this from Oculus for working around that restriction.

    https://developer.oculus.com/blog/unitys-ui-system-in-vr/

    It's been several years since that article was written and things inevitably evolve. Are there better resources for using Unity UI components in VR now? Ideally, I'd like to stick to an all Unity solution that will continue to evolve with new versions.

    Thanks.

    Mike
     
  2. uDamian

    uDamian

    Unity Technologies

    Joined:
    Dec 11, 2017
    Posts:
    1,231
    You might want to try the Unity UI forum section:
    https://forum.unity.com/forums/unity-ui-textmesh-pro.60/

    This section is for UIElements, a new retained-mode UI framework that is currently only supported in the Editor. We will have a runtime solution eventually as well, but Unity UI (or more commonly called uGUI) is something else. :)
     
  3. mikewarren

    mikewarren

    Joined:
    Apr 21, 2014
    Posts:
    109
    @uDamian

    Thanks for the heads up. I became aware of the UIElements development from the recent blog post. I'm excited by the prospect, but what I really need now is a runtime UI that works with VR (raycast, controller) and mouse input. I'm by no means an expert on the uGUI system, but I was hoping there might be, for instance, a Unity supported VR InputModule by now.
     
  4. uDamian

    uDamian

    Unity Technologies

    Joined:
    Dec 11, 2017
    Posts:
    1,231