Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Feedback You didn't account for some use cases in the event dispatcher (VR/AR)

Discussion in 'UI Toolkit' started by kataS_94, Oct 23, 2021.

  1. kataS_94


    Feb 3, 2016

    I've been doing some tests with the 2021.2 beta. I'm mainly a VR developer and I'm very interested on the future of UIToolkit for VR. Since you don't have implemented world space panels yet, I did some workarounds with a RenderTexture and redirection of events to the PanelEventHandler (converting the event position so it would properly project from the event camera), so I could test things around on VR.

    As you can imagine, within VR we can have two pointers (or even more, like raycasting from the head) with both hands. I have tried different things like setting one of the pointers to the mouse Id (0) or setting them to be 1 and 2 (base touch id and second touch id), or even setting one to be base touch Id and the other one to be base pen Id (21).

    It works fairly good, but the problem is that I think that you did not take into account that there can be multiple pointers at the same time that behave mouse-like (can hover over different elements at the same time), which is the special case here in VR.

    Maybe I'm giving feedback too much in advance, and you already had planed this kind of interactions for the future world space panel implementations. But just in case, please consider to not limit touch pointer Ids to not be able to have simple move events and triggering hover states. Ultimatelly I don't even think that it is necesary to make a distiction between mouse or touch pointers. Just implement any pointer Id to be capable of hovering indepently if it triggers move events (which would not happen in a touch screen). Also you should not account for the pointer Id order, they should all have the same importance no matter if one is higher than other. I know this may be harder to implement, but it will be more robust for any use case. Being able to handle UI with both hands at the same time is a huge attractive in VR.

    Actually, setting one of my pointers to be base touch Id and the other one to be pen base id allowed me to be able to hover elements with both pointers (because both were marked as primary pointer events), but whenever both pointers are over the panel at the same time, only the one with the highest id would hover items and the other one would be ignored.

    Thanks a lot for the hard work on this project, I think it's pretty amazing that we can have some web standards in Unity!. Just don't forget about the VR community ^^

    EDIT: I decided to share the script that I use to create world-space panels, for those that are curious or may want to experiment -->
    Last edited: Oct 27, 2021
    vsx-bieber and Max_Aigner like this.
  2. uBenoitA


    Unity Technologies

    Apr 15, 2020
    Hi kataS_94,

    Your use case is a very interesting, and it's a thing that we don't usually have in mind when testing or doing design. By default, we're mainly trying to match the design of the web controls, which as you rightly pointed out is not the only thing that Unity users might be interested in. Definitely something to add to our considerations in the future!

    As for the specific issue about hovering with two pointers, I can tell you that we've fixed a lot of issues with the hover state when using touch or pen input, and there's a good chance that your specific issue has changed between the version you tested and the absolute latest version of Unity. I know that the hover state was triggered by MouseEnterEvents and MouseLeaveEvents, which are only generated by primary pointers, but since 2022.1.0a12 they are aware of PointerEnterEvents and PointerLeaveEvents of all kinds. Also there's another fix in the works for handling hover state when multiple pointers are inside the same element, and only one of them leaves, which will now not remove the hover state as it used to.

    I hope you'll be able to test again when all those fixes are out in the public. If it's still bugged like you're describing, in the 2022.1 beta, then it might be a good idea to report bugs about it. Also, we're working hard on trying to have better documentation on what is the exact expected behavior out of UI Toolkit, so that you can know if what you're asking is a change in the specifications, or simply a bug fix. Hope this helps!
    Timboc likes this.
  3. kataS_94


    Feb 3, 2016
    Hi uBenoitA,

    Thanks a lot for the reply!, I will keep testing in the future when this fixes release to the public.

    I would also like to mention something that I forgot on the main post, related to the screen position data in the pointer events. As you can imagine, defining VR/AR pointer events in terms of screen position is kind of pointless. But since that is the way that UIToolkit and the old Unity's UI system works, custom VR/AR input modules like the XRUIInputModule from Unity's XR Interaction Toolkit package (currently in Pre-Release) have to make some workarounds like setting up a fake screen position in the PointerEventData so the Canvas can then reproject it from the camera to get the actual pointer position within the worldspace panel. Eventually, you could be interacting with a worldspace panel without even looking at it, so there would be no camera actually looking at the UI.

    It would be really nice that we could abstract away pointer events in UIToolkit from this, so they directly provide the local pointer position within the panel or maybe a Ray object rather than the screen position. So we can have cleaner and easier to understand implementations of custom input modules for wild use-cases like XR. We could even get rid of defining a reference to the event camera, since now we have a more abstracted way of defining the pointer (a Ray) that can work for any use case.
    Kleptine and Timboc like this.