Search Unity

Bug Multiple UI RayInteractor won't send clicks

Discussion in 'XR Interaction Toolkit and Input' started by freso, Apr 23, 2021.

  1. freso

    freso

    Joined:
    Mar 19, 2013
    Posts:
    73
    I have two RayInteractors on my hand, handling different UI:s on different layers.
    It seems only one of them can handle clicks at a time.

    When I start with both active, then first one can't click on my in-game UI.
    When I toggle adctive off/on on the second GameObject, it starts working for the first one, but stops for the other.

    The problem is separated per hand. Currently using device-based controller.
     
    jason-vreps likes this.
  2. jason-vreps

    jason-vreps

    Joined:
    Sep 22, 2017
    Posts:
    16
    Can second this exact scenario. We have a hidden menu that is physically underground, and a render texture displays it on multiple interactable "TVs" in our main playing space. When you ray-over a TV, we activate the second XRRayInteractor that's below ground (pointing on the actual Unity UI), and points to the same spot you were hovering.

    This approach has worked flawlessly on the deprecated OVR drivers. With XR Interactions, we can get limited interactions (hovers, scrollview drags, etc), but we cannot click buttons!!

    EDIT: Confirming the second part of your issue, if I disable/re-enable the main interactor on the left hand, for example, then the float-interactor that points at the Unity UI can click buttons. But now the main left interactor cannot.
     
    Last edited: May 18, 2021
  3. jason-vreps

    jason-vreps

    Joined:
    Sep 22, 2017
    Posts:
    16
    reinfeldx and freso like this.
  4. freso

    freso

    Joined:
    Mar 19, 2013
    Posts:
    73
    @jason-vreps : Yeah, I've noticed the reponse from Unity when you build a sample-project is much much better. Basically a requirement to not get a thousand weird questions from first line support.

    Good work! :)