Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Resolved Pulling trigger doesn't invoke Activate event

Discussion in 'XR Interaction Toolkit and Input' started by milosp, Jan 6, 2023.

  1. milosp

    milosp

    Joined:
    Dec 31, 2012
    Posts:
    23
    The game object has "XR Simple Interactable" and just Meshrenderer.Material change events for Activate and Deactivate as seen in screenshot. The ray changes color when I point to the object but nothing happens when I pull the trigger.
    Please see the screenshot.

    If I do the same thing for Hover and Hover Exited on that same object, that works, but not for Activate. Why?
     

    Attached Files:

  2. VRDave_Unity

    VRDave_Unity

    Unity Technologies

    Joined:
    Nov 19, 2021
    Posts:
    254
  3. milosp

    milosp

    Joined:
    Dec 31, 2012
    Posts:
    23
    Hi @VRDave_Unity

    Thank you for the answer. Coming from desktop development background (WPF) this interaction didn't align with my expectations but your answer has lead me to two solutions
    v1: Checking `Hover to Select` in `XR Ray Interactor` - that way Activated/Deactivated events worked (but selection seems redundant)
    v2. In XRI Right Hand Interaction > Select > Added triggerPressed and using SelectEntered/SelectExited events instead of Activate

    My assumptions were:
    Hover ~ OnMouseOver
    Activate/Deactivate ~ OnMouseDown/OnMouseUp
     
  4. VRDave_Unity

    VRDave_Unity

    Unity Technologies

    Joined:
    Nov 19, 2021
    Posts:
    254
    I see. Yeah, the Activate/Deactivate are more like an additional 'depth' of interaction. I'm not sure what the web/desktop equivalent would be other than a right-click while dragging/left-clicking:

    Hover ~ OnMouseOver
    Select/Deselect (Grab/Drag) ~ OnLeftMouseDown/OnLeftMouseUp
    Activate/Deactivate ~ (while LeftMouseDown): OnRightMouseDown/OnRightMouseUp

    The exception to this is UI interaction, where Select is equivalent to OnMouseDown/OnMouseUp as a 'click' event rather than a 'grab' type of event. I will take it back to the team to consider adding a checkbox to allow Interactables to be activated without selection.
     
  5. milosp

    milosp

    Joined:
    Dec 31, 2012
    Posts:
    23
    We definitely need new ways of thinking for spatial/gestural interactions and a new nomenclature. On the long run I can see UI and 3D interactions merging as we all move away from flat UIs at some point.

    Words that come to mind are more abstract words like "primary" and "secondary" interactions, especially since Activating/Enabling GO might be confusing.

    I have played more with selection and another discrepancy to desktop development is that none of the 4 modes mimic standard "single item" selection with triggerPressed set to Select - I don't see a way to select one object and then click a second object to switch the selection to it (first trigger click on second object will just deselect the first one) - similar to ListBox selections or RadioButton
     
    VRDave_Unity likes this.
  6. gPerry

    gPerry

    Joined:
    Nov 27, 2013
    Posts:
    21
    Wouldn't the 'Allow Hovered Activate' option work for this? It is included on both Direct and Ray interactors, under the Selection Configuration options.
     
    jayliu50 and VRDave_Unity like this.
  7. VRDave_Unity

    VRDave_Unity

    Unity Technologies

    Joined:
    Nov 19, 2021
    Posts:
    254
    gPerry likes this.
  8. AustinMclEctro

    AustinMclEctro

    Joined:
    May 3, 2017
    Posts:
    16
    I'm experiencing what I believe is an XRI bug where any input action that is set to use an interaction type (e.g. Hold, Press with Trigger Behavior set to Release Only, etc.) does not trigger any controller Input actions. I'm seeing only input actions without interactions work for input (select, activate, and UI press).

    For example, on my LeftHand controller using the
    XR Controller (Action-based)
    script, I have the following:
    upload_2023-3-5_15-6-14.png

    The Player/InteractLeft action is as follows using a Hold interaction, with its single binding using the path
    <XRController>{LeftHand}/primaryButton
    . I would expect this to work regardless of using any interaction type, but it does not:
    upload_2023-3-5_15-10-39.png

    Whereas the Player/InteractRight action has no interactions set, and it works:
    upload_2023-3-5_15-9-35.png

    And here is my RightHand
    XR Controller (Action-based)
    script:
    upload_2023-3-5_15-12-23.png

    What's going on here? Is this a bug preventing input actions for actions with interactions on them?

    Some info on my setup:
    • Unity 2021.3.9f1
    • XR Interaction Toolkit 2.0.4
    • XR Plugin Management 4.2.1
    • OpenXR Plugin 1.4.2
    • Using Meta Quest 1 via AirLink
    • Platform: Windows