Search Unity

Question Porting from SteamVR to Unity XR Question

Discussion in 'XR Interaction Toolkit and Input' started by JudahMantell, May 3, 2021.

  1. JudahMantell

    JudahMantell

    Joined:
    Feb 28, 2017
    Posts:
    476
    So exactly as the title says, I'm considering porting my in-development app from using the SteamVR plugin to the new Unity Input System/XR toolkit + OpenXR because it looks like this system will be the future of XR dev.

    I have a few concerns that maybe someone can help me with:
    Right now, I'm not using any of SteamVR's interaction scripts, just their action-based input -- I wrote my own interactions, so all I need to do (theoretically) is switch the input system. The problem is, from what I've messed around with in the UXR toolkit, there isn't much parity between the two.

    For example, I use my own pointer system (though the Ray interactor looks pretty great too), and the way I handle dragging objects is like this (pseudocode)

    if grab button is changed from up to down in a single frame, set "moving object" to the object pointed at (clicked on, essentially)
    then, while the button is held down, move the object,
    then if the button is released, set "moving object" to null

    Those are three different if statements in update. While it might not be the most efficient, it is very clear.
    From my understanding, UXR doesn't have any of that, just is the button down or not. This makes it really difficult to use for things like this. Am I wrong? Is there a solution to this? Should I even bother porting at all?

    I might just be misunderstanding, so I apologize if this is a dumb question.

    Thanks so much!
     
  2. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    540
    This is not related to the XR toolkit, as far as I understand you only want to check if a button is pressed, you can do this with the new input system.

    https://docs.unity3d.com/Packages/com.unity.inputsystem@0.9/manual/Migration.html#getAxis

    https://docs.unity3d.com/Packages/com.unity.inputsystem@1.1/manual/Interactions.html?q=interaction

    Short version: Make a button type InputAction for the grip and bind it to your XR grip, then in code register the action performed (down) and canceled (up) events, in the performed event handler you start the drag and set a drag your drag object, in the update loop you check if the drag object is != null and do your drag logic and in the canceled event handler you stop your drag and set drag object to null again.

    Hint depending on your input settings cancel might also be called if your grip is pressed a little, but not enough to trigger"performed" but that's a easy fix, you only have to add a check in canceled event handler to see if your drag object is != null to know if a drag was active or not.
     
    Last edited: May 8, 2021
  3. JudahMantell

    JudahMantell

    Joined:
    Feb 28, 2017
    Posts:
    476
    So you're saying that once I set up the XR interaction system with my OpenXR/whatever, and use the default included bindings, I can just call those from code and use them as any other button with the new input system?
    Thanks!

    EDIT: And you're saying that Performed is pressed once, and cancelled is released once?
    My problem with the original input system was that if I checked if a key was down, it would be true for every frame. That's not what I want here.
     
    Last edited: May 9, 2021
  4. the_real_apoxol

    the_real_apoxol

    Unity Technologies

    Joined:
    Dec 18, 2020
    Posts:
    467