Search Unity

Official XR Interaction Toolkit 1.0.0-pre.3 pre-release is available

Discussion in 'XR Interaction Toolkit and Input' started by chris-massie, Mar 20, 2021.

  1. Khang_Pham

    Khang_Pham

    Joined:
    Feb 28, 2021
    Posts:
    13
    Check your console for errors,
    if there are non, check your layers on the object and also if the interactable is currently selected by another interactor. In that case you need to call selectexit on the interactable first.
    Dummy code for selectexit:

    Code (CSharp):
    1.  _slotOne.interactionManager.SelectExit(_slotOne.startingSelectedInteractable.selectingInteractor, _slotOne.startingSelectedInteractable);
     
  2. ttttkk

    ttttkk

    Joined:
    Oct 2, 2016
    Posts:
    7
    Hi, I came from 0.9 to 1.0, and I noticed xr offset grab script is removed from the latest version, is there a newer method to achieve it?
     
    reinfeldx likes this.
  3. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    542
    Are there any plans to integrate the Oculus Quest hand tracking to the toolkit, so that the hand can be used as a controller / interactor with finger gestures (pinch)?
     
    reinfeldx likes this.
  4. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    I'm not familiar with that behavior, are you able to paste it here or describe what that script was doing so I can assist? I looked at older versions of the package and did not see it.

    Yes, hand tracking support is planned and not just for the Oculus Quest. I don't have a timeline for when that will be available, but it is currently being worked on.
     
    Shizola and R1PFake like this.
  5. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    542
    I have a question and possible feature request. When you set a (Grab)Interactable Rigidbody to Continuous or Continuous dynamic you will get a warning once you pickup the object, because it is set to kinematic and Continuous is not supported with kinematic.

    Once you drop the object kinematic will be set to false again and the warning goes away.

    My question is can this warning be ignored in that use case or would it be possible that you handle this case to check the detection mode and set the Rigidbody to Continuous Speculative while it is selected and then back to the previous mode once it is dropped?

    I currently implemented something similar in my own code to change the mode during select enter/exit to prevent this warning.
     
    Last edited: Oct 9, 2021
  6. AvinashB9

    AvinashB9

    Joined:
    Feb 10, 2021
    Posts:
    14
    Hey Guys,

    I am working on physical rotation of HMD. Basically, when I turn around physically within the tracking space without using any input(eg: thumbstick). How to rotate gameobject when I rotate HMD.

    I have tried getting the camera's quaternion, but it is not rotating at center.

    Any help will be greatly appreciated.

    Thanks,
    Avinash
     
  7. bentoBlox

    bentoBlox

    Joined:
    Nov 17, 2021
    Posts:
    2
    Wait...
    Is the Tooklit gone? I am on 2021 w preview packages checked, but there is no XR interaction toolkit w URP
    : o
    Had to do a new Project as 3D only to get it...?
     
    Last edited: Nov 17, 2021
  8. blorenz_unity

    blorenz_unity

    Joined:
    Nov 21, 2021
    Posts:
    1
    Same issue as @bentoBlox -- I cannot find XRI for 2021.2. I am on Apple Silicon and would prefer the not to use Intel installs.
     
  9. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    This was mentioned in the pinned post XR Interaction Toolkit 1.0.0 will be held in pre-release for the 2021 cycle. Check the first bullet in the Known issues section.

    In that Editor version, you will need to manually edit the Packages\manifest.json file in your project and add this to the dependencies:
    "com.unity.xr.interaction.toolkit": "1.0.0-pre.8",


    We understand that this is not ideal and will work to add the 2.0 version of the package to that Editor version when it becomes released as soon as we are able so that users are able to again use the Package Manager window.
     
  10. Mashimaro7

    Mashimaro7

    Joined:
    Apr 10, 2020
    Posts:
    727
    I'm getting 20 errors that I didn't have before importing this package lol

    XRRayDetector not found
    XRInteractorLineVisual not found
    XRBaseController not found.

    I have the XR Plugin Manager, and the two other packages the Git article says are required. What am I missing? Somebody please help, I've been trying to test the VR camera controls on my Unity project for literally days, why does Unity make this so hard? Is there an easier way to do it, without this package?
     
  11. unity_5F3233DDF34AF1474DBC

    unity_5F3233DDF34AF1474DBC

    Joined:
    Dec 11, 2021
    Posts:
    18
    is it possible to add the components of xr interaction tool kit via addcomponent function

    GameObject tt;
    tt.addComponent<ARSelectionInteractable>();

    My app imports 3d object from the local storage and that 3d object is pass down into a gameobject in the code, I want to add the ar selection interactable component to the gameobject in the script not in the editor. I tried to do this but ar selection interactable can't be found in the add component function. any hands on this?
     
  12. VRDave_Unity

    VRDave_Unity

    Unity Technologies

    Joined:
    Nov 19, 2021
    Posts:
    275
    Yes, this should be possible. You will want to make sure your code has:
    Code (CSharp):
    1. using UnityEngine.XR.Interaction.Toolkit.AR;
    at the top. That will allow you to add the component, but also don't forget to set up the object with appropriate colliders and other components to make sure you can interact with it.

    You should also ensure you have installed the AR Foundation package (com.unity.xr.arfoundation) to make use of all of the AR features of the XR Interaction Toolkit are enabled.
     
  13. VR_Junkie

    VR_Junkie

    Joined:
    Nov 26, 2016
    Posts:
    77
    Hello, I found a bug with UI using the toolkit


    1. What happened

    I am using an oculus quest and have the input system package and the XR interaction toolkit package in my project. When I enter a scene I can use UI perfectly fine. After switching xr rigs (and thus cameras) the UI breaks. I am using a world space UI and replacing the event camera reference when I switch my rigs. After I switch rigs UI buttons continue to work but scrolls and event trigger callbacks do not. I have tested by doing the same without VR (using keyboard and mouse) and that works fine. Which means it is a VR specific issue.

    2. How can we reproduce it using the example you attached

    Import input system package and xr interaction toolkit package into unity. Make two xr origins rigs using xr interaction toolkit and turn one off. Setup a UI that will work with xr (plenty of tutorials to do so) and then test. Once the enabled xr rig works with the UI try switching xr rigs during runtime and see that although buttons continue to work, scrolls and event triggers do not.
     
  14. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    @VR_Junkie
    If you are using the XR UI Input Module, it caches a reference to the Main Camera in the scene the first frame. If you are using a setup of two different rigs, each with their own Main Camera, when you swap which one is activated you will also need to set the new Main Camera on that component by setting the UIInputModule.uiCamera property. Otherwise, it is likely using the other Camera that you deactivated. You will also need to change the Event Camera on each world space Canvas component in your scene.

    I created an issue in our backlog to consider checking the enabled state of the camera in our method so that the
    UIInputModule
    class will recognize this situation and grab the updated
    Camera.main
    reference so that this could be done automatically.
     
    VR_Junkie likes this.
  15. VR_Junkie

    VR_Junkie

    Joined:
    Nov 26, 2016
    Posts:
    77
    Another possible bug, we recently switched our xr plug-in management provider from oculus to openXR in our project. That worked but now everytime we close out and reopen Unity the hands dont work (head does though). If we reimport openxr they will work again but the first time we open it always breaks
     
  16. VRDave_Unity

    VRDave_Unity

    Unity Technologies

    Joined:
    Nov 19, 2021
    Posts:
    275
    Hey @VR_Junkie,
    For this last issue (with the hands not working), what hardware and OpenXR runtime are you using? We've seen a couple of things with newer SteamVR + Vive, but I wanted to confirm your configuration.
     
  17. TyI3orG

    TyI3orG

    Joined:
    Mar 22, 2021
    Posts:
    15
    Hi, I am having an issue where after destroying an XR rig I get "MissingReferenceException: The object of type 'XRRayInteractor' has been destroyed but you are still trying to access it. Your script should either check if it is null or you should not destroy the object.". This stops me from being able to click the UI. Similiar to the question above asked by VR_Junkie, I have two XR rigs and switch between them. Using the code you gave to VR_Junkie I got my UI working after the switch. The difference is at one point I delete one of the rigs and replace it (the replaced rig cant click the ui).
     
  18. VR_Junkie

    VR_Junkie

    Joined:
    Nov 26, 2016
    Posts:
    77
    I am using a link cable connected Oculus Quest 2 and have oculus touch controller profile selected. My steps for the switch is I went to package manager and uninstalled oculus xr plugin, xr interaction toolkit, and xr plugin management. Then I reinstall plugin management from edit > project settings. After that I went back to package manager and reinstalled xr interaction toolkit. Then I went back to edit > project settings and installed openxr and the correct profiles from there.
     
  19. VR_Junkie

    VR_Junkie

    Joined:
    Nov 26, 2016
    Posts:
    77
    Hello, I made a click and drag binding. It only works in editor not when built. What exactly am I doing wrong? https://pastie.io/rhykvg.cpp
     
  20. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    231
    Try changing the attribute on your Init method to specify BeforeSceneLoad:
    [RuntimeInitializeOnLoadMethod(RuntimeInitializeLoadType.BeforeSceneLoad)]


    You may also need to add [Preserve] to the class and to the static constructor so they don't get stripped from standalone builds.
     
  21. flipwon

    flipwon

    Joined:
    Dec 29, 2016
    Posts:
    179
    Any update on hand tracking progress?

    Really like the oculus hand tracking but cant stand their toolkit
     
  22. VRDave_Unity

    VRDave_Unity

    Unity Technologies

    Joined:
    Nov 19, 2021
    Posts:
    275
    Hi @flipwon,
    We are actively working on hand tracking support in Unity with support in the XR Interaction Toolkit as well. The current timeline is targeted at the 2023.1 release of the Editor. We will hopefully have experimental packages available before then, stay tuned!

    Can I inquire as to what you don't like about the Oculus toolkit?
     
  23. flipwon

    flipwon

    Joined:
    Dec 29, 2016
    Posts:
    179
    It may just be a familiarity thing, as I'm able to accomplish everything I've set out to, but just the overall way the package has been put together doesn't jive with me. More of a workflow issue I guess than anything.
     
  24. VR_Junkie

    VR_Junkie

    Joined:
    Nov 26, 2016
    Posts:
    77
    Hello, I am having trouble getting the xr interaction toolkit (OpenXR) to work with valve index. It works with Vive and Oculus but not index. I have the oculus, vive, and index controller profiles selected and my unity version is 2020.3.5. Hopefully someone here can help me figure this out. Attached is an image of my action map for reference.
     

    Attached Files:

  25. VR_Junkie

    VR_Junkie

    Joined:
    Nov 26, 2016
    Posts:
    77
    Hello, We would like some help regarding the proper way to handle launch options for our game. It is built to work for PC (keyboard and mouse) or VR using OpenXR and requires OpenXR runtime to be set as active for VR to work. Attached are two images, one where we activate OpenXR Runtime through the Oculus App (in which case the tool only works for oculus headsets) and one where we activate OpenXR Runtime through SteamVR (works for most headsets but performance heavy so not recommended if you dont need it). Is there a way to automatically set OpenXR Runtime through your launch options (no player required steps)? Example, I have an Oculus Quest (with link cable) so I set OpenXR to Oculus but if I have a Valve Index or HTC Vive I set it to SteamVR. The problem is through using OpenXR our players are now forced to set the correct runtime for the headset to work. If a player owns an HTC Vive as well as an Oculus Quest 2 (with link cable) they could think the game is broken when using their Vive (assuming OpenXR is active through the Oculus App and not SteamVR). The reason it didnt launch in VR mode for you was most likely you dont have the OpenXR runtime set as steamVR (pictured in OpenXR_SteamVR below). This is obviously a concern for us. Please advise Side note: A popular VR game Phasmophobia uses OpenXR and appears to have the same issue https://steamcommunity.com/sharedfiles/filedetails/?id=2791489010
     

    Attached Files: