Search Unity

Official XR Interaction Toolkit 1.0.0-pre.2 pre-release is available

Discussion in 'XR Interaction Toolkit and Input' started by chris-massie, Jan 26, 2021.

Thread Status:
Not open for further replies.
  1. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    We have just published a new preview of the XR Interaction Toolkit (XRI) that brings a number of bug fixes and improvements. For those who want to experiment with XRI, the best way to start is with our samples available at https://github.com/Unity-Technologies/XR-Interaction-Toolkit-Examples. As always, you can refer to our documentation for more information.

    Pre-release
    This 1.0.0-pre.2 version of the XR Interaction Toolkit is considered pre-release. Pre-release packages are supported packages in the process of becoming stable and will be available as production-ready by the end of this upcoming 2021 LTS release. Starting in 2021.1 Alpha, Unity is changing the way we publish and show packages in the Package Manager, and is designed to provide clear guidance around a package's readiness and expected support level. There will be additional iterations of XRI before we get to the final 1.0.0 release.

    What’s new
    For a full list of changes, refer to the Changelog in our documentation.

    Many of the changes and fixes in this version were a direct result of feedback we received from the forum and from reported bugs. Thank you to everyone who took the time to make these issues known to the team and for your feedback!

    Notable changes
    • Added and improved Scripting API documentation and Inspector tooltips.
    • Changed the signature of all interaction event methods (e.g. OnSelectEntering) to take event data through a class argument rather than being passed the XRBaseInteractable or XRBaseInteractor directly. This was done to allow for additional related data to be provided by the Interaction Manager without requiring users to handle additional methods. This also makes it easier to handle the case when the selection or hover is canceled (due to either the Interactor or Interactable being unregistered as a result of being disabled or destroyed) without needing to duplicate code in an OnSelectCanceling and OnSelectCanceled. See the Changelog for code snippets with instructions for how to upgrade and migrate scripts to use the new signatures of these events and methods. Use the Migrate Events button in the Inspector of Interactor and Interactable objects to move any serialized listeners from the old, deprecated events to the new events.
    • Opened up the custom Editor classes to allow users to more easily customize the Inspector for derived classes. They now also apply to derived classes, so those users who override methods in behaviors will be able to continue using the customized Inspector rather than falling back to the default.
    • Fixed XR Ray Interactor from clearing a custom aim direction when initializing. (1291523)

    Known issues
    • Custom reticles get displayed on objects without a custom reticle (1252565)
    • Socket Interactor can apply the wrong rotation to an interactable and cause the interactable to skew in scale when the interactable has a parent with a non-uniform scale (1228990)
    • Grab Interactables can cause undesired behavior when using Continuous Move locomotion where the Character Controller can be blocked from moving while holding it, or cause the rig to rapidly move away when the object overlaps with the Character Controller
    • The end of the XR Interactor Line Visual lags behind and can appear bent when moving the controller fast (1291060)
    • The Hover To Select property on XR Ray Interactor is not functional (1301630)

    Roadmap
    Use the public roadmap to see our latest plans, upvote existing feature requests, and/or submit new feature requests. We are currently working towards a public 1.0 release this year for Unity 2021.2 (LTS). Most of our focus and development efforts now are on bug fixes, UX improvements, and polished documentation & samples. The feature set for public release will primarily reflect what exists today.

    Sharing feedback
    This forum is the best place to open discussions and ask questions. As mentioned above, please use the public roadmap to submit feature requests. If you encounter a bug, please use the Unity Bug Reporter in the Unity Editor, accessible via Help > Report a Bug. Include “XR Interaction Toolkit” in the title to help our team triage things appropriately!
     
  2. HeyBishop

    HeyBishop

    Joined:
    Jun 22, 2017
    Posts:
    238
    I've installed the XR Interaction Toolkit using the Package Manager.
    Then, I downloaded/extracted the XR-Interaction-Toolkit-Examples from GitHub.
    Imported the extracted directory into my project.

    Now, I have a bunch of errors, including:
    XR-Interaction-Toolkit-Examples-master\VR\Assets\Scripts\BubbleGun.cs(44,26): error CS0246: The type or namespace name 'DeactivateEventArgs' could not be found (are you missing a using directive or an assembly reference?)

    and

    Assets\Samples\XR-Interaction-Toolkit-Examples-master\VR\Assets\Scripts\ComplexCube.cs(41,43): error CS0246: The type or namespace name 'SelectExitEventArgs' could not be found (are you missing a using directive or an assembly reference?)




    upload_2021-1-28_11-21-56.png


    There were loads more before I deleted the AR directory, as it's not relevant to me.


    Not sure if this is a bug, or a user error... have I missed a step?
     
  3. HeyBishop

    HeyBishop

    Joined:
    Jun 22, 2017
    Posts:
    238
    Note: when I deleted BubbleGun.cs and ComplexCube.cs, I was able to run the scene. Not sure what I'm missing out on in the demo!
     
  4. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    I would recommend that you open the VR example project directly rather than copying that folder into an existing project as you did since the example will depend on some project settings and layers that may not exist in your project. The Examples on github contains two Unity projects you can open with Unity Hub, the AR folder (a mobile AR example), and the VR folder. Follow the steps in Getting started to open the VR project so you can play around with the examples. If you want to extract some parts of the VR project example to copy into your project, you can follow the steps to create an Asset package and then import it.

    As for why you were getting those compilation errors, my guess is that an older version like 0.10.0-preview.7 of XR Interaction Toolkit is installed in Window > Package Manager. Expand the foldout for XR Interaction Toolkit in that window, click See other versions, click 1.0.0-pre.2, and then click Install.
     
    HeyBishop likes this.
  5. DominiqueSandoz

    DominiqueSandoz

    Joined:
    Aug 29, 2017
    Posts:
    25
    Hello,

    i have tested out the new release with the example projects on Github in the hope that an issue is resolved that is inside the XR Interaction Toolkit since we are using it (0.9 preview).

    Interactables behind UIs are problematic. When pointing with the ray interactor (and attached ray visual) to the UI, the visual's length stops correctly at the UI but any interactable behind the UI still gets events like hover, select, etc. which seems wrong and unintuitive as it's "behind" the UI.
    This poses a great annoyance to our users since we switched to XR Interaction, as they are constantly selecting and changing objects behind UIs by just interacting with the UI.

    It is the same problem as described here by the user virtimed: https://forum.unity.com/threads/xr-...review-release-0-9.795684/page-9#post-5911529 (this is version 0.9)

    As somebody suggested to check the setup again, and this is a new release, and to rule out any error on our side we've downloaded the example from Github and checked the WorldInteractionDemo Scene - it's also a problem there and easily recreated as shown in this video:



    Steps to reproduce:

    1. Take a grab or simple interactable object (in Scene: from "Complex Grab Interactions")
    2. Place it behind a UI panel setup for XR (in Scene: put it on the box "UI Interaction")
    3. Position yourself such that the UI panel is in front of the interactable (in Scene: go to teleport area in the back)
    4. Point the left controller (with ray interactor) at the UI

    Expected result:
    The interactable behind the UI should not react as long as the ray is directed at the UI.

    Actual result:
    The interactable behind the UI turns red when the ray is directed at it even if there is a UI in between (that successfully shortens the line visual but not the "real raycast").

    Is there something we can do? How can this be resolved?

    What we've also tried

    We've tried to come up with solutions for that, ranging from ugly to less ugly. Our current strategy is to detect whenever the XR Ray Interactor interacts with a UI. We started to implement our own Ray Interactor, extending XRRayInteractor. We've almost got it but are stuck at the problem, that there are just too many functions of XRRayInteractor inaccessible to our class. It would be highly helpful, if e.g. GetCurrentUIRaycastResult was public (as also mentioned by Skinzart in https://forum.unity.com/threads/xr-...10-preview-is-available.1000576/#post-6530717 (version 0.10 preview)).

    Thanks for any response we can get here

    Dominique
     
    Last edited: Feb 1, 2021
  6. DEGUEKAS

    DEGUEKAS

    Joined:
    Jul 12, 2018
    Posts:
    26
    I have two objects that can be grabbed with the same configuration, one with a collider box and one with a convex collider.
    The object with the collider box works well but when I try to grab the convex collider it grabs but with a strange distance between the hand and the object.

    I attach a video with what happens:


    The same is true when they are multiple colliders and when they are multiple collider on a child object

    this is the configuration of objects:
    Sin título.png
     
  7. Skinzart

    Skinzart

    Joined:
    Sep 11, 2020
    Posts:
    14
    Last edited: Feb 3, 2021
  8. DominiqueSandoz

    DominiqueSandoz

    Joined:
    Aug 29, 2017
    Posts:
    25
    Is this thread monitored by Unity or should we post questions/feedback somewhere else?
     
  9. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    Thank you for the very detailed post! I agree, that is not desired behavior, and unfortunately there isn't a great way to resolve that as a user with the current version. I created an item in our issue tracking system to open up some of those methods and properties related to the UI raycasts after reading that post by Skinzart, but we have not made those changes yet. I've moved it up in priority.

    A workaround until we release a fix for this would involve something like subclassing
    XRRayInteractor
    and overriding
    GetValidTargets
    to remove Interactables that are behind UI. You would need to call
    TryGetUIModel
    and compare positions. However, this would be pretty ugly without having all of the methods available for you to call or override.

    If you report this bug with the Unity Bug Reporter, our team can provide a public tracking link and give updates on the status of a fix.

    Yes this thread and forum are monitored. Questions and feedback can be posted here, and bugs can be submitted with the Unity Bug Reporter in the Editor.
     
  10. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    I'm not exactly sure what's going wrong without looking at the Hierarchy and the model. What could be happening is if the object's pivot point is offset from the center of the model, that difference might cause that offset shown in the video. You can use the Pivot/Center button in the toolbar of the Editor to Toggle Tool Handle Position to visualize the difference. Since the Attach Transform is not set, it will use the pivot position and rotation of the Transform itself to determine how it orients with the Interactor's Attach Transform. You can create a child GameObject and set the Attach Transform on that Interactable to that child to manually adjust where the center should be by moving that Transform.
     
  11. DEGUEKAS

    DEGUEKAS

    Joined:
    Jul 12, 2018
    Posts:
    26
    that's the strange thing, because all the pivots are in the center
     
  12. DominiqueSandoz

    DominiqueSandoz

    Joined:
    Aug 29, 2017
    Posts:
    25
    Thank you very much for this, everything that helps is greatly appreciated!

    This is kind of the approach we had, but we hit another wall by not having access to
    TrackedDeviceModel.implementationData
    . This is what we tried, closely following the (inaccessible)
    GetCurrentUIRaycastResult
    :

    Code (CSharp):
    1.  
    2. public class CustomRayInteractor : XRRayInteractor
    3. {
    4.     public override void GetValidTargets(List<XRBaseInteractable> validTargets)
    5.     {
    6.         base.GetValidTargets(validTargets);
    7.  
    8.         if (TryGetUIModel(out TrackedDeviceModel model))
    9.         {
    10.             int raycastPointIndex = model.implementationData.lastFrameRaycastResultPositionInLine;
    11.             if (raycastPointIndex >= 0)
    12.             {
    13.                 validTargets.Clear();
    14.             }
    15.         }
    16.     }
    17. }
    But as said, we seem not to able to access model.implementationData here. I am not sure what you mean by comparing positions, as model.position will return the center of the interactable which can be anywhere but seems unrelated to where the ray hits the UI. Can you elaborate on this?

    We've tried using the internal bug reporter for this, it crashed while uploading the sample project. We will try that again.

    Thank you so much for your reponsiveness

    Dominique
     
  13. DominiqueSandoz

    DominiqueSandoz

    Joined:
    Aug 29, 2017
    Posts:
    25
    Last edited: Feb 10, 2021
  14. Simianosaurus

    Simianosaurus

    Joined:
    Feb 14, 2013
    Posts:
    14
    I'm seeing similar issues but with one interactable behind another, not just behind UI.

    The ray visuals stop at the first interactable, but the hover event is fired on both.

    I've tried every combination of masks and filters I can think of, without success.
     
  15. DEGUEKAS

    DEGUEKAS

    Joined:
    Jul 12, 2018
    Posts:
    26
    I've already encountered the problem, the error happens when I assign it a transform in the Attach Transform variable of XR Direct Interactor
     
  16. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    If you add this line to the
    manifest
    element in
    AndroidManifest.xml
    it should show the Oculus system overlay keyboard when the Input Field receives focus:
    Code (csharp):
    1. <uses-feature android:name="oculus.software.overlay_keyboard" android:required="false"/>
    The keyboard is only dismissed when clicking the icon that looks like a keyboard with a down arrow under it. I don't know if there's a way to configure so the keyboard is dismissed when clicking outside the keyboard, that would be a question you should ask on the Oculus forums.

    You should be able to use the Application.isFocused property and/or the Application.focusChanged event to determine when an Oculus overlay is opened, either the system menu or the system keyboard.
     
  17. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    I was referring to that since
    GetCurrentUIRaycastResult
    is private, you would need to derive if
    TryGetHitInfo
    was using the result from Physics hits or from UI. You could do something like this in your overriding GetValidTargets method:
    Code (CSharp):
    1. if (TryGetHitInfo(out var hitPosition, out var hitNormal, out _, out _) &&
    2.     GetCurrentRaycastHit(out var raycastHit) &&
    3.     (hitPosition != raycastHit.point ||  hitNormal != raycastHit.normal))
    4. {
    5.     validTargets.Clear();
    6. }
    That would let the custom Ray Interactor not hover or select Interactables behind the UI. Again, this is a bug that we intend on fixing in the package, but this is a temporary workaround you can use in the meantime. Note that this solution still does not solve every issue with 3D and UI raycasts causing undesired behavior, such as still being able to hover and select things in the UI when there is an object in front of the UI.
     
  18. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    It's intended behavior for multiple Interactables to be hovered by an Interactor at the same time. You can change that by deriving from Ray Interactor and overriding
    GetValidTargets
    or
    CanHover
    to make it only allow hovering one object at a time. We plan on adding configuration options in the Inspector of Ray Interactor so this can be adjusted without needing to create a custom script.
     
  19. CryptopherColumbus

    CryptopherColumbus

    Joined:
    Nov 3, 2020
    Posts:
    6
    @chris-massie were you guys ever able to figure out why xr grab interactables jitter so much while using the continuous move provider?

    It is very easy to recreate. You can have an XR Rig and add a continuous move provider on it. Let's say the move speed is 5. If your character is holding an xr grab interactable while moving the grabbable will jitter all over the place.

    I've brought this up with unity before, but it seems like no one has provided a solution or acknowledged the problem.
     
  20. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    The lag in Grab Interactables is due to how the object is moved in Kinematic or Velocity Based modes, and the difference in update frequencies between Update and FixedUpdate. The Movement Type value in the Grab Interactable Inspector controls how the position of the object is updated. With Kinematic (the default), the Rigidbody is moved to a target position during FixedUpdate. With Velocity Based, the Rigidbody is moved by setting its velocity during FixedUpdate. With Instantaneous, the Transform of the object is moved both during Update and right before rendering to the VR device. The Continuous Move Provider updates the rig by either updating the Character Controller or the Transform directly during Update.

    By default, Update occurs more frequently than FixedUpdate. Update can be like 90 Hz, whereas the default FixedUpdate is 50 Hz, which is adjustable in Edit > Project Settings > Time and setting Fixed Timestep. This difference is what causes the jitter when you move the controller fast while grabbing something (and even more apparent with a high move speed on the Continuous Move Provider).

    Changing the Movement Type to Instantaneous will help to get rid of the jitter. We have plans to solve the jitter caused in a future version of the package by separating the visual component from the physics component of Interactables. That would allow us to move the Rigidbody with the physics timestep, and separately move the visual representation of the object in Update.
     
  21. Simianosaurus

    Simianosaurus

    Joined:
    Feb 14, 2013
    Posts:
    14
    Ah, OK, I didn't spot that mentioned anywhere.
    Seems odd that the visuals are stopped based on the mask, but the detection is not. That seems counter intuitive and I can't imagine that being the expected standard behaviour.

    Cheers for the info, I'll get deriving.
     
  22. DominiqueSandoz

    DominiqueSandoz

    Joined:
    Aug 29, 2017
    Posts:
    25
    Thank you @chris-massie , we did not think of this. We've tried it and unfortunately it leads to more problems than it solves as it makes our UIs completely unusable - it seems there is a precision problem leading the ray shoot through the UI every other frame.

    Is there some estimation as to when the next pre-release is coming?
     
  23. CryptopherColumbus

    CryptopherColumbus

    Joined:
    Nov 3, 2020
    Posts:
    6

    Thank you so much for the response @chris-massie. It's good to hear that the team is aware. Do you have a rough ballpark estimate of how long we're talking? A month timeframe or half a year?


    Another issue that I encountered while using the xr interaction toolkit was with using haptics. Maybe the way I'm handling haptics is bad.

    I've been doing something like
    Code (CSharp):
    1.  
    2.         private void TriggerHapticFeedback()
    3.         {
    4.             if (secondaryBaseInteractor != null)
    5.             {
    6.                 primaryBaseInteractor.GetComponent<XRController>().SendHapticImpulse(hapticStrength, hapticDuration);
    7.                 secondaryBaseInteractor.GetComponent<XRController>().SendHapticImpulse(hapticStrength, hapticDuration);
    8.             }
    9.             else
    10.             {
    11.                 primaryBaseInteractor.GetComponent<XRController>().SendHapticImpulse(hapticStrength, hapticDuration);
    12.             }
    13.         }
    14.  
    It works great for a bit, but randomly will end up crashing my game and unity every so often while lets say you're shooting a gun.

    When I looked through the Unity logs I found that I get something like the text file I've attached. The logs are from a while ago.

    I've recently recreated the issue so here are some more recent logs I found. I see the below repeated over and over in my unity logs. I've had to disable haptics from my vr game all together because of this issue.

     

    Attached Files:

  24. AvinashB9

    AvinashB9

    Joined:
    Feb 10, 2021
    Posts:
    14
    Hi Everyone,

    I have been facing issue with XRBaseInteractables class.
    I have extended XRBaseInteractables class, then added listener selectEntered using code, but the events doesn't fire the method.
    I have also tried other way by adding methods under selectEntered in inspector, still the event doesn't fire the method.

    Any idea why the event is not firing?

    Below screenshot shows the class is extended with XRBaseInteractables and then adding method under selectEntered event. I have tried by code and also by inspector, but not both together :).

    Kindly request everyone to help me this issue.

    upload_2021-2-11_22-29-28.png

    Thanks,
    Avinash
     
  25. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    Fixing those bugs along with the wobbly Line Visual is my highest priority right now. Some or all of those fixes should be included in the next version (1.0.0-pre.3) within the next month.

    As far as the changes to separating the visual from the physics goes, that will not happen for the final 1.0 release and will likely require a major version bump to implement. It will likely need to be done as a prerequisite before we add articulated hand tracking, so it is high on our list however. Timeline is subject to change of course, but we're likely talking somewhere in the half year to year timeframe.

    It looks like you're triggering haptics fine, it doesn't seem like you're doing anything wrong from that code. Try updating your Oculus device and checking if there is an updated verified version of the Oculus XR Plugin package in Window > Package Manager. I'll forward this along to the Oculus package team to see if they can help.
     
  26. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    That event will notify the listeners within the
    OnSelectEntered(SelectEnterEventArgs args)
    method. If you are overriding that method in your script, make sure you call the base method so the event will actually be invoked.
    Code (CSharp):
    1. protected override void OnSelectEntered(SelectEnterEventArgs args)
    2. {
    3.     base.OnSelectEntered(args);
    4.     // Your additional code
    5. }
    The
    XRBaseInteractable
    also has
    Awake
    and
    OnEnable
    methods that do necessary things to register it. If your custom script has either of those two methods, make sure you're calling the base method similarly.

    Also verify that the Interactable is actually being selected. Open Window > Analysis > XR Interaction Debugger and check that the Interactable shows that it is selected when you think it is.
     
  27. AvinashB9

    AvinashB9

    Joined:
    Feb 10, 2021
    Posts:
    14
    Thanks @chris-massie for the response and the help. It worked after base methods for Awake.

    I have another issue when using XRGrabInteractable with Velocity Tracking. When I am grab and hold an object and move around the game, the object is lagging. When using Instantaneous type, there is still a slight lag. However, I would like to use Velocity tracking type, but the object is lagging when moving. Is there any workaround fix, until this issue has been fixed or release a new version of it.

    Thanks,
    Avinash
     
  28. CryptopherColumbus

    CryptopherColumbus

    Joined:
    Nov 3, 2020
    Posts:
    6

    You're a beast. Thanks for taking a look at this. I think I might have narrowed down the issue further. If you are applying haptics to both of the controllers at the same time you trigger that issue. Would love for the oculus package team to take a look.
     
  29. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    I described some of what's happening up in post #20. For Velocity Tracking, the behavior is using some const multiplier factors for the velocity and angular velocity that introduces some easing that adds some lag. I'll work on making those available to be tweaked in the Inspector. There will still be some delay until we separate the visual representation from the Physics representation of interactables, but being able to tweak those multiplier factors should help.

    Until a new version is released, if you want to adjust it now, you could counteract
    XRGrabInteractable.PerformVelocityTrackingUpdate
    by overriding
    ProcessInteractable
    to multiply out those factors from Rigidbody.velocity and Rigidbody.angularVelocity.
     
  30. curtispelissier

    curtispelissier

    Joined:
    Jul 26, 2019
    Posts:
    39
    Hello, I can't scroll on ScrollViews with my Oculus Joystick. Only thing I can do is to "grab" the ScrollView. I read that I must change the PointerEvent.scrollDelta but I don't really know how to achieve this?
    Code (JavaScript):
    1. {
    2.   "dependencies": {
    3.     "com.unity.inputsystem": "1.1.0-preview.3",
    4.     "com.unity.ugui": "1.0.0",
    5.     "com.unity.xr.interaction.toolkit": "1.0.0-pre.2",
    6.     "com.unity.xr.legacyinputhelpers": "2.1.7",
    7.     "com.unity.xr.management": "4.0.0-pre.3",
    8.     "com.unity.xr.oculus": "1.7.0",
    9.     "com.unity.xr.openxr": "1.0.2",
     
  31. Flamesilver

    Flamesilver

    Joined:
    Apr 10, 2013
    Posts:
    4
    I've encountered multiple bugs in this release including AttachTransform in XRGrabInteractable not properly working in certain modes, and now when I try to inherit from XRBaseInteractor any variables I have won't show up in the inspector no matter what I do (even after trying [System.Serializable] and [SerializeField]).
     
    hayhilal likes this.
  32. ArmanUnity

    ArmanUnity

    Joined:
    Nov 29, 2015
    Posts:
    22
    For AR scaling, i have experienced a bug that is, if i scale down to minimum scale value or scale up to maximum scale value using pinch , then i need to remove fingers from screen and then do a pinch gesture to start controlling scale. For example if i scale down an object to its minimum scale using pinch(move finger towards ) ,and then if i try to move fingers apart(for scaling up) it will not work. but if end touches by removing fingers on screen and then do a move fingers apart gesture it will work. i think it should work continuously without ending touch... ??? or is there any parameters i need to adjust in ARScaleInteractable.cs
     
  33. flipwon

    flipwon

    Joined:
    Dec 29, 2016
    Posts:
    179
    I'm trying my damndest to force an interactor to select a grab interactable when selecting another simple interactable. Does anyone know of a way to do this without janking the XR Toolkit code?
     
  34. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    I touched on some of what you would need to do in another post. Unfortunately, that area of UI interaction has not been improved since that post so you will still need to write your own replacement Input Module that will support translating joystick input into either virtual scroll wheel values or drags. You will hit difficulties due to much of the API still being internal or private, so it will not be an easy set of changes. Look to
    UIInputModule.ProcessMouse
    which calls
    ProcessMouseScroll
    in order to send the scroll amount in the
    PointerEventData.scrollDelta
    to the hovered UI element. You will need to update your version of the
    ProcessTrackedDevice
    method to send the scroll amount in a similar manner.

    Use the public roadmap to vote on or submit feature requests for this. I am also planning on opening up the API in XRRayInteractor before the final 1.0 release to significantly reduce the amount of code users will need to write to modify the PointerEventData generated by that class.

    I don't know of any known issues with attach transforms in Grab Interactable, please submit bug reports so that we may address them. As for the Inspector, a change was made in the current version to make the custom appearance also apply to derived classes, which is great if you are just overriding methods, but not ideal if there are additional serialized fields that need to be shown in the Inspector. We are going to fix this so additional SerializeField variables will show up in the Inspector automatically without needing to create a custom Editor.

    Until then, you will need to create a custom Editor. Instructions for doing this is described in the documentation in Extending the XR Interaction Toolkit. You can also use DrawDefaultInspector to ignore the custom appearance if you want to ignore it entirely.
    Code (CSharp):
    1. using UnityEditor;
    2.  
    3. [CustomEditor(typeof(ExampleInteractor), true), CanEditMultipleObjects]
    4. public class ExampleInteractorEditor : Editor
    5. {
    6.     public override void OnInspectorGUI()
    7.     {
    8.         DrawDefaultInspector();
    9.     }
    10. }
     
    hayhilal likes this.
  35. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    You can change Elastic Ratio Limit on the AR Scale Interactable to control how much you can expand beyond the range limit before it cancels the gesture. If you want to disable it canceling entirely, set that value to
    Infinity
    so you can keep pinching to change size after it hits the elasticity limit controlled by the Elasticity property.
     
    ArmanUnity likes this.
  36. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    You can manually trigger a selection by using
    XRInteractionManager.SelectEnter
    . Add this component to a GameObject and set Source Interactable to the Simple Interactable, and Interactable To Select to the Grab Interactable. The script basically waits until the Simple Interactable is selected, gets the Interactor that did the selection, and then clears the selection before selecting the Grab Interactable. This assumes they are all part of the same Interaction Manager.
    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.XR.Interaction.Toolkit;
    3.  
    4. public class RedirectExample : MonoBehaviour
    5. {
    6.     [SerializeField]
    7.     XRBaseInteractable m_SourceInteractable;
    8.  
    9.     [SerializeField]
    10.     XRBaseInteractable m_InteractableToSelect;
    11.  
    12.     XRBaseInteractor m_SourceInteractor;
    13.  
    14.     bool m_ForceSelect;
    15.  
    16.     protected void OnEnable()
    17.     {
    18.         if (m_SourceInteractable != null)
    19.             m_SourceInteractable.selectEntered.AddListener(OnSourceInteractableSelected);
    20.     }
    21.  
    22.     protected void OnDisable()
    23.     {
    24.         if (m_SourceInteractable != null)
    25.             m_SourceInteractable.selectEntered.RemoveListener(OnSourceInteractableSelected);
    26.     }
    27.  
    28.     void OnSourceInteractableSelected(SelectEnterEventArgs args)
    29.     {
    30.         // Save off the Interactor that selected the Interactable that triggered this.
    31.         // Wait until Update to force select something else to allow any other listeners to
    32.         // this event to process this selection before this behavior deselects.
    33.         m_SourceInteractor = args.interactor;
    34.         m_ForceSelect = true;
    35.     }
    36.  
    37.     protected void Update()
    38.     {
    39.         // Requires that the Interactable to select and the Interactor
    40.         // are registered with the same Interaction Manager.
    41.         if (!m_ForceSelect ||
    42.             m_SourceInteractor == null ||
    43.             m_InteractableToSelect == null ||
    44.             m_InteractableToSelect.interactionManager == null ||
    45.             m_InteractableToSelect.interactionManager != m_SourceInteractor.interactionManager)
    46.         {
    47.             return;
    48.         }
    49.  
    50.         var manager = m_SourceInteractor.interactionManager;
    51.  
    52.         // First deselect if the Interactor has something selected
    53.         if (m_SourceInteractor.selectTarget != null)
    54.             manager.SelectExit(m_SourceInteractor, m_SourceInteractor.selectTarget);
    55.  
    56.         // Now select the desired Interactable
    57.         manager.SelectEnter(m_SourceInteractor, m_InteractableToSelect);
    58.  
    59.         // Done
    60.         m_ForceSelect = false;
    61.     }
    62. }
    63.  
     
    myaydin and flipwon like this.
  37. flipwon

    flipwon

    Joined:
    Dec 29, 2016
    Posts:
    179
    This is perfect, I searched high and low but didn't think to have a closer look at the manager. Thank you so much, you've saved me a lot of jank :)
     
  38. flipwon

    flipwon

    Joined:
    Dec 29, 2016
    Posts:
    179
    I noticed when deriving from interactables etc that the editor isn't allowing us to see our public variables etc any more. Is there a reason you've changed the editor on these?
     
  39. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    See the last two paragraphs of my post #34. With all the properties and events on those behaviors, the default inspector wasn't great for derived classes. But that change to make the custom Editor apply to derived classes makes it bad for people who have added properties. I'm going to be updating the Editor classes to take advantage of
    DrawPropertiesExcluding
    to draw all of the unknown fields so users won't need to create a custom editor to see them.

    EDIT: If you don't want to create a custom Editor until I fix it, you can also right-click the Inspector tab and switch from Normal to Debug to be able to see the field you added, and then switch back to Normal.
     
  40. Alex-CG

    Alex-CG

    Joined:
    May 10, 2015
    Posts:
    11
    I made a thread about this and published a hack to go around this issue.
    https://forum.unity.com/threads/ins...bers-of-derived-classes.1047134/#post-6827297

    I'm also here to know if the best way to leave feedback is this thread or just the "Feedback" prefix in new posts. In any case, I made a new Feedback post, so I'll just link it here to avoid pasting the same text:
    https://forum.unity.com/threads/automatic-references-and-prefabs.1068128/
     
  41. dnnkeeper

    dnnkeeper

    Joined:
    Jul 7, 2013
    Posts:
    84
    Is there a way of making XRGrabInteractable grabbable by the point I grabbed with direct interactor without snapping object to the interactor position? It seems like an obvious function for interacting with objects but I can't find a straight way of implementing such behaviour. I can't override internal method OnSelectEntering of XRGrabInteractable without forking and modifing the package to achieve this effect.
     
  42. flipwon

    flipwon

    Joined:
    Dec 29, 2016
    Posts:
    179
    have you tried changing the interactables attach point on select?
     
  43. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    Either way of leaving feedback is fine with me, we'll see it in either case.

    That feature isn't currently built-in to the package, but the OnSelectEntering method is protected internal which means that you can override it within a derived class. The signature of your method would be
    protected override void OnSelectEntering(SelectEnterEventArgs args)
    . You should be able to change the attach Transform pose within that method to achieve what you want.
     
    Alex-CG likes this.
  44. Skinzart

    Skinzart

    Joined:
    Sep 11, 2020
    Posts:
    14
    @chris-massie is it possible to add out gameObject to this?
    public bool TryGetHitInfo(out Vector3 position, out Vector3 normal, out int positionInLine, out bool isValidTarget);
    btw when are the plans to release the next version?
     
  45. dpcactus

    dpcactus

    Joined:
    Jan 13, 2020
    Posts:
    53
    I made my own XR SocketInteractor script and expanded it with a Tag comparesion. It worked fine on 0.10 pre7 but now since I updated to 1.0pre2 I cannot see my public variables in the Inspector.

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.XR.Interaction.Toolkit;
    5.  
    6. public class XRSocketInteractorTag : XRSocketInteractor
    7. {
    8.     public string targetTag;
    9.  
    10.     public override bool CanSelect(XRBaseInteractable interactable)
    11.     {
    12.                
    13.         return base.CanSelect(interactable) && interactable.CompareTag(targetTag);
    14.      }
    15.     public override bool CanHover(XRBaseInteractable interactable)
    16.     {
    17.  
    18.         return base.CanHover(interactable) && interactable.CompareTag(targetTag);
    19.     }
     
  46. Thimo_

    Thimo_

    Joined:
    Aug 26, 2019
    Posts:
    59
    Im trying to built de key-lock mechanic from the VR escaperoom tutorial. Theres a component that disables the Hands layermask on the key so that you cant grab it anymore. In my own project only the nothing layermask works. My whole XRRig gameobject is set on the hands layermask, but for some reason I can still grab those objects. Is this a bug or is there something I missed in what I need to do?

    It happens in every test scene I try except for the escaperoom scene using version 1.0.0 pre2 I use this version also in the escaperoom project and in that project it works as intended.
     
    Last edited: Mar 16, 2021
  47. chris-massie

    chris-massie

    Unity Technologies

    Joined:
    Jun 23, 2020
    Posts:
    232
    1.0.0-pre.3 will be released very soon, it is going through QA and will be released as soon as that completes. In that next version, Ray Interactor will have new methods for obtaining the 3D RaycastHit and/or the UI RaycastResult, and the GameObject can be obtained through those structs.

    See my post #39. 1.0.0-pre.3 fixes this so added fields in derived classes will appear in the Inspector without needing custom Editor classes.

    The Interaction Layer Mask property on Interactors and Interactables uses the Layers defined in the project. The checks to determine if they are compatible to allow for interaction does not depend on the layer of any GameObject. An Interactor can interact with an Interactable as long as there is any overlap in the layer mask of that Interaction Layer Mask property. The Layer of the GameObject will affect whether Physics collisions and raycasts will hit, but that's separate from that Interaction Layer Mask property.
     
    Thimo_ likes this.
  48. Alex-CG

    Alex-CG

    Joined:
    May 10, 2015
    Posts:
    11
    I'm not completely sure if this is the right place to write this, but since this is an issue related to the new VR systems, I think this might be relevant here.

    This last week I've been integrating stereoscopic 360 videos in a project and I saw there's this repository (last modified 4 years ago):
    https://github.com/Unity-Technologies/SkyboxPanoramicShader

    which is already integrated to Unity since 2019.3:
    https://docs.unity3d.com/Manual/shader-skybox-panoramic.html

    but there are some lines in the shader Editor (around line 46) getting information of the now deprecated VR PlayerSettings:

    Code (CSharp):
    1. // No 3D settings unless PlayerSettings have VR support.
    2. m_Show3DControl.value = PlayerSettings.virtualRealitySupported;
    3. if (EditorGUILayout.BeginFadeGroup(m_Show3DControl.faded))
    4.     ShowProp(materialEditor, FindProperty("_Layout", props));
    5. EditorGUILayout.EndFadeGroup();
    As you can see, this hides with a FadeGroup the stereoscopic functionality in the shader, so the only solution left is to miraculously find this repository and then get this version of the shader and modify the editor to show this options, or implement another solution for stereoscopic videos.
     
  49. Skinzart

    Skinzart

    Joined:
    Sep 11, 2020
    Posts:
    14
    Wow nice, i can't wait ...
    have you tried SendHapticImpulse in oculus quest? I don't know why it doesn't work
     
  50. dpcactus

    dpcactus

    Joined:
    Jan 13, 2020
    Posts:
    53
    1.0.0-pre.3 can be installed via package manager, but I cannot find any change log.
     
Thread Status:
Not open for further replies.