Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Official XR Interaction Toolkit Preview Release (0.9)

Discussion in 'XR Interaction Toolkit and Input' started by mfuad, Dec 17, 2019.

Thread Status:
Not open for further replies.
  1. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    PS: I'm loving XRInputToolkit so far, great for a Preview, so I'm not surprised that I'm running into a lot of walls of (what seems to be) missing features or docs, that's fine. I just want to work out what's a bug (and needs reporting) vs what's my misunderstanding (and needs me to read / learn more :)).

    My end goal is to VR-upgrade this 5-year-old asset: https://assetstore.unity.com/packages/tools/modeling/snap-and-plug-21930 (in use on a bunch of non-VR projects, and some legacy VR projects), probably by creating custom extensions of XRBaseInteractable, so that you can snap any-shaped-objects to any-shaped-objects and use the (existing) asset features to manage the created blobs. From the existing codebase I already have good management of Unity Physics integration, Editor + Runtime/player integration - but it's all mouse-or-touch-screen oriented at the moment. It "works" in VR, but it's not easy to write VR-specific integrations for at the moment.
     
  2. ZeBraNS

    ZeBraNS

    Joined:
    Feb 21, 2015
    Posts:
    40
    Hi,
    is there a simulated camera rig that can be used for testing VR features using mouse and keyboard?
    Something like VRTK has.
     
    ImpossibleRobert and a436t4ataf like this.
  3. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    If anyone else is working on this, please let me know, I'd like to get involved.
     
  4. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    This thread's great but getting painful to try and find things I vaguely remember seeing in there ... somehwere ... around page 2 ... or page 4 ... or was it 3?

    So I've started a FAQ list from this thread with what I think are the up-to-date answers for common questions that I had (and others had already answered/resolved): http://snapandplug.com/xr-input-toolkit-2020-faq/ plus other things I find/discover elsewhere (e.g. if I find alternative CurvedCanvases I'll link them in too)

    EDIT: oops, URL has changed - the one in my Sig I keep udpating, but updarted this post too.
     
    Last edited: Apr 29, 2020
  5. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    Like this?



    I wanted something that was clickable, but also thumb-stick selectable, so I did both together. If this is what you're looking for, I can share the code (about 150-200 lines, not sure if this is the "correct" way but it's working fine so far).
     
  6. dwatt-hollowworldgames

    dwatt-hollowworldgames

    Joined:
    Apr 26, 2019
    Posts:
    104
    The Curved UI store Asset works fine with xr interactions once you gut it of anything input related.
     
    Matt_D_work and a436t4ataf like this.
  7. stippy

    stippy

    Joined:
    Mar 1, 2020
    Posts:
    9
    I am trying to get smooth locomotion working. However I face an issue with the forward direction. Not sure if my thinking is correct, but:
    (working on oculus quest)
    - My XRrig is parent to the camera offset, which is parent to main camera / left controller / right controller.
    - I am using the left controller axis to move forward (pushing forward = moving forward)
    - I am using the snap turn provider on the XRrig to turn the forward direction via the right hand controller axis --> this works as intended and turns also the camera (as camera is child to XRrig)
    - I would like to additionally adjust the forward direction of the player according to the camera direction (= pushing left controller axis forward means walking forward and when turning the head (=camera), then also the moving direction should be turned in the same way)

    I tried that with
    Code (CSharp):
    1. system.xrRig.MatchRigUpRigForward(Vector3.up,_camera.transform.forward);
    But as the camera is child to the XRrig, this leads to endless turning, when the camera is initially turned out of zero position.

    Is my approach wrong? Am I missing something obvious?
     
  8. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    thinking about it, because rotating the XR Rig will also rotate the camera forward vector you will have the situation where you will always be chasing whatever relative angle the headset has.

    i assume you want to _move_ the player along the camera's current world forward. which is different to trying to rotate it to match the forward.
     
  9. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    The easy way is not to match them up :). If you want to match them up, though, the standard way I'd do it (pre XRInput) is:

    1. read world direction of camera, save to a variable
    2. reset world LOCAL direction of camera to Vector3.forward
    3. call the Match-Up function, using the saved value.

    This should just work, no?
     
  10. stippy

    stippy

    Joined:
    Mar 1, 2020
    Posts:
    9
    Yes, this is correct for sure. But: If I do it that way and additionally use the snap turn provider (as is, without changes), then the snap turn provider will turn the rig more than the camera, which leads to wanting to walk forward will move the player sideways.

    Code (CSharp):
    1.         private void Update()
    2.         {    
    3.             //Get input of how fast the player wants to move
    4.             _stepSize = -Input.GetAxis("XRI_Left_Primary2DAxis_Vertical");
    5.             //Get the direction, the play is looking into
    6.             _lookDirection = _camera.transform.forward;
    7.             // remove the vertial angle of the vector
    8.             _lookDirection = Vector3.Scale(_lookDirection, new Vector3(1,0,1));
    9.         }
    10.  
    11.         private void FixedUpdate()
    12.         {
    13.             //move the player
    14.             _rigidbody.MovePosition(transform.position + Time.deltaTime * currentSpeed * _stepSize * transform.TransformDirection(_lookDirection));
    15.         }
     
  11. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    GrabInteractable:

    With the default MovementType (Kinematic) I get major juddering - holding an object at arm's length away, and moving side to side in a medium slow swipe (about the speed of the jedi "these are not the droids you're looking for" gesture :)), I get a 1-2 INCHES judder, with the object jumping back and forth that much every frame, giving two objects and horrible flicker for the player.

    Switching this to MT = Instantaneous, it almost entirely goes away.

    Is there something wrong / misconfigured?

    I see the same juddering on the Interactor (i.e. the RightHandController/XRController provided by the default XR "Room scale Rig" asset) that XRInputToolkit provides - it's so bad that if I pick up a GrabInteractable, and move it sideways, the red/white line renderer from XRInput's default TrackedDeviceRaycaster/Canvas detector keeps flickering red/white/red/white as it keeps "detecting" that the grab interactiable is under the raycast - oh! no it isn't! - yes it is again - oh! no it isn't! - yes it is again ... etc.

    This seems really really wrong. I worried there was something wrong with my hardware / device latency etc - but the Instantaneous mode shows that's not the case.

    Should we just manually change all the defaults away from Kinematic? At the moment out of the box it's unusable for most people (quickly incites nausea in normal people. I'm fine but I've been testing VR for 4 years, so I've got good at selectively ignoring/tuning out a lot of the visual cues that cause problems :))

    PS: I loved that the attachTransform Just Worked, and was intelligently interpreted - it takes both the offset and rotation into account. Very easy to use well. Although I'm going to extend GrabInteractable with an OnGizmos so that I can visualise the attach points more easily, helps a lot with scene setup.
     
  12. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    Trying to extend XRRayInteractor to add a simple feature ("make the attach point depend on where on the interactable the original ray hit" - the data is all there, it's only a few lines of code change), I ran into two problems:

    1. Can't extend XRRayInteractor directly because all the member variables are on default permissions; this means that although the methods are virtual, in practice they can't be overridden.

    Could the member variables be changed to "protected"? Please? :)

    2. Can't clone XRRayInteractor because it uses a couple of core structs and classes that have been marked as "internal" in the source. I'm not sure why? (SortingHelpers is internal??? .. ImplementationData makes more sense, but still ... since it prevents us from using XRRayInteractor and means we have to re-invent the entire wheel, I'm not convinced it should be private).

    Are we supposed to not extend XR classes/features, and instead re-build our own parallel implementation of everything?
     
  13. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    Question: When an Interactor has grabbed an Interactable, how do we offset the position programmatically?

    UPDATE: see https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/page-6#post-5577148 for a temporary workaround using reflection.

    Answer: (Appears to be) you can't; the XRGrabInteractable method for updating the offset internally (void UpdateInteractorLocalPose(XRBaseInteractor interactor))
    is only called once (in OnSelectEnter), and isn't public/protected :( so you can change the data, but it won't be refreshed, and your changes will be ignored?

    For instance ... if I grab something and then want to move it closer/further away from the hand ... I can't? This is one of the basic/core interactions I've been using a lot in VR with the legacy VR APIs and my custom grab implementations. Currently this is a showstopper for using a lot of XRInputToolkit.

    (c.f. previous post -- as far as I can tell there's currently no way to extend or fix this, because XRGrabInteractable can't be overrriden, etc etc etc.

    Can't extend XRGrabInteractable because the Update method is private :(.

    Can't clone XRGrabInteractable because OnSelectEnter/OnSelectExit have been marked "internal".)
     
    Last edited: Mar 11, 2020
  14. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    Re: extending, it currently seems designed that programmers should NOT be using XRInputToolkit, and should build their own instead?

    From https://docs.unity3d.com/Packages/c...dex.html#extending-the-xr-interaction-toolkit

    "You can use events these to define your own behavior to hover, selection and activation state changes with no additional coding."

    ...and instead of having normal classes with normal permissions, most things are artificially locked-down (heavy use of "internal" (which has a mixed reptuation in OOP, precisely because of situations like this :), lots of core methods and vars non-public AND non-protected, etc).

    Net effect: Most customization is impossible, because only a few "events" are exposed - e.g. core ones like Update aren't exposed, which is where interesting customization is nearly always going to happen.

    Is the goal to add the missing events (including super core stuff like Update, e.g. in the form of OnPreUpdate,OnPostUpdate etc), or is it that we should stop using most of XRInputToolkit and roll our own?
     
  15. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    (The immediate question for me here is whether this is going to be a core feature (which I thought it was), or a smaller niche utiltiy/reference framework that many (most?) people won't be using. Unless the extensibility changes massively, most won't be using it for games because they just can't get it to do anything non-trivial.

    That's fine, it's not too hard to re-write all the XRGrabInteractable etc from scratch (although it'll take quite a lot of time) but it means there's nothing for me to support - other devs can't build on it, I can't add to it - so I'd better build something to repalce it now and advise my existing users to avoid it and roll their own / find a community solution)
     
  16. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    we certainly want lots of people to use it! if there's things that we can do to help the extensibility please let us know!
    we're still preview for a reason :) and we will have a patch soon with a bunch of internal -> protected/public changes (like OnDestroy! and an IXRController interface)
     
    a436t4ataf likes this.
  17. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    Awesome. That's really helpful to know :). Based on that, I've hacked a temporary workaround for this case which I can delete with the next update.

    If anyone else wants this ... If you want to change the grab/attach offset of an object in realtime while its already grabbed, by XRGrabInteractable/XRXRayInteractor, this will force the updates using reflection:

    Code (CSharp):
    1. public void Update()
    2.     {
    3.         var interactor = GetComponent<XRBaseInteractor>();
    4.         var controller = GetComponent<XRController>();
    5.  
    6.         XRGrabInteractable grabbable = interactor.selectTarget as XRGrabInteractable;
    7.         if( grabbable != null && grabbable.attachTransform != null )
    8.         {
    9.             InputDevice device = InputDevices.GetDeviceAtXRNode( controller.controllerNode );
    10.  
    11.             Vector2 thumbDirection = IntelligentXRExtensions.AmountMoved(device, CommonUsages.primary2DAxis);
    12.             float moveAwayAmount = thumbDirection.y;
    13.  
    14.             Vector3 vectorToMoveObjectAlong = grabbable.attachTransform.parent.InverseTransformVector( controller.transform.forward );
    15.             grabbable.attachTransform.localPosition += moveAwayAmount * vectorToMoveObjectAlong;
    16.          
    17.             MethodInfo unity_UpdateInteractorLocalPose = typeof(XRGrabInteractable).GetMethod("UpdateInteractorLocalPose", BindingFlags.NonPublic | BindingFlags.Instance);
    18.             unity_UpdateInteractorLocalPose.Invoke(grabbable, new object[] { interactor });
    19.         }
    NB: this sometimes goes very juddery until you drop the object and re-grab, so I think it's interfering with the internal smoothing algorithm somehow (but we're poking a non-public method, so ... Deal With It).

    NB2: "AmountMoved" is just a thin wrapper for
    "(InputDevice obj).TryGetFeatureValue(moveable, out Vector2 movement)" which adds some error handling/detection stuff.
     
    Last edited: Mar 11, 2020
    vikash_ra1 likes this.
  18. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    there will always be some lag with using physics to move the object. if you have a good repro for extreme juddering we'd love to hear about it.
     
  19. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    54
    Is anyone else having controllers with the XR Direct Interactor interacting with XR Grab Interactable objects that are inactive in the scene? If not I can create a test scene, but it feels like a bug.

    I made a small script that disables my hands Direct Controllers when the grabbable object within reach is inactive - to work around this and keep coding, but this might be something to look into?
     
    Danielsantalla likes this.
  20. Danielsantalla

    Danielsantalla

    Joined:
    Jan 7, 2015
    Posts:
    75
    Anyone knows how to use haptics on the oculus quest? I'm confident that I configured the components correctly and yet I can't feel the haptics
     
  21. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    FYI for other readers / future reference - even with super-simple projects, I can't narrow it down. Literally in the space of 30 seconds I'll go from one interaction which is butter-smooth to a different one which is super-juddery, and back. Nothign changed. If I cancel and re-interact (currently: only using XRGrabInteractable / XRXRrayInteractor), there's a 2 in 3 chance of horrible juddering, and 1/3 of "oh, it's just fixed itself". Once it fixes itself it's usually fine for a while.

    The juddering (once it happens) is permanent during the interaction - it's not a temporary CPU/GPU problem, it starts and stops with the interaction starting/stopping.

    But I so far - even with super simple test! - cannot spot a pattern to why/why not.

    Last thought: the telltale signs of being CPU-bound (the drug-trip warping of straight lines and wobbling of the 3D world, because Oculus's frame-sending algorithm has run out of data) never coincide with this, so I am pretty certain it's not a raw perf problem.
     
  22. dakomz

    dakomz

    Joined:
    Nov 12, 2019
    Posts:
    40
    Also - how would I differentiate between "fired into the air" and "selected UI menu"... do I need to cache some state somewhere when the UI is hovered, and then check that before firing weapon?
     
  23. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    Does the standard UnityUI "IsPointerOverGameObject" not work? I thought XRInput was integrating with the normal EventSystem/Canvas/UI raycasting, so this should work as normal?
     
    dakomz likes this.
  24. Danielsantalla

    Danielsantalla

    Joined:
    Jan 7, 2015
    Posts:
    75

    I had an issue with this too. When I let go the grabbed object and destroy it, it destroys the hand. To fix it I had to add a coroutine that waits to seconds after releasing the object and then destroys it

    I made a small script that disables my hands Direct Controllers when the grabbable object within reach is inactive - to work around this and keep coding, but this might be something to look into?[/QUOTE]
     
    kavanavak likes this.
  25. dakomz

    dakomz

    Joined:
    Nov 12, 2019
    Posts:
    40
    Hmmm maybe, but I think I actually _want_ to cache the current state via hover events.

    When I try to add functions to invoke on hover though, nothing happens - even though the line renderer does turn white.

    Any ideas? Here's where I'm adding the events, on the RightHandController's XR Ray Interactor component

    upload_2020-3-15_23-8-5.png

    Edit: also couldn't get the IsPointerOverGameObject approach to work, fwiw...
     
    Last edited: Mar 15, 2020
  26. harleydk

    harleydk

    Joined:
    Jul 30, 2012
    Posts:
    41
    Am having the hardest time getting the XR Grab Interactible to set the parent of the gameobject it's attached to. Case in point, I'm trying to pick up a pair of sunglasses and want to stick it to the main camera of the XR Rig.

    Like so:


    But it won't work when it's the same object it's attached to that is referenced. If I specify any other gameobject (same configuration, also with an XR Grab Interactible attached) it works fine. Is this to be expected, or should I file a bug report? Thanks in advance.

    **EDIT this was registrered as a bug that will be fixed in a later release.
     
    Last edited: Apr 6, 2020
  27. dakomz

    dakomz

    Joined:
    Nov 12, 2019
    Posts:
    40
    Another question... is there a tutorial video about how to make like an avatar for the hand/controller in the new system? In other words place a mesh at the base that stays with the controller in the various systems (e.g. has 6dof in quest, rotation-only in go, etc.)
     
  28. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    What info is needed in that tutorial? I've been setting this up the same as I would in a non-VR game, using the info in normal Unity tutorials/docs, and it's working OK so far (position your meshes using transforms/parenting, then instead of RigidBody constraints, use TrackedPoseDriver with its different flags to control "how" you want that mesh to move in 3D, if you want to constrain a controller to eg rotate-only).
     
  29. dakomz

    dakomz

    Joined:
    Nov 12, 2019
    Posts:
    40
    Hmm when I tried just parenting it that didn't work. Maybe I had something wonky going on, will try again - thanks.
     
  30. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    Video I recorded for someone else on similar subject (getting fingers + controller + buttons to all animate automatically), is this what you're talking about? Or have I misunderstood? -
     
    harleydk and dakomz like this.
  31. dakomz

    dakomz

    Joined:
    Nov 12, 2019
    Posts:
    40
    great, thanks :)
     
  32. T3ddyTheGiant

    T3ddyTheGiant

    Joined:
    Aug 1, 2018
    Posts:
    11
    I gutted the CurvedUI asset, input works fine now...however the xr caster doesn't respect the curve (not surprising). I'm assuming there isn't a more elegant approach than building off CurvedUI's CustomRaycaster?
     
  33. harleydk

    harleydk

    Joined:
    Jul 30, 2012
    Posts:
    41
    Great! So, the obvious question... 'soon'? ;-)
     
  34. dakomz

    dakomz

    Joined:
    Nov 12, 2019
    Posts:
    40
    Nope, not yet... :\
     
  35. dwatt-hollowworldgames

    dwatt-hollowworldgames

    Joined:
    Apr 26, 2019
    Posts:
    104
    I use my own input module rather than the interaction toolkit I haven't seen to many issues but yes graphics raycasters will see it as flat for sure but it does work. I was never a fan of the curved ui input code as it gave me trouble no end with canvases attached to things that move. Also you have to know about the z scale needing to match the x,y scale or the curve is weird.
     
  36. dakomz

    dakomz

    Joined:
    Nov 12, 2019
    Posts:
    40
    Hard-coded workarounds aside, isn't it a bug that the `onHover` event isn't firing - or am I completely misunderstanding what this event does?
     
  37. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    293
    Have you tried logging it ?
    Mine have the same problem because OnHoverExit get call when the state change to OnSelect and whenever the ray move out of a collider even if there's still hover targets in the list.
     
  38. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    I found a lot of hands on the asset store that look great but are missing the core animations for VR (and DEFINITELY are missing the specific animations for individual controllers), so I prototyped a modular framework that can import a Hand and re-animate it for VR. Here's the first version importing MonsterHands (https://assetstore.unity.com/packages/3d/characters/vr-monster-hands-86609):



    If anyone else wants this, I'll package it up and put on the asset store. The idea is that it's one-click to import new VR Hands and have them auto-animate correctly, using the XRInput framework. At the moment I'm manually fixing the anims, I'll try IK'ing them but they'll probably still need cleanup by hand. I'm happy to integrate any Hands you're using if I can get a copy myself (I love giving players the choice of swapping hands in and out right now :)).
     
    mrwhip and harleydk like this.
  39. dakomz

    dakomz

    Joined:
    Nov 12, 2019
    Posts:
    40
    nm I guess I was doing something wrong before, works fine now
     
  40. dakomz

    dakomz

    Joined:
    Nov 12, 2019
    Posts:
    40
    Yeah, in this screenshot, `ActionHandler.OnHoverEnter` logs... but nadda

     
  41. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    Re: initializing XR ... I just noticed that if I hit Play on a project that has XRInteractionToolkit installed, but no headset plugged in, I get hit with Unity errors, e.g.:

    "Unable to start Oculus XR Plugin.
    UnityEngine.Debug:LogError(Object)
    Unity.XR.Oculus.OculusLoader:Initialize() (at Library/PackageCache/com.unity.xr.oculus@1.2.0/Runtime/OculusLoader.cs:110)
    UnityEngine.XR.Management.XRGeneralSettings:AttemptInitializeXRSDKOnLoad() (at Library/PackageCache/com.unity.xr.management@3.0.6/Runtime/XRGeneralSettings.cs:175)"

    ...on the one hand: this is useful! If for some reason the headset has become physically disconnected or ran out of battery without me noticing (saves me a lot of debugging pain!)

    But on the other hand: this is a disaster! It means a game that is able to work both with and without VR is going to always shove errors to the console :(.

    I *think* the solution is to uninstall / delete the XR plugins (all of them!) and instead manually load them at runtime, and then try/catch errors - but I have no idea how to do this. Can't find any examples of programmatic XR plugin management. Has anyone figured that out yet?
     
  42. rinkymehra

    rinkymehra

    Joined:
    Mar 20, 2020
    Posts:
    4
    Downloading the example project rather than starting with a blank project and trying to install the package worked around the issue for me!! 9apps cartoon hd
     
    Last edited: Mar 20, 2020
  43. HereLen

    HereLen

    Joined:
    Feb 5, 2019
    Posts:
    3
    Hi! When is this coming? I need HandleInteractionAction in XRController to be protected (since I want to to simulate press by hand tracking). Or could m_SelectInteractionState, m_ActivateInteractionState and m_UIPressInteractionState be protected instead of private?
    Otherwise, great stuff!
     
    Last edited: Mar 20, 2020
  44. ibyte

    ibyte

    Joined:
    Aug 14, 2009
    Posts:
    1,047
    How do I get the controller velocity so I can set the haptic strength on my controller object when making contact with an object?

    Update: I found out it's not really part of the XR Interaction Toolkit

    XRController controllerReference.inputDevice.TryGetFeatureValue(CommonUsages.deviceVelocity, out velocity);
     
    Last edited: Mar 20, 2020
    a436t4ataf likes this.
  45. Thickney

    Thickney

    Joined:
    Mar 3, 2013
    Posts:
    32
    Any reason why the OnTriggerEnter/Exit functions in XRDirectInteractor would just stop working with no scene changes? The interactor was working fine then I renamed the project and when I opened it again these functions weren't being hit anymore.

    The thing I'm interacting with is an XRBaseInteractable nested in an XRGrabInteractable where the grab interactable has interaction layers set to "None" if that makes any difference.

    Restarting my computer seems to be the only thing that fixes it.
     
    Last edited: Mar 22, 2020
  46. mader_enova

    mader_enova

    Joined:
    Sep 24, 2018
    Posts:
    6
    Is it planed to integrate real hand interaction, so the fingers move depending on how I touch the controller like with the Oculus Integration Asset?
    Don't get this running with the XRI.
     
  47. Freaking-Pingo

    Freaking-Pingo

    Joined:
    Aug 1, 2012
    Posts:
    310
    I cloned the XR Interaction Toolkit example projects from Github and tried them out on an Oculus Rift S, however alot of functionality seemed broken. My camera was placed way too high relative to the size of me and the floor. I tried recalibrating my headset but that did not resolve it. I couldn't teleport in the scene WorldInteractionDemo.unity but I had no trouble teleporting in Locomotion.unity

    I were using Unity 2019.3.0f6 and used Oculus Rift S. Anyone have a hunch of why this could be or maybe have experienced the same?
     
  48. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    I think this one should be public:

    Code (CSharp):
    1. // Calculate the world position/rotation to place this object at when selected
    2. public class XRGrabInteractable : XRBaseInteractable
    3. {
    4. ...
    5.         Vector3 GetWorldAttachPosition(XRBaseInteractor interactor)
    6.         {
    7.             return interactor.attachTransform.position + interactor.attachTransform.rotation * m_InteractorLocalPosition;
    8.         }
    9. ...
    10. }
    ...because right now I'm having to copy/paste big chunks of the XRGrabInteractable source code just to be able to integrate with the grab/select system. e.g. this:

    Code (CSharp):
    1. var attachPosition = grabbable.attachTransform.position;
    2.                 var interactor = grabbable.selectingInteractor;
    3.                 var m_RigidBody = grabbable.GetComponent<Rigidbody>();
    4.                 Vector3 targetPosition = interactor.attachTransform.position + interactor.attachTransform.rotation * grabbable.attachTransform.InverseTransformDirection(m_RigidBody.worldCenterOfMass - attachPosition);
    5.                
    ...because I went digging through the public interfaces trying to find that data and couldn't find it anywhere? But maybe I've missed something here (the class has no documentation yet on usage/behaviour/etc, so I'm mostly working by reading the source and reverse-engineering the design).

    I checked to see if there was an XRInteractableEvent for this (basically: an equivalent to OnMouseDrag), but couldn't find one (?).

    In case it helps, this is what I was implementing: a ghost-preview of the Grabbed item, that updates based on the XR controller data, but has its own internal data that allows it to snap items into position. Note how in the video the blue ghost is following the grabbed item, and when XR controller releases (currently using
    "XRInteractableEvent onSelectExit"), my code decides either to do nothing (let the XR controller handle the release - all the good stuff you have in there doing velocity-tracking, momenutm, etc) ... or to take over and snap the object to where the ghost was.

    If this is the wrong way to approach this, please let me know - I think it's probably a common use case?



    PS: one of the big challenges I've had so far is the heavy use of "onSomethingSomething" events that have to have to be hooked up in-editor, and work badly with source code (creating a persistent listener is a nightmare even for advanced C# coders; non-experts have no hope). They are a nice-to-have and I do like them, but really only for small/toy projects not big complex ones (this is a side-effect of UnityEvent/UnityAction, not specific to XR - but you're inheriting those problems). The "ideal" would be to have a parallel API access to everything they do (easy add/remove of listeners in source code, which shows up in-editor, but I'm fine with just having public access to all key variables and callbacks, so that we can make it a team policy of "never use the onSomething listeners in editor", and we'll be fine.

    FYI to anyone else - if you're trying to do something in code and its not working / access is blocked, you can often achieve half of it by manually hooking things up in editor (although debugging will now be a nightmare, since your solution is half code, and half gui-only).
     
    harleydk likes this.
  49. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,924
    When I pick objects up using XRRayInteractor + XRController + XRGrabInteractable, XR Interaction Toolkit messes-up the rotation of the picked-up object.

    Looking in the code, it appears you're trying to force-change the grabbed-item's rotation to be the rotation that was hardcoded on the interactor object (usually: the player's hand/controller)?

    If so, please make that an optional feature, since I'm finding most of my players hate it - they try to pick something up and it immediately changes orientation, so they then have to snap their wrist back/forth to reinstate the orientation they expected it to have.

    (I'm currently trying to figure out how to undo it by pre-anti-rotating, but my previous efforts here burned a whole day and I couldn't reliably remove the forced-rotation-change. I think I only need to undo the code here, but I'm still not 100% sure why it's inverting the inversion of the transform's rotation :). I also don't understand why it's using the RigidBody's rotation instead of the GameObject/transform's rotation (!). What effect does that have? Is that why the rotation is often incorrect in the scene??):

    Code (CSharp):
    1. void UpdateInteractorLocalPose(XRBaseInteractor interactor)
    2.         {
    3.             m_InteractorLocalRotation = Quaternion.Inverse(Quaternion.Inverse(m_RigidBody.rotation) * attachTransform.rotation);
    4.         }
     
    harleydk likes this.
  50. ibyte

    ibyte

    Joined:
    Aug 14, 2009
    Posts:
    1,047
    I have a question about transform.forward with the Oculus Rift controllers. I want to hold a sword (and shoot a ray) so that the ray and sword are parallel with the grip. Right now the ray and sword position are parallel with the flat top of the Oculus controller.

    I did notice in the XR Interaction demo scene "WorldInteraction" (trees Apple and sword) and the WorldInteractionDemo that the XRInteractorLineVisual and the XRRayInteractor are oriented in the way I want to see them but I am not sure what I need to do in my situation.
     
Thread Status:
Not open for further replies.