Search Unity

  1. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice
  2. Ever participated in one our Game Jams? Want pointers on your project? Our Evangelists will be available on Friday to give feedback. Come share your games with us!
    Dismiss Notice

Unity XR Interaction Toolkit Preview Release

Discussion in 'AR/VR (XR) Discussion' started by mfuad, Dec 17, 2019.

  1. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Try creating a child object that's rotated in the direciton you want, and setting XRRayInteractor.attachTransform to that object? If you provide a value there, it tries to rotate/position grabbed objects to that orientation.

    Or use TrackedPoseDriver yourself and just read the direct values from the controller and create a child that is pre-rotated relative to the controller's normal position?
     
  2. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    I was wrong :) - there was another (3rd party) script in the scene that was adding itself in onSelectEnter (which means it is invisbile in Editor - no way for me to see it was happening until I found it in a project-wide code search) and corrupting the attachTransform rotation before I was seeing it.
     
    Last edited: Mar 23, 2020
  3. ibyte

    ibyte

    Joined:
    Aug 14, 2009
    Posts:
    963
    Thanks for the suggestions. I have something figured out but my concern now is that it will not be controller agnostic. I need to test with the Vive once it is supported.
     
  4. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Well, I'd expect the specific "orientation" / rotation is never going to be controller agnostic because they are all diffrent sizes/shapes/rotations - you'd have to do it once for each one, and different people hold them differently (different games encourage different hand positions, in my experience!) - but after the initial setup of the positions, everything else should be agnostic?
     
  5. ibyte

    ibyte

    Joined:
    Aug 14, 2009
    Posts:
    963
    I could be wrong, I thought that the controller normal for the Vive follow the grip but Oculus does not to do that? Could be VRTK took care of those details.
     
  6. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Doesn't it depend which model you're using? And those aren't included in XRIT. Off the top of my head, the default Oculus model is aligned so that a "comfortable" hand postiion (which is definitely not vertical - closer to a pistol grip) has neutral rotation - which makes sense to me: it's the position closest to where most players will start each app / idle position where they'll hold the controller.
     
  7. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    @Matt_D_work here's one that's hurting extensibility: the reparenting by "XRGrabInteractable.OnSelectExit" makes perfect sense on its own - but when you try to use it with other game code, it's too rigid a design.

    Parenting in Unity is, admittedly, massively over-used :) (I remember being told by Unity engineers to use it to solve different problems/missing features at least as far back as Unity 3.5 :)), and that means a lot of code/libraries/techniques rely on it. The contract in UnityEditor/Engine has for long semi-officially been: "Unity will never change your parenting of objects out from under you".

    (I've noticed that this is one of the top 3 FAQs people keep asking: why is it re-parenting? how can I prevent it? etc)

    "XRGrabInteractable.retainTransformParent = false;" is a nice workaround for some use-cases. But not others: it doesn't prevent reparenting on pickup, only on drop. This means I am encountering a lot of code where it's easy to integrate half of XRIT with existing libraries/assets, but difficult to handle the other half (XRIT keeps messing-up the hierarchy).

    But I wonder why the reparenting is happening at all?

    Right now we seem to be getting the worst of both worlds:

    1. If XRIT reparented grabbables to the controller, then we'd have super-smooth movement for grabbed ojects (which, as noted previously, we don't have - we have the choice of 3 different lagging methods, all with pros/cons)
    2. If XRIT left grabbables where they are, then we'd have 100% compatibility with all legacy code (which we don't have, because even if we disable "retainTransformParent", XRIT still deparents on grab :(.

    If there's some reason why deparenting is required, then we need a lot more callbacks. The current "onThingHappening" isn't good enough in such cases, it needs to be changed to:

    1. onThingAboutToHappen // allows us to cancel it! Or alter data before it happens!
    2. onThingIsHappening // the current one: the simple case of "I don't care, just want a callback"
    3. onThingHasHappened // allows us to clean-up afterwards! Remove things that XRIT has overwritten, etc

    ...and that pattern needs to be repeated across all listeners/callbacks. Right now the pattern is that you only send us callbacks for "the thing has already been commtited to, you cannot back out ... but whatever you do, we're going to overwrite your changes because we invoked this BEFORE processing the action internally" (i.e. the call to listeners is happening as the first line of code inside each XRIT base class, rather than the last line).

    TL;DR: please remove the deparenting feature :).
     
    kavanavak and andybak like this.
  8. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Nasty bug: If you create a new XRGrabInteractable in response to an old one being dropped/de-selected, then the interactionmanager can (100% reproducible) "lose" the new one, and will never allow interactors to pick up that (new) grabbable ever again :(.

    In detail: when a grabbable is being dropped (i.e. inside the onSelectExit callback), if you attempt to add a new XRGrabInteractable to any of that grabbable's parent objects, it goes horribly wrong even though they are separate GameObjects.

    This prevents a whole class of workarounds to the "XRIT keeps changing the parent and messing-up the hierarchy" issue mentioned above :).

    UPDATE: I have a workaround, using coroutines (wait a frame - the bug appears to be a call-order thing inside XRIT)

    (Case 1231482)
     
    Last edited: Mar 29, 2020
  9. dariony

    dariony

    Joined:
    May 16, 2015
    Posts:
    13
    Yes looking into this (if you've ever used VIU's simulator check it out)
     
  10. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    I think the issue I hadyesterday (and sent in a bug report for) might be a different view of the same underlying bug you hit here:

    It seems that OnSelectExit does some post-processing *on the entire transform hierarchy* from the object upwards (i.e. its parent, grandparent?, etc?). Certainly, it does something weird to the parent (seems to mask it out). I haven't delved in the source code to figure out what's going on (the XRInteractionManager is completely undocumented right now, and there's so much stuff going in there with the custom Phases etc that I'm trying to avoid going deep into it).

    However, my workaround that's doing fine so far: use a coroutine, wait a frame (this guarantees that Unity's code for OnSelectExit will have finished processing - that code (spread across many different XRIT classes) seems to be the culprit), then process the object / parent.

    You'll have to write a one line method to call the coroutine, and a two line coroutine - can't do this purely in editorGUI/inspector, I think.
     
  11. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    [/QUOTE]

    Was this inside an OnSelectExit callback? (c.f. my last two posts :) I think there's a general bug here with parent/object during the post-processing of OSE)
     
  12. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Feature request: the callbacks for OnSelectEnter/Exit, OnHoverEnter/Exit, should have a Collider argument that is "the actual Collider that was hit". This mirrors the standard MonoBehaviour callbacks relating to physics/hits/triggers.

    (because of the way Unity Physics/Havok is designed, a single RB will usually have multiple children each with a separate Collider, and this is the *only* way to implement a lot of physics setups (anything else fails with Unity's physics, this is how it was designed) ... so it's often hugely important to know which childobject the ray/hit happened with)

    For a concrete use-case: I have a robot with arms and legs. I want to pick it up, so it has an XRGrabInteractable. But I want it to behave differently if I pick it up by the head, arm, or leg -- and at the moment, XRIT is discarding that information before it sends me the On****Enter/Exit callbacks.

    UPDATE: there is a method that lets you fetch this data during OnSelectEnter, but its easy to miss - if you cast your interactor ref to XRRayInteractor, it has a public method GetCurrentRaycastHit, which (during an OnSelectEnter callback) contains the raycast that was used to trigger the OnSelectEnter. You can then query into the raycasthit to find the collider, and from there the actual GameObject.
     
    Last edited: Mar 30, 2020
    harleydk likes this.
  13. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    I also attempted this (to workaround the missing Collider information), but there seems to be another bug:

    Code (CSharp):
    1. var interactor = (grabbable as XRGrabInteractable).selectingInteractor;
    ...because inside OnSelectEnter, that "selectingInteractor" variable is always null :( :(.

    UPDATE: yep, definitely a bug. The code in XRInteractionManager does:

    1. Tell the Interactor to send all its callbacks
    2. Tell the Interactable to send all its callbacks

    ...but the interactable doesn't update its own data structures until AFTER its sent out its callbacks. That means that all the Interactor callbacks see stale/incorrect information.

    I guess this should be a two-phase notification instead, something like:

    1. Tell the interactor "onSelectEnter is about to happen"
    2. Ditto for the interactable
    3. Step 1 as before
    4. Step 2 as before

    ... so that member variables can be set internally in 1/2, and be correct during 3/4

    EDIT: Case 1231662
     
    Last edited: Mar 29, 2020
  14. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    ...and another one:

    You are currently invoking OnSelectExit for multiple incompatible reasons - with no indication of why you invoked it. We need to know the reason it was invoked (or: we need this split into two callbacks, so we get that context info).

    In particular, as documented/described, it seems to mean:

    "The user has let go of the object / moved the controller away"

    ...which is half true. But you also invoke it when:

    "The user has re-selected the object using a different controller, and we're automatically cancelling all existing selections"

    The problem here is that:

    1. "User released an object"
    2. "XRIT cancelled the user's grab"

    ...are fundamentally different, and in most cases: require different code paths to be executed. Right now we cannot do that (although I'm trying to hack something into place by subclassing the XR classes, figuring out which method is processing the XR button presses, and then internally keeping track of "did I initiate this OnSelectExit, or is it somethign that XRInteractionManager did without telling me?").

    UPDATE: workaround for now (ugly but seems to work): Extend XRRayInteractor, override .isSelectActive, and store the value it returns. That value tells you whether you are currently "holding" something on that controller - even if Unity tries to take it away from you. When OnSelectExit is called ... if your stored value is true, then ignore Unity's attempt to cancel your grab (it's wrong! The user hasn't cancelled, Unity has), and execute your custom code as appropriate.
     
    Last edited: Mar 29, 2020
  15. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    If I understand it correctly: you currently map the Grip button to "select" and the Trigger button to "Activate".

    The "activate"/"select" idea works great in most cases - it seems to be a decent general design. But hardcoding that to buttons works poorly - it would be much better if these were toggleable (NB: you even break the convention yourselves for GUI, where you switch to using the Trigger as the primary interaction :) - which I agree with, it works better! - but this just illustrates how rapidly the hardcoding of buttons falls down as a design).

    The mapping of Grip=Select, Trigger=Activate works for a few simple games/apps - e.g. a simplistic shooting game, where you Grip to pick up a weapon and Trigger to fire it (I think this is the example you give in the docs).

    But it fails where those should be the other way around - e.g. a simplistic destruction game, where you want to Trigger to select which part of the building you're going to pull away, and then Grip to pull it away.

    For what it's worth ... some of my testers so strongly memorize "grip = select" while playing that they struggle to use the GUI, because they're trying to "grip" when they want to press buttons on the GUI. They're following the hardcoding of XRIT :), but I find its much more natural to redefine Grip+Trigger mapping on a case by case basis - and *most* of my testers agree.

    PS: I looked at subclassing XRRayInteractor and swapping the Select vs Activate - but interactors don't have callbacks for Activate :(. So I looked at subclassing XRGrabInteractable, and re-routing all calls to OnSelectEnter to OnActivate instead, and vice-versa. But XRIT's subclassing makes that not possible too, since the XRGrabInteractable calls base.OnSelectEnter during OnSelectEnter.

    It looks like there might be some way of achieving this using TrackedDeviceModel - that seems to be how XRRayInteractor is abstracting away its input handling - but I so far can't figure out how TDM and XRRI are connected together - in the editor, they seem unconnected, and I can't see how the XRRI is indicating what it expects.
     
    Last edited: Mar 30, 2020
  16. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Final one for today: XRGrabInteractable is marked as "requires RigidBody" - is this really necessary? It causes a lot of problems with using it, would be much better if it didn't have this line.

    The main problem is: RigidBody in Unity is special (I can't remember if that's the exact word that Unity engineers used when they explained this to me years ago :)), and its presence in a transform hierarchy affects other items in the hierarchy (which is pretty much unique to RB and Colliders) -- so you need to carefully control when and where it appears, and often remove it (temporarily) to prevent the special-handling from messing up your hierarchy.

    For a concrete example: this one line in XRGrabInteractable makes it impossible for a single hierarchy to have more than one XRGrabInteractable - the line causes all the child transforms to turn into fully-fledged physics items (and start moving/falling/etc).

    It only appears to be used after the XRGrabInteractable has received OnSelectEnter, and stops being used when OnSelectExit is called. This suggests it doesn't need that line? It could be removed, making XRGrabInteractable work in a much wider set of situations.

    I would have expected instead that XRGrabInteractable requires a RigidBody to exist only while something is selected - this makes sense!

    Given that you already fire the OnSelectEnter callbacks *before* XRGrabInteractable starts accessing the RigidBody, it would be very easy for 3rd party code to selectively remove the RB and re-instate it during OnSelectEnter, just in time for XRGrabInteractable to need it.
     
    Last edited: Mar 30, 2020
  17. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    A related problem/wish: make it so that we can change the XRGrabInteractable reference that is currently selected without anything else happening (i.e. swap grab1 for grab2). This would have to be something in the InteractionManager, I think.

    This would enable us to do things like:

    - right hand is holding something, left hand takes an action, effect of action: object in right hand is swapped for a different one (e.g. imagine that left hand is changing the weapon held by right-hand, using left hand as a browser, while right hand is in the middle of - e.g. - firing the gun, and doesn't want to lose it just because the grabbable got switched), without triggering selectexit/reenter (important since those both have a lot of side-effects that can't be ignored/disabled right now)

    - left hand can steal something from right hand (e.g. in mariokart games where you steal objects from other cars), again without triggering any of hte side-effects of the right hand "dropping" one object and "picking up" another, which are undesirable but currently unavoidable.
     
  18. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Another wish item for extensibility:

    "Refactor XR Interaction Manager so that we can programmatically select and deselect things."

    This is a big win for letting us integrate XR IT with other systems / features / frameworks etc. I think this could be one of the main features of XR Interaction Manager: to provide an easy, safe, centralised place for code to programmatically trigger select/activate/cancel/etc.

    The methods are there, and I'm currently using a build where I'm invoking them via reflection and it seems to work OK with no negative side-effects (so far :)).

    e.g. this one:
    Code (CSharp):
    1. internal void ForceSelect(XRBaseInteractor interactor, XRBaseInteractable interactable)
    2.         {
    3.             SelectEnter(interactor, interactable);
    4.         }
    (although for deselect I'm copying your example ;), and invoking SelectExit directly.
     
  19. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    (last two posts are linked: as a workaround, I've been able to fake the "moving XRGrabInteractable from GameObject to GameObject", by forcing XRIM to do a deselect, and on the very next frame creating a new GrabInteractable elsewhere in the hierarchy and forcing an artificial re-select of something else. In reality the object might get to drop through the air for a single physics-tick, or not (depends on your tick rates), but that's such a small time/distance that it's imperceptible in most cases)
     
  20. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    25
    I don't know if this is a bug or not, but when I destroyed ARselectionInteractable, it caused the ARGestureInteractor to break and throwing null reference errors.
     
  21. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    357
    Anyone else think that at this stage a preference for internal/private does more harm than good? It's a delicate balance but on the whole I find people tend to err too much on the side of closing things down. In a library meant for extension and reuse that shouldn't be your default stance.
     
    harleydk and a436t4ataf like this.
  22. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Destroying XRGrabInteractable works fine, and neatly triggers OnSelectExit if it was currently selected (*) - so I'm pretty sure the ARSelectionInteractable behaviour is a bug. Make a test case and submit it :)

    (*) - it's possible that I added that feature myself and forgot about it, but I'm pretty sure I did not - I'm pretty sure it worked this way out of the box (and it's what I'd consider the correct behaviour for the API)
     
  23. Bentoon

    Bentoon

    Joined:
    Apr 26, 2013
    Posts:
    69
    I'm loving the XR Interaction Toolkit!

    Has anyone figured a way to do a "Pinch Zoom" type of navigation (Like in Tiltbrush / Quill etc) with it?
    For me this is the most basic and intuitive navigation and plays with scale...
    I ahem been searching the forums and dev sites...
    Let me know
    ~be
     
  24. yulaw2k

    yulaw2k

    Joined:
    Apr 25, 2014
    Posts:
    23
    Has anyone figured out how to force an item to be stuck to a user's hand when they touch it?
    I see XRRayInteractor has a "Starting Selected Interactable", but that uses a private method which calls Forceselect on the InteractionManager.

    -Why does this whole system feel so unintuitive? I feel like with SteamVR, Oculus, VRTK, Microsofts interaction systems. I can hop in, glance at the code for 10 minutes and be like oh, sweet. But this system just seems confusing, I might actually have to read the entire documentation and still be confused.
     
  25. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    You mean .. directly, programmatically? (if so: that's what I was talking about in the last few posts).

    Or ... just a way to achieve it in general? (if so: the API has been designed so that you don't use code, you use UnityAction's in the Editor and manually hook everything up through GUI. This works great, unless you need to trigger it in the middle of a script, like I did recently :))

    Up front, of course, let's be clear: this system hasn't been released yet, we're using a pre-release preview! If you choose to do this, you're committing to using a system that hasn't been fully written by its authors yet, and to embracing that confusion, uncertainty, bugs, missing (core!) features, etc. I'm happy doing that because I want to help shape the final release, and because I'm confident enough to workaround / hack / add temporary fixes in my own code for now. All of those hacks I want/expect to delete when XRIT goes live as a "public" release.

    On the wider issue of how is it so confuding right now ... My guess on this: the team really wanted to work with the design of Unity's current/new input system, which in turn is heavily based on the design of UnityUI (UnityAction etc came with UnityUI / or "New GUI" as it was originally known), and that was a design/architecture that was never fully designed/implemented (*). If you love UnityUI's action system, and have learnt how to make it work from code (which is really hard!) then you're already fine with XRIT.

    There's good reasons for the team going that route, but there's also an exceptionally good reason: Unity is currently/recently overhauling their whole input layer for non-XR styff (which previously hadn't been updated in something like 10 years? And had a lot very old code and was missing lots of essential features). XR is a prime example of a new kind of Input that will heavily test/verify whether the new input systems "work" as an architecture/approach. So ... the XR team wants to do a good job of implementing XR in a way that will make sense to people coming from the new Input System in non-XR apps.

    So ... what we have is a system (XRIT) that is extremely elegant and easy to use *if you're using it in the narrow way that UnityUI allows*, but works poorly for everyone else - and this is more a side-effect of UnityUI/Input System, rather than a problem with XRIT.

    A recurring theme of my posts in this thread has been: Yes, fine, InputSystem whatever -- but in reality: Unity is a commercial game-engine and most real games have large programming teams (large percent of the team) and need to focus on code-based features because that's where most of the effort and innovation in game-development happens. So: we also need first-class access in code / code APIs. The response from the team so far seems to be "yes, we would like that too", and my guess: it is much easier for them to make a useable API than it is for them to integrate their code with a new architecture that no-one has used before because it's brand new, so they started with the harder part (compatibility with Input System), knowing they could easily add a good API (the easy part) top later.

    There were some bad design choices too (using "internal" is basically always wrong, especially when developing API's or libraries for public use, but it's a feature that Microsoft has failed to handle well, and it's taking time for the C# community to wake up to what a huge mistake it was ... so that's not unique to Unity :)). But ... well ... it's a preview release! So the right decision is to put out code today that maybe is poor in some ways (so long as it basically works), so that people can review and respond, rather than waiting until you've carefully written everything correctly. So I'm happy they did this (rather than taking longer to publish it).

    There isn't any (yet). This is a preview release :).
     
    gjf likes this.
  26. yulaw2k

    yulaw2k

    Joined:
    Apr 25, 2014
    Posts:
    23
    What a great response. Thank you. I haven't read through this forum since it was on like page 2 or 3 till right now, but yes I see you are doing something I'll have to do. I just thought I must have had that wrong idea since everything I wanted to access was private or internal.
     
  27. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Also check out my (unoffficial!) FAQ (link in sig below) - I try to add everything I find/see that sounds FAQ-like as I go along, especially things that don't affect me now but I suspect will bite me in future and I'll have forgotten the solution by the time they happn :) - but feel free to DM me other FAQs you feel should be there.
     
    gjf likes this.
  28. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    LMAO I clicked the link to check it's working - and immediately noticed a FAQ that is a neater answer to a problem I solved the hard way (through reflection!) this weekend. #facepalm. Need to read my own resource more often :)
     
  29. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Another bug, but I accept it might be marked "WONTFIX" so I haven't bothered reporting it (I've had no response to bug reports so far, so I'm not sure if they're being read).

    It turns out that if you set your Interactor to trigger on "STATE" not "STATECHANGE", then the Interaction Manager *does not query* the active state of the Interactor when deciding whether to select or not.

    i.e. the InteractionManager (or the XRBaseControllerInteractor? not sure) is apparently caching internally what it "guesses" the state of the Interactor to be. Either way: since isSelectActive is public and virtual, I would expect the interactor/manager to keep reading it each time it needs the value.

    (discovered this because I overrode the property "XRRayInteractor.isSelectActive", and was rather confused when the interaction manager started selecting things *without calling this method* -- this happened because I'd switched interactor's trigger from STATECHANGE, in an attempt to workaround issues/bugs in XRBaseControllerInteractor's implementation of detecting the "CHANGE" part of statechange - it doesn't seem to quite work properly, although you only discover this if you force a select)

    So ... maybe two bugs there :).
     
  30. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    This one is particularly annoying:

    Code (CSharp):
    1.  
    2. XRController.cs:
    3.  
    4.         /// <summary>Gets the current select interaction state.</summary>
    5.         internal InteractionState selectInteractionState { get { return m_SelectInteractionState; } }
    6.  
    Making this "internal" is weird. Your own code heavily uses that method for basic functionality - and yet you block everyone else from using it :(.

    I suggest: replace this class with an interface, and provide your own "basic" implemetnation that is easily swappable. Because right now it pretends to be an abstraction of the controller - but it fails. You've made it non-extensible (your own classes are calling private methods instead of the public ones! ARGH!), and so it can't be subclassed, can't be replaced.
     
    Last edited: Apr 1, 2020
    yulaw2k likes this.
  31. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Anyone having problems with grabbables stopping working? This may be why...

    Bug in XRInteractionManager: XRInteractables permanently store an internal list of which colliders they own. Separately, XRInteractionManager internally stores a list of which collider is owned by which interactable; there is no way of updating that list.

    You can "unregister" an interactable with the manager - but this has a bug: if any of the colliders are now owned by something else (and bear in mind: interactables automatically register themselves, so merely adding a GrabInteractable somewhere in your hierarchy can cause this to happen), then the interaction manager kills them from its internal list without checking if the "unregsitering" interactable actually owned them any more.

    (and does it check if the collider even exists on that object? No, this is another bug: it uses that permanent, unchanging, list on the interactable).

    I believe there should be methods:
    XRInteractble.refreshColliders() = updates the internal list
    XRInteractionManager.refreshRegistration( XRInteractable ) = updates the internal colliders-mapping, and does not use the stale data and does not remove collides from the wrong object (the current bug descrbied above), but instead uses the .refreshColliders method to get an up-to-date list, and checks the reverse-mapping for any orphaned colliders.

    I've added both those methods to my local copy of XRIT and they work fine. Is there anywhere we can submit Pull Requests of fixes for XRIT? I don't think so.
     
  32. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Another crash bug:

    If you destroy the grabbable in the OnSelectExit callback (pretty common thing to want to do), then XRGrabInteractable crashes, because it isn't checking whether the RigidBody still exists before trying to "restore" the settings that it probably didn't even change in the first place.

    Firstly: it should be checking for the RB still existing and being valid.
    Secondly: it possibly shouldn't be "restoring" values that it never changed (because if they've been changed in the meantime, it will now (incorrectly) overwrite them with corrupt data.

    Bug: 1232882
     
    Last edited: Apr 1, 2020
  33. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    So ... finally got this working, purely cross-platform with XRIT:

    1. Pick up objects, merge/snap them into bigger objects
    2. But also: break them apart by grabbing with two hands and pulling away
    3. ...while still being able to snap together, snap to other stuff, etc.

    There's a few hacks in here, and I've given up trying to do it cleanly, so now I've started directly editing the XRIT source files :(, at least until they push an update that removes all the internal/private flags - but it works! :)

     
    Matt_D_work and gjf like this.
  34. HaMMeRSI

    HaMMeRSI

    Joined:
    Jan 12, 2020
    Posts:
    1
    Is there a correct way to change XRRayInteractor behaviour?
    I Mean like changing the Line type, currently (0.9.3) if I change it programmatically from stright to projectile for example it throws runtime error in XRRayInteractorVisual line 385.

    In order to overcome it I use multiple controllers gameObjects with different configs and activate one when necessary, it seems starnge to me.
     
  35. harleydk

    harleydk

    Joined:
    Jul 30, 2012
    Posts:
    10
    For what it's worth, here's how I do it, by creating a grab-transform when I instantiate the XR Grabbable object - 'piece', in the below:

    Code (CSharp):
    1.  
    2.         var grabTransform = new GameObject();
    3.         grabTransform.transform.SetParent(piece.transform);
    4.         grabTransform.transform.localPosition = Vector3.zero;
    5.         grabTransform.transform.rotation = Quaternion.Euler(0, 0, 180); // rotate to avoid annoying rotation issue.
    6.    
    7.         XRGrabInteractable grab = piece.GetComponent<XRGrabInteractable>();
    8.         grab.attachTransform = grabTransform.transform;
    9.         }
     
    Last edited: Apr 6, 2020
  36. harleydk

    harleydk

    Joined:
    Jul 30, 2012
    Posts:
    10
    a436t4ataf likes this.
  37. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    8,347
    Can i use this to build for Android ?

    If not what other easy way is there to test VR without an actual device ?

    Finally i dont see this package in Unitt 2019.3.7f1

    Thanks
     
  38. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Question number 3 in the FAQ (below) - you've enabled Preview packages, right?
     
    nasos_333 likes this.
  39. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    8,347
    Oh, many thanks, i guess that was the issue :)
     
  40. yulaw2k

    yulaw2k

    Joined:
    Apr 25, 2014
    Posts:
    23
    I was so excited to see the first update in 2-3 months. :(:(:(
    upload_2020-4-8_13-1-33.png
     
    jiraphatK likes this.
  41. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    135
    we've been busy with a few other things but more XRI releases coming soon :)
     
  42. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Over the past few days, a lof the bugs I've logged on XRIT have suddenly been accepted/confirmed - so I guessed this was an indication you've caught up with other things and are now starting to work through the March reports :).
     
  43. yulaw2k

    yulaw2k

    Joined:
    Apr 25, 2014
    Posts:
    23
    Where can I view these things?
     
  44. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    Confirmed bugs should all appear here: https://issuetracker.unity3d.com/ - although this is what's taking about 4 weeks at the moment (there's always some delay before Unity's QA has finished verifying/reproducing new bugs).

    Unity never really implemented a search feature for this (its been live for more than 5 years), so finding specific bugs or bugs on specific components is hit-and-miss (it's a pity - if they'd used something off the shelf we'd have had this 5 years ago :)) - but my bugs are being renamed by the QA team to start with the text "(XR INTERACTION TOOLKIT)" which might be enough for you to find a lot of them.

    (NB: there is a huge "SEARCH" box at the top of the page. It has never worked. When I put "xr interaction toolkit" (with quotes, to do an exact match) the first hit is unrelated to XRIT. It's sad, but it's all we've got, so.)
     
  45. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
    PS: if anyone wants to DM me their bug URLs in the issuetracker, I can copy/paste them onto the end of the FAQ list. It would be a much more effective way of tracking them. But unless people send the URL's, I'm not going to go through the IssueTracker's search function trying to find them myself. (you get two URLs - one to fogbugz that you should keep private, and a second one to the IssueTracker once your bug is "confirmed", which you should share)
     
  46. feniks270392

    feniks270392

    Joined:
    Oct 11, 2019
    Posts:
    4
    Hello, everyone! i have a very stupid question. I need to interact in VR with standart canvas elemnets, as sliders, dropdowns and etc. I installed XR Toolkit, XR Manager. I create a new simple scene with cube and canvas with dropdown. I can interact with cube on scene. But i can't interact with deopdown. what i need to do for interact with canvas elements? dropdownInspector.png canvas inspector.png
     
  47. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    830
  48. feniks270392

    feniks270392

    Joined:
    Oct 11, 2019
    Posts:
    4
    a436t4ataf likes this.
  49. gibbrish123

    gibbrish123

    Joined:
    Feb 29, 2020
    Posts:
    8
    Hi... need some help with setting up the AR Examples for the interaction toolkit. I am trying to test the SwitchPlacementPrefab functionality. How do i do this??

    There's a script provided in the project - SwitchPlacementPrefab which references the prefabs
    Chair * Table * Kitchen Table 1 * Kitchen Table 2 * TV Table

    My Questions.
    1. What do I attach/add the SwitchPlacementPrefab script to ??
    2. Where are the Prefabs that the script mentions? (Chair, Table etc)
    3. How does this get triggered? When can a user SwitchPrefabs???
     
    Last edited: Apr 9, 2020
  50. freso

    freso

    Joined:
    Mar 19, 2013
    Posts:
    13
    I got excited. It looks pretty good. But, after a few hours trying it out, it's lacking.
    I strongly recommend following this style in all events: (source, object). Please, this will make the events 1000% more powerful. The only reason to skip the source would be if it is a Singleton-manager.

    Example:
    XRSocketInteractor has this event: OnSelectEnter(XRBaseInteractable)
    Change to: OnSelectEnter(XRBasetInteractor, XRBaseInteractable)
    When I attach the event to a manager, I want to know who sent it, and what it sent.

    XRSocketInteractor is also confusing. Looking in the inspector, it is not clear how it works. I've gathered by the source that it is using OnTriggerEnter, but does not require a collider. (And it handles scaled objects badly)

    Requiring a Rigidbody should not be necessary on XRSimpleInteractable.

    Also, please remove all "Gets or sets ..." from the comments/documentation of variables. It's not helping, just cluttering.
     
    jiraphatK likes this.
unityunity