Search Unity

  1. Unity 2020.1 has been released.
    Dismiss Notice
  2. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

XR Interaction Toolkit

Discussion in 'AR/VR (XR) Discussion' started by Sek19, Jul 29, 2020.

  1. Sek19

    Sek19

    Joined:
    Dec 12, 2014
    Posts:
    2
    I'm a bit lost as to how to work with the XR interaction Toolkit.
    I watched the tutorial series https://www.youtube.com/playlist?list=PLQMQNmwN3Fvx2d7uNxMkVOs1aUV-vxrlf

    I get how to apply these to objects and get functionality with the base system. What I'm not really understanding is how these are really communicating with each other to get that functionality.

    For example, the XRGrabInteractable has an override void ProcessInteractable, but how is this actually getting called to action from something like the XRController?

    What I'm looking to do is use the XRGrabInteractable for AR and grab objects like you would with a standard VR setup. From what I understand, this class is really meant to be used for XR with controllers so I'm looking to convert it for touch interactions.

    This question kind of stems from two areas,

    One, how are these classes interacting with one another? Another example (Specifically for AR)

    Looking in the ARSelectionInteractable class we have:

    Code (CSharp):
    1.    /// <summary>This method is called by the interaction manager
    2.         /// when the interactor first initiates selection of an interactable.</summary>
    3.         /// <param name="interactor">Interactor that is initiating the selection.</param>
    4.         protected internal override void OnSelectEnter(XRBaseInteractor interactor)
    5.         {
    6.             base.OnSelectEnter(interactor);
    7.            
    8.             if (m_SelectionVisualization != null)
    9.                 m_SelectionVisualization.SetActive(true);
    10.         }
    From what I gather, the class ARGestureInteractor under namespace UnityEngine.XR.Interaction.Toolkit.AR is firing off an event or something in that line to signify that a touch interaction has been made. Example of the code is as follows

    Code (CSharp):
    1.    /// <summary>
    2.         /// Gets the Tap gesture recognizer.
    3.         /// </summary>
    4.         public TapGestureRecognizer TapGestureRecognizer
    5.         {
    6.             get
    7.             {
    8.                 return m_TapGestureRecognizer;
    9.             }
    10.         }
    The second par of this question revolves around me not really understanding certain things about programming, are there any Unity tutorials or tutorials in general people recommend that would help me in understanding this area of programming as I find myself getting a bit lost in understanding of how to work with the majority of the XR interaction Toolkit.I think most of this comes from the this event type system / get and set, and how you're supposed to override specific classes to create extensions to these. I'm not really sure where my real gaps are as I don't know what I don't know. But the XR interaction Toolkit seems to definitely hit that sweet spot of the areas I've been struggling in.
     
  2. jackpr

    jackpr

    Unity Technologies

    Joined:
    Jan 18, 2018
    Posts:
    35
    Hi there, I'll do my best to fill in the gaps for you.

    > For example, the XRGrabInteractable has an override void ProcessInteractable, but how is this actually getting called to action from something like the XRController?

    You may have noticed that when you add XR Interaction Toolkit scripts to a scene, you also get a XRInteractionManager. Interactions are handled by the XRInteractionManager. This is a centralized place that all ProcessInteractor() and ProcessInteractable() functions are handled. All interactors and interactables register with their XRInteractionManager. This is handled in the base classes XRBaseInteractable and XRBaseInteractor, from which all of our examples inherit.

    In XRInteractionManager, first all of the ProcessInteractor functions are run. Interactors look for targets and try to activate them. Second, ProcessInteractable functions are run. This is where interactables react to being activated.

    To understand the pieces you are fitting together, look at XRInteractionManager and the base classes for example interactors and interactables. If "base class" and "inheritance" are new concepts for you, try looking for tutorials for those concepts or "object-oriented" programming.
     
unityunity