Search Unity

Official XR Interaction Toolkit Preview Release (0.9)

Discussion in 'XR Interaction Toolkit and Input' started by mfuad, Dec 17, 2019.

Thread Status:
Not open for further replies.
  1. nrvllrgrs

    nrvllrgrs

    Joined:
    Jan 12, 2010
    Posts:
    62
    Three quick questions:
    1. Is there a way for an XRGrabInteractable to attach via a joint?
    2. Is there a way to force an XRBaseInteractor to to select an XRGrabInteractable (i.e. is there a way to force a controller to grab an interactable object)?
    3. When dropping an object onto a socket, is there a way for the object to ease into position / rotation instead of an instantaneous snap?
     
    ROBYER1 likes this.
  2. nrvllrgrs

    nrvllrgrs

    Joined:
    Jan 12, 2010
    Posts:
    62
    One more:
    1. Is there a community git repository common extensions to XRI? If so, can you please share the link?
     
  3. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    Joints and grabbing are very different concepts, so you might need to give more detail here. If you mean: "can you pick up and hold an interactable using physics rather than by changing its position?" then: http://snapandplug.com/xr-input-too...it-using-parenting?-Attach-it-using-velocity?

    Again ... what do you mean here? There's multiple ways to do that, with different purposes, in different situations.

    The easiest one, to make a new Interactor start (in Awake) with something already grabbed: http://snapandplug.com/xr-input-too...rabbable-already-held-by-the-controller/hand?

    For the harder ones, e.g. programmatically grabbing during a Scene, there's a chain of 4 or so posts on the previous page detailing my attempts to get this to work. I finally settled on a slightly convoluted and hacky approach using some reflection and careful management of the XR classes' internal data to make sure I don't corrupt anything. I expect it will break in the next update :) so I don't recommend it, but it lets me do testing/development for now.
     
  4. vikash_ra1

    vikash_ra1

    Joined:
    Aug 3, 2019
    Posts:
    2
    Likely this has nothing to do with XR toolkit but it causes the grab interaction to not work.
    With the sample asset, I am able to grab the larger bucket but not the smaller one even though all settings on both of them are similar. Only difference I noted was that the mesh of smaller bucket showed as combined mesh at runtime vs expected mesh for the larger bucket.
     

    Attached Files:

  5. nrvllrgrs

    nrvllrgrs

    Joined:
    Jan 12, 2010
    Posts:
    62
    I'm migrating from VRTK 3.3 to XRI -- so I'll link to some documentation to explain my needs / desires for XRI.

    VRTK3.3 has GrabAttachMechanics that allows you to specify how an interactable object attaches to the controller. Common ones are: VRTK_ChildOfControllerGrabAttach, VRTK_FixedJointGrabAttach, and VRTK_CustomJointGrabAttach. XRI has a ChildOfController equivalent. But I would like a Joint equivalent so that I can attach the hand model to the interactable object and it can collide with the environment.

    VRTK3.3 has an VRTK_InteractGrab component. This component is approximately equal to an XRInteractor component. VRTK_InteractGrab has an AttemptGrab method, which can be used to try to grab the currently touched interactable object. In order to force a grab, I would also have to be able to force a Hover as well.

    Ultimately, I am trying to make is a RedirectorInteractable. When the player attempts to grab a redirector (assuming that redirector is referencing a valid GrabInteractable), then it grabs the referenced GrabInteractable. This would be useful for holsters when weapons can be "summoned" even if that weapon is dropped elsewhere in the environment.

    Another feature that VRTK3.3 has that I have not yet discovered in XRI is VRTK_HeadsetCollisionFade. This just fades the screen when the head collides with geometry. Is there an equivalency?
     
  6. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    I think you'll hit the same hard wall I ran into on previous page: XRGrabInteractable is required to be 1:1 always the thing grabbed, because Unity's design of XRIT means that the InteractionManager and other classes are hardcoded with this assumption. (I believe that's the wrong architecture/design decision - I've already commented above that IMHO it's common for games to break that assumption. But ... it may be that I just haven't figured a good enough workaround yet)

    I think you could get away with it by doing:

    1. Spawn an invisible object on the holster
    2. Put a Grabbable on it
    3. When that Grabbable is grabbed, in the callback: teleport the gun into being a child of the invisible grabbable
    4. When that Grabbable is released, in the callback: de-parent the gun, and delete the invisible object
    5. ...and respawn a new invisible object at the holster

    NB: because of some bugs in XRIT (logged above) you probably don't want to move the object in 1 back into place at 5. Instead you are much safer to create a new one and delete the old one.

    (I'm wondering if I can adapt this approach for my use-case too ... not convinced, but it might be possible)
     
  7. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    298
    I hacked head fade by creating reversed-normal sphere around camera and used unlit always shader to fade its alpha base on distance between collisions bound and the sphere
     
  8. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    298
    Is there anyway to get nearest object on XRRayInteractor ?
    I know there's a GetValidTarget() method but It seems redundant to call it outside of the class itself since the RayInteractor had already store the value to m_CurrentNearestObject already. I just want to access that property instead of calculating nearest object again in GetValidTarget method.
    I need the nearest object for outlining object correctly and other generic gameplay behaviour.
    upload_2020-4-13_12-49-10.png
     
    a436t4ataf likes this.
  9. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    Lots of things are currently marked private/internal which shouldn't be. The devs have already responded and said they plan to change this in a future update. So ... almost certainly the answer is "not yet". Until then: modify your local copy of the source, or use reflection (which will probably carry on working even after an update).

    (although it's also worth documenting the things that you think should be public, and giving use-cases, in case the Unity guys assume something isn't needed and keep it private :))
     
    harleydk likes this.
  10. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    298
    Any idea why the code keep reverting back to its own old state everytime I reopen my project?
    As for my specific use case, since unity use raycast nonalloc in RayInteractor which goes through everyting to check for interactables, every interactable that is hit by the raycast will recieve hover callback. The problem is, I use hover state to trigger hightlight function. So, there are a lot of interactable getting hightlighted. This also happen with direct interactor. Says I want to make object float when they are hover to indicate that it is the current focused object. But turn out every objects that are hit by RayInteractor or inside a sphere trigger of direct interactor are floating, how will user know which one will be selected when everyone of them is floating.
    I can make a work around this but it should not be a default behaviour imho.
     
  11. EddieChristian

    EddieChristian

    Joined:
    Oct 9, 2012
    Posts:
    725
    How can we easily tell which hand is holding an object so I can add offsets to held objects based on which hand is holding it?
     
  12. Polff

    Polff

    Joined:
    May 18, 2017
    Posts:
    30
    Hey as already stated the jitter is due to the frames updating (and with this the controller positions) more often than physics. You can reduce the jitter by decreasing the fixed time step in Project settings so physics are updated more often. But be aware that this increases CPU Load.
     
  13. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    The default setup: Each controller ("hand") has its own separate XRInteractor instance. All callbacks can access the current interactor (IMHO it should be included in the args for each one, but some of them you have to indirectly fetch it by asking the grabbable for its "selectingInteractor (IIRC thats it, but the field may be named slightly differently?)").

    Then do a reference compare on whether you've received the one that's attached to the left GameObject or the right one.

    (or, if you didn't change the default names of the hand/controller GO's, just look at "interactor.name" which will be the GO name in the Hierarchy, and will start with "Left" or "Right" in the default setup)
     
    EddieChristian likes this.
  14. EddieChristian

    EddieChristian

    Joined:
    Oct 9, 2012
    Posts:
    725
    Why are object in a different position if I use Movment Type : Instantaneous mode in the XRGrabInteractable script. They are way below the Attach Point. But if I use the other modes they are correctly moving with the Attach Point

    Does Instantaneous use the Mesh's Pivot Point? Should it be centered?
     
    Last edited: Apr 13, 2020
  15. mightnmagic

    mightnmagic

    Joined:
    Apr 8, 2020
    Posts:
    2
    Hello Unity experts!

    Using the new XR Interaction Toolkit, how would you go about doing say magic sprinkles from an XR controller, once the user presses a trigger? E.g. I do not want to "grab" or "select" anything, I just want to find out if the user pulls the trigger and spray magic particles until they let go of the trigger. I thought about making one giant GameObject representing the particles that would be an interactible and listen to onActivate coming from a grabInteraction component, but with that approach I could never actually have real interactibles in the game since the giant one would always win? (Unless I can sort their priority and give the giant one the lowest priority, but this approach just feels like a giant hack...)

    Or is the XR Interaction Toolkit just too high level for what I'm trying to do, and I should do this using the XR Input system?

    If so, then this leads me to another question which is that I've yet to manage to get Unity to respond *at all* to any button and trigger clicking on my VR controllers. I am using a Rift S with Unity 2019.3 and also tried the Unity 2020.1 beta - and in both cases I could never get even a trigger value to be nonzero. So I am also wondering if there are well-known issues working with the Oculus and Unity XR Inputs.

    Using the XR Interaction Toolkit, the XR controller inputs do work, so there's no fundamental problem with my hardware or the unity version I guess.

    Many thanks and apologies if my questions have been asked elsewhere - I did take a look but couldn't find any similar threads but I will admit I am new to Unity so perhaps I just didn't know where to look.
     
  16. mightnmagic

    mightnmagic

    Joined:
    Apr 8, 2020
    Posts:
    2
    More info on my journey. Oculus support just seems broken. I am using 2019.3 and I have copied and pasted exactly the script from the docs, attached it to a game object, and then added Debug log messages to message if a primary button is pressed, but I never get any positives.

    Here is the script:

    Code (CSharp):
    1.  
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.Events;
    5. using UnityEngine.XR;
    6.  
    7. [System.Serializable]
    8. public class PrimaryButtonEvent : UnityEvent<bool> { }
    9.  
    10. public class PrimaryButtonWatcher : MonoBehaviour
    11. {
    12.     public PrimaryButtonEvent primaryButtonPress;
    13.     private bool lastButtonState = false;
    14.     private List<UnityEngine.XR.InputDevice> allDevices;
    15.     private List<UnityEngine.XR.InputDevice> devicesWithPrimaryButton;
    16.  
    17.     void Start()
    18.     {
    19.         if (primaryButtonPress == null)
    20.         {
    21.             primaryButtonPress = new PrimaryButtonEvent();
    22.         }
    23.  
    24.         allDevices = new List<UnityEngine.XR.InputDevice>();
    25.         devicesWithPrimaryButton = new List<UnityEngine.XR.InputDevice>();
    26.         InputTracking.nodeAdded += InputTracking_nodeAdded;
    27.     }
    28.  
    29.     // check for new input devices when new XRNode is added
    30.     private void InputTracking_nodeAdded(XRNodeState obj)
    31.     {
    32.         updateInputDevices();
    33.     }
    34.  
    35.     void Update()
    36.     {
    37.         bool tempState = false;
    38.         bool invalidDeviceFound = false;
    39.         Debug.Log("Checking Devices +++");
    40.         foreach (var device in devicesWithPrimaryButton)
    41.         {
    42.             bool buttonState = false;
    43.             tempState = device.isValid // the device is still valid
    44.                         && device.TryGetFeatureValue(CommonUsages.primaryButton, out buttonState) // did get a value
    45.                         && buttonState // the value we got
    46.                         || tempState; // cumulative result from other controllers
    47.  
    48.             if (!device.isValid)
    49.                 invalidDeviceFound = true;
    50.  
    51.             Debug.Log(string.Format("Device valid {0}  buttonState {1}", device.isValid, buttonState));
    52.         }
    53.         Debug.Log("DONE Checking Devices +++");
    54.  
    55.         if (tempState != lastButtonState) // Button state changed since last frame
    56.         {
    57.             primaryButtonPress.Invoke(tempState);
    58.             lastButtonState = tempState;
    59.             Debug.Log("Button pressed!!!!!!!!!!!!!!");
    60.         }
    61.  
    62.         if (invalidDeviceFound || devicesWithPrimaryButton.Count == 0) // refresh device lists
    63.             updateInputDevices();
    64.     }
    65.  
    66.     // find any devices supporting the desired feature usage
    67.     void updateInputDevices()
    68.     {
    69.         devicesWithPrimaryButton.Clear();
    70.         UnityEngine.XR.InputDevices.GetDevices(allDevices);
    71.         bool discardedValue;
    72.  
    73.         foreach (var device in allDevices)
    74.         {
    75.             if (device.TryGetFeatureValue(CommonUsages.primaryButton, out discardedValue))
    76.             {
    77.                 devicesWithPrimaryButton.Add(device); // Add any devices that have a primary button.
    78.             }
    79.         }
    80.     }
    81. }
    And the only output I ever see, regardless of how much I press all the buttons on both controllers:

    Device valid True buttonState False

    (see image)
     

    Attached Files:

  17. freso

    freso

    Joined:
    Mar 19, 2013
    Posts:
    73
    More on the wishlist:
    * I want to be able to control anything programatically. (For example make the Grab/Socket interactors grab objects)
    * XRRayInteractor requires XRController (via
    XRBaseControllerInteractor), but has no reference to it. Add it.
     
    a436t4ataf likes this.
  18. H8ste

    H8ste

    Joined:
    Nov 18, 2017
    Posts:
    1
    I hope this is the correct place to pose the following question, if not please guide me elsewhere :)

    I'm attempting to implement a gesture for scaling objects. Figured I would do this by implementing a new interactable with its own logic to ensure replicability. However, I've encountered the issue of 'exclusive selection'. My question is:

    Why is the logic of 'exclusive selection of an interactable' bestowed on the instantiations of the controllers (XRBaseInteractor.requireSelectExclusive) rather than on the interactable(s)?

    Considering the use-case of interactable(s) in one's experience affording multi- and/or single-input interactions.
    Or am I going about implementing this gesture incorrectly? Any feedback is appreciated, many thanks.
     
  19. Realitycheckgal

    Realitycheckgal

    Joined:
    Nov 16, 2017
    Posts:
    1
    Does the XR toolkit use the Kronos OpenXR framework?
     
  20. nrvllrgrs

    nrvllrgrs

    Joined:
    Jan 12, 2010
    Posts:
    62
    I created a script that extends from TeleportationArea that limits teleportation to valid NavMesh locations in the area. My version requires Odin Inspector but could easily be modified to remove that requirement. Enjoy.

    Code (CSharp):
    1. using UnityEngine.AI;
    2. using Sirenix.OdinInspector;
    3.  
    4. namespace UnityEngine.XR.Interaction.Toolkit
    5. {
    6.     public class TeleportationNavMesh : TeleportationArea
    7.     {
    8.         #region Fields
    9.  
    10.         [Header("Navigation")]
    11.  
    12.         [SerializeField, MinValue(0f)]
    13.         private float m_maxDistance = 0.05f;
    14.  
    15.         [SerializeField, MinValue(-1)]
    16.         private int m_areaMask = -1;
    17.  
    18.         [SerializeField, ValueDropdown("GetAgentTypeNames")]
    19.         private string m_agentType = "Humanoid";
    20.  
    21.         #endregion
    22.  
    23.         #region Methods
    24.  
    25.         protected override bool GenerateTeleportRequest(XRBaseInteractor interactor, RaycastHit raycastHit, ref TeleportRequest teleportRequest)
    26.         {
    27.             if (!base.GenerateTeleportRequest(interactor, raycastHit, ref teleportRequest))
    28.                 return false;
    29.  
    30.             NavMeshQueryFilter filter = new NavMeshQueryFilter()
    31.             {
    32.                 agentTypeID = NavMeshUtil.GetAgentTypeId(m_agentType),
    33.                 areaMask = m_areaMask,
    34.             };
    35.  
    36.             Vector3 samplePos = raycastHit.point + Random.insideUnitSphere * m_maxDistance;
    37.             //Debug.LogFormat("Sample Position = {0}; Agent Type = {1}; Area Mask = {2}", samplePos.ToString("F3"), agentTypeId, m_areaMask);
    38.  
    39.             // Find position on NavMesh
    40.             if (NavMesh.SamplePosition(samplePos, out NavMeshHit hit, m_maxDistance, filter))
    41.             {
    42.                 teleportRequest.destinationPosition = hit.position;
    43.                 return true;
    44.             }
    45.  
    46.             return false;
    47.         }
    48.  
    49.         #endregion
    50.  
    51.         #region Editor Methods
    52. #if UNITY_EDITOR
    53.  
    54.         private string[] GetAgentTypeNames()
    55.         {
    56.             return NavMeshUtil.GetAgentTypeNames();
    57.         }
    58.  
    59. #endif
    60.         #endregion
    61.     }
    62. }
    Code (CSharp):
    1. using System.Collections.Generic;
    2.  
    3. namespace UnityEngine.AI
    4. {
    5.     public static class NavMeshUtil
    6.     {
    7.         #region Fields
    8.  
    9.         private static Dictionary<string, int> s_agentTypeMap;
    10.  
    11.         #endregion
    12.  
    13.         #region Methods
    14.  
    15.         public static string[] GetAgentTypeNames()
    16.         {
    17.             int count = NavMesh.GetSettingsCount();
    18.             var agentTypeNames = new string[count + 2];
    19.  
    20.             for (int i = 0; i < count; ++i)
    21.             {
    22.                 int id = NavMesh.GetSettingsByIndex(i).agentTypeID;
    23.                 string name = NavMesh.GetSettingsNameFromID(id);
    24.                 agentTypeNames[i] = name;
    25.             }
    26.  
    27.             return agentTypeNames;
    28.         }
    29.  
    30.         public static int GetAgentTypeId(string agentType, bool forceReset = false)
    31.         {
    32.             if (string.IsNullOrWhiteSpace(agentType))
    33.                 return 0;
    34.  
    35.             if (forceReset || s_agentTypeMap == null)
    36.             {
    37.                 ResetAgentTypeIds();
    38.             }
    39.  
    40.             return s_agentTypeMap.TryGetValue(agentType, out int value)
    41.                 ? value
    42.                 : 0;
    43.         }
    44.  
    45.         public static void ResetAgentTypeIds()
    46.         {
    47.             s_agentTypeMap = new Dictionary<string, int>();
    48.  
    49.             int count = NavMesh.GetSettingsCount();
    50.             for (int i = 0; i < count; ++i)
    51.             {
    52.                 int id = NavMesh.GetSettingsByIndex(i).agentTypeID;
    53.                 string name = NavMesh.GetSettingsNameFromID(id);
    54.                 s_agentTypeMap.Add(name, id);
    55.             }
    56.         }
    57.  
    58.         #endregion
    59.     }
    60. }
     
    Last edited: Apr 15, 2020
    harleydk likes this.
  21. nrvllrgrs

    nrvllrgrs

    Joined:
    Jan 12, 2010
    Posts:
    62
    In the PrototypeScene, if you select and unselect a TelportationAnchor then you cannot select another TeleportationAnchor until you reset controller (i.e. release the thumbstick). Does anybody know how to resolve this issue?
     
  22. R1PFake

    R1PFake

    Joined:
    Aug 7, 2015
    Posts:
    540
    Im not sure if this is supported/planned yet but some kind of "None VR Debug Player Controller" which controls the camera by mouse/keyboard and 2 fake controller objects infront of the camera would be awesome so you don't have to put on the headset all the time. I played around with a prototype myself, but I think it would be a awesome addition to this toolkit.
     
    nrvllrgrs and ROBYER1 like this.
  23. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    I assure you it is definitely working :).

    EDIT: the simple version: http://snapandplug.com/xr-input-toolkit-2020-faq/#FAQ:-How-do-I-detect-button.IsPressed?

    EDIT: more complex:

    You'll need to add some debugging to figure out at what stage it's failing. Given your output, the only obvious thing that comes to mind is that you're not holding the button down long enough (But that's not especially likely - just saying its the only obvious one I can think of).

    I would try a super simple setup and see if it works:

    1. get input device explicitly (eg "left hand controller")
    2. Debug.log it and make sure its the one you expected
    3. trygetfeaturevalue ( primary pressed )
    4. trygetfeaturevalue ( secondary pressed )

    ...actually, hang on a sec - I can't see your code while writing (unity forums GRR!) but there's a difference between checking the existence of a button and checking its state. Theres a table in the main Unity manual/docs saying which statuses exist on a per-vendor/model basis - you might be checking for somethign that isn't mapped on Oculus?

    For what it's worth - I frequently use:

    Code (CSharp):
    1.  
    2. public enum QuestButton
    3. {
    4.     A,
    5.     B,
    6.     TRIGGER,
    7.     GRIP,
    8.     JOYSTICK_PRESS
    9. }
    10. public class GlobalControllerPressHandler : MonoBehaviour
    11. {
    12.     public HandController handToListenTo = HandController.RIGHT_HAND;
    13.     public QuestButton buttonToListenTo = QuestButton.A;
    14.  
    15.     public UnityEvent buttonClickedEvent;
    16.  
    17.     public void Update()
    18.     {
    19.         if( IntelligentXRExtensions.ExistsController(handToListenTo, out InputDevice device ) )
    20.         {
    21.             InputFeatureUsage<bool> buttonToQuery;
    22.             switch( buttonToListenTo )
    23.             {
    24.                 case QuestButton.A:
    25.                     buttonToQuery = CommonUsages.primaryButton;
    26.                     break;
    27.              
    28.                 case QuestButton.B:
    29.                     buttonToQuery = CommonUsages.secondaryButton;
    30.                     break;
    31.              
    32.                 case QuestButton.GRIP:
    33.                     buttonToQuery = CommonUsages.gripButton;
    34.                     break;
    35.              
    36.                 case QuestButton.TRIGGER:
    37.                     buttonToQuery = CommonUsages.triggerButton;
    38.                     break;
    39.              
    40.                 case QuestButton.JOYSTICK_PRESS:
    41.                     buttonToQuery = OculusUsages.thumbTouch;
    42.                     break;
    43.             }
    44.          
    45.             bool isButtonPressed;
    46.             if( device.TryGetFeatureValue( buttonToQuery, out isButtonPressed) && isButtonPressed )
    47.             {
    48.                 buttonClickedEvent.Invoke();
    49.             }
    50.         }
    51.     }
    52. }

    ...where the code for ExistsController is complex (has lots of fallbacks and autodetect I wrote myself), but basically is a wrapepr for:

    var desiredCharacteristics = UnityEngine.XR.InputDeviceCharacteristics.HeldInHand | UnityEngine.XR.InputDeviceCharacteristics.Controller | UnityEngine.XR.InputDeviceCharacteristics.Left;
    return UnityEngine.XR.InputDevices.GetDevicesWithCharacteristics(desiredCharacteristics, handedControllers) [0];
     
  24. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    Looked at:

    http://snapandplug.com/xr-input-too...ithout-a-headset?-/-Is-there-a-headless-mode?

    ? Making a fake controller should be easy, but will be unique for every game! My FPS wants a completely different setup to your top-down RTS, for example :)
     
  25. Aaron-Meyers

    Aaron-Meyers

    Joined:
    Dec 8, 2009
    Posts:
    305
    Are there plans to give us some options for a quick linear interpolation for position on teleport? Or even a quick blackout?
     
  26. AlexSon007

    AlexSon007

    Joined:
    Dec 12, 2019
    Posts:
    2
    The Locomotion Systems section states that the example value is set to 600 seconds, while the attached image shows a Timeout of 10. I believe this is an error?
     
  27. Aaron-Meyers

    Aaron-Meyers

    Joined:
    Dec 8, 2009
    Posts:
    305
    Think I found a bug with XRIT and UI interaction. I've added some Event Triggers to a Slider. Begin Drag gets called just fine but End Drag never fires.
     
  28. T3ddyTheGiant

    T3ddyTheGiant

    Joined:
    Aug 1, 2018
    Posts:
    11
    I just want to say : I'm confused why the toolkit isn't utilizing the new input system!

    Questions..
    1. Is anyone using both the new input system and xr toolkit together? They don't seem to play nice with each other. I like the functionality that unity xr toolkit provides, but I'd also prefer to keep the input system robustness.
    2. I've been experimenting with the new Tracked Pose Drivers (New Input System) scripts, and they work only when you have the headset OFF or it is ON but you press Win+Y to regain desktop input/controls. Is there a way I can change this behavior?
     
    Last edited: Apr 22, 2020
  29. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    298
    Last time I checked, about a month ago, I could not get them to work together. A user told me they can't be used in the same project at the moment.
     
    rgbauv, kavanavak and ROBYER1 like this.
  30. T3ddyTheGiant

    T3ddyTheGiant

    Joined:
    Aug 1, 2018
    Posts:
    11
    I was afraid that was the case. Two potentially really great systems that don't work well / communicate with each other.
     
  31. yulaw2k

    yulaw2k

    Joined:
    Apr 25, 2014
    Posts:
    31
    If you search "Input System" on this thread you can find posts with Unity saying it's on their roadmap for the next release, or soon.
     
  32. LeFx_Tom

    LeFx_Tom

    Joined:
    Jan 18, 2013
    Posts:
    88
    I know, i have already asked that somewhere else, but this thread seems actually more fitting...so forgive me for a partial repost:

    The XR Interaction toolkit looks very promising, but I'm actually lacking the feeling, that it is continuously worked on.
    Is there a roadmap or any set plan for features?
    It would be really nice to have one unified toolset to develop applications for all/most HMDs and it is a charming idea, that it could be Unity's "own" maintained solution...but for that, it would need some sort of a reliable development lifecycle...

    So far - and please forgive me, if I am wrong - the XRI Toolkit feels a little like a pet-project of 1-2 ppl at Unity, that work on it, whenever there is nothing more urgent to be done. There is not much communication about a development schedule, you only get sometimes answers like "it is being looked into" or "we are planning to include on a future release" but then there can easily be silence for 1-2 months...

    This does seem like a bit of a problematic approach - together with the S***show (sorry...harsh, but that is what it feels like to us here atm) that is the switch from "legacy" XR Plugin-Structure to the new XR Manangement platform it leaves me a bit puzzled about the plans at Unity for the future of XR development...
     
    Brad-Newman and Shizola like this.
  33. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    @StayTalm_Unity I have finally found the cause of this very annoying problem and produced a repro project of it right after in a very short amount of time using the Samples project for XRI at (Case 1241245) [XR Interaction Toolkit] Controller connection interruptions disable all XR Interactors through Controller Manager Script
     
  34. freso

    freso

    Joined:
    Mar 19, 2013
    Posts:
    73
    More input. :)
    I'm not super sure how the XRInteractionManager is supposed to be used. It says it is an
    "intermediary between Interactors and Interactables". In that case why not add UnityEvents to it, so we can hook into them?

    If the events were setup like this, it is super usable:
    XRInteractable:
    • On[Select|Hover][Enter|Exit](XRBaseInteractable, XRBaseInteractor)
    • OnActivate(XRBaseInteractable, XRBaseInteractor)
    XRInteractor:
    • On[Select|Hover][Enter|Exit](XRBaseInteractor, XRBaseInteractable)
    • OnActivate(XRBaseInteractor, XRBaseInteractable)
    XRInteractionManager:
    • On[Select|Hover][Enter|Exit](XRBaseInteractor, XRBaseInteractable)
    • OnActivate(XRBaseInteractor, XRBaseInteractable)
    • On[Select|Hover][Enter|Exit](XRBaseInteractable, XRBasetInteractor)
    • OnActivate(XRBaseInteractable, XRBaseInteractor)
    In this case I can choose to add individual events to special objects, or intercept the events for all objects directly in the manager. Also, this makes it very friendly programatically, if we can hook into the events when spawning obects in runtime.
     
  35. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    I'd disagree re: UnityEvent is "friendly programmatically"! It's just about useable, but has major problems and makes large projects horrible.

    UnityEvent is officially (I believe the phrase is "not intended for use" :)) at runtime if you try to use it from scripts: you can only make transient connections, and there's no error handling for this. So either your code works in Editor while developing, and breaks in game (no warning, no build error). Or it works when you build, but your scene keeps being corrupted while you're developing (no warning, no editor error).

    I love UnityEvent for doing test projects and prototypes, where you can make buttons and things rapidly - but it's a nightmare for ongoing game development, where the invisible connections make debugging impossible, and the code to try and create events from script is just nasty (and - as far as I can tell - is broken on lambdas. Lambdas are more valuable when coding than UnityEvent is, so ... we normally delete all UnityEvents from a project if it's serious, and ban using them for anything except Button.onClick).

    So ... I'd greatly prefer normal, working, code hooks. Abstract baseclasses (used heavily by XR already - but not very flexible) and virtual methods. Delegates (much more powerful than XR abstract base classes, and used heavily throughout Unity APIs already). Callbacks (ditto). Stuff that works in code and is fully compatible with C# :).
     
    harleydk, Shizola and T3ddyTheGiant like this.
  36. freso

    freso

    Joined:
    Mar 19, 2013
    Posts:
    73
    Well, it would be nice if Unity Inspector could show standard C# events, instead of using this workaround with UnityEvents, but aren't they basically the same? (Foo.bar.AddListener(me) vs Foo.bar += me). Maybe I misunderstood your point.

    Anyway, these UnityEvent improvements I suggested are a minor change to their current setup and not a rewrite, and would help at least me. :)

    UnityEvents are far from perfect: Inspector-wise, I think they should be setup on the listener, not the sender. The list should be populated when AddListener is used in runtime. You should not have to create a workaround class for the event (I think I read this is solved in 2020, not sure). But we work with what we have :/

    Re: Base classes; there is nothing more annoying than base classes with protected methods that uses private members so you can't actually overload them properly. So everything can be implemented in a bad way. (Edit: I recently tried Oculus Integration and it made me sick :p)
     
    a436t4ataf likes this.
  37. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    For XR ... the main problem with UnityEvent is probably: it's not part of XRIT, it's in the core UnityEngine, and so it's pretty much guaranteed it won't get changed/improved in any significant way that we might ask for here :) (it would affect too many other Unity packages, would impact too many existing projects, and it has its own strategic direction that is *not* VR/AR).

    So ... while there are good reasons for being compatible with it (e.g. the team have stated they are actively integrating with the new Input System, which is great!) ... I also feel that - scripting/code-wise - we don't want it to be primary source of hooks into the API.s
     
  38. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    475
    This would be cool. There's so many interesting unity things on github but unless you constantly scan Twitter, they're easily missed.

    For example, Half Life style gravity gloves using XR toolkit - https://twitter.com/prvncher/status/1249449911961235458?s=20
     
  39. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    If you want ... go curate it :). Start a new thread and ask for contributions. A simple list of interesting projects would be a great start.

    (FYI - I'd love a curated list, but I wouldn't use a common repository. I'm not a big fan of aggregating Unity extensions - if you try to make a common repo, you always seem to end up with a monolithic mega-repo that is mostly useless for complex projects, or anyone who just wants one or more of the features. It's like the worst parts of Boost :) - you get a massive bloat and everything's so interlinked it can't be unpicked.)
     
  40. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    569
    @Shizola Or bypass the Twitter bit. Follow people on Github, star repos and check your newsfeed to see updates, new repos and who they've starred and followed.
     
    Shizola likes this.
  41. stychu

    stychu

    Joined:
    May 9, 2016
    Posts:
    62
    Having a problem with Unity and Quest. Recently all was working but not it's not. I've installed 2019.3.11f1 and Created new empty project added XR toolkit and XR management + Oculus provider added XRrig and when I do play mode nothing happens in the headset. Keeps me in the dash menu and I see this error in console


    DllNotFoundException: OculusXRPlugin
    Unity.XR.Oculus.OculusLoader.Initialize () (at Library/PackageCache/com.unity.xr.oculus@1.3.3/Runtime/OculusLoader.cs:103)
    UnityEngine.XR.Management.XRManagerSettings.InitializeLoaderSync () (at Library/PackageCache/com.unity.xr.management@3.2.9/Runtime/XRManagerSettings.cs:169)
    UnityEngine.XR.Management.XRGeneralSettings.InitXRSDK () (at Library/PackageCache/com.unity.xr.management@3.2.9/Runtime/XRGeneralSettings.cs:203)
    UnityEngine.XR.Management.XRGeneralSettings.AttemptInitializeXRSDKOnLoad () (at Library/PackageCache/com.unity.xr.management@3.2.9/Runtime/XRGeneralSettings.cs:176)
     
  42. jj-unity

    jj-unity

    Unity Technologies

    Joined:
    Jan 4, 2017
    Posts:
    74
    I tried to reproduce this locally with 2019.3.11f1 and everything seems to be working. Do you mind filing a bug with this simple empty project that's causing issues and send me a link?
     
  43. stychu

    stychu

    Joined:
    May 9, 2016
    Posts:
    62
    Hey there, so I've uninstalled Unity completely and removed all associated files in the AppData folder and installed again and it seems it solved the problem. I can run my scene in play mode and unity complains no more. I've created bug report before that. Will DM the link to you
     
  44. jj-unity

    jj-unity

    Unity Technologies

    Joined:
    Jan 4, 2017
    Posts:
    74
    Thanks for the bug report! I noticed in your Library/PackageCache folder for the Oculus XR Plugin package that the .dlls were missing, but the associated .meta files were still present. I'm not really sure how that could happen, but I'm glad that you found a solution that works for you.

    If anyone else sees a similar issue, please let us know. I'm hoping this was a one-off case though.
     
  45. FishStickies94

    FishStickies94

    Joined:
    Jul 27, 2019
    Posts:
    70
    I hope someone can help me with this as it's 6am and for the past 3 hours I've been plugging away trying to do this.

    I am creating a reload system that when a player brings a bullet (XRGrabInteractable) into reload area it automatically loads the bullet while removing the bullet from the players hands.

    The way I do this is on TriggerEnter and then calling the OnSelectExit on the bullet. Now it works but it lead me to some odd behaviour. Since I still have the grip down, the OnSelectExit only clears the bullets connection to the Hand (XRDirectInteractor) but not the other way round. I did a few small tests by creating a large trigger collider and bring my hand through. First off the bullet would just drop, not taking into account previous velocity (its velocity tracked) but then since the Hand was still had connection to the bullet, once I released the grip the bullet would then move with the velocity I had when it dropped. No matter how long I waited, once I let go it would then trigger the OnSelectExit on the Hand.

    I've been through all the scripts in the package, search the web and I can for the life of me see how to trigger the OnSelectExit on the controller without releasing the grip. While it's not neccessery for the thing I'm doing now, it would be great to know for the future (say knocking items out of players hands etc). The OnSelectExit on the XRBaseInteractor package files is protected internal as well so clearly we aren't meant to use that.

    TLDR: How the do I force the hand to drop an item, mainting it's velcoity without releasing the grip.
     
  46. NEOF

    NEOF

    Joined:
    Jun 25, 2013
    Posts:
    9
    Hi, I have posted an issue on github xrtoolkitexamples, but I want to ask here also. Here it is:
    I have this issue reproduced using photon, but I guess there are other possibilities that I am not aware of. (EDIT: Most likely it is because the object which is a parent of interactors gets disabled before getting destroyed)
    DoProcess in XRUIInputModule fails to remove RegisteredInteractables properly in case of using PhotonNetwork.Destroy and PhotonNetwork.Instantiate in sequence.

    I have a scene which joins photon and spawns a player with xr rig and 2 rayinteractors. I press on UI item which does Photon.Destroy on current player and Photon.Instantiate another prefab with xr rig and 1 rayinteractor + 1 direct interactor.

    When it runs through DoProcess() in XRUIInputModule.cs it executes else, even though registeredInteractable.interactable is null. It should instead remove the interactable from list.

    Because of that I keep getting this error for the interactable that is not removed properly.

    MissingReferenceException: The object of type 'Transform' has been destroyed but you are still trying to access it. Your script should either check if it is null or you should not destroy the object. UnityEngine.Transform.get_position () (at <05f2ac9c8847426992765a22ef6d94ca>:0) UnityEngine.XR.Interaction.Toolkit.XRRayInteractor.UpdateUIModel (UnityEngine.XR.Interaction.Toolkit.UI.TrackedDeviceModel& model) (at Library/PackageCache/com.unity.xr.interaction.toolkit@0.9.4-preview/Runtime/Interaction/Interactors/XRRayInteractor.cs:339) UnityEngine.XR.Interaction.Toolkit.UI.XRUIInputModule.DoProcess () (at Library/PackageCache/com.unity.xr.interaction.toolkit@0.9.4-preview/Runtime/UI/XRUIInputModule.cs:165) UnityEngine.XR.Interaction.Toolkit.UI.UIInputModule.Process () (at Library/PackageCache/com.unity.xr.interaction.toolkit@0.9.4-preview/Runtime/UI/UIInputModule.cs:32) UnityEngine.EventSystems.EventSystem.Update () (at C:/Program Files/Unity/Hub/Editor/2019.3.9f1/Editor/Data/Resources/PackageManager/BuiltInPackages/com.unity.ugui/Runtime/EventSystem/EventSystem.cs:377)
     
    Last edited: Apr 29, 2020
    ROBYER1 likes this.
  47. mikeNspired

    mikeNspired

    Joined:
    Jan 13, 2016
    Posts:
    82
    I didn't read your whole post, but they plan on updating all these protected internal methods in the future. The easy fix right now is to make those methods public your self till the next update. I made OnSelectExit public and ForceSelect on the manager to get my inventory working.

    Make sure to also check out that faq a436t4ataf made.
    http://snapandplug.com/xr-input-too...I-force-a-hand/controller-to-drop-held-items?
     
    a436t4ataf and FishStickies94 like this.
  48. FishStickies94

    FishStickies94

    Joined:
    Jul 27, 2019
    Posts:
    70
    I ended up doing the same thing on the InteractionManager script. I'm new to Unity so was worried about editing package files but it works well enough and glad to know others are doing it that way.
     
    mikeNspired likes this.
  49. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,932
    Make a small example (not using photon - do it by hand with a small script) and report it as bug inside UnityEditor, using the Window > Report a Bug feature.

    I think I already reported the same bug - but Unity has problems with the uploaded scene (I think they might have lost it :)), and I don't have the original any more (project has moved on), so the bug report will probably get ignored. It's worth you re-reporting it!
     
  50. JRissa

    JRissa

    Joined:
    Mar 20, 2015
    Posts:
    12
    I'm developing on Oculus Go and using the XR toolkit. Everything is working otherwise great, but the second hand controller/raycaster is active and falling on the ground when launching the game. Is there any easy way to disable the unoccupied hand all together?
     
    Last edited: May 1, 2020
    playfulsystems likes this.
Thread Status:
Not open for further replies.