Search Unity

Official XR Interaction Toolkit Preview Release (0.9)

Discussion in 'XR Interaction Toolkit and Input' started by mfuad, Dec 17, 2019.

Thread Status:
Not open for further replies.
  1. nigel-moore

    nigel-moore

    Joined:
    Aug 28, 2012
    Posts:
    26
    Is there any way to initiate a teleport via code - i.e. not using an XR Ray Interactor? I want my user to be able to select certain teleport anchor locations from a menu but cannot see a way to obtain or generate a TeleportRequest that I could then pass to the QueueTeleportRequest method of the TeleportationProvider class. Is this possible right now?
     
  2. MaskedMouse

    MaskedMouse

    Joined:
    Jul 8, 2014
    Posts:
    1,092
    How to make the XR ray interactor for UI stop at the intersection of UI?
    I've looked at the XRInteractorLineVisual script and it does have a
    m_StopLineAtFirstRaycastHit
    boolean but it doesn't seem to appear in the UI. By default it is set to true but it seems it isn't used in the code yet at all?
    It just looks terrible the line goes max length even though UI is in between.

    XR Interaction Toolkit 0.9.2
     
  3. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Here https://forum.unity.com/threads/xr-interaction-toolkit-preview-release.795684/page-3#post-5386755
     
  4. MaskedMouse

    MaskedMouse

    Joined:
    Jul 8, 2014
    Posts:
    1,092
  5. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    It's just one line in the code that is incorrect by like 1 word. The fix for it will come in a new version soon. Fret not o_O
     
  6. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    yep, if you look at the code for the teleport interactables, they submit a request to the teleportation provider. you can also submit your own request via code. QueueTeleportRequest is the function.
     
    nigel-moore likes this.
  7. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    Just Published 0.9.3-preview of the interaction toolkit, which fixes a bunch of bugs (and the InputHelpers compile error in the latest from toolkit git repo).

    changelog:
    Lots of fixes based on user feedback from the blog post / forums
    Including:
    Adds pose provider support to XR Controller Monobehaviour
    Fixes minor documentation issues
    Fixed passing from hand to hand of objects using direct interactors
    Removes need for box colliders behind UI to stop line visuals from drawing through them
    Adds abiilty to put objects back to their original hierarchy position when dropping them.
    Fixes null ref in controller states clear
    Fixes no "OnRelease" even for Activate on Grabbable
    Makes teleport configurable to use either activate or select
     
  8. scrant

    scrant

    Joined:
    Jun 1, 2017
    Posts:
    73
    I've been using the XR Toolkit for AR development for about a month now and while it's mostly great it isn't ready for primetime as far as developers hooking in it seems (understandable since it's still preview). It seems that we are supposed to inherit and override ARBaseInteractable as inheriting and overriding ARTranslationInteractable for instance does not really allow you to hook in sufficiently. Which is unfortunate because most people are going to want translation and rotation gestures which do 99.9% of what you want but then you need to tweak or add functionality and you can't.

    For instance, in the ARBaseGestureInteractable class it allows the gesture to start without first checking if something is selected:

    Code (CSharp):
    1.  
    2. void OnGestureStarted(DragGesture gesture)
    3.         {
    4.             if (m_IsManipulating)
    5.                 return;
    6.  
    7.             if (CanStartManipulationForGesture(gesture))
    8.             {
    9.                 m_IsManipulating = true;
    10.                 gesture.onUpdated += OnUpdated;
    11.                 gesture.onFinished += OnFinished;
    12.                 OnStartManipulation(gesture);
    13.             }
    14.         }
    15.  
    16.   void OnUpdated(DragGesture gesture)
    17.         {
    18.             if (!m_IsManipulating)
    19.                 return;
    20.  
    21.             // Can only transform selected Items.
    22.             if (!IsGameObjectSelected())
    23.             {
    24.                 m_IsManipulating = false;
    25.                 OnEndManipulation(gesture);
    26.                 return;
    27.             }
    28.  
    29.             OnContinueManipulation(gesture);
    30.         }
    This causes OnStartManipulation to be called in inherited classes when, in fact, it shouldn't be as the object may not have been selected until the OnUpdated method is called later where the object is checked for selection and then if it's not OnEndManipulation is called. This seems wrong to me as we've already started.

    Further, in the ARTranslationInteractable class (which again does 99% of what we need) m_IsActive is set to true at the end of the OnEndManipulation method which causes another whole round of position updates in the next Update call on this class but if we are truly ending the manipulation it should be completely done:

    Code (CSharp):
    1.  
    2. protected override void OnEndManipulation(DragGesture gesture)
    3.         {
    4.             if (!m_LastPlacement.HasPlacementPosition)
    5.                 return;
    6.            
    7.             GameObject oldAnchor = transform.parent.gameObject;
    8.             Pose desiredPose = new Pose(m_DesiredAnchorPosition, m_LastPlacement.PlacementRotation);
    9.  
    10.             Vector3 desiredLocalPosition = transform.parent.InverseTransformPoint(desiredPose.position);
    11.  
    12.             if (desiredLocalPosition.magnitude > MaxTranslationDistance)
    13.                 desiredLocalPosition = desiredLocalPosition.normalized * MaxTranslationDistance;
    14.             desiredPose.position = transform.parent.TransformPoint(desiredLocalPosition);
    15.  
    16.             //Anchor newAnchor = m_LastPlacement.Trackable.CreateAnchor(desiredPose);
    17.             var anchorGO = new GameObject("PlacementAnchor");
    18.             anchorGO.transform.position = m_LastPlacement.PlacementPosition;
    19.             anchorGO.transform.rotation = m_LastPlacement.PlacementRotation;
    20.             transform.parent = anchorGO.transform;
    21.            
    22.             Destroy(oldAnchor);
    23.  
    24.             m_DesiredLocalPosition = Vector3.zero;
    25.  
    26.             // Rotate if the plane direction has changed.
    27.             if (((desiredPose.rotation * Vector3.up) - transform.up).magnitude > k_DiffThreshold)
    28.                 m_DesiredRotation = desiredPose.rotation;
    29.             else
    30.                 m_DesiredRotation = transform.rotation;
    31.  
    32.             // Make sure position is updated one last time.
    33.             m_IsActive = true;
    34.         }
    35.  
    If I inherit this class and then try to fine tune the placement of the object when done (snapping to a position for instance) and override the OnEndManipulation method I don't have access to the m_IsActive variable and therefore can't really change the behavior and I'm fighting with its updates.

    Really, I'd have to take these ARTranslationInteractable and ARRotationInteractable classes and just copy (clone) them and hack them up which really defeats the purpose of object oriented code when the code ultimately changes underneath. I've had to hack the package code which is bad for the same reasons. I'd love it if there were better hooks into these classes as most people will want the behavior but with a few simple tweaks. Right now it's not really possible. Am I thinking about this incorrectly? Will this always be the behavior or are updates on the way? Was hoping the latest blog post would cover some of this but it didn't.

    Thanks and nice work so far on the toolkit. Getting so close to great!
     
    createtheimaginable likes this.
  9. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    54
    Thanks for the update, keep up the good work :)

    Quick Question:

    Is there a preferred/official/recommended way to check if the left or right controllers are on, and if not disable the model from being pulled into the scene by the XR Controller script? If so, someone please direct me to it. Thank you.

    Currently, all the scenes (even demo it seems) load controller prefabs into the scene if they're assigned on XRController, regardless of if the controller is on or when the scene loads.
     
  10. createtheimaginable

    createtheimaginable

    Joined:
    Jan 30, 2019
    Posts:
    29
    I am having trouble with that too! I think you are supposed to extend and then override the Interactors and Interactables.

    Extending The XR Interaction Toolkit

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.XR.Interaction.Toolkit;
    5. using UnityEngine.XR.Interaction.Toolkit.AR;
    6.  
    7. public class MyARSelectionInteractable : ARSelectionInteractable
    8. {
    9.     public bool m_GestureSelected { get; private set; }
    10.  
    11.     public override bool IsSelectableBy(XRBaseInteractor interactor)
    12.     {
    13.         if (!(interactor is ARGestureInteractor))
    14.             return false;
    15.  
    16.         return m_GestureSelected;
    17.     }
    18. }
     
  11. scrant

    scrant

    Joined:
    Jun 1, 2017
    Posts:
    73
    Right, so does that mean all I have to do is override as above and return true? IsSelectableBy means selected? And is that called by the interactor? You can’t access m_GestureSelected variable in an override as its private so you’ll have to come up with your own on code. Feel like their should just be a method on either the interactable or interactor like SetSelected.
     
    createtheimaginable likes this.
  12. Sb86

    Sb86

    Joined:
    Sep 5, 2018
    Posts:
    5
    I have been trying to create a simple menu script for the Oculus touch controller.
    Pressing button A opens the menu. Joystick to left and right for navigation, trigger for select and A button to close the menu.
    I can not for my life figure out how to achieve this!
    I have it working with SteamVR but thought it would be a good time to change to the XR toolkit as we no longer use HTC Vive.
    Can anyone help me with this?
     
  13. ArdaZeytin

    ArdaZeytin

    Joined:
    Jul 23, 2016
    Posts:
    12
    I have started to using XR Toolkit instead of VRTK or OculusSDK. I could not find best/practical way to use touch events. I am curious, "XR Direct Interactor" component is available on XR controllers but there is no "Touch" event triggers for interactable objects. Maybe i am missing but i recommend these events should be added:
    • OnFirstTouchEnter()
    • OnTouchEnter()
    • OnLastTouchExit()
    • OnTouchExit()
    A screenshot from my last project. I implemented additional classes based on XRBaseInteractable. I solved it temporarily but i would like to learn am i missing something or this will be available in future release. Thanks.
     

    Attached Files:

  14. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Upon upgrading to package version 0.9.3 from 0.9.2, I get this error

    ##Edit: Just fixed this, if it happens to you, you need to remove the XR Legacy input helpers package (it will automatically re-add it as it is a dependency of Interaction Toolkit)
    Even when deleting and readding the package, the error shows up?

    Library\PackageCache\com.unity.xr.interaction.toolkit@0.9.3-preview\Runtime\Interaction\Controllers\XRController.cs(283,35): error CS1061: 'BasePoseProvider' does not contain a definition for 'TryGetPoseFromProvider' and no accessible extension method 'TryGetPoseFromProvider' accepting a first argument of type 'BasePoseProvider' could be found (are you missing a using directive or an assembly reference?)
     
    Last edited: Jan 28, 2020
    ilyaylm, Turtwiggy, JamesClow and 2 others like this.
  15. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    @StayTalm_Unity The new improvements to the Controller Manager script are great, I can now show the teleporter with Primary 2D Axis Up using the Activation Usages which have been added, however I can't for the life of me activate the teleportation on release of the joystick through the XR Controller Usages when I make it Activate with Primary 2D Axis down. Is there a way of triggering teleport on release of the analogue stick like the standard technique Windows Mixed Reality uses?
     
  16. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    make sure you have 1.3.8 of the LIH package. you probably had a 2.x version which is, despite the higher number, not the latest.
     
    ROBYER1 likes this.
  17. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    It's one of the tasks that i'm working on shortly. There's two different callbacks.
    InputDevices.deviceConnected and InputDevices.deviceDisconnected

    So, listening to those will tell us when things are disconnected/connected. The way the ControllerManager is setup right now doesn't make it easy to turn on/off entire controllers, so it'll be getting a bit of a refactor.
     
    kavanavak likes this.
  18. nigel-moore

    nigel-moore

    Joined:
    Aug 28, 2012
    Posts:
    26
    Perfect, thanks Matt
     
  19. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @ROBYER1
    WMR-style teleporting was one of my main test cases for that refactor, so it is possible.
    Here is what I did:
    1) Change the ControllerManager's activation button to 'Primary 2D Axis Up'
    2) Change the ControllerManager's deactivation button to Grip (to cancel)
    3) Go to both [Left/Right]TeleportController's XRController and set the Select Usage to 'Primary 2D Axis Up'
    4) Make sure your Teleport GameObject's TeleportationAnchor 'Teleport Trigger' is set to 'On Select Exit'

    If the ControllerManager's Activation Usage the the teleport controllers Select Usage match, then it should teleport on release of that usage so long as a deactivation button wasn't pressed.

    Does that work for you? And is that the behaviour you wanted?
     
    nigel-moore and ROBYER1 like this.
  20. bruno1308

    bruno1308

    Joined:
    Aug 25, 2016
    Posts:
    4
    Upon upgrading to package version 0.9.3 from 0.9.2, I lost a nice behaviour, not sure if it's a bug or the new default:
    * Using an XRRayInteractor, if I kept held the Select Usage button - aiming at nowhere - and only then aim at something Interactable, the Interactable would go into my hand (desired behaviour). After the update, the Interactable will only teleport to my hand if I aim at it beforehand and only then press the Select Usage button.

    Could you guys clarify if it's was an intended change? @StayTalm_Unity @Matt_D_work
     
  21. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    It did work, thanks.

    Next issue which I realise isn't XR Interaction specific but will affect anyone using the sample repo on Oculus Quest/Go, using Vulkan Multiview rendering with Built-in rendering on Oculus Quest/Go on Unity 2019.3 and 2020.1.0a20 renders both eyes overlapped on the left eye (see below). If you attempt to use UniversalRP instead of Built-In renderer, the left eye only renders completely Grey and the right eye renders black also.

    I am a bit miffed with the current XR released features.. @StayTalm_Unity @Matt_D_work @daves are you able to prompt anyone in the VR department to have a simple test project with a cube and some worldspace UI and using both Built-In and then also URP + Shader Graph + XR Management + Oculus XR Plugin + Input System + XR Interaction Toolkit all together to make sure they actually all work together?

    Reported at Case 1215369 for built-in and Case 1215378 for UniversalRP (URP)

    More info here: https://forum.unity.com/threads/fol...ous-systems-supposed-to-work-together.817086/


    doubleup.png
     
    Last edited: Jan 29, 2020
    kavanavak, nigel-moore and Shizola like this.
  22. nigel-moore

    nigel-moore

    Joined:
    Aug 28, 2012
    Posts:
    26
    Is there any way to use Quest's Fixed Foveated Rendering with XRITK? I'm a little concerned that it is only available if you are using the actual OVR camera rig - is that correct? o_O
     
    ROBYER1 likes this.
  23. Unityceit2

    Unityceit2

    Joined:
    Sep 24, 2018
    Posts:
    1
    Can you see the XR Interaction Toolkit on the package manager? It doesn't appear on my list.

    upload_2020-1-29_16-19-47.png
     
  24. scrant

    scrant

    Joined:
    Jun 1, 2017
    Posts:
    73
    So it turns out the above is correct. If you return true from override IsSelectableBy the object will become selected. Guess the interactor or interaction manager polls each frame and calls IsSelectableBy on each object. Not obvious from the name of the function (which seems to imply it COULD be selected) but it works.
     
    createtheimaginable likes this.
  25. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    hmm. anyone else seeing that?
     
  26. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    475
    probably need to toggle "show preview packages"?
     
    Matt_D_work likes this.
  27. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    That got turned into a setting on the XRRayInteractor. You should now see a Select Action Trigger setting that by default is set to State Change. The original behaviour that you want is State.

    Let me know if that helps.
     
    bruno1308 likes this.
  28. Aaron-Meyers

    Aaron-Meyers

    Joined:
    Dec 8, 2009
    Posts:
    305
    Just noticed that when dragging a slider with an XRController, when my laser pointer moved outside of the bounds of the handle, it stopped dragging. Is there an option to keep dragging when you drag outside? It would be more consistent to how sliders behave with mouse/touch interaction.
     
    skrubbeltrang likes this.
  29. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    It *should* be consistent with how mouse and touch behaves, as I use the same code for all 3. The XR Pointer is just using a 3D world space instead of a 2D screen space. Can you report a bug using the Unity bug tracker? That will make sure it goes on my stack.
     
  30. Aaron-Meyers

    Aaron-Meyers

    Joined:
    Dec 8, 2009
    Posts:
    305
    I reported it. Unfortunately I don't have time to put together a project to upload for reproducing it, but it's pretty straightforward. Thanks!
     
    StayTalm_Unity likes this.
  31. javierbullrich

    javierbullrich

    Joined:
    Jan 31, 2020
    Posts:
    6
    Is there any plan to have a simulator for the editor, like VRTK has?

    Having to use the headset every single time slows down the development.

    That's the only thing that is stopping us from jumping from VRTK to XR Toolkit.
     
  32. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @javierbullrich
    One of our 'soon' roadmap items is to adopt the Input System package as our Input backend. So Actions, ActionMaps, etc...

    This comes with the ability to create and control custom devices in C#, which would be a really low level way for us to make proper mocks, and for you guys to customize your mocks to suit very specific needs.

    So, YES! there is a plan :)
     
  33. koebrugge

    koebrugge

    Joined:
    Dec 5, 2019
    Posts:
    4
    I really like this new feature!! I have to create a VR experience of a operation room, but I was totally new to it and Unity! This made it much easier for me, I now have all the functionality I need, without having to wrestle with all the different SDK's!
     
    StayTalm_Unity likes this.
  34. Corysia

    Corysia

    Joined:
    Mar 14, 2018
    Posts:
    108
    Just a side-note here, the EditorXR group is working on turning it in to a Package, too. It will be easier to "edit VR in VR". I don't know how compatible it will be with XR right away -- hopefully?
     
    harleydk likes this.
  35. Corysia

    Corysia

    Joined:
    Mar 14, 2018
    Posts:
    108
    So I'm left with a choice. I can either do:
    • XR Interaction Toolkit preview - 0.9.3
    • XR Legacy Input Helpers 1.3.8
    OR
    • XR Interaction Toolkit preview - 0.9.2
    • XR Legacy Input Helpers 2.0.6
    Which is the right combination?
     
  36. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @Matt_D_work did an upgrade recently for both packages and will let you know. He's currently under the weather today so hold tight :).

    I'd personally suggest the first version for now, especially if you want the few improvements we've made.
     
  37. koebrugge

    koebrugge

    Joined:
    Dec 5, 2019
    Posts:
    4

    I would like to know this as well please.
     
    harleydk likes this.
  38. franadoriv

    franadoriv

    Joined:
    Oct 5, 2015
    Posts:
    5
    How can i force or simulate button press for in editor testing?

    Hacking and making UpdateInteractionType and InteractionTypes public do the work, but i dont want to modify the original libraries.

    I need a legal way to execute this please:
    XRController(instance).UpdateInteractionType(XRController.InteractionTypes.select, true);
     
  39. alexchesser

    alexchesser

    Joined:
    Sep 15, 2017
    Posts:
    147
    You're in trouble with 2020.1 in more ways than one! Last night I tried setting up an XR Interaction Toolkit build and there were compile errors on the package out of the box. I'm actully poking through the thread here to see if it is a known issue, if it is fixable or if I just have to roll my own XR interactions in 2020.1
     
  40. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
  41. Thomukas1

    Thomukas1

    Joined:
    Sep 29, 2014
    Posts:
    31
    Hi, im not sure if this is the right place to ask, but how do i make an object stick to hand in exactly the place i want? I have created a "attachpoint" object on the hand, at 0,0,0 and it does kinda work, but my object is like a metre away from the hand. So what values do i need to tweak to make my object stick to the right place? Thanks
     
  42. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @Thomukas1
    It should take the transform of the GameObject with the Interactable Monobehaviour, and set it's position/rotation to the same as the Hand's Attach Point.

    It's hard without seeing the hierarchy of both the hand and the grabbed object, but I suspect you need to move the visual representation of your object closer to the local origin of the GameObject with the Interactable Monobehaviour.

    Does that work?
     
  43. Thomukas1

    Thomukas1

    Joined:
    Sep 29, 2014
    Posts:
    31
    Right, but the object im grabbing has a visual representation on it. here you can see the hierarchy Screenshot_2.png
    I want to grab HandheldCamera. It has this xrgrabinteractable component Screenshot_3.png
    and this is the attach point on the hand interactor Screenshot_2.png
     

    Attached Files:

  44. alexchesser

    alexchesser

    Joined:
    Sep 15, 2017
    Posts:
    147
    Hey StayTalm!

    The Error is:

    Code (CSharp):
    1. Library\PackageCache\com.unity.xr.interaction.toolkit@0.9.3-preview\Runtime\Interaction\Controllers\XRController.cs(283,35): error CS1061: 'BasePoseProvider' does not contain a definition for 'TryGetPoseFromProvider' and no accessible extension method 'TryGetPoseFromProvider' accepting a first argument of type 'BasePoseProvider' could be found (are you missing a using directive or an assembly reference?)
    with the package list:

    Code (JavaScript):
    1. {
    2.   "dependencies": {
    3.     "com.havok.physics": "0.1.2-preview",
    4.     "com.unity.collab-proxy": "1.3.5",
    5.     "com.unity.entities": "0.5.1-preview.11",
    6.     "com.unity.ide.visualstudio": "2.0.0",
    7.     "com.unity.ide.vscode": "1.1.4",
    8.     "com.unity.physics": "0.2.5-preview.1",
    9.     "com.unity.test-framework": "1.1.9",
    10.     "com.unity.textmeshpro": "3.0.0-preview.3",
    11.     "com.unity.timeline": "1.3.0-preview.6",
    12.     "com.unity.ugui": "1.0.0",
    13.     "com.unity.xr.interaction.toolkit": "0.9.3-preview",
    14.     "com.unity.xr.legacyinputhelpers": "2.0.6",
    15.     "com.unity.xr.management": "3.0.5",
    16.     "com.unity.xr.oculus": "1.1.5",
    17.     "com.unity.modules.ai": "1.0.0",
    18.     "com.unity.modules.androidjni": "1.0.0",
    19.     "com.unity.modules.animation": "1.0.0",
    20.     "com.unity.modules.assetbundle": "1.0.0",
    21.     "com.unity.modules.audio": "1.0.0",
    22.     "com.unity.modules.cloth": "1.0.0",
    23.     "com.unity.modules.director": "1.0.0",
    24.     "com.unity.modules.imageconversion": "1.0.0",
    25.     "com.unity.modules.imgui": "1.0.0",
    26.     "com.unity.modules.jsonserialize": "1.0.0",
    27.     "com.unity.modules.particlesystem": "1.0.0",
    28.     "com.unity.modules.physics": "1.0.0",
    29.     "com.unity.modules.physics2d": "1.0.0",
    30.     "com.unity.modules.screencapture": "1.0.0",
    31.     "com.unity.modules.terrain": "1.0.0",
    32.     "com.unity.modules.terrainphysics": "1.0.0",
    33.     "com.unity.modules.tilemap": "1.0.0",
    34.     "com.unity.modules.ui": "1.0.0",
    35.     "com.unity.modules.uielements": "1.0.0",
    36.     "com.unity.modules.umbra": "1.0.0",
    37.     "com.unity.modules.unityanalytics": "1.0.0",
    38.     "com.unity.modules.unitywebrequest": "1.0.0",
    39.     "com.unity.modules.unitywebrequestassetbundle": "1.0.0",
    40.     "com.unity.modules.unitywebrequestaudio": "1.0.0",
    41.     "com.unity.modules.unitywebrequesttexture": "1.0.0",
    42.     "com.unity.modules.unitywebrequestwww": "1.0.0",
    43.     "com.unity.modules.vehicles": "1.0.0",
    44.     "com.unity.modules.video": "1.0.0",
    45.     "com.unity.modules.vr": "1.0.0",
    46.     "com.unity.modules.wind": "1.0.0",
    47.     "com.unity.modules.xr": "1.0.0"
    48.   }
    49. }
    50.  
    uninstalling XR interaction toolkit gets rid of the error. Unity version 2020.1.0a21

    I wasn't sure if it was me doing something wrong, so I wasn't going to open a ticket until I knew if it was my fault or not. ... especially seeing as how I am basically elbows deep in Alpha and experimental features.

    But if that's not something you've seen, I'll hit the report button and send it along.
     
  45. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    revert to Legacy input helpers version 1.3.8 I think its called, the 2.0.x version is actually an older version.

    @Matt_D_work will anyone fix that incorrect version of the package or is there a reason why? A lot of people have that issue with the legacy input helper versions on first setup when I tell them to use XRI
     
    Last edited: Feb 6, 2020
  46. alexchesser

    alexchesser

    Joined:
    Sep 15, 2017
    Posts:
    147
    You're like the MVP around here @ROBYER1 :) thank you!

    (can confirm that fixed it)
     
    Last edited: Feb 6, 2020
    kavanavak, linojon and ROBYER1 like this.
  47. StayTalm_Unity

    StayTalm_Unity

    Unity Technologies

    Joined:
    May 3, 2017
    Posts:
    182
    @ROBYER1 is right!
    I know he is trying to get that fix published.

    Sorry about that, it's a nasty hiccup and the solution is not immediately clear.
     
    Corysia and createtheimaginable like this.
  48. linojon

    linojon

    Joined:
    Aug 25, 2014
    Posts:
    118
    Hi, this may be a basic question but I haven't been able to find an answer. A World Space canvas requires an Event Camera (if not explicitly assigned, we get a warning that Unity will use the slow Camera.main call at runtime). Presumably in conventional apps, this is used to project screen touches or mouse click positions from screen viewport to world space.

    But in XR, not only are there no screen space canvases, but also there is no screen space input. And in XRI, the world canvas has a Tracked Device Graphic Raycaster for detecting when an interactor is interacting with a UI element. So my question is, when if ever would the Camera reference ever be used? When would there be a raycast from the camera for UI detection in XR? And if not, can we ignore the Event Camera warning if left unassigned? I realize its not a big deal to have to assign it, so this is just for my understanding. Thanks.
     
  49. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    9,411
    checking for any news about SteamVR? can this be later used to build steam vr apps?

    **ok just saw this, so i guess its coming in some form here too:

    https://forum.unity.com/threads/vr-...deprecated-in-unity-2020.785369/#post-5468898
     
    Last edited: Feb 11, 2020
    jashan likes this.
  50. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    Apologies for hijacking this comment. You were probably referring to UIs that are displayed in VR/AR ... but as Unity seems to think that VR projects only ever display to VR (and that is a very wrong assumption), I wanted to chime in:

    There are quite a few use cases where VR projects will also render to the flatscreen, and may even have full flatscreen UIs. In fact, we even let players push the whole UI from VR to the flatscreen, or let the UI be pulled from VR to the flatscreen with a flatscreen UI. The way this works in our case is by changing the canvas from world to screen overlay on the fly. And it works fine.

    But we also have permanent screen overlay UIs, and I would really appreciate it if Unity's didn't try to be smart and give me a warning about that. And also, of course, any "UI stuff" e.g. in the interaction system that assumes that the UI will only ever be displayed in VR and breaks when that same UI moves to the flatscreen would be a dealbreaker for us.
     
Thread Status:
Not open for further replies.