Version 0.3 of EditorXR comes with Editor/libs/input-prototype. Is that an old version of the new Input System that is now available as a Unity package via UPM? Or something else? Also, how do EditorXR and the XR Interaction Toolkit (also from UPM) interact? It seems there are quite a few areas where both do the same thing, or similar things, but differently. In the recent blog posting it says that EditorXR will use customized controller models and interaction settings when they were set up using the XR Interaction Toolkit ... but how exactly does this work? I have just spent a little while with EditorXR in Unity 2019.2.17, using Valve Index Controllers (weren't supported and showed Vive wands instead), Vive wands and Oculus Touch controllers (with the original Rift). To be honest, I have very mixed feelings: On one hand, it does look like this might help a lot with a project I'm currently working on which is a content editor. On the other hand, things like Singlepass being broken (in 2020, where I believe most people have completely abandoned Multiplass ;-) ), or the lack of support for all relevant controller types (Valve Index, the various Windows MR controllers, the new Touch controllers, PS Move ;-) ) and using some very unusual UX conventions ("Blink" instead of "Teleport", putting it on a button instead of touchpad / joystick forward as is usual, and putting the spatial menu where teleport usually is) does scare me a little. In this current project, I use SteamVR Input 2.0, which is awesome, and some hacky fallbacks for native Oculus support, and PSVR support. My main concern is having to deal with yet another input abstraction that does the same as Valve, just a little different (and maybe not quite as production ready, but also larger in scope).