Search Unity

EditorXR and (new) Input System / input-prototype / XR Interaction Toolkit?

Discussion in 'EditorXR' started by jashan, Jan 17, 2020.

  1. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    Version 0.3 of EditorXR comes with Editor/libs/input-prototype. Is that an old version of the new Input System that is now available as a Unity package via UPM? Or something else?

    Also, how do EditorXR and the XR Interaction Toolkit (also from UPM) interact? It seems there are quite a few areas where both do the same thing, or similar things, but differently. In the recent blog posting it says that EditorXR will use customized controller models and interaction settings when they were set up using the XR Interaction Toolkit ... but how exactly does this work?

    I have just spent a little while with EditorXR in Unity 2019.2.17, using Valve Index Controllers (weren't supported and showed Vive wands instead), Vive wands and Oculus Touch controllers (with the original Rift).

    To be honest, I have very mixed feelings: On one hand, it does look like this might help a lot with a project I'm currently working on which is a content editor. On the other hand, things like Singlepass being broken (in 2020, where I believe most people have completely abandoned Multiplass ;-) ), or the lack of support for all relevant controller types (Valve Index, the various Windows MR controllers, the new Touch controllers, PS Move ;-) ) and using some very unusual UX conventions ("Blink" instead of "Teleport", putting it on a button instead of touchpad / joystick forward as is usual, and putting the spatial menu where teleport usually is) does scare me a little.

    In this current project, I use SteamVR Input 2.0, which is awesome, and some hacky fallbacks for native Oculus support, and PSVR support. My main concern is having to deal with yet another input abstraction that does the same as Valve, just a little different (and maybe not quite as production ready, but also larger in scope).
     
    dimib likes this.
  2. amirebrahimi_unity

    amirebrahimi_unity

    Joined:
    Aug 12, 2015
    Posts:
    400
    Yes, the input-prototype is an older version of the new InputSystem, since this work started in parallel at the same time as EditorXR (called EditorVR at that time). One of the things we'll do at some point is update EditorXR to replace the usage of the input-prototype.

    Currently, the XR Interaction Toolkit doesn't integrate with EditorXR. It came after EditorXR and although some people have worked on both when it was in our Labs group, the project got transferred to another group before release. However, now that it is released we are actively looking (e.g. as in we have one dev looking at this currently) at replacing some of the functionality of EditorXR (e.g. MultiRayInputModule) with equivalent functionality from XRI.

    Thanks for sharing your concerns. I think what you're experiencing is a project that started prior to the release of the first VR headsets and is now 4 years old with a lot of standards that have now since settled, both internally and externally. VR didn't have the splash everyone thought it would right out of the gates, so effectively not many people used EditorVR (as it was named back then). Other projects were started in our Labs group and EditorXR didn't get as much attention. I also moved on to other projects, but tried to give some attention to some tasks here and there, such as removing the partner SDK dependencies. Fast-forward to today where the Quest is now a popular platform and naturally more people want to dev for it, so there has been a surge of interest in EditorXR.

    What I can say is that it is actively being developed with current efforts to bring it to the package manager in Unity and the run-time has been used in our Unity Reflect product. The framework is also core to the MARS project if you are familiar with that. So, please bear with us and workaround current issues as best you can. We also happily accept PRs to our repository in case you are wanting to contribute back.
     
    Last edited: Jan 30, 2020
    jashan likes this.
  3. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    Thank you for the update, this does sound promising! I know the troubles of working on old projects (we started developing Holodance in 2015 ;-) ). I may actually dive into EditorXR and send some PRs. If I understand the architecture correctly, it should be possible to add e.g. support for SteamVR Input 2.0 (without breaking support for the new Input System prototype ... but might be wise to coordinate with work being done on porting this to the new Input System by Unity), or also the SteamVR controller attach points to EditorXR controller abstraction bridge that I mentioned somewhere. I'm aware that the former is probably an area where the actual implementation doesn't perfectly match the idea but at least the second one should be comparatively straightforward.

    I still have to look into Unity's new (XR) input system and whether or not it's compatible with the old VR plugins. I will stay on Unity 2019.3 and the old but properly working VR integrations until OpenVR is supported in the new XR Plugins and Systems approach, with full feature parity to what we currently have. In particular, I certainly won't give up SteamVR Input 2.0 or skeletal input, that stuff is just way too good to be replaced with something else ... and I would still love to see Unity collaborating more closely with Valve on getting the good stuff that Valve does natively into Unity instead of re-inventing the wheel and making it square.

    The biggest issue for me with EditorXR, at the moment, is that Single Pass Stereo Rendering isn't supported because the project that I would integrate this into requires SPSR (not the instanced variant of it, just the "regular" one). I believe it's probably just one or two shaders that break - but my shader-fu is worse than drunken bar fighting, so that's not something I could fix easily. Also, I would primarily use EditorXR as a UI during runtime / in builds to edit proprietary content (i.e. not the scene). Scene editing is also interesting for our project but that's lower-priority / longer-term. So that recent blog posting was obviously what re-sparked my interest even though I was quite excited when EditorVR was first announced.
     
  4. amirebrahimi_unity

    amirebrahimi_unity

    Joined:
    Aug 12, 2015
    Posts:
    400
    You have a few areas you could slot SteamVR Input 2.0 in. You could create a whole new proxy (i.e. SVR2ViveProxy or something like that) and instead of using ViveInputToEvents (assigned to m_InputToEvents you'd wire this up to SteamVR Input). I don't think you'd be able to inherit from TwoHandedProxyBase though (may have to fork it).
    https://github.com/Unity-Technologies/EditorXR/blob/development/Scripts/Proxies/Vive/ViveProxy.cs

    Another approach would be to hide SteamVR Input inside of the ViveInputToEvents/BaseVRInputToEvents class. Essentially, you'd be feeding the input-prototype system with a different low-level system. If you go back in the revision history to when we had partner SDKs required you can see how that was done.
    https://github.com/Unity-Technologies/EditorXR/blob/development/Scripts/Input/ViveInputToEvents.cs
    https://github.com/Unity-Technologies/EditorXR/blob/development/Scripts/Input/BaseVRInputToEvents.cs

    If you find that you would have to spend more than a day or two I'd probably punt on this as the system is going to get replaced at some point in the future. However, I'm sharing this in the hopes that it would unblock you if you must have the new SteamVR Input.

    It's good to hear what use case you're looking to use EditorXR for. The main reason why SPSR isn't supported was because of the editor rendering approach that's used, which is a different path than GameView rendering and we couldn't make it work with the MiniWorld rendering. However, if you don't need the MiniWorldWorkspace, then that could probably be circumvented in a custom fork.
     
    jashan likes this.
  5. mikewarren

    mikewarren

    Joined:
    Apr 21, 2014
    Posts:
    109
    Could someone clarify the relationship between the New Input System and XR Input. Is there a road map for a unified input system in the works?

    I've been experimenting a little with the XR Interaction Toolkit, and I like that it's got some support for uGUI, which I need. But, we support CAVEs and other projection displays that integrate separate (COTS) trackers and game controllers that aren't standard, well known VR systems (Oculus, OpenVR, WindowsMR, etc.). I need to figure out how to make the custom tracker/controller look like an XR InputDevice.

    Is there anyway to bridge the new input system with the XR input system, if I can code the tracker interface?
     
    Shirzad likes this.
  6. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    hi there! :)

    the long term plan is that we'll be eventually moving our tooling over to the new input system. thats going to take us some time though and we have no actual release date for that.

    in the meantime we'll be doing another drop of XR Interaction Toolkit soon with an IXRController interface so you can hook up whatever you want behind that (including thew new input system if you need to!).
     
    Shirzad likes this.
  7. mikewarren

    mikewarren

    Joined:
    Apr 21, 2014
    Posts:
    109
    Hi @Matt_D_work

    Appreciate the feedback and looking forward to the XR Interaction Toolkit update. Got a rough idea when that might be?

    I thought maybe I could create a proxy XR input device, but I don't see a way to do that without going the XR management / plugin route. Unfortunately, there's no Stereo non HMD plugin, and I'd like to stay on the managed side if I can.
     
    Shirzad likes this.
  8. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    soon as I can get it out. don't really want to give a deadline at this point :) but hopefully no more than a week or so. ill post back here when its out!
     
    dimib and Shirzad like this.
  9. mikewarren

    mikewarren

    Joined:
    Apr 21, 2014
    Posts:
    109
    That's great, thanks!

    Not looking to hold anyone to a deadline. I've got lots of stuff I'm working on, so a rough schedule just helps me prioritize. It sounds like the new interface might help me out, so it's worth waiting a couple weeks and working on other stuff.

    Really, I appreciate the feedback. Thank you.
     
    Shirzad likes this.
  10. mikewarren

    mikewarren

    Joined:
    Apr 21, 2014
    Posts:
    109
    @Matt_D_work

    Did the IXRController interface make it in yet? I've checkout out 0.9.4 (07 Apr 2020) and I don't see it in there.
    Just wondering.
     
    Shirzad likes this.
  11. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    Not yet :) doing something a little bit different with that, hopefully out soon.
     
    Shirzad likes this.
  12. T3ddyTheGiant

    T3ddyTheGiant

    Joined:
    Aug 1, 2018
    Posts:
    11
    @Matt_D_work

    I'm also looking forward to this! Would love to have the new input system and xr-toolkit work together, or at least have a layer of abstraction (IXRController interface) to provide compatibility.

    Is there a roadmap of this anywhere?
     
    dimib and Shirzad like this.
  13. SelfishReplicator

    SelfishReplicator

    Joined:
    Oct 28, 2015
    Posts:
    2
    We also would appreciate this IXRController interface, hoping to write simulations that could work in both our cave and HMD's. We wrote code for that in the past, but it feels like inventing the wheel over and over again (We are building a new cave now). An abstraction layer would be of great value!