Search Unity

What's the plan with VR input?

Discussion in 'AR/VR (XR) Discussion' started by greggtwep16, Aug 1, 2017.

  1. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    @joejo @runevision

    Hey unity folks,

    I was wondering what the current plan for VR/XR input is within the Unity API? I've noticed a decent amount of additions to VR inputtracking (XR inputtracking in 2017.2 beta) but not enough to realistically consider dropping the various sdks (Oculus, GoogleVR, Steam VR, etc.) when it comes to input. Furthermore VR input is still changing pretty rapidly (steam knuckles in the fall, samsung ring perhaps, and many others). This coupled with the next experimental build of the new input system and this can obviously play out a number of different ways. At least until now VR's inputtracking seems separate from the rest because of it's coupling to rendering for VR and feels like that's unlikely to change with the new input system (plus various VR companies seem to be going out of their way to do things not standard, like using custom bluetooth pairing so it doesn't work for their competitors and probably means Unity won't recognize their input without the sdk in your unity project in addition to the native dll).

    That being said while the position, rotation, fingers touching, etc. aren't typical in non VR input, the buttons on the controllers do seem piped into the more standard Unity input and it doesn't appear these are going to show up in VR inputtracking, so you're left with this dual nature of using both sets of APIs?

    What is the overall high level plan? I've created assets for numerous specific sdks but obviously that is a pain and creates an additional point of failure. Only basic things seem to be able to be done in a Unity API that would work on all of the 6 main VR platforms right now (Vive, Rift, PSVR, Gear VR, Daydream, and Cardboard). It seems silly to have to integrate similar things into 4 separate sdks all doing essentially the same stuff differently.
     
    guru20 likes this.
  2. linojon

    linojon

    Joined:
    Aug 25, 2014
    Posts:
    118
    I'm a fan of VRTK
     
    Alverik likes this.
  3. jonlundy

    jonlundy

    Joined:
    Apr 15, 2016
    Posts:
    25
    I started writing my own input manager and quickly realized that I was not going to exceed the functionality of VRTK. It has so much functionality and is being used and tested by so many people that I abandoned my own efforts and started working with it.
     
  4. scvnathan

    scvnathan

    Joined:
    Apr 29, 2016
    Posts:
    75
    Alverik and greggtwep16 like this.
  5. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    VRTK is good but it really only supports 2 out of the 6 platforms I mentioned (rift/vive) not PSVR, Gear VR, Daydream (has experimental but doesn't work), and Cardboard. For the Rift/Vive what it has is good, but even there the downside to the sdk approach is things get majorly broken from time to time when the maker of the sdk (oculus, steam, google, sony) decide to switch things up on the C# side which unfortunately they have done and probably will continue to do. I'm sure even the VRTK author himself would love to get rid of this additional point of failure and would do so if Unity had a VRInput API that was good enough.

    I'm trying to gauge what Unity's high level plans are.
     
    Alverik likes this.
  6. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    One of the bullet points in there is cross platform controller input, and my questions relate to this and what it will actually be. Does this coincide with the new input system or is it an extension of what we see with xr.inputtracking? Also, I have quite a bit of experience in the area instead of waiting for the beta is there a way to become an alpha tester? I have participated in many alphas in the past and am active in submitting bug reports. I just would need to know who to ask.

    @joejo @runevision any idea who's heading that up and who to ask?
     
  7. guru20

    guru20

    Joined:
    Jul 30, 2013
    Posts:
    239
    Agreed... the reality is that VR is still such a burgeoning field, and combine that with the fact that it is pretty fragmented (as far as platforms go), and any viable VR project is going to need to need to be cross-platform to have any sort of ROI / commercial viability...

    Example: There have been over 80 million Xboxes sold; over 60 million PS4... countless PCs and mobile devices...
    Now, when we look at PS units, it goes something like this:
    Cardboard - ~ 10 million?
    Gear VR - 5 million
    PSVR - ~1 million
    Vive - 0.5 million
    Rift - 0.3 million

    Even though right now I believe the best experiences are on Vive and Rift, clearly the market demand dictates we do not develop solely for those platforms... (however, I also can't make the same games for mobile as I can for powerful VR-ready PC.... however, I should be able to easily port tor PSVR.) We really need a generic input wrapper that can then easily tie into the individual APIs on the backend...

    Re: VRTK -- everybody loves it, and I agree it is pretty cool to have as a "swiss-army knife", but the reality is that, even though it does a lot of different things, it doesn't do any of them perfectly or even all that great... I've found that the features of other inexpensive arc-teleport assets make them preferable, many of the VRTK features are much better-implemented by the Vive plugin (however, even that makes input difficult... it has a new ViveInput system which is completely different than the old framework, thus eliminating cross-compatibility with things like VRTK...)
     
    Alverik likes this.
  8. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    To me this is the biggest problem, In addition to the steam vr sdk doing this, the google vr sdk also had a major update that broke backwards compatibility a few months back. It's hard enough doing cross platform support in general, but breaking changes in SDKs is a killer. I'm sure even the author of VRTK was swearing after that happened.

    Having 4 different SDKs that do similar things differently is not ideal and while XR.inputtracking is a good start, but it isn't a low level wrapper on everything (can't do buttons, haptic feedback, touchpads, recentering controllers, etc.) forcing the need for the 4 C# SDKs. I'd really love to be involved in an alpha to provide feedback for the XR toolkit input portion.
     
  9. scvnathan

    scvnathan

    Joined:
    Apr 29, 2016
    Posts:
    75
  10. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    Those files don't seem like it's the repository for them. For starters, it's using the old prototype that was built on top of the old input system. None of the new stuff that is supposed to have an experimental build when 2017.2 comes out is in there. Also, the Editor VR stuff only seems to have implementations for Rift/Vive which makes sense for the editor but wouldn't make sense for cross platform XR at runtime (doesn't have anything for PSVR, Gear VR, Daydream, Cardboard, etc.).

    Was interesting to scan though.
     
  11. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    @thep3000 @Rene-Damm @jj-unity @amirebrahimi_unity @unity_andrewm @JeffDUnity3D @r3dthrawn

    Perhaps one of you would know or know who would be in a best position to ask in regards to input and VR at a high level going forward? I'd certainly love to start ditching the 4 separate sdks for the main 6 VR platforms. I love the additions in 2017.1 and 2017.2 beta but there seems to be a few things that prevent removal of the vendor C# side sdks and using Unity APIs exclusively.
     
    Last edited: Aug 3, 2017
  12. Brad-Newman

    Brad-Newman

    Joined:
    Feb 7, 2013
    Posts:
    185
    Very much looking forward to any news on this front / the XR Foundation Toolkit!
     
    Alverik likes this.
  13. virror

    virror

    Joined:
    Feb 3, 2012
    Posts:
    2,963
    VRTK actually supports all those platforms and more as well.
    PSVR is a bit special because of NDAs and stuff, so it cant really integrate with that publicly, but there are integrations available if you are a PSVR dev, not just from official sources.
     
  14. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    It doesn't support all the major platforms.

    https://github.com/thestonefox/VRTK

    PSVR is special so I could see that only being mentioned on the private forums, can't comment either way. However it certainly doesn't support all the major VR platforms especially on the mobile side, and the ones marked as experimental like daydream don't use the current google sdk and won't function without errors unless you use older versions.
     
  15. virror

    virror

    Joined:
    Feb 3, 2012
    Posts:
    2,963
    That list is only kind of correct.
    GearVR works out of the box just using the Rift SDK.
    Daydream seems to be a bit broken atm yeah. You are right about that. But feel free to make a pr for it, its a community project after all : )
    WMR is in the works atm and will also be supported shortly.
     
    Alverik likes this.
  16. Akshara

    Akshara

    Joined:
    Apr 9, 2014
    Posts:
    100
    I too would like to hear something regarding the XR Foundation Toolkit, along with the status of Editor VR.
     
    Alverik likes this.
  17. SiliconDroid

    SiliconDroid

    Joined:
    Feb 20, 2017
    Posts:
    302
  18. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    If your talking about eventually sure, but if your talking about today with the inputtracking class on the 6 major VR platforms you certainly don't have a Unity API base that you can do everything on top of. They are making progress 2017.1 , 2017.2, 2017.3 have certainly added fields to the InputTracking API, but there are still pretty basic things you need to reach into the platform specific SDK for with input. I'd love to think by 2018.2 we'd get there but I'm a bit discouraged with Unity recently canning their latest input rewrite attempt. I have my fingers crossed for next summer but if I were a guessing man it will probably be longer.

    That's also assuming new schemes like the Valve knuckles or something else doesn't take off.
     
  19. SiliconDroid

    SiliconDroid

    Joined:
    Feb 20, 2017
    Posts:
    302
    Yup, I realise (all too well) we as devs have to juggle with multiple SDKs/APIs to be compatible with various platforms.

    The main reason for the holdup is probably not throwing enough man hours at it (understandably as VR/XR devs only account for small user% I assume).

    The total man hours required is relatively low though as the problem space is small, certainly for just input devices:

    Valve Knuckles or any other potential future input device will be comprised of some set of these standard input components:
    • Vector3 Tracking Rot
    • Vector3 Tracking Pos
    • Vector2 Joysticks (or touchpads)
    • float signal providers (triggers, finger sensors)
    • float battery level
    • bool digital buttons (Dpads etc)
    An API with getters for these low level components would give complete coverage and be future proof. Under the hood: a C++ base class, each platform interface inheriting it and implementing some set of override functions. Platform SDK changes would just require small changes in each derived class, all Unity side glue is in baseclass.

    Any further abstraction ought to be built in Unity IDE scope, c# sripts/prefabs.

    Maybe I'm missing something here? (probably:confused:)
     
    Last edited: Nov 4, 2017
    Alverik likes this.
  20. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    You have to remember that Unity has stated many times that they are redoing all of input not just XR. Even non-XR input for normal game controllers cross platform has been really bad for many years. Unity canning their latest attempt after a few years and starting from scratch does indicate that they're struggling

    It's not about the underlying data structure for these types being something we haven't seen before (bool (button), float (axis), Vector3 (position), Quaternion (rotation)) it's about the Unity API not having a consistent way to abstract all these and make it the same. Unity doesn't even do this well for gamepads currently. You don't have an abstraction for "left trigger" you have to lookup in non existent docs what "axis number" left trigger is and then you find out that even for the same controller manufacturer that it's a different number whether it's on windows, android, etc. You don't even have a way to bind after the app is started and say something like leftTrigger.bind(axis7); Essentially the only way to even do this is to not use Unity Input and use Rewired.

    On top of that most XR manufacturers are not playing to usb/bluetooth standards to make their controllers not work on their competitors (especially on mobile). This makes using the native side of their SDK mandatory for Unity (this is not the same as the Utility products you see in your project it's the .dll or .aar that is included in your build). Unity would have to consistently abstract to their API and they have not. If you compare the below it's the same different button/axis numbers and no API to rebind. This is before we start to see main use of other platforms like OSX as well.

    https://docs.unity3d.com/Manual/OculusControllers.html
    https://docs.unity3d.com/Manual/OpenVRControllers.html


    Being able to in a Unity API write one line of code for pushing the trigger on the left hand controller and it works on Vive, Rift, PSVR, Daydream, and Gear VR are a ways off. That's not even getting into more advanced things like resetting controller or boundary.
     
    Alverik likes this.
  21. SiliconDroid

    SiliconDroid

    Joined:
    Feb 20, 2017
    Posts:
    302
    I think that's the crux of the problem and thus it would be best to build this abstraction Unity user side. Free market would soon produce all required easy to use abstractions. Several systems then might compete and over time the best solution would evolve.

    Controllers being reset would be simple to handle, boundary redefining: yes probably unity would have to abstract that, but how hard can it be:

    Code (CSharp):
    1. bool bYes = XR.IsBoundaryNew()
    2. List<Vector2> lvBoundary = XR.GetBoundary()
     
  22. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    202
    I can confirm that we're actively working on cross platform input and interaction for XR. Can't say anything re: dates or final features, as we're still very much in development.

    It is a problem we're committed to solving, and we hope we'll be able to talk/show more in the future.

    In the meantime, make sure you check out the Tracked Pose Driver as it will help simplify some cross platform tracking setups.
     
  23. greggtwep16

    greggtwep16

    Joined:
    Aug 17, 2012
    Posts:
    1,546
    Implementing abstract different but similar interfaces is rarely about technical challenge. It's about coming up with a good general base to translate the concrete implementations (rift, vive, google, sony, gear vr, etc.) to. Unity still doesn't have a good abstraction on top of normal gamepad input 10+ years later, so much so that most serious games use rewired. It was also disappointing to hear that there latest attempt that was 1-2 years in the making was recently canned.

    That being said hopefully we'll be surprised next year, only time will tell.
     
    SiliconDroid likes this.
  24. SiliconDroid

    SiliconDroid

    Joined:
    Feb 20, 2017
    Posts:
    302
    A common superset baseclass for general input (XR included) is just some keystrokes away, just the right combo of keystrokes is all.:D

    I'm new to unity, this is my first game built with it (for gearvr/daydream), as such I'm somewhat ignorant to the existing issues with general input and standard game controllers.

    So I just checked out rewired and all it's 5 star reviews lamenting "this is how Unity input should be", hmmm, I'm suprised general input has not been nailed down yet. It's a shame Unity cant hire the rewired main dev, he seems to have done a good job. Good easy general input API is kind of a core requirement for a multiplatform engine.

    I suppose that kind of outsourcing is what has been done with TextMeshPro (which I love) and the new GUI system (which I have yet to explore).

    Yes let us hope so.
     
    Last edited: Nov 9, 2017
  25. kavanavak

    kavanavak

    Joined:
    Sep 30, 2015
    Posts:
    54
    JoeStrout likes this.
  26. Krxtopher

    Krxtopher

    Joined:
    Nov 30, 2013
    Posts:
    4
    I second the question on an update on this front. I'm just digging back into Unity after a few years away using Unreal Engine. Unreal Engine has had nice input abstractions for XR controllers for years now. I didn't realize how spoiled I was (and how productive) until I tried to do some simple cross-VR device input in Unity. Using a 3rd party abstraction solution from the Asset Store isn't an option for me because I'm creating my own assets to share on the Asset Store and therefore can't be dependent on other licensed assets.
     
    ROBYER1 likes this.