Search Unity

[QUESTION-BEST PRACTICE] Handling Multiplatform VR Input: VRTK or not?

Discussion in 'AR/VR (XR) Discussion' started by FlamingVorpalCow, Dec 6, 2019.

  1. FlamingVorpalCow

    FlamingVorpalCow

    Joined:
    Oct 29, 2014
    Posts:
    6
    Hi! I'm interested in starting to program for VR (have a Rift CS1 headset, and a Rift S arriving on january to replace it). The thin is I see much discussion on how to properly create multiplatform version of a game that works on Oculus and Vive. Here's what I've gathered and i'll post after that, my proper question:
    My Options are
    1. Work in Unity using OpenVR (For SteamVR) and the OculusSDK to publish on Oculus APP. Open VR Handles Input from the HMD and Controllers for Vive and Rift
    2. Use VRTK for interactions and that way it would be multiplatform, regardless of SDK
    3. Write my own code using the new XR systems that are in beta and will be production ready in 2020.
    What i've read around here and or am thinking from recent events:
    • Using VRTK or the Oculus Prefabs may be faster to deploy but harder to fully customize (some people reccomend ditching that and working from scratch), something that should be easily feasible with the new XR systems.
    • Unity and Oculus just released a course using VRTK and Oculus integration, so that should be some kind of "recommended approach" by Unity. But why do that when you're just releasing a new XR system that would make this approach obsolete in 1-2 months?
    • Oculus recommending VRTK through this course makes me think it is a good approach, but I want to make sure I invest my time in the most productive way, so if this is not the best way, i'd like to get input on what to do. I don't want to build my game and have to rip the plugin out halfway through and lose a lot of work.
    So, not to the actual question:
    • What is the best course of action to create a multiplatform VR Game for (Oculus+Vive)?
    • What tools should i concentrate on?
    • Is there i'm missing here?

    TL;DR: What is the best course of action to create a multiplatform VR Game for (Oculus+Vive)? Using SDKs/VRTK or building from scratch on new Unity XR systems?
     
  2. JoeStrout

    JoeStrout

    Joined:
    Jan 14, 2011
    Posts:
    9,682
    I wasn't too crazy about VRTK when I looked at it. I've been using the SDKs directly and have been perfectly happy with it. I'll probably switch to the new XR systems when they're ready, but till then, wrapping the SDKs in your own platform abstraction layer is not hard to do.
     
  3. nomand

    nomand

    Joined:
    Dec 23, 2008
    Posts:
    44
    VRTK is all event system driven and is geared towards a no-code inspector/hierarchy based setup, which is a huuuuuge pain if all you want to do is something like OnPickup() or anything custom.
    I did some testing and using VRTK for a basic vanillar locomotion setup showed a very significant fps drop on the Quest compared to a built-in Tracked Pose Driver solution, so personally I would wait for native XR system to be released, and if necessary use VRTK to prototype in the meantime.
     
    Shizola likes this.
  4. linojon

    linojon

    Joined:
    Aug 25, 2014
    Posts:
    118
    The approach I take is build for one platform first, using the XR classes as much as possible and isolating any device specific code only as necessary, focusing on the core features architecture of the app, behaviors, data management etc Then when ready port to the next platform, using compiler symbols.and the Bridgexr toolkit to swap out sdk dependencies. (Disclosure my company wrote Bridgexr for this purpose, is on the Asset Store)
     
unityunity