Search Unity

Unity Native VR Support Best Practise / Project Setup

Discussion in 'VR' started by appymedia, Aug 5, 2019.

  1. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    94
    HI All,

    I'm new to VR with Unity and only dabbled with Unity outside of VR previously and am trying to get started in VR dev. I'm using Unity 2019.2.0f1 and looks the VR template has vanished I saw in there previously, not entirely sure why and can't see any official response on the issue (a few others have asked) so trying to establish what exactly is the best way to setup a project for VR.

    I quite like the idea of using Unity’s native XR feature’s so I don't have to install or count on 3rd party assets. I believe the new Unity input system will / does support XR tracking and input but it’s a bit early still and not a lot of docs and so on to help guide a newbie like me.

    I've installed 2019.1.13f1 to get access to the old VR template so I can see how the XR rig is setup and so on. It uses the XR Legacy Input Helpers package and the Tracked Pose Driver which gets things setup with HMD tracking at the right height via a script.

    I still have to solve general input for triggers, buttons etc but from what I've read I don't think that will be too tricky using the 'old' input system.

    I just wanted to check that this is the current best practise for a VR setup in 2019.2.0f1? Is it Unity’s intention to transition us over onto the new input system and make the XR Legacy Input Helpers package actually legacy, it seems like they are current still?

    Be really good to have an idea about teleporting and general scene interaction in VR as well somewhere although being fair that’s more on me as the dev, still be nice though! I'm guessing I can snoop in on packages like VRTK to get an idea how I might begin.

    Would be handy to have all of this in a template that's updated Unity guys so we know exactly where to start. As a newcomer(ish) its a bit confusing and I’m not really sure I’m making the right approach on things, maybe I'm a dense as well lol

    What’s everyone else doing? Cheers all :)

    **EDIT** I found a sticky in the forums https://forum.unity.com/threads/xr-plugins-and-subsystems.693463/ that I'm digesting, lots of things changing and when you haven't experienced everything previously and just jumping in its really confusing :(
     
    Last edited: Aug 5, 2019
  2. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    94
    Ok, so I've read the sticky I cut and paste in my first comment and the plot thickens.

    It seems on top of what I'm already trying to understand regarding using VR the whole way its incorporated into Unity is changing via the XR Management plugin which is in preview in an effort to decouple XR SDK updates etc from engine releases.

    Yet another choice and another cog in a confusing amount of options for a newcomer.

    Still looking for what the recommend and supported way to use VR in Unity 2019.2 from setup of plugins / packages to scene setup, XR rig and controller input?

    I'm looking at Unreal as well and they have a template which gives you their native VR support, drops you into VR with teleporting, object selection etc. Has well commented blueprints and matching docs, the on-boarding experience is great. I'm not comparing engines or starting any flame war (each engine has its strengths / weaknesses from what I understand) but I'd love to have the same sort of experience with Unity if possible to help us newbies get going.
     
    Last edited: Aug 5, 2019
  3. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    94
    Just thought I'd report back on my experiences with a few setups I've tried...

    ** Unity 2019.2.0f1 + LWRP 1.6.1 (LWRP Template) + Oculus Integration 1.39

    - Needed to switch to single pass stereo rendering otherwise you have only left eye rendering both left and right and right eye is blank
    - Needed to enter 'fake' 999999 ID's in Avatar - Oculus Rift App ID & Go / Quest ID so see hands / controllers
    - Hand / controller models are still pink after using 'upgrade project materials to LWRP materials'
    - Teleporting doesn't seem to work correctly and you fall through floor in the locomotion demo scene

    ** Unity 2019.2.0f1 + LWRP 1.6.1 (LWRP Template) + Oculus Integration 1.38

    - Needed to switch to single pass stereo rendering otherwise you have only left eye rendering both left and right and right eye is blank
    - Needed to enter 'fake' 999999 ID's in Avartar - Oculus Rift App ID & Go / Quest ID so see hands / controllers
    - Hand / controller models are still pink after using 'upgrade project materials to LWRP materials'
    - Teleporting doesn't seem to work correctly and you fall through floor in the locomotion demo scene

    ** Unity 2019.1.13f1 + LWRP 1.6.1 (VR Preview Template) + Oculus Integration 1.39

    - Needed to enter 'fake' 999999 ID's in Avartar - Oculus Rift App ID & Go / Quest ID so see hands / controllers
    - Hand / controller models are still pink after using 'upgrade project materials to LWRP materials'
    - Teleporting doesn't seem to work correctly and you fall through floor in the locomotion demo scene

    ** Unity 2019.1.13f1 + 'Old' 3D Renderer + Oculus Integration 1.38

    - Teleporting doesn't seem to work correctly and you fall through floor in the locomotion demo scene

    ** Unity 2018.4.5f1 (recommended by Oculus) + 'Old' 3D Renderer + Oculus Integration 1.39

    - Teleporting doesn't seem to work correctly and you fall through floor in the locomotion demo scene
    - Needed to enter 'fake' 999999 ID's in Avartar - Oculus Rift App ID & Go / Quest ID so see hands / controllers

    I did partially fix the broken teleporting by adjusting the projects fixed time setup to match my devices refresh rate, so in my case Rift S 80hz 1/80 = 0.0125 but the teleporting was still not working about 10% of the time and just 'sticking' and when it did work there was a delay that didn't feel right. Obviously the 'pink' rendering could be fixed by opting for the 'old;' Unity renderer but I was just noting what wasn't working well out the box. I thought I'd have a good reference by using the 'offfical' Oculus Integration package but me experieince hasn't been great.

    I can get the built-in native Unity XR working ok via the Tracked Pose Driver and 'think' I have a correct rig setup. I've nicked the height setup script from the now old VR LWRP template but I still have no real reference for how to implement teleporting and grabbing etc. Only other things I can think of are VRTK but I'm not that confident I'll have a good experience there as a quick look at the docs invites me to install the Oculus Integration which I was seeing issues with like above (I only tried the teleporting scene, makes me wonder what other issues I'll run into). Those that are using VRTK what versions of Unity and VRTK are you running with? Is it easy to run with out the box and a decent reference point to help you understand and create your own mechanics?

    Maybe I'm doing something wrong? Ultimately all I''m after is a good reference point with some simple VR mechanics like teleporting / grabbing etc that I'm not battling through to get to the start line? Or is this just the norm?

    Any advice appreciated.
     
  4. Foj

    Foj

    Joined:
    Apr 11, 2013
    Posts:
    17
    appymedia,
    I have been following your efforts across various threads and appreciate your input. I'm in a similar predicament to you, only a few months down the line. Following a Unity tutorial I have been trying to use an OVRCameraRig with VRTK4 - I really cannot get my head around VRTK, just trying to implement some simple player axis locomotion has taken forever and is still annoyingly bugged. I am targeting the Oculus Rift.
    So I am wondering where you ended up, if you don't mind sharing?
    I am thinking to give the UnityXR package a go, any advice on this?
    And, finally, any documentation you recommend?

    If you are still looking here and have the time to reply. Thank You.
     
    Last edited: May 24, 2020
  5. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    94
    Hi @Foj

    I ended up using Unity's built-in XR support last year but then had quite a break from VR dev due to personal circumstances. I'm just easing back into VR dev again and still getting back up to speed with everything that's changed.

    I've tried the new XR plugin management which seems to work pretty well and is really simple to get going. I've stuck with the older rendering pipeline for now, URP might be fine, I've not checked. Just taking a look at the new XR interaction toolkit now which I'm hoping makes life pretty easy for a launch-board to tweak :)

    Obviously if you need any Oculus platform things like avatars etc you'll need to import the Oculus Integration as well and use the needed parts.

    That's whats working / being looked at for me right now, hope it helps :)
     
    Foj likes this.
  6. Foj

    Foj

    Joined:
    Apr 11, 2013
    Posts:
    17
    Hi again @appymedia,

    thank you for the quick response, I need to turn on notifications!

    I've decided to go the UnityXR route, though little information is available, and I have had experiences of companies failing to continue to support, so fingers crossed Unity keeps plugging away with this. It's a personal project, so no pressure.

    I am looking at trying to bring in the Oculus Integration prefabs for animating controller-hands, I am a bit stuck on this. I have been following "VR with Andrew" on YouTube, that's got most stuff going, but his animated hands look like hard work and the end result a little clunky.

    I am sure all these things should all be provided for us to simply drop-in and reskin by now :D

    Anyway, good luck with your project and maybe talk VR/UnityXR again at some point.
     
    appymedia likes this.
  7. Cottontech

    Cottontech

    Joined:
    Jul 27, 2020
    Posts:
    2
    Hi, very interesting reading and pretty much where I am today. I have abandoned the ‘legacy’ options and what appears to be a fairly top-heavy OVR (Oculus Integration Package) install. Ultimately I have industrial simulation in mind and want to ensure I am as light as possible on the back end; I already have a solid c# background and expect that DOTS will be a future requirements as I will be modelling thousands of discrete particles. I to have started with XR and with a few added components I have a simple landscape, can look around at Mount Lonely and it’s neighbouring tree, and watch as a car sized sphere with rigid body physics plummets from a kilometre above me to bounce and roll away. Riveting. I easily enabled Snap Turn and can do that but now need to embark on a rough to-do list which I could use a pointer on for the first. Moving; plain old D-Pad walking, a wireframe avatar cylinder, a simple interaction layer to monitor all button and controller/HMD events, various graphic includes such as hands, controllers and some guidance on physics, collisions and how to drop those events into a c# event stack. Anyway! bit of a brain dump, would be good to pick up on how you guys went ?
     
  8. InsaneDuane

    InsaneDuane

    Joined:
    Nov 1, 2019
    Posts:
    7
    I went head first into Unity VR using an Oculus Rift S about one year ago with no experience in programming or game development. I started with the OVR Integration package and now I can do all kinds of cool stuff but the OVR package has a large overhead and kind of boxed me in in certain regards. I recently, about 1 week ago, downloaded the XR Interaction Toolkit and I can finally, as of today, do what I was doing with the OVR and more! My first hurdle was just moving around (I don't like teleporting). I pieced together the following script:

    Code (CSharp):
    1. using UnityEngine;
    2.  
    3. public class MoveCameraAllDirections : MonoBehaviour
    4. {
    5.     [SerializeField] private float moveSpeed = 5.0f;
    6.  
    7.     void Update()
    8.     {
    9.         float ahead = Input.GetAxis("Vertical");
    10.         float sidestep = Input.GetAxis("Horizontal");
    11.  
    12.         if (sidestep != 0)
    13.         {
    14.             transform.position = transform.position + Camera.main.transform.right * sidestep * moveSpeed * Time.deltaTime;
    15.         }
    16.  
    17.         if (ahead!= 0)
    18.         {
    19.             transform.position = transform.position + Camera.main.transform.forward * ahead* moveSpeed * Time.deltaTime;
    20.         }
    21.     }
    22.  
    23. }
    This allowed me to move relative to where I was looking in any direction, even through the ground or out into space. I slapped a NavMeshAgent on it and now I am grounded. If I want to fly I just disable the NavMeshAgent.
    Next was the controller input problem. I tried MANY different ways and learned some clever things but just today I found a simple solution.
    Code (CSharp):
    1.     private void Update()
    2.     {
    3.         if(Input.GetKeyDown(KeyCode.JoystickButton5))
    4.         {
    5.             ToggleRifle();
    6.         }
    7.     }
    This allowed me to get a single trigger on the button. I kind of mapped all the JoystickButtons.

    *JoystickButton0 = button A
    *JoystickButton1 = button B
    *JoystickButton2 = button X
    *JoystickButton3 = button Y
    *JoystickButton4 = button GripLeft
    *JoystickButton5 = button GripRight
    *JoystickButton6 = button MenuLeft
    *JoystickButton7 = button MenuRight? Not working, probably overwritten (Oculus Menu)
    *JoystickButton8 = button JoystickPress(LeftController)
    *JoystickButton9 = button JoystickPress(RightController)
    *JoystickButton10 = button A & B When they are "touched" (pressing has no effect)
    *JoystickButton11 = button B Only when "touched" (pressing has no effect)
    *JoystickButton12 = button X Only when "touched" (pressing has no effect)
    *JoystickButton13 = button Y Only when "touched" (pressing has no effect)
    *JoystickButton14 = button TriggerLeft
    *JoystickButton15 = button TriggerRight
    *JoystickButton16 = button Joystick Left "touched"
    *JoystickButton17 = button Joystick Right "touched"
    *JoystickButton18 = button Joystick Right "touched"

    I am not sure how they will translate to another XR system but this works for the Oculus Rift S.

    BTW, it took WEEKS to figure out the whole OVR"Avatar" thing and I hated it. I eneded up finding a simple solution to adding hands in OVR but I am not using that now. Thankfully a YouTuber named Valem had a link to a pair of hands that work using the XR Toolkit.
    The link is in the description. Just place the LeftHandPresence onto the LeftHandController of the XRRig (same for Right).
    I hope this helps newbies. There is an advantage to figuring all this out by yourself but the time it takes....
     
  9. InsaneDuane

    InsaneDuane

    Joined:
    Nov 1, 2019
    Posts:
    7
    Check out my response :)
     
  10. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    94
    Hey guys, sorry about the slow response! I spent quite a break from dev due to family illness but am finding some time again now to jump back in.

    Given the break I decided to re-look through tooling in general and so far I have found myself more productive using other engines more than Unity for the VR (and non-VR) projects I have.

    I look at each project and choose the tooling to match what I want to achieve for a given timeframe / my skillset etc etc, for me its becoming harder to find reasons to choose Unity. Lots of competition out there exceling in areas that cover scopes of my projects / requirements better.

    That's my current experience, who knows what the future holds :)
     
  11. appymedia

    appymedia

    Joined:
    May 22, 2013
    Posts:
    94
    Actually, bit of advice from people that have been more in 'the tech transition period' of Unity then I have might help me frame Unity better...

    Things feel in a difficult 'middle ground' now regarding graphics pipelines in-particularly. If you have a new project which you would like to target mobile & desktop lets say (and include VR), which do you choose and why?

    In my last comment I used the words 'productive', more accurately I've had a more 'pleasant' experience in other engines so far. Being fair I do get things done faster in Unity and obviously still use it but it feels quite painful, disjointed at times and a fight. That said maybe I'm faster in Unity through familiarity though, I'm no-where near as well versed in the other engines I'm using.

    What I also highly value is a great developer experience, that feels quite impacted with Unity right now. Anybody got any suggestions and ways they deal with that whilst still taking advantage of some of the newer tech?

    p.s. Apologises for completely de-railing this thread :p
     
    Last edited: Oct 4, 2020
unityunity