Search Unity

Is there an updated tutorial for getting started with Oculus Go and 2019.1?

Discussion in 'VR' started by aaronfranke, May 27, 2019.

  1. aaronfranke

    aaronfranke

    Joined:
    Jan 23, 2017
    Posts:
    20
    Is there an updated tutorial for how to get started with VR development with an Oculus Go in Unity 2019.1? I can find many tutorials online for Unity 2018.x but I believe that the process is different with Unity 2019.1 because it comes with the Android SDK?

    I have Unity 2019.1 installed via Unity Hub with the Android SDK box checked. However, I also installed Android Studio and JDK, just in case I need those. My Oculus Go has developer mode enabled.

    What project template do I choose? Just 3D? VR Lightweight RP? Something else? And what do I need to change and configure once I open the project, so that it will use my Oculus Go?
     
    robinsail likes this.
  2. Tibor0991

    Tibor0991

    Joined:
    Dec 11, 2016
    Posts:
    27
    The process in Unity 2019 is not any different from Unity 2018.x, just keep these bullet points in mind:
    • Given Unity 2019's recent post-processing bugs, the only option available is "Use LWRP, set rendering to Single Pass, DO NOT USE any Post-Processing effect": your game will look extremely dull without any color filter or bloom effect, but until the Unity team decides to focus on all the half-baked VR features they've implemented and left to rot you're stuck like this.
    • Install the latest Oculus Integration package from the Asset Store, do not trust the one in the package manager and most importantly think as if you must never update it, since Oculus is also prone to release half-baked features.
    • The Oculus VR system is still designed around the need of having a GazePointer instantiated by the plugin itself, but such pointer doesn't properly exist in the files provided by Oculus, so you must build one from scratch ( just follow the complaints in the console, it's not really so hard)
    • The OVR Plugin recognizes the main camera as a VR camera and automatically hooks it up to the headset rotation, but you might still want to start from the OVRCameraRig prefab provided in the plugin.
    • OVREventSystem is, you guessed it, "some assembly required": you must create an empty GameObject and add the EventSystem and OVRInputModule components; afterwards you have to set the pointer prefab and the ray source (usually either the RightHandAnchor transform or the CenterEyeAnchor camera, both from the OVRCameraRig prefab).
    • World-Space UI needs a OVRGraphicRaycaster component to replace the default GraphicRaycaster on the Canvas Object.
    • Last but not least, add a OVRPhysicsRaycaster to the OVRCameraRig prefab in order to invoke the OnPointer... events on objects.
     
  3. unityuserunity85496

    unityuserunity85496

    Joined:
    Jan 9, 2019
    Posts:
    89
    its very breaky. when first building it takes FOREVER to build. It has a "Compiling shader variants" n of 9000+
    so far can not at all get glow to work, might have to step back to 2018
    but heres some code for using the default XR vr controls instead of the oculus package from the assets store.

    the default XR code works a treat after really understanding the examples from their docs
    you can poll for joystick inputs but its weird and the data back is less then useful at times

    Code (CSharp):
    1.  
    2. UnityEngine.XR.InputDevice device;
    3.  
    4. void Start()
    5. {
    6. device = UnityEngine.XR.InputDevices.GetDeviceAtXRNode(UnityEngine.XR.XRNode.RightHand);
    7. }
    8.  
    9. void Update()
    10. {
    11.  
    12. bool isOnTap = false;
    13. if ( device.TryGetFeatureValue(UnityEngine.XR.CommonUsages.primary2DAxisTouch, out isOnTap) ){
    14.  // magic
    15. }
    16.