Search Unity

Is Unity 2018 XR enough?

Discussion in 'VR' started by Corysia, Oct 8, 2018.

  1. Corysia

    Corysia

    Joined:
    Mar 14, 2018
    Posts:
    108
    Unity 2018.2 has native XR. Is that enough? I was reading the docs and set up my environment to be None/Oculus/OpenVR and then turned on Oculus spatial sound. But at that point, the docs pretty much stopped.
     
  2. JoeStrout

    JoeStrout

    Joined:
    Jan 14, 2011
    Posts:
    9,859
    It's almost enough. I still use the Oculus SDK to get the controller position and touch/button inputs.
     
  3. Corysia

    Corysia

    Joined:
    Mar 14, 2018
    Posts:
    108
    I was looking at these two tutorials:


    And in it he gets some Oculus hands animating and grabbing without the Oculus SDK.

    But there's so much more to figure out. How would I stop someone from walking thru a wall?
    How would I use the Mock HMD for those times I can work on my app, but don't have access to my HMD?

    I've a zillion 'newbie to vr' questions.
     
  4. JoeStrout

    JoeStrout

    Joined:
    Jan 14, 2011
    Posts:
    9,859
    I think the key is to just take them one at a time. Start building something, and focus on whatever's immediately blocking your progress.

    Walking through a wall isn't an issue until you have people walking around (which you probably shouldn't — it makes some people sick).

    I don't know what Mock HMD is; I work on my app in the editor (without the headset) via a few simple scripts that fake head and controller movements with the mouse and keyboard.

    But the key is, you don't need to understand everything before you begin. Throw a cube on a plane, run it on your HMD, and look around. Then take it from there.
     
    Corysia likes this.
  5. Corysia

    Corysia

    Joined:
    Mar 14, 2018
    Posts:
    108
    Mock HMD is an option under Player settings for XR. You've got Oculus and OpenVR by default. There's also None and Mock HMD. I was hoping the Mock HMD would let me do what you're describing -- walking around with mouse & keyboard, kind of like the Emulator mode in VRTK.

    I spend about 8 hours a day on a machine that doesn't have an HMD, and only two hours in the evening with one that does. If you could share those scripts, that'd be a godsend to me. I was considering trying to use the First Person Controller from the Standard Assets and activating it when falling back to the None XR configuration when an HMD isn't detected. But...I don't know how. =)

    And yes, I'm trying to do very baby steps. I'm not new to Software Dev, just Unity and VR. My project is a simple escape room. A locked door, a table with a key on it. You have to pick up the key and insert it in the lock...hopefully also need to turn it...then the door will unlock. Once you walk thru the door, you win. A very simple project that could become a tutorial.
     
    JoeStrout likes this.
  6. JoeStrout

    JoeStrout

    Joined:
    Jan 14, 2011
    Posts:
    9,859
    I'm traveling this week and don't have easy access to my code at the moment. Also note that I develop for Oculus Go; I don't have a Rift (and haven't yet gotten into Vive). So it might be a little different (it sounds like you're developing for Rift).

    But I'll try to come back this evening or tomorrow, and post some code if somebody else doesn't beat me to it.
     
  7. Corysia

    Corysia

    Joined:
    Mar 14, 2018
    Posts:
    108
    That would be great, thanks. Yes, I own a Rift with Touch. I hope to deploy to both Rift and Quest when it's out. I wouldn't mind Vive, too, but I can't test that since I don't own one. Mostly, I want two controllers, but I may change my mind there. I have access to Daydream, so I might. I don't know that I'm even planning on something that will actually require two hands.
     
  8. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    522
    It seems sufficient for my vive project so far. I've got the headset, controller pos, re-centering, axis and button input (not that I like how this works, but it works) and a few experimental jabs at haptic vibrations.
    I've no idea what'd be involved in adding oculus support if i one day wanted to, but the actual amount of code to get the above working was surprisingly low. I don't use the mock headset stuff.
     
    JoeStrout and Corysia like this.
  9. JoeStrout

    JoeStrout

    Joined:
    Jan 14, 2011
    Posts:
    9,859
    OK, I promised some code, so here's what we're using in Beatron.

    First, the camera: you don't need any code at all for that on the headset of course. But for testing, I attach this script to the camera, so I can look around in the editor:
    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4.  
    5. public class MousePitchYaw : MonoBehaviour {
    6.  
    7.     public KeyCode requiredModifier = KeyCode.None;
    8.  
    9.     public float yawSpeed = 3;
    10.     public float pitchSpeed = 3;
    11.    
    12.     private float yaw = 0;
    13.     private float pitch = 0;
    14.            
    15.     void Awake() {
    16.         // This script only needs to happen in the editor.  Not on the device.
    17.         #if !UNITY_EDITOR
    18.         this.enabled = false;
    19.         #endif
    20.     }
    21.      
    22.     void Update() {
    23.         if (requiredModifier != KeyCode.None && !Input.GetKey(requiredModifier)) return;
    24.        
    25.         yaw += Input.GetAxisRaw("Mouse X") * yawSpeed;
    26.         pitch = Mathf.Clamp(pitch - Input.GetAxisRaw("Mouse Y") * pitchSpeed, -90, 90);
    27.         transform.localRotation = Quaternion.Euler(pitch, yaw, 0);
    28.     }
    29. }
    Is simple, no? In this game I set requiredModifier to LeftShift, so that I only turn my virtual head while shift-mousing; then I use the same script, but with regular (requiredModifier=None) mousing, to control the controller.

    The setup for that is a little unobvious; it looks like this:

    upload_2018-10-14_7-51-42.png

    ...So we have Player, which is just a root-level container (and is what you'd move around if you have any movement in your game); that contains the camera, and a ControllerHolder container, which is where you attach the MousePitchYaw script for moving the controller in the editor, as shown above. Then below that is the actual controller, which is what should get moved by
    OVRInput.GetLocalControllerRotation
    and
    OVRInput.GetLocalControllerPosition
    .

    Finally, to simulate the controller inputs, I get all my input through a little input-abstraction layer like so:

    Code (csharp):
    1.  
    2. using System.Collections;
    3. using System.Collections.Generic;
    4. using UnityEngine;
    5.  
    6. public class OculusInput : MonoBehaviour {
    7.    
    8.    [Tooltip("ControllerPlayback module used for testing/debugging")]
    9.    public ControllerPlayback playback;
    10.  
    11.    // Virtual buttons which we adapt to both desktop and controller inputs:
    12.    public enum VButton {
    13.        Left = 0,
    14.        Right,
    15.        Up,
    16.        Down,
    17.        Trigger,
    18.        Back,
    19.        SwipeLeft,
    20.        SwipeRight,
    21.        SwipeUp,
    22.        SwipeDown
    23.    }
    24.    const int VButton_count = 10;
    25.  
    26.    bool[] currBtnState = new bool[VButton_count];
    27.    bool[] prevBtnState = new bool[VButton_count];
    28.  
    29.    Vector2 touchDownPos;
    30.    Vector2 touchUpPos;
    31.    float touchDownTime;
    32.    
    33.    static OculusInput _instance;
    34.    public static OculusInput instance { get { return _instance; } }
    35.    
    36.    void Awake() {
    37.        _instance = this;
    38.    }
    39.  
    40.    void Update() {
    41.        for (int i=0; i<VButton_count; i++) prevBtnState[i] = currBtnState[i];
    42.        
    43.        #if UNITY_EDITOR
    44.        bool shift = Input.GetKey(KeyCode.LeftShift) || Input.GetKey(KeyCode.RightShift);
    45.        currBtnState[(int)VButton.Left] = Input.GetKey(KeyCode.LeftArrow) && !shift;
    46.        currBtnState[(int)VButton.Right] = Input.GetKey(KeyCode.RightArrow) && !shift;
    47.        currBtnState[(int)VButton.Up] = Input.GetKey(KeyCode.UpArrow) && !shift;
    48.        currBtnState[(int)VButton.Down] = Input.GetKey(KeyCode.DownArrow) && !shift;
    49.        currBtnState[(int)VButton.Trigger] = Input.GetKey(KeyCode.Space) || Input.GetMouseButton(0);
    50.        currBtnState[(int)VButton.Back] = Input.GetKey(KeyCode.Escape);      
    51.        currBtnState[(int)VButton.SwipeLeft] = Input.GetKey(KeyCode.LeftArrow) && shift;
    52.        currBtnState[(int)VButton.SwipeRight] = Input.GetKey(KeyCode.RightArrow) && shift;
    53.        currBtnState[(int)VButton.SwipeUp] = Input.GetKey(KeyCode.UpArrow) && shift;
    54.        currBtnState[(int)VButton.SwipeDown] = Input.GetKey(KeyCode.DownArrow) && shift;
    55.        
    56.        if (playback != null && playback.enabled) currBtnState[(int)VButton.Trigger] = playback.triggerIsDown;
    57.  
    58.        #else
    59.        if (OVRInput.Get(OVRInput.Button.PrimaryTouchpad)) {
    60.            Vector2 pos = OVRInput.Get(OVRInput.Axis2D.PrimaryTouchpad);
    61.            float ang = Mathf.Atan2(pos.y, pos.x) * Mathf.Rad2Deg;
    62.            currBtnState[(int)VButton.Left] = (ang > 135 || ang < -135);
    63.            currBtnState[(int)VButton.Right] = (ang < 45 && ang > -45);
    64.            currBtnState[(int)VButton.Up] = (ang > 45 && ang < 135);
    65.            currBtnState[(int)VButton.Down] = (ang < -45 && ang > -135);
    66.        } else {
    67.            currBtnState[(int)VButton.Left] = false;
    68.            currBtnState[(int)VButton.Right] = false;
    69.            currBtnState[(int)VButton.Up] = false;
    70.            currBtnState[(int)VButton.Down] = false;
    71.            currBtnState[(int)VButton.SwipeLeft] = false;
    72.            currBtnState[(int)VButton.SwipeRight] = false;
    73.            currBtnState[(int)VButton.SwipeUp] = false;
    74.            currBtnState[(int)VButton.SwipeDown] = false;
    75.            
    76.            if (OVRInput.Get(OVRInput.Touch.PrimaryTouchpad)) {
    77.                Vector2 pos = OVRInput.Get(OVRInput.Axis2D.PrimaryTouchpad);
    78.                if (touchDownTime == 0) {
    79.                    touchDownTime = Time.time;
    80.                    touchDownPos = pos;
    81.                }
    82.                touchUpPos = pos;   // (update this continually during the touch)
    83.            } else if (touchDownTime > 0) {
    84.                // Touch-up: trigger a swipe if we have moved a sufficient distance
    85.                // since touch-down, within the last second.
    86.                if (Time.time - touchDownTime < 1 && Vector2.Distance(touchDownPos, touchUpPos) > 0.5f) {
    87.                    float ang = Mathf.Atan2(touchUpPos.y - touchDownPos.y, touchUpPos.x - touchDownPos.x);
    88.                    Debug.Log("Swipe from " + touchDownPos + " to " + touchUpPos + " angle: " + ang);
    89.                    currBtnState[(int)VButton.SwipeLeft] = (ang > 135 || ang < -135);
    90.                    currBtnState[(int)VButton.SwipeRight] = (ang < 45 && ang > -45);
    91.                    currBtnState[(int)VButton.SwipeUp] = (ang > 45 && ang < 135);
    92.                    currBtnState[(int)VButton.SwipeDown] = (ang < -45 && ang > -135);                  
    93.                }
    94.                touchDownTime = 0;
    95.            }
    96.        }
    97.        currBtnState[(int)VButton.Trigger] = OVRInput.Get(OVRInput.Button.PrimaryIndexTrigger);
    98.        currBtnState[(int)VButton.Back] = OVRInput.Get(OVRInput.Button.Back);      
    99.        #endif
    100.        
    101.    }
    102.    
    103.    public static bool Get(VButton btn) {
    104.        return _instance.currBtnState[(int)btn];
    105.    }
    106.    
    107.    public static bool GetDown(VButton btn) {
    108.        return _instance.currBtnState[(int)btn] && !_instance.prevBtnState[(int)btn];
    109.    }
    110.  
    111.    public static bool GetUp(VButton btn) {
    112.        return !_instance.currBtnState[(int)btn] && _instance.prevBtnState[(int)btn];
    113.    }
    114.  
    115. }
    This is a singleton. So, for example, wherever I want to know if the trigger was just pressed, I call
    OculusInput.instance.GetDown(VButton.Trigger)
    , and this works both on the device and in the editor (using spacebar or mouse button for the trigger).
     
    Corysia likes this.
  10. Corysia

    Corysia

    Joined:
    Mar 14, 2018
    Posts:
    108
    Thank you very much! Both for the code and for taking the time to document that so well!

    Hopefully, I can play around with this some this evening. I also want to see what happens with the FirstPersonController camera. I have an idea of switching cameras if no HMD is present for a non-VR experience. But I noticed the FirstPersonController is very similar to the MainCamera and maybe it's just an overload. Perhaps the HMD works with it as well -- I haven't had a chance to try that out.
     
    JoeStrout likes this.