Search Unity

Status of camera stacking for vr cockpit

Discussion in 'Universal Render Pipeline' started by RogueStargun, Jun 21, 2020.

  1. RogueStargun

    RogueStargun

    Joined:
    Aug 5, 2018
    Posts:
    296
    I have a 3d space shooter I've been working on for mobile that I'd like to try out for vr. Of course this means having a cockpit, and I was wondering if a camera stacking cockpit for VR is workable with URP right now
     
  2. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    3,021
    Yes, I have tested URP 7.4.1 with Unity 2019.4 in a space game project of mine. I successfully stacked the cameras. Camera stacking works better with URP now than it did with the built-in renderer.

    I also decided to use Beautify 2 for URP instead of Unity's post processing, and that also worked with stacked URP cameras.

    I have a 3D cockpit at the origin that I rotate to match the player's ship rotation. I keep the cockpit at the origin so it does not jitter in VR.
     
  3. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    522
    I am trying this in Unity 2019.3.0b6 and I have a world camera, and a cockpit camera. If only one of them is rendering, everything is OK, but when both are rendering, the world cam (lowest depth) is really jittery and shaky, while the cockpit cam is fine. Any idea what might be causing that?
     
  4. RogueStargun

    RogueStargun

    Joined:
    Aug 5, 2018
    Posts:
    296
    Innovine, I figured out the trick to eliminate the camera jitter. You need to have the worldcam update its position and rotation in LateUpdate rather than Update!

    It took me forever to figure this out, but once implemented, the results are dramatic!
     
  5. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    522
    I found an alternate solution... in my VR setup code I found a call to XRDevice.DisableAutoXRCameraTracking(Camera.main, true) and I changed this to false. Worked for me, where I have a second script on the other cameras which is doing InputTracking.GetNodeStates(states) and then TryGetPosition() for a tracked state and assign that to the transform. It's really a long time since I wrote that, no idea if it's good or not now but it appears to work ok. I'll come back to it some day and try some more experiments :)
     
  6. RogueStargun

    RogueStargun

    Joined:
    Aug 5, 2018
    Posts:
    296
    Innovine, can you explain a bit more clearly what you did with the InputTracking.GetNodeStates?
     
  7. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    522
    Sure, but it's a looong time since I looked at this, and I have NO idea if it is correct.
    It works fine for me in my current, Stationary setup with VR enabled. Havent looked at other options in a while:

    My main VR manager has this:
    Code (CSharp):
    1. public class XR_Setup : MonoBehaviour
    2. {
    3.     public bool disableAutoCameraTracking = true;
    4.     public bool stationary = false;
    5.  
    6.     public bool useVR = true;
    7.  
    8.     // Start is called before the first frame update
    9.     void Awake()
    10.     {
    11.         if (useVR)
    12.         {
    13.             EnableVR();
    14.         }
    15.         else
    16.         {
    17.             DisableVR();
    18.         }
    19.     }
    20.  
    21.     public void EnableVR()
    22.     {
    23.         XRSettings.enabled = true;
    24.  
    25.         Debug.Log("Setting VR tracked camera control to external");
    26.  
    27.         XRDevice.DisableAutoXRCameraTracking(Camera.main, disableAutoCameraTracking);
    28.  
    29.         Camera.main.transform.localPosition = Vector3.zero;
    30.         Camera.main.transform.rotation = Quaternion.identity;
    31.  
    32.         Debug.Log("Setting tracking space type to Stationary");
    33.         if (stationary)
    34.             XRDevice.SetTrackingSpaceType(TrackingSpaceType.Stationary);
    35.  
    36.         useVR = true;
    37.     }
    38.  
    39.     public void DisableVR()
    40.     {
    41.         XRSettings.enabled = false;
    42.         useVR = false;
    43.     }
    44. }
    45.  

    And on my camera I have this:
    Code (CSharp):
    1. public class XR_TrackedDevice : MonoBehaviour
    2. {
    3.     public XRNode _trackedDeviceType;
    4.     public bool _isTracked = false;
    5.     public ulong _uniqueID;
    6.  
    7.     private List<XRNodeState> states;
    8.     // Start is called before the first frame update
    9.     void Start()
    10.     {
    11.         states = new List<XRNodeState>();
    12.  
    13.     }
    14.  
    15.     // Update is called once per frame
    16.     void Update()
    17.     {
    18.         InputTracking.GetNodeStates(states);
    19.         foreach (XRNodeState nodeState in states)
    20.         {
    21.             if (nodeState.nodeType == _trackedDeviceType)
    22.             {
    23.                 _isTracked = nodeState.tracked;
    24.  
    25.                 // Try get the position
    26.                 Vector3 position;
    27.                 if (nodeState.TryGetPosition(out position))
    28.                 {
    29.                     transform.localPosition = position;
    30.                 }
    31.                
    32.                 // Try get the rotation
    33.                 Quaternion rotation;
    34.                 if(nodeState.TryGetRotation(out rotation))
    35.                 {
    36.                     transform.localRotation = rotation;
    37.                 }
    38.  
    39.                 // set the uniqueID
    40.                 _uniqueID = nodeState.uniqueID;
    41.  
    42.                 // More availabel statews exist, including acceleration, velocity and angular acceleration
    43.                 // ...
    44.             }
    45.         }
    46.     }
    47.  
    48.  
    49. }
    50.  

    And I also call this to re-center the headset:
    Code (CSharp):
    1.             InputTracking.Recenter();
    '

    Like I said, this works for my current case but I have no idea if its right or wrong.
     
  8. RogueStargun

    RogueStargun

    Joined:
    Aug 5, 2018
    Posts:
    296
    Thanks. I've noticed the LateUpdate still has a bit of a jitter to it, so I'm going to try to use your technique!
     
  9. RogueStargun

    RogueStargun

    Joined:
    Aug 5, 2018
    Posts:
    296
    Also, an alpha video!
     
  10. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    522
    Nice. How are you doing the npc movement?
     
  11. RogueStargun

    RogueStargun

    Joined:
    Aug 5, 2018
    Posts:
    296
    Hey we're on the forum at the same time!
    NPC's are being moved with rigidbody physics.
    2d Astar is used with tweaks to navigate around obstacles, and I plan to use behavior designer to do behavior planning.

    The AI is by far the most complicated part of the game, but shooting down an enemy plane in VR is hard enough that a simple AI will be challenging to dogfight.

    I did some more experiments, and I found that my original solution only works some of the time, and your solution also causes jitter.

    When the FPS is high, its not noticable, but when frames drop, the jitter is unbearable.

    I think the solution involves the fact that the tracked pose driver typically uses an update type of "Update and Before Render" which smoothes things out. The avatar camera that is mirror this camera needs to not only "update" but also update "before render" as well which should in theory smooth out the jitters.
     
  12. RogueStargun

    RogueStargun

    Joined:
    Aug 5, 2018
    Posts:
    296
    I finally found the solution. It's so simple I'm kicking myself.
    Instead of writing a script to handle the "world camera" that is attached to the ship, simply use another tracked pose driver with the "Update Type" set to Update and Before Render.

    It's so ridiculously obvious, and doesn't require any scripting!
     
  13. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    522
    Interesting, I've not noticed any jitter in my scene, even when the framerate drops. My cockpit controls are jerking around and getting reprojected but I never thought the outside looked wrong. I'll take a closer look today.

    I guessed the AI would be hard which is why I asked :) I have not added any yet, but my project is more of a flight sim with no combat, so perhaps I'll just run other traffic on rails and avoid any real calculations :)

    One other thing comes to mind when I see your video, do you use a real joystick or a virtual one, and what are your thoughts on that? I have two real joysticks (one for translation and one for rotation) which I really like using, but when I try to additionally use vr controllers to press cockpit buttons I bang them into my desk and joysticks and it ruins my immersion. I know vtol uses virtual sticks, but I don't think they compare to real ones
     
  14. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    522
    Ahh, both my cameras have the tracking script on them :)
     
  15. RogueStargun

    RogueStargun

    Joined:
    Aug 5, 2018
    Posts:
    296
    I'm using the unity inputsystem so it supports the virtual joystick, the oculus thumbstick, gamepads, and joysticks at the same time.

    Flying and shooting in VR is quite hard, so a very simple AI can feel very intelligent (just look at half-life alyx where the enemy soldiers barely take cover!)
     
  16. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    522
    Just upgraded to 2020.1 and find that the API calls above are obsolete. Thanks, Unity, re-doing all my VR S*** was just what I wanted to do today. But hey, I got a new package manager.