Search Unity

  1. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Resolved A nasty gotcha: ScreenPointToRay issues on non-XR cameras in project with XR enabled

Discussion in 'VR' started by BlakeSchreurs, Jul 19, 2021.

  1. BlakeSchreurs

    BlakeSchreurs

    Joined:
    Aug 10, 2016
    Posts:
    51
    Bottom-Line-Up-Front: Check your Target Eye setting if you're using Camera methods on a non-XR camera for a project with XR enabled. It defaults to "Both" when it should probably default to "None"

    This one took me far too long to find. Documenting it here for others, and for my future self if I ever come across this problem again and can't find the solution.

    I do a lot of XR work. However, I often test behaviors of objects outside of VR in little test scenes. This helps with the constant donning/doffing of a headset, especially when there's a behavior I'm working on that isn't related to the XR portions of the scene. NPC movement, that kind of thing.

    So, I'd created a scene, kept the default camera, set up a classic "boxes doing things" kind of scenario, and set things up so that if I clicked on the ground, my NPC would move there. Click the mouse, do a raycast into the world, go to the position clicked on the screen. Super basic stuff.

    Except it didn't work. I'd click, and the target position was always wrong. Really wrong. I used debug rays, and I saw the ray was going in the wrong direction (far more down than expected) At first I thought this was due to the new input system, maybe the mouse position was in a different coordinate space or something. That wasn't it at all. I created a test/sample project in the same version of Unity (2020.3.11) and tested it there. Code worked as expected. I copied the scene over to the initial Unity project.... and the thing still didn't work. Debug showed mouse positions were right (and in the same coordinate space), but ScreenPointToRay was returning different results.

    When I set the projects side-by-side, I noticed that the images weren't identical. The field of view on the malfunctioning project was far wider. I reset the window layout on both, made sure each was exactly half the screen, and the game images still weren't the same. Since one project had XR enabled, the Camera class was slightly different and had a setting for Target Eye. This was, by default, set to both, even though this camera had not been converted into an XR Camera rig. Once I set the camera to None (Main Display) the behaviors worked as expected.

    So that was a thing. Hidden problem, easy solution, hope my writeup saves someone some time in the future.