Search Unity

Question An issue rendering cameras to render textures

Discussion in 'General Graphics' started by darkshredder11, Aug 19, 2021.

  1. darkshredder11

    darkshredder11

    Joined:
    Aug 18, 2017
    Posts:
    1
    So I wasn't EXACTLY sure where to post this, because it involves XR plugins, although the issue still occurs regardless of if I'm in VR or viewing it in the regular game view within Unity, so I figured it would be safe to call it a graphics issue.

    Essentially the issue is this: I've been trying to set up a portals effect in VR. First, just to see if it was even possible, I copied project files from this tutorial: Coding Adventure: Portals - YouTube, and tried to replace the player camera in the slicing scene with a VR camera rig provided by the Oculus Integration package on the asset store. At first I was just trying to run it in Unity's game view to see if implementing the XR plugins had changed anything with the game, and suddenly the inside of the portals (an object called screen, which has a material whose texture is a camera rendering the view of the other portal) was now only rendering the skybox of the world. No other layers were visible. I checked the previews of each of the cameras in question, and they seemed to display fine, it was just when it was on the render texture that nothing else would show. Just to test, I changed the Clear Flags of the camera to something OTHER than skybox, to see if it would have the same result (displaying no layers, but showing a solid color background, for example), but after changing the clear flag to anything other than skybox, the portals suddenly started displaying everything in the camera view again. However, this would cause issues with depth, causing objects viewed through the portal to have scrambled depth (making it so the trunk of a tree in the scene could be seen drawn overtop of the tree itself, and the car in the scene could be seen through the floor after it drives off the platform), but the depth was the least of my worries at this point, as the view of the game in VR was even worse.

    Now, I DID do some searching and found another post on the Unity forums here discussing a VERY similar issue to mine, Interaction between a few Unity subsystems causes weird "game-breaking" render bug - Unity Forum, which in short discussed three errors, one being that cameras would not behave correctly when displayed picture-in-picture with another camera, unless they were set to any clear flag except skybox. It was noted that this issue ONLY appears after implementing ANY XR management plugin, which I found to be the case, as it worked fine after removing Unity's default XR management plugin. However, in the final post on it, the user mentioned that the bug was supposedly fixed as of Unity 2021.2.0a10, however I am currently using version 2021.2.0b7 and it still appears.

    There are differences in my case, however- For starters, the post mentioned seems to imply that the skybox issue is only present in the unity game view, whereas mine also occurs in VR after building it to the device. Also, I am building for the Oculus Quest, which is an android platform. When I tried changing the target platform of the game to Windows and testing it in the game view, everything worked fine, even with the clear flags set to skybox, so the issue is ONLY present in an android target platform. I tried to see if I could set the rendering settings of the android platform to be identical to that of the windows platform, however most of the settings are not present or go by different names for android than on windows, so it was essentially impossible for me to debug what about the android platform would cause this.

    Now- going back to an earlier point, about the VR game view being worse when using a non-skybox clear flag- the view inside the portal DOES render, however it seems as if TWO different cameras are rendering at the same time- one in the left eye and one in the right. Now, it would make enough sense if you had said "Yeah, well VR uses a stereoscopic view, so it's showing one perspective of the inside of the portal for each eye"- if it weren't for the fact that it actually seems to be rendering BOTH sides of the portal, the one I'm currently (the forest) at in my right eye, and the other portal (the desert) in my left eye. My best guess is that it IS caused by the stereoscopic rendering in VR, as the shader and scripts used to move and draw the cameras likely aren't designed to consider multiple perspectives, AND they began working again when changing the game to render monoscopic. (However, the Oculus Quest doesn't seem to support monoscopic rendering, as it only displays an image in my left eye, leaving the right eye in a black void).

    Other details to note- the first issue is present across the recommended versions for each of years 2018, 2019, 2020, AND the newest beta build at the time (2021.2.0b7.3246), although I did not test to see if the second issue would also occur in all these versions, only 2020 and 2021 (though the second issue is likely due to me needing to adjust the scripts for stereoscopic rendering, like I said). Also, I tried changing the oculus camera rig to use left and right eye cameras, rather than just a center-eye camera which it uses by default, and ONLY placing the main camera script provided with the portals scene to the left eye to see if it would stop two different camera views from rendering, but it made no difference.

    I've been Googling variations of the same five questions for 2 days now, so if you have any ideas as to why these issues may be occurring, or how to fix them, any help would be appreciated!