Search Unity

Bug Nested rendering mess with selected object outline

Discussion in 'Universal Render Pipeline' started by MadWatch, Mar 19, 2022.

  1. MadWatch

    MadWatch

    Joined:
    May 26, 2016
    Posts:
    112
    Hello Everyone

    I encountered a weird but annoying problem when doing a nested camera rendering in the scene view.

    I'm trying to do something similar to the PlanarReflections in the BoatAttack demo. I have an object named Background that has a child camera and a script that subscribes to the RenderPipelineManager beginCameraRendering event. When the event is called, the script create a temporary texture and call RenderSingleCamera() to render into it.

    It works, but it prevents the selection outline from being shown in the scene view if the selected object is outside of the frustum of the child camera.

    1.png

    On the image above, a mesh sphere is selected and is within the frustum of Background's camera. The orange outline is displayed as expected.

    2.png

    Now if I just move the sphere to the right out of Background's camera frustum then the outline isn't displayed any more (yes the sphere is still selected).

    Here is the code
    Code (CSharp):
    1. [ExecuteInEditMode]
    2. public class Background : MonoBehaviour
    3. {
    4.   // Components
    5.   private Camera mCamera;
    6.  
    7.   // Internals
    8.   private RenderTexture mTexture;
    9.  
    10.  
    11.  
    12.   #region Init
    13.   private void Awake()
    14.   {
    15.     // Get the camera used to render the background
    16.     mCamera = GetComponentInChildren<Camera>();
    17.  
    18.     // Camera should not be enabled because we control it manually
    19.     Debug.Assert(!mCamera.enabled);
    20.   }
    21.   #endregion
    22.  
    23.  
    24.  
    25.   #region Utils
    26.   private void ReleaseTexture()
    27.   {
    28.     if (mTexture != null)
    29.     {
    30.       RenderTexture.ReleaseTemporary(mTexture);
    31.       mTexture = null;
    32.     }
    33.   }
    34.   #endregion
    35.  
    36.  
    37.  
    38.   #region Events
    39.   private void OnEnable()
    40.   {
    41.     RenderPipelineManager.beginCameraRendering += OnCameraRenderBegin;
    42.     RenderPipelineManager.endCameraRendering   += OnCameraRenderEnd;
    43.   }
    44.  
    45.   private void OnDisable()
    46.   {
    47.     RenderPipelineManager.beginCameraRendering -= OnCameraRenderBegin;
    48.     RenderPipelineManager.endCameraRendering   -= OnCameraRenderEnd;
    49.     ReleaseTexture();
    50.   }
    51.   #endregion
    52.  
    53.  
    54.  
    55.   #region Render
    56.   private void OnCameraRenderBegin(ScriptableRenderContext context, Camera cam)
    57.   {
    58.     #if UNITY_EDITOR
    59.     Awake();
    60.     #endif
    61.  
    62.     if (mCamera == null)
    63.       return;
    64.  
    65.     // Camera may not cover the whole screen, get target size in pixels
    66.     int frameWidth  = cam.scaledPixelWidth;
    67.     int frameHeight = cam.scaledPixelHeight;
    68.  
    69.     // Get temporary texture to render the background into
    70.     mTexture = RenderTexture.GetTemporary(frameWidth, frameHeight, 24);
    71.  
    72.     // Make camera render into that texture
    73.     mCamera.targetTexture = mTexture;
    74.  
    75.     // Make the pipeline render the camera
    76.     UniversalRenderPipeline.RenderSingleCamera(context, mCamera);
    77.   }
    78.  
    79.   private void OnCameraRenderEnd(ScriptableRenderContext context, Camera cam)
    80.   {
    81.     ReleaseTexture();
    82.   }
    83.   #endregion
    84. }
    I first encountered this problem while scripting my own render pipeline. Then I reproduced it with URP. It seems that calling context.Cull() into RenderSingleCamera() it what causes the issue. I assume that it is a bug. But maybe I missed something? Is there any way to work around this issue?

    Thanks