Search Unity

Help Wanted How to efficiently create blurred texture from camera view to use as backdrop

Discussion in 'Universal Render Pipeline' started by uwdlg, Jul 24, 2021.

  1. uwdlg

    uwdlg

    Joined:
    Jan 16, 2017
    Posts:
    70
    Hi,

    here's what I'm trying to do: when transitioning from my standard first person view to a detail view, I want to smoothly blur and darken the current view and then superimpose an object to be inspected. Here's how I'm trying to accomplish this (code below):
    1. Render the current view from a second camera ignoring UI and other layers into a RenderTexture
    2. Create a new Texture2D and use
      ReadPixels()
      to copy the contents from the RenderTexture
    3. Graphics.Blit()
      from this Texture2D to a new RenderTexture with half the original resolution using a Kawase Blur shader
    4. Set this smaller RenderTexture on a RawImage of a ScreenSpace Canvas acting as the backdrop
    5. Activate an Overlay camera looking at the Canvas and (inactive) detail object
    6. Lerp RawImage alpha from 0 to 1 to create the illusion of the background getting progressively more blurry
    7. Activate the detail object
    8. Make the overlay camera the new base camera and deactivate the previous base camera
    I'm targeting WebGL which is why I don't just "animate" the blur offset in the shader but instead want to run the texture-read-heavy blur shader as rarely as possible.
    This works in the editor and I'm happy with how it looks. However in WebGL builds, the resulting RawImage texture seems to be fully transparent (I excluded the alpha lerping part and put another image behind it, which is all I see). I can also see the smaller RenderTexture for about one frame. I thought the contents of my Coroutine below should all run in one frame after the initial WaitForEndOfFrame (the reason for all this RenderTexture.active and Texture2D juggling is that
    Graphics.Blit()
    requires a Texture2D as source).
    By running the different parts in isolation, I found that the
    Graphics.Blit()
    line is the one causing the transparent texture issue. I could get around this by using the Texture2D from the second step above and a Material with the blur shader on the RawImage, but as I said I would like to avoid running the blur shader over and over.
    I tried changing the Texture formats of the RenderTexture and Texture2D but no luck.
    I also recall reading somewhere that
    Graphics.Blit()
    doesn't work with URP, but I couldn't find anything relevant in the official docs and am wondering why it works fine in the editor. I tried looking into CustomRenderPasses, but I only find examples for "permanent" post-processing type cases where the shader used for blitting is again run repeatedly.
    How can I get this to work / is my current approach the wrong way to go? (On the off-chance this is a platform-specific bug as I originally suspected, I also posted in the WebGL subforum without any traction as of yet)

    The transition code:
    Code (CSharp):
    1. private IEnumerator CaptureBackdropTexture()
    2.     {
    3.        yield return new WaitForEndOfFrame();
    4.      
    5.         int width = backdropCamera.pixelWidth;
    6.         int height = backdropCamera.pixelHeight;
    7.  
    8.         var lastRenderTarget = RenderTexture.active;
    9.         RenderTexture.active = backdropCamera.targetTexture;
    10.      
    11.         backdropCamera.Render();
    12.  
    13.         var texture = new Texture2D(width, height, TextureFormat.RGB24, false);
    14.         texture.ReadPixels(new Rect(0, 0, width, height), 0, 0, false);
    15.         texture.wrapMode = TextureWrapMode.Clamp;
    16.         texture.Apply(false);
    17.  
    18.         RenderTexture.active = lastRenderTarget;
    19.  
    20.         _renderTexture = new RenderTexture(width / 2, height / 2, 24, GraphicsFormat.R8G8B8A8_UNorm);
    21.      
    22.         Graphics.Blit(texture, _renderTexture, blurMaterial);
    23.      
    24.         Destroy(texture);
    25.         detailCamera.gameObject.SetActive(true);
    26.         baseCamera.GetUniversalAdditionalCameraData().cameraStack.Add(detailCamera);
    27.  
    28.         backdropImage.texture = _renderTexture;
    29.  
    30.         StartCoroutine(FadeInBackdrop());
    31.     }
    32.  
    33.     private IEnumerator FadeInBackdrop()
    34.     {
    35.         float t = 0;
    36.         backdropCanvas.gameObject.SetActive(true);
    37.         while (t < 1)
    38.         {
    39.             var backdropImageColor = Color.Lerp(Color.white, new Color(0.5f, 0.5f, 0.5f), t);
    40.             backdropImageColor.a = Mathf.Lerp(0, 1, t);
    41.             backdropImage.color = backdropImageColor;
    42.             t += Time.deltaTime;
    43.             yield return null;
    44.         }
    45.         backdropImage.color = new Color(0.5f, 0.5f, 0.5f, 1);
    46.      
    47.         detailObject.SetActive(true);
    48.  
    49.         baseCamera.GetUniversalAdditionalCameraData().cameraStack.Remove(detailCamera);
    50.         detailCamera.gameObject.SetActive(true);
    51.         detailCamera.GetUniversalAdditionalCameraData().renderType = CameraRenderType.Base;
    52.         baseCamera.enabled = false;
    53.     }
    The blur shader (adapted from https://github.com/tomc128/urp-kawase-blur/blob/master/Assets/KawaseBlur/KawaseBlur.shader and converted to URP by me when I thought that was the cause of the problem):
    Code (CSharp):
    1. Shader "Custom/TextureBlur"
    2. {
    3.     Properties
    4.     {
    5.         _MainTex ("Texture", 2D) = "white" {}
    6.         _Offset ("Offset", float) = 0.5
    7.     }
    8.  
    9.     SubShader
    10.     {
    11.         Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline" }
    12.         LOD 100
    13.  
    14.         Pass
    15.         {
    16.             HLSLPROGRAM
    17.             #pragma vertex vert
    18.             #pragma fragment frag
    19.  
    20.             #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
    21.  
    22.             struct appdata
    23.             {
    24.                 float4 vertex : POSITION;
    25.                 float2 uv : TEXCOORD0;
    26.             };
    27.  
    28.             struct v2f
    29.             {
    30.                 float2 uv : TEXCOORD0;
    31.                 float4 vertex : SV_POSITION;
    32.             };
    33.  
    34.             TEXTURE2D(_MainTex);
    35.             SAMPLER(sampler_MainTex);
    36.          
    37.             CBUFFER_START(UnityPerMaterial)
    38.             float4 _MainTex_TexelSize;
    39.             float4 _MainTex_ST;
    40.             float _Offset;
    41.             CBUFFER_END
    42.  
    43.             v2f vert(appdata v)
    44.             {
    45.                 v2f o;
    46.                 o.vertex = TransformObjectToHClip(v.vertex.xyz);
    47.                 o.uv = TRANSFORM_TEX(v.uv, _MainTex);
    48.                 return o;
    49.             }
    50.  
    51.             half4 frag(v2f input) : SV_Target
    52.             {
    53.                 float2 res = _MainTex_TexelSize.xy;
    54.                 float i = _Offset;
    55.  
    56.                 half4 col;              
    57.                 col.rgb = SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, input.uv).rgb;
    58.                 col.rgb += SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, input.uv + float2(i, i) * res).rgb;
    59.                 col.rgb += SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, input.uv + float2(i, -i) * res).rgb;
    60.                 col.rgb += SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, input.uv + float2(-i, i) * res).rgb;
    61.                 col.rgb += SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, input.uv + float2(-i, -i) * res).rgb;
    62.  
    63.                 col.rgb += SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, input.uv + float2(0, i) * res).rgb;
    64.                 col.rgb += SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, input.uv + float2(0, -i) * res).rgb;
    65.                 col.rgb += SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, input.uv + float2(i, 0) * res).rgb;
    66.                 col.rgb += SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, input.uv + float2(-i, 0) * res).rgb;
    67.              
    68.                 col.rgb /= 9.0f;
    69.                 col.a = 1;
    70.              
    71.                 return col;
    72.             }
    73.             ENDHLSL
    74.         }
    75.     }
    76. }
     
  2. uwdlg

    uwdlg

    Joined:
    Jan 16, 2017
    Posts:
    70
    Okay, I got it working now, but this seems to be a lot of work to get a simple Blit to work:
    I created a separate Renderer set to ignore the same layers as the camera I use to capture the background and assigned it to that camera. I then added a
    ScriptableRendererFeature
    set to inject a
    ScriptableRenderPass
    "After Rendering Post Processing" which calls
    Blit()
    like this:
    Code (CSharp):
    1. public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
    2.     {
    3.         ref CameraData cameraData = ref renderingData.cameraData;
    4.         RenderTargetIdentifier cameraTarget = (cameraData.targetTexture != null) ? new RenderTargetIdentifier(cameraData.targetTexture) : BuiltinRenderTextureType.CameraTarget;
    5.      
    6.         CommandBuffer cmd = CommandBufferPool.Get();
    7.         cmd.SetGlobalTexture("_MainTex", m_Source);
    8.  
    9.         Blit(cmd, m_Source, cameraTarget, material);
    10.  
    11.         context.ExecuteCommandBuffer(cmd);
    12.         CommandBufferPool.Release(cmd);
    13.     }
    Isn't there some way I can get the Blit functionality to work without all these steps?
     
    Last edited: Jul 30, 2021
unityunity