Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Feedback How to Blit in URP 12 - Documentation Needed

Discussion in 'Universal Render Pipeline' started by daneobyrd, Dec 13, 2021.

  1. BruceKristelijn

    BruceKristelijn

    Joined:
    Apr 28, 2017
    Posts:
    107



    So I think I understand why these issues happen. I am developing for the Quest 2 and enabling opaque texture causes these issues without and renderpass enabled. So maybe if I can find a way to prevent the passes enabling opaque texture it could work but I guess it is a hardware limitation.
     
  2. wwWwwwW1

    wwWwwwW1

    Joined:
    Oct 31, 2021
    Posts:
    761
  3. chentianqin

    chentianqin

    Joined:
    Jan 7, 2021
    Posts:
    3
    If I want to blit from a source renderTexture that is allocated by calling commandBuffer.GetTemporaryRT(), how can I do? It reports error when I call Blitter.BlitCameraTexture()
     
    shotoutgames likes this.
  4. StrangeWays777

    StrangeWays777

    Joined:
    Jul 21, 2018
    Posts:
    31
    I have done as you said but I am getting an error saying ''VolumetricRendering.SetupRenderPasses(ScriptableRenderer, ref RenderingData)': no suitable method found to override'

    There is almost no documentation on this and what I do find is always outdated. Any help would be massively appreciated. Thanks in advance.
     
  5. daneobyrd

    daneobyrd

    Joined:
    Mar 29, 2018
    Posts:
    101
    I ran into this recently, you need to change the RenderingData parameter to `in` instead of `ref`.
     
    StrangeWays777 likes this.
  6. StrangeWays777

    StrangeWays777

    Joined:
    Jul 21, 2018
    Posts:
    31
    Thank you! It worked.

    Still need to figure out why I still keep getting the warning
    'CommandBuffer: temporary render texture not found while executing RenderClouds (SetGlobalTexture)'
    But now I am one step closer!

    SOLVED:
    I had to set the source for the RenderPass which ended up looking like this.

    Code (CSharp):
    1.     public override void SetupRenderPasses(ScriptableRenderer renderer, in RenderingData renderingData)
    2.     {
    3.         if (settings.cloudRenderer == null)
    4.         {
    5.             Debug.LogWarningFormat("Missing Blit Material. {0} blit pass will not execute. Check for missing reference in the assigned renderer.", GetType().Name);
    6.             return;
    7.         }
    8.        
    9.         cloudsPass.source = renderer.cameraColorTargetHandle;
    10.     }
     
    Last edited: Jun 28, 2023
  7. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    858
    I need help with this. All these new methods in URP are utterly confusing.

    I am rendering the scene using a depth only pass, into the color target of the camera, which is a RenderTexture asset of type RFloat. That works. Now, I want the depth to be Linear01, so I thought I would add a Blit to the CommandBuffer in the RendererFeature, but this seems to break the depth rendering. I am not entirely sure what is going on. When I add the Blitter blit, the scene content disappears and it seems like it is also no longer cleared, because whatever I blit into the texture accumulates.

    Any hints are greatly appreciated.


    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.Rendering;
    3. using UnityEngine.Rendering.Universal;
    4. using UnityEngine.Rendering.RendererUtils;
    5.  
    6. public class RenderToDepthTextureFeature : ScriptableRendererFeature
    7. {
    8.     public LayerMask layerMask;
    9.     public RenderPassEvent renderPassEvent = RenderPassEvent.BeforeRenderingSkybox;
    10.  
    11.     RenderToDepthTexturePass _scriptablePass;
    12.     Material _fixMaterial;
    13.  
    14.  
    15.     public override void Create()
    16.     {
    17.         _scriptablePass = new RenderToDepthTexturePass( layerMask, _fixMaterial );
    18.         _scriptablePass.renderPassEvent = renderPassEvent;
    19.  
    20.         _fixMaterial = CoreUtils.CreateEngineMaterial( Shader.Find( "Hidden/RenderToDepthTextureFeatureFix" ) );
    21.     }
    22.  
    23.  
    24.     public override void AddRenderPasses( ScriptableRenderer renderer, ref RenderingData renderingData )
    25.     {
    26.         renderer.EnqueuePass( _scriptablePass );
    27.     }
    28.  
    29.  
    30.     public override void SetupRenderPasses( ScriptableRenderer renderer, in RenderingData renderingData )
    31.     {
    32.         if (renderingData.cameraData.cameraType == CameraType.Game)
    33.         {
    34.             // Calling ConfigureInput with the ScriptableRenderPassInput.Color argument
    35.             // ensures that the opaque texture is available to the Render Pass.
    36.             _scriptablePass.ConfigureInput( ScriptableRenderPassInput.Color );
    37.             _scriptablePass.SetTarget( renderer.cameraColorTargetHandle );
    38.         }
    39.     }
    40.  
    41.  
    42.     protected override void Dispose(bool disposing)
    43.     {
    44.         CoreUtils.Destroy( _fixMaterial );
    45.     }
    46.  
    47.  
    48.  
    49.     class RenderToDepthTexturePass : ScriptableRenderPass
    50.     {
    51.         ProfilingSampler _profilingSampler;
    52.         ShaderTagId _depthOnlyId;
    53.  
    54.         LayerMask _layerMask;
    55.         Material _fixMaterial;
    56.  
    57.         RTHandle _colorTargetHandle;
    58.  
    59.  
    60.         public RenderToDepthTexturePass( LayerMask layerMask, Material fixMaterial )
    61.         {
    62.             _layerMask = layerMask;
    63.             _depthOnlyId = new ShaderTagId( "DepthOnly" );
    64.             _profilingSampler = new ProfilingSampler( nameof( RenderToDepthTexturePass ) );
    65.             _fixMaterial = fixMaterial;
    66.         }
    67.  
    68.  
    69.         public void SetTarget( RTHandle colorHandle )
    70.         {
    71.             _colorTargetHandle = colorHandle;
    72.         }
    73.  
    74.  
    75.         public override void OnCameraSetup( CommandBuffer cmd, ref RenderingData renderingData )
    76.         {
    77.             ConfigureTarget( _colorTargetHandle );
    78.         }
    79.  
    80.  
    81.         public override void Execute( ScriptableRenderContext context, ref RenderingData renderingData )
    82.         {
    83.             CommandBuffer cmd = CommandBufferPool.Get();
    84.  
    85.             using( new ProfilingScope( cmd, _profilingSampler ) )
    86.             {
    87.                 RendererListDesc descrp = new RendererListDesc( _depthOnlyId, renderingData.cullResults, renderingData.cameraData.camera );
    88.                 descrp.renderQueueRange = RenderQueueRange.opaque;
    89.                 descrp.layerMask = _layerMask;
    90.                 RendererList rendererList = context.CreateRendererList( descrp );
    91.        
    92.                 // TEST
    93.                 //cmd.SetRenderTarget( _colorTargetHandle );
    94.  
    95.                 cmd.ClearRenderTarget( clearDepth: true, clearColor: true, Color.black );
    96.                 cmd.DrawRendererList( rendererList );
    97.        
    98.                 // TEST
    99.                 //cmd.CopyTexture( renderingData.cameraData.renderer.cameraColorTargetHandle, _colorTargetHandle );
    100.  
    101.                 // By example.https://docs.unity3d.com/Packages/com.unity.render-pipelines.universal@16.0/manual/renderer-features/how-to-fullscreen-blit.html
    102.                 // "Do not use the cmd.Blit method in URP XR projects because that method has compatibility issues with the URP XR integration.
    103.                 // Using cmd.Blit might implicitly enable or disable XR shader keywords, which breaks XR SPI rendering."
    104.                 Blitter.BlitCameraTexture( cmd, _colorTargetHandle, _colorTargetHandle, _fixMaterial, pass: 0 );
    105.                 //cmd.Blit( _colorTargetHandle, _colorTargetHandle, _fixMaterial ); // NO NO
    106.             }
    107.  
    108.             context.ExecuteCommandBuffer( cmd );
    109.             cmd.Clear();
    110.  
    111.             CommandBufferPool.Release( cmd );
    112.         }
    113.     }
    114. }

    Code (CSharp):
    1. Shader "Hidden/RenderToDepthTextureFeatureFix"
    2. {
    3.     SubShader
    4.     {
    5.         Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
    6.         LOD 100
    7.         ZWrite Off Cull Off
    8.         Pass
    9.         {
    10.             Name "RenderToDepthTextureFeatureFix"
    11.  
    12.             HLSLPROGRAM
    13.             #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
    14.             // The Blit.hlsl file provides the vertex shader (Vert),
    15.             // input structure (Attributes) and output strucutre (Varyings)
    16.             #include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"
    17.  
    18.             #pragma vertex Vert
    19.             #pragma fragment frag
    20.  
    21.             TEXTURE2D_X( _CameraOpaqueTexture );
    22.             SAMPLER( sampler_CameraOpaqueTexture );
    23.  
    24.             float frag( Varyings input ) : SV_Target
    25.             {
    26.                 UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX( input );
    27.        
    28.                 float  depth = SAMPLE_TEXTURE2D_X( _CameraOpaqueTexture, sampler_CameraOpaqueTexture, input.texcoord ).r;
    29.        
    30.                 // TODO: Depth to linear.
    31.                 // https://teodutra.com/unity/shaders/urp/graphics/2020/05/18/From-Built-in-to-URP/
    32.                 //depth = Linear01Depth( depth, _ZBufferParams );
    33.  
    34.                 // TEST
    35.                 depth += 0.1;
    36.        
    37.                 return depth;
    38.             }
    39.             ENDHLSL
    40.         }
    41.     }
    42. }
    I can see that both the depth pass and blit pass is operating on the same texture. I really don't get it.

    DepthOnlyPass.png

    BlitFixPass.png

    If I log the name of _colorTarget handle (defined in the RenderToDepthTexturePass class), I see that its called "_CameraColorAttachmentA". But in the FrameDebugger, its called "_CameraColorAttachmentA_640x576_R32_SFloat_Tex2D". Does this have something to do with the issue?

    Also, if I compare renderingData.cameraData.renderer.cameraColorTargetHandle with _colorTargetHandle inside RenderToDepthTexturePass.Execute they are the same! So why should I go through the hassle of parsing in the RTHandle from the RendererFeature to the RenderPass like recommended in the official Blit example?
     
    Last edited: Sep 28, 2023
  8. RebelEggGames

    RebelEggGames

    Joined:
    Dec 12, 2022
    Posts:
    27
    It took me several hours to make it work. Unity docs are outdated and they suck.
    Since docs were useless, forums, google and GPT4 didnt help as well, I had to go over a very complex (but working!) post process someone else made and... what I've found out is that basically in order to avoid render stacking you have to draw the renderers yourself, before blitting.
    I am pretty sure this is not an expected behaviour, post-process should affect only the current frame, not the next one (WTF UNITY?).

    The example I prepared is good enough for my current problem, however bear in mind that you might need to change the input to "DrawRenderers" method, depending on what you want to render and how (e.g: transparent objects).

    I have just finished it, so it is a first solution I've created, so while it solves the main problem (postprocess stacking), I'm pretty sure it can be improved.

    Full code:

    Code (CSharp):
    1.  
    2. using UnityEngine;
    3. using UnityEngine.Rendering;
    4. using UnityEngine.Rendering.Universal;
    5.  
    6. public class OutlineRendererFeature : ScriptableRendererFeature
    7. {
    8.     [System.Serializable]
    9.     public class OutlineRendererSettings
    10.     {
    11.         public RenderPassEvent Event = RenderPassEvent.AfterRenderingTransparents;
    12.         public RenderQueueRange Range;
    13.         public LayerMask mask;
    14.         public Shader postProcessShader = null;
    15.         public float scale = 1.0f;
    16.     }
    17.  
    18.     public OutlineRendererSettings settings = new OutlineRendererSettings();
    19.  
    20.     private Material material;
    21.     private OutlineRendererPass outlineRendererPass;
    22.  
    23.     public override void Create()
    24.     {
    25.         if (settings.postProcessShader == null)
    26.         {
    27.             Debug.LogError("Missing post-process shader in OutlineRenderer settings.");
    28.             return;
    29.         }
    30.  
    31.         material = CoreUtils.CreateEngineMaterial(settings.postProcessShader);
    32.         outlineRendererPass = new OutlineRendererPass(name, settings.Event, settings.Range, settings.mask, material);
    33.     }
    34.  
    35.     public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    36.     {
    37.         renderer.EnqueuePass(outlineRendererPass);
    38.     }
    39.  
    40.     public override void SetupRenderPasses(ScriptableRenderer renderer, in RenderingData renderingData)
    41.     {
    42.         if (renderingData.cameraData.cameraType == CameraType.Game)
    43.         {
    44.             outlineRendererPass.ConfigureInput(ScriptableRenderPassInput.Color);
    45.             outlineRendererPass.SetTarget(renderer.cameraColorTargetHandle, settings.scale);
    46.         }
    47.     }
    48.  
    49.     protected override void Dispose(bool disposing)
    50.     {
    51.         CoreUtils.Destroy(material);
    52.     }
    53. }
    54.  
    55. class OutlineRendererPass : ScriptableRenderPass
    56. {
    57.     private RTHandle cameraColorTarget;
    58.     private Material postProcessMaterial;
    59.     private float scale;
    60.  
    61.     private FilteringSettings filteringSettings;
    62.     private RenderStateBlock renderStateBlock;
    63.     private ShaderTagId shaderTagId = new ShaderTagId("OutlineRenderer");
    64.     private string profilerTag;
    65.     private ProfilingSampler customProfilingSampler;
    66.  
    67.     public OutlineRendererPass(Material material)
    68.     {
    69.         this.postProcessMaterial = material;
    70.     }
    71.  
    72.     public OutlineRendererPass(string profilerTag, RenderPassEvent renderPassEvent, RenderQueueRange renderQueueRange, LayerMask layerMask, Material material)
    73.     {
    74.         this.profilerTag = profilerTag;
    75.         customProfilingSampler = new ProfilingSampler(profilerTag);
    76.         this.renderPassEvent = renderPassEvent;
    77.         filteringSettings = new FilteringSettings(renderQueueRange, layerMask);
    78.         this.postProcessMaterial = material;
    79.         renderStateBlock = new RenderStateBlock(RenderStateMask.Nothing);
    80.     }
    81.  
    82.     public void SetTarget(RTHandle colorHandle, float scale)
    83.     {
    84.         cameraColorTarget = colorHandle;
    85.         this.scale = scale;
    86.     }
    87.  
    88.     public override void OnCameraSetup(CommandBuffer cmd, ref RenderingData renderingData)
    89.     {
    90.         if (cameraColorTarget == null)
    91.         {
    92.             Debug.LogError("cameraColorTarget is null!");
    93.             return;
    94.         }
    95.         ConfigureTarget(cameraColorTarget);
    96.     }
    97.  
    98.     public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
    99.     {
    100.         if (renderingData.cameraData.camera.cameraType != CameraType.Game)
    101.         {
    102.             return;
    103.         }
    104.  
    105.         if (postProcessMaterial == null)
    106.         {
    107.             Debug.LogError("postProcessMaterial is null! This should never happen.");
    108.             return;
    109.         }
    110.  
    111.         CommandBuffer cmd = CommandBufferPool.Get();
    112.         using (new ProfilingScope(cmd, customProfilingSampler))
    113.         {
    114.             context.ExecuteCommandBuffer(cmd);
    115.             cmd.Clear();
    116.  
    117.             SortingSettings sortingSettings = new SortingSettings(renderingData.cameraData.camera);
    118.             sortingSettings.criteria = SortingCriteria.CommonOpaque;
    119.             DrawingSettings drawSettings = new DrawingSettings(shaderTagId, sortingSettings);
    120.             drawSettings.perObjectData = PerObjectData.None;
    121.  
    122.             context.DrawRenderers(renderingData.cullResults, ref drawSettings, ref filteringSettings, ref renderStateBlock);
    123.  
    124.             postProcessMaterial.SetFloat("_Scale", scale);
    125.             Blitter.BlitCameraTexture(cmd, cameraColorTarget, cameraColorTarget, postProcessMaterial, 0);
    126.         }
    127.         context.ExecuteCommandBuffer(cmd);
    128.         cmd.Clear();
    129.  
    130.         CommandBufferPool.Release(cmd);
    131.     }
    132. }
    133.  
    <shameless-self-promotion>
    If this helps & you want to thank me -> go buy my game on Steam & send me feedback!
    </shameless-self-promotion>

    Good luck! :D
     
    Last edited: Oct 5, 2023
    nasos_333, cecarlsen and thelebaron like this.
  9. arkano22

    arkano22

    Joined:
    Sep 20, 2012
    Posts:
    1,891
    Working on URP blitting for my assets, and just have to say this is an absolute clusterfuck.

    Lack of/incorrect/outdated documentation, no clear gold standard to follow, and lots of boilerplate code required even for the simplest post process effect.

    Hooray SRPs.
     
  10. RebelEggGames

    RebelEggGames

    Joined:
    Dec 12, 2022
    Posts:
    27
    I feel you man, working with URP is a nightmare, if you want to create some post process effects...
    Also, lack of grab pass is just... uhh, you need to do 2 seperate post processes to work around this problem...
     
  11. arkano22

    arkano22

    Joined:
    Sep 20, 2012
    Posts:
    1,891
    It's not just that things are extremely lacking compared to the built-in pipeline, it's the huge amount of time wasted on the simplest of things. I just spent an entire day debugging trough a custom renderer feature, only to realize that Blitter.BlitCameraTexture passes the input texture to the shader as "_BlitTexture" instead of "_MainTex". Furthermore, it is necessary to use a specific vertex shader for this method to work at all.

    Does this follow Unity's own conventions? no. Was it a necessary naming change? probably not. Is it mentioned in the method's documentation? nope. Had to use the frame debugger to figure out where the problem was, then wade trough URP's source code to get an answer.

    At this point I've spent more time reverse-engineering Unity's SRPs than working on my own stuff... it's nerve-wracking. Sorry for the rant.
     
    Last edited: Oct 31, 2023
  12. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,286
    So pretty much they broke everything that they could in this new changes.

    This is just so unbelievable.

    I tried to replace based on the above and just nothing works, i get multiple errors no matter what i do.

    The whole thing looks broken.
     
    Last edited: Nov 1, 2023
    goncalo-vasconcelos likes this.
  13. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,286
    Has anyone managed to do a simple blit with the blitter writing Alpha values from the shader to thd destination render texture ?

    No matter what i tried transparency value from the blit shader is not written in the output rendertexture
     
  14. RaventurnPatrick

    RaventurnPatrick

    Joined:
    Aug 9, 2011
    Posts:
    248
    I can only agree, we tried writing some custom camera effects with URP 12 and used the official documentation such as here:https://docs.unity3d.com/Packages/c...renderer-features/how-to-fullscreen-blit.html

    However even this simple example does not work as soon as you need transparency for example
    In general the whole concept of renderer features seems very overcomplicated when compared to how it was done in built-in
     
  15. echu33

    echu33

    Joined:
    Oct 30, 2020
    Posts:
    62
    Hello everyone, I've encounter similar issue that the URP document's example didn't work on transparent objects when blittiing. So after trace the official FullScreenPassRendererFeature for fullscreen shader graph I've got a working one. (I'm using URP 14)


    Below is the my renderer feature code :
    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.Rendering;
    3. using UnityEngine.Rendering.Universal;
    4.  
    5. public class ColorBlitRendererFeature : ScriptableRendererFeature
    6. {
    7.     public enum InjectionPoint
    8.     {
    9.         BeforeRenderingTransparents = RenderPassEvent.BeforeRenderingTransparents,
    10.         BeforeRenderingPostProcessing = RenderPassEvent.BeforeRenderingPostProcessing,
    11.         AfterRenderingPostProcessing = RenderPassEvent.AfterRenderingPostProcessing
    12.     }
    13.     public InjectionPoint injectionPoint = InjectionPoint.AfterRenderingPostProcessing;
    14.     public bool fetchColorBuffer = true;
    15.  
    16.     /// <summary>
    17.     /// A mask of URP textures that the assigned material will need access to. Requesting unused requirements can degrade
    18.     /// performance unnecessarily as URP might need to run additional rendering passes to generate them.
    19.     /// </summary>
    20.     public ScriptableRenderPassInput requirements = ScriptableRenderPassInput.None;
    21.  
    22.     public Material passMaterial;
    23.  
    24.     internal bool showAdditionalProperties = false;
    25.  
    26.     public int passIndex = 0;
    27.  
    28.     /// <summary>
    29.     /// Specifies if the active camera's depth-stencil buffer should be bound when rendering the full screen pass.
    30.     /// Disabling this will ensure that the material's depth and stencil commands will have no effect (this could also have a slight performance benefit).
    31.     /// </summary>
    32.     public bool bindDepthStencilAttachment = false;
    33.  
    34.     private FullScreenRenderPass m_FullScreenPass;
    35.  
    36.     /// <inheritdoc/>
    37.     public override void Create()
    38.     {
    39.         m_FullScreenPass = new FullScreenRenderPass(name);
    40.     }
    41.  
    42.     /// <inheritdoc/>
    43.     public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    44.     {
    45.         if (renderingData.cameraData.cameraType == CameraType.Preview || renderingData.cameraData.cameraType == CameraType.Reflection)
    46.             return;
    47.  
    48.         if (passMaterial == null)
    49.         {
    50.             Debug.LogWarningFormat("The full screen feature \"{0}\" will not execute - no material is assigned. Please make sure a material is assigned for this feature on the renderer asset.", name);
    51.             return;
    52.         }
    53.  
    54.         if (passIndex < 0 || passIndex >= passMaterial.passCount)
    55.         {
    56.             Debug.LogWarningFormat("The full screen feature \"{0}\" will not execute - the pass index is out of bounds for the material.", name);
    57.             return;
    58.         }
    59.  
    60.         m_FullScreenPass.renderPassEvent = (RenderPassEvent)injectionPoint;
    61.         m_FullScreenPass.ConfigureInput(requirements);
    62.         m_FullScreenPass.SetupMembers(passMaterial, passIndex, fetchColorBuffer, bindDepthStencilAttachment);
    63.  
    64.         renderer.EnqueuePass(m_FullScreenPass);
    65.     }
    66.  
    67.     /// <inheritdoc/>
    68.     protected override void Dispose(bool disposing)
    69.     {
    70.         m_FullScreenPass.Dispose();
    71.     }
    72.  
    73.     public class FullScreenRenderPass : ScriptableRenderPass
    74.     {
    75.         private Material m_Material;
    76.         private int m_PassIndex;
    77.         private bool m_CopyActiveColor;
    78.         private bool m_BindDepthStencilAttachment;
    79.         private RTHandle m_CopiedColor;
    80.  
    81.         public FullScreenRenderPass(string passName)
    82.         {
    83.             profilingSampler = new ProfilingSampler(passName);
    84.         }
    85.  
    86.         public void SetupMembers(Material material, int passIndex, bool copyActiveColor, bool bindDepthStencilAttachment)
    87.         {
    88.             m_Material = material;
    89.             m_PassIndex = passIndex;
    90.             m_CopyActiveColor = copyActiveColor;
    91.             m_BindDepthStencilAttachment = bindDepthStencilAttachment;
    92.         }
    93.  
    94.         public override void OnCameraSetup(CommandBuffer cmd, ref RenderingData renderingData)
    95.         {
    96.             // FullScreenPass manages its own RenderTarget.
    97.             // ResetTarget here so that ScriptableRenderer's active attachement can be invalidated when processing this ScriptableRenderPass.
    98.             ResetTarget();
    99.  
    100.             if (m_CopyActiveColor)
    101.                 ReAllocate(renderingData.cameraData.cameraTargetDescriptor);
    102.         }
    103.  
    104.         internal void ReAllocate(RenderTextureDescriptor desc)
    105.         {
    106.             desc.msaaSamples = 1;
    107.             desc.depthBufferBits = (int)DepthBits.None;
    108.             //RenderingUtils.ReAllocateIfNeeded(ref m_CopiedColor, desc, name: "_CameraColorTexture"); works too, why?
    109.             RenderingUtils.ReAllocateIfNeeded(ref m_CopiedColor, desc, name: "_FullscreenPassColorCopy");
    110.         }
    111.  
    112.         public void Dispose()
    113.         {
    114.             m_CopiedColor?.Release();
    115.         }
    116.  
    117.         public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
    118.         {
    119.             ref var cameraData = ref renderingData.cameraData;
    120.             CommandBuffer cmd = CommandBufferPool.Get();
    121.  
    122.             using (new ProfilingScope(cmd, profilingSampler))
    123.             {
    124.                  //To be able to blit to the current camera color, we need to copy the current result into a RT,
    125.                  //In this case the RT is m_CopiedColor.
    126.                 if (m_CopyActiveColor)
    127.                 {
    128.                     CoreUtils.SetRenderTarget(cmd, m_CopiedColor);
    129.                     Blitter.BlitTexture(cmd, cameraData.renderer.cameraColorTargetHandle, new Vector4(1, 1, 0, 0), 0.0f, false);
    130.                 }
    131.  
    132.                 if (m_BindDepthStencilAttachment)
    133.                 {
    134.                     CoreUtils.SetRenderTarget(cmd, cameraData.renderer.cameraColorTargetHandle, cameraData.renderer.cameraDepthTargetHandle);
    135.                 }
    136.                 else
    137.                 {
    138.                     CoreUtils.SetRenderTarget(cmd, cameraData.renderer.cameraColorTargetHandle);
    139.                 }
    140.  
    141.                 Blitter.BlitTexture(cmd, m_CopyActiveColor ? m_CopiedColor : null, new Vector4(1, 1, 0, 0), m_Material, m_PassIndex);
    142.             }
    143.             context.ExecuteCommandBuffer(cmd);
    144.             cmd.Clear();
    145.  
    146.             CommandBufferPool.Release(cmd);
    147.         }
    148.     }
    149. }
    And here's the shader code to test

    Code (CSharp):
    1. Shader "ColorBlit"
    2. {
    3.     Properties
    4.     {
    5.         _Intensity("_Intensity", Range(0,1)) = 0
    6.     }
    7.         SubShader
    8.     {
    9.         Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
    10.         LOD 100
    11.         ZWrite Off Cull Off
    12.         Pass
    13.         {
    14.             Name "ColorBlitPass"
    15.  
    16.             HLSLPROGRAM
    17.             #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
    18.             // The Blit.hlsl file provides the vertex shader (Vert),
    19.             // input structure (Attributes) and output strucutre (Varyings)
    20.             #include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"
    21.  
    22.             #pragma vertex Vert
    23.             #pragma fragment frag
    24.  
    25.  
    26.             float _Intensity;
    27.  
    28.             half4 frag (Varyings input) : SV_Target
    29.             {
    30.                 UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);
    31.                 float4 color = SAMPLE_TEXTURE2D(_BlitTexture, sampler_PointClamp, input.texcoord);
    32.                 return  lerp(color, 1-color, _Intensity);
    33.             }
    34.             ENDHLSL
    35.         }
    36.     }
    37. }
    Afterall, A full screen blit is just putting a quad mesh that have extct same size as the screen, then the shader draw something on the quad.

    To be able to read the current screen color, we need to copy the current CameraColorTarget into a temporary RT, Because you can't read and write to a same RenderTarget at the same time, an extra copy to fetch the final color into another RT is required. In Above code's case the RT is m_CopiedColor. Then when we want to perform the blit of our own posteffect, we pass the RT as a input texture.

    Unity's blitter class and it's shader(Blit.hlsl) simply handle the quad's position/scale so the mesh can match the sreen size during draw. and using(expect) the _BlitTexture as input texture.

    [ Some extra info if you don't want to include Blit.hlsl ]
    If you don't want any of Blit.hlsl involve, you can replace the call of Blitter.BlitTexture with something like

    Code (CSharp):
    1.  
    2. using (new ProfilingScope(cmd, profilingSampler))
    3. {
    4.     //To be able to blit to the current camera color, we need to copy the current result into a RT,
    5.     //In this case the RT is m_CopiedColor.
    6.     if (m_CopyActiveColor)
    7.     {
    8.         CoreUtils.SetRenderTarget(cmd, m_CopiedColor);
    9.         //use Blitter api for simplicity
    10.         Blitter.BlitCameraTexture(cmd, cameraData.renderer.cameraColorTargetHandle, m_CopiedColor);
    11.     }
    12.  
    13.     if (m_BindDepthStencilAttachment)
    14.     {
    15.         CoreUtils.SetRenderTarget(cmd, cameraData.renderer.cameraColorTargetHandle, cameraData.renderer.cameraDepthTargetHandle);
    16.     }
    17.     else
    18.     {
    19.         CoreUtils.SetRenderTarget(cmd, cameraData.renderer.cameraColorTargetHandle);
    20.     }
    21.     //now, we want to perform the blit on our own post effect material,
    22.     //we can pass the m_CopiedColor RT as a texture input now.
    23.     m_Material.SetTexture("_BaseMap", m_CopiedColor);
    24.     cmd.DrawMesh(RenderingUtils.fullscreenMesh, Matrix4x4.identity, m_Material);
    25. }
    26.  
    RenderingUtils.fullscreenMesh and other similar API will give you a 4 vertices quad mesh, then in the vertex shader we simply use it's vertices position as clip space position.

    Code (CSharp):
    1.   Varyings vert(Attributes input)
    2. {
    3.       Varyings output;
    4.       UNITY_SETUP_INSTANCE_ID(input);
    5.       UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(output);
    6.  
    7.  
    8.       output.positionCS = float4(input.positionOS.xyz, 1.0);
    9.  
    10.       #if UNITY_UV_STARTS_AT_TOP
    11.            output.positionCS.y *= -1;
    12.       #endif
    13.  
    14.       output.uv = input.uv;
    15.       return output;
    16. }
    Then, in the fragment shader we use our own input texture (in this case it's baseMap)
    Code (CSharp):
    1. TEXTURE2D(_BaseMap);
    2. SAMPLER(sampler_BaseMap);
    3.  
    4. float _Intensity;
    5.  
    6. half4 frag (Varyings input) : SV_Target
    7. {
    8.       UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);
    9.       float4 color = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, input.uv);
    10.       return  lerp(color, 1-color, _Intensity);
    11.  
    12. }
    Some feedback to Unity's URP team:

    1.Maybe just modify and add some comment on the FullScreenPassRendererFeature, then put it on the urp document as example. I believe it will make things clearer for user.

    2. I agree that using Blit.hlsl to abstract the underlying low level complex stuff from user is probably necessary. But the way Blit.hlsl work is kinda different than how Unity always did for other parts of shader.

    For example, we have macro such as TransformObjectToWorld and ComputeScreenPos when we need to compute position into other spaces. Unity handles the complex part but at least user can easily understand whats going on.

    This is not 'blit.hlsl' s case. right now user just include the 'blit.hlsl' . And everything just works since even the vertex program and fragment program are defined already inside the blit.hlsl.
    But things won't work if the user don't completly obey the blitter api from C# script.

    This leads to a harder to understand shader code (trace URP shader code aren't that intuitive for user don't have Rider). What makes things shrimple for user might just expose a macro like ComputeFullScreenQuadClipPos (I just invent it) and Raise the awareness of cmd.DrawQuad/DrawTriangle/DrawMesh(fullscreenmesh), then let user do the blit them self.


    :) have a nice day.
     
    Last edited: Jan 15, 2024
  16. Themuffin

    Themuffin

    Joined:
    Jul 12, 2017
    Posts:
    3
    Very nice writeup, this works for me, albeit still this whole URP blitting API with tons of boilerplate is very confusing.
    I'm a bit stuck though, I want to be able to access the camera depth texture as well, but no luck. How would one upgrade this feature to output depth into
    Code (CSharp):
    1. _InputDepthTexture
    so that I can use it in ColorBlit.hlsl?
     
  17. echu33

    echu33

    Joined:
    Oct 30, 2020
    Posts:
    62
    Access URP's camera depth is as usual, by enable DepthTexture in your URP asset.

    upload_2024-3-20_10-21-30.png

    then in your shader you just declare

    TEXTURE2D_X(_CameraDepthTexture);
    SAMPLER(sampler_CameraDepthTexture);

    or simply add
    Code (CSharp):
    1. #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/DeclareDepthTexture.hlsl"
    in your shader.