Search Unity

Feedback How to Blit in URP 12 - Documentation Needed

Discussion in 'Universal Render Pipeline' started by daneobyrd, Dec 13, 2021.

  1. daneobyrd

    daneobyrd

    Joined:
    Mar 29, 2018
    Posts:
    101
    Please see: URP 14 Blitter Overview


    Edit #2: A lot has changed since I posted this. See this post and others for more recent details about changes for Blitting, the SRP Blitter Class, and other miscellaneous details about RTHandles.


    Edit #1: Jump to @ManueleB's message for a summary of URP blitting as of Unity 2021/URP 12.



    I don't quite understand why there is so little communication or documentation for blitting in SRP?

    Recently there was a new section added to the documentation -- "How To" -- which includes a blit tutorial with example code: How to perform a full screen blit in Single Pass Instanced rendering in XR

    Funnily enough, the example code doesn't use any of Unity's many different included blit functions.

    I recently talked about this on another forum post.

    I know there are currently issues with cmd.Blit() and XR but that is no reason for there to be so few examples of using blit outside of internal SRP code and the latest Photomode package.

    In the Photomode package the ScriptableRenderPass BlitRenderPass uses Blit(CommandBuffer, source, destination, Material, pass index);.

    However, the destination texture ("_AfterPostProcessTexture") is an existing texture from the SRP pipeline; not a user-created texture. Anecdotally, it feels like I see posts about custom shaders/materials consistently failing with blit, either resulting in gray or black textures being blit to the screen.

    There are very few examples where the blit destination is a user-created texture.

    One example of this can be found in this HDRP post. In that specific case, the solution was to use the XR macro SAMPLE_TEXTURE2D_X since HDRP uses XR macros for render targets to support stereo rendering.*

    *Those macros are defined differently (in Core.hlsl) based on the texture dimension set when the RTHandle system is initialized.
     
    Last edited: Sep 20, 2023
    ModLunar, thelebaron, atr0phy and 4 others like this.
  2. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    I made some example about this a while ago. I put the old style cmd.blit approach there first and you can then follow the commits to see what kind of changes are required or possible for upgrading it to the current setup: https://github.com/0lento/URP_RendererFeature_SPI_Example/commits/main

    if you don't want to use OpaqueTexture, just ignore the last commit there. Also worth noting that 2022.1+ URP will have a HDRP style RT handling scheme so that will change slightly there (doesn't require huge changes to simplified sample like this).
     
    Last edited: Dec 13, 2021
    daneobyrd likes this.
  3. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    I agree we need some official blog post or something. I've seen a lot of Unity internal code being inconsistent and doing stuff like this:

    Code (CSharp):
    1. Camera camera = rd.cameraData.camera;
    2. cmd.SetViewProjectionMatrices(Matrix4x4.identity, Matrix4x4.identity);
    3. cmd.DrawMesh(RenderingUtils.fullscreenMesh, Matrix4x4.identity, material, 0, pass);
    4. cmd.SetViewProjectionMatrices(camera.worldToCameraMatrix, camera.projectionMatrix);
    This is blit-like, but won't set all the same states, so might work differently if MSAA is on vs off (easy to fix in the shader, but a real WTF moment if it comes up 2 months after you wrote the original code).

    (EDIT: that's what the page linked by OP suggests -- drawing the mesh directly instead of using Blit)
     
    Last edited: Dec 13, 2021
    daneobyrd likes this.
  4. daneobyrd

    daneobyrd

    Joined:
    Mar 29, 2018
    Posts:
    101

    I forgot to mention how many different blit shaders and hlsl files are used as well. It makes sense to have so many blit options considering how versatile and ubiquitous it is throughout the render pipeline. However that doesn’t make it any less confusing. Many of the internal passes have their own ways of blitting including drawing the mesh directly.

    Here is an assortment of many internal files I could remember and those that appear when you search for blit in the Graphics repository. All files listed are hyperlinks.
    Core RP:
    /Runtime/Utilities/
    Universal RP:
    /Shaders
    /Utils
    /PostProcessing
    • many files in this folder contain TEXTURE2D_X(_SourceTex) which appears in other URP Blit implementations.​
    /Runtime
    High Definition RP:
    /Runtime
    /Compositor/Shaders
    /Core/CoreResources
    /Lighting/Shadow/
    /Debug
    /RenderPipeline/Utility/
    /ShaderLibrary
    Post Processing:
    /PostProcessing
    /Shaders/Builtins/
    /Runtime/Effects
    Bloom.cs - uses cmd.BlitFullScreenTriangle();
    DepthOfField.cs - uses cmd.BlitFullScreenTriangle();​

    and many more.

    (As of December 14, 2021)

    April 13, 2023 Edit:
    Fixed URLs to redirect to correct directories in Unity-Technologies/Graphics.
    Since posting this, Unity reorganized the Graphics repo and added a new `Packages/` subfolder.
     
    Last edited: Sep 20, 2023
    tmonestudio, DrViJ and Euri like this.
  5. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    For historical reasons URP doesn't currently have a proper standard in terms of "how to Blit".
    I agree that this is very confusing, and it is something we are trying to document better as we speak.
    This is an important ongoing discussion at the moment, since reviewing the state of our blits is also an important preparation step for some core changes in URP, better NativeRenderPass and RenderGraph support. Following these best practices early on existing projects should also make your life easier when upgrading to the next URP releases.

    I'll try to do a quick summary here, but you should expect documentation coming soon:

    1) cmd.Blit() should not be used. The main reason is that it is a bit of a "black box", since it contains a lot of built-in logic, in terms of changing states, binding textues and setting render targets. All of these are happening under the hood, so not transparently from an SRP point of view, which can cause some issues. Other big issues with cmd.Blit(): it "breaks" NativeRenderPass and RenderGraph compatibility, so any pass using cmd.Blit will not be able to take advantage of these. It also doesn't work well in XR. Its usage might also be deprecated in future URP versions.

    2) the same applies obviously to any utilities/wrappers relying on cmd.Blit() internally, so for example RenderingUtils.Blit should be avoided as well

    3) The current How to perform a full screen blit in Single Pass Instanced rendering in XR is a good example to follow. Under the hood cmd.Blit() does pretty much the same, except in this case everything is handled at the SRP level, which is the way to go. I think the fact that the page is mentioning XR is a bit confusing, since this is a perfectly valid way of doing blit on all platforms. So I am looking at updating that page. It is also in need of some sample code changes* since in the current state it doesn't work anymore on 22.1, because of the recently introduced RTHandles support.

    4) The SRP Blit API "to use" is Core Blitter: it is already used by other pipelines, and refactoring our existing passes so that they use it instead of cmd.Blit is in our short-term roadmap. This might also include modifying and improving the Blitter API to accomodate any URP requirements. So expect some changes soon in this area


    *a quick preview of the sample code changes needed for RTHandles support:

    Replace the content of AddRenderPasses and override the new SetupRenderPasses callback (more info in the RTHandles upgrade guide)


    Code (CSharp):
    1. public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    2.         {
    3.             if (renderingData.cameraData.cameraType == CameraType.Game)
    4.                 renderer.EnqueuePass(m_RenderPass);
    5.         }
    6.    
    7.         public override void SetupRenderPasses(ScriptableRenderer renderer, in RenderingData renderingData)
    8.         {
    9.             if (renderingData.cameraData.cameraType == CameraType.Game)
    10.             {
    11.                 //Calling ConfigureInput with the ScriptableRenderPassInput.Color argument ensures that the opaque texture is available to the Render Pass
    12.                 m_RenderPass.ConfigureInput(ScriptableRenderPassInput.Color);
    13.                 m_RenderPass.SetTarget(renderer.cameraColorTarget, m_Intensity);
    14.             }
    15.         }
     
    Last edited: Dec 14, 2021
  6. daneobyrd

    daneobyrd

    Joined:
    Mar 29, 2018
    Posts:
    101
    Thank you so much for the transparency and all the information! It’s exciting to see URP going through such big changes in the last two major releases.

    I look forward to seeing Core Blitter integrated with URP passes. Expanding support for NativeRenderPass and RenderGraph, as well as the shift to RTHandles, seems like it might make writing our own passes much easier.
     
    Last edited: Mar 27, 2023
  7. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    I'm not convinced this will make actual usage simpler. Unity could provide more helper functions to reduce the current boilerplate - sure - but changes so far haven't reduced much of our own code complexity, it's just structured in slightly different way.

    IMHO biggest thing to take away the pain from engine users would just be to give more varied examples on how one can use this. That XR doc page is a good start but it doesn't explain everything. Would be nice to have set of few renderer feature samples as part of URP examples for example that specifically had to handle more intermediate processing which you typically see on more advanced effects.

    I'm also personally hoping that the RG change will happen sooner than later so we can finally get closer to more mature code base with URP.
     
    DrViJ, revolute and daneobyrd like this.
  8. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    I was looking at this page on PR for the doc update:
    https://github.com/Unity-Technologi...ion~/renderer-features/blit-best-practices.md
    I'm actually now more confused than enlightened. According your comment here and what was written on that PR's page, SRP Blitter API would be the way to go yet the only practical example is for the cmd.DrawMesh.

    Why even have cmd.DrawMesh example if we are not supposed to use it? Why not have example for the recommended route?
     
  9. daneobyrd

    daneobyrd

    Joined:
    Mar 29, 2018
    Posts:
    101
    Where is it recommended to not use cmd.DrawMesh? I thought the recommendation was not to use cmd.Blit(); or any functions that wrap or rely on cmd.Blit();
     
    DrViJ likes this.
  10. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    They instead recommended SRP Blitter:
     
  11. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    I just realized that SRP Blitter uses RTHandles, so it wouldn't have even worked prior 2022.1's URP so I guess this all makes more sense. So my assumption is that once this version matures further, SRP Blitter will get adopted more in URP (+ I do get that the mentioned doc page is not something we actually have in use yet).

    Would expect the blitting example to use the Blitter api if it's being recommended but I get these things don't happen over night :)
     
  12. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    on top of the RTHandles availability issue, which make SRP Blitter not usable on versions < 22.1, there are still scenarios where you might be better off writing your "custom blit" using DrawMesh, like the how-to page shows.

    SRP Blitter will be improved in the future to facilitate URP adoption, but for now for example you are unable to do things like setting explicitly Load/Store actions if you are optimizing for mobile.
    With DrawMesh you could also write easily generic full screen quad renderers, which don't necessaily need to blit a source texture to a destination one: as an example, a very common pattern currently for post processing effect doing similar stuff (i.e. ColorGradingLUT) is to call cmd.Blit(null, RT). This is not the best use of blit and works much better with a simple draw to full screen quad mesh.

    Other examples of useful custom blit implementation could be something like the URP CopyDepth pass, where you need to add some extra MSAA resolve logic so your shader could use Tex2DMS samples based on whether the source texture is MSAA'd or not.

    The SRP Blitter will be extended and improved to cover more use cases over time, but DrawMesh will always give you fulll flexibility and customization if needed.

    TL;DR: Use SRP Blitter if what you need to do is available and implemented in that API, use the DrawMesh approach for any other cases, avoid cmd.Blit() :)
     
    Last edited: Dec 16, 2021
    Euri, daneobyrd, burningmime and 2 others like this.
  13. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    forgot to add: we plan to add more samples using Blitter once we start converting all URP passes to use it
     
    DrViJ, Meatloaf4, daneobyrd and 2 others like this.
  14. Steamc0re

    Steamc0re

    Joined:
    Nov 24, 2014
    Posts:
    144
    My game absolutely relies on Graphics.Blit, which works in 2021.1.28f1 and URP 11.0.0
    I am blitting a Material to a Texture (not in a pass!!! Just to bake a VERY heavy dynamic material to a texture ONCE and then GetPixels() to an array), that's it.

    Graphics.Blit(null, renderTexture, material); no longer works.

    What do I use now?
     
    Reahreic likes this.
  15. ekakiya

    ekakiya

    Joined:
    Jul 25, 2011
    Posts:
    79
    I'm using DrawProcedural for Blit, with GetFullScreenTriangleTexCoord and GetFullScreenTriangleVertexPosition.
    If you Blit with DrawProcedural, you must cancel the flip that by UNITY_UV_STARTS_AT_TOP if you blit from RT to RT, but that's it.
    The flip for the blit with DrawMesh looks more complicated. You must consider current camera matrix..
     
    Last edited: Dec 24, 2021
  16. Steamc0re

    Steamc0re

    Joined:
    Nov 24, 2014
    Posts:
    144
    I know it's a lot to ask but do you have a code snippet?
     
  17. ekakiya

    ekakiya

    Joined:
    Jul 25, 2011
    Posts:
    79
    Code (CSharp):
    1. //Script
    2. _BlitMaterial.SetTexture(Shader.PropertyToID("_BlitSrcTex"), _SrcTexture);
    3. _CmdBuffer.SetRenderTarget(new RenderTargetIdentifier(_DstTexture), RenderBufferLoadAction.DontCare, RenderBufferStoreAction.Store);
    4. _CmdBuffer.DrawProcedural(Matrix4x4.identity, _BlitMaterial, 0, MeshTopology.Triangles, 3, 1, null);
    5.  
    6. //VertexShader
    7. output.positionCS = GetFullScreenTriangleVertexPosition(input.vertexID);
    8. float2 uv = GetFullScreenTriangleTexCoord(input.vertexID);
    9. #if defined _CANCEL_FLIP
    10. uv.y = 1.0 - uv.y;
    11. #endif
    12. output.xyAndUv = uv.xyxy * float4(DST_TEXTURE_SIZE, 1.0, 1.0);
    13.  
    14. //FragmentShader
    15. float4 color = LOAD_TEXTURE2D(_BlitSrcTex, input.xyAndUv.xy);
    16. // ..and/or..
    17. float4 color = SAMPLE_TEXTURE2D_LOD(_BlitSrcTex, s_linear_clamp_sampler, input.xyAndUv.zw, 0.0);
    And I set _CANCEL_FLIP flag manually from script, if the source texture is a texture (not a renderTexture) or destination texture is a backBuffer, and running on non-Open GL api.
    Sample works just fine, but Load is useful in some case like resolving MSSA buffer.

    URP has good samples to use DrawProcedural Quad for blit. Script side , Shader side
    HDRP has good samples to use DrawProcedural Triangle for blit. Script side , Shader side
     
    Last edited: May 17, 2022
    daneobyrd and Steamc0re like this.
  18. Steamc0re

    Steamc0re

    Joined:
    Nov 24, 2014
    Posts:
    144
    I'm afraid this is out of my depth. I am using shader graph to calculate millions of distance calculations to fill a voxel array. Used to be, I just did Graphics.Blit(null, renderTexture, material); and then ReadPixels(); then GetPixels() which fills the array. Doing these millions of calculations linearly on CPU to fill the array instead (with all of the other calcs that go into it, this is simplified) takes 8-20 seconds depending on the volume. 3 million volume on GPU takes .8s.

    I do not have a source texture, I just want to copy the current state of the procedural material to a Texture2D. It was as easy as Blit(null, renderTexture, material), done. Now I have no idea what to do with the above code, as it won't do anything without a source texture. Also I'm using shadergraph and the output shader code is thousands of lines long (as I said, the above explanation is simplified, there's a LOT that goes into determining the values of the distance calculation).
     
  19. DrViJ

    DrViJ

    Joined:
    Feb 9, 2013
    Posts:
    163
    The link to the best practices for blit is missing, dont you know at what branch/commit can it be found? I am trying to find out how to correctly use Blitter API and dont understand yet. I corrected the example to work with DrawMesh and sent it to Unity team with feedback form. But I cant see any example of Blitter usage

    There is just out of dated example with XR and fullscreen rect cmd.DrawMesh in official documentation:
    https://docs.unity3d.com/Packages/c...ines.universal@13.1/manual/how-to.html?q=blit
     
    Last edited: Mar 29, 2022
    ChristopherKerr likes this.
  20. DrViJ

    DrViJ

    Joined:
    Feb 9, 2013
    Posts:
    163
    @ManueleB Sorry for bumping, could you please give the direction where to search for any examples of Blitter Api usage? Maybe Unity have any public repository with examples on it? or maybe special branch inside Graphics Repo?
     
  21. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    That git branch was just wip branch for merging the doc page you linked here.
     
    DrViJ likes this.
  22. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    tmonestudio, daneobyrd, DrViJ and 3 others like this.
  23. DrViJ

    DrViJ

    Joined:
    Feb 9, 2013
    Posts:
    163
    Thank you so much for the links! I've read pull requests and have tried to implement simple Blit like in provided tutor but using Blitter instead of cmd.DrawMesh(). But I am getting a strange behaviour. I get glitches when screen size is decreased, but everything works fine when I increase the size. The behaviour is same for Editor/Play modes.
    Glitchy behaviour:


    RTHandle reference sizes and scaled sizes look correct. What can be the reason of such behaviour? Am I doing something wrong in shader?

    The code:

    ColorBlitPass.cs:
    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.Rendering;
    3. using UnityEngine.Rendering.Universal;
    4.  
    5. internal class ColorBlitPass : ScriptableRenderPass
    6. {
    7.     private Material m_Material;
    8.     private RTHandle m_CameraColorTarget;
    9.     private float m_Intensity;
    10.  
    11.     public ColorBlitPass(Material material)
    12.     {
    13.         m_Material = material;
    14.         UpdateIntensity();
    15.         renderPassEvent = RenderPassEvent.BeforeRenderingPostProcessing;
    16.     }
    17.  
    18.     public void SetIntensity(float intensity)
    19.     {
    20.         m_Intensity = intensity;
    21.         UpdateIntensity();
    22.     }
    23.  
    24.     public void SetTarget(RTHandle colorHandle)
    25.     {
    26.         m_CameraColorTarget = colorHandle;
    27.     }
    28.  
    29.     public override void OnCameraSetup(CommandBuffer cmd, ref RenderingData renderingData)
    30.     {
    31.         //Nothing here yet.
    32.     }
    33.  
    34.     public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
    35.     {
    36.         Debug.Log("Ref size " + m_CameraColorTarget.referenceSize);
    37.         Debug.Log("Scaled size " + m_CameraColorTarget.GetScaledSize(m_CameraColorTarget.referenceSize));
    38.         Debug.Log("---");
    39.         var camera = renderingData.cameraData.camera;
    40.         if (camera.cameraType != CameraType.Game || m_Material == null)
    41.             return;
    42.  
    43.         var cmd = CommandBufferPool.Get();
    44.         Blitter.BlitCameraTexture(cmd, m_CameraColorTarget, m_CameraColorTarget, m_Material, 0);
    45.         context.ExecuteCommandBuffer(cmd);
    46.         cmd.Clear();
    47.         CommandBufferPool.Release(cmd);
    48.     }
    49.  
    50.     private void UpdateIntensity()
    51.     {
    52.         if (m_Material != null)
    53.             m_Material.SetFloat("_Intensity", m_Intensity);
    54.     }
    55. }
    ColorBlitRendererFeature.cs:
    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.Rendering;
    3. using UnityEngine.Rendering.Universal;
    4.  
    5. internal class ColorBlitRendererFeature : ScriptableRendererFeature
    6. {
    7.     public Shader m_Shader;
    8.     public float m_Intensity;
    9.  
    10.     Material m_Material;
    11.  
    12.     ColorBlitPass m_RenderPass = null;
    13.  
    14.     public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    15.     {
    16.         if (renderingData.cameraData.cameraType == CameraType.Game)
    17.         {
    18.             //Calling ConfigureInput with the ScriptableRenderPassInput.Color argument ensures that the opaque texture is available to the Render Pass
    19.             m_RenderPass.ConfigureInput(ScriptableRenderPassInput.Color);
    20.             m_RenderPass.SetIntensity(m_Intensity);
    21.             renderer.EnqueuePass(m_RenderPass);
    22.         }
    23.     }
    24.  
    25.     public override void SetupRenderPasses(ScriptableRenderer renderer, in RenderingData renderingData)
    26.     {
    27.         base.SetupRenderPasses(renderer, renderingData);
    28.         m_RenderPass.SetTarget(renderer.cameraColorTargetHandle);
    29.     }
    30.  
    31.     public override void Create()
    32.     {
    33.         if (m_Shader != null)
    34.             m_Material = new Material(m_Shader);
    35.  
    36.         m_RenderPass = new ColorBlitPass(m_Material);
    37.     }
    38.  
    39.     protected override void Dispose(bool disposing)
    40.     {
    41.         CoreUtils.Destroy(m_Material);
    42.     }
    43. }
    ColorBlit.shader:
    Code (CSharp):
    1. Shader "ColorBlit"
    2. {
    3.         SubShader
    4.     {
    5.         Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
    6.         LOD 100
    7.         ZWrite Off Cull Off
    8.         Pass
    9.         {
    10.             Name "ColorBlitPass"
    11.  
    12.             HLSLPROGRAM
    13.             #pragma vertex Vert
    14.             #pragma fragment Frag
    15.             #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
    16.             #include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"
    17.  
    18.             SAMPLER(sampler_BlitTexture);
    19.             uniform float _Intensity;
    20.  
    21.             half4 Frag (Varyings input) : SV_Target
    22.             {
    23.                 UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);
    24.                 float4 color = SAMPLE_TEXTURE2D_X(_BlitTexture, sampler_BlitTexture, input.texcoord);
    25.                 return color * float4(0, _Intensity, 0, 1);
    26.             }
    27.             ENDHLSL
    28.         }
    29.     }
    30. }
     
    ChristopherKerr and daneobyrd like this.
  24. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Command.Blit works in standard pipeline with VR/XR. It is the best developer experience by far. The point of having Unity, which is a great engine overall, is to make the developer experience easier. I realize it's a black box, and does a lot of things under the hood. I leave that to Unity's talented engineering team to maintain and have work properly regardless of rendering pipeline. That way we, as Unity users, can focus on our games/apps rather than spending (at least for me) quite literally weeks trying to blit in VR with URP.
     
    ModLunar and Reahreic like this.
  25. daneobyrd

    daneobyrd

    Joined:
    Mar 29, 2018
    Posts:
    101
    Luckily all the toiling we have done (across many threads) won't be for nothing. Soon all will be well and blitting will be simpler.
     
    Last edited: Jun 7, 2022
    ChristopherKerr and DrViJ like this.
  26. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    I appreciate you man, you've at least got things documented today on how to work-around and get it functional. I do worry about future Unity versions breaking things, hence why I would love for cmd.Blit to just work :)
     
    Reahreic and daneobyrd like this.
  27. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Sounds like Blitter might be the solution, at least for future Unity versions...?
     
  28. DrViJ

    DrViJ

    Joined:
    Feb 9, 2013
    Posts:
    163
    Last edited: Apr 2, 2022
    tmonestudio and ChristopherKerr like this.
  29. daneobyrd

    daneobyrd

    Joined:
    Mar 29, 2018
    Posts:
    101
    See this reply further up in the thread for more on Blitter availability.

     
    DrViJ likes this.
  30. DrViJ

    DrViJ

    Joined:
    Feb 9, 2013
    Posts:
    163
    Oh, exactly, sorry, I forgot about it :)
     
  31. DrViJ

    DrViJ

    Joined:
    Feb 9, 2013
    Posts:
    163
    The glitch appears when _BlitScaleBias x and y parameters are less then 1

    So this code works fine:
    Code (CSharp):
    1. var cmd = CommandBufferPool.Get();
    2. Blitter.BlitTexture(cmd, m_CameraColorTarget, new Vector4(1,1,0,0), m_Material, 0);
    3. context.ExecuteCommandBuffer(cmd);
    4. cmd.Clear();
    5. CommandBufferPool.Release(cmd);
    And this one glitches:
    Code (CSharp):
    1. var cmd = CommandBufferPool.Get();
    2. Blitter.BlitCameraTexture(cmd, m_CameraColorTarget, m_CameraColorTarget, m_Material, 0);
    3. context.ExecuteCommandBuffer(cmd);
    4. cmd.Clear();
    5. CommandBufferPool.Release(cmd);
    By the way, I see changes in the BlitCameraTexture, looks like it was a bug that is going to be fixed:
    https://github.com/Unity-Technologies/Graphics/pull/7115/files

    From this:
    Code (CSharp):
    1. Vector2 viewportScale = new Vector2(source.rtHandleProperties.rtHandleScale.x, source.rtHandleProperties.rtHandleScale.y);
    To this:
    Code (CSharp):
    1. Vector2 viewportScale = source.useScaling ? new Vector2(source.rtHandleProperties.rtHandleScale.x, source.rtHandleProperties.rtHandleScale.y) : Vector2.one;
     
    Last edited: Apr 3, 2022
  32. linnspitz

    linnspitz

    Joined:
    Apr 3, 2019
    Posts:
    5
    trying to implement this. I can only get the shader to affect the skybox, which is not what i need. I saw that in the the documentation, one screenshots also shows only the sky being affected and in the next one, it's an overlay over everything, without an explanation. What am I missing? Thanks! Screenshot 2022-06-22 175531.png
     
  33. echu33

    echu33

    Joined:
    Oct 30, 2020
    Posts:
    62
    upload_2022-8-17_22-16-22.png
    You need to enable the opaque texture toggle inside renderer asset.
     
  34. daneobyrd

    daneobyrd

    Joined:
    Mar 29, 2018
    Posts:
    101
    You can also use ConfigureInput(Color); to make the opaque texture available in your scriptable render pass (regardless of what the renderer asset settings are).
     
  35. jordansean10

    jordansean10

    Joined:
    Aug 28, 2020
    Posts:
    5
    I had this same issue, checking Opaque Texture on the renderer asset as suggested along with checking NativeRenderpass on the renderer data object worked for me
     

    Attached Files:

    • blit.png
      blit.png
      File size:
      62.3 KB
      Views:
      182
  36. najati

    najati

    Joined:
    Oct 23, 2017
    Posts:
    45
    This seems like an excellent, succinct example of how to use RTHandles + Blitter to get a full screen effect going, but I can't seem to get this to work in 22.1 and core/universal render-pipelines at 13.1.8. Even after including the fix a few comments further down.

    The effect is all black regardless of value for intensity. If I change the fragment shader to return just
    float4(0, _Intensity, 0, 1)
    then I see the shade of green with no background as expected.
    The video linked leads me to believe this should be working in 22.1 and 13.1.8 - am I mistaken?

    Thanks!

     
    ladismad likes this.
  37. 8bitgoose

    8bitgoose

    Joined:
    Dec 28, 2014
    Posts:
    448
    I sort of got this working, however I no longer am drawing the transparent parts of the scene. How does one make sure to include transparencies when doing a full screen overlay?
     
  38. thelebaron

    thelebaron

    Joined:
    Jun 2, 2013
    Posts:
    857
    I too am having difficulty rendering transparent or cutout materials with the new blitting system. Quite honestly I wish those examples from the UniversalRenderingExamples repo(blit material, sobel etc) were either kept up to date or included as standard. Why is render objects still experimental after all this time?
     
    ladismad likes this.
  39. bluescrn

    bluescrn

    Joined:
    Feb 25, 2013
    Posts:
    642
    The new Blitter class is likely to catch a lot of people out while trying to update RendererFeatures etc, as it's completely incompatible with old blitting shaders.
     
    ElliotB likes this.
  40. 8bitgoose

    8bitgoose

    Joined:
    Dec 28, 2014
    Posts:
    448
    Is this the new blitter class in URP 15 or something else?
     
  41. thelebaron

    thelebaron

    Joined:
    Jun 2, 2013
    Posts:
    857
    Does the new Blitter class not handle transparency?

    Copy pasting https://docs.unity3d.com/Packages/c...renderer-features/how-to-fullscreen-blit.html line for line doesnt render anything transparent so not sure if theres something more I am missing or if the new blitter is just feature incomplete.
     
  42. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    289
    This is definitely the case for me, it's a lot of work to update, so wanted to add my voice to this. I do respect the idea behind the Blit API though and see it will be good once ironed out.

    The current Blit.hlsl Varyings include functions which explicitly incorporate _BlitScaleBias for compatibility with the RTHandle API. However, Unity's own handling of the RTHandle scaling is not always 'pixel perfect', which makes it an absolute nightmare to do perfect dither effects that need to account for RT scaling (see e.g.
    https://forum.unity.com/threads/_sc...render-target-subregion.1336277/#post-8442380 ). I've not finished porting my shaders yet, but I'm fretting right now that even if I do they aren't going to work because the Blit API Varyings doesn't provide access to lower level stuff like raw position without _BlitScaleBias.
     
  43. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    A solution that does not require shader updates would be nice. Honestly if command buffer blit just worked and existing shaders worked it would be the best solution. The black box argument just doesn’t sell it for me…
     
  44. BruceKristelijn

    BruceKristelijn

    Joined:
    Apr 28, 2017
    Posts:
    108
    I have the same issue but am unsure how to resolve this, did you manage to resolve this somehow?
     
    Lechuza likes this.
  45. wwWwwwW1

    wwWwwwW1

    Joined:
    Oct 31, 2021
    Posts:
    769
    Hi, you should copy the scene color right before your PP effect and replace the "_CameraOpaqueTexture" with it.

    For example, there's a color inversion effect.
    • Create an RTHandle named "sceneColor" to store the scene color copy.
    • Allocate the RTHandle with RenderingUtils.ReAllocateIfNeeded().
    • Use Blitter.BlitCameraTexture() or cmd.Blit() to copy "renderingData.cameraData.renderer.cameraColorTargetHandle" to "sceneColor".
    • Use yourMaterial.SetTexture() to pass the "sceneColor" to PP material, or use cmd.SetGlobalTexture() to set it as a global texture, just like the opaque texture.
    • In the PP shader, replace the opaque texture with the sceneColor texture.
    If your project is using URP 14 or higher, the new full screen renderer feature is a good option for single pass PP effects.

    Add "Color" to the Requirements.
    FullScreenPass_URP14.jpg

    Then you can access the scene color copy with this node, or a texture named "_BlitTexture".
    URPSampleBufferNode.jpg
     
    cecarlsen and BruceKristelijn like this.
  46. BruceKristelijn

    BruceKristelijn

    Joined:
    Apr 28, 2017
    Posts:
    108
    Thankyou for the reply! I am however using URP 12.1.8 in Unity 2021.3.16f LTS so some methods don't really exist for this version (I think). I am currently looking deeper into it with the principles you to me! Thanks again!
     
    wwWwwwW1 likes this.
  47. BruceKristelijn

    BruceKristelijn

    Joined:
    Apr 28, 2017
    Posts:
    108
    I am having a fair bit of trouble getting the scene color as a RTHandle. Is there a way in URP 12?
     
  48. BruceKristelijn

    BruceKristelijn

    Joined:
    Apr 28, 2017
    Posts:
    108
    I am probably just out of luck for now, Transparent surfaces with the quest 2 seem to completely break rendering, might be because of the changes to the shader, ah well. Thanks for the suggestions

    upload_2023-2-15_16-47-27.png
    As you can see the transparent window just gets weird artifacts, and when I apply a fullscreen blit the whole screen gets these artifacts.
     
  49. wwWwwwW1

    wwWwwwW1

    Joined:
    Oct 31, 2021
    Posts:
    769
    I think you can access the camera's color RTHandle by
    renderingData.cameraData.renderer.cameraColorTargetHandle
    .

    As for the weird color on transparent window, what kind of post-processing effect is that? If it needs depth information, you should also enable depth write in the window's shader.
     
  50. BruceKristelijn

    BruceKristelijn

    Joined:
    Apr 28, 2017
    Posts:
    108
    I will give this a shot. Thanks!

    The window is a regular transparent box. These artifacts are also observeable on world space UI and other transparent shader geometry with just build-in shaders.
     
    wwWwwwW1 likes this.