Search Unity

[Solved] How to use Blit with a CustomPostProcessVolumeComponent?

Discussion in 'High Definition Render Pipeline' started by McDev02, Mar 17, 2020.

  1. McDev02

    McDev02

    Joined:
    Nov 22, 2010
    Posts:
    664
    I implement a former ImageEffect to HDRP with a CustomPostProcessVolumeComponent and I have various issues.

    1. When I follow these instructions I am unable to add the post effect to the BeforePP list, only to the AfterPP. It seems to be fine for my needs but having raw, non-tonemapped values could be beneficial in some cases.

    2. Is there a way to change the order of PostProcessing Components? It seems that each has a predefined priority. Component inspector position has no impact.

    3. How to use Blit with this system? Solution

    My shader consists of the following steps:
    For each iteration
    Blit: BlurX Pass
    Blit: BlurY Pass

    This results in a blurred image I need to use on the final output pass, that I could do like this:

    Code (CSharp):
    1. m_Material.SetTexture("_MaskTexture", blurredTexture);
    2. HDUtils.DrawFullScreen(cmd, m_Material, destination);
    3.  
    The main question however is how I perform Blits. Here is what I tried but Blit is not working.

    Code (CSharp):
    1. public override void Render(CommandBuffer cmd, HDCamera camera, RTHandle source, RTHandle destination)
    2.     {
    3.         if (m_Material == null)
    4.             return;
    5.  
    6.         m_Material.SetFloat("_Pixel", Pixel.value);
    7.         m_Material.SetFloat("_Amount", Amount.value);
    8.         m_Material.SetFloat("_Amount2", SecondAmount.value);
    9.         m_Material.SetFloat("_Threshold", Threshold.value);
    10.         m_Material.SetTexture("_InputTexture", source);
    11.  
    12.         RenderTexture tmp1 = RenderTexture.GetTemporary(destination.rt.width, destination.rt.height);
    13.         RenderTexture tmp2 = RenderTexture.GetTemporary(destination.rt.width, destination.rt.height);
    14.    
    15.         cmd.Blit(source.rt, tmp1);
    16.  
    17.         for (int i = 0; i < Iterations.value; i++)
    18.         {
    19.             cmd.Blit(tmp1, tmp2, blurMaterial, 0); //BlurX
    20.             cmd.Blit(tmp2, tmp1, blurMaterial, 1); //BlurY
    21.         }
    22.  
    23.         m_Material.SetTexture("_MaskTexture", tmp1);
    24.         //UnsharpMask Pass
    25.         HDUtils.DrawFullScreen(cmd, m_Material, destination, null, 0);
    26.  
    27.         RenderTexture.ReleaseTemporary(tmp1);
    28.         RenderTexture.ReleaseTemporary(tmp2);
    29.     }
    30.  
     
    Last edited: Mar 19, 2020
  2. Olmi

    Olmi

    Joined:
    Nov 29, 2012
    Posts:
    1,553
    Hi @McDev02

    1. You probably didn't read carefully enough?

    "The injectionPoint override allows you to specify where in the pipeline HDRP executes the effect. There are currently three injection points:".

    This line in the example allows you to set execution point:
    Code (CSharp):
    1. public override CustomPostProcessInjectionPoint injectionPoint =>
    2. CustomPostProcessInjectionPoint.AfterPostProcess;
    I remember it worked fine for me sometime in August last year, but things might have changed since then.

    3. Check Keijiro's HDRP post processing examples for blit examples.

    You need to reserve RTHandles for each render pass you need, like one for your horizontal pass, one for your vertical pass, then render the final result to the destination or whatever you call the incoming image (RTHandle) in your Render() override. Then use HDUtils.DrawFullScreen to render your shader pass.

    At least that's how I've done it, while there's no proper documentation available it's just guesswork, trial and error. Not sure if it's the right way but it worked for me.
     
  3. McDev02

    McDev02

    Joined:
    Nov 22, 2010
    Posts:
    664
    Thanks @Olmi

    But it just doesn't work. I made a few checks to see if the materials and shaders are fine and they are.
    For example when I change var lastRT = source.rt; then "_MaskTexture" is set and I can use it in the shader, but the way I do it the texture remains black.
    If I call HDUtils.DrawFullScreen(cmd, m_Material, destination, _prop, 1); I also get what I expect.
    The way I create the RTHandles are taken from the example you posted, the Streak Effect.

    Code (CSharp):
    1. public RTHandle GetNewRTHandle(HDCamera camera)
    2. {
    3.     var width = camera.actualWidth;
    4.     var height = camera.actualHeight;
    5.     const GraphicsFormat RTFormat = GraphicsFormat.R16G16B16A16_SFloat;
    6.     var rt = RTHandles.Alloc(scaleFactor: Vector2.one, colorFormat: RTFormat);// RTHandles.Alloc(width, height,  colorFormat: RTFormat);
    7.  
    8.     rtHandles.Add(rt);
    9.     return rt;
    10. }
    11.  
    12.  
    13. public override void Render(CommandBuffer cmd, HDCamera camera, RTHandle source, RTHandle destination)
    14. {
    15.     if (m_Material == null)
    16.         return;
    17.  
    18.     m_Material.SetFloat("_Pixel", Pixel.value);
    19.     m_Material.SetFloat("_Amount", Amount.value);
    20.     m_Material.SetFloat("_Amount2", SecondAmount.value);
    21.     m_Material.SetFloat("_Threshold", Threshold.value);
    22.     m_Material.SetTexture("_InputTexture", source);
    23.    
    24.     if (rth1 == null)
    25.         rth1 = GetNewRTHandle(camera);
    26.     if (rth2 == null)
    27.         rth2 = GetNewRTHandle(camera);
    28.  
    29.     HDUtils.DrawFullScreen(cmd, m_Material, rth1, _prop, 1);
    30.  
    31.     var lastRT = rth1;
    32.  
    33.     //for (int i = 0; i < Iterations.value; i++)
    34.     //{
    35.     //    cmd.Blit(tmp1, tmp2, blurMaterial, 0);
    36.     //    cmd.Blit(tmp2, tmp1, blurMaterial, 1);
    37.     //}
    38.  
    39.     m_Material.SetTexture("_MaskTexture", lastRT);
    40.     HDUtils.DrawFullScreen(cmd, m_Material, destination);
    41. }
     
  4. McDev02

    McDev02

    Joined:
    Nov 22, 2010
    Posts:
    664
    Alright this shows that posting full source is beneficial. The issue was the difference between LOAD_TEXTURE2D_X and LOAD_TEXTURE2D. For my RTHandles I have to use LOAD_TEXTURE2D. I do not plan for VR so I don't mind but maybe there also is a way to enable VR mode for RTHandles in case this becomes necessary.
    Here is some insight
    https://forum.unity.com/threads/hdrp-post-processing-shaders-accessing-gbuffers.726158/

    Shader head
    Code (CSharp):
    1. half _Pixel, _Amount, _Amount2, _Threshold, _DepthBias, _Fac;
    2. TEXTURE2D_X(_InputTexture);
    3. TEXTURE2D(_MaskTexture);
    Code (CSharp):
    1. public override void Render(CommandBuffer cmd, HDCamera camera, RTHandle source, RTHandle destination)
    2. {
    3.     if (m_Material == null)
    4.         return;
    5.  
    6.     m_Material.SetFloat("_Pixel", Pixel.value);
    7.     m_Material.SetFloat("_Amount", Amount.value);
    8.     m_Material.SetFloat("_Amount2", SecondAmount.value);
    9.     m_Material.SetFloat("_Threshold", Threshold.value);
    10.     m_Material.SetTexture("_InputTexture", source);
    11.  
    12.     if (rth1 == null)
    13.         rth1 = GetNewRTHandle(camera);
    14.     if (rth2 == null)
    15.         rth2 = GetNewRTHandle(camera);
    16.  
    17.     //_prop.SetTexture(Shader.PropertyToID("_InputTexture"), source);
    18.     HDUtils.DrawFullScreen(cmd, m_Material, rth1, _prop, 3);
    19.  
    20.     for (int i = 0; i < Iterations.value; i++)
    21.     {
    22.         _prop.SetTexture(Shader.PropertyToID("_MaskTexture"), rth1);
    23.         HDUtils.DrawFullScreen(cmd, m_Material, rth2, _prop, 1);
    24.  
    25.         _prop.SetTexture(Shader.PropertyToID("_MaskTexture"), rth2);
    26.         HDUtils.DrawFullScreen(cmd, m_Material, rth1, _prop, 2);
    27.     }
    28.  
    29.     m_Material.SetTexture("_MaskTexture", rth1);
    30.     HDUtils.DrawFullScreen(cmd, m_Material, destination);
    31. }
     
  5. Olmi

    Olmi

    Joined:
    Nov 29, 2012
    Posts:
    1,553
    Good to hear you got it working. Yes it would be good to also have the shader side of code visible as quite a few things have changed (when compared to built-in render pipeline.)