Search Unity

Custom Post Processing on LWRP

Discussion in 'Universal Render Pipeline' started by piter00999, Sep 14, 2019.

  1. piter00999

    piter00999

    Joined:
    Jan 5, 2018
    Posts:
    19
    Hi there!

    For a couple days now I have a problem and I can't find a clue. I want to create custom post-processing effect that is going to work with LWRP and I want to avoid using Post Processing Stack V2 as a base for my effect. I'm looking for something like OnRenderImage in the legacy rendering pipeline. I already tried adding command buffer to the camera but this doesn't work on LWRP too... So my question is there any other way to apply fullscreen image effect using LWRP?
     
    superjayman likes this.
  2. larsbertram1

    larsbertram1

    Joined:
    Oct 7, 2008
    Posts:
    6,900
    i am not really into this. but command buffers in lwrp are now ScriptableRenderPasses.
    and you can schedule them using:

    • IAfterDepthPrePass
    • IAfterOpaquePass
    • IAfterOpaquePostProcess
    • IAfterRender
    • IAfterSkyboxPass
    • IAfterTransparentPass
    --> documentation

    i hope this helps a bit on your further search.
     
  3. superjayman

    superjayman

    Joined:
    May 31, 2013
    Posts:
    185
    How can you do Custom Post-Processing effects in LWRP/URP? Can someone give clear instructions on the steps involved. eg. I want world space normals and motion vectors, as I want to implement a special motion blur.
     
  4. superjayman

    superjayman

    Joined:
    May 31, 2013
    Posts:
    185
    This seems trivial, where are the world space normals and motion vectors?
     
  5. piter00999

    piter00999

    Joined:
    Jan 5, 2018
    Posts:
    19
  6. piter00999

    piter00999

    Joined:
    Jan 5, 2018
    Posts:
    19
    Thanks for this! I think I'm somewhere near but I still can't push my image effect to screen buffer. I've written custom render pass which applies image effect to screen texture but I can't push the final image to the screen buffer and I don't have clue why. Frame Debugger shows that my effect is in fact rendered.

    Here is code of custom render pass:
    Code (CSharp):
    1. public class CustomRenderPassFeature : ScriptableRendererFeature
    2. {
    3.     class CustomRenderPass : ScriptableRenderPass
    4.     {
    5.  
    6.         private int screenCopyID;
    7.         // This method is called before executing the render pass.
    8.         // It can be used to configure render targets and their clear state. Also to create temporary render target textures.
    9.         // When empty this render pass will render to the active camera render target.
    10.         // You should never call CommandBuffer.SetRenderTarget. Instead call <c>ConfigureTarget</c> and <c>ConfigureClear</c>.
    11.         // The render pipeline will ensure target setup and clearing happens in an performance manner.
    12.         public override void Configure(CommandBuffer cmd, RenderTextureDescriptor cameraTextureDescriptor)
    13.         {
    14.             screenCopyID = Shader.PropertyToID("_ScreenCopyTexture");
    15.             cmd.GetTemporaryRT(screenCopyID, cameraTextureDescriptor);
    16.            
    17.         }
    18.  
    19.         // Here you can implement the rendering logic.
    20.         // Use <c>ScriptableRenderContext</c> to issue drawing commands or execute command buffers
    21.         // https://docs.unity3d.com/ScriptReference/Rendering.ScriptableRenderContext.html
    22.         // You don't have to call ScriptableRenderContext.submit, the render pipeline will call it at specific points in the pipeline.
    23.         public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
    24.         {
    25.             CommandBuffer buffer = CommandBufferPool.Get("MOPP");
    26.             buffer.name = "VIgnette";
    27.  
    28.             buffer.Blit(BuiltinRenderTextureType.CurrentActive, screenCopyID);
    29.  
    30.             Material VignetteMaterial = GetMaterial(VignetteShader);
    31.             buffer.Blit(screenCopyID, BuiltinRenderTextureType.CameraTarget, VignetteMaterial, 0);
    32.  
    33.             context.ExecuteCommandBuffer(buffer);
    34.             CommandBufferPool.Release(buffer);
    35.         }
    36.  
    37.         /// Cleanup any allocated resources that were created during the execution of this render pass.
    38.         public override void FrameCleanup(CommandBuffer cmd)
    39.         {
    40.             cmd.ReleaseTemporaryRT(screenCopyID);
    41.         }
    42.  
    43.  
    44.         private const string VignetteShader = "Hidden/VignetteShader";
    45.         private Dictionary<string, Material> Materials = new Dictionary<string, Material>();
    46.         public Material GetMaterial(string shaderName)
    47.         {
    48.             Material material;
    49.             if (Materials.TryGetValue(shaderName, out material))
    50.             {
    51.                 return material; //a
    52.             }
    53.             else
    54.             {
    55.                 Shader shader = Shader.Find(shaderName);
    56.  
    57.                 if (shader == null)
    58.                 {
    59.                     Debug.LogError("Shader not found (" + shaderName + "), check if missed shader is in Shaders folder if not reimport this package. If this problem occurs only in build try to add all shaders in Shaders folder to Always Included Shaders (Project Settings -> Graphics -> Always Included Shaders)");
    60.                 }
    61.  
    62.                 Material NewMaterial = new Material(shader);
    63.                 NewMaterial.hideFlags = HideFlags.HideAndDontSave;
    64.                 Materials.Add(shaderName, NewMaterial);
    65.                 return NewMaterial;
    66.             }
    67.         }
    68.     }
    69.  
    70.     CustomRenderPass m_ScriptablePass;
    71.  
    72.     public override void Create()
    73.     {
    74.         m_ScriptablePass = new CustomRenderPass();
    75.  
    76.         // Configures where the render pass should be injected.
    77.         m_ScriptablePass.renderPassEvent = RenderPassEvent.AfterRendering;
    78.     }
    79.    
    80.     // Here you can inject one or multiple render passes in the renderer.
    81.     // This method is called when setting up the renderer once per-camera.
    82.     public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    83.     {
    84.         renderer.EnqueuePass(m_ScriptablePass);
    85.         Debug.Log("Render pass added!");
    86.     }
    87. }
    Also here is evidence from frame debug that my effect is rendered but not to screen buffer (I need to use something different than BuiltinRenderTextureType.CameraTarget as my image effect target but I don't know what) vignetteRednering.png vignetteRednering.png
     
  7. larsbertram1

    larsbertram1

    Joined:
    Oct 7, 2008
    Posts:
    6,900
    me neither. but i would look into that "final blit pass" and which source texture it uses.
     
  8. piter00999

    piter00999

    Joined:
    Jan 5, 2018
    Posts:
    19
    Okay finally found proper camera target :D! To anyone looking for this you can find target reference here:

    Code (CSharp):
    1. public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    2.     {
    3.        cameraRenderTarget = renderer.cameraColorTarget;
    4.             renderer.EnqueuePass(renderPass);
    5.        
    6.     }
     
    morepixels and Tartiflette like this.
  9. Tartiflette

    Tartiflette

    Joined:
    Apr 10, 2015
    Posts:
    84
    Blitting to renderer.cameraColorTarget does make the whole thing work quite nicely. I found that the two renderPassEvent flags that work are RenderPassEvent.AfterRendering (only if postproc AA is disabled) and RenderPassEvent.BeforeRenderingPostProcessing.
    It's obviously not the most efficient solution due to the extra blit, but it's definitely nice to be able to at least test some custom effects.
     
  10. piter00999

    piter00999

    Joined:
    Jan 5, 2018
    Posts:
    19
    Interesting in my case it works even with flags like RenderPassEvent.AfterRenderingOpaques without problem maybe because for now I'am just testing this with simple vignette shader so I'm not using depth buffers etc. Anyway, I found another problem with this rendering method... performance on my quite old now phone (xiaomi redmi note 4) simple vignette effect take like 30ms! This is waaaay too much but I don't have a clue why it happens.

    I tried different things to optimize this:
    -creating commandbuffer and filling it with commands only once at startup (not worth it, almost 0 impact on performance)
    -using temporary render textures with different parameters than cameraTextureDescriptor have, this actually gave a significant boost in performance (especially changing depthBits to 0 because I don't use depth anyway)

    After this optimizations I'm now achieving rendering times like 30ms for just vignette :((for comparsion vignette effect from post processing stack build into lwrp take 2-3 ms). So it may be some bug connected with my device or I'm just missing something important in my code.
     
  11. olli_vrcoaster

    olli_vrcoaster

    Joined:
    Sep 1, 2017
    Posts:
    24
    any progress on this? I would like to start using Universal Renderpipeline but all the stuff i read makes it feel very limited. Especially with custom post processing in mind.
     
  12. piter00999

    piter00999

    Joined:
    Jan 5, 2018
    Posts:
    19
    Hey, as far as I know, render feature is still the only option for custom post-processing. About limits of urp/lwrp around a month ago I wanted to port my volumetric lights to support urp but I had to stop because I didn't find any way to get shadow map texture which is necessary for this effect, oh and build into urp post processing stack lacks SSAO. Tbh I don't know why urp/lwrp in the current state is called production-ready and not alpha or something xD.
     
  13. olli_vrcoaster

    olli_vrcoaster

    Joined:
    Sep 1, 2017
    Posts:
    24
    :D yeah that's weird to me too but maybe you just need more people using it to boost the development... so they trying to get more people to use it?
    Or maybe just a contract with oculus to get it done as soon as quest is out and now they have to officially call it "ready" to keep the agreement idk. ^^
    ... whatever i'm still happy that they try to get the best performance out of the engine and don't hesitate too long with introducing larger changes to reach that goal.
    With "Render Feature" you mean the steps you described above? Which gave you 30ms overhead? Or did i miss something?
     
  14. piter00999

    piter00999

    Joined:
    Jan 5, 2018
    Posts:
    19
    Yep render feature but after couple times trying it out I am not sure about why back then I was getting such big overhead now I still do the same thing copy screen to texture -> apply some shaders etc -> copy this to destination texture and everything works with reasonable rendering time so maybe it was problem with using some specific lwrp version or something.
     
    olli_vrcoaster likes this.