Search Unity

Question Custom pass: Post Processing does not apply to pixels rendered from RenderTexture.

Discussion in 'High Definition Render Pipeline' started by cubrman, Aug 31, 2020.

  1. cubrman

    cubrman

    Joined:
    Jun 18, 2016
    Posts:
    412
    I want to create an effect where an opaque mesh blocking the camera will slowly fade from view. I want to make it automatic and never involve changing materials to transparent. Custom pass seems to be the way to go, but my models start looking completely different, when I draw them first into a render texture and then alpha-blend into the scene:

    Meshes drawn normaly:
    upload_2020-8-31_6-43-30.png

    Meshes drawn into my custom render target and then alpha-blended into the final scene:
    upload_2020-8-31_6-41-37.png

    The only reason I could identify why the results are so different is the fact that post processing does not apply to the pixels that were alpha-blended from the render texture. I have no clue how to fix it.

    I tried unity 2020.2 alpha as well as unity 2019.4

    HDRP version 8.1.0

    Here is the project file:
    https://drive.google.com/file/d/1zpbYc4u2_1jFtuiHZ2bB6xa2Zx9gsfdj/view?usp=sharing
     
    Last edited: Aug 31, 2020
  2. antoinel_unity

    antoinel_unity

    Unity Technologies

    Joined:
    Jan 7, 2019
    Posts:
    265
    Hello,

    I think custom passes may not be the best tool to make this kind of effect. Let me explain:
    Here what you want is to fade some objects based on their distance to the camera, it implies to have both the color of the objects to fade (in another buffer) and the color behind those objects (in the main camera color buffer) and then blend the two together before the post processes.
    Custom passes allow you to render objects to an arbitrary buffer, which means you'll add some objects to this buffer. This is great because you'll be able to render these objects into the other color buffer and blend them afterward (using the depth for example).

    So what's the issue? The issue is that because those objects only exist in this special buffer and not in the main one, they won't interact properly with the scene (it means no shadow, no screen space effects, etc...) which will lead to subsequent visual artifacts.

    This is mainly why I think you should create a custom shadergraph to do this effect instead (using alpha clip threshold on opaque with dithering or something else), to achieve something like this:
    Which is, in the end, more performant than rendering the objects in a separate buffer and then doing a blend.
     
    cubrman likes this.
  3. Remy_Unity

    Remy_Unity

    Unity Technologies

    Joined:
    Oct 3, 2017
    Posts:
    704
    Hello,

    I looked a bit to the project, and couldn't find anything obvious that explains this, for the moment.

    My only finding is that is seems to be an exposure (or lack of precision in data ?) issue.

    I dupplicated and moved the objects a bit to have side by side direct rendering and custom pass :
    Here is your result.
    upload_2020-8-31_17-18-59.png

    And here is the same, but fixed exposure is increased to 12 :
    upload_2020-8-31_17-19-38.png
     
    cubrman likes this.
  4. Remy_Unity

    Remy_Unity

    Unity Technologies

    Joined:
    Oct 3, 2017
    Posts:
    704
    Ok, issue found !
    Like I suspected, your buffer doesn't have a high enough range to store the high exposed data.
    Line 40 of Outline.cs, you declare
    OutlineBuffer
    as
    R16G16B16A16_SNorm
    where you should use
    R16G16B16A16_SFloat
    .
     
    cubrman likes this.
  5. cubrman

    cubrman

    Joined:
    Jun 18, 2016
    Posts:
    412
    Damn guys, THANKS A LOT! This was very useful indeed!

    @antoinel_unity I implemented this very system in the builtin pipeline exactly the way you suggested, but after I switched to HDRP I had to search for other solutions due to exceptionally poor shader graph experience (no way to extend the full Lit shader - have to build it up from scratch first, and even then you get only 1/10 of the lit shader; no way to write more than one custom node [like 1 for pixel shader 1 for vertex shader] - the graph throws errors immediately]. I know how to overcome the issues you raised - I will be rendering the depth buffer data as well as setting the MeshRenderer render mode to "shadows only" to mitigate the issues you talked about.