Search Unity

Custom Pass API

Discussion in 'High Definition Render Pipeline' started by iamarugin, Aug 28, 2019.

  1. iamarugin

    iamarugin

    Joined:
    Dec 17, 2014
    Posts:
    883
  2. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
    It's merged now along with custom postprocessing pass (which is separate thing from custom pass).
     
  3. glbranimir

    glbranimir

    Joined:
    Jun 18, 2019
    Posts:
    4
    Where can I find some samples? Whatever I put in my custom pass or the fullscreen pass, only a black screen appears.

    Unity 2019.3
    HDRP 7.1.2
     
  4. iamarugin

    iamarugin

    Joined:
    Dec 17, 2014
    Posts:
    883
    I have the same problem. I think we should wait until 7.1.3 will be released.
     
  5. glbranimir

    glbranimir

    Joined:
    Jun 18, 2019
    Posts:
    4
    I want to make a
    _CameraNormalsTexture
    containing the screen normal textures to inject it into shader graph (really want to make the edge detection with normals work, but can't do it neither way, everything seems legacy)
     
  6. iamarugin

    iamarugin

    Joined:
    Dec 17, 2014
    Posts:
    883
    If you are using deffered rendering, I think you have normals in G-Buffer already. So you need to sample it in the custom pass before the postprocessing.
     
  7. antoinel_unity

    antoinel_unity

    Unity Technologies

    Joined:
    Jan 7, 2019
    Posts:
    262
    PolyCrusher, nasos_333 and rz_0lento like this.
  8. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,348
    hi, thanks for the information.

    Is it possible to use this system to render the scene for refraction texture ?
     
  9. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,348

    I installed the 7.1.2 in Unity 2019.3.0b7 and get the following error:

    The name 'GetCameraBuffers' does not exist in the current context

    I see in github this function exists, this means i have to use the github library and not the official package manager one ?
     
  10. antoinel_unity

    antoinel_unity

    Unity Technologies

    Joined:
    Jan 7, 2019
    Posts:
    262
    Ha yes, these changes aren't yet in the package manager, this repository uses all the latest features of custom passes that will be in the next 7.x package for 19.3.

    Do you mean render objects in the refraction color pyramid that HDRP uses or allocate a custom refraction buffer and render objects in it ?
     
  11. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,348
    Thanks for the information.

    I want to get a refraction in my custom shader in hdrp in general, i managed to read the color pyramid correctly, but when i try apply to my water is projected wrong and moves with the camera, thus i am not sure how to use in general case and Lit shader vertex is too complex to decipher easilly, that is why would like to use a texture and old pipeline funtions to go about it.

    Is there a way to insert the correct vertex to color pyramid image uvs conversion information in vertex shader in a simple way, e.g. some function that does that ?

    I describe the issue in more detail here

    https://forum.unity.com/threads/col...2-version-of-hdrp-but-works-in-2018-3.764147/

    I use the old grabpass vertex to screen conversion libraries, and suppose that is why i get the texture shifting in my water shader, but not sure how to fix for new texture and sampling function in hdrp.

    In unity 2018.3 hdrp i just used colorpyramid texture and was correctly working using the old pipeline code like a grabpass texture, i assume something changed in unity 2019.2 hdrp version.
     
  12. antoinel_unity

    antoinel_unity

    Unity Technologies

    Joined:
    Jan 7, 2019
    Posts:
    262
    I think what you're missing is the RTHandle stuff in shaders, to handle multi-camera resolutions, we introduced a system that allocate the maximum resolution of all the cameras that renders and then set the viewport for each cameras.
    I'd recommend to use the functions we provide to sample the camera depth/color in this file: https://github.com/Unity-Technologi...ntime/ShaderLibrary/ShaderVariables.hlsl#L294
     
    SKoptev and nasos_333 like this.
  13. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,348
    Hi,

    Thanks for the input, i use something similar and do my own scaling, i do:
    Code (csharp):
    1.  
    2. SAMPLE_TEXTURE2D_X_LOD(_CustomColorTexture, s_trilinear_clamp_sampler, Myuv * MyScaling.xy, 0);
    3.  
    where Myuv = UNITY_PROJ_COORD(i.grabPassPos)

    instead of
    Code (csharp):
    1.  
    2. SAMPLE_TEXTURE2D_X_LOD(_CustomColorTexture, s_trilinear_clamp_sampler, uv * _RTHandleScale.xy, 0);
    3.  
    because i convert from my old shader and cannot use all libraries directly because i get multiple conflicts between the HDRP core includes and GC includes.

    Using this for any "MyScaling" scale i tried that for any given camera view seems to work ok, when move or pan the camera the image projected on my water plane seems to distort and move. So i tend to think is the uv part that i change with mine and use the i.grabPassPos as in older pipeline grasspass calculated UVs (using a specific function in vertex shader).

    Is there a clean way to get this UVs in the vertex shader without the voodoo code in the Lit HDRP shader ? :)
    Because that would solve the issue i suppose. E.g. get the hit.positionSS that is used in the Lit HDRP shader to sample the texture, but from my vertex shader that has the vertex.xyz information or if there is a minimal include i could do to be able to querry similar UVs would help a lot.

    The hit.positionSS is defined and used in Lines 1726 to 1751 in the HDRP Lit shader (link:
    https://github.com/Unity-Technologi...high-definition/Runtime/Material/Lit/Lit.hlsl )

    My full related to the pyramid color image code is in the below post - link:
    https://forum.unity.com/threads/col...hdrp-but-works-in-2018-3.764147/#post-5087762

    I mainly include there new libraries comparing to my old code, so i can use the SAMPLE_TEXTURE2D_X_LOD functionality
    Code (csharp):
    1.  
    2. #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/API/D3D11.hlsl"
    3. #include "Packages/com.unity.render-pipelines.high-definition/Runtime/ShaderLibrary/TextureXR.hlsl"
    4.  
    In vertex i do:
    Code (csharp):
    1.  
    2. ComputeScreenAndGrabPassPos(o.pos, o.screenPos, o.grabPassPos);
    3.  
    In fragment i do to get the scene color (and apply distortion) and depth:
    Code (csharp):
    1.  
    2. half4 screenWithOffset = i.screenPos + distortOffset;
    3. half4 grabWithOffset = i.grabPassPos + distortOffset;
    4.  
    5. half4 rtRefractionsNoDistort = LOAD_TEXTURE2D_X_LOD(_ColorPyramidTexture, UNITY_PROJ_COORD(i.grabPassPos).xy, 0);
    6.  
    7. half refrFix = LOAD_TEXTURE2D_X_LOD(_CameraDepthTexture, UNITY_PROJ_COORD(grabWithOffset).xy, 0).r;
    8.  
    where
    Code (csharp):
    1.  
    2. inline void ComputeScreenAndGrabPassPos (float4 pos, out float4 screenPos, out float4 grabPassPos)
    3. {
    4.     #if UNITY_UV_STARTS_AT_TOP
    5.         float scale = -1.0;
    6.     #else
    7.         float scale = 1.0f;
    8.     #endif
    9.  
    10.     screenPos = ComputeScreenPos(pos);
    11.     grabPassPos.xy = ( float2( pos.x, pos.y*scale ) + pos.w ) * 0.5;
    12.     grabPassPos.zw = pos.zw;
    13. }
    14.  
    15. //// and from UnityCG.cginc, ComputeScreenPos is given as
    16.  
    17. inline float4 ComputeScreenPos(float4 pos) {
    18.     float4 o = ComputeNonStereoScreenPos(pos);
    19. #if defined(UNITY_SINGLE_PASS_STEREO)
    20.     o.xy = TransformStereoScreenSpaceTex(o.xy, pos.w);
    21. #endif
    22.     return o;
    23. }
    24.  
    25.  
    EDIT: I post a video showcasing what i see and my code i use to get the refraction on the water.



    EDIT 2:
    I suppose another way to go about it is to program a Blit from the camera to the scene without the water and modified far and close planes in camera to get the refraction texture, but i first want to see if is possible through the shader and render pipeline, as would be extra performance gain if i used the already created Pyramid color texture.

    Excuse the long post :), i am trying 4 days to solve this issue and i feel getting closer but still miss something to be perfected. I will now try to recreate the raycasting in the Lit HDRP shader and get that way the hit.positionSS position to feed the texture sampler function, if i manage to do it without conflicting with my CG based code.
     
    Last edited: Oct 22, 2019
  14. antoinel_unity

    antoinel_unity

    Unity Technologies

    Joined:
    Jan 7, 2019
    Posts:
    262
    Sadly, not currently. We focused on the ShaderGraph for custom shader creation so right now writing custom shaders in HDRP by hand is extremely hard and may break in future updates as there is not yet a clean API defined.

    Sorry but I don't see anything wrong in the code you posted and if you follow our refraction code in the Lit.hlsl file it should work.
     
  15. lil_sichen

    lil_sichen

    Joined:
    Apr 13, 2017
    Posts:
    10
    Is there a way to output a second camera's DepthNormals texture (previously stored in _CameraDepthNormalsTexture) to a RenderTexture in HDRP? I see @antoinel_unity already have an example for storing just the depth in the CameraDepthBake example, but I'm not sure how to extract DepthNormals in a similar fashion. Any advice?
     
  16. antoinel_unity

    antoinel_unity

    Unity Technologies

    Joined:
    Jan 7, 2019
    Posts:
    262
    Yes, you can have the normal buffer from a camera as well but note that in my CameraDepthBake example I only render objects from my second camera point of view into a depth buffer (I bind only a depth before rendering my objects):
    Code (CSharp):
    1. // targetTexture is a renderTarget that only have a depth and no color      
    2. CoreUtils.SetRenderTarget(cmd, targetTexture, ClearFlag.Depth);
    3. HDUtils.DrawRendererList(renderContext, cmd, RendererList.Create(result));
    4.  
    If you bind a render target that also have a color buffer (using the depth and color accessors), then you'll be able to have the object color and depth in a single render target.

    Then you can combine that to the material override parameter present in the RendererListDesc API (example here: Fur.cs). With an unlit ShaderGraph that outputs normals as color, you'll be able to have the normals rendered in the render texture color buffer.
     
  17. lil_sichen

    lil_sichen

    Joined:
    Apr 13, 2017
    Posts:
    10
    Thank you! That's very insightful. I actually tried that approach before and couldn't get it to work properly, so I gave up. I ended up reconstructing the normal from the depth texture as a temporary solution. I'll try your solution later and attach an example here for everyone else (if I get it to work).

    HDRP is really powerful and I hope the team can provide more detailed documentation and examples so we can fully utilize it!
     
    antoinel_unity likes this.
  18. lil_sichen

    lil_sichen

    Joined:
    Apr 13, 2017
    Posts:
    10
    Here's how you can render world normal into a RenderTexture's color buffer and the depth into the depth buffer with custom passes:
    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.Rendering;
    5. using UnityEngine.Rendering.HighDefinition;
    6. using UnityEngine.Experimental.Rendering;
    7.  
    8. public class CameraNormalDepthBake : CustomPass
    9. {
    10.     [Tooltip("Camera to render the normal and depth from")]
    11.     public Camera bakingCamera;
    12.     [Tooltip("target textre will store normal RGB and depth in A")]
    13.     public RenderTexture targetTexture;
    14.     Material m_NormalMaterial;
    15.     public LayerMask cullingMask = 0;
    16.     ShaderTagId[] m_ShaderTags;
    17.  
    18.     protected override void Setup(ScriptableRenderContext renderContext, CommandBuffer cmd)
    19.     {
    20.         m_NormalMaterial = CoreUtils.CreateEngineMaterial(Shader.Find("Hidden/WorldNormal"));
    21.         m_ShaderTags = new ShaderTagId[2]
    22.         {
    23.             new ShaderTagId("DepthOnly"),
    24.             new ShaderTagId("DepthForwardOnly"),
    25.         };
    26.     }
    27.  
    28.     protected override void Execute(ScriptableRenderContext renderContext, CommandBuffer cmd, HDCamera hdCamera, CullingResults cullingResult)
    29.     {
    30.         if(!bakingCamera || !targetTexture)
    31.         {
    32.             return;
    33.         }
    34.  
    35.         const int forwardOnlyPassIndex = 5;
    36.  
    37.         var result = new RendererListDesc(m_ShaderTags, cullingResult, bakingCamera)
    38.         {
    39.             rendererConfiguration = PerObjectData.None,
    40.             renderQueueRange = RenderQueueRange.all,
    41.             // Use the world normal shader with the forward-only pass index
    42.             overrideMaterial = m_NormalMaterial,
    43.             overrideMaterialPassIndex = forwardOnlyPassIndex,
    44.             sortingCriteria = SortingCriteria.BackToFront,
    45.             excludeObjectMotionVectors = false,
    46.             layerMask = cullingMask,
    47.         };
    48.  
    49.         var p = GL.GetGPUProjectionMatrix(bakingCamera.projectionMatrix, true);
    50.         Matrix4x4 scaleMatrix = Matrix4x4.identity;
    51.         scaleMatrix.m22 = -1.0f;
    52.         var v = scaleMatrix * bakingCamera.transform.localToWorldMatrix.inverse;
    53.         var vp = p * v;
    54.         cmd.SetGlobalMatrix("_ViewMatrix", v);
    55.         cmd.SetGlobalMatrix("_InvViewMatrix", v.inverse);
    56.         cmd.SetGlobalMatrix("_ProjMatrix", p);
    57.         cmd.SetGlobalMatrix("_InvProjMatrix", p.inverse);
    58.         cmd.SetGlobalMatrix("_ViewProjMatrix", vp);
    59.         cmd.SetGlobalMatrix("_InvViewProjMatrix", vp.inverse);
    60.         cmd.SetGlobalMatrix("_CameraViewProjMatrix", vp);
    61.         cmd.SetGlobalVector("_WorldSpaceCameraPos", Vector3.zero);
    62.  
    63.         // Draw normal to RenderTexture's color buffer, and depth to the depth buffer
    64.         CoreUtils.SetRenderTarget(cmd, targetTexture.colorBuffer, ClearFlag.All);
    65.         HDUtils.DrawRendererList(renderContext, cmd, RendererList.Create(result));
    66.     }
    67.  
    68.     protected override void Cleanup()
    69.     {
    70.         CoreUtils.Destroy(m_NormalMaterial);
    71.     }
    72. }
    73.  
    The World Normal Unlit ShaderGraph Material:
    upload_2020-5-12_2-53-56.png

    RenderTexture settings:
    upload_2020-5-12_2-54-31.png
     
    cultureulterior likes this.
  19. lil_sichen

    lil_sichen

    Joined:
    Apr 13, 2017
    Posts:
    10
    I had a new problem tho. I want to use a RenderTexture that doesn't have a depth buffer, and stores the world normal as RGB and A as depth only in its color buffer. This way I can sample the texture in the Visual Effect Graph using the Sample Texture2D node:

    Anyone has any advice on how to do this?

    Edit:
    Just use the code above but for the Unlit Shader Graph use Scene Depth on the Alpha output
    upload_2020-5-15_22-52-45.png
     
    Last edited: May 16, 2020
  20. antoinel_unity

    antoinel_unity

    Unity Technologies

    Joined:
    Jan 7, 2019
    Posts:
    262
    Did you try to output the pixel depth (from the screen space position in ShaderGraph) in the alpha channel of the unlit master node?
     
    lil_sichen likes this.
  21. lil_sichen

    lil_sichen

    Joined:
    Apr 13, 2017
    Posts:
    10
    Oof sorry I'm not familiar with Shader Graph and I totally missed the alpha output. Yep it totally works!
     
    antoinel_unity likes this.
  22. sergiusz308

    sergiusz308

    Joined:
    Aug 23, 2016
    Posts:
    235
    @antoinel_unity I have several custom passes for HDRP, while working perfectly in the Editor they are gone in the build - effect does work, no error message (build or player logs) are provided.

    What could be the reason for this?
     
  23. Bordeaux_Fox

    Bordeaux_Fox

    Joined:
    Nov 14, 2018
    Posts:
    589
    Maybe the shaders were stripped from the build. Check out the shader stripping settings.
     
  24. luosiri

    luosiri

    Joined:
    Dec 19, 2013
    Posts:
    11
    I have a same problem I use a Unity CustomPass Screen Space Water Rendering , its works fine in Editor, but after releasing , it's all black
     
  25. mandicLuka

    mandicLuka

    Joined:
    Aug 20, 2018
    Posts:
    13
    @lil_sichen I tried to use your code on Unity 2020, but I can't make it work. I have 4 cameras from which I try to sample the depth buffer, but it seems as though I every time sample from the main camera.

    I tried posting my problem here: https://forum.unity.com/threads/cus...n-from-multiple-cameras.1127984/#post-7255991.
     
  26. antoinel_unity

    antoinel_unity

    Unity Technologies

    Joined:
    Jan 7, 2019
    Posts:
    262
    Yes, this code only works for HDRP versions before 10.x, to make it work on Unity 2020.x and above, you can use the CustomPassUtils.RenderFrom functions.

    There is an example of this here: https://github.com/alelievr/HDRP-Cu...enes/CameraDepthBaking/CameraDepthBake.cs#L45
     
    mandicLuka likes this.
  27. mandicLuka

    mandicLuka

    Joined:
    Aug 20, 2018
    Posts:
    13
    antoinel_unity likes this.
  28. lil_sichen

    lil_sichen

    Joined:
    Apr 13, 2017
    Posts:
    10
    Hello. I'm trying to use Custom Pass to render an ID texture based on GameObjects' layers. I'm currently trying to use a RenderTexture with GraphicsFormat.R8_UNorm to store the ID texture. For example, Default layer should output a red value of x, Rock layer should output a red value of y, Water layer should output a red value of z, and so on...

    Is this possible to achieve with Custom Pass?