Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  2. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice

Official Introduction of Render Graph in the Universal Render Pipeline (URP)

Discussion in 'Universal Render Pipeline' started by oliverschnabel, Oct 2, 2023.

  1. oliverschnabel

    oliverschnabel

    Unity Technologies

    Joined:
    Mar 13, 2018
    Posts:
    45
    Hello Unity community,

    We are ready to share the new RenderGraph based version of URP with you! You might have seen it on our roadmap over the last year, and many PRs landing into the Graphics repo. We are close to shipping it and you can now start to try it out using the latest 23.3 alpha release! You can expect some changes during the alpha, especially based on your feedback. Currently, it is still hidden behind a scripting define (see the documentation below) to try it out.

    With this post, we aim to share our work early and discuss with you the next steps. So let us know what you think!

    Why RenderGraph?
    Render Graph is a foundational system that automatically optimizes runtime resources for rendering. This simplifies the development of render features in our render pipelines while improving performance over a wide range of potential pipeline configurations. This also reduces the likelihood of bugs when features are manually optimized.

    URP is highly extensible and with RenderGraph, performance can now be automatically optimized for ScriptableRenderPasses that are added to your project. This will lead to better GPU performance when you are extending URP. As part of this project, we’ve improved RenderGraph to apply the NativeRenderPass API that optimizes GPU bandwidth on tile-based (mobile) GPUs.

    The benefits for you are:
    • Enhanced Extensibility and Customization: The Render Graph API allows you to access more frame resources in your custom passes and share data between passes. For example you can now get access to the G-buffer for your effects.

    • Stricter and Safer API: The new APIs support you to ensure your Renderer Features/Custom Passes are both robust on many platforms and optimized automatically. This prevents you from making mistakes that would lead to rendering issues or performance problems

    • Optimized GPU Performance: While this release is about the foundation and we have more potential to improve performance even further in future releases, current results show an average of 1ms improvement in GPU performance per frame, significantly reducing bandwidth waste and enhancing both device thermal states and battery life. You can now customize URP yourself more easily to get more performance out of it.
    What Changes?
    All URP features have been converted to using RenderGraph under the hood. Apart from a slight difference in performance, nothing changes in your project if you haven’t extended URP.

    The main difference is your access to RenderGraph in the modified ScriptableRenderPass class. This allows you to benefit from the automatic performance optimization that RenderGraph offers when extending URP. However, this new API is tightly coupled to the new foundation so you’ll need to upgrade your RenderFeatures and ScriptableRenderPass classes. The previous API will not work with RenderGraph.

    You can find details how to start here:
    • Render Graph documentation
    • Code Samples can be found in the Package Manager samples (see reply #224)
    • There is a new Custom Post-Processing template that you can access in the assets window through Create > Rendering > URP Post-processing Effect (Renderer Feature with Volume) that highlights how to support RG and Non-RG at the same time (see reply #191)

    Love to hear from you!
    You can test RenderGraph in the 2023.3 alpha release (Unity 2023.3.0a18 or later, see the documentation above) and see how it works. We’d love to hear from you how it can benefit your project.

    We encourage thoughts, questions, and constructive feedback as we progress towards the final stages of this feature. Your input is vital to us!

    Stay informed of upcoming details, updates, and insights related to this feature.

    The render pipeline team

    -----
    Updates:
    09-Oct 2023: Edited min version number to 2023.3.0a8, since this reflects changes shown in the alpha documentation. Added a link to the "Perform a full screen blit in URP" file to the documentation.

    15-Dec 2023: Updated documentation reflecting changes in 2023.3.0a18
    • For new projects, Render Graph in URP is now enabled by default in Unity 2023.3.0a18 and later.
    • Added Compatibility Mode (RenderGraph disabled)
    • New API to Set Global Textures
    • Renamed API UseTextureFragment to SetRenderAttachment in the RenderGraphBuilder
    • Introduction of Unsafe Passes, Updates on the Render Graph Viewer for Debugging
     
    Last edited: Mar 28, 2024
  2. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    295
    I'm excited for the RenderGraph changes but also nervous of the work involved. Are there any examples/tutorials that show how to modify an existing simple render feature, like a full screen blit? Also, do you have a very rough eta of when this feature will be 'on by default' so that we can plan support for it?

    Cheers,
    Elliot
     
  3. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    We are working on adding simple code examples for different common use case scenarios. For a simple Blit this is a simple example render feature:


    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.Experimental.Rendering.RenderGraphModule;
    3. using UnityEngine.Rendering;
    4. using UnityEngine.Rendering.Universal;
    5.  
    6. public class CopyRenderFeature : ScriptableRendererFeature
    7. {
    8.     class CopyRenderPass : ScriptableRenderPass
    9.     {
    10.         // This class stores the data needed by the pass, passed as parameter to the delegate function that executes the pass
    11.         private class PassData
    12.         {
    13.             internal TextureHandle src;
    14.         }
    15.  
    16.         // This static method is used to execute the pass and passed as the RenderFunc delegate to the RenderGraph render pass
    17.         static void ExecutePass(PassData data, RasterGraphContext context)
    18.         {
    19.             Blitter.BlitTexture(context.cmd, data.src, new Vector4(1,1,0,0), 0, false);
    20.         }
    21.      
    22.         // This is where the renderGraph handle can be accessed.
    23.         // Each ScriptableRenderPass can use the RenderGraph handle to add multiple render passes to the render graph
    24.         public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
    25.         {
    26.             string passName = "Copy To Debug Texture";
    27.          
    28.             // This simple pass copies the active color texture to a new texture. This sample is for API demonstrative purposes,
    29.             // so the new texture is not used anywhere else in the frame, you can use the frame debugger to verify its contents.
    30.  
    31.             // add a raster render pass to the render graph, specifying the name and the data type that will be passed to the ExecutePass function
    32.             using (var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData))
    33.             {
    34.                 // UniversalResourceData contains all the texture handles used by the renderer, including the active color and depth textures
    35.                 // The active color and depth textures are the main color and depth buffers that the camera renders into
    36.                 UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
    37.              
    38.                 // Fill up the passData with the data needed by the pass
    39.              
    40.                 // Get the active color texture through the frame data, and set it as the source texture for the blit
    41.                 passData.src = resourceData.activeColorTexture;
    42.              
    43.                 // The destination texture is created here,
    44.                 // the texture is created with the same dimensions as the active color texture, but with no depth buffer, being a copy of the color texture
    45.                 // we also disable MSAA as we don't need multisampled textures for this sample
    46.              
    47.                 UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
    48.                 RenderTextureDescriptor desc = cameraData.cameraTargetDescriptor;
    49.                 desc.msaaSamples = 1;
    50.                 desc.depthBufferBits = 0;
    51.              
    52.                 TextureHandle destination = UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "CopyTexture", false);
    53.              
    54.                 // We declare the src texture as an input dependency to this pass, via UseTexture()
    55.                 builder.UseTexture(passData.src);
    56.  
    57.                 // Setup as a render target via UseTextureFragment, which is the equivalent of using the old cmd.SetRenderTarget
    58.                 builder.UseTextureFragment(destination, 0);
    59.              
    60.                 // We disable culling for this pass for the demonstrative purpose of this sampe, as normally this pass would be culled,
    61.                 // since the destination texture is not used anywhere else
    62.                 builder.AllowPassCulling(false);
    63.  
    64.                 // Assign the ExecutePass function to the render pass delegate, which will be called by the render graph when executing the pass
    65.                 builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecutePass(data, context));
    66.             }
    67.         }
    68.     }
    69.  
    70.     CopyRenderPass m_CopyRenderPass;
    71.  
    72.     /// <inheritdoc/>
    73.     public override void Create()
    74.     {
    75.         m_CopyRenderPass = new CopyRenderPass();
    76.  
    77.         // Configures where the render pass should be injected.
    78.         m_CopyRenderPass.renderPassEvent = RenderPassEvent.AfterRenderingOpaques;
    79.     }
    80.  
    81.     // Here you can inject one or multiple render passes in the renderer.
    82.     // This method is called when setting up the renderer once per-camera.
    83.     public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    84.     {
    85.         renderer.EnqueuePass(m_CopyRenderPass);
    86.     }
    87. }
     
    PDE26jjk, _geo__, JesOb and 3 others like this.
  4. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    And this one is a Blit pass using a custom material/shader:

    Render Feature:

    Code (CSharp):
    1.  
    2. using UnityEngine;
    3. using UnityEngine.Experimental.Rendering.RenderGraphModule;
    4. using UnityEngine.Rendering;
    5. using UnityEngine.Rendering.Universal;
    6. using UnityEngine.Serialization;
    7.  
    8. public class BlitWithMaterialRenderFeature : ScriptableRendererFeature
    9. {
    10.     class BlitWithMaterialPass : ScriptableRenderPass
    11.     {
    12.         private Material m_BlitMaterial;
    13.        
    14.         public BlitWithMaterialPass(Material blitMaterial)
    15.         {
    16.             m_BlitMaterial = blitMaterial;
    17.         }
    18.        
    19.         // This class stores the data needed by the pass, passed as parameter to the delegate function that executes the pass
    20.         private class PassData
    21.         {
    22.             internal TextureHandle src;
    23.             internal TextureHandle dst;
    24.             internal Material blitMaterial;
    25.         }
    26.  
    27.         // This static method is used to execute the pass and passed as the RenderFunc delegate to the RenderGraph render pass
    28.         static void ExecutePass(PassData data, RasterGraphContext context)
    29.         {
    30.             Blitter.BlitTexture(context.cmd, data.src, new Vector4(1, 1, 0, 0), data.blitMaterial, 0);
    31.         }
    32.  
    33.         private void InitPassData(RenderGraph renderGraph, ContextContainer frameData, ref PassData passData)
    34.         {
    35.             // Fill up the passData with the data needed by the passes
    36.            
    37.             // UniversalResourceData contains all the texture handles used by the renderer, including the active color and depth textures
    38.             // The active color and depth textures are the main color and depth buffers that the camera renders into
    39.             UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
    40.            
    41.             // The destination texture is created here,
    42.             // the texture is created with the same dimensions as the active color texture, but with no depth buffer, being a copy of the color texture
    43.             // we also disable MSAA as we don't need multisampled textures for this sample
    44.                
    45.             UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
    46.             RenderTextureDescriptor desc = cameraData.cameraTargetDescriptor;
    47.             desc.msaaSamples = 1;
    48.             desc.depthBufferBits = 0;
    49.                
    50.             TextureHandle destination = UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "BlitMaterialTexture", false);
    51.            
    52.             passData.src = resourceData.activeColorTexture;
    53.             passData.dst = destination;
    54.             passData.blitMaterial = m_BlitMaterial;
    55.         }
    56.        
    57.         // This is where the renderGraph handle can be accessed.
    58.         // Each ScriptableRenderPass can use the RenderGraph handle to add multiple render passes to the render graph
    59.         public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
    60.         {
    61.             string passName = "Blit With Material";
    62.            
    63.             // This simple pass copies the active color texture to a new texture using a custom material. This sample is for API demonstrative purposes,
    64.             // so the new texture is not used anywhere else in the frame, you can use the frame debugger to verify its contents.
    65.  
    66.             // add a raster render pass to the render graph, specifying the name and the data type that will be passed to the ExecutePass function
    67.             using (var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData))
    68.             {
    69.                 // Initialize the pass data
    70.                 InitPassData(renderGraph, frameData, ref passData);
    71.  
    72.                 // We declare the src texture as an input dependency to this pass, via UseTexture()
    73.                 builder.UseTexture(passData.src);
    74.  
    75.                 // Setup as a render target via UseTextureFragment, which is the equivalent of using the old cmd.SetRenderTarget
    76.                 builder.UseTextureFragment(passData.dst, 0);
    77.                
    78.                 // We disable culling for this pass for the demonstrative purpose of this sampe, as normally this pass would be culled,
    79.                 // since the destination texture is not used anywhere else
    80.                 builder.AllowPassCulling(false);
    81.  
    82.                 // Assign the ExecutePass function to the render pass delegate, which will be called by the render graph when executing the pass
    83.                 builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecutePass(data, context));
    84.             }
    85.         }
    86.     }
    87.  
    88.     BlitWithMaterialPass m_BlitWithMaterialPass;
    89.    
    90.     public Material m_BlitColorMaterial;
    91.  
    92.     /// <inheritdoc/>
    93.     public override void Create()
    94.     {
    95.         m_BlitWithMaterialPass = new BlitWithMaterialPass(m_BlitColorMaterial);
    96.  
    97.         // Configures where the render pass should be injected.
    98.         m_BlitWithMaterialPass.renderPassEvent = RenderPassEvent.BeforeRenderingTransparents;
    99.     }
    100.  
    101.     // Here you can inject one or multiple render passes in the renderer.
    102.     // This method is called when setting up the renderer once per-camera.
    103.     public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    104.     {
    105.         renderer.EnqueuePass(m_BlitWithMaterialPass);
    106.     }
    107. }
    108.  
    109.  
    110.  
    Shader:

    Code (CSharp):
    1. Shader "BlitWithMaterial"
    2. {
    3.    SubShader
    4.    {
    5.        Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
    6.        ZWrite Off Cull Off
    7.        Pass
    8.        {
    9.            Name "BlitWithMaterialPass"
    10.  
    11.            HLSLPROGRAM
    12.            #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
    13.            #include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"
    14.  
    15.            #pragma vertex Vert
    16.            #pragma fragment Frag
    17.  
    18.            // Out frag function takes as input a struct that contains the screen space coordinate we are going to use to sample our texture. It also writes to SV_Target0, this has to match the index set in the UseTextureFragment(sourceTexture, 0, …) we defined in our render pass script.
    19.            float4 Frag(Varyings input) : SV_Target0
    20.            {
    21.                // this is needed so we account XR platform differences in how they handle texture arrays
    22.                UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);
    23.  
    24.                // sample the texture using the SAMPLE_TEXTURE2D_X_LOD
    25.                float2 uv = input.texcoord.xy;
    26.                half4 color = SAMPLE_TEXTURE2D_X_LOD(_BlitTexture, sampler_LinearRepeat, uv, _BlitMipLevel);
    27.            
    28.                // Modify the sampled color
    29.                return half4(0, 1, 0, 1) * color;
    30.            }
    31.  
    32.            ENDHLSL
    33.        }
    34.    }
    35. }
     
    Last edited: Oct 2, 2023
    Kirsche, AljoshaD and ElliotB like this.
  5. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    295
    Is there going to be a period where both routes are supported, or is the intention to move URP wholly to rendergraph when the time comes? I'd seen the github repo previously where it had elements of both, but it wasnt clear if that was just while getting things running (there was a fair bit of duplication as a result)
     
  6. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    252
    You can also try out the KeepFrame sample in the URP package samples. It's upgraded to RenderGraph.

    The idea is to have this on by default in 23.3. We're still building confidence to make this decision though.

    It's indeed the intention to move URP wholly to RenderGraph when the time comes.
     
  7. kripto289

    kripto289

    Joined:
    Feb 21, 2013
    Posts:
    509
    I hope "ScriptableRendererFeature" will be removed? Because in HDRP I can use simple "volume" feature in runtime, without manual adding 100500 features through editor.
    Ps, right now the only way it's use "UniversalAdditionalCameraData.scriptableRenderer.EnqueuePass"

    I hope with render graph I can use the same universal custompass API for urp/hdrp?
    or will there be 2 different versions again ?

    if you plan to completely break the old URP API, I will be glad if it is a single API for URP and HDRP. I'm begging.
     
    SAMYTHEBIGJUICY likes this.
  8. phil_lira

    phil_lira

    Unity Technologies

    Joined:
    Dec 17, 2014
    Posts:
    584
    Hello everyone, here's a link to an example on How to Blit using Render Graph API and Blitter API. Let us know about any API feedback and we will update the API and docs.
     

    Attached Files:

    AljoshaD and ElliotB like this.
  9. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    295
    The thing that worries me most about API changes to URP is if it causes a loss of functionality. If we identify things that you can't do with the new API that you could do with the old, will the team be receptive to those changes? Historically it feels like most URP suggestions are ignored - like the pipeline is going wherever it's been decided to go, regardless of what users expect from it.
     
  10. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    252
    The render pass interface RecordRenderGraph(RG, ContextContainer FrameData) has been designed so it can be adopted by HDRP. HDRP currently doesn't expose RenderGraph in the HDRP CustomPass. In 23, HDRP will not adopt it yet but for the next version we have planned to unify the extension APIs indeed using this new interface.
     
    kripto289 likes this.
  11. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    252
    Yes very much so. Our goal is to not have any functional regressions, you should be able to do more with the new API, not less. It's a top priority to fix if you would find some regression. However, the old API was not as thoroughly designed and offers much less guardrails. So some things might have worked by accident (on some platforms) that now the more strict API could prevent.
     
    nasos_333 and ElliotB like this.
  12. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    you should be able to do more things with the new API that before were not possible, i.e. accessing the actual RTHandle of every single resource, or using frame buffer fetch by and native render passes enabled by default on TBDR devices.

    As Aljosha said, there might have been "undefined behaviours"/hacks that worked out of luck before, being undefined or technically incorrect. In those cases you would need to find a proper way to implement it, since the API now is much more safe and as a consequence more strict.

    Of course if there would be any missing valid functionality our priority is to fix it ASAP and that's why we are asking for feedback ahead of time
     
    Last edited: Oct 3, 2023
    DrViJ, _geo__ and ElliotB like this.
  13. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    Adding few more preview samples:

    How to draw geometry using RendererLists + RenderGraph (replacing the old cmd.DrawRenderers)


    Code (CSharp):
    1. using System.Collections.Generic;
    2. using UnityEngine;
    3. using UnityEngine.Experimental.Rendering.RenderGraphModule;
    4. using UnityEngine.Rendering;
    5. using UnityEngine.Rendering.RendererUtils;
    6. using UnityEngine.Rendering.Universal;
    7.  
    8. public class RenderListRenderFeature : ScriptableRendererFeature
    9. {
    10.     class RendererListPass : ScriptableRenderPass
    11.     {
    12.         // Layer mask used to filter objects to put in the renderer list
    13.         private LayerMask m_LayerMask;
    14.        
    15.         // List of shader tags used to build the renderer list
    16.         private List<ShaderTagId> m_ShaderTagIdList = new List<ShaderTagId>();
    17.  
    18.         public RendererListPass(LayerMask layerMask)
    19.         {
    20.             m_LayerMask = layerMask;
    21.         }
    22.        
    23.         // This class stores the data needed by the pass, passed as parameter to the delegate function that executes the pass
    24.         private class PassData
    25.         {
    26.             public RendererListHandle rendererListHandle;
    27.         }
    28.  
    29.         // Sample utility method that showcases how to create a renderer list via the RenderGraph API
    30.         private void InitRendererLists(ContextContainer frameData, ref PassData passData, RenderGraph renderGraph)
    31.         {
    32.             // Access the relevant frame data from the Universal Render Pipeline
    33.             UniversalRenderingData universalRenderingData = frameData.Get<UniversalRenderingData>();
    34.             UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
    35.             UniversalLightData lightData = frameData.Get<UniversalLightData>();
    36.            
    37.             var sortFlags = cameraData.defaultOpaqueSortFlags;
    38.             RenderQueueRange renderQueueRange = RenderQueueRange.opaque;
    39.             FilteringSettings filterSettings = new FilteringSettings(renderQueueRange, m_LayerMask);
    40.            
    41.             ShaderTagId[] forwardOnlyShaderTagIds = new ShaderTagId[]
    42.             {
    43.                 new ShaderTagId("UniversalForwardOnly"),
    44.                 new ShaderTagId("UniversalForward"),
    45.                 new ShaderTagId("SRPDefaultUnlit"), // Legacy shaders (do not have a gbuffer pass) are considered forward-only for backward compatibility
    46.                 new ShaderTagId("LightweightForward") // Legacy shaders (do not have a gbuffer pass) are considered forward-only for backward compatibility
    47.             };
    48.            
    49.             m_ShaderTagIdList.Clear();
    50.            
    51.             foreach (ShaderTagId sid in forwardOnlyShaderTagIds)
    52.                 m_ShaderTagIdList.Add(sid);
    53.            
    54.             DrawingSettings drawSettings = RenderingUtils.CreateDrawingSettings(m_ShaderTagIdList, universalRenderingData, cameraData, lightData, sortFlags);
    55.  
    56.             var param = new RendererListParams(universalRenderingData.cullResults, drawSettings, filterSettings);
    57.             passData.rendererListHandle = renderGraph.CreateRendererList(param);
    58.         }
    59.  
    60.         // This static method is used to execute the pass and passed as the RenderFunc delegate to the RenderGraph render pass
    61.         static void ExecutePass(PassData data, RasterGraphContext context)
    62.         {
    63.             context.cmd.ClearRenderTarget(RTClearFlags.Color, Color.green, 1,0);
    64.            
    65.             context.cmd.DrawRendererList(data.rendererListHandle);
    66.         }
    67.        
    68.         // This is where the renderGraph handle can be accessed.
    69.         // Each ScriptableRenderPass can use the RenderGraph handle to add multiple render passes to the render graph
    70.         public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
    71.         {
    72.             string passName = "RenderList Render Pass";
    73.            
    74.             // This simple pass clears the current active color texture, then renders the scene geometry associated to the m_LayerMask layer.
    75.             // Add scene geometry to your own custom layers and experiment switching the layer mask in the render feature UI.
    76.             // You can use the frame debugger to inspect the pass output
    77.  
    78.             // add a raster render pass to the render graph, specifying the name and the data type that will be passed to the ExecutePass function
    79.             using (var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData))
    80.             {
    81.                 // UniversalResourceData contains all the texture handles used by the renderer, including the active color and depth textures
    82.                 // The active color and depth textures are the main color and depth buffers that the camera renders into
    83.                 UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
    84.                
    85.                 // Fill up the passData with the data needed by the pass
    86.                 InitRendererLists(frameData, ref passData, renderGraph);
    87.                
    88.                 // Make sure the renderer list is valid
    89.                 if (!passData.rendererListHandle.IsValid())
    90.                     return;
    91.                
    92.                 // We declare the RendererList we just created as an input dependency to this pass, via UseRendererList()
    93.                 builder.UseRendererList(passData.rendererListHandle);
    94.                
    95.                 // Setup as a render target via UseTextureFragment and UseTextureFragmentDepth, which are the equivalent of using the old cmd.SetRenderTarget(color,depth)
    96.                 builder.UseTextureFragment(resourceData.activeColorTexture, 0);
    97.                 builder.UseTextureFragmentDepth(resourceData.activeDepthTexture, IBaseRenderGraphBuilder.AccessFlags.Write);
    98.  
    99.                 // Assign the ExecutePass function to the render pass delegate, which will be called by the render graph when executing the pass
    100.                 builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecutePass(data, context));
    101.             }
    102.         }
    103.     }
    104.  
    105.     RendererListPass m_ScriptablePass;
    106.  
    107.     public LayerMask m_LayerMask;
    108.  
    109.     /// <inheritdoc/>
    110.     public override void Create()
    111.     {
    112.         m_ScriptablePass = new RendererListPass(m_LayerMask);
    113.  
    114.         // Configures where the render pass should be injected.
    115.         m_ScriptablePass.renderPassEvent = RenderPassEvent.AfterRenderingOpaques;
    116.     }
    117.  
    118.     // Here you can inject one or multiple render passes in the renderer.
    119.     // This method is called when setting up the renderer once per-camera.
    120.     public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    121.     {
    122.         renderer.EnqueuePass(m_ScriptablePass);
    123.     }
    124. }
    125.  
    126.  
    127.  
     
    nasos_333, customphase and DrViJ like this.
  14. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    Framebuffer fetch sample:

    Feature:

    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine;
    3. using UnityEngine.Experimental.Rendering.RenderGraphModule;
    4. using UnityEngine.Rendering;
    5. using UnityEngine.Rendering.Universal;
    6. using UnityEngine.Serialization;
    7.  
    8. public class FrameBufferFetchRenderFeature : ScriptableRendererFeature
    9. {
    10.     class FrameBufferFetchPass : ScriptableRenderPass
    11.     {
    12.         private Material m_BlitMaterial;
    13.         private Material m_FBFetchMaterial;
    14.        
    15.         public FrameBufferFetchPass(Material blitMaterial, Material fbFetchMaterial)
    16.         {
    17.             m_BlitMaterial = blitMaterial;
    18.             m_FBFetchMaterial = fbFetchMaterial;
    19.         }
    20.        
    21.         // This class stores the data needed by the pass, passed as parameter to the delegate function that executes the pass
    22.         private class PassData
    23.         {
    24.             internal TextureHandle src;
    25.             internal Material material;
    26.         }
    27.  
    28.         // This static method is used to execute the pass and passed as the RenderFunc delegate to the RenderGraph render pass
    29.         static void ExecuteBlitPass(PassData data, RasterGraphContext context)
    30.         {
    31.             Blitter.BlitTexture(context.cmd, data.src, new Vector4(1, 1, 0, 0), data.material, 0);
    32.         }
    33.        
    34.         // This static method is used to execute the pass and passed as the RenderFunc delegate to the RenderGraph render pass
    35.         static void ExecuteFBFetchPass(PassData data, RasterGraphContext context)
    36.         {
    37.             context.cmd.DrawProcedural(Matrix4x4.identity, data.material, 1, MeshTopology.Triangles, 3, 1, null);
    38.            
    39.             // other ways to draw a fullscreen triangle/quad:
    40.             //CoreUtils.DrawFullScreen(context.cmd, data.fbFetchMaterial, null, 1);
    41.             //Blitter.BlitTexture(context.cmd, new Vector4(1, 1, 0, 0), data.fbFetchMaterial, 1);
    42.         }
    43.  
    44.         private void BlitPass(RenderGraph renderGraph, ContextContainer frameData, TextureHandle destination)
    45.         {
    46.             string passName = "InitialBlitPass";
    47.            
    48.             // This simple pass copies the active color texture to a new texture using a custom material. This sample is for API demonstrative purposes,
    49.             // so the new texture is not used anywhere else in the frame, you can use the frame debugger to verify its contents.
    50.  
    51.             // add a raster render pass to the render graph, specifying the name and the data type that will be passed to the ExecutePass function
    52.             using (var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData))
    53.             {
    54.                 // UniversalResourceData contains all the texture handles used by the renderer, including the active color and depth textures
    55.                 // The active color and depth textures are the main color and depth buffers that the camera renders into
    56.                 UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
    57.                
    58.                 // Get the active color texture through the frame data, and set it as the source texture for the blit
    59.                 passData.src = resourceData.activeColorTexture;
    60.                 passData.material = m_BlitMaterial;
    61.                
    62.                 // We declare the src texture as an input dependency to this pass, via UseTexture()
    63.                 builder.UseTexture(passData.src);
    64.  
    65.                 // Setup as a render target via UseTextureFragment, which is the equivalent of using the old cmd.SetRenderTarget
    66.                 builder.UseTextureFragment(destination, 0);
    67.                
    68.                 // We disable culling for this pass for the demonstrative purpose of this sample, as normally this pass would be culled,
    69.                 // since the destination texture is not used anywhere else
    70.                 builder.AllowPassCulling(false);
    71.  
    72.                 // Assign the ExecutePass function to the render pass delegate, which will be called by the render graph when executing the pass
    73.                 builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecuteBlitPass(data, context));
    74.             }
    75.         }
    76.        
    77.         private void FBFetchPass(RenderGraph renderGraph, ContextContainer frameData, TextureHandle source, TextureHandle destination)
    78.         {
    79.             string passName = "FrameBufferFetchPass";
    80.            
    81.             // This simple pass copies the target of the previous pass to a new texture using a custom material and framebuffer fetch. This sample is for API demonstrative purposes,
    82.             // so the new texture is not used anywhere else in the frame, you can use the frame debugger to verify its contents.
    83.  
    84.             // add a raster render pass to the render graph, specifying the name and the data type that will be passed to the ExecutePass function
    85.             using (var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData))
    86.             {
    87.                 // Fill the pass data
    88.                 passData.material = m_FBFetchMaterial;
    89.                
    90.                 // We declare the src texture as an input dependency to this pass, via UseTexture()
    91.                 //builder.UseTexture(passData.blitDest);
    92.                 builder.UseTextureFragmentInput(source, 0, IBaseRenderGraphBuilder.AccessFlags.Read);
    93.  
    94.                 // Setup as a render target via UseTextureFragment, which is the equivalent of using the old cmd.SetRenderTarget
    95.                 builder.UseTextureFragment(destination, 0);
    96.                
    97.                 // We disable culling for this pass for the demonstrative purpose of this sample, as normally this pass would be culled,
    98.                 // since the destination texture is not used anywhere else
    99.                 builder.AllowPassCulling(false);
    100.  
    101.                 // Assign the ExecutePass function to the render pass delegate, which will be called by the render graph when executing the pass
    102.                 builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecuteFBFetchPass(data, context));
    103.             }
    104.         }
    105.        
    106.         // This is where the renderGraph handle can be accessed.
    107.         // Each ScriptableRenderPass can use the RenderGraph handle to add multiple render passes to the render graph
    108.         public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
    109.         {
    110.             // This pass showcases how to implement framebuffer fetch: this is an advanced TBDR GPU optimization
    111.             // that allows subpasses to read the output of previous subpasses directly from the framebuffer, reducing greatly the bandwidth usage.
    112.             // The first pass BlitPass simply copies the Camera Color in a temporary render target, the second pass FBFetchPass copies the temporary render target
    113.             // to another render target using framebuffer fetch.
    114.             // As a result, the passes are merged (you can verify in the RenderGraph Visualizer) and the bandwidth usage is reduced, since we can discard the temporary render target.
    115.  
    116.             // The destination textures are created here,
    117.             // the texture is created with the same dimensions as the active color texture, but with no depth buffer, being a copy of the color texture
    118.             // we also disable MSAA as we don't need multisampled textures for this sample.
    119.                
    120.             UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
    121.             RenderTextureDescriptor desc = cameraData.cameraTargetDescriptor;
    122.             desc.msaaSamples = 1;
    123.             desc.depthBufferBits = 0;
    124.                
    125.             TextureHandle blitDestination = UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "BlitDestTexture", false);
    126.             TextureHandle fbFetchDestination = UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "FBFetchDestTextureTexture", false);
    127.            
    128.             BlitPass(renderGraph, frameData, blitDestination);
    129.            
    130.             FBFetchPass(renderGraph, frameData, blitDestination, fbFetchDestination);
    131.         }
    132.     }
    133.  
    134.     FrameBufferFetchPass m_FbFetchPass;
    135.    
    136.     public Material m_BlitColorMaterial;
    137.     public Material m_FBFetchMaterial;
    138.  
    139.     /// <inheritdoc/>
    140.     public override void Create()
    141.     {
    142.         m_FbFetchPass = new FrameBufferFetchPass(m_BlitColorMaterial, m_FBFetchMaterial);
    143.  
    144.         // Configures where the render pass should be injected.
    145.         m_FbFetchPass.renderPassEvent = RenderPassEvent.BeforeRenderingTransparents;
    146.     }
    147.  
    148.     // Here you can inject one or multiple render passes in the renderer.
    149.     // This method is called when setting up the renderer once per-camera.
    150.     public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    151.     {
    152.         renderer.EnqueuePass(m_FbFetchPass);
    153.     }
    154. }
    155.  
    156.  
    157.  
    Shader:


    Code (CSharp):
    1. Shader "FrameBufferFetch"
    2. Shader "FrameBufferFetch"
    3. {
    4.    SubShader
    5.    {
    6.        Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
    7.        ZWrite Off Cull Off
    8.        Pass
    9.        {
    10.            Name "InitialBlit"
    11.  
    12.            HLSLPROGRAM
    13.            #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
    14.            #include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"
    15.  
    16.            #pragma vertex Vert
    17.            #pragma fragment Frag
    18.  
    19.            // Out frag function takes as input a struct that contains the screen space coordinate we are going to use to sample our texture. It also writes to SV_Target0, this has to match the index set in the UseTextureFragment(sourceTexture, 0, …) we defined in our render pass script.  
    20.            float4 Frag(Varyings input) : SV_Target0
    21.            {
    22.                // this is needed so we account XR platform differences in how they handle texture arrays
    23.                UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);
    24.  
    25.                // sample the texture using the SAMPLE_TEXTURE2D_X_LOD
    26.                float2 uv = input.texcoord.xy;
    27.                half4 color = SAMPLE_TEXTURE2D_X_LOD(_BlitTexture, sampler_LinearRepeat, uv, _BlitMipLevel);
    28.              
    29.                // Modify the sampled color
    30.                return color;
    31.            }
    32.  
    33.            ENDHLSL
    34.        }
    35.      
    36.        Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
    37.        ZWrite Off Cull Off
    38.        Pass
    39.        {
    40.            Name "FrameBufferFetch"
    41.  
    42.            HLSLPROGRAM
    43.            #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
    44.            #include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"
    45.  
    46.            #pragma vertex Vert
    47.            #pragma fragment Frag
    48.  
    49.            FRAMEBUFFER_INPUT_X_HALF(0);
    50.  
    51.            // Out frag function takes as input a struct that contains the screen space coordinate we are going to use to sample our texture. It also writes to SV_Target0, this has to match the index set in the UseTextureFragment(sourceTexture, 0, …) we defined in our render pass script.  
    52.            float4 Frag(Varyings input) : SV_Target0
    53.            {
    54.                // this is needed so we account XR platform differences in how they handle texture arrays
    55.                UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);
    56.  
    57.                // read the current pixel from the framebuffer
    58.                float2 uv = input.texcoord.xy;
    59.                half4 color = LOAD_FRAMEBUFFER_X_INPUT(0, input.positionCS.xy);
    60.              
    61.                // Modify the sampled color
    62.                return half4(0,0,1,1) * color;
    63.            }
    64.  
    65.            ENDHLSL
    66.        }
    67.    }
    68. }
    69.  
     
  15. JesOb

    JesOb

    Joined:
    Sep 3, 2012
    Posts:
    1,110
    how to draw geometry without hardcoded culling? How to supply our own list or renderers/meshes/submeshes?
     
  16. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
  17. JesOb

    JesOb

    Joined:
    Sep 3, 2012
    Posts:
    1,110
    it will not add any lighting or anything from scene it is not replacement to draw renderers
     
  18. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    RendererList is the new API used for this by both URP and HDRP and gives you the same functionality of DrawRednderers

    https://docs.unity3d.com/ScriptReference/Rendering.RendererList.html

    note that this is not a RG related change, URP has been using RendererLists since 22
     
  19. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,342
    Is there any plan to make any API alternatives that are a bit less boilerplaty? I guess I could just copy-paste your example, but it feels like a bit much to require over 50 lines of code (without whitespace or comments!) to implement "get a named asset that blits a material to the screen".

    All in all this looks good, but I'd love to see some higher level features. That'd achieve two things:
    - Easier to find simple versions of the feature, so using this is achievable without deep knowledge of a pretty low-level API
    - A requirement for you to maintain the high level feature so we don't have to rewrite our code every Unity version update if we just want a simple blit to screen.
     
  20. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    yeah, as you can see most of the RG setup code across the different samples I posted is 90% the same. We plan to add as many as possible high level wrappers to do the most common operations, so eventually the average user render feature should become few lines of code, i.e. Blit(rg, source, target, material). This way "high level" non advanced users ideally shouldn't even be exposed to the RG itself at all.

    The low level API is more verbose and powerful and allows for much more customization, but for sure the next step on our side will be about making it more user friendly

    Being this a call for early feedback we just want users to start using the low level API and give feedback on that
     
    saskenergy and Baste like this.
  21. sacb0y

    sacb0y

    Joined:
    May 9, 2016
    Posts:
    937
    Yay more coexistence!
     
    arkano22 and LightJockey like this.
  22. kripto289

    kripto289

    Joined:
    Feb 21, 2013
    Posts:
    509
    Is it possible to rasterize compute shader + other raster command at the same time?
    I don't know why here is different "RasterCommandBuffer" and "ComputeCommandBuffer" and no examples about "ComputeCommandBuffer"

    for example, the pseudocode of what I am currently using

    cmd.DrawProcedural //rendering shoreline mask
    cmd.DispatchCompute //compute foam relative to the shoreline mask
    cmd.DispatchCompute //compute blur pass
    cmd.BlitTriangle //render foam to screen
     
  23. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    For us to be able to merge subpasses optimally and guarantee optimal RTs setup, we have Raster passes that only allow rasterization operations (Draw, etc) and compute passes that allow dispatches, so you shouldn't be able in a raster pass to do a dispatch, which would break the RenderPass setup (talking in terms of Vulkan subpasses, where a render pass is made of a set of subpasses). So the API is more strict and requires to use the type of pass for your need.

    In your case you can do what you need by scheduling RasterPass->ComputePass->RasterPass

    pseudo code:

    Code (CSharp):
    1. using (var builder = renderGraph.AddRasterRenderPass<PassData1>("DrawProceduralPass", out var passData))
    2.             {
    3.                 // initialize pass
    4.                
    5.                 // ...
    6.                
    7.                 builder.SetRenderFunc((PassData1 data, RasterGraphContext context) =>
    8.                 {
    9.                     context.cmd.DrawProcedural(...);
    10.                 });
    11.             }
    12.  
    13.             using (var builder = renderGraph.AddComputePass<PassData2>("DispatchesPass", out var passData))
    14.             {
    15.                 // initialize pass
    16.                
    17.                 // ...
    18.                
    19.                 builder.SetRenderFunc((PassData2 data, ComputeGraphContext context) =>
    20.                 {
    21.                     // do stuff
    22.                    
    23.                     context.cmd.DispatchCompute(...);
    24.                    
    25.                     // do more stuff
    26.                    
    27.                     context.cmd.DispatchCompute(...);
    28.                 });
    29.             }
    30.  
    31.             using (var builder = renderGraph.AddRasterRenderPass<PassData3>("BlitPass", out var passData))
    32.             {
    33.                 // initialize pass
    34.                
    35.                 // ...
    36.                
    37.                 builder.SetRenderFunc((PassData3 data, RasterGraphContext context) =>
    38.                 {
    39.                     Blitter.BlitTexture(...);
    40.                 });
    41.             }

    We are also adding samples to show ComputePass usages
     
    PolyCrusher, kripto289 and AljoshaD like this.
  24. optimise

    optimise

    Joined:
    Jan 22, 2014
    Posts:
    2,133
    Hi. Any plan to utilize burst or even better utilize more dots tech to improve performance of Render Graph to next level?
     
  25. optimise

    optimise

    Joined:
    Jan 22, 2014
    Posts:
    2,133
    GPU Performance still need extremely more optimization. Currently it's still extremely slow at mobile platform specially Android. At Android Mi 9T Pro (Snagdragon 855), GPU stalling is around 13ms+. See CASE IN-56966 for repro project.

    upload_2023-10-7_21-48-54.png
     
  26. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    252
    No. We are adding caching of the compiled graph so the RG compiler only needs to run once you modify the graph, and not every frame. This removes most of the RG CPU cost per frame.
     
    optimise, JesOb, ElliotB and 3 others like this.
  27. optimise

    optimise

    Joined:
    Jan 22, 2014
    Posts:
    2,133
    I see. How about GPU cost? Is that possible to reduce GPU cost significantly for Case IN-56966 at previous post #25 too?

     
  28. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    252
    That case seems to be specific to entities graphics. It is unrelated to RenderGraph. The benefits should be the same.
     
  29. Kabinet13

    Kabinet13

    Joined:
    Jun 13, 2019
    Posts:
    152
    Is this in HDRP too? Seems like more or less free performance.
     
  30. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    252
    Yes, the compiler caching will work for both URP and HDRP. The compiler time that is removed is not that significant on high end platforms with fast CPUs though, somewhere between 0.1-0.3ms, but every bit helps of course.
     
    ElevenGame likes this.
  31. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    252
    Hi everyone,
    an update on our progress. The stabilization work of RenderGraph is progressing well. We decided to ship the new and improved URP version with RenderGraph in Unity 23.3.a13. The RenderGraph checkbox is visible from that version in the global settings. Currently, RG is still off by default but we expect to have RG on by default around the early 23.3 beta. Since we are still in alpha phase, we are making some last changes to the APIs based on the feedback we have gotten from internal and external testing. Although we are entering the last weeks until beta, we still like to hear your feedback if you have any. You can see the last bits landing in the graphics repo.

    We are still working on the RG compiler caching. We first needed to refactor the compiler, which we are completing now. Additionally, we're further optimizing the main thread CPU cost of URP to make room for RenderGraph. Next, we'll start adding more helper functions soon to reduce the amount of boiler plate that you would need to write with the new lower level API.
     
    Last edited: Nov 6, 2023
    colin299, ElliotB, ManueleB and 9 others like this.
  32. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,471

    Hi,

    I am trying to use the new Blitter method for full screen rendering and i cannot use it instead of the mesh based method, since it seems is not the same, as i cannot write to the rendertexture transparency

    I need to upgrade to the new Blitter due to this warning:
    warning CS0618: 'RenderingUtils.fullscreenMesh' is obsolete: 'Use Blitter.BlitCameraTexture instead of CommandBuffer.DrawMesh(fullscreenMesh, ...)'

    I use this notes below, but even in this simple sample when write the result to a rendertexture with transparency and change the Alpha in the shader, the result is always opaque
    https://docs.unity3d.com/Packages/c...rsal@14.0/manual/customize/blit-overview.html

    I attach a photo showing how i try to pass Alpha to the render texture and the result, with the small modification in the suggest code in the above Unity documents.

    Is the Render Graph going to resolve this issue and make everything compatible, also will the Blitter get an upgrade to work correctly with transparency, so the current renderer features may be converted ?

    Other relevant threads:
    https://forum.unity.com/threads/hdr-and-alpha-in-urp-not-possible.1145996/
    https://forum.unity.com/threads/scr...is-inaccessible-in-custom-renderpass.1356629/
    https://forum.unity.com/threads/wha...ringutils-fullscreenmesh-is-obsolete.1405636/
    https://forum.unity.com/threads/urp-hdr-alpha-blending-problem.1315299/
    https://forum.unity.com/threads/urp-full-window-partial-transparency.963743/

    Also if the CommandBuffer.DrawMesh(fullscreenMesh) method gets deprecated, does this mean that all our effects will stop working, as the Blitter is not an option and is not replacing at all the mesh based method, as the warning falsely suggests ? Is there any plan to handle that, or will be a show stopper in using Unity ?

    Thanks
     
    Last edited: Nov 19, 2023
  33. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    183
    >Currently, RG is still off by default but we expect to have RG on by default around the early 23.3 beta

    Hello, what will happen when a user upgrades their project from 2023.2 to 2023.3 (where RenderGraph is enabled by default) if the project contains assets whose renderer features/passes don't support RenderGraph?

    I attempted to run a Unity 2022.3 project in Unity 2023.3, and turned RG on, it seems that if any asset's renderer feature/pass doesn't support RenderGraph, the pass will be ignored with a warning log stating "RecordRenderGraph is not implemented; the pass ____ won't be recorded in the current RenderGraph."

    I understand that RenderGraph is important for auto RenderTexture (RT) management, and I appreciate it. However, based on the example RenderGraph code from previous replies, it appears that all renderer features/passes must be entirely rewritten in a new manner. This could potentially take months if there are hundreds of renderer features/passes to support.

    It isn't easy for an asset developer to ensure their assets run on all Unity versions. For example:

    Code (CSharp):
    1. #if !UNITY_2022_2_OR_NEWER
    2. //RenderTargetHandle rendering....
    3. #endif
    4. #if UNITY_2022_2_OR_NEWER
    5. // RTHandle rendering...
    6. #endif
    7. #if UNITY_2023_3_OR_NEWER
    8. // RG rendering...(many of us, including me, may not have a good idea of what to write here)
    9. #endif
    Therefore, I would appreciate some advice from Unity's staff for asset developers:

    - Should asset developers be concerned about the release of RenderGraph in 2023.3?
    - How challenging is it to support RenderGraph, especially compared to the "RenderTargetHandle to RTHandle" transition?
    - Is there any guideline for supporting RenderGraph, similar to the document about converting from RenderTargetHandle to RTHandle?

    Thank you.
     
  34. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,471
    From my understanding, the Graph is a replacement for renderer features and we will need to redo all in the Graph.

    That is why is mentioned as a Graph based URP version. It will not be probably the same at all with previous one.

    The biggest question is until when those APIs will be changed and not have any solid generic base that will make transition transparent, than just ask to redo everything every few months
     
  35. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    252
    A project that upgrades will have RenderGraph turned off, we'll likely call this "Compatibility mode". This will give you time to upgrade your RenderFeatures.

    There will be a compatibility mode but there will be strong incentive to upgrade your assets to the new API. Since the compatibility mode is there, you'll have time to upgrade.


    We've upgraded a number of assets and it can be quite straightforward. We'll share more docs on how to upgrade efficiently soon.


    We'll have a lot of documentation and samples. We'll also add helper functions to make it more straightforward. We expect to share more soon in December.
     
    Last edited: Nov 20, 2023
    colin299 and JesOb like this.
  36. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,471
    So what if some of the effects are not upgradable ? Or take years to do so in development time ? Given it is a totally different scheme ?

    How are we supposed to convert complex code and rendering without spend infinite amount of time to redo everything, in a potentially not possible to convert scenario ?

    Also making code to Graph is extremely cumbersome, it is like slowing down the development by a factor of 10 times slower, this will make it extremely harder to convert anything.

    If the renderer feature are not working, this means there is no compatible mode, is as good as non existent, as anyone can turn the Graph on and see a broken project. There is no helping in having this mode in how badly all projects of users will break.

    Has the team really though well enough about such a change ? Given the millions of broken projects that can be broken for ever as well potentially.

    Plus everything with graph is usually vastly slower than the normal code, has this been addressed and is 100% guaranteed that a converted effect will make sense and not be much slower and useless ?

    Also is there a single reason besides not want to spend extra time, to not allow the existing renderer features be adapted to run on the new system and work together ?

    Because frankly i would rather work with code and not slow down my development for months trying to create what can be done in code in minutes, in hours in the Graph.

    Thanks
     
    Last edited: Nov 20, 2023
  37. vallis_unity

    vallis_unity

    Joined:
    Apr 14, 2022
    Posts:
    73
    Is there a sample/example which copies the temporary buffer back over the camera's color buffer? Arrived here from https://forum.unity.com/threads/urp-13-1-8-proper-rthandle-usage-in-a-renderer-feature.1341164/
     
    PatSalvatore likes this.
  38. oliverschnabel

    oliverschnabel

    Unity Technologies

    Joined:
    Mar 13, 2018
    Posts:
    45
    We are very careful with the introduction of RenderGraph and are aware of the implications. We want to help users and the ecosystem with this transition and let everyone over time benefit from this new URP backend. For developers, the main difference is the adjustments to the ScriptableRenderPass API. Please have a look at the preview documentation describing the main differences.

    It has nothing to do with a visual graph system like ShaderGraph, VFX Graph or Visual Scripting. It is sometimes also referred to as a Frame Graph (see some resources around that Design Pattern here).

    We will provide dozens of examples how to transition your code, and plan also to add more utility functions to avoid unnecessary boiler-plate code.
     
    nasos_333 and AljoshaD like this.
  39. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    252
    Every effect is upgradable. Some existing effects might make use of a hack and work on some platforms but actually have unsupported behavior. So these might require rework to work properly with the new API that offers more guardrails and doesn't allow unsupported behavior.
    Even complex RenderFeatures should take at most a few days to convert once all our learning materials are ready to share.

    RenderGraph is a new and improved API to code RenderPasses. It's not a visual graph. You can find the info in the document we shared above.


    Yes, very much so. And discussed it with many experts and many asset store providers. Unfortunately any change requires work but we made sure that the benefits for our users and wider community are worth it, and that our approach minimizes both the work and complexity required to upgrade. We also have ensured that there is a compatibility mode to make sure you have time to upgrade your assets.
     
    nasos_333 likes this.
  40. oliverschnabel

    oliverschnabel

    Unity Technologies

    Joined:
    Mar 13, 2018
    Posts:
    45
    This is an example CopyRenderFeature.cs that records a rendering command to copy, or blit, the contents of the source texture to the color render target of the render pass.

    Maybe this is a helpful example for you?

    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.Experimental.Rendering.RenderGraphModule;
    3. using UnityEngine.Rendering;
    4. using UnityEngine.Rendering.Universal;
    5.  
    6. public class CopyRenderFeature : ScriptableRendererFeature
    7. {
    8.     // Render pass that copies the camera’s active color texture to a destination texture.
    9.     // To simplify the code, this sample does not use the destination texture elsewhere in the frame. You can use the frame debugger to inspect its contents.
    10.     class CopyRenderPass : ScriptableRenderPass
    11.     {
    12.         // This class stores the data that the render pass needs. The RecordRenderGraph method populates the data and the render graph passes it as a parameter to the rendering function.
    13.         class PassData
    14.         {
    15.             internal TextureHandle copySourceTexture;
    16.         }
    17.  
    18.         // Rendering function that generates the rendering commands for the render pass.
    19.         // The RecordRenderGraph method instructs the render graph to use it with the SetRenderFunc method.
    20.         static void ExecutePass(PassData data, RasterGraphContext context)
    21.         {
    22.             // Records a rendering command to copy, or blit, the contents of the source texture to the color render target of the render pass.
    23.             // The RecordRenderGraph method sets the destination texture as the render target with the UseTextureFragment method.
    24.             Blitter.BlitTexture(context.cmd, data.copySourceTexture, new Vector4(1, 1, 0, 0), 0, false);
    25.         }
    26.  
    27.         // This method adds and configures one or more render passes in the render graph.
    28.         // This process includes declaring their inputs and outputs, but does not include adding commands to command buffers.
    29.         public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
    30.         {
    31.             string passName = "Copy To Debug Texture";
    32.  
    33.             // Add a raster render pass to the render graph. The PassData type parameter determines the type of the passData out variable
    34.             using (var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData))
    35.             {
    36.                 // UniversalResourceData contains all the texture handles used by URP, including the active color and depth textures of the camera
    37.  
    38.                 UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
    39.  
    40.                 // Populate passData with the data needed by the rendering function of the render pass
    41.  
    42.                 // Use the camera’s active color texture as the source texture for the copy
    43.                 passData.copySourceTexture = resourceData.activeColorTexture;
    44.  
    45.                 // Create a destination texture for the copy based on the settings, such as dimensions, of the textures that the camera uses.
    46.                 // Set msaaSamples to 1 to get a non-multisampled destination texture.
    47.                 // Set depthBufferBits to 0 to ensure that the CreateRenderGraphTexture method creates a color texture and not a depth texture.
    48.                 UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
    49.                 RenderTextureDescriptor desc = cameraData.cameraTargetDescriptor;
    50.                 desc.msaaSamples = 1;
    51.                 desc.depthBufferBits = 0;
    52.  
    53.                 // For demonstrative purposes, this sample creates a transient, or temporary, destination texture.
    54.                 // UniversalRenderer.CreateRenderGraphTexture is a helper method that calls the RenderGraph.CreateTexture method.
    55.                 // It simplifies your code when you have a RenderTextureDescriptor instance instead of a TextureDesc instance.
    56.                 TextureHandle destination =
    57.                     UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "CopyTexture", false);
    58.  
    59.                 // Declare that this render pass uses the source texture as a read-only input
    60.                 builder.UseTexture(passData.copySourceTexture);
    61.  
    62.                 // Declare that this render pass uses the temporary destination texture as its color render target.
    63.                 // This is similar to cmd.SetRenderTarget prior to the RenderGraph API.
    64.                 builder.UseTextureFragment(destination, 0);
    65.  
    66.                 // RenderGraph automatically determines that it can remove this render pass because its results, which are stored in the temporary destination texture, are not used by other passes.
    67.                 // For demonstrative purposes, this sample turns off this behavior to make sure that RenderGraph executes the render pass.
    68.                 builder.AllowPassCulling(false);
    69.  
    70.                 // Set the ExecutePass method as the rendering function that RenderGraph calls for the render pass.
    71.                 // This sample uses a lambda expression to avoid memory allocations.
    72.                 builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecutePass(data, context));
    73.             }
    74.         }
    75.     }
    76.  
    77.     CopyRenderPass m_CopyRenderPass;
    78.  
    79.     public override void Create()
    80.     {
    81.         m_CopyRenderPass = new CopyRenderPass();
    82.  
    83.         // Configure the injection point in which URP runs the pass
    84.         m_CopyRenderPass.renderPassEvent = RenderPassEvent.AfterRenderingOpaques;
    85.     }
    86.  
    87.     // URP calls this method every frame, once for each Camera. This method lets you inject ScriptableRenderPass instances into the scriptable Renderer.
    88.     public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    89.     {
    90.         renderer.EnqueuePass(m_CopyRenderPass);
    91.     }
    92. }
    93.  
    94.  
    95.  
    96.  
     
  41. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,471
    Oh i see, is not an actual Graph based system :), that is a relief :)

    Thanks for all input on this, i just finished converting all my effects with the new Blitter function and took few weeks and was worried that would need to do work on a Graph which could be an end game given the complexity of my effects.

    Hopefully can have the full documentation and examples soon to start, the sooner the better in this case, so users of the assets not have broken projects.
     
    oliverschnabel likes this.
  42. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,471

    Found a bug, not sure if is major or minor, as seems the code still compiles
     
  43. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,471
    Hi,

    How do we Blit one render texture to another ? I dont see any of the examples covering multiple toggles of blits, that is the base of any complex effect. Cannot find either the way to set a render target inside the execute static section as target for the new Blitter method with raster command buffer or blit one render texture to another (with or without a material pass)

    Before i used the below
    Code (csharp):
    1.  
    2.   Blitter.BlitCameraTexture(cmd, m_CameraColorTarget, _handleA, _material, 20);
    3.    cmd.Blit(_handleA, previousFrameTexture);
    4.  
    where

    Code (csharp):
    1.  
    2. public void SetTarget(RTHandle colorHandle, float intensity)
    3.         {
    4.             m_CameraColorTarget = colorHandle;
    5.             m_Intensity = intensity;
    6.         }
    7. RTHandle m_CameraColorTarget;
    8. RTHandle _handleA;
    9.  
    10. _handleA = RTHandles.Alloc(xres, yres, colorFormat: GraphicsFormat.R32G32B32A32_SFloat, dimension: TextureDimension.Tex2D);
    11.  
    and

    Code (csharp):
    1.  
    2.  RenderTexture previousFrameTexture;
    3.  
    4.  previousFrameTexture = new RenderTexture((int)(rtW / ((float)downSampleAA)), (int)(rtH / ((float)downSampleAA)), 0, format);// RenderTextureFormat.DefaultHDR);//v0.7
    5.                         previousFrameTexture.filterMode = FilterMode.Point;
    6.                         previousFrameTexture.Create();
    7.  
    Can be pass the Builder inside the static excecuted function, so can do multiple times the below ?

    Setup as a render target via UseTextureFragment, which is the equivalent of using the old cmd.SetRenderTarget
    Code (csharp):
    1.  
    2. builder.UseTextureFragment(passData.dst, 0);
    3.  
    Such that can then do the below to write to the target defined above for example.

    Code (csharp):
    1.  
    2.  Blitter.BlitTexture(cmd, data.m_CameraColorTarget, new Vector4(1, 1, 0, 0), _material, 20);
    3.  
    Thanks
     
    Last edited: Nov 20, 2023
  44. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,471
    After looking more into it, i think the major problem is not porting the effects.

    It is that the API is not ready and is years away from something streamlined and usable.

    Please allow at least 2 years after the API is ready before the final non beta release, so we can have a slight chance or porting our effects over. That would be around 3 to 4 years from now.

    So far seems impossible to port, there is no clear way of how to do any of the million things involved in making our previous effects work.

    This means we will need to deprecate our assets for months or years after this has been officially released.

    There is no Blit anymore, no way to set render targets, no way to know what renders where, or how to set rendertextures and their properties. No way to even do a simple blit of one texture to another.

    I get this feeling that to emulate just one of our previous Blits, need to copy the whole below code and use a builder for each operation, essentially bringing our execute workflow inside the graph function and not the static one, that is solely for shader operation than set the targets etc
    Like create a render pass and multiple Static functions for each Blit or something.

    Code (csharp):
    1.  
    2.  
    3.   using (var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData))
    4.            {
    5.                 // UniversalResourceData contains all the texture handles used by URP, including the active color and depth textures of the camera
    6.  
    7.  
    8.                UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
    9.  
    10.  
    11.                // Populate passData with the data needed by the rendering function of the render pass
    12.  
    13.  
    14.                // Use the camera’s active color texture as the source texture for the copy
    15.                 passData.copySourceTexture = resourceData.activeColorTexture;
    16.  
    17.  
    18.                 // Create a destination texture for the copy based on the settings, such as dimensions, of the textures that the camera uses.
    19.               // Set msaaSamples to 1 to get a non-multisampled destination texture.
    20.                 // Set depthBufferBits to 0 to ensure that the CreateRenderGraphTexture method creates a color texture and not a depth texture.
    21.                 UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
    22.                 RenderTextureDescriptor desc = cameraData.cameraTargetDescriptor;
    23.                 desc.msaaSamples = 1;
    24.                 desc.depthBufferBits = 0;
    25.  
    26.                // For demonstrative purposes, this sample creates a transient, or temporary, destination texture.
    27.                 // UniversalRenderer.CreateRenderGraphTexture is a helper method that calls the RenderGraph.CreateTexture method.
    28.                // It simplifies your code when you have a RenderTextureDescriptor instance instead of a TextureDesc instance.
    29.                 TextureHandle destination =
    30.                    UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "CopyTexture", false);
    31.  
    32.  
    33.               // Declare that this render pass uses the source texture as a read-only input
    34.                 builder.UseTexture(passData.copySourceTexture);
    35.  
    36.  
    37.                 // Declare that this render pass uses the temporary destination texture as its color render target.
    38.                 // This is similar to cmd.SetRenderTarget prior to the RenderGraph API.
    39.                 builder.UseTextureFragment(destination, 0);
    40.  
    41.  
    42.                 // RenderGraph automatically determines that it can remove this render pass because its results, which are stored in the temporary destination texture, are not used by other passes.
    43.                // For demonstrative purposes, this sample turns off this behavior to make sure that RenderGraph executes the render pass.
    44.                 builder.AllowPassCulling(false);
    45.  
    46.  
    47.                 // Set the ExecutePass method as the rendering function that RenderGraph calls for the render pass.
    48.                 // This sample uses a lambda expression to avoid memory allocations.
    49.                builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecutePass(data, context));
    50.            }
    51.  
    This is massively cumbersome and complex, plus we simply cant tell if it works that way or not and how to properly set it up overall

    Rendering a single image using a single shader is not a viable example, it is useless for more complex effects.

    This seems like a disaster so far. Seems like all our effort of the last 5 years to make our effects for the URP renderer features is now completely destroyed and useless.

    Another massive issue is the requirement to have static methods in our code now, i really dont want to use anything static and should be avoided at all costs.
     
    Last edited: Nov 20, 2023
  45. _geo__

    _geo__

    Joined:
    Feb 26, 2014
    Posts:
    1,367
    That sounds awesome.

    I have a question about the specialized 2D URP renderer you guys wrote. Will the 2D URP renderer be upgraded to RG or will it be dropped?


    These quotes make me worry:

    As an asset dev I worry because this all feels very rushed. I mean Unity 2023.3 is scheduled for release in APRIL 2024. That's in 4-5 months from now (+ a holiday season), yet here Unity is asking for low level API feedback. The title of the thread is "Introduction of Render Graph". Heck, that's something you post A YEAR+ before such a change. Not 6 months before you ship the final thing.

    I personally find the thought of collecting very rudimentary feedback 5 months before release questionable.

    Anyways, my biggest question is: How long will the compatibility mode be around? How will it be supported.
    I especially worry about the "but there will be strong incentive" part. To me this sounds like Unity will not give any support or help on it. I can already see Unity employees telling us to "port to RG because it's better" if we start to report bugs for the compatibility mode.

    Not everyone can immediately stop all other work and do the ports of the render features. I for one have learned (the hard way) to not start any porting work right away since things are shifting too frequently (early URP being a prime example) or have too many bugs.

    I worry this will be beta tested by LTS early adopters and us assets devs will take the hit by having to explain it all to those users with "your asset broke my project" support requests. Because: yes, people will use it immediately if you promise them a magic button for better performance. No, they don't care about what alpha, beta or LTS means.

    Again my apologies for the language. I feel cynical today.
    The API itself looks like a nice improvement to me (still too much boilerplate but that has been asked already).

    Thank you for listening
     
    D1234567890, halley and nasos_333 like this.
  46. Kabinet13

    Kabinet13

    Joined:
    Jun 13, 2019
    Posts:
    152
    Worth noting that the next actual LTS is going to be Unity 6, which will have several extra months in the oven before release. Partially at least to help with stabilizing, but also to give time to adapt to changes like this. I for one am massively in favor of fewer, stabler, more feature rich releases.
     
  47. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    183
    Thank you for the example! It clearly demonstrates what the RG API does. For any simple drawing/blit pass with 0 to 1 render target switch, I believe supporting the RG system is quite straightforward.

    I'd be interested in a more complex example. Let's say you extend the CopyRenderPass renderer feature to render a blurred version of "CopyTexture". The final output would be a texture that stores various levels of blurred textures in smaller mipmaps. Then, in the transparent queue, any object shader could sample this blurred "CopyTexture" using SAMPLE_TEXTURE2D_LOD with a blurriness value, typically for frosted glass objects. Would this still be manageable with the RG system?

    I am under the impression that each horizontal blur and each vertical blur blit requires a separate renderGraph.AddRasterRenderPass, since cmd.SetRenderTarget() can't be used in ExecutePass(the static function used by SetRenderFunc) anymore. Is my assumption correct?
     
  48. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    252
    It is upgraded to RG.


    Our goal is to make our users and our asset store providers successful. For our users, RG will mean more stability, and better performance when extending the rendering. For asset creators, it means an API that is better defined, better documented, once learned should be more productive, and offers performance by default. We're dedicated to support asset providers in this transition. We will monitor closely the progress of asset conversation in the marketplace and make sure we support the compatibility mode during this transition. How we approach this exactly will be based on your feedback.
     
    _geo__ likes this.
  49. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    the samples posted in this thread are for very minimal use cases just to show how to do the basic operations with the new API, some sort of building blocks.

    If you want to look at more complex effects and advanced usage you can look at the URP source, PostProcessPassRenderGraph.cs which implements the full URP post processing stack with the new API. We plan to publish more advanced samples and tutorials in the future
     
  50. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    252
    Do you have any specific feedback on any of the APIs that we can take into account?

    I realize there's a lot to learn and we are working on the documentation. Hopefully the guide shared in the first post gives a good intro. And Manuele has already shared a number of samples. Getting this API right is very important to us, to make sure you have a solid API to be more productive after the initial learning curve.

    To make sure we can make it the best it can be, much improved compared to the previous API, we have during the last half year done 3 different rounds of users testing before sharing on our asset provider forum and this public forum. We have collected feedback from many stakeholders, including internal teams. We are currently making some last improvements to the low level API that has been shared with you and will then continue adding helper functions to simplify many common operations like Blit. The idea is that simple things should be straightforward without much boilerplate. We are creating many code samples (we have 17 currently) for specific use cases. We plan to finish this in roughly one month to share with you then. We're confident that we can make this a solid foundation for your work and appreciate any feedback that can help to make it better.
     
    ali_mohebali, Kabinet13 and nasos_333 like this.