Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  2. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice
  3. Dismiss Notice

Official Introduction of Render Graph in the Universal Render Pipeline (URP)

Discussion in 'Universal Render Pipeline' started by oliverschnabel, Oct 2, 2023.

  1. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    297
    Aye, greatly appreciated!
     
  2. camerondus

    camerondus

    Joined:
    Dec 15, 2018
    Posts:
    63
    When do you plan to release the tools that will make it easier to upgrade? hopefully sooner rather than later, id like to have my asset render graph ready before april :)
     
  3. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,538
    I see, thanks for the feedback on this, that way seems better.

    But if a user has started a 2023.3 project and already switched Render Graph and then buy and insert one of my assets that not yet support it, how is this case handled ? I assume it wont switch to the previous system, that is the main issue.

    I really cant think of a way to save this, some users will just get confused and give low ratings, there is no working around that, unless the URP is renamed to URP RG, to showcase is a new pipeline perhaps.

    Another way would be having default the old method for all projects, new or imported and a big warning when trying to switch to the Render Graph that some effects wont work. That would be a big help i suppose.
     
    Last edited: Feb 15, 2024
    BOXOPHOBIC likes this.
  4. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    297
    Probably education and making the most of the new tooling to make users aware of the differences. You could chat with the asset store folks to explore additional compatibility flags like 'supports RG' etc. I haven't removed reviews in the past because I find it easier to just explain the issue in the reply and then future users can see it easily.

    Out of interest, what does the resource viewer show when in compatibility mode?
     
    AljoshaD likes this.
  5. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    281
    It does not work with compatibility mode. There's no RenderGraph to show.
     
    ElliotB likes this.
  6. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    297
    Apologies if this was said somewhere - is there a define statement for rendergraph/2023.3 'not compatibility mode'?
     
  7. BOXOPHOBIC

    BOXOPHOBIC

    Joined:
    Jul 17, 2015
    Posts:
    529
    I guess the solution for asset store devs would be to have a scripting define so both APIs can be used, and the compatibility mode option could be removed alltogehter. I didn't use any of them so I don't know if that would be possible.
     
  8. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    281
    What if we create a setting on the RenderFeature that you can turn on to show a standard Unity message about RenderGraph when your non-compatible asset is used with RG, that links to the documentation and that clarifies that it's expected that not all assets support RG yet. We can - as Unity - help to set the expectations with our users so that your asset doesn't get blamed for that. Just brainstorming here but happy to hear your thoughts.
     
  9. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    281
    You want to check in C#? You can check on the global settings, the API is shared in one of the earlier messages.
     
  10. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    297
    I was wondering if there was a way to check via #if statements, as there is for UNITY_2023_3_OR_NEWER for instance?
     
  11. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    281
    I don't think so. I'm curious about your use case. I assume using UNITY_2023_3_OR_NEWER and the global setting to check for compatibility mode should cover your needs.
     
  12. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    297
    If compatibility mode is more equivalent to pre 2023.3 behavior, then it would reduce complexity of the code. I can check 'if 2023.3 and not compatibility mode', rather than check 2023.3, check global settings, then check <2023.3 to switch between the different behaviours.

    (Disclaimer that i still haven't ported my asset yet due to time, so I'm running on some assumptions here)
     
  13. JesterGameCraft

    JesterGameCraft

    Joined:
    Feb 26, 2013
    Posts:
    454
    I haven't really been following this whole thread, just last few pages so this might have been mentioned already, if so please ignore. What about asking developers in asset store what version of unity they're using, or what pipeline and then if they're searching for assets only show those that meet that criteria? Or if they select an asset they want to purchase, before they buy it gives a compatibility message, like it will only work with Unity 202x or higher and with URP RG. Or ask them to put in the version of unity they're using and pipeline before pressing purchase. I know that this info is in asset descriptions already but a popup before purchase reiterating this information might stop someone from purchasing an asset that will not work in their project.
     
  14. DrViJ

    DrViJ

    Joined:
    Feb 9, 2013
    Posts:
    164
    Hello dear Unity Team,
    First, I would like to express my gratitude for the examples and links to documentation files provided in this thread. They have been immensely helpful in understanding the Render Graph API and the Render Graph pattern itself. The approach appears very user-friendly, and I particularly enjoyed using the render graph viewer. The concept of separating resource handling from operations on those resources seems promising. We have encountered numerous bugs due to such errors, and the idea of separating these operations presents itself as a straightforward solution.

    While using it, I have two questions regarding the debugging process of render features.
    1. Is there a static analyzer or perhaps a tool to debug situations when a user makes a mistake?

    For example, in the
    RecordRenderGraph:
    1. we set the render target but forget to call builder.UseRendererList(passData.RenderersList) at 39 line of example code.

    During execution:
    1. We clear the render target with red.
    2. Call DrawRendererList.
    As a result, the `DrawRendererList` is ignored, and we get a red screen, but we receive no warnings to easily identify the cause. Is a special tool planned to find such errors? (of course this one is simple to find, but the situation can be more complex)
    Here is an example of such render feature code:

    Code (CSharp):
    1. using System.Collections.Generic;
    2. using UnityEngine;
    3. using UnityEngine.Rendering;
    4. using UnityEngine.Rendering.Universal;
    5. using UnityEngine.Rendering.RenderGraphModule;
    6.  
    7. public class OutlineRenderGraphFeature : ScriptableRendererFeature
    8. {
    9.     class CustomRenderPass : ScriptableRenderPass
    10.     {
    11.         private class PassData
    12.         {
    13.             internal RendererListHandle RenderersList;
    14.         }
    15.  
    16.         private readonly List<ShaderTagId> m_ShaderTagIdList = new();
    17.         private readonly LayerMask m_LayerMask;
    18.         private readonly Material m_MaskMaterial;
    19.  
    20.         public CustomRenderPass(LayerMask layerMask, Material maskMaterial)
    21.         {
    22.             m_LayerMask = layerMask;
    23.             m_MaskMaterial = maskMaterial;
    24.         }
    25.  
    26.         static void ExecutePass(PassData data, RasterGraphContext context)
    27.         {
    28.             context.cmd.ClearRenderTarget(true, true, Color.red);
    29.             context.cmd.DrawRendererList(data.RenderersList);
    30.         }
    31.  
    32.         public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
    33.         {
    34.             const string passName = "Custom Render Pass";
    35.             using var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData);
    36.             UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
    37.  
    38.             PrepareRenderList(renderGraph, frameData, ref passData);
    39.             //We break everything by forgetting to use RenderList, but how can we debug this?
    40.             //builder.UseRendererList(passData.RenderersList);
    41.  
    42.             builder.SetRenderAttachment(resourceData.activeColorTexture, 0);
    43.             builder.SetRenderAttachmentDepth(resourceData.activeDepthTexture);
    44.  
    45.             builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecutePass(data, context));
    46.         }
    47.  
    48.         private void PrepareRenderList(RenderGraph renderGraph, ContextContainer frameData, ref PassData passData)
    49.         {
    50.             // Access the relevant frame data from the Universal Render Pipeline
    51.             var universalRenderingData = frameData.Get<UniversalRenderingData>();
    52.             var cameraData = frameData.Get<UniversalCameraData>();
    53.             var lightData = frameData.Get<UniversalLightData>();
    54.  
    55.             var sortingCriteria = cameraData.defaultOpaqueSortFlags;
    56.             var renderQueueRange = RenderQueueRange.opaque;
    57.             var filterSettings = new FilteringSettings(renderQueueRange, m_LayerMask);
    58.  
    59.             ShaderTagId[] forwardOnlyShaderTagIds =
    60.             {
    61.                 new("UniversalForwardOnly"),
    62.                 new("UniversalForward"),
    63.                 new("SRPDefaultUnlit"), // Legacy shaders (do not have a gbuffer pass) are considered forward-only for backward compatibility
    64.                 new("LightweightForward") // Legacy shaders (do not have a gbuffer pass) are considered forward-only for backward compatibility
    65.             };
    66.             m_ShaderTagIdList.Clear();
    67.  
    68.             foreach (ShaderTagId sid in forwardOnlyShaderTagIds)
    69.                 m_ShaderTagIdList.Add(sid);
    70.  
    71.             DrawingSettings drawSettings = RenderingUtils.CreateDrawingSettings(m_ShaderTagIdList, universalRenderingData, cameraData, lightData, sortingCriteria);
    72.             drawSettings.overrideMaterial = m_MaskMaterial;
    73.             var param = new RendererListParams(universalRenderingData.cullResults, drawSettings, filterSettings);
    74.             passData.RenderersList = renderGraph.CreateRendererList(param);
    75.         }
    76.     }
    77.  
    78.     [SerializeField] private LayerMask m_LayerMask;
    79.     [SerializeField] private Shader m_MaskShader;
    80.  
    81.     private Material m_MaskMaterial;
    82.  
    83.     private CustomRenderPass m_ScriptablePass;
    84.  
    85.     public override void Create()
    86.     {
    87.         if (m_MaskShader != null)
    88.             m_MaskMaterial = CoreUtils.CreateEngineMaterial(m_MaskShader);
    89.  
    90.         m_ScriptablePass = new CustomRenderPass(m_LayerMask, m_MaskMaterial)
    91.         {
    92.             renderPassEvent = RenderPassEvent.AfterRenderingPostProcessing
    93.         };
    94.     }
    95.  
    96.     protected override void Dispose(bool disposing)
    97.     {
    98.         if (m_MaskMaterial != null)
    99.         {
    100.             CoreUtils.Destroy(m_MaskMaterial);
    101.             m_MaskMaterial = null;
    102.         }
    103.     }
    104.  
    105.     public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    106.     {
    107.         renderer.EnqueuePass(m_ScriptablePass);
    108.     }
    109. }
    110.  

    2. My second question concerns the preparation of such RenderLists. This process was somewhat cumbersome in previous versions and has become even more challenging now.
    For example, to prepare a list and render it with a special material, we have to:
    1. Gather various types of data, depending on our task.
    2. Prepare filter settings.
    3. Prepare `ShaderTagIds`.
    4. Prepare drawing settings.
    5. Create `RendererListParams`.
    6. Convert all this into a `RenderListHandle`.

    Code (CSharp):
    1. private void PrepareRenderList(RenderGraph renderGraph, ContextContainer frameData, ref PassData passData)
    2. {
    3.     // Access the relevant frame data from the Universal Render Pipeline
    4.     var universalRenderingData = frameData.Get<UniversalRenderingData>();
    5.     var cameraData = frameData.Get<UniversalCameraData>();
    6.     var lightData = frameData.Get<UniversalLightData>();
    7.     var sortingCriteria = cameraData.defaultOpaqueSortFlags;
    8.     var renderQueueRange = RenderQueueRange.opaque;
    9.     var filterSettings = new FilteringSettings(renderQueueRange, m_LayerMask);
    10.     ShaderTagId[] forwardOnlyShaderTagIds =
    11.     {
    12.         new ShaderTagId("UniversalForwardOnly"),
    13.         new ShaderTagId("UniversalForward"),
    14.         new ShaderTagId("SRPDefaultUnlit"), // Legacy shaders (do not have a gbuffer pass) are considered forward-only for backward compatibility
    15.         new ShaderTagId("LightweightForward") // Legacy shaders (do not have a gbuffer pass) are considered forward-only for backward compatibility
    16.     };
    17.     m_ShaderTagIdList.Clear();
    18.     foreach (ShaderTagId sid in forwardOnlyShaderTagIds)
    19.         m_ShaderTagIdList.Add(sid);
    20.     DrawingSettings drawSettings = RenderingUtils.CreateDrawingSettings(m_ShaderTagIdList, universalRenderingData, cameraData, lightData, sortingCriteria);
    21.     drawSettings.overrideMaterial = m_MaskMaterial;
    22.     var param = new RendererListParams(universalRenderingData.cullResults, drawSettings, filterSettings);
    23.     passData.RenderersList = renderGraph.CreateRendererList(param);
    24. }
    In previous versions, I could accomplish this in a more straightforward (albeit still cumbersome) manner:

    Code (CSharp):
    1. var desc = new RendererListDesc(m_ShaderTagIdList.ToArray(), renderingData.cullResults, renderingData.cameraData.camera)
    2. {
    3.     overrideMaterial = m_MaskRenderObjectMat,
    4.     sortingCriteria = renderingData.cameraData.defaultOpaqueSortFlags,
    5.     renderQueueRange = RenderQueueRange.all,
    6.     layerMask = m_Feature.Settings.LayerMask,
    7. };
    8. var rendererList = context.CreateRendererList(desc);
    So, is there a way to prepare such render lists in a more concise manner, especially in situations when I simply want to render something on a special layer with a special mask?
     
  15. AMoulin

    AMoulin

    Unity Technologies

    Joined:
    Mar 29, 2022
    Posts:
    8
    @wwWwwwW1 New CreateSkyboxRendererList() function just landed in RenderGraph API, it will be released in 2023.3.0b10. Thanks for the feedback!
     
    Last edited: Feb 26, 2024
    PaulMDev, wwWwwwW1 and DrViJ like this.
  16. cholleme

    cholleme

    Unity Technologies

    Joined:
    Apr 23, 2019
    Posts:
    31
    @DrViJ
    Thanks for the feedback it's very valuable to be able to improve the API.

    On (1) yes this is very unfortunate behavior. If anything goes wrong with the renderer list creation of access it will currently simply render an empty list. It's easy to detect this case so we'll update the error handling to return this error to the user instead of simply doing nothing.

    On (2) this is indeed a bit verbose. I agree we should probably have a few helper functions for common use case but I still think you'll occasionally have to use the "full" API. This is a very powerful API as it's essentially the main "rendering" function we have to trigger any rendering on the scene graph. It's very difficult to predict all use cases. Layers, shader tags, material or shader replacement, sorting, which parameters to set up, ... are all very specific to one particular use case. But I agree it probably makes sense to rethink the helpers we have in the new render graph context, that would mean directly returning RendererListParams instead of separate DrawingSettings and directly taking the ContextContainer as an argument. This would probably almost halve the code needed to set things up.

    Also note that if you prefer the RendererListDesc you can still use it. It internally gets translated into a RendererListParams so it's essentially a slightly higher level API for the same thing. But the RendererListParams is the more powerful of the two and allows you to do everything you could do with the old ScriptalbeRenderContext.DrawRenderers.
     
    PaulMDev, ElliotB and DrViJ like this.
  17. rrtt_2323

    rrtt_2323

    Joined:
    Mar 21, 2013
    Posts:
    43
    I found out that this feature is already available in urp14, will it work? or do I have to choose to use it in 2023?
     
  18. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    281
    RenderGraph is only supported in 23.3. Although the code is there, RG is hidden in older versions. It's not tested and we don't backport fixes.
     
    DrViJ likes this.
  19. JackyMooc

    JackyMooc

    Joined:
    May 18, 2022
    Posts:
    4
    I'm on 2023.3.0b8 and it seems like having bloom enabled in a post processing volume stops my UI toolkit documents from rendering.

    Turning off Render Graph (Graphics -> Compatibility Mode) makes the UI show up again.

    Anyone encountering this?
     
  20. AMoulin

    AMoulin

    Unity Technologies

    Joined:
    Mar 29, 2022
    Posts:
    8
    Hi @JackyMooc, we have currently one bug fix related to Screen Space UI + Render Graph that should land soon and that might be related. When you open the Render Graph Viewer, do you see the Screen Space UIToolkit pass at the end of the pipeline?
     
  21. JackyMooc

    JackyMooc

    Joined:
    May 18, 2022
    Posts:
    4
    Glad to hear that there might be a fix coming soon!

    Yeah looks like the Screen Space UI Toolkit pass isn't present
    upload_2024-3-1_8-0-4.png
     
  22. tomekkie2

    tomekkie2

    Joined:
    Jul 6, 2012
    Posts:
    990
    How could I setup a render feature code similar to that in @DrViJ post;
    i.e. rendering RenderersList - but to custom global texture - instead of activeColorTexture?

    I wrote the below code on the basis of RendererListRenderFeature and KeepFrameFeature,
    but it brings me a number of errors (
    Render Graph Execution error,
    Exception: Mismatch in Fragment dimensions,
    InvalidOperationException: Trying to use a texture (_MainLightShadowmapTexture) that was already released or not yet created. Make sure you declare it for reading in your pass or you don't read it before it's been written to at least once,

    and so on)
    Code (CSharp):
    1. using System.Collections.Generic;
    2. using UnityEngine;
    3. using UnityEngine.Experimental.Rendering;
    4. using UnityEngine.Rendering;
    5. using UnityEngine.Rendering.Universal;
    6. using UnityEngine.Rendering.RenderGraphModule;
    7.  
    8. public class MaskRenderGraphFeature : ScriptableRendererFeature
    9. {
    10.     class CustomRenderPass : ScriptableRenderPass
    11.     {
    12.         private class PassData
    13.         {
    14.             internal RendererListHandle RenderersList;
    15.         }
    16.  
    17.         private readonly List<ShaderTagId> m_ShaderTagIdList = new();
    18.         private readonly LayerMask m_LayerMask;
    19.         private readonly Material m_MaskMaterial;
    20.  
    21.         RTHandle m_Destination;
    22.  
    23.         public void Setup(RTHandle destination)
    24.         {
    25.             m_Destination = destination;
    26.         }
    27.  
    28.         public CustomRenderPass(LayerMask layerMask, Material maskMaterial)
    29.         {
    30.             m_LayerMask = layerMask;
    31.             m_MaskMaterial = maskMaterial;
    32.         }
    33.  
    34.         static void ExecutePass(PassData data, RasterGraphContext context)
    35.         {
    36.             context.cmd.ClearRenderTarget(true, true, Color.red);
    37.             context.cmd.DrawRendererList(data.RenderersList);
    38.         }
    39.  
    40.         public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
    41.         {
    42.             UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
    43.  
    44.             if (cameraData.camera.cameraType != CameraType.Game)
    45.                 return;
    46.      
    47.             const string passName = "Custom Render Pass";
    48.             using var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData);
    49.             UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
    50.             TextureHandle destination = renderGraph.ImportTexture(m_Destination);
    51.             if (!destination.IsValid())
    52.                 return;
    53.  
    54.             PrepareRenderList(renderGraph, frameData, ref passData);
    55.             //We break everything by forgetting to use RenderList, but how can we debug this?
    56.             builder.UseRendererList(passData.RenderersList);
    57.  
    58.             builder.SetRenderAttachment(destination, 0, AccessFlags.Write);
    59.             builder.SetRenderAttachmentDepth(resourceData.activeDepthTexture);
    60.  
    61.             builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecutePass(data, context));
    62.         }
    63.  
    64.         private void PrepareRenderList(RenderGraph renderGraph, ContextContainer frameData, ref PassData passData)
    65.         {
    66.             // Access the relevant frame data from the Universal Render Pipeline
    67.             var universalRenderingData = frameData.Get<UniversalRenderingData>();
    68.             var cameraData = frameData.Get<UniversalCameraData>();
    69.             var lightData = frameData.Get<UniversalLightData>();
    70.  
    71.             var sortingCriteria = cameraData.defaultOpaqueSortFlags;
    72.             var renderQueueRange = RenderQueueRange.opaque;
    73.             var filterSettings = new FilteringSettings(renderQueueRange, m_LayerMask);
    74.  
    75.             ShaderTagId[] forwardOnlyShaderTagIds =
    76.             {
    77.                 new("UniversalForwardOnly"),
    78.                 new("UniversalForward"),
    79.                 new("SRPDefaultUnlit"), // Legacy shaders (do not have a gbuffer pass) are considered forward-only for backward compatibility
    80.                 new("LightweightForward") // Legacy shaders (do not have a gbuffer pass) are considered forward-only for backward compatibility
    81.             };
    82.             m_ShaderTagIdList.Clear();
    83.  
    84.             foreach (ShaderTagId sid in forwardOnlyShaderTagIds)
    85.                 m_ShaderTagIdList.Add(sid);
    86.  
    87.             DrawingSettings drawSettings = RenderingUtils.CreateDrawingSettings(m_ShaderTagIdList, universalRenderingData, cameraData, lightData, sortingCriteria);
    88.             drawSettings.overrideMaterial = m_MaskMaterial;
    89.             var param = new RendererListParams(universalRenderingData.cullResults, drawSettings, filterSettings);
    90.             passData.RenderersList = renderGraph.CreateRendererList(param);
    91.         }
    92.     }
    93.  
    94.     [SerializeField] private LayerMask m_LayerMask;
    95.     [SerializeField] private Shader m_MaskShader;
    96.  
    97.     private Material m_MaskMaterial;
    98.  
    99.     private CustomRenderPass m_ScriptablePass;
    100.     RTHandle CustomTextureHandle;
    101.  
    102.     public override void Create()
    103.     {
    104.         if (m_MaskShader != null)
    105.             m_MaskMaterial = CoreUtils.CreateEngineMaterial(m_MaskShader);
    106.  
    107.         m_ScriptablePass = new CustomRenderPass(m_LayerMask, m_MaskMaterial)
    108.         {
    109.             renderPassEvent = RenderPassEvent.AfterRenderingOpaques
    110.         };
    111.     }
    112.  
    113.     protected override void Dispose(bool disposing)
    114.     {
    115.         if (m_MaskMaterial != null)
    116.         {
    117.             CoreUtils.Destroy(m_MaskMaterial);
    118.             m_MaskMaterial = null;
    119.         }
    120.     }
    121.  
    122.     public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    123.     {
    124.         var descriptor = renderingData.cameraData.cameraTargetDescriptor;
    125.         descriptor.msaaSamples = 1;
    126.         descriptor.depthBufferBits = 0;
    127.         descriptor.graphicsFormat = GraphicsFormat.R8G8B8A8_SRGB;
    128.         var textureName = "_CustomTex";
    129.         RenderingUtils.ReAllocateIfNeeded(ref CustomTextureHandle, descriptor, FilterMode.Bilinear, TextureWrapMode.Clamp, name: textureName);
    130.  
    131.         m_ScriptablePass.Setup(CustomTextureHandle);
    132.  
    133.         renderer.EnqueuePass(m_ScriptablePass);
    134.     }
    135. }
     
    Last edited: Mar 6, 2024
    grandw49536 and nasos_333 like this.
  23. tomekkie2

    tomekkie2

    Joined:
    Jul 6, 2012
    Posts:
    990
    After replacing activeDepthTexture with cameraDepthTexture - I have finally managed to get my mask feature to work in RenderGraph.
    Looks like my custom texture bite each other with activeDepthTexture , (possibly due to Exception: Mismatch in Fragment dimensions) but fortunately tolerates the cameraDepthTexture.
    This is the camera view:
    Image Sequence_001_0000.jpg

    and the corresponding custom mask texture:
    Image Sequence_003_0000.jpg
     
    Last edited: Mar 6, 2024
    AljoshaD and oliverschnabel like this.
  24. oliverschnabel

    oliverschnabel

    Unity Technologies

    Joined:
    Mar 13, 2018
    Posts:
    49
    The Unity Editor 2023.3.0b10 is now available. It contains the URP Render Graph Samples: "A Collection of scripts with some examples of RenderGraph and how it is used within the Universal Render Pipeline."

    You can find these through the Package Manager under the Universal RP Samples tab:
    upload_2024-3-8_14-5-5.png

    These are code-only examples with comments to explain the API usage. Some of them have already been shared in this thread.
    upload_2024-3-8_14-10-50.png

    The URP 3D Sample has also been upgraded to Render Graph recently, and can be downloaded through the hub. The Renderer Features in this demo support both Render Graph, as well as the Compatibility Mode (Render Graph Disabled).

    Here are other Render Graph related items from the release notes:

    New 2023.3.0b10 Entries since 2023.3.0b5

    Fixes
    • SRP Core: Added CreateSkyboxRendererList in Render Graph API. (UUM-60100) First seen in 2023.3.0a1.
    • SRP Core: Rendering Debugger - Fixed Render Graph Debug Display Reset behaviour. (UUM-62760) First seen in 2023.3.0b3.
    • Graphics: Fixed an issue where it was possible for ReadPixels to crash on Metal API while using render graph due to a bad state with the depth target. (UUM-44404)
    Improvements
    • SRP Core: Improved BeginRenderPass CPU performance in the Native Render Pass Render Graph (URP).
    • SRP Core: Improved execution performance with Render Graph.
    • SRP Core: Small improvements for the resource pooling system in Render Graph.
    • SRP Core: Improving and unifying render graph profiling markers.
    • SRP Core: Small optimization, frame allocation checks of Render Graph resource pool
     
    echu33, PolyCrusher, Kirsche and 8 others like this.
  25. tomekkie2

    tomekkie2

    Joined:
    Jul 6, 2012
    Posts:
    990
    I need to use camera depth texture.

    If the depth texture is specified in inspector settings:

    camera > rendering > depth texture
    or
    universal render pipeline asset > rendering > depth texture​

    its format is D32_SFloat_S8_UInt.

    If I try to ensure the depth availability - not dependent on the above settings - using:
    m_ScriptablePass.ConfigureInput(ScriptableRenderPassInput.Depth);

    its format changes to: R32_SFloat

    and then I am getting this error:
    InvalidOperationException: Trying to SetRenderAttachmentDepth on a texture that has a color format R32_SFloat. Use a texture with a depth format instead. (pass 'Mask Pass' resource '_CameraDepthTexture')

    Is this an expected behavior or am I doing anything wrong?

    In the URP 16.0 (or compability mode) the depth format stayed not affected by ConfigureInput.
     
    Last edited: Mar 10, 2024
  26. oliverschnabel

    oliverschnabel

    Unity Technologies

    Joined:
    Mar 13, 2018
    Posts:
    49
    I don't think that this is expected. To make sure the depth texture is generated you either have to enable the "Depth Texture" option, or you need to set "ScriptableRenderPassInput.Depth" as an input for your custom pass (ConfigureInput(ScriptableRenderPassInput.Depth)). So it should be correct.

    Where do you call the ConfigureInput? If you do it in RecordRenderGraph it might be error prone. It is recommended to call it inside the the Constructor of the ScriptableRenderPass.

    Or do you have multiple ConfigureInput calls in your ScriptableRenderPass? If you want to combine them, you would need to do something like this:

    Code (CSharp):
    1. public ScriptableRenderPassInput requirements = ScriptableRenderPassInput.Depth & ScriptableRenderPassInput.Color & ScriptableRenderPassInput.Normal;
    2. ...
    3. m_pass.ConfigureInput(requirements);
    But I am not sure if that is the issue here.
     
    tomekkie2 likes this.
  27. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,538
    Hi,

    Is it generally ok if we use a 2ond scene camera to render to a depth only rendertexture and pass it as a shader global variable for use in the shaders ?

    Thanks
     
  28. oliverschnabel

    oliverschnabel

    Unity Technologies

    Joined:
    Mar 13, 2018
    Posts:
    49
    I just found out about another limitation on depth that might happening in your case: In some scenarios (i.e. deferred renderer) we use a color format (R32) to store the cameraDepthTexture from the CopyDepthPass. This is done because on Metal we cannot do framebuffer fetch with depth textures, so we store the GBuffer depth as a color attachment so that the deferred lighting pass can access it while staying on tile.

    Maybe a breakdown of the differences between UniversalResourceData.cameraDepth, UniversalResourceData.cameraDepthTexture, and UniversalResourceData.activeDepthTexture could also help you when working with depth:
    • cameraDepthTexture: This is a copy of the depth attachment, generated either in a depth prepass or in a copy depth pass. It is already resolved, so no MSAA resolve needs to be done manually. It's recommended to avoid manual resolves in URP, as the NRP backend typically handles this automatically using hardware depth resolve when possible. Prefer using cameraDepthTexture rather than binding the actual depth attachment yourself.
    • cameraDepth: This represents the offscreen depth attachment, which is the main URP depth target until just before the final blit. After the final blit, the depth target becomes the backbuffer (swapchain) depth.
    • activeDepthTexture: This is a pointer to the current active depth attachment. It targets either the offscreen depth (cameraDepthTexture) or the backbuffer depth, depending on the frame setup.
    Given these options, if your goal is to read the current depth texture in your custom pass, you may consider using UniversalResourceData.activeDepthTexture most of the time. This ensures compatibility with various frame setups and allows your feature to work seamlessly regardless of the rendering configuration.
     
  29. JackyMooc

    JackyMooc

    Joined:
    May 18, 2022
    Posts:
    4
    Last edited: Mar 11, 2024
  30. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    297
    What should we do if we want to write the depth texture, eg we have postprocessing that changes the scene depth before transparents? Also is the above really for RenderGraph? If so, the API seems increasingly convoluted?

    Cheers!
     
  31. cholleme

    cholleme

    Unity Technologies

    Joined:
    Apr 23, 2019
    Posts:
    31
    @tomekkie2 so your original "Mismatch in Fragment dimensions" problem is you mix msaa and non msaa buffers for color (1 sample) and depth (4 samples in my test) which is not possible (the error message will be updated to log this so you don't have to dig in the debugger for it). But whhy is this happening?

    If URP is using msaa then the resourceData.activeDepthTexture is an MSAA multisampled buffer but in your code you explicitly override the msaa setting:
    Code (CSharp):
    1.         var descriptor = renderingData.cameraData.cameraTargetDescriptor;
    2.         descriptor.msaaSamples = 1;
    So you need to take cameraTargetDescriptor and not override the msaa setting if you want to use it with the urp depth buffer.

    The correct fix is up to what the render feature wants to achieve. But I think it's most likely you'll want your mask to be generated with msaa if urp is using msaa then later when you use your texture in a shader( and assuming the texture doesn't have bindMS set) Unity's magic msaa resolve handling will kick in and you'll be able to use the mask texture as a regular 2d texture. This avoids you having to do to much custom msaa handling.
    However, if your mask is very peculiar on it's MSAA behaviour you'll have to generate a multisampled mask to a bindMS=true texture first, then in a second pass do a custom resolve that is appropriate for the sort of mask you need. (This is not really rendergraph related it's mostly about supporting msaa in your effect.)

    The other depth buffer you tried to use that is available through the settings or Require calls is a copy and meant for reading trhough the "_CameraDepthTexture" global in the shader to apply depth effects not really for doing depth testing against them. This should probably be cleaned up and documented in more detail though.
     
    tomekkie2 likes this.
  32. oliverschnabel

    oliverschnabel

    Unity Technologies

    Joined:
    Mar 13, 2018
    Posts:
    49
    The CreateRendererList can be used with RendererListParams and RendererListDesc. The first one accepts RenderStateBlock as a NativeArray in stateBlocks, and expects a similar size NativeArray of ShaderTagId for tagValues if you assign the stateBlocks.

    The latter one has an argument stateBlock. The use of these should be similar to the original API.

    This is an (arbitrary) example of CreateRendererList with RenderStateBlock here:
    Code (CSharp):
    1. private void InitRendererLists(ContextContainer frameData, ref PassData passData, RenderGraph renderGraph)
    2.         {
    3.             // Access the relevant frame data from the Universal Render Pipeline
    4.             UniversalRenderingData universalRenderingData = frameData.Get<UniversalRenderingData>();
    5.             UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
    6.             UniversalLightData lightData = frameData.Get<UniversalLightData>();
    7.          
    8.             var sortFlags = cameraData.defaultOpaqueSortFlags;
    9.             RenderQueueRange renderQueueRange = RenderQueueRange.opaque;
    10.             FilteringSettings filterSettings = new FilteringSettings(renderQueueRange, m_LayerMask);
    11.          
    12.             ShaderTagId[] forwardOnlyShaderTagIds = new ShaderTagId[]
    13.             {
    14.                 new ShaderTagId("UniversalForwardOnly"),
    15.                 new ShaderTagId("UniversalForward"),
    16.                 new ShaderTagId("SRPDefaultUnlit"), // Legacy shaders (do not have a gbuffer pass) are considered forward-only for backward compatibility
    17.                 new ShaderTagId("LightweightForward") // Legacy shaders (do not have a gbuffer pass) are considered forward-only for backward compatibility
    18.             };
    19.          
    20.             m_ShaderTagIdList.Clear();
    21.             RenderStateBlock[] renderStateBlocks = new RenderStateBlock[forwardOnlyShaderTagIds.Length];
    22.  
    23.             RenderStateBlock customRenderStateBlock = new RenderStateBlock(RenderStateMask.Stencil)
    24.             {
    25.                 stencilState = new StencilState(true, passOperation: StencilOp.Invert),
    26.                 stencilReference = 255
    27.             };
    28.             int index = 0;
    29.             foreach (ShaderTagId sid in forwardOnlyShaderTagIds) {
    30.                 m_ShaderTagIdList.Add(sid);
    31.                 renderStateBlocks[index++] = customRenderStateBlock;
    32.             }
    33.              
    34.          
    35.             DrawingSettings drawSettings = RenderingUtils.CreateDrawingSettings(m_ShaderTagIdList, universalRenderingData, cameraData, lightData, sortFlags);
    36.  
    37.             var param = new RendererListParams(universalRenderingData.cullResults, drawSettings, filterSettings);
    38.  
    39.             param.tagValues = new NativeArray<ShaderTagId>(m_ShaderTagIdList.ToArray(), Allocator.Temp);
    40.             param.stateBlocks = new NativeArray<RenderStateBlock>(renderStateBlocks, Allocator.Temp);
    41.             passData.rendererListHandle = renderGraph.CreateRendererList(param);
    42.         }

    It is one of our core Utilities we are working on to cover common cases of DrawRenderers as a way to quickly add a pass to draw renderers similar to ScriptableRenderContext.DrawRenderers.

    With RenderGraph only the BlitTexture overloads that do not set a destination are available, so you have to set the Depth+Stencil buffer yourself using:

    Code (CSharp):
    1. builder.SetRenderAttachmentDepth(...)
    Then in the execute of the pass you can modify stencil just like you could before (e.g using Shaderlab stencil command https://docs.unity3d.com/2022.1/Documentation/Manual/SL-Stencil.html). Blitter isn’t really involved in this in the end it just draws a full screen triangle.


    Maybe it was a little misleading. I just wanted to highlight the different depth buffers you find in UniversalResourceData:
    upload_2024-3-12_8-41-44.png

    This is how to set the depth as render attachment with write flags in the RecordRenderGraph function:

    Code (CSharp):
    1.  
    2. builder.SetRenderAttachment(resourceData.activeColorTexture, 0);
    3. builder.SetRenderAttachmentDepth(resourceData.activeDepthTexture, AccessFlags.Write);
    4.  
    In the Create function it is then defined at which renderPassEvent the custom pass is executed:

    Code (CSharp):
    1.  
    2. public override void Create()
    3.     {
    4.         m_ScriptablePass = new RendererListPass(m_LayerMask);
    5.  
    6.         // Configures where the render pass should be injected.
    7.         m_ScriptablePass.renderPassEvent = RenderPassEvent.BeforeRenderingTransparents;
    8.     }
    9.  
    There is also backBufferDepth in the resourceData. URP needs backBufferColor/Depth and cameraColor/Depth, the typical frame is rendering to the offscreenTextures first, then doing a final blit to the backbuffer.

    no target texture? backbuffer = swapchain
    target texture? backbuffer = the output texture

    Probably we should just have exposed the active texture only, which is just a pointer to one of the two (e.g. cameraDepth or backbufferDepth). That should be enough and cameraDepth/backBufferDepth would be only internal, because you would not want to target the backbuffer before final blit, since it will be overridden anyway.
     
    JackyMooc, ElliotB and tomekkie2 like this.
  33. tomekkie2

    tomekkie2

    Joined:
    Jul 6, 2012
    Posts:
    990
    I have tried to convert a very simple blit feature, i.e. TintFeature.

    After my conversion it works in both compability as well as the RenderGraph mode.

    It uses a very simple TintBlit shader.

    My discovery is that in the compability mode the camera opaque texture is available as _CameraColorTexture, while in the RenderGraph mode is not.
    Instead it is available as _CameraOpaqueTexture.
    Means in order to get the old feature to work I had to restore the _CameraColorTexture availability and I have added this statement:
    Shader.SetGlobalTexture("_CameraColorTexture", source);


    Code (CSharp):
    1. Shader "Custom/TintBlit"
    2. {
    3.         Properties
    4.     {
    5.         [MainColor] _Color("Color", Color) = (1,0,1,1)
    6.     }
    7.  
    8.     SubShader
    9.     {
    10.         Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
    11.         LOD 100
    12.         ZWrite Off Cull Off
    13.         Pass
    14.         {
    15.             Name "TintBlitPass"
    16.  
    17.             HLSLPROGRAM
    18.             #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
    19.             // The Blit.hlsl file provides the vertex shader (Vert),
    20.             // input structure (Attributes) and output strucutre (Varyings)
    21.             #include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"
    22.  
    23.             #pragma vertex Vert
    24.             #pragma fragment frag
    25.  
    26.             TEXTURE2D(_CameraColorTexture);
    27.             SAMPLER(sampler_CameraColorTexture);
    28.  
    29.             float4 _Color;
    30.  
    31.             half4 frag (Varyings input) : SV_Target
    32.             {
    33.                 float4 color = SAMPLE_TEXTURE2D(_CameraColorTexture, sampler_CameraColorTexture, input.texcoord);
    34.                 return color * _Color;
    35.             }
    36.             ENDHLSL
    37.         }
    38.     }
    39. }
    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.Rendering;
    3. using UnityEngine.Rendering.Universal;
    4. using UnityEngine.Rendering.RenderGraphModule;
    5.  
    6. internal class TintFeature : ScriptableRendererFeature
    7. {
    8.     public Shader m_Shader;
    9.     public Color m_Color;
    10.  
    11.     Material m_Material;
    12.  
    13.     TintPass m_RenderPass = null;
    14.  
    15.     public override void AddRenderPasses(ScriptableRenderer renderer,
    16.                                     ref RenderingData renderingData)
    17.     {
    18.         if (renderingData.cameraData.cameraType == CameraType.Game)
    19.         {
    20.             m_RenderPass.ConfigureInput(ScriptableRenderPassInput.Color);
    21.  
    22.             m_RenderPass.Setup(renderer.cameraColorTargetHandle, m_Color);
    23.             renderer.EnqueuePass(m_RenderPass);
    24.         }
    25.     }
    26.  
    27.  
    28.     public override void Create()
    29.     {
    30.         m_Material = CoreUtils.CreateEngineMaterial(m_Shader);
    31.         m_RenderPass = new TintPass(m_Material);
    32.    
    33.     }
    34.  
    35.     protected override void Dispose(bool disposing)
    36.     {
    37.         CoreUtils.Destroy(m_Material);
    38.     }
    39. }
    40.  
    41. internal class TintPass : ScriptableRenderPass
    42. {
    43.     ProfilingSampler m_ProfilingSampler = new ProfilingSampler("TintBlit");
    44.     Material m_Material;
    45.     RTHandle m_CameraColorTarget;
    46.     Color m_Color;
    47.  
    48.     public TintPass(Material material)
    49.     {
    50.         m_Material = material;
    51.         renderPassEvent = RenderPassEvent.BeforeRenderingPostProcessing;
    52.     }
    53.  
    54.     public void Setup(RTHandle colorHandle, Color color)
    55.     {
    56.         m_CameraColorTarget = colorHandle;
    57.         m_Color = color;
    58.     }
    59.  
    60.     public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
    61.     {
    62.         var cameraData = renderingData.cameraData;
    63.         if (cameraData.camera.cameraType != CameraType.Game)
    64.             return;
    65.  
    66.         if (m_Material == null)
    67.             return;
    68.  
    69.         CommandBuffer cmd = CommandBufferPool.Get();
    70.         using (new ProfilingScope(cmd, m_ProfilingSampler))
    71.         {
    72.             m_Material.SetColor("_Color", m_Color);
    73.             Blitter.BlitCameraTexture(cmd, m_CameraColorTarget, m_CameraColorTarget, m_Material, 0);
    74.         }
    75.         context.ExecuteCommandBuffer(cmd);
    76.         cmd.Clear();
    77.  
    78.         CommandBufferPool.Release(cmd);
    79.     }
    80.     class PassData
    81.     {
    82.         public TextureHandle source;
    83.         public Material material;
    84.         public Color color;
    85.     }
    86.     static void ExecutePass(RasterCommandBuffer cmd, RTHandle source, Material material, Color color)
    87.     {
    88.         if (material == null)
    89.             return;
    90.         Shader.SetGlobalTexture("_CameraColorTexture", source);
    91.         material.SetColor("_Color", color);
    92.         Blitter.BlitTexture(cmd, source, new Vector4(1, 1, 0, 0), material, 0);
    93.     }
    94.     public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
    95.     {
    96.         UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
    97.         UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
    98.  
    99.         if (cameraData.camera.cameraType != CameraType.Game)
    100.             return;
    101.  
    102.         using (var builder = renderGraph.AddRasterRenderPass<PassData>("TintBlit", out var passData))
    103.         {
    104.             TextureHandle source = resourceData.cameraOpaqueTexture;
    105.  
    106.             // When using the RenderGraph API the lifetime and ownership of resources is managed by the render graph system itself.
    107.             // This allows for optimal resource usage and other optimizations to be done automatically for the user.
    108.             // In the cases where resources must persist across frames, between different cameras or when users want
    109.             // to manage their lifetimes themselves, the resources must be imported when recording the render pass.
    110.             //TextureHandle destination = renderGraph.ImportTexture(m_Destination);
    111.  
    112.             if (!source.IsValid())
    113.                 return;
    114.  
    115.             passData.source = source;
    116.             passData.material = m_Material;
    117.             passData.color = m_Color;
    118.             builder.UseTexture(source, AccessFlags.Read);
    119.             builder.SetRenderAttachment(resourceData.activeColorTexture, 0, AccessFlags.Write);
    120.  
    121.             builder.SetRenderFunc((PassData data, RasterGraphContext context) =>
    122.             {
    123.                 ExecutePass(context.cmd, source, m_Material, m_Color);
    124.             });
    125.         }
    126.     }
    127. }
     
    Last edited: Mar 13, 2024
  34. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    297
    Is there a defined symbol that we can check in hlsl? '#if URP_COMPATABILITY_MODE'? This is probably the bare minimum requirement for allowing shaders to support both modes of there are changes to any part of the hlsl URP API

    Separately - do you know when 2023.3 is going to release and make RG on by default? There is now a unity asset store sale mid April, so I expect a lot of asset store creators are going to get a nasty surprise if this drops in the middle of the sale (URP breaking changes have been released in the middle of a major sale beforehand - dec 21 iirc? -, with an immediate wave of negative reviews).
     
  35. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,538
    I cant believe they actually going to release this in April, it is not even a year away from something concrete and optimized, the coding alone will take years to be made usable in a normal way, right now code is like 5-10x more to get the same results.

    Also releasing it and breaking all existing projects will create a chaos of unbelievable proportions.

    I just hope the management of the coders will see the issue there and stop this April release from ever happening in time before the ultimate disaster that is going to happen.

    Even if all were perfect, i would need at the very least another 6 months to properly port all my systems to the graph, that would be the bare minimum, if consider all my time goes to that only.

    April is just a few days away, so in the case this is going forward will have to put a huge disclaimer in all assets saying that Unity deliberately destroyed the image effects backend and the assets do not support the breaking of the system and will be a new version later specifically for the new Unity platform which is not the URP, but the URPRG.

    Trying to rush a fast release of something half ready with a million issues just to cover for this crazy situation will only create a vastly worse and more chaotic issue in the assets, as trying to make them run in this environment will lead to a trillion bugs in users, some now and many more later as there will be huge breaking changes until is completed few years from now.
     
    kripto289, Trigve and Neonage like this.
  36. tomekkie2

    tomekkie2

    Joined:
    Jul 6, 2012
    Posts:
    990
    Thanks@cholleme.
    I just rewrote this:

    Code (CSharp):
    1.         var descriptor = renderingData.cameraData.cameraTargetDescriptor;
    2.         descriptor.msaaSamples = 1;
    - from the script previous version - without realizing what it means.
    After removing it all the depth textures seem to work as expected.
     
  37. scottjdaley

    scottjdaley

    Joined:
    Aug 1, 2013
    Posts:
    163
    I just spent about 4 full days converting all of my render features in my project. There was quite a big learning curve and I made a lot of stumbles along the way, but I definitely feel like I have a better grasp on how the rendering stack works with the new API. With the old API, I was never quite sure how to set up all the texture handles correctly and had to rely on reading through the internal render features.

    Here are some of my thoughts:

    1. Creating RenderLists with stencil settings is really awkward, both with the old and new API.
    While this works, it is very verbose if you just want a single stencil setting applied to the whole render list. In your example, you set a stencil state for each forward shader tag. But in all the internal render features I've seen, they set a single stencil state and a single ShaderTagId.none. I'm not totally sure how this works, but I assume if the tag is set to none, it applies the blend and stencil to all of the tags in the render list? If so, why can't we just pass an empty array of shader tags? It always complains if the shader tag array and state block array are different.

    There is also this handy internal method that most of the internal render features use. I just copied this code into my project so that I could do the same thing, but I would love to see it added to the public API:
    Code (CSharp):
    1.         static ShaderTagId[] s_ShaderTagValues = new ShaderTagId[1];
    2.         static RenderStateBlock[] s_RenderStateBlocks = new RenderStateBlock[1];
    3.         // Create a RendererList using a RenderStateBlock override is quite common so we have this optimized utility function for it
    4.         internal static void CreateRendererListWithRenderStateBlock(RenderGraph renderGraph, ref CullingResults cullResults, DrawingSettings ds, FilteringSettings fs, RenderStateBlock rsb, ref RendererListHandle rl)
    5.         {
    6.             s_ShaderTagValues[0] = ShaderTagId.none;
    7.             s_RenderStateBlocks[0] = rsb;
    8.             NativeArray<ShaderTagId> tagValues = new NativeArray<ShaderTagId>(s_ShaderTagValues, Allocator.Temp);
    9.             NativeArray<RenderStateBlock> stateBlocks = new NativeArray<RenderStateBlock>(s_RenderStateBlocks, Allocator.Temp);
    10.             var param = new RendererListParams(cullResults, ds, fs)
    11.             {
    12.                 tagValues = tagValues,
    13.                 stateBlocks = stateBlocks,
    14.                 isPassTagName = false
    15.             };
    16.             rl = renderGraph.CreateRendererList(param);
    17.         }
    Personally, I think the whole render list creation could be improved a lot. Perhaps there could be some kind of builder API which sets reasonable defaults for everything, but has methods to override things when you want them. Maybe something like:

    Code (CSharp):
    1. RenderListBuilder builder = new RenderListBuilder();
    2. builder.SetSortingCriteria(/* ... */);
    3. builder.SetLayerMask(/* ... */);
    4. builder.SetRenderLayerMask(/* ... */);
    5. builder.SetRenderQueueRange(/* ... */);
    6. builder.SetBlendState(/* ... */);
    7. builder.SetStencilState(/* ... */);
    8. builder.SetRenderStateMapping(/* ... */); // if you want different stencil or blend per forward shader tag
    9.  
    10. RenderListHandle renderList = builder.Build();
    The advantage of this kind of code is that it flattens all of the millions of different parameters in this API, making it a lot more discoverable and easy to use. Right now, you have to remember where each setting is located (DrawingSettings, FilteringSettings, RenderStateBlock, etc.). Anyways, it sounds like this is already being worked on, so I'm excited to see some improvements there.

    2. I love the new API for declaring input and output textures. Much more intuitive than the old system. However, there were several times where I forgot to declare an input texture. From what I can tell, nothing breaks and the shaders can still read those global textures (assuming the creation of those textures hasn't been culled). I'm not sure if it is possible, but I wish RenderGraph would tell me when I'm forgetting to declare an input if the shader does in fact read from that texture. Otherwise, things can get culled unexpectedly. (However, the RenderGraphViewer does help identify these mistakes!)

    3. One of the questions I often ran into with the old API was whether to make a new ScriptableRenderPass for each shader pass, or if I should just do multiple Blits and SetRenderTarget calls in a single ScriptableRenderPass. With the new API, I can no longer have multiple SetRenderTarget calls in a single RasterRenderPass, but I can still add multiple RasterRenderPasses to a ScriptableRenderPass. So I still have the same question. What is considered the best practice?

    At first I tried to make separate ScriptableRenderPasses for each shader pass and used a custom ContextItem to pass data between the passes. This worked, but it actually seems easier to just have a single ScriptableRenderPass and share the data using a private variable instead. In fact, that seems to be what most of the internal render features are doing.

    There is also the possibility of using an UnsafePass which lets you call SetRenderTarget multiple times. I assumed this was only there to help with the transition of old render features to RenderGraph, but then I saw this comment in the samples:
    I have some render features that would seem to meet this criteria. For example, in my outline feature, I'm doing a bunch of ping pong blits back and forth between the same two textures. For those cases, should I be using an UnsafePass? (I also just discovered the Framebuffer fetch technique, which could be even better)

    4. Are there any best practices about when to access the ContextContainer and when to create new TextureHandles? A lot of the examples access the ContextContainer inside the AddRasterRenderPass scope, but from what I can tell, this could be moved out of that scope, above the AddRasterRenderPass call. Personally, I think it would be cleaner if the contents of the AddRasterRenderPass scope only contained method calls to the builder.

    I think what makes this awkward is that AddRasterRenderPass returns the PassData as an output parameter. That means that you can only populate it after called AddRasterRenderPass, which explains why the examples do their render texture setup inside this scope. But perhaps there could be an overload of AddRasterRenderPass that takes the PassData as an input parameter instead?

    5. Calling material.SetVector in one render pass seems to have an effect on previously scheduled render passes.

    For my outline render feature, I needed to blit the same material multiple times with different values for that material property. In each render pass, I called material.SetVector with the new value. But for some reason, all shader passes were using the same value (whatever I set last). I confirmed this with the FrameDebugger. Not sure if this behavior is expected or not, but it definitely tripped me up and took a while to debug.

    One solution was to set AllowGlobalStateModification(true), but that forces a sync point. So the only other solution I found was to pass a material property block while performing the blit. But unfortunately, the Blitter API has no way to pass material properties or a MaterialPropertyBlock to Blitter.BlitTexture. Instead, you have to draw the fullscreen triangle manually with a DrawProcedural call or use CoreUtils.DrawFullScreen after setting the Blit texture and scale bias yourself in a material property block. I would like to see Blitter get an overload for passing your own MaterialPropertyBlock or some other way to pass material properties.

    Apologies for the giant wall of text! Overall, I find the new API much much easier to work with. Thanks for all the hard work!
     
    tomekkie2 and AljoshaD like this.
  38. tomekkie2

    tomekkie2

    Joined:
    Jul 6, 2012
    Posts:
    990
    I just coulndn't get it to work this way or using RendererListDesc, but only using:
    Code (CSharp):
    1.             NativeArray<ShaderTagId> tagValues = new NativeArray<ShaderTagId>(new ShaderTagId[1] { ShaderTagId.none }, Allocator.Temp);
    2.             NativeArray<RenderStateBlock> stateBlocks = new NativeArray<RenderStateBlock>(new RenderStateBlock[1]{ _renderStateBlock }, Allocator.Temp);
    3.             param.tagValues = tagValues;
    4.             param.stateBlocks = stateBlocks;
     
  39. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    297
    Hi URP team,

    I've spent the weekend porting my package to support RG. I'm a fan of the changes, although it was a serious chore to port everything over with backwards compatibility in mind! Nevertheless, I was able to get it working in a weekend (~10 passes, including custom blits and object draws and fairly deep integration to URP). A key requirement for me is that my package needs to support versions earlier than 2023.3, ideally as far back as 2021. Here's some thoughts and feedback:
    • Release is very polished and I think the API is in a pretty good place. Everything worked more or less as expected.
    • The RG API allows a lot more control over the pipeline which is fantastic. I think _even more control_ could be given with very little work, by moving more of the pipeline configuration into ContextItems.
    • The tooling is good, the render graph analyser is very helpful.
    • The documentation is a bit rough and probably going to undermine the release at the moment.
    • ContextItem workflow was great once I got the hang of it. Feels very data-oriented.
    • There's also a question about whether a major API change to URP is a good idea - it's probably going to dent adoption in the short term - but that's all in the past now.

    1. Sharing non-RG and RG execution across previous editor versions - ExecutePass.

    The examples and URP code present a way to achieve backwards compatibility through 'compatibility mode'; this requires moving the functionality into a static void (usually called ExecutePass), to which you hand required data types, often bundled up into a nice PassData struct. Thus, both the RG and non-RG paths can reuse the execution code, with both preparing the PassData struct in their own way.

    While this works nicely if you only want to support 2023.3, it becomes tedious if you want to support something earlier due to incompatibility of types used in new and old engine versions, eg TextureHandle/RTHandle, RendererList/RendererListHandle, CommandBuffer/RasterCommandBuffer. There's a lot of implicit conversions between old and new types, but these are often context sensitive. For example, TextureHandle can implicitly convert to RTHandle, but only during the execution stage. As a result, the best approach I can come up with is to unpack the PassData struct into old types which are fed into execution function. It's very boilerplate heavy:

    PassData:
    Code (CSharp):
    1.         public class PassData
    2.         {
    3.             public RendererListHandle metadataObjectsToDraw;
    4.             public ProPixelizerSettings settings;
    5.             public bool previewCamera;
    6.             public Matrix4x4 projection;
    7.             public Matrix4x4 view;
    8.         }
    RG:
    Code (CSharp):
    1.                 builder.SetRenderFunc((PassData passData, RasterGraphContext context) =>
    2.                 {
    3.                     ExecutePass(context.cmd,
    4.                         passData.metadataObjectsToDraw,
    5.                         ref passData.settings,
    6.                         passData.previewCamera,
    7.                         passData.projection,
    8.                         passData.view
    9.                         );
    10.                 });
    non-RG
    Code (CSharp):
    1.                 ExecutePass(
    2. #if UNITY_2023_3_OR_NEWER
    3.                     CommandBufferHelpers.GetRasterCommandBuffer(buffer),
    4. #else
    5.                     buffer
    6. #endif
    7.                     renderList,
    8.                     ref Settings,
    9.                     renderingData.cameraData.cameraType == CameraType.Preview,
    10.                     projectionMatrix,
    11.                     viewMatrix
    12.                     );
    Execute pass signature:
    Code (CSharp):
    1. #if UNITY_2023_3_OR_NEWER
    2.         public static void ExecutePass(IRasterCommandBuffer command,
    3. #else
    4.         public static void ExecutePass(CommandBuffer command
    5. #endif
    6.             RendererList metadataObjectsToDraw,
    7.             ref ProPixelizerSettings settings,
    8.             bool previewCamera,
    9.             Matrix4x4 projection,
    10.             Matrix4x4 view
    11.             )
    12.         {
    2. Documentation is rough

    (referring to the pdf linked in earlier posts)

    The alpha documentation is helpful for the broad strokes, but not much otherwise. I've so far found it more useful to hunt for ideas/examples in the github graphics repo, looking at the Runtime URP passes. The pdf is often inconsistent with itself and leaves a lot of detail at the side. One example: a two-line snippet will be shown, but it's not clear which function or scope (execution/build) the code was in. Other posts have already mentioned this. Another issue is that the code examples are all screenshots, which makes copy and paste hard and is not accessibility-friendly. Sometimes one snippet will define a function, and the next snippet will invoke a function of the same name but different signature.

    The 'RenderGraph concepts' section of the user guide talks about OnBeginRenderGraphFrame as a place to initialise variables used for the frame, but then none of the examples or Runtime/Passes do this.

    My honest feeling is that the supporting resources aren't currently up to scratch and are going to severely undermine the release, if it's truly only a few weeks away. It's going to be a damn shame if all the good work on the new API is drowned out in user complaints.

    3. CommandBuffer implement IRasterCommandBuffer?

    This would make it a lot easier to share utility functions between non-RG paths and RG.

    Code (CSharp):
    1.         public static void MyUtilityFunction(IRasterCommandBuffer buffer, float myVar)
    2.         {
    3.             // do something, eg:
    4.             buffer.SetGlobalFloat(SOME_CONST_NAME, myVar);
    5.         }
    On that note - why does RasterCommandBuffer have SetViewProjectionMatrices, but CommandBuffer has these as two separate methods, SetViewMatrix and SetProjectionMatrix?

    Likewise, some common interface between CameraData versus UniversalCameraData would be great.

    4. Convert RenderTextureDescriptor to TextureDesc?

    TextureDesc is required for builder.CreateTransientTexture, a conversion from RenderTextureDescriptor would be very useful. On that note, no URP passes use CreateTransientTexture, it seems only used in HDRP?

    5. When to dispose of NativeArray resources?

    There are still some places where resources must be managed by ourselves. For instance, I have a drawobjects-like pass which uses NativeArray of RenderStateBlock and ShaderTagIds, which as input into RendererListParams to control drawing. In the non-RG path it's clear when the buffer has been submitted and when I can dispose of these resources. In the RG path it's not clear. I would guess I need to do something like 'builder.Use....' to tell RG to manage the disposal of these resources on my behalf, but I can't find an equivalent one of these functions for NativeArrays? This scenario would be quite common for compute shaders working on NativeArrays generated from DOTS/Unity ECS.

    The best I can come up with for now is to store the NativeArrays with persistent allocation as a field on the render pass itself, reuse them each pass, and implement a Dispose method to the pass to dispose the arrays (which I call from the RenderFeature's Dispose). It feels very hacky!

    6. How to set texture inputs to custom materials using deferred execution of draw calls?

    All execution is deferred through command buffers, so how should we set local texture properties on custom materials/shaders used for fullscreen blitting? So far I've got by with setting global texture properties (for which methods are available on the command buffer), and using these from the shader. However, it seems part of the benefit of the new API is better definition of state, which discourages these kind of global operations. Is there a better way to approach this using the new API, something like 'command.setTexture(material, texture)'? I can see for compute shaders there is SetComputeTextureParam, is there an equivalent for material blit workflows?

    7. Blitter.BlitCameraTexture only has CommandBuffer inputs, whereas Blitter.BlitTexture has RG-friendly variants

    We can't use BlitCameraTexture from a RasterGraphContext. The example uses Blitter.BlitTexture, but this won't respect resolution scaling. It seems from the URP code that we should use an UnsafeGraphContext and 'CommandBuffer cmd = CommandBufferHelpers.GetNativeCommandBuffer(rgContext.cmd)'. Is there a way of performing a blit, with support for resolution scaling, using a RasterGraphContext?

    8. Would like extra control over URP's Opaque/Transparent passes

    UniversalRenderingData holds the opaque and transparent layer masks. I was really excited when I saw this, because I thought this was going to allow me to control them! Unfortunately they are readonly, and it looks like they aren't actually used by the DrawObjects passes either - InitRenderLists takes the UniversalRenderingData in as an input, and then flatly ignores it to use m_FilteringSettings instead. That's a real shame! It would be great to see some of this configuration moved to ContextItems so we can intercept and modify the settings before the draw occurs.

    It's actually even harder than before to modify these now, because I have no access to the UniversalRenderer instance so I can't perform reflection to get the hidden DrawObjects instances for opaques and transparents.

    9. non-RG methods such as Configure marked as Obselete


    These code paths are used when running in compatibility mode, so they aren't really obselete and we can't really delete them yet. With the method marked as obselete, that means my package is now generating a load of compilation warnings, even if 'correct' for this version of Unity. For anyone else having this problem, just scatter '#pragma warning disable/restore 0618' throughout your code.

    10. OnCameraSetup gone

    I asked about this back in January but didn't get a reply. If the method is gone, where should we place per-camera calculations? Is the intention that we just hook at the very earliest event of the frame and calculate everything we need there, eg BeginRendering?

    11. Passes that only modify Global State get culled

    This is mentioned in the docs, and the solution is to add the hacky workaround builder.AllowPassCulling(false). Would be nicer to have some sort of detection for global state.

    12. What happens if I don't have the correct UseTexture declarations in an UnsafeRenderPass?

    Because of the BlitCameraTexture issue above, I need to use UnsafeRenderPass in most places. There's quite a lot of UseTexture declarations and it's not impossible for a user to make an error here, what happens if they do?

    13. Everything is Data


    I think it's going to be very confusing for first time users that everything is data, but there's two types of 'Data'. Some of these are ContextItems which are used as additional data that you require for drawing, eg CameraData. Other ones have a more specific scope and are generated for the sole purpose of caching data for execution of a particular render pass in the rendergraph (these are usually named 'PassData' throughout URP and examples). It might be worth emphasising the differences in the docs. Also, data is a confusing name because everything in a program is data! Probably too late, but something like PassResources or PassInputs would be clearer IMO than PassData. Alternatively, ExecutionResources would also help draw the line between the intended use in the execution phase.

    Cheers,
    Elliot
     

    Attached Files:

  40. scottjdaley

    scottjdaley

    Joined:
    Aug 1, 2013
    Posts:
    163
    Is there any way to use framebuffer fetch in an unsafe render pass?

    With a raster pass, we have SetInputAttachment, but that function isn't available with the unsafe builder.

    For some context, I'm trying to use framebuffer fetch in an outline shader that needs to ping pong write back and forth between two render textures. At first I tried using raster passes with SetInputAttachment, but this results in the following error:

    Render pass 'Outline Silhouette Pass/Outline Jump Flood Init Pass/Outline Jump Flood Single Axis Pass/Outline Jump Flood Single Axis Pass/Outline Jump Flood Single Axis Pass/Outline Jump Flood Single Axis Pass/Outline Jump Flood Single Axis Pass/Outline Jump Flood Single Axis Pass/Outline Jump Flood Single Axis Pass/Outline Jump Flood Single Axis Pass' validation: The render pass exceeds(10) the limit of allowed subpasses(8).


    I'm not entirely sure why I'm seeing this error. But my best guess is that RenderGraph is compiling all of these render passes into one and offsetting an render attachments accordingly. So even though my render passes alternate writing between two textures, RenderGraph uses separate render attachment slots for each pass. If so, I don't think there is any way around this other than using an unsafe pass and configuring the render targets manually.

    Why doesn't RenderGraph check this subpass limit before deciding to compile render passes together?

    It seems like if more than 8 passes are theoretically capable of being compiled together, it should stop at 8 to stay under this limit.
     
  41. cholleme

    cholleme

    Unity Technologies

    Joined:
    Apr 23, 2019
    Posts:
    31
    @scottjdaley Thanks for the, again, very valuable feedback. Here is some clarification on the issues you raised:
    1. We're looking at the design now planning to start development work soon, so this will feed straight into that :). But yes I share your feeling of forgetting where the settings are spread over the different blocks. There are technical and historical reasons for those but as a user this is not your concern. :)

    2. Yes we understand it's annoying we're looking into adding more validation but it's difficult to catch all cases. A sort of worst case example here is referencing a global texture from a material applied to an object in the scene. This is very hard to catch due to it's dynamic nature. But we are actively looking at ways to improve error handling as part of the beta.

    3. Scriptable render pass vs render graph pass

    When to start a new scriptable render pass?
    My guidelines would be use the least amount of passes, but do keep in mind that each pass can execute at a different time in the frame which is why a ScriptableRenderFeature has the possibility to contain several passes. This makes it possible to do work at several points in the frame from within the same feature. I see a pass mora as a "callback at a certain point in the frame".
    Nuances to the rule:
    - When you have several effects it might be reasonable to have a pass to encapsulate a "reusable" section say a "copy" pass that you can use in several places. Of course you could also make a function that you call in several places, it's really up you what feels best here.

    When to start a new render graph pass?
    Here the advise would probably again be use the least amount of passes with the sort of following (again flexible) constraints
    - If you want to switch render targets it's (obviously) a new raster pass
    - If you want to read something you've previously written as a framebuffer input, again it must be a new pass
    - If you see yourself adding passes from a for loop or something similar (e.g. "for each mip, add raster pass") consider if it makes much sense to have them as separate passes, each pass will be tracked and optimized by the rendergraph compiler but usually in these "loopy" passes there is not much to actually optimize in the end. You could consider moving the loop inside the pass and have a single UnsafePass + SetRenderTarget internally. I wouldn't "overdo" this rule if it's a reasonable amount of passes I wouldn't start merging them manually using UnsafePass, rg is there to handle that for you. But for example the URP bloom downsample passes are merged like this for speed.

    4. The value of the out data parameter is internally pooled, doing it like that is mainly to avoid garbage (or forcing it on users to deal with it more explicitly) when adding passes. I don't really share the feeling it's "bad" to access the contextdata in the using block. It's sort of expected in c# to access things form the outer scope of a using scope. In a way its' a lot of syntax sugar for a BeginBuildingPass/EndBuildingPass. If you have a single pass you could even write the following as the first line of your RecordRenderGraph function to make it less magicky:

    Code (CSharp):
    1. using var builder = renderGraph.AddRasterRenderPass<DrawGizmosPassData>("Draw Gizmos Pass", out var passData, Profiling.drawGizmos);
    2. ... work on it untill the end of the current scope
    To my feeling, the SetRenderFunc is where the real concern is around accessing local variables, that is why I would recommend to always make your execute a separate static function instead of just an inline lambda to avoid mistakes. (Even though our own code doesn't do that a lot...).

    5. Yes this is by design material behaves like any other c# object instance. If you schedule a pass it has a reference to this object (a material in this case) if something then later modifies this same object, the changes are of course visible to everyone with a reference to the same instance. So if you modify a material referenced by an added but not yet executed pass you'll see the modified version used later during execution. If you want several passes scheduled with different material settings you'll have to create several materials or handle this in other ways like you suggest material property blocks or globals. But we understand/felt the same pain so we are working on blitter helper functions that expose a material property block :)

    I think you were maybe expecting this to behave differently because the standard Unity Renderer class actually silently copies materials (as in deep copies returning a new instance) if you access them through the Renderer.material property. That is the difference between renderer.material (new copy) and renderer.sharedMaterial (shared object reference) but this is a renderer feature not a material feature. This has lead to much problems and confusion so we don't really want to replicate that here in favor of "standard" c# behavour.
     
    ElliotB, scottjdaley and JesOb like this.
  42. cholleme

    cholleme

    Unity Technologies

    Joined:
    Apr 23, 2019
    Posts:
    31
    @scottjdaley for your last post , it's a bug wed discovered internally as well, it shouldn't merge more than 8 passes (= api limit) but it does. We're working on fixing that right now.
     
    scottjdaley and JesOb like this.
  43. scottjdaley

    scottjdaley

    Joined:
    Aug 1, 2013
    Posts:
    163
    Thanks for the responses! Glad to hear that a lot of these pain points are being worked on!

    That makes sense, not sure why I didn't think of it just being normal c# reference semantics. But I think the confusion for me came from two things:
    1. The different execution times for different calls inside a RenderFunc. For example, if RenderFunc calls material.SetVector() followed by CoreUtils.DrawFullScreen(), then the former is happening immediately (modifying the material instance) and the latter is happening when the CommandBuffer is played back. I realize that this is exactly the point of a CommandBuffer, but it is still easy to forget. I almost wish that the RenderFunc could only have deferred CommandBuffer-backed calls in it to avoid this mistake.
    2. Since we are "recording" a render pass, it is unclear if our code is getting played back verbatim or if there is some kind of optimization happening. For example, I thought it might be somehow converting the the rendering instructions to some kind of faster and light weight unmanaged representation internally and then playing that back each frame.
     
  44. customphase

    customphase

    Joined:
    Aug 19, 2012
    Posts:
    247
    Im trying to render objects via Render Graph using separate camera, but having issues with Hybrid Renderer objects not outputting anything. So im getting culling results in BeginContextRendering and enqueing my pass:
    Code (CSharp):
    1. private void OnBeginContextRendering(ScriptableRenderContext src, List<Camera> cameras)
    2. {
    3.     var pipeline = GraphicsSettings.currentRenderPipeline as UniversalRenderPipelineAsset;
    4.     if (pipeline == null) return;
    5.  
    6.     _renderCamera.TryGetCullingParameters(out var cullingParameters);
    7.     var cullingResults = src.Cull(ref cullingParameters);
    8.  
    9.     var renderPass = new MapHeightmapRenderPass
    10.     {
    11.         CullingResults = cullingResults,
    12.         ProjMatrix = GL.GetGPUProjectionMatrix(_renderCamera.projectionMatrix, true),
    13.         ViewMatrix = _renderCamera.worldToCameraMatrix,
    14.         Viewport = viewport,
    15.         HeightmapRT = _heightmapRT,
    16.         renderPassEvent = RenderPassEvent.BeforeRendering
    17.     };
    18.     pipeline.scriptableRenderer.EnqueuePass(renderPass);
    19. }
    Then in my pass im creating the renderer list and trying to render it:
    Code (CSharp):
    1. public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
    2. {
    3.     var rtDesc = new RenderTextureDescriptor(Constants.ChunkSize, Constants.ChunkSize, RenderTextureFormat.Depth, 16);
    4.     rtDesc.autoGenerateMips = false;
    5.     rtDesc.shadowSamplingMode = ShadowSamplingMode.RawDepth;
    6.     var tempRT = UniversalRenderer.CreateRenderGraphTexture(renderGraph, rtDesc, "MapHeightmapRenderTemporaryRT", true);
    7.     var filteringSettings = new FilteringSettings(RenderQueueRange.all, Physics.AllLayers);
    8.     var drawingSettings = new DrawingSettings(new ShaderTagId("DepthOnly"), new SortingSettings());
    9.     drawingSettings.enableInstancing = true;
    10.     var rendererListParam = new RendererListParams(CullingResults, drawingSettings, filteringSettings);
    11.     var rendererList = renderGraph.CreateRendererList(rendererListParam);
    12.  
    13.     if (!tempRT.IsValid()) return;
    14.  
    15.     using (var builder = renderGraph.AddRasterRenderPass<PassData>("MapHeightmapRenderPass",
    16.                out var passData, profilingSampler))
    17.     {
    18.         passData.RendererList = rendererList;
    19.         passData.ViewMatrix = ViewMatrix;
    20.         passData.ProjMatrix = ProjMatrix;
    21.         passData.TemporaryRT = tempRT;
    22.         builder.AllowPassCulling(false);
    23.         builder.UseRendererList(passData.RendererList);
    24.         builder.SetRenderAttachmentDepth(passData.TemporaryRT);
    25.         builder.SetRenderFunc((PassData data, RasterGraphContext context) =>
    26.         {
    27.             context.cmd.SetViewProjectionMatrices(data.ViewMatrix, data.ProjMatrix);
    28.             context.cmd.DrawRendererList(data.RendererList);
    29.         });
    30.     }
    31. }
    The issue is that regular objects render correctly and output their depth, however Hybrid Renderer objects do not output anything. Both use default URP Lit shader. There are drawcalls for them with correct meshes, but they are just not outputting anything. Am i missing something here? Do i need to do something else to make Hybrid Renderer work? Maybe metadata is not set or something?

    upload_2024-3-20_14-19-57.png upload_2024-3-20_14-21-16.png
     
    Last edited: Mar 20, 2024
  45. customphase

    customphase

    Joined:
    Aug 19, 2012
    Posts:
    247
    Nevermind, figured it out. The issue had nothing to do with Hybrid Renderer, it wasnt showing up cause culling was inverted, due to incorrect projection matrix. Apparently youre not supposed to use
    GL.GetGPUProjectionMatrix
    at all. Just using raw projectionMatrix with
    context.cmd.SetViewProjectionMatrices(camera.worldToCameraMatrix, camera.projectionMatrix)
    is the right way.
     
  46. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    297
    Hi again,

    Quick question! Is this the correct way to determine if compatibility mode is enabled? (eg, if we need to modify which passes get added according to that). Thanks!

    Code (CSharp):
    1. GraphicsSettings.GetRenderPipelineSettings<RenderGraphSettings>().enableRenderCompatibilityMode
     
  47. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    297
    I'm currently finding some unexpected API behavior when trying to request a camera resource in a particular pass. The gist is:

    Code (CSharp):
    1. var resources = frameData.Get<UniversalResourceData>();
    2. ...
    3.             using (var builder = renderGraph.AddUnsafePass<PassData>(GetType().Name, out var passData))
    4.             {
    5. ...
    6.                 passData.cameraColor = resources.cameraColor;
    7. ...
    8.                 builder.UseTexture(passData.cameraColor, AccessFlags.Read);
    It works fine in the scene tab and in the game tab, but I find that clicking any material (to draw the material preview in the inspector) produces an absolute wall of red console error messages every draw frame:

    ArgumentException: Trying to use an invalid resource (pass ProPixelizerLowResRecompositionPass).
    UnityEngine.Rendering.RenderGraphModule.RenderGraphBuilders.CheckResource (UnityEngine.Rendering.RenderGraphModule.ResourceHandle& res, System.Boolean dontCheckTransientReadWrite) (at ./Library/PackageCache/com.unity.render-pipelines.core/Runtime/RenderGraph/RenderGraphBuilders.cs:500)
    UnityEngine.Rendering.RenderGraphModule.RenderGraphBuilders.UseResource (UnityEngine.Rendering.RenderGraphModule.ResourceHandle& handle, UnityEngine.Rendering.RenderGraphModule.AccessFlags flags) (at ./Library/PackageCache/com.unity.render-pipelines.core/Runtime/RenderGraph/RenderGraphBuilders.cs:197)
    UnityEngine.Rendering.RenderGraphModule.RenderGraphBuilders.UseTexture (UnityEngine.Rendering.RenderGraphModule.TextureHandle& input, UnityEngine.Rendering.RenderGraphModule.AccessFlags flags) (at ./Library/PackageCache/com.unity.render-pipelines.core/Runtime/RenderGraph/RenderGraphBuilders.cs:280)
    UnityEngine.Rendering.RenderGraphModule.RenderGraphBuilders.UnityEngine.Rendering.RenderGraphModule.IBaseRenderGraphBuilder.UseTexture (UnityEngine.Rendering.RenderGraphModule.TextureHandle& input, UnityEngine.Rendering.RenderGraphModule.AccessFlags flags) (at <751c003238ce4e0f87fd9e6326ab907b>:0)
    ProPixelizer.ProPixelizerLowResRecompositionPass.RecordRenderGraph (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ContextContainer frameData) (at Assets/ProPixelizer/SRP/Passes/ProPixelizerLowResRecompositionPass.cs:143)
    UnityEngine.Rendering.Universal.ScriptableRenderer.RecordCustomRenderGraphPasses (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.Universal.RenderPassEvent injectionPoint) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/ScriptableRenderer.cs:1180)
    UnityEngine.Rendering.Universal.UniversalRenderer.OnMainRendering (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRendererRenderGraph.cs:1117)
    UnityEngine.Rendering.Universal.UniversalRenderer.OnRecordRenderGraph (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRendererRenderGraph.cs:806)
    UnityEngine.Rendering.Universal.ScriptableRenderer.RecordRenderGraph (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/ScriptableRenderer.cs:1127)
    UnityEngine.Rendering.Universal.UniversalRenderPipeline.RecordRenderGraph (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.ScriptableRenderer renderer) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipelineRenderGraph.cs:9)
    UnityEngine.Rendering.Universal.UniversalRenderPipeline.RecordAndExecuteRenderGraph (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.ScriptableRenderer renderer, UnityEngine.Rendering.CommandBuffer cmd, UnityEngine.Camera camera) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipelineRenderGraph.cs:24)
    UnityEngine.Rendering.Universal.UniversalRenderPipeline.RenderSingleCamera (UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.UniversalCameraData cameraData) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipeline.cs:801)
    UnityEngine.Rendering.Universal.UniversalRenderPipeline.RenderSingleCameraInternal (UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Camera camera, UnityEngine.Rendering.Universal.UniversalAdditionalCameraData& additionalCameraData) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipeline.cs:638)
    UnityEngine.Rendering.Universal.UniversalRenderPipeline.RenderSingleCameraInternal (UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Camera camera) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipeline.cs:619)
    UnityEngine.Rendering.Universal.UniversalRenderPipeline.Render (UnityEngine.Rendering.ScriptableRenderContext renderContext, System.Collections.Generic.List`1[T] cameras) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipeline.cs:430)
    UnityEngine.Rendering.RenderPipeline.InternalRender (UnityEngine.Rendering.ScriptableRenderContext context, System.Collections.Generic.List`1[T] cameras) (at <14ae4cb93fc34d8e8f54b19c066671bf>:0)
    UnityEngine.Rendering.RenderPipelineManager.DoRenderLoop_Internal (UnityEngine.Rendering.RenderPipelineAsset pipelineAsset, System.IntPtr loopPtr, UnityEngine.Object renderRequest, Unity.Collections.LowLevel.Unsafe.AtomicSafetyHandle safety) (at <14ae4cb93fc34d8e8f54b19c066671bf>:0)
    UnityEngine.GUIUtility:processEvent(Int32, IntPtr, Boolean&)


    The error line refers to:
    Code (csharp):
    1. builder.UseTexture(passData.cameraColor, AccessFlags.Read);

    Sometimes, it is even thrown by URP's internal DrawObjectsPass:
    Code (csharp):
    1. ArgumentException: Trying to use an invalid resource (pass Draw Objects Pass).
    2. UnityEngine.Rendering.RenderGraphModule.RenderGraphBuilders.CheckResource (UnityEngine.Rendering.RenderGraphModule.ResourceHandle& res, System.Boolean dontCheckTransientReadWrite) (at ./Library/PackageCache/com.unity.render-pipelines.core/Runtime/RenderGraph/RenderGraphBuilders.cs:500)
    3. UnityEngine.Rendering.RenderGraphModule.RenderGraphBuilders.UseResource (UnityEngine.Rendering.RenderGraphModule.ResourceHandle& handle, UnityEngine.Rendering.RenderGraphModule.AccessFlags flags) (at ./Library/PackageCache/com.unity.render-pipelines.core/Runtime/RenderGraph/RenderGraphBuilders.cs:197)
    4. UnityEngine.Rendering.RenderGraphModule.RenderGraphBuilders.UseTexture (UnityEngine.Rendering.RenderGraphModule.TextureHandle& input, UnityEngine.Rendering.RenderGraphModule.AccessFlags flags) (at ./Library/PackageCache/com.unity.render-pipelines.core/Runtime/RenderGraph/RenderGraphBuilders.cs:280)
    5. UnityEngine.Rendering.RenderGraphModule.RenderGraphBuilders.Dispose (System.Boolean disposing) (at ./Library/PackageCache/com.unity.render-pipelines.core/Runtime/RenderGraph/RenderGraphBuilders.cs:129)
    6. UnityEngine.Rendering.RenderGraphModule.RenderGraphBuilders.Dispose () (at ./Library/PackageCache/com.unity.render-pipelines.core/Runtime/RenderGraph/RenderGraphBuilders.cs:110)
    7. UnityEngine.Rendering.Universal.Internal.DrawObjectsPass.Render (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ContextContainer frameData, UnityEngine.Rendering.RenderGraphModule.TextureHandle colorTarget, UnityEngine.Rendering.RenderGraphModule.TextureHandle depthTarget, UnityEngine.Rendering.RenderGraphModule.TextureHandle mainShadowsTexture, UnityEngine.Rendering.RenderGraphModule.TextureHandle additionalShadowsTexture, System.UInt32 batchLayerMask) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/Passes/DrawObjectsPass.cs:320)
    8. UnityEngine.Rendering.Universal.UniversalRenderer.OnMainRendering (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRendererRenderGraph.cs:1094)
    9. UnityEngine.Rendering.Universal.UniversalRenderer.OnRecordRenderGraph (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRendererRenderGraph.cs:806)
    10. UnityEngine.Rendering.Universal.ScriptableRenderer.RecordRenderGraph (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/ScriptableRenderer.cs:1127)
    11. UnityEngine.Rendering.Universal.UniversalRenderPipeline.RecordRenderGraph (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.ScriptableRenderer renderer) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipelineRenderGraph.cs:9)
    12. UnityEngine.Rendering.Universal.UniversalRenderPipeline.RecordAndExecuteRenderGraph (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.ScriptableRenderer renderer, UnityEngine.Rendering.CommandBuffer cmd, UnityEngine.Camera camera) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipelineRenderGraph.cs:24)
    13. UnityEngine.Rendering.Universal.UniversalRenderPipeline.RenderSingleCamera (UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.UniversalCameraData cameraData) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipeline.cs:801)
    14. UnityEngine.Rendering.Universal.UniversalRenderPipeline.RenderCameraStack (UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Camera baseCamera) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipeline.cs:1016)
    15. UnityEngine.Rendering.Universal.UniversalRenderPipeline.Render (UnityEngine.Rendering.ScriptableRenderContext renderContext, System.Collections.Generic.List`1[T] cameras) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipeline.cs:415)
    16. UnityEngine.Rendering.RenderPipeline.InternalRender (UnityEngine.Rendering.ScriptableRenderContext context, System.Collections.Generic.List`1[T] cameras) (at <14ae4cb93fc34d8e8f54b19c066671bf>:0)
    17. UnityEngine.Rendering.RenderPipelineManager.DoRenderLoop_Internal (UnityEngine.Rendering.RenderPipelineAsset pipelineAsset, System.IntPtr loopPtr, UnityEngine.Object renderRequest, Unity.Collections.LowLevel.Unsafe.AtomicSafetyHandle safety) (at <14ae4cb93fc34d8e8f54b19c066671bf>:0)
    18. UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr, Boolean&)
    19.  
    I'm not making any assignments to cameraColor in my passes. The errors only occur in preview cameras.

    I also get the following error - but I believe this may just be caused by the draw erroring out, and not a cause itself:
    Code (csharp):
    1. InvalidOperationException: The previously scheduled job ZBinningJob writes to the Unity.Collections.NativeArray`1[System.UInt32] ZBinningJob.bins. You must call JobHandle.Complete() on the job ZBinningJob, before you can write to the Unity.Collections.NativeArray`1[System.UInt32] safely.
    2. Unity.Collections.LowLevel.Unsafe.AtomicSafetyHandle.CheckWriteAndThrowNoEarlyOut (Unity.Collections.LowLevel.Unsafe.AtomicSafetyHandle handle) (at <14ae4cb93fc34d8e8f54b19c066671bf>:0)
    3. Unity.Collections.LowLevel.Unsafe.NativeArrayUnsafeUtility.GetUnsafePtr[T] (Unity.Collections.NativeArray`1[T] nativeArray) (at <14ae4cb93fc34d8e8f54b19c066671bf>:0)
    4. UnityEngine.Rendering.Universal.Internal.ForwardLights.PreSetup (UnityEngine.Rendering.Universal.UniversalRenderingData renderingData, UnityEngine.Rendering.Universal.UniversalCameraData cameraData, UnityEngine.Rendering.Universal.UniversalLightData lightData) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/ForwardLights.cs:200)
    5. UnityEngine.Rendering.Universal.UniversalRenderer.OnBeforeRendering (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRendererRenderGraph.cs:861)
    6. UnityEngine.Rendering.Universal.UniversalRenderer.OnRecordRenderGraph (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRendererRenderGraph.cs:802)
    7. UnityEngine.Rendering.Universal.ScriptableRenderer.RecordRenderGraph (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/ScriptableRenderer.cs:1127)
    8. UnityEngine.Rendering.Universal.UniversalRenderPipeline.RecordRenderGraph (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.ScriptableRenderer renderer) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipelineRenderGraph.cs:9)
    9. UnityEngine.Rendering.Universal.UniversalRenderPipeline.RecordAndExecuteRenderGraph (UnityEngine.Rendering.RenderGraphModule.RenderGraph renderGraph, UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.ScriptableRenderer renderer, UnityEngine.Rendering.CommandBuffer cmd, UnityEngine.Camera camera) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipelineRenderGraph.cs:24)
    10. UnityEngine.Rendering.Universal.UniversalRenderPipeline.RenderSingleCamera (UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.UniversalCameraData cameraData) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipeline.cs:801)
    11. UnityEngine.Rendering.Universal.UniversalRenderPipeline.RenderSingleCameraInternal (UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Camera camera, UnityEngine.Rendering.Universal.UniversalAdditionalCameraData& additionalCameraData) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipeline.cs:638)
    12. UnityEngine.Rendering.Universal.UniversalRenderPipeline.RenderSingleCameraInternal (UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Camera camera) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipeline.cs:619)
    13. UnityEngine.Rendering.Universal.UniversalRenderPipeline.Render (UnityEngine.Rendering.ScriptableRenderContext renderContext, System.Collections.Generic.List`1[T] cameras) (at ./Library/PackageCache/com.unity.render-pipelines.universal/Runtime/UniversalRenderPipeline.cs:430)
    14. UnityEngine.Rendering.RenderPipeline.InternalRender (UnityEngine.Rendering.ScriptableRenderContext context, System.Collections.Generic.List`1[T] cameras) (at <14ae4cb93fc34d8e8f54b19c066671bf>:0)
    15. UnityEngine.Rendering.RenderPipelineManager.DoRenderLoop_Internal (UnityEngine.Rendering.RenderPipelineAsset pipelineAsset, System.IntPtr loopPtr, UnityEngine.Object renderRequest, Unity.Collections.LowLevel.Unsafe.AtomicSafetyHandle safety) (at <14ae4cb93fc34d8e8f54b19c066671bf>:0)
    16. UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr, Boolean&)
    * Ironically the first error code block was not rendering in the forum, so I've had to remove the embed. Apologies.
     
  48. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    297
    Sometimes this console message spams wildly:

    Code (csharp):
    1. Attempting to render to a depth only surface with no dummy color attachment
    2. UnityEngine.GUIUtility:ProcessEvent (int,intptr,bool&)
    There's no indication of where it comes from though :(. It continues if I disable all render features, so it seems like it comes from URP internals.
     
    Last edited: Mar 20, 2024
  49. Neonage

    Neonage

    Joined:
    May 22, 2020
    Posts:
    288
    ProjectSettings Graphics keeps forgetting scroll position and URP tab selection.
    This is very annoying when switching Compatibility mode, please fix.

    upload_2024-3-22_23-24-3.png
     
  50. Neonage

    Neonage

    Joined:
    May 22, 2020
    Posts:
    288
    Seems like Depth texture is not being created in After Transparents mode.
    Works if Depth Priming or SSAO are enabled.