Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  2. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice
  3. Dismiss Notice

Official Introduction of Render Graph in the Universal Render Pipeline (URP)

Discussion in 'Universal Render Pipeline' started by oliverschnabel, Oct 2, 2023.

  1. _geo__

    _geo__

    Joined:
    Feb 26, 2014
    Posts:
    1,405
    Thanks for the quick reply (appreciated).

    Could you be a bit more specific on the timeline? While I am planning for 2024 I would like to know when I have to schedule RG porting work. Will compatibility mode be available all through 2023 LTS (Unity 6) or what's the timeline here. Is April 2024 still the target for RG hitting the shelves?

    Basically what I want to know is when exactly I can expect users, who use only default settings, to hit me with RG requests. I'd like a date, month or quarter on that please.

    I'd also be interested in how the "strong incentive to upgrade" will be implemented?

    Thank you.
     
    Last edited: Nov 21, 2023
    ElliotB likes this.
  2. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Strong incentive is that in few months nothing of our effect will work in a global sense, so will be forced to port everything or leave the store.

    As what happened when the pipelines were introduced, when the backend changed to the new RTHandles and then again changed to use the new Blitter etc etc

    Is rather strong given how much work has gone in the assets to kill them for this reason.

    Hi, indeed with the samples i managed to get some of the rendering back, at least the simpler direct shader to output one. So is good news that the very complex stuff on shader side all work so far.

    Now i still cannot pass multiple textures in one of the passes, e.f. when try to do

    _material.SetTexture("_MainTex", tmpBuffer1A); before i use the builder to set and send the pass, i get

    InvalidOperationException: Current Render Graph Resource Registry is not set. You are probably trying to cast a Render Graph handle to a resource outside of a Rend

    Seems like i can only insert a single texture for the blit so far, that of the source parameter in Blitter.BlitTexture, which also has a specific name, so would need to change my shader side also, which is not desirable.

    The tmpBuffer1A is defined as a global TextureHandle tmpBuffer1A;

    Then initialized inside the RecordRenderGraph

    tmpBuffer1A = UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "tmpBuffer1A", true);

    and is passed with a ref tmpBuffer1A in a function that does the Blit

    I have tested the both the tmpBuffer1A that i try to save the background and the tmpBuffer2A that have my volumetric effects are showing properly, but when i try pass the tmpBuffer1A result of previous step to next that renders tmpBuffer2A so can blend the volumetrics to background it fails.

    Is there a way to insert more textures to send to the relevant names inside our own shaders ?

    Thanks

    EDiT: i managed to send a rendertexture to the custom named variable, added it on the final static call, now need to see how to pass the textures between the passes, as currently use the camera rendertexture as input, as is harder to get the outer textures in the static.

    Also wondering how can control the texture size, eg downsample a rendertetxure versus the full screen resolution
     
    Last edited: Nov 21, 2023
  3. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    282
    Yes, that's when 23.3 tech release comes out of beta. Ideally you have your assets converted by that time to (also) work with RG, since RG will be the default when that tech release ships. But users will still be able to revert to Compatibility mode (non-RG) if some of their assets are not yet compatible with RG, giving you more time to support RG.


    Yes. We view Compatibility Mode as a way to facilitate upgrading a project. And to give adequate time for asset providers to support RG in their assets.


    We don't have all the details yet, we'll work with you to find the best approach. In terms of incentive, we expect assets that make good use of the new RG API to have better performance than with the current API. So users will prefer assets with RG. RG will also be the default setting and non-RG/compatibility mode will be deprecated. New projects will benefit from using RG but of course adoption will be slowed down by the available assets in the store. Once the majority of assets support RG then we expect to prioritize performance improvements and fixes for RenderGraph. As part of the incentive, we also want to minimize the amount of time it will take you to support RG in an asset. Once you're passed the initial learning curve, assets could be converted in a day or two at most.
     
    _geo__ likes this.
  4. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    I cant see how at any expertise level on the Graph can convert some more complex systems in 2 days.

    I think will take me a week just to find out how to do a temporal AA effect, as passing textures between the components is very unclear so far, or how to control the texture sizes etc

    There is nothing similar to the previous methods, plus until learn all aspects could be months of work, as each asset has different requirements.

    Realistically a very complex one first time could take a few months or so to perfect again from scratch for the new system.
     
  5. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    282
    We've actually tested this, and it's certainly possible. Based on our experience upgrading assets over the last months we'll share a best practices guide with clear steps how to approach this.

    We're still making changes from the previous feedback rounds and just landed some API improvements. We also brought RenderGraph itself out of experimental, into production.

    The best place to start is with this alpha documentation. We'll share an updated version next week with the new API changes. We'll also update the sample code shared in this forum thread.

    The new helper functions that we will add soon will also simplify things significantly.
     
    Kmsxkuse and nasos_333 like this.
  6. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Great, hopefully we can replace directly out blits and temporary rendertextures as with the previous work flow, so in that case indeed could be possible to port faster. this would help a lot, so probably will wait it out for the latest API release first before proceed.
     
  7. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Btw how do we create a global TextureHandle that keeps the value from previous frame ?

    I did not see an initialization region for such.

    Thanks

    EDIT: Scratch that, found that have to use RTHandles for globals, now seem temporal AA also worked :)
     
    Last edited: Nov 21, 2023
    AljoshaD likes this.
  8. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Hi,

    I have now managed to get most of my core systems to work :), i have only one issue that may need to be handled.

    I render in the 3D texture with URP direct camera rendering function UniversalRenderPipeline.RenderSingleCamera() and get the below warning

    ""Trying to render to a rendertexture without a depth buffer. URP+RG needs a depthbuffer to render.""

    Adding depth to a 3D texture is not desirable, so could this be handled not to return such message, at least in 3D textures case ?

    Thanks a lot in advance

    Another issue is that the RenderRequest api is not working to replace the above function too.

    https://github.com/Unity-Technologies/Graphics/commit/76150c8114170b7e2359ccd786c02b22bafdb125

    I get an error in BeginContextRendering:
    Recursive rendering is not supported in SRP (are you calling Camera.Render from within a render pipeline?).
     
    Last edited: Nov 22, 2023
  9. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    About the last statement, i think if something works, it works. Adding a layer to stop it working will not solve anything imo

    For example when i can blit with source and destination same texture is not a problem, it is a powerful tool and faster on the platforms that does work.

    Removing this a forcing to use two passes to do the same can make things less optimized i suppose.

    It would be best if this was forced only on platforms that does have an issue for example.

    I think something similar has happened with the swapchain crashes, that is one of the most massive bugs in game graphics today.

    Trying to stop the rare case where a GPU racing could stall the program by adding a time out that crashes everything was definitly not the solution, especially since it crashes randomly even if frame rate is 30fps and clearly GPU is not stalling when using older Unity and NVidia drivers.
     
    Last edited: Nov 24, 2023
  10. _geo__

    _geo__

    Joined:
    Feb 26, 2014
    Posts:
    1,405
    Last edited: Nov 25, 2023
  11. kripto289

    kripto289

    Joined:
    Feb 21, 2013
    Posts:
    539
    You can render custom passes without "render feature" editor.
    I do not know why Unity devs has not added this to the API description yet.

    Code (CSharp):
    1.  
    2. void OnEnable()
    3. {
    4.       RenderPipelineManager.beginCameraRendering += OnBeforeCameraRendering;
    5. }
    6.  
    7. void OnDisable()
    8. {
    9.       RenderPipelineManager.beginCameraRendering -= OnBeforeCameraRendering;
    10. }
    11.  
    12. private void OnBeforeCameraRendering(ScriptableRenderContext context, Camera cam)
    13. {
    14.       var data = cam.GetComponent<UniversalAdditionalCameraData>();
    15.       f(usePass1) data.scriptableRenderer.EnqueuePass(YourScriptableRenderPass1);
    16. }
    17.  
     
    AljoshaD and nasos_333 like this.
  12. Le_Tai

    Le_Tai

    Joined:
    Jun 20, 2014
    Posts:
    443
    Looking at the doc, it seem that the frame buffer after post processing can be accessed? If so, it would remove a massive pain point for me. I hope it will also work when FXAA or FSR is on, unlike the latest version of URP.

    Can you create an example that show how to support both existing SRP, the builtin render pipeline, and RG with minimal code duplication? I understand more complex things are very different between them, but simply blitting the screen to another texture should be possible with some shimming + preprocessor, right? About 70% of users use the last 2 LTS versions, so asset store publisher will have to maintain the old SRP for quite a while. And the last time it was talked about, most user are still using BIRP.
     
    GoGoGadget and Lars-Steenhoff like this.
  13. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Hi,

    Some feedback, i have a shader in Shader Graph and would like to use it for the post processing, as i did in the previous system.

    The issue is that need to replace the vertex shader with the full screen one, and if i use the render to texture option in graph seems to have many issues like not use the world space node.

    I fixed it by export the code and simple replace the vert with the Vert function in the blit sample, but would be nice if could automate that from the Shader Graph, without need to export the code and without having any nodes disabled.

    Thanks
     
  14. _geo__

    _geo__

    Joined:
    Feb 26, 2014
    Posts:
    1,405
    Thanks (not sure if I knew at the time). But that's only part of what that particular API question is about. It's about disabling/enabling or modifying properties of existing features at runtime. This was easily done in Built-In, yet in SRPs it's (partially) hidden away and we have to use reflections to do it (SSAO in URP for example).

    SSAO might be a bad example because it also depends on shader stripping being off but in general a "scriptable" render pipeline should expose features, not hide them away. That's what I'd like to be fixed in the new api. I had to use this reflection approach in my settings asset so I could disable SSAO even if the user had it added.
     
    Last edited: Nov 25, 2023
    OCASM likes this.
  15. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Hi,

    How can we set a specific material, e.g. the one to draw depth normals texture ?

    Also i read in the manual that there is a normals texture that can contain depth if use a certain Depth prepass, but not sure how to apply that.

    "cameraNormalsTexture:
    Camera normals texture. Contains the scene
    depth if the DepthNormals Prepass pass is
    executed
    (see ConfigureInput)
    "

    Is it possible to have one version of the above code that will write to the depthnormals texture, using the given layermask ?

    Thanks in advance
     
  16. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    Yeah with RasterPass each blit targeting targets with different dimensions would require a separate RasterPasses. In general RasterPass in these scenarios doesn't provide much advantage since those passes using targets at different resolutions cannot be merged.

    So for scenarios like this we are providing an "UnsafePass" API, which allows you to just ignore any benefits and optimizations of RasterPasses and just use the old cmd.SetRenderTarget and all the old CommandBuffer interface, so you should be able to reuse most of your old API code.
    Some of the reasons for using UnsafePass are:
    • in some case you are sure RasterPass optimizations wouldn't bring much in terms of performance improvements (i.e. the bloom downsample/upsample pyramid of your example)
    • you are happy sacrificing performance for ease of use (i.e. writing a debug only pass?)
    • in general to make life easier when supporting both non-RG and RG path
    • simplify the process of porting to RG: first you can port everything using UnsafePass, then you can optimize later by converting to RasterPass
    UnsafePass is currently not available in the public release but it should be there with the next tech stream release, so VERY soon (few days?). We are updating the document to include UnsafePasses and use case scenarios.

    The following is an UnsafePass example, which shows how to implement a simple downsample effect withing a single UnsafePass, insead of multiple RasterPasses:


    Code (CSharp):
    1.  
    2. using UnityEngine;
    3. using UnityEngine.Rendering.RenderGraphModule;
    4. using UnityEngine.Rendering;
    5. using UnityEngine.Rendering.Universal;
    6.  
    7. public class UnsafePassRenderFeature : ScriptableRendererFeature
    8. {
    9.     class UnsafePass : ScriptableRenderPass
    10.     {
    11.         // This class stores the data needed by the pass, passed as parameter to the delegate function that executes the pass
    12.         private class PassData
    13.         {
    14.             internal TextureHandle src;
    15.             internal TextureHandle dest;
    16.             internal TextureHandle destHalf;
    17.             internal TextureHandle destQuarter;
    18.         }
    19.  
    20.         // This static method is used to execute the pass and passed as the RenderFunc delegate to the RenderGraph render pass
    21.         static void ExecutePass(PassData data, UnsafeGraphContext context)
    22.         {
    23.             // Set manually the RenderTarget for each blit. Each SetRenderTarget call would require a separate RasterCommandPass if we wanted
    24.             // to setup RenderGraph for merging passes when possible.
    25.             // In this case we know that these 3 subpasses are not compatible for merging, because RenderTargets have different dimensions,
    26.             // so we simplify our code to use an unsafe pass, also saving RenderGraph processing time.
    27.          
    28.             // copy the current scene color
    29.  
    30.             CommandBuffer unsafeCmd = CommandBufferHelpers.GetNativeCommandBuffer(context.cmd);
    31.          
    32.             unsafeCmd.SetRenderTarget(data.dest);
    33.             Blitter.BlitTexture(unsafeCmd, data.src, new Vector4(1, 1, 0, 0), 0, false);
    34.          
    35.             // downscale x2
    36.          
    37.             unsafeCmd.SetRenderTarget(data.destHalf);
    38.             Blitter.BlitTexture(unsafeCmd, data.dest, new Vector4(1, 1, 0, 0), 0, false);
    39.          
    40.             unsafeCmd.SetRenderTarget(data.destQuarter);
    41.             Blitter.BlitTexture(unsafeCmd, data.destHalf, new Vector4(1, 1, 0, 0), 0, false);
    42.          
    43.             // upscale x2
    44.          
    45.             unsafeCmd.SetRenderTarget(data.destHalf);
    46.             Blitter.BlitTexture(unsafeCmd, data.destQuarter, new Vector4(1, 1, 0, 0), 0, false);
    47.          
    48.             unsafeCmd.SetRenderTarget(data.dest);
    49.             Blitter.BlitTexture(unsafeCmd, data.destHalf, new Vector4(1, 1, 0, 0), 0, false);
    50.         }
    51.  
    52.         // This is where the renderGraph handle can be accessed.
    53.         // Each ScriptableRenderPass can use the RenderGraph handle to add multiple render passes to the render graph
    54.         public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
    55.         {
    56.             string passName = "Unsafe Pass";
    57.  
    58.             // This simple pass copies the active color texture to a new texture, it then downsamples the source texture twice. This sample is for API demonstrative purposes,
    59.             // so the new textures are not used anywhere else in the frame, you can use the frame debugger to verify their contents.
    60.             // The key concept of this sample, is the UnsafePass usage: these type of passes are unsafe and allow using command like SetRenderTarget() which are
    61.             // not compatible with RasterRenderPasses. Using UnsafePasses means that the RenderGraph won't try to optimize the pass by merging it inside a NativeRenderPass.
    62.             // In some cases using UnsafePasses makes sense, if for example we know that a set of adjacent passes are not mergeable, so this can optimize the RenderGraph
    63.             // compiling times, on top of simplifying the multiple passes setup.
    64.  
    65.             // add a raster render pass to the render graph, specifying the name and the data type that will be passed to the ExecutePass function
    66.             using (var builder = renderGraph.AddUnsafePass<PassData>(passName, out var passData))
    67.             {
    68.                 // UniversalResourceData contains all the texture handles used by the renderer, including the active color and depth textures
    69.                 // The active color and depth textures are the main color and depth buffers that the camera renders into
    70.                 UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
    71.  
    72.                 // Fill up the passData with the data needed by the pass
    73.  
    74.                 // Get the active color texture through the frame data, and set it as the source texture for the blit
    75.                 passData.src = resourceData.activeColorTexture;
    76.  
    77.                 // The destination textures are created here,
    78.                 // the texture is created with the same dimensions as the active color texture, but with no depth buffer, being a copy of the color texture
    79.                 // we also disable MSAA as we don't need multisampled textures for this sample
    80.                 // the other two textures halve the resolution of the previous one
    81.  
    82.                 UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
    83.                 RenderTextureDescriptor desc = cameraData.cameraTargetDescriptor;
    84.                 desc.msaaSamples = 1;
    85.                 desc.depthBufferBits = 0;
    86.  
    87.                 TextureHandle destination = UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "UnsafeTexture", false);
    88.                 desc.width /= 2;
    89.                 desc.height /= 2;
    90.                 TextureHandle destinationHalf = UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "UnsafeTexture2", false);
    91.                 desc.width /= 2;
    92.                 desc.height /= 2;
    93.                 TextureHandle destinationQuarter = UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "UnsafeTexture3", false);
    94.                 passData.dest = destination;
    95.                 passData.destHalf = destinationHalf;
    96.                 passData.destQuarter = destinationQuarter;
    97.  
    98.                 // We declare the src texture as an input dependency to this pass, via UseTexture()
    99.                 builder.UseTexture(passData.src);
    100.              
    101.                 // UnsafePasses don't setup the outputs using UseTextureFragment/UseTextureFragmentDepth, you should specify your writes with UseTexture instead
    102.                 builder.UseTexture(passData.dest, AccessFlags.Write);
    103.                 builder.UseTexture(passData.destHalf, AccessFlags.Write);
    104.                 builder.UseTexture(passData.destQuarter, AccessFlags.Write);
    105.  
    106.                 // We disable culling for this pass for the demonstrative purpose of this sample, as normally this pass would be culled,
    107.                 // since the destination texture is not used anywhere else
    108.                 builder.AllowPassCulling(false);
    109.  
    110.                 // Assign the ExecutePass function to the render pass delegate, which will be called by the render graph when executing the pass
    111.                 builder.SetRenderFunc((PassData data, UnsafeGraphContext context) => ExecutePass(data, context));
    112.             }
    113.         }
    114.     }
    115.  
    116.     UnsafePass m_UnsafePass;
    117.  
    118.     /// <inheritdoc/>
    119.     public override void Create()
    120.     {
    121.         m_UnsafePass = new UnsafePass();
    122.  
    123.         // Configures where the render pass should be injected.
    124.         m_UnsafePass.renderPassEvent = RenderPassEvent.AfterRenderingTransparents;
    125.     }
    126.  
    127.     // Here you can inject one or multiple render passes in the renderer.
    128.     // This method is called when setting up the renderer once per-camera.
    129.     public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    130.     {
    131.         renderer.EnqueuePass(m_UnsafePass);
    132.     }
    133. }
    134.  
    135.  
    136.  
    137.  
    138.  
    if you bind your result as a global texture (check the RG doc shared in the first thread for "hot to set globals") then any passes after can sample it in their material shaders
     
    Last edited: Nov 27, 2023
    TSWessel, DrViJ, Kabinet13 and 2 others like this.
  17. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Hi,

    Is it possible to have a sample render graph renderer feature that can be assigned to a 2ond scene camera and render depth and normals and depthnormals textures on select scene objects by layer from that camera ?

    That would be very helpful.

    Thanks
     
  18. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549

    Hi,

    Is it possible to have a sample render graph renderer feature that can be assigned to a 2ond scene camera and render depth and normals and depthnormals textures on select scene objects by layer from that camera ?

    That would be very helpful. Also would be great if can have the toggle option to do normals based on the mesh normals or combined with the texture normals. Both are useful for different scenarios.

    Thanks
     
    Last edited: Nov 29, 2023
  19. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    you can check the RG implementation of \Packages\com.unity.render-pipelines.universal\Runtime\Passes\DepthNormalOnlyPass.cs for something similar
     
    nasos_333 likes this.
  20. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Thanks, will check on it
     
  21. Yoraiz0r

    Yoraiz0r

    Joined:
    Apr 26, 2015
    Posts:
    91
    Hello, I'm reading on the adjustments being made to RenderGraph and I wanted to express one experience I had trouble creating with the old renderpasses, despite them being less verbose. I'd like to request either information that confirms this is achievable, or a way to be added with RenderGraph to achieve this, please.

    There was no direct method to draw certain renderers as they should be drawn in a specific context.
    I'm talking about a specific group of renderers - not based in a layer mask.
    For the sake of simplicity assume that during runtime I collect a group of IEnumrable<Renderer> and I want to iterate on each of those specific renderers and draw them in a specific way (such as by using a specific material)

    Over my research for this I stumbled upon multiple approaches, but they all turned out to be duds.
    CommandBuffer.DrawRenderer misses light information, and relies on renderpass, without a quick way to set up said lighting information is quite cumbersome.

    I don't understand quite yet why does ScriptableRenderContext.DrawRenderers not have a method that takes actual renderers, and doesn't suffer from the lack of light information.

    I've got a usecase where I want to take a small list of objects that use any material (not based in a specific shader, I cannot guarantee they have any particular layer, material, nor any particular shadertag for a special pass on their material) and render them into a screen sized render texture to later over process

    Imagine something akin to unity's scene view orange highlight on selection. It has to work on any mesh using any material in any layer. (except, ideally with lighting information for the override material to consume)

    Due to the low object count (under 10 of them at any given point) I intend for this usecase, performance hasn't been a real concern - but the actual way to do it has been making me scratch my head.

    To summarize the request: I'd like a way to provide an actual list of renderers (anything IEnumerable<Renderer> ideally) to be rendered with a custom material, in a custom pass, without using layermasks. Particularly targeting Mesh Renderer and Skeletal Mesh Renderers, though doing Particle Renderers would be good too.

    Also, separately from the request above, if I understand this correctly,with RenderGraph its still possible to create a 'grab pass'-like pass, right? Having only one pass containing opaque objects has been a pain point for me, as trying to make shockwaves that can also distort other transparent effects on screen has a lot of value.
     
    colin299, kripto289 and saskenergy like this.
  22. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Hi,

    Is there any reason that can disallow the rendering to the final screen buffer ?

    Code (csharp):
    1.  
    2.   //FINAL
    3.                     using (var builder = renderGraph.AddRasterRenderPass<PassData>("Color Blit Resolve4aqa", out var passData, m_ProfilingSampler))
    4.                     {
    5.                         builder.AllowPassCulling(false);
    6.                         builder.AllowGlobalStateModification(true);
    7.                         passData.BlitMaterial = _material;
    8.                         // Similar to the previous pass, however now we set destination texture as input and source as output.
    9.                         passData.src = builder.UseTexture(tmpBuffer1A, IBaseRenderGraphBuilder.AccessFlags.Read);
    10.                         builder.UseTextureFragment(resourceData.activeColorTexture, 0, IBaseRenderGraphBuilder.AccessFlags.Write);
    11.                         // We use the same BlitTexture API to perform the Blit operation.
    12.                         builder.SetRenderFunc((PassData data, RasterGraphContext rgContext) =>
    13.                          ExecuteBlitPass(data, rgContext, 37, passData.src));
    14.                     }
    15.  
    16.  
    I try to write to the "resourceData.activeColorTexture" but nothing is rendering, i have set a global variable with the actual result just above this, which i can sample in a shader, but the final result is not written to screen with the above Blit function that samples the global variable. I only get the original screen.

    Same if i use the simpler Blit function that gets my last render handler (which i set to the global that previews correct) and try write also to the camera.

    I use an if statement in the code, so the first part of the "if "does write the result and wonder if that has something to do with it.

    I assume one of my calls before the last call to write to the camera buffer may also affect it somehow.

    If i do the above code above any other Blit (eg. when copy the camera buffer to a temporary texture handler), it works, but of course is wrong result.

    Thanks
     
    Last edited: Dec 3, 2023
  23. GoGoGadget

    GoGoGadget

    Joined:
    Sep 23, 2013
    Posts:
    868
    Can I just say, calling a non-UI based API "Render Graph" (when Unity already has multiple other 'graphs') is a prime example of why marketing people should design names for things, and not graphics programmers

    Name aside, before I have a play around with it, just echoing what some others have said here in asking for less boilerplate when this does become forced upon us. Unity 2014 through 2019 managed to have the most pleasant & simple Blitting/Post-processing API by far, it was legitimately joyful to use (and still performed well!).

    Is there really no way a Render Graph can handle a blit between two sizes of texture normally? Edit: Thanks Artem for clarifying below.

    It would be a shame if a bloom post-processing effect (or any of the other many types of PostFX which use RT's of different sizes in the chain, ie. Sunshafts, DoF, etc) ends up being a half-normal-half-'unsafe'-pass monstrosity.
     
    Last edited: Dec 7, 2023
    kripto289 likes this.
  24. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    You can create the downsized textures holders at initialization.
     
    GoGoGadget likes this.
  25. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    282
    This is a general truth indeed :)
    In our defense, it is actually a graph. And the name RenderGraph / Frame Graph is widely used in the industry.


    That's great to hear, we should keep that indeed!
     
    ElliotB and GoGoGadget like this.
  26. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Hi,

    How do we clear with skybox the current render target ?

    e.g. the equivalent of
    RenderTexture.active = currentRT;
    GL.ClearWithSkybox(false, Camera.main);

    Thanks
     
  27. Neonage

    Neonage

    Joined:
    May 22, 2020
    Posts:
    288
    I see that builtin passes still use old execute code / resources management when RenderGraph is disabled.
    Why can't RenderGraph code be executed in old render loop, instead of throwing "Execute is not implemented, the pass won't be executed in the current render loop"?
    All these ever-changing APIs were already a nightmare to deal with, and this S*** quadruples the fun.
     
    nasos_333 likes this.
  28. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Because as every single thing since the inception of pipelines it is assigned to the end user to adapt to any radical changes, than to abstract the changes on the backend.

    For example, if Render Graph is selected, should just run the Blit with the Render Graph function on the backend, without the need to do any changes to our code.
     
  29. ManueleB

    ManueleB

    Unity Technologies

    Joined:
    Jul 6, 2020
    Posts:
    110
    to run in the non-RG path you need to implement the Execute(), in RG you implement RecordRenderGraph(), if you have both (like the builtin passes) you can run on both paths. You can look how builtin passes share code between RG and non-RG, which is also explained in the document linked at the start of this thread
     
    ElliotB, tmonestudio and nasos_333 like this.
  30. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    282
    Indeed, we spend a lot of time making sure that both the RenderGraph version of URP and the non-RG version are available in 23, so that we can offer a Compatibility mode that still runs the previous ScriptableRenderPass API. This was done to make it easier to upgrade to 23.3 but also to give our ecosystem more time to provide upgraded assets.


    I assume you mean running the old API with RG? Unfortunately, the ScriptableRenderPass API and the URP foundation (ie RenderGraph) are tightly coupled. The purpose of the new API is to allow RenderGraph to automatically optimise extensions. The behaviour of the new API is also much better defined. We investigated to support the old API in the RG path, but it seemed technically impossible to guarantee the same behaviour in all use cases and for all platforms. Instead, we have the Compatibility mode and we also added the "UnsafePass" (documentation is coming soon) to simplify upgrading to RenderGraph.
     
    nasos_333 likes this.
  31. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Hi,

    Also need to add that the coexistence of the two is crucial, so we can toggle between the two versions fast and debug issues in the porting.

    If was not for the direct comparison ease, would take forever to convert more complex effects, so the compatibility mode is a very welcome feature until the Graph is fully adopted.
     
    oliverschnabel and AljoshaD like this.
  32. Neonage

    Neonage

    Joined:
    May 22, 2020
    Posts:
    288
    No, I mean automatically running RG in old render loop functions, without touching the old API.

    Supporting old functions means having to maintain two different resource management processes and boilerplate data structures, for old and new, which defeats the whole purpose of this automation.
     
  33. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    282
    I think we mean the same thing when I mentioned running the old API with RG in the render loop functions (and features).
     
  34. oliverschnabel

    oliverschnabel

    Unity Technologies

    Joined:
    Mar 13, 2018
    Posts:
    49
    Hello Everyone,

    thank you for all the great feedback so far! We are happy to see that some of you were already successful incorporating the API changes in existing assets of effects. We know that any adjustments to APIs need to happen with care and we want to support you as best as we can on the journey to benefit from the introduction of RenderGraph in URP.

    For new projects, Render Graph in URP is now enabled by default in Unity 2023.3.0a18 and later. If you open a project created in an earlier Unity version which did not use Render Graph, Unity enables the option Compatibility Mode (Render Graph Disabled). This option lets you migrate the existing effects to the new API as described in this document.

    We updated the alpha documentation with the following additions:

    New API to Set Global Textures: In particular cases you might want to bind textures to so-called global slots, meaning they are available for any pass in the pipeline. The main use case for this feature is to make textures available to opaque draw passes such as draw renderer lists.

    Renamed API UseTextureFragment to SetRenderAttachment in the RenderGraphBuilder

    Introduction of Unsafe Passes: On top of RasterPass and ComputePass, the RenderGraph API provides now an UnsafePass type, which allows you to ignore any benefits and optimizations of RasterPasses and use the old cmd.SetRenderTarget and all the legacy CommandBuffer interfaces. This allows you to reuse more of the old API code.

    Updates on the Render Graph Viewer for Debugging

    We are currently actively working on more utility functions to remove unnecessary boiler plate code to reduce any overhead on your side. We will share them soon in the new year.

    Thank you for your feedback! Let us know here if we can assist you further.
     
    DrViJ, dnach, JesOb and 4 others like this.
  35. optimise

    optimise

    Joined:
    Jan 22, 2014
    Posts:
    2,137
    Will official make Render Graph high level APIs to be SRP agnostic that call the same API will work on both URP and HDRP nicely?
     
  36. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    282
    Yes, the RecordRenderGraph API is SRP agnostic and can be adopted by HDRP in the next release. HDRP already uses RenderGraph under the hood, but the CustomPass does not yet expose RenderGraph.
     
  37. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Hi,

    Thanks for the info

    Will the renamed API break the current effects, or their cide will be auto converted upon insertion in the newer Unity version for example ?
     
  38. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    282
    Since these are changes between alpha patch versions, I don't expect them to auto upgrade. That said, the team might have managed to set that up for some of these changes. We don't expect breaking changes anymore now that we are almost at the beta cycle.
     
  39. adamgolden

    adamgolden

    Joined:
    Jun 17, 2019
    Posts:
    1,558
    As of 2023.3.0a18 (vs. 0a17) some assets in a project created recently broke. There was no auto-upgrade - but thanks to this thread I was able to learn about the compatibility mode, and turning that on plus commenting out some code that was saying #if UNITY_2023_1_OR_NEWER and making it use the #else part only / older way appears to have entirely fixed the project. For anyone else, it's under Project Settings->Graphics,
    disabling_render_graph.png

    So my concern is.. how long can we count on this compatibility mode being available? Will it be available through Unity 6 LTS cycle? I can imagine this affects a lot of assets we have purchased, so we're at the mercy of publishers to release updates (and we will have to re-purchase new "Unity 6-compatible" versions of some assets as well, where the publishers find it's a lot of work.. which I've noticed was already announced by one shortly after 0a18 was released). Thanks!
     
    _geo__ and ElliotB like this.
  40. AljoshaD

    AljoshaD

    Unity Technologies

    Joined:
    May 27, 2019
    Posts:
    282
    Our goal is to make our users and our asset store providers successful. For our users, RG will mean more stability, and better performance when extending the rendering. For asset creators, it means an API that is better defined, better documented, once learned should be more productive, and offers performance by default. We're dedicated to support asset providers in this transition. We will monitor closely the progress of asset conversation in the marketplace and make sure we support the compatibility mode during this transition. How we approach this exactly will be based on your feedback.
     
  41. adamgolden

    adamgolden

    Joined:
    Jun 17, 2019
    Posts:
    1,558
    My one suggestion would be a comprehensive "upgrade guide" - so for users that are comfortable modifying code, we're able to make assets work again without compatibility mode or contacting asset support, and for publishers to be able to update their assets without spending time combing through threads looking for "how do I.." about each thing that changed - because a very well-known and veteran publisher already stating we'll have to pay for a new version that will be for Unity 6 does not bode well. Thanks for the quick reply.
     
    oliverschnabel likes this.
  42. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Hi,

    From converting my systems so far, can take crazy time to convert, essentially redoing all image effects in both c# setup and shader code.

    I have done all sorts of combined code in shaders to make this work with all effects, i think is one of the hardest changes by miles, beaten only by the adapt to pipelines from BiRP.

    I had to deconstruct how the full image quad is done and adapt all my vertex shaders to it. The code is also 3 to 4 times longer, as simple one line blits are now super complex functions with multiple setup lines.

    So i would expect any publisher go for a separare 2023.3 version, as there is a crazy amount of extra work and ingenuity involved into converting all effects. Supporting also 2021, 2022 and 2023 is same code is extremely hard now, as there is so many if conditions, so going for a 2023 native seem like a one way.

    Now how the pricing will be depends on publisher, personally as my usual policy plan to do a very small update fee for the 2023.3 versions, if at all. Even with all the massive work involved.

    Also so far since had to restructure the C# code so much, in some cases see lower performance due to lost optimizations, as is so much harder to setup lower resolution render targets and keep track of them, so i would expect to keep optimizing those for months after the initial releases, so the work involved is much more than massive to get there to previous state, is more like a new endless cycle of optimizing for the new system all over.
     
    Last edited: Dec 15, 2023
  43. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,549
    Now to the good things, since need to be mentioned.

    One major is that there is no need to clear the render targets manually. This is certainly a big plus. It also helps during conversion.

    Another perk is that force a certain structure that might be less prone to bugs or issues in code.

    Maybe there is some eventual performance benefit in some cases, since is a graph optimized before sumission to rendering, though have not yet seen this, but can see how might help if structure the effects correctly.
     
    adamgolden likes this.
  44. flyer19

    flyer19

    Joined:
    Aug 26, 2016
    Posts:
    126
    Can Render Graph make nodes net visual system render pipeline like Frostbite Engine's FrameGraph ?
     
  45. adamgolden

    adamgolden

    Joined:
    Jun 17, 2019
    Posts:
    1,558
    I would rather pay a small upgrade fee than see publishers abandon their assets because of this, which I truly expect to see happening in some cases, because many (most?) of you already aren't making much in return for the amount of time that needs to be invested (in updating for new editors, improving, supporting), and as you've indicated, this is truly significant and costly when considering the number of hours involved.

    I understand what's happening now is a part of progress - Unity can't just endlessly continue to support old systems, and expecting seamless coexistence of all ways old and new in the same project isn't realistic. However, it would benefit Unity, publishers and users to invest the time providing documentation for each of the changes and error messages that will now appear, and explain what you need to do to achieve the same things / replicate the broken functionality in Render Graph. If a publisher like yourself can simply CTRL+F/search for the error message or old code somewhere and then see clear instructions for how to do it in Unity 6.. compared to each of you investing much effort researching and experimenting.. this replacement system would end up being less overwhelming for all concerned. If Unity wants everyone using their latest editor as soon as possible, it can't just break projects and make things obsolete without notice and without a clear path to repair/upgrade.. especially if this "compatibility mode" isn't going to stick around for long.
     
    JesOb and Neonage like this.
  46. adamgolden

    adamgolden

    Joined:
    Jun 17, 2019
    Posts:
    1,558
    Another consideration about a lasting "compatibility mode" will be (if it means Render Graph becomes unavailable/disabled, which I'm assuming happens), as publishers start updating their assets to use it when editor version is detected as new enough, there will be a conflict arising ..where we can either turn the mode off and have new versions of assets working, or keep it on, but we'll be stuck using older versions of the assets. So there will be a time where some assets we're using will be broken, regardless of which mode we set it to, and we'll have to pay close attention to keeping multiple versions of assets on hand. Just a thought - and we'll get through the transition eventually, but for the immediate future I imagine this will be a thorn in the sides of many.
     
  47. Yoraiz0r

    Yoraiz0r

    Joined:
    Apr 26, 2015
    Posts:
    91
    I downloaded Unity 2023.3.0a18 today in hopes of trying the RenderGraph, but I'm getting memory leaks inconsistently when trying to enter playmode, and the stack trace points to RenderGraph...
    The root seems to be

    Unity.Collections.NativeList`1<UnityEngine.Rendering.RenderGraphModule.NativeRenderPassCompiler.PassRandomWriteData>:.ctor

    With that in mind, I moved back to 2022.3.15f1 to collect feedback to bring about scriptable render passes here.
    Trying my due diligence on this post (to which I still didn't get a response :c), I tried to look at possible improvements RenderGraph could bring. One sore spot as I mentioned above is that there is still not a single clear way that I can find to setup lighting information for an object to be rendered manually, regardless of its shader, layer, or pass.

    Using this example code, I'd like to ask how would this look in the context of the RenderGraph, which seems to want to deprecate the usage of commandbuffer entirely.
    Code (CSharp):
    1. public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
    2.     {
    3.         var cmdBuffer = CommandBufferPool.Get();
    4.         cmdBuffer.Clear();
    5.        
    6.         var renderer = MySpecialClass.GetRenderer();
    7.         var pass = MySpecialClass.GetRenderPassIndex();
    8.         if (renderer)
    9.         {
    10.             LightProbeUtility.ProperSetSHCoefficients(renderer.transform.position, renderer, cmdBuffer);
    11.             cmdBuffer.DrawRenderer(renderer, renderer.sharedMaterial, 0, pass);
    12.         }
    13.  
    14.         context.ExecuteCommandBuffer(cmdBuffer);
    15.         CommandBufferPool.Release(cmdBuffer);
    16.     }
    All LightProbeUtility does is set unity_SHAr/g/b unity_SHBr/g/b unity_SHC through the command buffer before calling DrawRenderer. Would there be any limitation there for rendergraph?

    Also, the above code has revealed some undefined behavior between Forward+ and Deferred / Forward, which I'd like to touch on. It seems that Forward+ is giving DrawRenderer some information that would otherwise not be available to it. I wrote a thread for it here.
     
  48. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    297
    I agree this will probably kill a large number of assets. It's not as simple as charging an upgrade fee like Unity suggests, because some users will review bomb in reaction to that, and reviews are key to asset store sales. But there's also a second aspect to the rewrites, which is a very real and lingering threat of developer burnout. There are myriad bugs and behavior quirks in URP across different engine versions (check my posts for concrete examples), and its already 'not fun' to ship a package that is rightfully expected to work across maybe 4 editor versions, >5 platforms, >4 APIs. Periodically throwing in major rewrites will be the breaking point at which people say 'F*** it' and abandon their package.

    Leaving on a positive note - progress must obviously be made and its good to see the engine improve. I'd also definitely love to see less of URP hidden as internal, its telling how many forum users are referring to needing reflection to access and modify URP in the ways needed (eg modifying passes on the renderer, existing targets, etc), hopefully we see that.
     
    adamgolden likes this.
  49. Well, the race to the bottom should end some time when it comes to pricing... All asset store creator who is too shy to charge decent money for their assets are doing this disservice themselves.

    Yes, I am not a publisher, I am a user and I advocate for higher prices because of this instability.
     
  50. retired_unity_saga

    retired_unity_saga

    Joined:
    Sep 17, 2016
    Posts:
    296
    How do we set up our own rendergraph/scriptable feature to do exactly what base URP 17 does, but forces all lights to calculate as Legacy Vertex Lit and still support Forward+?

    I just want Forward+ and true Vertex Lights.