Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Feedback Wanted: Scriptable Render Pipelines

Discussion in 'Graphics Experimental Previews' started by Tim-C, May 9, 2017.

  1. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,358
    I had a more close look to the HDRP and realized that the book of the dead has a different version than the one downloaded standalone. For example the sun shafts and fog is based on this changed version.

    I wonder why HDRP seems so much more limited than the standard in this regard, i mean HDRP had to be changed to get the same results we got with standard pipeline directly, it is not encouraging to say the least and seems like the API is not even close to fully realized yet.

    For example why not build the HDRP on top of the Book Of the Dead pipeline version, which enables the effects like the sun shafts and fog ? Is there a reason why the official HDRP is chopped of ?

    Thanks
     
  2. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    how is it limited? Have you explored the volumetric fog stuff?
     
  3. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,358
    In the sense that they had to augment / change it to make the volumetric fog - lighting work.

    I understand that pipelines are changeable, but the way they are provided as two fixed pipelines make no sense to change either to get any custom effect, since customers of asset store will be incompatible and all other assets also

    I really dont understand the logic behind all this. I am lost as to how to proceed to be globally compatible with other assets and the official pipeline releases.

    Imagine that Unity stuff themselves had to change the pipeline to get the proper effect, if that is how it works how is everyone else supposed to go about it ?

    When i first read about the 3 different incompatible pipelines i already though this was a terrible idea, now i see that actually there is no just 3, but any number of possible pipelines that a user may keep or use etc.

    It is beyond crazy to keep an asset store asset with such, or make anything custom with these pipelines because wont play on any actual project that uses the official release ones.

    Another issue is that those pipelines are inserted as a closed package that cant be edited, thus is not possible to obverride them.

    I can only imagine forcing users of the asset to use a custom local version, not get the unity release ones and be incompatible with all other store assets that use the HDRP, which of course is not truly a solution
     
    Last edited: Mar 14, 2019
    DaveL99 likes this.
  4. tibi_fake

    tibi_fake

    Joined:
    Jul 31, 2018
    Posts:
    7
    How about giving job system access to the CullResults? For example, there could be an IJobParallelForCullResults that would have an Execute with MaterialPropertyAccess/RendererAccess and TransformAccess, we could use that for example to assign motion vectors and similar stuff.
     
  5. CarlLee

    CarlLee

    Joined:
    Mar 4, 2015
    Posts:
    8
    Just read the source code, for LWRP, it's very easy. There's IAfterOpaquePass, etc. Just extend it and attach it to the camera you want to register the pass to.
     
  6. Adam-Bailey

    Adam-Bailey

    Joined:
    Feb 17, 2015
    Posts:
    232
    As far as I interpret it, isn't this why they are so heavily pushing Shadergraph over hand-written shaders? So shaders are mostly setup in there and just converted to use a different master node?
     
  7. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,358
    This is even worse, as a shader graph can never be as versatile as writing code. They can streamline the process, but destroy one key aspect in the process.
     
  8. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,981
    It doesnt change the fact that by authoring in there you will maintain compatability. It may not be the answer you wanted, but it is an answer with a solution.
     
  9. Kichang-Kim

    Kichang-Kim

    Joined:
    Oct 19, 2010
    Posts:
    1,011
    Hmm, with Core RP Library 5.x and PostProcessing 2.1.3, some type names are conflicted and annoying to edit script. Is this intended naming?

    Unity.Rendering.XXXParameter
    Unity.Rendering.Postprocessing.XXXParameter

    (XXX can be Float,Int, and so on)
     
  10. PixelPup

    PixelPup

    Joined:
    Mar 6, 2018
    Posts:
    18
    Hey everyone, I am having a heck of a time trying to get the volumetric lighting to work like in the original demos with the HDRP. I feel like I have enabled everything everywhere, but I cannot seem to locate the Volumetric Lighting Controller Over ride. This seems to be critical to controller the overall look of this and it just isn't an option for me. I have tried in 2018.3 and in the new beta 2019.1 and I am at a loss. Unless I load an older project that has this in it, I can't seem to use it. Was this deprecated and replaced and if so with what? Thanks
     
  11. KausBorealis

    KausBorealis

    Joined:
    Sep 28, 2016
    Posts:
    8
    Hi, can someone explain to me, why is Culling API is so bogus? It makes SRP virtually useless for me. I can't seem to find any way to popualte CullingResults manually, to avoid extra work and/or do my own culling. And of course that means a way to iterate meshes and perform drawcalls. Where's all of it? The current HD/LWRP looks like it was barely intended to modify, you almost immediately stumble into native code.
     
  12. Taki24

    Taki24

    Joined:
    Jun 6, 2018
    Posts:
    9
    Hi, as i looked through this thread i noticed that some guys had the problem with the
    "the camera list passed to the render pipeline is either null or empty" (LWRP) after the application build.

    In the Package Manager you just have to update the LWRP package and it solved the problem for me.

    Sry for the offtopic.
     
  13. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Now that LWRP is out of preview I am completely stumped at how to inject ScriptableRenderPass. Anyone have an example or documentation? I can find none...
     
  14. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Those interfaces are now gone in final release... what is the replacement?
     
  15. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    789
    The new method is to use the Custom Render Passes. I don't see any docs on it yet so this video is the best I know of that talks about it, as well as another in the GDC Vault. It was mentioned that you can do all that with scripting as well but haven't seen anything on that so don't know where to point you.

     
    syscrusher likes this.
  16. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    I was able to get a pass injected with a scriptable render feature, unfortunately there are two showstoppers for my effect:

    1] executing a command buffer blit call will call the blit multiple times if the shader has mutliple passes, even if you explicitly specify a pass, kills performance
    2] there is no event to capture the cascade shadow map, death to any volumetric shadow effects
     
  17. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    995
    Where is depth only mode for second camera in HDRP ?
     
  18. pastaluego

    pastaluego

    Joined:
    Mar 30, 2017
    Posts:
    196
    Is there a way to hook into a certain renderqueue value to blit the current camera contents to a render texture using SRP? I'm still trying to wrap my head around it. For example, like before objects at renderqueue 3005 are rendered, blit camera contents to a render texture.
     
  19. Haiisam

    Haiisam

    Joined:
    Jun 18, 2017
    Posts:
    5
    I tried to change the LWRP asset, or rather its parameters, but I ran into a problem
    Error CS0234 The type or name of the namespace "LightweightPipeline" does not exist in the namespace "UnityEngine.Experimental.Rendering" (perhaps there is no link to the assembly).
    What could be the problem, or am I doing something wrong.
    Generally what I want:
    The ability to change the resolution of shadows through ui as well as their distance and cascades, rendering resolution, enable/disable hdr,srp batcher soft shadows

    Version of the editor: 2019.1.0f2
    Version LWRP: 5.7.2
     
  20. karl_kent

    karl_kent

    Joined:
    Feb 1, 2018
    Posts:
    10
    In HDRP, Anyone have any idea why:
    clip()
    Does Not work/work properly outside of the LitDataIndividualLayer.hlsl
    float ADD_IDX(GetSurfaceData(..)
    function?
     
  21. RunninglVlan

    RunninglVlan

    Joined:
    Nov 6, 2018
    Posts:
    182
    Hi, is this a bug or I'm just missing something?
    Using
    RenderPipelineManager.endCameraRendering
    this example renders as expected in Scene view, but incorrect (on top of other objects) in Game view.
    upload_2019-5-7_12-34-59.png
    Without
    GraphicsSettings.renderPipelineAsset
    and using
    Camera.onPostRender
    Scene and Game views are rendered the same way.
    upload_2019-5-7_12-46-53.png
    Yeah, and the last time I tested it was with Unity 2019.2.0b1 and Lightweight RP 6.5.3
    Code:
    Code (CSharp):
    1. void OnEnable() {
    2.   // Camera.onPostRender += OnRendered;
    3.   RenderPipelineManager.endCameraRendering += OnRendered;
    4. }
    5. void OnDisable() {
    6.   // Camera.onPostRender -= OnRendered;
    7.   RenderPipelineManager.endCameraRendering -= OnRendered;
    8. }
    9. private void OnRendered(ScriptableRenderContext context, Camera camera) {
    10.   OnRendered(camera); // OnRenderObject() from https://docs.unity3d.com/ScriptReference/GL.html
    11. }
    And one more thing. I got it to work correctly, but I needed to change all LWRP asset fields to defaults. Why does unexpected behavior happen when some of these fields are changed: Quality's HDR and MSAA, Advanced's SRP Batcher?
    For example, it stops rendering (or renders incorrectly) in Game view if HDR is enabled, if SRP Batcher is disabled, if MSAA is enabled.

    Oh, just found out that plain GL example from documentation doesn't work at all with LWRP in Game view. Changing LWRP asset settings doesn't change anything.
     
    Last edited: May 10, 2019
  22. monark

    monark

    Joined:
    May 2, 2008
    Posts:
    1,598
    Did you have any luck doing this?
    Looks like DrawRendererSettings isn't even in 2019.1
     
  23. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    995
    When will be 5.14 released?
     
  24. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    When VFX and supporting packages are the same version on staging?
     
  25. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    hippocoder likes this.
  26. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Unity tended to (in the past) just release LWRP + SRP + VFX + HDRP of the same version each time at same time.

    So just assuming in that respect :)
     
  27. DaveL99

    DaveL99

    Joined:
    Jul 5, 2018
    Posts:
    22
    I am also currently very puzzled as to how this is intended to work.

    I have seen some issues raised regarding functionality that was available in the legacy built-in pipeline, but no-longer available in HDRP/LWRP. Things like pre/post camera rendering hooks / command-buffer insertion, SetReplacementShader etc., and the general response is that with SRP you can just change it / write your own.

    However, I have yet to find any examples or discussion as to how to make reasonable, minor customisations to the existing HDRP/LWRP. (It's possible that maybe I just haven't been looking in the right places?).

    All I have seen is the Book of the Dead approach, of making a local copy of the entire package. I can't imagine this is ultimately the intended workflow for everyone though, as then you have essentially forked a major part of the engine - making tracking against bug-fixes/improvements/new-features in subsequent official releases significantly more difficult.

    Am I missing something here?
     
    nasos_333 and Korindian like this.
  28. dave_sf_42

    dave_sf_42

    Joined:
    May 28, 2019
    Posts:
    41
    One thing that would compliement SRP is to make it possible to implement a version of SRC.DrawRenderers() which draws through a compute shader, by binding VB/IB data to a compute shader, invoking the shader, then calling CommandBuffer.DrawProcedural (or DrawProceduralIndirect) on an output of the Compute Shader.

    This is useful to use compute-shaders to emulate geometry shader like behavior onMetal (which lacks geometry shaders).

    The specific area this has come up in Unity-SRP-VXGI which has both compute shaders and geometry shaders.

    It works efficiently on windows, because DX11 supports geometry shaders and compute shaders.

    On Mac OpenGL, Unity supports geometry shaders but not compute shaders.

    On Mac Metal, Unity supports compute shaders, which are powerful enough to emulate geometry shaders. However, there appears to be no way to get Unity to efficiently send VB/IB data to a compute shader, or to draw through a compute shader.

    A few ways this could work:

    (a) create calls like Mesh.GetNativeIndexBuffer_AsComputeBuffer() and Mesh.GetNativeVertexBuffer_AsComputeBuffer(), to make it possible to bind existing VB/IB buffers to Compute Shaders. With this, is it possible to walk renderers manually to draw them? (i.e. write a full custom implementation of SRC.DrawRenderers() ? ) If so, this is sufficient. If not...

    (b) create a Delegate version of SRC.DrawRenderers() which hands each drawing primitive to the Delegate. This would allow the delegate to use the calls in (a) to bind VB/IB data to a compute shader, issue the compute shader, then call DrawProcedural or DrawProceduralIndirect on a ComputeShader output buffer.

    (c) alternatively, make a state mode for SRC.DrawRenderers() which does (a) and (b) in a more structured way.

    ?? or something else..
     
    Last edited: May 31, 2019
    misabiko, ekakiya and syscrusher like this.
  29. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,358
    That is the thing, i am not sure it is intended to work, more likely was intended to impress with HDRP and feel more optimized with LWRP, for PR reasons.

    There cant really actually be a reason to have 3 incompatible pipelines, other than a rushed unorganized release imo.

    Unity has become more complicated than Unreal and less unified and such complexity was not its major advantage for sure.

    What is more strange is that we are offered a complex way to making our own rendering pipeline, which might be ok in general, yet have to conform to two specific pipelines and everything gets incompatible if not.
     
    Last edited: May 31, 2019
    interpol_kun likes this.
  30. Redhook_Galen

    Redhook_Galen

    Joined:
    Nov 21, 2018
    Posts:
    11
    I've been trying to insert a Post Processing effect before the Transparency rendering but I've been running into issues.

    The first thing I tried is just setting the PostProcessEvent to BeforeTransparent for a PostProcessEffectRenderer. But the Render function doesn't even get called when set to that. Works fine for AfterStack or BeforeStack.

    So then I started trying to use the ScriptableRendererFeature to do the same insertion. I found this example https://gist.github.com/phi-lira/46c98fc67640cda47dcd27e9b3765b85 but the _MainTex global shader texture is set to the Scene View's color texture. I get the right camera's depth texture though.

    I tried manually setting the RenderTextureIdentifier to the right camera but no luck so far.

    Unity 2019.1.1f1
    LWRP 5.13.0
     
  31. Redhook_Galen

    Redhook_Galen

    Joined:
    Nov 21, 2018
    Posts:
    11
    Okay wait I got the Color Texture to kinda work with
    Code (CSharp):
    1. cmd.SetGlobalTexture("_MainTex", m_ScriptableRenderer.cameraColorTarget);
    .

    I want to set it to RenderPassEvent Before Rendering Transparent but if I do so the view just goes black. It works for After Rendering Transparent but I don't want this effect on-top of transparent particle effects.

    If I use the Frame Debugger it actually works fine.
     
  32. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,618
    Considering how much time and effort unity technologies spent to implement lwrp for example, I wonder how many studios that use unity have also the luxury to have a team of world class rendering engineers working on a custom pipeline for such time frame.
     
  33. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    This is the challenge I am facing as well, trying to make a published asset compatible with SRP. Andy Touch was kind enough to spend some time with me at Unite last year, but the APIs he suggested I could use have morphed since then as the HDRP has evolved. (I don't blame Unity, or Andy, for that, because HDRP is clearly labeled as a preview...it's just unfortunate that this change caught me.)

    The documentation gap you mentioned is hindering me as well. I am not a graphics programmer and don't have time to become one. I wrote a custom shader in the Standard Pipeline and am using rendering hooks to draw procedural geometry. I don't want or need to create a custom render pipeline, because that's not what my tool is for. I just need to make it work correctly for customers who are exploiting SRP. I give Unity the benefit of the doubt here because I know how these things go -- in the real world, documentation always seems to lag feature development, but Unity as a company generally does catch up eventually in their docs. So I'm willing to be patient, but will still "bump" the need for this as a "to-do" for the team. :)

    Another difficulty for asset developers whose tools aren't specifically focused on one pipeline is that there are enough differences between LWRP and HDRP that it's difficult to make even simple tools that can work in both pipelines. Over time, I would hope that Unity refactors as much of the SRP as possible to put more into a general SRP API and minimize how much is pipeline-specific.
     
  34. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,358
    I just hope Unity realizes the vast mistake they have made with the new pipelines system and the v2.0 of the store before all gets destroyed. Right now as a publisher i don't see how is even remotely possible to support this pipeline system and already looking at Unreal store and Google Store as alternatives should Unity depreciates their main current pipeline.

    My revenue is also now tiny comparing to the effort and the previous years (mainly because v2.0 of the store so far is my guess), so if development time needed goes 10-fold due to required pipeline support, i cant see how will be possible to keep the same level of upgrades and support for store users.
     
  35. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    I will judge the pipelines as they relate to asset store items based on what happens in the first year after HD pipeline is out of preview. I have certainly seen plenty of complaints and concerns from asset store developers over recent years, and I can appreciate many of their concerns. For now I'm going to assume that the future will be a mixed bag - some concerns will be solved, some probably will not. Some asset store developers won't get on board, or will be stuck waiting ages for Unity to make key changes, others will be able to embrace things with enthusiasm from an earlier stage.

    At this stage, personally I need the pipelines more than I need assets from the store. Deprecated assets and interoperability issues between assets by different authors, sometimes due to limitations with the legacy render pipeline, were an old pain that I will be glad to swap for the brave new pains of the scriptable render pipelines. I look forward to adding some store assets to the pipeline mix one day, and I know that in the meantime it is not easy for authors of vertain kinds of assets. I wish that these complaints from asset store devs were at least met with more thorough public resonses from Unity, but I am missing part of the picture because I am not an asset store dev myself so cannot see whatever forums exist for their eyes only.
     
    syscrusher likes this.
  36. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    To a certain extent, I think part of the problem is a disconnect between needs of two distinct audiences. The people who are early adopters of SRP in general, and especially of HDRP, seem to be graphics/rendering specialists who are driven by the need for SRP's capabilities. While some asset developers are also shader or rendering wizards, there are also those like myself who aren't. My asset is a tool that happens to require one custom shader to do its job; it's not a rendering tool. For this type of developer, the need is not to "exploit" SRP but simply to "support" it for customers who are using SRP in their projects.

    None of the preceding is meant to complain or criticize anyone, only to offer one possible explanation of why this has been a problematic transition for some asset devs.

    As a new development, I note from another thread (https://forum.unity.com/threads/big-collection-of-free-shader-graph-nodes.539208/#post-4479508) that there is now a Custom Function Node in Shader Graph (I'm not sure what version actually added it, but it's present in the latest 5.16 Preview). I'm excited about this because I think this will give me the majority of what I need.
     
  37. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,358
    The problem is not the pipelines themselves, is that unity is like a changing platform every few months now that creates the issue and lack of support or documentation for this ever changing landscape.

    Is like moving to a whole new engine every few weeks, first was the standard material, then make image effects for image processing stack that is no longer working in 2019 version, then make for new pipelines one of which is not even ready but everyone want to use it and includes me, but is just not ready for that because misses key funtionality.

    And all pipelines are incompatible between each other and any other custom one !!! Which is crazy in the first place.

    So is impossible to keep up like that. And i am sure game developers will have the same issues if try to do anything above normal and have a project that needs time to make. Or simply want to port to different platforms, which Unity was famous for.

    Jumping to incompatibily every few weeks is just a terrible strategy.
     
    Last edited: Jun 3, 2019
  38. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Thats why I'm waiting a good while till its out of preview before judging - they warn that things change during preview that should not change in the same way/with the same pace once the preview period is over.

    I certainly havent waited till its out of preview in order to use it myself, but I'm also not harassing asset store devs to support HDRP at this time either.

    It is not like moving engine every few weeks. And I happen to like their strategy of completely separate pipelines, I dont want the compromises in either pipeline that cross-compatibility would bring. Once both pipelines are more mature I would like them to look at which bits could be unified more effectively, because there may be elegant solutions that were not obvious back when the pipelines were in their infancy.
     
  39. pastaluego

    pastaluego

    Joined:
    Mar 30, 2017
    Posts:
    196
    Is it possible to get screen contents with more specific info than before/after opaque pass? Maybe by providing a specific render queue # value? Hard to do some screen effects in 2D when everything is on the transparent queue and I can't capture the screen at specific render times in an efficient way without using grabpass.
     
  40. CGBull

    CGBull

    Joined:
    Feb 12, 2017
    Posts:
    82
    ekakiya, Adam-Bailey and Onigiri like this.
  41. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    I should probably say that although I have a vaguely positive view about longer term asset store & render pipelines issues, I do not deny that things can be painful at the moment.

    For example, Unitys own flagship HDRP 'Unity Icon Collective Buried Memories series' of assets dont support 2019.x at the moment, and that includes the 2nd pack which only got released yesterday. 2019.x support to follow 'in the coming months'.

    Hopefully things are on track to stabilise (in terms of breaking changes) by the time HDRP is out of preview, so this sort of unfortunate situation wont be seen so much after this year.
     
  42. tibi_fake

    tibi_fake

    Joined:
    Jul 31, 2018
    Posts:
    7
    colin299, tspk91, ekakiya and 2 others like this.
  43. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,618
    Good observation! I agree, rather than hard-coding to get specific pipelines working, better revisit and improve the ShaderGraph API.
     
    tspk91, JesOb and interpol_kun like this.
  44. ekakiya

    ekakiya

    Joined:
    Jul 25, 2011
    Posts:
    79
    I agree with that too! It makes VFX Graph difficult to use with own SRPs.
    VFX Graph itself is easy to use with own SRP(thanks to the render pipe setting path), but VFX Graph and Shader Graph must be used together on Vfx Artist's workflow.
     
  45. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    961
    Having used Unity SRP for over 1 year at this point: I have some personal complaints.

    - Many bugfixes happen on the next major release (which makes you want to fork)
    - API locking and hard dependency (modifying LWRP is a pain)
    - You don’t know what’s being worked on (roadmap?)

    And I don't know why no one ever bother to update OP for each pinned thread (Core, HDRP, LWRP), they are so outdated at this point I consider having a blank OP with a link to package document better...

    original post: https://twitter.com/bitinn/status/1141202282945531906
     
    sand_lantern and andybak like this.
  46. G1NurX

    G1NurX

    Joined:
    Dec 25, 2012
    Posts:
    69
    We attempted to replace shader based on distance.
    It's very inefficient to do shader replacement on material instances with respect to CPU and memory resources.
    Since all the information is already available in the rendering pipeline (distance in Z).
    We hacked the built-in pipeline to do it. It works well.
    SRP should provide more APIs to developers.
     
  47. G1NurX

    G1NurX

    Joined:
    Dec 25, 2012
    Posts:
    69
    In our game, we implemented a PVS occlusion culling system. But found it is hard to nicely integrate it with the existing rendering pipeline, so does the SRP.

    There're two options we can choose.Enabling/disabling GameObjects or changing layers of GameObjects.
    Both solutions are not perfect.

    1) Enabling/disabling GameObjects has an obvious performance overhead that the RenderObject has to be added/removed from the scene.

    2) Layer solution is also inefficient. Though the toggling overhead is low. Layer filtering in the pipeline is not early enough. A lot of calculations still have to be done before being filtered out by layer.

    A solution with better custom occlusion culling integration is expected.
     
    JesOb and Peter77 like this.
  48. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    But this is actually something they are working on with DOTS rendering. It's more efficient with SRP batching to replace shader based on distance with DOTS than it is to overwhelm the shader. I might lack your use case though, apologies if so.
     
  49. ekakiya

    ekakiya

    Joined:
    Jul 25, 2011
    Posts:
    79
    Using Scriptable Render Pipeline makes 40B GC.Alloc continuously.
    32B for initial, +8B per camera, and additional 32B for setting render Texture to the camera's Target Texture.
    The GC.Alloc is displayed as PostLateUpdate.FinishFrameRendering -> GC.Alloc on Dev Build. also displayed as PlayerLoop -> GC.Alloc on Editor.
    Built-in renderer doesn't make this GC.Alloc
    Tested on 2018.4.2f1, 2019.1.7f1, 2019.3.0a with LWRP, own SRP, and HDRP(without camera's RT thing).
    Case Number is 1165372.

    By the way, the Render(Camera[]) is called per camera object, more than once per frame even without vSyncCounts.
    The LWRP's SortCameras and HDRP's async frame counting looks inconsistently treating this Render function..
    Doesn't we need something like BeginTimeFrameRendering for another hook point?