Search Unity

Feedback Wanted: Scriptable Render Pipelines

Discussion in 'Graphics Experimental Previews' started by Tim-C, May 9, 2017.

  1. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    437
  2. JakubSmaga

    JakubSmaga

    Joined:
    Aug 5, 2015
    Posts:
    416
    @SebLagarde Are there any plans on built-in support for virtual texturing?
     
  3. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    437
    Copy/paste: Everything is on the roadmap, no ETA, matter or priority, human resources etc... :)
     
    JakubSmaga likes this.
  4. Kiori

    Kiori

    Joined:
    Jun 25, 2014
    Posts:
    159
    Hey guys, have you tested the performance of the lightweight renderer vs the current one(w/forward rendering) on mobile?
    Like through a bunnymark or similar? Do you guys even do benchmark testing?

    Thanks!
     
  5. JakubSmaga

    JakubSmaga

    Joined:
    Aug 5, 2015
    Posts:
    416
    Kiori likes this.
  6. JakubSmaga

    JakubSmaga

    Joined:
    Aug 5, 2015
    Posts:
    416
  7. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    437
    0.1.29 require 2018.1b7 that is not out
     
  8. Karearea

    Karearea

    Joined:
    Sep 3, 2012
    Posts:
    313
    Patience is a virtue.. Working nicely with b7 thanks.
     
  9. Llockham-Industries

    Llockham-Industries

    Joined:
    Aug 4, 2013
    Posts:
    262
    Edit - Just realized B6 is out with a compatible release, will try in the new version and report back.
    Edit 2 - Bugs occur as above in B6, including the crash.
    Edit 3 - Copying the depth directly into the camera target (as done in beta release 6) resolves the issue. :) Crash should probably still be fixed though.

    Just some general bug reporting - Not sure if this is the best place. If there is a better place that will reach you guys immediately let me know. This is using Unity 2018.1.0b2 and the respective tag/version in your GitHub repo.

    I'm having trouble with scene view wireframe, selection outline and grid behavior.
    I'm setting the filled depth buffer (24 bit, depth format, created explicitly as non-temporary renderTexture) at the end of the render loop, similar to your HD pipeline.

    If there is no depth buffer set, and I then try to select something with the selection outline enabled the engine crashes.

    (Direct3D11 / Vulkan) If the depth buffer is set, the grid appears instead in the top panel (tabs, play, pause, next frame buttons etc.), until I select something, then the grid and wireframe outline appear in the correct positions.

    (Direct3D12) If the depth buffer is set, the grid and wireframe outline appears very far below the expected position. The selection outline appears in the correct position.

    Pic :
    SceneViewDebug.png

    I realize, given the nature of SRP, that there are many places I could have gone wrong, so this could very easily be a fault on my end, though the crash at least should be addressed.
     
    Last edited: Feb 15, 2018
  10. Kumo-Kairo

    Kumo-Kairo

    Joined:
    Sep 2, 2013
    Posts:
    331
    @Tim-C do we have any news regarding custom code injection in LWRP? I see that there's a hardcoded PostProsessing pass "After Opaque" and "After Transparent", but it's pointing to the built-in PostProcessing stack.
    Do you have any info on that?

    Right now I'm starting to think about IL code injection into LWRP dll.
     
  11. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,448
    Don't understand - why not just copy and modify LWRP source code with own shaders?
     
  12. Kumo-Kairo

    Kumo-Kairo

    Joined:
    Sep 2, 2013
    Posts:
    331
    I can do it for my own projects, it's a wonderful solution. I'm absolutely fan of this new system and will use it extensively in future projects.

    But not quite so for the assetstore packages. I can't really make a post processing "injection" instead of the standard one as it's hardcoded right now. So users who use the built-in LWRP version (for apparent reasons) can't really benefit from alternative solutions.
    Also think of the endless assetstore packages like UI blur, fast glass refractions, custom fast shadows (based on projection instead of shadowmapping for example) - it's not really clear how to make it work with LWRP

    I have posted in this thread regarding this a few pages ago, and it was very nice of Tim to reply to all of my questions. There was an ongoing internal talk between Unity Tech guys regarding hook points, so I have decided to check how things are going.
    This is that answer I was referring to https://forum.unity.com/threads/fee...e-render-pipelines.470095/page-5#post-3369158
    Cheers
     
  13. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,448
    That's the thing though, once you start pulling SRP to general purpose you lose the benefits of it and may as well use builtin. SRP comes with shader graph, which is going to be what most people (our little dev shop included) will ever need. Asset store sales will suffer in the near term, yes. And in my view, an acceptable tradeoff because I like performance.

    Perhaps the key here is educating people how to modify SRP correctly, and start selling pipelines, with instructions on how to integrate into other pipelines.

    Isn't that middleware 101 historically?
     
  14. Kumo-Kairo

    Kumo-Kairo

    Joined:
    Sep 2, 2013
    Posts:
    331
    Yes, this is exactly what I am looking into right now.

    But my question was aimed at specific "inner talk" about hook points. So I'm not suggesting anything, I'm just interested
     
    hippocoder likes this.
  15. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    7,123
    What kind of inner hooks would you be looking for? Per object? I assume you'd want something similar to what the grab pass is capable of, right? Like a command buffer you could attach to a renderer rather than a camera or light?

    That would be interesting.
     
  16. Nexusmaster

    Nexusmaster

    Joined:
    Jun 13, 2015
    Posts:
    53
    Hi, really great to see the SRP finally working on my PC, thanks for that! It might be ask too much, but maybe somebody can push me in the right direction, I'm trying to get DrawProceduralIndirect working with the HD Pipeline. So far, I can render an object, but it is not shaded, just unlit with DrawProceduralIndirect ! I put my code in the RenderOpaqueRenderList method in the HDRenderPipeline.cs. Any advice how to proceed? Do I need to add it in an extra lighting function?

    Code (CSharp):
    1.  
    2. //... this is in an external start function
    3.       vertsBuffer = new ComputeBuffer(testMeshObj.vertexCount, sizeof(float)*3);
    4.         vertsBuffer.SetData(testMeshObj.vertices);
    5.  
    6.         normalsBuffer = new ComputeBuffer(testMeshObj.normals.Length, sizeof(float) * 3);
    7.         normalsBuffer.SetData(testMeshObj.normals);
    8.  
    9.         trisBuffer = new ComputeBuffer(testMeshObj.triangles.Length, sizeof(int));
    10.         trisBuffer.SetData(testMeshObj.triangles);
    11.  
    12.         trisArgs = new ComputeBuffer(4, sizeof(int), ComputeBufferType.IndirectArguments);
    13.         trisArgs.SetData(new int[]{ testMeshObj.triangles.Length, 1, 0, 0});
    Code (CSharp):
    1.  
    2. //...this happens at the end of RenderOpaqueRenderList
    3.               bt.mpb.SetBuffer("vertsBuffer", bt.vertsBuffer);
    4.                 bt.mpb.SetBuffer("normalsBuffer", bt.normalsBuffer);
    5.                 bt.mpb.SetBuffer("trisBuffer", bt.trisBuffer);
    6.                 cmd.DrawProceduralIndirect(bt.defaultMatrix, bt.proMatFarGlobal, bt.pass, MeshTopology.Triangles, bt.trisArgs, 0, bt.mpb);
    Code (CSharp):
    1. //in the VertMesh.hlsl
    2. StructuredBuffer<float3> vertsBuffer;
    3. StructuredBuffer<float3> normalsBuffer;
    4. StructuredBuffer<int> trisBuffer;
    5.  
    6. VaryingsMeshType VertMesh(AttributesMesh input)
    7. {
    8. ....
    9.         uint index = trisBuffer[input.id];
    10.         positionWS = TransformObjectToWorld(vertsBuffer[index]);
    11.         normalWS = TransformObjectToWorldNormal(normalsBuffer[index]);
    12.     #ifdef ATTRIBUTES_NEED_TANGENT
    13.         tangentWS = float4(cross(float3(0, 1, 0), normalWS), tangentWS.w);
    14.     #endif
     
    Last edited: Feb 19, 2018
  17. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,093
  18. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,093
    The hard part here is that it's really nasty whichever way you go about it. Each pipeline is different so has very different data structures and pipeline execution (there was even divergence in the old unity forward vs deferred command buffer hook points that caused issues).

    There could be some simple hook points added that apply to the pipelines we have:
    *void AfterOpaque(ScriptabaleRenderContext ctx, RenderTargetIdentifier color, RenderTargetIdentifier depth)
    *void BeforeTransparent(ScriptabaleRenderContext ctx, RenderTargetIdentifier color, RenderTargetIdentifier depth)
    *void AfterTransparent(ScriptabaleRenderContext ctx, RenderTargetIdentifier color, RenderTargetIdentifier depth)
    *void AfterEverything(ScriptabaleRenderContext ctx, RenderTargetIdentifier color, RenderTargetIdentifier depth)

    Then we need to talk about how they should be exposed? Generic hook points that you can add callbacks too on assembly reload? Or would you want to inherit from the class and override them (less flexible).
     
  19. Kumo-Kairo

    Kumo-Kairo

    Joined:
    Sep 2, 2013
    Posts:
    331
    @Tim-C
    I have two points on this -
    It's either something like current camera command buffers (something like you have described), assembly reload hooks or inheritance/composition/ordered interface implementation
    Or just scrap the whole idea of hook points altogether so there's absolutely no way of dealing with it using built-in SRP pipelines, just like @hippocoder said.

    First option will enable assetstore publishers to create more custom content for built-in pipelines
    Second option will probably reduce internal complexity and fear of breaking assetstore stuff (which is sertanly a good thing)
     
  20. f1ac

    f1ac

    Joined:
    May 23, 2016
    Posts:
    64
    Noticed another case when upper body IK should be automatically disabled - when character moves in the direction more than 90 degrees relative to camera view ray. Try to hold down 'S' key in 3rd person view with IK enabled - character will run towards the camera alternating between looking left and right which looks weird.
     
  21. Llockham-Industries

    Llockham-Industries

    Joined:
    Aug 4, 2013
    Posts:
    262
    @Kumo-Kairo - agreed, hook points seem very limited. If you're doing anything meaningful you will need a bunch of data from within the pipeline, just having the render-context and render-targets is very limited. But then providing culling data, lighting data, shadow data etc. etc. etc. will be difficult to make work, and would change within every pipeline.

    Instead, each pipeline should be uniquely modular, and that would be up to the author of the pipeline to add hook points for their users, as well as document how to use them (ie. interpret the setup lighting/shadow data in shader etc.). You could do this for each of you're authored pipelines, it would set a strong example for other developers, but enforcing it on other pipelines seems like a surefire way to destroy the flexibility of what you've built.

    I would suggest instead documenting the hell out of this thing and encouraging asset store creators to do the same for there pipelines. It would also be nice to have optional code in packages that are only used when another requisite package is included. This would allow asset creators to build code specifically for each of the pipelines they plan to support. Though that would then require some sort of versioning and automatic code updating built into the packages so you don't have a mess of incompatible packages of different versions.

    Don't be discouraged though, it's well worth it. What you've built, even in its current, somewhat messy form, is incredibly awesome. Speaking of -

    Trying to get CoreUtils.DrawFullScreen() to work. I'm using the corresponding shader functions :

    GetFullScreenTriangleVertexPosition(vertexID)

    To generate the clip space positions and texCoords, like you guys are in you're deferred lighting pixel shader (HD pipeline). This only serves to confuse me more. The function itself draws a single procedural triangle without any indices, though the shader method seems like it should take a quad (And it works perfectly when blitting, which uses a quad), and I'm not quite sure how a mesh can be drawn without indices..?

    I can't blit as I'm using unbound multisampled inputs & targets and blitting forces them to resolve (possibly a bug? possibly intended behavior?). I could just draw a quad.. but I want that 8% percent performance gain so I can waste it away on something else completely trivial...

    Any ideas?
    Edit : Nevermind - got it :D
     
    Last edited: Feb 22, 2018
    Seneral likes this.
  22. Cybexx

    Cybexx

    Joined:
    Dec 4, 2008
    Posts:
    23
    Scriptable Render Pipeline doesn't seem to work in Unity 2018.1.0b8 if you import either the Lightweight or HD pipelines into a blank project using the Package Manager I see this

    Code (CSharp):
    1. C:/ProgramData/Unity/cache/packages/packages.unity.com/com.unity.render-pipelines.core@0.1.28/CoreRP/Shadow/ShadowUtilities.cs(250,51): error CS0227: Unsafe code requires the `unsafe' command line option to be specified
    And any custom shaders referencing pipeline files cannot open those files. I also cannot create new pipeline asset files and existing asset files are blank and unassignable. Everything seems to work in beta 7.
     
  23. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,448
    Those having problems with unsafe errors for beta 8 should create a text file: smcs.rsp (ensure it is that and not smcs.rsp.txt) containing:

    Code (CSharp):
    1. -unsafe
    And place in root Assets folder then reimport.
     
    Llockham-Industries likes this.
  24. Cybexx

    Cybexx

    Joined:
    Dec 4, 2008
    Posts:
    23
    Thanks, that seems to fix everything.
     
  25. Cybexx

    Cybexx

    Joined:
    Dec 4, 2008
    Posts:
    23
    Alternatively you can check the "Allow 'unsafe' Code" checkbox under /PlayerSettings/[platform]/Other Settings. Which appears to do the same thing.

    Edit: Wait nope that doesn't actually work I just got a false positive when I tried it.
     
    Last edited: Feb 23, 2018
  26. Karearea

    Karearea

    Joined:
    Sep 3, 2012
    Posts:
    313
    I found that beta 8, the latest HD SRP 0.1.32 from the package manager, checking Allow ‘Unsafe’ and deleting the library folder before reopening the project all worked.
     
    Llockham-Industries likes this.
  27. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,448
    Really not necessary.
     
  28. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,093
    Hi. In Beta 8 a scripting change when in requiring all asm defs to define is unsafe was allowed. We didn't get notified in time and this broke SRP :(

    If you update SRP via the package manger to the version tagged 1.0.0-beta you should be good to go. As a note: please remove shader graph before doing this (via package manager) as we have changed dependency ordering and if you don't remove it you may run into issues.
     
  29. f1ac

    f1ac

    Joined:
    May 23, 2016
    Posts:
    64
    I was playing with SSS and noticed two things:
    1. There is no sign of backscattering, at least I couldn't find it. I guess it is not implemented? Notice the hand transmits red light, but the reflection is pure white:
    sss.PNG sss_back.PNG

    2. SSS blurs or desaturates textures quite a bit. Is it supposed to work this way?
     
  30. GameDevCouple_I

    GameDevCouple_I

    Joined:
    Oct 5, 2013
    Posts:
    1,847
    Looks almost like its doing a bit of bloom on that area :S
     
  31. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    437
    Sorry can't tell without more details (Settings, light intensity etc..)

    Regarding vocabulary, this is SSS:

    upload_2018-2-27_20-19-30.png

    And this is Transmission (ears):

    upload_2018-2-27_20-19-51.png

    SSS and Transmission share the same "Scattering" parameter on the diffusion profile. Diffusion profile also allow you to control the secular intensity (0.028 for skin, or 1.4 IOR) On a material with subsurface scattering enable you have an option to enable transmission or not, and there is a material type name "translucent" that do transmission only.

    Note that to get the most from SSS effect you need to have crazy amount of details in the normal map (your hand doesn't seems to have any normal map).

    >SSS blurs or desaturates textures quite a bit. Is it supposed to work this way?
    SSS is blurring, so yes it blur :). You can use Textureing mode: Post-scatter mode, on diffusion profile to get back more details.
    Post-scatter mode target scanned data (as SSS is already present in the scan)
    PreAndPostScatter add 50% before blur and 50% after blur and is for CG assets (i.e not scan).
    More info about texturing mode can be found here: https://developer.nvidia.com/gpugems/GPUGems3/gpugems3_ch14.html
     
    f1ac likes this.
  32. f1ac

    f1ac

    Joined:
    May 23, 2016
    Posts:
    64
    @SebLagarde ,
    >Sorry can't tell without more details (Settings, light intensity etc..)
    Surface type: Opaque
    Material Type: SSS
    Enable transmission: true
    base color, normal, mask maps present (except detail channel). Normals don't have much details though.
    Diffusion profile: skin
    Thickness: 0.75
    Other settings at defaults.
    White spot light, 60lm, about 10cm from the hand.


    I have figured it out, increased light intensity to 900lm, and thickness to 1 to decrease transmission.

    Some red noisy backscattering finally visible:
    sss_back900lm.PNG

    Transmitted light is more intense:
    sss900lm.PNG


    Made a quick real life comparison with a 900lm flashlight:

    red backscattering is more intense and wide, not sure how to replicate this.
    IMG_20180227_215812_HDR.jpg

    transmission is less intense. Shader probably needs Thickness max limit to be increased to 2 or 3 to properly model lower transmission in real life.
    IMG_20180227_215841_HDR.jpg
     
  33. SebLagarde

    SebLagarde

    Unity Technologies

    Joined:
    Dec 30, 2015
    Posts:
    437
    >Made a quick real life comparison with a 900lm flashlight:
    Remark:
    Only point light match real world light buld in HD, for artistic reasons we have decided to match spot light to a point light, so spot light is really just a point light with occlusion.
    So regarding you comparison with real world, I am not sure that 900lm will match what you have (as I expect that 900lm is the whole sphere then reflector in flash light focus the light in a given solid angle), so maybe try higher intensity.

    >Some red noisy backscattering finally visible:
    I suppose this due to undersampling of the SSS effect. I am not sure what you call "backscattering" here, there is no backscattering (there is a bit in the Disney diffuse model that we use for very rough surface but is not what we see here).

    We have added a mode (not recommend for game) on HDAsset name: very high SSS quality, you can try that to reduce the noise (but can't ship a game with it).

    >Shader probably needs Thickness max limit to be increased to 2 or 3 to properly model lower transmission in real life.

    Thickness is in real world Unity (mm iirc), you can control it from the diffusion profile with the remap widget.
    So 0..1 thickness will be remap to min mm, max mm of the remap widget.

    Important: All SSS and Transmission are based on real world metric. You need to have a mesh at correct scale (Unity is 1 == 1m), with correct value for thickness, (a skin blur kernel should be around 8mm iirc, this is the value display in the scattering greyed field)

    Last point: Have you enable a tone map ?
     
  34. f1ac

    f1ac

    Joined:
    May 23, 2016
    Posts:
    64
    @SebLagarde ACES Tonemapping + eye adaptation on defaults, 1:1 model scale. I was trying to say that for default skin profile transmission is exaggerated relative to SSS in the direction opposite to light normal (the undersampled red ring, or backscattering - this term is often used in scientific articles). Increasing light intensity to simulate a flashlight with a reflector doesn't change this.

    Regarding the decision to to match spot light to a point light - it would probably be easy enough to add a dropdown with options like "full sphere intensity" (the way it is now) and "spot intensity" to avoid confusion and to assist users when they are trying to match intensity to a real world spot light. LED manufacturers probably specify intensity for a specific solid angle, not a full sphere, and then it gets reduced by the reflector, so calculations get messy. With that dropdown users can just select "spot intensity" and specify the value from light datasheet. Then it can be recalculated to the value used currently ("full sphere intensity") before it reaches the shader. Just a suggestion :)
     
  35. petersx

    petersx

    Joined:
    Mar 5, 2015
    Posts:
    215
    Where we can find it ?
     
  36. RubenCUR

    RubenCUR

    Joined:
    Mar 29, 2017
    Posts:
    2
    Hi, when trying to build the Nightmares scene in the LightweightAndShaderGraph project for Windows x86, I'm getting the following error:

    C:/ProgramData/Unity/cache/packages/staging-packages.unity.com/com.unity.render-pipelines.lightweight@0.1.21/Data/LightweightPipelineAsset.cs(2,19): error CS0234: The type or namespace name `ProjectWindowCallback' does not exist in the namespace `UnityEditor'. Are you missing an assembly reference?

    Does anyone know what reference needs to be set?
     
  37. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,093
    SRP +Upgrading + Extension Developers.

    SRP is a big step away from how things have traditionally been done with Graphics in Unity. Instead of thinking of Unity as an engine with one renderer it is now a platform that (custom) renderers can be plugged into.

    This open Unity up in a number of ways:
    • Game specific stylized rendering
    • Experimental rendering algorithms
    • Optimized rendering for a specific game.
    When undertaking this new approach we made some core, low level decisions on how we want the new system to work, a number of these decisions are divergent from existing unity. I want to take some time to explain these differences and why we made the decision to make these changes.

    The Start of a Journey
    SRP has existed in experimental state for the last year for experimentation, from this time we have received a lot of feedback from advanced users in terms of API and similar. From this we have managed to get a long way towards having a core SRP that we are happy with. That being said as we have been bringing our recommended pipelines public we have been receiving some more detailed feedback on how you would like to interact with specifically:
    • Callbacks from SRP into extension features
    • Shader writing system that is not Shader Graph

    Right now we are investing a large amount of effort into polishing off V1 for 2018.1, but as we move forward towards 18.2 and 18.3 we are going to be taking a closer look at the pain points that you have with the system. SRP isn’t something that we are finished with, it’s our new core rendering architecture, and we want it to be amazing.

    When we talk about moving to the SRP world of rendering it is important to think of it as a new rendering system. There are similarities (we still use existing data structures for example), but it is a new system. Similar to porting a feature to ECS / Job system you are going to need to port features to SRP if you want them to be compatible with SRP. Legacy Unity rendering will still exist into the foreseeable future and it’s okay if you want to wait to port your tools and assets. What we really want from you is help understanding your needs and how we can help you while still keeping the core design philosophy behind SRP.

    Fixing Bad Decisions
    The Unity renderer has grown somewhat organically over the past 10 years, and along the way a number of questionable decisions have snuck in. These decisions have led to the inability to do big optimisation work in the rendering code (as we would have to break user projects) as well as holding back some aspects of future work. When designing the SRP we decided to at the low level rethink the callback structure, going for the approach of issuing less callbacks so that we can offer a cleaner more optimised experience.

    Removing Camera Rendering Callbacks
    As pre-SRP Unity stands there are a number of callbacks issued to sibling scripts of a Camera component; specifically:
    • OnPreCull
    • OnPreRender
    • OnPostRender
    • OnRenderImage
    As these callbacks stand they offer ways to inject extra rendering code into Unity. The issue that arises when attempting to port these to SRP world is multifaceted.
    • Existing plugins that use these are built with a very very deep implicit contract with Unity
      • Assuming Camera.main is set
      • Current render target setup is ‘known’ pre call
      • You are using the legacy post processing (c++, little script control)
    • Invoked by high level unity
      • Callbacks (generally) have no arguments
        • What camera is being used etc
      • Designed as a ‘primitive’ injection method
        • We have SRP where you can do much more
    • These were invoked mid camera render. This was generally bad as we would have some rendering state configured and user code would then mutate this state (by calling a nested render, of just smashing some framebuffer state). This is where a large number of bugs in our backlog come from. The tough part is that fixing any bug here normally results in regressions against implicit, non documented behaviour.

    • Always invoked, even if you don’t want or need them
      • Overhead for no reason
    When we started working on SRP we looked into these callbacks and decided that making a breaking change here is advantageous to everyone in the long term even if there is some short term pain. When it comes to SRP we have added two new callbacks to the RenderPipeline class:
    • public static event Action<Camera[]> beginFrameRendering;
      Called when SRP rendering begins. Do things like make frame dependent geometry or similar from this callback. Called by default in HD / LW pipe. If you write a custom pipe you need to issue the call yourself.

    • public static event Action<Camera> beginCameraRendering;
      Called when camera rendering begins (per camera). Do things like camera dependent effects (planer reflection) or similar from this callback. Called by default in HD / LW pipe. If you write a custom pipe you need to issue the call yourself.

    A big advantage these callbacks have over legacy rendering callbacks is that they are not injected in the middle of rendering anything! This means that state is generally safe and preserved and you can do ‘whatever you want’ from within them!

    Cleaning Up Object Callbacks + Best Practices
    Object rendering callbacks in Unity are a two sided coin. They offer a large amount of flexibility but they can cause some really weird side effects if not used properly. Many of the issues users experience when customising rendering in Unity happen when using these callbacks. The real culprit is that they are issued in the middle of rendering. What this means is that OnWillRenderObject will be called during a camera render; when a camera is rendering there is a lot of state (some local like the camera state, and some global like the graphics device configuration) and during the OnWillRenderObject this state can be mutated leaving to subsequent steps in the rendering having issues. Further issues arise when you start to look deeply into how these callbacks should work and the ramifications to the rest of the rendering data model. These callbacks are issued against objects that have passed culling, that means that they are visible on screen and we have built an optimised data block to represent how they can be drawn. At this stage if a callback enable or disables a renderer what should be the expected behaviour?
    • We allow that modification to affect the current list of renderable objects, this means that the render tasks cannot be started till all user code is completed executing and also makes the internal code munch more complex due to the introduction of corner cases
    • We disallow modification to the ‘current’ state and changes take effect next frame.
    As it stands right now (including SRP) we take option one as it offers the most flexibility. That being said there are dragons in this code, and there are projects that depend on undefined behaviour. Due to this we have decided to leave many of these callbacks in… But I will make some recommendations.

    • OnWillRenderObject - Called
      Called in SRP still (it is used for some internal systems still like animation). This callback can be used just before an object is rendered as it is now known to be visible. I highly recommend AGAINST using this. It introduces a sync point on the main thread in Unity. One common thing to do is call a Camera render here or similar, please please don’t do this; it’s really fragile.

    • OnRenderObject - Not Called
      This was called at a (somewhat) well defined part of the rendering pipeline and allowed users to inject custom draw calls. In SRP there isn’t a ‘well defined’ point anymore as all pipelines are different. This functionality was also somewhat superseded by the camera command buffers in the old render pipelines. Instead per pipeline callbacks are being considered (talked about below).

    • OnBecameVisible - Called
      Still called

    • OnBecameInvisible - Called
      Still called
    A best practices solution to working with callbacks (on a per object basis) is to build a system that utilizes a combination of beginFrameRendering and OnBecameVisible. What you want to do is anything that requires updates when visible should register in OnBecameVisible then the next frame should utilise the ‘beingFrameRendering’ call. This also means that you will only do work once per frame instead of per camera render.

    Pipeline Specific Callbacks
    As it currently stands pipelines issue no callbacks to external systems (i.e there are no hook points). This is something we are investigating but there are a number of difficulties with this approach.

    As it stands now the pipelines have a number of passes and are quite tight and self contained what this means is that assumptions can be made between passes and optimizations applied. Every time there is a callback or hook point it means that these assumptions may not be true and it can lead to compromises needing to be made in the design of the system.

    For the lightweight pipeline we have done some experimentation with hook points (code can be found here), from our testing it seems that this will work but has other downsides. Refactoring the passes (reordering for example) is now much harder as there are now external dependencies on these hook points. I’m not saying that we won’t be adding these kind of things as we move forward, but we are trying to find a better way. Right now our approach is to offer a minimal set of things, then grow the offering as things become apparent they are needed. We don’t want to dig another hole for ourselves like we have with the current rendering architecture.

    Material Upgrading
    This is more pipeline specific. In 18.1 we have two new rendering pipelines; Lightweight and HD, they both have different audiences and I won’t be talking much about that here. What I want to talk about is upgrading materials from legacy Unity to these pipelines.

    Not from the start there are some big differences. In HD the material model and default texture packing is completely different. In addition the lighting model is drastically altered. What this means is that there is no such thing as a 1:1 upgrade from an existing Unity project to an SRP, some content reauthoring will always be needed.

    When it comes to upgrading out of the box we provide a number of scripts to go from legacy unity to both HD and LW. These scripts work because they know the textures and parameters that belong to the shaders and can provide an upgrade and remap from the old to the new. If you have custom shaders then the upgrade scripts don’t know how things should map to the new materials. If you really want to write a custom upgrader for your material you can clone the SRP into your project and write a custom upgrader. It is possible to extend bothe the LW and HD upgraders.

    Shader Upgrading
    This is a much more difficult topic. As it stands in Unity shaders have a number of passes and these passes have expected outputs; for example a GBUFFER pass is expected to output to multiple render targets in a specific way with specific data encoding. When it comes to upgrading shaders from one pipeline to another it is highly non trivial. If you have custom shaders that you wish to port to the new pipelines manual steps must be taken to do this. That is the shaders must be written to support the specific rendering pipeline. In Unity there is a concept of subshaders, when rendering an object Unity selects a valid subshader to use based on a number of rules. In SRP we have added an extra subshader selection tag: “RenderPipeline”. This name maps to the supported render pipeline and it is possible to add multiple supported subshaders to a shader each with different tags. And example of using this tag can be found here.

    Shader Future
    The biggest issues with shaders as they currently stand is that any change to the expected inputs / outputs from passes breaks the rendering contract, and thus potentially the rendering in a project. In 18.1 we are moving towards a world where we want to help ensure that we don’t break projects by updating something in our shader library. The way we are doing this for 18.1 is by leveraging a shader graph system. The graph is an extensible system that allows you to write custom nodes / master nodes - essentially it turns the generated shader into an artefact of the graph. This level of indirection means that if we change a core API or similar the node or template can be updated and on the next project load the shader will be regenerated. This helps substantially with ensuring that we can update the contracts that we have in the rendering pipeline without breaking user projects. For the start of the 2018 release cycle we are concentrating our efforts on this as it fills a very big hole in the Unity product that artists and content creators have been wanting for years.

    “What about surface shaders” I hear you ask. This is something that we are having internal discussions about and may start exploring during 2018. We can’t offer promises here. We have done a few minor internal prototypes by we need to decide if we want to maintain a surface shader system in parallel with the shader graph.
     
    Last edited: Mar 1, 2018
  38. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,093
    Seems like you are on an old version. If you go into the package manager window you can update to the latest :)
     
    RubenCUR likes this.
  39. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    5,815
    In real life our hands also have bones, which reduce the transmission a lot, so it's not exactly a fair comparison.
     
  40. RubenCUR

    RubenCUR

    Joined:
    Mar 29, 2017
    Posts:
    2
    Thanks! I'm going to update the package now.
     
  41. SoxwareInteractive

    SoxwareInteractive

    Joined:
    Jan 31, 2015
    Posts:
    366
    @Tim-C: Thank you very much for this detailed post.

    My asset currently uses the GL class to draw some lines after rendering finished (in OnPostRender() as suggested by the Manual). I see that there are events called before rendering starts which is already great, but is there a way (without manipulating the default SRPs) to use GL calls to draw something after rendering has finished? Is the GL class still supported in the future or is it going to be deprecated?
     
  42. glasshandstudios

    glasshandstudios

    Joined:
    Mar 15, 2017
    Posts:
    6
    Hey guys! So far having a blast with HD srp on my current project. One thing that I noticed was using post processing with vr capture reorients the cubemaps. Turn it off, perfect cubemaps and equirectangular images? I updated to the latest HD pipeline and Post processing v2.0.1-beta. Thank you!
     
  43. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    666
    Will 2018.1 have different lighting and rendering system ?
     
  44. petersx

    petersx

    Joined:
    Mar 5, 2015
    Posts:
    215
    If someone does not know where to find it: you need to have the SRPHD version 0.1.33 installed with Postprocesing 0.2.0 and then this checkbox is in the HDRenderPipelineAsset edition.
    Below is an example of the json manifest file that works with the beta 9 version

    {

    "registry": "https://staging-packages.unity.com",
    "dependencies": {
    "com.unity.render-pipelines.high-definition": "0.1.33",
    "com.unity.package-manager-ui": "1.8.1",
    "com.unity.postprocessing": "0.2.0"
    }
    }
     
  45. GameDevCouple_I

    GameDevCouple_I

    Joined:
    Oct 5, 2013
    Posts:
    1,847
    From looking at the shaders in the github, it looks like the actual coding of shaders isnt changing too much (as in some syntax changes, but nothing major like all commands changing / types changing etc).

    I am super happy to see this as up till now I thought all the practise in writing shaders I have done would count for nothing! Actually it looks more in line with standard HLSL which is a good thing IMO!

    EDIT: such as this: https://github.com/Unity-Technologi...line/LWRP/Shaders/LightweightCopyDepth.shader
     
  46. rubentorresexozet

    rubentorresexozet

    Joined:
    Dec 8, 2016
    Posts:
    2
    1) What is the criteria used for splitting some draw commands in ScriptableRenderContext and some other in CommandBuffer?

    2) Is there any way I can copy a texture into RAM and not VRAM?

    3) A bit off-topic, but I am using RenderDoc to better visualize the results. Is there any way of filtering the draw commands to Camera.Render? I am wasting lots of time searching for that node in every capture.

    Thanks for the great work.
     
  47. Benjamin_Overgaard

    Benjamin_Overgaard

    Joined:
    Jul 20, 2015
    Posts:
    13
    I'm having some problems understanding how to use the pipeline assets.

    Using the HD Render Pipeline, where can I write a custom lighting model? Earlier, I could just write a custom deferred script and drag it into the custom deferred shading slot in graphics settings.

    Now, when I add a custom deferred shader in the HD Render Pipeline Resources asset, I don't see any changes - not even when I leave the field empty. I only see changes when I leave the Compute Shader fields empty. What am I missing? :)
     
  48. petersx

    petersx

    Joined:
    Mar 5, 2015
    Posts:
    215
    Hi,

    Is there a chance to repair Lit shader? In beta 9 and SRP HD 0.1.33, when we insert the texture into the Detail Map by setting Detail AlbedoScale to 0, we get a disco?
    In previous version (0.1.27) Detail albedo scale 0 was neutral to material.




    BTW, the new SSS is awesome ! If this continues, Unity will be the best realtime SSS renderer.

     
  49. tatoforever

    tatoforever

    Joined:
    Apr 16, 2009
    Posts:
    3,863
    @Tim-C
    Thanks for making things a lot more clear about the future of SRP. And yes please don't think about compatibility, think of usability, performance, flexibility and extendability first, the rest come after. At least we all know where the future of rendering in Unity is heading and this is all good for us devs who invest our next projects into Unity, helps us make better decisions.
    Lastly, don't neglect other important parts of Unity, the new terr*in (ehem * cough *)... :oops: Keep the transparency and don't leave 4years promises threads in Limbo, it is not healthy for both parties. :)
    I think if it wasn't for SRP and all this modularity and extendability of the engine, I would have long gone look elsewhere.
    Keep the nice work, I love how SRP is shaping up.
    Regards,
     
    Last edited: Mar 7, 2018
    GameDevCouple_I and Gokcan like this.
  50. Gokcan

    Gokcan

    Joined:
    Aug 15, 2013
    Posts:
    200
    I am using unity for almost 7 years and following unity developers everyday on forums. @Tim-C took part of development of all these feautures somehow. Some of them: UI, Image Effects, Shader Graph, SRP, and maybe many more we do not see in background. I really want to thank you for all these great features you present us:) And now new challenge: why not New Terrain system?:)