Search Unity

Feedback Wanted: Scriptable Render Pipelines

Discussion in 'Graphics Experimental Previews' started by Tim-C, May 9, 2017.

  1. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    436
    Hi.

    I'm a student at a university who has been watching this project from the shadows from a while, and now this summer on weekends I think I would like to try building a couple of SRPs for the experience. I have two projects in mind that would greatly benefit from custom pipelines. One of them is a game in which different scenes use different visual styles with different effects, weather, ect. The other is an FPS somewhat similar to Splatoon with a "paint the world" effect (which I got a prototype working in 5.6 using particles and texture arrays and a fake lightmap, though the mechanics are different and I would eventually want to add a full fluid simulation. So if you don't mind, I have quite a few questions on things I have ran into so far.

    First, what exactly is the purpose of using a factory system and producing pipeline instances? I couldn't really figure out how that is useful, and some of the Unity pipelines don't seem to bother with it (BasicRenderPipeline just uses static functions, and the mobile deferred pipeline just passes rendering back to the asset). While I really like the idea of splitting run-time data with serialized data, Unity automatically creates and deletes pipeline instances whenever something is changed in the asset from the editor, which means any run-time data registered into the instance from script during Start() would get broken if an artist decided to modify the shadow settings during play mode. And I'm not sold on the idea of generating a new pipeline instance every frame to handle dynamic events. What am I missing here?

    Second is just a nitpick, but why is GetCullingParameters in CullResults rather than a method called GetPararmetersFromCamera in CullingParameters?

    Third, what exactly are your plans for managing lightmaps? Will we have access to controlling when Enlighten would perform a meta pass after requesting a renderer update? Will we be able to store our own custom texture transform matrices in renderer components that work for multiple lightmaps and other kinds of world-space maps? For example, when I built the FPS prototype, I had to create a baked black directional light and specify a small lightmap resolution to get the entire baked lightmap into a single lightmap atlas, so that I could use the baked lightmap UVs to index my texture array. But in the future I would love to be able to just automatically have Unity pack the baked lightmap uvs to fill a single atlas, and then after painting the particles to the texture array, update global illumination on either the CPU or GPU (not sure which will be easier/more performant) using either Enlighten or my own system, and then draw the scene. Are there any plans to make something like this feasible?

    Fourth, regarding discussion of callbacks within the render pipeline, I imagine it would be something like this:
    1. I build up a list of CullResults for things that need callbacks. (C#)
    2. I call a function in Unity API that takes a list of CullResults and spits out a 2D list of Renderers. (C++)
    3. I use this 2D list to send out appropriate messages as I build up my rendering instructions. (C#) Would I use SendMessage for this?
    4. I submit my context. (C++)

    Would this be an efficient approach compared to how Unity is currently doing things?

    Fifth, will we be able to customize CameraEvents to attach command buffers to?

    Sixth, will we be able to define our own render queue enumerations instead of simply using "Opaque", "Transparent", ect? More specifically, have an easy way to specify them from Shaders and such?

    Seventh, how would the following use case be possible in SRPs (assuming it is possible)?
    I have a particular type of enemy whose body is emitting fire. However, I want full control over the style of the fire, so I write a compute shader that takes in the deformed mesh (hopefully from GPU skinning), and outputs mesh data for fire for that frame. I have over 100 of these fire enemies in my scene, but only about 10 of them will be on screen at once, and I only need to update the fire when the enemy is on screen. I want to render the enemy mesh during the opaque pass. Then in the transparent pass I want to run the compute shader on only the visible enemies and then draw indirectly the fire. I then want to use a similar technique for smoke enemies, water enemies, ect. The only way I can imagine doing this would be to use a custom callback that sends out a reference to a command buffer to fill to all the objects after culling objects on a specific render queue (hence why I asked about custom render queue enumerations).

    Eighth, I'm noticing in some the examples the use of an AdditionalLightData script. I have a sinking feeling this is going to lead to a lot of artist frustration, as an artist could easily forget to add this script when creating a light. It could also lead to confusion when changing the light component parameters have no effect, because actual data lies within the AdditionalLightData script. For example, maybe I wanted the light intensity to be calculated by a lightbulb type and wattage so that I could simulate a sketchy electrical system. Would it be possible to get a minimal version of a light (and probes and maybe even cameras) that we could inherit from that has the normal MonoBehavior messages (or at least the important ones)? Maybe this minimal class would only contain info for culling (like bounding box and such that would be hidden from the editor)? And then provide some way to print a warning when a user adds the regular Unity light? (This might already be possible. I'm not very good at editor scripting.) Are there alternative solutions far superior to this idea?

    Ninth, can we get callbacks for when a pipeline asset gets assigned in the editor to that particular pipeline (as well as a callback for when a pipeline gets removed)?

    Tenth, which shader variables does SetupCameraProperties actually set up?

    And finally, is it a good idea for me to be trying to build my own SRPs this early? Am I asking too many questions?

    I really like where Scriptable Render Pipelines are going. Aside from the things above, everything is really intuitive. It is easy to cull what you want to cull. I have full customization over shadows, light falloffs, styles, abilities to do crazy interactions between multiple lights and cameras, apply filtering to the skybox by drawing it first and then running compute shaders, whose results I can use to apply other shading effects, and all sorts of stuff.

    And things for the most part just work. The easiest evidence that anyone can try is Debug.Log the order the cameras get passed in the array. You'll find it is sorted by the cameras' depth values, just as one would expect!
     
  2. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,093
    Lots of questions :)

    There is a separation here between runtime data and configuration data. The Asset is used to configure settings for the pipeline, and the runtime version is an instance of the created pipeline. The idea is that the runtime version can cache things like RenderTextures, Shaders, Materials and similar. This is important because it's possible to have more than one pipe of the same time active at once. For example a number of our tools instantiate an instance of the current pipe with 'debug settings' for rendering (material window, look dev, scene view). If these shared the same instance as used by the game view then each time a render happened all the render textures would need to be resized for the view. This stops this happening as each context owns it's own instance. You can still 'runtime' settings embedded in the instance, and changing these won't recreate the instance, it's only settings on the asset that do this. It's a heavy operation but should never be done every frame.

    We are already changing this :)

    Out plan for Unity to expose a number of lightmapping modes, and the pipeline you write should support a subset of these. The pipeline advertises what modes it supports and the lighting UI changes to only show the supported mode. We are not going to support custom lightmapping / scripting the lightmapper currently.

    Currently we are going to keep this list opaque and have callbacks like "SendVisibilityChangedCallabcks(cullResults)" this is for performance reasons. I would really really not want to use SendMessage for this.

    CameraEvents don't exist in vanilla SRP as you have access to everything when you write one. You could add support for camera events into your SRP but it will mean a lot more work.

    The queue is just a number, and rather arbitrary. You don't need to use the given names and if you want set your own names / numbers up. In the DrawRenderesSettings you can specify the queue ranges to draw.

    I need to think about this one a little bit.

    We have some changes coming that allow you to write a custom Light / Camera inspector for your SRP. In this you can show additional light setting in the normal light inspector and hide any normal settings that don't make sense for your srp.

    What the use case?

    [/quote]
    Tenth, which shader variables does SetupCameraProperties actually set up?
    [/quote]
    We will eventually be removing this for more fine grained control, it calls deep into the unity engine and sets up a bunch of stuff in many places. What i'm basically saying is that we don't have an easily visible list atm. But it looks like:

    Code (csharp):
    1.  
    2. void Camera::SetCameraShaderProps(ShaderPassContext& passContext, const CameraRenderingParams& params)
    3. {
    4.     float overrideTime = -1.0f;
    5. #   if UNITY_EDITOR
    6.     if (m_State.m_AnimateMaterials)
    7.         overrideTime = m_State.m_AnimateMaterialsTime;
    8. #   endif // if UNITY_EDITOR
    9.     ShaderLab::UpdateGlobalShaderProperties(overrideTime);
    10.  
    11.     GfxDevice& device = GetGfxDevice();
    12.     BuiltinShaderParamValues& shaderParams = device.GetBuiltinParamValues();
    13.  
    14.     shaderParams.SetVectorParam(kShaderVecWorldSpaceCameraPos, Vector4f(params.worldPosition, 0.0f));
    15.  
    16.     Matrix4x4f worldToCamera;
    17.     Matrix4x4f cameraToWorld;
    18.     CalculateMatrixShaderProps(params.matView, worldToCamera, cameraToWorld);
    19.     shaderParams.SetMatrixParam(kShaderMatWorldToCamera, worldToCamera);
    20.     shaderParams.SetMatrixParam(kShaderMatCameraToWorld, cameraToWorld);
    21.  
    22.     // Get the matrix to use for cubemap reflections.
    23.     // It's camera to world matrix; rotation only, and mirrored on Y.
    24.     worldToCamera.SetPosition(Vector3f::zero);  // clear translation
    25.     Matrix4x4f invertY;
    26.     invertY.SetScale(Vector3f(1, -1, 1));
    27.     Matrix4x4f reflMat;
    28.     MultiplyMatrices4x4(&worldToCamera, &invertY, &reflMat);
    29.     passContext.properties.SetMatrix(kSLPropReflection, reflMat);
    30.  
    31.     // Camera clipping planes
    32.     SetClippingPlaneShaderProps();
    33.  
    34.     const float projNear = GetProjectionNear();
    35.     const float projFar = GetProjectionFar();
    36.     const float invNear = (projNear == 0.0f) ? 1.0f : 1.0f / projNear;
    37.     const float invFar = (projFar == 0.0f) ? 1.0f : 1.0f / projFar;
    38.     shaderParams.SetVectorParam(kShaderVecProjectionParams, Vector4f(device.GetInvertProjectionMatrix() ? -1.0f : 1.0f, projNear, projFar, invFar));
    39.  
    40.     Rectf view = GetScreenViewportRect();
    41.     shaderParams.SetVectorParam(kShaderVecScreenParams, Vector4f(view.width, view.height, 1.0f + 1.0f / view.width, 1.0f + 1.0f / view.height));
    42.  
    43.     // But as depth component textures on OpenGL always return in 0..1 range (as in D3D), we have to use
    44.     // the same constants for both D3D and OpenGL here.
    45.     double zc0, zc1;
    46.     // OpenGL would be this:
    47.     // zc0 = (1.0 - projFar / projNear) / 2.0;
    48.     // zc1 = (1.0 + projFar / projNear) / 2.0;
    49.     // D3D is this:
    50.     zc0 = 1.0 - projFar * invNear;
    51.     zc1 = projFar * invNear;
    52.  
    53.     Vector4f v = Vector4f(zc0, zc1, zc0 * invFar, zc1 * invFar);
    54.     if (GetGraphicsCaps().usesReverseZ)
    55.     {
    56.         v.y += v.x;
    57.         v.x = -v.x;
    58.         v.w += v.z;
    59.         v.z = -v.z;
    60.     }
    61.     shaderParams.SetVectorParam(kShaderVecZBufferParams, v);
    62.  
    63.     // Ortho params
    64.     Vector4f orthoParams;
    65.     const bool isPerspective = params.matProj.IsPerspective();
    66.     orthoParams.x = m_State.m_OrthographicSize * m_State.m_Aspect;
    67.     orthoParams.y = m_State.m_OrthographicSize;
    68.     orthoParams.z = 0.0f;
    69.     orthoParams.w = isPerspective ? 0.0f : 1.0f;
    70.     shaderParams.SetVectorParam(kShaderVecOrthoParams, orthoParams);
    71.  
    72.     // Camera projection matrices
    73.     Matrix4x4f invProjMatrix;
    74.     InvertMatrix4x4_Full(params.matProj.GetPtr(), invProjMatrix.GetPtr());
    75.     shaderParams.SetMatrixParam(kShaderMatCameraProjection, params.matProj);
    76.     shaderParams.SetMatrixParam(kShaderMatCameraInvProjection, invProjMatrix);
    77.  
    78. #if GFX_SUPPORTS_SINGLE_PASS_STEREO
    79.     // Set stereo matrices to make shaders with UNITY_SINGLE_PASS_STEREO enabled work in mono
    80.     // View and projection are handled by the device
    81.     device.SetStereoMatrix(kMonoOrStereoscopicEyeMono, kShaderMatCameraInvProjection, invProjMatrix);
    82.     device.SetStereoMatrix(kMonoOrStereoscopicEyeMono, kShaderMatWorldToCamera, worldToCamera);
    83.     device.SetStereoMatrix(kMonoOrStereoscopicEyeMono, kShaderMatCameraToWorld, cameraToWorld);
    84. #endif
    85. }
    86.  
    87. void setup()
    88. {
    89.     if (m_State.m_UsingHDR)
    90.         passContext.keywords.Enable(keywords::kHDROn);
    91.     else
    92.         passContext.keywords.Disable(keywords::kHDROn);
    93.  
    94.  
    95.     GraphicsHelper::SetWorldViewAndProjection(device, NULL, &params.matView, &params.matProj);
    96.     SetCameraShaderProps(passContext, params);
    97. }
    98.  
    And finally, is it a good idea for me to be trying to build my own SRPs this early? Am I asking too many questions?
     
  3. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    1,512
    Was hoping to play around with this but i'm stuck at the first hurdle and don't seem able to download a fully working version of SRL.

    I'm not familiar with github, so I just use their gitHub desktop software to deal with it. Unfortunately this means when cloning it will download the latest version ( master ). I then tried reverting back to the 156fb11 commit, but it told me there were merging conflicts that had to be resolved first and, yeah no idea what to do about that. I'm not even sure if the conflict refers to current local version or reverting to the 156fb11 version.

    I then tried downloading the 156fb11 zip and the PostProcessingStack V2. Unfortunately something is messed up and the PPS just spews out errors until i disable the component. Whilst in some scenes ( e.g. LDRenderPipelineVikingVillage ) I have a Missing prefab, but no idea what it was or if its important. So even when I have a 'working version' its unclear as to whether this really is working or if things are broken/missing intentionally.

    Can anyone provide clear series of commandlines for doing the github stuff - assuming that the results provide a working version of SRL for a specific Unity version


    So with regard to pushing updates and experimental builds of SRL I would say
    • The github instructions aren't very useful for someone unaccustomed to github.
    • The fact that the master version is generally on a build not available to alpha testers, let alone beta testers is painful.
    My suggestion then is that periodically Unity deploy a working version of the project for a specific Unity beta/alpha version that is available to the public. Though I dislike having multiple beta/alpha installs i'd rather that and be able to get in and playing with SRL immediately than the situation currently where I have nothing or spend most of my time trying to get a working version.


    As for SRL themselves, its too early for any real comments, but I am somewhat confused with the pipeline asset apparently needing to be assigned by hand to graphics-ScriptableRenderLoopSettings properties.This would appear to be an awful decision, not least because if I hadn't happened to have a read a GUI popup in the GDC2017 demo I would never have known to swap pipeline assets and been very stuck/confused as to why it wasn't working or that i just had a black screen.

    Are there any plans to change this?
    Why can't it be changed via script at runetime? or if it can shouldn't these SRL test demos be doing so, instead of the tester trying to workout with pipeline asset goes with which demo?
     
  4. phil_lira

    phil_lira

    Unity Technologies

    Joined:
    Dec 17, 2014
    Posts:
    383
    Hi Noisecrime,

    From the commandline issuing the following should be enough:

    Code (CSharp):
    1. git clone https://github.com/Unity-Technologies/ScriptableRenderLoop
    2. git checkout unity-2017.1b5 (or whatever is the latest tag)
    3. git submodule update --init --recursive
    From what you're reporting the problems you're having with missing prefabs and errors in the PostProcessing are due to not having the submodules checked in. The above command line will do everything for you. Also, it's important to match the tag with the correct unity version. You can run git tag command to see all available tags.

    I'll take a look to upgrade the github instructions page to be more informative.

    Thanks for the feedback. There's currently a plan to deploy SRP in a more elegant way. @Tim-C knows more about it.

    The SRP can be assigned both by the inspector interface or by script (in the github project the test scenes have a script that changes the pipeline per scene). I guess most of the source of confusion comes by the fact that it's not explicit which scenes in the project should work with each pipe and the fact that as off now, if no pass is valid the SRP won't render anything.

    For the scenes we can improve it by making it more explicit/automatic the pipeline configuration. As off the pipeline rendering nothing when unmatched, IIRC there's a plan to fallback to an error shader similar to what happens when no pass is suitable in legacy.

    Best,
    Felipe
     
    Last edited: Jun 7, 2017
    Andreas_Lokko and Noisecrime like this.
  5. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    436
    Oh wow! I totally was not expecting such a detailed response! This answered a lot of my questions. Embedding custom data right into into the inspectors of lights and cameras sounds amazing! That actually makes things like implementing custom camera events and such safer too.

    So how could I write code like this in my shader's subshader block:
    Tags { "Queue"="OpaqueMoonlit" }

    Where somewhere else in my C# code I define OpaqueMoonlit to be equal to 2115?

    In addition, how would I change the dropdown options in the "Render Queue" setting in the material inspector?


    Right now, per-object configuration is still a bit lacking, both for the example I gave previously and for light and probe sorting. If I was on low-end mobile (which I usually never am) and wanted to sort lights either by intensity, by distance, or both (taking into account a custom falloff equation), I would not be able to. I feel like either a new variant of command buffers are needed, command buffers need to be attachable to materials (with some way to access per instance data), or both, or some other more intuitive solution.

    I'm kinda curious if and if so how surface shaders could be incorporated into SRP? They were nice for the black box render pipelines, and they might be nice for when people start sharing their custom pipelines with each other, but probably not the highest priority.

    Anyways, I think I'm going to hold off on building my render pipeline until either some new information on how to do things or a new iteration of the API arrives. In the meantime I think I'm going to work on building the non-SRP aspects of some games that could really take advantage of SRPs.

    Thanks again for all the information! It's exciting stuff!
     
  6. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    614
    Hi @phil_lira @Tim-C I am following this awesome feature since the beginning and I must say that is an incredible and fantastic initiative. :D
    I would like to know if there is any ETA (Unity version, weeks, months ?) for this and if it can be used in a production environment. I am mainly talking about the LD pipe.

    Thanks a lot !
     
  7. scvnathan

    scvnathan

    Joined:
    Apr 29, 2016
    Posts:
    74
    Tim-C responded earlier in this thread about LD pipe's stability:
    So looks like in 2017.1 (july) it will be usable for some, and in 2017.2 (november?) usable for most.
     
  8. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    614
    Thanks for the answer @scvnathan. Seems that I missed that post.
     
  9. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    614
    I am also wondering if we can add properties to lights in SRL.
     
  10. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    614
  11. WilkerLucio

    WilkerLucio

    Joined:
    May 2, 2017
    Posts:
    18
    So, for anyone who needed a video explaining the stuff, i've found this:

    Hope it helps understanding the loops better.
     
    OCASM and Lex4art like this.
  12. Quatum1000

    Quatum1000

    Joined:
    Oct 5, 2014
    Posts:
    724
    1) Would it be possible to access the reflection/specular pipeline as well to eg. enhance the current smoothness main reflection that based on a hand full of blended pre rendered cubes?

    2) Currently to build a clear coat metallic material, I have to hack the the internal build in reflection shader and exhaust some BDRF channels.

    The whole discussion needs creative and eloquent ppl who have the ability to introduce possibilities of the new system. Programmers are not able to do this.

    If a game designer or shader artist didn't know what the HDRP can offer, this discussion about HDRP seems to be a academic entertainment for programmers.

    Why buying a expensive programmer into my company, if I do not know what the new system can do for me, or what definitely benefit I have from.
     
  13. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,093
    The best thing to do here is similar to what the HDRenderPipe dies. They have an additional light data they attach to their lights. They then use this data for HD specific light settings that can be configured in the inspector then passed through to the shader :)
     
  14. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    614
  15. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,372
    @Tim-C any new release for the SRP?
     
    Alverik likes this.
  16. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,093
    There is some new API landing in 2017.1 that we are keeping a little sneaky till we push out a reference of them in use.

    We now have support for RenderPasses. These allow you to stay on tile and use frame buffer fetch in a nice way on tiled GPU architecture. Should be a general win.
     
  17. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,293
    Recently I've noticed @hippocoder asking a number of Asset Store developers who make rendering-related add-ons for Unity at present, if they are going to support SRP's. This caused two questions to loom large in my mind.

    1. Isnt it a bit early for this? Especially for assets that will want to target the HDPipeline for desktop etc given that it was still described as at an early stage of development in May.

    2. A lot of the conversation, presentations and documentation to date tends to focus on the idea of people taking one of the C# pipelines Unity are making and sharing, and modifying it for their own needs. Is there much of anything around that focuses on how Asset Store devs can make their stuff work with one of Unitys pipelines that their customers may be using? Users who will likely want to use with more than one render-related asset store item with the pipeline, making the scenario where each asset comes with its own complete, customised pipeline a bit messy (to say the least) if I've understood SRP's right?

    Anyway these questions arent supposed to be a criticism of hippocoder, its just once I had thought of them I really wanted to know the answers! And its quite possible that question 2 simply stems from ignorance on my part, and me not having the time or complete technical vocabulary to deduce the answer(s) from the existing conversation & docs.
     
    hippocoder likes this.
  18. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,293
    I also ask it in the spirit of wanting to be as helpful to asset store devs as possible, it would be nice if some general advice for them was put out now, and a good guide with them in mind released at the appropriate time.

    I'm not an asset store developer myself (yet) but I noted that some of them got a bit burnt out by the number of rendering changes made to unity at various points in the 5.x cycle. So whilst some of them are no doubt interested in what they could achieve with this new degree of rendering control as this pipeline stuff matures, I also sense trepidation and groaning from some and I want to ease their pain! Which at this stage mostly just means getting the right advice in place in terms of the when and a little something about the how. Even if its just a few sentences here in this thread that I can point to when the subject arises. Cheers.
     
    Last edited: Jun 22, 2017
  19. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,474
    I'm not actually asking on behalf of Unity. Blue badges don't work for Unity, we are just proven to be beneficial to the community and Unity alike, and aren't paid or anything. Call us helpful but passionate people that mingle between.

    But in my experience you generally want to spread awareness of something long before the something lands, otherwise you, and everyone else will be sat around waiting. Unity isn't going to wait. It'll release what it has as soon as it can do so, unlike the old days when features needed to be held back for a new paid release.

    In this case, SRP is a really big deal and totally the future of Unity so we want people really working toward that ASAP not later. It will take asset authors a while to even learn where it fits in with their assets or if it's even useful. As for V2 post stack, that's much more relevant for people doing post FX that needs to play ball or be a little more optimised.

    So I spend time informing people of Unity's basic tech direction early as possible, so we don't have to wait ages for some form of improvement, and that these asset authors can also feed back any early problems to Unity staff.

    It's just one of the general things I do, to be helpful rather than any agenda.
     
    Alverik likes this.
  20. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,293
    Oh dont get me wrong, I never thought you were doing it on behalf of Unity. I've done similar myself in the past, most obviously when it comes to metal supporting compute shaders in 5.6 which I've gone on about all over the place on the forums.

    So I'm all for the spreading of awareness and really what I was getting at is that by you doing so at this stage, I found myself with questions I need answers to in order to join in with such conversations with asset store developers in a way that puts a little more meat on the bones.
     
  21. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,293
    For the sake of clarity because I do go on a bit, I will explicitly state the two nuggets of info I have been fishing for:

    A tiny bit more timescale info that carries on from the planned story in regards experimental->standard of the LD Pipe, but for the HD pipe. I know since it sounds like this is further away for the HD Pipe so I assume that talk of timescales will likely want to be kept looser, but some sense for asset store devs of, eg, whether to plan for the year 2017 or 2018 for desktop HD Pipe being considered the new ready standard that users might start clamouring to have support for in their asset.

    Some advice from Unity about exactly how they envisage multiple 3rd party asset store assets fitting into the way the pipeline works, the practicalities of shipping their assets that make use of a particular pipeline, and/or reassurance if there are no obvious major tech issues here.
     
    Last edited: Jun 22, 2017
    Noisecrime likes this.
  22. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    1,512
    I still want to get around to playing with this SRP, I have a perfect project for it ( recursive portal rendering ) and i'm hoping that SRP will show a reasonable performance gain due to be able to share various data sets between various operations ( e.g. render depthnormalTexture using same culling data ).

    However currently I feel that there is so little information about the system, that it appears to be in a state of high flux and out of sync with unity builds that developers can get it just seems like an up hill battle. Every time I look at the source code project I have a bunch of questions ( e.g. why does it appear that SRP requires custom shaders for each type?) that I can't find the answers to, but which I don't want to post in case i'm missing some detail somewhere and want to avoid spamming the thread with a question every few days. I sort of feel I need to actually try and implement my project and post questions that arise directly from it.

    What I would like to see is periodic updates to this thread providing absolute links to Unity experiment builds that support a specific SRP branch/tag. Heck I'd even perfer just a single download from Unity that provides the editor and SRP library in a single package so I don't have to try and work out which builds work with which branches etc. Just something to help get developers started instead of wasting their time matching Unity builds to branches etc.

    I'd also like to see some updated documentation on where the current state, direction, plans for SRP are outlined. It doesn't have to be well written, just comprehensive in providing important up to date information.

    Most importantly I feel its high time such a important feature gets its own sub-forum and not just a few threads dotted about. That way developers who are playing with this tech can post issues, bugs, questions, findings that others can help or learn from. I mean I may not have the time to write my own SRP, but I do have the time to read up a few threads a day about the issues or success other developers are having and learn from the information they posting, thus arming me with great information for when I do have time to play with it.

    I don't know, I just feel like perhaps UT could be doing more to make ease of entry to SRP better. Providing more complete demo projects that aren't broken when you try to run them, perhaps a video or twitch live stream explaining the current system in depth etc, anything really.

    I realize that can be a large amount of additional effort and work, but currently I feel the only people who truly understand the tech are those UT developers making it and without getting greater input from customers it runs the danger of not providing required features or missing bug reports.
     
  23. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,474
    SRP definitely needs it's own graphics sub forum if only to draw attention to it because there's a lot of data here that's really specific and should not be lost. I left a note for the staff, don't know what is happening there.
     
    Alverik, ArthurT and MaT227 like this.
  24. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    614
    Totally agree.
     
  25. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    436
    Definitely agree with a sub forum, though I don't think it is a big deal until the next iteration of the API. Speaking of which, it's actually super awesome there's still a planned update for 2017.1 given that a while ago I remember it being stated somewhere that Unity had internally already forked 2017.2 and that there would be no new features for 2017.1, as well as the fact that the current SRP GitHub is on 2017.2. That goes to show how much they value our feedback!

    This is just speculation on my part, but I'm guessing Unity's main purpose for HDRP right now is a means to test their design and uncover some pitfalls and snares before cleaning out all of the legacy code with a fresh new SRP-friendly system. I wouldn't recommend this to artists right now. At the current state it is more a matter of whether or not graphics programmers are on board with the direction of the framework. Someone from Unity, please correct me if I'm wrong on this.

    When looking at the API and the changes that are being made and promised, a lot is still in flux, but big picture things are a little more stable than you might expect. The major hurdle SRP had to overcome was finding a way to make SRPs work in Unity. You have one or more scene editors, a game view, material previews, and possibly some other previews. Unity had to make that all work to provide real-time feedback. Currently they have a pretty good solution to the problem. (I won't say it is perfect yet because I can't figure out how to make Unity not rebuild my instances and reallocate my shadow maps and break all my events in play mode when I just want to adjust the atmospheric density which lives on the asset for serialization benefits and all that actually needs to happen is that I loop through all instances and copy the new value.) The general idea hasn't changed much since its inclusion into public builds. You have an asset which generates instances for all the different windows, and then in those instances you have your rendering loop in which you fill a context with commands to cull data, execute command buffers, and draw renderers, as well as whatever other logic you want to put in there. And the fact that people have mostly been asking about features rather than questioning the design choices either means not enough attention is coming this way or that the design choices are tough to beat (I think it is a little of both).

    I actually know the answer to this (to some extent). First off, many SRPs will redefine the shader constants, making the older shaders incompatible. This is actually a good thing, as it means we have more control over what globals are stored in the shaders. I don't have to use Unity's time conventions anymore. I could define a time convention where x is milliseconds, y is seconds, z is minutes, and w is hours. I can now package this in a CBuffer used every frame along with wind settings and other per-frame data and get a little bit of a performance boost. Also, generally a SRP will have custom shaders so that they can take advantage of SRP optimizations, such as tiled lighting.

    The SRP 2017.1 beta 5 release works for me, and I don't believe the API has changed at all in any 2017.1 beta since. They switched to 2017.2 only a couple of days after release. Something to note is that in 2017.2, they hid RenderPipeline.cs into the engine. That confused me for a while until I got a chance to test things out myself. I also recommend checking out the Unity Package in the initial post containing the presentation project. No git required, and in my opinion provides an easier code template to start experimenting. A fun little challenge is to try to add lighting to the scene.

    Looking forward to the new API!
     
    WilkerLucio, scvnathan and Noisecrime like this.
  26. misharg

    misharg

    Joined:
    May 30, 2017
    Posts:
    5
    Tested ScriptablePipeline for a few weeks. Here’s what I’ve come up with.

    As a general goal, I was trying to create a ScriptablePipeline for low end mobile devices. The biggest requests I have are to help support forward, single pass, and per object point lights on these low end mobile devices.
    1. Per object point lights on low end devices, like the IPhone 5, which don’t support compute buffers. The current LightweightPipeline.FillLightIndicies requires compute buffers which means light indices cannot be used on low end mobile devices.
    2. Use shader preprocessors per object to turn on/off the calculations for point light code in vertex/fragment shaders. For our shader, running the point light calculations for all objects costs 25% of gpu device utilization. Only 1/2 of our objects are in range of a point light and our tests showed that the gpu device utilization could drop to 1/2 (12%) which is a huge reduction.
    3. Static objects with single pass per object point lights. Allowing objects to use both light maps and single pass point lights would help immensely with our performance. Currently, the only way to support point lights on static objects is through multi pass lighting, which is prohibitively expensive for low end mobile devices. In the case that all objects in our scene are in range of 4 point lights, running single pass lighting costs 25% gpu device utilization while multipass lighting costs 45% gpu device utilization. The old unity pipeline appears to assume objects either get light maps or single pass point lights (unity uses vertex lit) but not both. The scriptable render pipeline would be more useful if it was customizable to allow objects to get both light maps and single pass point lights.

    The ScriptablePipeline seems like it could be a great feature to help Unity support low end mobile devices and high end pcs. In contrast, my attempt to write a custom renderer using CommandBuffer.DrawMesh/DrawRenderer was time consuming and needed more work to lower cpu usage. However it did show that if scriptable pipeline supported the above features, our expected gpu device utilization for point lights would drop to 12% (IPhone 5, 30 fps). This would enable us to make our Unity games look much better.
     
  27. benblo

    benblo

    Joined:
    Aug 14, 2007
    Posts:
    469
    Is your GDC presentation available in video? couldn't find it in the GDC vault...
     
  28. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,331
  29. WilkerLucio

    WilkerLucio

    Joined:
    May 2, 2017
    Posts:
    18
    Elecman likes this.
  30. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,331
    Thanks, that is exactly what I am looking for.
     
  31. phil_lira

    phil_lira

    Unity Technologies

    Joined:
    Dec 17, 2014
    Posts:
    383
    Hi @misharg thanks for the feedback. That's very important.

    Adressing your comments:
    In order to get a performant way of doing per-object light lists on old devices some changes need to be made to the pipeline core API. I'm working on them atm. We should have it available soon (probably next beta).

    We've discussed briefly about exposing the list of visible renderers to c# api. That would allow a next step of customization in the pipeline. In your specific case, it would allow you to get the light offset/count per object and set the keywords yourself. The reason we don't want to do it as a general case in the engine is that depending of the game and light setup keywords would break batching, which greatly increase the cost of state management, especially in mobile.

    The lightweight pipeline already supports this, unless something broken recently.
     
  32. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    614
    In the older version, it seems that you where doing some light sorting based on intensity. Wasn't that good enough ?

    I would like to know if there's a example on how to handle cookie and if it's possible and how to handle image effects ?
    Thank a lot !
     
  33. phil_lira

    phil_lira

    Unity Technologies

    Joined:
    Dec 17, 2014
    Posts:
    383
    Hi @MaT227

    The previous sort was global. Ideally we want to sort and cull lights per-object. For instance the example below has 4 pixel lights set in the pipeline, but since they are sorted and culled per object, the cube in the left is shaded with a set of lights different from the cube on the right. Giving the impression we run more pixel lights in the shader.

    [Disclaimer: The artifacts in the image are due to the gif compression, not to the render]
    https://i.gyazo.com/07ee1df2bd10d95d31d6fc5315a49e37.gif

    Light cookies are still not implemented in Lightweight. It should be done in about a couple of weeks.

    For Postprocessing there's a quick guide here: https://github.com/Unity-Technologies/PostProcessing/wiki/(v2)-Quickstart
     
    Lex4art and MaT227 like this.
  34. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    614
    Nice, this is indeed better :D

    Thanks for the precisions.

    Thanks again, I didn't know about the PostProcessing customisation, that's awesome !

    You are doing awesome work with the Postprocessing and the SRP !
     
    phil_lira likes this.
  35. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    614
    Oh I forgot to ask about point light shadows @phil_lira , is it planed in the LightWeight pipeline ?
     
  36. misharg

    misharg

    Joined:
    May 30, 2017
    Posts:
    5
    Thanks for your reply, looking forward to Scriptable Pipeline.

    Yeah I can see how it would be better for state management if the default behaviour preserved batching, it seems like the cpu cost would increase in general. However light computations can cost more than is saved by batching, helping lower overall gpu utilization. For us, that next level of customization seems necessary, thanks for looking into it.

    Sorry, I didn't mean to indicate that this wasn't currently a feature in Scriptable Pipeline. Things like this are actually one of the reasons we want to use ScriptablePipeline over the old Unity pipeline.
     
  37. phil_lira

    phil_lira

    Unity Technologies

    Joined:
    Dec 17, 2014
    Posts:
    383
    @MaT227 It was not originally planned for the lightweight. However, I think it would be nice to have it for devs who still want to use the lightweight pipeline and target more powerful devices that we originally planned. It's not in the priority list now, so it might take a while.
     
    Alverik likes this.
  38. phil_lira

    phil_lira

    Unity Technologies

    Joined:
    Dec 17, 2014
    Posts:
    383
    @misharg I'll ping you when I have more information about it. Meanwhile, feel free to comment on any existing/non existing feature and feedback you come up with. This is the way we improve the SRP to suit everyone's needs. :D
     
  39. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,093
    Yeah, exposing the list of renderers if both good and bad. It allows more flexibility but it also allows for some really slow stuff to be done on the c# side that will make rendering slower (setting material property blocks on all the renderers things like that). It's something we want to avoid and instead probably have a deferred interface for filtering. Something like this:
    Code (csharp):
    1.  
    2. var cullResults;
    3. FilterResultHandle fourLightObjects = cullResults.FilterOnLights(4);
    4. cmdBuffer.SetKeyword("FOUR_LIGHTS");
    5. DrawRenderers(fourLightObjects);
    6.  
    And so you can change the keywords between each execution block.
     
    Noisecrime likes this.
  40. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,293
    So, any guidance for asset store devs as per my previous posts?
     
  41. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,293
    Also, it is possible to give any clues as to when you will next tag a release for a particular unity version? eg will there be a version thats more up to date than the last one (May, 2017.1b5), that is tagged to work on, for example, the initial release version of 2017.1?
     
  42. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    436
    How bad is this exactly? It's not like Unity doesn't already have things like this. Heck, lately I ran into an issue where an assignment operation (assigning an 8k poly mesh to MeshCollider.sharedMesh) cost me 20 ms. I'm working on writing a native c++ plugin to get around that issue. It's kind of like the Find function. It's a performance killer when used incorrectly, but when used appropriately (like during initialization while generating a procedural world) could save a lot of programming time. Then again, if the possibility of exposing renderers occurs a big performance cost to SRP even if the feature isn't utilized, then it's a bad idea. That's at least my view. You probably know something I don't.

    Personally I think it would be really cool to have a DrawRenderersWithCommandBuffer function, in which we could set up CBuffer parameters indirectly from the top 8 (yet to be determined) nearest lights and set up an array of light color, light position/direction, custom light cookie data, and even dispatch a compute shader ahead of time to re-orient some flowers to point towards the light without clipping into the plant-covered character (Sorry, I like to do a lot of weird fantasy stuff).

    Here's some made up example with made up API (and probably syntax too because I'm tired):
    Code (CSharp):
    1. LightSorter = new LightSorter();
    2.         LightSorter.sortMode = LightSorterMode.INTENSITY_FALLOFF;
    3.         LightSorter.falloffCurve = LightSorter.GetPropertyFromAdditionalLightData<FantasyWorldAdditionalLightDataComponent>("m_FalloffCurve");
    4.         LightSortResults = LightSorter.sort(8);
    5.         CommandBufferIndirect cmdi = new CommandBufferIndirect();
    6.         for (int i = 0; i < 7; ++i)
    7.         {
    8.             cmdi.SetGlobalVector(LightPositionShaderIDs[i], LightSortResults.GetPropertyFromTransform("position", i));
    9.             cmdi.SetGlobalColor(LightColorShaderIDs[i], LightSortResults.GetProperty("color", i));
    10.             cmdi.SetGlobalTexture(LightCookieShaderIDs[i], LightSortResults.GetPropertyFromAdditionalLightData<FantasyWorldAdditionalLightDataComponent>("m_LightCookieRTI", i));
    11.         }
    12.         cmdi.SetComputeVectorParam(FlowerFollowerComputeShader, "NearestLight", LightSortResults.GetPropertyFromTransform("position", 0));
    13.         cmdi.SetComputeBuffer(FlowerFollowerComputeShader, FlowerFollowerComputeKernalID, "FlowerMesh", cmdi.RenderIndirect.VertexDataToComputeBuffer(VertexAttributes.POSITIONS | VertexAttributes.COLORS | VertexAttributes.NORMALS | VertexAttributes.TANGENTS));
    14.         cmdi.SetComputeBuffer(FlowerFollowerComputeShader, FlowerFollowerComputeKernalID, "VertexDisplacementMatrices", cmdi.RenderIndirect.GetPropertyFromAdditionalRendererData<FlowerLightFollower>("m_VertexDisplacementMatricesBuffer"));
    15.         cmdi.SetComputeBuffer(FlowerFollowerComputeShader, FlowerFollowerComputeKernalID, "VertexDisplacementMatricesIndices", cmdi.RenderIndirect.GetPropertyFromAdditionalRendererData<FlowerLightFollower>("m_VertexDisplacementMatricesIndicesBuffer"));
    16.         cmdi.DispatchCompute(FlowerFollowerComputeShader, FlowerFollowerComputeKernalID, FlowerComputeTGX, FlowerComputeTGY, FLowerComputeTGZ);
    17.         cmdi.SetGlobalBuffer(FlowerFollowVertexDisplacementMatricesShaderID, cmdi.RenderIndirect.GetPropertyFromAdditionalRendererData<FlowerLightFollower>("m_VertexDisplacementMatricesBuffer"));
    18.         cmdi.SetGlobalBuffer(FlowerFollowVertexDisplacementMatricesIndicesShaderID, cmdi.RenderIndirect.GetPropertyFromAdditionalRendererData<FlowerLightFollower>("m_VertexDisplacementMatricesIndicesBuffer"));
    19.         cmdi.RenderIndirect.Draw();
    20.         context.DrawRenderersWithCommandBufferIndirect(DrawSettings, cmdi);
    Now obviously this example reeks of reflection and serialization issues to name a few. But if there was a way to have this level of power, where SRP could adapt to any use case rather than a limited few (such as single mode light sorting) while still maintaining performance given whoever programs SRP doesn't do anything stupid without profiling, well, I know I would have a lot of stupid fun...

    Anyways, looking forward to the next API update! Besides per-object concerns, custom shaderlab tag values for built-in types, awaiting lightmapping API, and the few other random things already addressed, I have no more feedback to give unfortunately.

    Time estimates are always appreciated, but I totally get there's a lot of uncertainty and challenges for something so game-changing. We appreciate what you do!
     
  43. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,093
    The current goal is to get the HDPipe into a feature complete, but rough around the edges, state for october. From there we will spend a few months rounding it out, doing performance passes and making it a real first party citizen. I would say that if you are an asset maker that October is a good time as the API will be solid and the team will have time to take on requests (such as missing hook points and similar).

    We are going back and forth internally on this currently, but we want do something similar to camera where there are hook points where you can inject code in places like 'after opaque', 'after transparent' and that kind of thing. So 3'rd parties can hook in like this. One other nice thing is that if there is a real issue with hooking into the pipe you can make improvements and we can PR them into the mainline if it makes sense for the general user.
     
  44. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    1,512
    Just out of interest, could SRP be used to create a pre unity 5.6 forward rendering system? I keep running into issues with the new allowMSAA setting in 5.6 and above, mostly I think due to the resolve.AA that happens. I've not checked the SRP source in a while so unsure if resolve.AA can be avoided, while still having MSAA?
     
  45. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,293
    Thanks for the brilliant, extremely helpful reply. Great amount of clarity, cheers! :)
     
  46. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,474
    I think we are probably going to need a SRP graphics sub forum. @Tim-C too soon?
     
    Alverik and chiapet1021 like this.
  47. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,093
    Yes it can be :) You'll be able to reuse the existing shaders and just write a pipeline for all the other passes and things like that. It would be a lot of work (the old pipelines do some really weird stuff due to legacy)

    Can you describe the issue more? If you had msaa turned on pre 5.6 you would still have to resolve it to do image filtering and similar.
     
    Noisecrime likes this.
  48. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    436
    Just realized you released the new API with the 2017.2 beta! At first I was a little disappointed that it wasn't back-ported to 2017.1 where things are stable, but then I cracked open the documentation (Can't actually open the editor due to some licensing issue/bug).

    THE RENDERPASS API IS AMAZING!

    I'm psyched! I prefer deferred rendering as most crazy effects I'd do work in screen space. And in the case of my FPS project, I'm also doing a bunch of MRT rendering of PBR textures for painted objects. This new API makes all of this about 10 times easier and more intuitive to do this kind of stuff, plus it comes with performance benefits! Also, the per-target blend modes is really nice! If there was a way to do this before in Unity, I am unaware of it and had to rely of CopyTexture commands and do bleding directly in the shader.

    Quick feature request: Can we bind a render texture array slice to a RenderPassAttachment?

    I don't have much to say on the ScriptableCullingParameters/CameraProperties API as the documentation isn't quite complete/clear yet, but I like the direction it is going. I'm guessing it is pretty new as not even the public repo uses it.

    I totally get why you haven't had time to address features like lightmaps and per-object stuff. I just hope that as SRP develops, more and more of the API reaches RenderPass level of intuitiveness!

    Now that 2017.2 is out with the new API, are you going to create a new thread there?
     
    Noisecrime likes this.
  49. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    1,512
    Cool. As you say probably alot of work, but at least it gives options.

    I've been working on a recursive stencil portal rendering system ( which i'm desperate to find some time to update to SRP as I think that would be cool) and the result in 5.6 was just a blank (skybox) view. The whole rendering chain was broken, though I think in the end it may have just been due to some form of loss of the stencil buffer due to MSAA Resolves. In the same project I noticed massive numbers of resolves due to the recursive rendering ( via Camera.Render() ) required.

    Thinking about it, my project might still be broken as I simply circumvent Unity allowMSAA and instead create my own renderTexture with AA and render directly to that, thus avoiding resolve.AA after each and every camera.redner() call until I've finished rendering all my cameras.

    I guess the main point is that something changed in 5.6 and which I assume is how MSAA is dealt with, that ends up possibly destroying information between multiple camera renders as well as having the potential to introduce needless resolves when they are not required. Most likely this is just an unfortunate side-effect of Unity being a generic engine, vs the developers special knowledge about the render system they want to create. Funnily enough exactly the sort of thing SRP is going to solve ;)

    Having said that I recently found a very bizarre situation in my fixed version where the final resolve.AA appears to add content to the view that is not from my renderTexture at all - no idea what is going on there, it really doesn't make sense, so need to dig into it further.


    A simpler to understand and more obvious issue with 5.6 MSAA is that canvas ScreenSpace-overlay has lost any AA capability! I submitted this as Bug 927346 ( QA just marked it as reproducible ). In a simple test create a 1 x 128 UIImage then arbitrarily rotate it. Prior to 5.6 if you had MSAA enabled it would be nicely anti-aliased, since 5.6 nothing, its just aliased. Interestingly using ScreenSpace-Camera does not have that problem, but sadly breaks other aspects of my code. I'm hoping this bug was just an over-sight that can be fixed as AA on canvas can be very important in some cases.
     
  50. JakubSmaga

    JakubSmaga

    Joined:
    Aug 5, 2015
    Posts:
    416
    Since 2017.1 release is next week and this 2017.1 beta sub forum is probably going to be closed, Shouldn't SRP already have it's own sub forum?