Search Unity

Unity Focusing on Shaders

Discussion in 'Shaders' started by marcte_unity, Sep 18, 2021.

  1. marcte_unity

    marcte_unity

    Unity Technologies

    Joined:
    Sep 24, 2018
    Posts:
    11
    We know that the introduction of the Scriptable Render Pipelines brought a lot of complexity to writing shaders in Unity. We are working on improving that experience. As you are likely aware, the features we write today take almost a year to release in a shipped Unity Editor, so please keep that broad timeline in mind. We will share frequent updates on these plans and give you access to our work in alphas and betas for early feedback whenever possible.

    Requirements
    We’ve read these threads, your tweets, and your blog comments, and we’ve talked with some of you directly. We know the following features are critical:
    1. You need to hand-write shaders and have that code work across pipelines and upgrade seamlessly to new versions of Unity.
    2. You need a straightforward way to do that without a ton of boilerplate code.
    3. You need these shaders, and ones created in Shader Graph, to be modular, composable, and work with custom Render Pipelines.
    4. You need to create your own templates/targets for any render pipeline and let TA’s and artists extend those in Shader Graph or via code.
    The following system is designed to meet those requirements.

    Design
    We are adding the concept of composable Shader Blocks to our existing Shader Lab language. A Shader Block is an isolated piece of shader functionality that declares a public interface in the form of inputs and outputs.

    Render pipelines will provide Shader Templates, each exposing a set of Customization Points (name TBD). Shader Blocks can be plugged into these Customisation Points, and their inputs/outputs will be automatically matched together to pass data between them. You will also be able to manually control this data flow. A completed template outputs a subshader. The final shader is then composed from one or more subshaders.

    Underlying this system will be an internal, intermediate representation of the Shaders that we’re calling the Shader Foundry. All Shader Block code, as well as Shader Graph files, will produce this unified representation, allowing both to use the same set of templates to output Shaders for each required pipeline.

    Of course this is all very high-level, but over the next few months we’ll show you examples of the code and how to use it so we can refine our APIs based on your input.

    Benefits
    The biggest benefit of this system is that it supports every feature of the render pipelines. No more waiting for Shader Graph to add a node or workflow - URP and HDRP will expose their features directly. It also provides stability across upgrades so the shader code you write in it will work for years. Additional benefits:
    1. Customizability - the provided customization points will give a lot of power to control the final result, and if they are insufficient you can create your own template either from the ground up or based on an existing template to limit or expand the end user’s ability to control the shader.

    2. Reusability - Shader Blocks are interchangeable, mix-and-match pieces. Unity’s internal functionality will be written as blocks that you can incorporate into your own shaders. The Asset Store will use the same system as well.

    3. Specialization - We will include the concept of pipeline specific blocks and branches so functionality can be adjusted based on which pipeline is in use, along with other performance, platform, and feature support branches.

    4. The limitations of the system are driven by the render pipelines. For example, HDRP will not support custom lighting due to the complexity of its lighting system, while URP will support providing your own lighting function because URP’s lighting system is not as deeply integrated with the other parts of the rendering pipeline and is designed to be highly flexible. As part of our goal to provide a unified authoring environment across pipelines, we are finding ways to reference common concepts (like motion vectors) in a pipeline-agnostic way.
    Surface Shaders?
    A big question on many minds is, “Will this provide a feature-for-feature replacement for the BuiltIn Render Pipeline’s Surface Shaders?”. Our goal is to provide as much parity and support as possible within the constraints of a modern, performant graphics pipeline. Some of the features and syntax that Unity added to Surface Shaders are kind of bonkers! And you all did amazing, beautiful things with them! But that kind of deep access to every facet of the renderer’s internals just isn’t realistic in a modern rendering architecture. We’ll provide a lot more detail on this in the future, including a feature-by-feature comparison. For the Surface Shader features we can’t support, we’ll gather your input on what you used them for and do our best to provide alternative solutions.

    Timeline
    So when can you get your hands on this? It won’t all come right away. Over the next year, we will release the Shader Foundry layer - including a public, supported C# API for constructing shader code in the Foundry. We will also update our Shader Graph Targets to take advantage of the new system. Along with that, we will start releasing previews of the new ShaderLab features that will become the preferred way to write shaders for Unity. If you prefer, you will still be able to write directly in the low level languages you use today. Since this all has a lot of nuance and we need to get it right, we’ll be reviewing the work in progress with a lot of people both internally and externally before we finalize and release it.

    Summary
    This is a high-level plan, and we understand if you have questions or are just skeptical! Our best answer will be to execute and deliver a fantastic solution. Please let us know where you'd like additional clarity, or if there are use cases we haven't captured here. We welcome your feedback and we look forward to the incredible experiences you’ll build with it!
     
    Kirsche, GliderGuy, grizzly and 21 others like this.
  2. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,072
    Sounds great! I'm very interested to see the details and to port some things to the system once it's available. Unfortunately, given the timeline, that leaves maybe another 5 years of supporting the existing mess, but seems like eventually we will have nice things.

    I'm glad you're copying the stackable concept from Better Shaders, it will pay off big time, as being able to write and compose shaders from separate authors, written in text or graph, is kind of a holy grail for studios and asset store users. I'm still unclear if allowing multi-stacking (multiple copies of the same subshader added to one shader) was the right choice in Better Shaders, since it has to mutate the names of properties/keywords/variables to work, but it is a powerful feature if you choose to allow it. (Note, wasn't clear if you have some kind of blackboard for data sharing between blocks, but that is needed as well. Might even want the equivalent of [RequireComponent()] for shaders?)

    I'm also very interested in how you plan to handle optimizations and some of the more esoteric cases. For instance, optimizing out interpolator usage in shader variant scenarios, or handling things like centroid interpolation on texcoords, etc. For instance, in Better Shaders I don't have the user write these structures by hand, which greatly reduces the complexity of writing shaders and keeps things standard and well named (which I suspect you'll do as well given the shader graph/text approach here). Mostly, it just kinda works and strips what you don't use. But there are times when in one variant you might not use some data and want to optimize it out, but making the parser detect and do this automatically would be a lot of work. So I ended up going for an "opt in" approach with these kinds of optimizations, where you can add an #if _FOO check around something like texcoord3 in your appdata structure via an option. Same for something like using centroid interpolation.

    I do think what you expose in a graph and what you expose in text can be different. For instance, adding compute support to a text based shader is a matter of adding some code to the shader (not the template), but adding support to a shader graph is not trivial, requiring whole new constructs and nodes. So in a way, I think the new approach, which allows for both within the same shader, can actually relax the requirements on the shader graph somewhat. Programmers can write a block which reads the compute data and provides the data to the blackboard, and artists can just plug it in and graph away. In fact, many things which the shader graph currently doesn't support could be easily handled this way (terrain shaders, etc).

    Question:
    - Will this target BiRP as well as SRPs? If not, we will still have to write for multiple pipelines until it either does or BiRP is removed.
     
    GliderGuy, cxode, Edy and 15 others like this.
  3. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    38
    Sounds very interesting and I look forward to hearing more!
    This bit did concern me a little. Do you have any idea what limitations will be imposed in the future? I may be misunderstanding, but this feels at odds with the goals of the scriptable render pipelines (exposing more control and low level access to the user to enable 'non-standard' things to be done)
     
    cxode and OCASM like this.
  4. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    28,028
    +1

    Is there a Rough ETA? Even when SpaceX designs rockets, they have a rough ETA :)
     
  5. valarus

    valarus

    Joined:
    Apr 7, 2019
    Posts:
    296
    Unity is not rocket science.:)
     
  6. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    815
    These are very welcome news! Looking forward to trying the system out when it becomes available - lack of simple, maintainable shader programming solution similar to surface shaders has been the single biggest issue I had with URP/HRDP.
     
  7. marcte_unity

    marcte_unity

    Unity Technologies

    Joined:
    Sep 24, 2018
    Posts:
    11
    Jason! Thanks for your great feedback. You get it, and the points you've hilighted (like name mangling) are exactly what we're working out and debating internally. Seeing how people use Better Shaders has given us invaluable insight into the opportunities and pitfalls awaiting us! And you're exactly right - this will free Shader Graph to focus on providing fabulous artist workflows without the expectation of exposing every nook and cranny of the domain.

    Regarding optimizations, that's one of the major strengths of the Foundry concept. Instead of splicing strings around, we'll have a full data representations of the shader blocks that we can reason about along multiple axes - platform support, quality levels, etc. I think your "opt-in" approach is a good one for when automatic optimization fails. Once we're a little further along, I'm sure we'll have some great conversations about how best to Make Go Fast.

    As for BuiltIn Render Pipeline support, that's the hidden driver behind the BiRP Target we're releasing for Shader Graph in Unity 2021.2. We're already testing this new work using that target so it can be a bridge for our customers to reduce the effort required to upgrade to SRP when they're ready.

    ElliotB, we're not imposing any limitations on what SRP already supports. As I said, we'll have the Surface Shader feature comparison out as soon as we can.

    Hippocoder and Valarus, we're not rocket scientists. :D As I said, "Over the next year...we'll start releasing previews." As soon as we're confident in a more specific timeline, we'll share.
     
    Last edited: Sep 20, 2021
    GliderGuy, Ruchir, cxode and 5 others like this.
  8. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    38
    Sounds great, thank you! I've really enjoyed the new possibilities from SRP so I'm excited to see what comes next.
     
  9. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    2,136
    @jbooth Thank you for the feedback!
    Blocks produce and consume data using inputs and outputs. Passing data from one block to another is as simple as linking an output of a producer to an input of a consumer. We'll also make some rules for passing data around implicitly so that one doesn't have to type in everything.

    Definitely!
     
    cxode, OCASM and hippocoder like this.
  10. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    28,028
    Great so, if I have a block that only works on vertex and that's really optimised, the result will get passed to fragement just like the old surface shaders? This way I can calculate some pretty expensive things at a lower granularity. Also, what about interpolation for that? Assuming it'll just work if I packed a spare "texcoord". I think a lot of that should be cleared up.

    My current way of working (so I remain sane) is to just go directly to a function node, work in my HLSL and exit to the final output (base, emission, etc).

    One of my needs was to be able to do something to the colour after all lighting has been calculated by Unity, would that be possible too? It's not the maximum priority but it'd be nice to do dither myself, or other stuff at the end like fast object based grading/tone mapping, and I can't currently without losing hair or using an unlit shader... not ideal.

    I tend to develop for VR and low power devices but have a huge bag of tricks that sadly died when URP rolled around due to lack of flexibility and access.

    How much control do we have?
     
    OCASM likes this.
  11. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    2,136
    Yep. We'll provide a mechanism to pass data from a block in a stage to a different block in a following stage. You'll also be able to control, how it's interpolated.
    Do you think explicit control over what gets packed with what will be needed? We had an automagic system for interpolator packing in mind.

    This will be possible once the render pipeline you work on provides a customisation point in their templates. I suppose you're talking about the lack of
    finalColor
    from surface shaders in the SRP land, is that right?
     
    valarus and cxode like this.
  12. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    28,028
    Yeah I just want to do grading and the likes in "mesh" shaders as I can't really afford to do it as a post process. I'm actually not using any post process at all (it's all local to the object shaders).

    Turns out in 2021, it's still faster to do old tricks (for VR and switch at least).
     
    mahdi_jeddi likes this.
  13. Jes28

    Jes28

    Joined:
    Sep 3, 2012
    Posts:
    813
    Automatic is good default especially if we will see how data packed actually
    But sometimes we can pack data in specific way because some additional data can be calculated from that format or something. So I guess we will ask for manual way eventually.

    Dunno what @hippocoder want to say but I fall many time into situation where I want to replace some block on project level, not specific shader level, but pre/postprocess any input of master node or replace SampleTexture2D node for all shaders (or some selection) in project, etc.

    Main Thought is we want to be able pre/post process all shaders(or internal pieces) in project (from store or any other source) to meet specific project art style.
    For Example: Unreal has concept of Base Material and to apply virtual texturing we need replace texture sampler in that base material to virtual Texture sampler
    • When adding a virtual texture to a Material Graph, UE assigns the Virtual sampler type automatically. However, should you make the expression a Texture Sample Parameter that you can use in Material Instances, keep in mind that the base material applies the Virtual sampler type to all child instances.
    Do it be possible to do something like this?

    Do it be possible to create mesh with very custom format and custom data of custom data types :) and feed it though our special custom shader block to URP?
    I mean we have special packed vertex format and unpack it so other part of vertex shader can use it as normal. This additionally will require mesh previews to know what block to use to unpack vertex and actually render preview.
     
  14. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    2,136
    This seems orthogonal to interpolator packing. The data will be rearranged back to the same layout when unpacking interpolators anyway.
    We thought about this. We're not settled how exactly to do this yet. Probably some name-based solution.
    This depends on the template design on the SRP side :) Sounds like a reasonable thing to be able to customise.
     
    Jes28 likes this.
  15. Jes28

    Jes28

    Joined:
    Sep 3, 2012
    Posts:
    813
    My thoughts on this:
    - Type-Based for Nodes (or guid based if node is asset)
    - port uid-based for master node ports. (master node ports must be stable)
    - some sort of shader inheritance so all shaders inherit baseShader and we can change that base shader in project to change all project. May be that BaseShader in other shaders looks like MasterNode
    - shader selection based on tags (e.g. add tags for all VFX shaders and then apply replace on tag)

    I think we need concept of Shader output redefining(MasterNode as small subshader) and
    concept of Shader input redefining (like redefine TextureSamplers, LightData, ShadowData, VertexData, LightmapData, MainTexture(well known texture input), PerInstanceDataSource...)

    Hope this is useful :)

    P.S.: With this big constructor I with to, again, have OnDemand runtime shader compilation instead of current, hard to deal with, compile 100500 shader variants on build :)
     
  16. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    2,136
    Nope :)
    This will all be Editor side, outputting a .shader file.

    All noted and will be given thought :)
    Thank you!
     
    Jes28 likes this.
  17. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,072
    This seems very "shader graph centric", does this work with text based shaders? Just don't want to end up in another case where only half the tool works from scripting.

    I think the trick here is around things like conditional compilation and which stage something is used in. In Better Shaders, the default thing is to just pass anything used by the shader over to the pixel shader. However, this can be wasteful if you are only using vertex color in the vertex stage, or only using it when some keyword is set, or unpacking one type of data into another type before sending to the next stage. So I've added various opt-in settings for stuff like this, where you can say "This is only used in this stage, so don't pass it" or "add this #ifdef around this interpolator". It's not "performance by default", rather "tell me some conditionals or guarantees and I'll optimize this out".

    Ideally this would all be automatic, but then you'd run into the surface shader issue of having to generate all the variants to figure out what cases need what data, which basically capped that system to a small number of keywords or the generator would explode. Better Shaders attempts to do everything with a really dumb parser; and I think that's actually good- but this is one area where more knowledge of the code would be useful.

    The problem with global replacement is usually you don't mean global. For instance, shaders are used in UI drawing, editor code, etc, and may not want to be modified. So usually these are scoped to something like "Surfaces". I would suggest handling this via the blocks system - if Unity is going to ship their shaders as blocks to build from, then you can insert a global block and have all your shaders use that as the base instead of Unity's.

    What might be nice is if templates can have a list of blocks they include added on a project basis. Let's take "Curved World" as an example- you need to add a vertex modifier function to every surface's shader, but not ones for UI and such. You could modify the template to do this, but then when you upgrade Unity it breaks as the new template overwrites it. Or you could go to every shader you use and add the block. But both are brittle in their own ways. If instead, you could go to the "URP Lit Template" and add a project wide custom block there, then all the shaders would get recompiled with this new code automatically.
     
  18. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    2,136
    Yes, we're working on syntax.
    Depends on how you break up the shader into blocks - this can become N+M instead of N*M, which can be OK.
    If you can base a template on another template, this one may work. This will need more thinking anyway :)
     
  19. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,072
    Right- I had considered analyzing each stage separately - but because my parser is particularly stupid, it would need to be a lot smarter about things like shared functions used in any of the stages, etc. And the #ifdef case could likely be handled by tracing the control flow and pushing them onto a stack which can be checked, giving each use of, say, .vertexColor, a scope of defines it exists within. Having optimizations like these be automatic would be really nice, as even in traditional shaders you can spend a lot of time managing this kind of stuff, and conditional code compilation hides bugs like nothing else.
     
    GliderGuy, hippocoder and aleksandrk like this.
  20. BattleAngelAlita

    BattleAngelAlita

    Joined:
    Nov 20, 2016
    Posts:
    389
    URP/HDRP exclusive?
     
  21. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,044
    I need a clarification on this: If I have a Unity Material using one of these magic shaders we're debating about, will I be able to change the render pipeline in the project and this material will just work, including built-in? Without going pink or requiring an irreversible upgrade?
     
    Last edited: Sep 21, 2021
  22. sabint

    sabint

    Joined:
    Apr 19, 2014
    Posts:
    20
    What I would really like to see is ability to easily re-use shader code from existing shader libraries that I've developed in the past, or open-source libraries like Lygia. Currently, this is difficult because of two reasons:
    1. Unity has introduced a convention of needing "_float" and "_fixed" suffix in custom functions. That means, I'd need to either modify all existing functions to be able to use them as custom functions in the graph. (Can someone remind me why does this restriction exist?)
    2. Adding a custom function would be a lot less painful if the Shader Graph automatically detected input/output parameters by parsing the function's signature. Right now, it involves adding each field one by one, and selecting a type from a drop down. Do this for many functions, and it becomes incredibly tedious.

    I'd really like to request Unity to make custom functions easier to use. A lot of shader writers need/want to maintain the core of their logic in HLSL files.
     
    Supergeek likes this.
  23. marcte_unity

    marcte_unity

    Unity Technologies

    Joined:
    Sep 24, 2018
    Posts:
    11
    @BattleAngelAlita No, we plan to provide as much Built In support as is feasible. SRP is our primary focus, but we want to enable the _all_ RP use case and make it easier for people to upgrade to URP/HDRP when you're ready.

    @Edy That's the goal! The shaders may appear different based on the pipeline, and may need pipeline-specific paths. They're just code, not magic. ;) If a shader uses an HDRP-specific feature, and does not provide an alternative path, it could certainly appear pink in URP. But the ability to fix that will exist, without irreversible upgrades.

    @sabint I'll let someone else tackle #1, but for #2, that's an obvious win that we'd like to provide, but it's not on our roadmap yet. We are definitely investing in making custom functions easier - one of our long-term goals for the Shader Foundry is that you can write a block with code, then expose that block automatically inside Shader Graph as a custom node.
     
    GliderGuy, sabint, Oxeren and 2 others like this.
  24. BattleAngelAlita

    BattleAngelAlita

    Joined:
    Nov 20, 2016
    Posts:
    389
    Custom too? Or like a shadergraph?
     
  25. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    2,136
    Yes, custom SRPs as well.
     
  26. marcte_unity

    marcte_unity

    Unity Technologies

    Joined:
    Sep 24, 2018
    Posts:
    11
    Yeah, as Aleks said, once we've converted our internal RP support (URP/HDRP/BiRP) to the new Template system and are confident it's ready, we will make that API public so everyone can create Templates specific to their custom RPs.
     
    JoNax97 likes this.
  27. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,044
    Cool, I understand that. I just expect Unity to provide a standard set of shaders that look reasonably well in the three official RPs (URP/HDRP/BiRP). That's the Holy Grail for Asset Store publishers and multi-platform projects.
     
    Last edited: Sep 22, 2021
    Supergeek and Ruchir like this.
  28. Supergeek

    Supergeek

    Joined:
    Aug 13, 2010
    Posts:
    102
    You have developers begging you to stop making things so complicated, but instead you're just adding new systems. That's the new normal for Unity.

    Unity is killing itself by adding extensions and new systems instead of fixing or consolidating anything. Unity has built an engine ecosystem so confusing that it requires a dedicated expert in every dev team just to figure out how to match project requirements to the version of Unity with the systems that will support those requirements.

    Unity needs a dedicated in-house game development team to teach them just how painful it is to work with their own engine. Not just the very limited sample scenes and demos you've made available, but REAL, full games, for 2D and 3D.

    Unity have lost the point. They took the "democratizing game development" mantra and misinterpreted it; they made the engine as complex as the workings of a government. I've wasted years on it, but I think I'm done with Unity.
     
    Last edited: Sep 27, 2021
    GliderGuy, hippocoder, Edy and 4 others like this.
  29. adslitw

    adslitw

    Joined:
    Aug 23, 2012
    Posts:
    171
    @Edy - not sure if you're aware but @jbooth already released his 'better shaders' asset which does exactly that - it's absolutely amazing. You're an absolute lifesaver on a daily basis @jbooth!

    (I still only use BiRP, but the Better Lit Shader is my now default shader for most things).
     
  30. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,044
    @adslitw Of course, I know Jbooth's work :) That's what Unity should have provided since the beginning, but instead they chose to go through all this multi-year SRP madness. The fact that it took ~4 years for them just to start considering this feature seriously denotes the severe lack of vision they're suffering. Unfortunately I can't but agree with @Supergeek's post.
     
    Last edited: Sep 28, 2021
    hippocoder likes this.
  31. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    950
    What? Did you actually read the original post? The entire goal of this is to simplify and unify the current mess - not just "add new systems"...

    Apparently you didn't actually read it? These new systems they're adding have one goal: "to fix and consolidate the scriptable render pipelines [and even the BiRP]."

    It says a lot that @jbooth has reacted positively to this post and followed up with constructive and pointed questions.
     
    Last edited: Sep 27, 2021
  32. Supergeek

    Supergeek

    Joined:
    Aug 13, 2010
    Posts:
    102
    We've been promised simplifications and fixes and improvements how many times over the past decade?

    Look at this promise:
    Just wait another YEAR for a fix! And how well do you think this new system will be implemented? As well as the other new systems implemented in the past few years? Completely bug free I'm sure... just like the new new UI system, and the JOBS system that's still in development after 3 years.

    Says it all right there. Fixing the broken things isn't a priority. Or maybe it's just beyond the ability of their engineers to fix the spaghetti code they've got to work with. Not to knock the engineers; if management won't prioritize fixes, the front line developers can't just go wild west and refuse to do what management tells them to do. And doing fixes of this scope "in their spare time" just isn't feasible, no matter what your skill level.

    Since the new CEO took over, the impression I've gotten is that he's pushing the Unity developers hard to implement all the buzz words and market to business instead of indie developers like they used to, and giving their engineers little time to work on the massive technical debt they've incurred by slapping new systems over old broken ones. They've given a big middle finger to the indie developers that built the company for them.
     
    hippocoder and Edy like this.
  33. SonicBloomEric

    SonicBloomEric

    Joined:
    Sep 11, 2014
    Posts:
    950
    Because I actually read the original post, I am fairly confident that the new system will be well-implemented. This wasn't a "brand new solution" that someone cooked up and decided to make the hot new thing to distract Unity's graphics team with. They took heavy doses of inspiration from @jbooth's Better Shaders system. And from the conversation between @jbooth and the Unity devs, it appears as though they are building a system that improves upon some of the areas where Better Shaders cut some corners in order to be feasible for a one-person team to build part time.

    First, nowhere did anyone say that the SRP teams were going to drop everything and work only on this new unifying technology. My expectation is (and @marcte_unity and @aleksandrk can chime in and correct me if I'm wrong) that the SRP teams will continue to add features and fix bugs as they have been while some folks on the teams work to build this unifying technology.

    Second, to a very great many Unity users, the fact that you cannot easily write a shader for an SRP by hand, let alone in an SRP-portable way, is a massive problem. Specifically, it is the massive problem that this proposes to fix. If you have a selection of bugs/issues that are not this problem, then it may be helpful to post a new forum post or file a bug report (or both) to discuss such problems and raise awareness.

    Oh? Where, then, is Unity's Nanite-response? Why didn't they drop everything a year ago and build a new SRP that focuses on similar "next-gen" rendering techniques?

    But... this entire topic is about fixing the "massive technical debt they've incurred". Did you not read the post? This single new system would effectively enable everyone to write custom, hand-written shaders in a way that is more flexible than what we had "back in the good ole days" with the Built-In Render Pipeline prior to the SRPs.

    If you have some criticism where the proposed system falls short for your workflow in some way, then this is an excellent place to post it!

    However, if you're simply getting frustration off your chest:
    then is it safe to assume that you've done so and we can go back to making this thread about the proposed solution?
     
    ledshok, lilacsky824 and JoNax97 like this.
  34. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    8,259
    "Over the next year we'll start releasing previews" sounds like a timeline of at least 3 years from now for a "production ready" version of this feature, am I wrong in that assumption?
     
    GliderGuy, florianBrn, OCASM and 2 others like this.
  35. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,072
    Most likely not- with another few years until the vast majority of Unity users are on versions where it's included. The problem is that Unity spent 5 years digging themselves into a hole by not solving this problem ahead of time, and now they need to both write this new stack and refactor the shader graph and URP/BiRP/HDRP shaders to run off the new system. Combine that with the massive bureaucratic overhead of a large company/team, backwards compatibility with old data, a long QA/Branch Merge process, and a bit of not invented here syndrome, and basically Better Shaders is the only solution viable for something like the asset store for the next 5 years, give or take a year.

    That said, eventually there's a chance they ship a more robust or feature complete system than I have. I would suspect that they might not support some of the things that I do (centroid interpolation, multi-stacking, etc), but will likely cover a wider range of shader templates (decals, HDRP specific features like hair shaders, etc). Further, I'll be free of having to reverse engineer all their shader code every year, which is the primary reason I don't add those templates and only focus on cross pipeline shaders.

    In many ways, this is one of the reasons I'm always in favor of Unity focusing on features which improve the core of the engine over things like front end tooling, and support things which would help the asset store become a more professional/supported/profitable space for solutions. Small teams can move much faster than Unity can, as well as provide lots of variations on possible approaches to front end tooling (just look at how many approaches to terrain generation there are, each with various advantages). Unity has no doubt spent 50x what Amplify has spent on a shader graph, with more questionable results. I wrote and shipped a next gen surface shader system in a few weeks of work. And while I know Unity wants to become more like Unreal, with top quality tools and workflows as default, IMO that works against one of Unity's strengths which has always been flexibility. And to be quite frank, I have many times said right to Unity's teams that they've never gotten a front end tool past the 90% mark, and they still haven't, so there's something in the DNA of the companies approach which just isn't good at that - and in my career I've found that it's really hard to fix things in the core culture of how a company produces things, and it's better to double down on what they do well.

    In the end, I just want this fixed. I predicted this problem when I was pitched SRPs at GDC in 2017 or something, and I was absolutely right. And I've spent the last several years likely becoming a bad word around Unity's office for driving this issue. Now they'll spend millions over the next few years digging their way out of it. But that's just kinda what it is now - like it or not, they are chasing a sunk cost fallacy with their SRP decisions, and that's going to take a lot of time to retrofit.
     
    Last edited: Sep 28, 2021
  36. marcte_unity

    marcte_unity

    Unity Technologies

    Joined:
    Sep 24, 2018
    Posts:
    11
    @Supergeek, @Edy, and others - I hear you. Unity has changed a lot, and it hasn't been easy in a lot of ways. Thanks for the honest feedback, and I hope you'll give our new system a chance when you're ready.

    You are correct sir.

    A not-insignificant benefit of starting with the Foundry layer! By releasing the stable API first, we're setting everyone up for success because whatever gaps you see in our offering, you can fill them yourself by customizing the tools to meet your specific needs.

    If the feature you're looking for is "the next Surface Shaders" in an LTS version, that's probably a pretty good guess. This is an incremental plan where we will release production-ready features that build on each other, starting with the Foundry API. We've rushed a lot in the last few years, and it hasn't worked out too well...for any of us. We're going to communicate our plans, show you our work as we go, iterate from your feedback, and deliver work we're confident in.

    Thanks to everyone for your input so far, and we look forward to the ongoing dialog as we share more concrete pieces of what we're working on.
     
    GliderGuy, AcidArrow, Edy and 4 others like this.
  37. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    794
    Save some time and money and pay jbooth a million bucks to do all this next month :p

    ...but seriously though, the new system sounds amazing. Let's hope it will be everything we all want it to be.
     
    ledshok and Edy like this.
  38. sabint

    sabint

    Joined:
    Apr 19, 2014
    Posts:
    20
    Thanks for replying @marcte_unity . Here's an example of why the convention of needing `void function_float(..., out float retVal)` is killing me. I have a library of re-usable shader functions that I share between projects.
    Take for example this function:

    Code (CSharp):
    1.  
    2. float random3D(float3 p) {
    3.   ...
    4.   return somevalue;
    5. }
    6.  
    If I want to use this function in the shader graph as a custom function, I have to do this:
    Code (CSharp):
    1.  
    2. float random3D(float3 p) {
    3.   ...
    4.   return somevalue;
    5. }
    6.  
    7. void random3D_float(float3 p, out float retVal) {
    8.   retVal = random3D(p);
    9. }
    10.  
    It also makes reusing code harder. Example:
    Code (CSharp):
    1.  
    2. // Normally
    3. float3 pos = pos + random(p) * dir;
    4.  
    5. // With Unity-mandated syntax, extra statements needed for every function call.
    6. float r;
    7. random_float(p, out r);
    8. float3 pos = pos + r * dir;
    9.  
    10.  
    Also, there are excellent shader libraries on github, like Lygia, designed with the goal of being reusable across platforms.
    While I can easile use these functions via `#include "xyz.hlsl"` in unlit shaders or shaderlab, none of these functions can be used in the Shader Graph without modifications.

    TLDR: it is a very bad pattern for Unity to require developers to use this kind of syntax.
     
    OCASM, hippocoder, Edy and 2 others like this.
  39. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,072
    So will this new system use some kind of AST tree on variants to determine feature usage in the various structures (appData, v2f, etc)?

    Right now in Better Shaders it follows the rule of "If you type .VertexColor, I assume you use it" ignoring that it might only be for a specific shader variant. Then I allow the user to optimize this through various options. For instance, they can:

    Code (CSharp):
    1.  
    2. /// in a vertex tint shader
    3. BEGIN_OPTIONS
    4.     VertexColorRequire "_VERTEXTINT"
    5. END_OPTIONS
    6. // in a vertex mask shader
    7. BEGIN_OPTIONS
    8.     VertexColorRequire "_LAYERVERTEXMASK"
    9. END_OPTIONS
    10.  
    and the resulting code will be:

    Code (CSharp):
    1. And the resulting code will look like:
    2.  
    3. Code (CSharp):
    4. struct VertexOutput
    5. {
    6.     #if _VERTEXTINT || _LAYERVERTEXMASK
    7.          float3 VertexColor;
    8.     #endif
    9. }
    10.  
    This works, but if someone adds a third shader that uses .VertexColor and doesn't make this contract, the shader generator has to remove this optimization (and throw a warning) otherwise the shader will break in some cases, as VertexColor wouldn't always be there.

    This is all fine for my use cases, but gets more complex with things like texcoords- which sometimes the lighting system uses depending on which pipeline and lighting modes your in. Ideally each variant's structures are only what is needed for that variant/stage, which would require generating those structures per variant, or successfully constructing an #if #endif with the variant conditions matching.
     
    JoNax97, Ruchir and SonicBloomEric like this.
  40. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    2,136
    @jbooth yes, we want this system to make such optimizations automagically. It may not be there right from the start, though - we'll see about the timeline.
     
  41. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,072
    So another though- if we're AST'ing the shader, doing analysis on it, and outputting proper code, can we fix the sampler issues in Unity as part of this? The issues I'm talking about include:

    1. Sampler settings are now tied to textures, but samplers get stripped if you do not sample that texture with them, but instead only sample another.

    Code (CSharp):
    1. Texture2D(_Foo);
    2. Texture2D(_Bar);
    3. Sampler(sampler_Foo); // will pull it's sampling state from _Foo
    4.  
    5. o.Albedo = SAMPLE2D(_Foo, sampler_Foo, uv);
    6. #if _DEBUG
    7.     o.Albedo = SAMPLE2D(_Bar, sampler_Foo, uv);
    8. #endif
    9.  
    On windows (but not on Mac), this will complain that sampler_foo is missing (not sampler_Foo). This is because sampler_Foo has been stripped in the debug variant because the compiler has stripped the code that samples the texture _Foo, and doesn't realize that the _Bar texture sample is using it's sampler, so strips the sampler as well.

    This is very non-obvious until you've encountered it enough. And in a complex shader, you end up just always sampling from albedo and using the result somehow to avoid this.

    2. I want to share samplers as much as possible, but still have user control over settings.

    Right now, I share samplers as much as possible because, well, MicroSplat uses a lot of textures. So the user sets Albedo to bilinear and normal to trilinear, as an example, but only see's no difference because albedo/normal share the same samplers.

    ----

    Anyway, this all feels like it needs a big cleanup and is obviously a throwback to dx9 style samplers, which were directly associated with textures. But what we end up with now is a bunch of mess, where the shader author has to be really careful about sharing samplers and runs into seemingly non-sensical errors, and the user has a bunch of controls on textures that may or may not work. In most cases, the number of variants used in the shader of sampler settings is less that 16 - it's usually just clamped vs. unclamped, and possibly some filtering options.
     
    OCASM, Jes28 and GliderGuy like this.
  42. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    2,136
    @jbooth you can also use inline sampler state, but you have to hardcode it.

    Note that separate sampler state is supported on a subset of platforms - for example, there's no support on GLES. The compiler won't complain (and I'd like to fix that part), but it will use the state specified on the texture.

    The issue with the compiler stripping the texture may be solvable without code analysis - I'll check that tomorrow.
     
    bac9-flcl likes this.
  43. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,072
    Yeah, and believe me I've thought about exposing this stuff in the material myself (and do in a few places in MS). With MS it's achievable because it can regenerate the code, but it's much nicer to not have to write those frameworks, and with the new keyword limit it's much easier to not end up in code-gen land. I could do it via keywords/macros, but then you'd end up generating 100 megs of shader variants because someone flipped some setting that should be settable per texture, so that seems like a very bad idea.

    Those platforms are dead to me- but I'd guess with embedded devices and cheap phone Unity has to deal with them for quite a while longer.

    My understanding from Aras was that it was somewhere in the DX compiler and couldn't be fixed. That said, it's bitten me and users of Better Shaders quite a few times, and as shaders get more modular (stackables, etc) it, and the sampler count limit, becomes a larger issue.

    One thing that might make sense is allowing the compiler to handle inline samplers declared multiple times gracefully (thinking in separate combined shader stackables). That way, if 5 different things declare "my_point_clamp_sampler", it's like "great, just declare that once and we'll all use it". Then the code works when combined or independent, and you get optimal sampler counts by just following the convention.
     
    bac9-flcl and Invertex like this.
  44. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    2,136
    The stripping happens in the compiler, definitely. I just want to double-check that there's no way around that. If that's the case, this cannot be fixed whether we analyse the shader code or not.

    That's a good idea! Thanks :)
     
    Last edited: Oct 18, 2021
  45. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,072
    Thoughts on lighting customization:

    So I've generally steered away from changing the lighting models in Unity for assets (done it for projects a few times). For assets, there's too much risk (changing code, lots of lighting mode combinations) and potential confusion (only works in forward, etc). However, I have some thoughts on how this might work in a more modular system like we are talking about here. Clearly most things done to make this more modular will still have to be done per pipeline and rendering mode, because the code is pretty tightly coupled. Further, this is where Unity does most of their changes, and you can find deprecated functions in, say, the URP forward lighting code already.

    There's a number of things someone might want to change. Usually it's:

    - Enable/Disable some feature (vertex lighting, fog, etc)
    - Change the BRDF functions
    - Modify or replace the input or output data of one of these functions (Posterize light data, replace baked data with own)

    It seems to me that a fair bit of customization could be done without having to write an entirely new template. For instance, one approach would be if the compiler detects you've written a LightingPhysicallyBased function, it calls yours instead of the build in one (and errors if you've provided multiple). That things like GetMainLight and GetAdditionalLight call out to a ProcessLightData function if it exists, so you can posterize the attenuation of a light before it's used. And instead of having an OverrideGI property like HDRP does, you can just override the function which gets that data with your own. This all seems relatively trivial once you are in code generation land.

    Now there's some argument to be made for "Just write your own template if you want to customize lighting", which is valid and the right answer sometimes - but that will always be a brittle solution, where as it seems like some of these things can be safely overridden in a modular way, allowing for a "toon shading" stackable to exist without it effectively having the weight of managing an SRP template, and in a way that doesn't break on most Unity releases. To further add to the safety, I'd suggest all these functions take structures as inputs (and possibly outputs) so that data can be easily added to them without changing the API. (Seems like this has already been done for a number of URP forward lighting conventions).
     
    Kirsche, rz_0lento, OCASM and 5 others like this.
  46. marcte_unity

    marcte_unity

    Unity Technologies

    Joined:
    Sep 24, 2018
    Posts:
    11
    Thanks, Jason, that's another great use case to call out, and it fits into our overall design that we're iterating on right now. "Just write your own template" may be where we start, but only so we can prove out a solid foundation on which to build more powerful abstractions (like specific function overrides). We're excited to share more as our plans solidify - and really grateful for the continued engagement. Keep the use cases coming, they really help us ensure we're on the right path.
     
    Jes28, Edy, BOXOPHOBIC and 3 others like this.
unityunity