Search Unity

URP docs on writing shaders?

Discussion in 'Universal Render Pipeline' started by Lex-DRL, Feb 14, 2020.

  1. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    137
    TL;DR:
    Unity graphics team, please document the shader part of the new URP. Shader graph is great, but manual shader programming is still a thing, especially for complex shaders, especially on old mobile platforms (where each shader instruction counts, and we all know that the majority of mobile players have a really outdated hardware).

    ----------------------------

    I've read what seems to be every documentation page / blog post about URP available and have found no info on how to write your own custom shaders for the new render-pipeline. Is there a huge tutorial I'm somehow missing?

    Yeah, there are some comments in the shaders shipped with the URP package itself, but a lot of things are still very unclear.
    For example, what are the LightMode's of the passes I need to define and what are they used for? I see "Universal2D", "NormalsRendering" and "UniversalForward" in the "Universal Render Pipeline/2D/Sprite-Lit-Default" shader. But it's unclear what exactly these passes do and why are they there.

    Why do I need Universal2D pass if I already have UniversalForward? How are they different?
    Why is NormalsRendering there? Is it a separate pass to generate normals for deferred path? If so, where are all the other passes? If not, why would I calculate normals in a separate extra pass, not in a main one?

    I thought comparing one built-in URP shader to another would clarify this but it actually brought even more questions. If I compare this shader with "Universal Render Pipeline/Lit":

    What's the DepthOnly pass used for?
    Why doesn't it have the same NormalsRendering pass the other shader has?
    Why does neither of them has any pass related to albedo/metallic/roughness rendering for deferred path?
    Where can I find a full list of lightmodes catched up by URP by default?
    Which of the classic Unity's built-in variables/macros are available in URP shaders and which aren't?

    The only thing Google finds on the subject are some links to the same shaders on the github and a few chinese websites. That's it.
     
    Last edited: Feb 14, 2020
  2. BattleAngelAlita

    BattleAngelAlita

    Joined:
    Nov 20, 2016
    Posts:
    151
    URP sources is on github, all that things is pretty easy to understand.
     
  3. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    340
    Don't excuse the lack of docs. "Read the source" is meant to be advice in addition to "Read the docs"
     
    Moritz5thPlanet and hippocoder like this.
  4. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    137
    Pardon my french, but... is it? Really?
    I might be looking in a wrong place (i've looked here, for example, and it is NOT documented at all, neither the code is self-explanatory), but unless you manage to learn the entire general SRP (a MUCH bigger thing), it's not "pretty easy to understand" at all. Which (SRP in a whole) is an odd thing to force users to learn if they don't intend to extend the RP yet. Yes, it's "the right thing to do" to learn the SRP approach if you migrate to it, but it shouldn't be required if you just need to update your custom shaders to make them work with URP. Am I missing something here?

    Don't get me wrong, I believe the new SRP approach is great. But the lack of documentation is definitely an issue. Despite the fact that SRP is already treated as both stable and default part of the engine now.

    P.S.: @BattleAngelAlita, even though you said it's pretty easy to understand, I still haven't figured out what the above passes are used for. Can you point me to a place where: a) all the supported pass types are listed; b) it's explained what they actually should do?
     
    Last edited: Feb 17, 2020
  5. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    3,986
    Back when I was working at a studio with full source access and expensive enterprise support I asked enterprise support for documentation on the shader side of the light pipeline. The response I got was "the shader code changes too fast to document", which is basically sounds like graphic coders getting away with excuses because no one understands what they do. Yet now many of us are required to keep up with 3 sets of undocumented and rapidly changing shader code, on what has become the most unfriendly shader pipeline in the business.

    Nice job Unity,
     
    tehusterr, Mese96, andybak and 3 others like this.
  6. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    154
    I start learning writing URP shader using this template
    https://gist.github.com/phi-lira/225cd7c5e8545be602dca4eb5ed111ba
    it is quite easy to understand, compare to the urp source in github.

    This template gives you almost everything you need,
    for example, I changed the lighting model to do NPR, the result looks like this

    and you can add lots of logic like vertex animation+recalculate normal, planar reflection for example


    all starting from the above template.

    But I assume the shader will break each time Unity upgrade URP, unity recommends using shader graph so it is safe to upgrade. I found it very useless, I can't even do multipass or multi_compile which I cannot imagine how to create a game using only the shader graph, to me it is just not possible at all without code.
     
    Lex-DRL likes this.
  7. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,436
    I agree, shadergraph is currently limited for many things. I have to copy shader->go add a few things, change a few precisions (webgl, I see you, ES2.0 precision I see you too!) and it's just right now most suited to doing tweaks to the PBR or unlit shaders.

    You can do a lot of variation but that's all it is for now - variation. And that's mostly enough for me. There are always those shaders that need specific colormask or offset and so on and these require you to copy shader source and edit every single time.

    I don't know why the graph is so limiting in these simple fundamental areas. I don't get it.

    Is it because we're too stupid to be allowed such power?
     
    andybak and neoshaman like this.
  8. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    154
    If you want to develop a simple uber shader in URP.
    -Shader graph = too limited, can't control multi-pass, multi_compile, can't even control stencil/Ztest/Cull front. Not an option, don't waste time learning it / supporting it unless you are providing nodes for your customers.
    -using asset like amplify shader editor = It works, you can do most of the thing in URP, and it is surprisingly good. If you don't want to code, go for it! but still not very good for a programmer.
    -surface shader = it doesn't exist in URP
    -Writing your own vert/frag shader = complete control. but not easy and breaks every time unity upgrades URP

    I choose "Writing your own vert/frag shader", I am willing to accept doing shader fix everything I upgrade URP, because I just don't have another option.
     
    Lex-DRL likes this.
  9. BattleAngelAlita

    BattleAngelAlita

    Joined:
    Nov 20, 2016
    Posts:
    151
    Lex-DRL
    In the pre-surf era, writing shaders was the same - look at the sources, write similar things.
     
  10. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    3,986
    Except unity closed off the API for writing custom shader graph nodes, so now you basically have to hack the assemblies and then fix it every time they change the internals of the shader graph. I pulled my shader graph node asset rather than deal with this anymore, it's just a crap fest for shaders right now in Unity, and while I keep getting lip service about fixing it from individual unity employee's who aren't actually responsible for shaders in Unity, the rest of the team is either quiet or tells us to just use the shader graph.
     
  11. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    154
    do you mean unity removed the support of custom MASTER node?
    I see an asset doing really great using custom function node alone.(which is just a node that points to a .hlsl file and call a function in hlsl, you can write include"xxx.hlsl" also)
    https://assetstore.unity.com/packages/vfx/shaders/lux-urp-lwrp-essentials-150355

    and he attempt to remove all logic from the pbr master node using this setup
    https://medium.com/@larsbertram1/lwrp-and-custom-lighting-in-shader-graph-6a7c48008a1d

    so now he is using a pbr master node, but result similar to a custom master node
     
  12. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    3,986

    No, they removed support for custom nodes. Period. They basically made the whole API internal, so you can't inherit from the classes, etc. Yes, you can write some HLSL and call it from your graph with what they are now calling a custom code node, but what you can't do is create an actual node like the ones they write (without hacking the assemblies or providing your own version of the entire shader graph code with your modifications) because the C# code is closed off. This means your node cannot have options via dropdowns, and other things which modify the code it's going to produce. IE: the "IsNormal" checkbox on a texture sampling node to decode the normal, or the "object space" modifier that lets you decide what space an input is in. Without these, my Stochastic node could still be done by turning every combination of option it provided into a unique HLSL file with copies of 99% of the code and a "custom function node" for each one, which means you'd end up with nodes like:

    StochasticSampler_LumBlend_NotNormal
    StochasticSampler_LumBlend_IsNormal
    StochasticSampler_HeightBlend_RChannel_NotNormal
    StochasticSampler_HeightBlend_RChannel_IsNormal
    StochasticSampler_HeightBlend_GChannel_NotNormal
    StochasticSampler_HeightBlend_GChannel_IsNormal
    StochasticSampler_HeightBlend_BChannel_NotNormal
    StochasticSampler_HeightBlend_BChannel_IsNormal
    StochasticSampler_HeightBlend_AChannel_NotNormal
    StochasticSampler_HeightBlend_AChannel_IsNormal
    StochasticSampler_LumBlend_NotNormal_Chained
    StochasticSampler_LumBlend_IsNormal_Chained
    StochasticSampler_HeightBlend_RChannel_NotNormal_Chained
    StochasticSampler_HeightBlend_RChannel_IsNormal_Chained
    StochasticSampler_HeightBlend_GChannel_NotNormal_Chained
    StochasticSampler_HeightBlend_GChannel_IsNormal_Chained
    StochasticSampler_HeightBlend_BChannel_NotNormal_Chained
    StochasticSampler_HeightBlend_BChannel_IsNormal_Chained
    StochasticSampler_HeightBlend_AChannel_NotNormal_Chained
    StochasticSampler_HeightBlend_AChannel_IsNormal_Chained

    God forbid if I want to add another option to it. They made this change after many of us released nodes on the asset store, so we all had to hack the assemblies to keep everything working. After they broke the interface again, I got tired of dealing with it and removed the asset off the store and only sell it for Amplify's shader graph now.
     
    goncalo-vasconcelos and colin299 like this.
  13. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    154
    I see, can't create options for customers in a custom node and can't do multi_compile so ends up copy code for each keyword combination. Sounds like hell to me. but how did you support it? you write code to generate shader code for each combination?

    I don't know which happen first, but if they actually first telling people "shader graph is production-ready" and then remove an important API after lots of people develop things using it, I will black-list that product forever.
     
  14. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    3,986
    Yeah, shader graphs just write code. It's not even a multi-compile thing, just basic code gen. If I want to give the user an option of which channel to pull some data from in a texture, I can expose an enumeration for them to choose, then have the node write out .r, .g, .b, or .a at the end of the sample. Without this, I need 4 nodes for that option.

    For instance, unity's texture sampling node has a type enumeration. When you set it to normal, it generates the code to call the UnpackNormal function on the resulting sample before outputting it.

    They closed off the API in 2019.2, well after LWRP was "Final". Pretty sure it wasn't in preview at that point.
     
    goncalo-vasconcelos and colin299 like this.
  15. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    137
    To be fair, they finally did what we all was asking for: the way to have the true current- / next-gen graphics while still being able to run fast on low-end hardware. It's a really big task, and for the main part - they've nailed it. It took almost a decade for UT to make it, though...

    Huge thanks, man! At a glance, it looks like exactly what I was looking for.

    And in the pre-GPU era literally everything was done as a mind-f*ing hack. Remember how a perspective was achieved in the original DOOM 1?
    Not a valid argument. Anything can be justified with: "Remember how bad it was back then?"

    I do the same, but I've made my own "interaction layer": a separate cginc (a few of them, actually) containing all the functions dealing with Unity's built-in stuff that might change. From the actual shader code, I call only my own funcs.
    This way at least I need to update my code only in one place when I'm upgrading to a newer Unity version.
     
    goncalo-vasconcelos likes this.
  16. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    154
    are you writing something like your own surface shader(extract all common code/lighting into your include files), but for URP only?
    so every new shader can be written with no/minimum duplicated/boilerplate code? (just like writing surface shader in the built-in RP)
     
  17. KokkuHub

    KokkuHub

    Joined:
    Feb 15, 2018
    Posts:
    143
    IMO they didn't "nail" it at all: they botched it. You have to choose between "next gen" graphics and running on low end hardware and there is no path for switching between one or the other without recreating all your materials and lighting setup manually.

    SRP's biggest problem is the lack of a strong shader generation backend. If you create a custom SRP, you cannot use shaders written for another. You can't even use graphs from another SRP. For SRP to really take off, it needs a system to bridge a shader/graph and the SRP's internal implementation.
     
    Last edited: Feb 21, 2020
    neoshaman likes this.
  18. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    340
    I'm currently working on an app that supports both URP and HDRP and it's not perfect but I'm getting a fair amount of reuse from shaders, lights etc.

    1. Simple shader graphs use a master node that works on either pipeline
    2. Otherwise - put all nodes apart from the master node into a subgraph and have top level graphs that differ only in the master nodes
    3. Lights need adjusting when switching pipelines. I'm still working out whether I need two sets of lights or whether I can simply apply an parameter adjustment to each light based on the current pipeline.

    It's far, far from perfect and my requirements are unusual (it's not a game) but it's kinda working.

    The thing I'm more worried about is the seeming lack of attention to custom pipelines which seems to negate the main stated reason for switching to this system. The VFX Graph has hardcoded support for the official pipelines which means no VFX for you if you use a custom pipeline. There's probably other things you lose but this is enough for me to assume that custom pipelines aren't a realistic option.

    That seems very short-sighted.
     
    Last edited: Feb 21, 2020
  19. KokkuHub

    KokkuHub

    Joined:
    Feb 15, 2018
    Posts:
    143
    Shaders should all use a common abstraction that a SRP can consume to generate its final shaders. The lack of such a thing is the biggest problem with SRP's going forward and it makes writing a custom SRP very unappealing.
     
    jbooth likes this.
  20. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    3,986
    Custom SRPs are dead on arrival because Unity has hard coded their tools to only work with their SRPs to shortcut development. It’s a real shame because the idea is solid, but the vision to see it through has just been lacking. There would always be the chance that a custom SRP would be different enough that it wouldn’t have any commonality with the traditional pipelines (something like Dreams comes to mind), but with a solid abstraction layer most things would at least run between most SRPs and sometimes flawlessly.

    Custom SRPs are very useful in things like mobile development or when you really want a unique look, but each day it locks you out of more of Unity to write one..
     
    hippocoder, Mese96, KokkuHub and 2 others like this.
unityunity