Search Unity

  1. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice
  2. Ever participated in one our Game Jams? Want pointers on your project? Our Evangelists will be available on Friday to give feedback. Come share your games with us!
    Dismiss Notice

URP docs on writing shaders?

Discussion in 'Universal Render Pipeline' started by Lex-DRL, Feb 14, 2020.

  1. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    137
    TL;DR:
    Unity graphics team, please document the shader part of the new URP. Shader graph is great, but manual shader programming is still a thing, especially for complex shaders, especially on old mobile platforms (where each shader instruction counts, and we all know that the majority of mobile players have a really outdated hardware).

    ----------------------------

    I've read what seems to be every documentation page / blog post about URP available and have found no info on how to write your own custom shaders for the new render-pipeline. Is there a huge tutorial I'm somehow missing?

    Yeah, there are some comments in the shaders shipped with the URP package itself, but a lot of things are still very unclear.
    For example, what are the LightMode's of the passes I need to define and what are they used for? I see "Universal2D", "NormalsRendering" and "UniversalForward" in the "Universal Render Pipeline/2D/Sprite-Lit-Default" shader. But it's unclear what exactly these passes do and why are they there.

    Why do I need Universal2D pass if I already have UniversalForward? How are they different?
    Why is NormalsRendering there? Is it a separate pass to generate normals for deferred path? If so, where are all the other passes? If not, why would I calculate normals in a separate extra pass, not in a main one?

    I thought comparing one built-in URP shader to another would clarify this but it actually brought even more questions. If I compare this shader with "Universal Render Pipeline/Lit":

    What's the DepthOnly pass used for?
    Why doesn't it have the same NormalsRendering pass the other shader has?
    Why does neither of them has any pass related to albedo/metallic/roughness rendering for deferred path?
    Where can I find a full list of lightmodes catched up by URP by default?
    Which of the classic Unity's built-in variables/macros are available in URP shaders and which aren't?

    The only thing Google finds on the subject are some links to the same shaders on the github and a few chinese websites. That's it.
     
    Last edited: Feb 14, 2020
  2. BattleAngelAlita

    BattleAngelAlita

    Joined:
    Nov 20, 2016
    Posts:
    182
    URP sources is on github, all that things is pretty easy to understand.
     
  3. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    358
    Don't excuse the lack of docs. "Read the source" is meant to be advice in addition to "Read the docs"
     
    Moritz5thPlanet and hippocoder like this.
  4. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    137
    Pardon my french, but... is it? Really?
    I might be looking in a wrong place (i've looked here, for example, and it is NOT documented at all, neither the code is self-explanatory), but unless you manage to learn the entire general SRP (a MUCH bigger thing), it's not "pretty easy to understand" at all. Which (SRP in a whole) is an odd thing to force users to learn if they don't intend to extend the RP yet. Yes, it's "the right thing to do" to learn the SRP approach if you migrate to it, but it shouldn't be required if you just need to update your custom shaders to make them work with URP. Am I missing something here?

    Don't get me wrong, I believe the new SRP approach is great. But the lack of documentation is definitely an issue. Despite the fact that SRP is already treated as both stable and default part of the engine now.

    P.S.: @BattleAngelAlita, even though you said it's pretty easy to understand, I still haven't figured out what the above passes are used for. Can you point me to a place where: a) all the supported pass types are listed; b) it's explained what they actually should do?
     
    Last edited: Feb 17, 2020
  5. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    4,192
    Back when I was working at a studio with full source access and expensive enterprise support I asked enterprise support for documentation on the shader side of the light pipeline. The response I got was "the shader code changes too fast to document", which is basically sounds like graphic coders getting away with excuses because no one understands what they do. Yet now many of us are required to keep up with 3 sets of undocumented and rapidly changing shader code, on what has become the most unfriendly shader pipeline in the business.

    Nice job Unity,
     
  6. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    160
    I start learning writing URP shader using this template
    https://gist.github.com/phi-lira/225cd7c5e8545be602dca4eb5ed111ba
    it is quite easy to understand, compare to the urp source in github.

    This template gives you almost everything you need,
    for example, I changed the lighting model to do NPR, the result looks like this

    and you can add lots of logic like vertex animation+recalculate normal, planar reflection for example


    all starting from the above template.

    But I assume the shader will break each time Unity upgrade URP, unity recommends using shader graph so it is safe to upgrade. I found it very useless, I can't even do multipass or multi_compile which I cannot imagine how to create a game using only the shader graph, to me it is just not possible at all without code.
     
    Johannski and Lex-DRL like this.
  7. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,729
    I agree, shadergraph is currently limited for many things. I have to copy shader->go add a few things, change a few precisions (webgl, I see you, ES2.0 precision I see you too!) and it's just right now most suited to doing tweaks to the PBR or unlit shaders.

    You can do a lot of variation but that's all it is for now - variation. And that's mostly enough for me. There are always those shaders that need specific colormask or offset and so on and these require you to copy shader source and edit every single time.

    I don't know why the graph is so limiting in these simple fundamental areas. I don't get it.

    Is it because we're too stupid to be allowed such power?
     
    andybak and neoshaman like this.
  8. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    160
    If you want to develop a simple uber shader in URP.
    -Shader graph = too limited, can't control multi-pass, multi_compile, can't even control stencil/Ztest/Cull front. Not an option, don't waste time learning it / supporting it unless you are providing nodes for your customers.
    -using asset like amplify shader editor = It works, you can do most of the thing in URP, and it is surprisingly good. If you don't want to code, go for it! but still not very good for a programmer.
    -surface shader = it doesn't exist in URP
    -Writing your own vert/frag shader = complete control. but not easy and breaks every time unity upgrades URP

    I choose "Writing your own vert/frag shader", I am willing to accept doing shader fix everything I upgrade URP, because I just don't have another option.
     
    Lex-DRL likes this.
  9. BattleAngelAlita

    BattleAngelAlita

    Joined:
    Nov 20, 2016
    Posts:
    182
    Lex-DRL
    In the pre-surf era, writing shaders was the same - look at the sources, write similar things.
     
  10. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    4,192
    Except unity closed off the API for writing custom shader graph nodes, so now you basically have to hack the assemblies and then fix it every time they change the internals of the shader graph. I pulled my shader graph node asset rather than deal with this anymore, it's just a crap fest for shaders right now in Unity, and while I keep getting lip service about fixing it from individual unity employee's who aren't actually responsible for shaders in Unity, the rest of the team is either quiet or tells us to just use the shader graph.
     
    Noisecrime, gigazelle and Recon03 like this.
  11. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    160
    do you mean unity removed the support of custom MASTER node?
    I see an asset doing really great using custom function node alone.(which is just a node that points to a .hlsl file and call a function in hlsl, you can write include"xxx.hlsl" also)
    https://assetstore.unity.com/packages/vfx/shaders/lux-urp-lwrp-essentials-150355

    and he attempt to remove all logic from the pbr master node using this setup
    https://medium.com/@larsbertram1/lwrp-and-custom-lighting-in-shader-graph-6a7c48008a1d

    so now he is using a pbr master node, but result similar to a custom master node
     
  12. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    4,192

    No, they removed support for custom nodes. Period. They basically made the whole API internal, so you can't inherit from the classes, etc. Yes, you can write some HLSL and call it from your graph with what they are now calling a custom code node, but what you can't do is create an actual node like the ones they write (without hacking the assemblies or providing your own version of the entire shader graph code with your modifications) because the C# code is closed off. This means your node cannot have options via dropdowns, and other things which modify the code it's going to produce. IE: the "IsNormal" checkbox on a texture sampling node to decode the normal, or the "object space" modifier that lets you decide what space an input is in. Without these, my Stochastic node could still be done by turning every combination of option it provided into a unique HLSL file with copies of 99% of the code and a "custom function node" for each one, which means you'd end up with nodes like:

    StochasticSampler_LumBlend_NotNormal
    StochasticSampler_LumBlend_IsNormal
    StochasticSampler_HeightBlend_RChannel_NotNormal
    StochasticSampler_HeightBlend_RChannel_IsNormal
    StochasticSampler_HeightBlend_GChannel_NotNormal
    StochasticSampler_HeightBlend_GChannel_IsNormal
    StochasticSampler_HeightBlend_BChannel_NotNormal
    StochasticSampler_HeightBlend_BChannel_IsNormal
    StochasticSampler_HeightBlend_AChannel_NotNormal
    StochasticSampler_HeightBlend_AChannel_IsNormal
    StochasticSampler_LumBlend_NotNormal_Chained
    StochasticSampler_LumBlend_IsNormal_Chained
    StochasticSampler_HeightBlend_RChannel_NotNormal_Chained
    StochasticSampler_HeightBlend_RChannel_IsNormal_Chained
    StochasticSampler_HeightBlend_GChannel_NotNormal_Chained
    StochasticSampler_HeightBlend_GChannel_IsNormal_Chained
    StochasticSampler_HeightBlend_BChannel_NotNormal_Chained
    StochasticSampler_HeightBlend_BChannel_IsNormal_Chained
    StochasticSampler_HeightBlend_AChannel_NotNormal_Chained
    StochasticSampler_HeightBlend_AChannel_IsNormal_Chained

    God forbid if I want to add another option to it. They made this change after many of us released nodes on the asset store, so we all had to hack the assemblies to keep everything working. After they broke the interface again, I got tired of dealing with it and removed the asset off the store and only sell it for Amplify's shader graph now.
     
    goncalo-vasconcelos and colin299 like this.
  13. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    160
    I see, can't create options for customers in a custom node and can't do multi_compile so ends up copy code for each keyword combination. Sounds like hell to me. but how did you support it? you write code to generate shader code for each combination?

    I don't know which happen first, but if they actually first telling people "shader graph is production-ready" and then remove an important API after lots of people develop things using it, I will black-list that product forever.
     
  14. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    4,192
    Yeah, shader graphs just write code. It's not even a multi-compile thing, just basic code gen. If I want to give the user an option of which channel to pull some data from in a texture, I can expose an enumeration for them to choose, then have the node write out .r, .g, .b, or .a at the end of the sample. Without this, I need 4 nodes for that option.

    For instance, unity's texture sampling node has a type enumeration. When you set it to normal, it generates the code to call the UnpackNormal function on the resulting sample before outputting it.

    They closed off the API in 2019.2, well after LWRP was "Final". Pretty sure it wasn't in preview at that point.
     
  15. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    137
    To be fair, they finally did what we all was asking for: the way to have the true current- / next-gen graphics while still being able to run fast on low-end hardware. It's a really big task, and for the main part - they've nailed it. It took almost a decade for UT to make it, though...

    Huge thanks, man! At a glance, it looks like exactly what I was looking for.

    And in the pre-GPU era literally everything was done as a mind-f*ing hack. Remember how a perspective was achieved in the original DOOM 1?
    Not a valid argument. Anything can be justified with: "Remember how bad it was back then?"

    I do the same, but I've made my own "interaction layer": a separate cginc (a few of them, actually) containing all the functions dealing with Unity's built-in stuff that might change. From the actual shader code, I call only my own funcs.
    This way at least I need to update my code only in one place when I'm upgrading to a newer Unity version.
     
  16. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    160
    are you writing something like your own surface shader(extract all common code/lighting into your include files), but for URP only?
    so every new shader can be written with no/minimum duplicated/boilerplate code? (just like writing surface shader in the built-in RP)
     
  17. KokkuHub

    KokkuHub

    Joined:
    Feb 15, 2018
    Posts:
    221
    IMO they didn't "nail" it at all: they botched it. You have to choose between "next gen" graphics and running on low end hardware and there is no path for switching between one or the other without recreating all your materials and lighting setup manually.

    SRP's biggest problem is the lack of a strong shader generation backend. If you create a custom SRP, you cannot use shaders written for another. You can't even use graphs from another SRP. For SRP to really take off, it needs a system to bridge a shader/graph and the SRP's internal implementation.
     
    Last edited: Feb 21, 2020
    Noisecrime and neoshaman like this.
  18. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    358
    I'm currently working on an app that supports both URP and HDRP and it's not perfect but I'm getting a fair amount of reuse from shaders, lights etc.

    1. Simple shader graphs use a master node that works on either pipeline
    2. Otherwise - put all nodes apart from the master node into a subgraph and have top level graphs that differ only in the master nodes
    3. Lights need adjusting when switching pipelines. I'm still working out whether I need two sets of lights or whether I can simply apply an parameter adjustment to each light based on the current pipeline.

    It's far, far from perfect and my requirements are unusual (it's not a game) but it's kinda working.

    The thing I'm more worried about is the seeming lack of attention to custom pipelines which seems to negate the main stated reason for switching to this system. The VFX Graph has hardcoded support for the official pipelines which means no VFX for you if you use a custom pipeline. There's probably other things you lose but this is enough for me to assume that custom pipelines aren't a realistic option.

    That seems very short-sighted.
     
    Last edited: Feb 21, 2020
  19. KokkuHub

    KokkuHub

    Joined:
    Feb 15, 2018
    Posts:
    221
    Shaders should all use a common abstraction that a SRP can consume to generate its final shaders. The lack of such a thing is the biggest problem with SRP's going forward and it makes writing a custom SRP very unappealing.
     
    jbooth likes this.
  20. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    4,192
    Custom SRPs are dead on arrival because Unity has hard coded their tools to only work with their SRPs to shortcut development. It’s a real shame because the idea is solid, but the vision to see it through has just been lacking. There would always be the chance that a custom SRP would be different enough that it wouldn’t have any commonality with the traditional pipelines (something like Dreams comes to mind), but with a solid abstraction layer most things would at least run between most SRPs and sometimes flawlessly.

    Custom SRPs are very useful in things like mobile development or when you really want a unique look, but each day it locks you out of more of Unity to write one..
     
    Noisecrime, Fenixake, Recon03 and 6 others like this.
  21. Recon03

    Recon03

    Joined:
    Aug 5, 2013
    Posts:
    465
    sounds about right....They pulled some of this nonsense with Substance, which made it hard to deal with and making anything third party has been a pain. ( buggy mess) hey Unity, START making a damn game......Just maybe you will see the pain of what we go through..with your closed off NONSENSE for everything....( I just don't get it....and Unity wonders why, we go to Unreal......its gets exhausting...dealing with this crap.. ( a lot of us gave up and just use LTS, or even older versions, or just use another engine, because Unity, just doesn't understand the needs of its developers that have real experience and that use it... ( you would think, they will listen...) Ya I have gotten plenty of lip service.. (exhausting) in some areas it seems to have gotten worse....
     
  22. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,729
    It's got to the point where I quite fancy just rolling my own lights and doing a vert/frag shader. That would be lovely and fast.
     
  23. KokkuHub

    KokkuHub

    Joined:
    Feb 15, 2018
    Posts:
    221
    I'm still on built-in, but if I had to use URP the first thing I would do is rewrite the hell of many of it's systems (like I'm having to do with built-in and PPv2, where possible). Why? Because it's built around the feature sets of ancient phone hardware and doesn't use compute shaders anywhere (other than PPv2's super-expensive fixed-quality AO and a tiny bit of the PPv2 color grading LUT generation).

    Many things, like bloom and FXAA, can be made to run more efficiently using compute shaders than plain old fragment shaders. Post processing like fog and color grading using compute could be done without ping-ponging between render targets by modifying the buffers in-place. Even tiled lighting itself can be made more efficient, by classifying the tiles based on light count and using indirect dispatches to execute specialized shaders instead of having a single full of dynamic branches.
     
    hippocoder likes this.
  24. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    26,729
    URP and HDRP are generic solutions really so I get where Unity is coming from. At least we're able to write our own SRPs which is one of the big benefits of this whole thing.
     
  25. KokkuHub

    KokkuHub

    Joined:
    Feb 15, 2018
    Posts:
    221
    When the thing ever get actual programming documentation, stops changing wildly on every release, and SG gets support for custom SRPs, that is.

    It's also a bit annoying that certain somewhat high-level logic pieces are still at the other side of the C++ fence, like culling and shadow generation.
     
    Recon03 likes this.
  26. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    8,351
    In practice though, from working with urp and hdrp seem so far 100% pointless, most effects i make are comparable or more limited in urp and hdrp is way slower than custom standard pipeline shaders, because you are now forced to use an extremely heavy core to do simple effects.

    Maybe i miss some qualities there, but my grass shader is slower in hdrp and i miss point light shadows in urp and everything seem worst case than standard so far after actual implementations.

    Maybe if i spend another decade optimizing and writing custom stuff for the pipelines with 1000x the complexity manage to do what standard did so far.

    For sure all the above is not what i would ask for. A clear cut better performing with same or more features is what i would ask and we dont get that at all with urp or a great graphics with more optimization than standard would ask from hdrp and is far from that, looks very slow and heavy so far and i dont see much that could not already do with standard with much faster performance.
     
    Recon03 likes this.
  27. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    8,351
    But then you get cut off from the whole asset store and everything on other pipelines, so is there a realistic indie developer sceneario that this feature would be used ?
     
    Recon03 likes this.
  28. phil_lira

    phil_lira

    Unity Technologies

    Joined:
    Dec 17, 2014
    Posts:
    545
    Regarding Shader documentation we started converting the "writing vertex and fragment shaders" manual for URP.
    On ShaderGraph, I'm not fully knowledgeful of the roadmap but I know they plan to expose more control.
     
    chrismarch, Recon03, Lex-DRL and 3 others like this.
  29. Recon03

    Recon03

    Joined:
    Aug 5, 2013
    Posts:
    465


    Yup...no this is what we see as well... its really got on my nerves.


    not far from the truth sadly
     
    Last edited: May 16, 2020
  30. Recon03

    Recon03

    Joined:
    Aug 5, 2013
    Posts:
    465

    I really hope so... You guys do this with Physic X as well, when it comes to Orgin Shifting, lacking what is exposed, or done this to lack what Substance needs from the Unity API...no idea why you guys limit us...but cmon already.....every investor client I work with , after 13 years of being here, tell me to use UE, or another engine, due to some of this nonsense. I hate to complain and finally like Unity direction for other things....but I wish you guys would expose more, and have proper docs for this stuff and other stuff Physics related. and give Substance the support they need....AAA games, companies use that for a reason..... Yet you guys make it harder for us.. Its like you don't want awesome games made in Unity.... ( I worked in this industry for over 20 years, and sure some stuff you guys are going in a better direction, but, we are begging you to expose more...and to have more docs... This is a waste of time and exhausting, there are many of us who do this for a living...you make it easy for us to use other engines...So we want to use Unity too...but some days I wonder.. (

    PS: I hate to complain.but its getting exhausting, I hope this is not just lip service., no offense.)
     
    Fenixake likes this.
  31. Recon03

    Recon03

    Joined:
    Aug 5, 2013
    Posts:
    465

    Great question......
     
  32. Epiplon

    Epiplon

    Joined:
    Jun 17, 2013
    Posts:
    29
    Wish there was some info on how to write shader that is compatible with both standard and universal render pipelines
     
  33. Fenixake

    Fenixake

    Joined:
    Jan 25, 2014
    Posts:
    109
    Unity has made a bone handel that they use for mecanim when you adjust the pose for the humanoid rig, but they could not figure out that the thing is verry usefull when you want to see your character's bones in the editor(Or atleast let you acces ito_O).Now they made that animation type of package that needed to rewrite the engine code and it still bad to work with xD. I feel like Unity should be bought by Unreal or something to start do things right.:cool: Unreal will get an explosion of users since they updated the free licence, I smell some Unity on a grill in the comming years.
     
  34. Johannski

    Johannski

    Joined:
    Jan 25, 2014
    Posts:
    655
    I'm subscribed to this thread to get relevant information on URSp shader docs, not your opinion of why unreal should buy unity. If you want to discuss that, please open a thread in the general discussion section: https://forum.unity.com/forums/general-discussion.14/
    It's not that I don't believe you, that unity might have made mistakes with their animation solutions, but your reply is not helping in any way in this particular thread.
     
    jbooth and florianhanke like this.
  35. Fenixake

    Fenixake

    Joined:
    Jan 25, 2014
    Posts:
    109
    It helps keeping it hot,so that unity devs can see it!.(They wont read why I say anyway, so they just might read what others say that they need with URP/SRP in general.)
     
  36. castor76

    castor76

    Joined:
    Dec 5, 2011
    Posts:
    1,645
    Man.. we really need this doc. I can't even begin to write a simple shader that works with SRP batching for Sprite. Anyone with info on where I can get the default shader source for URP Sprite Shader?

    I mean, when you click on the Unity's own 2Drenderer Sprite Lit Shader, it says it is not SRP compatible.. Garhhh..

    I am completely confused on how to prepare my shader or my material to be batching friendly for URP 2DRenderer...
     
    Last edited: May 31, 2020 at 10:02 AM
  37. BattleAngelAlita

    BattleAngelAlita

    Joined:
    Nov 20, 2016
    Posts:
    182
    Just wrap all yours uniforms with CBUFFER_START(UnityPerMaterial)...CBUFFER_END
     

    Attached Files:

unityunity