Search Unity

Can Unity please try documenting changes to the shader system? This is absurd.

Discussion in 'Universal Render Pipeline' started by jbooth, Feb 24, 2020.

  1. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Every URP release I spend many hours diffing the changes to their shaders to try to see what has changed and what I need to do to update my shaders. This has reached an absurd level of support cost, and is simply unsustainable with:

    - No real documentation
    - No abstraction like surface shaders to work with
    - No warning or information about changes
    - No template that shows us what each stage of the shader needs to work across all features (ie: You need these for VR here, here and here, these for tessellation, these for shadows, etc).
    - Multiple pipelines and versions to work across

    Unity has basically turned my life into a nightmare of support issues.
     
  2. echologin

    echologin

    Joined:
    Apr 11, 2010
    Posts:
    1,078
    You are not alone !!

    going thru same thing here
     
    rthom likes this.
  3. Rich_A

    Rich_A

    Joined:
    Nov 22, 2016
    Posts:
    338
    Unity, please fix this.

    Without extensive asset support, one of the primary advantages against Unreal Engine is gone, and the sparse core featureset becomes very apparent.

    The slowdown on the asset store (ie. the decreased pace of new releases and of general asset development) has become very obvious in the last year.
     
  4. GoGoGadget

    GoGoGadget

    Joined:
    Sep 23, 2013
    Posts:
    864
    Don't get me started on this. I can't imagine what it's like working with surface shaders, but at least with Post-Processing, it's a nightmare trying to keep track of what Unity has decided to change their API to with each new version of the engine.

    Just so any Unity guys reading this know - what you have done with these "agile" (read: not stable) "modern" (read: over-engineered, overly complex) rendering pipelines is made it much harder for commercial shader asset creators to update and maintain tools for Unity. I'd hope you've given the Asset Store team a heads-up that they'll be making less revenue in this area come 2021, because with how hard it is to develop and maintain these sorts of tools now, there will be less developers pursuing this option. I would love to be proven wrong, but in all the AS groups I'm a part of, this is the consensus.
     
    april_4_short likes this.
  5. Flurgle

    Flurgle

    Joined:
    May 16, 2016
    Posts:
    389
    It really saddens me that Unity is treating these awesome developers like total dirt. This isn't the first time Jason has complained about these sort of issues. And it's not just about URP, but about how unity treats their asset developers in general.
     
    hopeful likes this.
  6. Le_Tai

    Le_Tai

    Joined:
    Jun 20, 2014
    Posts:
    442
    Speculation:

    I think as Unity growing bigger, they starting to have the Google problems:
    - no one want to maintain old product, as in order to be promoted, you want to launch something that sound good to the boss i.e. "performance by default".
    - Profit.
    - Rinse and repeat. Who care about the thing I launched anyway.
     
    hopeful likes this.
  7. phil_lira

    phil_lira

    Unity Technologies

    Joined:
    Dec 17, 2014
    Posts:
    584
    Hey All. I apologize for the upgrade and lack of documentation pain.

    A dedicated tech writer for URP rencetly joined the team. We are mapping all that's missing in terms of priority. We have already flagged that missing documentation about renderers and shader are HUP documentation issues now.

    We started with 7.2.0 to write upgrade guides for minor and major versions of the pipeline. This document lives in the URP documentation.

    To clarify, when you talk about template do you mean shader templates like "create new" Unlit or PBR shader for URP?
    About macros for VR, tesselation, I'll discuss these in the docs meeting to have an action plan to map and document these.

    This has been a big harsh feedback we received. We are putting big effort towards decreasing the fragmentation pain for users.
     
  8. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,527
    Are URP and HDRP going the merged in the future? So we can focus on just one render pipeline but with different shaders for different features? Like it used to be on the old days.

    Or will there be conversion tools to convert between pipelines? ( not just the standard shader )
    Are there going to be less frequent releases so less versions to worry about?

    What is the plan for decreasing the fragmentation ?
     
    valarnur and cxode like this.
  9. Rich_A

    Rich_A

    Joined:
    Nov 22, 2016
    Posts:
    338
    The ability to scale down HDRP to Nintendo Switch would mostly solve the fragmentation issue. URP could then be a thoroughly stylised/mobile/VR solution.
     
  10. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    The lack of a surface shader replacement is the real mistake. There really doesn't need to be a difference between the pipelines in terms of most shaders- they all use the same inputs to the lighting equation, with HDRP having a few extra which URP will likely gain cheaper versions of eventually (SSS for instance).

    Right now I use the code output from the shader graph as a template, inserting various comments into it so my shader generation code can insert what it needs into it. I believe ASE is doing the same thing. But this has proved difficult to maintain because the shader structure between HDRP/URP is so vastly different and changing rapidly between versions. As such, in URP right now I've been maintaining an older version of the URP shader because if I use your new structure I'm going to loose tessellation and custom vertex data being passed to the pixel shader, breaking a bunch of stuff. And in the latest release I've managed to update the shadow cascade issues and changes to the shadow pass, but I think there's a bunch of VR stuff that's been added that I'm missing - none of this was mentioned in the Upgrade Guide either.

    Thing is, it's really not that hard for you guys to implement a surface shader like system - I managed to hack together one in a day - but the issue is maintaining it through your changes, as this isn't something I can really sell on the store to recoup costs on. I outline the basic approach here:

    https://forum.unity.com/threads/making-srp-shaders-easier-to-write.775085/

    Unity seems to have a big problem changing direction once it makes a decisions- which is hard in a large company. But whomever over there is in charge of this stuff needs to pull the emergency breaks and get this done as it's completely wrecking SRP as a viable platform for assets and studios. If I was working for a studio right now writing custom shaders and dealing with these changes constantly, I'd be just as pissed. Not to mention you've hardcoded things in the shader and VFX graphs for your specific SRPs, which means custom SRPs are basically dead if you want to use those tools, and you've closed off the node graphs so they are not extensible without forking the entire shader graph or VFX graph systems or hacking the assemblies.
     
  11. Rich_A

    Rich_A

    Joined:
    Nov 22, 2016
    Posts:
    338
    Unity, why not just make Jason Booth a sweet offer and get him to fix all this stuff for you directly, and integrate his assets as core components of the engine for Plus/Pro users?

    Valve have brought in Lars Doucet who has cut through all bureaucracy and revamped the Steam product search engine.

    Integrated Mesh baking is badly needed and the Splats can be integrated into the Terrain system.

    It seems to have worked pretty well for TMP and ProGrids in Unity's own history.
     
    funkyCoty, NotaNaN, Bwacky and 18 others like this.
  12. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    My prediction is one of the SRPs will be eventually axed with the surviving one absorbing all the features.

    We are in the age of cross platform games and the dual SRP situation makes it impossible to do something like Fortnite: a single project that can be built for multiple platforms ranging all the way from mid range mobile all the way to high end PC.
     
  13. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Well Unreal effectively have both HDRP and URP features within one rendering stack, at least from a users point of view. You're not asked to rewrite your shaders for mobile, just some features don't work or are too expensive for use there. The general trend is going to be for URP to take on more and more features to become a more general renderer. There's no technical reason it can't support raytracing or other high end features, or have new passes for tile based rendering or whatever they want to add to it.

    However, the whole point of having them being two separate pipelines is likely to make it easier to manage code wise, since it starts getting really hard to make changes with all of that running in one code base. When the vision went form "Lets have a high end SRP and a low end one", the entire focus shifts towards URP being everything for everyone, and it will be difficult to argue against feature creep because the vision for what it needs to do is much wider. It's not about having a performant renderer for low end devices, it's about replacing the standard renderer with its full feature set (hence adding deferred, etc), leading to a similar level of complexity that they were originally trying to get away from.

    However, the real promise for SRP IMO is being able to really customize your pipeline- someone writing a new one which gives the game a very different look, new lighting model, etc. Or for custom rendering research. But the uptick on this doesn't seem like it's happening, and again, the shader abstraction being missing and closed API on the shader graph are likely the reasons here. Once you go that route, you can't use the shader graph anymore, all shaders are low level, can't use anything from the store, and upgrades are a lot of work. Instead, demand seems to be moving the system back to new versions of the old ways of doing things- you don't write a top down SRP, you extend the existing one with custom passes and such.

    With a proper shader abstraction I suspect this whole experience wouldn't feel that different than it does in unreal or past Unity- you can switch between URP/HDRP, and everything still works except for some features that are not supported in one renderer or the other. In an ideal world, it's my belief that this shader abstraction layer should have belonged to the SRP, and wouldn't care if the shader functions being written were written in a shader graph or a surface shader like system- it is some C# which knows how to generate the full shader, with all passes and lighting functions, such that as I customize my SRP I can customize how that final shader looks without requiring my regular shaders to be rewritten.

    Note that Unity already built all this to allow their shader graph to work on both pipelines, they just decided to hardcode it into the shader graph rather than separate out the idea of abstracting shaders from pluggin nodes together visually, and then make every person on the asset store either use that graph or build that same abstraction layer themselves with no help or documentation.
     
  14. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    569
    Yeah. One of the first things I checked when Shadergraph was released was the editor API. I wanted to see how to harness it programmatically.

    I was rather surprised to find out the answer was "you can't". Surely it would have been a better architecture all round - even putting aside using it programmatically?

    To summarize your point: Unity created the thing that would solve all our problems but tangled it up in the UI so it couldn't be used separately.
     
    cxode likes this.
  15. xVergilx

    xVergilx

    Joined:
    Dec 22, 2014
    Posts:
    3,296
    The real answer for this from UT is I think "Use ShaderGraph", which is sucky.
    Everything bellow applies to HDRP, but URP should be nearly in the same state (in terms on shader nothing).

    Real point is the lack of documentation regarding everything shader related.
    (Comments in code are fine, as long as those can be found which is sometimes not that easy)




    Personally, I've managed to port custom terrain (CMU, with a little bit of help from larsbertram1) to HDRP, and it wasn't that bad to be honest.
    Once you've realised that Templates == Surface shader abstraction, its not that hard.
    Plus there's already terrain shader, so merging features was relatively easy.

    Real hard part is to find out what is located where, and how it works internally.
    Fortunately there's shader sources included with the package, making this less time consuming task.

    Also, working in my spare time on porting SpeedTree v6 to HDRP.
    There was no way to find out why pipeline behaves like it does (see https://forum.unity.com/threads/weird-black-box-hall-of-mirrors-effect-in-hdrp.833074/) so I've dumped the idea, and started rebuilding it with ASE.

    Which is time consuming, but at least I've got my trees back. Need to finish porting wind still, but I think I'll deal with it.

    TL;DR:
    - If its large, and there's no one to help you with the shader port - use ShaderGraph, or ASE to turtle port (slowly, but reliably).
    - If its large, and there's already a shader in the pipeline, merging features manually should be relatively easy (although, time consuming).
    - For small / new shaders its better to just use ShaderGraph / ASE.
    - I'd love some documentation on how the shaders work. In both HDRP / URP.
    - SpeedTree sucks at porting their bloody shaders. (Cmon, why the heck one dude with a weekend, port code faster than actual industry veterans?)


     
  16. Le_Tai

    Le_Tai

    Joined:
    Jun 20, 2014
    Posts:
    442
    Porting shader for your own use is different from porting for other people. The primary problem is you don't know what magic template to use to make thing work correctly on all platforms (Opengl, Direct X, Metal, Vulkan) and with all features: instancing, XR, Single Pass AR, Multipass AR, dynamic resolution, and probably many more.

    Currently I have a problem with HDRP where the framebuffer is sometime Texture2D any other time Texture2DArray. I probably figure it out soon, but the problem is I can't possibly know that there is such a problem, without happen to stumble upon it. God know how many other gotchas there are.

    When these gotchas hit customer, what you get is bad review.
     
    Last edited: Mar 2, 2020
  17. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    And everything you write breaks on every update, so you get the joy of figuring that out and writing something that works across all versions, a burden Unity doesn’t even have to deal with.
     
  18. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    181
    We are not providing shaders in the asset store, but we need a way to write shader safely for URP with minimum maintenance cost if we upgrade Unity/URP in the future.

    Our final solution is to build our own abstraction layer, an intermediate fake shader text file similar to built-in's surface shader, we call this file "surface shader" internally.
    Then we wrote a simple shader code gen editor tool which accepts 2 inputs:
    -a single template shader (a real URP shader with correct custom lighting model, but contains minimum high level surface logic)
    -all surface shaders, each surface shader defines high level logic like what albedo, emissive, normal....is, also what cull, blend,properties....is (imagine this text file is a shader graph, but in code)

    then the shader code gen editor tool will combine these 2 inputs by injecting each surface shader's code into the correct place inside the template shader, and output a real final URP shader for each surface shader.

    It is basically 90% the same as what ASE is currently doing, but we use code(surface shader text file) instead of node to define a shader's logic, because we found that code is more version control friendly, and the development cost for this tool is much lower, also code instead of node is more flexible and simple. The downside of this method is, it is absolutely not artist-friendly at all but we don't care, because our artist doesn't touch shader anyway.

    If we update Unity/SRP, we only need to fix our template shader, then trigger the tool to regenerate all real shaders.
    If we need to support HDRP in the future, we only need to write an extra template shader, then trigger the tool to regenerate all real shaders.
    ------------------------------------------
    I believe this shader code gen editor tool, or a complete surface shader system, should be provided by unity inside SRP, because I believe many developers are doing the exact same thing. In real development environment, not everything can be made by shader graph/ASE(I am not even talking about performance).

    Code and abstraction is the only answer, not node and graph.
     
    Last edited: Mar 3, 2020
  19. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Absolutely - if every developer is basically writing their own abstraction layer then its hundreds of thousands of wasted man hours being pushed onto us for their lack of fore sight. The exact reason people use a licensed engine is so it can solve these types of problems.

    The fact that Unity remains silent and continues to ignore this problem is very telling that they don't seem to mind wasting our time. I know they've read this thread, as well as many others over the years, and yet have NEVER once commented on the issue. Yes, let's all stick our heads in the sands and hope the problem goes away, that's exactly how you should manage a problem like this.
     
  20. nxrighthere

    nxrighthere

    Joined:
    Mar 2, 2014
    Posts:
    567
    This applies to pretty much every area of the engine today, you are basically a free QA workforce. I feel your pain and it's sad. Unity today is 70% about marketing and sales and 30% about the creation of an actual product for developers. I know companies who are slowly investigating and migrating to alternatives, so it's just about when you finally draw the line.

    Unity_2019.png
     
  21. Vincenzo

    Vincenzo

    Joined:
    Feb 29, 2012
    Posts:
    146
    This image however funny it can be is the bitter truth.

    Older unity stuff is getting no real support or progress. Basically deprecated in every way except official words. LTS 2 year support is a lie.

    New stuff is alpha at best. Calling it production ready is a big lie too.
    Big reports is a giant hassle so it hardly happens nor do they get really fixed.

    Companies livelihood is on the line. Both assets developers. And professionals that make games.

    We are all looking to move on from unity to alternatives. And I've seen many companies already switch.

    Unity today turned into a company with a lot of sales people . Unite events. But no actual engineering. Sad.
     
  22. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,933
    I fully agree with Jason, but ... don't miss sight of the bigger picture: Unity has (and still does!) changed the world of game development, largely because their engine was fundamentally modular and easy to replace any part you didn't like.

    The comic is funny, but ... this is what (professional) developers always wanted! We spent many years begging for this, and Unity gave it to us. It is an amazingly awesome thing for us!

    e.g. UnityUI sucks in some big ways. And UIElements is editor only (argh!). So ... I opened up UnityUI, reverse-engineered their layout algorithm, and ... replaced it with a complete implementation of CSS3-Flexbox, at much higher performance (and much much easier to use than UIElements) ... and it works seamlessly in runtime/player builds. It's all my Christmasses at once :) (I have a free version pending assetstore review right now).

    That is ... just amazing. If you'd experienced the hell of Unreal Engine 2 or even Unreal Engine 3 you'd know how lucky we are to have an engine where it was so easy to say "meh; Unity's implementation of X sucks; fine! I'll write my own! And just replace theirs, and everything will continue to Just Work".

    This is manifestly not true. And the engineering quality + output/quantity from Unity in recent years is a huge step up on what it used to be. I've been with Unity since version 2, and actively making games since 3.x -- if you lived through those days you wouldn't believe how much better it is today :)

    ** All of which underlines why Jason's complaints are so timely: with the current plans Unity has (not) made and is (not) enacting, ShaderGraph/SRP are in many ways like the worst of Unity's history, and we've become used to much much better in recent years. So it looks really bad by comparison to the rest of Unity **
     
    AdamBebko, cxode, xVergilx and 2 others like this.
  23. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,983
    I still cannot understand why after so long unity have not at least said "yes we will make a surface shader for URP and HDRP".

    How many people across how many years need to complain about this and potentially migrate engines before this becomes an actual priority?

    Im sorry but the URP and HDRP management do not currently seem to understand what the userbase actually wants.

    @jbooth we went from having no shadergraph and unreal making us look like we were lagging behind the times BUT still being able to make beautiful things (look at games like sable which are made with inbuilt), to having a shadergraph and pipelines that are so unusable no-one is making anything that revolutionary shaderwise anymore.

    Everytime I try to get up to speed on URP and HDRP its changed so much and there such a lack of documentation that I give up. Cant be the only one feeling this way. I feel for you @jbooth because I cannot imagine what it is like to be trapped having to support these nightmare-designed pipelines.

    Its a shame because if they actually listened and acted on what everyone is saying, it wouldnt take THAT much work to turn this crappy situation into a complete win for unity AND users all round.

    I noticed yet again @jbooth that every one of your questions was answered by unity BUT the surface shader one, which is a shame as thats THE problem everyone really has.

    Unity Team TLDR: When you first introduced surface shaders you explained the need for them. That hasnt gone away. You understood why they were necessary back then with a SINGLE pipeline, why would they not be now that there is 3 incompatible ones? I still for the life of me cannot figure out your thinking behind this one.
     
    Last edited: Mar 4, 2020
  24. transat

    transat

    Joined:
    May 5, 2018
    Posts:
    779
  25. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Unity should rename urp to BURP (bloated unfinished render pipeline)
     
  26. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,983
    BUURP

    Bloated Undocumented Unfinished Render Pipeline

    :D
     
  27. BattleAngelAlita

    BattleAngelAlita

    Joined:
    Nov 20, 2016
    Posts:
    400
    > I still cannot understand why after so long unity have not at least said "yes we will make a surface shader for URP and HDRP".
    Because shader graph can be advertised to a lots of artists that familiar with spaghetti-shaders in maya/blender/unreal/ect.
    On the other side, surface-like shaders are interested just for few shader-geeks.
     
  28. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    569
    And it's shader-geeks that make the simple, reusable assets that form part of the ecosystem that is one of Unity's biggest strengths.

    The point was that we could have had both - most of the work needed for a shader framework has already been done. A cleaner architecture in Shader Graph would have allowed us to make use of it.
     
  29. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,983
    Its not one or the other though. They can easily provide both. Its the "shader geeks" that literally drive the ecosystem of unity, the shadergraphs everyone used since 2014 have been developed solely by said geeks.

    We have both artists and shader programmers on our team, it shouldnt be that there is only proper tools for one of them to make shaders in a modern game development environment.

    Also some things are almost impossible to make (and optimise) using spaghetti shaders.
     
    xVergilx, Ryiah, Vincenzo and 2 others like this.
  30. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,205
    Only if you completely ignore the problems we had back then.

    Mono
    Both the compiler and the framework were insanely out of date. MonoBehaviours are heavy and the Unity API is not safe for threads meaning anyone wanting to develop a complex game with large numbers of things on screen had to either build their own framework or use a third party like Entitas (ECS).

    Input
    Unity's legacy input system had no rebinding support among other problems which meant most people ended up using a third party solution like Rewired or built their own system running on top of the official system.

    Networking
    Unity has never had a solid networking solution (UNET is both insanely heavy on resources and rapidly falls apart once you start trying to scale up past a co-op game or anything similarly complex) which is why most people went with a third party like Photon.

    UIs

    IMGUI. Good enough for editors. Worthless for just about everything else. Most people ended up using a third party like NGUI or DarkForge. Once Unity UI came out it was clearly good enough to kill off the majority of the third parties, and once UI Elements comes out I will very eagerly abandon Unity UI too for many reasons.
     
    Last edited: Apr 15, 2020
  31. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Yes, it's an unwritten rule that if you want to get anything actually done with Unity you will either have to buy some 3rd party assets or spend some R&D due to many out-of-the-box features being barebones, insufficient, non-performant, a pain to use for anything more involved than a prototype, or simply non-existent. So the average Unity developer already experienced a version of that image when picking up which assets they would purchase.
     
  32. Flurgle

    Flurgle

    Joined:
    May 16, 2016
    Posts:
    389
    @Ryiah What is dark forge? Or did you mean something else
     
  33. nxrighthere

    nxrighthere

    Joined:
    Mar 2, 2014
    Posts:
    567
    It was one of a custom GUI solutions, about 6-7 years ago. 3rd-parties is what carrying Unity for a long time.
     
  34. MP-ul

    MP-ul

    Joined:
    Jan 25, 2014
    Posts:
    230
    Unity has no AAA game released and we are in 2020, meanwhile Epic has Fortnite who made millions since its release and because of that they gave a ton of stuff for free for unreal users.Unity gives us only broken stuff........, but at least is giving something, the models are usually good stuff but the code parts are a mess most of the times and unusable. =))
     
    Elfstone likes this.
  35. cxode

    cxode

    Joined:
    Jun 7, 2017
    Posts:
    268
    Supposedly they are working on some first party AAA titles, I've heard that mentioned several times by Unity employees.
     
  36. Flurgle

    Flurgle

    Joined:
    May 16, 2016
    Posts:
    389
    @Fenixake
    That is the point though. Unreal started as a games company, and Unity - an engine company - didn't cater to massive, multi million budget AAA studios (that relied on mission critical ability to look deep into the source code, elaborate pipelines, etc), their primary target was the small/medium sized company or indie dev ("democratizing").

    With Dots, SRP, the slow but steady open sourcing, and the new road map, that could change everything.
     

    Attached Files:

  37. MP-ul

    MP-ul

    Joined:
    Jan 25, 2014
    Posts:
    230
    Man, have you seen how many persons work in that company?.Crytek went broke, and still has a superior engine to unity in terms of graphics. Meanwhile, godot is getting there, now Unigine engine 2 has a free version that does everithing you can't do with Unity, fast and ultra responsive editor plus a beast of a shader(- some shader compiling time) but other then that, I took off my hands from Unity untill they will fix it properly.
     
    Vincenzo likes this.
  38. bluescrn

    bluescrn

    Joined:
    Feb 25, 2013
    Posts:
    642
    But there's huge numbers of small teams and solo developers who don't need (or want to learn) DOTS and who certainly aren't going to develop a custom scriptable render pipeline.

    These developers now have huge amounts of experience with 'Unity Classic' and can get great results from it, but 'New Unity' is starting to look as unfamiliar as an entirely new game engine.

    Many people came to Unity for its rapid prototyping capabilities. DOTS doesn't really seem to fit with that, nor does XML-based UI, or the whole analysis paralysis situation of starting a new Unity project these days (summed up by the Unity in 2016 vs 2019 image above).

    While 'New Unity' may end up being great for larger teams making bigger games, there's a risk that the huge changes in development may be detrimental for a rather larger group of indie/mobile devs?

    (Although really, there's still not a lot of competition. Very few mobile developers seem to be using UE4, and C++ isn't very appealing to most small indie teams)
     
    kexei likes this.
  39. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Mobile developers tend to gravitate towards solutions which support the most devices out-of-the-box, which currently is Unity with the built-in renderer. That same motivation will likely slow down URP adoption on mobile, which creates a negative feedback loop: less developers using URP lead to less compatibility bugs being reported, which leads to URP having a lower compatibility, which leads to less developers using URP.

    Meanwhile, other engines are upping their game. UE4 greatly improved its mobile support after Fortnite and is seeing an uptick in adoption for high-end mobile games in Asia. Free engines like Monogame and Godot are improving at a steady pace, with the former having been used by quite a few indie hit games like Axiom Verge and even Streets of Rage 4.

    With "classic" Unity pretty much having its development frozen (the built-in pipeline is stated to only receive bug fixes and MonoBehaviours are unlikely to receive any major improvements), that's what the "new Unity" is going to compete with, whenever it's "ready".
     
  40. TheOtherMonarch

    TheOtherMonarch

    Joined:
    Jul 28, 2012
    Posts:
    867
    Why does this have to be so hard. I was happy writing shaders or using other peoples shaders. Now all I got is a crappy limited shader graph.
     
  41. bluescrn

    bluescrn

    Joined:
    Feb 25, 2013
    Posts:
    642
    And if you *do* want to use shader graph, you're forced to use one of the less-mature render pipelines, there's no support for it with the 'classic' render pipeline.

    There are artists who'd love to be able to use shader graph, but most real-world projects are still using 'Unity Classic'
     
  42. TheOtherMonarch

    TheOtherMonarch

    Joined:
    Jul 28, 2012
    Posts:
    867
    Shader Graph is not really artist-friendly. It is like a GUI layer on top of a shader. You literally need to know how a shader works to use Shader Graph.

    The decal graph is the most limited piece of S*** ever created.
     
    BattleAngelAlita likes this.
  43. Mauri

    Mauri

    Joined:
    Dec 9, 2010
    Posts:
    2,665
    I mean... it's kind of a given to have a basic understanding of how shaders work.

    Or of anything, really. You literally need to know how to code in <insert lang here> in order to make a game. If you're using a visual scripting tool, a basic understanding of how code works.
     
    Last edited: Apr 27, 2020
    xVergilx likes this.
  44. bluescrn

    bluescrn

    Joined:
    Feb 25, 2013
    Posts:
    642
    Many artists are technical enough to know they want to do something like 'sample a second texture using a second UV set and blend with the main texture based on its alpha channel', and would have more chance of doing that with shader graph than 'scary shader code' (even if it would be a very simple surface shader)
     
    transat likes this.
  45. transat

    transat

    Joined:
    May 5, 2018
    Posts:
    779
    @bluescrn That makes me think... perhaps the most artist friendly shader editor would use plain English instead of nodes? Add voice recognition and you could be in front of your computer saying “Sample a second texture. Actually make that two textures. Blend the first with...” while seeing the changes live on screen. In written form, each paragraph would be a macro. And you could copy sentences or paragraphs from one shader to another. You could include comments by adding “in order to...” to the sentence. Pseudo coded design instead of nodal design.

    But to get back to the topic...

    Unity please document the changes! I’ve experienced issues going from 8.01 to 8.10 and I don’t know exactly what’s changed.
     
  46. phil_lira

    phil_lira

    Unity Technologies

    Joined:
    Dec 17, 2014
    Posts:
    584
    Hey all, sharing some updates on this. We are converting the writing vertex/fragment shader example pages in the Unity manual to URP. I'm also going to write some more shader examples that go beyond the simple ones in the manual.
    In that sense what sort of examples would you like to see?

    I'm also interested in knowing a little bit more of scenarios that you would want to have Surface Shader support for?
    Do you want to have SurfaceShaders because there are things that you can't do today with ShaderGraph and writing vertex/fragment is too much work or too much complex?
    Do you want to have SurfaceShaders because you prefer to write shader code but we are lacking some bits in the shader library or docs to make that easier for you?

    I guess it's clear to me there's a gap here we need to solve to better support you. I'm interested in knowing more of your use cases so we can see if we can close this gap by adding features to ShaderGraph and making shader building blocks / API for URP shader library + docs and examples.
     
  47. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    9,448
    i'd certainly prefer writing shader from code, visual editor feels like too much clicking and dragging..

    for example:
    in scripting you can easily copy-paste (from docs/web),
    comment/uncomment lines to quickly test,
    swap swizzles, invert values etc. literally all in 1-2 keyboard clicks.
     
    TheSmokingGnu likes this.
  48. arkano22

    arkano22

    Joined:
    Sep 20, 2012
    Posts:
    1,929
    Hi there,

    I completely agree with jbooth. There needs to be much better documentation for both URP and HDRP shaders.

    Some level of abstraction is also a must. It's just ridiculous that even basic things like the base albedo tint color for Standard/Lit materials have different names in each pipeline ("_MainColor" in built-in vs "_BaseColor" in URP). Common operations like converting positions from object to clip space or getting the main directional light color are also done in completely different ways, using differently named functions/structures that are included in different files. I'm not even talking about having a high-level abstraction layer like surface shaders, or an all-powerful shader graph, but come on. That's three times the amount of work even for extremely basic shader-dependent functionality that needs to work somewhat similarly across all pipelines, user pain that could have been easily avoided by consistent naming convention and a basic library of common functions.
     
    Last edited: Apr 29, 2020
    TheSmokingGnu and Rich_A like this.
  49. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    A use case would be having an easier migration path from built-in to SRP. Re-creating shaders from scratch in SG in a large project is significant work, while with text-based surface shaders it is at least possible to automate part of the process via string replacement and polyfill defines.
     
    TheSmokingGnu and phil_lira like this.
  50. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    181

    Thanks for replying here, I will provide my exact user experience here when I started shader development using URP, from day1 to now. See if it helps, it will be a bit long, hope that it can explain what I expected from surface shader / shader graph.

    Imagine I am the only "shader guy" that writes hlsl lit shader in my team, and other artists can only produce shader using node editor like shader graph/ASE, mostly doing texture blending / masking / uv animations for high level properties like albedo / normal / smoothness.

    1. first day in URP, "wow, the new shader graph is so cool, can sample color/depth texture with just a few clicks, perfect for VFX!", and it really works, I am very happy about this, it is like an official ASE for URP.
    2. then I need to create something more complex, a toon shader with outline and cel shade lighting for characters (visual result looks like this - https://github.com/ColinLeung-NiloCat/UnityURPToonLitShaderExample)
    3. first, I tried creating it in shader graph, and give up on the same day because I found that in shader graph I can't add a second pass for outline, and I can't edit stencil settings
    4. ok, I decided I really can't rely on shader graph for this task, I need to write shader code
    5. found this https://gist.github.com/phi-lira/225cd7c5e8545be602dca4eb5ed111ba, very very important to me
    6. slowly replace (5)'s code to my own code that makes the shader looks like a toon shader, found out that writing shader in URP feels so much better compared to built-in, because of URP's light abstraction(with auto shadowAttenuation!) and single-pass light loop design(instead of MACRO hell and ForwardAdd design in built-in)
    -------------------------------------------------
    so far so good, for a long time. we decided to go full URP without going back to built-in anymore.
    -------------------------------------------------
    7. but as development goes on, this character shader becomes larger and larger because artist can't use shader graph to do what they want for this character shader, instead they send me requests like "can you make this area glow with this mask and make the intensity animated with sin wave? can you blend these 2 textures using this mask? can you do this....do that.......", I write more and more functions into this shader because of these requests.
    8. I use shaderfeature to make these special function togglable for performance reason
    9.found that writing everything inside a big uber shader is just very hard, this shader now becomes too long with lots of shaderfeature options, very hard to read.
    10. I really need a solution to separate high level logic(albedo,normal,emission,ao.....) and the core lighting logic into different files
    11. No good solution to this problem now , expect this shader will become longer and longer in the future

    To sum up, to me the whole shader development problem is:
    -99% of the time artist only cares high level shader properties like albedo/normal/position/emission/AO/smoothness.
    -while the shader guy(me) only cares low-level shader code like custom lighting , which receive artist's high level shader properties as input.

    But because the whole character shader is 100% code, artists can only send me requests, and they can't do anything until I finish writing that request into the character shader.
    Also I need to spend lots of time processing these request.

    I just can't find a way to improve this situation.
    ===================================================================
    To me the ideal way to develop custom lit shader in URP is,
    -the shader guy(me) write a template shader, which is exactly the same as ASE's template shader(contains lighting, handles all extra passes like outline/shadowcaster/depthonly), without defining high level properties like albedo/normal/smoothness, same as C#'s abstract class
    -artist/tech artist use shader graph/ surface shader to define only the high level properties like albedo/normal/smoothness, without caring about how lighting is actually done.
    -shader graph is just a visual tool that writes surface shader files, and surface shader can be used by any custom template shader


    This is the workflow I expected from URP, but it turns out URP is not what I hope for because the shader guy(me) can't even write a custom lit master node for the artist to use in shader graph.

    I am quite sure adding more features to shader graph will NOT solve this problem, because the only problem is people want to write their own custom lit master node, and they can't.

    Imagine artists can use shader graph as usual to produce a lit shader, but now they can choose a custom lit master node writen by the shader guy(me), or other custom lit master node in the asset store, instead of only 2 options - URP's Unlit/Pbr master node.
    This is the solution I hope for years.
    This is how shader graph can be much much more useful to me.
     
    Last edited: Apr 29, 2020
    FM-Productions, Hoagen, Fewes and 8 others like this.