Search Unity

Official Focusing on Shaders

Discussion in 'Shaders' started by marcte_unity, Sep 18, 2021.

Thread Status:
Not open for further replies.
  1. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,026
    Thank you for your input!
    @jbooth we want to approach variant management a bit differently, yes. Not only for the blocks, though. Block do offer an opportunity to have conditionals on both levels (whole blocks and within blocks), but we may as well provide something for the existing shaders.
    @BOXOPHOBIC conditionals on the block level make this possible as well. We'll think about the best approach here - perhaps we could have this for plain shaders as well.
     
    NotaNaN and BOXOPHOBIC like this.
  2. BOXOPHOBIC

    BOXOPHOBIC

    Joined:
    Jul 17, 2015
    Posts:
    509
    ShowIf, yea, it can be used, I use custom shader GUI and drawers for all my shaders, but it doesn't solve the issue of having many variants.

    The idea is to be able to use those "variant" defines for everything. It could be a new type of Toggle / Enum, that sets some "variant" keywords. Something like this. So this would show/hide props automatically, set any pragma, add a new multi_compile keyword, etc. Just throwing some ideas here...

    Code (CSharp):
    1.     Properties
    2.     {
    3.         _Color("Color", Color) = (0,0,0,0)
    4.         _MainTex("_MainTex", 2D) = "white" {}
    5.  
    6.         // new toggle type
    7.         [VariantToggle(_SNOW_ON)] _Snow("_Snow", Float) = 0
    8.  
    9. #if variant(_SNOW_ON)
    10.         _SnowTex("_SnowTex", 2D) = "white" {}
    11.         [Toggle(_SNOW_HIGH_ON)] _Snow_High("_Snow_High", Float) = 0
    12. #endif
    13.  
    14.     }
    15.  
    16.     SubShader
    17.     {      
    18. #if variant(_SNOW_ON)
    19.         Tags { "RenderType"="Opaque" }
    20. #elif
    21.         Tags { "RenderType"="Penguin" }
    22. #endif
    23.  
    24.         CGINCLUDE
    25. #if variant(_SNOW_ON)
    26.         #pragma target 4.0
    27. #elif
    28.         #pragma target 3.0
    29. #endif
    30.  
    31.         ENDCG      
    32.      
    33.         Pass
    34.         {
    35.             Name "Unlit"
    36.  
    37.             CGPROGRAM
    38.  
    39. #if variant(_SNOW_ON)
    40.             #include "FancySnow.cginc"
    41.             #pragma multi_compile_local _SNOW_HIGH_ON
    42. #endif
    43.  
    44.  
     
  3. BOXOPHOBIC

    BOXOPHOBIC

    Joined:
    Jul 17, 2015
    Posts:
    509
    Can't wait to try it out!
     
  4. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    I recently ran into a use case that may be relevant here. I needed to add animated vertex deformation to some meshes (eg, squash and stretch, wobble, twist, extrusion along a bezier curve, etc). For example (slowed down to show effect):

    deform.gif

    I ended up creating replacement materials for every material my characters used (easy enough for standard URP lit but tricky with the cloth and hair shaders I'm using, and an all-around time consuming effort) that altered the object space position input to each VS. Then at runtime, I replace the materials with the deformed versions and copy all the properties over. But that made me realize just how silly it is that something as abstract and generally applicable as vertex deformation is tied to the way that light is transmitted through hair or the color of shirt that a character is wearing. A vertex deformer could run on any vertex that has a position and normal, which is pretty much any mesh, and would be compatible with most vertex shaders (let's ignore the tangents in the room), meaning it's not actually tied much to the material.

    Would an API like this allow me to inject a deformation layer into an arbitrary shader via script (as an editor preprocess step), or is that a little bit outside its bailiwick?
     
    Last edited: Dec 8, 2021
  5. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,550
    That does sound like it would be a bit outside the scope, as that would involve modifying shaders that aren't part of this system. But having that kind of support built in to shaders written with this system would be nice, though I imagine there may be some edge case issues to figure out.

    But regardless, there is another approach you could be taking, which is using a ComputeShader step to actually modify the mesh data that is on the GPU. Earlier this year Unity exposed API to directly reference the GPU side mesh data and modify it from compute. Then whatever shaders are rendering that mesh are going to see those changes, without any need to modify those shaders.

    This can potentially be much more performant too, as instead of the vertex program in every shader pass running on the mesh doing those same calculations, you're only doing it once per-frame from compute.
     
    LooperVFX and burningmime like this.
  6. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    Oh, I just assumed they were read-only. So are they safe to modify by the time a camera starts rendering? 'Cause if so, problem solved!

    Although I still think it would be cool to inject things into existing materials, and do so via an editor script. That would also help out Asset Store assets that require special shaders for effects, (for example, Bakery's lightmapped specular or Magio's people-turning-to-stone effect).
     
    LooperVFX likes this.
  7. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    That is the whole idea of Stackables in Better Shaders, which we've been talking about the new system having an equivalent of in this thread.

    Stackables allow you to combine multiple shaders in a kind of signal chain approach. For instance, I have integrations for Bakery, Vegetation Studio, Trax, and other assets included with Better Shaders - adding these to your shaders is as simple as including them in, or stacking them in a scriptable object editor to create a new shader which has the features of both. Further, since it's a signal chain, you can write components and stack them together for different effects. This is how features of the Better Lit shader are done. I have a Lit layer, with Wetness, Snow, and other effects stacked on that like photoshop layers. You can take those stackables and put them onto other shaders written with Better Lit, as well as fully test them independently as their own shaders.

    You can even multi stack- the texture layers in Better Lit Shader are just the same stackable stacked multiple times.

    So yeah, the idea here is that you could write your vertex deformation code, then apply it to your existing shaders without having to modify them.
     
  8. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,550
    I believe when it comes to the Skinned Meshes which you appear to be using in the video, then it's safe because you're going to be modifying the skinning deformation output buffer that the shaders actually use and not the source buffer that is used to deform from each frame. So your deformation one frame should not carry over to the next frame, just like in a shader.

    When it comes to regular non-skinned meshes though, on OnEnable() you would usually want to create a secondary buffer (and destroy in OnDisable()) to store the original mesh shape before your deformations are applied and then use that as the source data whenever applying your deformations to the actual target mesh buffer that the shaders will be using. Otherwise your changes will be additive and buildup each frame.

    But in terms of safety, it does not modify the CPU-side mesh data. So the true source data is always safe and can be refreshed on the GPU by telling the Mesh component to update the data (but it's not ideal to have to send the mesh data to GPU each frame, thus why it would be better to create that source data buffer on OnEnable like I mentioned)
     
    Last edited: Dec 9, 2021
    LooperVFX and burningmime like this.
  9. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    Thanks! I was more worried about the timing of it. Like the GPU is going to do the skinning itself, so if you did the deformation sometime before that, the skinning would overwrite it. You'd need to find the perfect moment to inject into the render pipeline, after skinning is done but before rendering. I guess I should just test it and find out :)

    Thanks; that makes sense. I guess asking for it to be injected into arbitrary materials is a little much; that would require Unity being able to rewrite any shader, which sounds like a nightmare. Maybe it could be done at like the bytecode layer, but from re-reading the OP, it sounds like the Shader Foundary thing is going to be source-based.
     
  10. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Trying to use #pragma shader_feature_local_fragment has tripped me up several times now:

    Since it's a pragma, if you write shader_feature_fragment_local, you get no warning but things just don't work. Now that these have more than one qualifier, it's pretty easy to get the order wrong, so it would be really nice if it either threw an error or accepted the arguments in either order. I prefer the error, because if a third comes, ugh..

    Also, it appears this doesn't work on oculus in 2019.4 (the fragment extension). It does work on oculus in 2021, but at some point the fragment option was added but not really reliable until much later. So might want to note that in the docs. I didn't bother to file a bug because I'm sure it would just be greeted with the "upgrade" response.
     
  11. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,794
    Yup, I've already done that a bunch of times, wasn't fun at all debugging.
     
  12. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    on the other hand, if they're building an AST for each stage and doing analysis of what's being used, marking stages for features should no longer be required, right? Like it should just be able to tell if _FOO is used in the fragment stage.
     
  13. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,794
    I don't know if I would like it to be fully automatic (especially if, in typical Unity fashion, it doesn't really let me know what stage it ends up being -or that information is hard to get).

    But an error if I mark it as fragment but I use it outside it, would be very much appreciated, so if a shader has a keyword declared with fragment in it and I don't see error in the console, I know what stage it's used in.
     
  14. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Yeah, I suppose this gets to the point of knowing what the system is doing automatically vs. manually specifying it. Better Shaders will automatically handle a bunch of things for you, such as what data gets interpolated across the v2f stages. It scans your code and figures this out (did you write .VertexColor in your code?), but can't handle variants automatically (#if USE_VERTEXCOLOR), so you have to write some option blocks that create a little contract that turns into an #if #endif around the data for you.

    But when developing, I will spit out the raw shader it generates to inspect exactly what it's doing, both to make sure Better Shaders is doing the right things, but also to make sure I haven't accidentally caused it to include something it shouldn't and ruined an optimization or broken the contract I've told it I've made.

    So I guess there's some question of what Unity will provide here with their system. Some of the things that have been talked about happen below the level of having a "shader output"- for instance, being able to determine the appdata/v2f structures for each variant by scanning the code. Currently that has to be handled by wrapping the data in various #if #endif checks.

    But if the shader generation is going to automatically infer that for each variant, you wouldn't be able to easily see that from looking at some kind of shader output, unless that output was for a given set of keywords after the stripping has happened. And while you'd still be able to go into RenderDoc and see exactly what it's doing, it might be nice to have some kind of way to check things which are done "automagically" to make sure they are doing what you think they are. I do believe that, if the system really does do proper analysis, this will be required less than in something like Better Shaders which does a bit of a compromise around these kinds of issues.
     
    AcidArrow likes this.
  15. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,026
    Just FYI... This only works on graphics APIs that have the fragment stage separate from the others. The APIs that combine stages together (GL, GLES, VK) treat it the same way as a directive without the stage suffix.
    Oculus shouldn't be affected by having "_fragment" at all.
     
  16. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Well in 2019.4, if you use _fragment, none of those features are recognized as on and the shader just falls back to having them all turned off. If you remove _fragment, everything works fine. In 2021, however, the shader still works with the _fragment modifier.

    So in this case, by not working, I don't mean that it's not optimizing the shader correctly, I mean the shader doesn't render correctly at all.
     
  17. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,026
    @jbooth looks like there was a bugfix on 2020 that didn't go in for 2019 :(
    We'll fix it
     
  18. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,794
    So after I've stumbled upon this and took a whole day to debug, I... just did it once again.

    The thing is, to be sure, I opened the Unity docs to see the formatting:
    https://docs.unity3d.com/2021.2/Documentation/Manual/SL-MultipleProgramVariants.html
    upload_2021-12-13_13-16-18.png

    The manual has them wrong...

    @aleksandrk can you ping the docs guys to fix it? I clicked the report a problem in the docs page, but last time I did that for a different docs problem it took around 3 years for the docs page to be updated with correct information.
     
  19. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,026
    I'd rather fix it so that the order is not important :)
     
  20. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,794
    Well, if we’re getting greedy, why not both? :p
     
  21. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Except that's not going to magically back port it's way into existing versions. Personally, I'd rather have it error - as if it supports both in some versions, but not others, then we get stuff working fine in some versions and not others and having no idea why.
     
    aleksandrk likes this.
  22. graskovi

    graskovi

    Joined:
    May 28, 2016
    Posts:
    14
    Does this mean that for the Shader Foundry system the shader code itself will actually be written in C#? I know that the syntax of the Unity Mathematics library is shader based, will the C# API be a HPC# API?
     
  23. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,026
    No. The code is still going to be in HLSL.
     
    jbooth likes this.
  24. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    789
    With this, we could do material layering through shader graph?
     
  25. funkyCoty

    funkyCoty

    Joined:
    May 22, 2018
    Posts:
    727
    Honestly you guys should just hire this jbooth guy.
     
    ElliotB, timsoret, Nicrom and 6 others like this.
  26. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,026
  27. Ruchir

    Ruchir

    Joined:
    May 26, 2015
    Posts:
    934
    Any updates on this? :D
     
  28. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,026
    On what exactly? :)
    On the main topic or on some of the side posts/questions?
     
  29. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    789
    There's progress on Github - https://github.com/Unity-Technologies/Graphics/commits/shader-foundry/staging

    There are other branches for Shader Foundry as well.

    Github is your friend if you want to keep an eye on upcoming features/progress (Whatever is made public) for most of the graphics work. Currently, it seems like two things are happening.

    Shader Foundry is being developed (as mentioned in the main post) along with Shader Graph 2.0 (Built on GTF - Graph Tool Foundation the newer Graph API) which seems to be the new UI and UX of handling Shader Foundry and its features.
     
    Ruchir likes this.
  30. Ruchir

    Ruchir

    Joined:
    May 26, 2015
    Posts:
    934
    On the main topic :D
     
  31. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,026
    We're working on it :)
    We'll share more info when we're ready to do so.
     
  32. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    One thing that would be really useful is some sense of keyword hierarchy to help the compiler eliminate variants. For instance, the bakery stackable for Better Lit Shader has these keywords:

    Code (CSharp):
    1.  
    2.     #pragma multi_compile _LIGHTMAPMODE_STANDARD _LIGHTMAPMODE_RNM _LIGHTMAPMODE_SH _LIGHTMAPMODE_VERTEX _LIGHTMAPMODE_VERTEXDIRECTIONAL _LIGHTMAPMODE_VERTEXSH
    3.  
    4.     #pragma shader_feature_local USEBAKERY
    5.     #pragma shader_feature_local BAKERY_VERTEXLMMASK
    6.     #pragma shader_feature_local BAKERY_SHNONLINEAR
    7.     #pragma shader_feature_local BAKERY_LMSPEC
    8.     #pragma shader_feature_local BAKERY_BICUBIC
    9.     #pragma shader_feature_local BAKERY_VOLUME
    10.     #pragma shader_feature_local BAKERY_VOLROTATION
    11.     #pragma shader_feature_local BAKERY_COMPRESSED_VOLUME
    That big multi_compile it uses adds a lot of variants to every shader, but they are only needed if USEBAKERY is true. So the whole system gets massively larger even if you're not using it, and I currently provide a stripped to help with this, but if I could express some form of parent/child relationship the compiler could skip a massive number of variants automatically.
     
  33. marcte_unity

    marcte_unity

    Unity Technologies

    Joined:
    Sep 24, 2018
    Posts:
    17
    That's a really cool idea, Jason, especially for people sharing assets or making significant additions to a shader library. We'll definitely keep it in mind once we have time to overhaul how we manage variants.

    I know what many of you are thinking - "Why not just do exactly what Jason said, it couldn't take more than a few weeks, right? This is a corporate management brush off!" And yeah. We have coders on the team who could whip out a system like that really quickly. But that's how we end up with individual features that don't talk to each other, don't really make sense together, and cause bugs, regressions, and conflicts. For better or worse (hopefully mostly better), we've chosen as a company to slow down, think carefully about what we're building, and ship larger coherent workflows instead of bandaid fixes and one-off features. It's not perfect yet but it's starting to pay off in quality.

    We do plan some bandaids for variants, because those are totally out of hand and the pain you're all feeling is unsustainable! But for things like Jason proposed, that involve adding major new concepts to the process of working with shaders in Unity, we're going to take our time and get it as close to "right" as possible. And keep the concepts coming! We won't get it right if we don't hear your ideas, both on what's wrong now as well as good ways of doing it better.
     
    Last edited: Feb 16, 2022
    neoshaman, OCASM, NotaNaN and 11 others like this.
  34. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    My biggest fear with the proposed direction you're heading in is that you get constrained by the framework you're building, or the framework is not 'staged' enough in terms of how code is processed. Having an AST tree available to understand the code and produce better output is a great tool, or having a lot of the structure shared by the shader graph seems great, but if it means adding new features is painful or slow, or requires intimate knowledge of the system, then a simple text fragment mashup like I've built is actually superior because it's infinitely flexible and not bound by other systems needs.

    For instance, the original mistake of shader graph was not separating the pipeline abstraction from the shader graph tool- had an abstraction layer been built separately, a shader graph or text parser could have fed that same abstraction. So I guess what I would recommend is keeping a couple of odd ball cases in mind, like the various interpolation modifiers you can add to a v2f struct, or how much work it is to add support for something like newer HLSL features is going to be (say, interfaces, etc). In a solution like mine, adding these things was trivial, because it's really just a matter of putting the right text into the right places, and I don't need to understand what any of that text means. But in a shader graph, many of those things can have much higher integration costs - and it would be a failure to have the entire shader system in Unity anchored to the restrictions of visual paradigms of a graph. The same is true for any kind of AST tree you build- if it needs to understand the new constructs, then it's a much higher cost of integrating new features.

    Also, please let me know when there is an example of what a text based shader might look like in this new system. I've been monitoring the depot a bit, but so far it seems like it's all SG restructuring..
     
    polytropoi, Noisecrime, OCASM and 9 others like this.
  35. abovenyquist

    abovenyquist

    Joined:
    Feb 13, 2014
    Posts:
    14
    Unity is always buying things and integrating them with Unity. Here it seems like it would make the most sense for Unity to just buy Better Shaders from jbooth and make it the official solution.
     
  36. BOXOPHOBIC

    BOXOPHOBIC

    Joined:
    Jul 17, 2015
    Posts:
    509
    We have business to run here, taxes to pay, our assest are getting more and more outdated by the day, we are currently working with 9 pipelines with no documentation for the changes, many hours a day spent on support, a barely usable shader graph for anything a bit more complex, just because unity failed to provide a shader abstraction in 5 years.

    So, I'm just wondering when is the new shader system planned to be released? Thanks!
     
    funkyCoty and Ruchir like this.
  37. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    From an asset production standpoint, you're likely 3-5 years out at minimum. Even if they release it next year, which given what we've seen in terms of updates is unlikely, it will only work in some very new version of Unity so not be viable for assets for some time.
     
  38. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Frankly it is less risk to use Jason's Better Shaders because you are already hiring 9 people. If Jason stopped tomorrow you would have still saved 8 staff. Dramatic, but you get the picture. I think that product is targeted more at the asset store developer.
     
    BOXOPHOBIC and SonicBloomEric like this.
  39. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,550
    Which 9 pipelines are these? As far as I'm aware there's only really 3, 4 if you count the Hybrid, but that utilizes the URP/HDRP shaders.
     
  40. BOXOPHOBIC

    BOXOPHOBIC

    Joined:
    Jul 17, 2015
    Posts:
    509
    I was a bit optimistic and expected to see something in 2022.2. Of course it is not the case. I just tested shader graph as an alternative to ampify in the newly released beta, and there are no new features there are well, or features that would help productivity, like:

    - Shortcuts!
    - Portals (Register variable)
    - Custom attributes on properties
    - Support for some basic stuff like global arrays
    - The need to add _float in includes makeing any existing library unusable
    - Option to always disable the previews
    - Function switches (one of the greatest amplify features)
    - Pipelines Switch node
    - Option to add defines, stencils, etc
    - Bugs not fixed in years (I wonder of instancing is fixed in birp)
    - And a lot more

    Basically, the features that makes Amplify great compared to sg. And of course, I suggested most of the above on the roadmap platform probably 3 years ago. But anyways, who care what some developer who's been around on the store for 5-6 years working with all these tools on a daily bases thinks, some upper management person knows better.

    We had tools like Strumpy shader editor, Shader Forge, Amplify, but no, unity is not getting any of the best of them, but reinventing the wheel, and make it in square.
     
    funkyCoty and Ruchir like this.
  41. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,794
    I think it's at least 2 more years away.
     
  42. BOXOPHOBIC

    BOXOPHOBIC

    Joined:
    Jul 17, 2015
    Posts:
    509
    I have an ecosystem of 8-9 assets built around Amplify, which as one of the biggest asset store developers couldn't keep up with the pipelines mess! So rewriting everything in Better Shaders would mean a lot of work, require changes probably which would mean changes for all my users, introduce new bugs and for what? What if Jason decides tomorrow to make music instead of shaders?
     
  43. BOXOPHOBIC

    BOXOPHOBIC

    Joined:
    Jul 17, 2015
    Posts:
    509
    Yes, 3 pipelines, but count the major unity releases as well, non of the shaders built for one pipeline works across unity versions, except the built-in surface shaders, which uses the abstraction we need, working from unity 5 to 2022.2 without issues.

    Why do I have to care how the decals code is changing in each hdrp version? Or how the lighting is calculated, all the tons of keywords defined if some feature is enabled, etc... All I need to care is how my albedo, normal, metallic, etc.. should look like. Surface shaders!
     
  44. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Yeah but what you are doing is the work of one person per pipeline, it's a ridiculous cognitive load. It is why I never want to publish my terrain engine or tools.
     
    OCASM likes this.
  45. BOXOPHOBIC

    BOXOPHOBIC

    Joined:
    Jul 17, 2015
    Posts:
    509
    Exactly that! Instead of going forward and releasing products helping developers, or should I say said "democratising game development", we are still trying to figure out why one of the tens of features in hdrp is not working with my shaders.
     
  46. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    And deploying that to users in a slick way is additional work having to make specific support packages etc.
     
    BOXOPHOBIC likes this.
  47. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    URP7.x, URP8.x, URP9.x, URP10.x, URP11.x, URP12.x, URP13.x, HDRP7.x, HDRP8.x, HDRP9.x, HDRP10.x, HDRP11.x, HDRP12.x, HDRP13.x, Standard

    That's to support every version of Unity 2019 and up, as every one of these either added a new feature or breaking change and required some work to support.

    I round that down by not supporting tech releases, so URP7.x, URP10.x, URP13.x, HDRP7.x, HDRP10.x, HDRP 13.x and Standard.

    It would not be a big delta to have amplify output in better shaders format as a target, and then have better shaders be the core abstraction. With Better Shaders stackable system, this means users could write their shaders in graph or text and combine them together (ie: I write a nice base shader in text, someone stacks a snow shaders onto it done in a graph, everything works). To me this is the dream version of shader authoring, but writing a shader graph just for better shaders doesn't make a ton of sense since it's not like Better Shaders pays the bills exactly.

    This is far less likely than Unity killing this initiative half way through. LWRP was production ready for almost a month before it was deprecated. You think you're any more safe relying on Unity? Their track record for abandoning features and assets is far worse than most of the good asset store developers.
     
    Kirsche, OCASM, SamOld and 7 others like this.
  48. BOXOPHOBIC

    BOXOPHOBIC

    Joined:
    Jul 17, 2015
    Posts:
    509
    Better Shaders + Amplify would be a dream coming true. Did you ever check what would be needed to make it happen? The template system is quite versatile, but I currently don t know the ins and outs of better shaders and what would be needed to make it work, but I'm happy to help with anything if you ever decide to take a look at it. And before anybody says that this is offtopic, we are focusing on shaders.
     
  49. DEEnvironment

    DEEnvironment

    Joined:
    Dec 30, 2018
    Posts:
    437
    i agree with Jason about not trying to officially support Tech-release and even more about so about Pre-release
    as developers we need to ask what version we should support

    first let’s look at how unity works for releasing API versions and the relationship with Unity versions.
    unity release cycle has 3 main stages
    Unity xxxx_1 = Pre-release
    Unity xxxx_2 = Tech-release
    Unity xxxx_3 = Official-release LTS
    upload_2022-7-19_18-22-18.png

    xxxx_1 Pre-release
    this cycle commonly uses the api from the last official release, however from experience is usually missing large chucks of code and developers should avoid pre-release unless they fully understand it’s not intended for production use.

    xxxx_2 Tech-release
    this cycle is a good starting point to look for asking what's new. Almost ready for production however may still have small bugs. this cycle is sort of a battle testing ground to flush out problems

    xxxx_3 Official-release
    this is the most stable cycle and suggested for production use

    let’s have a look at the history in HDRP api 10x and see what we can learn
    unity 2020.3 has 36 versions from api 10.3.2 thru api 10.9.0 and one more 10.10.0 as of last week

    The main transition from pre-release into tech-release happened in 10.2.2 shortly after tech-release cycle started. This release had major changes such as removing distortionvectors pass and adding scenepicking pass

    looking ahead into 11x the prerelease for 12x unity again has missing chunks of code. In this case Unity is missing the entire section for the new piking pass that was added in 10x … basically making 11x impossible to support correctly. This history can also be found in older cycles 6x 7x & 8x showing this is not a one-off and rather a well-documented trend for unity repeating mistakes for missing sections in prelease.

    what did we learn?
    Pre-release = “Bad Egg” so take a lesson from history and wait just a bit longer. you are much better off to develop in
    late

    below is screen shot that will help other to understand what we are talking about for relationship with api vs Unity Official-release cycles. we basically know every breaking change in detail, some have been fixed and other forgotten to the past hopeful to never return

    Note 13x is a Pre-release however it is the first Pre-release found not fully messed up yet or missing massive chunks of code, for me i am still skipping it to more dangerous ground into 14x beta in that this will eventually turn into something supported.

    upload_2022-7-19_18-30-42.png

    upload_2022-7-19_18-31-20.png
     
    LaireonGames, JesOb and PutridEx like this.
  50. DEEnvironment

    DEEnvironment

    Joined:
    Dec 30, 2018
    Posts:
    437
    @aleksandrk
    hello sir
    just a heads up about api 12x adding fragment with fog
    #pragma shader_feature_local_fragment _ENABLE_FOG_ON_TRANSPARENT

    it had trouble working with custom shaders and we are still stuck to use
    #pragma shader_feature_local _ENABLE_FOG_ON_TRANSPARENT

    i have no idea but this seems to be the only frag change that still has stability issues

    i wish we could have just one pragma instead of needed 3 or 4 for just one MATERIAL_FEATURE
    i dont know why we need the pragma / pragma+Local / pragma+Local+raytrace instead of just one keyword

    do they have any plan to clean all this up
     
Thread Status:
Not open for further replies.