Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice
  2. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  3. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice

Official Block Shaders: Surface Shaders for SRPs and more (Public Demo Now Available!)

Discussion in 'Graphics Experimental Previews' started by dnach, Oct 19, 2022.

  1. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    That's not at all the impression I get from Unity's recent moves. The trend over the last 5+ years was away from the asset store model (at least for graphics plugins).

    HDRP is an example of a very Unreal-like approach -- make stuff difficult to extend and provide tools that cover 90% of use cases for artists. With the old Unity, you could buy a volumetric cloud solution from the asset store, a sky/weather solution from the asset, a water system from the asset store, a GI solution from the asset store, a couple different skin/hair/foliage solutions... HDRP has this all baked in. Some of it is a bit worse or less performant than a dedicated asset, but it's *there*, and it's free (or part of your subscription fee), and it's easy, and it's probably going to be supported for a few years (and AAA studios love it since they can pay for reliable support instead of being at the whim of an asset store developer).

    Meanwhile, if you want to create a shader that works with HDRP or URP and sell it on the asset store, you've been high and dry for years, reverse engineering the whole thing yourself. They pulled custom nodes from the shader graph, and every version bump of URP/HDRP has broken a thousand different assets on the store.

    Block Shaders are one of the first steps Unity has taken to supporting plugin/asset developers on SRPs.
     
    tsukimi, Gooren and echu33 like this.
  2. sacb0y

    sacb0y

    Joined:
    May 9, 2016
    Posts:
    936
    Again, I'm still not seeing it beyond "general unity issues".

    You should really get down to specific use cases.

    I can understand wanting BIRP to stick around and be supported beyond providing a upgrade or conversion path. But it's #1 way too late for that, and #2 URP at this point has borderline feature parity and the people who made URP likely still work at Unity compared to the people who made BIRP. That's just a guess but still.

    And even URP at base feature set (regardless of parity) still feels old and very outdated for anything but a phone game currently, maybe Switch, especially if someone started a game today to release in 2-3 years. So, what, you want them to replace URP with BIRP so it can be ultimately relegated to phones? Cause that's more or less what's happening. Once it's easier to make an HDRP focused game also release on lower end devices (through URP) even if HDRP has some issues it's still the better choice long term.

    In the end I don't understand why that seems like the core of your problem. Cause it's the aspect that's least likely to change.

    We're WAY beyond fads in terms of transitional technology right now on multiple fronts, anyone can see that.
     
  3. wwWwwwW1

    wwWwwwW1

    Joined:
    Oct 31, 2021
    Posts:
    784
    I agree that the Block Shader support in BIRP is necessary because there's something a Shader Graph cannot do.

    People who are using BIRP can spend time migrating all Surface Shaders to Block Shaders and reuse them once URP becomes the default RP.

    But I disagree that Unity should:
    1. Spending time implementing Forward+ & SRP Batching to BIRP. (slowing down the progress of SRPs)
    2. Adding BIRP Surface Shader (DX9 style grammar) codes to HDRP and support all features (complex lighting models, ray-tracing, compute, tessellation with DX9 grammar)
    3. Deprecate Deferred rendering path, so that people are forced to use HDRP for Deferred.
    4. Deprecate DOTS after 1.0 (will be released soon) and let users wait for another solution.
    Some of them is obvious difficult/impossible to be considered by decision makers. (there's no turning back)

    As for the first one:
    I remember that people can increase the maximum light limit in BIRP settings (with high cost of performance).

    BIRP lighting uses 1 pass per light, so SRP Batching may not be very usable in BIRP because there will still be many draw calls.​
     
  4. marcte_unity

    marcte_unity

    Unity Technologies

    Joined:
    Sep 24, 2018
    Posts:
    17
    All this feedback is useful, and it's fantastic that people are so engaged in this topic. It means our team is working on something important - which we know, and you know, but it's helpful to be able to point to a 100-reply thread to remind the rest of Unity of the value!

    I do want to address some of the general feedback we've received - we're a team of 4 devs, 1product manger, and a lead (me). We're working as hard as we can on shipping the best Shader Management tools that we can, for all the million-plus customers we support. For those who have never worked at a 6,000-person company, our team has very little influence on what the other 5,994 people do. I've never even met 90% of them! We can try our best to influence large-scale decisions like "should we maintain this old system or replace it", but we have no direct control over that. So know that the people responding in this channel are fully committed to building Shader tools that will last for the next decade, and the more specific feedback we get, the better they will be.

    I'm not saying you can't yell at us about "Unity problems" - we pass that sentiment along internally as best we can. We're just paying the most attention to constructive feedback that can improve what the 6 of us deliver next year.
     
    florianBrn, rdjadu, Ziddon and 25 others like this.
  5. echu33

    echu33

    Joined:
    Oct 30, 2020
    Posts:
    68
    I agree. Maintaining a URP shader with custom lighting model is a crazy pain right now. Even just support unity 2019 - 2021 (it means custom code for URP7, 10 ,12). Re-writing all the shaderpass's include files. For each version :rolleyes:.

    And HDRP is another beast that I force to step back from custom coding it and just using shadergraph and custom node to make custom one. But the out-of-box result are really good in this unreal-like approach. Compare to what URP has offer. But it does limit the freedom of plugin/asset development.

    So i'm really looking forward for this block shader.

    keep up the good work!
     
  6. RegdayNull

    RegdayNull

    Joined:
    Sep 28, 2015
    Posts:
    68
    What about stencil? Or it will be like "you do not need it"?
     
  7. FronkonGames

    FronkonGames

    Joined:
    Feb 24, 2022
    Posts:
    16
    It would be great to be able to create post-processing shaders with this system, have you considered it?
     
  8. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    Do you mean combining multiple post-processing effects for an "uber-post" like thing? Eg you have SSAO, SSGI, and an outline pass that all need to reference the depth and normal textures, so do it in one big shader to reduce costs of sampling and memory bandwidth?

    If so, that's a great idea!!!!!
     
    JesOb likes this.
  9. FronkonGames

    FronkonGames

    Joined:
    Feb 24, 2022
    Posts:
    16
    I meant full screen effects, but that's a good idea.
     
  10. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,042
    Configuring stencil will definitely be supported.

    Definitely.
    I think it's most interesting for adding effects that can be done without running a separate pass; but making a custom post-processing uber-shader with this would be an easy deal as well.
     
  11. dnach

    dnach

    Unity Technologies

    Joined:
    Mar 9, 2022
    Posts:
    90
    As Aleks mentioned above, Stencil will definitely be supported. In fact, it should already work in the public demo linked in the announcement (first post), by declaring the stencil commands inside a Block declaration:
    Code (CSharp):
    1. Block SomeBlock
    2. {
    3.     Commands
    4.     {
    5.         Stencil
    6.         {
    7.             Ref <ReferenceValue>
    8.             Comp <CompareOperation>
    9.         }
    10.     }                
    11.  
    When we release Block Shaders the syntax will be some what different though, in order to simplify/unify RenderState declaration.
     
    burningmime likes this.
  12. RegdayNull

    RegdayNull

    Joined:
    Sep 28, 2015
    Posts:
    68
    That's nice. Will wait for the full release. Hope it will be soon.
     
  13. Grix

    Grix

    Joined:
    Mar 23, 2014
    Posts:
    12
    Thanks for this, it's one of the only things holding me back to BIRP because my project depend on surface shaders.

    May I ask what the estimated time for this to be released is? I am new to Unity so I don't know how fast things usually progress from the alpha stage. Are we talking years until it's in production, or maybe it will already be released in the first 2023.x version?
     
  14. Razmot

    Razmot

    Joined:
    Apr 27, 2013
    Posts:
    346
    float4/half4 vertex position ?

    Will it be easy to use a float4 vertex position ( or a half4 position)
    It s a very interesting feature and optimization with procedural meshes that are built with
    Code (CSharp):
    1. new VertexAttributeDescriptor(VertexAttribute.Position, VertexAttributeFormat.Float16, 4)
     
  15. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,042
    We'll share when we're ready :)

    The system will make it possible in general (we don't limit the type for declaring attributes).
     
    cecarlsen likes this.
  16. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    Is this something that could be put into packages in the package manager and declared as a requirement in the manifest file instead of requiring users to download a completely separate editor?
     
  17. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,042
    @burningmime It will be part of the Editor, not a package.
     
    OCASM, saskenergy, JesOb and 2 others like this.
  18. imblue4d

    imblue4d

    Joined:
    May 27, 2016
    Posts:
    110
    What is the probability of this making it to 2023.1?
     
  19. Onigiri

    Onigiri

    Joined:
    Aug 10, 2014
    Posts:
    494
    2023 already in beta so... zero
     
  20. imblue4d

    imblue4d

    Joined:
    May 27, 2016
    Posts:
    110
    I see thanks, forgot that it's not a standalone package
     
  21. ElliotB

    ElliotB

    Joined:
    Aug 11, 2013
    Posts:
    295
    I've had a read over the documentation, but I still can't find any mention throughout of adding extra passes, e.g. for better outlining. Do block shaders/templates/foundry offer a way to create a shader template with an additional pass?
     
    LooperVFX likes this.
  22. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,042
    @ElliotB Yes, we plan to support this.
     
    ElliotB and LooperVFX like this.
  23. SoyUnBonus

    SoyUnBonus

    Joined:
    Jan 19, 2015
    Posts:
    43
    Hi everyone! I work at Milkstone Studios (we've released Ziggurat 2 and Farm together lately, maybe someone here knows them?) and I have a huge library of cginc files that I've created while we were releasing games made in Unity (we started back in 2014, after leaving XNA).

    I compose them by using lots of defines in the shader code and it would be a waste to throw all that knowledge out of the window.

    Currently, all of our games use the built-in pipeline, as there's always something we can't do with URP/HDRP. We usually want games that look good enough, but still support the Nintendo Switch, for example. So URP was the only choice, but it missed a lot of features that worked in built-in.

    I would really like to move over URP with Forward+ in our future titles, as it seems it's 'almost there' compared to built-in, but it's a must to keep re-using my shader code library, so this block shader thingy sounds great. Easy of use, but still text based. I really like where this is going, at least for now.
     
    Last edited: Mar 7, 2023
    OCASM, Ofx360 and Edy like this.
  24. peter226

    peter226

    Joined:
    Apr 6, 2015
    Posts:
    5
    Hey there! This is seriously cool! I wonder, will there be a Visual Studio extension for supporting this new workflow? I would love if there would be intellisense for suggesting available keywords and parameters, finding where something is declared, even if in a different file. Not long ago I just found out Jetbrains Rider has some amount of support for these for the current version of shader workflow, but I am really not comformtable with that IDE.
     
  25. MKGameDev

    MKGameDev

    Joined:
    May 14, 2013
    Posts:
    70
    I'm a bit late to the party, but also have some thoughts on this. I'm speaking from the point of view of a graphics focused asset store publisher.

    I think some kind of a new system is needed and it looks very promising at first glance. I haven't followed all of the graphics related news, so I got a bit confused at first. If I remember correctly there were some kind of "SRP Surface Shaders" on the public roadmap and I also read about the "Block Shaders", but it turned out the Block Shaders are the actual replacement for the Surface Shaders and the future of shader development.

    I really like the "blocky" approach on this. It feels very modular and maybe creating shaders will be fun again in the future. However it feels like this system is too much for people who quickly want to create shaders without messing around with the tech (shader graph users), but lacking for people who want to create shaders from scratch (code based).

    The current problems:
    1)
    Every few months you can expect your shaders to break completely.
    2) Every few months you have to re-learn the whole shader library of each render pipeline (URP, HDRP).
    3) Every few months you have to figure out all new and changed or removed shader features by yourself. This includes keywords, render loop related shader code, new properties, render pipeline library changes.
    4) Every few months you have to merge the changes of the previous version with the new version (From 2019.4 up to 2023.2 you have like 3 different light loops you have to handle in the fragment shader of the Universal Forward Pass).
    5) Every few months you have to test your shader for every new feature, render pipeline setup and platforms.
    6) You have to somehow manage duplicate code you can't avoid. For example for forward-only shaders (URP) you have to manage the "UniversalForward" and "UniversalForwardOnly" LightTag in the forward rendering pass (since 2021.2 where the deferred rendering was introduced).
    Your options here:
    - Manage multiple shader files
    - Manage duplicate passes using the PackageRequirements block
    - Create a custom importer and change the tag accordingly
    7) Super important pre-processor definitions are missing since the beginning. Why are there no "UNIVERSAL_RP_ACTIVE", "HIGH_DEFINITION_RP_ACTIVE" and "BUILT_IN_RP_ACTIVE" pre-processor definitions for both, c# scripting and shaders. Something like this should be set based on the active render pipeline asset.

    In additon I want to point out that everything mentioned above is something you have to do without any proper documentation. At the moment I dont support HDRP, because the URP is already bringing me close to a mental breakdown. And the URP is nothing in comparison to the HDRP.

    What i would expect from the new system at best case:
    1)
    Creating one shader file, which compiles on Built-in, URP and HDRP. No pink materials in each case.
    2) It updates and correctly works on every editor / render pipeline release.
    3) Ability to create all type of shaders. Regular 3D, 2D, Particles, Terrain [...].
    4) Control over pass-related settings and Shaderlab commands (Blending, Tags, ZWrite, ZTest, [...]).
    5) Control over different shader stages (the block shader approach could be really nice here). This should also include input and output structs of each stage.

    What I'm scared of:
    Based on my experience of the last year's it's kinda hard to have any hope or trust here. I experienced to many pink materials. You do not have the goodwill any longer. Basically every package is "alpha", "beta", "preview" or "experimental" but not "stable". And the whole shader library feels the same.
    At the moment I'm in a state were I try to avoid as many predefined shader macros and functions as possible.
    I can't even express my frustration.
    Unless you prove that you are able to move all of your existing shaders onto that system (Lit Shader of HDRP and URP, Terrain of URP and HDR just to name a few examples) its hard to have any trust.

    What would help alot:
    - Every new major update of the editor and render pipelines should provide a clear documentation on how to upgrade shaders. Describe exactly what is required to add or modify in terms of keywords, shader lab commands, functions and macros. This alone would be a game changer and i would value it higher than the new system.
    - Everyone of the graphics team should be forced to upgrade a fully lit shader from the old version to the new version - every time after a new update arrives. After the fifth iteration you should have a really good understanding of the pain some people experience.

    My current approach:
    So I'm kind of in a corner at the moment. Currently I'm creating my own type of shader system, but I also really like the demo you provided.
    My own system is kind of a mix of Surface Shaders and the modular approach of Block Shaders.
    The following code compiles into a working physically based shader on both the BiRP and URP in a range of 2019.4 and 2023.1 (on the latest stable test I did):
    Code (mkshader):
    1. Shader
    2. {
    3.     //<-----| Main Settings |----->
    4.     Name "MK/Shader/Lit"
    5.     Type Regular
    6.     LightingType PhysicallyBased
    7.     CustomEditor "MK.Debug.UnlitEditor"
    8.  
    9.     //<-----| Toggleable System Features |----->
    10.     Emission On
    11.     Occlusion On
    12.     Height On
    13.  
    14.     //<-----| Blending |----->
    15.     Blending
    16.     {
    17.         ForwardBase One Zero
    18.         ForwardAdd One One
    19.     }
    20.  
    21.     //<-----| Cull |----->
    22.     Cull {}
    23.  
    24.     //<-----| ZWrite |----->
    25.     ZWrite {}
    26.  
    27.     //<-----| ZTest |----->
    28.     ZTest {}
    29.  
    30.     //<-----| ColorMask |----->
    31.     ColorMask {}
    32.  
    33.     //<-----| Properties |----->
    34.     Properties
    35.     {
    36.         [MainColor] _AlbedoColor ("", Color) = (1,1,1,1)
    37.     }
    38.  
    39.     //<-----| Uniforms |----->
    40.     Uniforms
    41.     {
    42.         uniform half4 _AlbedoColor;
    43.     }
    44.  
    45.     //<-----| Defines |----->
    46.     Defines
    47.     {
    48.         #define INDIRECT_ISOTROPIC
    49.         #define WORKFLOW_METALLIC
    50.         #define FRESNEL_HIGHLIGHTS
    51.         #define SPECULAR_ISOTROPIC
    52.     }
    53.  
    54.     //<-----| HLSL Includes Pre |----->
    55.     HLSLIncludesPre {}
    56.  
    57.     //<-----| HLSL Includes Post |----->
    58.     HLSLIncludesPost {}
    59.  
    60.     //<-----| Shader Variants |----->
    61.     ShaderVariants
    62.     {
    63.         ForwardBase ForwardAdd UniversalForward ShadowCaster Meta Shader_Feature_Local SM25 __ _MK_ALBEDO_MAP
    64.     }
    65.  
    66.     //<-----| Surface Init |----->
    67.     inline void SurfaceInit(inout MKSystemSurfaceData systemSurfaceData, inout MKSurfaceData surfaceData, inout MKUserData userData)
    68.     {
    69.         //systemSurfaceData.albedo = _AlbedoColor;
    70.     }
    71.  
    72.     //<-----| Surface Unlit |----->
    73.     inline void SurfaceUnlit(in MKSystemSurfaceData systemSurfaceData, in MKSurfaceData surfaceData, inout MKUserData userData, inout half4 result) {}
    74.  
    75.     //<-----| Surface Lighting |----->
    76.     inline void SurfaceLighting(in MKSystemSurfaceData systemSurfaceData, in MKSystemLightingData systemLightingData, in MKSurfaceData surfaceData, in MKLight light, in MKLightData lightData, inout MKUserData userData, inout half4 result)
    77.     {
    78.         half rawDiffuse = ComputeScalarDiffuse(PASS_LIGHTING_ARGS);
    79.         result.rgb = rawDiffuse * light.color * systemLightingData.diffuseRadiance * light.combinedAttenuation;
    80.         result.rgb += ComputeScalarSpecular(PASS_LIGHTING_ARGS) * light.color * systemLightingData.specularRadiance * light.combinedAttenuation;
    81.     }
    82.  
    83.     //<-----| Surface Composite |----->
    84.     inline half4 SurfaceComposite(in MKSystemSurfaceData systemSurfaceData, in MKSystemLightingData systemLightingData, in MKSurfaceData surfaceData, in MKUserData userData)
    85.     {
    86.         return half4(systemLightingData.directMain.rgb + systemLightingData.directAdditional.rgb + systemLightingData.indirect.rgb, systemSurfaceData.alpha);
    87.     }
    88. }
    Kinda hard to make a decision in my situation, if i should stick to my own system or hoping for the new system.

    Is there any ETA for a possible beta release?
     
    Last edited: Mar 29, 2023
    tsukimi, ontrigger, NotaNaN and 6 others like this.
  26. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    328
    No new info on the GDC roadmap session...
     
  27. SoyUnBonus

    SoyUnBonus

    Joined:
    Jan 19, 2015
    Posts:
    43
    They can't test every permutation, so things are bound to break. It would be enough if the LTS version is stable, and the previous ones exist to test if shaders work or not (they're 'technical previews').

    Should you wait or keep your system? Work with what you currently have. At this point, unless I have something stable enough, I would rather use my own code. You never know how much time this is going to take
     
    OCASM and MKGameDev like this.
  28. JesOb

    JesOb

    Joined:
    Sep 3, 2012
    Posts:
    1,109
    Block shaders idea look awesome :)

    This is somewhat late but:

    Do BlockShaders architecture take into account VKShaderObject that already just released?
    https://www.khronos.org/blog/you-can-use-vulkan-without-pipelines-today

    More specifically do BlockShaders have entities/blocks on level of shader stage that can be reused and in future compiled into separate VkShaderObjects?
     
  29. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,042
    @Jes28 The extension provides a way to compile shader stages individually. It wouldn't make much sense to do anything with regard to this extension on the Block shader level.
    ShaderLab level would be a way better fit here, but since it's an extension, we'd need to have a fallback for drivers that don't support it.
     
  30. Wolfos

    Wolfos

    Joined:
    Mar 17, 2011
    Posts:
    953
    The blocks seem like a good idea, and the syntax is much nicer than Shaderlab.
     
  31. itsjase

    itsjase

    Joined:
    Apr 18, 2017
    Posts:
    4
    Not sure if this has already been answered, but a big use of surface shaders for me is being able to use structuredBuffers in a custom vertex shader and still use the existing unity lighting.

    Will Block shaders make this possible?
     
    burningmime and LaireonGames like this.
  32. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,042
    Yes, this will be possible.
     
    itsjase likes this.
  33. Just_Lukas

    Just_Lukas

    Joined:
    Feb 9, 2019
    Posts:
    34
    Wouldn't it be more time efficient for you guys to just update existing ShaderLab syntax & design nice HLSL API for custom shaders and such rather than inventing brand new syntax & compiler from scratch?
     
  34. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,042
    What exactly do you mean by this?
     
  35. Just_Lukas

    Just_Lukas

    Joined:
    Feb 9, 2019
    Posts:
    34
    I mean for your team to update ShaderLab language to get rid of some legacy syntax like {} after texture definition, add new types such as Gradient or Curve and such and then create HLSL api just like surface shaders were in legacy RP..
     
  36. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    I don't know, I agree that the surface shader syntax is a dead end. There's too much "magic" in it, like i.viewDir changing spaces based on if you write to o.Normal, etc. A new, cleaner syntax is needed. That said, some of the proposed changes are further than I would have gone and are going to require a lot of work on my end to update my shaders, and I suspect I'm not the only one who will feel that way.

    Also, Gradient and Curve aren't shader things - they can be represented as math or textures - and since there are a dozen ways you might want to do represent them (ie: Send bezier data, rasterize a curve in each channel of a texture, pack multiple curves together to save samplers, etc), it doesn't make sense for them to be first class citizens of a shader language. Having a well formed library of tools to do stuff like this, sure, but not at the language level.
     
    Prodigga, LooperVFX and cecarlsen like this.
  37. Wolfos

    Wolfos

    Joined:
    Mar 17, 2011
    Posts:
    953
    Surface shader syntax is definitely worth a complete overhaul instead of marginal improvement.
    Just look at how ridiculously inconsistent this is:
    Code (CSharp):
    1.  
    2. Shader "Example/Diffuse Simple" { // Shaderlab
    3.     SubShader { // Shaderlab
    4.       Tags { "RenderType" = "Opaque" } // Shaderlab
    5.       CGPROGRAM // Shaderlab
    6.       #pragma surface surf Lambert // CG macro
    7.       struct Input { // CG
    8.           float4 color : COLOR; // CG
    9.       }; // CG
    10.       void surf (Input IN, inout SurfaceOutput o) { // CG
    11.           o.Albedo = 1; // CG
    12.       } // CG
    13.       ENDCG // Shaderlab
    14.     } // Shaderlab
    15.     Fallback "Diffuse" // Shaderlab
    16.   } // Shaderlab
    17.  
     
  38. aras-p

    aras-p

    Joined:
    Feb 17, 2022
    Posts:
    75
    That's not "surface shader syntax" though, is it? It's "shaderlab syntax" (start & end of the snippet) and "HLSL" (in the middle). Yes, in retrospect it probably would have been way cleaner if Unity had shader syntax that's similar to the .FX files syntax from Direct3D circa 2002-2010 -- i.e. most of the text is HLSL, anything extra is "things that look similar enough to HLSL".
     
  39. SoyUnBonus

    SoyUnBonus

    Joined:
    Jan 19, 2015
    Posts:
    43
    First finish block shaders with shaderlab, then whatever is next. We scream when Unity tries to reinvent the wheel by scrapping everything and starting over and that's exactly what you're proposing. It's not that I don't think ShaderLab is getting old, but let's do things one step at a time. First we need some UNITY between rendering pipelines, and Block Shaders is trying to fix that.
     
    Edy and aras-p like this.
  40. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    I think people are using "surface shaders" and "ShaderLab" somewhat interchangeably, when they are actually two different things (myself included sometimes). ShaderLab being the the syntax around shader definitions like properties, fallbacks, etc, and surface shaders being the abstraction layer around the lighting system.

    For instance:

    Code (CSharp):
    1. void surf (Input IN, inout SurfaceOutput o) { // CG
    2.           o.Albedo = 1; // CG
    3.       } // CG
    This is actually both CG and Surface Shaders. The code is in CG, but that function call expects to be called via the surface shader framework.

    The surface shader framework needs to die and be re-written - however, I do agree that changing shader lab is more debatable. For instance, changing the way properties are defined is going to create a metric ton of work for my stuff, with no advantage that I can currently see (not saying there won't be ones, just currently I don't see any benefit). So from my point of view having some backwards compatibility there would be super useful. But I would also suspect that not changing shader lab may incur additional costs or restrictions - and what I don't want to see is having to rewrite the shaders more than once, because it's already going to take me 6 months to a year to do that as it is - a task which provides no business value to my work except righting a mistake Unity made 7 years ago and has been a thorn in my side ever since.

    So what I might suggest, without full knowledge of the issues, is that being able to write a shader in the new system still work with familiar syntax for properties and such- but being able to use the new features, like blocks, might require the new syntax. At least from my current understanding changing things like the property definitions is specifically about the block system- and while I'm a massive proponent of modular shaders, the majority of shaders out there are not written for this in mind.

    As an example, let's take Better Shaders. It has its own modular shader stacking system, and lots of people use it to write shaders. If property syntax changes, then every users shader will need to be rewitten, or I'd have to come up with some kind of property translator that does the conversion (Right now it has no concept of what a property is). If I can use the old syntax, it's quite possible I can write a wrapper layer to have better shaders output to the new format automatically, without having the users rewrite all of their shaders.

    The same is true for something like MicroSplat, which is going to take many months to rewrite under a new system unless that new system can be treated like I treat the current SRP adapters - simple wrappers for the code/structs/functions.

    And I think this would also be true of many monolithic shaders out there.

    And yes, I'd be ok with loosing block functionality in these cases - a larger re-write will be necessary for that (which would likely change the whole product design, since you'd want to share snow across any shader instead of each product having their own snow system, etc), and I won't have to do the entire thing before I can use the new system to abstract SRP issues.
     
  41. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,960
    We do that, because they seem to be unable to execute that concept well. Every feature where they tried to do that has been sub-par, even after waiting months and years for refinement (that doesn't really happen).

    So we scream when they do that, because they suck at it, not because it's a bad idea in general.

    I do agree with the rest of your post though, that we need some "UNITY". Although I doubt I will be using the block shader system, unless it really impresses me when it comes out, (which I highly doubt...).
     
  42. BOXOPHOBIC

    BOXOPHOBIC

    Joined:
    Jul 17, 2015
    Posts:
    519
    I don't really understand why the properties syntax need changes. I think the old system worked perfectly fine for many many years and I didn't see anyone complaing about that.

    Edit: Are there any benefits to the new synatx?
     
    Last edited: Apr 26, 2023
  43. SoyUnBonus

    SoyUnBonus

    Joined:
    Jan 19, 2015
    Posts:
    43
    Have you tried jbooths's Better Shaders? (there's a free version in the Asset Store that works fine with built-in) I've been tinkering with it the past couple of weeks and it's pretty amazing. I've turned my quite large shader codebase, with tons of #defines, to a way better modular system that helps reuse code in a much better way.

    It's not groundbreaking as I'm basically doing the same as before, it's just waaay more convenient and easier to maintain.

    If Block Shaders is like that, I'll jump right away!
     
    Last edited: Apr 26, 2023
    dnach and jbooth like this.
  44. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,960
    Sure, but I’m not holding my breath.
     
    SoyUnBonus likes this.
  45. Wolfos

    Wolfos

    Joined:
    Mar 17, 2011
    Posts:
    953
    I'm confused, isn't block shaders a complete overhaul?
     
  46. Wolfos

    Wolfos

    Joined:
    Mar 17, 2011
    Posts:
    953
    I guess so. My main gripe is how inconsistent the metadata definition is. We've got the shader name outside of the block, an array of tags, then a CG macro to define the lighting function and vert/frag/surface programs.
    I have to look it up every single time I write one despite using it for over a decade.

    Block shaders at least looks more consistent with that.
     
  47. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    From a quick glance at the docs and some small tests, it's surprisingly similar syntax-wise. The block concept is new, but maps in a straightforward way if you just need to customize a surface, and there are minor cleanups of property syntax in places. Anyone who is familiar with surface shaders should be right at home.
     
  48. SoyUnBonus

    SoyUnBonus

    Joined:
    Jan 19, 2015
    Posts:
    43
    I'm treating it as a 'Better Shaders' clone. Same as the official UI was a clone of NGUI back in the day.

    It's a way of reusing old code but organizing it better. Sort of like a Surface Shaders Plus.
    For example, pairing code with properties so that you can create shaders by mixing blocks. You should check out the Better Shaders docs to get the whole picture, it's pretty neat.

    Not saying that Block Shaders is going to be like that, though! but I HOPE that it'll be similar (fingers crossed).
     
    Last edited: May 3, 2023
  49. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    @aleksandrk
    Hey, I was wondering if you could give me some insight into shader variants and build/memory sizes. Better Lit Shader has a ton of variants (all shader_feature_local, or shader_feature_local_fragment), and we've been seeing this thing where the memory for it balloons up to 500mb+ on builds, then someone removes a single material and suddenly it's down to 150mb or so. One theory I have is there's some kind of bug preventing things from being stripped that should be, but this never happens in a simple repro project and I don't have a good repro from any users. But despite the strangeness we're seeing, just reducing the memory for the shaders would be a nice win. As an example, building the demo in BiRP 2021.3LTS/OSX creates:

    23 Unique Materials of the lit shader, 70mb
    2 of the Lit/tessellation shader at 18mb
    1 of the Lit/Alpha shader at 3.2mb
    1 of the Unlit shader at 1.1mb

    This seems like a lot, especially considering a single variant of the Unlit shader has far less keywords than a lit shader, since, well, lighting, and it's a meg just for that. The variant in question is a reasonably simple shader, but includes all the potential functionality in the actual code since everything is turned on/off with shader_feature.

    For comparison, the standard Lit shader is 0.8mb if I include it in the build by making a cube. Note that my unlit shader is bigger than that, and has no lighting. And my lit alpha shader is 3x that size, despite having essentially the same features as the Unity one after all variants are stripped.

    So I'm trying to understand what I can do to reduce total memory size. I know some platforms can use precompiled shaders while others compile on the fly, but how does unity store them in build and in memory? IE: Does it compile them into some intermediate format, or is all the text included raw in the build and thus bloat build size? Does it keep that around in memory? Does it inline all includes or keep them as separate resources allowing repeated code to be included once? And when they are loaded and any variants generated, what consumes most of that memory? Properties and CBuffers cannot be #ifdef'd around (you could for the cbuffer, but it would break the SRP Batcher), so is that consuming additional space in shader memory?

    One user reported that switching from my packed shaders to the source better lit shaders fixed it for them. I don't know how true this is because I don't have a good repro, but both are effectively turned into shaders via ScriptableAssetImporters, so in theory they should be no different outside of the editor, right?
     
    Last edited: Jun 1, 2023
    OCASM likes this.
  50. aras-p

    aras-p

    Joined:
    Feb 17, 2022
    Posts:
    75
    Source files, includes etc. -- none of that exists at build time. Whatever shader variants are "needed for the build" (which is a complex process, also potentially influenced by any shader variant filtering C# code) are included.

    For each variant, for each graphics API that is included into the build, the variant in question is compiled into whatever the target platform / graphics API expects to use. That is DXBC bytecode for d3d11, DXBC or DXIL bytecode for d3d12, (compressed) SPIR-V for Vulkan, transpiled GLSL for OpenGL, priopretary formats for consoles.

    Additional data that is stored for each variant * each graphics API is "binding information" -- which constant buffers are used by the shader, their layout, as well as other resources (textures, samplers, other buffers).
     
    LooperVFX and graskovi like this.