Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Official Block Shaders: Surface Shaders for SRPs and more (Public Demo Now Available!)

Discussion in 'Graphics Experimental Previews' started by dnach, Oct 19, 2022.

  1. dnach

    dnach

    Unity Technologies

    Joined:
    Mar 9, 2022
    Posts:
    89
    Greetings from the Shader team!

    We heard your feedback regarding the need for SRP compatible surface shaders, as well as a better way to integrate with Unity’s Shader system, and have been hard at work to provide a meaningful solution which we are now happy to announce in the form of Block Shaders.

    Block Shaders introduce a streamlined and modular text-based shader authoring workflow, and allow to override and extend the functionality of existing shaders, without the complexity of modifying the original shader source - in a similar fashion to BiRP Surface Shaders. In addition, Block Shaders also aim to help unify shader authoring across the render pipeline.

    We are at an alpha stage of development and need to hear from you! To that end, we prepared a public prototype for you to experiment with and provide your valuable input as early as possible. In order to access the demo please refer to the demo's Quickstart Guide.

    Manifesting as a new shader asset type and syntax, Block Shaders can override and extend the functionality of shader "templates", which will be provided by the Render Pipeline's or created freely by users. For example, a Block Shader can override and extend a lit template’s vertex or surface descriptions:

    Code (BlockShader):
    1. BlockShader “MyShader” implements “LitTemplate” {
    2.  
    3.    Blocks {
    4.  
    5.         Block VertexBlock
    6.         {
    7.             Interface
    8.             {
    9.                 inout float3 PositionOS;
    10.                 inout float3 NormalOS;
    11.                 [Property] in float _Deformation;
    12.             }
    13.             void apply(inout Interface self)
    14.             {
    15.                 // HLSL goes here...
    16.                 self.PositionOS += self.NormalOS * _Deformation;
    17.             }
    18.         }
    19.  
    20.         Block SurfaceBlock
    21.         {
    22.             Interface
    23.             {
    24.                 in float4 UV;
    25.                 out float3 Albedo;
    26.                 out float Metalness;
    27.                 out float Roughness;
    28.                 out float Alpha;
    29.             }
    30.             void apply(inout Interface self)
    31.             {
    32.                  self.Albedo = float(1.0, 0.0, 0.0); // set albedo to red
    33.                  self.Metalness = 0.0; // set metalness to non-metallic
    34.                  self.Roughness = 0.5; // set roughness to semi-rough
    35.                  self.Alpha = 1;
    36.             }
    37.         }
    38.    }
    39.  
    40.    Implementation Vertex
    41.    {
    42.       VertexBlock;
    43.    }
    44.  
    45.    Implementation Surface
    46.    {
    47.       SurfaceBlock;
    48.    }
    49.  
    50. }
    51.  

    Users are free to declare their own custom Blocks, and define their Block’s interfaces in order to freely pass data between Blocks. In the following example, an interpolator is passed between a Vertex and Surface Block in order to implement custom fog:

    Code (BlockShader):
    1.  
    2.         Block VertexBlock
    3.         {
    4.             Interface
    5.             {
    6.                 inout float3 PositionOS;
    7.                 out float FogIntensity;
    8.  
    9.             }
    10.             void apply(inout Interface self)
    11.             {
    12.                 float3 positionWS = TransformObjectToWorld(self.PositionOS);
    13.                 float4 positionCS = TransformWorldToHClip(positionWS);
    14.                 float2 pos = positionCS.xy / positionCS.w;
    15.                 self.FogIntensity = min(1, dot(pos, pos) * 0.5);
    16.             }
    17.         }
    18.  
    19.         Block SurfaceBlock
    20.         {
    21.             Interface
    22.             {
    23.                 in float FogIntensity;
    24.                 out float3 Color;
    25.             }
    26.             void apply(inout Interface self)
    27.             {
    28.                 self.Color = lerp(self.Color, float3(1.0,1.0,1.0), self.FogIntensity);
    29.             }
    30.         }


    Block Shaders can be created from available Block Templates which define the shader’s passes and stages. Templates are constructed using a collection of linked Blocks - shader functions with a set interface (akin to shader graph nodes) to be reused and shared as assets and libraries across shaders, projects and teams.

    Templates also expose public Blocks called “Customization Points” along the shader’s flow of execution, which define the template’s public interface to be implemented and extended by Block Shaders, using any number of user defined Blocks.

    By authoring Blocks, as well as utilizing existing Block libraries, users can override and extend shaders in a reusable and modular fashion:

    For illustration purposes only


    The Universal and High Definition Render Pipelines will both provide a library of Block Templates, corresponding to their existing shader libraries, to be overridden and extended by user authored Block Shaders. In addition to the familiar vertex or surface descriptions, templates could provide any number of customizations points, for use cases such as custom lighting functions or image-based lighting override:

    Code (BlockShader):
    1.  
    2.         Block DirectionalLightingBlock
    3.         {
    4.             Interface
    5.             {
    6.                in float3 Normal;
    7.                inout float3 Color;
    8.             }
    9.             void apply(inout Interface self)
    10.             {
    11.                self.Color *= saturate(dot(self.Normal, _MainLightPosition.xyz)) * _MainLightColor.rgb;
    12.             }
    13.         }
    14.  
    15.         Block ImageBasedLightingBlock
    16.         {
    17.             Interface
    18.             {
    19.                 inout float3 ReflectionWS;
    20.                 [Property] UnityTextureCube _CubeMap;
    21.                 out float3 IBL;
    22.             }
    23.             void apply(inout Interface self)
    24.             {
    25.                 self.IBL = texCUBE(self._CubeMap, self.ReflectionWS).rgb;
    26.             }
    27.         }
    28.  

    In order to unify the shader authoring across pipelines, we are evaluating if URP and HDRP will provide templates under cross-pipeline Template Groups, which define a shared minimal interface. Users could then author Block Shaders that target a specific template group, in order to maintain compatibility with both pipelines, with HDRP potentially implementing more complex shading and utilizing more optional customization points compared to the URP equivalent.

    Due to the generic nature of Block Shaders, they also allow streamlining the creation of any type of shader, depending on the templates created by both Unity and the community. For example, a developer or asset store creator could choose to create a volumetric ray marching template - and allow technical artists to customize their own cloud or smoke shaders in a simplified manner.

    This new workflow is made possible by Shader Foundry API, a new intermediate layer and API which provides a C# representation of Block Shaders and Templates, and allows for better integration with Unity’s shader system. The Shader Foundry API will allow developers to programmatically create and configure Block Shader Templates, in order to integrate and maintain shaders and tools across pipelines and engine versions.

    We hope to land Shader Foundry and Block Shaders as early as possible, including official URP and HDRP template support. We will continue to improve and extend Block Shaders, and provide support for additional features such as compute or raytracing shaders, so please share your feedback and any feature or functionality you would like to see - either directly in this thread or via the Block Shaders Survey.

    You can follow our progress in the Shaders forum as well as in our public roadmap, and look forward to an announcement of when the Foundry is available in the public betas. We can't wait to see what amazing shaders and block libraries you create using Block Shaders!
     
  2. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,869
    This looks DEFINITIVELY EXCITING!
    Will this be a core feature only available on future Unity versions or will this be compatible with current Unity 2021 LTS ?
     
  3. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,508
    Long awaited, very much necessary! Maybe a silly question, but there's no compatibility / interoperability with built-in RP? Just wanting to know if we'll be still 2+ years either stuck at legacy built-in or tied to Better Shaders (the Asset Store package)...
     
    M_MG_S and atomicjoe like this.
  4. dnach

    dnach

    Unity Technologies

    Joined:
    Mar 9, 2022
    Posts:
    89
    @atomicjoe
    I'm afraid we do not have any plans to backport Block Shaders and Shader Foundry at the moment.

    @Edy
    Since you can use "Surface Shaders" with BiRP, this seems like a lower priority for us at the moment. With that in mind, would Block Shaders support for BiRP be something you interested in? If so, please fill the survey and let us know. We will definitely consider this when planning our roadmap for rolling out Block Shaders..
     
    Edy likes this.
  5. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,546
    At the very least, you can have Shadergraph shaders target BiRP alongside the others now.
    It does seem strange to me adding support for BiRP wouldn't be as simple as having a third compilation target that maps the outputs to BiRP syntax, given they already do that with ShaderGraph...

    But anyways, excited to see this is finally here and to try this out!
     
    Last edited: Oct 19, 2022
    Yuchen_Chang and BenjaminSimsMK like this.
  6. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,869
    Ok so let me recap: Block Shaders is not compatible with current or older Unity versions, is not compatible with BiRP and it's not compatible with Surface Shaders.

    What we asked for was to port Surface Shaders to SRP for retro and inter-compatibility with ALL render pipelines and 10+ years of shaders and what we got is yet another layer of incompatibility over Unity.

    Nice.
     
  7. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,869
    Yet another disappointment in the long list of Unity disappointments.
    I'm out.
     
  8. dnach

    dnach

    Unity Technologies

    Joined:
    Mar 9, 2022
    Posts:
    89
    We could very well explore providing BiRP compatibility with Block Shaders if there is enough interest so we are definitely not ruling that out.

    The demo we provided includes "Simple Lit" templates that allow you to override the Vertex and Surface descriptions of a shader, similarly to BiRP Surface Shaders - so we recommend you check that out.

    Being much more generic, Block Shaders will allow the Render Pipelines to expose any public interface they see fit for their provided shader templates. Users could also freely define their own templates and their available interface. This would allow to potentially match, or even exceed the functionality provided by the existing BiRP Surface Shaders.

    We are also exploring the concept of "Template Groups" in order to allow unifying the shader authoring experience across pipeline. In practice, this could allow you to author Block Shaders that target certain template groups in order to achieve cross pipeline compatibility for your authored shaders!
     
    LooperVFX, OCASM and Edy like this.
  9. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,546
    The main issue with re-using old surface shaders was the barrier to entry of copying their functionality to the newer pipelines. This system should in theory make it very easy to translate old surface shaders to the new pipelines, especially once some templates are made for it.
     
    alexanderameye likes this.
  10. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,869
    First of all, let me be clear that this is not against you or your team in particular but about Unity's direction in general and there is nothing personal in it:

    You'll have to excuse my lack of confidence in any potentiality after all these years of lackluster SRP performance and compatibility compared to BiRP.
    With this you had the opportunity to bridge the gap between BiRP and SRP, incentivize adoption of the SRPs by BiRP users and leverage more than a decade of shaders by simply supporting Surface Shaders on the SRPs, but instead of that you opted for yet another new layer of incompatibility over the SRPs, doubling down on all what the community criticized the SRPs for.
    You keep pushing the SRPs and try to bury BiRP and yet the community refuses to let BiRP die. You should ask yourselves WHY, instead of doubling down on something the users don't really want.
    The problem here is the mindset.
     
  11. Saniell

    Saniell

    Joined:
    Oct 24, 2015
    Posts:
    193
    +1 to what @atomicjoe is saying. Big problem for adapting SRP is the fact that we can't easily port shaders, now you make a tool that could solve it and completely neglect this fact. It's just annoying to see Unity trying to act like BiRP is not a thing again and again. That's not exactly how you make people change their toolset you know.

    It's like if Blender 2.8 decided to just straight up remove old shortcuts instead of providing compatability options. You don't exactly need a survey to understand why this would not work well

    That said, I have a question regarding syntax

    Code (csharp):
    1. Implementation VertexDescription
    2. {
    3.     FogBlock()->(VertexFogIntensity: FogInterpolator);
    4. }
    Why use -> operator instead of just FogBlock(VertexFogIntensity: FogInterpolator);
    And what can Implementation thing have inside it anyway, except call to a block?

    Also. How template is selected, or whatever is used to generate a shader? Would it be possible to select template from block shader itself, thus, assets like GPUInstancer could provide custom templates that would work with their own rendering systems
     
    BenjaminSimsMK and atomicjoe like this.
  12. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,014
    If this was really simple to do, it would've been done ages ago.

    It may be simple enough to make a translation tool that converts a subset of surface shaders to the new system. We'll evaluate this in due time.

    The Implementation section can have a list of block invocations.
    As to the syntax, it goes like this:
    Code (CSharp):
    1. BlockName(input0: inVar0, ...)->(output0: outVar0, ...); // Full version
    2. BlockName(inout0: var0, ...); // When overridden inputs and outputs are symmetric
    3. BlockName; // No overrides
    A template is specified in the block shader.
     
    LooperVFX, OCASM, Luxxuor and 4 others like this.
  13. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,508
    Definitely. Oh, damn it, definitely! My understanding was that all this was intended to finally provide RP interoperability between all RPs supported by Unity.

    What we desperately need since the very arrival of SRPs is some way to have our shaders and materials so they just work no matter the actual RP being used in the project (BiRP, URP, HDRP, future/other RP...). Of course they may look different depending on the RP used. Of course some features will be available in some RPs and not in others. But they will *just work* instead of showing a pink material.

    You guys even gave a talk in Unite Copenhagen 2019 entirely dedicated to create BiRP-URP compatible shaders and effects! It's a very unexpected surprise that now BiRP is a "lower priority", given that it's the only RP that is truly universal still today! Many of us have been stuck here since forever, as upgrading to any non-BiRP pipeline has always been an one-way destructive path for our projects.

    Sure, I'll fill the survey if that helps. Thank you for your understanding!
     
    Last edited: Oct 21, 2022
  14. Saniell

    Saniell

    Joined:
    Oct 24, 2015
    Posts:
    193
    If you can actually have a tool that could convert surface shaders to block shaders it could as well be a proof that new system at least has no regressions compared to previous. So I think that would be a good idea

    And syntax now makes perfect sense to me, thanks. So far idea looks better than surface shaders. Those were weird and built around edge cases so I like this new approach more
     
    LooperVFX likes this.
  15. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,526
  16. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,526
    Same for Amplify shaders, they can work on all pipelines, what is needed is more unity between pipelines and not just between URP and HDRP
     
  17. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,508
    Please, please do. Compatibility between *all* RPs supported by Unity has been desperately needed for too many years.

    Young people might not remember this, but years ago one of the mottos in Unity was "author once, build many". The introduction of SRPs blew all that up, leaving many of us stuck at BiRP or Better Shaders.
     
  18. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,869
    That's not what we need. We need full Surface Shaders compatibility.
    Sorry if that's not what you wanted to hear, but that's just the way it is.
    If that's not easy to do with the current state of SRPs, then maybe that is what should be fixed, instead of putting band-aids over a bleeding wound.
     
  19. sacb0y

    sacb0y

    Joined:
    May 9, 2016
    Posts:
    873
    I can kinda get why people want BIRP support, it would help with the transition for sure. I imagine on Unity's end since BIRP will DEFFINITELY be depreciated at some point it doesn't seem worth the effort. Cause as soon as URP/HDRP is more dominantly used, they'll move on completely. They just need to make a more convincing argument to do so.

    And the side effect of adding support for this is people may stick with BIRP for longer than they'd like.

    But I think the reason they should is cause when that depreciation happens, hopefully the transition will be much smoother. I've already long left BIRP behind personally, but there's still the issue of assets and such that might not be SRP compatible. So, I would maybe see benefits on the asset store side in the end if people could more easily update their assets.

    But I'm certain this will complicate matters, sure it CAN be done, but people already complain about Unity being slow as it is. And maybe some dude who knows BIRP in an out can make a tool for it, but how many people at Unity now are at that level that are available for that job?

    At least for me personally, the need is low priority compared to the desire to see things move forward as an SRP user.
     
    thelebaron, gigazelle, noio and 2 others like this.
  20. Saniell

    Saniell

    Joined:
    Oct 24, 2015
    Posts:
    193
    Don't you think being able to transition without redoing all of your shaders from scratch would be nice argument?

    This kind of thinking doesn't apply to majority that still use BiRP. Mostly because it's not only not compatible but actually better than glorified mobile renderer which is URP and simepler than parameter hell that is HDRP. But I'm getting off track now so gonna shut up :rolleyes:
     
  21. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,508
    Not at all. Adding support for BiRP would allow us to finally move to URP and HDRP.

    The need of support BiRP for full compatibility is what forces us to stay in BiRP. If we had cross-RP compatibility, we'd just move to HDRP and write HDRP shaders and materials with the confidence that they will still work in BiRP and URP. That's what many of us desperately need to leave BiRP behind.

    Then Unity may sunset BiRP at any point later when anyone can use any SRP and URP is feature-parity with BiRP.
     
    Last edited: Oct 21, 2022
    LaireonGames, mitaywalle and sacb0y like this.
  22. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    BiRP compatibility is needed, though I get why unity keeps wanting to kill it. But it should be easy enough for them to write a template for it, if it isn't, then something is very wrong with this new system.

    That said, I think sticking with the old surface shader format would not only have been a mistake, but would have caused all kinds of headaches for users as well as developers, as some features of that system simply would not work. Further, that syntax was rife with issues - such as the magic variables (viewDir, what space is it in?) and implied vertex interpolators from texture names, etc. It also would have prevented composition of shader blocks, which Better Shaders proved is a very desirable feature.

    However, it does mean people like me will have to do yet another translation of all of our shader code to yet another new format, which will take a few months given how much shader code I have. So we'll just throw it on the 2 years of unpaid labor Unity has already forced me to do or lose my business.

    As for the actual system, I'll need to spend some time with it and figure out if I'll be able to port my work to it. Immediate things which come to mind is how do I modify things like texcoord's to have centroid sampling without modifying the template, since raw v2f structures don't seem to be exposed, are tessellation stages supported, etc?
     
  23. sacb0y

    sacb0y

    Joined:
    May 9, 2016
    Posts:
    873
    I think Unity's coexistence effort if done properly will mostly nullify this problem, even if BIRP isn't supported. I understand there's some risk, but at some point the feature set will have to make BIRP obsolete.

    Like I said it makes sense for transition.

    At some point things will pass the rubicon where people are annoyed when anything new supports BIRP. :p
     
  24. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    So unzipping the mac editor from the doc file just produces a large file with a .out extension. What am I supposed to do with this?
     
    Walter_Hulsebos likes this.
  25. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,014
    The template will likely not have to declare the texcoord for, say, main texture sampling - this will be in the block shader. So you'll have full control there.
    Tessellation support will be there at some point, but it's not available in the prototype.
     
    OCASM likes this.
  26. joshua_davis

    joshua_davis

    Unity Technologies

    Joined:
    Jan 5, 2021
    Posts:
    9
    @jbooth The internal file is a ".tar" so you can just rename and unzip it. Not sure why the internals of the file have no extension.
     
  27. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Since i can't run the editor, I'm reading the docs:

    - float2 is not listed as a supported type? nor int2, int3, int4?
    - having a color type instead of a float4 seems like a mistake
    - the property syntax has changed - is there a reason for this? Does this invalidate all material property attributes, etc?
    - only surface and fragment shader support?
    - no way to add a custom pass without modifying the template, for things like an outline pass
    - How are texture2d's passed to functions? What about samplers, _ST, and _TexelSize? Examples seem to use tex2D semantics
    - How are template inputs handled - I see the examples use normalTS, but what if my normal is in world space? Do I have to pay the cost of putting it into tangent space and just having them template convert back? Or can I set either and have it do the right thing somehow by declaring which space my output is in?


    Code (CSharp):
    1. Block DirectionalLightingToon
    2. {
    3.     Interface
    4.     {
    5.         [Property] in UnityTexture2D DissuseMap;
    6.         [Property] in float RimAmount;
    7.  

    whats a DissuseMap? Does that mean it's not used? I've had enough of that second property over the last few years thanks ;)

    Suggestions:
    - Consider something like the SurfaceData structure I do for Better Shaders to make shader authoring easier for new users. This structure contains a bunch of precomputed variables for the user, like "WorldSpaceViewDir", "TangentSpaceViewDir", "WorldSpaceCameraPosition", etc. Learning to compute all that stuff requires a lot of unity shader specific knowledge and is also different in different pipelines (camera relative, not camera relative, etc), so having a simple set of variables or functions all in one place to do it in a cross platform way will make everyone's lives much easier than crawling through source files.
     
    Walter_Hulsebos, noio and sirleto like this.
  28. joshua_davis

    joshua_davis

    Unity Technologies

    Joined:
    Jan 5, 2021
    Posts:
    9
    Also the folder contents are a little odd (it's the result of our build system). Internally you'd want to add x64/Release/Unity.app via the hub I believe. You may get a warning about it being produced by an unverified source because the exe is not signed. I have been told on mac you can allow this either via System Preferences->Security & Privacy has a tab for “allow apps downloaded from:“ or when you try to open the binary it may auto pop up asking for explicit approval.
     
  29. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Ok, unzipped the tar, but it says the editor is not a valid application - also tried adding it to the hub, which also rejected it.
     
  30. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    When added through the hub, it warns me and I click got it, but then when trying to open the project it can't find the editor version..
     
  31. joshua_davis

    joshua_davis

    Unity Technologies

    Joined:
    Jan 5, 2021
    Posts:
    9
    I'm not sure which docs you're reading, but:
    - float2, int2, etc... are supported
    - there is no color type at the moment, although we may want to have one. Currently you tag a floatN property as [Color] to tell the system to add the same attribute to the material property.
    - textures are passed using UnityTexture2D (and other UnityTexture types) which combine the ST data all together.
    - In the prototype you cannot add a custom pass in the shader without creating a new template, but we plan to allow that long term. There's also a planned system to build one template as an extension of another
    - templates provide data to the customization points for you to use. It's up to the template to provide values in multiple spaces if that's desired.
    - DissuseMap: I assume this is a typo on DiffuseMap
     
    LooperVFX, OCASM and sirleto like this.
  32. jessebarker

    jessebarker

    Unity Technologies

    Joined:
    Dec 13, 2016
    Posts:
    6
    Did you get through the hoop jumping in System Preferences->Settings & Privacy bypass? Once you've got it runnable, I would maybe run the editor directly with the `-projectPath` command line option just to confirm it all works. I'm not sure how Hub deals with alpha builds in the wild.
     
  33. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,546
    Amplify is a node based shader editor, just like ShaderGraph. Amplify shaders themselves are just regular shaders that have been compiled out of the Amplify editor for each of the pipelines. ShaderGraph supports all 3 as well and you could just the same include ShaderGraph shaders in your store asset and add a shadergraph package dependency.
     
    noio likes this.
  34. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Yeah, I had to do it via command line because the anyone option isn't available by default..

    - Adding via the Hub complains that it's not a signed unity application
    - Double clicking it says "the application Unity cannot be opened"
    - "open Unity.app" via terminal "Application cannot be opened for an unexpected reason"
    - go into contents of Unity.app/Contents/MacOS and try "open Unity" just opens a text editor with a bunch of goop in it.
     
  35. dnach

    dnach

    Unity Technologies

    Joined:
    Mar 9, 2022
    Posts:
    89
    Thank you for your feedback!
    As @aleksandrk mentioned above, we definitely want to provide a way to convert BiRP surface shaders into SRP compatible Block Shaders and will investigate adding this to our roadmap.


    Thanks for trying this out Jason! We were not able to reproduce the issue you are experiencing on macOS so far, but will try to repro and keep you posted.

    Joshua already addressed the above points, but just to add up:

    - The demo provided templates provide Vertex and Surface customization points (to override parts of the vert/frag shader stages respectively), and these would be provided by RP templates along with any other relevant public blocks.

    As mentioned in the original post, we are also planning to support additional shader stages. The linked survey has some questions regarding feature prioritization, including shader stage support (e.g Compute, Geometry, Tesselation, Raytracing) - so it would be very useful if you can provide your input!

    - The provided templates are for demonstration purposes only, but RP provided templates will definitely take such considerations into account and provide a sensible interface to avoid redundant transformations (for your example, normalWS and normalTS could be provided)

    - The example shaders "Assets\BlockShaders\ExtraExamples\Properties Blocks\PropertiesShader.blockShader" and "Assets\Tests\LegacyShaders\PropertyTypes\UnitySamplerStateProperty.blockShader" both provide some reference on using sampler states, so you can check these out.

    - Regarding the shader property name typo, I quickly fixed that and reuploaded the package.
     
    sirleto likes this.
  36. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Code (CSharp):
    1. [Property(uniformName = "SamplerState")][SamplerState(filterMode = Trilinear, anisotropicLevel = 8, wrapMode = MirrorOnceV, depthCompare = true)] UnitySamplerState MySamplerState;
    So does this mean we can expose sampler states to materials in a non-texture binded way? That would be a big help in fighting the sampler stripping issues that arise when sharing samplers or when passes are run that don't use all the outputs. For instance:

    Code (CSharp):
    1. TEXTURE2D(_Foo);
    2. TEXTURE2D(_FooNormal);
    3. SAMPLER(sampler_Foo);
    4.  
    5. SurfaceFunc(Input i, Output o)
    6. {
    7. // this will break on URP2021's depth normal pass because albedo is not used and the sampler gets stripped
    8.     o.Albedo = SAMPLE_TEXTURE2D(_Foo, sampler_Foo, i.uv);
    9.     o.Normal = SAMPLE_TEXTURE2D(_FooNormal, sampler_Foo, i.uv);
    10. }
    11.  
     
  37. joshua_davis

    joshua_davis

    Unity Technologies

    Joined:
    Jan 5, 2021
    Posts:
    9
    You can either have texture + sampler bundles or you can declare an inline sampler state. Using "UnityTexture2D" is a bundled texture + sampler. Using [SamplerState] creates a deduplicated inline sampler state. This follows the same rules as described here: https://docs.unity3d.com/Manual/SL-SamplerStates.html.
     
  38. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Ok, so no way to have that exposed to the user then. I'm also assuming I can create a texture without a sampler state, right? Because I have hundreds of textures in some of my shaders, so I can't have a sampler for every texture, but I still need the user to specify things like wrap mode, etc (it just uses the albedo one). And that also means I'll still have to hack around samplers getting stripped when they are still being used by other textures, which is a drag but no worse than what it is now.
     
    noio likes this.
  39. Saniell

    Saniell

    Joined:
    Oct 24, 2015
    Posts:
    193
    Would it be possible to add ability to expose sampler properties to editor somehow? Like selecting filtering from shader/setting from C#. This would be really nice
     
  40. joshua_davis

    joshua_davis

    Unity Technologies

    Joined:
    Jan 5, 2021
    Posts:
    9
    I believe in the prototype there is no way to declare a texture without a sampler right now. We can definitely add that if that is something unity already supports.

    BlockShaders are currently only really a layer on top of shaderlab, so if something isn't supported in shaderlab then it isn't supported. Since there is no current way to expose sampler states to materials this isn't something that we could support easily in MVP.
     
    Saniell likes this.
  41. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Yeah, I get it - just got excited for a minute there as I dev on mac and this often bites me when someone runs on windows (mac doesn't strip the samplers like DX does), and often not because of a change I make, such as when URP added the depth normals pass and all my existing code broke because there was now a pass that didn't use albedo and just used normals.

    Now if I could just get the editor to run to try this thing..
     
  42. joshua_davis

    joshua_davis

    Unity Technologies

    Joined:
    Jan 5, 2021
    Posts:
    9
    Also we're investigating the Mac situation. I believe I have queued up an installer to be built
     
    jbooth likes this.
  43. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    This does remind me it might be valuable to consider having an 'escape hatch' to raw code within your constructs - one advantage of Better Shaders is it really just outputs code from combining text blocks, so it doesn't need any translation or understanding for the most part. And in this case, if I could raw output a texture declaration, it would have allowed me to declare a texture without you adding any features to the system. Not to say that textures without samplers shouldn't be added, but I can imagine coming across other advanced cases where something isn't supported.

    It also makes me wonder if the system should encourage combined samplers via tex2D and such at all. Or if it's better to just force users to understand textures/samplers being separate things from the beginning, since that is how all graphics APIs work these days. The idea of combined texture/samplers is really DX9 heritage at this point, right?
     
    VirtusH, noio, OCASM and 4 others like this.
  44. joshua_davis

    joshua_davis

    Unity Technologies

    Joined:
    Jan 5, 2021
    Posts:
    9
    To go into some details, the inputs to blocks need to be known types that aren't hidden in macros since we run before the preprocessor. I believe Unity still expects the use of the SAMPLER and TEXTURE macros to be cross platform, so this causes problems.

    ShaderGraph added the UnityTexture2D and similar structures which hide a lot of this and block shaders are currently using these structures. I think we'll look more into these and determine what we use and how we address the texture/sampler split. I think in an ideal world we'd just create wrappers for each opaque type available, maybe with an optional combined texture + sampler.

    Having 'escape hatches' to raw code is tricky for uniforms/properties. One of the goals is to reduce global usage and allow passing around and re-directing inputs, also simplifying declarations of uniforms/properties. This means that properties end up being more of a "request" that the block makes and the system figures out to actually declare and how to pass it in. Backdoors always end up being tricky because you can't understand them and they're hard to change / update later.
     
  45. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Until the DX9-isms are completely gone (like having sampler state option based on the selected texture), I expect the idea of combined samplers will remain.


    While that's very true about backdoors, when has a system ever truly been 100% wrapped correctly? Shader lab allowed backdoors into HLSL/GLSL code so you could write it directly - fully understanding that it isn't portable or upgradable. Shader graph allows the same with its HLSL node, and is being used to hack in all kinds of interesting things that shader graph couldn't do without it. Now this might not be true for things like properties, but certainly we already have use cases of this within the system that have proved incredibly valuable in the past, and most languages allow some form of this in one form or another (unsafe blocks, asm in C++, etc). There is always a need to get to the lower layer and avoid the abstraction for one reason or another.


    Hey, custom editors for blocks? How will they be handled and combined into an interface for a shader built out of reusable blocks? I do this in Better Shaders by allowing blocks to declare them and they receive a list of properties they can display, then a custom editor just iterates through the blocks sending each custom editor their data.
     
    noio likes this.
  46. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,014
    Unfortunately, this is not the case. The separate sampler state support that comes with OpenGL is entirely different, and we're limited by this.

    We provide raw HLSL support here - you could say your block needs an HLSL include file, and put your HLSL declarations there.
     
    jbooth likes this.
  47. JesOb

    JesOb

    Joined:
    Sep 3, 2012
    Posts:
    1,109
    May be better make new tool of 2023+ that will bake projects aimed for 2024+ to support only DX11(maybe), Dx12, Metal, Vulkan?

    Drop support for abandoned tech and stop limiting yourself.
    For older platforms we already have plenty of different tech in Unity and on AssetStore.
     
    LooperVFX, OCASM, NotaNaN and 4 others like this.
  48. neontropics

    neontropics

    Joined:
    Apr 12, 2013
    Posts:
    1
    This looks really cool! Any ideas on timelines for when a public beta could be released?

    I gave up on URP and stuck to BiRP because the amount of setup and trial and error needed to just convert very simple shaders was insane, this seems to fix that and lead to a world where writing URP shaders is actually fun and productive.

    I agree with everyone that straight up BiRP support would be awesome, but a sensible system like this means that it’d be much more straightforward to convert shaders.
     
    VirtusH, LaireonGames, Edy and 2 others like this.
  49. sacb0y

    sacb0y

    Joined:
    May 9, 2016
    Posts:
    873
    I'd agree with this if vulkan on android felt more complete... But my personal experience with things like the game crashing when using more than 1GB of ram says otherwise.
     
  50. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,014
    Unfortunately, GLES3 is the minspec for new Android devices. URP has to work there as well :)
     
    sacb0y likes this.