Search Unity

Material.Setfloat works, but MaterialPropertyBlock doesn't

Discussion in 'Shaders' started by LukasKiefer, Apr 23, 2020.

  1. LukasKiefer

    LukasKiefer

    Joined:
    Jul 25, 2017
    Posts:
    9
    I am developing a shader that can be modified at runtime to be offsetted in depth forward or backwards, so that i can draw specific objects on top of others dynamically.

    Right now, our app is using three cameras and stacking. Then i shuffle my objects around on layers.
    I thought using a shader depth offset should be much faster.

    This is the shader i have for testing:

    Code (CSharp):
    1.  
    2.  
    3. Shader "DepthTest" {
    4.    Properties {
    5.      _Color ("Color", Color) = (1,1,1)
    6.       _Offset ("Z Offset", Float) = 0
    7.    }
    8.    SubShader {
    9.      Tags { "RenderType"="Opaque" "Queue"="Geometry"}
    10.      Pass {
    11.        Offset [_Offset], [_Offset]
    12.        Lighting Off
    13.        Color [_Color]
    14.      }
    15.    }
    16. }
    This is the script to test modifying the shader offset:

    Code (CSharp):
    1. using UnityEngine;
    2.  
    3. [RequireComponent(typeof(Renderer))]
    4. public class RenderqueueTester : MonoBehaviour
    5. {
    6.     private Renderer renderer;
    7.     private Material mat;
    8.     [SerializeField] private float maxVal = 1000f;
    9.     [SerializeField] private float val;
    10.     private MaterialPropertyBlock PropBlock;
    11.  
    12.     [SerializeField] private bool UsePropertyBlock = false;
    13.  
    14.     [SerializeField] private string propertyName = "_Offset";
    15.  
    16.     void Start()
    17.     {
    18.         renderer = GetComponent<Renderer>();
    19.         mat = renderer.sharedMaterial;
    20.         PropBlock = new MaterialPropertyBlock();
    21.     }
    22.  
    23.     void Update()
    24.     {
    25.         val = Mathf.Sin(Time.time) * maxVal;
    26.  
    27.         if (UsePropertyBlock)
    28.         {
    29.             renderer.GetPropertyBlock(PropBlock);
    30.             PropBlock.SetFloat(propertyName, val);
    31.             renderer.SetPropertyBlock(PropBlock);
    32.         }
    33.         else
    34.         {
    35.             mat.SetFloat(propertyName, val);
    36.         }
    37.     }
    38. }
    39.  
    This is what my test scene looks like. I want to change the depth of the red box at runtime, to have it render in front of or behind the other boxes.



    As you can see, i have 2 methods, one using MaterialPropertyBlock and the other one working on the material itself.

    I want to use MaterialPropertyBlock, but it's just not working. I have used this method before without problems. It just seems like a depth offset cannot be modified per renderer?
    mat.SetFloat works without problems, but that would bring a lot of problems with material instances etc.

    So, i have some questions here:

    1. Why does this work on the material and not on the MaterialPropertyBlock?
    2. Is there a better way to specify a depth offset, for example:
    This renderer should be drawn on top of that one, changeable at runtime
    3. Sometimes i have an error popping up and the Rendering "breaks":
    Assertion failed on expression: 'SUCCEEDED(hr)' why is that?
     
    Last edited: Apr 23, 2020
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    Material property blocks can't modify render state properties, only shader uniforms.
    In other words any value that's outside of a
    CGPROGRAM
    block can't be modified by a MaterialPropertyBlock. Your example shader above makes this a little confusing because there isn't any apparent
    CGPROGRAM
    block, but that's because that's written as a deprecated fixed function shader which Unity automatically generates a vertex fragment shader for. You can see what the actual shader looks like by selecting it in the editor and clicking on the "Show generated code" button.
    upload_2020-4-23_14-41-57.png
    For that shader it looks like this:
    Code (csharp):
    1. Shader "DepthTest" {
    2. Properties {
    3.  _Color ("Color", Color) = (1.000000,1.000000,1.000000,1.000000)
    4.  _Offset ("Z Offset", Float) = 0.000000
    5. }
    6. SubShader {
    7.  Tags { "QUEUE"="Geometry" "RenderType"="Opaque" }
    8.  Pass {
    9.   Tags { "QUEUE"="Geometry" "RenderType"="Opaque" }
    10.   Offset [_Offset], [_Offset]
    11. CGPROGRAM
    12. #pragma vertex vert
    13. #pragma fragment frag
    14. #pragma target 2.0
    15. #include "UnityCG.cginc"
    16. #pragma multi_compile_fog
    17. #define USING_FOG (defined(FOG_LINEAR) || defined(FOG_EXP) || defined(FOG_EXP2))
    18.  
    19. // uniforms
    20. half4 _Color;
    21.  
    22. // vertex shader input data
    23. struct appdata {
    24.   float3 pos : POSITION;
    25.   UNITY_VERTEX_INPUT_INSTANCE_ID
    26. };
    27.  
    28. // vertex-to-fragment interpolators
    29. struct v2f {
    30.   fixed4 color : COLOR0;
    31.   #if USING_FOG
    32.     fixed fog : TEXCOORD0;
    33.   #endif
    34.   float4 pos : SV_POSITION;
    35.   UNITY_VERTEX_OUTPUT_STEREO
    36. };
    37.  
    38. // vertex shader
    39. v2f vert (appdata IN) {
    40.   v2f o;
    41.   UNITY_SETUP_INSTANCE_ID(IN);
    42.   UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(o);
    43.   half4 color = _Color;
    44.   float3 eyePos = mul (UNITY_MATRIX_MV, float4(IN.pos,1)).xyz;
    45.   half3 viewDir = 0.0;
    46.   o.color = saturate(color);
    47.   // compute texture coordinates
    48.   // fog
    49.   #if USING_FOG
    50.     float fogCoord = length(eyePos.xyz); // radial fog distance
    51.     UNITY_CALC_FOG_FACTOR_RAW(fogCoord);
    52.     o.fog = saturate(unityFogFactor);
    53.   #endif
    54.   // transform position
    55.   o.pos = UnityObjectToClipPos(IN.pos);
    56.   return o;
    57. }
    58.  
    59. // fragment shader
    60. fixed4 frag (v2f IN) : SV_Target {
    61.   fixed4 col;
    62.   col = IN.color;
    63.   // fog
    64.   #if USING_FOG
    65.     col.rgb = lerp (unity_FogColor.rgb, col.rgb, IN.fog);
    66.   #endif
    67.   return col;
    68. }
    69. ENDCG
    70.  }
    71. }
    72. }
    Notice the
    Offset
    is outside of the
    CGPROGRAM
    , and the
    _Color
    is inside that, but not inside another function or struct. (Also conveniently labelled with the comment
    // uniforms
    .)

    You're looking to force the sorting of objects. Note the use of
    Offset
    only works because the object you're moving is opaque. They're being sorted by their pixel depth by the depth buffer, so moving an object further or closer to the camera also affects the depth it writes. If this object and the objects it was sorting against were transparent that setting wouldn't do anything since transparent objects generally don't write to the depth buffer. For that you have to use the material queue, which also negates the option of using material property blocks.

    It's also good to be aware that the implementation of
    Offset
    is device specific. Different GPUs / devices may not behave exactly the same, so a value that pushes it over another object might not on a different GPU, or might push it over more than you want. It also changes how much it "pushes" depending on the distance from the camera in most implementations, so as you get closer / further away it'll get pushed less / more. Basically, don't use
    Offset
    for anything you want precise control over. It's best used for cases when you have to surfaces that are right on top of each other and need to push one slightly closer to avoid z fighting. Not to make sure it renders over another completely separate object.

    The "best" option for sorting objects regardless of if they're opaque or transparent would be to actually control their render order explicitly using the material's queue, or each renderer component's
    renderOrder
    setting along with stencils. All renderer components have a
    renderOrder
    setting, but it's only exposed in the inspector & serialized for UI & sprite renderers, but it can be set with a custom script on anything. As mentioned before opaque objects sort by depth buffer, their render order doesn't actually have an effect on the final rendered image because of this, which is why I mentioned stencils. That would require you have objects write to and read from the stencil buffer to mask and be masked by each other irrespective of the depth. That also requires modifying the material directly as stencils are part of the render state, so that also excludes the use of a material property block.


    So ... lets look back at
    Offset
    . Is there a way to replicate
    Offset
    within the shader code in a way that works with uniforms and is consistent across platforms? Why yes, there is.

    Hang onto your britches for this one, this is modifying the generated code above:
    Code (csharp):
    1. // uniforms
    2. half4 _Color;
    3. float _Offset; // add this to your uniforms.
    4.  
    5. // in the vert function add this to the end of the function
    6.   // transform position
    7.   o.pos = UnityObjectToClipPos(IN.pos);
    8.  
    9.   // offset z depth in world units
    10.   float3 viewPos = UnityObjectToViewPos(IN.pos);
    11.   if (_Offset != 0.0 && viewPos.z < _ProjectionParams.y)
    12.   {
    13.     // clamp to the near plane if the vertex started visible
    14.     viewPos.z = min(-_ProjectionParams.y - 0.00001, viewPos.z - _Offset);
    15.   }
    16.  
    17.   // calculate the clip space position of the depth offset position
    18.   float4 depthOffsetPos = mul(UNITY_MATRIX_P, float4(viewPos, 1.0));
    19.  
    20.   // project the new depth into the original clip space position
    21.   o.pos.z = (depthOffsetPos.z / depthOffsetPos.w) * o.pos.w;
    22.  
    23.   return o;


    No idea. There are a lot of less than useful error message that show up in the log from Unity's internal code. Not much you can do about a lot of them since it's not code you can look at.
     
    bns_YoheiMiyake likes this.
  3. LukasKiefer

    LukasKiefer

    Joined:
    Jul 25, 2017
    Posts:
    9
    Wow, thank you so much for this thorough explanation!
    Now i understand the workings behind it. Shaders are still a bit of uncharted territory for me.

    I did a quick test and it seems to work flawlessly.

    Even better this way, specifying the offset in world space units rather than something arbitrary.

    Now i have to test it in my project, i am expecting quite big performance gains going from three fullscreen cams to one.

    Is there an easy and simple way to replace the Vertex Shader for my materials using standard shader? for the custom ones i'll have to do it myself anyway...
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    Not really, no. You can use a Surface Shader using the Standard lighting model to get something closer to the Standard shader, but getting all of the features of the Standard shader is a bigger pain. Also this particular technique has lots of potential problems when it comes to lighting and shadows.
     
  5. LukasKiefer

    LukasKiefer

    Joined:
    Jul 25, 2017
    Posts:
    9
    Okay, thank you. I'll keep that in mind and check for complications.

    I think it can work, i'm not really having real shadows and fake lighting on the shaders.
    A lot of my materials are using matcaps, and they should be fine i think.