Search Unity

  1. Unity 2020.1 has been released.
    Dismiss Notice
  2. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

GLES2/3 error with bit field operations.

Discussion in 'Shaders' started by Paul_H23, Jul 31, 2020.

  1. Paul_H23

    Paul_H23

    Joined:
    Jun 19, 2019
    Posts:
    21
    I've seen various posts about this, and one ticket that claims to be closed, but I can't see how it can be given the evidence I'm seeing. I'm trying to compile a shader, source below, on iOS targeting the simulator for now, but realistically, the problem is going to show on any GLES3 device. The same, or worse happens when targeting GLES2, which is what I want to do on Android to support the largest range of devices. I'm currently at a complete loss as to how to solve this. It appears to me that the Unity shader compiler is errantly using GLES4 features in both GLES2 and GLES3 targets, which is crazy.


    Below is the shader, and after that the error message in Xcode when running on the simulator, simulating an iPhone 11 Pro Max running iOS 13.5.

    As you can see, the Unity compiler seems to insert an implementation of bitfieldInsert to address the problem on GLES3, but then goes on to use bitfieldExtract, which is GLES4 only.

    https://www.khronos.org/registry/OpenGL-Refpages/gl4/html/bitfieldExtract.xhtml

    ------------------------------------------

    Shader "Unlit/AnimatedTransition"
    {
    Properties
    {
    _MainTex ("Texture", 2D) = "white" {}
    _MaskTex ("Mask", 2D) = "white" {}
    _Color ("Color", Color) = (1,1,1,1)
    _Frames ("Frame Count", Int) = 5
    _Speed ("Per Frame Duration", Float) = 0.5
    [Toggle(CLAMP_MASK)] _ClampMask ("Clamp Mask?", Float) = 0

    [PerRendererData]_Neighbourhood ("Neighbourhood", Int) = 15
    }
    SubShader
    {
    Tags {
    "RenderType"="Transparent"
    "Queue" = "Transparent"
    "IgnoreProjector" = "True"
    }
    LOD 100
    Blend SrcAlpha OneMinusSrcAlpha
    Lighting Off

    Pass
    {
    CGPROGRAM
    #pragma vertex vert
    #pragma fragment frag
    #pragma shader_feature CLAMP_MASK

    #include "UnityCG.cginc"

    struct appdata
    {
    float4 vertex : POSITION;
    float2 uv : TEXCOORD0;
    };

    struct v2f
    {
    float2 uv : TEXCOORD0;
    float4 vertex : SV_POSITION;
    };

    sampler2D _MainTex;
    sampler2D _MaskTex;
    fixed4 _Color;
    uint _Neighbourhood;
    float4 _MainTex_ST;
    int _Frames;
    float _Speed;

    fixed4 shot (sampler2D tex, float2 uv, float dx, float dy, int cell, int row) {
    return tex2D(tex, float2(
    (uv.x * dx) + (cell * dx),
    (uv.y * dy) + (row * dy)
    ));
    }

    v2f vert (appdata v)
    {
    v2f o;
    o.vertex = UnityObjectToClipPos(v.vertex);
    o.uv = TRANSFORM_TEX(v.uv, _MainTex);
    return o;
    }

    fixed4 frag (v2f i) : SV_Target
    {
    float frame = fmod(_Time.y / _Speed, _Frames);
    int current = floor(frame);

    int edges_cell = _Neighbourhood & 0xFF;
    int corners_cell = (_Neighbourhood & 0xFF00) >> 8;
    float dx = 1.0 / 16.0;
    float tex_dy = 1.0 / (2.0 * _Frames);
    float mask_dy = 1.0 / 2.0;

    fixed4 edge_mask = shot(_MaskTex, i.uv, dx, mask_dy, edges_cell, 1);
    fixed4 corner_mask = shot(_MaskTex, i.uv, dx, mask_dy, corners_cell, 0);
    fixed4 mask = min(edge_mask, corner_mask);

    #ifdef CLAMP_MASK
    mask.a = mask.a < 1.0 ? 0.0 : 1.0;
    #endif

    fixed4 edge_col = shot(_MainTex, i.uv, dx, tex_dy, edges_cell, (current * 2) + 1);
    fixed4 corner_col = shot(_MainTex, i.uv, dx, tex_dy, corners_cell, (current * 2));
    fixed3 combined_col = (corner_col.rgb * corner_col.a) + (edge_col.rgb * (1.0 - corner_col.a));
    float combined_a = edge_col.a + (corner_col.a * (1.0 - edge_col.a));

    fixed4 col = (fixed4(combined_col.r, combined_col.g, combined_col.b, combined_a) * mask.a) * _Color;

    return col;
    }
    ENDCG
    }
    }
    }




    -------- failed compiling:
    fragment evaluation shader
    ERROR: 0:71: Invalid call of undeclared identifier 'bitfieldExtract'

    Note: Creation of internal variant of shader 'Unlit/AnimatedTransition' failed.
    -------- Shader compilation failed
    #version 300 es
    precision highp float;
    precision highp int;
    #define HLSLCC_ENABLE_UNIFORM_BUFFERS 1
    #if HLSLCC_ENABLE_UNIFORM_BUFFERS
    #define UNITY_UNIFORM
    #else
    #define UNITY_UNIFORM uniform
    #endif
    #define UNITY_SUPPORTS_UNIFORM_LOCATION 0
    #if UNITY_SUPPORTS_UNIFORM_LOCATION
    #define UNITY_LOCATION(x) layout(location = x)
    #define UNITY_BINDING(x) layout(binding = x, std140)
    #else
    #define UNITY_LOCATION(x)
    #define UNITY_BINDING(x) layout(std140)
    #endif
    uniform vec4 _Time;
    uniform mediump vec4 _Color;
    uniform uint _Neighbourhood;
    uniform int _Frames;
    uniform float _Speed;
    UNITY_LOCATION(0) uniform mediump sampler2D _MaskTex;
    UNITY_LOCATION(1) uniform mediump sampler2D _MainTex;
    in highp vec2 vs_TEXCOORD0;
    layout(location = 0) out mediump vec4 SV_Target0;
    vec3 u_xlat0;
    mediump vec4 u_xlat16_0;
    ivec3 u_xlati0;
    vec4 u_xlat1;
    mediump vec4 u_xlat16_2;
    mediump vec4 u_xlat16_3;
    mediump vec4 u_xlat16_4;
    mediump float u_xlat16_5;
    float u_xlat6;
    mediump float u_xlat16_6;
    bool u_xlatb12;
    float u_xlat18;
    int u_xlati18;
    uint u_xlatu18;
    int int_bitfieldInsert(int base, int insert, int offset, int bits) {
    uint mask = ~(uint(0xffffffff) << uint(bits)) << uint(offset);
    return int((uint(base) & ~mask) | ((uint(insert) << uint(offset)) & mask));
    }
    void main()
    {
    u_xlat0.x = _Time.y / _Speed;
    u_xlat6 = float(_Frames);
    u_xlat0.x = u_xlat0.x / u_xlat6;
    #ifdef UNITY_ADRENO_ES3
    u_xlatb12 = !!(u_xlat0.x>=(-u_xlat0.x));
    #else
    u_xlatb12 = u_xlat0.x>=(-u_xlat0.x);
    #endif
    u_xlat0.x = fract(abs(u_xlat0.x));
    u_xlat0.x = (u_xlatb12) ? u_xlat0.x : (-u_xlat0.x);
    u_xlat0.x = u_xlat6 * u_xlat0.x;
    u_xlat6 = u_xlat6 + u_xlat6;
    u_xlat6 = float(1.0) / u_xlat6;
    u_xlat0.x = floor(u_xlat0.x);
    u_xlati0.x = int(u_xlat0.x);
    u_xlati0.z = int(u_xlati0.x << 1);
    u_xlati0.x = int(int_bitfieldInsert(1,u_xlati0.x,1,31) );
    u_xlat0.xz = vec2(u_xlati0.xz);
    u_xlat18 = u_xlat6 * vs_TEXCOORD0.y;
    u_xlat1.z = u_xlat0.z * u_xlat6 + u_xlat18;
    u_xlat0.z = u_xlat0.x * u_xlat6 + u_xlat18;
    u_xlatu18 = bitfieldExtract(_Neighbourhood, 8, 8);
    u_xlat18 = float(int(u_xlatu18));
    u_xlat1.yw = vs_TEXCOORD0.yx * vec2(0.5, 0.0625);
    u_xlat1.x = u_xlat18 * 0.0625 + u_xlat1.w;
    u_xlat16_2 = texture(_MainTex, u_xlat1.xz);
    u_xlat16_3.x = (-u_xlat16_2.w) + 1.0;
    u_xlati18 = int(uint(_Neighbourhood & 255u));
    u_xlat18 = float(u_xlati18);
    u_xlat0.x = u_xlat18 * 0.0625 + u_xlat1.w;
    u_xlat16_4 = texture(_MainTex, u_xlat0.xz);
    u_xlat16_3.xyz = u_xlat16_3.xxx * u_xlat16_4.xyz;
    u_xlat16_3.xyz = u_xlat16_2.xyz * u_xlat16_2.www + u_xlat16_3.xyz;
    u_xlat16_5 = (-u_xlat16_4.w) + 1.0;
    u_xlat16_3.w = u_xlat16_2.w * u_xlat16_5 + u_xlat16_4.w;
    u_xlat0.y = vs_TEXCOORD0.y * 0.5 + 0.5;
    u_xlat16_0.x = texture(_MaskTex, u_xlat0.xy).w;
    u_xlat16_6 = texture(_MaskTex, u_xlat1.xy).w;
    u_xlat16_5 = min(u_xlat16_6, u_xlat16_0.x);
    u_xlat16_0 = u_xlat16_3 * vec4(u_xlat16_5);
    SV_Target0 = u_xlat16_0 * _Color;
    return;
    }
    -------- failed compiling:
     
  2. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    1,536
    This is actually a GLES 3.1 feature. There's no GLES4, you've mixed this with GL 4 :)

    If you want your shader to run on GLES2 hardware, don't do any integer math there. It's not guaranteed to be supported on all hardware.

    Which Unity version are you using?
     
  3. Paul_H23

    Paul_H23

    Joined:
    Jun 19, 2019
    Posts:
    21
    You're right, my bad, getting mixed up with GL and GLES specs. However, it still doesn't explain why Unity's shader compiler uses GLES3.1 features when targeting GLES2, surely this is a bug?

    Looks like I'll have to rethink the shader to work on GLES2.

    I'm using Unity 2019.4.6f1.
     
  4. KokkuHub

    KokkuHub

    Joined:
    Feb 15, 2018
    Posts:
    349
    Because bitwise operations are a GLES3 feature, they are not supported by the GLES2 specs.
     
  5. Paul_H23

    Paul_H23

    Joined:
    Jun 19, 2019
    Posts:
    21
    I get that, but that's no reason to inject function calls that simply don't exist in GLES2. If I use bitwise operations on GLES2, and it doesn't compile because of them, that's on me, and I can change that. If Unity injects missing functions on GLES2, surely that's on them, and it's not right, i.e. it seems like a bug to inject functions into the transpiled shader code that can't possibly exist on the target platform. It's like a compiler inserting x64 instructions into an x86 target executable just because the code is using a feature that isn't available on x86.

    Ideally, if I'm using bitwise arithmetic in a shader, and then targeting GLES2, the Unity shader compiler should warn/error at that point, that would be a perfect solution. But just adding functions that can't possibly compile doesn't seem like a sensible solution under any circumstances, to me anyway, happy to be proven wrong with an example where it makes sense.
     
    aleksandrk likes this.
  6. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    1,536
    @Paul_H23 can you please submit a bug report?
     
unityunity