I need to use a texture array in my GLSL shader, targeting webGL 2. It works perfectly in the editor, but not when running on the web. Code (CSharp): Debug.Log("supports2DArrayTextures: " + SystemInfo.supports2DArrayTextures); reports true. Texture arrays are supposed to work according to the ES3 specs. Unity docs says: Platform Support Texture arrays need to be supported by the underlying graphics API and the GPU. They are available on: Direct3D 11/12 (Windows, Xbox One) OpenGL Core (Mac OS X, Linux) Metal (iOS, Mac OS X) OpenGL ES 3.0 (Android, WebGL 2.0) PlayStation 4 What is happening here, is this a bug in Unity, or am I doing something wrong? This is the shader: Code (CSharp): Shader "testshader" { Properties{ _MainTex("Base (RGB)", 2D) = "white" {} } SubShader{ Tags { "Queue" = "Geometry" } Pass { GLSLPROGRAM #ifdef VERTEX varying vec2 TextureCoordinate; void main() { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; TextureCoordinate = gl_MultiTexCoord0.xy; } #endif #ifdef FRAGMENT precision mediump sampler2DArray; uniform sampler2D _MainTex; uniform sampler2DArray textureArray; varying vec2 TextureCoordinate; void main(void) { vec3 uvz = vec3(TextureCoordinate, 0); vec4 c = texture(textureArray, uvz); gl_FragColor = c; } #endif ENDGLSL} }} I always get white from the sampler, no matter what is in the texture arrays. The textures are in the correct format (RGBA32) and size (1024x1024). This is the script: Code (CSharp): using System.Collections; using System.Collections.Generic; using UnityEngine; public class Test : MonoBehaviour { public Material testMaterial; public Texture2DArray texturesArray; public List<Texture2D> textures; private Texture2DArray createTexture2DArray(List<Texture2D> textures, int size) { Texture2DArray texture2DArray = new Texture2DArray(size, size, textures.Count, TextureFormat.RGBA32, false); texture2DArray.filterMode = FilterMode.Bilinear; texture2DArray.wrapMode = TextureWrapMode.Repeat; for (int i = 0; i < textures.Count; i++) { //Copy to the array texture Graphics.CopyTexture(textures[i], 0, 0, texture2DArray, i, 0); } return texture2DArray; } void Start() { texturesArray = createTexture2DArray(this.textures, 1024); testMaterial.SetTexture("textureArray", texturesArray); Debug.Log("supports2DArrayTextures: " + SystemInfo.supports2DArrayTextures); } void Update() { } } I have also tested with an HLSL equivalent, using this: https://gist.github.com/Spaxe/57de76c993a4dc8bb37275acbe597ef2 Same story there, works in the editor but not in the browser. It looks to me like Code (CSharp): /Graphics.CopyTexture(textures[i], 0, 0, texture2DArray, i, 0); Is not working in WebGL, as I get the same result in the editor if I omit this.
Okay, answering my own question here, as usual... I found a workaround: Code (CSharp): if (SystemInfo.copyTextureSupport==UnityEngine.Rendering.CopyTextureSupport.None) texture2DArray.SetPixelData(t.GetRawTextureData(), 0, i); else Graphics.CopyTexture(t, 0, 0, texture2DArray, i, 0); ... texture2DArray.Apply(); The texture formats must match exactly,