Search Unity

Read Texture2DArray from GPU to CPU for compute shader

Discussion in 'Shaders' started by sstrong, Nov 15, 2019.

  1. sstrong

    sstrong

    Joined:
    Oct 16, 2013
    Posts:
    1,370
    How can I read Textures back from a RWTexture2DArray<float4> in a compute shader to my C# code? I'd like to do something like the following abbreviated code:

    Code (CSharp):
    1. Texture2DArray tex2DArray = new Texture2DArray(width, height, numTextures, TextureFormat.ARGB32, false, true);
    2. ..
    3. computeShader.SetTexture(kernalId, rwTex2DArray, tex2DArray);
    4. ..
    5. computeShader.Dispatch(kernalId, threadGroupX, 1, 1);
    6. ..
    7. // READ Texture2DArray from GPU to CPU
    8.  
    9.  
    10. // Dispose of Texture2DArray
    11. UnityEngine.Object.Destroy(tex2DArray);
     
  2. Olmi

    Olmi

    Joined:
    Nov 29, 2012
    Posts:
    877
    Hi @sstrong,

    You need to use RenderTexture with dimension parameter configured to tex2DArray and in your compute shader use
    RWTexture2DArray.

    Then read the data back to CPU-side with AsyncGPUReadback. You could then copy this data to separate slices of Texture2D with SetPixels/GetPixels.

    Check this good thread to get the all critical information needed:
    https://forum.unity.com/threads/upl...e-as-texture2darray-in-compute-shader.495137/

    I don't know if it's possible to read that array without that AsyncGPUReadback, as Texture2DArray doesn't have ReadPixels as you probably noticed and I don't know any other way.
     
  3. sstrong

    sstrong

    Joined:
    Oct 16, 2013
    Posts:
    1,370
    Yeah, have seen that thread. I wanted something that I could read synchronously as it is part of code that does other things too. I'm already using Texture2DArray as input without needing RenderTexture with dimension parameter (works fine).

    Currently I'm using a computebuffer for this scenario but I'd like to switch to RWTexture2DArrays.
     
unityunity