Search Unity

AsyncGPUReadbackRequest.getData<Byte> with RenderTexture

Discussion in 'General Graphics' started by pyjamaslug, Aug 21, 2019.

  1. pyjamaslug

    pyjamaslug

    Joined:
    Jul 5, 2017
    Posts:
    51
    I am doing image capture using AsyncGPUReadbackRequest using two different approaches, one for capturing the screen image and the other for capturing from a renderTexture.

    The screen image sets up the request like this (in OnRenderImage):
    Code (CSharp):
    1. inputBundle.request = AsyncGPUReadback.Request(source,0,TextureFormat.RGBA32);
    and deals with it on update like this:
    Code (CSharp):
    1. outputBundle.image = req.request.GetData<Byte>(0).ToArray();
    I then just hand the byte array to a png encoder and it works perfectly.
    With the capture from a renderTexture, the capture is scheduled during update like this:
    Code (CSharp):
    1.                     currentRT = RenderTexture.active;
    2.                     if (rT != null)
    3.                     {
    4.                         Destroy(rT);
    5.                     }
    6.                     rT = new RenderTexture(outputBundle.width, outputBundle.height, 24, RenderTextureFormat.RFloat);
    7.                     GetComponent<Camera>().targetTexture = rT;
    8.                     RenderTexture.active = rT;
    then executed during OnRenderImage like this:
    Code (CSharp):
    1.                         inputBundle.request = AsyncGPUReadback.Request(rT);
    2.                         inputQueue.Enqueue(inputBundle);
    finally, it is retrieved like this:
    Code (CSharp):
    1. maskBundle.image = req.request.GetData<float>(0).ToArray();
    This works ok but I end up with an array of floats when I really want an array of bytes that I can hand off to to a compression stream for writing out to a file. If I change the request.GetData to get a Byte array, I get wrong data returned. I'd appreciate suggestions for how to get the RenderTexture request to return a valid byte array.
     
  2. joelv

    joelv

    Unity Technologies

    Joined:
    Mar 20, 2015
    Posts:
    203
    The reason you get a float array seems to be that you have a render target with a float type (RenderTextureFormat.RFloat) and you have specified no automatic image conversion to be made as you did with the first request (AsyncGPUReadback.Request(rT) vs. AsyncGPUReadback.Request(source,0,TextureFormat.RGBA32))

    You could render to a byte buffer, request conversion to RGBA32 or do a manual conversion of the float array.

    Hope this help
     
  3. pyjamaslug

    pyjamaslug

    Joined:
    Jul 5, 2017
    Posts:
    51
    Actually, I know that I am doing that; it's the only way I can get sensible data back from the request. When I render to a byte-oriented RenderTexture such as RenderTextureFormat.ARGB and request a conversion to TextureFormat.RGBA, I can't get it to work at all.
    Do you have an example of how it is meant to work?

    EDIT:
    I just tried out your suggestion to use a byte-ordered RenderTexture and re-confirmed that it doesn't work. I used the image capture camera so that I can be sure that the shader is correct (it is a unity standard shader) and changed it to render to a texture. Here's what it looks like:
    Schedule the capture during update:
    Code (CSharp):
    1.                     rT = new RenderTexture(outputBundle.width, outputBundle.height, 24, RenderTextureFormat.ARGB32);
    2.                     GetComponent<Camera>().targetTexture = rT;
    3.                     RenderTexture.active = rT;
    and request it back during OnRenderImage
    Code (CSharp):
    1. inputBundle.request = AsyncGPUReadback.Request(rT,0,TextureFormat.RGBA32);
    I tried it with and without the request for format conversion and neither worked.

    Can you be more precise about what you mean by 'render to a byte buffer'?
     
    Last edited: Aug 25, 2019
  4. ataulien

    ataulien

    Joined:
    Nov 7, 2017
    Posts:
    5
    What do you mean with "valid byte array"? I'm also using
    AsyncGPUReadback 
    to store a float-RenderTexture to disk and I'm just writing the bytes given by
    GetData<byte>(...)
    . Keep in mind that the datatype you specify in
    GetData
    does not do any converstion, rather it only selects a type and puts the bits from the GPU resource as they are into that datatype.

    For 32-bit floating point rendertargets, you can combine 4 bytes to get a valid float back. I'm not sure whether the AsyncGPUReadback can do format conversion. If needed, you can just use the float-values to calculate the RGBA32 values yourself or use
    Graphics.ConvertTexture
    (https://docs.unity3d.com/ScriptReference/Graphics.ConvertTexture.html) to convert your render target to a different format, but this will need an intermediate render target to hold the converted data.
     
    Sam_Game likes this.
  5. npatch

    npatch

    Joined:
    Jun 26, 2015
    Posts:
    247
    Is it possible to readback the depth buffer of the rendertexture alone?!
     
  6. pyjamaslug

    pyjamaslug

    Joined:
    Jul 5, 2017
    Posts:
    51
    I mean a byte array that gives me back the exact same data in the same order that I put into it in the shader. You'll see in my first example that I am doing exactly that where the result is requested as a byte array but the source is the display buffer. Of course I tried the same thing with a rendertexture but when I do, I get incorrect data back. You say you have got it to work: how does your implementation differ from mine apart from replacing GetData<float> with GetData<byte>?
     
  7. pyjamaslug

    pyjamaslug

    Joined:
    Jul 5, 2017
    Posts:
    51
    I need 32 bits and depth buffers are not always that deep.
     
  8. npatch

    npatch

    Joined:
    Jun 26, 2015
    Posts:
    247
    I managed to do it, but I added an intermediate render texture where I convert the 16bit depth 1-0 value to a regular color buffer value. Unfortunately I can't find a way to straight up as for the depthBuffer from the async readback. It seems to default to the color buffer.
    As for the 32bits depth, RenderTexture constructor only supports up to 24bits.

    UPDATE: Weird thing is, I just noticed, if you create a RenderTexture using RenderTextureDescriptor, depthBufferBits can be set to 32. On the other hand, RenderDoc tells me that Camera Depth Texture is D32S8_TYPELESS(specifically DXGI_FORMAT_R32S8X24_TYPELESS) and according to docs D32 means 32bit depth. But even if I set the depthBufferBits to other values, nothing changes, at least as far as RenderDoc is concerned.
     
    Last edited: Aug 26, 2019