Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice
  2. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  3. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

AsyncGPUReadbackRequest.hasError usually true

Discussion in '2018.1 Beta' started by Dave-Taylor, Jan 18, 2018.

  1. Dave-Taylor

    Dave-Taylor

    Joined:
    Jul 7, 2012
    Posts:
    21
    I'm trying to get the AsyncGPUReadback.Request to work, and on the AsyncGPUReadbackRequest, I'm seeing AsyncGPUReadbackRequest.done as true and AsyncGPUReadbackRequest.hasError as true, but the API doesn't have a way for me to learn more about the nature of the error.

    I've created a RenderTexture w/ these params as the target texture of a camera:

    camera.targetTexture = new RenderTexture(1, 1, 24, RenderTextureFormat.ARGBFloat, RenderTextureReadWrite.Linear);

    After I've rendered it, inside OnPostRender(), I call:

    textureRequest = AsyncGPUReadback.Request(camera.targetTexture, 0, TextureFormat.RGBAFloat);

    (btw, I find it really confusing that there are different texture types for RenderTextureFormat's and TextureFormat's)

    Then in the subsequent frames, I start polling to see if textureRequest.done is true, and when it is, I find that textureRequest.hasError is generally true, and only occasionally is it false. If it's true, I grab the data like this:

    NativeArray< Color > pixels = textureRequest.GetData< Color >(0);

    When it succeeds, it looks like the data it's putting into pixels[0] is correct. It just doesn't usually succeed.

    FYI, I'm using four cameras and four target textures attached then duty cycling them over four frames, so that they don't interfere with each other and have time to get the data back.

    Any theories about how to track this down? This has become a blocking issue.
     
  2. Dave-Taylor

    Dave-Taylor

    Joined:
    Jul 7, 2012
    Posts:
    21
    Nevermind. My own bug. :)
     
  3. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    6,221
    What was your own bug?
     
  4. Dave-Taylor

    Dave-Taylor

    Joined:
    Jul 7, 2012
    Posts:
    21
    I was generating indexes into my old camera's using this line:
    int index = (frame - numFramesStored - 1) % numFramesStored;
    The fix was this:
    int index = (frame - (numFramesStored - 1)) % numFramesStored;

    I was basically looking at the current frame in addition to the older frames and just generally upsetting things.
     
  5. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    6,221
    ah! these bugs take me forever to find...
     
  6. Claytonious

    Claytonious

    Joined:
    Feb 16, 2009
    Posts:
    900
    Out of curiosity, what are you using this feature for?
     
  7. Dave-Taylor

    Dave-Taylor

    Joined:
    Jul 7, 2012
    Posts:
    21
    I'm using a ComputeShader to render a curved surface on the GPU. When I point at a spot on the surface, I want to know which texel I'm pointing at. I've approached this by creating what I call a "locator camera" which is just a 1x1 render target which is aligned along the pointing device. I use a different shader on that camera which takes the same geometry but renders out a single pixel stating the coordinates of the texel I'm pointed at.

    Using Texture2D.ReadPixels() in 2017.3.0b1 causes really bad hitches that show up as enormous loads in Gfx.ReadbackImage. It's pretty clear that's the culprit.

    Using 2018.1.0b3, I am getting huge Gfx.WaitForPresent spikes, and I'm finding it frustrating to isolate the cause of the problem from there. I don't see any specific issues with either ReadPixels() or using AsyncGPUReadback.Request(), but I don't really know where to begin in debugging these Gfx.WaitForPresent spikes.
     
  8. Dave-Taylor

    Dave-Taylor

    Joined:
    Jul 7, 2012
    Posts:
    21
    FYI, what I've decided is that I cannot rely on Unity to get information back from the GPU, because it's broken in 2017, and while fixed in 2018, something else is broken and impenetrable. It's the classic Unity rock & hard place. So I'm sticking to 2017 and treating the GPU as a device that can never be read. It requires extra work on the CPU side to estimate where the pointer hits, and then I'll curve the pointer, so that it always hits the estimated position on the curved surface.

    This is probably going to create discontinuities in the feeling of smoothness, but it'll obviate the need for PID controllers to cope with the lag getting information back from the GPU. Overall, this is probably for the best, as it's easy to get PID controllers into an unstable state with frame rate variations, and Unity tends to be more reliable at adding new features than fixing the ones you desperately need.
     
  9. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    6,221
    I've had tons of crashes with asyncgpureadback and other weird behaviors, when it works it works but it's difficult to figure out with so much black boxiness and I'm reaching the same conclusion.
    Experimental really means experimental.
     
  10. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    And beta really means beta, and sometimes the problems are of our own making but this stuff is so new that we might initially reach other conclusions.

    I havent started to use this stuff properly yet but I'm quite excited about it and I'm trying to stock up on patience before I do. Sometimes things happen with Unity development, bugs and limitations are far from ideal situations that I might moan at Unity about. But at beta times like these I often end up messing with the bleeding edge and I dont expect Unity to be able to offer bleeding edge experimental stuff to me and at the same time somehow insure me against all the things that inevitably come with that territory.
     
    laurentlavigne likes this.
  11. JulienF_Unity

    JulienF_Unity

    Unity Technologies

    Joined:
    Dec 17, 2015
    Posts:
    324
    Async Readback is experimental in 2018.1 for two reasons:
    • It is implemented only on a subset of platforms (currently DX11 and DX12)
    • Its API is still subject to change
    Other than that, the current implementation should be functional on DX. And we want to take advantage of the beta phase to get users to test it on a wide variety of config/projects and report any bugs or issues they are having with it.

    This feature is an advanced feature and some tutorials, examples should be made to help users understand how to take the most out of the it. At the moment there's only the scripting API documentation though.

    @laurentlavigne If you encounter crashes, this is definitely not expected and very valuable for us to be able to gather more information about it to help us fix the issues.

    @Dave-Taylor The issue you are mentionning with Gfx.WaitForPresent is interesting. I'd like to have more information about it.
    • Do you have it without any GPU readback at all (either old synchronous one or new async one) ? Do you see a difference between sync and async ?
    • Did just porting your project from 2017.3 to 2018.1 show those spikes and therefore can we say you had big perf regressions/instabilities by just switching to the new version ?
     
    Last edited: Jan 21, 2018
    richardkettlewell likes this.
  12. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    6,221
    @JulienF_Unity I think those crashed were caused by stride mismatch. I didn't know that could crash Unity.

    I encourage you to keep releasing API in such an early stage, it makes for fun exploration and we're used to poking at black a black box ;)