Search Unity

Compute shader not returning any data when running on integrated graphics.

Discussion in 'Shaders' started by sharkweek42, Aug 21, 2017.

  1. sharkweek42

    sharkweek42

    Joined:
    Jul 30, 2013
    Posts:
    14
    I have a data processing pipeline that runs in multiple kernels on the same compute shader. Everything works beautifully when I run it on a system with a dedicated GPU, but I recently discovered that parts of it don't return any data when running on a PC with integrated graphics, even when SystemInfo.SupportsComputeShaders returns true and SystemInfo.ShaderModel returns 50. This has been tested on multiple different integrated graphics systems. I'm also not getting any error messages or warnings, so I'm not sure why things aren't working. Here's the general process:

    First kernel: Take integers from a small array and place them in a very large array after manipulating their positions. The small arrays come from individual files, so this part repeats a lot depending on how many files are being read.
    Second kernel: Take integers from the large array, divide them by a large number to get a float, then stick the float into a float4 and place that into a large float4 array. This part isn't working. I was originally doing this in one dispatch call, but I tried breaking it down into smaller chunks like the first kernel and it still isn't working. I also tried hardcoding the float value (meaning I'm putting float4(1,1,1,1) in the output buffer), but the output is still empty when I try to read it after the dispatch call. I tried adding a yield return new WaitForSeconds(2) before and after the GetData call just to give it some extra time, but it's still completely empty after the GetData call.

    I'm not really sure what to do next. I tried creating a test project with a compute shader that fills a few very large output buffers with garbage data, but I couldn't reproduce the issue there. I'd be fine with switching over to a CPU-based data processing pipeline on these systems, but I'm not sure how to detect the integrated graphics card.
     
  2. sharkweek42

    sharkweek42

    Joined:
    Jul 30, 2013
    Posts:
    14
    For what it's worth: I discovered I can get around this issue by writing my output value to a large array of floats, then using the floats to create the color array on the CPU. For some reason integrated graphics cards really don't seem to like float4 arrays of any size. The texture is alpha-only anyways, so it's not too much of a hassle to stick the output value in the alpha of a color. It'd be really annoying if I needed more than one color channel in the output array.