Search Unity

ComputeShader.SetInts failing (or me failing)?

Discussion in 'Shaders' started by cecarlsen, Apr 29, 2019.

  1. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    864
    I just can't get ComputeShader.SetInts to play well.

    I've created a simple example to illustrate the issue. Here I upload four ints to a const array, dispatch a kernel that copies them to a buffer, and read them back again. For some reason only the first value is copied. The example produces this result on my machine (Windows 10, GTX 1080, Unity 2019.1):

    Upload: 1, 2, 3, 4
    Download: 1, 0, 0, 0

    What am I missing?

    ComputeShader:
    Code (CSharp):
    1. #pragma kernel Test
    2.  
    3. RWStructuredBuffer<int> _IntBuffer;
    4.  
    5. int _IntValues[4];
    6.  
    7. [numthreads(1,1,1)]
    8. void Test( uint i : SV_DispatchThreadID )
    9. {
    10.     _IntBuffer[i] = _IntValues[i];
    11. }
    Dispatch Script:
    Code (CSharp):
    1. using UnityEngine;
    2. public class ComputeShaderSetInts : MonoBehaviour
    3. {
    4.     [SerializeField] ComputeShader _computeShader;
    5.  
    6.     void Awake()
    7.     {
    8.         int testKernel = _computeShader.FindKernel( "Test" );
    9.         int[] intValues = { 1, 2, 3, 4 };
    10.         Debug.Log( "Upload: " + intValues[0] + ", " + intValues[1] + ", " + intValues[2] + ", " + intValues[3] );
    11.         ComputeBuffer intBuffer = new ComputeBuffer( intValues.Length, sizeof( int ) );
    12.         _computeShader.SetBuffer( testKernel, "_IntBuffer", intBuffer );
    13.         _computeShader.SetInts( "_IntValues", intValues );
    14.         _computeShader.Dispatch( testKernel, intValues.Length, 1, 1 );
    15.         intBuffer.GetData( intValues );
    16.         Debug.Log( "Download: " + intValues[0] + ", " + intValues[1] + ", " + intValues[2] + ", " + intValues[3] );
    17.         intBuffer.Release();
    18.     }
    19. }
     
  2. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    864
    Oddly enough, if you upload to an int4 instead. it works. Like so:

    Code (CSharp):
    1. #pragma kernel Test
    2.  
    3. RWStructuredBuffer<int> _IntBuffer;
    4.  
    5. //int _IntValues[4];
    6. int4 _IntValues;
    7.  
    8. [numthreads(1,1,1)]
    9. void Test( uint i : SV_DispatchThreadID )
    10. {
    11.     //_IntBuffer[i] = _IntValues[i];
    12.     if( i == 0 ) _IntBuffer[i] = _IntValues.x;
    13.     else if( i == 1 ) _IntBuffer[i] = _IntValues.y;
    14.     else if( i == 2 ) _IntBuffer[i] = _IntValues.z;
    15.     else if( i == 3 ) _IntBuffer[i] = _IntValues.w;
    16. }
    Works for matrices as well (like int4x4). The problem is I need single indexing.

    Is SetInts broken?
     
    Last edited: Apr 30, 2019
  3. mjoos_eth

    mjoos_eth

    Joined:
    Dec 8, 2015
    Posts:
    2
    I found the same issue today. I'm using 2018.3.7f1.
    The documentation (https://docs.unity3d.com/ScriptReference/ComputeShader.SetInts.html) says:

    This function can be used to set int vector, int array or int vector array values

    It works for vectors (SetVectorArray) and matrices (SetMatrixArray), but not for ints (SetInts) or floats (SetFloats)
    I've sent Unity a bug report
     
    Mytino and cecarlsen like this.
  4. mjoos_eth

    mjoos_eth

    Joined:
    Dec 8, 2015
    Posts:
    2
    Mytino likes this.
  5. unity_leaves

    unity_leaves

    Joined:
    Nov 24, 2019
    Posts:
    1
    Code (CSharp):
    1. using UnityEngine;
    2. public class ComputeShaderSetInts : MonoBehaviour
    3. {
    4.     [SerializeField] ComputeShader _computeShader;
    5.  
    6.     void Awake()
    7.     {
    8.         int testKernel = _computeShader.FindKernel("Test");
    9.         int[] intValues = new int[4*4];
    10.         intValues[0] = 1;
    11.         intValues[4] = 2;
    12.         intValues[8] = 3;
    13.         intValues[12] = 4;
    14.         Debug.Log("Upload: " + intValues[0] + ", " + intValues[1] + ", " + intValues[2] + ", " + intValues[3]);
    15.         ComputeBuffer intBuffer = new ComputeBuffer(intValues.Length, sizeof(int));
    16.         _computeShader.SetBuffer(testKernel, "_IntBuffer", intBuffer);
    17.         _computeShader.SetInts("_IntValues", intValues);
    18.         _computeShader.Dispatch(testKernel, intValues.Length, 1, 1);
    19.         intBuffer.GetData(intValues);
    20.         Debug.Log("Download: " + intValues[0] + ", " + intValues[1] + ", " + intValues[2] + ", " + intValues[3]);
    21.         intBuffer.Release();
    22.     }
    23. }
    Reference material:https://www.cnblogs.com/murongxiaopifu/p/9697704.html
     
    wolfand13 and BayernMaik like this.
  6. LU15W1R7H

    LU15W1R7H

    Joined:
    Jul 7, 2020
    Posts:
    1
    Can confirm, same behaviour here.
    This should really be fixed, to work with int[] arrays too.
     
  7. Mytino

    Mytino

    Joined:
    Jan 28, 2019
    Posts:
    16
    Tldr; works as long as each of your CPU-side ints have 3 unused ints of padding following it.

    I tried SetInts today in hopes that the newer versions had solved it. Didn't work at first, but then I looked at the documentation and maybe it has been recently updated as it helped me out a bit. It says:
    I tried "int4 offset[9]" in my compute shader instead of "int offset[9]" for my 9 values, and instead created a 9 * 4 sized array on the CPU side, just placing the int I need at the 0th, 4th, 8th, 12th position etc. and then it actually worked (after also doing ".x" to access). Turns out if you do it this way on the CPU side, you don't even need "int4". I could actually have it be "int offset[9]" just like before (because of implicit casting I think), which I found out after reading this: https://cmwdexint.com/2017/12/04/computeshader-setfloats/
     
    voxelltech likes this.
  8. voxelltech

    voxelltech

    Joined:
    Oct 8, 2019
    Posts:
    44
    WORKS like a charm!!