Is there any way that works to send and receive 64 bit ints to and from a Compute Shader? I've been asking ChatGPT and got a bunch of ways that don't work.
@Maxeaman Minimal Working Example: Code (CSharp): // Example of using 64-bit unsigned integers in Unity compute shader // Requirements: Unity 2020.2.0a8 or later and active DX12 graphics API using UnityEngine; public class UnsignedInteger64 : MonoBehaviour { [SerializeField] ComputeShader _ComputeShader; void Start() { if (_ComputeShader == null) return; ComputeBuffer reader = new ComputeBuffer(4, sizeof(ulong), ComputeBufferType.Default); reader.SetData(new ulong[] {172439890993963ul, 657367095657329ul, 277347196953998ul, 844613309877278ul}); _ComputeShader.SetBuffer(0, "_Reader", reader); ComputeBuffer writer = new ComputeBuffer(2, sizeof(ulong), ComputeBufferType.Default); _ComputeShader.SetBuffer(0, "_Writer", writer); _ComputeShader.Dispatch(0, writer.count, 1, 1); // execute compute shader ulong[] result = new ulong[2]; writer.GetData( result ); Debug.Log( "172439890993963 + 657367095657329 = " + result[0].ToString()); // 829 806 986 651 292 Debug.Log( "277347196953998 + 844613309877278 = " + result[1].ToString()); // 1 121 960 506 831 276 reader.Release(); writer.Release(); } } Code (CSharp): #pragma kernel CSMain #pragma use_dxc StructuredBuffer<uint64_t> _Reader; RWStructuredBuffer<uint64_t> _Writer; [numthreads(1,1,1)] void CSMain (uint3 id : SV_DispatchThreadID) { _Writer[id.x] = _Reader[id.x * 2] + _Reader[id.x * 2 + 1]; }
Maybe I don't have the right requirements? When I copy that exactly, I get that the answer = 0. How can I check? I'm using Unity 2021.3 and I have the latest version of DirectX (DirectX12, checked using dxdiag). There is another possible problem though. In trying to figure this out before, it was suggested I use uint64_t, but that was undefined, so I made a #define for it, but it still didn't work, yet even after I removed the #define from my code, that type still showed up as green in my VB editor, and didn't specifically say there was an error. I tried your code in a new project, but could unity be remembering that previous definition somewhere? Edit: I tried using #undef uint64_t and compiling and running it twice, then removed it and compiled and run it twice, and got the same results each time. I have checked that my GPU drivers are the correct versions as well.