Search Unity

Feature Request Compute Shader Set Bools?

Discussion in 'Shaders' started by segant, Jan 15, 2022.

  1. segant

    segant

    Joined:
    May 3, 2017
    Posts:
    196
    Last edited: Jan 15, 2022
  2. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    While you can use bools in local variables in shaders, GPUs don't actually support storing bools in memory buffers, which is where the values you pass from the CPU are stored under the hood. At most you could pack your bools into the bits of an int and unpack those in the shader, but that's only worthy if you have hundreds of thousands of them.
     
    segant likes this.
  3. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,550
    To elaborate on what Neto is saying, the reason they don't properly support them is because most modern GPUs deal in 32bit floats, with a pipeline sized for 128bit packs of data (float4). There are some older mobile platforms that use lower precision value types, but it's still nowhere near as low as a single bit or byte (except in the niche use case of things like Tensor cores using Int16/8)).

    So simply use an integer array and set 0 or 1. Or if you're looking for more space saving, use
    .SetData()
    with a ComputeBuffer that has an array of bytes stored in it. And in the most extreme, you can use Neto's suggestion of bit packing to store multiple values per element.
     
    Last edited: Jan 15, 2022
    segant likes this.