Search Unity

Question Fragment shader does not read what Compute Shader wrote

Discussion in 'Shaders' started by Bovine, Jul 30, 2022.

  1. Bovine

    Bovine

    Joined:
    Oct 13, 2010
    Posts:
    195
    Hi there, hopefully you can help.

    I am lacking some fundamentals in shader programming, so I could be missing something, clearly has to be the case.

    I have a compute shader that is writing this:

    Code (CSharp):
    1. Lighting[id.xy] = float4(0.2f, 0.4f, 0.6f, 1.0f);
    Where Lighting is as follows:

    RWTexture2D<float4> Lighting;

    If I then read the RGBA from that same texture in shader graph, I get these values:

    upload_2022-7-30_11-32-0.png

    I can only presume that I am not packing the texture corectly in the compute shader or that I am reading it incorrectly and some conversion needs to occur that I am completely missing.

    Secondary to this, is that I don't actually want to deal with floats - I actually want to store 4 bytes each with a value of 0-255, so if anyone has any thoughts on how I can do that without having to presumably do:

    Code (CSharp):
    1. integerValueAsFloat/256.0f
    And then in addition, how I would read that byte OUT of in the fragment shader (shader graph, but could use a custom function?).

    laternatively, can I share a structured buffer between a compute shader and a shader in shader graph so I can write this data to a structured buffer instead (if that's possible?) and then read it out in the fragment shader?

    Thanks
    Ian H
     
  2. Bovine

    Bovine

    Joined:
    Oct 13, 2010
    Posts:
    195
    Oh, interesting - I suddently thought "Maybe it's the colour space", so I clearly do not understand colour space, as setting this back to Gamma from linear fixes this whole problem.

    However, if anyone has any explanation for that or an answer to my second question there.... that would be handy.
     
  3. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    What format is the texture? I suspect you’re using an sRGB format, which gets the sRGB to linear conversion applied when sampled, but not when written to as a RWTexture.
     
    Bovine likes this.
  4. Bovine

    Bovine

    Joined:
    Oct 13, 2010
    Posts:
    195
    Thanks, in unity I followed some compute texture tutorials.

    So in my compute texture I declare:

    Code (CSharp):
    1. RWTexture2D<float4> Result;
    And then in my C# code I create it as:

    Code (CSharp):
    1. renderTexture = new RenderTexture(64, 64, 0, RenderTextureFormat.ARGB32);
    2. renderTexture.enableRandomWrite = true;
    3. renderTexture.Create();
    4. computeShader.SetTexture(0, "Result", renderTexture);
    So I can take a look see if there are more options, or if actually, I don't need a render texture at all and can use a regular texture?

    I don't need to read the texture back, just do some computation on it and attach it to a shader graph shader.

    I am finding the compute shader code very picky also, the branching does not seem predictable and I wonder if it's being ignored, certainly the shader is much simpler if a omit [branch] attributes.

    Cheers
    Ian H
     
  5. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    ARGB32 defaults to an sRGB format, which is where your problem is coming from.

    You can see the constructor has a third optional input for "read write".
    https://docs.unity3d.com/ScriptReference/RenderTexture-ctor.html

    That is an enum with three options: Default, Linear, and sRGB.
    https://docs.unity3d.com/ScriptReference/RenderTextureReadWrite.html
    Code (csharp):
    1. renderTexture = new RenderTexture(64, 64, 0, RenderTextureFormat.ARGB32, RenderTextureReadWrite.Linear);
     
    Bovine likes this.
  6. Bovine

    Bovine

    Joined:
    Oct 13, 2010
    Posts:
    195
    Thanks I was planning to poke around in there later so that gives me a massive heads up.

    I'll ping back when I've done so, but looks like this should fix it so thanks a lot!
     
  7. Bovine

    Bovine

    Joined:
    Oct 13, 2010
    Posts:
    195
    Yep, worked a treat, thanks a lot - I should probably have dug a bit more there on the constructor.

    One question, if you're able to answer? I am actually storing integer values in my texture from my compute shader but having to work in floats and divide it all by 256. Is there way for me to just write a uint as the RGBA value? What about reading them back in the fragment shader, everything appears to come out as a float, it feels a bit unnecessary to have to read a float, split and multiple it all by 256?

    Thanks again for your help.
     
  8. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,550
    Why bother working with the 256 system?
    You could just use the Color class instead which uses 1-0 range, instead of Color32, avoiding the 256 mul/div steps. If you want the user to input using 255 values for ease of use, then do that conversion on their input and when displaying UI.
     
  9. Bovine

    Bovine

    Joined:
    Oct 13, 2010
    Posts:
    195
    The texture is not used as a colour but a lookup for a significant light, so the RGBA values are each an index into a light array, so in my compute shader I am writing a light index to use as a value 0 to 255 into each colour component. I'd rather just write the value than have to convert it to a float and convert it back into a value 0 to 255.
     
  10. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,550
    So what's the purpose of using the RWTexture2D then? You would just use a RWBuffer<byte> and access it in your shadergraph using a Custom Function node with this defined in it.
    If you want groups of 4, make the length *4 and multiply the index by 4 and +1,2,3 for the y,z,w components.
    Or you can make a 4 byte struct and use a RWStructuredBuffer<ByteStruct> and skip the index offsetting.
     
    Last edited: Aug 1, 2022
    Bovine likes this.
  11. Bovine

    Bovine

    Joined:
    Oct 13, 2010
    Posts:
    195
    No purpose, it is just that I'm new to shader programming, so all the tutorials are writing to a texture, but as you can see above in my first post, I was wondering if usinf structured buffers would be possible but I've not explored it yet, sounds like it will be possible, however, so thanks.

    Perhaps you can advise me further? Given I have to share this sructured buffer between my compute shader and my custom function, how and where would I declare it, given I need to be able to call SetData() on the compute shader, should I declare the buffer externally to both and include it in both the customer function file and the compute shader?

    Is reading/writing a structured buffer as performant or perhaps, even better, than a texture?

    Thanks in advance.
     
  12. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,550
    Yes, you would define it in both of those.
    Bind it to the pixel shader with https://docs.unity3d.com/ScriptReference/Graphics.SetRandomWriteTarget.html

    Reading the buffer can be more performant since there's no extra sampler stuff going on. But the GPU can also prefetch texture samples and run further code that isn't dependent on the sample. But these days it would be similar if not faster with the buffer, plus memory/bandwidth savings.
     
    Bovine likes this.
  13. Bovine

    Bovine

    Joined:
    Oct 13, 2010
    Posts:
    195
    Thanks a lot, this was a key piece I was missing, I'll try and put it together.

    My immediate problem, is that having debugged several problems and fixed an artefact (coming from mipmaps I didn't know where enabled) and a bunch of other things, my compute shader does not run properly on my Quest 2 :(

    The structured buffer provides some other benefits too - assuming my Q2 supports those!