Search Unity

Question Single-channel difference for RG16 vs RGBA32

Discussion in 'General Graphics' started by Forberg, Nov 21, 2022.

  1. Forberg

    Forberg

    Joined:
    Oct 27, 2018
    Posts:
    25
    I am trying to create a two-channel texture with format RG16 since I do not need the full 4 channels. I was previously using RGBA32 without any problems.

    I can create the texture in the desired format but the results in the single channels look different.

    I use a byte array with every fourth element set to the color byte of the red channel in RGBA32 and another array with every second byte set to the color byte for RG16.

    The result is roughly the same but the RG16 texture looks a lot brighter with way less contrast.
    Does anyone know what's happening here?

    Code (CSharp):
    1.  
    2.  
    3. byte[] pixels = new byte[colors.Length * _textureChannelCount];
    4.  
    5. for (int j = 0; j < colors.Length; j++)
    6. {
    7.     pixels[j * _textureChannelCount] = colors[j].r;
    8. }
    9.  
    10. diffuseHeight.SetPixelData(pixels, 0);
    It also seems to work for RGB24 and NOT work for R8
     
  2. Forberg

    Forberg

    Joined:
    Oct 27, 2018
    Posts:
    25
    Solved it!

    The constructor boolean "linear" of the Texture2D is ignored for RG16 and R8 and "false" is being used per default.
    To fix, I had to covert my bytes prior to putting them into the texture:

    Code (CSharp):
    1. private byte GammaToLinear(byte color)
    2. {
    3.       float linear = color * (1.0f / 255.0f);
    4.       float gamma = Mathf.GammaToLinearSpace(linear);
    5.       return (byte)(255 * gamma);
    6. }      
     
  3. c0d3_m0nk3y

    c0d3_m0nk3y

    Joined:
    Oct 21, 2021
    Posts:
    674