Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice
  3. Join us on November 16th, 2023, between 1 pm and 9 pm CET for Ask the Experts Online on Discord and on Unity Discussions.
    Dismiss Notice

[SOLVED] Pixel color in RenderTexture not matching with Material Color

Discussion in 'General Graphics' started by Bidule200, Aug 17, 2015.

  1. Bidule200

    Bidule200

    Joined:
    Apr 26, 2014
    Posts:
    18
    Hello everyone,
    here is my program :

    - I have a Camera rendering in a RenderTexture ;

    - Every object on scene seen by this camera have the Unlit/Color shader, so the camera only see their pure material color ;

    - Every object on scene chose on Start() a unique Color, apply it to its Material, then adds it to a List<Color> ;

    - I transfer pixels from the RenderTexture to a Texture2D with Texture2D.ReadPixels(), then I transfer all the Texture2D pixels in a Color[] array with Texture2D.GetPixels();

    - I do a foreach loop on every Color in the Color[] array, and see if the pixel color matches a Color in the List<Color> ;


    And now my problem :

    - The colors match when the Objects on scene chose pure colors like Color.red or Color.white, but never with more subtle tints like Yellow or anything else ;

    - I found that the problem resids in a conversion from Linear to Gamma space : a material color (0.5, 0.5, 0.5) gives a color (0.216, 0.216, 0.216) in the Color[] array.

    - Problem is : I've done everything to make the colors match. I have checked that my project Lighting is in Linear space, that the Texture2D is in Linear space, etc.
    Even if the floats in the colors are the same, the comparaison do not match :(

    - So now I have a Material.color of rgba(0.5, 0.5, 0.5, 1), I convert it so it is stored in the Color<List> as (0.216, 0.216, 0.216, 1), and it is identified in the Texture2D pixels as (0.216, 0.216, 0.216, 1), but the comparaison says it's not the same color, so I can't match them even if they have the exact same values :mad:



    Please... has someone any idea on how to resolve this problem ?

    Thank you !


     
    Last edited: Aug 17, 2015
    Deon-Cadme likes this.
  2. MSplitz-PsychoK

    MSplitz-PsychoK

    Joined:
    May 16, 2015
    Posts:
    1,278
    I'm not really sure what the problem is, but maybe I can help you debug.

    I'm sure you've probably already noticed, but Color does have a .gamma and a .linear property that can convert back and forth between gamma space and linear color space.


    I assume this means you have two colors which you believe should be the same when you compare them, but you aren't getting that result from the comparison. Just to see if maybe there's a little floating point difference to ruin the comparison, check the difference between the two colors.

    What I mean by that is if you have color 'a' and color 'b', make "Color c = b - a". Components of colors can go negative, so you can check the components of c. If all components of c are 0, then it is not a floating point problem and you'll have to try to find out how to test between the color spaces.

    I'll keep helping where I can if you keep me posted.
     
  3. Bidule200

    Bidule200

    Joined:
    Apr 26, 2014
    Posts:
    18
    Here is the solution :

    1. When you define (create) a Color, it's already in "Gamma space".
    In my projet, the color was converted because I assumed that you created colors in Linear, and then it was converted to Gamma, but it is actually the opposite :
    If you set your RenderTexture and Texture2D to "Linear", it will convert the color from Gamma back to Linear.
    Solution : you stick to "sRGB" (constructor default setting) and it will not convert anything.

    2. Color create floats inaccuracy, use Color32 instead.
    Even when sticking with sRGB (Gamma) colors, a (0.5, 0.5, 0.5, 1) defined Color was read as (0.502, 0.502, 0.502, 1) in the texture pixel.
    So I changed everything to Color32 and the bytes channels stay accurate :)
    (128, 128, 128, 256)
     
    Last edited: Aug 24, 2015
    Games4Stream and kilik128 like this.