Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Question Texture2D.GetPixel returning the wrong colors, including ones that are not in the texture

Discussion in 'Scripting' started by canslp, Aug 31, 2023.

  1. canslp

    canslp

    Joined:
    May 6, 2019
    Posts:
    36
    okay so maybe i am completely misusing this function but i need to get the colors from a texture and i am trying to use Texture2D.GetPixel(x,y), but it seems to be giving me colors that are not only not the color at that pixel position, but often not in the texture at all.

    Here's my texture, the pixel positions i am sampling, and the output:

    upload_2023-8-30_19-38-51.png

    upload_2023-8-30_19-39-22.png

    upload_2023-8-30_19-41-3.png

    it seems like not only are the pixels i am inputting not the ones that are actually being sampled, but it also seems like most of the colors that are being returned are actually interpolations between other colors. am i using this function wrong? the texture's are definitely imported without filtering (nearest neighbor) and without compression, but it seems like the colors that are being returned are either ones that would only exist if the texture was being filtered, and/or completely in the wrong part of the image (black and white are completely absent). what am i doing wrong?
     
  2. spiney199

    spiney199

    Joined:
    Feb 11, 2021
    Posts:
    5,769
    Not sure it's really related to your error, but 0,0 is the bottom left of the image, not the top left.
     
  3. canslp

    canslp

    Joined:
    May 6, 2019
    Posts:
    36
    you're right, i just noticed that and updated it. the texture is still being sampled wrong though
     
  4. spiney199

    spiney199

    Joined:
    Feb 11, 2021
    Posts:
    5,769
    What if you get a less lossy sample with, say,
    GetPixels32
    , and iterate through the collection instead? Or perhaps
    LoadRawTextureData
    possibly.

    Haven't played with texture stuff much, but the one time I did I believe I needed to use the methods working with Colour32 to get the right results.
     
  5. Bunny83

    Bunny83

    Joined:
    Oct 18, 2010
    Posts:
    3,495
    Depending on the platform, texture sizes may have certain alignment limitations. The minimum width / height may be 4 or 8 in many cases. Also it highly depends on how you imported the texture in the first place. By default textures are scaled and resampled to the closest power of two unless you change the import settings. Since what you've shown (assuming each colored area is one pixel) your texture seems to be 10x2 pixels. So that texture would probably be imported as 16x2 or 16x4 or 16x8 pixels. You have to set the import mode to advanced and switch the texture to a "NPOT" (Non Power Of Two) texture. Though note that non-power of two textures are bad for performance. So you really should keep your source image resolutions at a power of two (8,16,32,64,128,256, ...)

    There are other settings which may be quite important when you actually use this texture as "texture". First of all the filtering mode should probably be set to "point" and not the bilinear or trilinear.

    Another common problem is gamma information stored in the texture which can change the colors as they are / are not gamma corrected. There are several potential issues here. First there's the sRGB setting which defines if the texture should use gamma or linear color space. On the other hand the Unity importer behaves different depending on if the imported image contains gamma information or not. I can't get into more details here as it highly depends on the used image format.

    Have you checked the actual width / height in code? GetPixel will clamp or wrap around depending on the wrapping settings of the texture in case the indices are outside the valid area.
     
    canslp and spiney199 like this.
  6. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    10,977
    Can you show the texture's import settings? I think your texture is getting resized.
     
  7. canslp

    canslp

    Joined:
    May 6, 2019
    Posts:
    36
    oh you're right, it was totally shrinking it down to 8 pixels wide. that's insane that it does that by default, and its also crazy that GetPixel, a function that takes two integers, will return the color between two pixels if the size is wrong. thanks!
     
  8. Bunny83

    Bunny83

    Joined:
    Oct 18, 2010
    Posts:
    3,495
    That's why your GPU is optimised to work with power of two textures. NPOT textures aren't even supported on some older platforms and some even require square power of two textures. There are several reasons for that. Memory layout, compression but most importantly MipMaps. So you should always try to create your textures in a size that is a power of two. So in your case, make it 16 pixels wide.

    GetPixel doesn't do that. When the image is imported it is resized to the nearest power of two. Of course when you resize an image to a different size, you get a blend between the original pixels. How the texture is resized can also be specified. However you just want to avoid the resizing in the first place, so make your textures a power of two ^^.