Search Unity

  1. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Question 2D Texture array adding alpha channel?

Discussion in 'General Graphics' started by JasonBrunner-SM, Mar 3, 2023.

  1. JasonBrunner-SM

    JasonBrunner-SM

    Joined:
    Nov 3, 2022
    Posts:
    2
    Using Unity 2021.3.18f1
    If I change a RGB texture from "Texture Shape" of 2D to 2D Array Unity adds a alpha channel making it a RGBA texture and double the file size. Even if I change the "Alpha Source" to none this empty new alpha channel will not go away.

    Is there a reason for this? I'd like to use arrays instead of atlases, but not if the result is wasted memory from empty forced alpha channels. Is there a way around this that I'm maybe missing?
    Many thanks in advance for any help.
     
  2. c0d3_m0nk3y

    c0d3_m0nk3y

    Joined:
    Oct 21, 2021
    Posts:
    622
    What platform are we talking here?

    If we are talking DirectX, there is not really a 24 bit format: https://learn.microsoft.com/en-us/windows/win32/api/dxgiformat/ne-dxgiformat-dxgi_format

    Unity has a R8G8B8 format, but if you are using the DirectX backend, it most likely gets mapped to R8G8B8A8 anyways. You can take a RenderDoc capture to see what you actually get.

    It is also possible, that the 2D format is using texture compression (DXT, BC) but the 2D texture array isn't. Either way, you can find out with RenderDoc.
     
    Last edited: Mar 5, 2023
  3. danamuise

    danamuise

    Joined:
    Sep 15, 2022
    Posts:
    31
    Looking at the docs, 2D TextureArray is 4 channels RGBW, where W is the frame index number for the array