Search Unity

RenderTexture does not produce any fully opaque pixels

Discussion in 'Universal Render Pipeline' started by BoaNeo, Mar 22, 2020.

  1. BoaNeo

    BoaNeo

    Joined:
    Feb 21, 2013
    Posts:
    56
    I'm trying to render some 3D models into a textures to use as icons in the UI. I've done this before with no issues, but this time I'm using the Universal Render Pipeline (not sure if that's the reason for my problems, but I guess so).

    The problem is that the output texture has no fully opaque pixels in it. The highest alpha value of any pixel in the resulting texture is 181. The exact same 3D mesh renders fine in the normal viewport and both offscreen on and view cameras are set up the same (same lights even).

    For now, I've had to resort to having a stupid loop between ReadPixels and Apply where I clamp all alpha values above 181 to 255. It works, but it's both slow and ugly (it creates a rougher edge than I would like).

    Does anyone have any clue why this might happen?
     
  2. weiping-toh

    weiping-toh

    Joined:
    Sep 8, 2015
    Posts:
    192
    Um... You might want to change the format of the render texture that you are initializing for copying. I believe that RGB565 does not support alpha hence the Blit will remove the alpha values.
     
  3. BoaNeo

    BoaNeo

    Joined:
    Feb 21, 2013
    Posts:
    56
    565 does not have alpha, true, but I'm using ARGB, 8 bits per channel. That also is not the problem - the problem is that the alpha is capped at 181 - no pixels are fully opaque...
     
  4. Elvar_Orn

    Elvar_Orn

    Unity Technologies

    Joined:
    Dec 9, 2019
    Posts:
    162
    Hey BoaNeo,
    That seems a bit weird.

    What URP and Unity versions are you using?

    Would you mind reporting a bug for us with a sample so we can take a look?
     
  5. BoaNeo

    BoaNeo

    Joined:
    Feb 21, 2013
    Posts:
    56
    Sure,

    Bug report #1230507

    It's Unity 2019.3.0f6
     
    Elvar_Orn likes this.