Search Unity

Blending transparent rendertexture with scene gives fringing, with OR without premultiplied alpha?

Discussion in 'General Graphics' started by hazel_koop, Sep 8, 2020.

  1. hazel_koop

    hazel_koop

    Joined:
    Apr 9, 2019
    Posts:
    29
    Hey, I'm having a weird situation with drawing a transparent rendertexture into the scene... I've filed a bug report, but I thought I'd ask here too in case anyone has already run into this: the semitransparent fringe in my transparent rendertexture isn't being blended correctly, even when I blend it premultiplied.


    Here is a scene that demonstrates the problem; the camera on the left draws into a rendertexture which is cleared to zero-alpha black; the camera on the right clears to the skybox and draws to the screen. The squares at the back are textures, and the vertical halves on the right display the left camera's rendertexture in two different ways (see below). All of these materials are unlit.




    The left and right source textures are identical other than one having a semitransparent hole in the middle (demonstrated here blending to a red checkerboard), so in theory, if the two were to be alpha blended correctly, the result would just be identical to the right image.




    Those two vertical halves in front of the right camera are displaying the left camera's rendertexture; the bottom one draws with standard alpha, and the upper one draws with premultiplied alpha.



    The transparent fringe in the bottom one shows up too dark -- this makes sense, since the alpha values written into the rendertexture were already blended with the zero-alpha black, so we must have to render it premultiplied instead.

    However, the top half uses premultiplied alpha blending, but while black and white blend properly, grey areas in the fringe now show up too BRIGHT!

    What could be causing this?

    My first guess was that since it only affected greys, it could be related to gamma -- but the bug still happens when the project is set to 'linear' (though it IS somewhat lessened...)

    Also, no combination of enabling/disabling "sRGB" and "alpha is transparency" on the transparent source texture makes any difference to the result (and I've checked the channels in the inspector to ensure that the texture in the asset is NOT premultiplied).
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,348
    The source texture isn't the issue here. The render texture itself is the problem. Regardless of what you render to a render texture, the result is a premultiplied texture. You might be able to fix all of this by setting your project to use gamma color space instead of linear. Linear color space complicates all of this tremendously, and I've actually never totally wrapped my head around how to get it to be perfect.
     
  3. hazel_koop

    hazel_koop

    Joined:
    Apr 9, 2019
    Posts:
    29
    I am in gamma color space; I only set it to linear temporarily, just to see whether that would help (it didn't).
     
  4. hazel_koop

    hazel_koop

    Joined:
    Apr 9, 2019
    Posts:
    29
    Also, if it were premultiplied, then the top half of the image wouldn't have that white glow (the top half is drawing premultiplied)
     
  5. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,348
    Ah, then the last problem is all built in Unity shaders do not handle writing to the alpha correctly. Most of the time what gets rendered to the alpha channel is never used, so it's not important, but the default blend mode for something like traditional alpha blending is:
    Blend SrcAlpha OneMinusSrcAlpha

    The problem is that blend mode means it's multiplying the alpha with itself. So the value stored in the alpha of the render texture is now the square of the alpha rather than the original alpha.

    You need a custom shader that uses:
    Blend SrcAlpha OneMinusSrcAlpha, One OneMinusSrcAlpha


    Or a premultiplied alpha shader that multiplies the RGB color value by the alpha in the shader and uses:
    Blend One OneMinusSrcAlpha


    Also note, all of Unity's built in premultiplied alpha shaders are wrong and do not do premultiplied alpha correctly.
     
  6. hazel_koop

    hazel_koop

    Joined:
    Apr 9, 2019
    Posts:
    29
    Ahhhh that explains it!! Thank you!

    I'm drawing a large variety of things into the rendertarget with Unity's default shaders (in my actual project) so I would definitely like to work around this in my "rendertarget back into scene" shader if it's possible -- is that what you're describing with your second shader? I tried to implement what you described, but it just made it look the same as attempting to draw it with straight alpha:


    instead I tried taking the square root of the alpha to undo the extra self-multiplication, and that ALMOST worked but gave an unfortunate rounding error for values near zero:


    luckily it looks like bumping up the color precision to 16bpc is enough to work around it!!


    it bothers my optimizer's instincts to use twice as much VRAM to work around this, but if it means not going in and manually rewriting every shader that has to go through this system, I think I'll take the hit ;)

    Thank you so much for clarifying this!!
     
    bgolus likes this.
  7. hazel_koop

    hazel_koop

    Joined:
    Apr 9, 2019
    Posts:
    29
    Just for future reference -- it turns out that there are actually some Unity shaders that DON'T have that alpha bug (which means that compensating for it universally will give a black fringe in those cases instead...)

    So far I've had to hard-code in a check for whether the graphic was drawn by the Sprites/Default shader; I'll just add in more cases as I notice them from here...