Search Unity

Bug 2D Renderer - Outputting incorrect alpha into rendertexture/framebuffer

Discussion in '2D' started by Ferazel, Apr 23, 2021.

  1. Ferazel

    Ferazel

    Joined:
    Apr 18, 2010
    Posts:
    517
    In my game I have a system that will take a screengrab of a character that is composed of many layers and render it to a render texture so that I can fade out the character without any alpha transparency overlay issues. So the problem is that when I try to do this with the 2D renderer.

    So in this example project I have a 2D renderer camera being output to a render texture.

    The ordering of the sprites is:
    1) BG Quad alpha = 1.0
    2) Gradient/Circles

    This is what it looks like in the 2D renderer when output directly to the screen:
    upload_2021-4-23_15-3-23.png
    This is what happens when I render that to a RenderTexture and put it into a transparent (alpha blended) shader.
    upload_2021-4-23_15-3-56.png
    You can see that the renderer is not outputting a texture that is alpha 1.0 and the subobjects are overwriting the alpha values in the shader. I'm not sure why as I looked at the shader and the blendfunc is SrcAlpha, OneMinusSrcAlpha so I'm left thinking that this is a bug?
    Looking at the render texture's alpha channel you can see there there is weird haloing going on with the semi-transparent pixels of the render texture.
    upload_2021-4-23_15-13-34.png

    I have submitted a Unity bug (1331392) but I'd appreciate any help if I'm missing something here. If I switch the shader from the Sprite-Lit-Default to Sprite-Default (the unlit one) it alpha blends properly and the render texture looks correct.
     
    Last edited: Apr 23, 2021