Search Unity

Rendertexture transparency from camera alpha instead of camera depth

Discussion in 'General Graphics' started by 00christian00, Sep 20, 2016.

  1. 00christian00

    00christian00

    Joined:
    Jul 22, 2012
    Posts:
    1,035
    I don't know if it is a bug, but If I want to have a rendertexture with a transparent background, it seem that Unity use the depth buffer as the alpha channel instead of the frame buffer alpha.

    So for example if I set the camera with the usual Overlay setup using clear flags depth only, I get a different result on the Rendertexture than what I see from the camera, this is because If a shader doesn't write to the depth buffer but write to the frame buffer it doesn't appear because it's depth is zero and thus alpha of the texture is zero.
    So I have some trees with alpha blending that don't appear on the texture now. Is there any way to solve it?
    Is it a bug or a known limitation?
     
  2. 00christian00

    00christian00

    Joined:
    Jul 22, 2012
    Posts:
    1,035
  3. rrh

    rrh

    Joined:
    Jul 12, 2012
    Posts:
    331
    I don't know the answer but you've explained what my problem is.
     
  4. rrh

    rrh

    Joined:
    Jul 12, 2012
    Posts:
    331
    And given your clue of the depth buffer being the problem, I set the depthBuffer on the render texture to none, and it works for me.

    In my case I'm rendering a single object, so if there's any other problems with having no depthBuffer, I'm not encountering them.
     
  5. monark

    monark

    Joined:
    May 2, 2008
    Posts:
    1,598
    Did you ever find a solution to this?
    I've seen a bunch of different answers to this issue but none of them solve this in my case
     
  6. monark

    monark

    Joined:
    May 2, 2008
    Posts:
    1,598
    Actually I found the solution

    Add these lines:
    Blend SrcAlpha OneMinusSrcAlpha, One One
    ColorMask RGBA

    And add keepalpha to your surface pragma line

    In my case
    #pragma surface surf SimpleSpecular vertex:vert keepalpha
     
  7. jvo3dc

    jvo3dc

    Joined:
    Oct 11, 2013
    Posts:
    1,520
    This has to do with applying the alpha twice. Once when rendering to the RenderTexture and then again when rendering the RenderTexture as an overlay. It has nothing to do with Unity accidentally using the depth buffer as alpha.

    Easiest solution is to assume premultiplied alpha when rendering the RenderTexture as overlay. So with blending:
    Blend One OneMinusSrcAlpha

    No change required to the rendering into the RenderTexture and no separate blend states required for the alpha channel.
     
  8. monark

    monark

    Joined:
    May 2, 2008
    Posts:
    1,598
    You will need the keepalpha and ColorMask RGBA though otherwise it just gets replace with ColorMask RGB and the alpha is completely wiped out.
    The seperate alpha blend is optional
     
  9. jvo3dc

    jvo3dc

    Joined:
    Oct 11, 2013
    Posts:
    1,520
    Not in a fragment shader, but I can imagine this is required in a surface shader.
     
  10. monark

    monark

    Joined:
    May 2, 2008
    Posts:
    1,598
    Sorry yes I was talking about a surface shader in this case
     
  11. RavenWits

    RavenWits

    Joined:
    Jan 18, 2019
    Posts:
    2
    monark can you share the shader wit us if you dont mind?