Search Unity

Premultiply Shader in Render Texture = sadness

Discussion in 'General Graphics' started by naked_chicken, Sep 17, 2017.

  1. naked_chicken

    naked_chicken

    Joined:
    Sep 10, 2012
    Posts:
    186
    So I have the following texture:


    This texture is applied using a Premultiplied Alpha shader:

    Code (CSharp):
    1. Shader "Unlit/blendTest"
    2. {
    3.     Properties
    4.     {
    5.         _MainTex ("Texture", 2D) = "white" {}
    6.     }
    7.     SubShader
    8.     {
    9.         Tags {
    10.             "Queue" = "Transparent"
    11.             "IgnoreProjector" = "True"
    12.             "RenderType" = "Transparent"
    13.         }
    14.         Blend One OneMinusSrcAlpha
    15.         Lighting Off ZWrite Off
    16.         Cull Off
    17.  
    18.         Pass
    19.         {
    20.             CGPROGRAM
    21.             #pragma vertex vert
    22.             #pragma fragment frag
    23.            
    24.             #include "UnityCG.cginc"
    25.  
    26.             struct appdata
    27.             {
    28.                 float4 vertex : POSITION;
    29.                 float2 uv : TEXCOORD0;
    30.             };
    31.  
    32.             struct v2f
    33.             {
    34.                 float2 uv : TEXCOORD0;
    35.                 float4 vertex : SV_POSITION;
    36.             };
    37.  
    38.             sampler2D _MainTex;
    39.             float4 _MainTex_ST;
    40.            
    41.             v2f vert (appdata v)
    42.             {
    43.                 v2f o;
    44.                 o.vertex = UnityObjectToClipPos(v.vertex);
    45.                 o.uv = TRANSFORM_TEX(v.uv, _MainTex);
    46.                 return o;
    47.             }
    48.            
    49.             fixed4 frag (v2f i) : SV_Target
    50.             {
    51.                 // sample the texture
    52.                 fixed4 col = tex2D(_MainTex, i.uv);
    53.                 return col;
    54.             }
    55.             ENDCG
    56.         }
    57.     }
    58. }
    That renders fine and looks like this:

    That looks as I would expect. Buuuuuut...


    Now the issue. If I instead set the camera to render out to a RenderTexture I get this:


    Any part that would be considered additive does not show up in the RenderTexture. I've tried everything I can think of to no avail. Hoping somebody here can help.
     
    Torbach78 likes this.
  2. Torbach78

    Torbach78

    Joined:
    Aug 10, 2013
    Posts:
    296
    I'd like an answer to this burning question as well since it affects my fx
     
  3. samizzo

    samizzo

    Joined:
    Sep 7, 2011
    Posts:
    487
    I don't see that behaviour. This is what it looks like for me:
    test.png

    On the right is a GameObject with MeshRenderer using your shader, and on the left is a GameObject with MeshRenderer using the RenderTexture that resulted from rendering what was on the right. The background colour for the Camera was set to that blue colour. The left GameObject is rendered using the Unlit/Texture shader.

    Where is your second screenshot from? Is that the game view, or the editor? What does the camera preview look like for the camera which is generating your RenderTexture?

    -sam
     
  4. jvo3dc

    jvo3dc

    Joined:
    Oct 11, 2013
    Posts:
    1,520
    You're applying the alpha twice there. That commonly happens when using RenderTextures. Don't forget that the alpha is also stored in the RenderTexture and if you use standard alpha blending (SrcAlpha OneMinusSrcAlpha) during rendering of the RenderTexture with, you'll indeed lose those additive parts.

    Typical pixel in that RenderTexture for the additive part has an alpha value of 0, so with standard alpha blending that shows nothing.

    In this case you can fix it by using the same blending mode when rendering the RenderTexture, One OneMinusSrcAlpha. You have to make sure you clear the RenderTarget with a black color.
     
  5. Torbach78

    Torbach78

    Joined:
    Aug 10, 2013
    Posts:
    296
    I understand, but where are these changes made exactly? I need to communicate this precisely to colleagues
     
  6. samizzo

    samizzo

    Joined:
    Sep 7, 2011
    Posts:
    487
    You can use the same shader when rendering the RenderTexture. Also set the camera clear colour to black on the camera that is rendering the RenderTexture.

    -sam
     
  7. naked_chicken

    naked_chicken

    Joined:
    Sep 10, 2012
    Posts:
    186
    Yeah, using the same shader is about where I landed. The problem I have specifically is that we've got kind of a convoluted multi-stage rendering happening.
    1. The asset get's rendered to a RenderTexture
    2. That RenderTexture is applied to a RawImage UI element.
    3. The UI is rendered to another RenderTexture which has some filters applied to it.
    4. The UI RenderTexture get's combined with the Scene.

    So the problem with using the same shader on the UI element is that while that will make it look fine, the UI RenderTexture will hit the same issue.

    I thought about using a shader which kind of converts the PreMult shader to regular blending but if I do that then I might as well use regular blending on the asset in the first place (which is not desirable as it changes the look).

    Sigh, well I learned much about the Render Target pipeline this weekend and I have several options moving forward. None of them are perfect so we'll just have to choose the lesser of two weevils. Thanks very much for the insights.
     
  8. samizzo

    samizzo

    Joined:
    Sep 7, 2011
    Posts:
    487
    Why not write a custom shader that does exactly what you need?

    -sam