Search Unity

Shader alpha not working on iOS builds only.

Discussion in 'Shaders' started by Phedg1, Apr 4, 2019.

  1. Phedg1

    Phedg1

    Joined:
    Mar 3, 2015
    Posts:
    106
    I'm experiencing an issue with my shader, but only when built to iOS. I'm trying to render a screenshot on screen. I learned that the method for getting correct transparency in a screenshot is to take one screenshot with a black background and one with a white background, then use a formula to correct the colours. I created a shader with two extra texture properties to contain these black and white screenshots. The guts of it is:

    Code (CSharp):
    1.             fixed4 frag(v2f IN) : SV_Target
    2.             {
    3.                 half4 blackColor = tex2D(_BlackTex, IN.texcoord);
    4.                 half4 whiteColor = tex2D(_WhiteTex, IN.texcoord);
    5.                 float alphaMultiplier = 1 + blackColor.r - whiteColor.r;
    6.                 half4 color = half4(1, 1, 1, 0);
    7.              
    8.                 color.r = blackColor.r / alphaMultiplier;
    9.                 color.g = blackColor.g / alphaMultiplier;
    10.                 color.b = blackColor.b / alphaMultiplier;
    11.                 color.a = alphaMultiplier;
    12.  
    13.  
    14.                 #ifdef UNITY_UI_CLIP_RECT
    15.                 color.a *= UnityGet2DClipping(IN.worldPosition.xy, _ClipRect);
    16.                 #endif
    17.  
    18.                 #ifdef UNITY_UI_ALPHACLIP
    19.                 clip (color.a - 0.004);
    20.                 #endif
    21.  
    22.                 return color;
    23.             }
    This works in the on Android, as well as in the editor on my PC and Mac. However, on iOS the parts of the screenshot that are completely transparent show as completely opaque (and black, the black bit is to be expected).

    Code (CSharp):
    1.                 color.r = alphaMultiplier;
    2.                 color.g = alphaMultiplier;
    3.                 color.b = alphaMultiplier;
    4.                 color.a = 1;
    Changing the shader so the colour of the screenshot is defined by the opacity, the parts of the image that are opaque should show as white and the parts of the image that are transparent should show as black. In the case where transparent areas are shown as black in my build, I was expecting the whole screenshot to be white. It wasn't. The correct parts of the image were black and white.

    Code (CSharp):
    1.                 color.r = alphaMultiplier;
    2.                 color.g = alphaMultiplier;
    3.                 color.b = alphaMultiplier;
    4.                 color.a = 0;
    I then tried hard coding the alpha to 0 for all pixels, to see if the iOS build was capable of showing true transparency at all. It was, the image was invisible.

    Code (CSharp):
    1.                  color.r = blackColor.r; // 0, 0
    2.                 color.g = whiteColor.r; // 255, 1
    3.                 color.b = alphaMultiplier; // 0, 0
    4.                 color.a = 1;
    Lastly I tried setting the rgb values of the pixels from the black and white textures. Shaders are difficult to debug, so my intention was to be able to open a screenshot of the build in an image editor to "debug" their values. These values, taken from a section of full transparency, are show above. They are exactly as they aught to be for the shader to be working properly.

    I really don't know what to do from here. All three of my tests rendered the shader how it would in a perfect world, I don't find any fault with them. However, for all the evidence to the contrary, it still does not work properly outside the tests. Every pixel that is partially transparent renders perfectly, it's only completely transparent pixels that don't work, and only on iOS builds. The only thing I can think that I haven't tested is the fact that the alpha was hard coded in all my tests. I don't see how that should make a difference. All my tests indicated that everything should be rendering correctly. Please help me.

    Code (CSharp):
    1.            color.r = blackColor.r / alphaMultiplier;
    2.            color.g = blackColor.g / alphaMultiplier;
    3.            color.b = blackColor.b / alphaMultiplier;
    color.a = min(max(blackColor.r, 0.25), 0.5);

    EDIT: As one final test I used my main code for rgb and for the alpha I used a variable that I min/maxed into a range between 0.25 and 0.5. Normal areas of the screen are semi transparent but the areas that should be full transparent are once again opaque. I officially have no clue. Any help would be really, REALLY appreciated.
     
    Last edited: Apr 4, 2019
  2. Phedg1

    Phedg1

    Joined:
    Mar 3, 2015
    Posts:
    106
    @runevision @gabrielw_unity I know this isn't your area but you guys are the only people I know who work with Unity. Can you please help me get this in front of the right people because it really isn't behaving in a way that makes any sense. I think something is seriously wrong here.
     
  3. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    6,979
    Why not just use the alpha from the screenshot? If your camera's clear color is 0,0,0,0, then the resulting texture you get from using ReadPixels should already have alpha. No fancy extraction needed (assuming the shaders rendering to the screen are properly outputting alpha as well).

    That said, try adding this after the color.a = alphaMultiplier:
    color = saturate(color);
     
  4. Phedg1

    Phedg1

    Joined:
    Mar 3, 2015
    Posts:
    106
    Sadly saturate(color) did not make a difference. I was able to borrow a different iPhone today and install my TestFlight build. It works perfectly on this new iPhone, just not on mine. Mine is an iPhone 6s running iOS 10.2.1, the other was an iPhone 7s running iOS 11.2.6.
     
  5. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    6,979
    One last guess, get rid of the clip()?