Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Implement Alpha To Coverage / Better handling of alpha for VR on mobile

Discussion in 'Shaders' started by GilG, Feb 7, 2020.

  1. GilG

    GilG

    Joined:
    Jan 30, 2017
    Posts:
    27
    Hello,

    I'm trying to implement the alpha to coverage or a solution to mitigate the alpha issues we have on low resolution screens in VR.

    I'm following this logic : https://medium.com/@bgolus/anti-aliased-alpha-test-the-esoteric-alpha-to-coverage-8b177335ae4f

    And I'm using shader graph.
    We already were able to implement the shader code as is but I would like to use it with URP and with the Shader Graph so I can easily create shaders.

    So far, I have created these nodes :
    upload_2020-2-7_23-53-14.png

    And I reference this hlsl script in the custom function :
    Code (CSharp):
    1. void CalcMipLevel_float(out float2 texture_coord)
    2. {
    3.     float2 dx = ddx(texture_coord);
    4.     float2 dy = ddy(texture_coord);
    5.     float delta_max_sqr = max(dot(dx, dx), dot(dy, dy));
    6.     texture_coord = max(0.0, 0.5 * log2(delta_max_sqr));
    7. }
    8.  
    9. void CalcMipLevel_half(out half2 texture_coord)
    10. {
    11.     half2 dx = ddx(texture_coord);
    12.     half2 dy = ddy(texture_coord);
    13.     half delta_max_sqr = max(dot(dx, dx), dot(dy, dy));
    14.     texture_coord = max(0.0, 0.5 * log2(delta_max_sqr));
    15. }
    16.  
    17. void AlphaToCoverage_float(Texture2D _MainTex, half _Cutoff, half2 _uv, sampler2D _SS, out float4 _Color)
    18. {
    19.     //float _MipScale = 0.25
    20.  
    21.     // _Color = sampler2D(_MainTex, _uv);
    22.     //SamplerState sampler_MainTex; // "sampler" + “_MainTex”
    23.  
    24.     // rescale alpha by mip level (if not using preserved coverage mip maps)
    25.     // Original : col.a *= 1 + max(0, CalcMipLevel(i.uv * _MainTex_TexelSize.zw)) * _MipScale;
    26.  
    27.     //_Color.a *= 1 + max(0, CalcMipLevel_half(_uv.rg * _uv.ba)) * 0.25;
    28.  
    29.  
    30.     // rescale alpha by partial derivative
    31.     // Original : col.a = (col.a - _Cutoff) / max(fwidth(col.a), 0.0001) + 0.5;
    32.  
    33.     //_Color.a = (_Color.a - _Cutoff) / max(fwidth(_Color.a), 0.0001) + 0.5;
    34.  
    35.     //_Color  =
    36.     _Color = float4 (0.0, 0.0, 0.0, 0.0);
    37. }
    38.  
    39. void AlphaToCoverage_half(Texture2D _MainTex, half _Cutoff, half2 _uv, sampler2D _SS, out half4 _Color)
    40. {
    41.     // #define AlphaToMask On;
    42.     // Don't know how to handle mipscale right now but here is the value it should have
    43.     //half _MipScale = 0.25;
    44.     //SamplerState sampler_MainTex;
    45.  
    46.         //_Color = sampler2D(_MainTex, _uv);
    47.     _Color = SAMPLE_TEXTURE2D(_MainTex, _SS, _uv);
    48.     //_Color = _MainTex.Sample(sampler_MainTex, _uv);
    49.  
    50.     //SamplerState sampler_MainTex; // "sampler" + “_MainTex”
    51.  
    52.     // rescale alpha by mip level (if not using preserved coverage mip maps)
    53.     // Original : col.a *= 1 + max(0, CalcMipLevel(i.uv * _MainTex_TexelSize.zw)) * _MipScale;
    54.     //_Color.a *= 1 + max(0, CalcMipLevel_half(_uv.uv * sampler_MainTex.zw)) * _MipScale;
    55.  
    56.     // rescale alpha by partial derivative
    57.     // Original : col.a = (col.a - _Cutoff) / max(fwidth(col.a), 0.0001) + 0.5;
    58.     //_Color.a = (_Color.a - _Cutoff) / max(fwidth(_Color.a), 0.0001) + 0.5;
    59.  
    60.     _Color = half4 (0.0, 0.0, 0.0, 0.0);
    61. }

    As I lack hlsl & Unity general knowledge I'm kinda stuck.
    I have already tested and followed a lot of conversations but I can't get a result as I always have something blocking and I can't be sure of everything I do.
    I would be glad if someone could take a look to the code.
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Alpha to coverage is a rendering feature which Shader Graph doesn't support related to using MSAA's coverage samples. You can't use this and Shader Graph at all.

    However the snippet of code you posted is about scaling the alpha over the mips levels to keep the alpha tested coverage the same, which is separate from alpha to coverage and can be used with alpha testing just as well. The mip level hack I presented in that article is unnecessary now as you can just click on the "Mip Maps Preserve Coverage" option on the texture asset's settings. That calculates correctly scaled mip maps to retain the apparent "opacity" of the original texture.
     
    GilG and Subliminum like this.
  3. GilG

    GilG

    Joined:
    Jan 30, 2017
    Posts:
    27
    Hello @bgolus !
    Thanks for your answer.

    I thought I saw you in the forums talking about this :). The topic is really interesting and even more since VR.

    Then I was trying to reimplement the wheel that already existed :p.
    I still got couples questions though.


    Inside the Texture import settings, there is an Alpha Cutoff Value.
    Is this value always used to handle the preserve to coverage over the value we enter in the material ?

    Regarding the fast the Shader Graph doesn't support Alpha to coverage, if we were to code our own Master node would that be possible and why doesn't anybody do this ?

    Finally, when we use a billboard shader the alpha tested textures are super stable even compared to preserve coverage. Why that is ? I guess that's the way the alpha are handled but it works so well in a headset...


    Sorry for all these questions but the alpha issue is a very big one for us.
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    It's looking to see how many pixels are opaque in the original texture at that cutoff, and scaling the smaller mip maps to match that percentage as closely as possible. A low cutoff means more of the texture is opaque, which means the smaller mips need their values scaled up to match. Leaving it at the default of 0.5 can be a good compromise if your artist want to adjust it, but the result will be the apparent opacity of alpha tested surfaces might be too opaque or too transparent again. Some AAA studios remove the option entirely and hard code the 0.5 cutoff, instead requiring artist to tweak the texture assets instead.

    Because you can't really code your own master nodes. Not with out a ton of hassle. There's a reason why examples of any custom nodes are all about a year old now, and it's because Unity has made this nearly impossible for people to make their own.

    Now, you can take the generated code from a Shader Graph and add
    AlphaToMask On
    at the top of the forward pass, but then you're stuck maintaining two versions of the shader, the Shader Graph one that doesn't do alpha to coverage and the generated vertex fragment one that does.

    Not sure. Might just be the alpha cutoff on the material is set to a significantly different value than the texture. Or it could be having the object "shrink" due to mip maps is less obvious than them going slightly too fat with bilinear filtering and you might need to enable trilinear?
     
  5. GilG

    GilG

    Joined:
    Jan 30, 2017
    Posts:
    27
    Thanks for the tip, I have some really low value by default (0.3 / 0.1). I will test this during the week on the devices :).

    I checked the Unity asset "ShaderGraphEssentials". They add some master nodes.
    Add the AlphaToMask On manually is a cool possibility then. Maybe we should script something about it to add it automatically.
    It's too bad we can't do that directly in shadergraph. I thought the principle was more used broadly. Maybe the shader graph is the problem here :p.

    I will check with trilinear, I thought I already did but I didn't rechecked everything with preserve to coverage.
    I know that it was even disabled for the billboard textures and still it was perfectly rendered.

    Thanks so much for the informations !
    Have a nice day ;)
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Yeah, a few people are still trying to sell assets, but Unity keeps finding ways to make it harder on them with each patch (seemingly unintentionally). The short version is Unity has made more and more of the SRP code (including Shader Graph) marked as
    internal
    in the c# code. This means it can’t be extended without essentially hacking the compiled code at runtime to make the code not be internal. It also means every update to an SRP potentially breaks it.

    No one outside of Unity (or even a few inside Unity I’ve talked to) seems to know why it’s all marked as
    internal
    . Just removing that classification from the classes makes it easily extendable by store assets and doesn’t cause any problems.
     
    tspk91 and Desoxi like this.
  7. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Desoxi and Deleted User like this.
  8. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Any news on this? Is it still impossible to implement alpha to coverage with shader graph?
     
  9. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Support looks like it was added to the HDRP's Shader Graph at some point, at least it looks like it's in the latest version on GitHub. Not yet in the version on Package Manager, and not in URP.
     
    Desoxi likes this.
  10. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Thank you @bgolus, then I think the way to go is to write the shader by hand in URP for now
     
  11. desenholdb

    desenholdb

    Joined:
    Oct 4, 2012
    Posts:
    12
    Just reporting that I'm going through the same thing. Gonna finish it by hand and keep the two versions until it is in URP's shadergraph.
     
    hippocoder likes this.
  12. moonaanii

    moonaanii

    Joined:
    Aug 28, 2017
    Posts:
    2
    @desenholdb May I ask what you meant by saying 'by hand'? I'm stuck on this problem...
     
  13. Alehr

    Alehr

    Joined:
    Nov 16, 2011
    Posts:
    17
    @moonaanii I think he means just add "AlphaToMask On" in the generated shader graph code. This works fine but as others have mentioned, it's a mind numbingly dumb hassle for something that should really just be a checkbox in the master node settings.
     
    tspk91, hippocoder and Desoxi like this.
  14. tspk91

    tspk91

    Joined:
    Nov 19, 2014
    Posts:
    130
    @bgolus So you'd say currently as things are it is best to just write shaders for URP ourselves right? That's what I'm doing for Quest development. But I always get uncomfortable that I'm not doing things the "official" way.
     
  15. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Unity doesn't really do official ways. That would imply it's been planned long enough without a sudden change of direction.

    I don't think you can rely on the shader source being problem free forever, but you can rely on it for the major version for a while yet. If you want security and source, you should use shadergraph with a custom function node.
     
  16. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Except you can't enable A2C this way.
     
  17. jubaerjams8548

    jubaerjams8548

    Joined:
    Jun 8, 2020
    Posts:
    33
    what would be the cheapest and fastest way to implement alpha for mobile devices???
     
  18. jiaozi158

    jiaozi158

    Joined:
    May 24, 2020
    Posts:
    23
    Hi, there are mainly 2 ways to implement transparency in Unity:

    1. Alpha Blend (Set material type to Transparent)
    • Support semi-transparent.
    • Always in Forward rendering path, so at most 8 lights in URP (unless using Forward+)
    • Will not be considered by some post-processing effects (Depth of Field, ...)
    • There are transparent sorting issues by default. To solve this, enable the Depth Write in shaders (If using Shader Graph, change "Depth Write" to "Force Enabled")
    • Severe overdraw for something like overlapping grass.


    2. Alpha Clipping (check the box)
    • The actual alpha can only be 0 or 1.
    • Material is still opaque, which means support Deferred rendering path & post-processing effects like Depth of Field.
    • Need to use Alpha-To-Coverage for MSAA. (exists since URP 14, might need to implement by yourself in Built-in shaders)
    • No transparent sorting issues by default.
    • Severe overdraw still exists, but the cost can be reduced if enabling Depth Priming in URP asset.
    As there are many limitations when using Alpha Blend in URP/Built-in RP, I suggest:
    • Try to compare their performance on mobiles before making decisions.
    • Some features are only available for Alpha Clipping transparency. Your decision can be affected by project's need.
     
  19. jubaerjams8548

    jubaerjams8548

    Joined:
    Jun 8, 2020
    Posts:
    33
    i heard that Additive blending is
    faster than alpha blending....is it true.?

    i mean Src Alpha one vs Alpha blend, who wins according to your experience?
     
  20. wwWwwwW1

    wwWwwwW1

    Joined:
    Oct 31, 2021
    Posts:
    631
    Sorry that I'm not sure which one can be faster.

    But most of the time you found that Transparency is slow is because of overdraw (consuming the Fillrate of a GPU).

    There is no solution to this because the scene must be rendered multiple times in order to support semi-transparency.

    As for Alpha Clipping objects, they are still opaque.

    Alpha Clipping does not support semi-transparency, so it's possible to render the pixel only once, which means reducing overdraw. (Need to use DepthPrepass)

    (Skip this if you know/ don't want to know what is DepthPrepass)
    • What DepthPrepass does is to render the depth (to camera depth) before "color" (I mean lighting).
    • Then when rendering "color", set the ZTest from "Less Equal (LEqual)" to "Equal".
    • ZTest Equal means that this pixel will only be rendered if it's depth is equal to the one in camera depth.
    • In this way, opaque overdraw will only happen in the depth rendering.

    Someone may tell you that Alpha Clipping can be slow on mobiles, and you should not use them.

    That is because the way that Alpha Clipping works. Alpha Clipping modifies the pixel depth if a pixel is transparent (0 alpha). This will break a technique called Early-Z on most mobile architecture GPUs.

    The effect of Early-Z (remove hidden faces) is to reduce overdraw (on opaque objects).

    But they are discussing about using "Opaque" or "Alpha Clipping Opaque".​

    The fact is that both Alpha Clipping and Alpha Blending (including multiply, pre-multiply, additive) can be slow, which is because of overdraw.
    So what is the cheapest & best way to implement alpha for mobiles?

    I can only say it depends.

    "Opaque Alpha Clipping + DepthPrepass" should be faster than "Transparent", especially when DepthPrepass already exists (to generate depth texture, to get depth information for something like SSAO before opaque).

    But DepthPrepass dose not exist in BIRP, which means you'll need to add it manually if possible. (It's called "Depth Priming" in URP)

    What I can share is some tricks. (reducing overdraw)

    Opaque (Alpha Clipping):
    Try using DepthPrepass (Depth Priming with Force Mode in URP 12 and above).

    Transparent:
    The middle one will have less overdraw than the right one.

    But make sure that the mesh is not too high-poly.

    Picture from https://thegamedev.guru/unity-ui/sprite-vs-image/:
    Mesh.png


     
  21. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Yes!* Additive Blending requires two or three fewer operations than Traditional Alpha Blending; it’s a single add (or an add and multiply if using the alpha) compared to an add, subtract, and two multiplied.

    * This only actually matters if you’re rendering using an OpenGLES 1.0 or DirectX 7.0 GPU. But those class of GPU are effectively dead. On modern GPUs there’s no difference because the blend is done using hardware that can do any blend at the same “speed”. And even if that wasn’t the case, the difference between 2 or 3 math operations on modern GPUs, even low end mobile GPUs, is nigh unmeasurable.
     
  22. patrickb2022

    patrickb2022

    Joined:
    Oct 15, 2022
    Posts:
    2
    @bgolus Hi, do you know if anything has changed about AlphaToMask with shader graph?
     
  23. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Nope.
     
    patrickb2022 likes this.