Search Unity

Dark edge around textures drawn into a RenderTexture

Discussion in 'General Graphics' started by Ross_S, Feb 8, 2018.

  1. Ross_S

    Ross_S

    Joined:
    Jul 10, 2012
    Posts:
    29
    Hello folks,
    I'm attempting to paint into a RenderTexture(RT from now on). I've attached images of the RT Camera and the RT setup. Also of the plane that is displaying the RT in the scene.
    I draw textures for one frame in front of the RT Camera, which doesn't clear... and so it slowly fills the texture up.
    All works sweetly except I get these dark edges where the alpha of the texture is very low. See on the image. It's as if it is combining with some imaginary black colour somewhere along the line. But I'm baffled...
    If there is already a colour there - as you can see where the pink and white lines touch each other - to works perfectly... it is only if the space on the RT was previously transparent. Screen Shot 2018-02-08 at 12.48.49.png

    I've tried fiddling with the shader on the blob that's being drawn in front of the RT Camera but i can't seem to find a shader that fixes it... I'm using Sprites/Default in the attached image... i've tried loads of different ones... Unlit/Transparent half seems to work - but that seems to replace the alpha that's already on the texture plane with the alpha from the blob... so drawing a single blob will always have almost clear edges even if that plane was already filled with white.

    Any help here would be much appreciated.
    Thanks!

    Screen Shot 2018-02-08 at 12.19.17.png Screen Shot 2018-02-08 at 12.19.26.png Screen Shot 2018-02-08 at 12.25.21.png
     
  2. brownboot67

    brownboot67

    Joined:
    Jan 5, 2013
    Posts:
    365
    That's pre-multiplied alpha darkening the edges of your texture. In your import settings for the lines/blobs, whatever, you want to uncheck "alpha is transparency" and make sure your source texture RGB is solidified.
     
  3. Ross_S

    Ross_S

    Joined:
    Jul 10, 2012
    Posts:
    29
    Thanks brownboot... it sounds like you've got the right answer there... only i can't quite make the breakthrough and get it to actually work. I've created a solid white square in Photoshop with an Alpha Channel .. that when i import it into Unity it looks the same as my previous blob... But it still has the dark edging. Is there some trick in the export from Photoshop? I've tried saving as png, psd and tiff .... all doing the same thing... I've tried all configurations of Alpha Source and s(RGB) setting on the import... ummmmm....
    Thanks!
    Screen Shot 2018-02-08 at 15.09.37.png Screen Shot 2018-02-08 at 15.09.42.png Screen Shot 2018-02-08 at 15.15.49.png
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    7,947
    @brownboot67 is correct that pre-multiplied alpha is the problem here, but not how he's thinking, and disabling the Alpha Is Transparency setting won't have any effect, or can make it worse. The problem is that the resulting image from rendering to a render texture is a texture that has been premultiplied by the alpha. This is the expected outcome. The color by necessity was multiplied by the alpha when it was rendered for the blend to work properly.

    The short version is you should be using a pre-multiplied shader for the material used to draw the render texture in the scene. I believe the "Sprites/Default" shader is such a shader, or alternatively the "Particles/Alpha Blend Premultiply". That may fix it for you well enough with just that change. Though you’ll probably have bright edges instead of dark edges if you do just that.


    Quick rundown of why:
    You have your texture that's solid white with an alpha. That texture as it exists in Unity and on the GPU is still solid white.
    You have an alpha blend shader you're using to render that texture to the screen.
    You have your render texture.

    The render texture in it's initial state is the color (0.0, 0.0, 0.0, 0.0), black with zero alpha. Your texture will be (1.0, 1.0, 1.0, 1.0) in the center, white with 100% alpha. Lets assume you're using an alpha blend shader. The blend mode for an alpha blend shader will be Blend SrcAlpha OneMinusSrcAlpha.

    That translates into: multiply the shader's output color by the output alpha, and multiply the existing color by one minus the output alpha, and add them together. That's written out as this (where Src is the output from the shader and Dst is the color in the render texture prior to the shader):
    Result = Src.rgb * Src.a + Dst.rgb * (1 - Src.a)

    The end result is the color is white.
    (1, 1, 1) * 1 + (0, 0, 0) * (1 - 1) == (1, 1, 1)
    This is exactly as you'd expect it. This is the math for your basic lerp.

    Lets change this to a pixel from the edge of your texture, presumably (1.0, 1.0, 1.0, 0.0). Now that same math works out to a color of (0,0,0), or black. Again, what you probably expect. Now lets look at a pixel color of (1.0, 1.0, 1.0, 0.5). The result of that is a color of (0.5, 0.5, 0.5), or grey.

    You might at this point start to see the problem. The color in the render texture isn't a solid color, it's fading out to black. You could clear the render texture to white, but then you'd just have white fringes on your colored parts. The alpha blend shader you’re likely using to render the render texture is going to multiply the color again making it even darker.


    So what is this premultiplied thing that keeps being mentioned? Well, it's that part where the texture's color gets multiplied by the alpha. If you use a premultiplied shader, the blend mode changes to Blend One OneMinusSrcAlpha. The assumption is the color in the texture has already been premultiplied by the alpha, so it skips the part where it needs to do that. Your render texture's color has been premultiplied, and thus you should use a shader that assumes that.


    That’s not the end though. Really you want to treat the color and the alpha differently when rendering the dots. Normally the alpha of the render buffer doesn't matter, but in this case since you're going to be reusing it, it does. If you have a solid white texture with an alpha you want to be doing a normal alpha blend for the color, and premultiplied blend for just the alpha value. This is possible with a custom shader, but it’s way easier to just use a premultiplied texture and shader for the blobs too.

    In photoshop make a solid black layer for the background. Then add a solid white color for the layer above. Select the image’s alpha as a selection and apply it as an alpha mask for the white layer. It might seem like it’d be easier to just copy the alpha into the RGB, but don’t, it’s not quite the same. Save this as your image. Personally I just use the straight .psd file in the Unity asset folder btw.


    So now you’ve set both the material used to draw the render texture in the scene and to draw the blobs to the render texture to use a premultipled shader, and the blob texture to be premultipled. And now it should look fine.


    There’s one last step you may need to do if after all that it still doesn’t look right. Try setting the render texture to be sRGB. It probably also doesn’t need a depth buffer.
     
    Torbach and hopeful like this.
  5. Ross_S

    Ross_S

    Joined:
    Jul 10, 2012
    Posts:
    29
    Amazing! Thanks - that's above and beyond the call of duty.
    So... Making the RenderTexture plane Sprites/Default didn't make any difference but using Particles/Alpha Blend Premultiply does.. .. It appears to fix it except when I leave the brush in one position... then, as the very low alpha edge pixels of the blob are repeatedly rendered on top of each other, you get some solid black pixels and some other weird artefacts... as per the picture.
    I guess I'm not rendering/creating the blob texture correctly still. The best result i got for the blob texture was actually using Sprites/Default... if i used Particles/Alpha Blend Premultiply again it appeared almost additive.
    I think I created the psd texture just as you instructed - it creates an extra channel called Layer 1 Mask... It all seems to be working... except the artefacts as per the image...
    However... I mainly just wanted to say thanks... I think this is good enough for now... if the brush is left in one spot then i quite quickly scale down its size so this problem is avoided...
    I know it kindof looks like the texture is compressed or something... but both my blob and RT are definitely ARGB32 and RGBA32
    :)
    Screen Shot 2018-02-09 at 14.49.24.png
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    7,947
    It’s not compression, but likely an artifact of quantization. Does it happen when the blob is white? My guess is when you have the blob colored you get some pixels that Texture RGB * Color RGB quantizes down to a darker value. I’m not sure how to fix that. You could try using an ARGBHalf.
     
unityunity