Search Unity

Question Overlapping transparent shaders

Discussion in 'Shader Graph' started by LimeVector, Nov 8, 2020.

  1. LimeVector

    LimeVector

    Joined:
    Mar 23, 2017
    Posts:
    6
    Hi there,

    For the past few weeks/months I have been struggling with this issue.
    Basically, I have two spheres both using a shader set to transparent (and using the screen position node). When they overlap, the one in front completely ignores the one in the back.

    The following video shows what I mean: Two objects, both using a transparent shader. The green one distorts, and the red one acts as a see through (for the purpose of this question). It can be seen that the green sphere is not visible through the red sphere (when it is in front of green).


    How can I make it so that the green sphere is visible when it is behind the red sphere?
     
  2. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,073
    Are your spheres writing to Z-buffer ?
    They should not.
    Also I see a sorting problem. Green should be rendered first and than red.
     
  3. LimeVector

    LimeVector

    Joined:
    Mar 23, 2017
    Posts:
    6
    // Are your spheres writing to Z-buffer ?
    To be honest, I don't know. I created them in shader graph. So whatever the default is there is applied in my shaders.

    // Also I see a sorting problem. Green should be rendered first and than red.
    I think the video might not have displayed it properly, sorry for that, but the red sphere is actually in front of the green sphere. If the red sphere would have been behind the green sphere, then the red sphere would not have been visible.

    Maybe if it helps I'll link my shaders:

    Green Sphere Shader (distortion):
    https://i.imgur.com/Bjz6sH2.png

    Red Sphere Shader (see-through):
    https://i.imgur.com/LJpXeOF.png
     
  4. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,073
    By first I don't mean closer to camera. But drawn first to the buffer.
    Show me your materials.
     
  5. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    The Scene Color node samples from the Opaque Texture, so named because it’s a copy of what was rendered just after all opaque objects (and the sky) have rendered, but before any transparent objects. If you have an object that uses the scene color node, no transparent objects will show up behind it.

    There is no real work around to this. That’s the limitation you have to work within if you want to use the Scene Color node. If you don’t need distortion, you should be using master node’s alpha input and blend mode to get the look you want.
     
  6. LimeVector

    LimeVector

    Joined:
    Mar 23, 2017
    Posts:
    6
    Well, The materials are default as well. I just slapped the shader onto the materials and applied the materials to the objects.
     
  7. LimeVector

    LimeVector

    Joined:
    Mar 23, 2017
    Posts:
    6
    Yeah, I am aware of the alpha input, but I only used this for the purpose of the video.

    The problem is that this phenomenon really hinders me in what I want to do. The following video shows more clearly what I mean by "hinders".


    I would expect for the green distorted sphere to visible inside the warp (As in: warped as well), but it isn't.

    Once again these are default unchanged materials with the shaders slapped onto them, If you want I could link the shader graphs again.

    Isn't there a way to force that the transparent shaders are written to the color buffer? As you said that there is not a workaround to this, I'm assuming no.
     
  8. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Correct, the limitation you’re seeing is “by design” for the URP. There is no work around outside of not using the URP or HDRP.

    You have to use the original built in renderer if you want to be able to stack multiple transparent objects that warp the screen. If you want a node based material editor, look into Amplify Shader Editor.
     
  9. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,073
    I'm quite surprised myself.
    So no distortion like heat and shockwave if you want to use other transparent objects ?
    Also why it is implemented that way, is this well know technological limitation ? (something like intersecting transparent objects)
     
  10. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Because to be able to read a texture of the scene you need to make a copy of the current render target. That’s relatively expensive to do, especially if you have to do it a lot of times. If you’re doing it between rendering two objects, you have to stall the entire GPU while it’s in the middle of rendering to make the copy. It’s unfortunate because a lot of effects can’t be easily done with the system they’ve gone with, but it is what it is.

    Technically with HDRP there is a work around, but it’s not really a general purpose one. You can use raytracing instead of the scene color to do real refraction through multiple layers of transparency.
     
  11. LimeVector

    LimeVector

    Joined:
    Mar 23, 2017
    Posts:
    6
    Well, that's a shame. But does that mean that if I wanted to get this kind of effect, I cannot use URP altogether, or would it still be possible in URP using Amplify Shader Editor?
     
  12. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Amplify Shader Editor works with both the built in renderers and URP. However this is a fundamental limitation of the URP, regardless of the tools used to make the shader.
     
  13. LimeVector

    LimeVector

    Joined:
    Mar 23, 2017
    Posts:
    6
    Alright, I understand it a lot better now. Thank you for your help!