Search Unity

Migrating Planet Shader to Shader Graph

Discussion in 'Shaders' started by Eraph, Aug 11, 2018.

  1. Eraph

    Eraph

    Joined:
    Aug 15, 2015
    Posts:
    45
    I had a nice shader made up based on Nils Daumann's excellent Plant Shader as described in the Unify Wiki. Unfortunately my attempts to make it compatible with the new shader graph for scriptable lightweight render pipeline have been met with very limited success. In this post I would like to build up an understanding of what's going on in the original shader script, and to seek help on how it could be migrated to nodes in the graph editor.
    I'm mostly interested in making the atmosphere work, but I'll cover the whole shader anyway.

    Assumptions (Correct me if I'm wrong!)
    • If it works in the lightweight render pipeline, it will work in the high-definition render pipeline, so I will be targeting lightweight.
    • There is a one-to-one mapping between shader graphs and passes, so for each pass a new shader must be created.
    Shader Inputs
    • MainTex (Texture2D) - applied to the planet to give it land, sea, etc.
    • Color (Color) - gives a tint to the planet texture.
    • AtmoColor (Color) - the colour of the atmosphere glow.
    • Falloff (Float) - determines the spread of the atmosphere glow over the planet.
    • Transparency (Float) - opacity of the atmosphere glow into space.
    • Transparency Planet (Float) - opacity of the atmosphere glow over the planet.
    Float in the original shader corresponds to Vector1 in the shader graph editor.

    Pass 1: Planet Texture and Surface Glow
    This pass gives the planet itself its texture, and a flat glow that will essentially blend in to the glow into space from the next pass.
    There are two parts to this and this is where my understanding of shaders gets a bit fuzzy.

    Code (csharp):
    1. v2f vert(appdata_base v)
    2. {
    3.     v2f o;
    4.  
    5.     o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
    6.     o.normal = mul((float3x3)_Object2World, v.normal);
    7.     o.worldvertpos = mul(_Object2World, v.vertex).xyz;
    8.     o.texcoord = TRANSFORM_TEX(v.texcoord, _MainTex);
    9.  
    10.     return o;
    11. }
    So it looks like this is concerned with working out positions of textures in relation to the object they're applied to. I'm not sure how relevant this is as I've managed to get nice looking planet textures on a sphere without worrying about this part. Is this already being handled in the new render pipelines?

    Code (csharp):
    1. float4 frag(v2f i) : COLOR
    2. {
    3.    i.normal = normalize(i.normal);
    4.    float3 viewdir = normalize(_WorldSpaceCameraPos-i.worldvertpos);
    5.  
    6.    float4 atmo = _AtmoColor;
    7.    atmo.a = pow(1.0-saturate(dot(viewdir, i.normal)), _FalloffPlanet);
    8.    atmo.a *= _TransparencyPlanet*_Color;
    9.  
    10.     float4 color = tex2D(_MainTex, i.texcoord)*_Color;
    11.     color.rgb = lerp(color.rgb, atmo.rgb, atmo.a);
    12.  
    13.     return color*dot(_WorldSpaceLightPos0, i.normal);
    14. }
    A lot of this code maps directly to nodes in the shader graph editor, and indeed I reckon the Fresnel Effect node might simplify it somewhat further. The first five lines set up the "shape" of the atmosphere output as well as colors and alphas.
    1. Feed a Normal Vector node into a Normalize node.
      1. The output looks exactly the same, is it necessary to normalize? Maybe it's because we're working with a sphere, but if you're creating a planet...
    2. Add a Subtract node, feeding the Position property of a Camera node into input A, and the Out property of a Position node into input B.
      1. In the original code, it uses the property worldvertpos which I'm presuming is using the Position node with Space set to World.
    3. Feed the output of the Subtract node into a Normalize node.
    4. Feed the outputs of the Normalize nodes from before into the input properties of a Dot Product node.
    5. Feed the output of the Dot Product node into a Saturate node.
      1. Again, the output seems almost identical. Is this a sort of belt-and-braces approach to ensure no matter what, this will produce the optimum result?
    6. Feed the output of the Saturate node into the B property of a Subtract node. The A property should already be set to 1.
    7. Feed the output of the Subtract node into the A property of a Power node, and feed the Falloff Planet input into the B property. Starting to look like a somewhat convincing inner glow yet?
    8. Create a Multiply node and pass in the Transparency Planet and Color inputs.
    9. Feed the output of that Multiply node into another Multiply node, and pass in the output of the Power node from before.
      • The next two steps aren't necessary.
    10. Feed the AtmoColor input into a Split node, and then feed the RGB properties of that node into their corresponding inputs in a Combine node (R to R, G to G, B to B, but not A).
      • You could probably multiply the output of A with what happens next if you want to combine the input alpha with the calculated alpha
    11. Feed the output of the last Multiply node from before into the A property of the Combine node.
      • Uh oh, looks like a solid circle... Is this because alphas aren't shown in previews?
    All going well you should have something like this:


    From what I understand, the next three lines apply the output of what we have on top of the planet texture we want to use, relative to the position of a light source.
    1. Feed the Texture input into a Sample Texture 2D node.
    2. Feed the RGBA output into property A of a Multiply node, and feed the Color input into property B.
      • This simply tints the color of the planet texture.
    3. Feed the output of the Multiply node into a Split node.
    4. Feed the R, G and B outputs into the corresponding inputs of a Combine Node. Ensure the A input is set to 1.
    5. Feed the RGB output into the A property of a Lerp node.
    6. Feed the RGB output of the last Combine Power node from the previous steps into the B property of the Lerp node.
    7. From that same Combine node, feed the RGBA into a Split node.
    8. Feed the A output of the Split node into the T property of the Lerp node.
      • Alright, we have a nice glow!
    With one line to go, this is where you should be:


    Edit: That's it for Pass 1! The rest of this post discusses a line of code that has no effect, but has been left in for posterity's sake.

    But unfortunately this last line I am struggling with. Here's how far I get...
    1. Feed the output of the Lerp node into property A of a Multiply node.
    2. Feed the output of the Normalize node that is attached to the Normal Vector node into the A property of a Dot Product node.
    3. Feed the output of the Dot Product node into property B of the Multiply node just created.
    But what goes in to the B property of the Dot Product node?
    Taking a look at that last line we have this part:
    dot(_WorldSpaceLightPos0, i.normal)

    Where does _WorldSpaceLightPos0 come from? What does it correspond to in the node editor? The closest thing I've found is a Light Probe node but I've no idea how to use it, nothing I pass into it changes the output. This is kind of crucial to getting the atmosphere effect!
     
    Last edited: Aug 11, 2018
  2. Eraph

    Eraph

    Joined:
    Aug 15, 2015
    Posts:
    45
    Pass 2: Atmosphere Glow
    This pass grows the original shape out from its normals and applies a fading atmosphere effect. As before, there are two parts.

    Code (csharp):
    1. v2f vert(appdata_base v)
    2. {
    3.     v2f o;
    4.  
    5.     v.vertex.xyz += v.normal*_Size;
    6.     o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
    7.     o.normal = mul((float3x3)_Object2World, v.normal);
    8.     o.worldvertpos = mul(_Object2World, v.vertex);
    9.  
    10.     return o;
    11. }
    Most of this we don't need to worry about, but there is one line we're particularly interested in here:
    v.vertex.xyz += v.normal*_Size;

    This pushes the sphere out by the multiplier specified (a value of 0.1 pushes it out 10%).
    To achieve this, the render pipeline package in use must be version 2+.
    1. Feed the Size input into the A property of a Multiply node.
    2. Feed the output of a Position node into the B property of that same Multiply node.
    3. Set the Space property of the Position node to Object.
    4. From the same Position node, feed the output to the B property of an Add node.
    5. Feed the output of the Multiply node into the A property of the Add node.
      • You could skip the Add node altogether if you're happy to use values such as 1.1 to say size is 110% of original, rather than 0.1 as a 10% addition to the original size.
    6. Feed the output of the Add node into the Position property of the PBR Master node.
    The next chunk of code deals with the colors and fades of the atmosphere.

    Code (csharp):
    1.  
    2. float4 frag(v2f i) : COLOR
    3. {
    4.     i.normal = normalize(i.normal);
    5.     float3 viewdir = normalize(i.worldvertpos-_WorldSpaceCameraPos);
    6.  
    7.     float4 color = _AtmoColor;
    8.     color.a = dot(viewdir, i.normal);
    9.     color.a *=dot(i.normal, _WorldSpaceLightPos0);
    10.     color.a = saturate(color.a);
    11.     color.a = pow(color.a, _Falloff);
    12.     color.a *= _Transparency;
    13.     return color;
    14. }
    Again we see that line
    color.a *=dot(i.normal, _WorldSpaceLightPos0);
    which bgolus assures us is not required. So we should be able to smash through this one.
    1. Feed a Position node (you could use the same one from before) into the A property of a Subtract node.
      • Is there any harm in using a geometry node such as Position twice in the same graph?
    2. Feed the Position property of a Camera node into the B property of the Subtract node.
    3. Feed the output of the Subtract node into a Normalize node.
    4. Feed a Normal Vector node into a Normalize node.
    5. Change the Space property of the Normal Vector node to Object.
    6. Feed the output of both Normalize nodes into a Dot Product node.
    7. Feed the output of the Dot Product node into a Saturate node.
      • At this point we've skipped the line with
        _WorldSpaceLightPos0
        .
    8. Feed the output of the Saturate node into the A property of a Power node.
    9. Feed the Falloff input into the B property of the Power node.
    10. Feed the output of the Power node into the A property of a Multiply node.
    11. Feed the Transparency input into the B property of the Multiply node.
    12. Feed the output of the Multiply node into the Alpha property of the PBR Master node.
    13. Feed the AtmoColor input into the Albedo property of the PBR Master node.
    14. Use the cog icon on the PBR Master node to set the Surface property to Alpha.
    At this point we have something almost resembling a glow on one side of the sphere, and the graph should look like this:


    Unfortunately this is where I get stuck! There are a couple of key steps missing that I don't know how to do with a shader graph.

    Code (csharp):
    1. Cull Front
    This line is key, I believe this is what gives the authentic atmosphere glow effect. I would expect to find an option for that on the PBR Master node, but there is nothing. Is it possible to set the culling mode on shader graph yet?

    Code (csharp):
    1. Blend SrcAlpha One
    This one is even trickier. It specifies the blend mode with objects behind it. Again I would expect to find this on the PBR Master node, and indeed there are some blend modes given already, but none do quite what I'd want it to. Can we make our own? I have an inkling this actually relies on the previous pass (see the very first post) to decide how it renders, so if we can't do multiple passes in a shader graph, will this be impossible?

    I feel like I'm really close to cracking this, just a couple more steps in the right direction and I think we'll have a working atmosphere for use with the new pipelines! As before, all feedback is welcome, I'm learning a lot looking at this but I've no doubt you guys have plenty more to teach me!
     
    Last edited: Aug 12, 2018
  3. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,336
    Steps 10 & 11 are pointless. While it’s accurate to what the original shader is doing, it was done purely so the original author didn’t have to think of a name for a new variable. Skip the whole split and combine here and just plug the output of the multiply into the lerp. I would even just put the full RGBA values of both the texture and the atmosphere color straight into the lerp rather than doing the two splits and combines.

    You can even save an instruction by multiplying the lerp output by the _Color rather than multiplying both the texture and atmosphere separately prior to the lerp.

    Nothing. You don’t do the last dot product at all. That dot product is your basic Lambertian diffuse lighting. Unity’s Shader Graph is setup to not allow the creation of custom lighting models, so you have to use the existing one. Lucky for you the Lightweight pipeline’s diffuse is already a straight Lambert, and the HD pipeline’s diffuse is equivalent to Lambert when using a smoothness of zero. So use a PBR master node and plug the output of the lerp into the Albedo. Set the metallic and smoothness to zero, and you’re done.
     
    Last edited: Aug 11, 2018
  4. Eraph

    Eraph

    Joined:
    Aug 15, 2015
    Posts:
    45
    Thanks for your input! You're absolutely right about steps 10 and 11, no change after routing straight into the Lerp node.
    Having a look at a sphere with that shader applied, does look like it's in pretty good shape. I was expecting the relative position of the light to have an effect, but it doesn't look necessary right enough.
     
  5. Eraph

    Eraph

    Joined:
    Aug 15, 2015
    Posts:
    45
    Apologies for the bump, but I have added the second pass here. It covers the atmosphere glow part, but there are a couple of things I don't know how to do with shader graph yet.
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,336
    The second pass is not possible to accurately recreate in Shader Graph. In the base pass the Lambert shading is the last step, and because the lighting models that both the HD and LW pipelines use are identical are very close to that model you can just use that.

    The atmosphere pass on the other hand is taking the Lambert shading and modifying it. That means it’s a custom lighting model, and not just Lambert, and thus is impossible to recreate using Shader Graph alone. Those are not operations you can do prior to passing it on as it’s directly modifying the cosine curve that the dot product produces.

    The solutions are to make do with the built in lighting model, or use an unlit shader and a shader property to pass in the light direction.

    This is just additive blending. It can be that, or Blend One One, the only difference is the prior multiples the output color by the output alpha. I don't know which blend mode Shader Graph uses when choosing additive blend, but you can produce the same results in either by multiplying the color value by the alpha in the shader before hand instead of relying on the blend mode.

    This one is a little harder. There is no way to set the culling mode to front, only the default Cull Back, or Cull Off by checking two sided. So you'll have to abuse two sided and the Front-Facing node to make the front face black.
     
  7. Eraph

    Eraph

    Joined:
    Aug 15, 2015
    Posts:
    45
    Again, thanks for the feedback! I've actually hacked a workaround for Cull Front. Before feeding into the Position property of the PBR Master node, multiply by -1. I think this doesn't play well with the normals on the faces though, so I might have to figure out a way to sort them out as well. I'll keep working on it anyway.
     
  8. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,976

    On the point of custom lighting models, have you or anyone else managed to work out a way to hack this to work? I have that someone @PhilSA (i believe) had done a toon lighting using shadergraph a month or two back, and I was wondering how this sort of thing (or any custom lighting) might be achieved in current shadergraph + SRP?
     
  9. Eraph

    Eraph

    Joined:
    Aug 15, 2015
    Posts:
    45
    Haven't had much of a chance to look at this recently but here's how far I got. Inverting the positions of vertices works well as a hack for achieving a Cull Front effect. The image below shows the results of two different blend types on the PBR Master node.
    On the left is Alpha blend, on the right is Premultiply blend.



    What I found is that you can export the shader graph to code. I tried this with the intention of setting the Cull and Blend properties manually (after reverting vertex positions to original locations), and it looks like it's very close, but with an obvious halo. Maybe I missed something in the shader... but I would much rather not have to export and edit the code every time I make a change to the graph!

     
  10. Saturn1004

    Saturn1004

    Joined:
    Nov 14, 2014
    Posts:
    42
    I ended up making my own planet shader in shader graph for LWRP before ever seeing this post. Here's what I ended up with. Shader Graph for Planet + Atmosphere included.
    I didn't really make it for anyone's eyes but my own so sorry it's not super polished.
    If anyone has any ideas to improve it I'd love to hear them, especially if anyone knows how to get a more hazy atmosphere and less of a solid line.

    LWRP-Planet.jpg

    LWRP-Planet-Shader-Graph.jpg

    Custom function code to get main light direction:
    #if SHADERGRAPH_PREVIEW
    Direction = half3(0.5, 0.5, 0);
    #else
    Light light = GetMainLight();
    Direction = light.direction;
    #endif

    Link to the Shader Graph: https://mega.nz/#!XN1TCIJD!VcbMW_pIU4YUhbFCEiCrhyUkeesAcyQsm1rp9NsXUiw
    Feel free to use it.

    The game it's going to be used in can be found here if anyone cares: https://play.google.com/store/apps/details?id=com.NullReferenceGames.ExoplanetsOnline&hl=en_US
     
    Last edited: Aug 4, 2019
    NorahTheDog likes this.
  11. A132LW

    A132LW

    Joined:
    Jun 21, 2017
    Posts:
    37
    If your RP was HDRP, then you can plug the fresnel output into distortion to create a blur.
    I do not know of an alternative for LWRP.

    Maybe multiply the fresnel with a texture or add some clouds?

    Also, note that the atmosphere of Earth from space is not very blurry, so a set of layered fresnels ought to work:


    Like this - to get a ramped outline color:
    upload_2019-8-4_13-13-27.png

    Hope this helps.
     

    Attached Files:

  12. Eraph

    Eraph

    Joined:
    Aug 15, 2015
    Posts:
    45
    Thanks for chipping in, guys! Must admit I haven't looked at this in a while but good to see there's still interest!
     
  13. dispatch_starlost

    dispatch_starlost

    Joined:
    Nov 17, 2017
    Posts:
    37
    Subtracting a bright "border" fresnel from a larger "glow" fresnel can give the effect you're after. In these screenshots I have a generic planet sphere, plus an atmosphere sphere with the fresnel effect shader on it. Ps. using your custom function code, cheers :D

    The atmosphere sphere is slightly larger than the planet. I imagine you could keep tweaking to get something pretty close to what you're after.

    planet-shader-1.jpg

    planet-shader-2.jpg

    planet-shader-3.jpg

    planet-shader-4.jpg
     
  14. JLW

    JLW

    Joined:
    Jul 5, 2018
    Posts:
    1
    Hi, I know this post is a few months old but would someone post a pic on how they set up the custom function node for the get light direction. I'm getting an initialize error.

    Custom function code to get main light direction:
    #if SHADERGRAPH_PREVIEW
    Direction = half3(0.5, 0.5, 0);
    #else
    Light light = GetMainLight();
    Direction = light.direction;
    #endif

    Thanks
     
  15. fct509

    fct509

    Joined:
    Aug 15, 2018
    Posts:
    108
    Can someone tell me how to convert "TRANSFORM_TEX(v.texcoord, _MainTex)" over to shader graph. More specifically, I'm working on converting a shader that uses TRANSFORM_TEX([procedually generated values], _MainTex), but the values are partially based on v.texcoord, but after running the values through TRANSFORM_TEX([procedually generated values], _MainTex), it does even more processing to that result before sampling a different texture.

    I'm working on converting an old Unity 5.x project over to HDRP, and I was lucky enough to be able to convert most things over without too much trouble. Yet, this one item, well, people write some crazy code to create realistic water. Only, this water is also controlled via scripts, so I can't just swap it out for another shader unless I match all the properties and their functions.

    I would start a new thread, but it looks like this is something that needed to be figured out for the conversion of the planet shader.

    Edit:
    Well, it took a bit but I figured it out. I forgot to create the ST properties for my textures. Since TRANSFORM_TEX is taking the _MainTex (or whichever second variable is passed in) and appending "_ST" to the name in order to get the _MainTex_ST vector4 (or which ever texture _ST you're using). So what I really wanted was (_MainTex_ST.xy * coords.xy) + _MainTex_ST.zw. Wow, I know I haven't written any custom shaders for Unity's standard render pipeline these last two years, but I do feel a bit embarrassed with how long it took me to remember the use of the _ST properties.
     
    Last edited: Oct 22, 2020