Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Shader Forge - A visual, node-based shader editor

Discussion in 'Assets and Asset Store' started by Acegikmo, Jan 11, 2014.

  1. Cascho01

    Cascho01

    Joined:
    Mar 19, 2010
    Posts:
    1,347
    Hi,

    I would like to compare Skyshop/Marmoset against Shaderforge concerning ambient-lighting.
    Is Shaderforge capable to produce the same ambient lighting quality as Skyshop?
    Can Skyshop do more?

    Thanks!
    Carsten
     
  2. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    It all depends on how your textures look!
    If your textures are very simply colored, you can put each texture in a channel, and then multiply the R channel by a color, or lerp between two colors using the R channel as a T value, and so forth :)


    They can both do pretty ambient lighting!
    However, if you're talking IBL ambient lighting, you'll still need a pretty cubemap to make it without seams, which Skyshop is very good at generating.
    So, one doesn't replace the other, they complement eachother - you can use Skyshop cubemaps in Shader Forge, plus more integrated support coming up soon!

    With SF you can of course customize you ambient light to be based on whatever you want ;)
     
  3. Cascho01

    Cascho01

    Joined:
    Mar 19, 2010
    Posts:
    1,347
    Yes, I meant IBL. And I will probably use my own cubemaps. What about Shaderforge and HDR-Cubemaps for IBL?
     
  4. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,084
    That's mostly what I've been doing for my monochromatic stuff, but right now I'm trying to pack materials a bit for a terrain engine replacement thing since Unity's terrain isn't really fitting my needs.
     
  5. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    That works as well!
    You just need to unpack them, which you can do my multiplying the RGB with the Alpha * 8. (Assuming 8 is the RGBM packing constant)

    Also, if you're using your own cubemaps, and you want to use MIP blurring, make sure the MIPs are seamless. (Skyshop does this)
     
  6. Filto

    Filto

    Joined:
    Mar 15, 2009
    Posts:
    713
    Hi. Seems like a great tool!

    I looked at your custom Bling-phong tutorial and at the beginning you create a shader with just color and light attenuation that can be used for local ambient lighting. I have always desired a point light source for local ambient lighting (as you have in 3dsmax for instance) which would be extremely useful in creating richer lighting in a quick simple way. So my question is this then. Can you in some way tag the light source and in that way decide which parts of the shader should be calculated? So for instance if the lightsource is tagged ambient it disregards the diffuse calculations. This might be a ridicoulus thing to ask but my basic understanding of how shaders works isn't the best :)
     
  7. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    The per-light information you get is very limited, however, you can use the alpha channel of the light color :)
    Read from the alpha channel, and interpret the light as an ambient light if the alpha is 1, and a regular light if the alpha is 0
     
  8. JoeW97

    JoeW97

    Joined:
    Nov 3, 2013
    Posts:
    56
    Hi

    If I take the separate r,g,b outputs from a texture to process them individually, how do I combine them back together to provide an input to Diffuse?

    Thanks
     
  9. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Depends on how you want to combine them! But I presume you want to use the append node :)

    (Hold Q and click to create an append node)
     
  10. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,084
    I did a quick test since I've never had to explicitly do this, but you should be able to do this with a pair of append nodes.



    For instance, here's a test structure I whipped together. As you can see, when I used append to join the separated channels (that I inverted, for the sake of example) I was able to get the same result as if I inverted the RGB channel outright. I admit, there's probably a better way to do this, but this is at least functional.
     
  11. JoeW97

    JoeW97

    Joined:
    Nov 3, 2013
    Posts:
    56
    Thanks, that seems like a useable way :) Surprised there isn't a 3-way combine node ( or 4 with alpha)
     
  12. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    I'm trying to figure out how to efficiently pack a large number into an RGBA 8-bit per channel texture, and then reconstruct it in the shader. I've done it where I break the number down into its 8-bit parts and like /255, or /65535 etc and then in the shader multiply the components and add the results together etc, but it seems like a lot of instructions. I'd love to have float textures but not on Unity free. Any suggestions for an optimal way to unpack a large number from multiple components and treat it as a single number? It can be integer or float, just needs to have a pretty decent range like at least 0-65000 or so.

    I know I can append or mix channels but each is still treated as an individual number within a vector. Any such thing as a vector3 to float converter?
     
  13. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
  14. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Might be possible, but it sounds like you'd get precision issues when using compressed textures, but something like this perhaps?

    R* 2^24 + G * 2^16 + B * 2^8 + A
     
    Last edited: Feb 8, 2014
  15. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Now it's out in the wilds, what kind of Marmoset integration does it have, and importantly, what speed / optimisations do you have planned? :) there's an awful lot going on in frag...
     
  16. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    The Skyshop support is coming soon, sometime within the next two/three weeks I would guess!

    As for vert/frag optimizations - it's not that high of a priority, but it should be done at some point. I'm not sure when though
     
  17. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,084
    If I append the RGB values of a texture together, use the alpha channel data to define where a character's clothes are, and then tint them with a colour node? It seems like this would be an easy way to implement outfit customisation without eating up a texture lookup.
     
  18. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
  19. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,084
    Excellent, I figured as much! I guess I'll have to set up a simple photoshop script for that then.
     
  20. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Yah that's pretty much what I'm doing already. Just hoped maybe there was a swizzle or something interesting that could do it with less calculation.
     
  21. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    But then how would you give the object alpha transparency around the edges at the same time as having tinted clothing?
     
  22. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,084
    Well, this is for 3D models and I'm working with almost exclusively flat colours/shading so that's kinda a non issue.
     
  23. Lockestone

    Lockestone

    Joined:
    Feb 8, 2014
    Posts:
    3
    Sorry if this has been asked before (had a quick thread search and couldn't see it), but does this support real time shadows on top of lightmapping? When I flick the 'support lightmapping' option in the lighting settings of the shader, real time shadows are no longer rendered on light mapped objects.
     
  24. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    That's currently not supported. Does Unity's own shaders support real-time shadows + baked? I haven't checked in a long time.
     
  25. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Since it's saturday, I'm adding some unnecessary features after having done a code cleanup sweep :)
    Hopefully this makes the menu less poppy and more tied together:

     
  26. Lockestone

    Lockestone

    Joined:
    Feb 8, 2014
    Posts:
    3
    Ah darn. Yeah, in forward rendering, without the switching between far and near maps to facilitate close up real time shadows like in deferred, shadows will get drawn onto light baked objects with the standard diffuse/diff+norm ect shaders.
     
  27. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Cool, I'll have to sort that out then :)
    It should be possible in SF too, it's just a miss from my end then.

    Also, since this got to the bottom of the previous page:

     
  28. donzen

    donzen

    Joined:
    Oct 24, 2009
    Posts:
    54
    Yes it does.
     
  29. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,084
    Last edited: Feb 10, 2014
  30. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    I'm not sure how possible it is to do that. It's assuming you can iterate through edges, which, as far as I know, you can't do unless you're in the DX11 geometry shader.

    Shaders are super fast to calculate on the GPU, but they are a bit oblivious to their surroundings. It calculates in rough terms like this:

    Code (csharp):
    1. for each triangle{
    2.     for each vertex{
    3.           VertexShader();
    4.     }
    5.     for each pixel inside{
    6.           FragmentShader();
    7.     }
    8. }
    Without knowing anything about the adjacent triangles. So for instance, I don't think you can detect whether or not an edge is hard/soft, simply because you need to know about neighboring triangles to tell. The vertices doesn't know about each other either, so, it's a bit of a mess.
     
  31. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    829
    It's possible, but I think you need to use custom lightmapping shader with manually assigned .exr maps stolen from the level folder to do that. Goes like this:

    Code (csharp):
    1.  
    2. Shader "Lightmapped Lit"
    3. {
    4.     Properties
    5.     {
    6.         _MainTex                ("Diffuse", 2D)             = "white" {}
    7.         _Color                  ("Main Color",         Color)                 = (1,1,1,0.5)
    8.         _AmbientColor           ("Ambient Color",         Color)                 = (1,1,1,0.5)
    9.         _LightmapNew            ("Lightmap", 2D) = "white" {}
    10.  
    11.     }
    12.    
    13.     Category
    14.     {
    15.         SubShader
    16.         {
    17.             ZWrite On            
    18.             Tags { "RenderType"="Opaque" }
    19.             LOD 400
    20.             CGPROGRAM
    21.                 #pragma target 3.0
    22.                 #pragma multi_compile_builtin        
    23.                 #pragma surface surf BlinnPhong nolightmap vertex:vertLocal
    24.                
    25.                 sampler2D       _MainTex;
    26.                
    27.                 sampler2D _LightmapNew;
    28.                 float4 _LightmapNew_ST;
    29.                
    30.                 fixed4          _Color;
    31.                 fixed4          _AmbientColor;
    32.  
    33.                
    34.                 struct Input
    35.                 {
    36.                     float2 lmUV;  
    37.                     float2 mainUV;
    38.                 };  
    39.                      
    40.  
    41.                 void vertLocal (inout appdata_full v, out Input o)
    42.                 {
    43.                     o.mainUV = v.texcoord0.xy;
    44.                     o.lmUV = v.texcoord1.xy * _LightmapNew_ST.xy + _LightmapNew_ST.zw;
    45.                 }    
    46.                    
    47.                 void surf (Input IN, inout SurfaceOutput o)
    48.                 {      
    49.  
    50.                     half4 main_color = tex2D (_MainTex, IN.mainUV);
    51.                     main_color.rgb *= DecodeLightmap(tex2D(_LightmapNew, IN.lmUV));
    52.                    
    53.                     half4 c         = blended_color.rgba * _Color.rgba * 2; // * 2 used so that neutral color would be 128,128,128 and coloration would be possible without darkening
    54.                     o.Albedo        = c.rgb / _AmbientColor;  // You can use (0.5 * UNITY_LIGHTMODEL_AMBIENT.rgb), but it's buggy and not always returning proper color from the render settings
    55.             ENDCG
    56.         }
    57.     }
    58.     FallBack "Diffuse"
    59.  
    60. }
    With that shader, you can, for example, bake a piece of modular geometry reused on multiple levels and

    There are few important things:

    - DecodeLightmap instead of usual sampling (as lightmap is 32-bit and nothing will be properly interpreted otherwise)
    - The lightmap file you've extracted from a level should be marked for import as Lightmap
    - You have to cancel out the ambient yourself, because noambient flag will not work for that particular case and because UNITY_LIGHTMODEL_AMBIENT very often returns incorrect values.
    - Static batching is very often introducing a floating point error into the texture offset, which is not a problem for traditional textures, but screws lightmaps royally as even a 0.000625 offset error kills proper alignment for lightmaps stretched over any large objects. Either disable static batching or don't use static flag on objects with that shader.

    Actually, DecodeLightmap would be a nice node addition.
     
    Last edited: Feb 10, 2014
  32. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,084
    I figured that was the case, but I just wanted to make sure. I can think of a way to fake it, but the overhead would be too high to use it for any real amount of polygons.
     
  33. mandydark2

    mandydark2

    Joined:
    Aug 13, 2012
    Posts:
    17
    Hello again friend
    I liked the last update and the possibility of having the SceneColor, SceneDepth and BlendDepth nodes.

    I would like, if possible, that you placed here an example of how to achieve real-time reflection, without using cubemaps, just want a surface with reflection, like a polished floor.

    Thanks
     
  34. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Currently you can't do that, as you can't pass matrix properties into SF shaders. It is a planned feature, but it's taking a while!
     
  35. mandydark2

    mandydark2

    Joined:
    Aug 13, 2012
    Posts:
    17
    then, a shader that simulates the water can not be built, because it would be obvious that the water does not reflect the objects floating on its surface.
     
  36. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,084
    Okay, if I can't check what faces are adjacent, can I check what pixels are adjacent? Like, could I take a texture that paints each face as either Red, Green, or Blue like this:



    So that none of the adjacent faces have the same colour, then use that the calculate the line? It's an additional texture lookup, but I'm really trying hard to not have to compromise on my aesthetic.
     
  37. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    I'm not sure how that would help. If you want to detect borders between triangles, you still can't do it with just a fragment shader, unless you bake it using vertex colors, in some way. You'd either need to use DX11 geometry shaders or post processing effects
     
  38. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,084
    Eugh, I figured as much, but it was worth a shot. My development and release platform is OSX and I'm using Unity Free, so DX11 and Post Processing aren't viable options, unfortunately.
     
  39. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Another way of getting proper outlines, would be to smooth-shade your entire mesh, and bake the hard-edges into a normal map instead of having it in the geometry. That way you won't get any splits in your mesh
     
    Murgilod likes this.
  40. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    829
    Seconded, I would definitely recommend that. This thread has a lot of valuable advice on the subject, by the way:
    http://www.polycount.com/forum/showthread.php?t=107196 (some other sticky threads in that section are relevant too)

    There is no downside in going full smooth.
     
  41. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,084
    EDIT: And then I realised I could save myself a lot of work if I just had each texture contain 4 R data, 4 G data, and, 4 B data.


    PREVIOUS STUPID IDEA:
    So I think I've found a way to store 4 radically different textures in 3 lookups.

    1. Use RGBA to store 4 modified greyscale versions of an image in a single texture.
    2. In the second and third texture RGBA slots, store 4 hue(r1,b1,r2,b2) and 4 saturation(g1,a1,g2,a2) values.
    3. Convert and combine the HSL data to RGB data.

    Now, this has a couple obvious problems. The first one is that you'll lose a bit of colour accuracy as you'll be going from a 360º hue value range to a 256 hue value range. The second is that it seems like this will require a fair few operations. Now, is this idea completely ridiculous or am I on to something here?
     
    Last edited: Feb 11, 2014
  42. metlapig

    metlapig

    Joined:
    Sep 24, 2013
    Posts:
    3
    Hi,
    I'm a bit new to shader creation. I was wondering if there is a way to create a random offset in a UV, so each time the particle is created there will be an offset in the "noise". That way all the particles don't look like each other.

    This is the shader I am creating, which allows a transparency noise (and animation) and another perturbing noise (and animation). What I'd like to do is have the perturb start in a random location in the texture UV space. I can get it to look like it should work by adding in variables and putting their input into the UV panner (so when I change the values, it looks like the texture is moving), but it doesn't seem to carry over into Unity.
    $sf_perturb_blend_particle_opaque_2102014.png
     
  43. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Randomness is hard in shaders.
    You should be able to do it through vertex colors, as you can pass random data from the particle system to the shader, but it looks like you're already using all channels for that.
    Can't think of a straightforward solution to it at the moment
     
  44. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    You could store 5 textures in the space of 3 if you go with 6 bits per channel. It reduces the color resolution from 16 million to 262144, and ideally you just chop off the lower 2 bits from the color values.. and maybe use a little ordered dithering to approximate subtler colors. Then you get 2 free bits per color channel which for 3 textures is 4x2x3=24 bits, so you get 4 x 18-bit textures and 1 x 24-bit texture in the space of 3 RGBA textures. But I guess you can't really use a compressed texture format then and you have to split/recombine the channels in the shader.
     
  45. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Pseudo random number generators use various operations on numbers to generate a random output. Various modulos and stuff. So you could implement a pseudo-random generator based on the fragment's x/y coords as the `seed`. Then feed it into the uv offset.
     
  46. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    I think he wanted it per-particle though, rather than per-fragment
     
  47. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Oh. I guess like you said then the vertices need to be fed an index or something?
     
  48. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,084
    Yeah, It's a lot of splitting and recombining and likely an optimisation that wouldn't offer much benefit unless on really specific platforms.
     
  49. voxi

    voxi

    Joined:
    Dec 3, 2013
    Posts:
    60
    Hello,

    I can mask vertex displacement in DX9 with vertex colors. When I try to do the same thing with a texture my example mesh goes Pink like there is an error.

    I have a feeling that masking vertex offset with a UV mapped texture is not supported, am I right about this? Did I forget to multiply another value when i am using a texture?

    Thanks in advance, sorry if this is a stupid question :/
     
  50. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    It should work; how does your node tree look?