Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Shader Forge - A visual, node-based shader editor

Discussion in 'Assets and Asset Store' started by Acegikmo, Jan 11, 2014.

  1. GMM

    GMM

    Joined:
    Sep 24, 2012
    Posts:
    301
    Thanks, i will try that out. Is there a good way to log what information is being processed by the shader(ex. returning view direction values)? Would make it a bit easier to figure out the data my shader is processing.
     
  2. brook0907

    brook0907

    Joined:
    May 27, 2013
    Posts:
    2
    Please make that Spherical Area Lights Tutorial, it looks super cool. I was trying but haven't figured out yet.

    I just got shader forge and started playing with it.
    It's really good for fast prototyping. Thanks.
     
  3. ok8596

    ok8596

    Joined:
    Feb 21, 2014
    Posts:
    40
    @Marco Sperling
    Wow, thanks for your fast reply : )
    I could work it with this shader using your advice.
    http://i.imgur.com/gijZzQL.png

    But I found another problem like this.
    http://i.imgur.com/vEj3fir.png
    Both of object and white block are used same shader. The white block received shadows on its top and side. The side shadow is not correct. I think the top shadow appears in the side. Is it possible not to make shadows in the reverseside plane?
     
  4. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
    First thing I would try is to set your LUT texture asset to "Clamp". This way your UVs never exceed the 0-1 range.
    Next thing you could investigate is your light source's shadow bias value. Unity's shadow system does not operate well in a wide range of scene scales. You either get peter panning or shadow acne - sometimes both - if your scene uses very small or very large assets (compared to a default cube of 1 unit).
     
  5. ok8596

    ok8596

    Joined:
    Feb 21, 2014
    Posts:
    40
    Great. I could solve to get down the shadow bias value. (I coudnt connect clamp node to texture asset node...)
    Thanks for your teaching.
     
  6. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
    Glad the shadow thing solved your problem.

    Oh and I didn't mean a clamp node inside the shader graph. What I meant was the texture asset itself in your project view. Find the texture there and change its settings to something like this:
    $LUT_clamp.jpg

    This (Wrapmode: clamp) should be done to textures that are used for "looking up values" inside a shader like you are doing with the ramp texture (which actually is some kind of lookup texture - LUT). Again: this can be done if the UV coordinates to lookup the values are derived from shader calculations that could exceed the 0-1 range and repeating the LUT would result in unwanted graphical results.
    This is by no means an irrevocable rule. You might find yourself in a situation where you actually want your lookup texture (LUT) to be repeating when the UV coordinates go past the 0-1 range.
     
  7. hyogast

    hyogast

    Joined:
    Feb 24, 2014
    Posts:
    7
    Is there some node like the DepthBiasedAlpha from UDK?

    I'm doing a water shader, and when I did it in UDK, I have used this DepthBiasedAlpha node to smooth the parts in which the "water" touch a mesh.. It kind of blurs these parts.

    Any idea?

    Cheers!
     
  8. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Thanks Marco Sperling for helping people in here :)

    http://acegikmo.com/shaderforge/nodes/?search=DepthBlend
     
  9. marggob

    marggob

    Joined:
    Feb 24, 2013
    Posts:
    65
    Can I use Object Space normal maps? I need to transfer information from a texture in vector World.
     
  10. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Sure, you can use the transform node to transform from/to tangent space :)
     
  11. drewski58

    drewski58

    Joined:
    Jul 3, 2012
    Posts:
    15
    First of all, kudos for such an awesome product, especially for those of us who aren't coders.

    I do have a question about alpha/alpha clipping. I have a graffiti texture on a wall that has a corresponding grayscale map with black as transparent. I've sort of managed to get this to work by putting the grayscale map into a 2d texture node and then connecting any of the R, G, or B channels to alpha clip. The problem is certain things in the graffiti texture arent drawing, especially those that are close to white.

    I realize I'm not doing this correctly, so any help on how to create this kind of effect would be greatly appreciated.

    Thanks!
     
  12. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Could you take a screenshot of the node tree?
     
  13. djweinbaum

    djweinbaum

    Joined:
    Nov 3, 2013
    Posts:
    533
    Perhaps I'm doing something wrong, or there is something peculiar in my setup, but attaching a flat normal color (128, 128, 255) into the normals plug drastically changes the look of my preview ball. While the full complexity of normal maps is well above my understanding, I'm fairly certain that a flat normal color should yield identical results to a diffuse only if its shading correctly. To be certain I don't have some funky setting, I created a fresh new project and imported a fresh new shader forge. When I add a normal and compile I get these strange artifacts at the poles of my ball. My ball looks fine when there's no normal attached.

    $questionable_normals.jpg
     
  14. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    That's because you're inserting a packed normal. (0 to 1 range). When normal maps are passed into the shader, they are unpacked to a -1 to 1 range, so, a flat normal is 0,0,1, not 0.5,0.5,1 :)
     
  15. djweinbaum

    djweinbaum

    Joined:
    Nov 3, 2013
    Posts:
    533
    Woah, I had no idea. I figured I must have been doing something wrong. That explains it. Sorry to trouble you. I've been packin' my normals since I was a small baby.
     
  16. drewski58

    drewski58

    Joined:
    Jul 3, 2012
    Posts:
    15
    I have two examples of this effect... the first one is okay, although you can see that the edges are completely hard.
    $graffiti example 1a.png $graffiti example 2a.png

    It seems this works only because the texture has an absolute black edge.

    The second example has parts of the graffiti that won't display, which I believe is do to the more varied grayscale image I'm using for the alpha clip. $graffiti2 example 2a.png
    This shows the graffiti with no shader. Now with the same setup as before here's what I get. $graffiti2 example 1a.png $graffiti2 example 3a.png
    Note that the effect is the same no matter which channel I connect to alpha clip.

    Again, I know this has got to be the wrong way to do this, so any help would be greatly appreciated. What I would ultimately like to achieve is a soft edge fall-off based on the gray-scale, if this is possible.

    Thanks!
     
  17. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Soft falloff is possible if you plug it into Alpha instead of Alpha clip, but you may get sorting issues in that case
     
  18. drewski58

    drewski58

    Joined:
    Jul 3, 2012
    Posts:
    15
    I have played around with the Alpha channel, but I'm still wondering if there is a better way to set up the alpha clips than what I'm currently doing.
     
  19. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
  20. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,083
  21. P1st3ll1

    P1st3ll1

    Joined:
    Dec 13, 2012
    Posts:
    69
    Hey guys !

    I'm having some problems with my shader...

    I made a comparision from what the mesh should look like and the other one with my shader...

    $Shaders.png

    And here's my node tree (it's a mess, I know XD )

    $sf_testshader_2272014.png

    Help me, please X(
     
  22. P1st3ll1

    P1st3ll1

    Joined:
    Dec 13, 2012
    Posts:
    69
    Nevermind... I fixed it... The blending mode was in Alpha Blend when it should be off.
     
  23. helscar

    helscar

    Joined:
    Nov 2, 2010
    Posts:
    9
    Hi,
    first i need to say that this tool is awesome and thanks for your work !
    i'm using it to make pbl metallic surface and i encounter a weird bug where object don't receive shadow but they are cast behind the mesh. All work fine with standard material. I'm in forward rendering with blend mode off and auto sort.

    Do you have any ideas ?
     

    Attached Files:

    Last edited: Feb 27, 2014
  24. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    So I had a little time tonight and decided to stress test shader forge a bit. My current work doesn't require anything too fancy, but I wanted to see how it did under some load. So, I wrote a 2 octave value noise with an animated domain warp, no textures, all procedural. I've attached an image of the node graph as well as a copy of the shader file if anyone wants to play with it. Ideally, the node graph that produces a single octave of value noise would be a node with inputs for uv, frequency and amplitude, as you'll notice the quality of the effect far exceeds what you can produce using a texture (due to precision lost via bit depth and resolution).

    Some UI feedback:

    - It would be really nice if the graph window would scroll when dragging a connection near the edge, that way you could create far connections without having to drag two nodes to a common location.
    - The current connection curves get messier than I'd like on a large graph, simply because they create a wide arc when going backwards.They look great on small graphs though.
    - Multiple select and comment areas are really needed for managing this level of complexity
    - I'd really like a Vector2 and Vector3 type for parameters; only having float or vector4 means I'm adding extra controls that don't do anything, or having multiple lines for what I'd like to be one line.
    - This whole shader would be better done as two custom HLSL nodes, as graphing all this out is kind of insane.
    - I would love if the vector types had convenience outputs for the component parts, in the way that a texture has the R, G, and B channels available for extraction. This would reduce the need for the component mask node and make the graph a little smaller.
    - I would love if pluggin a vector3 into a Vector2 input just used the first 2 channels of the vector2. Again, less component masks and faster workflow.
    - Add/Subtract and Multiply should assume a second input of 0/0 and 1 respectively if no input is provided. This makes them act as a NOOP node, or placeholder that continues to work even when values are missing. Actually, I suggest similar values on most, if not all, inputs. The more things always produce a result the better, IMO.
    - It would be nice if a texture reference could be put into a shader without exposing it for users to adjust; I don't know if the Unity framework would easily allow this, but it's nice to be able to have fixed/private textures in the graph in the same way that you have constants and user-values.

    When you have this many nodes on the canvas, every interaction gets really slow, even if auto-compile is turned off. This leads me to believe that it's doing something to generate previews (or some other operation which involves touching every node) after each interaction.

    Additionally, the raw compile time for the shader graph is very slow compared to some other ones I've used or built, and the current previews do not animate. Some of this might be in Unity's domain, as I know little about what compiler they use, but I also suspect a fair bit comes from generating previews for the nodes. When you have this many nodes, a large cut/paste can take 20 seconds and changing a value can take 1-2.

    One way to speed up previews (and get them animating) is to generate extra code in the shader for the preview system which gets conditionally compiled out when not being used to render previews. When rendering previews, set a shader constant with the node id and switch on it in the shader, something like:

    float node1 = 1.0;
    float4 node2 = float4(1,1,0,0);
    if (_node_number == 1)
    o.diffuse = node1.xxxx;
    else if (_node_number == 2)
    o.diffuse = node2.rgba;

    The main advantage here is that you can generate and compile a single shader, then render every node's preview quickly. If you only render what's on the screen, it's even faster, and if you do for the resolution of the node on the screen, it's blazingly fast. Again, I have no idea how much time is being spent in Unity vs. Shader Forge or on previews, but it's slow enough to be disruptive even on smaller graphs, and I've noticed that changing some values don't cause the shader to recompile (which I'm assuming is because of slowness).

    Anyway, that's it for now; it's so nice to have a solid shader graph in Unity, and despite the strain on compile times it held up to the abuse rather well!
     

    Attached Files:

  25. GamebookStudio

    GamebookStudio

    Joined:
    Sep 18, 2013
    Posts:
    10
    First of all, awesome tool. Saw the Alpha on polycount and now finally got it for our project :)

    Some questions:
    1.
    I have a problem with converting some of my old nodetrees from UDK to Shaderforge. I assume Unity ( or Shaderforge?) have different standards for vectordata (View Direction, Light Direction, etc). Tried to rebuild a Rimlight Shader like http://udn.epicgames.com/Three/MaterialExamples.html#Rim Lighting and i couldnt 'wrap' my head around this whole Transform thing.

    some UI stuff
    2. When i load a saved shader with shaderforge my Texture Asset: Normalmap always forgets the normalmap/bump setting. So after loading it looks ok, then compiles and is broken afterwards. So reactivating the settings and waiting for 2nd recompile is quite annoying.
    3. After closing shaderforge with a compiled shader sometimes the last change (eg. renaming slider or properties) isnt saved
    4. Deleting node names with del isnt working, only with backspace ? eg. Clicking somewhere in the text and pressing del or select all and press del
    5. When zoomed out the alt selection is offset - like selecting more nodes outside the selected area
     
  26. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
    I second that. But I think this has been suggested at userecho (bugtracking and feature requests service used for SF development)

    I agree. But again, this has been suggested at userecho before I think.

    Custom cg nodes would be great. Please go to userecho and vote for it :)

    You can find a request on userecho for this, too. Pls vote for it - I would love to see this feature ;)

    Personally I like strict programming languages. This way you are forced to double check your work. Lazy assumptions like these default parameters can cause headaches later on when you think your shader works (after all, you see something else than pink) - but under the hood it doesn't because you've forgotten to plug something meaningful into one or two nodes. Again. This is personal preference. Maybe adding a toggle to the graph editor could make these defaults work or turn them off. Or throw warnings at compile time.
     
  27. Kacer

    Kacer

    Joined:
    Oct 26, 2012
    Posts:
    1
    Thank you for this awesome tool, its really really great, and a huge improvement over unity's standard material editor :)

    I have a question though.

    I've been making my own reflective shader, and i've been stuck for a while now.

    I've got two issues, the first one is this:

    $tO5YhUE.png

    The blue lines show which direction the cieling reflection is going in, and it doesnt match up with the reflection i see on my walls.

    This one here might be related as well: http://i.imgur.com/VVViWvB.png

    The white square on the floor is a reflection from the cieling, yet, the scale is all wrong again, and at all times it stays under of my camera instead of acting like a real reflection.

    Writing this, i'm beginning to think its because i'm using a cubemap, and not "real" reflections as such, having the white square move, and show other portions of the cieling as reflections on the floor, would require me to "move around" inside the cubemap.

    Lastly: for some reason my lightmaps arent showing up on my models, i've made all the settings correct, and i've got two uv channels, yet no of the lightmaps are showing up in the viewport, even when i've ticked "lightmap support" in Shader Forge. I am getting one error though, but its a GUI error, so i dont know how bad it is: Material doesn't have a texture property '_MainTex'"

    here's a screenshot of my shader: http://i.imgur.com/CE8nKm0.png

    All in all, i'm somewhat confused about this, so help me please :)
     
  28. P1st3ll1

    P1st3ll1

    Joined:
    Dec 13, 2012
    Posts:
    69
    How do I "make" a Light Vector ?
     
  29. lazygunn

    lazygunn

    Joined:
    Jul 24, 2011
    Posts:
    2,749
    I hope to have this very soon, actually excited about it as I think it's exactly what's needed. Cannot wait for the custom code node, that would be brilliant, and in the spirit of that i'm going to echo jbooth's insistence about having a few proper dedicated noise nodes. I'm a relative neophyte to shading and i had the misfortune to run across shadertoy.com and now i want to raymarch everything - what's really useful for raymarching is noise, for example, say you were to write a cloud shader with this (I'm hoping something like this would be fine), fractional brownian noise is perfect for clouds, in fact its popular for lots of things.

    This page explains fbm: http://iquilezles.org/www/articles/warp/warp.htm
    And results can look like this: $gfx00.jpg

    It's often regular to animate noise using time or use the date as a seed maybe

    And there are several other well known noise techniques on his articles page http://iquilezles.org/www/index.htm# under Procedural Content

    That site has several noise functions that any shader writer needing any kind of values to dirty something up, creates ripples on a water surface, create smoke would be very grateful for. Especially if these functions aren't so hard to implement but would be really useful even without the custom code node, which is probably a tough thing to implement. I mean look at how many nodes jbooth had to use to create the noise to test, when it could have been a lot more straightforward.

    Sorry for being so intolerable haha, i just only recently found out how important these things can be in shader writing and i'm now considering myself enlightened, and this is such a good opportunity to have this functionality. Great work so far though and once I have it, can't wait to see what I make with it
     
  30. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    That's strange. Have you double-checked the blending mode? Have you checked the render path in the camera itself?

    http://shaderforge.userecho.com/top...gether-unless-they-are-both-visible-onscreen/

    Yeah, I'll probably do another pass on the connection behavior/shape at some point :)

    I've been thinking about box commenting, but I think, in the end, a node-nesting feature would make box comments superfluous (But I may be wrong)

    Currently it's due to how Unity works. There is no interface for setting V2/V3 values:
    Value: http://docs.unity3d.com/Documentation/ScriptReference/Material.SetFloat.html
    Vector 4: http://docs.unity3d.com/Documentation/ScriptReference/Material.SetVector.html

    Both custom code node and nested nodes are planned :)
    http://shaderforge.userecho.com/top...des-to-encapsulate-frequently-used-functions/

    It would be nice, but I'd have to make some sort of special interface for it, such as only showing it when the cursor is nearby, etc.
    Otherwise, it creates a ton of UI clutter, especially since they are colored, and I'm trying to keep it as minimalistic as possible (Which is getting increasingly more difficult)
    http://shaderforge.userecho.com/topic/371216-component-mask-per-node/

    Yeah this has been a bit of a mess on my end, due to designing the code in a bit of a stupid way. I may find a good way of making this work well at some point.

    Possibly - I suppose it would be easier to test/iterate. Feel free to suggest it on UserEcho :)

    It's possibly yeah! Again - feel free to request it on UserEcho!

    It does! And it's a bit of a stupid system. I'm currently generating the node previews on the CPU, through C#, for every pixel, which is not ideal at all, to say the least. If you want a temporary "solution", you can click on settings, and turn off node preview rendering :)

    The shader compile time is rather long, and there's not much I can do about it. They're compiling all branches to all selected platforms, as well as runs an optimization pass. However, shader compilation is much faster in an upcoming release, which SF will benefit greatly from :)

    Since the shader compile time is slow as it is, this would force me to compile two shaders every time, which may end up being a bit too slow in the end. Ideally, I should have pre-compiled shaders per-node, and simply render quads on top of each other, using the previous results as RT inputs :)

    Thanks a bunch for testing it properly!



    "Making" a light vector is quite simple, just make a Vector 3 and treat it as a light vector when doing custom lighting, and you're set :)
    If you want to read from the actual light vector, you can just add the node called Light Direction.
     
  31. GrumpyCoder

    GrumpyCoder

    Joined:
    Oct 9, 2012
    Posts:
    2
    Any one having issues with the parallax node, am I doing something wrong? $ParallaxError.JPG

    It seems like this is a bug.
     
  32. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,083
    I tried recreating the same thing and it does the same for me. It also throws up a lot of debug errors.
     
  33. BradHerman

    BradHerman

    Joined:
    Jan 10, 2013
    Posts:
    21
    I'm running into some vertex offset issues. Trying to do a simple height map based on world coordinate UVs. As soon as I connect a texture2D to anything connected to vertex offset I end up with a console full of errors.

    The relevant messages are below. I know in manual program I can use a 2D texture in the vertex shader but for the life of me I keep having issues in shader forge. Did I miss something obvious or is this a limitation of the current beta?

    Shader error in ‘test/heightmap': Program 'vert', function "tex2Dlod" not supported in this profile (maybe you want #pragma glsl?) at line 404


    I want to do tex2Dlod, this does work with a #pragma glsl
    I have seen it in several other shaders.

    If I pass in a vec3 I can get the vertexes to move uniformly. If I try and put a masked component into any of those 3 it changes the type enough to break the shader.

    Any ideas?


    $sf_heightmap_03022014.png
    Here I'm throwing in the texture2D, I know it won't give me a heightfield just like this, however I didn't think it should error.

    This is the GLSL code I'm trying to replicate

    Code (csharp):
    1. float3 worldPos = mul(_Object2World, v.vertex).xyz;        
    2. float dist = clamp(distance(_WorldSpaceCameraPos.xyz, worldPos) / _LodFadeDist, 0.0, 1.0);
    3. float lod = _MaxLod * dist;
    4. float ht = 0.0;
    5. ht += tex2Dlod(_height, float4(worldPos.xz/_GridSizes.x, 0, lod)).x;
    6. ht += tex2Dlod(_height, float4(worldPos.xz/_GridSizes.y, 0, lod)).y;
    7.  
    $sf_heightmap_03022014_Ydisp.png
    Simple Vector moves the grid up
     
  34. BradHerman

    BradHerman

    Joined:
    Jan 10, 2013
    Posts:
    21
    Turns out you need to pipe a Zero into MIP for normal and offset nodes to work correctly.
     
  35. digitalsalmonfx

    digitalsalmonfx

    Joined:
    Oct 9, 2013
    Posts:
    7
    $Tess.jpg

    I'd like to use vertex offset to produce the mesh on the right, but as far as I can tell you cant use a texture asset in vertex offset - The alternative is to use dx11 displacement, but as shown left, the mesh normals don't seem to recalculate after displacement, as shown by the fresnel diffuse.

    Any suggestions?
     
  36. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
    @BradHerman and digitalsalmonfx:
    You can definitely use a texture asset to do the vertex offset.
    That's what I had done in a shader when SF had not officially been released yet - and it still works. Make sure to untick the OpenGL compilation under "Target renderers" because that profile does not support the tex2DLod instruction required for reading a texture inside the vertex shader.

    For correct normals it gets somewhat trickier. You can use the heightmap to calculate your own normals. But I don't think you can easily get that faceted look you are after.
     
    Last edited: Mar 3, 2014
  37. digitalsalmonfx

    digitalsalmonfx

    Joined:
    Oct 9, 2013
    Posts:
    7
    Blimey I didn't even see that Brad was trying to do almost the exact same thing-

    You are correct that unticking OpenGL fixed the issue! Many thanks!

    As for normal recalculation - Perhaps using OnWillRenderObject to run a recalculatenormals every frame :/ Seems dirty and might not even work...

    Edit: FYI Using a height map to displace in 3DS Max, then baking out a normal pass from an orthographic camera, which is strengthened proportional to the vertex offset seems viable - will report back.
     
    Last edited: Mar 3, 2014
  38. helscar

    helscar

    Joined:
    Nov 2, 2010
    Posts:
    9
    Yeah i checked. In Vertex Lit the grey object just disappear. Deferred and Forward work as i said earlier. weird go through shadows for the forward and no shadow in deferred. we'll probably be in forward in the end and being unable to properly show shadows can be a problem.
     
  39. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    I think the are really different features. If I'm making a reusable node from a section of a graph, I'm defining inputs and outputs with understandable names, and saving that to disk as an asset which I can reuse, and giving it some name and menu location that makes sense. In essence, I'm defining an API for my node. Box comments don't require any setup, and let you notate and drag around sections of code easily. If I change something about the code in a box comment, it's not reflected back to some asset on disk and changing it everywhere.

    I understand your concern. Personally, I'd prefer the clutter on the nodes instead of in the graph. That said, I would consider treating the outputs of a split vector as X,Y,Z,W instead of as R,G,B,A and not coloring them. That would likely help with visual overload and also makes more conceptual sense to me since the output of a vector is not necessarily color.

    Wow, so if you were to add something like a noise node, you'd have to write that function in both C# and GLSL? Moving that rendering to the GPU would definitely be high on my list if I was you, simply because that's going to become a bottleneck for integrating nodes. For instance, a user-created node (via custom HLSL node) is going to completely break the preview system because you won't have a c# version available.

    So basically have one shader for each node which only performs the operations of that node given the input of the previous node. Then when a connection changes, you render each new node with the result of the last. Yeah, that'd be pretty fast, especially since you could precompile the shaders for each node once and re-use them (every add node uses the same shader, etc).
     
  40. lazygunn

    lazygunn

    Joined:
    Jul 24, 2011
    Posts:
    2,749
    I don't know if I just got highly ignored but the above post explains it hugely, especially reticence about having noise nodes - anything needing very quick updates, where the noise would be animating, would have to be shader based, sorry for sounding pressurey in previous post

    On the upside i'll have this soon and i'm very inspired, especially with the adoption of very udk-ish nodes to get a lot of things moved over, including remaking an ocean shader i'm working on and probably trying to move the entire udk library here to shaderforge - released using the idea of only supplying the shaderforge data so you'd need shaderforge to use it. Should keep me busy and educate me fully on the process.

    After reading the material on The Order: 1886 (here) and on owning Jove which works on implementing the same idea of layered materials, and there being a 'supermaterial' upon which sub-materials are derived and composited, i'm wondering how possible making a similar system with shaderforge might be eventually - perhaps asking a lot but the above mentioned idea of making an api for a node, if it were possible to make the UI for the material behave in a way where elements were added based on these 'apis' then making something like The Order's system might be possible. Maybe eventually anyways. I'd certainly have a go. Anyways, back to reading until money arrives, and then a busy night ahead with this.
     
  41. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    The type of system The Order uses is quite possible in a shader graph. UE4 uses this workflow for various materials (there's even a video on the net somewhere about it). The basic workflow is that you create a shader for each type of material in your game (metal, rock, etc), then drag those materials into a new shader graph - each material in the shader graph just has outputs for all of it's output values (diffuse, normal, etc), which you can then Lerp and mask like you would with any other shader node. I haven't looked under the hood on any of these systems, but I wouldn't be surprised if they just packed all of the output values into a float4x4 so that you could use the standard operators on them (lerp, add, etc).

    The systems I've seen using this approach generally use physically based lighting and deferred rendering. Using PBL helps reduce texture usage for these systems; because a given surface only needs to define roughness (greyscale) with a constant value for the other specular terms, and a lot of hardware is still limited in terms of how many texture's you can use in a single shader. If your final result is composed of 5 materials, being able to stuff the roughness into a spare channel saves you 5 texture samplers vs. having a full specular/power map for each surface. Deferred rendering helps because you only pay the cost of this (potentially) expensive shader once for each light that hits it.

    Anyway, you could build all of this right now in Shader Forge, but the workflow would require that the shaders for each material be entirely baked into one shader - which would not be the most efficient for workflow. But point is, it's possible to get the same result.
     
  42. lazygunn

    lazygunn

    Joined:
    Jul 24, 2011
    Posts:
    2,749
    I think i'd have a go at it! If not straight away then definitely when I have a full sense of the system, it seems quite sensible, the idea of layering materials intuitively makes a lot of sense to me as an artist, even if i just made it for myself. I'm wondering if I could use the exercise of converting the rather great looking udk library into shaderforge equivalents to all work under a unified pbr structure, it seems like specifying a variety of lighting models shouldnt be too hard, so 'interesting' cases like cloth and translucent materials can be taken into account.
     
  43. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Sorry if I may seem a bit unresponsive as of late. I've been quite busy with a lot of things, and I've not been feeling very well in general. I do read everything in here, but I don't always reply. I'm glad you guys are helping each other!

    To answer a few of the things above:

    @digitalsalmonfx: If you want proper normals when using a texture for vertex offsetting, you should also use a normal map in order to get realistic normals. You can also check for partial derivatives using your vertex offset function, which also works when doing dynamic offsetting, although it will be more expensive.

    @Chariots: I've never even seen the Dependency keyword before! Is it enough to add those to the shader, did you try it?

    @lazygunn: I'm hoping that the custom code node and the nested node functionality will essentially allow users to create any noise function they want, and wrap it nicely into a single node. I'm probably not going to explore noise any further than what is already in presently: http://acegikmo.com/shaderforge/nodes/?search=noise
     
  44. lazygunn

    lazygunn

    Joined:
    Jul 24, 2011
    Posts:
    2,749
    I'm sure the custom nodes will be fine, i'll look forwards to their inclusion, meanwhile today starting this moment I finally get to have a go at this, looking forwards to it all
     
  45. AndreasAustPMX

    AndreasAustPMX

    Joined:
    Mar 2, 2014
    Posts:
    9
    Hello Joachim, SF is awesome im really like this Tool :)

    Would you explain this icons plz
    $sf.JPG

    Is there a solution to benchmark a shader performance?

    Thx ;)
     
  46. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
    These icons represent vertex shader instruction count, fragment shader instruction cound and texture lookups (from left to right).
    To benchmark shader performance you might use the Unity profiler... just instance a bunch of meshes with the desired material and compare it to other shaders.
     
  47. AndreasAustPMX

    AndreasAustPMX

    Joined:
    Mar 2, 2014
    Posts:
    9
    Thx Marco!

    Unity Profiler is not exact enough :)
    2 shaders, same texture, different calculation but the same result ..
     
  48. GamebookStudio

    GamebookStudio

    Joined:
    Sep 18, 2013
    Posts:
    10
    Sorry for bumping my questions, but does anyone has a idea about the different coordinate systems in UDK, Unity ?
    Or can someone post a Rimlight shader nodetree :)

     
  49. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    UDKs vectors are in tangent space by default, but in SF they are in world space, so, this would work in UDK, but not in SF:

    rim = 1 - dot( (0,0,1) , viewDir )

    Because in tangent space, (0,0,1) is the normal direction.

    In SF, you can either use the normal dir, view dir, dot, and one-minus nodes, or simply use the fresnel node :)
     
  50. bariscigal

    bariscigal

    Joined:
    Oct 26, 2009
    Posts:
    46
    Any chance you can share how you integrated another matrix for projection? I think someone also asked the same question with no answer but wanted to try my chances :)