A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Discussion in 'Assets and Asset Store' started by Acegikmo, Jan 11, 2014.
I think I'm overlooking something simple but is there a 'smoothstep' function somewhere?
I've released a `pack` of animated dissolve shaders made with Shader Forge to the asset store, now available at just $10.
Asset store page: https://www.assetstore.unity3d.com/#/content/14635
Forum thread is here: http://forum.unity3d.com/threads/225565-NEW-Animated-Dissolve-shaders-(for-Shader-Forge)
Shader Forge ownership is currently required.
Currently no! But it is a planned node. In which situation did you intend using it?
You could make a manual smooth function for now:
First off, love this tool. Even for someone who is experienced with coding and shaders, this can save so much time. Visual representation is why I love Unity.
Secondly, I have a question about transparency. I have a multi-pass shader I wrote using the alpha test tag. The first pass renders all opaque geometry followed by transparent geometry so I can have something akin to glass in a window without having to create multiple meshes/textures (however I'm using vertex colors so the textures is not accurate). Using the depth buffer works to help me achieve this, but I end up with artifacts on transparent geometry drawing over itself. Is there a way to have multiple passes? If not, should I break my mesh into multiple pieces and use two shaders (one for opaque and one for transparent)?
Thirdly, is there a way to include the skybox in the preview, as this helps to see the resulting transparency work properly against a background.
Thanks for your amazing work! Look forward to using this more.
You don't have control over passes in SF, so in that case you're better off creating two shaders.
There's no way of having a skybox background in the previewer at the moment, but it would be nice
Hey, can you estimate when you will do the nested nodes(preferably exportable to other shaders)? I am testing on some complex shaders and the trees get so huge its uncanny! Would be great to be able to test certain things in a seperate shader and then export the compound and use it in the final shader.
Not sure! I haven't started working on it, but I can see several cascading issues with shareable nested nodes already.
For instance, if I edit a nested node that I'm using across multiple shaders, I would have to force a recompile of all shaders using it etc.
This then gets even more complicated if I'll allow nested nodes within other nested nodes, in which case I need to recursively check changes, and sounds like an area of potential risk of asset links breaking, prefabs going missing, etc.
It's a big topic, but if people think I should prioritize it, I will
New update seems very nice, thank you.
Playmaker 1.7.6 (newest version) has a conflict with your plugin. It will not let the inspector expose public variables. (Texture2d, Color32,etc). Seems to pick and choose which it will and will not expose. (Playmaker is a visual scripting system for unity, it is really popular.)
Just an FYI, maybe it will let you know of some bugs or limitations with your plugin. I have a feeling it is Playmakers fault. They tweak the GUI very heavy.
Not a big deal for me, i plan on pre-making all of my Shaders in their own test project folder, so I don't accidentally break my game.
Can you please add the feature to zoom in and out in the node view in the next update?
Possibly, there's quite a lot to do at the moment though, I also happen to be sick at the moment, so, things are going very slowly
I've been playing about with different types of texture blending, and different ways to drive the blend, this week. One of these uses heightmaps to modulate the blending so that, for example, grass would start to creep between cobblestones and only actually cover the tops of the stones when fully blended. I have this mostly working but the join between the two is sharp, I'm planning on using smoothstep to smooth out this join while keeping the nice height-sensitive bits in it.
I did start to implement Perlin's "Smootherstep" function but the node tree for that just got painful with the amount of mathematical operators it uses..
Amazing product, a must have. So many possibilities now for people like me who haven't gotten into shader programming.
It would be awesome if you could export/import templates so people could trade their customization's rather than from a screenshot or compiled shader!
So awesome! Keep it up.
Hmmm, speaking of Perlin Molt, can anyone graciously point me in the direction of recreating a noise function without having to resort to using textures?
I'm attempting to simply do Vertex Offseting with noise over time...an undulating/rippling surface.
Joachim, I wanted to congratulate you on making such an invaluable tool! I'm already in love with Shader Forge and you keep making it better!
That being said, I'm afraid Shader Forge may be too good. I know less is better when shaders are concerned, but I have yet to notice any performance issues while stress-testing even my most complex ones (one I replicated in UDK caused it to immediately crash). Do you plan on adding any performance-related info to the FAQ page? Or maybe a tutorial video explaining performance dos-and-don'ts? The node documentation mentions very little about node cost, and I think users new to Shader Forge (and shaders in general) could benefit from some sort of chart or overview of node costs.
Also, I would love to see a Function node that could be plugged in to the If node to externally trigger that node cluster, or a Rand (Random) node that can generate a value between a user-selected range. Or are these kinds of scripting nodes not possible in Unity?
Anyway, lovely job so far. Can't wait to see what you add next! (And get better soon!)
Thanks a bunch Niiromaus! Glad you like SF
When it comes to performance, there are essentially two major things to watch out for:
1. Instruction count - The numbers below the 3D preview will show you how many GPU instructions the shader will cost, per-vertex and per-fragment/per-pixel, as well as how many texture lookups it has. Per-vertex is obviously cheaper, so for instance, if you have a complex vertex animation tree of 60 instructions, it's still much cheaper than 60 instructions per-fragment.
2. Dependent texture reads - Avoid using a texture to figure out UV coordinates at which to sample another texture. This can get heavy, and is also the reason why Parallax mapping is generally expensive. It may not be that big of an issue on modern PC GPUs though.
Nested nodes is a planned feature, that has gotten quite a bit of priority now due to lots of people requesting it
Generating random values on a GPU doesn't seem to be possible with Unity. That said, you could use a texture with noise and read from it, to get pseudo random numbers, which is likely much faster too (This is how Unity's SSAO works, for instance)
Had no idea about this. This'll definitely help with some optimisation later on.
Such an awesome tool,thank you Joachim !
I've asked for a tutorial on how to create Detail Normal Map in Shader Forge,but finally after seeing how it is done in UDK i successfully created the Detail Normal Map in Shader Forge !
Sorry for the ugly screenshot,it was randomly created ^^
Hej Joachim, incredible tool !
I have a bit of a newbie request, I was wondering how to go about building a toon-ramp shader with some kind of lambert shading, something like Team Fortress 2, I tried to build a custom lighting shader and got something close, but it was too dark (?)... not sure how to do this the easiest way
Nice! Glad you like it
I've started working on a proper normal blending node for 0.23, so it will be even easier soon, and also more accurate
I suspect you'll want something like this:
I was playing around, to see how hard it would be to make an 8 texture terrain shader. It was surprisingly easy with Shadforge!
Until I ran out of texture samplers when I started playing around with normal maps. So now my question is: How would Addpass support work, or is it even possible?
You can't override that really. Possibly if you force DX11 so you go beyond SM3.0, though that would also force it to Windows/360 only.
That said - 16 textures is a massive amount for just one shader. Isn't 11 enough?
(Splatmap + 5 diffuse w. blendmask in the A channel + 5 normals)
I like a little more variety in my terrain. That being said...I could probably pare it down. Maybe sacrifice a few normals for unbumped diffuses.
You can overcome this limit using other tricks. For instance, you can pack two normal maps into one texture by only using 2 channels per normal map and reconstructing the z component of the normal map manually. You can often derive the information in one texture from another, which is a common technique for creating specular maps from the diffuse channels. You could also try packing multiple textures into a single image and frac'ing the UV coordinates (and you'll need to pad the textures a bit so the filtering doesn't cause issues).
How can I reconstruct the Z component for a normal map?
z = (1 - x*x - y*y)
Great tool Acegikmo!
I have a couple of noob questions:
1) please, can you explain me exactly how to realize with ShaderForge a simple Diffuse shader that works exactly as the standards unity3d Diffuse shader works?
2) I have added a Fallback shader (i.e. unity3d diffuse) to a ShaderForge complex shader to allow it to render something (then not black) in SM 2.0.
But when i use Unity3d graphics emulation to test shader model 2 the ShaderForge shader render always black the mesh.
Is there a special trick to use the "fallback" parameter?
Thanks in advance!
1) Make a Texture2D node named MainTex, and multiply it by a Color node called Color. Then under lighting settings, check Lightmapping, Light Probes, and Double incoming light. Set normal quality to Interpolated. I think that's as close as you get
2) Not sure! Currently SF makes SM3.0 only shaders, I would expect the fallback to be handled in the editor though, but I guess it's not. Does it apply the fallback in a build?
Many thanks for the quick answer
About question 2:
Unforntunatly I haven't a SM 2.0 card to test it now; normally I use the editor function (it seem to works fine).
Some other shaders, that I have bought from asset store, works correctly when I set graphics emulation to SM 2.0
From Unity3D doc seem that fallback parameter is handled directly from the shader language:
Than is not really clear for me why if I set in SF a unity3d standard shader (like diffuse) in fallback field it doesn't works in the emulated SM 2.0 mode.
It could be that you're using different names in the shaders. The blackness could be a working shader, but it's texture is empty, or the color is black, or similar.
Make sure your diffuse node is called "MainTex" (So that the internal name becomes _MainTex), and the Color node is called "Color".
Other than that I'm not sure what it could be
You are right! ...I did a spelling error writing the color parameter
Now my SF "standard diffuse" works fine!
Then I have extended it only changing "Face Culling" from "Backface culling" to "Double sides"
It seems to works fine, but using ORTHOGRAPHIC CAMERA appear a strange visual problem:
Is this a bug or i must set any other parameter to use double sided culling in SF?
Thanks for the tips, I'll play around with it and see what I can come up with!
thanks for fixing the normal issue, works as it should now
Can Shader Forge make a glow shader?
Anyone know how to make or fake a Sorbel edge detect? I am trying to make a nice outline for my cartoon game I found this tutorial:
But I need to have a sorbel input for the tutorial. It seems like everything else I need is here.
The question isn't whether shader forge can make a given shader, it's more a question of whether you can use it to make one.
I think glow just needs to output some Emission value and then needs a post-processing Image Effect (pro only) to apply a glow.
I was just wondering - is there a way to make a lens flare type shader with your tool? I've been trying to mess around to make one but when I set the sorting to "overlay" it doesn't render last. but I supposed I'm doing it wrong. In all honesty I don't know the proper method for making a lens flare. Another way I was thinking is having the shader depth test to always and use the z-depth + distance test to fade out the prop if it's behind another object. That way I can have that bleed over effect that lens flares do but also have it fade away when the center nears the edge of another mesh.
And along with that, is it possible to make a custom post process with SF? I suppose I can just set the depth test to always, but would that create possible sorting issues with other transparencies? And if that's all it takes, is that the proper pipeline for creating a post process with SF?
Pro isn't the issue for me. My problem is I'm a Shader forge noob trying to learn. I've seen a few "Glow" shader scripts in the asset store, I've been trying to figure out where to start. This should get me started. Thx
With the built in shaders Unity uses the Main Color alpha to set the opacity of the Glow camera effect.
In Shader Forge apparently does not work.
not sure if you can get your hands on that in shaderforge, but your sobel input is just an image, use either your whole screen, scene z buffer or scene normal, you can definitely do it with textures in shaderforge.
to understand what you are doing here, open an image in Photoshop, duplicate it as a new layer, blur it and set its layermode to difference, thats your edge detection
HI EVERYONE !
I need some HALP !
.As im having here already some deep sleep problems caused by SSSHADERFORGE ...
As i'm Already 3 Days, Day and night trying but CANT do the Following Shaders :
.AMBIENT OCLUSION SHADER.
- This should be Possible Inverting The Light Atenuation And light Direction Node And giving it a Soft blend Incendience over Normals and Camera Direction ...
- BAsicaly is to Invert the Light Fallor so it has the oposite offect ( aka Fake Ambient oclusion ) Projected on textures ...
WORN EDGES SHADER.
- This Should be Possible Calculating both Normals and Tangents / And Adding it - or subtracting to World With a Transform FroM World to local And Dividing by normal Position ... We would get a mask of the Object Edges, then we would just need a ( fake Procedural ) Noise to Distort it ...
- Making a Shader to project blended Textures on a 6 Planes base ... That is Calculating The Triplanar - World x , y , and z Positions Translating world to Object cordinates, but making a Half Divide and thus Duplicating each X, y, and Z to 2 Planes that is mixed xy , yz, zx, x, y, z That those project Textures Coordinates ...
And DJEE... I know Technically is Possible / but im pulling my hair for almost 4 days.. i cant sleep .. and morei effort cant have those !! :razz:
CAN SOMEONE PLEASE DO SOME EXPERIENCES AND IF ACHIEVE IT ANYTHING TROW SOME NODE SCREENSHOTS ON HOW TO PLEEEASE !!! Or At least put in the store or sell its method or something ... :wink:
.Im already in Asterical mode! And cant handle in spending much more trying and failling any more !
Thanks million much !
I'm loving Shaderforge so far, it definitely make creating shaders less daunting, but the lack of full documentation and tutorials can be frustrating.
One thing that I think would be incredible would be a guide on converting from UDK's shaders to Shaderforge. There are WAY more resources available for UDK's system, and being able to learn a concept from that and adapt it to Shaderforge would make learning this stuff easier.
I convert shaders (well, surface shaders) from UDK to Shader Forge all the time. Really the only thing to take note of is that "saturation" in UDK is "clamp" in Shader Forge. Other than that, things are basically the same. There's a few differences, but most of them are pretty easy to figure out.
Hey thanks for the Photoshop tip.
My question is about Shaderforge though, can you tell us how to do it in ShaderForge please?
I am sure others will thank you too , there is a way to make screenshots in the SF editor, top left hand corner.
How do I remove a connection in Shader Forge?
Alt+Mouse or press on the zone of the connection and press right click or supr i can't remember
Hold down ALT and right click on the inputs or outputs of a node. Alternatively, you can hold ALT and drag with the right mouse button down to 'cut' through multiple connections.
Thank you! :-D
Man..... this is so 5 stars!
Shader Forge Beta 0.23 now released:
• You can now zoom the node view - which took ages, but now it's finally working \/
• Added node: Normal Blend. This node will combine two normals; a detail normal perturbed by a base normal
• Added node: Blend. Photoshop-style blending with 18 blending modes:
- Darken, Multiply, Color Burn, Linear Burn
- Lighten, Screen, Color Dodge, Linear Dodge
- Overlay, Hard Light, Vivid Light, Linear Light, Pin Light, Hard Mix
- Difference, Exclusion, Subtract, Divide
• The Component Mask node will now display outputs per-channel, in addition to the combined one. For example, if you mask out RGB, you will now get 4 connectors, RGB, R, G and B
• There’s now a checkbox under settings, where you can enable/disable node-preview updating, in case you’re having performance issues in large node trees
• Fixed a bug where you couldn’t use depth nodes in the refraction input
• Fixed a bug where the Length node output the wrong component count
• Fixed a bug where objects in the node view could be selected through the side panels
• Fixed a bug where the screenshot button overlapped the toolbar when expanding settings
Click to see all changelogs
(Don't forget to delete the old Shader Forge before updating to this one!)
Also, I totally missed almost the entire last page of comments! I suppose that's what I get for relying on email notifications
I'm glad you guys are helping each other with shaders
This is a strange issue, I'm not sure what's causing it, but I think it might be due to two things, either that SF does a single-pass double-sidedness rather than double-pass double-sidedness, which is an approximation that can cause issues on low-poly objects, or shadows not handling face culling properly at the moment (It's a known issue). Does it work better on a high poly mesh?
I'm all ears for which parts you find most confusing, and what you would like to have more documentation on
I've actually planned to make a "UDK to SF" tutorial video, hehe. But like someone already said, they're very similar, not that much differs
Is there a way I can have multiple passes with Shader Forge?
I need to have two passes:
1) first the alpha transparency texture (with the glitchy z order)
2) second the alpha clip, on top, (which has a correct z order)
This way I have transparent material, with correct z order, and you see the semi-transparent anti-aliasing on the edges.
Is this possible?