A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Discussion in 'Assets and Asset Store' started by Amplify_Ricardo, Sep 13, 2016.
Tool while not support multi-pass shaders, but coming soon (not only you want this shaders type).
Thanks for the update. The transparency (almost) stays the same now. There is a new problem, however:
A new line was added:
#pragma multi_compile _ALPHAPREMULTIPLY_ON
This actually has a very small impact, but:
Refraction changed from:
color = lerp( Refraction( i, o, ( ( tex2D( _Refraction,uv_Refraction) * _Opacity ) * 0.6 ), _ChromaticAberration ), color, o.Alpha );
color.rgb = color.rgb + Refraction( i, o, ( ( tex2D( _Refraction,uv_Refraction) * _Opacity ) * 0.6 ), _ChromaticAberration ) * ( 1 - color.a );
color.a = 1;
You should be able to test this with the shader I've sent you. The result is this:
You create one by change the type of all your shader properties to "instanced"
Both the new pragma and the changes to the refraction output are only meant to change the refraction port when connected, not how transparency works, are they happening to you outside of it?
Also, those changes are necessary to take into account reflection when doing refraction. If your sample changed significantly with the new update try reducing specularity, maybe even put it to zero. That should be enough, if not, the amount of opacity and albedo color does have an impact. Just looking at your image I think you want your albedo and specularity at zero.
The left image is from before, the right image from opening and saving the shader in the new editor versions without changing anything.
If that is supposed to happen, I will try to change some values and see if I can get it to work like before again.
All of our Property Nodes (Float, Vector[2 - 4] and Color ) have a Parameter dropdown called Type on their Node Properties window which can be configured to Instanced Property.
By selecting this option for at least one property you are automatically setting your shader to use GPU Instancing and configure that property to be Per Instance. Please note that this option needs to be set to each property you want to be setting per instance data.
Also don't forget that you need to set your instanced properties via MaterialPropertyBlock for your material to correctly make use of GPU Instancing.
A new build was uploaded into our website.
Here are the Release Notes:
Release Notes v0.5.1 dev 005:
Added new Curvature Parameter for 'Parallax Occlusion Mapping' Node
Added 'World To Object' node
Added 'Object To World' node
Fixed issues on:
'World Normal' node
'World Tangent' node
'World Bitangent' node
'World Reflection' node
'Register Local Var' node
Tessellation used with Custom Shadow Caster
Master Node Debug port usage
Mip Level not being used with 'Texture Sampler' nodes on Vertex Ports
Here's an image demonstrating the new POM Curvature parameter.
Happy shader creations!
Now this is beautiful man. I may have to use this eventually for complex structures...
Does anyone know how to use the fresnel node? Trying to emulate the fresnel used by alloy
Has anyone tried to combine two Substances together with an alpha? Say I have one Substance with grass, and another Substance with dirt that has a alpha for dirt positioning. I'd like to combine/layer them together together within a shader. Thanks.
Glad to see some more great work coming out of the Amplify Shader Team!
I haven't been able to sit down and check out some of the new versions, and I swear I saw someone do this but I didn't think it was possible previously. So my question is; can the Local Vertex Offset input on the root node take Texture Sample inputs now? Last I checked, this didn't work and I was hoping this would get added eventually!
Has anything with Local Vertex Offset nodes and modifications changed in the recent couple of versions? I find the changelist updates a bit vague.
I always thought that if you wanted to keep the pseudo-procedural nature with the Substance materials, then you need to use the Substance shader/plug-in with Unity. Which means that you're limited to keeping all your authoring into that one shader.
If you want to take data from a Substance material and use it to create masks to blend in-engine, then you'll need to bake out the textures you made in Substance, then put the shader together in Unity using a custom shader.
If I'm wrong on any point on this with Substance and Unity, please someone let me know! I think I'm right with this one, though. ^_^
EDIT: I neglected to mention that if you go through the trouble to pre-setup the blending in Substance, then you can expose controls to made that work in Unity - but that's all inside of one Substance material. Blending two substance materials in Unity would be equal to blending two shaders, which isn't impossible, it's just is a lot more involved.
There is a node (Substance Sample) which allows you to import exported textures that are created with a Substance. You can drag the Substance into the node, and it will create all the exports that you have defined within your Substance so you can hook them up into your shader. What I'm trying to do is connect two Substance Sample nodes together, and blend them with an alpha which is part of my Diffuse map.
What I wanted to try is instead of using one Substance that takes time to update when you make changes, to break it apart into smaller Substances and blend them together inside a shader, hopefully speeding up the updates. I don't know if it will help, but I wanted to try it.
--- UPDATE ---
I figured it out. I used a Lerp node to combine the two, and then a Component Mask node for the alpha.
This looks very nice, what's the difference with the old POM?
Also why does this displacement shader look so nice? https://www.assetstore.unity3d.com/en/#!/content/82066
I thought a displacement shader also changed the normals...
As the author of that shader explains:
Using normal map still requires mesh with correct tangents, so if you have e.g. standard plane and do displace using shader, what happens with mesh normals and tangents? - Nothing. They are not modified and do not reflect displace effect. As a result bad light interaction, with or without normalmap.
Beast has such a good visual results because it handles normal/tangent calculation inside tessellation pass.
Yes, you can check our Tessellation example where we are using a Texture Sample to get values from an heightmap and apply it as Vertex Offset.
Concerning the release notes being vague, we were trying to present them on a short simpler way to be very fast to read.
But you are absolutely right, as we were being too vague. We will be more specific on what changed from now on.
Apologies for being too late on helping you out. Really glad you figured out how to achieve what you wanted!
We'll add an help box to this node to better explain what are each of its inputs:
But on short, they are all used on the fresnel equation:
Bias + Scale * pow(1.0 + dot(I, Normal), Power)
(I is the Incident Vector)
Here are some links ( Link1 , Link2 ) with a nice explanation on the Fresnel effect.
Oh yeah!!! I saw this Substance node entered into Amplify but haven't been able to use it. That's great that you can use it that way.
Yeah had I known I would've suggested the Lerp to blend that way. Good work! Thanks for the info.
Ah yes! Did I see there there was some sort of update to tessellation somewhere? I thought I had and that I might be able to leverage that for simple vertex offset. The effect I want to achieve is just a simple vertex offset, not go as far as tessellating the geometry - although that's a capability that I'm glad is more versatile in Amplify!
The copy/paste all parameters from one material to another - simply brilliant. This options saves many my hours.
I can't wait for material functions
Gotcha, displacement only moves the vertices after sub-d occurs. How do we change the normal to match that awesome look in ASE?
I just got Amplify Shader for Work. I love the Integration (i'm a Unreal Engine Guy).
So i'm looking for a POM Setup. Maybe somebody can help me ? I couldn't find anything in the Manual.
Morning, guys... still working on my shaders for my game, and after finding the blend normals node, I can add up to 3 extra normal maps on top of the main normal map. Now I am trying to add the ability to mask areas on my model where I want my detail to be... I tried using a lerp node and the normal blending gets all messed up for me. Unless there is a normalise node, I don't see this working well... I am open to suggestions.
What is POM ?
Parallax Occlusion Mapping
ASE is quickly becoming my preferred shader editor... but is anyone else noticing the fact that when you press space in the node editor to bring up the quick node menu it actually puts a space into the search field? Like [Space] -> [m] for multiply, etc.. except in ASE, it puts a space into the search field instead of just showing the menu causing it to not find anything and show no results. Using Unity 5.5.2f1 on Mac with latest build of ASE.
Me the same - maybe next update can handle it. (Win10, last ASE)
Awesome, glad to know that you liked it, it's a great time saver. "Material Functions" are going to be a game changer
Did the sample we sent you directly solve the problem?
That issue will be correct soon, it should be resolved in the next update.
Thank you for trying it out, we really appreciate it. We would be happy to help, have you seen the included POM sample?
Yeah i saw that a few min after my Post. This is a great Start for my POM Work. Keep up the great work!
I was curious about you answer.
To be honest, asking in public and answering in private is not the best way to help your customers, especially the noob ones like me.
Yep... and it looks really good... I might get bold and squeeze in one more normal map.
Don't worry, unless the user requests otherwise, we always post discussed content in the forum, including possible solutions. We just wanted to be sure that this one worked well before publishing it. It will actually be included in @Amplify_RnD_Rick post later on the day.
Thank you for confirming it
Just wanted to know that the bug where the node connections go outside the window is still there in your latest version:
The Node previews also seem to be bigger than they should in the texture nodes:
Is there any Node like this in the Shader Editor ?
So a Transform from Tanget to World Space ?
Like @Amplify_Ricardo mentioned we were just awaiting confirmation to share the fix.
And here it is:
Yes, unfortunately we haven't been able to tackle this issue yet. Apologies for the delay on this.
We still don't have any node which transforms Tangent to World space yet but we are planning to add it soon!
We also uploaded a new build into our website.
Here are the release notes:
Release Notes v0.5.1 dev 006:
Added new Helper Window accessible via the right most button on the graph window
(De)Activating Tessellation and Outlines forces shader to save
Expanded the amount of nodes with available preview
Added fail safe to continue loading shader if in-existing community nodes are detected
Added Normal Map unpacking to 'Texture Array' node and updated its sample
Fixed issues on:
Debug Port usage
'Flipbook UV Animation' when property nodes are connected to rows and column input ports
Not configuring 'Texture Array' node ports after read
Register/Get Local Var mechanics
Adding a space on the node palette search when opening it via space bar
Default values on input port internal data if an exception is caught
Ignoring color masks setup on certain situations
Happy shader creations and have an awesome weekend!
EDIT: Guys there is a really small issue with the version we just uploaded on our website. Already on top of it.
EDIT2: Just upload a new version. All good now.
hello, I'm new to Amplify shader editor and I'd like to have some tips about a shader I'd like to create.
I'd like to create a tree shader (for billboards) that uses traslucency and that when the sun is under the horizon (so during the night)it became unlit?
I've started with the "ASESampleShaders/Traslucency" shader for customization.
I've seen that there is a "World Space Light Pos". It could be the right node to use ? why in the node properties there is "dir/pos" together? What's the output of that property?
If anyone has more suggestions to achieve what I'm looking for is more than welcome.
Thanks in advance.
Hey Guys... if you are still taking feature requests, will you guys be adding a shader previewer so that we don't have to keep saving the shader to see it on our models? It would help us immensely in speed. And I would love to see some form of information in the graph in regards to shader performance... I am making a lot of shaders with many features, and I would love to see how they would stack up and how much waste I'm dealing with.
Theoretical question: is it possible to create shader that uses curvature (pointiness) of mesh geometry as mask to blend additional effects/layers? In blender this curvature/pointiness property is very useful in my metal materials to use as mask for roughness/grunge/dust/paint/whatever layers (all done via nodes; user side of that is friendly - only sliders). I've seen that in Lumion3D - it is possible in realtime, but this is different 3D engine... in Unity I've only this but it is offline and based on baking texture masks... The only thing that I've found in Amplify is "screen space curvature" - but there is no description what it is and how stable this screen space stuff ...
I'm just wondering.. If i create extra slots such as Detail Albedo, POM or any other on the shader and don't use them at all, will this cost any extra performance?
Btw Great job Amplify! Really loving the Asset.
Hello every one!
I started using the Amplify Shader suite during the first release, and I'm really impressed by the results coming from Shader Forge. I find it much simpler to use, and oh god the constant improvements are really enjoyable.
I wanted to know that said, has anyone been able to create a shader for a skybox, or a camera post effect? Is it possible or is the suite only for surface shaders?
That is REALLY impressive! Maybe a stupid question, but what are the differences between tessellation and POM curvature? I know that tessellation creates geometry by subdividing polygons, but what are practically the advantages of each technique?
Also a last question : in ShaderForge, there was an "object scale" node, that permitted to read each scale axis of a transform and use it as a float (useful to get homogeneous texture scales without triplanar setup). However, I couldn't find a similar node in ASE. The closest I found was using "object to world matrix" and breaking the component, but none of the broken component was a scale factor alone (always linked to a rotate factor). Did I miss an obvious node?
Whether is possible with ASE to convert bump map to normal map ? Now I'am converting texture to normal as grayscale but it was easier to add to shader bump map without converting ?
The custom expression node is so very convenient, because often if can replace a lot of node that would otherwise make the graph less managable, but I think it could still use some work:
1) could you generate internal variable names for the inputs? sometimes I get the
"Shader error in 'ScX/HiBe/WorldRotatorPlate': declaration of "d" conflicts with previous declaration at (130) at line 135 (on d3d9)"
error and sometimes multiple nodes with the same var names work.
It is a bit annoying and calling the time t differently in each of my expressions just so it does not "collide" (while it is actually exactly the same) feels unnecessary. I even had a simple expression node that failed on its own with the above message because some casting took place, but I have no simple example for reproduction.
2) you allow linefeeds in the editor (which BTW is hard to use because of the black cursor on dark background). The shader looks fine after save, but when you try to reopen it, the expression node is broken and only contains the expression up to the first linefeed. If you accidentally save the shader then, the expression is gone for good. The only way to safe it is by manually editing the ASE comment part of the shader.
3) just selecting a node (not just custom expressions) marks the editor as dirty.
You can, but those effects usually need a curvature texture, the example provided does work but because of the way derivatives works in the fragment shader it will look faceted instead of smooth, you can bake those kind of texture with blender, xnormal, marmoset toolbag or my favorite, knald. You can the use that curvature map to do all sort of things and yes you can create that shader with this editor.
It will. Remember that the way unity standard shader works is that it compiles different version of the same shader that only uses whats needed, here you are creating just one. You can however try to mimick the result with shader keywords to activate/deactivate certain features of your shader but more often than not you'll find it easier to just copy an existing shader and just add or remove the needed parts.
You can't create screen effects yet. As far as skyboxes go I'm not sure, I would say it's totally doable but I don't know if there are specific details about them since I never created them. That's one in my book to cover.
Tessellation vs POM really depends on a lot of things, first you have to take into account that tessellation only runs on shader model 4.5 and above so your target platform matters, second, both have a bunch of different options that increase/decrease the processing power needed to run them, third, tessellation is a geometry shader that changes polygons and that could have some implications as far as shadows and distance detail go while POM is just a shift in the UVs of the texture to simulate depth which (depending on the quality setting) could have an impact on the fill rate of your screen. It's complicated issue and the best way to deal with it is to test different settings on different conditions and measure the performance and decide whats best.
No you didn't miss anything the unity_scale variable no longer exists, hence the editor doesn't have a node with it. See here, unity provides some examples.
While is somewhat possible it's best that you use unity's importer for that, creating a normal map from a gradient means using derivatives to see where the normal is being shifted, it's unnecessarily expensive and hard to recreate the same look when you can simply use a traditional normal map. The only reason to use is when you have procedural stuff happening.
You can already do that by turning the the auto-save on and unity's previewer on to see the changes.
We'll take a look at those issues and fix them as soon as possible.
We've just uploaded a new build into our website.
Here are the release notes:
Release Notes v0.5.1 dev 008:
Majorly improved Previews update speed
Added LOD levels to previews ( sampler and texture arrays )
Added many more node previews
Updated TriplanarProjection and ParallaxMappingIterations samples
Optimization on drawing wires
'World Normal', 'World Reflection' and 'Fresnel' input ports now modify their previews
Improved Nodes Graph internal ordering to correctly create connections on shader load
Fixed issues on:
Using line feed on 'Custom Expression' node code area
Wires and previews displaying on top of the title bar
Order issues on 'Commentary' node
Changes to previews:
They are now much faster and render without delays. We added a bunch of missing previews, most importantly for texture arrays ( really useful since Unity inspector doesn't have them ).
Previews now also respect LODs for Samplers as well as your Project Gamma/Linear settings.
Happy shader creations!
I am still trying to figure out how to use the Fresnel node, so I can match the look of Alloy closely. ASE is growing more powerful by the day. Has anyone figured out how to create NPR with this? After playing games like JoJo's Bizarre and the latest Guilty gear with the comic book style shading, I am also hoping to add this to my shader set.
I think it should be possible, just keep in mind for Guilty Gear Shading it required a different vertex normal and uv if you want to get the same style
Cool, downloading it right now.
By the way, is there a "guide" to know what different colors mean in debug mode?
Or how to make use of it, for that matter.
Does anyone know why this is happening to me when using textures? The left one is using a Sprites-Default typical sprite shader, the right one, the one showed.
Extruding the edges on the image settings actually make them more visible...
Another approach to skin shader.
If you open our helper window via its button you'll see what each color represents.
You'll also have a list in there indicating all available keyboard shortcuts.
We'll be adding additional information in there that we or you guys find useful.
This shader you shared is missing a critical step. To take your sprite alpha into account you need to connect you Texture Sample alpha channel into the Master Node Opacity port.
You should be doing something like this: