A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Now in Beta! Get 1:1 live lessons on any Unity topic or help troubleshooting your project – Connect with an expert on Unity Live Help
Discussion in 'Assets and Asset Store' started by Amplify_Ricardo, Sep 13, 2016.
Any thoughts on the issue here?
Just wanted to say that Vertex To Fragment is working fine after the latest update. Thanks !
Haven't got the time yet but I didn't forget you. Later today I'll try to get back to you.
Good to know
No rush, just checking
Is this normal that after change order in material function, connected nodes in shader are wrong ?
Is this only in editor or that make matter to shader ?
My append problems:
So I finally took a look at it.
You need to reconstruct the normals according to the changes to the vertices. This means you need some kind of analytical solution (some math) to calculate the new normal. What you are doing right now is just increasing the normals into the Y axis which doesn't make any difference besides making the final result brighter or darker.
There's a solution where you use the derivatives to reconstruct the normals (DDX and DDY) but they only work on the fragment function and will produce a facet look. What you should do instead is finding the correct math to the new normals. I just stacked some nodes together that made sense and I got this:
While this is better it's still not correct, but hopefully it should give you a hint on what to do. There are a few articles on the web about it, not an easy subject.
Nice catch, it shouldn't do anything to the shader code, it's just a preview error, I'll fix it for the next build. Thx for the report
Ahhhh, now I understand. That was by design actually, but I'll see what I can do. The reason it was by design is because the connections depend on the order of connection, we can't do this because at any time you could load a shader where the order could be different and thus resulting in different outcomes. We may need to rethink this logic to make it more pratical, but for now just use a break to components or a mask between the two appends.
You right, I check with an ASE shader on a substance and it works, but what I want to do is a shader which blend 4 substances with a mask (per channel RGBA) and conserve exposed parameter of each substance.
Something like that but actually the Unity shader has not enough possibilities.
I'm not sure to post at the best place, but maybe I'm wrong.
I have a scene with a spherical terrain. Depending on the position of the user (north or south pole), I need a different lighting (sun or moon for example). I would like to use one directional light per pole. However, when I use the Mixed Lighting option, static objects get their direct lighting popping from one light to the other.
Do you know if there is a way to limit the range of a directional light ? (I think the answer is negative).
Alternatively, is there a way to create with ASE a custom directional light ? (parallel rays, constant intensity ..., but with a limited range).
Thanks for the sample, i'll look into it further!
Awesomely great tool!
Wondering if int/float/matrix arrays are planned on the roadmap?
Unity always using the last indicated shader for fallback, so any custom fallback above "Diffuse" will be ignored. You can download a test scene here.
Hey there, I forgot to warn you about something from the sample you provided, you should do this:
The reason I'm repeating the switch just before the vertextofrag node is because the way you had step up your shader it would generate the vertex function code even if you were using the per pixel variant. This is because while you weren't using it by having the toggle off, the shader was still assigning the code to an interpolator, by having the switch just before the vertextofrag node and having the False statement empty makes it so it doesn't assign any special code into any interpolator thus saving instructions.
Check out the Global Array node, it's a community node until we figure a good way to do it but it should do what you need to do
Say whaaaa!? If that's the case I see why you are having problems with it, the link you provided isn't available tho.
I need to further test this and I'll bring some kind of solution if necessary.
Have you guys thought about supporting geometry shaders?
My world has been a lie :C
You are right, it only picks the last one, according to the docs fallbacks are the equivalent of replacing them with the subshaders of that fallback so I assumed that having more than one would follow the same logic from subshaders and run them in order. But's I guess it's not the case, it will only do it for the last one. The reason we kept the other diffuse fallback is because of the shadow caster. We'll have to change it a bit to use our own shadow caster. oh well... :C
Thx for reporting this and sorry for the inconvenience, consider it fixed for the next release
Thats one of the reasons for templates, when fully fledged it should be able to handle geometry shaders as well. You could probably do some things already with them as long it's stays contained.
I would like to make a shader that takes positions of specific objects into account (multiple), but I can't figure out how to get the position of even one object (in world space i think), is there a way to do that currently?
From what i understood there should be a way to make a global variable to set a vector? which I could then access by code?
I don't know if can be useful for you, but the Force Shield Shader -community shader- is using a method to send coordinates to shader. A script attached on the object with the shader get the impact point between another object and the object with the shader and update a shader var with this info.
oh, I was wondering if there is a community shader I could look at that IS actually very helpful thank you, maybe I can figure the rest out on my own now ^^
In UE4 shaders used for particles has parameters (which are used by particle) - is it possible with ASE (or maybe this is Unity limitation) ?
can you implement this feature ? (texture instance similar to Texture Array instance)
I have a normal map and i want to flip the green channel. in Unreal engine 4 we found an option to do it "" Flip Green Channel " can we do the same with Amplify Shader
Sorry for the wait, here's the lastest build for this week.
Grab it at the usual place.
Release Notes v1.3.1 dev 04:
Added support for Texture Arrays with 'Parallax Occlusion Mapping' node
Fixed issue with 'Static Switch' node duplicatnig code
Static Switch' node now properly allows the use of the same keyword
Fixed issue with Int ports generating black previews
Fixed issue where 'Custom Standard Lighting' node was generating garbage code when connected multiple times
Fixed dynamic baked lightmapping for 'Indirect Diffuse Light' node
Default fallback is now only added if shader doesn't use it's own
Changed 'Template Parameter' node to mimic the same look from the equivalent property nodes
Changed some labels and warning texts to be more clear on what's going on for texture objects
Int port color now uses the same color as float nodes
Added ASE custom inspector to the default templates
Do you mean being able to change a property for the material that you use in your particle system? If yes this is already possible, you can override the material you particles uses but that material is a ordinary material that can have it's own properties. The best example is the propery for soft particles.
That would require quite the change, but I'll see what I can about it. Thx for the suggestion.
That is tied to architectural design choices from both engines, asking to do that is kind like asking to flip the Y and Z axis. You can however write a pre-import script that flip the green channel for normal maps but then you need to control which ones you want to flip and which you don't, the quick dirty way is by using the path name as logic. So, in short, yes it's possible, but it's not exactly a toggle inside ASE, it's a seperate script that does something to the file on import. Look here.
Happy weekend everyone, keep the reports and suggestions coming
Hi Amplify developers and community!
I'm trying to work out which asset would suit me best, ShaderWeaver or ASE. I am confident in C# and the little I have tinkered with shaders has been tricky to say the least - hard to enough to put me exploration and creative thinking! I am interested primary in 2D image effects.
I have downloaded the ShaderWeaver free trial and produced an effect like this,
very easily indeed, using three nodes. I was surprised how intuitive and fun it was.
I am aware that ASE will provide more flexibility and a steeper learning curve. As way of a test case, would anyone quantify or demonstrate how easy it would be to produce a similar effect to the one shown above in ASE? How many nodes would be it require?
Which node in ASE is equivalent to Absolute World Position in UE4 ?
does ASE support Texture2DArrays?
Any idea how to make realistic lava and molten?
Weird issue on OSX latest, Macbook pro 2013. When I duplicate a node with cmd + d I cannot select the duplicated node. It seems that somehow the key is locked. when clicking on an empty space, divide nodes are created at every click. After pressing escape I can select the duplicated node. But it was duplicated twice.
I can reproduce this almost every time. But sometimes duplication works just fine, only the node is placed somewhere outside the visible screen, so I have to zoom to find it.
@Amplify_Paulo - I saw the post about using the Static Switch node. In the latest update there is this fix:
Fixed issue with 'Static Switch' node duplicating code
Is it a fix for the situation below? Or this is the way the connections should be done?
Also it seams that there's a problem with UnityGI giXXX = gi when used in the vertex function:
Shader error in 'Custom/Undefined Variable': undeclared identifier 'gi' at line 60 (on d3d11)
Sample shader attached.
There's always a duality in these tools, of course it's nice to have a one click tool that does what you want but most of the times it comes with the price of flexibility like you mentioned. That particular effect I'm not really sure how many nodes would require, mostly because I'm not sure what is doing, I'm guessing it's only panning a changing texture up to lerp between two others? Maybe 10 nodes? Just a guess. You have to keep in mind that you can create the effect you want and save it into what we call a shader function, which is a separate file that contains a set of connected nodes that can be used inside any other shader, so while you may take your time learning how to create that specific effect, once you get it the way you want you can save it as a shader function and reuse it in any shader you want as just one node, effectively reducing the complexity to just one. I know it doesn't fully answer your question but without having more information on that effect it's hard for me to say how complex it really is.
It's called "World Position". Keep in mind the differences in scale though, unreal units are in centimeters whereas unity units are meters so the two nodes might look slightly different.
Just a couple of ideas. You could try using two panning textures at different speeds to generate a moving pattern that resembles lava, you would then using this somewhere in the emission output. You might also want to mask the emission with some other texture to simulate hot and cold places, you could even use vertex offset if you want to make it "alive".
Thx for reporting, can you tell when this started to happen? Does it also happen if you use Ctrl+C, Ctrl+V?
That image is just a suggestion on what you should be doing, it basically saves some instruction counts.
It's true, these lighting nodes are not prepared to be used in vertex functions. I'll see what I can do, some I'm pretty sure I can't do anything about it besides removing the error (which I should have done already :| )
I see, thanks for the info.
I understand. The error didn't occur in the previous versions, so I guess it has something to do with fixing the dynamic baked lightmapping for the Indirect Diffuse Light' node.
I haven't tried this yet but does anyone know if it is possible to create a billboard effect for something like trees or leaves?
Yes it does, and unfortunately it doesn't seem I can get the baked lightmaps to work from the vertex function. This means it will only show the dynamic one, which was the previous behavior. I'm going to fix that so at least retains the dynamic baked light and also do something for indirect specular which also suffers from the same problem.
If you mean creating billboards, yes you can create them, applying them to trees, leaves or grass is just a matter of ... well.. using them. Keep in mind tho that you might need to do some extra work to make things like wind work with them.
Hi all, i made a cutting shader. I am able to cut my mesh using a plane defined in space and clipping via "Opacity Mask" channel. I'm in forward rendering with msaa enable.
I noticed that the clipped borders of my mesh aren't antialiased. Is it something related to my shader or to the Unity's render pipeline ?
RenderType is set to "Transparent cutout".
Is there a specific example scene that might give me some insight on how to do that? I've figured out how to mask things by vertex color(which I wanted to do so some planes would billboard and others won't) and have some fake wind effects but I haven't figured out how to make the geometry always face the camera.
MSAA uses geometry to figure edges to do AA, anything that isn't fully opaque wont be catch by it, transparent or masked shaders for instance wont be anti-aliased. It's not a limitation on Unity or the shader, it's a limitation from the msaa technique. Try post-process AA like TXAA, those should be able to pick these kind of cases.
We don't have any sample that uses it to be honest, mostly because it's just one node or option. What I mean is, you have an option called Billboard in the main panel that you can activate and configure, and you also have the equivalent node called billboard that you connect to the local vertex output.
I created with ASE a ground shader that applies vertex offset as height/depth map. However, I have the impression that this vertex offset is not use when I make a Global Illumination Baking, int the clustering step at least. Is it possible to combine both (vertex offset + GI) and, if so, do I need to do something in particular ?
ASE version ist the latest. But I did not use ASE for a while, I am currently on Unity 2017.1.0f3. Yes, Cmd + C, Cmd + V produces two duplicates too, but I can click or select without creating other nodes
Awesome update, thanks for the hard work.
Some features I would like to see, If I may:
- Copy & Paste from one canvas to another don't keep the properties names. Pretty anoying
- Any plan to be able to create a Standard Surface shader that could handle different rendering mode (Opaque, Fade, ..) like the standard one using keywords ?
Right now, I have a master shader for our game with some additional features that we needed, and I had to create 4 different ones to handle rendering mode.
When I need to add a new feature, I have to modify all variants. I used a shader function which is way more easier than before, but... I'm pretty sure you can figure something even better.
- Still not able to add Header/Label/Group in the properties to make it nicer for users ?
Thanks again for this awesome package
I'm having some trouble with a skin shader that I'm developing. What I'm trying to do is combine my head/mouth/eyelash textures into one. But since the eyelashes will have transparency (double-sided), I'm trying to apply transparency (double-sided) on my entire model using an opacity mask but my internal geometry is showing through (mouth and eyes).
My character's body also has the same problem. I have a white texture for the opacity mask, and the clothing has it's own shader. Any thoughts?
On a side note as far as performance is concerned; is it better to combine all of this into one shader, or create shaders/materials for each individual piece; head/mouth/eyelashes?
Congrats on Unity Awards 2017 nomination! https://awards.unity.com/
We are very happy and humbled to be part of this year nominees, it's an honor seeing our product next to so many fantastic entries.
A very special thank you to all ASE users, we cannot thank you enough for all your support.
Hi Amplify team and Unity users,
I have some questions about POM, acutally I'm working on eye shader with parallax mapping it compiles but the parralax go "outside" contrary to "inside".
When I used Parralax Occlusion Mapping; I think it's the good way to get something like Iris "inside" but I have compile error.
I allready check shader propose in example but my brain is burning
Have a nice day,
What exactly is failing? Is it a static baking that doesn't show? only partially? Also, have you tried clearing and rebaking? Send us screen and/or a shader.
Thx for the suggestions, some of them we already have plans to do, it's more of a priority thing. The only one where we don't have any plans on changing is the rendering modes, you see, unity standard shader controls the rendering modes by use a special material inspector. We don't want that the shaders ASE produces to be bound by the editor, we want that you can share that shader with anyone without also having to deal with dependencies. But of course this could be something you really want, so what you can do is simply override the material inspector of the ASE shader by one you created to do whatever you want. So you can create your inspector that mimics unity and you can set you shader to use that inspector. You can control the flow of the shader with shader keywords, just like unity does.
What are the by-hand modifications, most likely those can be translated to node instructions or keywords settings, if it something we can change to support it we'll take a look. Share that information with us and we'll take a look, you can send an email or PM if you prefer.
You should turn the casting shadows option on to let the depth to be written properly. (deactivate shadows in the renderer if you really need it)
There's all sort of issues with transparency ordering, your best bet is to not deal with those as much as possible so I would use a different shader the eyelashes. In fact I would even consider using alpha test (with or without dithering) to save some overdraw issues. The only other good solution is with multi pass shaders, which is a thing we are still working on for templates to support it, this is for cases like hair.
Thanks a lot :]
The nomination itself already means a lot to us.
i have a weird question... is it possible to write a shader to bake a custom dynamic reflection probe & try to generate shadows from the cubemap?
I have an eye shader that solves your issue. It includes the ability to have cataracts
Woot nice KRGraphics, can you screen shot how you construct it? or just share how you plug your POM node?
I just wonder what can amplify shader editor can do for the deferred rendering path mode?