Search Unity

[Best Tool Asset Store Award] Amplify Shader Editor - Node-based Shader Creation Tool

Discussion in 'Assets and Asset Store' started by Amplify_Ricardo, Sep 13, 2016.

  1. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    I don't mean to be blunt but if TVE has a problem or not without ASE is not the point. The point here is that you're having issues when using a 3rd-party package. Contact the the TVE developer first. If the 3rd-party developer concludes that the problem is on our end we will resolve directly it with the developer; we don't provide support for 3rd-party packages directly, so we're not going to speculate about possible causes without some additional data.

    Just to be clear, our approach would be different if you could replicate it without TVE and not using a Beta/Alpha distribution.

    Contact the developer, we can take it from there.

    As for your deadline, please keep in mind that it might take up to 1 week or so to update ASE on any given official Unity release; possibly less as we begin our work before it being released.
     
  2. perholmes

    perholmes

    Joined:
    Dec 29, 2017
    Posts:
    235
    Still, it's a shader made in Amplify, and double-clicking it crashes Amplify. Is it really possible for anything done in Amplify to crash Amplify itself? That sounds like if Javascript managed to crash Chrome.

    But it's OK, I can work around it, maybe make my few Amplify edits on another machine running LTS versions, and wait and see. I've also filed a report with the TVE developer, but I it just feels far fetched that he's even able to affect this.

    Anyway, I'm on a time scale measured in years, and I can wait. Just wanted to report a potential problem from the future.
     
  3. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    There's a myriad of possible issues/conditions that could crash ASE, or any other Unity extension for that matter. It's not farfetched to assume extensions will conflict in some way, we see this almost every week. I'm not saying this is the case here, it could very well be our previews not working on the Beta/Alpha version for some reason.

    If you are able to replicate it with an LTS version, that's something we can follow up on.

    We appreciate you taking the time to report it. Like many other Beta/Alpha issues that crop up, there's a good chance this will be a non-issue on release. We will keep an eye out for any possible issues on our end.
     
  4. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    635
    Is there a way to use the new stochastic 2D or triplanar sampling for normal maps? It works amazing with diffuse maps where it completely eliminates tiling like magic but I couldn't get it to work with normals.
     
  5. castor76

    castor76

    Joined:
    Dec 5, 2011
    Posts:
    2,331
    Just a "friendly" nagging for early support for 2Drenderer for URP12... please. :D
    We can't try the Unity 2021.2 because all the shaders won't compile for 2021.2 ...
     
  6. delaneyking

    delaneyking

    Joined:
    Jan 11, 2017
    Posts:
    8
    Hiya,
    I created a "see player through wall" shader, and wanted to make this a shader function that I can use across several shaders. I was thinking it would be smart if I synched all the values across all shaders, and, obviously I will need to get the players screen position and the size of the hole passed in.

    As I want to tune this effect across all my materials, I was wondering if I can just fetch the variable values from a single source scriptable object and have them update across all materials that use the function. The manual mentions FETCH as an option for variables but doesn't actually say how.

    Any advice?
     
  7. perholmes

    perholmes

    Joined:
    Dec 29, 2017
    Posts:
    235
    I've made a proof of concept for stackable texture layers in Amplify. I've put it on GitHub as AmplifyLayerCompiler at https://github.com/perholmes/AmplifyLayerCompiler

    Basically, the idea is to create one super-duper texture layer with all the bells and whistles, and then stack it multiple times in Amplify. You'd use this for height-blending terrain textures or having smudge/damage layers on objects.

    This is currently not possible in Amplify, because a function doesn't get new property names or keywords if it's used a second time. If a function imports a texture or has a setting, this is the same setting for all uses of the function in the shader. Thus, every layer has the same settings. The only solution is to feed all the values from the outside, and that turns into a whopper quickly.

    This is a proof of concept that works, but it's not the right solution if everyone agrees that it's good and Amplify should support it. More on this below.

    For this demo, the texture layer function just mixes a texture with its input. So it has (a) a texture input, (b) and opacity slider, and (c) a static switch. These properties and keywords have _LAYERNUM append to their name.

    upload_2021-8-8_23-15-52.png

    The _LAYERNUM suffix is simply appended to property names and keywords:

    upload_2021-8-8_23-21-59.png

    The layer compiler simply generates 4 separate functions with all these names replaced. So _ENABLELAYER_LAYERNUM becomes _ENABLELAYER, _ENABLELAYER1, _ENABLELAYER2 etc. More on the naming below.

    upload_2021-8-8_23-16-57.png

    These separate functions are then daisy-chained in the outer shader:

    upload_2021-8-8_23-18-14.png

    And since all the properties have suffixes that identify the layer, all the properties are individually addressable in a custom GUI:

    upload_2021-8-8_23-19-7.png

    The checkboxes are keywords that control the Static Switch. The rest are properties.

    On the naming, the first layer doesn't get a suffix at all. "_BaseMap_LAYERNUM" becomes "_BaseMap", "_BaseMap1", "_BaseMap2". This is so that the first layer has proper universal names for things, which will match if you change to another shader.

    This works, although it's already annoying to have to right-click and run the layer compiler every time you modify the texture layer. But this is just a conversation starter. If people like it, I think the two follow solutions are more correct:

    Ideas for more correct solutions follow.
     
    Last edited: Aug 8, 2021
  8. perholmes

    perholmes

    Joined:
    Dec 29, 2017
    Posts:
    235
    Amplify Layer Compiler continued:

    MOST CORRECT SOLUTION:

    The most transparent way is if each usage of a function has a parameter called Layer Number:

    upload_2021-8-8_23-27-27.png

    Then people can simply add "_LAYERNUM" to properties or keywords that they want to adopt the layer number. Usage is then simply to use the function 4 times, but set them to 1, 2, 3, 4. Default option would be No Layers. I tried to hack this in Amplify, but my knowledge of the framework is too weak.

    LESS CORRECT SOLUTION:

    A function could have a parameter inside the function itself called "Compile To X Number Of Layers", and every time the function is saved, an additional X number of files are generated, one for each layer. This is what I do now, just more automatic. But it's spammy, and there's some manual labor in inserting the numbered functions one by one. And generating separate files is really an unnecessary step if the only goal is to get each layer rendered into the final shader with its own layer identify in the names.

    Best,

    Per
     
  9. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    You can unpack the Normal manually after the Procedural Sample node using the Unpack Scale Normal node.

    I'm afraid the answer is still the same. I wish we had more resources to help you here but, as a policy, no Alpha/Beta or Experimental support is provided; we can't support URP 12 until it's officially made available by Unity.

    Hey there,

    Sounds interesting, are you looking to synch a specific property or the resulting calculation?

    The purpose of the Fetch is to access a variable declared in a cginc or in the actual shader template for example. A Global property is probably your best option here as you could set via a simple script and apply it to all shaders/materials.

    You can even pass an array of positions if that's something you could have use for: http://amplify.pt/forum/viewtopic.php?f=23&t=640

    https://i.imgur.com/DCWHVGM.gif

    A good example of how to pass a specific value is our Smear example, located in the built-in samples.


    Greetings,

    That's a cool take, it's awesome seeing users expanding on what ASE offers! We actually have "Material Layers" coming down the line so this should be something that we will eventually tackle. I think there's also some room from optimization/sampler reduction here. We're also working on something new for ASE that twill fit this type of use perfectly. (unannounced)

    That said, I will share this with our Discord community, you should join us if you have not already.

    https://discord.gg/zVdqVSp

    I will also ping our devs as they are not on Forum duty right now, I'm sure they will be very happy to see what you built, and possibly provide some insight.

    Thanks!
     
    Cactus_on_Fire likes this.
  10. perholmes

    perholmes

    Joined:
    Dec 29, 2017
    Posts:
    235
    Hi, thanks, I'm glad you're warm to the idea. This is primarily a conversation starter, in addition to solving my problem right now. In the end, I'm just trying to achieve the ability to stack fully featured functions and have them have independent identity.

    There are probably many ways to achieve this broad goal. In the end, though, it does have to end up with uniquely named properties and keywords for each instance, since this is the only way they can be addressed in an editor.

    The problem with my solution is that it doesn't do nesting. I don't know that this is important. I don't yet have a use for nesting, but once it's a feature, people will do it, and then you're dealing with the fallout. Maybe just prevent nesting for now.
     
  11. perholmes

    perholmes

    Joined:
    Dec 29, 2017
    Posts:
    235
    Indeed, sampler reduction would be dynamite. I don't believe even Better Shaders does this (which this is obviously inspired by), but I wondered about that. But with more layers you have more identical samplers just sitting around.
     
  12. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    635
    Terrain Blend 01.png
    Working on a tesselation based terrain shader. It finds edges from the terrain normals and blends the grass with rocks automatically. Other than the trees and wooden structures, everything is displaced from the terrain.
     
    Amplify_Ricardo likes this.
  13. djancool

    djancool

    Joined:
    May 1, 2016
    Posts:
    6
    Hello all,

    When i use the "Switch by face" or "Face" node amplify adds:
    HALF4 frag(VertexOutput IN , half ase_vface : VFACE) : SV_Target

    After updating to a new unity versions this breaks the shader and needs to be replaced with:
    float4 frag(VertexOutput IN, FRONT_FACE_TYPE ase_vface : FRONT_FACE_SEMANTIC) : SV_Target

    Now i make that change manually but i would like if i could replace the injection that amplify does. Is there a way to change this injection? Thanks in advanced
     
  14. perholmes

    perholmes

    Joined:
    Dec 29, 2017
    Posts:
    235
    Hi,

    I have a strange problem, that an expression node upstream from a static switch is always evaluating. Here I have an expression that's evaluating some pass-level global state by shifting the colors, red to green, green to blue etc.

    upload_2021-8-10_10-36-20.png

    I was expecting that when the switch is off, the nodes upstream from the True input wouldn't evaluate. But they have their effect on the global state regardless of the state of the switch. And I do see it always running in the generated code. The upstream results are always calculated, and the switch only chooses via a keyword which result to use.

    upload_2021-8-10_10-35-33.png

    Is there something I've misunderstood? Because if every upstream branch always evaluates, you're no better off than with a runtime "if".

    Now, I'm finding out because this node is an expression, and it operates on a global matrix and writes back to it. It also means that it's not a problem to also wrap this in "#ifdef". So this is more of a principle. Weren't we expecting the compiled shader variant to remove results that aren't used? Or is it because I call a function from the expression node that the compiler can't figure out that this is upstream from an unused node, and always executes it?

    And in general, isn't the static switch the correct way to disable large sections of nodes in a shader variant? Is there a risk that it happens often that the upstream nodes from an unused input actually do evaluate?

    UPDATE: I can confirm after more testing that it's only the expression node that's always being evaluated. Anything I put after the expression but before the True input gets wrapped into the #if KEYWORD section of the static switch.

    UPDATE2: Actually, I see everywhere now that when you use expression, most optimizations go away. I get it, of course, it's impossible for you to reason about dependencies in the text written into the expression. But basically, expressions always evaluate, even if they're before static switches that are off.
     
    Last edited: Aug 10, 2021
  15. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    Very nice!

    Hello,

    Which Unity version and shader type is breaking? Perhaps this is something we should examine on our end.

    To confirm, what's your current ASE version? (Window > Amplify Shader Editor > About)

    Greetings,

    Our Static Switches are equivalent to what you'd expect in manually coded shaders; if not, we'd be happy to test specific cases with replicable examples. (support@amplify.pt)

    However, in most cases it's best to check compiled code as the compiler will 99% of the time handle that; meaning that the Expression will not be calculated on the actually executed shader. Can you elaborate on what you tested to determine that it's executed so that we may best examine any possible issues?

    Using Custom Expression by itself doesn't negate optimizations; it always depends on how it's used. And a Static Switch is not equivalent to using regular IF or other Logical Operator nodes.

    If you prefer, we're available to discuss this via the official support email.
     
  16. perholmes

    perholmes

    Joined:
    Dec 29, 2017
    Posts:
    235
    I've tested the above example again, and the code in the expression always runs, regardless of state of switch. It seems that the compiler refuses to optimize it out.

    The two things that stick out are first, that the expression operates on a global (I now have my logic directly in the expression):

    upload_2021-8-10_13-0-54.png

    And secondly, Amplify always wraps expressions in a function call:

    upload_2021-8-10_13-1-34.png

    The compiler refuses to optimize this out when the result isn't used in the switch, either because I operate on a global, or because it's wrapped in a function call.
     
  17. djancool

    djancool

    Joined:
    May 1, 2016
    Posts:
    6

    I am using ASE version: 1.8.8
    and unity version: 2020.3.14f1
     
  18. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    Hey there, no revision number? If you're using the old 1.8.8 r1, please update and resave your shader. You can do this via the Asset Store(v1.8.9.012) or our Website(v1.8.9.013).

    If the problem persists, please forward the shader to support@amplify.pt for further examination.
     
  19. shadowmatt

    shadowmatt

    Joined:
    Jun 2, 2013
    Posts:
    10
    Hey All. New to Unity / Amplify shader and I was wondering if there is a way to create a shader that follows the shape of the object applying a procedural texture. If I use the procedural node it kinda gives me what im after, but it is blurred as standard. If I apply a texture to a Triplanar sample it kinda works, but if I rotate the object it warps and blurs the texture ( bot image ).

    What im after is the abitliy to have a few shapes mapped with the checkerboard but where if 2 objects intersect their textures line up. I understand using worldspace makes this work, but then any surface not along an axis warps said texture.

    Screenshot 2021-08-17 at 13.12.50.png Screenshot 2021-08-17 at 13.13.04.png
     
  20. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360

    Hey there!

    As we mentioned via Discord, you can do some manual calculations to account for this:
    https://forum.unity.com/threads/accessing-screen-position-after-vertex-position-change.673684/



    In your case, use the Grab Screen Pos node.
     
  21. shadowmatt

    shadowmatt

    Joined:
    Jun 2, 2013
    Posts:
    10
    Thanks I will give it a go.
     
    Last edited: Aug 17, 2021
  22. Barliesque

    Barliesque

    Joined:
    Jan 12, 2014
    Posts:
    84
    Hi!
    I'm applying a vertex offset and trying to use a screen-space effect. Unfortunately, the Screen Position node is calculated without taking vertex offset into account. Is there any way to change that? Or is this to be a feature request?
     
  23. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    Hey,

    This post above might help https://forum.unity.com/threads/bes...ader-creation-tool.430959/reply?quote=7425401

    Let us know if that's not the case, we could use additional details.
     
  24. Barliesque

    Barliesque

    Joined:
    Jan 12, 2014
    Posts:
    84
    :rolleyes: I should've read that one a little closer!

    Yes, that does resolve the problem. I think it's still worth requesting the feature of a tickbox on nodes like Screen Position that would allow it to be optionally calculated after vertex offset is applied. The difference being, tick a box versus adding 8 new nodes -- keeps complexity down and makes life a lot easier.

    Thanks for your help!
     
    Last edited: Aug 20, 2021
  25. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360

    No problem!

    Absolutely, we'll include a Shader Function for that later on the day. Until then, you can always create your own, it's quite handy for reusable graphs: https://wiki.amplify.pt/index.php?title=Unity_Products:Amplify_Shader_Editor/Manual#Shader_Functions
     
  26. adamz

    adamz

    Joined:
    Jul 18, 2007
    Posts:
    1,106
    @Amplify_Ricardo Hey, is there a way to deform the surface of a mesh relative to another mesh? For example; I'd like to have the surface of the balloon indent when my VR finger touches it. Any thoughts would be appreciated!
     
  27. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    Yes, but not shader to shader, you will have to pass on specific data that will in turn be used to deform the mesh. A simple way would be to pass the finger tip position to the balloon material and use that to control the position of the Sphere Mask for example; it can even be an array.

    http://amplify.pt/forum/viewtopic.php?f=23&t=640
     
    adamz likes this.
  28. mjako64

    mjako64

    Joined:
    Feb 9, 2017
    Posts:
    5
    Hello! Firstly just want to say thanks for making such an amazing product. It's intuitive and robust, and just all around great to use.

    Recently I've run into an issue with lightmapping when using the masked/cutout blend mode. I have trusses on the bottom of these catwalks that are set up with an alpha mask.

    2021-08-26_10-28-28.png

    2021-08-26_09-16-14.png

    On the left side I baked with the standard shader set to "Cutout", and on the right side with a similar shader set up in ASE with the "Masked" blend mode preset. On the standard shader version the mask is being taken into account when baking, but on the ASE material I guess the whole plane is just being considered as opaque, without any masking, resulting in everything underneath it turning pitch black. I also had double-sided global illumination checked on both materials. Our team is currently using the built-in render pipeline on 2019.4.20f1

    Sorry in advance if this has already been asked about. I couldn't find any results so far.

    Edit: I did a little more testing and this applies to direct lighting as well if the lights are set to cast shadows.
     
    Last edited: Aug 26, 2021
  29. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    Hey there, thank you, hope we can help!

    Double-sided global illumination is usually one of the culprits, please try naming your Albedo "_MainTex"(click the padlock to force the property name).

    Alternatively, send the shader over to support@amplify.pt and we'll have a quick look.
     
  30. mjako64

    mjako64

    Joined:
    Feb 9, 2017
    Posts:
    5
    And now it's working perfectly :D
    Thanks for the help, you guys are awesome!
     
    hopeful likes this.
  31. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    Awesome, this way Unity's lightmapping process knows what to look for.

    Happy to help!
     
  32. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    635
    I couldn't find a way to apply the texture based depth offset using ASE. Is there a template or tutorial for it?
     
  33. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    I'm afraid there are no tutorials for that; HDRP does have a specific input for Depth offset.
     
  34. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    635
    I noticed that the transmission feature dims the transmission lighting when its viewed from the side, possibly from its world normals. Is there a way to prevent that so the backlighting from the transmission texture is the same intensity from all angles?
     
  35. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    I don't think we have much control over this, what's the Shader Type and Renderer used? You might be looking at creating your own effect instead of using the provided Unity Transmission effect; this will depend on the renderer.
     
  36. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    635
    It's the surface/standard shader using forward path. im using the old 2019 3D unity, not using any new pipelines. The grass shaders simply apply the transmission to the back of the lit faces without light dimming so I doubt unity multiplies the transmission output from angles by default.
     
  37. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    I see, with Surface Shaders there's a good chance we're just accessing what Unity provides there; I'll confirm with the devs. In any case, since you're using the Built-in renderer and Forward, you could possibly handle this on a Custom Lighting shader, that should give you the control you require, but it would involve creating it from scratch.
     
  38. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    635
    Ok thanks, the dimming dissapeared with custom lighting.
     
    Amplify_Ricardo likes this.
  39. Sisay

    Sisay

    Joined:
    Dec 6, 2012
    Posts:
    29
    RenderingMode:{Opaque:{_Cutoff}, Cutout:{}, Fade:{_Cutoff}, Transparent:{_Cutoff}}}

    How can I add to the standard amplify shader a selection in the material between opaque cut-out and transparent?
     
  40. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    That's beyond the scope of the shader provided, it would require quite a lot of customization. You should consider creating individual shaders instead on an "uber shader".

    Shader Functions will help you share graph operations between shader variants.
     
  41. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    635
    Is there a way to mirror the same normals to the other side of the faces in double sided shaders?
    I'm on standard 3D template and not any new pipelines
     
    Last edited: Sep 7, 2021
  42. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    Something like this should help!
     
  43. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    635
    Thanks, it worked!
     
    Amplify_Ricardo likes this.
  44. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    635
    Uh, I noticed that it causes strange flickers when I'm viewing the model near the edges now. It's more obvious on smooth models.
     
  45. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    Can't say we come across it before, is it on the actual object shader or could it be some kind of post-processing? An example would be very helpful.
     
  46. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    635
    The one with the dark flickering is the exact same shader with just the local vertex normal nodes I added from your post.

     
  47. i16yue

    i16yue

    Joined:
    Aug 2, 2015
    Posts:
    3
    Unity: 2019.4.29
    URP: 7.7.1
    ASE: Version: 1.8.9.012 • Jul 12, 2021

    ShaderError:
    invalid subscript 'normalizedScreenSpaceUV'
    failed to open source file: 'Packages/com.unity.render-pipelines.universal/ShaderLibrary/UnityGBuffer.hlsl'

    The UnityGBuffer.hlsl only appear in URP:10 (which only work in Unity 2020)

    My project has a lot bug runnng in Unity 2020. So any solution support 2019?
     
  48. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    Greetings,

    When did this problem first appeared?

    If you're updating to 2020, you'll have to update your shaders to the 2020 used SRP by opening and saving them. Be sure to first get the latest ASE version.
     
  49. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,360
    Not really sure in this particular case but we'd be happy to examine a small sample. (support@amplify.pt)
     
  50. matjumon

    matjumon

    Joined:
    Mar 20, 2019
    Posts:
    6
    Hello! I just started using ASE and I must say I'm loving it so far!

    I have a question though. We are trying to set up a code/shader library for reuse in future projects - the idea is to set up things such that we can import it as a local package through Package Manager. However, since by default ASE #include directives reference files via absolute paths, shaders I make in the library project work fine there, but break when we import the local package into a separate project (ie the shader looks for the HLSL file in Assets/ instead of Packages/).

    Is there any way to change this behavior and have the #include use a relative path instead?
     
unityunity