Search Unity

[Best Tool Asset Store Award] Amplify Shader Editor - Node-based Shader Creation Tool

Discussion in 'Assets and Asset Store' started by Amplify_Ricardo, Sep 13, 2016.

  1. DGordon

    DGordon

    Joined:
    Dec 8, 2013
    Posts:
    649
    How does having a c# script set properties on the shader affect performance. Will it break batching or anything like that? I can easily just have a few ints I check against in the shader that get set by a component on a per tile basis, but i dont know if thats the best way to handle the above use case.

    Thanks again.

    Also, alternatively, is there any way to check where a given object intersects with my floor, and based on that it draws moss around the model shape?

    Sorry for all the questions. Im not a shader guy at all, so i may be asking some very basic stuff. Its a testament to your product that I can even attempt this :).
     
    Last edited: Jul 13, 2018
  2. sacb0y

    sacb0y

    Joined:
    May 9, 2016
    Posts:
    874
    This helps with my shadow issue as filtering point lights works as a solution.

    My question is "Why does the 'world position' node react to point lights at all if connected to custom lighting on it's own?

    I guess i'm expecting custom lighting to give me raw data but it seems like it's got other data too.
     
  3. Amplify_Borba

    Amplify_Borba

    Joined:
    Jul 24, 2017
    Posts:
    538
    Hey, no problem! A shader's performance depends mostly on the complexity of its operations ( for example, multiplication being faster than division ), so it's more about what you do with the values passed via scripting than feeding them into the shader.

    You can either resort to scripting in order to determine where there are intersections, or do it through the shader using nodes such as the Depth Fade node, which allows you to get the intersection on the camera point of view.

    Feel free to also check out our ForceShield and Smear samples, which provide examples on how you can use scripting together with shaders.


    Happy to help!

    Regarding your question, all nodes pass a certain type of data to the ports they are connected to, which will be interpreted despite coming from a correct / expected node setup or not. Like I've said in the previous reply, Custom Lighting expects users to do the lighting calculation, however, it does not ignore any other data that you feed into it.
     
  4. wetcircuit

    wetcircuit

    Joined:
    Jul 17, 2012
    Posts:
    1,409
    I'm getting a "not completely initialized" error when I try to add tessellation to my shaders…
    Unity2018.2 (latest version of ASE)

    Shader error in 'Velv': 'vertexDataFunc': output parameter 'v' not completely initialized at line 66 (on metal)

    Compiling Vertex program with UNITY_PASS_DEFERRED LIGHTPROBE_SH UNITY_HDR_ON INSTANCING_ON
    Platform defines: UNITY_ENABLE_REFLECTION_BUFFERS UNITY_USE_DITHER_MASK_FOR_ALPHABLENDED_SHADOWS UNITY_PBS_USE_BRDF1 UNITY_SPECCUBE_BOX_PROJECTION UNITY_SPECCUBE_BLENDING UNITY_ENABLE_DETAIL_NORMALMAP SHADER_API_DESKTOP UNITY_LIGHT_PROBE_PROXY_VOLUME UNITY_LIGHTMAP_FULL_HDR


    Just activating Tessellation breaks the shader:
    Screen Shot 2018-07-14 at 2.51.40 PM.png
     
    Last edited: Jul 14, 2018
  5. DGordon

    DGordon

    Joined:
    Dec 8, 2013
    Posts:
    649
    Any idea when ASE will work with HDSRP?
     
  6. DigitalK064

    DigitalK064

    Joined:
    Jul 6, 2018
    Posts:
    8
    Hi.
    Can anyone help me create a new template using Sprites/Diffuse shader? I'm doing a 2D game but I would like to have proper lighting while being to create sprite shaders, not using default 3D diffuse ones. I've tried following the documentation but the shader is a surface one so I have no idea what to do. Any help would be greatly appreciated!
    Link to the shader if needed: https://pastebin.com/MB6Bts44
     
  7. o1o101

    o1o101

    Joined:
    Jan 19, 2014
    Posts:
    639
    @Amplify_Borba I have several house models using a custom building shader I made in Amplify, and I need to use a baked AO map on each different house model. I would like for them to all share the same materials so they can batch. Is there anyway to have separate AO textures while still having all houses share the same materials or will I have to create different materials for each house and assign?
    By the way I am not talking about a texture AO map but a regular baked AO map for the model in a separate UV set.
    I also can't use vertex colors for baking the AO, or Unitys lightmapper.
     
  8. Amplify_Borba

    Amplify_Borba

    Joined:
    Jul 24, 2017
    Posts:
    538
    There is a known Unity issue that could be related to the error you're experiencing, in which tessellation does not work with GPU Instancing, causing a compile error.

    Could you also confirm that you have GPU Instancing disabled on the material?

    Also, are you using tessellation in OSX?


    Although we have no official ETA, it's currently being worked on by our developer, so it should be available soon!


    Hello! We provide a 2D Sprite template with ASE, but you'll have to do the illumination calculations manually.

    The shader you shared is a Surface shader with the Lambert light model, so you can simply set up a Standard Surface shader with this light model if you're not comfortable with writing custom shaders, but do note that you'll have to keep some notions, such as making sure that you have a Texture Sample node with its property name set to _MainTex, for example.


    In this situation, you'll have to make sure that all the house textures are available in the material that's going to be shared by all the houses. I would suggest creating an atlas or using a Texture Array, as its index is a property marked as GPU Instanced, passed through a material property block by script.
     
    Last edited: Jul 16, 2018
  9. DigitalK064

    DigitalK064

    Joined:
    Jul 6, 2018
    Posts:
    8
    So is there any easy way to make lit sprite shaders with ASE? Is using the standard surface shader slower than using the sprites shader?
     
  10. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    961
    Hi all,

    Just a quick question: if I want to build my own toon shader with Amplify, and I want to use deferred shading because I want it to work with SSR. Is my only way to modify the standard shader base directly?

    (Given its light model is hardwired and Amplify shader's custom lighting model is forward-only.)

    Thx in advance!
     
  11. daville

    daville

    Joined:
    Aug 5, 2012
    Posts:
    303
    Hi, I'm testing Post Effects... And I'm having trouble finding out the nodes I can use and how to get info from the scene.
    Question 1: I want to obtain the z-Depth of the Scene and plug it to the Remap node. how do I do that?

    upload_2018-7-17_1-47-23.png

    I made this other example I want to replicate, there I'm using Scene Depth

    upload_2018-7-17_1-50-30.png

    it should look something like this

    upload_2018-7-17_1-51-23.png

    ==============================================================================

    Second question, what other info similar to render passes I can get from the Scene? like Can I get the Shadows? or Normals? or Light Direction? Intersection?
     
  12. halley

    halley

    Joined:
    Aug 26, 2013
    Posts:
    2,433
    I am loving my first foray into ASE. I've owned ShaderGraph since it was a beta, so I was bummed to see it languish. Then i tried out the graph editor for LWP but was underwhelmed.

    One of the really big strengths of ASE is the large number of examples. Here I demo two simple extensions to a classic "burn effect" because I didn't want my scripts to have to micro-manage the effect. I trigger the animation once and the shader takes care of the animation aspect.



    I haven't benchmarked the shaders. I'm a bit afraid of how much slower it might be than the standard shader, when it has not yet been triggered to animate. But I don't really know a good way to benchmark shaders. What is the best approach to measuring shader performance and load on a GPU, without a thermometer?
     
  13. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,467
    @Amplify_Borba So, I think I just found the issue with my eye and hair shader since I am using custom lighting

    My only way around this currently is to make the shader Standard Lighting for now... and pray it works the way I want.
     
    wetcircuit likes this.
  14. Amplify_Borba

    Amplify_Borba

    Joined:
    Jul 24, 2017
    Posts:
    538
    If you intend to make your own custom template or do the light calculations yourself, it can be a bit overwhelming if you're a beginner to shader development. However, you can always use a Standard Surface shader since it gives you access to all of Unity's Light Models through a few clicks.

    In the screenshot shared below, I've added a Texture Sample with its property name set to _MainTex, connected it to the Albedo port and selected Lambert as the Light Model, as an example:



    By default, Shader Templates are faster than Standard Surface shaders since you can choose which features to enable and use, but it ultimately depends on the complexity of the shader itself.


    Unfortunately, since our Custom Lighting model is Forward only, we don't recommend using it for Deferred shading.


    Hello! You'll need to use the Screen Depth node in place of the Scene Depth from the SF sample and to make sure that your camera is properly set up as well.

    You should be able to get all the information through our Camera and Screen and Surface Data nodes, and we also provide two post processing samples, Sobel and FilmGrain, that you can also examine.

    If you believe that there's any issue with the shader, feel free to send us a sample over to support@amplify.pt for further examination, thanks!


    Hey there, thank you for your support, we're happy you're enjoying ASE!

    Regarding your question, there are a few tools that allow for a certain degree of benchmarking, such as AMD's Shader Analyzer, RenderDoc and GPUOpen, most of it boils down to experience and knowledge on shader language, as shader performance will ultimately depend on the complexity of the created networks and specific considerations such as using the lowest precision possible on each data type and performing multiplications instead of divisions, for example.


    Apologies if I failed to mention that detail before, please let us know if you come across any issues!
     
    daville likes this.
  15. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    675
    I'm thinking of switching from Shader Forge to ASE since my current problem can't be solved with Shader Forge and the support for Shader Forge is over for newer Unity versions. I can't use light attenuation and light direction nodes in the deferred shader mode. They get locked down and it doesn't allow me to use them.

    Does ASE support light direction and light attenuation nodes for deferred shader option ?
     
    Last edited: Jul 17, 2018
  16. wetcircuit

    wetcircuit

    Joined:
    Jul 17, 2012
    Posts:
    1,409
    I think GPU Instancing was the issue, thank you…. I wasn't sure because I backtracked to turn off everything I could remember enabling and got something working. It feels like praying to Santeria candles when it starts working again and you don't know why it broke in the first place, haha…. I feel more confident hearing it confirmed.

    Unity - RenderCameraObject 2018-07-14 at 18.31.15.png

    I am using osX (project player is set to osX) if that is what you mean…?
     
    hopeful likes this.
  17. daville

    daville

    Joined:
    Aug 5, 2012
    Posts:
    303
    Thanks, it works, now I can have access to the Depth info
    upload_2018-7-17_8-49-33.png
    And now I can do crazy things with that :D

    I was able to get the Light Direction, Color and Position correctly, but I still can't get the Normal Direction or World Position from the Post Effect

    upload_2018-7-17_8-52-21.png

    I would like to get Info of the Normals and World position of the objects seen by the camera similar to Render pases... I asume that if I Can get access to the Render Passes in Red, I could calculate some others, since I already have the light direction I should be able to do a Dot Product with the normals to get more info and so on.

    upload_2018-7-17_8-57-3.png

    Sorry if I'm asking too much, I already tried with the documentation nodes but most of them return a solid color, perhaps I'm doing something wrong.
     
  18. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,467
    Will do. I will post my findings
     
  19. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,467
    SWEET!!! Just updated my Advanced Eye Shaders to Full Deferred

    Screenshot 2018-07-17 12.00.48.png
    KRGraphics Advanced Eye (OLD)

    Screenshot 2018-07-17 12.00.56.png
    KRGraphics Advanced Eye (Full Deferred Rendering)

    Screenshot 2018-07-17 11.58.39.png

    Screenshot 2018-07-17 12.01.04.png


    @Amplify_Borba The conversion was easier than I thought... just a few lerp nodes and that did the trick... of course, I few things are not working correctly now, such as cataracts working only with a flat grey and Iris Specularity...but I hope you guys will get custom lighting working with deferred rendering, especially for hair (I put that on hold).
     
    wetcircuit likes this.
  20. Amplify_Borba

    Amplify_Borba

    Joined:
    Jul 24, 2017
    Posts:
    538
    Hello, thank you for your interest!

    Unfortunately, Custom Lighting is Forward Only, but if you're comfortable with shader programming, I recommend looking into implementing your own Deferred Custom Lighting type using our flexible Template System.

    Apologies for any inconvenience caused.


    Thank you for confirming that the issue was caused due to GPU Instancing, glad to know that the culprit was found!

    I had asked in regards to OSX since there have been reports of Unity's tessellation not working correctly on Mac in the past, but since the above is confirmed we can disregard this.


    Glad to be of assistance!

    You may get the normals in View Space through the setup in the screenshot below:



    The Object to World node, if its input is left unconnected and its default internal value of (0,0,0,1) is used, will return the current game object's actual position on the world .

    We've also recently introduced the Transform Position and Transform Direction nodes to allow for transforming between spaces.


    Awesome, thank you for sharing your progress!

    Unfortunately, we've already looked at what needs to be added for Deferred custom lighting and sadly it seems to be a wasted effort, at least for now. While Unity does allow to create your custom behavior to some parts of the Deferred pass it will not work correctly in most situations.

    Although we haven't totally given up on this, for now, it seems to be out of the scope of ASE. Because Deferred is a way to treat everything the same way by doing the lighting part in screen space it defeats the purpose of having one special object different from the rest, that's what forward rendering does. In order to change it and create a true custom lighting for deferred the internal shader would have to be changed.
     
  21. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,467
    And that would be a monumental task to get it working... I am still waiting to get skin shaders working (Not a programmer) like Pre-Integrated and Screen Space Subsurface Scattering
     
  22. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    961
    Unfortunately it's exactly I am trying to do: I need deferred shading due to lighting and post processing requirement. But I also need a custom light model to create a toon-ish style.

    I know I can modify the standard shader itself to achieve this. Just wondering if ASE can still help me, Like:

    - I want to dump the standard base as it's the main bottleneck for me (both performance and shading limit): If I can create a custom node that does the simple BRDF calculation, then I just need a diffuse shader base which should work just fine in deferred path.

    - I know this is possible because I am not using any of the rendering options: not even ambient light or lightmaps. So really most of the standard shader features are not useful for me, but I still want to use ASE to author my shader logic, if possible.

    Does it sound doable to you?
     
    Last edited: Jul 17, 2018
  23. xxPillsxx

    xxPillsxx

    Joined:
    Oct 10, 2012
    Posts:
    31
    Hi. It seems like there's a bug: when targeting iOS, if you use the Toggle Switch node and connect a Color from a shader function to the input the shader will break. I have successfully reproduced it on new empty projects. Using Unity 2018.2.0f2
    Edit: Reinstalling Unity solved this.

    Edit 2: So while it doesn't make the shader purple anymore, it still doesn't work properly (the whole object disappears altogether). After carefully reconstructing the shader function, I've found out that connecting a shader function input of type sampler2D to a texture sample will break the shader on iOS (particularly Metal).
     
    Last edited: Jul 18, 2018
  24. Amplify_Borba

    Amplify_Borba

    Joined:
    Jul 24, 2017
    Posts:
    538
    Unfortunately, at least for now, we won't be able to offer support for implementing deferred in custom lighting, for the reasons stated in a previous reply.

    We apologize for any inconvenience caused but, as previously stated, we've tried to tackle this before and hit several roadblocks, so we will have to revisit this at a later time to see if its indeed possible to get it to work without any issues.

    I've registered your interest so that we may get back in touch when we have any developments regarding this matter, thank you for understanding.


    Hello, glad to know the issue has been solved by reinstalling Unity, thank you for reporting back!
     
    bitinn likes this.
  25. xxPillsxx

    xxPillsxx

    Joined:
    Oct 10, 2012
    Posts:
    31
    You didn't read my second edit. After further investigation, it seems like if if you don't connect anything to a shader function input of type Sampler2D, it will break the shader but only on iOS, all other platforms are fine. This is annoying because not all the inputs have to be used and right now I have to make variants of the shader input depending on which ports I need.
     
  26. Amplify_Borba

    Amplify_Borba

    Joined:
    Jul 24, 2017
    Posts:
    538
    Oh, it seems that I forgot to refresh the tab before replying!

    Thank you for sharing details regarding this issue, I'll pass it along to the developer for further investigation and will let you know as soon as we have any developments.
     
  27. Amplify_RnD_Rick

    Amplify_RnD_Rick

    Joined:
    Feb 15, 2016
    Posts:
    528
    Hey guys,

    We just finished uploading a new build into our website.

    And here are the release notes.
    Release Notes v1.5.4 dev 06:
    • Fixes:
      • Fixed issue on 'Toggle Switch' node not taking Custom Attributes into account
      • Fixed out of bounds exception over 'Static Switch' node
      • Fixed issue on templates port linking behavior
      • Fixed issues on 'Texture Array', 'Triplanar Sample' and 'Unpack Scale Normal' nodes on Lightweight SRP
      • Fixed issue on shader function directives not being written on template based shaders
      • Fixed serialization issue on saving/loading shader function directives
      • Fixed issue with using 'Texture Array' node inside shader functions
      • Fixed shader compilation errors on'Parallax Occlusion Mapping'
        • Now it does not generate code if no 'Texture Object' node is connected to it
      • Fixed issue on 'Grab Screen Color' node not updating reference list correctly when copy/pasted
      • Fixed issue with Undo'ing a 'Grab Screen Color' node on reference mode
    • Improvements:
      • Min and Max Samples options on 'Parallax Occlusion Mapping' node are now inline options
    We would like to apologize for the delay on delivering this latest build, but hopefully we will be able to drop a proper Highly-Defined reason over the next week! :D

    Hope you all have an amazing weekend and happy shader creations!
     
    Last edited: Jul 20, 2018
    KRGraphics and marcatore like this.
  28. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    961
    Hi all,

    Is this issue known with lightweight rendering pipeline on 2018.2? (I am using SRP lightweight 3.0.0-preview)

    Screen Shot 2018-07-21 at 23.10.55.png

    My shader is a really basic one:

    Screen Shot 2018-07-21 at 23.08.31.png

    My observation is, without enabled scene lighting, this warning continue to be thrown, and object is transparent in scene. Perhaps a Unity 2018.2 issue, but I have seen similar issue reports that's fixed in 2018.2.

    Screen Shot 2018-07-21 at 23.20.27.png
    Screen Shot 2018-07-21 at 23.20.33.png

    I will submit a bug report to Unity but hopefully someone have seen this problem and has a workaround.
     
  29. FeastSC2

    FeastSC2

    Joined:
    Sep 30, 2016
    Posts:
    978
    Hello again!

    I'm trying to recreate the shader built in this tutorial but I can't figure out how to do it in Amplify.



    If you should try and succeed in recreating it in Amplify without meshes I'd be very curious! ;)
    I attached a package to show where I'm at in the shader, but if you don't have the time that's alright of course.

    Here are the main reasons I'm not capable of recreating the effect:

    1) I want to twist the UV's of a shader I have in a similar way to this tutorial:
    I never quite understood how to use normals to modify the UV's in a way that makes sense.
    Is that how to do it? When building the shader however it does not do what I want.
    https://i.imgur.com/4zCznIP.png

    2) There's some interesting nodes in Shadero to create things like Twist, Spherize and a bunch of other effects that one could find in the Filter effects of Photoshop. How can I recreate these kind of effect in Amplify Editor?
    -> I assume it's possible to do it using Photoshop with its filters and then creating some textures that I would multiply with my UV's but what kind of base texture should I use the Photoshop filters to properly multiply it with my UV's?
    https://i.imgur.com/kqGpAWT.png (Here i'm trying with a B&W gradient from bot to top but I'm really guessing).
     

    Attached Files:

  30. Cleverlie

    Cleverlie

    Joined:
    Dec 23, 2013
    Posts:
    219
    Hi Guys, I'm trying to make a simple shader that offsets the Y position of the vertices depending on an animated flipbook texture, first off I wanted to report a couple of bugs that I encountered, and after that I wanted to ask for a bit of help.

    so the first bug that I've found is that you can't use the "Flipbook UV Animation" node if you are using local vertex offset output. So I had to switch to the "Flipbook" function node that is also provided with ASE (although I don't know why you keep both since they seem to have the same functionality, only difference is one is implemented as a node and the other as a shader function).
    To reproduce this I will upload the shader I made with some example scene, in the shader editor you must use the Debug Switch that changes use from one flipbook to the other.

    the other bug I found is that if you try to have two flipbook nodes coexisting there are also errors thrown, in this case I tried having one flipbook node, and another one with "start frame" offseted by one so I have the current frame and the next frame, and I can do a lerp between those two depending on time, this is to do a smooth blend between frames via a Lerp node. This try didn't work.


    So now for the question, I managed to make the shader work without the smooth frame blending and using the flipbook shader function as I mentioned before, the shader changes the Local Vertex Offset in Y depending on an animated heightmap.

    the problem I have now is that the vertex normals stay untouched, now I know that you have an example shader recently added which tries to solve this problem by reconstructing vertex normals via "derivation" in a flag model.
    My problem is that I can't really understand the math involved or what you are actually doing, I would really appreciate if someone wether from Amplify or an experienced user / mathematician can explain to me what is going on to reconstruct normals this way, so I can reuse this example or the technique for my own shaders.

    thanks in advance!
     

    Attached Files:

  31. Amplify_Borba

    Amplify_Borba

    Joined:
    Jul 24, 2017
    Posts:
    538
    Thank you for reporting this, I'll pass it along to the developer for further investigation and we'll get back in touch as soon as we have any additional information to share.


    Hello, this might be possible to achieve, since Unity has a global variable named _Wind, which contains the wind information, and the animation parameters should go through the mesh's vertex color. I'd recommend looking into Unity's built-in shaders, specifically TerrainEngine.cginc, which contains an AnimateVertex(...) function that includes the process.

    If you'd rather create your own custom effect to simulate wind movement, you could to use a wave function that generates some pseudo random values based on the vertex position of your vertices, and that also accepts a time variable. Then it would be a matter of using vertex color to filter what you want or don't want to move ( ie: paint the base vertices black so it doesn't move with the function ).


    Hello, although that asset has specific nodes that facilitate achieving certain effects, you should be able to achieve the equivalent with ASE.

    In short, there are two ways of creating the effect you're looking for, either by using textures to offset the UVs and deform them, or through math, but it'll ultimately depend on the type of effect you want to create and how you choose to approach it.

    I'd recommend looking into our RadialUVDistortion sample and its Radial UVDistortion function, which you can edit to examine how we're manipulating UVs, as an example, or even at our FakeLiquid sample.



    You may use any software to generate textures, and the example you've shared is adequate, however, simply using the textures will not be enough in most cases, you'll likely have to perform some additional calculations. You can also generate masks or gradients through math, as per the examples below.



    For more advanced effects, you can also consider using flowmaps.

    I'm including a sample for your convenience: FlowSample.zip


    Hello, I've managed to replicate the first Flipbook node issue and will pass it along to the developer for further investigation, thank you for reporting it!

    The second issue has proven a bit illusive, could you provide additional details?

    The original node was a community submission, while the Function is an extended alternative to it that was created by our developer and is editable / customizable. We can't remove the node as it would break retro-compatibility, which is why we keep both of them available.

    Regarding your question, the idea behind normal reconstruction is that you need to calculate the new vertex position not once but three times, one is the original that you already have, and the other two are mostly similar but with a small offset in tangent X and Y. With these three points you can calculate a new normal.

    In the sample that we provide, we have the code for the vertex transformation inside a shader function so that we can reuse it in those 3 points.


    Unfortunately, we can't seem to replicate this issue, could you provide additional information, such as Unity and ASE versions and the correct steps to ensure that it can be replicated?
     

    Attached Files:

    Last edited: Jul 23, 2018
    Cleverlie likes this.
  32. Cleverlie

    Cleverlie

    Joined:
    Dec 23, 2013
    Posts:
    219

    Hi, I kinda understand what you mean, but I think I'm missing some basic or intermediate triggonometry maybe, in any case I tried to take the reconstruction shader sample and use it to reconstruct the normals in my own shader, but it didn't work.

    I'm attaching a unitypackage with a scene and everything to see what I'm seeing, check that the reflections on the plane are not changed, they still are flat like a perfect mirror.

    reff.jpg


    regarding the second bug I reported, I'll make a test scene for that so you can reproduce it.

    Any advices on why my shader is still not working?

    EDIT: I'm re uploading the unityPackage because it was missing a shader function
     

    Attached Files:

    Last edited: Jul 23, 2018
  33. daville

    daville

    Joined:
    Aug 5, 2012
    Posts:
    303
    Any plans to add support to create Post Processing Stack V2 Image effect shaders?
     
  34. Amplify_Borba

    Amplify_Borba

    Joined:
    Jul 24, 2017
    Posts:
    538
    Thank you for the sample, I'll have to pass this along to the developer for further investigation.

    We'll get back in touch as soon as we have any additional information.


    Unfortunately, ASE currently does not create Stack compatible shaders, nor do we supply camera components ( PostProcessing template example included in the ASE package ) ready to use with their system.

    Although there's no official ETA for this, it's definitely something that we're going to support it in the near future!


    I've tried to replicate it once more without any success, which leads me to believe that it might be somehow related to the shaders being used and, as a result, it would be extremely helpful if you could share a simple sample in which the issue can be replicated.

    Feel free to send it to support@amplify.pt, thanks!
     
  35. OP3NGL

    OP3NGL

    Joined:
    Dec 10, 2013
    Posts:
    267
    how do one do a shader where sprites loop once or ping pong?
     
  36. Amplify_Borba

    Amplify_Borba

    Joined:
    Jul 24, 2017
    Posts:
    538
    Hello!

    Depending on the effect you're trying to create, you have a few ways to deal with sprite transitions and looping.

    You can consider using some scripting in order to manipulate a Texture Sample's texture and implement the transition / looping logic, and you can also make use of the Texture Array node or the Flipbook UV Animation node ( useful sample in this discussion ).

    We also have a sample included in the ASE package that shows how you can make use of a texture atlas, the ReadFromAtlasTiled, which might also be an option.
     
  37. Crocomodo

    Crocomodo

    Joined:
    Nov 21, 2017
    Posts:
    29


    The white color(in picture B) is affecting other color when multiplied. I have tried changing color space to Linear in project settings but the problem still persisted. I also used floor to check if light attenuation node outputs higher than 1 and used negate to check if it outputs negative numbers. So I am pretty sure the white is 1 when rounded. I am really lost here, Please Help. **test with step function and still get the same result too**
     

    Attached Files:

    Last edited: Jul 24, 2018
  38. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    Hello,

    Can you describe the effect you intend to achieve?

    Please note that direct lights return 0 or 1, while point lights do provide a gradient to work with. Depending on your requirements, you'll likely have to filter which light is used by your shader. The World Space Light Pos node could come in handy as it outputs Type 0 for directional lights and 1 for any other.

    Check out our Light Reveal tutorial for a more practical example.

    Thanks!
     
    Last edited: Jul 25, 2018
  39. Crocomodo

    Crocomodo

    Joined:
    Nov 21, 2017
    Posts:
    29
    Thanks for pointing out this tutorial. I fix my issue with World Space Light Pos node just as you said. I just want color to be the same as it output. The problem I found when investigated further is when using point light, Custom lighting port actually lit without any light node in shader. When I multiply color with light type, which should return "1", The color magically corrected itself. I don't know If this is a normal behavior or not. Here is the picture>
     
  40. DGordon

    DGordon

    Joined:
    Dec 8, 2013
    Posts:
    649
    Any way to have this work with the Uber shaders? I would love to use this to blend some textures which then get fed in to the uber shader stuff. I want their POM :/.
     
  41. dadude123

    dadude123

    Joined:
    Feb 26, 2014
    Posts:
    789
    Hi,
    I'm doing some simple distance-field stuff but FWidth based anti-aliasing is giving me some real headaches.

    The idea is really simple! Basically I want to draw stripes/lines, and my approach to do that is just
    sin(uv.x * numberOfLinesIWant)
    which I then feed into some step function.

    1) No surprises when using the normal "step" function. Hard, aliased edges.
    In case you are wondering how the described graph can create angled lines: I simply used a UV Rotator node.


    2) Now, hard edges are ugly, so I want some anti-aliasing. The common trick is using a step function with a very narrow gradient, with the narrowness determined by the derivatives (FWidth).

    The following image and video are using the built-in "Step Antialiasing" node in ASE.

    At first everything is great:



    But now it seems like this technique is very sensitive to scaling or moving the value range around somehow.
    The strange issue can be replicated through at least 2 ways as I've found.

    The first one is simply increasing the number of lines.
    Here I'm manually increasing the line count from 10 to ~80.



    At the end the strange 2x2 pixel blocks become clearly visible.
    Here's the exact same thing using the normal "step" function instead:




    3)
    After some tinkering I was starting to think that maybe the Step-AA implementation is somehow wrong(?),
    so I created my own distance-field AA thingy, along with a "smoother" FWidth, and a simple strength control.
    Outside the image to the right, there are just some static switch nodes to allow switching between the different modes (HardStep, BuiltIn, MyCustomAA)




    This video shows the difference between all 3 modes:


    No boxes checked: Hard ("Step" node)
    First box: ASE "Step Antialias"
    Both boxes: My custom AA
    (Maybe the ASE StepAA is more technically correct, but my custom AA gives smoother results)

    Anyway, now my problem is drawing very thin lines.
    For example it seems a 1px wide line can't be correctly anti-aliased by this technique (no matter which implementation).

    There are always strange artifacts, and it seems like the AA is skipping alternating pixel rows/columns....



    Here's an supposedly ideal / easy case, 45° angle, exactly 1px wide:



    Am I doing something wrong? Is there some technique I can use to get better anti-aliasing for very thin lines?
    Or is this maybe some inherent limitation of the FWidth idea/approach itself?
    Any ideas how to workaround that?
     
    petersx likes this.
  42. Amplify_Borba

    Amplify_Borba

    Joined:
    Jul 24, 2017
    Posts:
    538
    No problem, we're happy to help!

    That's correct, you will have to filter any light information required, simply plugging a color node will not prevent it from being affected by the spot light. Alternatively, you can plug your color node directly into the Emission channel.

    In short, everything seems to be working as expected, do note that if you intend to use the Custom Lighting port you'll have to create your own unique node-based lighting model through our specific lighting nodes.


    Hello! Unfortunately, we do not offer support for the Uber shader, apologies for the inconvenience.


    Hello, could you provide us a sample for further examination on our side, in order for us to best help you?

    Feel free to send it over to support@amplify.pt, we'll be happy to investigate it for any potential issues, thanks!
     
  43. SpyChar

    SpyChar

    Joined:
    Jul 15, 2012
    Posts:
    11
    You can have POM in ASE as well. Have U tried that? I'd say it's pretty great.
     
    Amplify_Borba likes this.
  44. Anonymous225

    Anonymous225

    Joined:
    Nov 5, 2016
    Posts:
    1
    Hi. I am having trouble finding the Substance Sample node, I have tried dragging in a substance and it does not create a node and I cannot find the node in the list of nodes. Please help.
     
  45. jjobby

    jjobby

    Joined:
    Nov 28, 2009
    Posts:
    161
    Hello, I intend to create a fluffy shader like Dandelion in KH3. You can see the image below.

    I'd like to ask which technique or Amplify's template should I look into to achieve this?
     
    Last edited: Jul 27, 2018
  46. dadude123

    dadude123

    Joined:
    Feb 26, 2014
    Posts:
    789
    I figured it out!
    It's the fact that DDX/DDY gives the same result for all pixels inside every 2x2 pixel block.
    After some research it seems like this is an inherent rendering limitation.

    But since those functions seem to be commonly used for other stuff as well, maybe you guys have an idea how the accuracy can be improved? Is there a way to smooth it out a bit somehow?
     
  47. Amplify_RnD_Rick

    Amplify_RnD_Rick

    Joined:
    Feb 15, 2016
    Posts:
    528
    Hey guys,

    Sure, no problem at all.
    This is an optimization on our end for surface shaders to save interpolators usage.
    We create that dummy 2D property to only create one uv_ interpolator which is then shared across multiple 2D properties. It's contents is defined automatically by the surface shader generated code.
    We then apply the tiling and offset of each texture on top of that interpolator over the surface function to correctly sample the current texture.

    Which Unity version are you working on? We still don't have Substance support on ASE over Unity 2018.
    But rest assured that we are already talking to Allegorithmic to figure out how we can use their plugin over ASE.

    From that still image is quite hard to see the behavior, but I'm guessing that might be some kind of fur technique.
    Is it possible to give us more info on the effect? Maybe share a animated gif/youtube link?

    Yes, you are 100% correct. P.e DDX for the leftmost pixels from the block is exactly the same of the rightmost ones.
    This is because during rasterization the GPU organizes multiple instances of the fragment shaders in blocks of 2x2.
    It's kinda tricky, to be honest I'm really not quite sure how you can improve accuracy on your end. Let me think a bit about this and let you know if I figure something out.


    In other news.
    We've just uploaded a new build into our website.

    Here are the release notes.

    Release Notes v1.5.4 dev 07:
    • New Templates:
      • HD PBR
      • HD Unlit
    • Fixes:
      • Fixed issue on incorrectly capturing module tags
      • Fixed issues with 'Flipbook UV Animation' node
      • Fixed multiple issues with Directives usage under shader functions with templates
      • Fixed issue on 'Custom Expression' node loading on Call mode
      • Fixed issue on unpacking normals with scale on templates over multiple nodes
      • Changing tessellation on Procedural Wall sample shader from edge to distanced based to prevent metal issues on Mac

    With this build, we are releasing two new templates that are intended to be used over the HD render pipeline. These templates were created over the HD 2.0.8 Preview.
    Please bear in mind that they are still under development.

    There are some nodes on our end that will not work on SRP, p.e. Grab Screen Color no longer can used since SRP doesn't support grab passes.

    Also some info is still not available/created over HD so some nodes may throw shader compilation errors ( p.e getting fog color from Fog and Ambient Colors node ).

    But please report not only compilation errors but also strange visual results you are having when using these templates so we can fix and improve them as fast as possible.

    Hope you guys have a great weekend and happy HD shader creations! :D
     
    petersx and hopeful like this.
  48. jjobby

    jjobby

    Joined:
    Nov 28, 2009
    Posts:
    161
    Hi, thank you for your reply. This is a part of KH3 trailer. I hope it helps.


    Yes. I think that it's some kind of fur but it's not a usual fur like hair. It's more like a cotton ball which I have no idea how to create such thing with shader. I'm no shader expert. Creating everything from scratch is still very hard for me. It would be a great help if there is existing template or some info/hint which I can start with.

    By the way, it's nice to see the support of HD rendering in new version.
     
  49. Cleverlie

    Cleverlie

    Joined:
    Dec 23, 2013
    Posts:
    219
    Hi Amplify people, Have you been able to take a look at the scene I uploaded to reproduce the bug with normal reconstruction not working? for now I'm faking the effect by applying a normal from the heightmap, into the normal output node, this at least is somewhat the result I was expecting by reconstructing the normals of the vertices, but it's quite more expensive since it is a fragment calculation instead of a vertex calculation, and also it doesn't work for all possible scenarios, thanks!
     
    Last edited: Jul 27, 2018
  50. Amplify_Borba

    Amplify_Borba

    Joined:
    Jul 24, 2017
    Posts:
    538
    Hello, thank you, we've been working hard on this feature and we're glad it's out!

    From the video it's still not entirely perceptible if this that effect is fur or not, in any case a proper fur shader would require ASE to support Geometry shaders, which are not yet supported.

    As a possible alternative, you could look into our tessellation sample and use that technique to try and mimic the look, perhaps through the use of some overlapping layers with different noise values.


    Hello, the reported issue is still under investigation by our developer, we'll definitely get back in touch as soon as we have any further information to share, apologies for the inconvenience.