Search Unity

[Best Tool Asset Store Award] Amplify Shader Editor - Node-based Shader Creation Tool

Discussion in 'Assets and Asset Store' started by Amplify_Ricardo, Sep 13, 2016.

  1. Victoralm

    Victoralm

    Joined:
    Dec 31, 2015
    Posts:
    30
    Hi all,

    I've started with ASE yesterday and I'm having some doubts... This is my attempt so far. Is a little bit messy, but I'm trying my best.



    - Is this the correct way to add fresnel to the water?
    - On the Sea Foam group, I was trying to mask by the vertex y axis position, together with the noise. Is that the correct way to do it?
     
    syscrusher likes this.
  2. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    Hey there,

    That's a already quite advanced for someone who started yesterday!

    What's on your mind, any issues?

    I can take a quick look.
     
    syscrusher likes this.
  3. Victoralm

    Victoralm

    Joined:
    Dec 31, 2015
    Posts:
    30
    Hi, thanks for the reply!

    I was mixing some tutorials, that was the only way for me to end up with this... :p
    I did somethings with Shader Graphs. But ASE has some different nodes and sometimes I get a little lost...
     
  4. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    No problem!

    Nothing wrong with that, we encourage you to check different sources as the concepts you learn with Shader Graph, or even the Unreal Material editor, can be applied here as well; minus specific node/engine differences of course.

    In any case, you can usually find similar nodes on our end and vice-versa as they are both creating the same type of Shaders. I would definitely recommend joining our Discord channel, it's the best place to ask questions about it, you usually get a reply on the same day.

    Amplify Creations Discord Channel

    Not sure about your use of Fresnel here, or the shoreline. I want to say that the height use is alright but the way that it's being mixed might not be what you're looking for.

    This multiplication is only going to darken your albedo.
    upload_2020-8-17_15-45-14.png

    The thing about shader development is that more often than not, there's more that one way to do things. It's not apparent what exactly you have in mind so the best way to go about this is to breakdown your shader into smaller/specific parts(questions) so that you can isolate specific issues/results and improved them.
     
    Victoralm likes this.
  5. Victoralm

    Victoralm

    Joined:
    Dec 31, 2015
    Posts:
    30
    Following your recommendation I get rid of the fresnel multiplication, and it really looks better.
    Also, I've changed the normal mixing. And it appears to be better now too.
    I'm trying to use the shoreline as a mask to smooth the edge when the water collides with other surface and add some foam on it.
    Thanks a lot!!

    ScreenshotASE_01.png ScreenshotASE_02.jpg
     
  6. arnoob

    arnoob

    Joined:
    May 16, 2014
    Posts:
    155
    Amplify_Ricardo likes this.
  7. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    Be sure to saturate before plugging the Alpha input on the Lerp node, it expects a 0-1 value.

    Have you had the chance to check our WaterSample shader? Does some interesting things you might find useful, all you need to do is add some waves ;)
     
    Victoralm likes this.
  8. wechat_os_Qy00smRDz5wLDD6s_CsEfuIOc

    wechat_os_Qy00smRDz5wLDD6s_CsEfuIOc

    Joined:
    Aug 18, 2020
    Posts:
    6
    Hello, I want to use custom lighting in URP pipeline, so as to disturb the ambient light and make hair effect. But it is very difficult for an art to read PBR lighting. Can you provide a connection method of URP PBR lighting model? And I am a user in mainland China. It is difficult for us to log in. youtu.be 。 Do we have a good place to learn? Thank you very much for your hard work. ASE saved my life. We can control the rendering effect by ourselves instead of praying for the program to make it!
     
  9. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    Hello, thanks for using ASE!

    For a URP Custom Lighting example, be sure to download our Stream files(since you can't watch youtube): https://tinyurl.com/s57tu7k

    The Sample uses Lightweight shaders but you can simply change the shader type to URP.

    Please keep in mind that Custom Lighting is, by definition, not PBR; we don't have a readily available option for that. If you're comfortable with shader development, I'd recommend looking into our Template System, perhaps you can create your own implementation.
     
  10. wechat_os_Qy00smRDz5wLDD6s_CsEfuIOc

    wechat_os_Qy00smRDz5wLDD6s_CsEfuIOc

    Joined:
    Aug 18, 2020
    Posts:
    6
    Thank you very much for your reply. My English is generated by the translation software. Maybe your reading is not very smooth. I mean the anisotropic effect of hair. One way to do this is to disturb the highlights along the tangent line. But if I want to combine with the PBR effect, I can't control the disturbance of spcalur and smoothness in the PBR template. So one way is to reconstruct the PBR lighting in the unlit template, so what is the order of establishing PBR links in the unlit template? I'm very sorry. It's too difficult for art to understand the lighting model of PBR. The effect of my hair is as follows. In fact, it's still based on blingphong. I hope it's based on PBR. Damn, it seems that it's not convenient to upload pictures. I decided to go over and watch your video. In short, I love ase. I hope you'll always be there for my career!
     

    Attached Files:

  11. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    Might be hard to understand each other, but we can try. Alternatively I would definitely recommend asking someone who speaks english to help out.

    To be clear, Custom Lighting cannot be mixed with PBR; PBR is not Custom Lighting. You cannot "link" he Lit PBR template to the Unlit.

    If you're using Custom Lighting with the Unlit Shader Type, be sure to use the SRP Additional Light node as demonstrated in the sample linked, otherwise it wont function corrrectly.

    upload_2020-8-19_10-43-12.png
     
  12. wechat_os_Qy00smRDz5wLDD6s_CsEfuIOc

    wechat_os_Qy00smRDz5wLDD6s_CsEfuIOc

    Joined:
    Aug 18, 2020
    Posts:
    6
    Thank you very much. I think I can understand. In China, learning art will delay other courses, so we become "game art" after the general English is not good. I watched your video. The guy in the video is handsome. Is that you? Now a little bit confused is, how should specular workflow "link"? Metal workflow I can understand!
     
  13. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389

    Sorry, don't quite understand. Here's some info on Specular PBR.
    1. Specular BRDF
    2. PBR Theory
    3. Implementing PBR in Unity
     
  14. wechat_os_Qy00smRDz5wLDD6s_CsEfuIOc

    wechat_os_Qy00smRDz5wLDD6s_CsEfuIOc

    Joined:
    Aug 18, 2020
    Posts:
    6
    Thank you. I can do it with this! The information in our country is not very complete.
     
  15. RevenantDevs

    RevenantDevs

    Joined:
    Mar 3, 2019
    Posts:
    2
    Please help me!!

    Custom Outline Toon Shader not work when GPU instancing is enabled.
    If instancing a prefab using CustomOutlineToonShader in multiple times, only one prefab is showing. The other is transparent.
    In addition, performance is too bad. Below 30 frames in real android device (Samsung Galaxy Note 8)

    I'm using Unity 2019.3.15f1 personal and android platform.
     
    Last edited: Aug 20, 2020
  16. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    675
    Caustics shader made with ASE :)
     
    wetcircuit and syscrusher like this.
  17. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    Hope it helps!

    Instancing seems to be working, can you provide additional details?

    Regarding performance, we'd also need additional details; is the drop below 30 present after adding that specific shader?

    A devbuild profiler shot would be great to help identify the issue.

    Looking very nice! Care to share some details on your approach?
     
  18. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    675
    Sure. Basically there is panning caustics texture distorted with a distortion map. And it's added on top of each other with slightly offset distances with 16 samples. That's how I get those organic refraction shapes.

     
  19. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    The result is pretty cool, thanks for sharing!

    I would recommend maybe using a Shader Function there, might help simplify it for other users.
     
    syscrusher and Cactus_on_Fire like this.
  20. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    675
    Oh that's nice. I'll definitely use that for any shader block I need to repeat from now on.
     
    Amplify_Ricardo likes this.
  21. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    Shader functions are great. I'm making a complicated UI shader right now and am using them extensively.

    If I may, I have two further comments on your shader:
    1. In addition to the suggestion from @Amplify_Ricardo for shader functions, I also suggest you investigate the Relay Node and the feature of putting "dots" on your longer connection "wires" to organize them.
    2. All else aside, I showed your node graph to my wife, who is a retired librarian (extremely smart, but absolutely not a techie). Both of us agree that your node graph is a beautiful artistic expression in its own right. :)
     
    Amplify_Ricardo likes this.
  22. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    I saw the reply from the Amplify team, but also wanted to mention a mockup I posted earlier in this thread, to someone who asked a very similar question about URP. I'm not sure if your project needs to also support URP, but here's the link in case it's useful for you: https://forum.unity.com/threads/bes...er-creation-tool.430959/page-110#post-5616622
     
    Amplify_Ricardo likes this.
  23. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    675
    It's like a very organized subway system :)
     
    Amplify_Ricardo and syscrusher like this.
  24. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    I have several unrelated general questions, so I hope nobody minds that I'm grouping them in a single post, in no particular order. None of these are urgent; they are just things that have come up as I've been working on a current project as well as exploring some creative ideas. I'll use a numbered list in case someone has an answer for just one item, to make it easier to refer to that question without quoting my whole post.
    1. Is there a good way to pass an array of arbitrary Vector2, Vector3, or Vector4 data to the shader from my C# code in Unity or via the Unity inspector? The intent is that this data is "just numbers" without implicit semantics of being something specific. I can think of several ways to do this, but I'm not in love with any of them:
      • The obvious way to pass an array of Vector4 would be to provide a Texture and treat its pixels as Vector4 instead of RGBA data inside the shader. I would use the VertexID, which I can already obtain from ASE's templates or from my custom template, as an array index to look up what I need. This would work, but it seems really clumsy to me and perhaps counter-intuitive to anyone who uses my shader (though I could hide this property from the C# custom editor, and manage it internally with C# code to set the shader properties).
      • Since the data is related to the object's vertices, I could pass it in some of the unused UV arrays for the Mesh, then use the TexCoord Node in ASE to extract that data. This has some problems, though:
        • I've tried, but so far have found no way to prevent this data from being interpolated before it's presented to the fragment function. For UV coordinates used in the expected way, interpolation is exactly what's needed, but I need the raw data as it would exist in the vertex function.
        • Most of the time, the only UV channels in use are UV0 and UV1 (or 1 and 2, depending on context) -- the first for the primary texture mapping, and the second for lightmap baking. But I have no way of being sure my nonstandard use of the highest-numbered UV channel would conflict with something else. Perhaps a workaround for this would be to provide a shader property that allows the user to choose a UV channel they know is not otherwise occupied. That, however, becomes slightly problematic because ASE's TexCoord Node wants the channel number chosen in a dropdown list at shader compile time -- I would have to make a custom node to expose that as a property.
        • To support arbitrary user-provided Meshes, I need to avoid modifying the asset file containing the Mesh. This implies that the Mesh object I have in C# and pass to the shader is not identical to the one that was read from the asset file at startup. That's probably okay, but as a programmer it makes me nervous that I might introduce bugs that could be hard to diagnose.
      • I could do something like the previous method, but use the Vertex Color instead of an unused UV channel. In my situation, However, I'd really like to use the Vertex Color for its original intended purpose in my application, so I hope to avoid consuming this singular channel. Also, the Vertex Colors are interpolated just like the rest of the vertex data, so this approach suffers the same three limitations as the preceding method.
      • See my question #2 below -- if there is an easy answer to that, then I can achieve almost everything I wanted to accomplish by precomputing some things in the vertex function and then passing the raw numbers -- without interpolation -- to the fragment function. If I can do that, I'll forget about trying to pass this hypothetical raw data array.
    2. Is there a way for the vertex function to pass a small amount (a few Vector4 values) from its current vertex, then have the fragment function be able to retrieve this data -- without interpolation -- for all three vertices of that fragment's current triangle? I know ASE has a Global Array Node, but after reading the documentation I am not clear about the scope of global values (are they per shader, per renderer, per instanced renderer, per vertex, per polygon, or per fragment?), their persistence (that is, will the global values be retained from one frame to the next?), or whether or not they are interpolated when presented to the fragment function.

    3. For the Global Array Node, how does one write data to an element? The documentation shows inputs or node parameters for the length of the array, the data type, the index of the desired element, and the output value -- but there is no data input, as far as I can tell.

    4. Is there a way for my transparent shader's fragment function to determine if the source fragment's color has been modified by any other object using the same shader? That is, is there something akin to the stencil buffer that is shader-wide scope so one such buffer is shared by all uses of this shader in the current frame?

    5. Alternatively, is there a way to detect if the current fragment for a transparent shader has the lowest depth of any fragment that will be rendered in the current frame by this shader? In other words, can my fragment function leave the source fragment untouched unless it knows this is the last time this fragment will be touched this frame by this shader?

    6. As a general question, what technique do others use to avoid some of the depth artifacts that result when two transparent objects overlap in screen space, and their depth is close enough Unity does not depth-order them predictably?

    7. In looking at the code, both the editor scripts and the generated shader code, I notice that the VertexID Node is not among the fields that can be declared in the ase_vdata and ase_interp preprocessing directives for custom templates. This causes a lot of grief if one is making a custom template where the vertex ID is to be a required part of the template's vertex data and fragment data structures. All the other such fields, such as normals and bitangents, can be added to these directives. Could Amplify please consider "promoting" this important data element to first-class status in ase_vdata and ase_interp template directives?
    Beyond these user-type questions, I have a hypothetical question I'd like to put to the Amplify team.

    I'm partway through developing a couple of custom nodes that have in common the fact that they need one or more dedicated "control" inputs that are for specific purposes and therefore accept only one input type. This is in addition to a variable number from 1 to N of regular data input ports.

    When digging through the code, I've found that the parts that handle dynamic sizing of input counts (as is done in nodes such as Add and Multiply, among others) tacitly assume that all of the inputs are dynamic and will allow the user to delete all but one of them. For nodes with some number of fixed-function control inputs, there needs to be a way to declare those separately from the regular data ports.

    I've been working on a patched version of the node logic that avoids that assumption, and adds a variable indicating how many control ports exist at the start of the input set. Any code that iterates through inputs to delete the unconnected ones and resize the list excludes the specified number at the beginning. Mostly it involves changing some for loops so they don't have zero hardwired as the starting value for the loop counter. (It's a bit more involved than that, but that's the core of the changes.)

    Rather than patch ASE itself, I've built my customization as a subclass. But to get this functionality, I couldn't just add my new feature and still call the base method. The existing method is rather long and monolithic, and the parts I needed to change aren't exposed as part of the API -- it's an "all or nothing" situation for that method. Instead, I had to copy and paste the method from the stock source code and then modify it. That much duplicated code isn't good for maintainability.

    So, after all that introductory material, here's my question: If I get this working and am willing to share my code with the Amplify team, would they consider adding this feature to the mainstream code base? I have no problem with Amplify just taking my code and running with it, although they might have a better way to do this and would want to start from scratch. In particular, it might be cleaner to have two entirely separate categories of input port -- controls and data -- and keep them in separate lists, rather than having a single list with a boundary integer to define the number of control ports. I was trying to implement this with minimal intrusiveness because I can't set expectations for future versions if the upstream code -- but if Amplify did it in house, you're not under that constraint. :)

    Thanks for listening to all of this, and I apologize to all and sundry for the length of this post. I've been building up questions for a few weeks now and finally got around to formally writing them up.

    As always, thanks to the Amplify team for a product that has become one of my very favorite Unity tools, and thanks to anyone (Amplify or others) who have suggestions or advice for my questions. :)
     
  25. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    675
    How can I create a shader function node? I can't see the node by default in ASE canvas.
     
  26. o1o101

    o1o101

    Joined:
    Jan 19, 2014
    Posts:
    639
    Hey, been hoping for to use this for a few months, is there any method of accessing this now?

    All the best
     
  27. arnoob

    arnoob

    Joined:
    May 16, 2014
    Posts:
    155
    Amplify_Ricardo likes this.
  28. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    Hey there!

    Those are some really good questions, let me get one of our developers on board as this touches on some of our more complex systems.


    It's actually its own Asset, go to Create and "Shader Function" just as you'd create a new shader file. (double click to open)

    I'm afraid it's still on the queue, let me check with one of our developers for a possible ETA.

    Apologies for the inconvinience.
     
    syscrusher likes this.
  29. Amplify_Paulo

    Amplify_Paulo

    Joined:
    Jul 13, 2017
    Posts:
    297
    1. Passing arrays must be done using c#, the inspector doesn't support it natively, however you could create your own array inspector for shaders. I'm not totally sure what exactly you are describing or asking here to be completely honest.

    2. Global values are per "global variable". Meaning, the value is set in a way that all shaders using that variable return the same value, it's the equivalent of a static variable in c#, regardless of it being accessed in the vertex or fragment function. They do persist over time. There are ways to create interpolators that don't do interpolation but are still consider interpolators (they occupy interpolator memory, they simply don't do the interpolation math) but we currently don't support these because they don't work in surface shaders and we haven't found a good way to create them

    3. The Global array node was not done by us, so our support for it is limited, but the general idea is that you first create an array and then you pass the whole array to the shader, you don't write to a single element, you change the whole array and you resend it. Check unity's documentation for more information.

    4. Technically, there's no such thing as a "transparent shader", shaders do behave in a way that they simulate transparency and there are systems to deal with transparency as a whole. Shader's don't have any information about other shaders unless you somehow pass that information to them. In this particular question it may be technically possible by changing the internal shaders of unity or the rendering pipeline shaders and scripts, akin to how the "Opaque texture" for URP works. But not only it will cost performance by doing so I wouldn't even know where to start due to the complexity of the pipelines. Natively there's no such thing.

    5. No. for the same reason. Shaders render to a texture, there's no previous memory that you store and reuse in a future shader unless you somehow do it manually. Which is expensive.

    6. They fake it somehow. You seem to be talking about "Order-independent transparency" which is a complex topic in itself and require complex solutions. If i'm not mistaken HDRP has such a solution. But I have no how it works. Not sure either if it's screen space related. Seems to me like the typical transparency issue. May be wrong.

    7. I guess we could, there's already partial support for it so it shouldn't be too difficult to add. Can't give you any ETA tho.

    About your custom nodes, did you take a look custom expression node, static node or function switch node? All three of them have N amount of inputs without being "dynamic inputs", the custom expression node may be too complex but the last two are easy to understand and function in a similar way you are describing. You simply select how many inputs you want and they show up. Isn't this what you want? or am I oversimplifying it?

    I'm not sure if I'm answering your questions, maybe with a clear example I could be more useful.
     
    syscrusher likes this.
  30. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    For what it's worth, I really like that approach, because "where are my shader functions?" is an easy self-answered question and not a deep dive into the internals of the ASE source code to figure out where those reside. +1!
     
  31. ccjay

    ccjay

    Joined:
    Mar 24, 2014
    Posts:
    20
    I am looking for ideas or a reference on how to approach this.

    I want to overlay a texture / decal onto an object ignoring it's UVMap and essentially "Project" it onto the object, whilst being able to rotate / scale / move it around through attributes on the material. The use case are things like tattoo's or decals or whatever.. on objects where the texture I'm projecting may not match the UVMap they are projecting onto.

    I know Unity has literal "Projectors" but I assume that has some kind of overhead and wouldn't be a good choice for this?

    I initially tried just adding the texture as a "detail mask" over the diffuse and it worked well enough but whenever I hit the seam of a UVMap it would get broken or cut off.

    I also tried experimenting with Triplanar Mapping.. but wasn't able to get the results I wanted.
     
  32. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    Good evening, and thanks for a quick but very detailed response. I'll see if I can clarify my questions (inline, below your replies) and add some use-case examples.

    I guess I made a mess of the question phrasing! Passing the data from C# not only isn't a problem, that is exactly what I want to do anyway. :)

    The question is meant to be about data types, rather than the mechanism. I need to pass an array of a few hundred Vector4 data elements to the shader, but I want my shader code to interpret these as raw Vector4 data and not with any of the built-in shader semantics. The supported types for shader parameters include float, float2, float3, float4, and so on. I can declare "float 4 _MyFloat4", but I can't declare "float4[200] _MyFloat4Array".

    Obviously I can use C# to make a "texture" that has the float4 values packed as RGBA, then hand that to the shader as a parameter, and I know how to do that. My question to you is, however, "Is there a better way to hand that sort of raw numeric data from C# to a shader?"

    Thanks, that analogy to static C# variable helps a lot.

    Are these values interpolated between the vertex function and the fragment function, or can I rely on them being unaltered as I pass them along? If they are not interpolated, that's good news for me, because I can solve a lot of my problems with global variables if that's true.

    Ah, I didn't realize it was a community node, nor was I aware that Unity has something like this that the node is utilizing -- I thought the node was yours and was all "stand alone" within ASE. I'll go consult the Unity docs and take a peek at the node source. Thanks for the heads-up about that being a documented Unity feature.

    Understood about the notion of "transparent shader". My shader is an unlit one, and I'm setting the alpha of its output and using the alpha blend mode. I used the phrase "transparent shader" as shorthand to describe its behavior from the point of view of a designer using my material, rather than the shader internals. I should have been more precise in wording. :)

    It sounds as if what I suspected is true: This is a rabbit hole down which I do not really want to venture. It was worth asking the question, but I think I'll take your implied advice and leave this one alone. :)

    Understood. At one stage of my development, I was using a depth-only prepass, which of course cost performance, but I removed that when I discovered it really didn't help the cosmetic artifacts.

    Yes, agreed. I think I'm just frustrated that even in a small scene (render distances in "tens of meters" not "kilometers") the precision is so poor. I set the shader's precision from "half" to "float" thinking that would help, but it really doesn't.

    Fair enough. I actually did it here, in an experimental sandbox, and got it working -- but I reverted my changes when I decided they were deep enough I wasn't willing to locally maintain the local fork. If I had changed a few lines of code, I would have carefully documented it and kept my fork, but there was enough "delta" that I decided it's not worth the risk. I've got a workaround. :)

    That's a reasonable request. I'll post a follow-up with more details, but to do that I want to go back and look at the node list and see if I can cite some other similarities besides what we've mentioned.
     
  33. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    Hey there,

    I'm afraid your options are a bit limited, there's no practical way around providing UV coordinates of some kind in order to avoid seams. Triplanar mapping is probably not ideal for this case but I can't know for sure without an example shot.

    -Use projectors, costly.
    -Using HDRP you can add mesh decals directly on top of your geometry, this would be better than using a simple projector.
    -Create a secondary UV channel in order to avoid seams, this would be used only by your decal; can be easy to implement; not too performance intensive.

    Hope it helps.
     
  34. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    675


    Another attempt to make caustics. This one is 100% procedural and doesn't use textures. It creates a normal map from the 3d noise and uses it to push the 3d noise to the edges to create this effect.
     
    syscrusher and wetcircuit like this.
  35. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    Very nice, and without textures!
     
  36. Scornz

    Scornz

    Joined:
    Jun 15, 2014
    Posts:
    27
    Hello!

    I have been experimenting with Amplify and got this nice "world" reveal shader under my belt. I started out by making it transition between two textures, and then implemented it by having it replace the second texture with just opacity instead. However, when the opacity is used, it just makes the whole object completely see-through.




    It's all based on distance from a point, so the effect happens in a spherical shape, as shown here.



    Now, I would like to replace that "transparent" intersection between the "distance sphere" and mesh with an emissive white color, but I have no idea how to accomplish this. Any advice?
     
    Last edited: Aug 25, 2020
  37. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    Which render pipeline, render mode, and ASE template are you using?

    In ASE there's an option to capture and share your node graph. If you're comfortable doing that, it would help others provide more focused solution ideas. :)
     
  38. Scornz

    Scornz

    Joined:
    Jun 15, 2014
    Posts:
    27
    Thanks for the quick reply! I'm using the built-in render pipeline, and I don't believe I'm using a template at all. Just the default surface shader. I'm also using a custom blend mode just so I can have both a transparent material with z-write on, since sometimes the material is completely opaque.

    I didn't know there was a way to share the node graph! I believe this is what you were talking about? http://paste.amplify.pt/view/raw/ed755511

    I also took a screenshot of the nodes.
     
  39. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    Seems to be doing what's expected, you're not rendering backfaces.

    Try setting its Cull Mode to OFF and using the Switch By Face node; we use something similar on our Simple Potion built-in sample.

    upload_2020-8-25_18-6-28.png
     
  40. Scornz

    Scornz

    Joined:
    Jun 15, 2014
    Posts:
    27
    Thank you! This is great, but I don't think I was being specific enough in my original post. I am looking to replace the intersection between that "distance sphere" and the mesh with a solid color. Very similar to how this mesh is being "cut", and looks as if it is solid, instead of just a hollow mesh. How would I approach this?

     
  41. ccjay

    ccjay

    Joined:
    Mar 24, 2014
    Posts:
    20
    Can you elaborate this option in a little more detail?
    Do you mean create a second UV Map for my mesh How would I overlay that over the first?

    (This sounds like a really good solution since I could eliminate any cutting along the seams, just a little unsure how to have two UV's working simultaneously on a single material / shader)
     
  42. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    You don't really overlay the UVs themselves, but use each of the UV channels to sample an appropriate set of texture maps. The UVs are stored as part of the mesh data (each vertex has coordinates U and V for each of however many UV channels are defined for that mesh).

    The first UV channel is the one most modelers will use for basic unwrapping of materials, the one with which you're familiar. That's UV1.

    UV2 is almost always reserved for lightmap UVs. Modelers will deliberately overlap UV coordinates for polygons in UV1. For instance, suppose you have an interior wall with a wood panel texture. The modeler is likely to just "project from view" while in orthographic mode, to get the UV coordinates for the front and back faces at the same time. The upper-left corner might be UV (0,1) on the front side but (1,1) on the back, for instance. That's a perfectly reasonable practice for modelers to use textures multiply in a model, especially if the desired result is two sides of an object that are mirrored of each other.

    But in the game engine, lightmapping needs to project onto unique parts of the lightmap, which essentially is a texture that Unity creates automatically. Think about that example wall -- there could be a lamp on one side of it but not the other. So even if the wood panels have identical (but mirrored) texture data, the lightmap does not.

    To fix this, Unity and some other game engines have a way to automatically unwrap models into the UV2 channel during import, if the modeler didn't already do this. The resulting UV unwraps are not visually pleasing, but it doesn't matter.

    In any case, don't use UV2 for your custom UVs.

    But there can be UV3, UV4, and sometimes more on the model, and these are rarely used. So you could assign different UV coordinates for a detail texture, decals, or whatever, on those channels.

    In ASE, the texture sampler and texture coordinates nodes have dropdown properties to select which UV they will observe for sampling the texture. If you've set up your UV3 or UV4 for your purposes, just use a separate texture sampler to get the pixel data from that texture, and make sure to tell ASE to have that node use the special UV channel.

    Map your other textures (the main albedo, normal, metallic, etc.) as usual, on UV1. No conflicts. :)
     
    Amplify_Ricardo and ccjay like this.
  43. ccjay

    ccjay

    Joined:
    Mar 24, 2014
    Posts:
    20
    Amazing explanation - Thanks.
    Will dig into this a bit and see where I get.
     
    syscrusher likes this.
  44. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    675
    So I got a weird bug with shader functions. First of all it's amazing and it's even better that you can nest functions within functions which cleans a huge mess of noodles while making shaders. The problem is that I can't seem to get the function inputs to behave accordingly to their sub-functions for the final shader. You can see in the first pic that all functions work correctly in their previews where different integer inputs slightly rotate the texture normals while the final compiled shader on the material doesn't seem to read those inputs for rotating normals. When I plug any of these nodes into emission it shows the exact same thing as if the integer inputs aren't different while the canvas previews show the correct outputs. The first pic is shader and the second pic is the shader function that each box reads from.



     
    Last edited: Aug 26, 2020
  45. owenchen1994

    owenchen1994

    Joined:
    Oct 26, 2019
    Posts:
    7
    A quick question about vertex position node.
    In the document, it says the Vertex Position node outputs the vertices position in object space, and it doesn't change with the transform of the object. Then why the color changes when I move the object?

    vertexPosition2.gif
     
  46. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    That can be achieved with Stencils, it's a good place to start.



    Not sure what could be happening there, can you share a small sample? We're available via support@amplify.pt or direct forum PM.

    Do you get the same thing with a regular Unity cube or sphere? Noticed the window, any particle related stuff?
     
  47. Cactus_on_Fire

    Cactus_on_Fire

    Joined:
    Aug 12, 2014
    Posts:
    675
    I just sent it through the support email. Would be great if you can take a look at it today.
     
  48. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    Thanks, we'll see what we can do.

    Which texture goes where? The Material was corrupted for some reason, what Unity version are you using?
     
  49. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    Ok, I see now that you're using Int nodes, using a float node should resolve the issue for now; just be sure to also adjust your input node on the shader function a saving both SF and Shader to update. Let me confirm with our devs if this is something specific to the calculations made when using this type.
     
  50. Amplify_Ricardo

    Amplify_Ricardo

    Joined:
    Jul 19, 2013
    Posts:
    2,389
    That seems to be it, the preview is definitely misleading in this case, just confirmed with one of our developers. It is indeed specific to your Int Division as order of operation is taken into consideration. In your particular case, you just need to add a Float to your Divide so that it's cast appropriately to float; required if you want to divide two integers and want a fractional part.

    Unless you absolutely need an Int, use Floats as the default go to.

    upload_2020-8-26_11-44-13.png

    I can see that after saving the SF and Shader, it's each shader function now outputs a different value.