In the feature request thread... OK, I'll bite. Why is this functionality being removed? Yes, it was kind of clunky to implement but it was mostly copy/paste and the end result is something that's easy for regular users. The "replacement", which as far as I can tell isn't in the current version of the package released with 2019.1, sounds like it will make it impossible to provide standalone content to be used in Shader Graph, but instead require manual editing of text within the graph? Unless I'm really missing something that sounds like a huge step backward rather than an improvement. It creates a middle ground where everybody can sort of do stuff, but not easily, rather than a more difficult task done once by a developer. My impression from @Kink3d's comment is that this might be a bit of a debate internally as well, so as a potential user of this feature I'd like to get some feedback in early on. We don't mind doing up front work if it makes the overall workflow better!
I can answer a little bit of this question, though a lot of the in depth stuff will likely need to be checked with the team first to make sure we're all NDA safe . The new custom node allows for inline text editing as well as referencing a file path for an HLSL include. This is easier for people who really want the functionality, as now they can write straight HLSL in their favored IDE with all the bells and whistles and just drop it in to the graph. If you need something small and quick (like less than five lines maybe?) then you can inject straight in. In terms of reusing the nodes (ie; task done once by developer sort of process), you can wrap the custom node in a subgraph. When the final shader is compiled everything gets flattened out anyway, so you're not hitting any extra performance costs in terms of the final shader regarding multiple subgraphs. Now that subgraphs are nestable, this is much more viable option. In terms of the package release.. yeah, the internal process to update that is not very fast and we didn't get out as far ahead of it as we should have for this release. It is available when updating through the package manager though, and each of the SRP templates available should have a note in the readme about it. It is definitely not ideal (we're working on fixing it), but they are here and functional.
So will there be a way to "package" a subgraph up and have it appear as a node? That might be the best of both worlds if it was easy enough to register/install.
That's already built in -- when you create a subgraph, it's automatically added to Shader Graph's "Create Node" menu with all the rest of the nodes. Sharing them between projects is the same as sharing any Unity asset between projects, just with an added step of updating a filepath reference if you're using that in the Custom Node inside a subgraph.
Sorry, I meant for external use. e.g. as an Asset Store item or other shared project. Is there a way to add subgraphs from an external source? I think I saw something about cut and paste between graphs, but it would be great if I could add some sort of subgraph asset to my project and have it show up. (Maybe this is what it does, and I just missed it?)
Yes, that's exactly what it does. If you take your subgraph file and any HLSL include files and share them, they'll load into the project normally (just update the filepath reference for the custom node). They share just like regular Unity assets.
Alright then, I'll shut up until I can test the new implementation. It sounds like it should cover my use case.
Where can i find an example of using the 'Custom Function Node'. I tired to upgrade my CodeNodes from 2018.3(4.10) to 2019.1(5.10) and have a hard time of finding out what is supported and what not from hlsl. For example: - Do I need to implement 3 methods just to cover the precisions (float, half, fixed)? - Is there a way to expose default values?
GDC 2019 Shader Graph Updates from Matt Dean (from alexandral_unity's sticky post) explains why the system was removed and the new alternative (which is only included in newer non verified versions of the package). For my opinion the combination of SubGraph (newer non verified version, which allows typing and renaming of output-slots) and the new node is a nearly complete replacement for the CustomFunctionNode-Class. One thing I would love to see in SubGraph is the ability to not only set constant standard values for the input-slots, but also bind something like UV0 to a vector2 slot as default (like it was possible in the CustomFunctionNode-Class).
Completely closing the door on custom nodes really sucks. I'm glad that sub-nodes are becoming more powerful, but unless you embed a full visual scripting system inside of subnodes, it will never be as powerful as custom nodes can be no matter how much work you put into it. The main things that not having code running when generating shader code prevents is the density of functionality a given node can have. As a simple example, lets say our node needs to sample a texture and select some data from it. In code, it trivial to have an option on the node which the user can set, and write the code accordingly, such that there is no runtime cost incurred. However, in a subgraph I cannot do this. My choices are to either write multiple copies of the node and have the user select the correct one, or have runtime properties and code select the correct data, which is less performant. This easily compounds as subgraphs are nested or functions gain more options. I use code generation like this in MicroSplat, and have several hundred options- a level of complexity simply not possible in any commercial shader graph, with the type of optimizations which just aren't possible in a graph either. A possible improvement would be a way to expose enums and options to the use of the node, along with an internal branching structure which decides which graph gets chosen based on these settings, but that still ends up in duplicating graph nodes all over the place. Given the removal of surface shaders, the introduction of SRPs that make hand written shaders a nightmare, and the removal of custom code from the shader graph, it really feels like Unity is trying it's best to make it hard to write shaders in their engine. I would suggest, if they think that the graph is a fully capable replacement for our ability to write shaders, that they re-write their entire shader library using a graph and see if it holds up to the task. Rewrite Lit, LayeredLit, the Standard Shader, a decent terrain shader, etc. Also rewrite any complex node you have without using the internal API - because that is what you are asking us to do. Because until you can do that, with overall better workflow, your graph is not a replacement for code. Also, as a side note, the idea presented in that GDC talk that shaders should become entirely driven by artists and not require code is a foolish notion. The best work with graph based systems comes when coders and artists are able to work together, not when one is shut out from the process. In any visual scripting system, the ideal case is that coders can solve hard code-centric problems and expose the results as simple nodes for the graph user. If this wasn't the case, then the only nodes you would ever add to a shader graph would be the language intrinsics themselves, right? Imagine if this same set of outcomes was pushed on visual scripting: - Make writing C# in the engine extremely hard and unmaintainable with new Unity versions - Make C# not portable unless you use visual scripting - Focus on artists and designers writing all game code to the point of shutting out coders from the process - No way to write custom nodes Sounds pretty crazy, right? But this is effectively the approach being taken here.
@jbooth totally agree that there needs to be ways of parametrization and branching on shader-generation level. Re-reading my post it could sound like I'm fully happy with the new solution... I also think that there needs to be more flexibility. But I understand why the api was turned private. Going out of preview means the current features have to be maintained... The public api was very hard to maintain. A programmer could fix compiling-errors when the api changes but the real problem are the shaders which already use that nodes... If the deserialisation of a node fails you have to fix all the shaders which use it. Its not only replacing the node... It also means redoing the wireing and the options on the node. If a central node of your shader breaks that could mean your have to recreate (and rethink) the shader. Thats why I think the decision to remove (or better "replace") the api was right and I hope that some sort of parametrization and branching will be provided soon (I think the ShaderGraph team is (and was before disabling the api) aware that there needs to be a alternative to the advanced functions of the api... but hadn't the time to implement it before going out of preview.).
Those are all solvable problems though; Unity has an autoupdating system for API changes, and serialization is a well known issue in Unity. I highly doubt they are going to create a way to do branching in the shader generation code, but if they did, doing this all in nodes is going to be very time consuming to work with. It also doesn't address other issues that something like the Amplify Shader Editor nodes address, like being able to adjust pragmas at the node level, etc. I really do believe that some form of C# API is the right way to go for this stuff, as this is the very API they use themselves to add nodes, otherwise they should get rid off all non-intrinsic nodes and eat their own dogfood for all the complex nodes (noise, pom, etc). I love visual graph systems for domain specific use cases, but sometimes code is just infinitely superior - and forcing everyone into a non-extensible node graph just seems very un-unity.
So, I converted my Stochastic Height Sampling node for 2019.1 today. Basically it's way worse than the custom node system, for both me and the user. For me: - Had to duplicate my sampling function five times, once for each option of which channel (or use luminance) to pull the blending operator from. - Becomes 6 subgraphs which I have to maintain, and 6 subgraphs in the menu instead of 1 with options. - White on White for input/output naming? Really? - Nodes are cumbersome to write, when all I'm really doing is wrapping a function. - Functions need _float appended to them for no apparent reason. Got forbid I need to do _half and _fixed versions of them all for some reason? For users: - 6 subgraphs instead of one node with options - Subgraphs are not sorted the way you'd want them to be. Since there is now a subgraph for each channel option, you'd want to see them in RGBA then Luminance order. Instead they get alphabetized. - Users have to manually unpack the normal now, since providing that as an option would mean another 6 subgraphs - Users have to plug in UV instead of being able to use default UV values For me, having worked with many shader graph and written one for one of my old companies custom engine, this actually makes Unity's the worst one I've worked with in regards to being able to extend it and provide good interfaces for the user, except for Shader Forge when it was closed source. I think I'm going to look into hacking the assembly files on script initialization so that I can continue to work with the node API instead of releasing this to my users. It might be more of a pain for me in the long run when the API changes, but at least I won't have to ship a mess to my users.
After reading this thread, I really can't help but feel that the removal of the NodeAPI and the impossibility of MasterNode creation was a terrible decision that needs to be re-evaluated by Unity. These decisions seem to not really accomplish anything except sabotaging ShaderGraph as a tool, and making it less powerful for the entire community. The community benefits immensely from seemingly "non-accessible" features such as NodeAPI and custom MasterNodes, because the few people who know how to use those things can create powerful tools and then share their work online and on the store. Everyone wins If I remember correctly, the main argument for this change was the fact that the NodeAPI was too much trouble when it comes to upgradability/compatibility. But you could totally just let node authors worry about upgrading-related issues if there ever are any, and everyone would be happy I think. Unity doesn't have to worry about upgradability for such specialized features. It's kinda like when the PlayablesAPI was in early experimental stage... or like the ECS right now. Sometimes there are massive refactors and parts of the API disappearing, but that's okay. I'm okay with using these APIs and dealing with these sorts of big changes by myself without any auto-upgrade solutions, because it's better than not having access to them at all ______ tl;dr: My suggestion is just make it all public and feel free to refactor it as much as you want without worrying about us. I think this would be a reasonable middle-ground that at least does not jeopardize the entire tool
I support mr. Jbooth. Shader graph is not a replacement for plain shader code, it is cumbersome and inconvenient for perception, analysis and debugging in the case of a truly complex shader - not just a simple demo for a happy "another ShaderGraph-beginner-guide" on youtube.
Well, the alternative is that we're all going to hack the assemblies and get access anyway. There are now at least 3 assets I know of on the store doing this, and I suspect more to come. This will only create pain and grief later when we all have to troll through a private API and see what broke.
For anyone interested, I ported the MainLight node to a 'Custom Function' for use in 2019.1 See this post. Writing custom shader code seems to still work pretty reasonably. Imports are a bit buggy thought.
I had a rather simple custom node drawing a hex grid in world position projected on the x-z plane. Looked like this: https://pastebin.com/XLBxJMQb now as that does not work anymore, I set it up as subGraph with the code as inline text like this: Which works fine enough. But I struggle, what I'd need to do to make it work when I want to load it from a file. What kind of boilerplate do I need to add to make it work? (I get "failed to open source file" when I just add the string to a file and reference it). The beauty of the old system and the inline text version is that you don't really need to know hlsl to write custom nodes, you just need to specifiy the inputs and outputs and do some maths on it and not worry about all that confusing boilerplate.
@AurelWu There isn't much more hlsl involved for an external file then what you have already. First of all, the 'failed to open source file' error leads me to believe that you weren't pointing to the correct file location. The source field should have a reference to /Assets/[YourFilePath]/[YourFile].hlsl Secondly, rather the putting the code directly into a file, you have to wrap it in a function. In the Name field you then place the name of that function. (the actual function name should end on _float or similar, this should be excluded from the Name field. It will give an appropriate error message about which function it's looking if it doesn't find it). Finally, the in and out parameters of the custom function should be given as parameters to the function in the hlsl file. Naming doesn't matter, only order. return values should be added as out parameters, to which the resulting values can be assigned. Wich should result in something like this: HexNode.hlsl Code (CSharp): void HexNode_float(float3 worldPosition, float2 gridSize, float borderWidth, out float value) { ... borderWidth = ... } Edit: fixed the missing out keyword for future reference, good catch!
That was easier than expected. Thanks a lot! (your code is missing the ",out float value" parameter but that was obvious to figure out As for the path I entered the absolute Path which doesnt seem to work, instead of relative to project directory
Leaving a solution for converting AE Tuts' render probe type refraction node for LWRP to the new custom node: Code (CSharp): Out = refract(normalize(viewDirection), normalize(normalDirection), IOR);
how do I input a sampler2D/3D to a custom node? "cannot implicitly convert from 'Texture3D<float4>' to 'sampler3D'"
Well said @jbooth . This decision is very unfortunate and will strongly influence the further positioning of Unity in the industry and in its own community. This will not only change the attractiveness of the platform for engineers but also for artists, as they will have to rely solely on Unity's update cycles and their provided tech. I hope the key decision makers at Unity are reconsidering their path.
Was just wondering if I can pipe out as a Texture2D the shadows... This is baffling me since a long time. Sadly I don't know any HLSL. Is it actually possible? (I can Imagine not - because if it would, why it's not there already?)
do it for amplify- sells 2x as many copies as ones for Unity’s graph, is an open framework, and works on all 3 rendering pipelines.
So how do you use instancing and StructuredBuffer with this new approach? Or how about shader semantics? Like how would you do a custom shader that does something similar to what the (undocumented btw) LinearBlendSkinningNode does?
Ya I just found out custom nodes were no longer a thing, and this is going to royally suck for advanced shaders. Not just ones we create but third party shaders that will be far more complex then is necessary now, and far more complex to just make simple changes to like we can with built in shaders. The whole paradigm they have been selling of SRP allowing you to customize everything, it's quickly turning out to be something very different then advertised. The whole no more black boxes mantra, it's a getting a darker and darker grey every day. Obviously they think what they have created is too complex for most people to deal with, so they just closed it off. And it's turned into that case actually. I took a relatively simple shader and tracked all the related commits, and the bits and pieces scattered all over needed to support it is rather insane. I don't see how they can even reason about it all well internally. The complexity they have created just to do simple things, all to support their pretty graph, I'm sure there are more then a few people inside Unity questioning the value of that.