Search Unity

  1. Unity 2019.2 is now released.
    Dismiss Notice

Feedback Wanted: Shader Graph

Discussion in 'Graphics Experimental Previews' started by Kink3d, Jan 10, 2018.

Thread Status:
Not open for further replies.
  1. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,332
    Does anyone know how the Fog node is supposed to be used? The shader code functions (MixFog() for LW and EvaluateAtmosphericScattering() for HD) have both inputs and outputs but the shader graph node only has two outputs.
     
  2. GameDevCouple_I

    GameDevCouple_I

    Joined:
    Oct 5, 2013
    Posts:
    2,146
    As someone with many ideas for 2D games, I also agree with this sentiment. Unity effectively has all the right pieces to be a real competitor to pure 2D engines such as the UbiArt engine, but so far has not fully realised this (Although the work towards sprite shape, 2D rigging tools and tilemaps have bridged this gap a lot, so really its just a case of getting decent 2D lighting and we are dancing and singing!)
     
  3. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,844
    but I don't want 2D lighting or shadows. I want 3D lighting and shadows :) you can do it with custom shaders but I'd like to use graphs where possible...

    Although it's fine as it's a resolved issue for us (we're doing a 3D game) however my side projects and projects from friends could use shadergraph love.
     
  4. Jick87

    Jick87

    Joined:
    Oct 21, 2015
    Posts:
    124
    Is the depth node working correctly in HDRP currently? I'm getting some odd results. I've tried a lot of stuff from various posts in this thread (like this one) but none of them seem to work right for me right now.

    I'm using Unity 2018.3.0b11 with HDRP 4.2.0 and Shader Graph 4.2.0.

    Not sure a screenshot would show the issue well. Would need to make a video I guess. Basically, it seems like the depth isn't being properly transformed or something so it kinda moves across the object (a plane in this case) as the camera moves.

    It seems to work for other people in this thread though. Perhaps something broke in one of the newer beta minor versions? Or maybe I'm doing something wrong?

    I'm basically just trying to get basic depth blending on a plane for a water effect, which is what others here have done it seems like.

    Thanks for any help!
     
  5. McDev02

    McDev02

    Joined:
    Nov 22, 2010
    Posts:
    634
    So I just tried out ShaderGraph and it is a charm. But there are a few things. Maybe those have already been posted, but well. All tested on the latest version of HDRP.

    • After each tiny change a progress bar pops up, at least until this is fixed (make it async at least) allow us to just deactivate the preview and compilation. I coded shaders for 10 years without it and I need a quick workflow!
    • I see no way to adjust Properties inside ShaderGraph for preview purposes, which makes the preview function kinda useless.
    • Adding Vertex Shader support, at least to manipulate Normals.
     
    Last edited: Nov 27, 2018
  6. Pimpace

    Pimpace

    Joined:
    Sep 21, 2017
    Posts:
    33
    Shadergraph_main_preview_bug.jpg

    In the main preview window you see a bucket custom mesh, then I tried to switch to sphere, and zoom out...

    Sorry if this has been posted, but after SG 2.0.8 or 9 in SG the main preview window not refreshes itself. Nor any mouse interaction or change shape. Only refresh if I resize its window inside SG window, or resize the ShaderGraph window.

    Is this about to sorting out in future releases? (it's very annoying though)
     
  7. Krishna_psk

    Krishna_psk

    Joined:
    Feb 2, 2018
    Posts:
    6
    Friends, any tutorials for shader graphs for beginners?? I would like to learn about them in detail for my game.. especially the one for stencils.. any links will be of great use to me
     
  8. tweedie

    tweedie

    Joined:
    Apr 24, 2013
    Posts:
    278
    Can we change 'Vector 1' to just Float? To me, a Vector 1 is a misnomer. Vectors have magnitude and direction, and the notion of a direction is meaningless in 1 dimension. It is literally the definition of a scalar value.

    I'd be interested to hear if there was a reason for this naming. It would've been less surprising if they were all Float, Float2, Float3 etc, especially as this is what it is under the hood anyway. Sorry if this has already been discussed.
     
    dadude123 likes this.
  9. tweedie

    tweedie

    Joined:
    Apr 24, 2013
    Posts:
    278
    We wanted to be able to add a Prefix Use pass before the body of a shader constructed in shadergraph. After some digging we found where we could inject some of that stuff so thought I'd detail it here:

    In HDLitMasterNode we added some properties, with serialized backing fields:

    Code (CSharp):
    1.        
    2.         [SerializeField]
    3.         private string _prefixUsePass;
    4.        
    5.         public string prefixUsePass {
    6.             get { return _prefixUsePass; }
    7.             set {
    8.                 if (_prefixUsePass == value)
    9.                     return;
    10.                 _prefixUsePass = value;
    11.                 Dirty(ModificationScope.Graph);
    12.             }
    13.         }
    14.        
    15.         [SerializeField]
    16.         private string _suffixUsePass;
    17.        
    18.         public string suffixUsePass {
    19.             get { return _suffixUsePass; }
    20.             set {
    21.                 if (_suffixUsePass == value)
    22.                     return;
    23.                 _suffixUsePass = value;
    24.                 Dirty(ModificationScope.Graph);
    25.             }
    26.         }
    27.  

    Then at the end of the HDLitSettingsView constructor, we added two more PropertRows for a Prefix / Suffix Use Pass:

    Code (CSharp):
    1.            
    2.             ps.Add(new PropertyRow(new Label("Prefix Use Pass")), (row) =>
    3.             {
    4.                 row.Add(new TextField(), (textField) =>
    5.                 {
    6.                     textField.value = m_Node.prefixUsePass;
    7.                     textField.OnValueChanged(ChangePrefixUsePass);
    8.                 });
    9.             });
    10.             ps.Add(new PropertyRow(new Label("Suffix Use Pass")), (row) =>
    11.             {
    12.                 row.Add(new TextField(), (textField) =>
    13.                 {
    14.                     textField.value = m_Node.suffixUsePass;
    15.                     textField.OnValueChanged(ChangeSuffixUsePass);
    16.                 });
    17.             });
    18.  


    And the corresponding methods to actually change these values:

    Code (CSharp):
    1.        
    2.         void ChangePrefixUsePass(ChangeEvent<string> evt)
    3.         {
    4.             if (Equals(m_Node.prefixUsePass, evt.newValue))
    5.                 return;
    6.  
    7.             m_Node.owner.owner.RegisterCompleteObjectUndo("Prefix Use Pass Changed");
    8.             m_Node.prefixUsePass = evt.newValue;
    9.         }
    10.        
    11.         void ChangeSuffixUsePass(ChangeEvent<string> evt)
    12.         {
    13.             if (Equals(m_Node.suffixUsePass, evt.newValue))
    14.                 return;
    15.  
    16.             m_Node.owner.owner.RegisterCompleteObjectUndo("Suffix Use Pass Changed");
    17.             m_Node.suffixUsePass = evt.newValue;
    18.         }
    19.  

    Then, in HDLitSubshader's GetSubShader() we added the following code just after opening the Indent(), and another similar chunk further down for the suffix pass:

    Code (CSharp):
    1.  
    2.                 if (!string.IsNullOrEmpty(masterNode.prefixUsePass)) {
    3.                     subShader.AddShaderChunk(string.Format("UsePass \"{0}\"", masterNode.prefixUsePass));
    4.                 }
    5.  

    This was enough for us to inject a single pass pre/post the body of the lit shader. Might be useful for anybody wanting similar functionality. The result looks like this in the editor:



    The only limitation we've found is when this pass is changed (for example if its in another graph), you have to also recompile any graphs using it (as UsePass literally just copy / pastes the text from the other shader). I haven't detailed the Suffix pass too much as we're still breaking down the HDLit shader.

    The only other things to note, are it recompiles after every key press, because we didn't spend any time telling it not to serialize til you've finished typing, and the text fields draw beyond the bounds of the menu when the pass name is larger than the field.

    Hope this is helpful for somebody!
     
    wyatttt, Elecman and hippocoder like this.
  10. tweedie

    tweedie

    Joined:
    Apr 24, 2013
    Posts:
    278
    Also, just to cram more stuff into this thread ( :) ) - here's some feedback from our team

    - Hotkeys for standard math functions; (A) Add, (M) Multiply etc
    - Creating/Naming a property should first default the reference name to the same as the title, prefixed with an underscore.
    - Does PerRendererData still work (guess this is more of an HDRP question)? If so, how do we make properties use PerRendererData.
    - Please default node previews to be hidden, or at least expose a preference setting for this
    - Dockable properties/preview panels. But preferably just a panel on the side of the window. Really can't imagine a scenario where I'd care about dragging these around, but perhaps thats just me.
    - Are the MousePosition vectors on the corners of the Drag selection box just for developers? Seems an odd thing to leave in
    - Reducing the amount of recompilation (particularly when adding/changing properties) would be great. Currently really breaks the flow.
    - If Vectors could have a connection line for each of their components (V2's to have 2 lines, V3s get 3) rather than just a colour difference, that would be lovely. This is as much an accessibility thing as it is a standard.
    - Hotkey for hiding preview, and another for showing it on selected nodes. (Not a toggle, if multiple nodes are selected their previews shouldn't flip-flop).
    - Hotkey for globally hiding/showing previews.
    - Save the last typed-text in AddNode menu.
    - Alt+Click / RightClicking a pin (socket?) to break the connection to it.
    - Another key for dragging a connection out of the pin/socket and plugging it into another. EDIT: Just discovered you click-drag the line, not the socket, to do this.
    - Grid background? Grid snapping?
    - Is there a code node / custom / logic? Or is the expected behaviour to write a custom node in C#
    - Nodes that are dragged upwards cause the graph to scroll upwards well before I'd expect them to (at least 200pixels away from the window's upper border). This is surprisingly frustrating.
    - Can outputs from subgraphs only be V4s? If so this is really frustrating.
    - Is the only way to visibly convert a V4 into a V2 to Split it and then construct a new Vector2 from it? Using Channel mask seems to keep the type as V4 and, presumably, set B/A to 0.
    - Ability to add Comments
     
    Last edited: Nov 28, 2018
    Elecman, McDev02 and Jick87 like this.
  11. Jick87

    Jick87

    Joined:
    Oct 21, 2015
    Posts:
    124
    I'm not sure why they went with Vector1. I guess just as some sort of consistency with the others?

    As for "Vector" instead of "Float", I guess they were trying to pick something generic because they are not necessarily floats in the actual code. It depends on the "precision" that is set. They can end up being float, half, fixed, etc. I noticed this when going through some of the code for the built-in nodes. They never specifically use "float", "half", etc. They do string replacement and use a value called "precision". I'm not sure where that value gets set, but that is how it seems they are doing it right now. This is one example.
     
  12. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,844
    There are also radial vectors, position vectors and so on, a vector does not need to represent a direction in maths or in programming. For example in physics, it's common to define a point as a vector which can have a length from 0 to n.

    For shaders (HLSL) they can represent a direction, a position, a force, a colour, pretty much everything. GPUs describe them as Vector internally anyway, and have specialised vector processing so I am going to guess this was part of the rationale behind Vector.

    Doesn't really bother me. If you want to be properly bothered by this, head toward ECS land, particularly Unity mathematics library.
     
  13. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,194
    I recently did HD Lit particle shader with SG using HD Lit Graph, but found out that SG is still pretty inflexible in making some general purpose "ubershader". In general, I've found SG really cool concept but it's still super limited to specific purpose shaders and not really suitable for more generic things you'd use in production.

    I've tried to give SG a fair chance for few times but I keep going back to Amplify Shader Editor as it doesn't have same limitations (or at least it gives ways to work around them).

    Here's few things I've come across so far:

    - You can't rename sub graph outputs. In case you have many outputs, it becomes really tedious to remember what output is what (this still baffles me that it's like this by design, hence listing it first).

    - You can't expose final shaders public properties directly from sub graph. If you expose a property it just becomes "input" for the node on the main graph which usually just means you expose a ton of SAME parameters on the sub graph and on your main graph. I'd love to have "expose" on sub graph literally exposing the value in material and have another checkbox for "input". Or just make sub graph nodes on main graph to default to exposing them if their inputs are left unconnected on main graph. Right now you end up with stuff like this (see how the ShurikenBaseColor and ShurikenDistortion sub graph node's inputs just replicated as main graphs properties, this is totally unnecessary duplicate work and makes whole thing more bloated as it needs to be in this case).

    - You can't quickly create new properties while wiring the graph itself, for example in above case, one could at least have "create new property" option when you drag the wire out of the node input (like on UE4 materials) and let it default to same name as in the input source.

    - If you've used same property on your graph multiple times (to minimize the wiring mess) and then collapse the nodes into new sub graph (without properties feeding the nodes selected), you get duplicate properties in the new subgraph for the same thing so you always need to manually clean up (as it can't detect you actually had two property ref nodes using same ref).

    - If you do the same as above but select the property nodes as well before collapsing to sub graph, new graph does get the properties but everything is disconnected. I honestly don't know how these are designed to be used as every way I try seems to be suboptimal.

    - You can't expose the master nodes properties. Imagine you want to toggle some expensive but fancy feature on/off, right now you can't. You have to make a duplicate of the graph and make a variant there. This is where you'd think you'd save your time and make more sub graphs but due to most of the points mentioned above, it's not all that simple.

    - (exposed) property types in general are lacking, SG only exposes what is needed by it's own nodes but in case your shader would require some custom input type to be fed into custom node, you don't have that option (again, I've had to revert to ASE for this in past).

    - I'd REALLY love reroute nodes, things can get messy and I like to keep things organized. I know "pins" are on the internal roadmap but I'd love to get them already :)

    It's possible I've missed some workflows that would make some of these less of a pain for me, if so, please tell me :)
     
  14. tweedie

    tweedie

    Joined:
    Apr 24, 2013
    Posts:
    278
    Yeah this is certainly an argument. On Desktop everything is handled in full precision - here's a quote from the docs - perhaps this is why "precision" is used in its place, and some fancy platform-dependent stuff happens to shuffle the type appropriately.

    "One complication of float/half/fixed data type usage is that PC GPUs are always high precision. That is, for all the PC (Windows/Mac/Linux) GPUs, it does not matter whether you write float, half or fixed data types in your shaders. They always compute everything in full 32-bit floating point precision."

    (https://docs.unity3d.com/Manual/SL-DataTypesAndPrecision.html)

    EDIT: Thinking about this a little more, I think this reasoning explains the most. Calling it Float would imply that level of precision when it might be less. I'm still not sold on calling it a Vector1 though. "Vector" is a nice label to avoid implying precision, but I just don't think its worth a 1D value being called that for the sake of consistency with the other types. Especially when there is explicitly a "Color" class and we're not just expected to use a Vector4.

    ---

    Radial/Unit vectors inherently have a direction, as they operate around an angle. Positions just define a direction from the origin, with a magnitude being the distance from the origin. I have personally never come across a 1D vector in physics, that is just a scalar.

    If this is the case, and GPUs call them that behind the scenes, I can understand why the decision was made. But it is disctinctly odd to me that in HLSL we write float, float2, float3 yet in Shadergraph we're using v1, v2 etc. Parity is always nice.

    All the more reason for consistency imo!
     
    Last edited: Nov 27, 2018
    Jick87 likes this.
  15. DigitalSalmon

    DigitalSalmon

    Joined:
    Jul 29, 2013
    Posts:
    42
    Vector | A quantity having direction as well as magnitude, especially as determining the position of one point in space relative to another.

    Just chipping in with another vote for a rename - scalar or value both seem more intuitive. The concept of a Vector1 trades accuracy/logic for a consistency no one is asking for.
     
    Elecman, Mark_01 and tweedie like this.
  16. thelebaron

    thelebaron

    Joined:
    Jun 2, 2013
    Posts:
    313
    Is it possible to create decals with shader graph yet? Ive seen this asked in this thread without answers(unless I missed something).
     
    EricLowry likes this.
  17. Karearea

    Karearea

    Joined:
    Sep 3, 2012
    Posts:
    322
    @Kink3d -I'm just in the middle of converting a few old shaders, at the moment I've been working on an ice material that utilises the camera opaque texture generated by the LWRP for a fake refraction effect.

    I'm generally really happy with how it looks, but at the moment it suffers from the classic problem of foreground objects bleeding into the distortion, as per this thread. I've tried to implement some nodes to match the suggested fixes, but I'm not really sure how to translate it to shadergraph completely.

    Attached is where I got to after trying a few different approaches, I'd really appreciate any advice on how to do this properly. Thank you!
     

    Attached Files:

  18. darkydoodle

    darkydoodle

    Joined:
    Oct 27, 2018
    Posts:
    49
    Did someone manage to build a project with custom nodes ? Because I have errors for all of my custom nodes :

    error CS0234: The type or namespace name 'ShaderGraph' does not exist in the namespace 'UnityEditor' (are you missing an assembly reference?)
    error CS0246: The type or namespace name 'Vector1' could not be found (are you missing a using directive or an assembly reference?)

    And so on… It works perfectly well from the editor play, though.
     
    P_Jong likes this.
  19. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,194
    Don't include UnityEditor stuff on your build. Do you even need custom nodes in the build? Shouldn't the relevant code be stored in the shader files themselves?
     
    darkydoodle likes this.
  20. dgoyette

    dgoyette

    Joined:
    Jul 1, 2016
    Posts:
    2,067
    Like rizu says, the CodeFunctionNode doesn't need to be included in the build. You can put those script in a folder named Editor, or wrap the whole script in "#if (UNITY_EDITOR) " to avoid it being added to the build.
     
    P_Jong and darkydoodle like this.
  21. darkydoodle

    darkydoodle

    Joined:
    Oct 27, 2018
    Posts:
    49
    I thought something along this line, but when I put my custom node folder out of the asset folder, they disappeared from the graph. I'll take a closer look. Thanks both of you :)
     
  22. rizu

    rizu

    Joined:
    Oct 8, 2013
    Posts:
    1,194
    It has to be found by the editor still so you can't just take it out of Assets. You can put it in Assets/Editor etc or just wrap the code in those #if (UNITY_EDITOR) #endif like mentioned above.
     
    darkydoodle likes this.
  23. darkydoodle

    darkydoodle

    Joined:
    Oct 27, 2018
    Posts:
    49
    Ok, thanks, never written unity_editor stuff before.
    While i'm there, any news on the instancing MaterialPropertyBlock access from shader graph / custom nodes ? Also, no possibilities to force a node to execute itself in the vertex shader ?
     
  24. Karearea

    Karearea

    Joined:
    Sep 3, 2012
    Posts:
    322
    I've set up a branch utilising the LinearEyeDepth as per @ssartell's guide above. This should avoid refraction artifacts around foreground objects. There's no discernable difference in the material, is there anything obviously amiss with the graph (Beyond the Position output being split to 'A' rather than 'B' which I've fixed)?

    Edit: Turns out sampling _CameraDepthTexture works as expected..
     

    Attached Files:

    Last edited: Dec 1, 2018
  25. tertle

    tertle

    Joined:
    Jan 25, 2011
    Posts:
    1,856
    Slight issue I ran into today.

    Have a custom master node and was upgraded. It appeared to work in shader graph fine however would appear pink in the actual game. No errors. Shader appeared to compile fine.

    I finally clicked the show generated code in shader graph trying to figure out what was wrong, and only then was an exception thrown for a null reference exception (which was a mistake on my part in the sub shader code).
    But for whatever reason before this the exception was completely surprised, even when making a build.
    I don't feel like errors should be suppressed. It was a simple fix as soon as I got my hands on the actual exception.
     
  26. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,379
    @wyatttt for shadergraph node output, instead of using split node to separate channel, why not just add additionals output slot for each channel? Just like the output of Split Node
     
  27. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,332
    It would be nice if the Shader Graph developers themselves provide some more feedback to the feedback.
     
    Last edited: Dec 2, 2018
    GameDevCouple_I likes this.
  28. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,379
    yeah it's been quiet lately
     
  29. z_space

    z_space

    Joined:
    Jul 17, 2016
    Posts:
    10
    Sorry I think I missed something. You said you sampling _CameraDepthTexture works as expected, but how did you sample it in the shader graph exactly? Thanks!
     
  30. Karearea

    Karearea

    Joined:
    Sep 3, 2012
    Posts:
    322
    Sure thing- see below. You need to make an unexposed texture2D variable for it, then sample it with the screen position. I had thought that's what the Scene Depth node was doing under the bonnet, but evidently not.
     

    Attached Files:

    Jesus likes this.
  31. z_space

    z_space

    Joined:
    Jul 17, 2016
    Posts:
    10
    Thanks, but doing this gets me an error about the "redefinition of "_CameraDepthTexture". This is in a fresh empty HDRP scene.
     
  32. Karearea

    Karearea

    Joined:
    Sep 3, 2012
    Posts:
    322
    Ah, this is LWRP- not sure if the HDRP generates the same depth texture. In my experience you’ll get that error if you’ve added that depth texture variable, but also have a Scene Depth node in the graph somewhere.
     
  33. GameDevCouple_I

    GameDevCouple_I

    Joined:
    Oct 5, 2013
    Posts:
    2,146
    I am sure they are very busy and are taking note, but yeah I do feel a bit more communication regarding feedback given (even just marking each with a quick "good", "bad", "not relevant" etc) would do wonders.

    I see a lot of good feedback given but its hard to tell how relevant any of it is currently due to the api being in constant flux
     
    wyatttt likes this.
  34. einWikinger

    einWikinger

    Joined:
    Jul 30, 2013
    Posts:
    34
    Having a node that provides an explicit interface between the vertex/fragment shaders would be very nice, e.g. to move complex computations to only compute per-vertex and then being interpolated for the fragment shader (i.e. a "Interpolator" node).

    My current use case is to perform some complex UV transformations that are applied to all vertices of a specific triangle. Currently this transformation has to be computed for every fragment, where a computation per-vertex would suffice because it doesn't matter if it's interpolated or computed for each fragment.

    (Something like http://wiki.amplify.pt/index.php?title=Unity_Products:Amplify_Shader_Editor/Vertex_To_Fragment )
     
    Last edited: Dec 4, 2018
  35. KnightPista

    KnightPista

    Joined:
    May 18, 2015
    Posts:
    38
    I've just updated shader graphs from 2018.2 to 2018.3.b12 and all Sub-Graph nodes in the shader graphs now have disconnected input properties. The property nodes are still there, but they're not connected as inputs to the sub-graphs.
     
    Last edited: Dec 6, 2018
  36. Jick87

    Jick87

    Joined:
    Oct 21, 2015
    Posts:
    124
    Is it possible to get the post-modified vertex position in Shader Graph for use in custom nodes? I have some nodes fed into the Position slot on an HDLit master node which modifies the position from standard. I then try to use SAMPLE_TEXTURE2D in a custom node utilizing Binding.ObjectSpacePosition to do some custom UV stuff based on the object-space position but it seems in that case the position is not the modified position, but rather the original unmodified one, so the texture does not follow the vertexes.

    I had assumed the position you feed into the Position slot would modify the actual position value used everywhere else in the shader, but that doesn't seem to be the case, at least not in a custom node it seems like...
     
  37. Daoyee

    Daoyee

    Joined:
    May 1, 2018
    Posts:
    2
    //
    Beautiful solution that I've looking for! However, I'm confusing on the Scene Depth Node as I could not find them in the shader graphs. Could you share more on this?
    Thank you!

    Update: find it in 2019.1 beta
     
    Last edited: Dec 7, 2018
  38. minhdaubu2

    minhdaubu2

    Joined:
    Jun 10, 2014
    Posts:
    69
    Hello. I want to use CustomEditor for Shader is generated by Shader Graph, but don't know how?
    Can anyone help me?
     
  39. wyatttt

    wyatttt

    Unity Technologies

    Joined:
    May 9, 2018
    Posts:
    288
    You mean on the node itself? Like Vector3 node would have 3 Vector1 output ports instead of 1 Vector3 output?
     
  40. wyatttt

    wyatttt

    Unity Technologies

    Joined:
    May 9, 2018
    Posts:
    288
    This is something we are looking out for as we change things with ShaderGraph
     
    noio likes this.
  41. wyatttt

    wyatttt

    Unity Technologies

    Joined:
    May 9, 2018
    Posts:
    288
    1. You should be able to use MaterialPropertyBlocks with ShaderGraph Materials. We now wrap Properties with CBUFFER macros
    2. Could you explain this a bit more for me? Do you mean to force a routine to execute per vertex and then pass that data to the Fragment shader?
     
  42. wyatttt

    wyatttt

    Unity Technologies

    Joined:
    May 9, 2018
    Posts:
    288
    I believe a fix for this has been made. IIRC, it required a change in the Unity source and not just something within the ShaderGraph / SRPs packages. So it will probably require that you update Unity
     
  43. wyatttt

    wyatttt

    Unity Technologies

    Joined:
    May 9, 2018
    Posts:
    288
    I believe this is mostly for precision reasons. Vector1/2/3 could be float, float2, float3 or half, half2, half3, etc. in the shader. While this is a feature we have yet to implement, I think that's why we opted for Vector as the naming convention
     
    noio likes this.
  44. Karearea

    Karearea

    Joined:
    Sep 3, 2012
    Posts:
    322
    I tend to collapse my nodes as soon as I’ve wired them in anyway, so I’d be down for one xyz output, with individual x,y,z outputs stacked below.
     
  45. Elecman

    Elecman

    Joined:
    May 5, 2011
    Posts:
    1,332
    Can you please expose the w component of the mesh Tangent? Currently the mesh Tangent is only accessible as a Vector3 but in the mesh mesh data it is a Vector4.
     
  46. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,379
    Yes one Vector3 Output and 3 Vector1 Output
     
  47. Aladine

    Aladine

    Joined:
    Jul 31, 2013
    Posts:
    138
    Hey folks,

    Does Shader Graph supports replacement Shaders ? if not, is there any news about that ?

    thanks
     
  48. webjeff

    webjeff

    Joined:
    Mar 11, 2015
    Posts:
    32
    I am trying to replicate a shader I made myself inside of ShaderGraph but do not think it's possible in the current state.

    I have a 2 pass shader so I can draw the object normally in one pass using ztest lequals, then do another pass with ztest greater to draw it as a solid color (fast) when we're behind a building.

    Is this possible in Shader Graph?

    Thanks!
     
  49. Jick87

    Jick87

    Joined:
    Oct 21, 2015
    Posts:
    124
    As far as I know multiple passes are not currently possible in Shader Graph. However, you could make two different shaders (one for each of the passes you need) and then export the code and combine them yourself manually. You won't be able to work with the combined shader in Shader Graph any longer after that, but it should at least be a decent workaround. You could also keep the two shaders around in case you want to tweak them later. Then you just have to go through the process of combining them again when you do change them.
     
  50. wyatttt

    wyatttt

    Unity Technologies

    Joined:
    May 9, 2018
    Posts:
    288
    So kind of like the SampleTexture2D node where there's port for RGBA and then ports for each individual channel on that same node. Makes sense. I'll make the suggestion
     
    Reanimate_L likes this.
Thread Status:
Not open for further replies.