Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Feedback Wanted: Shader Graph

Discussion in 'Graphics Experimental Previews' started by Kink3d, Jan 10, 2018.

Thread Status:
Not open for further replies.
  1. wyattt_

    wyattt_

    Unity Technologies

    Joined:
    May 9, 2018
    Posts:
    424
    If you are able to recreate the graph in Shader Graph, you might be able to get it to work with the PostProcessing stack and an Unlit Shader Graph. I've had some luck with that when doing post fx.
    https://docs.unity3d.com/Packages/com.unity.postprocessing@2.0-preview/manual/index.html#scripting
    Scroll down just a bit and see the "Writing Custom Effects" section.
     
  2. wyattt_

    wyattt_

    Unity Technologies

    Joined:
    May 9, 2018
    Posts:
    424
    In what way is this not working? ie What are the results you are expecting and how are you using the node?
     
  3. angelomoro

    angelomoro

    Joined:
    Sep 4, 2018
    Posts:
    13
    Hi there!

    It's possible to get the Lighting Data of the scene to the Light Direction? This is needed to enhance the "Toon Ramp" Example for the ShaderGraph_ExampleLibrary
    Since the Graph provides a manual definition of a Vector3 "Light Direction".

    Thanks!
     
  4. Jesus

    Jesus

    Joined:
    Jul 12, 2010
    Posts:
    504
    Yes, but for now it's best to do it with just one directional light, usually the main directional light used for the sun (etc).

    Use a script to grab the light direction from the directional light gameobject. Then, in that same script, use Shader.SetGlobalVector.
    See here: https://docs.unity3d.com/530/Documentation/ScriptReference/Shader.SetGlobalVector.html
    Give it a useful name, like ManualLightDirection. Also, just watch out as this needs a v4, and a direction is only a v3, so you have to append a 0 on the end.

    Then in your shader you just use the same as the example.

    On the blackboard you should have a vector4, rename it to ManualLightDirection as both name and reference name, and UNTICK exposed (this will make it vanish from the usual material inspector since it doesn't need to be there).

    For point lights, it's different. Instead of sending the direction, you send the position. This is because the light direction changes based on the position of the rendered pixel, if that makes sense. I might include it in a later demo.
     
    angelomoro likes this.
  5. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Well, i created the custom node like i posted in a previous post like this:

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using System.Reflection;
    4. using UnityEditor.ShaderGraph;
    5. using UnityEngine;
    6. [Title("Custom Noise", "Y Rotation Matrix")]
    7. public class YRotationMatrixCustomNode : CodeFunctionNode
    8. {
    9.     public YRotationMatrixCustomNode()
    10.     {
    11.         name = "YRotationMatrixCustomNode";
    12.     }
    13.     protected override MethodInfo GetFunctionToConvert()
    14.     {
    15.         return GetType().GetMethod("RotationMatrixY", BindingFlags.Static | BindingFlags.NonPublic);
    16.     }
    17.     static string RotationMatrixY(
    18.         [Slot(0, Binding.None)] Vector1 v,
    19.         [Slot(1, Binding.None)] out Matrix3x3 Out)
    20.     {
    21.         return @"
    22. {
    23.   Out = float3x3(0,v,0,
    24.               0,0,0,
    25.               0,0,0);
    26. }
    27. ";
    28.     }
    29. }
    And using it in the shader graph as node gives me this as a result:
    Node.PNG

    And this in the console:

    Code (CSharp):
    1. Assertion failed on expression: 'success'
    2. UnityEditor.EditorApplication:Internal_CallUpdateFunctions()
    I tried mat3 an other initializations as well like float3x3(0) etc but none of them worked.
    Maybe i am really missing something obvious though this is not the first custom node im writing.

    EDIT: The same happens with Matrix3x3 or Matrix2x2 as out type.
     
  6. Hugo-Habets

    Hugo-Habets

    Joined:
    Jan 14, 2016
    Posts:
    3
    Hi all,

    I'm trying to sample from a Texture3D in a CodeFunctionNode. The slots:

    [Slot(4, Binding.None)] Texture3D Volume,
    [Slot(5, Binding.None)] SamplerState Sampler,

    ...and the test code:

    float4 c = SAMPLE_TEXTURE3D(Volume, Sampler, float3(0.5, 0.5, 0.5));

    ...gives me:

    Compilation error in graph at line 165 (on d3d11):
    undeclared identifier '_RayMarchVolume_436F3918_Sampler'
    UnityEditor.EditorApplication:Internal_CallUpdateFunctions()

    Any ideas what's wrong?
     
  7. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Can you post the full custom node code? And please use the "inser code" button on top of the text input.
     
  8. Hugo-Habets

    Hugo-Habets

    Joined:
    Jan 14, 2016
    Posts:
    3
    Sure no problem, here it is:

    Code (CSharp):
    1. using System.Reflection;
    2. using UnityEditor.ShaderGraph;
    3. using UnityEngine;
    4.  
    5. [Title("Test", "RayMarchVolumeTest")]
    6. public class RayMarchVolumeTest : CodeFunctionNode
    7. {
    8.     public RayMarchVolumeTest()
    9.     {
    10.         name = "RayMarchVolumeTest";
    11.     }
    12.  
    13.     protected override MethodInfo GetFunctionToConvert()
    14.     {
    15.         return GetType().GetMethod("RayMarchVolumeFunction",
    16.             BindingFlags.Static | BindingFlags.NonPublic);
    17.     }
    18.  
    19.     static string RayMarchVolumeFunction(
    20.     [Slot(0, Binding.WorldSpacePosition)] Vector3 Position,
    21.     [Slot(1, Binding.ViewSpacePosition)] Vector3 CameraPosition,
    22.     [Slot(2, Binding.ObjectSpacePosition)] Vector3 ObjectPosition,
    23.     [Slot(3, Binding.None)] Vector1 Steps,
    24.     [Slot(4, Binding.None)] Texture3D Volume,
    25.     [Slot(5, Binding.None)] SamplerState Sampler,
    26.     [Slot(6, Binding.None)] out Vector3 HitPosition)
    27.     {
    28.         HitPosition = new Vector3();
    29.         return
    30.             @"
    31.            {
    32.                float4 c = SAMPLE_TEXTURE3D(Volume, Sampler, float3(0.5, 0.5, 0.5));
    33.                HitPosition = c;
    34.            }";
    35.     }
    36. }
    37.  
    Thanks for the help!
     
  9. angelomoro

    angelomoro

    Joined:
    Sep 4, 2018
    Posts:
    13
    That's the ticket! Of course it will do with the main (Sun) light. Thank you very much :D
     
  10. jason_sone

    jason_sone

    Joined:
    Jul 7, 2017
    Posts:
    8
    In the standard render, I can use transparent PNG images for my buttons, but with the LWRP, the alpha channel is gone. What am I doing wrong?
     
  11. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Hey! I got it working with exactly the same code. Though it is extremely weird.
    You have to do the following:

    1. Compile your custom node code like you already did
    2. Create the custom node in shader graph (now it will be pink and output the errors you already posted)
    3. Now create another "Samplerstate" node and connect it to the sampler state input of your custom node
    4. Go back to your custom node code in c# and change something (add a space somewhere etc) so it has to recompile!
    5. Go back to unity and you will see that the error wont come anymore and you custom node is not pink but grey.

    I think this is some kind of a bug because it should have a default sampler state as input, but i couldnt find a Binding for that in c#.
     
    Hugo-Habets likes this.
  12. Hugo-Habets

    Hugo-Habets

    Joined:
    Jan 14, 2016
    Posts:
    3
    Yes, you've got it! Thank you so much!

    I guess this is something still under construction at the moment since I couldn't find any example code that reads a texture (all graphs I found use the texture nodes out of the box), and, there is a SampleTexture3DNode that actually does compile when used in a Slot, but at the moment when used breaks the "Create Node" menu. :)

    Thanks again!
     
    Desoxi likes this.
  13. equalsequals

    equalsequals

    Joined:
    Sep 27, 2010
    Posts:
    154
    Similarly, the ability to define attributes on the ShaderProperties would be nice. Colors do this with the HDR but it seems very bespoke. It would be nice to be able to flag a given field for PerRendererData if your Graph is intended to play nicely with MaterialPropertyBlocks.
     
  14. Jesus

    Jesus

    Joined:
    Jul 12, 2010
    Posts:
    504
    Use this code, assign your primary directional light.

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4.  
    5. [ExecuteInEditMode]
    6.  
    7. public class CustomLightingCore : MonoBehaviour {
    8.  
    9.     public Light MainLight;                                     //  The important light we want to use.
    10.  
    11.     public Vector4 MainLightDirectionV4;                        //  The holding vector where we append a 0 so it's a vec4 not a vec3
    12.     public Color MainLightColor;                                //  Grab the colour of the light
    13.     public float MainLightIntensity;                            //  Grab the light intensity;
    14.  
    15.     public string LightVectorName = "_ManualLightDirection";     //  The name we're calling this that MUST be in the shader
    16.     public string LightColorName = "_ManualLightColor";     //  The name we're calling this that MUST be in the shader
    17.     public string LightIntensityName = "_ManualLightIntensity";     //  The name we're calling this that MUST be in the shader
    18.  
    19.     // Update is called once per frame
    20.     void Update () {
    21.  
    22.         MainLightDirectionV4 = new Vector4(-MainLight.transform.forward.x, -MainLight.transform.forward.y, -MainLight.transform.forward.z, 0.0f);
    23.         MainLightColor = MainLight.color;
    24.         MainLightIntensity = MainLight.intensity;
    25.  
    26.         Shader.SetGlobalVector(LightVectorName, MainLightDirectionV4);
    27.         Shader.SetGlobalColor(LightColorName, MainLightColor);
    28.         Shader.SetGlobalFloat(LightIntensityName, MainLightIntensity);
    29.     }
    30.  
    31. }
    Then, create this graph:
    customlightingbase_graph.png

    Try out some of the features by plugging the various 'Preview' nodes into the Color output.

    Also, you can see I forgot to make a texture property, create a texture 2d node and make it a property so you can try your own texture in each material. Plug that into the sample texture 2d node at the top.

    You can see the light is computed as a dot product of light direction and normal direction. NOTE in the code above the - signs ahead of the light direction components, you must do this or the light will be backwards. Then the base lighting is multiplied by the light colour and intensity, and at the top by the texture for the first and second preview. The third preview skips past colour/intensity.

    For the second last preview, it's come straight from a remap - this is half lighting. They did this a fair bit in the Half Life 2 games to make the lighting softer and more diffuse; it's like the light source is the entire gray cloudy sky instead of a single point like the sun. It also works for areas that have giant glowing screens in HL2 as well.

    This half lighting is what you use for the ramp input. If you have your own gradient texture, combine the half lighting and 0.5 (or the other way around) and use that as the UV for your toon ramp gradient texture.

    At the bottom you can see a toon ramp/cutoff. It uses a remap and a slider to lerp between a (shadowed colour) and a (lit colour). NOTE I use a remap and a clamp with values like (0.49-0.51) to (0-1) because that gives a better soft line, if you use step (0.5) you get a harsh on/off line that's ugly. I've attached this to a slider that is adjusted up/down by 0.01 so you can control the cutoff in the inspector.


    EDIT: note that this can grab light direction, colour and intensity from a directional light object. It does NOT get shadowed, isn't affected by lightmaps and may misbehave if you disable the light/component.
     
    drallcom3 and angelomoro like this.
  15. Jesus

    Jesus

    Joined:
    Jul 12, 2010
    Posts:
    504
    Also, I've made this available.

    Let me know how it runs!

    Download link (~25mb) by clicking the picture on THIS WEBPAGE.

    LWRP_Challenge_readme_pic.png

    README:

    Download project, unzip, then open project with Unity.

    Requires Unity 2018.2

    Unity Package Manager

    RP-Core + Lightweight RP + Shader Graph Editor v3.0.00

    Post Processing v2.0.11

    For performance measurements, note that the Render Pipeline Asset has a resolution scale of 2.0 – effectively 2x supersampling AA. If you have performance issues, try turning that value to 1-1.5; that should make it run faster.

    The Reflection Probe capture doesn’t happen every frame unless you’re in play mode, use that to get full reflections.

    If you need to regenerate terrain, clouds, etc, the options are in the Control Panel GameObject in the scene hierarchy.
     
    jackytop and wyattt_ like this.
  16. pointcache

    pointcache

    Joined:
    Sep 22, 2012
    Posts:
    579
    The default decal shader is not fit for my use case, i need to have split blend masks per channel (read https://forum.unity.com/threads/decal-shader-problems.551920/ )
    Upon looking at decal directory in hdrp i realized that i don't understand any of this, and wont be able to simply copy and modify the shader to my needs.
    So my last bet would be to try and recreate it in shadergraph, however i understand deferred decals only on theoretical level, i know that the map channels modify gbuffer contents, but i am not able to find a gbuffer node, can anyone point me into the right direction?
     
  17. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Is there a way to manually edit and add some lines of code to the generated shader code?
     
  18. angelomoro

    angelomoro

    Joined:
    Sep 4, 2018
    Posts:
    13
    It's working as expected, the shader reacts great with the Direction. I was confusing the Direction (a space vector) with the Rotation (with the Quaternion representation).

    I'm eager to try out the Outline, it looks fantastic in the Graph.

    Thanks!
     
  19. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Right click on the Master Node in the graph view and select Show Generated Code.
    Then copy that into a new .shader file and modify as you want.
     
  20. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    995
    How to make glass material in HDRP with refraction ?
     
    angelomoro likes this.
  21. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Tried that already but it never shows any code and outputs an error in the console (don't know which exactly right now because I'm on my phone).
     
  22. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    My best guess is you assigned a folder name for your shader. That breaks that feature.
     
  23. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Hi all, wondered if the shader graph will enable us to replicate the HDRP lit shader as normal, but force the shader to draw on top of other opaque meshes. I need it so that some objects will render in front without a second camera - such as a melee weapon. I wasn't able to see any of the required functionality to implement this unfortunately...
     
  24. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    789
    This will need to be done with stencil buffer so guess they have to implement that into the master node setting at some point. It's been requested by other people here but to do other effects so people are looking forward to something like this especially for first person games.
     
    Desoxi and hippocoder like this.
  25. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    789
    Not really a shader graph topic but it's pretty simple just
    Change the material surface type to transparent
    Enable double-sided if needed
    Reduce the Base color alpha (Set the color to what you want Black if just clear glass)
    Scroll down and at Transparency Inputs
    Refraction Model - Sphere if the object is rounded if flat then use Plane
    SSRay Model - Sett to Proxy
    Index of Refraction - Adjust to your liking

    That should be it, refraction should be looking pretty sweet
    Refraction.png
     
    hippocoder and konsic like this.
  26. pointcache

    pointcache

    Joined:
    Sep 22, 2012
    Posts:
    579
    i would also like to know upload_2018-9-8_13-46-46.png

    first question about procedural shapes, anyone knows the answer?

    RTFM guys

    Note that in order to preserve the ability to offset the shape within the UV space the shape will not automatically repeat if tiled. To achieve a repeating rounded rectangle effect first connect your input through a Fraction Node.
     
    Last edited: Sep 8, 2018
  27. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    995
    What is equivalent in shader graph radialgradentexp, paralax occlusion and flip book in this materials :

     
    angelomoro likes this.
  28. pointcache

    pointcache

    Joined:
    Sep 22, 2012
    Posts:
    579
    Ive seen flipbook in shadergraph
     
    konsic likes this.
  29. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    There are no built in equivalents to the first two nodes you listed.

    The first one can be mostly replicated easily with a distance node, though I don't know the exponential density formula bring used, it's likely just your basic exponential fog equation.

    The parallax occlusion node is harder. People have already releases custom nodes which do this, but the node graph alone doesn't offer the functionality required to do it efficiently.
     
    konsic likes this.
  30. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Is there a way to sample depth from a texture inside a custom node?
     
  31. wyattt_

    wyattt_

    Unity Technologies

    Joined:
    May 9, 2018
    Posts:
    424
    "Show Generated Code" is broken atm. You can right-click the Master Node in a Shader Graph and use "Copy Shader" and paste the contents of your clipboard into a new .shader file and make edits to that shader. At that point, however, you won't be working with ShaderGraph anymore
     
    Desoxi and bgolus like this.
  32. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    I was able to copy the shader graph file into a new project and use the "sho generated code" feature there. But the copy shader way is much quicker, thanks
     
  33. cAyouMontreal

    cAyouMontreal

    Joined:
    Jun 30, 2011
    Posts:
    315
    Correct me if I'm wrong but a depth texture for me is just a grayscale of depth, so you can just use one color (red for example) and 0 will mean far,1 will mean near.
     
  34. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Yes once it is sampled correctly it is. But until then you have to use something like this to do the sampling

    Code (CSharp):
    1. Linear01Depth(SAMPLE_DEPTH_TEXTURE(_CameraDepthTexture, sampler_CameraDepthTexture, i.texcoordStereo));
    where "_CameraDepthTexture" is declared like this in an hlsl block


    Code (CSharp):
    1. HLSLINCLUDE
    2.         TEXTURE2D_SAMPLER2D(_CameraDepthTexture, sampler_CameraDepthTexture);
    3.     ENDHLSL
    At least its done this way in post process effects. Im not sure how to declare such properties in a node in shader graph (dont think that is possible yet). Also we would have to include stdlib.hlsl for the Linear01Depth method to work (though we could easily implement it ourselves, so thats not the main problem).

    What i am doing right now is to do the sampling inside a custom post process effect and write the depth into a rgb as grayscale texture like you mentioned and then sample this texture inside another shader.

    The only problem im facing is that this texture seems to jitter although not anti aliasing is activated (especially not TAA). Couldnt find a solution to this so far but im still searching.

    I though the jitter could be a result of me doing the depth buffer sampling inside the custom post process effect and that is why i asked the question if it is possible to sample the depth buffer directly from a custom shader graph node.
     
    awesomedata likes this.
  35. EricLowry

    EricLowry

    Joined:
    Apr 12, 2015
    Posts:
    17
    Hi there,

    I've been experimenting with using ShaderGraph to create some UI elements.

    The basic idea is to use an Unlit ShaderGraph material on an UI.Image component to be able to render images with some fancy layering/blending happening.

    The problem is that it works PERFECTLY in the editor (even when switching to scene view while on play), but the image just shows up as a black rectangle in the play window.

    Someone seems to have posted a similar question here: https://gamedev.stackexchange.com/q...shadergraph-for-ui-elements-with-transparency
    And as far as I can tell, UI shading would require a specific output type other than PBR or Unlit?
    Is there a way around this?

    _____

    Here is an example ShaderGraph I'm using:
    ShaderGraph1.jpg

    And the results (in the editor, and in the game):
    ShaderGraph2.jpg

    Note that adding an image with the same alpha channel as the Image component's sprite, will just produce a black box that encompasses any visible pixels:
    ShaderGraph3.jpg
     
    Last edited: Sep 13, 2018
    awesomedata likes this.
  36. Jesus

    Jesus

    Joined:
    Jul 12, 2010
    Posts:
    504
    @EricLowry I think it's something to do with stencil values. I remember doing similar things in Shader Forge and you had to get them right (or expose them to the material inspector) otherwise it would do that. UI elements are rendered in-scene fine, because of...reasons? I could never figure that out.
     
    EricLowry likes this.
  37. EricLowry

    EricLowry

    Joined:
    Apr 12, 2015
    Posts:
    17
    Okkaaayy?
    I'm not quite sure how this is supposed to work, but it seems that the limits of ShaderGraph don't currently allow me to output the correct type of transparency to be used with UI Images?

    Or am I really misunderstanding this?
     
  38. LostPhilosopher

    LostPhilosopher

    Joined:
    Feb 25, 2018
    Posts:
    23
    I was wondering,Will we be able to use ShaderGraph in the 2d games or 2d Game mode of unity in future? Or we have to change the pipeline / converts effects into sprites and then merge it with the desired objects one by one for desired effects?
     
  39. Soaryn

    Soaryn

    Joined:
    Apr 17, 2015
    Posts:
    328
    Is there a way to customize the sub shader graph output node to output a specific type?
    Currently it seems that it always defaults to Vector4. I'd like it to show Vector1. While I understand it will auto convert upon use, the previews from that point on become rather broken.

    Alternatively, is there a way to change what preview type the SubGraph shows in the main graph?
     
    sand_lantern and Desoxi like this.
  40. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Yes this especially is a big problem when converting a pretty big graph into a subgraph and then cant connect the previous vector1 nodes with the vector4 nodes for whatever reason. Sometimes it works when bridging it over a preview node but most of the times simply doesnt work and the whole subgraph is useless.
     
    sand_lantern likes this.
  41. wyattt_

    wyattt_

    Unity Technologies

    Joined:
    May 9, 2018
    Posts:
    424
    We have Depth nodes on the way I believe.

    If you're using Lightweight/HDRP, you might have to enable depth texture via the pipeline asset settings (HDRP probably enables it by default). If those nodes aren't in the latest package via Package Manager and you're not on 2018.3, you can create a Texture2D property, change its reference name to "_CameraDepthTexture" and unflag the "Exposed" toggle, then add that node to your graph. But, as you said, you'll have to do the conversion to linear space yourself. With the actual Depth node, you wont have to do that. It should be linearized already
     
    awesomedata likes this.
  42. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,982
    Does anyone know how to do full screen effects shaders for LWRP or HDRP? Trying to work out how to do what would be done via graphics.blit via a c# script on camera previously
     
  43. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    I think you were trying it in OnRenderImage like we had to before we had the new render pipelines. I had the same problem but this helped a lot:

    https://github.com/Unity-Technologies/PostProcessing/wiki/Writing-Custom-Effects

    And here is an example of how to export the depth texture of a camera (though i dont recommend it because the resulting texture seems to jitter and noone here could tell me how to change that):

    https://forum.unity.com/threads/how...n-onrenderimage-is-not-getting-called.552091/
     
    jackytop and MadeFromPolygons like this.
  44. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Okay i didnt try that approach with the reference name yet. As i posted above i tried to "export" the texture via a custom post process effect to a render texture (but there is a problem where the resulting texture jitters and i dont know why and noone here seems to know it as well, and no, there is no taa enabled).

    How can i control which camera is writing to "_CameraDepthTexture" when i have 2 cameras and im using the reference name way?
     
  45. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Okay i tried to sample the texture2D with the reference name "_CameraDepthTexture" but the problem now is that the editor says the following:

    Code (CSharp):
    1. Shader error in 'test': redefinition of '_CameraDepthTexture' at line 230 (on d3d11)
    2.  
    3. Compiling Vertex program
    4. Platform defines: UNITY_ENABLE_REFLECTION_BUFFERS UNITY_USE_DITHER_MASK_FOR_ALPHABLENDED_SHADOWS UNITY_PBS_USE_BRDF1 UNITY_SPECCUBE_BOX_PROJECTION UNITY_SPECCUBE_BLENDING UNITY_ENABLE_DETAIL_NORMALMAP SHADER_API_DESKTOP UNITY_LIGHT_PROBE_PROXY_VOLUME UNITY_LIGHTMAP_FULL_HDR
    And 3 other exceptions which basically say the same but on other lines.

    The same happens in shadergraph v.3.3.0 with a slightly different message:

    Code (CSharp):
    1. Compilation error in graph at line 238 (on d3d11):
    2. redefinition of '_CameraDepthTexture'
    3. UnityEditor.EditorApplication:Internal_CallUpdateFunctions()
    4.  
     
  46. wyattt_

    wyattt_

    Unity Technologies

    Joined:
    May 9, 2018
    Posts:
    424
    @Desoxi, Where/how is it jittering and are you trying to use it in a Shader Graph after copying the buffer?
     
  47. wyattt_

    wyattt_

    Unity Technologies

    Joined:
    May 9, 2018
    Posts:
    424
    These are my guesses. Post a gif if it's not too much trouble and I can tell ya if that's the case or not.

    Typical Depth Buffer "Jitter" Scenario:
    Typically it will "jitter" because of when (actual time during the render process) you are accessing the current Camera's Depth Texture. The depth buffer gets filled as opaque geometry is drawn to the screen and isn't "filled" until that has been done, so, if your shader is used as part of the opaque geometry queue, the depth buffer for that frame has not been filled yet and it is probably using the depth buffer from the previous frame. Your depth buffer is basically lagging behind one frame. To fix this, you generally set the queue of the shader to "Transparent" since transparent geometry gets rendered after opaque geometry (at which point the depth buffer will be filled)

    Your situation:
    I believe the situation might be similar for you. You copy the depth buffer via post-processing to another buffer and use that in your shader but post-processing is done after opaque/transparent geometry is drawn which means your shader has already been used for rendering and you are copying what is now the previous frame's depth buffer to your intermediate depth buffer. Now it's the next frame and your shader is again used in the render pass but now you are using old depth buffer data stored in that intermediate buffer

    Side Note:
    It might also "jitter" because there is also the SceneView camera which has it's own depth pass stuff so you might actually be using the SceneView camera depth buffer at times in addition to your main camera. This should be rectified when entering Play Mode and not having the SceneView active

    I'll post this on the other thread too
     
    Last edited: Sep 19, 2018
    awesomedata and Desoxi like this.
  48. Arnklit

    Arnklit

    Joined:
    May 18, 2015
    Posts:
    48
    Is it possible to do a shell style fur shader with the shader graph? After messing around with it for a while, I don't really see any way to make it render multiple versions of your mesh?
    This is the style of thing I'm thinking about: http://www.reddotgames.pl/unitystore/furfx/furForces.png
     
  49. pointcache

    pointcache

    Joined:
    Sep 22, 2012
    Posts:
    579
    How to make overlay shader? that renders on top of everything, need it to be transparent, mode additive, render on top of everything. hdrp.
     
  50. wyattt_

    wyattt_

    Unity Technologies

    Joined:
    May 9, 2018
    Posts:
    424
    Totally doable but you can't generate the geometry with Shader Graph. You can make the shell shader with Shader Graph though. Look at the Graphics.DrawMesh-related API for rendering multiple versions of the same Mesh. Currently working on the same thing myself!

    Ideally, you'd use Graphics.DrawMeshInstanced but that may not work as expected at the moment since instanced data buffers are not supported in the current release of Shader Graph

    Graphics.DrawMesh API: https://docs.unity3d.com/ScriptReference/Graphics.DrawMesh.html
    Info on GPU Instancing: https://docs.unity3d.com/Manual/GPUInstancing.html
     
    Last edited: Sep 18, 2018
Thread Status:
Not open for further replies.