Search Unity

Question on Combining Vertex/Fragment and Surface Shaders

Discussion in 'Shaders' started by Slashking, Jul 8, 2019.

  1. Slashking

    Slashking

    Joined:
    Feb 4, 2017
    Posts:
    3
    So I don't normally ask questions on here (usually there already is a question with an appropriate answer to my question somewhere out there), but I couldn't find the answer to my specific question so far.

    So I'll just keep it short and simple. I've got a vertex and fragment shader, I displace my model with a noise texture in my vertex shader and add two textures in my fragment shader (a main Texture, and a Overlay Texture), all pretty basic stuff. Now I'm sure what I want is very easy and simple to achieve aswell, but I couldn't find an answer so far. I want to now take this output (displacement and the textures), and add some basic lighting to it. To my knowledge this is done using a surface shader, and I got it all mostly setup already, but how do I now take the output of vertex and fragment shader, and plug that into my surface shader for further computation?

    So tl;dr, how do I take the output of a vertex and fragment shader, and further compute this in a surface shader?

    I'll attach my shader I got so far below:


    Code (CSharp):
    1. Shader "Unlit/PlayerShader"
    2. {
    3.     Properties
    4.     {
    5.         _MainTex ("Texture", 2D) = "white" {}
    6.         _SideTex ("Texture", 2D) = "white" {}
    7.         _OverlayTex ("Texture", 2D) = "black" {}
    8.         _TintColor("Tint Color", Color) = (1,1,1,1)
    9.         _NoiseSpeed ("Noise Speed", Float) = 0.25
    10.         _NoiseStrength ("Noise Strength", Float) = 0.25
    11.     }
    12.     SubShader
    13.     {
    14.         Tags { "RenderType"="Opaque" }
    15.         LOD 100
    16.  
    17.         Pass
    18.         {
    19.             CGPROGRAM
    20.             #pragma vertex vert
    21.             #pragma fragment frag
    22.             // make fog work
    23.             #pragma multi_compile_fog
    24.  
    25.             #include "UnityCG.cginc"
    26.  
    27.             struct appdata
    28.             {
    29.                 float4 vertex : POSITION;
    30.                 float2 uv : TEXCOORD0;
    31.             };
    32.  
    33.             struct v2f
    34.             {
    35.                 float2 uv : TEXCOORD0;
    36.                 UNITY_FOG_COORDS(1)
    37.                 float4 vertex : SV_POSITION;
    38.             };
    39.  
    40.             sampler2D _MainTex;
    41.             sampler2D _SideTex;
    42.             sampler2D _OverlayTex;
    43.             float4 _MainTex_ST;
    44.             float4 _TintColor;
    45.             float _NoiseSpeed;
    46.             float _NoiseStrength;
    47.  
    48.             v2f vert (appdata v)
    49.             {
    50.                 v2f o;
    51.  
    52.                 float4 colVal = tex2Dlod(_SideTex, float4(v.uv + (_Time.y*_NoiseSpeed),0,0));
    53.                 v.vertex.xyz *= 1+(colVal.x*_NoiseStrength-(_NoiseStrength/2));
    54.  
    55.                 o.vertex = UnityObjectToClipPos(v.vertex);
    56.                 o.uv = TRANSFORM_TEX(v.uv, _MainTex);
    57.  
    58.                
    59.  
    60.                 UNITY_TRANSFER_FOG(o,o.vertex);
    61.                 return o;
    62.             }
    63.            
    64.  
    65.             fixed4 frag (v2f i) : SV_Target
    66.             {
    67.                 // sample the texture
    68.                 fixed4 col = tex2D(_MainTex, i.uv)  * _TintColor;
    69.                 fixed4 overlayCol = tex2D(_OverlayTex, i.uv);
    70.                 if(overlayCol.w > 0.1){
    71.                     col = tex2D(_OverlayTex, i.uv);
    72.                 }
    73.                 // apply fog
    74.                 UNITY_APPLY_FOG(i.fogCoord, col);
    75.                 return col;
    76.             }
    77.  
    78.             ENDCG
    79.         }
    80.  
    81.         CGPROGRAM
    82.         #pragma surface surf Lambert
    83.  
    84.         struct Input {
    85.             float4 color : COLOR;
    86.         };
    87.         void surf (Input IN, inout SurfaceOutput o) {
    88.            
    89.         }
    90.  
    91.         ENDCG
    92.     }
    93. }
    94.  
     
  2. Slashking

    Slashking

    Joined:
    Feb 4, 2017
    Posts:
    3
    So I figured it out, fragment shader can be merged into the surface shader (the surf function) and the vert shader of the surface shader can be overwritten with a custom vert shader.

    For completeness sake I would be interested in how/whether it would be possible to transfer the output of a fragment shader either way though^^
     
  3. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,550
    The surface shader is a vert and frag shader. Surface shader is just giving you a way to generate a fragment shader that has all the shading functions done for you, with all you needing to do is feed it texture/color values into the outputs you want to use, like albedo, normal, emission, height, etc...

    So no, there's no way to transfer the data from the frag to the surf, because the surf *is* the frag once it compiles.

    (you could render to a render texture and use that RT as the input for a surface shader... but that's a pretty costly thing to do and only niche scenarios require that kind of thing)
     
    xVergilx likes this.
  4. Slashking

    Slashking

    Joined:
    Feb 4, 2017
    Posts:
    3
    I see, so merging the functions as I did it is the only way of actually going about things then?

    Seems a bit weird to me that there is no real ability of communicating between two different passes in a shader, well atleast if those passes are vert+frag and a surface shader. But I did fix my problem, so it's fine I guess^^
     
  5. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,348
    Basically, yes. Technically there are other ways of passing information between passes, using shared read/write buffers for example, but this isn't generally something you want to rely on, can be very slow, and isn't available on all platforms.

    Generally speaking a shader pass has two stages, the vertex shader and the fragment shader. Both stages have access to any properties set on the material, or otherwise passed to it from the application (user set global shader properties, the current time, the camera's view and projection matrices, the object's transform matrices, optionally some subset of the lighting data, etc.). The vertex stage additionally has access to mesh data one vertex at a time for each invocation, and outputs data that gets passed to the fragment shader. The fragment stage gets access to the interpolated vertex stage output depending on where in the triangle is currently being rendered by that invocation, and outputs a single color value that is immediately written into the current render target using the current blend mode. Any data calculated during each invocation of the vertex or fragment shaders that isn't output to either the vertex to fragment interpolators or render target is immediately forgotten as soon as that invocation finishes. Each stage and invocation has no access to any other stage or invocation* outside of those narrowly controlled paths.

    * Fragment shaders have some limited communication between multiple fragment shader invocations as GPUs actually calculate 4 pixels at a time in 2x2 groups, and you can get the difference between the current invocation and the invocation to the side or above/below via partial derivative functions, ddx, ddy, and fwidth. GPUs use this functionality to derive texture mip levels, but they can be used for other purposes as well, like anti-aliased lines or per pixel surface curvature.

    Like @Invertex mentioned, Surface Shaders are vertex fragment shaders, or rather vertex fragment shader generators. You can see the actual shader code used by the game by clicking on the "Show Generated Code" button in the inspector. Each shader pass can only have one #pragma vertex and one #pragma fragment or the shader compiler will tell you to eff off, the actual function names are totally arbitrary, which confuses some people since the vertex functions for vertex fragment shaders and surface shaders are usually named the same in examples, but for a vertex shader it's the main entry point for the stage, and for surface shaders the main function is named surf_vert and the vertex function you define there is just another function the main vertex shader function calls.
     
    FM-Productions, ecv80 and Slashking like this.