Search Unity

Help for my custom strategic map terrain shader?

Discussion in 'Shaders' started by ComteJaner, Jun 23, 2018.

  1. ComteJaner

    ComteJaner

    Joined:
    Jun 9, 2013
    Posts:
    9
    Hello I am working on a grand strategy project and I am currently working on the terrain rendering. I want to avoid using splatmap to texture the terrain, and as I don't need complex blends (max 3 texture blending together on a single triangle) I think I could use vertex data to achieve what I want.

    I will explain my idea with a basic exemple on a single triangle:

    I use Vertex Color (r,g,b) and 3 other float values d1, d2, d3 (using unused uv or something)

    This triangle will have 3 vertex, v1, v2, v3.

    v1 color data (1,0,0), d1=idx1, d2=0, d3=0
    v2 color data (0,1,0), d1=0, d2=idx2, d3=0
    v3 color data (0,0,1), d1=0, d2=0, d3=idx3

    idx1, idx2 and idx3, being the indexes of the texture I want to use for each corresponding texture.

    Then in the fragment shader, suppose the interpolation coordinates are i1, i2, i3, whe have the color in the fragment = (i1, i2, i3), and I can find idx1=d1/i1, idx2=d2/i2 and idx3=d3/i3.

    I worry about division precision...

    I am a total shader beginer, so can someone help me a little bit? Is it a good idea? Would you have some outline of the code? Should I take another approach, use command buffer or something else?

    Thanks in advance.
     
  2. ModLunar

    ModLunar

    Joined:
    Oct 16, 2016
    Posts:
    374
    This sounds like a really great idea! I'm actually doing something very similar. I'm not sure though why you need the division part, but what I've been working on is texturing a voxel terrain using a similar technique. I calculated that a byte was enough to store one of my "texture weights" as I call them, so I could store 4 of these weights in one integer which is 4 bytes. It would be challenging to try to support just "as many as you want" textures being blended at any given spot, so it's a good idea in my opinion to only care about the few most important textures with the highest weights at each spot, since it saves on a lot of room memory-wise.

    In my approach, my mesh may use UVs (TEXCOORD0) so I didn't want to take that up with the texture weights.

    I also found out that TEXCOORD1 and TEXCOORD2, the next two UV sets, are typically used for pre-calculated lighting techniques (Baked GI for TEXCOORD1, and Realtime GI for TEXCOORD2). At least for my case, since I'm using a surface shader.

    But the fourth UV set in TEXCOORD3 is unused, so that's where I put my texture weights.

    With my project, I haven't gotten to the part where I actually support more than 4 textures, but I think I'll be using an additive surface shader to render the additional textures, four at a time per pass. But I still need to look into that part. So far, this is kind of what I have:


    Code (CSharp):
    1. Shader "2kPS/Voxel Terrain 2D Base" {
    2.     Properties{
    3.         _Layer0 ("Texture 0", 2D) = "white" {}
    4.         _Layer1 ("Texture 1", 2D) = "white" {}
    5.         _Layer2 ("Texture 2", 2D) = "white" {}
    6.         _Layer3 ("Texture 3", 2D) = "white" {}
    7.  
    8.         _Layer0_ST("Texture 0 Tiling & Offset", Vector) = (1, 1, 0, 0)
    9.         _Layer1_ST("Texture 1 Tiling & Offset", Vector) = (1, 1, 0, 0)
    10.         _Layer2_ST("Texture 2 Tiling & Offset", Vector) = (1, 1, 0, 0)
    11.         _Layer3_ST("Texture 3 Tiling & Offset", Vector) = (1, 1, 0, 0)
    12.     }
    13.  
    14.     SubShader{
    15.         Tags {
    16.             "RenderType" = "Opaque"
    17.             "Queue" = "Geometry-100"
    18.         }
    19.         LOD 200
    20.  
    21.         CGPROGRAM
    22.  
    23.         #pragma surface surf Standard vertex:vert fullforwardshadows noinstancing
    24.         #pragma target 3.0
    25.  
    26.         sampler2D _Layer0;
    27.         sampler2D _Layer1;
    28.         sampler2D _Layer2;
    29.         sampler2D _Layer3;
    30.  
    31.         float4 _Layer0_ST;
    32.         float4 _Layer1_ST;
    33.         float4 _Layer2_ST;
    34.         float4 _Layer3_ST;
    35.  
    36.         struct VertexInput {
    37.             float4 vertex : POSITION;
    38.             float3 normal : NORMAL;
    39.             //texcoord is also a special name the surface shader will recognize and
    40.             //do stuff with that I don't want, so I needed to name my TEXCOORD0
    41.             //something else. Also not something starting with "uv" either. rawUV works
    42.             //well -- it's untouched by Unity's auto-generated surface shader stuff
    43.             //float2 texcoord : TEXCOORD0;
    44.             float2 rawUV : TEXCOORD0;
    45.             float2 texcoord1 : TEXCOORD1; //Baked GI
    46.             float2 texcoord2 : TEXCOORD2; //Realtime GI
    47.  
    48.             //cVal stands for control value -- the normalized blending weights each
    49.             //in range [0, 1] for the 4 most important textures at a given vertex.
    50.             //This cVal alone supports up to 4 textures (since it has 4 components xyzw or rgba)
    51.             half4 cVal : TEXCOORD3;
    52.         };
    53.  
    54.         struct Input {
    55.             //This would be SO wasteful! I wouldn't take 4 UV channels when I could just use
    56.             //one (TEXCOORD0), and store the _ST values from the Properties block.
    57.             //float2 uv_Layer0;
    58.             //float2 uv_Layer1;
    59.             //float2 uv_Layer2;
    60.             //float2 uv_Layer3;
    61.  
    62.             float2 rawUV : TEXCOORD0;
    63.             half4 cVal : TEXCOORD3;
    64.  
    65.             //For terrains with more than 4 textures, they'll need to make use of
    66.             //the additive pass shader as well as this base shader.
    67.         };
    68.  
    69.         void vert(inout VertexInput v, out Input o) {
    70.             UNITY_INITIALIZE_OUTPUT(Input, o);
    71.  
    72.             //My custom transfer of stuff! (The interpolators) :)
    73.             o.cVal = v.cVal;
    74.             o.rawUV = v.rawUV;
    75.         }
    76.  
    77.         void surf(Input IN, inout SurfaceOutputStandard o) {
    78.             half4 cVal = IN.cVal;
    79.  
    80.             fixed4 color0 = tex2D(_Layer0, TRANSFORM_TEX(IN.rawUV, _Layer0));
    81.             fixed4 color1 = tex2D(_Layer1, TRANSFORM_TEX(IN.rawUV, _Layer1));
    82.             fixed4 color2 = tex2D(_Layer2, TRANSFORM_TEX(IN.rawUV, _Layer2));
    83.             fixed4 color3 = tex2D(_Layer3, TRANSFORM_TEX(IN.rawUV, _Layer3));
    84.  
    85.             fixed4 finalColor = color0 * cVal.r
    86.                 + color1 * cVal.g
    87.                 + color2 * cVal.b
    88.                 + color3 * cVal.a;
    89.    
    90.             o.Albedo = finalColor;
    91.         }
    92.         ENDCG
    93.     }
    94.     FallBack "Diffuse"
    95. }
    So just a couple of notes:

    • I called my 4 textures _Layer0, _Layer1, _Layer2, and _Layer3

    • Unity allows you to scale and offset each texture individually by using the name of the texture property (in your shader), suffixed by "_ST", originally standing for Scale & Translation (the offset)

    • With surface shaders, I could have used variables in your surface shader input struct called uv_Layer0, uv_Layer1, uv_Layer2, etc. (because I named my textures _Layer0, _Layer1, etc.). But each of those use a TEXCOORD semantic behind the scenes! I didn't want to waste 4 texture coordinate semantics just for that, so I made my "rawUV" as TEXCOORD0, and transformed the rawUV in my surface shader individually for each texture. That allowed me to have the room in TEXCOORD3 to store my texture weights, which I called "cVal" (I tried to be short with its name haha)

    • I used a vertex shader (with vertex:vert in the surface shader #pragma) to transfer my custom data from the vertex stage to the surface shader's Input struct. This was for my "rawUV" and "cVal". This works like sending stuff from the vertex shader to the fragment shader -- the values get interpolated by the GPU automatically, so I just set them equal to each other and the magic happens with all that barycentric goodness that we don't need to handle :)

    ---

    Now disclaimer, I have not had any feedback on the performance of what I do with the TRANSFORM_TEX in the surface shader to avoid all those TEXCOORDs being eaten up by the uv_Layer0, etc. surface shader stuff. But it should be just a component-wise vector multiplication and addition each time.

    By the way, that TRANSFORM_TEX that I use in the surface function is a macro defined by Unity that applies the _ST variable to the UVs you pass into it. The syntax for it is:

    TRANSFORM_TEX(float2 uv, [nameOfTextureHere]) and it'll use [nameOfTextureHere]_ST (you must have the _ST variable defined, which is a float4 holding the xy scale in the xy components, and xy offsets in the zw components)

    ---

    Anyway, lots of information, I hope it's not overwhelming, you might know a lot of it already, I hope it helps though! Goodluck with what you're doing :D
     
    Last edited: Jun 24, 2018
  3. brownboot67

    brownboot67

    Joined:
    Jan 5, 2013
    Posts:
    375
    Micro/Mega splat do exactly what you're l looking for.
     
  4. ComteJaner

    ComteJaner

    Joined:
    Jun 9, 2013
    Posts:
    9
    Thank you a lot ! Unity is truly an amazing community and you are amazing. How would you support more textures? Because for my terrain I need to support 20-something different textures, which is why I want to store "indexes" in the vertices of the triangles and not only blend factors as you do. Otherwise I would be limited in the number of textures by the amount of data I can store in the vertices. My problem is that the idx I store will be messed up by the interpolation, and I need to retrieve the original values by countering the interpolation blend, hence the division! But I worry that the division will not be precise enough for my purpose...

    I will be sure to show me my shader when/if I get it working!

    Seems a little overkill and complicated from what I seen in microsplat. I prefer a simpler shader with only the features I need so I have more control and can add my features without having to read and understand a long and complex shader file. From the geometric requirement in the doc (3 color vertices in one triangle), I guess they use a similar technique to achieve this result!
     
  5. ModLunar

    ModLunar

    Joined:
    Oct 16, 2016
    Posts:
    374
    Of course, I'm glad to help! There's not enough talk going around about shaders :p

    And oh yeah you're right! Haha sorry, I haven't modified the shader to support more than 4 yet, but it is definitely within reach and not impossible the way we have the shader so far. I've been focusing on another area of programming with my terrain lately and have not yet gotten around to modifying the shader to support more textures.

    I think I'm going to need to write a second Surface Shader function that uses "decal:add" in the "#pragma surface" line, which will make that second shader an additive one. Then with that, you can have multiple additive passes of drawing the terrain, and each pass can draw 4 more textures on top, using the _Layer0, _Layer1, _Layer2, and _Layer3 textures in the shader, setting them each time through C#.

    I'm getting closer though to actually implementing this, and when I do, I'd love to share the skeleton of how I did it :) but in the meantime, hopefully this information points you in the right direction. Good luck!
     
    Last edited: Jun 26, 2018
  6. ModLunar

    ModLunar

    Joined:
    Oct 16, 2016
    Posts:
    374
    Hey, so I actually started working on the additive shader approach for my voxel terrain now!

    I've gotten to a roadblock now though with the additive shader. I created a new very empty surface shader, added "decal:add" to the surface pragma, and made it just output black to test, since adding (0, 0, 0) everywhere shouldn't make the resulting colors on the screen any different. This is the gist of my additive shader:

    Code (CSharp):
    1. SubShader{
    2.         Tags{
    3.             "RenderType" = "Opaque"
    4.             "Queue" = "Geometry-99"
    5.         }
    6.         LOD 200
    7.  
    8.         CGPROGRAM
    9.         #pragma surface surf Standard decal:add fullforwardshadows noinstancing
    10.         #pragma target 3.0
    11.  
    12.         void surf(Input IN, inout SurfaceOutputStandard o) {
    13.             o.Albedo = 0;
    14.         }
    15.         ENDCG
    16.     }
    (And I changed the render queue to be one after my base pass, which was Geometry-100).

    But I see it literally drawing black over my mesh (which is just a cube right now), so I don't know how that's additive. Unless I should be somehow disabling lighting and stuff on the additive pass?

    ... Okay literally as I was going to grab the picture, maybe I forgot about something I did? But Unity recompiled something, and now it works like an additive pass should work generally speaking.. xD and now I'm sad that I don't know what I fixed or how. Anyway:

    upload_2018-6-27_14-49-3.png

    My mesh has one submesh (https://docs.unity3d.com/ScriptReference/Mesh-subMeshCount.html) so normally there'd only be one material. But through my C# code, I figure out how many materials are needed to draw all the textures, assuming each material can handle 4 textures, and I create an array of materials and assign it to my terrain's MeshRenderer:

    upload_2018-6-27_15-11-15.png

    It turns out that having 1 submesh (1 material it'd normally need) but then assigning 2 materials will make it get drawn with both materials (or however many there are in the array)!

    ---

    So all the good news aside, this is still problematic for me. Cause I use TEXCOORD3 for my blending weights on the textures. So I'm like -- would I really need to continously switch out uv4 on my Mesh through C# and make it get drawn for every set of 4 blending values?

    I'm trying to use only the 4 most important textures at each given vertex by having their weights and the texture indices, but the shader would need all of the textures at once as properties in the shader to use any given texture for any given vertex.

    So I'm thinking of settling on ... supporting maybe 8-16 textures maximum for now, doing it all in one surface shader, by using one sampler, and having 8-16 textures sampled with that sampler. With this, I'll just have just one shader -- not two with a base and an additive shader. They explain the samplers (called "SamplerStates") vs. textures stuff here in the docs, and also a big (super useful!) thread also on the forums here.

    But basically, we're limited more by how many samplers we can use (sampler2D, sampler3D, samplerCUBE, etc.) rather than by how many textures we use (Texture2D, Texture3D, TextureCube, etc.). But when writing shaders in Unity, the default is that there's always one sampler with one texture (so if I'm understanding this correctly, when you write
    sampler2D _MainTex;
    in your CGPROGRAM, that creates a sampler2D and a Texture2D).

    Luckily there's ways to define just one sampler, and then sample from multiple textures with it (see the unity docs link earlier in this paragraph). They say we're limited to 16 samplers, and 128 textures maximum in DirectX11/Shader Model 5.0. I still need to test if this will work though xD so .. guess we'll find out!
     
    Last edited: Jul 6, 2018