Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Voting for the Unity Awards are OPEN! We’re looking to celebrate creators across games, industry, film, and many more categories. Cast your vote now for all categories
    Dismiss Notice
  3. Dismiss Notice

[ShaderGraph] Highlighting edges

Discussion in 'Shaders' started by ProjectClass41, Sep 18, 2018.

  1. ProjectClass41

    ProjectClass41

    Joined:
    Dec 12, 2016
    Posts:
    3
    I am trying to create an "edge highlight" shader for the purposes of low-poly. I would like to control the color of the edges of the mesh triangles.

    I cannot seem to come up with a solution to this. Does anyone know how this can be accomplished?
    I would like to change the color of these using shadergraph.
    upload_2018-9-17_21-26-50.png

    Thank you in advance!
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,230
    There are three main ways of doing wireframe rendering.

    Option 1 is to tell the GPU to render using wireframe rendering. That's what's being done in the image above. There's no option to turn that on from shaders, shader graph or otherwise. It's something you have to enable globally from script just before rendering an object.

    Option 2 is to render the lines manually. This means taking the mesh and iterating over every edge and drawing a line using GL calls, or a line renderer, or otherwise manually constructing a custom mesh of very thin geometry to represent those lines. This is almost all from script, again almost nothing important done in the shader.

    Option 3 is to use barycentric coordinates. Shaders don't generally have access to the barycentrics directly, and this is doubly true for shader graph. The hack is to store the barycentric coordinates in the vertex colors or in some unused UVs so they can be accessed. This requires preprocessing your mesh from script.
     
    lunahwanganim likes this.
  3. ProjectClass41

    ProjectClass41

    Joined:
    Dec 12, 2016
    Posts:
    3
    I thought about using a script but I figured it was too expensive on the system.

    I noticed that shader graph has a "vertex color" node. Is there nothing that can be done with that to achieve the effect? My goal is to have low-poly water and have the faces that are above "sea level" have a white outline on them down the wireframe triangles. I feel like that should be possible but I'm not sure how.

    Is there at least a way to switch the color of geometry that has had its position modified by shadergraph and is above a certain height from its original position to change colors?
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,230
    That's to get access to the mesh asset's vertex colors. They're the vertex colors because all mesh data is stored in the vertices; position, normal, UVs, etc. Those are no more related to wireframe rendering than any other part of a mesh or shader, and cannot be written to from the shader. Like I mentioned above, they can be used to store barycentric information so the shader can make use of them.

    This tutorial has an example of how to do wireframe rendering in the shader.
    https://catlikecoding.com/unity/tutorials/advanced-rendering/flat-and-wireframe-shading/
    However they make use of a geometry shader to pass the barycentric coordinates to the fragment shader. This isn't possible with Shader Graph as it doesn't expose geometry shaders, and they're not available on many platforms anyway. So instead you have to bake the barycentric coordinates into the mesh from c#. This only needs to be done once before you use the mesh.


    Example of barycentric coordinates being displayed as the diffuse color

    There are assets on the store that can do this for you, like this one:
    https://assetstore.unity.com/packag...ame-shader-the-amazing-wireframe-shader-18794
    The shaders included in those assets likely won't work with the new render pipelines, but they should store the data in the mesh so you can access so you can implement the technique shown in the first link.
     
  5. ProjectClass41

    ProjectClass41

    Joined:
    Dec 12, 2016
    Posts:
    3
    Awesome! thanks. I'll see if I can make it work with lightweight pipeline.

    I appreciate the research you did. Thanks again.
     
  6. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    8,956
  7. chris_tmssn

    chris_tmssn

    Joined:
    Jan 5, 2016
    Posts:
    2
    i just recently needed the same effeckt and want to do it in shader graph too... i knew the totorial from him and combined it with a nother tutorial from him https://catlikecoding.com/unity/tutorials/procedural-grid/ and extendet the grid script with just a few lines for the barycentric coords as colors.
    Code (CSharp):
    1. using UnityEngine;
    2.  
    3. [RequireComponent(typeof(MeshFilter), typeof(MeshRenderer))]
    4. public class Grid : MonoBehaviour {
    5.  
    6.     public int xSize, ySize;
    7.  
    8.     private Mesh mesh;
    9.     private Vector3[] vertices;
    10.  
    11.     private void Awake () {
    12.         Generate();
    13.     }
    14.  
    15.     [ContextMenu("generate")]
    16.     private void Generate () {
    17.         GetComponent<MeshFilter>().mesh = mesh = new Mesh();
    18.         mesh.name = "Procedural Grid";
    19.  
    20.         Color[] coords = new[]
    21.         {
    22.             new Color(1, 0, 0),
    23.             new Color(0, 1, 0),
    24.             new Color(0, 0, 1),
    25.         };
    26.        
    27.         vertices = new Vector3[(xSize + 1) * (ySize + 1)];
    28.         Vector2[] uv = new Vector2[vertices.Length];
    29.         Vector4[] tangents = new Vector4[vertices.Length];
    30.         Vector4 tangent = new Vector4(1f, 0f, 0f, -1f);
    31.         Color32[] vertexColors = new Color32[vertices.Length];
    32.         for (int i = 0, y = 0; y <= ySize; y++) {
    33.             for (int x = 0; x <= xSize; x++, i++) {
    34.                 vertices[i] = new Vector3(x*5, y*5);
    35.                 vertexColors[i] = coords[(int)Mathf.Repeat(x-y, 3)];
    36.                 uv[i] = new Vector2((float)x / xSize, (float)y / ySize);
    37.                 tangents[i] = tangent;
    38.             }
    39.         }
    40.         mesh.vertices = vertices;
    41.         mesh.uv = uv;
    42.         mesh.colors32 = vertexColors;
    43.         mesh.tangents = tangents;
    44.  
    45.         int[] triangles = new int[xSize * ySize * 6];
    46.         for (int ti = 0, vi = 0, y = 0; y < ySize; y++, vi++) {
    47.             for (int x = 0; x < xSize; x++, ti += 6, vi++) {
    48.                 triangles[ti] = vi;
    49.                 triangles[ti + 3] = triangles[ti + 2] = vi + 1;
    50.                 triangles[ti + 4] = triangles[ti + 1] = vi + xSize + 1;
    51.                 triangles[ti + 5] = vi + xSize + 2;
    52.             }
    53.         }
    54.         mesh.triangles = triangles;
    55.         mesh.RecalculateNormals();
    56.     }
    57. }
    then you can simple call the vertex color node in shader graph and do the same operations as in the wireframe tutorial
     

    Attached Files: