Search Unity

  1. Looking for a job or to hire someone for a project? Check out the re-opened job forums.
    Dismiss Notice
  2. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

Help Wanted Can I achieve this edge highlight effect?

Discussion in 'Shader Graph' started by jkorhonen222, Feb 18, 2020.

  1. jkorhonen222

    jkorhonen222

    Joined:
    Jan 21, 2020
    Posts:
    4
    So I have low poly game and I'd like to color the edges of a mesh like so: (white lines on the edges) low-poly-rock-AEkmL47-600.jpg I could just paint those manually but this would save me a lot of time.
     
  2. unit_dev123

    unit_dev123

    Joined:
    Feb 10, 2020
    Posts:
    990
  3. jkorhonen222

    jkorhonen222

    Joined:
    Jan 21, 2020
    Posts:
    4
    Yeah I might have to even though I don't know anything about that
     
  4. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    463
    You can do this but you'll need to preprocess your mesh to add extra info into a spare UV channel about which edges should be visible.

    You can then use this in shader graph to get "distance from visible edge".

    Can you figure out the first part? If so I can help with the second.
     
  5. Erfan_SheikhHoseini

    Erfan_SheikhHoseini

    Joined:
    Feb 7, 2019
    Posts:
    22
    i think the easiest way is to just use textures
    if you use programs like substance it wont take time at all because it does it automatically.
     
  6. jkorhonen222

    jkorhonen222

    Joined:
    Jan 21, 2020
    Posts:
    4
    I such a noob that I dont yet understand the uv channels...
     
  7. jkorhonen222

    jkorhonen222

    Joined:
    Jan 21, 2020
    Posts:
    4
    I could try that. I have used blender to paint the textures.
     
  8. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    9,842
    You will have to do it with a texture baked outside of Unity. Shader Graph is not capable of this effect without pre-baking that data into a texture before hand. This isn't a limitation of Shader Graph, but rather a limitation of GPU shaders in general.
     
  9. mngmra

    mngmra

    Joined:
    Feb 15, 2018
    Posts:
    1
  10. PJRM

    PJRM

    Joined:
    Mar 4, 2013
    Posts:
    300
    i'd love to have some help with this kind of result using shader graph only.
    I bet there are people with knowledge to share how to, but they don't want to. Sad
     
  11. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    7,454
     
  12. Mauri

    Mauri

    Joined:
    Dec 9, 2010
    Posts:
    2,158
    See @bgolus reply above. He has a high level of expertise and experience when it comes to Shaders in general.
     
  13. PJRM

    PJRM

    Joined:
    Mar 4, 2013
    Posts:
    300
    Damn i hate you guys. :p

    I wish Uber did use a shader for this... Thas my inspiration.
    Now idk if i must go to another topic to ask if it is possible to a texture to create use border (just like the 2D/UI have to not stretch the image, if the UV uses a "Reused Tile" in a wide face. Just to not have to bake the entire model. (my laziness :p)
     
  14. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    9,842
    So, that’s a Planetary Annihilation commander. I wrote or worked on most of the shaders for that game and was one of the people who created the asset pipeline for creating those textures. For commanders, it’s 100% hand UV’d and painted (with some help from stuff like substance). For the original Unity one of my coworkers came up with a scheme that used a swatch of premade textures with edge details painted on them, and you picked the one you wanted and then UV’d the mesh to that (or left it using the default UVs, which was often fine). The export tool then automatically generated a uniquely UV’d atlas texture, similar to something like is used for light maps, and an extra pass to generate edges (for the wireframe shader) is generated from the UVs themselves in a Photoshop script.

    I’d always wanted to redo it so it generated more in-engine, which I had a plan laid out to do, and would have made user generated content a lot easier to get in the game, but it never happened. It also had the benefit of not locking user generated content into the same exact look as existing stuff, but I don’t think anyone really made use of that.
     
    florianBrn and Raptosauru5 like this.
  15. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    748
    You can do it semi-automatically in engine.
    You could generate texture manually/runtime by analyzing edge angles and painting on texture in places specified by UVs.

    Doable but not very easy since you would have to take into account UV stretch in relation to vertices layout.
     
  16. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    463
    You could also do it by checking the angle between faces at each edge. If the faces are nearly co-planar - don't draw an edge. Otherwise do.

    I've done something similar in my code except I already know which faces are meant to be co-planar (I'm using a half-edge mesh so my meshes have faces that aren't just triangles).

    When I generate the Unity mesh, I store a 1 in each vertex that's on a visible edge and a 0 on vertices on invisible edges.

    In the shader you can use the interpolated value to get a gradient - or smoothstep it to get a wireframe.
     
  17. MattRix

    MattRix

    Joined:
    Aug 23, 2011
    Posts:
    111
  18. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    9,842
    Doing an edge detection post process on the screen space normals is totally doable. That used to be an effect that shipped with Unity's Standard Assets package back in the Unity 5.0 days.
    https://docs.unity3d.com/550/Documentation/Manual/script-EdgeDetectEffectNormals.html
    That was basically the same technique, but using Sobel or Robert's Cross instead of Prewitt and applying it as a multiply instead of as an overlay.

    However Unity has no existing Object ID map to use, nor do rendered objects have an inherent unique ID that you could reuse. So you'd have to handle that yourself somehow.
     
  19. yzRambler

    yzRambler

    Joined:
    Jan 24, 2019
    Posts:
    36
    Another method for solving this problem is:

    1. Render the target object into a texture map (refer to it as PNMap) in real-time, only save the normal value for each pixel. In fact, PNMap is not equal normal map completely. If normal map is existed, this step need be still executed. The normal vector of the pixels in the PNMap is just about the trigle plan. This step just be made once.

    2. Render this targe object again, in the fragment shader stage, sample some pixels around current pixel, such as four or eight, from PNMap. If the normal vector of the surrounding pixels sampled have obviously different, the current pixel must have been in the line of the edge, then shadering it with your want color.

    Furthermore, you can caculate a right value from the distance between current pixel and edage. (A few sampled pixels that have different normal vector, the distance is longer)
     
    Last edited: Mar 3, 2021 at 3:15 PM
unityunity