Search Unity

Question Can I achieve this edge highlight effect?

Discussion in 'Shader Graph' started by jkorhonen222, Feb 18, 2020.

  1. jkorhonen222

    jkorhonen222

    Joined:
    Jan 21, 2020
    Posts:
    4
    So I have low poly game and I'd like to color the edges of a mesh like so: (white lines on the edges) low-poly-rock-AEkmL47-600.jpg I could just paint those manually but this would save me a lot of time.
     
    markantonybowley likes this.
  2. unit_dev123

    unit_dev123

    Joined:
    Feb 10, 2020
    Posts:
    989
  3. jkorhonen222

    jkorhonen222

    Joined:
    Jan 21, 2020
    Posts:
    4
    Yeah I might have to even though I don't know anything about that
     
  4. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    569
    You can do this but you'll need to preprocess your mesh to add extra info into a spare UV channel about which edges should be visible.

    You can then use this in shader graph to get "distance from visible edge".

    Can you figure out the first part? If so I can help with the second.
     
  5. Erfan_SheikhHoseini

    Erfan_SheikhHoseini

    Joined:
    Feb 7, 2019
    Posts:
    32
    i think the easiest way is to just use textures
    if you use programs like substance it wont take time at all because it does it automatically.
     
  6. jkorhonen222

    jkorhonen222

    Joined:
    Jan 21, 2020
    Posts:
    4
    I such a noob that I dont yet understand the uv channels...
     
  7. jkorhonen222

    jkorhonen222

    Joined:
    Jan 21, 2020
    Posts:
    4
    I could try that. I have used blender to paint the textures.
     
  8. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    You will have to do it with a texture baked outside of Unity. Shader Graph is not capable of this effect without pre-baking that data into a texture before hand. This isn't a limitation of Shader Graph, but rather a limitation of GPU shaders in general.
     
  9. mngmra

    mngmra

    Joined:
    Feb 15, 2018
    Posts:
    1
  10. PJRM

    PJRM

    Joined:
    Mar 4, 2013
    Posts:
    303
    i'd love to have some help with this kind of result using shader graph only.
    I bet there are people with knowledge to share how to, but they don't want to. Sad
     
  11. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,794
     
  12. Mauri

    Mauri

    Joined:
    Dec 9, 2010
    Posts:
    2,665
    See @bgolus reply above. He has a high level of expertise and experience when it comes to Shaders in general.
     
  13. PJRM

    PJRM

    Joined:
    Mar 4, 2013
    Posts:
    303
    Damn i hate you guys. :p

    I wish Uber did use a shader for this... Thas my inspiration.
    Now idk if i must go to another topic to ask if it is possible to a texture to create use border (just like the 2D/UI have to not stretch the image, if the UV uses a "Reused Tile" in a wide face. Just to not have to bake the entire model. (my laziness :p)
     
  14. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    So, that’s a Planetary Annihilation commander. I wrote or worked on most of the shaders for that game and was one of the people who created the asset pipeline for creating those textures. For commanders, it’s 100% hand UV’d and painted (with some help from stuff like substance). For the original Unity one of my coworkers came up with a scheme that used a swatch of premade textures with edge details painted on them, and you picked the one you wanted and then UV’d the mesh to that (or left it using the default UVs, which was often fine). The export tool then automatically generated a uniquely UV’d atlas texture, similar to something like is used for light maps, and an extra pass to generate edges (for the wireframe shader) is generated from the UVs themselves in a Photoshop script.

    I’d always wanted to redo it so it generated more in-engine, which I had a plan laid out to do, and would have made user generated content a lot easier to get in the game, but it never happened. It also had the benefit of not locking user generated content into the same exact look as existing stuff, but I don’t think anyone really made use of that.
     
    Farage, florianBrn and Raptosauru5 like this.
  15. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,074
    You can do it semi-automatically in engine.
    You could generate texture manually/runtime by analyzing edge angles and painting on texture in places specified by UVs.

    Doable but not very easy since you would have to take into account UV stretch in relation to vertices layout.
     
  16. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    569
    You could also do it by checking the angle between faces at each edge. If the faces are nearly co-planar - don't draw an edge. Otherwise do.

    I've done something similar in my code except I already know which faces are meant to be co-planar (I'm using a half-edge mesh so my meshes have faces that aren't just triangles).

    When I generate the Unity mesh, I store a 1 in each vertex that's on a visible edge and a 0 on vertices on invisible edges.

    In the shader you can use the interpolated value to get a gradient - or smoothstep it to get a wireframe.
     
  17. MattRix

    MattRix

    Joined:
    Aug 23, 2011
    Posts:
    121
    Lex4art likes this.
  18. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Doing an edge detection post process on the screen space normals is totally doable. That used to be an effect that shipped with Unity's Standard Assets package back in the Unity 5.0 days.
    https://docs.unity3d.com/550/Documentation/Manual/script-EdgeDetectEffectNormals.html
    That was basically the same technique, but using Sobel or Robert's Cross instead of Prewitt and applying it as a multiply instead of as an overlay.

    However Unity has no existing Object ID map to use, nor do rendered objects have an inherent unique ID that you could reuse. So you'd have to handle that yourself somehow.
     
  19. yzRambler

    yzRambler

    Joined:
    Jan 24, 2019
    Posts:
    37
    Another method for solving this problem is:

    1. Render the target object into a texture map (refer to it as PNMap) in real-time, only save the normal value for each pixel. In fact, PNMap is not equal normal map completely. If normal map is existed, this step need be still executed. The normal vector of the pixels in the PNMap is just about the trigle plan. This step just be made once.

    2. Render this targe object again, in the fragment shader stage, sample some pixels around current pixel, such as four or eight, from PNMap. If the normal vector of the surrounding pixels sampled have obviously different, the current pixel must have been in the line of the edge, then shadering it with your want color.

    Furthermore, you can caculate a right value from the distance between current pixel and edage. (A few sampled pixels that have different normal vector, the distance is longer)
     
    Last edited: Mar 3, 2021
  20. MalyaWka

    MalyaWka

    Joined:
    Jan 2, 2015
    Posts:
    30
  21. MattRix

    MattRix

    Joined:
    Aug 23, 2011
    Posts:
    121
  22. fuzzy3d

    fuzzy3d

    Joined:
    Jun 17, 2009
    Posts:
    228
  23. MalyaWka

    MalyaWka

    Joined:
    Jan 2, 2015
    Posts:
    30
    Rendering a UV layout for all models in a large scene will be very heavy. A large number of textures will take up a lot of memory.

    It's like with baked shadow maps - sometimes it's better to use Real-time Shadow Volume to save memory.
     
  24. fuzzy3d

    fuzzy3d

    Joined:
    Jun 17, 2009
    Posts:
    228
    That's not what I really meant...
     
  25. triangle4studios

    triangle4studios

    Joined:
    Jun 28, 2020
    Posts:
    33
    I am sorry, but this is completely false. Gpu shaders do not limit you from achieving this result. To the asker, I will provide you with piece of advice:

    This is considered to be a "holy grail" in game development, so few people will share how to achieve it. Old tutorials did, but few of them work these days, and most people will intentionally lead you astray on this one.
    So forget all that. Look at the facts.
    Blender achieved it without using textures.
    You can too. Period.
    Blender is open source. My recommendation is for you to rifle through blenders source files and learn how it is done. Remember though,right now they are an industry leader in many things, and are giving Autodesk a long and vigorous run for their money. So it may be a journey to discovery if you are new coder.
     
  26. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Yes, Blender can do it, or at least approximate it. Eevee can do it by rendering the objects into a full screen depth texture and or full screen normals texture, and then doing an inverse SSAO pass with that. SSAO which usually requires 3+ more passes (initial gather pass, blurring passes, maybe temporal reprojection) depending on how its done to get something that's not needlessly expensive or noisy. All this is done to produce an screen edge texture that is then sampled by the object's actual shader.

    None of those are things you can do by just adding code to Shader Graph because Unity doesn't have built in support for it. So yes, you still have to generate a texture somewhere else. It could be offline (which is the least expensive for real time rendering) or by writing a bunch of additional code to generate something like the screen space edging effect the OP was going for, but likely at a much lower quality than they'd be comfortable with.

    Otherwise Cycles doesn't count as it's not a real-time renderer using shaders and meshes in the more traditional way being discussed here. That's a CPU / GPU compute renderer.
     
  27. Captain_Flaush

    Captain_Flaush

    Joined:
    Apr 20, 2017
    Posts:
    65
    Hello,

    I would like to to create such an effect. Could you please give me some insight on how you compute the information you store in the second UV channel?

    Thanks!