Search Unity

  1. Looking for a job or to hire someone for a project? Check out the re-opened job forums.
    Dismiss Notice
  2. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

Resolved Is there a way to calculate vertex tension for streched mesh areas to set alpha?

Discussion in 'Shader Graph' started by fataldevelop, Sep 25, 2020.

  1. fataldevelop

    fataldevelop

    Joined:
    Jul 5, 2014
    Posts:
    13
    Hi everyone,

    I want to stretch mesh and set alpha to stretched area, is that possible? I've found some video for example


    I would be glad to get some advices
     
    Last edited: Sep 25, 2020
  2. florianBrn

    florianBrn

    Joined:
    Jul 31, 2019
    Posts:
    25
    If you use Vertex Position to stretch the mesh, I guess you could use what you plug there to lerp between alpha and color.
     
  3. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    I imagine if it's being stretched via shader, you could do this by using the distance of the actual vertex position from the position it's at after stretch - at the max distance of the stretch that's possible, the alpha is 0 (or whatever min) vs. if it's right where the vertex normally is a value of 1 (or whatever max).
     
  4. fataldevelop

    fataldevelop

    Joined:
    Jul 5, 2014
    Posts:
    13
    No, I have just blendshape animation, so I think I have to calculate polygons area size (distance between vertices of polygon) and if size more than some threshold, than apply alpha. What do you think is there a way to do this using shadergraph?
     
  5. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    Yeah I've done something similar before in SG - might take a run at an example for you this morning but it's been a long night for me. If not I'll try tonight. Either way I'll follow up with something functional or a reason why it wasn't possible. Obviously it'd be far more ideal letting the GPU handle this than having the overhead of any other approach.
     
  6. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    So far so good :D
    BubbleGumAlpha.jpg
    Edit: The above is the default Sphere mesh
     
    fataldevelop and florianhanke like this.
  7. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    BubbleGumAlpha_ShaderGraph.jpg
    Note that in previous post image, I had unstretched/stretched color switched, so if copying properties to try it, just note you set alpha channel of _ColorStretched to like 0 vs. _ColorUnstretched almost 1.

    Anyway, looks like it can be done - the above is about as simple of a configurable bulge+alpha I could manage. Bulge direction in this is calculated by direction from origin through the given vertex, you could easily change the origin of the bulging by subtracting an offset from Position node output before sending it into Normalize. You could animate whatever values using Multiply/Add/etc. and a Time node or by calling SetFloat/SetVector/etc from a script.. and of course you could get a much nicer/smoother bulging with more vertices in the geometry.

    Edit: Attached package of the ShaderGraph, Material and Prefab.
     

    Attached Files:

    Last edited: Sep 26, 2020
    fataldevelop likes this.
  8. fataldevelop

    fataldevelop

    Joined:
    Jul 5, 2014
    Posts:
    13
    Wow, this is really cool, thank you for your time. I took a look at your approach but unfortunately in my case i have already stretched mesh (i don't stretch it with shader) and the only way I think I have is to calculate triangles areas, and when area is big enough, than apply alpha.

    Via script it would look like
    Code (CSharp):
    1. void ApplyAlphaToStretchedTriangles(Mesh mesh)
    2.     {
    3.         var triangles = mesh.triangles;
    4.         var vertices = mesh.vertices;
    5.         float area;
    6.  
    7.         for (int i = 0; i < triangles.Length; i += 3)
    8.         {
    9.             Vector3 corner = vertices[triangles[i]];
    10.             Vector3 a = vertices[triangles[i + 1]] - corner;
    11.             Vector3 b = vertices[triangles[i + 2]] - corner;
    12.  
    13.             area = Vector3.Cross(a, b).magnitude;
    14.  
    15.             //if(area > treshold)
    16.             //{
    17.             //    apply alpha to vertices color...
    18.             //}
    19.         }
    20.     }
     
  9. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    No problem, it was interesting to try. Using Vertex Color alpha would work - the downside is that it's expensive doing real-time vertex animation on the CPU especially for something like this. First of all you'll have a lot of vertices to process for a smooth "bubbling" effect (depending on the quality you're looking for), and you'll have to consider neighboring triangles because setting 3 vertices' alpha means the adjacent triangles sharing the same ones are affected as well ..and if they don't share vertices it wouldn't blend smoothly without averaging with the adjacent triangles' result, so you could end up processing 3X the number of vertices you need to if you're not careful and only seeing 1/3 of the changes because triangles are overwriting changes made by each other (i.e. think of a pyramid of 3 sides and a bottom - only 4 vertices.. but if you process each triangle, that's 12 vertices and you've processed each one three times).

    You might be better off calculating per vertex instead, maybe using the sum of distances to connected vertices divided by the number of them to get the value to compare with the threshold. To optimize you could generate and cache a lookup table (list of .triangles indices used by each vertex) so you're not re-determining which other triangles use which of the same vertices every frame. Unless your triangles all have pretty much the same area, it won't be a uniform effect using an area-based approach whether faces or verts - smaller triangles should bulge as well but wouldn't. Also, there are a number of ways to set mesh data ranging from slow to fast, and the fast ways are way more involved than just setting .colors = result.

    I'm pretty sure any real-time solution will need to have calculations relative to the unstretched vertices to be generic. If you have an "already stretched" mesh / it was just modeled/designed with bulges, there's no actual data available to define what "stretched" means, as it's purely artistic.. and since since the vertices are already how you want them / you're not bulging it procedurally, you might be better off just painting vertex alpha to make bulges more transparent ..?
     
    fataldevelop likes this.
  10. fataldevelop

    fataldevelop

    Joined:
    Jul 5, 2014
    Posts:
    13
    omg, you are right, my method would just overwrite vertices colors and I'll have to add additional vertices to not get gradient on unstretched triangles. This is getting more complicated than i thought.

    Actually I have unstretched mesh. I have blendshape animation where I'm blending unstretched mesh to stretched mesh, so technically I could compare stretched triangles area or distance between vertices to unstreched one to calculate the difference, you are right, this would be more accurate.

    Maybe it could be good solution to manually paint vertex color on bulges, but as I see there are no simple ways to transfer vertex color animation into Unity, because 3D software like Blender or Cinema 4D do not create blendshape animation with vertex colors animation, they just animate vertex position. And I assume that Blendshapes in Unity do not support vertex colors animation either. So for this method I have to write custom vertex colors animation script which would be for sure not very fast on cpu(when i just set .colors parameter).

    So I think I have to try to write script for calculating alpha on stretched vertices (to not paint all 3d models manually), and save mesh.colors to not to calculate alpha in runtime. And in runtime just animate vertices colors.
     
  11. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    If you have the reference model, you can get bulge alpha per vertex by getting the distance between corresponding vertices. Then specify a "max tension" as the furthest distance any vertex would be from reference, then lerp alpha value by that distance divided by max tension between unstretched alpha and stretched alpha values. Something like..
    Code (CSharp):
    1. float alphaUnstretched = 0.9f;
    2. float alphaStretched = 0.1f;
    3. float maxTension = 0.25f; // maximum bulge distance
    4. Color color = new Color(0.94f, 0.53f, 0.76f, alphaUnstretched);
    5. Vector3[] vertices = mesh.vertices; // the mesh with bulge(s)
    6. Vector3[] refVertices = refMesh.vertices; // the reference mesh (unstretched)
    7. Color[] colors = new Color[vertices.Length]; // or = mesh.colors; (if you know it has them)
    8. for (int v = 0; v < vertices.Length; v++)
    9. {
    10.   colors[v] = color; // only necessary if not using pre-existing color (i.e. a new Color array)
    11.   colors[v].a = Mathf.Lerp(alphaUnstretched, alphaStretched, Mathf.Clamp(Vector3.Distance(vertices[v], refVertices[v]) / maxTension, 0, 1));
    12. }
    13. mesh.colors = colors;
    Edit: Added "color" variable + setting the color first if it would be null
     
    Last edited: Sep 27, 2020
  12. KokkuHub

    KokkuHub

    Joined:
    Feb 15, 2018
    Posts:
    666
    Bake the original vertex positions into a UV channel (those support X, Y, and X too), then it is just a matter of measuring the distance between the deformed vertex and its original position on the vertex shader.
     
    fataldevelop and polemical like this.
  13. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    Indeed, that was my original suggestion - and with the unstretched meshes available, if you can get the unstretched vertices XYZ into the 2nd UV channel, this would work:
    BubbleGumAlpha_Using_UV2s.jpg

    However, it appears you can only set array of Vector2s to the mesh procedurally in Unity. I just confirmed the following workaround is functional though, which requires you set for each vertex in the stretched mesh a Vector2(X, Y) to uv2[v] and a Vector2(Z, 0) to uv3[v], where X, Y and Z are the corresponding unstretched vertex position.
    BubbleGumAlpha_Using_UVs_Vector2s_Only.jpg
     
    fataldevelop likes this.
  14. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    This would be for color and alpha only (since you'd be applying it to the bubbled mesh), provided the unstretched vertex positions are in 2nd UV channel:

    BubbleGumAlpha_Using_UVs_ColorAndAlphaOnly_V2.jpg

    Edit: Replaced above with a faster approach (avoid the Split nodes) and renamed properties. "StretchDistanceMax" is the distance from the original vertex position at which the output should be entirely ColorStretched and AlphaStretched.
     
    Last edited: Sep 27, 2020
    fataldevelop likes this.
  15. fataldevelop

    fataldevelop

    Joined:
    Jul 5, 2014
    Posts:
    13
    Thank you guys, I baked unstretched X,Y to UV1 channel, and Z,0 to UV2 channel and used your Shader Graph example and with just "stretched color" without alpha it works well on my BubbleGum example attached. I feel like I did my first Shader Graph shader by myself =) At least I start understand now what is happening within these nodes. In your example i swapped incoming splines in Divide node to get correct color.

    But when I changed "surface parameter" to Transparent, I'm getting strange behavior, mesh became transparent in not transparent area. Do you know what am I missing?
    Screenshot 2020-09-28 at 16.51.14.png

    I attached package of example.
     

    Attached Files:

    Last edited: Sep 28, 2020
  16. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    You have an issue with your normals. I just wrote a quick procedural fix, here's before and after (with shadows disabled on the light so as not to interfere with the comparison):
    before_FixNormals.jpg after_FixNormals.jpg
    Code (CSharp):
    1. using System.Collections.Generic;
    2. using UnityEngine;
    3. public class FixNormals : MonoBehaviour
    4. {
    5.   public bool runAtAwake = true;
    6.   public float smoothing = 0.001f;
    7.   private Mesh mesh;
    8.   void Awake()
    9.   {
    10.     if (!runAtAwake) return;
    11.     Debug.Assert(smoothing >= 0, "smoothing must be >= 0");
    12.     MeshFilter mf = GetComponent<MeshFilter>();
    13.     Debug.Assert(mf != null, "No MeshFilter");
    14.     Debug.Assert(mf.sharedMesh != null, "MeshFilter has no Mesh");
    15.     mesh = Instantiate(mf.sharedMesh);
    16.     Vector3[] vertices = mesh.vertices;
    17.     Vector3[] normals = mesh.normals;
    18.     List<int> matches = new List<int>();
    19.     for (int v = 0; v < vertices.Length; v++)
    20.     {
    21.       matches.Clear();
    22.       for (int w = 0; w < vertices.Length; w++)
    23.       {
    24.         float d = Vector3.Distance(vertices[v], vertices[w]);
    25.         if (d <= smoothing) matches.Add(w);
    26.       }
    27.       Vector3 normal = Vector3.zero;
    28.       for (int m = 0; m < matches.Count; m++) normal += vertices[matches[m]];
    29.       normal /= (float)matches.Count;
    30.       normal.Normalize();
    31.       for (int n = 0; n < matches.Count; n++) normals[matches[n]] = normal;
    32.     }
    33.     mesh.normals = normals;
    34.     mf.sharedMesh = mesh;
    35.   }
    36.   void OnDestroy()
    37.   {
    38.     if (mesh != null) DestroyImmediate(mesh);
    39.   }
    40. }
    Attach to your BubbleGumMesh object with the MeshFilter and hit Play - you'd want to actually save the revised mesh and remove the component.

    Edit: fixed an assert statement. Edit #2: moved smoothing value assert to earlier. Edit #3: now applying result to all matches (subsequent comparison would be irrelevant that way / fixes an issue that occured to me). Also added (float) before matches.Count, it made me nervous not having it there even though it probably isn't necessary :D ..finally Edit #5: added normal.Normalize() before applying to matches. Edit #6! Fixed line 15 redefining mesh, as reference wouldn't be kept in private mesh for OnDestroy (not like this would be used at runtime.. but for anyone deriving something from this, might as well correct it to avoid a leak).
     
    Last edited: Sep 29, 2020
    fataldevelop likes this.
  17. fataldevelop

    fataldevelop

    Joined:
    Jul 5, 2014
    Posts:
    13
    Your script makes mesh very smooth, it looks cool. But it still has issues with transparent material (if set opaque, no issues). Alpha is not set, so material should not be transparent. I've tested it with standard renderer too, same issues. Maybe on your machine you don't see it, I made gif animation. I checked polygons normals in Cinema 4d, everything should be fine, but something is wrong.
    final_5f724de01ad799008045e056_424904.gif
     
  18. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    To get a 3D model sorting properly with transparency is a well-known/frequently encountered issue that's elaborated upon here and elsewhere - if you search for something like "unity transparent models depth sorting" you'll find plenty of info and suggestions of workarounds. I'd just be searching too. There might be an easier way, but my initial guess is that you'll need to split the model into chunks - I can't think of any other approach to solve having multiple surfaces with front facing triangles behind each other in the same mesh.
     
    fataldevelop likes this.
  19. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    BubbleGumAlpha_almost_there_1.jpg BubbleGumAlpha_almost_there_2.jpg

    Progress! I couldn't resist. It still has issues / not ready yet, but it's close. I'll follow up in a day or two, I'm still tinkering and it's messy (..and this isn't the only thing I'm working on).
     
    fataldevelop likes this.
  20. fataldevelop

    fataldevelop

    Joined:
    Jul 5, 2014
    Posts:
    13
    Now I see that this is common problem, I never used transparent 3d models and didn't know about.

    It looks great, I'll follow your posts. Your help is already huge.
     
    polemical likes this.
  21. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
  22. fataldevelop

    fataldevelop

    Joined:
    Jul 5, 2014
    Posts:
    13
    Hi, have you found a workaround without splitting mesh in chunks? Because this one looks just perfect :)
     
  23. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    The image above is through drag-drop onto a Component, hands-free/behind-the-scenes splitting. Drawn procedurally via Graphics.DrawMesh. No additional GameObjects, not even a MeshRenderer or MeshFilter if select "Cached" mode (data serialized as an array of floats). Also, can select "Instanced" then drop another BubbleGumSplit object into "Instance Of" and have it drawn by that without having any data itself.

    I've been putting off creating a solution like this off a while given the complexity, but have been meaning to for ages because there are quite a few things it'd be useful for. It's all working now - it's the Editor side that I'm not happy with yet. I need to do a custom Inspector because it's unwieldy without one / too many things can get confusing between the different modes and what applies or not. There is "Cached", "Realtime" and "Instanced" - all functional.

    BubbleGumSplit_pre-inspector.jpg

    The above should give you the general idea. I actually just got the remaining functionality I wanted it to have done this morning. So I'm down to optimizing, doing a custom inspector, sample scene, packaging. I'm guessing late tomorrow I'll post it wherever it's at and you can have at it.

    Note that this isn't a magic bullet or anything - sorting operations for transparency and out-of-control batch counts prevent this kind of thing from being applied to everything in a scene. I wouldn't use this for a mobile game. What I've done is just made *being able to* easy, without sorting additional mesh assets or chopping meshes yourself or needing a prefab with 100 GameObjects in it each having a little mesh chunk. To me that kind of thing is a showstopper.

    With this, you just drop the mesh and material in, tweak some properties and you're done.
     
    fataldevelop likes this.
  24. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    I'll clarify/correct a few points re workflow while I'm here. You can either:
    1. Create an empty GameObject, add this Component, drop a Mesh and a Material to it.
    2. Add this to an existing GameObject with a MeshRenderer and MeshFilter, it will copy references from both.

    Having "Instance Of" selected, the mesh data is shared by however many instances there are. You can still translate, rotate, scale each instance, it will adhere (through matrix multiplication of the mesh data and relevant transform). You can still select a different material and different shadow cast/receive options for each instance. This approach would make sense if you had several glasses on a table or seriously-impressive soap bubbles.

    "Caching" is the important point. It cuts runtime processing in my test down from 5ms to 1ms (or less).
    BubbleGumSplit_benchmarking.jpg

    When you select Caching mode, it only deserializes at runtime. It's pretty fast, and you don't need to include the source mesh in your build unless you're using it elsewhere (or for shadow casting a non-transparent material with "shadows only" set on an attached MeshRenderer/MeshFilter, since you can't cast shadows from transparent one - this approach is shown in the image you commented on). Also, I have split count pretty high for a sphere, I just kept increasing until I could zoom in and pan around it without transparency depth glitches.

    At any time during gameplay you can change properties and just call .Generate() on the component - it is functional in Play Mode and builds.

    Also note any material or mesh is fine. The bulge processing is optional but would be necessary for the shader in question since that requires it. You can just uncheck that and select any other material you could want. The serialization I've done for this analyzes the mesh data and only includes channels necessary - vertex position plus whichever other ones the result has - of normals, tangents, colors, uv, uv2, uv3. You could modify it to support more UV channels than the first 3 if you needed them.

    I've considered making it easier to add bulge data by dropping your bulged mesh into a slot. Then when it generates the data, if it sees that slot isn't null, it'll use the vertex positions in it for UV2/UV3. That way you could easily just drop an unstretched mesh into "Mesh" slot and stretched mesh into "Bulged Mesh" (or whatever I call it) and it would automatically fill the channels. I'm not sure if that will make it in for the initial share, but if not I'll make it the first priority for any revision.
     
    fataldevelop likes this.
  25. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    DoubleGum_shadows.jpg DoubleGum_intersection.jpg
    It's pretty much wrapped up for now (code, custom inspector, shader, a couple of prefabs).

    The above shots and the package are 2019.4.11f1 with URP 7.3.1 Verified.
    I've also tested briefly in 2020.2.0b4 with URP 10.1.0-preview.2.

    The title of this asset is tentatively DoubleGum :p ..whether I end up taking it any further or this was just a fun couple of days, it's been [mostly] a pleasure. Another tool in the toolbox - for everyone.

    Please keep in mind the pink mesh was shared here by @fataldevelop with the hope of getting something like this to work. Any subsequent release of DoubleGum will not contain it. That said, I do have other things to attend to for the next while I've been putting off soo.. that probably won't be any time soon.

    Enjoy :)
     

    Attached Files:

    fataldevelop likes this.
  26. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    747
    I noticed if you select Cached mode with a Prefab, there's an issue with "change delta" and serialization - the solution is either:
    - Pick REALTIME mode instead.
    - or Unpack Prefab (which is only a single GameObject anyway) if you want to use caching.

    I'm pretty sure this isn't something I can do anything about, because 300KB scene when it's NOT a Prefab can be like 15MB when it is. The editor is struggling somehow - I'll file a bug report with minimal repro at some point if I do anything else with this and that turns out not to be a bug on my end.. and seeing as it's only with prefabs, I don't think it is.
     
    fataldevelop likes this.
unityunity