Search Unity

Question Getting an unmodified vertex position in fragment shader

Discussion in 'Shader Graph' started by Amnimatic, Apr 23, 2021.

  1. Amnimatic

    Amnimatic

    Joined:
    Aug 14, 2020
    Posts:
    1
    I've been making a vertex shader that heavily alters the vertex position. However I'd also like to get the unmodified vertex position in the fragment shader to reliably cull what would normally wrap around itself.

    Sadly it doesn't seem like I can save variables in the vertex shader and pass into the fragment shader; like in Shaderlab. Is there a way to get the unmodified position outside of baking positions into a UV channel?

    Any feedback, ideas, or even "nope ur stupid" is appreciated.
     
    Last edited: Apr 23, 2021
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    At this time baking the original vertex positions into the UVs is the only work around without manually modifying the generated shader code.
     
  3. lilacsky824

    lilacsky824

    Joined:
    May 19, 2018
    Posts:
    171
    Seems since 2021.2 you can use custom interpolator.
    Send original vertex position to custom interpolator.
    Then get it in fragment stage.
    I am waiting this feature so long.
    I always need to hand write shader before this feature release.
    name.jpg

    Left side is position get from custom interpolator.
    Right side is get from position node.
    custom.jpg
     
    KMDeclius, Olmi, Amnimatic and 2 others like this.
  4. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    I'm working on something similar where I bake the original vertex positions to secondary UV maps. I want to compare these with the current vertex position and output the distance between the points. This float value will then be used as a mask. Unfortunately I just can't make it work. Is the problem that the position node is functioning in the vertex stage of the shader, while reading the UV is at the fragment stage? I tried using a custom function too but that isn't working. Any suggestions are much appreciated.
     
  5. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    I made some progress but the stored mesh vertex positions don't seem to be right once in the shader. I've been at it for a while so am tired and could be making a stupid mistake. Does this workflow sound right?

    I grab a copy of the mesh.vertices array. Each item in the array is a Vector3, with each component in the -0.5 to +0.5 range, corresponding to the local vertex x, y and z coordinates.
    I save the contents of this array to the uv3 array. Each array item has (0.5, 0.5, 0.5) added to move the components into the 0 - 1 range, as this is what the uv3 array expects.
    In the shader (vertex stage) I compare the current vertex position with the uv3 stored value (subtracting (-0.5, -0.5, -0.5) to correct the coordinate values). Even when I confirm the values from the script match, the shader output does not. It's as if the mesh vertices positions have been shifted down slightly.

    As far as I understand, Unity can write a Vector3 to the uv3 array via script. Shader Graph can read these back, or is it limited to uv x and y only?
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Don't do this. All you need to do is:
    Code (csharp):
    1. mesh.SetUVs(3, mesh.vertices);
    Anything more and you're overthinking it.

    The mesh UVs don't "expect" a range between 0.0 and 1.0, they're unbounded arbitrary float values (apart from 32 bit floating point limitations). There's no need to try to remap them to any kind of limited range. Just set the actual vertex positions and then get them from the UV node in the shader.

    You also mentioned the "uv3 array", which makes me think you are doing
    mesh.uv3
    ... which you should not be using. That's legacy functionality for setting mesh UVs, and those are limited to
    Vector2
    arrays. Those older array variables are also 1-based, meaning
    mesh.uv3
    would be "UV2" in the Shader Graph's UV node as it is 0-based. The
    mesh.SetUVs()
    function is also 0-based, so the first number for that function lines up with the UV index the Shader Graph node uses.

    What stage a node gets used is dependant on what value the graph plugs into in the end. If it ends up plugging into one of the Block Nodes in the Vertex section of the Master Stack, it'll run all the connected nodes in the vertex shader. If they plug into one in the Fragment section, they'll all run in the fragment shader. If a single set of nodes connects to both, then those nodes gets run twice, once in the vertex and once in the fragment.
     
  7. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    Thanks. I'll try this out. I am using mesh.SetUVs(3... but will skip any value adjusting.
     
    Last edited: Sep 18, 2021
  8. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    I've dropped the (0-1) remapping but am getting the same results. In my test shader if I take the current vertex position and plug it into the position node in the Vertex stage of the Master Stack it behaves as you would expect. There is no visible change to the mesh because it's simply re-applying the correct vertex position.

    However, if I connect the output from my saved UV(3) values instead the whole mesh changes position slightly. In the debugger I can see the saved UV values are identical to the mesh vertex positions, so I'd expect no change to the mesh.

    Am I missing something obvious here? What I'm trying to do is compare the current vertex position with a corresponding saved value in the UV.

    Thanks.

     
  9. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Are you using a skinned mesh? Then this is probably all working exactly as you described. On the right is the mesh in its original vertex position. On the left is the skinned position. They’re often not the same even in the “bind pose”, which is what I expect is happening here.

    If you take that mesh and use it with a MeshRenderer instead, my expectation is swapping the shader between the regular vertex position and the UV3 vertex position won’t show the offset you’re seeing.

    So what you need to do is applying the skinning to each vertex to get the “original” position, or more easily get the baked mesh from the skinned mesh renderer and copy the vertex positions from that.
    https://docs.unity3d.com/ScriptReference/SkinnedMeshRenderer.BakeMesh.html
     
  10. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    Yes, it's a skinned mesh. Testing on a normal MeshRenderer works as expected. Makes sense. Thanks for that. Why would the values appear the same in the console debug though?
     
  11. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Because those are the default vertex positions of the mesh. That's why the mesh doesn't move when you use it with a mesh renderer. A skinned mesh renderer is always applying the bones weights, either on the CPU or GPU, but that doesn't modify the base mesh data itself. But rather produces a copy of the mesh that's been transformed that gets used. The BakedMesh() function gets you access to one of those modified meshes. On the GPU it's either using that modified mesh uploaded from the CPU, or using a mesh copy created by the GPU on the GPU.
     
  12. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    What should I be saving at the start to act as my reference vertex positions? A copy of the skinned mesh before any other changes are made to it? I'm trying to find the differences between the current mesh vertex positions and a default rest pose.
     
  13. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    You need to use the
    BakeMesh()
    function (which I typoed as "BakedMesh" in the previous reply) to get the vertex positions of that default rest post.
    https://docs.unity3d.com/ScriptReference/SkinnedMeshRenderer.BakeMesh.html

    The "default pose" of a skinned mesh is still a skinned mesh, meaning the position of the bones are modifying the position of the vertices. And there's no guarantee that the "real" starting vertex and bone positions are exactly at the bind pose positions.

    It's probably weird to think that the default pose isn't just the mesh's vertex positions, especially if you were the one to do the rigging of the mesh and you know you placed the bones on it "where it was standing". But there are all sorts of hidden transforms meshes have applied to them by asset creation tools that may not be obvious. You may have a character mesh that you know you have positioned so it's standing with its feet flat on the plane you want and with the pivot placed at 0,0,0. You may have reset the transform or flattened the hierarchy, or done whatever processing is needed by your modelling tool to remove any unwanted transforms ... but my experience is that didn't do enough. The only 100% way of absolutely removing all transforms for every modelling tool I've ever used is to make a box, and append your character mesh to it, then delete the box geometry. I've worked on several projects in the past that were especially fussy with the original mesh vertex positions and had to do this. Max, Maya, Blender, Softimage, all of them have a hidden transform history that there's no other way to clear without doing something like this.

    Unity is going to import the data exactly as the official FBX tool (which is what they use) hands it to them. And the official FBX tool hands them a mesh with the vertices and bones using that hidden offset and then the bones moved to the default pose you're expecting, which in turn of course also moves the rendered mesh. 3DS max is even more weird, because it uses Z up, the original vertex positions of the mesh are rotated 90 degrees.
     
  14. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    Thank you.