Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

How can I grab shader Vertex Offsets and store into Mesh?

Discussion in 'Shaders' started by TheoKain, Nov 3, 2018.

  1. TheoKain

    TheoKain

    Joined:
    Nov 20, 2016
    Posts:
    14
    I'm fairly new to this pipeline and have spent all day researching and testing, but with little success. I'm looking for a detailed walkthrough of getting vertex offset data from the shader's vertex program into a c# mesh.vertices array.

    What I've tried is (I'm using a custom amplify shader to write the vertex shader)
    - output the vertex offsets to the albedo,
    - in c#, make a rendertexture,
    - graphics.blit to it
    - readpixels and then
    - getpixels of the color array into a vector3[] array to feed back into mesh.vertices

    Code (CSharp):
    1. RenderTexture resultBuffer, temp;
    2.         temp = RenderTexture.GetTemporary(rendTexSize, rendTexSize, 0, RenderTextureFormat.ARGBFloat, RenderTextureReadWrite.Linear);
    3.         resultBuffer = RenderTexture.GetTemporary(rendTexSize, rendTexSize, 0, RenderTextureFormat.ARGBFloat, RenderTextureReadWrite.Linear);
    4.  
    5.         Graphics.Blit(temp, resultBuffer, mat, -1);
    6.         RenderTexture.ReleaseTemporary(temp);
    7.        
    8.         Texture2D decTex = new Texture2D(resultBuffer.width, resultBuffer.height, TextureFormat.RGBAFloat, false);
    9.         decTex.filterMode = FilterMode.Point;
    10.         RenderTexture.active = resultBuffer;
    11.         decTex.ReadPixels(new Rect(0, 0, resultBuffer.width, resultBuffer.height), 0, 0);
    12.         decTex.Apply();
    13.         RenderTexture.active = null;
    14.         Color[] colors = decTex.GetPixels();
    15.  
    16.         Vector3[] results = new Vector3[colors.Length];
    17.         for(int i=0; i<colors.Length; i++)
    18.         {
    19.             results[i].x = colors[i].r;
    20.             results[i].y = colors[i].g;
    21.             results[i].z = colors[i].b;
    22.         }
    23.         RenderTexture.ReleaseTemporary(resultBuffer);
    I vaguely get the concept of packing the vertex offset into a 2D texture then read back from the buffer via rendertexture... but the step-by-step process escapes me.

    I'm targeting opengl es3.0 so unfortunately can't use compute.

    Any help would be very much appreciated.
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    I don't know of any tutorials on this, but Keijiro's Skinner does the vertex positions to texture thing.
    https://github.com/keijiro/Skinner

    The key parts are the SkinnerMesh.cs and Replacement.cginc files. The script creates a copy of the mesh and converts it to draw as points rather than triangles, as well as sets the UVs to correspond to the vertex index. The shader then renders each vertex to the appropriate pixel within the render texture with the wanted data, and it's done.

    I would not expect Amplify Shader Editor to give you enough control to do this easily. And I would not expect this to end up being faster then manipulating the vertices on the CPU since the read pixels command can be very slow.
     
  3. TheoKain

    TheoKain

    Joined:
    Nov 20, 2016
    Posts:
    14
    @bgolus Thanks for the helpful hints, I think I found SkinnerMesh.cs => https://github.com/keijiro/Skinner/blob/master/Assets/Skinner/SkinnerSource.cs ... Anyway, I may have broken parts of my brain trying to to get this working, I think this is way over my head and probably not even worth the time as you mentioned. Instead of moving the vertices through the shader in realtime, I will instead redesign that part of my system to move the mesh vertices at discrete intervals via cpu.

    On a side note, I did learn a lot more about shaders than 2 days ago. If I could I would write a shader passing in sv_vertexID's (which I read are opengl es3.0 compatible) which have the same indices as the mesh data, then output the transformed vertex positions to a render texture in object or world space (i presume), in the same order as the id's, and then read the texture data back into a vector array to feed to the mesh. Despite pouring through the keijiro source files, there are just too many gaps in my knowledge to be able to piece together a working solution. I hate to give up on this path, but I guess that's why there a so few guides for this out there.
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329