Search Unity

Passing data from .cs to shader

Discussion in 'Shaders' started by vijaykiran, Jun 3, 2019.

  1. vijaykiran

    vijaykiran

    Joined:
    Mar 4, 2015
    Posts:
    7
    Hello everyone,

    I need to perform a color animation of timesteps data on a quad. There are about 30 timesteps, each having 4,000 float values. These float values represent intensities of a certain property (let's say 'co') at specific locations on the quad. In other words, there is a 'co' intensity value available for an <x,y> location on the quad. These <x,y> locations are common for intensity values in every timestep.

    I have a CPU version of this where I load all the timestep info into memory in Start(), and then I animate through different timesteps in a loop between float values from timestep 0 to timestep 30, rendered on sprites. See image here: https://www.dropbox.com/s/bae021rghaly99d/co_animation.png

    The C# only version tanks my framerate down to about 15 FPS, and I figured I might have a better framerate if I used a shader. To avoid passing 4,000 vertices to the shader, I procedurally created a quad with as many vertices as there are number of points in my timestep data.

    Code (CSharp):
    1. private void ProcedurallyGenerateQuad()
    2.     {
    3.         GetComponent<MeshFilter>().mesh = mesh = new Mesh();
    4.         mesh.name = "Procedural Quad";
    5.  
    6.         vertices = new Vector3[(xSize) * (ySize)];
    7.         int vertexIndx = 0;
    8.         for(int i=0; i<ySize; i++)    // 3750/25 = 150 rows
    9.         {
    10.             for(int j=0; j<xSize; j++)    // 25 columns
    11.             {
    12.                 vertices[vertexIndx] = new Vector3(cfdx[j + xSize * i], cfdz[j + xSize * i], cfdy[j + xSize * i]);
    13.                 vertexIndx++;
    14.             }
    15.         }
    16.         mesh.vertices = vertices;
    17.  
    18.         int[] triangles = new int[(xSize-1)* (ySize-1) * 6];
    19.         for (int ti = 0, vi = 0, y = 0; y < (ySize-1); y++, vi++)
    20.         {
    21.             for (int x = 0; x < (xSize-1); x++, ti += 6, vi++)
    22.             {
    23.                 triangles[ti] = vi;
    24.                 triangles[ti + 3] = triangles[ti + 2] = vi + 1;
    25.                 triangles[ti + 4] = triangles[ti + 1] = vi + xSize;
    26.                 triangles[ti + 5] = vi + xSize + 1;
    27.             }
    28.         }
    29.         mesh.triangles = triangles;
    30.         mesh.RecalculateNormals();
    31.     }
    32.  
    This way, the shader readily has access to vertices and theoretically I'd just need to compute colors for the intensities of 'co' in the fragment shader. The only missing part is the 'co' intensity data. This is where my confusion began. I attempted passing a float array from .cs to the shader like below:

    Code (CSharp):
    1. Renderer rend;
    2. rend = GetComponent<Renderer>();
    3. rend.material.shader = Shader.Find("Go_Find_Shader_Name");
    4. rend.material.SetFloatArray("_co_Array_in_shader", coArrayInCSFile);
    In the shader, I defined an array like below:
    Code (Cg):
    1. float _co_Array_in_shader[4000];
    I did not see any error messages, so I assumed that the GPU received all 4,000 float values. The issue is that only one float value per fragment should have been enough to compute the color I need. But the way I am passing the array makes 4000 float values available for every fragment, which is an overkill. I am unable to figure how to distribute these 4,000 float values across different fragments in the fragment shader.

    I also made another hack job approach to pack the quad's normals with 'co' intensity values, since I do not really care about the quad's normals generated using mesh.RecalculateNormals();.

    Code (CSharp):
    1.  
    2.  
    3.         //commented out normals calculation and injected my own data
    4.         //mesh.RecalculateNormals();
    5.         co_co2_h2_normals = new Vector3[xSize * ySize];
    6.         for (int i = 0; i < co_co2_h2_normals.Length; i++)
    7.         {
    8.             // Temporarily use [30] and later make it
    9.             // a variable that changes in FixedUpdate()
    10.             co_co2_h2_normals[i] = new Vector3(co_val[30][i], co2_val[30][i], h2_val[30][i]);
    11.         }
    12.         mesh.normals = co_co2_h2_normals;
    13.  
    The issue with the above is that once the normals are passed, I cannot update normals without creating a new quad again in FixedUpdate().

    I am missing something really simple that allows me to send float array from .cs where every fragment gets only one value within the array.

    I hope I was clear in my question. Any help or direction is deeply appreciated.

    Sincerely,
    Vijay.
     
  2. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,550
    I'm not sure I get what you mean here. If the screen resolution changes or the quad is closer or further away from the camera, more or less of its surface is going to be covered in fragments to be rendered, so you don't really statically map to specific fragments unless you were just doing a render texture render perhaps. How would you expect a specific fragment to know which element in the array to take from?

    The closest thing I could think of there is if your mesh has the same number of vertices as the color array you're sending, then you could simply have a float or color field in your vert to frag struct that you assign to in your vertex program, like
    o.vertExtraColor = _co_Array_in_shader[i.vertID];
    where i.vertID is an
    int vertID : SV_VertexID;
    in your input struct to the vertex program.

    This will result in a value passed into the fragment
    i.vertExtraColor
    that is specific to that fragment's position in the triangle, blended between the colors of the 3 verts that make up that triangle, based on distance from a given point. I'm not sure if that's what you're looking for here though.