Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Baking Vertex Positions in Blender and Animated them in Unity via Shader

Discussion in 'General Discussion' started by berk_unity75, Oct 17, 2022.

  1. berk_unity75

    berk_unity75

    Joined:
    Aug 20, 2021
    Posts:
    3
    Hello everyone!

    Not sure if this is the right section If it is not I am sorry.

    Let's say I have a standard sphere in Blender which it has 482 vertices. I want to shape this sphere as a cube with same mesh and baking vertex positions ( as texture or vertex color doesn't matter). Exporting as a sphere but I baked vertex positions as cube.

    In unity I want to set position of each vertex via baked texture or color. In a nutshell this is what I am trying to achieve. I know I can use blendshape and stuff but I want to learn vertex position baking and positioning them in shader.

    I found couple python scripts that bakes vertex positions to texture. I tried to feed vertex position input in shader graph but It didn't work ( surprisingly!). Than found a script that bakes vertex positions as vertex colors which is much better I guess because I dont need to use UV Coordinates but that didn't work either.

    I know we can bake animations and play them in shader with vertex animations. I am trying to achieve same but without animation.

    Thanks for the help!!
     
  2. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    12,401
    It is called blendshapes. FBX supports them, and you can control them via a script.

    If you for some reason don't want to use blend shapes, then you can bake positions as texture coordinates. This way you'll maintain full precision. Getting that data out of blender might be t ricky, plus you'd need t wo sets of UV coordinates for positional data, as unity allows only 2 floats per uv set (using simple mesh API, at least), even though GPUs support 4.

    Also, there's not much to learn about it from shader side, plus if you're using shadergraph, the number of things you do will be definitely very limited.
    -----
     
  3. berk_unity75

    berk_unity75

    Joined:
    Aug 20, 2021
    Posts:
    3
    I see thanks a lot. Well what you think about baking vertex positions to vertex colors?
     
  4. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    12,401
    You'll lose some precision and will have limited space. Verterx color normally uses uses 8bit ARGB colors and cannot be negative. Meaning only 256 possible values per x/y/z axis. You can use normalmap color trick, of course, to store negative values (compressed.rgb = uncompressed.rgb * 0.5 + 0.5), but you'll be limited to +-1 cube for the values, AND you'd still need to store normals for the new shape anyway. In texture coordinates you can store any value you can stuff into a float.

    Basically, if you want to truly experiment with shaders, Built-in renderer with code is better.
    https://docs.unity3d.com/Manual/SL-SurfaceShaderExamples.html
    Though those articles still don't teach you how to write a naked shader that is not a surface one.
     
  5. berk_unity75

    berk_unity75

    Joined:
    Aug 20, 2021
    Posts:
    3

    Thanks for the help!
     
  6. Ne0mega

    Ne0mega

    Joined:
    Feb 18, 2018
    Posts:
    560
    I did this eventually with vertex colors and shadergraph, but it is way too complex to lay out or explain here, and I had to write my own scripts/data container system for data transfer in Unity for dynamic blend-shaping of a common model.... ...and it is even more complex because I am using skinned mesh renderers, which use shared meshes, which is a nightmare to swap around. This took me months to perfect.

    But, it is performant and works very well in the end.

    I'd be interested in a link to the python script you found that bakes vertex positions to vertex colors! Sounds great!