Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Is it possible to convert mesh vertex colors to texture?

Discussion in 'General Graphics' started by Bagazi, Apr 5, 2021.

  1. Bagazi

    Bagazi

    Joined:
    Apr 18, 2018
    Posts:
    611
    I want to convert the vertex colors of mesh to a texture map of png format or some else. Is it possible?:)
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    Possible, but better done in an application not Unity. To do it in Unity you need your mesh to have pre-existing unique UVs, and write a custom shader & script to do it. Where as Blender / 3DS Max / Maya / etc are entirely capable of rendering vertex colors to a texture out of the box, as well as automatically generate unique UVs if you do not already have them. The term you want to search for is "bake vertex colors to a texture" and you'll find plenty of resources.
     
    Sab_Rango and Bagazi like this.
  3. Bagazi

    Bagazi

    Joined:
    Apr 18, 2018
    Posts:
    611
    Oh, I see. By the way , could the vertex color make the same or nearly visual effect via Unity standard light system? If could,maybe I dont need to bake them..
     
  4. Przemyslaw_Zaworski

    Przemyslaw_Zaworski

    Joined:
    Jun 9, 2017
    Posts:
    327
    Quick test:

    Code (CSharp):
    1. using UnityEngine;
    2.  
    3. public class BakeVertexColorMap : MonoBehaviour
    4. {
    5.     public Mesh SourceMesh;
    6.     public Shader BakeVertexColorMapShader;
    7.     public int Resolution = 2048;
    8.  
    9.     void Start()
    10.     {
    11.         if (SourceMesh != null)
    12.         {
    13.             RenderTexture renderTexture = new RenderTexture(Resolution, Resolution, 0);
    14.             renderTexture.Create();
    15.             Material material = new Material(BakeVertexColorMapShader);
    16.             RenderTexture currentTexture = RenderTexture.active;
    17.             RenderTexture.active = renderTexture;
    18.             GL.Clear(false, true, Color.black, 1.0f);
    19.             material.SetPass(0);
    20.             Graphics.DrawMeshNow(SourceMesh, Vector3.zero, Quaternion.identity);
    21.             Texture2D texture = new Texture2D(Resolution, Resolution, TextureFormat.ARGB32, false);
    22.             texture.ReadPixels( new Rect(0, 0, Resolution, Resolution), 0, 0);
    23.             RenderTexture.active = currentTexture;
    24.             byte[] bytes = texture.EncodeToPNG();
    25.             System.IO.File.WriteAllBytes(System.IO.Path.Combine(Application.dataPath, "VertexColors.png"), bytes);
    26.             Destroy(material);
    27.             Destroy(texture);
    28.             renderTexture.Release();
    29.         }
    30.     }
    31. }
    Code (CSharp):
    1. Shader "Bake Vertex Color Map"
    2. {
    3.     SubShader
    4.     {
    5.         Pass
    6.         {
    7.             ZTest Off
    8.             ZWrite Off
    9.             Cull Off
    10.             CGPROGRAM
    11.             #pragma vertex VSMain
    12.             #pragma fragment PSMain
    13.  
    14.             void VSMain (inout float4 vertex:POSITION, inout float2 uv:TEXCOORD0, inout float4 color:COLOR)
    15.             {
    16.                 float2 texcoord = uv.xy;
    17.                 texcoord.y = 1.0 - texcoord.y;
    18.                 texcoord = texcoord * 2.0 - 1.0;
    19.                 vertex = float4(texcoord, 0.0, 1.0);
    20.             }
    21.  
    22.             float4 PSMain (float4 vertex:POSITION, float2 uv:TEXCOORD0, float4 color:COLOR) : SV_TARGET
    23.             {
    24.                 return color;
    25.             }
    26.             ENDCG
    27.         }
    28.     }
    29. }
    Left sphere has material which visualise vertex colors, right has material with (Unlit/Texture) and baked texture with vertex colors. Unfortunately this script have to be fixed, due to visible seams.

    upload_2021-4-6_14-59-54.png

    For test, script which generates vertex colors in runtime:
    https://docs.unity3d.com/ScriptReference/Mesh-colors.html
     
    april_4_short and Bagazi like this.
  5. Bagazi

    Bagazi

    Joined:
    Apr 18, 2018
    Posts:
    611
    By the way, is there any unique benefit of texture that compared to vertex color?
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    Depends on what you're trying to do with vertex coloring. If you're trying to mimic diffuse lighting, you can certainly get something similar with Unity's lighting system. There's even a built in vertex lit shader that you can use if you want to match the "look" of vertex coloring as close as possible.

    The only benefit textures have over vertex color is you're not limited to the density and structure of the vertices to define how the color is shaped. You can do thin lines or in some cases smoother gradients with a texture than would be possible with vertex colors. There are also a hand full of shader effects that can be done with textures that can't be as easily accomplished with vertex colors, like having the color wiggle around on the surface, though usually you can get pretty close by moving the vertex positions instead.

    That and most of Unity's built in shaders don't support vertex colors by default. Mainly just UI and particle shaders.

    But if you already have a mesh that has vertex coloring, there's not really a ton of reason to bake it to a texture unless you explicitly need color that's more detailed than the mesh.
     
  7. Bagazi

    Bagazi

    Joined:
    Apr 18, 2018
    Posts:
    611

    Thanks very much. My game object are very simple like pixel 3d gameobject .For instance ,like
    Chameleon upload_2021-4-7_2-3-0.png . In this game there are some visual effect like mirror reflection,some shadow effect and some blur effect. I care about that if the equal or nearly quality could be accomplished just by vertex color of my 3d gameObject.Or is there any difficult for baking or other Unity light setting. I use a shader of vertex shader from Probuilder which could be supported by Unity light system. But I am not sure if it is enough or there is some drawbacks when use vertex color instead of texture..
     
  8. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    Chameleon Run isn't even using vertex colors. They're using materials on cubes that have a color property. Even then I'm pretty sure they're using textures on those cubes to give the color some minor splotchy variation in brightness, as well as using ambient occlusion, reflections (both real on the ground and baked on the objects), blob shadows, and basic real time lighting. It's much more than vertex colors alone.
     
    Bagazi likes this.
  9. Bagazi

    Bagazi

    Joined:
    Apr 18, 2018
    Posts:
    611


    This script works,thanks very much:D
     
  10. Bagazi

    Bagazi

    Joined:
    Apr 18, 2018
    Posts:
    611
    Is there any lession that about building such a brief but comfortable and with high quality scene in Unity?:D
     
  11. Bagazi

    Bagazi

    Joined:
    Apr 18, 2018
    Posts:
    611
    Ye, some visual seams also appears in my case ..

    I am a little confused about the shader part.. What is the theory behind the algorithm?I want to do more research
     
  12. calpolican

    calpolican

    Joined:
    Feb 2, 2015
    Posts:
    425
    Hi, sorry to necro this post.
    I was also thinking lately about baking polypaint to textures. Doing it in an external app is fine for me.
    ...But I'm sort of wondering if this would be a good solution for my use case. The use case is this:
    In my game's city landscapes, I need to layer textures, the same way I would do in a Terrain asset (is a post apocaliptic game, so there's sand everywhere, a broken version of the floor, dirt, etc). For this, having textures you can polypaint makes a lot of sense to me.
    Yet, my floor is mostly flat, so, I don't need the dense high polycount that polypainting requires (with polypaint, if you want polypaint resolution, your mesh needs to have lots of vertex).
    So, I figured I could use a dense high poly version of the floor to polypaint the masks in Unity, paint the sand, the dirt, etc. Then, bake the final mask to texture. Then, change the mesh of the floor to the low poly version, and use a material that masks the layers using a texture instead of the polypaint. So, the floor could remind lowpoly, but still with a very decent resolution for the layared material's mask, potentially better than with polypaint.
    Well, that's mostly the idea.
    I also thought other things, like that I could have many of this different materials. In wich case, I could have a different layered material for the road, another one for the street, yet another one for the floor in the interiors, etc. This would sucessfully break the 4 texture limit tiranny of the URP terrain, and the layers would be used per floor area, without the area needing to be a square like terrains tend to be.
    I even though about having a secondary UV, so I can even change the distribution of the modular floor tiles to break the tiling pattern even more, but still have a unfied UV for the mask that you can paint consistently. (not sure if a secondary uv won't make the material more heavy though).
    So, to me it sounds feasable to me, but I don't know enough to understand if this is a good idea or not.
     
    Last edited: Oct 19, 2021
    Bagazi likes this.
  13. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    You're kind of just describing the mesh workflow for MicroSplat. Sans the baking step since it "just works" even with a bunch of different material layers. And there are tools that let you make it no more expensive than having 4 layers (by adjusting the layer weights so no more than 4 textures are ever used per pixel).

    Otherwise what you're describing is also what tools like Substance Painter exist to handle.
     
    calpolican likes this.
  14. calpolican

    calpolican

    Joined:
    Feb 2, 2015
    Posts:
    425
    That sounds really good. So, it's not crazy at all. I'm not using any store assets in my project, so I can't use microsplat. I'll have to make my own solution.
    If you can provide more info about this, I'd appreciate it.
    Well, the thing is I need it to work directly in the editor with my baked lights and colors and all the game assets, I couldn't switch back and forth to substance.

    I did a material in shadergraph (so that I can customize it easily). It has 4 textures but you can put a 5th in the alpha channel or use it for something else. I'm gonna do a another network for the other maps like the terrain has.
    Not sure if I blended the normals right.
    z-MaterialBlend.jpg z-LayerColorBlend.jpg z-LayerNormalBlend.jpg
     
    Last edited: Oct 20, 2021
  15. calpolican

    calpolican

    Joined:
    Feb 2, 2015
    Posts:
    425
    I'm pretty sure I can place my meshes in the scene, inside another dummy object, duplicate them, combine the duplicates and subdivide the result by script to polypaint them together.
    The part where I think it gets tricky is when you create the secondary UV for the mask. I should do that before combining the items.
    What I'm thinking is:
    You iterate through each child mesh. You take their UVs; reduce them in size, and offset their position, so that you can lay them all out in one single square without any overlapping between them. Then, after you have made a combined mesh and polypainted your combined mesh, you bake the polypaint into that secondary UV space. As long as you don't delete the high polyversion, you could edit your changes and re bake as much as you wanted.

    Now, the thing that worries me is: Imagine I have two instances of the same mesh. If I want them to have a different set of secondary UVs, will I have to duplicate the file of the mesh? How do light maps handle this?
     
    Last edited: Oct 20, 2021
  16. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    I was talking about tools that come with MicroSplat.

    The main trick with MicroSplat, apart from using texture arrays instead of multiple individual textures for layers, is it figures out what the top N number of layers are per pixel and only samples the texture arrays at those N layers, where N is a setting you can adjust to control how expensive the shader is. There's a lot of small details in how you make that work well, including trying to make sure that you don't actually ever have more than N layers needed per pixel in the masks you use, otherwise you can get some hard lines showing up in places. Which is why MicroSplat has tools to help avoid that, or at least clean up the masks after they've been hand painted.

    This also only really works if you use texture arrays for complicated reasons on how GPUs handle texture sampling in shaders. The short version is if you have a shader that samples 4 different textures, even if you have branches to not sample the texture, it's best to assume it's going to be sampling all of the textures all of the time. For texture arrays you can have your code always sample the texture array 4 times, but change which layers you're sampling from on each pixel, that means you can have dozens of layers without needing to sample all of them all of the time.

    But all of that is moot if you just want to bake everything down to a single texture input. You can do what you're currently doing since you don't really care how efficient it is.

    For objects that need to be lightmapped, first each mesh has special UVs for the lightmap for each mesh that are supposed to be guaranteed to be unique per triangle with no overlaps. These can either be the secondary UVs that are imported or Unity can try to generate them automatically. Then each renderer component has a lightmap index and lightmap scale & offset assigned to it that says which lightmap texture to use, and where the relative UVs are within the texture. So if you have multiple identical meshes of the same scale in the same lightmap, they'll be scaled down and have different offsets. The packing basically just assumes each mesh's lightmap UVs take up a square area, with relatively consistent texel to world scale for that mesh, and it tries to give each mesh within a lightmap a similar relative amount of "texel world space", depending on the settings you have for that renderer's light map scale. Then it does square packing to sort and pack all of those squares into the UV space.

    You'd have to replicate this, or update the mesh UVs for each mesh after you bake. There are already free tools out there for atlasing multiple objects into a single texture / material that do a lot of this kind of thing you want to search around. There are some that bake down the lighting too, but AFAIk those are all paid assets.