A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Calling all New Unity users! Join the Halloween Mods Showcase Challenge until October 31.
Discussion in 'Assets and Asset Store' started by Phong, Nov 20, 2012.
Will try to take a look at this later today.
Thanks lot, I know it´s an abuse, but I really tried everything and found no solution so far.
Is there an easy way to export a baked to another project?
Hi Lzardo2012, I tried the model and was able to bake them. I have sent you a PM with a Unity package that has the test bake scene.
Yes, you may need to "bake into prefab" so that the baked mesh is a project asset instead of an instance in the scene. You can export everything by selecting the assets/folders you want to export in the project view and selecting "Assets Menu -> Export Package". You can export:
Texture Bake Result
Prefabs (can contain TextureBaker and MeshBaker components)
The exported package can be imported into a different project. Note that you can export from lower versions of Unity to higher versions of Unity but not the other way around.
Well, in the meantime I randomly came across a weird workflow that sort of does what I want. If I bake into a scene object, and select an existing mesh to overwrite (I'm just overwriting a sphere) that works as one would think. But then if I try to copy that mesh to another project it's still a sphere. But if I restart the first project the and copy the mesh somewhere else the mesh is correct, not a sphere anymore.
Very weird, and not very intuitive (why not just make a new mesh instead of hijacking an existing one?)
You can drag another mesh asset into the Mesh slot as you mention. Using a "Shere" is a bit sketchy if it is the Unity built-in sphere mesh. I think if you bake that then all of your "sphere"s will become your baked mesh, until Unity re-creates it which usually happens when the editor restarts.
The sphere mesh that comes with meshbaker, sorry. I don't really understand why baking into that mesh doesn't change the size of the mesh, and remains a sphere when copied to another project... until I restart the first project in Unity and then the mesh is correct.
Not the sphere that comes with Mesh Baker, the sphere that comes with Unity (GameObject -> 3D Object -Sphere)
You will get different results depending on which Mesh you use:
If you use a built in Unity 3D primitive (Sphere, Cube etc...). These meshes are generated by unity and shared. If you overwrite them then all your (Spheres, Cubes etc...) will change until Unity re-generates it and your changes will be overwritten.
If you use the mesh that is part of an Imported Model (.FBX file). Your changes will survive until the .FBX file gets reimported. When the mesh reimports from the .FBX file your changes will get overwritten.
A "Mesh" saved as a .asset file. This is the only safe way to save a mesh if you want your changes to persist.
Thank You very much!
hey , please I am desperate been trying all day . I followed ur instruction on ep 8 and on ur blender file with the customizable character when I add for example armor on the player with an aditional skinned mesh the animation doesn't play for additional mesh , it is like it is being ignored. The armor is not being affected by the torso.
my steps: opened ur blend file on blender, added the mesh I wanted , parented it as object to the Player object , added a armature modifier with player as root bone , the animation does play on unity but the armor does not move. what should I do ? do animations get cancelled when I add an additional bone ?
In order to debug the problem I would suggest testing each part separately until we locate the problem.
1) Is the problem with the changes to the blender file. To check this test that the blender model works on its own in an empty scene. Drag the blender model into an empty scene, move the torso bone. If the armor does not move then the problem is with the changes to the Blender model. If the armor moves then this part is good.
2) Try using the model without baking things together. Does the armor work in this case. If not then there is something wrong with how the models are being put together into the character rig.
3) Try the baked version. There are a few things I can think of to check:
Is the "render type" of the MeshBaker settings set to "skinned mesh"?
Could the animator culling settings be affecting this (try "always animate")?
Check the editor log after the mesh bake. Any errors or warnings? Were the expected number of meshes baked?
(solved)Ok I should have 1)parented the armor mesh to the root with automatic weights(or make my own weights) 2)the automatic configured avatar by unity should be set on humanoid.A question , does the size of a texture affect performance? for example having a 1024by1024 would take more cpu power than a 512by512 to load?
Glad to hear that it is working.
A single image that is a different size 1024 vs 512 usually does not affect performance, but there are cases when it can:
When downloading your game or asset bundle, the download will be bigger.
If there are more textures in the game than can fit into GPU memory then textures need to be unloaded and reloaded on some platforms.
You can't exceed the total ram on the target platform.
Most games have a texture budget for each scene (depending on the platform). The budget is usually designed so that the scene will play acceptably well on the low end for that platforms. It is important that the total size of all of textures in a scene do not exceed the budget.
That's what I'm doing. But the changes only persist after I restart Unity. If I copy the asset from the project folder to somewhere else before restarting, then the .asset file doesn't change.
Anyway, I've noticed that flipping a model with - scale has a tendency to screw up lighting (eg. if a light is coming from above and you flip vertically, in the baked mesh the top is dark and the bottom is lighted.) Is this a known issue?
Re: mesh changes not persisting. It think the mesh asset needs to be marked as "dirty". Mesh Baker only does this if the output is "Bake Into Prefab".
Re: flipped normals. I think that is an issue with any mesh in Unity, not just baked meshes. The exact behavior will depend on how the shader does the lighting. Most shaders will take the mesh normal and multiply it by the "transform" matrix. If the scale (in the transform matrix) is -1,-1,-1. Then the normal will be flipped. When the lighting calculations are done, it is like each vertex/pixel is being lit with light from the opposite direction.
Can you adopt this behaviour, or at least expose it as a toggle, for other modes too?
In this scenario the lighting is fine before the bake. It applies to the standard shader. Perhaps there's a way to detect this, then modify the scale and rotation of the affected sub-objects prior to baking to avoid the issue?
Video here: https://www.dropbox.com/s/6cdtchn978y3xo7/MeshBaker lighting issue.mp4?dl=0
Re: the mesh changes not persisting. Is there a reason you are not using the Bake Into Prefab feature? The problem with adding this functionality to the "bake into scene object" is that this feature is designed around the idea of runtime baking and baking meshes that remain in memory. There is already some code that tries to detect scene changes and clean up meshes that were created while the scene is running. I am honestly worried about the complexity added when trying to detect the situation (is the game running or not, is the mesh an asset or not, was the mesh assigned externally or created internally by the mesh combiner). The purpose of "Bake Into Prefab" is to handle baking into assets. You could also write a script that handles your situation.
Create a mesh
Assign it to the mesh baker
Bake into it
Save the mesh as a project asset
Re: The scale issue. I will see if I can reproduce that issue.
Honestly the workflow just seems way worse. In Bake Into Prefab:
Create a GameObject, create a prefab from it, set the prefab to the bake target, bake, drag the instantiated GameObject with the mesh filter *out* of the prefab (because it's a child of the Prefab instead of the root object, which isn't useful), and then create a second prefab from that.
I set the bake target exactly one time, hit bake, it creates an object in the scene (Which is nice since I can immediately compare the result to the original and make sure everything lines up), and then I just drag that into the project folder and I have a prefab I can export.
Also, the same issue is there that it appears even when you bake into a prefab directly, the mesh (.ASSET file) is only updated *after* you close Unity. At least, that's when the file size actually changes.
Thanks. I imagine the video makes it super straightforward, but let me know if you need any help with that one.
Hi, My custom shader has a gradient color property, each material has different color.
It seems that both [custom shader property names] [blend Non-Texture Properties] can't keep the color for each material.
Is it possible to keep every material's color individually? Thanks.
The combined material only has one set of color properties so it can only do one color at a time.
One solution would be to write your shader so that it uses textures instead of colors. You could use a small, solid color texture in place of each color and set the UVs of all your vertices to .5, .5.
Then Mesh Baker can build an atlas and adjust the UVs of the source meshes so that each mesh would sample a different part of the atlas to get its color.
@Phong I'm having a issue with the material combination step in the latest version of mesh baker (3.28.1). I generated the Mesh Baker using the Analyse Scene & Generate Bakers wizard. The I click "bake materials into combined material." The screenshot below is using the default MeshBaker settings in the SampleScene that comes with the Lightweight Render Pipeline. I get a similar problem if I check the "Consider Mesh UVs" box . I get atlases that look correct, but the combined mesh doesn't appear to be using the right UVs.
Regarding persisting prefab changes. Are you saving the scene after you "bake"? You need to save the scene, after baking the changes to persist the changes. This is a strange "Unity thing". Some changes to assets that you make while a scene is open are not persisted until you "save" the scene. It can be confusing because some changes are persisted immediately (editing assets in the project view, creating and destroying assets) but other changes, like editing properties of an asset are not persisted until the scene is saved.
Regarding the flipped normals. I am able to reproduce the problem and understand the issue. It will take me a few days to implement a solution.
If you use "consider mesh UVs" and re-baked the atlas, did you then alos re-bake the mesh? I often forget to do that (re-bake the mesh) after updating the atlas. If that doesn't solve the issue, let me know and I will look into this.
I figured out part of the problem. I'm using LWRP and the name of the main albedo texture in the LWRP shaders is _BaseMap not _MainTex, so the automatically created material has the wrong texture applied. I swapped the albedo texture and got the below results (This is with the "consider mesh uvs" checkbox checked). Most of it's working except for the top part of the drywall. The drywall is a single mesh with two materials.
Hello! What if I have houses with openable doors and windows? Baking mesh and atlases will make the door opening impossible?
Ahhh, if there are shaders with names that are non-standard then these can be added in Custom Property Names. you should be able to add _BaseMap to that list.
Re: the upper part of the wall not working you should be able to use the multiple materials feature to map that to a different submesh so that it can have a different material. See this video:
It is about dealing with meshes that have out of bounds UVs (tiling) and multiple materials. Very common with trees and buildings.
You have a few options.
1) You can use the Batch Prefab Baker to bake an atlas & prefabs with modified UVs but not combine the meshes. You can let Unity do that with Static batching (doors and windows are left not static).
2) You can combine into a skinned mesh although I would only recommend doing this with VERY low poly geometry.
3) You can add and remove from the combined mesh at runtime (remove a static window or door that the player is interacting with) and enable the separate mesh version of that window or door.
Thanks. I watched the tutorial video and tried some different settings out. This is the best I've been able to produce (merged version is on the right). The material atlas looks perfect, it's just the uvs of the final mesh don't seem to be right. It's just one draw call, so I can live with the results, this is just an easy to reproduce case you might want to investigate.
Can I deactivate doors and windows gameobjects from the house prefab and just bake and combine everything else?
Yes, when combining the textures and meshes, exclude the door and window game objects from the list of Objects To Combine.
How does 'fix out of bounds' UVs work? Does it rearrange the UVs to fit in the texture bounds, or does it just tile the texture to cover the out of bound area?
It ignores the texture bounds. It measures the UV rectangle actually used by each source mesh:
The used UV rectangle can be larger than the source texture (tiling)
The used UV rectangle can be smaller than the source texture (extracting a small part of another atlas)
The used UV rectangle is baked into the output atlas. Note that different parts of a single source texture can be copied multiple times to different used rectangles in the output atlas. This can happen if you have several different props that share a source atlas. For example a flower_pot prop and a chair might use different parts of a large source atlas. You may not want the entire large source atlas copied to the output atlas. ConsiderUVs will extract just the parts of the large source atlas used by the flower_pot and chair and bake only these parts to different parts of the combined atlas. If the used rectangles overlap, they will be merged into a single used rectangle. Otherwise they will be baked to separate used rectangles.
It looks like uv packing is only done at the level of per object basis. Meaning it will not repack the uvs from different objects and to make the packing more efficient.. I can see the reason behind it to some extent, but I really need to bake my atlas more effectively using uv packing across different objects because I have a lots of objects with wasted uv spaces.
Now you may ask that is object fault but the objects are generated mesh from voxel so they contains a lots of wasted spaces.
I know what I am asking is a bit different thing than what mesh baker do, but I was crossing my fingers since I own the asset and was hoping if this is possible?
Sadly Mesh Baker cannot help with this. There might be another asset that does. Probuilder has a UV editor, but it may only work with probuilder meshes.
I need to know before purchasing this, Can it still use LOD after combining?
If you have a group of props close together, each with an LODgroup, then you can combine LOD0's toghether and LOD1's together and LOD2's together.... and make a new combined-LODgroup that uses the combined meshes.
You can also use the Batch Prefab Baker. The idea here is create modified versions of your prop prefabs that all share the same material. You can use these prefabs in the scene instead of your originals. The main benefit here is static batching which works with LODGroups. You can let Unity do the combining (static batching) and LOD groups still work and are handled automatically. All the props in a static batch must share a material which Mesh Baker helps with.
Thanks this asset also help combining atlas right. Because i have a few atlases and wish to combine it into one big atlas so it can share one material
EDIT with question:
Does it compatible with Imposter System Asset use in LOD,just asking.
Hello @Phong . I'm playing with free version of Mesh Baker and it seems it decreasing quality of combined textures compared to original textures even if size of combined texture is large enough to avoid any significant scaling of original textures. You can see this effect on screenshots below:
Below you can see part of normal maps corresponding to these windows for both cases as example. They are copied from complete normal maps and have the same size but Mesh Baker make composite normal map more blurry compared to original.
Is it something wrong with my textures (I can send you a simple project reproducing this) or I missed something essential in documentation?
Thanks in advance.
Re: Combining atlases. Yes this is possible but the problem you will run into is that the source atlases are probably large, and each platform has a maximum texture size. You could easily run into a where the quality is downgraded because the atlases need to be shrunk to fit.
A way to fix this is to use "considerUVs". If that is enabled, then only the parts of the atlas that are actually used by the models get copied. This is great if you are using a flowerpot that uses a tiny part of a huge atlas and don't want to include the whole atlas in your build.
Re: Imposter System. I am not familiar with this asset but it should work. The atlases (.png textures) and meshes created by Mesh Baker are no different from the atlases and meshes created any other way. If the LOD system uses regular Unity atlases and meshes then this should work.
Likely you are running into a case where the atlas is too full and textures have to be reduced in size to fit them in. Take a look at your atlas, how does it look? Some things to check:
Try checking "Resize Power Of Two". This helps power of two textures pack very efficiently when padding is added.
Check the size of the result texture on the TextureImporter after it is generated. The TextureImporter might be forcing it to a small size.
Do any of your source models have a lot of tiling? This can use up a lot of space quickly. If necessary you can use the multiple materials feature so that those textures can live on their own submesh (material). Then they are free to tile without making many copies of the texture.
Check the console after baking the textures. There should be a report that explains what was done. It might explain what is using up so much space.
Bought the asset and work as itended..question, which bake mode that preserve the LOD system again?
Thanks for reply.
I tried to increase 'Max Atlas Size' to 8192 and 'Max Tiling Bake Size' to 4096 (which seems maximum reasonable values) to give Mesh Baker more space but generated normal map is 4096 X 2048 as it was previously.
Screenshot in my first message taken after combining textures with 'Resize Power-Of-Two' checked and 'Atlas Padding' set to 1. I tried to play with these settings and 'Consider Mesh U Vs' checkbox also. Generated normal map is always noticeably more blurry than original normal maps.
It is 4096 X 2048 (checked with image viewer application) and the same size in Unity Editor (displayed with default import setting as well as with 'Override for PC, Mac & Linux Standalone' 'Max Size' set to 4096).
As I can see no tiling appears in generated texture and normal map.
There are 2 warning types:
1) Texture has out of bounds UVs that effectively tile by (1.0, 1.0) tiling will be baked into a texture with maxSize:4096
2) Some of the textures have widths close to the size of the padding. It is not recommended to use _resizePowerOfTwoTextures with widths this small. But unchecking 'Resize Power Of Two' seems not making generated normal map less blurry.
The only method that helped to reduce bluring is overriding size of original textures in Unity Editor import settings. Generated normal map is still more blurry than original normal map but applying sharpening filter to it in applications like Photoshop or Gimp should make it acceptable.
Just in case I attached original normal map and generated normal map (area corresponding to original normal map is in upper left corner).
At this time you will need to manually create an LODgroup for the combined mesh(s).
Add all your meshes to the TextureBaker
Delete the "MeshBaker" child of the TextureBaker
Use the MeshBakerGrouper to re-create children of the TextureBaker using the split by LOD level
Bake your meshes. You should have one per LOD level.
Create an LODGroup and and the new combined meshes as children and LOD levels.
Another way to do it is to use the Batch Prefab Baker. You can bake your prefabs so that you get new prefabs (with original LODs). The meshes will have been modified to use a combined material. You can use these in your scene instead of your original prefabs. If you use Static Batching, then you get the benefit of combined meshes (because the props share materials) and the LOD system.
This sounds odd. There may be a hard coded atlas size limit of 4096 in the free version. It is overdue for an update. It sounds like there too many images being packed an the atlas needs to b downsized. Although in this case I would expect the atlas size to be 4096x4096, not 4096x2048.
Another strange thing is the warning: Some of the textures have widths close to the size of the padding. Are some of your source textures very small (Usually padding is only a few pixels (1-8)? Are these the images that are becoming blurry? With very small images you can run into issues with the "Filter Mode" method used in the Texture inspector. What is the "Filter Mode" of your source textures and the "Filter Mode" of the atlas.
Minimum texture size is 64 X 64. The submesh/material which becomes blurry has 512 X 512 normal map and 256 X 256 texture. But one of materials has no textures at all. Only color selected for Main Color property of bumped diffuse shader. Few of materials have no normal map assigned.
All of them has default Bilinear filter mode.
Batch Prefab Baker is the one that will create a new prefab with its original LOD attached is it?Thanks a million,will experiment after work
Texture atlassing in runtime: What do I need to do to run this process by frame/steps? I don't care about the time it takes.
There is an example scene that bakes textures at runtime "SceneBakeTexturesAtRuntime". The script should show you how to run as a coroutine which I think is what you want, as well as baking all the textures at once.
Why after Mesh Baker ,the Tris is increasing???