A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Calling all New Unity users! Join the Halloween Mods Showcase Challenge until October 31.
Discussion in 'Assets and Asset Store' started by guycalledfrank, Jun 14, 2018.
you mean GT750 or GTX750 ti?
My bad I meant a gt705 (though the exact specs don't appear on online benchmark, they have 4 ROP, my card report 8) http://gamesystemrequirements.com/gpu/geforce-gt-705-oem
The gtx 1080 is 29,5 more powerful according to this
seems that your gpu is very poor ... its PCIe 2 GPU :\
Not planned, but maybe will give it a try some day
What bothers me is the amount of data required for "surfels visible from this texel" storage.
Bakery is really cool, setup like vray, fast and without tons of re-renders.
Hello. I tried in some scene Bakery it seems really nice and I'm really willing to use it but I often have lot of issues in Unity 2018.2.5 It brakes my project and I have 8 compiling errors
Hope it can be fixed as soon as possible or to understand what is broken. Also, I tried Bakery in empty scene it worked but trying to bake again example scenes it did not worked I had the scene for example_sponza_day which turned to gayed out texture as if there were no lightmaps affected and rendered. I wait for some feedback. All the best. Xavier
I'm not even sure where I can download it. Is it a beta? According to latest docs, SceneManager does contain all these functions: https://docs.unity3d.com/ScriptReference/SceneManagement.SceneManager.GetActiveScene.html
I am not aware of anything radical happening to SceneManager class. Perhaps it is safer to user a more stable Unity version (e.g. 2018.2.2).
EDIT: based on this reddit thread, it seems like such problem can happen if you have your own class named "SceneManager", as it interferes with Unity's class.
Unity 2018.2.11 is the latest none beta.
as far as I read the asset it great and I'm gonna buy it anyway, but was curious if you are planning to support Radeon at some point?
Hi! Unfortunately Radeon support is tricky, as Bakery depends on OptiX and OptiX denoiser libraries. If porting the ray-tracing part to something like Radeon Rays (that's what Unity's GPU lightmapper is using) is possible, and I might try it some day, there is no analogous denoising library. I can write an independent denoising library, but my knowledge here is limited to cheap algorithms like bilaterial blur (often used to blur SSAO and such), and the quality won't be as good.
And because upcoming versions of OptiX are likely going to take full advantage of RTX, I'm not going to abandon the NV version. That means I would have to support both NV and AMD versions and make sure every feature works, and it's a hell of a burden.
Got it, thanks for the quick reply
few words on current 1.4 development? please
4 directional modes are working. Vertex baking now supports everything. Added Bakery shader supporting new directional (RNM, SH) modes, all vertex baking features as well as baked specular. Fixed many bugs. Also added a HDRP-compatible lit shader, but the only feature it currently supports is SH. Working on LWRP-compatible shader. Definitely want people to try SH lightmaps on all render pipelines
But yeah, this version it taking more than usual, so I'm going to stop myself from adding stuff after the last shader is finished, test it thoroughly and release.
Thanks for the update! I cant wait for the 1.4 drop
I haven't had a chance to test this yet, or really know the best way to compare for certain. But is it actually limited to only cards in SLI, or do you use that to mean only those connected to a single board? From what I understand CUDA can be made to work across multiple CUDA devices regardless of SLI. If it isn't already so, is there a chance to make it work?
I'm looking for a replacement for Unity's default baker and this looks really good!
I was just wondering if Bakery supports the new HDRP with custom shaders yet? I saw a arcvis project earlier in this thread that was seemingly using this and hdrp but don't know if it used any custom shaders.
I'm asking because I'm extensively using a custom lit shader graph that samples everything from texture arrays to reduce draw calls.
If hdrp is supported then are the new light shapes available yet (rect and line)?
If it can, then it probably does. I didn't implement anything extra regarding work sharing across devices, but given my understanding of CUDA driver and the feedback from multi-GPU users, it seems to work.
Yes, if you have a HDRP shader supporting regular lightmaps, it should work with Bakery as well. I'm only adding shaders for new fancy directional modes and vertex baking. Typical lightmaps, including shadowmask and (soon to be added) directional data work with all shaders.
Bakery uses its own light source components. There is a Light Mesh component for example, allowing you to make a light of any shape. The only limitation is that you can't currently calculate a shadowmask for rects/lines, but I'll look into it (Light Meshes already have mask generating code, just needs some scripting).
Thanks for reply!
I was thinking in terms of having a custom shader albedo being read properly when indirect lighting is calculated. Like green wall casting green indirect bounched light to nearby surfaces. Is that supported or do you assume grey/neutral albedo when indirect light is calculated? Or perhaps read and assume material's "_MainTex" to be the albedo?
If it currently works with built-in lightmappers, then it should work with Bakery. I'm using the same meta pass.
Only when it can't locate the meta pass - e.g. in case of completely custom shaders not based on surface/HDRP includes and without meta pass definition added by author.
Awesome, thank you! All my custom stuff should work fine then. Wasn't aware of the meta pass thing for shader -> albedo (I wrote a simple cpu lightmapper some years ago).
Will purchase tonight and try it out.
Accidentally came across this thread on the forum. With every page I read, my eyes got bigger and bigger.
But when I saw SH and Vertex Light baking screenshots I decide to purchase immediately!
I get used to the fact that light-baking tech in a deep pit of degradation nowadays. Unity switched from Beast to cursed Enlighten instead of fixing the first one, ue4 got rid of vertex light baking and turn Lightmass into a light mess.
Baking lightmaps became a torture... Not to mention RNM or proper packing like in Quake1 at least...
This Bakery Lightmapper baking lights of hope in me!
To be honest, didn't experience such excitement for a long time already.
Biggest respect to the author for this amazing project and careful attention to small important things!
Can't wait to try v1.4!
to make the story even shaddyer they already had a running Unity 4.7 64bits that would have allowed for much larger Beast scene baking as Beast itself was 64bits but the need to sell you the Unity5 things made em to ditch that version ...
Damn, I totally forgot about that, what a painful time...
Looks like this asset does everything for lighting out of the box !...
I just wonder if lightmaps are memory consuming ? What's the average size per resolution from 1K to 4K ? I plan to create a massive count of assets (procedural level parts - thousands) so memory could be a concern with lightmaps...
I'd need lightmaps of less than 100Ko ideally - for 50² size meshes in Unity units.
They are regular texture assets, the size is no different to what built-in lightmappers would produce.
It can be indeed - maybe try it with Progressive on low settings just to get the idea.
Also, you can bake per-vertex with Bakery. In case you have relatively uniform tessellation, it might be a better choice than textures. Note that vertexLM requires modifying the shader though (or using included shaders).
Hi, in one moment in my scene some objects stopped getting lightmaps. It`s happens absolutely randomly. In one try bake - lightmapps not have some objects and second try - another. I do not see any patterns. May be some idea about this?
No idea unfortunately. i can send a beta of v1.4 though, as it might have something fixed.
Wow, it`s will be great!
Done! Make sure you restart Unity before package import, since it can't overwrite previously used DLLs otherwise.
Was it necessary to restart Unity a lot for replacing the library while developing the tool !? I imagine how painful might this be
If a DLL file was used by Unity at least once, it becomes forever locked and can't be replaced. Quite disappointing, yes.
Bakery itself is a pile of exes/dlls/scripts/shaders, so I didn't have to restart it for everything. The core part is ftrace.exe, and it has command-line interface.
DLL interface is used for a bunch of libraries where it's useful to pass a D3D11 texture pointer right to my code (e.g. texture export).
Maybe this has been asked before, but is it possible to emplement a visual feedback during the baking process like native Unity-bakers do?
A black screen is kind of boring and inspecting the baking progress would be a big benefit.
Of course renderspeed shouldn´t suffer because of this.
An obvious drawback of progressive-like feedback is that it would require you to store the whole scene in video memory (and regular RAM as well). That would mean reducing available memory for the lightmapper. Currently Bakery comes with an "Unload scenes" checkbox enabled by default for this very reason. For example, Adam demo environment from the asset store takes 2 GB of VRAM just by itself.
To update the lightmaps during rendering, I would also have to store these lightmaps in uncompressed form (because texture compression is pretty slow) further cluttering VRAM.
Therefore any fancy preview would be only useful for simple scenes which are fast to bake anyway, so I'm not sure about it.
However, what I could do is something like "render a frame from editor camera" button. I can write a fast simple (well not so simple) camera path tracer using Bakery's lighting algorithms, so you could get the idea before baking.
We're using Gitlab to work, and noticing that everytime we push/pull to another computer, the lighting is messed up everywhere, like there's a major lightmap offset or something. Is there something happening with local cache that isn't being transferred over or that could be contributing to the issue? Thank you!
I need to bake only a small part of a complex scene. In that scene I have a lot of objects incompatible with bakery, but those don't need GI at all.
The object I need to bake works perfectly when it's isolated on its own scene but, when it's merged in the global scene, everything goes wrong.
I've tried different approaches:
- The geometry to bake is the only Static in the scene. Any other is not static.
- Hide everything in the scene but the baking geometry, switching off the activate checkbox, so the only objects active and visible are those I need to bake. Then, bake.
In both cases Bakery is trying to bake ALL the scene geometry, including the objects that are not static, even the hidden, deactivated, ones. Or, at least, they appear in the progress bar as being included in the calculations. There's a lot of invalid objects there, so Bakery gives an error or crashes.
Also tried to bake in a separate scene, it works OK. Once baked, save it as a prefab and then include the prefab in the main scene. No solution: I lose the baked lightmaps in the process.
Tried the experimental render feature. AFAIK this is an option to bake in albedo textures instead of generating a lightmap. I just get an error.
In a line: I need a way to make a partial bake of an scene or, even better, a method to bake some objects in a controlled scene and then being able to import the baked version in different scenes.
Anyone a clue on this?
Thanks in advance.
In fact, with the Progressive Lightmapper GPU from Unity Beta, I can't bake scene that in Bakery I can bake without problem.
This because all RAM was taken from Unity itself. With Bakery I have a lot of free RAM to use.
Yeah, I was just explaining it today to another user. Version control is a problem at the moment.
Apart from your scene data, there are 2 interesting files in the project called ftGlobalStorage.asset and ftLocalStorage.asset.
Local Storage asset should not be pushed to repo. It should belong to an individual machine. Global storage should be pushed. The idea is that you get the update (data + global storage), it detects the difference between global/local storages, syncs UV padding on your models and updates local storage. I'm not sure about the whole system, hence it is not even mentioned in the docs, but eventually it should evolve into something usable.
I'm noticing Unity is unable to properly merge scripted asset files. I wonder if it would help if I just convert global storage to e.g. JSON.
Full explanation of existing mechanism:
- You import a model with auto-generated UVs.
- Unity unwraps every mesh in the model individually for lightmapping. The padding value (spacing between UV charts aka "pack margin") is constant for the whole model and identical between meshes.
- Unity's own atlas packer as well as Bakery packer scale each mesh differently in the lightmap atlas, so the amount of texels used per mesh is based on its world-space area.
- Surprise, now the pack margin value doesn't make any sense. It was supposed to mean "empty texels between charts", but now it's scaled. Tiny objects have bleeding or large objects have huge empty holes between charts. This is how it works in Unity out of the box. Bleeding on small objects is especially annoying.
- Now you install Bakery and it has this "Adjust UV padding" checkbox enabled by default. It's not even a part of the lightmapper, but without it it's hard to get decent results.
- Bakery calculates texel size of every mesh in the atlas and THEN, if the checkbox is enabled, it calculates the optimal pack margin value per-mesh (not per model) and, if it's different from currently used, forces model reimport. Then ftModelPostProcessor.cs comes in and calls the unwrapping function individually for each mesh with different values. Model post processor is the only way to alter the values of a model asset.
- The optimal padding values are stored inside the global storage asset. This is where model post processor gets the data from, and it's also used to check if model reimport is necessary.
- When you import the project to another machine, it looks into global storage and sees "model A should have modified padding". It then looks into local storage and sees "model A is not even mentioned here", and so it calls reimport and marks it locally.
- (not so) fun fact: if the padding value changed, but the model is already marked inside local storage, it won't force reimport. I fixed it for v1.4, it now also compares the values between global/local.
However, according to what users say, it's still not sufficient - for some reason many version control systems refuse to replace/merge the global storage asset, so I need to fix that too. As I mentioned, porting the asset to something simpler like JSON may help it. I'll try it soon.
What exactly makes them incompatible?
How exactly it goes wrong?
This is 100% not possible. Objects are filtered by static/active flags. How do you know it tried to bake them?
- Create a new scene.
- Create an empty object named "!ftraceLightmaps"
- Put your stuff in the scene and bake.
- !ftraceLightmaps object now should have a script with some data attached.
- Create a prefab out of everything including this object.
- This prefab should always be lightmapped now..
There is no such option. Perhaps you've mistaken something else for it.
Apart from the prefab trick above, you can bake in a separate scene and then load both scenes.
Hey Frank, thanks for the FAST!!! answer.
Just edited my message. I thought the hidden or not-static objects are part of the calculations because I see references to them in the progress bar. I'm leaving now, but tomorrow I'll re-check this.
My "incompatible" objects have normal maps, others are procediral geometry from Dreamktek spline generator, and/or animated geometry (not static).
It's not me but my colleague who told me he was trying that "experimental render". He's already gone home, I'll check this with him tomorrow.
But first of all, We'll try the trick you describe.
We're absolute newbies with Unity and, of course, with Bakery. I apologize if posted too weird thing, we're still trying to understand the basics in the plattform.
And again, thanks a lot for the quick help!.
I seem to be running into the same problem: Rendering occlusion probes gets stuck and never finishes. I'm on Bakery 1.3 and Unity 2018.2.11f1. Also, my build target is iOS. If I turn on real-time reflection probes in Quality Settings, it fixes the problem and the occlusion probes finish rendering.
Leaving real-time reflection probes turned on is problematic though, because they are expensive and I'm targeting mobile. Maybe you could have Bakery check to see if real-time reflection probes are turned on before trying trying to render occlusion probes? If it's turned off, maybe Bakery could turn it on, do the render, and then turn it back off?
Loving Bakery so far. Looking forward to 1.4 and directional lightmaps!!
I might sound ridiculous, but I never used Unity's lightmapping solutions because I don't know how to bake lightmaps for procedural assets.
The "Render Selected" option in Bakery seem to be the answer for this need. Am I right ? Like, does it allow to bake a given room with emissive materials - and make this room a Prefab on-the-go ?
I'm trying to figure out what units to use on my light intensities, I noticed that I get much more correct results when I bump an area light on my ceiling from 10 brightness to 100 brightness and tonemap down, but that feels kind of arbitrary, and I'd love to be able to just look up fixture wattages and convert them to lumens or lux or whatever.
Do you have a standard lighting unit that you use? If not, is there a range of brightness you can recommend staying within to get clean bakes?
Having an option to use real world units on the light scripts would be a really nice feature!
Hi @guycalledfrank - I had a suggestion,
Since Bakery clearly bakes out the scene on a light-by-light basis and then composites the results together, would it be possible to add a "minimal re-bake" mode, where Bakery could cache the results of a previous bake, and if only a single light had changed, Bakery could only re-bake and re-composite the contribution from that light?? This would theoretically allow for much faster iteration, and it seems like Unity already employs similar 'GI Caching'?
Ah yeah, sorry about that - I think I already fixed it for the upcoming v1.4
No, Render Selected is similar to normal Render, but it skips rendering some stuff you don't need. Currently it also has some bugs, like breaking the shadowmask (also fixed in 1.4). To get baked prefabs, read my trick from post #1088.
I'm definitely planning to add lumen/lux/candela options to all lights
As for area lights, if I remember correctly from my Mitsuba comparisons, their brightness is radiance, or as Mitsuba docs put it, "emitted radiance in units of power per unit area per unit steradian".
Yes, it's already there. It's not too documented though:
- Bake the scene normally.
- Go to experimental settings mode and disable "Export scene" and "Update unmodified lights".
- Bake again. It should now skip scene export and cached lights that didn't change. GI, denoising and seam fixing will run as usual.
There is actually a cache of previously baked values for every light, and it compares them. It might be a bit out of date, and perhaps I didn't add some newer options.
Note that if you currently bake another scene, it will overwrite geometry data in the cache and break the feature. Perhaps I should make it more usable.
Hi @guycalledfrank !!! The "render a frame from editor camera" seems a really really awesome idea!!! Because if you have a huge scene it could help to have a really quick and general idea if youre lights and mood are right, or are over or under exposed! And not wait to bake all the scene!!! It could help to iterate even faster!!!
When I say a room I mean a single mesh, so I wouldn't need anything else than this mesh / room (to select).
In short, should I simply create a new scene, put my individual rooms in it, bake and turn them into prefabs one by one ?
You will need some scripting to do that. The script would have to iterate over rooms and (roughly):
- Mark the room static and everything else dynamic.
- Call the baking coroutine from the ftRenderLightmap class.
- Find the !ftraceLightmaps object, put it inside the room hierarchy, create the prefab.
- Delete the !ftraceLightmaps object (or the whole already baked room).
Long story short, there is no "make these gameobjects baked prefabs" button at the moment. You can script it, but there are many very different use cases preventing me from writing a one-fits-everyone built-in function.
I think I won't be able to pass part 2 because sadly I have not much programmig skills, even more when it comes to complex things like lightmapping...
I guess I just have to give a try to your asset.
Thanks for making such a great tool. I only bought it yesterday and it's already been miles beyond Unity's solution.
I have however been having issues with failed bakes and occasional editor crashes when baking. I'm getting the following error in the console:
ArgumentOutOfRangeException: Argument is out of range.
Parameter name: index
System.Collections.Generic.List`1[System.Single].get_Item (Int32 index) (at /Users/builduser/buildslave/mono/build/mcs/class/corlib/System.Collections.Generic/List.cs:633)
ftBuildGraphics+<ExportScene>c__Iterator0.MoveNext () (at Assets/Editor/x64/Bakery/scripts/ftBuildGraphics.cs:3079)
I'm using Unity 2017.3.1f1 with the following settings in Bakery and it fails every time:
Any help would be much appreciated, thank you.
This error seems to be linked to GI VRAM optimization. I heard of similar reports, but so far was unable to reproduce it. Have you tried baking without it?