Ha, indeed, it happens in URP. However, there is no reason to disable "Baked Global Illumination". Progressive won't do anything, unless you ask it or have "auto generate" enabled. Thanks Why not use LODGroups? Bakery can generate unique lightmaps per LOD. You can later disable the LODGroup component and use your own switching logic. Can't see the red errors on your screenshot. Any details about them? Note that apart from the lightmap files, you'll also need to sync: - the scene - model *.meta files, in case "Adjust UV padding" was enabled See: https://geom.io/bakery/wiki/index.p...2Fother_version_control_system_with_Bakery.3F Interesting. Bakery relies on OptiX library to do parallelization. Also in ftRenderLightmap.cs you can find this line: const uint deviceMask = 0xFFFFFFFF; (not sure if it's in v1.71 or was added later in github patches). Anyway, it is supposed to tell it which GPUs to use. 0xFFFFFFFF = all 0 = first 1 = second I wonder if setting it to e.g. "1" actually makes your second card do anything? Curious to know if it just can't run them in parallel or can't use the second one. Also, do you have RTX mode enabled? Bakery doesn't perform anything on the GPU, unless you bake. Sounds like it's simply Unity destroying it, thus I suspect your GPU is likely defective from the start Did you try any GPU-intensive games, do they crash? The only background functionality it has is checking if you changed any checkboxes/sliders in the window and saving them if you did. This is done in a plain C# script. Lightmap injecting happens only once when you load a new scene. It also just uses a simple C# script to assign textures to renderers. You can do Bakery->Utilities->Clear baked data->Clear all to completely get rid of the hidden object (also close Bakery window first, so it doesn't reinitialize it). You can use a separate lightmap group for this object: https://geom.io/bakery/wiki/index.php?title=Manual#Bakery_Lightmap_Group_Selector ... and set its render mode to e.g. Full Lighting: He's good!