Search Unity

Official Progressive GPU Lightmapper preview

Discussion in 'Global Illumination' started by KEngelstoft, Sep 26, 2018.

  1. Cascho01

    Cascho01

    Joined:
    Mar 19, 2010
    Posts:
    1,347
    As written above, I don´t have the time to prepare and document an example. My problem.
    Did you ever try Bakery on HDRP?
    Without being precise, there are some more issues with Bakery and HDRP, Frank indirectly confirmed this.

    Please don´t get me wrong, I am the biggest friend of Bakery.
    Hopefully I will be able to prepare an example with my problem soon.
     
  2. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Yeah I've used bakery for a long time. I have a pretty good idea why you're getting that, but I don't want to guess. It seems to me you can't send the bakery author a few boxes as in your screenshot.

    You may be having problems with PLM as well - PLM is far from finished in my view.
     
  3. Cascho01

    Cascho01

    Joined:
    Mar 19, 2010
    Posts:
    1,347
    No problem, you´re welcome to guess here ;).
    (The geometry is a spline with extrusion modifier in 3dsmax, nothing more.)
    This particular problem does not appear in PLM, same geometry bakes fine.
     
  4. UnityLighting

    UnityLighting

    Joined:
    Mar 31, 2015
    Posts:
    3,874
    PLM is too much fast on my RTX 3060ti 8GB... You can't bake high lightmap size like 4096x4096 on the 3060ti 8gb in most cases... SO use 2048x2048 lightmap size with a very high lightmap resolution (100~200)...
    About 15x faster than my Ryzen 5 3600XT (1.5 min vs 27 min bake time performance)
     
  5. Cascho01

    Cascho01

    Joined:
    Mar 19, 2010
    Posts:
    1,347
    Sorry, what do you compare, GPU vs CPU or 4096 vs 2048?
     
  6. UnityLighting

    UnityLighting

    Joined:
    Mar 31, 2015
    Posts:
    3,874
    Both
    2048 is limitation for 8GB gpu
    GPU is faster than cpu for baking
     
  7. Cascho01

    Cascho01

    Joined:
    Mar 19, 2010
    Posts:
    1,347
    Wrong, I bake 4096x4096 lightmaps on my RTX2080super (8GB).
    Just ensure your lightmap resolution in the Scene-Lighting is not too high, 75 in my case doing archviz interieur.
    Open taskmanager and keep an eye on the gpu during baking.
    When I get >7.5GB Progressive Lightmapper falls back to CPU.

    Right!
     
    fuzzy3d likes this.
  8. UnityLighting

    UnityLighting

    Joined:
    Mar 31, 2015
    Posts:
    3,874
    4096 is not useful for small objects... It's useful for large objects baking like terrains... You can use a very high lightmap resolution for small objects instead
     
  9. soleron

    soleron

    Joined:
    Apr 21, 2013
    Posts:
    580
    I typically do not bake higher than 1024, and it used to work fine but recently, 2020.3 onward I started having GPU memory issues again like when GPU lightmapper had first been introduced. and I had only 3GB of vram. It could be there are some GPU memory management issues or whatever changes they have made after 2020.3 they are hogging a lot of GPU memory. The editor has become slower too and shaders take longer to compile.

    My project used to bake fine and superfast even at 2K until Unity 2020.2.
    I have tried in 2020.3, 2021.1, and 2021.2 because I need the cloud feature. But no luck.

    My very complex scene used to bake in 20-25 minutes even at 60+ resolution.
    I am running on a 2070 8GB card.
     
    Last edited: Feb 15, 2022
    forestrf likes this.
  10. ebaender

    ebaender

    Joined:
    Oct 29, 2020
    Posts:
    97
    Is the lightmapper totally broken on AMD GPUs? I can't bake 4096x4096 with my 6900 XT, and Unity never even tries to use more than 5 of the 16 GB during baking.
     
  11. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    I have been using it with an AMD RX6800XT and it works fine. You have to provide some more context about what broken means, ideally filing a bug report so we can investigate this with exact version info and logs etc. Unity QA - Bug Reporting - Unity (unity3d.com)
     
  12. ebaender

    ebaender

    Joined:
    Oct 29, 2020
    Posts:
    97
    By broken I mean I can not use it at all, it falls back to the CPU lightmapper every time as if I didn't have enough VRAM. Will submit a bug report later today.
     
    KEngelstoft likes this.
  13. UnityLighting

    UnityLighting

    Joined:
    Mar 31, 2015
    Posts:
    3,874
    I have a simple question
    Does the lightmap parameters affect only on the Enlightens Lightmapper or does it also affect GPU PLM?
     
  14. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,735
    Most of the stuff in the Baked GI section do affect PLM (Antialiasing, pushoff, Baked tag, Limit lightmap count and Backface tolerance).

    The rest doesn't do anything for PLM.
     
    andreiagmu likes this.
  15. UnityLighting

    UnityLighting

    Joined:
    Mar 31, 2015
    Posts:
    3,874
    So lightmap parameters doesn't do anything?
     
  16. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,735
    For PLM here are the ones that do something:
    upload_2022-3-6_19-15-6.png
     
    andreiagmu and UnityLighting like this.
  17. andreiagmu

    andreiagmu

    Joined:
    Feb 20, 2014
    Posts:
    175
    Is there any way to force the GPU Lightmapper to work even if I have less than 4GB VRAM?
    I'm aware that in that case, I won't be able to ask for support if I have issues, but I'm still willing to try it.
    I have a 2GB GPU at the moment. I just wanted to make sure before spending on a new GPU. :p
     
  18. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Not very well, no. Expect it to crash constantly. Don't bother. 2GB ram barely leaves any left over after the editor, the os and so on. Just wait a bit and invest in a cheap GPU with bags of ram.

    Even 8GB will be unstable in some larger scenes. More is better!
     
    andreiagmu and newguy123 like this.
  19. andreiagmu

    andreiagmu

    Joined:
    Feb 20, 2014
    Posts:
    175
    Thanks for the quick answer!
    I asked because, when I used the GPU Lightmapper in Unity 2019 (before the 4GB min requirement increase), it worked flawlessly in my scenes. I currently have an AMD Radeon R7 350 2GB, and I didn't have any issues with the GPU Lightmapper at the time.
    But I understand the need for the extra VRAM to avoid issues.

    And thanks for the advice about 8GB and larger scenes! :)
    Guess I'll have to go for a 12GB GPU next. That won't be cheap in my country (Brazil)... :confused:
     
  20. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,788
    Didn't the GPU Lightmapper Tilling should help with the VRAM usage? i mean that system are created to reduce the memory usage and requirement by splitting the lightmap (when baking) into a small chunks of texture based on current available VRAM?
     
  21. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,735
    Yes, but AFAIK currently it still has too many points of failure and stuff it doesn't quite support for it to be really useful.
     
  22. ebaender

    ebaender

    Joined:
    Oct 29, 2020
    Posts:
    97
    Is it just not possible to bake with HDR skyboxes? I've been trying forever to get rid of the artifacts produced by the sun and backlit clouds in the HDRI:

    hdr-skybox-artifacts.png

    I read the manual page about the lightmapper which says that you need to increase environment samples beyond 500 for HDR skyboxes, but that seems to do nothing at all. The exposure of the skybox is also as low as it can go without looking too dark as well. The improvement is barely noticeable with increased environment samples, even at 30.000 environment samples which is the maximum that is realistically possible within working hours. 40.000 would already take a whole day on my 6900 XT, and even if that yielded a 30% improvement over 30.000 samples that would still be nowhere near acceptable.

    Maybe I could try increasing samples further if the lightmapper wasn't so broken that it only uses a third of the available VRAM on the 6900 XT, but that problem has gone ignored for so long that I guess I can give up now. Might as well just sell it and go back to my GTX 970 since that barely bakes any slower.

    baking.png
     

    Attached Files:

    Last edited: Jun 9, 2022
  23. Bordeaux_Fox

    Bordeaux_Fox

    Joined:
    Nov 14, 2018
    Posts:
    589
    Sadly, coming back after years and trying out the GPU lightmapper, it is still worse.
    It often crashes Unity. I'm using the lastest LTS.

    1. Bug: Viewing the Baked Lightmap Scene View during baking caused a crash.
    2. Bug: Sometimes, after baking and entering game mode, I get error messages that the main camera was "destroyed" resulting just in a blue game view. To get rid off this bug, I have to restart Unity. I have a static scene with no scripts destroying the main camera ...

    Clearly, I'm wondering when GPU lightmapper will finally gets out of "Preview".
     
    OBiwer likes this.
  24. Haze-Games

    Haze-Games

    Joined:
    Mar 1, 2015
    Posts:
    189
    I'm not sure if this will help you but that seems like crazy amount of samples. Try your regular 1024 value, and switch to Non-Directional if you are in Directional lightmaps.

    I've had these white artifacts with GPU PLM absolutely everywhere in my scene with Directional as soon as there is at least 1 Area Light.

    Could be something else for you but worth a try (took me many days to figure it out).

    PS: Re-created my most as I had originally replied to the wrong post ;)

    Cheers
     
  25. Mauri

    Mauri

    Joined:
    Dec 9, 2010
    Posts:
    2,664
    Those scattered bright spots you see are called "fireflies" and will happen in every lightmapper and even in offline ray tracers aswell. When using a HDRI sky, make sure to remove any super bright areas in it - like the sun. For the sun, simply use a directional light.
    .
     
  26. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    How does it look when you check the Multiple Importance Sampling option? That way you will spend your environment samples on the bright areas of the sky instead of doing random sampling.
     
  27. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Did you file a bug report for the crash? We don't have any active cases about the problems you encountered so please do create bug reports so we can investigate and fix your issues.
    https://support.unity.com/hc/en-us/articles/206336985-How-do-I-submit-a-bug-report-
     
  28. novaVision

    novaVision

    Joined:
    Nov 9, 2014
    Posts:
    518
    Any estimations when Progressive GPU will work on Mac? For now I got the error using the commit with lightmap generated on Windows using Progressive GPU. Same I can't generate lightmap using Progressive GPU on mac
     
  29. theNfan

    theNfan

    Joined:
    Dec 30, 2020
    Posts:
    25
    Does anyone have an educated guess how a mobile RTX 3050ti 4GB (GPU Lightmapper) would do against a Ryzen 6800 (CPU lightmapper)?

    Edit: Is a 4GB card even viable at all? I get the impression that this is the bare minimum, but may not work for larger scenes anyways.
     
    Last edited: Jul 14, 2022
  30. theNfan

    theNfan

    Joined:
    Dec 30, 2020
    Posts:
    25
    I tried it with 2021.3.6 and a GTX 680 2GB and it still bails out immediately because it does not have 4GB. Still looks like a hard coded limit? I though I might give it a try with smaller light map sizes, but there really seems to be no way.
     
  31. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,735
    I think 4GB is still minimum regardless.
     
  32. theNfan

    theNfan

    Joined:
    Dec 30, 2020
    Posts:
    25
    Sorry for the many questions, but here's another one:
    Anyone have experience with using eGPUs for baking? Is the performance loss acceptable or do the bandwidth limitations hit too hard?
     
  33. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    eGPUs work okay in my experience, PCI express bandwidth is not the limiting factor. I have used AMD RX580, AMD RX5700 and AMD Vega Frontier Edition with a Razer enclosure plugged into a MacBook Pro and they all worked fine with the GPU lightmapper.
     
    theNfan likes this.
  34. novaVision

    novaVision

    Joined:
    Nov 9, 2014
    Posts:
    518
    @KEngelstoft any comment?
     
  35. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Hi, the GPU lightmapper works on mac as far as I know. If it doesn't work in your project, please file a bug report so we have all the information we need to investigate. Your post doesn't contain any information about what doesn't work so I can't provide any solutions here.
     
  36. novaVision

    novaVision

    Joined:
    Nov 9, 2014
    Posts:
    518
    Here what I see trying to use Progressive GPU on my MacbookPro 2015 (Radeon VGA)
    unity_GPU_error.png

    I asked it because Unity manual says:
     
  37. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    The error is unrelated to the amount of memory, this looks like no OpenCL devices are found at all. The editor log can shed some more light on this, look for -- Listing OpenCL platforms(s) --
     
  38. theNfan

    theNfan

    Joined:
    Dec 30, 2020
    Posts:
    25
    So, I got my hands on a Ryzen 6900HS laptop with the Radeon 680m iGPU and a RTX 3050ti 4GB dGPU.
    To my surprise I can indeed use the 680m for baking! That's unexpected since it only reports 512MB of VRAM (or course it does not actually have "VRAM" at all). Performance is about half of the 3050, which is in line with performance in games. But still better than using the CPU.
    I used the Backyard asset and got about those times for lightbaking:
    CPU: 22:40 at 12 MRays/s
    Radeon: 13:28 at 50 MRays/s
    RTX: 7:20 at 100 MRays/s

    I guess CPU and GPU MRays are somehow different?
     
  39. York_1003415

    York_1003415

    Joined:
    Mar 9, 2022
    Posts:
    1
    Just to report about "Failed to find a suitable OpenCL device" problem.

    My original project is under Unity 2019.1.10, then updated Unity to 2021.3.6.
    Still have the same problem "Failed to find a suitable OpenCL device" when selecting Progressive GPU as Lightmapper.
    After that error, I can't select Progressive GPU, and keep showing "Falling back to CPU lightmapper."
    CPU: Intel i9-10980XE
    GPU: 2x NVIDIA Geforce RTX2080 Ti
    OS: Windows

    Not the VRAM size problem, it turn out to be that OpenCL really can't get my GPU device.
    But I've already install nvidia driver 516.93 by quick installation mode.
    By checking Device manager, GPU card property->driver details, and find my OpenCL.dll(in my case it's under C:\Windows\SysWOW64\).
    It shows that the OpenCL.dll is not signed, meaning that nvidia driver might skip installing OpenCL.dll if I already have one.
    So delete that OpenCL.dll and reinstall nvidia driver 516.93, select custom installation, and check the complete installation box.
    After the second time nvidia driver installation, I got a new OpenCL.dll under the same path.
    And can generate lighting by Progressive GPU.
     
  40. latas

    latas

    Joined:
    Oct 9, 2013
    Posts:
    149
    Hi, I'm currently using Unity 2020.3.37f1. At time of this post is the latest TLS for 2020. I read in the documentation that if there are two GPU in the computer Unity will use the one not used in the Editor for the GPU lightmapper. In my case I've got two 3080 RTX Nvidia cards. The first issue is, in the GPU PLM, the dropdown list box only shows one card.
    In the editor log, I've got two.


    [00:00:14] Enlighten: Precompute started.
    -- Listing OpenCL platforms(s) --
    * OpenCL platform 0
    PROFILE = FULL_PROFILE
    VERSION = OpenCL 3.0 CUDA 11.7.99
    NAME = NVIDIA CUDA
    VENDOR = NVIDIA Corporation
    EXTENSIONS = cl_khr_global_int32_base_atomics cl_khr_global_int32_extended_atomics cl_khr_local_int32_base_atomics cl_khr_local_int32_extended_atomics cl_khr_fp64 cl_khr_3d_image_writes cl_khr_byte_addressable_store cl_khr_icd cl_khr_gl_sharing cl_nv_compiler_options cl_nv_device_attribute_query cl_nv_pragma_unroll cl_nv_d3d10_sharing cl_khr_d3d10_sharing cl_nv_d3d11_sharing cl_nv_copy_opts cl_nv_create_buffer cl_khr_int64_base_atomics cl_khr_int64_extended_atomics cl_khr_device_uuid cl_khr_pci_bus_info cl_khr_external_semaphore cl_khr_external_memory cl_khr_external_semaphore_win32 cl_khr_external_memory_win32
    -- Listing OpenCL device(s) --
    * OpenCL platform 0, device 0
    DEVICE_TYPE = 4
    DEVICE_NAME = NVIDIA GeForce RTX 3080
    DEVICE_VENDOR = NVIDIA Corporation
    DEVICE_VERSION = OpenCL 3.0 CUDA
    DRIVER_VERSION = 516.59
    DEVICE_MAX_COMPUTE_UNITS = 68
    DEVICE_MAX_CLOCK_FREQUENCY = 1710
    CL_DEVICE_MAX_CONSTANT_BUFFER_SIZE = 65536
    CL_DEVICE_HOST_UNIFIED_MEMORY = false
    CL_DEVICE_MAX_MEM_ALLOC_SIZE = 2684223488
    DEVICE_GLOBAL_MEM_SIZE = 10736893952
    DEVICE_EXTENSIONS = cl_khr_global_int32_base_atomics cl_khr_global_int32_extended_atomics cl_khr_local_int32_base_atomics cl_khr_local_int32_extended_atomics cl_khr_fp64 cl_khr_3d_image_writes cl_khr_byte_addressable_store cl_khr_icd cl_khr_gl_sharing cl_nv_compiler_options cl_nv_device_attribute_query cl_nv_pragma_unroll cl_nv_d3d10_sharing cl_khr_d3d10_sharing cl_nv_d3d11_sharing cl_nv_copy_opts cl_nv_create_buffer cl_khr_int64_base_atomics cl_khr_int64_extended_atomics cl_khr_device_uuid cl_khr_pci_bus_info cl_khr_external_semaphore cl_khr_external_memory cl_khr_external_semaphore_win32 cl_khr_external_memory_win32
    * OpenCL platform 0, device 1
    DEVICE_TYPE = 4
    DEVICE_NAME = NVIDIA GeForce RTX 3080
    DEVICE_VENDOR = NVIDIA Corporation
    DEVICE_VERSION = OpenCL 3.0 CUDA
    DRIVER_VERSION = 516.59
    DEVICE_MAX_COMPUTE_UNITS = 68
    DEVICE_MAX_CLOCK_FREQUENCY = 1800
    CL_DEVICE_MAX_CONSTANT_BUFFER_SIZE = 65536
    CL_DEVICE_HOST_UNIFIED_MEMORY = false
    CL_DEVICE_MAX_MEM_ALLOC_SIZE = 2684223488
    DEVICE_GLOBAL_MEM_SIZE = 10736893952
    DEVICE_EXTENSIONS = cl_khr_global_int32_base_atomics cl_khr_global_int32_extended_atomics cl_khr_local_int32_base_atomics cl_khr_local_int32_extended_atomics cl_khr_fp64 cl_khr_3d_image_writes cl_khr_byte_addressable_store cl_khr_icd cl_khr_gl_sharing cl_nv_compiler_options cl_nv_device_attribute_query cl_nv_pragma_unroll cl_nv_d3d10_sharing cl_khr_d3d10_sharing cl_nv_d3d11_sharing cl_nv_copy_opts cl_nv_create_buffer cl_khr_int64_base_atomics cl_khr_int64_extended_atomics cl_khr_device_uuid cl_khr_pci_bus_info cl_khr_external_semaphore cl_khr_external_memory cl_khr_external_semaphore_win32 cl_khr_external_memory_win32
    User defined PlatformAndDeviceIndices 0 0, name:NVIDIA GeForce RTX 3080
    Found command line argument OpenCL-PlatformAndDeviceIndices 0 0
    -- GPU Progressive lightmapper will use OpenCL device 'NVIDIA GeForce RTX 3080' from 'NVIDIA Corporation'--
    use -OpenCL-PlatformAndDeviceIndices <platformIdx> <deviceIdx> as command line arguments if you want to select a specific adapter for OpenCL.

    Also it looks there is a user defined parameter for using device 0 0 in the command line, but I'm not launching Unity with any command line. I also reviews the Unity Hub, and there are no parameters used.

    So, can you explain to me what's happening?

    Thanks.
     
  41. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    How odd, I might have an excuse now for getting a pair of 3080's to investigate ;-) Can you please try starting the Editor with -OpenCL-PlatformAndDeviceIndices 0 1 as command line argument and report back:
    a) is the second 3080 device now used?
    b) two devices now appears in the dropdown?
     
  42. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    300
    Can I use HDRP path tracing as a kind of PLM realtime preview? Will the result of path trace be similiar enough to the final result of baking?
     
  43. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    The HDRP path tracer is doing a full material evaluation whereas the progressive lightmapper is only doing the diffuse part, so the output will not be identical.
     
    jiraphatK likes this.
  44. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    300
    Okay. I sent you a new road map idea :).
    I think it's the feature that make Bakery so good even if it has limited hardware support.
    Please, please consider it. We'll be stuck with light mapping in the foreseeable future and this single feature alone will help a lot in the light baking workflow, instead of having to do an educated guess and end up wasting time anyway like current workflow.:confused:
     
  45. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    We are already working on it ;-)
     
    jiraphatK likes this.
  46. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    I managed to reproduce this and have logged a bug: IN-14135.
     
  47. latas

    latas

    Joined:
    Oct 9, 2013
    Posts:
    149
    Sorry for delayed answer. That command line was already added. It didn't work. I decided to change to a RTX 3090 for having more memory, but the issue is not fixed.
     
  48. germanban

    germanban

    Joined:
    Jul 8, 2018
    Posts:
    12
    Hello! I am having some serious problems with the GPU lightmapper since around... October I think?
    Using 2019.1.8f1, and a 2080Ti. (I have also tested this on an older computer with a 980 and the problem is exactly the same).

    Newer NVIDIA drivers just made the GPU lightmapper stop working. I get the "OpenCL Error. Falling back to CPU lightmapper. Error callback from context: CL_INVALID_KERNEL".
    The newest driver that seems to work right now is 517.48. Right after that, the GPU lightmapper apparently broke in all versions of Unity I tried it on.

    Yesterday I tried the latest-latest driver, and I got some hope because with it the GPU lightmapper seemed to work in 2019.4 and a 2020 build. The problem is, my game is built on 2019.1.8, and it's impossible for me to update the unity version for this project. I can stay on the older drivers (in fact, the driver Windows puts me in after a DDU is 516.94, so it's even an earlier one) but, if for some reason I had to update... would I be completely out of luck? Is there any hope that a new NVIDIA driver will fix this issue? I've read this thread and some people seemed to fix similar issues by fiddling with .dll files, but none of that has helped me so far.

    Thanks in advance!
     
  49. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    In Unity 2023.2.0a1 and later min spec for GPU Lightmapper is 2 GB of GPU memory.
     
    giorgos_unity likes this.
  50. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    We have removed the preview label for GPU lightmapper in 2023.2.0a6. :D A big thank you to everyone who gave feedback and filed bugs during the preview period.

    Since 2023.1 Unity provides a "Baking Profile". This can be found in the Lighting window when using the GPU backend in on-demand mode, and offers users a tradeoff between performance and GPU memory usage when baking lightmaps.

    With this improvement, we have removed falling back from the GPU to the CPU. Instead of silently falling back from GPU to the CPU lightmapper, now the bake process will stop and provide a clear Console message to explain why. However, with the lower memory consumption, we expect the number of failures to be significantly lower.

    Note that for light baking some Scenes will simply not fit into GPU memory. This can become noticeable when processing large Scenes with many objects, and/or using many large textures, for instance for transparency.

    If you encounter issues, please use the bug reporter instead of posting here - it is much faster for us to notice and fix issues that way.