Search Unity

Official Progressive GPU Lightmapper preview

Discussion in 'Global Illumination' started by KEngelstoft, Sep 26, 2018.

  1. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    In your case, probes should not be the cause for the high memory usage. Usage from other parts of the scene is highly scene dependent. When the lightmapper falls back to CPU, a report of the memory usage is printed in the editor log.
     
    Kichang-Kim likes this.
  2. Bordeaux_Fox

    Bordeaux_Fox

    Joined:
    Nov 14, 2018
    Posts:
    589
    Disabling the denoiser does not prevent the glowing artefacts.

    Also, after adding small baked point lights to my ceilings and try to bake again, my scene looks like hell.
    Is that baked indirect light from my point setup with that small range really that strong? Before I bake, my point looks like in the screenshot.
     
    Last edited: Jan 28, 2020
  3. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    A 5000 lumen point light will certainly add a lot of energy to your scene but it is highly scene dependent if this is to be considered strong a light. The sun would be way more intense for instance.
    I can't deduce the root cause of the artefacts from you screenshots, would you mind sharing your scene as part of a bug report so we can have a look and fix it? Here is a guide on how to report https://unity3d.com/unity/qa/bug-reporting
    Thanks!
     
  4. sas67uss

    sas67uss

    Joined:
    Feb 8, 2020
    Posts:
    81
    Hi
    I have had a very bad result from using gpu lightmapper. Please look at this picture especially tree shadows . I use unity 2019.2.19 LWRP and trees material is Unlit with surface Opaque and Alpha Clipping .
     

    Attached Files:

    Last edited: Feb 12, 2020
  5. uy3d

    uy3d

    Unity Technologies

    Joined:
    Aug 16, 2016
    Posts:
    187
    What you're seeing are artifacts caused by hitting a lot of invalid surfaces. Have you flagged the foliage on your trees as double sided?
     
  6. sas67uss

    sas67uss

    Joined:
    Feb 8, 2020
    Posts:
    81
    I found a solution
    the material Render Face most set to " BOTH "
    Otherwise double sided GI most enabled for material .
    But this issues not happened when i use Bakery gpu lightmapper asset , although the material render face was not set to BOTH
    unity technology guys don't have a comment?
    Given that unity gpu lightmapper has been slowed down in 2019.3 I think unity must change their algorithms.
     
  7. sas67uss

    sas67uss

    Joined:
    Feb 8, 2020
    Posts:
    81
    How to flag as double side ?
     
  8. uy3d

    uy3d

    Unity Technologies

    Joined:
    Aug 16, 2016
    Posts:
    187
    In the URP this is achieved by setting Render Face to Both, as you have done so already. For builtin there was a checkbox in the material.
     
  9. PerunCreative

    PerunCreative

    Joined:
    Nov 2, 2015
    Posts:
    113
    I have few questions:
    • Does progressive lightmapper supports multi GPU setup?
    • Is there a solution for network distributed baking?
    • Which is generally faster NVIDIA (Optix) or AMD (Radeon Pro)?
     
  10. kristijonas_unity

    kristijonas_unity

    Unity Technologies

    Joined:
    Feb 8, 2018
    Posts:
    1,080
    • It does not yet.
    • None at the moment.
    • After the most recent update to Radeon Pro denoiser, it is now just as fast as Optix.
     
    PerunCreative likes this.
  11. Dwight_P

    Dwight_P

    Joined:
    Feb 26, 2013
    Posts:
    42
    Anyone else having issues with a GeForce GTX 970 being unable to allocate memory no matter what the lightmapping settings are? The GPU was 4gb memory, and has been updated to the most recent drivers. Even dropping everything to the absolute minimum requirement will not work. In fact, dropping to the minimum possible sample rates results in the process stalling.

    1583267336571574248756628001613.jpg

    Edit: Unity version 2019.3.3
     
    Last edited: Mar 3, 2020
  12. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Does turning off *Prioritize View' make a difference? Do you have tons of Lights or Light Probe Groups in your scene? The editor log contains more details of the memory usage when it fails to allocate memory. Please share it if possible.
     
  13. Dwight_P

    Dwight_P

    Joined:
    Feb 26, 2013
    Posts:
    42
    I did get it somewhat working. I had to turn off Prioritize View, Multiple Importance, and Compress Lightmaps. Even then, I am not able to push much. Currently away from my PC, or I'd take a screenshot but my currents are:

    Direct Samples: 20
    Indirect Samples: 50
    Environment Samples: 50
    Bounces: 2
    Filtering: Auto
    Lightmap Resolution: 16
    Lightmap Padding: 2
    Lightmap Size: 256
    Ambient Occlusion: Off
    Directional Mode: Directional
    Indirect Intensity: 1
    Albedo Boost: 1
    Lightmap Parameters: Default- LowResolution

    I have a single directional light in the entire scene which is mainly there for ambiance (horror game, it's mainly for ambience). The rest of my lights are point and area, all set to mixed. I had to set their profiles all to low. There is only around 30 lights currently in the scene, and it's no where near finished. Still it will take about 6 hours to build. I am just hoping it doesn't run out of memory during that time period and stop. Originally has the atlas size at 512, but about an hour in ran out of memory and it reverted to CPU.

    If it fails again, I'll post the memory information in the editor file. May help weed out other performance issues.



    Edit: So I was able to get more information on my lightening setup:
    Light Profiles (104 Total: Primarily Point lights)
    upload_2020-3-4_23-10-48.png

    Emissive Mats:
    upload_2020-3-4_23-10-48.png

    Light Profiles:
    upload_2020-3-4_23-27-52.png

    Building Profile:
    upload_2020-3-4_23-28-56.png

    And here is all it says in the Editor file
    Code (CSharp):
    1. [Licensing::Module] Successfully connected to LicensingClient on channel: LicenseClient-Dwight
    2. Entitlement-based licensing initiated
    3. [LicensingClient] Licenses Updated successfully in LicensingClient
    4.  
    5. LICENSE SYSTEM [202034 19:11:47] Next license update check is after 2020-03-05T01:14:30
    6.  
    7.  
    8. LICENSE SYSTEM [202034 19:11:47] 00330-80000-00000-AA127 != 00330-80000-00000-AA680
    9.  
    10. Built from '2019.3/staging' branch; Version is '2019.3.3f1 (7ceaae5f7503) revision 8186542'; Using compiler version '191627012'
    11. OS: 'Windows 10  (10.0.0) 64bit' Language: 'en' Physical Memory: 12269 MB
    12. [Licensing::Module] Serial number assigned to: "F4-NQWJ-ZGNU-MQXA-DPZA-XXXX"\nBatchMode: 0, IsHumanControllingUs: 1, StartBugReporterOnCrash: 1, Is64bit: 1, IsPro: 0
    13. [Package Manager] Server::Start -- Port 53949 was selected
    14.  
    15. COMMAND LINE ARGUMENTS:
    16. C:\Program Files\Unity\Hub\Editor\2019.3.3f1\Editor\Unity.exe
    17. Exiting without the bug reporter. Application will terminate with return code 0        
    Also using the HDRP Lit Shader for all models.
     

    Attached Files:

    Last edited: Mar 5, 2020
  14. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Your editor log is almost empty, it appears to be from a different run instead of the one where you actually baked the scene. Please consider reporting a bug, if you use the bug reporter we can get the correct editor log and also get to look at your scene and analyze what is causing it to run out of memory. Thanks!
     
  15. rmon222

    rmon222

    Joined:
    Oct 24, 2018
    Posts:
    77
    Hi, I'm using the GPU lightmapper and 2019.3.4. I get 22 2k lightmaps at 4 texels/unit resolution. I'm afraid if I bump that up to 20 texels/unit, the number of lightmaps will explode. Attached is a picture of lightmap index 6 to 10. They all look similar. Is this expected? Thanks
    Capture.PNG
     
  16. Zylex

    Zylex

    Joined:
    Nov 25, 2008
    Posts:
    238
    We would love to use the GPU baking but in previous versions it always seems to run out of memory while baking making it impossible to bake. We have pretty big scenes with terrains which we would like to bake at a lightmap size of 4096 to increase batching and have a high terrain lightmap resolution.

    Now on my current hardware of 8GB VRAM this does not seem possible and it seems it might take a while before this is improved. But the real question then is, what graphics card would work? Do we really need to buy a 48GB video card in order to make this work? Or would 16GB already do the trick?
     
  17. xVergilx

    xVergilx

    Joined:
    Dec 22, 2014
    Posts:
    3,296
    Or you could just buy Bakery, and save yourself lots of $ instead. (And potentially sanity)
    If you're using Nvidia + Windows setup that is.

    PLM is lightyears away from being GPU stable. At least that is for me, and my large scenes.
     
    Thomas-Pasieka likes this.
  18. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    16 GB will allow you to bake most scenes, but it depends on which Unity version you are on. With the ray space tiling feature that recently landed in 2020.2, Lightmaps and especially 4K maps are now using less memory but 8GB is can still be pushing it if there are a ton on large assets in your scene (The OS, Editor, other apps and the light baker all need to fit). This is why a cheaper alternative to a super high memory GPU would be to use two GPUs, one for the editor and one dedicated to light baking.
     
  19. JamesArndt

    JamesArndt

    Joined:
    Dec 1, 2009
    Posts:
    2,932
    Yep and this was a big drawback to Bakery. I would start my project on my desktop with Nvidia based GPU, store the project on the cloud. Open that same project on my AMD GPU laptop and Bakery will no longer function in my project. That limitation really sucked and became the reason I couldn't rely on Bakery for my projects. I need hardware agnostic tools as much as possible.
     
    newguy123 likes this.
  20. fct509

    fct509

    Joined:
    Aug 15, 2018
    Posts:
    108
    I haven't tried this with GPU baking, but I learned (through experience) that you can fix out of memory errors by increasing the page-file size in the system's settings if you're using Windows. I have a system with 32 GB of memory, but I kept getting out of memory errors on the bake jobs on a project I used to work on. I ended up increasing the paging file size to like 6 GB and it solved the problem. While, I don't think that increasing the paging file size will be much help when doing GPU baking, it might be worth a try.
     
  21. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Please don't mess with the page file, it will not make a difference for the GPU lightmapper.
     
  22. StenCG

    StenCG

    Joined:
    Mar 26, 2015
    Posts:
    66
    IES support expected? Building a mesh in front of the light is too archaic
     
  23. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
  24. StenCG

    StenCG

    Joined:
    Mar 26, 2015
    Posts:
    66

    Attached Files:

  25. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    GPU lightmapper is still in preview. Baked LOD support for GPU lightmapper was added in 2020.1.0a20.
     
  26. sstrong

    sstrong

    Joined:
    Oct 16, 2013
    Posts:
    2,255
    I have a lightmap data asset that is approx. 1GB in 2018.4 LTS, however, in the Lighting tab it says 19 Directional Lightmaps with Shadowmasks 19x512x512 25.3MB. What is consuming all the space in the lighting asset?? I'm using the Progressive GPU lightmapper.
     
  27. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Precomputed Realtime GI data perhaps? Do you have that enabled?
     
  28. fendercodes

    fendercodes

    Joined:
    Feb 4, 2019
    Posts:
    192
    I'm getting the following error: OpenCL Error. Falling back to CPU lightmapper. Error callback from context: Max allocation size supported by this device is 1.50 GB. 1.85 GB requested.

    There is a single light probe group with about 100 probes in our scene. I've set the resolution to just 2 for testing but my Nvidia GTX 1060 6GB keeps switching back to CPU. Here is the memory log.. any ideas?

    *********************************************
    * Lightmapper - GPUMemoryStatistics - START *
    *********************************************
    -------------------
    Overview:
    Category Overview 3017.339 MBs
    OpenCLEnvironmentBuffers 2.096 MBs
    OpenCLCommonBuffers 1892.981 MBs
    -------------------
    Details:
    Category OpenCLEnvironmentBuffers 2.096 MBs
    env_mipped_cube_texels_buffer 2.096 MBs
    Category OpenCLCommonBuffers 1892.981 MBs
    goldenSample_buffer 0.500 MBs
    sobol_buffer 0.203 MBs
    albedoTextures_buffer 0.225 MBs
    emissiveTextures_buffer 0.021 MBs
    transmissionTextures_buffer 1892.031 MBs
    Category OpenCLRenderBuffers-probe-0 561.104 MBs ProbeCount:17
    lightSamplesCompactedBuffer 8.000 MBs
    outputShadowmaskFromDirectBuffer 0.004 MBs
    shadowmaskExpandedBuffer 16.000 MBs
    pathRaysCompactedBuffer_0 48.000 MBs
    pathRaysCompactedBuffer_1 48.000 MBs
    pathIntersectionsCompactedBuffer 32.000 MBs
    transparentPathRayIndicesCompactedBuffer 4.000 MBs
    pathThroughputExpandedBuffer 16.000 MBs
    pathLastPlaneNormalCompactedBuffer 4.000 MBs
    pathLastInterpNormalCompactedBuffer 4.000 MBs
    pathLastNormalFacingTheRayCompactedBuffer 1.000 MBs
    directSampleCountBuffer 0.001 MBs
    indirectSampleCountBuffer 0.001 MBs
    environmentSampleCountBuffer 0.001 MBs
    lightRayIndexToPathRayIndexCompactedBuffer 4.000 MBs
    expandedTexelsBuffer 0.003 MBs
    sampleDescriptionsExpandedBuffer 8.000 MBs
    lightRaysCompactedBuffer 48.000 MBs
    lightOcclusionCompactedBuffer 16.000 MBs
    positionsWSBuffer 0.004 MBs
    originalRaysExpandedBuffer 16.000 MBs
    outputProbeDirectSHDataBuffer 0.040 MBs
    outputProbeIndirectSHDataBuffer 0.040 MBs
    outputProbeOcclusionBuffer 0.004 MBs
    inputLightIndicesBuffer 0.004 MBs
    probeSHExpandedBuffer 144.000 MBs
    probeOcclusionExpandedBuffer 144.000 MBs
    Category OpenCLRenderBuffers-probe-1 561.158 MBs ProbeCount:21
    lightSamplesCompactedBuffer 8.000 MBs
    outputShadowmaskFromDirectBuffer 0.007 MBs
    shadowmaskExpandedBuffer 16.000 MBs
    pathRaysCompactedBuffer_0 48.000 MBs
    pathRaysCompactedBuffer_1 48.000 MBs
    pathIntersectionsCompactedBuffer 32.000 MBs
    transparentPathRayIndicesCompactedBuffer 4.000 MBs
    pathThroughputExpandedBuffer 16.000 MBs
    pathLastPlaneNormalCompactedBuffer 4.000 MBs
    pathLastInterpNormalCompactedBuffer 4.000 MBs
    pathLastNormalFacingTheRayCompactedBuffer 1.000 MBs
    directSampleCountBuffer 0.002 MBs
    indirectSampleCountBuffer 0.002 MBs
    environmentSampleCountBuffer 0.002 MBs
    lightRayIndexToPathRayIndexCompactedBuffer 4.000 MBs
    expandedTexelsBuffer 0.005 MBs
    sampleDescriptionsExpandedBuffer 8.000 MBs
    lightRaysCompactedBuffer 48.000 MBs
    lightOcclusionCompactedBuffer 16.000 MBs
    positionsWSBuffer 0.007 MBs
    originalRaysExpandedBuffer 16.000 MBs
    outputProbeDirectSHDataBuffer 0.061 MBs
    outputProbeIndirectSHDataBuffer 0.061 MBs
    outputProbeOcclusionBuffer 0.007 MBs
    inputLightIndicesBuffer 0.007 MBs
    probeSHExpandedBuffer 144.000 MBs
    probeOcclusionExpandedBuffer 144.000 MBs
    *********************************************
    * Lightmapper - GPUMemoryStatistics - STOP *
    *********************************************
     
  29. Zylex

    Zylex

    Joined:
    Nov 25, 2008
    Posts:
    238
    In this case if you have two GPU's, would 8GB suffice or is it still better to get a 16GB video card as a seperate baker? Ideally we would like to bake a scene on GPU with around 20-30k meshrenderers.
     
  30. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    It looks like the transmission textures are taking up more memory that your Nvidia GPU can support in single allocation. It can allocate up to 25% of total physical memory (1.5GB in your case) and 1892 MB is simply too much. Either reduce some transmission textures in size, use CPU lightmapper or use a GPU with larger max alloc size (and AMD GPU or an Nvidia GPU with more memory).
    I have added this case to our backlog so we can give a better user experience in the future. Thanks for reporting this.
     
  31. fendercodes

    fendercodes

    Joined:
    Feb 4, 2019
    Posts:
    192
    @KEngelstoft Thanks, makes sense. Except, what exactly are "transmission textures"?
     
  32. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    If you are targeting 4K lightmap resolution 8GB isn't enough yet, we are working on improving memory usage to allow this on smaller video cards too.
     
  33. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Transparent or cutout textures from materials applied to the renderers that is affecting GI in your scene. This is very scene dependent but it is usually from foliage, windows or other transparent objects.
     
  34. Zylex

    Zylex

    Joined:
    Nov 25, 2008
    Posts:
    238
    Thanks for your reply. I understand that 8GB isn't enough so we are willing to invest in hardware however we are not sure what IS enough VRAM to use GPU baking on a scene where we want 4k Lightmap resoltion and have around 30k static mesh renderers+Terrain. Any advice on this?

    You mentioned that 16GB works for most scenes but in this case would 24GB suffice? Or is 48GB needed? There is no real way for us to tell atm so I hope you can shed some more light on this.
     
    Last edited: May 28, 2020
  35. DefyGravity

    DefyGravity

    Joined:
    Mar 26, 2020
    Posts:
    6
    Hello, I just wanted to thank the Unity developers for their continued improvements to the Progressive GPU Lightmapper. After upgrading to Unity 2019.3 we saw a tremendous decrease in render times. We are using an NVIDIA Quadro RTX 8000 and we were getting rendering errors before 2019.3. This card has 48 GB of video ram and is enough ram for our very large scene. Using Progressive CPU we had bake times of 3.5 days, now it takes 20 minutes using the same settings for Progressive GPU! That's for 42 2K lightmaps. This is such a huge time saver for us!

    Thanks again!
     
    Gametyme and KEngelstoft like this.
  36. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    I am sorry but that is really scene dependent, those 30k static meshes could be a hundred megabytes or many gigabytes, I have no way of knowing. Also having many thousands of lights or just a single light also makes a difference, texture resolution for transparency textures also plays a part... The list goes on, so I am not able to advice you on the upper limit your scene could take up.
    We are already working on improving memory usage and will have more reasonable memory consumption before going out of preview.
     
  37. Milanis

    Milanis

    Joined:
    Apr 6, 2018
    Posts:
    55
    Oh my, it's so sad. At the company, we're using 2018.4 (that's sticky, no changes possible).

    One particular map light bake takes around 3 hours on CPU.

    Now here comes the crazy part and I really wish there would be some workaround to make it work:

    I can render the scene with GPU in under 5 minutes, resulting in 4x 2K lightmaps. (GTX 1080, 8GB, own home office equipment)

    Problem (that's sticky, no changes): We are using 4K lightmaps in production. With the same light bake settings but having 4k activated in the dropdown instead of 2k, the GPU fails.
    I am not the big tech guy here, but I really wonder what's the issue when settings and results would be the same.
    I am aware that in 2018.x needs more GPU ram..but why when the upper 4x 2k's worked like a charm which is literally that one 4K map?

    I probably can just beg for help but can't there be some hackaround or lightmap-stitcher to bake these 4x 2k maps into one 4K?

    Side notes:
    - No, I can't afford a bigger card.
    - No, the production can't upgrade to newer Unity versions. (2018.4)
    - No, it is not allowed to use anything else than Unity light bake solutions.

    It's a real-world production problem that needs real-world solutions in this case.

    Kind regards.
     
    Last edited: May 28, 2020
  38. fct509

    fct509

    Joined:
    Aug 15, 2018
    Posts:
    108
    I think you said that your GPU can bake the information into four images at a 2K resoultion, but that the bake fails when you try to do a single 4K image (which contains the same number of pixels as the four 2K images).

    I wonder if the reason why you're able to bake 4 images at 2K, but not a single image at 4K is because the GPU is unloading one or more images (at the 2K settings) to make sure everything fits in memory. Maybe it works a bit one one image, unloads it from the GPU, loads a different image into the GPU and works on that for a bit, unloads that, loads another one to work on it for a bit, and so on and so forth.

    Besides the image resolution, there are a few things that you can do to decrease the memory use of your bake. I don't remember any of them off the top of my head because it's been a while since the last time I had to really mess around with the sizes of the lightmaps. I suspect that lowing the settings that I'm thinking about will lower the resulting quality of the bake.

    Anyways, I'm really curious what the KEngelstoft has to say about this. Good luck to you, Milanis.
     
  39. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Yes, you are on to something! It is indeed just baking one of your four 2K maps at a time, whereas for the 4K map it needs to allocate 4 times as much memory to bake. We are not backporting new features because we don't want to risk destabilizing the LTS release so you have to stick with the CPU lightmapper on 2018.4.
     
  40. fct509

    fct509

    Joined:
    Aug 15, 2018
    Posts:
    108
    I was thinking a bit more about this, and I was wondering if there's anything they can do to free up some video memory for the bakes? For instance, can they close the Tabs/Windows with the Scene View and/or Game Views? Can they decrease the number of monitors that they are connected to? What if they shut down any other programs that might be using the graphics card? Can they lower the resolution on their monitor? I'm not saying that these things will make it work, but I am wondering if they might free up some additional resources for the bake.
     
  41. Gametyme

    Gametyme

    Joined:
    May 7, 2014
    Posts:
    618
    The Gpu light mapper is now working for really large scenes in 2020.1b10 and 2020.2A10. These scenes never worked before for me. They also have a ton of light probes. They do still fall back to cpu if I try to use a resolution greater than 1024. Is there any advantage to that? I have a 4K cpu light map baking now to see if I can notice a different in quality during gameplay.
     
  42. WildStyle69

    WildStyle69

    Joined:
    Jul 20, 2016
    Posts:
    318
    FWIW - I spent a lot of time messing around with resolutions and trying to free up memory to no avail, and it was only when I purchased a decent second graphics card that I was able to use the GPU light-mapper effectively, without Unity switching back to the CPU version at higher resolutions. Even now from time to time something goes wrong and it switches back to CPU but not often, and those cases normally result in a crash and I have to kill the Unity process and restart. Then the GPU version works fine again.

    // Wildstyle
     
  43. Gametyme

    Gametyme

    Joined:
    May 7, 2014
    Posts:
    618
    What gpu were you using?
     
  44. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Please report the crash if possible, so we can investigate and fix it. https://unity3d.com/unity/qa/bug-reporting
     
  45. Laurentius1984

    Laurentius1984

    Joined:
    May 7, 2019
    Posts:
    3

    Hello,

    I did the above, but still got the same errors:

    "OpenCL Error. Failling back to CPU lightmapper. Error callback from context: CL_INVALID_PROGRAM"

    and

    OpenCL CPU device GeForce GTX 540 Ti from NVIDIA Corporation has less than 4 GB of global memory, ignoring device for lightmapping ( having this error after downgrading my newest driver version to 416.34.)

    also

    Failed to find a suitable OpenCL device, falling back to CPU lightmapper.

    and

    [PathTracer] LoadShaders job with hash: 98e46fb452b0c763474d20b0a9131eff failed with exit code 0.


    Also tried to find OpenCl in editor-map, but could not find any!
     
  46. Laurentius1984

    Laurentius1984

    Joined:
    May 7, 2019
    Posts:
    3



    Okey, I have updated Unity to 2019.4

    Allthough the lighting generation works with CPU (very slowly), with CGU it falls back to CPU ginving the following messages:

    • OpenCL Error. Falling back to CPU lightmapper. Error callback from context: CL_MEM_OBJECT_ALLOCATION_FAILURE
    • OpenCL Error. Falling back to CPU lightmapper. Error callback from context: CL_MEM_OBJECT_ALLOCATION_FAILURE error executing CL_COMMAND_NDRANGE_KERNEL on GeForce GTX 750 Ti (Device 0).
    • OpenCL Error. Falling back to CPU lightmapper. Error callback from context: CL_MEM_OBJECT_ALLOCATION_FAILURE
    • Asset file ‘Assets/Scenes/samplescene.unity.meta’ and meta file ‘Assets/Scenes/SampleScene.unity.meta’ has inconsistent casing. Renaming meta file succeded. UnityEngine.GUIUtility:processEvent(Int32, IntPtr)
     
    JamesArndt likes this.
  47. chillcarrier

    chillcarrier

    Joined:
    May 24, 2020
    Posts:
    4
    Hi, having a hard time with my Geforce GTX 1660 Ti not being recognized by the Progressive GPU lightmapper (in Unity 2019.4.0f1, same Problem under 2019.3.14f1).
    I'm using the latest official drivers (446.14) and in the edit.log the Geforce is mentioned but not under OpenCL platforms, just here:

    [Optix] context is using local device 0: GeForce GTX 1660 Ti - 6144MB VRAM (available: 4719MB)


    Baking is always using my i5-9600k.
    I also tried copying the drivers opencl64.dll directly into the /editor/ directory and renaming it to opencl.dll but without any luck as read on another thread. It then just gives me

    Failed to find a suitable OpenCL device for the GPU Lightmapper, falling back to CPU lightmapper. Please install the latest graphics driver.


    in the console on startup.

    I've installed the card today and already had the Unity installations on my system. Could that be the problem? Do I have to reinstall them? Then again it did not recognize my old GT 730 either.
    So yeah, ran out of ideas, would be grateful for any input.


    Edit: Found the solution in the Nvidia forum under OpenCL issues. Had to fall back to driver version 442.72, appearently 446.14 did disable it as GPU-Z showed me as well. Really glad it finally works. :)
     
    Last edited: Jun 23, 2020
    genify, IN_Ron and JamesArndt like this.
  48. Bordeaux_Fox

    Bordeaux_Fox

    Joined:
    Nov 14, 2018
    Posts:
    589
    I also have again problems with baking a rather big map with Unity 2019.4.
    Sadly I cannot get higher than a resolution of 256 px otherwise the GPU baking will fail because it says my RTX 2060 has not enough memory.

    Any tips how to get a higher resolution with my hardware setup?
    I already followed some general optimizations. I excluded all vegetation and small objects from lightmapping (they use light probes or light proxy volumes). Basically I just bake the big building walls and the ground.

    Btw. does light probes also have a high influence about the required gfx memory to bake the scene with GPU?
     
  49. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    789

    You can also turn texture details to low (Quality Settings) when you are going to bake. This will free up some of your VRAM
     
  50. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Hi, Geforce GTX 540Ti only has 1 GB of GPU memory, you need a card with 4 GB as indicated in the error message.