Search Unity

Official Progressive GPU Lightmapper preview

Discussion in 'Global Illumination' started by KEngelstoft, Sep 26, 2018.

  1. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Baked LOD will not be available for the GPU lightmapper in 2019.x, you should use CPU lightmapper if you need this.
     
  2. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    @KEngelstoft i finally found the reason for getting black parts of the meshes all over the scene in all the projects i tested !

    It is the "Pushoff" value. I never thought that this might cause any issue because it is set by default to 0.0001 which seems correct to me as far as i am aware of other renderers, but it does produce issues with the CPU and GPU lightmappers !

    Note that the objects in the images below are placed around a couple of KMs away from the origin !

    Result with default tolerance of 0.0001 - incorrect ( this value is set to all the presets and that's the reason i did not get good results with any preset loaded ) Could this be a floating point precision issues with Radeon Rays API ? But it is also valid for the CPU Lightmapper !

    GPU
    upload_2019-8-16_22-23-19.png

    CPU
    upload_2019-8-16_22-29-36.png



    Result with tolerance set to 0.001 - correct - I created a custom preset and did change the value !

    GPU

    upload_2019-8-16_22-16-40.png


    CPU
    upload_2019-8-16_22-30-39.png
     

    Attached Files:

    Lars-Steenhoff likes this.
  3. Haze-Games

    Haze-Games

    Joined:
    Mar 1, 2015
    Posts:
    189
    Hi again guys,

    I've tried again to bake with GPU PLM in latest 2019.2.1f1: same error using my RTX 2080 or my GTX 1070 with any lowest setting possible. I've tried forcing to use another GPU (Intel Graphics) to render the Editor scene; it doesn't help.

    I still the same error:

    Code (CSharp):
    1. (Filename: C:\buildslave\unity\build\Editor/Src/GI/Progressive/OpenCL/OpenCLCheck.cpp Line: 134)
    2. OpenCL Error: 'GetBuffer(kRRBuf_lightRayIndexToPathRayIndexBuffer).EnqueueClearBuffer(openCLState)' returned -4 (CL_MEM_OBJECT_ALLOCATION_FAILURE)!
    3. OpenCL Error. Falling back to CPU lightmapper. Error callback from context: CL_MEM_OBJECT_ALLOCATION_FAILURE error executing CL_COMMAND_NDRANGE_KERNEL on GeForce RTX 2080 (Device 0).
    It works perfecly fine on 2019.1.14f1.

    Any ideas to help with this? I've got latest nVidia drivers; should I roll back the one recommended in the very first post? Seems like a pretty old driver so haven't tried that yet.

    Thanks,
    Charles
     
  4. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    Hi, i am using GTX 1060 with this latest 2019.2.1f1 and it is working great.
    Have you tried the suggestion of cleaning the compute cache folder ?
    I assume you are on windows.
    Just go in that location and delete this folder - "ComputeCache": I actually deleted everything in the NVIDIA folder !
    C:\Users\"Your User Name"\AppData\Roaming\NVIDIA\ComputeCache
     
    skullthug and Haze-Games like this.
  5. Haze-Games

    Haze-Games

    Joined:
    Mar 1, 2015
    Posts:
    189
    Hi, thank you for replying!

    I did indeed delete this folder; and I also deleted all NVIDIA folders in all 3 Roaming / Local / LocalLow and reinstalled entirely latest drivers and didn't help either :(

    I just discovered that in a very small level it works - but strangely it doesn't work in a real level, where it works well as soon as I downgrade back 2019.1.14f1

    I will try this again, and maybe delete parts of the level gradually to see if a specific object could create this error (or level size, but would be weird that it's only "too big for my RTX2080" even with 32x32 lightmap size and resolution set to 1 from 2019.2.1f1, whereas I can bake 2k resolution 28 on that same level on 2019.1.14f1... confused :D
     
  6. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    It does not make sense to me. It should be an issue. Unfortunately i can not think of a fix.
    Have you tried other OpenCL based applications like Blender renderer benchmarks and the Lux renderer, just to be sure that they are working ?
    Also there are some things that you can make to allow driver ( or OS - not sure ) to allow using more memory for OpenCL like this "GPU_FORCE_64BIT_PTR=1" go and search the web about "Environment Variables". I know there are some entries that can be added in order to improve memory allocation - well mostly for AMD cards, but you can go and search the web for these things !
    Also do you have the latest windows updates ?

    Another thing you can try ( i would try it for sure, but i doubt Unity would recommend this ), is to go to 2019.1 Unity install folder and copy the OpenCL.dll file and paste it to the 2019.2 folder. There might be some things changed if eventually the 19.2 version is using a dll compiled with a newer Radeon Rays SDK version ( just guessing ) ! Just be sure, back up the OpenCL.dll from the 19.2 folder just in case so you do not have to re install it if it does not work !
     
    Haze-Games likes this.
  7. Haze-Games

    Haze-Games

    Joined:
    Mar 1, 2015
    Posts:
    189
    Thanks a lot for these suggestions and your lengthy reply :) I'm going to try this and keep you posted on what happens :)
     
    Vagabond_ likes this.
  8. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Are you using the Optix denoiser? Does it work if you disable it?
     
  9. Haze-Games

    Haze-Games

    Joined:
    Mar 1, 2015
    Posts:
    189
    Hello,
    Thanks for your reply.

    I tried with Filter set to None too, and all settings to a minimum, it doesn't help unfortunately. I've tested the DLL replacement suggested above, but there isn't any OpenCL.dll it seems so I tried with various other DLLs, didn't help either.

    EDIT: It happens on multiple machines with various settings, it maybe is project-related to my project only as with other scenes it seems to work fine. I did clear all caches / delete lighting data, delete all NVIDIA folders, reinstall latest drivers, etc.

    Thanks,
    Charles
     
  10. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    Does the Project Architecture matter - i mean, if the Windows Project is set to 32 bit try to set it to 64 bit !
     
    Haze-Games likes this.
  11. icefallgames

    icefallgames

    Joined:
    Dec 6, 2014
    Posts:
    75
    Unfortunately seemingly tiny changes to a level can cause the GPU lightmapper to blow up and fail. We just had an issue (with 2019.2.0f1) where suddenly a level started failing to bake, even when the settings were all turned way down. After losing hours and hours of dev time, we finally narrowed it down to too many light probe groups (someone happened to try disabling all light probe groups in the Light Probes tab of light explorer, and suddenly the bakes worked fine).
     
  12. Haze-Games

    Haze-Games

    Joined:
    Mar 1, 2015
    Posts:
    189
    Thanks, I'm going to try this and tell you how it goes :)
     
  13. Gametyme

    Gametyme

    Joined:
    May 7, 2014
    Posts:
    618
    For me it works on small scenes but anything big it falls back to cpu baker. This is using a gtx1080ti.
     
    Last edited: Aug 21, 2019
  14. fendercodes

    fendercodes

    Joined:
    Feb 4, 2019
    Posts:
    191
    @icefallgames @Aze_ We are also getting the same issue with the GPU lightmapper not able to bake when there are any Light Probes on the map. This seems to be a regression in the latest Unity. Is there a bug tracked for this?

    I tried to bake the same map with CPU lightmapper, and it doesn't work either and seems to run out of memory after a while.
     
  15. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Hi, memory management for many light probe groups will land in 2020.1. For 2019.x please combine probe positions into one light probe group.
     
  16. fendercodes

    fendercodes

    Joined:
    Feb 4, 2019
    Posts:
    191
    @KEngelstoft Thanks! That completely fixed the issue and I baked without errors. Do you happen to know what the lack of baking on LOD Groups in GPU Lightmapper actually means? Does it mean any static object with a LOD Group component gets no indirect lighting baked?

    For any onlookers who want to save time, copy/pasting of Light Probes between groups is not trivial. To do it you have to select all the light probes, hit CTRL-C, go to the new light probe group and select one of the light probes in that group before pressing CTRL-V. We had hundreds or probes so this helped save a lot of time.
     
    WildStyle69 and KEngelstoft like this.
  17. fendercodes

    fendercodes

    Joined:
    Feb 4, 2019
    Posts:
    191
    I'm also assuming lightmapping on Terrains isn't working yet? I get good results with CPU but bad with GPU.
     
  18. PhaseQuad

    PhaseQuad

    Joined:
    Jun 14, 2017
    Posts:
    39
    [PathTracer] InitializeLightmapData job with hash: b9f2fdc6080521d91d4e0858a1dc35e5 failed with exit code 2. How fix this?

    This is a nightmare. I not can create lightmaps more one month for one city map, why I have this error? On CPU I have more 20 hours for baking, on GPU I have this error.

    Please get the script, which ignore this error or something, I can pay for that, because I really need create lightmaps.
     
  19. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Baked lightmaps for LOD is not implemented for GPU lightmapper so all the LODs will be there from the POV of the light rays, giving you severe overlap problems. We are working on a solution but for now you have to stick with CPU if you need this.
     
  20. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Perhaps you are using an old version, there is no known issues with terrain on the latest 2019.3 version. If you find something, please report a bug. Thanks!
     
  21. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    This error code means out of memory, not GPU memory but physical main memory (RAM). Try closing other applications you have running while baking or use a lower lightmap resolution.
     
  22. fendercodes

    fendercodes

    Joined:
    Feb 4, 2019
    Posts:
    191
    @KEngelstoft Apologies, it looks like it might be a general lighting issue and not just related to the GPU. Thought I'd show you a screenshot anyway. Do you know why the shadows from trees would be blocky like that on the terrain?

    There is also several instances in my game of where mixed lights do not seem to render any direct light either. I'm Forward rendering and the pixel light count is at 4.
     

    Attached Files:

  23. PhaseQuad

    PhaseQuad

    Joined:
    Jun 14, 2017
    Posts:
    39
    Oookaay. Thank you. I spent one month for understanding this error... Why not write in console: Out of memory? This is standart message in other engines and programs.
    This problem is randomly, I will start baking 3-4 times and will have this error on start or on finish. 16GB ram, about 12 free and 4GB vram - how much need for normal baking?

    Okay, I have more problems:
    1) why scene is dark after baking?
    2) why some objects have a white or black spots, how fast fix this problem? I have not this problem in Unreal, so I think that problem not in meshes. Also, sometimes this spots dissapear, how it works?

    Sorry. but I really tired. After one hard month I have this poroblems again and again...Unfortunately I not can upgrade my PC at this time.
     

    Attached Files:

    Last edited: Aug 27, 2019
  24. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Hi, I finally got an eGPU to test with today and it works with the Editor. I found that if an OpenCL device is ignored for lightmapping, for instance because it has too little memory, it will not count when specifying device index on the command line, so you have to subtract the number of ignored devices from the index yourself, so -OpenCL-PlatformAndDeviceIndices 0 3 becomes -OpenCL-PlatformAndDeviceIndices 0 2 in your case. Hope this helps.
     
    Last edited: Aug 29, 2019
  25. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Hi @Stygian, I finally managed to get hold of a RX 5700 and reproduced your issue. The OpenCL compiler in the Navi driver is more strict than previous versions so a few fixes had to be made to our kernels. The case number is 1180454 for reference and I think the fix will land in 2019.3 or 2020.1. The Radeon VII kernel compiler is less strict so that card works with the current versions of Unity (and it has 16 GB of memory so it is a good choice for the GPU lightmapper).
     
    Lars-Steenhoff likes this.
  26. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,526
    Just got a Radeon VII in a sonnet e-gpu for my MacBook 2016.
    It's working fine in Mojave and unity 2019.3.0b2.

    On some scenes I get LightmapRasterize.cpp(574) in RasterTriangle - Triangle failed to raster! v1
    What does this mean?
     
  27. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Hi Lars, it means that a triangle is either very close to zero area or somehow shaped in a way that the rasterization code doesn't like. What is printed after v1, there should be a lot of indices and original triangle()...?
    Please share the offending scene / object with us in a bug report we can fix it. This is not GPU lightmapper specific code so CPU lightmapper will likely hit the same problem.
     
  28. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,526
    Thanks I will check for the model
     
  29. Ghost_Tales

    Ghost_Tales

    Joined:
    Sep 22, 2018
    Posts:
    6
    Hello there, recently got around to baking parts of my project. Ran into a problem. No matter how low I set the lighting settings, the light mapper fails to bake in GPU mode. I have a 980Ti, so the memory should not be an issue. The errors I get are the following,

    "OpenCL Error. Falling back to CPU lightmapper. Error callback from context: CL_MEM_OBJECT_ALLOCATION_FAILURE error executing CL_COMMAND_NDRANGE_KERNEL on GeForce GTX 980 Ti (Device 0)"

    "OpenCL Error. Falling back to CPU lightmapper. Error callback from context: CL_MEM_OBJECT_ALLOCATION_FAILURE"

    "OpenCL Error. Falling back to CPU lightmapper. Error callback from context: CL_MEM_OBJECT_ALLOCATION_FAILURE error executing CL_COMMAND_WRITE_BUFFER on GeForce GTX 980 Ti (Device 0)"

    Any suggestions or help would be appreciated.

    My Unity Version is 2019.2.5f1
     
  30. Polkatuba

    Polkatuba

    Joined:
    Oct 31, 2014
    Posts:
    79
    As I replied on the other thread, reducing lightmap resolution helped for a while for me. But when my scene started to grow again, the same error appeared. Now I deleted all light probes and I'm able to bake with gpu again. Deleting light probes might not be a proper fix but have you tried that?
     
  31. icefallgames

    icefallgames

    Joined:
    Dec 6, 2014
    Posts:
    75
    It looks like it only has 6GB?

    - try reducing the number of light probe groups you have (not light probes)
    - reduce your lightmap size. I have an 8GB card, and usually can't bake any higher than a 512 lightmap size for our levels.
    - make sure your graphics drivers are up to date - if it fails no matter what the settings, even for small scenes, this might be the issue
     
  32. Ghost_Tales

    Ghost_Tales

    Joined:
    Sep 22, 2018
    Posts:
    6


    6GB is over kill for the scene I'm trying to bake trust me lol. There's almost nothing here. I figured out that having multiple small terrains causes it to sometimes fail. So I just created one giant terrain. Problem solved.

    There are no light probe groups in the scene at the moment.

    Another strange problem is that if I scale any mesh up too high, the GPU fails the bake. Very odd. I'm trying to upscale some background structures, and bringing them past 1.5X the size causes the GPU bake to fail.

    Drivers are up to date. Light map size is on 512. Just got home today and tried it on my personal PC (1080Ti) and the same issues persisted.


    Edit: Anybody able to shed some light on why up sizing meshes past a certain size can cause the GPU light bake to fail?
     
    Last edited: Sep 23, 2019
  33. velacorp

    velacorp

    Joined:
    Sep 28, 2019
    Posts:
    3
    Nvidia gpu have a limit of 25% ram usage when using OpenCL . So you can only use around 1.5GB ram on 980ti for baking .
     
  34. WildStyle69

    WildStyle69

    Joined:
    Jul 20, 2016
    Posts:
    318
    This saved me, I was about to throw my system out the window.. thanks so much!

    Successfully baked all my scenes at 2048 resolution in around 6 minutes... amazing stuff!! :D

    // WildStyle
     
  35. illinar

    illinar

    Joined:
    Apr 6, 2011
    Posts:
    863
    So, for me GPU lightmapper always crashes with even single 1024x map. 1050Ti 8GB RAM.

    Will this be resolved in the future? Will getting another 8GB of RAM help? Or will it just allow me to bake 1024 and nothing higher?
     
  36. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    GTX 1050 Ti could have at max 4 GB of memory.
    How much is the memory of your GPU - 2 GB or 4 GB
    When you say 8 GB, are you referring to system memory, or GPU memory. Because the system memory is only used to prepare and cache data that will be stored to the GPU memory for computation. 8 GB of system memory could be enough depending on scene. A small scene also could fit in 2GB or 4GB of GPU memory bu for lightmapping it is recommended to have 6GB or more !

    In order to get proper answer you may need to share some errors that unity throws in the console !
     
  37. illinar

    illinar

    Joined:
    Apr 6, 2011
    Posts:
    863
    There are no errors, it always crashes with maps bigger than 512x. I didn't check logs.

    Will it ever be able to get around this memory limitation? Other GPU lightmappers do.
     
  38. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    I successfully have been baking small scenes on GTX 750Ti with 2 GB of memory before even at 2K resolution !
    Your hardware looks capable and when Unity crashes then the problem might be GPU drivers or something else !
    In case you are out of memory Unity will throw an error and not crash. It always has been notifying me when the PC is running out of memory !

    What version of Unity you are using. Could you try on some newer version and see if this is the case, because i've never experienced crashes on GTX 750 Ti and GTX 1060 whatever version i am using !

    P.S. - there is also one folder that caches GPU kernels. You may want to try clean up that folder as the system may use some older kernel. Get back to the thread and searched for cached kernels folder !
     
  39. illinar

    illinar

    Joined:
    Apr 6, 2011
    Posts:
    863
    It's been crashing in all versions in the past 6 months I think. Also it seem to have gotten worse and crashes with smaller scenes lately, not sure.

    Is that what you mean regarding kernel:

    So it's probably not that if its supposed to be fixed.
     
  40. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    So if you are on Windows PC, go to this folder ->
    C:\Users\...your user name...\AppData\Roaming\NVIDIA\ComputeCache
    and delete all ( or back it up if you want ) and restart and try again !
     
  41. illinar

    illinar

    Joined:
    Apr 6, 2011
    Posts:
    863
    Thanks. Didn't work unfortunately. I restarted Unity after deleting the folder, idk if you meant the PC instead.
     
  42. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Please report a bug, it should never crash. 2020.1 alpha9 has lower memory usage, please give that version a try.
     
    illinar likes this.
  43. Ryukra

    Ryukra

    Joined:
    Dec 1, 2017
    Posts:
    3
    The gpu plm didn't work until I removed all probes from the scene.. including the reflection probe.. strange.
    Unity 2019.2.8f
    Default HDRP scene.
    750ti with 2GB vram.
     
  44. JamieVRcade

    JamieVRcade

    Joined:
    Oct 21, 2012
    Posts:
    32
    Hey everyone,

    I have been struggling with this problem since the inception of GPU lightmapping in Unity. Today we solved it. Although this might not be a practical solution, I can tell you that it's the only one we found and I have been making crazy good lightmaps since.

    I replaced my 1080 Ti with an AMD Radeon RX 580. This is the only component I have changed. We tried multiple AMD and Nvidia GPUs and this is the ONLY ONE that works.

    The CPU is a 7th generation Intel i7 7700K. DDR4-2200 RAM.

    All of my problems are gone.
     
    Last edited: Oct 12, 2019
    naby-pixiv likes this.
  45. xVergilx

    xVergilx

    Joined:
    Dec 22, 2014
    Posts:
    3,296
    Will there ever be a proper fix to the terrain baking issue?

    GPU lightmapper will ran out of memory always when baking even some arbitrary medium size terrain. E.g. 1.5x1.5km.
    I get it that 1050TI might not cut it anymore, but cmon. Shouldn't it be optimized to bake large objects slower but consume less VRAM?

    Terrain only makes GPU lightmapper completely unusable for me. Please consider fixing this at some closer point in time.
     
  46. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    Windows 10 blocked unity 19.2 for using the GPU while trying to bake a small scene, notification that i see for first time and then the editor stopped hanged !
     
  47. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    Related to the post above, i reinstalled Windows recently and i saw the option to install "Studio Drivers". I am currently having them installed ?

    However is any of both drivers packs recommended or both should work because Unity hangs even if lightmapping option is disabled and only bake the environment lighting ?

    P.S. the GPU is working in UE4 and even works with the realtime RayTracer. May be you would like to test these drivers because are recommended for users that are not playing games but are using the GPU for working in general !

    Thanks !

    upload_2019-10-23_9-45-27.png

    Unity stuck right away after start baking !

    upload_2019-10-23_9-46-33.png
     
  48. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Yes, there will be a lower GPU memory footprint terrain baking code path in a future version of Unity, but I don't have an ETA for that yet.
     
  49. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Either driver should be fine, if not please open a bug report so we can resolve this.
     
  50. xVergilx

    xVergilx

    Joined:
    Dec 22, 2014
    Posts:
    3,296
    Well, lower memory consumption will not matter. Unless actual algorithm will change.

    Right now I've got estimated 12GB VRAM scene in size, so there's no way Progressive GPU will handle that in a single chunk, with lower consumtion or not.

    Either Progressive should have chunked baking, or its a complete bust for me.