Search Unity

Official Progressive GPU Lightmapper preview

Discussion in 'Global Illumination' started by KEngelstoft, Sep 26, 2018.

  1. Rich_A

    Rich_A

    Joined:
    Nov 22, 2016
    Posts:
    338
    Any forecast/confirmation on when this is exiting preview?

    I've tried Bakery, but the quality is just not the same as progressive CPU (2017.4), and the bake times weren't great either. So for a future project, progressive GPU is critical.

    Could you please add that information to the first post in this thread?
     
  2. McDev02

    McDev02

    Joined:
    Nov 22, 2010
    Posts:
    664
    Just encountered this issue here. First I thought the GPU is buggy but it seems to have issues with small scaled scenes.
    The normal scale is a real world scale of a city. But my scene on the right images has a scale of 0.006370144 as I am using this for an AR project.

    The CPU seems to be fine with this, but the GPU isn't. Maybe this can give you a hint and could be improved. For preview purposes the GPU is still fine in this case.

    lightmap.jpg
     
  3. fguinier

    fguinier

    Unity Technologies

    Joined:
    Sep 14, 2015
    Posts:
    146
    Interresting! Thanks for the repro. Can you open a bug with this sample scene please?
     
  4. Milvaa

    Milvaa

    Joined:
    May 20, 2018
    Posts:
    4
    Hello, I am using a laptop with GTX1050 and Intel 630, so I just tried to set Unity to use my 1050 for baking, but I keep getting this error.

    Failed to connect to player ip: D:\Program Files\Unity 2018.3.14f\Editor\Unity.exe -OpenCL-PlatformAndDeviceIndices 1 0
    UnityEditorInternal.ProfilerDriver: DirectIPConnect(String)
    UnityEngine.GUIUtility: ProcessEvent(Int32, IntPtr)

    I've also tried using just -OpenCL-PlatformAndDeviceIndices 1 0, as well as removing the OpenCL.dll file from the Unity Editor folder, but I have the same error. I am sure I am doing something wrong, but don't know what exactly.

    Unity version: 2018.3.14f1
     
    Last edited: Jun 7, 2019
  5. fguinier

    fguinier

    Unity Technologies

    Joined:
    Sep 14, 2015
    Posts:
    146
    The error above seems related to connecting the Unity profiler to a standalone player instance. Can you detail your setup please?
     
  6. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    We are hoping to remove the preview label in 2020.x but it is still too early to say exactly when it is going to happen.
     
    fguinier likes this.
  7. Milvaa

    Milvaa

    Joined:
    May 20, 2018
    Posts:
    4
    Which setup exactly? The project set up or my hardware?

    The project is in Unity 2018.3.14f1, it's a standard 3D Project, without the HDRP or LWRP. Unity itself does use my GTX1050, but when I try the Progressive GPU lightmapper it uses my IntelHD 630

    PS. I have a silly workaround at the moment, whenever I want to bake I just set unity editor to use my Intel GPU and so it can bake with my GTX, but I would like to be able to use my GTX for both the editor and baking.
     
    Last edited: Jun 11, 2019
  8. ade72

    ade72

    Joined:
    Mar 20, 2019
    Posts:
    17
    Hello everybody, I am new to this forum, thank you for your great forum talk so far!
    So most of the hints to get the progressive GPU bakery running is for windows users (like deleting a special .dll), so I am trying to get it running on osx. I run a new iMac mini with internal GPU and a eGPU (Radeon VII). When I choose "progressive GPU", something strange happens: When I hit "Generate Lighting", unity starts the baking with the Radeon VII (super fast, and the name of the GPU is written in the baking window). After 2 or 3 seconds (maybe another pass?) it seems to switch to the (slow) CPU (I guess, because the name of the Radeon disappears, and NO other name is written), and the baking is slower (needs about 25 seconds for very very small test setup). The Console shows the typical errors like OpenCL Error, falling back to CPU lightmapper, and this: OpenCL Error. Falling back to CPU lightmapper. Error callback from context: [CL_INVALID_KERNEL] : OpenCL Error : clEnqueueNDRangeKernel failed: invalid kernel 0x0.
    I tried to start unity from command line, like the hint above, selecting platform and device number, but this does not work. Any ideas? Thanks in advance!

    Here is the Editor.log, I think the interesting part is:

    -- GPU Progressive lightmapper will use OpenCL device 'AMD Radeon VII Compute Engine' from 'AMD'--
    use -OpenCL-PlatformAndDeviceIndices <platformIdx> <deviceIdx> or -OpenCL-ForceCPU as command line arguments if you want to select a specific adapter for OpenCL.
    OpenCL Error. Falling back to CPU lightmapper. Error callback from context: [CL_INVALID_BUILD_OPTIONS] : OpenCL Error : clBuildProgram failed: Invalid build options "-D APPLE -D KERNEL_INCLUDE_VERSION=201903180 -D USE_SAFE_MATH -cl-std=CL1.2 -I /Users/mypath/Caches und lokale Backups/Unity Editors/2019.2.0b5/Unity.app/Contents/Resources/OpenCL/kernels/"
     
    Last edited: Jun 11, 2019
  9. fguinier

    fguinier

    Unity Technologies

    Joined:
    Sep 14, 2015
    Posts:
    146
    Hi ade72,

    Thanks for kind words and the detailed and to the point log section! This look like a bug, can you report it using the editor, this might be driver related so please be sure to state your hardware and macOs version please.
     
  10. fguinier

    fguinier

    Unity Technologies

    Joined:
    Sep 14, 2015
    Posts:
    146
    This is fixed starting in 19.1 :)
     
  11. Milvaa

    Milvaa

    Joined:
    May 20, 2018
    Posts:
    4
    Thanks a lot for the response! I am halfway through a project, so it's not an option to update Unity atm, but I am glad it's fixed and I will be checking it out later :)
     
    fguinier likes this.
  12. ade72

    ade72

    Joined:
    Mar 20, 2019
    Posts:
    17
    Thank you! I used the editor for a bug report. I found out, that on a nearly "empty" scene baking with the Radeon VII works well. Only when I switch on "Contribute Global Illumination" for an object, the above error occurs. It must have to do with lightmaps...
     
  13. sewy

    sewy

    Joined:
    Oct 11, 2015
    Posts:
    150
    Have you tried to use GPU Lightmapper with LOD Group?
    When using CPU, every LOD bakes good, but GPU has visible problems on intersections.

    CPU: CPU.JPG

    GPU: GPU.JPG

    EDIT: Double Sided Global Illumination is already checked, if no, the artifacts are not black but colorful
     
    ROBYER1 likes this.
  14. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Hi, the GPU lightmapper doesn't support baked LOD groups. This is documented in the first post. We are working on a solution.
     
  15. sewy

    sewy

    Joined:
    Oct 11, 2015
    Posts:
    150
    Hello KEngelstoft,

    Applogies for that, I missed that in the post and search on the forum didn't return any results for LOD query.
    Do you have any ETA for this please?
     
  16. Aki-DAI

    Aki-DAI

    Joined:
    Jul 30, 2015
    Posts:
    5
    • Intel Core 07-9700 CPU 3.60GHZ
    • RAM 32GM
    • Nvidia Geforce RTX 2060
    • Geforce Experience updated to Version 430.86, released on May 27th 2019
    • Unity 2018.4.1f1, standard build-in render pipeline
    Trying to bake lightmap with GPU lightmapper preview but got this OpenCL Kernel error:

    Unity2018_4_GPUlightmapperFellBack2CPULightmapper.PNG

    3D project which used to be baked very well on my legacy GTX970 desktop under the same version of Unity with GPU lightmapper, and this old computer is also equipped with the latest Geforce Experience driver. Is that RTX 2060 the troublemaker?
     
  17. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Hello @Aki-DAI I have previously tested the GPU lightmapper on a 2080 and the Turing GPU generation works fine with it. You may have run into a compiler caching problem. Nvidia drivers are caching compiled OpenCL kernels to disk in %AppData%\NVIDIA\ComputeCache on Windows. Please close the Editor, delete that folder and try again. If this doesn't work I recommend that you try the latest 2019.3 alpha release to check if it works there.
     
  18. Yany

    Yany

    Joined:
    May 24, 2013
    Posts:
    96
    Guys, I plan to build a new computer based on Ryzen and Unity's gpu lightmapper can have some effect on me choosing the right GPU. I already own a great Sapphire Vega64 beast, but I'm considering to sell that too and buy a Radeon 5700(XT), but afaik the Vega64 has much more horsepower for such calculations than the new gen GPU from AMD. Or should I look at the other side and consider 2070(Super)? Could somebody somehow benchmark these cards with a scene and the lightmapper? (I guess it's not an obvious question but I did not want to go blind :)). Thanks.
     
  19. optimise

    optimise

    Joined:
    Jan 22, 2014
    Posts:
    2,129
    Hi. I found a critical bug Case 1170228 that both Progressive GPU (Preview) and CPU bake lighting forvever and the estimate baking time keep increasing but Enlighten (Deprecated) bakes it perfectly in short time. When baking ETA keeps increasing and the scene change from very bright to become very dim. It suppose to bake lightmap progressively but no more new changes after that. It is simple scene.

    Progressive GPU.png

    Progressive GPU dim.png
     
  20. INGTONY

    INGTONY

    Joined:
    Oct 13, 2014
    Posts:
    24
    I have a strange issue here , working with Unity 2018.3 i notice that the gpu load is quite low 2.1% during baking also cpu is low 4% , i dont have any error of overlapping or cpu fall back nothing !!!!!! just it makes me curios about it , the time is in the range to similar bakings that i have done with CPU , my question is why its not using 100% neither CPU or GPU im getting pretty low usage but its baking in a decent time , i dont know if some set up is wrong or drivers and may b i can get better performance or squezze all the power of my gpu my set up its a 6950X and 2 titan X , any advice
     
  21. Haze-Games

    Haze-Games

    Joined:
    Mar 1, 2015
    Posts:
    189
    Hi guys, with latest 2019.2.0f, I'm instantly getting this error after starting a bake with GPU Lightmapper:

    Code (CSharp):
    1. OpenCL Error. Falling back to CPU lightmapper. Error callback from context: CL_MEM_OBJECT_ALLOCATION_FAILURE error executing CL_COMMAND_NDRANGE_KERNEL on GeForce GTX 1070 (Device 0).
    I've tried all parameters I can think of:
    • Very small lightmap sizes of 32x32
    • With or without denoising
    • Deleted the NVIDIA/ComputeCache folder
    • Latest nVidia Drivers
    • With / without multiple importance sampling
    • With / without prioritize view
    • With 0 and 2 bounces
    There seems to be nothing I can do - even deleted Library folder and reimported the entire project.

    Happens on 2 different machines:
    • Machine 1 with GTX 1070 - 16 GB RAM - Win 10 x64
    • Machibe 2 with RTX 2080 - 16GB RAM - Win 10 x64
    Please note that GPU baking was working fine on this project on 2019.1.10f1. I'm using the default legacy rendering, Subtractive Lighting Mode for an Android Project. This issue also happens on the PC (Windows) version of the project, using Shadowmask.

    Would there be some additional files / folders I should be deleting when upgrading? I'm using the latest Unity Hub to install new releases.

    Here's some additional info from the Editor.log:
    Code (CSharp):
    1. (Filename: C:\buildslave\unity\build\Editor/Src/GI/Progressive/OpenCL/OpenCLCheck.cpp Line: 134)
    2.  
    3. OpenCL Error: 'GetBuffer(kRRBuf_lightRayIndexToPathRayIndexBuffer).EnqueueClearBuffer(openCLState)' returned -4 (CL_MEM_OBJECT_ALLOCATION_FAILURE)!
    4. OpenCL Error. Falling back to CPU lightmapper. Error callback from context: CL_MEM_OBJECT_ALLOCATION_FAILURE error executing CL_COMMAND_NDRANGE_KERNEL on GeForce RTX 2080 (Device 0).
    Any help would be appreciated!

    Thanks,
    Charles
     
    Last edited: Jul 30, 2019
  22. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    I don't get the issue as yours, but i was really curious and i just tried a simple scene, and again the old issues with stating that we need to fix objects in 3d program and i got an error as well. Pretty disappointed right after the simplest test.

    Oh and the objects are not actually lightmapped !

    upload_2019-7-30_20-43-41.png
     
    Haze-Games likes this.
  23. ikisarov

    ikisarov

    Joined:
    Mar 17, 2015
    Posts:
    11
    Hello!
    Are there those who are engaged in architecture or interior designers? Tell me, please, what are the ideal settings for this window. How many direct and indirect samples do you need?

    Screen.png
     
  24. Haze-Games

    Haze-Games

    Joined:
    Mar 1, 2015
    Posts:
    189
    Well, it largely depends on the scene and what objects you have, spacing and scales etc (it seems). For me, it seems like your sample values are very high. For example, I usually use:
    • Direct: 512
    • Indirect: 1024
    • Environment: 1024 (not sure, this is a new setting and I can't get the latest to work, as my post above states)
    • Bounces: 2
    • Lightmap Resolution: 18
    • Lightmap Size: 1024
    • Lightmap Paramers: High Quality
    • Ambient Occlusion Disabled: I cannot see any visual difference and takes ages to compute here
    These provide good quality for me, if I want extra resolution for shadows I bump up to 24 Lightmap Resolution.

    I hope this helps having a start-off point.
     
    Last edited: Jul 30, 2019
    ikisarov likes this.
  25. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    It appears that high GPU memory usage is the problem, please confirm this by looking at the Windows task manager 'dedicated GPU memory usage' graph for your 2080 GPU. The usage of the Editor and the lightmapper combined exceed the physical amount. Please note the GPU memory usage before starting the bake and usage once it returns the error.
    One way to alleviate memory usage is to run the editor on the 1070 and use another dedicated GPU like your 2080 for baking, using a command line parameter to specify a specific GPU for baking. See the 'How to select a specific GPU for baking' section in the first post in this thread.
     
    Haze-Games likes this.
  26. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Please try the recently released 2019.2 version, where CPU and GPU utilization is much better. Assign a dedicated GPU for the GPU lightmapper and another for the Editor for best performance, as written in the post above.
     
  27. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    The objects are lightmapped, in your screenshot the sphere is set to 'Contribute Global Illumination'.
    Have a look here on how to fix the overlap https://docs.unity3d.com/Manual/ProgressiveLightmapper-UVOverlap.html
     
  28. salex1

    salex1

    Joined:
    Jul 9, 2015
    Posts:
    29
    I know you are working hard on this but i also noticed that the lightmapping tend to have this isuses and it worries me. For example, in one project with ordinary unity elements (cube, cylinder, sphere) I got a huge number of lightmapping overlapping errors! Shouldn't this models from unity be by default automatic properly lit. I mean it's simple objects! Suppose we use those simple objects, let's say, for prototyping, with different shapes, like cubes, a bunch of overlapping errors will appear ... there's no way each of these can be manually adjusted. What about a little more complex geometry ... while enlighten doesn't have this problems at all!
     
  29. fct509

    fct509

    Joined:
    Aug 15, 2018
    Posts:
    108
    Hi, I'm working on a Unity Editor Extension that makes use of Compute Shaders and I was wondering how the Unity Team managed to get the GPU Lighmapper to run without freezing the main thread.

    I'm currently using Editor Coroutines with a "yield return null" between dispatch calls, but the entire Unity Editor will freeze up if I do too many dispatch calls this way. I was hopping to dispatch from a background thread, but the Unity API doesn't allow that.

    As a team working on a feature that has managed to get around this problem, I was wondering if you have any advice.

    Thanks,
    -Francisco
     
  30. Haze-Games

    Haze-Games

    Joined:
    Mar 1, 2015
    Posts:
    189
    Hi,
    Thanks for your reply. I currently uninstalled the latest version because of this issue and reverted back. I will take some time to re-upgrade next week to come back to you with these values and try what you suggested.

    However there must be a technical issue regarding memory consumption, as it works perfectly before this latest version with a single RTX 2080 with all scenes of my project. This issue is present even on a simple test scene with 2 small objects and 32x32 size and 10 samples and low quality preset.

    I assume I should be able to bake safely with these low settings using a 8GB GDDR6 RTX 2080.

    I'll have a look at the consumption and get back to you.

    Thanks,
    Charles
     
  31. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    We are dispatching calls to OpenCL from a separate thread, the main thread is not doing any OpenCL work at all.
     
  32. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Yes, that should work just fine. Please file a bug report so we can investigate this issue and fix it.
     
  33. rasmusn

    rasmusn

    Unity Technologies

    Joined:
    Nov 23, 2017
    Posts:
    103
    I agree with you that this is not acceptable. However, the solution is not trivial unfortunately. It's far more multi-faceted than it seems at first glance. There several intertwined sub-issues at play. Please see my posts here and here where I discuss some of the issues.

    We are aware of the problems and addressing them as we speak, but we have to handle one issue at a time. I am currently finalizing my work on an improvement to the Mesh Importer that will make it easy generate lightmap UVs that has no texel bleeding (see the first of the posts linked above for more detail). The existing Mesh Importer setting "Pack Margin" can be a pain to use, and this change will greatly improve on this.

    We have not yet decided what particular packing issue is up next, but rest assure that we will continue our work in this area.
     
    Lars-Steenhoff and KEngelstoft like this.
  34. fct509

    fct509

    Joined:
    Aug 15, 2018
    Posts:
    108
    Well, that's a shame, I was hoping for a way to use the Compute Shaders since I already know how to do that. Well, guess I'm going to be looking into OpenCL.

    Thanks for the info.,
    -Francisco
     
  35. salex1

    salex1

    Joined:
    Jul 9, 2015
    Posts:
    29
    It is good to hear that you are working on it. I think these things are very important and especially for gpu lightmapping future! I also see lot of things and nice new features coming...
     
  36. Stygian

    Stygian

    Joined:
    Aug 12, 2016
    Posts:
    3
    Hey everybody! I've been having loads of issues trying to bake lighting for our game using any lightmapping system in unity, but recently the GPU lightmapper at lower settings seemed to be a godsend until we started trying to lightmap entire levels. CL memory crashes reverting to CPU baking, blotchy shadow areas in darker areas of the environments, you name it. Following along on this thread here, I decided to try and pick up a Radeon GPU to try and make more of the card's memory, and hopefully allow me to bake a level without having to babysit and min/max the settings in case of failure. My dev machine is optimal, so we don't have to go over that. 6700k, 32gb ram, windows 10 latest patches, liquid cooling..no overclocking and a founders edition 1080TI, and a newly purchased Radeon 5700 XT.
    Tested on 2019.1(both versions) and 2019.2
    Here's the current problem:

    The Radeon 5700XT does not work with GPU lightmapping. What?!

    I have a fresh install of windows 10-- fresh install of the latest Radeon drivers and am unable to get the GPU lightmapper to work. Unity Recognizes the Radeon 5700XT as GFX1010 in the editor log, and at bake time I repeatedly get CL_INVALID errors before it reverts to CPU baking (aka "100 hours for half a level") I've also tried running my 1080GTX TI as a bake device using the command line settings, but it still uses gfx1010 as the baking card. Glad I didn't buy a VII, but I am in need of as much baking memory as possible. Any ideas?
     
    Last edited: Aug 4, 2019
  37. Alimohammadlo

    Alimohammadlo

    Joined:
    Jun 20, 2017
    Posts:
    16
    Please fix or make began lightmap algorithm for baking in lightmap anything its problems I cant use gi realtime anything back level unity is crashing please fix him please regenerate lightmap window and algorithm, please......
     
  38. Alimohammadlo

    Alimohammadlo

    Joined:
    Jun 20, 2017
    Posts:
    16
    unity lightmap system is a -z its problem !!!!!!:(:(:mad::mad::mad::mad::mad::mad::mad::mad::mad::mad:
     
  39. iamthwee

    iamthwee

    Joined:
    Nov 27, 2015
    Posts:
    2,149
    Isn't GPU baking limited to the vram on the graphics card? So if your scene has insane geometry only CPU would work. Much like big hollywood studios only use CPU farms to render graphics?
     
  40. Stygian

    Stygian

    Joined:
    Aug 12, 2016
    Posts:
    3
    Not insane geometry. Our levels are VERY reasonable if not downright minimalistic when it comes to detail. This has been a fundamental issue for years baking with enlighten, and still feels like it's the same with the Progressive CPU (if not actually longer than enlighten times). For reasonable production turnaround time, none of the current solutions I am seeing offered by Unity at this time seem to allow for a shippable quality product if there is next to zero memory management when baking a large level. Why must the lightbaker try and load ALL lightable geometry at the same time when calculating? I can bake small sections of the level if others are hidden..this seems like a simple "divide and conquer" sort of scenario and yet I can only try and make an entire level bake on a 2.75gb max budget, or I can bake one small segment of a level for a screenshot.

    Also, iamthwee--it's limited to 25% of the total ram. that's roughly 2.75gb. Im baking with a 1080TI and 11gb of vram. For the record, I've worked on "hollywood style" productions and we used Octane, a GPU baking system running on computers FILLED with 1080TI cards.

    Other members of my team cannot attempt to bake the same settings as I can with a level merely because they have NVIDIA cards that have 8gb or less total ram. I picked up an AMD card with 8gb, the 5700, but apparently it is incompatible with the GPU Lightbaker.
     
    Last edited: Aug 5, 2019
  41. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Please open a bug report so we can take a look at the error log. The 5700 card should work just fine but you might have hit a driver issue.
     
  42. Stygian

    Stygian

    Joined:
    Aug 12, 2016
    Posts:
    3
    This unfortunately would require me to buy the card again and test it. I returned it after testing with fresh installs of windows, radeon drivers, and unity. I'm having a difficult enough time trying to get other team members with less memory on their NVIDIA cards to be able to bake levels without hitting the memory limitations. My 1080ti is the only one with enough memory overhead to bake and denoise our levels and it took hours of experimentation to find the right combo.
     
  43. Alimohammadlo

    Alimohammadlo

    Joined:
    Jun 20, 2017
    Posts:
    16
    i am use cpu because my cpu is powerfull to
    I am using CPU because my GPU its poor my GPU is a gtx 1050ti and my processor is a Core i7 9700k
     
  44. Alimohammadlo

    Alimohammadlo

    Joined:
    Jun 20, 2017
    Posts:
    16
    i am using a realtime mod fo global illumination after 3 hours a go-to 7/11 level as 35 jobs unity is gone how to fix this my level is used a LOD, and I checked anything UV lightmap geometry to 0.2 texels mesh and enable checkbox to UV normalize and static lightmap and set a resolution to 0.2 my pc configuration is a CPU core i7 9700k and ram 16 GB DDR 4 3200 MGHz and 240 GB SSD for projects and 115 GB SSD for windows and 10 tr HDD and my GPU is GTX 1050 TI
     
  45. deltamish

    deltamish

    Joined:
    Nov 1, 2012
    Posts:
    58
    GPU Progressive Mapper crashing and reverting back to CPU mapper. I am rendering at default settings, Optix Denoiser, 4096 resolution.
    Running on a RTX 2060. Any possible reason why this might be happening ?

    Same happend in my old card and with Unity 2018.x
     
  46. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    You most likely get out of memory. I doubt 6 GB of GPU memory is enough to render 4k as there is a lot of stuff already loaded to the memory ( as the scene representation ) and also the data needed to do denoising is also a lot in terms of memory size. Try bake at 1K or 2K.

    4K i think is suitable only for high end GPUs with at least 8 GB of memory and more ! And even 8 might be not enough depending on the scene complexity !
     
  47. Rich_A

    Rich_A

    Joined:
    Nov 22, 2016
    Posts:
    338
    From reading these posts, its not looking promising that we are going to have a reliable GPU Lightmapper, ready for production, in 2019.4 LTS.... I was looking forward to dumping a lot of my graphical assets (Bakery etc.) and relying a lot on core Unity features and HDRP.

    I think the Unity version system needs to be reviewed. Why should an LTS release fall arbitrarily at the end (roughly) of the year? It should occur once every 12-18 months, when the core software is stable. Instead we see things like Nested Prefabs (ostensibly 'released' in 2018.3) still getting fundamental functionality changes/improvements (with an impact on actual productive workflow) all the way into 2019...

    I think x.1 and x.2 versions, followed by an x.3 LTS version, over an approximately 18 month cycle would be the way to go. The idea would be that x.1 has features or changes which breaks old projects, but still presents an upgrade path to x.3, x.2 has major new features that need testing, and x.3 is released when everything is ready and the Unity team is comfortable supporting it (and backporting improvements and fixes!) for two years.
     
    Hypertectonic and Vagabond_ like this.
  48. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    This is exactly what i was thinking as well.
    I totally agree with this !!!!!!!!!!
    Otherwise it all feels constantly broken !
    New features should be added up to x.2 and then all should be optimized fixed and improved without any new features added in x.3 !
     
    Last edited: Aug 9, 2019
    Rich_A likes this.
  49. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    When OpenImageDenoiser is selected, the lightmaps are not updated and only stays black.

    Also objects could not baked correctly on the GPU.

    CPU Lightmapper prioritized view and a custom procedural mesh baked correctly also all containers baked correctly.

    upload_2019-8-9_15-55-6.png

    Same test just switched to GPU lightmapper ( GTX 1060 6 GB memory ) - procedural mesh can not be baked, also containers do not look correct. I do not know why is that !

    upload_2019-8-9_16-4-55.png
     
  50. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Please try the GPU lightmapper in latest 2019.3 alpha and let me know if it works there (that version has submesh support). If it still doesn't work, please file a bug report so we can fix it. Thanks!
     
  51. Kichang-Kim

    Kichang-Kim

    Joined:
    Oct 19, 2010
    Posts:
    1,011
    Hi, Is there any information about Baked LOD? the first page did not say any ETA of Baked LOD. I think that it is a key feature of progressive lightmapper and want to know whether it will be available in 2019.x cycle (I hope).
     
    sewy likes this.