Search Unity

Official Progressive GPU Lightmapper preview

Discussion in 'Global Illumination' started by KEngelstoft, Sep 26, 2018.

  1. mircea_02

    mircea_02

    Joined:
    Jun 18, 2015
    Posts:
    3
    So, I bought a RX 580 second-hand 4gb memory, with 90$ to try this improvement that the whole planet has been expecting for some years now (quote: "Revolutionizing render times and workflows for realistic light effects has was one of the dominant themes at GDC 2018 ")

    First try: Run ! time is greatly improved, quality is good!
    Second try: Unity crash
    Third try: System crash
    System restart, render again:
    Fourth attempt: ...... nothing happens!
    System restart again, clear previous baking, render again:
    Fifth Try: Going Again
    It looks like among 5, 6 system crashes, you can try experiencing with some variables.
    I look forward to the following functional version improvements (Unity 2019 gen..)

    However, I have a question:
    How much would Unity "industry" cost to work with the "Bakery" creator (found in the Asset Store)?
    I mean ... I do not know, it seems to me that what this asset have done is at least 5-6 years ahead of Otoy/Octane, Unity/Radeon Rays, etc ....
    In fact, what the "Bakery" has done is so revolutionary so perfectly functional and so advanced as technology to everything that could be produced for many years from now !
    I think it would have been enough a letter sent to their producer team ( ....as i understand, is one simple guy from Moscow) and I honestly believe they would have apreciated and helped Unity.


    Meanwhile, I've reinstalled my old GTX 780 with nvidia drivers, and I use "Bakery" for perfect (quality and lightspeed) baking:
    70 - 300.000 scenes vertices, high resolution map , maximum 7-8 minutes, with a wonderful simple solution "bonus" for prefab everything you baking in a reusable models to put them in other instance, other scene or other projects
     
    xVergilx and Lars-Steenhoff like this.
  2. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,527
    Yes bakery is great, I belief that develpment goes faster outside of the unity development environment because bakery does not have to go trough unity internal QA because the unity forums are the QA, all the feedback is directly integrated.

    But im very happy unity is making gpu lightnapper because it supports mac, bakery is pc only for now.
     
  3. ApexofReality

    ApexofReality

    Joined:
    Feb 14, 2016
    Posts:
    102
    Do you guys think 16gb is overkill? The new Vega VIIs are coming out next month and i might consider getting one. Also do tensor cores from the RTX 2080, etc help performance at all? Do you guys think buying the RTX 2080 right now is a good idea or just to wait for the Vega VIIs?
     
  4. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,527
    you know there is no overkill in terms of memory, the more the better. :)
     
    ApexofReality likes this.
  5. ApexofReality

    ApexofReality

    Joined:
    Feb 14, 2016
    Posts:
    102
    Great! Many may be disappointed at the Vega VII card but I sure am exicted with all those compute units and that massive amount of dram!
     
  6. screenname_taken

    screenname_taken

    Joined:
    Apr 8, 2013
    Posts:
    663
    I think to use the tensor cores directly you need to have compatibility written in the software?
    Unless i'm mistaken.
     
  7. cmorait

    cmorait

    Joined:
    Nov 22, 2017
    Posts:
    59
    Same here with my Asus ROG 702ZC gaming laptop that has AMD Ryzen and RX 580.

    I will wait for the newer version of the light mapper.
     
  8. cmorait

    cmorait

    Joined:
    Nov 22, 2017
    Posts:
    59

    Thanks.
     
  9. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    We are improving memory usage of the GPU lightmapper but while it is in preview the 16 GB Vega VII or another card with 16 GB is needed to bake a 4K lightmap. A card with 11 GB will run out of memory in 2019.1 when baking a 4K lightmap.
     
  10. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    The Optix denoiser can use the tensor cores. The rest of the lightmapper is not using them at the moment.
     
  11. Lyje

    Lyje

    Joined:
    Mar 24, 2013
    Posts:
    169
    @KEngelstoft A few questions:
    Is it still the case that nVidia cards won't use system RAM but AMD cards will if necessary?
    If it is, and I know it's an nVidia driver thing, is it a limitation of their OpenCL 1.2 implementation and does that limitation go away with OpenCL 2.0? Does the lightmapper use 2.0 if available?
    Thanks!
     
  12. Lune

    Lune

    Joined:
    Oct 20, 2012
    Posts:
    62
    i am working in a laptop and unity is not finding the nvidia gtx 1060, this in unity 2018.0.3.0f2 in the opencl log
     
  13. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    As mentioned in the thread above a few times, you have to go to the unity install directory and remove the OpenCL.dll file. This step is not needed in 2019.x anymore and if i am not mistaken this fix should be ported to Unity 2018.3 but it is not yet there.
     
  14. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Whether a GPU can swap to system memory depends on both hardware and driver support and this is outside of our control. Rest assured we are working on decreasing the memory usage of the GPU lightmapper.
    We are using OpenCL 1.2 at the moment because that is the version supported on macOS. We will eventually move away from OpenCL so we have no plans on using the OpenCL 2.0 feature set.
     
    Lyje likes this.
  15. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    Really, just out of curiosity, what other API you could rely on, that supports multiple platforms and hardware vendors ?
     
  16. Jesper-Mortensen

    Jesper-Mortensen

    Unity Technologies

    Joined:
    Mar 15, 2013
    Posts:
    232
    It will be a set of APIs for ray tracing services and vendor agnostic compute for the kernels.
     
    Vagabond_ likes this.
  17. Jesper-Mortensen

    Jesper-Mortensen

    Unity Technologies

    Joined:
    Mar 15, 2013
    Posts:
    232
    I'm personally a big fan of Bakery. It's the perfect example of an Asset Store package as it addresses a slice of our users. I believe it is built on top of a mature rendering API called Optix, which also helps on the development velocity. Thus it only runs on NVIDIA devices. We need to serve all our users so we are going in a different direction.
     
  18. Mac92

    Mac92

    Joined:
    Jan 18, 2019
    Posts:
    3
    Hello!

    I have the following issue - I'm baking a bunch of scenes with the GPU baker in Unity 2018.3. GTX 1070 graphic card, everything bakes nicely (I've deleted the OpenCL.dll as suggested here and it works, thanks a bunch!).

    Anyway, all of my scenes bake okay, but one scene refuses to bake. It actually will bake, but it takes 10 hours to bake on a small sample size, and the bake result is all broken up. If I use the CPU baker on that scene it works correctly. I've deleted the scene and copied two of my other working scenes and it was still broken. Then I copied and re-exported my Blender file a bunch of times, and it's still broken.

    I'm guessing the problem is either in a random baker bug, or in the Blender file. Because when I export any other blender file it bakes correctly. What do you think could the problem be with the Blender file and the bake? I'm exporting fbx into Unity, and I can't share the file since it's a project I'm working on under a NDA.
     
  19. screenname_taken

    screenname_taken

    Joined:
    Apr 8, 2013
    Posts:
    663
    Yeah it still needs fixing. There is a scene of mine that the CPU lightmapper makes different lightmaps. (that are more correct than the GPU one.)
     
  20. cmorait

    cmorait

    Joined:
    Nov 22, 2017
    Posts:
    59
    Do you recommend to buy Vega VII for a desktop build for GPU light baking instead of Ti for the above reason that you describe?

    Will you fix that problem in the final release of the GPU light mapper? I am asking because Ti is faster than Vega 7 and supports also other application for raytracing that are CUDA based.
     
  21. ApexofReality

    ApexofReality

    Joined:
    Feb 14, 2016
    Posts:
    102
    Yes Unity will fix the problem with the VRAM. At the moment, Vega VII is a good choice.
     
  22. cmorait

    cmorait

    Joined:
    Nov 22, 2017
    Posts:
    59
    I have seen strange behavior with the GPU light mapper. The first time that I am baking it is utilizing my RX580 at maximum speed (1077Mhz). If I stop the baking and initialize it again it drops to 300Mhz or 600 or 910Mhz. I just tried with a large archviz project to test the speed difference. For avoiding the problem I have to reboot the pc and then bake again. I found the problem because it seemed that the specific scene took longer than usual to complete baking.

    I hope that GPU Lightmapper will be developed at a faster pace as going back to CPU baking is a pain. Even if you have a 32core CPU light baking with GPU is by far superior in term of baking performance.

    One scene that took 18.5minutes in an RX 580 took more than 5 hours with Ryzen 7 1700. That is huge.

    I hope the next release to be available soon as the problem that the GPU light mapper cannot bake a scene when light probes are activated is very limiting for our workflows.
     
    Last edited: Jan 20, 2019
  23. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    The crash when probes are in the scene with has been fixed and is backported to 2018.3.2f1 (released now). Get it here https://unity3d.com/get-unity/update.

    As for the drop in clock speed, are you sure GPU temperature is still within a reasonable range and that the editor window has focus while baking the second time? Does restarting Unity give the same speed increase as rebooting the machine?
     
  24. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Please file a bug report with the problematic scene or a comparable asset that reproduces the problem if at all possible. This will make it a lot easier for us to fix this problem. Please try the latest 2019.x alpha to see is this has been fixed already by the numerous improvements we have made since 2018.3.
    If the CPU backend doesn't reproduces the problem, you have found a GPU lightmapper specific bug and that we would like to fix. I don't think this is related to Blender.
     
  25. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Please report this so we can fix it.
     
  26. cmorait

    cmorait

    Joined:
    Nov 22, 2017
    Posts:
    59
    The temp is maxed at 65 Celsius. That is very strange behavior. If I close the editor and fire it up again the same thing happens. Only 300Mhz. I will investigate more.

    I would like also to ask what you recommend in Radeon GPU workload type. There are two modes one is graphics and the other one is compute that is supposed to be used for mining and other compute applications.

     
  27. bodzi0x95

    bodzi0x95

    Joined:
    May 16, 2018
    Posts:
    12
    Hi, I have a problem with lightmapper on 2018.3. On Unity 2018.1.9f progressive lightmapper was baking lightmaps in max 10 mins on big scene. After unity update it changed to 16h... I updated my graphic drivers and now "preparing bake" lasts forever on both progressive cpu and gpu lightmappers.

    I tried delete OpenCL file with no result
     
  28. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    You shouldn't have to change the GPU workload type.
     
  29. Mac92

    Mac92

    Joined:
    Jan 18, 2019
    Posts:
    3
    Thanks for the answers. I tried with the Beta 2019 build, however I'm getting the same broken result. Then I tried a lot of different things, of which none worked. The last thing I tried did work. I started deleting parts of the model, until I found what caused the problem.



    This is the part of the model causing problems - it has a bunch of modifiers on it, array/mirror some weighted bevel etc. As you can see on the picture the blue selected part arrays and mirrors around the center.



    If I select this windows, and I isolate them as a new mesh, everything bakes correctly.

    Basically when those three windows are removed from that mesh as a new object/mesh the GPU baker starts working. Before that only the CPU baker worked correctly.

    I've attached the Blender file here where everything is a single mesh. That, if exported to fbx doesn't work.

    I've also attached 2 fbx files. One of them has everything joined and is named "thisdoesntwork". One of them has only the windows isolated and is named "thisworks".

    I hope that helps.
     

    Attached Files:

  30. fguinier

    fguinier

    Unity Technologies

    Joined:
    Sep 14, 2015
    Posts:
    146
    Thanks svami, amazing brake down of the problem. Can i ask that you open a bug (you can link to the forum)? This way it will be linked to you directly and you will be notified when we resolve it.
     
  31. Mac92

    Mac92

    Joined:
    Jan 18, 2019
    Posts:
    3
    I've opened the bug report, thanks for the help and good luck!
     
    Last edited: Jan 21, 2019
  32. screenname_taken

    screenname_taken

    Joined:
    Apr 8, 2013
    Posts:
    663
    Will do. I was just thinking that the difference is just down to something that he GPU lightmapper does support just yet.
     
  33. Omzy

    Omzy

    Joined:
    Jun 28, 2013
    Posts:
    31
    I've been trying to get the lightmapper to work on Unity 2019.1.0a14. This is on my desktop, pretty new machine with a 1070, 8gb vram. It works on a tiny test scene, but on a larger complex scene I'm getting

    OpenCL Error. Falling back to CPU lightmapper. Error callback from context: CL_MEM_OBJECT_ALLOCATION_FAILURE

    There is no OpenCL.dll that I can see in this version, looks like it was renamed to OpenRL.dll? Even so, renaming that file to something else has no effect.
     
    M_R_M likes this.
  34. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    First of all in Unity 2019 you do not have to touch any OpenCL.dll files. This is for 2018.3 and only in case there is no a fix provided yet.

    So, baking a scene can very fast fill your GPU memory. You are obviously running out of GPU memory. Note that the scene itself is taking memory to store all the mesh and texture data plus all the rendering pipeline stuff, and the GPU lightmapper is also using a lot of memory to store all the needed data for calculation.

    Lightmapping team noted that is working on improving the memory management, but it all depends also on the texture resolutions your objects are using, how many vertices in total you are baking etc... and most important may be the lightmaps resolution and scale. For example baking 2K or 4K lightmaps on a big scene will definitely fill you GPU memory quickly.

    So try with starting with lightmap scale of 1 and increase with a magnitude of 1, and be sure to keep your lightmaps of a resolution of 1k for big scene. You can not just bake at 40 resolution and 2k a big scene !

    P.S. - OpenRL.dll is another file, i think for using with the CPU Progressive Lightampper - you do not have to touch or rename this file, otherwise your CPU lightmapper will probably fail !
     
  35. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,793
    Is having the editor windows in focus a requirement? I'm guessing it has more to do with the OS and the drivers than the actual Unity, but I hope I can still browse a web page or two while I'm baking in Unity.
     
  36. screenname_taken

    screenname_taken

    Joined:
    Apr 8, 2013
    Posts:
    663
    You don't have to have it in focus. That doesn't make sense. I did make a bake while doing something else.
     
  37. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    For me, when not on focus, the editor is updated at lower rates, and i noticed that GPU lightmapper is slower for me ( did not check that recently though ). I had a similar issues with a GPU lightmapper of mine which was subscribing to editor update method in order to update the baking process, and the update method does not run at 60 fps ( may the value be different but it should be constant, at least of what i can remember when i was doing the lightmappin plugin ) as when it is on focus !
     
    Last edited: Jan 22, 2019
  38. screenname_taken

    screenname_taken

    Joined:
    Apr 8, 2013
    Posts:
    663
    Wow. Haven't noticed that myself, but that sounds wrong.
     
  39. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,527
    It could be outside of unities control, the OS might give less resource to apps that are not in focus
     
  40. MAK11

    MAK11

    Joined:
    Nov 24, 2017
    Posts:
    11
    Isn't that related to Nvidia GPUs ? They have a hard time rasterizing when their CUDA cores are fully under load doing compute calculations especially OpenCL (for example happens in DCC softs when rendering on the GPU the viewport just goes sub 5fps while Radeon GPUs manage to balance things better in those cases).
     
  41. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    Not having this behavior with Blender for example. It plays animation and renders with GPU at the same speed no matter if on focus or not!

    You can create an EditorWindow script and add the Update() method, When not on focus, it seems the editor window is running at around 10 fps ( updated 10 times per second ) and when it is on focus it is much faster - should be at 60 fps.

    Just did a test bake using the GPU. When the editor is on focus it takes 55 seconds in total, and when it is not on focus it takes 110 seconds.

    It simply turns out that you will have to wait some more time until a bake is done in case you are browsing the web or doing something else even not a heavy task, like edit text documents in another application !
     
    Last edited: Jan 23, 2019
    Lars-Steenhoff likes this.
  42. fguinier

    fguinier

    Unity Technologies

    Joined:
    Sep 14, 2015
    Posts:
    146
    Yes that is a problem we currently have on the GPU lightmapper, lower frequency tick when editor is out of focus might not generate enough work for the GPU to be fully fed (if the GPU is fast enough).
     
  43. Omzy

    Omzy

    Joined:
    Jun 28, 2013
    Posts:
    31
    I've reduced my settings as you recommended but still have the same error. Here is a picture of my settings if you can spot anything I'm doing wrong. The error occurs a few seconds after pressing the Generate button. There are 120ish point/spotlights and 3 area lights in my scene. I've taken care to make sure the ranges are 15 or below with tight angles (100 deg or less) for spotlights. The area lights are quite large in scale and were meant to replace tons of point lights for performance improvement (although i have no idea if they are better or worse). For comparison, Bakery GPU completes the same scene in around 12 hours, so theoretically I should be able to bake it with this lightmapper, right? If you're wondering why I don't stick to bakery, it has created some extremely puzzling artifacts in my scene that the developer hasn't figured out either, so I'm hoping this lightmapper can do the trick.

    https://gyazo.com/22bd9de48d37c83ad2d3cf37599e8adf

    Edit: I have been reading the Introduction to Realtime GI Lighting articles and realize I have hundreds of static small scene dressing objects with no light probes. I'm going to try to optimize that and limit the number of 'charts' in the scene. Hopefully that will help.
     
    Last edited: Jan 24, 2019
  44. KEngelstoft

    KEngelstoft

    Unity Technologies

    Joined:
    Aug 13, 2013
    Posts:
    1,366
    Hi! Posting the editor.log after it fails allocating can help us figure out what it going wrong here.
     
  45. Omzy

    Omzy

    Joined:
    Jun 28, 2013
    Posts:
    31
    I finally got it to work! The scene just had too many lightmap static objects. Adding blend probes making all small items work with them (reducing number of charts) fixed it. The scene baked in about 1 minute, which is ridiculous. I can certainly up the settings now, lol.
     
    fguinier and Vagabond_ like this.
  46. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    Yes, it's not worth baking small objects for any reason. Achieving good lightmapping is kind a tricky in general so you must learn. Lightprobes are perfect for small objects but in opposite, they do not work well for big objects because the big object might start looking weird ( not in place ) as the light intensity is interpolated between probes.

    Now you can crank up settings and get some nice indirect lighting and shadows :)
     
  47. Omzy

    Omzy

    Joined:
    Jun 28, 2013
    Posts:
    31
    Ok, i knew it was too good to be true. Ever since i baked with Bakery, i've had a strange bug where there is a strange shader or memory issue, but it only occurs in my builds, not in the editor in play mode. After my success baking with the GPU lightmapper, i deleted Bakery (just deleted the folders), and I'm still suffering from the same issue. Here's a screenshot. Any idea what is plaguing my scene? It looks perfectly fine in the editor and in play mode.


    https://gyazo.com/212bb58c6df8add90a80d6d899c34671
     
  48. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    I think Bakery was leaving some stuff behind even hidden in scene. So one thing might be to copy all stuff from your scene and paste to another scene, with the same lighting settings, and try to bake again with the GPU lightmapper. Also go and contact Frank ( Bakery developer ) and ask him how to properly remove the plugin. It is using native plugins which should be deleted with the editor closed and from Widows explorer ( i will assume you are using Windows ) !
     
  49. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    Another thing would be to disable all the lights in your scene and try bake again. Lights might still have some non initialized components left after you deleted Bakery ! So what you can do is to go and force recompiling scripts. This way any hidden component ( non initialized ) added to the light will get visible and you will be able to remove them if there are any at all !
     
  50. Omzy

    Omzy

    Joined:
    Jun 28, 2013
    Posts:
    31
    Thanks for the advice, i'll let you know how it goes!