Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

SEGI (Fully Dynamic Global Illumination)

Discussion in 'Assets and Asset Store' started by sonicether, Jun 10, 2016.

  1. castor76

    castor76

    Joined:
    Dec 5, 2011
    Posts:
    2,517
    I think it will help a lot of there is some kind of in depth documentation on how to make best usage of this asset.

    Things like proposed best method to prevent light leaking and creating simpler geometry for faster performance.

    I can understand that simpler geometry means faster calculation, but I am not entirely sure on how to shape this simpler version of geometry around the complex one in order to maximize performance and quality at the same time. For example, if I have a small detail with stairs like steps , is it better to create a simple slope looking geometry? if so where does it needs to be? over the detail? or under the detail?

    Also for VR, it is not practical yet , since it needs to run at +60-90 fps but I think it might be calculating voxel tracing for each eye so not doing that alone can possibly make this usable in VR.
     
  2. castor76

    castor76

    Joined:
    Dec 5, 2011
    Posts:
    2,517
  3. SteveB

    SteveB

    Joined:
    Jan 17, 2009
    Posts:
    1,451
    Looking good, especially since there's zero VR optimization (e.g. calculating the solution once vs. twice/each eye)!
     
    RB_lashman likes this.
  4. VisionPunk

    VisionPunk

    Joined:
    Mar 9, 2012
    Posts:
    711
    RB_lashman likes this.
  5. castor76

    castor76

    Joined:
    Dec 5, 2011
    Posts:
    2,517
    Yeah, I am getting just around 60fps in Unity Editor. When I tested it using only one eye, it goes to about 120fps. So assuming that the VR optimization can reduce voxel calculation to half, I am hoping it can reach at around above 100 fps, and then with the other optimization that may come along, it can pump to 120fps giving me actual room to do the game logic stuff, effect rendering etc.

    But I think it is close.. I am running it on gtx 970 so making it run at least 90fps is going to be tough.

    It's also shame that it has to run on deferred (which I am ok with it for GI purpose) but it means no AA so had to run some screen space AA supplied by Unity. which does some blurring on the edges.

    I am just imagining myself welding a flaming sword of death and then run around this kind of dungeon and slashing everything in my way!
     
    Baldinoboy, RB_lashman and SteveB like this.
  6. SteveB

    SteveB

    Joined:
    Jan 17, 2009
    Posts:
    1,451
    I know I sound like a broken record, but while the 970 is min for VR (I'm rocking a 980ti), in the realm of graphics-goodies such as turn up the resolution from 100% - 150% (still talking VR; pixel density), realtime global illumination is a 'goodie' that should make it completely reasonable to require a faster videocard in VR. Of course you want to market to as many consumers as possible, but again I don't think it's unreasonable.

    Then again, it's beta 0.81, so it can only go up! Right?! :D
     
    spraycanmansam and RB_lashman like this.
  7. castor76

    castor76

    Joined:
    Dec 5, 2011
    Posts:
    2,517
    Yeah true.

    It's fine as a developer to ask for a higher spec on the players , but it's a bit too risky to focus on minority when vr market itself isn't that big yet.

    Problem is people will always ask for option to turn GI off, and it means there has to be other way to light the scene without GI, and this kind of totally different setting is very difficult to maintain while developing. If there is option to turn the quality down based on the same lighting method, that is totally doable stuff. But making the game to run totally different lighting setup ? Now that's not the place I want to go to be honest.

    After doing some tests, I am also wondering if there is some way to make the voxel calculation even cheaper by skipping calculation where occlusion is happening. Similar to kinda what Unity occlusion does. If camera will not be able to see through beyond a wall, it should skip the calculation for what's behind. etc.. I am not sure, but maybe this is being done already :D
     
    SteveB and RB_lashman like this.
  8. VisionPunk

    VisionPunk

    Joined:
    Mar 9, 2012
    Posts:
    711
    Yeah, and I was thinking (post VR optimization + forward rendering support) that if the asset works with Valve's The Lab Renderer, it might be a real killer for VR. The Lab Renderer gives Vive developers big performance gains with forward rendering and many lights, and has an adaptive quality feature to always maintain 90 FPS. Again, I don't understand the rendering technicalities but I guess I could base the GI shading on just a couple of lights while still having additional lights.

    Just a bit excited for this tech and VR. I've always found pre-processed lighting a pain :).
     
    Last edited: Jul 27, 2016
  9. chiapet1021

    chiapet1021

    Joined:
    Jun 5, 2013
    Posts:
    605
    Well right now, SEGI doesn't support point and spot lights, so it's hard to say how Sonic will implement that sort of support. If I understand correctly, deferred rendering performance is fairly agnostic to the number of lights in the scene, whereas forward rendering is more tied to that number. I have a feeling that's part of the challenge in implementing forward rendered object/shader support.
     
  10. castor76

    castor76

    Joined:
    Dec 5, 2011
    Posts:
    2,517
    as you can see from my video, point light sort of becomes redundent... as it can be almost replaced by the emissive material. but i am not so sure about the spot light.

    i have tried to place 10s of emissive materials in the scene and it almost made no difference.

    however if you want very sharp shadow.. that is another story.
     
    RB_lashman likes this.
  11. chiapet1021

    chiapet1021

    Joined:
    Jun 5, 2013
    Posts:
    605
    Yeah, I want to say SE said this waaaaay back in one of his pre-asset store threads, but since emissive materials contribute to GI, it's essentially "area lights for free." I feel like area lights are more visually accurate than point lights, particular for specular highlights. Spotlights might be more difficult to mimic with just emissive materials.
     
    RB_lashman likes this.
  12. SteveB

    SteveB

    Joined:
    Jan 17, 2009
    Posts:
    1,451
    No doubt. Lower resolution OR take those emissive lights of yours and make them regular point lights if GI is turned off in options? Not going to be as beautiful but at least still realtime illumination.

    Actually that's interesting I never thought about that. Find where volumes cannot be seen and cull them from the calculation. The tough part comes in where you have to decide where a light may contribute even if occluded, but yea some compromise should be doable. @sonicether ?
     
  13. IronDuke

    IronDuke

    Joined:
    May 13, 2014
    Posts:
    132
    Oooh, I agree! That sounds very promising!:D And you could probably make it really easy to implement if Unity exposes the occlusion data from standard occlusion culling done by the camera. (or just shamelessly copy-paste it, and fix it up to work right:cool:)

    --IronDuke
     
    SteveB likes this.
  14. Remerbr

    Remerbr

    Joined:
    Mar 20, 2015
    Posts:
    8
    Hi guys! I'm looking into getting SEGI for my project but I have a quick question about normal mapped assets that utilize smoothing groups. With Enlighten, I get AWFUL results even when using Directional GI mode. Here's an image showcasing what I'm talking about.


    As you can see in the 2nd image it looks like Enlighten calculates bouncing light strictly on mesh geometry with no regard for normal maps. Once a normal map is applied that alters the way light hits the overall shape of the mesh it gets all funky. I'm just curious if SEGI does the same thing. Almost all of my meshes utilize this style of modeling/texturing and there's no way to make the lighting look clean and consistent with Enlighten. I really want to get SEGI but I'm afraid to spend the money if it will be just as useless for my purposes.
     
    RB_lashman likes this.
  15. sonicether

    sonicether

    Joined:
    Jan 12, 2013
    Posts:
    265
    Did you get your issue sorted out? If not, send me a PM. We might have to dive into this deeper to find the solution.

    Currently, if you set a non-black sky color in SEGI, a constant ambient term of that color will be added to the scene for anything outside the voxel volume.

    Medium resolution is 128x128x128 and high resolution is 256x256x256. So, "high" is the next power of two after "medium". Doubling resolution in 3 dimensions results in 8 times the data. I'll investigate non-powers-of-two resolutions.

    Yep, adding more emissive materials doesn't cost anything more than the additional geometry that needs to be voxelized. If you're using simple geometry, you could probably use hundreds of "area lights" this way.

    I've thought about this, and more specifically, frustum culling. The problem is that not being able to see something directly doesn't mean that you wouldn't be able to see its lighting influence on the things that you can see. Imagine a scene where you're standing in a hallway and there's an open door to a room in front of you such that you can't see into the room much. If SEGI used occlusion culling, the lighting within the room wouldn't be visible in the hallway because the objects that would cast that light would be culled because they're not visible. I doubt that there would be a simple way to extend classic occlusion culling to consider this, though I admit I don't know the specifics of occlusion culling. I feel that improving the efficiency of voxelization in general would be more worthwhile than investigating this further because of the problems I mentioned.
     
  16. sonicether

    sonicether

    Joined:
    Jan 12, 2013
    Posts:
    265
    SEGI should handle this situation just fine, since it uses the same normals that are used to calculate Unity's direct lighting.
     
    Remerbr and RB_lashman like this.
  17. Remerbr

    Remerbr

    Joined:
    Mar 20, 2015
    Posts:
    8
    Jesus Christ man talk about a fast response haha. That's AWESOME! Will be getting this ASAP.
     
    RB_lashman likes this.
  18. IronDuke

    IronDuke

    Joined:
    May 13, 2014
    Posts:
    132
    Ouch. Not sure why I didn't think of that. Okay then, that makes perfect sense. Ignore my sandwich-addled brain for now.:p

    --IronDuke
     
    RB_lashman likes this.
  19. CaptainMurphy

    CaptainMurphy

    Joined:
    Jul 15, 2014
    Posts:
    746
    In a manner of speaking, yes. I use a script to keep the Tenkoku script ambient color system matched to the sky color in SEGI and use the Tenkoku alpha of the color as the intensity of the sky. It works decently but is still not the same as the editor for some reason. It seems that if you have no sky color set that the build treats the bounces differently for some reason.
     
    RB_lashman likes this.
  20. castor76

    castor76

    Joined:
    Dec 5, 2011
    Posts:
    2,517
    Also, problem with the regular point lights and such is that if their range is large and casts no shadow, there will be light leaking everywhere. One of the most important aspect of SEGI that stands out is that it can cast occlusion and shadow automatically and with bit of tweaking and careful management, we can avoid large scale light leaking even if the range of light emission is large.
     
    RB_lashman likes this.
  21. DivergenceOnline

    DivergenceOnline

    Joined:
    Apr 19, 2015
    Posts:
    244
  22. Pix10

    Pix10

    Joined:
    Jul 21, 2012
    Posts:
    850
    @sonicether You've said Emissive materials are as lightweight as their geometry, which is great, but what about emissive textures?​
     
    RB_lashman likes this.
  23. castor76

    castor76

    Joined:
    Dec 5, 2011
    Posts:
    2,517
    I am not sure but I think emissive texture input to the emissive material should work?

    Edit - I have just tried this and it works.

    I have also tried to use single pass stereo rendering to VR setting, and it renders at 2fps :p So clearly something isn't working right.
     
    RB_lashman likes this.
  24. Assembler-Maze

    Assembler-Maze

    Joined:
    Jan 6, 2016
    Posts:
    630
    Hello!

    How is the progress with SEGI going in great outdoor scenes? I have a big island and the bake times are terrible, and I was thinking onto moving on to SEGI.

    So my question was, did anyone use it with a big terrain (unity terrain not meshes) with a lot of speedtrees and stuff in it?
     
    Last edited: Jul 29, 2016
  25. StaffanEk

    StaffanEk

    Joined:
    Jul 13, 2012
    Posts:
    380
    That sounds like it's worth a shot.
     
    SteveB likes this.
  26. Pix10

    Pix10

    Joined:
    Jul 21, 2012
    Posts:
    850
    Yes it works, and works well. I'm asking about performance overhead.
     
  27. castor76

    castor76

    Joined:
    Dec 5, 2011
    Posts:
    2,517
    I don't think there is much performance overhead of using emissive materials, even if there are lots. I think the performance issue could be more to do with how complex the geometry that bounces voxel cone tracing around, not the amount of emissive materials you use.
     
    RB_lashman likes this.
  28. Pix10

    Pix10

    Joined:
    Jul 21, 2012
    Posts:
    850
    Not materials, textures, as in per-pixel emission. Think stained glass window vs flat uniform color.
     
  29. chiapet1021

    chiapet1021

    Joined:
    Jun 5, 2013
    Posts:
    605
    Isn't the overhead the same in general (not just for SEGI), no matter how complex or uniform the emission texture is?
     
  30. castor76

    castor76

    Joined:
    Dec 5, 2011
    Posts:
    2,517
    There should be no overhead for using texture instead of plain material color. The technology behind this lighting is not per pixel emission. It is based on voxel cone tracing and it uses its voxel traced data combined with deferred shader. You will never get "per pixel" accuracy lighting with SEGI. What you will probably land up getting , if you are using texture for emission is the averaged emissive value on that particular voxel space where the polygon with the emissive material (texture) is being rendered.

    If you are looking for very sharp per pixel accuracy lighting, I don't think you will get it here. What I will probably land up doing is I will use emission texture for visual details only and use custom material / geometry for all lighting work separately to get a best performance.

    Having said that, I am beginning to hate segi for making me stuck with it now! I can't go back to other normal Unity lighting. I can't never go back now! And since I can't go back, I really want to see the future optimization and improvements to come quickly! :D
     
    RB_lashman likes this.
  31. Pix10

    Pix10

    Joined:
    Jul 21, 2012
    Posts:
    850
    I know how it works and what it looks like, as I do have the plugin. I just want to know if there's any overhead to texture based emissions, and if there are any plans related to it - whether it's coming from already available GI sampling (which I'd expect) or otherwise, the quality is fine and as expected.
     
  32. chiapet1021

    chiapet1021

    Joined:
    Jun 5, 2013
    Posts:
    605
    If you already have the beta asset, couldn't you validate for yourself whether complex emission textures have a significant impact on performance?
     
  33. Pix10

    Pix10

    Joined:
    Jul 21, 2012
    Posts:
    850
    I'm kind of busy with my actual job. It's just a question, it doesn't require a debate or me running huge tests.

    Please forget I asked. It really isn't that important, purely curiosity. >_>
     
    RB_lashman likes this.
  34. castor76

    castor76

    Joined:
    Dec 5, 2011
    Posts:
    2,517
    Sonice, Do you think it is possible to have resolutions like 128x32x128 ?

    For some games, it is possible to limit the voxel like that to optimize performance.. I am trying to see if segi is usable in first person view , closed up dungeon game, so the game is just more or less flat level designed. This means I don't need to calculate lighting above or below certain world y coordinates.
     
  35. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Hey guys, let the author answer questions if you don't know ;)
     
  36. adamlukegoodwin

    adamlukegoodwin

    Joined:
    May 10, 2015
    Posts:
    15
    Before I got into real-time 3d many moons ago, I was waiting up to an hour for a GI pass in mental ray.. May the the gods of the digital dimensions bless the author! I can't wait to see where this goes in the future.

    Anyway, played with this a bit over the weekend to see which 3rd party shaders I have work. Glad to say, it's looking good for UBER so far. Going to try a bit of vertex painting to blend two different shaders next..
     
    SteveB and RB_lashman like this.
  37. adamlukegoodwin

    adamlukegoodwin

    Joined:
    May 10, 2015
    Posts:
    15
    Vertex painting with two UBER shaders works. Here are two materials I made - clean and dirty..
     
    RB_lashman likes this.
  38. adamlukegoodwin

    adamlukegoodwin

    Joined:
    May 10, 2015
    Posts:
    15
    Quick and dirty test with tessellation cranked right up. I am sold. Especially as this is still in beta testing.
     
    RB_lashman likes this.
  39. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,538
    Tesselation is intensive GPU, if you add SEGI i am not sure you'll be able to run both in a real game scenario ?
     
    RB_lashman likes this.
  40. adamlukegoodwin

    adamlukegoodwin

    Joined:
    May 10, 2015
    Posts:
    15
    You're right there zenGarden. Tessellation is a bit messy too. Just promising to see it working well with various UBER shaders.
     
    RB_lashman likes this.
  41. S_Darkwell

    S_Darkwell

    Joined:
    Oct 20, 2013
    Posts:
    320
    @sonicether: I'm having the same issue in my project. Build is noticeably darker than in-editor.
     
  42. Arganth

    Arganth

    Joined:
    Jul 31, 2015
    Posts:
    277
  43. SteveB

    SteveB

    Joined:
    Jan 17, 2009
    Posts:
    1,451
    @sonicether ...just how much are you loving the Vive? :D

    Anyway any thoughts are reducing the calculation to one camera?
     
    RB_lashman likes this.
  44. CaptainMurphy

    CaptainMurphy

    Joined:
    Jul 15, 2014
    Posts:
    746
    I am using the open source volumetric in our Unicorn project. It looks good when used correctly.
     
    arnoob, Baldinoboy, zenGarden and 3 others like this.
  45. sonicether

    sonicether

    Joined:
    Jan 12, 2013
    Posts:
    265
    Yep, I've got a solid idea for that. I'll have to switch SEGI's resources to be stored in static variables so if there are two instances there's still only one set of data. Then, using some sort of static variable, I can have each instance of SEGI check if voxelization has already occurred for the current frame and skip voxelization if that's the case.

    I also wanted to probe you guys about an idea that a few others have mentioned. I recently created some functionality that makes it so that only half of the GI volume is voxelized per-frame and the half that is voxelized swaps for each frame. This obviously results in a significant speedup in voxelization, especially with High voxel resolution, but it did cause an issue. Basically, if the camera or voxel volume moves, it means that the half of the GI volume that wasn't updated for that frame contains invalid data. I tried setting up something where the invalid volume scrolls its contents when this happens, but it doesn't completely solve the issue, and there are still very noticeable "pops" when the volume is moved. The only way I see to completely solve this issue is to have the entire voxel volume updated whenever it moves. For any frames where SEGI has to voxelize the entire volume instead of only half, obviously there will be an increased rendering cost. I'll keep trying to find a solution other than this, but meanwhile, what do you guys think? Should I include it at least as an option so you guys can see how it performs/behaves?

    Since Unity 5.4 just came out, I'll be working on using the new velocity buffer to solve temporal sampling artifacts. I'll let everyone know how that's going.
     
  46. eskovas

    eskovas

    Joined:
    Dec 2, 2009
    Posts:
    1,373
    Sounds good to me.
    Have you tried instead of not voxelizing half of the volume, voxelize that part with a less detailed voxel structure to avoid the invalid data?

    Edit:
    Also, sharing SEGI data for multiple cameras would be very much appreciated. I for example use scopes with a camera and render texture, and that camera doesn't use SEGI because i don't want it to also do voxelization, so it just sees direct light.
     
    arnoob and RB_lashman like this.
  47. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,538
    Why not using some smooth progressive blending between old invalid data and new lightening data when it comes ?
    This could be some approximative cheap solution for people needing more performance than accuracy ?

    I would like that.
     
    arnoob and RB_lashman like this.
  48. ksam2

    ksam2

    Joined:
    Apr 28, 2012
    Posts:
    1,079
    It is on road map "Support for forward-rendered objects" I think lots of people are waiting for this but still on the 0%. please work on this too. thanks.
     
    zenGarden and arnoob like this.
  49. castor76

    castor76

    Joined:
    Dec 5, 2011
    Posts:
    2,517
    What about non square cube voxel shape for performance optimization? Is this even possible? Does voxel volume has to be in square cube shape?

    When I try SEGI on VR, I found that the updating the voxel calculation isn't the main issue, but the main performance bottlenect may come down to the other part where resolution of the rendering may affect it more. I tried to disable update GI, but it didn't make a lot of difference. (on low setting)
     
    arnoob likes this.
  50. DivergenceOnline

    DivergenceOnline

    Joined:
    Apr 19, 2015
    Posts:
    244
    What I think is that having a 50m range renders all other optimizations pointless and I haven't seen the voxel volume cascades milestone budge in the past month.
     
    arnoob likes this.