Search Unity

  1. Unity 2020.2 has been released.
    Dismiss Notice
  2. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

SEGI (Fully Dynamic Global Illumination)

Discussion in 'Assets and Asset Store' started by sonicether, Jun 10, 2016.

  1. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    5,571
    Well "render" to cubemap IS real time i mean, it can provide you that to some capacity, that's why I was proposing it.

    In fact while I was thinking of a custom mode, unity provide way to do that with its cubemap it seems ... but I only see it for reflection, not sure about lighting.
     
    RB_lashman likes this.
  2. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    291
    This is only limited by available VRAM... But it does support and is tested with map sizes up to a couple 2k.

    Can't speak to performance yet. It's not been tested on a selection of hardware yet. Just dev boxes with all the transistors. But, small indoor maps where you just want a bit of shadow or whatever, is where it presents the highest performance. While it's real-time. Suddenly transforming a huge chunk of the map will be very expensive to re-encode the changes, and not advisable unless you're really clever about it. The best situation is where you only have the typical small actors in the scene or a few lights moving about. My own performance yard stick is, better than SEGI & run's at 90fps+ in VR.
     
    RB_lashman and Baldinoboy like this.
  3. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    291
    Rendering a cubemap will still be cheaper than doing full GI.
     
    RB_lashman likes this.
  4. Baldinoboy

    Baldinoboy

    Joined:
    Apr 14, 2012
    Posts:
    1,418
    So I want high detail Dynamic GI with little to no impact on performance for a 20km² map. This work good for that ;)

    Sounds good, Really looking forward to seeing it. Now you will need to send weekly screenshots to squelch our GI appetite.
     
    RB_lashman likes this.
  5. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    291
    No. It won't work good for that. Not the first release at least. Let's me realistic. It runs to maybe 4km max and you'll run short on VRAM. And the higher the detail the larger the performance hit. There's no paging out of stale assignments to allow for limitless coverage. That'll be a 2nd version feature.

    It's still GI. GI is expensive however you slice it. You can make good GI, but it still has overheads, which are in the ballpark of RTX, with improvements in certain applications.
     
    RB_lashman likes this.
  6. Oniros88

    Oniros88

    Joined:
    Nov 15, 2014
    Posts:
    57
    The huge chunks would be map generation only at the start of the level (for maps that are generated, most will be premade). Most common scenario would be doors opening and light leaking, time of the day/weather changing or someone shutting off the lights in the building. I think the "biggest" one we have plans for would be an underground base map in which in one of the rooms (some hangar or missile silo) there is a big circular ceiling door that can be opened leaking outside light inside.

    Would it be fine? And thanks for the info. we are really really really looking forward to this non-hardware dependant (RTX) GI solution. Do you have any ideas of what the price will be?
     
    RB_lashman likes this.
  7. Baldinoboy

    Baldinoboy

    Joined:
    Apr 14, 2012
    Posts:
    1,418
    Yeah I understand, just a joke. Do not expect to get a playable 20km map working in unity with any lighting.
     
    RB_lashman likes this.
  8. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    5,571
    cubemap

    basically a variant of that
    https://community.arm.com/developer...s/dynamic-soft-shadows-based-on-local-cubemap

    but with a render feedback to get some GI
     
    RB_lashman likes this.
  9. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    5,571
    If your map is essentially a "flat" open world, you could just sample the terrain texture + some decals/masking using the normal direction intersection a proxy plane, then sample mipmap based on distance to the plane.

    But then you could do the breath of the wild trick ... that is a cubemap around the camera retroprojected on the environement ....
    https://twitter.com/flogelz/status/1175053137712955392

    Video on the tweet

    You can do a lot of approximation with cubemap, and they are generally good enough
     
    Baldinoboy and RB_lashman like this.
  10. Baldinoboy

    Baldinoboy

    Joined:
    Apr 14, 2012
    Posts:
    1,418
    Hey @neoshaman, Was just joking about that large a map. Will be working on a larger scene this week but not planning on GI.

    Sorry if you already mentioned it but how are cubemaps being used for diffuse lighting?
     
    Last edited: Jun 1, 2020
    RB_lashman likes this.
  11. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    5,571
    Easy, just sample the cubemap in the direction of the normal, it's call IBL or lightprobe lighting, usually you should convolve the cubemap based on the BRDF (basically blurring using the shape of the brdf curve), but for an approximation that's good enough to just sample the mipmap and find the closest correlation.

    SH (spherical harmonics) are really just mathematical cubemap (it's a kind of low rez format, it's the jpeg of cubemap, in game we use it as very low rez cubemap compression) that have the diffuse property close to a convolve brdf of the cubemap, but I have no idea how to update them in real time from an environment capture.

    So cubemap it is, anyway if you capture a cubemap to update a sh, you could just has much use the cubemap. Also diffuse is low frequency so you could use relatively low resolution cubemap to do the trick, which makes it faster to render, if you can use proxy low poly objects with lower rez texture (basically just render distant lod) it's better.

    I know there is roughness curve approximation of the brdf using mipmap sampling for blinn phong specular reflection, with cubemap, to replace complex convolution. I don't know for lambert convolution if there is a ready made roughness curve to get close to an actual convolution (the simplest technique (bruteforce) is to use every pixel of the cubemap as a light source and accumulate a ndotl*pixel area, and it's done offline).

    https://learnopengl.com/PBR/IBL/Diffuse-irradiance

    edit:
    Also it's worth noting you wouldn't need to update the cubemap frequently, like say 4fps is a common one, or just update based on deplacement delta from a spatial hash (like update only 5m), with exception when there is sudden change.
     
    Last edited: Jun 1, 2020
    Baldinoboy and RB_lashman like this.
  12. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    291
    Regarding price for Nigiri. Nothing has been settled upon. But I am interested in what people think is fair for something that took 2 years of my life (Not to mention a small crowd of enthusiastic supporters, and the time of 2 additional programmers) and is a decent product.
     
    Last edited: Jun 1, 2020
    RB_lashman and neoshaman like this.
  13. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    5,571
    100€ min, you have been going in depth
     
    RB_lashman, DMeville and mgear like this.
  14. Stardog

    Stardog

    Joined:
    Jun 28, 2010
    Posts:
    1,598
    €50-80 makes it comparable to other assets.
     
    RB_lashman likes this.
  15. DMeville

    DMeville

    Joined:
    May 5, 2013
    Posts:
    403
    ....there are other dynamic GI assets on the asset store? Hard to compare it to other assets as every other GI has only ever been talked about and not "finished" and released. For something to be released and working seems worth much more than that, at least to me (and I've been waiting and following all the different systems for years waiting for more than just promises....)
     
    Last edited: Jun 1, 2020
    Acissathar and RB_lashman like this.
  16. Mauri

    Mauri

    Joined:
    Dec 9, 2010
    Posts:
    2,096
    None currently. Though, there's the upcoming MadGoat SSGI (no ETA yet) and some various GitHub projects.

    Honestly, with Unity working on their own native solutions (that is: RTX-based GI and... DDGI?), I don't think there will be a need for SEGI or other similar assets anymore.
     
    DMeville likes this.
  17. Acissathar

    Acissathar

    Joined:
    Jun 24, 2011
    Posts:
    592
    Assuming that is still on track (I haven't been able to find any updates since it was announced in 2019), won't be apart of any Unity release until 2021.1.

    Sitting on a project until then, hoping the move to a new version doesn't break anything (SRP seems to always break) and that the GI implementation actually works as expected is probably not feasible for a fair chunk of people needing a GI solution.
     
    Last edited: Jun 1, 2020
    RB_lashman likes this.
  18. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    291
    Interesting.... MadGoat is doing screen space.


    Nigiri has a dual encode pipeline. Where the primary is also screen space. But has a slower updating secondary encoder that also handles everything off-screen. A best of both worlds' solution. Both are toggable so users can themselve decide if fast screen space is good enough for them and only assign important things such as lights and things to the more expensive secondary. It works out well with 2 encoders feeding a common octree injection pipeline to ensure physically identical inputs from both.
     
    RB_lashman likes this.
  19. razzraziel

    razzraziel

    Joined:
    Sep 13, 2018
    Posts:
    169
    anyone tried these? how's the performance and look?
     
    RB_lashman likes this.
  20. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    291
    They're really good example projects if you were looking to roll your own. But all, on the whole are about as polished as SEGI. Which is either good or bad, depending on your expectations, requirements. I've poked at near all public projects between my previous nkgi release of early 2018 and the yet to be unveiled nigiri 2.0
     
    RB_lashman and razzraziel like this.
  21. Baldinoboy

    Baldinoboy

    Joined:
    Apr 14, 2012
    Posts:
    1,418
    Would say $80-$120. Is worth more than that of course but you also want to sell it well. A lot of people who would buy for just playing with it would wait for a sale at that price but actual developers would be more than willing to get it in that range. If well advertised and reviewed when on sale at $40-$60 it would sell like crazy. Know that is not ideal to have most of the sales at that price but it probably would be enough to cover your work.
     
    RB_lashman likes this.
  22. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    5,571
    Also people will expect support. So you need that 100$ to cover that, especially with the tumultuous unity RP changes. Also you probably should do separate paid add on for every new features, because it won't be sustainable.

    Also you didn't get paid for 2 years developing that, you already invested +160 000$ of your time, that's a lot, it's not clear you will ever break even, you would need to sell 1600 licenses at my suggested price for that, more due to sales.
     
    LapidistCubed and RB_lashman like this.
  23. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    5,571
    There is a new contender, the godot guy has a new GI volume solution
    https://twitter.com/reduzio/status/1267301593378172930 (below tweet as some explanation)
    At first glance it looks like a light propagation volume mixed, using sdf, with ddgi concept and occlusion idea from voxel cone tracing to augment the visibility probe of ddgi, it does sound like madness.

    edit: the lpv trick might be the visibility query
     
    Last edited: Jun 1, 2020
    RB_lashman likes this.
  24. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    291
    I really want to aim for accessibility. Yes, I need to recoup from the time spent when I wasn't doing anything else that earns monies. But I don't want to price out the hobyists while doing it. As such, I was leaning towards the idea of a tiered pricing structure a supporters/hobyists/non-commercial tier in their range... Then a commercial usage licence above that and a full source code access above that. Along with pre-sale demos, instructional videos, open documentation, so people can be confident in what they're buying and the nature of this particular beast before putting any money down. Too many store assets are a leap of faith in what you're actually getting. I have a particular itch to try to do better than that.
     
  25. Back_Buffer

    Back_Buffer

    Joined:
    Jun 22, 2015
    Posts:
    2
    Here are my tests with SEGI, by the way thanks for the product!!
     
    AntonioModer likes this.
  26. razzraziel

    razzraziel

    Joined:
    Sep 13, 2018
    Posts:
    169
  27. ivanmotta

    ivanmotta

    Joined:
    Jun 19, 2013
    Posts:
    23
    Hi folks! Just wanted to tell you that @Ninlilizi 's version for SEGI works on 2018.4 and I just launched a game that fully uses it. It's Legally Addicted, procedural dungeon-crawler inside an office building. You play as John Smoke, crazy to go out for a little nicotine after his first day on a big company. All procedural, physics breaking furniture, imaginary demons, and all that jazz. Made by only me as a side project on the last 3 years.



    It's available on itch.io and gamejolt: sensingames.indie.af/legallyaddicted/
    Some more info is at my website: www.sensingames.com
     
    Last edited: Jul 22, 2020
  28. ksam2

    ksam2

    Joined:
    Apr 28, 2012
    Posts:
    1,047
    Hi, Chromatic aberration in this pic looks nice. Is that same as post processing chromatic aberration?
     
  29. florianalexandru05

    florianalexandru05

    Joined:
    Mar 31, 2014
    Posts:
    1,246
    It's the old one from pp v1.
     
    ksam2 likes this.
  30. ksam2

    ksam2

    Joined:
    Apr 28, 2012
    Posts:
    1,047
    What happened to this Nigiri project? I get it from GitHub performance was great but totally unusable when moving camera around :confused:
     
    RB_lashman likes this.
  31. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    291
    Still in progress. The Github code is 2 years old now. The current version isn't publicly available. There was a round of testing a month ago, which found that there are still some problems that mean it is not ready for general release yet and will need re-engineering certain systems. Which has pushed back its availability some, along with the current state of the world resulting in having had to spend some months fighting against possible homelessness.
    I am currently working on integrating support for the Nvidia OptX denoisor, after which they'll be another round of testing.
     
    Duende, ftejada, nirvanajie and 4 others like this.
  32. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    291
  33. ksam2

    ksam2

    Joined:
    Apr 28, 2012
    Posts:
    1,047
    Last edited: Oct 21, 2020
    RB_lashman likes this.
  34. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    291
    Current code is fully SVO based with more accurate material handling.

    It is a complete rewrite that has nothing in common with the old (and admittedly, ghastly awful) Github code.

    The downside of using a sparse octree for data storage is the traversal cost.
    The upside is support for huge and outdoor scenes running into several kilometres, along with tiny voxels that allow for a lot of previously impossible accuracy and detail.

    But traversal is expensive. Which means you can afford a lot less samples. Which means you have a lot more noise to deal with. Which is why a couple of months ago. I realized this wasn't ready for the more general release I was hoping for at that time and am now incorporating some RTX features to handle that more elegantly.
     
    Ne0mega, ftejada, RogueCode and 6 others like this.
  35. RogueCode

    RogueCode

    Joined:
    Apr 3, 2013
    Posts:
    216
    The pic looks great!

    I have a few questions if you've got a few minutes:

    - Will this support on-demand updating? With SEGI I am currently telling it to update, then stopping updates until something in the level changes (imagine a game like sims where basically nothing changes until you build a wall etc). While not updating, SEGI still renders the last result, and saves a ton on performance.

    - If the above answer is yes, how will moving objects be handled? I obviously wouldn't be moving things that are highly emmisive, but can I move an object and it will receive the light from surrounding objects at its new position. In SEGI this seems to work fine, even without updating the GI (not too sure how and I haven't really tested properly).

    - How would I go about being in the next round of testing?

    - Will emissive materials glow correctly much like they do in SEGI? It is possible in SEGI to roughly emulate a point and area light by setting the material of an invisible sphere/cube really high.

    - Will it works in HDRP/LWRP as well as built-in?

    As a side note, I'm working on a small other project in HDRP which is all pre-built levels so jumping back into the lightmapping world of Unity, and the whole experience somehow feels even more painful than 5 years ago when I last tried.

    Thanks :)
     
    ftejada and RB_lashman like this.
  36. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    524
    Nin, I hope all this is HDRP. If so, consider my money yours.
     
    RB_lashman likes this.
  37. andywatts

    andywatts

    Joined:
    Sep 19, 2015
    Posts:
    92
    Afaik It’s default render pipeline.
    I guess the hlsl compute shader work could be ported to SRP.
     
    RB_lashman likes this.
unityunity