Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

SEGI (Fully Dynamic Global Illumination)

Discussion in 'Assets and Asset Store' started by sonicether, Jun 10, 2016.

  1. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    294
    This is only limited by available VRAM... But it does support and is tested with map sizes up to a couple 2k.

    Can't speak to performance yet. It's not been tested on a selection of hardware yet. Just dev boxes with all the transistors. But, small indoor maps where you just want a bit of shadow or whatever, is where it presents the highest performance. While it's real-time. Suddenly transforming a huge chunk of the map will be very expensive to re-encode the changes, and not advisable unless you're really clever about it. The best situation is where you only have the typical small actors in the scene or a few lights moving about. My own performance yard stick is, better than SEGI & run's at 90fps+ in VR.
     
    RB_lashman and Baldinoboy like this.
  2. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    294
    Rendering a cubemap will still be cheaper than doing full GI.
     
    RB_lashman likes this.
  3. Baldinoboy

    Baldinoboy

    Joined:
    Apr 14, 2012
    Posts:
    1,526
    So I want high detail Dynamic GI with little to no impact on performance for a 20km² map. This work good for that ;)

    Sounds good, Really looking forward to seeing it. Now you will need to send weekly screenshots to squelch our GI appetite.
     
    RB_lashman likes this.
  4. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    294
    No. It won't work good for that. Not the first release at least. Let's me realistic. It runs to maybe 4km max and you'll run short on VRAM. And the higher the detail the larger the performance hit. There's no paging out of stale assignments to allow for limitless coverage. That'll be a 2nd version feature.

    It's still GI. GI is expensive however you slice it. You can make good GI, but it still has overheads, which are in the ballpark of RTX, with improvements in certain applications.
     
    RB_lashman likes this.
  5. Oniros88

    Oniros88

    Joined:
    Nov 15, 2014
    Posts:
    150
    The huge chunks would be map generation only at the start of the level (for maps that are generated, most will be premade). Most common scenario would be doors opening and light leaking, time of the day/weather changing or someone shutting off the lights in the building. I think the "biggest" one we have plans for would be an underground base map in which in one of the rooms (some hangar or missile silo) there is a big circular ceiling door that can be opened leaking outside light inside.

    Would it be fine? And thanks for the info. we are really really really looking forward to this non-hardware dependant (RTX) GI solution. Do you have any ideas of what the price will be?
     
    RB_lashman likes this.
  6. Baldinoboy

    Baldinoboy

    Joined:
    Apr 14, 2012
    Posts:
    1,526
    Yeah I understand, just a joke. Do not expect to get a playable 20km map working in unity with any lighting.
     
    RB_lashman likes this.
  7. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,492
    cubemap

    basically a variant of that
    https://community.arm.com/developer...s/dynamic-soft-shadows-based-on-local-cubemap

    but with a render feedback to get some GI
     
    RB_lashman likes this.
  8. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,492
    If your map is essentially a "flat" open world, you could just sample the terrain texture + some decals/masking using the normal direction intersection a proxy plane, then sample mipmap based on distance to the plane.

    But then you could do the breath of the wild trick ... that is a cubemap around the camera retroprojected on the environement ....
    https://twitter.com/flogelz/status/1175053137712955392

    Video on the tweet

    You can do a lot of approximation with cubemap, and they are generally good enough
     
    Baldinoboy and RB_lashman like this.
  9. Baldinoboy

    Baldinoboy

    Joined:
    Apr 14, 2012
    Posts:
    1,526
    Hey @neoshaman, Was just joking about that large a map. Will be working on a larger scene this week but not planning on GI.

    Sorry if you already mentioned it but how are cubemaps being used for diffuse lighting?
     
    Last edited: Jun 1, 2020
    RB_lashman likes this.
  10. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,492
    Easy, just sample the cubemap in the direction of the normal, it's call IBL or lightprobe lighting, usually you should convolve the cubemap based on the BRDF (basically blurring using the shape of the brdf curve), but for an approximation that's good enough to just sample the mipmap and find the closest correlation.

    SH (spherical harmonics) are really just mathematical cubemap (it's a kind of low rez format, it's the jpeg of cubemap, in game we use it as very low rez cubemap compression) that have the diffuse property close to a convolve brdf of the cubemap, but I have no idea how to update them in real time from an environment capture.

    So cubemap it is, anyway if you capture a cubemap to update a sh, you could just has much use the cubemap. Also diffuse is low frequency so you could use relatively low resolution cubemap to do the trick, which makes it faster to render, if you can use proxy low poly objects with lower rez texture (basically just render distant lod) it's better.

    I know there is roughness curve approximation of the brdf using mipmap sampling for blinn phong specular reflection, with cubemap, to replace complex convolution. I don't know for lambert convolution if there is a ready made roughness curve to get close to an actual convolution (the simplest technique (bruteforce) is to use every pixel of the cubemap as a light source and accumulate a ndotl*pixel area, and it's done offline).

    https://learnopengl.com/PBR/IBL/Diffuse-irradiance

    edit:
    Also it's worth noting you wouldn't need to update the cubemap frequently, like say 4fps is a common one, or just update based on deplacement delta from a spatial hash (like update only 5m), with exception when there is sudden change.
     
    Last edited: Jun 1, 2020
    Baldinoboy and RB_lashman like this.
  11. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    294
    Regarding price for Nigiri. Nothing has been settled upon. But I am interested in what people think is fair for something that took 2 years of my life (Not to mention a small crowd of enthusiastic supporters, and the time of 2 additional programmers) and is a decent product.
     
    Last edited: Jun 1, 2020
    RB_lashman and neoshaman like this.
  12. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,492
    100€ min, you have been going in depth
     
    RB_lashman, DMeville and mgear like this.
  13. Stardog

    Stardog

    Joined:
    Jun 28, 2010
    Posts:
    1,910
    €50-80 makes it comparable to other assets.
     
    RB_lashman likes this.
  14. DMeville

    DMeville

    Joined:
    May 5, 2013
    Posts:
    418
    ....there are other dynamic GI assets on the asset store? Hard to compare it to other assets as every other GI has only ever been talked about and not "finished" and released. For something to be released and working seems worth much more than that, at least to me (and I've been waiting and following all the different systems for years waiting for more than just promises....)
     
    Last edited: Jun 1, 2020
    Acissathar and RB_lashman like this.
  15. Mauri

    Mauri

    Joined:
    Dec 9, 2010
    Posts:
    2,663
    None currently. Though, there's the upcoming MadGoat SSGI (no ETA yet) and some various GitHub projects.

    Honestly, with Unity working on their own native solutions (that is: RTX-based GI and... DDGI?), I don't think there will be a need for SEGI or other similar assets anymore.
     
    DMeville likes this.
  16. Acissathar

    Acissathar

    Joined:
    Jun 24, 2011
    Posts:
    677
    Assuming that is still on track (I haven't been able to find any updates since it was announced in 2019), won't be apart of any Unity release until 2021.1.

    Sitting on a project until then, hoping the move to a new version doesn't break anything (SRP seems to always break) and that the GI implementation actually works as expected is probably not feasible for a fair chunk of people needing a GI solution.
     
    Last edited: Jun 1, 2020
    RB_lashman likes this.
  17. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    294
    Interesting.... MadGoat is doing screen space.


    Nigiri has a dual encode pipeline. Where the primary is also screen space. But has a slower updating secondary encoder that also handles everything off-screen. A best of both worlds' solution. Both are toggable so users can themselve decide if fast screen space is good enough for them and only assign important things such as lights and things to the more expensive secondary. It works out well with 2 encoders feeding a common octree injection pipeline to ensure physically identical inputs from both.
     
    RB_lashman likes this.
  18. razzraziel

    razzraziel

    Joined:
    Sep 13, 2018
    Posts:
    395
    anyone tried these? how's the performance and look?
     
    RB_lashman likes this.
  19. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    294
    They're really good example projects if you were looking to roll your own. But all, on the whole are about as polished as SEGI. Which is either good or bad, depending on your expectations, requirements. I've poked at near all public projects between my previous nkgi release of early 2018 and the yet to be unveiled nigiri 2.0
     
    RB_lashman and razzraziel like this.
  20. Baldinoboy

    Baldinoboy

    Joined:
    Apr 14, 2012
    Posts:
    1,526
    Would say $80-$120. Is worth more than that of course but you also want to sell it well. A lot of people who would buy for just playing with it would wait for a sale at that price but actual developers would be more than willing to get it in that range. If well advertised and reviewed when on sale at $40-$60 it would sell like crazy. Know that is not ideal to have most of the sales at that price but it probably would be enough to cover your work.
     
    RB_lashman likes this.
  21. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,492
    Also people will expect support. So you need that 100$ to cover that, especially with the tumultuous unity RP changes. Also you probably should do separate paid add on for every new features, because it won't be sustainable.

    Also you didn't get paid for 2 years developing that, you already invested +160 000$ of your time, that's a lot, it's not clear you will ever break even, you would need to sell 1600 licenses at my suggested price for that, more due to sales.
     
    LapidistCubed and RB_lashman like this.
  22. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,492
    There is a new contender, the godot guy has a new GI volume solution
    https://twitter.com/reduzio/status/1267301593378172930 (below tweet as some explanation)
    At first glance it looks like a light propagation volume mixed, using sdf, with ddgi concept and occlusion idea from voxel cone tracing to augment the visibility probe of ddgi, it does sound like madness.

    edit: the lpv trick might be the visibility query
     
    Last edited: Jun 1, 2020
    RB_lashman likes this.
  23. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    294
    I really want to aim for accessibility. Yes, I need to recoup from the time spent when I wasn't doing anything else that earns monies. But I don't want to price out the hobyists while doing it. As such, I was leaning towards the idea of a tiered pricing structure a supporters/hobyists/non-commercial tier in their range... Then a commercial usage licence above that and a full source code access above that. Along with pre-sale demos, instructional videos, open documentation, so people can be confident in what they're buying and the nature of this particular beast before putting any money down. Too many store assets are a leap of faith in what you're actually getting. I have a particular itch to try to do better than that.
     
  24. Back_Buffer

    Back_Buffer

    Joined:
    Jun 22, 2015
    Posts:
    2
    Here are my tests with SEGI, by the way thanks for the product!!
     
    VirtualPierogi and AntonioModer like this.
  25. razzraziel

    razzraziel

    Joined:
    Sep 13, 2018
    Posts:
    395
  26. ivanmotta

    ivanmotta

    Joined:
    Jun 19, 2013
    Posts:
    23
    Hi folks! Just wanted to tell you that @Ninlilizi 's version for SEGI works on 2018.4 and I just launched a game that fully uses it. It's Legally Addicted, procedural dungeon-crawler inside an office building. You play as John Smoke, crazy to go out for a little nicotine after his first day on a big company. All procedural, physics breaking furniture, imaginary demons, and all that jazz. Made by only me as a side project on the last 3 years.



    It's available on itch.io and gamejolt: sensingames.indie.af/legallyaddicted/
    Some more info is at my website: www.sensingames.com
     
    Last edited: Jul 22, 2020
    chingwa, Ne0mega, DuncanIdaho and 4 others like this.
  27. ksam2

    ksam2

    Joined:
    Apr 28, 2012
    Posts:
    1,079
    Hi, Chromatic aberration in this pic looks nice. Is that same as post processing chromatic aberration?
     
  28. florianalexandru05

    florianalexandru05

    Joined:
    Mar 31, 2014
    Posts:
    1,794
    It's the old one from pp v1.
     
    ksam2 likes this.
  29. ksam2

    ksam2

    Joined:
    Apr 28, 2012
    Posts:
    1,079
    What happened to this Nigiri project? I get it from GitHub performance was great but totally unusable when moving camera around :confused:
     
    RB_lashman likes this.
  30. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    294
    Still in progress. The Github code is 2 years old now. The current version isn't publicly available. There was a round of testing a month ago, which found that there are still some problems that mean it is not ready for general release yet and will need re-engineering certain systems. Which has pushed back its availability some, along with the current state of the world resulting in having had to spend some months fighting against possible homelessness.
    I am currently working on integrating support for the Nvidia OptX denoisor, after which they'll be another round of testing.
     
    Duende, ftejada, nirvanajie and 4 others like this.
  31. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    294
  32. ksam2

    ksam2

    Joined:
    Apr 28, 2012
    Posts:
    1,079
    Last edited: Oct 21, 2020
    RB_lashman likes this.
  33. Ninlilizi

    Ninlilizi

    Joined:
    Sep 19, 2016
    Posts:
    294
    Current code is fully SVO based with more accurate material handling.

    It is a complete rewrite that has nothing in common with the old (and admittedly, ghastly awful) Github code.

    The downside of using a sparse octree for data storage is the traversal cost.
    The upside is support for huge and outdoor scenes running into several kilometres, along with tiny voxels that allow for a lot of previously impossible accuracy and detail.

    But traversal is expensive. Which means you can afford a lot less samples. Which means you have a lot more noise to deal with. Which is why a couple of months ago. I realized this wasn't ready for the more general release I was hoping for at that time and am now incorporating some RTX features to handle that more elegantly.
     
    Ne0mega, ftejada, RogueCode and 6 others like this.
  34. RogueCode

    RogueCode

    Joined:
    Apr 3, 2013
    Posts:
    230
    The pic looks great!

    I have a few questions if you've got a few minutes:

    - Will this support on-demand updating? With SEGI I am currently telling it to update, then stopping updates until something in the level changes (imagine a game like sims where basically nothing changes until you build a wall etc). While not updating, SEGI still renders the last result, and saves a ton on performance.

    - If the above answer is yes, how will moving objects be handled? I obviously wouldn't be moving things that are highly emmisive, but can I move an object and it will receive the light from surrounding objects at its new position. In SEGI this seems to work fine, even without updating the GI (not too sure how and I haven't really tested properly).

    - How would I go about being in the next round of testing?

    - Will emissive materials glow correctly much like they do in SEGI? It is possible in SEGI to roughly emulate a point and area light by setting the material of an invisible sphere/cube really high.

    - Will it works in HDRP/LWRP as well as built-in?

    As a side note, I'm working on a small other project in HDRP which is all pre-built levels so jumping back into the lightmapping world of Unity, and the whole experience somehow feels even more painful than 5 years ago when I last tried.

    Thanks :)
     
    ftejada and RB_lashman like this.
  35. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,115
    Nin, I hope all this is HDRP. If so, consider my money yours.
     
    RB_lashman likes this.
  36. andywatts

    andywatts

    Joined:
    Sep 19, 2015
    Posts:
    110
    Afaik It’s default render pipeline.
    I guess the hlsl compute shader work could be ported to SRP.
     
    RB_lashman likes this.
  37. nukadelic

    nukadelic

    Joined:
    Aug 5, 2017
    Posts:
    73
    What's the current state of this project ? Could we use it for vr ? If so, any tips how to set it up ?
     
  38. Duende

    Duende

    Joined:
    Oct 11, 2014
    Posts:
    200
    Hi folks, we still don't have a solution in Unity to have dynamic global illumination at runtime? Could someone catch me up?
     
  39. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,288
    My very old GI Proxy system still works for an emulation of GI, i worked on the URP version demo lately, here is a sample


    Is rather different approach than SEGI, but the principle is not far from how real GI works.

    I emulate the bounce lights using point lights without shadows.

    I am also in the process of porting SEGI to URP, but this could take any amount of time as is very complex. Also working on a screen space GI approach.
     
    ftejada, x4000 and florianalexandru05 like this.
  40. florianalexandru05

    florianalexandru05

    Joined:
    Mar 31, 2014
    Posts:
    1,794
    My personal preference shifted to Bakery in the end. I know what everyone is thinking, it's not real time but, baking the GI volumes comes close to having real time GI in my opinion. I think that it's pretty amazing having baked GI with a volume that captures the GI and even the shadow, is great, it is kind of like probes but better. This is something I always wanted from SEGI, like Cryengine's SVOGI, where you could bake the voxels in place and then it had little to no impact on performance. For real time objects like trees, grass and plants but mostly trees where you can't have baked GI, I used a shader with baked vertex Ao for the missing self shading, the rest of the lighting and shading will come from the volume and the end results are fast and work excellently. Bakery by itself is pretty accurate and fast, you can have path traced GI, compared the results with stuff like SEGI and SSRT and they don't come close to the accuracy of Bakery, just saying... It also support directional and SH lightmaps so it fixes the ugly normal map shading as well. I want to recommend this tool to anyone that might of overlooked it like me. It's still not fully real time but still good at what it does, not sure if moving lights would work, probably not.

    >>>Video examples of baked volumes.<<<

     
  41. one_one

    one_one

    Joined:
    May 20, 2013
    Posts:
    621
    As I can't find much about it on the store description or manual - in which way does Bakery serve as a realtime GI tool? The geometry itself won't be able to change in realtime I assume (so dynamically generated levels are not an option) and from the looks of it it also doesn't seem like the light intensity can be adjusted? E.g. going from noon to dusk?
     
    florianalexandru05 likes this.
  42. florianalexandru05

    florianalexandru05

    Joined:
    Mar 31, 2014
    Posts:
    1,794
    Yes, you are right, that's what I'm saying, it won't do moving lights or the geometry itself but if you have a static scene it's a pretty good alternative for people that want to make static levels while a better alternative to real-time probes is the bakery volumes, that's about it. For the things you and most people want, you need a proper real-time solution which they fail to deliver from what I noticed, not yet anyway, hence we have some talented devs making new GI solutions here on this thread. Unreal Engine 5 Lumen does impress me thou.
     
    one_one likes this.
  43. x4000

    x4000

    Joined:
    Mar 17, 2010
    Posts:
    353
    Are we just talking about URP and traditional pipelines at the moment? I seem to recall some promising-looking stuff in HDRP that was based around fast-point-lights. I really don't have a proper sense of how much of my audience I'd be cutting out with HDRP yet, so it's on my list to investigate more next year or the year after. For the moment I'm just planning on projects that don't require that level of lighting fidelity.

    Here it is -- UPGEN: https://assetstore.unity.com/packag...een-camera-effects/upgen-lighting-hdrp-169744
     
  44. Duende

    Duende

    Joined:
    Oct 11, 2014
    Posts:
    200
    Thank you all for your responses. :)

    Yes, I have purchased your asset for a long time but it did not fit my project correctly.

    That is very interesting. :eek: I'll keep an eye on that, although I would be interested in HDRP.

    Yes, as one_one told you, I am interested in a dynamic and runtime solution, since my game is procedural and everything is created in runtime.

    I did not know your asset, it is quite interesting and may serve as a temporary solution while someone release a dynamic GI and in runtime.

    So Unity still doesn't provide a solution to this? They have not even talked about it in a conference?
     
    nasos_333 likes this.
  45. x4000

    x4000

    Joined:
    Mar 17, 2010
    Posts:
    353
    Dynamic GI is one of those features that seems really sexy and like it's going to really make things amazing. But the reality is that in the PC market, most of the potential customer base would be excluded from using it. So if your game doesn't look great without dynamic GI, then it's just not going to look great for most people.

    The next gen of consoles obviously support some of that stuff more directly, but how many people really have a PS5 right now, for instance? Even on the PS4, you have to support the original version as well as the PS4 Pro.

    I find myself really tempted by a lot of technologies all the time, but I am reminded that even AAA studios struggle with these things, and they have buckets of money and staff to throw at the problem. It's possible to get a lot of really great results that are attractive by focusing on smart art direction and also by using some of the things that exclude fewer customers, like HDRP and screen-space solutions. Even that still shuts out a lot of the market.

    Back in 2016 I spent a hell of a lot of time chasing certain pieces of technical fidelity, ranging from IK through various lighting elements, and ultimately what I should have realized is that even the AAA titles just don't have this stuff mastered. You look at something like Red Dead Redemption 2, which has amazing graphics, and you'll still see clothes clipping through other clothes, physics oddities, IK issues, and hair that behaves a bit strangely if you look too closely at it.

    Feeling beholden to solve all of these problems in a manner superior to what Rockstar could do is a bit of a trap I fell into, and I wonder how often that happens. As the developer, we see lots of flaws in anything we do, and certainly there will always be gamers criticizing every aspect of anything we do. But there's a serious point of diminishing returns, and even harm (reduced customer access, spiraling dev time), that comes when you focus on going with too much fidelity where you could instead potentially focus on art direction and making things look great by means that are more artistic rather than technical.

    Anyway, when it comes to "why has unity not done XYZ," I can only speculate, but I think it's this sort of mindset, if I had to guess. Unreal is pushing hardware to its limits because that's what they're known for, and that's their only real competitive advantage. Unity is easier to learn and more flexible in a ton of ways, so if Unreal doesn't stay ahead of them in the visual fidelity department, they're never going to pick up new developers.

    TLDR I think that chasing fidelity past a certain point in any of the technical areas is a bit of a trap. It's good to make beautiful things, but it's not great to be on the bleeding edge if you're a small company.
     
    Duende, cxode and nasos_333 like this.
  46. DMeville

    DMeville

    Joined:
    May 5, 2013
    Posts:
    418
    The team at Amplify is apparently working on a realtime, no-bake, GI system will be able to "run anywhere" (so without RTX). Very promising as Amplify is known to actually make (and release) quality tools, but not a lot of info out there about it yet or any ETA, just a few teaser screenshots on their twitter. https://twitter.com/fozeta/status/1449366793282334726?s=20
     
    Last edited: Nov 22, 2021
    Duende, cxode, TerraUnity and 2 others like this.
  47. razzraziel

    razzraziel

    Joined:
    Sep 13, 2018
    Posts:
    395
    Good to see they're working on something. Realtime GI is one of the bleeding parts of Unity. I hope they'll release soon.
     
  48. x4000

    x4000

    Joined:
    Mar 17, 2010
    Posts:
    353
    Amplify do such good work. I'm amazed unity hasn't acquired them.
     
  49. Duende

    Duende

    Joined:
    Oct 11, 2014
    Posts:
    200
    Yes, you are absolutely right, that is why for all the graphic aspects of my game: lighting, graphic shaders, clouds, vegetation, etc etc. I'm using assets, for, on the other hand, I concentrate all the development in the game itself and the gameplay, since I'm making the game alone.

    Would my game be boring or bad without those graphical aspects? No, but if with (relatively) little effort I can make it look better because other people developed a package with graphical enhancements, then perfect. :p That is why I was interested in your asset, because, although it is not a perfect GI, it looks pretty good and could serve as a temporary solution while someone release a dynamic GI and in runtime.

    About this, I know that dynamic GI is something very difficult to create, that's why we still don't have a good asset in the store nor Unity gave us a solution, but it's not impossible, and many years have passed since projects like SEGI appeared. And not just Unreal, even Godot have dynamic GI. But at no time was I criticizing Unity, I have been working with this engine for many years (and the ones that remain), I just wanted to know if they had announced any plans on this. There are other more important things that do require criticism on this engine. :D

    That's wonderful news, I have some assets from the Amplify team and they do a great job.

    Thanks everyone again for the responses.
     
  50. x4000

    x4000

    Joined:
    Mar 17, 2010
    Posts:
    353
    Lots of engines have dynamic GI, for sure. Unity has a challenge in that they have like 6 pipelines that they support, plus custom ones. If you've got fewer pipelines and don't need feature parity between them, and know that you have some certain set of shader features, a lot of things become easier. But at the same time, you cut out a lot of options.

    I expect that we'll see dynamic GI, but how reliable it is and how many people's machines it works on will be a question. Have you released many commercial games? Having some strange thing not work on just a subset of customer computers is incredibly frustrating, and it's most common in the graphics space. Most code-focused assets are going to work the same regardless of where you run it, just maybe a bit slower or faster depending on the computer.

    But when it comes to things like dynamic grass for example, there's a lot of techniques that use features that aren't in Metal or Vulkan. If you just focus on windows computers that have DX11 or DX12, that does hit most of the market and most of the features, but it's still super frustrating.

    Earlier this year I was chasing a bug that was some sort of NaN propagation between HDR cameras that only happened on OSX on certain Radeon cards. I had to buy a 12th laptop to even be able to replicate that thing, and I don't even remember what the resolution was in the end. It sucked up a lot of time time and money on a niche issue.

    As time passes and you're supporting more hardware, or assets get deprecated, it's pretty challenging to upgrade and keep customers happy sometimes. If you're targeting just consoles you don't have that moving target, but I'm just advising caution. I particularly like Amplify because they've been around forever and seem to have the means to support their stuff.
     
    hopeful likes this.