Search Unity

Unity Enlighten deprecation and replacement solution

Discussion in 'Global Illumination' started by Jesper-Mortensen, Jun 19, 2019.

  1. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    4,156
    It's like you aren't reading anything I said :p I can't make sense of your answer, I'm so confused, so I'm dropping, I wasn't fighting anyway.
     
  2. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    239
    I thought it was very clear. DDGI is a very limited technique that provides low quality results compared to per-pixel ray tracing :D
     
  3. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    4,156
    I think it's more that I was breaking down the technique down to the component and showing how insight can be build to get better quality at better cost. Ie I read the technique conceptually.

    For example I use lightprobe as the "spatial and angular data", that is it can be "encoded" using various support structure (texture pixel, cubemap, SH grid, etc ...), which is why I use in isolation words like volume, lightprobe, etc

    Basiquement TO ME, DDGI offer insight to get faster and better result using a caching mechanics while solving the interpolation issue due to "sparse" sampling (leaking). It's not inherently coarse like the implementation presented show. We can use the same insight with different structure (which is what I was discussing about). Anyway that's what ALL implementation of RTX based RTGI is actually doing now (using different support structure, not just lightprobe), so opposing the two literally don't make sense to me.

    Meanwhile it seems your perception is bog down by the usual implementation that marry multiple concept in a single solution. Ie you see lightprove as the lump data, support structure and implementation all at once (here it is LPPV according to unity's lexicon). I kinda moved away from LPPV in my explanation, which might be the misunderstanding.

    I think It's because I'm planning another cheaper (less accurate) implementations of GI and I'm not reading like a user. I'm sorry if we weren't talking the same language lol. You are judging the current implementation, I'm judging the technical concept. Apple and linux comparison essentially. :rolleyes:
     
  4. SamOld

    SamOld

    Joined:
    Aug 17, 2018
    Posts:
    86
    It's not that simple. It optimises for different types of quality.

    DDGI:
    - Infinite bounces
    - Low frequency indirect and area shadows (but still normal shadowmapping for direct, which is where high frequency information mostly is)
    - No high frequency information from very close geometry
    - Few rays per unit space, allowing many more rays per unit angle
    - Caching and asynchronous update giving constant performance and even more budget for angular collection
    - Works more reliably over large scenes because it has more rays to collect far away information (potentially over multiple frames), giving a higher quality result

    Per pixel:
    - Only one bounce (affordably, could supplement from screen space but that's quite sub par and will cause light pop in)
    - True indirect shadows, but limited by few angular rays which could cause artefacts
    - High frequency information from close geometry is available, but at the expense of far away objects
    - No caching (maybe temporal reprojection?) or asynchronous update

    In particular, the single bounce and poor long range reliability of a purely per pixel approach mean that it's not necessarily higher quality. Some scenes would look better with one, and others with the other. I still think that the ultimate solution is likely to be one that uses per pixel traces for very local detail and DDGI for everything outside of the nearest probes giving the best of both worlds at a hopefully affordable additional cost.

    Unity's goal should be to have a solution that covers the widest variety of use cases.
     
    keeponshading likes this.
  5. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    239
    It's always good to exercise the mind with hypothetical concepts but when it comes to production software I think proven solutions are the way to go :p
     
  6. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    239
    Sure, a combination of probes for far away sources and RTGI for the close-mid range would be fine. But just a couple things:

    - PPRTGI can handle infinite bounces but just like DDGI it introduces ghosting. As an example, check out Sonic Ether's Minecraft GI.
    - Shadow maps are very expensive for wide penumbras and DDGI too coarse for contact-hardening area shadows.
    - Ray lenghts can be quite large for per-pixel RT. Metro's AO uses lenghts of 50m+.
    - Caching is indeed limited so performance wise DDGI certainly wins.
     
  7. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    4,156
    Yeah but then LPPV would win, rtx is too slow and don't scale, it doesn't match the current market, so it's still experimental (few games on high end machine are just technical demo) :p

    Also most stuff I talk are actually proven stuff, we have been using lightprobe DATA in all sorts of stuff already with all sorts of structure (lightmap SH are used in the last of us for example). We have visibility structure already bake in all game, just not pair with probe and tracing (like huge ao probe cache like in the division).

    Basic RTGI with rtx don't win at all, it's not widespread, we haven't solved the spp optimization count, and we heavily fallback on old techique. Just a few dev have develop the specialized knowledge and we are still on the onlook for every of their talk so they share their insight.

    It's not hypothetical ;)

    IMHO, the real reason we don't have RTGI solution in unity is that they care too much about accurate light transport (see the video above).
     
    Alverik likes this.
  8. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    239
    1. Fortunately RTGI isn't intended for the current market but for the high-end of 2021 and beyond :p

    2. Sure, but probes in current games are pretty low res compared to RTGI even when fully baked and the best results are reserved for static geometry.

    3. It's not widespread yet. Metro Exodus solution already handles diffuse and specular GI for all geometry (static and dynamic) along with emissive lighting and area shadows at 1080p@60fps. For a title released just 6 months after the first ray tracing cards were available it's an amazing achievement. By 2021 when all major players support hardware ray tracing adoption will be much more widespread.

    4. Without even a proof of concept it kinda is.

    5. Also, the base HDRP isn't even ready yet.
     
  9. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,197
    Oh look, a new Unity vacancy:

    https://careers.unity.com/position/lighting-developer/1649861
     
    Waterlane, Rich_A, OCASM and 2 others like this.
  10. SamOld

    SamOld

    Joined:
    Aug 17, 2018
    Posts:
    86
    The problem with rays is not they can't be long, it's that when they get far away, there's a lot of space between them, where important sources of light can be missed. Using probes doesn't fix this, but on account of there being many fewer of them (and them being able to update asynchronously) they can have a far greater ray density, which reduces the problem.

    Edit: The probes can fix this if you allow a time delay. After a set distance, they can stop tracing and read out of the previous frame's probe field at the end point of the ray.

    This can also introduce artefacts like this, which results in a stripy surface. Here I'm showing per pixel GI with sparse rays along a 1D surface. This can cause problems in scenes with small or far away indirect light sources. Excuse the thrown together gif.

     
    Last edited: Jul 14, 2019 at 10:31 PM
    hippocoder and neoshaman like this.
  11. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    239
    I see what you mean now. But no need to worry. RTGI traces rays stochastically so every frame the ray directions change a little, increasing the chances of selecting a good sample. Also results are accumulated temporally so flickering is mitigated. Add some importance sampling strategies to the mix and you're set :D
     
  12. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    4,156
    It's also notable that metro use a simplified color source for GI to decrease noise sampling, ie they don't sample the full detail texture, they basically use flat color surface!

    So ghosting right? ;)

    By the time rtxGI become widespread and mature, so will other techniques anyway ... "Flat tracing" will only remain a techn demo showcase of early rtgi :p

    /playing the game
     
  13. SamOld

    SamOld

    Joined:
    Aug 17, 2018
    Posts:
    86
    We've gone very far off topic for this thread. We should probably move this discussion elsewhere if we're continuing it.
     
  14. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    a good complete readup with lots of realities

    https://www.slideshare.net/mobile/c...re-challenges-of-global-illumination-in-games

    in special the

    ( <n) ms Treshold
    and
    The PowerCost (€)

    The most off topic regarding all this is
    to focus on an diffuse based, OIDN and Radeons Rays powered lightmaping and a deprecation and don t deliever an replacement before 2021+.

    As deciscribed before. Professionell Full feature GI baking is OpenSource available (around 5 or more years ahead of PLM)r and is even the base for fancy lightfield baking among all other realtime GI solutions.

    It s also benchmark for
    ( <n) ms Treshold
    and
    The PowerCost (€)
    for more than the next years.
     
    Last edited: Jul 12, 2019 at 11:06 AM
  15. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    .....so here is what you could have today out of preview (not 2021x+ in preview)
    after 1 day of calculation and making lightmap and probes streaming via virtual textures sets like granite available.

    Photorealistic GI Day/Night cycle with moon and sun lighting pulled out of 48 calculations.

    - fotoreal
    - running on mobile(android) and vr (little downscaling of textures) in same quality


    https://forum.unity.com/threads/bakery-gpu-lightmapper-v1-6-released.536008/page-64#post-4739741


    by simply supporting the right stuff.
    Giving Mr. F the time back he must reverse engineer the lighting asset , there would be much more advanced stuff for the probe rendering.

    So. 140 Texels bake to 16x4K lightmaps and probes in 00h34m18s in these quality.

    In 24h there is an complete day (48x) calculated on one GPU.
    You can mostly calculate these on daylight hours when you don t like or need the full moon lighting for your scene.

    For outside environmenty you can scale down 140texel per unit a lot to get an big area.

    Unity has the scene also.
    Would it be possible to show the PLM output in comparable quality and speed.
    I tried 6 months. Best was 7hours with quarter the quality by crashing more than 50 times.
     
    Last edited: Jul 12, 2019 at 3:23 PM
  16. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    4,156
    For the curious, I found teh DDGI paper, that is super readable and goes more in depth than the video and blog:
    http://www.cim.mcgill.ca/~derek/files/DDGI-highres.pdf

    - Apparently they also propose an accelerated raytracing structure that basically trace using the lighprobe structure (not rtx, not bvh, basically cubemap hoping)
    - They don't use SH like in typical LPPV, they use full cubemap, laid out in octahedron encoding, in an atlas.
    - They use Gbuffer at cubemap level to compute the lighting and accumulate over time
    - You don't need to use a grid, any linked probe structure can do (tetrahedral, box projected cubemap, etc ...)
    - Result with 1m spacing are very close to ground truth
    - It does look trivial to implement a simple version.
    - It's kinda close to my own hypothetical and untested (yet) solution, both use cubemap atlas as visibility structure (but differently) and texture gbuffer. The main difference is that mine make the hypothesis around cubemap as samples addresses to the gbuffer stored in a lightmap, shadow are define by sampling the analytical skybox using the visibility structure, and box projection to replace tracing. Both light objects as an async structure that is then simply sampled by geometry at runtime.
     
    hippocoder and keeponshading like this.
  17. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    239
    1. Great optimization.
    2. Minimal compared to accumulation of multiple bounces ;)
    3. And probes will be a thing of the past :D

    /playing the game

    Requires UVs and precalculation. Not a good fit based on the requirements @Jesper-Mortensen posted.
     
  18. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    4,156
    Except it also accumulate all bounce too (ie the previosu frame bounce) :D that's all RTGI by now. The sampling is spread along multiple frame, that's the accumulation and temporal part. Ie data is obsolete the next frame.

    Except not because it will always be faster to get stuff async frame rate independent for the entire level field, for cheaper, and in a single look up resolution at runtime. Which scale on weak hardware, and will be supplemented by screen space vanilla tracing AFTER we do screen space pixel solution are used to jump start the ray so you can have more rtx spp to help the temporal denoising pass (see frostbyte). :cool:

    We reach the same conclusion, whatever, the future is glorious for RTGI at all scales.
     
  19. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    UV's are so 80ies and the result of an GI or FullGI equation we could pull out of an artificial black hole opened on GPU. Saves precalculation.
    All science on computer graphics made towards physically based realism is wrong.) We don t need to calculate it anymore in times we finally could and use any aproximation because we must get rid of precalculation.

    The requirement list you mentioned fits perfect to

    minecraft-ray-tracing.jpg

    No UV s(only triplanar). Big Open World. Decoupled. Newest RT tech. Here it is.

    It s not to hard to rearange already authored UV s today with actual packing algorithms. Bakery does well.
    To get nice consistent probes u can work with, like Bakerys, they are needed too and generated after lightmap calculation.

    But when PLM cannot do it we set uv s deprecated.

    When we can t do simplest multi scene lighting editing today we must decouple lighting...

    Mhmm?
    Sorry. It s hard to imagine and reads more as an excuse for todays situation due to more than 3 years of technically idle at unitys lighting and setting up future targets like reaching gpu parity with cpu feature set from PLM.
     
    Last edited: Jul 14, 2019 at 9:22 PM
  20. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    Here the probably most complete readup with full list of sources and productive uses.

    Part 3: DDGI Overview
    https://morgan3d.github.io/articles/2019-04-01-ddgi/overview.html

    see also
    Part 2: Global Illumination
    and
    Part1:
    Dynamic Diffuse Global Illumination

    that shows the actual state of the art and uses and why a full featured path traced solving for Lightmaping and every potential further Realtime XXGI method ( Enlighten has most advanced and used base tech) are not should be handeled as seperate things.
    As like all upcoming DXR RT tech should be seen as additional features replacing precalculated one when you can afford, not as general replacement.

    As a blog posts something in this direction could be expected to show of the future in Unity.

    Not the deprecarion of an base tech(Enlighten) bundled with featured parity from PLM GPU to CPU , up to UV free and precalculation free statements.
     
    Last edited: Jul 14, 2019 at 1:26 PM
  21. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    25,276
    I don't care. I just need it sooner than the frankly frown-worthy deadline of 2021.1 when clearly Unity knew what was happening in 2017.

    It feels bad to be stuck. But maybe it means Unity gets better tech than they would've (DDGI, for example).

    Still sucks to be me.
     
  22. SamOld

    SamOld

    Joined:
    Aug 17, 2018
    Posts:
    86
    This isn't very useful for the average person, but there's always the option of building your own. I was planning on doing that for my current project anyway as I need true dynamic GI. Building a general purpose solution like Unity need to is hard, but building something for your own needs is a lot easier than you might expect. You do need to be a comfortable programmer and to be happy dealing with shaders and compute, but I built a GI prototype before and it came in under 2,000 lines of code.
     
    OCASM likes this.
  23. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    239
    1) Not really, ghosting in Metro's GI is pretty much imperceptible.
    2) That will be good enough for mobile games but for high-end titles that quality is unnacceptably low. Also, Metro already does a screen-space ray tracing pre-pass.

    Sadly (actually fortunately) most games aren't made out of cubes. Triplanar mapping is relegated to only a few use cases so you'd still need UVs which for organic models continue to be a pain in the ass to setup.
     
  24. sacb0y

    sacb0y

    Joined:
    May 9, 2016
    Posts:
    105
    So what are we supposed to do if we want a day and night system?
     
  25. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257

    This was more an post with some irony.
    A reaction on post #2 from this thread.

    ....
    • Fast iteration: Time-to-first-pixel needs to be fast, cannot have a lengthy pre-compute step.
    • Easy authoring: We need to remove the dependency on authoring suitable UVs and other surface based authoring.
    • Dynamic worlds: In addition to dynamic materials and lighting setup, we have to support dynamic geometry (eg. for procedural games).
    • Unified lighting: The lighting container needs to be decoupled from surfaces. This allows all scene elements to use the same lighting including volumetrics and participating media.
    • Large worlds: Due to the sheer size of levels today we need an easy way to to do localized light transport where what is lit and what is affecting that lighting is decoupled.
    • Source access: We need to have full access to all source in-house. So that we can independently drive development forward, fix bugs and support future platforms. This is arguably the most important point.
    ...
     
  26. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    lets do.

    I can calculate the data for an GI dynamic day cycle.

    Bakery around 1h day for GI with 48 calculation (sun/moon) positions.
    or with
    Cycles around 3 days but Full featured GI with 48 calculations (sun /moon) positions

    Then calculate correct 48x Probes sets out of these baked Lightmaps sets.
    1 GPU - 1day - midsize scenario
    with
    PLM you would need minimum 20days for an non correct calculation noone can explain.

    I also was nearly finished to stream and interpolate the Lightmaps via 64k virtual texture sets with granite after 4 months of fiddeling around.
    Then Unity aquired Granite some months before and i had to stop because it not available anymore (deprecated in Asset Store) for new versions and i have no idea and information what Unity plans with it.

    So when you have an idea how to multithreaded interpolate all the probe data and how to update light maps and probes at runtime with some dynamic input.
    (what Enlighten did)

    Let s talk.)

    All i need is an fotorealistic photoreal correct day and night cycle with some additional interactivity.
     
    Last edited: Jul 15, 2019 at 4:15 AM
  27. SamOld

    SamOld

    Joined:
    Aug 17, 2018
    Posts:
    86
    I'm sorry but I don't really understand what you're trying to say. You appear to be talking about baking but I was talking about a realtime dynamic solution.
     
  28. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    Sorry. I thought of an dynamic solution based on precalculated GI data with an photorealistic and mostly physical correct outcome.

    So is it more like realtime denoising of realtime path_traced scene?
    This is the fastest i came across.
    https://benedikt-bitterli.me/mmbj/
    Needs around 30ms for an HD frame.
    Diffuse direct and indirect is grazy nice but gloss blur is heavily depending on camera position.
    However for non photorealistic scenes very nice.
     
    Last edited: Jul 15, 2019 at 4:37 PM
  29. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    However. I tried in several Post to show an logical and fast
    way to get on top concerning baked lighting, realtime preview, open source , free scalable, full gi feature set.... , years ahead of PLM for platform independent free use and more as an solid base for all further realtime GI development.

    like

    https://forum.unity.com/threads/enlighten-deprecation-and-replacement-solution.697778/#post-4681067

    https://forum.unity.com/threads/enl...placement-solution.697778/page-2#post-4721408

    The funny thing is there is almost no reaction, mostly soft smiles.
    Meanwhileme here some indication, and it is more.) that some others made their homework and benchmarks and are preparing, among other stuff, an full sponsored access to this free tech in addition to an payed proffesional vray integration.

    https://80.lv/articles/epic-games-supports-the-blender-team-with-1-2-million-epic-megagrant/
    ...
    “Open tools, libraries and platforms are critical to the future of the digital content ecosystem,” said Tim Sweeney, founder and CEO of Epic Games. “Blender is an enduring resource within the artistic community, and we aim to ensure its advancement to the benefit of all creators.”
    ....
    Between the lines, this is the most intelligent "invest" since a long time with an really high payback factor, without any risk.
     
    Last edited: Jul 15, 2019 at 11:40 PM
  30. Mauri

    Mauri

    Joined:
    Dec 9, 2010
    Posts:
    1,457
    Ever wonder, why? This thread isn't about finding a replacement for the current lightmap baking tools inside Unity. It's about finding a replacement for Enlighten by (hopefully) going fully dynamic realtime GI - e.g. something like SEGI or HXGI, only in better. Most of your posts sadly don't help...
     
    GameDevCouple_I and SamOld like this.
  31. SamOld

    SamOld

    Joined:
    Aug 17, 2018
    Posts:
    86
    Yes. @keeponshading I don't mean to be rude, but your posts here have been very hard to read and haven't made much sense, and they don't appear to be on topic.
     
    GameDevCouple_I likes this.
  32. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    I agree that some points are hard to read and i should step back a lilttle. Last 2 Unity Blogpost are referencing the complete lighting pipeline. Also the start post from this thread.


    Probably i am not so diplomatic because feeling still shot in both knees and out of my experience this current development will become really dangerous for Unity.

    There are lot of Use Cases for Unity.
    My needs address more the HighEnd sector where physical plausibl, photoreal visualisation quality and max performance is needed. This needs access to cutting edge stuff, paid as middleware or built in. A real hard competition is daily buisness. When the competition does everything to show off that you have all possibilities vs a warm"we have something we cannot talk about. Come back 2021.Use the deprecated one, as it is (not working), up to then. Sorry. This comunication is hard to get and for me 'rude' towards your customers.

    So i like SEGI and HXGI, but they are not an option here for this target because all the limitations they have.

    However. There are lots of realities.
    Towards high quality lighting there is big vacuum now and only one external asset option called Bakery who can deliver an productive professionel workflow towards this targets.
    Also a big loss in trust in being able to handle this but i think it s more my way to say goodbye trying showing off what happens out there in my reality.
     
    Last edited: Jul 16, 2019 at 3:54 AM
  33. Mauri

    Mauri

    Joined:
    Dec 9, 2010
    Posts:
    1,457
    I'm not sure why you're still coming up with Bakery all the time. Bakery is not a realtime GI solution and will never turn into one, let alone replacing Enlighten. It's a GPU lightmapper (a great one, I admit). Plus, its baking is currently limited to NVIDIA cards and Windows only. All information you can take from the first post of the Bakery thread.

    But then again: Unity is not looking for a new lightmapper.

    I mentioned SEGI and HXGI (I could've also mentioned SVOGI or NVIDIA's VXGI) because that's what I assume (and hope :p) Unity is planning to come up with, but who knows? It's a long time until 2021(.1) hits, so maybe we should just wait and see?
     
  34. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    some bigger journey....
    Probably an nice simplified example from avalanche who has the know how to set up the 4 best available FullGI PathTracers available for comparison.

    - all 4 scenes have been set up with diffuse 50% gray material and floor material of same RGB value,
    0.5 roughness/glossiness,
    IOR 1.52 and fresnel based IOR.
    - all 4 scenes use environment light from the identically oriented HDRI map.

    -all 4 renderers have been set up to use full brute force GI (no caching),
    using 10 bounces for all the light path types (diffuse, reflection, refraction etc…) and all 4 renderers have been set up with ray clamping/max ray intensity value at 10.

    By applying normal pbr materials you reach absolut photorealism.
    But here another showoff with an simplest scene setup.

    Here they are.

    Corona

    19f07a5db77dc49e1418b2f46defe9e506365554.jpeg


    IRAY

    bd57094ee2f50d604fb071ed8f9c977ccb11add4.jpeg

    VRAY

    6aea5593131fe0b8a85c872eca2739e659ed8afe.jpeg

    Cycles

    3f2bdab9f5986d0a503fe90c471a1b7fd72d974e.jpeg


    Corona, iRay and Vray render linear output by default. Cycles defaults to Filmic so it was switched back to linear.

    So this is the absolut minimum quality you need as base for some photorealistic, physical plausibel work. One of them is available for free.

    Every renderer can bake this down.

    PLM is far away. In terms of speed, feature set and quality. More. Speed will probably never get there because of some complexity.

    This is the base. And the start for everything, Realtime GI included. Why?
    Extrapolate that on every scenario. The room is only an example here. Could be an complete city, forest, car....

    So for me it s not possible to wait and hope 2 years.
    Also it is not possible to further use Enlighten as an deprecated system ,as it is,.

    Sorry. Again Bakery because it is the only production ready and working system we have in Unity for now and it is better to work , optimize and plan with stuff who s available and accessible ..... now.

    Outcome of current tests. Bakery is really close.

    The Archviz 6 needs on one Titan RTX 32min.
    By using 4x Titan RTX with seperated CUDA 0 to CUDA 3 allocation you need 8min.(not working now in Bakery but realistic, because it work on cycles, cuda)
    The you calculate the probes out of this high quality lightmapped time of day shot.
    .
    Then do this for an HDR timelapse 50 times for an complete day night cycle.
    400min is not bad.
    Current PLM would need more than a month with much less quality.

    Then interpolate all the data in virtual textures and some multithreaded probe interpolation per mousewheel.
    Voila. An Dynamic Day in photorealistic GI quality.

    Possible now. Some kind of Lightfield GI. But not really.

    To do.
    Alter these data, textures and probes, through additional realtime lights to get some additional dynamic stuff.

    50 calculations on a day night cycle could mostly spend during daytime hours when you do not need the full moon lighting. So you calculate complete GI around every 15min from a day what should be enough.

    These calculations should be done with an clear sky ibl timelapse. around 38x.
    The overcast variant is only needed every hour because it is very similar. around 12x

    The you have an photorealistic dynamic day on your mousewheel. Hit shift to interpolate towards overcast available for every minute.

    The quality is higher than every game at the moment.
    Have a look on VR, Android and Desktop build with an constant high framerate. 60 or 90fps plus.
    The methods you described like SEGI or HXGI give you a quarter of this quality and around a quarter of fps.
    This method need also a lot of artist efforts to fake it right with never getting it right.

    So all you need is 400min baking and for sure... UV s.)
    and access to Granite.)

    And coming back to cycles. Same method is running here but you have additionally full featured GI access. means 128 diffuse rays, transmission ray..... and an realtime raytraced preview and the meshes and light source positions , uv authoring, mesh authoring, and more are realtime streamed from Blender to Unity. )
    Finished soon.
    It s not done in 400min. More 4000 or more but you have caustics on the table when you want and perfect lit stuff in the darkest corner. Artist effort could go fully in post processing and visual storytelling because you have already an photorealistic base.

    When there is any other solution on the horizon to get similar quality and performance. Would be nice to know.

    The math here is really easy.
    The GI or FullGI formula is written down in the Global llumination Compendium since many years. This one solves it through precalcualtion in 400 or 4000min mostly correct for static with most high tech paralell processing available today.

    So there simply could be no wonder realtime variant math delievering the same quality and speed in some ms at same year.

    Also the HDRP made a long way to get truly physical plausibel.
    To fed it with some gambling one sample denoised “GI“ ms aproximation values would not be fair exept additional DXR RT reflections when your frame budget allows it.
     
    Last edited: Jul 17, 2019 at 9:58 AM
  35. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    This is mostly (visually 90%+) everything what makes the difference
    to get from
    PLM baked,

    DGKoKhnVYAAamTD.jpg

    to here Cycles, Corona, VRAY, IRAY

    Detox-a-Fontainebleau (2).jpg

    A physical correct based "Full“ GI precalculation first.
    The rest makes an dynamic day cycle out of it.
    There is everything in the Fountainebleau HDRP dataset and IBL to reach this.
    Simply add correct precalculated “Full“GI.
     
    Last edited: Jul 17, 2019 at 2:01 AM
  36. GameDevCouple_I

    GameDevCouple_I

    Joined:
    Oct 5, 2013
    Posts:
    1,607
    That looks like two different scenes? It would be better to provide a side by side comparison of the same scene from the same camera angle, in both unity and cycles or eevee

    Also why are you even showing shots of prerendered stuff? You really should only be comparing realtime stuff as we are talking about real-time GI, so bringing up stuff like cycles, corona, V-ray, Iray is not really helpful to the discussion and only confuses things more. Those are not real-time renderers, the only one that would come close is Eevee, which has a noticeable hiccup when making big changes in a scene and therefore is also not really comparable to the speed required to run real-time GI alongside game logic, etc etc
     
    neoshaman and SamOld like this.
  37. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    Now coming back to main BlogPost
    Enlighten deprecation and replacement solution
    and the real outcome of this dangerous comunication style and behaviour by shooting down your base customers down from behind.

    Sorry for taken again hardly the direct competition into account.
    Some sentences could be overexpressed, but mirror whats happening.

    I recognized that some have a problem with it.
    But you should do it. Daily.

    Concentrating on Lighting solutions. But for Networking similar.

    I described the ones with most potential in last posts to have acces and the need for free, open source or best in class solutions.

    Unreal

    Enlighten (Silicon Studio) as middleware, further dev of heavy scene optimisation and combining with RTX,....,...,.
    Yebis as middleware
    VRAY integration
    Lightmass
    ....
    ....
    ....

    Funded 10 000$ 2014
    and
    yesterday 1.2 million$ to Blender Foundation
    to prepare and improve among lots of other stuff access to the best free solution out there, Cycles.

    "
    So Epic is making the case that if you go with Unreal, don´t worry about who Epic has made deals with. Those partnerships are for you to make. We also massively invest in full open source solutions to give you best possible access for free to skyrocket your revenue
    "


    Unity

    Geomerics does not exist anymore, shuted down. We have problems with Silicon Studio.
    Enlighten doesn not exist anymore? We remove it. Give you some replacement in 2021+.
    But can use deprectaed not working integration up to xxxx.

    You get will get PLM GPU feature parity to CPU soon.

    Oh. Some Octane chaos with high pricing per GPU without any benefits against shown free solution.

    Simply take some months, to reverse egineer our blackboxed Lighting Asset when you will do your own.
    We do a simply change then you can start from beginning.

    When you need some productive stuff you should trust on a, meantime world class, solution for what it does in Asset Store. Its based on Optix. Only Windows. We cannot do this for our users. In the next sentence. We integrate Optix and OIDN.)

    So i think i it s finished now. Take it as community input. What it is.
    I helps noone to hide some realities.
     
    Last edited: Jul 17, 2019 at 1:43 PM
  38. GameDevCouple_I

    GameDevCouple_I

    Joined:
    Oct 5, 2013
    Posts:
    1,607
    I am sorry I dont mean to be rude but your post and many previous dont make sense, are poorly written and seem to be mixing points from all sorts of topics and therefore confusing information.

    This is about realtime GI. Everything else you are talking about such as cycles, unreal funding blender, and all sorts of other stuff just really is not relevant in any way. Your literally derailing this entire thread, this is not a messenger application, what you write stays here and confuses users who come later to navigate this already messy thread.

    Please keep things on topic, and open other threads to discuss the many many topics you are mixing together. Also please try to format your posts in a more readable fashion, its difficult to read such long wall of text posts already, let alone when you do not format them (seemingly by choice given that each post you have provided completely different formatting in...?)
     
  39. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    Thanks for the valuable input. It´s ok. That s not rude. The Blog Post is the most rude thing i read for years when you are specialized in lighting. Beeing an Animator i would say. Great. No UV´s. Go on.
    Probably i don t now what rude is.
    However. It s not my native language nor i am able to explain content of several books in one post.
    I am also not an illustrator. Think this was the last post.
    Only an community constribution mostly formed by my personal experience.
    And it s all about Realtime GI. Starting from the base.

    When you would write similar stuff over networking, i would not understand anything. So i can understand your view.
     
    Last edited: Jul 17, 2019 at 2:15 PM
  40. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    43
    Don't forget: Enlighten is not licensed by Unreal and developers / studios must licensing it's themselves. (so far i know)
     
    Last edited: Jul 17, 2019 at 2:47 PM
  41. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    As middleware. Sure. I don t think Silicon Studio is interessted in providing access after this. So for now it s an feeled cutoff.
    Let s see if Yebis is available for Unity after this. In BuiltIn RP it s still the reference for PostProcessing. PPv3 in HDRP is on a good way.
    So this communication style from the BlogPost could went into an double cutoff by painting black. Don t hope.

    And thanks to Yebis in our case we could position Unity along Unreal in our pipeline by showing similar quality some years before.
    So everything is connected.

    Also deeply integrated in our solution was Granite Texture Streaming Asset. Half of a year hard work.
    Then Silicon studio announced to take it in her portifolio as middleware too in january. Ok. Then Unity announced the purchase of Granite some month later. Sounds good.
    But since then no announcement what happens or how available for several months now. Asset got deprecated.
    And Texture streaming is the base tech for the next years.
    Amplify Texture never got released.

    So sometimes it would be nice if Unity would "take" the developer perspective more often.

    All in all really hard hits because of Unity - Silicon Studio gaming. Now the Enlighten story which is more worse than the Game of Thrones final.

    Is it not possible to find solutions and a communication style by not destroying 100 s of projects, relationships and trust?

    And sometimes it is needed to comment buisness descissions.
    I would prefer to create nice stuff with Unity to skyrocket the buisness. This became actually impossible through buisness descission without delivering any replacement for key technologies.

    From my point Unity should provide , per definition, supported up to date access to all key technologies.
    Paid or not.
    When you have something better in place for your customers for free they will be happy.
    Not earlier.
     
    Last edited: Jul 17, 2019 at 10:59 PM
  42. sacb0y

    sacb0y

    Joined:
    May 9, 2016
    Posts:
    105
    Apparently Silicon Studio is interested in further support of Unity with Enlighten, they claim Unity doesn't want to bother.

    Which is annoying cause i prefer Enlighten to the light mapper for now, simply cause it doesn't require me to fix all my assets (and assets on the asset store). I'm not even sure why the overlapping UV's thing is an issue in the first place, but in my projects i can't even get the progressive light mapper to even work.

    But the main issue is, apparently there's fixes we can have, cause enlighten is still updated, we just can't get it cause Unity doesn't consider it worth the effort. It'd be nice if they at least explained why that's the case but we're left to just complain... ¯\_(ツ)_/¯
     
    keeponshading likes this.
  43. Mauri

    Mauri

    Joined:
    Dec 9, 2010
    Posts:
    1,457
    Unity simply doesn't want to rely on third-party tools any longer.

    Enlighten isn't going away anytime soon, anyway (see the last sentence).
     
  44. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    257
    Rely on something is something complete different as support access to an middleware.
    See Unreal strategy.

    Unity decided around 2017 not to rely on it. What we have 2.5 years later? A full commit for 2021.

    You really should remove Rewired, Photon and Bakery too.
    Better close the complete Asset Store when the new core strategy is not to rely on something.

    Also is Enlighten a widely used nice precalculated Realtime GI sytem that will not go away because Unity decided to deprecate a version from the old license holder.

    So when you have not the dev power like an AAA studio to invest let s say some years to develop your own and to hire some really smart people then there are not so much alternatives.
    And you should really blend out every RTX marketing implementations from current games they would sell 5 times by deleting every precalculated GI, UV and light map.
     
    Last edited: Jul 17, 2019 at 10:40 PM
  45. alexandre-fiset

    alexandre-fiset

    Joined:
    Mar 19, 2012
    Posts:
    321
    "Unity will continue support for Enlighten in the built-in renderer as it currently exists today (as-is, with no new platform support)"...

    So is Stadia considered a current or new platform?

    More specifically, will Enlighten precomputed GI work on Stadia as it work on Linux?
     
    SamOld likes this.
  46. sacb0y

    sacb0y

    Joined:
    May 9, 2016
    Posts:
    105
    That's perfectly fine but there's a process to these things.

    I don't think anyone really minds that unity wants to be more independent. But when they want to do new stuff the process is always so weird and inconsistent. It's good stuff, but useless when existing features don't exactly seem properly thought out. If you already have a game you're working on, often new stuff isn't usable/compatible. Like unity is constantly making parts of itself obselete, it's just weird.

    And it always seems like they don't quite get why this stuff bugs people so much. Our maybe they think people won't care eventually .

    Like "hey, were getting rid of this" ...

    "Ok wheres the replacement, why are you announcing this with no real replacement? How long has this been the plan? ".

    "Oh, don't worry we'll have one in two years and it's gonna be awesome. Meanwhile deal with this outdated thing while we put resources into this new thing. Oh and there's the other new thing that isn't what you need lol"

    "That does not help me at all. Cause if it's anything like the progressive light mapper it's not even going to be compatible with my assets for some reason. The other thing at least works. And I'm not exactly eager to jump into another [preview] just to feel like I'm keeping up with the times."
     
    SamOld and AcidArrow like this.
  47. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    4,156
    IMHO unity is stuck between:
    1 - Legacy bad decision, that make changing the code hard, while keeping compatibility in the fragmented ecosystem that make them famous,
    2 - Ambitious shiny new stuff, that is hard to implement without a clean state, working on the fragmented ecosystem AND keep compatibility. But is not proven yet and might actually fail to cover the proper use case.

    Here it's case of trying to move away from 1 but falling into the trap of 2 while still being pin down by 1.

    The technical term for this is GROWING PAIN.

    They need to make proper games with proper scope to experiment with 2 too.

    IMHO they should do an open source fortnite clone tech demo, it has all the pain of the modern game:

    - dynamic and destructible environment with fast vehicle
    - in a shared multiplayer world with hundreds of player (and potentially npc like in the demon invasion events)
    - that is also open world where the player can build everywhere.
    - with dynamic time of day and weather system.
    It's like the best case for GI and all their systems combined together.

    You can see EPIC learning a lot from that, from streaming, lighting and rendering, that aren't impractical state of the art, but actually working solution with the right production trade off, while declining it on all platforms and devices, and it would help Unity to nurture potential concurrence, as the CEO is unhappy with Epic bullish behavior.

    Also showcasing state of the art on small target would help too. A first step was made with FontaineBLeu on ps4, which is a fixed medium target. But building demo on switch level hardware, would be impressive too to demonstrate scalibility, instead of pic on the sky hardware like 64GB "off the shelf" laptop with expensive CPU. Which doesn't exactly match the customer we need to sell game to. I'm playing fortnite on switch, and it's basically an on live demonstration of Unreal optimization on all fronts, you see streaming and rendering issues and solution evolve in real time sometimes while playing, and how it impact and improve the play experience while still being production friendly (they have a very fast rate of updating).

    Unity clearly lack the know how that is demonstrated in that game.
     
    Last edited: Jul 18, 2019 at 4:53 PM
    keeponshading likes this.