Search Unity

Graphics H-Trace: Global Illumination and Occlusion [ RELEASED ]

Discussion in 'Tools In Progress' started by Passeridae, Mar 11, 2022.

  1. sqallpl

    sqallpl

    Joined:
    Oct 22, 2013
    Posts:
    384
    Hi,

    Great job. Good luck with finishing the first public version!

    How does it work when the object is 'excluded'? Is it excluded from receiving the SSGI and from 'being visible' by SSGI at the same time? Do you want to use layers for controlling what's excluded?

    Thanks.
     
  2. Meceka

    Meceka

    Joined:
    Dec 23, 2013
    Posts:
    423
    Hi, is there a chance that this asset will be ported to URP / BIRP? Or is it not possible?
     
  3. one_one

    one_one

    Joined:
    May 20, 2013
    Posts:
    621
    Thanks a lot for the in-depth reply!

    Wouldn't that mean though that transparent objects would appear overly bright/dark in occluded/indirectly lit areas? I understand that this isn't exactly a trivial problem, just trying to understand the implications (i.e. switching to deferred + alpha testing instead).

    That's perfectly understandable and sounds like a very reasonable path ahead! Would it be possible for h-trace to utilize baked (large-scale) AO and bent normal occlusion data (e.g. for a building) and reflect that data on smaller dynamic objects? Or is that not feasible in principle with the way h-trace works?
     
  4. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Update
    About the “thickness” feature I’ve been talking about so much recently.

    What is thickness in screen space?
    Generally speaking, thickness tries to determine where the backside (that can’t be seen from our viewpoint) of the object is, so the tracing algorithm could decide whether to trace rays behind the object or not. Unity calls this parameter “Depth Tolerance” in SSGI and “Object Thickness” in SSR.

    Why is it so important?
    Because without this parameter all objects will be treated as infinitely thick which is incorrect in, like, 100% of cases. You can check it out by setting this parameter to the maximum value in either SSR or SSGI.

    Why is it so important for H-Trace specifically?
    Because H-Trace is a horizon based tracing algorithm. It treats the scene (in the form of the Depth buffer) as a heightfield and collects only the highest elevation points of this heightfield. This concept allows it to gather a lot of samples very efficiently, but it’s not really compatible with accurate thickness detection. That’s why you don’t see any “Thickness” slider in Unity’s AO (which is GTAO and is horizon-based). Although there are some thickness heuristics developed for GTAO, they work well only for moderate radius values. Since H-Trace uses a full-frame gathering radius for GI, these thickness heuristics are not really applicable and all objects default to the infinite thickness.

    It may not always be a problem, at least a huge one. Depends on the scene. Here’s the most extreme scenario:

    upload_2022-4-22_20-40-53.png

    As you can see, the small pole in the center of the Cornell Box is treated like an infinite wall by GI. It casts a very thick shadow and doesn’t allow for the bounced light to travel behind it. Such issues are likely to be observed when most of the scene is lit by indirect lighting.

    To improve the visual look in such cases, I’ve added the “Thickness” rendering feature. Let's see how it works with more complex objects, like this chair:
    upload_2022-4-22_20-41-59.png

    Thickness works in two modes: Simple and Advanced. Simple mode is an approximation. It allows you to control the thickness parameter with the “Depth Tolerance” slider just as you do it in Unity’s SSGI. It forces a uniform thickness value for all objects in the frame.

    upload_2022-4-22_20-44-31.png
    Advanced mode renders ground-truth thickness on the per-object basis. It can detect the actual backfaces of objects and use this data to correctly cast rays behind them. In order to filter objects correctly, the Advanced mode relies on a special ID buffer that it generates once activated. It allows distinguishing different objects, but it can’t detect separate parts within an area of a single object. In this case it fallbacks to the Simple mode.

    upload_2022-4-22_20-46-20.png

    Pay attention to the middle screenshot. There's an incorrect overshadowing on the chair, where the per-object detection has failed (because it's a single object) and the Depth Tolerance was set to a very high value. This is fixed in the third screenshot, where the fallback to the small Depth Tolerance value has sucessfully resolved the issue.

    Activating any of these modes has a performance cost. The thickness option is one of the most demanding features in H-Trace. Advanced mode is more resource intensive, because it re-renders all objects in the frame into two separate buffers. Therefore, you may see an increase in the drawcalls. H-Trace uses an extremely light custom shader for this purpose, so, the main cost of the effect comes not from drawing objects into buffers but from doing calculations inside the GI pass. Nevertheless, I can add a layer mask (if there are such requests), so the users could specify the list of objects for the advanced mode themselves and avoid re-rendering all objects. This will help in cases with very high-poly objects in the scene.

    Note that this feature may change and evlove in the future.
     
  5. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Started testing on Sponza.

    Here's only directional lighting:
    upload_2022-4-23_18-3-52.png

    Here's H-Trace Enabled:
    upload_2022-4-23_18-4-13.png

    Here's H-Trace + Unity's SSR & Fog:
    upload_2022-4-23_18-4-41.png

    I'm still not finished with denoisers, so using very high sample-counts here.
     

    Attached Files:

  6. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    I am buying this on day one.
     
    Shodan0101 and Passeridae like this.
  7. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    I'm trying to make it work in three modes:
    • Excluded objects are completely invisible for H-Trace.

    • Excluded objects can contribute to H-Trace, but can't receive any GI from H-Trace. This means that H-Trace can pick up the GI that you baked (or provided in any other way) for the excluded group of objects.

    • Excluded objects can contribute to H-Trace and can receive GI, but only from H-Traced objects. This means full interaction between excluded and non-excluded objects and no screen-space light transfer from one already baked object to another baked object, to avoid doubling their GI. We assume that baked objects have already received and exchanged all the necessary lighting between themselves during the process of baking.
    Then it will be up to the user which mode to use. Moreover, I expect there will be some performance boost associated with every mode, because excluding objects allows to avoid running some (or all) GI calculations on the pixels that these excluded objects cover in the frame.

    These are the plans. I can't promise that I will implement everything exactly the way I described above, because there may be some yet unknown barriers and issues. But I will do my best. As for the layers, yes, you will be able to specify the layer / layers for the excluded (or maybe the opposite: non-excluded) objects.

    I will provide some screenshots once I'm done to illustrate this system.

    Hi! Yes, I think it's possible. I don't have a lot of experience with non-HDRP pipelines, but I think that it's doable. I will try to port it after the release. Which pipeline support do you need more? URP or Built-in?

    Yes, indeed, it means that if you had had a pitch black directional shadow which was brightened up only by GI, a transparent object placed in this shadow will remain pitch black, because GI doesn't affect it and can't make it brighter. I wonder what Unity offers to solve this, because that's how their SSGI works as well (at least according to my test). However, I've mentioned that I'm looking into a workaround for forward-rendered objects. This will not be a very user-friendly workaround, I'm afraid, because it will involve some manual work, but it can potentially allow to include any object into H-Trace. Even a transparent one. I'm not sure how the GI made for opaque objects will look on transparency (it doesn't take such thing as refraction, reflection and let alone caustics into account), but we'll see. I'll post some screenshots once I make any progress on this to illustrate the effect.

    This is possible and I'm trying to support it. Take a look at the first reply in this post for more info! :)
     
    GuitarBro and one_one like this.
  8. Meceka

    Meceka

    Joined:
    Dec 23, 2013
    Posts:
    423
    For me it's Built-in. But after an uncertain amount of time (possibly still more than a year) we will switch to URP.
    We're currently using "Sunfall Global Illumination (SSGI)" with Amplify Occlusion, these both assets work well on Built-in and use the depth buffer. They are abandoned but BIRP isn't changing so they keep working. I think in deferred rendering there shouldn't be many limitations from the render pipeline (BIRP). I would also purchase this on day one no matter the price.

    By the way, I suggest you do like Jason Booth and some other asset developers, separating assets for render pipelines. One asset for BIRP one for HDRP and one for URP. So a user needs to buy twice if for some reason they use or test two render pipelines. And if a breaking update to HDRP gets released by Unity for example in Unity 2023, release a new asset with a discounted upgrade option from the old one, instead of a free update. This may motivate you financially to not abandon the asset.

    Both you and buyers would be happy to pay 100 once and then 50 dollars each year for a render pipeline upgrade, instead of implementing an asset to see it get discontinued after a year.
     
    Passeridae, one_one and blueivy like this.
  9. GuitarBro

    GuitarBro

    Joined:
    Oct 9, 2014
    Posts:
    180
    +1 as this is likely the best compromise given the render pipeline situation right now. As a nice benefit, it also gives the asset developer some insight as to which RP their asset is most in demand.
     
    Passeridae and blueivy like this.
  10. MaxWitsch

    MaxWitsch

    Joined:
    Jul 6, 2015
    Posts:
    114
    Damn! This seems to be a hell of a Lightning Solution.
     
    Passeridae likes this.
  11. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Hi everyone!
    Here's a new update:

    I've spent the last two weeks fixing bugs, testing on Sponza and, most importantly, integrating H-Trace into the HDRP pipeline. At first I thought there would be no need for any deep integration and that's why I never included it in the plan. However, my tests proved me wrong and I started investigating. It turned out that it may not be too hard to overwrite Unity's SSGI with my own and let it propagate throughout the whole rendering pipeline properly. In practical terms it means such things as:
    • Forward support. Fabric, Hair and all other forward shaders are now fully supported.
    • Reflections support. H-Trace is now visible in Unity's SSR reflections.
    • Correct interaction with materials and their properties. For example, H-Trace is now affected by material smoothness.
    And so on. All in all, it works as if Unity was natively using H-Trace instead of their SSGI. Now to the technical part. There are two ways to integrate something like this into the pipeline:

    1. Via Shader Graph. GI can be written to an off-screen buffer, which is sampled in SG and connected to the "Baked GI" input. This approach has certain advantages: zero pipeline customisation (only official methods are used), more control over GI on a per material basis (you can do with it whatever you want before you plug it into the "Baked GI" input). And disadvantages: SG must be used for all materials. But the main problem, sadly, is this bug. It makes this approach barely usable. So, if anyone has encountered this issue - please let me know. Especially if you found a way to solve it. For now we’ll have to leave this as plan “B”.

    2. Via HDRP package customisation. This is a tricky one. I spent some time trying to find the most minimalistic way to do this. I ended up with only 4 (I may add a few more later) additional lines of code in a single file inside the HDRP package. I still do all the calculations inside the custom pass system (as I did before), but instead of writing to the screen - I copy my GI output to the buffer that Unity uses for its own SSGI, effectively overwriting it. This approach is supposed to withstand those notorious breaking changes Unity makes with every new version. So, for now its plan “A” and I’ll stick to it until the SG issue is resolved and/or Unity makes a new injection point specifically for GI effects. Now let me show you the difference.

    No H-Trace (direct lighting only):
    upload_2022-5-7_2-27-42.png

    H-Trace with no pipeline integration:
    upload_2022-5-7_2-28-6.png
    The sphere in the background is using the Fabric shader which is forward only. Since H-Trace doesn't interact with forward shaders it remains black as if it's still in the directional shadow. The statue is using a very glossy metallic material and SSR is enabled. However, since H-Trace doesn't show up in SSR, the part of the statue in the directional shadow also remains pitch-black. The yellow wall on the left is not visible in the reflections as well. Let alone the sphere.

    H-Trace integrated into the pipeline:
    upload_2022-5-7_2-31-14.png
    Now the forward-rendered sphere is correctly lit by GI (it doesn't cast a lot of blue color because of the fabric-specific properties). The statue also looks correct and reflects both the insides of the Cornell Box that are lit by GI and the blue sphere.

    Note that all this will be optional. If you’re okay with the basic H-Trace injection and/or don’t want to overwrite any HDRP package files - it’s totally fine. It will work.

    Also note that I haven’t yet tested this with every Unity and HDRP version, so this is not final and is subject to change. I’ll also look into automating the process. For now it only requires a single file replacement, so maybe it's simpler to just drag and drop it by hand. But maybe there will be some patcher that will do all the file-movement-and-replacement work for you, we’ll see :)
     
    Last edited: May 7, 2022
    saskenergy, Ryiah, jjejj87 and 15 others like this.
  12. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    867
    This sounds good and looks great!
    The patching is not optimal, maybe HDRP team can build in some additional PP injection point next to the existing one, if this works well?

    I have some questions regarding current state:
    - Will it already support VR / stereo rendering?
    - Does it support skinned meshes / motion vectors, how bad are artifacts on fast movement?
    - What's the performance compared to SSGI of HDRP? I know it's hard to compare, but roughly at same quality level
    - Will it work with AVP? Or does it replace this completely?

    Thanks
     
  13. florianalexandru05

    florianalexandru05

    Joined:
    Mar 31, 2014
    Posts:
    1,806
    OK take my money!
     
    Passeridae likes this.
  14. Marseol

    Marseol

    Joined:
    Jan 16, 2022
    Posts:
    10
    The result is surreal! How does it behave in movement in the current state? Artifacts such as ghosting effect?
     
  15. MaxWitsch

    MaxWitsch

    Joined:
    Jul 6, 2015
    Posts:
    114
    Its a shame that HDRP has no real open injection point for Indirect Lighting.
    I think if this would be the case there could be a lot of awesome 3rd Party GI Solutions such as yours.
     
    one_one likes this.
  16. JudahMantell

    JudahMantell

    Joined:
    Feb 28, 2017
    Posts:
    476
    This looks incredible! Just posting here so I can easily find this thread.
    And please +1 For URP support. There is literally no option for real-time GI in URP, and as someone working on a project with all geometry either procedural or imported at runtime, this is a killer feature that I would definitely pay for!
     
  17. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    Hey mate, thanks for keeping the updates coming. Love the progress so far.

    But, I think its time for us to see some videos!!!! I would love to see it in motion.
     
  18. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Thanks! Noted :)

    Thank you!
    I think we'll see some official injection points for screen-space effects in the future. But I'm not sure when it will happen. I don't see anything related to this in the roadmap.

    Honestly, I haven't tried it with VR yet. My plan is to finish the regular (non-VR) rendering first, submit it to the Asset Store and then move to URP support, VR support, additional features and so on.

    Since it's a screen-space algorithm, it doesn't really matter if the mesh is a skinned one or not. It will work fine with any content as long as Unity provides valid motion vectors for it. The artifacts on fast movement will depend on two factors:

    1) The number of temporal denoisers and their aggressiveness.
    I'm going to provide one temporal and one spatial denoiser for H-Trace. It's likely there will be also one temporal upscaler (TAAU-like) in case you're running GI in half-res. And there's also Unity's native TAA, of course. So, you can have 2-3 temporal denoisers running simultaneously. This may result in some trailing / ghosting artifacts indeed. Especially if Unity's TAA is not set up properly. However, I'm going to provide a speed rejection option which will allow you to find a balance between temporal and spatial denoising. The plan is to decrease temporal impact in the areas of fast motion, while increasing the spatial denoiser contribution.

    2) The lighting conditions.
    If the directly lit area of the frame is relatively big - more rays are likely to find it and bounce light around the scene. This leads to a low-noise level, which allows you to use less temporal denoisers and as a result you get less motion artifacts. If there are just a few directly lit pixels - most rays will probably fail and you'll end up with a lot of noise. You can counter it by using more samples (sacrificing some performance) or relying on temporal accumulation.

    I've tried to capture some gifs for you, but the compression makes it very hard to estimate motion artifacts. I'm going to record a high-res video instead. Will post it here in a couple of days.

    I can compare them only in terms of visual quality at the same performance level. Here's an example of a such comparison. Note that it was captured a few days ago and the performance has been improved since then. H-Trace is also not using the Thickness feature in this comparison. The ultimate goal is not to make it, like, 2 - 4x faster than Unity's SSGI (though, you could tune it down to make it really fast at the cost of quality), but to make it more accurate and less noisy. To be more specific: I found three main downsides of SSGI in HDRP: limited sampling radius, low sample count (1 sample per pixel) and (as a result) very heavy denoising, that totally washes out indirect shadows, normal map shadows (pay attention to the brick texture in the comparison I linked above) and other important details. So the first goal of H-Trace is to be better in these areas and then, if possible, be faster as well. This is the baseline and the rest will be up to the user, who can tweak it to be more performant or more accurate depending on the specific needs.

    It will work with AVP, using it as a fallback. Like Unity's SSGI does. I'll probably add this option after the release. However, the reflection probe fallback option will be included in the release version, I think.

    Thank you!
    Depends on the temporal denoiser configuration and lighting conditions. For more info on it take a look at my answer above in this post :)

    Got it :)
    I've talked to one of my friends about the URP port. It's possible that he'll be able to help me with it, so the things could move a bit faster.

    Sure! I'll try to post a video with Sponza in a couple of days.

    Thank you everyone for your interest and support! It keeps me motivated, I really do appreciate it :)
     
    Last edited: May 15, 2022
  19. LapidistCubed

    LapidistCubed

    Joined:
    Dec 22, 2015
    Posts:
    10
    I have been building a fully procedurally generated game (literally everything in the environment is added during runtime) for 3 years now. I've tried SEGI, Nigiri, and a handful of other GI solutions, and now I'm wrestling with UPGEN to get the quality I want for the project.

    I would pay hundreds for an asset like this. And if it has URP support?? Oh dear lord, I really hope this one is not abandoned like all the others. This is game-changing.
     
  20. Win3xploder

    Win3xploder

    Joined:
    Dec 5, 2014
    Posts:
    161
    Would you mind showing the specular occlusion feature?
     
    blueivy likes this.
  21. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
  22. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Sure, once it's ready :)
    I haven't finished it yet - just laid the groundwork by adding Bent Normals. But it's still on my list, of course.

    Hi!
    In a nutshell, my goal is to post a decent video. And I realized that I need to have everything in the *almost* production-ready state for this. Because video requires working denoisers (people asked about temporal artifacts), good performance, etc. So I've been working towards this goal all this time. Here's a list of what I've done so far:

    - added Reflection Probe Fallback (like in the native SSGI)
    - added Reflection Probe Fallback for a single real-time probe
    - added Stochastic Depth Buffer (Experimental. Allows the algorithm to kinda "see" objects behind other objects. More info here)
    - added Checkerboard Rendering (0.75 of the original resolution with almost no visual impact and ~25% perf. boost)
    - improved Thickness Mode (faster, more accurate)
    - improved overall performance (small and big optimisations here and there)

    Right now I'm working on spatial and temporal denoisers, relying on this, this and Unity's own implementations. I will post a video once the denoisers are finished.

    I've also sent the current version of H-Trace to my friend who will try to make it work in URP while I'm finishing the general development.

    Sorry for the lack of the visual content. I'll try to post more in the near future. For now, here's a nice screenshot of a sky GI:
    upload_2022-6-11_14-55-29.png
     
    Ryiah, jjejj87, Win3xploder and 6 others like this.
  23. blueivy

    blueivy

    Joined:
    Mar 4, 2013
    Posts:
    632
    Looks amazing! Just wanna say I appreciate all your hard work.
     
    Shodan0101 and Passeridae like this.
  24. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
    Looks amazing! my money is yours. hope you publish it as soon as possible.
     
    Shodan0101 likes this.
  25. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    I am rooting for ya.
     
  26. blueivy

    blueivy

    Joined:
    Mar 4, 2013
    Posts:
    632
    I'm not a programmer so I don't know how feasible this type of feature is, but is there a way to get some type of directional occlusion from the sky even if it is offscreen. Maybe just by assuming there is always some type of hemispherical light coming from the top. Something like this effect?
    https://mobile.twitter.com/icelaglace/status/803916172051611648

    I find a lot of scenes instantly look better even with a basic form of sky occlusion. I know you were working on directional GTAO before this, so maybe like a crude way to inject approximate sky lighting into the gtao effect

    edit: I actually found a pretty cool article on what I'm talking about, directional gtao using the bent normal to sample the environment map. This all sounds like things you already implemented in your GTAO thread so I'd love it as a optional feature to the system.
    https://interplayoflight.wordpress....n-and-directionality-in-image-based-lighting/

    implementation details: https://interplayoflight.wordpress....-image-based-lighting-implementation-details/
     
    Last edited: Jun 23, 2022
  27. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
    Any news recently?
     
    Shodan0101 likes this.
  28. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
  29. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Hi! Sorry for the late reply :)

    The work continues. I'm struggling a bit to finish the reflection probe fallback support, so it takes more time than expected. But I'm working on the asset almost every day, so it's not abandoned, no worries here.

    On the bright side, while I'm finishing the HDRP version, the URP one is also being worked on by my friend, who is making steady progress so far.
     
    saskenergy, Autarkis, Ziq-Zaq and 2 others like this.
  30. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
    Thanks for the reply.

    Just to confirm it’s not abandoned, cause all GI solutions in this forum were abandoned eventually.

    I personally refresh this thread twice a day. Really hope you say something maybe every week to let people know it’s alive.
     
  31. TyGameHead

    TyGameHead

    Joined:
    Apr 29, 2017
    Posts:
    25
    Is your friend working on the GTAO also or just the H trace GI for urp?
     
  32. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Hi! I've looked into this effect. I wonder what exactly is being occluded in this post. My guess, it's the ambient probe that represents the sky. If so, it would be suitable only for the outdoor scenes and only when most occluders are in the frame. Technically speaking, ambient occlusion is trying to do exactly that. It assumes that you have a white dome over your scene (like an overcast sky) and occludes everything accordingly in the specified radius.

    But yes, AO could be extended like this. I've seen this article too and consulted its author regarding this feature. I'll think what I can do about it, but I see a number of issues with it. For example, if we assume that there's some lighting that is coming from the top, but we don't have all occluders in the frame (we don't usually look at the ceiling when playing from the first person POV) then we'll have huge light leaking artifacts in the interiors. This will work only when we are looking at the scene from the top and all major sky-occluders are visible. Like in the linked twitter post. So, it's mostly suitable for specific use cases. But thanks for the idea! :)

    Got it! So, weekly update: it's alive, the work is going well :)
    For those who are waiting for the URP port:
    upload_2022-7-8_22-16-10.png

    This is the first screenshot of GI working in URP. It doesn't have multiple bounces (that's why it's so dark), denoisers are also not there yet. This is just a test to confirm that all critical buffers are working. They do, so it seems it's just a matter of time now. We'll try to do out best to achieve feature parity with the HDRP version, but I can't promise anything regarding the reflection probe fallback for now. In HDRP I'm using a bunch of native functions for that. We'll need to see if they're portable to URP.

    He's working on H-Trace only. But we are thinking about releasing GTAO as a separate asset (or assets) both for HDRP and URP. For those who doesn't need GI, but wants high-quality AO.
     
    saskenergy, Autarkis, blueivy and 6 others like this.
  33. TyGameHead

    TyGameHead

    Joined:
    Apr 29, 2017
    Posts:
    25
    Sir I will wait as long as I need for this project to shine :) can't wait
     
  34. mikeCarma

    mikeCarma

    Joined:
    Mar 26, 2020
    Posts:
    4
    Hi,

    Does the illumination stay persistent if the camera looks away from the light sources ? Because i recently tried the new asset Radiant GI and i found that screen space solutions tend to make light flicker and not appear normally or dissapear completely, so are not useful in most scenarios.
     
  35. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
    I'm afraid that's the limitation of all screen space effects including this SSGI solution. What you want is fully danymic real time Global illumination like UE5's Lumen system. Unfortunately there is no way to acheive such a perfect effects in Unity.

    The worst thing is Unity officially said that they will never make a fully danymic GI like “Lumen”, cause they think Bake GI is enough for almost 99% scenarios.


    Therefore, the options you have is:

    1. Go to UE5 or Godot and forget Unity forever (Both of them have fully dynamic realtime GI).

    2. Upgrade your project ( HDRP only ) from DX11 to DX12 ( which will lose 30% FPS, and Unity said that they can't solve this problem ) , then enable Nvidia's raytracing tech ( requires your players have a RTX card and again will lose another 50% FPS.). finaly you get a "Fully dynamic realtime GI" with FPS dropped 80% and full of noise artifacts.
    even if you enable DLSS, you still lose 50% FPS and worse visual quality which makes no sense.

    3. Using Unity Enlighten System or 3rd party asset "Magic Lightmap Switcher" to fake realtime GI, but not Dynamic.(In this way after you bake, you can change position or rotation of light source, the realtime GI will fit. But you can't move any scene object, that's why it's called "realtime" but not a "Dynamic" GI solution)

    4. Using 3rd party asset "UPGEN lighting",this asset cast thousands Physics ray to detect and sample the color around, and generate dozens high performance "Point light" around camera to illuminate the enviroment. But it's based on Screen space tech,so the light leaking aritifacts is a very big problem which makes it hardly to use in a serious project.And it does not support Directional Light which means you can't use it in outdoor scene.

    5. Using Screen Space GI and bear the light flicking when light source off the screen.

    6. If your project is Build In RP, I sugest you use "SEGI", it's a voxel based fully dynamic real time GI with reasonable performance cost. but still has slightly Light leaking artifacts.

    7. Learning Graphics and make one by yourself.


    In my opinion, in these hard times , screen space GI is the only chioce we have.
     
    Last edited: Jul 10, 2022
    nirvanajie, mikeCarma and Meceka like this.
  36. Ookoohko

    Ookoohko

    Joined:
    Sep 29, 2009
    Posts:
    103
    I just threw some of my humble pennies at this. The author is working on both Nanite-style micropolygon rendering AND a lumen-style realtime GI.

    https://www.indiegogo.com/projects/the-unity-improver-nano-tech#/

    br,
    Sami
     
  37. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
    Thanks for sharing this. But personally I'm counting on this "H-Trace" SSGI solution.
     
    Last edited: Jul 10, 2022
  38. Ookoohko

    Ookoohko

    Joined:
    Sep 29, 2009
    Posts:
    103
    Yeah, I'm waiting for this too... But unless it handles the stuff outside of the screen somehow, it has the same stability issues as any other SSGI solution - your lighting changes when you look away.

    There's several ways to do this, hopefully some of them gets implemented into H-trace.

    br,
    Sami
     
  39. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    What you encountered in Radiant GI is a natural limitation of any screen space tech. For any global illumination traced in screen space you need to have some area of the frame to be directly lit. So, yes, without a fallback you will see something like this with H-Trace too.

    But that's exactly why we DO have a fallback! :)
    For now H-Trace can use Reflection Probes (or Probe) as its primary fallback option. I'll add APV Volumes as well.
    At first, I was a bit skeptical about the reflection probe idea. But after I spent some time digging into it and studying how Unity's own SSGI handles this, I came to the conclusion that it's actually a really good option. Come to think of it, reflection probes store a lot of data about the scene. Yes, you have to place them first and then bake them. And probably add some proxy volumes, but that's about it.

    Here's some old screenshot that demonstrates this fallback in its decoupled state. The indirect color is produced only by probes here. Camera color buffer tracing is completely disabled. Which means it's only the fallback part of the whole GI effect. Depth buffer is still traced to detect occlusions and produce these nice indirect shadows:
    upload_2022-7-10_15-34-29.jpeg
    (Sorry for the quality, I found only the compressed version)

    And here's a single probe attached to the camera and updated in real-time. Again, no color buffer tracing. The GI is produced by the probe data and depth buffer tracing:
    upload_2022-7-10_15-3-15.gif

    Using a single probe like this will lead to somewhat less accurate results in comparison to using all probes in the scene, of course, but it's still a decent fallback option and can produce fully dynamic GI.

    And, of course, normally you first trace the color buffer and then, if the tracing fails, you fallback to the reflection probes. And right now I'm trying to figure out the best way to handle this transition. There may be some slight lighting changes when it happens, but no massive light fluctuations or complete black-outs. I'll capture a video with the camera turning away from the directly lit area to show this fallback in action a bit later.

    The ultimate goal, however, is to implement scene voxelization or some other acceleration structure that will serve even better as a fallback option.
     
    Last edited: Jul 10, 2022
  40. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
    Great job ! I think Unity should hire you to improve their SSGI. Well we know SSGI is not perfect compare to the fully dynamic GI solution, but hey, that's the best we can have for now. just take it and keep working on your game project !
     
    Passeridae likes this.
  41. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Will probably give this a whirl myself. Also... how do these assets combine shadow with Unity's shadow without double-darkening effects? been scratching my head on that one a while with HDRP.
     
  42. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
    Hi, I found 2 GI Assets on the store, both for URP, and I contact their author , they said they will port to HDRP soon.One is SSGI and another is Voxel based GI. that SSGI now has reflection fallback already and it looks not bad.

    What I want to say is , your competitors are already there, you have to a llitle bit hurryup...
    Anyway, I realy like your work, cause yours looks better.
     
    Passeridae likes this.
  43. florianalexandru05

    florianalexandru05

    Joined:
    Mar 31, 2014
    Posts:
    1,806
    Can you post the links, I haven't seen anything on the store yet but I'll take a look now?!
    Edit: Ok I saw them but I'm expecting HDRP and I think HTGI is already looking pretty good, I'm still waiting for this one, his technique is more of what I'm looking for. The ones from the store don't look that impressive compared to what Unity already has but it must be because they are made for URP and built-in.
     
    Last edited: Jul 17, 2022
    Passeridae likes this.
  44. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Hi! You're probably referring to Radiant Global Illumination and Lumina (these are two most recent ones). Yep, I've seen them and I'm constantly monitoring the state of the GI market out there.

    The thing is, HDRP's native SSGI + Reflection Probe / APV Fallback will be superior to the majority of these solutions in most cases. It may look like I'm constantly complaining about this native SSGI, but that's because I'm trying to do something better and as a result I'm looking for weak spots. But Unity's SSGI is as good as it gets for a standard ray-marching algorithm. The reflection probe fallback is also extremely well-designed. I haven't seen anything like that in other assets. Now, Unity applies some really heavy denoising that doesn't preserve details that much, that's for sure.

    Many GI solutions out there do something like this: get sorta-kinda color-bleeding effect and then blur the hell out of it. That usually ends up with a bunch of bright spots filling the shadows with a somewhat appropriate color. The overall look can be described as "Yep, there's some GI going on here", but that's about it. While such GI is arguably better than no GI, I think we're well past that in 2022. So, I'm trying to do something more complex here. For example, the most obvious difference between a lot of real-time GI solutions and a path-traced image (or Lumen) will be the indirect shadows and small details. They add a ton of realism and physical correctness to the image, but are often completely missed by some GI algorithms or just murdered by massive over-blurring. Here's an example of GI with really good indirect shadows, just to illustrate what I'm talking about here.

    There's also a misconception about screen-space GI not being a "real" GI solution and that it should be replaced by some world-space method if possible. That's not really true. The reason is that you can't get that granular and detailed GI with other methods. There's nothing more detailed than tracing individual pixels (pure ray-tracing is another topic, so I'm not considering it here). Algorithms that are not traced in screen-space need some kind of a simplified scene representation to work with. It can be done with Voxels, SDFs and so on. But such representation is by definition a downgrade of the original scene data with a massive detail loss. Moreover, you can't voxelize everything. There's a limit (and it's usually 256 x 256 x 256 voxels) and you either create a relatively detailed voxelization of a small area or you trade quality for coverage. The same goes for SDFs. An example is Lumen that traces only the first 2 meters in the individual SDFs and then it switches to a coarse combined global SDF. Screen-space methods, on the other hand, can trace as far as you see for free and pick per-pixel details. So a good GI solution needs to combine both screen-tracing and some other world-space technique. Lumen, which traces screen even twice, is again a good example here. But in any way, you really have to get your screen-tracing and denoising right in the first place, because that's where most of the details will come from. That's why I'm putting so much work into it.
     
    Last edited: Jul 17, 2022
  45. PutridEx

    PutridEx

    Joined:
    Feb 3, 2021
    Posts:
    1,136
    I wonder how cryengine's GI (SVOGI) solution works. I believe it's voxel-based? You can see it in action in the game (Kingdoms come deliverance), massive open world game with no baked lighting. Extremely fast, impressive visuals and the most beautiful forests I have ever seen. The GI occludes sky light, so trees/forests end up looking amazing.

    Works with meshes(houses)/terrain/grass/trees. Always wondered why people aren't just making that rather than innovate (e.g: lumen), I do know it has some limitations, like light leaking, and limited bounces, but it's still good enough for to light a world with zero baked lighting. Very impressive.

    You can see a more recent, updated example in the game (Hunt: showdown) - since Kingdom deliverance is pretty old, running on an ancient cry-engine version.
     
  46. GuitarBro

    GuitarBro

    Joined:
    Oct 9, 2014
    Posts:
    180
    The V in SVOGI is for Voxel, yes. ;)
     
  47. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Yeah, I can agree on that. I'm looking through a lot of Voxel GI solutions right now, to find the one that suits my needs. Btw, this one is really good. It shows the best looking Sponza VXGI I've ever seen.

    In the case of Lumen, I guess, it was because of its author. He had implemented the original DFAO in UE4 before that. If you look through his paper on that, you'll notice how he talks about SDFs being superior to Voxels in many ways. He clearly likes SDF stuff and regrets that you can't do GI with them (we are talking about UE4 era). That's why I think it was easier for them to upgrade and rewrite the existing battle-tested SDF pipeline rather than inventing something new or switching to voxels. But that's just my opinion :)
     
    Hubster and PutridEx like this.
  48. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
    Yeah, I couldn't agree more.The beautiful indirect soft shadows even more important than GI effect. I haven't seen such a GI solution on the store. So you are the only one who can make it.
     
  49. florianalexandru05

    florianalexandru05

    Joined:
    Mar 31, 2014
    Posts:
    1,806
    That's one thing I don't like about Unity's SSGI, it's good but the noise is just extreme in some cases and it seems to be different in every scene. I hate using temporal AA since it turns all the animated foliage into mush. I'd say that your GI solution looked the most impressive so far, please keep going with this and get to finish and publish it or at least release a functioning version if you decide to discontinue it, but I'm sure you won't stop till it's finished.
     
  50. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
    Hello, Have you encountered any difficulties? may I know the latest progress? Our project is almost finished, if we can't get a SSGI solution before the deadline, then have to swtich our HDRP project to URP to use the only SSGI on the assets strore...That's so frustrating.