Search Unity

Graphics H-Trace: Global Illumination and Occlusion [ RELEASED ]

Discussion in 'Tools In Progress' started by Passeridae, Mar 11, 2022.

  1. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    H-Trace: Global Illumination and Occlusion
    Asset Store Link
    Here's our Discord and Twitter

    H-Trace is a fully dynamic screen-space Global Illumination and Occlusion system that aims for accurate indirect lighting with detailed secondary shadows.

    upload_2022-10-11_16-31-35.png
    H-Trace system consists of three main parts:

    - Global Illumination
    - Ground Truth Ambient Occlusion
    - Ground Truth Specular Occlusion


    All three are rendered in real-time and computed in a single screen-tracing pass with no baking required.

    MAIN FEATURES:

    - Full integration into the rendering pipeline comes with both Forward and Deferred support, correct interaction with materials and other rendering features. GI can be picked up by Unity’s SSR and therefore bounced lighting is visible inside screen-space reflections. *

    - Reflection Probe Fallback gathers data from all probes in the scene and reconstructs global illumination even when the primary source of lighting is obscured or outside the frame. An alternative fallback mode allows to specify a single custom reflection probe. Real-time reflection probes are supported as well.

    - Real-Time Specular Occlusion correctly attenuates both SSR and cubemap reflections using Bent Normals & Cones generated in real-time. It can provide occlusion between separate and / or dynamic objects which is impossible with the traditional workflow that requires offline bent normal map baking. *

    - Real-Time Ambient Occlusion uses the most advanced GTAO algorithm with multibounce approximation support and correct blending with the main GI effect. It brings up small details but avoids unrealistic over-occlusion.

    - Emissive Lighting support makes it possible to illuminate parts of your scene using emissive materials and objects of any shape that can act as actual light sources and cast soft shadows. **

    - Infinite light bounces are simulated through a feedback loop. Every frame gathers one more bounce of light.

    - Accurate Thickness mode reconstructs true thickness of all visible objects and renders more accurate indirect shadows in comparison to using a single value as a common thickness denominator for the whole scene.

    - Advanced Denoising algorithm includes temporal and spatial denoisers equipped with many features for noise reduction and detail preservation. It also supports self-stabilizing recurrent blur.

    - Layer Mask allows to exclude objects from processing on a per-layer basis.

    - Flexible performance control with multiple parameters helps to find the right balance between speed and quality. Resolution downscale using either checkerboard rendering or half-res output is also available.

    SHORT FAQ:
    - Why does GI look too dark (weird shadows / occlusion) in the Scene window, but everything is okay in the Game window?
    Change Scene Camera (not your main Camera) clipping settings. Especially try disabling / enabling the “Dynamic Clipping” checkbox.

    - Why does GI look very bleak / dull and doesn’t produce a good color-bleeding effect?
    Check scene reflections. If you’re seeing a strong blue / gray tint covering most of the objects making them look “wet” - it’s likely that this is the sky (environment) reflection leaking everywhere. Use reflection probes to avoid it. It’s recommended to take advantage of Proxy Volumes while setting up the reflection probes.

    - How to see the actual GI impact on the scene in full strength?

    First, make sure that all lights in the scene have shadowmaps enabled. Then make sure that the shadows are completely black. To achieve this, turn down all reflections (the easiest way to do this is to add an override called “Indirect Lighting Controller” and set the reflection multiplier to 0). Then enable GI.

    - How to exclude an object from receiving GI?
    Use the Layer Mask in H-Trace settings.

    - How to exclude an object from contributing GI?
    Switch the shader of this object to "Transparent" type. Transparent objects are not processed by GI.

    The FAQ will be updated.

    Additional Screenshots:
    upload_2022-10-11_16-36-6.png

    upload_2022-10-11_16-36-39.png

    upload_2022-10-11_16-37-12.png

    upload_2022-10-11_16-37-21.png
     
    Last edited: Oct 31, 2022
  2. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    Consider me your customer. Let me know if there is anything I can help you with. Just keep posting screenshots and vids.

    Amazing!
     
    newguy123 and Passeridae like this.
  3. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Thank you for your kind words! :)
    I'll probably need some help with testing. And yep, new screenshots and videos are definitely coming! I think I'll start with Sponza.
     
    imDanOush, newguy123, one_one and 2 others like this.
  4. LeFx_Tom

    LeFx_Tom

    Joined:
    Jan 18, 2013
    Posts:
    88
    Do you think it can handle VR?
    If you need a tester for that - let me know. We use unity for VR/Realtime-Training applications for enterprise. This could be really interesting for us
     
    imDanOush likes this.
  5. S4UC1SS0N

    S4UC1SS0N

    Joined:
    May 29, 2014
    Posts:
    49
    Alwayas interesting to see work being done around the GI for Unity, it's a major part of the engine lighting that need some love right now.

    Keep up the good work.
     
    Passeridae likes this.
  6. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Hi! Thank you for your interest!

    From the testing point of view: I don't have a VR headset myself, but I have a friend who owns Oqulus Qest 2. I can ask him to lend it to me for testing. After that I can send the asset to you for further testing on your hardware.

    From the technical point of view: I haven't worked with VR yet and I've heard that VR is not the strongest side of HDRP (but maybe my info on that is outdated, so correct me if I'm wrong :)). But in theory it should work okay. And I'm implementing an upscaler, so the effect could be rendered in 1/2 or 1/4 resolution to handle high-res cases typical for VR hardware.
     
    Last edited: Mar 11, 2022
    imDanOush likes this.
  7. esgnn

    esgnn

    Joined:
    Mar 3, 2020
    Posts:
    38
    Looking great.
    Just saw your other related post as well, and learned new things.
    Great job.
     
    imDanOush, AlejMC and Passeridae like this.
  8. THE2FUN

    THE2FUN

    Joined:
    Aug 25, 2015
    Posts:
    63
    So amazing ! Unity needs to hire you
     
  9. nuFF3

    nuFF3

    Joined:
    Dec 7, 2013
    Posts:
    6
    Looks good, I'd buy it :)
     
  10. Seith

    Seith

    Joined:
    Nov 3, 2012
    Posts:
    755
    Looks amazing!

    Yeah it seems Unity is not really focused on real-time interactive environments running at 60 fps (games) nowadays. As a company they might be more interested in looking for new market shares in offline rendering.

    Anyway, it's really neat to see a user take up the mantle to try and improve the engine! :)
     
    AcidArrow, Meceka, one_one and 2 others like this.
  11. GuitarBro

    GuitarBro

    Joined:
    Oct 9, 2014
    Posts:
    180
    Looks good. Consider this a +1 for built-in RP support down the line.
     
    atomicjoe and AlejMC like this.
  12. Pourya-MDP

    Pourya-MDP

    Joined:
    May 18, 2017
    Posts:
    145
    if considiring to port it to built_in RP
    ill buy 20 units from store , to just help you
    i have to say that is amazing
     
    SammmZ and AlejMC like this.
  13. Tabularas

    Tabularas

    Joined:
    Jul 7, 2020
    Posts:
    4
    This looks awesome. We replaced our Maya V-Ray pipeline with Unity (HDRP + URP) to create high quality renderings and are also producing VR Trainings too. We would love to test this out. Especially HDRP + URP if it is someday available (URP is best for VR, HDRP IS TOO Heavy for rendering per eye).

    Looking forward to your updates
     
    Last edited: Mar 14, 2022
  14. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    I find myself checking this thread twice a day...

    I am dying here!!!!! Gimme a screenshot! :) :) :)
     
    newguy123 and one_one like this.
  15. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Hi! Sure :)

    So, I've been working on the normal rejection filter all this time. Finished it yesterday.
    upload_2022-3-18_15-32-17.png

    It may seem like a minor thing, but it's very important to correctly filter out indirect lighting based on the normal data, otherwise wrong samples may be taken into account and there will be light transfer between surfaces that clearly can't see each other.

    Also, here's a quick comparison with the DX12 Path-Tracer:
    upload_2022-3-18_15-49-59.png


    Btw, you can use any object of any shape as a light source. Even if you want to use a statue with an emissive material - it's totally fine, it will cast proper GI:
    upload_2022-3-18_15-58-52.png

    There's nothing unusual to it, because it's just a byproduct of any SSGI algorithm, but I've noticed that people were kinda hyped about this thing in UE5, so worth mentioning, I guess. But it's strongly not recommended to use emissive surfaces / meshes as primary light sources. It will lead to a noisier image and, obviously, you won't get any direct lighting and direct shadows from them, since Unity doesn't support it.

    At the current moment, there are still some thickness artifacts, you can see them in the screenshots above. Objects in the foreground cast too much occlusion onto the background. It's natural for screen-space effect, but I'm working on it.

    Thank you everyone for your interest! It keeps me motivated :)
     
  16. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    Awesome! Already better than Unity's SSGI!
     
  17. Onat-H

    Onat-H

    Joined:
    Mar 11, 2015
    Posts:
    195
    This looks amazing! Can't wait to test it. (And your AO as well btw.)
     
  18. greene_tea92

    greene_tea92

    Joined:
    Jun 26, 2017
    Posts:
    22
    Yep, this will definitely be a day one buy for me. Keep it up, we're all rooting for you. :)
     
    one_one likes this.
  19. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    A small update:
    I've been working on the thickness parameter. That's not really the strongest part of horizon-based effects, but nevertheless, here's some progress:

    This is without thickness parameter enabled:
    upload_2022-3-23_18-44-8.png

    This is with thickness parameter enabled:
    upload_2022-3-23_18-45-45.png

    It's not perfect (and never will be in screen space), but it's better than without it. The screenshot demonstrates the best-case scenario so far (objects and their shadows are not heavily overlapping each other)

    The drawbacks are:
    - You have to manually select (on the layer(s) basis) objects that you want to participate in this effect, because there are cases when it's virtually impossible to distinguish between a thin poll and an infinite wall in screen space, due to the nature of perpsective rendering.

    - Since HDRP doesn't allow to use stencil buffers in shaders at all, I have to re-render all selected objects in forward mode to make a mask. So, this thickness has some performance cost. If stencil buffers will become available, I will rework this so there will be no second rendering and the peroformance cost will be minimal.

    The advantages are:
    - Obviously, way more correct thickness appearance.

    - Due to the manual selection, this is supposed to be absolutely leak-proof (according to my tests it is, so far)

    - Since I have to re-render selected objects anyway, I can render them to a separate depth buffer with frontface culling to find out their backface positions. It improves thickness appearance even further.

    - Separate depth buffer can be later used for other imporvements. For example, in theory, it's possible to get GI from objects that are fully occluded by other objects (that are not selected for thickness effect), which typically would't be possible in screen-space. But no promises here, we'll see :)
     
    Ryiah, valarnur, Ziq-Zaq and 4 others like this.
  20. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Improving Thickness Mode:

    Thickness OFF
    upload_2022-3-25_2-30-18.png

    Thickness ON
    upload_2022-3-25_2-30-42.png

    Here's one more:
    upload_2022-3-25_2-31-2.png
     
  21. JudahMantell

    JudahMantell

    Joined:
    Feb 28, 2017
    Posts:
    476
    This looks great! I'm just waiting on some Realtime GI solution for URP!
     
  22. florianalexandru05

    florianalexandru05

    Joined:
    Mar 31, 2014
    Posts:
    1,811
    This is looking good, the non-ray traced SSGI in HDRP is pretty decent but this looks slightly better.
     
  23. UnityLighting

    UnityLighting

    Joined:
    Mar 31, 2015
    Posts:
    3,874
    Please share a demo file to check its quality and performance during development
     
  24. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    1,248
    This looks awesome!

    What would make it even better, if it can also do reflections! Is that a possibility?
     
  25. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
    Is there a way to render GI on a larger Camera FOV and then overlap the GI effect on MainCamera? so the Screen Space GI can get the illumination from the light that is not visible on screen.

    If such an effect can be achieved, it can be an alternative to the dynamic GI solution but with much better performance.that mean more people would like to pay for it.
     
    Last edited: Apr 4, 2022
  26. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Hi everyone!
    I'm working on H-Trace almost every day, but all tests are still done in the Cornel Box scene, so I'm not posting new screenshots because they won't be really different from those that I've already posted here. I'm still working on the thickness rendering, nearly finished. It's very important to get it right if we want to avoid over-occlusion and get nice and accurate shadows. I'm also trying to make it as automatic as possible and as fast as possible, so it takes time.

    Now, onto the questions:)
    Thanks! I'll see what I can do after I release the HDRP version. I heard that URP is close to getting a custom pass system similar to the one that HDRP has now. If it's true, the porting process may be easier than I expect.

    I was thinking about making a demo for the Asset Store. Not 100% sure about it yet, though. Anyway, at the current moment I'm trying to invest as much resources as possible into the actual development process. So, I'm afraid, the demo will have to wait until I'm nearly finished or the asset is submitted to the store. The good news is that it will happen quite soon (I hope).

    Thank you! :)
    I was also thinking about the reflections, but I can't say anything for now, because: 1) I haven't tried yet 2) I don't have a lot of experience with SSR. The author of one of the papers I use mentions that, indeed, it may be possible to make reflections (at least in theory). So, no promises, but I will definitely look into this after the release.

    I've seen this approach (sometimes called "guard band") in different papers regarding AO. First, a double-camera setup is a no go. It is too costly in HDRP and it may be an overkill for such a scenario. There are techniques (Multi-View AO) that make use of two and more cameras, but that's a different story and it's certainly not something you want to have in HDRP. Second, it won't be a silver bullet anyway, because you'll need something like 360 FOV to cover everything and even then there's no way to retrieve any data from behind the objects.

    But there are other ways to achieve the desired result. For example, I'll try to support the fallback on the reflection probes and, if possible, the new adaptive probe volume. Both are not really dynamic, but better than nothing. Moreover, my friend and I are also working on a VXGI tracing that is planned as a more robust fallback when the screen-tracing fails. Again, no promises here, but there are plans to make it a full-scale GI solution. But first I have to get as much as possible from the screen-space part.
     
    Radivarig, PutridEx, Ziq-Zaq and 5 others like this.
  27. UnityLighting

    UnityLighting

    Joined:
    Mar 31, 2015
    Posts:
    3,874
    Currently there is no out of the box Fully Realtime GI solution for Unity. I hope your GI solution be the first and only solution and you can find maximum popularity among Unity users
    I'm waiting to see your GI solution on a real scene (HDRP sample scene is a good one)
     
    Gooren likes this.
  28. florianalexandru05

    florianalexandru05

    Joined:
    Mar 31, 2014
    Posts:
    1,811
    Can't wait to try this, I'll buy it for sure. I only need it for pretty screenshots and videos so moving to HDRP was the most logical conclusion for me.
     
  29. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Thanks! But remember, that at least for now it's a screen-space effect. So all (or most of) screen-space limitations are present. I'll try to test on the Sponza and HDRP sample scene soon.

    Thank you! Btw, have you tried path-tracing in HDRP? If you don't care for real-time performance, there's nothing that can beat a path-traced result.
     
  30. florianalexandru05

    florianalexandru05

    Joined:
    Mar 31, 2014
    Posts:
    1,811
    I don't have an RTX card atm, I would have used it by now.
     
  31. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    Just wondering, how is the performance compared to Unity's SSGI?
     
  32. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Hi!
    Depends on the comparison method, because both H-Trace and Unity's SSGI have a number of parameters that can be tweaked. For example you can set any number of ray-steps for Unity's SSGI. The same goes for the sample count for H-Trace. But ray steps are not samples, so we can't just type in the same number into both and compare. Next we have denoisers. Enabling denoisers also impacts performance. Furthermore, we have Unity's native TAA which also acts as a denoiser in this case. By setting a high sample count in H-Trace you can get rid of the noise using only this native TAA (no additional denoisers required). But do we count the performance impact of the TAA itself in this case? All in all it's a complicated question.

    I have an initial comparison in the first post (under the spoiler) where I tweaked settings of H-Trace and Unity's SSGI to achieve roughly the same performance. This way you can understand what visual output each of them yields under the same performance cost.

    From the technical point of view, SSGI seems to trace 1 sample per pixel (maybe I'm wrong, but it's definitely not a big number). The radius of sampling is controlled by a user with the "Max Ray Steps'' parameter. And in order to sample across the whole frame, you have to use some ridiculous number, like 1000, which will absolutely murder performance. And even if you do that, you still get 1 sample (or a couple) per pixel. So you'll need to heavily denoise the result. Thus, two denoisers. This denoiser combination leads to a visible temporal accumulation lag and some artifacts.

    H-Trace, on the other hand, samples across the whole frame by default. And it can trace hundreds of samples per pixel, maintaining real-time performance. This results in a natural low noise level and there's no need to use multiple denoisers (or use them so heavily, at least). The downside is not so accurate thickness detection (which is called "Depth Tolerance" in SSGI). But it has been dealt with, and now there's a mode that can potentially render even more accurate thickness than Unity's SSGI. It's not free (from the performance point of view), but it's optional.

    P.S. I'm in the middle of the actual testing and it's not fair to make the final comparison yet. Things may change, new issues may arise. Plus, H-Trace is not battle tested yet, and it will mature and improve in the future (if someone finds it useful). I will post some screenshots, numbers and visual comparisons soon.
     
    Last edited: Apr 10, 2022
  33. blueivy

    blueivy

    Joined:
    Mar 4, 2013
    Posts:
    633
    Will you allow us to control the intensity of the screen space effect? I find something like this could be useful blending dynamic objects with baked lighting, I wonder if controlling the intensity would allow the GI to not over contribute to an already baked scene
     
  34. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Sure! If you find this useful, I will make a slider to control the intensity. I will also try to make use of the "Receive SSR/SSGI'' button in the material inspector tab. An ideal scenario would allow to completely disable H-Trace on the per-object or per-material basis. It may be useful if a portion of your scene is baked (or uses some other type of GI), so you could enable H-Trace only for non-baked (dynamic) objects, therefore correctly mixing H-Trace with your baked lighting.
     
    blueivy likes this.
  35. blueivy

    blueivy

    Joined:
    Mar 4, 2013
    Posts:
    633
    Thanks so much! I will definitely be getting this when it releases, my final question is does this play nice with Unity's SSR and or reflection probes? I'm not a graphics programmer by any means so I don't know if reflections is something that would bug out your system.

    Also I saw in your GTAO thread that you would also be able to do specular occlusion thanks to bent normals, is that something that will also make it to h trace?
     
    Last edited: Apr 11, 2022
  36. GoldFireStudios

    GoldFireStudios

    Joined:
    Nov 21, 2018
    Posts:
    160
    This looks fantastic! Curious, is this only supported in deferred rendering or would it work in forward as well?
     
  37. TyGameHead

    TyGameHead

    Joined:
    Apr 29, 2017
    Posts:
    25
    Hey I've have tried 3 different types of GI's for unity built in like SSIL, SSGI and SSRT but none seem to be working for me if not built in could this possible be converted to urp and how soon will the project be out for release looks very promising can't wait.:)
     
  38. Ookoohko

    Ookoohko

    Joined:
    Sep 29, 2009
    Posts:
    103
    Just came across this - looks definitely interesting, good job!

    One quick question: I've been experimenting with Screen Space GI stuff (such as Pascal Gilcher's SSRTGI implementation, which looks quite similar to your approach?) but depending on the case screen space just simply does not provide stable enough information (for example in my use case, which is really high end LED volume rendering).

    So, I've thought of the possibility of adding a DXR raytracing pass for those rays that don't hit anything on the screen space. If the screen space part works well, you don't have to shoot gazillions of them, so the performance should be pretty bearable. I already made a simple test that shoots DXR rays from G-buffer.

    Have you considered this?

    br,
    Sami
     
  39. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    You can enable both H-Trace and SSR (or any other screen-space effect) at the same time, no problems with that. One limitation is that H-Trace GI won't be visible in the reflections provided by Unity's SSR. That's because HDRP renders all its native screen-space effects before H-Trace can be injected into the pipeline. I hope this limitation will be lifted, once Unity adds an option to directly override screen-space effects. One of the devs mentioned that it's being worked on. Or maybe I'll just do some custom SSR myself ;)

    Reflection probes - sure. They are a completely separate system. If you had H-Trace enabled at the time of the reflection probe baking - it will be visible in the reflection.

    Yep, GTAO and Screen-Space Bent Normals are also available in H-Trace. I'm planning to implement GTSO (Ground Truth Specular Occlusion) based on them. And probably Micro-Shadows as well.

    Thank you!:)
    H-Trace needs _GBufferTexture0 (Diffuse Buffer) to work properly. This buffer is generated only for the deferred-compatible shaders (Lit shader), as far as I know. So, deferred only for now. However, all other required buffers are there for both deferred and forward, so I can try to generate the diffuse fuffer myself to support forward. If it's possible to do this fast enough. HDRP is not super flexible when it comes to this stuff. But I'll look into this as soon as all other major parts are ready.

    Thanks!
    Can you elaborate on what assets did you use and what exactly didn't work out in your case? It's always interesting to hear feedback on different GI methods.
    I'll try to port it to URP after the release.

    Thank you!;)
    I can't tell which approach is used in Pascal Gilcher's SSRTGI for sure, but judging from the "Ray Step Amount" parameter it has, it's probably closer to a regular SSGI tracer like the one you can find in HDRP.

    As for screen-space instability, well, it's a general limitation of these effects. It's impossible to make it go away completely, but there are ways to make it less noticeable and distracting. I can't claim that H-Trace is better or worse than other screen-space effects in this regard, because I haven't tested it in all possible scenarios. However, I did take some measures. For example, there's an advanced thickness mode which uses a separate backface depth buffer to render object thickness accurately in cases where it's impossible to derive correct thickness data from the regular depth buffer. I'll write about it in the next update soon.

    Yep, I thought about something along these lines. I want to try VXGI (or maybe SDFGI) as a fallback when the screen tracing fails. I'm not sure about DXR, because you need compatible hardware for that. But it probably makes sense to add it at least as an option for those who have such hardware.
     
    blueivy likes this.
  40. blueivy

    blueivy

    Joined:
    Mar 4, 2013
    Posts:
    633
    Thanks for the reply! I'll be patiently waiting for this amazing asset! I really admire all the technical work and skill that goes into these techniques.
    Have you thought about releasing the project on the asset store as is and updating the features as you go along? I know the Unity community would chomp at the bit to buy a screen space GI with just the results you have in the OP even without all of the features listed in your short term plan.
     
  41. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Update:
    I've finished the AO part. Now H-Trace can render Ground Truth Ambient Occlusion alongside GI in a single pass. Enabling AO will add 5-7% cost to the whole effect.

    Without AO (GI only):
    upload_2022-4-14_2-4-29.png

    With AO (GI + AO):
    upload_2022-4-14_2-4-54.png

    AO Debug Mode:
    upload_2022-4-14_2-5-11.png

    AO has it's own adjustable radius and intensity:
    upload_2022-4-14_2-6-34.gif

    Main Benefits:
    • Unified GI + AO pass costs less than rendering these effects separately as it's typically done (for example in HDRP).

    • AO radius can be as large as you want (up to the whole frame) with no additional cost.

    • AO can have an insane quality in terms of the sample count because it shares samples with GI. These captures are made with 200 samples (prior to the temporal accumulation) per pixel in 1920 * 1080 resolution.

    • Bent Normals are also generated
    I should mention that AO must be used with caution. Ambient Occlusion effects are generally not 100% physically correct and provide just an approximation of GI occlusion. So most of the time you wouldn't need it because all the necessary and physically correct occlusion is already produced by the GI part of H-Trace.

    What's more important is that it helps to generate Bent Normals:
    upload_2022-4-14_2-28-40.png

    Bent Normals (I hope to upgrade them to Bent Cones one day) are used to produce such effects as:

    - GTSO (Ground Truth Specular Occlusion)
    - Far Field Global Illuminaition approximation
    - Micro Shadows (fake shadows from normal maps)

    I hope to implement these effects in the near future to make AO even more useful:)
     
    Last edited: Apr 14, 2022
  42. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
    Hi,All the GI solutions you mentioned is not for Unity HDRP, which is meaningless:

    1.Unity buildin RP has "SEIGI" which has better visuals than the "Screen Space GI".

    2.Unity buildin RP has no future for PC games, if you make a PC 3d game, you never want it looks like a game that from 20years ago. So the better choice is HDRP, unfortunately, there is no Dynamic GI solution for HDRP.

    3.Mobile/VR platform can't afford to run the "Full Dynamic GI solution", Baking GI is the only option.

    Therefore, HDRP/URP project need the Dynamic GI the most. not the Buildin RP.
     
    Gooren likes this.
  43. TyGameHead

    TyGameHead

    Joined:
    Apr 29, 2017
    Posts:
    25
    Yea I know wasn't saying hey these will work with hdrp I was telling him what assets I'm using in the built in RP and hope his project can be converted to the urp pipeline never said these assets work with hdrp.
     
    blueivy likes this.
  44. Ookoohko

    Ookoohko

    Joined:
    Sep 29, 2009
    Posts:
    103
    The nice thing about DXR is that you don't really need to do much to get it up and running, and it just works (tm) unlike SDFGI, where you need a solution to conveniently generate the SDF's for the objects + at least in UE5's case you need to support two different types of SDF's ("global" ones, lower res, used for far distance traces, and "local" ones for closer per-object traces). I do understand the need for dedicated hardware though, but at least in my case it's not a problem :p

    For VXGI, as far as I know, you need a good way to rasterize the whole scene to voxels, which I suspect can be a real performance killer unless you manage to somehow apply it to static objects only + update the dynamic ones on the fly and merge the result somehow.

    Anyway, let me know if you need testing or dev help, been messing with a lot of GI related stuff lately.

    br,
    Sami
     
  45. one_one

    one_one

    Joined:
    May 20, 2013
    Posts:
    621
    Those new updates look great! Was wondering about two things specifically though:
    1. You mention that forward doesn't work at the moment - is that only the case for GI 'casters' or also for receivers? The latter would be problematic in deferred pipelines where a forward pass is utilized for transparent materials, right?
    2. Large objects are basically guaranteed to cause visually inacceptable issues with screenspace effects, such as terrains or large buildings. Would it be possible to perhaps pre-calculate AO and/or bent normals for these objects and somehow incorporate that into the effect? Perhaps by having them write into a separate pass? Or would you think that approaches like SDF and voxelization as you mentioned above would be a more feasible solution?
     
  46. blueivy

    blueivy

    Joined:
    Mar 4, 2013
    Posts:
    633
    I presume the easiest route would be doing the fallback using the reflection probe like native SSGI does and that OP plans for the future. That already provides great results and works on a wide array of systems, combined with the increased quality of this technique it would look amazing.
    Examples of native SSGI with reflection probe fallback on terrains can be seen here.
    https://forum.unity.com/threads/ssg...enoiser-and-performance.1211346/#post-7744245
     
    Deleted User and Gooren like this.
  47. one_one

    one_one

    Joined:
    May 20, 2013
    Posts:
    621
    Yeah, I've seen that and I think for outdoor scenarios where reflection probes can capture the 'general ambience' of a region it will work very well. However, it's going to be much more difficult if you have very dominant occluders, like mountains in a valley or if your view is right next to a wall that's just barely out of screen space.
     
    florianalexandru05 likes this.
  48. florianalexandru05

    florianalexandru05

    Joined:
    Mar 31, 2014
    Posts:
    1,811
    Very excited for this, I can't wait anymore!! :eek:
     
  49. Blade_Master_2020

    Blade_Master_2020

    Joined:
    Aug 3, 2019
    Posts:
    38
    Hi, May I ask when will you release it? Maybe like one month later?

    Can’t wait too long cause we know as long as this GI solution gets famous, it will be abandoned suddenly. Like all other beautiful GI solutions in the forum (we all know the reason ;)):

    1.<SEGI> https://forum.unity.com/threads/segi-fully-dynamic-global-illumination.410310/
    2.<HXGI> https://forum.unity.com/threads/hxgi-realtime-dynamic-gi.472486/
    3.<VXGI> https://github.com/Looooong/Unity-SRP-VXGI
    4.<CRTGI> https://forum.unity.com/threads/com...illumination-using-light-propagations.275808/
    5.<NigiriGI> https://github.com/ninlilizi/Nigiri
    6.<MadGoat SSGI> https://forum.unity.com/threads/wip-madgoat-ssgi-screenspace-global-illumination.849214/
    ect…

    Anyway, We hope you can release it soon ,then update it on the Asset Store. Oherwise keep waiting is such a painful thing for us, since all other GI were dead during long waiting.

    Once you release it I'll buy it at first time.I think many people would agree with me, we don't want to wait anymore.We don't care the thinckness feature cause your GI solution is good enough for now (with fallback to reflection probe feature, it can create Great visuals with high performance,that's totally enough for commercial project).

    Best regards.
     
    Last edited: Apr 17, 2022
  50. Passeridae

    Passeridae

    Joined:
    Jun 16, 2019
    Posts:
    395
    Thank you for this offer!:)

    Objects with forward shaders (Hair and Fabric, mostly) don't interact with GI at all right now. There's a possible workaround for this that I'm investigating. But the actual "out of the box" support will arrive once Unity makes it possible to inject custom screen-space effects. However, this is not a problem for forward-rendered transparency. Transparent objects don't write to regular depth and normal buffers, so they're pretty much non-existent for H-Trace and Unity's SSGI and therefore they don't affect each other in any way (be it a good way or a bad way).

    Well, I'd like to have something like SDF or voxels as a fallback, sure. But these are plans for the future. At the current moment I'm adding an option to exclude specific objects from H-Trace. It will allow the use of other types of GI fur such objects, combining them with H-Trace. So, yes, I think it will be possible to bake GI, AO and bent normals for one group of objects and use H-Trace for another, mixing them together.

    Yep, I will try to add the reflection probe fallback in the future. But I'm not sure how good this solution is. Reflection probes are static and captured in fixed positions. The effect they provide is better than nothing, sure, but it's a rough approximation, from what I can tell. Unity's SSGI has a small sampling radius which results in a lot of potentially uncovered areas in the first place. Reflection probe fallback patches these areas with some "kinda-sorta" correct data and it starts to look brighter and better, so it may sound appealing, but it's not a very reliable method. The new adaptive probe volume, on the other hand, looks very promising. But it's still in preview, so it's too early to plan anything.

    Hi! Honestly, I don't know the reason :) From what I know, people have abandoned those for different reasons. And I'm not sure that the author of HXGI has ever even promised to release or sell it publically (maybe I'm wrong). So, I don't see a correlation between my chances of abandoning the project and the amount of time I've spent working on it. However, I understand your concern. I stopped working on any new features a week ago. I will not add anything new until I submit the asset. Now I'm working on the following things:

    - Fixing bugs and cleaning code;
    - Making sure denoisers work fine;
    - Making sure forward rendering is not breaking anything;
    - Making sure you can disable H-Trace for certain objects;
    - Testing.

    These are basic things that most people will need, so I need to make sure they work properly. I understand that waiting may be boring and not pleasant, but I will not give an exact release date, cause I've already missed a couple of self-imposed deadlines and I've learned from my mistakes. I'll do everything I can to finish it as fast as I can :)

    Thank you, but I think you might care about this feature even if you don't know it yet ;) I'll release a post about it soon, so it may shed some light on why it is so important. It's also crucial to understand that I post only good stuff. You just don't see how many bugs and simply wrong results I get behind the scene. So it may look like everything is fine and production-ready, but it's not the case. There are some thing that need to be finished before the initial release and that's why the asset is still in the development phase.