Search Unity

  1. If you have experience with import & exporting custom (.unitypackage) packages, please help complete a survey (open until May 15, 2024).
    Dismiss Notice
  2. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice

Graphics HXGI Realtime Dynamic GI

Discussion in 'Tools In Progress' started by Lexie, May 24, 2017.

  1. chiapet1021

    chiapet1021

    Joined:
    Jun 5, 2013
    Posts:
    605
    Confidence in it happening at all, or that it'll happen in their projected timeframe? I've just been following threads about it on these forums, but from what I've seen, developers are generally positive on the SRP development, even if there's still quite a bit left to do.
     
  2. Lexie

    Lexie

    Joined:
    Dec 7, 2012
    Posts:
    646
    Both. Although the team working on it are passionate, I don't think its close to unity's top priority at all. Like i said if it took them 2 years to finish compute buffer support after its first appearance in a public release, why would a complex decoupled custom rendering pipeline be made stable in a shorter amount of time, let alone 2 custom pipelines to go with it.
     
  3. hopeful

    hopeful

    Joined:
    Nov 20, 2013
    Posts:
    5,696
    Yeah, aside from the promotional materials, what I've seen from Unity spokespeople is that the HD pipeline is to be considered a "toy" for now, and maybe for the next year. I think it's entirely experimental.

    The lightweight pipeline seems to be real, though.
     
  4. Lexie

    Lexie

    Joined:
    Dec 7, 2012
    Posts:
    646
    I've tried a lot of different methods to generate the GI data in for the Sparse Light Volume.

    I've tried a bunch of screen space raytracing/pathtracing techniques and although they work well on highend cards they are too slow on the hardware I'm trying to target. The effect is also screen space and doesn't really support volumetric lighting. So its not something i want to spend any more time looking into. It generates really good indirect and direct shadows though.

    I've also tried a few methods for generating the lightprobe data by ray tracing the octree. if i fire out a thousand rays per light probe it can create some nice results. The downside is there are a lot of light probes to update and its not really possible to update them fast enough for realtime changes using this method.

    Using standard light propagation works ok for realtime lighting but requires the octree to be dilated a lot. This ends up taking up a lot more space in the octree then I would like. If i stick with light propagation then ill probably switch back to the 4x4x4 node size for the data structure. then just dilated one level and be done with it. This is my backup plan if i cant find another way to generate the lighting data.

    Although light propagation volumes generally generate acceptable indirect lighting, they don't generate very good direct lighting from emissive surfaces. Skybox lighting is pretty important for my game as it mostly takes place indoors with lots of windows to the outside world. I need that skybox lighting to fill the rooms correctly rather then just spilling in around the windows.

    I'm trying out one more method for generating the light probe data using patches. If you can generate a list of all the nodes each lightprobe can see. you can quickly generate a light bounce by summing up all the incoming light of all those nodes.

    After the patch data is generated for a chunk, the light would be bounced around by summing up all the patches for each light probe and recalculating the SH. each time the results are summed up it generates 1 light bounce! The idea would be to generate the patches for a chunk, calculate a few light bounces. then throw away the patch data and move onto another chunk. The patch data will be generated using global lines, But instead of rendering the hi polly world geo like most GI techniques that use Global lines, Ill be rendering the sparse node data instead. This should speed up the patch generation enough for realtime use.
     
    Last edited: Mar 16, 2018
    Zuntatos, arnoob, Arganth and 7 others like this.
  5. Shinyclef

    Shinyclef

    Joined:
    Nov 20, 2013
    Posts:
    505
    Wow sounds like you've been busy! Thanks for the update!
     
    arnoob, Arganth and MarkusGod like this.
  6. arnoob

    arnoob

    Joined:
    May 16, 2014
    Posts:
    155
    That's really interesting to read!
    Also just out of curiosity, how do you do the light propagation? I don't mean the code, more like could you give us a ELI5 of what is happening? :)

    Anyway your posts are always great to read, keep up the good work!
     
    MaximKom likes this.
  7. Lexie

    Lexie

    Joined:
    Dec 7, 2012
    Posts:
    646
    a 1D example is probably the easiest to explain.

    Say you have a line of cells from left to right.
    00000000000
    For one update you inject some light into the middle one and then run the propagation loop a few times the outcome will look like this
    00000100000
    00001010000
    00010001000
    00100000100
    01000000010
    10000000001

    The lighting spreads out like a sound wave. you would actually store how much lighting is going to the left and how much lighting is going to the right. If one of those cells was a wall and the wave hits it, it would reverse the direction of the wave giving you bounces.

    Light propagation volumes are pretty similar to this, but they work in 3D. as the wave spreads out it losses intensity as its spreading out in more then 1 direction. this gives you the light fall off.

    Emissive surfaces and directly lit surfaces inject lighting into the cells every update. so there is always a new force of light getting added at those positions. If you think about it more like sound waves it makes it all easier to understand.
     
    arnoob likes this.
  8. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    I totally understand where you are coming from and the historical experiences that have lead you there. I have no reason to disagree with most of it either, except that it is clearly one of their priorities and one of the main things (along with Jobs/ECS/Burst etc) they will use to market Unity 2018. Just see very recent Unity blog posts for evidence of this, along with info in the blog post about what is missing from the HD pipeline so far.

    Dont get me wrong, me believing that its one of their top priorities does not guarantee all that much. However I am certainly wary of using historical examples as a timescale guide, because the reasons it took them two years to finish compute buffer support are likely down to arbitrary decisions and issues that dont necessarily apply to everything Unity develops. I firmly believe that Unity have placed a large chunk of Unitys future in the pipeline stuff and I really think they are committed to it. That doesnt mean I'm going to get carried away with assumptions about how quickly they will add the missing stuff to the HD pipeline such as volumetric lighting, shadows for the area lights etc. Or stability/bug related stuff. And they clearly have ongoing work to do to convince those who are understandably skeptical at this stage. But I'm reasonably confident that when it comes to the rendering pipelines, a lot of the 'lack of obvious momentum', slipping timescales and the pipeline feature being pushed back on roadmap are things that I'm going to associate far, far more with Unity 2017 than Unity 2018. I'm sure there will be still on occasion be further slippages and limitations, along with various teething problems and growing pains that seem somewhat predictable for this sort of pipeline stuff (& associated shaders). But I'm no longer feeling the sense of inertia that was present for a while after initial pipeline hype, and I'm presently fairly optimistic, especially considering what sort of performance gains have been experienced with the pipelines in some areas.
     
    Last edited: Mar 16, 2018
    yohami and chiapet1021 like this.
  9. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Also making sure everything work on 2^64 hardwares, with some being exotic configuration, certainly take a toll, especially when it must work across equally exotic game requirements pesky dev will come out with (like planetary terrain).
     
  10. arnoob

    arnoob

    Joined:
    May 16, 2014
    Posts:
    155
    That is exactly the kind of explanation I was hoping for! Thank you a lot! :)
    While you are showing the propagation in the two axis at the same time (from left to right and from right to left), I guess that through code you actually need to do both separately in order to know the direction of the light propagation, isn't it? otherwise 00100000100 would then be 01010001010 at the next step?
     
  11. Lexie

    Lexie

    Joined:
    Dec 7, 2012
    Posts:
    646
    Correct, you would store the amount of intensity going to the right and left separately so the could be propagated each direction individually. to add color you would store the amount of red,green,blue in each direction.
     
    arnoob likes this.
  12. arnoob

    arnoob

    Joined:
    May 16, 2014
    Posts:
    155
    That was what I thought, good! :)
    The part where I am now struggling, is when going into a 2D process.

    If we have this grid like the example you set up before :

    00000
    00000
    10000
    00000
    00000

    Then we can either propagate it in a single direction like this

    00000 00000 00000 00000
    00000 00000 00000 00000
    10000 01000 00100 00010
    00000 00000 00000 00000
    00000 00000 00000 00000

    But it would result in rays. So a better option would be to make it propagate also diagonally.

    00000 00000 00100 00010
    00000 01000 00100 00010
    10000 01000 00100 00010
    00000 01000 00100 00010
    00000 00000 00100 00010

    The lighting would then propagate more realistically. But if this grid represent for example the light on the X+ axis, it would only lit the X axis, and not the Y axis at all! To make a illustration, let's imagine we have a wall with a lit window standing at a meter from the ground. If we use this propagation technique for the lighting, any wall or standing object that is in front of the window (perpendicularly) will be lit, but the floor itself won't at all as it is parallel to the light source.

    Maybe sending the propagation information to also the Y axis could work on one iteration, but it would quickly loop over multiple step, as there is no record about where the light is actually coming from...

    Could you explain how you managed to avoid this issue in your old GI technique? (if you have the time of course).
     
  13. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,538
    Meanwhile a scene without materials

    It looks like real time GI could come soon.
     
  14. buttmatrix

    buttmatrix

    Joined:
    Mar 23, 2015
    Posts:
    609
    Is this Godot?
     
  15. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    it is
     
    buttmatrix likes this.
  16. Lexie

    Lexie

    Joined:
    Dec 7, 2012
    Posts:
    646
    That looks like light propagation, After looking through images of GODOTs GI it really looks like LPV. it says something about cone traces for low settings but those artifacts from the GI are the same as LPV.
    When you switch from 1D to 2D you actually project 3 faces from each neighbor into 2-3 faces of the current cell.
    when you step up to 3D you project 5 faces of each neighboring cell into 4 - 5 faces. This gives you diagonal lighting.

    Here is a 2D image of how the light is transfered


    The lighting has no idea where it originated from. Each cell looks at each neighbor and projects all the neighbors faces into all its face. There are a lot of papers that go over how this works if you want to read up on it.
     
    Last edited: Mar 18, 2018
    arnoob likes this.
  17. buttmatrix

    buttmatrix

    Joined:
    Mar 23, 2015
    Posts:
    609
    What exactly is going on here?
     
  18. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,538
    It's not real time, and in fact there is lights , so it's a bad example, CryEngine Svogi screenshot would be a better showcase.
    It looks like it is very long time you are working on a solution, i hope you finally find a working solution some day and not be disappointed if Unity comes with a GI in real time like Cry Engine before.
     
  19. Mauri

    Mauri

    Joined:
    Dec 9, 2010
    Posts:
    2,666
    Won't happen. Guess why Unity uses Enlighten. It's all about compatibility between the different platforms. I don't think CryEngine's GI would run on mobile at an acceptable framerate (if at all).
     
  20. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,538
    It could happen with the HD pipeline for desktop.
     
  21. Shinyclef

    Shinyclef

    Joined:
    Nov 20, 2013
    Posts:
    505
    It's going to be a while before unity has realtime dynamic GI built in. It takes a long time from announcement to feature, and much long for a stable feature, and nothing of the sort has been announced yet. It will eventually happen, but not any time soon.
     
  22. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,538
    The time you wait for an alternative plugin solution Unity should come with full dynamic GI.
     
    Last edited: Mar 19, 2018
  23. olavrv

    olavrv

    Joined:
    May 26, 2015
    Posts:
    515
  24. Lexie

    Lexie

    Joined:
    Dec 7, 2012
    Posts:
    646
    Zuntatos, neoshaman and chiapet1021 like this.
  25. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    especially at current crypto currencies rate
     
    hopeful and Lexie like this.
  26. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,538
  27. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,941
    Well "need" is a strong word. No game "needs" them.

    But I kinda like the trend I see in some more recent games, where the art style is very simple and cartoony and stylised, but the lighting shading and techniques and are very modern (PBR/SSR) and realistic.
     
    zenGarden likes this.
  28. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,538
    Yep, you can do cartoony and modern using Pbr and SSr approximation without needing ray tracing to already get good results. Sure some people could want to make cartoony or flat poly game using RTX why not.
     
  29. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    And even when they do, the stuff on display at this years GDC is very much a mixed system where various things are still rasterised and the ray-tracing is only used for certain things.

    I've just been watching a detailed talk about this stuff, which revealed much of the detail of this, as it pertained to the star wars scene using raytracing in UE4. They had to use multiple GPUs to mostly hit their 1080p 30fps target. The nvidia denoising stuff is a big part of getting results at interactive framerates. Other techniques were used to get required performance too, for example to use raytracing for AO they couldnt use the ideal number of rays per pixel per frame, so they used TAA techniques to get more data over time, and used a temporary blur trick to hide the lack of data in that buffer upon scene cuts. For raytraced reflections stuff, some simplified material shaders were used. They also used the raytracing to enable nice shadows for area lights.

    And, crucially as far as this thread goes, they investigated using this stuff for GI and they could not get anywhere close to the required performance budget. They went over this GI experiment briefly anyway because they got pretty results and are interested in this for the future, but they were not trying to pretend that this stuff is in easy reach of the forthcoming generation of nvidia raycast enabled gpus.
     
    Last edited: Mar 21, 2018
    Lexie and buttmatrix like this.
  30. chiapet1021

    chiapet1021

    Joined:
    Jun 5, 2013
    Posts:
    605
    Ah well, back to dreaming then... :)
     
  31. buttmatrix

    buttmatrix

    Joined:
    Mar 23, 2015
    Posts:
    609
    ^ GDC 2018 "Cinematic Lighting in Unreal Engine" if anyone is interested
     
    elbows likes this.
  32. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Yeah thats the one, sorry I didnt have the name to hand when I posted my thoughts.

    Also I forgot to say that the GI type stuff they didnt get performant results with wasnt even trying to do everything from scratch, they were making use of the volumetric lightmap baking in UE4.
     
  33. Adam-Bailey

    Adam-Bailey

    Joined:
    Feb 17, 2015
    Posts:
    232
    It's not going to be used in games for a few years yet, but this type of technology will probably be in Unity in medium term as well as both engines are making big pushes to the VFX industry.
     
  34. chiapet1021

    chiapet1021

    Joined:
    Jun 5, 2013
    Posts:
    605
    That's a really good point. It could potentially be used for cinematic experiences much more quickly than for games.
     
  35. buttmatrix

    buttmatrix

    Joined:
    Mar 23, 2015
    Posts:
    609
    Just to elaborate on what @elbows said, the demo ran on the new NVIDIA DGX stations, which can be found here, if you're in the market.
     
    Last edited: Mar 22, 2018
  36. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    That still good for small shop vfx and movie indie with ambition, which mean affordable quality cinematics too
     
  37. scheichs

    scheichs

    Joined:
    Sep 7, 2013
    Posts:
    77
    Why always polute the project-specific RealtimeGI threads with stuff that is completely off-topic? We had it in the SEGI thread with the otoy stuff, now here with DXR. Can we focus on HXGI here?
     
    pcg likes this.
  38. N00MKRAD

    N00MKRAD

    Joined:
    Dec 31, 2013
    Posts:
    210
    Well, there is no general realtime GI thread and DXR is kinda related.
     
  39. Lexie

    Lexie

    Joined:
    Dec 7, 2012
    Posts:
    646
    Thanks for posting this breakdown. I'm glad they actually released more information about this. I feel like everyone I talk to was super excited about this tech coming out and me having to tell them this isn't going to be usable on consumer level hardware was not fun.

    It's really cool to see GPU venders pushing for new methods for realtime rendering. I could see SFX studios or some art/museum installations. but a price tag of $69,000 - 148,000 is a little out of budget for most consumers. I'm sure these prices will come down over time and the power of the GPU will be able to handle 1080p 60 fps eventually. But if you're looking to make a consumer video game in the next 5 (probably 10) years I wouldn't bank on using that tech.


    Normally I try and post stuff to steer the conversation back but I haven't made much visual progress for a while.
    Trying to scale up the GI so it can handle larger areas is proving to be pretty hard.

    I'm kinda at GDC right now so there wont be much progress posted for a bit. So in the mean time it would be cool if we didn't get to far off topic.
     
    ftejada, hopeful and elbows like this.
  40. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,538
    Did you did some reasearch how Cryteck is doing with Svogi ? Because it just works.
     
  41. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Yeah, thats the reason I've sounded negative in some of my posts despite being excited about what this stuff will mean one day.

    Hopefully I didnt misrepresent anything from the demo talk, I spouted what I could remember from it before it even ended.

    Sure, though I will finish up this detour by saying that I caught most of the Unity Otoy talk on a stream a short while ago. I think at one point he showed a pathtraced scene that they had running somewhere around 1fps on 2 GPUs. Then he said some stuff I wasnt very convinced by about getting it up to 30fps by the end of the year. They need to have their eyes on the future but while these companies are getting great results with developing things like AI denoise, I feel I may need an AI dehype filter to cover expectations in the intervening period ;)
     
    wetcircuit, Lexie and chiapet1021 like this.
  42. Lexie

    Lexie

    Joined:
    Dec 7, 2012
    Posts:
    646
    Every method of GI has its downsides. what works for one game might not work for another.

    for me, SVOGI is too slow, doesn't support volumetric tracing, major light leaking from distance surfaces due to mipmapping the voxel data. SVOGI does not work for my game as it requires really dark areas next to lit rooms. The light bleeding is not acceptable for me. I'd also prefer the lighting data to be stored volumetricly so i can do volumetric lighting on the GI.
     
    hopeful, Zuntatos and elbows like this.
  43. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,538
    It's always a tradeOff and choices you must do, CryEngine is more about open worlds than super precise Archi Viz interior lighting.
    Anyway CryEngine games like The Hunt or Kingdom Core look amazing with Svogi, i'm sure lot Unity users would buy a plugin able to get same outdoor lighting as CryEngine with same performance :rolleyes:
     
    Last edited: Mar 22, 2018
  44. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Yeah if SVOGI was the ultimate solution more people would have copied all its methods by now, and if it had amazing performance they wouldnt have just added an offline SVOGI mode to the latest CryEngine.

    My final thoughts on this years GDC raytracing hype is that nvidia thought this stuff was close to being ready for consumer realtime GI solutions, they might have started losing interest in their own GameWorks VXGI stuff, but I believe they were still plugging away with that at GDC this year. Since I'm not there I cant honestly say what, if any, updates to it they've been talking about this year but I presume there may be some.

    Nothing has changed my GI expectations so far this year. The expectation that nvidia have changed for me is that I now expect they would like at least one game to do something shiny (ie glossy reflections) with this stuff in time for nvidia Volta consumer card launches.
     
  45. chiapet1021

    chiapet1021

    Joined:
    Jun 5, 2013
    Posts:
    605
    Metro Exodus is that game (or at least one of them). How extensively they take advantage of RTX--and how horribly taxing it is on the GPU--remains to be seen.
     
  46. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,538
    You have 3D engines from big companies like Decima engine or Square Enix luminous engine doing real time GI even on PS4 without baking any lightmaps :rolleyes:
    Meanwhile CryEngine (with Lumberyard) is the only 3D Engine indies can use and offering out of the box real time GI without baking and with triple A games using it. Considering it's real time GI performance is awesome and offline voxelization is mainly because of console first.

    Meanwhile in Unity you don't have any GI solution without baking.
    Or you must create your own GI solution tailored to your game needs and you find it's looçking good enought ( lot of people would be ready to buy such solution even it's not perfect)
     
  47. Lexie

    Lexie

    Joined:
    Dec 7, 2012
    Posts:
    646
    The method linked in this video only supports skybox approximation. Unity is about to release a volumetric skybox approximation volume with 2018.1 so if this is what you're looking for then unity might have this functionality shortly

    I don't think you're going to find anyone in this forum post that's going to disagree with you that it sucks that unity has no runtime realtime/baked GI solution. But its not really constructive pointing out the fact that other engines have it and unity doesn't.

    I understand that a good SVOGI implementation in unity might work well for many peoples projects.
    I understand that my LPV GI might also work for many peoples projects.
    But I'm trying to solve GI for my game, I don't have any time to work on any GI methods that wont work for our needs.
     
    Last edited: Mar 22, 2018
    arnoob, one_one, zenGarden and 3 others like this.
  48. buttmatrix

    buttmatrix

    Joined:
    Mar 23, 2015
    Posts:
    609
    Is it unthinkable to combine baked offline methods with real-time for others, e.g. AO, GI, etc.?

    Seems OTOY GPU accelerated lightmapping just got closer.
     
    Last edited: Mar 23, 2018
  49. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,538
    It's not real time, still lightmap baking.
    And there is subscriptions if you want to use it in good enough conditions :rolleyes:
    • OctaneRender Studio is $20 per month and allows access to 2 GPUs and a selection of 1 additional plugin.
    • OctaneRender Creator is $60 per month and allows access to up to 20 GPUs and a selection of 3 additional plugins
     
  50. N00MKRAD

    N00MKRAD

    Joined:
    Dec 31, 2013
    Posts:
    210
    Single GPU is free and still 10x faster than Enlighten or PLM.

    And, I mean, how many Unity devs have more than one GPU? 5%?
     
    zenGarden likes this.