Search Unity

Feedback I'd like this GI solution in Unity, thanks a lot :)

Discussion in 'General Discussion' started by hippocoder, Apr 6, 2019.

?

Would you like this?

  1. Yes

    93.8%
  2. Yes

    67.9%
  3. Yes

    66.7%
Multiple votes are allowed.
  1. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
  2. iamthwee

    iamthwee

    Joined:
    Nov 27, 2015
    Posts:
    2,149
    yh shame the project is crowd funded and runs in blender
     
  3. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    937
    Raytracing against SDF Spheres and Cubes with Physics is fast since 2012.
    http://madebyevan.com/webgl-water/
     
  4. Darthlatte

    Darthlatte

    Joined:
    Jan 28, 2017
    Posts:
    27
    Hi, how exactly does this work? Do you just set the sunlight to pure red and use a gradient sky with all blue colors and bake the lighting or what? The results look good, but I fail to understand how it works... If someone could write a mini guide or give some hints on this I would be very happy :)
     
  5. Adam-Bailey

    Adam-Bailey

    Joined:
    Feb 17, 2015
    Posts:
    232
    I've been meaning to expand that little test project to release as an example but have had absolutely no time.

    Exactly that. Sunlight set to RGB[255,0,0], ambient light to RGB[0,255,0], and then any other lights (controlled as one) to RGB[0,0,255].

    Bake lighting as normal if just baking direct lighting. If baking indirect lighting then all static geometry will need a plain white texture.

    That gives you a lightmap where the three lighting types are baked to R, G, and B respectively. You can then use those channels as masks in a shadergraph shader.
     
  6. Darthlatte

    Darthlatte

    Joined:
    Jan 28, 2017
    Posts:
    27
    I understand and thanks for the explaination :) So in the shadergraph the values for direct/indirect lighting are lerped based on the RGB mask? I would love to see an example if/when you have the time ;-)
     
  7. DMeville

    DMeville

    Joined:
    May 5, 2013
    Posts:
    418
    +1 would like this gi solution. Need something for time of day system with dark areas (caves, houses) for large areas and baking stuff is dumb. I know it's been a few months since this was posted, but I'm sad that no one from unity has popped in.

    I've been lightly following the different GI systems for Unity for a long time, SEGI, HXGI, and even got hyped years ago about SpectraGI with their impressive video but no actual product. Popping in every few months hoping there's been a breakthrough with performance and systems were actually released to the public. I'm going through the preproduction phase of a new project that could really use a dynamic realtime GI solution now, so I stumbled on this thread..

    I've tried SEGI in the past but it was slow and artifact-y (maybe it's better now, this was years ago). HXGI hasn't released anything (and sadly I'm doubtful they will) and there's a few other half-baked systems that probably don't do well to support anything more than demo scenes. Pretty desperate at this point.

    What other options are out there currently? Short of trying to do it yourself, or hiring someone smart enough to do it for you?
     
    Last edited: Jun 10, 2019
    iamthwee and joshcamas like this.
  8. chingwa

    chingwa

    Joined:
    Dec 4, 2009
    Posts:
    3,790
    Does this type of post-injection work in Unity games?

     
  9. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,132
    I mean, that's all ReShade does, really, and there are ReShade presets for Hearthstone even.
     
  10. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
  11. chingwa

    chingwa

    Joined:
    Dec 4, 2009
    Posts:
    3,790
    Adios Enlighten! (and good riddance!) Nice to see Unity finally being forced to develop an actual Realtime solution, finally!
     
  12. xVergilx

    xVergilx

    Joined:
    Dec 22, 2014
    Posts:
    3,296
    Maybe its a good thing that I wasn't able to bake anything with enlighten.

    I don't need to rebake everything nao. (⌐ ͡■ ͜ʖ ͡■)

    Too bad its infinity before releasing actual realtime gl solution of out preview.
     
    hippocoder likes this.
  13. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Steady on, give Unity some positive reinforcement! There's people there who invested a lot of work and effort. Sometimes things don't go to plan, sometimes they do.

    Such mob :D
     
  14. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    I was skeptical at @hippocoder saying GI was solved for RT (I mean at a good enough level) but damn if in between now and the start of that thread if change hasn't greatly evolved. RTX + existing solution + DDGI = @hippocoder was right, will he ever pardon my sin? I shall never doubt a moderator again.
     
    Martin_H and hippocoder like this.
  15. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    My child... sins cannot be absolved but should you go right ahead and make sure Unity does a good job of it, I'll forgive you :p
     
    xVergilx and neoshaman like this.
  16. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    BTW I think I'm going to start implementing the idea floating into my head, the realization that actually any texture is a 3d one due to mipmap, and that due to diffuse light spread it can be actually good enough as a representation.

    Also because I delved into lightfield courses and discussion, and they state that the 5d (plenoptic) equations (aka xyz, horizon and azimuth angle, understand a lightfiled volume, generally lightprobe array in game) is over kill and we only need 4d (xy and angles). Another way to put it, you don't (always) need a light probe volume, all you need is a cubemap because that's essentially an empty 3d texture, the plenoptic equation will fill the inside just fine. PS: I Assume you store SH in the texture not mere pixel color.
    https://en.wikipedia.org/wiki/Light_field#The_4D_light_field


    I dunno about you, but that open a lot of opportunity for weak and ultra cheap hardware to have decent (diffuse) lighting (and probably gi compute) at small cost with no extra hassle (no 3d textures sampling, one sampling).

    Which mean the main headache now is just to know how to update those structure to have RTGI.

    EDIT:
    When you can't stop thinking and realize that solution mipmap solution make so much sense for heightfield terrain ... I thought that you would need to raymarch the heightfield ... but NO! Light field solution work cheaply because they encode the spherical lighting environment in one point, raymarch is only when you get pixel color instead of SH ... so you just need one single sample per point, so now you have a flat lightfield associated with a heightfield, all you need is to reconstruct the lightprobe position from the heightmap and 2d position and sample that from the shader ... DONE! there is no light inside the terrain anyway, and since probe interpolate just fine, you can sample the mipmap too to get basically a column of lighting. You probably need a policy of distribution to find the sampling height of mip map probe though,depending of the range of variation of height, since each mimap cover a bigger and bigger area, which could probably work in tandem with mipmap data of the heightfield anyway (edit, which would store max height of the area covered anyway, but then you need to "march" the mipmap "column" to find the relevant probe since the data is arbitrary in height, probably could use a pointer to next mipmap height in another channel?).
     
    Last edited: Jun 21, 2019
  17. iamthwee

    iamthwee

    Joined:
    Nov 27, 2015
    Posts:
    2,149
    @UT I want these solutions now, production ready and available for my mac mini. Plz hurrie.



     
  18. DMeville

    DMeville

    Joined:
    May 5, 2013
    Posts:
    418
    I'm sure many of you have seen this talk and blog posts about DDGI from nvidia:

    https://www.gdcvault.com/play/1026182/
    https://devblogs.nvidia.com/dynamic-diffuse-global-illumination-part-i/

    Apparently they've been working on it for 5 years, and it looks pretty good imo. This talk was given at GDC, at the same time as the talk in the original post of this thread, and while it's not quite as fast as 0.6ms at 4k on XBO, maybe they've seen the other talk and made some hefty improvements since then. Although, since it's nvidia tech, I wonder how nicely it plays with other graphics cards..

    I messaged the speaker, Morgan McGuire, on twitter asking about availability and unity integrations or betas or anything to get my hands on code and start playing, as I'm getting to the point in my project where this is something I would like solved and couldn't find any information about this other than these two links. He said they'll have updates at Siggraph in a few weeks. I have my fingers crossed that means actually releasing some code before 2021.

    (https://twitter.com/CasualEffects/status/1148397983177826305)
     
    Last edited: Jul 9, 2019
    elbows, Total3D and OCASM like this.
  19. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Yep I have seen this, it's basically using RTX to update lightprobe volume WITH a nifty idea to control light leak, which is the main contribution of the technique (it make updating lightprobe viable).

    Now the big thing to optimize it, is this update pass, that's the main thing if you want to port it elsewhere. DDGI use rtx, but it has been implemented using other technique (see HxGI thread). That's where you can pillage any other techniques depending on your hardware budget and quality target. I think even old school approximation light propagation volume could do (low quality lighting), which could further (probably) be optimized using lightfield theory to skip empty space.

    edit: Optimizing the structure that control light leak (visibility texture) might be another improvement. Also for low end, trying non grid light probe structure (tetrahedral) which allow for sparser update.
     
    Last edited: Jul 10, 2019
    DMeville likes this.
  20. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    328
     
    DMeville likes this.
  21. chingwa

    chingwa

    Joined:
    Dec 4, 2009
    Posts:
    3,790
    @OCASM Thanks for the video, but wow that is a lot of technical mumbo jumbo to me. I hope it means something to someone in a position to make stuff happen. :D
     
    OCASM and DMeville like this.
  22. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    OCASM likes this.
  23. DMeville

    DMeville

    Joined:
    May 5, 2013
    Posts:
    418
    'RTX' means hardware accelerated raytracing, right? Which only works if you have an RTX level nvidia card (or similar)? Personally, I'd really like a GI solution that can scale down and work work on older cards too, as the majority of players don't have that kind of hardware yet, and probably won't for years still. I could be misunderstanding, though.
     
    Last edited: Jul 11, 2019
  24. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    328
    HDRP is intended for high-end PCs and Unity's real-time GI is intended for 2021 and beyond. By then NVIDIA, AMD and consoles (maybe even mobile) will support hardware ray tracing.
     
  25. DMeville

    DMeville

    Joined:
    May 5, 2013
    Posts:
    418
    Sure, but what about projects targeting LWRP or wanting to release sooner than 2021? No GI for them? Clearly from the OP it can be done acceptably without hardware accelerated raytracing (0.6ms at 4k on an xbox one x!), and it can be done yesterday. The dream would be to have it running on modern hardware, and let those with RTX cards accelerate it making it run even faster or at higher quality, that way everyone wins.
     
    Last edited: Jul 11, 2019
    angrypenguin, one_one and Metron like this.
  26. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    To be more precise, we should probably say "Nvidia's hardware accelerated BVH traversal shader for raytracing".

    So while in the video they use RTX, the broad line lessons they get with RTX should be applicable to other implementation of accelerrated structure, since they all have the same problem.

    AMD hasn't been in a hurry because, apparently, you can get good performance using compute type shader, and implement the acceleration structures there, that's more work though. On nvidia it's just there, therefore conducive to experiment that focus on optimizing the tracing part (scene sampling) rather than on the (ray) acceleration part. The core of that video is agnostic to the tracing method, it's also focus on efficient and accurate light transport.

    Basically RTX (kinda) is a "scene sampling" implementation that deal with visibility at each points' hempishere, as long as you solve that problem efficiently (for your target hardware and target quality), you should probably be cool.

    For example:
    - Enlighten solved it on weak hardware, on CPU, by prebaking (offline raytracing) the visibility on coarse surface, and storing that visibility per surface. Then at real time, compute the lighting at each surfaces, then resolve the final GI by querying surfaces lighting inside the visibility structure of that surface.
    - Voxel solutions, dealt with visibility by storing (coarsly) and marching the result at real time in a 3d textures.
    - Light probe just store the resolved visibility result of tracing in angular structure (SH) offline, which allow to query in one sample per pixel.
    - DDGI use raytracing to update lightprobe at runtime (decoupling from resolution and framerate), but the update method can be anything (voxel, prebake like enlighten, compute or RTX).

    So to answer your question. It would scale if they keep what they have and JUST change the update method (ie replace RTX with another solution). Providing you find one (or many that scale on all hardware, or use specific solution for specific machine, or handle the trade off with quality.

    The real problem is efficiency, and right now, RTX is the proven most efficient method that allow raytracing to reach real time and good enough cost.


    In fact I'm exploring that by (ruthlessly) approximating the visibility using a box projected cubemap which stored the adresses of points to sample, with ruthless approximation light transport(not accurate then), stay tunes for when I get results back.
     
    keeponshading and DMeville like this.
  27. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    328
    To quote @Jesper-Mortensen :

    "To address HDRP and LWRP support we are going to integrate the features that make sense for the pipeline in question. Some features require hardware capabilities not available in LWRP so those will have to remain HDRP only. Also, the pipelines are quite different in nature so to integrate efficiently we need to do it separately for each pipeline in order to achieve optimum performance."

    https://forum.unity.com/threads/enlighten-deprecation-and-replacement-solution.697778/#post-4701119

    And to quote myself:

    "It should be exclusive to the HDRP so it's well optimized and not held back by the limitations of the LDRP. That's the point of having different pipelines in the first place. For the LDRP they could have a different, cheaper technique like LPVs."
     
    pcg, one_one and DMeville like this.
  28. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    For the curious, I found the DDGI paper, that is super readable and goes more in depth than the video and blog:
    http://www.cim.mcgill.ca/~derek/files/DDGI-highres.pdf

    - Apparently they also propose an accelerated raytracing structure that basically trace using the lighprobe structure (not rtx, not bvh, basically cubemap hoping)
    - They don't use SH like in typical LPPV, they use full cubemap, laid out in octahedron encoding, in an atlas.
    - They use Gbuffer at cubemap level to compute the lighting and accumulate over time
    - You don't need to use a grid, any linked probe structure can do (tetrahedral, box projected cubemap, etc ...)
    - Result with 1m spacing are very close to ground truth
    - It does look trivial to implement a simple version.
    - It's kinda close to my own hypothetical and untested (yet) solution, both use cubemap atlas as visibility structure (but differently) and texture gbuffer. The main difference is that mine make the hypothesis around cubemap as samples addresses to the gbuffer stored in a lightmap, shadow are define by sampling the analytical skybox using the visibility structure, and box projection to replace tracing. Both light objects as an async structure that is then simply sampled by geometry at runtime.
     
    DMeville likes this.
  29. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Sounds like some DDGI action was on display at Unite Europe, but I dont know what or which part of the conference it was shown at.
     
  30. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Bah I couldn't afford it this year, would've pressed my face to the glass. Anyone got vids or materials on it?
     
  31. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    328
    Here's the keynote:



    Didn't see any GI demos.
     
    Lex4art and iamthwee like this.
  32. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    I know it wasnt in the keynote, but that are other parts of Unite Europe.

    To be honest I dont expect various bleeding edge new stuff to be a certain part of keynotes any more - some announcements will be on that stage, but other stuff will emerge on github or on roadmap session sensions or another Unite sessions first. I start to see keynote for mostly announcements for things that involve partners, or stuff that has certain reached a levelness of ripeness. There are exceptions to this, but I didnt take the lack of DDGI at keynote to be anything I should read into.
     
  33. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Apparently it was shown at the session "Optimizing and deploying real-time raytraced global illumination with RTXGI".

    But I am just passing this info on, I wasnt there.
     
  34. xVergilx

    xVergilx

    Joined:
    Dec 22, 2014
    Posts:
    3,296
    So that means no proper software solution (in the near future)?
     
  35. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    I do not know what their full plan is. I was not at Unite or that session, and I do not know if, for example they have both a DDGI plan and a completely different realtime GI plan for the long-term future of Unity.

    I did hear that at the Unite talk they may have been talking about offering a DDGI type solution that allows for baking - ie so that developers running Unity editor on their dev machine with an RTX card can bake data that can then be used on machines that dont have an RTX GPU. But again I was not there, there could be detail wrong in what I said, and might not be their full GI plan.

    I suppose in summary my own knowledge right now is limited to:

    Enlighten is being deprecated. Affects HDRP users first, where it is removed for 2020 releases.
    They intend to improve light probe system in various ways
    They have been doing some DDGI stuff, and they spoke about it at that session
    They were/are hiring for the engineering team that will make the new solution(s)
    Unity committed to delivering a real-time GI replacement solution in 2021.1

    (the non-DDGI-specific detail in above list is from https://forum.unity.com/threads/enlighten-deprecation-and-replacement-solution.697778/ )
     
    xVergilx likes this.
  36. xVergilx

    xVergilx

    Joined:
    Dec 22, 2014
    Posts:
    3,296
    I hope its not a baked solution. Those never worked for me in any project well enough. Ugh.

    Guess, time will tell.
     
  37. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Yeah, I'm not going to make any assumptions, and maybe baking is the wrong word. Maybe pre-computed is a better term, or maybe its better just to think of the idea that solutions will probably involve probes in some way at least.

    I dont think 'perfect' completely realtime (eg work with procedurally generated scenes), performant GI with wide platform support etc has really been solved by anyone yet. So I dont expect total miracles from Unity. Possibly some nice hybrid options and much cleverer set of compromises than we had in the past are possible. At least there shouldnt be any black boxes involved this time.
     
  38. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    I'm not even looking for pure realtime, just something that can handle a change at runtime without having an impact on the framerate, any solution that'll chop it's work up over time is fine by me.
     
    DMeville and xVergilx like this.
  39. xVergilx

    xVergilx

    Joined:
    Dec 22, 2014
    Posts:
    3,296
    I'm all in for anything that doesn't require waiting for a week just to crash in the end.
    It doesn't have to be perfect, it just has to work on a large scale. Right now Progressive seems like regressive, GPU one isn't even capable of baking large scenes due to VRAM limitations.

    CPU baking just takes billions of year to finish on a lowest possible settings. Sooooo no, I'd rather have that -2.5 ms and forget about it forever.

    Edit: I've looked for a solution on the asset store but there's seems to be none good. And there probably won't be, because of HDRP shift from the built-in renderer.

    Edit2: To be honest, I don't even need realtime GI. Just properly working baked shadows. Hmmmm.
     
  40. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Bakery is actually QUITE good, go to their thread.

    Also I think there is a bit of confusion.

    Enlighten isn't just about baked GI, it's also about PRECOMPUTE GI for real time resolution of light.
    They remove the precompute solution and there is no alternative.
     
  41. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Yeah this is about realtime indirect, basically. Everything else is solved very well with PLM (GPU) IMHO :)

    (And DDGI handles vast distances fine, but obviously that would be lower detail and lower refresh rate or other scheme - but it's something that is compatible)

    Last public noise I heard about Unity's GI is they're tryng a lot of things, would love an update from the lighting team here, if only to tease a screenshot or some talk.
     
  42. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    I'm going to start trials to test my own crappy solution to see how crappy it really is, in fact I'll probably start a thread today, after I finish installing a few thing on teh main computer. I keep saying crappy to net let expectation too high, I use very rough geometry approximation, might be just enough for the weaker hardware to have "artistic GI". If teh whole thing work!
     
    keeponshading likes this.
  43. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,609
    protopop, one_one and AndersMalmgren like this.
  44. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,132
    Judging by their screenshots, they should be working on an AA solution first.
     
    OCASM likes this.
  45. protopop

    protopop

    Joined:
    May 19, 2009
    Posts:
    1,560
    Cool - Godot has been making a lot of improvements and im very impressed. I wish there was more compatibility between Unity and Godot.
     
    keeponshading likes this.
  46. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    I finally got round to watching the session video from Unite Copenhagen.

    So now I need to remember to call it RTXGI not DDGI, and I note that on Nvidias RTXGI page, availability is said to be 2020 (not Unity specific).
     
  47. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    995
    Hardware and API Agnostic Ray Tracing

     
    neoshaman and iamthwee like this.
  48. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    I was about to share about that, 30fps on a 970 is nothing to sneeze at
     
  49. Acissathar

    Acissathar

    Joined:
    Jun 24, 2011
    Posts:
    677
    Although not as well known as Godot or CryEngine, the C# engine Xenko (formerly Paradox3d, soon to be Stride?) has someone working on a Voxel GI implementation as well:

     
    konsic likes this.
  50. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    995
    Glory of SSGI
     
    florianalexandru05 and Lex4art like this.