Search Unity

Graphics Erebus - Real-Time Ray Tracing SDFs; UE4/5 Style DFAO and Soft Shadows [HDRP/URP/BUILT-IN]

Discussion in 'Tools In Progress' started by SuperDrZee, Aug 12, 2021.

  1. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    And then another post processed version. I most likely would use this one for my game.
    adjusted2.png
     
  2. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41

    Hey!

    Yah that about sums it up, there was actually a bad error hiding in the atomics of the ao shader that was effecting this a well. The last post I made with the lil tease shows what the AO should look now;



    So while it's still a little dark due to my cranking the AO toning waaay up, it shifts alooot more smoother from lit to unlit. And should never look as bad or dark like the old stuff.

    There's actually is spatial filtering already, that's essentially taking neighboring samples and converging/blurring them. It's a big part of the denoising system, not at my computer, atm but here's some rando AO shot I have handy to give an idea of what the actual AO look like. (Black spots on the far right are from global DF, theres a setting to fix that.)




    After denoising/filtering this is what the AO looks like now. There's a setting I need to expose that controls how strong samples close to the ray origin effect the final output, this helps with very noisy stuff like the trees so it's not as dark there but yah, hope this helps give some clarity. I also do have contrast settings and what not in the same vein as unreal when modulating there AO.

    I'll try and get around to it today and go over the AO a little more and all the change made there. I'll also try and take some brighter shots. The one are sooo bad... :(
     
    knxrb, ftejada and one_one like this.
  3. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    Nice!

    Just one more thing, I am guessing this spatial filtering is screen space right? I think it would look a lot better if the strength and filter radius is more aggressive depending on the distance (using depth maybe?)

    On the pic image u posted right top corner, right below the text "AO should look" the AO is super dense, but it should be much smoother/weaker. Obviously we won't be able to achieve 100% physically accurate results, but I think dimming AO down would be a feasible option. Maybe multiply it by depth with some clamps. Right now that particular bush looks like a sticker stuck on a good looking scenery. It breaks the distance feel of it. The close ups, I think u nailed it.

    Looking good mate. Looking good.
     
  4. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    Alright, to explain what I meant:

    and then another mock up with volumetric fog.
    adjusted5.png
     
  5. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Hey, yah in the top right corner, where it's just a black blob that is because of how the heightmap is injected into the global distance field. There's a setting inside the clip map object where u can shift the thickness of the terrain around. I ended up hiding it migrating code around. It's on the last minute list of crap to take care of.



    I had the heightmap AO off apparently. (I trace against the height map, and the global distance field with the height map injected, the height map acts as a high accuracy DF for the terrain)

    I circled the errors just to make sure we're on the same page, I ended up hard coding the height map injection thickness a lil thinner, and voila! No more bads, lol. They tend to crop up in the past the 3rd clip map. But yah you do have settings to deal with this.

    And yah, there's a lot of settings to mess with, I'm not a technical artist by any means. I've also been running the scene absent of post processing mostly to show what the system is doing by itself.

    I'm not sure how much it would help increasing the blur factor further away, I can send shots with the filter kernels increased. There main purpose is to denoise the random ray sampling; Every frame it fires rays in completely different directions, temporally accumulates those results, and then applies a spatial filter to smooth out the grain from the summed up noise. You have to be careful or you loose the roughness on the terrain surface you see below if you go to crazy. I think the smoothness and radius adjustments you look for, you can get through messing with the ray tracing properties themselves. There's is control for the radius as well as the falloff, though I tried to keep the fall off as close to unreal as possible, mostly for the sake of familiarity. I will actually expose that vs using a preset value like unreal and I think that would give you the level of control you are looking for.

    You should join the project discord, it would be nice to have someone like you around for tweaking. :)

    I should also mention the heightfield AO was off on the top screen shot, sorry it gets confusing.

    In the first AO buffer shot I sent I wasn't doing a direct trace against the heightfield, but I was injecting into the global distance field/clip map around the camera and tracing that. Basically the results should more like bottom half i posted with this message albeit with differing settings for radius and falloff ;)
     
    ftejada, Vincent454 and PutridEx like this.
  6. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Just to elaborate on what I mean by the radius and strength settings in place now;

    Below the top half image has a tracing range of 10 metres and the on the bottom the tracing range is 30 metres.






    The last image here, I toned the 30 meter trace distance strength to around 0.9. There's a couple other things you can play with along with the overall contrast.
     
    Last edited: Jul 20, 2022
  7. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    Nice! Really love the work mate. So, from your response, I think the filter approach is not the way to go, but rather the AO color intensity scale based on depth. So, based on trace distance, apply a lerped value maybe? I think to add icing to this lovely cake u baked, this is a must. I am simply asking for AO to fade over distance, not completely off, but a min value and scale value. So, here is a screenshot. I just blended the two pics top half bottom half. And it already feels stable. I did exaggerate to show the diff :)
    us.png


    Regardless, take your time, no need to hurry and have a negative start. You can only begin once :)
     
  8. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    I think the first image was little to big to load properly on the forum, I adjusted that to show what I mean but to elaborate a little further on the filtering, the AO starts off looking the image on the left, alot more stable, but still very noisy; The goal of the filtering is to make up for the lack of samples we can take during a single frame. You would need 100+ samples per frame to get a stable result without denoising, im only using 9 to make the effects run fast enough to work in real time.



    So basically rays launch out from the pixel we're looking to color, it then ray traces against all SDFs that could effect that pixel along the random ray path for that frame. The temporal accumulation, is a fancy way of saying it takes the results from previous frames and adds them to the current one, if the history is stable enough.

    When the history isn't stable enough, we use a spatial filter, the idea of a spatial filter isn't to blur per se, it's more so for finding stable samples around an unstable pixel to correct any vibrating or visual distortion due to the noise.

    There's basically no way around this part as all the filtering goes towards the effort of making a really messy noisy image like the before half into a stable nice image like the one on the left. Without out the whole thing would look like the image on the left.

    The AO strength, fall off, radius, ray cone parameters and other minor settings come from inside the ray trace loop itself.

    But yah I think I'm following you a little better now, sorry haha. Further post processing is an option if you wanted to modulate the output pixel visibility by depth or something. I tried to follow unreals appraoch, I'm pretty intimate with the graphics part of the source there and aside from how they use bent normals, the modulation is pretty much the same with how im doing it.

    It pretty well goes,

    Ray Trace/Find Min Visibility Per Ray
    Denoise
    Apply

    And repeat.

    The way I and unreal handle the AO is we calculate the total occlusion of the cone at the sample point, then keep then min visibility or max occlusion from a surface for the ray. Once all the rays have figured out how much they're occluded we sum their visibility and divide them by the total ray count.

    There's a lot to get into there for getting the settings to your liking, the way a mesh is converted is even a factor; I know the tree mesh planes were set to thick leading to a lot of self occlusion for example in like everything i posted lol...

    Hopefully, i'll have sometime to clean things up and try and make a little video showing off all the different settings and stuff related to how you can adjust the AO, it gets pretty overwhelming with mixing the global distance field and controlling the fall offs and start distances and what have and I've kind of just left the scene alone for now, lol.

    Anyways, I hope this claries what the filter does, and hopefully the image that didn't load before in previous responses serves to communicate something now, haha.
     
  9. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41

    Holy S***, pardon my w.e. Im so tired, yah there's a setting for that. So sorry. I was thinking about something really exotic. Of course you can fade it off at a specified radius. :) I just leave it cranked to say, 'ITS RAY TRACING EVERYTHING', lol

    I'll add some settings to control the fade start distance and the length for sure. :)
     
  10. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
  11. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Quick Update, been trucking through the clean up and found a bunch of other little errors. Turns out I made a boo boo two years ago setting up some stuff with the convertor. Every SDF position was off by 0.5 or so, just couldn't see it because it was always on Y and the objects we're big... This was causing a lot of self occlusion with the trees.

    I also got the bent normals back in after finishing up the filtering. And I can say they are applied correctly this time.. ooops.







    I plugged the bentnormals directly into the directional light calcs exclusively, there's some blending with the gbuffer normals you wont see here, but yah i figured it would be worth bringing back after figuring out a whim where i was messing up there..

    It adds a lot to image.

    Could have a probably framed a better shot, but I unno figure its good enough to show everything is looking smooth and pretty.

    I think I'm finally done after this weekend, like done done, just manuals and testing. Never thought I would say that....
     
    Last edited: Aug 12, 2022
    blueivy, one_one, hkalterkait and 5 others like this.
  12. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    Hey mate, so quick question, is this for HDRP? or all render pipe lines? Kindly clarify.
     
  13. Nexusmaster

    Nexusmaster

    Joined:
    Jun 13, 2015
    Posts:
    365
    Hi, I think I can answer this: All render pipelines are planned to be supported! This currently runs with the Built-In RP.
     
    pwka, MostHated and knxrb like this.
  14. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    Thanks!
    Although it kind of seem counter intuitive not to do HDRP first then URP then maybe built in.
    I guess I will have to wait.
     
  15. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Yah I see what you're saying.

    I had started on this over 3 to 4 years ago though, around the time HDRP and URP were just being established, at the time it made little sense to push to support either as this was just a hobby project and both were plagued with issues.

    (I'm looking at you no motion vectors in HDRP back in the day among alot of other things) I couldn't even make a denoiser back then.

    This is a huge project, I felt it would be more stable to get it finished in the version I had started it then try to conform it to three differing pipelines making the entire task of getting this done far more complex.

    If Unity HDRP and URP wasn't such a mess when I started this, odds are I probably have used it. Just kind of one those crappy things when something is built over half a decade, also doesn't help that while I was employed we were stuck in built in past 5 years lol. Not many graphics programmers that I know have much confidence in either in a stable product setting even to this dat. That's starting to change though, I see less people cringing when u bring up working with either. So it's probably time to start translating.

    But ya, it definitely wasn't a personal choice versus a circumstantial one at the time.

    From what i gather though this could be a week long task with automated tools and a lil TLC so I'm thinking it won't be to long of wait.

    But ya, in case sorry for the inconvenience. :(
     
  16. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    No mate, no need to apologize. My bad for assuming in the first place. I will just wait for HDRP :)
     
    QGGameDesign likes this.
  17. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Haha, no, that's my bad. I get asked this so much. I thought you guys deserved some sort of explanation there. :)

    I want HDRP too, I could migrate it to one of those nice Nature Manufacture Scenes.. :( Com'n dev hurry your ass up! ;D
     
    MostHated and ftejada like this.
  18. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
  19. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Got some pretty big updates;

    Me and Nexusmaster, as he goes by on the forums, have decided to work on this project as a joint venture.

    He'll be helping me out with responding on the forum's as well as the discord, so look towards him or myself when looking to have inquiries answered or when looking for general support. :)

    We're also planning to change the name of the asset to Erebus in vein of the fact that it goes beyond SDF now and this kit will be focusing exclusively on light blocking. We'll most likely be branching our lumen tech, e.i the indirect diffuse and reflections into a separate package entirely that will contrast this one.

    The past week has beyond fruitful;

    We got HDRP support in the works/mostly done,



    Nexus is taking on the conversion. Shadows are now working with-in HDRP, AO is pretty much finished after talking to him just now as well! :) :)

    So HDRP/URP will now be officially supported alongside built-in, and will be making it into the initial submission to the asset store! That should hopefully make a bunch of people happy.

    I also went over all the shaders and tweaked everything to get a lot better results. Most of the time working on this was putting in to the object management and the parts that figure out what instance affects what pixel, so now that I'm focusing on clean up an bug fixes, i've really been able to push the tracing quality a lot further.

    So just a little gallery of the AO now:





    There was a few more bugs among other things that i fixed over the week and I find now the AO looks alot more closer to reality then being to dark, and the instance tracing is now bring out micro details I wasn't pulling out before. The was also some issues that we're leading to every pixel being toned, now opens areas remain lit like the should lol...



    Also brang out a lot more detail with the shadows and got the softening effect working a lil better. Stuff far from the ray origin fades, stuff close stays sharp. Going to add some simple filtering after up scaling to handle the aliasing when using half res tracing but aside from def a lot better then before as well.

    So, now starting with unity light ;



    plus the ray traced AO (unity shadows off to show just AO)


    plus the SDF shadows


    Probably could have lit the scene a bit better, but definitely has a come a long way, haha.

    So now little of changes :
    -SDF Convertor Is Finalized, New UI, Better Tracing Functions; Won't Crash Now!!
    -Global Distance Field is now super optimal, syncs to transform changes automatically, supports wrap addressing for fast updates.
    -All Render Pipelines Supported
    -AO Effect Produces Much Nicer Results
    -Shadows Produce Nicer Results
    -Tonnes of little optimizations, speed increases across the board.
    -HDRP/URP Support added!!

    Just to be a tease ;D

    Lumen uses very similar SDF management pipelines to this, and requires the shadow SDF tracing how we have it set up already, with a little love, the instance tracing and the clip map tracing can be used as a template for the indirect passes. I'm starting to actively prototype on weekends indirect diffuse SDF tracing tech that would work very similar to lumen. With out the surface cache, I believe single bounce diffuse building off the AO effects can be achieved and submitted for a release by the years end.

    Infinite bounce could potentially follow suit and be in 2-3 months after. In either case; Keep yours eye open for our lumen set up over the next month or two as well!

    I'll be integrating colors/diffuse indirection into the atlas, as well as setting up secondary textures for the colors sometime over the next couple of days, hoping by next week to have some indirect diffuse stuff to show off as well.
     
    Last edited: Aug 2, 2022
  20. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Just a quick update, finally got around to making a video to sort of explain things better as well as show off everything in action. It's a lil longer then I wanted but thought this could help clarify a bunch of stuff.



    I posted this on the main post of this thread as well. Ill be updating a lot of the material throughout the coming weeks to, as in the time since my last post we've managed to get even better tracing times for shadows, HDRP/URP support mostly wrapped up. I also took some time to really fine tune the tracing maths, and the whole experience is just a whole lot better. There was some bugs with clip map that became apparent going over it that fixed basically every artifact i was dealing with before.

    I figure ill let the video do the talking, also some surprise stuff; we added the indirection/color data to both the instance tracing side of things and the clip map. Everything down to terrains is ready to get started with when move into indirect lighting initially on our off time, then full time after we get this thing submitted to unity.

    So we mainly need a surface cache if we want to follow how unreal does things or supplement that with surfels. I'm thinking for the time being I'm going to offer essentially SEGI on roids, then get into probes, mesh cards, and all the fun stuff lumen utilizes on top of the SDF data and tracing pipelines we have already, potentially surfels as well.

    Also don't mind some of my commentary, had a moment near the end where I wasn't sure how to end this and wasn't giving up the whole 15 minute take for it, lol.
     
    Last edited: Aug 12, 2022
  21. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Thought Id elaborate on the quality now a bit further now.

    There was really bad distance injection in later clips I had initially chalked up to be simply a consequence of down-sampling the DFs in to the clip map. But I had actually introduced a varying offset into the clip map space by mistake that shouldn't have been there I found out this week.

    This fixed a lot of stuff removing it, I was then able to adjust the ray so it does the mesh trace for a short distance, than falls back to the global clip-map tracing and now I'm getting seamless results as expected despite where the instance trace ends and the clip map trace starts as well as pretty much seamless transitions when moving instances between clip maps.










    Everyone seems to being doing angels, figured I'd give it a shot. ;P

    High Poly Angel contributing and receiving AO


    Double the angel action, lower AO intensity.

    Pretty much no over darkening now I'm finding. I'm bringing the ambient light factor down to normal levels again and the light intensity is pretty close to 1 in these shots. Overall aside from maybe some more minor tweaks to bring out more detail with the instance trace, I think the AO is actually done this time, lol.
     
    Last edited: Aug 13, 2022
    knxrb, ftejada, one_one and 4 others like this.
  22. blueivy

    blueivy

    Joined:
    Mar 4, 2013
    Posts:
    633
    Wow the AO in those angel shots look awesome! Just wanted to say amazing work!
     
  23. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Thanks! :D

    I had spent so much time working on the data management I didn't notice all micro mistakes i had built up with the AO specifically, just whipping. (This thing is mostly super efficient data management) It's been a lot less stressful since teaming up with Christian/Nexusmaster though, the Nanotech developer. It's just been going really well, and it's really given me time to comb everything while he was translating things. (Yay, all the render pipelines work now! Thanks Chris ;D)

    But yah, did some more tweaking to the AO and I got the mesh tracing working really good now, also found an offset that shouldn't have been there with clip-map.. again... :confused:


    So roughly the same image as above, albeit the mesh distance field is really doing it's job now, the clip map also captures more details too. I'd say before it was mostly clip map. (I think the darkening the border of the mesh might be a bit off, baked the SDF pretty half ass'd)

    Could probably do something like GTAO as well though last note where the cones sort of align more to the directional light angle which would bring this out further, which would get you a really slick looking effect verses spreading things out as much in certain case, could add that in later pretty fast anyways.



    Finding it's coming together really nicely. Also some really cool features I should have in be release that will really make this things special. Unreal fades the effect off past the last clip map border, I use to run the mesh tracing at 15 meters no problem on the crap-top 1050 gtx special. It's also more costly up close then it is coincidently passed the clip map range as I use it. So what I'm thinking I'm going to do is, add an adaptive radius increase for when no clip map is present to trace through, to instead brute for against the mesh distance field. Basically that would mean AO infinite distance lol. I think over draw handling would munch through it.

    This is kind of side note, but this scheme came up with is really amazing, I could replace the SDF with SVDAG (Sparse Voxel DAG) which I have alot of experience with, alooot. Mesh tracing could act on 256^3 volumes and fallback to tracing through a 1024 clip. I also want to experiment with distance field accelerated BVH path tracing.

    Anyways, no one cares! I know.. lol Future looks really bright though all the big pieces are finally there and stable, hdrp! Even got the mesh tracing an clip map stuff all working with full material support. (encodes meshes albedo, emission, etc as well a distance, all sampling functions are in place, work in the debugging view) So yah.. once this is out it's gonna start stacking fast with options, gonna put that material data to good use and get some reflections and indirect bounce going.

    It's gonna nice moving out of AO and shadows for a bit...
     
    Last edited: Aug 16, 2022
    one_one, florianBrn, blueivy and 3 others like this.
  24. blueivy

    blueivy

    Joined:
    Mar 4, 2013
    Posts:
    633
  25. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    I've actually already been following his threads, haha. I would be open to integration for sure. :) That would probably depend on a few factors. Lumen and Unreals DFAO jumps between data sets much less gracefully then how they make it sound, by that I mean it doesn't sample from other data sets in the screen space pass, it caches the ray trace results and if something better is found as it moves across the available data sets, it takes that result instead; ei. ray goes off screen sort of thing.

    So the SSGI passes I was planning to add later would basically trace as far as it can (reliably) and if more valid samples come up via mesh df tracing or from any other data set we use that instead. This sounds simple but it's a little more tricky. Basically the screen space effects should use the same ray sample directions and counts all the effects I'm running currently do in order for each to override the previous correctly.

    The reason this is important is while one might assume the screen space GI is doing most of the work, later passes tend to invalidate it as the screen space information tends to degrade quickly, due to a number of reasons and it's important for the sake of continuity that each tracing pass accounts for the previous correct and conforms to a few very important parameters in order for everything to be cache compatible and to converge correctly.



    Here's an image from lumen that shows what I mean. Basically in this interior, lumen fills maybe 10-15% percent of the screen using the screen space GI tracing pass, where as the distance field tracing ends up yielding most of the accurate GI in the end.

    The reason why the screen space GI gets invalidate so often is there's a lot of stuff on screen that get's occluded by closer fragments or that would be a part of something that is currently back facing the camera.

    Really the big one, would be how compatible transitioning between data sets is between everything, so you don't end up with a mess due to camera rotating back faces into view from slight translations and stuff among other things.

    Like if GTAO used 20 rays per pixel, this will never work with it.. very nicely I don't know, maybe for short screen space trace distances, nothing like he's been showing off however, haha.

    Me and Chris are trying conform to Lumen in that regard with potentially some influence from EA seed, as this will all be going into a more ambitious 'ray lights' project realization, so it would really depend on how he would plan to take advantage of the global data sets we have in place now and what's going on under the hood there on his end. But ya, anything is possible for sure. :)
     
    Last edited: Aug 16, 2022
  26. nehvaleem

    nehvaleem

    Joined:
    Dec 13, 2012
    Posts:
    436
    This is looking pretty awesome! Is there any roadmap available? Can't wait to get my hands on this thing!
     
  27. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Hey man! Thanks!

    I'm aiming to get everything out anytime in September, as soon as it's all ready sort of deal. I'll have to talk to Chris/Nexusmaster though to be certain, he ended up doing to HDRP/URP migration mind bogglingly fast, and I endded up doing a bunch of work to the clip map parts at his discretion. It's added a little more work to my plate then I was anticipated, but pretty much through the worst if it now.

    I'm thinking in a couple of weeks I can lay out a proper road map, right though it's looking like the sometime in September is highly likely.

    Just a quick update, ill need to make a video or something to really show this off but had sometime to clean up the sky occlusion a little more, fix a few bugs and it's now working correctly. This also handles specular occlusion though it's sort of hard to see in the images im about to post.


    So first off, unlit.

    There is some light baked into this meshes texture, so everything below will be rendered without the materials;




    DFAO

    So first comes the AO. (Fixed some scaling errors that we're blowing up the clip map stuff a lil to much, was over darkening everything)



    Sky Occlusion

    Next we apply the bent normals to the directional light acting as the sun. (Normals get a little weird where two SDFs intersect, this should be gone hopefully when I go over the convertor again) The bent normally for debugging are left un-normalized; Length indicates of visible the sky is from that point, while the bent normal indicates the direction that would receive the most light.



    Light Intensity 1.2 (All images above captured at this value)


    Light Intensity 1

    And finally, with everything on; Looking a lot sharper now... :)


    The really cool things about sky occlusion is it's responsive to directional light rotation changes.

    I'll make a gif tomorrow but, it does a really nice job of brightening and darkening the AO in response to directional light changes.

    You can see the difference in toning especially behind the right angel if you compare the AO images with the SkyOcclusion images.

    Edit;

    Just to show the bent normals off a bit better with a bunch of stuff interacting;


    It's a lil low res, need to account for luminance when handling the normals where AO I didn't save, it but yah getting sparklies on tree branch in the full screen view. Thought this would work for now.



    And with everything applied.
     
    Last edited: Aug 17, 2022
    blueivy, one_one, ftejada and 7 others like this.
  28. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Just a minor update;

    Spent the past few day combing over the mesh convertor, found a few issues an managed to speed up the BVH aloooot. Converting 300k poly meshes in less the a minute.



    This blobby thing is 50^3, gonna probably update this post with SDFs baked at various resolutions. I started adding 3D bin packing support so SDFs arent locked to a single resolution in the atlas as they are at the moment, and I think i might manage to get sparse SDF support in through out the weekend which should keep things very cheap to trace even when SDFs are at 128^3 voxels or something much larger.


    Just for comparison. The small details like the text, will be more visible when I finish up over the weekend. The additional stuff i'm doing should make it possible to use something close to this, while using relatively the same memory. Though that isn't a license to go nuts. I should be able to push this further as well by using 'sparse SDF' data, but that comes later.



    So now that SDFs are more solid thanks to getting to the convertor finalize, the shadows are pretty close to what the mesh would deliver being shadow mapped. Being able to control the SDF resolution when baking into the atlas would push this further as well.


    I also got around to looking at the math for the shadows and set in a proper light angle variable for adjusting the cone sampling radius. I got set it up now with a light angle representing the size of the sun relative to the earth, you can mess with this to match what ever it is you're looking for though.



    Still gotta fix that banding, but bent normals, AO and shadows are now looking a lot cleaner.




    And for reference, here's with everything on vs off

    If you compare the 'Off' with the baked lighting into the texture below, it's lining up remarkably well with the AO we're half ass'n off the SDF. I can still bring out more contact, mainly where the arm touches the slab, but yah I was pretty surprised by that personally.





    The sky occlusion brightens non-occluded areas hit by the direct light quite a bit. So if it looks a little brighter that's mostly why.

    Everything seems to be wrapping up nicely,

    Converter seems to have no issue with double sided stuff, or non water tight meshes. Had some major typos going on around the convertor that was making get nice clean water tight SDFs impossible, seems to be working alright now.


    Like now it's getting silly, with next parts of the convertor in place we should be able to feed it more resolution on a per instance basis as well, I imagine doubling the res would be almost on par to tracing BVH when it comes to quality at that point.

    But yah, convertor is defo done now, should be able to handle anything you throw at it, holes, planes, big mesh-y messes, w.e. Should produce a useable SDF from anything, no head aches without any major wait times.
     
    Last edited: Aug 20, 2022
  29. Jack_Martison

    Jack_Martison

    Joined:
    Jun 24, 2018
    Posts:
    143
    Hello I came from Nano-tech video and stumbled across this thread. Even though I watched video explanation posted above I still have quite a bunch questions. I am pretty much newbie in graphics environment so all I can say I understand almost nothing what read most of the times lol.

    First of all I'm HDRP user, and biggest problem here is keeping distant shadows while keeping closest shadows sharp. Does this asset solving it in more performant (or not) way? AO is pretty much nice for such scenarios posted in screenshots above, but imo that's not the biggest demand among HDRP users, since latter is much closer to Unreal than say Built-in.

    Okay if this is shadow-ao solution, what is SDF converter? Do we need to transfer all models to SDF so this renderer could utilise it?
    Oh and does it replaces cascaded shadows workflow?
     
  30. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41

    Hey!

    This is going to be a full blown GI solution, Lumen is an extremely demanding system to build though. AO and shadow was good testing ground for all the data and culling pipelines we had to set up.

    We're still missing some pieces, like surface caching, probes, and virtual shadow mapping (unreal new shadow mapping technology)

    Basically there is 0 point in doing GI, if you don't have super optimal light blocking data. AO is easy because you dont have to read shadow maps when you sample pixels, and for GI at the scale we're looking at, you'd need a massive shadow map.

    Unreal also uses the shadow tracing. I didn't do the AO first because I felt it was better then working towards GI, all of this is needed for the GI haha, there is no Lumen without the DFAO and shadows, it would also take me another 6 months to have something releasable and I need to support myself haha.

    I personally can't work with screen space effects due to the compromises, like if there is no lit pixel on screen, you are screwed, no GI. Like if I'm in a room and the only light source comes in through a window lighting the floor behind you, you'll get no GI in that room until a pixel from that lit spot enters the view port.

    And RTX is to expansive so this was my answer to those problems, though my plan is to cover UE4 stuff in a release in a few week, and then dive into full blown GI full time.

    I unno it's big system, 40 plus shader files probably around 20000 plus lines of code. The reflections and bounce will add another 40 shaders easy plus probably another 20k lines of code, baby steps lol.

    But yah it solves the shadow stuff in a more performant way, this works exactly as Unreal's pipeline was intended too. I do 150 metres of cascaded shadow map, then 2000 meters of DF shadows, they stay sharp and it always cheaper to trace then vs drawing into the cascaded shadow maps.

    So you end using the default field shadows and distance shadows together in most cases, default shadows up close this capture animation and stuff, distance field shadows at a range this ensure you don't have to compromise on distance.

    As for meshes, everything needs to be converted. Same thing as Unreal again, this is ray tracer, it just uses SDF over BVH because it is a lot faster.

    But yah I figure I can do an update on the GI situation; Wasn't going tooo, but yah its in the works.

    Already through most of it on the SDF side, I have all the color data stored and interfaces built to access it. (Height map is expansive to trace, that's why its not in the high res image below)

    One more small step for man; I unno a little underwhelming; S'all ready to trace some indirect though.


    And the it's also in the clip map.

    Like this alone added 5 shaders, especially account for the terrain. The clip maps a lil holy due to the debugger skipping a lil to much space. The 'slices' you can see in clip map image are indicative of the current min step size. (This allows me to use a small amount of ray steps without worry of them getting snagged on small features for debugging purposes)

    But yah should be starting on this stuff soon. And the quality should be very comparable to the AO, just bleeding color vs darkness.

    We have our screen space GI effect already done, just want to take my time here. I think over a weekend I could reproduce SEGI with no light bleeding and it would a lot faster.

    I might go that route just to give people something, but Lumen does not screw around it is a very robust solution that can tackle most scene types, interiors exteriors, huge ones, small ones. For me I would like to offer the later personally. I did a a lot of work extending SEGi for a client, stuff with RSM / Screen Space and non of these were good enough to call a general solution. So we'll see where this goes anyways.

    Quite honestly to get GI in SEGI style, i could do it pretty fast with all the color data being in place now and being traceable. So expect something soon there with this.
     
    Last edited: Aug 24, 2022
    tspk91, Stardog, blueivy and 5 others like this.
  31. properse7en

    properse7en

    Joined:
    Aug 26, 2022
    Posts:
    1
    Is there an updated discord link? the one there right now doesn't work!
     
    MaxKMadiath likes this.
  32. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Hey, no idea what happened there. People still seem to be trickling onto the server from somewhere... lol Thats weird..

    https://discord.gg/c74977m2aJ

    Here's a new link;

    I'll also be posting pretty big update. Version 1.0 is essentially finished. Doing a few last minute things while I work through docs. Looks like we'll be submitting this baby in around a weeks time!

    It's finally over... yay!
     
  33. Rensoburro_Taki

    Rensoburro_Taki

    Joined:
    Sep 2, 2011
    Posts:
    274
    Woahhhh! I want this! When is the planned release? will it be free? a store asset? github?`new life? new chances?xD
     
  34. DragonmoN

    DragonmoN

    Joined:
    Nov 27, 2016
    Posts:
    26
    Woahhhh, reading may help
     
    QGGameDesign likes this.
  35. Rensoburro_Taki

    Rensoburro_Taki

    Joined:
    Sep 2, 2011
    Posts:
    274
    I didn't take it that seriously though, because it was intended more as a motivational push. I Definitely read. Much even. Apparently not about the planned asset store release for 60 bucks and exactly the last sentance from the post above me. xD

    but thanks! ^^
     
    DragonmoN likes this.
  36. Pode

    Pode

    Joined:
    Nov 13, 2013
    Posts:
    145
    Usually raymarching SDF is done in fragment shader, but the need of using a BVH to, trace against is more compatible with Computer shader. Computer shader rules out browser. Is there a possibility for your development to dump the BVH into 2D texture (like when Texture3D are sliced and laid out in Texture2D) to keep it compatible with WebGL, even if it implies a 'strong' machine?
     
    QGGameDesign likes this.
  37. Rensoburro_Taki

    Rensoburro_Taki

    Joined:
    Sep 2, 2011
    Posts:
    274
    How are the finalization processes going?
     
    jjejj87 and DragonmoN like this.
  38. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Hey!

    Sorry, for some reason I never got any notifications. :confused:

    We'll be submitting to unity end of next week, after Chris comes back to go over a few things. As of now, I'm well into the docs and just adding extra UI features as I can think them up while I do that, but yah we're done. :)

    Been posting more regularly on the discord server trying to avoid spamming up the WIP threads. We add a few last-minute things I wanted to share first before getting more into release stuff.

    So got our convertor working with large objects consisting of tones of prefabs pretty efficiently in terms of user input.

    This gave us a chance to try out some large objects to give an idea of how well SDF tracing supports them.




    And a couple AO views of the interiors;





    Only converted the one tower. It uses the same SDF resolution as the deer, but even for this 4-story building, 50^3 seems to be more than sufficient to handle it.

    We also added options though where you can now use any size SDF with some fancy pin packing for composing the atlas. So essentially now there really isn't a size restriction in terms of the objects you work with. We've tried meshes at scales of 400 meters now and we're rain into no issues modifying transforms or with the tracing effects themselves.



    Just to elaborate, now we have pretty dynamic atlas just like unreal. Right now all our fields have a max dimension of 50 voxels, with varying counts on the other axis. You can now bake an SDF volume of any resolution into it and it will efficiently back them the allotted space.

    Lastly, we now support all directional lights, and all point lights add the scene.


    This is pretty new and is really what pushed us back, we wanted support for adding our lighting stuff to any lighting pass inside unity, we also wanted to extend the shadows to support point lights as well. So, here we have it! Super fast point light shadows with softening.


    So now you just add these Titan Light Components to your unity lights and our manager does the rest. (Titan is the name we gave our underlying manager we'll be using for Erebus and our GI tech Aruna)

    We made sure though that all of this was add to unitys shadow factor looks up so, all shaders regardless of pipeline will work automatically.

    There's also no limit to the number of lights, directional or point, and everything supports drawing outside unity's defined shadow distance/radius.

    And we also support working in the editor; Scene view cameras are completely compatible as well as running any number of cameras. Change lights or renderers on objects automatically respond, updating both shadows and AO. All editors call back works with our objects, so artists and level designs should be able to work as they normally would with the built-in unity lighting set up.

    We did a lot more things under the hood, ensured our math handles scaling and rotations and the AO and shadow softening all accounts for it, dynamic SDF sizes, vs limited to a single constant size, a bunch of last-minute optimizations and heck of a lot of work on Chris's part getting this to work with all the RPs and rendering paths. We also added some custom stuff for allowing for infinite range AO with instance only outside clip map range increasing AO tracing distance to compensate for lack of cascades at a very low performance cost and a bunch of other really nice stuff we hope you'll like.

    In terms of features, we can now truly say, we support everything unreal does, In terms of both the AO and now shadows. :)

    Eventually ill add support for spot lights. (Unreal doesn't even have this)

    But we need to get this out for both me and chris sakes... So really happy to say. Im just cleaning code and doing not need UI stuff and writing the manual now. :)

    I'm so glad it's done...
     
  39. minhdaubu2

    minhdaubu2

    Joined:
    Jun 10, 2014
    Posts:
    76
    congrats, you guys make me so hyped, if you are interested i hope that i can use our scene to demo your tech.
    that game called The Scourge (Unity 2021.1f4, HDRP)

     
  40. KYL3R

    KYL3R

    Joined:
    Nov 16, 2012
    Posts:
    135
    I just watched
    and now I'm really happy someone is doing SDF Shadows for Unity :)
     
    blueivy likes this.
  41. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Wow! :D

    First, gotta say this is super impressive. Amazing work! :) I'd be more then interested! I think your project and our shaders would be a match made in heaven. <3 We just added spotlight support on top of the points, think your project could really put these new light types to great use. :)

    I'll send you a DM later today. :)
     
    QGGameDesign and minhdaubu2 like this.
  42. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
  43. sirleto

    sirleto

    Joined:
    Sep 9, 2019
    Posts:
    146
    looks pretty nice, looking forward to your asset store entry!
     
    QGGameDesign likes this.
  44. Rensoburro_Taki

    Rensoburro_Taki

    Joined:
    Sep 2, 2011
    Posts:
    274
    There is no showcase with a regular architectural room o_O
    Hard to imagine how this would really look like.
    The forest-deer scene might be good for programming the thing, but as a customer I'd rather see something architectural, comparable, familiar, where you can see the nuances in shading and so on.

    Else, looking forward to see this one in the stores.
     
    Last edited: Nov 19, 2022
    matthewhxq, sirleto and olavrv like this.
  45. nehvaleem

    nehvaleem

    Joined:
    Dec 13, 2012
    Posts:
    436
    any progress? Release date ETA?
     
    sirleto likes this.
  46. Nexusmaster

    Nexusmaster

    Joined:
    Jun 13, 2015
    Posts:
    365
    Hi,
    we got permission to use ArchVizPRO assets for promotion, so here is a quick setup, AO could be setup a bit nicer by myself, because with Erebus you can adjust the SDF Bias, but it already works quite well:
    ArchVizProAndErebus.jpg ArchVizProAndErebusAO.jpg

    This is now all realtime shadow casting and AO, nothing is baked!
     
  47. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    Just a small update.

    We ended up finding some issues when setting up the arch viz scene initially and after a lil messing around we found out some stuff related to working with different scale units that tracer's settings needed to account for a bit better. There were also some modifications made to the prefabs in the scene that weren't applied to the source that messed up some of the objects that we're being converted.

    Now that we have a firm grasp on how to tackle scenes that follow unity's units. (All metric vs the terrain scene which used a lot of imperials scaled meshes meant for unreal) Everything is working pretty well, so thought I'd offer a tease to end my day of what the living looks like with most of its objects set up correctly.

    Note; I didn't convert the entire scene, a lot of the small clutter in the living room specifically needs to be managed but got some nice improvements coming along.


    Fine Tuned AO. (Light bleeding along wall, can be corrected really simply; This is due to arch viz meshing optimizations for the houses hull and me not noticing this till the image was already posted lol)



    Top Erebus On, Bottom Erebus Off; All shadows in the images are drawn by Erebus. Unity Shadows Disabled in the Scene**

    So again, no baking. ;)

    Should also mention Erebus is done; Asset Store Page is ready, just need to hit the big ol publish button and it's live for you guys.

    Taking a day or two to really make sure we got our sh*t together as well as cover some stuff we figured out working with the Arch viz scenes in the manual.

    So yeahhh, expect Erebus next week to be available. :)
     
    Last edited: Dec 4, 2022
  48. orville_redenbacher

    orville_redenbacher

    Joined:
    Sep 9, 2018
    Posts:
    34
    This is probably a stupid question but....does this run on mobile at all?
     
  49. SuperDrZee

    SuperDrZee

    Joined:
    Aug 12, 2021
    Posts:
    41
    While it's not officially supported atm, I recall Chris saying he made a build for mobile and it ran, but his phone was too slow to handle it without low fps. The test scene we've been using very is un-optimal though, it takes a lot of resources to draw all the Unity Terrian stuff, especially the trees with no instancing, so it's really hard to say.

    I'd imagine for simple scenes you'd fair okay. But i personally can't make any guarantees atm. It's a little complicated as well when looking at the whole package. I imagine the shadows would always run a little better than unity shadow mapping passes would. The AO is a little hard to make comparisons with as its more akin to RTX ray tracing then it is to screen space effects, just a lot faster in DFAOs favor vs RTAO.

    In theory a new snapdragon gen 1 should run all of this faster than my 1050gtx laptop. This is something ill actually test a little later. I think flag ships from the last two years could handle it given I see a lot of newer devices emulating switch and performing it.

    And in relation to mobile support, I am also planning to more resolution options, like 1/4 res mode with upscaling like unreal was using back when they first released their stuff. Once that's in and we experiment with it a bit more, I'm thinking will be able to lean towards saying 'mobile is officially supported'.

    For the now though we make 0 guarantees, and you're on your own for the now the moment. I am planning to add quarter res mode for AO in pretty soon via one of the first patches so hopefully, it won't be this way for to long.
     
    orville_redenbacher likes this.
  50. Rensoburro_Taki

    Rensoburro_Taki

    Joined:
    Sep 2, 2011
    Posts:
    274
    I can't help myself but I can't see any light features that make this a "lighting" solution. Shadows & AO is something that works already in 1000 versions all over the internet, on github AO solutions that even Unreal would have been jealous back then, when it needed it. Covering AO and shadows isn't something I can get excited about in any way anymore. So, where is the right light/bounce demo that really shows what the light part of your tool does? SEGI looks and works already good enough and it's free and your screenshots aren't even close to what SEGI or SSRT3 offers. I even wonder how you can write something like that if you have no idea about lighting at all? Sorry, you guys can't be taken seriously. Not when it comes to a professional point of view. And suddently you talk about "baking" O_O

    Once again - where is the light? where are the bouncing light effects, that really make things better?
    I can achieve in built in, in real-time, not baked, without any plugin, a better result than your two poor screenshots!

    Thank you, I am out from here
     
    sirleto likes this.