Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Official Adaptive Probe Volumes (APVs) experimental release for HDRP in 2021.2

Discussion in 'Graphics Experimental Previews' started by Matjio, Feb 11, 2022.

  1. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,750
    Eh, I think if APV lerping becomes a thing a lot of us use, we need a way to easily lerp sets of relfection probes as well, or have a system to easily low priority async rebake all reflection probes in the vicinity.

    But we're probably not going to get that. Normalizing them reflections probes by the lightprobes will be deemed good enough and they'll call it a day, and it will be yet another half assed feature in Unity's toolset.
     
  2. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Kinda leaning toward a mix of relighting probes (possibly rgb cube, alpha contains depth which you can reconstruct normal from) and static (local) indoor ones.

    Probe relighting is a staple of open world games out there because it's practically free, requires no camera nonsense or setup or cost. It would be nice if the option was built into the existing Unity probes. Naturally these probes do not contain shadow information, but they may well not require it for their use cases. Any extra thoughts on this from anyone would be cool.

    Not sure how it would look API wise but you need the cubemap rgb and possibly alpha in the depth. Then you can calculate how a probe is supposed to look from any given position with any number of basic in-shader lights, like maybe a campfire or a primary dir light. It's for outdoor usage mainly or places where it's too expensive to update 6 camera directions for a cube. The memory cost would be a bit higher: the probe needs RGB+RGB+A so 7 floats to do with what you will.

    Perhaps there's better data arrangements and people know about it.

    In any case, for a large open world game, I think people need 3 types of probe:
    • classic render camera for 6 faces / sliced (slow, but you can use proxy geo)
    • static probes for indoors etc
    • relit probes: store rgb+depth and use it to render a new probe relit for any time or lighting circumstances to loosely match. Custom shader can be used.
     
    one_one and HIBIKI_entertainment like this.
  3. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    595
    This is largely where we have been heading too. Open air being the most difficult to balance right.
    Our early tests we had a few open air probes refreshing, code and timeline tests proved to work however no existence of time slicing in HDRP under 22, puts a heavy spike into the mix, interestingly enough more effective with timeline than hardcoding the update on change.

    even with custom frame setups ignoring shadows for example there sadly was still large spikes causing a noticeable jitter where as you mentioned the 6 face updates we're just way too much.


    In general I feel you're certainly on a strong path for your project, especially with your DOTS framework as well, that must have been very rewarding for you getting Into that early on.

    It also seems like many open world projects are hitting the same pain point sadly.
     
    hippocoder likes this.
  4. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    595
    To be honest ( from a Nvidia perspective) you could use most 1000/2000/3000 fine enough.

    We tested on a 1070ti 2070s and 3070 and all was pretty adequate on our test ext/int scene. Though it wasn't extensive.
    But I think your personal workflows wouldn't vary too much based on hardware, potentially just more volumes for lower end cards

    Overall it's a much much MUCH better experience that manual probe placement and the additional processes APV does.

    The APV system can be sliced up quite well with volumes so that your GPU isn't chugging too much at once.

    I will however state that it's really still early days in the tools life so using at full production stage is dicey.

    It does tend to freeze and hang the GI cache plenty, and of course full support for end hardware may not be present yet.
     
    Last edited: Jun 9, 2022
  5. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    I'm switching to HDRP after the proto and HDRP has a heavy up front CPU cost. I hope staff will optimise it best they can. I really do need the fidelity it brings and I won't wait for APVs.
     
    HIBIKI_entertainment likes this.
  6. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    595
    I feel like I'm generally more accustomed to HDRP general workflows, our team took a much longer time to adapt to it trying to change their solo work built-in mindsets to HDRP specifically.

    I'd say focusing on your project outliner with the hardware targets and the RP assets set up first, you'll probably have a much better time working with it, profiling and benchmarking performance rounds then.
     
    hippocoder likes this.
  7. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Well, I have to say it's come a long way since I last was here a few years ago. This blows Unreal away in the right hands because it has way less temporal dirt and the signal seems way cleaner. Some really clever heads have worked on this and it shows as soon as you engage. Wow! Loving this!

    Sorry, off-topic. Now to figure out how APVs work in an open world lerpy context. Advice anyone? Basically plan is to emulate enlighten by lerping time of day. What is best practise/size/resolution/amount of APVs needed etc etc?

    Advice please! :D
     
  8. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,788
    It currently doesn't work well with terrain, also don't lerp everyframe
     
  9. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Ouch - so there is a big hit to performance? I don't mind if something can be sliced to only update in parts over time, but what I CANNOT endure are spikes. Even if it's once every 5 frames. Because that's precisely sort of thing that becomes visible on a wide range of machines.

    I do use terrain, sadly. I'll try meshes while I learn so I see the correct results.

    @francescoc_unity - does it have a heavy cost (spike)? can we make it a constant cost but spread over frames?
     
  10. francescoc_unity

    francescoc_unity

    Unity Technologies

    Joined:
    Sep 19, 2018
    Posts:
    193
    Support for terrain has very recently landed, so will soon be in a public release.

    As in for the cost, if not blending there is no spike if not when triggering a streaming event if any (and even so it shouldn't be chugging, just need to account for the extra streaming workflow and more improvement over that are coming soon).
    The baseline GPU cost is also fairly low.

    If blending, we added ways so that only a customizable amount of cells are blended over time so that you can time slice how the blend happens. Hopefully we'll have good documentation on this soon and I can expand in a bit more details on Monday when I am at my work PC. The blending of course will have an extra cost (not super high, will provide numbers when I have access to them), but you can make it "constant" so that it can be easily accounted for instead of random spikes.
     
  11. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Thanks that would be really great. Happy to test what you have planned!
     
  12. francescoc_unity

    francescoc_unity

    Unity Technologies

    Joined:
    Sep 19, 2018
    Posts:
    193
    So we don't have the captures on our PC anymore, but a scene with several very dense cells (it was probably a bit more than 10 cells, but we don't have precise values now, we can get them later on) we seem to remember around 0.4ms for a blend on PS4 base. This is a very rough recollection, but is to give a ballpark before I get my hands on data again.

    You can however setup however many cells you want blended per frame via numberOfCellsBlendedPerFrame on the ProbeReferenceVolume. This should allow you to keep the perf under check.

    Unfortunately for this we don't have any nice UX exposed as with all the rest of the blending (which is a fairly experimental feature as it is for now confined only to APV data), so needs to be driven by code.
     
    Deleted User and hippocoder like this.
  13. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Great, thank you.

    Regarding APV volume, you said it is tied to the scene. Is there support for a simple offset for origin shifting? Perhaps for hybrid renderer / entities in future (if you needed context).
     
    LumaPxxx and AydinDeveloper like this.
  14. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    868
    Found an issue with APV on 2021.3.5 and earlier:
    - Planar reflection probes don't work anymore (probably wrong normal calculation) if "Normalize Reflection Probes" is enabled. Screen goes all black if Planar Reflection Probe is visible.
     
  15. francescoc_unity

    francescoc_unity

    Unity Technologies

    Joined:
    Sep 19, 2018
    Posts:
    193
    The problem is that the data itself is slotted in a global structure for your whole "baking set" (i.e. independent set of scenes that will ever be loaded together), the scene holds the data just for the sake of triggering the loading/unloading properly.

    Hey! Thanks for the report, do you mind filing a bug for easiness of tracking on our end? Thank you!
     
  16. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    1,248
    If the APV (which bakes very quick) aids in a sort of almost realtime GI, since the prodes are all over the place, I can understand how it makes lighting better.

    At the same time though, can't the same system be used, in a way to aid reflections, without the need to place reflection probes? Altenatively, can't a similar system be created for automatic reflections everywhere in the scene?
     
    NotaNaN, PutridEx and valarnur like this.
  17. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    868
    If you are not able to reproduce, I would do, but would need to create a new project with APV setup, as current project is way too big to submit.
     
  18. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    For large worlds that go further than 5-10K units, what do you recommend?
     
  19. valarnur

    valarnur

    Joined:
    Apr 7, 2019
    Posts:
    440
    I'm also interested in this? Maybe in Unity 2023? It would be extremely helpful.
     
  20. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    868
    Reported as IN-8235 without repo project, but steps to reproduce.
     
  21. francescoc_unity

    francescoc_unity

    Unity Technologies

    Joined:
    Sep 19, 2018
    Posts:
    193
    Thank you! Being able to track it is good enough for us so we don't forget :) We will try to recreate a repro project on our end.

    Reflection probes are significantly more demanding to store/sample than light probes so we can't really use the same system unfortunately. On the plus side, with the option to normalize reflection probes with APV, you could get away with placing less reflection probes.

    For authoring, I assume not all of your 10k units will have the same amount of required lighting precision. For the sake of discussion let's assume you have a terrain and villages throughout.
    I suggest to have a volume spanning the world that overrides subdivision so by default you skip the finer resolution.
    Then in your area of interest (e.g. the villages) add a volume that restore the level of detail from the highest quality one so that there you can have more definition.

    For iteration bake only loaded scenes (with the full world bake done less often).

    At runtime, rely on streaming to avoid having issues with a large amount of memory being loaded. Current streaming still require all the data asset for the loaded scenes to be in CPU memory with streaming to GPU. However we are currently working on streaming directly from disk to GPU memory; that way you can keep the amount of memory used under control and let the streaming load the most relevant data. This is going to be especially important for platforms like consoles where memory is unified.


    Not sure I answered your question, but I am happy to answer specific questions! :)
     
    one_one and hippocoder like this.
  22. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Many thanks! Your are right, this was what I was looking for, a way to do the classic open world scenario. Still remaining is a question of how to deal with loss of precision far away from origin?

    At 10k units, things start to move strangely, so we shift the world every so often, I have not tried shifting with APV yet, but I assume it will be streamed in at the wrong position. Will test next week.
     
  23. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    300
    How do I fix this blackness on the floor?
    I'm using CPU light mapper
    Probe volume setting at the default value
    Unity 2021.3.4
    HDRP 12.1.7
     

    Attached Files:

  24. francescoc_unity

    francescoc_unity

    Unity Technologies

    Joined:
    Sep 19, 2018
    Posts:
    193
    Looks to me that you are not using dilation and/or virtual offset?

    Can you check via the debug view (Window -> Analysis -> Rendering Debugger -> Probe Volume -> Display Probes and add shading mode to Validity) if those probes are indeed invalid?
     
  25. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    868
    One comment / question on this; As reported by me earlier in this thread, in 2021.2.x and 2021.3.x "Virtual Offset" is not usable at all if a project uses non-kinematic physics, so I am running into same issues with blackness. What will be the first version where the physics issue has been fixed?
     
  26. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    300
    In the picture, I had both dilation distance and iteration count at 1.
    Changing both to 3 seems to fix the problem. Thanks.
    All these settings seem like black magic to me though. I don't understand what they do from the short tooltip shown in editor.
    Questions
    1. Is there an advantage of spliting the probe volume to multiple volumes instead of one big volume that cover everything?
    2. How do you deal with large and complex scenes that need a very dense probe volume (aside from getting better hardware)? I can't bake the volume that has distance lower than 1 in large scene it because I ran out of ram or vram? 'Cannot allocate more brick chunks, probevolume brick pool is full.'
    3. Is it possible to only rebake probe and not the entire lightmap?
    4. Does the light probe sample multiplier in the lightmap setting do anything for APV?
     
    Last edited: Jun 30, 2022
    hippocoder likes this.
  27. francescoc_unity

    francescoc_unity

    Unity Technologies

    Joined:
    Sep 19, 2018
    Posts:
    193
    I am not entirely sure, will check soon, but latest 22.2 version has it fixed for sure.

    EDIT: First tag I can find is 2022.2.0a15


    Yes, apologies for the lack of documentation. It will come soon hopefully.

    Note that iteration count shouldn't be needed to be raised to 3.

    1- No advantage outside more control over the subdivision (you can override parmeters per volume). Other than that no change.

    2. For the issue you are encountering - enable streaming (in HDRP asset) and also increase the size of the pool (memory budget in the HDRP Asset). More work on streaming is happening as we speak.
    Note that an issue for baking extremely large scenes also can happen from the lightmapper, the issue was reported internally to the relevant team, but I don't have immediately visibility on that, sorry.

    3. Not yet, but work to make that happen is scheduled.

    4. It does affect the number of samples shot to bake the probes, yes
     
    jiraphatK and Qleenie like this.
  28. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    HI!
    Looks like some good works has been done.

    However I struggle to evaluate the cost as a low spec game dev. I mean how does it work under the hood?

    Given it's per pixel and seems sparse except at surface, how the hell the probe selection happen?

    - first I thought of a system like sparse voxel octree, with texture brick as leaf holding the SH, but the traversal seems a bit too expensive for most low end target if that's a replacement for old style PV in unity.
    - Then I thought it use hierarchical volume with various, we collect the volume on object overlap, then pass them on a light loop to sample the corresponding SH texture brick, which make the sampling O(1).
    - Another idea was it might work by having the texture brick like the tetrahedral SH, but interpolating the brick instead of a single SH. Which probably wouldn't work as good, so I guess not.

    Is there a paper or article about the underlying behavior? That would help understand a lot how to deal with it.
     
  29. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Hey! nevermind! I had to google video instead, site search only return unity forum in most pages! lol
     
    Reanimate_L likes this.
  30. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,788
    In the video it showed that the scene also have somekind of global Sky Occlusion which help for the improve the indirect shadowing, we really need those kind of global Sky Occlusion even if it's static Sky Occlusion
     
  31. francescoc_unity

    francescoc_unity

    Unity Technologies

    Joined:
    Sep 19, 2018
    Posts:
    193
    A talk at siggraph that will be in August will have a section on how it works. Will put the slides here when they will be publicly available.

    Also hopefully the documentation will have more info.
     
  32. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    595
    Where is the ♥️ love button!

    Really grateful to hear this.
     
  33. valarnur

    valarnur

    Joined:
    Apr 7, 2019
    Posts:
    440
    What sort of APV features and worlkflow will HDRP 15 get?
     
    Ruchir likes this.
  34. PutridEx

    PutridEx

    Joined:
    Feb 3, 2021
    Posts:
    1,136
    Been trying it out in 2022.2.18a - surprised by the results! better then the last time I tried it out a while ago.
    One question though: Where's the option to smooth noise when TAA is enabled?
    I'm referencing this change:
    • HDRP: Added: A new option to animate APV sample noise to smooth it out when TAA is enabled.
    It's a part of the release notes for 2022.2.a18 so it should be there. But I can't find it anywhere (APV window, probe volume script, camera, etc)

    one issue I'm facing is noise. It gets worse the higher the bounces are, and so you have to compensate with an even higher value for light probe samples the higher the samples are, the less noise there is, as you can imagine it gets crazy pretty fast :D
     
    Last edited: Jul 16, 2022
  35. PutridEx

    PutridEx

    Joined:
    Feb 3, 2021
    Posts:
    1,136
    Some interior APV / lightmap comparisons:

    - mixed lights
    APV settings: 0.55m (very dense), 1024 indirect samples, 17 probe sample multiplier, 5 max bounces
    lightmap settings: 45 texels, 1024 indirect samples, 5 max bounces
    No directional/sky light

    Right click on each picture and click open image in new tab, then jump between tabs quickly to notice all the differences:

    APV:
    Unity_eXC0RPWRbX.png


    Lightmap:
    Unity_tTtfpOgxGG.png


    APV with SSGI active:
    Unity_7VQzmW0uOu.png

    it seems to struggle more with emissions
     
    Last edited: Jul 16, 2022
    florianBrn and valarnur like this.
  36. AydinDeveloper

    AydinDeveloper

    Joined:
    Sep 8, 2017
    Posts:
    92


    unknown-14.png
     
    PutridEx likes this.
  37. jiraphatK

    jiraphatK

    Joined:
    Sep 29, 2018
    Posts:
    300
    What's your SSGI setting?
     
  38. PaulMDev

    PaulMDev

    Joined:
    Feb 19, 2019
    Posts:
    72
    I can't find all of the options in 2022.2.0b1.
    Did they move it somewhere else ? upload_2022-7-18_17-54-30.png
     
  39. PutridEx

    PutridEx

    Joined:
    Feb 3, 2021
    Posts:
    1,136
    Default medium with reflection probe + sky fallback
     
    jiraphatK likes this.
  40. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Seems to me this is a bit of an abuse of the APV, setting it so high as to replace local lighting on small details. But if that is possible, then colour me impressed.

    Could staff elaborate the ideal use cases a bit clearer for the system? One can then be impressed and work with the original design intent (where it no doubt shines).
     
  41. francescoc_unity

    francescoc_unity

    Unity Technologies

    Joined:
    Sep 19, 2018
    Posts:
    193
    No still there, but under additional settings (you need to enable it from the three dots menu on top left of the component foldout)

    Well it is possible (Enemies doesn't have any light maps), however it should be clear that it could be used as an alternative, but in some cases it won't work as good (leaking surviving for example) so we don't want to push it as a straight up replacement.

    The official stance is that:

    - The main intent is to replace the old light probe system, so to light dynamic objects.
    - We observed by internal content that using it for everything, including static objects and environment, can work a lot of times, however some scenarios are still tricky and therefore lightmaps might still be needed in those. We are not aiming to solve all of the issues so that lightmaps become completely replaceable in all cases. As I said, success of using just APV for indirect diffuse (+SSGI for small details) has been definitively achieved internally, but is not a guarantee we can provide.


    Mostly also this is meant to be used for indirect lighting, but if it works with baked direct or not it really depends on content :) It often does, it sometimes doesn't all that well.
    In the advanced settings in the probe volume options, can you try and decrease the sampling noise value and also maybe play a bit with the biases?

    That said, that kind of emissive light represent a bit of a challenging case :)
     
    NotaNaN, PaulMDev and hippocoder like this.
  42. davidjfranco

    davidjfranco

    Joined:
    Oct 9, 2014
    Posts:
    23
    Hi there, is it possible to use apv to approximate lighting on a skinned mesh renderer? i'm not having any luck with that.
     
  43. francescoc_unity

    francescoc_unity

    Unity Technologies

    Joined:
    Sep 19, 2018
    Posts:
    193
    What do you mean exactly? Yes skinned meshes can receive lighting from APV
     
  44. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    What does your skinned mesh inspector look like?
     
  45. davidjfranco

    davidjfranco

    Joined:
    Oct 9, 2014
    Posts:
    23
    Hi there. Can anyone possibly test with Better Lit Shader? I’m trying to figure out why it doesn’t work with Better Lit Shader. Can someone from Unity perhaps test this to aid in the support of this feature? Thanks!
     
  46. davidjfranco

    davidjfranco

    Joined:
    Oct 9, 2014
    Posts:
    23
    To the above :) it was because the mesh was using Better Lit Shader not HDRP Lit. I’ve been on to the developer but he is unsure what would cause the issue - it works with standard probes - perhaps you could point to the difference in what APV is doing and point him in the right direction for support for APV.
     
  47. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    I'm wondering if APV would work as a simplification of the plenoptic equation. While researching lightfield rendering, it has been demonstrated that the 5d equation (x,y,z,phi,theta) can be reduce to a 4d equation (x,y,phi,theta). Typical lightprobe array are basically a 5d representation of lightfield, which mean it can be compressed from volume to at least boundaries of volume (assuming convex for simplicity). APV seems to aggregate data to boundaries of mesh rather than volume, so how is the empty space derived, through interpolation of hierarchical data, or through adapting the plenoptic equation?
     
  48. davidjfranco

    davidjfranco

    Joined:
    Oct 9, 2014
    Posts:
    23
    Anyone? :p
     
    neoshaman likes this.
  49. francescoc_unity

    francescoc_unity

    Unity Technologies

    Joined:
    Sep 19, 2018
    Posts:
    193
    Better Lit shader is not owned by us nor I personally have much visibility of what it is doing. I am open to discuss with the asset author to help if they reach out.
     
  50. francescoc_unity

    francescoc_unity

    Unity Technologies

    Joined:
    Sep 19, 2018
    Posts:
    193