Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Question Denoiser for HDRP pathtracing

Discussion in 'High Definition Render Pipeline' started by molestium, Jan 20, 2021.

  1. PutridEx

    PutridEx

    Joined:
    Feb 3, 2021
    Posts:
    1,136
    Scene view doesn't update by default, tick "update" in one of those menus:

    4T4vdIRBrf.png

    the one with a +, tick the arrow next to it. And enable update or refresh mode, something like that.

    With game-view, you need to hit play for it to refresh.
     
    chap-unity likes this.
  2. rmuk

    rmuk

    Joined:
    Feb 18, 2015
    Posts:
    65
    Ah, thank you! I had assumed it was something basic. I remembered seeing something similar with animated shaders a few years ago.
     
  3. rmuk

    rmuk

    Joined:
    Feb 18, 2015
    Posts:
    65
    Question for those in this thread using the new denoiser/path tracing to do offline rendering. I assume most of us are using timeline to handle camera movements (along with Recorder to get the images or video out). I'm running into an issue where timeline continues to update the camera position in between "frames" as convergence is happening. Since the camera is moving, it creates a sort of motion blur since the camera is at a slightly different position as it converges. Is there a way to set a timeline to just snap the camera to it's new location at the start of each frame instead of interpolating?
     
  4. chap-unity

    chap-unity

    Unity Technologies

    Joined:
    Nov 4, 2019
    Posts:
    760
    This is more a timeline question rather than pathtracing thing but you should be able to do what you want by using curve in timeline between your keyframes.


    upload_2022-8-12_10-27-45.png

    By default, it's going to use an "S-Curve" interpolation but you can fiddle with the bezier curve handle to make it a squared shape and make it snap between location as you want.
     
    hippocoder likes this.
  5. rmuk

    rmuk

    Joined:
    Feb 18, 2015
    Posts:
    65
    Yeah sorry, i realize this wasn't totally on topic, but i figured that no one else using unity would need/want this besides people using the path tracer (or other accumulation based offline rendering). I had thought about the square shape, but wouldn't I have to have a key in between every single frame then? I need it to snap to the new location at the start of each frame.
     
  6. chap-unity

    chap-unity

    Unity Technologies

    Joined:
    Nov 4, 2019
    Posts:
    760
    No, you can make a curve like this (the blue one) if you push the handle far enough. :)

    upload_2022-8-12_18-45-35.png
     
  7. rmuk

    rmuk

    Joined:
    Feb 18, 2015
    Posts:
    65
    Right, but i would need to do that for every single frame of the animation. A smooth sweeping camera move which occurs over a 10 second span, for example, would require me to set a key and the square-curve each for each and every frame since otherwise it tries to move the camera while the convergence is happening.
     
  8. chap-unity

    chap-unity

    Unity Technologies

    Joined:
    Nov 4, 2019
    Posts:
    760
    You can select all the concerned key in your curves and set the tangeants to constant all at once.
    That way, they won't interpolate and just snap between positions.
     
  9. rmuk

    rmuk

    Joined:
    Feb 18, 2015
    Posts:
    65
    Right, i understand that, but the use-case here is a camera move. Let's say over 10 seconds (at 60FPS) it needs to go from (0,0,0) to (10,0,0). The camera will be at a different position each frame. If i did what is in your screenshot, it would snap from (0,0,0) to (10,0,0). Using that method I would need to create a keyframe at every in between frame and set the tangents which would be 600 different keyframes for just a 10 second period.
     
  10. stefanob

    stefanob

    Joined:
    Nov 26, 2012
    Posts:
    67
    I would like to get access to the denoise progress of the path tracing volume. I know when the path tracing is done but not the denoiser. I want to take a screenshot when it is complete. Is that possible somehow?
     
  11. eaglemo

    eaglemo

    Joined:
    Feb 1, 2023
    Posts:
    8
    Hi,how to test the denoise time in HDRP?Someone knows this problem.
     
  12. chap-unity

    chap-unity

    Unity Technologies

    Joined:
    Nov 4, 2019
    Posts:
    760
    Hey, sadly, currently there's no API to get the progress and the timing of pathtracer accumulation and denoising process.

    It's not the first time it has been requested, so It's tracked in our backlog but we cannot say when this will be taken care of.
     
  13. OlavAketun

    OlavAketun

    Joined:
    Aug 4, 2021
    Posts:
    44
    IIRC, the ground work has already been done. Been poking around in the HDRP and denoiser package and I've found all sorts of stuff.
    The accumulation progress bar compute shader gets it's info from somewhere, I don't remember where but it's there. And somewhere in the denoiser package someone already had set up a few bools for the denoisers state, example "failed", "succesful" etc.
     
  14. OlavAketun

    OlavAketun

    Joined:
    Aug 4, 2021
    Posts:
    44
    Scratch that, it's an enum not a bool.
    DenoiserBase-state.PNG
     
  15. eaglemo

    eaglemo

    Joined:
    Feb 1, 2023
    Posts:
    8
    Did you figure out the whole process of it? How to get the Texture?
     
  16. eaglemo

    eaglemo

    Joined:
    Feb 1, 2023
    Posts:
    8
    upload_2023-3-1_11-2-20.png
    What is this code doing? Execute RenderDenoisePass logic?
     
  17. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    858
    Any ETA on this? Path tracing is only useful using Recorder right now, which is Editor only. It would be wonderful to do path tracing in builds as well.
     
    Last edited: May 2, 2023
    newguy123 likes this.
  18. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    858
  19. chap-unity

    chap-unity

    Unity Technologies

    Joined:
    Nov 4, 2019
    Posts:
    760
    No ETA but it's still tracked on our side, just not a big priority as you might expect. I'll poke again the right people and keep you posted when that moves. It's not that complicated, it's just a matter of priority, thanks for your patience.
     
  20. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    858
    I do understand there is plenty on the todo, and that it doesn't get easier with the layoffs, but still I don't understand this specific prioritization – that is, if rendering path-traced video in Unity is supposed to be a thing at all. My machine (RTX3070) crashes after rendering about six seconds with denoising enabled. And it seems Keijiro experienced the same six months ago. I imagine a lot of resources have already been poured into implementing the feature, so if there is a fix that is "not that complicated" that takes path-tracing from useless to useful, then I'd put that top of the list.

    Thanks @chap-unity, it's much appreciated. I know everyone is doing their best. It's just very frustrating.
     
    Last edited: May 9, 2023
    chap-unity and newguy123 like this.
  21. rmuk

    rmuk

    Joined:
    Feb 18, 2015
    Posts:
    65
    I had reported this back in october, it was accepted and has been listed as "in progress" for quite some time. I'm still seeing the issue on 2023.1

    https://issuetracker.unity3d.com/is...he-play-mode-when-using-path-tracing-denoiser
     
  22. chap-unity

    chap-unity

    Unity Technologies

    Joined:
    Nov 4, 2019
    Posts:
    760
    Hey, I've asked around and apparently this issue has been fixed in the 1.0.3 update in the denoiser package.
    For now, this update is only available in 2023.2 alphas but backports are currently being evaluated to 2023.1 and 2022 LTS !
     
    cecarlsen and newguy123 like this.
  23. rmuk

    rmuk

    Joined:
    Feb 18, 2015
    Posts:
    65
    Wonderful! Glad to hear it! Thanks for checking.

    Do you know the status of VFX graph and the path tracer? Even with the new ray-tracing options in VFX graph i can't get it to show up when i enable path tracing.

    EDIT: Tried my project in the 2023.2 alpha and i'm only seeing the v1.0.0 denoising package. Is there a way to override that with 1.0.3? I tried enabling pre-release packages but that didn't do anything.
     
    Last edited: Jun 21, 2023
  24. chap-unity

    chap-unity

    Unity Technologies

    Joined:
    Nov 4, 2019
    Posts:
    760
    It should have been automatically updated with the right unity engine version (2023.2.0a21 on my side, but maybe it's not public yet, if not, it should be soon).

    As for this, the current status is that it's supported in raytraced effects (Reflections, AO, GI.. etc) but not supported by the pathtracer (so, expected that they don't appear).
    Can't promise anything, so let's say its under consideration in the future :fingers_crossed:
     
  25. rmuk

    rmuk

    Joined:
    Feb 18, 2015
    Posts:
    65
    Ok thanks for clarifying that! 2023.2.0a19 is the latest one that is publicly available and DX12 is actually completely broken in it, so i'm anxiously awaiting a20's release which supposedly has a fix for that.

    EDIT: a20 does fix the dx12 issue, but the 1.0.3 version of the denoiser is still not present. The memory bug still occurs.
     
    Last edited: Jun 29, 2023
    chap-unity likes this.
  26. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    858
    I have just tested Unity Denoising 1.0.3 using 2023.2.0a22 and the GPU memory leak remains for both the Intel and NVIDIA option =(

    EDIT: And unfortunately the same is true for 2023.2.0b1

    It really puzzles me why this is not higher priority (the denoiser is using AI, right? ;)). According to the Denoiser changelog, the only thing that changed in 1.0.3 was "Removed Radeon denoiser backend.". Are we even sure the bug was worked on?
     
    Last edited: Jul 21, 2023
  27. rmuk

    rmuk

    Joined:
    Feb 18, 2015
    Posts:
    65
    Odd, i just tried it in the new alpha and beta (with denoiser 1.0.3) and the issue seems to be resolved for me. HOWEVER, Path tracing no longer works in scene view! I can only get it to show up in game view. I looked for a toggle or a setting and couldn't find one anywhere.

    @cecarlsen This was 1.0.3, however the version that we had access to prior to this was 1.0.0. So possibly one of the internal intermediate versions fixed the issue?
     
    Last edited: Jul 21, 2023
  28. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    858
    You are right. The leak is fixed in 1.0.3. I was using my own recorder instead of the Unity Recorder, and it was calling the multi-frame rendering API incorrectly.

    Now I am back to the original issue. That there is "no API to get the progress and the timing of pathtracer accumulation and denoising process.". I wonder how the Unity Recorder does it. Unfortunately I can't use it because I need it to work in builds.
     
    newguy123 likes this.
  29. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    858
    Ok. I am wary of firing the confetti canons just yet ... but this does seem to work (only tested with the NVIDIA denoiser):

    Code (CSharp):
    1. void PrepareSubFrameCallBack( ScriptableRenderContext cntx, Camera[] cams ){
    2.     _subFramesRendered++;
    3.     if( _subFramesRendered % subFrameCount == 0 ){
    4.         StartCoroutine( WaitForDenoiseAndSave() );
    5.     }
    6.     _hdrp.PrepareNewSubFrame();
    7. }
    8.  
    9. IEnumerator WaitForDenoiseAndSave(){
    10.     yield return new WaitForPresentSyncPoint(); // COMPLETELY undocumented =(
    11.     //Save frame here.
    12. }
    EDIT: It continues to work so far.
     
    Last edited: Jul 28, 2023
    newguy123 likes this.
  30. rmuk

    rmuk

    Joined:
    Feb 18, 2015
    Posts:
    65
    Man, i've just given up on using Unity for 3D rendering work. I wrapped the project that I was working on with the new denoiser update but I'm so burnt out with the issues I encountered that I think I'm just done trying to make this work in Unity. Here's a brief summary of the things I ran into off the top of my head:

    1) Timeline scrubbing inconsistencies. Sometimes timeline elements just didn't reset themselves when scrubbing a timeline that had multiple sub-timelines on it.
    2) Sometimes the recorder would just skip frames. In my scene I have multiple shots, each with it's own timeline. Each of those timelines had a recorder track on it. Since there's no real way to set up a render queue, I had to create one parent timeline and bring several of the individual scene timelines on it. Relatively straightforward, but sometimes the renderer would just skip to every ~10th frame. So for example, one time I had 3 segments I wanted to render overnight (ETA was 12 hours total), each had 1000-2000 frames in them. The MIDDLE segment only rendered like 120 frames. It basically would render one frame, skip 10, render another one, etc. The first and third segments on that same timeline rendered fine. They all had the same frame-rates, render settings, etc. After noticing this and cursing a bit, I just disabled the other two segments and rendered just the second one again with NO changes to it's settings and it rendered fine. This brings me to my next issue....
    3) Inconsistency. Most of the things I'm mentioning here were completely inconsistent. You press "play" to start a render and it was a random chance that one, none, or multiple of these issues might pop up. At least if there's some consistency I can try and work around these problems. Sadly that wasn't the case.
    4) The Denoiser sometimes would just not work. One particular render had been going for over an hour and I decided to open the folder containing the jpg frames to take a look. I noticed that denoiser hadn't been applied to any of the frames, another wasted hour. This was a pretty common one and would happen every third or fourth render it felt like. Applied to both optiX and openImage. Each time I hit render I would have to go and confirm that the first frame was in fact denoised. This also made queueing up multiple renders overnight a fun adventure.
    5) Material properties on timeline sometimes decided not to work. This is an old one and not really related to the denoiser specifically, but it's part of this whole workflow. I needed to update a color property on a material via the timeline, this would sometimes break and sometimes work, and of course it was in the middle of a shot so i would often have to go back and try to re-render just a few specific frames where it would break. Which reminds me....
    6) How is there not a way to set a filename variable that is equal to the frame number of the current timeline? Each time a timeline recorder starts, the "frame" variable begins at zero, instead of the frame number of the parent timeline. Which made going back and re-rendering smaller subset of a sequence a complete pain. Speaking of....
    7) When I did have to move a recorder so that it only rendered out a portion of a timeline, the play-head would often skip 3-6 frames INTO the recorder segment before it would actually record anything. So if i set the recorder track on my timeline to run from frames 1440 to 1490, it would actually start recording from ~1445. Again, this was one of those fun "doesn't do the same thing each time" issues, so i got into the habit of adding 10-15 frames of extra padding when i needed to do that. However, you know where you can't do that?....at the start of a timeline. A lot of my shots ended up being a few frames off from where they were supposed to be, which made syncing up the audio later a bit of a pain. (there was a speaking character/person in the shot so it had to match very closely)
    8) Unity's render times seem to skyrocket if focus is lost. One particular overnight render ended up taking almost 50% longer because the main unity window lost focus at some point. I had expected it to be done by the time i woke up in the AM and it was only halfway through.
    9) Crashes....yeah not sure what to say about this one. That's just part of rendering, either with Unity or any other application. Yes, I realize this isn't something I should complain about while using a Beta build, but without that new build, you can't use the denoiser due to the memory leak, which basically makes the path tracer useless unless you want to either render still images or only do like 50 frames at a time before resetting the play mode.

    All in all, this whole week has been nothing but frustration and I can't see myself attempting this again anytime soon. Just for comparison sake, I fired up that other game engine that starts with a U to see how it compared, and from what I can tell so far, the tools appear to be much more mature and suited to this task. I haven't really done any work in it before and really would rather keep working in Unity, but trying to get this to work has just burnt me out.
     
    cecarlsen likes this.
  31. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    1,248
    Yeah I'm a Unity fan too, but if you're main aim is rendering work, you'll get results faster and better with less headaches by just using Unreal. Sad, but true
     
  32. Yanus3D

    Yanus3D

    Joined:
    May 6, 2019
    Posts:
    490
    rmuk
    Instead of using Path Tracing use DXR.
    Forget PT as the pipeline for rendering - yes is great but still, this is Path Tracing: good for 1-2 images. Render time is bad even on my 4090 (2-5 min for a single image in QHD or 4K is not an option in my pipeline - some features from Unity also are not supported in PT)

    Here is my pipeline with Unity:
    - I use Lightwave 3D for modeling (since 25 years)
    - I use Rhino for modeling (more complex objects which required T-Spline and technical charts or feature CAD, etc)
    - I use a special bridge Lightwave-Unity plugin which offers seamless connection
    - only DXR and ONLY H-Trace Engine (soon HT2 !) Yes, I must make a billion image renders/hour and a lot of Panorama
    - only Shader Graph (my custom surfaces) forget basic Lit
    - for animation/panorama special custom scripts
    - the only advance of Unreal is Local Exposure (this is bad that do not have it) - the engine is really bad compared to Unity, I do not talk even when compare to H-Trace.... (DXR)
    - if you go to Unreal, simply you will make a step back, especially with architecture offline rendering, yes on the web you will see a lot of great arch renders from Unreal but those are in 99.9 %...bake systems with DXR reflection or without DXR - totally useless for a professional job. Unreal is great but realism in renders is close to games than renderings (DXR).
    Lumion is overrated.

    Here is the only required hardware for offline rendering in DXR (I use only DXR):
    you need 4090, DDR5, and the newest Intel CPU 6 GHz....otherwise, forget DXR
    You do not have it - simply you must forget DXR
     
  33. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    858
    Sounds like an awful week, sorry to hear. I've experienced the same issues with Unity Recorder + Timeline + Path Tracing ... and I've also spend days trying to figure out why some frames were denoised while others were not. Draining indeed. It's really shouldn't be necessary to write ones own recorder, like I ended up doing.
     
    Last edited: Jul 28, 2023
  34. rmuk

    rmuk

    Joined:
    Feb 18, 2015
    Posts:
    65
    I appreciate the detailed post Yanus. Just a few thoughts on it:

    1) I'm not doing archviz and a lot of the scenes i'm rendering are quite dynamic, so baking isn't really an option for a lot of it.
    2) The per-frame render times I was getting rendering at 1440p were around 10-12 seconds, not 2-5 minutes with PT. This was achieved after running a lot of tests with different accumulation settings to find a good mix of quality/speed.
    3) A lot of the lighting in my scenes are from emissive surfaces, not from set light sources, and there is no sun/global light. In my experience DXR did not handle this well at all, same with the reflections.
    4) You mentioned custom scripts for animation, does that mean you don't use the timeline at all? We do a lot of animation in our scenes and not being able to lay that out with the timeline is a non-starter for us.
    4) I utilize the VFX graph in my scenes as well. VFX graph isn't supported by PT at all right now, so I did my renders in two passes and composite them using unity's built in compositor. I was able to get by with this, but occlusion and depth are disregarded by going that route so i had to be careful with my shots. Unreal's vfx system does work in PT. I and a few others in a different forum post have submitted it as a feature request to Unity but it looks like it's not on the roadmap and isn't a high priority.
    5) Thanks for mentioning H-Trace, that does look quite interesting, however I'm really not keen on using a third party tool for something so important as GI. I've been working in Unity for quite a long time and gotten burned by various assets that lacked support/updates over time. That's not to say i don't use any, but i'm wary of using one for something so critical like GI
    6) After spending a bit more time exploring unreal, I would definitely argue that they have a lot more going for them than local exposure. The ability to queue renders, set different presets for my renders (like a quick preset for test renders, a "final" preset with higher accumulation times for end renders, etc.) is pretty huge. They also give proper estimates of render times while rendering which is something that Unity lacks.
    7) I'm not testing with Lumen, just comparing the PT offerings between the two engines. Render times and visual quality seem similar between the two at the moment.

    I'm not completely sold on jumping ship, but just trying to find a workflow that fits my situation. Unity's collection of tools for this seem like a great match, but as i started using them i just hit so many different roadblocks.
     
  35. Yanus3D

    Yanus3D

    Joined:
    May 6, 2019
    Posts:
    490
    rmuk
    1. Understood.

    2. I need sometimes 1-20 sec in DXR for QHD Image (GPU 4090) Somebody will tell me: that's a nightmare! But in Arch I do not need performance. My scenes are horribly complex, with huge amounts of physical light, ultra-heavy objects. Some scenes to render in DXR needs 20-24 Giga Vram....
    In this case, Path Tracing gets even worse with huge noise (no way to use it)
    Comparing to CPU...HEAVEN TIME RENDER (DXR), make number: 10.000,30.000, and even 50.000 times faster.

    4. Right now you cannot do animation in Unity because Path Tracing has some sort of bugs, is horribly long and you must use Denoiser which is bad for animation.
    DXR is much better but still due to bugs in Unity it is buggy and has very bad code (offline mode is useless)
    For that, you need a special script. Same for Panorama.

    4. I do not use this.

    5. This is the main course in Unity: H-Trace. The developer is very active and very good. Upcoming HT2 will be a game changer in architecture rendering. Unity DXR is so bad that even comparing it to the H-Trace is meaningless.

    6. Local Exposure - the best that could happen for Unity....Still waiting (maybe will appear as the third part?) I know is planned in Unity but this happen in the next 1-2 years.

    7. PT is exactly the same everywhere. Even in our render engine KrayTracing for Lightwave is the same. Of course, this is a CPU engine but the effect is the same. This is a Brute Force system, IT MUST BE THE SAME EVERYWHERE. Differences are in speed: GPU is much faster with PT than CPU.

    PS
    I use Unity forever because I am an old user of Lightwave 3D (over 30 years) and we have a perfect pipe-line LW-Unity through LWImporter plugin (the perfect bridge between them in two directions)
    Also, I love Unity DXR (HTrace plugin) it has amazing quality. As a developer of CPU render engine for Lightwave, I appreciate the quality of HTrace. But sadly I have to tell you that the original Unity DXR is very bad. Is good comparing to Unreal but still very bad compared to H-Trace.
     
  36. OlavAketun

    OlavAketun

    Joined:
    Aug 4, 2021
    Posts:
    44
    Does the denoiser take HDR images as well? For OIDN I believe you have to tell it you're giving it HDR input.
    Looking at the documentation for OIDN and it has a lot of inputs I'm unsure we're actually setting before we denoise.
    https://www.openimagedenoise.org/documentation.html
     
  37. OlavAketun

    OlavAketun

    Joined:
    Aug 4, 2021
    Posts:
    44
    On another note, it seems whoever set up the parameters for the denoiser didn't actually look into the documentation properly. I'm not pointing fingers... mostly because I don't know who did this. This is specifically talking about Intels denoiser here, OIDN.
    After some quick testing with different parameters it seems we're not actually setting any more than just input color, albedo, and normal.
    In fact we're not even giving the denoiser the input it would need to denoise properly.

    The Path Tracer renders all of its AOVs as first hit, meaning anything beyond first hit will be ignored. Glass is treated as a solid surface, anything behind or reflected will be smudged beyon repair.
    There's also a handy "Yes I'm giving you clean AOVs" flag that isn't set, so it defaults to being off. The cleanaux parameter is super handy, it means you can have high frequency noise in your albedo and normal without it smudging it unnecessarily.

    Here's a few examples to show the difference giving OIDN the "correct" AOVs and setting that one magical parameter.
    Using the images provided in the OIDN documentation as examples.
    First up is just giving the denoiser first hit albedo and normal, cleanaux is off.
    denoised-firsthit cleanaux0.jpg
    You can see it's a bit smudgy but considering it is only 4spp you can forgive it a little... except we know it can look better.

    Second image is with cleanaux turned on.
    denoised-firsthit.jpg
    It cleans up the image even better now and the glaring problems from the first image are cleaned up. It's still not the best it can be however, since we're still providing the denoiser with glass as a solid surface.

    Third image is with cleanaux turned off again just for comparison, but here albedo and normal actually continue bouncing after the first hit on transparent surfaces, ie glossiness is taken into acount. This means they are actually semitransparent and semireflective.
    denoised-nondelta cleanaux0.jpg
    Some of the smudginess is back since we set cleanaux to off, but at least we can get some detail in the interior of the car.
    Again we can do better by simply giving the denoiser the best AOVs possible, and by setting the correct parameters.

    Last one is with albedo and normal with glossiness, and cleanaux turned on.
    denoised-nondelta.jpg
    In terms of clarity and how clean this result is compared to the first image, I think it's a nobrainer to do denoising justice.

    I always thought the denoiser was a bit "meh" until I realized we've not actually using the denoiser to it's fullest.
    It's sad considering it doesn't seem like a huge task to set it up correctly (apart from the AOVs).
    Getting a better denoised image than we have now would be as easy as including the correct parameter to the denoiser.
    Going all the way to get a better denoised image would involve rewriting how albedo and normals are calculated for the AOVs. Currently it's simplistic, it's just taking "surfaceData.color" and putting that into "aovData.albedo", which is in turn put into "payload.aovAlbedo". Meaning it's only really taking the unlit colors of a surface, this as a result means we're denoising to the level of quality as the first image I've provided here.

    TLDR:
    We can get better results from the Intel denoiser by looking at the documentation properly and by providing with better AOVs.

    Sorry for the long post.
     
    nuFF3 and newguy123 like this.
  38. chap-unity

    chap-unity

    Unity Technologies

    Joined:
    Nov 4, 2019
    Posts:
    760
    Hey, thanks for the feedback. :)
    The long version of it is that on our side AOVs are not always "clean" (when rendering DoF is an example) so they can't always be used. We could also denoise them and it shouldn't be too complicated but this has a high cost.
    This is why it has been done that way.

    However, it has been raised internally and it will be re-evaluated if it makes sense to expose it as an option for users since it makes sense on some scenarios indeed.

    Have a nice day :)
     
  39. OlavAketun

    OlavAketun

    Joined:
    Aug 4, 2021
    Posts:
    44
    Obviously if the AOVs are noisy then they need to be cleaned, but that's why it's an exposed option in OIDN. I'm not sure about Optix, I haven't looked into the documentation for that.
    Not carrying it along when integrating it into the package is a bit of a missplay if you ask me, as the denoised results look much better with it turned on.
    And now that you mentioned DoF as an example, I'm gonna counter that with a screenshot of albedo AOV with spp set to 35 (but it hasn't finished rendering yet). DoF set to the extreme, 200mm x 200mm sensor, f/0.7, focus distance 0.7m.
    upload_2023-10-31_16-48-27.png
    You can be the judge if that's clean enough when it's not even done rendering at 35spp, now think how it would look at 256spp.

    I'd almost say AOVs can be safely assumed to be clean when we're rendering above 8spp, but yes having an option would be the best approach.
     
    chap-unity likes this.
  40. EugenioA

    EugenioA

    Joined:
    Jul 9, 2020
    Posts:
    9
    Hey, can I ask you how do you take panorama images with DXR? I use cam.RenderToCubemap to render 360 pictures and that doesn't work well with screen space effects and accumulation. Do you render each cubemap face and then combine them?
     
  41. OlavAketun

    OlavAketun

    Joined:
    Aug 4, 2021
    Posts:
    44
    I would recommend rendering each cubeface and combine them in the end.
    It's more work but it's not broken like the builtin RenderToCubemap when rendering with Path tracing.
     
  42. EugenioA

    EugenioA

    Joined:
    Jul 9, 2020
    Posts:
    9
    I will keep this thread alive instead of creating a new one!
    I have a question about how to use the denoiser on a render texture. I found an example in the Unity Denoiser documentation:
    Code (CSharp):
    1. // Create a new denoiser object
    2.         var denoiser = new CommandBufferDenoiser();
    3.  
    4.         // Initialize the denoising state
    5.         Denoiser.State result = denoiser.Init(DenoiserType.OpenImageDenoise, width, height);
    6.         Assert.AreEqual(Denoiser.State.Success, result);
    7.  
    8.         // Create a new denoise request for a color image stored in a Render Texture
    9.         denoiser.DenoiseRequest(cmd, "color", colorImage);
    10.  
    11.         // Wait until the denoising request is done executing
    12.         result = denoiser.WaitForCompletion(renderContext, cmd);
    13.         Assert.AreEqual(Denoiser.State.Success, result);
    14.  
    15.         // Get the results
    16.         var dst = new RenderTexture(colorImage.descriptor);
    17.         result = denoiser.GetResults(cmd, dst);
    18.         Assert.AreEqual(Denoiser.State.Success, result);
    So my question is what is exactly cmd in denoiser.DenoiseRequest(cmd, "color", colorImage); ? I think it's a commandbuffer, if so where do I get it (I know nothing about command buffers as you might've guessed)? Do I just create a new one like:
    CommandBuffer cmd = new CommandBuffer();
    and pass it right away to the function, or should I do something with it before passing it or maybe get it from somewhere?
    Same question about the renderContext.
    Thanks in advance for any help.
     
    Last edited: Nov 29, 2023