A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Discussion in 'Universal Render Pipeline' started by Chman, Dec 17, 2019.
Also, I use Unity 2019.3.0f3, do you need to be on 2019.3.0f6?
Hi! I got it to work, you need to change the Post Processing under the URP Pipeline asset to Post Processing V2 (see image). Majority of the SC Post Effects from Staggard Creations works. Though, you can't really have the integrated URP post processing and PPV2 on at the same time. This is a big problem for us as we have all our other post processing made in URP already, so it's kind of pointless of not being able to have them on at the same time. The whole point of adding PPV2 was to get the Custom Post Processing integrated with URP, but this needs to run in parallell with the integrated URP Post Processing, else everybody needs to redo all their existing post processing to PPV2, which is contradictory to the long term strategy of moving to URP PP. @phil_lira @ElvarOrnUnnthorsson could you activate an option to run both the integrated URP Post Processing and PPV2 at the same time and add it to 7.2.0 package, please? Thanks
Is there someplace I can follow the development of this? So I can see the status of Ambient occlusion and other features?
That's exactly the point I was trying to make originally. The correct solution for boiler plate is a better API rather than helping people with the boiler plate. I know to some extent your hands are tied by the language itself (C# isn't especially amenable to writing the kind of mini-DSLs that produce truly pleasant APIs) but it's still something to aspire to. The need for code templates or code generation etc should be a cause of hand-wringing and introspection rather than celebration. ;-)
Hi. Sorry this is done by design. You choose the pp stack you want to run. Major reasons are supporting both to run together would not only cause noticeable performance cost (update + final blit pass). Also tonemapping and colorg would not work properly with 2 stacks.
If the reason is that you want custom post in the new stack you can do it now by injecting a custom render pass. When we say we are supporting custom post-processing in new stack it means we make it easier for you by giving you custom post-processing template.
Gawd... THIS. Please. If nothing else, THIS.
OK, I understand. When can we expect to have official support for URP Custom Post Processing for the Asset Store developers (e.g. SC Post Effects from Staggard Creations and LWRP Volumetric Lightning from Bad Fat Dog)? They are hesitant to port their packs until official support is there. Please indicate some dependencies and timelines for 2019.4 LTS. Thanks
So it's basically a documentation and communication problem? Because it seems that most people are under the impression you can't currently do custom PP at all in URP.
Could you consider prioritising updating the docs, maybe release a simple example project and streamline your messaging to make it clearer that this isn't the case? This misunderstanding has lingered for weeks if not months and led to some fairly long discussions that might have been rendered moot.
Have to agree with @andybak
The way it was communicated is that custom post processing is something that is still coming to URP.
And as an observation I really think its not great to hide the custom render passes tutorial behind a paywall:
Interestingly enough this is basically a written version of this free video here
So to clear up the confusion as there is a lot of it and I fully understand where it is coming from.
If we look at it in it's simplest form, yes, right now, today you can do custom Post-processing in URP, you can even do it in LWRP, example:
Now, the reason we are officially saying we don't is because of 3 main reasons:
Documentation is lacking on creating a Custom RenderFeature/Pass and how to create a VolumeOverride to perform a volume based post-processing effect
We do not have a simple out-of-the-box template for people to get started with, so entry to this requires a much higher level of SRP knowledge
Efficient and tight integration with the embedded post-processing is non-existent currently, meaning that if you want to use a mix of our post and custom post you will be paying more than you need to performance wise
So we need to do more before we can say we support custom post processing as we did with PPv2, which btw will now be available to use in 7.2.0 in 19.3 later this week.
When you look at what a post processing effect is, it really is just a render pass that might blit a shader to screen, there is also nothing stopping someone from using the volume system and expending it. When we do create a better system the back end will simply be a RenderFeature with a Pass that is paired with a VolumeOverride and this will just be a template, along with a good doc page similar to the one we made for PPv2 which was easy to follow and got you going quickly.
So to sum up, you can do custom post processing UniversalRP now, but we want to make it more accessible and faster. Remember this is also why we moved to SRPs in the first place, so you have all the code in your hands and nothing is hidden behind out compiled code, tweak to your hearts content if you cannot wait for us to tweak it for you.
I just wanted to point out how helpful and needed that post was. Like others, up until right now I believed that URP could not do custom PP at all. I really appreciate this post but your team really needs to streamline its messaging and make things clearer, I bet there are users all over the place who believe as we did and still have no idea.
Yes thanks for the explanation !
What are we looking at in terms of pixel effects on cameras with URP -- particularly where upscaling issues are concerned?
See the following issues for an example:
As you can see, this pixel-perfect 2D project hasn't been updated in a LONG time, yet looking here, you can clearly see that people are interested in non-pixelated camera stuff being hybridized with standard Unity cameras:
I am wondering if this pixelized/non-pixelized camera scaling is planned as being part of the URP built-in solution?
If so, how far along are we?
Thanks for the demo project! I tried this project, visually it runs without any problem, very easy to use and flexible.
But from the performance perspective, it is not perfect, no matter what setting I use, there is always an extra render target switch back to _CameraColorTexture which is not needed, I marked this extra action in red below.
here is the result from FrameDebugger, using your project without any change:
1)start rendering on _CameraColorTexture normally, do regular clear->render all opaque objects and skybox
2)render target switch to _tmpBlurRT, sample _CameraColorTexture and then do blur chain
3)render target switch back to _CameraColorTexture (tile memory load), but without rendering anything useful!
4)render target switch to _CameraOpaqueTexture, trigger unity Opaque Texture Copy
5)render target switch back to _CameraColorTexture (tile memory load), continue rendering the rest
basically, I just want to:
1)start rendering on _CameraColorTexture normally, do regular clear->render all opaque objects and skybox
2)render target switch to _CameraOpaqueTexture, trigger unity Opaque Texture Copy
3)render target switch to _tmpBlurRT, sample _CameraColorTexture or _CameraOpaqueTexture and then do my own blur chain
4)render target switch back to _CameraColorTexture (tile memory load), continue rendering the rest
is it possible to do the above render logic, which results in minimum render target switch? I tried to inject kawaseBlur both at After Rendering skybox and Before rendering Transparent, but both without success in terms of performance, there is always an extra render target switch back to _CameraColorTexture which is not needed (red line).
maybe we need something similar to After Rendering Opaque Texture?
This might be a shortcoming of the ScriptableRenderPass system rather than the Kawase Blur example, will ask about this.
There are plans to move the OpaqueTexture copy to a renderer feature to let you choose where you want it and how many, also more distant plans are to add the ability to request it easily inside a render pass/feature so that you can make a renderer feature that relies on the opaque texture being available at that point. Otherwise for now Before Rendering Transparent should be in the right place roughly.
Is it possible to get a depth pass on a render objects pass? I've made an outline post process shader that I'd like to have affected by my first person objects but I need access to the depth pass.
The only other way I can have them affected is to use a second camera which I'd like to avoid.
Kinda sucks my question was skipped over.
So I'll ask again --
What are the plans for multi-res cameras (as it relates to pixelated effects described above)?
Honestly, there are times when I would like to take a 3d camera's output and shrink its resolution to a particular pixel-scale (per-camera) so that I can use pixel-style graphics on top. -- Is this feature being worked on?
I'm interested in a variant of that too!
I would still like some more precision about how much I can push the resolution thing in URP:
1. Can I do High rez proximity vs a low rez rendered background (as done in KLM jets)?
2. Render particles or foliage on a low rez render, then composite it with the high rez scene (as done with Grid the racing game)?
3. Render the scene in low rez, but has just a few focus objects in high rez composited on top (like eyes of character, fence, wire, etc...)?
4. Simulate a fixed foveated rendering that adapt to performance scaling (center stay high rez, but background and peripherie goes low rez, done in Mario Odyssey)?
Resolution tricks, with proper art direction, help with managing fillrates and performance when done cleverly. I think the official name is Extended Low Resolution Rendering.
What about anamorphic pixel rendering?
Not currently, there is also quite a few issues here when dealing with depth properly. There is someone who has made a cool low-res post effect in URP though here:https://github.com/Dener1111/LowRes
With this you can see how you might create some cool effects with a Custom RendererFeature.
Definitely, we do want to get here but right now we are focusing more on what is missing form builtin rather than adding too much new stuff, I mean we still don't have point light shadows, so we need to make sure things like this are top priority.
HDRP has the option to downres transparency which is a common trick as you said with particles to avoid some filtrate cost. This stuff is on our minds but we have not committed to adding them to the roadmap yet.
So as long as you are not performing a DepthPrepass and it is enqueued after opaques then the render objects meshes will be rendered into both color and depth buffers, so if you have a depth copy(depth texture) these will appear in them.
Also as a note, I had some time today to do a quick pass at exposing both Bloom max iterations and downsampling options to the Bloom PP effect, currently it is only on a branch(no PR yet) but if anyone is interested in giving it a spin let me know what you think. I've yet to do any performance testing to see what kind of speed up one can get but it would be interesting to know how much it helps.
This is pretty good-looking actually.
In response to your issues with depth -- Depth is actually something I don't think anyone who downsamples output from the camera cares much about too much, aside from when you're trying to do the things @neoshaman mentioned. A simple grayscale texture is all you need, and with some planning, a special render-pass to grab this immediately before downsampling (assuming you do need this information) would be all anyone could reasonably expect when doing such a thing. It seems like offering the option for a depth pass before and after downsample in the shader would be all you need (i.e. just a quick setting in the project's preferences where you can set options for cascading shadows and whatnot). A special "trick" like this would clearly have limitations, but if you NEED these downsample passes in a shader, you NEED the ability to do this at some point in the pipeline. If these options are not enabled, simply throw an error and don't let the user save the shader if it includes these kinds of nodes (or force the setting to be enabled if a shader uses these passes for downsample textures).
Typically these are project-wide settings and not a one-off shader pass, but if users clearly understand that compromises must be made on their part for this kind of support, I think most of us would gladly accept that over a straight-up impossibility to downsample ever (or at least within the next 10 years!!). Again, as @neoshaman pointed out -- this isn't just a one-off request -- Downsampling techniques are widely used in the VFX industry for some really cool stuff, especially for performance-critical shaders.
Remember the whole "Performance by Default" thing?
It seems we've forgotten all about the forest because of all the trees...
In your bloom inspector, I highly recommend that the "Downsample" option includes both "fixed height RT" and "fixed-width RT" option, a "fixed height RT" or "fixed-width RT" option is very very important if the user wants to make sure bloom visual is 100% identical between all different device resolutions.
We found that keeping bloom & dof's visual result identical between all resolution is impossible in URP, just setting the resolution in editor's game window will change the bloom & dof 's result a lot (for example when we check render result in both 1080p & 4k to make sure they look the same), this problem is a show stopper problem in our cinematic shot / emissive spell VFX development.
If that can't be solved, we will need to edit the source code of URP's post process, which we don't want to because it is hard to maintain.
Thanks for giving me an answer!
Is there any status on this? They recently released an asset that does URP AO, but id rather wait for the Unity version if its on its way.
I'm still working on this. Making it work in VR became a bit of a bottleneck and progress slowed down a bit because of the "corona virus + working from home with the missus and two toddlers" situation that's going on. But I believe I'm ready to send the first part for review today/tomorrow.
First part will include a DepthOnly SSAO where we reconstruct the normals from the depth texture.
Second part will include a SSAO with DepthNormals where we will also add a new pass with the ability to get normals from the prepass. We will also add the ability for Render Features to specify their requirements to the renderer. (Example: SSAO in depth normals mode, will tell the renderer that it requires a prepass that produces depth + normals)
So, if all goes well, then we'll release both parts in 7.4. In worst case it will only contain part 1 but I think I'll be able to implement both before that release.
Any chance to get the option to bind the depth pre-pass for depth-testing as well?
Just tried the upcoming version of HBAO and was excited to see it shows up as a PP override in URP... custom effects are go!
Hello URP team, can you give us an update on the status of custom PP effects in URP? Is it still on-track to be delivered when 2020.1 is released (7.4) or is it going to happen later? I'm hoping that I can use it with the 2D renderer to support a conditional blur PP (AKA not a RenderFeature) that I can enable/blend via a volume. Thanks!
I am getting artifacts on Quest using Vulkan and post process, looks very similar of a fix I saw you guys done for a NaN in other platforms, are you guys aware of that one?
I am also having reflection problems, is there any known bugs with URP and Vulkan about rflections? any plans for fixes?
When you guys have plans to release a new version of URP? is been months without an update and we are looking forward for it.
Cool! Has this been released yet?
Doesn't look like it: https://github.com/Unity-Technologies/Graphics/tree/universal/bloom-quality-settings
Well, that's sad.
Have to take care of it myself then!
Is this possible in URP?
I tried the following but it doesn't work.
Bloom is applied to all screens.
Is it possible to turn on Bloom only for the UI?