A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Discussion in 'Graphics Experimental Previews' started by id0, Jun 14, 2018.
For one, you have an extra full screen rendertexture, which can eat up a lot of memory.
Some guesses (Unity's engineers will clear this up no doubt):
Not requiring the entire pipeline to reboot for a second camera, including culling and so on is a
good start. It's basically a horror story and always has been perf wise in Unity but with something like LWRP or HDRP, it's adding a lot of overhead as these pipelines are performing much more up-front work prior to the render than previously, to absorb the cost of greater demands further on. And when you have multiple cameras being layered, I'm pretty sure you need to retain some kind of alpha information and the GPU has to keep stuff around. I'm not totally clear on it but I'm 100% clear that historically doing this was slow as hell, for me
I guess you can still render into a texture if the texture does not need to do a copy to retain what is there previously, and this would be fast if you turn a lot of stuff off.
In their new "Deep dive into graphics of FPS Sample - Unite LA" video, they talk a bit on how they achieved the FPS render mode (at 39:05)
Apparently, they render a very tiny gun with the FoV being adjusted in the shader
Yeah they also explain this in FPS Sample forum, and with example for getting the same effect with shadergraph
I made github diffs now for FPS Sample's SRP and PP (in comparison to unaltered SRP and PP), relevant post here: https://forum.unity.com/threads/fps-sample-srp-and-pp-change-diffs-in-github.587863/
Their SRP changes include "FpsModeFov" which could be related to what's been discussed here (didn't check the changes in detail).
I'm not sure if someone has mentioned this already, if don't care about PBR you can use the GUI/Text Shader (works in 2018.3.5).
@hippocoder I think what you are looking for is possible, but I don't think it is yet exposed in Shader Graph, so you may need to (very lightly) modify the HDRP Lit shader.
The two things you will need to change are the render queue tag and the depth testing property.
First, the render queue tag. In the tags at the start of the shader, change "Queue"="Geometry" to "Queue"="Geometry+1". This will make the sort order of the shader larger and thus be drawn after other opaque objects with the default sort order.
This is now documented at:
Note that the article above is mainly discussing transparent sorting, though the same logic applies to opaque sorting.
Also note that either by design or bug (in 2019.1 HDRP v 5.13 anyway) negative values in the Geometry queue seem to cause the object to not get rendered rather than rendering first.
Secondly, you will need to disable depth testing. Just change ZTest LEqual to ZTest Always. This will make sure that the pixels or your overlayed object don't get discarded if they clip through an object and end up behind it.
As an example, the red cube is physically behind the blue one, though rendered on top.
From the frame debugger, you can see that the red cube was indeed rendered last.
As mentioned, I don't think this is doable without modifying the Lit shader. If you search the changelog below for RenderQueue it appears that in v 6.0 they have added RenderQueue to the lit material, which presumably is this and hopefully it will end up in the Shader Graph. Controlling render order is pretty useful, so I cant imagine it not.
Hope this helps!
Changing the ZTesting to 'always' will cause issues when using more complex meshes or multiple objects in 'fps mode', causing objects/triangles to be drawn on top of each other. Try using this method with one or multiple complex meshes and you'll see this issue.
Normal objects should always be rendered with LEqual.
Another thing i tried with ZTest was to first render the fps objects with ztest always and 'geometry + 1' but with a big offset in depth, that would cut a hole in anything in front where the object would exactly be, and then render the normal weapon with 'geometry+2' that would be guaranteed to fit in that hole and would render normally.
From my tests i wasn't able to get that to work properly although it sounds like it should work. Maybe i did something wrong, or the z-buffer doesn't like that or it doesn't get properly modified with that.
Would've given captain stencil a visit but he's said that he's too busy moving around to help
Yeah I should have mentioned that there will be z fighting issues if your weapon is made up of multiple objects.
This will always be the case if you are trying to render objects in an overlayed fashion and trying to circumvent the natural order of things.
As hippo mentioned, you could use the stencil buffer. Note though in HDRP the stencil buffer is used extensively for lighting, so you will have to be careful. As of v5.13, only bit 4 is free for custom usage. From the HDRP source:
// Stencil usage in HDRenderPipeline.
// Currently we use only 2 bits to identify the kind of lighting that is expected from the render pipeline
// Usage is define in LightDefinitions.cs
public enum StencilBitMask
Clear = 0, // 0x0
LightingMask = 3, // 0x7 - 2 bit - Lifetime: GBuffer/Forward - SSSSS
// Free slot 4
// Note: If required, the usage Decals / DecalsForwardOutputNormalBuffer could be fit at same location as LightingMask as they have a non overlapped lifetime
Decals = 8, // 0x8 - 1 bit - Lifetime: DBuffer - Patch normal buffer (This bit is cleared to 0 after Patch normal buffer)
DecalsForwardOutputNormalBuffer = 16, // 0x10 - 1 bit - Lifetime: DBuffer - Patch normal buffer (This bit is cleared to 0 after Patch normal buffer)
DoesntReceiveSSR = 32, // 0x20 - 1 bit - Lifetime: DethPrepass - SSR
DistortionVectors = 64, // 0x40 - 1 bit - Lifetime: Accumulate distortion - Apply distortion (This bit is cleared to 0 after Apply distortion pass)
SMAA = 64, // 0x40 - 1 bit - Lifetime: SMAA EdgeDetection - SMAA BlendWeight.
ObjectMotionVectors = 128, // 0x80 - 1 bit - Lifetime: Object motion vector pass - Camera motion vector (This bit is cleared to 0 after Camera motion vector pass)
All = 255 // 0xFF - 8 bit
Hopefully that doesn't change. I would think Unity would make sure going into production we have a least one stencil bit for custom usage, though at this point HDRP is still the wild west so who knows?
Hiya, I am curious - Where do I find this in HDRP? I would like to get my weapons rendered ontop of everything. I may be being dumb, but how did you do this in HDRP?