Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

HDRP Weapon shader?

Discussion in 'Graphics Experimental Previews' started by id0, Jun 14, 2018.

  1. jRocket

    jRocket

    Joined:
    Jul 12, 2012
    Posts:
    700
    For one, you have an extra full screen rendertexture, which can eat up a lot of memory.
     
  2. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Some guesses (Unity's engineers will clear this up no doubt):

    Not requiring the entire pipeline to reboot for a second camera, including culling and so on is a
    good start. It's basically a horror story and always has been perf wise in Unity but with something like LWRP or HDRP, it's adding a lot of overhead as these pipelines are performing much more up-front work prior to the render than previously, to absorb the cost of greater demands further on. And when you have multiple cameras being layered, I'm pretty sure you need to retain some kind of alpha information and the GPU has to keep stuff around. I'm not totally clear on it but I'm 100% clear that historically doing this was slow as hell, for me :)

    I guess you can still render into a texture if the texture does not need to do a copy to retain what is there previously, and this would be fast if you turn a lot of stuff off.
     
  3. eskovas

    eskovas

    Joined:
    Dec 2, 2009
    Posts:
    1,373
    In their new "Deep dive into graphics of FPS Sample - Unite LA" video, they talk a bit on how they achieved the FPS render mode (at 39:05)


    Apparently, they render a very tiny gun with the FoV being adjusted in the shader
     
    Lahcene and giraffe1 like this.
  4. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,788
    Yeah they also explain this in FPS Sample forum, and with example for getting the same effect with shadergraph
     
  5. rz_0lento

    rz_0lento

    Joined:
    Oct 8, 2013
    Posts:
    2,361
  6. andyRogerKats

    andyRogerKats

    Joined:
    Oct 3, 2016
    Posts:
    13
    I'm not sure if someone has mentioned this already, if don't care about PBR you can use the GUI/Text Shader (works in 2018.3.5).
     
  7. derp_bot

    derp_bot

    Joined:
    Nov 21, 2012
    Posts:
    8
    @hippocoder I think what you are looking for is possible, but I don't think it is yet exposed in Shader Graph, so you may need to (very lightly) modify the HDRP Lit shader.

    The two things you will need to change are the render queue tag and the depth testing property.

    First, the render queue tag. In the tags at the start of the shader, change "Queue"="Geometry" to "Queue"="Geometry+1". This will make the sort order of the shader larger and thus be drawn after other opaque objects with the default sort order.

    This is now documented at:

    https://docs.unity3d.com/Packages/c....7/manual/Renderer-And-Material-Priority.html

    Note that the article above is mainly discussing transparent sorting, though the same logic applies to opaque sorting.

    Also note that either by design or bug (in 2019.1 HDRP v 5.13 anyway) negative values in the Geometry queue seem to cause the object to not get rendered rather than rendering first.

    Secondly, you will need to disable depth testing. Just change ZTest LEqual to ZTest Always. This will make sure that the pixels or your overlayed object don't get discarded if they clip through an object and end up behind it.

    cubes.PNG
    As an example, the red cube is physically behind the blue one, though rendered on top.

    frame-debug.PNG
    From the frame debugger, you can see that the red cube was indeed rendered last.

    As mentioned, I don't think this is doable without modifying the Lit shader. If you search the changelog below for RenderQueue it appears that in v 6.0 they have added RenderQueue to the lit material, which presumably is this and hopefully it will end up in the Shader Graph. Controlling render order is pretty useful, so I cant imagine it not.

    https://github.com/Unity-Technologi...render-pipelines.high-definition/CHANGELOG.md

    Hope this helps!

    Cheers
     
    hippocoder likes this.
  8. eskovas

    eskovas

    Joined:
    Dec 2, 2009
    Posts:
    1,373
    Changing the ZTesting to 'always' will cause issues when using more complex meshes or multiple objects in 'fps mode', causing objects/triangles to be drawn on top of each other. Try using this method with one or multiple complex meshes and you'll see this issue.
    Normal objects should always be rendered with LEqual.

    Edit:
    Another thing i tried with ZTest was to first render the fps objects with ztest always and 'geometry + 1' but with a big offset in depth, that would cut a hole in anything in front where the object would exactly be, and then render the normal weapon with 'geometry+2' that would be guaranteed to fit in that hole and would render normally.
    From my tests i wasn't able to get that to work properly although it sounds like it should work. Maybe i did something wrong, or the z-buffer doesn't like that or it doesn't get properly modified with that.
     
    Last edited: May 28, 2019
    hippocoder likes this.
  9. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Would've given captain stencil a visit but he's said that he's too busy moving around to help :(
     
  10. derp_bot

    derp_bot

    Joined:
    Nov 21, 2012
    Posts:
    8
    Yeah I should have mentioned that there will be z fighting issues if your weapon is made up of multiple objects.
    This will always be the case if you are trying to render objects in an overlayed fashion and trying to circumvent the natural order of things.

    As hippo mentioned, you could use the stencil buffer. Note though in HDRP the stencil buffer is used extensively for lighting, so you will have to be careful. As of v5.13, only bit 4 is free for custom usage. From the HDRP source:

    Code (CSharp):
    1.  
    2. // Stencil usage in HDRenderPipeline.
    3. // Currently we use only 2 bits to identify the kind of lighting that is expected from the render pipeline
    4. // Usage is define in LightDefinitions.cs
    5. [Flags]
    6. public enum StencilBitMask
    7. {
    8. Clear = 0, // 0x0
    9. LightingMask = 3, // 0x7 - 2 bit - Lifetime: GBuffer/Forward - SSSSS
    10. // Free slot 4
    11. // Note: If required, the usage Decals / DecalsForwardOutputNormalBuffer could be fit at same location as LightingMask as they have a non overlapped lifetime
    12. Decals = 8, // 0x8 - 1 bit - Lifetime: DBuffer - Patch normal buffer (This bit is cleared to 0 after Patch normal buffer)
    13. DecalsForwardOutputNormalBuffer = 16, // 0x10 - 1 bit - Lifetime: DBuffer - Patch normal buffer (This bit is cleared to 0 after Patch normal buffer)
    14. DoesntReceiveSSR = 32, // 0x20 - 1 bit - Lifetime: DethPrepass - SSR
    15. DistortionVectors = 64, // 0x40 - 1 bit - Lifetime: Accumulate distortion - Apply distortion (This bit is cleared to 0 after Apply distortion pass)
    16. SMAA = 64, // 0x40 - 1 bit - Lifetime: SMAA EdgeDetection - SMAA BlendWeight.
    17. ObjectMotionVectors = 128, // 0x80 - 1 bit - Lifetime: Object motion vector pass - Camera motion vector (This bit is cleared to 0 after Camera motion vector pass)
    18. All = 255 // 0xFF - 8 bit
    19. }
    20.  
    Hopefully that doesn't change. I would think Unity would make sure going into production we have a least one stencil bit for custom usage, though at this point HDRP is still the wild west so who knows?
     
  11. Cookieg82

    Cookieg82

    Joined:
    Mar 28, 2017
    Posts:
    73
    Hiya, I am curious - Where do I find this in HDRP? I would like to get my weapons rendered ontop of everything. I may be being dumb, but how did you do this in HDRP?

    Thanks!