Search Unity

How do I write a normal decal shader using a newly added (Unity 5.2) "finalgbuffer" modifier?

Discussion in 'Shaders' started by bac9-flcl, Sep 22, 2015.

  1. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    What I'm trying to do should be pretty simple: a decal shader that overlays it's own normals on top of normals already existing in the deferred GBuffer, instead of overwriting everything on pixels covered by decal faces. It was said to me that what I want to accomplish can be done through use of finalgbuffer modifier.

    Full disclosure: I'm pretty bad at writing shaders and I only attempt to use a finalgbuffer because a senior shader programmer with Unity 5.2 beta access recommended it to me as a perfect tool to create the shader I describe. :)

    ____________________________________________

    First, some visual examples, for context.

    Let's say you have two models like these:



    The one on the right is textured with this normal map:



    The end result, with the decal adding it's normals on top, looks like this:



    ____________________________________________

    Now, from what I understand, the shader should work like this:

    • Add finalgbuffer modifier to the surface shader and create a function for it
    • Declare sampler2D for _CameraGBufferTexture2
    • Calculate screenUV in the finalgbuffer function
    • Use tex2D with screenUV on _CameraGBufferTexture2 in the finalgbuffer function
    • Transform your tangent space normals to world space (as GBuffer is storing world space normals)
    • Overlay your transformed normal onto sampled _CameraGBufferTexture2
    • Output the result to inout half4 normal in the finalgbuffer function
    Except I have no idea how to make it work, because the surface part of the shader ruins the result by overwriting everything, and I can't discard the result of a surface shader using e.g. Blend Zero One because that will also discard anything I will do in the finalgbuffer function.

    One of the ideas I have is using screenUV and sampling all four GBuffer textures in the surface function to output that to SurfaceOutputStandard struct, but that is not working either, unfortunately. I am pretty sure I am completely misunderstanding the point of finalcolor/finalgbuffer if I do that too, I think, because if it was possible to read form GBuffer and write to it at the same time, separate functions wouldn't have been necessary at all.

    The only reference I have, and pretty much the only instance of finalgbuffer use in existence so far is Terrain splat shaders in Unity. If someone can take a look at them, that would be nice, because I'm not sure I'm understanding how finalgbuffer is used there correctly.

    Maybe one missing piece of the puzzle is use of decal:add or decal:blend modifiers, but I'm not sure.

    Can someone help? If you have no experience with finalgbuffer, that's alright - some of you are probably familiar with finalcolor modifier, which, according to Unity, works almost exactly the same. Maybe you can at least show me how to make a shader with a similar idea but using finalcolor (sans normal operations, just e.g. overlay of a red fill).

    ____________________________________________

    P.S.: I have encountered these questions a dozen of times now, so just in case:

    Q: Those models look awful, why not just use a single unwrapped object with baked normals?
    A: Few reasons:

    • Makes it possible to detail objects with huge surface area. Not even insanely high-res unwraps will save you if you want to put crisp detail on a wall or a vehicle with surface area of, say, 30 meters.
    • Absolutely perfectly consistent texel density - every single detail in every single object in every single environment will have exactly the same texel density. Great for the look.
    • Ease of authoring art - slapping quads onto models is the definition of a fast workflow. An artist can detail ten objects with that approach in a time it will take him to set up highpoly and bake maps for one traditional object.
    • Ease of updating art - modify the normal map decal to, say, alter a shape of a rivet, get every single rivet in existence updated on thousands of objects. Good luck doing that with traditional floater baking workflow.
    • Free memory - huge per-object normal maps are no longer needed if you are using med-poly base geometry with normal decals on top. Just one small decal atlas per all objects and you're done. Hell, you might not even need other per-object atlases - just jump to tiled materials and add all your detail with decals.
    This workflow is not some obscure fantasy, it was used for most of the modular objects in Alien Isolation and is being used in most ship and level models for Star Citizen. There are existing shaders accomplishing what I'm asking about on UE4 and CE.

    Q: What's wrong with alpha blended shaders? Just make a texture where normal mapped parts are opaque.
    A: That won't give the desired result. I want to overlay my normals over existing normals.



    If a surface has grain or bumps in the normal, decal has to preserve that instead of putting decal face normal with tangent space perturbations applied onto it. More than that, even if the underlying surface lacks any normal mapping - overlaying onto GBuffer is still desirable, because geometry and normals of decal faces might not precisely follow the geometry and normals of the underlying surface (this is especially true in med-poly modeling workflow with custom object normals that's becoming popular among artists now). Same deal with other components - I don't want to overwrite albedo, smoothness and other stuff.

    Q: Why aren't you using the Command Buffers showcase from Unity 5.1, it has an example doing exactly this?
    A: Nope, it has an example for box projected decals which is not at all what I want to do - I need a surface shader for pre-authored decal geometry and UVs). In addition, command buffers seemed to be quite slow performance-wise and required inconvenient setup (additional components etc.) I'd like to avoid if possible.

    Q: Why not use GrabPass?
    A: GrabPass will give a lit frame, which makes it's contents unusable for "invisible" surface output.
     
    Last edited: Sep 22, 2015
    Artaani likes this.
  2. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    570
    I cannot contribute to a solution to your Problem but I already had difficulties seeing the decals in Action at your first example Picture - same as in your shaderforge posting.
    From what I see and know (polycount Forums) what you have there is simply Floating geometry that was baked down into a normal map for the whole door.
    No decals at all.
    I honestly don't understand where that normal map Comes into Play that you posted right after the door since I cannot make out any normal map Information from that texture in neither door nor Floating geometry.

    I simply recommend posting a stronger set of Pictures. The last ones are much better at showing off your Problem. Hope that helps somehow. Good luck. :)
     
  3. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    @Marco Sperling, there are no baked normals anywhere on the third picture. I got this workflow from Polycount in the first place, and I'm 100% sure it involves geometry that stays separate from the final surface and is using a separate shader. The whole point of this workflow is not to use floaters baked into unique per-object normal maps (I have listed numerous reasons why that's very beneficial above). Here are some examples that illustrate the approach better, I hope (all clickable for high res originals).

    A simple display with all rivets, protrusions, seams and buttons done through use of normal decals:




    Every single seam, rivet and panel you see on the exterior surface of this ship is NOT a part of it's texture but separate decal geometry:




    Additionally, here is a set of mockup illustrations I just made, maybe they will help too. First, the example topology:



    Next, textures I use in this example:



    Next, why a traditional alpha blended shader won't give the desired result:



    Next, why tinting is not a solution to matching areas with decals to underlying surfaces.



    And finally, why overlaying our decal onto gbuffer contents is the best (to reiterate, this is a crude mockup only meant to illustrate that all detail from both materials is preserved):



    And to drive the point of overlay home again, another example, hopefully more clear than the illustration I made for the previous post:



    Furthermore, here is an explanation of the texture mapping in the door example. Every single face in the mesh on the right is indeed mapped using that texture. Maybe it's difficult to wrap the head around the mapping on those long seam faces, but just imagine them all unwrapped into a long horizontal rectangular UV island that is then squished on U axis to fit the seam area on the texture. Pretty simple. Here is another shot of the door.



    _______________________________

    I hope this makes everything clear to you.
     
    Last edited: Sep 22, 2015
  4. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    To return to the subject. Can someone explain to me the following?

    • Why are _CameraGBufferTexture0 and other GBuffer textures containing a completely random surface texture from a scene, or a black fill, or some distorted jumbled mess unless I use decal:add or decal:blend optional parameters? At first I thought it was an issue with queue/type, but moving the shader to, say, transparent has absolutely no effect on the issue. Those textures contain nothing resembling the GBuffer textures unless I use decal:add or decal:blend, and from the limited documentation ("hey, it's for decals!") on those parameters, the reason is not at all clear.
    • As _CameraGBufferTexture2 stores world space normals, can someone tell me how to apply the decal normal map to that space? I'd guess that simply wrapping tex2d in UnpackNormal won't cut it here.
    • If _CameraGBufferTexture2 is RGB specular + A smoothness, what exactly do I write to it's RGB when using metalness input? Lerp between half3 (0,0,0) and albedo input based on metalness input?
    • As far as I'm I understanding the four inout half4 used in finalgbuffer function in TerrainSplatmapCommon.cginc, half4 diffuse refers to value going into _CameraGBufferTexture0, half4 specSmoothness refers to value going into _CameraGBufferTexture1, half4 normal refers to value going into _CameraGBufferTexture2 and half4 emission refers to value going into _CameraGBufferTexture3, correct?
    • Adding exclude_path:forward to the shader at the same time as decal:add renders the shader output black and makes it ignore all parameters (like Offset -1, -1): does that reveal that the shader was not actually being compiled for deferred at all and is actually rendered in forward mode if it's using the decal:add optional attribute? Sorry, I don't know any other way to check which path is used by a shader. Maybe that's the reason why gbuffers were only read properly when I attempted to read their content in the shader? Release notes of 5.2 mention that decal:add should actually generate a deferred shader, though, so I don't understand what's going on there:
    • Maybe I'm misunderstanding the intended use of finalgbuffer completely: what's the difference between writing albedo/metalness/smoothness/occlusion/emission/normal to the SurfaceOutputStandard vs. sampling the texturesdirectly modifying the four half4 inouts in the finalgbuffer function that represent what goes into the buffers? The example of finalgbuffer attribute so far (new TerrainSplatmapCommon.cginc from Unity 5.2) is not exactly illustrating the possibilities well - it simply multiplies GBuffer output by o.Alpha there for some reason. My current understanding of the difference is that outputting normal to SurfaceOutputStandard will relieve you from the need to transform the normal to world space, but that's the only thing I can guess.
    And finally, if my idea of using _CameraGBufferTexture* textures to get a base for overlay blending is entirely wrong, can anyone suggest other ways of using finalcolor/finalgbuffer to achieve what I want to do? I'm fairly sure they are the key to solving this shader, but unfortunately, I don't have a way to communicate with a shader dev who recommended to use them at this moment, so I'm on my own.
     
    Last edited: Sep 23, 2015
  5. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    570
    Sorry, I googled it up and saw that Obscura remade this door with the decals technique. Originally he made this door for a Doom fanart project where this technique was not used. So, ignore my post. Your last set of pictures is very clear though and should bring you the right answers, I hope.
     
  6. Dolkar

    Dolkar

    Joined:
    Jun 8, 2013
    Posts:
    576
    I'm not too familiar with how decals are supposed to work in Unity so I might be wrong here, but I noticed you are trying to both sample from the _CameraGBufferTextures and write to them by using a deferred shader. That either won't work at all, or if it does, Unity needs to make a copy of the entire GBuffer, which would make this technique prohibitively expensive.
    If simple alpha blending is not acceptable, I suggest figuring out a way to apply the decals directly in the surface function of the main pass, maybe using a second UV set. Normal map decals applied that way should also look much better, because they will affect ambient lighting as well.
     
  7. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    @Dolkar

    You are right, use of _CameraGBufferTextures might be misguided. I don't see any other way to make "invisible" opaque shaders that can both leave some pixels unspoiled and do custom blending on top of them. I was repeatedly pointed to finalcolor and finalgbuffer, but I have no idea how to use them to achieve this. Anyone has any ideas on use of those functions?

    As about shared model with two UV sets, I've thought about that, and this approach indeed makes blending dead simple, but it removes the ease of authoring. You can no longer have a surface with an arbitrary number of materials correctly overlayed by arbitrary decal materials, like in the orange/blue example I show there, and you have to recombine decal and surface objects every time you want to update one of them. If nothing else works, I'll be forced to do that, but I'm hoping it's possible to achieve parity with what other artists get in Unreal Engine 4 and CryENGINE and get those independent normal decals working.
     
  8. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    And another question - theoretically, if I were to drop the idea of overlay blending, how can I make an alpha tested or alpha blended shader that completely discards it's own albedo/spec/smoothness while writing it's normals at 100% intensity at pixels where alpha is 100%? My major gripe is inability to avoid removal of preexisting albedo/spec/smoothness info on areas where I add my decals. If I can avoid this, killing only the preexisting normals to write new ones in alpha masked areas, then it will still be a pretty huge leap forward.

    How can I use finalcolor and/or finalgbuffer to achieve that?
     
  9. Dolkar

    Dolkar

    Joined:
    Jun 8, 2013
    Posts:
    576
    To my knowledge, finalcolor and finalgbuffer are supposed to be used in the main pass as well. Their purpose is just to provide a way to mess around with the final values that are about to be written into the targets, after all the lighting calculations, encoding and packing.

    To answer your second question, I think you could actually use the above functions for that. If you use alpha blending with MRT, the output of each target is affected by it's own alpha channel. So if only the normal map output has a non-zero alpha, it should leave all other targets alone, allowing you to modify just the normals.
     
  10. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    That's great to hear! One thing that worries me, though, is how would finalgbuffer work for an alpha shader. Wouldn't a transparent shader automatically become a forward shader, which will then mean that finalgbuffer won't work at all? As far as I see, among transparent ones, only decal:add shaders stay in deferred, - and as far as I understand, it's impossible to do a normal mapped decal:add shader because it just adds normal output to normal GBuffer and breaks the normals in there.
     
  11. Dolkar

    Dolkar

    Joined:
    Jun 8, 2013
    Posts:
    576
    Hmm, if that's true, you'd have to use a classic vert/frag shader for that, then, with tags and names to convince Unity to draw it as regular deferred object. A good way to do that would be to first make a surface shader that does what you want including changing the alphas in finalgbuffer, but actually opaque, and then open the generated vert/frag shader. Once there, you can remove all the passes except the deferred one and then change it's blend mode to alpha blending.

    Alternatively, you could use a command buffer to write just into the normal map buffer as a single target after the g-buffer pass.
     
  12. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    To be honest I had a very hard time understanding the released command buffer examples: they do far, far more than what I'm trying to do here, with their complex projections in the shaders, a big set of custom components doing some obscure object juggling and strange blit operations, and so on, so it was hard to figure out how to actually use command buffers in a more specific and restrained way (e.g. just giving a normal to a shaders). So yeah, if it's possible to do what I want with just one shader, I'll happily go that route.
     
  13. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    @Dolkar
    Can you take a look at this source and tell me whether I'm doing it right?

    Alpha blended try (obviously, gets rendered in forward and ignores finalgbuffer, making it impossible to hide albedo/spec/smoothness output; uses forwardadd to do all lighting so performs pretty poorly):
    http://hastebin.com/raw/zeciguvini

    Alpha tested (outputs pure black for some reason, or outputs absolutely nothing below a certain cutoff value, no reaction to finalgbuffer changes):
    http://hastebin.com/raw/anenozuweh

    Using decal:add option (outputs absolutely nothing, just full transparency, - and outputs faded white around point lights with nothing resembling an actual texture if I test it by removing multiplication of diffuse in finalgbuffer - but hey, at least it tells me it renders in deferred):
    http://hastebin.com/raw/jabacicoqe

    Using decal:blend option (seems to be absolutely identical to normal alpha blend, goes forward again):
    http://hastebin.com/raw/efatohazur

    What do I have to modify there to generate proper vert/frag you are talking about?
     
  14. Dolkar

    Dolkar

    Joined:
    Jun 8, 2013
    Posts:
    576
    The alpha test one looks like a good basis. Change the RenderType to Opaque, get rid of the alphatest:_Cutoff and remove the finalcolor and finalprepass functions... This unfortunately won't work in light prepass, because the normals don't have it's own buffer. After you've done those changes, see what shader Unity generates out of that.

    Oh, and normal *= o.Alpha should probably be normal.a = o.Alpha.
     
  15. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    Alright, thanks! And can you also elaborate on the "tags and names to convince Unity to draw it as regular deferred object" you have mentioned?
     
  16. Dolkar

    Dolkar

    Joined:
    Jun 8, 2013
    Posts:
    576
    I don't know them from the top of my head.. Unity generates them automatically in the resulting vert/frag shader. You just need to remove all passes but the deferred one and change its blend mode.
     
  17. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,383
    Are you trying to replicate the same technique that been used by obscura or you want different normal blending?
     
  18. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    Ideally, yes, I want overlay normal blending I outlined in the first posts. But I'm told it's impossible to implement unless both your decal and your surface are rendered from two UVs in a single shader (since that idea requires knowing the content of GBuffer at exactly the same time you are writing to it, which just won't work). So I asked another artist who has actually tried CE decals and UE4 decal shader by Obscura, and toned down my epxectations. As it turns out, what I want to accomplish now (just traditional normal blending, but independent from albedo blending and the rest) is exactly how things work in decal shaders built in those engines. So I guess the post #8 describes what that shader from the door sample by Obscura is doing, no significant differences - well, beyond the fact that UE4 has fancy per-output blending out of the box.
     
  19. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    Okay, I used exclude_path attributes (e.g. exclude_path: forward) and some other attributes (noshadow and noforwardadd) to prevent the decal from generating stuff I won't need. I then checked the generated vert/frag code, and only found two passes there - deferred and Meta (that's for Enlighten baking, with no runtime effect as far as I know). So I assume there is nothing to clean up in that generated code, beyond changing RenderType to Transparent. Well, unfortunately, I still get exactly the same issue I had with the Alpha Test surface shader: there is absolutely nothing in the output, it's ipaque and pitch black no matter the textures, everywhere from albedo to normals. Here is a gif showing how it looks in shaded and all deferred debug modes of scene view vs. a Standard shader:



    And here is the full generated code (I cleaned up the tabs and spacing there a bit to make it readable):

    http://hastebin.com/udiyuvipuw.avrasm

    Do you have any ideas about a potential issue causing this black output?
     
  20. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,383
    Something like this??
     

    Attached Files:

  21. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    It's hard to make out on this example how exactly normals blend together, but I assume they overlay like in descriptions from the first posts - if so, then hell yeah that's what I wanted. Except without albedo/spec/smoothness contribution (if that's where the black fill is coming from), of course. How did you achieve that?
     
  22. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,383
    Yeah i add the color on purpose since kinda hard to see the effect :p.
    Not sure if this is what you looking for but here's the code.
    Code (csharp):
    1.  
    2. Shader "Custom/DecalProject" {
    3.    Properties {
    4.      _Color ("Color", Color) = (1,1,1,1)
    5.      _MainTex ("Albedo (RGB)", 2D) = "white" {}
    6.      _BumpMap ("Normalmap", 2D) = "bump" {}
    7.      _BumpScale("BumpScale", Float) = 1
    8.      _Glossiness ("Smoothness", Range(0,1)) = 0.5
    9.      _Metallic ("Metallic", Range(0,1)) = 0.0
    10.      _Cutoff("cutout",Float)=0
    11.    }
    12.    SubShader {
    13.      Tags { "RenderType"="Opaque" }
    14.      LOD 200
    15.      
    16.      CGPROGRAM
    17.      // Physically based Standard lighting model, and enable shadows on all light types
    18.      #pragma surface surf StandardSpecular fullforwardshadows alpha
    19.      #include "UnityCG.cginc"
    20.      // Use shader model 3.0 target, to get nicer looking lighting
    21.      #pragma target 3.0
    22.  
    23.      sampler2D _MainTex;
    24.      sampler2D _BumpMap;
    25.      sampler2D _CameraGBufferTexture2;
    26.  
    27.      struct Input {
    28.        float2 uv_MainTex;
    29.        float4 screenPos;
    30.      };
    31.  
    32.      half _Glossiness;
    33.      half _Metallic;
    34.      fixed4 _Color;
    35.      half _BumpScale,_Cutoff;
    36.  
    37.      void surf (Input IN, inout SurfaceOutputStandardSpecular o) {
    38.        // Albedo comes from a texture tinted by color
    39.        fixed4 c = tex2D (_MainTex, IN.uv_MainTex) * _Color;
    40.        fixed3 n = UnpackScaleNormal(tex2D(_BumpMap, IN.uv_MainTex),_BumpScale);
    41.        fixed3 GN = tex2D(_CameraGBufferTexture2,IN.screenPos.xy / IN.screenPos.w).rgb;
    42.        //clip(c.a-_Cutoff);
    43.        o.Albedo = _Color;
    44.        // Metallic and smoothness come from slider variables
    45.        o.Specular = _Metallic;
    46.        o.Smoothness = _Glossiness;
    47.        //o.Normal = BlendNormals(n,GN);
    48.        o.Normal = (GN+n);
    49.        o.Alpha = c.a;
    50.      }
    51.      ENDCG
    52.    }
    53.    FallBack "Diffuse"
    54. }
    55.  
    56.  
    57.  
    Umm sorry i think i messed something with the latest code, i'll try to get the correct normal buffer for the normal

    EDIT: Okay fixed :p
     
    Last edited: Sep 23, 2015
  23. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,383
    btw just sample the existing Gbuffer if you want to use the existing albedo/spec.
    Edit :
    On second thought i don't think there's any easy way to do this except modify the buffer directly :/
     
    Last edited: Sep 23, 2015
  24. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    @rea
    Okay, I'm not sure I understand some points fully:

    • Why is this shader not suffering from GBuffer fetching issues? Every time I did it in a similar shader, half the time GBuffer returned black, or mangled, or containing a random texture. Is it because this shader is forward, not deferred?
    • Why is output of this shader not present in the debug modes of the Scene view? Is it because they are rendered before forward shaders are rendered?
    • What is the purpose of the alpha attribute and clipping variable if you are not using them at all?
    • I'm suspect that normals are transformed incorrectly and/or added to the normal GBuffer incorrectly, because lighting on them is vastly different from lighting on the underlying surface and a flat tangent space normal is not actually being lit like a flat surface with that shader. I also suspect that any blending operation you'll use over the normal GBuffer has to be wrapped into a normalize operation. The issue is probably this: this is a forward shader, so deferred rendering is not used at all, so the output struct expects normals in tangent space format, so you actually need to transform the content of second GBuffer to the tangent space of our surface before you can mix it with unpacked normal and output it to the standard struct.
    • Even if I remove both textures and normal output and only output albedo from gbuffer0, spec from gbuffer1.xyz and smoothness from gbuffer1.w, there is still a slight discrepancy in the result: see how decal areas are slightly darker despite duplicating the gbuffer contents exactly. No idea how to fix this:


    ________________________

    @Dolkar
    As the example above is a forward shader with all associated performance and limitations, my question about generated code is still relevant - if you have the time to answer it, please do. :)
     
  25. Dolkar

    Dolkar

    Joined:
    Jun 8, 2013
    Posts:
    576
    The RenderType should stay Opaque. What that tag controls is not the blend mode, but when the material is rendered in the pipeline and what replacement shaders are used (for shadows, for example). What you need to do instead is change the blend mode directly by adding Blend SrcAlpha OneMinusSrcAlpha inside the pass. As long as the alpha output is set correctly, that should work.
     
  26. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    Oh damn! Looks like vert/frag conversion is not required at all, as everything seems to work with a surface shader! Thanks a lot :D

    I applied your advice, keeping rendertype opaque while adding the Blend, but it still did not work, keeping all output completely blank for some bizarre reason. And then, by pure accident, I decided to remove the line multiplying output to emissive GBuffer (#3) by 0. And boom, everything started working!



    I have absolutely no idea why, but tampering with output to emissive GBuffer seems to completely kill any output to normals, albedo and other buffers.

    I have only one issue, though. Some of the UV islands never receive any normal mapping at all, as you can see from the screenshot above. I'm pretty dumbfounded by this. I double checked UVs and other stuff, it's all grabbed correctly, otherwise those perfect islands, like a circular seam, wouldn't work. What can actually cause normal output to stretch or become completely flat on some faces while at the same time an albedo output that's using the very same uv_MainTex for it's tex2d stays perfectly mapped?

    Here is the current version of the shader:
    http://hastebin.com/raw/kacakiremi
     
    Last edited: Sep 24, 2015
    n00body likes this.
  27. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    Okay, it seems to be fully working! I modified it to allow separately configured albedo, spec/roughness, normal and emission contribution intensity.



    Here is full source:
    http://hastebin.com/raw/itopeyakuq

    I have a few questions left, though, so if @Dolkar, @rea or anyone else has info on this, please share it:
    • Where is ambient occlusion in the GBuffer?
    • What exactly is GBuffer3 containing and what should I do with it in the finalgbuffer? I mean, I see that the documentation lists it as an "ARGB32 (non-HDR) or ARGBHalf (HDR) format: Emission + lighting + lightmaps + reflection probes buffer" that is "logarithmically encoded to provide greater dynamic range", but I have no idea how to use it in the context of modulating the surface shader output. Multiplying it by values ranging from 0 to 1 has some extremely weird effects, for example drastically changing the visibility of albedo contribution, or making some of the albedo texture still visible even when albedo is completely killed in the finalgbuffer method.
    Actually, I'm a bit confused about relationship between surface output struct components and four components finalgbuffer function can modify. Can someone explain how albedo output, metalness output and smoothness output interact, how albedo texture somehow ends up visible when nothing but emission variable is left visible in the finalgbuffer and so on? Here are a few screenshots with the different configurations of that shader.

    Only the emission output is left:



    Only the normal output is left:



    Only the normal and emission outputs are left:



    Only the specSmoothness output is left:



    Only the specSmoothness and emission outputs are left:



    All outputs are active (diffuse, specSmoothness, normal, emission):

     
    n00body likes this.
  28. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,383
    Alpha and clipping atrribute in my shader are unused, i just forgot to remove it :p
    sampler2D _CameraGBufferTexture0; // Diffuse RGB and Occlusion A
    sampler2D _CameraGBufferTexture1; // Specular RGB and Roughness/Smoothness A
    sampler2D _CameraGBufferTexture2; // World Normal RGB
    uniform sampler2D _CameraReflectionsTexture; // Deferred reflection buffer

    And it seems you already find the correct buffer blending
     
  29. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    7,683
    http://docs.unity3d.com/Manual/RenderTech-DeferredShading.html

    The emission value is where any self illumination or ambient lighting is stored. It's also reused after the creation of the gbuffers as the render destination for lights. Basically the reason you don't see it in the shaders that have _CameraGBufferTexture# listed is because what would be #3 is what they're rendering to.
     
  30. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    Can someone recommend a solution for blending the normals correctly? As far as I see, Blend SrcAlpha OneMinusSrcAlpha results in some weird non-normalized vectors appearing in the normal RT between alpha values of 0 and 1. Unfortunately, it's not possible to get rid of them by making an alpha without gradients, as those vectors will still appear on some pixels due to texture filtering. Here are few examples:



    And here is an example with a completely flat tangent space normal, illustrating how the issue happens even when there is no difference between decal normal and underlying RT normal:

     
  31. GoGoGadget

    GoGoGadget

    Joined:
    Sep 23, 2013
    Posts:
    668
    Just in relation to the CameraGBuffer3, it's borked in HDR mode at the moment. Referencing it in HDR seems to just give the diffuse G-Buffer texture (or something? from the documentation it seems like that's how its meant to work in HDR, but I do not see the logic behind that, seems completely pointless with no use cases), and Unity will spit an error at you if you try and use it with BuiltInRenderTextureType.xxx
     
  32. Undertaker-Infinity

    Undertaker-Infinity

    Joined:
    May 2, 2014
    Posts:
    62
    Hi

    For the normal blend artifact, I was using directx's slerp (spherical lerp) to blend, but that doesn't seem to work anymore
     
  33. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    @Undertaker Infinity
    Err, both lerp and slerp would have worked perfectly well with my flat normal example (second image), because lerp between two identical vectors would have yielded a correct vector the whole way through 0-1 factors. But the whole point is that I do not have any information about the normal previously occupying that pixel in the GBuffer, so I have no control over blending beyond setting alpha and Blend mode. Am I missing something, and there is a way to supply a blending function (ideally per-GBuffer) explicitly, which will allow me to use lerp?
     
  34. Dolkar

    Dolkar

    Joined:
    Jun 8, 2013
    Posts:
    576
    Well, regular alpha blending is basically a lerp based on the alpha. So you're using it already.
     
  35. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    That's not what I'm seeing in the second screenshot here. I'm outputting a flat normal there - shouldn't a result of a straight lerp between existing GBuffer normal and new normal be a valid at any alpha value? If I output e.g. 0,1,0 and existing pixel contains 0,1,0, then result of the blending should be 0,1,0 even at intermediate alpha values. That's not what I get at all, - as you can see in the GBuffer view on first screenshot, every pixel with alpha values between 0 and 1 gets a very weird, completely invalid "gray" normal.
     
  36. Dolkar

    Dolkar

    Joined:
    Jun 8, 2013
    Posts:
    576
    That definitely shouldn't be happening. It's a weighted sum... if both values are the same, then the result should be as well, regardless of the alpha value. If they are different, though, then yes, the result won't be normalized. (0, 1, 0) * 0.5 + (1, 0, 0) * 0.5 = (0.5, 0.5, 0). But that shouldn't be a problem because I'm pretty sure the deferred lighting shader normalizes the normal map input anyway.

    Could you make a quick image effect that displays the contents of the normal buffer to see what's actually happening to them?
     
  37. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    @Dolkar
    Not sure if I need an image effect, there are scene view image effects that show the contents of all deferred RTs save for emission.

    Hmm, I'm just noticing this now in an new environment lit with GI, but I have the same issue with very weird halos from alpha values 0.99-0.01 in albedo, spec and all other outputs too. It's not just normals. It seems to be linked to the emission or at least at the very worst with emission, because emission turns completely wrong the moment you attempt to touch it in a finalgbuffer function.

    Here is the finalgbuffer code for context for the sliders in the next gifs:
    Here is how emission blends:


    Here is how normals blend (again, to remind: the same issue happens with a completely flat normal map, and in the normal map used in this example, alpha gradient starts only in flat areas, so the edge artifact can not be coming from a difference in normals between source and destination):


    Same deal with specSmoothness, although it's harder to show in a low-resolution GIF.

    Another strange quirk that might point someone to an answer, I guess: when a smoothness value I try to output is lower than smoothness of an underlying pixel in the GBuffer, I'm unable to overwrite it at all, my output just fades to background smoothness color. Pretty weird. Maybe someone will recognize that as an issue specific to some incorrectly set up blending, or something.
     
  38. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    570
    Have you tried other blend modes? One OneMinusSrcAlpha for example?
     
  39. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    That's exactly the mode I'm using, Blend SrcAlpha OneMinusSrcAlpha.
     
  40. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    570
    Blend SrcAlpha OneMinusSrcAlpha is common Alpha blending.
    Blend One OneMinusSrcAlpha is often used for premultiplied Alpha blending. I suggested that mode since you are multiplying alot of values inside the finalgbuffer function.

    edit: but then again I am not understanding that finalgbuffer function. I wouldn't multiply normals - I would add them together and renormalize afterwards. Why it seems to work eludes me.
    When our project has upgraded to 5.2 I would love to have another look at this decals stuff. Until then I can only lurk and learn from your work :)
     
    Last edited: Sep 28, 2015
  41. Dolkar

    Dolkar

    Joined:
    Jun 8, 2013
    Posts:
    576
    That's actually the issue. I already suggested before to not use premultiplied alpha with regular alpha blending. You can either have one of the two:
    Code (CSharp):
    1. Blend One OneMinusSrcAlpha
    2. ...
    3. normal *= o.Alpha * _ContributionNormal;
    Code (CSharp):
    1. Blend SrcAlpha OneMinusSrcAlpha
    2.  
    3. normal.a = o.Alpha * _ContributionNormal;
    When you're mixing them together, it's clear why you're getting incorrect results.
     
  42. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    Oh damn, that makes so much sense! That's what you get when pesky artists try to make the shaders. :D
    Thank you, you saved me!



    Couple of remaining questions, though. Let's modify the finalgbuffer and surf functions to allow a bit more flexible blending:

    Code (csharp):
    1.  
    2. void surf (Input IN, inout SurfaceOutputStandard o)
    3. {
    4.     fixed4 main = tex2D(_MainTex, IN.uv_MainTex);
    5.     fixed3 normal = UnpackNormal(tex2D(_BumpMap, IN.uv_BumpMap));
    6.  
    7.     o.Albedo = main.x * lerp (1, main.z, _ContributionCavity);
    8.     o.Normal = normal;
    9.     o.Metallic = _Metalness;
    10.     o.Smoothness = saturate (lerp (_SmoothnessLow, _SmoothnessHigh, main.y));
    11.     o.Occlusion = lerp (1, main.z, _ContributionOcclusion);
    12.     o.Alpha = lerp (main.w, 1, _DisableAlpha);
    13. }
    14.  
    15. void DecalFinalGBuffer (Input IN, SurfaceOutputStandard o, inout half4 diffuse, inout half4 specSmoothness, inout half4 normal, inout half4 emission)
    16. {
    17.     diffuse *= o.Alpha * _ContributionDiffuse;
    18.     specSmoothness.xyz *= o.Alpha * _ContributionSpecular;
    19.     specSmoothness.w *= o.Alpha * _ContributionSmoothness;
    20.     normal *= o.Alpha* _ContributionNormal;
    21.     emission *= o.Alpha * _ContributionEmission;
    22. }
    23.  
    Here is the MainTex R channel, which is used in o.Albedo (alternatively, I sometimes just use a solid color property):



    Here is the MainTex G channel, which is used in o.Smoothness (to use whole range of black to white values, I don't store smoothness explicitly and instead simply author a mask between low and high values, which are then separately configured in a material, as you can see here):



    Here is the MainTex B channel, which is used in o.Albedo and o. Occlusion (depending on the size of the decal or visual style of a game, it might be appropriate to output the ambient occlusion map either as true occlusion - only in o. Occlusion - or as cavity - by multiplying it over albedo):



    Here is the MainTex A channel, which is used in o.Alpha:



    And finally, here is the normal map:



    And just in case this is relevant, I'm testing the shader in a scene without baked GI, with realtime GI, with Specular Directional GI mode, and with a point light source covering some of the decals. With this setup, I'm wondering about a couple of questions:

    • Why is Occlusion contribution only visible when Diffuse contribution is set to 1 and only under direct light - never under GI? Shouldn't it be completely opposite? Occlusion should disappear under direct light, after all. Not sure if GI bounced lighting counts as direct light in Unity interpretation, but I assume it's not.
    • When underlying surface smoothness is 0, why is Normal contribution alone (when all other contribution factors are set to 0) becoming visible under direct light and not under GI?
    • When underlying surface smoothness is 0, why is Normal contribution becoming invisible when Diffuse contribution is set to 1 and Metalness is set to 1? I assume because Specular contribution must always be identical to Diffuse contribution, because that's where o.Albedo goes for metals.
    • When underlying surface smoothness is 0, why is Normal contribution actually becoming visible under GI when Smoothness contribution is also set to 1?
    • When underlying surface smoothness is 0, why is Normal contribution actually becoming visible under GI when Emission contribution is also set to 1?
    • When underlying surface smoothness is NOT 0, why is lone Normal contribution set to 1 suddenly becoming enough to show normal mapping? I suspect that there are two ways to notice normal mapping in action - through it's influence on reflections, which are applied after finalgbuffer, and through it's influence on lighting, which is applied after finalgbuffer for direct lighting and before finalgbuffer for GI. Rough materials don't get any reflections, so I'm unable to see normal influence on them under GI without Emissive contribution. Smooth materials get reflections, so normals create visible differences even withouth Emissive contribution. Is that correct?
    • Why is Emission contribution alone (when all other contribution factors are set to 0) giving an impression of applied normals when no actual altered normals are being written to GBuffer in that case?
    • Should I actually be giving a user an option to modify Smoothness contribution separately from Specular contribution (specSmoothness.xyz vs. specSmoothness.w), or are they intricately linked in PBR and modifying only one is physically incorrect?
    • Should I actually be giving a user an option to modify Albedo separately from Specular contribution, or are they intricately linked in PBR and modifying only one is physically incorrect? I assume this is the case, because metals vs. dielectrics require both to contribute to render correctly.
    • Why is content of Specular part of RT2 (.xyz) changing in intensity when I modify the Smoothness in RT2 (.w)?
    • When an underlying surface has a Smoothness of, for example, 0.5, why am I unable to write less than 0.5 into Smoothness channel of GBuffer? For example, setting _SmoothnessLow to 0 and Smoothness contribution to 1 still gives me Smoothness of 0.5 according to scene view GBuffer mode.
    • Why are pixels with Smoothness value of 1 turning metallic decal specular completely black? This is somehow linked to Specular/Smoothness levels of underlying surface, giving an impression that beyond a certain point relative to underlying surface values, smoothness values of a decal cause specular values of a decal to go weird.
    • When an underlying surface is Metallic with colored specular, why am I unable to overwrite the color to grayscale when my decal is dielectric, even when every single channel is used and all contributions are set to 1? Also, when my decal is metallic too, I am only able to get a multiplied color (e.g. reddish blue from red background and blue decal specular) output into specular buffer - why is that? Smoothness intensity, specular brightness and specular color issues might all be linked - I guess something strange is happening with blending of specSmoothness RT.
     
    Last edited: Sep 28, 2015
  43. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    Questions above still stand, but in the meantime, here is a beauty shot! Huge thanks to Dolkar and Marco for helping me get so far!



    P.S.: Underlying surface is actually auto-unwrapped and is using another neat shader. All wear, dust etc. aren't painted manually and come from one UV2 packed map containing some baked info like curvature driving some neat masking. Fancy gif.

    Oh, and there are no baked normals on that surface, it's a lowpoly with face area weighted vertex normals. That's pretty neat, I guess (one less map to store, perfect shading gradients).
     
    Last edited: Sep 29, 2015
  44. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    570
    Looking really nice. Would you mind sharing that asset to further study this technique? I wonder about the face weighted vertex normals - is there a small bevel at the edges or does the face weighting do all the edge highlighting?
     
  45. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,383
    Ooh great work, will you share the latest version or you gonna put it on asset store?
     
  46. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    Sure, I'll upload it soon I think. Here is how edges work - no normal maps, same vertex count as with the hard edges, free silhouette and shading!


    Doesn't feel right to me to sell something that wouldn't have existed without the Polycount threads on the subject of Star Citizen and Alien Isolation workflows, and without answers from the people here. I put some hours into this, of course, but in the end, the shader is pretty simple and most of the reason this was never done before was lack of finalgbuffer function in Unity prior to 5.2.
     
    Last edited: Sep 29, 2015
  47. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,383
    No worries man.
     
  48. Martin_H

    Martin_H

    Joined:
    Jul 11, 2015
    Posts:
    3,954
    I haven't read the whole thread yet but I am thrilled that you are digging through all this and want to share your solution with the world. I've seen the thread on polycount and thought "this would be great to have in unity!" and here we are :D. Awesome work! Thanks a lot!
     
  49. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,383
    @bac9-flcl btw have you tried to add vertex color for the decal albedo? might be usefull if you want to different color per decal and still keeping it one material
     
    Martin_H likes this.
  50. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    786
    Can easily be done, but tbh I don't see the use for it. Uniform vertex based tinting generally looks awful unless your decal depicts something like uniformly white paint, road marks, captions, etc. And if that's the sort of the stuff you need to output in a decal, then you definitely need to override underlying albedo and specular, and maybe even normals - and if you need to do that, then you have no use for a specialized finalgbuffer shader anyway, you can just use a traditional alpha shader with Offset -1, -1 to do the job.

    Unless I'm misunderstanding the idea. :)