Search Unity

  1. If you have experience with import & exporting custom (.unitypackage) packages, please help complete a survey (open until May 15, 2024).
    Dismiss Notice
  2. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice

Deferred Rendering Pipeline: I can't access the emission gbuffer (gbuffer3)!

Discussion in 'General Graphics' started by DryerLint, Dec 9, 2020.

  1. DryerLint

    DryerLint

    Joined:
    Feb 7, 2016
    Posts:
    68
    Hey all,

    This is a really frustrating one for me.

    I've made a command buffer that is supposed to combine gbuffer3 (the emission channel) with some custom lighting effects. The only problem is gbuffer3 is always black, regardless of whether I write to it or not.

    All Materials in the scene use the Standard shader, and many of those materials have emission channels. I assume the Standard shader draws the emission channel to gbuffer3 during the first pass of the deferred pipeline, which is the opaque rendering pass.

    I've tried using Blit(source, dest) to copy gbuffer3 over to the camera target. I've tried writing a custom shader that accesses _CameraGBufferTexture3 and copies it to the screen. I've tried inserting the command buffer at AfterForwardOpaque, AfterGBuffer, BeforeReflections, BeforeLighting, AfterLighting, and even AfterEverything. Gbuffer3 is always empty.

    Further details:
    1. I've tried it with both HDR on and off;
    2. I found a bgolus post where he suggested accessing gbuffer3 using the exp() function, to account for its logarithmic scaling in HDR, but no luck either way;
    3. Graphics settings are set to deferred in all quality levels;

    Quite simply all I want is a way to separate the albedo buffer from the emission buffer, whether this uses the deferred pipeline, the Standard shader, or whatever.

    I also discovered a weird quirk. My command buffer will refuse to execute if I switch the 'HDR' field in the main camera from 'Use Graphics Settings' to 'Off'. This is whether or not HDR is enabled in the Graphics Settings area.

    Any help would be greatly appreciated.

    Thanks,
    Nick
     
  2. whitexroft

    whitexroft

    Joined:
    Oct 22, 2012
    Posts:
    48
    so funny enough GBuffer3 is not a separate render texture. It is a mere binding to the color buffer of the camera's render target. So if you have an explicitly assigned render texture, that is the emission texture binded to SV_Target3
    If there is no assigned render texture, it is a temporary render texture, that the camera creates, and eventually blits into the back buffer
     
    DryerLint likes this.
  3. DryerLint

    DryerLint

    Joined:
    Feb 7, 2016
    Posts:
    68
    I appreciate the reply and the interesting advice. It's a channel of the render target, right?

    Is there any convenient way to visualize the emission channel of the gbuffer without using the Frame Debugger? I noticed that the emission channel is conspicuously missing from the Draw Mode dropdown at the top left of the Scene View window. All other gbuffer channels are there.

    You mentioned the "eventual blitting to the back buffer". Is there any way to change the logic of how the emission channel is mixed with the albedo channel to produce the final rendered image? Or is this hardcoded into Unity's deferred pipeline code?

    Thanks!
     
  4. whitexroft

    whitexroft

    Joined:
    Oct 22, 2012
    Posts:
    48
    As for validating it, it does make sense why you dont have that option in scene view, because the emission channel is the actual final picture. For example when it renders transparent object, that is the channel it renders to. You could adjust your command buffer that it copies the currently active render target (BuiltInRenderTextureType.CameraTarget), instead of Gbuffer3, with BeforeForwardAlpha event, and see the "emission". Just dont blit from CameraTarget to CameraTarget, because it is undefined behaviour. Might actually be your issue, because maybe you were copying from _CameraGBufferTexture3 to CameraTarget that have the same native binding


    The entire deferred rendering is open source, and you can replace the built in deferred shader with your own in Graphics Settings with whatever you download here: https://unity3d.com/get-unity/download/archive

    I can show, for example what a built in logic looks like
    upload_2020-12-13_1-57-15.png

    I wouldn't know why your materials are not writing to Emission. I also found for you where it does that :
    UnityStandardCore.cginc
    upload_2020-12-13_2-5-22.png

    Found this _EmissionMap property, that is probably the same as GBuffer3
    UnityStandardInput.cginc:
    upload_2020-12-13_2-16-44.png

    Anyway. you can try writing a simple surface shader, write to .Emission there and see how it behaves in frame debugger.
     
    DryerLint likes this.
  5. DryerLint

    DryerLint

    Joined:
    Feb 7, 2016
    Posts:
    68
    Wow, thank you so much for that incredibly detailed reply.

    I am going to try your suggestion to capture the emission gbuffer by copying the CameraTarget at BeforeForwardAlpha.

    Another added level of confusion that I neglected to account for is the different event sequence for the Scene camera. Occasionally the Game camera and the Scene camera will look different because of this.

    Thanks a lot for looking through the source of the default deferred renderer. You found some pretty interesting stuff in there. I'll try writing that surface shader you suggested and see what happens in the Frame Debugger.

    You've given me some good experiments to try out. Thanks!
     
    whitexroft likes this.