Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Question What are Gbuffers?

Discussion in 'General Graphics' started by MatheusMarkies, May 31, 2021.

  1. MatheusMarkies

    MatheusMarkies

    Joined:
    Apr 16, 2017
    Posts:
    67
    Good Morning! I am in need of help to understand GBuffers. What exactly are they? and how can I access them through unity? Is it something I need to create? Or does the engine create automatically?

    In some forum I had seen something like this: This is in the camera rendering code in a customizable render pipeline;
    Code (CSharp):
    1.  
    2. void DrawVisibleGeometry(bool useDynamicBatching, bool useGPUInstancing)
    3.     {
    4.  
    5.         //RenderTexture rt0 = RenderTexture.GetTemporary(camera.pixelWidth, camera.pixelHeight, 0, RenderTextureFormat.ARGB32);
    6.         //RenderTexture rt1 = RenderTexture.GetTemporary(camera.pixelWidth, camera.pixelHeight, 0, RenderTextureFormat.ARGB32);
    7.         //RenderTexture rt2 = RenderTexture.GetTemporary(camera.pixelWidth, camera.pixelHeight, 0, RenderTextureFormat.ARGB2101010);
    8.         //RenderTexture rt3 = RenderTexture.GetTemporary(camera.pixelWidth, camera.pixelHeight, 24, RenderTextureFormat.DefaultHDR);
    9.  
    10.         //RenderBuffer[] colorBuffers = new RenderBuffer[4];
    11.         //colorBuffers[0] = rt0.colorBuffer;
    12.         //colorBuffers[1] = rt1.colorBuffer;
    13.         //colorBuffers[2] = rt2.colorBuffer;
    14.         //colorBuffers[3] = rt3.colorBuffer;
    15.         //camera.SetTargetBuffers(colorBuffers, rt3.depthBuffer);
    16.  
    17.         var sortingSettings = new SortingSettings(camera)
    18.         {
    19.             criteria = SortingCriteria.CommonOpaque
    20.         };
    21.         var drawingSettings = new DrawingSettings(unlitShaderTagId, sortingSettings)
    22.         {
    23.             enableDynamicBatching = useDynamicBatching,
    24.             enableInstancing = useGPUInstancing,
    25.             perObjectData = PerObjectData.ReflectionProbes |
    26.                 PerObjectData.Lightmaps | PerObjectData.ShadowMask |
    27.                 PerObjectData.LightProbe | PerObjectData.OcclusionProbe |
    28.                 PerObjectData.LightProbeProxyVolume |
    29.                 PerObjectData.OcclusionProbeProxyVolume
    30.         };
    31.  
    32.         //Shader.SetGlobalTexture("_CameraGBufferTexture0", rt0);
    33.         //Shader.SetGlobalTexture("_CameraGBufferTexture1", rt1);
    34.         //Shader.SetGlobalTexture("_CameraGBufferTexture2", rt2);
    35.         //Shader.SetGlobalTexture("_CameraGBufferTexture3", rt3);
    36.  
    37.         drawingSettings.SetShaderPassName(1, litShaderTagId);
    38.  
    39.         var filteringSettings = new FilteringSettings(RenderQueueRange.opaque);
    40.  
    41.         context.DrawRenderers(cullingResults, ref drawingSettings, ref filteringSettings);
    42.  
    43.         context.DrawSkybox(camera);
    44.  
    45.         sortingSettings.criteria = SortingCriteria.CommonTransparent;
    46.         drawingSettings.sortingSettings = sortingSettings;
    47.         filteringSettings.renderQueueRange = RenderQueueRange.transparent;
    48.  
    49.         context.DrawRenderers(cullingResults, ref drawingSettings, ref filteringSettings);
    50.     }
    51. }
    52.  
    When I tried this, the performance drops a lot

    All code (Render Camera)

    Code (CSharp):
    1.  
    2. using System;
    3. using UnityEngine;
    4. using UnityEngine.Rendering;
    5. namespace MagicByte
    6. {
    7.     public partial class CameraRenderer
    8.     {
    9.  
    10.         const string bufferName = "RenderCamera";
    11.  
    12.         CommandBuffer buffer = new CommandBuffer { name = bufferName };
    13.  
    14.         //static int deepBufferId = Shader.PropertyToID("_CameraDeepBuffer");
    15.         static int frameBufferId = Shader.PropertyToID("_CameraFrameBuffer");
    16.  
    17.         Decal dacals;
    18.  
    19.         static ShaderTagId
    20.             unlitShaderTagId = new ShaderTagId("MBUnlit"),
    21.             litShaderTagId = new ShaderTagId("MBLit");
    22.         static int cameraColorTextureId;
    23.  
    24.         ScriptableRenderContext context;
    25.         Camera camera;
    26.  
    27.         CullingResults cullingResults;
    28.         Lighting lighting = new Lighting();
    29.         PostProcessingStack postProcessingStack = new PostProcessingStack();
    30.         ComputeShaderStackBlit csBlit = new ComputeShaderStackBlit();
    31.  
    32.         public void Render(ScriptableRenderContext context, Camera camera, bool useDynamicBatching, bool useGPUInstancing, ShadowSettings shadowSettings)
    33.         {
    34.             this.context = context;
    35.             this.camera = camera;
    36.  
    37.             camera.renderingPath = RenderingPath.DeferredShading;
    38.             camera.allowMSAA = false;
    39.             camera.allowHDR = true;
    40.  
    41.             context.SetupCameraProperties(camera);
    42.  
    43.             PrepareBuffer();
    44.             PrepareForSceneWindow();
    45.             if (!Cull(shadowSettings.maxDistance))
    46.             {
    47.                 return;
    48.             }
    49.  
    50.             buffer.BeginSample(bufferName);
    51.             ExecuteBuffer();
    52.             lighting.Setup(context, cullingResults, shadowSettings);
    53.             buffer.EndSample(bufferName);
    54.  
    55.             Setup();
    56.             DrawVisibleGeometry(useDynamicBatching, useGPUInstancing);
    57.             DrawUnsupportedShaders();
    58.  
    59.  
    60.             //DecalLoad decalLoad = camera.GetComponent<DecalLoad>();
    61.             //if (decalLoad)
    62.             //{
    63.             //  foreach (Decal decal in decalLoad.getDecalList())
    64.             //  {
    65.             //      decal.PreLoadDecal(this.camera);
    66.             //      decal.OnRederDecal(context);
    67.             //  }
    68.  
    69.             //}
    70.  
    71.             if (camera.TryGetComponent<PostProcessingLayer>(out PostProcessingLayer postProcessingLayer))
    72.             {
    73.                 postProcessingStack.setRenderContext(context);
    74.  
    75.                 postProcessingLayer.OnRenderCamera();
    76.                 postProcessingStack.postProcessingDrawing(postProcessingLayer.getEffects(), frameBufferId, camera);
    77.             }
    78.  
    79.             ComputeShaderStack CCS = camera.GetComponent<ComputeShaderStack>();
    80.             if (CCS)
    81.             {
    82.                 CCS.OnRederCamera();
    83.                 csBlit.Setup(context, camera);
    84.                 foreach (ExecuteComputeShader exe in CCS.getComputeShaderList())
    85.                 {
    86.                     exe.setCulling(cullingResults);
    87.                     csBlit.RenderComputeShader(exe.OnRenderComputeShader(), exe.getAccumulation(), exe.getFilterMaterial());
    88.                     exe.currentSample++;
    89.                 }
    90.             }
    91.  
    92.             DrawGizmos();
    93.             Cleanup();
    94.             Submit();
    95.         }
    96.  
    97.         void Cleanup()
    98.         {
    99.             lighting.Cleanup();
    100.             //if(PostProcessActive)
    101.             buffer.ReleaseTemporaryRT(frameBufferId);
    102.         }
    103.  
    104.         bool Cull(float maxShadowDistance)
    105.         {
    106.             if (camera.TryGetCullingParameters(out ScriptableCullingParameters p))
    107.             {
    108.                 p.shadowDistance = Mathf.Min(maxShadowDistance, camera.farClipPlane);
    109.                 cullingResults = context.Cull(ref p);
    110.                 return true;
    111.             }
    112.             return false;
    113.         }
    114.  
    115.         void Setup()
    116.         {
    117.             context.SetupCameraProperties(camera);
    118.             CameraClearFlags flags = camera.clearFlags;
    119.  
    120.             if (camera.TryGetComponent<PostProcessingLayer>(out PostProcessingLayer postProcessingLayer))
    121.             {
    122.                 if (flags > CameraClearFlags.Color)
    123.                 {
    124.                     flags = CameraClearFlags.Color;
    125.                 }
    126.  
    127.                 buffer.GetTemporaryRT(frameBufferId, camera.pixelWidth, camera.pixelHeight, 32, FilterMode.Bilinear, RenderTextureFormat.DefaultHDR);//Pegar textura de renderizacao
    128.                 buffer.SetRenderTarget(frameBufferId, RenderBufferLoadAction.DontCare, RenderBufferStoreAction.Store);//Renderizar efeitos
    129.             }
    130.  
    131.             buffer.ClearRenderTarget(flags <= CameraClearFlags.Depth, flags == CameraClearFlags.Color, flags == CameraClearFlags.Color ? camera.backgroundColor.linear : Color.clear);
    132.  
    133.             buffer.BeginSample(SampleName);
    134.             ExecuteBuffer();
    135.         }
    136.  
    137.         void Submit()
    138.         {
    139.             buffer.EndSample(SampleName);
    140.             ExecuteBuffer();
    141.             try
    142.             {
    143.                 context.Submit();
    144.             }
    145.             catch (Exception e) { }
    146.         }
    147.  
    148.         void ExecuteBuffer()
    149.         {
    150.             context.ExecuteCommandBuffer(buffer);
    151.             buffer.Clear();
    152.         }
    153.  
    154.         void DrawVisibleGeometry(bool useDynamicBatching, bool useGPUInstancing)
    155.         {
    156.  
    157.             //RenderTexture rt0 = RenderTexture.GetTemporary(camera.pixelWidth, camera.pixelHeight, 0, RenderTextureFormat.ARGB32);
    158.             //RenderTexture rt1 = RenderTexture.GetTemporary(camera.pixelWidth, camera.pixelHeight, 0, RenderTextureFormat.ARGB32);
    159.             //RenderTexture rt2 = RenderTexture.GetTemporary(camera.pixelWidth, camera.pixelHeight, 0, RenderTextureFormat.ARGB2101010);
    160.             //RenderTexture rt3 = RenderTexture.GetTemporary(camera.pixelWidth, camera.pixelHeight, 24, RenderTextureFormat.DefaultHDR);
    161.  
    162.             //RenderBuffer[] colorBuffers = new RenderBuffer[4];
    163.             //colorBuffers[0] = rt0.colorBuffer;
    164.             //colorBuffers[1] = rt1.colorBuffer;
    165.             //colorBuffers[2] = rt2.colorBuffer;
    166.             //colorBuffers[3] = rt3.colorBuffer;
    167.             //camera.SetTargetBuffers(colorBuffers, rt3.depthBuffer);
    168.  
    169.             var sortingSettings = new SortingSettings(camera)
    170.             {
    171.                 criteria = SortingCriteria.CommonOpaque
    172.             };
    173.             var drawingSettings = new DrawingSettings(unlitShaderTagId, sortingSettings)
    174.             {
    175.                 enableDynamicBatching = useDynamicBatching,
    176.                 enableInstancing = useGPUInstancing,
    177.                 perObjectData = PerObjectData.ReflectionProbes |
    178.                     PerObjectData.Lightmaps | PerObjectData.ShadowMask |
    179.                     PerObjectData.LightProbe | PerObjectData.OcclusionProbe |
    180.                     PerObjectData.LightProbeProxyVolume |
    181.                     PerObjectData.OcclusionProbeProxyVolume
    182.             };
    183.  
    184.             //Shader.SetGlobalTexture("_CameraGBufferTexture0", rt0);
    185.             //Shader.SetGlobalTexture("_CameraGBufferTexture1", rt1);
    186.             //Shader.SetGlobalTexture("_CameraGBufferTexture2", rt2);
    187.             //Shader.SetGlobalTexture("_CameraGBufferTexture3", rt3);
    188.  
    189.             drawingSettings.SetShaderPassName(1, litShaderTagId);
    190.  
    191.             var filteringSettings = new FilteringSettings(RenderQueueRange.opaque);
    192.  
    193.             context.DrawRenderers(cullingResults, ref drawingSettings, ref filteringSettings);
    194.  
    195.             context.DrawSkybox(camera);
    196.  
    197.             sortingSettings.criteria = SortingCriteria.CommonTransparent;
    198.             drawingSettings.sortingSettings = sortingSettings;
    199.             filteringSettings.renderQueueRange = RenderQueueRange.transparent;
    200.  
    201.             context.DrawRenderers(cullingResults, ref drawingSettings, ref filteringSettings);
    202.         }
    203.     }
    204. }
    205.  
    I need to access them to create a Screen Space Refletions shader.
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    They're something you use for deferred shading.

    Your standard real time rendering is called forward rendering. The short version of what this means is when you render an object to the screen, you calculate the lighting for that object at the time you render it. Basically what gets rendered by the object is the final result you see for that object (ignoring any post processing).

    Deferred rendering comes in a couple of different forms, but a common feature of deferred rendering is the use of gbuffers and doing lighting separate from rendering the object. The very short explanation is when rendering an object instead of rendering the final result the object writes out the material properties to a set of screen space textures. Albedo (the object's color), normals (either in world or view space), specular color, smoothness, position, etc. In Unity's case it also outputs ambient or baked lighting during the gbuffer pass. Lighting from real time lights is then done separately by rendering the lights as their own geometry (spheres for point lights, cones for spot lights, or the entire screen for directional lights) that read from the gbuffers to get the necessary material and surface values.

    The main advantage of deferred over forward is deferred can (generally) support significantly more lights per object than forward can, as the lights and the object are no longer tied together. The other benefit of deferred is it makes modifying material properties with decals or doing post processing that requires knowledge about the surface much easier as it's already in an easy to access textures.


    If you're using a forward rendering SRP and trying to add screen space reflections, you can, but you're going to have a bad time as it means you basically have to render both the forward rendering pass and the gbuffer pass. You'll probably want to look at moving to a fully deferred renderer at that point.



    Or look at the recent Doom games which are forward renderers with SSR which work by reflecting the previous rendered frame.
     
    JoNax97 and BrandyStarbrite like this.
  3. MatheusMarkies

    MatheusMarkies

    Joined:
    Apr 16, 2017
    Posts:
    67
    I'm looking to work with deferred rendering in my SRP. But how can I access "Gbuffers"?
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    That's what these lines are for.

    First you need to render the gbuffers, then you need to assign the textures to the material, or set them as globals (as those above lines do). Then you need to use the screen space position as the UVs to sample the gbuffers. A big part of implement deferred rendering is the lighting passes, which need to access the gbuffers, so it'll become obvious how to do so once you've gotten that far.
     
  5. MatheusMarkies

    MatheusMarkies

    Joined:
    Apr 16, 2017
    Posts:
    67
    When I was using these codes the performance drops a lot.
    This is normal?
    Am I putting it in the right place? (Before rendering as meshes)