Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice
  3. Join us on November 16th, 2023, between 1 pm and 9 pm CET for Ask the Experts Online on Discord and on Unity Discussions.
    Dismiss Notice

Blitting while not writing to depth buffer.

Discussion in 'Shaders' started by Incendiary-Games, Apr 27, 2015.

  1. Incendiary-Games

    Incendiary-Games

    Joined:
    Mar 24, 2015
    Posts:
    18
    Unity Version: 5.0.1f1 Personal
    Forward Rendering Path

    I have a full screen image effect I have stored in a RenderTexture that I'm trying to blit on the screen using Graphics.Blit. One issue I'm running into is the full screen quad from Graphics.Blit is writing to the depth buffer and messing up the effect. I want to disable depth buffer writes for this blit. I've tried making a custom shader and inserting it into a material for Graphics.Blit to disable z writes but it doesn't appear to be working. Internally Graphics.Blit draws a quad and the entire RenderTexture has the depth value of this quad. Does anyone have any suggestions?

    Here is the shader(I also tried Hidden/BlitCopy):
    Code (CSharp):
    1.  
    2. Shader "Custom/DepthCopy" {
    3.     Properties {
    4.         _MainTex ("Base (RGB)", 2D) = "" {}
    5.     }
    6.     SubShader {
    7.         ZTest Always Cull Off ZWrite Off Fog { Mode Off }
    8.         Pass {
    9.             ZWrite Off
    10.             CGPROGRAM
    11.             #pragma vertex vert_img
    12.             #pragma fragment frag
    13.             #pragma fragmentoption ARB_precision_hint_fastest
    14.             #include "UnityCG.cginc"
    15.      
    16.             uniform sampler2D _MainTex;
    17.  
    18.             fixed4 frag(v2f_img i) : COLOR {
    19.      
    20.                 fixed4 renderTex = tex2D( _MainTex, i.uv);
    21.                 return renderTex;
    22.             }
    23.      
    24.             ENDCG
    25.         }
    26.  
    27.     }
    28.  
    29. }
     
    Last edited: Apr 28, 2015
  2. Incendiary-Games

    Incendiary-Games

    Joined:
    Mar 24, 2015
    Posts:
    18
  3. Zicandar

    Zicandar

    Joined:
    Feb 10, 2014
    Posts:
    388
    Don't have that depth buffer set when rendering ?
     
  4. Incendiary-Games

    Incendiary-Games

    Joined:
    Mar 24, 2015
    Posts:
    18
    How would I accomplish that? Without getting too detailed I have a camera that renders the scene once with an attached depth buffer. I'm then blitting the camera's render texture into another camera with Dont Clear set for the camera flags. My goal is to have the Render Buffers of camera 2 contain the pixel and depth values of Camera 1, then after it contains those values, be able to render more objects and have depth testing. Both Graphics.Blit and the following custom code is overwriting the depth buffer with the quad. I need a cross platform Windows (dx 11) and Android OpenGL solution. I'm at the point where I'm pulling my hair out and will either move this to c++ code where I have full control or something else.

    It needs to be compatible with Android's OpenGl ES 2.0. I had another thought of rendering to a second texture encoding depth buffer information then reading it out in a custom shader and setting the fragment's depth values but unfortunately multi target rendering is only supported in OpenGL ES 3.0. And if I can use OpenGL ES 3.0 I'd just make a couple of extern calls to glBlitFramebuffer for the color and depth buffers...

    Code (CSharp):
    1.     private void  CopyDepth(RenderTexture source, RenderTexture destination, Material material)
    2.      {
    3.          var oldRT = RenderTexture.active;
    4.  
    5.          source.SetGlobalShaderProperty("_RenderTex");
    6.          RenderTexture.active = destination;
    7.          //Graphics.SetRenderTarget(destination.colorBuffer, throwaway.depthBuffer);
    8.  
    9.          GL.PushMatrix();
    10.          GL.LoadOrtho();
    11.  
    12.          material.SetPass(0);
    13.  
    14.          //Render the full screen quad manually.
    15.          GL.Begin(GL.QUADS);
    16.          GL.TexCoord2(0.0f, 0.0f); GL.Vertex3(0.0f, 0.0f, 0.1f);
    17.          GL.TexCoord2(1.0f, 0.0f); GL.Vertex3(1.0f, 0.0f, 0.1f);
    18.          GL.TexCoord2(1.0f, 1.0f); GL.Vertex3(1.0f, 1.0f, 0.1f);
    19.          GL.TexCoord2(0.0f, 1.0f); GL.Vertex3(0.0f, 1.0f, 0.1f);
    20.          GL.End();
    21.  
    22.          GL.PopMatrix();
    23.  
    24.          RenderTexture.active = oldRT;
    25.      }
     
    Last edited: Apr 28, 2015
  5. IndreamsStudios

    IndreamsStudios

    Joined:
    Jun 4, 2013
    Posts:
    9
    Is your image effect implementing OnRenderImage? If you do, you should be able to this with a call to:

    Code (CSharp):
    1. Graphics.Blit(source, destination, material, 0);
    where the material uses the shader you provided earlier. If your problem is that you don't have a camera that draws directly to screen, I think you can use null as the destination RenderTexture to draw to screen (might need to update some camera settings to get that to work though).
     
  6. Zicandar

    Zicandar

    Joined:
    Feb 10, 2014
    Posts:
    388
    Hmm, to draw to a target without setting a depth buffer I think you might want to use:
    http://docs.unity3d.com/ScriptReference/Graphics.SetRenderTarget.html
    Graphics.SetRenderTarget(destination, 0); //Maybe it should be null not 0
    But my main guess at this point is that you forget that blit's don't seem to copy depth+stencil buffers? (Or at least that is the impression I have been getting).
    In that case I'd suggest not trying to blit between the cameras, (I'm assuming these are normal in-scene cameras with sequential depths set and not Camera.Render's?), but leave it with don'tClear, and then use the command buffers to clear the color but not depth of the render texture before letting the normal rendering go on? That or clear the color before passing it on to the second camera.
     
  7. Incendiary-Games

    Incendiary-Games

    Joined:
    Mar 24, 2015
    Posts:
    18
    No. I have a callback registered Camera.onPostRender that checks to see if its the camera attached to my image effect and if so it blits to the other cameras. I also have camera depth values properly set and have ensured this is the first camera that gets rendered. My image effect requires to process essentially half the scene that has objects on a specific layer, post process them, then blit them to the other camera that picks up other objects.

    I have three cameras. Objects in camera 1 and the other two cameras have special depths that can mix and overlap. Objects in camera 1 can be infront or behind objects in camera 2 (don't clear) & camera 3 (don't clear). For specific reasons I can't do this effect in one pass at the end of rendering. I can set camera 1's render target to camera 2 and get it to work. However if I blit from camera2 to camera 3 it overwrites the depth buffer. I want to preserve the depth buffer from camera 2 for camera 3, and camera 2 cannot render into camera 3. For my setup objects that are in camera 2 and camera 3 don't have depth overlap so its fine to preserve the depth buffer from cam2 and cam3. (I cannot render with camera 1 twice into each camera as it totally kills performance and the point of this image effect.)

    Code (CSharp):
    1. Graphics.Blit(source, destination, material, 0);
    I've tried Graphics.Blit with my material and it still writes to the depth buffer.

    I've tried Graphics.SetRenderTarget. It doesn't let me set a null value for the depth buffer since it takes a RenderBuffer struct and structs can't be null. I tried creating a throaway rendertexture with an attached depth buffer and that didn't work either (think it might of crashed the editor.)

    Yup they are normal scene cameras. Nothing is calling Camera.Render in my project. Right now I'm blitting after Camera.onPostRender. That is the approach I'm taking. I'm using the command buffers to clear color but not depth. However despite my renders with a material with ZWrite Off it still writes the depth of the quad.
     
    Last edited: Apr 29, 2015
  8. Zicandar

    Zicandar

    Joined:
    Feb 10, 2014
    Posts:
    388
    I'd suggest using RenderDoc to find you issue, how to use it with unity:
    Start the editor from within render doc, then you have to enter play mode, and Please maximize game view as it'l reduce the amount of crap in the log.
    Then at the bottom of the log generated the camera.render will exist for the game view :D (Assuming you made sure it was the active swapchain to capture)
     
  9. Incendiary-Games

    Incendiary-Games

    Joined:
    Mar 24, 2015
    Posts:
    18
    Thank you. I'm trying out RenderDoc right now. I'm pretty confused, the final image it shows isn't what I see on screen at all. It looks closer to what I was intending with my image effect. It is capturing my game. Only thing though is the objects in there aren't being depth tested against the objects from the previous camera. I'm guessing maybe camera.onPostRender() isn't the correct place for this effect. I'll keep playing with it.
     
    Last edited: Apr 30, 2015
  10. Incendiary-Games

    Incendiary-Games

    Joined:
    Mar 24, 2015
    Posts:
    18
    I moved it around to the monobehavior OnPostRender and onPreRender for the other camera, neither worked. I'm also not seeing the quad draw at all in RenderDoc which is suspicious.
     
  11. aubergine

    aubergine

    Joined:
    Sep 12, 2009
    Posts:
    2,864
    Move your depth rendering to OnPreCull.
     
  12. Incendiary-Games

    Incendiary-Games

    Joined:
    Mar 24, 2015
    Posts:
    18
    I tried OnPreCull too. Forgot to mention that, sorry.
     
  13. Incendiary-Games

    Incendiary-Games

    Joined:
    Mar 24, 2015
    Posts:
    18
    After struggling more I've decided to post the full code of what I'm doing. It is an Oculus Rift Virtual Reality specific optimization I'm trying to accomplish. If anyone here is familiar with it my code and setup is outlined here:
    https://forums.oculus.com/viewtopic.php?f=37&t=22762
     
  14. Incendiary-Games

    Incendiary-Games

    Joined:
    Mar 24, 2015
    Posts:
    18
    Bump. Still an issue.
     
  15. Alexey

    Alexey

    Unity Technologies

    Joined:
    May 10, 2010
    Posts:
    1,602
    that all sounds a bit weird (blit tweaking z buffer) so first of all - bug report with small repro - we might just fix the bug you know ;-)
    second
    if i understood you correctly - that's EXACTLY why you have SetRenderTargets with RenderBuffer and Camera.SetTargetBuffers. Essentially just create 3 RT (well, i give you the easiest way, you can combine some buffers but thats unrelated to the idea itself): two with just color and one with just depth
    then
    cam1.SetTargetBuffers(color1, depth)
    cam2.SetTargetBuffers(color2, depth)
    and do blitting "manually" (or in OnPreRender or in any other gazillion of possible places)
     
  16. Incendiary-Games

    Incendiary-Games

    Joined:
    Mar 24, 2015
    Posts:
    18
    Thanks Alexey,

    For this specific optimization SetTargetBuffers doesn't work for me as its for the Rift. Doing that I'll end up with double vision in the right eye from the left eye's stereo render. I need to be able to blit in this case to clear the render from the left eye stereo objects. :(

    After the Oculus game jam ends on Monday I'll submit a small test project with reproduction. Both one with the OVR case (in-case its something going on with the Oculus plugin) and one that makes use of ZWrite Off and no OVR Plugin.