In the default render pipelines provided by Unity there is a shader global named _CameraDepthTexture that is provided by default. I'm trying to mimic this feature in my own custom pipeline but I'm not familiar enough with the innerworkings of Unity's rendering system to know what exact steps need to be taken. I've ensured that the camera being passed to the pipeline's render method has it's depthTextureMode set to Depth. I'm not entirely sure what exactly this even does but I'm assuming it tells unity to create a rendertexture that will be used to contain the contents of the depth buffer at some point. I'm also assuming that it tells Unity that at some point it actually should copy that info from the depth buffer to the texture. However, I'm not entirely sure where or even if that is happening at all. A little clarification on this would help a lot. I know that somehow I have to get the contents of the depth buffer passed to the shader using something like Code (CSharp): cmdDepthOpaque.SetGlobalTexture("_CameraDepthTexture", theDepthTextureINeed); But I have no idea how I'm actually supposed to get that depth texture. The camera does not appear to have any by default when I enable it's depthTextureMode. I've also tried creating a render texture on the fly, doing a depth presspass, and then passing *that* rendertexture the shader globals but it seems to just be giving me some very weird and glitchy results. Does anyone have any experience with depth buffers and depth textures in their own custom renderpipelines or does anyone have enough general knowledge of Unity's internals that they could at least explain some basics so that I can know what direction I need to take to get access to the depth info in shaders using my own pipeline?