Hi I'm trying to make a viewshed effect using multiple cameras and shaders. I use a camera to render a depth buffer with this script attached Code (CSharp): public class DepthCamera : MonoBehaviour { private Camera cameraComp; // second camera public static readonly string targetTextureName = "_TransparencyFocusMask"; public static readonly string targetDepthName = "_DepthFocusMask"; private RenderTexture target; private RenderTexture targetDepth; private void OnEnable() { cameraComp = GetComponent<Camera>(); cameraComp.depthTextureMode = DepthTextureMode.None; target = new RenderTexture(512,512, 24, RenderTextureFormat.ARGBFloat); targetDepth = new RenderTexture(512,512, 24, RenderTextureFormat.Depth); Shader.SetGlobalTexture("_ColorBuffer", target); Shader.SetGlobalTexture("_DepthBuffer", targetDepth); Shader.SetGlobalFloat("_FarClipPlane", cameraComp.farClipPlane); Shader.SetGlobalFloat("_NearClipPlane", cameraComp.nearClipPlane); cameraComp.SetTargetBuffers(target.colorBuffer, targetDepth.depthBuffer); } and then I access to the buffer from my shader. Code (CSharp): uniform sampler2D _DepthBuffer; uniform sampler2D _ColorBuffer; uniform float _FarClipPlane; uniform float _NearClipPlane; fixed4 frag(v2f o) : COLOR { float depth = tex2D(_DepthBuffer, o.uv).r; //depth as distance from camera in units depth = Linear01Depth(depth); depth = depth * _ProjectionParams.z; //tried to replace from UnityCG.cginc the computation to replace on the values from the other camera //does not work //depth = 1.0 / ((1 - _FarClipPlane / _NearClipPlane)* depth + (_FarClipPlane / _NearClipPlane)); //depth = depth* _FarClipPlane; return depth; } For now I just display the buffer, but using the built in methods it does not work well, I assume that's because the built-in variables and functions apply to the main camera. I've tried to replace the builtin variables with the far/near values from the other camera, but that's even worse. I think I may be wrong somewhere. Any idea would be welcome, thanks for reading