Hi everybody! I'm trying to setup a simple frame postprocessing, and unfortunately, built-in mechanism doesn't seem to handle several cameras well. I have following camera set-up in my 2D game: - First one is rendering sky and clouds, lowest depth, clear flags - solid color - Second one is rendering the game scene itself, clear flags - depth only - Third one is UI camera, clear flags - depth only. Issues arouse when I'm trying to apply any postprocessing effect on second camera. In unity editor it works just fine, affecting the game world below it and keeping UI untouched. However, when I'm trying to upload build to Android Device (Samsung Galaxy Note 2, Android 4.1.2) I see only UI on top of black background. It only starts work when I switch clear flags on second camera to Solid Color or Skybox. However, it's hardly acceptable for my game, cause I have a dynamic sky with clouds, starts and mountains, and it can't be rendered with one camera, because thus camera zooming would yield strange results, breaking an illusion of distant objects. I also tried to rewrite OnRenderImage function on the first camera, just for experiment. It also looked very weird: working fine on device, but ignoring all camera depth values in UnityEditor, rendering only Sky and UI, skipping the second camera at all. That makes me think that postprocessing has some fundamental limitation on cameras with depth-only clear flags, leading to some sort of undefined behaviour, but unfortunately unity guides and documentation on that matter are really scarce. Have you ever came across similar issue? I just can't figure out a proper workaround about that, without addressing to render all cameras to rendertextures and than combine it in one single camera. But I'm afraid, that is quite heavy from perfromance perspecitve. Any ideas? My Unity version is 4.6.2. Thanks, Alex.