I'm making a drawing / coloring app. I use uGui for stamp/sticker functionality. I'm trying to use a rendertexture to make a screenshot. But the GUI elements don't show up on the render texture. The GUI elements are on a Screen Space - Camera canvas which is rendered by my Main Camera. When I'm taking a screenshot I set the target texture of my camera to a rendertexture, take my screenshot, and reset. Everything shows up in the PNG except the GUI elements. This is my code: Code (CSharp): mainCamera.targetTexture = renderTextureFull; RenderTexture.active = renderTextureFull; mainCamera.Render (); Rect portion = new Rect(0f, canvasMinY, canvasSizeX, canvasSizeY); readingTextureFull.ReadPixels (portion, 0, 0); System.IO.File.WriteAllBytes (directoryPath + "/" + fileName + "_total.png", readingTextureFull.EncodeToPNG ()); mainCamera.targetTexture = null; RenderTexture.active = null; Weird thing is that if I don't reset the main camera back to rendering to the screen (targetTexture = null) I do get the GUI elements if I save a second time. So I guess GUI elements are rendered sometime later in the render pipeline. But if I postpone everything from mainCamera.Render() onwards to the OnPostRender() event it doesn't show up so it's probably even later than that. A solution I thought of is using 2 cameras and don't switching the Render Target, but I can't assign a canvas to more than one camera. Does anybody know another solution for this issue or is it a Unity bug? I found a forum post stating that an issue was logged for a similar issue (number 631091), but can't find it back in the issue tracker.