Dear Unity-Crew, I have tracked a weird behavior in my application. When I click a button, I render a texture using this function (this is a stripped down version, that does not save the rendered screenshot, but I have only posted the important part here): Code (CSharp): IEnumerator takeScreenShot(ScreenShotEncodingType screenshotType, string postfix = "", bool automatic = false) { Rect cropArea = CanvasUIManager.Instance.GetCropRect(); Vector2 renderSize = CanvasUIManager.Instance.GetRenderSize(); float scaleFactor = renderSize.x / cropArea.width; RenderTexture rt = RenderTexture.GetTemporary((int)Mathf.Ceil(Screen.width * scaleFactor), (int)Mathf.Ceil(Screen.height * scaleFactor), 24); Camera.main.targetTexture = rt; Camera.main.Render(); Camera.main.targetTexture = null; RenderTexture.ReleaseTemporary(rt); RenderTexture.Destroy(rt); System.GC.Collect(); this.CorrScreenshot = null; yield return new WaitForEndOfFrame(); } The problem is, that somehow the RenderTexture does not seem to be removed from memory completely. The Application starts, then I click on the button, then a huge chunk of memory is allocated and released after the rendering but around 1/3 of the newly allocated memory is not freed up again. When I comment the line that assigns the render texture to the camera targettexture nothing is created or allocated as I expected so it has to be the part when the render texture gets assigned to the camera and the camera finally renders into this texture. I can't understand this behavior especially cause every example I could find uses Render Texture in this way. I even added the GC Collect without any success.