Search Unity

ReadPixels from rendertexture with multi-camera setup fails on iOS

Discussion in 'iOS and tvOS' started by Biggix, Apr 19, 2019.

  1. Biggix

    Biggix

    Joined:
    Dec 8, 2014
    Posts:
    44
    Hey guys,

    I'm making a screenshot of a game level which uses a multi-camera setup, while the game view is obscured by another (one more) camera (so that the user doesn't even know what is happening behind the visible game menu).

    The following code works perfectly in Editor, however on iOS I get a gray texture and a weird error message:
    Reading pixels out of bounds of the current active render texture

    Any ideas?

    Code (CSharp):
    1.     /*
    2.            
    3.             rt = new RenderTexture(Screen.width, Screen.height , 24 , RenderTextureFormat.ARGB32);
    4.             rt.useMipMap = false;
    5.             rt.antiAliasing =1;
    6.             RenderTexture.active = rt;
    7.             CameraHolder.one.OverlayCam.targetTexture = rt;
    8.            
    9.             */
    10.            
    11.             screenShot = new Texture2D((int)rect.height, (int)rect.height, TextureFormat.RGB24, false);
    12.            
    13.             yield return frameEnd;
    14.            
    15.             RenderTexture.active = rt;
    16.             CameraHolder.one.OverlayCam.targetTexture = rt;
    17.            
    18.             CameraHolder.one.BackgroundCam.Render();
    19.             CameraHolder.one.MainCam.Render();
    20.             CameraHolder.one.ScrollCam.Render();
    21.             CameraHolder.one.OverlayCam.Render();
    22.            
    23.             int diff = ((int)rect.height - (int)rect.width) / 2;
    24.            
    25.             screenShot.ReadPixels(rect, diff, 0);
    26.