Search Unity

Texture2D.ReadPixels return black texture in iOS app after upgrading project to Unity 2019.4.2

Discussion in 'AR' started by thesanketkale, Jul 13, 2020.

  1. thesanketkale

    thesanketkale

    Joined:
    Dec 14, 2016
    Posts:
    65
    Hello, fellow Unity Developers.

    I am facing a weird issue after having upgraded my project from Unity 2019.2.21 to Unity 2019.4.2. The Texture2D.ReadPixels in the iOS project is returning a black texture in my iPad 6th generation running iOS 13.5.1. It worked all fine in Unity 2019.2.21, but in almost all editor versions post 2019.3.0, I am not able to capture a screenshot in my unity iOS app.

    In this Unity ARkit based project, I am just upgrading the Unity version to be able to use the latest Depth API features available in AR Foundation & ARkit 4.0.2. Now, surprisingly enough, the same upgraded project works fine in Unity Editor as well as an Android device with ARCore, but I don't understand why the ReadPixels texture is always coming black on the iOS app in my iPad.

    In my project, there is a Take Screenshot button which simply invokes a Coroutine that Reads Pixels of the Screen to a Texture2D and then shows it on a UI.RawImage. The Coroutine is as below:

    Code (CSharp):
    1. private IEnumerator CaptureScreenShot()
    2. {
    3.     yield return new WaitForEndOfFrame();
    4.  
    5.     var screenShot = new Texture2D(Screen.width, Screen.height, TextureFormat.RGB24, false);
    6.     screenShot.ReadPixels(new Rect(0, 0, Screen.width, Screen.height), 0, 0, false);
    7.     screenShot.Apply();
    8.  
    9.     OnScreenShotCapture(screenShot);
    10. }
    This works fine in the Unity Editor but it always comes black in the RawImage on the iPad. Now, I tried replicating this behavior in a new project made with Unity 2019.4.2 with a simple button and the same code of capturing a screenshot, and surprisingly, it worked there. I was able to capture the screenShot with a new project made with Unity 2019.4.2, but somehow not with my upgraded project.

    I have tried upgrading the same project to Unity 2019.3.0, Unity 2019.3.15, Unity 2019.4.2, and Unity 2019.4.3; but the same result in all the iOS builds, black texture instead of the screenshot. And the weird part is if I downgrade back to Unity 2019.2.21 from any of the above versions, it starts to work normally and the ReadPixels texture returns the screenshot as expected.

    I dug the internet for possible solutions, but none of them seem to work for me right now. I tried Application.CaptureScreenShot() and ScreenCapture.CaptureScreenshot() both, instead of the ReadPixels, but they save the error image with a red question mark instead of the actual screenshot in my case.

    Is anybody else facing such an issue, maybe with an old iOS project when upgraded to Unity 2019.4.2?

    What should I do to resolve this? Kindly suggest things to try to maybe work around this.

    Please help!
     
  2. thesanketkale

    thesanketkale

    Joined:
    Dec 14, 2016
    Posts:
    65
    Guys, I have tried a multitude of things for over 2 weeks, but couldn't solve this. Before I pull my hair out, please, any help or suggestion will be highly appreciated.

    Thanks in advance.
     
    AlejMC likes this.
  3. thesanketkale

    thesanketkale

    Joined:
    Dec 14, 2016
    Posts:
    65
    **Phew**

    I found the issue. The bug was quietly lying in the player settings, to disable "Metal write-only BackBuffer" for my case, but it took quite a journey to get there.

    Now, I am not sure why, but enabling "Metal write-only BackBuffer" in Unity 2019.4.2 or even 4.3 is causing ReadPixels to return black texture in the iOS app. I had enabled it to improve performance and it worked quite well up until unity 2019.2.21, but maybe with recent changes in Unity, this may have turned into a bug. So, I have disabled it for now.
     
    Last edited: Jul 21, 2020
    AlejMC likes this.
  4. AlejMC

    AlejMC

    Joined:
    Oct 15, 2013
    Posts:
    149
    UPDATE [FIXED]:
    The issue was the way a single channel texture was being expanded and read as a Color32.
    On desktop the value returned would be (value, value, value, value) on iOS it would be read as (value, value, value, 1.0f).
    I don't believe this is a Unity specific thing and more about how substance treats the generation per platform.


    --- Old comment ---
    I actually had this setting off and it wasn't working. Enabling it doesn't work either.
    I'm reading a texture that's generated from substance, it works in editor and standalone macOS build. On iOS it just doesn't want to work, all values are read as color (0,0,0,1)
     
    Last edited: Feb 8, 2023
  5. unity_bu1zWp-Jk79O2g

    unity_bu1zWp-Jk79O2g

    Joined:
    Feb 3, 2023
    Posts:
    5
    Could you explain on more details of how you solved this issue? Metal write-only backbuffer was already disabled. I don't understand how alpha being value 1 always makes it full black. Any ideas would be greatly appreciated I have been stuck on this for too long.
     
  6. AlejMC

    AlejMC

    Joined:
    Oct 15, 2013
    Posts:
    149
    Sorry for the late reply, but from the top of my head, I was just trying to read a single channel texture that for some reason the values weren’t uniform in all channels.

    My use case is definitely not as complex or probably not the same.

    I would try to look for an asset or examples that take screenshot captures, if not after playing a bit with different texture formats before. Maybe even a new project from scratch that only tries to do that.

    Also, try to check if you can Debug.Log what RenderTexture.active has and which texture format? (ReadPixels gets the values from what’s there). Check if your camera is using render targets (HDR values might make some of those things have differing texture format values) and also maybe check when during the rendering it might be happening… in some Unity samples the use one of the PostRender events.

    hope that helps a bit at least
     
    unity_bu1zWp-Jk79O2g likes this.