Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice
  2. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  3. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

AsyncGPUReadbackRequest and EncodeToPNG

Discussion in 'Graphics Experimental Previews' started by brianchasalow, Sep 5, 2018.

  1. brianchasalow

    brianchasalow

    Joined:
    Jun 3, 2010
    Posts:
    208
    AsyncGPUReadbackRequest now is supported on more platforms in 2018.2, which is great. But, we really need an async EncodeToPNG built in to Unity, so we can read back from a RenderTexture, encode to png, and write a file to disk asynchronously, to fully complete the pipeline!

    ImageConversion has utility methods for EncodeToPNG, but it only takes a Texture2D as parameter. We need a threaded, async version that takes a byte[] array and returns a png-encoded byte[] array after the request is done.

    Consider this my official feature request. :)
    Brian
     
  2. bradweiers

    bradweiers

    Joined:
    Nov 3, 2015
    Posts:
    59
    Interesting. Forgive my ignorance but can you provide more information on the use case, and what sorts of things that having this would unlock?
     
  3. brianchasalow

    brianchasalow

    Joined:
    Jun 3, 2010
    Posts:
    208
    Let's say I want to write the contents of a RenderTexture, of something on-screen, to a png file on disk.
    We used to (and still kinda) have to do this:
    Code (CSharp):
    1. public static void DumpRenderTexture (RenderTexture rt, string pngOutPath)
    2.     {
    3.         var oldRT = RenderTexture.active;
    4.  
    5.         var tex = new Texture2D (rt.width, rt.height, TextureFormat.ARGB32, false, false);
    6.         RenderTexture.active = rt;
    7. //this is expensive, although there are maybe better ways to copy from a RT to a Texture2D now?
    8.         tex.ReadPixels (new Rect (0, 0, rt.width, rt.height), 0, 0);
    9.         tex.Apply ();
    10. //this EncodeToPNG() is also expensive, and only exists on a Texture2D
    11.         System.IO.File.WriteAllBytes (pngOutPath, tex.EncodeToPNG ());
    12.         RenderTexture.active = oldRT;
    13.         GameObject.Destroy (tex);
    14.     }
    15.  

    Now that we can run an async readback from the GPU (typically an expensive part of the operation), we can get a raw byte[] array. But, there's not an easy Unity way to encode that raw byte array to a PNG-encoded byte array. what we'd want to do:

    Code (CSharp):
    1.    
    2.             //this part exists now in 2018.2 on many platforms:
    3.            UnityEngine.Rendering.AsyncGPUReadbackRequest request = UnityEngine.Rendering.AsyncGPUReadback.Request (renderTexture, 0);
    4.             while (!request.done)
    5.             {
    6.                 yield return new WaitForEndOfFrame ();
    7.             }
    8.             byte[] rawByteArray = request.GetData<byte> ().ToArray ();
    9.             //this doesn't exist:
    10.             UnityEngine.ImageConversion.EncodeToPNGRequest encoderRequest = UnityEngine.ImageConversion.EncodeToPNG.Request(rawByteArray);
    11.  
    12.             while(!encoderRequest.isDone)
    13.             {
    14.                 yield return new WaitForEndOfFrame ();
    15.             }
    16.             byte[] encodedByteArray = encoderRequest.GetData<byte> ().ToArray ();
    17.  
    18. //while we're at it, can we get a file writer that's jobified also? ;-)
    19.             System.IO.File.WriteAllBytes (pngOutPath, encodedByteArray);
     
    Last edited: Sep 7, 2018
  4. bhermer

    bhermer

    Joined:
    Jun 24, 2013
    Posts:
    28
    I need this functionality too, my use case, we have a VR camera which takes picture in the scene, we want these images saved out to disk, currently I have to wait until they finish the level, then save them all, as the user is at a place where having jitter doesnt matter, I would like though to save the png in realtime when they take the picture.
     
  5. dyox

    dyox

    Joined:
    Aug 19, 2011
    Posts:
    619
    brianchasalow likes this.
  6. Jbouzillard

    Jbouzillard

    Joined:
    Oct 10, 2016
    Posts:
    3
    We needed this feature for ages but never asked for it as we expected it to be released sooner than never...
    The Bitmap from System.Drawing is not available in Unity. (System.Windows.Media.Imaging provided by dyox neither afaik...)
    What we have in Unity from start is creating a texture, setting pixels, Apply (which is awfully not async and slow), and then EncodeToPNG.

    UnityWebRequest is nice for decoding PNG data into a texture and is pretty fast, but there is a lack of the other edge of the pipeline.

    There is many use cases for an async PNG converter from bytes without creating a texture and calling texture.Apply() but ours is linked to PhotoCapture.
    This Unity feature returns a PhotoCaptureFrame containing IMFMediaBuffer byte data (simple color array). There is no async fast method of converting this byte array to PNG before to send it over the network.
    And even if we didn't need to send it over the network, It's actually faster to send a PNG to the network, redownload it with UnityWebRequest to load a texture, than setting the pixels and call texture.Apply() which literally freezes the application.
    Therefore, having something in Unity similar to the System.Drawing to encode color byte array to PNG asynchronously would be very, very nice to see.
     
    Psyco92, brianchasalow and a3dline like this.
  7. pixeltrix

    pixeltrix

    Joined:
    Jan 19, 2015
    Posts:
    1
    I'm extremely interested in this in this as well.

    Using asyncGPUReadbackRequest has solved half of the slow down. Its just the conversion and saving process that makes my game freeze momentarily. And as I'm doing a VR application this unfortunately makes the feature unusable.
     
  8. Jbouzillard

    Jbouzillard

    Joined:
    Oct 10, 2016
    Posts:
    3
    Maybe try the BitmapEncoder from Windows.Graphics.Imaging namespace, available in Unity.
    The functions below take an array of colors (in bytes) and asynchronously encode those bytes into PNG byte array, returning the byte array into the main thread of Unity for use (or write async into file, etc...).
    Let's tell me if that work for your case.

    Code (CSharp):
    1. using System.Runtime.InteropServices.WindowsRuntime;
    2. using System.Threading.Tasks;
    3. using Windows.Storage;
    4. using Windows.Storage.Streams;
    5. using Windows.Graphics.Imaging;
    6.  
    7. Public void EncodeAsync(Action<byte[]> _encoded)
    8. {
    9.     Task.Run(async () =>
    10.     {
    11.         byte[] pngData = await EncodeColorBytesToPNG(colorBytes);
    12.      
    13.         ExecuteInMainThread.Push(() =>
    14.         {
    15.             _encoded.Invoke(pngData);
    16.         });
    17.     });
    18. }
    19.  
    20. private async Task<byte[]> EncodeColorBytesToPNG(byte[] _data)
    21. {
    22.     using (IRandomAccessStream stream = new InMemoryRandomAccessStream())
    23.     {
    24.         // Create an encoder with the desired format
    25.         BitmapEncoder encoder = await BitmapEncoder.CreateAsync(BitmapEncoder.PngEncoderId, stream);
    26.  
    27.         encoder.SetPixelData(BitmapPixelFormat.Bgra8, BitmapAlphaMode.Straight, (uint)width, (uint)height, 300, 300, _data);
    28.  
    29.         try
    30.         {
    31.             await encoder.FlushAsync();
    32.         }
    33.         catch (Exception err)
    34.         {
    35.             HFLogger.Error(err.Message);
    36.         }
    37.  
    38.         byte[] result = new byte[stream.Size];
    39.         await stream.ReadAsync(result.AsBuffer(), (uint)stream.Size, InputStreamOptions.None);
    40.  
    41.         return result;
    42.     }
    43. }
     
    futurlab_peterh likes this.
  9. andrew-lukasik

    andrew-lukasik

    Joined:
    Jan 31, 2013
    Posts:
    249
    Hi. Original pngcs is awesome but it requires some time to understand how to move data to/from Unity. It's totally ok for bigger projects, but not for something needed quickly or just once. In my case - I was in urgent need of 16 bit grayscale png encoding - this is why I started a fork to bring more png encoding options to Unity while interfacing really simple Unity-like API i.e.:
    using Pngcs.Unity;
    Texture2D texture = PNG.Read( path );

    It's work in progress so if you want to test it, brake it, fork it and send angry feedback my way, check out
    github.com/andrew-raphael-lukasik/pngcs (Unity.cs is a entry point).
     
    Last edited: Sep 30, 2019
  10. mansiva2000

    mansiva2000

    Joined:
    Jul 2, 2013
    Posts:
    9
  11. coldwarrior5

    coldwarrior5

    Joined:
    Nov 9, 2016
    Posts:
    12
    +1 I could also use this feature since we are trying to capture several sensors and store it in shared memory but the convert to png is awfully slow.

    Update: I decided not to convert data into png as I am just going to use raw data in another process. So I will be just storing byte array as is. But there should be a speedier converter available.
     
    Last edited: Mar 22, 2019
  12. qfettes-rr

    qfettes-rr

    Joined:
    May 17, 2019
    Posts:
    1
    +1 I'm using unity to generate synthetic training data for a deep learning model. Converting high resolution images to png is so slow (~1s per image) it's a deal breaker right now.
     
  13. stijnvdbmgsp

    stijnvdbmgsp

    Joined:
    Feb 6, 2017
    Posts:
    6
    +1! I'm looking for any way to quickly encode from a Texture2D to whichever format is quickest, for sending to a (non-Unity) peer over network. FFMpeg is not an option as I need to support mobile platforms. EncodeToPNG and JPG are too slow (it works, but the performance hit in-game is not worth it). I'm currently looking at https://github.com/libjpeg-turbo/libjpeg-turbo, but it might still not be fast enough... Another alternative is just sending the raw data and figure out a way to convert it on the other end. Any suggestions?
     
    Last edited: Sep 29, 2019
  14. stijnvdbmgsp

    stijnvdbmgsp

    Joined:
    Feb 6, 2017
    Posts:
    6
    thanks, I had a quick look but that definitely doesn't look like it would work on mobile. It's using NVEncode for encoding, which rules out mobile. The solution I have is mostly working - I'm sending image data over a socket every couple of frames, which the peer is able to draw. There are very noticeable hickups though each time the game does this, which is in large part caused by the EncodeToJPG call - I'm looking to reduce that hickup as much as possible. The options I'm looking at right now are either finding a quicker encode (any format will do), or getting rid of the encode altogether.
     
  15. AndreasBroager

    AndreasBroager

    Joined:
    Jun 7, 2013
    Posts:
    5
  16. Ryunis

    Ryunis

    Joined:
    Dec 23, 2014
    Posts:
    24
    Hi, I just wanted to add another solution to this thread. It requires no third-party tools, and while it's not the fastest in terms of raw speed it doesn't block the main thread. First you use AsyncGPUReadback to get the image data as a NativeArray and then pass it to a Job similar to this:
    Code (CSharp):
    1.  
    2. private struct EncodeImageJob : IJob
    3. {
    4.     [ReadOnly] [DeallocateOnJobCompletion]
    5.     public NativeArray<uint> Input;
    6.          
    7.     public uint Width;
    8.     public uint Height;
    9.     public int Quality;
    10.          
    11.     public NativeList<byte> Output;
    12.      
    13.     public unsafe void Execute()
    14.     {
    15.         NativeArray<byte> temp = ImageConversion.EncodeNativeArrayToJPG(
    16.             Input, GraphicsFormat.R8G8B8_UNorm, Width, Height, 0, Quality);
    17.              
    18.         Output.Resize(temp.Length, NativeArrayOptions.UninitializedMemory);
    19.  
    20.         void* internalPtr = NativeArrayUnsafeUtility.GetUnsafeBufferPointerWithoutChecks(temp);
    21.         void* outputPtr = NativeArrayUnsafeUtility.GetUnsafeBufferPointerWithoutChecks<byte>(Output);
    22.         UnsafeUtility.MemCpy(outputPtr, internalPtr, temp.Length * UnsafeUtility.SizeOf<byte>());
    23.              
    24.         temp.Dispose();
    25.     }
    26. }
    27.  
    Keep in mind that this might look a bit different depending on the image format you're using. The reason I'm using UnsafeUtility.MemCpy is because the SafetySystem would throw exceptions when I tried to use NativeArray.CopyTo and similar functions. Also you might need to copy the NativeArray you get from AsyncGPUReadback before passing it to the job, the SafetySystem was complaining there as well. The Output can simply be allocated as an empty NativeList before passing it to the job.
     
    Last edited: Oct 10, 2020
  17. asa989

    asa989

    Joined:
    Dec 18, 2015
    Posts:
    52
    I need this in IOS. What you all can do is to copy RT to Texter2D in compute shader. However, writing on Texture2D in compute shader is not supported in IOS!! which is what I need! any solution for IOS?
     
  18. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    737
    We've also spent months on solving Cross-Platform Live streaming solution in 2018, for exhibitions, and art gallery projects.
    After optimising for another few months, we finally released our C# Live Streaming solution FMETP STREAM in one year ago.

    It's also compatible with iPhone 5 too~ including async encoding methods.
    store link: https://assetstore.unity.com/packages/slug/143080

    You can find more demo on our Youtube Channel~

     
  19. H4ppyTurtle

    H4ppyTurtle

    Joined:
    Aug 20, 2019
    Posts:
    18
    It's a nice workaround, but I really don't get why it requires so much work. I don't get the purpose of EncodeNativeArrayTo... if it's not really usable with the job system out of the box. Even though the documentation states "This method is thread safe". I eventually implemented a much cleaner approach with System.Threading that receives the data of the AsyncGPUReadback. Would have preferred to use the job system though. Is this behaviour somewhat of a bug or a restriction of the Job system?

    Code (CSharp):
    1. encodeImgThread = new Thread(EncodeImg);
    2. encodeImgThread.IsBackground = true;
    3. encodeImgThread.Start(new ImageData(request.GetData<Color32>(), (uint)shareableTexture.width, (uint)shareableTexture.height));
    4.  
    5.     private void EncodeImg(object obj)
    6.     {
    7.         ImageData data = (ImageData)obj;
    8.         try
    9.         {
    10.             FinishedImgEncodingEvent(ImageConversion.EncodeArrayToJPG(data.imgBytes, GraphicsFormat.R8G8B8_SRGB,
    11.                     data.width, data.height, 0, 75));
    12.         }
    13.         catch (Exception e)
    14.         {
    15.             Debug.Log(e.Message);
    16.         }
    17.     }
    18.     public class ImageData
    19.     {
    20.         public Color32[] imgBytes;
    21.         public uint width;
    22.         public uint height;
    23.  
    24.         public ImageData(NativeArray<Color32> imgBytes, uint width, uint height)
    25.         {
    26.             this.imgBytes = imgBytes.ToArray();
    27.             this.width = width;
    28.             this.height = height;
    29.         }
    30.     }
     
    Last edited: May 1, 2021
    MikleBlack and NemanjaPavlovic like this.
  20. rosssssss

    rosssssss

    Joined:
    Jan 27, 2017
    Posts:
    70
    Hello Andreas - I'm trying to use your solution - thanks - but i'm hitting "Index outside bounds of array" errors at line 669 - I imagine this is because the images we are trying to encode are big! - 8k x 4k - do you happen to have a simple fix for this?