Search Unity

Texture.allowThreadedTextureCreation

Discussion in 'General Graphics' started by G33RT, Nov 12, 2019.

  1. G33RT

    G33RT

    Joined:
    Dec 27, 2013
    Posts:
    52
    Hi,

    I've been trying to create Textures on a non-main thread without succes, even though Texture.allowThreadedTextureCreation is set to true. I can't find too much info on this property either. Is this supposed to be working yet, how should we use/interpret it?

    Tnx,

    Geert
     
  2. lyndon_unity

    lyndon_unity

    Unity Technologies

    Joined:
    Nov 2, 2017
    Posts:
    66
    Hi, I'll try an help you. Can you give me some more information:
    • Which platform are you working with? (and which Unity version?)
    • Which texture type are you using (2d/3d)?
    • Which texture format are you using (crunch compressed, RGB 24 bit, etc)?
    • Is Texture.allowThreadedTextureCreation set to true on start up, or are you trying to set it to true?
    If its false on start up then Unity does not support threaded texture creation on that platform.
    If you try to set it to true, it will ping back to false if its unsupported.

    If this is true on start up then textures will be created on a thread where possible.

    How are you confirming whether this is working?
    • You can look in the unity profiler to check. Look for Gfx.CreateTextureThreaded rather than Gfx.CreateTexture.
    • Its easier to load the data into the Profile Analyzer tool to search for markers with specific names over a time range across a range of threads (make sure to search the main thread, render thread and job threads)
    Not all texture formats are supported, hence the second/third questions. For example 3D textures are not currently supported.

    Can I also ask what problem you are looking to solve so I can potentially offer other solutions.
     
    richardkettlewell and LeonhardP like this.
  3. G33RT

    G33RT

    Joined:
    Dec 27, 2013
    Posts:
    52
    Hi!

    Tnx for replying!

    I'm building for Windows, and allowThreadedTextureCreation is set to true (I've checked that at runtime). I was kind of hoping I could now "new Texture2D(width,height)" on a different thread, but that assumption is wrong I suppose?

    The problem I'm trying to solve: I'm streaming 3D data over http, I've succeeded to a very large extent to have as much work done on other threads than the main thread to have very fluent rendering while streaming. The only problem remaining are large textures as I must allocate them on the main thread ...

    Geert
     
  4. athenspire

    athenspire

    Joined:
    Dec 4, 2019
    Posts:
    1
    I am experiencing the same issue on Mac. After setting Texture2D.allowThreadedTextureCreation and confirming that it returns true, calling "new Texture2D(width,height)" from a separate thread results in runtime errors that state that texture creation can only be done in Main Thread.

    The documentation for Texture.allowThreadedTextureCreation is terse. Is there a prescribed technique to create textures from another thread?

    jeff
     
    Last edited: Jan 28, 2020
  5. Zapan15

    Zapan15

    Joined:
    Apr 11, 2011
    Posts:
    186
    @Unity Can you please show an example on how to create a texture on another thread, as we are also not able to create a texture even if allowThreadedTextureCreation is true:

    Code (CSharp):
    1.     private void CreateTextureAsync()
    2.     {
    3.         //UnityException: SupportsTextureFormatNative can only be called from the main thread.
    4.         //Constructors and field initializers will be executed from the loading thread when loading a scene.
    5.         //Don't use this function in the constructor or field initializers, instead move initialization code to the Awake or Start function.
    6.         Texture2D tex = new Texture2D(256, 256);
    7.     }
    8.  
    9.  
    10.     private void StartCreateTexture()
    11.     {
    12.         Thread loadingThread = new Thread(CreateTextureAsync);
    13.         loadingThread.Start();
    14.     }
     
  6. G33RT

    G33RT

    Joined:
    Dec 27, 2013
    Posts:
    52
    I still can't get it to work either, tried several combinations/variations on each new (minor) Unity version. I really need this, and I am quite amazed that not too many others are kinda waiting for this.

    Texture.allowThreadedTextureCreation has been in the API for several months now. Anyone has any clue on how to use it or if it is possible in any way to create Textures or do a ImageConversion.LoadImage on another thread?
     
  7. lyndon_unity

    lyndon_unity

    Unity Technologies

    Joined:
    Nov 2, 2017
    Posts:
    66
    Hi, I'm afraid the documentation has probably created some confusion on this issue.
    The Texture.allowThreadedTextureCreation flag controls Unity internals for texture creation.

    When its set to true, the internal graphics APIs will create the texture on a worker thread, rather than the render thread. This will free up the render thread for other work and reduce hitches, due to moving some slower activity to a worker thread.

    Unfortunately you can't call the C# texture creation API's on a separate thread.
    You could create the texture on the main thread and then generate the data for the texture on a separate thread.
    Use Texture2D.GetRawTextureData to get a pointer to memory to fill in and pass this to your worker thread.
    The Texture2D.Apply function will still need to be called on the main thread though.
     
    Mic9 and richardkettlewell like this.
  8. G33RT

    G33RT

    Joined:
    Dec 27, 2013
    Posts:
    52
    Ok.

    Do you see any way of converting "raw png bytes" to "RawTextureData" on a non-main thread? (I currently simply use LoadImage for that). I suppose some kind of custom png decoder to start with?

    Geert
     
  9. richardkettlewell

    richardkettlewell

    Unity Technologies

    Joined:
    Sep 9, 2015
    Posts:
    2,285
    I think you could use a 3rd party PNG decoder (google shows a few), which you could use to populate the raw texture data yourself.
     
  10. Glader

    Glader

    Joined:
    Aug 19, 2013
    Posts:
    456
    Is it safe to use this pointer on another thread though or do we need to create a buffer copy of it before queueing it onto another thread?

    edit: Further googling seems like using NativeArray created on the main thread on another thread probably isn't safe.
     
  11. lyndon_unity

    lyndon_unity

    Unity Technologies

    Joined:
    Nov 2, 2017
    Posts:
    66
    Note the caveat listed in the documentation for GetRawTextureData:
    https://docs.unity3d.com/ScriptReference/Texture2D.GetRawTextureData.html

    Note: The returned array can become invalid (i.e. it no longer points to valid memory) if modifications or uploads happen to the texture after you call this method. Therefore the recommended way to use this method is to get the data, and use or modify it immediately. You should not store the returned array for later use.
    Potentially you could use the pointer on another thread but there are definitely risks.
    A safer option is to create your texture data in some working memory and then perform the final copy to the texture on the main thread. (or if you need the original texture data, make a copy of that into the memory you own, modify it on a thread and then perform the final copy to the texture on the main thread).
     
  12. LaireonGames

    LaireonGames

    Joined:
    Nov 16, 2013
    Posts:
    705
    Having to call .Apply makes everything here redundant since I find that is the most expensive method by far. For example with a 1k by 1k texture I can get setting pixels down to 0.61ms but calling .Apply is taking a massive 49.49ms.

    Are there any future plans to allow us to create textures on the fly on a different thread/job?

    My end use case is that I am trying to build a Texture2DArray that contains a current list of textures needing rendered that I will be altering depending on what is in view. Without this, batching is impossible :(
     
    mgear likes this.
  13. joshuacwilde

    joshuacwilde

    Joined:
    Feb 4, 2018
    Posts:
    731
    If you are using SetPixels that is your problem. Any modern game should never use SetPixels. You should be using CopyTexture for your custom texture streaming solution.

    To be clear, using Graphics.CopyTexture (or command buffer equivalent) will prevent you from having to call Apply().
     
  14. LaireonGames

    LaireonGames

    Joined:
    Nov 16, 2013
    Posts:
    705
    Its not as simple as that. I'm downloading data from the network that I use to build a texture. I need to call Apply to get data to the GPU before I can play with things like Graphics.CopyTexture
     
  15. joshuacwilde

    joshuacwilde

    Joined:
    Feb 4, 2018
    Posts:
    731
    What I do is call GetRawTextureData<byte>() on the texture, then write the data to that array, then call Apply() on the texture later. I get pretty good performance with that.
     
  16. LaireonGames

    LaireonGames

    Joined:
    Nov 16, 2013
    Posts:
    705
    I'm encoding to JPG and storing the byte[].

    Then I'm reading the byte[] and setting it into the texture array (works nice and fast).

    But Apply was taking 50ms. This was only with a 1024 by 1024 with 10 entries! TextureArray is also RGB24

    Edit: Edited to clarify that I'm editing an array, not just a Texture2D. My original wording wasn't clear on this
     
    Last edited: Nov 13, 2022
  17. joshuacwilde

    joshuacwilde

    Joined:
    Feb 4, 2018
    Posts:
    731
    Maybe Apply will be faster if writing into the array from GetRawTextureData() is what I'm saying. You could also try writing only a bit per frame if that is allowable.
     
  18. LaireonGames

    LaireonGames

    Joined:
    Nov 16, 2013
    Posts:
    705
    You have to write the whole mip at once with a Texture2DArray and you can't use GetRawTextureData() with them either. You can write pixel data with custom data which is fast but like I say, Apply is the problem.

    The speed of Apply doesn't change depending on if you use SetPixels vs SetPixelData. Doubted it would cause its still just trying to go from CPU to GPU and those methods are only playing on CPU data