Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

GetPixels32() Returning Uninitialized Data on a Valid ARGB32 Texture2D

Discussion in 'Scripting' started by knchaffin, Jun 8, 2014.

  1. knchaffin

    knchaffin

    Joined:
    May 18, 2013
    Posts:
    58
    Does anyone have any idea why GetPixels32() on a valid ARGB32 Texture2D would return an array of Color of the correct Length but the pixel values are all 0xCD? 0xCD is the VS debug runtime designator of uninitialized heap memory. But, I know the Texture2D is good because I have a quad primitive that the texture is displaying on in Unity with no problem.

    I have a lot going on in this Unity 4.5 C# script, including an NVIDIA CUDA DLL plugin and a DirectCompute HLSL compute shader. The Texture2D in question has had its DX11 native texture pointer passed to the plugin and the texture content has been procedurally set and that is what is displaying correctly in the Unity script. It is when I do the GetPixels32() on this Texture2D in order to copy pixels to another Texture2D which is bound to the compute shader via foocomputeshader.SetTexture() that the pixels have apparently uninitialized data in them. Since I know that the texture has correct pixel values in it since it is displaying properly in Unity, all I can think of is that something I am doing may be setting the Texture2D to not readable or locked in some way.

    Thanks to anyone who has any ideas here.

    K.N. Chaffin - Texas Tech University
     
  2. knchaffin

    knchaffin

    Joined:
    May 18, 2013
    Posts:
    58
    I could never figure out what was causing this problem, but it appeared that there was some issue with writing to the Texture2D in the CUDA plugin and then copying that texture to another Texture2D via GetPixels() and SetPixels() in the C# script and then allowing the compute shader to read that second Texture2D. I changed the first texture to a RenderTexture with enableRandomWrite set to true, passed the native texture pointer to the CUDA DLL, let it populate this texture, then in the C# script used the ReadPixels() function to copy the RenderTexture to the Texture2D, then allowed the compute shader to read this texture and set the TC-Particles particle values via the TC-Particles extension compute shader methodology. Everything is working great. So, if you have ever wondered if you could mix a compute shader and a CUDA GP-GPU parallel processinging DLL, sharing textures between them and the Unity C# script, the answer is yes you can.

    K.N. Chaffin - Texas Tech University
     
  3. LightStriker

    LightStriker

    Joined:
    Aug 3, 2013
    Posts:
    2,716
    Just to be sure, after SetPixels, you do Apply(), right?
     
  4. knchaffin

    knchaffin

    Joined:
    May 18, 2013
    Posts:
    58
    Absolutely.
     
  5. knchaffin

    knchaffin

    Joined:
    May 18, 2013
    Posts:
    58
    I'm in the process of converting all of my textures to write enabled ARGBFloat RenderTexture's and using Graphics.Blit to copy from RenderTexture to RenderTexture. This is working very well.

    K.N. Chaffin