Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

GetNativeTexturePtr() call behavior differs from RenderTexture/Texture2D, how come?

Discussion in 'Editor & General Support' started by Nikolaj, Aug 22, 2013.

  1. Nikolaj

    Nikolaj

    Joined:
    Aug 21, 2013
    Posts:
    3
    Hi,
    Im trying to write a DLL plugin that uses CUDA and DirectX 11 interoperability, and I require a pointer to a RenderTexture generated from Unity. However, I am not getting the same behavior when calling GetNativeTexturePtr() on a RenderTexture, as when calling GetNativeTexturePtr() on a Texture2D.

    Heres a snippet of the C# code that creates the texture and rendertexture in a Unity script:

    Code (csharp):
    1.  
    2. // Create a Texture2D and RenderTexture
    3. Texture2D tex = new Texture2D(256,256,TextureFormat.ARGB32,false);
    4. RenderTexture renderTexture = new RenderTexture(256, 256, 0,RenderTextureFormat.ARGB32);
    5. renderTexture.isPowerOfTwo = true;
    6. renderTexture.Create();
    7. tex.Apply();
    8.  
    9. //Link RenderTexture to a camera and display texture on plane surface, and display Texture2D on this GameObject
    10. mirrorCamera.targetTexture = renderTexture;
    11. mirror.renderer.material.mainTexture = mirrorCamera.targetTexture;
    12. renderer.material.mainTexture = tex;
    13.  
    14. // Pass texture pointer to the plugin
    15. SetTextureDestinationFromUnity (renderer.material.mainTexture.GetNativeTexturePtr());
    16. SetRenderTextureSourceFromUnity (mirror.renderer.material.mainTexture.GetNativeTexturePtr());
    17.  
    In my DLL plugin I attempt to register the texture pointers with CUDA with the following code:

    Code (csharp):
    1.  
    2. ID3D11Texture2D* d3dtex = (ID3D11Texture2D*)g_TexturePointerDestination;
    3. ID3D11Texture2D* d3drendertargetsource = (ID3D11Texture2D*)g_RenderTexturePointerSource;
    4.  
    5. cudaGraphicsD3D11RegisterResource(&g_texture_2d.cudaResource, d3dtex, cudaGraphicsRegisterFlagsNone);
    6. getLastCudaError("cudaGraphicsD3D11RegisterResource texture failed");
    7.  
    8. cudaGraphicsD3D11RegisterResource(&g_texture_2d.cudaRenderTarget_source, d3drendertargetsource, cudaGraphicsRegisterFlagsNone);
    9. getLastCudaError("cudaGraphicsD3D11RegisterResource render texture failed");
    10.  
    In the above code, registering the texture with CUDA is no problem, with me able to do any further processing I need to. However, the call to register the d3drendertargetsource (the native texture pointer sent from Unity) crashes the application promptly. My guess is that I need to do some extra DirectX stuff in my DLL. This is my first time working with DirectX, I wrote the code i want to port for OpenGL, and Im having trouble figuring out what exactly I am doing wrong. I hope someone sees this and has the answer :)

    -Nikolaj
     
  2. Nikolaj

    Nikolaj

    Joined:
    Aug 21, 2013
    Posts:
    3
    If I am missing any relevant information, please don't hesitate to tell me.

    -Nikolaj
     
  3. Nikolaj

    Nikolaj

    Joined:
    Aug 21, 2013
    Posts:
    3
    The call to cudaGraphicsD3D11RegisterResource returns the following error when used with the pointer from a RenderTexture's getNativePtr() function: "CUDA Runtime API error 11: invalid argument."

    According to CUDA documentation, the only rendertarget that cannot be registered with CUDA is the primary rendertarget.

    As mentioned, cudaGraphicsD3D11RegisterResource with getNativePtr() from a Texture2D works as intended.

    -Nikolaj
     
  4. CKahler

    CKahler

    Joined:
    May 6, 2013
    Posts:
    149
    Hi Nikolaj,

    did you find a solution for this problem? I'm trying the same thing with DX11 & OpenCL, and I also have troubles to get the RenderTexture working with it.
     
  5. korzen303

    korzen303

    Joined:
    Oct 2, 2012
    Posts:
    223
    Hi Nikolaj,

    any updates on this? Would it be possible to have a look at your interop code. I need to do something similar with Compute Buffers and CUDA

    Thanks
     
  6. ReJ

    ReJ

    Unity Technologies

    Joined:
    Nov 1, 2008
    Posts:
    378
    Check_2015 likes this.
  7. Check_2015

    Check_2015

    Joined:
    Oct 4, 2015
    Posts:
    2
  8. luyangliu123

    luyangliu123

    Joined:
    Mar 10, 2017
    Posts:
    1
    Did anyone fix this problem? I adopt the suggestion from ReJ and get the correct native texture pointer, but still get "CUDA Runtime API error 11: invalid argument" when calling cudaGraphicsD3D11RegisterResource.
     
  9. XinYueStudio

    XinYueStudio

    Joined:
    Oct 27, 2015
    Posts:
    2
    hi, Nikolaj did you find a solution for this problem? want to study this project ,Would you like to share a Demo for me?
     
  10. seb_krueger

    seb_krueger

    Joined:
    Jan 4, 2019
    Posts:
    17
    Hi,

    cudaGraphicsD3D11RegisterResource is not working with typless texture formats. As far as I know, there is no solution for that, but making a copy of the texture into a strongly typed one. And this should solve the issue CUDA/dx interopt has with the main render target texture.

    Best
    Sebastian