Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

[Solved] Texture pointer with DirectX

Discussion in 'General Graphics' started by lacucaracha, May 30, 2016.

  1. lacucaracha

    lacucaracha

    Joined:
    May 11, 2016
    Posts:
    8
    Hello !


    I'm here to ask you some help because i'm a bit lost for the moment :)

    Here is my situation :
    I have developed a native plugin in c/cpp that decode video using FFMPEG and SDL.
    For each frame, i have an AVFrame (struct that contains information of each frame of a video) that i bind using opengl. the data of a frame and the pointer of a texture obtained from unity using the function GetNativeTexturePtr.

    this is my actual situation, which is working really fine !

    In this case i have to use the opengl graphic API in the player settings in Unity.

    I need to change my code in order to support DirectX 11, so that i will be able in integrate the Oculus SDK and i wont have obsolete opengl code in my native plugin.


    to be clear : My function that does the binding between the texture pointer from unity and the data of each frame that i got from my code using ffmpeg need to be changed in order to be available in a DirectX grapic api context in unity.

    How to set value of a texture pointer using directx code ?

    Could you please guide me for this modification ?


    I'm using unity 5.2.4 32 bits and Visual Studio.

    Thank you for your help !
     
  2. lacucaracha

    lacucaracha

    Joined:
    May 11, 2016
    Posts:
    8
    Here is the code i use for the rendering function :
    (In my c/cpp code using opengl functions)

    Code (CSharp):
    1.     void DoRendering() {
    2.  
    3.             // Opengl case
    4.             if (s_DeviceType == kUnityGfxRendererOpenGL) {
    5.  
    6.                 //init
    7.                 //g_TexturePointer is a void * that i get from unity , this is the pointer of the texture
    8.                 texture = (GLuint)(size_t)(g_TexturePointer);
    9.                 glBindTexture(GL_TEXTURE_2D, texture);
    10.  
    11.                 //parameters
    12.                 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    13.                 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    14.                 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    15.                 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    16.  
    17.  
    18.                 if (global_video_state && global_video_state->playerState != STOPPED) {
    19.                     if (savedFrame && savedFrame->data[0] && isPictAvailable) {
    20.                         GLsizei texWidth = videoW;
    21.                         GLsizei texHeight = videoH;
    22.  
    23.                         if (isFirstUseOfGLText) {
    24.                             //savedFrame->data is type of uint8_t *
    25.                             glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, texWidth, texHeight, 0, GL_RGB, GL_UNSIGNED_BYTE, savedFrame->data[0]);
    26.                             isFirstUseOfGLText = false;
    27.                         }
    28.                         else {
    29.                             glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, texWidth, texHeight, GL_RGB, GL_UNSIGNED_BYTE, savedFrame->data[0]);
    30.                         }
    31.  
    32.                         isPictAvailable = false;
    33.                     }
    34.                 }
    35.                 else {
    36.                 //display black image of 2x2 pixels
    37.                     float pixels[] = {
    38.                         0.0f,0.0f,0.0f, 0.0f,0.0f,0.0f,
    39.                         0.0f,0.0f,0.0f, 0.0f,0.0f,0.0f,
    40.                     };
    41.                     glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 2, 2, 0, GL_RGB, GL_FLOAT, pixels);
    42.                 }
    43.                 glFlush();
    44.                 glFinish();
    45.             }
    46.  
    47.  
    48.             // D3D11 case
    49.             if (s_DeviceType == kUnityGfxRendererD3D11)
    50.             {
    51.                 // update native texture from code
    52.                 if (g_TexturePointer)
    53.                 {
    54.                     ID3D11Texture2D* d3dtex = (ID3D11Texture2D*)g_TexturePointer;
    55.                     D3D11_TEXTURE2D_DESC desc;
    56.                     d3dtex->GetDesc(&desc);
    57.  
    58.                
    59.                     //here i'm lost
    60.                 }
    61.             }
    62.     }

    As you can see at line 57... There is the help i need :) !

    Thank you !
     
  3. lacucaracha

    lacucaracha

    Joined:
    May 11, 2016
    Posts:
    8
    Hello again,


    I also have the feeling that this function is never called, so its impossible for me to know the graphic context used by Unity at the time my plugin is used :

    In my c/cpp code
    Code (CSharp):
    1. void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API UnityPluginLoad(IUnityInterfaces* unityInterfaces)
    2. //exact code from the example...
    3.  
    This code come from the example project available here :
    http://docs.unity3d.com/Manual/NativePluginInterface.html

    In my Unity c#code, an exemple of function i use :
    Code (CSharp):
    1. #if UNITY_5 && !UNITY_5_0 && !UNITY_5_1
    2.     [DllImport("FFMPEGPlayerWIN")]
    3.     public static extern void SetTimeFromUnity(float value);
    4.  
    5. ...
    6. //example
    7. SetTimeFromUnity(1);
    8.  
    9. #endif
    which leaves me with only this :

    Code (CSharp):
    1. UnityGfxRenderer s_DeviceType = kUnityGfxRendererNull;
    from IUnityGraphics.h
    And this var stay to kUnityGfxRendererNull the whole time even if i change the API grpahics in the player setting to Directx or Opengl


    Thx for reading


    EDIT : I found the solution for this problem.

    I used DLL Export Viewer ( http://www.nirsoft.net/utils/dll_export_viewer.html ) i order to see every call possible (exported functions) from my lib and i found out that there was an '_' before the name of my functions caused by the "__stdcall" added in front of the function.
    My bad then, i had to remove it to only let this :
    Code (CSharp):
    1. void UNITY_INTERFACE_EXPORT UnityPluginLoad(IUnityInterfaces* unityInterfaces)
    So now im able to switch context between opengl and Directx normally and my var s_DeviceType is correct now( its value is 0 for Desktop OpenGL and 2 for Direct3D 11.

    Now i can continue my research about the Directx11 pointer context(first message)
     
    Last edited: Jun 1, 2016
  4. lacucaracha

    lacucaracha

    Joined:
    May 11, 2016
    Posts:
    8
    Hello guys !

    Now that my UnityPluginLoad works, i have a context !
    which allows me to have an "IUnityGraphicsD3D11" and a "ID3D11Device" :)

    So i have found a solution for my problem.

    I will explain my work here, may this post help someone in the future !
    and thanks to the community for your help.. oh wait ?


    Lets begin :

    1 - You need to have a UnityPluginLoad that works fine, so you can get the context of D3D11
    Code (CSharp):
    1. IUnityGraphicsD3D11* d3d11 = s_UnityInterfaces->Get<IUnityGraphicsD3D11>();
    2.             g_D3D11Device = d3d11->GetDevice();
    2 - I use the format "RAW_BGRA32" for my texture in unity and of course "AV_PIX_FMT_BGRA" in my ffmpeg code for the AVFrame.

    3 - I now have an updated code that take into account the D3D11.

    Code (CSharp):
    1. void DoRendering() {
    2.  
    3. #if SUPPORT_D3D11
    4.             // D3D11 case
    5.             if (s_DeviceType == kUnityGfxRendererD3D11)
    6.             {
    7.                 //get the ID3D11DeviceContext
    8.                 ID3D11DeviceContext* ctx = NULL;
    9.                 //g_D3D11Device is type of ID3D11Device
    10.                 g_D3D11Device->GetImmediateContext(&ctx);
    11.  
    12.                 // update native texture from code
    13.                 if (g_TexturePointer)
    14.                 {
    15.                     //get the pointer from the unity texture.
    16.                     //g_TexturePointer is receive from unity via setTexturePtr (void *)
    17.                     //cast it in a ID3D11Texture2D
    18.                     ID3D11Texture2D* d3dtex = (ID3D11Texture2D*)g_TexturePointer;
    19.                     d3dtex->GetDevice(&g_D3D11Device);
    20.  
    21.                     //D3D11_TEXTURE2D_DESC desc;
    22.                     //d3dtex->GetDesc(&desc);
    23.  
    24.                     //use ID3D11DeviceContext::UpdateSubresource to fill the default texture with data from a pointer provided by the application.
    25.                     ctx->UpdateSubresource(d3dtex, 0, NULL, savedFrame->data[0], savedFrame->linesize[0], 0);
    26.                  
    27.                 }
    28.  
    29.             }//end of dx11 case
    30. #endif
    31.  
    32. }

    But with thoses changes my code for opengl is not working anymore, i have to change some color format, but at the moment i dont care because i only need Directx 11 !


    I'll test a bit more, if it's all ok for my site, i will change the title of this topic as solved.
    EDIT : Set as solved !

    Thx for reading anyway, too bad none respond :(
     
    Last edited: Jun 2, 2016
    kyle_unity424 and glenrhodes like this.
  5. silverduck

    silverduck

    Joined:
    May 16, 2010
    Posts:
    27
    Three and a half years later, this thread saved my buns. Thank you so much.
     
  6. asmbaty

    asmbaty

    Joined:
    Jul 20, 2014
    Posts:
    1
    I try to render the output of a Unity camera into a native window using DirectX. To achieve this in Unity first I copy the render texture of camera to a new texture in OnRenderImage event

    Code (CSharp):
    1.             RenderTexture.active = source as RenderTexture;
    2.             _wrappedTexture.ReadPixels (new Rect (0, 0, src.width, src.height), 0, 0);
    3.             _wrappedTexture.Apply ();
    4.             RenderTexture.active = null;
    And Send _wrappedTexture.GetNativeTexturePtr to the plugin to render the frame on the native window. I created a new directX device & context for the native window. I am new to DirectX and have no much knowledge there.

    Starting from the next update I use Graphics.Copy to make the operation fast:
    , on next frame I use `Graphics.Copy`. Please see the code:

    Code (CSharp):
    1.             Graphics.CopyTexture(source, _wrappedTexture);
    I would like to achieve the best possible performance where I don't have to copy the render texture. Sending RenderTexture.GetNativeTexturePtr directly to the plugin doesn't work.