Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

RenderTexture returns format not supported for ID3D11RenderTargetView

Discussion in 'Windows' started by mashman, Sep 18, 2013.

  1. mashman

    mashman

    Joined:
    Aug 23, 2013
    Posts:
    8
    I'm using a RenderTexture to get a native texture for rendering by an external native plugin, but have a hit a snag,

    I can successfully static_cast<ID3D11Texture2D*> the pointer returned by RenderTexture.GetNativePtr() in my rendering plugin, but when I observe the texture descriptor from my rendering setup code the texture format is DXGI_FORMAT_R8G8B8A8_TYPELESS. According ID3D11Device::CheckFormatSupport() a texture with this format cannot be used as a render target, and when attempting to CreateRenderTargetView() I end up getting E_INVALIDARG.

    Am I missing something basic here? Is there a way to get an ID3D11Texture2D from Unity which can be rendered to by a plugin?

    Thanks in advance...
     
  2. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    10,504
    Hello,

    when calling CreateRenderTargetView, in D3D11_RENDER_TARGET_VIEW_DESC structure you should set format to either DXGI_FORMAT_R8G8B8A8_UINT or DXGI_FORMAT_R8G8B8A8_UNORM_SRGB.
     
    DevDuFF likes this.
  3. mashman

    mashman

    Joined:
    Aug 23, 2013
    Posts:
    8
    Excellent. Anything else I should watch out for when rendering to the RenderTexture.GetNativeTexturePtr() from my native rendering plugin?
     
  4. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    10,504
  5. DevDuFF

    DevDuFF

    Joined:
    Jan 3, 2013
    Posts:
    17
  6. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    10,504
    You don't get to set D3D11_TEXTURE2D_DESC - it is set when texture is created by Unity. You can only read it by calling ID3D11Texture2D::GetDesc().

    Regarding D3D11_RENDER_TARGET_VIEW_DESC:
    * You must set ViewDimension to D3D11_RTV_DIMENSION_TEXTURE2D;
    * If you specify to use mipmaps for rendertexture in Unity, you can specify Texture2D.MipSlice to be any mip. If you don't use mipmaps, it must be 0;
    * Format depends on format you specified on Unity side when creating the render texture.
     
  7. Yashiz

    Yashiz

    Joined:
    Jul 9, 2015
    Posts:
    11
    Hi,

    In Unity 2017.1, ID3D11Texture2D::GetDesc() always returns format as DXGI_FORMAT_R8G8B8A8_TYPELESS, no matter in Unity, the RenderTexture format is ARGBFloat, or ARGBHalf, or RFloat. How the conversion is made ?

    I would like to use Unity to render some G-Buffers, and pass the render textures to my native plugin, that is a render in DX11. Do you have suggestions about how to that? You know it is not precise enough to store positions and normals in DXGI_FORMAT_R8G8B8A8_TYPELESS.

    Any tutorials or docs about inter-operation between Unity textures and DX11 native textures would be helpful.

    Thanks,

    Yashiz
     
  8. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    10,504
    That doesn't sound right, do you have a small sample that demonstrates this behaviour? From what I call tell, it should create DXGI_FORMAT_R16G16B16A16_TYPELESS for ARGBHalf, DXGI_FORMAT_R32G32B32A32_TYPELESS for ARGBFloat and DXGI_FORMAT_R32_TYPELESS for RFloat.
     
  9. Yashiz

    Yashiz

    Joined:
    Jul 9, 2015
    Posts:
    11
    Thank you for the confirmation Tautvydas-Zilys. It was my own stupid error that I didn't pass the correct render texture.