Update your bookmarks! This blog is now hosted on http://xoofx.com/blog

Wednesday, November 3, 2010

Hacking Direct2D to use directly Direct3D 11 instead of Direct3D 10.1 API

Disclaimer about this hack: This hack was nothing more than a proof of concept and I *really* don't have time to dig into any kind of bugs related to it.

[Edit]13 Jan 2011, After Windows Update KB2454826, this hack was not working. I have patched the sample to make it work again. Of course, you shouldn't consider this hack for anykind of production use. Use the standard DXGI shared sync keyed mutex instead. This hack is just for fun![/Edit]


If you know Direct3D 11 and Direct 2D - they were released almost at the same time - you already know that there is a huge drawback to use Direct 2D : It's in fact only working with Direct3D 10.1 API (although It's working with older hardware thanks to the new feature level capability of the API).

From a coding user point of view, this is really disappointing that such a good API doesn't rely on the latest Direct3D API... moreover when you know that the Direct3D 11 API is really close to the Direct3D 10.1 API... In the end, more work are required for a developer that would like to work with Direct3D 11, as It doesn't have any more Text API for example, meaning that in D3D11, you have to do it yourself, which isn't a huge task itself, if you go to the easy precalculated-texture-of-fonts generated by some GDI+ calls or whatever, but still... this is annoying specially when you need to display some information/FPS on the screen and you can't wait to build a nice font-texture-based system...

I'm not completely fair with Direct2D interoperability with Direct3D 11 : there is in fact a well known solution proposed by one guy from DirectX Team that imply the use of DXGI mutex to synchronized a surface shared between D3D10.1 and D3D11. I was expecting this issue to be solved in some DirectX SDK release this year, but It seems that there is no plan to release in the near future an update for Direct2D (see my question in the comments and the anwser...)... WP7 and XNA are probably getting much more attention here...

So last week, I took some time on the Direct2D API and found that It's in fact fairly easy to hack Direct2D and redirect all the D3D10.1 API calls to a real Direct3D 11 instance... and this is a pretty cool news! Here is the story of this little hack...


How Direct2D is accessing your already instantiated D3D10.1 device?


In order to use Direct2D with a renderable D3D10 texture2D, you need to query the IDXGISurface from your ID3D10Texture2D object, something like this:
IDXGISurface* surface;

// Create a Texture2D (or use SwapChain backbuffer)
d3d10Device->CreateTexture2D(&texture2DDesc, 0, &texture2D);

// Query the DXGI Surface associated with the D3D10.1 Texture2D
texture2D->QueryInterface(__uuidof(IDXGISurface), &surface);

// Create a D2D Render target from the D3D10 Texture2D through the associated DXGISurface
d2dFactory->CreateDxgiSurfaceRenderTarget(
        surface,
        &props,
        &d2dRenderTarget
        );
So starting from this CreateDxgiSurfaceRenderTarget call, Direct2D is somehow able to get back your D3D10.1 instance and is able to use it to submit drawcalls / create textures... etc. In order to find how Direct2D is getting an instance of ID3D10Device1, I have first implemented a Proxy IDXGISurface that was responsible to embed the real DXGI Surface and delegate all the calls for it...while being able to track down how Direct2D is getting back this ID3D10Device1 :

  • After the surface enters the CreateDxgiSurfaceRenderTarget, Direct2D is querying the IDXGIDevice through the GetDevice method on the IDXGISurface
  • From the IDXGIDevice, Direct2D is calling QueryInterface with the IID of the ID3D10Device interface (surprisingly not the ID3D10Device1)
And bingo! Being able to give your own implementation of the ID3D10Device to Direct2D... and you are able to redirect all the D3D10 calls to a Direct3D 11 device/context with a simple proxy implementing ID3D10Device1 methods!

Interoperability between D3D10.1 and D3D11 API


Migrating from D3D10/D3D10.1 to D3D11 API is quite straightforward and even have a dedicated paper on msdn. For the purpose of this quick hack, I didn't implement proxies for the whole D3D10 API... but I have instead focused my work on how is used the D3D10 API from D2D and what are the real methods/structures used that are not binary compatible between D3D10 and D3D11.

In the end, I have developped 5 proxies :
  • a Proxy for IDXGISurface interface, in order to hack the GetDevice method and return my own proxy for IDXGIDevice
  • a Proxy for IDXGIDevice interface in order to hack the QueryInterface method and return my own proxy for ID3D10Device1
  • a Proxy for the ID3D10Device1 interface
  • a Proxy for the ID3D10Texture2D interface
  • a Proxy for the ID3D10Buffer interface
For the ID3D10Device1 interface, most of the methods are redirecting  the calls directly to the device (ID3D11Device) or context (ID3D11DeviceContext). I didn't bother to implement proxies for most of the parameters, because even if they are not always binary compatible, returned objects are only used as reference and are not called directly. Suppose for example the proxy implementation for VSGetShader (which is used by Direct2D for saving the D3D10 pipeline state) :
virtual void STDMETHODCALLTYPE VSGetShader( 
    /* [annotation] */ 
    __out  ID3D10VertexShader **ppVertexShader) { 
        context->VSGetShader((ID3D11VertexShader**)ppVertexShader, 0, 0);
}

A Real proxy would have to wrap the ID3D11VertexShader inside a ID3D10VertexShader proxy... but because Direct2D (and this is not a surprise) is only using VSGetShader to later call VSSetShader (in order to restore the saved states, or to set it's own vertex/pixel shaders), It doesn't call any method on the ID3D10VertexShader instance... meaning that we can give it back directly a ID3D11VertexShader without performing any - costly - conversion.

For instance, most of the ID3D10Device1 proxy methods are like the previous one, a simple redirection to a D3D11 Device or DeviceContext... easy!

I was only forced to implement custom proxies for some incompatible structures... or returned object instance that are effectively used by Direct2D (like ID3D10Buffer and ID3D10Texture2D).

For example, the ID3D10Device::CreateBuffer proxy methods is implemented like this :

virtual HRESULT STDMETHODCALLTYPE CreateBuffer( 
 /* [annotation] */ 
 __in  const D3D10_BUFFER_DESC *pDesc,
 /* [annotation] */ 
 __in_opt  const D3D10_SUBRESOURCE_DATA *pInitialData,
 /* [annotation] */ 
 __out_opt  ID3D10Buffer **ppBuffer) {  
  D3D11_BUFFER_DESC desc11;

  *((D3D10_BUFFER_DESC*)&desc11) = *pDesc;
  // StructureByteStride field is new in D3D11
  desc11.StructureByteStride = 0;

  // Returns our ID3D10Buffer proxy instead of the real one
  ProxyID3D10Buffer* buffer = new ProxyID3D10Buffer();
  buffer->device = this;
  *ppBuffer = buffer;
  HRESULT result = device()->CreateBuffer(&desc11, (D3D11_SUBRESOURCE_DATA*)pInitialData, (ID3D11Buffer**)&buffer->backend);

  CHECK_RETURN(result);

  //   return S_OK; 
}

There was also just a few problems with 2 incompatible structures between D3D10_VIEWPORT/D3D11_VIEWPORT (D3D11 is using floats instead of ints!) and D3D10_BLEND_DESC/D3D11_BLEND_DESC... but the proxy methods were easy to implement:

virtual void STDMETHODCALLTYPE RSSetViewports( 
 /* [annotation] */ 
 __in_range(0, D3D10_VIEWPORT_AND_SCISSORRECT_OBJECT_COUNT_PER_PIPELINE)  UINT NumViewports,
 /* [annotation] */ 
 __in_ecount_opt(NumViewports)  const D3D10_VIEWPORT *pViewports) {

  // Perform conversion between D3D10_VIEWPORT and D3D11_VIEWPORT
  D3D11_VIEWPORT viewports[16];
  for(int i = 0; i < NumViewports; i++) {
   viewports[i].TopLeftX = pViewports[i].TopLeftX;
   viewports[i].TopLeftY = pViewports[i].TopLeftY;
   viewports[i].Width = pViewports[i].Width;
   viewports[i].Height = pViewports[i].Height;
   viewports[i].MinDepth = pViewports[i].MinDepth;
   viewports[i].MaxDepth = pViewports[i].MaxDepth;
  }
  context->RSSetViewports(NumViewports, (D3D11_VIEWPORT*)viewports);
}

Even if I haven't performed any performance timing measurement, the cost of those proxy methods should be almost unnoticeable... and probably much more lightweight than using mutex synchronization between D3D10 and D3D11 devices!

Plug-in the proxies


In the end, I have managed to put those proxies in a single .h/.cpp with an easy API to plug the proxy. The sequence call before passing the DXGISurface to Direct2D should then be like this:

d3d11Device->CreateTexture2D(&offlineTextureDesc, 0, &texture2D);

// Create a Proxy DXGISurface from Texture2D compatible with Direct2D
IDXGISurface* surface = Code4kCreateD3D10CompatibleSurface(d3d11Device, d3d11DeviceContext, texture2D);

d2dFactory->CreateDxgiSurfaceRenderTarget(
    surface,
    &props,
    &d2dRenderTarget
    );

And that's all! You will find attached a project with the sources. Feel free to test it and let me know if you are encountering any issues with it. Also, the code is far from being 100% safe/robust... It's a quick hack. For example, I have not checked carefully that my proxies behaves well with AddRef/Release... but that should be fine.

So far, It's seems to work well on the whole Direct2D API... I have even been able to use DirectWrite with Direct2D... using Direct3D 11, without any problem. There is only one issue : PIX won't be able to debug Direct2D over Direct3D 11... because It seems that Direct2D is performing some additional method calls (D3D10CreateStateBlocks) that are incompatible with the lightweight proxies I have developed... In order to be fully supported, It would be necessary to implement all the proxies for all the interfaces returned by ID3D10Device1... But this is a sooo laborious task that by that time, we can expect to have Direct2D fully working with Direct3D 11 provided from DirectX Team itself!

Also from this little experience, I can safely confirm that It shouldn't take more than one day for one guy from the Direct2D team to patch existing Direct2D code in order to use Direct3D 11... as it is much easier to do this on the original code than going to the proxy road as I did! ;)



You can grab the VC++ 2010 project from here : D2D1ToD3D11.7z

This sample is only saving a "test.png" image using Direct2D API over Direct3D11.

18 comments:

  1. I will never stop thanking you for this :D
    i absolutely think you are the best around ;)

    By Patrizio Tamorri

    ReplyDelete
  2. ah and i will mention you in my 3d engine ;)

    ReplyDelete
  3. Glad to hear that someone is going to use it. Let me know if you encounter any problem with it.

    ReplyDelete
  4. hi im japanese from japan.
    im writing a book about direct3d10/11.
    i have a question, can i use your code to my sample program of book?
    not all code,i will modify a bit to fit my sample.

    ReplyDelete
  5. This comment has been removed by the author.

    ReplyDelete
  6. Hey,

    I'd like to mention here that the code has some serious memory leak issues. I've been using self-releasing pointers to guarantee the destruction of all my resources, including the proxy IDXGISurface, and am seeing many objects being left alive at the end of execution.

    Although I must say, very grateful for this code. It's been very helpful.

    ReplyDelete
  7. >> I'd like to mention here that the code has some serious memory leak issues. I've been using self-releasing pointers to guarantee the destruction of all my resources, including the proxy IDXGISurface, and am seeing many objects being left alive at the end of execution.

    Probably, I didn't check addref/release on objects. This code is more to demonstrate that there is no limitation in Direct2D to be able to use it with Direct3D11 (this is mainly in direction to Microsoft Direct2D developers, and say to them "come on, give us a Direct2D other Direct3D11, there are no limitation for it").

    Thus, this code cannot be taken as a real solution to address Direct2D/Direct3D11 interop... but more a proof of concept. But if you are working to make things cleaner on the addref/release area, I would be glad to integrate your changes!

    ReplyDelete
  8. After update KB2454826
    http://support.microsoft.com/kb/2454826
    this code stopped working.

    ReplyDelete
  9. Thanks Sergej for the report! I have updated the sample to make it work again. As I said, this hack should not be considered as failsafe. It was primarily done for fun, and to "motivate" Direct2D team to provide a Direct3D11 version!

    ReplyDelete
  10. This comment has been removed by the author.

    ReplyDelete
  11. Thanks! Now it works again, but main.cpp is missing.

    ReplyDelete
  12. oops, sorry my bad, link is updated with main.cpp.

    ReplyDelete
  13. Thanks Alexandre. Much appreciated.

    Steve Williams

    ReplyDelete
  14. Hi Alexandre,
    I know this post is very old, but I was playing around with your code a little bit. Messing with Direct2D and DirectWrite within D3D11 is a lot of fun. I've been having a strange issue though. It seems even with the sample code you posted, setting a font size less than _exactly_ 100 with either CreateTextFormat(), or SetFontSize() seems to result in an E_NOINTERFACE error originating from around line 158 of D2D1ToD3D11.cpp. It could just be my machine, or maybe some portion of this code doesn't work anymore. Who knows.

    It could be that I'm just using it wrong (I have no idea what I'm doing :) ). Do you have this issue too? Am I using DirectWrite incorrectly?

    ReplyDelete
  15. @Schmidget, difficult to say, as I'm not using this hack, so I can't tell! The real way to use Direct2D with Direct3D11 is unfortunately to go through DXGI shared surface and a D3D10 device.

    But upcoming Direct2D1.1 from DirectX11.1 is now working with Direct3D11 so It should make this hack completely useless.

    ReplyDelete
  16. I found the same problem as Schmidget.. does anyone know a solution?

    ReplyDelete
  17. Its a good Hack. But one big Problem i have.

    I want to pass the texture with the result to a shader.
    But something like:

    static D3D11_TEXTURE2D_DESC offlineTextureDesc =
    {
    width, // width
    height, // height
    1, // MipLevels;
    1, // ArraySize;
    DXGI_FORMAT_R8G8B8A8_UNORM, // Format;
    {1, 0}, // SampleDesc
    D3D11_USAGE_DYNAMIC, // Usage
    D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET, // BindFlags;
    D3D11_CPU_ACCESS_WRITE, //CPUAccessFlags;
    (D3D11_RESOURCE_MISC_FLAG)0 //MiscFlags;
    };

    not work.

    The device->CreateTexture2D(...)-Method crashs with D3D11_BIND_RENDER_TARGET on.

    Is there a solution to get a valid texture to pass it to a shader for mapping it on a 3D-Object, for instance?

    ReplyDelete
  18. Disclaimer about this hack: This hack was nothing more than a proof of concept and I *really* don't have time to dig into any kind of bugs related to it.

    The official way to use Direct2D with Direct3D11 is to go through DXGI shared surface and a D3D10 device.

    ReplyDelete

Comments are disabled

Note: Only a member of this blog may post a comment.