We're designing a HDMI Capture with 1080i stream.
It's considered that the capture outputs the RAW of NV12 to the MSDK encode frame buffer(DXVA) by the way of DMA though PCIE.
I would like to ask, whether the MSDK hardware impl support that kind of input behavior? (seems more efficient without colorspace convert.)
Any advices appreciated.
Nina Kurina (Intel) wrote:Your reply helps a lot, thank you!
Your solution is good as long as "DXVA buffer" is a Direct3D9 or a Direct3D11 (DX11.1 on Windows 8) surface - that's what Media SDK Encoder supports at input.
Please also note that you have to provide the D3D device used to create D3D buffers and an external frame allocator to Media SDK so that HW accelerated encoding can be performed on those buffers. I recommend to check out sections "Working with Microsoft* DirectX* Applications" and "Memory Allocation and External Allocators" of mediasdk_man.pdf for more details.
camkego wrote:Thanks for your advice!
You can also use system memory to do this, and the programming model is simpler to get going than Direct3D,
but you will pay a performance penalty.
In my experience encoding 1080P from system memory was about 78% as fast as using Direct3D surfaces.
So, you have a couple options.
Hello,Joyah and Nina Kurina (Intel). I'm also designing a HDMI Capture Card. And now I can capture RAW of NV12 to the system memory.But to increase the perfomance,I'd like to capture to the vedio memory.I'm working on Intel Core i5, and my minidriver is based on AVStream.
To do this,The WDK documents mention that "Obtain the adapter GUID from the vendor-supplied graphics miniport driver. The DXGK_INTERFACESPECIFICDATA structure contains the adapter GUID to return in the property request. This structure is generated by the DirectX graphics kernel (DXGK) subsystem and is passed to the miniport driver when the adapter is initialized."
But I'm still puzzeled about the specific operations to get that display adapter’s GUID.Can you explain the specific operations to me?
Thanks very much!