Media (Intel® Video Processing Library, Intel Media SDK)
Access community support with transcoding, decoding, and encoding in applications using media tools like Intel® oneAPI Video Processing Library and Intel® Media SDK
Announcements
The Intel Media SDK project is no longer active. For continued support and access to new features, Intel Media SDK users are encouraged to read the transition guide on upgrading from Intel® Media SDK to Intel® Video Processing Library (VPL), and to move to VPL as soon as possible.
For more information, see the VPL website.

Asynchronous MFT is not sending MFTransformHaveOutput Event(Intel Hardware MJPEG Decoder MFT)

abi_k_
Beginner
939 Views

Hi,

I'm developing USB camera streaming Desktop application using MediaFoundation SourceReader technique. The camera is having USB3.0 support and gives 60fps for 1080p MJPG video format resolution.

I used Software MJPEG Decoder MFT to convert MJPG to YUY2 frames and then converted into the RGB32 frame to draw on the window. Instead of 60fps, I'm able to render only 30fps on the window when using this software decoder. I have posted a question on this site and got some suggestion to use Intel Hardware MJPEG Decoder MFT to solve frame drop issue.

To use this Hardware MJPEG decoder, I have addressed Asynchronous MFT processing model and configured an asynchronous callback for IMFMediaEventGenerator through IMFTransform interface.

After called MFT_MESSAGE_NOTIFY_START_OF_STREAM using ProcessMessage method, I received the MFTransfromNeedInput event twice but I didn't receive the MFTransformHaveOutput event from MFT.

I have shared my code here for your reference:

	IMFTransform* m_pTransform = NULL;

	HRESULT EnumDecoderMFT ()
	{
		HRESULT hr;
		IMFActivate** ppActivate;	
		UINT32 numDecodersMJPG = 0;
		LPWSTR lpMFTName = 0;
		
		MFT_REGISTER_TYPE_INFO inputFilter = {MFMediaType_Video,MFVideoFormat_MJPG};
		MFT_REGISTER_TYPE_INFO outputFilter = {MFMediaType_Video,MFVideoFormat_YUY2};

		UINT32 unFlags = MFT_ENUM_FLAG_SYNCMFT | MFT_ENUM_FLAG_ASYNCMFT | MFT_ENUM_FLAG_LOCALMFT | MFT_ENUM_FLAG_HARDWARE | MFT_ENUM_FLAG_SORTANDFILTER;

		hr = MFTEnumEx(MFT_CATEGORY_VIDEO_DECODER, unFlags, &inputFilter, &outputFilter, &ppActivate, &numDecodersMJPG);
		if (FAILED(hr)) return hr; 
		
		hr = ppActivate[0]->GetAllocatedString(MFT_FRIENDLY_NAME_Attribute,&lpMFTName,0);
		if (FAILED(hr)) return hr;

		// Activate transform
		hr = ppActivate[0]->ActivateObject(__uuidof(IMFTransform), (void**)&m_pTransform);
		if (FAILED(hr)) return hr;
		
		hr = hr = m_pTransform->GetAttributes(&pAttributes);
		if (SUCCEEDED(hr))
		{
			hr = pAttributes->SetUINT32(MF_TRANSFORM_ASYNC_UNLOCK, TRUE);			
			if(FAILED(hr)) return hr;

			hr = pAttributes->SetUINT32(MFT_SUPPORT_DYNAMIC_FORMAT_CHANGE,TRUE);		
			if(FAILED(hr)) return hr;

			hr = pAttributes->SetUINT32(MF_READWRITE_ENABLE_HARDWARE_TRANSFORMS, TRUE);
			if(FAILED(hr)) return hr;

			hr = m_pTransform->QueryInterface(IID_IMFMediaEventGenerator,(void**)&m_pEventGenerator);			
			if(FAILED(hr)) return hr;

			hr = m_pEventGenerator->BeginGetEvent((IMFAsyncCallback*)this,NULL);
			if(FAILED(hr)) return hr;

			pAttributes->Release();
		}

		SafeRelease(&ppActivate[0]);

		CoTaskMemFree(ppActivate);

		return hr;		
	}

	HRESULT Invoke(IMFAsyncResult *pResult)
	{
		HRESULT hr = S_OK,hrStatus;
		MediaEventType meType = MEUnknown;  // Event type
		IMFMediaEvent *pEvent = NULL;

		// Get the event from the event queue.
		hr = m_pEventGenerator->EndGetEvent(pResult, &pEvent);		//Completes an asynchronous request for the next event in the queue.
		if(FAILED(hr)) return hr;

		// Get the event type. 
		hr = pEvent->GetType(&meType);
		if(FAILED(hr)) return hr;

		hr = pEvent->GetStatus(&hrStatus);
		if(FAILED(hr)) return hr;
	   
		if(SUCCEEDED(hrStatus))
		{
			if(meType == METransformNeedInput)
			{		
				SetEvent(m_hNeedInputEvent);
			}
			else if(meType == METransformHaveOutput)
			{			
				SetEvent(m_hHaveOutputEvent);
			}
			else if(meType == METransformDrainComplete)
			{
				hr = m_pTransform->ProcessMessage(MFT_MESSAGE_COMMAND_FLUSH,0);
				if(FAILED(hr)) return hr;
			}
			else if(meType == MEError)
			{
				PROPVARIANT pValue;
				hr = pEvent->GetValue(&pValue);			
				if(FAILED(hr)) return hr;	
			}

			hr = m_pEventGenerator->BeginGetEvent((IMFAsyncCallback*)this,NULL);	
			if(FAILED(hr)) return hr;
		}

	done:
		SafeRelease(&pEvent);
		return S_OK;
	}

	HRESULT CMFSourceReader::OnReadSample(
		HRESULT hrStatus,
		DWORD  dwStreamIndex ,
		DWORD  dwStreamFlags ,
		LONGLONG  llTimestamp ,
		IMFSample *pSample      // Can be NULL
		)
	{
		HRESULT hr = S_OK;
		IMFMediaBuffer *pBuffer = NULL;
		DWORD dwcbTotLen = 0;			
		IMFSample *mftOutSample = NULL;

		EnterCriticalSection(&m_critsec);

		if (FAILED(hrStatus))
		{
			hr = hrStatus;
		}

		if (SUCCEEDED(hr))
		{
			if (pSample != NULL)
			{
				if(dwStreamIndex == 0)		//VideoStream
				{					
					if(m_pTransform)
					{	
						hr = m_pTransform->ProcessMessage(MFT_MESSAGE_NOTIFY_START_OF_STREAM, 0);
						if(FAILED(hr))	return hr;

						m_dwWaitObj = WaitForSingleObject(m_hNeedInputEvent,INFINITE);
						if(m_dwWaitObj == WAIT_OBJECT_0)
						{							
							hr = ProcessInputSample(pSample);
							if(FAILED(hr))	return hr;
						}

						m_dwWaitObj = WaitForSingleObject(m_hHaveOutputEvent,INFINITE);
						if(m_dwWaitObj == WAIT_OBJECT_0)
						{
							hr = ProcessOutputSample(&mftOutSample);
							if(FAILED(hr))	return hr;
						}
					}
				}
			}
		}

		if(SUCCEEDED(hr))
		{
			if(m_pReader != NULL)
			{
				hr = m_pReader->ReadSample(
					(DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM,
					0,
					NULL,   // actual
					NULL,   // flags
					NULL,   // timestamp
					NULL    // sample
					);
				if(FAILED(hr)) return hr;
			}
		}
				
		SafeRelease(&mftOutSample);

		LeaveCriticalSection(&m_critsec);
		return hr; 
	}

	HRESULT ProcessOutputSample(IMFSample **pOutSample)
	{
		HRESULT hr = S_OK;
		MFT_OUTPUT_DATA_BUFFER outputDataBuffer;
		DWORD processOutputStatus = 0,mftOutFlags = 0;
		MFT_OUTPUT_STREAM_INFO StreamInfo;
		IMFSample *mftOutSample = NULL;
		IMFMediaBuffer *pOutBuffer = NULL;

		if(m_pTransform != NULL)
		{	
			hr = m_pTransform->GetOutputStreamInfo(0, &StreamInfo);
			if(FAILED(hr)) return hr;

			DWORD status = 0;
			hr = m_pTransform->GetOutputStatus(&status);
			if (FAILED(hr)) return hr;

			hr = MFCreateSample(&mftOutSample);
			if(FAILED(hr)) return hr;

			hr = MFCreateMemoryBuffer(StreamInfo.cbSize, &pOutBuffer);
			if(FAILED(hr)) return hr;

			hr = mftOutSample->AddBuffer(pOutBuffer);
			if(FAILED(hr)) return hr;

			outputDataBuffer.dwStreamID = 0;
			outputDataBuffer.dwStatus = 0;
			outputDataBuffer.pEvents = NULL;
			outputDataBuffer.pSample = mftOutSample;

			hr = m_pTransform->ProcessOutput(0, 1, &outputDataBuffer, &processOutputStatus);			
			if(FAILED(hr)) return hr;

			hr = m_pTransform->ProcessMessage(MFT_MESSAGE_NOTIFY_END_OF_STREAM, 0);
			if (FAILED(hr))  return hr;

			hr = m_pTransform->ProcessMessage(MFT_MESSAGE_COMMAND_DRAIN, 0);
			if (FAILED(hr))  return hr;

			if(mftOutSample)
			{
				*pOutSample = mftOutSample;
				(*pOutSample)->AddRef();
			}

			ResetEvent(m_hHaveOutputEvent);
		}

		SafeRelease(&mftOutSample);
		SafeRelease(&pOutBuffer);

		return hr;
	}

	HRESULT ProcessInputSample(IMFSample *pInputSample)
	{
		HRESULT hr;

		if(m_pTransform != NULL)
		{				
			hr = m_pTransform->ProcessInput(0, pInputSample, 0);
			if(FAILED(hr)) return hr;

			hr = m_pTransform->ProcessMessage(MFT_MESSAGE_NOTIFY_END_OF_STREAM,0);
			if(FAILED(hr)) return hr;

			ResetEvent(m_hNeedInputEvent);
		}

		return hr;
	}

I have commented ProcessOutputSample() method in my code and checked, continuously MFT send MFTransformNeedInput event type. After ProcessInput sample, I have ProcessOutput method but it returned an E_UNEXPECTED error. I have read about this error in MSDN, they have mentioned that I should not call IMFTransform:: ProcessOutput method without receive MFTransformHaveOutput event.

Am I missing anything?Can I use Intel Hardware MJPEG Decoder MFT in this way inside MediaFoundation? Please guide me to use this decoder issue? Past 4 days, I'm struggling with this issue.

Thanks in advance.

0 Kudos
4 Replies
Mark_L_Intel1
Moderator
939 Views

Hi abi,

MFT is the framework that directly integrated with the Intel video driver layer. The call you used to MFT doesn't go though the Media SDK framework.

Sorry to say that but you have to refer to Microsoft MFT support for the full solution.

Mark

0 Kudos
abi_k_
Beginner
939 Views

Hi Liu,

Thank you for your reply. I have posted the same query in MediaFoundation MSDN forum in last week but I didn't get any reply. I have referred this site https://software.intel.com/en-us/node/700738 to access MFT. In this site, you people suggested your Sample_decode app use it. I downloaded Intel SDK sample from GitHub and tried to build the source code but I'm getting some compiler errors. Already Intel SDK installed on my machine and I couldn't find dependency header/lib files in the installation folder.

I have few queries to clear my doubts:

1) Is there any Intel API's available to use Intel Hardware MJPEG Decoder MFT? If Yes, Provide me source and guidance to use in my app.

2) What is the version of Intel supporting Hardware MJPEG Decoder MFT?

3) Should I follow Microsoft MFT procedure to use your decoder?Is this only way to use it?

Please suggest some idea. Thank you once again.

 

0 Kudos
K__RAMESH
Beginner
939 Views

Hi,

I'm also facing the same issue with intel encoders whereas my program works perfectly fine with Nvidia graphics.

0 Kudos
Mark_L_Intel1
Moderator
939 Views

Hi Ramesh,

What is your application?

There are basically 2 path for intel HW accelerator, MFT or Media SDK.

For the codec level support, you can check our document at our Media SDK release notes, it is for Linux but it should apply to Windows too.

Let me know if you have more questions.

Mark Liu

0 Kudos
Reply