Media (Intel® oneAPI Video Processing Library, Intel Media SDK)
Access community support with transcoding, decoding, and encoding in applications using media tools from Intel. This includes Intel® oneAPI Video Processing Library and Intel® Media SDK.
Announcements
This community is designed for sharing of public information. Please do not share Intel or third-party confidential information here.
2935 Discussions

Intel® Quick Sync Video H.264 Encoder MFT MFT_OUTPUT_DATA_BUFFER_NO_SAMPLE

bogdan_d_
Beginner
753 Views

Hi,

I'm having problems running Intel® Quick Sync Video H.264 Encoder MFT. The setup, ProcessInput and ProcessOutput are all returning S_OK but the sample from ProcessOutput is NULL and the status is MFT_OUTPUT_DATA_BUFFER_NO_SAMPLE. Any idea what could be wrong?

This is the output:

Intel® Quick Sync Video H.264 Encoder MFT
Start encoding
ProcessInput frame 1
ProcessInput frame 2
ProcessInput frame 3
ProcessInput frame 4
ProcessOutput MFT_OUTPUT_DATA_BUFFER_NO_SAMPLE

 

from the following code (ignore any leaks that are unrelated to my issue): 

void main()

{
    HRESULT hr = S_OK;

    hr = CoInitializeEx(NULL, COINIT_APARTMENTTHREADED);
    if (FAILED(hr)) {
        OutputDebugString(L"CoInitializeEx failed\n");
        return;
    }

    hr = MFStartup(MF_VERSION);
    if (FAILED(hr)) {
        OutputDebugString(L"MFStartup failed\n");
        return;
    }

    IMFActivate **devices = NULL;
    IMFTransform *encoder = NULL;
    IMFMediaEventGenerator *eventGenerator = NULL;
    GUID inFormat = MFVideoFormat_NV12;
    GUID outFormat = MFVideoFormat_H264;
    MFT_REGISTER_TYPE_INFO inputInfo = { MFMediaType_Video, inFormat };
    MFT_REGISTER_TYPE_INFO outputInfo = { MFMediaType_Video, outFormat };
    uint32_t flags = MFT_ENUM_FLAG_HARDWARE;
    uint32_t count;
    uint32_t fps = 25;
    uint32_t width = 1280;
    uint32_t height = 720;

    IMFAttributes *attributes;
    uint32_t asynch;
    uint32_t len = 0;
    MFCalculateImageSize(inFormat, width, height, &len);

    hr = MFTEnumEx(MFT_CATEGORY_VIDEO_ENCODER, flags, &inputInfo, &outputInfo, &devices, &count);
    if (FAILED(hr)) {
        OutputDebugString(L"MFTEnumEx failed\n");
        return;
    }

    if (count == 0) {
        OutputDebugString(L"No devices\n");
        return;
    }

    LPWSTR szFriendlyName = NULL;
    hr = devices[0]->GetAllocatedString(MFT_FRIENDLY_NAME_Attribute, &szFriendlyName, NULL);
    OutputDebugString(szFriendlyName);
    OutputDebugString(L"\n");

    hr = devices[0]->ActivateObject(IID_PPV_ARGS(&encoder));
    if (FAILED(hr)) {
        OutputDebugString(L"ActivateObject failed\n");
        return;
    }

    for (size_t idx = 0; idx < count; ++idx) {
        devices[idx]->Release();
    }
    CoTaskMemFree(devices);

    encoder->QueryInterface(&eventGenerator);

    hr = encoder->GetAttributes(&attributes);
    if (FAILED(hr)) {
        return;
    }

    hr = attributes->GetUINT32(MF_TRANSFORM_ASYNC, &asynch);
    hr = attributes->SetUINT32(MF_TRANSFORM_ASYNC_UNLOCK, TRUE);
    attributes->Release();
    if (FAILED(hr)) {
        return;
    }

    DWORD inputCount = 0, outputCount = 0, inputId = 0, outputId = 0;
    hr = encoder->GetStreamCount(&inputCount, &outputCount);

    if (FAILED(hr)) {
        return;
    }

    if (inputCount == 0 || outputCount == 0) {
        return;
    }

    DWORD *inputIds = new DWORD[inputCount];
    DWORD *outputIds = new DWORD[outputCount];
    hr = encoder->GetStreamIDs(inputCount, inputIds, outputCount, outputIds);

    auto cleanStreamIds = [&inputIds, &outputIds] {
        delete[] inputIds;
        inputIds = NULL;

        delete[] outputIds;
        outputIds = NULL;
    };

    if (FAILED(hr)) {
        if (hr != E_NOTIMPL) {
            OutputDebugString(L"GetStreamIDs failed\n");
            cleanStreamIds();
            return;
        }
    }
    else {
        inputId = inputIds[0];
        outputId = outputIds[0];
    }

    cleanStreamIds();

    IMFMediaType *outputType = NULL;
    hr = MFCreateMediaType(&outputType);
    if (FAILED(hr)) {
        OutputDebugString(L"MFCreateMediaType outputType failed\n");
        return;
    }

    hr = outputType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video);
    hr = outputType->SetGUID(MF_MT_SUBTYPE, outFormat);
    hr = outputType->SetUINT32(MF_MT_AVG_BITRATE, 10000000);
    hr = MFSetAttributeSize(outputType, MF_MT_FRAME_SIZE, width, height);
    hr = MFSetAttributeRatio(outputType, MF_MT_FRAME_RATE, fps, 1);
    hr = outputType->SetUINT32(MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive);
    hr = outputType->SetUINT32(MF_MT_MPEG2_PROFILE, eAVEncH264VProfile_High);
    hr = outputType->SetUINT32(MF_MT_MPEG2_LEVEL, eAVEncH264VLevel3_1);
    hr = outputType->SetUINT32(CODECAPI_AVEncCommonRateControlMode, eAVEncCommonRateControlMode_CBR);
    hr = encoder->SetOutputType(outputId, outputType, 0);
    if (FAILED(hr)) {
        OutputDebugString(L"SetOutputType failed\n");
        return;
    }

    IMFMediaType *inputType = NULL;
    hr = MFCreateMediaType(&inputType);
    if (FAILED(hr)) {
        OutputDebugString(L"MFCreateMediaType inputType failed\n");
        return;
    }

    hr = inputType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video);
    hr = inputType->SetGUID(MF_MT_SUBTYPE, inFormat);
    hr = inputType->SetUINT32(MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive);
    hr = MFSetAttributeSize(inputType, MF_MT_FRAME_SIZE, width, height);
    hr = MFSetAttributeRatio(inputType, MF_MT_FRAME_RATE, fps, 1);
    hr = MFSetAttributeRatio(inputType, MF_MT_PIXEL_ASPECT_RATIO, 1, 1);
    hr = encoder->SetInputType(inputId, inputType, 0);
    if (FAILED(hr)) {
        OutputDebugString(L"SetInputType failed\n");
        return;
    }

    hr = encoder->ProcessMessage(MFT_MESSAGE_COMMAND_FLUSH, 0);
    hr = encoder->ProcessMessage(MFT_MESSAGE_NOTIFY_BEGIN_STREAMING, 0);
    hr = encoder->ProcessMessage(MFT_MESSAGE_NOTIFY_START_OF_STREAM, 0);

    OutputDebugString(L"Start encoding\n");

    uint32_t frame = 1;
    IMFMediaEvent *event = NULL;
    MediaEventType type;
    do
    {
        hr = eventGenerator->GetEvent(0, &event);
        hr = event->GetType(&type);

        switch (type)
        {
        case METransformNeedInput: {
            IMFSample *newSample = NULL;
            IMFMediaBuffer *buffer;
            hr = MFCreateMemoryBuffer(len, &buffer);

            byte *bufferData;
            hr = buffer->Lock(&bufferData, NULL, NULL);

            memset(bufferData, 0, len);

            hr = buffer->Unlock();
            hr = buffer->SetCurrentLength(len);

            hr = MFCreateSample(&newSample);
            hr = newSample->AddBuffer(buffer);

            uint64_t sampleDuration = 0;
            MFFrameRateToAverageTimePerFrame(fps, 1, &sampleDuration);
            newSample->SetSampleDuration(sampleDuration);

            hr = encoder->ProcessInput(0, newSample, 0);
            if (FAILED(hr)) {
                OutputDebugString(L"ProcessInput failed\n");
                return;
            }
            else
            {
                WCHAR outMessage[128];
                memset(outMessage, 0, 128 * sizeof(WCHAR));
                wsprintf(outMessage, L"ProcessInput frame %u\n", frame);
                OutputDebugString(outMessage);
                ++frame;
            }
            break;
        }

        case METransformHaveOutput: {
            MFT_OUTPUT_DATA_BUFFER data = { outputId, NULL, 0, NULL };

            MFT_OUTPUT_STREAM_INFO outInfo;
            hr = encoder->GetOutputStreamInfo(outputId, &outInfo);
            if (!(outInfo.dwFlags & MFT_OUTPUT_STREAM_PROVIDES_SAMPLES)) {
                OutputDebugString(L"!MFT_OUTPUT_STREAM_PROVIDES_SAMPLES not handled\n");
                return;
            }

            DWORD status = 0;
            hr = encoder->ProcessOutput(MFT_OUTPUT_STREAM_PROVIDES_SAMPLES, 1, &data, &status);
            if (FAILED(hr)) {
                OutputDebugString(L"ProcessOutput failed\n");
                return;
            }

            if (data.dwStatus == MFT_OUTPUT_DATA_BUFFER_NO_SAMPLE) {
                OutputDebugString(L"ProcessOutput MFT_OUTPUT_DATA_BUFFER_NO_SAMPLE\n");
                return;
            }

            break;
        }

        default:
            break;
        }

    } while (true);
}

Also, here is the output from mediasdk_ss_analyzer:

The following versions of Media SDK API are supported by platform/driver
[opportunistic detection of MSDK API > 1.16]:

        Version Target  Supported       Dec     Enc
        1.0     HW      Yes             X       X
        1.0     SW      Yes             X       X
        1.1     HW      Yes             X       X
        1.1     SW      Yes             X       X
        1.2     HW      Yes             X       X
        1.2     SW      Yes             X       X
        1.3     HW      Yes             X       X
        1.3     SW      Yes             X       X
        1.4     HW      Yes             X       X
        1.4     SW      Yes             X       X
        1.5     HW      Yes             X       X
        1.5     SW      Yes             X       X
        1.6     HW      Yes             X       X
        1.6     SW      Yes             X       X
        1.7     HW      Yes             X       X
        1.7     SW      Yes             X       X
        1.8     HW      Yes             X       X
        1.8     SW      Yes             X       X
        1.9     HW      Yes             X       X
        1.9     SW      Yes             X       X
        1.10    HW      Yes             X       X
        1.10    SW      Yes             X       X
        1.11    HW      Yes             X       X
        1.11    SW      Yes             X       X
        1.12    HW      Yes             X       X
        1.12    SW      Yes             X       X
        1.13    HW      Yes             X       X
        1.13    SW      Yes             X       X
        1.14    HW      Yes             X       X
        1.14    SW      Yes             X       X
        1.15    HW      Yes             X       X
        1.15    SW      Yes             X       X
        1.16    HW      Yes             X       X
        1.16    SW      Yes             X       X
        1.17    HW      Yes             X       X
        1.17    SW      Yes             X       X

Graphics Devices:
        Name                                         Version             State
        Intel(R) HD Graphics                         20.19.15.4320       Active

System info:
        CPU:    Intel(R) Pentium(R) CPU  N3700  @ 1.60GHz
        OS:     Microsoft Windows 10 Home Single Language
        Arch:   64-bit

Installed Media SDK packages (be patient...processing takes some time):
        Intel(R) Media Samples 6.0.0.68
        Intel(R) Media SDK 2016

Analysis complete... [press ENTER]

Thanks

0 Kudos
6 Replies
Harshdeep_B_Intel
753 Views

Hi Bogdan,

Thank you for sharing the detailed log of your system. Looking into issue and based on our investigation, the status returned "MFT_OUTPUT_DATA_BUFFER_NO_SAMPLE"  is not specifically due to hardware MFT (Intel® Quick Sync Video H.264 Encoder MFT) but due to no MFT sample ready for this stream. 

For MSFT website reference here: https://technet.microsoft.com/es-es/ms704014, "If the MFT has multiple output streams, the streams might produce output at different rates, so some streams might have output while other streams do not. If a stream did not any produce output, the MFT sets the MFT_OUTPUT_DATA_BUFFER_NO_SAMPLE flag in the dwStatus member of the MFT_OUTPUT_DATA_BUFFER structure for that stream". I suggest checking with MSFT support team on this issue. Hope this information helps. 

Thanks, 

 

bogdan_d_
Beginner
753 Views

Hi Harsh,

Thank you for your answer.

The MFT has only 1 output stream (only 1 input stream too).

Also, running the same code with an NVIDIA hardware MFT, I have no issues on getting encoded frames (below is the output from it):

NVIDIA H.264 Encoder MFT
Start encoding
ProcessInput frame 1
ProcessOutput Got output
ProcessInput frame 2
ProcessOutput Got output
ProcessInput frame 3

.....

 

Thanks,

Bogdan

 

Harshdeep_B_Intel
753 Views

Hi Bogdan,

Thank you for sharing this information. If the MFT has only 1 output stream, then further investigation is needed. Can you please share a log from mediasdk_tracer (https://software.intel.com/en-us/articles/media-sdk-tools). Also, I have shared a tool via private message. Please share the logs from the tools, where logs could provide us more visibility into this issue. 

Thanks, 

bogdan_d_
Beginner
753 Views

Hi Harsh,

Attached are the logs from mediasdk_tracer.

Please note that I didn't get any private messages from you, so I cannot provide any logs from the tool that you shared via private message.

 

Thanks,

Bogdan

Harshdeep_B_Intel
753 Views

Hi Bogdan,

Thank you for the log. Please check you inbox for Intel forum messages and I did re-send the message again too. Also, after brief look at the code (above) I suspect the issue is caused by MFT_OUTPUT_STREAM_PROVIDES_SAMPLES flag as it is not supported by Intel H.264 Encoder HMFT. Please try and remove this flag to isolate the problem, let me know how it goes.

Thanks,

Tammo_H_
Beginner
753 Views

EDIT: Never mind. In my case it was a problem with the encoder wanting to renegotiate the output format and my code miscounting the METransformHaveOutput callbacks.

Hey,

I have the exact same problem, and it's clearly not related to the MFT_OUTPUT_STREAM_PROVIDES_SAMPLES flag - providing a sample in the MFT_OUTPUT_DATA_BUFFER structure doesn't help.

My code is completely independent from Bogdan's, and it's working flawlessly on NVidia/AMD GPUs and Microsoft's software encoder MFT, so there's definitely something fishy going on there.

 

 

Reply