Media (Intel® Video Processing Library, Intel Media SDK)
Access community support with transcoding, decoding, and encoding in applications using media tools like Intel® oneAPI Video Processing Library and Intel® Media SDK
Announcements
The Intel Media SDK project is no longer active. For continued support and access to new features, Intel Media SDK users are encouraged to read the transition guide on upgrading from Intel® Media SDK to Intel® Video Processing Library (VPL), and to move to VPL as soon as possible.
For more information, see the VPL website.
3078 Discussions

MFXVideoENCODE_QueryIOSurf parameters, what is the minmum frame number?

salsailor
Beginner
738 Views
According to the API doc, one of the parameters is

typedef struct _mfxFrameAllocRequest {

mfxU32 reserved[4];

mfxFrameInfo

Info;

mfxU16 Type;

mfxU16 NumFrameMin;

mfxU16 NumFrameSuggested;

mfxU16 reserved2;

} mfxFrameAllocRequest;


My question is what decides the NumFrameMin? Can it be set to 1? If the stream cannot feed enough frames to the encoder, what happens?
I am using the DirectShow filters, the upper filters do not always allocate the number of frames the encoder wants. And in my test, this call generates the NumFrameMin equals to 5 or 6. While many live feed filters only buffers 1 or 2 frames, is it OK? Or is this call optional in making decision on the buffers of frames?

Lee
0 Kudos
3 Replies
IDZ_A_Intel
Employee
738 Views

Hi Lee,

NumFrameMin and NumFrameSuggested in the mfxFrameAllocRequest structure is retrieved from the Encoder, Decoder and VPP using the QueryIOSurf call. For more info on the usage check out the console encoder or decoder sample. For instance, depending on the requested encoder configuration the QueryIOSurf call will return the minimum and suggested (better performance) required by the encoder. The minimum value represents the least number of required surfaces to be allocated. For better performance more surfaces can be allocated.

Upstream filters are supposed to ask downstream filter (such as the Media SDK encoder filter) how many surface buffers to allocate. The upstream filter then uses that response to call the downstream filter requesting allocation of surfaces. Some upstream filters does not really follow these rules, in that case the upstream filter needs to be changed. Media SDK filters cannot function with less than the minumum required buffers/surfaces.

Thanks,
Petter

0 Kudos
salsailor
Beginner
738 Views
Peter,

thanks for the response. I can get the upper stream filter work with it now. The further question I have regarding this is, the mfxVideoParam will probably change the frames/buffer numbers. How can I change the mfxVideoParam so that the min numbers can be the least? Since mfxVideoParam has a lot parameters, I am not able to guess which one is the most significant one decided the frame/buffer numbers.

Lee
0 Kudos
IDZ_A_Intel
Employee
738 Views
Hi Lee,

You can limit the number of buffers by configuring the encoder for "low latency" behavior.

First of all, to minimize internal buffers set mfxVideoParam::AsyncDepth = 1.

If low latency encode is also of interest make sure to also set (this may further reduce the number of buffers needed):
mfxInfoMFX::GopRefDist = 1
mfxInfoMFX::NumRefFrame = 1

Be aware that by limiting the number of buffers you are also affecting encoder performance.

Also, if you are willing to explore Media SDK3.0beta then you will find that there are new features and samples that details how to configure encoder for low latency (fewer buffers). Media SDK3.0beta has several improvements addressing specifically low latency and streaming. Media SDK 2.0 was not explicitly developed for low latency use cases, however it may still fulfill some developer requirements.

Regards,
Petter

0 Kudos
Reply