Media (Intel® oneAPI Video Processing Library, Intel Media SDK)
Access community support with transcoding, decoding, and encoding in applications using media tools from Intel. This includes Intel® oneAPI Video Processing Library and Intel® Media SDK.

MFXInit - cpu usage

Ashim_Prasad
Beginner
180 Views

When MFXInit is called from an application - the application CPU usage goes up by 3-4% as seen on top command. I would have expected the CPU utilization to go up only during frame processing. I guess some threads are created by MFXInit (as inferred from MFXIntEx). Are these threads doing some kind of busy loops, which is making them take up cpu?

OS: ubuntu 16.04 (kernel patches applied using Generic Media SDK directory)

Media SDK version 2017 R1

0 Kudos
6 Replies
Dmitry_E_Intel
Employee
180 Views

Hi Ashim Prasad,

Your observation is correct. MediaSDK has a scheduler and internal threads. Current MediaSDK implementation is free from polling between MediaSDK decoders/vpp/encoders components and a video driver. However MediaSDK still has a 'soft' polling when a MediaSDK level task routine needs to be executed several times. In absence of new tasks (calls to DecodeFrameAsync, EncodeFrameAsync, RunFrameVPPAsync) scheduler wakes up the thread # 0 each millisecond, the rest threads - each second. While this gap can be neglected in case of max performance transcoding model (because treads are always busy), the gap is relevant in case of real time processing. I can't give you any commitments to eliminate this 'soft' polling, but this gap is on radar of a dev team so you may expect that it will be fixed in the future. 

 

Regards,

Dmitry

Ashim_Prasad
Beginner
180 Views

Thanks for the input. In that case is it better to have less number of sessions, so as to minimize this cpu usage. Are there any limitations in terms of kind of tasks that can be handled in one session - for example can conversion from YUV to NV12 and conversion from YUV to BGRA and H.264 encoding be done in one session. Currently I have two sessions one for YUV->NV12 and H.264 Encoder in one session and YUV to BGRA in another session.

Dmitry_E_Intel
Employee
180 Views

Right. Besides to mitigate the issue application can join MediaSDK sessions so that all the sessions share one scheduler object.

One MediaSDK session can have only one instance of decoder, vpp and encoder. So in your case at least two sessions are needed.

Regards,

Dmitry

Ashim_Prasad
Beginner
180 Views

If I understand correctly the rule that one session can have one instance of vpp, decoder, encoder applies to joining of sessions as well.

Dmitry_E_Intel
Employee
180 Views

Right.

Artem_S_Intel
Employee
180 Views

Joining sessions won't limit number of components, even sessions are joined they are independent so each session can have Decode, VPP, ENCODE and only scheduler will be one for all, so number of threads allocated decreased. 

Reply