Media (Intel® Video Processing Library, Intel Media SDK)
Access community support with transcoding, decoding, and encoding in applications using media tools like Intel® oneAPI Video Processing Library and Intel® Media SDK
Announcements
The Intel Media SDK project is no longer active. For continued support and access to new features, Intel Media SDK users are encouraged to read the transition guide on upgrading from Intel® Media SDK to Intel® Video Processing Library (VPL), and to move to VPL as soon as possible.
For more information, see the VPL website.
3058 Discussions

Linux: Performance issue when rendering a decoded video

Frédéric_K_
Beginner
394 Views

Hello everybody,

I did some tests with the latest MSDK Release 2015R3 for Linux on a Intel NUC D34010. More precisely i used the MSDK samples to decode and display a h264 stream ("Tears of Steel" at a resoultion of 3840x1714 pixels and 24 frames per second - https://mango.blender.org/). The system is set up as described in the Getting Started Guide, i.e. CentOS7 with a patched Kernel is running.

To decode and render the video, i used this command: 

./sample_decode_x11 h264 -hw -vaapi -r -i ~/tearsofsteel.h264

Here comes my problem: when rendering the decoded video (using th "-r" switch), the video is rendered at only 3-4 frames per second and one of the 4 CPU cores is running at 100%! Without rendering, i get between 250-300 fps and none of the CPU cores is stressed at 100%.

 

To further investigate this issue a set up a second NUC, this time with a custom Linux that was build using Buildroot (http://buildroot.uclibc.org/). It is running a very new Linux Kernel (3.18.1), the latest libva + libva-intel-driver (1.5.0) and on top of that gstreamer (1.4.3) + gstreamer-vaapi (0.5.9). I build this system in order to reprocude the steps described within this Intel Paper: https://www-ssl.intel.com/content/www/us/en/intelligent-systems/intel-embedded-media-and-graphics-driver/video-encode-atom-e38xx-emgd-gstreamer-paper.html.

Using a gstreamer demo application, the command to decode and render a video stream is:

./simple-decoder ~/tearsofsteel.h264

And voila! The video is decoded and rendered at full 24 fps. I can even start multiple instances, up to 3 times without any performance issue! And the CPU cores are basically bored at less than 10%.

 

So does anyone known whats going on here? The  gstreamer demo proves that it is possible to decode and render a 4k video on Linux smoothely without stressing the CPU. But unfortunately, i can't get the Intel Media SDK to perform the same way.

My only guess might be that when using the Media SDK, there is some heavy copying going on between the GPU and CPU buffers to render the images to the screen, while gstreamer manages to do the decoding and rendering on the GPU without copying the frames between GPU and CPU buffers. But thats only a guess...

Any help on how to get the Media SDK to render without a performance issue would be highly appreciated!

0 Kudos
0 Replies
Reply