Media (Intel® Video Processing Library, Intel Media SDK)
Access community support with transcoding, decoding, and encoding in applications using media tools like Intel® oneAPI Video Processing Library and Intel® Media SDK
Announcements
The Intel Media SDK project is no longer active. For continued support and access to new features, Intel Media SDK users are encouraged to read the transition guide on upgrading from Intel® Media SDK to Intel® Video Processing Library (VPL), and to move to VPL as soon as possible.
For more information, see the VPL website.

oneVPL evaluation on devcloud

Jitendra_S_Intel
Employee
1,973 Views

I'm trying to use oneAPI DevCloud to test some oneVPL samples.  I have downloaded sample video from https://github.com/intel-iot-devkit/sample-videos/blob/master/people-detection.mp4.

 

I also built the oneVPL samples from login node and tried to run them on a compute node by using an interactive shell. Using following command to log into the compute node: 

 

 

 

 

 

qsub -I -l nodes=1:quad_gpu:ppn=2 -d .

 

 

 

 

 

The decode and VPP samples don't seem to run properly. Please check.

 

SW based decode attempt below. The video is 49 seconds long @ 12fps. The output shows 9 frames "processed".

 

 

 

 

 

uxxxxx@s012-n001:~/oneAPI-samples/Libraries/oneVPL/hello-vpp/build$ time ./hello-vpp -sw -i ./test.h264 -w 768 -h 432
Implementation details:
  ApiVersion:           2.4
  Implementation type:  SW
  AccelerationMode via: NA
  Path: /glob/development-tools/versions/oneapi/2021.3/inteloneapi/vpl/2021.4.0/lib/libvplswref64.so.1

Processing ./test.h264 -> out.raw
Processed 9 frames

real    0m4.939s
user    0m0.019s
sys     0m0.052s
uxxxxx@s012-n001:~/oneAPI-samples/Libraries/oneVPL/hello-vpp/build$

 

 

 

 

 

 

HW based decode attempt:

 

 

 

 

 

uxxxxx@s012-n001:~/oneAPI-samples/Libraries/oneVPL/hello-vpp/build$ time ./hello-vpp -hw -i ./test.h264 -w 768 -h 432
Cannot create session -- no implementations meet selection criteria
Processed 0 frames

real    0m0.027s
user    0m0.009s
sys     0m0.024s
uxxxxx@s012-n001:~/oneAPI-samples/Libraries/oneVPL/hello-vpp/build$ cd ..

uxxxxx@s012-n001:~/oneAPI-samples/Libraries/oneVPL/hello-decode/build$ ./hello-decode -sw -i ../../hello-vpp/build/test.h264
Implementation details:
  ApiVersion:           2.4
  Implementation type:  SW
  AccelerationMode via: NA
  Path: /glob/development-tools/versions/oneapi/2021.3/inteloneapi/vpl/2021.4.0/lib/libvplswref64.so.1

Error initializing decode

Decoded 0 frames
uxxxxx@s012-n001:~/oneAPI-samples/Libraries/oneVPL/hello-decode/build$ 

 

 

 

 

 

 

Input file information:

 

 

 

 

 

uxxxxx@s012-n001:~/oneAPI-samples/Libraries/oneVPL/hello-vpp/build$ ffmpeg -i ./test.h264 -f null -
ffmpeg version 4.2.4-1ubuntu0.1 Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 9 (Ubuntu 9.3.0-10ubuntu2)
  configuration: --prefix=/usr --extra-version=1ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  libavutil      56. 31.100 / 56. 31.100
  libavcodec     58. 54.100 / 58. 54.100
  libavformat    58. 29.100 / 58. 29.100
  libavdevice    58.  8.100 / 58.  8.100
  libavfilter     7. 57.100 /  7. 57.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  5.100 /  5.  5.100
  libswresample   3.  5.100 /  3.  5.100
  libpostproc    55.  5.100 / 55.  5.100
Input #0, h264, from './test.h264':
  Duration: N/A, bitrate: N/A
   Stream #0:0: Video: h264 (Baseline), yuv420p(tv, smpte170m, progressive), 768x432, 12 fps, 12 tbr, 1200k tbn, 24 tbc
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> wrapped_avframe (native))
Press [q] to stop, [?] for help
Output #0, null, to 'pipe:':
  Metadata:
    encoder         : Lavf58.29.100
    Stream #0:0: Video: wrapped_avframe, yuv420p, 768x432, q=2-31, 200 kb/s, 12 fps, 12 tbn, 12 tbc
    Metadata:
      encoder         : Lavc58.54.100 wrapped_avframe
frame=  596 fps=0.0 q=-0.0 Lsize=N/A time=00:00:49.58 bitrate=N/A speed= 179x
video:312kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
uxxxxx@s012-n001:~/oneAPI-samples/Libraries/oneVPL/hello-vpp/build$

 

 

 

 

 

 

Input file was extracted from mp4 file as follows:

 

 

 

 

ffmpeg -i ~/test.mp4 -c:v copy -bsf h264_mp4toannexb -f h264 test.h264

 

 

 

 

0 Kudos
9 Replies
JaideepK_Intel
Moderator
1,942 Views

Hi,

 

Thank you for posting in Intel Forums, 

We assumed that you were working on this sample(https://github.com/oneapi-src/oneAPI-samples/tree/master/Libraries/oneVPL/hello-vpp) So after going through the readme file, we could come to a conclusion that the sample you're trying to run doesn't support GPU. We've provided a screenshot for your reference. We will reproduce this soon and get back to you with updates.

 

MicrosoftTeams-image.png

 

 

Thanks,

Jaideep

 

0 Kudos
Jitendra_S_Intel
Employee
1,917 Views

Aside from using a compute node w/GPU, I didn't try to run the demo on the GPU unless the demo defaulted to GPU by using '-hw' option.  

0 Kudos
JaideepK_Intel
Moderator
1,858 Views

Hi,


We could reproduce your issue from our end & we observed that the hello_vpp & hello_decode samples are indeed giving wrong number of frames & different frame rate. We’ll investigate this issue further and get back to you.

And regarding the issue with “-hw option” i.e. Hardware Implementation, there are two types of implementations in Intel Media SDK & Intel OneVPL i.e. Software Implementation & Hardware Implementation. Software Implementation is used for running a demo sample on CPU. This can be done by adding “-sw” option in the command whereas the hardware Implementation is used for running a demo sample on GPU. This can be done by adding “-hw” option to your command. Since you added -hw option, you got the “no implementations” error. GPU compute node doesn’t effect this process.


Thanks,

Jaideep


0 Kudos
Jitendra_S_Intel
Employee
1,784 Views
0 Kudos
Jitendra_S_Intel
Employee
1,850 Views

Where can I see supported target hardware for the '-hw' option? Application or documentation?

0 Kudos
JaideepK_Intel
Moderator
1,827 Views

Hi,


You can check the supported hardware in the Readme.md file for every sample. 

We've provided the link to the readme file(https://github.com/oneapi-src/oneAPI-samples/blob/master/Libraries/oneVPL/hello-vpp/README.md

& below screenshot of it where the target device is mentioned as CPU. 

That means that the sample is not GPU supported & doesn't work with "-hw" parameter.


Thanks,

Jaideep


0 Kudos
JaideepK_Intel
Moderator
1,747 Views

Hi,


Sorry for the delayed response. It took us some time to figure out this issue.


Sample decode: please go through this readme file(https://github.com/oneapi-src/oneAPI-samples/tree/master/Libraries/oneVPL/hello-decode)

Currently sample-decode supports input format as h.265, since you gave input format as h.264 you got that error.


If you give h.265 file as an input and run the below command:

./hello-decode -sw -i <inputfile>(eg: test.h265)


The output will be stored as out.raw in the same directory (sample-decode gives output as I420 ) and to view this file run below command:

ffplay -f rawvideo -pixel_format yuv420p -video_size [width]x[height] out.raw 


[width],[height] - width and height of the out.raw


Sample VPP: please go through this readme file(https://github.com/oneapi-src/oneAPI-samples/tree/master/Libraries/oneVPL/hello-vpp)

Sample vpp supports only I420/yuv420 as input colour format. Since we already got I420 file from sample-decode

now we can use that file as input to hello-vpp i.e


./hello-vpp -sw -i ../../hello-decode/build/out.raw -w <width of the out.raw> -h <height of the out.raw> -sw


Note: The output resolution for sample vpp will be 640x480(always)


The output will be stored as out.raw in the same directory (Output format:BGRA raw video elementary stream) and to view this file run below command:

ffplay -f rawvideo -pixel_format bgra -video_size 640x480 out.raw



Thanks,

jaideep


0 Kudos
JaideepK_Intel
Moderator
1,718 Views

Hi,

 

If this resolves your issue, make sure to accept this as a solution. This would help others with similar issue. 

Thank you,

Jaideep


0 Kudos
JaideepK_Intel
Moderator
1,619 Views

Hi,


I assume that your issue is resolved. If you need any additional information, please post a new question as this thread will no longer be monitored by Intel.


Thanks,

Jaideep



0 Kudos
Reply