- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
When I call MFXVideoDecode_Init() it fails with MFX_ERR_UNSUPPORTED.
I'm using an annex-b .h264 file that I generated with FFMPEG. I get the same error when I use the .264 file from the samples; this is obvious ofcourse as VideoDECODEH264::Init() has a #ifdef that triggers the return of MFX_ERR_UNSUPPORTED, as you can see in the linked code
VideoDECODEH264::Init() returns MFX_ERR_UNSUPPORTED because MFX_VA_LINUX is defined in mfx_config.h:
But I wasn't able to find where MFX_VA was defined.
How can I fix this?
- Tags:
- Development Tools
- Graphics
- Intel® Media SDK
- Intel® Media Server Studio
- Media Processing
- Optimization
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Diederich,
Could you provide more information?
Which OS are you using?
Which application did you use when error happens?
Any log file?
Mark.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Mark,
Thanks for replying.
For the OS, please see the link to the CMake output it shows:
Target: CMAKE_SYSTEM_NAME : Linux CMAKE_SYSTEM_VERSION : 5.2.1-arch1-1-ARCH CMAKE_SYSTEM_PROCESSOR : x86_64
I tested with the sample_decode application and my own code.
How / where can I get a log? (the app that I ran didn't create a log).
I'm wondering if this is due to missing drivers. I've got a MSI Nightblade X2 PC, which has a i7-6700K. Though in the UEFI settings I cannot explicitly enable/disable the integrated GPU. I know that some BIOS/UEFI systems you can do this. I've contacted MSI support about this.
roxlu
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Diederick,
You might be right on the driver comment. This is why I wanted to learn what Linux distro you are using, the driver is in kernel, so unless I know the kernel version, I can't tell if you are missing driver or not.
As I read through your message, it seems like the system has a discrete graphic card in it. This could be an issue and you have to make sure the integrated graphic is enabled.
You can also refer to Appendix D of the SDK developer reference to get clue.
Mark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks Mark!
I'm using Arch Linux, kernel version 5.2.4 (I upgraded yesterday)
$ uname -a
Linux arch680 5.2.4-arch1-1-ARCH #1 SMP PREEMPT Sun Jul 28 10:52:46 UTC 2019 x86_64 GNU/Linux
Thanks for the link to Appendix D. I've to read up on this, but could it be the case that my UEFI disables the integrated GPU as I've my monitors connected to the discrete gpu? Maybe I'm oversimplifying things, but I would assume that it would be a perfect workflow to use the discrete GPU for rendering and the integrated GPU for other tasks (encoding / decoding); am I wrong?
roxlu
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
hi roxlu,
Sorry for the late response.
You have 2 problems here:
- Driver issue, if you use kernel less then 4.14.15, you have to apply the kernel patch. The method has described in the installation guide. We only validate our release on CentOS and briefly checked in Ubuntu, so I am really not sure how the installation on Arch Linux. By theory, it should work.
- Discrete graphic issue: based on OS, you should only use one driver at a time, so if you enabled discrete graphic card, the integrated graphic should not be active and you should get failed result.
I would suggested you can try you hardware platform by disabling the discrete graphic on CentOS or Ubuntu, this just to confirm your hardware. If this works, then we can move on to Arch Linux and then enable the discrete graphic step by step. Does this make sense to you?
Mark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Mark,
Thanks for getting back to me and worries about the late response. I appreciate you take time to help me dive into this.
1. You mention that it could be a driver issue when I use a kernel < 4.14.15. My current kernel is 5.2.5-arch1-1-ARCH, therefore I don't think this is the case. I've installed libva and intel-media-driver. When I run vainfo I get the following output:
$ vainfo
vainfo: VA-API version: 1.5 (libva 2.5.0)
vainfo: Driver version: Intel iHD driver - 1.0.0
vainfo: Supported profile and entrypoints
VAProfileNone : VAEntrypointVideoProc
VAProfileNone : VAEntrypointStats
VAProfileMPEG2Simple : VAEntrypointVLD
VAProfileMPEG2Simple : VAEntrypointEncSlice
VAProfileMPEG2Main : VAEntrypointVLD
VAProfileMPEG2Main : VAEntrypointEncSlice
VAProfileH264Main : VAEntrypointVLD
VAProfileH264Main : VAEntrypointEncSlice
VAProfileH264Main : VAEntrypointFEI
VAProfileH264Main : VAEntrypointEncSliceLP
VAProfileH264High : VAEntrypointVLD
VAProfileH264High : VAEntrypointEncSlice
VAProfileH264High : VAEntrypointFEI
VAProfileH264High : VAEntrypointEncSliceLP
VAProfileVC1Simple : VAEntrypointVLD
VAProfileVC1Main : VAEntrypointVLD
VAProfileVC1Advanced : VAEntrypointVLD
VAProfileJPEGBaseline : VAEntrypointVLD
VAProfileJPEGBaseline : VAEntrypointEncPicture
VAProfileH264ConstrainedBaseline: VAEntrypointVLD
VAProfileH264ConstrainedBaseline: VAEntrypointEncSlice
VAProfileH264ConstrainedBaseline: VAEntrypointFEI
VAProfileH264ConstrainedBaseline: VAEntrypointEncSliceLP
VAProfileVP8Version0_3 : VAEntrypointVLD
VAProfileVP8Version0_3 : VAEntrypointEncSlice
VAProfileHEVCMain : VAEntrypointVLD
VAProfileHEVCMain : VAEntrypointEncSlice
VAProfileHEVCMain : VAEntrypointFEI
VAProfileHEVCMain10 : VAEntrypointVLD
VAProfileHEVCMain10 : VAEntrypointEncSlice
VAProfileVP9Profile0 : VAEntrypointVLD
VAProfileVP9Profile2 : VAEntrypointVLD
2. I'm currently testing on a NUC8i7BEH2 which only has an integrated Intel Iris Plus Graphics 655.
When I look in the source of mfx_h264_decode.cpp in the Init(mfxVideoParam* par) function I see this:
if (MFX_PLATFORM_SOFTWARE == m_platform)
{
#if defined (MFX_VA_LINUX)
return MFX_ERR_UNSUPPORTED;
#else
m_pH264VideoDecoder.reset(new UMC::MFX_SW_TaskSupplier());
m_FrameAllocator.reset(new mfx_UMC_FrameAllocator());
#endif
}
It's that #if defined(MFX_VA_LINUX` part that returns the MFX_ERR_UNSUPPORTED.
I'm compiling the intel media sdk as an ExternalProject_Add() with CMake and because it's an #ifdef I'm wondering if this is an issue related to the way the media sdk is built.
Any advice on how to debug / proceed?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi roxlu,
Apologized again for long time no response, this is my fault. I was busying on other projects.
Basically, this platform is not an official supported Linux version, so I am trying my best.
It looks like your driver configured correct, but the debug info went very deep into the library. I am sceptic about MFX_PLATFORM_SOFTWARE since it implies a software only platform.
In case of software codec, we have to use the software library to complete the codec operation; but open source project is hardware only, so software codec should not work.
Mark
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page