- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I'm trying interlaced encoding based on sample_encode.exe(x86) in Media SDK 2.0.
I have set mfxExtCodingOption.FramePicture to MFX_CODINGOPTION_ON, and put it to mfxVideoParam.ExtParm inside function CEncodingPipeline::InitMfxEncParams.
This works in software encoding, but doesn't seem to work in hardware encoding.
As a result, when muxed into mp4 or mkv (by mp4box or mkvmerge), the number of frames included in the output stream doubles from the input number of frames.
How can I put on mfxExtCodingOption.FramePicture flag in hardware encoding?
Win7(x64)
Core i5 2500
Graphics Driver Version 8.15.10.2509
Thanks.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This may be a bug.We willlet you know as we find more information.
Regards,
Jeff
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
rigaya
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Sorry for the delayed response.
mfxExtCodingOption.FramePicture is not supported for H.264 in Media SDK 2.0 or the current Media SDK 3 betas. We will update the documentation for future releases to reflect this.
We are still looking into MPEG2.
In the meantime, some more information will help us find an answer as quickly as possible.
Could you make two short elementary streams available? (One for hardware, one for software) There are many things that could be happening with the muxer, but if possible we would like to learn more about the elementary stream outputs.
Please let us know more about what changes you made to the sample. The best option would be to attach your updated pipeline_encode.cpp and/or a log from mediasdk_tracer. The tracer tool is available with the MSDK 3 betas, but it will work with MSDK 2 executables. Just verifying that there are only the changes required to add the extended buffer FramePicture parameter would also be helpful.
Also, what command line parameters did you use?
Thanks for your patience,
Jeff
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks Jeff, I understand mfxExtCondingOption.FramePicture doesn't work with H.264 encoding.
The command line was
for software
sample_encode.exe h264 -tff -f 29.97 -i raw.yuv -o video.264 -w 1280 -h 720
for hardware
sample_encode.exe h264 -hw -d3d -tff -f 29.97 -i raw.yuv -o video.264 -w 1280 -h 720
The change I made to the sample is very simple.
I added
mfxExtBuffer* m_EncExtParams[1];
mfxExtCodingOption m_mfxCopt;
to Class CEncodingPipeline, and then changed CEncodingPipeline::InitMfxEncParams like below. I put CQP Encoding mode on, and attached ExtBuffer to m_mfxEncParams.ExtParam.
[cpp]mfxStatus CEncodingPipeline::InitMfxEncParams(sInputParams *pInParams) { m_mfxEncParams.mfx.CodecId = pInParams->CodecId; m_mfxEncParams.mfx.TargetUsage = pInParams->nTargetUsage; // trade-off between quality and speed //Disable //m_mfxEncParams.mfx.TargetKbps = pInParams->nBitRate; // in Kbps /* Add */ m_mfxEncParams.mfx.RateControlMethod = MFX_RATECONTROL_CQP; m_mfxEncParams.mfx.QPI = 24; m_mfxEncParams.mfx.QPP = 25; m_mfxEncParams.mfx.QPB = 26; /* Add End */ ConvertFrameRate(pInParams->dFrameRate, &m_mfxEncParams.mfx.FrameInfo.FrameRateExtN, &m_mfxEncParams.mfx.FrameInfo.FrameRateExtD); m_mfxEncParams.mfx.NumThread = pInParams->nThreads; // if 0 then encoder decides m_mfxEncParams.mfx.EncodedOrder = 0; // binary flag, 0 signals encoder to take frames in display order // specify memory type if (pInParams->bd3dAlloc) { m_mfxEncParams.IOPattern = MFX_IOPATTERN_IN_VIDEO_MEMORY; } else { m_mfxEncParams.IOPattern = MFX_IOPATTERN_IN_SYSTEM_MEMORY; } // frame info parameters m_mfxEncParams.mfx.FrameInfo.FourCC = MFX_FOURCC_NV12; m_mfxEncParams.mfx.FrameInfo.ChromaFormat = MFX_CHROMAFORMAT_YUV420; m_mfxEncParams.mfx.FrameInfo.PicStruct = pInParams->nPicStruct; // set frame size and crops // width must be a multiple of 16 // height must be a multiple of 16 in case of frame picture and a multiple of 32 in case of field picture m_mfxEncParams.mfx.FrameInfo.Width = ALIGN16(pInParams->nDstWidth); m_mfxEncParams.mfx.FrameInfo.Height = (MFX_PICSTRUCT_PROGRESSIVE == m_mfxEncParams.mfx.FrameInfo.PicStruct)? ALIGN16(pInParams->nDstHeight) : ALIGN32(pInParams->nDstHeight); m_mfxEncParams.mfx.FrameInfo.CropX = 0; m_mfxEncParams.mfx.FrameInfo.CropY = 0; m_mfxEncParams.mfx.FrameInfo.CropW = pInParams->nDstWidth; m_mfxEncParams.mfx.FrameInfo.CropH = pInParams->nDstHeight; // we don't specify profile and level and let the encoder choose those basing on parameters /* Add */ if (m_mfxEncParams.mfx.FrameInfo.PicStruct & (MFX_PICSTRUCT_FIELD_TFF | MFX_PICSTRUCT_FIELD_BFF)) { memset(&m_mfxCopt, 0, sizeof(mfxExtCodingOption)); m_mfxCopt.Header.BufferId = MFX_EXTBUFF_CODING_OPTION; m_mfxCopt.Header.BufferSz = sizeof(mfxExtCodingOption); m_mfxCopt.FramePicture = MFX_CODINGOPTION_ON; m_EncExtParams[0] = (mfxExtBuffer *)&m_mfxCopt; m_mfxEncParams.ExtParam = m_EncExtParams; m_mfxEncParams.NumExtParam = 1; } /* Add End */ return MFX_ERR_NONE; }
[/cpp]
This is all the change I made for the test.
By encoding with hardware and software, it makes elementary stream outputs of similar size, but number of samples(frames) differs.
Thanks.
rigaya
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Rigaya,
Thank you for posting your command line and code changes. This helps to make sure we're looking at the same thing.
When I run with these changes and the command lines given, I see the same number of frames in the elementary streams. The main differences are after the elementary streams are muxed.
A quick look at Media SDK 3.0 beta 4 showed better duration matches after muxing. However, a better solution might be to consider adding muxing to your pipeline. Petter Larsson has written a paper on this available at
http://software.intel.com/en-us/articles/muxing-with-intel-media-software-development-kit/
This makes it possible to supply appropriate timestamps to the container.
We will keep looking at the duration differences between the hardware and software implementations, and the doubling behavior with interlaced inputs. In the short term, verifying the video elementary stream output can be done via elementary stream compatible players. Media SDK provides two of them -- sample_dshow and sample_decode with -r in Media SDK 3. The Media SDK DirectShow filters can also be used to set up elementary stream players.
Let me know if this matches what you are seeing -- if your h.264 elementary streams actually have double the number of frames then of course the problem is quite different. Please feel free to post them.
Regards,
Jeff
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page