I noticed a problem while doing some H264 encoding, the output bitrate seemed have smaller fluctuation range than i expected.
eg. For a 1280x720 output, I set the ratecontrol mode to VBR, TargetKbps 2000, MaxKbps 8000, but output stream 's maxbitrate only made to 4000kbps,i believe this will decline video quality while encoding some certain scene.
I compare mediasdk output stream and X264 ,both at their best quality level. in some severe motion scene, x264 has obvious quality advantage over mediasdk due to a wider bitrate range.
Hello there - Thanks for the question.
The VBR rate control method tries to achieve the best quality while also achieving low file size. As you know, the bitrate controls are dependent (among many things) on the buffer size one specifies for the encoder to operate with. That can explain one of the reasons why x264 and our rate control methods differ in the final output size.
When you say x264 has obvious quality advantage, did you inspect it visually?
When you say "best quality", can you explain what are the presets you used for x264 and for our SDK as well?
Can you provide more details on the experiment you did (system, command-line used, versions, input and output etc.,)
In short, we need more details on your experiment to understand what you're observing and we also need to ensure both the softwares are being compared on similar parameters.
FYI - Here is a good article that talks about our rate control methods - https://software.intel.com/en-us/articles/common-bitrate-control-methods-in-intel-media-sdk
Also, do mention what application did you run for Media SDK. You can find code examples to use from here (tutorials or samples) - https://software.intel.com/en-us/intel-media-server-studio-support/code-samples