Media (Intel® Video Processing Library, Intel Media SDK)
Access community support with transcoding, decoding, and encoding in applications using media tools like Intel® oneAPI Video Processing Library and Intel® Media SDK
Announcements
The Intel Media SDK project is no longer active. For continued support and access to new features, Intel Media SDK users are encouraged to read the transition guide on upgrading from Intel® Media SDK to Intel® Video Processing Library (VPL), and to move to VPL as soon as possible.
For more information, see the VPL website.

sample_encode frame by frame

Stefan_S_2
Beginner
376 Views

Hi,

we are currently testing H.264 HW encoding on Windows by using the sample_encode project. This project assumes that frames are read from file in one complete process. Which works fine. Now, if I'm feeding frames one by one and calling the CEncodePipeline::Run() method for each frame, the resulting bitstream contains only I- and P- frames, but no B- frames. No matter how GopRefDist is configured.
After commenting out the "drain" part at the end of this method, B- Frames are created, but the video itself plays not fluently.

Can anyone shed some light on how to change the CEncodePipeline::Run() method in order to be capable of frame-by-frame processing ?

Thank you,
Stefan.

 

0 Kudos
2 Replies
Jeffrey_M_Intel1
Employee
376 Views

Media SDK is designed to be run asynchronously -- this is important for performance.  By simplifying GOP structure (no B frames), using async=1, etc. you can move closer to the goal of getting 1 frame out for each surface you put in.  However, you should assume that several frames may need to go in before the first encode completes.  If you would like to write a "process next frame" operation you could look at the sink/source implementation in sample_multi_transcode.  This shows one possible way for your application to maintain a queue of frames to feed the encoder.  

To work with this queue you will probably need to extend the Media SDK locking mechanism.  The Locked field in mfxFrameData is intended for use only by Media SDK, not by your application.  However, especially if your application is multi-threaded, you will want to indicate if a surface is locked from your application's perspective too. 

0 Kudos
Stefan_S_2
Beginner
376 Views

Thanks Jeffrey, I've now figured out what the problem was.
Even though the flag mfxVideoParam.EncodedOrder is set to 0 (display order) the encoder produces frames in transmit order: IPBBPBB. So I had to adjust the output timestamps accordingly in my DirectShow filter in order to have the bitstream being muxed properly.
Additionally I implemented a ring buffer of 3 frames, because the SDK manual states "The application should not alter the frame until the encoder unlocks the frame."

I would appreciate if those input/output behaviour aspects would be adressed a little bit more clearer in the SDK manual.

Anyway, thanks for your support,
Stefan.

 

 

0 Kudos
Reply