Media (Intel® Video Processing Library, Intel Media SDK)
Access community support with transcoding, decoding, and encoding in applications using media tools like Intel® oneAPI Video Processing Library and Intel® Media SDK
Announcements
The Intel Media SDK project is no longer active. For continued support and access to new features, Intel Media SDK users are encouraged to read the transition guide on upgrading from Intel® Media SDK to Intel® Video Processing Library (VPL), and to move to VPL as soon as possible.
For more information, see the VPL website.

question on simple_encode_vmem_lowlat

MyMother
Beginner
301 Views

hi Intel friends,

OS: Ubuntu 12.04

       mediasdk-tutorials-0.0.3

       MediaServerStudioEssentials2015R6

Platform:  i5-4570S

     I have several questions ^^a

Q1.  I found description as below, and referred to https://software.intel.com/en-us/forums/intel-media-sdk/topic/603158. Is the variable AsyncDepth here best for low latency but not result in good performance ?? If not, what do I miss?

    // Configuration for low latency

    mfxEncParams.AsyncDepth = 1;    //1 is best for low latency

Q2. How did you know if the following value is enough for buffering ?? Is there any limits??

     extendedCodingOptions.MaxDecFrameBuffering = 1;

Q3. Why the following variable is only 1??  When will I need to change the number ??  and how to decide to change to what??

      mfxEncParams.NumExtParam = 1;

Q4. As the following happens, how do I know how many needed buffers is sufficient ??

            } else if (MFX_ERR_NOT_ENOUGH_BUFFER == sts) {
                // Allocate more bitstream buffer memory here if needed...
                break;

Q5. Why the number is always 60000 ?? how to get that ?

        sts = session.SyncOperation(syncp, 60000);

Thanks for your patience to read ~

 

 

0 Kudos
3 Replies
Jeffrey_M_Intel1
Employee
301 Views

Q1: The effects of AsyncDepth may be easier to understand if you consider the hardware architecture.  There are several HW blocks which can work simultaneously.  If you are only processing 1 stream, the effect of AsyncDepth is bigger because it is harder to keep all components busy.  The improvements from AsyncDepth>1 are reduced when you are processing more streams.  As mentioned in the other thread you may want to measure latencies and FPS with your content and pipeline to optimize for your scenario.  For processing a single stream AsyncDepth 4  is often ideal.  With more than one concurrent stream you may find that AsyncDepth 2 or 1 provides the right combination of performance and latency.  

Q2:  The "low latency encoding and decoding" section of the manual describes how to set up MaxDecFrameBuffering for encode.  Suggested setting is to set the frame buffering to the same value as the number of reference frames.  Low latency GOP structures often don't include B frames. This article has more info: https://software.intel.com/en-us/articles/video-conferencing-features-of-intel-media-software-development-kit/

Q3: For backward compatibility, additional parameters are added by "attaching" extended parameter buffers instead of changing the parameter structs.  If you are only adding one set of additional parameters this value can remain one.  If you add parameters from more parameter sets just set the NumExtParam value to the number of additional parameter buffers Media SDK should read when initializing.

 

Q4:   For encode, the output buffer must have enough space to hold the largest frame.  If you can spare a few extra kilobytes specifying a buffer much larger than individual frame size should mean you won't reach this state.  As far as I know there isn't a way to query exactly how many bytes are waiting to be written to the output buffer, but usually there is enough memory so that estimating with heuristics is sufficient.

 

Q5:  The last parameter for SyncOperation is wait time in milliseconds.  There is nothing special about 60000 -- it is just an arbitrary very long wait time.

0 Kudos
MyMother
Beginner
301 Views

hi Jeffrey,

    Many thanks for your patience. I have new questions below

Q1. According to following description from https://software.intel.com/en-us/articles/video-conferencing-features-of-intel-media-software-development-kit/, it looks like for single channel decoding, is it also for multi-channel decoding??

with specific setting for decoder frame/picture buffering (DPB), to ensure that decoded frame gets displayed immediately after decoding:

mfxExtCodingOption::MaxDecFrameBuffering = 1

Q2. According to following description from https://software.intel.com/en-us/articles/video-conferencing-features-of-intel-media-software-development-kit/, does it mean I need to include SPS/PPS into IDR ?? Or just pass IDR with discarding SPS/PPS ??

mfxBitStream::DataFlag = MFX_BITSTREAM_COMPLETE_FRAME

It is also suggested that the decoder bit stream buffer is only provided one frame at a time.

Q3. According to the following mentioned previously, could you give me an illustration ?? Because I can't find any info when

mfxEncParams.NumExtParam != 1. (I have referred to https://software.intel.com/sites/default/files/mediasdk-man.pdf

For backward compatibility, additional parameters .... when initializing.

 

Thanks again for your patience to read.

0 Kudos
Jeffrey_M_Intel1
Employee
301 Views

Sorry for the delayed reply.

Q1: MaxDecFrameBuffering specifies the max number of frames in the decoded picture buffer for the encoder.  Decode should be independent.  Multiple channels can be decoded simultaneously, whether or not this value is set to 1 for encode.

Q2: Media SDK handles inserting SPS/PPS for IDR frames.  If your question is about decode optimization, the main idea is to make sure that the bitstream buffer doesn't have an incomplete frame at the end.  You can do this by parsing the bitstream for frame boundaries.  For an IDR frame this would include SPS/PPS as well.

Q3: The list of parameters has grown a lot.  Additional parameter sets could be from mfxExtCodingOption (example, to set CAVLC instead of CABAC), mfxExtCodingOption2 to set LookaheadDepth, etc.  What you set here depends on the parameters you need.  For examples of how to set up additional parameter buffers it may be easier to read the tutorials than the samples.

0 Kudos
Reply