Intel® Integrated Performance Primitives
Deliberate problems developing high-performance vision, signal, security, and storage applications.

Relation between encoding bitrate and speed

I have a general question about the relation between bitrate and speed of encoding operation.
When I increase the bitrate it takes more time for the operation to finsh and of course the quality is better, I think it is counterintuitive for my understanding.

Isn't a higher bitrate equivalent to less compression? for example no compression should take zero time.

thanks for any information related to this and maybe other parameters that can speed up Mpeg2 encoding

0 Kudos
1 Reply
Hello Dashesy,

I don't have a definitive answer to your question, but I suspect the reason for your observation is a combination of the amount of data being handled along with computational effort required. With lower quality compression there will be more "shortcuts" taken in the compression (less computational effort) and also less data handled on the output side -- resulting in a faster rate of compression. With a higher bitrate there is still compression taking place (requiring computational effort) and there is more data to handle on the ouptput. If there was no compression it would simply be a "passthru" operation, but handling the maximum amount of data. I imagine there is a peak point where the combination of computational effort and data that has to be managed gives the longest time to process, but this will also vary with the memory and I/O bandwidth of the system and the computational capabilities (such as SIMD instructions sets) of the processor -- so there probably isn't a simple answer to your question...

0 Kudos