Hi,
I am intel ipp h264 encoder for our application. We are using it for various resolutions with various bitrate.
So, for 320x240 with 100kbps with 30 fps CPU usage is around 20 to 35%
where for 640x480 with 100kps with 30fps CPU usage is around 80 to 100%
As the resolution for 320x240 is 20 to 35%, for double the resolution with same fps and bitrate, the CPU usage also get doubled, but why it is going beyond more than double equals to thriple.
Is this not like resolution is doubled, so CPU also double, is this valid one or expected.
Plz provide the proper explanation.
链接已复制
I think there is no need of investigation required, as the encoder specialist in Intel can easliy answer the question as the resolution doubles, will the CPU doubles.
Even here is the test case which u can try.
file : YUV data
resolution : 320x240 with fps : 10
CPU usage : 20 %
where in case of 640x480
resolution : 640x480
fps : 10
CPU usage : 80 which is 4 times greater than 320x240
Here my question is as 320x240 tool 20% , so 640x480 has to take 40%, because resolution is doubled, CPU usage is doubled right, but it is taking 80%
I think it was not the case that when resolution doubled, CPU usage also will get exactly doubled, if not what may be the valid reason?
That is what we are required.
For your convience here we are attchinhg videoSender.jpg, please rename the file to videoSender.yuv after downloading, use the intel ipp latest encoder and encode the file with above test cases and check the CPU performance.
I do not know if you can expect always simple linear dependency growth when the resolution is doubled so the cpu load will be also doubled.
Btw. only by counting total numbers of pixels which are composing 640x480 frame size it is 4x larger than 320x240 frame size(in pixels) so by mesuring cpu load as simple lineary dependent function of pixel count it seems that cpu has 4x more pixels to render.
