Intel® Integrated Performance Primitives
Deliberate problems developing high-performance vision, signal, security, and storage applications.
6704 Discussions

new idea about jpeg encode and decode

qianqianzhutianfang
579 Views
To reduce the use of cpu, Can I use cpu and gpu together to achivejpeg encode and decode , can this ideafeasible? IS jpeg encode and decode time-consuming can be reduce??
0 Kudos
1 Solution
Vladimir_Dudnik
Employee
579 Views
When you consider heterogeneous model (when part of your algorithm run on cpu and other part is running on gpu) you need to take into account the overhead of exchanging data between those two parts. In gpu case you have to send data through PCIe bus which may cost too much in overall application performance.
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.

Regards,
Vladimir

View solution in original post

0 Kudos
6 Replies
Vladimir_Dudnik
Employee
580 Views
When you consider heterogeneous model (when part of your algorithm run on cpu and other part is running on gpu) you need to take into account the overhead of exchanging data between those two parts. In gpu case you have to send data through PCIe bus which may cost too much in overall application performance.
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.

Regards,
Vladimir
0 Kudos
qianqianzhutianfang
579 Views
When you consider heterogeneous model (when part of your algorithm run on cpu and other part is running on gpu) you need to take into account the overhead of exchanging data between those two parts. In gpu case you have to send data through PCIe bus which may cost too much in overall application performance.
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.

Regards,
Vladimir

Do you mean this method can'tsave time for JPEG Encode or Decode? I have made a sample to do this, but it cost too much time to send data between GPU and CPU, I don't known whetheror not to continue research? or give up this idea?
0 Kudos
Vladimir_Dudnik
Employee
579 Views
Well you to decide to continue with this or not. I can guess the only way which probably might make sense with existing GPU (which can't implement whole JPEG algorithm) is something like doing huffman decoding on cpu and send decoded coefficients to GPU where you may try to do IDCT, color conversion and display.

Vladimir
0 Kudos
qianqianzhutianfang
579 Views
Well you to decide to continue with this or not. I can guess the only way which probably might make sense with existing GPU (which can't implement whole JPEG algorithm) is something like doing huffman decoding on cpu and send decoded coefficients to GPU where you may try to do IDCT, color conversion and display.

Vladimir

Yes,you are right. but this is only parts of my work,I also need encode JPEG,include doing color conversion,DCT,quantitative onGPU and send encoded data to CPU to do huffman encode. Which need exchange data two times, it cost too much time.

aslo,I can't make GPU and CPUdo work Synchronizationly,so .... can'tsave time .
0 Kudos
Vladimir_Dudnik
Employee
579 Views
I think with future Intel Larrabee architecture you will be able to implement huffman encoding natively on GPU. In this case you may put whole JPEG algorithm to GPU. Stay tuned.

Regards,
Vladimir
0 Kudos
qianqianzhutianfang
579 Views
I think with future Intel Larrabee architecture you will be able to implement huffman encoding natively on GPU. In this case you may put whole JPEG algorithm to GPU. Stay tuned.

Regards,
Vladimir

Thanks a lot, If It can be able to implement huffman encoding on GPU , the jpeg encode and decode time-consuming can be reduce.
0 Kudos
Reply