Community
cancel
Showing results for 
Search instead for 
Did you mean: 
164 Views

new idea about jpeg encode and decode

Jump to solution
To reduce the use of cpu, Can I use cpu and gpu together to achivejpeg encode and decode , can this ideafeasible? IS jpeg encode and decode time-consuming can be reduce??
0 Kudos
1 Solution
Vladimir_Dudnik
Employee
164 Views
When you consider heterogeneous model (when part of your algorithm run on cpu and other part is running on gpu) you need to take into account the overhead of exchanging data between those two parts. In gpu case you have to send data through PCIe bus which may cost too much in overall application performance.
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.

Regards,
Vladimir

View solution in original post

6 Replies
Vladimir_Dudnik
Employee
165 Views
When you consider heterogeneous model (when part of your algorithm run on cpu and other part is running on gpu) you need to take into account the overhead of exchanging data between those two parts. In gpu case you have to send data through PCIe bus which may cost too much in overall application performance.
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.

Regards,
Vladimir

View solution in original post

164 Views
When you consider heterogeneous model (when part of your algorithm run on cpu and other part is running on gpu) you need to take into account the overhead of exchanging data between those two parts. In gpu case you have to send data through PCIe bus which may cost too much in overall application performance.
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.

Regards,
Vladimir

Do you mean this method can'tsave time for JPEG Encode or Decode? I have made a sample to do this, but it cost too much time to send data between GPU and CPU, I don't known whetheror not to continue research? or give up this idea?
Vladimir_Dudnik
Employee
164 Views
Well you to decide to continue with this or not. I can guess the only way which probably might make sense with existing GPU (which can't implement whole JPEG algorithm) is something like doing huffman decoding on cpu and send decoded coefficients to GPU where you may try to do IDCT, color conversion and display.

Vladimir
164 Views
Well you to decide to continue with this or not. I can guess the only way which probably might make sense with existing GPU (which can't implement whole JPEG algorithm) is something like doing huffman decoding on cpu and send decoded coefficients to GPU where you may try to do IDCT, color conversion and display.

Vladimir

Yes,you are right. but this is only parts of my work,I also need encode JPEG,include doing color conversion,DCT,quantitative onGPU and send encoded data to CPU to do huffman encode. Which need exchange data two times, it cost too much time.

aslo,I can't make GPU and CPUdo work Synchronizationly,so .... can'tsave time .
Vladimir_Dudnik
Employee
164 Views
I think with future Intel Larrabee architecture you will be able to implement huffman encoding natively on GPU. In this case you may put whole JPEG algorithm to GPU. Stay tuned.

Regards,
Vladimir
164 Views
I think with future Intel Larrabee architecture you will be able to implement huffman encoding natively on GPU. In this case you may put whole JPEG algorithm to GPU. Stay tuned.

Regards,
Vladimir

Thanks a lot, If It can be able to implement huffman encoding on GPU , the jpeg encode and decode time-consuming can be reduce.
Reply