- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
To reduce the use of cpu, Can I use cpu and gpu together to achivejpeg encode and decode , can this ideafeasible? IS jpeg encode and decode time-consuming can be reduce??
1 Solution
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
When you consider heterogeneous model (when part of your algorithm run on cpu and other part is running on gpu) you need to take into account the overhead of exchanging data between those two parts. In gpu case you have to send data through PCIe bus which may cost too much in overall application performance.
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.
Regards,
Vladimir
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.
Regards,
Vladimir
Link Copied
6 Replies
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
When you consider heterogeneous model (when part of your algorithm run on cpu and other part is running on gpu) you need to take into account the overhead of exchanging data between those two parts. In gpu case you have to send data through PCIe bus which may cost too much in overall application performance.
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.
Regards,
Vladimir
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.
Regards,
Vladimir
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Quoting - Vladimir Dudnik (Intel)
When you consider heterogeneous model (when part of your algorithm run on cpu and other part is running on gpu) you need to take into account the overhead of exchanging data between those two parts. In gpu case you have to send data through PCIe bus which may cost too much in overall application performance.
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.
Regards,
Vladimir
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.
Regards,
Vladimir
Do you mean this method can'tsave time for JPEG Encode or Decode? I have made a sample to do this, but it cost too much time to send data between GPU and CPU, I don't known whetheror not to continue research? or give up this idea?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Well you to decide to continue with this or not. I can guess the only way which probably might make sense with existing GPU (which can't implement whole JPEG algorithm) is something like doing huffman decoding on cpu and send decoded coefficients to GPU where you may try to do IDCT, color conversion and display.
Vladimir
Vladimir
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Quoting - Vladimir Dudnik (Intel)
Well you to decide to continue with this or not. I can guess the only way which probably might make sense with existing GPU (which can't implement whole JPEG algorithm) is something like doing huffman decoding on cpu and send decoded coefficients to GPU where you may try to do IDCT, color conversion and display.
Vladimir
Vladimir
Yes,you are right. but this is only parts of my work,I also need encode JPEG,include doing color conversion,DCT,quantitative onGPU and send encoded data to CPU to do huffman encode. Which need exchange data two times, it cost too much time.
aslo,I can't make GPU and CPUdo work Synchronizationly,so .... can'tsave time .
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I think with future Intel Larrabee architecture you will be able to implement huffman encoding natively on GPU. In this case you may put whole JPEG algorithm to GPU. Stay tuned.
Regards,
Vladimir
Regards,
Vladimir
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Quoting - Vladimir Dudnik (Intel)
I think with future Intel Larrabee architecture you will be able to implement huffman encoding natively on GPU. In this case you may put whole JPEG algorithm to GPU. Stay tuned.
Regards,
Vladimir
Regards,
Vladimir
Thanks a lot, If It can be able to implement huffman encoding on GPU , the jpeg encode and decode time-consuming can be reduce.
Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page