I found the JPEG lossless speed is 200% slow on the same brand DELL machine with different CPU. The test is done by using the IPP sample program jpegview.exe. From the about menu, I can see the dispatch DLL is different between tow machines. One uses the v8 version, another uses t7 version. Both machines CPUs are Xeon CPU, but the CPU are detected as different types.
Here is the test data. The low performance machine actually has more cores than fast one. How could the JPEG performance are so different?
Fast DELL 490 2 CPU 4 cores
Intel Xeon CPU 5140 @ 2.33GHz
EM64T Family 6 Model 15 Stepping 6, GenuineIntel
752x753x3 8 bitsJpeg lossless load timing: 18561.96 us
Slow DELL 490 2 CPU 8 cores
Intel Xeon CPU 3.20GHz
EM64T Family 15 Model 6 Stepping 4, GenuineIntel
752x753x3 8 bitsJpeg lossless load timing: 48474.34 us
This mean we do have highly optimized code for V8 library (tuned by hands) and do not have this for T7 library (meaning compiler optimization only used in T7 library for some functions used in lossless JPEG).
Thanks for the information.
I found another strange thing about the lossless compression. In the fast machine, I can run 4 threads at a time to speed up the compression about 4 times. However in the slow machine, even it has 8 cores, when I use 4 threads the speed is only up 2 times.
I cut an image to 4 pieces and run 4 encoders at 4 threads. My program is a medical image server. I need the lossless compression to reduce the network traffic.
From the WIndows performance monitor, I do see four CPU have high usage, but the compress speed does not scale up to 4, only 2 times on t7 lib.
I have been busy with Jpeg2000 and have learned that JpegView is now legacy (outdated) and I should use UIC.
I'm now busy with Jpeg2000 using UIC sample code, and I must say, it is much better. Bugs are removed and performance is better. I use a custom DLL linked with T libraries for multithreading.
how did you conclude that UIC JPEG does not support lossy 12-bit compression mode?
You can check this by open any 12-bit image in UIC picnic application and than option to save in JPEG Ext mode became available in Save As dialog.
So, to simplify the things we do not provide 16-bit to 12-bit conversion in application. That mean you can savein 12-bit JPEG Extended Baseline Lossy mode only 12-bit images.
I finally got the 12 bit jpeg lossy compression to work. The trick is to set the param.huffman_opt = 1 and use
JPEG_EXTENDED mode. Why do we have set param.huffman_opt = 1? However in baseline mode I have to set param.huffman_opt = 0 to make 8 bits lossy compression work.
for JPEG baseline mode you can use either huffman_opt = 0 or huffman_opt =1. This parameter instruct encoder to use 'default' JPEG tables (if set to 0) or generate huffman tables based on entropy staticstics of this particular image (if set to 1) which result in slightly better compression ratio but costs in performance due to additional steps encoder have to do.
Because 'default' huffman tables assume 8-bit input data they can't be used in JPEG extended baseline mode for 12-bit data. Thus, encoder have to always generate huffman tables in this case.