I was actually playing with the pngcrush and konsorten for making
web-pic smaller and thought about an equivalent program for JPEGs. So,
because I know progressive JPEGs are most often smaller I tuned a few
scan-files for use in jpegtran, and came to an astounding result:
optimize Collab___The_Wanderers_by_alyn.jpg ...740703 > 707011 ...
optimize Haven.jpg ...768592 > 723167 ...
optimize LocusAmoenus2.jpg ...686826 > 624442 ...
optimize Sway_by_lasoysauce.jpg ...1232456 > 1173645 ...
optimize TheShroud.jpg ...697195 > 670113 ...
optimize Waytoheavenfull2.jpg ...693137 > 635510 ...
optimize atf11x8.jpg ...699115 > 660586 ...
optimize chute2_11x8.jpg ...760355 > 716104 ...
optimize dibu7.jpg ...770964 > 730699 ...
optimize portofino.jpg ...745001 > 706159 ...
optimize vagues11x8.jpg ...829419 > 808262 ...
...
optimize DC3_03.jpg ...24653 > 12541 ...
and some real big ones:
optimize panorama33021.JPG ...6813756 > 6236247 ...
optimize panorama33036.JPG ...6338422 > 5700025 ...
optimize panorama33048.JPG ...7363277 > 6462039 ...
those pictures are all natural ones.
I put the scans and the script to calculate the optimal version here:
http://paradice-insight.us/cdb/scripts/jpeg/
So if for those JPEGs that were maybe 65-95% /quality/ get between
10%-25% and sometimes upto 50% space-saving (and that with only 3
different progressive tries with huffman-coding):
Why does everybody wonder how StuffIt and paq can compress JPEGs?
Ciao and have fun with the script
Niels