|Studying Lossy Image Compression Efficiency, July 2014||Josh Aas||7/15/14 7:34 AM|
This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image Formats Study and the Mozilla Research blog post entitled "Mozilla Advances JPEG Encoding with mozjpeg 2.0".
|Re: Studying Lossy Image Compression Efficiency, July 2014||Josh Aas||7/15/14 8:25 AM|
|Re: Studying Lossy Image Compression Efficiency, July 2014||lange....@gmail.com||7/15/14 9:46 AM|
thank you and all involved for your efforts to make the web faster.
Are there any plans to integrate into other tools, specifically imagemagick?
Or would you leave that up to others?
With all the options available for image processing one can end up with building quite a complex chain of tools and commands to produce the best output.
While you state that you now accept also jpeg for re-compression, this usually involves loss of quality in the process.
Does mozjpeg have a preferred input format (for best quality/performance)?
|unk...@googlegroups.com||7/15/14 12:35 PM||<This message has been deleted.>|
|Re: Studying Lossy Image Compression Efficiency, July 2014||stone...@gmail.com||7/15/14 12:38 PM|
On Tuesday, July 15, 2014 7:34:35 AM UTC-7, Josh Aas wrote:Would be nice if you guys just implemented JPEG2000. It's 2014.
Not only would you get a lot more than a 5% encoding boost, but you'd get much higher quality images to boot.
"But nobody supports JPEG2000 and we want to target something everyone can see!"
If you had implemented it in 2014, everyone would support it today. If you don't implement it today, we'll wait another 15 years tuning a 25 year old image algorithm while better things are available.
Similarly there's a reason that people are still hacking video into JPEGs and using animated GIFs.
|Re: Studying Lossy Image Compression Efficiency, July 2014||perez....@gmail.com||7/15/14 1:15 PM|
On Tuesday, July 15, 2014 10:34:35 AM UTC-4, Josh Aas wrote:#1 Would it be possible to have the same algorithm that is applied to webP to be applied to JPEG?
#2 There are some JPEG services that perceptually change the image, without any noticeable artifacts. Have you tried something like that?
|Re: Studying Lossy Image Compression Efficiency, July 2014||Chris Peterson||7/15/14 1:43 PM|
Do Chrome and IE support JPEG2000? I can't find a clear answer online.
The WONTFIX'd Firefox bug  says IE and WebKit/Blink browsers support
JPEG2000 (but WebKit's support is only on OS X).
|Re: Studying Lossy Image Compression Efficiency, July 2014||Masatoshi Kimura||7/15/14 2:34 PM|
On 7/15/14 12:38 PM, stone...@gmail.com wrote:
> Similarly there's a reason that people are still hacking video intoPeople are using animated GIFs, but animated GIFs people are using may
not be animated GIFs .
No, IE does not support JPEG2000. But IE9+ supports JPEG XR. Chrome does
not support both, but it supports WebP .
|Re: Studying Lossy Image Compression Efficiency, July 2014||ren...@gmail.com||7/16/14 2:06 AM|
I'm replying to this note:
"1. We're fans of libjpeg-turbo - it powers JPEG decoding in Firefox because its focus is on being fast, and that isn't going to change any time soon. The mozjpeg project focuses solely on encoding, and we trade some CPU cycles for smaller file sizes. We recommend using libjpeg-turbo for a standard JPEG library and any decoding tasks. Use mozjpeg when creating JPEGs for the Web."
Why not use hardware for JPEG? It uses less memory, and battery as well as being quicker. It's available on many devices these days too. Why use the CPU to first convert a small amount of data into a big amount of data when it's not needed by most hardware? Not only that, but you probably store the original JPEG data in cache as well! The fastest decoder is the one that does nothing. Just let the dedicated JPEG decoding hardware, or the GPU do it.
All talk of considering decoding performance is kind of silly considering the JPEG performance could be improved massively.
|Re: Studying Lossy Image Compression Efficiency, July 2014||j...@cloudflare.com||7/18/14 8:05 AM|
On Tuesday, July 15, 2014 3:34:35 PM UTC+1, Josh Aas wrote:Josh,
I work for CloudFlare on many things but recently on image compression. We have a product called "Polish" that recompresses images for our customers automatically. As we are in the process of rolling out a new version I looked at mozjpeg 2.0.
I selected 10,000 random JPEGs that we were caching for customers and ran them through mozjpeg 2.0 via jpegtran. Some interesting facts:
1. 691 files were not compressed further. This compares with 3,471 that libjpeg-turbo did not compress further.
2. Of the compression files the average compression was about 3%.
3. Run time was about 1.7x the libjpeg-turbo time.
4. I've put together a small chart showing the distribution of compression that we saw. It's here: https://twitter.com/jgrahamc/status/490114514667327488/photo/1
We will continue to work with mozjpeg 2.0 experimentally with the hope that run time can be brought closer to what we had before as the compression looks good.
|Re: Studying Lossy Image Compression Efficiency, July 2014||Caspy7||7/19/14 1:14 PM|
Would this code be a candidate for use in Firefox OS or does most of that happen in the hardware?
|Re: Studying Lossy Image Compression Efficiency, July 2014||Ralph Giles||7/19/14 1:40 PM|
On 2014-07-19 1:14 PM, Caspy7 wrote:Probably not for Firefox OS, if you mean mozjpeg. Not necessarily
because it uses hardware, but because mozjpeg is about spending more cpu
power to compress images. It's more something you'd use server-side or
in creating apps. The phone uses libjpeg-turbo for image decoding, which
is fast, just not as good an compression.
|Re: Studying Lossy Image Compression Efficiency, July 2014||Gabriele Svelto||7/21/14 7:46 AM|
On 19/07/2014 22:40, Ralph Giles wrote:It might be useful in Firefox OS development: we routinely re-compress
PNG assets in FxOS but we never tried re-compressing our JPEG assets
(which are mostly wallpapers IIRC).
|Re: Studying Lossy Image Compression Efficiency, July 2014||Bryan Stillwell||7/21/14 3:57 PM|
One option that I haven't seen compared is the combination of JPEG w/ packJPG (http://packjpg.encode.ru/?page_id=17). packJPG can further compress JPEG images another 20%+ and still reproduce the original bit-for-bit.
More details on how this is done can be found here:
To me it seems that JPEG+packJPG could be competitive or exceed HEVC-MSP on bits/pixel.
|Re: Studying Lossy Image Compression Efficiency, July 2014||Josh Aas||7/24/14 2:51 PM|
> Are there any plans to integrate into other tools, specifically imagemagick?For now we're going to stay focused on improving compression in mozjpeg's library. I think a larger improved toolchain for optimizing JPEGs would be great, but it's probably outside the scope of the mozjpeg project.
Options for improving re-compression are very limited if you're not willing to accept any quality loss. That said, our 'jpgcrush' feature does reduce size significantly for progressive JPEGs without harming quality.
Not really. It's probably best to input JPEG if your source image is JPEG, otherwise I'd probably recommend converting to BMP for use with cjpeg.
|Re: Studying Lossy Image Compression Efficiency, July 2014||Josh Aas||7/24/14 2:54 PM|
On Tuesday, July 15, 2014 3:15:13 PM UTC-5, perez....@gmail.com wrote:I'm not sure. WebP was created much later than JPEGs, so I'd think/hope they're already using some equivalent to trellis quantization.
I'm not really sure what this means, but you can experiment with re-encoding with mozjpeg and find a level that saves on file size, but at which you can't tell the difference between the source and the re-encoded image.
|Re: Studying Lossy Image Compression Efficiency, July 2014||Josh Aas||7/24/14 2:59 PM|
On Friday, July 18, 2014 10:05:19 AM UTC-5, j...@cloudflare.com wrote:With mozjpeg you probably want to re-encode with cjpeg rather than jpegtran. We added support for JPEG input to cjpeg in mozjpeg to make this possible. I'm not sure, but I don't think jpegtran takes advantage of much of the work we've done to improve compression.
We haven't spent as much time as we'd like to on run-time optimization, we've really been focused on compression wins. We hope to spend more time on run-time performance in the future.
|Re: Studying Lossy Image Compression Efficiency, July 2014||ja...@removethebackground.com||7/31/14 5:06 AM|
Den torsdag den 24. juli 2014 23.59.58 UTC+2 skrev Josh Aas:Hi Josh
You write that we should re-encode with cjpeg rather than just optimize with jpegtran, but what settings would you use for this, if the purpose is just to optimize, and not actually change the format, quality and so on, in any way?
I tried with "cjpeg -quality 100 -optimize -progressive" but this seems to give me much bigger files.
I am hoping to optimize images uploaded for websites, which has allready had the quality setting changed to fit their purpose, so I am just interested in optimizing the images lossless, which seems like a similar case to John's.
And one other thing:
I been testing an early version of jpegtran from MozJpeg, but after upgrading to 2.0 my testfiles seems to grow by a few KB, after being optimized.
Was there an error in the older versions that deleted a bit too much data or did the algorithm change to the worse in 2.0?
I am using "jpegtran -optimize -progressive -copy none" with both versions.
|Re: Studying Lossy Image Compression Efficiency, July 2014||jno...@hirevue.com||9/16/14 9:02 AM|
On Tuesday, July 15, 2014 8:34:35 AM UTC-6, Josh Aas wrote:Could you post the command lines used for the various encoders? Also, for mozjpeg, if you use arithmetic encoding instead of huffman encoding, what is the effect?
I know arithmetic encoding isn't supported by a lot of browsers....but neither are most of the formats being tested in the study. So it seems appropriate to consider.
|Re: Studying Lossy Image Compression Efficiency, July 2014||jno...@hirevue.com||9/16/14 9:06 AM|
On Tuesday, July 15, 2014 1:38:00 PM UTC-6, stone...@gmail.com wrote:Based on what data?
Based on what data?
Just because something is new doesn't automatically imply it's better. I've seen conflicting data on whether or not JPEG2000 outperforms JPEG. And on some basic level, that last statement is also pretty fickle since encoder maturity is a huge factor in quality.
|Re: Studying Lossy Image Compression Efficiency, July 2014||songof...@gmail.com||11/28/14 1:53 PM|
On Tuesday, July 15, 2014 7:34:35 AM UTC-7, Josh Aas wrote:
> This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image Formats Study and the Mozilla Research blog post entitled "Mozilla Advances JPEG Encoding with mozjpeg 2.0".It would help if you would use much more distinct colors in your graphs of the results. It can be very hard to keep track of which is which. You used two shades of red/purple, and three shades of blue/green/teal. That's a bizarre decision for graphs meant to be easily understood.