Settings for Lossless

63 views
Skip to first unread message

24lev...@gmail.com

unread,
Nov 6, 2013, 4:17:16 AM11/6/13
to webp-d...@webmproject.org
What settings should i use for Lossless and best compression?

I am currently using -q 100 -lossless

But i am skeptical about having Alpha channel, as from what i grasp, it´s never used, except if i myself has made use of it.
And i think it takes some space?

Also, what about -m?

As in ReadMe it says (q controls the time spent to minimize size if lossless is active).
which is the same thing -m controls?

So i am a bit confused.


So in other words, what settings should i use to have get a lossless image with as little size as possible.

oX Triangle

unread,
Nov 6, 2013, 9:48:23 AM11/6/13
to webp-d...@webmproject.org, 24lev...@gmail.com
if u use - q 100 then u use lossy mode!!
erase the -q 100... use only -lossless

-mt -m 6 -lossless

if u dont use batching or know every type of image
u can optimize somes with the -hint [photo|picture|graph] parameter

24lev...@gmail.com

unread,
Nov 6, 2013, 9:53:30 AM11/6/13
to webp-d...@webmproject.org, 24lev...@gmail.com
Wait wait...
Are you sure?

from Readme it says like this:

You might want to try the -lossless flag too, which will compress the source
(in RGBA format) without any loss. The -q quality parameter will in this case
control the amount of processing time spent trying to make the output file as
small as possible.

From my understanding, -q becomes (Time to try to compress the size more).

And from my tests, the difference between -q 100 -lossless and non lossless is big, so it must be lossless.

And tried -m 6 now, did improve it, a tiny bit, but every bit is something:)

-mt however seems to never be use, and i know that many video codecs loss quality with -mt (though very marginal), so in this case i care not to use it, as i don´t earn on it.
As i can easily just encode 4 images at a time (4 cores) which is much faster anyway:)

Does optimize matter for lossless?

I only want lossless and as compressed as possible.

Though if i want to use lossy i will test different settings to find something to my liking.

oX Triangle

unread,
Nov 6, 2013, 6:40:46 PM11/6/13
to webp-d...@webmproject.org, 24lev...@gmail.com
i have made some huge test with different q.. and if u search the smallest result
u should take as exsample -q 50, 51, 52...,59 to make 10 pics to get the smallest
but in mostly cases i dont have more then 1% gain in filesize
and on every image its another qualitynumber.. 

if its ok for you to take time to make 10 times longer calculation..? ok

the mt its only for multiprozessing...  no effect on single-cpus

if u want smaller size..  if it be allowed.. clean the image (noise...)

24lev...@gmail.com

unread,
Nov 6, 2013, 6:52:52 PM11/6/13
to webp-d...@webmproject.org, 24lev...@gmail.com
Not entierly sure waht you mean.

Do you mean that changing the -q mostly result in increased time, but pretty much no size difference?
If so, well that´s to be expected, and well for archiving, it´s worth it, it doesn´t that that long, and i can do other stuff while encoding so it chould take 10 times longer and i wouldn´t care.

Yeah i know, or well single-cpu, i am guessing you mean 1 core, as several cpus is rare.

And well from my tests, i never get to use above 25% when encoding webp, and 25% for me is 100% (4 cores 100/4 = 25).
So i think it can only be used in certain scenarios, not sure when.

Or something is wrong and it simply doesn´t want to use my other core, but well as said, i runt 4 at at a time anyway.

But well, i can´t say it´s fast, it certainly is slow, but for Lossless it´s worth it. Lossy however, not so much, especially with the color problem i am having.

James Zern

unread,
Nov 7, 2013, 2:18:38 AM11/7/13
to webp-d...@webmproject.org, 24lev...@gmail.com
Hi,


On Wednesday, November 6, 2013 3:52:52 PM UTC-8, 24lev...@gmail.com wrote:
Not entierly sure waht you mean.

Do you mean that changing the -q mostly result in increased time, but pretty much no size difference?
If so, well that´s to be expected, and well for archiving, it´s worth it, it doesn´t that that long, and i can do other stuff while encoding so it chould take 10 times longer and i wouldn´t care.

This is the right way to think about it, analogous to gzip -1..-9.
 

Yeah i know, or well single-cpu, i am guessing you mean 1 core, as several cpus is rare.

And well from my tests, i never get to use above 25% when encoding webp, and 25% for me is 100% (4 cores 100/4 = 25).
So i think it can only be used in certain scenarios, not sure when.

Multi-threading on the encode side is currently only used in 2 scenarios when doing lossy encoding:
1) on the lossy analysis pass
2) to split the lossy encode from the lossless alpha

-mt is accepted in all cases for compatibility, allowing other combinations to be added without the need to update command lines.

Pascal Massimino

unread,
Nov 7, 2013, 4:58:19 AM11/7/13
to WebP Discussion, 24lev...@gmail.com
Hi,


And actually, there might be some possibilities in the future to speed-up lossless compression using several thread.
That's on my TODO list.

To summarize the lossless options:

-q controls the effort and time spent at compressing more
-m controls the number of extra algorithms and compression tools used, and varies the combination of these tools.

Note that the trade-of between final size and compression time is harder to control for lossless than lossy.
As a result, even if -q and -m are roughly monotonic (more time spent compress -> smaller files), there might
be some inversion for some particular sources (that is: you try to raise the -q value and suddenly the file gets a
little bigger). It's never a dramatic increase in size, but just don't be surprised if this happens.

skal

Reply all
Reply to author
Forward
0 new messages