Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

New lossless image compression method

295 views
Skip to first unread message

Nils Haeck

unread,
Feb 2, 2004, 5:37:02 AM2/2/04
to
Hi everyone,

It seems that in my research for better ways to store images in a lossless
way, I have managed to create a very efficient lossless compression method.

Just before making "outrageous" claims, I want to be sure that I have
compared it to any lossless compression method available to see if it
performs better. So my first question is:

* what is currently the best lossless compression method for photographic
images?

Just as an indication, I tried compressing a photograph, 24bit RGB,
1800x1200 pixels, with these methods:

1) JPG - Lossy of course: 865 Kb 13%

2) BMP - 24Bit 6.329 Kb 100%
3) TIFF - with LZW 3.680 Kb 58% <- requires patent license
4) PNG - maximum compr (9) 3.819 Kb 60%

Compared to my new method and file format, called "ISA"

5) ISA 2.784 Kb 44%

Perhaps with more tweaking I can even squeese some more bytes out.

It should also be noted that this is for complex photographic images; for
many artificial images like screenshots, it compresses much better.

In general I found that it compresses photographs to around 3 to 4 times the
size of JPG (but lossless!).

ISA can also use any kind of bits per channel (1bit, 2bit, 4bit, 8bit and
e.g. for digital cameras 10 and 12 bit), any number of channels (Grayscale,
RGB, RGBA, CMYK, etc), and even palette-based images. It is specifically
geared towards 2D bitmap information, not suitable for 1D compression
processes.

Compression and decompression times are now comparable to JPG, but can be
made much faster if using MMX in the inner loops, so I guess they will be
faster than JPG in the end. This is also as would be expected because no
"expensive" transforms like DCT are involved.

If it turns out to be a valuable compression method, what should I do?

Any one of you having experience with a situation like this (in general)?

Thanks for any pointers.

Kind regards,

Nils Haeck
www.simdesign.nl


Soós Árpád

unread,
Feb 2, 2004, 5:45:33 AM2/2/04
to
> 3) TIFF - with LZW 3.680 Kb 58% <- requires patent license
> 4) PNG - maximum compr (9) 3.819 Kb 60%

Strange. I've always found that PNG compresses better (in many cases much
better) than TIFF.


Nils Haeck

unread,
Feb 2, 2004, 6:08:26 AM2/2/04
to
The LZW option is almost never available in shareware apps, just in some big
packages that have paid the Unisys license, like Photoshop. With LZW, TIFF
compresses quite reasonably.

Nils

"Soós Árpád" <soos....@iterion.hu> wrote in message
news:401e...@newsgroups.borland.com...

Soós Árpád

unread,
Feb 2, 2004, 6:25:53 AM2/2/04
to
"Nils Haeck" <n.ha...@spamchello.nl> az alábbiakat írta a következo
üzenetben news:401e2de9$1...@newsgroups.borland.com...

> The LZW option is almost never available in shareware apps, just in some
big
> packages that have paid the Unisys license, like Photoshop. With LZW, TIFF
> compresses quite reasonably.
>
> Nils

I use PaintShop and selected LZW compression (which is BTW not a highly
sophisticated algorithm).
I achieved the following results using the flower sample picture in windows
XP:

BMP (uncompressed) 1440054
TIFF 920100 (63,9%)
PNG 729533 (50.7%)
RAR multimedia compression 646291 (44.9%)

The latter is achieved compressing the uncompressed 24 bit file with WinRar
3.0+ using multimedia compression. Similar results can be achieved using
WinAce 3+ as well.
These compressors always produced the best non-lossy compression so far for
me. They are MUCH better compressing screenshots (pictures that have large
areas of exactly the same color).

Jens Gruschel

unread,
Feb 2, 2004, 6:34:40 AM2/2/04
to
> It seems that in my research for better ways to store images in a lossless
> way, I have managed to create a very efficient lossless compression
method.

I have developed something like that, too. But yours seems to be better. My
method was very close to png, so I did not continue working with it.

> ISA can also use any kind of bits per channel (1bit, 2bit, 4bit, 8bit and
> e.g. for digital cameras 10 and 12 bit), any number of channels
(Grayscale,
> RGB, RGBA, CMYK, etc), and even palette-based images. It is specifically
> geared towards 2D bitmap information, not suitable for 1D compression
> processes.

Sounds very interesting. Of course you probably want to keep your secrets
until you know which direction to go...

> Any one of you having experience with a situation like this (in general)?

I guess there are two things you can do:

1) Apply for a software patent. At first you have to find out whether you
didn't re-invent something and how new your algorithm really is. Everybody
will hate you (like everybody hated UniSys for LZW), but all big companies
do it, so why shouldn't you? I think you also have to promote your
algorithm, because a patent nobody uses is worthless.

2) Publish your algorithm somewhere, maybe in magazines, on web sites etc.
The more people read about it, the better, that's a good way to prove that
you did invent it. Now everybody can use it for free, and no other person /
company can apply for a patent about something which is already known.

Jens

Nils Haeck

unread,
Feb 2, 2004, 6:43:29 AM2/2/04
to
Hi Soos,

Can you mail me the uncompressed BMP image?

n.haeck at simdesign dot nl

I will then put it through my ISA compressor and see what it produces.

Thanks,

Nils

"Soós Árpád" <soos....@iterion.hu> wrote in message
news:401e...@newsgroups.borland.com...

Jens Gruschel

unread,
Feb 2, 2004, 6:36:19 AM2/2/04
to

It also depends on the image of course. Try a bitmap with random pixels :-)

Jens

P.S. Sorry, I sent the answer to you, first (I just hit the wrong button).

Nils Haeck

unread,
Feb 2, 2004, 7:01:01 AM2/2/04
to
> Sounds very interesting. Of course you probably want to keep your secrets
> until you know which direction to go...


Indeed, I can't :) Must be good to know that it is written in Delphi
though..

The only thing I can say about it is that it fully profits from virtually
every aspect of a photographic image. Things like

- Use the fact that images are 2D data, not just 1D, so
* Pixels close to each other are often similar
* Describe the data in a spatial way, not as a linear array

- Use the fact that images consist mostly of 80% gradient planes and 20%
edges.

- Use the fact that there's a lot of correlation between the color planes

- Certain color groups are more dominant than others.

> 1) Apply for a software patent. At first you have to find out whether you
> didn't re-invent something and how new your algorithm really is. Everybody
> will hate you (like everybody hated UniSys for LZW), but all big companies
> do it, so why shouldn't you? I think you also have to promote your
> algorithm, because a patent nobody uses is worthless.

Actually perhaps I have to patent it, on the other hand I would not want to
run after my patent all the time. It would also seriously hinder the
widespread use of the compression algorithm. However, if a big company comes
up to me with a *very* good offer, well...

> 2) Publish your algorithm somewhere, maybe in magazines, on web sites etc.
> The more people read about it, the better, that's a good way to prove that
> you did invent it. Now everybody can use it for free, and no other person
/
> company can apply for a patent about something which is already known.

Planning on that.. Do you have any suggestions as to which magazines?

Thanks for your input!

Nils

"Jens Gruschel" <nos...@pegtop.net> wrote in message
news:401e359b$1...@newsgroups.borland.com...

Eric Grange

unread,
Feb 2, 2004, 7:08:15 AM2/2/04
to
> If it turns out to be a valuable compression method, what should I do?

Good question :)

You can try to patent and ask for fees, which is a sure way to make
sure your method won't be used on any large scale for the next 15 years
or so, but you may make some bucks if you hit right customers at the
right time, or you can make it more open and fight/collaborate with
metaformats groups (like PNG) to see if they could be interested
in integrating it.
Of course, there is also the issue that your compressor may use some
already patented technique, in which case neither option will work out,
but you may only be made aware of it when through a lawyer coming out
of nowhere...

Anyway, I guess you'll need to "prove" the efficiency on larger collections
of images (there are some around the web), along with the lossless nature
(ie. compress-decompress-compare cycle on those large collections).

Eric

Nils Haeck

unread,
Feb 2, 2004, 7:43:41 AM2/2/04
to
Hi Soos,

Thanks for the picture. You sent me a JPG file, which I converted to a BMP
file, of size 1440054 bytes like yours, so I assume it is the same file.

But that does not change the fact that the file has been a JPG. JPG
introduces a lot of 8x8 artifacts (small edges) which are harder to compress
than an original smoother BMP.

Nevertheless, I compressed the BMP file with "ISA" and got it to 614.363
bytes (42.7%)! I tried to RAR the resulting file and it became bigger
(that's good): 614.441 bytes.

I used WinRAR 3.20 with option "best".

By the way, I can still squeese out some more bytes, which I'm going to try
right now :)

Kind regards,

Nils Haeck
www.simdesign.nl


"Soós Árpád" <soos....@iterion.hu> wrote in message
news:401e...@newsgroups.borland.com...

Soós Árpád

unread,
Feb 2, 2004, 7:43:00 AM2/2/04
to
Yes, I know it is not 100% correct to use an image decompressed from JPG but
I couldn't find quickly a good quality uncompressed 24-bit picture on my
HDD.

When will some (limited) version be available of your compressor?


Nils Haeck

unread,
Feb 2, 2004, 7:52:09 AM2/2/04
to
> Anyway, I guess you'll need to "prove" the efficiency on larger
collections
> of images (there are some around the web), along with the lossless nature
> (ie. compress-decompress-compare cycle on those large collections).

I can give results in tabular form. But how can I actually prove that
without giving at least some executable or DLL away (which can be
decompiled)..

Do you have any URL's of these collections?

Thanks for your feedback!

Nils

"Eric Grange" <egr...@glscene.org> wrote in message
news:401e3c8a$1...@newsgroups.borland.com...

Nils Haeck

unread,
Feb 2, 2004, 8:00:19 AM2/2/04
to
> When will some (limited) version be available of your compressor?

Not until any patent issues are fixed :) Or if you pay me US$1M in 30
minutes, it is available in one hour :) Which could be a bargain if you can
resell it to MS for US$10M.

No seriously, as soon as I have published an article about the method and it
is properly accredited to me I will probably make it available as component.

Kind regards,

Nils
www.simdesign.nl


"Soós Árpád" <soos....@iterion.hu> wrote in message
news:401e...@newsgroups.borland.com...

eLion

unread,
Feb 2, 2004, 11:25:13 AM2/2/04
to
Nils Haeck wrote:
>
> 1) JPG - Lossy of course: 865 Kb 13%
>
> 2) BMP - 24Bit 6.329 Kb 100%
> 3) TIFF - with LZW 3.680 Kb 58% <- requires patent license
> 4) PNG - maximum compr (9) 3.819 Kb 60%
>
> Compared to my new method and file format, called "ISA"
>
> 5) ISA 2.784 Kb 44%
>

Hello Nils,

1) You are surely aware of Jpeg 2000 project ?
They are achieving a new compression method for images, with an algorithm for
lossless compression (as well as loosy compression). (royalty free)
How does it compare with yours ?

2) Just try to compare with a bmp compressed with zip, rar, 7-zip, ace...
if the size improvment with those easy, free methods is not great, your new
format would need some other significant improvment over others.
Size of the resulting file is one (important) parameter, but not the only one:
- can your new format provide "Interlaced" streaming ?
- can your new format be "raw" editable ? (for example, 90/180 rotation
directly on data, without decompressing)
- ...

I think, unless you have connections with those big guys in Digital image world
;-), the only thing you can make, is to publish your algorithm method and format
as royalty free (provided it does not use any other patented algorithm...).

If it's good, it might benefit the community, and in return, you may have some
devellopers willing to ehance this and help you.


Regards,

Emmanuel

Alan Garny

unread,
Feb 2, 2004, 11:33:43 AM2/2/04
to
"Nils Haeck" <n.ha...@spamchello.nl> wrote in message
news:401e25f6$1...@newsgroups.borland.com...

> Just as an indication, I tried compressing a photograph, 24bit RGB,
> 1800x1200 pixels, with these methods:
>
> 1) JPG - Lossy of course: 865 Kb 13%
>
> 2) BMP - 24Bit 6.329 Kb 100%
> 3) TIFF - with LZW 3.680 Kb 58% <- requires patent license
> 4) PNG - maximum compr (9) 3.819 Kb 60%
>
> Compared to my new method and file format, called "ISA"
>
> 5) ISA 2.784 Kb 44%

Sounds very encouraging. Having said that, allow me to the devil's advocate
here... Why would people use yet another format when all they gain is "only"
15-25% in the end? That percentage range is based on the above data (PNG &
ISA) and that given in reply to Soós message.

Also, I don't seem to have seen any information regarding the time your
method takes to compress/decompress an image. What type of pictures have you
tried your method on (e.g. number of colours actually used? Complexity of
the picture?)? What is the memory requirement?

Don't get me wrong, your work seems encouraging, but if you really want to
get people interested, it should really add a lot to what is already
existing.

Alan.


Uffe Kousgaard

unread,
Feb 2, 2004, 11:46:18 AM2/2/04
to
"Nils Haeck" <n.ha...@spamchello.nl> wrote in message
news:401e45a1$1...@newsgroups.borland.com...

>
> I can give results in tabular form. But how can I actually prove that
> without giving at least some executable or DLL away (which can be
> decompiled)..

Set up a webservice of some kind - of course at your own server.

Ignacio Vazquez

unread,
Feb 2, 2004, 12:03:39 PM2/2/04
to
Nils Haeck wrote:

> > 2) Publish your algorithm somewhere, maybe in magazines, on web
> > sites etc. The more people read about it, the better, that's a
> > good way to prove that you did invent it. Now everybody can use it
> > for free, and no other person
> /
> > company can apply for a patent about something which is already
> > known.
>
> Planning on that.. Do you have any suggestions as to which magazines?

DDJ is always good. I'm not certain what month is their graphics issue,
I can check tonight if you want.

--
Cheers,
Ignacio

Help improve Delphi: http://qc.borland.com/

Eric Grange

unread,
Feb 2, 2004, 11:46:12 AM2/2/04
to
> I can give results in tabular form. But how can I actually prove that
> without giving at least some executable or DLL away (which can be
> decompiled)..

Best approach would probably be to scout the net for papers on other
compression algorithms, identify the most commonly used image databases,
references and result charts, and then provide figures that can
be compared to these.

> Do you have any URL's of these collections?

http://sipi.usc.edu/services/database/Database.html
http://www.ece.ualberta.ca/~mandal/index-info/img+vid.html

Courtesy of google ;)

Eric

RandomAccess

unread,
Feb 2, 2004, 12:22:29 PM2/2/04
to
Hi Nils,

Your system sounds great. Do you have any comparison info concerning speed
of the algorythm?

Best Regards


Jan Derk

unread,
Feb 2, 2004, 12:14:09 PM2/2/04
to
> The only thing I can say about it is that it fully profits from virtually
> every aspect of a photographic image. Things like
> [Snip: Hints about how you did it]

Nils,

If you up to something, you should not post any details or even the
slightest hints about how you did it. This is exactly the stuff that
should go in any patent if your algoritm is applicable.

Geeks like you and me are always enthousiastic and want to share new
inventions. If you want to make money: don't. I would either bring it
into the public and open source it or be totally silent about the details.

If you really think you have something important, cancel your post.

Jan Derk

Jens Gruschel

unread,
Feb 2, 2004, 1:37:37 PM2/2/04
to
> Why would people use yet another format when all they gain is "only"
> 15-25% in the end?

A good point. I don't think it makes sense to introduce another format, too,
except if you have reasons. One reason could be that you need some features
other formats don't provide (for example psd files are not as small as png
files, but do support more special things (layers / text / effects etc.). So
if you are planning some special application which needs a new format
because other formats don't provide what you need, why not implementing your
own algorithm if it's a good one?

Jens

Jens Gruschel

unread,
Feb 2, 2004, 1:38:33 PM2/2/04
to
> Geeks like you and me are always enthousiastic and want to share new
> inventions. If you want to make money: don't. I would either bring it
> into the public and open source it or be totally silent about the details.

Right. But I don't think he told too much. Other algorithm make use of that,
too.

Jens

Jens Gruschel

unread,
Feb 2, 2004, 1:49:10 PM2/2/04
to
> * Pixels close to each other are often similar

That's what my algorithm utilizes, too. It's a very weird idea behind my
algorithm, but on some types of images it works perfectly. Maybe I should
publish my algorithm (when I invented it some years ago I thought about a
patent, but I don't think that's a good idea any more).

> * Describe the data in a spatial way, not as a linear array

I thought about this, too, but couldn't find a good way to implement it
properly.

I'd really love to see your algorithm, and compare it with mine. But that's
probably impossible until at least one of us has published it. Well, since
I'm very busy at the moment, that's ok, but don't forget to tell me (here)
as soon as we can read something about it.

Jens

DaveH

unread,
Feb 2, 2004, 2:00:12 PM2/2/04
to
My 2 cents Nils...

Having read some of these posts and having started a patent once, my
suggestion to you is to simply write articles and get published. A patent
will cost you a few thousand dollars so, unless you have deep pockets and
lots of time, its not as easy to do as it is to say.

If you can get published then anyone trying to patent your ideas will be
violating copyright laws. If your publication is good and seen by the Big
Boys (like digital camera companies), then they will contact you for
permission to use the algorithm. BTW: don't forget to have a copyright
notice at the end of your article stating that the algorithm and code can
not be used for commercial purposes without the prior written consent of the
author.

DaveH

"Nils Haeck" <n.ha...@spamchello.nl> wrote in message

news:401e25f6$1...@newsgroups.borland.com...

Jens Weiermann

unread,
Feb 2, 2004, 2:11:17 PM2/2/04
to
Hi Nils,

On Mon, 2 Feb 2004 11:37:02 +0100, Nils Haeck wrote:
> Compared to my new method and file format, called "ISA"

sounds very promising, but the name sucks <g>! Reminds me of those old
16bit cards...

Anyway, good job!

Regards,
Jens

Peter Haas

unread,
Feb 2, 2004, 2:11:53 PM2/2/04
to
Hi Nils,

Nils Haeck wrote in <401e25f6$1...@newsgroups.borland.com>:


> It seems that in my research for better ways to store images in a lossless
> way, I have managed to create a very efficient lossless compression method.
>
> Just before making "outrageous" claims, I want to be sure that I have
> compared it to any lossless compression method available to see if it
> performs better. So my first question is:
>
> * what is currently the best lossless compression method for photographic
> images?

This is dependent of the image contents.
Presumable JPEG 2000 lossless is a good choice.

> Just as an indication, I tried compressing a photograph, 24bit RGB,
> 1800x1200 pixels, with these methods:
>
> 1) JPG - Lossy of course: 865 Kb 13%
>
> 2) BMP - 24Bit 6.329 Kb 100%
> 3) TIFF - with LZW 3.680 Kb 58% <- requires patent license
> 4) PNG - maximum compr (9) 3.819 Kb 60%

You can increase the compression rate of a PNG image by detect the
optimal filter separated for every line. Unfortunately there are good
PNG algorithms, which do it and bad PNG algorithms, which don't use the
filter feature, like Photoshop in the default setting.

I holiday photo from me, 2058 x 1372:
BMP 8275 kByte 100%
TIFF 6295 kByte 76%
PNG 4960 kByte 60% (PaintShop Pro)
JPEG 2000 lossless 3443 kByte 42%

The image was originally saved as JPG with high quality. Maybe this
influence the results.

On the other side I have images, where JPEG 2000 lossless create bigger
files than TIFF and PNG.


> Compared to my new method and file format, called "ISA"
>
> 5) ISA 2.784 Kb 44%

Can you upload your test image in a well known format (e.g. PNG) and
post the link?

Bye Peter.
--
JEDI+ API, the active JEDI Header Conversions Project:
http://jediplus.pjh2.de

Nils Haeck

unread,
Feb 2, 2004, 2:43:10 PM2/2/04
to
Wohooohwwww :) Relax :))

It's the name of my daughter (grin)

I'm still trying to come up with a good acronym though, but up till now I
only thought of a French one: "I"mages "S"ans "A"meloiration. Considering
I'm a Dutch man I found that quite an achievement.

Nils

"Jens Weiermann" <wexm...@solidsoftwareDOT.de> wrote in message
news:3tgpqjq4qzte$.15301rbqrze8d$.dlg@40tude.net...

Alan Garny

unread,
Feb 2, 2004, 2:46:33 PM2/2/04
to
"Nils Haeck" <n.ha...@spamchello.nl> wrote in message
news:401ea6a6$1...@newsgroups.borland.com...

> Wohooohwwww :) Relax :))
>
> It's the name of my daughter (grin)
>
> I'm still trying to come up with a good acronym though, but up till now I
> only thought of a French one: "I"mages "S"ans "A"meloiration. Considering
> I'm a Dutch man I found that quite an achievement.

You nearly got it right... :)

Images Sans Amélioration

Having said that, it means: pictures without improvement... Is it really
what you meant?? :)

Alan.


Nils Haeck

unread,
Feb 2, 2004, 3:00:23 PM2/2/04
to
Hello Peter,

> Can you upload your test image in a well known format (e.g. PNG) and
> post the link?

Yes, I will upload a few images later today or tomorrow, including - yes -
the compressed counterpart (.ISA files).

That'll give some people some incentive to break the code, not that I think
that is humanly possible :) It would be fun to see though, and hereby I even
put a prize on it: one bottle of Dalwhinnie Scottish Single Malt of 15 years
old.

I think my biggest competition is Jpeg2000. Do you know which utility or
program produces lossless J2K files? I have once worked with the Jasper lib
but don't know / had time to investigate the lossless feature. I also hear
that JBIG is lossless. Image Magick also supports JPeg2000 but I haven't
spotted the lossless option in it.

And even though it is reported as lossless, is it "truly" lossless? Or does
it perhaps have some conversions in it that make it "pseudo lossless"? For
instance, a DCT followed by a IDCT with realistic calculation methods often
produces roundoff errors.

Thanks a lot for your feedback.

Nils

> I holiday photo from me, 2058 x 1372:
> BMP 8275 kByte 100%
> TIFF 6295 kByte 76%
> PNG 4960 kByte 60% (PaintShop Pro)
> JPEG 2000 lossless 3443 kByte 42%
>

Can you perhaps post the BMP file somewhere? I found that Gustavo Daud's PNG
image component works pretty stiff at compression 9 (much better than
Photoshop).


Nils Haeck

unread,
Feb 2, 2004, 2:53:04 PM2/2/04
to
People with digital photocameras would love to get 15-25% extra uncompressed
photos on their memory sticks :)

And sattelite bandwidth users would love to send through 15-25% more HD
uncompressed images.

By the way, I have already raised the bar since this morning, I'm now
fighting the magical limit of 40% compression. There are some tweaks here
and there that might squeese out the last percent.

Kind regards,

Nils Haeck
www.simdesign.nl

"Alan Garny" <som...@somewhere.com> wrote in message
news:401e7bf9$1...@newsgroups.borland.com...

Jens Weiermann

unread,
Feb 2, 2004, 4:05:15 PM2/2/04
to
Hi Niels,

On Mon, 2 Feb 2004 20:43:10 +0100, Nils Haeck wrote:

> Wohooohwwww :) Relax :))
>
> It's the name of my daughter (grin)

oops! Of course I meant "it sucks" as a name for a file format - it's sure
nice for a little lady! How old is she? If I'm ever gonna invent a new file
format, I'm gonna name it after my son (Robin, 22 months). Kindof like the
.ROB extension ;-)

Cheers!
Jens

Rene Tschaggelar

unread,
Feb 2, 2004, 4:11:25 PM2/2/04
to
Nils Haeck wrote:

> Hi everyone,


>
> It seems that in my research for better ways to store images in a
> lossless way, I have managed to create a very efficient lossless
> compression method.
>
> Just before making "outrageous" claims, I want to be sure that I have
> compared it to any lossless compression method available to see if it
> performs better. So my first question is:
>
> * what is currently the best lossless compression method for
> photographic images?
>

> Just as an indication, I tried compressing a photograph, 24bit RGB,
> 1800x1200 pixels, with these methods:
>
> 1) JPG - Lossy of course: 865 Kb 13%
>
> 2) BMP - 24Bit 6.329 Kb 100%
> 3) TIFF - with LZW 3.680 Kb 58% <- requires patent
> license 4) PNG - maximum compr (9) 3.819 Kb 60%
>
> Compared to my new method and file format, called "ISA"
>
> 5) ISA 2.784 Kb 44%
>

>[snip]


>
> If it turns out to be a valuable compression method, what should I do?
>
> Any one of you having experience with a situation like this (in
> general)?

There once was a method called fractal compression, it produced
*.fif files and achieved very good results on selfsimilar images,
or image where some patter repeated also on different scales.
It somehow vanished as the developpers wanted to make big bucks
and had rather rigid license requirements.

As to the lossless compression of datastreams, there once was
the method called entropy compression. I also lost sight of it.

Another worthy competitor for lossless compression : zip.

As to your question : make a worldwide patent on it, hang that
patent into your toilet and know to have blocked some technology
for 20 years until the patent runs out.
;-)

Rene
--
Ing.Buro R.Tschaggelar http://www.ibrtses.com
Your newsgroups @ http://www.talkto.net

Fred Edberg

unread,
Feb 2, 2004, 4:50:03 PM2/2/04
to

> There once was a method called fractal compression, it produced
> *.fif files and achieved very good results on selfsimilar images,
> or image where some patter repeated also on different scales.
> It somehow vanished as the developpers wanted to make big bucks
> and had rather rigid license requirements.
>
> As to the lossless compression of datastreams, there once was
> the method called entropy compression. I also lost sight of it.
>
> Another worthy competitor for lossless compression : zip.
>
> As to your question : make a worldwide patent on it, hang that
> patent into your toilet and know to have blocked some technology
> for 20 years until the patent runs out.
> ;-)
>
> Rene

[Hi Nils]

I'm glad Rene brought this one up. I haven't seen mention of the LizardTech
(MrSID format) yet.

Although, I've been out of the geoscience realm for a little while, I do
know the MrSID is commonly used there and seem to remember claims of
compression ratios of 5% of original and am fairly certain this is
lossless - or its [largest] user base wouldn't use it <g> [see link below].

...my recall of all the details could be skewed though. While I am not
familiar with the intimate details of the algorithm(s) involved, I seem to
recall that it does rely upon either a quadtree and/or fractal-based
approach.

The obligatory link is here:
http://www.lizardtech.com/solutions/geo/index.php

The MrSID example definitely shows how a good compression algorithm can be
patented/licensed and resold for fun and profit - so, either way, good luck
Nils.

Regards,

Fred Edberg
http://home.teleport.com/~fedberg/imagepage.htm
http://home.teleport.com/~fedberg/


Jens Gruschel

unread,
Feb 2, 2004, 5:10:12 PM2/2/04
to
> BMP (uncompressed) 1440054
> TIFF 920100 (63,9%)
> PNG 729533 (50.7%)
> RAR multimedia compression 646291 (44.9%)

I have 689172 for PNG, probably because of other PNG settings (I hope it is
the same file) and 732340 for my own algorithm (at least it is better than
TIFF).

Jens

Michael Hansen

unread,
Feb 2, 2004, 5:34:20 PM2/2/04
to
XNView can read and write Jpeg 2K files and much more... www.xnview.com ..
it supports a lot of formats, maybe there´s some lossless formats you
haven´t compared with...


Regards,

Michael Hansen


"Nils Haeck" <n.ha...@spamchello.nl> wrote in message

news:401e...@newsgroups.borland.com...

Nils Haeck

unread,
Feb 2, 2004, 6:40:58 PM2/2/04
to
Hello Rene,

> There once was a method called fractal compression, it produced
> *.fif files and achieved very good results on selfsimilar images,
> or image where some patter repeated also on different scales.
> It somehow vanished as the developpers wanted to make big bucks
> and had rather rigid license requirements.

I think indeed some people here remember the parrot's eye of FIF and how it
would supposedly stay a circle instead of blocky when zooming in, because it
was a fractal :)

Indeed an interesting file format, but IIRC it was not lossless.

Nils

Nils Haeck

unread,
Feb 2, 2004, 6:36:49 PM2/2/04
to
Thanks all for the very helpful reactions.

I came to the conclusion that JP2 with the "lossless" option (lurawave
plugin for Irfanview) compresses better than my compression method. This
effectively sends me back to the drawing board, because I don't like second
place :) But .jp2's wavelet transform is certainly a strong competitor.

Here is one reference image from Kodak, in 6 different formats, stored
inside a ZIP archive. All of them are lossless.
http://www.simdesign.nl/images/compr/kodim.zip

Here's how they rank:

1) kodim01.jp2 510,320 bytes 43.3%
2) kodim01.isa 548,591 bytes 46.5%
3) kodim01.png 736,501 bytes 62.4%
4) kodim01.zip 798,142 bytes 67.7%
5) kodim01.tif 1,078,006 bytes 91.4% (using LZW)
6) kodim01.bmp 1,179,702 bytes 100.0%

I hope I can make .isa compress better and then I'll report back.

By the way, compression/decompression speed of .isa seems to be faster than
.jp2, and that is without any lowlevel optimization.

I did not verify whether or not .jp2 is truly lossless.

Paul Nicholls

unread,
Feb 2, 2004, 6:55:32 PM2/2/04
to
"Nils Haeck" <n.ha...@spamchello.nl> wrote in message
news:401e...@newsgroups.borland.com...
> Thanks all for the very helpful reactions.
>
> I came to the conclusion that JP2 with the "lossless" option (lurawave
> plugin for Irfanview) compresses better than my compression method. This
> effectively sends me back to the drawing board, because I don't like
second
> place :) But .jp2's wavelet transform is certainly a strong competitor.

Bummer!

> Here is one reference image from Kodak, in 6 different formats, stored
> inside a ZIP archive. All of them are lossless.
> http://www.simdesign.nl/images/compr/kodim.zip
>
> Here's how they rank:
>
> 1) kodim01.jp2 510,320 bytes 43.3%
> 2) kodim01.isa 548,591 bytes 46.5%
> 3) kodim01.png 736,501 bytes 62.4%
> 4) kodim01.zip 798,142 bytes 67.7%
> 5) kodim01.tif 1,078,006 bytes 91.4% (using LZW)
> 6) kodim01.bmp 1,179,702 bytes 100.0%
>
> I hope I can make .isa compress better and then I'll report back.

There isn't TOO much difference between yours and jp2...you may be able to
beat it :-)

Good luck, I hope you become first! :)

> By the way, compression/decompression speed of .isa seems to be faster
than
> .jp2, and that is without any lowlevel optimization.

That is nice, keep up the good work dude :-)

<SNIP>


Pete

unread,
Feb 2, 2004, 8:38:53 PM2/2/04
to
On Tue, 3 Feb 2004 00:36:49 +0100, Nils Haeck wrote:

> 5) kodim01.tif 1,078,006 bytes 91.4% (using LZW)
> 6) kodim01.bmp 1,179,702 bytes 100.0%

If LZW TIFF only compresses 8.6%, that suggest to me there's something
atypical about the Kodak image. Suggest you don't make big decisions
without testing on a wide variety of photos.

Good luck!

Jens Gruschel

unread,
Feb 2, 2004, 9:26:32 PM2/2/04
to
> I came to the conclusion that JP2 with the "lossless" option (lurawave
> plugin for Irfanview) compresses better than my compression method. This
> effectively sends me back to the drawing board, because I don't like
second
> place :) But .jp2's wavelet transform is certainly a strong competitor.

My compression algorithm is near PNG, so I've lost the race. But if you are
also compressing bitplanes separately (instead of single pixels), it might
be interesting to implement my ideas to your algorithm. If not... well... at
least it was fun to develop it.

Jens

Peter Haas

unread,
Feb 2, 2004, 6:03:54 PM2/2/04
to
Hi Nils,

Nils Haeck wrote in <401e...@newsgroups.borland.com>:


> That'll give some people some incentive to break the code, not that I think
> that is humanly possible :) It would be fun to see though, and hereby I even
> put a prize on it: one bottle of Dalwhinnie Scottish Single Malt of 15 years
> old.

No interest, I prefer Irish whiskey. ;-)


> I think my biggest competition is Jpeg2000. Do you know which utility or
> program produces lossless J2K files?

LEAD Technologies offer a PhotoShop / PaintShop Pro plugin:
http://www.leadtools.com/Utilities/PSPlugIn/PhotoShop_plug-in.htm

Algovision and Luratech offer a freeware application for non commercial
using (SmartCompress Lite), free Browser plugins (IE, Netscape) and a
commercial PhotoShop / PaintShop Pro plugin:
http://www.algovision-luratech.com/

I have installed the PS/PSP plugin from LEAD Technologies and
SmartCompress Lite, both have a lossless option.


> I also hear that JBIG is lossless.

Yes and no.

The ISO/IEC JTC1 SC29 committee have two groups:
JPEG -> Joint Photographic Experts Group
JBIG -> Joint Bi-level Image experts Group

JBIG have developed ISO/IEC IS 11544 / ITU-T T.82 for the lossless
compression of a bi-level image. It can also be used for coding
greyscale and colour images with limited numbers of bits per pixel.

Currently JBIG develop JBIG2 (IS 14492), which know lossy and lossless
compression.

JPEG have developed ISO/IEC IS 10918 / ITU-T T.81, this is the
compression standard of JFIF (the fileformat), which the world know as
JPEG (.jpeg, .jpg).

JPEG have devoloped ISO/IEC IS 14495 / ITU-T T.87, also called JPEG-LS,
a near lossless compression, intended to e.g. medical images. It use the
LOCO-Algorithmus von HP Labs.


> And even though it is reported as lossless, is it "truly" lossless? Or does
> it perhaps have some conversions in it that make it "pseudo lossless"? For
> instance, a DCT followed by a IDCT with realistic calculation methods often
> produces roundoff errors.

I don't know it really, but I think it is lossless. JPEG2000 use DWT
(Discrete Wavelet Transform) in the lossless mode with a integer filter
(lossy mode use a float filter).

I have compare the decompressed image with the bitmap image in binary,
no difference.


> > I holiday photo from me, 2058 x 1372:
> > BMP 8275 kByte 100%

> Can you perhaps post the BMP file somewhere?

How I wrote, I don't know, whether the prior JPEG compression influence
the results. The photo was taken by a analog camera and digitalized to a
PhotoCD. I will scan the paper photo in equivalent resolution and upload
this scan tomorrow, instead of the JPEG based bitmap, which I could
upload today.

> I found that Gustavo Daud's PNG image component works pretty stiff at
> compression 9 (much better than Photoshop).

It is similar to PaintShop Pro, sometimes better.

Epis

unread,
Feb 3, 2004, 4:31:46 AM2/3/04
to
A few thoughts after reading trough this thread:

Forget about the desktop publishing, internet and other PC-related
applications. There are so many formats already out there, and I can't see
why people would switch to a new unknown format just to get some extra
compression along with the inevitable compatibility problems. Unless all the
big players decide to make it happen somehow for some reason. If even then:
think about TIFF and JPEG for instance, they are not technically even near
the best solutions but still widely used.

Instead I'd concentrate on niche applications where losslessness, bandwidth,
memory consumption and computing power are real issues. High-end digital
cameras, frame grabbers, satellites, security, medical, geographical
applications... See what the strengths and possibilities of your algorithm
are, research some applications to find where there is need for improvement,
and then see how your solution would do the job compared with the
competition.

So if you find an application where your algorithm brings true benefits
while being reasonable to implement, you'll surely be able to sell it. What
I mean is you probably should aim for something like a million-dollar
medical gadget manufacturer, rather than trying to write another great image
format (like PNG) for the web enthusiast and photoshop hobbyist.


Epis


Nils Haeck

unread,
Feb 3, 2004, 6:01:32 AM2/3/04
to
Hi Epis,

I fully agree with you. I foresee a future for my file format specially in
"proprietary" lossless formats for digital camera vendors, for one. And of
course the medical and sattelite imaging applications.

Our company is already active in these places, we have a lot of interesting
stuff for image alignment, shape recognition, character recognition, etc.

But since my format is reasonably simple to implement it would also do
nicely in many smallish devices with less-than-powerful chips, like
handhelds.

Kind regards,

Nils

"Epis" <thisisnot...@nospam.nop> wrote in message
news:401f...@newsgroups.borland.com...

Nils Haeck

unread,
Feb 3, 2004, 5:56:35 AM2/3/04
to
Hi Pete,

I'm not sure if Irfanview implements the LZW correctly. Indeed, I would also
expect it to give slightly better results.

Nevertheless, these are uncompressed photographs, which are very hard to
compress for an algorithm that expects similar series of bytes. This is an
entirely different problem than compressing database or textual data.

Nils

"Pete" <nob...@nowhere.com> wrote in message
news:1pb10skvjd3no$.8j4ou5ov1cgc.dlg@40tude.net...

eLion

unread,
Feb 3, 2004, 6:37:42 AM2/3/04
to
Nils Haeck wrote:

> People with digital photocameras would love to get 15-25% extra uncompressed
> photos on their memory sticks :)
>

People with digital cameras, like me, do not use any uncompressed format.
Only profis. And alomne, the gain of 20% is not enough. You new method needs a
lot more.


> And sattelite bandwidth users would love to send through 15-25% more HD
> uncompressed images.

same answer.

>
> By the way, I have already raised the bar since this morning, I'm now
> fighting the magical limit of 40% compression. There are some tweaks here
> and there that might squeese out the last percent.

Working a lot, but I am afraid that will not bring you anywhere, except for the
proud this give you to achieve that.

Regards,

Emmanuel

Jacques Oberto

unread,
Feb 3, 2004, 6:45:42 AM2/3/04
to

"Nils Haeck" <n.ha...@spamchello.nl> a écrit dans le message de
news:401ea6a6$1...@newsgroups.borland.com...

> Wohooohwwww :) Relax :))
>
> It's the name of my daughter (grin)
>


I thought ISA meant: Image Squeeze Algorithm :)

Jacques


Theodor Kleynhans

unread,
Feb 3, 2004, 7:42:22 AM2/3/04
to
Hi Nils,

> I foresee a future for my file format specially in "proprietary"
> lossless formats for digital camera vendors, for one.

---
I agree. If my digicam was able to store RAW files in compressed form that were only slightly larger
than the JPEG equivalent, I'd only use that. I suspect many other digicam users would agree.

Many new digicams offer RAW functionality, not only the so-called "prosumer" models. However, the
large filesizes of the RAW files are prohibitive (e.g. 6.6MB RAW vs 2.2MB JPEG-Fine on the Canon
300D).

Regards,
Theodor

---------------------
Sulako Developments
http://www.sulako.com


Nils Haeck

unread,
Feb 3, 2004, 9:56:03 AM2/3/04
to
> I agree. If my digicam was able to store RAW files in compressed form that
were only slightly larger
> than the JPEG equivalent, I'd only use that. I suspect many other digicam
users would agree.

Unfortunately, "slightly larger" is not realistic :) I think the best to
expect is something like 2 or 3 times as large (depending on JPG
compression). It simply is hard to compete to a format that throws away
information.

Nils


Lord Crc

unread,
Feb 3, 2004, 9:47:32 AM2/3/04
to
On Tue, 3 Feb 2004 12:42:22 -0000, "Theodor Kleynhans"
<borlan...@sulako.com> wrote:

> If my digicam was able to store RAW files in compressed form that were only slightly larger
>than the JPEG equivalent, I'd only use that.

If your digicam uses uncompressed (if one disregards the lossless
huffman step) jpeg, then there's only the quality lost in the color
space/dct conversions, which should be very little...

Since jpeg can be read by almost anyone on anything, i fail to see why
you'd want that.

- Asbjørn

Nils Haeck

unread,
Feb 3, 2004, 10:16:18 AM2/3/04
to
Hi Asbjorn,

Uncompressed JPEG (exists, but rare) produces bigger files than my
compression method. The only format that produces smaller files is JP2
(JPeg2000) which does not use DCT but uses wavelet transforms with arhitmic
coding.

Nils

"Lord Crc" <lor...@hotmail.com> wrote in message
news:l0dv105ef1529lp0o...@4ax.com...

David J Taylor

unread,
Feb 3, 2004, 9:53:06 AM2/3/04
to
"Epis" <thisisnot...@nospam.nop> wrote in message
news:401f...@newsgroups.borland.com...
[]

> Instead I'd concentrate on niche applications where losslessness,
bandwidth,
> memory consumption and computing power are real issues. High-end digital
> cameras, frame grabbers, satellites, security, medical, geographical
> applications... See what the strengths and possibilities of your
algorithm
> are, research some applications to find where there is need for
improvement,
> and then see how your solution would do the job compared with the
> competition.
[]

There is a new data transmission standard for weather satellites in the
process of being implemented right now. It is called HRIT/LRIT (high/low
rate information transmission). The new European Meteosat-8 satellite
uses lossless wavelet transform for its HRIT data, lossy 8-bit JPEG for
its own LRIT data, and it has used both lossless 8- and 10-bit JPEG and
lossless wavelet transform for data from other satellites which it relays.

I have produced software in Delphi which decodes all these formats, hence
my interest.

Cheers,
David


Nils Haeck

unread,
Feb 3, 2004, 9:54:12 AM2/3/04
to
> People with digital cameras, like me, do not use any uncompressed format.
> Only profis. And alomne, the gain of 20% is not enough. You new method
needs a
> lot more.

I don't agree.

1) First of all, the fact that compression is lossless is very important.
People who edit pictures a lot will love to store their images losslessly in
a way which does better than the bulky TIFF used nowadays. Especially people
whose harddisk is 99% full. These 20% are very welcome then.

2) If I could make the claim that my method compresses better than (and
faster than) the new standard JP2, then this is big news. Currently this is
not the case but where I am now it certainly merits to invest a bit more
energy in it, because I think it can be done.

3) Since this new format will be known to be lossless by definition (unlike
JP2, which can be lossless or lossy), people will feel safer using it
hopefully.

> > And sattelite bandwidth users would love to send through 15-25% more HD
> > uncompressed images.
>
> same answer.

Have you seen the bills?

Kind regards,

Nils


Michael Hansen

unread,
Feb 3, 2004, 10:17:58 AM2/3/04
to
If you decide to make it available for free for non commercial software
products (ofcourse not opensource..), I believe it could get quite popular.
There are many great free imageviewers with plugin support, and if ISA can
compete closely with jpeg2000 there are absolutely no reason not to do so
(the filesize is one thing, another is processingtime, and if ISA is really
fast, I悲 find the few extra KB in filesize worth the use). Consider e.g. a
multilayered format for Graphics32 with ISA.. that would be mmm... GREAT!

If ISA were the fastest and close to the best available lossless compression
algorithms, available free for non commercial use, I悲 choose it! And
further if popular imageviewers could show the format (which I don愒 think
is very hard to accomplish, because the goal for the Imageviewer developers
is to support as much as possible), there愀 a good reason for working on
further with ISA.

If you choose a strictly commercial path for ISA, you also say no to a large
developer-community, and thereby limit the popularity of ISA (which is a
good reason for e.g. digital camera venders to choose another, more popular
format).
Now, I don愒 know much about patent rules and such, but I believe the best
path would be something like:

- Publish the Algorithm, eventually take patent on it.
- Make decoding free for all
- Make encoding free for non commercial products (as long as you get the
proper credits in the product)


And wide support is a must, so you need to:
- Distribute headers, components and demos for various languages
- Write plugins for as many popular products as possible - decoding plugins
available for free


It sounds really interesting.. one thing I came to think about: have you
considered if any slightly lossy steps is worth introducing?

If its possible without revealing any secrets, you could roughly explain the
compression steps - maybe someone can point out further steps or
optimizations..

/Michael Hansen

"Nils Haeck" <n.ha...@spamchello.nl> wrote in message

Jason F. Kowalski

unread,
Feb 3, 2004, 11:03:15 AM2/3/04
to
"Jens Gruschel" <nos...@pegtop.net> wrote

> > > 3) TIFF - with LZW 3.680 Kb 58% <- requires patent
license
> > > 4) PNG - maximum compr (9) 3.819 Kb 60%

> > Strange. I've always found that PNG compresses better (in many cases
much
> > better) than TIFF.

> It also depends on the image of course. Try a bitmap with random pixels
:-)

Well, on average any compression algorithm will actually increase the size.


eLion

unread,
Feb 3, 2004, 11:06:47 AM2/3/04
to
Nils Haeck wrote:
>>People with digital cameras, like me, do not use any uncompressed format.
>>Only profis. And alomne, the gain of 20% is not enough. You new method needs a lot more.
>
> I don't agree.

Then, we do not agree.

>
> 1) First of all, the fact that compression is lossless is very important.
> People who edit pictures a lot will love to store their images losslessly in
> a way which does better than the bulky TIFF used nowadays.

I still think those people are profis in digital images.
For the common, jpg high quality (low compression) is far good enough.

Especially people
> whose harddisk is 99% full. These 20% are very welcome then.

Those profis people just go and buy another 200Go HD ;-)

> 2) If I could make the claim that my method compresses better than (and
> faster than) the new standard JP2, then this is big news. Currently this is
> not the case but where I am now it certainly merits to invest a bit more
> energy in it, because I think it can be done.

If you achieve a better compression than lossless Jpeg2000, this would be
already admirable.
But would that be enough for you to win the market ?
Those peoples are specialist in their branch. They already invested a lot of
man-year works and money.
They already thought ahead of what a new format should contain like the
progressive streaming: you want the image at 200x300, fine, just a part of the
file is streamed.
Then, you want an 480x720 enlargement ? just go ahead with the same stream and
you get that... Can your format provide that ?


>
> 3) Since this new format will be known to be lossless by definition (unlike
> JP2, which can be lossless or lossy), people will feel safer using it
> hopefully.

Well, JP2 is royalty free.
Its hard to compete against something good and free. ;-)

>
>>>And sattelite bandwidth users would love to send through 15-25% more HD
>>>uncompressed images.
>>
>>same answer.
>
>
> Have you seen the bills?

???

As you asked on this forum advices, I just gave you mine.
I don't mean that your work is not serious or not good.
As said by Rene Tschaggelar, a little further in the same thread,


"make a worldwide patent on it, hang that
patent into your toilet and know to have blocked some technology
for 20 years until the patent runs out."

I just wish I am wrong, and you make a lot of money with your compression method.

Regards,

Emmanuel

Nils Haeck

unread,
Feb 3, 2004, 11:10:45 AM2/3/04
to
Let's keep that as the "official" name then :)

Nils

"Jacques Oberto" <j...@nospam.com> wrote in message
news:401f89f7$1...@newsgroups.borland.com...

Nils Haeck

unread,
Feb 3, 2004, 10:56:49 AM2/3/04
to
> Consider e.g. a
> multilayered format for Graphics32 with ISA.. that would be mmm... GREAT!

I work every day with Graphics32. Actually, the current ISA implementation
by default also stores the Alpha channel (it stores TBitmap32 as native
format). The format is by the way not limited to 8bits/channel, but can be
used for any bits per channel. It will be also especially useful for RAW
images of digicams which nowadays often have 10..12 bits.

But if you're referring to the layers as used in some of the Graphics32
examples: no, I don't use that scheme, I have created my own layered scheme
in DtpDocuments:
http://www.simdesign.nl/dtpdocuments.html

DtpDocuments can store multipage documents with many shapes, which can be
any popular format. The storage format is simply an archive file a bit like
a ZIP file.

By the way, ISA also is "multi-page". The pages could be used for animation
sequences, movies, or layers. But it just stores them, without any
additional info. It does not try to compress based on inter-image
correlation, like MPEG does.

> It sounds really interesting.. one thing I came to think about: have you
> considered if any slightly lossy steps is worth introducing?

In theory it would be possible to discard information in some steps to make
it lossy. However, I think from a "marketing" point it is best to brand this
format as purely lossless. A "sister format" could be introduced that is
lossy (JSA or something).

> If its possible without revealing any secrets, you could roughly explain
the
> compression steps - maybe someone can point out further steps or
> optimizations..

Not at this point, I'm still investigating if I should patent it and where.
Maybe I will have to share and get ideas at a later stage, that depends
whether or not I will be stuck on making it compress more to beat JP2.

I agree that it would be nice to make this into an open format used by the
developer community. I'm part of that community too and like to share stuff.
But on the other hand, sometimes it is also necessary to get paid for work
:) So I'm investigating all commercial ventures first.

Kind regards,

Nils
www.simdesign.nl


Nils Haeck

unread,
Feb 3, 2004, 11:28:15 AM2/3/04
to
> Well, on average any compression algorithm will actually increase the
size.

That really depends on the random generator. Some random data might look
random (looking at the histogram) but may actually contain some redundancy.
A good compression algorithm will find that and use it.

Also, since the compression algorithm can add some expert knowledge through
its definition, it may be able to even compress the most random data, simply
by luck. Here's my theory on how that works. Suppose you have compression
methods 1 through 255: you can try them all. Most of them will produce
longer files, but due to sheer luck, perhaps one of them actually compresses
the data a bit. You then simply add one byte to the file at the start with
the number of that compression method and voila! Worst case is that you end
up choosing method 0 (no compression) and your file is 1 byte longer.

I'm sure some scientist can prove this theory wrong, but it sounds
reasonable to me :)

Nils


Jason F. Kowalski

unread,
Feb 3, 2004, 11:20:44 AM2/3/04
to
"Nils Haeck" <n.ha...@spamchello.nl> wrote

> If it turns out to be a valuable compression method, what should I do?
>
> Any one of you having experience with a situation like this (in general)?

Can't help with legalities but you should probably do some double blind
tests by having someone without knowledge of your algorithm provide a
variety of images, maybe grouped into categories (*) and measure compression
times and ratios, as well as sensitivity to corruption (how many pixels
does, for instance, a single bit error in the file effect). As a less
rigourous method, obtain a (random) commercial photo clipart cd (or have
someone take lots of varying quality and subject raw images with a digital
camera) and apply major compression algorithms as well as yours. Once you
have such data, you can draw up a better course of action.

(*) Photographic (subcategories as low light, noisy, low detail, high
detail...etc), rendered (raytraced, opengl, radiosity), vector (ie corel,
xara,...etc), artificial bitmap (ie photoshop generated only), scanned (from
various media with different dither patterns), TV frame... etc. It's likely
that your method will do better in some and worse in others than other
methods. Danger is, since you know which types will work best, you might
have a tendency to unconsciously test those only.


Nils Haeck

unread,
Feb 3, 2004, 11:44:31 AM2/3/04
to
Hi Emmanuel,

Thanks for your advice. I certainly appreciate it, don't worry :)

> I still think those people are profis in digital images.

There are *lots* of profis in the world :)

> They already thought ahead of what a new format should contain like the
> progressive streaming: you want the image at 200x300, fine, just a part of
the
> file is streamed.
> Then, you want an 480x720 enlargement ? just go ahead with the same stream
and
> you get that... Can your format provide that ?

I can't go into tech details but I can just say that my format is also well
thought out. On the other hand I sometimes wonder who would use these
advanced options. The old JPG for instance also has the "progressive" option
to allow it to show up quickly in a web browser. But let's face it: who is
currently actually using that?

> Well, JP2 is royalty free.

JP2 is currently royalty-free but not patent-free. At any moment any one of
the current patent-holder companies or new owners involved (which are many)
could stand up and start collecting license fees. This happened to GIF after
Unisys bought the patent. At that moment, my format would possibly be a
cheaper alternative.

> Its hard to compete against something good and free. ;-)

Agreed. When I manage to beat the size though, I have something better :)
Definitely worth the additional effort..

Kind regards,

Nils


Nils Haeck

unread,
Feb 3, 2004, 11:53:14 AM2/3/04
to
Hi Jason,

The format ISA would be specifically geared towards photographic images,
which it can compress as one of the best. Of course it can also compress
artifical images quite well, but does not claim to be any better there than
for instance ZIP.

Of course you're right that this should be tested on large scale, but all in
due time :) When legalities have cleared I will probably provide some kind
of utility, BMP2ISA.EXE or PNG2ISA.EXE that people can use to do all kinds
of tests.

Thanks and kind regards, Nils

"Jason F. Kowalski" wrote in message news:401f...@newsgroups.borland.com...

eLion

unread,
Feb 3, 2004, 12:46:06 PM2/3/04
to
Nils Haeck wrote:

>
> I can't go into tech details but I can just say that my format is also well
> thought out. On the other hand I sometimes wonder who would use these
> advanced options. The old JPG for instance also has the "progressive" option
> to allow it to show up quickly in a web browser. But let's face it: who is
> currently actually using that?
>

Progressive/interlaced is not the same as the new jp2 streaming.
Now, with jpeg & co, when there is an image thumb with a link on it to enlarge
it, the web site actually store 2 pictures: a small one, and a big one.
What jp2 want to achieve, is that the image is stored only once. The first
request, for dislaying the thumb wuld load a part of the file, corresponding to
the definition you need.
Clicking on the thumb for an enlargment would load the rest of the _same_ file,
to enlarge definition.
This would simplify web administration and reduce data transfer when loading
enlargment: a part of the data is already there.
Though, browsers needs a support for that feature...

>
>
> Agreed. When I manage to beat the size though, I have something better :)
> Definitely worth the additional effort..
>

As some one said in the thread, you could patent it and make money in special
branches (like medecinal image).
But as large distribution in Hi-Tech toys for public, like digital cameras, I
have serious doubt that your format has any chance unless it makes a _big_
improvment.

Again, good luck.

Emmanuel

Nils Haeck

unread,
Feb 3, 2004, 1:33:11 PM2/3/04
to
Thanks Wayne,

Wow, I did not know that RAR was able to achieve such compression.. quite
remarkable, knowing that it is a general-purpose compression method.

Nils

"Wayne Sherman" <body1233 at yahoo dot com> wrote in message
news:401fe4db$1...@newsgroups.borland.com...
> RAR using the BEST compression setting
> 7-ZIP using the ULTRA compression setting


>
> > Here's how they rank:
> >
> > 1) kodim01.jp2 510,320 bytes 43.3%
> > 2) kodim01.isa 548,591 bytes 46.5%
>

> kodim01.rar 596,999 bytes 50.6%
> kodim01.7z 622,626 bytes 52.8%


>
> > 3) kodim01.png 736,501 bytes 62.4%
> > 4) kodim01.zip 798,142 bytes 67.7%

> > 5) kodim01.tif 1,078,006 bytes 91.4% (using LZW)
> > 6) kodim01.bmp 1,179,702 bytes 100.0%
>

> Regards,
>
> Wayne Sherman
> Las Vegas
>
>


Wayne Sherman

unread,
Feb 3, 2004, 1:13:47 PM2/3/04
to

Patrick Veenstra

unread,
Feb 3, 2004, 1:23:18 PM2/3/04
to
Hi Nils,

I think your format can be better than JP2, if you use some ASM
optimization.
Anyway, the first thing you must do, if register your source. It cost about
10 Euro, and will help to proove that you wrote this code, in case of any
legal problems you may have in future. It is possible to register code that
uses patents you don't own.

You register code to proove that it is yours, you patent it to make sure
nobody else can use it, without permision.

If you have any problems, contact me, I'm Dutch, but live in Spain (still is
European Union, same patents, same registration).

Regards,
Patrick


"Nils Haeck" <n.ha...@spamchello.nl> escribió en el mensaje
news:401e25f6$1...@newsgroups.borland.com...
> Hi everyone,
>
> It seems that in my research for better ways to store images in a lossless
> way, I have managed to create a very efficient lossless compression
method.
>
> Just before making "outrageous" claims, I want to be sure that I have
> compared it to any lossless compression method available to see if it
> performs better. So my first question is:
>
> * what is currently the best lossless compression method for photographic
> images?
>
> Just as an indication, I tried compressing a photograph, 24bit RGB,
> 1800x1200 pixels, with these methods:
>
> 1) JPG - Lossy of course: 865 Kb 13%
>
> 2) BMP - 24Bit 6.329 Kb 100%


> 3) TIFF - with LZW 3.680 Kb 58% <- requires patent license
> 4) PNG - maximum compr (9) 3.819 Kb 60%
>

> Compared to my new method and file format, called "ISA"
>
> 5) ISA 2.784 Kb 44%
>
> Perhaps with more tweaking I can even squeese some more bytes out.
>
> It should also be noted that this is for complex photographic images; for
> many artificial images like screenshots, it compresses much better.
>
> In general I found that it compresses photographs to around 3 to 4 times
the
> size of JPG (but lossless!).
>
> ISA can also use any kind of bits per channel (1bit, 2bit, 4bit, 8bit and
> e.g. for digital cameras 10 and 12 bit), any number of channels
(Grayscale,
> RGB, RGBA, CMYK, etc), and even palette-based images. It is specifically
> geared towards 2D bitmap information, not suitable for 1D compression
> processes.
>
> Compression and decompression times are now comparable to JPG, but can be
> made much faster if using MMX in the inner loops, so I guess they will be
> faster than JPG in the end. This is also as would be expected because no
> "expensive" transforms like DCT are involved.


>
> If it turns out to be a valuable compression method, what should I do?
>
> Any one of you having experience with a situation like this (in general)?
>

> Thanks for any pointers.
>
> Kind regards,
>
> Nils Haeck
> www.simdesign.nl
>
>
>
>
>
>
>
>


Johann Larve

unread,
Feb 3, 2004, 1:39:00 PM2/3/04
to
Hi,

> 1) First of all, the fact that compression is lossless is very
> important. People who edit pictures a lot will love to store their
> images losslessly in a way which does better than the bulky TIFF used
> nowadays. Especially people whose harddisk is 99% full. These 20% are
> very welcome then.

I love the concept behind tiff. Various compression schemes in one
file. Why not store your compressed data in a tiff file? I would love
that.

Just my $0.02

Wayne Sherman

unread,
Feb 3, 2004, 1:35:07 PM2/3/04
to
RAR 3.30 with FORCE TRUE COLOR Compression Setting

> > Here's how they rank:
> >
> > 1) kodim01.jp2 510,320 bytes 43.3%

kodim01-2.rar 540,924 bytes 45.9% //RAR 3.30 TRUE COLOR

> > 2) kodim01.isa 548,591 bytes 46.5%

kodim01.rar 596,999 bytes 50.6% //RAR BEST Settting
kodim01.7z 622,626 bytes 52.8% //7-Zip ULTRA

Wayne Sherman

unread,
Feb 3, 2004, 2:09:38 PM2/3/04
to
> Wow, I did not know that RAR was able to achieve such compression.. quite
> remarkable, knowing that it is a general-purpose compression method.

Is your algorithm optimized enough do compression/decompression in real-time
for say a video compression codec? If yes, then you have an advantage over
ZIP/RAR/7-ZIP compressors, since these are not fast enough. How about a
motion-ISA codec?

Can your algorithm be modified to compress more but be slightly lossy? A
slightly lossy video codec that has good compression but retains high image
quality would be very interesting.

Nils Haeck

unread,
Feb 3, 2004, 2:10:00 PM2/3/04
to
What do you mean by registering the source? How do you do that?

Thanks, Nils

"Patrick Veenstra" <patrick...@yahoo.es> wrote in message
news:401f...@newsgroups.borland.com...

Nils Haeck

unread,
Feb 3, 2004, 2:26:45 PM2/3/04
to
Hello Wayne,

> Is your algorithm optimized enough do compression/decompression in
real-time
> for say a video compression codec? If yes, then you have an advantage
over
> ZIP/RAR/7-ZIP compressors, since these are not fast enough. How about a
> motion-ISA codec?

I think it is possible to optimize the implementation so that it becomes
fast enough, esp. the decompressor (which is most important in this case).

> Can your algorithm be modified to compress more but be slightly lossy?
A
> slightly lossy video codec that has good compression but retains high
image
> quality would be very interesting.

Yes, it can. And that thought has crossed my mind. That will be my next move
perhaps. But first I want to break the lossless record of JP2:)

Kind regards, Nils

"Wayne Sherman" wrote in message news:401ff1f2$1...@newsgroups.borland.com...


Rene Tschaggelar

unread,
Feb 3, 2004, 2:20:30 PM2/3/04
to
Nils Haeck wrote:

> Thanks all for the very helpful reactions.
>
> I came to the conclusion that JP2 with the "lossless" option (lurawave
> plugin for Irfanview) compresses better than my compression method.
> This effectively sends me back to the drawing board, because I don't
> like second place :) But .jp2's wavelet transform is certainly a
> strong competitor.
>
> Here is one reference image from Kodak, in 6 different formats, stored
> inside a ZIP archive. All of them are lossless.
> http://www.simdesign.nl/images/compr/kodim.zip


>
> Here's how they rank:
>
> 1) kodim01.jp2 510,320 bytes 43.3%

> 2) kodim01.isa 548,591 bytes 46.5%

> 3) kodim01.png 736,501 bytes 62.4%
> 4) kodim01.zip 798,142 bytes 67.7%
> 5) kodim01.tif 1,078,006 bytes 91.4% (using LZW)
> 6) kodim01.bmp 1,179,702 bytes 100.0%
>

> I hope I can make .isa compress better and then I'll report back.
>
> By the way, compression/decompression speed of .isa seems to be
> faster than .jp2, and that is without any lowlevel optimization.
>
> I did not verify whether or not .jp2 is truly lossless.

Nils,
one image cannot be enough to test a set of algorithms.
You should at least go through 100 different images
with different sizes, different contents and colors
before considering to give a statement.
There may be a set of images where your algorithm is
better than the others.

Rene
--
Ing.Buro R.Tschaggelar http://www.ibrtses.com
Your newsgroups @ http://www.talkto.net

Nils Haeck

unread,
Feb 3, 2004, 3:10:04 PM2/3/04
to
Hello Rene,

Of course you're right, I didn't have time yet to publish results of more
than one image, but I have tested many more. This one was the first, and
representative for the others. The general rule of thumb being that .jp2
compresses slightly better. That's what I'm working on first :) If I have
better news, I'll probably build a script and run it over a whole collection
and show some statistics.

Nils

"Rene Tschaggelar" wrote in message

Jens Gruschel

unread,
Feb 3, 2004, 4:02:38 PM2/3/04
to
> I'm sure some scientist can prove this theory wrong, but it sounds
> reasonable to me :)

There are so-called "perfect codes", which in theory provide the best
compression possible (if the single words are long enough). BUT this assumes
a source without memory, which means no word depends on other words, a very
theoretical assumption, which cannot be used for things like images. In our
real world everything depends on something else. Maybe you can compress the
whole world to one bit if you find the right formula for (de)compression (if
you find this formula you win the nobel prize, but you cannot do anything
with it, because decompression would take as much time and capacity as the
universe provides - well "I'm sure some scientist can prove this theory
wrong, but it sounds reasonable to me" :-). Well, this is very theoretical,
too, but there is a practical side: many things can be compressed to a
minimum, if you do not store the result of your work but the instructions to
generate this result. Unfortunatelly this cannot cannot be undone easily (if
you have a bitmap file it's hard to make a metafile from it, the other way
round it's easy). An amazing example of this principle is the "Farbrausch"
demo (fr-08). It's a file of 64 KB, just like a small jpeg file, but you can
see 3D worlds with many textures, more than 10 minutes of music and much
more. The idea: (nearly) no data is stored, only instructions to generate
data are stored. Link:
http://www.theproduct.de/

Jens

Lord Crc

unread,
Feb 3, 2004, 5:03:49 PM2/3/04
to
On Tue, 3 Feb 2004 16:16:18 +0100, "Nils Haeck"
<n.ha...@spamchello.nl> wrote:

>Uncompressed JPEG (exists, but rare) produces bigger files than my
>compression method.

Ah yes i missread your post slightly. Yes, it would be bigger as the
only real compression is provided by the huffman step.

- Asbjørn

Theodor Kleynhans

unread,
Feb 3, 2004, 8:19:13 PM2/3/04
to
Hi Lorc Crc,

"Lord Crc" wrote...


> On Tue, 3 Feb 2004 12:42:22 -0000, "Theodor Kleynhans"
> <borlan...@sulako.com> wrote:
>
> > If my digicam was able to store RAW files in compressed form that were only slightly larger
> >than the JPEG equivalent, I'd only use that.
>
> If your digicam uses uncompressed (if one disregards the lossless
> huffman step) jpeg, then there's only the quality lost in the color
> space/dct conversions, which should be very little...
>
> Since jpeg can be read by almost anyone on anything, i fail to see why
> you'd want that.

---
RAW stores the bits exactly as they are read from the digicam's CCD or CMOS sensor. This has the
following added benefits:

1) >8 bits per pixel colour (usually 10-12, sometimes even 14).
2) You get a true "digital negative", untouched by the camera's
processing algorithms (so you can play with these settings
during postprocessing):
* No sharpening applied
* No gamma or level correction applied
* No white balance applied
* No colour correction applied
(More info at http://www.dpreview.com/learn/?/Glossary/Digital_Imaging/RAW_Image_Format_01.htm)

Regards,
Theodor


---------------------
Sulako Developments
http://www.sulako.com


Nils Haeck

unread,
Feb 3, 2004, 9:23:43 PM2/3/04
to
Hi Theodor,

Although you're right about RAW being a "print" of the CCD at the moment of
capture, this does not prevent the format from being compressed.

For instance, looking at Canon's RAW format CRW, for which I wrote a decoder
once, it stores the CCD values using Huffmann codes. This compresses the
format somewhat, but this process could surely be done better using a
spatial agorithm.

So, without loosing any of the meaning or quality of the RAW format, I think
a lossless compression method that can handle odd bitdepths and number of
channels would be very valuable for camera manufacturers.

Kind regards,

Nils

"Theodor Kleynhans" wrote in message news:4020...@newsgroups.borland.com...

jonjon

unread,
Feb 4, 2004, 5:12:22 AM2/4/04
to
One first step would be to send by post to yourself a copy of the source
code. You never open that package and if needed one time, you can then prove
that the code was written by the time the package has been stamped... the
person claiming you have stollen his code would have to find an older
package in his mail box :) Seriously I read this in a shareware author book
and thought it was a good idea. I'm not sure how this can be legally
interpreted though, maybe a lawyer in here can tell us more ?

Best Regards,

John.

"Nils Haeck" <n.ha...@spamchello.nl> a écrit dans le message de

news:401f...@newsgroups.borland.com...

Glenn Randers-Pehrson

unread,
Feb 4, 2004, 9:50:41 AM2/4/04
to
"Nils Haeck" <n.ha...@spamchello.nl> wrote in message news:<401e...@newsgroups.borland.com>...


> Here is one reference image from Kodak, in 6 different formats, stored
> inside a ZIP archive. All of them are lossless.
> http://www.simdesign.nl/images/compr/kodim.zip
>
> Here's how they rank:
>
> 1) kodim01.jp2 510,320 bytes 43.3%
> 2) kodim01.isa 548,591 bytes 46.5%
> 3) kodim01.png 736,501 bytes 62.4%
> 4) kodim01.zip 798,142 bytes 67.7%
> 5) kodim01.tif 1,078,006 bytes 91.4% (using LZW)
> 6) kodim01.bmp 1,179,702 bytes 100.0%

Also

2a) kodim01.mng 588,139 bytes 49.9% (using pngcrush -loco)

This is using a lossless compression method particularly suitable for
photos.

Glenn

Gary Williams

unread,
Feb 4, 2004, 10:38:42 AM2/4/04
to

jonjon wrote:
> One first step would be to send by post to yourself a copy of the source
> code. You never open that package and if needed one time, you can then
prove
> that the code was written by the time the package has been stamped... the
> person claiming you have stollen his code would have to find an older
> package in his mail box :) Seriously I read this in a shareware author
book
> and thought it was a good idea. I'm not sure how this can be legally
> interpreted though, maybe a lawyer in here can tell us more ?


What's to prevent someone from mailing an unsealed envelope to themself,
then stuffing the contents at a later date?

-Gary


Alan Garny

unread,
Feb 4, 2004, 11:28:34 AM2/4/04
to
"Gary Williams" <2FC5...@garywilliams.us> wrote in message
news:4021...@newsgroups.borland.com...

I am sure there are ways of sending yourself something that must be sealed.
You could even have that recorded in some way or another. I believe there is
something like that in the UK: recorded delivery (?). In France, it's
called: "lettre avec accusé de réception".

Alan.


Uwe

unread,
Feb 4, 2004, 3:06:18 PM2/4/04
to
Hi Jens

Thanks for that interesting link. So those guys are using more or less the
same technique as in SVG, is that right? The web page was a little
confusing.

Cheers
Uwe


Jens Gruschel

unread,
Feb 4, 2004, 4:24:34 PM2/4/04
to
> Thanks for that interesting link. So those guys are using more or less the
> same technique as in SVG, is that right? The web page was a little
> confusing.

SVG? You mean scalable vector graphics? Yes, there similarities. For
textures instead of storing each pixel in a file they store instructions how
to generate the pixels (not single vertices, lines, splines etc., but I
think something like: start with red background, add verical lines, blur
image, add some noise etc. - comparable to Photoshop filters with different
parameters). But, no, it's not limited to vector graphics (one interesting
way to generate textures is Perlin noise, see
http://freespace.virgin.net/hugo.elias/models/m_perlin.htm for more details
or take a look at my texture generator at http://www.pegtop.de/xfader/). For
generating sound (maybe it gets off-topic now, so if you are interested in
more details we should go somewhere else) you often add and/or modulate
different sine waves (or other basic wave types), so to store them you only
need a few bytes for the different frequencies and amplitudes (of course at
the cost of some time to generate them - that's why it takes some time until
the demo starts).

Jens

Nils Haeck

unread,
Feb 4, 2004, 4:40:04 PM2/4/04
to
Interesting demo.. Indeed, making everything procedural (textures, sounds,
3D geometries) can compress enormously.

This is indeed a good idea for image compression too. More than just planes
and edges, one could define a whole set of primitives, and try to build the
(bitmap) image from that in an optimal way.

Of course the hard part is to find a good set of descriptive primitives that
can detail a whole world of different images, and second, to actually write
an algorithm that finds out where to put which primitive.

I'm pretty sure that with spending a few hundred manyears on this with the
right people, it would be possible to compress each image to a few Kb :)

Decompression times could be reasonable, but let's not talk about
compression times..

Thanks,

Nils

Btw, I installed your roller-coaster screensaver, and it is very nice in
demo mode. But somehow it does not react well when in screensaver mode
(cannot turn it off, or it gives an error when coming up). So I'm back to
the good ol' lavalamp :)

"Jens Gruschel" wrote in message news:4020...@newsgroups.borland.com...

Jens Gruschel

unread,
Feb 4, 2004, 4:58:51 PM2/4/04
to
> Of course the hard part is to find a good set of descriptive primitives
that
> can detail a whole world of different images, and second, to actually
write
> an algorithm that finds out where to put which primitive.

Almost impossible I think. My theory: for lossless compression you need too
many primitives, and assigning a number to each primitive you end up with as
much space to store these numbers as the pixel data needs. Of course it
works in special cases (like the Farbrausch textures), but a good image
compression algorithm should be able to compress every image :-)

I told you about my idea for compression. Maybe it's time to add some more
details. I split the image to single bitplanes and try to compress each
bitplane. This works well if many pixels in a row (or maybe rectangle?) have
the same value. So my aim was to make areas with same bits larger. I
remembered the good old graycode, and transformed each channel of each pixel
(each byte) to graycode. As you know using graycode, if you increase a value
by one only one bit changes, which makes sure areas with gradients or
similar colors have many equal bits. Now back to what you said: primitives:
if you find a way to outline these areas and store them with as few bytes as
possible you are a lucky winner. With RLE / Huffman or something similar (I
found some other way, but that's only a detail) you get quite good results,
but I noticed that PNG is better. I tried to separate the areas into
rectangles with same value, maybe that would enhance compression a lot, but
I found no good algorithm for it (and probably it would be slow). Another
alternative might be using some kind of quad-tree, but I haven't tried it
yet. So all I have is an algorithm that's nearly as good as PNG (better in
some special cases, but not as good in most cases I tested). It's nice to
know to have invented a new compression algorithm, but... well... doesn't
matter :-)

> Btw, I installed your roller-coaster screensaver, and it is very nice in
> demo mode. But somehow it does not react well when in screensaver mode
> (cannot turn it off, or it gives an error when coming up). So I'm back to
> the good ol' lavalamp :)

As I said it's only beta (some textures coordinates are set improperly, I
noticed that the version I uploaded shows too many of those screens, because
I was testing that in the last version). Pressing Esc should close the
screensaver, but of course... well... it's beta.

Jens

P.S. Of course I don't know whether there are other algorithms around, which
make use of graycode, so maybe it wasn't even my invention. But if someone
ever finds a way for a good graycode / bitlayer compression, it would be
nice to mention my name somewhere (just in case she / he read this *smile*)

Nils Haeck

unread,
Feb 4, 2004, 5:53:28 PM2/4/04
to
I recognise a lot of what you say

Some things

- I don't use graycodes, but a similar process that causes the same
behaviour.

- PNG uses prediction, and does so for each scanline, choosing the most
optimum predictor per scanline. This is the power of PNG, it tries to create
as many equal bytes per scanline as possible (which compress better later
on). However, as shown in this thread, PNG is only poor compression compared
to JP2 and RAR with "True Color" option. The reason for this can only be
attributed to the fact that it does not consider the info to be spatial
(2D).

- The quadtree alternative you mentioned is not a bad idea to try :)

> Almost impossible I think. My theory: for lossless compression you need
too
> many primitives, and assigning a number to each primitive you end up with
as
> much space to store these numbers as the pixel data needs. Of course it
> works in special cases (like the Farbrausch textures), but a good image
> compression algorithm should be able to compress every image :-)

I think it is possible:

Suppose you could create an initial artificial image quickly with a set of
primitives. The artificial image is close to the original, but not exactly
on it. However, you have so far only spent very few bytes to achieve this.

The next step is to take the difference from artificial minus original. You
can expect this one to have much smaller color values, and many zero values.
This image will compress way better than the original. Now compress this
image using a smart method, and the total compression will be much better
than any conventional method.

Just an idea :)

Nils

"Jens Gruschel" wrote in message news:4021...@newsgroups.borland.com...

Jens Gruschel

unread,
Feb 4, 2004, 7:53:52 PM2/4/04
to
> P.S. Of course I don't know whether there are other algorithms around,
which
> make use of graycode, so maybe it wasn't even my invention.

It's not my invention :-(

http://www.faqs.org/faqs/compression-faq/ (part 1/3)

Jens

sculptex

unread,
Feb 6, 2004, 4:01:22 AM2/6/04
to
I looked into this briefly once, I am sure that you can have a package
timestamped and deposited in a bank. Ask your local bank about it?

You might also consider publishing (here! or elsewhere) hashes (using
various algorithms) and checksums etc. of a zip or similar of your code
and/or algorithms. I am sure you can see where I am going with this.
SUGGESTION: do this now!!
I also thought about publishing an encrypted file containing the algorithm
which you could then decrypt if necessary, but I thought better of it.

BTW congrats on single-handedly producing a competetive algorithm. I hope
you can find a niche for it. There may be environments (where security is
paramount) that would welcome a losless proprietary file format.

Regards,

Sculptex

"Alan Garny" <som...@somewhere.com> wrote in message
news:40211db8$1...@newsgroups.borland.com...

Nils Haeck

unread,
Feb 6, 2004, 10:58:27 AM2/6/04
to
Hi Sculptex,

Unfortunately, Jpeg2000 is compressing more, still. I've tried quite a few
tricks up my sleeve to beat it, but have not yet succeeded. I'm now trying
still a few others, not giving up :)

You see, I don't want to create a competitive algorithm, I want to create a
better one :)

Nils


"sculptex" wrote in message news:4023584b$1...@newsgroups.borland.com...

Henry A. Eckstein

unread,
Feb 7, 2004, 7:43:20 PM2/7/04
to

Nils Haeck wrote:
>>Sounds very interesting. Of course you probably want to keep your secrets
>>until you know which direction to go...
>
>
>
> Indeed, I can't :) Must be good to know that it is written in Delphi
> though..
>
> The only thing I can say about it is that it fully profits from virtually
> every aspect of a photographic image. Things like
>
> - Use the fact that images are 2D data, not just 1D, so
> * Pixels close to each other are often similar
> * Describe the data in a spatial way, not as a linear array
>
> - Use the fact that images consist mostly of 80% gradient planes and 20%
> edges.
>
> - Use the fact that there's a lot of correlation between the color planes
>
> - Certain color groups are more dominant than others.
>
>

The above sound like searching for self-similarity on luminance and
chromacity channels which in effect is Fractal Compression:

Sorry to spoil the fun, but here are the patents for fractal compression
using iterative numeric functions to find self-similar pixels:
The ones marked with *** (3 asterisks) relate full to the fractal
algorithm itself while others relate to the edge enhancement and edge
detection using by fractal encoders.

6,356,973: Memory device having a cyclically configured data memory
and having plural data portals for outputting/inputting data
***6,330,367: Image encoding and decoding using separate hierarchical
encoding and decoding of low frequency images and high frequency edge images
***6,327,304: Apparatus and method to digitally compress video signals
6,292,582: Method and system for identifying defects in a semiconductor
***RE37,342: Dual format digital video production system
***6,275,615: Method and apparatus for image representation and/or
reorientation
6,269,174: Apparatus and method for fast motion estimation
6,246,787: System and method for knowledgebase generation and management
***6,226,414: Image encoding and decoding method and apparatus using
edge synthesis and inverse wavelet transform
6,226,412: Secure digital interactive system for unique product
identification and sales
6,222,532: Method and device for navigating through video matter by
means of displaying a plurality of key-frames in parallel
6,205,239: System and method for circuit repair
***6,195,472: Image processing apparatus and method
6,185,625: Scaling proxy server sending to the client a graphical
user interface for establishing object encoding preferences after
receiving the client's request for the object
***6,181,822: Data compression apparatus and method
***6,178,265: Method and apparatus for motion vector compression
***6,167,155: Method of isomorphic singular manifold projection and
still/video imagery compression
6,141,034: Immersive imaging method and apparatus
6,134,631: Non-volatile memory with embedded programmable controller
6,134,547: Computerized method and system for user-interactive,
multimedia cataloguing, navigation and previewing of film and films on video
6,122,391: Spectrally coordinated pattern search-imaging system and
method
***6,111,988: Fractal representation of data
6,111,568: Personal computer system, compact disk, and method for
interactively viewing the earth
6,091,846: Method and system for anomaly detection
6,081,750: Ergonomic man-machine interface incorporating adaptive
pattern recognition based control system
6,078,664: Z-transform implementation of digital watermarks
6,070,140: Speech recognizer
***6,064,771: System and method for asynchronous, adaptive moving
picture compression, and decompression
***6,055,335: Method and apparatus for image representation and/or
reorientation
***6,054,943: Multilevel digital information compression based on
lawrence algorithm
6,044,172: Method and apparatus for reversible color conversion
***6,002,794: Encoding and decoding of color digital image using wavelet
and fractal encoding
5,974,521: Apparatus and method for signal processing
5,973,731: Secure identification system
***5,946,417: System and method for a multiresolution transform of
digital image information
***5,924,053: Fractal representation of data
5,923,376: Method and system for the fractal compression of data
using an integrated circuit for discrete cosine transform
compression/decompression
5,920,477: Human factored interface incorporating adaptive pattern
recognition based controller apparatus
***5,917,948: Image compression with serial tree networks
5,905,800: Method and system for digital watermarking
5,903,454: Human-factored interface corporating adaptive pattern
recognition based controller apparatus
5,901,246: Ergonomic man-machine interface incorporating adaptive
pattern recognition based control system
5,889,868: Optimization methods for the insertion, protection, and
detection of digital watermarks in digitized data
5,875,108: Ergonomic man-machine interface incorporating adaptive
pattern recognition based control system
***5,870,502: System and method for a multiresolution transform of
digital image information
5,867,386: Morphological pattern recognition based controller system
5,867,221: Method and system for the fractal compression of data
using an integrated circuit for discrete cosine transform
compression/decompression
***5,862,263: Fractal image compression device and method using
perceptual distortion measurement
***5,852,565: Temporal and resolution layering in advanced television


>>1) Apply for a software patent. At first you have to find out whether you
>>didn't re-invent something and how new your algorithm really is. Everybody
>>will hate you (like everybody hated UniSys for LZW), but all big companies
>>do it, so why shouldn't you? I think you also have to promote your
>>algorithm, because a patent nobody uses is worthless.
>
>
> Actually perhaps I have to patent it, on the other hand I would not want to
> run after my patent all the time. It would also seriously hinder the
> widespread use of the compression algorithm. However, if a big company comes
> up to me with a *very* good offer, well...
>
>
>>2) Publish your algorithm somewhere, maybe in magazines, on web sites etc.
>>The more people read about it, the better, that's a good way to prove that
>>you did invent it. Now everybody can use it for free, and no other person
>
> /
>
>>company can apply for a patent about something which is already known.
>
>
> Planning on that.. Do you have any suggestions as to which magazines?
>
> Thanks for your input!
>
> Nils
>
> "Jens Gruschel" <nos...@pegtop.net> wrote in message
> news:401e359b$1...@newsgroups.borland.com...


>
>>>It seems that in my research for better ways to store images in a
>
> lossless
>
>>>way, I have managed to create a very efficient lossless compression
>>
>>method.
>>

>>I have developed something like that, too. But yours seems to be better.
>
> My
>
>>method was very close to png, so I did not continue working with it.


>>
>>
>>>ISA can also use any kind of bits per channel (1bit, 2bit, 4bit, 8bit
>
> and
>
>>>e.g. for digital cameras 10 and 12 bit), any number of channels
>>
>>(Grayscale,
>>
>>>RGB, RGBA, CMYK, etc), and even palette-based images. It is specifically
>>>geared towards 2D bitmap information, not suitable for 1D compression
>>>processes.
>>

>>Sounds very interesting. Of course you probably want to keep your secrets
>>until you know which direction to go...


>>
>>
>>>Any one of you having experience with a situation like this (in
>
> general)?
>

>>I guess there are two things you can do:
>>
>>1) Apply for a software patent. At first you have to find out whether you
>>didn't re-invent something and how new your algorithm really is. Everybody
>>will hate you (like everybody hated UniSys for LZW), but all big companies
>>do it, so why shouldn't you? I think you also have to promote your
>>algorithm, because a patent nobody uses is worthless.
>>
>>2) Publish your algorithm somewhere, maybe in magazines, on web sites etc.
>>The more people read about it, the better, that's a good way to prove that
>>you did invent it. Now everybody can use it for free, and no other person
>
> /
>
>>company can apply for a patent about something which is already known.
>>
>>Jens
>>
>>
>>
>
>
>

--

==============================================================
Henry A. Eckstein
Triad Communications Ltd.
2751 Oxford Street
Vancouver, British Columbia, Canada
V5K 1N5

Telephone: 604-253-3990
Fax: 604-253-0770

Toll Free: 1-800-600-9762 (1-800-OZZY-ROC)

Website: www.triadcommunications.ca

Email: he...@comwave.com

General Inquiries: tri...@comwave.com

We Produce Training, Promotional and Information Videos
for distribution on Tape, DVD or via Internet.

We integrated high quality digital video into
custom designed Computer-based Training Systems with
embedded test and evaluation components.

We are an authorized distributor of Maxell brand
professional blank media.

We Offer Duplication, Conversion services for
NTSC, PAL & SECAM format videos and audio programs.

We Digitize and Compress to MPEG-1, MPEG-2, Quicktime, AVI,
Windows Media, Real Video, Custom Streaming formats and Audio MP3
and then save onto DVD, CD-Recordable and other digital media.

We Transfer 8mm and 16mm film to Video, DVD and CD-ROM

We provide Legal Video Services including the video recording
of depositions, day-in-the-life, accident reconstructions.

We are a full-service, one-stop video and multimedia production house
that can provide you
with initial scripts to complete video production services, high quality
post-production
and computer graphics to final packaged copies and all steps in between.

Established 1973.

Nils Haeck

unread,
Feb 7, 2004, 10:41:39 PM2/7/04
to
Hello Henry,

I understand that there are *lots* of patented methods luring that perhaps
are used by my compression method. Nevertheless it is enough fun to continue
my work anyway :)

I scanned through your list and at least the most important compression step
that I use does not seem to be described. However, it is hard to judge from
the titles alone.

Then again, it also depends where you are located, and when the patents were
issued, whether or not they are going to be applicable and enforcible. And a
lot of "prior art" may cancel out the rest. As I mentioned somewhere before,
the whole process is built up from "maths book" methods more or less, not
overly complicated, and afaik it is not possible to patent these (at least
not in Europe).

Thanks for the interest, can I hire you as a patent attorney? (grin)

Nils
www.simdesign.nl


"Henry A. Eckstein" wrote in message
news:40258627$2...@newsgroups.borland.com...


> The above sound like searching for self-similarity on luminance and
> chromacity channels which in effect is Fractal Compression:
>
> Sorry to spoil the fun, but here are the patents for fractal compression
> using iterative numeric functions to find self-similar pixels:
> The ones marked with *** (3 asterisks) relate full to the fractal
> algorithm itself while others relate to the edge enhancement and edge
> detection using by fractal encoders.
>
> 6,356,973: Memory device having a cyclically configured data memory
> and having plural data portals for outputting/inputting data
> ***6,330,367: Image encoding and decoding using separate hierarchical
> encoding and decoding of low frequency images and high frequency edge
images

<snip>


John

unread,
Feb 14, 2004, 9:29:12 AM2/14/04
to
"Nils Haeck" <n.ha...@spamchello.nl> wrote

> > Well, on average any compression algorithm will actually increase the
> size.

> That really depends on the random generator. Some random data might look
> random (looking at the histogram) but may actually contain some
redundancy.
> A good compression algorithm will find that and use it.
>
> Also, since the compression algorithm can add some expert knowledge
through
> its definition, it may be able to even compress the most random data,
simply
> by luck.

Sorry, Murphy rules in this universe. Not only can you not break even but
you can only lose in the long run (remember, you are adding information).
Let's try 3 bits of data for starters. Find me an algorithm that will on
average express 3 bits of information (8 values) in less than three bits.
Once you realize the impossibility, it's trivial to extend it to any bit
size as unspecified bit sizes.

Why do we employ compression then? Because all values aren't that important
for us personally. Compression works by prioritizing data, or by picking a
narrower (optimum or close to optimum) domain. In theory, it has nothing to
do with internal representation. The best compression algorithm actually is
a dumb algorithm that merely knows the frequency of values. Take Google
image search, for instance. Pick the most frequent image. You can compress
it into 1 bit (1) trivially. Has nothing to do with the content. Now pick
the next frequent. That's 2 bits (01), so on and so forth.

However, if you don't know the frequencies, you have to guess, which is
where intelligence comes in. If most images are outdoors, compress the sky
separately than the rest... etc. But you have to make certain assumptions,
which will fail miserably in more situations than they work If you are
lucky, people will use it in the situations that it actually works.


Nils Haeck

unread,
Feb 14, 2004, 6:46:48 PM2/14/04
to
Hi John,

Indeed, you cannot compress all "random" data, that's why I mentioned that
some scientist would prove me wrong.

Actually the proof that you mentioned is indeed simple, and called the
Counting Theorem. Imagine a file you want to compress as a string of number
of bits, for instance a 10 byte file would have 80 bits, which can contain
either 0 or 1.

The total number of different combinations is thus 2^80. So, at least in
theory, there are 2^80 different files possible. Now, a compression
algorithm that would guaranteed always compress this to a smaller number of
bits, could never store these 2^80 combinations in less than 80 bits, and
thus not reproduce all of the 2^80 files with certainty.

But I don't see the connection with Murphy: "random" to a human often isn't
that random to a machine. For instance if you manage to detect the actual
random generator that created the random data, you can compress that file to
one bit saying "Use generator X". You make "not important for us" sound as
if all compression is lossy, which is mostly not so, except for some image
formats like JPG and sound formats like MP3.

> However, if you don't know the frequencies, you have to guess, which is
> where intelligence comes in.

I don't agree that it is "guessing". As I found with images, even though
uncompressed images do tend to contain quite a bit of noise (randomness) on
the channels, it also contains structures that we as humans can actually see
quite easily, but that conventional compression algorithms fail to see (and
use), because they usually do not treat the image data as being spatial.
Structural information is by the way always present in images except the
artifical "random noise" ones, but they will not make mankind very happy
anyways.

My image compression algorithm makes optimal use of this structural
information and thus is able to compress images very efficiently. Currently
it is zeroing in on J2K (lossless Jpeg 2000). My format ISA is within 2% of
J2K size usually, but *without* use of (adaptive) arithmetic coding which
J2K uses. Arithmetic coding, esp. the adaptive variant, is a very
memory-expensive and time-consuming operation, so if it can be avoided that
is a great plus. I'm still working on it and I hope I can make some
announcements soon.

Thanks for the feedback, and kind regards,

Nils Haeck
www.simdesign.nl


"John" wrote in message news:402e218b$1...@newsgroups.borland.com...
> "Nils Haeck" wrote

Jens Gruschel

unread,
Feb 14, 2004, 7:51:45 PM2/14/04
to
Sorry, again this might not be relevant, but it just comes to my mind...

> But I don't see the connection with Murphy: "random" to a human often
isn't
> that random to a machine.

What is "random"? Don't we call something "random" if we don't know what's
behind it? I mean something that's too complex to calculate (at least for
our brains). Like a dice falling onto the table. If you know its starting
point, its angle, speed, material of the table, wind direction and
strength... well, of course a scientist now says: "you cannot know
everything" (chaos theory), but if it drop it from one inch without drill...
(I still don't know what happens, but it's easier to guess). What I want to
say: some things aren't that random as they look like - what makes them
easier to compress.

> I don't agree that it is "guessing". As I found with images, even though
> uncompressed images do tend to contain quite a bit of noise (randomness)
on
> the channels, it also contains structures that we as humans can actually
see
> quite easily, but that conventional compression algorithms fail to see
(and
> use), because they usually do not treat the image data as being spatial.
> Structural information is by the way always present in images except the
> artifical "random noise" ones, but they will not make mankind very happy
> anyways.

Another thought. 16 x 16 pixel icons with lets say 16 colors. Everyone hates
to create such icons, because you are so limited, it never looks like you
want it to look like. But there are hundrets of different icons on your pc,
and they are distinguishable for us. In fact there are 2 ^ 1024 different
icons possible, or about 10 ^ 300 (more than 300 digits), quite a lot (it's
even more impressive if you take larger images with more colors). So we can
assume (can we?) that many of them are rubbish, just random pixels, not
needed, and most likely they will never be drawn (and cannot be drawn
because to draw all of them you need a bit more time than you have in your
life). So all we have to do is to separate the ones that are more likely
(show something else than just rubbish) from the other ones. And now we can
compress them. Of course finding such an algorithm is very difficult, but
our brain has a very good (lossy) built-in compression algorithm (instead of
storing 256 pixels it stores "green flower" for the ICQ icon, "W" for the MS
Word icon etc).

Jens

P.S. This could be extended to the whole storage capacity of your computer.
Can you tell me how many different numbers (or bit patterns) can be stored
with 4096 Mbit (the number of bits the ram of my computer has)? It's a
number with more than 1200000000 digits. Writing a simple program that fills
up your memory with any bit combination possible with a super high speed
computer, this program needs more than 10^1000000 years to finish its job.
Unfortunatelly the universe is not more than about 10^10 years old. So most
combinations will never be stored inside your ram (and "most" means "nearly
all"), just because they don't make any sense, are nothing more than random
data, or a not very likely. Theoretically this means you can compress your
ram (and hard drive etc.) enormously.

Nils Haeck

unread,
Feb 14, 2004, 11:14:28 PM2/14/04
to
> So all we have to do is to separate the ones that are more likely
> (show something else than just rubbish) from the other ones. And now we
can
> compress them.

This is very true what you say. Suppose that from these 2^1024 icons only 1%
are realistic ones (that's still 10^298 according to your computation) then
with the right algorithm you could compress these 1% to e.g. 10% size at the
expense of (100 - 10) / 99 = 0.9% for all other icons. Since these will most
probably never be used because they look like noise, everybody would love
the compression technique.

> Of course finding such an algorithm is very difficult, but
> our brain has a very good (lossy) built-in compression algorithm (instead
of
> storing 256 pixels it stores "green flower" for the ICQ icon, "W" for the
MS
> Word icon etc).

I think the brain is not really a compression algorithm but merely a
collection of non-linear detectors that are again interconnected through a
neural network. Images are not really stored (not even lossy), just
concepts. Concepts are always related to other concepts we've seen in the
past (e.g. if a caveman in 1000 BC would have seen an airplane, he would not
have labeled it under "just another noisy airplane flying by at low
altitude" but as a "magical ball of brightness with the sound of thunder",
perhaps noting the movement perhaps not.

Mimicking the brain to some extent by using artificial neural networks is
not that hard, and is already used a lot, but building a lossless image
compressor based on it would be a job indeed. Just from the fact that
since neural networks require a-linearity in their synapse behaviour in
order to work, it would be virtually impossible to *prove* that a neural
network is lossless. Lossy compression would be another story but since I'm
after lossless compression I do not use neural networks :)

Another disadvantage of using a "dictionary" kind of approach, storing
objects for recognition, such as "flower" or "letter w" would be the sheer
size of the dictionary (= compression + decompression program size).

By the way, I should try my compression method on an icon.. I guess, since
artificial images generally do not contain much noise, that it should
achieve impressive compression indeed.

Kind regards,

Nils

Jens Gruschel

unread,
Feb 15, 2004, 3:49:05 AM2/15/04
to
> I think the brain is not really a compression algorithm [...]

I agree to what you say. But isn't it some kind of compression if you store
(or remember) things based on other things you already have stored?

> By the way, I should try my compression method on an icon.. I guess, since
> artificial images generally do not contain much noise, that it should
> achieve impressive compression indeed.

Most compression formats don't work very well with such little files (have
you tried your format for a 1x1 image?). But why don't you try another
artificial image, maybe a rendered one without textures?

Jens

Camiel Wijffels

unread,
Feb 17, 2004, 5:24:43 PM2/17/04
to
"Nils Haeck" <n.ha...@spamchello.nl> schreef in bericht

news:401f...@newsgroups.borland.com...
> What do you mean by registering the source? How do you do that?
>
> Thanks, Nils

Hi Nils,
You register the code in holland, simply by going to your local tax office
and say that you want to register the code. They´ll take the piece of paper
(or bookwork), register it for only EU 5,= and put it away in some dark
location inside the building. If you have trouble afterwards you can always
show you registered it earlier.
BTW, I have read all post with great interest. Congratulations.
--
Best regards, Camiel Wijffels


Epis

unread,
Feb 18, 2004, 3:54:42 AM2/18/04
to
Hi Nils,

Would your compression method help in comparing/sorting/recognizing images?

Epis


Nils Haeck

unread,
Feb 18, 2004, 4:56:56 AM2/18/04
to
Hi Epis,

> Would your compression method help in comparing/sorting/recognizing
images?

No, the compression method would not really help there, at least, not with
what I would consider a very fast method.

By the way, I already have made a comparison system for comparison of images
in a shareware application called "ABC-View Manager", which you can download
here:
http://www.abc-view.com/abcview.html

It allows one to sort images by similarity, and can do this easily for up to
10.000 images at a time (or more, if you have some patience).

Here's a dedicated page about it:
http://www.abc-view.com/articles/article3.html

That tool is also written in Delphi. The "similar images" recognition can
also be sold as a separate engine.

Finally, in another project, we have created an image alignment tool. It
works perfectly for big scans (up to 600 DPI), and aligns a sample image
(possibly with markups etc, like filled-in forms) to a master image with
subpixel accuracy. If you're interested I can send a demo. Ideal for forms
recognition (ICR/OCR).

Kind regards,

Nils Haeck
www.simdesign.nl


"Epis" <thisisnot...@nospam.nop> wrote in message
news:4033...@newsgroups.borland.com...

Nils Haeck

unread,
Feb 18, 2004, 5:03:09 AM2/18/04
to
> BTW, I have read all post with great interest. Congratulations.

Thanks. I'm still working on it.

Right now, "ISA" is only 2% bigger than lossless "JP2", and that without
entropy encoding or adaptive arithmetic coding. It is usually about 75% of
PNG's size.

I'm still figuring out ways to wringle the last drops of water out of the
towel :)

If I succeed in getting it to smaller sizes than JP2 I will probably publish
more about it. Currently it makes no sense I think, nobody is waiting for
yet another format that is not smaller than existing ones.

Kind regards, Nils


Andrew Rybenkov

unread,
Feb 22, 2004, 12:37:45 AM2/22/04
to
Currently it makes no sense I think, nobody is waiting for
> yet another format that is not smaller than existing ones.

depends on encoding/decoding speed, of course.

--
Andrew Rybenkov.

It is loading more messages.
0 new messages