Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

New improvements to ultimate compression

24 views
Skip to first unread message

Tim Bernard

unread,
Apr 25, 2003, 2:37:33 AM4/25/03
to
Ok ive been working on this for the last 3 days straight and have improved
compression.

100 byte file now compresses to 1 byte instead of 7 bytes. my program takes
89 seconds to do this but i can make this faster with assembly.

i need to talk to a lawyer to get patents. i heard ibm has patent on all
compression, how to get them to pay me?


Giovanni Motta

unread,
Apr 25, 2003, 2:54:20 AM4/25/03
to

> 100 byte file now compresses to 1 byte instead of 7 bytes. my program takes
> 89 seconds to do this but i can make this faster with assembly.

If I send you 256 compressed files, can you send me back the 100 bytes
originals ?

G.

Fredrik Melin

unread,
Apr 25, 2003, 3:25:36 AM4/25/03
to
Why is it always about money?

If you find the *magic* number, the method to take down all mathematics as
we know it, why not make it public?

Wouldnt you rather see what the people can evolve to with the new
information?
At least for me there isnt enough money in the world to compare to that...
but thats me..

Regards
Fredrik

"Tim Bernard" <notmy...@server.com> wrote in message
news:NI4qa.75614$Si4....@rwcrnsc51.ops.asp.att.net...

Chris

unread,
Apr 25, 2003, 3:27:16 AM4/25/03
to
Giovanni Motta <g...@ieee.org> wrote in news:3EA8DB95...@ieee.org:

send him 257 ;)

Tim Bernard

unread,
Apr 25, 2003, 3:31:35 AM4/25/03
to
My compression works with 8 bit ascii post your files here. for quicker
response post some 7 byte compressed it will take a little longer for me to
run with 1 byte files


"Giovanni Motta" <g...@ieee.org> wrote in message
news:3EA8DB95...@ieee.org...

Salvador Fandiño García

unread,
Apr 25, 2003, 5:25:04 AM4/25/03
to
Tim Bernard wrote:
> My compression works with 8 bit ascii post your files here. for quicker
> response post some 7 byte compressed it will take a little longer for me to
> run with 1 byte files
>


why can do a very simple test to see if your compressor really works:

1 - we send you some 100byte files to compress (around 300 files)

2 - you send us back the files compressed

3 - we send you one or two of the compressed files

4 - you send us back the original files for those (3)


to compress 300 x 100bytes files at 90s will take you 450 min
(6h30m), not so long.

well, attached is a file with 300 different 100 byte lines.

Bye,

- Salva

compme.txt.gz

Giovanni Motta

unread,
Apr 25, 2003, 4:26:06 AM4/25/03
to

Tim Bernard wrote:
> My compression works with 8 bit ascii post your files here. for quicker
> response post some 7 byte compressed it will take a little longer for me to
> run with 1 byte files

I am not in a rush, I'll wait for the time consuming response. Getting
the answer is also important to you since you can speed up your
decompressor with that (see hint).
I would like to see the uncompressed result of the 256 files containing
1 byte only with the byte going from 0x00 to 0xFF.
If there is any byte that doesn't compress correctly just mark the
corresponding byte "Illegal".
I can use a table like:

Content Decompressed

00 0234123134.... 100 bytes in hex notation here.... F54D
01 Illegal
02 FD0ABC1233.... 100 bytes in hex notation here.... F000
.. ..
.. ..
FF DFAB010234.... 100 bytes in hex notation here.... D0AB

Giovanni

P.S. Hint to speed up your decompressor: Build the table once for all
and use table lookup to decompress. It rarely gets faster than that.


Willem

unread,
Apr 25, 2003, 4:28:46 AM4/25/03
to
Tim wrote:
) Ok ive been working on this for the last 3 days straight and have improved
) compression.
)
) 100 byte file now compresses to 1 byte instead of 7 bytes. my program takes
) 89 seconds to do this but i can make this faster with assembly.

I can make it lots and lots faster for you easily.

First, the decompressor: Simply make a lookup table, indexed by the 1
compressed byte. This only takes 256*100 bytes of space and will be
blindingly fast.

Then, the compressor: Search through the lookup table until you find your
100-byte file and return the index. As you only have to search through
25600 bytes, this will also be quite fast, in the order of milliseconds.


SaSW, Willem (at stack dot nl)
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT

Salvador Fandiño García

unread,
Apr 25, 2003, 5:34:03 AM4/25/03
to
Salvador Fandiño García wrote:

> why can do a very simple test to see if your compressor really works:
>
> 1 - we send you some 100byte files to compress (around 300 files)
>
> 2 - you send us back the files compressed

oops, I forgot to say that obviously you have to use the 100 to 1
compressor!


Bye,

- Salva


Tom St Denis

unread,
Apr 25, 2003, 6:14:55 AM4/25/03
to
Giovanni Motta <g...@ieee.org> wrote in message news:<3EA8DB95...@ieee.org>...

STOP FEEDING THE IDIOT TROLL!

Janne Johansson

unread,
Apr 25, 2003, 7:47:08 AM4/25/03
to
Willem <n...@sp.am.invalid> writes:

> ) Ok ive been working on this for the last 3 days straight and have improved
> ) compression.
> )
> ) 100 byte file now compresses to 1 byte instead of 7 bytes. my program takes
> ) 89 seconds to do this but i can make this faster with assembly.
>
> I can make it lots and lots faster for you easily.
>
> First, the decompressor: Simply make a lookup table, indexed by the 1
> compressed byte. This only takes 256*100 bytes of space and will be
> blindingly fast.
>
> Then, the compressor: Search through the lookup table until you find your
> 100-byte file and return the index. As you only have to search through
> 25600 bytes, this will also be quite fast, in the order of milliseconds.

But, but... But then it is easy for a child to see that it doesn't
work for all inputs. This will utterly shatter his perception of the
undoable...

Alex Sisson

unread,
Apr 25, 2003, 8:34:02 AM4/25/03
to
"Tim Bernard" <notmy...@server.com> wrote in message news:<NI4qa.75614$Si4....@rwcrnsc51.ops.asp.att.net>...

But surely if you run the compresser on it again you could make it 1 bit.
That would be much better.

mmaniscalco

unread,
Apr 25, 2003, 11:02:34 AM4/25/03
to
There is a very simple way to put this nonsense to bed and no data needs to
exchange
hands at all.

Tim, I have a stream which is one byte (8 bits) in size. The value of that
byte is zero.
That is, 8 bits all set to zero.

What was the original 100 byte stream?

-Michael A Maniscalco

"Tim Bernard" <notmy...@server.com> wrote in message
news:NI4qa.75614$Si4....@rwcrnsc51.ops.asp.att.net...

Gib Bogle

unread,
Apr 25, 2003, 9:01:10 PM4/25/03
to
Tim Bernard wrote:

> Ok ive been working on this for the last 3 days straight and have improved
> compression.
>
> 100 byte file now compresses to 1 byte instead of 7 bytes. my program takes
> 89 seconds to do this but i can make this faster with assembly.

I'm sure if you tried a bit harder you could get 100 bytes to compress
to 0 bytes. THAT would be ultimate compression.

Gib

Kelsey Bjarnason

unread,
Apr 26, 2003, 12:16:49 AM4/26/03
to

Bah; I do that regularly. Only problem is, it's lossy. I think the
program to do this is called 'rm'...

Frances Nostrodamus

unread,
Apr 26, 2003, 2:59:02 AM4/26/03
to
$BM5....@news.primus.ca...

> There is a very simple way to put this nonsense to bed and no data needs
to
> exchange
> hands at all.
>
> Tim, I have a stream which is one byte (8 bits) in size. The value of
that
> byte is zero.
> That is, 8 bits all set to zero.
>
> What was the original 100 byte stream?
>


Here is the decompression of your sample file this should prove all
naysayers wrong and encourage discussion on potential uses of my program

"True! nervous, very, very dreadfully nervous I had been and am; but why
will you say that I am mad? "

Mikael Lundqvist

unread,
Apr 26, 2003, 4:22:08 AM4/26/03
to

And it's possible to get back the original, without any loss!
Just copy it to a floppy or a CD first! ;-)

Well, seriously, I think it's the *definition* of compression which is
the problem here. I've not read anywhere that he can recreate the original.

/Mikael

Dale King

unread,
Apr 28, 2003, 1:37:08 PM4/28/03
to
"Mikael Lundqvist" <mlqn...@telia.com> wrote in message
news:b8dfje$8r47o$1...@ID-87439.news.dfncis.de...


He said he could do it losslessly, which by definition is supposed to mean


he can recreate the original.

--
Dale King


0 new messages