Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Harringtons Compression Method Revistied *EXTREMELY LONG*

454 views
Skip to first unread message

Einstein

unread,
Jun 12, 2006, 6:57:26 AM6/12/06
to
Ok, here is a new white paper taking the core of my theory and applying
it alone. In here basically it could be possible to get something to
say 50 bits in size, but that's so highly unlikely... instead
information in compression is like a elastic band... it stretches and
pulls... and finds a medium specific for it. In this manner it is
possible to compress random binary data with no 2 types of data ending
up with the same results. In fact we have to use it for best effect on
files greater than 8 megabytes with this variation for it to have a
very reasonable chance to compress data. However anyone following this
can and will achieve compression of random binary data. This is a White
Paper by Michael Harrington, all rights reserved, copyrighted © June
2006.

8 megabytes is of course not the smallest, in fact we could probably do
6 megabytes with ease, given natural variation in all sequences, and
cover 90% of the code existing at or above 6 megabytes, but the
absolute worst case is 8 megabytes. Some terms you might be familiar
with don't appear here. Regressive compression, statistical modeling,
etc... I avoid those since I created this without those terms being
part of my thought process. However it does use regressive compression
first, then compresses. Without regressive compression we could not,
would not, get compression.

I will also note this variation of the main invention SUCKS for
compression. 0.3% is NOT a high number, nor probably one that will make
people sit up and go eureka... but it's on random binary data, and it
works


NOTE: Due to length, I am forced to cut this into about 12 parts, I
apologize... but my dialup just wont handle this! Hahaha... jeez I cant
wait for universal data compression in transmissions :P

The basis of the main invention is a means to create from a 4 bit
group, one to two outputs, each output going to a different section.
Think of these two outputs as temp files. They are worked separately of
each other until near the end. Then we get the changes. The first file
has information appended normally, left to right. The 2nd file has it
appended right to left, in utter reverse. In this manner when we merge
the two files, they become decompressible.

Einstein

unread,
Jun 12, 2006, 6:59:02 AM6/12/06
to

0000 = 00000
0001 = 00001
0010 = 0001 & 0
0100 = 001 & 00
1000 = 01 & 000
0011 = 1 & 0000
0101 = 0001 & 1
1001 = 001 & 01
0110 = 01 & 001
1010 = 1 & 0001
1100 = 001 & 1
1110 = 01 & 01
1101 = 1 & 001
1011 = 01 & 1
0111 = 1 & 01
1111 = 1 & 1

In this manner using both sections, we can backwards track each outcome
to the appropriate original source. The way this works is best to
imagine 2 temp files, each section goes to one temp file. Only the 2nd
group goes in reverse order (So if chronological from above, top to
bottom we have 1 01 1 001 01 1 0001 001 01 1 0000 000 00 0. By reading
it in this manner, we can easily determine the proper order of things.

Now we don't always do this... in fact, we only do it 50% of the
time. 50% of the outcomes actually don't compress well at all, and we
instead just use a section called count to handle identifying which we
will compress, and which we wont, as well as list the new results for
those we wont. In this manner we now have a section devoted to the
entire amount. We also need a section to tell us exactly how many
chinks of 16 bits we are processing. So we introduce the command
section.

The command section makes a count of all 16 bit groups, and also adds a
value to say extra 0's if the whole is not divisible by 16. This
comes out to a binary figure. We now increase the size of this figure
until it is divisible by 4 (or we could do 8, or whatever value we
want, 4 is my easy to track method at this time). This additional 0's
is kept separate of the count of the 16 bits, and takes up 4 bits total
space.

Now we split this value up into chunks of 4. Each time there is still a
chunk of the command sections count of the 16 bit groups, we place a 0,
indicating continue. When we reach the last one, we follow it with a 1,
indicating that we are done with the count. This means we have for
every say 4096 bits, 15 bits to identify the whole. This is a size
increase of 1.003662109375 of the whole. We will not be worried though
this is a significant portion of our efforts.

Now when we run the following, we get 0.005692046 less in data.
Therefore our actual savings is going to be a 1 to 0.997949219 ratio,
meaning a VERY SMALL trade off. Therefore to see actual compression,
using a library method (next paragraph) we need 63,913,200 bits, or
7,989,150 bytes, and we will round this to basically 8 megabytes. This
means statistically this is the smallest we can compress. In fact
though we could conceivably get some files down to as small as say 50
bits. This is highly unlikely and would take a serious cyclic motion in
the code to arrange. Further research is needed to create a proper
cyclic motion, but it's elementary in nature, requiring we swap
around values time to time with a simple 4 bit switch, to make said
values have opportunities to land elsewhere.

Now we need a dictionary method to handle the actual occurring most
outcomes, and make them the actual compressing items, and in proper
order. So what we do is we assign 32768 groups of 4 bits in proper
order, of which outcomes, in proper order are actually occurring the
most. We then replace them with the first column (in order) down below.
The remaining 32768 outcomes follow a simplistic default order where
they are replaced according to their number value in proper order with
the remainder (so if 0000000000000000 is the first unused value, it is
replaced with 1101111100100100 since this is the first replacement
section. It does not matter if 0000000011111111 is first,
0000000000001111 is first, or whatever was first in sequence of
uncompressible results, they are defaulted in proper order (not posted
to keep some shortness if at all possible to this page)

Einstein

unread,
Jun 12, 2006, 7:03:57 AM6/12/06
to
Well crap... I cant get the code to fit :(

Ok, here will be another attempt.. I will post it to my website, and
have the url placed here... will be RARed.

Einstein

unread,
Jun 12, 2006, 7:30:52 AM6/12/06
to

Joshua Shinavier

unread,
Jun 12, 2006, 1:51:01 PM6/12/06
to
Einstein wrote:
> http://www.security1.free2host.net/whitepaper.rar
>
> This is the Url to the Rar file

Your paper reads like a continuation of a newsgroup thread. I think
you should give your reader a better idea of what they're reading and
why they're reading it.

Barb Knox

unread,
Jun 12, 2006, 5:18:17 PM6/12/06
to
In article <1150134661....@u72g2000cwu.googlegroups.com>,
"Joshua Shinavier" <par...@gmail.com> wrote:

> Einstein wrote:
> > http://www.security1.free2host.net/whitepaper.rar
> >
> > This is the Url to the Rar file
>
> Your paper reads like a continuation of a newsgroup thread. I think
> you should give your reader a better idea of what they're reading

They're reading a document which claims to describe an algorithm that
(among other things) can compress purely RANDOM binary data by 0.3%:

"0.3% is NOT a high number, nor probably one that will make
people sit up and go eureka... but it's on random binary data, and it

works". This is of course provably impossible (a well-known result from
Shannon in the 1950's).

> and why they're reading it.

For entertainment value only.

Matthijs Hebly

unread,
Jun 12, 2006, 5:29:04 PM6/12/06
to
Einstein wrote:
> Ok, here is a new white paper taking the core of my theory and applying
> it alone. In here basically it could be possible to get something to
> say 50 bits in size, but that's so highly unlikely... instead
> information in compression is like a elastic band... it stretches and
> pulls... and finds a medium specific for it. In this manner it is
> possible to compress random binary data...

Einstein, my dear chap, it is not, and will never be possible to
compress all random data. This has been pointed out to you on several
occasions. Let me make a last attempt:

How many different compressed texts of N bits can you have? Let's see:

0 bits -> can decompress to at most 1 plain text.

1 bits -> can decompress to at most 2 plain texts (compressed text
(binary) "0" stands for one plain text, "1" for the other).

2 bits -> can decompress to at the most 4 plain texts ("00" stands for
one plain text, "01" to one, "10" to one, and finally "11" stands to
exactly one plain text).

See the pattern here? Here is comes:

N bits -> can decompress to at the most 2^N plain texts.

If you want to calculate how many plain texts can correspond to N *or*
*less* compressed bits, just add the numbers:

0 bits -> can decompress to at most 1 plain text.

1 bits or less -> can decompress to at most 1 + 2 = 3 plain texts.

2 bits or less -> can decompress to at the most 1 + 2 + 4 = 7 plain texts.

See the pattern here?

N bits or less -> can decompress to at the most 2^0 + 2^1 + ... 2^(N-1)
+ 2^N = 2^(N+1) - 1 plain texts.

Now suppose I have all different 2^(N+1) plain texts of size (N+1) bits
that I want to compress. Compression means that I want to decrease their
size by at least 1 bit, right? So I want to compress them all to become
N bits at the most (since they are all (N+1) bits, this is what we want).

To how much compressed texts should this result? Of course, to 2^(N+1)
different compressed texts, since I started with 2^(N+1) different plain
texts.

But as we saw, there can only be 2^(N+1) - 1 compressed texts of N bits
or less!

So we have 1 plain text *too* *much* here, which we can never compress!!!

Practical example (since I'm in a good mood):

Let N be 2 (but remember: this goes for *any* N):

How many compressed texts of 2 bits can you have? 4! (00, 01, 10, and 11).

How many compressed texts of 2 bits or less can you have? 7! (00, 01,
10, 11, 0, 1, and the empty string).

Now I construct all possible plain texts of N+1 = 3 bits:

000, 001, 010, 011, 100, 101, 110, and 111.

That's 8 plain texts, right?

You claim you can compress them *all*.

But there are only *7* (= 8-1) possible shorter (compressed)
representations for this *8* plain texts:

empty string, 0, 1, 00, 01, 10, 11.

How, on earth, do you map *8* plain texts to *7* compressed versions of
them?

And remember: this goes for *any* N!!!

You say, you need at least plain texts of 8 MB. That's 8 * 1024 * 1024 *
8 bits = *67108864* possible plain texts or 8 MB, right?

But you want to map them to 67108864 - 1 = *67108863* possible
compressed texts.

How do you do this? How do you account for the one that you could not
compress?

See also http://en.wikipedia.org/wiki/Lossless , under "Lossless data
compression must always make some files longer".

I rest my case.

> ...with no 2 types of data ending

Matt Mahoney

unread,
Jun 12, 2006, 5:41:28 PM6/12/06
to
Einstein wrote:
> http://www.security1.free2host.net/whitepaper.rar
>
> This is the Url to the Rar file

There is a mistake in your second table (the real long one).

-- Matt Mahoney

Einstein

unread,
Jun 13, 2006, 8:10:11 AM6/13/06
to

Question Matt, where specific and what is it? The method I used to
write the table allows for no errors...

Matt Mahoney

unread,
Jun 13, 2006, 9:05:09 AM6/13/06
to

Instead of listing the table, describe the procedure for generating it.
Then the answer will be obvious.

-- Matt Mahoney

maud

unread,
Jun 15, 2006, 9:13:15 AM6/15/06
to

Einstein wrote in part

> ...


> anyone following this can and will achieve compression of
> random binary data.

> ...

I see you are still hammering away at this. Go hard d00d :)

Your work is OK as far as it goes. I can actually see what you are
trying to do. You'll get the coding of your method worked out sooner or
later. IMV your method is overly-complicated for something that can be
alternatively coded quite trivially in a couple of dozen lines for
similar results, but thats neither here nor there.

Whats not OK, is your claim stated above. It is simply not true. Please
stop claiming this. Its giving a lot of people a sore head, including
me. Please modify your claim to something along the lines of:

"anyone following this can and will achieve compression of some sets of
random binary data."

Please note the important difference between the two claims. This
latter claim is true of any method of de/compression when "random" is
properly defined as any value falling within known lower and upper
bounds. "Random" must be defined thus if the values are to be contained
within the bounds of a physical device.

Taking this as read; then the only thing that matters to anyone is the
answer to the following question?

Given all sets of possible values contained in a stream or file of n
length, how many of these sets can be reduced so that they occupy less
space than the original and then fully restored without error?

You have suggested n = 8MB as a likely candidate for your method. My
suggestion is this.

a) Create all sets of possible values that occupy 8MB space.
b) Compress and decompress them all without error using your method.
c) Count up how many sets when compressed result in a smaller space
than 8MB.

Now, it may take some time but hey! thats combinatorics for you. Should
you be able to show numbers less than the trivial norm, for both space
savings and count, at the end of this exercise then all reasonable
people will give you credit for your work.

I don't want to rain on your parade, as I do like folk who are as keen
as mustard. But when you do actually go through the steps above using
your method, you will find that the numbers won't be as good as you
would have hoped. Nevertheless, as I said at the start, go hard d00d.
After all, the proof of the pudding is in the eating :)

maud.

David

unread,
Jun 15, 2006, 9:54:29 AM6/15/06
to
"maud" <maudli...@hotmail.com> writes:

> Einstein wrote in part

> > ...
> > anyone following this can and will achieve compression of
> > random binary data.
> > ...

> I see you are still hammering away at this. Go hard d00d :)

> Your work is OK as far as it goes. [...]

> You have suggested n = 8MB as a likely candidate for your method. My
> suggestion is this.

> a) Create all sets of possible values that occupy 8MB space.

Bzzzt! You just qualified as a lifetime member of the Innumerati.

tc...@lsa.umich.edu

unread,
Jun 15, 2006, 9:59:30 AM6/15/06
to
In article <1150377194....@i40g2000cwc.googlegroups.com>,

maud <maudli...@hotmail.com> wrote:
>a) Create all sets of possible values that occupy 8MB space.
>b) Compress and decompress them all without error using your method.
>c) Count up how many sets when compressed result in a smaller space
>than 8MB.
>
>Now, it may take some time but hey! thats combinatorics for you.

No kidding. 8MB is 64 million bits, so there are 2^(64 million) possible
values that occupy 8MB space. Even if you handle a googol cases per
femtosecond, you would handle a negligible fraction of all cases in a
googol universe lifetimes.
--
Tim Chow tchow-at-alum-dot-mit-dot-edu
The range of our projectiles---even ... the artillery---however great, will
never exceed four of those miles of which as many thousand separate us from
the center of the earth. ---Galileo, Dialogues Concerning Two New Sciences

maud

unread,
Jun 15, 2006, 11:09:23 AM6/15/06
to

Thankyou Mr President. I accept your nomination :)

maud.

Einstein

unread,
Jun 15, 2006, 11:19:08 AM6/15/06
to
All Possibilties? Your evil, evil I say! Evilllllll! Now you make the
software to generate all those, and the software to run my method vs
them!

Frank Ch. Eigler

unread,
Jun 15, 2006, 12:21:59 PM6/15/06
to

"Einstein" <mich...@gmail.com> writes:

How about a trade. I'll write you the former (generating all 8MB
input files) - if you write up the latter (running your method against
any single one).

- FChE

maud

unread,
Jun 15, 2006, 12:38:08 PM6/15/06
to

tc...@lsa.umich.edu wrote:
> In article <1150377194....@i40g2000cwc.googlegroups.com>,
> maud <maudli...@hotmail.com> wrote:
> >a) Create all sets of possible values that occupy 8MB space.
> >b) Compress and decompress them all without error using your method.
> >c) Count up how many sets when compressed result in a smaller space
> >than 8MB.
> >
> >Now, it may take some time but hey! thats combinatorics for you.
>
> No kidding. 8MB is 64 million bits, so there are 2^(64 million) possible
> values that occupy 8MB space. Even if you handle a googol cases per
> femtosecond, you would handle a negligible fraction of all cases in a
> googol universe lifetimes.

Probably be a good idea if we went for lunch first then :)

maud.

Matt Mahoney

unread,
Jun 15, 2006, 5:19:33 PM6/15/06
to

Here is an easier problem. I generate one 8 MB file and post it. You
compress it and post an archive containing the compressed file and a
decompression program, such that the archive is at least one byte
smaller than the file I post. Follow the rules of
http://mailcom.com/challenge/ for creating the archive and decompressor
(substituting the test file for the Calgary corpus).

Your decompression program should not be very large, so you don't need
much compression. But just in case, if you prefer a different size
test file, let me know how much data you would like, between 1 byte and
100 MB, and I will create it for you. Your archive only has to be 1
byte smaller than the test file. Since you can compress any file over
8 MB, then all you have to do is recursively compress the test file
until it is 8 MB, and that leaves you 92 MB for the decompression
program.

To make it easier still, I will tell you how I will create the test
file. I will use the program sharnd.exe to create it. You can find
the program here: http://cs.fit.edu/~mmahoney/compression/#sharnd

Let me know when you are ready to accept the challenge. If you can
pass this test, I think everyone will agree that you were right and
that thousands of mathematicians and data compression experts who
stubbornly believed that recursive or random compression is impossible
were wrong.

-- Matt Mahoney

Phil Carmody

unread,
Jun 16, 2006, 2:31:51 AM6/16/06
to
"Matt Mahoney" <matma...@yahoo.com> writes:
> Your decompression program should not be very large, so you don't need
> much compression. But just in case, if you prefer a different size
> test file, let me know how much data you would like, between 1 byte and
> 100 MB, and I will create it for you. Your archive only has to be 1
> byte smaller than the test file.

I'd like 1 byte please! I've already sent you the archive.

I'm afraid I can't give technical support on how to run the decompressor.

:-)

Phil
--
The man who is always worrying about whether or not his soul would be
damned generally has a soul that isn't worth a damn.
-- Oliver Wendell Holmes, Sr. (1809-1894), American physician and writer

maud

unread,
Jun 16, 2006, 8:22:51 AM6/16/06
to

Einstein wrote:
> All Possibilties? Your evil, evil I say! Evilllllll!

maudlin: wow! nxt lvl, evil innumerati. woohoo!

> Now you make the software to generate all those,
> and the software to run my method vs them!

dealer: Dealing new hand.
dealer: einstein opens.
dealer: frankyrich calls.
dealer: mattdamoney raises.
maudlin: arg!
dealer: maudlin folds.
dealer: einstein, the action is on you.
dealer: einstein, you have 10 seconds to act.
dealer: einstein, you have 5 seconds to act.
carmo [observer]: c'mon
dealer: einstein failed to act in time and has been folded.
railbird [observer]: damn

:)

maud.

Matt Mahoney

unread,
Jun 16, 2006, 11:05:20 AM6/16/06
to
Phil Carmody wrote:
> "Matt Mahoney" <matma...@yahoo.com> writes:
> > Your decompression program should not be very large, so you don't need
> > much compression. But just in case, if you prefer a different size
> > test file, let me know how much data you would like, between 1 byte and
> > 100 MB, and I will create it for you. Your archive only has to be 1
> > byte smaller than the test file.
>
> I'd like 1 byte please! I've already sent you the archive.
>
> I'm afraid I can't give technical support on how to run the decompressor.
>
> :-)
>
> Phil

That's OK. I figured it out. Here is your byte: x

I named the empty archive you sent me "d.bat.x3x.x3v.x4a.x4h.x1g.x4q"
and decompressed it with 6 recursive cycles of barf (from
http://www.cs.fit.edu/~mmahoney/compression/barf.html )

That gave me a decompressor named d.bat. When I ran it, the output
was: x

I guess einstein was right all along :-)

-- Matt Mahoney

maud

unread,
Jun 16, 2006, 11:13:13 AM6/16/06
to

Phil Carmody wrote:

> I'd like 1 byte please! I've already sent you the archive.

dealer: Dealing new hand.
dealer: carmo opens.
dealer: maudlin calls.
maudlin: hey carmo, bet i can tell u wut u got?
dealer: carmo, you have 10 seconds to act.

maud.

maud

unread,
Jun 16, 2006, 11:42:56 AM6/16/06
to

Matt Mahoney wrote in part:

> Here is an easier problem. I generate one 8 MB file and post it. You
> compress it and post an archive containing the compressed file and a
> decompression program, such that the archive is at least one byte
> smaller than the file I post. Follow the rules of
> http://mailcom.com/challenge/ for creating the archive and decompressor
> (substituting the test file for the Calgary corpus).

> ...


> To make it easier still, I will tell you how I will create the test
> file. I will use the program sharnd.exe to create it.

> ...

dealer: Dealing new hand.
dealer: mattdamoney opens.
maudlin: hey matt, i'll call if u take side bet as well
dealer: mattdamoney, you have 10 seconds to act.

maud.

Phil Carmody

unread,
Jun 16, 2006, 4:40:02 PM6/16/06
to
"Matt Mahoney" <matma...@yahoo.com> writes:
> Phil Carmody wrote:
> > "Matt Mahoney" <matma...@yahoo.com> writes:
> > > Your decompression program should not be very large, so you don't need
> > > much compression. But just in case, if you prefer a different size
> > > test file, let me know how much data you would like, between 1 byte and
> > > 100 MB, and I will create it for you. Your archive only has to be 1
> > > byte smaller than the test file.
> >
> > I'd like 1 byte please! I've already sent you the archive.
> >
> > I'm afraid I can't give technical support on how to run the decompressor.
> >
> > :-)
> >
> > Phil
>
> That's OK. I figured it out. Here is your byte: x

Haha! I was expecting x - your random byte generator just isn't
random enough.



> I named the empty archive you sent me "d.bat.x3x.x3v.x4a.x4h.x1g.x4q"
> and decompressed it with 6 recursive cycles of barf (from
> http://www.cs.fit.edu/~mmahoney/compression/barf.html )

Not liking the recursive scheme any more, I prefer a Vectorised
Orthogonal Matrix Inversion Transform, which, being vectorised,
can achieve the same thing in one pass.

> That gave me a decompressor named d.bat. When I ran it, the output
> was: x
>
> I guess einstein was right all along :-)

Evidently so. I guess now he's been proved right, he has no need
to continue pushing his theora upon us, as we're all converted now.

Matt Mahoney

unread,
Jun 16, 2006, 5:51:19 PM6/16/06
to

Phil Carmody wrote:
> "Matt Mahoney" <matma...@yahoo.com> writes:
> > Phil Carmody wrote:
> > > "Matt Mahoney" <matma...@yahoo.com> writes:
> > > > Your decompression program should not be very large, so you don't need
> > > > much compression. But just in case, if you prefer a different size
> > > > test file, let me know how much data you would like, between 1 byte and
> > > > 100 MB, and I will create it for you. Your archive only has to be 1
> > > > byte smaller than the test file.
> > >
> > > I'd like 1 byte please! I've already sent you the archive.
> > >
> > > I'm afraid I can't give technical support on how to run the decompressor.
> > >
> > > :-)
> > >
> > > Phil
> >
> > That's OK. I figured it out. Here is your byte: x
>
> Haha! I was expecting x - your random byte generator just isn't
> random enough.
>
> > I named the empty archive you sent me "d.bat.x3x.x3v.x4a.x4h.x1g.x4q"
> > and decompressed it with 6 recursive cycles of barf (from
> > http://www.cs.fit.edu/~mmahoney/compression/barf.html )
>
> Not liking the recursive scheme any more, I prefer a Vectorised
> Orthogonal Matrix Inversion Transform, which, being vectorised,
> can achieve the same thing in one pass.

Yeah, I'm working on a faster version using Heuristic Unicyclic
Reciprocal Lattices based on the theories of Primitive Uniform
Kolmogorov Entropy and Generalized Anisotropic Groups. :-)

> > That gave me a decompressor named d.bat. When I ran it, the output
> > was: x
> >
> > I guess einstein was right all along :-)
>
> Evidently so. I guess now he's been proved right, he has no need
> to continue pushing his theora upon us, as we're all converted now.

Yep, somebody ought to tell him he can stop working 16 hour days and
fire his programmer who just can't get those bugs out. Yeah, we're
convinced now...

-- Matt Mahoney

maud

unread,
Jun 16, 2006, 8:06:03 PM6/16/06
to

maud wrote:

maudlin: still in the tank?
maudlin: i'll restate
maudlin: i know you have empty archive, thats 1 hole card
maudlin: but you have byte holecard also
maudlin: dealer_matt gave you x in your byte
maudlin: i can tell you what the value of x is
maudlin: odds are 255 to 1
maudlin: i've called at even money
maudlin: you have the best of it
maudlin: you can't get better pot odds than that
maudlin: check, bet or fold d00d :)
dealer: carmo, its your turn.
maudlin: any railbirds/observers want a piece?
maudlin: i will take anyone's action at even money
maudlin: can't say fairer than that now can I?
maudlin: i'll play for real dollars if anyone is up for it
maudlin: i'm either the freshest <%)}}})>< ever
maudlin: or a great big salt-water crocodile
maudlin: take your pick ;)
dealer: carmo, the action is on you.
dealer: carmo, you have 5 seconds to act.

maud.

Phil Carmody

unread,
Jun 16, 2006, 8:13:43 PM6/16/06
to
"maud" <maudli...@hotmail.com> writes:
> dealer: carmo, you have 5 seconds to act.

Hmmm, yawn, or plonk, I can't decide.

Phil Carmody

unread,
Jun 16, 2006, 8:17:22 PM6/16/06
to

I find that hard to swallow.

However, I will enjoy scrutinising your progress - my own
bleugh watch project.

maud

unread,
Jun 16, 2006, 8:49:22 PM6/16/06
to

maudlin: i'll make it even easier. i call.
dealer: flop comes Ad-Ac-5s.
maudlin: hmm! i put you on pocket Ah-As
maudlin: got to be with that monster challenge
maudlin: nevermind, i got my favourite pockets 6s-2s
maudlin: true, you probably got a made hand.
maudlin: quad AAAA is a real nice hand
maudlin: but my favourite pockets have never let me down
maudlin: i got 1 out. runner runner 3s-4s. whats the odds
maudlin: not that it matters, i'm going allin anyway
maudlin: for any amount of real money you care to nominate
maudlin: check, bet or fold, d00d :)
dealer: mattdamoney, its your turn.
maudlin: any railbirds/observers want a piece of this?
maudlin: i will cover any amount of real money anyone wants to put up
maudlin: <%)}}}})>< or crocodile? ;)
dealer: mattdamoney, the action is on you.
dealer: mattdamoney, you have 5 seconds to act.

maud.

Brian Raiter

unread,
Jun 17, 2006, 2:51:12 AM6/17/06
to
>> Not liking the recursive scheme any more, I prefer a Vectorised
>> Orthogonal Matrix Inversion Transform, which, being vectorised,
>> can achieve the same thing in one pass.
>
> Yeah, I'm working on a faster version using Heuristic Unicyclic
> Reciprocal Lattices based on the theories of Primitive Uniform
> Kolmogorov Entropy and Generalized Anisotropic Groups. :-)

Interesting. I tried such an approach once, but I found it hard to get
started until I had first worked out an Introspective Procedural
Extraction Cataloguer And Compressor. Talk about hard to swallow.

b

Brian Raiter

unread,
Jun 17, 2006, 2:55:45 AM6/17/06
to
Random data compression fanfic. Now I've seen everything.

b

maud

unread,
Jun 17, 2006, 3:05:45 AM6/17/06
to

Phil Carmody wrote:
> "maud" <maudli...@hotmail.com> writes:
> > dealer: carmo, you have 5 seconds to act.
>
> Hmmm, yawn, or plonk, I can't decide.

Suggest you plonk. This game was over before it started :)

*

For anyone else not plonking :)

In the game with carmo there can only be one winner and it is not
carmo. It is also an example of how to present a proposition so that
you will always win, no matter what the oppo does.

If anyone has worked it out, good for you :) I will close out the game
thread tomorrow.

On another matter but still on-topic.

I have responded to another proposition, a challenge issued by someone
else, in this thread. I have unconditionally accepted this challenge as
posed, in my own inimitable way of course, but accepted it is. Hope my
acceptance doesn't get plonked, lol :)

maud.

John McCarten

unread,
Jun 18, 2006, 3:05:01 AM6/18/06
to

Brian Raiter wrote:
> Random data compression fanfic.

I'll take the bet. I bet that I'm not. Loser gets to eat their words :)


> Now I've seen everything.

Not quite everything ;)

john. (maud.)

Einstein

unread,
Jun 18, 2006, 3:47:50 AM6/18/06
to
Heh

Well, this is going to be fun....

I am working on a php version of the stripped variation... this is a
barbones, proves the thereom.

Now here is the question I ask of you all.


No current compression program can handle alll 65536 variations of 16
bits in a randomly arranged pattern (1 of each result, randomly placed)
can they?

If the answer here is no... then you will be horrified that I can
handle that "Soon".


This is the most stripped down I can make my own tool/method.
Unfortunately there is around 15k results that would increase the size,
so it cant handle all random data, but this should be sufficient to
prove the theory to you skeptics when done.

http://www.security1.free2host.net/phpstuff/Finalstart.php

bookmark that link and consider it permament. If it is not the tool
itself, it will link to the tool.


I am currently learning some loop type applications required to make
this tool actually work. Once I master the loops, then I am going to be
able to code this software with ease. My main worry is pure size of the
application in php, as I think it will exceed 30 mb :o... but that is
my problem, not yours. Worst case, I release the code that 'should be'
to those who could potentially host it for me (free atm, no cash, I
would also retain all rights, they would just be demoing a program that
beats Shannon down to the ground is all)...

http://www.security1.free2host.net/phpstuff/ has examples of the
efforts I am currently going through to test my skills, to learn the
required skills, and so on.

Matthijs Hebly

unread,
Jun 18, 2006, 6:10:48 AM6/18/06
to
Einstein wrote:
> Unfortunately there is around 15k results that would increase the size,
> so it cant handle all random data,

Ah, so you finally see that your initial claim (being able to compress
*all* data above a certain size) isn't going to happen?

> but this should be sufficient to
> prove the theory to you skeptics when done.

No, not sufficient, not sufficient at all. You claimed to be able to
compress *all* possible files, at least above a certain size (6-8 MB).
Now you suddenly say that you can't compress *all* of those. Which is
basically true for all compression algorithms: they can compress certain
files, namely the ones with contents that are easy for them to compress,
but other files, they will either increase in size or the size will
remain the same at best.

So your algorithm does nothing special so far.

Regards,

M.

John McCarten

unread,
Jun 18, 2006, 8:19:10 AM6/18/06
to
maud wrote:

> In the game with carmo there can only be one winner and it is not
> carmo. It is also an example of how to present a proposition so that
> you will always win, no matter what the oppo does.

If anyone wants to plonk now, be my guest :)

That said, on with the game without the smoke this time;

dealer: Dealing new hand.
dealer: carmo opens with bet of $100.
dealer: maudlin calls $100.
maudlin: hey carmo, I bet I can tell what your holecards are.
carmo: ok, you're on.
dealer: maudlin places side bet of $100.
dealer: carmo calls side bet $100.
dealer: flop comes [whatever]
dealer: turn comes [whatever]
dealer: river comes [whatever]
dealer: carmo, its your turn to act.

carmo has 3 options: check, bet or fold.

if carmo checks or bets mauldin calls.

dealer: carmo checks | bets.
dealer: maudlin checks | calls
dealer: carmo shows holecards.
carmo: arg! good bet
dealer: maudlin shows holecards
dealer: maudlin wins main pot of $199.
dealer: maudlin wins side pot of $199.
maudlin: thankyou.
carmo: good game, bye.
maudin: good game, bye.

if carmo folds (mucks his cards)

dealer: carmo folds.
dealer: maudlin wins main pot of $198.
dealer: carmo wins side pot of $198.
carmo: arg! i'm out $1.
maudlin: the dealer has to be paid ;)
dealer: true.
maudlin: good fold.
carmo: thankyou.
carmo: oh well, at least I didn't lose my shirt.
carmo: good game, bye.
maudlin: good game, bye.

Perhaps only a poker player will fully appreciate this game :)

Be that as it may.

The first important thing here is the order in which events occur.
Order is important in many fields, poker for one, compression for
another. When the order in which events occur are known, one can take
advantage of this and squeeze it for all its worth.

A couple of other points.

Why does maudlin always win the pot of $198? There is a logical
answer. maudlin has the same hand that Matt Mahoney had when he played.
When a solution is known to work, one can take advantage of it. What
this has to do with compression, I'm sure you can work out for
yourselves..

Aha! some might say, you said that maudlin never loses? mauldin is out
$2 when carmo mucks.

Yes it's true, I did say this. But the game's not quite over.

pokerroomowner: its a bit quiet in here, so pack it up and go home,
dealer.
dealer: ok, boss.
pokerroomowner: here's your wages for tonight, 50cents.
dealer: thankyou.
pokerroomowner: goodnight dealer.
dealer: goodnight maudlin.

Two things here. If you don't know all the rules before you start then
don't play, best too plonk. If you do know the rules and choose to play
anyway, then that itty-bitty little rake will murder you. Again, I'm
sure you can all draw your own conclusions from this.

I also said it was an example of how to present a proposition so that
you will always win. There are two ways to do this.

The first way is to simply hold something back, to not disclose
everything at the outset, which is the case in this example.

The second way is when the person putting up the proposition holds
nothing back and discloses everything. However, when doing this one can
unintentionally give away more than one intended. In the other game
underway with Matt Mahoney, he has unfortunately done this. In so doing
he can only lose his own proposition.

If you have already worked out why this is so, then good for you. I
will close that game thread out tomorrow, but will not be doing it
pokerstyle.

Firstly, I do believe we can all agree thats been done to death.

And secondly, Matt Mahoney's proposition does highlight some important
considerations not only for this particular proposition but also for
similar challenges of this kind. Something that is worthy of a more
serious discussion,

john. (maud.)

John McCarten

unread,
Jun 18, 2006, 8:25:28 AM6/18/06
to

John McCarten wrote:
> dealer: maudlin wins main pot of $198.
> dealer: carmo wins side pot of $198.

LOL!. where's that ol' counting whatsit when you need it :)

john.

Einstein

unread,
Jun 18, 2006, 8:31:00 AM6/18/06
to


M. I am saying, I can do certain items. I call them tiers. Tier #1 is
the absolute weakest I can do... it proves the theory, and only that.
It can compress only in a less than full pattern. In fact it could
probably handle 50% (This is a guess, I would have to run some quick
numbers to validate any plausible level) of all programs, but I dont
ever think that 50% is enough, nor 90%. I want them ALL, and only ALL
will impress people around here as well.

This Tier #1 will prove the concept. Tier #2 will handle any file you
toss at it. Tier #3 (The beta my programmer is making) will handle any
file and get much better compression results, Tier #4 will be full on
beta release, and Tier #5 will be a marketable product.


I will handle Tier #1 and Tier #2 myself and proffessional programmers
(Including the one I have at this moment working (with difficulty) on
Tier #3) will finish Tiers #3, #4, and #5.

I will have a functional code sequence as soon as I lick this issue I
have with loops in PHP... that will prove my capabilities.

John McCarten

unread,
Jun 18, 2006, 10:53:35 AM6/18/06
to
maud wrote:
> Matt Mahoney wrote in part:
>
> > Here is an easier problem. I generate one 8 MB file and post it. You
> > compress it and post an archive containing the compressed file and a
> > decompression program, such that the archive is at least one byte
> > smaller than the file I post. Follow the rules of
> > http://mailcom.com/challenge/ for creating the archive and decompressor
> > (substituting the test file for the Calgary corpus).
> > ...
> > To make it easier still, I will tell you how I will create the test
> > file. I will use the program sharnd.exe to create it.
> > ...

If you look at the rules for the challenge you will see an important
element missing for challenges such as the one proposed. There is no
time limit.

The second thing is that the challenge file is to be created by sharnd.
The source code of which has been made available.

Now some readers will already have clicked where this is going, but I
will continue.

loop through all possible values of secret key
compare the sharnd output to the challenge file
stop when match found else continue

The decompressor is a modified version of sharnd itself with the
matching key and the 8MB length value embedded.

The length that meets the challenge is length(modifiedsharnd).

Now, it might take a while to find the key but hey! thats combinatorics
for you.


john.

Matt Mahoney

unread,
Jun 18, 2006, 11:52:38 AM6/18/06
to
John McCarten wrote:
> maud wrote:
> > Matt Mahoney wrote in part:
> >
> > > Here is an easier problem. I generate one 8 MB file and post it. You
> > > compress it and post an archive containing the compressed file and a
> > > decompression program, such that the archive is at least one byte
> > > smaller than the file I post. Follow the rules of
> > > http://mailcom.com/challenge/ for creating the archive and decompressor
> > > (substituting the test file for the Calgary corpus).
> > > ...
> > > To make it easier still, I will tell you how I will create the test
> > > file. I will use the program sharnd.exe to create it.
> > > ...
>
> If you look at the rules for the challenge you will see an important
> element missing for challenges such as the one proposed. There is no
> time limit.

I am patient.

> The second thing is that the challenge file is to be created by sharnd.
> The source code of which has been made available.

So the problem ought to be easy. After einstein gives up I'll post a
decompressor under 30 KB just to prove it can be done.

> Now some readers will already have clicked where this is going, but I
> will continue.
>
> loop through all possible values of secret key
> compare the sharnd output to the challenge file
> stop when match found else continue
>
> The decompressor is a modified version of sharnd itself with the
> matching key and the 8MB length value embedded.
>
> The length that meets the challenge is length(modifiedsharnd).
>
> Now, it might take a while to find the key but hey! thats combinatorics
> for you.
>
>
> john.

That's the obvious solution. Also, there are known weaknesses in SHA-1
(2^69 resistance to collisions), but I'm not too concerned.

-- Matt Mahoney

Matt Mahoney

unread,
Jun 18, 2006, 1:58:45 PM6/18/06
to

Einstein, after you learn to program and prove that entropy doesn't
exist, you might be interested in claiming the 7 Clay prizes.
http://www.claymath.org/millennium/
If you search the comp.compression archives you'll find my proof that
random compression implies P = NP, which is worth $1 million. I can
also provide similar proofs for the other 6 problems. If you're
interested, let me know and I will split the prize money with you.
Your half of the proof is to create a working compressor/decompressor
pair that will compress every file or string of size N, where N can be
any number you choose.

-- Matt Mahoney

Keith Thompson

unread,
Jun 18, 2006, 5:02:31 PM6/18/06
to
"Einstein" <mich...@gmail.com> writes:
> Well, this is going to be fun....

When?

> I am working on a php version of the stripped variation... this is a
> barbones, proves the thereom.
>
> Now here is the question I ask of you all.
>
>
> No current compression program can handle alll 65536 variations of 16
> bits in a randomly arranged pattern (1 of each result, randomly placed)
> can they?

Correct, no current compression program can reduce each of the 65536
possible patterns of 16 bits to less than 16 bits, or even reduce some
while leaving others the same size. Any reversible transformation
that reduces the size of any pattern will increase the size of another
one.

> If the answer here is no... then you will be horrified that I can
> handle that "Soon".

I'm merely horrified that you think so.

"Soon" is not interesting. "Now" would be interesting. (It would be
interesting because it's mathematically impossible.)

> This is the most stripped down I can make my own tool/method.
> Unfortunately there is around 15k results that would increase the size,
> so it cant handle all random data, but this should be sufficient to
> prove the theory to you skeptics when done.

So you can compress some patterns while expanding others. That's a
perfectly ordinary compression scheme. It's nothing new, and it's
boring.

--
Keith Thompson (The_Other_Keith) ks...@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.

Willem

unread,
Jun 18, 2006, 5:29:57 PM6/18/06
to
Einstein wrote:
) I am working on a php version of the stripped variation... this is a
) barbones, proves the thereom.

What, exactly, is the claim of the theorem ?
Can you put it into one or maybe a few lines of text ?


SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT

silly...@gmail.com

unread,
Jun 18, 2006, 5:55:52 PM6/18/06
to
In comp.compression Einstein <mich...@gmail.com> wrote:

> Now here is the question I ask of you all.
>
> No current compression program can handle alll 65536 variations of 16
> bits in a randomly arranged pattern (1 of each result, randomly placed)
> can they?
>
> If the answer here is no... then you will be horrified that I can
> handle that "Soon".

Hard to tell exactly what you mean by that, but assuming you mean you
take all possible 16 bit values and randomly permute them, then yes I
can compress that. The raw data would be 16*65536 = 1048576 bits. I
can encode that in a little over 954037 bits (or about 14.56 bits per
item).

This isn't rocket science. But if you understood why I can do that,
and you understand what entropy is, then you'd also understand that
you can't really beat that value. Maybe you could get lucky and get
954000 bits or less on a few permutations, but you won't be able to
beat my encoding scheme on average.

--

Steve Stringer
silly...@gmail.com

Einstein

unread,
Jun 18, 2006, 7:18:17 PM6/18/06
to
I am betting we will exceed 50% with ease... which will make youwant to
reconsider your points of math,... Entropy is dead, beaten and I will
soon enough prove it.

Now I am going back to my php

Keith Thompson

unread,
Jun 19, 2006, 1:04:31 AM6/19/06
to
"Einstein" <mich...@gmail.com> writes:
> I am betting we will exceed 50% with ease... which will make youwant to
> reconsider your points of math,... Entropy is dead, beaten and I will
> soon enough prove it.

50% of what?

You've said you're going to convince us of the truth of some theorem.
Exactly what theorem are you referring to? Whether you can prove it
or not, can you clearly *state* it?

Einstein

unread,
Jun 19, 2006, 6:55:41 AM6/19/06
to
I am going to convince you with real software.

As for the Millenium Challenges, those are quite interesting, though I
may have accidently solved one of them while doing personal research in
compression.

That would be the P vs NP Problem. It is on my other pc, which is put
away in storage at my families. Would be interesting if I did derive an
answer to such an esteemed challenge on accident while trying to defeat
a different math mans theory. However thats neither here nor there...

This software is being made. My skills will make it 'rather large' but
I re-directed my programmer to make it also.. combined, we will make
this a tool that will work in short order I expect.

Matthijs Hebly

unread,
Jun 19, 2006, 4:16:08 PM6/19/06
to
Einstein wrote:
> I am going to convince you with real software.

Bring it on!

> As for the Millenium Challenges, those are quite interesting, though I
> may have accidently solved one of them while doing personal research in
> compression.
> That would be the P vs NP Problem. It is on my other pc, which is put
> away in storage at my families. Would be interesting if I did derive an
> answer to such an esteemed challenge on accident while trying to defeat
> a different math mans theory. However thats neither here nor there...

You start to sound remarkably like JSH...

Since you refuse to see and acknowledge the obvious and clear facts
which prove that what you claim can never be realized, how on earth
could you expect us to believe that you will come up with any proof for
any math problem soon?

But OK, bring it on.

About your compression algo: let's try to compress strings of length 2
or less. In other words, let N be 2:

Plain text bit strings you want to compress of length 2 or less:

'11'
'10'
'01'
'00'
'1'
'0'
'' (empty string)

That's 7 (SEVEN) strings, right? (In other words: you have 2^(N+1) - 1
plain text strings here)

You want to compress those to *smaller* bits strings, thus of length 1
or less:

'1'
'0'
'' (empty string)

That's 3 (THREE) strings, right? (In other words: you have 2^N - 1
strings here)

Now, the problem is: how on earth do you ever expect to be able to map
SEVEN (or 2^(N+1) - 1) plain text strings to THREE (or 2^N - 1)
compressed strings?

E.g.:

'11' -> (compression) -> '1'
'10' -> (compression) -> '0'
'01' -> (compression) -> ''
'00' -> (compression) -> ??????? // no possibilities left...
'0' -> (compression) -> ??????? // no possibilities left...
'1' -> (compression) -> ??????? // no possibilities left...
'' -> (compression) -> ??????? // no possibilities left...

This goes for any N!!! Claiming that you can compress *any* and *all*
strings of length N or less, is saying that you can map all possible
2^(N+1) - 1 plain text strings to 2^N - 1 compressed strings. I'm afraid
you want to fill pigeon holes with on average about 2 pigeons per hole.
Maybe quite cozy for the pigeons, but not very helpful for what you're
trying to achieve.

It is, and will never be, possible.

Give it up.

Or, alternatively, entertain us some more.

> This software is being made.

What a waste of time, since it can't be done. But OK, bring it on.

> My skills will make it 'rather large' but
> I re-directed my programmer to make it also.. combined, we will make
> this a tool that will work in short order I expect.

I guess the (pigeon) whole of comp.compression and comp.theory is
looking forward to it...

M.

guenther vonKnakspott

unread,
Jun 19, 2006, 5:02:41 PM6/19/06
to

Einstein wrote:
> I am going to convince you with real software.
>
> As for the Millenium Challenges, those are quite interesting, though I
> may have accidently solved one of them while doing personal research in
> compression.
>
> That would be the P vs NP Problem. It is on my other pc, which is put
> away in storage at my families. Would be interesting if I did derive an
Wouldn't that happen to be your brain which was put away into your
family's refuse bin?

> answer to such an esteemed challenge on accident while trying to defeat
> a different math mans theory. However thats neither here nor there...
>
> This software is being made. My skills will make it 'rather large' but
> I re-directed my programmer to make it also.. combined, we will make
> this a tool that will work in short order I expect.

Why don't you just shut up until that happens?

Matt Mahoney

unread,
Jun 19, 2006, 8:13:34 PM6/19/06
to
Einstein wrote:
> I am going to convince you with real software.
>
> As for the Millenium Challenges, those are quite interesting, though I
> may have accidently solved one of them while doing personal research in
> compression.
>
> That would be the P vs NP Problem. It is on my other pc, which is put
> away in storage at my families. Would be interesting if I did derive an
> answer to such an esteemed challenge on accident while trying to defeat
> a different math mans theory. However thats neither here nor there...

Just curious, which did you prove, P = NP or P != NP?

-- Matt Mahoney

Keith Thompson

unread,
Jun 20, 2006, 12:52:03 AM6/20/06
to
"Einstein" <mich...@gmail.com> writes:
> I am going to convince you with real software.

You're going to convince of *of what*? You've mentioned a theorem.
Please state the theorem.

John McCarten

unread,
Jun 20, 2006, 1:59:07 AM6/20/06
to

Matt Mahoney wrote:
> John McCarten wrote:
> > maud wrote:
> > > Matt Mahoney wrote in part:
> > >
> > > > Here is an easier problem. I generate one 8 MB file and post it. You
> > > > compress it and post an archive containing the compressed file and a
> > > > decompression program, such that the archive is at least one byte
> > > > smaller than the file I post. Follow the rules of
> > > > http://mailcom.com/challenge/ for creating the archive and decompressor
> > > > (substituting the test file for the Calgary corpus).
> > > > ...
> > > > To make it easier still, I will tell you how I will create the test
> > > > file. I will use the program sharnd.exe to create it.
> > > > ...
> >
> > If you look at the rules for the challenge you will see an important
> > element missing for challenges such as the one proposed. There is no
> > time limit.
>
> I am patient.

Yes. Me too. One often has to wait patiently for a bite. I only got one
bite, from BR, but that's the way it sometimes goes when fishing :)

>
> > The second thing is that the challenge file is to be created by sharnd.
> > The source code of which has been made available.
>
> So the problem ought to be easy.

True. Lucky for me the source was available. I would have been in
serious trouble otherwise.

> After einstein gives up

You may be in for a long wait, but as you've said :)

> I'll post a
> decompressor under 30 KB just to prove it can be done.

This will be definitely be worth seeing. I won't be calling tho'. I
will be safely out of harm's way, sitting on the rail, when you play
this hand.

>
> > Now some readers will already have clicked where this is going, but I
> > will continue.
> >
> > loop through all possible values of secret key
> > compare the sharnd output to the challenge file
> > stop when match found else continue
> >
> > The decompressor is a modified version of sharnd itself with the
> > matching key and the 8MB length value embedded.
> >
> > The length that meets the challenge is length(modifiedsharnd).
> >
> > Now, it might take a while to find the key but hey! thats combinatorics
> > for you.
> >
> >
> > john.
>
> That's the obvious solution.

Yes.

> Also, there are known weaknesses in SHA-1
> (2^69 resistance to collisions), but I'm not too concerned.

My only concern is that I probably wouldn't be around when the key was
found :)

>
> -- Matt Mahoney

john.

Robby Goetschalckx

unread,
Jun 20, 2006, 4:17:30 AM6/20/06
to
Matt Mahoney wrote:

> Just curious, which did you prove, P = NP or P != NP?

Given his counterproofs of Shannon's theory, probably he proves that NP is a
strict subset of P.

robby

giorgi...@email.it

unread,
Jun 20, 2006, 5:44:54 AM6/20/06
to
> Now, the problem is: how on earth do you ever expect to be able to map
> SEVEN (or 2^(N+1) - 1) plain text strings to THREE (or 2^N - 1)
> compressed strings?
> E.g.:
> '11' -> (compression) -> '1'
> '10' -> (compression) -> '0'
> '01' -> (compression) -> ''
> '00' -> (compression) -> ??????? // no possibilities left...
> '0' -> (compression) -> ??????? // no possibilities left...
> '1' -> (compression) -> ??????? // no possibilities left...
> '' -> (compression) -> ??????? // no possibilities left...
> This goes for any N!!! Claiming that you can compress *any* and *all*
> strings of length N or less, is saying that you can map all possible
> 2^(N+1) - 1 plain text strings to 2^N - 1 compressed strings. I'm afraid
> you want to fill pigeon holes with on average about 2 pigeons per hole.

It's a very good and clear example.
And: if the program compress all strings of n bytes (that what is
needed to compress a random file using a rigorous definition of
random), then nothing prevent you to use it recursively (the output
file is an *any file*), so if any input file is compressed at least of
1 bit each time we have that any file of n bit must disappear (being 0
bit of size) after being recursively compressed by that program n
times, that is equal to proof that the compressor is not lossless since
all the n bits of information are now gone.

As said in other posts, anyone could use a PRN generator and so provide
a very efficient compressor that map each random looking output to his
seed and size, however it is indeed compressing a subgroup of files,
not at all compressing all files.
In other worlds a program that generate a random looking output, use
ent or diehard suite to say that "it's random" (really, the tests can
only say that is probable that the file doesn't contain any
recognizable structure, and many good PRNG will pass that kind of
tests) and compress it to an amazingly small file is not enough to say
that any arbirtrary input can be compressed.
Trying to recompress the output recursively is a way to disprove the
claim, if the compressor can compress an arbitrary input it must
compress any of the output of the recusion until the output size is 0,
proving itself to not be a lossless compressor.

snork...@gmail.com

unread,
Jun 20, 2006, 9:28:06 AM6/20/06
to

Matt Mahoney wrote:
>
> Just curious, which did you prove, P = NP or P != NP?
>
> -- Matt Mahoney

Yes, do tell, I've been dying to see how this one turns out.

I'm not calling for a straw poll or anything, but we're all betting on
P != NP, right?

But if P = NP, what a world!

|
| Mark Nelson
|

Matt Mahoney

unread,
Jun 20, 2006, 11:46:52 AM6/20/06
to


Yep, and it's a simple proof too. All you need is a compressor that
will compress any string longer than m, and you can perform any
computation in O(n) time, which is in P:

1. Recursively compress the input from size n to size m in O(n) time.
(I have omitted some details, such as encoding the number of
recursions, dealing with compression by more than one bit, and showing
that each cycle does not depend on n so is O(1), but the basic result
is the same).
2. Compute the compressed output in O(1) time using a 2^m by m lookup
table.
3. Recursively decompress the output in O(n) time. Total computation
is O(n).

As a corrolary, the halting problem is in P. This fact can be used to
solve the 6 remaining Clay millenium prize problems.

Einstein, if you are still interested in the Clay prizes, let me know.
Here is the deal.
1. You pay me a fee of $1000 by July 1, 2006.
2. You write and publish (with source code) a compressor/decompressor
pair that will compress every file or string larger than some size m by
at least one bit, where m is any positive integer you choose, and the
decompressor restores the original input to the compressor.
3. After step 2, I will write a paper with you as co-author solving the
7 millenium problems and get it published in an appropriate mathematics
journal within 6 months.
4. The Clay Institute requires a 2 year waiting period after
publication, acceptance within the mathematics community, and approval
by their scientific advisory board before awarding the prizes ($7
million). We split the prize money, regardless of the actual amount.
5. If after step 2 I fail to complete steps 3 and 4 (for the full $7
million) then I refund my fee.
6. There is no refund if you fail to complete step 2.
7. After July 1, 2006 my fee increases to $2000, then increases $1000
per week after that. The amount of the fee is determined by the day
that I receive payment.
8. There is no time limit for you to complete step 2.
9. You are free to file for patents worldwide on your algorithm for
step 2 with you as sole inventor.

If we have a deal, contact me by email to arrange payment.

Of course you can go it alone and claim the whole $7 million for
yourself (or at least $1 million for P = NP using either the proof I
posted or the one sitting on your other computer). I don't know if you
have tried to publish any papers before, but it can be a humbling
experience. It really helps to get outside assistance in this matter.

-- Matt Mahoney

snork...@gmail.com

unread,
Jun 20, 2006, 5:15:07 PM6/20/06
to
So as long as guys like Harrington are out there, I need to make sure
that we keep the million random digit challenge on the table. In the
past it's been a little hard to find, you have to search through
comp.compression and there are various false hits.

Now, it has a permanent home on my site:

http://marknelson.us/2006/06/20/million-digit-challenge/

Einstein, the point of the million random digit challenge is basically
to encourage people with magical compressor claims to put up or shut
up. If you can beat the challenge within the spirit of the rules, you
win.

It's been out there for four or five years, and nobody has taken a
serious crack at it. One or two people have noted that there is a tiny
bit of redundancy in the million random digit file, but not enough to
exploit with a conventional decompressor - nibbling away a dozen bytes
doesn't leave room for much of a decompression program.

Nope, to compress this file will take a magic compressor. So, why not
code up what you've been talking about and earn undying fame by meeting
the challenge? It's no Clay Prize, but then, I don't have a retail
empire or a family fortune to fund a prize of this nature.

But I'm still in for $100.

|
|

Fibonanci Code

unread,
Jun 21, 2006, 11:04:09 AM6/21/06
to

maud wrote:
> tc...@lsa.umich.edu wrote:
> > In article <1150377194....@i40g2000cwc.googlegroups.com>,
> > maud <maudli...@hotmail.com> wrote:
> > >a) Create all sets of possible values that occupy 8MB space.
> > >b) Compress and decompress them all without error using your method.
> > >c) Count up how many sets when compressed result in a smaller space
> > >than 8MB.
> > >
> > >Now, it may take some time but hey! thats combinatorics for you.
> >
> > No kidding. 8MB is 64 million bits, so there are 2^(64 million) possible
> > values that occupy 8MB space. Even if you handle a googol cases per
> > femtosecond, you would handle a negligible fraction of all cases in a
> > googol universe lifetimes.
>
> Probably be a good idea if we went for lunch first then :)
>
> maud.

Hi all,

I found a way to create a recursive compressor software
right away if some one could transform any random file into this format
(for example) with minimal additional bytes.
111110110001101111000

each section in the binary string of the output must have runs of 1,
with the length atleast 2 and above, and run of the zero is atleast 1
and above...

x>=2 , y>=1 where x = run length of 1, and y = run
length of 0

so the minimal group will be like this 110

if anyone found any variable length code that have this characteristic
or output of
some codec generate something like this, please contact me
immediately...
we will create the miracle and solve this problem once for all.

I am hinting Fibonanci for the 11 but still it is not suitable...

Einstein, this is a challenge for you.. I only need a data transform to
complete
the quest. so before I complete it with the help of experts here, I do
hope you did first.


Thanks.


Regards,
Ray

Geevarghese Philip

unread,
Jun 24, 2006, 1:41:49 PM6/24/06
to
> I will have a functional code sequence as soon as I lick this issue I
> have with loops in PHP

Any guesses on what this "issue with loops" might be? I am curious since
the Halting Problem reduces to the problem Mr.Einstein claims to have
solved.

Philip

jacko

unread,
Jun 27, 2006, 10:22:52 PM6/27/06
to
Back online again for a short while.

Seems like another one of the futile. MUST not map directly , must
modulate representation, must sleep.

nice quadrapak parabolic antenna me thinks he, he

0 new messages