Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Help license my code and get a % !

23 views
Skip to first unread message

Tim Bernard

unread,
Apr 21, 2003, 9:06:20 AM4/21/03
to
I have written a data compression engine that will compress any group of
100 bytes down to 7 bytes. It takes about 30 seconds. That's the down side.
The upside is that nobody else can do this.
As well the s/ware engine can compress any 64 bits down to 35 bits in less
then a second.
The decompression is 10 times faster.
So if you know of someone who would be interested email me back at
timokm...@hotmail.com
The software will actually compress any data down to nearly nothing once
one allows for recompression of the compressed data recursively. But
currently cpu speeds are not able to make this a good timely application.


P.S. Spare me the details on why its not possible, ive heard it all before.


--
"What worries me is not the violence of the few, but the indifference of the
many"

Kevin Easton

unread,
Apr 21, 2003, 9:59:46 AM4/21/03
to
Tim Bernard <notmy...@server.com> wrote:
> I have written a data compression engine that will compress any group of
> 100 bytes down to 7 bytes. It takes about 30 seconds. That's the down side.
> The upside is that nobody else can do this.
> As well the s/ware engine can compress any 64 bits down to 35 bits in less
> then a second.

Actually, plenty of people can do it - just discard 29 of the bits.

We'll assume you mean reversible compression - that is, you claim that
you can compress any 64 bit pattern down to 35 bits, and then run the
inverse operation on those 35 bits to produce the original 64 bit pattern.

This is clearly wasted on the OP, who either hasn't tested the algorithm
in question, or is deliberately setting out to decieve. But for the
benefit of anyone who might be wondering about it:

There are 2^64 combinations of 64 bits, or 18446744073709551616 unique
bit patterns. There are only 2^35 combinations of 35 bits; 34359738368
unique bit patterns. That means that a set of 35 bits can only
represent 34359738368 of the 18446744073709551616 combinations of 64
bits, or .000000186264514923 percent. For each 64 bit pattern that can
e represented, there are 536870911 more that can't.

"But how does any compression scheme work then?" - the answer is the
name "compression" is a little misleading. All compression schemes
actually *increase* the size of the input data for the vast majority of
possible input data - it's just that they decrease the size for the
infintesimal minority of cases that correspond to the data we commonly
deal with.

This is why if you take the output of an unbiased random source and
attempt to compress it, you can't - the chance of the random source
generating one of the special cases that results in a size reduction is
infintesimal.

It is conceivable that the OP could have an algorithm that reduces
34359738367 different 64 bit patterns to 35 bits, but then it must
also increase the length of the remaining 99.999999813735485083 percent
of 64 bit patterns. It is the claim to compress "any" 64 bit pattern
down to 35 bits, in a reversible mannner, that is certainly false.

There Is No Such Thing As A Free Lunch.

- Kevin.

Matt Mahoney

unread,
Apr 21, 2003, 12:33:58 PM4/21/03
to
Tim Bernard wrote:

> I have written a data compression engine that will compress any group of
> 100 bytes down to 7 bytes. It takes about 30 seconds. That's the down side.
> The upside is that nobody else can do this.
> As well the s/ware engine can compress any 64 bits down to 35 bits in less
> then a second.
> The decompression is 10 times faster.
> So if you know of someone who would be interested email me back at
> timokm...@hotmail.com
> The software will actually compress any data down to nearly nothing once
> one allows for recompression of the compressed data recursively. But
> currently cpu speeds are not able to make this a good timely application.
>
> P.S. Spare me the details on why its not possible, ive heard it all before.

But of course it's possible. Unfortunately, ZeoSync came up with the idea first
and patented it. Like your algorithm, they can repeat the compression over and
over until the file size is down to 0. They announced their amazing discovery
in Jan. 2002. Perhaps if you had read some books on compresson (or the FAQ),
you would have come up with the idea before they did. See
http://www.backseatdriver.com/clients/zeosync/docs/index.htm and click on
"technology" for an explanation of how their system works. Fortunately it is
not too late to invest in their company and make a bundle :-)

-- Matt Mahoney


Phil Carmody

unread,
Apr 21, 2003, 12:30:09 PM4/21/03
to
On Mon, 21 Apr 2003 13:06:20 +0000, Tim Bernard wrote:

> I have written a data compression engine that will compress any group of
> 100 bytes down to 7 bytes. It takes about 30 seconds. That's the down side.
> The upside is that nobody else can do this.
> As well the s/ware engine can compress any 64 bits down to 35 bits in less
> then a second.
> The decompression is 10 times faster.
> So if you know of someone who would be interested email me back at
> timokm...@hotmail.com
> The software will actually compress any data down to nearly nothing once
> one allows for recompression of the compressed data recursively. But
> currently cpu speeds are not able to make this a good timely application.
>
>
> P.S. Spare me the details on why its not possible, ive heard it all before.

Save us the effort - go find somewhere quiet and kill yourself.

*_PLONK_*

Phil

Eric Bodden

unread,
Apr 21, 2003, 1:16:44 PM4/21/03
to

ROFL Yeah that was great! I still have a copy of their fancy presentation!
See...
http://www.bodden.de/studies/ac/ (bottom-most link there)

Eric


Dale King

unread,
Apr 21, 2003, 1:41:51 PM4/21/03
to
"Kevin Easton" <kevin@-nospam-pcug.org.au> wrote in message
news:newscache$ig5pdh$624$1...@tomato.pcug.org.au...

> Tim Bernard <notmy...@server.com> wrote:
> > I have written a data compression engine that will compress any group of
> > 100 bytes down to 7 bytes. It takes about 30 seconds. That's the down
side.
> > The upside is that nobody else can do this.
> > As well the s/ware engine can compress any 64 bits down to 35 bits in
less
> > then a second.
>
> Actually, plenty of people can do it - just discard 29 of the bits.
>
> We'll assume you mean reversible compression - that is, you claim that
> you can compress any 64 bit pattern down to 35 bits, and then run the
> inverse operation on those 35 bits to produce the original 64 bit pattern.
>
> This is clearly wasted on the OP, who either hasn't tested the algorithm
> in question, or is deliberately setting out to decieve. But for the
> benefit of anyone who might be wondering about it:
>
> There are 2^64 combinations of 64 bits, or 18446744073709551616 unique
> bit patterns. There are only 2^35 combinations of 35 bits; 34359738368
> unique bit patterns. That means that a set of 35 bits can only
> represent 34359738368 of the 18446744073709551616 combinations of 64
> bits, or .000000186264514923 percent. For each 64 bit pattern that can
> e represented, there are 536870911 more that can't.

Here is another angle that I think may be easier to grasp. I like to explain
it in terms of telephone numbers. Consider if you were implementing a
replacement for the telephone system. Consider a city with 1,000,000 people.
Is there any mapping we can come up where we can assign each one of those
persons a 4-digit telephone number? The answer is of course no. There are
only 10,000 different 4 digit telephone numbers. If we assign a 4 digit
telephone number to each person we will assign the same number to more than
one person. In fact we will have an average of 100 people per telephone
number. If I try to call the number of John Smith how does the system know
to connect me with John Smith and not Mary Jones or one of the other 100
people with that same number.

It is easy to see that any system where you have more more people than
telephone numbers it will not work, because any system that assigns the same
phone number to more than one person because there is no way to know which
of the people to map it to.

The same is true with compression. The compressor is just a way of assigning
numbers to inputs. The decompressor is like entering the telephone number
and getting connected with the original data. If the compressor assigns the
same number to more than one input, there is no way for the decompressor to
reverse the process.

So as Kevin said, you have a city of 18446744073709551616 people and only
34359738368 telephone numbers. There is no way to make such a system work.
Doesn't matter how long you take to do the compression you have more people
than telephone numbers.

> "But how does any compression scheme work then?" - the answer is the
> name "compression" is a little misleading. All compression schemes
> actually *increase* the size of the input data for the vast majority of
> possible input data - it's just that they decrease the size for the
> infintesimal minority of cases that correspond to the data we commonly
> deal with.

And the analogy with phone numbers here is that here in the US I can call
anyone in my town using only 7 digits. To call someone in another state
requires entering a 1 followed by 10 digits and more digits still to call
someone in another country. This is done because we most commonly call
people who are near to us and rarely if ever call people in another country.
This actually lengthens the average phone number sinc for instance I cannot
have a 7 digit telephone number that starts with 1.

--
Dale King


Tim Bernard

unread,
Apr 22, 2003, 8:56:59 AM4/22/03
to
You are using conventional thinking towards the problem.
if i tryed solving the compression problem the way you do then i wouldn't be
able to do it either.

"Dale King" <Dale...@Thomson.net> wrote in message
news:3ea4...@news.tce.com...

Kelsey Bjarnason

unread,
Apr 22, 2003, 12:32:26 PM4/22/03
to
On Tue, 22 Apr 2003 12:56:59 +0000, Tim Bernard wrote:

> You are using conventional thinking towards the problem.
> if i tryed solving the compression problem the way you do then i wouldn't be
> able to do it either.

Indeed; if we actually try to do something as breathtakingly obvious as
simply _counting_ what can actually be done, well, hell, that says you're
full of fertilizer.

Since counting is obviously not allowed to be used, what magic method do
_you_ propose for determining the veracity of claims which are, on the
surface, complete and utter twaddle?


Glen Smith

unread,
Apr 23, 2003, 11:04:06 PM4/23/03
to
I am demonstrating the software this summer.

"Kelsey Bjarnason" <kel...@xxnospamyy.telus.net> wrote in message
news:pan.2003.04.22....@xxnospamyy.telus.net...

0 new messages