Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Anyone here work for a compression company ?

15 views
Skip to first unread message

Tim Bernard

unread,
Apr 21, 2003, 7:48:16 PM4/21/03
to
I can compress any 100 bytes down to 7 bytes lossless
takes 30 seconds max to compress it and about 5 seconds to decompress it.
Or 40 bytes down to 7 bytes in one second.
I am not looking for money because I have all that I need so im not looking
to bilk anyone out of money on such a "crazy idea".
I merely wish to demonstrate the fact that the application does indeed work
to someone in the industry so they take it seriously that's it.

--
"What worries me is not the violence of the few, but the indifference of
the many"

Eric Bodden

unread,
Apr 21, 2003, 7:52:02 PM4/21/03
to
The more interesting thing is that "Tim Bernard" seems to be something like
a new, sophisticated, high level, subject and text alternating spam bot ;-)

Eric

--
Arithmetic Coding at a glance
http://ac.bodden.de


Mathew Hendry

unread,
Apr 21, 2003, 7:55:32 PM4/21/03
to
On Mon, 21 Apr 2003 23:48:16 GMT, "Tim Bernard" <notmy...@server.com>
wrote:

>What worries me is the indifference of the many

Oh well, better get used to it.

-- Mat.

Eric Bodden

unread,
Apr 21, 2003, 7:54:05 PM4/21/03
to
> "What worries me is not the violence of the few, but the indifference of
> the many"

But he still needs to work on the signature ;-)

s...@nospam.unt.edu

unread,
Apr 21, 2003, 10:28:16 PM4/21/03
to
Tim Bernard <notmy...@server.com> wrote:

> I can compress any 100 bytes down to 7 bytes lossless
> takes 30 seconds max to compress it and about 5 seconds to decompress it.
> Or 40 bytes down to 7 bytes in one second.
> I am not looking for money because I have all that I need so im not looking
> to bilk anyone out of money on such a "crazy idea".
> I merely wish to demonstrate the fact that the application does indeed work
> to someone in the industry so they take it seriously that's it.

There are several well-established "challenges" on this newsgroup if
you should choose to try to prove your system. Mark Nelson has a file
of random digits that you would need to compress, but you need to add
in the size of the decompression program when counting how much
compression you achieved. There's another challenge along those lines
with a cash prize. Here's my own challenge, which has been listed in
the comp.compression FAQ for many years, and yet oddly enough no one
has won:

1) You send me the decompression program (I don't need the
compression program).

2) I send you the data to compress. I'll send you 100 bytes because
of your description above.

3) You send me the compressed version -- 7 bytes from your
description.

4) I will run your decompression program with this compressed
data, and test to see if the original 100 bytes are restored.

If you manage to do this, I send you $100. Fair enough?

--
Steve Tate - srt[At]cs.unt.edu | "A computer lets you make more mistakes faster
Dept. of Computer Sciences | than any invention in human history with the
University of North Texas | possible exceptions of handguns and tequila."
Denton, TX 76201 | -- Mitch Ratliffe, April 1992

Steven Pigeon

unread,
Apr 21, 2003, 10:59:44 PM4/21/03
to
"Tim Bernard" <notmy...@server.com> wrote in message
news:4r%oa.96696$BQi....@news04.bloor.is.net.cable.rogers.com...

> I can compress any 100 bytes down to 7 bytes lossless
> takes 30 seconds max to compress it and about 5 seconds to decompress it.
> Or 40 bytes down to 7 bytes in one second.
> I am not looking for money because I have all that I need so im not
looking
> to bilk anyone out of money on such a "crazy idea".
> I merely wish to demonstrate the fact that the application does indeed
work
> to someone in the industry so they take it seriously that's it.


This is like the cold virus. Every new
season brings us a mutant strand of the
bug...


Best,

S.

--
Steven Pigeon, Ph. D.
pig...@iro.umontreal.ca


Tim Bernard

unread,
Apr 21, 2003, 11:08:04 PM4/21/03
to
Hey that made me laugh,
Thanks that was a funny ai joke.

"Eric Bodden" <e...@ukc.ac.uk> wrote in message
news:b82073$eip$1...@athena.ukc.ac.uk...

Tim Bernard

unread,
Apr 21, 2003, 11:14:54 PM4/21/03
to
I want to take you up on ur offer, I would prefer to do it in person ,
since im leery about sending any actual engine code out.
If your anywhere near Toronto Ontario I can make it happen in person.
Or if you know anyone u trust near Toronto, Ontario I can do that as well.
Ur in North Texas University i think so how about at least contacting some
other university in Toronto With the same department and getting them at
least to look at it for u and relay the facts that it works to you.
Buy the way i dont accept checks. Get a money order ready.
I hope you do see that i am willing to verify my claim and not making
excuses cause i am trying to find a solution.

<s...@nospam.unt.edu> wrote in message news:b829c0$bg$1...@hermes.acs.unt.edu...

s...@nospam.unt.edu

unread,
Apr 21, 2003, 11:13:16 PM4/21/03
to
Tim Bernard <notmy...@server.com> wrote:
> I want to take you up on ur offer, I would prefer to do it in person ,
> since im leery about sending any actual engine code out.
> If your anywhere near Toronto Ontario I can make it happen in person.
> Or if you know anyone u trust near Toronto, Ontario I can do that as well.

This is not the kind of thing I would get anyone else involved with,
but if you want to send me a plane ticket to Toronto.....

> Ur in North Texas University i think so how about at least contacting some
> other university in Toronto With the same department and getting them at
> least to look at it for u and relay the facts that it works to you.
> Buy the way i dont accept checks. Get a money order ready.
> I hope you do see that i am willing to verify my claim and not making
> excuses cause i am trying to find a solution.

I think we're stuck with the long-distance thing. As I've offered to
others, I'd happily sign a non-disclosure agreement as far as your
decompression code goes. I'd store it safely, run it when you gave me
the compressed data, delete it, and never look at it. That much you
can guarantee legally. Of course, the results of the test will be
announced to all...

Tim Bernard

unread,
Apr 21, 2003, 11:49:37 PM4/21/03
to
Well im not about to pay for ur airfare unless i know u have lots of good
contacts in the compression industry.
Get back to me if u ever find anyone in / near Toronto who wants to see it
preferably if there are beneficial to me to spread the word.

<s...@nospam.unt.edu> wrote in message news:b82c0c$fl$1...@hermes.acs.unt.edu...

Eric Bodden

unread,
Apr 22, 2003, 5:07:57 AM4/22/03
to
> Hey that made me laugh,
> Thanks that was a funny ai joke.

You're welcome! Why should it just be us who is laughing?

Eric


s...@nospam.unt.edu

unread,
Apr 22, 2003, 8:47:36 AM4/22/03
to
Tim Bernard <notmy...@server.com> wrote:

> Well im not about to pay for ur airfare unless i know u have lots of good
> contacts in the compression industry.
> Get back to me if u ever find anyone in / near Toronto who wants to see it
> preferably if there are beneficial to me to spread the word.

I do have "lots of good contacts in the compression industry", for
what it's worth. In fact, I know quite a few people up at the
University of Toronto too. What you're missing is that I wouldn't
involve any of them with this because it's really a waste of time
unless you're in it for the entertainment value, like me.

Your technique doesn't work, it can't work, and that could be
demonstrated in a matter of minutes if you'd just try....

Glen Smith

unread,
Apr 22, 2003, 10:44:34 AM4/22/03
to
Its ok, i have an appointment with a local computer paper and im
demonstrating the technology to them next week.
They will feed the story to other publications after.
All is good.

<s...@nospam.unt.edu> wrote in message
news:b83dl8$2no$1...@hermes.acs.unt.edu...

Kelsey Bjarnason

unread,
Apr 22, 2003, 12:19:36 PM4/22/03
to
On Mon, 21 Apr 2003 23:48:16 +0000, Tim Bernard wrote:

> I can compress any 100 bytes down to 7 bytes lossless

No, you can't, as anyone with the ability to master complex subjects such
as counting can figure out.


Randall R Schulz

unread,
Apr 22, 2003, 12:50:00 PM4/22/03
to


Oh, come now.

One needs exponentiation, too, even if only for base 2.

Perhaps we should suffer fools a bit more gladly? ... Nah.

RRS

Kelsey Bjarnason

unread,
Apr 22, 2003, 3:58:38 PM4/22/03
to
On Tue, 22 Apr 2003 16:50:00 +0000, Randall R Schulz wrote:

>
> Kelsey Bjarnason wrote:
>> On Mon, 21 Apr 2003 23:48:16 +0000, Tim Bernard wrote:
>>
>>
>>> I can compress any 100 bytes down to 7 bytes lossless
>>
>>
>> No, you can't, as anyone with the ability to master
>> complex subjects such as counting can figure out.
>
>
> Oh, come now.
>
> One needs exponentiation, too, even if only for base 2.

Not even:

00
01
10
11

I count 4. :)


Randall R Schulz

unread,
Apr 22, 2003, 4:15:28 PM4/22/03
to


My hat is off to you, sir!

But wait a minute... How do I know you didn't use a computer to do that?
Surely no mere human mind could pull that off?

RRS

Eric Bodden

unread,
Apr 22, 2003, 4:42:51 PM4/22/03
to
> But wait a minute... How do I know you didn't use a computer to do that?
> Surely no mere human mind could pull that off?
He certainly used a clustering farm ;-)


Kelsey Bjarnason

unread,
Apr 22, 2003, 5:44:40 PM4/22/03
to
[snips]

On Tue, 22 Apr 2003 20:15:28 +0000, Randall R Schulz wrote:

>>>One needs exponentiation, too, even if only for base 2.
>>
>>
>> Not even:
>>
>> 00
>> 01
>> 10
>> 11
>>
>> I count 4. :)
>
>
> My hat is off to you, sir!
>
> But wait a minute... How do I know you didn't use a computer to do that?

A Galactic Stellar Cluster 197, to be exact. 193 trillion gigaflops, 512
petabytes core RAM, 120 exabytes storage. The first computer designed to
be sufficiently powerful as to be able to track the U.S. national deficit
in real time - to the nearest megabuck.

But it counted them things. No exponentiation needed. I'd show you the
source code to prove it, but Earth isn't advanced enough to cope with
quintelinear algorithms yet.

> Surely no mere human mind could pull that off?

Human? Who said anything about... oh, crap, I blew my cover, didn't I?

Dale King

unread,
Apr 22, 2003, 6:18:57 PM4/22/03
to
"Randall R Schulz" <rrsc...@cris.com> wrote in message
news:Yoepa.7739$JX2.4...@typhoon.sonic.net...


Actually, they don't have to understand exponentiation completely. All they
need to understand is that:

k^m > k^n if m > n and k > 0

in this case we have k = 256, m = 100, and n = 7

256^100 > 256^7

so there cannot be a one-to-one function from the set of 100-byte inputs to
the set of 7-byte outputs.
--
Dale King


Eric Bodden

unread,
Apr 23, 2003, 4:52:28 AM4/23/03
to
> k^m > k^n if m > n and k > 0
k > 1,
sorry to be picky :-P


Benny Baumann

unread,
May 4, 2003, 8:44:01 AM5/4/03
to

"Dale King" <Dale...@Thomson.net> wrote in news:3ea5...@news.tce.com...
> k^m > k^n if m > n and k > 0
>
> in this case we have k = 256, m = 100, and n = 7
>
> 256^100 > 256^7
>
> so there cannot be a one-to-one function from the set of 100-byte inputs to
> the set of 7-byte outputs.
> --
> Dale King

But when using this in this context u might add that MUST BE

k^m * entropy(k^m) <= k^n with k>1, entropy(k^m) the data blocks initial entropy

in order to decompress it. But I think some of us in this ng done really want to know that entropy of some data blocks gets near to 1 if they get compressed so that n has to be

n >= m * entropy(k^m)

for a given starting size. And because some can't use their brains for such complex things (like counting and basic maths) they easy claim that they can fly because they don't use Newton's laws. ;-)

I think this should be compressed short enought...

@Tim: Why you don't want money for repairing ur brain or even buy a new one? ;-)

Dale King

unread,
May 6, 2003, 11:18:16 AM5/6/03
to
"Benny Baumann" <Be...@firemail.de> wrote in message
news:b96535$uga$02$1...@news.t-online.com...


No, you do not have to include entropy into it at all. All you have to do is
look at a compressor as function (more properly a binary relation, since the
question is whether it actually fits the definition of a function). You put
an input in and you get an output out. The decompressor is the reverse
process. For it to be lossless, it cannot be that for 2 different inputs I
get the same output (i.e. it must be one-to-one not many-to-one). If the
number of inputs is greater than the number of outputs then you cannot
generate unique outputs for each input.

--
Dale King


D.A.Kopf

unread,
May 6, 2003, 4:10:25 PM5/6/03
to

Dale King wrote:
>
> No, you do not have to include entropy into it at all. All you have to do is
> look at a compressor as function (more properly a binary relation, since the
> question is whether it actually fits the definition of a function). You put
> an input in and you get an output out. The decompressor is the reverse
> process. For it to be lossless, it cannot be that for 2 different inputs I
> get the same output (i.e. it must be one-to-one not many-to-one). If the
> number of inputs is greater than the number of outputs then you cannot
> generate unique outputs for each input.

I agree entropy doesn't enter into it, but an "input" is not so well
defined. The input can be a message composed of symbols, in which case
the order of the symbols can be important, and different symbols can
produce the same output as when using adaptive tables. Or the input
can be a set of messages, and again different messages can produce the
same output depending on their order, e.g. the output 1,1,1,1 for a
series of four messages can mean each is the expected one within the
current context. Ultimately it seems to me that a lossless compressor
is a reversible mapping from a set of possible concatenated messages
to a set of concatenated outputs. Each output indeed must uniquely
decompress, but when context is allowed the input set is not simply
the set of all possible binary messages. I am reminded of the SF story
in which superintelligent beings sent a message back to Earth
containing the theory of everything, which said something like
"concatenate the prime factors of 2^233985-42 and read the resulting
file as ASCII" :).

Dale King

unread,
May 6, 2003, 6:18:57 PM5/6/03
to
"D.A.Kopf" <d...@dakx.com> wrote in message
news:3EB815C8...@dakx.com...

>
>
> Dale King wrote:
> >
> > No, you do not have to include entropy into it at all. All you have to
do is
> > look at a compressor as function (more properly a binary relation, since
the
> > question is whether it actually fits the definition of a function). You
put
> > an input in and you get an output out. The decompressor is the reverse
> > process. For it to be lossless, it cannot be that for 2 different inputs
I
> > get the same output (i.e. it must be one-to-one not many-to-one). If the
> > number of inputs is greater than the number of outputs then you cannot
> > generate unique outputs for each input.
>
> I agree entropy doesn't enter into it, but an "input" is not so well
> defined.

In the case in question it was very well defined. All sequences of exactly
100 bytes.

> The input can be a message composed of symbols, in which case
> the order of the symbols can be important, and different symbols can
> produce the same output as when using adaptive tables. Or the input
> can be a set of messages, and again different messages can produce the
> same output depending on their order, e.g. the output 1,1,1,1 for a
> series of four messages can mean each is the expected one within the
> current context. Ultimately it seems to me that a lossless compressor
> is a reversible mapping from a set of possible concatenated messages
> to a set of concatenated outputs. Each output indeed must uniquely
> decompress, but when context is allowed the input set is not simply
> the set of all possible binary messages. I am reminded of the SF story
> in which superintelligent beings sent a message back to Earth
> containing the theory of everything, which said something like
> "concatenate the prime factors of 2^233985-42 and read the resulting
> file as ASCII" :).

The definition of the input has nothing to do with it in what I was saying.
All you have to do is be able to count how many inputs you have and how many
outputs you have. If you have more inputs to the compressor than outputs the
compressor cannot be lossless. I agree that when the numbers of inputs and
the number of outputs are both infinite or when it is difficult to determine
an exact count it takes a little more to show you can't compress everything.
But in the finite cases as we have here all you have to do is count.

--
Dale King


0 new messages