Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Any 100 bytes down to 7 Bytes Losslessly !

35 views
Skip to first unread message

Tim Bernard

unread,
Apr 21, 2003, 8:37:58 AM4/21/03
to
One must understand the limitations of math and its strengths and change
the approach and view it differently to make it happen.
It does work, however it is CPU intensive and currently takes too long for
tradition compression applications other then for small text files, or small
transmition security encryption applications or archieve applicaitons where
the end user is willing to wait for a min or so to bring up a small section
of data from a phyical 80 gig drive that holds 800 gigs compressed on it.


DSCOTT

unread,
Apr 21, 2003, 8:59:10 AM4/21/03
to
notmy...@server.com (Tim Bernard) wrote in
<GCRoa.119479$jVh....@news01.bloor.is.net.cable.rogers.com>:

To bad you don't understand the "limitation of math" Even if you
let every possible bit pattern count as a single number in seven
bytes. It does not even come close to then number of possible in
a hundred bytes. Either you insane or a con man. But your not the
first to claim such nonsense we get new ones here every few months.
The good news is that several fools with money they wish to piss
away still exist so you do have a chance to fleece them of there
money which otherwise they would spend foolishly anyway.

David A. Scott
--
My Crypto code
http://cryptography.org/cgi-bin/crypto.cgi/Misc/scott19u.zip
http://cryptography.org/cgi-bin/crypto.cgi/Misc/scott16u.zip
http://www.jim.com/jamesd/Kong/scott19u.zip old version
My Compression code http://bijective.dogma.net/
**TO EMAIL ME drop the roman "five" **
Disclaimer:I am in no way responsible for any of the statements
made in the above text. For all I know I might be drugged.
As a famous person once said "any cryptograhic
system is only as strong as its weakest link"

Mikael

unread,
Apr 21, 2003, 12:42:25 PM4/21/03
to
In article
<GCRoa.119479$jVh....@news01.bloor.is.net.cable.rogers.com>,
"Tim Bernard" <notmy...@server.com> wrote:

Say you want to compress some data to 2 bits.
But 2 bits can only be decoded to 2^2 = 4 unique results.
This means that you can only compress 4 different data to these two bits.

In this case you have 7 bytes = 56 bits.
So you have 2^56 unique data which can only be decoded to 2^56 unique
results.

You say you can decode them to 2^800 unique results.
See how this can't be done?

/Mikael

Teo van der Vlies

unread,
Apr 21, 2003, 2:54:16 PM4/21/03
to
Tim,

This is very interresting.
Is it possible to repeat this process?

Example:
14 groups of 100 bytes results in 14x7 bytes = 98 bytes
than repeat the process ones and get again 7 bytes.

Please, let me know if this is possible?
How much time does the last compression take?

Teo.


"Tim Bernard" <notmy...@server.com> wrote in message
news:GCRoa.119479$jVh....@news01.bloor.is.net.cable.rogers.com...

Kelsey Bjarnason

unread,
Apr 21, 2003, 5:17:16 PM4/21/03
to
On Mon, 21 Apr 2003 12:37:58 +0000, Tim Bernard wrote:

> One must understand the limitations of math

Most folks who use math regularly are familiar with its limits. However,
those limits are not sufficient to allow in magic at the drop of a hat.


Eric Bodden

unread,
Apr 21, 2003, 7:41:16 PM4/21/03
to
Another very good application for that would be our fridge!
We are almost always running out of space with 5 guys... ;-)

Eric


Tim Bernard

unread,
Apr 21, 2003, 7:58:16 PM4/21/03
to
Yes it is indeed possible to repeat this process. With a few extra bits of
overhead one can compress anything down to nearly nothing at all.
CPU processing is a large issue unfortunetly.
The time to compress any 100 bytes down to 7 bytes is 30 seconds maximum.
Currently the system in use is a 1200 Duron Chip. Doubling this can easily
reduce the time down to 15seconds. Encoding the software within Assembly
language could yeild the time down to 7 seconds or so.


"Teo van der Vlies" <teo_...@hotmail.com> wrote in message
news:s7Xoa.650640$sj7.26136066@Flipper...

Eric Bodden

unread,
Apr 21, 2003, 8:02:16 PM4/21/03
to
> CPU processing is a large issue unfortunetly.
Hmmm... So - just trying to understand you... The more time you invest, the
better your compression gets, yeah?
What happens if you decompress? Do you _win_ time then? Just thinking
because I have my final exams starting next week and just for the case I am
running out of time I should probably tweak my MP3 player before to
decompress the files. What do you think about it?

Eric


Tim Bernard

unread,
Apr 21, 2003, 11:06:11 PM4/21/03
to
Correct the more processing time u invest the higher the compression ratio.
If one invests only one second to compress data you would be able to
compress 40 bytes into 4 bytes.
Decompression is 10 times faster then compression roughly speaking.
So if ur compressing for 30 seconds u decompress it in 3 seconds or so.
To increase producivity one could impliment the software engine
distributivley via a computer network where each computer gets 1/64th of the
data to compress (if there is 64 computers in the network) it faster.
Parrallel processing works as well if u have 32 cpus inside one computer
etc...

App is best currently suited for instant messaging compression / encryption

"Eric Bodden" <e...@ukc.ac.uk> wrote in message
news:b820q9$ekj$1...@athena.ukc.ac.uk...

Tim Bernard

unread,
Apr 21, 2003, 11:07:16 PM4/21/03
to
Following your approach to the solution i do see the problem you would face.
I dont have that problem. Thats why u think its not possible.

"Mikael" <mlqn...@telia.com> wrote in message
news:mlqnospam-81588...@news.fu-berlin.de...

Peter Ballard

unread,
Apr 22, 2003, 1:51:17 AM4/22/03
to
"Tim Bernard" <notmy...@server.com> wrote in message news:<sA%oa.96714$BQi....@news04.bloor.is.net.cable.rogers.com>...

> Yes it is indeed possible to repeat this process. With a few extra bits of
> overhead one can compress anything down to nearly nothing at all.
> CPU processing is a large issue unfortunetly.
> The time to compress any 100 bytes down to 7 bytes is 30 seconds maximum.
> Currently the system in use is a 1200 Duron Chip. Doubling this can easily
> reduce the time down to 15seconds. Encoding the software within Assembly
> language could yeild the time down to 7 seconds or so.

Can it compress any 2 bytes into 1 byte?

Regards,

Peter Ballard
Adelaide, AUSTRALIA
pbal...@ozemail.com.au
http://www.ozemail.com.au/~pballard/

"Love your enemies, and pray for those who persecute you" - Jesus
(Matthew 5:44, NIV)

Tim Bernard

unread,
Apr 22, 2003, 8:54:55 AM4/22/03
to
No it can not compress any two bytes into one byte.

"Peter Ballard" <pbal...@ozemail.com.au> wrote in message
news:9d5509fa.03042...@posting.google.com...

Mikael Lundqvist

unread,
Apr 22, 2003, 9:21:32 AM4/22/03
to
Tim Bernard wrote:
> Following your approach to the solution i do see the problem you would face.
> I dont have that problem. Thats why u think its not possible.
>
You have that "problem" too of course.
The thing is that compression has to be one to one to be useful.

ONE string of bits can only be encoded to ONE result.
This ONE result can only be decoded to ONE string of bits.
How else can you get back what you encoded?

This is why 7 bytes = 56 bits or 2^56 unique data can only be
decoded/transformed/changed to 2^56 unique results.

What's interesting with compression is not these obvious and
undisputable limitations, but the fact that we can use statistics to
encode the more popular data with fewer bits and the less popular with
more bits.

But it's ALWAYS one to one, or an injection if you like.
http://wombat.doc.ic.ac.uk/foldoc/foldoc.cgi?query=injection

/Mikael

Ray

unread,
Apr 22, 2003, 10:04:45 AM4/22/03
to
"Tim Bernard" <notmy...@server.com> wrote in message news:<El2pa.97421$BQi....@news04.bloor.is.net.cable.rogers.com>...

Does it possible to encode a message WITHOUT consider the all
possible combination ? so what we do is to encode the characteristic of only
the current message (making the message itself its universe) without considering
surrounding possible value.

Raymond

Ken Savage

unread,
Apr 22, 2003, 6:35:29 PM4/22/03
to
Tim Bernard wrote:

>> Can it compress any 2 bytes into 1 byte?

> No it can not compress any two bytes into one byte.

Why not??

Could it compress any combination of 3 bytes into 2?

Could it compress any combination of 4 bytes into 3 (or less)

How about 5 into 4 (or less)? 6 into 5?

Could it compress 1000 bytes into 999? (Nope, I don't want
to do any better than 999 -- no reason to tax one's CPU!!)
I (and everyone else here) would be perfectly happy with an
algorithm that could guarantee 1000-->999 every time.

What is the minimum value of 'n' at which point the algorithm
(or a variant thereof) is able to do the following:

*** Completely offline, without reading or writing to a database,
*** self-modifying executeables, or other metadata repository

a) accept *any* sequence of 'n' bytes
b) produce, for every sequence accepted in (a), a secondary sequence
of fewer than 'n' bytes (ie: 'n-1' bytes is perfectly acceptable!)
c) recreate, from every sequence produced in (b) and no other meta
input (ie: filename used as part of input is illegal), the original
sequence of 'n' bytes from (a) -- AND NOTHING ELSE.

Ken

Kelsey Bjarnason

unread,
Apr 22, 2003, 7:00:07 PM4/22/03
to
[snips]

On Tue, 22 Apr 2003 22:35:29 +0000, Ken Savage wrote:

> What is the minimum value of 'n' at which point the algorithm
> (or a variant thereof) is able to do the following:
>
> *** Completely offline, without reading or writing to a database,
> *** self-modifying executeables, or other metadata repository
>
> a) accept *any* sequence of 'n' bytes
> b) produce, for every sequence accepted in (a), a secondary sequence
> of fewer than 'n' bytes (ie: 'n-1' bytes is perfectly acceptable!)
> c) recreate, from every sequence produced in (b) and no other meta
> input (ie: filename used as part of input is illegal), the original
> sequence of 'n' bytes from (a) -- AND NOTHING ELSE.


2^40960-1 bytes. For proof, send me a complete set of all files of that
size, and I'll send you back the entire set, each compressed by one byte.

:)


Gib Bogle

unread,
Apr 23, 2003, 12:41:00 AM4/23/03
to
This is really excellent. It means that a file of any size can be
compressed down to about 7 bytes, by iterative compression. Each
iteration stage reduces the size by a factor of 100/7 = 14.3, and it'll
take, for example, just 5 iterations to reduce a 4 MB file to 7 bytes.
Good news for the computer world!

Gib

Randall R Schulz

unread,
Apr 23, 2003, 1:08:02 AM4/23/03
to


"Good News for Modern Man"

0 new messages