Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

some (should-be) ground-rules f

219 views
Skip to first unread message

Bradford L Knowles

unread,
Jun 10, 1988, 9:54:18 PM6/10/88
to
In article <76000213@uiucdcsp> gil...@uiucdcsp.cs.uiuc.edu writes:
>
>But when I benchmarked compress on some digital pictures, I found that
>stuffit's LZW algorithm is much more efficient than compress's
>algorithm.

> [stuff deleted about too many programmers involved with compress].

>So I'm not voting for making compress a standard. I believe stuffit
>is better.
>
>Don Gillies {ihnp4!uiucdcs!gillies} U of Illinois
> {gil...@cs.uiuc.edu}

I think the only thing compress has going for it is that almost all
Unix sites have it (something that is rather useful when you consider
that a majority of Usenix sites run some flavor of Unix). If we had
a compatible program for the Macintosh, we could compress stuff at work
(using Unix, of course) and download those files at home (using the Mac,
of course). Without such a program, this would be more problematic.

Essentially, like many things, compress can be/should be a standard
*NOT* because it is the best program for the job, but because it
is a good program that has the best distribution (it beats Unix
compact anyday, and has equally good distribution).

-Brad Knowles

UUCP: ...!ihnp4!occrsh!uokmax!blknowle ARPA: blkn...@uokmax.ecn.uoknor.edu
SNAIL: 1013 Mobile Circle
Norman, OK 73071-2522
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
Philosophy of Boris Yeltsin: "If one wants to be unemployed, one will
criticize ones' boss. If one wants to be sent to Siberia, one
will criticize the wife of ones' boss."
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
Disclaimer: (The above opinions are my own. They have nothing to do with the
Univeristy of Oklahoma nor intelligance of any sort. :-)

Richard Michael Todd

unread,
Jun 10, 1988, 11:27:43 PM6/10/88
to
In article <76000213@uiucdcsp> gil...@uiucdcsp.cs.uiuc.edu writes:
>But when I benchmarked compress on some digital pictures, I found that
>stuffit's LZW algorithm is much more efficient than compress's
>algorithm. I'm not sure compress is all that great a program. If you

Interesting, since compress *is* using the LZW algorithm. I believe the
only differences in compress are hacks to make the hash table lookups
faster, which shouldn't affect its output. The interesting question is,
just what is stuffit doing that compress isn't?
Another interesting question--just what level of LZW code is being used
by the two programs (stuffit and compress). Since the Mac doesn't have
evil segment registers from hell, I'd expect stuffit and the Mac compress
port to use 16-bit LZW, since the higher number of bits the better and
on the Mac you wouldn't be forced to limit the number because of CPU
braindamage, but I don't know.

>So I'm not voting for making compress a standard. I believe stuffit
>is better.

Could be. I haven't noticed much difference between compress and the
various PC-based LZW-compression programs (ARC,PKXARC,ZOO) in terms of
output size. I don't have a Mac, so I can't comment directly on the
Mac implementations. Differences between # of bits used in LZW compression
do make some difference on really large files (e.g. Gnu CC source went from
1.4M to 1.7M in going from 16-bit LZW to 13-bit.) Could you provide us with
some actual test figures?
--
Richard Todd Dubious Domain: rmt...@uokmax.ecn.uoknor.edu
USSnail:820 Annie Court,Norman OK 73069 Fido:1:147/1
UUCP: {many AT&T sites}!occrsh!uokmax!rmtodd
"Indulgent is a word that anyone who doesn't like progressive rock can call
you" - Eddie Jobson

Gail Zacharias

unread,
Jun 11, 1988, 1:06:24 PM6/11/88
to
In article <14...@uokmax.UUCP> rmt...@uokmax.UUCP (Richard Michael Todd) writes:
> Another interesting question--just what level of LZW code is being used
>by the two programs (stuffit and compress).

Stuffit uses 14-bit compression.

Freek Wiedijk

unread,
Jun 11, 1988, 4:46:19 PM6/11/88
to
In article <76000213@uiucdcsp> gil...@uiucdcsp.cs.uiuc.edu writes:
>But when I benchmarked compress on some digital pictures, I found that
>stuffit's LZW algorithm is much more efficient than compress's
>algorithm. I'm not sure compress is all that great a program. If you

I thought that unsit used compress to decompress stuffit-files under unix.
That implies that stuffit and compress use the same LZW-compression, doesn't
it?

--
Freek Wiedijk <the pistol major> UUCP: uunet!mcvax!uva!freek
#P:+/ = #+/P?*+/ = i<<*+/P?*+/ = +/i<<**P?*+/ = +/(i<<*P?)*+/ = +/+/(i<<*P?)**

gil...@uiucdcsp.cs.uiuc.edu

unread,
Jun 12, 1988, 1:29:00 AM6/12/88
to

I have been reading the compress source code extensively, and the
original paper by Terry Welch. Here are some of the many non-standard
modifications to "vanilla" LZW:

(1) adaptive reset to restart compression at a certain threshold
(suggested, but not standard). I've read the code, and I suspect
the reset may always be arriving 100% too late, but I'm not positive.
(2) variable-length codes, minimum = 9 bits, max = 16, increasing
as the compression goes on. "Vanilla" LZW is 12 bits (constant).
(3) some kind of special processing for low-density bit images.
This may be a significant performance hit -- it may significantly
overuse (abuse?) the adaptive reset [HOG_CHECK code].

Someone said we should standardize compress because it's decent and
nearly every UNIX machine has it. Well, I believe that IBM PCs are
decent and nearly every business has one. So while we're at it, why
don't we just give up on the macintosh altogether? This would save
even more time!

Bradford L Knowles

unread,
Jun 13, 1988, 2:09:15 PM6/13/88
to
In article <76000218@uiucdcsp> gil...@uiucdcsp.cs.uiuc.edu writes:
>
>Someone said we should standardize compress because it's decent and
>nearly every UNIX machine has it. Well, I believe that IBM PCs are
^^^^^^^^^^^

>decent and nearly every business has one. So while we're at it, why
^^^^^^

Let's get realistic here, IBM PC's are decent?

>don't we just give up on the macintosh altogether? This would save
>even more time!
>
>Don Gillies {ihnp4!uiucdcs!gillies} U of Illinois
> {gil...@cs.uiuc.edu}

I *HOPE* you posted this as a joke.

Bradford L Knowles

unread,
Jun 16, 1988, 10:21:20 AM6/16/88
to
In article <3...@uva.UUCP> fr...@uva.UUCP (Freek Wiedijk) writes:
>In article <76000213@uiucdcsp> gil...@uiucdcsp.cs.uiuc.edu writes:
>>But when I benchmarked compress on some digital pictures, I found that
>>stuffit's LZW algorithm is much more efficient than compress's
>>algorithm. I'm not sure compress is all that great a program. If you
>
>I thought that unsit used compress to decompress stuffit-files under unix.
>That implies that stuffit and compress use the same LZW-compression, doesn't
>it?
>
>--
>Freek Wiedijk <the pistol major> UUCP: uunet!mcvax!uva!freek

From what I understand, Raymond Lau looked at compress (which uses LZW
only) and saw that it was a very good compression algorithm. But when
he actually implimented it, he discovered that his compression ratios
on MacPaint pictures was much lower than that of PackIt III (which uses
Huffman encoding). He then decided to impliment the compression
algorithms of both programs in his. That is why StuffIt is backwardly
compatible with PackIt III, and also how it achieves such good
compression ratios. BTW, StuffIt has an LZW only mode also, so it is
basically compress for the Mac if you use it that way. As for unsit
using compress, don't ask me, until five days ago, I had never even
heard of unsit! In reality, I don't think it would use compress, even
though they both use some form of LZW encoding.

gil...@uiucdcsp.cs.uiuc.edu

unread,
Jun 16, 1988, 2:46:00 PM6/16/88
to

When you learn a little bit compression algorithms, you find that
almost ANY algorithm can be misimplemented to get poor performance.
And the decompressor will still work!

I can implement anti-Huffman coding to EXPAND the data AS MUCH as
possible. The Huffman decompressor will still work.

I can implement LZW to do A RESET after every byte code. The data
will always expand to 300% of its original size (assuming 12-bit
codes). The LZW decompressor will still work.

Just because the data decompresses, doesn't mean it was compressed
well.. The LZW paper does not define a unique compression output.
Many possible outputs are possible, and some are VERY bad.

Michael J. Farren

unread,
Jun 18, 1988, 9:00:27 AM6/18/88
to
In article <27...@utastro.UUCP> wer...@utastro.UUCP (Werner Uhrig) writes:
>
>will this baloney-argument never stop? are you telling me that when I reduce
>the size of a text file (or whatever) by 40% by compressing it, here comes
>the transmission program trying to compress it, only to double it back in size?

The compress program used in Usenet transmission will not result in a
larger transmitted file than that stored. But - what happens when your
pre-compressed file gets sent through a mailer which doesn't understand
eight-bit data? To guarantee transmission throughout the entire net,
you need to convert a compressed file into pure ASCII by using uuencode
or some such, and this DOES increase file size significantly.

Remember, too, that compression is quite ineffective on smaller files
(such as individual articles), and fairly CPU-intensive as
well. The standard news transmission scheme batches many articles together
before compressing, resulting in much greater efficiency.

--
Michael J. Farren | "INVESTIGATE your point of view, don't just
{ucbvax, uunet, hoptoad}! | dogmatize it! Reflect on it and re-evaluate
unisoft!gethen!farren | it. You may want to change your mind someday."
gethen!far...@lll-winken.llnl.gov ----- Tom Reingold, from alt.flame

Werner Uhrig

unread,
May 21, 1988, 8:21:22 PM5/21/88
to

[ this article was crossposted to news.admin because I feel that the ]
[ basic premise applies to non-Macintosh groups also ]
Folks,
I, usually, pipe hexed articles through xbin to take advantage of the
reduced download-time of the resulting unhexed file(s).

I was rather bothered that xbin (on my favorite SUN) choked on GuardDog,
enough so, that I decided to figure out why. I downloaded the hex-sources
and unhexed them with BinHex4, resulting in a file called "Guard Dog TM"
(where TM represents the sign for Trademark). Suspecting that the non-ASCII
character may be the cause of xbin choking, I changed the name to "Guard Dog"
(given that 'Guard Dog' is of type SYSTEM you have to 'bend some bits'
before you can rename the file), hexed and uploaded that file, and, presto,
xbin stopped dumping core.

While at it, I figured I might as well check how much smaller compressed file
would have been (I used StuffIt-1.40 for that), with the following
result:

32 -rw-r--r-- 1 werner other 32134 May 21 17:36 Guard_Dog.Hqx
18 -rw-r--r-- 1 werner other 18336 May 21 17:55 Guard_Dog.sit.hqx

All this leads me to the following suggestion:

ALL files distributed on COMP.BINARIES.MAC and COMP.SOURCES.MAC should be
COMPRESSED before being HEXED. The (intermediate) compressed file should
have a name consisting of ASCII-characters only so that xbin can be used to
unhexify the article on the mainframe in order to reduce the download time.

It would be best if the person submitting the article would do "the work",
however, the moderator should, at least, verify that the submitted article
conform to the standard (I assume that he already verifies that no pirated
material gets distributed, by inspecting the submitted material).

If the moderator is unable (or unwilling) for technical or personal reasons
to take on that (additional?) work, I propose that a call go out for
volunteers to help with such tasks - or for a new moderator who is able and
willing to take on or coordinate such an effort.

Cheers, ---Werner

"Let's take the best there is and improve it"
--- someone famous is bound to have come to that conclusion
before me ....

PS: if someone has a version of xbin that can handle ALL non-ASCII chars in
file-names, I'd appreciate .....

Jim Macak

unread,
May 23, 1988, 1:44:26 PM5/23/88
to
In article <26...@utastro.UUCP> wer...@utastro.UUCP (Werner Uhrig) writes:
> (much deleted...)

>
>ALL files distributed on COMP.BINARIES.MAC and COMP.SOURCES.MAC should be
>COMPRESSED before being HEXED. The (intermediate) compressed file should
>have a name consisting of ASCII-characters only so that xbin can be used to
>unhexify the article on the mainframe in order to reduce the download time.
>
> (etc.)

Yes, an excellent suggestion. It's crazy to have 10 part files posted in
comp.binaries.mac when file compression can reduce those down to 50-60% of
that size.

And while we're at it, perhaps a standard program for doing the
packing/compression ought to be adopted. Ray Lau's StuffIt is the obvious
choice. Though still with an aoccasional bug, StuffIt is certainly stable
enough to use for mere stuffing/unstuffing of files. StuffIt has certainly
become the standard on the GEnie, CIS and other Mac sections, so why not here
too? Besides, Ray Lau only requests a shareware fee if one uses StuffIt for
encoding/archiving purposes. Otherwise, it is considered a freebie: all the
more reason to adopt it as a standard on this system.

Jim

--

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>><<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Jim --> ma...@lakesys.UUCP (Jim Macak) {Standard disclaimer, nothin' fancy!}
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>><<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<

Gail Zacharias

unread,
May 23, 1988, 5:24:31 PM5/23/88
to
In article <6...@lakesys.UUCP> ma...@lakesys.UUCP (Jim Macak) writes:
>And while we're at it, perhaps a standard program for doing the
>packing/compression ought to be adopted. Ray Lau's StuffIt is the obvious
>choice. <....> Besides, Ray Lau only requests a shareware fee if one uses

>StuffIt for encoding/archiving purposes. Otherwise, it is considered a
>freebie: all the more reason to adopt it as a standard on this system.

In order for you to use this "freebie", the contributor has to use StuffIt
for encoding. This pretty much excludes people like me, who oppose shareware
on principle, from contributing.

I suggest tar and compress instead. There are (several) freeware versions of
each available for the macintosh, and I'm sure it'd be easy to write versions
for other micros. And besides, every unix system has them, so you're not
locked into one individual's good will for continued support. This also gives
archive sites (which are pretty much all Unix) a lot of flexibility on how to
handle storage without having to transfer files to a micro for repackaging.
For example, the software could be posted as uuencoded tar files, then
automatically uudecoded and compressed on receipt at the archival host,
thereby getting around the compressing-compressed-files problem of news
transmission.
--
g...@entity.com ...!mit-eddie!gz
Unix: when you can't afford the very best.

Howard Chu

unread,
May 24, 1988, 2:57:03 PM5/24/88
to
In article <6...@lakesys.UUCP> ma...@lakesys.UUCP (Jim Macak) writes:
%And while we're at it, perhaps a standard program for doing the
%packing/compression ought to be adopted. Ray Lau's StuffIt is the obvious
%choice. Though still with an aoccasional bug, StuffIt is certainly stable
%enough to use for mere stuffing/unstuffing of files. StuffIt has certainly
%become the standard on the GEnie, CIS and other Mac sections, so why not here
%too? Besides, Ray Lau only requests a shareware fee if one uses StuffIt for
%encoding/archiving purposes. Otherwise, it is considered a freebie: all the
%more reason to adopt it as a standard on this system.

Just out of curiosity (I'm no Mac maven...), does the Stuffit program handle
ARC format compressed files? It would be nice if it did... There's been many
a time when I've wanted to get text files to or from a Mac and some other
system, and, while ARC has become the standard everywhere else, I have to
wade thru these weird programs like Packit, Packit II, Stuffit, etc... on
the Mac. I think Stuffit is a very nice program, but it'd be a useful new
feature if it could work with ARC files.
--
/
/_ , ,_. Howard Chu
/ /(_/(__ University of Michigan
/ Computing Center College of LS&A
' Unix Project Information Systems

Roger L. Long

unread,
May 25, 1988, 3:01:00 AM5/25/88
to
In article <6...@lakesys.UUCP> ma...@lakesys.UUCP (Jim Macak) suggests
standardizing using StuffIt for packing/compression.

In article <3...@spt.entity.com> g...@eddie.mit.edu (Gail Zacharias) objects:


>In order for you to use this "freebie", the contributor has to use StuffIt
>for encoding. This pretty much excludes people like me, who oppose shareware
>on principle, from contributing.

Actually not. That's the job of the moderator. People could contribute
stuff in just about any reasonable form that I could figure out how to deal
with, and I'd repackage things into the "standard" form here. An there is
a public domain (at least for UNIX) un-StuffIt package, so no reason to
even USE the shareware package if you don't care to.

What matters most (to me) for submitting things to comp.{binaries,sources}.mac
is that

- the submission gets here intact
- it is real obvious who the submitter is (i.e. name, email address,
organization, all the stuff that makes up an article header).
- if there is a version number, it'd be nice if you'd include that in
the header, since one complaint a number of people have is being
aware of what version something is, so they don't have to download
it to see if it's an updated version of something they already have.
- Documentation, if provided, is TEXT or MacWrite, since these seem
to be most standard. Besides, I don't personally have every
Word Processing package available, so I can't check out documentation
that comes in other flavors.
- You write a paragraph or so explaining what you are posting (yes, I
take a look at what's posted, but have better things to do than
write a description for something that YOU felt was worth posting).
Remember, in this paragraph, you want to convey enough information
to the people who receive the article to give them something by which
they can decide if they want to spend their time downloading it.
Let people know if documentation is included. If something will only
work on a Mac II, let people know that in the text description, so
that people with Mac SE's don't waste their time downloading something
that has no hope of working on their machine. If it requires the
latest version of the system software, let people know that so they
will understand why things don't work on their machine if they're using
old software.

Thanks.

--
Roger L. Long
dhw68k!bytebug

Earle R. Horton

unread,
May 25, 1988, 10:54:42 PM5/25/88
to
In article <26...@utastro.UUCP> wer...@utastro.UUCP (Werner Uhrig) writes:
>
> (He says xbin doesn't like file name characters > 0x7F)

>
>PS: if someone has a version of xbin that can handle ALL non-ASCII chars in
> file-names, I'd appreciate .....

Werner requests that posters of Macintosh programs not use non-ASCII
characters in file names, since these are not legal in UNIX file names.
It is better to fix xbin, I think, since then you don't place artificial
restrictions on Macintosh users, who may not especially care about UNIX
peculiarities. The fix is easy, just AND every character in the UNIX
file names used by xbin with 0x7F. Here is a C code fragment from my
copy of xbin. Note that I have added translation for some (but not all)
characters that sh and csh do not like. This is, frankly, getting
cumbersome, and I am thinking of converting to use of a translation
table someday...

namebuf[n] = '\0';

/* get rid of troublesome characters */
for (np = namebuf; *np; np++){
if (*np == ' ' || *np == '/' || *np == '!' ||
*np == '(' || *np == ')' || *np == '[' || *np == ']'
|| *np == '*' || *np == '<' || *np == '>' ||
*np == '?' || *np == '\'' || *np == '"' || *np == '$')
*np = '_';
*np &= 0x7f;
}
sprintf(files.f_data, "%s.data", namebuf);
sprintf(files.f_rsrc, "%s.rsrc", namebuf);
sprintf(files.f_info, "%s.info", namebuf);

My 2 cents worth:

I don't think StuffIt is a good choice for a standard of compressing
files, since then every poster has to shell out $20.00 to be a legal
user. I am partial to use of UNIX-compatible tar and compress. There
exist both tar and compress for the Macintosh, both freeware, but
only the MPW Tool version of tar handles the resource fork. At this
time, there is not a suitable method of archiving/compressing
Macintosh files which:

a) Is free.
b) Runs under the Finder.

The MPW version of Tar can archive Macintosh files using the MacBinary
standard, and the resulting file can be unTar'ed on any UNIX machine
and possibly VMS and other systems, also on any Macintosh which has
MPW installed. Portions of the archive can be downloaded to a
Macintosh from any system which has an xmodem program, and many free
terminal programs will recognize the MacBinary format of the
downloaded file(s). Tar files can be compressed using MacCompress2.1,
which is compatible with UNIX compress. If the MPW Tool version of
Tar were compiled into a Macintosh application, then I think this
would be an ideal choice for an all-purpose archiving program.
Volunteers?
*********************************************************************
*Earle R. Horton, H.B. 8000, Dartmouth College, Hanover, NH 03755 *
*********************************************************************

dor...@uxg.cso.uiuc.edu

unread,
May 26, 1988, 12:55:00 PM5/26/88
to

>locked into one individual's good will for continued support. This also gives
>archive sites (which are pretty much all Unix) a lot of flexibility on how to
^^^^^^^^
Tell that to sumex.

>g...@entity.com ...!mit-eddie!gz
----
Steve Dorner, U of Illinois Computing Services Office
Internet: dor...@uxc.cso.uiuc.edu UUCP: ihnp4!uiucuxc!dorner
IfUMust: (217) 333-3339

Straka

unread,
May 27, 1988, 8:47:15 AM5/27/88
to
In article <86...@dartvax.Dartmouth.EDU> ear...@eleazar.dartmouth.edu (Earle R. Horton) writes:
>My 2 cents worth:
>
>I don't think StuffIt is a good choice for a standard of compressing
>files, since then every poster has to shell out $20.00 to be a legal
>user. I am partial to use of UNIX-compatible tar and compress. There

There are two issues here (as least as far as the net is comcerned. One is
getting the binary to the moderator. Compression is not crucial here. Any
means of getting the binary to the moderator would not stress the bandwidth
of the network too much.

The moderator unpacks, unstuffits, or unhexifies the binaries, reviews them,
and only THEN stuffits them together for distribution on the net. This way,
only the moderator is obliged to contribute the shareware fee.

I don't think there is much of an argument against Stuffit. Of course, if
you want to help out the revenues of AT&T, Sprint, MCI, et al., then go right
ahead... I would prefer to use the same resources to process MORE binaries
through the net for the same cost using Stuffit.
--
Rich Straka ihnp4!ihlpf!straka

Advice for the day: "MSDOS - just say no."

Larry E. Kollar

unread,
May 27, 1988, 2:41:02 PM5/27/88
to
[I'm cross-posting to comp.sys.amiga since this applies to amiga binaries
as well.]

To reduce the amount of binaries on the net, and to enhance the usefulness of
what's posted, I propose the following scoring system:

All submissions have a base score of 0.

Sources accompany submission: +10
Public domain (as opposed to free copyrighted): +5
Binary < 20K: +5
Binary < 50K: +3 (a 10K binary has a score of +5, not +8)

Binary > 100K: -3
Binary > 150K: -5
Shareware: -5
Demo of commercial program: -10

Each submission is given a score, and placed in the posting queue based on that
score. The higher the score, the sooner it gets posted. Small things aren't
clobbered nearly as often as larger programs, and programs with sources are
much more useful, so we should encourage these submissions. Big shareware or
commercial demos would most likely never get posted, due to things jumping
ahead of them in line. The Mac and Amiga are hard enough to learn to program;
the until-recent dearth of example Mac sources didn't help.

These, of course, are rough guidelines; since moderators are intelligent humans,
they can modify scores to their own tastes or for special considerations (if
everyone is asking for it, it should be posted to reduce request clutter).

On the other hand, people could port GNU CC to the Amiga & Mac, then we could
do away with binaries entirely. :-)

Well, what do y'all think?

Larry Kollar ...!gatech!dcatla!mclek

Gail Zacharias

unread,
May 27, 1988, 8:01:16 PM5/27/88
to
In article <1520...@uxg.cso.uiuc.edu> Steve Dorner writes:
>
>>locked into one individual's good will for continued support. This also
>>gives archive sites (which are pretty much all Unix) a lot of flexibility
> ^^^^^^^^
>Tell that to sumex.

Ok, I will: sumex, babes, would you convert your archives to compressed tar
format if I gave you twenex versions of tar and compress?

--
g...@entity.com ...!mit-eddie!spt!gz

Sean Casey

unread,
May 28, 1988, 10:05:13 PM5/28/88
to
In article <51...@dcatla.UUCP> mc...@dcatla.UUCP (Larry E. Kollar) writes:
>To reduce the amount of binaries on the net, and to enhance the usefulness of
>what's posted, I propose the following scoring system:

I hope no one takes this seriously.

Sean
--
*** Sean Casey se...@ms.uky.edu, se...@ukma.bitnet
*** The Empire Definistrator {rutgers,uunet,cbosgd}!ukma!sean
*** ``I'm not gonna mail it, YOU mail it. I'M not gonna mail it... Hey! Let's
*** sent it to Rutgers! Yeah! They won't mail it! They return everything...''

Joseph M. Piazza

unread,
May 30, 1988, 1:23:37 AM5/30/88
to
In article <51...@dcatla.UUCP> Larry Kollar writes:
>To reduce the amount of binaries on the net,

What ever for? ...

> ... and to enhance the usefulness of


>what's posted, I propose the following scoring system:
>
> All submissions have a base score of 0.
>
> Sources accompany submission: +10

... You seem to consider Sources intrinsically more valuable than
Binaries. While I'm sure you have reasons, some true and good, it can't
hold true for many situations. For one thing, this subject has been thrashed
about many times in many news-groups and yet binarie news-groups still exist.

The fact remains that sources are (usually) larger than binaries --
no savings here.

Not everybody has the apropriate compiler/interpreter/assembler.
Some people may have them but may not know enough to make everything work. A
good example is if the user doesn't own the right brand compiler. This
sometimes includes me. Do you know what that you would be saying to me?
"Tough shit."

>... programs with sources are


>much more useful, so we should encourage these submissions.

I do agree that sources c a n be more useful ... (hold this thought)

>... Big shareware or


>commercial demos would most likely never get posted, due to things jumping
>ahead of them in line.

There's smoething slippery about this idea. An unpredictable delay
of a posting could deminish its usefulness. This could also cause a
sufficietly delayed posting prove virtually useless and therefore a
detriment -- but not dependent on the posting's own merit. There's no good
in that.

>... The Mac and Amiga are hard enough to learn to program;

... And some people don't program at all. (Remember that thought)
Should we ignore non-programmers?

>...


>These, of course, are rough guidelines; since moderators are intelligent
>humans, they can modify scores to their own tastes or for special

>considerations ...

I think it we would best leave it the hands of intelligent humans
than to a non-thinking scoring scheme -- we're not playing Bridge.

Flip side,

joe piazza

---
In capitalism, man exploits man.
In communism, it's the other way around.

CS Dept. SUNY at Buffalo 14260
UUCP: ..!{ames,boulder,decvax,rutgers}!sunybcs!jmpiazza GEnie:jmpiazza
BITNET: jmpi...@sunybcs.BITNET Internet: jmpi...@cs.Buffalo.edu

> Larry Kollar ...!gatech!dcatla!mclek

Chip Rosenthal

unread,
May 30, 1988, 12:16:05 PM5/30/88
to
In article <11...@sunybcs.UUCP> jmpi...@sunybcs.UUCP (Joseph M. Piazza) writes:
>In article <51...@dcatla.UUCP> Larry Kollar writes:
>>... The Mac and Amiga are hard enough to learn to program;
> ... And some people don't program at all. (Remember that thought)
>Should we ignore non-programmers?

Yes.
--
Chip Rosenthal /// ch...@vector.UUCP /// Dallas Semiconductor /// 214-450-0400
{uunet!warble,sun!texsun!rpp386,killer}!vector!chip
I won't sing for politicians. Ain't singing for Spuds. This note's for you.

Lee Atchison

unread,
May 31, 1988, 10:54:00 AM5/31/88
to
>/ hpindda:comp.sys.mac / mc...@dcatla.UUCP (Larry E. Kollar) / 11:41 am May 27, 1988 /

>[I'm cross-posting to comp.sys.amiga since this applies to amiga binaries
>as well.]

Stupid reason to cross-post. History has shown that cross posting ANYWHERE
is a dumb idea. If you are going to cross-post, why not cross-post to the
ibm-pc binary group too? What's so special about the Amiga's????

>To reduce the amount of binaries on the net, and to enhance the usefulness of


>what's posted, I propose the following scoring system:

BULL.

This obviously reflects your individual bias, as would any such system.
Deciding which postings are valuable should be up to the moderator.

> Sources accompany submission: +10

Why? Many people don't care if source is included or not.

> Public domain (as opposed to free copyrighted): +5

An unimportant distinction. Many very good programs could NEVER be
distributed as shear public domain. Take anything written in LSC (using
the UNIX libraries, at least). They would, at the very least, require
the LSC copyright notice, even if the rest is PD. Should these be considered
less important than programs written in another language??????

> Binary < 20K: +5
> Binary < 50K: +3 (a 10K binary has a score of +5, not +8)
> Binary > 100K: -3
> Binary > 150K: -5

A good decision making point. Although usefullness of the program should
be of first concern, size is also a VERY important point.

> Shareware: -5

Some strange religious nuts would agree with this, I happen to think
shareware is a good idea. The very concept of being able to look at software
before you buy is a good idea, even if most shareware authors haven't made
much money on it.

> Demo of commercial program: -10

A personally think that demos of commercial programs are a great idea. They
benefit everyone. The person who is thinking of buying the program gets a
free look before shelling out any money. The person who likes the program
but can't afford to buy it gets a pseudo-working version for free (I'm
thinking of demos like the WriteNOW demo, for example), and the publisher
gets free advertising.

The only person possibly hurt is the person paying the phone bills for the
net. They are hurt with all the trash on this net anyway. If they are
worried about cutting costs, cut some of the garbage groups (I'll leave the
list of these to your imaginations to create, but one of them spells 'groups'
with an 'f').

>The Mac and Amiga are hard enough to learn to program;

>the until-recent dearth of example Mac sources didn't help.

There are a lot of non-programmers on this net who could care less about
source code. Leave the source code for the source groups. I would rather
see shorter examples of source code segments that do certain important things
(that are clearly commented and explained), then the unreadable source code
for a large project that I probably will never look at anyway.

>
> Larry Kollar ...!gatech!dcatla!mclek
>----------


----
Lee Atchison
Hewlett Packard, Business Networks Division
Cupertino, CA 95014
atchison%hpi...@hplabs.hp.com

These opinions are my own, not of my employers.

Barbara Dyker

unread,
May 31, 1988, 8:31:18 PM5/31/88
to
In article <82...@dhw68k.cts.com> byt...@dhw68k.cts.com (Roger L. Long) writes:
>In article <6...@lakesys.UUCP> ma...@lakesys.UUCP (Jim Macak) suggests
>standardizing using StuffIt for packing/compression.
>
>What matters most (to me) for submitting things to comp.{binaries,sources}.mac
>is that
> [long list of good guidlines deleted]


What would also take some load off the net would be a REGULAR posting in
comp.binaries.* of a list of submittal guidlines AND the procedures AND
software required to unpack/hex the posting!!!

The net is all too often cluttered with requests for BinHex, StuffIt, PackIt...
and hasty articles about not being able to unbinhex 4.0 with 5.0, and "what's
a .pit file?"

I did my stuggling as a newbie - let's get some info out for those that
are new to the net so that it works for all of us.

Or shall we ignore newbies like someone suggested we ignore non-programmers??


Barb Dyker CSNET: dy...@boulder.Colorado.EDU
UUNET: ...rutgers!ncar!dinl!tosgcla!dyker
"Do what you will with my employer, but leave me alone."

Larry Blair

unread,
Jun 8, 1988, 7:41:10 PM6/8/88
to
Sorry to join this discussion at such a late date. I had been ignoring it, but
while flipping through I saw the word "compress" and that disturbed me.

The vast majority of news is transported site to site compressed. Applying
compression to postings will most likely result in an increase data transfered.

gil...@uiucdcsp.cs.uiuc.edu

unread,
Jun 9, 1988, 12:08:00 PM6/9/88
to

I recently became interested in writing a compression utility.

But when I benchmarked compress on some digital pictures, I found that
stuffit's LZW algorithm is much more efficient than compress's
algorithm. I'm not sure compress is all that great a program. If you

look at the source code, it's been hacked dozens of times by different
authors at different sites. Data compression methods are heuristics,
and sometimes too many cooks can spoil the pot.

So I'm not voting for making compress a standard. I believe stuffit
is better.

Don Gillies {ihnp4!uiucdcs!gillies} U of Illinois
{gil...@cs.uiuc.edu}

Larry Blair

unread,
Jun 9, 1988, 12:39:54 PM6/9/88
to
In article <88...@dartvax.Dartmouth.EDU> ear...@eleazar.dartmouth.edu (Earle R. Horton) writes:

|In article <6...@vsi1.UUCP> l...@vsi1.UUCP (Larry Blair) writes:
|>The vast majority of news is transported site to site compressed.
|>Applying compression to postings will most likely result in an
|>increase data transfered.
|
|Example, please:
|
| No StuffIt Used | Stuffit Used
| |
| Stage Size (bytes) | Stage Size
| ----- ----------- | ----- ----
| |
|Application 33756 | Application 33756
|Application.Hqx 45082 | Application.sit 24070
|Application.Hqx.Z 31395 | Application.sit.Hqx 32896
| | Application.sit.Hqx.Z 29683

I stand corrected. These results seem to contradict the notion that
compression randomizes a file such that further compression is useless.
I guess that the ASCII-tizing of the compressed data adds enough non-
randomness to allow more compression.
--
* * O Larry Blair
* * O VICOM Systems Inc. sun!pyramid----\
* * O 2520 Junction Ave. uunet!ubvax----->!vsi1!lmb
* * O San Jose, CA 95134 ucbvax!tolerant/
* * O +1-408-432-8660

Bradford L Knowles

unread,
Jun 9, 1988, 10:21:27 PM6/9/88
to
Personally, I think the we shouldn't really worry about this, let the
moderators appear to be intelligent and do their jobs.

Second, I would like to see kudos bounced around to anyone who posts a
binary to one newsgroup and a matching source code file to another.
This way, you get the binary if you want only it, and you can go find
the source (if it was posted) if you desire.

Werner Uhrig

unread,
Jun 10, 1988, 7:40:49 AM6/10/88
to
In article <6...@vsi1.UUCP>, l...@vsi1.UUCP (Larry Blair) writes:
> while flipping through I saw the word "compress" and that disturbed me.

> The vast majority of news is transported site to site compressed. Applying
> compression to postings will most likely result in an increase data transfered

will this baloney-argument never stop? are you telling me that when I reduce


the size of a text file (or whatever) by 40% by compressing it, here comes
the transmission program trying to compress it, only to double it back in size?

I'd be surprised if the worst case of negative pay-off of having integrated
compression into the transmission protocols are more than a few % at most;
otherwise, there must be something terribly wrong with it. If it can't
be smart about supporting people who want to reduce disk-usage for all these
articles queueing up on thousands of USEnet machines (well, give it a little
help with an additional header "Compressed-Article: <algotrithm used>" if
you must) then I can't understand why we bother at all ....

If it could be shown that, overall, so many articles are now precompressed
that transmission-integrated compression no longer buys anything - then
we have reached the ideal state where compression on-the-fly during transmission
is no longer needed ... but I don't buy that "proof-unseen"!

if someone has studied actual data and can present ideas/suggestions based
on the analysis, I'm sure that's worth discussing. otherwise, ....

0 new messages