Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Why horrible code?

43 views
Skip to first unread message

Bill Wendling

unread,
Jan 17, 1998, 3:00:00 AM1/17/98
to

Hi all,

Does anyone know why people design and create horrible-to-maintain, buggy
code? I have seen and heard about some real horror stories. Is it just
ignorance/incompetence on the coder's part? Lack of time, maybe?

--
|| Bill Wendling wend...@ncsa.uiuc.edu

Robert Billing

unread,
Jan 18, 1998, 3:00:00 AM1/18/98
to

In article <69prv3$mm3$1...@vixen.cso.uiuc.edu>
wend...@news.cso.uiuc.edu "Bill Wendling" writes:

> Does anyone know why people design and create horrible-to-maintain, buggy
> code? I have seen and heard about some real horror stories. Is it just
> ignorance/incompetence on the coder's part? Lack of time, maybe?

There are three causes of bad code.

1) Confusion
2) Bill Gates
3) There's never time to do it right, there's always time to do it
again.

--
I am Robert Billing, Christian, inventor, traveller, cook and animal
lover, I live near 0:46W 51:22N. http://www.tnglwood.demon.co.uk/
"Bother," said Pooh, "Eeyore, ready two photon torpedoes and lock
phasers on the Heffalump, Piglet, meet me in transporter room three"

smy...@popmail.voicenet.com

unread,
Jan 18, 1998, 3:00:00 AM1/18/98
to

I think there are many reasons why code is so often so bad.

1) Most people develop a style of coding that is often intelligible
to them, but not to others. Example, when I write in S/360 Assembler I
use BXLE and BXH to form loops, but most people find these instructions
difficult to understand.

2) Often the original code is not quite right, and many times it is
easier to break the planned approach for the fix, which can be
confusing to say the least. The is a particular problem for languages
that do not have a block structure.

3) Documentation is a major problem. This seems to be particularly
true in block structured langauges.

4) In a sense, programming is like poetry. Even a superbly well
documented program contains elements that are undocumentable, that
are different from the documentation, or that were never documented.
Skills that literature freaks learn to take understand literature should,
perhaps, be extended to programming, but usually are not.

5) It is a truism that the program itself is its own document. Generally
real programs solve issues beyond the problem statement, because the
problem statement is nothing more than a model of the real problem. The
problem statement, though, is never updated to match the problem solved.

5a) The other reason the problem statement is inadequate is that it rarely
mentions non-problem details, like the mechanics of obtaining file names,
opening and closing these files, and what is supposed to be done if a
file is not there. Or the mechanics of receiving and (especially)
validating the input data processed by a program.

6) It is a truism that programming may well be great art, but it does
not last. A ten year old version of TurboTax, for example, is useless
today. For this reason, people do not make a great effort to understand
code, even their own code, after the fact.

In <69prv3$mm3$1...@vixen.cso.uiuc.edu>, wend...@news.cso.uiuc.edu (Bill Wendling) writes:
>Hi all,


>
>Does anyone know why people design and create horrible-to-maintain, buggy
>code? I have seen and heard about some real horror stories. Is it just
>ignorance/incompetence on the coder's part? Lack of time, maybe?
>

>--
>|| Bill Wendling wend...@ncsa.uiuc.edu


-- Steve Myers

The E-mail addresses in this message are private property. Any use of them
to send unsolicited E-mail messages of a commerical nature will be
considered trespassing, and the originator of the message will be sued in
small claims court in Camden County, New Jersey, for the maximum penalty
allowed by law.

Chris Hedley

unread,
Jan 18, 1998, 3:00:00 AM1/18/98
to

In article <69prv3$mm3$1...@vixen.cso.uiuc.edu>,

wend...@news.cso.uiuc.edu (Bill Wendling) writes:
> Does anyone know why people design and create horrible-to-maintain, buggy
> code? I have seen and heard about some real horror stories. Is it just
> ignorance/incompetence on the coder's part? Lack of time, maybe?

Sometimes lack of time, sometimes laziness, sometimes lack of discipline,
sometimes incompetence, sometimes inexperience; in my experience, it
generally depends on the circumstances and the individual, although in
the latter case I have encountered some people who qualify for most of
the things I've listed!

Chris.

Sean Ennis

unread,
Jan 18, 1998, 3:00:00 AM1/18/98
to

In article <69prv3$mm3$1...@vixen.cso.uiuc.edu>, wend...@news.cso.uiuc.edu (Bill
Wendling) wrote:
] Does anyone know why people design and create horrible-to-maintain, buggy

] code? I have seen and heard about some real horror stories. Is it just
] ignorance/incompetence on the coder's part? Lack of time, maybe?

In 'my' experiance:

1) The project was started without a clear plan. So while it may all 'work',
it's just not as good as it could have been.

2) The project was started WITH a clear plan, but over the years little
features, and upgrades were added which screwed up the original plan.
If they had made all the changes all at once, it would probably be
cheeper in the long run, and it would much easier to add to the project
at a later date. I've found that with this sort of horror that you can
often tell what function (or code block) was coded before another just
by saying "Why did they do it this way in A? Oh over in B they did blah,
they must have coded B before A." Kind of like phorensic(sp?) coding :-)

3) The project started off with a screwy design, and everything was built
to get around that orginal problem - instead of going back and fixing
it.

4) Old tools. I've come across blocks of code that worked for the wrong
reason. Function A passes a pointer to function B (who sees it as an
integer) and then function B passes it onto function C (who sees it as
a pointer again) - and because on the machine & compiler they were
using an integer and the size of a pointer were the same it worked fine.
Many of the newer compilers even recognise common coding mistakes and
warn you. The older ones didn't. The number of times that I've seen
if (function_name(a) = b) { }
when they really meant
if (function_name(a) == b) { }
it amazes me that some of the stuff still works.

5) Lack of experience. I'm not talking about a green 20-something
programmer. I'm talking about that alot of the coding ideas that
we have now weren't around 5, 10, 20+ years ago. Coding was done
by an electrical engineer (who did his thesis is High-Voltage
transmition theory) because he was the only one around who
knew anything about a computer. Often they were solving problems
of a style that no one had ever solved befor. We can only see so far
today because we are standing on the sholders of giants.

Sean

-+-
Sean Ennis <sen...@escape.ca>

My email address has been munged to prevent spam. Take off the first
'S' and it should reach me.

Kelsey Bjarnason

unread,
Jan 18, 1998, 3:00:00 AM1/18/98
to
says...
> Hi all,

>
> Does anyone know why people design and create horrible-to-maintain, buggy
> code? I have seen and heard about some real horror stories. Is it just
> ignorance/incompetence on the coder's part? Lack of time, maybe?
>
> --
> || Bill Wendling wend...@ncsa.uiuc.edu
>
Lots of reasons. Lack of time, lack of knowledge. Or, in some cases,
because it's "throw away" code. I.e. you write a program knowing that
the code is going to have to be completely rewritten in three weeks
because you're moving to a new product/platform/library/whatever, so it
doesn't make a lot of sense to spend endless time worrying about the
existing codebase being terribly wonderful. :)

--
Remove .no.spam from my address to respond.

Raoul Golan

unread,
Jan 20, 1998, 3:00:00 AM1/20/98
to

Kelsey Bjarnason wrote:

> In article <69prv3$mm3$1...@vixen.cso.uiuc.edu>, wend...@news.cso.uiuc.edu
> says...
> > Hi all,
> >
> > Does anyone know why people design and create horrible-to-maintain, buggy
> > code? I have seen and heard about some real horror stories. Is it just
> > ignorance/incompetence on the coder's part? Lack of time, maybe?
> >
> > --
> > || Bill Wendling wend...@ncsa.uiuc.edu

Uh, job security?

--
In a consumer society there are inevitably two kinds of slaves:
the prisoners of addiction and the prisoners of envy. - Ivan Illich


J. Otto Tennant

unread,
Jan 20, 1998, 3:00:00 AM1/20/98
to

Raoul Golan <ra...@ind.tansu.com.au> writes:

>Kelsey Bjarnason wrote:

>> In article <69prv3$mm3$1...@vixen.cso.uiuc.edu>, wend...@news.cso.uiuc.edu
>> says...
>> > Hi all,
>> >
>> > Does anyone know why people design and create horrible-to-maintain, buggy
>> > code? I have seen and heard about some real horror stories. Is it just
>> > ignorance/incompetence on the coder's part? Lack of time, maybe?
>> >
>> > --
>> > || Bill Wendling wend...@ncsa.uiuc.edu

>Uh, job security?

In my opinion, not usually.

Ignorance and incompetence are certainly the cause, but this is
tautological. There are varieties and degrees of ignorance and
incompetence.

Assuming that the coder has a general understanding of the language and
the operating system, even if vague, wretched code results from an
inadequate understanding of the problem to be solved (which is also
almost tautological). The horrible code is caused by inappropriate
choice of data structures to use in resolving the problem: define the
data first, before the first line of code is written.

(If the data structures aren't right, the hope that there is just one
more bug is futile, and the whole program will have to be thrown away;
this is a difficult decision for management to approve.)
--
J.Otto Tennant jo...@pobox.com
Forsan et haec olim meminisse juvabit.

D. Peschel

unread,
Jan 20, 1998, 3:00:00 AM1/20/98
to

In article <6a168h$ijk$1...@darla.visi.com>, J. Otto Tennant <j...@visi.com> wrote:

> The horrible code is caused by inappropriate
>choice of data structures to use in resolving the problem: define the
>data first, before the first line of code is written.

This is an interesting viewpoint. Maurice Wilkes said soemthing similar
(essentially, "Choose the right data structure before you write the program
and the problem will be much easier to solve.") And he helped design and
build EDSAC (the first working, large-scale, stored-program computer) and
wrote a very interesting programming textbook at the time. So he should know.

-- Derek

Kevin Gilhooly

unread,
Jan 20, 1998, 3:00:00 AM1/20/98
to

Bill Wendling <wend...@news.cso.uiuc.edu> wrote:

> Hi all,
>
> Does anyone know why people design and create horrible-to-maintain, buggy
> code? I have seen and heard about some real horror stories. Is it just
> ignorance/incompetence on the coder's part? Lack of time, maybe?
>

There are three versions of any program:
1) What the user wants
2) What the user needs
3) What is delivered

How to write a program (in pseudocode):

While (budget.available)
{
User.provide(requirements);
Developer.design(program);
Developer.code(program);
User.redefine(requirements);
}

If requirements change during development, you can end up miles from the
original "solution", and most people code around the previoyus attempt,
rather than starting over.

Of course, my usual defense to code appearance is, "Hey, if it's hard to
write, it should be hard to read!"


kjg

"You can have it fast, you can have it cheap, you can have it right -
pick any two"

John D. Burleson

unread,
Jan 20, 1998, 3:00:00 AM1/20/98
to
Kevin Gilhooly wrote:
>
> Bill Wendling <wend...@news.cso.uiuc.edu> wrote:
>
> > Hi all,
> >
> > Does anyone know why people design and create horrible-to-maintain, buggy
> > code? I have seen and heard about some real horror stories. Is it just
> > ignorance/incompetence on the coder's part? Lack of time, maybe?
> >
>
> While (budget.available)
> {
> User.provide(requirements);
> Developer.design(program);
> Developer.code(program);
> User.redefine(requirements);
> }
>

Unfortunately (all too often) it's:

While (budget.available) do in parallel {
...
}

--
John Burleson (mailto:john.d....@boeing.com)
Principal Software Engineer
The Boeing Company/McDonnell Douglas Aerospace/Huntsville
(205)922-7589 FAX:(205)922-4890

vcard.vcf

Eric Werme

unread,
Jan 20, 1998, 3:00:00 AM1/20/98
to

wend...@news.cso.uiuc.edu (Bill Wendling) writes:

>Does anyone know why people design and create horrible-to-maintain, buggy
>code? I have seen and heard about some real horror stories. Is it just
>ignorance/incompetence on the coder's part? Lack of time, maybe?

I'm a little surprised at the replies to date. When I went to college
at Carnegie-Mellon in 1968, few people had grown up with computers
around home. While my father designed process control computers and
taught me how to program part of one in 1963, he thought programming
would be a fairly uninteresting task, so I didn't pursue tempting
things like the Packard Bell 250 at work. However, one of the reasons
I chose C-MU was for its CS dept. As a freshman I took S600, the CS
intro to computers course, which was taught Algol on an Univac 1108.
I quickly discovered Dad was wrong!

I also discovered that while most freshman qualified for Mensa, very few
did well programming. One person in particular produced incredibly
difficult to read code that was 3X bigger than necessary. (He
went on to become an operator at a nuclear plant down the river.)

On the other hand, I wrote an orbit simulation program that term. The
next year Gotos were considered harmful and structured programming was
all the rage. Looked at my old program - beautifully structured. As a
senior, I took a simulation course. Looked at my old program and found
more efficient techniques than were needed to do well in that course!

So I concluded that programming talent is a rare trait. Clearly I had it,
clearly other people do to, clearly many don't.

Last Christmas my sister gave me a book by Thomas Sowell on "Late
Talking Children", which he wrote about his son and others he found
though his newspaper column. I didn't start talking until four or so,
but impressed my 2nd grade teacher by keeping Webster's Collegiate
dictionary at school. (I had it mainly for its Roman numerals and number
names, e.g. dodecillion.) These kids are nearly alsways male, tend
get along fine without talking, catch up quickly, often need some
speech therapy, are great at math and wind up in the computer field.
(Or nuclear physicists before computers were common - I think Einstein
and Feynmann were also late talkers.)

Now, people who grew up with Apples and Commodores were exposed to programming
and late talkers probably found their way into the field. I'm not so sure
about kids today - my Win 95 system needs no programming to be useful.
(Of course, a case can readily be made that it needs Linux.)

So, what differentiates good and bad programmers these days?

If anyone else has late talker stories, please change the subject to
"Late talking programmers".
--
--
<> ROT-13 addresses: <> The above is unlikely to contain official <>
<> <jr...@mx3.qrp.pbz> <> claims or policies of Digital Equipment Corp. <>
<> <jr...@plorecbegny.arg> <> Eric (Ric) Werme <>

Joseph M. Newcomer

unread,
Jan 20, 1998, 3:00:00 AM1/20/98
to

I certainly agree with (1) and (3), but I fail to see how (2) has
anything to do with bad code, except that it is now possible for
thousands of people who could never before program to write bad code,
but why is this a problem? Everybody starts out writing bad code, and
matures.
joe

On Sun, 18 Jan 98 09:16:08 GMT, Robert Billing
<uncl...@tnglwood.demon.co.uk> wrote:

>In article <69prv3$mm3$1...@vixen.cso.uiuc.edu>


> wend...@news.cso.uiuc.edu "Bill Wendling" writes:
>
>> Does anyone know why people design and create horrible-to-maintain, buggy
>> code? I have seen and heard about some real horror stories. Is it just
>> ignorance/incompetence on the coder's part? Lack of time, maybe?
>

> There are three causes of bad code.
>
> 1) Confusion
> 2) Bill Gates
> 3) There's never time to do it right, there's always time to do it
> again.

Joseph M. Newcomer
newc...@flounder.com
http://www3.pgh.net/~newcomer

Joseph M. Newcomer

unread,
Jan 20, 1998, 3:00:00 AM1/20/98
to

It turns out that there are many definitions of "bad" code.
Horrible-to-maintain is high on my list of personal gripes; I teach
that the most important thing you can do is build code that is "robust
under maintenance", that is resilient to all sorts of changes and
doesn't always trust what it is passed in. Buggy code can be caused
by anything from poor specs to incompetent coding, but often bugginess
increases with a program's lifetime rather than decreases (new
features introduce new bugs, sometimes breaking old things that work).

In general, I've found that the less-experienced programmers produce
the hardest-to-maintain code; it is rigid and fragile. Experienced
programmers usually tend to build in defenses against idiosyncratic
changes. Here are some examples
Amateur: the integer value for flow control options is
encoded by the position in a dropdown list in which
the flow control description appears, no matter how
disorganized the resulting GUI presentation of
the table might be to maintain this correspondence.

Professional: the dropdown list encodes both the
text description and the integer that describes the
value, and is organized in a way that makes
sense at the GUI level

Amateur: uses clever mappings to minimize the
total amount of code written. Changes or extensions
to the value set render the code useless.

Professional: uses simple, straightforward techniques
to isloate value dependencies.

Amateur: Code Size Is All

Professional: Code Size is the least of the concerns.
ALWAYS compromise code size in the interest of
maintainability, extensibility, and understandability.

Amateur: code is small, fast, compact, subtle, and optimized,
whether it makes sense to make it so or not.

Professional: only the code whose performance matters
is optimized, and that only after performance data has
been obtained. All other code is gratuitously bulky, slow,
and easy to understand and maintain.

Amateur: codes
DWORD mask = (DWORD)pow(2.0, (float)n);

Professional: codes
DWORD mask = 1 << n;

(This last example is from the inner loop of a realtime system,
written to run on a 386 without floating point accelerator. It was
the least of the sins I found in that code)

Amateur: delivers a system with four .c files, 2 .h files,
and one of the .c files contains both the menu handler
for the GUI and the interrupt handler for the device. The
smallest .c file is 23,000 lines. The average rebuild
cycle after any change is 45 minutes.

Professional: delivers a system containing 189 .c files,
and 245 .h files. The largest .c file is 1800 lines of
source code (it took me three weeks to transform
the first system into the second). The average rebuild
cycle after most changes is 2 minutes.

On 17 Jan 1998 09:00:19 GMT, wend...@news.cso.uiuc.edu (Bill
Wendling) wrote:

>Hi all,


>
>Does anyone know why people design and create horrible-to-maintain, buggy
>code? I have seen and heard about some real horror stories. Is it just
>ignorance/incompetence on the coder's part? Lack of time, maybe?

Joseph M. Newcomer
newc...@flounder.com
http://www3.pgh.net/~newcomer

Charlie Gibbs

unread,
Jan 20, 1998, 3:00:00 AM1/20/98
to

In article <34d018f1...@206.210.64.12> newc...@flounder.com
(Joseph M. Newcomer) writes:

>On Sun, 18 Jan 98 09:16:08 GMT, Robert Billing
><uncl...@tnglwood.demon.co.uk> wrote:
>
>> There are three causes of bad code.
>>
>> 1) Confusion
>> 2) Bill Gates
>> 3) There's never time to do it right, there's always time to do it
>> again.
>

>I certainly agree with (1) and (3), but I fail to see how (2) has
>anything to do with bad code, except that it is now possible for
>thousands of people who could never before program to write bad code,
>but why is this a problem? Everybody starts out writing bad code, and
>matures.

Bill Gates proved that you can make billions (literally!) with
mediocre code - all you need is enough marketing skill. This
was the death knell of carefully-crafted code in the mass market.
Why waste time and effort making truly good code if you can get
the peepul to settle for less? Even worse, if you spend too much
time on quality, some half-assed piece of crap will beat you to
market and close your window of opportunity.

"Eat shit - 50,000,000 flies can't be wrong."

--
cgi...@sky.bus.com (Charlie Gibbs)
Remove the first period after the "at" sign to reply.


Bill Wendling

unread,
Jan 21, 1998, 3:00:00 AM1/21/98
to

Charlie Gibbs wasted electrons by posting:

} Bill Gates proved that you can make billions (literally!) with
} mediocre code - all you need is enough marketing skill. This
} was the death knell of carefully-crafted code in the mass market.
} Why waste time and effort making truly good code if you can get
} the peepul to settle for less? Even worse, if you spend too much
} time on quality, some half-assed piece of crap will beat you to
} market and close your window of opportunity.

} "Eat shit - 50,000,000 flies can't be wrong."

There is a very good article in the current New Yorker magazine about
why MS is so powerful. The fact is that the "free market" strategy of
letting consumers decide what is the best product isn't always true.

For instance, one product has a slight advantage over its competitors.
(MS-DOS for one). It gets put on more machines, then a feed-back loop
starts. The more machines which the product is on, the more people who
need software for it, the more people will need to create software
for it in order to survive, the more machines the product gets put on
to use this software...

It feeds on itself. MS is a perfect example of this. In this way,
inferior products become way too popular and choice goes out the
window. There is only an illusion of choice in this scenario.

--
|| Bill Wendling wend...@ncsa.uiuc.edu

John West McKenna

unread,
Jan 21, 1998, 3:00:00 AM1/21/98
to

newc...@flounder.com (Joseph M. Newcomer) writes:

> Amateur: Code Size Is All

> Professional: Code Size is the least of the concerns.
> ALWAYS compromise code size in the interest of
> maintainability, extensibility, and understandability.

I wish I could. I really do. But 32K of EPROM is 32K of EPROM.

So I wrote an interpreter for a memory-efficient byte-code, compressed all
my strings, and write code like memory doesn't matter.

It is really, really slow. About 12500 instructions/second. But I don't
care. When the routine that is called twice each second takes 2 seconds to
run, you do some optimizing. When the user interface starts to feel
sluggish, you do some optimizing. Until then, it doesn't really matter.

> Amateur: codes
> DWORD mask = (DWORD)pow(2.0, (float)n);

> Professional: codes
> DWORD mask = 1 << n;

>(This last example is from the inner loop of a realtime system,
>written to run on a 386 without floating point accelerator. It was
>the least of the sins I found in that code)

char buf[10];
int i;

i=function_that_always_returns_an_integer_between_1000_and_9999();
sprintf(buf,"%d",i);
buf[0]=buf[2];
buf[1]=buf[3]
buf[2]=0;
i=atoi(buf);

Took me a while to realise that it is simply doing
i=i%100;
(My C is rather rusty, so forgive me if the code is not quite right)

John West

Charlie Gibbs

unread,
Jan 21, 1998, 3:00:00 AM1/21/98
to

In article <6a4q7p$84h$1...@enyo.uwa.edu.au> jo...@ucc.gu.uwa.edu.au
(John "West" McKenna) writes:

> char buf[10];
> int i;
>
> i=function_that_always_returns_an_integer_between_1000_and_9999();
> sprintf(buf,"%d",i);
> buf[0]=buf[2];
> buf[1]=buf[3]
> buf[2]=0;
> i=atoi(buf);

Yes, that is horrible. Of course, the proper way to do it is:

sprintf (buf, "%04d", i); /* Now it works for 0-999 as well. */
sscanf (buf+2, "%d", &i); /* :-) */

>Took me a while to realise that it is simply doing
> i=i%100;

Spoilsport.

Joseph M. Newcomer

unread,
Jan 21, 1998, 3:00:00 AM1/21/98
to

OK, what is "mediocre code"? While I don't like the bugs in Word or
PowerPoint better than anyone else, have you every really *used*
mediocre code? FrameMaker, for example, makes the concept of a
mediocre GUI look like an improvement, and it was written for the
legendary Unix-that-can-do-no-wrong. And it is full of bugs, and its
design is incredibly poor in most directions. And anyone who wants to
see mediocre code only has to look at nearly *any* Unix source code
(for example, one of the major Unix editors once had the property that
if the system crashed while you were editing a file, both the changes
and the actual file were lost, and this was years before Bill Gates
ever wrote a line of 8008 code. I was there. It was my file that was
lost, costing me a day of editing).

What is "carefully crafted" code and how can I tell if the code I'm
running is "carefully crafted" or "crap"? If it does the job well,
does it matter overly much? If it doesn't do the job, it is probably
a Unix app (look up BQS, or Berkley Quality Software, in The New
Hacker's Dictionary). Carefully crafted code apparently is bug-free,
and presumably a multi-megaline system that is carefully crafted has
no bugs, and pigs fly, and I've got a great bargain on a bridge, just
for you.

Let's see: OS/360 predated Bill Gates, and I doubt that anyone would
argue that it represented the pinnacle of good coding style. TSS/360
took the attitude that function calls could use their own register
conventions, but which weren't always documented (I used to maintain
parts of TSS/360, tell me about it). TOPS-10 and TOPS-20 were clearly
the epitome of carefully-crafted code; just ask anyone who maintained
them. And EMACS-20! Who could forget what $$A$&!*P$$ meant! ("What
could be more mnemonic than J137" - Alan Newell on IPL-V).

I am deeply suspect of anyone who claims that Microsoft is alone in
writing "mediocre" code unless they've worked on at least a half-dozen
other major systems (>500K source lines); I have, and NOT ONE of the
several dozen major systems I've worked on in the last 35 years them
would have been in the running for "best code ever written". I also
worked on some really nice small systems, that were quite elegantly
coded, but were essentially toys; all real systems had huge amounts of
crud and cruft, bugs, design flaws, etc.

joe

On 20 Jan 98 15:30:02 -0800, "Charlie Gibbs" <cgi...@sky.bus.com>
wrote:

>In article <34d018f1...@206.210.64.12> newc...@flounder.com


>(Joseph M. Newcomer) writes:
>
>>On Sun, 18 Jan 98 09:16:08 GMT, Robert Billing
>><uncl...@tnglwood.demon.co.uk> wrote:
>>
>>> There are three causes of bad code.
>>>
>>> 1) Confusion
>>> 2) Bill Gates
>>> 3) There's never time to do it right, there's always time to do it
>>> again.
>>
>>I certainly agree with (1) and (3), but I fail to see how (2) has
>>anything to do with bad code, except that it is now possible for
>>thousands of people who could never before program to write bad code,
>>but why is this a problem? Everybody starts out writing bad code, and
>>matures.
>

>Bill Gates proved that you can make billions (literally!) with
>mediocre code - all you need is enough marketing skill. This
>was the death knell of carefully-crafted code in the mass market.
>Why waste time and effort making truly good code if you can get
>the peepul to settle for less? Even worse, if you spend too much
>time on quality, some half-assed piece of crap will beat you to
>market and close your window of opportunity.
>
>"Eat shit - 50,000,000 flies can't be wrong."

Joseph M. Newcomer
newc...@flounder.com
http://www3.pgh.net/~newcomer

Joseph M. Newcomer

unread,
Jan 21, 1998, 3:00:00 AM1/21/98
to

OK, the "illusion of choice" goes out the window. Or Windows.

How has this hurt me?

Most of my career I used a variety of proprietary operating systems,
compilers, and languages. Every couple years I had to relearn
everything. Then along came Unix. And everything was now uniform.
Except that every Unix vendor had to have their own cool features that
made it incompatible (only the simplest applications would port
without major effort, and "real" apps wouldn't port at all). Compilers
were a side issue; one compiler, for one vendor, had the problem that
if it constant-folded an integer comparison in an if-statement, it
inverted the sense of the branch. Their solution: "Don't do that".
Their fix "We'll fix it in the next release, next year". Seven years
ago, I was given a RISC-6000 machine. Other than the fact that its
compiler didn't work, its debugger didn't work, its linker didn't
work, and there was no document production system on it, no one knew
how to magnify the fonts so I could read them, or even get the fonts
loaded, and it compiled slower than my 386/33, it was all right.
Microsoft had *already* produced a development environment that in
every dimension exceeded anything I had ever seen or used in any other
environment in history, and it only cost $350/seat on a $2,500 machine
(as opposed to something like $1500/seat on a $30,000 machine).

Given the quality of the environment I have, why do I care? If there
is a better environment, it will displace Microsoft, and Microsoft
knows it. I never had a choice with Unix anyway, I had to take
whatever crappy dialect of Unix that ran on whatever hardware I had,
so in what way has Microsoft "limited" my choices over what I had a
decade ago? I can't run on a VAX? Why do I care? When has hardware
ever been a deciding factor, anyway? And don't blame Microsoft; they
supported both MIPS and PowerPC, until the people selling those
platforms decided that they didn't want NT on them because nobody was
buying them. So the MARKET, not Microsoft, elected to go with Intel.
When I can insert pictures, by-reference, in spreadsheets that I can
insert in Word documents, print them in full color on any printer I
want (check on an article I did in SIGPlan Notices on IDL about 15
years ago; after trying for THREE WEEKS to print it at the SEI on the
laser printer, I gave up and printed it on my dot-matrix printer on my
DOS box, because THAT WORKED). I don't even want to THINK about the
two days it took me to insert a bitmap image in a document under Unix
in 1989. And after I inserted it, I couldn't print it! Today, it is a
few mouse clicks!

Frankly, I don't care how rich Bill Gates gets; what I care about is
that the environment that I work in gets better and better. Microsoft
accomplished this when the oh-so-wonderful Unix was still trying to
figure out what "portability" meant. I can guarantee that if
Microsoft did not exist, we would NOT have 200MHz
Pentium-equivalent-power laptops for $2,500 today (one sits beside me
right now, running Win95).

If I'm going to have a "choice", I want a choice that gives me
something BETTER than what I already have. I'm not seeing enough
difference between most products to feel that I need a lot of choices.
joe

On 21 Jan 1998 06:22:16 GMT, wend...@news.cso.uiuc.edu (Bill
Wendling) wrote:

>Charlie Gibbs wasted electrons by posting:
>

>} Bill Gates proved that you can make billions (literally!) with
>} mediocre code - all you need is enough marketing skill. This
>} was the death knell of carefully-crafted code in the mass market.
>} Why waste time and effort making truly good code if you can get
>} the peepul to settle for less? Even worse, if you spend too much
>} time on quality, some half-assed piece of crap will beat you to
>} market and close your window of opportunity.
>
>} "Eat shit - 50,000,000 flies can't be wrong."
>

>There is a very good article in the current New Yorker magazine about
>why MS is so powerful. The fact is that the "free market" strategy of
>letting consumers decide what is the best product isn't always true.
>
>For instance, one product has a slight advantage over its competitors.
>(MS-DOS for one). It gets put on more machines, then a feed-back loop
>starts. The more machines which the product is on, the more people who
>need software for it, the more people will need to create software
>for it in order to survive, the more machines the product gets put on
>to use this software...
>
>It feeds on itself. MS is a perfect example of this. In this way,
>inferior products become way too popular and choice goes out the
>window. There is only an illusion of choice in this scenario.

Joseph M. Newcomer
newc...@flounder.com
http://www3.pgh.net/~newcomer

John D. Burleson

unread,
Jan 21, 1998, 3:00:00 AM1/21/98
to
Joseph M. Newcomer wrote:
>
> OK, what is "mediocre code"? While I don't like the bugs in Word or
> PowerPoint better than anyone else, have you every really *used*
> mediocre code? FrameMaker, for example, makes the concept of a
> mediocre GUI look like an improvement, and it was written for the
> legendary Unix-that-can-do-no-wrong. And it is full of bugs, and its
> design is incredibly poor in most directions.

Huh? What was the last version of FrameMaker you used, 2.0? FrameMaker
has an interface that is remarkably consistent across multiple platforms
and creates files that are compatible across ALL these platforms as well
from version to version. I have never, in 5+ years of using FrameMaker,
had a single lock up or crash.

I don't have any comment on the rest of your post. Once I got this far
I _knew_ you didn't have a clue, so I didn't read anymore of it.

vcard.vcf

Charlie Gibbs

unread,
Jan 21, 1998, 3:00:00 AM1/21/98
to

In article <34e067db...@206.210.64.12> newc...@flounder.com
(Joseph M. Newcomer) writes:

>On 21 Jan 1998 06:22:16 GMT, wend...@news.cso.uiuc.edu (Bill
>Wendling) wrote:
>
>Charlie Gibbs wasted electrons by posting:

Hey, they're my electrons...

>Frankly, I don't care how rich Bill Gates gets; what I care about is
>that the environment that I work in gets better and better. Microsoft
>accomplished this

I'm afraid we'll have to agree to disagree on this one.

> when the oh-so-wonderful Unix was still trying to
>figure out what "portability" meant.

That's easy - an application that runs on both Win95 and NT. :-)

> I can guarantee that if
>Microsoft did not exist, we would NOT have 200MHz
>Pentium-equivalent-power laptops for $2,500 today (one sits beside me
>right now, running Win95).

We probably wouldn't miss them quite so badly if we didn't
have them, either. It reminds me of something I saw on
rec.humor.funny a while back:

>From: Vincent Frisina (Grad) <fri...@POLYGON.MATH.CORNELL.EDU>
>
>Two of my housemates recently picked up an old 8088 for free. The
>first, Joe, was overjoyed because, well, it was free and free stuff
>is good, and it's a computer and computers are good. Chris wasn't
>so sure.
>
>J: Once we get a new video card and a new hard drive in this, it'll
> run like new.
>
>C: So? It's an 8088. What the hell can you do with an 8088?
>
>J: Come on. Think of all the stuff they did with computers built
> thirty years ago.
>
>C: Okay, you can get to the moon. What else?

BTW nowhere did I say that Microsoft was the first to distribute
crappy code. I acknowledge all of your examples, and can add a
few of my own. Nor do I praise Unix as a pinnacle of perfection;
it's got many good things, but a few real warts too. But IMHO
M$ was the company that managed to get the general public to
accept crashing software as a way of life. A lot of people
didn't realize how flaky the mouse buttons are under Windows
until I pointed it out to them, at which point it was obvious;
they were just so used to clicking the button until something
happened. (I can quote Petzold to back this up.)

For what it's worth, I just spent the past day in a futile attempt
to add a serial port to a Win95 box. Right now my attitude toward
Microsoft is even less charitable than usual. Hell, I could have
done the corresponding software configuration in that old 90/30
faster than this. So from my point of view we haven't advanced
much at all.

Tom Watson

unread,
Jan 21, 1998, 3:00:00 AM1/21/98
to

<<<deletia with an example of horrible code, with commentary>>>

Are there any other examples of horrible code (in whatever language).
Maybe we could all laugh at them. Please post!!

--
t...@cagent.com (Home: t...@johana.com)
Please forward spam to: anna...@hr.house.gov (my Congressman), I do.

sp...@lisardrock.demon.co.uk

unread,
Jan 22, 1998, 3:00:00 AM1/22/98
to


On 1998-01-21 jo...@ucc.gu.uwa.edu.au(John"West"McKenna) said:
-char buf[10];
-int i;
-i=function_that_always_returns_an_integer_between_1000_and_9999();
-sprintf(buf,"%d",i);
-buf[0]=buf[2];
-buf[1]=buf[3]
-buf[2]=0;
-i=atoi(buf);
-Took me a while to realise that it is simply doing
-i=i%100;

eurgh!!!!!!

however... is there a "quick" way to divide by small odd prime
constants, in the same way that there is to multiply? we can't work any
out, but we're willing to admit we may be missing something.

(eg. : 3* dup dup + + ; but : 3/ ??? ; )


-- Communa - all at lisardrock.demon.net
to be sure we read your reply, send to username `communa'

Net-Tamer V 1.09.2 - Test Drive

Bill Wendling

unread,
Jan 22, 1998, 3:00:00 AM1/22/98
to

Charlie Gibbs wasted electrons by posting: :P

} > when the oh-so-wonderful Unix was still trying to
} >figure out what "portability" meant.

} That's easy - an application that runs on both Win95 and NT. :-)

Unless it's a game...which won't run on NT cause it's only for 95...

--
|| Bill Wendling wend...@ncsa.uiuc.edu

Bill Wendling

unread,
Jan 22, 1998, 3:00:00 AM1/22/98
to

Joseph M. Newcomer wasted electrons by posting:

} OK, the "illusion of choice" goes out the window. Or Windows.

Illusion of choice is the only thing that stays. All real choice flies
out the Window.

} How has this hurt me?

Please tell us how products you've used in the distant past have hurt
you (way before the GUI, even MS Windows, was "perfected")...oh...I
see you have.

} Most of my career I used a variety of proprietary operating systems,
} compilers, and languages. Every couple years I had to relearn
} everything. Then along came Unix. And everything was now uniform.

[snip]

Well, the number one reason for the ports is that UNIX runs on many
different architectures and hardwares. However, there are two main
ports: BSD and SVR4. Porting between the two isn't too difficult (esp.
with the advent of Posix). However, porting between MS and anything
else is a mind-numbingly arduous task.

} Except that every Unix vendor had to have their own cool features that
} made it incompatible (only the simplest applications would port
} without major effort, and "real" apps wouldn't port at all). Compilers

You mean like NCSA Mosaic and NCSA Telnet, Emacs, GCC, Perl, VIM, Netscape,
Mathematica, etc. etc...All of these are definitely NOT real apps...

} Microsoft had *already* produced a development environment that in
} every dimension exceeded anything I had ever seen or used in any other
} environment in history, and it only cost $350/seat on a $2,500 machine
} (as opposed to something like $1500/seat on a $30,000 machine).

Except for MS's memory management, threading, editting, compiling,
telnetting (or lack thereof), etc., I'd have to agree with you.

} Given the quality of the environment I have, why do I care? If there
} is a better environment, it will displace Microsoft, and Microsoft
} knows it. I never had a choice with Unix anyway, I had to take

No, MS is very much aware of their positive feed back loop. All they have
to do is compete in a field (browsers, for instance) and they can
be almost assured of some success.

} whatever crappy dialect of Unix that ran on whatever hardware I had,
} so in what way has Microsoft "limited" my choices over what I had a
} decade ago? I can't run on a VAX? Why do I care? When has hardware
} ever been a deciding factor, anyway? And don't blame Microsoft; they

Actually, hardware is a deciding factor among real users of computers.
People in graphics are vitally concerned with what types of video cards
they have. Some people need really fast access to data on their hard
drive. Some need memory up the wazoo. One of the disappointing things
about the PC industry is that there really isn't good hardware out there
that will optimize your desktop computer with your OS.

} supported both MIPS and PowerPC, until the people selling those
} platforms decided that they didn't want NT on them because nobody was
} buying them. So the MARKET, not Microsoft, elected to go with Intel.

Er...MS had already gone with Intel long before the PowerPC came out.
THey had (and still have) a contract with Intel.

} When I can insert pictures, by-reference, in spreadsheets that I can
} insert in Word documents, print them in full color on any printer I
} want (check on an article I did in SIGPlan Notices on IDL about 15
} years ago; after trying for THREE WEEKS to print it at the SEI on the
} laser printer, I gave up and printed it on my dot-matrix printer on my
} DOS box, because THAT WORKED). I don't even want to THINK about the

Hm...lpr works for me...Sounds like you are kinda inept at
running/understanding your apps. Perhaps you should read that thing called
a "Manual" that comes with the software.

} two days it took me to insert a bitmap image in a document under Unix
} in 1989. And after I inserted it, I couldn't print it! Today, it is a
} few mouse clicks!

Please do compare software from the same time period.

} Frankly, I don't care how rich Bill Gates gets; what I care about is
} that the environment that I work in gets better and better. Microsoft

I care about that too, however, MS is -way- behind in the OS game. OSes are
now moving to 64-bit processors (those damned hardware requirements
again), while MS has just gone to 32-bit (barely) within the last 2 years.
They are also way behind in multi-processing. They can't handle the
load that UNIX machines can (there are several articles on this, one in
the current Linux Journal magazine about the company who did effects for
_Titanic_ who couldn't use NT machines cause it couldn't do what a Linux
box could). NTs aren't as scalable as UNIXes. The list goes on.

} figure out what "portability" meant. I can guarantee that if
} Microsoft did not exist, we would NOT have 200MHz
} Pentium-equivalent-power laptops for $2,500 today (one sits beside me
} right now, running Win95).

What a waste...

} If I'm going to have a "choice", I want a choice that gives me
} something BETTER than what I already have. I'm not seeing enough
} difference between most products to feel that I need a lot of choices.
}

Then please get your eyes checked.

--
|| Bill Wendling wend...@ncsa.uiuc.edu

J. Otto Tennant

unread,
Jan 22, 1998, 3:00:00 AM1/22/98
to

t...@cagent.com (Tom Watson) writes:

><<<deletia with an example of horrible code, with commentary>>>

>Are there any other examples of horrible code (in whatever language).
>Maybe we could all laugh at them. Please post!!

Well, I have many examples. At least dozens.

On the other hand, I wrote them. Technically, some of them are (Thank
God) still covered by proprietary agreements. (If the Soviet Union were
still in business, revealing my code to them might well be considered to
be in the national interest. These codes were, at best, "value
subtracted" rather than "value added.")

Heinz W. Wiggeshoff

unread,
Jan 22, 1998, 3:00:00 AM1/22/98
to

Tom Watson (t...@cagent.com) writes:
> <<<deletia with an example of horrible code, with commentary>>>
>
> Are there any other examples of horrible code (in whatever language).
> Maybe we could all laugh at them. Please post!!

"Don't get me started!" Ralph Kramden.

Production PL/1 at the defunct grocery warehouser Loeb, (Ont. Can.):

ON ENDFILE ( IN_FILE ) BEGIN;

/* Allocate the following CONTROLLED structure to avoid a S0C7 abend: */

{Details not important.}
END;

This code fragment demonstrated to me that at Loeb, one could be a
data processing super-star without having a clue about the proper
usage of the CONTROLLED storage facility, never mind about how to
handle end-of-file.

Dan Strychalski

unread,
Jan 22, 1998, 3:00:00 AM1/22/98
to

Joseph M. Newcomer (newc...@flounder.com) wrote --

> Frankly, I don't care how rich Bill Gates gets; what I care about is

> that the environment that I work in gets better and better.[...]

Greetings from Userland, Mr. Programmer Man. The environment I work in
has gotten worse and worse as your darling Billy has pulled the wool
over more and more people's eyes and forced nearly all the world's
computing population to work like six-year-olds.

> I can guarantee that if Microsoft did not exist, we would NOT have
> 200MHz Pentium-equivalent-power laptops for $2,500 today (one sits
> beside me right now, running Win95).

To paraphrase you, how has this benefited me? I can guarantee you that
we would have such machines in good time, and we would not be forced to
use them for tasks that could be done just as well on a 286.

Windows sucks like a black hole, fellah, and so does everyone who
advocates it.

Dan Strychalski
dski at cameonet, cameo, com, tw.
Apologies for the non-threading newsreader and anti-spam devices.

Robert Merkel

unread,
Jan 22, 1998, 3:00:00 AM1/22/98
to

(Bill
Wendling) wrote:
] Does anyone know why people design and create horrible-to-maintain,


buggy
] code? I have seen and heard about some real horror stories. Is it just
] ignorance/incompetence on the coder's part? Lack of time, maybe?

Lack of time, quite commonly, but also the requirements - not all software
has to be maintained. At the moment I'm involved in a small research
project that involves adapting somebody else's hacked together testing
scripts. I'm in a hurry to get results, so I'm just doing a cut and paste
job on his already hastily written code which is laden with magic numbers
and the like. I'm fully aware that the code is bad - but, frankly, there
is no point in doing a proper design - it does what it is intended to do.

Of course, the result is almost unreadable and unmaintainable - but
nobody's ever going to modify it (just like nobody was ever going to read
or modify the code I started with . . . )

You can see how bad code can happen . . .

jmfb...@ma.ultranet.com

unread,
Jan 22, 1998, 3:00:00 AM1/22/98
to

In article <6a6c4b$nl7$1...@vixen.cso.uiuc.edu>,
wend...@news.cso.uiuc.edu (Bill Wendling) wrote:
<snip>

>
>Except for MS's memory management, threading, editting, compiling,
>telnetting (or lack thereof), etc., I'd have to agree with you.
<snip>
I agree with your list; I have an addition: the lack of a buffered
mode I/O concept so that files with a size greater than available
memory can be opened for in/output. I suppose that this could be
classified under [lack of] memory management.

/BAH

Harry Dodsworth

unread,
Jan 22, 1998, 3:00:00 AM1/22/98
to

I wasn't sure whether to post under the horrible code, or the self
modifying code thread.
However the worst bit of COBOL code I ran across, had three ALTER
statements in a group of four statements.

--
Harry Dodsworth Ottawa Ontario Canada af...@freenet.carleton.ca
----------------------------------------------------------------

Marco S Hyman

unread,
Jan 22, 1998, 3:00:00 AM1/22/98
to

Neil.Frankli...@ccw.ch writes:

> The same as good science result from an open process of public critic
> (testing) and improvement (feedback), good software happens when the
> social structure making it can tap all of its knowledge and testing
> power to drive out the inevitable bugs that will creep in.
>
> The only OS made on such principles at the moment is GNU/Linux, which

I'm curious, why do you believe FreeBSD/NetBSD/OpenBSD are closed?
Full source (kernel as well as all of user land) is available
for a few dollars (CD) or the time it takes it you to download it
off the net.

// marc

Simon Slavin

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

In article <werme.8...@alingo.zk3.dec.com>,
we...@alingo.zk3.dec.com (Eric Werme) wrote:

> Last Christmas my sister gave me a book by Thomas Sowell on "Late
> Talking Children", which he wrote about his son and others he found

> though his newspaper column. [snip]


> These kids are nearly alsways male, tend
> get along fine without talking, catch up quickly, often need some
> speech therapy, are great at math and wind up in the computer field.
> (Or nuclear physicists before computers were common - I think Einstein
> and Feynmann were also late talkers.)

Standard event when late talkers eventually start talking:

Child: Milk ?
Mother: He spoke ! Darling, why did you never say anything before ?
Child: Never wanted anything before.

Simon.
--
Simon Slavin -- Computer Contractor. | If I ran usenet, the timestamp for
http://www.hearsay.demon.co.uk | anything posted between 2am and 5am
Check email address for UBE-guard. | would *blink*. -- Nancy Lebovitz
My s/ware deletes unread >3 UBEs/day.| Junk email not welcome at this site.

Richard Shetron

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

In article <34df6431...@206.210.64.12>,
Joseph M. Newcomer <newc...@flounder.com> wrote:
[snip]

>Let's see: OS/360 predated Bill Gates, and I doubt that anyone would
>argue that it represented the pinnacle of good coding style. TSS/360
>took the attitude that function calls could use their own register
>conventions, but which weren't always documented (I used to maintain
>parts of TSS/360, tell me about it). TOPS-10 and TOPS-20 were clearly
>the epitome of carefully-crafted code; just ask anyone who maintained
>them. And EMACS-20! Who could forget what $$A$&!*P$$ meant! ("What
>could be more mnemonic than J137" - Alan Newell on IPL-V).

I'd vote for Multics, but then I'm definilty biased ;)
[snip]


Neil.Frankli...@ccw.ch

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

newc...@flounder.com (Joseph M. Newcomer) asked:

> OK, what is "mediocre code"?

Code where the decisions are made on the base of short term avoiding
of development costs (budget effective) and not on the base of long
term accumulating crash costs (costomer puts up with them).

Actually the real issue is not Microsoft vs Unix. It is commercial
compromises vs non commercial striving for good.


The same as good science result from an open process of public critic
(testing) and improvement (feedback), good software happens when the
social structure making it can tap all of its knowledge and testing
power to drive out the inevitable bugs that will creep in.

The only OS made on such principles at the moment is GNU/Linux, which

happens to be an variant of Unix. But that was just a conveniance for
easy modularising the the project for parallel independant development
without requiring an mentally limiting central structure.


> What is "carefully crafted" code and how can I tell if the code I'm
running is "carefully crafted" or "crap"?

By taking what comes from an credible (public) process and not from an
"we are the oracle of the world" nose-up-the-ass company like MS.

No, that doesn't guarantie qualitiy, but it improves the likelyhood.

A lot like the "what is secure" discussion, IMHO.


> Berkley Quality Software

Annother closed group. Avoid BSD the same as MS.


> OS/360 predated Bill Gates, and I doubt that anyone would
argue that it represented the pinnacle of good coding style. TSS/360

Annother closed group. Avoid IBM.


> TOPS-10 and TOPS-20 were clearly

The same. And so on...


> >500K source lines)

Linux Kernal source. Is actually nice to look at. And you can do so.

--
Neil.Frankli...@ccw.ch, http://www.ccw.ch/Neil.Franklin/
for Geek Code, Papernet, Voicenet, PGP public key see http:
Mac, 95 and NT users are CLUEless (Command Line User Environment)
If I go missing, its once again my newsfeed that has craped

D. Peschel

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

Well, don't just taunt us unexperienced people! What does $$A$&!*P$$ mean?
(I assume the $'s are escapes (altmodes?).)

-- Derek

Warren Young

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

wend...@news.cso.uiuc.edu (Bill Wendling) wrote:

>Well, the number one reason for the ports is that UNIX runs on many
>different architectures and hardwares. However, there are two main
>ports: BSD and SVR4. Porting between the two isn't too difficult (esp.
>with the advent of Posix). However, porting between MS and anything
>else is a mind-numbingly arduous task.

...And porting from UNIX (or VMS, or Amiga, or...) to Windows is easy?

>} Except that every Unix vendor had to have their own cool features that
>} made it incompatible (only the simplest applications would port
>} without major effort, and "real" apps wouldn't port at all). Compilers
>
>You mean like NCSA Mosaic and NCSA Telnet, Emacs, GCC, Perl, VIM, Netscape,
>Mathematica, etc. etc...All of these are definitely NOT real apps...

Granted, his original statement was overkill, but don't kid yourself:
all of these programs have conditional code out the wazoo. Even today
in these enlightened POSIX days, it still takes a powerful tool like
Autoconf to give the illusion of easy portability. NT programs, on
the other hand, pretty much just recomipiled. I guess you could say
the same thing about Solaris SPARC and x86, too, FWIW.

>} Microsoft had *already* produced a development environment that in
>} every dimension exceeded anything I had ever seen or used in any other
>} environment in history, and it only cost $350/seat on a $2,500 machine
>} (as opposed to something like $1500/seat on a $30,000 machine).
>
>Except for MS's memory management, threading, editting, compiling,
>telnetting (or lack thereof), etc., I'd have to agree with you.

Memory management: NT does just fine protecting processes from each
other. (Don't kid yourself: no one is advocating Windows 95 in this
thread -- that'd be like comparing apples to worms.)

Threading: NT does that well, too (it even does some things pthreads
doesn't do, like thread-local storage).

Editing: the Visual C++ and Borland C++ editors do just fine, and if
you don't like them, you can use that $1200 per-seat difference to put
towards whatever editor you like. Or, get the Win32 port of emacs or
vim -- they work just fine.

Compiling: PC compilers are better than most UNIX compilers, probably
just because there are more people using them. Borland C++ and Visual
C++, for example, both support the newest C++ features better than
g++, and definitely better than any stock compilers I've seen. (Sure,
spend big money on a third-party toolset and that might change, but PC
compilers are still pretty darn good these days.)

Telnetting: No one in their right mind sticks with the default telnet
client. Come on, splurge on a $30 shareware one. Heck, there are now
freeware ones that are good enough for most purposes.

>} Given the quality of the environment I have, why do I care? If there
>} is a better environment, it will displace Microsoft, and Microsoft
>} knows it. I never had a choice with Unix anyway, I had to take
>
>No, MS is very much aware of their positive feed back loop. All they have
>to do is compete in a field (browsers, for instance) and they can
>be almost assured of some success.

Don't fool yourself: Microsoft has some good stuff these days. As far
as browsers go, Netscape and Microsoft are about equal as far as
perceived quality goes, and I have no doubts that Microsoft will
eventually be the default browser on Windows. Personally, I'm a
Netscape guy, but I still think MSIE has its points. For example, it
starts up _much_ faster, is feature-competitive, and doesn't seem to
crash any more than Navigator. I just don't like the
newbie-handholding aspect it purveys.

>} whatever crappy dialect of Unix that ran on whatever hardware I had,
>} so in what way has Microsoft "limited" my choices over what I had a
>} decade ago? I can't run on a VAX? Why do I care? When has hardware
>} ever been a deciding factor, anyway? And don't blame Microsoft; they
>
>Actually, hardware is a deciding factor among real users of computers.

I hate it when people use "real" this way -- as if all N million of us
PC people are figments of the universe's imagination or something.

>People in graphics are vitally concerned with what types of video cards
>they have. Some people need really fast access to data on their hard
>drive. Some need memory up the wazoo. One of the disappointing things
>about the PC industry is that there really isn't good hardware out there
>that will optimize your desktop computer with your OS.

Take a look at Intergraph and Compaq. They both build SGI killers,
and I'm sure there are others. The reason that machines like these
are not common in the PC world is that most PC users don't need
machines like that -- why pay for what you don't need?

>} Frankly, I don't care how rich Bill Gates gets; what I care about is
>} that the environment that I work in gets better and better. Microsoft
>
>I care about that too, however, MS is -way- behind in the OS game. OSes are
>now moving to 64-bit processors (those damned hardware requirements
>again), while MS has just gone to 32-bit (barely) within the last 2 years.
>They are also way behind in multi-processing. They can't handle the

I can't speak for anyone else here, but I'm not saying that UNIX is
worthless. I have two machines here at home, one of which is a Linux
box. At work, we've built our head-end system on a UNIX machine, some
of which I'm involved with. If you're just trying to argue that NT
isn't the most capable system on the planet, then fine, I agree, and
we can go find another thread to haunt. If instead you're trying to
get us to believe that UNIX is the across-the-board best OS, you're
gonna have to do a lot better than this.

>load that UNIX machines can (there are several articles on this, one in
>the current Linux Journal magazine about the company who did effects for
>_Titanic_ who couldn't use NT machines cause it couldn't do what a Linux
>box could). NTs aren't as scalable as UNIXes. The list goes on.

Yeah, the article essentially said "We couldn't use NT because it
isn't UNIX." Big surprise. (The company had made a big prior
investment in UNIX software and was unwilling to port or replace it.)

= Warren -- http://www.cyberport.com/~tangent
=
= Remove the SPAMCATCHER to email. -- Finger me!

Warren Young

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

newc...@flounder.com (Joseph M. Newcomer) wrote:

> Amateur: the integer value for flow control options is

Amateur: Uses lots of global variables, with names like 'x' and 'i1'.
(I did not just make this up -- real code inspired this.)

Professional: Likes to test the 31-character identifier limit of C.

Amateur: Changes the text messages in the program until it stops
crashing.

Professional: Runs a bleedin' memory debugger on the sucker.

Amateur: Turns off the warnings so he can see the compiler's error
reports easier.

Professional: Turns on _all_ the warnings and fixes everything
reported even if there really is no problem...so he can see the
compiler's error reports easier.

Amateur: Trustful.

Professional: Positively paranoid: checks return values, assert()s
passed parameters, liberally uses try {} blocks (C++) and auto_ptr,
installs signal handlers, and writes elaborate error handling
libraries to handle all of these errors gracefully.

Robert Billing

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

In article <34c934ee...@news.cyberport.com>
tan...@SPAMCATCHER.cyberport.com "Warren Young" writes:

> ...And porting from UNIX (or VMS, or Amiga, or...) to Windows is easy?

Not really, but porting between Un*x, VMS, Linux, Amiga, OS/2, RSX and
all other *real* operating systems is. I've done it. Real in this
context means handles multiple tasks with a reasonable form of task
interaction.

--
I am Robert Billing, Christian, inventor, traveller, cook and animal
lover, I live near 0:46W 51:22N. http://www.tnglwood.demon.co.uk/
"Bother," said Pooh, "Eeyore, ready two photon torpedoes and lock
phasers on the Heffalump, Piglet, meet me in transporter room three"

jmfb...@ma.ultranet.com

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

In article <6a168h$ijk$1...@darla.visi.com>,
j...@visi.com (J. Otto Tennant) wrote:
>Raoul Golan <ra...@ind.tansu.com.au> writes:

>
>>Kelsey Bjarnason wrote:
>
>>> In article <69prv3$mm3$1...@vixen.cso.uiuc.edu>,
wend...@news.cso.uiuc.edu
>>> says...
>>> > Hi all,

>>> >
>>> > Does anyone know why people design and create horrible-to-maintain,
buggy
>>> > code? I have seen and heard about some real horror stories. Is it
just
>>> > ignorance/incompetence on the coder's part? Lack of time, maybe?
>>> >
>>> > --
>>> > || Bill Wendling wend...@ncsa.uiuc.edu
>
>>Uh, job security?
>
>In my opinion, not usually.
>
<snip some reality>

It was others' job security. Consider having to engage the time
of a systems programmer to maintain a COBOL program.

/BAH

Rob Hafernik

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

You know, it's a wierd little secret, but the Metrowerks development
environment (which started out on the Mac, but now is available for the PC
as well) is really good, but nearly unknown outside of the Mac community.

They call it an "Integrated Development Environment". The editors,
compilers and debuggers all work together under a program called the IDE.
You organize your project into "project" files. These keep track of the
location of your source and libraries and other files. They also keep
track of your "targets", which are the things you want to build. Each
project can have multiple targets.

So, you can have one project that can build PowerMac, 68K Mac and Windows
versions of your program. The project also takes the place of makefiles
(thankfully), but has a scripting ability so that you can add any external
processing you like to the build process (even to running someone else's
compiler and importing the result).

The code editor does everything you expect: syntax coloring, function
pop-ups, parsing for missing parens or braces, auto-indent, block-indent,
any font and spacing, pop-ups to take you to include files, etc, etc.

The compilers have, in the past, produced some fairly poor code, but
they're pretty good these days. In particular, their PowerPC code is
quite good. Since the compilers just plug into the environment, they've
added support for additional targets (such as the Pilot) and additional
languages (such as Java).

The source-level debugging is as good as any I've seen anywhere (and works
the same for all targets and languages that support the debugger). You
can set breakpoints and watchpoints. You can evaluate expressions on the
fly. You can change the values of variables as the program runs. You can
view variables as their own type or any other type (very helpful with
pointers that might point to lots of different things). You can view
memory directly, in hex or as any variable. The debugger can be
automatically invoked when there's an exception.

There's also a fairly handy profiling tool that makes optimization a lot
easier. I just finished converting a PC video game to the Mac and it was
obvious that the PC programmers hadn't used a profiler on the code (I
presume because they didn't have one they liked), because there were some
obvious bottlenecks in the code that were pretty easy to work around. One
block of ten lines of code was taking up nearly 12% of the processor
time. With a little tweaking I brought it down to 2%. This is the sort
of thing that you can only find with a profiler.

Oh, and it complies and links fast as hell, always an important point.
I've rebuilt a 120,000 line project in three or four minutes. A 10,000
line program will compile and link in a few seconds.

With a tool like Metrowerks, there's just no excuse for poor code.

PS. I don't work for Metrowerks, I just like their products <g>

Peter Seebach

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

In article <shokwave-230...@as1-dialup-17.wc-aus.io.com>,

Rob Hafernik <shok...@well.com> wrote:
>You know, it's a wierd little secret, but the Metrowerks development
>environment (which started out on the Mac, but now is available for the PC
>as well) is really good, but nearly unknown outside of the Mac community.

Hmph. I wrote them with a couple of bug reports, and they tried to
correct me.

Specifically, as memory serves, they tried to convince me that
int main(void) {
return 0;
};
is not a syntax error. The rationale? Well, yes, function defns don't
take ;'s, but you can just view it as a null statement!

Of course, C can't take null statements outside of blocks.

>The code editor does everything you expect: syntax coloring,

AaAAaAAAAaagggghhh!

Would you be happier reading English if prepositions were green,
nouns were blue, verbs were red, and adjectives purple, adverbs
orange, and everything else yellow?

Hint: Syntax coloring is not always a good thing.

>function pop-ups,
>parsing for missing parens or braces, auto-indent, block-indent,
>any font and spacing, pop-ups to take you to include files, etc, etc.

Uhm. So far, you've offered a couple of features I want and look for
in an editor, but none of the good ones:
* Regular expression support
* Piping blocks through external commands
* Full-featured command set, i.e.,
"go to every line that has a '.call' at the beginning;
there, replace anything that looks like a bunch of
letters, followed by the sequence '_foo' followed
by a group of numbers, with the same letters, followed
by '_bar', followed by the same group of numbers."
g/^\.call/s/\([a-z]*\)_foo\([0-9]*\)/\1_bar\2/g

[snip]

None of the other features you mention are unusual; every Unix system
has had them all for decades.

>With a tool like Metrowerks, there's just no excuse for poor code.

Sure there is! For instance, let's say the compiler fails to warn you
about a syntax error, or the manual gives you bad advice, or the example
program for "ANSI C" isn't ANSI C, because it declares main wrong.

I actually think they're pretty cool, but not as cool as you think they
are.

-s
--
se...@plethora.net -- I am not speaking for my employer. Copyright '97
All rights reserved. Boycott Spamazon! End Spam. C and Unix wizard -
send mail for help, or send money for a consultation. Visit my new ISP
<URL:http://www.plethora.net/> --- More Net, Less Spam! Plethora . Net

Peter Seebach

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

In article <34c934ee...@news.cyberport.com>,

Warren Young <tan...@SPAMCATCHER.cyberport.com> wrote:
>...And porting from UNIX (or VMS, or Amiga, or...) to Windows is easy?

Except when Windows has a crappy API, for instance, Winsock, yes, it is.
I took an implementation of RPC that someone had mostly-ported to Win '95,
and I got it running on both kinds of windows in a day or two.

>Granted, his original statement was overkill, but don't kid yourself:
>all of these programs have conditional code out the wazoo. Even today
>in these enlightened POSIX days, it still takes a powerful tool like
>Autoconf to give the illusion of easy portability. NT programs, on
>the other hand, pretty much just recomipiled. I guess you could say
>the same thing about Solaris SPARC and x86, too, FWIW.

99% of the things people use autoconf for are just developer cluelessness.

I've had very little trouble porting between Unixes, and what I have had
is mostly a result of systems 8-10 years apart in feature set.

>Threading: NT does that well, too (it even does some things pthreads
>doesn't do, like thread-local storage).

Well? You call a system that needs 4x the memory, at least, of a Unix
system to run threads "well"?

>Editing: the Visual C++ and Borland C++ editors do just fine, and if
>you don't like them, you can use that $1200 per-seat difference to put
>towards whatever editor you like. Or, get the Win32 port of emacs or
>vim -- they work just fine.

Uhm. What $1200 per-seat difference? Real Unix (BSDI, NetBSD, Linux)
all come with real editors and toolsets. Free. So...

>Compiling: PC compilers are better than most UNIX compilers, probably
>just because there are more people using them.

Oh, nonsense! They're utter crap! I'm sorry, but I collect compilers,
and the absolute worst, without doubt, have been the PC ones.

gcc is one of the top five compilers I've ever used. Clear, informative
diagnostics. When there's a conflict between two headers, it tells you
which ones - MSVC 4.x was still just telling you that something was redefined,
not telling you where the previous definition is.

That, right there, can make a twenty minute to an hour difference in how
long it takes to get something running.

>Borland C++ and Visual
>C++, for example, both support the newest C++ features better than
>g++, and definitely better than any stock compilers I've seen.

You're joking. This is the Visual C++ that was still using the ARM scope
for variables in for loops?

When did they get support for 'mutable'? gcc had it around '91 or '92.

>(Sure,
>spend big money on a third-party toolset and that might change, but PC
>compilers are still pretty darn good these days.)

No, they're crap. They're horrible. I would rather use a Macintosh compiler
than any PC compiler I've had the misfortune to be stuck on. My experience
was limited to various VC 4.x's, VC 1.52c, and Borland 4.52 or so... But they
were all horrible.

>Telnetting: No one in their right mind sticks with the default telnet
>client. Come on, splurge on a $30 shareware one. Heck, there are now
>freeware ones that are good enough for most purposes.

Oh, I see - telnet isn't basic functionality, so it's okay that the default
telnet client is completely broken. Nice try, but I'll stick with an OS
that has reasonable tools.

>Don't fool yourself: Microsoft has some good stuff these days.

Really?

>As far
>as browsers go, Netscape and Microsoft are about equal as far as
>perceived quality goes, and I have no doubts that Microsoft will
>eventually be the default browser on Windows. Personally, I'm a
>Netscape guy, but I still think MSIE has its points. For example, it
>starts up _much_ faster, is feature-competitive, and doesn't seem to
>crash any more than Navigator. I just don't like the
>newbie-handholding aspect it purveys.

Well, it starts up much faster for the same reason that M$ apps are
always faster - they can cheat, they can put stuff into the OS, and
so on.

Try starting MSIE on a Unix system some time.

>I hate it when people use "real" this way -- as if all N million of us
>PC people are figments of the universe's imagination or something.

No, but many of you aren't really *using* computers, you're just toying
with them. (Hell, I know I am a lot of the time - and for that, I have
a PC.)

>Take a look at Intergraph and Compaq. They both build SGI killers,
>and I'm sure there are others. The reason that machines like these
>are not common in the PC world is that most PC users don't need
>machines like that -- why pay for what you don't need?

I have no idea what you consider an "SGI killer", but I haven't been
able to find any PC's comparable to SGI's graphics workstations...

>If instead you're trying to
>get us to believe that UNIX is the across-the-board best OS, you're
>gonna have to do a lot better than this.

True - if it weren't for the lack of memory protection, I would have
to say AmigaDOS is competitive.

>Yeah, the article essentially said "We couldn't use NT because it
>isn't UNIX." Big surprise. (The company had made a big prior
>investment in UNIX software and was unwilling to port or replace it.)

How about MSNBC, who couldn't use NT because it simply couldn't handle
real loads?

How about the guy I was working with who had an NT server with nine
virtual web pages on it... It was overloaded, so they had to up the
memory from 128MB to 256MB.

The Unix box, with 128MB, was handling ninety. It wasn't heavily loaded
yet.

How about ftp.microsoft.com, over ten dedicated ftp servers, serving less
information than a single unix server at one of the other places...

sp...@lisardrock.demon.co.uk

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to


On 1998-01-22 ds...@cameonet.cameo.com.twx(DanStrychalski) said:
-> I can guarantee that if Microsoft did not exist, we would NOT have
-> 200MHz Pentium-equivalent-power laptops for $2,500 today (one sits
-> beside me right now, running Win95).

-To paraphrase you, how has this benefited me? I can guarantee you
-that we would have such machines in good time, and we would not be
-forced to use them for tasks that could be done just as well on a
-286.

we *knew* there was some reason why our home 386sx20 seemed faster than
our work 486dx2/66...

sp...@lisardrock.demon.co.uk

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to


On 1998-01-21 newc...@flounder.com(JosephM.Newcomer) said:
-OK, the "illusion of choice" goes out the window. Or Windows.

-How has this hurt me?

see below. if you had a choice (or rather, let yourself believe you had
a choice; modern unices are pretty damn good, especially a linux sysstem
running on one computer at work which is generally underspecified for
win95.

but we know, it's not comparing like with like. a better comparison for
linux would be nt/server. which just goes to show...)

-Most of my career I used a variety of proprietary operating systems,
-compilers, and languages. Every couple years I had to relearn
-everything. Then along came Unix. And everything was now uniform.

why is having to relearn everything a *bad* thing...? and why couldn't
you have used the common principles that you must surely have picked up?
a text editor is a text editor is a text editor. an algorithmic language
ditto. sure the implementations would have differed, but LDA is the same
whichever machine you're doing it on (with the possible exception of the
cdc6600).

-Except that every Unix vendor had to have their own cool features
-that made it incompatible (only the simplest applications would port
-without major effort, and "real" apps wouldn't port at all).

whatever became of the k&r c library...? or, for that matter, the
standard facilities of f77?

now look. porting a dos app to a mac is going to be a bit of a bastard
too. so what? whenever you port an app from one computer to another (or
from one language to another, or whatever) it's a sod. that's just the
way of things. it's damned near impossible to do anything useful without
doing things specific to the system you're using (sorry, java).

these days, porting isn't an issue because your beloved windows has
locked people into *one* architecture, running on *one* platform, using
*one* company's technology. no options for improvement. but just try
porting a windows app to the mac, and then come back and tell us that ms
has done away with the compatibility problems of modern computers.

better still, try porting it to x/windows.

-Compilers were a side issue; one compiler, for one vendor, had the
-problem that if it constant-folded an integer comparison in an
-if-statement, it inverted the sense of the branch. Their solution:
-"Don't do that".

well, *don't* do it then, it's not hard to avoid. (what is the extra
effort involved in working out the constant beforehand, giving it a
meaningful name, and not worrying about whether the compiler can do
something you should have done yourself in the first place...? in a
perfect world, compilers wouldn't optimise anything.)

-Their fix "We'll fix it in the next release, next
-year". Seven years ago, I was given a RISC-6000 machine. Other
-than the fact that its compiler didn't work, its debugger didn't
-work, its linker didn't work, and there was no document production
-system on it, no one knew how to magnify the fonts so I could read
-them, or even get the fonts loaded, and it compiled slower than my
-386/33, it was all right. Microsoft had *already* produced a
-development environment that in every dimension exceeded anything I
-had ever seen or used in any other environment in history, and it
-only cost $350/seat on a $2,500 machine (as opposed to something
-like $1500/seat on a $30,000 machine).

hmm. which environment was that, then? can't have been visual basic,
next got there first several years beforehand (and in much better taste
and style) and they even took the cue from smalltalk (remember mvc,
anyone?) also can't have been anything they've ever done with c, we
haven't seen anything remotely functional coming from that direction.

or maybe you meant mbasic on the trs-80...? :>

-Given the quality of the environment I have, why do I care? If
-there is a better environment, it will displace Microsoft, and
-Microsoft knows it.

*WRONG*! look at nextstep. look at the mac. look at any number of great,
beautiful, powerful failures. look at os/2, ferchrissakes!!!

ms has done an extremely effective job of locking any possible
competitor out of the market almost completely. once a company
eliminates competition, it doesn't *have* to be good. or cheap. or
effective. all it has to do is sit there and give the orders.

and so many machines use it, and so much software relies on it (and
can't be redeveloped), that there *is* no competition effectively.
otherwise, why is cobol still a thriving opportunity of employment?

-I never had a choice with Unix anyway, I had
-to take whatever crappy dialect of Unix that ran on whatever
-hardware I had, so in what way has Microsoft "limited" my choices
-over what I had a decade ago? I can't run on a VAX? Why do I

well, actually, that almost certainly isn't the case. there were and are
plenty of choices of unix for most machines. even in freeware, you can
have linux, minix or one of half a dozen different BSD ports. and
manufacturers have never been terribly good at giving their customers
operating systems that made the best of the underlying irons.

-care? When has hardware ever been a deciding factor, anyway? And
-don't blame Microsoft; they supported both MIPS and PowerPC, until
-the people selling those platforms decided that they didn't want NT
-on them because nobody was buying them. So the MARKET, not

umm, microsoft wasn't forced into giving up on support for those
platforms. the last thing we need from anyone is a "poor microsoft,
aren't they victims?" speech.

-Microsoft, elected to go with Intel. When I can insert pictures,
-by-reference, in spreadsheets that I can insert in Word documents,
-print them in full color on any printer I want (check on an article
-I did in SIGPlan Notices on IDL about 15 years ago; after trying
-for THREE WEEKS to print it at the SEI on the laser printer, I gave
-up and printed it on my dot-matrix printer on my DOS box, because
-THAT WORKED). I don't even want to THINK about the two days it
-took me to insert a bitmap image in a document under Unix in 1989.
-And after I inserted it, I couldn't print it! Today, it is a few
-mouse clicks!

yep. 1989. sure. in 1989 we can imagine that inserting a bitmap image
into a document under windows would have been something of a hassle
too. after all, no ole, no dde, none of the technologies (all of which
are available from others, better implemented, with more power and scope
and fewer stupid blind points) that actually make all of this stuff
possible.

but please, point to which of these technologies microsoft have
developed, pioneered, and used long before they were credible elsewhere.

take your time, please.

and then take a few minutes to consider whether doing what you find so
wonderful now would even have been *possible* had microsoft achieved
their current level of market domination a few minutes earlier.

nowadays, ms has to rip off public domain solutions. it's driven all
it's commercial competitors to the wall. it's even started buying into
universities (cambridge, of all places). what happens when ms own all of
them?

you don't want choice? fine. we'll assume you don't want any of the
benefits you raved about above either - or certainly, you won't want the
equivalents that should have been along in five years' time.

-Frankly, I don't care how rich Bill Gates gets; what I care about is
-that the environment that I work in gets better and better.

same here. shame it won't. (until we get linux at work, that is.)

-Microsoft accomplished this when the oh-so-wonderful Unix was still
-trying to figure out what "portability" meant. I can guarantee

ms didn't have to worry about portability. so what?

-that if Microsoft did not exist, we would NOT have 200MHz
-Pentium-equivalent-power laptops for $2,500 today (one sits beside
-me right now, running Win95).

bollocks.

intel have done their own thing in the micro race (except perhaps for
mmx). they've had to, because people like digital weren't going to slow
down for anyone - even intel has to stay competitive. and the pda
revolution would've happened anyway (and it seems like a little company
in the uk will win that particular battle).

so no, we wouldn't have had 200MHz pentia, perhaps, but we would
probably still have had 500MHz strongarms, 250MHz powerpcs, etc. for the
same price. and probably with better overall performance. and hey, we
might have had 200MHz 80860-based systems instead, which would have been
much nicer.

-If I'm going to have a "choice", I want a choice that gives me
-something BETTER than what I already have. I'm not seeing enough
-difference between most products to feel that I need a lot of
-choices.

so in other words, ms - who are so wonderful - aren't actually all that
different to anyone else...?

John Cochran

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

In article <tsw-210198...@cypher.cagent.com>,

Tom Watson <t...@cagent.com> wrote:
><<<deletia with an example of horrible code, with commentary>>>
>
>Are there any other examples of horrible code (in whatever language).

At a previous assignment, I was asked to evaluate some 'C' code for
maintainability. Some of the things I found.

1. emsg[0] = 'E';
emsg[1] = 'r';
emsg[2] = 'r';
....
emsg[68] = '\0';
report_error(emsg);

2.
char *strncpy(int n, char *dest, char *source)
{
/* Code to copy NUL terminated string from source to dest */
/* for at most n characters. Yes, the exact same functionality */
/* as the standard strncpy(). Just that the parameters don't match */
}

After finding many more problems (Just the kind of thing you would expect
when encountering a group of 10 people who just developed a project without
any overall leadership or plan, and it was their FIRST encounter with Unix,
and it was their FIRST encounter with C, and it was their FIRST encounter
with a certain graphics package....

I finally asked them to provide me with a directory subtree that had
everything required to compile the project and NOTHING else. After
they claimed to do that, I typed:
grep main *.c
and about a dozen definitions of main() popped up.

I sincerely pity who ever is having to maintain that monster.

John Cochran


Dr. Peter Kittel

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

In article <34C67D98...@boeing.com> "John D. Burleson" <john.d....@boeing.com> writes:
>Joseph M. Newcomer wrote:
>>
>> OK, what is "mediocre code"? While I don't like the bugs in Word or
>> PowerPoint better than anyone else, have you every really *used*
>> mediocre code? FrameMaker, for example, makes the concept of a
>> mediocre GUI look like an improvement, and it was written for the
>> legendary Unix-that-can-do-no-wrong. And it is full of bugs, and its
>> design is incredibly poor in most directions.
>
>Huh? What was the last version of FrameMaker you used, 2.0? FrameMaker
>has an interface that is remarkably consistent across multiple platforms
>and creates files that are compatible across ALL these platforms as well
>from version to version. I have never, in 5+ years of using FrameMaker,
>had a single lock up or crash.

Ugh, and that FrameMaker 5.0 here for Mac crashes every quarter of an
hour. Weird. On different Mac clones, I should add.

--
Best Regards, Dr. Peter Kittel // http://www.pios.de of PIOS
Private Site in Frankfurt, Germany \X/ office: peterk @ pios.de


Jonathan Feinberg

unread,
Jan 23, 1998, 3:00:00 AM1/23/98
to

shok...@well.com said...

> Oh, and it complies and links fast as hell, always an important point.
^^^^^^^^
No matter how fast you throw new standards at the Metroworks Complier, it'll
keep up, with new Self-Modifying Compliance Modules.

--
Jonathan Feinberg j...@pobox.com Sunny Brooklyn, NY

Warren Young

unread,
Jan 24, 1998, 3:00:00 AM1/24/98
to

se...@plethora.net (Peter Seebach) wrote:

>In article <34c934ee...@news.cyberport.com>,
>Warren Young <tan...@SPAMCATCHER.cyberport.com> wrote:
>>...And porting from UNIX (or VMS, or Amiga, or...) to Windows is easy?
>
>Except when Windows has a crappy API, for instance, Winsock, yes, it is.
>I took an implementation of RPC that someone had mostly-ported to Win '95,
>and I got it running on both kinds of windows in a day or two.

Well, I may be biased, since I'm a Winsock evangelist, but it seems to
me that it works just fine. Sure, it has a few limitations and bugs,
especially on Windows 95, but we're not going to talk about _that_ OS.
On NT, the only limitation I can find, compared to common UNIX
systems, is that it doesn't allow most kinds of raw sockets. Since
this latter is a feature (Windows doesn't have a reputation as a
cracker OS, and I'm sure Microsoft wants to keep it that way), I'm not
inclined to be too hard on MS for doing that.

What was hard about your RPC port? And incidentally, why couldn't you
just use Microsoft RPC? I know that there are differences, but I
think there's at least _some_ interoperability.

>>Threading: NT does that well, too (it even does some things pthreads
>>doesn't do, like thread-local storage).
>
>Well? You call a system that needs 4x the memory, at least, of a Unix
>system to run threads "well"?

NT wants 32MB to run well, just like all of the commercial UNIXes I'm
familiar with. Of course, they all want more if they can get it, but
that's just computers in general. Make sure you're comparing apples
to apples here: both running window systems, for example.

No, NT is not a slim OS -- Linux is a slim OS -- but when you can get
64MB of memory for $100, who's counting?

>>Editing: the Visual C++ and Borland C++ editors do just fine, and if
>>you don't like them, you can use that $1200 per-seat difference to put
>>towards whatever editor you like. Or, get the Win32 port of emacs or
>>vim -- they work just fine.
>
>Uhm. What $1200 per-seat difference? Real Unix (BSDI, NetBSD, Linux)
>all come with real editors and toolsets. Free. So...

Just using the example given by the orignal post.

>>Compiling: PC compilers are better than most UNIX compilers, probably
>>just because there are more people using them.
>
>Oh, nonsense! They're utter crap! I'm sorry, but I collect compilers,
>and the absolute worst, without doubt, have been the PC ones.
>
>gcc is one of the top five compilers I've ever used. Clear, informative
>diagnostics. When there's a conflict between two headers, it tells you
>which ones - MSVC 4.x was still just telling you that something was redefined,
>not telling you where the previous definition is.

No doubt, gcc diagnostics are nice. I wasn't saying that _everything_
is better about the compilers I've chosen. I was mainly referring to
language support, about which more below. In any case, Borland C++
_does_ tell you where the earlier definition was, and in many
respects, I prefer Borland C++ over Visual C++. (If _Borland_ cared
more about BC++, I'd still be using it.)

>>Borland C++ and Visual
>>C++, for example, both support the newest C++ features better than
>>g++, and definitely better than any stock compilers I've seen.
>
>You're joking. This is the Visual C++ that was still using the ARM scope
>for variables in for loops?

Granted, VC++ doesn't do everything all that well, but it does tend to
do templates better than g++. For example, g++ barfs on templates as
default template parameters and on nested templates. For those
keeping score, BC++ does do templates as default template parameters,
but doesn't handle nested templates, while VC++ does both correctly.

And hey, if you wanna be snippy, how `bout those great g++
diagnostics:

x.cpp:5: warning: namespaces are mostly broken in this version of g++

>When did they get support for 'mutable'? gcc had it around '91 or '92.

It wasn't added to the Draft Standard earlier than a year or so ago.
(April 1995 DWP, IIRC.) So, if what you're saying is true, it was
only as a vendor-specific extension. Again, Borland C++ had this
feature soon after it was released from committee.

>>starts up _much_ faster, is feature-competitive, and doesn't seem to
>

>Well, it starts up much faster for the same reason that M$ apps are
>always faster - they can cheat, they can put stuff into the OS, and
>so on.
>
>Try starting MSIE on a Unix system some time.

It probably starts up faster because it uses a lot of DLLs and things
that are already loaded. On a UNIX system, they aren't there, so it
has to load them itself.

>>I hate it when people use "real" this way -- as if all N million of us
>>PC people are figments of the universe's imagination or something.
>
>No, but many of you aren't really *using* computers, you're just toying
>with them. (Hell, I know I am a lot of the time - and for that, I have
>a PC.)

Hmmmm...that's interesting -- I have a Linux box that _I_ toy with,
and I had an SGI once for the same reason. I get my actual work done
on my NT box.

My point is that a computer is what you make of it -- you can toy with
a Cray, if you want, and you can get real work done on an Apple ][,
too.

>>Take a look at Intergraph and Compaq. They both build SGI killers,
>>and I'm sure there are others. The reason that machines like these
>>are not common in the PC world is that most PC users don't need
>>machines like that -- why pay for what you don't need?
>
>I have no idea what you consider an "SGI killer", but I haven't been
>able to find any PC's comparable to SGI's graphics workstations...

I suspect that at the high end (maybe even what SGI calls its
"midrange") you're right, but there are many many people using
high-end PCs that would have been using an SGI not too long ago.
Again, I'm not saying that UNIX is junk or that Windows NT does
everything. I'm just trying to bring a little reason into this
thread: NT isn't junk, either.

>>Yeah, the article essentially said "We couldn't use NT because it
>>isn't UNIX." Big surprise. (The company had made a big prior
>>investment in UNIX software and was unwilling to port or replace it.)
>
>How about MSNBC, who couldn't use NT because it simply couldn't handle
>real loads?

How about Sun, which uses mainframes?

Kevin Handy

unread,
Jan 24, 1998, 3:00:00 AM1/24/98
to

In article <tsw-210198...@cypher.cagent.com>, t...@cagent.com says...

>
><<<deletia with an example of horrible code, with commentary>>>
>
>Are there any other examples of horrible code (in whatever language).
>Maybe we could all laugh at them. Please post!!
>
In the book 'Programming Pearls', the author gave an example of
code he had seen written in Cobal. It went something like thi
(not realling knowing cobal, I'm writing it more generically, and
am placing elipses to shorten the code somewhat, but you'll get the idea)

TOTAL0 = 0
TOTAL1 = 0
...
TOTAL999 = 0

WHILE(GET DATA)

IF DATA = 0 THEN TOTAL0 = TOTAL0 + 1
IF DATA = 1 THEN TOTAL1 = TOTAL1 + 1
...
IF DATA = 999 THAN TOTAL999 = TOTAL999 + 1

NEXT

IF TOTAL0 != 0 THEN PRINT "TOTAL0 = " TOTAL0
IF TOTAL1 != 0 THEN PRINT "TOTAL1 = " TOTAL1
...
IF TOTAL999 != 0 THEN PRINT "TOTAL999 = " TOTAL999

Joseph M. Newcomer

unread,
Jan 24, 1998, 3:00:00 AM1/24/98
to

Userland? Mr. Programmer Man? Hmmm, I've missed something here. *I
AM A USER*. Admittedly, a signficant percentage of my time goes into
programming, but I write books, handle email, and when I have time,
participate in these discussions. I cannot think of a single activity
that I do now that was better under Unix, TOPS-10, OS/360, TSS/360,
TOPS-20, RSX-11M, MVS, or VMS. When I say "The environment gets
better" I mean the *entire* environment, not just the compiler. The
last time I tried to do email under Unix, about six years ago, I had
my choice of six undocumented mailers, each of which almost worked. I
could almost read newsgroups without significant agony. The drawing
tools could be most politely referred to as a joke (when I could draw
instantly using PowerPoint on Windows, I had to use PIC on Unix! When
I had Word on Windows, I had *roff on Unix.) What convinced me that
Windows was a winner was that I could go down to my local software
store and have my choice of a half-dozen packages for whatever I
wanted to do, from $29.95-just-barely-above-shareware to $600
professional packages. On Unix I had my choice of two freeware,
barely-functional packages, providing I could recompile them with my C
compiler (not guaranteed), and one professional product that was crap
and cost $5,000/seat. Not counting the $1500/year support package
without which I couldn't get tech support at all.

My choices are greater, the tools more powerful, and the cost is
negligible. In 20 years, Unix never succeeded in having a
"shrink-wrapped" market (albeit there were a couple products, this
does not a new wave make. For example, how many computer stores
existed selling Unix products ten years into Unix's reign? How many
stores existing selling Microsoft-compatible products ten years into
Microsoft's reign). Out here in the Real World, we want cost-effect
solutions to problems, not solutions that are "Microsoft-free". I
don't care *who* makes the solution; I care that it makes my life,
which includes email, accounting, invoicing, letter-writing,
book-writing, backup, printing, Web page design, etc. easier. Unix
never managed to do that, not in 20 years. In fact, I cannot name a
single operating system that was easier to use and gave me more power
than Windows. Yes, it can be miserable to program, but I've
programmed so many operating systems with so many problems that I
don't see the problems of programming Windows as any different (I
could regularly crash the X-server by sending it perfectly legitimate
requests, so had to limit what I could do to what that particular
X-server on that particular Unix box could handle without crashing,
independent of the X-spec. Of course, the first time we tried the app
on another Unix flavor, we discovered that *its* X-server had a
different set of bugs! THIS does not make my life easier).

I do not have a blind love of Windows; in fact, there are parts of it
that are *really* miserable to program. And Word crashes
occasionally. Eudora was written by people who never used dial-up
ISPs or heard of multithreading. ActiveVirus is a terrible technology
for the Internet, albeit a wonderful technology for abstraction when
used as originally intended. TOPS-20 required reading the tape status
register three times and voting on the results to see if the status
was valid. I remember CMU spending six months getting the high-density
Storage Technology tape drivers to work with our KL-10. The Unix site
at the SEI would successfully print only one out of ten tries on a
good day (mostly it just lost the request somewhere between my machine
and the printer). We spent years trying to get IBM's TSS/360 to work.
Unix self-destructed 30 workstations in one night (after having failed
to back up 80 of them for something like a year!). Configuring a
Macintosh in a real-world network was a nightmare (in spite of what
Apple liked to say, it wasn't that easy). If you want perfection in
this world, you need to be in some other profession than computing.

Last week I was in the hospital. The admissions people had IBM
displays with light pens, running half-duplex, and dot-matrix
high-speed line printers. They hope by next year to have laser
printers installed so they don't have to put up with the infernal
racket. BY NEXT YEAR! I had a personal laser printer in 1986! (The
first laser printer I used was in 1970, somewhat before they had
lasers). The progress in mainframes is astounding in its rapidity...

When I went to the ITC at CMU in 1990 for a year, the BIG NEWS was
that they could now support color displays in color! I had been using
a color display for four years. The only downside to this Great
Advance was that it took over 30 seconds to drop down a menu on a
$50,000 RISC-6000 (when I switched from 16-color to 256-color mode, I
found that it took 10 seconds to drop down a Windows menu, so I went
out and spent $150 for a new SVGA card and at 256-colors could drop
one down instantly, on a 386/33). Color Comes To Unix was big news in
1990. Ho hum.

Everyone seems down on Microsoft. Anyone who was around during the
60s and 70s should remember that IBM was treated the same way. Why is
it that software from Microsoft, one of America's foremost
free-enterprise companies, is evil, when software from AT&T, one of
America's foremost government-protected monopolies, was Good? Did it
become more moral after it was sold to Novell, perpetrator of the
worst example of networking I've ever seen (even beats NFS for
bogosity. At least Sun publishes their protocols. I know of very few
sites who can run a Novell network without constant fondling of the
server by the sysadmin, and very few sites that require other than
casual administration of a Microsoft network).

The Wintel platform has produced more millionaires than any
hardware/software technology in history. This is bad?

Oh yes, the issue with a Black Hole is that nothing comes out of it.
Poor analogy.

btw, you obviously know a better operating system and software
packages than most of us. How much did it cost, how much does the
software cost, how much effort does it take to install these wonderful
packages, and are they utterly bug-free and never crash? Can they do
everything my software packages do? I'm surprised I haven't heard of
such a wonderful system. How many millions has it made for its
creators, and how many millions of happy users does it have? Can my
brother afford four instances of it for his four kids? Complete with
game software, network support, CD-ROM support, education software,
accounting software, etc.? Does it fully support all standard
peripherals, including scanners, tape drives, modems, FAX modems,
color printers, still and video cameras, 24-bit 3-D color cards,
16-bit 3-D sound, high-performance CD-ROMs, writeable CDs, etc.? Does
it support a file server system capable of backing up both the server
and individual workstations? Does it have a full GUI? How many
different drawing, desktop publication, and image processing systems
can I get for it at my local computer store? Can I buy a 6-lb laptop
that runs it? Can I simply install it and expect it to run within
minutes of completing the installation? Do all the products have free
customer support? Premium customer support? Is its GUI documented
effectively with less than 3ft of manuals (this rules out anything
based on X, the Mac, or Windows). Clearly, this is the Market Winner,
if only I'd ever heard of it..

The example of my brother is a good one. He is a purchasing agent for
a small company. He has no interest in computers as computers; only
as devices that can help him get work done, and help his kids. His
four kids fought over who got the machine with the color printer until
we installed a network. (One afternoon, most of the time spent
threading the wire through the basement). With one machine having a
tape drive he can back up the entire network. He is missing very
little from the above list (no CD-R, and his 24-bit color cards don't
support 3D, and no digital cameras [yet]). His youngest was using it
at age 2 (try THAT with Unix!) and by age 26 months had figured out
how to turn it on and double-click the icon for her favorite game. His
wife uses the computer to design needlepoint, when she can get in
(fortunately, the youngest is now in 1st grade so she actually can
play with the computers in the afternoons).

This, folks, is what I mean by progress. Any other opinion had better
be able to come up with at least as good justifications.

On 22 Jan 1998 12:13:30 GMT, ds...@cameonet.cameo.com.twx (Dan
Strychalski) wrote:

>Joseph M. Newcomer (newc...@flounder.com) wrote --
>

>> Frankly, I don't care how rich Bill Gates gets; what I care about is

>> that the environment that I work in gets better and better.[...]
>
>Greetings from Userland, Mr. Programmer Man. The environment I work in
>has gotten worse and worse as your darling Billy has pulled the wool
>over more and more people's eyes and forced nearly all the world's
>computing population to work like six-year-olds.
>

>> I can guarantee that if Microsoft did not exist, we would NOT have
>> 200MHz Pentium-equivalent-power laptops for $2,500 today (one sits


>> beside me right now, running Win95).
>

>To paraphrase you, how has this benefited me? I can guarantee you that
>we would have such machines in good time, and we would not be forced to
>use them for tasks that could be done just as well on a 286.
>
>Windows sucks like a black hole, fellah, and so does everyone who
>advocates it.
>
>Dan Strychalski
>dski at cameonet, cameo, com, tw.
>Apologies for the non-threading newsreader and anti-spam devices.

Joseph M. Newcomer
newc...@flounder.com
http://www3.pgh.net/~newcomer

Marco S Hyman

unread,
Jan 24, 1998, 3:00:00 AM1/24/98
to

tan...@SPAMCATCHER.cyberport.com (Warren Young) writes:

> NT wants 32MB to run well, just like all of the commercial UNIXes I'm
> familiar with. Of course, they all want more if they can get it, but
> that's just computers in general. Make sure you're comparing apples
> to apples here: both running window systems, for example.
>
> No, NT is not a slim OS -- Linux is a slim OS -- but when you can get
> 64MB of memory for $100, who's counting?


ARGGGGHHHHHHHH! Not everyone (especially those who may be reading
a folklore group) want to throw away that old hardware. I can run
the latest/greatest *BSD on my 8 year old 20 MHz 386. What I can't
do is get memory for it. They don't make the proprietary memory
expansion boards any more and the motherboard is maxed out at 8
Meg.

Now I don't run X on that particular machine, but I did at one time.
It's nice being able to run the same OS on my 8 year old 386 (8 meg
ram), my 3 year old portable (24 meg ram), and my 2 year old sun
(although the sun is still running SunOS for the next month or so).

// marc

Marco S Hyman

unread,
Jan 24, 1998, 3:00:00 AM1/24/98
to

newc...@flounder.com (Joseph M. Newcomer) writes:

> I had Word on Windows, I had *roff on Unix.) What convinced me that

Word vs *roff is an interesting comparison. I've learned (the hard
way) that I write MUCH better using *roff than using a wysiwyg
editor. Maybe it's lack of self control on my part, but I found
that given the opportunity to easily play with a document format I
spent all too much time doing just that -- to the detriment of
what was being written.

// marc

Kelsey Bjarnason

unread,
Jan 24, 1998, 3:00:00 AM1/24/98
to

[snips]

In article <6aah7v$r5g$6...@darla.visi.com>, se...@plethora.net says...

> Well? You call a system that needs 4x the memory, at least, of a Unix
> system to run threads "well"?
>
> >Editing: the Visual C++ and Borland C++ editors do just fine, and if
> >you don't like them, you can use that $1200 per-seat difference to put
> >towards whatever editor you like. Or, get the Win32 port of emacs or
> >vim -- they work just fine.

You know, it's funny. People talk about unix vs NT all the time, usually
about "efficiency" and "response" and "memory/resource requirements".
Meanwhile the old HP-1000 is still merrily serving 40-odd people and
running 60-80 processes, on average, with good response times, and has
just recently been upgraded - it now has 2Mb of RAM. Oh, and let's note
that the HP users generally run dumb terminals, not workstations; the
real work is all done on the server.

Oh, yeah, but unix is so much more efficient, right? :) Lemme know when
I can run Linux serving even 20 users with good response times in 2Mb.
With the server doing all the work. Even in plain old text mode.


--
Remove .no.spam from my address to respond.

Kelsey Bjarnason

unread,
Jan 24, 1998, 3:00:00 AM1/24/98
to

In article <6a9bdv$d9f$1...@nntp4.u.washington.edu>,
dpes...@u.washington.edu says...

It meant then the same thing it means now: "shoot the person who did the
command syntax and interface for this thing." :)

Pete Fenelon

unread,
Jan 24, 1998, 3:00:00 AM1/24/98
to

Rob Hafernik <shok...@well.com> wrote:
> With a tool like Metrowerks, there's just no excuse for poor code.

Nothing you mentioned in that article hasn't been available on other
environments before. Metrowerks is just another IDE; not a bad one, but
there's nothing I've not seen in MPW, or Visual C++, or Oberon, or Lisp
environments, or Mesa, or... or... you get the idea.

pete
--
Pete Fenelon ("There's no room for enigmas in built-up areas")
pe...@fenelon.zetnet.co.uk http://www.users.zetnet.co.uk/petef/
3 Beckside Gardens, Melrosegate, York, Y01 3TX +44 1904 438472

Joseph M. Newcomer

unread,
Jan 24, 1998, 3:00:00 AM1/24/98
to

There are several interesting classes of programming:
- Space matters (embedded systems with limited
ROM capability)
- Speed matters (real-time systems or systems with
serious performance requirements)
- Everything else

It is important not to confuse these categories. For example, when
coding the inner loop of a performance-demanding application, I wrote
relatively simple and straightforward code. It verified that
everything else worked, but the performance sucked. I then tweaked
the inner loop a bit and tripled its performance, and then turned on
"The Full Monty" of the compiler optimization and got another factor
of 4. At that point it was fast enough that it met the required
performance goals. But I didn't waste time optimizing, say, the
dispatch of mouse clicks, which went through some rather heavy
layering to produce clear, easily maintained code.

I've also done space-constrained programming, and again, I wrote code
as simple and straightforward as possible, then began trimming it when
we hit the space limit. Most of the trimming was in trading off code
size and performance for preloaded table size (I could compress a
table by a factor of 4 by writing hairier--and larger, slower--code,
and performance wasn't the bottleneck; 64K ROM was). By adding 400
bytes of code I could save 16K of table space (a combination of
compression and some nonlinear interpolation to halve the number of
points required in the table).

I've also done real-time programming, where the overhead of taking an
interrupt made the difference between meeting the realtime window and
missing it. All the interrupt routine had to do was check to see if a
new interrupt was pending, and if so, loop instead of taking the
interrupt. The saving in interrupt latency tripled the data rate we
could handle. Since the data came in bursts, we got out of the
interrupt handler often enough to service the GUI.

I've learned what often makes code expensive to develop is to confuse
either of the first two cases with the last case. Good application of
experience can suggest where some effort should be put, but until
you've built the system, you don't know for sure.

OK, here's a question: what's the difference between a hacker, a
computer scientist, and a software engineer? [stay tuned for the
answer...I'll give a few days for other inputs]
joe


On 21 Jan 1998 12:38:17 GMT, jo...@ucc.gu.uwa.edu.au (John "West"
McKenna) wrote:

>newc...@flounder.com (Joseph M. Newcomer) writes:
>

>> Amateur: Code Size Is All
>
>> Professional: Code Size is the least of the concerns.
>> ALWAYS compromise code size in the interest of
>> maintainability, extensibility, and understandability.
>
>I wish I could. I really do. But 32K of EPROM is 32K of EPROM.
>
>So I wrote an interpreter for a memory-efficient byte-code, compressed all
>my strings, and write code like memory doesn't matter.
>
>It is really, really slow. About 12500 instructions/second. But I don't
>care. When the routine that is called twice each second takes 2 seconds to
>run, you do some optimizing. When the user interface starts to feel
>sluggish, you do some optimizing. Until then, it doesn't really matter.
>
>> Amateur: codes
>> DWORD mask = (DWORD)pow(2.0, (float)n);
>
>> Professional: codes
>> DWORD mask = 1 << n;
>
>>(This last example is from the inner loop of a realtime system,
>>written to run on a 386 without floating point accelerator. It was
>>the least of the sins I found in that code)
>
> char buf[10];
> int i;
>
> i=function_that_always_returns_an_integer_between_1000_and_9999();
> sprintf(buf,"%d",i);
> buf[0]=buf[2];
> buf[1]=buf[3]
> buf[2]=0;
> i=atoi(buf);
>
>Took me a while to realise that it is simply doing
> i=i%100;
>(My C is rather rusty, so forgive me if the code is not quite right)
>
>John West

John West McKenna

unread,
Jan 25, 1998, 3:00:00 AM1/25/98
to

sp...@lisardrock.demon.co.uk writes:

>however... is there a "quick" way to divide by small odd prime
>constants, in the same way that there is to multiply? we can't work any
>out, but we're willing to admit we may be missing something.

>(eg. : 3* dup dup + + ; but : 3/ ??? ; )

Isn't : 3* dup 2* + ; better? (I assume Forth has a way of shifting a number
one bit to the left).

If you've got a fast multiply, x/n = ((65536/n)*x)>>16. More or less.
There might be some clever rounding trick that will make it always correct,
but it's far too early in the morning to think about it.

John West

sp...@lisardrock.demon.co.uk

unread,
Jan 25, 1998, 3:00:00 AM1/25/98
to


On 1998-01-24 tan...@SPAMCATCHER.cyberport.com(WarrenYoung) said:
-No, NT is not a slim OS -- Linux is a slim OS -- but when you can
-get 64MB of memory for $100, who's counting?

that one again. only someone who really doesn't care about computers
could come out with this line.

look. NT wants 32mb to stand up straight. but it really doesn't offer
very much more functionality than something like AmigaDOS, which took up
512k to stand up, kick a ball around, and score a few goals.

what has been gained???

shouldn't having to put 32Mb in a system get you a *hell* of a lot more
than going out and picking up a second hand amiga, rather than just a
few trimmings around the edges?

the only conceivable reason for a system needing 32Mb is code-bloat gone
utterly insane. there is no justification, no reason, no sense in it.
and it certainly isn't good value for money, even if 32mb only costs you
fifty quid.

-And hey, if you wanna be snippy, how `bout those great g++
-diagnostics:
-x.cpp:5: warning: namespaces are mostly broken in this version of g++

we'd call that refreshing honesty, personally. you won't ever find that
in a microsoft compiler. but then, if you did, you'd be buggered,
whereas in g++ you can at least (in fact, we suspect you're somewhat
expected to) dig around the source and put it right.

it's not so much the difference in ability that appeals, more the
difference in philosophy.

-It wasn't added to the Draft Standard earlier than a year or so ago.
-(April 1995 DWP, IIRC.) So, if what you're saying is true, it was
-only as a vendor-specific extension. Again, Borland C++ had this
-feature soon after it was released from committee.

oh, right, so g++ gets an extension 4 years before it's a standard, and
it's a proprietasry extension. bc++ gets it a short while after, and
it's a fast implementation of the standard and a major selling point.

never mind that it was probably g++ which *defined* the standard in the
first place...

but then, you must like standards. you seem to be applying two at once.
:> :> :>

->Try starting MSIE on a Unix system some time.

-It probably starts up faster because it uses a lot of DLLs and
-things that are already loaded. On a UNIX system, they aren't
-there, so it has to load them itself.

hmm... it's not unreasonable to expect unix to have a sockets package
loaded on startup... and it's not reasonable to expect it to load
activex to do its desktop, when it has far superior technologies
available to it...

-Hmmmm...that's interesting -- I have a Linux box that _I_ toy with,
-and I had an SGI once for the same reason. I get my actual work
-done on my NT box.

we get our work done (kinda) on a win95 box. what of it? let's face it,
in the world of work, we all do what we're told by our customers,
because if we tell them they're daft, they'll find someone else who
will (and who probably won't do it half so well either).

unix isn't crap. nt isn't crap. but unix was small once, and could
become so again. it's highly unlikely that fate will ever befall nt...

sp...@lisardrock.demon.co.uk

unread,
Jan 25, 1998, 3:00:00 AM1/25/98
to


On 1998-01-24 kel...@no.spam.usa.net(KelseyBjarnason) said:
-You know, it's funny. People talk about unix vs NT all the time,
-usually about "efficiency" and "response" and "memory/resource
-requirements". Meanwhile the old HP-1000 is still merrily serving
-40-odd people and running 60-80 processes, on average, with good
-response times, and has just recently been upgraded - it now has
-2Mb of RAM. Oh, and let's note that the HP users generally run
-dumb terminals, not workstations; the real work is all done on the
-server.

-Oh, yeah, but unix is so much more efficient, right? :) Lemme know
-when I can run Linux serving even 20 users with good response times
-in 2Mb. With the server doing all the work. Even in plain old text
-mode.

which just goes to show, getting the hardware architecture right for the
use in the first place probably has a hell of a lot more impact on
eventual performance than any amount of flashy programming afterwards.
:>

sp...@lisardrock.demon.co.uk

unread,
Jan 25, 1998, 3:00:00 AM1/25/98
to

you know, on a folklore group, this kind of post is really asking for it
- maybe you'd be better posting these particular diatribes in
alt.windows.advocacy - but you enlivened our evening, so we return the
compliment...

On 1998-01-24 newc...@flounder.com(JosephM.Newcomer) said:
-Userland? Mr. Programmer Man? Hmmm, I've missed something here. *I
-AM A USER*. Admittedly, a signficant percentage of my time goes
-into programming, but I write books, handle email, and when I have
-time, participate in these discussions. I cannot think of a single
-activity that I do now that was better under Unix, TOPS-10, OS/360,
-TSS/360, TOPS-20, RSX-11M, MVS, or VMS. When I say "The environment
-gets better" I mean the *entire* environment, not just the compiler.

unfortunately, empirical data around us demonstrates that the
environment is deteriorating rapidly. and we don't suppose it's helped
by all those dioxins churned out by pcb manufacturers who have to race
to build ever bigger, better, faster computers just to justify billy
gates's refusal to employ any programmer who might just possibly show
him up.

if we all started using computers to their full potential, we'd be able
to recycle about the last ten years' worth of technological advances -
but that would require the sacking of at least 75% of programmers and a
general moratorium on hardware development. we'd love to see it, but we
doubt we're going to... maybe when the physical limits of silicon are
hit, we can all sit back and take a bit of a breather.

please...?

-The last time I tried to do email under Unix, about six years ago,

*six*years*ago*.

meanwhile one of our university professor was automarking our work using
email only a couple of years after. he wouldn't have done that if it
hadn't been reliable enough to use.

-I had my choice of six undocumented mailers, each of which almost
-worked. I could almost read newsgroups without significant agony.

funny; we've *never* yet found a newsreader as good as tin...

-The drawing tools could be most politely referred to as a joke
-(when I could draw instantly using PowerPoint on Windows, I had to
-use PIC on Unix! When I had Word on Windows, I had *roff on Unix.)

yes, where you have to use that long-neglected capacity, your
visuo-spatial imagination. it's called not being lazy. besides, the unix
philosophy has always been about power before prettiness (where the hell
else would awk come from???) so to find such tools isn't actually so
surprising.

of course, there are probably fast, easy x/windows front ends for them
these days. certainly there is for tex. which give you the best of both
worlds; the raw power of a command language (both tex and ?roff are
turing-equiv) combined with the convenience of a graphical editor.

show us *that* in windows.

-What convinced me that Windows was a winner was that I could go
-down to my local software store and have my choice of a half-dozen
-packages for whatever I wanted to do, from $29.
-95-just-barely-above-shareware to $600 professional packages. On
-Unix I had my choice of two freeware, barely-functional packages,
-providing I could recompile them with my C compiler (not
-guaranteed), and one professional product that was crap and cost $5,
-000/seat. Not counting the $1500/year support package without
-which I couldn't get tech support at all.

but then, unix wasn't really written for end-users anyway; it was
written for people to write software on. that it's possible now to use
it for the former just demonstrates the strength of the system.

-My choices are greater, the tools more powerful, and the cost is
-negligible. In 20 years, Unix never succeeded in having a
-"shrink-wrapped" market (albeit there were a couple products, this
-does not a new wave make. For example, how many computer stores
-existed selling Unix products ten years into Unix's reign? How many

you know, in some areas that could be classed as a point in unix's
favour... shrink-wrapped software is all very well so long as it can be
guaranteed bug-free.

if it can't, then for god's sake give us the source!

-stores existing selling Microsoft-compatible products ten years into
-Microsoft's reign). Out here in the Real World, we want cost-effect
-solutions to problems, not solutions that are "Microsoft-free". I
-don't care *who* makes the solution; I care that it makes my life,
-which includes email, accounting, invoicing, letter-writing,
-book-writing, backup, printing, Web page design, etc. easier. Unix
-never managed to do that, not in 20 years. In fact, I cannot name a
-single operating system that was easier to use and gave me more
-power than Windows. Yes, it can be miserable to program, but I've
-programmed so many operating systems with so many problems that I
-don't see the problems of programming Windows as any different (I
-could regularly crash the X-server by sending it perfectly
-legitimate requests, so had to limit what I could do to what that
-particular X-server on that particular Unix box could handle
-without crashing, independent of the X-spec. Of course, the first
-time we tried the app on another Unix flavor, we discovered that
-*its* X-server had a different set of bugs! THIS does not make my
-life easier).

microsoft the empowerer! whee! *snort*

or to put it another way: if we apply your arguments to government,
democracy is really inefficient, because of all those messy
incompatibilities; besides which, you never get the perfect set of tools
anyway, and a lot of the tools you do find just aren't worth using. so
wouldn't life be so much easier with a really benign dictatorship?

the trouble is that you forget that dictatorships don't stay benign any
longer than they have to. we don't like the way microsoft is forcing
competition aside through no other ability than marketing clout and some
highly dubious selling practices, for precisely that reason. and whilst
the unix world is probably even more flawed than you would suggest, at
least it was always democratic.

(if not anarchistic, which is even better ;> )

-I do not have a blind love of Windows; in fact, there are parts of
-it that are *really* miserable to program. And Word crashes
-occasionally. ^^^^^^^

you spelled "works" wrongly. hth.

-Everyone seems down on Microsoft. Anyone who was around during the
-60s and 70s should remember that IBM was treated the same way. Why
-is it that software from Microsoft, one of America's foremost
-free-enterprise companies
^^^^^^^^^^^^^^^
the last thing microsoft wants is a free market, and well you know it.

-is evil, when software from AT&T, one of
-America's foremost government-protected monopolies, was Good? Did it

AT&T weren't particularly pleasant either, but the great thing about
government protected monopolies (and at&t were never a monopoly, btw) is
that they tend to be so hopelessly inefficient that they drop diamonds
through their fingers (unix v6, anyone...?)

-The Wintel platform has produced more millionaires than any
-hardware/software technology in history. This is bad?

yes, and gerald ratner became a millionaire in an astonishingly short
time selling crap too. your point...? crap systems are much more likely
to produce millionaires that good ones, and closed systems far more
likely to produce people with power than open ones. does that make
either instrinsically worth having?

-btw, you obviously know a better operating system and software
-packages than most of us. How much did it cost, how much does the
-software cost, how much effort does it take to install these
-wonderful packages, and are they utterly bug-free and never crash?

(etc.etc... 8< )

linux. free. free. a reasonable chunk of effort, but no more than
justified - you get back what you put in. no, but what the hell is???

(and as regards your vast list of devices to support: against all the
odds, linux supports zip drives. because someone took the time and care
to reverse-engineer the proprietary interface protocol when confronted
with a manufacturer who didn't want to give out such information.
doesn't that strike you as anti-competitive practice...?)

-The example of my brother is a good one. He is a purchasing agent
-for a small company. He has no interest in computers as computers;
-only as devices that can help him get work done, and help his kids.
-His four kids fought over who got the machine with the color
-printer until we installed a network. (One afternoon, most of the
-time spent threading the wire through the basement). With one
-machine having a tape drive he can back up the entire network. He
-is missing very little from the above list (no CD-R, and his 24-bit
-color cards don't support 3D, and no digital cameras [yet]). His
-youngest was using it at age 2 (try THAT with Unix!) and by age 26
-months had figured out how to turn it on and double-click the icon
-for her favorite game. His wife uses the computer to design
-needlepoint, when she can get in (fortunately, the youngest is now
-in 1st grade so she actually can play with the computers in the
-afternoons).

yeah, whatever. at age 10 we were designing operating systems.

-This, folks, is what I mean by progress. Any other opinion had
-better be able to come up with at least as good justifications.

sorry, but when your justifications are any good, and have less to do
with "well, it makes my life easier", they may be worth trying to beat.
on the other hand, if it all turns to shit in ten years, we'll be there
saying "i told you so"...

Peter van Hooft

unread,
Jan 25, 1998, 3:00:00 AM1/25/98
to

In <34ea6229....@206.210.64.12> newc...@flounder.com (Joseph M. Newcomer) writes:

>Unix self-destructed 30 workstations in one night (after having failed
>to back up 80 of them for something like a year!).

This convinced me not to read any further. I don't know of _any_
serious environment where this situtation would have allowed to
continue after a _week_, let alone a year.
Furthermore, I cannot think of any reason why Unix "self-destruct" systems
other then being instructed to do so, which by any standards would be
considered pilot error, and which is obviously not limited to Unix systems.

Therefore, Mr Newcomer is
- trolling
- a luser
- both

peter


Mike Swaim

unread,
Jan 25, 1998, 3:00:00 AM1/25/98
to

Peter van Hooft <ho...@natlab.research.philips.com> wrote:
: In <34ea6229....@206.210.64.12> newc...@flounder.com (Joseph M. Newcomer) writes:

: >Unix self-destructed 30 workstations in one night (after having failed


: >to back up 80 of them for something like a year!).

: This convinced me not to read any further. I don't know of _any_


: serious environment where this situtation would have allowed to
: continue after a _week_, let alone a year.

I know of 2, and I haven't been around that much. At the first, one of
the programmers took to deleting a file that he didn't need once a month
and requesting that it be restored "just to be sure." At the second, I
found out by accident, and had one of the netware sysadmins give me rights
to the accounting data so I could back it up via an NT machine I
controlled. I also told the head of IS, who didn't know about it, either.

--
Mike Swaim, Avatar of Chaos: Disclaimer:I sometimes lie.
Home:sw...@phoenix.net or sw...@c-com.net, I'm just not sure.
Work:mps...@gdseng.com Silly:whats...@gdseng.com
Alum: sw...@rice.edu Quote: "Boingie"^4 Y,W&D

sp...@lisardrock.demon.co.uk

unread,
Jan 25, 1998, 3:00:00 AM1/25/98
to


On 1998-01-25 jo...@ucc.gu.uwa.edu.au(John"West"McKenna) said:
->(eg. : 3* dup dup + + ; but : 3/ ??? ; )

-Isn't : 3* dup 2* + ; better? (I assume Forth has a way of shifting
-a number one bit to the left).

it does, but if you're using a register based machine it makes little
difference whether you code

SHL AX, 1

or

ADD AX, AX

and if you have an inlining forth compiler, you can get the effect of 2*
without having to write the optimiser for it. hence our preference. :>

-If you've got a fast multiply, x/n = ((65536/n)*x)>>16. More or
-less. There might be some clever rounding trick that will make it
-always correct, but it's far too early in the morning to think
-about it.

we went down that route at one point - even to the extent of working out
the trick to make it round properly (otherwise exact multiples tend to
be one off) - but it still assumes the presence of a fast multiplier,
and if you don't have one of those you do have a problem - if you have
to code up a multiply routine, you may as well code up a divide routine
too...

Robert Billing

unread,
Jan 25, 1998, 3:00:00 AM1/25/98
to

In article <34ca035b...@news.cyberport.com>
tan...@SPAMCATCHER.cyberport.com "Warren Young" writes:

> Hmmmm...that's interesting -- I have a Linux box that _I_ toy with,
> and I had an SGI once for the same reason. I get my actual work done
> on my NT box.

I started by doing that, now I'm *paid* for using Linux...

--
I am Robert Billing, Christian, inventor, traveller, cook and animal
lover, I live near 0:46W 51:22N. http://www.tnglwood.demon.co.uk/
"Bother," said Pooh, "Eeyore, ready two photon torpedoes and lock
phasers on the Heffalump, Piglet, meet me in transporter room three"

Rob Hafernik

unread,
Jan 25, 1998, 3:00:00 AM1/25/98
to

In article <6adlqu$5q2$2...@irk.zetnet.co.uk>, Pete Fenelon
<pe...@fenelon.zetnet.co.uk> wrote:

> Rob Hafernik <shok...@well.com> wrote:
> > With a tool like Metrowerks, there's just no excuse for poor code.
>
> Nothing you mentioned in that article hasn't been available on other
> environments before. Metrowerks is just another IDE; not a bad one, but
> there's nothing I've not seen in MPW, or Visual C++, or Oberon, or Lisp
> environments, or Mesa, or... or... you get the idea.

Well, for one thing, it runs a HELL of a lot faster than MPW in a fraction
of the memory, that would be enough for me. The editor is also a LOT
nicer than MPW (unless you have a command-line fetish, in which case you
should stick with MPW). The debugger is also far ahead of the debugger in
MPW.

Also, I'm not sure about all of the environments you mention, but I don't
recall them all being able to compile to different platforms in different
languages. Perhaps I didn't make myself clear: the Metrowerks compiler
will let you run on a Mac or PC and compile projects that target the Mac,
PC, pilot and a couple of other hardware platforms in multiple languages.

Someone earlier didn't like the idea of syntax coloring. Fine, turn it off.

Also, there was a mention of regular expressions across multiple files.
The Metrowerks compiler does this, I just didn't mention it. The Find
dialog also lets you search just the project source, just the project
headers, just the system headers or a custom set of files that you
define.

There's a LOT of stuff I didn't mention (such as a very nice
implementation of "diff" for files).

Neil.Frankli...@ccw.ch

unread,
Jan 25, 1998, 3:00:00 AM1/25/98
to

newc...@flounder.com (Joseph M. Newcomer) wrote:
> I cannot think of a single activity
> that I do now that was better under Unix, TOPS-10, OS/360, TSS/360,
> TOPS-20, RSX-11M, MVS, or VMS. When I say "The environment gets
> better" I mean the *entire* environment, not just the compiler.

Better shell with better command language (I've just been analysing
Web server logs). Multiple apps running without others freesing or the
entire system crashing (I got the Win3.1 shock treatment). Real Web
server to test my pages off-line before an quick and easy transfer to
the on-line server (via the wonder of rsync). Effective proxy server
based filtering of .gif junk from Webpages (by Junkbuster). A real
local cache for once downloaded Web pages (Squid). A Netscape that
has crashed about one every 3 months. A real News server for off-line
news reading/posting without expiry of groups I want (INN). Oh, not to
forget a system that stays alive and automatically recovers from an X
server crash (thanks to xdm), just re-login and click forward to the
web page I was fetching (that is still downloading while the X server
crash/recovery thanks to Squid); a Windows (95 or NT) with a crashed
GDI.EXE would be fully dead.


> The last time I tried to do email under Unix, about six years ago, I had
> my choice of six undocumented mailers, each of which almost worked. I
> could almost read newsgroups without significant agony.

Interesting. I first time met email was on a NeXT. Their Mail.app still
beats everything I have seen on any platform. The GNUemacs GNUS (I am
wtiting this on it) comes a close second. MS Mail (I had that thrust
upon me at work) was sufficiently bad for me to give it up and prefer
a X login to an VAX and on that the VT terminal MAIL program. CCmail
was not much better either.


> What convinced me that
> Windows was a winner was that I could go down to my local software
> store and have my choice of a half-dozen packages for whatever I
> wanted to do, from $29.95-just-barely-above-shareware to $600
> professional packages. On Unix I had my choice of two freeware,
> barely-functional packages, providing I could recompile them with my C
> compiler (not guaranteed), and one professional product that was crap
> and cost $5,000/seat. Not counting the $1500/year support package
> without which I couldn't get tech support at all.

Am I just imagining, that the $60 10-CD set with everythign I am running
on this Linux PC at the moment (a few Internet downloads excepted) was
a dream then :-).


> the tools more powerful,

If you had said "easier" I would agree, but "more powerfull" no.


> I care that it makes my life,
> which includes email, accounting, invoicing, letter-writing,
> book-writing, backup, printing, Web page design, etc. easier.

I am member of a computer club including many Win 95 and NT users,
some of them PC dealers and system supporters. I have lost count how
often I hear the sentance "I've got to reinstall, the Registry is
broken". Sure that makes life easier, just click on the Install button
and pray that the drivers are there. I will use the saved time to
learn a bit more about the power of my system.


> Unix self-destructed 30 workstations in one night (after having failed
> to back up 80 of them for something like a year!).

I have seen Windows PCs take down 3 years of work because the User in
question had never heard of backup and there is no operator doing it.
In the end the problem disappeared after we got all users to install
Ethernet and save all important stuff to the VAX.


> Why is it that software from Microsoft, one of America's foremost
> free-enterprise companies, is evil, when software from AT&T, one of
> America's foremost government-protected monopolies, was Good?

No one insists that AT&T is good. Have you never heard the "Live AT&T
free or die" slogan. Unix became good through Berkley, GNU and Linus.
A democratic world where users count more than bureaucracies (private
profit driven or government incompetence driven ones).


> The Wintel platform has produced more millionaires than any
> hardware/software technology in history. This is bad?

Not bad. But also no guarantee for good, as you seen to be implying.


> btw, you obviously know a better operating system and software
> packages than most of us. How much did it cost, how much does the
> software cost, how much effort does it take to install these wonderful
> packages, and are they utterly bug-free and never crash? Can they do
> everything my software packages do? I'm surprised I haven't heard of
> such a wonderful system. How many millions has it made for its
> creators, and how many millions of happy users does it have?

Example Linux: $60 for the CDs (used for 4 installs, so $15 per
machine), uses standard PC hardware, installs in 30..60 min, I have
found ca 5 bugs (non serious) in 300MB of code, ca 5 crashes in 3
years (all hardware related), no millions for its inventors (but that
is no important measure, see above), ca 5 mio happy users.


> Can my brother afford four instances of it for his four kids? Complete with
> game software, network support, CD-ROM support, education software,
> accounting software, etc.?

$15 per seat. He can surely afford that.


> Does it fully support all standard
> peripherals, including scanners, tape drives, modems, FAX modems,
> color printers, still and video cameras, 24-bit 3-D color cards,
> 16-bit 3-D sound, high-performance CD-ROMs, writeable CDs, etc.? Does
> it support a file server system capable of backing up both the server
> and individual workstations? Does it have a full GUI?

Yes, all of these.


> How many different drawing, desktop publication, and image processing systems
> can I get for it at my local computer store?

Many CDs full.


> Can I buy a 6-lb laptop that runs it?

I even run it on an 3/2-lb palmtop (http://www.ccw.ch/NeilFranklin/Palmtops/).


> Can I simply install it and expect it to run within
> minutes of completing the installation? Do all the products have free
> customer support? Premium customer support?

It ran direct from Install for me. Free support via Internet/Usenet/Web.
Premium support from people who get paid for nothing else (and if you
don't like one of them, go to an other, no manufacturers support
department has a monopoly here).


> Is its GUI documented
> effectively with less than 3ft of manuals (this rules out anything
> based on X, the Mac, or Windows). Clearly, this is the Market Winner,
> if only I'd ever heard of it..

OK, one bad point for Linux, you need to RTFM. I accept that one for
all the good I get. Ever heard the quip "Make it so an fool can use it
and only a fool will"? Replace fool with beginner (not the same thing)
and you stil have an valid sentance.


> This, folks, is what I mean by progress. Any other opinion had better
> be able to come up with at least as good justifications.

Or simply be as good or better as those you stated.


--
Neil.Frankli...@ccw.ch, http://www.ccw.ch/Neil.Franklin/
for Geek Code, Papernet, Voicenet, PGP public key see http:
Mac, 95 and NT users are CLUEless (Command Line User Environment)
If I go missing, its once again my newsfeed that has craped

Neil.Frankli...@ccw.ch

unread,
Jan 25, 1998, 3:00:00 AM1/25/98
to

Neil.Frankli...@ccw.ch writes:
> The same as good science result from an open process of public critic
> (testing) and improvement (feedback), good software happens when the
> social structure making it can tap all of its knowledge and testing
> power to drive out the inevitable bugs that will creep in.
> The only OS made on such principles at the moment is GNU/Linux, which
Marco S Hyman <ma...@snafu.org> asked:
> I'm curious, why do you believe FreeBSD/NetBSD/OpenBSD are closed?
> Full source (kernel as well as all of user land) is available
> for a few dollars (CD) or the time it takes it you to download it
> off the net.

Hmmm, good question.

Source certainly is available since the deal with UOotM (Unix Owner of
the Month). But how is it about modifications done by users? Has that
changed relative to the UCB days (up to 4.4BSD), when with exception
of a few handpicked universities all code came from the group in UCB?

Thinking about it, it must have changed. After all UCB isn't involved
any more. And with 3 different versions Free/Net/Open* there must be 3
different groups working on it, nost likely with different policies.
I suppose my claim isn't valid any more these days.

* could you give a concise description of what differs the 3 groups
and their versions of the system.

Neil "confused by xxxBSD" Franklin

Marco S Hyman

unread,
Jan 25, 1998, 3:00:00 AM1/25/98
to

Neil.Frankli...@ccw.ch writes:

> * could you give a concise description of what differs the 3 groups
> and their versions of the system.

Oh boy! This is sure to get people mad at me as I get the details
wrong. Oh well...

I believe FreeBSD started out as BSD specifically for the i386 family.
NetBSD was BSD 4.4 (more or less) for all architectures, and OpenBSD
started out due to personality differences between the maintainers
and concentrated on "security" issues. But (at least from my perspective)
they all seem to track each other these days, with nice things from the
BSD/OS (BSDI) croud thrown in, too.

I'm certain you can find diehards that will point to some specific
piece of code and say xbsd does it better than ybsd. And they are
probably correct. But when ic came down to deciding wich one
to run I picked the first one that booted on my portable with
its specific mix of hardware. In honesty, once I figured out
what the real problem was (I don't know my portable had a 'sound
card' that lived at teh IRQ my network card was trying to use)
any of the three would have probably booted.

If I have a preference at all it is probably OpenBSD because they
support anon CVS access to the source.

I've used all of them (plus BSDI up to 2.0). They all work fine.

// marc

Warren Young

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

sp...@lisardrock.demon.co.uk wrote:

> -It wasn't added to the Draft Standard earlier than a year or so ago.
> -(April 1995 DWP, IIRC.) So, if what you're saying is true, it was
> -only as a vendor-specific extension. Again, Borland C++ had this
> -feature soon after it was released from committee.
>
>oh, right, so g++ gets an extension 4 years before it's a standard, and
>it's a proprietasry extension. bc++ gets it a short while after, and
>it's a fast implementation of the standard and a major selling point.
>
>never mind that it was probably g++ which *defined* the standard in the
>first place...
>
>but then, you must like standards. you seem to be applying two at once.
>:> :> :>

My point is that it's all well and fine for one C++ compiler to have a
given feature, but if you can't count on it, it's only worthwhile if
you're willing to strap yourself to that compiler.

And yes, I do like standards, which is why I wouldn't have found the
G++ extension useful. I'm sure that if this sort of thing came out in
a PC compiler, you'd call it a proprietary extension, too. In fact,
I've got such an example: STL, which was created under Borland C++
because none of the UNIX compilers did templates well enough to
support its development. Before 1994ish, STL could be thought of as a
proprietary Borland extension. But now that it's been added to the
Standard and its becoming ubiquitous, it's a wonderful feature and
everything's nice and spiffy. `Course, Visual C++ and Borland C++
_still_ do STL better than g++....

>unix isn't crap. nt isn't crap. but unix was small once, and could
>become so again. it's highly unlikely that fate will ever befall nt...

And it depends on your perspective whether this is a good thing or
not. Actually, you could argue that Windows CE is "NT become small":
portable Win32 workalike, etc. The real issue is that Microsoft is
not interested in porting WinCE to low-end, commodity hardware,
whereas the Linux community has no such problem.

Joseph M. Newcomer

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

>Peter van Hooft <ho...@natlab.research.philips.com> wrote:
>: In <34ea6229....@206.210.64.12> newc...@flounder.com (Joseph M. Newcomer) writes:
>
>: >Unix self-destructed 30 workstations in one night (after having failed

>: >to back up 80 of them for something like a year!).
>: This convinced me not to read any further. I don't know of _any_
>: serious environment where this situtation would have allowed to
>: continue after a _week_, let alone a year.


The environment in which this was allowed to persist for over a year
was the Software Engineering Instittute of Carnegie Mellon University.
Here's the story:

We bought a lot of MicroVax IIs. After several months of trying to
get them running at all (it turns out that the chip company from Japan
had delivered umpteen hundreds of thousands of chips with no physical
connection between the refresh line on the chip and the external pin
on the package; VMS ran just fine because it touched so many locations
that it automatically forced a refresh, while Ultrix didn't), we found
that due to a complete lack of imagination on the part of DEC they
decided that a 70MB drive was sufficient for a workstation [famous
quote by Ken Olsen: "Why would anyone want to put as much as 70MB on a
*workstation*?"; he thought disks that big were frivolous, and smaller
disks would have been more than adequate]. Anyway, we ended up
violating the DEC hardware assumptions and putting a *second* 70MB
drive on the machines. These were configured with soft links so that
the user files were all on the second drive while the system files
were on the first drive, all transparent to the users. The software
was largely crap, and eventually I even refused to read email on the
machines; I had my secretary print out my email each morning and I had
her respond to it. There is something about an eight-minute
turnaround on the "send" command that makes using email frustrating.

Why eight minutes? Well, some people complained that having the
hosts.txt list on their machine took up too much of their disk, so
instead it was soft-linked to the server. With the typical brilliance
exhibited by the then-sysadmins, we were not permitted to actually
*have* the copy on our workstations, *even if we wanted it*, so
instead of 30 seconds to send a message (still too long), it took
between 3 and 8 minutes depending on the load on the server!

So anyway, the documentation group, responsible for the 1-year and
5-year plans, hit a disk space limit on the tiny 70MB drives, but by
that time someone had located a larger drive. So they put a larger
drive on the machine, and went to restore the files from tape.
Whoops! No files! It turns out that the backup program which ran on
the server cleverly didn't follow soft-links, so for a year it had
been backing up 80 copies of the Unix kernel, the Unix man pages, the
Unix utilities, *and not a single user file*. By this time, due to
cricumstances that are murky, the original disk had been clobbered.
Less than a week before the 1-year and 5-year plans were due at DoD,
they had been lost!

OK, the first thought is to use the new Kurzweil OCR system that
someone had bought but never tested. So they tried to OCR the
existing pages. This didn't work (OCR was pretty primitive in the
mid-1980s). So instead every secretary in the building was co-opted
into retyping pieces of the proposals, and they were re-assembled
(including the formatting commands) in a marathon session by the
documentation folks.

Why the then-head-of-operations was not fired on the spot at that time
is completely unknown.

But this isn't the worst. They changed the option that followed
soft-links so the backup software would now back up all the user
files. The problem is that the backup system had a couple different
aspects, one of which was to resynch all the workstations with the
file server. If a file existed on the server and did not exist on the
workstation, it was downloaded to the workstation. If the FS version
was newer than the WS version, it overwrote the version. And if a
file had been deleted from the FS, it was deleted from the WS.

Do you see it coming?

That night. The next night, one of the junior system programmers was
working late, when suddenly all his files disappeared. Some quick
checking indicated that they had all been deleted within a small
window of time. He checked, and a couple other machines were equally
bare of files. He quickly shut down the backup system. It turns out
that since it now followed soft-links, it found all the user files.
But there were no copies of these files on the corresponding
non-existent directories on the file server, so it decided that they
had to be deleted from the workstations. 30 machines were destroyed.
Of course, none of these files had been backed up. My machine was one
of the 30 destroyed. Fortunately, I had decided that Unix was crap,
and had insisted on having an IBM PC installed in my office. All my
real work, except for email, was done on my PC. I had been out of
town on an extended vacation, and lost a month's worth of email. I
made the most progress of anyone during that month; I lost no real
work. Everyone else lost all their work so made negative progress.
In some cases, a whole year's worth of work was lost.

I agree. No professionally-run installation could have this happen.
However, in spite of the protestations of all of the senior staff
(such as myself, who was an experienced sysadmin of a TOPS-20 site),
the SEI would not spring for the salary of an experienced Unix
sysadmin; instead choosing to "bring up" someone from inside and hire
fresh CMU graduates as cheap system labor.

In 1986, the SEI director responsible for this and many other fiascos
resigned to pursue other opportunities. He was given a faculty
appointment at the Graduate School of Industrial Administration. Much
to the dismay of his considerable ego, he was given a shared office
with a graduate student. This is someone who put in the SEI budget a
line item for his chauffer, and designed the new SEI building with a
helipad so he could be brought into work via helicopter if necessary
(it turns out the area he lives in has a specific ordinance against
landing helicopters, so he had to back down on this one). He was
personally affronted when the Air Force explained that such
frivolities were *not* going to be covered by any contract *they*
administered.

Just to give you an idea of the quality of management at that time,
the head sysadmin STILL KEPT HIS JOB!

There were many problems with this environment; and then we discovered
that we were not permitted to know the root password on our machines.
Something about NFS and if I became root on my machine, I was
implicitly root on every machine...wow! This is what is known as
Robust Security. Of course, none of these people understood that you
could boot standalone, change the root password to whatever you
wanted, then reboot back to the network. When it was finally
explained to them, they decided to make it impossible to boot a
machine standalone. As far as we could tell, this was impossible,
besides being incredibly stupid.

Then one day they decided to build a line concentrator that would
allow the sysadmin types to reboot any machine and administer it from
a single physical location. The junior types came in with a design
for the hardware for the multiplexor box and a design for the packets
down to the bit level that would be sent between the multiplexor box
and the administrative host. Someone sitting around the table did the
arithmetic of 9600x150 (the projected number of machines was 150) and
pointed out that the single 9600-baud link between the multiplexor box
and the central host would be, shall we say, a little overburdened.
And the designed multiplexor box did not have enough RAM to handle the
buffering. Of course, nobody thought of something as obvious as a PC
with a multiport serial card (real programmers don't consider PCs as
computers), instead this was a design from the chip level up using at
best MSI components, home-built, with its own embedded operating
system. Like the staff had nothing else to do but build and debug
this...

The guy responsible for this idea was the same person responsible for
the backup fiasco. Shall we say he had no true appreciation of any
form of reality?

Professionally-administered is not a term I would have applied to that
site at that time. I left in 1987 (My letter of resignation started
off "Next week I shall be 40. I have decided to give myself a
birthday present: I'm resigning"). Fortunately, the new director,
Larry Druffel, eventually got the situation under control, but it was
a struggle because of the years of incompetence and neglect that had
preceded his tenure.

Michael Kircher

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

k...@srv.net (Kevin Handy) writes:

> [snip some COBOL code, whose author didn't know, that 'varying' exists.]

ROTFL :-)

Cheers,

Michael

Warren Young

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

Robert Billing <uncl...@tnglwood.demon.co.uk> wrote:

>> Hmmmm...that's interesting -- I have a Linux box that _I_ toy with,
>> and I had an SGI once for the same reason. I get my actual work done
>> on my NT box.
>

> I started by doing that, now I'm *paid* for using Linux...

I tried that, but my employer would have none of it. B-) Still, I've
managed to sneak in some fairly vital programs that only run on Linux.
If nothing else, it keeps it in his face.

Warren Young

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

Neil.Frankli...@ccw.ch wrote:

>Neil.Frankli...@ccw.ch writes:
>> The only OS made on such principles at the moment is GNU/Linux, which
>Marco S Hyman <ma...@snafu.org> asked:
>> I'm curious, why do you believe FreeBSD/NetBSD/OpenBSD are closed?
>

>* could you give a concise description of what differs the 3 groups
>and their versions of the system.

I don't know how they differ, but I believe they're all "cathedral"
systems (see http://www.ccil.org/~esr/writings/cathedral-paper.html),
like Emacs: you can change your copy, but few people outside the
"inner circle" actually change the main source. `Course, better
perceived stability is why so many ISPs choose FreeBSD over Linux...

Do read that paper -- it's kinda long, but very good. A landmark,
really.

Bill Wendling

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

Warren Young wasted electrons by posting:
} wend...@news.cso.uiuc.edu (Bill Wendling) wrote:

} >Well, the number one reason for the ports is that UNIX runs on many
} >different architectures and hardwares. However, there are two main
} >ports: BSD and SVR4. Porting between the two isn't too difficult (esp.
} >with the advent of Posix). However, porting between MS and anything
} >else is a mind-numbingly arduous task.

} ...And porting from UNIX (or VMS, or Amiga, or...) to Windows is easy?

Of course not.

} >You mean like NCSA Mosaic and NCSA Telnet, Emacs, GCC, Perl, VIM, Netscape,
} >Mathematica, etc. etc...All of these are definitely NOT real apps...

} Granted, his original statement was overkill, but don't kid yourself:
} all of these programs have conditional code out the wazoo. Even today
} in these enlightened POSIX days, it still takes a powerful tool like
} Autoconf to give the illusion of easy portability. NT programs, on
} the other hand, pretty much just recomipiled. I guess you could say
} the same thing about Solaris SPARC and x86, too, FWIW.

The code that is conditionalized is not much and it's certainly much less
than what would have to be #ifdef'ed if porting to/from MS.

} >Except for MS's memory management, threading, editting, compiling,
} >telnetting (or lack thereof), etc., I'd have to agree with you.

} Memory management: NT does just fine protecting processes from each
} other. (Don't kid yourself: no one is advocating Windows 95 in this
} thread -- that'd be like comparing apples to worms.)

Up until very recently (and maybe even yet), mem management was a separate
program (EMM386, right). It probably is still that way.

} Editing: the Visual C++ and Borland C++ editors do just fine, and if
} you don't like them, you can use that $1200 per-seat difference to put
} towards whatever editor you like. Or, get the Win32 port of emacs or
} vim -- they work just fine.

Putting EMACS or VIM on NT would need the X port to NT...The editting in VC++
leaves me cold.

} Compiling: PC compilers are better than most UNIX compilers, probably
} just because there are more people using them. Borland C++ and Visual
} C++, for example, both support the newest C++ features better than
} g++, and definitely better than any stock compilers I've seen. (Sure,
} spend big money on a third-party toolset and that might change, but PC
} compilers are still pretty darn good these days.)

However, they aren't portable to other OS's and aren't cross-compilers, and
aren't free...

} Telnetting: No one in their right mind sticks with the default telnet
} client. Come on, splurge on a $30 shareware one. Heck, there are now
} freeware ones that are good enough for most purposes.

I did mean that you can't telnet into the box, unless you put an X server on
it.

} >No, MS is very much aware of their positive feed back loop. All they have
} >to do is compete in a field (browsers, for instance) and they can
} >be almost assured of some success.

} Don't fool yourself: Microsoft has some good stuff these days. As far
} as browsers go, Netscape and Microsoft are about equal as far as
} perceived quality goes, and I have no doubts that Microsoft will
} eventually be the default browser on Windows. Personally, I'm a
} Netscape guy, but I still think MSIE has its points. For example, it
} starts up _much_ faster, is feature-competitive, and doesn't seem to
} crash any more than Navigator. I just don't like the
} newbie-handholding aspect it purveys.

MS's and Netscape's browsers are bloated and slow. If you get Opera, there's
a dramatic change. They also don't have some of the nice standard features
that Mosaic had...Ie. annotations, full-screen display for presentations, the
ability to link your browser to another browser, etc., etc.

} >People in graphics are vitally concerned with what types of video cards
} >they have. Some people need really fast access to data on their hard
} >drive. Some need memory up the wazoo. One of the disappointing things
} >about the PC industry is that there really isn't good hardware out there
} >that will optimize your desktop computer with your OS.

} Take a look at Intergraph and Compaq. They both build SGI killers,
} and I'm sure there are others. The reason that machines like these
} are not common in the PC world is that most PC users don't need
} machines like that -- why pay for what you don't need?

Take a look at the big graphics houses which are designing graphics for
movies, they are probably running Alphas or SGIs.

} >I care about that too, however, MS is -way- behind in the OS game. OSes are
} >now moving to 64-bit processors (those damned hardware requirements
} >again), while MS has just gone to 32-bit (barely) within the last 2 years.
} >They are also way behind in multi-processing. They can't handle the

} I can't speak for anyone else here, but I'm not saying that UNIX is
} worthless. I have two machines here at home, one of which is a Linux
} box. At work, we've built our head-end system on a UNIX machine, some
} of which I'm involved with. If you're just trying to argue that NT
} isn't the most capable system on the planet, then fine, I agree, and
} we can go find another thread to haunt. If instead you're trying to
} get us to believe that UNIX is the across-the-board best OS, you're
} gonna have to do a lot better than this.

There are a lot of reasons I hate MS's stuff...My argument was with the other
guy, you are unknown to me.

} >load that UNIX machines can (there are several articles on this, one in
} >the current Linux Journal magazine about the company who did effects for
} >_Titanic_ who couldn't use NT machines cause it couldn't do what a Linux
} >box could). NTs aren't as scalable as UNIXes. The list goes on.

} Yeah, the article essentially said "We couldn't use NT because it
} isn't UNIX." Big surprise. (The company had made a big prior
} investment in UNIX software and was unwilling to port or replace it.)

Not really. They had bought new machines (Alphas) and could have put NT, VAX,
DEC UNIX, or Linux on it...

--
|| Bill Wendling wend...@ncsa.uiuc.edu

Bill Wendling

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

Joseph M. Newcomer wasted electrons by posting:
[long story about sysop disasters snipped]

I see now your hatred of UNIX stems from the fact that you had inept people
running the system...No wonder! Duh!

--
|| Bill Wendling wend...@ncsa.uiuc.edu

Heinz W. Wiggeshoff

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

Joseph M. Newcomer (newc...@flounder.com) writes:
>
> OK, here's a question: what's the difference between a hacker, a
> computer scientist, and a software engineer?

The first is an insult, the second is scarce, and there's no such
profession as software engineering!

R!ch

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to Heinz W. Wiggeshoff

On 26 Jan 1998, Heinz W. Wiggeshoff wrote:

> Joseph M. Newcomer (newc...@flounder.com) writes:
> >
> > OK, here's a question: what's the difference between a hacker, a
> > computer scientist, and a software engineer?
>
> The first is an insult, the second is scarce, and there's no such

~~~~~~

You seem to be confusing "hacker" with "cracker"; to me, being called
a hacker is a sign of respect, not an insult!

> profession as software engineering!

From what I've - true!

--
R!ch (Email is flakey at present: use rich...@keaton.uk.sun.com)

If it ain't analogue, it ain't music.
#include <disclaimer.h> \\|// - ?
(o o)
/==================================oOOo=(_)=oOOo========\
| Richard Teer richar...@uk.sun.com |
| |
| |
| WWW: www.rkdltd.demon.co.uk |
| .oooO |
| ( ) Oooo. |
\===================================\ (==( )==========/
\_) ) /
(_/


Rob Hafernik

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

Here's my vote for the worst code I've ever seen:

A couple of years ago, some of my folks were re-writing a terminal
emulator for a Company That Shall Not Be Named. They ran across a
function that, at first, made no sense at all.

It seemed that this function called the C "pow" function (to raise a
number to a power) inside a for loop to raise 2 to an integer power, then
cast the result back to an integer. Later in the for loop, it did it
again with a call to pow. Right away, we knew we were in trouble.

It turns out that the original programmer had evidently never heard of the
shift operators. The purpose of the function was to reverse the bits in a
word (which was required for some dumb reason anyway). To do this, the
bozo had written a for loop that used POW to get 2^^i and AND it with the
word in question. If the AND operation detected a bit, he set the
corresponding reverse bit by AGAIN calling POW to get an integer with the
right bit set and ORing it with his result.

In other words, to reverse the bits in a 16-bit word, this function called
POW between 16 and 32 times. All this on a PC that wasn't all that good
at floating point anyway.

I wouldn't have believed it if they hadn't showed me the code...

Rob Hafernik

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

In article <34e95db9....@206.210.64.12>, newc...@flounder.com
(Joseph M. Newcomer) wrote:

<snip>


> OK, here's a question: what's the difference between a hacker, a

> computer scientist, and a software engineer? [stay tuned for the
> answer...I'll give a few days for other inputs]

Here's my answers:

hacker: cares if code is cool enough.

computer scientist: cares if code is designed an written
according to latest and best principles.

software engineer: cares if code WORKS.

Of course, I may be showing a little bias here <g>. I'm an egineer FIRST
and a programmer SECOND...

Seriously, though, I've managed all three types at various times and they
all have their strong points. If you've got a tricky little problem that
requires a counter-intuitive approach, the hacker is the one to give it
to. If you've got a complicated program that will require multiple
processes with multiple threads and semaphore access to shared memory or a
common database and you may have to "prove" some of your algorithms, then
give it to the CS type (but stand back and give them some time). If
you've got a big, straightforward application that has to be maintained
for a while, then give it to a software engineer.

Most projects, though, have aspects of all three and you need a team that
has aspects of all three.

George R. Gonzalez

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

Rob Hafernik wrote in message ...


>Here's my vote for the worst code I've ever seen:

>In other words, to reverse the bits in a 16-bit word, this function


called
>POW between 16 and 32 times. All this on a PC that wasn't all that good
>at floating point anyway.
>
>I wouldn't have believed it if they hadn't showed me the code...

I can do better than that...

Years ago one programmer was given the task of writing a routine to read
a random line of text from a file.

Now the way any rational person would do this is to build a little index
array giving the starting offset of each line, do a seek(), a readln(),
and you're done..

It took them several weeks to write this routine. In the meantime we
wrote our own one in about an hour.

But eventually they delivered the code, and first glance it looked
impressive, with lots of comments and lots of code. Must be a lot of
optimized in-line code and error checking in there we thought....


We didnt look closely at the code, but just accepted it and wrote a test
program:

for i := 1 to 1000 do writeln( GetLine(f,i) );

(This was in Mac Pascal on a 8MHz 68000).

Lines 1,2,3,4,5,6 showed up fine. Line 7 took a sec to show up. Line 8
took 15 secs to show up. By line 10 we gave up waiting....

Guess what algorithm they used?

To show line X:
for L := 1 to X do
rewind the file.
for L2 := 1 to L
make a head node
repeat
read a char.
insert char into linked list of chars
until ch = CR;
end ( for L2 )
end (for L);
convert linked list of chars to string
return value := string;

A order( factorial ) algorithm, mixed in with a lot of node allocation
and linking, and a bad memory leak. I couldnt think of a worse
algorithm if I tried....

But it *did* work as per the spec..... Makes you wonder how much code
out there was written to the spec and not to any rational
considerations....

Ever since then I try to at least *mention* that it would be nice if the
algorithm would deliver its results in xxx.xxx seconds at worst, and not
chew up more than yyy bytes.


Regards,


George

T.W. Seddon

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

Bill Wendling (wend...@news.cso.uiuc.edu) wrote:

> Up until very recently (and maybe even yet), mem management was a separate
> program (EMM386, right). It probably is still that way.

Not quite.

Windows manages its own memory. You don't need EMM386 to run Windows, and
generally you're best off doing without it, if you can, beacuse
then Windows can have each DOS window individually configured for EMS. This
may not be true for Windows '95, but was for 3.11.

EMM386 is to provide UMB support for DOS, so that TSRs can be loaded out of
the 640K conventional part, and to provide EMS support for DOS applications.
It also has a VCPI interface built in, I seem to remember, for those programs
that use VCPI. So I suppose EMM386 counts as memory management for DOS, but
you have to stretch the definition of "management" somewhat...

--
--Tom

David Thompson

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

sp...@lisardrock.demon.co.uk wrote in message
<885708495.19868.1...@news.demon.co.uk>...

>look. NT wants 32mb to stand up straight. but it really doesn't offer
>very much more functionality than something like AmigaDOS, which took up
>512k to stand up, kick a ball around, and score a few goals.
>

This is just flat out wrong. You may not use all (or apparently even know
about many) of the services provided by all the DLLs and services loaded
into the first 32 megs of NT, but to say it does not provide much more
functionaility is the stupidest thing I have read this year.


Simon Slavin

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

In article <MPG.f328e868...@news.concentric.net>,
j...@pobox.com (Jonathan Feinberg) wrote:

> shok...@well.com said...
> > Oh, and it complies and links fast as hell, always an important point.
> ^^^^^^^^
> No matter how fast you throw new standards at the Metroworks Complier, it'll
> keep up, with new Self-Modifying Compliance Modules.

Great. Does that mean I can write 'psychic newsreader' with it ?
You know, the one which knows what threads you'll want to read and
which new groups you'll be interested in ?

Simon.
--
Simon Slavin -- Computer Contractor. | If I ran usenet, the timestamp for
http://www.hearsay.demon.co.uk | anything posted between 2am and 5am
Check email address for UBE-guard. | would *blink*. -- Nancy Lebovitz
My s/ware deletes unread >3 UBEs/day.| Junk email not welcome at this site.

Neil.Frankli...@ccw.ch

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to

Neil.Frankli...@ccw.ch writes:
> The only OS made on such principles at the moment is GNU/Linux, which
Marco S Hyman <ma...@snafu.org> asked:
> I'm curious, why do you believe FreeBSD/NetBSD/OpenBSD are closed?
tan...@SPAMCATCHER.cyberport.com (Warren Young)

>I don't know how they differ, but I believe they're all "cathedral"
>systems (see http://www.ccil.org/~esr/writings/cathedral-paper.html),
>like Emacs: you can change your copy, but few people outside the
>"inner circle" actually change the main source. `Course, better
>perceived stability is why so many ISPs choose FreeBSD over Linux...
>
>Do read that paper -- it's kinda long, but very good. A landmark,
>really.

The famous ESR cathedral vs bazaar paper. I have actually already read
it (printed in the Linux Journal or Linux Magazine I borowed). Thaks
for the URL to it.

Actually thinking about it, it may be that that paper (and weak
memory) fooled me into still thinking that BSD is an closed system.

Actually I should one get an BSD, just to compare it with Linux.

John Ahlstrom

unread,
Jan 26, 1998, 3:00:00 AM1/26/98
to bme...@bruce.cs.monash.edu.au

bme...@bruce.cs.monash.edu.au wrote:

--snip snip



>
> >software engineer: cares if code WORKS.
>

> Bernie

I disagree. Except in the case of life-critical software
(in which case the most important thing is It must not Kill anyone)
the most important characteristic of code is that it be
MODIFIABLE.

Modifiability is more important because we know that
it doesn't work. It must be modified and modified and
modified to approach an acceptable state of workingness
and then must be modified and modified and modified again
to get new features and functions and environments into
an acceptable state of workingness. I think a software
engineer knows this. Or I would like not to characterize
anyone who does not know this as a software engineer.

(There may be other situations, ROMmed code in an appliance?,
that can be more easily rewritten completely than modified.)

John Ahlstrom
jahl...@cisco.com

Reporting sufficiently advanced form of technology
is indistinguishable from mental illness.

Apologies to Arthur C Clarke

bme...@bruce.cs.monash.edu.au

unread,
Jan 27, 1998, 3:00:00 AM1/27/98
to

shok...@well.com (Rob Hafernik) writes:

>hacker: cares if code is cool enough.

hacker: Programs intuitively; Churns out incredible code
at incredible rates, and most of it almost works
most of the time.

>computer scientist: cares if code is designed an written
> according to latest and best principles.

computer scientist: Knows that you shouldn't start coding before you have
the algorithm and data structures worked out, and
sometimes is able to follow that rule. In fact,
sometimes is not able to code at all (or at least not
in any fashion that should be published), but that's
what programmers/postgrad students for ;-)

>software engineer: cares if code WORKS.

software engineer: Knows that you have to do a design first. Even knows
that you should do rapid prototyping to find out what
the customer really wants. Also knows how to write
maintainable code.
Due to being employed by companies in which pointy haired
bastards decide over project schedules, all this
knowledge usually gets sacrificed to the gods of "make
it work well enough to meet the deadline". Whether it
really _works_ doesn't often matter.

Can you tell I am not too impressed by the current state of programming?
Just about the only times you get really good software is if someone with
the knowledge of all three groups works on his/her own time, with any PHBs
messing things up. TeX is a good example for this --- and Word is a good
example for the opposite.

Bernie



--
============================================================================
"It's a magical world, Hobbes ol' buddy...
...let's go exploring"
Calvin's final words, on December 31st, 1995

Mike Williams

unread,
Jan 27, 1998, 3:00:00 AM1/27/98
to

On Tue, 20 Jan 1998 09:58:28 -0600, Kevin Gilhooly <k...@dfw.net> wrote:
>Bill Wendling <wend...@news.cso.uiuc.edu> wrote:
>
>> Hi all,
>>
>> Does anyone know why people design and create horrible-to-maintain, buggy
>> code? I have seen and heard about some real horror stories. Is it just
>> ignorance/incompetence on the coder's part? Lack of time, maybe?
>>

The biggest problems I have seen are

- Programs Designed By Committee
- Programs Designed With Poor Specifications
- Programs Written Under Unrealisticly Short Deadlines

--
-Mike Williams
http://www.mnsinc.com/daolath/index.html
I can also be reached at work: mike.w...@swift.com

David Wragg

unread,
Jan 27, 1998, 3:00:00 AM1/27/98
to

dao...@news1.mnsinc.com (Mike Williams) writes:
> The biggest problems I have seen are
>
> - Programs Designed By Committee
> - Programs Designed With Poor Specifications
> - Programs Written Under Unrealisticly Short Deadlines
>

Of course, the worst problems of all result from programs designed by
committee who produce a poor (though remerkably long winded)
specification, which has to be implemented under unrealistically short
deadlines.

And when it becomes clear that it will be very late, buggy and
incomplete the managers make a decision: "Lets hire a load of cheap (=
inexperienced) programmers in a desperate attempt to get it finished".

Why horrible code? As the morale of the programmers plummets, it
becomes more a case of "Why not?"

Dave Wragg

gla...@glass2.cv.lexington.ibm.com

unread,
Jan 27, 1998, 3:00:00 AM1/27/98
to

In <6aft1r$oo2$1...@uhura1.phoenix.net>, Mike Swaim <sw...@shell.c-com.net> writes:
>Peter van Hooft <ho...@natlab.research.philips.com> wrote:
>: In <34ea6229....@206.210.64.12> newc...@flounder.com (Joseph M. Newcomer) writes:
>
>: >Unix self-destructed 30 workstations in one night (after having failed
>: >to back up 80 of them for something like a year!).
>: This convinced me not to read any further. I don't know of _any_
>: serious environment where this situtation would have allowed to
>: continue after a _week_, let alone a year.
>
> I know of 2, and I haven't been around that much. At the first, one of
>the programmers took to deleting a file that he didn't need once a month
>and requesting that it be restored "just to be sure." At the second, I
>found out by accident, and had one of the netware sysadmins give me rights
>to the accounting data so I could back it up via an NT machine I
>controlled. I also told the head of IS, who didn't know about it, either.
>
>--
>Mike Swaim, Avatar of Chaos: Disclaimer:I sometimes lie.
>Home:sw...@phoenix.net or sw...@c-com.net, I'm just not sure.
>Work:mps...@gdseng.com Silly:whats...@gdseng.com
>Alum: sw...@rice.edu Quote: "Boingie"^4 Y,W&D

At a place I once worked, the semi-automated mainframe backup system
would backup user files to tape at night. However, it required an
operator to mount the tapes from the backup pool onto the tape
drives. It seems that, due to increased numbers and sizes of user
files, that the backup tape pool ran short of tapes. So, the
operations person merely mounted a tape from the scratch tape pool,
allowed the backup job to finish, and then returned the tape to the
scratch pool! And, to make matters worse, this had been going on
for over six months before I accidentally deleted a file that I
needed and tried to recover it from the backup system. It caused a
bit of a panic when word got out that about half of the user files
on the system weren't backed up (Well, ok, so they had been backed
up, it's just that the backup tapes had been reused as scratch
tapes!).

That's probably even worse than the time that I had a tape assigned
to me, dumped some stuff on it, and then came back a week later to
dump some more stuff to it. After the tape mount was pending for a
couple of hours, operations finally called me up and ask 'Why are
you trying to mount a non-existant tape?'. They were a little
surprised when I told them that I had just used that tape last week,
and that it had been assigned to me by the automated tape system.
They claimed "Well, we've never had a tape with that volume serial
number here.". Ok, scratch that data that I had archived to tape.
Write only storage anyone? :*)

Dave

P.S. Standard disclaimer: I work for them, but I don't speak for them.


Benz

unread,
Jan 27, 1998, 3:00:00 AM1/27/98
to

Bad code also results from the business aspects of 'software business' -
remember there are two words to that phrase. In business, he who gets to
market first usually wins bigger than the also-rans. Every business
package I've ever seen has been awful code - but it works and it makes
money. You can spend years of effort and tons of money writing perfect
code, but if someone else gets to market first and nails down the market
share, all that perfection still leaves you bankrupt. I've also seen
plenty of packages that were written to good, academic standards, where the
coders did everything by the book, where deadlines were given lower
priority than perfection, where full Software Engineering methodologies
were religiously applied - and they're now out of business, because they
concentrated more on form than on taking care of business. Bad code also
results from many cycles of enhancement and customization after the first
release - many programmers hacking changes into a package will degrade
quality more than a single sloppy coder during the initial development.


David Wragg <d...@outoften.doc.ic.ac.uk> wrote in article
<y7r1zxu...@outoften.doc.ic.ac.uk>...

Ben Harris

unread,
Jan 27, 1998, 3:00:00 AM1/27/98
to

In article <34d71ac8....@news.cyberport.com>,
Warren Young <tan...@SPAMCATCHER.cyberport.com> wrote:

>Neil.Frankli...@ccw.ch wrote:
>>Marco S Hyman <ma...@snafu.org> asked:
>>> I'm curious, why do you believe FreeBSD/NetBSD/OpenBSD are closed?
>>
>>* could you give a concise description of what differs the 3 groups
>>and their versions of the system.

I think the traditional claim is that FreeBSD is tall and thin, NetBSD is
low and wide and OpenBSD is somewhere in-between.

>I don't know how they differ, but I believe they're all "cathedral"
>systems (see http://www.ccil.org/~esr/writings/cathedral-paper.html),
>like Emacs: you can change your copy, but few people outside the
>"inner circle" actually change the main source. `Course, better
>perceived stability is why so many ISPs choose FreeBSD over Linux...

*BSD aren't pure 'cathedral' systems. For instance, the current
development (read: probably functional) sources for NetBSD are made
available daily through the NetBSD-current tree.

Does this mean the Linux kernel source tree is effectively world-writeable,
or does it just have a very large 'inner circle'?

--
Ben Harris
Computer Officer, Corpus Christi College, Cambridge.

Rob Hafernik

unread,
Jan 27, 1998, 3:00:00 AM1/27/98
to

In article <34CD31DF...@cisco.com>, John Ahlstrom
<jahl...@cisco.com> wrote:

<snip>


> Except in the case of life-critical software
> (in which case the most important thing is It must not Kill anyone)
> the most important characteristic of code is that it be
> MODIFIABLE.
>
> Modifiability is more important because we know that
> it doesn't work. It must be modified and modified and
> modified to approach an acceptable state of workingness
> and then must be modified and modified and modified again
> to get new features and functions and environments into
> an acceptable state of workingness. I think a software
> engineer knows this. Or I would like not to characterize
> anyone who does not know this as a software engineer.
>
> (There may be other situations, ROMmed code in an appliance?,
> that can be more easily rewritten completely than modified.)

I agree to a point, but lots and lots of programs are one-shot deals.
Maybe my perception is skewed, working as a contractor, but I see a LOT of
code that goes out the door and is never modified, as long as it works
fairly well at the time of release. Computer games are a good example.
You don't see version 2.0 of a video game. You may see a sequel, but
those are often re-writes.

In the arena of shrink-wrapped consumer software, I've been involved in
many "version 2.0" projects that started out by throwing away the version
1.0 code. This can happen for lots of reasons: to take advantage of a new
language or new class library, to update the software to comply with
changes to the operating system, to architect the software better in light
of the experience gained from version 1, etc.

Also, speaking as a contract software developer, the most important
consideration is that the code WORKS WHEN DELIVERED. If it doesn't, you
DON'T GET PAID!! Even if you ask, clients rarely rate maintainability as
highly as on-delivery fitness. This may be (is) shortsighted, but it's
the way the world works.

Rick Hawkins

unread,
Jan 27, 1998, 3:00:00 AM1/27/98
to

In article <shokwave-250...@as2-dialup-28.wc-aus.io.com>,
Rob Hafernik <shok...@well.com> wrote:

>Someone earlier didn't like the idea of syntax coloring. Fine, turn it off.

Don't like it? It's the only reason to have a color display . . .

:)

[ok, the kids like color for their games, too. But last fall marks the
first time out of my dozen or so machines starting in the late 70's that
I've had color. And if this display hadn't originally been the kids, it
would be 19" or 20" mono rather than 17" color . . .]

--
R E HAWKINS
rhaw...@iastate.edu

These opinions will not be those of ISU until they pay my retainer.

Robert Billing

unread,
Jan 27, 1998, 3:00:00 AM1/27/98
to

In article <shokwave-270...@as5-dialup-08.wc-aus.io.com>
shok...@well.com "Rob Hafernik" writes:

> In the arena of shrink-wrapped consumer software, I've been involved in
> many "version 2.0" projects that started out by throwing away the version

OTOH there is something which I wrote years ago, which is now on
version 50-something. In general what happens is that it becomes
broadly stable, then is tweaked to add some features, then goes through
half a dozen releases in a fortnight, and then becomes stable again.
Version 46 for example was in series production for nearly two years.

--
I am Robert Billing, Christian, inventor, traveller, cook and animal
lover, I live near 0:46W 51:22N. http://www.tnglwood.demon.co.uk/
"Bother," said Pooh, "Eeyore, ready two photon torpedoes and lock
phasers on the Heffalump, Piglet, meet me in transporter room three"

Charlie Gibbs

unread,
Jan 27, 1998, 3:00:00 AM1/27/98
to

In article <y7r1zxu...@outoften.doc.ic.ac.uk>
d...@outoften.doc.ic.ac.uk (David Wragg) writes:

>dao...@news1.mnsinc.com (Mike Williams) writes:
>
>> The biggest problems I have seen are
>>
>> - Programs Designed By Committee
>> - Programs Designed With Poor Specifications
>> - Programs Written Under Unrealisticly Short Deadlines
>
>Of course, the worst problems of all result from programs designed
>by committee who produce a poor (though remerkably long winded)
>specification, which has to be implemented under unrealistically
>short deadlines.

Don't forget specifications that are about as stable as a house
of cards. (I suppose this could be classified under "poor
specifications", but I like to distinguish volatile specs from
ones that are just poorly defined.

"Frozen specs and the Abominable Snowman have a lot in common:
both are myths, and both melt when enough heat is applied."
I forget who said this, but it sure sounds like Fred Brooks.

My worst case was one when I pretty much wrote the programs
first, then got the specs. Well, I went to the department
and got enough of an idea what they wanted so I could make an
educated guess, but getting solid information was like pulling
teeth. So I went ahead and wrote something, then went through
several iterations of the "That's not what I wanted" cycle.
Amazingly, the code still had some sort of integrity by the
time I was done.

But once you have a working application, it'll have to be
changed. In an in-house environment, new needs will arise,
while in the marketplace you'll need new bells and whistles
to match your competitors'. Keeping a program from turning
into a pile of loose ends requires more than a quick-and-dirty
approach.

>And when it becomes clear that it will be very late, buggy and
>incomplete the managers make a decision: "Lets hire a load of

>cheap (=inexperienced) programmers in a desperate attempt to
>get it finished".

"Adding programmers to a late project makes it later."
Fred Brooks definitely did write that one, and he
backs it up with mathematical proof.

>Why horrible code? As the morale of the programmers plummets,
>it becomes more a case of "Why not?"

But pointy-haired managers will forever fail to realize how
things got that way in the first place.

--
cgi...@sky.bus.com (Charlie Gibbs)
Remove the first period after the "at" sign to reply.


Brett Leach

unread,
Jan 27, 1998, 3:00:00 AM1/27/98
to

In article <tsw-210198...@cypher.cagent.com>, t...@cagent.com
says...
> <<<deletia with an example of horrible code, with commentary>>>
>
> Are there any other examples of horrible code (in whatever language).
> Maybe we could all laugh at them. Please post!!
>
> --
> t...@cagent.com (Home: t...@johana.com)
> Please forward spam to: anna...@hr.house.gov (my Congressman), I do.
>
6502 Assembler can't remember the pseudo ops so I'm making them up
as I go along. I'm also a little fuzzy on the byte order on the
stack.

This was a little routine I created early in my programing life to
avoid creating string tables, it also got me around string length
limits too. I did simmilar things later to deal with floating
point arithmetic.

Looks O.K. in the original source but if you disassemble it "yech!"

But it is ROMable. In fact that's just where it went, replacing the
cassette routines on my Apple ][.

: :
JSR PRINT
DS "Any string"
DB $00
: :

PRINT: POP
STA SCRATCH
LDA #$00
STA SCRATCH+1
POP
TAY
P_LOOP: LDA (SCTATCH),Y
CMP #00
BEQ P_DONE
JSR PRINTCHAR
INY
BNE PLOOP
INC SCRATCH
BNE PLOOP
BRK
P_DONE: TYA
PHA
LDA SCRATCH
PHA
RTS


____
/ \ Remove ".pepperpot" to reply
\____| via E-Mail to this post.
/ |
( | |
\____ \__/ Leach. My brain hurts.

Alexandre Pechtchanski

unread,
Jan 28, 1998, 3:00:00 AM1/28/98
to

On Mon, 26 Jan 1998 02:46:42 GMT, newc...@flounder.com (Joseph M. Newcomer)
wrote:

>>Peter van Hooft <ho...@natlab.research.philips.com> wrote:


>>: In <34ea6229....@206.210.64.12> newc...@flounder.com (Joseph M. Newcomer) writes:
>>
>>: >Unix self-destructed 30 workstations in one night (after having failed
>>: >to back up 80 of them for something like a year!).
>>: This convinced me not to read any further. I don't know of _any_
>>: serious environment where this situtation would have allowed to
>>: continue after a _week_, let alone a year.
>
>

>The environment in which this was allowed to persist for over a year
>was the Software Engineering Instittute of Carnegie Mellon University.
>Here's the story:
[ true horror story snipped ]

So you had a bad fortune to work in the place with worse-than-none sysadmin.
How does it make Unix bad?

[ When replying, remove *'s from address ]
Alexandre Pechtchanski, Systems Manager, RUH, NY

Warren Young

unread,
Jan 28, 1998, 3:00:00 AM1/28/98
to

wend...@news.cso.uiuc.edu (Bill Wendling) wrote:

>} all of these programs have conditional code out the wazoo. Even today
>} in these enlightened POSIX days, it still takes a powerful tool like
>} Autoconf to give the illusion of easy portability. NT programs, on
>} the other hand, pretty much just recomipiled. I guess you could say
>} the same thing about Solaris SPARC and x86, too, FWIW.
>
>The code that is conditionalized is not much and it's certainly much less
>than what would have to be #ifdef'ed if porting to/from MS.

That's a fine and dandy statement, except that code isn't _supposed_
to be ported from UNIX to NT or vice versa. Code _is_ supposed to be
portable between UNIXes.

>} Memory management: NT does just fine protecting processes from each
>} other. (Don't kid yourself: no one is advocating Windows 95 in this
>} thread -- that'd be like comparing apples to worms.)
>
>Up until very recently (and maybe even yet), mem management was a separate
>program (EMM386, right). It probably is still that way.

Hellloooo, welcome to the 90's! NT's had good memory protection since
its inception in 1993. Granted, much later than UNIX, but we're not
arguing about who was first -- just who does what better _today_.

>Putting EMACS or VIM on NT would need the X port to NT...The editting in VC++
>leaves me cold.

Guess again. There's a specific port of Emacs to NT that runs in a
normal window. Looks just like Xemacs.

`Course, if you really wanted to, I supposed you could port it your
way. Cygnus has ported all the major GNU stuff to Win32, including a
UNIX-like translation layer that seems to do an allright job of
things. Among the things ported includes the X11R6.3. Also available
is a free X server called MicroImages/X that appears to do a good job,
at least when coupled with an external rsh program. (It can't start X
clients on its own, but once you call the program with the appropriate
-display flag, it works just fine.)

>However, they aren't portable to other OS's and aren't cross-compilers, and
>aren't free...

<pedantic>Visual C++ comes in cross-compilation versions for all
flavors of NT, Win95, WinCE and Macintosh.</pedantic> And no, they
aren't free, but then, you get what you pay for. If you want a
_really good_ compiler under UNIX, you pay for that, too, and for a
pretty penny, too. Also, as mentioned above, you can have g++ for
Win32, if you like. I can't remember if it will create GUI
executables yet, though.

>} Telnetting: No one in their right mind sticks with the default telnet
>} client. Come on, splurge on a $30 shareware one. Heck, there are now
>} freeware ones that are good enough for most purposes.
>
>I did mean that you can't telnet into the box, unless you put an X server on
>it.

Definitely a problem sometimes, though there are telnet servers
available. A bigger concern is that there are not all that many
command-line utilities for NT.

>MS's and Netscape's browsers are bloated and slow. If you get Opera, there's
>a dramatic change. They also don't have some of the nice standard features
>that Mosaic had...Ie. annotations, full-screen display for presentations, the

Navigator does this. (I can't remember if the "kiosk" stuff is a
version or a mode in the regular version.)

>} >_Titanic_ who couldn't use NT machines cause it couldn't do what a Linux
>} >box could). NTs aren't as scalable as UNIXes. The list goes on.
>
>} Yeah, the article essentially said "We couldn't use NT because it
>} isn't UNIX." Big surprise. (The company had made a big prior
>} investment in UNIX software and was unwilling to port or replace it.)
>
>Not really. They had bought new machines (Alphas) and could have put NT, VAX,
>DEC UNIX, or Linux on it...

Of course they could, but their criterion for dismissing NT was
because it wouldn't run their existing software, which they were
unwilling to port or replace. That's not really an NT problem, any
more than UNIX is "unsuitable" because it won't run Word 97.

It is loading more messages.
0 new messages