Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Implications of recent virus (Trojan Horse) attack

44 views
Skip to first unread message

Sean McLinden

unread,
Nov 5, 1988, 11:39:44 AM11/5/88
to
Now that the crime of the century has been solved and all of the
bows have been taken it is, perhaps, time to reflect a little more
on the implications of what has happened.

First of all, to the nature of the problem. It has been suggested that
this was little more than a prank let loose without sufficient restraint.
I have not seen the latest in press releases but there seems to be
a hint of "I didn't want anything like this to happen!" Perhaps not.
In fact, if the thing had not run wild and had not bogged down a number
of systems it might have gone undetected for a long time and might
have done much worse damage than our estimates suggest was done. I can
accept that the author did not anticipate the virulence of his creation
but not that it was out of some benevolent concern for the users of
other systems. Rather it was because it allowed him to be caught.

In fact, with function names such as "des", "checkother", and
"cracksome", I am less likely to believe that the intent of this
program was one of simple impishness.

Let's look, for a moment, at the effects of this system (whether
intended or otherwise). First, it satisfied a public desire for news
and, one might argue, served as a reassurance to the many technophobes
out there that our systems are as vulnerable as error prone as they,
all along, have been arguing. If you don't think that this might have
social consequences you need only look at things like community bans
on genetic research have resulted from social policy implemented as
a result of public distrust. When I was interviewed by a local news
agency the questions asked were on the order of "Does this mean that
someone could fix a Presidential Election?" (sure, Daley did it in
Chicago but he didn't used computers!), and "What implications does
this have for the nation's defense?" (In spite of reassurances from
here and CMU, the local media still insisted on the headline "Defense
Computers invaded by virus.")

Second, there is an economic conseqence. Since we were unable to
determine the extent of the programs activities we were forced to
commit programmers time to installing kernel fixes, rebuilding systems,
checking user data files, and checking for other damage. That was
the direct cost. The indirect cost comes from the delay in other
tasks that was incurred by the diversion of people's time to solving
this one. If you multiply by the effort that is going on at a number
of other sites I suspect that in salary time, alone, you are looking
at costs into the hundreds of thousands of dollars.

Perhaps, most importantly, there is the academic costs. I would argue
that that the popularity of Unix, today, is due in great part to the
development of the Berkeley Software Distribution which was made available
in source form to thousands of research and academic organizations starting
in the '70s. In a sense, it is a community designed system and although
Berkeley deserves the lion's share of the credit, it was the contribution
of hundreds of users with access to source codes that allowed the system
to evolve in the way that it did.

There is a cost to providing an academic environment and there are
responsibilities that are imposed by it. One advantage of academic is
access to information which would not be tolerated in an industrial
domain. This access requires our users to observe some code of behavior
in order to guarantee that everyone will have the same access to the
same information. The person who rips out the pages of an article from
a library journal is abusing this privilege of free access to information
and depriving others of the same. By convention, we agree not to do
that, and so we protect that system that has benefited us so that others
derive the same benefit.

A great part of the Internet was funded by DARPA because some forward
thinking individuals recognized the tremendous technological and academic
benefits that would be derived from this open network. This has resulted,
I believe, in significant economic benefits to American industry and
continues to support our leadership role in software development. It is
an an infrastructure that supports a gigantic technological community
and there are very few, if any, computer interests in this country that
were influenced by DARPA' experiment.

Within a week or two, members of the organizations responsible for this
network are going to be meeting to discuss the implications of the recent
virus(es), and mechanisms with which they can be dealt. One possible outcome
would be increased restrictions on access to the network (the Defense
Research Network is already moving along these lines). It would not
be unreasonable to consider whether a venture such as this should be
supported, at all. To restrict access to a network such as this, or
to remove the network, altogether, would be the economic equivalent
to tearing up the Interstate highway system. The effect on academic
and technological advancement would be quite serious.

The bottom line being that to suggest that program such as the
"virus" (which is really more of a Trojan Horse), was little more
than a harmless prank is to overlook what the long term effects of
both the technology, and the PUBLICATION of that technology will
have on continued academic freedom and technological growth.

But what of the nature of the act? Is there something to be said of
that? First, there is the personal tragedy, here. There is public
humiliation for the (supposed) perpetrator's father who is, himself,
a computer security expert (his employer's must be questioning whether
the son had access to specialized information though most of us realize
that the holes that were exploited were well known). There is the
jeopardy of the academic career for the programmer. But there is more
than that.

There seems to be a real lack of consideration for what are the ethical
considerations of this action. Consider, for a moment, that you are
walking down the street and the person in front of you drops a 10 dollar
bill. You have three options: 1) You can pick it up and hand it to them;
2) You can pick it up and keep it; 3) You can leave it and continue walking.
It should be obvious that these choices are not morally equivalent. To
have known about the holes in the system which allowed the virus in
(and even to have known how to exploit these), is NOT the same as actually
doing it (any more than leaving the bill on the sidewalk is the same
as pocketing it). Somewhere along the line, we fail ourselves and our
students if we don't impress upon them the need to regard the network
as a society with rights, responsibilities, and a code of professional
ethics which must be observed in order to preserve that society. There
are probably a few hundred people who could have written the code to
do what this virus did; most of those people didn't do it. Most, if
not all, of us have had the opportunity to pocket a candybar from
the local convenience store, but most of us don't. We don't, not
because we will be punished or because there are laws against it,
but because we have a social consciousness which tells us that
such an action would, in the end, would substantially degrade the
society in which we live.

What happened in this situation reflects not only a moderately
high level of programming sophistication but also a disturbingly
low level of ethical maturity.

If we tolerate those who view the network as a playground where
anyhting goes, we are going to be faced with serious consequences. But
the answer is not to change the character of the network (by increasing
restrictions and decreasing freedom of access), but to promote a sense
of character among the members of the community who work and experiment
in this network. This puts the burden on us to remember that there
is a need for us to encourage, teach, and provide examples of the
kind of behaviors that we need to preserve in order to preserve the
network.

Sean McLinden
Decision Systems Laboratory
University of Pittsburgh

Peter da Silva

unread,
Nov 6, 1988, 1:00:10 AM11/6/88
to
One side effect that I don't like is that UNIX is taking the blame for
a combination of (1) a security hole in an application (sendmail), and
(2) deliberate loosening of security to trusted sites (rhosts, etc...).

Non-academic UNIX in general is a lot less open to techniques like this.
--
Peter da Silva `-_-' Ferranti International Controls Corporation
"Have you hugged U your wolf today?" uunet.uu.net!ficc!peter
Disclaimer: My typos are my own damn business. pe...@ficc.uu.net

Steve Elias

unread,
Nov 6, 1988, 11:51:15 AM11/6/88
to
"Wormer" Morris has quite a career ahead of him, i'll bet.
he has done us all a favor by benevolently bashing bsd 'security'.

the smtp/sendmail security hole that he exploited was big enough to
drive the Whirlwhind computer through -- never mind a few
thousand Suns & bsd vaxes.

the hole was so obvious that i surmise that Morris
was not the only one to discover it. perhaps other less
reproductively minded arpanetters have been having a field
'day' ever since this bsd release happened.

some of the more security minded folk out there might have
archived ps records which could indicate the presence of
spurious shells spawned from smtp. depending on how long
Mr. Morris used the security hole, he may be very well qualified
to tell all whether he saw signs of other creative use of the
sendmail security gift.

in at least one sense, Morris has done a service for the internet.
nobody will be able to continue to "benefit" from the bsd/sysV
sendmail -- which was the true trojan horse.

--


harvard!spdcc!eli

Paul Vixie

unread,
Nov 6, 1988, 2:36:10 PM11/6/88
to
# the hole [in sendmail] was so obvious that i surmise that Morris
# was not the only one to discover it. perhaps other less
# reproductively minded arpanetters have been having a field
# 'day' ever since this bsd release happened.

I've known about it for a long time. I thought it was common knowledge
and that the Internet was just a darned polite place. (I think it _was_
common knowledge among the people who like to diddle the sendmail source.)

The bug in fingerd was a big surprise, though. Overwriting a stack frame
on a remote machine with executable code is One Very Neat Trick.
--
Paul Vixie
Work: vi...@decwrl.dec.com decwrl!vixie +1 415 853 6600
Play: pa...@vixie.sf.ca.us vixie!paul +1 415 864 7013

Donald Chen - Microbiology

unread,
Nov 6, 1988, 9:02:16 PM11/6/88
to
In article <16...@cadre.dsl.PITTSBURGH.EDU> se...@cadre.dsl.PITTSBURGH.EDU (Sean McLinden) writes:
>Now that the crime of the century has been solved and all of the
>bows have been taken it is, perhaps, time to reflect a little more
>on the implications of what has happened.
>
text deleted

>
>Let's look, for a moment, at the effects of this system (whether
>intended or otherwise). First, it satisfied a public desire for news
>and, one might argue, served as a reassurance to the many technophobes
>out there that our systems are as vulnerable as error prone as they,
>all along, have been arguing. If you don't think that this might have
>social consequences you need only look at things like community bans
>on genetic research have resulted from social policy implemented as
>a result of public distrust. When I was interviewed by a local news

Are you suggesting that the "public" does not have an interest and
responsibility to ask for suitable safeguards from what "they"
consider to be either dangerous or incompletely thought out?
Although people like Jeremy Rifkin have been nuisances to the
practical application of bio-engineered tools, they have also
caused investigators to more completely think out their studies,
AND have forced scientists to explain and defend their approaches
and tools to the people who ultimately fund their research.

>Second, there is an economic conseqence. Since we were unable to
>determine the extent of the programs activities we were forced to
>commit programmers time to installing kernel fixes, rebuilding systems,
>checking user data files, and checking for other damage. That was
>the direct cost. The indirect cost comes from the delay in other

Perhaps I am foolish, but I feel some of the responsibility goes to
whoever left the debug option in sendmail, and to those who allow
promiscuous permissions in their systems.

>
>If we tolerate those who view the network as a playground where
>anyhting goes, we are going to be faced with serious consequences. But
>the answer is not to change the character of the network (by increasing
>restrictions and decreasing freedom of access), but to promote a sense
>of character among the members of the community who work and experiment
>in this network. This puts the burden on us to remember that there
>is a need for us to encourage, teach, and provide examples of the
>kind of behaviors that we need to preserve in order to preserve the
>network.
>

You talk of personal responsibility -to oneself, to one's colleagues,
to one's community - and I heartily agree; however, you also talk of the
burden we all have to somehow teach and instill in others that sense of
rightness which makes the net possible. This does not insure that those
whom we teach will listen, and even if they do, that they will do it
right away. Perhaps there is an analogy to children who, though they
have been told to "do right", test the limits of their freedom, test the
extent of their personal strengths. We hope that through time and
experience these children will grow to become an integral part of their
communities - but it takes time.
I do not wish to condone the actions of anyone who disrupts the net or
rips out pages from library books or trashes the environment in which we
all live. Although our site has not seen evidence for this particular
virus, it will, no doubt, be the victim of others. In that vein, we need
to protect our site from the thrashings of either childish behaviour
or cynical attacks. This means we treat our sites more protectively -
viz. the family heirloom - yet no so much that growth and evolution
of the system is stifled.
I suspect that part of the openess and collegiality which we would like
pays its price in these attacks. We can only muted the number and
intensity of them

Don Chen
Dept of Microbiology
Oregon State University

John B. Nagle

unread,
Nov 6, 1988, 9:27:19 PM11/6/88
to
In article <2...@jove.dec.com> vi...@decwrl.dec.com (Paul Vixie) writes:
>The bug in fingerd was a big surprise, though. Overwriting a stack frame
>on a remote machine with executable code is One Very Neat Trick.

Yes. But not all that uncommon, given classical C's rather casual
approach to array sizing. "login" in V6 UNIX could be broken by submitting
very long, suitably constructed passwords.

John Nagle

Jim Hutchison

unread,
Nov 7, 1988, 1:01:10 AM11/7/88
to
Unix is not a "secure" system. No system attached to a network is
entirely secure. Valid and illicit network transactions can be
identical. A casual shell expansion here, a little flexibility in
input for a mailer there, ... the system not designed to stop intruders
lets them in. For security, put the machine in a red Tempest can and
seal it up tight. Or looked at in another light, more damage could
have been done with a modem and 10 popular women's names!

The type of hole through which a recent Deutschlander climbed, still
exists. The casual hole. A broken piece of software that did not
get updated, or came back from a backup when the controller scrawled
wild accusations across the system partition. Human error is real,
it can not be ignored. Most importantly, it will happen to you.

Locks are for children and honest people. It is nice to know that
there are "locks" on the doors of the system. I don't go out cracking
security, I'm simply not interested. Almost anyone *can* crack
security. BSD security is not particulary more ventilated than SysVr*,
or VMS. Software has bugs. Get it. If it fails to deliver a letter,
or lets in "the man with no name", it's still just a bug.

Hopefully this article has not fed any hysteria.

/* Jim Hutchison UUCP: {dcdwest,ucbvax}!cs!net1!hutch
ARPA: JHutc...@ucsd.edu
These are my opinions, and now you have your perceptions of them. */

Don Ferencz

unread,
Nov 7, 1988, 10:55:05 AM11/7/88
to
In article <2...@jove.dec.com> vi...@decwrl.dec.com (Paul Vixie) writes:
>
>I've known about it for a long time. I thought it was common knowledge
>and that the Internet was just a darned polite place. (I think it _was_
>common knowledge among the people who like to diddle the sendmail source.)
>
>The bug in fingerd was a big surprise, though. Overwriting a stack frame
>on a remote machine with executable code is One Very Neat Trick.

I wasn't aware of these tricks, but I find them interesting now, knowing
what security hazards they pose. Is there some place interested
[sick, twisted] individuals like me could get more information on
Morris' handiwork? It would be a benefit from a security aspect. I also
realize that presenting such information could be considered another
risk, perhaps "inviting" someone else to subject us to the same
peril (although most of the net is now "immunized" against this
particular virus).


===========================================================================
| Don Ferencz | "And in the end/ |
| fer...@cwsys3.cwru.EDU | The love you take/ |
| Department of Systems Engineering | Is equal to the love you make." |
| Case Western Reserve University | -- The Beatles |
===========================================================================

David Emberson

unread,
Nov 7, 1988, 3:06:23 PM11/7/88
to
In article <20...@spdcc.COM>, e...@spdcc.COM (Steve Elias) writes:
> "Wormer" Morris has quite a career ahead of him, i'll bet.
> he has done us all a favor by benevolently bashing bsd 'security'.
>

I knew about this sendmail bug at least four years ago, courtesy of Matt
Bishop (now at Dartmouth). He wrote a paper detailing at least a half dozen
holes in the Unix system and methods for constructing trojan horses which was
so dangerous that he responsibly decided not to publish it, but instead to
give selected copies to people who could fix some of the problems. He also
wrote an article for the Usenix newsletter, ;login, which explained how to
write secure setuid shell scripts--a major source of security holes. Matt did
not "benevolently bash" anyone's machines. His behaviour, while unsung by
the press and the Usenet community, is an example of the highest in profession-
al and academic standards. This is the kind of behaviour that we should be
extolling.

It is a pity that the perpetrator of this hack, allegedly Mr. Morris, is now
hailed as a famous "expert" in computer security. No doubt he will make a
fortune after the noise dies down as a security consultant. In fact, I saw
someone quoted in this morning's Wall Street Journal as saying that the
perpetrator was someone he would love to hire! Not I! I would think that
prison would be a better place for a person who cost the government, several
universities, and many companies untold thousands of man-hours and millions of
dollars in downtime and effort spent tracking this piece of garbage down. And
it is almost certain that all the copies of the virus haven't been found.

Unfortunately, the press seems to grab hold of every stupid jerk like this and
hail him as some sort of genius. Somehow the issue of computer security evokes
images of high school kids firing off MX missles or some other vision which
terrifies the public, and the press loves sensation more than substance. A few
years ago there was pandemonium in the press when someone told them that
terminals with programmable function keys could be trojan-horsed. Big deal!
But the media broadcast repeatedly the "revelation" that most terminals in the
world had this "bug." Now they are jumping up and down because the recent
virus made its way into Lawrence Livermore and NASA Ames--even though it didn't
make it into any classified machines. The news people are more interested in
irresponsibly stirring people into a frenzy than they are in responsible
reporting of facts.

I call upon my fellow computing professionals to promote ethical behaviour
amongst their students and colleagues and to denounce destructive misuse of
computing knowledge. I also call upon them to refuse to participate in the
glorification of people in the profession who engage in this kind of behaviour.
We must police ourselves and censure those amongst us who engage in this type
of computer crime. Much is at risk if hysterical reporters cause hysterical
law makers to place restrictions on networks, on the capability of hardware,
on access to computing facilities, or on software. Computer security costs a
great deal of money, like defense spending. I for one would rather see this
money go for better things.


Dave Emberson (d...@sun.com)

John Moore

unread,
Nov 7, 1988, 8:59:32 AM11/7/88
to
In article <2...@jove.dec.com> vi...@decwrl.dec.com (Paul Vixie) writes:
># the hole [in sendmail] was so obvious that i surmise that Morris

According to press reports, RM spent his summers working at AT&T
on "Unix Communications Software Security". Anyone with a source
license check to see if he slipped a trojan horse into uucico
or uuxqt or something?
--
John Moore (NJ7E) {decvax, ncar, ihnp4}!noao!nud!anasaz!john
(602) 861-7607 (day or eve) {gatech, ames, rutgers}!ncar!...
The opinions expressed here are obviously not mine, so they must be
someone else's. :-)

Steve Elias

unread,
Nov 8, 1988, 7:56:32 AM11/8/88
to
In article <76...@sun.uucp> dre%em...@Sun.COM (David Emberson) writes:
>In article <20...@spdcc.COM>, e...@spdcc.COM (Steve Elias) writes:
>> "Wormer" Morris has quite a career ahead of him, i'll bet.
>> he has done us all a favor by benevolently bashing bsd 'security'.
>
>prison would be a better place for a person who cost the government, several
>universities, and many companies untold thousands of man-hours and millions of
>dollars in downtime and effort spent tracking this piece of garbage down. And
>it is almost certain that all the copies of the virus haven't been found.

my opinion is that it is almost certain that others have
been using such security holes in unix for quite some time.

i'm glad to see such gaping security holes closed; it is too
bad that it took a USA Today Worm Program to do this. ca va.

>Unfortunately, the press seems to grab hold of every stupid jerk like this and
>hail him as some sort of genius.

Morris is apparently quite intelligent.
but he's also apparently a jerk enough to let his worm get away.

we may be lucky that it escaped (or was released) before he
had a chance to slow its propagation speed such that it would
be more difficult to notice... or worse.


--


harvard!spdcc!eli

peter honeyman

unread,
Nov 8, 1988, 8:54:14 AM11/8/88
to
John Moore asks:

>Anyone with a source
>license check to see if he slipped a trojan horse into uucico
>or uuxqt or something?

there's not a line of code in honey danber or 4.3uucp that was written
by rtm.

however, rtm's (independent) work on adding protection to uucp served
as the inspiration for honey danber's tight-assed protection scheme.
(e.g., by default, don't send files unless you placed the call; e.g.,
by default don't allow hosts to request files). his contribution here
was a valuable one.

peter

Jim Matthews

unread,
Nov 8, 1988, 10:09:21 AM11/8/88
to
In article <14...@anasaz.UUCP> jo...@anasaz.UUCP (John Moore) writes:
>
>According to press reports, RM spent his summers working at AT&T
>on "Unix Communications Software Security". Anyone with a source
>license check to see if he slipped a trojan horse into uucico
>or uuxqt or something?
>--

As a matter of fact, one of the things Robert did at Bell Labs (while
still a high school student, I believe) was fix some of the glaring
security holes in uucp (AT&T Bell Laboratories Technical Journal,
10/84).

It is very easy in the aftermath of something like this to indulge in
the devil theory of crime -- that all bad things must come from evil
minds. The more you find out about rtm I believe the more you will find
he has in common with the people criticizing his behavior. He has done
significant work in computer security, including warning people for
years about the security holes that made the worm possible. He has
worked as a sysadmin for an arpanet host. He is a serious student of
computer science and was making contributions to the field at an age
when most of us were trying to learn Pascal. He's also one hell of a
great guy, and no one seems more appalled by the effects of his actions
than he is.

We can argue about the advisability of what he did, but I urge you to
resist the temptation to pigeon-hole someone you don't know on the basis
of fragmentary information.

Jim Matthews
Dartmouth Software Development

John B. Nagle

unread,
Nov 8, 1988, 10:56:40 AM11/8/88
to
>According to press reports, RM spent his summers working at AT&T
>on "Unix Communications Software Security". Anyone with a source
>license check to see if he slipped a trojan horse into uucico
>or uuxqt or something?

This is serious. The knowledge that this person had the opportunity to
tamper with the master source code for UNIX is very worrisome. A major
examination of all AT&T-provided security related code is in order.

We may not be at the end of this yet.


John Nagle

a.v.reed

unread,
Nov 8, 1988, 1:53:31 PM11/8/88
to
In article <76...@sun.uucp>, dre%em...@Sun.COM (David Emberson) writes:
< In article <20...@spdcc.COM>, e...@spdcc.COM (Steve Elias) writes:
< > "Wormer" Morris has quite a career ahead of him, i'll bet.
< > he has done us all a favor by benevolently bashing bsd 'security'.
<
< I knew about this sendmail bug at least four years ago, courtesy of Matt
< Bishop (now at Dartmouth). He wrote a paper detailing at least a half dozen
< holes in the Unix system and methods for constructing trojan horses which was
< so dangerous that he responsibly decided not to publish it, but instead to
< give selected copies to people who could fix some of the problems. He also
< wrote an article for the Usenix newsletter, ;login, which explained how to
< write secure setuid shell scripts--a major source of security holes. Matt did
< not "benevolently bash" anyone's machines. His behaviour, while unsung by
< the press and the Usenet community, is an example of the highest in profession-
< al and academic standards. This is the kind of behaviour that we should be
< extolling.

Really? In my book, a key component of professionalism is "owning
the problem". That means you work it until it gets fixed. "Giving


selected copies to people who could fix some of the problems"

(they didn't) is not enough. Morris did what was necessary to get
the problems fixed. For that, many of us are grateful. And yes,
some of us LIKE people who "own the problem" until it is solved.

Adam Reed (a...@mtgzz.ATT.COM)

Robert L. Morgan

unread,
Nov 8, 1988, 2:19:34 PM11/8/88
to

I could only sigh as I telnet'ed to the various machines that I use
here on campus to change my passwords last Friday morning (along with
most other users, no doubt), hoping that some "bored graduate student"
wasn't sucking up the cleartext passwords as they passed across our
various braodcast LANs.

The recent viral event makes it very clear that those of us who
promote the use of network-attached computers in their current
insecure state are on the same moral ground with, say, the automotive
engineers and management who manufactured and sold the exploding
Pintos of a few years back. There is a conspiracy of silence
(acknowledged by those posters who "knew about the bug four years
ago") that we all participate in whenever we design, produce,
purchase, or install such systems without raising the issue of
security.

Project Athena (among others) has shown that order-of-magnitude
improvements in security are possible without terrible penalties in
performance or usability, but is anyone listening? I hope people will
keep the implications of the virus attack in mind as they go about
their daily technological work. A patch to sendmail, putting Mr.
Morris in jail, or saying the Pledge of Allegiance each morning, are
not the answer.

- RL "Bob" Morgan
Networking Systems
Stanford

Glen Dudek

unread,
Nov 8, 1988, 2:35:10 PM11/8/88
to
In article <14...@anasaz.UUCP> jo...@anasaz.UUCP (John Moore) writes:
>In article <2...@jove.dec.com> vi...@decwrl.dec.com (Paul Vixie) writes:
>># the hole [in sendmail] was so obvious that i surmise that Morris
>
>According to press reports, RM spent his summers working at AT&T
>on "Unix Communications Software Security". Anyone with a source
>license check to see if he slipped a trojan horse into uucico
>or uuxqt or something?

I was system administrator at Harvard's computer science computing
facility while Robert Morris was an undergraduate there. I found him
to be an intelligent and responsible person. He volunteered his
assistance in solving difficult problems in network configuration and
routing, and helped to make Harvard a major Northeast news and mail
gateway. He did not exploit his knowledge of UNIX security
deficiencies to break into systems or install trojan horses, though he
well could have.

I do think that if he did indeed release this worm, he showed
extraordinarily poor judgement. However, I would not consider it
justice to punish him as a criminal. I am convinced he had no
malicious intent (please, no arguing about intent and breaking the law -
I am talking about justice, not the law).

I do not think the world need worry about holes that Robert Morris
could have created - I think we need to worry about the ones he didn't find.

Glen Dudek
ex-pos...@harvard.harvard.edu

Matt Crawford

unread,
Nov 8, 1988, 3:01:59 PM11/8/88
to
In article <76...@sun.uucp>, dre%ember.sun.com (David Emberson) writes:
) I knew about this sendmail bug at least four years ago, courtesy of Matt
) Bishop (now at Dartmouth). ... His behaviour, while unsung by
) the press and the Usenet community, is an example of the highest in
) professional and academic standards.

How long have you been at sun? Or how long has anyone at sun known of
the debug hole? And yet they kept shipping binaries with the hole open.
This is an example of the lowest in conscientious responsibility to the
customer.
Matt Crawford

Scott MacQuarrie

unread,
Nov 8, 1988, 5:40:09 PM11/8/88
to

There is a product available from AT&T's Federal Systems group called
MLS (Multi-Level Security) which provides B1-level security in a System V
Release 3.1 environment. I have seen the product on a 3B2, it's availablity
from other vendors would probably require work by those vendors. (Yes Henry,
we might even help them do that :-) ).

Scott MacQuarrie
AT&T Canada Inc.
uunet!attcan!scott

p.s. Opinions are my own.

Adam L. Buchsbaum

unread,
Nov 8, 1988, 6:08:00 PM11/8/88
to
In article <17...@glacier.STANFORD.EDU> j...@glacier.UUCP (John B. Nagle) writes:
> This is serious. The knowledge that this person had the opportunity to
>tamper with the master source code for UNIX is very worrisome. A major
>examination of all AT&T-provided security related code is in order.
>
> We may not be at the end of this yet.
>
> John Nagle

Personally, I'd be much more concerned with software that was
written by people who have been clever enough to have not yet
been caught...

John F. Haugh II

unread,
Nov 8, 1988, 7:28:25 PM11/8/88
to
In article <17...@glacier.STANFORD.EDU> j...@glacier.UUCP (John B. Nagle) writes:
> This is serious. The knowledge that this person had the opportunity to
>tamper with the master source code for UNIX is very worrisome. A major
>examination of all AT&T-provided security related code is in order.

Not just security related code - but ALL code.

A trojan horse in awk or sed would be just as deadly. I'm casting my vote
with some other poster who suggested taking a fine toothed comb to all the
UUCP code.

Meanwhile, I'm working on a replacement login so I can have a shadow password
file on this machine.
--
John F. Haugh II +----Make believe quote of the week----
VoiceNet: (214) 250-3311 Data: -6272 | Nancy Reagan on Artifical Trish:
InterNet: j...@rpp386.Dallas.TX.US | "Just say `No, Honey'"
UucpNet : <backbone>!killer!rpp386!jfh +--------------------------------------

George Seibel

unread,
Nov 8, 1988, 10:25:16 PM11/8/88
to
In article <76...@sun.uucp> dre%em...@Sun.COM (David Emberson) writes:
>In article <20...@spdcc.COM>, e...@spdcc.COM (Steve Elias) writes:
>> "Wormer" Morris has quite a career ahead of him, i'll bet.
>> he has done us all a favor by benevolently bashing bsd 'security'.

>I knew about this sendmail bug at least four years ago, courtesy of Matt
>Bishop (now at Dartmouth). He wrote a paper detailing at least a half dozen
>holes in the Unix system and methods for constructing trojan horses which was
>so dangerous that he responsibly decided not to publish it, but instead to
>give selected copies to people who could fix some of the problems. He also
>wrote an article for the Usenix newsletter, ;login, which explained how to
>write secure setuid shell scripts--a major source of security holes. Matt did
>not "benevolently bash" anyone's machines. His behaviour, while unsung by
>the press and the Usenet community, is an example of the highest in profession-
>al and academic standards. This is the kind of behaviour that we should be
>extolling.

In all due respect, why? It didn't seem to be very effective in closing
the hole in sendmail. Now that everyone is coming out of the woodwork
exclaiming that they've known about this bug for years, I can't help but
wonder why it wasn't fixed. There were a lot of people running around
a couple of weeks ago under the blissful assumption that their computers
were reasonably secure - they had done all the "right" things, vis a vis
file protections, setuid scripts and the like, and all the while, *anyone*
with the appropriate knowledge (and apparently a lot of people had it)
could have done *anything* they wanted to your machine! Perhaps that
was no great surprise to many readers of this newsgroup. Fine. If that's
the way people want it, then let's be up front and print a warning on
each copy of system software that ships: "Congratulations! You just
bought a fine copy of Unix. Don't keep any files you care about on it."
If we have security holes on our machines that are well known, and we
do nothing to patch those holes, we are asking for trouble.

George Seibel

peter honeyman

unread,
Nov 8, 1988, 10:50:33 PM11/8/88
to
John B. Nagle observes:
> ... The knowledge that this person had the opportunity to

>tamper with the master source code for UNIX is very worrisome. A major
>examination of all AT&T-provided security related code is in order.
>
> We may not be at the end of this yet.

rtm worked in research. e.g., he ported bsd tcp/udp/ip to streams,
back when only eighth edition unix had streams. i believe this was the
first example of a multiplexing stream handler. it's a pretty neat hack.

rtm had access to the master source code as in master race, but did not
develop in system v at murray hill.

peter

George Seibel%Kollman

unread,
Nov 8, 1988, 11:48:43 PM11/8/88
to
In article <11...@cgl.ucsf.EDU> I write:

>file protections, setuid scripts and the like, and all the while, *anyone*
>with the appropriate knowledge (and apparently a lot of people had it)
>could have done *anything* they wanted to your machine!

Oops.. not *anything*, perhaps *some* things... the sendmail bug doesn't
provide root access; more likely 'daemon' or something of that sort.
One of our local hosts did have the root password cracked in the recent
worm attack, but that was due to poor choice of root password rather
than any of the myriad *other* security holes we learned about courtesy
of Mr. Morris. My appologies for the misinformation.

George Seibel

d...@alice.uucp

unread,
Nov 9, 1988, 3:52:32 AM11/9/88
to
References: <14...@anasaz.UUCP> <7...@mailrus.cc.umich.edu>

Pursuant to the responses of Honeyman and Mitchell to the worries
of Moore and Nagle:

Robert Morris (rtm, Morris Minor, the little enchilada) spent two
summers, several years ago, in our group at Bell Labs. During
the first, his major accomplishment was a complete rewrite of
the uucp and accompanying software. As Peter noted, his version
was considerably more secure than previous versions, and some
of his insights influenced HoneyDanBer uucp. We ran it on our machines
for nearly a year thereafter, but dropped it in favor of HDB,
mainly because HDB was rapidly gaining favor within AT&T, and Robert's
version had no superiority sufficient for us to push it or keep
it going in the absence of its author. I believe it was
free of intentional trapdoors, unlike sendmail.
In any event, the code is long gone except from backup tapes.

The second summer, his major product was a streams implementation
of TCP/IP that is still the basis of the Eighth/Ninth edition
version of that module. It has since been reworked considerably,
mainly to remove the vestiges of the socket mechanisms (he started
from the Berkeley code), but again, we have never found any evidence
of funny business that wasn't in what he started with.

None of the work he did is in any product, and he didn't have
any opportunity to tamper with the master source code--
that is really quite far away from Research.

Dennis Ritchie

Wayne Folta

unread,
Nov 9, 1988, 9:06:35 AM11/9/88
to
In article <3...@ksr.UUCP> du...@ksr.com (Glen Dudek) writes:
>
>I was system administrator at Harvard's computer science computing
>facility while Robert Morris was an undergraduate there. I found him
>to be an intelligent and responsible person. He volunteered his
>assistance in solving difficult problems in network configuration and
>routing, and helped to make Harvard a major Northeast news and mail
>gateway. He did not exploit his knowledge of UNIX security
>deficiencies to break into systems or install trojan horses, though he
>well could have.
>

Is anyone sure that Morris didn't plant any trojan horses at Harvard?
From the popular press accounts (admittedly the popular press is naive
and sensationalist) Morris had passwords recorded in his account for
machines at MIT and Harvard. Is this so? If so, why did he have them?
If so, did his buddies at Harvard give them to him, or did he steal them?

I am sure that Glen Dudek speaks with authority about Morris' helpfulness,
intelligence, and general good nature. But how can he authoritatively
state that Morris did not compromise Harvard's systems?

Wayne Folta (fo...@tove.umd.edu 128.8.128.42)

Wayne Folta

unread,
Nov 9, 1988, 9:08:11 AM11/9/88
to

This is a very difficult case. If Good Morris is let off because he is
sincere and meant no harm, what about the 2000 Evil Morrises that lurk
in every high school and university in the land? The next guy could
claim that his destructive program was not meant to be destructive, that
he (or she) only meant to overwrite the systems' message of the day, and
a bug resulted in destroying a filesystem. (Morris is like a crime buff who,
to prove that it can be done, smuggles a gun onto a plane and hijacks it
to Canada. He has shown a problem in the system, but he has created the
defense for every would-be hijacker in America.)

And remember, at least two of the well-known Macintosh viruses were not
meant to harm anyone's system, but unexpected side-effects caused crashes.
Morris' program wasn't meant to get loose, but it did. It wasn't meant to
destroy data but...

Wayne Folta (fo...@tove.umd.edu 128.8.128.42)

Eduardo Krell

unread,
Nov 9, 1988, 11:21:36 AM11/9/88
to
In article <17...@glacier.STANFORD.EDU> j...@glacier.UUCP (John B. Nagle) writes:

> This is serious. The knowledge that this person had the opportunity to
>tamper with the master source code for UNIX is very worrisome. A major
>examination of all AT&T-provided security related code is in order.

This is nonsense. He worked at the research center in Murray Hill, which
has nothing to do with the organization in charge of the official System V
distribution in Summit, NJ.

Eduardo Krell AT&T Bell Laboratories, Murray Hill, NJ

UUCP: {att,decvax,ucbvax}!ulysses!ekrell Internet: ekr...@ulysses.att.com

Jim Hutchison

unread,
Nov 9, 1988, 7:00:37 PM11/9/88
to
In <11...@cgl.ucsf.EDU> sei...@hegel.mmwb.ucsf.edu.UUCP (George Seibel) writes:
> [...] If that's

>the way people want it, then let's be up front and print a warning on
>each copy of system software that ships: "Congratulations! You just
>bought a fine copy of Unix. Don't keep any files you care about on it."

You would prefer VMS where you can read the documentation to find out how
to break security? Or how about a system with no features?

If you boadcast a bug, and its fix/patch, you take responsibility for that
patch. You also risk letting loose all sorts of mayhem on systems where
the system manager is lazy or on vacation. Binary sites are particularly
limited in the number of fixes they can apply. So out go the fixes quietly,
and perhaps only locally. Here we are.

Do you have a good answer, or are you just going to indulge yourself in
a good screaming fit?

>If we have security holes on our machines that are well known, and we
>do nothing to patch those holes, we are asking for trouble.

True. But not real. Many people spend a great part of their waking
hours monitoring and fixing the system, locally and for others. Don't
be viscious and ignore their hard work.

>George Seibel

Doug Gwyn

unread,
Nov 9, 1988, 9:07:54 PM11/9/88
to
>(In spite of reassurances from here and CMU, the local media still
>insisted on the headline "Defense Computers invaded by virus.")

The media was right. For example, VGR.BRL.MIL was inoperative for
days while we studied the virus here (since that system had already
been infected). VGR.BRL.MIL plays a key role in several projects
that are important to the national defense. Other military sites
are known to have been affected. Fortunately we have been able to
characterize the behavior of the virus and now know that it did not
alter critical databases (for example).

>Second, there is an economic conseqence. Since we were unable to
>determine the extent of the programs activities we were forced to
>commit programmers time to installing kernel fixes, rebuilding systems,
>checking user data files, and checking for other damage.

We spent our time instead determining the exact extent of the virus's
abilities. As a result we found that we did not need to worry about
the effects of Trojan horses, etc. (which could well have been part
of what the virus/worm did, although we were lucky this time).

>"virus" (which is really more of a Trojan Horse), ...

It could be considered a "worm" but not meaningfully a Trojan horse.
It had the opportunity to install Trojan horses but didn't do so.

>low level of ethical maturity.

So where is the student to learn better? The current culture is
founded more on the philosophy of pragmatism than anything else,
and accordingly the student is encouraged in his belief that
nearly anything is okay so long as he doesn't get caught.

If you want to establish rational values as the norm, you have your
work cut out for you. It's a worthwhile goal, but won't be
accomplished quickly.

Doug Gwyn

unread,
Nov 9, 1988, 9:20:02 PM11/9/88
to
In article <21...@ficc.uu.net> pe...@ficc.uu.net (Peter da Silva) writes:
>One side effect that I don't like is that UNIX is taking the blame for
>a combination of (1) a security hole in an application (sendmail), and
>(2) deliberate loosening of security to trusted sites (rhosts, etc...).
>Non-academic UNIX in general is a lot less open to techniques like this.

The virus exploited two security holes in Berkeley-supplied servers.
We found that several commercial offerings that included this software
had done little more that stick their own label on it; they did not go
over the code and fix its problems before releasing it. In fact, in
the case of sendmail, they didn't even turn off the DEBUG flag in the
Makefile.

The technical problems that were exploited were mostly sloppiness that
nobody had reviewed and corrected in time. We know of a few other
similar security holes that the virus didn't try to exploit.

One could also challenge the design that provides privileged access
via sockets and their servers without adequate authentication.

The lessons to be learned are not overly simple, and until they have
been thoroughly assimilated by the right people, you can be assured
that there are more security holes of the same general nature.

Try the following on your favorite remote 4BSD-based system:
rlogin host -l ''
This attack works a surprising percentage of the time. The problem
that provides the hole has been known for many years and was fixed
at least as long ago as 1984 in the AT&T-supplied UNIX variants.
But it persists in the Berkeley variants. Perhaps this note will
prompt the various vendors to finally fix this problem!

The REAL problem is that too many people just do not care about
security, probably because they don't understand how it affects
them.

ko...@husc4.harvard.edu

unread,
Nov 9, 1988, 9:43:49 PM11/9/88
to
In article <10...@dartvax.Dartmouth.EDU> matt...@eleazar.dartmouth.edu (Jim Matthews) writes:
>It is very easy in the aftermath of something like this to indulge in
>the devil theory of crime -- that all bad things must come from evil
>minds. The more you find out about rtm I believe the more you will find
>he has in common with the people criticizing his behavior. He has done
>significant work in computer security, including warning people for
>years about the security holes that made the worm possible. He has
>worked as a sysadmin for an arpanet host. He is a serious student of
>computer science and was making contributions to the field at an age
>when most of us were trying to learn Pascal. He's also one hell of a
>great guy, and no one seems more appalled by the effects of his actions
>than he is.

>We can argue about the advisability of what he did, but I urge you to
>resist the temptation to pigeon-hole someone you don't know on the basis
>of fragmentary information.

>Jim Matthews

I may be a really nice guy but if I, by accident, kill someone by driving
recklessly, the state of MA is going to toss me in jail for manslaughter.
And I'd expect as much. Nice people are just as responsible for their
actions as "evil" people. If we fail to prosecute someone just because
they appear to be nice, brilliant, et al, then what's to stop many others
from doing similar things and claiming "I'm just as nice as RTM! Let me
go."

With the press holding RTM up on high many a hacker is going to say,
"This is how I get recognition! This is how I get a job!" And, surprise!,
it'll work. Set an example and set it before things get out of hand.
If at all possible, punish RTM to the fullest extent of the law. It may
be more than he deserves but unfortunately (?) someone must set the
example and show that such anti-social activities are not acceptable.

Perhaps a suitable punishment, at least in this case, is just denying
RTM access to any systems that connect to any other systems. You pollute
our nest and we're going to toss you out of it.

-David Kovar
Technical Consultant
Harvard University

Steven M. Bellovin

unread,
Nov 9, 1988, 11:18:15 PM11/9/88
to
> According to press reports, RM spent his summers working at AT&T
> on "Unix Communications Software Security". Anyone with a source
> license check to see if he slipped a trojan horse into uucico
> or uuxqt or something?

Morris wrote an entirely new version of uucp, one that a higher degree
of inherent security than any of its predecessors. It was in fact
installed as the production uucp on a number of research machines for
several years. Ultimately, it was supplanted by Honey DanBer uucp
because it wasn't hardened enough against real-world failures. At
Morris's request, I went over the code in great detail; there were
no holes visible -- and I repeat, I studied his code thoroughly.
In any event, to the best of my knowledge that version of uucp was
never released.


--Steve Bellovin

Chris Torek

unread,
Nov 10, 1988, 7:11:46 AM11/10/88
to
In article <88...@smoke.BRL.MIL> gw...@smoke.BRL.MIL (Doug Gwyn ) writes:
>The technical problems that were exploited were mostly sloppiness that
>nobody had reviewed and corrected in time. We know of a few other
>similar security holes that the virus didn't try to exploit.

Well, good grief, SEND THEM TO US. WE *WILL* FIX THEM. This is a
large part of what comp.bugs.4bsd.ucb-fixes is about. (Or do you mean
that they are fixed in 4.3tahoe but not other 4BSD-derived systems?)

>Try the following on your favorite remote 4BSD-based system:
> rlogin host -l ''

I get:

`Password:'

Obviously this one has been fixed in 4.3tahoe.
--
In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7163)
Domain: ch...@mimsy.umd.edu Path: uunet!mimsy!chris

Ron Natalie

unread,
Nov 10, 1988, 8:33:47 AM11/10/88
to
...and I have worked on IBM's B2 product, but I fail to see what that
has to do with the discussion. A bug in either product can cause it
to fail to do what it is supposed to do. In the development group the
Trusted System Programmer frequently has backdoor functions to bypass
the Mandatory Access Control on the test system that one hopes are never
installed in the field (this is much akin to the exploited DEBUG bug
in the BSD systems). And any secure workstation that's plugged into
a network is very suspect. I believe NCSC won't even talk to you if
you put an ethernet card in the workstation.

-Ron

Peter da Silva

unread,
Nov 10, 1988, 10:38:40 AM11/10/88
to
In article <11...@cgl.ucsf.EDU>, sei...@cgl.ucsf.edu (George Seibel) writes:
> In article <76...@sun.uucp> dre%em...@Sun.COM (David Emberson) writes:
> >[Matt Bishop (now at Dartmouth)] wrote a paper detailing at least a half
> >dozen holes in the Unix system...

UNIX in general or things like sendmail that are specific to BSD?

> "Congratulations! You just
> bought a fine copy of Unix. Don't keep any files you care about on it."

[ because we're not going to fix any security holes ]

You see, this is the sort of response we're going to get from this sort
of confusion. UNIX is not just BSD. The latest release of System V has
many security improvements... some of which were apparently made at the
urging of Robert Morris himself.

Another consideration.

Most of the individuals who buy computers run MS-DOS, a file manager and
program loader that doesn't begin to try to keep viruses out. It can't,
for obvious reasons. For all its faults, UNIX is a lot more secure than
the alternatives available to the general public.

The typical virus in an unprotected DOS or OS is at most a few K of
code... many only a few hundred bytes. This worm required a leader that
was nearly a hundred lines of 'C' code by itself.

Let's keep a sense of proportion.
--
Peter da Silva `-_-' Ferranti International Controls Corporation
"Have you hugged U your wolf today?" uunet.uu.net!ficc!peter
Disclaimer: My typos are my own damn business. pe...@ficc.uu.net

Doug Gwyn

unread,
Nov 10, 1988, 1:03:13 PM11/10/88
to
In article <14...@mimsy.UUCP> ch...@mimsy.UUCP (Chris Torek) writes:
>In article <88...@smoke.BRL.MIL> gw...@smoke.BRL.MIL (Doug Gwyn ) writes:
>>The technical problems that were exploited were mostly sloppiness that
>>nobody had reviewed and corrected in time. We know of a few other
>>similar security holes that the virus didn't try to exploit.
>Well, good grief, SEND THEM TO US. WE *WILL* FIX THEM. This is a
>large part of what comp.bugs.4bsd.ucb-fixes is about. (Or do you mean
>that they are fixed in 4.3tahoe but not other 4BSD-derived systems?)

Last time I tried, there was a distinct lack of interest!

>>Try the following on your favorite remote 4BSD-based system:
>> rlogin host -l ''

>Obviously this one has been fixed in 4.3tahoe.

Not necessarily. Try the following:
# vi /etc/passwd
<insert an extra blank line, say at the end>
$ passwd
<change your password, say to the same thing it already is>
$ su ''
# suprise!
If this hole exists, it can be traced to getpwent() not being careful
enough when it parses /etc/passwd records. See UNIX System V for the
simplest fix.

Andrew Hume

unread,
Nov 10, 1988, 1:04:30 PM11/10/88
to

come on. this is so prepostrous that i feel obliged to respond.
morris has never worked on System V code which is probably what you mean
by the master source. he has worked on Research Unix but given Ken Thompson
used his Turing Award lecture to advertise a trojan horse he put into
research unix; you would have to be very naive to trust research unix.
(although there are currently no known trojan horses or viruses.)

more importantly, morris has been doing this in an open way; penetrating systems
from the outside, not via trojan horses. in a peculiar (but obvious to me) way,
he is doing the honourable thing; attacking systems via their own foibles,
and not ones he has added. and we have heard peter honeyman acknowledge
morris's contribution towards the current uucp.

so think a little before raising panics and denigrating people's character.

g...@hcx3.ssd.harris.com

unread,
Nov 10, 1988, 2:57:00 PM11/10/88
to

Written 5:40 pm Nov 8, 1988 by sc...@attcan.UUCP (Scott MacQuarrie)

> There is a product available from AT&T's Federal Systems group called
> MLS (Multi-Level Security) which provides B1-level security in a System V
> Release 3.1 environment. I have seen the product on a 3B2, it's availablity
> from other vendors would probably require work by those vendors.

It did. It's done. It's called CX/SX.

Gil Pilz -=|*|=- Harris Computer Systems -=|*|=- g...@ssd.harris.com

John F. Haugh II

unread,
Nov 11, 1988, 2:15:26 AM11/11/88
to
In article <88...@smoke.BRL.MIL> gw...@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
>The REAL problem is that too many people just do not care about
>security, probably because they don't understand how it affects
>them.

I don't recall whether it was Doug Gywn or Guy Harris who last complained
when I attacked the CSRG and BSD.

Do you *really* trust college students to write real software? If so, you
must have never attended a university similiar to the one I graduated from.

Sean McLinden

unread,
Nov 11, 1988, 2:48:03 PM11/11/88
to
>In article <88...@smoke.BRL.MIL> gw...@smoke.BRL.MIL (Doug Gwyn ) writes:
>Try the following on your favorite remote 4BSD-based system:
> rlogin host -l ''

Or, to keep someone else from doing this, remove lines like:

::0:0::

from your password file. Most Sun systems have this as a default (stupid!).
The alternative is to fix your login prograns which you might not be able
to do with a binary license. Or, you could run MACH (Version 3 will have
NFS).

Sean McLinden
Decision Systems Laboratory

Paul Raulerson

unread,
Nov 11, 1988, 4:22:58 PM11/11/88
to
>In article <14...@anasaz.UUCP> jo...@anasaz.UUCP (John Moore) writes:
>>
>>According to press reports, RM spent his summers working at AT&T
>>on "Unix Communications Software Security". Anyone with a source
>>license check to see if he slipped a trojan horse into uucico
>>or uuxqt or something?
[deleted text]

>It is very easy in the aftermath of something like this to indulge in
>the devil theory of crime -- that all bad things must come from evil
>minds. The more you find out about rtm I believe the more you will find
>he has in common with the people criticizing his behavior. He has done
>significant work in computer security, including warning people for
>years about the security holes that made the worm possible. He has
>worked as a sysadmin for an arpanet host. He is a serious student of
>computer science and was making contributions to the field at an age
>when most of us were trying to learn Pascal. He's also one hell of a
>great guy, and no one seems more appalled by the effects of his actions
>than he is.
>
>We can argue about the advisability of what he did, but I urge you to
>resist the temptation to pigeon-hole someone you don't know on the basis
>of fragmentary information.
>
>Jim Matthews

Gee, What a *HELL* of an attitude to take about someone who has just cost a
lot of people and organizations a terrifically large amount of resources.
To a great extent, this wonderful wacky and extremely open net of ours is
self policing. People who abuse their privs most often loose them. Once,
when I was a tad younger, I might have agreed with you about showing more
compassion and understanding, but since I have been running this system at
some cosiderable expense, and deaing professionally with the government for
about 10 years, I feel that this self policing action should be encouraged.

After all, there is nothing in the world stopping Mr. Morris from going
off and starting his own network, as secure as he wishes now is there? But
participation in a group environment means you have to be responsible enough
to realize that other peoples' resources are NOT your personal private toys
to play with. I think it is far more humane to have Mr. Morris recognized
by System Adminsitrators everywhere as a security risk, and be denied access,
with threat of legal action is his illegal activites continue, than it is
to slap him on the wrist and tell those same System Adminstrators that he
CANNOT be denied access because he really didn't mean it and is sorry for
what he did.

People have to be responsible for themselves, and yes, they have to
realize everyone makes mistakes and be willing to "forget" them. However,
there is *always* a price associated with such forgetfulness, and
Mr. Morris, or whoever the guilty critter was, has yet to pay for
his play.

This isn't really a personal attack on anyone, it is just more of a
defense of the openess we all share here, and what it may take to
keep it open. Anyone wishing to has the matter over some more, your
welcome to mail me and if it seems reasonable, I'll summarize the
opinions and post 'em back as a single message.

--
Paul Raulerson & Paul Raulerson & Associates +---------------------------+
Data/Voice: 1+215-275-2429 / 1+215-275-5983 | OS/who? Why bother? Isn't |
Cis: 71560,2016 Bix: paulr | Mess-Dos bad enough? |
UUCP: ...!rutgers!lgnp1!prapc2!paulr +---------------------------+

Guy Harris

unread,
Nov 11, 1988, 5:43:55 PM11/11/88
to
>If this hole exists, it can be traced to getpwent() not being careful
>enough when it parses /etc/passwd records. See UNIX System V for the
>simplest fix.

If that fix is "have 'getpwent()' return NULL if the entry it looks at
is syntactically incorrect," the fix is simple but rather rude; the net
result is that any program scanning the password file linearly - e.g.,
"passwd" - will think it's at the end of the file if it sees such a
syntactically incorrect line. Having "passwd" cut off the password file
as soon as it sees a blank line isn't very nice; ignoring the
syntactically-invalid lines, or passing them through unchanged, is
probably a better idea. The former could be done by having "getpwent"
skip over those entries, rather than return NULL on them; the latter
requires that "passwd" not just naively use "(f)getpwent" and "putpwent"
to update the password file.

Mike Haertel

unread,
Nov 12, 1988, 3:37:31 PM11/12/88
to
In article <85...@rpp386.Dallas.TX.US> j...@rpp386.Dallas.TX.US (John F. Haugh II) writes:
>Do you *really* trust college students to write real software? If so, you
>must have never attended a university similiar to the one I graduated from.

I am a college student. Also the author of GNU grep, coauthor of GNU diff,
and working on GNU sort . . . all of my programs are faster and (I hope)
more robust than the Unix programs they replace. I am glad to hear that
you don't trust me to write real software, and that you will not be
using my programs.

Do you really trust your vendor to write real software? Most of them
won't distribute source, so you can't check for trojan horses et. al.
You can't fix bugs that arise, unless you are good at reading and
patching binaries. Most of them have license agreements that prevent
you from doing this, if you are a person who keeps your word.

I have heard that the reason some vendors don't distribute source is
that they don't want their customers to see how badly written it is.

---
Mike Haertel
Really mi...@stolaf.UUCP, but I read mail at mi...@wheaties.ai.mit.edu.

Guy Harris

unread,
Nov 12, 1988, 4:25:20 PM11/12/88
to
>Or, to keep someone else from doing this, remove lines like:
>
>::0:0::
>
>from your password file. Most Sun systems have this as a default
>(stupid!).

Excuse me, but to what are you referring? Most Sun systems have a line
like

+::0:0:::

as a default, but this is INequivalent to

::0:0::

Lines of the latter sort are generated by the scenario Doug Gwyn
described; the problem is that "getpwent" doesn't, in some systems,
check that the login name field is non-null before returning a value.
(S5R3's version checks, but unfortunately returns NULL rather than
skipping the invalid entry, which causes programs to think a blank line
in "/etc/passwd" is really the end of the file.)

Don Libes

unread,
Nov 12, 1988, 6:14:45 PM11/12/88
to
In article <11...@cgl.ucsf.EDU>, sei...@cgl.ucsf.edu (George Seibel) writes:
> "Congratulations! You just
> bought a fine copy of Unix. Don't keep any files you care about on it."

I already see messages like that. Whenever I run gnuemacs, I get:

GNU Emacs comes with ABSOLUTELY NO WARRANTY; type C-h C-w for full details.

Sounds similar if not as overt. In any case, I'm not the least
surprised when gnuemacs trashes my files/buffers. (Same for UNIX.)

Don Libes li...@cme.nbs.gov ...!uunet!cme-durer!libes

John F. Haugh II

unread,
Nov 13, 1988, 8:34:52 AM11/13/88
to
In article <84...@alice.UUCP> d...@alice.UUCP writes:
>None of the work he did is in any product, and he didn't have
>any opportunity to tamper with the master source code--
>that is really quite far away from Research.

It would be so nice if someone would undertake a security audit to
insure that work other college students did, which *is* currently
in production, doesn't contain any surprizes.

Our friendly enchilada may not be the only prankster out there ...

Sean McLinden

unread,
Nov 13, 1988, 10:03:50 AM11/13/88
to
:In article <4...@auspex.UUCP: g...@auspex.UUCP (Guy Harris) writes:
::Or, to keep someone else from doing this, remove lines like:

::
::::0:0::
::
::from your password file. Most Sun systems have this as a default
::(stupid!).
:
:Excuse me, but to what are you referring? Most Sun systems have a line
:like
:
: +::0:0:::
:
:as a default, but this is INequivalent to
:
: ::0:0::
:
Excuse ME, but the last four lines of my SunOS 4.0 distribution tape
password file are:

+::0:0:::
::0:0:::
::0:0:::
::0:0:::

NeXT?

Michael DeCorte

unread,
Nov 13, 1988, 12:42:58 PM11/13/88
to

Think of the fun everyone is going to have when the politicians and
lawers start chewing on this. Especially any of them who read CACM.
Can you say regulation?

--

Michael DeCorte // (315)265-2439 // P.O. Box 652, Potsdam, NY 13676
Internet: m...@sun.soe.clarkson.edu // Bitnet: m...@clutx.bitnet
---------------------------------------------------------------------------
Clarkson Archive Server // commands = help, index, send, path
archive...@sun.soe.clarkson.edu
archive-server%sun.soe.cl...@omnigate.bitnet
dumb1!dumb2!dumb3!smart!sun.soe.clarkson.edu!archive-server
---------------------------------------------------------------------------

Jim Frost

unread,
Nov 13, 1988, 12:52:20 PM11/13/88
to
In article <7...@stolaf.UUCP> mi...@wheaties.ai.mit.edu writes:
|In article <85...@rpp386.Dallas.TX.US> j...@rpp386.Dallas.TX.US (John F. Haugh II) writes:
|>Do you *really* trust college students to write real software? If so, you
|>must have never attended a university similiar to the one I graduated from.

Actually, those students who produce code often do a better job than
'professionals', mostly because they have the time to do it right.
Professionally written software is most often pushed out the door,
which isn't likely to help its quality. I could cite examples of
this, but you have probably seen it as often as I have anyway.

Another thing that happens with professionally produced software is
the author deliberately making it hard to follow (read: modify and
debug) in order to ensure his (her) job security (kind of reminds me
of Bush picking Quayle, come to think of it :-). Again, not
something a student, writing on his own, is likely to do.

Would I trust student-written code? You bet your life I would, but
only after giving it a little personal attention, something that
should always be done anyway.

BTW, I would be interested in knowing what exactly constitutes a
"student". A good many people I know, myself included, write things
professionally as well as go to school. Should you only trust those
things I write while I'm at work? The questions could go on and
on....

jim frost
ma...@bu-it.bu.edu

Adam L. Buchsbaum

unread,
Nov 13, 1988, 4:01:54 PM11/13/88
to
In article <85...@rpp386.Dallas.TX.US> j...@rpp386.Dallas.TX.US (John F. Haugh II) writes:
>It would be so nice if someone would undertake a security audit to
>insure that work other college students did, which *is* currently
>in production, doesn't contain any surprizes.

Being just an ignorant graduate student myself, I can't figure out
whether this implies that all college students are suspect, anyone who
is not in college is not suspect, or both? Perhaps John F. Haugh II
could clarify this for me?

car...@s.cs.uiuc.edu

unread,
Nov 13, 1988, 4:19:00 PM11/13/88
to

/* Written 2:37 pm Nov 12, 1988 by mi...@stolaf.UUCP in s.cs.uiuc.edu:comp.unix.wizards */


I have heard that the reason some vendors don't distribute source is
that they don't want their customers to see how badly written it is.
---

Having seen some proprietary source code that made me gag, I wouldn't
be at all surprised at that.

Alan M. Carroll "How many danger signs did you ignore?
car...@s.cs.uiuc.edu How many times had you heard it all before?" - AP&EW
CS Grad / U of Ill @ Urbana ...{ucbvax,pur-ee,convex}!s.cs.uiuc.edu!carroll

Glen Overby

unread,
Nov 13, 1988, 5:39:22 PM11/13/88
to

In article <17...@cadre.dsl.PITTSBURGH.EDU> se...@cadre.dsl.pittsburgh.edu (Sean McLinden) writes:
>It is clear from Rick Adams' comments that 'not wanting to tip anyone off'
>is no excuse. Even binary-only sites can be protected fairly rapidly if
>the appropriate channels are used.

This sort of thing has been a pretty big issue lately, so I thought I'd chip
in a few comments. If information about bugs (or, should I say,
"misfeatures") in Unix (or really any OS) should not be publicly disclosed to
protect those who either do not or can not repair them, then HOW should
such "classified" information be distributed to those who want/need it, and
can and will fix the holes?

Not but a few weeks ago there was a "discussion" on one of the news.* groups
about the Security mailing list (there are two of them, but thats irrevalent
here) which is restricted to "trusted" people (those who are "root" on a
"major machine" -- whatever that means). Now, if information about security
bugs is too risky for distribution among that elite group of "system gods",
then should that information be exchanged over network mail systems at all?
(e.g. to 4bsd-bugs@ucbvax).

I think all of this sort of information should be distributed at least over
the private security forum; Vendor releases just aren't frequent enough to
fix these problems in a timely manner.

Glen Overby
ncov...@plains.nodak.edu uunet!ndsuvax!ncoverby
ncoverby@ndsuvax (Bitnet)

Glen Overby

unread,
Nov 13, 1988, 6:02:46 PM11/13/88
to

In article <85...@rpp386.Dallas.TX.US> j...@rpp386.Dallas.TX.US
(John F. Haugh II) writes:
>It would be so nice if someone would undertake a security audit to
>insure that work other college students did, which *is* currently
>in production, doesn't contain any surprizes.

Why are you worried only about college students? We're not the only ones
in this world to commit crimes.

This security audit should go for any software posted to the net or
otherwise available (anon uucp, anon FTP, etc), as well as on a per-vendor
basis (who's to say that ABC computer maker didn't botch something in their
port?).

What you're prescribing is a pretty major task. I'm sure that if anybody
with Unix Sources is sufficently worried about contamination they will
perform some sort of "audit" and report the bugs back to the Keeper of the
Sorces.

MFHorn

unread,
Nov 13, 1988, 11:50:12 PM11/13/88
to
Written 5:40 pm Nov 8, 1988 by sc...@attcan.UUCP (Scott MacQuarrie)
> There is a product available from AT&T's Federal Systems group called
> MLS (Multi-Level Security) which provides B1-level security in a System V
> Release 3.1 environment.

> I have seen the product on a 3B2, it's availablity
> from other vendors would probably require work by those vendors.

What does this product do to get this rating?

I had heard that ATT, Sun and probably others had been working on
a B-level Unix. I didn't know anyone had gotten past C2.

Andy Rosen | aro...@hawk.ulowell.edu | "I got this guitar and I
ULowell, Box #3031 | ulowell!arosen | learned how to make it
Lowell, Ma 01854 | | talk" -Thunder Road
RD in '88 - The way it should've been

Clifford C. Skolnick

unread,
Nov 14, 1988, 1:16:16 AM11/14/88
to
In article <85...@rpp386.Dallas.TX.US> j...@rpp386.Dallas.TX.US (John F. Haugh II) writes:
>
>It would be so nice if someone would undertake a security audit to
>insure that work other college students did, which *is* currently
>in production, doesn't contain any surprizes.

What evidence do you have that college students are evil programmers
whos code should be verified? It does not take a college student to place
a section of unathorized code into a program. I'm sure many programs out
in the real word have similar features added by a programmer and abused
by another (as this case was).

I would much rather you have requested an audit on *all* code written
by *any* programmer. No one person should ever be trusted so much to not
validate code that person had written. This is especially true for any
program that runs set-uid to root.

Would you install a set-uid root program off the net without taking a
real carefull look at the code? So why did all those source sites not
pick up on this problem long ago? If they did notice it, they kept their
mouths shut. That is just as wrong as the author of sendmail who
supposidly added that code to avoid restrictive management policies.

>Our friendly enchilada may not be the only prankster out there ...

I take offence at your attack on college students. I am a college student
and have never deliberatly comprimised the security of any code I have
written or worked on.
--
Clifford C. Skolnick | "You told me time makes it easy, then you never told
Phone: (716) 427-8046 | me time stands still" - Gary Neuman
TCP/IP: 44.68.0.195 | ...!rutgers!rochester!ritcv!ritcsh!sabin! lazlo!ccs
c...@lazlo.n1dph.ampr.org| \!kodak!pcid!gizzmo!/

Brian Beattie

unread,
Nov 13, 1988, 12:36:24 PM11/13/88
to
In article <7...@muffin.cme-durer.ARPA> li...@cme.nbs.gov (Don Libes) writes:
/ I'm not the least
/surprised when gnuemacs trashes my files/buffers. (Same for UNIX.)
/
/Don Libes li...@cme.nbs.gov ...!uunet!cme-durer!libes

I'm not the least surprised when the hammer smashes my thumb.

Think about it.
--
_ANYONE_ | Brian Beattie (703)471-7552
can sell software| 11525 Hickory Cluster, Reston, VA. 22090
that has already | bea...@visenix.UU.NET
been written | ...uunet!visenix!beattie

Doug Gwyn

unread,
Nov 14, 1988, 3:31:26 AM11/14/88
to
In article <4...@auspex.UUCP> g...@auspex.UUCP (Guy Harris) writes:
>If that fix is "have 'getpwent()' return NULL if the entry it looks at
>is syntactically incorrect," the fix is simple but rather rude;

That would be rude but it wasn't what I was talking about.
All you really need to do is to skip over a bogus entry and
resynchronize for the next one.

John B. Nagle

unread,
Nov 14, 1988, 12:03:43 PM11/14/88
to

I suggest that the security mailing list be posted to a newsgroup,
but with a 60-day delay. Sites and vendors serious about security will either
have fixed any problem by that time, or they probably aren't going to fix it
at all. This insures that a false sense of security is not engendered among
system administrators, yet allows a reasonable time for closing newly discovered
problems.
General knowledge of that 60-day timer will tend to accelerate efforts
by vendors to fix problems, I would suspect.

Why 60 days? A monthly update service would be enough to keep systems
operating with the latest security fixes. 30 days would require biweekly
updates to stay current, which is a bit frequent. Much longer than 60 days,
and the pressure would be off on fixing holes.

John Nagle

Mike McNally

unread,
Nov 14, 1988, 12:13:21 PM11/14/88
to
In article <85...@rpp386.Dallas.TX.US> j...@rpp386.Dallas.TX.US (John F. Haugh II) writes:
>It would be so nice if someone would undertake a security audit to
>insure that work other college students did, which *is* currently
>in production, doesn't contain any surprizes.

Doesn't seem to me that a diploma forms some sort of delineation between
wickedness and honesty. Any company that cares about security but only
with respect to those parts of its software that were written by ``college
students'' doesn't deserve serious consideration. Surely, the majority of
electronic crimes are committed by employees of the victims.

--
Mike McNally Lynx Real-Time Systems
uucp: {voder,athsys}!lynx!m5 phone: 408 370 2233

Where equal mind and contest equal, go.

Brad Turner

unread,
Nov 14, 1988, 12:26:29 PM11/14/88
to
In article <17...@ndsuvax.UUCP> ncov...@ndsuvax.UUCP (Glen Overby) writes:
>
>In article <85...@rpp386.Dallas.TX.US> j...@rpp386.Dallas.TX.US
> (John F. Haugh II) writes:
>>It would be so nice if someone would undertake a security audit to
>>insure that work other college students did, which *is* currently
>>in production, doesn't contain any surprizes.
>
>This security audit should go for any software posted to the net or
>otherwise available (anon uucp, anon FTP, etc), as well as on a per-vendor
>basis (who's to say that ABC computer maker didn't botch something in their
>port?).
>
>Glen Overby
>ncov...@plains.nodak.edu uunet!ndsuvax!ncoverby
>ncoverby@ndsuvax (Bitnet)

(out of context of course and maybe not 100% exact)
Frank Burns: I wouldn't be so paranoid if everybody wasn't watching me

Let's all put on our paronia pants and do the little "somebody is out to
to get me" dance!

I'm not suggesting that security should be ignored, or that code should
never be looked at after the first successful compile. It's just that I
hate to see everybody join a posse/lynch mob because of ONE (not several,
ONE) incident. So....

Face it unless you are willing to personally inspect every piece of source
for every executable that's on your machine you're potentially compromising
the security of your system. It's no good to "audit" the code, because how
to you know the auditors can be trusted? Couldn't one dishonest auditor do
more harm then than anybody else. Think about it, one central group in
charge declaring what is and is not fit. A single point of failure!

What it comes down to is the fact that systems these days are far to
complicated for a single person to deal with. You have to trust your
fellow human being at some point in time, otherwise everybody will be
doomed to re-inventing the wheel. Do you personally have the time and expertise
to code a boot load PROM? Then go from there to a monitor program to an
assembley to a compiler to....vmunix...>rest-of-unix<....ad nausem. Then
if you really want to get paranoid, how about the hardware? You're going
to have to design your own CPU, mask it yourself, produce it yourself.
Don't forget the glue logic, make your own 74xxx chips, resistors, caps
etc... Where does it stop???? I give up lets disband society and all go
live in woods where only the wildlife can get ya'.

While I'm on my soapbox (and guilty)...Is it possible that we (the computing
community) have wasted more time discussing/arguing about the worm than
we spent discovering/disecting/erradicating/patching? My personal view
I that the gossip fence has gotten overcrowded and we need to let the
issue die and quit wasting net bandwidth rehashing every different
flavor of the same argument/issue.

Thanks for your time, have an OK day, and DON'T post a followup.

-brad-
--
v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v^v
Brad Turner 1330 Ashleybrook Ln. (919) 768-2097 | I speak for myself
3Com Corp. Winston-Salem, NC 27103 mbt@bridge2 | NOT for my employer.

Steven M. Bellovin

unread,
Nov 14, 1988, 2:15:21 PM11/14/88
to
In article <10...@swan.ulowell.edu>, aro...@hawk.ulowell..edu (MFHorn) writes:
>
> What does this product do to get this rating?

I know about AT&T's System V/MLS; let me describe it a bit. For those
who want more details, see the May/June 1988 issue of the AT&T
Technical Journal. I'll start by quoting from the introduction:

``System V/MLS adds several security enhancements to the
standard UNIX system, including mandatory access controls based
on labels consistent with the DoD classification scheme,
improved protection of passwords, extensive auditing, boot-time
assurance measures to detect the introduction of malicious
code, and restriction of certain capabilities that historically
have been responsible for security failures.

The most interesting change is the way mandatory labels are
implemented. What's done is to reinterpret the GID. Rather than being
used for a simple equality check, the System V/MLS GID is used as a
pointer to a label table; this table gives the security level,
compartment information, etc.

Root Boy Jim

unread,
Nov 14, 1988, 3:54:08 PM11/14/88
to
Doug,
Sometime awhile back (this spring, summer?), I remember someone's
comment regarding which sources contained the routine `gets', the routine
used to subvert fingerd. I recall you thanking the poster and stating
your intention to eradicate it from your System V emulation code.

I applaud you for your foresight, sharing your distaste for this beast.
You may very well have saved yourself from one prong of the fork.

I can imagine you crusading against gets() in both the C and POSIX
standards and I hope you have had success in that area. I would go
so far as to suggest that everyone remove this routine from libc.a
and place it in a separate library available only upon special request
for binary applications only, after filling out numerous forms.

I can see it now, a paper entitled `Local Variables Considered Harmful'.

(Root Boy) Jim Cottrell (301) 975-5688
<r...@nav.icst.nbs.gov> or <r...@icst-cmr.arpa>
Careful with that VAX Eugene!

Rahul Dhesi

unread,
Nov 14, 1988, 3:57:14 PM11/14/88
to
In article <17...@glacier.STANFORD.EDU> j...@glacier.UUCP (John B. Nagle) writes:
>I suggest that the security mailing list be posted to a newsgroup,
>but with a 60-day delay.

This is a good idea. In the case of the oft-quoted ftpd bug, the above
procedure was roughly followed, and it worked.
--
Rahul Dhesi UUCP: <backbones>!{iuvax,pur-ee}!bsu-cs!dhesi

Doug Gwyn

unread,
Nov 14, 1988, 4:54:51 PM11/14/88
to
In article <17...@adm.BRL.MIL> r...@nav.icst.nbs.gov (Root Boy Jim) writes:
>I can imagine you crusading against gets() in both the C and POSIX
>standards and I hope you have had success in that area. I would go
>so far as to suggest that everyone remove this routine from libc.a
>and place it in a separate library available only upon special request
>for binary applications only, after filling out numerous forms.

Although I probably voted to remove gets() from the proposed C standard,
I will stand by X3J11's decision to leave it in. As explained in
discussions raging in comp.lang.c (INFO-C), there are safe uses for
gets(), its "problem" is well known, there are several other standard
library routines with similar characteristics, and a lot of existing
code uses it (sometimes safely, sometimes not).

People are focusing on the wrong problem. The Internet virus also
attacked through a hole unrelated to gets(), and I know of at least
three other such holes. The general problem is lack of sufficient
attention to detail in security-related code. You're not going to
solve this by outlawing a sometimes useful tool.

Jon A. Tankersley

unread,
Nov 14, 1988, 6:43:23 PM11/14/88
to
I've read too much not to comment.....


The question is not actually 'can you trust any university student', but
'can you trust any person'. The answer is yes and no. Short of getting
some crack programmers together and brainwashing them. But even then it
would be difficult, they could turn on you.

Anybody is culpable. Anyone can be 'broken'. Maturity has nothing that
makes it more reliable.

There are/were some University students that I can/would trust to write clean
code. This is because of the 'more than cursory' knowledge of the people
in question. After working with them for 4 years, I knew what their morals
and ideals were. I also knew the other type, that you couldn't trust to
give you the correct time. But, even these people I could trust could/can
be broken and subverted. And that is not a crime. That is human nature.
Given the correct type of hard choices, anyone can be subverted.

But this doesn't deal with the issue. Ethics is something learned from
day 1. Education on ethics points out some of the problems when dealing
with ethics, but it doesn't teach ethics. Scruples are learned also.
Beyond the ancient form of measure, there is no education for scruples.
But it also takes discipline. Discipline to document what is really going
on. Discipline to get it done the right/correct/best way. Discipline to
not be seduced by 'creeping featurism' (a seduction/subversion listed above).

There will always be bugs and loopholes. Security is not a passive function.
But it is often treated that way. Fix it when something slips. Active
Even I am 'guilty' of letting security lapse, partially due to ignorance and
partially due to lack of time to devote security auditing. Even with all
of the C1-B2 auditing going on, it is still an active job. If nobody ever
looks at the logs..... then there is no security.

The biggest result of the 'Attack of the Hungry Worm' will be a clamping down
on the ease of use of networking. New 'conveniences' will be developed with
new 'features' that will present new 'loopholes' in the never ending seesaw
battle between 'good and evil' (convenience and security).

Sigh... Back to work. Standard disclaimers, etc, etc, etc. and to be
redundant etc.

-tank-
#include <disclaimer.h> /* nobody knows the trouble I .... */

Jon A. Tankersley

unread,
Nov 14, 1988, 6:53:42 PM11/14/88
to
I just checked numerous 4.0 systems that I have. Narry a one had a missing
'+'. Are you sure you checked the installation log?

I am pretty confident that there never were any stray ::0:0.... entries in
any of mine, I use scripts to update my passwd files, and they are pretty
dumb....

Guy Harris

unread,
Nov 14, 1988, 9:25:43 PM11/14/88
to

>Excuse ME, but the last four lines of my SunOS 4.0 distribution tape
>password file are:
>
> +::0:0:::
> ::0:0:::
> ::0:0:::
> ::0:0:::

Ex*cuse* me, but I just looked at the password file on the Sun-3 and Sun-4
1/2" distribution tapes, both on the "root file system" tar file and in
the "Install" optional software component (because it contains the
"prototype" used to install diskless clients). All of them had

+:

as the last line in the password file (in fact, I'll bet the password
files were identical). No blank lines, and certainly no

::0:0:::

I tried "passwd" with a last line like the one above, and it merely
turned it into

+::0:0:::

filling in the missing fields; it didn't insert a

::0:0:::

line.

Now, I can't speak for:

1) the 1/4" distribution tapes, as we don't have them handy,
although I would be *EXTREMELY* surprised if they were any
different.

2) the Sun-2 distribution tapes; see 1)

3) the Sun386i

but I don't see any indication that the password file, as shipped or set
up by Sun, has any

::0:0:::

lines in it.

Guy Harris

unread,
Nov 15, 1988, 1:05:43 PM11/15/88
to

It may not be what you were talking about, but it's what AT&T did, at
least in S5R3 and S5R3.1. As I think I stated, skipping bogus entries
and proceeding to the next valid entry is, indeed, the right fix.

Henry Spencer

unread,
Nov 15, 1988, 1:08:21 PM11/15/88
to
In article <9...@lazlo.UUCP> c...@lazlo.UUCP (Clifford C. Skolnick) writes:
>What evidence do you have that college students are evil programmers
>whos code should be verified? It does not take a college student to place
>a section of unathorized code into a program...

The problem with college students is not that they are evil crackers, but
that college software quality control is not the best, to put it mildly.
Colleges are organized to produce ideas and degrees, not high-quality
software. It shows. The popular software distribution from a certain
university in southern California is a good example of interesting ideas
often marred by first-cut [i.e. poorly thought out, messy, sometimes
incomplete] designs and implementations.

This is not to say that any random commercial organization, like, say,
one whose name has three initials and an "&" in it, will *necessarily*
do better. But those people can, in theory, afford to spend some money
on quality assurance. Universities generally can't.
--
Sendmail is a bug, | Henry Spencer at U of Toronto Zoology
not a feature. | uunet!attcan!utzoo!henry he...@zoo.toronto.edu

D.Rorke

unread,
Nov 15, 1988, 1:38:50 PM11/15/88
to
> >According to press reports, RM spent his summers working at AT&T
> >on "Unix Communications Software Security". Anyone with a source
> >license check to see if he slipped a trojan horse into uucico
> >or uuxqt or something?
> >--
>
> As a matter of fact, one of the things Robert did at Bell Labs (while
> still a high school student, I believe) was fix some of the glaring
> security holes in uucp (AT&T Bell Laboratories Technical Journal,
> 10/84).

The author of the article you reference was not the Robert Morris
under suspicion (although it may be his father). The biographical
notes at the end of the paper indicate that the Robert H. Morris
who co-authored the paper had been employed at Bell Labs since 1960.

> It is very easy in the aftermath of something like this to indulge in
> the devil theory of crime -- that all bad things must come from evil
> minds. The more you find out about rtm I believe the more you will find
> he has in common with the people criticizing his behavior. He has done
> significant work in computer security, including warning people for
> years about the security holes that made the worm possible. He has
> worked as a sysadmin for an arpanet host. He is a serious student of
> computer science and was making contributions to the field at an age
> when most of us were trying to learn Pascal. He's also one hell of a
> great guy, and no one seems more appalled by the effects of his actions
> than he is.

Being a "great guy" is not sufficient. As members of society we are
also expected to exhibit a reasonable degree of responsible judgement.
Perfectly nice people get roaring drunk, get into their cars, and
unintentionally run over little children. Although this analogy is lacking
in some ways it is meant to dramatically make the point that nice, well
intentioned people can do irresponsible things that cost the rest of society
a great deal. Such people must be held accountable for the results of
their irresponsibility.

The person responsible for this virus may in fact be a "great guy" in many
ways and may not have thought there was anything wrong with what he was doing.
If so, he had a very poor understanding of the ethics involved. Although we
may feel sorry for him we cannot afford to easily excuse such poor judgement.


> We can argue about the advisability of what he did, but I urge you to
> resist the temptation to pigeon-hole someone you don't know on the basis
> of fragmentary information.
>
> Jim Matthews
> Dartmouth Software Development


Dave Rorke
attunix!der

Root Boy Jim

unread,
Nov 15, 1988, 1:44:43 PM11/15/88
to
? Well, now, gets() is of course unsafe, but then there are
? read(), sprintf(), and no telling how many others. For that
? matter, *p++ = *q++

Not quite in the same way. Read takes an argument which specifys the
maximum size of the buffer, no problem. Copying a string (*p++ = *q++),
while a frequent source of bugs, is possible to control since strlen
will tell you the length. Likewise sprintf; with a little care one
can precompute the size and reserve a large enuf area. One problem the
latter two have is with segmentation violations.

However, with gets(), one is totally at the mercy of data that is
outside the program, and thus, not under control.

? hay...@ucscc.ucsc.edu ? hay...@ucscc.bitnet ? ...ucbvax!ucscc!haynes

? "Any clod can have the facts, but having opinions is an Art."
? Charles McCabe, San Francisco Chronicle

(Root Boy) Jim Cottrell (301) 975-5688
<r...@nav.icst.nbs.gov> or <r...@icst-cmr.arpa>
Careful with that VAX Eugene!

I can't think about that. It doesn't go with HEDGES in the shape of
LITTLE LULU -- or ROBOTS making BRICKS...

Dennis G. Rears (FSAC)

unread,
Nov 15, 1988, 1:56:46 PM11/15/88
to

John F. Haugh II writes

>Do you *really* trust college students to write real software? If so, you
>must have never attended a university similiar to the one I graduated from.

What do you have against students? I guess this says something
negative about you (as a previous student), current students, and
your university. The only people I distrust from the start are
felons, known crackers, and politicians :-). Trust is something
that builds over time; not because one isn't a student.

Dennis

Mohamed Ellozy

unread,
Nov 15, 1988, 3:46:46 PM11/15/88
to
In article <88...@smoke.BRL.MIL> gw...@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
>
>People are focusing on the wrong problem. The Internet virus also
>attacked through a hole unrelated to gets(), and I know of at least
^^^^^^^^^^^^^^^^^^^^^^

>three other such holes. The general problem is lack of sufficient
^^^^^^^^^^^^^^^^^^^^^^

This is what irritates the living daylights out of so many of us.
He "knows" of at least three other such holes. He is thus more
learned, perhaps even wiser, than we are.

BUT WHAT THE HELL ARE YOU DOING TO GET THEM CLOSED???

Wizards who "know" about problems and pride themselves about it, but
do nothing, are little better than those who mailiciously exploit them.

This wormy episode will only prove useful if it leads to a serious effort
to eradicate existing holes. I suspect that vendors will now be very
sensitive (for a short period of time) to reports of security problems.
Not too sure, though. What have various vendors done for sites which
run anonymous ftp? Expecting customers to learn of problems from the
net is not acceptable user support.

Dennis L. Mumaugh

unread,
Nov 15, 1988, 9:06:00 PM11/15/88
to
In article <88...@smoke.BRL.MIL> gw...@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
>So where is the student to learn better? The current culture is
>founded more on the philosophy of pragmatism than anything else,
>and accordingly the student is encouraged in his belief that
>nearly anything is okay so long as he doesn't get caught.
>
>If you want to establish rational values as the norm, you have your
>work cut out for you. It's a worthwhile goal, but won't be
>accomplished quickly.

This is a short essay on the mores and morals of the computer
culture. It is caused by the controversy over whether the
originator of the recent internet worm should be hailed as a hero
or hauled off to jail and his life and career ruined by his
actions.

There are two attitudes towards life that are exemplified by
various social systems. In an authoritarian/totalitarian society
that which is not permitted is forbidden. In a "free society"
that which is not prohibitted is permitted.

In the computer culture we have similar attitudes. Some people
feel that since UNIX has file permissions, if you don't protect
your files they should be able to browse them (and if your
terminal is not locked they can use it and browse). Others feel
that personal directories and files are out of bounds.

Part of this culture clash comes from the differences between the
"academic community" and the "business community".

I remember back in 1967 when a Freshman student of physics was
making a nuisance out of himself with the University of Maryland
Computer Center by breaking the operating system and stealing
time. He lead the systems people a merry chase. They finally
stopped the activities by hiring him as a systems programmer.
Today that person is famous as the inventor of <product deleted>
and was a professor of a well known academic institution. [His
name is deleted because he is now a well known person, but I knew
him way back when.]

Today, the same actions would result in disciplinary action and
since the advent of the new federal law on computer security
would be cause for criminal action. What was once considered a
harmless prank is now a "serious" offense.

What has changed? Computers have changed. They used to be toys
of the privileged few researchers and now they are the work
horses of the world. The analogy is that between horses and the
current automobile. In the old days borrowing a horse for a bit
wasn't that serious, nowadays joy riding in a car is a major
offense. [We did hang horse thieves though didn't we?]

Our academic community encourages browsing and "snooping" as long
as we don't destroy or conceal the origination of ideas
[plaigarism]. The ideal of co-operation between people and the
spread of knowledge is generally taught as the highest goal.

Our business community is just the opposite. We have found that
information is power is money. The FSF to the contrary, computer
data is now valuable [I rememeber trying to get a mag tape
through Candian Customs: those who said "Computer Data" paid
duty; those who said "Software" got by for free]. As more and
more people commit their fortunes and lives [figuratively] onto
computer media, the more we will become intolerant of people who
disrupt those computers or idlely browse through files.

In another newsgroup [news.sysadmin or some such] the question
was raised: "What authority does a systems administrator have to
browse files." I can remember some times when I happened upon a
torrid love affair being conducted by two married people via
EMail, and .... today I would almost be required to inform
authority of this abuse of computer resources.

Essentially what Doug is raising and I am seconding is that times
have changed. This worm incident has rattled some cages and
arroused some sleeping dragons. Hopefully, the Professional
Societies will provide a code of ethics about computer use in
reference to these areas. If they don't the US Government will.
Already the new law could be used to charge rogue players with a
crime [unauthorized use of facilities]. Then of course those who
read netnews without official sanction ..... I suspect that one
could even make a case for routing personal mail over the
Internet as being a crime.

--
=Dennis L. Mumaugh
Lisle, IL ...!{att,lll-crg}!cuuxb!dlm OR cuuxb!d...@arpa.att.com

Harvey R Moran

unread,
Nov 16, 1988, 6:04:52 AM11/16/88
to

I wonder how many more people out there believe that sites without
access to the security mailing list (or possibly even USENET) should
have their risks increased pretty significantly? How about us binary
liscense sites?

If you consider the UNIX community to include both binary liscense
sites and sites with no access to USENET, the *most* such a newsgroup
would accomplish is to make a larger group of privileged characters --
i.e. anyone with access to USENET. It would *not* get the information
to all concerned SA's.

Please don't take the 60 day suggestion. I wouldn't want to be
forced to abandon UNIX and use VMS. Please note that I do not claim
VMS is any more inherently secure than UNIX, just that DEC doesn't
publish break-in methods around the world. It wouldn't take many
successful break-in's to convince my management to abandon UNIX, or at
least UNIX with *any* communication with the outside world.

Harvey Moran mo...@tron.UUCP@umbc3.UMD.EDU
{wb3ffv,netsys}!hrmhpc!harvey

Roger Collins

unread,
Nov 16, 1988, 10:34:19 AM11/16/88
to
In article <1988Nov15.1...@utzoo.uucp> he...@utzoo.uucp (Henry Spencer) writes:
> The problem with college students is not that they are evil crackers, but
> that college software quality control is not the best, to put it mildly.

The single most shocking realization I had upon entering the professional
world after student life (Univ. of Florida, 1987) was that programmers
credited with writing the BEST operating system in the world wrote
some of the worst code I had every seen.

1. Almost every utility can be made to dump core with a little
negative testing. (You don't have to use gets(), fgets() works
on stdin, too.)

2. Almost no comments, anywhere.

3. Readability absolutely the poorest I have ever seen (mostly
one-to-three-letter variable names, poor style, etc.).

I realize that much of the code was written decades ago. And I
have seen some improvement in later releases. But still nothing that
wouldn't get some red marks from my former professors.

--
Roger Collins
NCR - E&M Columbia
rog...@ncrcae.Columbia.NCR.COM

Jim Meritt

unread,
Nov 16, 1988, 12:42:55 PM11/16/88
to
In article <9...@lazlo.UUCP> c...@lazlo.UUCP (Clifford C. Skolnick) writes:
}In article <85...@rpp386.Dallas.TX.US> j...@rpp386.Dallas.TX.US (John F. Haugh II) writes:
}>
}>It would be so nice if someone would undertake a security audit to
}>insure that work other college students did, which *is* currently
}>in production, doesn't contain any surprizes.
}
}What evidence do you have that college students are evil programmers
}whos code should be verified? It does not take a college student to place
}a section of unathorized code into a program. I'm sure many programs out
}in the real word have similar features added by a programmer and abused
}by another (as this case was).

OK set folk, where am I wrong? (go to it, weemba!)

I do not see:
1. The original post did not say ALL college students are "evil programmers"
(it implied to me that most were not, though)
2. The original post said ONE college student was (rtm)
3. The original post did not say ONLY college students are "evil programmers".


So why the flail, unless the old "protesteth too much" syndrome?


Disclaimer: "It's mine! All mine!!!"
- D. Duck

ter...@tekcrl.crl.tek.com

unread,
Nov 16, 1988, 1:01:35 PM11/16/88
to
In article <1988Nov15.1...@utzoo.uucp> he...@utzoo.uucp (Henry Spencer) writes:
>In article <9...@lazlo.UUCP> c...@lazlo.UUCP (Clifford C. Skolnick) writes:
>>What evidence do you have that college students are evil programmers
>>whos code should be verified? It does not take a college student to place
>>a section of unathorized code into a program...
>
>The problem with college students is not that they are evil crackers, but
>that college software quality control is not the best, to put it mildly.
>Colleges are organized to produce ideas and degrees, not high-quality
>software. It shows. The popular software distribution from a certain
>university in southern California is a good example of interesting ideas
>often marred by first-cut [i.e. poorly thought out, messy, sometimes
>incomplete] designs and implementations.


Careful, Henry. I know which college you're talking about, and believe
me, it's not in southern California; in fact, you'll probably incur the wrath
of MANY people by inadvertantly moving it from northern CA to southern CA.

You see, there's this great disdain between the people of northern CA
and southern CA, and they like to mention that fact as much as possible!!!

(Lest anyone get the wrong idea (and for you people who couldn't spot
sarcasm if it bit you on the nose and said "This is sarcasm), insert MANY
(-: here!!!!)

Boy
Do
I
Hate
Inews
!!!!
!!!!

John F. Haugh II

unread,
Nov 16, 1988, 1:44:47 PM11/16/88
to

This thread is working its way out of relevance in this newsgroup. Whoever,
I feel the need to interject a few closing remarks, etc.

Professional discipline and ethics are not something which can be taught in
a single course. Indeed, even programming skills themselves are difficult
to teach in a classroom environment. There simply are too few hours in a
semester to learn that craft.

For a 132 hour undergraduate degree, which is just about what yours truly
has, the student will attend approximately 1848 hours [ 132 hours per week
times 14 weeks per semester ]. This is less than one years work in the
so-call ``Real World''. One if you consider the number of non-degree
related course, the actual amount of degree-related experience drops even
lower.

In `The Mythical Man Month' the author discusses various problems which
are encountered when non-experienced programming staffs attempt to write
large software systems. Several revisions later, the author points out,
the entire kitchen sink has been added and the product is impossible to
use or maintain.

There are those of us who feel that BSD Unix has arrived at the ``kitchen
sink'' state of its existence, and I for one also feel USG Unix is at the
same stage. The system, by its very complexity, is becoming useless.

The maintenance of such a system becomes more and more difficult until
we arrive at the point where bugs aren't being fixed as fast as they are
being found or created.

And this is where we return to a question of discipline and ethics. It
is only after years of experience that you realize the futility of adding
new features on top of old systems, without first repairing the underlying
system itself.
--
John F. Haugh II +----Make believe quote of the week----
VoiceNet: (214) 250-3311 Data: -6272 | Nancy Reagan on Artifical Trish:
InterNet: j...@rpp386.Dallas.TX.US | "Just say `No, Honey'"
UucpNet : <backbone>!killer!rpp386!jfh +--------------------------------------

Henry Spencer

unread,
Nov 16, 1988, 2:51:49 PM11/16/88
to
In article <1988Nov15.1...@utzoo.uucp> I wrote:
>... The popular software distribution from a certain
>university in southern California...

Okay, so maybe Berkeley is in northern California. I never did pay much
attention to foreign geography... :-)

Doug Gwyn

unread,
Nov 16, 1988, 4:40:46 PM11/16/88
to
In article <2...@popvax.harvard.edu> moh...@popvax.UUCP (R06400@Mohamed Ellozy) writes:
-This is what irritates the living daylights out of so many of us.
-He "knows" of at least three other such holes. He is thus more
-learned, perhaps even wiser, than we are.
- BUT WHAT THE HELL ARE YOU DOING TO GET THEM CLOSED???
-Wizards who "know" about problems and pride themselves about it, but
-do nothing, are little better than those who mailiciously exploit them.

The BSD developers know of all three holes and have published fixes for
two of them. BRL's network host tester will probe for them and inform
system administrators if they have these holes.

I'm waiting for an apology.

Doug Gwyn

unread,
Nov 16, 1988, 4:52:47 PM11/16/88
to
The problem is, ethics and legality have little logical connection
with each other. One does not solve an ethical problem by passing
crime laws. To take Mumaugh's example of playing "rogue", it IS
technically a crime to do so with Federal facilities. However I
am sure that this has not much deterred people from doing it. And
one might wonder whether it is even unethical, much less something
criminal. If the Federal bureaucracy were properly based on
hierarchical authority/responsibility/accountability, then when a
supervisor decided that such "abuse" was benign or even beneficial,
it should be allowed. Unfortunately that is not the way the Civil
Service operates, especially in peacetime.

No, ethics and morality need to be self-motivated.

John B. Nagle

unread,
Nov 17, 1988, 12:00:13 AM11/17/88
to

The legal issues will be interesting. It's not at all clear that
Morris ever "accessed" a Government computer. He may have induced computers
owned by others to do so, but the legal implications of such an act are
not clear. The case, assuming it ever comes to trial, will break
significant new legal ground.

John Nagle

j...@cs.brown.edu

unread,
Nov 18, 1988, 7:16:02 AM11/18/88
to
There are a couple other points where problems similar to gets()
overflowing its buffer might arise. Normal usage of scanf() and
fscanf() can lead to the same problem if trying to read a string
in from someplace. It is easy to specify the buffer size in the
format, but I have rarely seen this done.

For setuid programs, curses leaves this same type of hole open
with several of its input routines. There are routines like both
gets() and scanf().

The issue of strcpy() and sprintf() can be worked around, but read
code that uses them and you will find that most programmers do not
put all the careful checks to make sure that the buffer is not over
run. Maybe a good reminder of this problem is needed to get people
to clean up.

Jim Bloom
Brown University
j...@cs.brown.edu
uunet!brunix!jb

Sean McLinden

unread,
Nov 18, 1988, 7:57:07 AM11/18/88
to
In article <89...@smoke.BRL.MIL> gw...@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
:The problem is, ethics and legality have little logical connection

:with each other. One does not solve an ethical problem by passing
:crime laws.
:[deleted]
:No, ethics and morality need to be self-motivated.

Possibly, but social consciousness is learned. Children aren't born with
a sense for what is right and wrong, they are educated in that area. Most
of the education comes from personal experience: you touch a hot stove
only once. Insofar as what harms other people, we start out with a system
of rules which are replaced by reason when the child has enough experience
to make sense of it. An example is respect for personal property. Have you
ever known a three year old who DIDN'T think that everything was his/hers
to play with? Until they can appreciate the concept of individuality and
stop defining the world in terms only of their own existence, children
cannot understand that some things in their world are other people's
personal property and should be treated, accordingly. This is learned,
it is not divined by the soul.

One problem (sic) with an open academic computing environment is the
fact that real world experience does not contain enough parallels to
allow people to reason about appropriate behavior. At least one can
say that if they do exist they are not obvious to everyone. There is
a perception that whatever a (computer) system allows you to do is
acceptable ("If I'm not allowed to run 32 processes simultaneously
why is MAXPROC defined to be 32?"; "If it isn't 'fair' for me to fire
up 12 LISP jobs in the background why does the shell support '&' ?").

There are also less obvious consequences of behavior that need to be
taught. The solitary programmer often has no knowledge of the administrative
issues surrounding the operation of a facility and the allocation of
resources in that community. How many people who have access to ARPANET
have read the ARPANET policy manual (how many copies of it are there
at YOUR institution)? Many rules of conduct in a programming environment
develop from the experience of people who functioned, for a time, in
a society without such rules. Before British colonialism, much of the
U.S. wilderness was lawless. Social rules and laws evolved from that
pioneer spirit because someone determined that these rules would be
needed in order to support a society. In many cases, generations of
experience were needed before an appropriate formalism existed.

I would agree with the claim that you don't make a person behave
ethically by exposing them to ethics. But you can, at least, provide
an background which will allow them to understand why certain social
conventions exist. Many of these would not be obvious to everyone,
which is the justification for doing it in the first place.

Sean McLinden
Decision Systems Laboratory

Larry Campbell

unread,
Nov 18, 1988, 1:10:56 PM11/18/88
to
In article <1988Nov15.1...@utzoo.uucp> he...@utzoo.uucp (Henry Spencer) writes:
}
}This is not to say that any random commercial organization, like, say,
}one whose name has three initials and an "&" in it, will *necessarily*
}do better. But those people can, in theory, afford to spend some money
}on quality assurance...

Put another way, companies whose business is the sale and support of
software can't afford NOT to spend money on quality assurance.
--
Larry Campbell The Boston Software Works, Inc.
camp...@bsw.com 120 Fulton Street
wjh12!redsox!campbell Boston, MA 02146

Larry Campbell

unread,
Nov 18, 1988, 1:17:22 PM11/18/88
to
In article <39...@ncrcae.Columbia.NCR.COM> rog...@ncrcae.Columbia.NCR.COM (Roger Collins) writes:
}
}The single most shocking realization I had upon entering the professional
}world after student life (Univ. of Florida, 1987) was that programmers
}credited with writing the BEST operating system in the world wrote
}some of the worst code I had every seen.

The BEST operating system in the world? Which one is that? Got your
asbestos suit on? "Go ahead, punk, make my day..."

Seriously, if I'm correct in guessing that you're talking about UNIX, I don't
think anyone -- even anyone from AT&T -- would claim with a straight face
that UNIX is the world's most reliable, maintainable, and well coded
operating system in the world. It is probably the most portable, probably
the most flexible, and probably the most fun for programmers to play with,
but far from the most elegantly written.

Tony Nardo

unread,
Nov 18, 1988, 4:17:01 PM11/18/88
to
In article <89...@smoke.BRL.MIL> gw...@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
>In article <2...@popvax.harvard.edu> moh...@popvax.UUCP (R06400@Mohamed Ellozy) writes:
>-This is what irritates the living daylights out of so many of us.
>-He "knows" of at least three other such holes. He is thus more
>-learned, perhaps even wiser, than we are.
>- BUT WHAT THE HELL ARE YOU DOING TO GET THEM CLOSED???
>
>The BSD developers know of all three holes and have published fixes for
>two of them. BRL's network host tester will probe for them and inform
>system administrators if they have these holes.

I don't mean to sound facetious, but I seem to recall some news article
mentioning that there were 60,000+ nodes on the Internet. Let's assume that
only 5% of these systems use some flavor of 4.* BSD. Let's also assume that
only 40% of those systems have administrators who wish to have those holes
identified and (possibly) plugged. Does BRL have the facilities to test 1200+
nodes before some other clever person develops a copycat "infection"? Or even
distribute a "hole test kit" to that many sites?

There *must* be a better way to distribute information on how to check for
these holes than to have every Internet site queue up for BRL's test...

Tony

P.S. To Mohamed: if you discovered one of these holes, and realized that
a second worm could very easily be written to exploit it, what would
*you* do?

Actually, anyone may feel free to answer this. Please reply to me
by E-mail. I'll attempt to summarize.

==============================================================================
ARPA, BITNET: t...@aplcomm.jhuapl.edu
UUCP: {backbone!}mimsy!aplcomm!trn

"Always remember that those who can, do, and that those who can't, teach. And
those who can't teach become critics. That's why there're so many of them."
PORTRAIT OF THE ARTIST AS A YOUNG GOD (Stephen Goldin)
==============================================================================

Joseph E Poplawski

unread,
Nov 18, 1988, 9:38:38 PM11/18/88
to
In article <48300017@hcx3> g...@hcx3.SSD.HARRIS.COM writes:
>Written 5:40 pm Nov 8, 1988 by sc...@attcan.UUCP (Scott MacQuarrie)
>> There is a product available from AT&T's Federal Systems group called
>> MLS (Multi-Level Security) which provides B1-level security in a System V
>> Release 3.1 environment. I have seen the product on a 3B2, it's availablity
>> from other vendors would probably require work by those vendors.
>
>It did. It's done. It's called CX/SX.

Can anyone post more information on what these products do to increase system
security? Do these require source licenses?

-Jo

-------------------------------------------------------------------------------
| Joseph E Poplawski (Jo) US Mail: 1621 Jackson Street |
| Cinnaminson NJ 08077 |
| UUCP:..!rutgers!rochester!moscom!telesci!fantasci!jep |
| ..!princeton!telesci!fantasci!jep |
| ..!pyrnj!telesci!fantasci!jep Phone: +1 609 786-8099 home |
-------------------------------------------------------------------------------
| He who dies with the most toys wins! |
-------------------------------------------------------------------------------
| Copyright (C) 1988 Joseph E Poplawski All rights reserved. |
-------------------------------------------------------------------------------

David Collier-Brown

unread,
Nov 18, 1988, 10:02:08 PM11/18/88
to

>>In article <17...@glacier.STANFORD.EDU> j...@glacier.UUCP (John B. Nagle) writes:
>>>I suggest that the security mailing list be posted to a newsgroup,
>>>but with a 60-day delay.
>
From article <3...@tron.UUCP>, by mo...@tron.UUCP (Harvey R Moran):>
> I wonder how many more people out there believe that sites without
> access to the security mailing list (or possibly even USENET) should
> have their risks increased pretty significantly? How about us binary
> liscense sites?
>

Well, consider two points:

1) If you're not one the net, and preferably don't support
async communications, your insecurity to communications-related
attacks is not significantly affected.
2) Binary sites get patches too: my sun comes with patches
printed on paper, for me to apply the hard way.

The suggestion of a 60-day timeout is by no means a cure-all. It
is a heuristic to improve the general case while minimizing impact
upon other cases.

--dave

Brandon S. Allbery

unread,
Nov 20, 1988, 1:05:57 PM11/20/88
to
As quoted from <13...@princeton.Princeton.EDU> by a...@olden.uucp (Adam L. Buchsbaum):
+---------------

| In article <85...@rpp386.Dallas.TX.US> j...@rpp386.Dallas.TX.US (John F. Haugh II) writes:
| >It would be so nice if someone would undertake a security audit to
| >insure that work other college students did, which *is* currently
| >in production, doesn't contain any surprizes.
|
| Being just an ignorant graduate student myself, I can't figure out
| whether this implies that all college students are suspect, anyone who
| is not in college is not suspect, or both? Perhaps John F. Haugh II
| could clarify this for me?
+---------------

You misunderstand; he's not talking about RTMorris, he's talking about the
kind of peoplke who wrote sendmail, and fingerd, and other programs that
might have inadvertent security holes in them. And we've *all* done it at
one time or another. An independent audit of "important" code is a good
idea.

++Brandon
--
Brandon S. Allbery, comp.sources.misc moderator and one admin of ncoast PA UN*X
uunet!hal.cwru.edu!ncoast!allbery <PREFERRED!> ncoast!all...@hal.cwru.edu
allb...@skybridge.sdi.cwru.edu <ALSO> all...@uunet.uu.net
comp.sources.misc is moving off ncoast -- please do NOT send submissions direct
Send comp.sources.misc submissions to comp-sources-misc@<backbone>.

John Chambers

unread,
Nov 20, 1988, 11:58:29 AM11/20/88
to
In article <89...@smoke.BRL.MIL>, gw...@smoke.BRL.MIL (Doug Gwyn ) writes:
> The problem is, ethics and legality have little logical connection
> with each other. One does not solve an ethical problem by passing
> crime laws. To take Mumaugh's example of playing "rogue", it IS
> technically a crime to do so with Federal facilities.

And it's also a violation of almost all employers' rules (including
the government) for an employee to have a picture of his/her family
members on their desk. This is, after all, a use of the employer's
property for purposes of personal entertainment.

Ask any lawyer about the meaning of the phrase "De minimus non curat
lex". (Also, while you're at it, ask for the well-known limerick that
ends with that line.)

--
John Chambers <{adelie,ima,maynard,mit-eddie}!minya!{jc,root}> (617/484-6393)

[Any errors in the above are due to failures in the logic of the keyboard,
not in the fingers that did the typing.]

Joseph S. D. Yao

unread,
Nov 22, 1988, 10:41:41 AM11/22/88
to
In article <4...@auspex.UUCP> g...@auspex.UUCP (Guy Harris) writes:
< >Excuse ME, but the last four lines of my SunOS 4.0 distribution tape
< >password file are:
< > +::0:0:::
< > ::0:0:::
< > ::0:0:::
< > ::0:0:::
> ... All of them had
> +:
>as the last line in the password file ...
>I tried "passwd" ...
>turned it into
> +::0:0:::

A lot of people - and some editor programs - have the terrible habit
of leaving a blank line at the end of a file after editting it. The
'passwd' program, at least all versions that I know of, tends to turn
this into the offending line. This happens because getpwent() returns
a blank pwd entry, and putpwent() or the printf() used insert the
colons. I'd suggest that all getpwent()'s skip over blank lines
completely.

Joe Yao js...@hadron.COM (not yet domainised)
hadron!jsdy@{uunet.UU.NET,dtix.ARPA,decuac.DEC.COM}
arinc,att,avatar,blkcat,cos,decuac,dtix,\
ecogong,empire,gong,grebyn,inco,insight, \!hadron!jsdy
kcwc,lepton,netex,netxcom,phw5,rlgvax, /
seismo,sms,smsdpg,sundc,uunet /

Joseph S. D. Yao

unread,
Nov 22, 1988, 10:47:48 AM11/22/88
to
In article <17...@adm.BRL.MIL> r...@nav.icst.nbs.gov (Root Boy Jim) writes:
>I can see it now, a paper entitled `Local Variables Considered Harmful'.

To mis-quote Tom Lehrer,
When properly mis-used, everything will lose.

Joe Yao

Snoopy T. Beagle

unread,
Nov 29, 1988, 6:48:09 PM11/29/88
to
In article <1988Nov15.1...@utzoo.uucp> he...@utzoo.uucp (Henry Spencer) writes:

| The popular software distribution from a certain

| university in southern California is a good example of interesting ideas
| often marred by first-cut [i.e. poorly thought out, messy, sometimes
| incomplete] designs and implementations.

| This is not to say that any random commercial organization, like, say,


| one whose name has three initials and an "&" in it, will *necessarily*
| do better. But those people can, in theory, afford to spend some money

| on quality assurance. Universities generally can't.

Does this mean I should "rm -rf cnews" rather than trying to get it to
build? :-) Can I trust software from a certain university in eastern
Canada? :-)

These days, a vender is likely to be pushing both hardware and software
out the door as soon as possible so that they can rake in the bucks for
whizzy new feature foobar before their competitor beats them to it. They
may very well argue that they can't spend any more time/money on quality.

If you want better quality, you need to get customers to demand it.
Customers with large budgets.

It isn't who you work for, it is your state-of-mind that counts. Tools like
code inspections can help, but they may not buy you much if you're just going
through the motions.
_____
/_____\ Snoopy
/_______\
|___| tektronix!tekecs!sopwith!snoopy
|___| sun!nosun!illian!sopwith!snoopy

Henry Spencer

unread,
Dec 1, 1988, 4:36:51 PM12/1/88
to
In article <7...@sopwith.UUCP> sno...@sopwith.UUCP (Snoopy T. Beagle) writes:
>| This is not to say that any random commercial organization, like, say,
>| one whose name has three initials and an "&" in it, will *necessarily*
>| do better. But those people can, in theory, afford to spend some money
>| on quality assurance. Universities generally can't.
>
>Does this mean I should "rm -rf cnews" rather than trying to get it to
>build? :-) Can I trust software from a certain university in eastern
>Canada? :-)

You pays your money and you takes your chances! :-)

Some people can write good software without a QA group standing over them
with a club. Some can't. If there *is* a club-equipped CA-group, the odds
of getting consistently good software are better. If there isn't, as in
universities, much depends on who wrote the stuff, and on whether they got
out on the right side of bed that morning. (Even I, normally the absolute
pinnacle of programming perfection, have been known to produce code with
occasional trivial, unimportant flaws on a bad day. :-) :-) :-) :-))

>These days, a vender is likely to be pushing both hardware and software
>out the door as soon as possible so that they can rake in the bucks for
>whizzy new feature foobar before their competitor beats them to it. They
>may very well argue that they can't spend any more time/money on quality.

Yes, unfortunately, some QA departments come with a leash rather than a
club as standard equipment...
--
SunOSish, adj: requiring | Henry Spencer at U of Toronto Zoology
32-bit bug numbers. | uunet!attcan!utzoo!henry he...@zoo.toronto.edu

0 new messages