0. A robot must not harm humanity or through inaction allow humanity to
come to harm.
1. A robot must not harm a human or through inaction allow a human to come
to harm except in situations which conflict with the zeroeth law.
2. A robot must obey the commands given by a human unless it conflicts with
the zeroeth or first law.
3. A robot must protect its own existance unless it conflicts with the
zeroeth, first or second laws.
The original poster also asked when they were appeared. I was waiting for
someone else to answer before I stuck my head out. I'll go ahead and say
that the Laws One through Three appeared first in 1941 in Astounding Science
Fiction. "Liar" and "Reason" were the first among the short robot stories
to use the Laws, and these appear later in _I, Robot_.
The Zeroeth Law appears in _Robots_of_Dawn_ in 1983.
Please correct me if I am wrong on these dates or titles. I don't doubt that
I am wrong on these.
Carl
--
--
Leslie Carl Seiler | Texas A&M Geography
lcs...@tamsun.tamu.edu | This sig for identification purposes only.
-------------------------------------------------------------------------
Before everyone comments "But, there are only *3* laws!", I will add
some commentary. Miniscule spoilers?
> 0. A robot must not harm humanity or through inaction allow humanity to
> come to harm.
This law was added after thousands of years of robot thought on how to
better accomplish law #1 and was actually implemented BY robots
(Namely, Daneel R. Olivaw (sp?))
> 1. A robot must not harm a human or through inaction allow a human to come
> to harm except in situations which conflict with the zeroeth law.
> 2. A robot must obey the commands given by a human unless it conflicts with
> the zeroeth or first law.
> 3. A robot must protect its own existance unless it conflicts with the
> zeroeth, first or second laws.
This are the original 3 laws that were embedded into robots from the
very beginning. Of course, the original 3 did not mention the zeroth
(It is spelled "zeroth" and not "zeroeth") law.
Hopefully, that clarifies the situation a bit.
// Dan
--
Daniel Quinlan <qui...@spectrum.cs.bucknell.edu>
As mentioned before, possible minor spoiler...
In article <QUINLAN.93...@scarlet.cs.bucknell.edu>,
Daniel Quinlan <qui...@spectrum.cs.bucknell.edu> wrote:
>> 0. A robot must not harm humanity or through inaction allow humanity to
>> come to harm.
>
>This law was added after thousands of years of robot thought on how to
>better accomplish law #1 and was actually implemented BY robots
>(Namely, Daneel R. Olivaw (sp?))
If I recall correctly, the zeroth law was devised by both R. Giskard Reventlov
and R. Daneel Olivaw, but I might be recalling incorrectly.
>As mentioned before, possible minor spoiler...
>In article <QUINLAN.93...@scarlet.cs.bucknell.edu>,
>Daniel Quinlan <qui...@spectrum.cs.bucknell.edu> wrote:
>>> 0. A robot must not harm humanity or through inaction allow humanity to
>>> come to harm.
>>
>>This law was added after thousands of years of robot thought on how to
>>better accomplish law #1 and was actually implemented BY robots
>>(Namely, Daneel R. Olivaw (sp?))
>If I recall correctly, the zeroth law was devised by both R. Giskard Reventlov
>and R. Daneel Olivaw, but I might be recalling incorrectly.
It was mainly Daneel who invented it, but Giskard was with him att that time. Giskard however couldn't "live" with it.
>--
>--
>Leslie Carl Seiler | Texas A&M Geography
>lcs...@tamsun.tamu.edu | This sig for identification purposes only.
>-------------------------------------------------------------------------
--
-Christian Almgren Internet: d93...@nada.kth.se
Who am I? Why am I here? Forget the questions! Someone gimme another beer!
"Meatloaf"
|> The Zeroeth Law appears in _Robots_of_Dawn_ in 1983.
|>
|> Please correct me if I am wrong on these dates or titles. I don't doubt that
|> I am wrong on these.
|>
|> Carl
|>
I think the zeroth law appeared in _Robots and
Empire_. If I remember correctly, _The Robots of Dawn_
involved Baley and Giskard (with Daneel playing a
peripheral role) and set up Amadiro's hatred of Earth.
In _Robots and Empire_, Elijah Baley is long dead (and
appears in a flashback) and now Daneel and Giskard
hold long conversations on the three laws, psychohistory,
mob statistics etc. Giskard is really the one who paves
the way for Seldon AND Gaia.
Anand Rangarajan
rangaraj...@cs.yale.edu
Was it _Runaround_ in which two scientists stranded on Mercury(?) have a
malfunctioning robot, so they need to ride a couple of larger, older models
out onto the surface of the planet to see what the problem is? And they
find that the fumes from the material that the original robot was sent to
gather are screwing up its positronic brain? It ends up in an infinite
loop - it walks away from the fumes until it regains control of its brain,
and then its original order kicks back in, so it approaches again....
I think the introductions to my story collections have Asimov claiming that
this story was the first in which he used the laws.
>Carl
Duane
"Reason" and "Liar!" are copyright 1941, and "Runaround" is copyright 1942,
and that is why I said those two were the first stories. It is possible
that Asimov wrote "Runaround" first.
I read this story over the summer, and can't remeber the new laws well
enough to post them now, maybe later if there is an interest.
(maybe I'll have to reread it. If I ever get caught up on my homework...)
Lance Shaw
I have read _Isaac Asimov's Caliban_ by Roger MacBride Allen [author of
_The Modular Man_].
> Anyway, the main story is about a new kind of robot brain that is based
>on the positronic brain so it doesn't have the three laws. It is a mystery
The positronic brain is replaced with the gravitonic brain. Here are
the New Laws of Robotics (pp. 214-215):
1) A robot may not injure a human being. [Human protection *from*
robots, not *by* robots.]
2) A robot must cooperate with human beings except where such
cooperation would conflict with the First Law. [Robot cooperation,
not obedience.]
3) A robot must protect its own existence, as long as such protection
does not conflict with the First Law. (Second Law not mentioned.)
[Robotic self-preservation.]
4) A robot may do anything it likes except where such action would
violate the First, Second, or Third Law. [Robotic freedom and
creativity.]
> Lance Shaw
The second (forthcoming) book in this series is entitled/subtitled _Inferno_.
Archie
--
Archie Medrano (amed...@euclid.ucsd.edu)
"The most exciting phrase in science, the one that heralds
new discoveries, is not 'Eureka' (I found it!) but
'That's funny...'" - Isaac Asimov
I just happen to have my copy within arm's reach, so allow me:
New First Law: A robot may not injure a human being.
Intent: People won't have to worry about being injured BY robots, but should
not expect robots to protect them constantly.
New Second Law: A robot must cooperate with human beings except where such
cooperation conflicts with the First Law.
Intent: "Cooperate", not "obey".
New Third law: A robot must protect its own existence, as long as such
protection does not conflict with the first law.
Intent: Note lack of reference to the second law. A robot will not
cooperate with a human to destroy itself.
New Fourth Law: A robot may do anything it likes except where such action
would violate the First, Second or Third Laws.
Intent: Play time.
>
> Lance Shaw
I wasn't too crazy about this book. At first, it struck me as a typical
"Oh-good-asimov's-dead-now-I-can-use-all-of-his-ideas-about-robots" story
(what are some others...Robots in Time...) But it also just wasn't a good
story, IMHO.
Duane, who just realized he hasn't read Robots and Empire yet
>0. A robot must not harm humanity or through inaction allow humanity to
> come to harm.
>1. A robot must not harm a human or through inaction allow a human to come
> to harm except in situations which conflict with the zeroeth law.
>2. A robot must obey the commands given by a human unless it conflicts with
> the zeroeth or first law.
>3. A robot must protect its own existance unless it conflicts with the
> zeroeth, first or second laws.
And one little-used and non-canon law:
4. A robot must reproduce, unless said reproduction conflicts with the
zeroeth, first, second, or third laws.
The fourth law from a story in "Foundation's Friends" called "The Fourth Law of
Robotics." I don't think it really work with the zeroeth, because it was
devided by a smoking hippie robot. But it needed to be added to any COMPLETE
list of robot's laws.
--
Eat, drink, and have sex, for tomorrow we may all have hypothalmic lesions.
>The Zeroeth Law appears in _Robots_of_Dawn_ in 1983.
>
>Please correct me if I am wrong on these dates or titles. I don't doubt that
>I am wrong on these.
>
>Carl
>
The Zeroth Law first appears in _Robots and Empire_, Part V, Chapter 18.
regards, Andy
--
>> 0. A robot must not harm humanity or through inaction allow humanity to
>> come to harm.
>
>This law was added after thousands of years of robot thought on how to
>better accomplish law #1 and was actually implemented BY robots
>(Namely, Daneel R. Olivaw (sp?))
The Zeroeth Law was actually the inception of R. Giskard Reventlov. It was
discussed with R. Daneel Olivaw and eventually passed to him near the time
of Giskard's (death?). It was the internal conflict with the implementation
of this Law which ultimately killed Giskard. You must remember that Giskard
was not designed to be as sophisticated as Daneel. Giskard was able to
comprehend this concept and decided that passing his ability, programming
and Law concept to Daneel would stand a better chance at success.
regards, Andy
--