Believable Stupidity

64 views
Skip to first unread message

Phil Goetz

unread,
Mar 23, 1998, 3:00:00 AM3/23/98
to

How to Be Stupid

by Phil Goetz, March 1998

I. We Are Eliza

Every year the Loebner competition pits different natural-language
understanding strategies against each other, as teams of researchers
compete to see whose computers can hold conversations the most like humans.
Every year the sophisticated programs with complex knowledge representations
and reasoning and planning strategies lose out to some variant of Eliza,
a program Joseph Weizenbaum created in 1966.

Eliza works basically like this: The computer has a list of templates,
which are generalized patterns that sentences can match. Each template
has a transformation which specifies a way to rearrange and add things to
or subtract things from any sentence that matches that template.
Every time you give it a sentence, it finds the first match in the
list, does the transformation, and prints the result. No memory,
no understanding, nothing but stimulus-response. For example,
you might have these templates and transformations:

Template Transform

I think $1. Why do you think $1?
I $1 don't know $2. You sound uncertain.

People have puzzled for years over why it is that Eliza still simulates
human conversation better than sophisticated knowledge representation
and logical reasoning. Three decades of research in computational
linguistics has not given us an answer. A few hours of conversation
at your local bar, however, suggests an answer: People are more
like Eliza than they are like sophisticated knowledge representation
and reasoning systems.

A lot of the people I've met can be modelled pretty well with Eliza.
Consider the following scripts:

Template Transform

You could $1. Don't tell me what to do!

Yes, but $1. Why are you always criticizing everything
I say?

$1 is so cool. Like, you are just sad. $1 is lame.

$1 lower taxes $2 | Hey! I'll vote for him!
$1 our children $2 |
$1 this great nation $2

$1 Alaska $2. Did I ever tell you 'bout the time I was
up in Alaska hunting moose with Clem? Think it was
four, five years ago, that year we had the frost
that killed all Molly's orange trees she'd worked so
hard on. I told her the damned things wouldn't grow
this far north. So anyway, there we were in
Alaska...

$1 Clinton $2. Bill Clinton is the biggest goddamn
hypocrite in the U. S. of A. Supposed to be out
leading the country and here he's having floozies
go down in him in the oval office. God damn.
It's no surprise them damn Japanese are whupping
our asses. Hey, Mel, pour me another.


I. Stupidity Makes the World Go Round

I was thinking about how to write Eliza scripts or something else for
an interactive drama system that could generate responses that would
convince the user that the system understood. Then it occurred to me:
Most in the drama in my life doesn't come from the intelligence and wisdom
of the people around me, but from their stupidity. Person A says one
thing. Person B filters it through their prejudices and preconceptions
about person A, adds in a few assumptions to make it all gel better,
then gives person C a summary. Eventually word gets back to person A,
who concludes person B is maliciously making up lies about A to further
certain evil intentions that person A has always suspected person B of
harboring. Instant drama.

But -- as we know from interactions with unsuccessful computer programs --
just any old stupidity won't work. Simple ignorance or failure to
understand results in programs that say "Huh?" and "I'm not in the mood
to talk right now" a lot. To create drama, we need to push the stupidity
envelope, to create agents that are aggressively stupid. Whenever
they are confronted with a user action which makes no sense, they
should make assumptions until it makes sense.

The trick is to produce /believable stupidity/, by making our agents be
stupid in the same ways that people are stupid. To do this, think about
the misunderstandings you've had with other people, and what caused them.
From my experience, I suggest the following

Principles of Believable Stupidity:

1. Have preconceived notions about every other character: a label,
a category, an expected way they will act, a model of their goals and
motivations. If character A does something so out of line with character
B's categorization of A, B must have a set of other categories and
expectations and quickly squeeze A into one of them.
Specifically, for each character, have ready to hand a list of suspicions
about that character which can be used to explain any action.

2. Assume that everything the other character does is directly related
to you.

3. Never attribute to forgetfulness, incompetence, coincidence, or a lack
of information what can be explained by hatred, dishonesty, or conspiracy.

4. When confronted with conflicting information, never perform inference
to find a belief whose alteration could resolve the conflict. Only
question one of your own beliefs if it is directly contradicted by
physical evidence. To do otherwise would not be believable.


Here is an imaginary interaction with an intelligent agent:

Player: Moe, what is the weather like?

Moe: I don't know what the weather is like.

Player: Moe, where is Curly?

Moe; Curly is in the closet.


Here is the same scenario played out with an aggressively stupid agent:

Player: Moe, what is the weather like?

Moe: Why do you want to know?

Player: I want to take a walk.

Moe [matching on "I want $1 walk."]: Hah! You want to go see Larry and
tell him what you learned from listening behind my door. But I'm on to you!

Player: Moe, where is Curly?

Moe [matching on "Curly" in the context that the "tell Larry" suspicion
is active]: So Curly's in on it too, is he?


The second interaction is clearly more dramatic.

The nice thing is that these kinds of responses can probably be
implemented partly with Eliza-like scripts, which just match a few words
in a sentence and assume the rest does what it is supposed to.
There will be some knowledge representation requirements, but it shouldn't
amount to more than remembering which labels an agent has put on each
character, providing enough reasoning capabilities (or perhaps simply a
list of cues) to confirm each suspicion, and a list of which suspicions
are currently active and which have been confirmed. The ability to
retract confirmed suspicions should not be necessary.


Thoughts? Template matches?

Phil Go...@zoesis.com

"Give a man a fire and he's warm for a day --
set fire to him, and he's warm for the rest of his life."

Daryl McCullough

unread,
Mar 23, 1998, 3:00:00 AM3/23/98
to

go...@cs.buffalo.edu says..

>I. Stupidity Makes the World Go Round
>
>I was thinking about how to write Eliza scripts or something else for
>an interactive drama system that could generate responses that would
>convince the user that the system understood. Then it occurred to me:
>Most in the drama in my life doesn't come from the intelligence and wisdom

>of the people around me, but from their stupidity...

[stuff deleted]

Interesting discussion. Do you know that there was a conversation
program called "PARRY" (for "paranoid") that was an attempt to
simulate the reasoning a person suffering from paranoia. That sounds
a little like your artificial stupidity idea.

Daryl McCullough
CoGenTex, Inc.
Ithaca, NY

Jeff Hatch

unread,
Mar 23, 1998, 3:00:00 AM3/23/98
to

Phil Goetz wrote:
>
> How to Be Stupid
> by Phil Goetz, March 1998

LOL, so hard that a brother descended the stairs and demanded a
printout. Even the sig is superb. This is the best thing I've read in
this newsgroup in months. ;-)

-Rúmil

Alan Conroy

unread,
Mar 24, 1998, 3:00:00 AM3/24/98
to

On 23 Mar 1998 22:17:43 GMT, go...@cs.buffalo.edu (Phil Goetz) wrote:

> How to Be Stupid
>
> by Phil Goetz, March 1998

Excellent! It's not often that a post is both applicable to IF and to
real life. Have you considered writing a book? Maybe, "Stupidity for
Dummies"? ;)

A couple of issues.

As much as I like it, wouldn't being surrounded entirely by such
neurotic individuals be a bit wearing after a while? While we all run
into such situations, that is not the sum total of our interactions
(at least, I hope note). Let me therefore, suggest another possible
attribute: the tendency to hang with people of the same opinion. This
could have an entirely different effect - one of commiseration.

You:
I think Clinton is fine, upstanding man.

Response:
Yes, other people are much too hard on him!


or

You:
Clinton is a low-life with the morals of an alley cat.

Response:
I agree. He should be taken out and shot.


Perhaps there could be a degree of agreement/disagreement so that its
not always only one way or another.

And then let's not forget the human tendency to project our own
personality onto others and then completely warp what they are, in
fact, saying.

You:
I'm going out to buy some coffee. Do you want anything?

Response:
Just what do you mean by that?

You:
I meant exactly what I said.

Rerponse:
Only a cruel, uncaring person could say something like that.

You:
What are you talking about? What did you think I said?

Response:
You know very well what you said.

and on and on...

- Alan Conroy

The views expressed in this post are mine and not necessarily those of my spouse, employer, government, or God.
But then again...

Adam J. Thornton

unread,
Mar 24, 1998, 3:00:00 AM3/24/98
to

"Aggressive Stupidity" is going into my next project.

Adam
--
ad...@princeton.edu Cynical and drunk and boring someone in some dark cafe

Bill

unread,
Mar 24, 1998, 3:00:00 AM3/24/98
to

Daryl McCullough wrote in message <6f6tc9$l...@drn.newsguy.com>...

>Interesting discussion. Do you know that there was a conversation
>program called "PARRY" (for "paranoid") that was an attempt to
>simulate the reasoning a person suffering from paranoia. That sounds
>a little like your artificial stupidity idea.

If I remember from my AI classes ... PARRY was just an Elisa with a
different set of patterns.

By the way, I think this is a GREAT approach to conversations in adventures.
Also it fits in extremly well with the capabilities of the speech
recognition systems (Dragon et. al.) out there.

Bill

Michael Straight

unread,
Mar 25, 1998, 3:00:00 AM3/25/98
to

On 23 Mar 1998, Phil Goetz wrote:

> I. Stupidity Makes the World Go Round

[...]


> The trick is to produce /believable stupidity/, by making our agents be
> stupid in the same ways that people are stupid. To do this, think about
> the misunderstandings you've had with other people, and what caused them.
> From my experience, I suggest the following
>
> Principles of Believable Stupidity:

[snipped]

This post isn't really about computers, is it?

SMTIRCAHIAGEHLT


Michael Straight

unread,
Mar 25, 1998, 3:00:00 AM3/25/98
to

On Tue, 24 Mar 1998, Bill wrote:

> If I remember from my AI classes ... PARRY was just an Elisa with a
> different set of patterns.

Speaking of Eliza, are there any works of IF where one of the NPC's *is*
Eliza? Meaning the game uses Eliza-like code to talk to the player just
like Eliza?

I guess the answer might be a spoiler.

SMTIRCAHIAGEHLT


Allen Garvin

unread,
Mar 25, 1998, 3:00:00 AM3/25/98
to

In article <Pine.A41.3.95L.980325...@login1.isis.unc.edu>,

Michael Straight <stra...@email.unc.edu> wrote:
>
>On Tue, 24 Mar 1998, Bill wrote:
>
>> If I remember from my AI classes ... PARRY was just an Elisa with a
>> different set of patterns.
>
>Speaking of Eliza, are there any works of IF where one of the NPC's *is*
>Eliza? Meaning the game uses Eliza-like code to talk to the player just
>like Eliza?

There is an Eliza on a lot of LPmuds. In fact, you get healed by talking
to her. It's considered slightly amusing to get another talk-response
monster to wander in with Eliza and watch them go wild with back-and-forth
responses (like the zippy-eliza feature of emacs).


--
Allen Garvin kisses are a better fate
--------------------------------------------- than wisdom
eare...@faeryland.tamu-commerce.edu
http://faeryland.tamu-commerce.edu/~earendil e e cummings

Edan Harel

unread,
Mar 26, 1998, 3:00:00 AM3/26/98
to

Michael Straight <stra...@email.unc.edu> writes:

>Speaking of Eliza, are there any works of IF where one of the NPC's *is*
>Eliza? Meaning the game uses Eliza-like code to talk to the player just
>like Eliza?

I had once considered making a game like that, where you can call a small
group of people from a telephone. You could talk to them normally and
they culd have some limited memory to know what the subject was. Talking
to them could get them to trust you (or get suspicious), to tell you
certain things (ie, clues if you were trying to solve a crime) and to
give you other people's phone numbers.

Also, while not strictly Eliza-like, the old angelsoft games allow you
to say what you want without directing what you're saying. For example,
instead of:

>Charlie, were were you this morning

or

>ask Charlie for alibi

You could simply do the following

>Look

Charlie and Bob are here.

>Examine Charlie

Charlie looks back at you.

>Where were you this morning?

Charlie: None of your buisness.


Also, are there any games which, after you ask a question actually
state your question?

ie:

>ask Charlie for alibi

"So, Charlie, where were you this morning at 11:37?" You ask, "We know you
weren't at your apartment or at work."

Charlie looks at you briefly before responding...

Edan
--
Edan Harel edh...@remus.rutgers.edu McCormick 6201
Research Assistant Math and Comp Sci Major Computer Consultant
USACS Member Math Club Secretary

Kenneth Fair

unread,
Mar 26, 1998, 3:00:00 AM3/26/98
to

>Speaking of Eliza, are there any works of IF where one of the NPC's *is*
>Eliza? Meaning the game uses Eliza-like code to talk to the player just
>like Eliza?

Yes. Or there will be . . .

(I will not Avalon. I will not Avalon.)
--
KEN FAIR - U. Chicago Law | <http://student-www.uchicago.edu/users/kjfair>
Of Counsel, U. of Ediacara | Power Mac! | CABAL(tm) | I'm w/in McQ - R U?
"My vulva is wide opened, a cave of darkness where the people . . . hang out
at night . . . . There is music playing in my Womb." - Doctress Neutopia

Edan Harel

unread,
Mar 27, 1998, 3:00:00 AM3/27/98
to

Michael Straight <stra...@email.unc.edu> writes:

>Speaking of Eliza, are there any works of IF where one of the NPC's *is*
>Eliza? Meaning the game uses Eliza-like code to talk to the player just
>like Eliza?

Forgot to answer this in my other post. Guardians of Infinity,
Star Trek: The Kobayashi Alternative and Star Trek: First Contact (the
text adventure) all had eliza type parsers (in a way). For GoI, you
command five people, and can tell them to do certain things. There's
actually a limited amount of things you can say. You can say "Name,
go to <somebody> at <home/work> in <city> at <time>" Then they might
ask you what you want thhem to do there. You could say "stop them" or
"kill them" or "talk to them" or whatever. They might ask you some
yes or no questions too. I think at one point, youwould have to tell
them something a little more complicated, so them parser might be more
advanced, but it seemed to me it would ignore any word not in it's
dictionary and just use those remaining to "understand" it. If a certain
word was missing, they might ask something like "Where do you want
me to meet John F. Kennedy?" or "Who do you want me to meet in Dallas?"

The two star trek games (which are basically simmilar, though the second
one is more powerful and allows for using a menu to say certain things
rather than typing) are simmilar, though they allow a larger range of
commands. For example, you might say to Scotty "pick up phaser" or something.
you could even ask spock "Tell me about McCoy" and he would respond
with something like "McCoy is the ship's alleged doctor." :)

John Elliott

unread,
Mar 27, 1998, 3:00:00 AM3/27/98
to
:
: Speaking of Eliza, are there any works of IF where one of the NPC's *is*

: Eliza? Meaning the game uses Eliza-like code to talk to the player just
: like Eliza?

I can't test it right now, but I believe the palace guards in _The_Pawn_
were Eliza.

------------- http://www.seasip.demon.co.uk/index.html --------------------
John Elliott |BLOODNOK: "But why have you got such a long face?"
|SEAGOON: "Heavy dentures, Sir!" - The Goon Show
:-------------------------------------------------------------------------)

Edan Harel

unread,
Mar 27, 1998, 3:00:00 AM3/27/98
to

j...@seasip.demon.co.uk (John Elliott) writes:

<snip>

>: Eliza? Meaning the game uses Eliza-like code to talk to the player just
>: like Eliza?

> I can't test it right now, but I believe the palace guards in _The_Pawn_
>were Eliza.

You mean they were freelance psychologists on the side? :-)

Edan

Reply all
Reply to author
Forward
0 new messages