Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Humans hard-wired for morality?

10 views
Skip to first unread message

ta

unread,
Aug 9, 2007, 4:42:52 PM8/9/07
to
"Research suggests morality is built in the brain

By SHANKAR VEDANTAM | Washington Post
May 29, 2007

Science opens new window on what it means to be good

WASHINGTON - The e-mail came from the next room.

"You gotta see this!" Jorge Moll had written. Moll and Jordan Grafman,
neuroscientists at the National Institutes of Health in Bethesda, Md.,
had been scanning the brains of volunteers as they were asked to think
about a scenario involving either donating a sum of money to charity
or keeping it for themselves.

As Grafman read the e-mail, Moll came bursting in. The scientists
stared at each other.

The results were showing that when the volunteers placed the interests
of others before their own, the generosity activated a primitive part
of the brain that usually lights up in response to food or sex.
Altruism, the experiment suggested, was not a superior moral faculty
that suppresses basic selfish urges but rather was basic to the brain,
hard-wired and pleasurable.
Click here for El Palacio Photography Contest

Their 2006 finding that unselfishness can feel good lends scientific
support to the admonitions of spiritual leaders such as St. Francis of
Assisi, who said, "For it is in giving that we receive." But it also
is a dramatic example of the way neuroscience has begun to elbow its
way into discussions about morality and has opened a new window on
what it means to be good.

Grafman and others are using brain imaging and psychological
experiments to study whether the brain has a built-in moral compass.
The results - many of them published in recent months - are showing,
unexpectedly, that many aspects of morality appear to be hard-wired in
the brain, most likely the result of evolutionary processes that began
in other species.

No one can say whether giraffes and lions experience moral qualms in
the same way people do because no one has been inside a giraffe's
head, but it is known that animals can sacrifice their own interests:
One experiment found that if each time a rat is given food, its
neighbor receives an electric shock, the first rat will eventually
forgo eating.

What the new research is showing is that morality has biological roots
- such as the reward center in the brain that lit up in Grafman's
experiment - that have been around for a very long time.

The more researchers learn, the more it appears the foundation of
morality is empathy. Being able to recognize - even experience
vicariously - what another creature is going through was an important
leap in the evolution of social behavior. And it is only a short step
from this awareness to many human notions of right and wrong, says
Jean Decety, a neuroscientist at the University of Chicago.

The research enterprise has been viewed with interest by philosophers
and theologians, but already some worry that it raises troubling
questions. Reducing morality and immorality to brain chemistry -
rather than free will - might diminish the importance of personal
responsibility. Even more important, some wonder whether the very idea
of morality is somehow degraded if it turns out to be just another
evolutionary tool that nature uses to help species survive and
propagate.

Moral decisions can often feel like abstract intellectual challenges,
but a number of experiments such as the one by Grafman have shown that
emotions are central to moral thinking. In another experiment
published in March, University of Southern California neuroscientist
Antonio Damasio and his colleagues showed that patients with damage to
an area of the brain known as the ventromedial prefrontal cortex lack
the ability to feel their way to moral answers.

When confronted with moral dilemmas, the brain-damaged patients coldly
came up with "end-justifies-the-means" answers. Damasio said the point
was not that they reached immoral conclusions, but when confronted by
a difficult issue - such as whether to shoot down a passenger plane
hijacked by terrorists before it hits a major city - these patients
appear to reach decisions without the anguish that afflicts those with
normally functioning brains.

Such experiments have two important implications. One is that morality
is not merely about the decisions people reach but also about the
process by which they get there. Another implication, said Adrian
Raine, a clinical neuroscientist at the University of Southern
California, is that society might have to rethink how it judges
immoral people.

Psychopaths often feel no empathy or remorse. Without that awareness,
people relying exclusively on reasoning seem to find it harder to sort
their way through moral thickets. Does that mean they should be held
to different standards of accountability?

"Eventually, you are bound to get into areas that for thousands of
years we have preferred to keep mystical," said Grafman, the chief
cognitive neuroscientist at the National Institute of Neurological
Disorders and Stroke. "Some of the questions that are important are
not just of intellectual interest, but challenging and frightening to
the ways we ground our lives. We need to step very carefully."

Joshua Greene, a Harvard neuroscientist and philosopher, said multiple
experiments suggest morality arises from basic brain activities.
Morality, he said, is not a brain function elevated above our baser
impulses. Greene said it is not "handed down" by philosophers and
clergy, but "handed up," an outgrowth of the brain's basic
propensities.

Moral decision-making often involves competing brain networks vying
for supremacy, he said. Simple moral decisions - is killing a child
right or wrong? - are simple because they activate a straightforward
brain response. Difficult moral decisions, by contrast, activate
multiple brain regions that conflict with one another, he said.

In one 2004 brain-imaging experiment, Greene asked volunteers to
imagine they were hiding in a cellar of a village as enemy soldiers
came looking to kill all the inhabitants. If a baby was crying in the
cellar, Greene asked, was it right to smother the child to keep the
soldiers from discovering the cellar and killing everyone?

The reason people are slow to answer such an awful question, the study
indicated, is that emotion-linked circuits automatically signaling
that killing a baby is wrong clash with areas of the brain that
involve cooler aspects of cognition. One brain region activated when
people process such difficult choices is the inferior parietal lobe,
which has been shown to be active in more impersonal decision-making.
This part of the brain, in essence, was "arguing" with brain networks
that reacted with visceral horror.

Such studies point to a pattern, Greene said, showing "competing
forces that might have come online at different points in our
evolutionary history. A basic emotional response is probably much
older than the ability to evaluate costs and benefits."

While one implication of such findings is that people with certain
kinds of brain damage might do bad things they cannot be held
responsible for, the new research could also expand the boundaries of
moral responsibility. Neuroscience research, Greene said, is finally
explaining a problem that has long troubled philosophers and moral
teachers: Why is it that people who are willing to help someone in
front of them will ignore abstract pleas for help from those who are
distant, such as a request for a charitable contribution that could
save the life of a child overseas?

"We evolved in a world where people in trouble right in front of you
existed, so our emotions were tuned to them, whereas we didn't face
the other kind of situation," Greene said. "It is comforting to think
your moral intuitions are reliable and you can trust them. But if my
analysis is right, your intuitions are not trustworthy. Once you
realize why you have the intuitions you have, it puts a burden on you"
to think about morality differently."

http://www.freenewmexican.com/news/62163.html

tg

unread,
Aug 10, 2007, 6:35:25 AM8/10/07
to


What a mish-mash. Start with a definition of morality, then we can
talk. (I'm directing that to the various interviewees and the
reporter.)

Morality is a set of rules.
Empathy is a behavioral phenomenon.

But people keep conflating the two, since it is politically expedient.

-tg

> http://www.freenewmexican.com/news/62163.html


Art

unread,
Aug 10, 2007, 9:50:46 AM8/10/07
to
On Thu, 09 Aug 2007 13:42:52 -0700, ta <pad...@nc.rr.com> wrote:

>The results were showing that when the volunteers placed the interests
>of others before their own, the generosity activated a primitive part
>of the brain that usually lights up in response to food or sex.

Tests of which areas of the brain "light up" under various
circumstances are interesting. However, the assumption
of causality made by the investigators is philosophically
suspect. For example, if the immaterial soul is in the driver's
seat and using the brain as a instrument, then _it_ may well
be the cause of the brain lighting up in certain areas. The
assumption that the brain is in the driver's seat may well be a
case of assuming the tail wags the dog.

Art
http://home.epix.net/~artnpeg

pico

unread,
Aug 10, 2007, 10:46:31 AM8/10/07
to
Art wrote:

> Tests of which areas of the brain "light up" under various
> circumstances are interesting. However, the assumption
> of causality made by the investigators is philosophically
> suspect. For example, if the immaterial soul is in the driver's
> seat and using the brain as a instrument, then _it_ may well
> be the cause of the brain lighting up in certain areas.

Perhaps some tendencies are hardwired to help assure survival of the
species until it becomes enlightened enough to help itself, to see the
larger picture. Hard-wiring fits well within a teleological framework.

ta

unread,
Aug 10, 2007, 11:39:15 AM8/10/07
to

I took the article to mean that "morality" is not a reason-based set
of rules and principles that we prescribe after much analysis and
contemplation (like religions claim), but rather they are dictated by
our genetics. The "rules" are that we will do whatever makes us feel
good (that which we call "right") and avoid that which makes us feel
bad (that which we call "wrong").

I thought the article took a step toward taking the religion out of
morality, not the other way around.

> >http://www.freenewmexican.com/news/62163.html


GatherNoMoss

unread,
Aug 10, 2007, 12:07:04 PM8/10/07
to
On Aug 10, 11:39 am, ta <padl...@nc.rr.com> wrote:

> I thought the article took a step toward taking the religion out of
> morality, not the other way around.


Yes the whole point.

I note the near hysteria now of "science" in its quest to destroy
religion.

All truth seekers should be wearing a black band around their arms,
denoting this crisis in science. The crisis of the subordination of
truth in science to the interest of Political Correctness.

In the end these totalitarian rats will simply drop the charade and
outlaw religon as subversive and "damaging to the psychic development
of children".

zinnic

unread,
Aug 10, 2007, 12:32:03 PM8/10/07
to

Science offers explanation not 'truth"!
I guess you are feeling panicky as your mysterious "truths" are
washed away by the on-rushing tide of neurological science.
Don't worry so much. After all it is only chemistry!
Zinnic.

tg

unread,
Aug 10, 2007, 2:09:26 PM8/10/07
to
On Aug 10, 11:39 am, ta <padl...@nc.rr.com> wrote:

I thought religions claimed that morality was received wisdom from the
almighty. You are probably thinking of theologians who are
*translating* the Word Of God for the rest of us. (sarcasm alert ;-)

That aside, I don't get what you are trying to say. It is often the
case that what we call "wrong" is what makes us feel good.

-tg

> > >http://www.freenewmexican.com/news/62163.html


Day Brown

unread,
Aug 10, 2007, 4:27:30 PM8/10/07
to
On Aug 10, 2:09 pm, tg <tgdenn...@earthlink.net> wrote:
> I thought religions claimed that morality was received wisdom from the
> almighty. You are probably thinking of theologians who are
> *translating* the Word Of God for the rest of us. (sarcasm alert ;-)
You need to expand your definition of 'religion' beyond the strawman
of Levantine faiths.

ta

unread,
Aug 10, 2007, 9:56:30 PM8/10/07
to

True, true . . . I guess when I said religion I was thinking more
along the lines of objectivism. ;-)

> That aside, I don't get what you are trying to say. It is often the
> case that what we call "wrong" is what makes us feel good.

What we call "wrong" and what we know to be "wrong" are not always the
same. But still, underneath the rationalizations and moralizing, what
is "right" still exists; it's just subverted or twisted.

We know deep down that killing is wrong, as is stealing and
manipulating and cheating. But we often rationalize them. So the hard
wiring is still there, but it gets subverted.

So I was just saying that there appears to be certain behaviours that
we pretty much universally deem to be "right" and "wrong" (despite our
attempts to sometimes mask them in the name of "self-interest" or some
other political agenda).

> -tg> > >http://www.freenewmexican.com/news/62163.html


pico

unread,
Aug 10, 2007, 10:15:55 PM8/10/07
to
ta wrote:

> We know deep down that killing is wrong, as is stealing

I am skeptical of that. Murder has been socially acceptable for a long
time, especially in domestic affairs; acts of murderous passion have
been acceptable for hundreds, possibly millions of years. Stealing?
Well, as they say, stealing with the pen or court is acceptable to so many.

And let's not forget about what humans do because it feels good but it
is socially and personally destructive - compulsive or addictive
behavior where they "know" it is bad, but behave regardless.

Methinks the thing the scientists found has more to do with the set and
setting of the experiment than real life.

0 new messages