Trolley Problem

65 views
Skip to first unread message

Cary Cook

unread,
Nov 19, 2019, 7:28:18 PM11/19/19
to bys-...@googlegroups.com

The “trolley problem” and its variants exists because some people are smart enough to see that morality is subject to arithmetic, and some aren’t.  Those who aren’t smart enough to see it are justified only if they are young or mentally defective.  Otherwise such people are refusing to acknowledge obvious truth for emotional reasons, and deserve to be offended for it.

 

Cary

 

Sent from Mail for Windows 10

 

Email

unread,
Nov 19, 2019, 10:57:18 PM11/19/19
to bys-...@googlegroups.com
Hi Cary,

As usual, I perceive morality as being more nuanced than you appear to. Even ignoring the different weighting between "action" an "inaction" (varying degrees of personal responsibility), different life experiences weight the choices in this problem. A medical doctor who has worked under wartime triage conditions, for example, may find it easier to make rapid life-and-death decisions (based on mere numbers) than a Hollywood stuntman – who might need to see convincing proof that the five aren’t just stunt dummies before he actively kills the one real person.
 
The whole point of the trolley problem is to try to balance the math against the other relevant conditions in such a way that it divides individuals into camps, presumably with the intent of generating discussion. In real life, the "solution" is to use sufficient forethought to avoid situations where the only remaining choices are all potentially disastrous.

- Don


--
You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5dd488a2.1c69fb81.69cc4.48f6SMTPIN_ADDED_MISSING%40gmr-mx.google.com.

Cary Cook

unread,
Nov 20, 2019, 1:04:13 AM11/20/19
to bys-...@googlegroups.com

Don,

 

This can be discussed on Facebook if you like:

https://www.facebook.com/cary.e.cook/posts/10217555573615888?notif_id=1574189037616211&notif_t=feedback_reaction_generic

 

I agree that different life experiences may weight the choices of the decision maker, and hence the morality of the intent of the act, but not the morality of the results of the act. 

There may also be any number of other possibilities to consider in addition to the math.

e.g.

You know the single person is high moral quality, and the opposing 4 are criminals.

You know the single person is age 10, and the opposing 4 are age 90.

You don't know if any of the apparent people are stunt dummies.

Etc.

 

But assuming that the decision maker knows only the fact of one person vs. 4 persons, then morality is based only on math. Action and inaction are in principle morally equal decisions.

 

The point of the trolley problem may be to try to balance the math against other relevant conditions - until the problem is stated such that "all other things being equal" what should the decision maker do?  Once clarified, the point of the problem is to cause people to figure out objective morality.

 

Cary

 

 

Sent from Mail for Windows 10

 

Shane Fletcher

unread,
Dec 7, 2019, 5:58:03 PM12/7/19
to bys-...@googlegroups.com
Cary, if morality is subject to arithmetic, and action inaction are equally moral in principle, is it moral for a doctor to kill 1 healthy person, to save 4 people who need organ transplants?

Or, kill 1 person with a brain tumor prematurely, to save 4 people who need organ transplants now?

Shane

Cary Cook

unread,
Dec 7, 2019, 8:00:20 PM12/7/19
to bys-...@googlegroups.com

Shane,

 

Thanks for this!

 

Moral RESULTS:

All other factors being equal, it is moral to kill 1 healthy person, to save 4 people who need organ transplants, if the operations themselves do not cause more unhappiness than would have occurred otherwise.

But the question of rights comes into play.  The doctor does not have the right to kill the 1 person if that person does not volunteer to be killed.  Violation of rights tips the happiness/unhappiness balance toward unhappiness.  The total happiness/unhappiness balance caused by the act determines the moral results of the act.

 

Moral INTENT:

If the doctor thinks the happiness/unhappiness balance will tip toward happiness by the operations, then the doctor who does the operations has moral intent.

 

Cary

 

Sent from Mail for Windows 10

 

From: Shane Fletcher
Sent: Saturday, December 7, 2019 2:58 PM
To: bys-...@googlegroups.com
Subject: Re: Trolley Problem

 

Cary, if morality is subject to arithmetic, and action inaction are equally moral in principle, is it moral for a doctor to kill 1 healthy person, to save 4 people who need organ transplants?

 

Or, kill 1 person with a brain tumor prematurely, to save 4 people who need organ transplants now?

 

Shane

--

You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.

Email

unread,
Dec 7, 2019, 8:18:24 PM12/7/19
to bys-...@googlegroups.com
Why would the doctor not have the (moral) "right" to perform the operation if morality is based only on total happiness -- given that total happiness ought to be increased more by saving four people and losing one, than by saving one and losing four? 

- Don


Cary Cook

unread,
Dec 7, 2019, 11:26:32 PM12/7/19
to bys-...@googlegroups.com

Don,

 

Good point.  Ultimately, the doctor does have that moral right (again assuming equality of all possible mitigating factors).  But it's not apparent until one figures it out, as you have.

Still, apparent violation of rights causes unhappiness to anyone who has not figured out the big picture.  And that unhappiness is real, and affects the total ratio.

 

Cary

 

Sent from Mail for Windows 10

 

Email

unread,
Dec 8, 2019, 12:15:43 AM12/8/19
to bys-...@googlegroups.com
The question was intended to be rhetorical.

It is simply morally wrong to kill and harvest another person's
organs, against his will -- even to save many others.
If a doctor sees this as being morally just, he should offer his
own organs, rather than stealing life from another.

- Don


Email

unread,
Dec 8, 2019, 12:33:18 AM12/8/19
to bys-...@googlegroups.com
Part 2:
A:

It is simply morally wrong to kill and harvest another person's
organs, against his will -- even to save many others.

B:

If a doctor sees this as being morally just, he should offer his
own organs, rather than stealing life from another.

Why should A cause more unhappiness than B? -- unless the
additional unhappiness is because A is independently immoral,
apart from the numerically identical consequential deaths.

- Don


Cary Cook

unread,
Dec 8, 2019, 7:26:16 PM12/8/19
to bys-...@googlegroups.com

Don,

 

More good points - but deserving follow thru.

 

All mitigating factors being equal or neutral, then it is simply morally wrong to kill and harvest another person's organs, against his will.

But there are hypothetical cases in which there is a point where it becomes moral:  e.g. If the act of doing so prevents a nuclear holocaust.  You can see where this goes.  At some point between preventing a nuclear holocaust and saving 2 humans for killing one, there is a tipping point where ANY generally immoral act becomes moral.

 

In the case of human life, there are presuppositions to all of this. e.g.

1. It must be assumed that human life is better off existing than not existing.  If human life itself is more unhappy than happy, then the whole set of human beings would be better off not existing - despite the exceptional cases.

 

2. If human life itself tips the total balance of happiness/unhappiness in the universe toward unhappiness, then even if the set of human beings is mostly happy, the rest of the emotional universe would be better off without humans.

---------------------------

 

If the doctor could achieve the desired end by offering his own organs, then that would be more moral than stealing life from another.

------------------------------------

 

Understanding morality is not difficult.

It is made difficult because it is frightening.

Email

unread,
Dec 8, 2019, 9:02:04 PM12/8/19
to bys-...@googlegroups.com
C:

If the doctor could achieve the desired end by offering his own organs,

then that would be more moral than stealing life from another.


This sounds to me like you are suggesting a difference between two levels evil.

A:

It is simply morally wrong to kill and harvest another person's

organs, against his will -- even to save many others.

 

B:

If a doctor sees this as being morally just, he should offer his

own organs, rather than stealing life from another.


While, "A" feels to me like Nazi-class infamy; but "B" feels to me like hero-class virtue.

- Don


Cary Cook

unread,
Dec 8, 2019, 10:09:38 PM12/8/19
to bys-...@googlegroups.com

Don,

 

It feels the same to me. 

But after one figures out morality (or any part of reality), how one feels about it becomes irrelevant.

Shane Fletcher

unread,
Dec 21, 2019, 4:41:18 PM12/21/19
to bys-...@googlegroups.com
"All other factors being equal, it is moral to kill 1 healthy person, to save 4 people who need organ transplants, if the operations themselves do not cause more unhappiness than would have occurred otherwise."

I would be unhappy to be saved, because someone was killed against their will. It seems to me that Don would be the same. It's easy to scale up the numbers, and suggest that unhappiness would be increased by killing one person against their will to save 4.

"But the question of rights comes into play.  The doctor does not have the right to kill the 1 person if that person does not volunteer to be killed.  Violation of rights tips the happiness/unhappiness balance toward unhappiness.  The total happiness/unhappiness balance caused by the act determines the moral results of the act."

I would likewise be unhappy to be saved, by someone who volunteered to be killed. It's less about a persons rights to be killed, then the unnatural ending of someones life. You can see the natural extension as being, "Someone who volunteers to killed, is unhappy with living, thus killing them will decrease the unhappiness in the world." but I believe there are better ways to decrease a persons unhappiness then killing them.

Cary Cook

unread,
Dec 21, 2019, 6:33:44 PM12/21/19
to bys-...@googlegroups.com

Shane,

 

I would be unhappy to be saved, because someone was killed against their will.

Me too.  But if I'm one of 1,000 people saved by killing one person against their will, that unhappiness is diminished 1,000%.  (Again, all of this is assuming that living things are happy.)  And if I see that the quantity of happiness is increased among the 1,000, then my happiness will also be increased.

 

I believe there are better ways to decrease a persons unhappiness then killing them.

Right.  But the question here is not "What is the better way to decrease a person's unhappiness?".  The permeators of the problem are clearly outlined. 

Do you have any criterion other than quantity of happiness/unhappiness to determine which is the more moral/immoral of 2 acts?

 

Cary

 

Sent from Mail for Windows 10

 

From: Shane Fletcher
Sent: Saturday, December 21, 2019 1:41 PM
To: bys-...@googlegroups.com
Subject: Re: Trolley Problem

 

"All other factors being equal, it is moral to kill 1 healthy person, to save 4 people who need organ transplants, if the operations themselves do not cause more unhappiness than would have occurred otherwise."

--

You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.

Email

unread,
Dec 22, 2019, 12:53:01 AM12/22/19
to bys-...@googlegroups.com
Hi Shane,


"But the question of rights comes into play.  The doctor does not have the right to kill the 1 person if that person does not volunteer to be killed.  Violation of rights tips the happiness/unhappiness balance toward unhappiness.  The total happiness/unhappiness balance caused by the act determines the moral results of the act."

That sounds circular to me:

1) That it is the immoral act which causes the unhappiness,
and
2) That it is the resulting unhappiness which made the act immoral in the first place.

- Don


Cary Cook

unread,
Dec 22, 2019, 2:08:44 AM12/22/19
to bys-...@googlegroups.com

Don, That was a quote from me.  Did you intend this for me?

Cary

 

Sent from Mail for Windows 10

 

From: 'Email' via BYS vs MH
Sent: Saturday, December 21, 2019 9:53 PM
To: bys-...@googlegroups.com
Subject: Re: Trolley Problem

 

Hi Shane,


--

You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.

Cary Cook

unread,
Dec 22, 2019, 4:09:10 AM12/22/19
to bys-...@googlegroups.com

Don,

 

Correct.  The way I said it is circular in the sense you stated.

It is circular if it is seen in terms of causation.

"A causes B; B causes A": problem.

 

Does immorality cause unhappiness?

Or does unhappiness cause immorality?

I dunno; ask Euthyphro.

 

The statement can be reworded in terms of what is, rather than what causes what, and therefore without causal circularity.

If an act tips the happiness/unhappiness ratio toward unhappiness, that act is immoral.

If an act is immoral, it tips the happiness/unhappiness ratio toward unhappiness.

i.e. "A = B; B = A": no problem.

 

Cary

 

Sent from Mail for Windows 10

 

From: Cary Cook
Sent: Saturday, December 21, 2019 11:08 PM
To: bys-...@googlegroups.com
Subject: RE: Trolley Problem

 

Don, That was a quote from me.  Did you intend this for me?

Cary

 

Sent from Mail for Windows 10

 

From: 'Email' via BYS vs MH
Sent: Saturday, December 21, 2019 9:53 PM
To: bys-...@googlegroups.com
Subject: Re: Trolley Problem

 

Hi Shane,

--

You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.

Email

unread,
Dec 22, 2019, 10:51:51 AM12/22/19
to bys-...@googlegroups.com
Sorry Cary, my attention is mostly elsewhere. My bad.
If that was your quote, then the circularity was also yours.

- Don


Email

unread,
Dec 22, 2019, 11:00:34 AM12/22/19
to bys-...@googlegroups.com
Cary,

That isn't really an accurate description of either happiness or morality. Happiness is caused by numerous things which have nothing to do with morality; and there are also immoral acts which cause perverted (immoral) forms of happiness. The two ideas don't really track.
 
- Don


Cary Cook

unread,
Dec 22, 2019, 6:07:14 PM12/22/19
to bys-...@googlegroups.com

Don,

 

When talking about cause, you're correct.

That's why I admitted a mistake in bringing cause into the issue.

 

When talking about what is, do you have any rational problem (emotional problems are to be expected) with my 2 statements?

 

If an act tips the happiness/unhappiness ratio toward unhappiness, that act is immoral.

If an act is immoral, it tips the happiness/unhappiness ratio toward unhappiness.

 

(given that happiness is defined as that emotion, or factor affecting emotion, that causes (in this case it’s OK) an organism to like existing in that emotional condition)

 

I may have to tighten it up if you look for exceptional cases.  e.g. I may have to specify voluntary acts that affect the

happiness/unhappiness of organisms other than the actor, and I’m talking about total happiness/unhappiness in the universe.  I didn’t try for bulletproof precision, because it would get unnecessarily convoluted.

Email

unread,
Dec 22, 2019, 11:47:58 PM12/22/19
to bys-...@googlegroups.com
Hi Cary,

Review:
"Morality" distinguishes between "right" and "wrong."
2+3=5 is "right," while 2+3=6 is "wrong," although math appears detached from "morality."
Classical nature/physics/chemistry does not and cannot ever produce "wrong" results.
When a rock "falls" (action) in the "wrong" place, it has done nothing morally "wrong."
Whether a rock's landing site is "right" or "wrong" is a human perception
Sentient beings (IMO involving QM) can produce moral results which are either "right" or "wrong.".
Human "acts" can be: unavoidable, unintentional, careless, spontaneous, premeditated, etc.
The more intended (vs. unintended) an act is, the more it becomes a "moral" issue.
The amount of happiness/unhappiness felt varies with the act's actual effects and perceived intent.
The amount of moral right or wrong varies with the act's anticipated effects and actual intent.

Therefore:
Happiness/unhappiness does not consistently track the relevant moral variables (intended effects) because produced happiness and morality are actually two different things.
However, "right" does correlate heavily with "happiness," and "wrong" with "unhappiness," to the extent that people correctly anticipate the effects and perceive the intents. In the real world, this difference is significant. In a theoretically perfect world, this whole issue would be moot.

- Don


Cary Cook

unread,
Dec 23, 2019, 1:35:32 AM12/23/19
to bys-...@googlegroups.com

Don,

 

If that does it for you, you're welcome to it.

The fact remains, I have offered a logical and practical explanation of what constitutes moral and immoral acts.

 

You haven't, and you can't offer a better one.

Email

unread,
Dec 23, 2019, 11:59:28 AM12/23/19
to bys-...@googlegroups.com
Hi Cary,

... Likewise, you are certainly welcome to your position.
I have no problem with disagreeing agreeably.

Some other remaining facts include :
1) Your theory's predictions don't track quantitatively with its expected targets (ergo falsification).
and
2) You have not falsified my theory that morality, like logic, is primordial and inexplicable in other terms.

This amounts to falsifying your theory, while presenting a counter theory which could, potentially, be as unfalsifiable as the basis for logic is. Whether or not an "unfalsifiable theory" constitutes a "better theory" than a" falsified theory" might be as philosophically debatable as just about anything else is. Scientifically: Unfalsifiable theories are moot while falsified theories are discarded.

- Don


Cary Cook

unread,
Dec 23, 2019, 6:57:45 PM12/23/19
to bys-...@googlegroups.com

Don,

 

These are new subjects - possibly worth discussing.

 

1) Your theory's predictions don't track quantitatively with its expected targets (ergo falsification).

What predictions?

 

2) You have not falsified my theory that morality, like logic, is primordial and inexplicable in other terms.

Correct.  You have presented an unfalsifiable theory - much like your inerrancy theory.

X is primordial and inexplicable in other terms.

Therefore if I explain X in other terms, my explanation must be false.

Cool!

Email

unread,
Dec 24, 2019, 4:44:09 PM12/24/19
to bys-...@googlegroups.com
Cary,

Regarding your comment:

Correct.  You have presented an unfalsifiable theory - much like your inerrancy theory.

X is primordial and inexplicable in other terms.

Therefore if I explain X in other terms, my explanation must be false.

Cool!


No.
We need to go over this again: 

I said your theory is wrong because it was falsified (in this case because its predictions were different than what really happens), to which you replied:

What predictions?


As I had said (on 12/22):

 Happiness is caused by numerous things which have nothing to do with morality; and there are also immoral acts which cause perverted (immoral) forms of happiness. The two ideas don't really track.


If morality is the maximizing of happiness, then there should never be situations where happiness increases or decreases in the disproportional amounts or in the opposite direction. These examples falsify the present form of your theory.

As you replied (also 12/2/):

I may have to tighten it up if you look for exceptional cases.  e.g. I may have to specify voluntary acts that affect the happiness/unhappiness of organisms other than the actor, and I’m talking about total happiness/unhappiness in the universe.  I didn’t try for bulletproof precision, because it would get unnecessarily convoluted.


There is no philosophical or scientific principle which obligates you do go to a lot of bother if you just plain don't want to. If you're OK without tightening up the glitches in your theory, then fine. However, that doesn't make your theory immune to falsification; if you want me to recognize it as a "valid" and "un-falsified" theory, then you still need to make those repairs.

There is also no reason why I should be obligated to try to make positive sense out of that same mess.

My counter-theory may not be subject to falsification. (I suspect it isn't, but I'm not really certain.) This disqualifies it as a "scientific" theory in the same manner that my theory that "math and logic are valid" (neither provable nor falsifiable) is not a proper "scientific" theory. I think the latter is likely to be the "truth" (especially if we are to believe that "truth" actually exists), but I can't claim to have any solid justification for my opinion. The former (primordial morality) theory is more of a suspicion than an actual belief.

Until you choose to repair the defects in your present theory (however trivial you might believe those defects are), then I have no obligation to treat it as more than a "falsified" theory. In my mind, my unproven suspicion carries more weight than your "theory" -- as long as it still contains demonstrable errors.

You can either accept this as it stands, or you can try to fix those errors. It's your choice.

- Don


Cary Cook

unread,
Dec 24, 2019, 5:56:00 PM12/24/19
to bys-...@googlegroups.com

Don,

 

Point out anything you've said that falsifies anything I've said.

 

What predictions?

You still haven’t answered this.  My theory has no predictions that I'm aware of.

 

there are also immoral acts which cause perverted (immoral) forms of happiness.

Correct.  But the happiness they cause is always outweighed by the unhappiness they cause.  That's why morality is based on the total ratio of happiness/unhappiness in the universe. 

Point out an exception, and you have falsified my theory.

 

You can either accept this as it stands, or you can try to fix those errors. It's your choice.

You have not shown errors.  I've admitted imprecisions, which I will make more precise whenever you point out each one as causing an apparent error.

Email

unread,
Dec 24, 2019, 9:02:35 PM12/24/19
to bys-...@googlegroups.com
Cary,

You still haven’t answered this.  My theory has no predictions that I'm aware of.


Actually, your theory is "predictive" in the scientific sense:

https://en.m.wikipedia.org/wiki/Scientific_theory :
The defining characteristic of all scientific knowledge, including theories, is the ability to make falsifiable or testable predictions. The relevance and specificity of those predictions determine how potentially useful the theory is. A would-be theory that makes no observable predictions is not a scientific theory at all. Predictions not sufficiently specific to be tested are similarly not useful. In both cases, the term "theory" is not applicable.     (Emphasized italics mine)

If, as your theory states, "morality" and "maximized happiness" are the same thing, then, any variation in the amount of morality being present will invariably be accompanied by a corresponding variation in the total amount of happiness being present. (The two things will "track.")

This is what is meant by useful, testable, predictions.
 

there are also immoral acts which cause perverted (immoral) forms of happiness.

Correct.  But the happiness they cause is always outweighed by the unhappiness they cause.  That's why morality is based on the total ratio of happiness/unhappiness in the universe. 


This "supporting" claim of yours is actually a restatement of what your theory "predicts":

IF an immoral act produces happiness, THEN it will always produce a greater amount of unhappiness. 

Stated precisely: Morality = Happiness - Unhappiness

As I had said (on 12/22):

 

 Happiness is caused by numerous things which have nothing to do with morality; and there are also immoral acts which cause perverted (immoral) forms of happiness. The two ideas don't really track.


Part 1:

Happiness is caused by numerous things which have nothing to do with morality;


In this case:  Morality < Happiness -Unhappiness

This difference, all by itself, falsifies the precise (mathematical) expression of your theory.

Part 2:

There are also immoral acts which cause perverted (immoral) forms of happiness. 


This referred back to my statement (also Dec. 22):

The more intended (vs. unintended) an act is, the more it becomes a "moral" issue.

The amount of happiness/unhappiness felt varies with the act's actual effects and perceived intent.

The amount of moral right or wrong varies with the act's anticipated effects and actual intent.

 

Therefore:

Happiness/unhappiness does not consistently track the relevant moral variables (intended effects) because produced happiness and morality are actually two different things.


Again:   Morality > or < Happiness - Unhappiness      (not necessarily = to)

... Because morality is really something different than mere total net happiness.

As I summed it up:

However, "right" does correlate heavily with "happiness," and "wrong" with "unhappiness," to the extent that people correctly anticipate the effects and perceive the intents. In the real world, this difference is significant. In a theoretically perfect world, this whole issue would be moot.


I have answered your question.
Why isn't this obvious?
What am I missing?

- Don


Email

unread,
Dec 24, 2019, 9:07:17 PM12/24/19
to bys-...@googlegroups.com
If you had said, "Morality influences happiness, but is, itself a different thing," then I would have to agree with you.

- Don


Cary Cook

unread,
Dec 25, 2019, 2:02:02 AM12/25/19
to bys-...@googlegroups.com

Don,

 

No problem!  Morality and happiness are different:

 

I've stipulatively defined happiness as that emotion, or factor affecting emotion, that causes an organism to like existing in that emotional state.

 

I define morality as an objective standard for judging willful acts that should or should not happen.

 

Does that do it?

Email

unread,
Dec 25, 2019, 2:51:49 AM12/25/19
to bys-...@googlegroups.com
Cary,

That much sounds reasonably correct to me. I'll even go as far as saying that moral acts will tend to promote "happiness" under unexceptional conditions. However, I haven't put much thought into defining "happiness," other than thinking of it as a general category of qualia which roughly includes the responses to positive achievements. (One theory is that the biochemical mechanism which "causes" it may actually involve nuerally-produced opium.) A different person might define it as where along the needs-hierarchy an individual is operating: (Struggling for air = fairly unhappy, seeking higher social status = fairly happy.) I don't particularly care how you define it, as long as we both understand what we mean when we use the term. I believe your definition does that well enough.

- Don


Cary Cook

unread,
Dec 25, 2019, 6:24:34 PM12/25/19
to bys-...@googlegroups.com

Don,

 

I don't particularly care how you define it, as long as we both understand what we mean when we use the term.

Exactly!  That part gets me flack every time I present my theory.  I know happiness can be defined many different ways, but I'm only talking about one concept.  I call it happiness because I can't think of a better term.  Some people prefer well-being, but that's even more ambiguous.

 

Please tell me if you find any other ambiguities or other problems that need clarifying.

Email

unread,
Dec 25, 2019, 7:10:49 PM12/25/19
to bys-...@googlegroups.com
Cary,

Please tell me if you find any other ambiguities or other problems that need clarifying.


The only "problem" I can think of is that you don't seem to have anything better to do on Christmas day than discuss philosophy. Merry Christmas. My family says "Hi." May you find the happiness you seek.

- Don


Cary Cook

unread,
Dec 25, 2019, 9:46:29 PM12/25/19
to bys-...@googlegroups.com

Bah! Humbug!  Happiness makes me sloppy.

Email

unread,
Dec 25, 2019, 11:08:27 PM12/25/19
to bys-...@googlegroups.com

Shane Fletcher

unread,
Jan 4, 2020, 8:37:35 PM1/4/20
to bys-...@googlegroups.com
"

I would be unhappy to be saved, because someone was killed against their will.

Me too.  But if I'm one of 1,000 people saved by killing one person against their will, that unhappiness is diminished 1,000%."


Can you show me the math on that, please? I am not following what how that works.


"Right.  But the question here is not "What is the better way to decrease a person's unhappiness?".  The permeators of the problem are clearly outlined."


I'm not sure of the meaning of "permeator" in this context. But I think the whole point of these thought experiments is the grey area.


"Do you have any criterion other than quantity of happiness/unhappiness to determine which is the more moral/immoral of 2 acts?"


I might need to wait on your explanation of the math above, but it seems to me that if you kill a person who is happy to be killed, his happiness is decreased to zero at the point of his death. And if all the people who are saved are unhappy with his sacrifice, then you have a Net increase of unhappiness in the world.


Hope you had a Christmas spent with people you love, and I wish you well for 2020.

Shane

Cary Cook

unread,
Jan 4, 2020, 10:20:19 PM1/4/20
to bys-...@googlegroups.com

Don,

 

Can you show me the math on that, please?

If I'm saved by killing one person against their will, that gives me X amount of unhappiness

If I'm 1 of 1,000 people saved by killing one person against their will. that gives me 1/1,000 X amount of unhappiness.

 

OK, I stupided out.  Permeators should be parameters.

 

I think the whole point of these thought experiments is the grey area.

The point is grey areas for people who want morality to be a grey area.

The point is figuring out what morality is for people who want to figure out what morality is. 

 

if you kill a person who is happy to be killed, his happiness is decreased to zero at the point of his death.

If a person is happy to be killed, then he is unhappy living.  So it's his unhappiness that is decreased to zero at the point of his death.  But I don't see a point either way in this statement.

 

And if all the people who are saved are unhappy with his sacrifice, then you have a Net increase of unhappiness in the world.

Only if they are more unhappy with his sacrifice than they are happy to be living.  And it must be assumed that living things are happy, in order to say killing is immoral.

 

I quit pretending to know what love means long ago.  I wish you and everyone else exactly what you all deserve.

 

Cary

 

Sent from Mail for Windows 10

 

From: Shane Fletcher
Sent: Saturday, January 4, 2020 5:37 PM
To: bys-...@googlegroups.com
Subject: Re: Trolley Problem

 

"

I would be unhappy to be saved, because someone was killed against their will.

--

You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.

Email

unread,
Jan 5, 2020, 1:31:54 AM1/5/20
to bys-...@googlegroups.com

Cary Cook

unread,
Jan 5, 2020, 2:31:06 AM1/5/20
to bys-...@googlegroups.com

Sorry, that was to Shane, not Don.

Cary

 

Sent from Mail for Windows 10

 

From: Cary Cook
Sent: Saturday, January 4, 2020 7:20 PM
To: bys-...@googlegroups.com
Subject: RE: Trolley Problem

 

Don,

Shane Fletcher

unread,
Jan 7, 2020, 7:39:18 AM1/7/20
to bys-...@googlegroups.com
Hi Cary,

"Can you show me the math on that, please?

If I'm saved by killing one person against their will, that gives me X amount of unhappiness

If I'm 1 of 1,000 people saved by killing one person against their will. that gives me 1/1,000 X amount of unhappiness."


How is your unhappiness modified because there are 999 more people saved? Surely if your unhappiness is shared by 1000 other people it is 1000 x amount of unhappiness?

"If a person is happy to be killed, then he is unhappy living."

I don't think that follows. Mainly because I'm sure "If a person is unhappy to be killed, then he is happy living." is not a true statement. I am sure there are miserable people that don't want to die.

"Only if they are more unhappy with his sacrifice than they are happy to be living.  And it must be assumed that living things are happy, in order to say killing is immoral."

I see where you're coming from. But your second sentence seems to contradict your point; that sometimes killing is moral.

"I quit pretending to know what love means long ago. "

Love is sacrificing of yourself for another.

Shane

Cary Cook

unread,
Jan 7, 2020, 9:32:18 PM1/7/20
to bys-...@googlegroups.com

Shane,

 

How is your unhappiness modified because there are 999 more people saved? Surely if your unhappiness is shared by 1000 other people it is 1000 x amount of unhappiness?

OK, I'll try it a different way.

Assuming that living things are happy, then the person who died lost X amount of happiness.

The people who are saved, including me, gain X amount of happiness each.  That's a trade of X for 1,000X.

 

"If a person is unhappy to be killed, then he is happy living." is not a true statement. I am sure there are miserable [unhappy] people that don't want to die.

That's only because they expect an amount of happiness in the future great enough to outweigh the unhappiness they are presently experiencing.  OR they haven’t figured that out, but are just reacting on survival instinct.

Remember that I'm defining happiness as that emotion, or factor affecting emotion, that causes an organism to want to continue living in that condition.  The person who is happy to be killed has already overcome survival instinct.

 

your second sentence seems to contradict your point; that sometimes killing is moral.

If happiness is quantified, and morality is based on happiness, then killing is moral if it results in a greater ratio of happiness to unhappiness in the universe.

 

Love is sacrificing of yourself for another.

Then I want love only if it results in a greater ratio of happiness to unhappiness in the universe.

 

Cary

 

Sent from Mail for Windows 10

 

From: Shane Fletcher
Sent: Tuesday, January 7, 2020 4:39 AM
To: bys-...@googlegroups.com
Subject: Re: Trolley Problem

 

Hi Cary,

--

You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.

Reply all
Reply to author
Forward
0 new messages