The “trolley problem” and its variants exists because some people are smart enough to see that morality is subject to arithmetic, and some aren’t. Those who aren’t smart enough to see it are justified only if they are young or mentally defective. Otherwise such people are refusing to acknowledge obvious truth for emotional reasons, and deserve to be offended for it.
Cary
Sent from Mail for Windows 10
--
You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5dd488a2.1c69fb81.69cc4.48f6SMTPIN_ADDED_MISSING%40gmr-mx.google.com.
Don,
This can be discussed on Facebook if you like:
I agree that different life experiences may weight the choices of the decision maker, and hence the morality of the intent of the act, but not the morality of the results of the act.
There may also be any number of other possibilities to consider in addition to the math.
e.g.
You know the single person is high moral quality, and the opposing 4 are criminals.
You know the single person is age 10, and the opposing 4 are age 90.
You don't know if any of the apparent people are stunt dummies.
Etc.
But assuming that the decision maker knows only the fact of one person vs. 4 persons, then morality is based only on math. Action and inaction are in principle morally equal decisions.
The point of the trolley problem may be to try to balance the math against other relevant conditions - until the problem is stated such that "all other things being equal" what should the decision maker do? Once clarified, the point of the problem is to cause people to figure out objective morality.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5A006FC3-58C2-44CD-B46B-F561EAC09A0D%40yahoo.com.
Shane,
Thanks for this!
Moral RESULTS:
All other factors being equal, it is moral to kill 1 healthy person, to save 4 people who need organ transplants, if the operations themselves do not cause more unhappiness than would have occurred otherwise.
But the question of rights comes into play. The doctor does not have the right to kill the 1 person if that person does not volunteer to be killed. Violation of rights tips the happiness/unhappiness balance toward unhappiness. The total happiness/unhappiness balance caused by the act determines the moral results of the act.
Moral INTENT:
If the doctor thinks the happiness/unhappiness balance will tip toward happiness by the operations, then the doctor who does the operations has moral intent.
From: Shane Fletcher
Sent: Saturday, December 7, 2019 2:58 PM
To: bys-...@googlegroups.com
Subject: Re: Trolley Problem
Cary, if morality is subject to arithmetic, and action inaction are equally moral in principle, is it moral for a doctor to kill 1 healthy person, to save 4 people who need organ transplants?
Or, kill 1 person with a brain tumor prematurely, to save 4 people who need organ transplants now?
Shane
--
You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/CAAsswWoOQeyti2G_j6az%3DU96yYX%3DnfE%3DPv9bB7Yrm2GeOXQ5sQ%40mail.gmail.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5dec4b23.1c69fb81.b75bf.281cSMTPIN_ADDED_MISSING%40gmr-mx.google.com.
Don,
Good point. Ultimately, the doctor does have that moral right (again assuming equality of all possible mitigating factors). But it's not apparent until one figures it out, as you have.
Still, apparent violation of rights causes unhappiness to anyone who has not figured out the big picture. And that unhappiness is real, and affects the total ratio.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/33956997-7704-4C1E-A349-5C289B394E1D%40yahoo.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5dec7b77.1c69fb81.9f70a.cf63SMTPIN_ADDED_MISSING%40gmr-mx.google.com.
It is simply morally wrong to kill and harvest another person'sorgans, against his will -- even to save many others.
If a doctor sees this as being morally just, he should offer hisown organs, rather than stealing life from another.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/7C9DCCB9-1CB2-4F8B-98F2-43B6C8AEB7E6%40yahoo.com.
Don,
More good points - but deserving follow thru.
All mitigating factors being equal or neutral, then it is simply morally wrong to kill and harvest another person's organs, against his will.
But there are hypothetical cases in which there is a point where it becomes moral: e.g. If the act of doing so prevents a nuclear holocaust. You can see where this goes. At some point between preventing a nuclear holocaust and saving 2 humans for killing one, there is a tipping point where ANY generally immoral act becomes moral.
In the case of human life, there are presuppositions to all of this. e.g.
1. It must be assumed that human life is better off existing than not existing. If human life itself is more unhappy than happy, then the whole set of human beings would be better off not existing - despite the exceptional cases.
2. If human life itself tips the total balance of happiness/unhappiness in the universe toward unhappiness, then even if the set of human beings is mostly happy, the rest of the emotional universe would be better off without humans.
---------------------------
If the doctor could achieve the desired end by offering his own organs, then that would be more moral than stealing life from another.
------------------------------------
Understanding morality is not difficult.
It is made difficult because it is frightening.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/04CF6F6C-4586-487F-93A0-B13E3F7B2A2B%40yahoo.com.
If the doctor could achieve the desired end by offering his own organs,
then that would be more moral than stealing life from another.
A:
It is simply morally wrong to kill and harvest another person's
organs, against his will -- even to save many others.
B:
If a doctor sees this as being morally just, he should offer his
own organs, rather than stealing life from another.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5ded94a7.1c69fb81.71b4e.659cSMTPIN_ADDED_MISSING%40gmr-mx.google.com.
Don,
It feels the same to me.
But after one figures out morality (or any part of reality), how one feels about it becomes irrelevant.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/1570D628-29C3-4936-B83D-ECEDB1318224%40yahoo.com.
Shane,
I would be unhappy to be saved, because someone was killed against their will.
Me too. But if I'm one of 1,000 people saved by killing one person against their will, that unhappiness is diminished 1,000%. (Again, all of this is assuming that living things are happy.) And if I see that the quantity of happiness is increased among the 1,000, then my happiness will also be increased.
I believe there are better ways to decrease a persons unhappiness then killing them.
Right. But the question here is not "What is the better way to decrease a person's unhappiness?". The permeators of the problem are clearly outlined.
Do you have any criterion other than quantity of happiness/unhappiness to determine which is the more moral/immoral of 2 acts?
Cary
Sent from Mail for Windows 10
From: Shane Fletcher
Sent: Saturday, December 21, 2019 1:41 PM
To: bys-...@googlegroups.com
Subject: Re: Trolley Problem
"All other factors being equal, it is moral to kill 1 healthy person, to save 4 people who need organ transplants, if the operations themselves do not cause more unhappiness than would have occurred otherwise."
--
You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/CAAsswWp%3DkiswzQyKeg8t9aZDNqTJF906e8M8aVx0cSZ%3D8Dtc3g%40mail.gmail.com.
"But the question of rights comes into play. The doctor does not have the right to kill the 1 person if that person does not volunteer to be killed. Violation of rights tips the happiness/unhappiness balance toward unhappiness. The total happiness/unhappiness balance caused by the act determines the moral results of the act."
Don, That was a quote from me. Did you intend this for me?
From: 'Email' via BYS vs MH
Sent: Saturday, December 21, 2019 9:53 PM
To: bys-...@googlegroups.com
Subject: Re: Trolley Problem
Hi Shane,
--
You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/036A557B-D225-4329-B0BA-53C2DF4D71CF%40yahoo.com.
Don,
Correct. The way I said it is circular in the sense you stated.
It is circular if it is seen in terms of causation.
"A causes B; B causes A": problem.
Does immorality cause unhappiness?
Or does unhappiness cause immorality?
I dunno; ask Euthyphro.
The statement can be reworded in terms of what is, rather than what causes what, and therefore without causal circularity.
If an act tips the happiness/unhappiness ratio toward unhappiness, that act is immoral.
If an act is immoral, it tips the happiness/unhappiness ratio toward unhappiness.
i.e. "A = B; B = A": no problem.
From: Cary Cook
Sent: Saturday, December 21, 2019 11:08 PM
To: bys-...@googlegroups.com
Subject: RE: Trolley Problem
Don, That was a quote from me. Did you intend this for me?
Cary
From: 'Email' via BYS vs MH
Sent: Saturday, December 21, 2019 9:53 PM
To: bys-...@googlegroups.com
Subject: Re: Trolley Problem
Hi Shane,
--
You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/036A557B-D225-4329-B0BA-53C2DF4D71CF%40yahoo.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5dff167b.1c69fb81.15bba.09beSMTPIN_ADDED_MISSING%40gmr-mx.google.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5dff32b6.1c69fb81.5d70e.a740SMTPIN_ADDED_MISSING%40gmr-mx.google.com.
Don,
When talking about cause, you're correct.
That's why I admitted a mistake in bringing cause into the issue.
When talking about what is, do you have any rational problem (emotional problems are to be expected) with my 2 statements?
If an act tips the happiness/unhappiness ratio toward unhappiness, that act is immoral.
If an act is immoral, it tips the happiness/unhappiness ratio toward unhappiness.
(given that happiness is defined as that emotion, or factor affecting emotion, that causes (in this case it’s OK) an organism to like existing in that emotional condition)
I may have to tighten it up if you look for exceptional cases. e.g. I may have to specify voluntary acts that affect the
happiness/unhappiness of organisms other than the actor, and I’m talking about total happiness/unhappiness in the universe. I didn’t try for bulletproof precision, because it would get unnecessarily convoluted.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/35303498-6DA6-40EA-998D-0EE75414F95C%40yahoo.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5dfff721.1c69fb81.8d509.2d73SMTPIN_ADDED_MISSING%40gmr-mx.google.com.
Don,
If that does it for you, you're welcome to it.
The fact remains, I have offered a logical and practical explanation of what constitutes moral and immoral acts.
You haven't, and you can't offer a better one.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/1DCF41D0-78A8-4829-A59A-6DE8DCEA547F%40yahoo.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5e006034.1c69fb81.9c35a.44c8SMTPIN_ADDED_MISSING%40gmr-mx.google.com.
Don,
These are new subjects - possibly worth discussing.
1) Your theory's predictions don't track quantitatively with its expected targets (ergo falsification).
What predictions?
2) You have not falsified my theory that morality, like logic, is primordial and inexplicable in other terms.
Correct. You have presented an unfalsifiable theory - much like your inerrancy theory.
X is primordial and inexplicable in other terms.
Therefore if I explain X in other terms, my explanation must be false.
Cool!
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/535F861D-F8F9-40E1-BE46-699D8AE63681%40yahoo.com.
Correct. You have presented an unfalsifiable theory - much like your inerrancy theory.
X is primordial and inexplicable in other terms.
Therefore if I explain X in other terms, my explanation must be false.
Cool!
What predictions?
Happiness is caused by numerous things which have nothing to do with morality; and there are also immoral acts which cause perverted (immoral) forms of happiness. The two ideas don't really track.
I may have to tighten it up if you look for exceptional cases. e.g. I may have to specify voluntary acts that affect the happiness/unhappiness of organisms other than the actor, and I’m talking about total happiness/unhappiness in the universe. I didn’t try for bulletproof precision, because it would get unnecessarily convoluted.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5e015478.1c69fb81.f1897.1f76SMTPIN_ADDED_MISSING%40gmr-mx.google.com.
Don,
Point out anything you've said that falsifies anything I've said.
What predictions?
You still haven’t answered this. My theory has no predictions that I'm aware of.
there are also immoral acts which cause perverted (immoral) forms of happiness.
Correct. But the happiness they cause is always outweighed by the unhappiness they cause. That's why morality is based on the total ratio of happiness/unhappiness in the universe.
Point out an exception, and you have falsified my theory.
You can either accept this as it stands, or you can try to fix those errors. It's your choice.
You have not shown errors. I've admitted imprecisions, which I will make more precise whenever you point out each one as causing an apparent error.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/13030082-7025-4EC7-B687-2017C698E28D%40yahoo.com.
You still haven’t answered this. My theory has no predictions that I'm aware of.
there are also immoral acts which cause perverted (immoral) forms of happiness.
Correct. But the happiness they cause is always outweighed by the unhappiness they cause. That's why morality is based on the total ratio of happiness/unhappiness in the universe.
As I had said (on 12/22):
Happiness is caused by numerous things which have nothing to do with morality; and there are also immoral acts which cause perverted (immoral) forms of happiness. The two ideas don't really track.
Happiness is caused by numerous things which have nothing to do with morality;
There are also immoral acts which cause perverted (immoral) forms of happiness.
The more intended (vs. unintended) an act is, the more it becomes a "moral" issue.
The amount of happiness/unhappiness felt varies with the act's actual effects and perceived intent.
The amount of moral right or wrong varies with the act's anticipated effects and actual intent.
Therefore:
Happiness/unhappiness does not consistently track the relevant moral variables (intended effects) because produced happiness and morality are actually two different things.
However, "right" does correlate heavily with "happiness," and "wrong" with "unhappiness," to the extent that people correctly anticipate the effects and perceive the intents. In the real world, this difference is significant. In a theoretically perfect world, this whole issue would be moot.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5e02977f.1c69fb81.50b0a.fcbaSMTPIN_ADDED_MISSING%40gmr-mx.google.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/9AD95AE7-5480-44E8-8BAA-1F4FF38BE882%40yahoo.com.
Don,
No problem! Morality and happiness are different:
I've stipulatively defined happiness as that emotion, or factor affecting emotion, that causes an organism to like existing in that emotional state.
I define morality as an objective standard for judging willful acts that should or should not happen.
Does that do it?
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/96E9996B-879B-4CFF-AB46-132FA2209019%40yahoo.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5e03096a.1c69fb81.1d09.26aeSMTPIN_ADDED_MISSING%40gmr-mx.google.com.
Don,
I don't particularly care how you define it, as long as we both understand what we mean when we use the term.
Exactly! That part gets me flack every time I present my theory. I know happiness can be defined many different ways, but I'm only talking about one concept. I call it happiness because I can't think of a better term. Some people prefer well-being, but that's even more ambiguous.
Please tell me if you find any other ambiguities or other problems that need clarifying.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/A18EE26E-7600-4B05-BDA9-79F33B98D10E%40yahoo.com.
Please tell me if you find any other ambiguities or other problems that need clarifying.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5e03efb1.1c69fb81.bbb76.9260SMTPIN_ADDED_MISSING%40gmr-mx.google.com.
Bah! Humbug! Happiness makes me sloppy.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/FF36FC71-481A-4761-9B2C-B23C105EA3D4%40yahoo.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5e041f04.1c69fb81.f65d6.190eSMTPIN_ADDED_MISSING%40gmr-mx.google.com.
I would be unhappy to be saved, because someone was killed against their will.
Me too. But if I'm one of 1,000 people saved by killing one person against their will, that unhappiness is diminished 1,000%."
Can you show me the math on that, please? I am not following what how that works.
"Right.
But the question here is not "What is the better way to decrease a
person's unhappiness?". The permeators of the problem are clearly
outlined."
I'm not sure of the meaning of "permeator" in this context. But I think the whole point of these thought experiments is the grey area.
"Do you have any criterion other than quantity of happiness/unhappiness to determine which is the more moral/immoral of 2 acts?"
I might need to wait on your explanation of the math above, but it seems to me that if you kill a person who is happy to be killed, his happiness is decreased to zero at the point of his death. And if all the people who are saved are unhappy with his sacrifice, then you have a Net increase of unhappiness in the world.
Don,
Can you show me the math on that, please?
If I'm saved by killing one person against their will, that gives me X amount of unhappiness
If I'm 1 of 1,000 people saved by killing one person against their will. that gives me 1/1,000 X amount of unhappiness.
OK, I stupided out. Permeators should be parameters.
I think the whole point of these thought experiments is the grey area.
The point is grey areas for people who want morality to be a grey area.
The point is figuring out what morality is for people who want to figure out what morality is.
if you kill a person who is happy to be killed, his happiness is decreased to zero at the point of his death.
If a person is happy to be killed, then he is unhappy living. So it's his unhappiness that is decreased to zero at the point of his death. But I don't see a point either way in this statement.
And if all the people who are saved are unhappy with his sacrifice, then you have a Net increase of unhappiness in the world.
Only if they are more unhappy with his sacrifice than they are happy to be living. And it must be assumed that living things are happy, in order to say killing is immoral.
I quit pretending to know what love means long ago. I wish you and everyone else exactly what you all deserve.
Cary
Sent from Mail for Windows 10
From: Shane Fletcher
Sent: Saturday, January 4, 2020 5:37 PM
To: bys-...@googlegroups.com
Subject: Re: Trolley Problem
"
I would be unhappy to be saved, because someone was killed against their will.
--
You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/CAAsswWqgmfdO2etcd7B7pTgkf%2ByycQcT5R_EL3UWZh62gOU7-w%40mail.gmail.com.
Don,
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5e1155f3.1c69fb81.15bba.35dcSMTPIN_ADDED_MISSING%40gmr-mx.google.com.
Sorry, that was to Shane, not Don.
From: Cary Cook
Sent: Saturday, January 4, 2020 7:20 PM
To: bys-...@googlegroups.com
Subject: RE: Trolley Problem
Don,
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/5e1155f3.1c69fb81.15bba.35dcSMTPIN_ADDED_MISSING%40gmr-mx.google.com.
"Can you show me the math on that, please?
If I'm saved by killing one person against their will, that gives me X amount of unhappiness
If I'm 1 of 1,000 people saved by killing one person against their will. that gives me 1/1,000 X amount of unhappiness."
Shane,
How is your unhappiness modified because there are 999 more people saved? Surely if your unhappiness is shared by 1000 other people it is 1000 x amount of unhappiness?
OK, I'll try it a different way.
Assuming that living things are happy, then the person who died lost X amount of happiness.
The people who are saved, including me, gain X amount of happiness each. That's a trade of X for 1,000X.
"If a person is unhappy to be killed, then he is happy living." is not a true statement. I am sure there are miserable [unhappy] people that don't want to die.
That's only because they expect an amount of happiness in the future great enough to outweigh the unhappiness they are presently experiencing. OR they haven’t figured that out, but are just reacting on survival instinct.
Remember that I'm defining happiness as that emotion, or factor affecting emotion, that causes an organism to want to continue living in that condition. The person who is happy to be killed has already overcome survival instinct.
your second sentence seems to contradict your point; that sometimes killing is moral.
If happiness is quantified, and morality is based on happiness, then killing is moral if it results in a greater ratio of happiness to unhappiness in the universe.
Love is sacrificing of yourself for another.
Then I want love only if it results in a greater ratio of happiness to unhappiness in the universe.
Cary
Sent from Mail for Windows 10
From: Shane Fletcher
Sent: Tuesday, January 7, 2020 4:39 AM
To: bys-...@googlegroups.com
Subject: Re: Trolley Problem
Hi Cary,
--
You received this message because you are subscribed to the Google Groups "BYS vs MH" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bys-vs-mh+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bys-vs-mh/CAAsswWouPks10gpVCOKC4VmkPixNdCUZdWKMA7622Wkyyxf24A%40mail.gmail.com.