crime and duplication machines

29 views
Skip to first unread message

Telmo Menezes

unread,
Apr 23, 2015, 6:15:40 PM4/23/15
to everyth...@googlegroups.com
My suspicion is that "personal identity" is a human concept that is evolved mainly to enforce social norms, and that it only works until technologies like duplication machines or mind uploading are created. To illustrate this, I propose a dilemma:

Let's assume I murder someone and then get scanned in Brussels, and reconstructed in Washington. Who should go to jail?

What if I am destroyed in Brussels and reconstructed in Washington and Moscow?

LizR

unread,
Apr 23, 2015, 8:19:33 PM4/23/15
to everyth...@googlegroups.com
You should both go to jail, on the basis that both copies of you had the same consciousness as the person who committed the murder, and therefore you are both equally responsible (leaving aside considerations of free will etc)

And (this is the clincher) you are both equally a danger to society, having had your psychopathic tendencies duplicated means you're twice as much of a danger as you were when there was only one of you.

QED, "You're nicked, sunshine."

Telmo Menezes

unread,
Apr 23, 2015, 8:30:36 PM4/23/15
to everyth...@googlegroups.com
On Fri, Apr 24, 2015 at 1:19 AM, LizR <liz...@gmail.com> wrote:
You should both go to jail, on the basis that both copies of you had the same consciousness as the person who committed the murder, and therefore you are both equally responsible (leaving aside considerations of free will etc)

I agree. I would be curious to know if anyone disagrees with this, and why.
 

And (this is the clincher) you are both equally a danger to society, having had your psychopathic tendencies duplicated means you're twice as much of a danger as you were when there was only one of you.

QED, "You're nicked, sunshine."

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To post to this group, send email to everyth...@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

LizR

unread,
Apr 23, 2015, 8:44:02 PM4/23/15
to everyth...@googlegroups.com
Oops, that should have read "You're both nicked, sunshines!" :-)

However, when it comes to inheriting from your parent's will, things might get ... interesting.

PS I assume everyone's seen the film "The Prestige" (or read the novel - written by an ex-teacher of mine!)

spudb...@aol.com

unread,
Apr 24, 2015, 7:09:14 AM4/24/15
to everyth...@googlegroups.com
Brussels-easy. You violated the law against murder there, only to flee to NYC. Like boarding a plane to NYC except quicker. Let us say a child molester, attacks a young child in Brussel's and then teleports to NYC to escape. Whilst in NYC the molester from Brussels molests two other young children before he is captured. At this point should we hear the NYC Molester saying. "That was not me! The Brussels molester died whilst in teleporting (by necessity). At this point should a jury even care? 

It does get fun if 300 versions of Stalin are produced, and only 298 of the clones commit mass murder post teleporting. 


Stathis Papaioannou

unread,
Apr 24, 2015, 7:47:04 AM4/24/15
to everyth...@googlegroups.com
On 24 April 2015 at 21:09, spudboy100 via Everything List
<everyth...@googlegroups.com> wrote:
> Brussels-easy. You violated the law against murder there, only to flee to
> NYC. Like boarding a plane to NYC except quicker. Let us say a child
> molester, attacks a young child in Brussel's and then teleports to NYC to
> escape. Whilst in NYC the molester from Brussels molests two other young
> children before he is captured. At this point should we hear the NYC
> Molester saying. "That was not me! The Brussels molester died whilst in
> teleporting (by necessity). At this point should a jury even care?
>
> It does get fun if 300 versions of Stalin are produced, and only 298 of the
> clones commit mass murder post teleporting.
>
>
>
> -----Original Message-----
> From: Telmo Menezes <te...@telmomenezes.com>
> To: everything-list <everyth...@googlegroups.com>
> Sent: Thu, Apr 23, 2015 6:15 pm
> Subject: crime and duplication machines
>
> My suspicion is that "personal identity" is a human concept that is evolved
> mainly to enforce social norms, and that it only works until technologies
> like duplication machines or mind uploading are created. To illustrate this,
> I propose a dilemma:
>
> Let's assume I murder someone and then get scanned in Brussels, and
> reconstructed in Washington. Who should go to jail?
>
> What if I am destroyed in Brussels and reconstructed in Washington and
> Moscow?

Copies should be punished only for things they did in their subjective past.


--
Stathis Papaioannou

Bruno Marchal

unread,
Apr 24, 2015, 11:55:46 AM4/24/15
to everyth...@googlegroups.com
On 24 Apr 2015, at 02:30, Telmo Menezes wrote:



On Fri, Apr 24, 2015 at 1:19 AM, LizR <liz...@gmail.com> wrote:
You should both go to jail, on the basis that both copies of you had the same consciousness as the person who committed the murder, and therefore you are both equally responsible (leaving aside considerations of free will etc)

I agree. I would be curious to know if anyone disagrees with this, and why.


Now, I agree. And Liz gave two good arguments, one pure 3p, and the other is terms of moral punishment. The first one is enough, but the second one make sense too.

Another "terrible question": do people have the right to torture copies, when they accepted the protocols, that is with consent made at the time before the duplication?

Should that be made illegal?  (assuming the technology, comp, etc.)

Bruno


 

And (this is the clincher) you are both equally a danger to society, having had your psychopathic tendencies duplicated means you're twice as much of a danger as you were when there was only one of you.

QED, "You're nicked, sunshine."

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To post to this group, send email to everyth...@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To post to this group, send email to everyth...@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Bruno Marchal

unread,
Apr 24, 2015, 12:08:23 PM4/24/15
to everyth...@googlegroups.com
Yes, and only if there are evidence that they commit it, and it was
not a dream. I have a heard about some people claiming to have killed
someone, but left free as they did not succeed in providing evidences.

Your remark raises the question: can we condemn a person who has
killed someone, but is completely amnesic of the fact (according to
the nominated experts)?

Bruno


>
>
> --
> Stathis Papaioannou
>
> --
> You received this message because you are subscribed to the Google
> Groups "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it,
> send an email to everything-li...@googlegroups.com.
> To post to this group, send email to everyth...@googlegroups.com.
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.

http://iridia.ulb.ac.be/~marchal/



Dennis Ochei

unread,
Apr 24, 2015, 1:52:33 PM4/24/15
to everyth...@googlegroups.com
Here's the clincher.

1. Suppose I erase my body's memories after. Do I go to jail?

2. Suppose I erase the memories of this body. I find another body (say a laboratory synthesized one with no memories) and download my memories onto it. Does the new body go to jail?

3. I commit a crime and then a buddy of mine, who had no knowledge of the crime decides he wants to experience my memories. He downloads the entirety of my memories while retaining his own. Does he go to jail?

4.  I commit a crime, then I kidnap someone and forcibly download their memories onto my brain, retaining my own. I then delete their memories. Memory transfer technology is at such a stage that it is not possible to transfer or delete selected memories. So it is impossible to remove my memories without removing my victim's. Do I go to jail?

5. I commit a crime, then I kidnap someone and forcibly download my memories onto their brain, without erasing theirs. I then delete my memories. Memory transfer technology is at such a stage that it is not possible to transfer or delete selected memories. So it is impossible to remove my memories without removing my victim's. Does my kidnapped victim go to jail?


At first glance, you want to say no to 1, but then someone could just backup their memories, leave themselves a note on where to restore them, and then waltz out of the country. Reminds me a bit of the anime Death Note.

You want to say yes to 2, but that seems to entail saying yes to 3-5, and you really don't wanna say yes to 5. Even of you evade that entailment it seems your answers to 3-5 have to be the same

spudb...@aol.com

unread,
Apr 24, 2015, 2:51:23 PM4/24/15
to everyth...@googlegroups.com
Bruno, if it's a genuine amnesia we do have a quandary (not us but future people) but what if, like the fellow in Matrix 1 said, "I want to remember nothing, Noth-thing!" in which case this is part of a conspiracy to obstruct justice. It reminds me of American comic, Steve Martin, who is past skits advised people to rob banks, but then inform the judge that, "I'm sorry, your honor, I didn't know it was illegal to rob a bank." If the Charlie Hebdo murderers were able to teleport from Paris to Baghdad, the secondaries (a term) would need to stand trial. How about 5000 mass murderers produced from a single killer? "Go ahead and pick one of us to execute! ha hah!" Having 500 replicas running around might make life more difficult. If the replicas are uploaded to an environment/virtual, then its a different story than teleportation. You can isolate unfriendly persons in virch space. 

PGC

unread,
Apr 24, 2015, 3:48:57 PM4/24/15
to everyth...@googlegroups.com


On Friday, April 24, 2015 at 5:55:46 PM UTC+2, Bruno Marchal wrote:

On 24 Apr 2015, at 02:30, Telmo Menezes wrote:



On Fri, Apr 24, 2015 at 1:19 AM, LizR <liz...@gmail.com> wrote:
You should both go to jail, on the basis that both copies of you had the same consciousness as the person who committed the murder, and therefore you are both equally responsible (leaving aside considerations of free will etc)

I agree. I would be curious to know if anyone disagrees with this, and why.


Now, I agree. And Liz gave two good arguments, one pure 3p, and the other is terms of moral punishment. The first one is enough, but the second one make sense too.

Another "terrible question": do people have the right to torture copies, when they accepted the protocols, that is with consent made at the time before the duplication?

Should that be made illegal?  (assuming the technology, comp, etc.)

Depends on what we mean by the term "illegal" or "jail". If "jail" or "legality" turns out to be just some unreflected form of confinement or isolation, then we only replicate our tendency towards another form of vengeance justice. This seems medieval/savage, which is plausible; but what if we assumed they are less savage than us because they've grown bored?

Because I'm not sure we need "forms of punishment" a priori in all scenarios of justice. In such sufficiently advanced setting, where we can e.g. copy Telmo, we can define crime as something like: "form of amnesia relative to theological aspects/questions of personhood", then justice is restored when that amnesia is either lifted or the person decides to move to a geography where said amnesia can be lived/dreamed by people who choose it theologically; where it can theologically kick back. Unfortunately, this opens up territoriality of geography, which I'd like to not have to do. Ideally, we'd like to lift that amnesia, perhaps. This may be fuzzy, but at least more precise than faith in weirdly justified spans of time for "confinement for security of society".

I could see it as the job of scientists, mystics, and artists to grapple with this huge problem of how to make amnesic loss of theological question of personhood, accessible to such persons again (who committed "crime"). I'm not sure the term "illegal" or "crime" would still apply in such setting; closer to "they forgot stuff/questions". So "crime" would be closer to restoration of memory and bear on "how did we get here in local history?" which would give clues to undo the imbalance and appears more as a memory problem, than a problem with "Telmo" (sorry for using you like this, man ;-))

Not that I would assume a clear solution (we're attempting good/evil here...); just assuming we can be less naive and hand waving with theology and question of dream/reality than we are today, which is a high price tag. But we could reasonably assume a lot more histories with programming virtual worlds, altered states of mind, theological practice and nuance, technological tools, engineering and management of trance/ecstasy, maybe some advance on problem of evil etc.

I try to exercise setting up such scenario's fictionally, but it is difficult to find ones that are fun, where Goedel does not bite back too much, lol. Thanks for posting/sharing, Telmo. This is more fun than all the usual and yet understandable preaching for physical universe, politics, environment etc. Closer to some of Wei Dai's thoughts and writings as well. PGC   

LizR

unread,
Apr 24, 2015, 5:25:49 PM4/24/15
to everyth...@googlegroups.com
On 25 April 2015 at 05:52, Dennis Ochei <do.inf...@gmail.com> wrote:
Here's the clincher.

1. Suppose I erase my body's memories after. Do I go to jail?

Unless there is some magic stuff involved in identity, you just committed suicide. How about you commit a crime yesterday then erase your memory of the last year (say) ? Only the 'psychopathic tendencies' argument holds water then, but it would have held equally well a year earlier and wasn't acted on (we assume).

I may comment on the other points later (if I have anything sensible to say....maybe even if not :-)

But for now I have a Jumbo Winter Crossword to set!

Stathis Papaioannou

unread,
Apr 24, 2015, 5:26:57 PM4/24/15
to everyth...@googlegroups.com
The question does arise practically, since people sometimes do things when intoxicated that they can't remember. Usually they are still punished, because they chose to become intoxicated. People who are dementing (or have other serious illnesses) may be punished less on compassionate grounds. 


--
Stathis Papaioannou

Dennis Ochei

unread,
Apr 24, 2015, 5:30:43 PM4/24/15
to everyth...@googlegroups.com
Only temporarily. I leave myself instructions on where to restore my memories
--
You received this message because you are subscribed to a topic in the Google Groups "Everything List" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/everything-list/xrPfkrIWCWw/unsubscribe.
To unsubscribe from this group and all its topics, send an email to everything-li...@googlegroups.com.

To post to this group, send email to everyth...@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.


--
Sent from Gmail Mobile

meekerdb

unread,
Apr 24, 2015, 8:18:47 PM4/24/15
to everyth...@googlegroups.com
On 4/24/2015 2:25 PM, LizR wrote:
On 25 April 2015 at 05:52, Dennis Ochei <do.inf...@gmail.com> wrote:
Here's the clincher.

1. Suppose I erase my body's memories after. Do I go to jail?

Unless there is some magic stuff involved in identity, you just committed suicide. How about you commit a crime yesterday then erase your memory of the last year (say) ? Only the 'psychopathic tendencies' argument holds water then, but it would have held equally well a year earlier and wasn't acted on (we assume).

And quite aside from "pyschopathic tendencies" society imposes punishments as deterrent as well as to satisfy sentiments for retribution.  Having all your memories erased might be deterrent enough, but just having last year's or yesterday's probably wouldn't.

Brent

LizR

unread,
Apr 24, 2015, 9:32:49 PM4/24/15
to everyth...@googlegroups.com
On 25 April 2015 at 05:52, Dennis Ochei <do.inf...@gmail.com> wrote:
Here's the clincher.

1. Suppose I erase my body's memories after. Do I go to jail?

This one counts as suicide - presumably you are left as a vegetable if you erase your memory. However, if you only erase the memory of committing the crime (plus, say, the events that led up to it) then it's more problematic (!) - not to mention fun for the SF writer. This one still falls into my "catch number 2", as mentioned above - you are still a danger to society, and we can prove it!

2. Suppose I erase the memories of this body. I find another body (say a laboratory synthesized one with no memories) and download my memories onto it. Does the new body go to jail?

I'd say yes. I'm assuming we aren't worrying about whether jail is the right punishment, whether you have free will, and so on - just judging the situation on its merits within existing laws and assumptions.

3. I commit a crime and then a buddy of mine, who had no knowledge of the crime decides he wants to experience my memories. He downloads the entirety of my memories while retaining his own. Does he go to jail?

Not unless having your memories somehow makes him more likely to commit the crime himself, in which case he could in theory fall foul of catch-2 (above).

4.  I commit a crime, then I kidnap someone and forcibly download their memories onto my brain, retaining my own. I then delete their memories. Memory transfer technology is at such a stage that it is not possible to transfer or delete selected memories. So it is impossible to remove my memories without removing my victim's. Do I go to jail?

For kidnapping and forcibly copying memories and for murdering the original owner of the memories (by deleting them from his original body) - yes.

5. I commit a crime, then I kidnap someone and forcibly download my memories onto their brain, without erasing theirs. I then delete my memories. Memory transfer technology is at such a stage that it is not possible to transfer or delete selected memories. So it is impossible to remove my memories without removing my victim's. Does my kidnapped victim go to jail?

No, they aren't responsible for what you did to them. Unless your memories somehow turn them into you (more fun for the SF writer!)

At first glance, you want to say no to 1, but then someone could just backup their memories, leave themselves a note on where to restore them, and then waltz out of the country. Reminds me a bit of the anime Death Note.

(Not to mention "Memento". PKD has a lot to answer for.)

Stathis Papaioannou

unread,
Apr 24, 2015, 10:46:31 PM4/24/15
to everyth...@googlegroups.com


On Saturday, April 25, 2015, Dennis Ochei <do.inf...@gmail.com> wrote:
Here's the clincher.

1. Suppose I erase my body's memories after. Do I go to jail?

2. Suppose I erase the memories of this body. I find another body (say a laboratory synthesized one with no memories) and download my memories onto it. Does the new body go to jail?

3. I commit a crime and then a buddy of mine, who had no knowledge of the crime decides he wants to experience my memories. He downloads the entirety of my memories while retaining his own. Does he go to jail?

4.  I commit a crime, then I kidnap someone and forcibly download their memories onto my brain, retaining my own. I then delete their memories. Memory transfer technology is at such a stage that it is not possible to transfer or delete selected memories. So it is impossible to remove my memories without removing my victim's. Do I go to jail?

5. I commit a crime, then I kidnap someone and forcibly download my memories onto their brain, without erasing theirs. I then delete my memories. Memory transfer technology is at such a stage that it is not possible to transfer or delete selected memories. So it is impossible to remove my memories without removing my victim's. Does my kidnapped victim go to jail?


At first glance, you want to say no to 1, but then someone could just backup their memories, leave themselves a note on where to restore them, and then waltz out of the country. Reminds me a bit of the anime Death Note.

You want to say yes to 2, but that seems to entail saying yes to 3-5, and you really don't wanna say yes to 5. Even of you evade that entailment it seems your answers to 3-5 have to be the same.

 
Well, not only is the concept of personal identity, problematic, so is the concept of guilt and free will. If I kill someone and I did it because of the way I was born and the way my environment was it's not my fault, and if I did it due to randomness it's not my fault. So the practical solution to questions of crime and punishment is to do what will deter crime. In particular, people should be deterred from using copying and memory transfer to commit crimes and avoid punishment.


--
Stathis Papaioannou

meekerdb

unread,
Apr 24, 2015, 10:58:02 PM4/24/15
to everyth...@googlegroups.com
On 4/24/2015 7:46 PM, Stathis Papaioannou wrote:
Well, not only is the concept of personal identity, problematic, so is the concept of guilt and free will. If I kill someone and I did it because of the way I was born and the way my environment was it's not my fault, and if I did it due to randomness it's not my fault. So the practical solution to questions of crime and punishment is to do what will deter crime. In particular, people should be deterred from using copying and memory transfer to commit crimes and avoid punishment.

Right.  All the talk about guilt and free will and what God commands and who's responsible are human inventions to control and order societies that grew beyond extended families.

Brent

Bruno Marchal

unread,
Apr 25, 2015, 1:52:31 PM4/25/15
to everyth...@googlegroups.com
Guilt and free will can exist, like God without being (mis)used by powers. But when the science is left in the hands of the Church, progress are slow down, discoveries are hidden, etc.

Theology is the science which has this defect: the use of the authoritative argument is the most grave, and the more easy, or hard to not do, when communicating about it. All serious theologians are aware of the difficulties, and usually do criticize from inside the "authorities".

You must not confuse the domain of investigation, and the human theories about it. In theology, we are still, even just compared to the greek, in the age of fairy tales. 

As long as scientist hides the problem under the rug, the fake religion will keep the fake, but locally real, power.

Theoretical theology, like all theoretical sciences, can be done without any ontological commitment.

Bruno



Brent

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To post to this group, send email to everyth...@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Dennis Ochei

unread,
Apr 26, 2015, 4:50:48 PM4/26/15
to everyth...@googlegroups.com
indeed. The memory criterion reveals itself to be problematic the moment you consider partial transfers. If you transfer all my memories, we've decided, per the criterion, that I would wake up at the destination. But what if you transferred all but one memory? 75%? 50%? Via the sorites paradox, you'd have to conclude that a null transfer still allows you to wake up in the new body. Or you could conclude there is some critical percentage where you go from not arriving to arriving in the new body, which is absurd. Or you conclude only a 100% complete transfer allows you to wake in the new body. But that's even worse, when we don't consider *gaining* memories equally destructive to identity. Imagine we have a mind M at t0 with a certain set of memories. At t1 it gains a new memory. At t2 it losses that memory. It would mean that M0 = M1, M0 = M2 and M1 != M2.

Instead we have to consider the subjective *illusion* of identity, independent of the question of actual identity. Then the answer is clear. The more memories I transfer the more the new body will believe it is me, the veracity of that belief being an empty question. in the case of a complete transfer the illusion will be total and complete. A partial transfer will create a weaker illusion. If I transfer just a few memories, it will seem to the destination person that they had a dream where they where me, but Zhaungzi will realize he is not the butterfly.

Along another line of thought, the social construct of my identity is deeply dependent on my mind being tied to a body that looks very much like the body it had yesterday. The moment that assumption doesn't hold, punishment breaks down. You can no longer tell who you're dealing with by looking. Obvious solution 1 is to tightly regulate memory transfers. If the government can make them effectively impossible to perform then we can stay in dreamland, retaining the social construct of identity.

Barring that, if memory transfers are possible, then there is no way to deter *someone* from using them to escape punishment. This is a tenuous point, but i think it follows from throwing out the fact of identity while retaining the illusion. Call it a conjecture. We'll come back to this.

Now suppose the government did regular memory scans to track who's who. Memory finngerprinting. Just overlook how this is the most total breach of privacy possible... They would get some sort of similarity measure and use that to track closest continuer subjective threads. The problem is that it's possible to simply make your subjective thread disappear for some time to reappear later. I will use letters to represent bodies and numbers for minds. The dash indicates their association

A - 1
B - 2
C - 3

Then 2 splits into 4 and 5. 4 is added to A, 5 is added to C, 3 overwrites the contents of B.

A - 1,4
B - 3'
C - 3, 5

The operation can be reversed at a later date reconstiting 2.

The point im trying to make is that any person can just cease to exist only to reappear later.

This is even simpler if we can just write the memories to a hard drive. Then there is no need to hide parts of 2 in other bodies.

The second point is that there is always reasonable doubt that you were in control of your body when you committed a crime. A1 kidnaps B2, stores 2 on a drive. B1' commits a crime (say a kidnapping!) 1 and 1' merge in body A. A1 returns 2 to B.

You could say that 2 has no memories of commiting the crime so he'll get off. But if that's all it takes for innocence then a criminal can just erase his memories of committing a crime.

I mean we could play with this more but I'd rather get to where I'm going with this. I want to say that punishing *people* for what they did (for deterrence or retributive reasons) is simply intractable in this situation.

Instead, one has to lower their level of abstraction to memes. A memeplex caused a body to act in a certain way. At the mind
--
You received this message because you are subscribed to a topic in the Google Groups "Everything List" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/everything-list/xrPfkrIWCWw/unsubscribe.
To unsubscribe from this group and all its topics, send an email to everything-li...@googlegroups.com.

To post to this group, send email to everyth...@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Dennis Ochei

unread,
Apr 26, 2015, 5:04:01 PM4/26/15
to everyth...@googlegroups.com
Whoops, accidentally hit send. As i was saying, at the mind fingerprinting the government has to look for criminal memeplexes and render them inert. For instance let's say a criminal memplex is composed of two major subunits, the desire to commit the crime and the know-how to commit the crime. If the government detects them in the same body then one has to be deleted or modified. Crimes would be attributed to "mind-viruses".

Now, at first glance directly modifying minds seems very 1984ish. But that's what our criminal justice system is supposed to do *now,* render the desire component of the criminal memeplex inert. This is just a more effective version.

Basically we move from a model where we punish people for what the did to to a model where we disassemble memeplexes for what they might cause people to do. This effectively means it will be illegal to have certain ideas in your head.

Of course there is no way the American legal system will be able to keep up with this, so it's gonna be a field day if and when memory transfers are possible.

meekerdb

unread,
Apr 26, 2015, 6:48:08 PM4/26/15
to everyth...@googlegroups.com
On 4/26/2015 1:50 PM, Dennis Ochei wrote:
Along another line of thought, the social construct of my identity is deeply dependent on my mind being tied to a body that looks very much like the body it had yesterday. The moment that assumption doesn't hold, punishment breaks down. You can no longer tell who you're dealing with by looking. Obvious solution 1 is to tightly regulate memory transfers. If the government can make them effectively impossible to perform then we can stay in dreamland, retaining the social construct of identity.

Ah, so that's why Yaweh, Allah, and those other mesopotamian gods stuck souls in bodies; so they could punish them breaking commandments. :-)

Brent

Russell Standish

unread,
Apr 26, 2015, 7:42:53 PM4/26/15
to everyth...@googlegroups.com
On Sun, Apr 26, 2015 at 01:50:47PM -0700, Dennis Ochei wrote:
> indeed. The memory criterion reveals itself to be problematic the moment
> you consider partial transfers. If you transfer all my memories, we've
> decided, per the criterion, that I would wake up at the destination. But
> what if you transferred all but one memory? 75%? 50%? Via the sorites
> paradox, you'd have to conclude that a null transfer still allows you to
> wake up in the new body. Or you could conclude there is some critical
> percentage where you go from not arriving to arriving in the new body,
> which is absurd.

Why is this absurd? What if all your memories are interlinked into
some sort of network, and if you leave out enough memories, a
percolation threshold is crossed, and your identity falls apart?

--

----------------------------------------------------------------------------
Prof Russell Standish Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics hpc...@hpcoders.com.au
University of New South Wales http://www.hpcoders.com.au
----------------------------------------------------------------------------

Telmo Menezes

unread,
Apr 27, 2015, 2:21:21 AM4/27/15
to everyth...@googlegroups.com
On Fri, Apr 24, 2015 at 5:55 PM, Bruno Marchal <mar...@ulb.ac.be> wrote:

On 24 Apr 2015, at 02:30, Telmo Menezes wrote:



On Fri, Apr 24, 2015 at 1:19 AM, LizR <liz...@gmail.com> wrote:
You should both go to jail, on the basis that both copies of you had the same consciousness as the person who committed the murder, and therefore you are both equally responsible (leaving aside considerations of free will etc)

I agree. I would be curious to know if anyone disagrees with this, and why.


Now, I agree. And Liz gave two good arguments, one pure 3p, and the other is terms of moral punishment. The first one is enough, but the second one make sense too.

Another "terrible question": do people have the right to torture copies, when they accepted the protocols, that is with consent made at the time before the duplication?

Should that be made illegal?  (assuming the technology, comp, etc.)

If you assume comp, I don't think this is different from the dilemma of whether a person has the right to torture another if the other consents.

Some people are masochistic and desire torture, even to be placed in a situation where they know they can't withdraw consent later. Our current legal systems tend to solve this problem using Monty Python logic: if someone wants to be tortured they have a mental problem, if they have a mental problem they cannot give consent.

I think mainstream western ethics are influenced by the golden rule. We could do worse than the golden rule (see ISIS) be perhaps we could also do better: do unto others as they would have done unto them. This requires a level of tolerance for individual preferences that I don't think human civilization has attained yet.

Telmo.

Telmo Menezes

unread,
Apr 27, 2015, 2:32:20 AM4/27/15
to everyth...@googlegroups.com
Thanks! Although it's a fun scenario to discuss, my main motivation here was to show that appeals to common sense on "personal identity" are a charade. When we mobilize some of our ancient instincts (like the instinct to punish those who disrespect important social norms), those instincts seem to tell us that both copies are valid continuations of the original.

Or, saying it another way, when something important is at stake, it seems that we suddenly know the answer. This proves nothing, of course, but at least gives a counter-example to claims that the comp notion of personal identity is not aligned with our common sense perception of personal identity.

Telmo.
 
PGC   

Dennis Ochei

unread,
Apr 27, 2015, 3:05:32 AM4/27/15
to everyth...@googlegroups.com
The argument weak point detector is quite strong with this one :). Well, I was leaning on Parfit's reasoning that on a Reductionist view of identity, such distinctions would be arbitrary. But we could for instance divide memories into essential and superfluous categories and pretend we could divine the difference between the two. Addition or loss of an essential memory changes identity, while the same for a superfluous memory does not. It does seem that without sophisticated brain scanning equipment you could not know the facts of your identity--a body might lose or gain an essential memory without the resulting person *realizing* it. The facts of identity might not follow the phenomenology of identity. Of course, the fact that a simple change of one memory could alter identity makes all these law enforcement evasion strategies by memory transfer that much easier.
--
You received this message because you are subscribed to a topic in the Google Groups "Everything List" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/everything-list/xrPfkrIWCWw/unsubscribe.
To unsubscribe from this group and all its topics, send an email to everything-li...@googlegroups.com.
To post to this group, send email to everyth...@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Bruno Marchal

unread,
Apr 27, 2015, 1:03:19 PM4/27/15
to everyth...@googlegroups.com
On 27 Apr 2015, at 08:21, Telmo Menezes wrote:



On Fri, Apr 24, 2015 at 5:55 PM, Bruno Marchal <mar...@ulb.ac.be> wrote:

On 24 Apr 2015, at 02:30, Telmo Menezes wrote:



On Fri, Apr 24, 2015 at 1:19 AM, LizR <liz...@gmail.com> wrote:
You should both go to jail, on the basis that both copies of you had the same consciousness as the person who committed the murder, and therefore you are both equally responsible (leaving aside considerations of free will etc)

I agree. I would be curious to know if anyone disagrees with this, and why.


Now, I agree. And Liz gave two good arguments, one pure 3p, and the other is terms of moral punishment. The first one is enough, but the second one make sense too.

Another "terrible question": do people have the right to torture copies, when they accepted the protocols, that is with consent made at the time before the duplication?

Should that be made illegal?  (assuming the technology, comp, etc.)

If you assume comp, I don't think this is different from the dilemma of whether a person has the right to torture another if the other consents.

I can conceive that some find this not acceptable, as long as we have only one exemplary of oneself.



Some people are masochistic and desire torture, even to be placed in a situation where they know they can't withdraw consent later. Our current legal systems tend to solve this problem using Monty Python logic: if someone wants to be tortured they have a mental problem, if they have a mental problem they cannot give consent.

I think mainstream western ethics are influenced by the golden rule. We could do worse than the golden rule (see ISIS) be perhaps we could also do better: do unto others as they would have done unto them. This requires a level of tolerance for individual preferences that I don't think human civilization has attained yet.

Yes. It will take time.

Bruno

Jason Resch

unread,
Apr 27, 2015, 4:58:13 PM4/27/15
to Everything List
What if you step into a delayed duplication machine, and the first one out goes and commits murder at a later time, and then commits suicide, later the delayed duplicate of you emerges. Do we imprison them, or would that be punishing them for a "pre-crime"?

Jason

On Thu, Apr 23, 2015 at 7:19 PM, LizR <liz...@gmail.com> wrote:
You should both go to jail, on the basis that both copies of you had the same consciousness as the person who committed the murder, and therefore you are both equally responsible (leaving aside considerations of free will etc)

And (this is the clincher) you are both equally a danger to society, having had your psychopathic tendencies duplicated means you're twice as much of a danger as you were when there was only one of you.

QED, "You're nicked, sunshine."

meekerdb

unread,
Apr 27, 2015, 6:33:35 PM4/27/15
to everyth...@googlegroups.com
You make a rule about punishing people that will deter them from committing crimes in a way that maximizes satisfaction in the community.  I'm not sure what rules that is, but it doesn't necessarily have to solve some philosophical problem of personal identity. 

In your example, suppose society said, "No we won't punish him."  Then people might be tempted to use this as a way of killing someone they hate.  So society would probably say, "Yes, we'll punish him...and any additional copies of him too."

Brent


On 4/27/2015 1:58 PM, Jason Resch wrote:

LizR

unread,
Apr 27, 2015, 7:24:45 PM4/27/15
to everyth...@googlegroups.com
On 28 April 2015 at 08:58, Jason Resch <jason...@gmail.com> wrote:
What if you step into a delayed duplication machine, and the first one out goes and commits murder at a later time, and then commits suicide, later the delayed duplicate of you emerges. Do we imprison them, or would that be punishing them for a "pre-crime"?

I think the "public safety" argument comes in here. We have very good evidence that you are both dangerous and mentally unstable. I think we should at least consider offering psychiatric help, and perhaps threaten imprisonment if it's refused.

But of course I don't know how the rest of this hypothetical SF society functions. Maybe we keep you from being a threat by uploading you into a computerised utopia in which your every wish is granted.

LizR

unread,
Apr 27, 2015, 7:26:10 PM4/27/15
to everyth...@googlegroups.com
On 28 April 2015 at 10:33, meekerdb <meek...@verizon.net> wrote:
You make a rule about punishing people that will deter them from committing crimes in a way that maximizes satisfaction in the community.  I'm not sure what rules that is, but it doesn't necessarily have to solve some philosophical problem of personal identity. 

In your example, suppose society said, "No we won't punish him."  Then people might be tempted to use this as a way of killing someone they hate.  So society would probably say, "Yes, we'll punish him...and any additional copies of him too."

I agree in principle, although ISTM you'd have to hate someone an awful LOT to kill them and then commit suicide so your duplicate could escape being punished for the crime.

Dennis Ochei

unread,
Apr 27, 2015, 7:28:44 PM4/27/15
to everyth...@googlegroups.com
That only holds if you were planning the murder before you dupped. 
--
You received this message because you are subscribed to a topic in the Google Groups "Everything List" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/everything-list/xrPfkrIWCWw/unsubscribe.
To unsubscribe from this group and all its topics, send an email to everything-li...@googlegroups.com.

To post to this group, send email to everyth...@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Dennis Ochei

unread,
Apr 27, 2015, 7:42:36 PM4/27/15
to everyth...@googlegroups.com
Right, but *who* to punish in order to deter is dependent on these questions of identity. Suppose there are three actors who are willing to do this delayed duplication murder suicide scheme. Furthermore, they don't care what happens to their duplicate. (Perhaps they think of him as someone else) However, each have a family member that the care about deeply. You tell the first that his duplicate will be punished if he commits his crime. He doesn't care. You then say you will transfer the punishment onto his family member. This would deter him, but he doesn't believe you are actually so utilitarian and so he carries out his plans. Now, there are still two would be murderers. Do you punish the first man's family member in order to prove you mean business, deterring the remaining actors?

On purely utilitarian grounds, there is just as much disutility generated when you punish the first actor's duplicate as when you punish the first actor's family member. Furthermore, unless we reolve this question of identity who this disutility is doled out to doesn't matter as long as it serves its deterrent purpose.
--
You received this message because you are subscribed to a topic in the Google Groups "Everything List" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/everything-list/xrPfkrIWCWw/unsubscribe.
To unsubscribe from this group and all its topics, send an email to everything-li...@googlegroups.com.

To post to this group, send email to everyth...@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

meekerdb

unread,
Apr 27, 2015, 8:04:32 PM4/27/15
to everyth...@googlegroups.com
To really make a good decision we'd have to know a lot more - which is why we have trials.  Just from the above outline we don't even really know that the killer is dangerous or mentally unstable.  Maybe he murdered the guy who bullied his gay son online and caused his son to commit suicide.  Or maybe he murdered his duplicate because his duplicate stole his identity?

Brent

meekerdb

unread,
Apr 27, 2015, 8:14:57 PM4/27/15
to everyth...@googlegroups.com
You don't want to punish the first actor's family because that is disutility to them as well as, or instead of, the actor and their happiness counts in the society's utility as well as that of the murder victim.  You could just execute/imprison the two remaining actors on the assumption that (a) that will deter similar schemes and (b) they are "would be murders".

Brent
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.

Russell Standish

unread,
Apr 27, 2015, 8:20:06 PM4/27/15
to everyth...@googlegroups.com
On Mon, Apr 27, 2015 at 12:05:31AM -0700, Dennis Ochei wrote:
> The argument weak point detector is quite strong with this one :). Well, I
> was leaning on Parfit's reasoning that on a Reductionist view of identity,
> such distinctions would be arbitrary. But we could for instance divide
> memories into essential and superfluous categories and pretend we could
> divine the difference between the two. Addition or loss of an essential
> memory changes identity, while the same for a superfluous memory does not.
> It does seem that without sophisticated brain scanning equipment you could
> not know the facts of your identity--a body might lose or gain an essential
> memory without the resulting person *realizing* it. The facts of identity
> might not follow the phenomenology of identity. Of course, the fact that a
> simple change of one memory could alter identity makes all these law
> enforcement evasion strategies by memory transfer that much easier.
>

Yes - I find myself in quite a bit of disagreement with Parfit on
this. He seems to assume that the components of a person interact in
simple additive ways, whereas from what we know of real biological
organisms, the interactions tend to be complex and close to
criticality. A single change _can_ cause an avalanche that causes the
whole system to unravel. Not every change of course. For example, one
can usually remove quite a number of species from an ecosystem without
it changing much. But remove a "keystone" species, and the ecosystem
colapses.

I am rather taken by Marvin Minsky's "Society of the Mind" idea, which
also fits into this notion that one's person (or identity) may be
robust to the removal of some elements, but then catastrophically
collapse when the wrong bit is removed.

Cheers

Dennis Ochei

unread,
Apr 27, 2015, 8:29:23 PM4/27/15
to everyth...@googlegroups.com
> You could just execute/imprison the two remaining actors

Well, i thought it was obvious that you can't just walk down the street and stop them... I didn't realize I had to spell that out.


You don't want to punish the first actor's family because that is disutility to them as well as, or instead of, the actor

Right! We seem to care about punishing the person who committed the crime, not mere deterrence. Deterrence is a necessary but not sufficient reason to inflict disutility on a given person. It must also be the case that the punishee committed the act that we want to deter.

So you cannot punish the delayed duplicate without weighing in on the personal identity question.

So you cannot punish the delayed duplicate unless you are saying the delayed duplicate is the same p

LizR

unread,
Apr 27, 2015, 8:34:59 PM4/27/15
to everyth...@googlegroups.com
On 28 April 2015 at 11:28, Dennis Ochei <do.inf...@gmail.com> wrote:
That only holds if you were planning the murder before you dupped. 

I assumed that was implied by Brent's comment that "people might be tempted to use this as a way of killing someone they hate".

LizR

unread,
Apr 27, 2015, 8:38:57 PM4/27/15
to everyth...@googlegroups.com
On 28 April 2015 at 12:04, meekerdb <meek...@verizon.net> wrote:
On 4/27/2015 4:24 PM, LizR wrote:
On 28 April 2015 at 08:58, Jason Resch <jason...@gmail.com> wrote:
What if you step into a delayed duplication machine, and the first one out goes and commits murder at a later time, and then commits suicide, later the delayed duplicate of you emerges. Do we imprison them, or would that be punishing them for a "pre-crime"?

I think the "public safety" argument comes in here. We have very good evidence that you are both dangerous and mentally unstable. I think we should at least consider offering psychiatric help, and perhaps threaten imprisonment if it's refused.

But of course I don't know how the rest of this hypothetical SF society functions. Maybe we keep you from being a threat by uploading you into a computerised utopia in which your every wish is granted.
To really make a good decision we'd have to know a lot more - which is why we have trials. 

Of course. Having been on a jury, I do actually appreciate that.
 
Just from the above outline we don't even really know that the killer is dangerous or mentally unstable. 

It's a reasonable reading given the main facts - he committed a murder and then killed himself. Obviously there may be extenuating circumstances...
 
Maybe he murdered the guy who bullied his gay son online and caused his son to commit suicide. 

...but that isn't actually a justification for murder. (See your first comment about why we have trials.)
 
Or maybe he murdered his duplicate because his duplicate stole his identity?

Ditto, although it might make a fun plot for an SF story.
 

meekerdb

unread,
Apr 27, 2015, 9:56:36 PM4/27/15
to everyth...@googlegroups.com
On 4/27/2015 5:29 PM, Dennis Ochei wrote:
> You could just execute/imprison the two remaining actors

Well, i thought it was obvious that you can't just walk down the street and stop them... I didn't realize I had to spell that out.

You don't want to punish the first actor's family because that is disutility to them as well as, or instead of, the actor

Right! We seem to care about punishing the person who committed the crime, not mere deterrence. Deterrence is a necessary but not sufficient reason to inflict disutility on a given person. It must also be the case that the punishee committed the act that we want to deter.

So you cannot punish the delayed duplicate without weighing in on the personal identity question.

No.  It's not that we care about punishing the person who committed the crime.  We care about deterrence as it contributes to the overall well being of the society.  So we weigh the disutility of punishing the family against the utility it would provide in deterrence. I think it would come out highly negative, but it depends somewhat on how much prospective murderers are influenced by their families welfare. You usually get the most deterrent utility by punishing the person who committed the crime, but that's a consequence of how psychology works - not a basic principle.

Brent

Dennis Ochei

unread,
Apr 27, 2015, 10:42:29 PM4/27/15
to everyth...@googlegroups.com
If you are punishing the family member the same way you would punish the perpetrator, then how is the disutility any different?

And correct me if I'm wrong, youre fine with punishing family members or friends if the deterrence value is sufficient?


On Monday, April 27, 2015, meekerdb <meek...@verizon.net> wrote:
--
You received this message because you are subscribed to a topic in the Google Groups "Everything List" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/everything-list/xrPfkrIWCWw/unsubscribe.
To unsubscribe from this group and all its topics, send an email to everything-li...@googlegroups.com.
To post to this group, send email to everyth...@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

meekerdb

unread,
Apr 27, 2015, 11:31:22 PM4/27/15
to everyth...@googlegroups.com
I'm not saying it's justification.  In fact from a utilitarian analysis of murder laws "justification" is a kind of derivative attribute of laws.  I'm saying that he may not be a danger to other people at all.  That's not good a basis for judging for judging the utility of punishment or laws.  People who kill their wife or husband in anger are very unlikely to kill anyone else.

Brent

meekerdb

unread,
Apr 28, 2015, 12:16:03 AM4/28/15
to everyth...@googlegroups.com
On 4/27/2015 7:42 PM, Dennis Ochei wrote:
If you are punishing the family member the same way you would punish the perpetrator, then how is the disutility any different?

It's effect on the rest of the community is different.  Someone else in the community may think, "I can't really control what my son does and he's kinda mean and high-strung.  I'd better get out of this place before my son does something that gets me punished."

Brent

You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages