Forget Zombies, Let's Talk Torture

21 views
Skip to first unread message

Craig Weinberg

unread,
Sep 27, 2012, 10:55:55 AM9/27/12
to everyth...@googlegroups.com
Say that you have been captured by the [totalitarian fiend of your choice], and are tied up in a basement somewhere. The torture has begun, and is has become clear that it will continue to get worse until you 'become one of them'.

Fortunately you have been supplied by your team with a 'Chalmers' device, which allows you to know exactly what to say and do to convince your captors that you have turned and become 'one of them' in earnest. Using real-time em field sensitivity and quantum computing, the computational states are not only analyzed, but predicted for everyone in the room so that you are furnished with the best lines and gestures, sobbing, explaining, etc.

The Chalmers device allows you to be a flawless actor. Is there any reason that this wouldn't work in theory? What law says that acting can only be so good, and beyond that you actually have to 'love Big Brother' in order to seem like you do? If we had a device that would allow us to control our bodies, emotions, and minds precisely and absolutely, why couldn't we use that device as a mask?

Part II

Instead of replacing parts of the brain with perfect functional replicas, what if we used a hot wire to ablate or burn parts of the brain. If I burn one region, you lose the power of speech. If I burn another, you lose all understanding of physics and math. If I burn another, you go into a coma. I can do different combinations of ablation on different subjects, but would there be any case in which someone who was dead could be induced to speak or solve math problems? Why not? I could replace the motherboard of a burned out computer with any other compatible motherboard and expect to pick up right where I left off. If I toasted a critical part of any computer, there is no loss of potential functionality to any of the other parts, whether that part is implicated in the boot up process or not. Just because a computer won't boot doesn't mean that it can't be easily repaired. Not so with a living organism. If you blow out a simple power supply in a biological system, it will never run again - not even a little bit.

What say ye?

Craig

Stathis Papaioannou

unread,
Sep 27, 2012, 8:39:42 PM9/27/12
to everyth...@googlegroups.com
On Fri, Sep 28, 2012 at 12:55 AM, Craig Weinberg <whats...@gmail.com> wrote:
> Say that you have been captured by the [totalitarian fiend of your choice],
> and are tied up in a basement somewhere. The torture has begun, and is has
> become clear that it will continue to get worse until you 'become one of
> them'.
>
> Fortunately you have been supplied by your team with a 'Chalmers' device,
> which allows you to know exactly what to say and do to convince your captors
> that you have turned and become 'one of them' in earnest. Using real-time em
> field sensitivity and quantum computing, the computational states are not
> only analyzed, but predicted for everyone in the room so that you are
> furnished with the best lines and gestures, sobbing, explaining, etc.
>
> The Chalmers device allows you to be a flawless actor. Is there any reason
> that this wouldn't work in theory? What law says that acting can only be so
> good, and beyond that you actually have to 'love Big Brother' in order to
> seem like you do? If we had a device that would allow us to control our
> bodies, emotions, and minds precisely and absolutely, why couldn't we use
> that device as a mask?

The perfect actor might believe it or he might just be acting. Acting
is top-down replacement, not bottom-up replacement. Bottom-up
replacement would involve replacing a part of your brain so that you
didn't notice any difference and no-one else noticed any difference.

> Part II
>
> Instead of replacing parts of the brain with perfect functional replicas,
> what if we used a hot wire to ablate or burn parts of the brain. If I burn
> one region, you lose the power of speech. If I burn another, you lose all
> understanding of physics and math. If I burn another, you go into a coma. I
> can do different combinations of ablation on different subjects, but would
> there be any case in which someone who was dead could be induced to speak or
> solve math problems? Why not? I could replace the motherboard of a burned
> out computer with any other compatible motherboard and expect to pick up
> right where I left off. If I toasted a critical part of any computer, there
> is no loss of potential functionality to any of the other parts, whether
> that part is implicated in the boot up process or not. Just because a
> computer won't boot doesn't mean that it can't be easily repaired. Not so
> with a living organism. If you blow out a simple power supply in a
> biological system, it will never run again - not even a little bit.
>
> What say ye?

Replacing body parts that break down with artificial ones is
well-established in the medical industry, and will become increasingly
so in future as the devices become more sophisticated.


--
Stathis Papaioannou

Craig Weinberg

unread,
Sep 27, 2012, 10:40:00 PM9/27/12
to everyth...@googlegroups.com


On Thursday, September 27, 2012 8:40:14 PM UTC-4, stathisp wrote:
On Fri, Sep 28, 2012 at 12:55 AM, Craig Weinberg <whats...@gmail.com> wrote:
> Say that you have been captured by the [totalitarian fiend of your choice],
> and are tied up in a basement somewhere. The torture has begun, and is has
> become clear that it will continue to get worse until you 'become one of
> them'.
>
> Fortunately you have been supplied by your team with a 'Chalmers' device,
> which allows you to know exactly what to say and do to convince your captors
> that you have turned and become 'one of them' in earnest. Using real-time em
> field sensitivity and quantum computing, the computational states are not
> only analyzed, but predicted for everyone in the room so that you are
> furnished with the best lines and gestures, sobbing, explaining, etc.
>
> The Chalmers device allows you to be a flawless actor. Is there any reason
> that this wouldn't work in theory? What law says that acting can only be so
> good, and beyond that you actually have to 'love Big Brother' in order to
> seem like you do? If we had a device that would allow us to control our
> bodies, emotions, and minds precisely and absolutely, why couldn't we use
> that device as a mask?

The perfect actor might believe it or he might just be acting. Acting
is top-down replacement, not bottom-up replacement. Bottom-up
replacement would involve replacing a part of your brain so that you
didn't notice any difference and no-one else noticed any difference.

Acting is an augmentation, not a replacement. It's a skill set. It involves a capacity to embody social expectations so that one's audience doesn't notice any difference. It's the same exact result from the third person view. An actor is a zombie being operated by a person.


> Part II
>
> Instead of replacing parts of the brain with perfect functional replicas,
> what if we used a hot wire to ablate or burn parts of the brain. If I burn
> one region, you lose the power of speech. If I burn another, you lose all
> understanding of physics and math. If I burn another, you go into a coma. I
> can do different combinations of ablation on different subjects, but would
> there be any case in which someone who was dead could be induced to speak or
> solve math problems? Why not? I could replace the motherboard of a burned
> out computer with any other compatible motherboard and expect to pick up
> right where I left off. If I toasted a critical part of any computer, there
> is no loss of potential functionality to any of the other parts, whether
> that part is implicated in the boot up process or not. Just because a
> computer won't boot doesn't mean that it can't be easily repaired. Not so
> with a living organism. If you blow out a simple power supply in a
> biological system, it will never run again - not even a little bit.
>
> What say ye?

Replacing body parts that break down with artificial ones is
well-established in the medical industry, and will become increasingly
so in future as the devices become more sophisticated.

Are you saying that you expect replacing someone's brain would be no more problematic than replacing any other body part?

Craig
 


--
Stathis Papaioannou

meekerdb

unread,
Sep 27, 2012, 11:05:11 PM9/27/12
to everyth...@googlegroups.com
On 9/27/2012 7:40 PM, Craig Weinberg wrote:
The perfect actor might believe it or he might just be acting. Acting
is top-down replacement, not bottom-up replacement. Bottom-up
replacement would involve replacing a part of your brain so that you
didn't notice any difference and no-one else noticed any difference.

Acting is an augmentation, not a replacement. It's a skill set. It involves a capacity to embody social expectations so that one's audience doesn't notice any difference. It's the same exact result from the third person view. An actor is a zombie being operated by a person.

The idea is to replace parts so that there is no behavior difference *under any circumstance* - acting, as you've conceived it, is limited to a particular situation.

Brent

Stathis Papaioannou

unread,
Sep 27, 2012, 11:15:41 PM9/27/12
to everyth...@googlegroups.com
On Fri, Sep 28, 2012 at 12:40 PM, Craig Weinberg <whats...@gmail.com> wrote:

>> Replacing body parts that break down with artificial ones is
>> well-established in the medical industry, and will become increasingly
>> so in future as the devices become more sophisticated.
>
>
> Are you saying that you expect replacing someone's brain would be no more
> problematic than replacing any other body part?

It will be more difficult to make an adequate replacement the more
complicated the part is, but the principle is the same: put the device
in, ask the patient how he feels, observe the patient to see how he
behaves including tests and investigations. If he says he feels
normal, he behaves normally and test results are normal you have
succeeded.


--
Stathis Papaioannou

Craig Weinberg

unread,
Sep 27, 2012, 11:28:27 PM9/27/12
to everyth...@googlegroups.com

If you understand my thought experiment than you would realize that this is the same thing. Just as a zombie arbitrarily asserts "no behavior difference *under any circumstance*", my acting service does exactly the same thing. It is a high technology simulation-prediction which augments rather than replaces the existing nervous system. My concept of acting is *specifically unlimited* and applies to all possible situations forever. That's what makes it a thought experiment. You have to accept the premise of the thought experiment or else explain why the premise is unworkable (as I have done repeatedly with the zombie assumption).

Craig


Brent

Craig Weinberg

unread,
Sep 27, 2012, 11:30:24 PM9/27/12
to everyth...@googlegroups.com

The principle is not the same. You cannot get a head transplant and assume that the 'you'-ness is going to magically follow the scalpel into your head from your body. You cannot get a prosthetic head, because without a head, there is no 'you' there anymore.

Craig
 


--
Stathis Papaioannou

Stephen P. King

unread,
Sep 27, 2012, 11:30:38 PM9/27/12
to everyth...@googlegroups.com
Hi Craig,

    I kinda have to side with Stathis a bit here. The problem that you are hinging an argument on it merely technical, it is not principled. My opinion is that a neuron is vastly more complex in its structure than a transistor, heck its got its own power supply and repair system and more built in! Nature, if anything, is frugal, there would not be redundant stuff in a neuron such that we only need to replace some aspect of it in order to achieve functional equivalence.

    The point is that the brain is a specialized biological computer that has achieved computational universality because it learned how to process language. It is because it can figure with symbols and representations that it can do what it does. This does not make it "special" in any miraculous way, it just shows us how Nature and its evolutionary ways is vastly more "intelligent" than we can possibly imagine ourselves to be.

-- 
Onward!

Stephen

http://webpages.charter.net/stephenk1/Outlaw/Outlaw.html

meekerdb

unread,
Sep 27, 2012, 11:56:48 PM9/27/12
to everyth...@googlegroups.com
On 9/27/2012 8:28 PM, Craig Weinberg wrote:


On Thursday, September 27, 2012 11:05:20 PM UTC-4, Brent wrote:
On 9/27/2012 7:40 PM, Craig Weinberg wrote:
The perfect actor might believe it or he might just be acting. Acting
is top-down replacement, not bottom-up replacement. Bottom-up
replacement would involve replacing a part of your brain so that you
didn't notice any difference and no-one else noticed any difference.

Acting is an augmentation, not a replacement. It's a skill set. It involves a capacity to embody social expectations so that one's audience doesn't notice any difference. It's the same exact result from the third person view. An actor is a zombie being operated by a person.

The idea is to replace parts so that there is no behavior difference *under any circumstance* - acting, as you've conceived it, is limited to a particular situation.

If you understand my thought experiment than you would realize that this is the same thing. Just as a zombie arbitrarily asserts "no behavior difference *under any circumstance*", my acting service does exactly the same thing. It is a high technology simulation-prediction which augments rather than replaces the existing nervous system. My concept of acting is *specifically unlimited* and applies to all possible situations forever. That's what makes it a thought experiment.

Then I would say it's not distinct from 'being'.  It is no longer a choice, "I'm going to act." motivated by some particular situation.

Brent

You have to accept the premise of the thought experiment or else explain why the premise is unworkable (as I have done repeatedly with the zombie assumption).

Craig


Brent
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To view this discussion on the web visit https://groups.google.com/d/msg/everything-list/-/Lw78s30vyoAJ.
To post to this group, send email to everyth...@googlegroups.com.
To unsubscribe from this group, send email to everything-li...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.

Craig Weinberg

unread,
Sep 27, 2012, 11:57:54 PM9/27/12
to everyth...@googlegroups.com

Yes and no. It is biological and one of the things that it does is compute, but computation is not sufficient to describe the brain (or any organic cell, tissue, or system).
 
that has achieved computational universality because it learned how to process language.

The role of language is controversial. It's important, no doubt, but it isn't clear that human language is the killer app that enabled the rise of Homo sapiens. We don't really know which organisms have language, nor can we say for sure that any species has no language as far as I can tell. Quorum sensing is bacterial language. Prairie dogs have language, birds, crickets, trees. It depends how we define it.

It is because it can figure with symbols and representations that it can do what it does. This does not make it "special" in any miraculous way, it just shows us how Nature and its evolutionary ways is vastly more "intelligent" than we can possibly imagine ourselves to be.

I agree it's not special in any miraculous way. I have never advocated human exceptionalism. What does that have to do with acting being a perfectly appropriate counterfactual for the zombie assumption?

Craig

Craig Weinberg

unread,
Sep 28, 2012, 12:01:04 AM9/28/12
to everyth...@googlegroups.com


On Thursday, September 27, 2012 11:56:58 PM UTC-4, Brent wrote:
On 9/27/2012 8:28 PM, Craig Weinberg wrote:


On Thursday, September 27, 2012 11:05:20 PM UTC-4, Brent wrote:
On 9/27/2012 7:40 PM, Craig Weinberg wrote:
The perfect actor might believe it or he might just be acting. Acting
is top-down replacement, not bottom-up replacement. Bottom-up
replacement would involve replacing a part of your brain so that you
didn't notice any difference and no-one else noticed any difference.

Acting is an augmentation, not a replacement. It's a skill set. It involves a capacity to embody social expectations so that one's audience doesn't notice any difference. It's the same exact result from the third person view. An actor is a zombie being operated by a person.

The idea is to replace parts so that there is no behavior difference *under any circumstance* - acting, as you've conceived it, is limited to a particular situation.

If you understand my thought experiment than you would realize that this is the same thing. Just as a zombie arbitrarily asserts "no behavior difference *under any circumstance*", my acting service does exactly the same thing. It is a high technology simulation-prediction which augments rather than replaces the existing nervous system. My concept of acting is *specifically unlimited* and applies to all possible situations forever. That's what makes it a thought experiment.

Then I would say it's not distinct from 'being'.  It is no longer a choice, "I'm going to act." motivated by some particular situation.

You would be wrong. Acting is like any other capacity or skill. You can always choose not to act, but in this example, if you choose to, then nobody can tell the difference. You can exhibit the behaviors of a zombie at your discretion.

Craig

meekerdb

unread,
Sep 28, 2012, 12:02:54 AM9/28/12
to everyth...@googlegroups.com
On 9/27/2012 9:01 PM, Craig Weinberg wrote:


On Thursday, September 27, 2012 11:56:58 PM UTC-4, Brent wrote:
On 9/27/2012 8:28 PM, Craig Weinberg wrote:


On Thursday, September 27, 2012 11:05:20 PM UTC-4, Brent wrote:
On 9/27/2012 7:40 PM, Craig Weinberg wrote:
The perfect actor might believe it or he might just be acting. Acting
is top-down replacement, not bottom-up replacement. Bottom-up
replacement would involve replacing a part of your brain so that you
didn't notice any difference and no-one else noticed any difference.

Acting is an augmentation, not a replacement. It's a skill set. It involves a capacity to embody social expectations so that one's audience doesn't notice any difference. It's the same exact result from the third person view. An actor is a zombie being operated by a person.

The idea is to replace parts so that there is no behavior difference *under any circumstance* - acting, as you've conceived it, is limited to a particular situation.

If you understand my thought experiment than you would realize that this is the same thing. Just as a zombie arbitrarily asserts "no behavior difference *under any circumstance*", my acting service does exactly the same thing. It is a high technology simulation-prediction which augments rather than replaces the existing nervous system. My concept of acting is *specifically unlimited* and applies to all possible situations forever. That's what makes it a thought experiment.

Then I would say it's not distinct from 'being'.  It is no longer a choice, "I'm going to act." motivated by some particular situation.

You would be wrong. Acting is like any other capacity or skill. You can always choose not to act, but in this example, if you choose to, then nobody can tell the difference. You can exhibit the behaviors of a zombie at your discretion.

If you can choose then ex hypothesi there is a circumstance in which you would choose not to act.

Brent

Craig Weinberg

unread,
Sep 28, 2012, 12:07:50 AM9/28/12
to everyth...@googlegroups.com


On Thursday, September 27, 2012 11:56:58 PM UTC-4, Brent wrote:

Then I would say it's not distinct from 'being'.  It is no longer a choice, "I'm going to act." motivated by some particular situation.

Brent


Think of it as an 'auto-pilot' functionality. Instead of getting a brain replacement, you use the same exact technology to augment your brain, even making a conjoined twin brain if you like, over which you have hot-swappable manual override capability. Just as you can choose to voluntarily breathe or let your body continue to breathe at their own pace, your zombie twin would work the same way. You turn it off and on at will and nobody is the wiser - just like your lungs.

Craig

Craig Weinberg

unread,
Sep 28, 2012, 12:10:08 AM9/28/12
to everyth...@googlegroups.com

That's up to the actor. They may choose to stay on auto-pilot forever. It's up to them.

Craig
 

Brent

Roger Clough

unread,
Sep 28, 2012, 6:30:25 AM9/28/12
to everything-list
Hi Craig Weinberg

I never picked up on this Zombie debate
and acting, but the world according to
Leibniz is a theater in which everything
happens "as if" there are actual forces and masses,
but which is actually controlled by a different
form of logic than Newtonian mechanics etc.



Roger Clough, rcl...@verizon.net
9/28/2012
"Forever is a long time, especially near the end." -Woody Allen


----- Receiving the following content -----
From: Craig Weinberg
Receiver: everything-list
Time: 2012-09-27, 23:28:27
Subject: Re: Forget Zombies, Let's Talk Torture

Roger Clough

unread,
Sep 28, 2012, 6:39:17 AM9/28/12
to everything-list
Hi Craig Weinberg and Stathisp,

You identity is only partly in your brain, it
is in your dna, your body and your history.
The sum totality of your identity is called your soul,
which suggests a serious shortcoming of materialis.
So far materialists deny that there is such a thing as a soul.
And even if so, nobody has proposed how a soul
transplant could be done.




Roger Clough, rcl...@verizon.net
9/28/2012
"Forever is a long time, especially near the end." -Woody Allen


----- Receiving the following content -----
From: Craig Weinberg
Receiver: everything-list
Time: 2012-09-27, 23:30:24
Subject: Re: Forget Zombies, Let's Talk Torture




On Thursday, September 27, 2012 11:16:12 PM UTC-4, stathisp wrote:
On Fri, Sep 28, 2012 at 12:40 PM, Craig Weinberg wrote:

>> Replacing body parts that break down with artificial ones is
>> well-established in the medical industry, and will become increasingly
>> so in future as the devices become more sophisticated.
>
>
> Are you saying that you expect replacing someone's brain would be no more
> problematic than replacing any other body part?

It will be more difficult to make an adequate replacement the more
complicated the part is, but the principle is the same: put the device
in, ask the patient how he feels, observe the patient to see how he
behaves including tests and investigations. If he says he feels
normal, he behaves normally and test results are normal you have
succeeded.


The principle is not the same. You cannot get a head transplant and assume that the 'you'-ness is going to magically follow the scalpel into your head from your body. You cannot get a prosthetic head, because without a head, there is no 'you' there anymore.

Craig




--
Stathis Papaioannou

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To view this discussion on the web visit https://groups.google.com/d/msg/everything-list/-/2LYmh2j2qOkJ.

Stephen P. King

unread,
Sep 28, 2012, 2:44:32 PM9/28/12
to everyth...@googlegroups.com
On 9/27/2012 11:57 PM, Craig Weinberg wrote:
Are you saying that you expect replacing someone's brain would be no more problematic than replacing any other body part?

Craig
Hi Craig,

    I kinda have to side with Stathis a bit here. The problem that you are hinging an argument on it merely technical, it is not principled. My opinion is that a neuron is vastly more complex in its structure than a transistor, heck its got its own power supply and repair system and more built in! Nature, if anything, is frugal, there would not be redundant stuff in a neuron such that we only need to replace some aspect of it in order to achieve functional equivalence.

    The point is that the brain is a specialized biological computer

Yes and no. It is biological and one of the things that it does is compute, but computation is not sufficient to describe the brain (or any organic cell, tissue, or system).

Hi Craig,

    I agree. It does not "just compute".


 
that has achieved computational universality because it learned how to process language.

The role of language is controversial. It's important, no doubt, but it isn't clear that human language is the killer app that enabled the rise of Homo sapiens. We don't really know which organisms have language, nor can we say for sure that any species has no language as far as I can tell. Quorum sensing is bacterial language. Prairie dogs have language, birds, crickets, trees. It depends how we define it.

    Any representational and (at least potentially) sharable form of interaction is language, in my thinking.



It is because it can figure with symbols and representations that it can do what it does. This does not make it "special" in any miraculous way, it just shows us how Nature and its evolutionary ways is vastly more "intelligent" than we can possibly imagine ourselves to be.

I agree it's not special in any miraculous way. I have never advocated human exceptionalism.

    I do advocate it. Humans are exceptional if merely because we can make the claim and make attempts to demonstrate the possibility! The fact that we can question whether we are or not and seek answers to the question of consciousness, is exceptional!


What does that have to do with acting being a perfectly appropriate counterfactual for the zombie assumption?

    My point about zombies is that if we are going to stipulate their existence as being exactly like humans except that they have no qualia (first person percepts and all that), then we have to be consistent to the definition in our discussions of them.

Craig Weinberg

unread,
Sep 28, 2012, 3:00:44 PM9/28/12
to everyth...@googlegroups.com


On Friday, September 28, 2012 2:44:32 PM UTC-4, Stephen Paul King wrote:
On 9/27/2012 11:57 PM, Craig Weinberg wrote:
Are you saying that you expect replacing someone's brain would be no more problematic than replacing any other body part?

Craig
Hi Craig,

    I kinda have to side with Stathis a bit here. The problem that you are hinging an argument on it merely technical, it is not principled. My opinion is that a neuron is vastly more complex in its structure than a transistor, heck its got its own power supply and repair system and more built in! Nature, if anything, is frugal, there would not be redundant stuff in a neuron such that we only need to replace some aspect of it in order to achieve functional equivalence.

    The point is that the brain is a specialized biological computer

Yes and no. It is biological and one of the things that it does is compute, but computation is not sufficient to describe the brain (or any organic cell, tissue, or system).

Hi Craig,

    I agree. It does not "just compute".

 
that has achieved computational universality because it learned how to process language.

The role of language is controversial. It's important, no doubt, but it isn't clear that human language is the killer app that enabled the rise of Homo sapiens. We don't really know which organisms have language, nor can we say for sure that any species has no language as far as I can tell. Quorum sensing is bacterial language. Prairie dogs have language, birds, crickets, trees. It depends how we define it.

    Any representational and (at least potentially) sharable form of interaction is language, in my thinking.

That's what I think too. The entire universe can be considered language really. Texts.
 


It is because it can figure with symbols and representations that it can do what it does. This does not make it "special" in any miraculous way, it just shows us how Nature and its evolutionary ways is vastly more "intelligent" than we can possibly imagine ourselves to be.

I agree it's not special in any miraculous way. I have never advocated human exceptionalism.

    I do advocate it. Humans are exceptional if merely because we can make the claim and make attempts to demonstrate the possibility! The fact that we can question whether we are or not and seek answers to the question of consciousness, is exceptional!

I agree. I mean exceptional in the sense of that some people consider humans as being not really animals but special beings that happen to have an animal body. I see human beings as a clear product of the animal kingdom.
 

What does that have to do with acting being a perfectly appropriate counterfactual for the zombie assumption?

    My point about zombies is that if we are going to stipulate their existence as being exactly like humans except that they have no qualia (first person percepts and all that), then we have to be consistent to the definition in our discussions of them.


If you had the technology to augment your acting skills in the way I described, then that is exactly what it would be. You would have a zombie mask that functions entirely by comparing detected brain data.

Craig
 

John Mikes

unread,
Sep 28, 2012, 3:45:37 PM9/28/12
to everyth...@googlegroups.com
Brent, I 'experienced' such situation in 1944 when the Nazi Gendarme's Pol. Police arrested me on suspicion to be part of the underground anti-Nazis (what was true). I made them 'believe' about being an at least 'neutral' grad student so they asked questions before torture started. I was 'believable' so they gave me 3 days to "think about it" after which I was released with an assignment to report about my friends (what I never did). However later on the Commis accused me of having been a 'secret agent' for the Nazis in 1944 and I had to 'makebilieve' (again) that non of it was true. So, they, too, let me go. (I was 3 times in the confinement of the Commis).
I was NEVER an actor. It was skills of survivor-pressure.
The rest may be the skill in Stathis' profession.
John M


 

--
You received this message because you are subscribed to the Google Groups "Everything List" group.

Stathis Papaioannou

unread,
Sep 29, 2012, 2:42:24 PM9/29/12
to everyth...@googlegroups.com
On Fri, Sep 28, 2012 at 1:30 PM, Craig Weinberg <whats...@gmail.com> wrote:

> The principle is not the same. You cannot get a head transplant and assume
> that the 'you'-ness is going to magically follow the scalpel into your head
> from your body. You cannot get a prosthetic head, because without a head,
> there is no 'you' there anymore.

What test do you use to determine if it is still you after a certain
procedure? For example, someone could claim that photographing a
person destroys their soul, and they are no longer the same person. I
would point out that they behave like the same person and seem to
honestly believe that they are the same person, but the
counterargument is that this counts for nothing, since this would be
true of an identical copy and the original person would be dead.


--
Stathis Papaioannou

Craig Weinberg

unread,
Sep 29, 2012, 9:18:22 PM9/29/12
to everyth...@googlegroups.com


On Saturday, September 29, 2012 2:42:56 PM UTC-4, stathisp wrote:
On Fri, Sep 28, 2012 at 1:30 PM, Craig Weinberg <whats...@gmail.com> wrote:

> The principle is not the same. You cannot get a head transplant and assume
> that the 'you'-ness is going to magically follow the scalpel into your head
> from your body. You cannot get a prosthetic head, because without a head,
> there is no 'you' there anymore.

What test do you use to determine if it is still you after a certain
procedure?

You do half of the procedure, then walk them back off, then the other half and walk them back off, then you do the whole procedure and walk them back off. This would give the subject at least an idea of any obvious problems with the procedure.
 
For example, someone could claim that photographing a
person destroys their soul, and they are no longer the same person. I
would point out that they behave like the same person and seem to
honestly believe that they are the same person, but the
counterargument is that this counts for nothing, since this would be
true of an identical copy and the original person would be dead.

Yes, there would be no way to undo taking a picture, so there isn't really anything better than the subjective interpretation to tell the difference.

Craig
 


--
Stathis Papaioannou

Stathis Papaioannou

unread,
Sep 29, 2012, 9:42:20 PM9/29/12
to everyth...@googlegroups.com
On Sun, Sep 30, 2012 at 11:18 AM, Craig Weinberg <whats...@gmail.com> wrote:

>> What test do you use to determine if it is still you after a certain
>> procedure?
>
>
> You do half of the procedure, then walk them back off, then the other half
> and walk them back off, then you do the whole procedure and walk them back
> off. This would give the subject at least an idea of any obvious problems
> with the procedure.

OK, so you put in the brain implant, switch it in and out of circuit
without telling the subject which is which, and ask them how they
feel. They can't tell any difference and you can't tell any difference
in behaviour. To make the experiment better there would be two
researchers, one doing the switching and another analysing the
subject's behaviour. With this double blind procedure the implant is
pronounced successful. Is that good enough?


--
Stathis Papaioannou

Craig Weinberg

unread,
Sep 30, 2012, 1:15:55 AM9/30/12
to everyth...@googlegroups.com

No, I think that you have to have each hemisphere of the brain offloaded completely to the device one at a time, then both, and then back, and have the subject live that way at each stage for several months before finally being restored back to their original brain. This would be repeated several times with double blind placebo offloadings. The subject would then decide for themselves if it was safe for them to say yes to the doctor.

I entertain this only theoretically though, as I think in reality it would fail completely, with every case resulting right away in unconsciousness, amnesia, coma, death, trauma, and psychosis and the whole project ultimately being abandoned for good.

Craig



--
Stathis Papaioannou

Stathis Papaioannou

unread,
Sep 30, 2012, 6:18:44 AM9/30/12
to everyth...@googlegroups.com
On Sun, Sep 30, 2012 at 3:15 PM, Craig Weinberg <whats...@gmail.com> wrote:

>> OK, so you put in the brain implant, switch it in and out of circuit
>> without telling the subject which is which, and ask them how they
>> feel. They can't tell any difference and you can't tell any difference
>> in behaviour. To make the experiment better there would be two
>> researchers, one doing the switching and another analysing the
>> subject's behaviour. With this double blind procedure the implant is
>> pronounced successful. Is that good enough?
>
>
> No, I think that you have to have each hemisphere of the brain offloaded
> completely to the device one at a time, then both, and then back, and have
> the subject live that way at each stage for several months before finally
> being restored back to their original brain. This would be repeated several
> times with double blind placebo offloadings. The subject would then decide
> for themselves if it was safe for them to say yes to the doctor.

One would hope the scientists try it with a more limited part of the
brain before moving to an entire hemisphere.

> I entertain this only theoretically though, as I think in reality it would
> fail completely, with every case resulting right away in unconsciousness,
> amnesia, coma, death, trauma, and psychosis and the whole project ultimately
> being abandoned for good.

I don't doubt that initial experiments would not yield ideal results.
Neural prostheses would initially be used for people with
disabilities. Cochlear implants are better than being deaf, but not as
good as normal hearing. But technology keeps getting better while the
human body stays more or less static, so at some point technology will
match and then exceed it. At the very least, there is no theoretical
reason why it should not.


--
Stathis Papaioannou

Craig Weinberg

unread,
Sep 30, 2012, 11:45:33 AM9/30/12
to everyth...@googlegroups.com

I'm all for neural mods and implants. Augmenting and repairing brain = great, replacing the brain = theoretically viable only in theories rooted in blind physicalism, in which consciousness is inconceivable to begin with.

Craig
--
Stathis Papaioannou

meekerdb

unread,
Sep 30, 2012, 2:03:03 PM9/30/12
to everyth...@googlegroups.com
On 9/30/2012 3:18 AM, Stathis Papaioannou wrote:
I don't doubt that initial experiments would not yield ideal results.
Neural prostheses would initially be used for people with
disabilities. Cochlear implants are better than being deaf, but not as
good as normal hearing. But technology keeps getting better while the
human body stays more or less static, so at some point technology will
match and then exceed it. At the very least, there is no theoretical
reason why it should not.

Indeed.  And cochlear implants could have a much wider frequency range (like I did when I was younger :-) ) and they even be designed to 'hear' RF.  So then Nagel will be able to ask "What is it like to be a human?"

Brent


Stephen P. King

unread,
Sep 30, 2012, 3:45:55 PM9/30/12
to everyth...@googlegroups.com
Hi Brent,

    The actual real world Cochlear implants that have been installed have a very feeble range of frequencies as the ability to interface the device with the brain is not a well understood area. In principle it should be possible to create an entire full spectrum detection system. The hard problem is "how do you interface with a brain".
   
-- 
Onward!

Stephen

Craig Weinberg

unread,
Sep 30, 2012, 7:05:46 PM9/30/12
to everyth...@googlegroups.com

Exactly. It's one thing for a person to use an artificial hand, but what is it that learns to use an artificial 'you'? It's hard for me to understand how this obvious Grand Canyon is repeatedly glossed over in these conversations. Head amputation? No big deal... Ehhh, not so fast I say, and saying not so fast doesn't make someone a Luddite, it just doesn't make sense that without understanding anything about how or why subjectivity comes to be that we should presume to reproduce it through imitation of the very body parts which seem to show now trace of consciousness without us.

Craig
   
-- 
Onward!

Stephen

Stathis Papaioannou

unread,
Oct 1, 2012, 1:35:53 AM10/1/12
to everyth...@googlegroups.com
On Mon, Oct 1, 2012 at 1:45 AM, Craig Weinberg <whats...@gmail.com> wrote:

>> I don't doubt that initial experiments would not yield ideal results.
>> Neural prostheses would initially be used for people with
>> disabilities. Cochlear implants are better than being deaf, but not as
>> good as normal hearing. But technology keeps getting better while the
>> human body stays more or less static, so at some point technology will
>> match and then exceed it. At the very least, there is no theoretical
>> reason why it should not.
>>
>>
>
> I'm all for neural mods and implants. Augmenting and repairing brain =
> great, replacing the brain = theoretically viable only in theories rooted in
> blind physicalism, in which consciousness is inconceivable to begin with.

You're suggesting that even if one implant works as well as the
original, multiple implants would not. Is there a critical replacement
limit, 20% you feel normal but 21% you don't? How have you arrived at
this insight?


--
Stathis Papaioannou

Craig Weinberg

unread,
Oct 1, 2012, 9:45:18 AM10/1/12
to everyth...@googlegroups.com

If you have one brain tumor, you may still function. With multiple tumors, you might not fare as well. Tumors function fine on some levels (they are living cells successfully dividing) but not on others (they fail to stop dividing, perhaps because there is a diminished identification with the sense of the organ as a whole).

Because we are 100% ignorant of any objective ontology of consciousness, there is no reason to assume that an implant can possibly function well enough to act as a replacement on all levels, unless possibly if the implant was made of one's own stem cells (probably the best avenue to pursue).


PS Someone posted a good AI related quote today that sort of applies:

"I think the point at which a computer program can be considered intelligent is

the point at which — given an error — you, as the programmer, can say it made a mistake."

If an implanted device doesn't make mistakes, it isn't human intelligence. If it does make mistakes, it has to make the kinds of mistakes that humans can tolerate...the mistakes have to be sourced in the same personal agendas of living beings.

Craig


 


--
Stathis Papaioannou

Stathis Papaioannou

unread,
Oct 1, 2012, 11:08:13 AM10/1/12
to everyth...@googlegroups.com
On Mon, Oct 1, 2012 at 11:45 PM, Craig Weinberg <whats...@gmail.com> wrote:

>> You're suggesting that even if one implant works as well as the
>> original, multiple implants would not. Is there a critical replacement
>> limit, 20% you feel normal but 21% you don't? How have you arrived at
>> this insight?
>
>
> If you have one brain tumor, you may still function. With multiple tumors,
> you might not fare as well. Tumors function fine on some levels (they are
> living cells successfully dividing) but not on others (they fail to stop
> dividing, perhaps because there is a diminished identification with the
> sense of the organ as a whole).
>
> Because we are 100% ignorant of any objective ontology of consciousness,
> there is no reason to assume that an implant can possibly function well
> enough to act as a replacement on all levels, unless possibly if the implant
> was made of one's own stem cells (probably the best avenue to pursue).

You're not really answering the question. The neural implants are
refined to the point where thousands of people are walking around with
them with no problem. Any objective or subjective test thrown at them
they pass. There are implants available for every part of the brain.
You're saying that if someone has 12 implants of the best possible
design they will be fine, but when they get 13 they will start to act
strangely. How can you know that this will happen? You're not just
saying here that it would be technically difficult, you're saying that
it would be *impossible* for the implants to work properly. So what
physical law that you know about and no-one else does would be broken?


--
Stathis Papaioannou

Stathis Papaioannou

unread,
Oct 1, 2012, 11:21:56 AM10/1/12
to everyth...@googlegroups.com
On Mon, Oct 1, 2012 at 9:05 AM, Craig Weinberg <whats...@gmail.com> wrote:

> Exactly. It's one thing for a person to use an artificial hand, but what is
> it that learns to use an artificial 'you'? It's hard for me to understand
> how this obvious Grand Canyon is repeatedly glossed over in these
> conversations. Head amputation? No big deal... Ehhh, not so fast I say, and
> saying not so fast doesn't make someone a Luddite, it just doesn't make
> sense that without understanding anything about how or why subjectivity
> comes to be that we should presume to reproduce it through imitation of the
> very body parts which seem to show now trace of consciousness without us.

"You" are not located in a special spot in the brain. "You" are an
ensemble of parts working together. If you determine the rules a
component in the ensemble follows in response to its neighbours you
can replace that part and the ensemble will behave the same. You don't
need to know EXACTLY how the part behaves, only APPROXIMATELY, since
in ordinary life neurons change from moment to moment and the brain
continues to function. Sop if you replace a part, the behaviour of the
organism will be unchanged. But if consciousness changes despite
behaviour remaining the same you have a really weird situation: a
person who feels he has changed but is powerless to prevent his vocal
cords from speaking and saying that he has not changed.


--
Stathis Papaioannou

Craig Weinberg

unread,
Oct 1, 2012, 11:46:05 AM10/1/12
to everyth...@googlegroups.com


On Monday, October 1, 2012 11:08:44 AM UTC-4, stathisp wrote:
On Mon, Oct 1, 2012 at 11:45 PM, Craig Weinberg <whats...@gmail.com> wrote:

>> You're suggesting that even if one implant works as well as the
>> original, multiple implants would not. Is there a critical replacement
>> limit, 20% you feel normal but 21% you don't? How have you arrived at
>> this insight?
>
>
> If you have one brain tumor, you may still function. With multiple tumors,
> you might not fare as well. Tumors function fine on some levels (they are
> living cells successfully dividing) but not on others (they fail to stop
> dividing, perhaps because there is a diminished identification with the
> sense of the organ as a whole).
>
> Because we are 100% ignorant of any objective ontology of consciousness,
> there is no reason to assume that an implant can possibly function well
> enough to act as a replacement on all levels, unless possibly if the implant
> was made of one's own stem cells (probably the best avenue to pursue).

You're not really answering the question. The neural implants are
refined to the point where thousands of people are walking around with
them with no problem. Any objective or subjective test thrown at them
they pass. There are implants available for every part of the brain.
You're saying that if someone has 12 implants of the best possible
design they will be fine, but when they get 13 they will start to act
strangely.

They may or may not act strangely depending on who is defining what strange is. Think of how Alzheimers progresses. It's not like dementia can be detected from the first appearance of an amyloid plaque overgrowth.

It would really be surprising if any brain change didn't follow this pattern. If you ingest n micrograms of LSD you are fine. If you ingest n+x micrograms, then you have a psychedelic experience lasting several hours. The model of the brain that you seem to assume is based on pure mechanistic assumption. It has no grounding in the physiological realities of what the brain actually is as a living organ.

How can you know that this will happen?

Because I understand what makes consciousness different from a machine. 

You're not just
saying here that it would be technically difficult, you're saying that
it would be *impossible* for the implants to work properly. So what
physical law that you know about and no-one else does would be broken?

The implants would work like proper implants, not like proper sub-persons. Implants have no experiences, therefore a collection of interconnected implants also have no experiences. If you have enough of a living person's brain left to be able to still be a person, then that person can learn to use prosthetic additions and implants to augment functionality or repair damage, but not replace the person themselves.

There is no physical law that is broken, there is an assumption of equivalence which I am exposing as fallacious.

Craig
 


--
Stathis Papaioannou

Stathis Papaioannou

unread,
Oct 1, 2012, 12:03:06 PM10/1/12
to everyth...@googlegroups.com
On Tue, Oct 2, 2012 at 1:46 AM, Craig Weinberg <whats...@gmail.com> wrote:

>> You're not really answering the question. The neural implants are
>> refined to the point where thousands of people are walking around with
>> them with no problem. Any objective or subjective test thrown at them
>> they pass. There are implants available for every part of the brain.
>> You're saying that if someone has 12 implants of the best possible
>> design they will be fine, but when they get 13 they will start to act
>> strangely.
>
>
> They may or may not act strangely depending on who is defining what strange
> is. Think of how Alzheimers progresses. It's not like dementia can be
> detected from the first appearance of an amyloid plaque overgrowth.
>
> It would really be surprising if any brain change didn't follow this
> pattern. If you ingest n micrograms of LSD you are fine. If you ingest n+x
> micrograms, then you have a psychedelic experience lasting several hours.
> The model of the brain that you seem to assume is based on pure mechanistic
> assumption. It has no grounding in the physiological realities of what the
> brain actually is as a living organ.

Physiological realities are mechanistic. Biologists and doctors are
mechanists. Even if you claim that "the whole is greater than the sum
of its parts" that does not mean that if yoyu replace the parts the
whole will stop working.

>> How can you know that this will happen?
>
>
> Because I understand what makes consciousness different from a machine.

No, you don't. You claim without any coherent explanation that even an
engineer with godlike abilities could not make a replacement brain
part that would leave the person functioning normally, and that even
if one such part could be made to work surely *two* of them would not!

>> You're not just
>> saying here that it would be technically difficult, you're saying that
>> it would be *impossible* for the implants to work properly. So what
>> physical law that you know about and no-one else does would be broken?
>
>
> The implants would work like proper implants, not like proper sub-persons.
> Implants have no experiences, therefore a collection of interconnected
> implants also have no experiences. If you have enough of a living person's
> brain left to be able to still be a person, then that person can learn to
> use prosthetic additions and implants to augment functionality or repair
> damage, but not replace the person themselves.
>
> There is no physical law that is broken, there is an assumption of
> equivalence which I am exposing as fallacious.

But if the implants worked as implants without experiences the person
would behave as if everything were fine while internally and
impotently noticing that his experiences were disappearing or
changing. Do you understand what this means?


--
Stathis Papaioannou

Bruno Marchal

unread,
Oct 1, 2012, 1:00:48 PM10/1/12
to everyth...@googlegroups.com
Yes. Anti-mechanist often refer to "the whole is bigger than the
parts", but nowhere else than in computer and engineering is it more
true that the whole is bigger than the part, if only because the whole
put some specific structure on the relation between parts.
We might simplify this by saying that the whole *structural
complexity* grows like an exponential (or more) when the whole
cardinality grows linearly.

Bruno
> --
> You received this message because you are subscribed to the Google
> Groups "Everything List" group.
> To post to this group, send email to everyth...@googlegroups.com.
> To unsubscribe from this group, send email to everything-li...@googlegroups.com
> .
> For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
> .
>

http://iridia.ulb.ac.be/~marchal/



Craig Weinberg

unread,
Oct 1, 2012, 3:21:42 PM10/1/12
to everyth...@googlegroups.com

I understand exactly what you think it means, but you don't understand why the theoretical assumption doesn't apply to reality. Where consciousness is concerned, the whole is not merely the sum of its parts, or even greater than the sum of it's parts, it is *other than* the sum of it's parts. If the parts are not genetically identical to the whole, then they cannot be expected to even create a sum, let alone produce an experience which is greater or other than that sum. You assume that personal experience is a sum of impersonal mechanisms, whereas I understand that is not the case at all. Impersonal mechanisms are the public back end of sub-personal experiences.

If you try to build a person from mechanisms, you will *always fail*, because the sub-personal experiences are not accessible without the personal experiences to begin with. A baby has to learn to think like an adult through years of personal experience. It is the actual subjective participation in the experiences which drives how the neurology develops. We see this with how people blind from birth use their visual cortex for tactile experience. If you gave the blind person a drug with will make their visual cortex function just like a sighted person's, they still won't get any colors. The colors aren't in 'there', there in 'here'.

Craig


--
Stathis Papaioannou

Stathis Papaioannou

unread,
Oct 1, 2012, 8:09:17 PM10/1/12
to everyth...@googlegroups.com
On Tue, Oct 2, 2012 at 5:21 AM, Craig Weinberg <whats...@gmail.com> wrote:

>> But if the implants worked as implants without experiences the person
>> would behave as if everything were fine while internally and
>> impotently noticing that his experiences were disappearing or
>> changing. Do you understand what this means?
>
>
> I understand exactly what you think it means, but you don't understand why
> the theoretical assumption doesn't apply to reality. Where consciousness is
> concerned, the whole is not merely the sum of its parts, or even greater
> than the sum of it's parts, it is *other than* the sum of it's parts. If the
> parts are not genetically identical to the whole, then they cannot be
> expected to even create a sum, let alone produce an experience which is
> greater or other than that sum. You assume that personal experience is a sum
> of impersonal mechanisms, whereas I understand that is not the case at all.
> Impersonal mechanisms are the public back end of sub-personal experiences.

And if that were the case you would get a person who would behave as
if everything were fine while internally and impotently noticing that
his experiences were disappearing or changing. Do you understand why
he would behave as if everything were fine? Do you understand why he
would internally and impotently notice that his experiences were
disappearing or changing?

> If you try to build a person from mechanisms, you will *always fail*,
> because the sub-personal experiences are not accessible without the personal
> experiences to begin with. A baby has to learn to think like an adult
> through years of personal experience. It is the actual subjective
> participation in the experiences which drives how the neurology develops. We
> see this with how people blind from birth use their visual cortex for
> tactile experience. If you gave the blind person a drug with will make their
> visual cortex function just like a sighted person's, they still won't get
> any colors. The colors aren't in 'there', there in 'here'.

The problem with someone blind from birth who later has an optical
problem corrected in adulthood is that the visual cortex has not
developed properly, since this normally happens in infancy. Thus to
make them see you would need not only to correct the optical problem
but to rewire their brain. If you could do that then they would have
all the required apparatus for visual perception and they would be
able to see. This is trivially obvious to me and most people: you
can't see because your brain doesn't work, and if you fixed the brain
you would be able to see. But I'm guessing that you might say that
even if the blind person's eyes and brain were fixed, so that
everything seemed to work perfectly well, they would still be blind,
because the non-mechanistic non-reducible spirit of visual essence
would be missing.


--
Stathis Papaioannou

Stephen P. King

unread,
Oct 1, 2012, 11:57:51 PM10/1/12
to everyth...@googlegroups.com
On 10/1/2012 1:00 PM, Bruno Marchal wrote:
>> Physiological realities are mechanistic. Biologists and doctors are
>> mechanists. Even if you claim that "the whole is greater than the sum
>> of its parts" that does not mean that if yoyu replace the parts the
>> whole will stop working.
>
> Yes. Anti-mechanist often refer to "the whole is bigger than the
> parts", but nowhere else than in computer and engineering is it more
> true that the whole is bigger than the part, if only because the whole
> put some specific structure on the relation between parts.
> We might simplify this by saying that the whole *structural
> complexity* grows like an exponential (or more) when the whole
> cardinality grows linearly.

H Bruno,

Could you source some further discussions of this idea? From my own
study of Cantor's tower of infinities, I have found the opposite,
complexity goes to zero as the cardinals lose the ability to be named.



--
Onward!

Stephen


Craig Weinberg

unread,
Oct 2, 2012, 12:33:28 AM10/2/12
to everyth...@googlegroups.com


On Monday, October 1, 2012 8:09:53 PM UTC-4, stathisp wrote:
On Tue, Oct 2, 2012 at 5:21 AM, Craig Weinberg <whats...@gmail.com> wrote:

>> But if the implants worked as implants without experiences the person
>> would behave as if everything were fine while internally and
>> impotently noticing that his experiences were disappearing or
>> changing. Do you understand what this means?
>
>
> I understand exactly what you think it means, but you don't understand why
> the theoretical assumption doesn't apply to reality. Where consciousness is
> concerned, the whole is not merely the sum of its parts, or even greater
> than the sum of it's parts, it is *other than* the sum of it's parts. If the
> parts are not genetically identical to the whole, then they cannot be
> expected to even create a sum, let alone produce an experience which is
> greater or other than that sum. You assume that personal experience is a sum
> of impersonal mechanisms, whereas I understand that is not the case at all.
> Impersonal mechanisms are the public back end of sub-personal experiences.

And if that were the case you would get a person who would behave as
if everything were fine while internally and impotently noticing that
his experiences were disappearing or changing. Do you understand why
he would behave as if everything were fine?

I understand why you think he would behave as if everything were fine, but you don't understand that I know why this view is reductionist fantasy. Everything that we do and are is an expression of every experience we have ever had. The way we reason is the result of the experiences which we have lived through. Although these experiences play a role in circumscribing the behavior of the neurology, you can't reverse engineer the experiences from the behavior. It's like assuming that you can excise a block out of New York City and replace it with something that you assume to be functionally identical.

There is nothing that is functionally identical to a specific block of New York City. You can replace a building here or there and in time the population will embrace it, but the more of the city is replaced, the less able the population is able to assimilate the loss and the whole thing fails. He would *not* behave as if everything were fine because he would not be able to integrate new experiences as a human being would. As with all computing devices, there would be glitches that would be dead giveaways, and they would only become more prominent over time.
 
Do you understand why he
would internally and impotently notice that his experiences were
disappearing or changing?

I understand why you might believe that, but it's again based on a reductionist fantasy of consciousness. Do you think that someone with dementia is fully aware of their condition at all times like some kind of detached voyeur? If someone is falling down drunk are they internally and impotently noticing that their experiences are disappearing or changing? Consciousness isn't like that. It is a vast library of intertwined modalities of apocatastatic-gestalt frames of awareness access. It's not like you can stand aloof from your own psyche being dismantled...there isn't going to be enough of 'you' left to do that. Every part of your brain being replaced has to be compensated for by a live part of your brain picking up the slack, using the devices as prosthetic limbs and antennae. There is only so much that can be amputated and beyond that is irrevocable loss.


> If you try to build a person from mechanisms, you will *always fail*,
> because the sub-personal experiences are not accessible without the personal
> experiences to begin with. A baby has to learn to think like an adult
> through years of personal experience. It is the actual subjective
> participation in the experiences which drives how the neurology develops. We
> see this with how people blind from birth use their visual cortex for
> tactile experience. If you gave the blind person a drug with will make their
> visual cortex function just like a sighted person's, they still won't get
> any colors. The colors aren't in 'there', there in 'here'.

The problem with someone blind from birth who later has an optical
problem corrected in adulthood is that the visual cortex has not
developed properly, since this normally happens in infancy. 
Thus to make them see you would need not only to correct the optical problem
but to rewire their brain. If you could do that then they would have
all the required apparatus for visual perception and they would be
able to see.

If you can't correct the optical problem though, they are not going to be able to see even if you rewire the visual cortex. This is what you are not understanding. Without the optical experience, there is no images in the brain. Nothing that you can do to the brain will generate a visual experience.
 
This is trivially obvious to me and most people: you
can't see because your brain doesn't work, and if you fixed the brain
you would be able to see.

It seems trivially obvious to you because you don't understand the reality of perception. The brain, the eye, and illuminated conditions in the environment are all absolute requirements for the experience of visual sense to occur. None of those things can be missing, even with a perfect replica of a sighted person's brain. You assume that the brain contains experiences, but I don't. The brain only gives us access for perception and participation experiences as a human being. It can't generate experiences that it hasn't had.
 
But I'm guessing that you might say that
even if the blind person's eyes and brain were fixed, so that
everything seemed to work perfectly well, they would still be blind,

If their eyes worked then the neuroplasticity of the brain would adapt and hopefully begin to see eventually, depending on how long they had lived without sight. If the brain were fixed too, then they would learn to see very quickly, but still it would take months to get used to I would imagine. Pure speculation of course. But you think that what, you can pop on the eyes and visual cortex of a 40 year old Chinese man onto a Hungarian baby and he will be able to read Chinese?

because the non-mechanistic non-reducible spirit of visual essence
would be missing.

Whenever anyone accuses me of talking about spirit or magic I know that they don't understand anything that I am talking about. Here is my post yesterday which specifically asserts why terms like spirit and soul are inappropriate. http://s33light.org/post/32628731508

Essence I do use but not in the way that you accuse me of. I use essence only in juxtaposition to existence, as a way of discerning between primordially direct experience (essence) and indirect representations of experience (existence). I would never say that there was a visual essence or any such vague allusion. I am about describing the ordinary conditions of experience as they really, even it it often requires pretentious words. I propose no supernatural or metaphysical properties, especially none that make mechanistic mathematical functions suddenly start feeling like they want to sing and dance.

Craig



--
Stathis Papaioannou

Bruno Marchal

unread,
Oct 2, 2012, 6:10:23 AM10/2/12
to everyth...@googlegroups.com

On 02 Oct 2012, at 05:57, Stephen P. King wrote:

> On 10/1/2012 1:00 PM, Bruno Marchal wrote:
>>> Physiological realities are mechanistic. Biologists and doctors are
>>> mechanists. Even if you claim that "the whole is greater than the
>>> sum
>>> of its parts" that does not mean that if yoyu replace the parts the
>>> whole will stop working.
>>
>> Yes. Anti-mechanist often refer to "the whole is bigger than the
>> parts", but nowhere else than in computer and engineering is it
>> more true that the whole is bigger than the part, if only because
>> the whole put some specific structure on the relation between parts.
>> We might simplify this by saying that the whole *structural
>> complexity* grows like an exponential (or more) when the whole
>> cardinality grows linearly.
>
> H Bruno,
>
> Could you source some further discussions of this idea?

I thought it was common sense. With a coffee machine, you can do
coffee. But you can't do coffee with any parts of that machine.


> From my own study of Cantor's tower of infinities, I have found the
> opposite, complexity goes to zero as the cardinals lose the ability
> to be named.

?

Bruno


http://iridia.ulb.ac.be/~marchal/



John Mikes

unread,
Oct 2, 2012, 5:57:50 PM10/2/12
to everyth...@googlegroups.com
Stephen (and Bruno?)
What I called The Aris - Total - meaning Aristotle's maxim that the 'whole' is bigger than the sum of its parts - means something else in MY agnosticism. Originally I included only the fact what Bruno pointed out now: that the PARTS (as accounted for) develop relations (qualia) adding to the totality they participate  in. Lately, however, I added to my view that beyond the accountable parts (forget now the relations) there are participant 'inconnu'-s from outside our (inventoried) model knowable as of yesterday. So whatever we take inventory of is an (accountable) partial only.
Beyond that - of course - Aristotle's 'total' ("material parts only") of his inventory was truly smaller than the above TOTAL in its entire complexity.
 
The fact that complexity-parts extracted, or replaced may not discontinue the function of the 'total' is my problem with death: how to identify THOSE important components which are inevitable for maintaining the function as was?
(Comes back to my negative attitude towards transport - hype (to Moscow, or another planet/universe) - complexity has uncountable connections in the infinite relations. How much could we possibly include (in our wildest fantasy) into the tele-transporting of a "person" (or whatever) so that the original functionality should be still detectable?)
 
Heavenly afterlife anybody?
 
John Mikes
 






--
Onward!

Stephen


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscribe@googlegroups.com.

meekerdb

unread,
Oct 2, 2012, 6:13:19 PM10/2/12
to everyth...@googlegroups.com
On 10/2/2012 2:57 PM, John Mikes wrote:
Stephen (and Bruno?)
What I called The Aris - Total - meaning Aristotle's maxim that the 'whole' is bigger than the sum of its parts - means something else in MY agnosticism. Originally I included only the fact what Bruno pointed out now: that the PARTS (as accounted for) develop relations (qualia) adding to the totality they participate  in. Lately, however, I added to my view that beyond the accountable parts (forget now the relations) there are participant 'inconnu'-s from outside our (inventoried) model knowable as of yesterday. So whatever we take inventory of is an (accountable) partial only.
Beyond that - of course - Aristotle's 'total' ("material parts only") of his inventory was truly smaller than the above TOTAL in its entire complexity.
 
The fact that complexity-parts extracted, or replaced may not discontinue the function of the 'total' is my problem with death: how to identify THOSE important components which are inevitable for maintaining the function as was?
(Comes back to my negative attitude towards transport - hype (to Moscow, or another planet/universe) - complexity has uncountable connections in the infinite relations. How much could we possibly include (in our wildest fantasy) into the tele-transporting of a "person" (or whatever) so that the original functionality should be still detectable?)

Yet we get time transported from last year to this year - with most of our atoms being replaced by others.  We are not exactly the same - but "the same enough".

Brent

Stephen P. King

unread,
Oct 2, 2012, 8:50:34 PM10/2/12
to everyth...@googlegroups.com
On 10/2/2012 5:57 PM, John Mikes wrote:
Stephen (and Bruno?)
What I called The Aris - Total - meaning Aristotle's maxim that the 'whole' is bigger than the sum of its parts - means something else in MY agnosticism. Originally I included only the fact what Bruno pointed out now: that the PARTS (as accounted for) develop relations (qualia) adding to the totality they participate  in. Lately, however, I added to my view that beyond the accountable parts (forget now the relations) there are participant 'inconnu'-s from outside our (inventoried) model knowable as of yesterday. So whatever we take inventory of is an (accountable) partial only.
Beyond that - of course - Aristotle's 'total' ("material parts only") of his inventory was truly smaller than the above TOTAL in its entire complexity.
 
The fact that complexity-parts extracted, or replaced may not discontinue the function of the 'total' is my problem with death: how to identify THOSE important components which are inevitable for maintaining the function as was?
(Comes back to my negative attitude towards transport - hype (to Moscow, or another planet/universe) - complexity has uncountable connections in the infinite relations. How much could we possibly include (in our wildest fantasy) into the tele-transporting of a "person" (or whatever) so that the original functionality should be still detectable?)
 
Heavenly afterlife anybody?
 
John Mikes

Hi John,

    "Aris", I like it! One question is how much of one's sense of self and memories can be carried across. Function does not seem to do this alone as it is completely independent of the physical "body".


 


On Mon, Oct 1, 2012 at 11:57 PM, Stephen P. King <step...@charter.net> wrote:
On 10/1/2012 1:00 PM, Bruno Marchal wrote:
Physiological realities are mechanistic. Biologists and doctors are
mechanists. Even if you claim that "the whole is greater than the sum
of its parts" that does not mean that if yoyu replace the parts the
whole will stop working.

Yes. Anti-mechanist often refer to "the whole is bigger than the parts", but nowhere else than in computer and engineering is it more true that the whole is bigger than the part, if only because the whole put some specific structure on the relation between parts.
We might simplify this by saying that the whole *structural complexity* grows like an exponential (or more) when the whole cardinality grows linearly.

H Bruno,

    Could you source some further discussions of this idea? From my own study of Cantor's tower of infinities, I have found the opposite, complexity goes to zero as the cardinals lose the ability to be named.



--


-- 
Onward!

Stephen

John Mikes

unread,
Oct 3, 2012, 4:20:02 PM10/3/12
to everyth...@googlegroups.com
Stephen:
are you compofrtable to imagine yourSELF and the warehouse of your MEMORIES - all excluding any relations to your body?
Then what? I think the complexity "WE" includes the part thought of as body and bodily feelings so an abstract transport would not result in 'ourselves'.
Even the (oriental) 'experts' in reincarnation deny memories of the previous format. The 'ant' does not remember what kind of 'man' he was, nor does a 'man' remember his former life-form.
Why should the 'expert' teleportation differ?
JM

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everyth...@googlegroups.com.
To unsubscribe from this group, send email to everything-li...@googlegroups.com.

Stephen P. King

unread,
Oct 3, 2012, 8:45:05 PM10/3/12
to everyth...@googlegroups.com
On 10/3/2012 4:20 PM, John Mikes wrote:
Stephen:
are you compofrtable to imagine yourSELF and the warehouse of your MEMORIES - all excluding any relations to your body?
Then what? I think the complexity "WE" includes the part thought of as body and bodily feelings so an abstract transport would not result in 'ourselves'.
Even the (oriental) 'experts' in reincarnation deny memories of the previous format. The 'ant' does not remember what kind of 'man' he was, nor does a 'man' remember his former life-form.
Why should the 'expert' teleportation differ?
JM

    The following is purely speculative. In reincarnation, we lose the information body completely (and all the classically encoded memories) and plunge back into Aris, in teleportation we keep the record of the body and reconstruct the states of it somewhere else, thus preserving the connection. All motion that is occurring now is teleportation, just in very small distances. The environment is both measuring us and reconstructing in the next location.



On Tue, Oct 2, 2012 at 8:50 PM, Stephen P. King <step...@charter.net> wrote:
On 10/2/2012 5:57 PM, John Mikes wrote:
Stephen (and Bruno?)
What I called The Aris - Total - meaning Aristotle's maxim that the 'whole' is bigger than the sum of its parts - means something else in MY agnosticism. Originally I included only the fact what Bruno pointed out now: that the PARTS (as accounted for) develop relations (qualia) adding to the totality they participate  in. Lately, however, I added to my view that beyond the accountable parts (forget now the relations) there are participant 'inconnu'-s from outside our (inventoried) model knowable as of yesterday. So whatever we take inventory of is an (accountable) partial only.
Beyond that - of course - Aristotle's 'total' ("material parts only") of his inventory was truly smaller than the above TOTAL in its entire complexity.
 
The fact that complexity-parts extracted, or replaced may not discontinue the function of the 'total' is my problem with death: how to identify THOSE important components which are inevitable for maintaining the function as was?
(Comes back to my negative attitude towards transport - hype (to Moscow, or another planet/universe) - complexity has uncountable connections in the infinite relations. How much could we possibly include (in our wildest fantasy) into the tele-transporting of a "person" (or whatever) so that the original functionality should be still detectable?)
 
Heavenly afterlife anybody?
 
John Mikes

Hi John,

    "Aris", I like it! One question is how much of one's sense of self and memories can be carried across. Function does not seem to do this alone as it is completely independent of the physical "body".


 


On Mon, Oct 1, 2012 at 11:57 PM, Stephen P. King <step...@charter.net> wrote:
On 10/1/2012 1:00 PM, Bruno Marchal wrote:
Physiological realities are mechanistic. Biologists and doctors are
mechanists. Even if you claim that "the whole is greater than the sum
of its parts" that does not mean that if yoyu replace the parts the
whole will stop working.

Yes. Anti-mechanist often refer to "the whole is bigger than the parts", but nowhere else than in computer and engineering is it more true that the whole is bigger than the part, if only because the whole put some specific structure on the relation between parts.
We might simplify this by saying that the whole *structural complexity* grows like an exponential (or more) when the whole cardinality grows linearly.

H Bruno,

    Could you source some further discussions of this idea? From my own study of Cantor's tower of infinities, I have found the opposite, complexity goes to zero as the cardinals lose the ability to be named.




-- 
Onward!

Stephen
Reply all
Reply to author
Forward
0 new messages