> Now, this seems VERY cool:
https://theinnermostloop.substack.com/p/the-first-multi-behavior-brain-upload
I think the following quotation is especially interesting:
"If a fly brain can now close the sensorimotor loop in simulation, the question for the mouse becomes one of scale, not of kind.Now, this seems VERY cool:
https://theinnermostloop.substack.com/p/the-first-multi-behavior-brain-upload
--
You received this message because you are subscribed to the Google Groups "extropolis" group.
To unsubscribe from this group and stop receiving emails from it, send an email to extropolis+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/extropolis/CAKTCJyfTFxXnGzKyE%3DDtj_Jp6NJTxreduPRa9zwViv-i2E_oUA%40mail.gmail.com.
>It sounds interesting, but I fail to see how this does not amount to uploading a copy of a brain that has the consciousness of a rusty nail. If your brain is demolished in the process you are dead and there is a digital zombie copy of you.
--
You received this message because you are subscribed to the Google Groups "extropolis" group.
To unsubscribe from this group and stop receiving emails from it, send an email to extropolis+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/extropolis/CAJPayv27%3DBFjpW4_i5ueCh-3VT0Z81ZxD9xpp_ex1PzZquCQSA%40mail.gmail.com.
LCOn Sun, Mar 8, 2026 at 4:09 AM Giulio Prisco <giu...@gmail.com> wrote:Now, this seems VERY cool:
https://theinnermostloop.substack.com/p/the-first-multi-behavior-brain-upload
--
--
You received this message because you are subscribed to the Google Groups "extropolis" group.
To unsubscribe from this group and stop receiving emails from it, send an email to extropolis+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/extropolis/CAJPayv27%3DBFjpW4_i5ueCh-3VT0Z81ZxD9xpp_ex1PzZquCQSA%40mail.gmail.com.
> The problem with your first argument is that an elementary particle or atom is identical because they share identical quantum properties. As a result they are indistinguishable.
>To extend this into problems of brains and mind is a leap that is most likely a category error.
> Your argument is a sort of counterfactual, where to say consciousness is a way that data "feels" amounts to putting a different condition on what is meant by information.
> Consciousness is a way that behavior is in some way self-guided.
> A brain exists not to be intelligent, it exists to generate behavior that enhances the survival of an animal and allows them to pass their genes to the next generation
> A mouse is not particularly intelligent, but I do think they have sentience as some small mental landscape.
> Consciousness is then a property of biological systems.
> There is no particular reason to think that if you take the brain of any animal and by some means duplicate each neuron in a digital format, and this process could easily be destructive to the brain, that the mental landscape is transferred to the machine.
> I am certainly not going to upload myself into a computer.
On Mon, Mar 9, 2026 at 8:17 AM John Clark <johnk...@gmail.com> wrote:On Mon, Mar 9, 2026 at 6:27 AM Lawrence Crowell <goldenfield...@gmail.com> wrote:>It sounds interesting, but I fail to see how this does not amount to uploading a copy of a brain that has the consciousness of a rusty nail. If your brain is demolished in the process you are dead and there is a digital zombie copy of you.I am surprised to hear a physicist say that, you must know there's no difference between one hydrogen atom and another, and the atoms in your body are constantly being replaced, your brain is literally made of last year's mashed potatoes. And if you're thinking of the quantum state, that can't be the key to personal identity because if it was then you would become a different person every time you took a sip of coffee, in fact you'd become a different person a trillion times a second due to interactions with the environment.As for consciousness, I have empirical evidence that it must be an inevitable byproduct of intelligence. I know for a fact that I am conscious, therefore Evolution has managed to produce it at least once and probably many billions of times. But natural selection can't directly see the consciousness of animals any better than we can directly see it in others, and natural selection can't select for something it can't see. However natural selection CAN detect intelligence. Therefore it must be a brute fact that consciousness is the way data feels when it is being processed intelligently.I think the fundamental problem is that both you and I were misinformed when our third grade teachers told us that the word "you" was a pronoun, it is not, it is an adjective. You are the way that matter behaves when it is organized in a Lawrencecrowellian way; right now there is only one chunk of matter in the observable universe that is organized in that way, but that need not always be the case.John K ClarkThe problem with your first argument is that an elementary particle or atom is identical because they share identical quantum properties. As a result they are indistinguishable. To extend this into problems of brains and mind is a leap that is most likely a category error. Your argument is a sort of counterfactual, where to say consciousness is a way that data "feels" amounts to putting a different condition on what is meant by information.Consciousness is a way that behavior is in some way self-guided. A brain exists not to be intelligent, it exists to generate behavior that enhances the survival of an animal and allows them to pass their genes to the next generation. A mouse is not particularly intelligent, but I do think they have sentience as some small mental landscape. Consciousness is then a property of biological systems. There is no particular reason to think that if you take the brain of any animal and by some means duplicate each neuron in a digital format, and this process could easily be destructive to the brain, that the mental landscape is transferred to the machine.I am certainly not going to upload myself into a computer.
There is reason to reject the idea that consciousness depends on the biological substrate of the brain. Suppose a part of the brain is replaced with a non-biological implant that perfectly replicates its input–output behaviour. The rest of the brain interacting with it will behave exactly as before, and the subject as a whole will continue to behave normally.
A problem arises if we assume that only the behaviour and not the consciousness associated with the replaced part is replicated. In that case the subject could lose a significant aspect of their experience, such as visual perception or the ability to understand language, while still behaving exactly as before and insisting that everything is normal.
This would amount to a kind of partial zombie: a person missing an entire domain of conscious experience but unable to notice the loss. Imagine completely losing visual experience while still believing that everything looks as vivid as before, accurately describing your surroundings, navigating the world normally, and never detecting that anything has changed.
This scenario seems absurd. The natural conclusion is that consciousness follows functional organization. If the functional behaviour of the brain were perfectly replicated, consciousness would necessarily be replicated as well.
--
You received this message because you are subscribed to the Google Groups "extropolis" group.
To unsubscribe from this group and stop receiving emails from it, send an email to extropolis+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/extropolis/CAJPayv2zAnfgBuno7dn54d%3DByGuGKY9CdE_7LBCgtB3r0j0_Tw%40mail.gmail.com.
> Stepping back, ... , I see nothing to indicate that if a brain is copied or all its states at one time copied into a computer that consciousness is transferred.
> If this were done by some hyper-tech magic,
>where a brain is scanned and almost instantly that information put in a computer, does consciousness bifurcate?
>Where do I go?
> This would then suggest that if the scan demolishes the brain that you are dead and a robotic zombie acts in your stead.
> Faster than light travel I think is virtually impossible,
> Dyson spheres are of a scale such that they may be practically impossible,
> A small subregion of the brain can conceivably be replaced by a computer. The functional recovery should be nearly complete, though the charge or patient may detect some differences.
> The system has been slightly perturbed or tweaked, or in physics we call an adiabatic perturbation. A complete replacement is a violent perturbation.
> Generally, the adiabatic case cannot be extended this way to a hard perturbation.