VERY cool. Toward mind uploading?

10 views
Skip to first unread message

Giulio Prisco

unread,
Mar 8, 2026, 5:09:45 AMMar 8
to ExI chat list, extro...@googlegroups.com

John Clark

unread,
Mar 8, 2026, 8:08:20 AMMar 8
to extro...@googlegroups.com, ExI chat list
On Sun, Mar 8, 2026 at 5:09 AM Giulio Prisco <giu...@gmail.com> wrote:

Now, this seems VERY cool:

https://theinnermostloop.substack.com/p/the-first-multi-behavior-brain-upload

Yes, that is cool, very cool! Thanks Giulio.

John K Clark

 

John Clark

unread,
Mar 8, 2026, 8:16:09 AMMar 8
to extro...@googlegroups.com, ExI chat list

I think the following quotation is especially interesting: 

"If a fly brain can now close the sensorimotor loop in simulation, the question for the mouse becomes one of scale, not of kind.  
Watch the video closely. What you are seeing is not an animation. It is not a reinforcement learning policy mimicking biology. It is a copy of a biological brain, wired neuron-to-neuron from electron microscopy data, running in simulation, making a body move. The ghost is no longer in the machine. The machine is becoming the ghost.
"

John K Clark



Giulio Prisco

unread,
Mar 9, 2026, 3:21:17 AMMar 9
to ExI chat list, extro...@googlegroups.com
Eon Systems Founder and CEO Michael Andregg has posted an X thread to
comment on this breakthrough and its significance. He explains that
"we do know what the brain does when it wants to move in certain ways
and that's what we connected to the NeuroMechFly."

"This is, in our view, a real uploaded animal," he says. "We don't
know what its experience is - nobody does. But we take the possibility
seriously."

https://x.com/michaelandregg/status/2030764512488677736

Lawrence Crowell

unread,
Mar 9, 2026, 6:27:52 AMMar 9
to extro...@googlegroups.com
It sounds interesting, but I fail to see how this does not amount to uploading a copy of a brain that has the consciousness of a rusty nail. If your brain is demolished in the process you are dead and there is a digital zombie copy of you. 

LC

On Sun, Mar 8, 2026 at 4:09 AM Giulio Prisco <giu...@gmail.com> wrote:
--
You received this message because you are subscribed to the Google Groups "extropolis" group.
To unsubscribe from this group and stop receiving emails from it, send an email to extropolis+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/extropolis/CAKTCJyfTFxXnGzKyE%3DDtj_Jp6NJTxreduPRa9zwViv-i2E_oUA%40mail.gmail.com.

John Clark

unread,
Mar 9, 2026, 9:17:20 AMMar 9
to extro...@googlegroups.com
On Mon, Mar 9, 2026 at 6:27 AM Lawrence Crowell <goldenfield...@gmail.com> wrote:

>It sounds interesting, but I fail to see how this does not amount to uploading a copy of a brain that has the consciousness of a rusty nail. If your brain is demolished in the process you are dead and there is a digital zombie copy of you. 

I am surprised to hear a physicist say that, you must know there's no difference between one hydrogen atom and another, and the atoms in your body are constantly being replaced, your brain is literally made of last year's mashed potatoes. And if you're thinking of the quantum state, that can't be the key to personal identity because if it was then you would become a different person every time you took a sip of coffee, in fact you'd become a different person a trillion times a second due to interactions with the environment. 

As for consciousness, I have empirical evidence that it must be an inevitable byproduct of intelligence. I know for a fact that I am conscious, therefore Evolution has managed to produce it at least once and probably many billions of times. But natural selection can't directly see the consciousness of animals any better than we can directly see it in others, and natural selection can't select for something it can't see. However natural selection CAN detect intelligence. Therefore it must be a brute fact that consciousness is the way data feels when it is being processed intelligently.

I think the fundamental problem is that both you and I were misinformed when our third grade teachers told us that the word "you" was a pronoun, it is not, it is an adjective. You are the way that matter behaves when it is organized in a Lawrencecrowellian way; right now there is only one chunk of matter in the observable universe that is organized in that way, but that need not always be the case. 

John K Clark

Henrik Ohrstrom

unread,
Mar 9, 2026, 1:33:16 PMMar 9
to extro...@googlegroups.com
Unless you insist on the presence of something akin to "soul" a digital copy with enough fidelity and suitable randomness is "you " in enough ways that I wouldn't give a shit.  A digital copy would perhaps require an VR environment to shoot the breeze and drink beer but that's the end.  If you have multiple simultaneous copies activated at the same time that is a new interesting complication best solved by activating some extra copies of my self. 
If the copy is as active as a rusty nail, then it is not a copy, just an image of some sort.

/Henrik 



--
You received this message because you are subscribed to the Google Groups "extropolis" group.
To unsubscribe from this group and stop receiving emails from it, send an email to extropolis+...@googlegroups.com.

Lawrence Crowell

unread,
Mar 9, 2026, 8:05:31 PMMar 9
to extro...@googlegroups.com
The problem with your first argument is that an elementary particle or atom is identical because they share identical quantum properties. As a result they are indistinguishable. To extend this into problems of brains and mind is a leap that is most likely a category error. Your argument is a sort of counterfactual, where to say consciousness is a way that data "feels" amounts to putting a different condition on what is meant by information. 

Consciousness is a way that behavior is in some way self-guided. A brain exists not to be intelligent, it exists to generate behavior that enhances the survival of an animal and allows them to pass their genes to the next generation. A mouse is not particularly intelligent, but I do think they have sentience as some small mental landscape. Consciousness is then a property of biological systems. There is  no particular reason to think that if you take the brain of any animal and by some means duplicate each neuron in a digital format, and this process could easily be destructive to the brain, that the mental landscape is transferred to the machine. 

I am certainly not going to upload myself into a computer.

LC

 



 

LC

On Sun, Mar 8, 2026 at 4:09 AM Giulio Prisco <giu...@gmail.com> wrote:
Now, this seems VERY cool:

https://theinnermostloop.substack.com/p/the-first-multi-behavior-brain-upload

--

--
You received this message because you are subscribed to the Google Groups "extropolis" group.
To unsubscribe from this group and stop receiving emails from it, send an email to extropolis+...@googlegroups.com.

John Clark

unread,
Mar 10, 2026, 9:31:38 AMMar 10
to extro...@googlegroups.com
On Mon, Mar 9, 2026 at 8:05 PM Lawrence Crowell <goldenfield...@gmail.com> wrote:

The problem with your first argument is that an elementary particle or atom is identical because they share identical quantum properties.   As a result they are indistinguishable. 

Right, that's what "identical" means and that's why atoms don’t have our initials scratched on them. But I don't see how that's a problem with my argument as you claim. 

>To extend this into problems of brains and mind is a leap that is most likely a category error.

Saying you are a brain would be a category error, saying you are a mind would not be. But mind is what a brain does. And what a brain does depends on how those identical atoms are arranged. And that arrangement can be described by information. So Information is as close as you can get to the religious concept of the soul and still remain within the scientific method. Consider the similarities:

The soul is non material and so is information. It's difficult to pin down a unique physical location for the soul, and the same is true for information. The soul is the essential, must have, part of consciousness, exactly the same situation is true for information. The soul is immortal and so, potentially, is information.

 But there are also important differences:

A soul is unique but information can be duplicated.The soul is and will always remain unfathomable, but information is understandable, in fact, information is the ONLY thing that is understandable. Information unambiguously exists, I don't think anyone would deny that, but if the soul exists (which I doubt) it will never be proven scientifically.

Your argument is a sort of counterfactual, where to say consciousness is a way that data "feels" amounts to putting a different condition on what is meant by information. 

The carbon atoms in your brain and mine are absolutely identical and yet the very large arrangement of those identical carbon atoms in our brains results in them behaving differently. And that's the only reason why we are different people. So the difference must not be in the atoms themselves but in how those identical atoms are arranged. And that can be described by information. And that's why it's as close as you can get to the traditional concept of the soul without leaving the scientific method. 

Consciousness is a way that behavior is in some way self-guided.

I’m not sure what you mean by "self-guided" but I do know that all human actions, and all actions of any sort for that matter, DOES happen because of cause-and-effect OR it DOES NOT happen because of cause-and-effect and is thus random.

A brain exists not to be intelligent, it exists to generate behavior that enhances the survival of an animal and allows them to pass their genes to the next generation

And there is a word for exactly that type of behavior, "intelligent". A mouse that was not intelligent would be less likely to get its genes passed into the next generation than a mouse that was more intelligent.  

A mouse is not particularly intelligent, but I do think they have sentience as some small mental landscape.

Perhaps, perhaps not, we’ll never know for sure. I can't prove that a rock is not conscious and I can't prove that a mouse is. You say you think a mouse does have some small amount of consciousness and I too think that's probably true, but I don't think a mouse is always conscious and I'll bet that you don't either, not if the mouse is sleeping or under anesthesia or dead. And the reason is that when a mouse is in one of those conditions they are not behaving very intelligently. And it's not just a mouse, I would say precisely the same thing about a member of my own species. 

Consciousness is then a property of biological systems.

Some biological systems can produce consciousness, those that can process information intelligently, but as is becoming increasingly obvious on an almost daily basis, biological systems are not the only things that can process information intelligently.  
 
There is  no particular reason to think that if you take the brain of any animal and by some means duplicate each neuron in a digital format, and this process could easily be destructive to the brain, that the mental landscape is transferred to the machine. 

No particular reason to think that?! Your upload says it's you, it behaves like you, and it vividly remembers being "you" yesterday. What more is required to be "you"? Does the element carbon have some strange mystical ability that the element silicon lacks? Is it really of paramount importance that your present brain is wet and squishy but your uploaded brain will be dry and hard?  
 
I am certainly not going to upload myself into a computer.

I wouldn't want to be the first person uploaded, I want to wait until all the bugs have been ironed outMy confidence in the ultimate technical viability of uploading is the only reason I was willing to spend $80,000 to have my brain frozen after my death. About two years ago on this list I gave a more detailed explanation of why I was willing to do that:

"The important thing is that during the freezing process things and structures in the brain stay put, or at least if they must move the flow should not be turbulent so you can figure out where the parts were before they moved. If things are turbulent then a small change in initial conditions will lead to a huge change in outcome and you'll never figure out where things are supposed to go. I don't see why turbulence would happen during the freezing of a brain and I'm not interested in what happens during unfreezing, that's a problem for advanced nanotechnology. I just want to be sure the information inside that frozen lump of tissue remains stable. 

That's why I think Cryonics has a pretty good chance of working at least from a technical viewpoint, whether my brain will actually remain at liquid nitrogen temperatures until the age of nanotechnology, and whether anybody will think I'm worth the bother of reviving are entirely different questions which I have no control over. All I can do is hope for the best.  

Fluid flow stops being smoothly Laminar and starts to become chaotically turbulent when a system has a Reynolds number between 2300 and 4000, although you might get some non chaotic vortices if it is bigger than 30. We can find the approximate Reynolds number by using the formula LDV/N.  L is the characteristic size we're interested in, we're interested in cells so L is about 10^-6 meter. D is the density of water, 10^3 kilograms/cubic meter.  V is the velocity of the flow, during freezing it's probably less than 10^-3 meters per second but let's be conservative, I'll give you 3 orders of magnitude and call V 1 meter per second.  N is the viscosity of water, at room temperature N is 0.001 newton-second/meter^2, it would be less than that when things get cold and even less when water is mixed with glycerol as it is in Cryonics but let's be conservative again and ignore those factors. 

If you plug these numbers into the formula you get a Reynolds number of about 1. 1 is a lot less than 2300 so it looks like any mixing caused by freezing would probably be laminar not turbulent, so you can still deduce the position where things are supposed to be."

John K Clark 




Stathis Papaioannou

unread,
Mar 10, 2026, 3:48:26 PMMar 10
to extro...@googlegroups.com


Stathis Papaioannou


On Tue, 10 Mar 2026 at 11:05, Lawrence Crowell <goldenfield...@gmail.com> wrote:


On Mon, Mar 9, 2026 at 8:17 AM John Clark <johnk...@gmail.com> wrote:
On Mon, Mar 9, 2026 at 6:27 AM Lawrence Crowell <goldenfield...@gmail.com> wrote:

>It sounds interesting, but I fail to see how this does not amount to uploading a copy of a brain that has the consciousness of a rusty nail. If your brain is demolished in the process you are dead and there is a digital zombie copy of you. 

I am surprised to hear a physicist say that, you must know there's no difference between one hydrogen atom and another, and the atoms in your body are constantly being replaced, your brain is literally made of last year's mashed potatoes. And if you're thinking of the quantum state, that can't be the key to personal identity because if it was then you would become a different person every time you took a sip of coffee, in fact you'd become a different person a trillion times a second due to interactions with the environment. 

As for consciousness, I have empirical evidence that it must be an inevitable byproduct of intelligence. I know for a fact that I am conscious, therefore Evolution has managed to produce it at least once and probably many billions of times. But natural selection can't directly see the consciousness of animals any better than we can directly see it in others, and natural selection can't select for something it can't see. However natural selection CAN detect intelligence. Therefore it must be a brute fact that consciousness is the way data feels when it is being processed intelligently.

I think the fundamental problem is that both you and I were misinformed when our third grade teachers told us that the word "you" was a pronoun, it is not, it is an adjective. You are the way that matter behaves when it is organized in a Lawrencecrowellian way; right now there is only one chunk of matter in the observable universe that is organized in that way, but that need not always be the case. 

John K Clark

The problem with your first argument is that an elementary particle or atom is identical because they share identical quantum properties. As a result they are indistinguishable. To extend this into problems of brains and mind is a leap that is most likely a category error. Your argument is a sort of counterfactual, where to say consciousness is a way that data "feels" amounts to putting a different condition on what is meant by information. 

Consciousness is a way that behavior is in some way self-guided. A brain exists not to be intelligent, it exists to generate behavior that enhances the survival of an animal and allows them to pass their genes to the next generation. A mouse is not particularly intelligent, but I do think they have sentience as some small mental landscape. Consciousness is then a property of biological systems. There is  no particular reason to think that if you take the brain of any animal and by some means duplicate each neuron in a digital format, and this process could easily be destructive to the brain, that the mental landscape is transferred to the machine. 

I am certainly not going to upload myself into a computer.

There is reason to reject the idea that consciousness depends on the biological substrate of the brain. Suppose a part of the brain is replaced with a non-biological implant that perfectly replicates its input–output behaviour. The rest of the brain interacting with it will behave exactly as before, and the subject as a whole will continue to behave normally.


A problem arises if we assume that only the behaviour and not the consciousness associated with the replaced part is replicated. In that case the subject could lose a significant aspect of their experience, such as visual perception or the ability to understand language, while still behaving exactly as before and insisting that everything is normal.

This would amount to a kind of partial zombie: a person missing an entire domain of conscious experience but unable to notice the loss. Imagine completely losing visual experience while still believing that everything looks as vivid as before, accurately describing your surroundings, navigating the world normally, and never detecting that anything has changed.

This scenario seems absurd. The natural conclusion is that consciousness follows functional organization. If the functional behaviour of the brain were perfectly replicated, consciousness would necessarily be replicated as well.


Lawrence Crowell

unread,
Mar 10, 2026, 7:40:35 PMMar 10
to extro...@googlegroups.com
Stepping back, ... , I see nothing to indicate that if a brain is copied or all its states at one time copied into a computer that consciousness is transferred. If this were done by some hyper-tech magic, where a brain is scanned and almost instantly that information put in a computer, does consciousness bifurcate? Where do I go? I give you a hefty bet you stay in your body. This would then suggest that if the scan demolishes the brain that you are dead and a robotic zombie acts in your stead.

Of late there has been a lot of hyper-speculations about faster than light travel, Dyson spheres, immortality and so forth. These things are most likely not going to happen. Faster than light travel I think is virtually impossible, Dyson spheres are of a scale such that they may be practically impossible, and ideas of minds in a computer matrix similarly are unlikely. These things are more science fiction than anything really about science.

LC

--
You received this message because you are subscribed to the Google Groups "extropolis" group.
To unsubscribe from this group and stop receiving emails from it, send an email to extropolis+...@googlegroups.com.

Lawrence Crowell

unread,
Mar 10, 2026, 7:48:37 PMMar 10
to extro...@googlegroups.com
It is similar to perturbation theory. A small subregion of the brain can conceivably be replaced by a computer. The functional recovery should be nearly complete, though the charge or patient may detect some differences. The system has been slightly perturbed or tweaked, or in physics we call an adiabatic perturbation. A complete replacement is a violent perturbation. Generally, the adiabatic case cannot be extended this way to a hard perturbation.

LC

Keith Henson

unread,
Mar 10, 2026, 8:24:14 PMMar 10
to extro...@googlegroups.com
20 years ago, I gave considerable thought to this topic. Part of that
was revulsion to Han Moravec's destructive brain scan proposal.

A technology able to extract information from a brain should be able
to add information. In the story I wrote around this, you can move to
the uploaded state and back to the physical body state without loss of
consciousness and with continuous memory. One-way uploading would be
worse than buying a car without a test drive.

Keith

On Tue, Mar 10, 2026 at 4:40 PM Lawrence Crowell
> To view this discussion visit https://groups.google.com/d/msgid/extropolis/CAAFA0qoqFvqG5pt9nZF%3DrgD8p8G%3DyFk7Z_2%3DbRz7%3DwrThBv8Mg%40mail.gmail.com.

John Clark

unread,
Mar 11, 2026, 8:01:54 AMMar 11
to extro...@googlegroups.com
On Tue, Mar 10, 2026 at 7:40 PM Lawrence Crowell <goldenfield...@gmail.com> wrote:

Stepping back, ... , I see nothing to indicate that if a brain is copied or all its states at one time copied into a computer that consciousness is transferred.

What indication would you need to see to change your mind about this?  What indication do you see in your fellow human beings that makes you conclude they are conscious that you have failed to see in today's most modern AI's?  

If this were done by some hyper-tech magic,

Drexler style Nanotechnology is not magic and it does not even require more scientific knowledge, all you need to achieve it is improved engineering. 

>where a brain is scanned and almost instantly that information put in a computer, does consciousness bifurcate?

Yes bifurcation would occur if the brain scanning was done non-destructively, although a destructive brain scan would be easier to do. As I said before, today there is only one chunk of matter in the entire observable universe that behaves in a Lawrencecrowellian way but that need not always be the case. 

>Where do I go?

You would go to the same place that "fast" goes when a race car slows down because "I" and "you" and "fast" are all adjectives. 
 
This would then suggest that if the scan demolishes the brain that you are dead and a robotic zombie acts in your stead.

Do you have any evidence that would suggest you are NOT the only conscious entity in the universe? Do you have any proof that all your fellow human beings are not zombies?  

> Faster than light travel I think is virtually impossible,

I agree. I would even be willing to remove the "virtually". And I would say the same thing about traveling to or communicating with the past. Perpetual motion machines too.

Dyson spheres are of a scale such that they may be practically impossible,

That's another thing that would not require more scientific knowledge, it just needs improved engineering. 

A small subregion of the brain can conceivably be replaced by a computer. The functional recovery should be nearly complete, though the charge or patient may detect some differences. 

If the patient reported he had detected no difference would you admit that the replacement part had been successfully installed?  

The system has been slightly perturbed or tweaked, or in physics we call an adiabatic perturbation. A complete replacement is a violent perturbation. 

What physicists call something does not change the underlying reality.  

Generally, the adiabatic case cannot be extended this way to a hard perturbation.

But we're not talking about a generality, we're talking about a specific special case. After a patient's entire biological brain had, one step at a time, been replaced by an electronic brain and after the last step he still reported that he had detected no differences and vigorously insisted he was still the same person, would you admit that the replacement had been successful?

John K Clark

Stathis Papaioannou

unread,
Mar 11, 2026, 3:26:54 PMMar 11
to extro...@googlegroups.com


Stathis Papaioannou


The argument is not for technical feasibility, it is an argument showing that consciouness cannot be substrate dependent. 
Reply all
Reply to author
Forward
0 new messages