A.I. Is Already Intelligent. This Is How It Becomes Conscious.

14 views
Skip to first unread message

John Clark

unread,
Nov 9, 2025, 9:07:25 AM (2 days ago) Nov 9
to ExI Chat, extro...@googlegroups.com, 'Brent Meeker' via Everything List
Explore this gift article from The New York Times. You can read it for free without a subscription.

A.I. Is Already Intelligent. This Is How It Becomes Conscious.

Skeptics overlook how our concepts change.

https://www.nytimes.com/2025/11/08/opinion/ai-conscious-technology.html?unlocked_article_code=1.z08.X2fy.YlX39OExK1hi&smid=em-share

Brent Meeker

unread,
Nov 9, 2025, 4:27:40 PM (2 days ago) Nov 9
to everyth...@googlegroups.com
There are different kinds of consciousness.  For some reason these discussions of AI always assume the "I'm thinking that I feel sad (or happy or whatever)."  My dog feels happy or sad, he just doesn't think about.  AI's think about it because that's something you do in words, but do they feel it?  I know my dog feels it because when he's happy he's energetic, wants to play, wags his tail,...  AI needs to be embodied to connect feeling to anything beyond words.  Not that I think embodying AI would be a good thing now.

Brent
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/everything-list/CAJPayv14KWew8R-9d-%2BfZAJSLCubOVnSPLnn20CUyQ3NWjXB5w%40mail.gmail.com.

John Clark

unread,
Nov 9, 2025, 4:58:23 PM (2 days ago) Nov 9
to everyth...@googlegroups.com
On Sun, Nov 9, 2025 at 4:27 PM Brent Meeker <meeke...@gmail.com> wrote:


 
My dog feels happy or sad, he just doesn't think about.  AI's think about it because that's something you do in words, but do they feel it? 

The short answer is I don't know. I know for a fact that I am capable of feeling happy and sad but I'll never know for sure if an AI can actually feel happy or sad, and I could say exactly the same thing about you. But there are few things in life we can be absolutely certain about so we must do the best we can in the absence of certainty, and I think it would be a pretty good assumption that both you and your dog and the AI are all capable of feeling happy and sad. 

So should we take morality into account in the way we treat AI's? I don't think that is an interesting question because in the long run it makes no difference if the answer is yes or no. A much more interesting question is, should an AI take morality into account in the way it treats humans?

I know my dog feels it because when he's happy he's energetic, wants to play, wags his tail,

What makes you think those actions have anything to do with an internal emotional state?  

John K Clark


Brent Meeker

unread,
Nov 9, 2025, 5:10:59 PM (2 days ago) Nov 9
to everyth...@googlegroups.com
What makes you think your wife is mad when she slams the bedroom door?

Brent

John K Clark


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.

John Clark

unread,
Nov 10, 2025, 6:18:11 AM (20 hours ago) Nov 10
to everyth...@googlegroups.com
On Sun, Nov 9, 2025 at 5:10 PM Brent Meeker <meeke...@gmail.com> wrote:

What makes you think your wife is mad when she slams the bedroom door?

I think she is probably angry for the same reason that I think an AI is probably conscious, I deduced it from behavioral actions; it's not a proof but I think it's a pretty good hypothesis. The only alternative is solipsism, and I simply could not function if I really believed in that. 

John K Clark 


Brent Meeker

unread,
Nov 10, 2025, 2:28:06 PM (12 hours ago) Nov 10
to everyth...@googlegroups.com
So you infer your wife has internal emotional states from her behavior.  I'd add that she is also much more like you than an LLM is.  Even a dog or an octopus is more like you than an LLM is, they are both embodied and adapted to living in a 4D world.

Brent

John Clark

unread,
Nov 10, 2025, 3:40:43 PM (11 hours ago) Nov 10
to everyth...@googlegroups.com
On Mon, Nov 10, 2025 at 2:28 PM Brent Meeker <meeke...@gmail.com> wrote:

Even a dog or an octopus is more like you than an LLM is, 

These days I can have a stimulating philosophical conversation with a LLM, but when I discuss philosophy with my dog the conversation tends to be somewhat less interesting.  

they are both embodied

 Was Stephen Hawking "embodied"? Even today's clumsy robots can directly manipulate things in the physical world better than he could.

 John K Clark    See what's on my new list at  Extropolis

Brent Meeker

unread,
Nov 10, 2025, 5:56:57 PM (8 hours ago) Nov 10
to everyth...@googlegroups.com


On 11/10/2025 12:40 PM, John Clark wrote:

On Mon, Nov 10, 2025 at 2:28 PM Brent Meeker <meeke...@gmail.com> wrote:

Even a dog or an octopus is more like you than an LLM is, 

These days I can have a stimulating philosophical conversation with a LLM, but when I discuss philosophy with my dog the conversation tends to be somewhat less interesting.  

they are both embodied

 Was Stephen Hawking "embodied"? Even today's clumsy robots can directly manipulate things in the physical world better than he could.
Yes he was embodied.  He was always in some particular place.  I just used that as an example.  He was like you many other ways too.  

Brent
Reply all
Reply to author
Forward
0 new messages