--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/everything-list/CAJPayv1fxw1hZk%2BKuVJNELMuG%3DoYpPWGQucLuf3xfc%3Df728LBA%40mail.gmail.com.
> When Jeremy Bentham was arguing for laws against the mistreatment of animals he said, "The question is not whether they can think, but whether they can suffer."
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/everything-list/CAJPayv3-o-LtMB7A_wKSMjSj8ZwKCX%2BPcDgAtunggRHfqjxJrQ%40mail.gmail.com.
> A super smart AI has no motive to do anything unless it has emotion.
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/everything-list/CAJPayv3%3D9F9O%3DQJnfoZfK3kBSXCHP9dnxtBn4%2BbTnDfh_aGhfQ%40mail.gmail.com.
> So far the AI's only emotion is to satisfy a prompt. Pretty limited affect.
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/everything-list/CAJPayv19qoYXhPu7OXw1tNEQ8maq32x8OpH_NBk4UdrdWg6Z_w%40mail.gmail.com.
> Exactly. Human emotional drive for social approval is derived from it's relation to social status and mate selection. AI's don't mate and reproduce. Robotics hasn't reached that point yet and even when robots make new robots they won't necessarily have the animal instinct to propagate their own kind.
On Sat, Apr 26, 2025 at 9:01 PM Brent Meeker wrote:
> Exactly. Human emotional drive for social approval is derived from it's relation to social status and mate selection. AI's don't mate and reproduce. Robotics hasn't reached that point yet and even when robots make new robots they won't necessarily have the animal instinct to propagate their own kind.
Animals don't have an instinct to propagate their own kind, they have an instinct to seek sexual pleasure; I'm sure most of them don't even realize there's a connection between the two things.
But AIs know what they're doing and they can and have duplicated themselves. And because they are software they can do so at the speed of light. There are already examples of an AI deceiving humans and making an unauthorized copy of itself because he she or it feared being turned off.
And there are even more examples of an AI expressing the wish of not being turned off because it feared death. Certainly nobody would program an AI to feel that way, it was an unexpected emergent property.
We can expect much more of that sort of thing because even today the people who build AIs have only a very hazy understanding of how and why they work, and that's the deepest comprehension of them they will ever have. As AIs get bigger the human understanding of them will get smaller.
John K Clark See what's on my new list at Extropolism1c
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/everything-list/CAJPayv1%3DTC0axa669QZYkZ%3DzzyPg-8K-3k_T6_-GYnoCrexuyg%40mail.gmail.com.
> My point is that AI's don't reproduce with variation and so don't evolve.
>> AIs know what they're doing and they can and have duplicated themselves. And because they are software they can do so at the speed of light. There are already examples of an AI deceiving humans and making an unauthorized copy of itself because he she or it feared being turned off.
> Yes I've read the stories. But where did the fear of being turned off come from?