Ilya Sutskever says "Superintelligence is within reach"
20 views
Skip to first unread message
John Clark
unread,
Jun 20, 2024, 8:08:18 AMJun 20
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to extro...@googlegroups.com, 'Brent Meeker' via Everything List
I don't think anybody knows more about the science of AI than Ilya Sutskever, at least nobody human, and apparently he believes that Artificial General Intelligence is boring old hat stuff because he now says "Superintelligence is within reach". That fact has scared the hell out of him. Sutskever saw something last November that spooked him and he's the one who orchestrated the attempt to kick out Sam Altman because he didn't think he was emphasizing safety enough. His attempt to get rid of Altman failed, so he left OpenAI and today started his own company called "Safe Superintelligence".
> It is my opinion that superintelligence is inevitable. Whatever
downsides there are, we can't avoid them. There are expected upsides
as well, so we might as well rush ahead and get them.
That is also my opinion. The best thing to do is just get on with it and hope for the best. Who knows we might even survive.
John K Clark See what's on my new list at Extropolis
ew4
Giulio Prisco
unread,
Jun 21, 2024, 2:15:17 AMJun 21
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to everyth...@googlegroups.com
On Thu, Jun 20, 2024 at 8:18 PM John Clark <johnk...@gmail.com> wrote:
>
> On Thu, Jun 20, 2024 at 1:25 PM Keith Henson <hkeith...@gmail.com> wrote:
>
>> > It is my opinion that superintelligence is inevitable. Whatever
>> downsides there are, we can't avoid them. There are expected upsides
>> as well, so we might as well rush ahead and get them.
>
I AGREE.
>
> That is also my opinion. The best thing to do is just get on with it and hope for the best. Who knows we might even survive.
>
> John K Clark See what's on my new list at Extropolis
> ew4
>>
>>