It's too late to stop GPT4 now

15 views
Skip to first unread message

John Clark

unread,
Apr 2, 2023, 3:35:40 PM4/2/23
to 'Brent Meeker' via Everything List
This video is a summary of several technical papers that have come out in the last 72 hours, apparently GPT4 can now improve itself without human help by self-reflecting on its errors and can even design better hardware for itself. 


John K Clark    See what's on my new list at  Extropolis
3zi

Jason Resch

unread,
Apr 2, 2023, 4:25:27 PM4/2/23
to Everything List
"Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind... Thus the first ultraintelligent machine is the last invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control. It is curious that this point is made so seldom outside of science fiction. It is sometimes worthwhile to take science fiction seriously."
-- I.J. Good

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAJPayv1Zdxw4fhV7Vq%3DWHDL5SOUNbnyCNgFmjuEo8%3DqOAC%2Busg%40mail.gmail.com.

spudb...@aol.com

unread,
Apr 2, 2023, 5:05:59 PM4/2/23
to johnk...@gmail.com, everyth...@googlegroups.com
Maybe someday we humans can do the same? 


Russell Standish

unread,
Apr 8, 2023, 2:39:53 AM4/8/23
to 'Brent Meeker' via Everything List
What struck me when watching this video is the uncanny similarity of
this mechanism to the Steven Pinker's proposed "mind's big bang",
which took place in human minds about 40,000 years ago.

It all came down to using language for the disparate modules of the
human brain to talk to each other, likened to individual chapels
uniting to form a cathedral.

I would predict that human level intelligence may be matched in 2025
with GPT-5, only 5 years later than Ray Kurzweil's prediction, which
might mean the singularity is on course for some time in the 2050s...

Cheers
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an email
> to everything-li...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/
> everything-list/
> CAJPayv1Zdxw4fhV7Vq%3DWHDL5SOUNbnyCNgFmjuEo8%3DqOAC%2Busg%40mail.gmail.com.

--

----------------------------------------------------------------------------
Dr Russell Standish Phone 0425 253119 (mobile)
Principal, High Performance Coders hpc...@hpcoders.com.au
http://www.hpcoders.com.au
----------------------------------------------------------------------------

Stathis Papaioannou

unread,
Apr 8, 2023, 7:31:01 AM4/8/23
to everyth...@googlegroups.com
On Sat, 8 Apr 2023 at 16:39, Russell Standish <li...@hpcoders.com.au> wrote:
What struck me when watching this video is the uncanny similarity of
this mechanism to the Steven Pinker's proposed "mind's big bang",
which took place in human minds about 40,000 years ago.

It all came down to using language for the disparate modules of the
human brain to talk to each other, likened to individual chapels
uniting to form a cathedral.

I would predict that human level intelligence may be matched in 2025
with GPT-5, only 5 years later than Ray Kurzweil's prediction, which
might mean the singularity is on course for some time in the 2050s...

Why such a long gap between gaining human level intelligence and the singularity?
--
Stathis Papaioannou

John Clark

unread,
Apr 8, 2023, 7:40:59 AM4/8/23
to everyth...@googlegroups.com
On Sat, Apr 8, 2023 at 2:39 AM Russell Standish <li...@hpcoders.com.au> wrote:

> I would predict that human level intelligence may be matched in 2025
with GPT-5, only 5 years later than Ray Kurzweil's prediction,
 
Actually Kurzweil predicted that "computers will be routinely passing the Turing test by 2029 ", so his prediction was too conservative because, although it's not "routine" quite yet, I would argue that one computer program passed the Turing Test one month ago and that by 2025 human level AI will be ubiquitous. As for GPT-5, some say it's already operational but OpenAI is checking it for safety and they will not be releasing it to the world until late this year or early next. OpenAI only has 375 employees and that number is small enough to keep a secret for a few months, after all we now know that GPT-4 first became operational last August, although the world didn't find out about it until March. 

John K Clark    See what's on my new list at  Extropolis
u9b



John Clark

unread,
Apr 8, 2023, 7:46:42 AM4/8/23
to everyth...@googlegroups.com
On Sat, Apr 8, 2023 at 7:31 AM Stathis Papaioannou <stat...@gmail.com> wrote:

> Why such a long gap between gaining human level intelligence and the singularity?

That is a very good question but I don't have a very good answer so I don't think there will be a long gap. Fasten your seatbelts, we're in for a bumpy ride. 

John K Clark    See what's on my new list at  Extropolis

jwx



Russell Standish

unread,
Apr 8, 2023, 8:19:44 AM4/8/23
to everyth...@googlegroups.com
Don't forget it requires a society of hundreds of millions of human
level intelligences to make a GPT-4.

And it take a human level intelligence some 20 years in order to make
meaningful contributions to something like GPT-4.

Progress will therefore continue to be be exponential for some time to
come. Only when super human intelligence is able to design itself will
hyperbolic progress begin. It will also need to better the energy
efficiency of human brains, and it is still orders of magnitude away
from that.

In saying 25 years to singularity, I was simply taking Kurzweil's
timeline, and adding the 5 years he was out by.
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an email
> to everything-li...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/
> everything-list/
> CAJPayv1BhQQ4visgvVPTZTH9ZqiMJYcabARAnY_g37c-dUOUkA%40mail.gmail.com.

John Clark

unread,
Apr 8, 2023, 3:12:25 PM4/8/23
to everyth...@googlegroups.com
On Sat, Apr 8, 2023 at 8:19 AM Russell Standish <li...@hpcoders.com.au> wrote:

> Don't forget it requires a society of hundreds of millions of human
level intelligences to make a GPT-4. And it take a human level intelligence some 20 years in order to make
meaningful contributions to something like GPT-4.
Progress will therefore continue to be be exponential for some time to
come. Only when super human intelligence is able to design itself will
hyperbolic progress begin. 


Although certainly extremely helpful most areas of science require more than just a brilliant theoretician, they need experimental evidence, and so new knowledge in those fields will not grow at the same explosive rate as computer intelligence does; however there are two fields that do not require experiment evidence and so should grow as rapidly as intelligence does, mathematics and software development, including smart software they can write even smarter software. And there are mountains of data on physics and biology that already exist and they're almost certainly unknown gems hiding in there that nobody has spotted, but with new mathematical techniques and better software they could be found.

> It will also need to better the energy efficiency of human brains, and it is still orders of magnitude away from that.

Take a look at this video, it talks about Nvidia's new chip, with a data center using it an AI system that had required 35 MW to run will only need 5 MW to do the same thing. 


By the way, I think mathematicians and software developers will be the first to lose their jobs, perhaps they could be retrained as coal miners.   

John K Clark    See what's on my new list at  Extropolis
8fi


spudb...@aol.com

unread,
Apr 8, 2023, 5:05:00 PM4/8/23
to johnk...@gmail.com, everyth...@googlegroups.com
Follow up from Fox of all sources! "Moral judgements"





-----Original Message-----
From: John Clark <johnk...@gmail.com>
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit

Russell Standish

unread,
Apr 8, 2023, 8:08:37 PM4/8/23
to everyth...@googlegroups.com
On Sat, Apr 08, 2023 at 03:11:47PM -0400, John Clark wrote:
> On Sat, Apr 8, 2023 at 8:19 AM Russell Standish <li...@hpcoders.com.au> wrote:
>
>
> > Don't forget it requires a society of hundreds of millions of human
> level intelligences to make a GPT-4. And it take a human level intelligence
> some 20 years in order to make
> meaningful contributions to something like GPT-4.
> Progress will therefore continue to be be exponential for some time to
> come. Only when super human intelligence is able to design itself will
> hyperbolic progress begin. 
>
>
> Although certainly extremely helpful most areas of science require more than
> just a brilliant theoretician, they need experimental evidence, and so new
> knowledge in those fields will not grow at the same explosive rate as computer
> intelligence does; however there are two fields that do not require experiment
> evidence and so should grow as rapidly as intelligence does, mathematics and
> software development, including smart software they can write even smarter
> software. And there are mountains of data on physics and biology that already
> exist and they're almost certainly unknown gems hiding in there that nobody has
> spotted, but with new mathematical techniques and better software they could be
> found.
>

Sure - I was trying to proffer some suggestions as to why Ray Kurzweil
suggested 25 years between attaining human level computational ability
and the singularity. I haven't read his book, just summaries - maybe
someone who has could enlightent us.

BTW - I still think we haven't cracked the problem of open-ended
creativity, which is essential for something like the singularity to
occur, but recent developments have lead me to believe it might be
achieved sooner rather than later. Ten years ago, I'd have said the
singularity wouldn't appear before 2070 (probably did say, though not
publicly). Now, I've brought that forward to 2050s


>
> > It will also need to better the energy efficiency of human brains, and it
> is still orders of magnitude away from that.
>
>
> Take a look at this video, it talks about Nvidia's new chip, with a data center
> using it an AI system that had required 35 MW to run will only need 5 MW to do
> the same thing. 
>
> Nvidia's HUGE AI Breakthrough is Bigger Than ChatGPT

That is a seven fold improvement, not quite one order of magnitude. My
understanding is that about 4-5 orders of magnitude are required
before machines can really take over the world. It will happen, but
on present exponential progress (classic Moore's law) that will take 2-3
decades.

Current AI systems like GPT-4 require the resources of a small town of
several thousand people for training.

GPT-4 is about 800 billion parameters IIUC. A human brain has over a
trillion synapses, so its certainly getting close.


>
> By the way, I think mathematicians and software developers will be the first to
> lose their jobs, perhaps they could be retrained as coal miners.   
>

I don't think they'll be the first :). ATM, GPT systems seem to have
an enormous propensity to make shit up, but less skill in making shit
up that is correct. ISTM the creative arts might be the area to lose
their jobs first.


> John K Clark    See what's on my new list at  Extropolis
> 8fi
>
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an email
> to everything-li...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/
> everything-list/
> CAJPayv15B0KuA4vUUid4YGex_H9uqwTnbKT9EU_64mqjNioXcw%40mail.gmail.com.
Reply all
Reply to author
Forward
0 new messages