Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

CNET "The return of artificial intelligence"

1 view
Skip to first unread message

Reini Urban

unread,
Jun 30, 2002, 6:31:53 AM6/30/02
to
CNET covered yesterday a McKinsey Quarterly report:
"The return of artificial intelligence"

http://news.com.com/2009-1001-940611.html?legacy=cnet&tag=pt.rss..feed.ne_20101046

"Artificial intelligence has come in and out of vogue more times than
Madonna in the past 20 years: It has been hyped and then, having
failed to live up to the hype, been discredited only to be revived
again. (...)"

"Nonetheless, the AI development community has generated techniques
that are beginning to show promise for real business applications.
Like any information system, AI systems become interesting to business
only when they can perform necessary tasks more efficiently or more
accurately or exploit hitherto untapped opportunities. What makes AI
much more likely to succeed now is the fact that the underlying
Web-enabled infrastructure creates unprecedented scope for collecting
massive amounts of information and for using it to automate business
functions. "

ori src: http://www.mckinseyquarterly.com/article_abstract.asp?ar=1201&L2=13&L3=13
(needs registration)

Fred Gilham

unread,
Jul 3, 2002, 7:24:13 PM7/3/02
to

rur...@x-ray.at (Reini Urban) writes:
> CNET covered yesterday a McKinsey Quarterly report:
> "The return of artificial intelligence"

Interesting! I recently saw an announcement about a DARPA initiative
that seemed to be pretty heavily AI oriented as well. My response
upon seeing it was `AI winter is over!' This seems to confirm that
impression.

--
Fred Gilham gil...@csl.sri.com
Perhaps the greatest damage the American system of education has done
to its children is to teach them that their opinions are relevant
simply because they are their opinions.

Paul Wallich

unread,
Jul 4, 2002, 12:28:57 AM7/4/02
to
In article <u7ofdoz...@snapdragon.csl.sri.com>,
Fred Gilham <gil...@snapdragon.csl.sri.com> wrote:

>rur...@x-ray.at (Reini Urban) writes:
>> CNET covered yesterday a McKinsey Quarterly report:
>> "The return of artificial intelligence"
>
>Interesting! I recently saw an announcement about a DARPA initiative
>that seemed to be pretty heavily AI oriented as well. My response
>upon seeing it was `AI winter is over!' This seems to confirm that
>impression.

It's been a couple years since the boxes on most people's desks got
enough cycles and memory to execute the algorithms developed 15 years
ago on reasonable problems in reasonable amounts of time. So now the
question is how long it will take to start filling that application
space and develop algorithms to bring the next generation of boxes to
their knees...

paul

Frank A. Adrian

unread,
Jul 4, 2002, 12:48:41 AM7/4/02
to
Fred Gilham wrote:

> Interesting!  I recently saw an announcement about a DARPA initiative
> that seemed to be pretty heavily AI oriented as well.  My response
> upon seeing it was `AI winter is over!'  This seems to confirm that
> impression.

If you think about it, this makes a lot of sense. What we call AI, as I've
talked about on other posts, are the "things we don't really know how to do
yet". The last twenty or so years have been concentrated on
commercializing and refining the knowledge from the last go-round of "AI"
and other DARPA advanced computer technology. WIMP interfaces, the
commercial Internet, multiprocessing, OO programming, speech recognition,
automatic translation are some of the many fruits of the last push in
"pure" computer research.

With the decay of government and private research labs and everyone
focusing on technology transfer and incremental improvements, we've pretty
much eaten all of our seed corn. So there does need to be a real push
towards pure computer research if we're to advance beyond the sorry state
of computer research today. And, when you look at the next set of problems
out there - protein folding, large scale system simulation, reliable large
scale and autonomously managed computing networks, better interfaces to our
ubiquitous computing grid, et omnibus - it's clear that a simple increase
in quantitative computational terms won't do the trick. Small incremental
improvements will not blast us out of the local energy minimum of the space
of possible computing futures we've settled into. Entirely novel
approaches will be needed in order to move on.

I guess if I had to put it bluntly, we've been focusing on improving what
we know how to do rather than trying to figure out how to do what we don't
know how to do; we've been screwing around with type systems and wizards to
"help" falible humans program these machines more easily rather than giving
the machines the common sense needed to program themselves in a manner to
be beneficial to humans; we've been building mechanistic network monitors
and protection systems to defend our fragile machines rather than making
the machines and their systems smart enough and robust enough to defend
themselves; we've been drawing prettier windowing systems without asking if
there's a better way to interact with our machines.

I won't say that none of this type of stuff is being done, but it has
become a research backwater. All you need to do is to look at the latest
OOPSLA's, which have become the "Conferences on Doing Stuff with/to Java"
(until next year when it will start to transfer over to being the
"Conferences on Doing Stuff with/to C# and .NET"), or the International
Conferences on Functional Programming, which have become the "International
Conferences on Doing Things with Type Systems so your Functional Program
will Run 5% Faster". Sorry if I sound bitter here, but the research
mainstream has really turned to crap. The number of truly original ideas
over the last 20 years has been close to nil.

In any case, the computing industry is starting to be seriously impacted.
Our computing economy is hitting the wall. Mo bigga processors and more of
the same "research" are not needed. What is needed now are new ideas.

I hope that the computing industry and/or government have started to
realize this and I think that there are people smart enough to make changes
in this area. This is ultimately why I think that we will start moving
back to what has traditionally been called "Artificial Intelligence", the
art of trying to do things that no one understands how to do yet. With any
luck, we will not stop short again this time. Lisp, as always, is
well-positioned to help. And this is why I am also hopeful about Lisp.

faa

Roger Corman

unread,
Jul 4, 2002, 2:52:52 AM7/4/02
to
On Wed, 03 Jul 2002 21:48:41 -0700, "Frank A. Adrian" <fad...@ancar.org> wrote:

>Fred Gilham wrote:
>
>> Interesting!  I recently saw an announcement about a DARPA initiative
>> that seemed to be pretty heavily AI oriented as well.  My response
>> upon seeing it was `AI winter is over!'  This seems to confirm that
>> impression.
>
>If you think about it, this makes a lot of sense. What we call AI, as I've
>talked about on other posts, are the "things we don't really know how to do
>yet". The last twenty or so years have been concentrated on
>commercializing and refining the knowledge from the last go-round of "AI"
>and other DARPA advanced computer technology. WIMP interfaces, the
>commercial Internet, multiprocessing, OO programming, speech recognition,
>automatic translation are some of the many fruits of the last push in
>"pure" computer research.

(rest deleted)

Thanks very much, Frank. That is a great message, and I agree with your points
down the line. I too feel the state of software is quite mediocre, based on few
real advances in the 18 years I have been working in the industry. The hardware
today is certainly far more capable of realizing the visions that AI researchers
have pursued in the past. Hardware has gotten more and more efficient, and the
opposite has basically happened in software. It is generally bloated and
mediocre, and just rides the hardware improvements. If some new
technologies/languages/techniques could be brought to bear which would take
significant advantage of existing capabilities, we could see amazing
improvements in software capabilities and quality.

I also think lisp is very well positioned to be a part of the next wave of
innovation, partly because of its value in the past, and partly because nothing
has come along which is any better. If a new language were to come along which
was truly innovative and went beyond current lisp systems in capabilities, I
would be thrilled to jump on it and start writing compilers for it. After this
many years, I don't think that is very likely to happen. I expect lisp will play
a big role in the next real innovations in computer science.

Nearly all the knocks that used to be made against lisp are completely off now.
Lisp was slow? Now it's one of the fastest languages--certainly much faster than
Java, Perl, Python, etc.
Too memory intensive? Hardly anyone cares anymore. I have never heard that being
used as a knock against Java, which is in fact very memory intensive. I believe
that current lisp systems have gone so far to eliminate memory inneficiency that
lisp tends to use memory more conservatively than many of the newer languages.

Non standard? Well of course that's history.
Can't do X? Now there are multiple implementations that can do X (whatever it
is).
Drags along the compiler, interpreter etc. with the runtime executable?
Thats one of its big benefits now. Memory is cheap enough that this is the kind
of thing we need more of. I'd like to see other languages start adding the
compiler as a feature of the language library. Most windows users today have no
concept how big a program is, nor care. Most windows programs involve dozens of
DLLs and other components, and install to hundreds of megabytes. Nobody cares.
Why not include full, rich core libraries as a standard feature? No need to
strip anything out.

Of course on the other side, most of us can name quite a few things which Lisp
does well and hardly any other languages, certainly not the mainstream
languages, can do at all. That is without implementing half of common lisp (ack
to Philip Greenspun).

Roger

0 new messages