It does look pretty clear that McCarthy was referring to his 1960 paper, by both date and phrase reference.
Here is a provocative quote I just found while googling on these topics:
"A language should be designed in terms of an abstract syntax and it should have perhaps, several forms of concrete syntax: one which is easy to write and maybe quite abbreviated; another which is good to look at and maybe quite fancy... and another, which is easy to make computers manipulate... all should be based on the same abstract syntax... the abstract syntax is what the theoreticians will use and one or more of the concrete syntaxes is what the practitioners will use. John McCarthy, creator of Lisp
http://www.dwheeler.com/readable/
Joe
--- On Sat, 3/30/13, Anton van Straaten <an...@appsolutions.com> wrote:
https://news.ycombinator.com/newest
--- On Sat, 3/30/13, Geoffrey S. Knauth <ge...@knauth.org> wrote:
I can't help but think that Lisp is preferable for at least none financial and engineering type problems, in other words most programming problems, because most such problems are not well defined. The client for most projects says I want X, and X changes daily as the project proceeds. On the other hand learning Haskell and it's coding paradigm now seems like a rightfully required part of compsci.
Are most larger Haskell projects now done by piecing together library components as mentioned? The 'libraries and teams' approach was a huge factor for both Linux (in general) and Java's development platform popularity, and brings up 'the curse of the Lisp programmer':
https://news.ycombinator.com/item?id=2450973
-Although I don't see that as a curse, just a natural tendency relating to Lisp's focus strength.
Concerning the second rise and fall of Lisp AI in the 80's, I think what seems to have happened is that expert systems, the most important 80's AI technology, were new enough in the 80's that the flexible power of Lisp was needed in creating and dealing with them. But after that 3 things seem to have happened:
A. Expert systems didn't ultimately provide an automated expert on demand, and the 'public' wrongly perceived that as a failure. Though previous AI PR claims didn't help that situation.
B. There was a lack of understanding in general that effective long term use of expert systems required a certain level and type of training in how to use them.
C. Expert systems and other AI techniques developed by Lisp coders matured to a 'crystallization' point, and were ported as libraries to other languages like C, C++, MS C++, Perl and later on ultimately Java. These other languages were targeted specifically for running on Wintel, Sun and Later Linux machines, which out commoditized Lisp machines in the market place, due to 'worse is better.'
I think another reason Lisp isn't as popular as it once was, is that in the 80's, a larger percentage of programmers came from elite institutions like MIT, where the cutting edge of programming was constantly aggressively pursued. When that type of culture predominates, languages such as Lisp are pushed to the front. I think the reason Lisp didn't catch on with the masses after the masses started picking up computers and programming during the 80's, is that to make the most powerful use of Lisp, to even understand why it's desirable to code in a math symbolism looking language, one has to understand and even have experience with some fairly deep compsci topics. Namely the importance and nature of abstraction, the nature of compilation, the effect various language constructs have on compilation, and the what,why and how of those language constructs. That would seem to be a fairly tall order, that may be unlikely to come about outside of being in a
community of compsci pursuing programmers.
I have always been surprised that it took until the 90's for a book like On Lisp to come out. And that book is still pretty much one of a kind. The power of Lisp was only starting to be well communicated outside of MIT and a few other places by the 80's, and after that got swamped with the tremendous noise of the commoditization of computing.
I do think there is a huge future for Lisp, because it seems to be the ideal platform for not only none well defined programming problems, but as a base to build every other language on. If every language was built on say Scheme, than the programmer could mix and match approaches and libraries from all the different language communities. This would be done be decomposing down to Scheme and back up to the other language when needed. Such a system would also be ideal for an AI assisted style of programming that Geoffrey alluded to. PG has a great article on this general area:
http://paulgraham.com/popular.html
Joe
--- On Sun, 3/31/13, Geoffrey S. Knauth <ge...@knauth.org> wrote:
> From: Geoffrey S. Knauth <ge...@knauth.org>
> Subject: Re: A toy implementation of M-expressions
40's: Lambda Calculus, early functional papers
50's: pre lisp AI material, early functional papers
early 60's: cons cell machine code, primitives machine code, eval
later 60's: garbage collection, macros, pre planner AI material
70's: planner, sherdlu, maclisp collection, lambda papers
80's: what,why and how of Common Lisp, expert systems, Emacs, Autocad, genera
90's: common lisp logistical systems, Viaweb
00's: Reddit, ITA, Clojure
Joe
ps, Fowler wrote a book recently on DSL's, which shows the need and market for a classic Lisp code book:
http://martinfowler.com/books/dsl.html
--- On Sun, 3/31/13, Joe O'Donnell <gal...@yahoo.com> wrote:
Yes. He says as much into the introductory chapter.