A toy implementation of M-expressions

5,691 views
Skip to first unread message

Anton van Straaten

unread,
Mar 30, 2013, 1:53:52 AM3/30/13
to li...@lispnyc.org
Revisiting McCarthy's 1960 paper tonight, "Recursive Functions of
Symbolic Expressions and Their Computation by Machine, Part I":

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.50.9776&rep=rep1&type=pdf


...I couldn't resist converting his M-expression version of a symbolic
differentiator into Scheme, which was simple enough. But then I
wondered what it would look like in Haskell, given that Haskell has
something of a resemblance to M-expressions.

The end result, after a bit of tweaking, was remarkably close to
McCarthy's M-expression version. Here's McCarthy's version from the paper:

diff [y; x] =
[atom [y] → [eq [y; x] → ONE; T → ZERO];

eq [car [Y]; PLUS] →
cons [PLUS; maplist [cdr [y]; λ[[z]; diff [car [z]; x]]]];

eq [car [y]; TIMES] →
cons [PLUS; maplist [cdr[y];
λ[[z];
cons [TIMES;
maplist[cdr [y];
λ[[w];
¬ eq [z; w] → car [w];
T → diff [car [w]; x]]]]]]]

Below is an M-expression-like Haskell version. Forget about trolling
Java One as Ray suggested, Lisp fans can now claim that McCarthy
invented Haskell, and wrote the first Haskell programs, in 1960. Which
is just as well, since Alonzo Church wrote the first Lisp programs in
1936.

The rest of this message is a working Haskell program:

diff [y, x] =
atom [y] → (eq [y, x] → ONE $ ZERO) $

eq [car [y], PLUS] →
cons [PLUS, maplist [cdr [y], lambda$ \z -> diff [car [z], x]]] $

eq [car [y], TIMES] →
cons [PLUS, maplist [cdr [y],
lambda$ \z ->
cons [TIMES,
maplist [cdr [y],
lambda$ \w ->
z /= w → car [w] $
diff [car [w], x]]]]] $
error ("Unexpected term" ++ show y)

-- Test:
mctest1 = diff [List [TIMES, X, List [PLUS, X, A], Y], X]

-- Output of test:
-- List [PLUS,
-- List [TIMES,ONE,List [PLUS,X,A],Y],
-- List [TIMES,X,List [PLUS,ONE,ZERO],Y],
-- List [TIMES,X,List [PLUS,X,A],ZERO]]

--
-- Some code is needed to support M-expression idioms, as follows:
--

-- Toy S-expression data type, including symbols needed for example
data SExpr = ZERO | ONE | PLUS | TIMES | X | A | Y
| List [SExpr]
| Lambda Fn
deriving (Eq, Show)

-- constructor for lambdas in SExprs
lambda = Lambda . Fn

-- Support functions that take Haskell lists of SExprs as arguments,
-- to emulate M-expression function call syntax

car [List t] = head t
car _ = error "car of non-pair"

cdr [List t] = List (tail t)
cdr _ = error "cdr of non-pair"

cons [h, List t] = List (h : t)
cons [_, _] = error "cons with non-list" -- our lists aren't pairs

atom [List _] = False
atom _ = True

eq [x, y] = x == y

maplist [List x, Lambda (Fn f)] = List (ml x)
where ml [] = []
ml x@(_:t) = f (List x) : ml t

-- McCarthy conditional arrows
infixl 1 →
(→) :: Bool -> a -> a -> a
(→) True x _ = x
(→) False _ y = y

-- Wrapper for Haskell functions in SExprs
data Fn = Fn (SExpr -> SExpr)
instance Eq Fn where x == y = error "Cannot compare functions"
instance Show Fn where show x = "Function"

-- ()

Joe O'Donnell

unread,
Mar 30, 2013, 2:02:45 PM3/30/13
to li...@lispnyc.org, Anton van Straaten
Fantastic posts. Before I was asking what the similarities between Lisp and Haskell are, and after reading this I'm asking what the difference is. It would seem the difference is Lisp is an extended Untyped Lambda Calculus, and Haskell is an extended Typed Lambda Calculus. The Haskell code post at minimum is Hacker News worthy. http://news.ycombinator.com

It does look pretty clear that McCarthy was referring to his 1960 paper, by both date and phrase reference.

Here is a provocative quote I just found while googling on these topics:

"A language should be designed in terms of an abstract syntax and it should have perhaps, several forms of concrete syntax: one which is easy to write and maybe quite abbreviated; another which is good to look at and maybe quite fancy... and another, which is easy to make computers manipulate... all should be based on the same abstract syntax... the abstract syntax is what the theoreticians will use and one or more of the concrete syntaxes is what the practitioners will use. John McCarthy, creator of Lisp
http://www.dwheeler.com/readable/

Joe

--- On Sat, 3/30/13, Anton van Straaten <an...@appsolutions.com> wrote:

Geoffrey S. Knauth

unread,
Mar 30, 2013, 2:34:18 PM3/30/13
to Anton van Straaten, li...@lispnyc.org
On Mar 30, 2013, at 01:53 , Anton van Straaten <an...@appsolutions.com> wrote:

> Lisp fans can now claim that McCarthy invented Haskell, and wrote the first Haskell programs, in 1960. Which is just as well, since Alonzo Church wrote the first Lisp programs in 1936.

Anton, your code is really neat. I think you should share it it with William R. Cook, who is writing a new textbook, Anatomy of Programming Languages.

http://www.cs.utexas.edu/~wcook/

You can be a collaborator!

Your mention of Church reminded me, a friend of mine (Hartley Rogers, Jr.) had Alonzo Church as a thesis advisor, and went to grad school with John Nash [1]. It is good to have heroes in a mixed-up world.

Geoff

[1] http://www.flickr.com/photos/gknauth/4655152606/in/set-72157624046322781

Joe O'Donnell

unread,
Mar 30, 2013, 2:45:08 PM3/30/13
to li...@lispnyc.org
I submitted the post to Hacker news, if we mod it up to at least 20 points it may get on the front page:

https://news.ycombinator.com/newest


--- On Sat, 3/30/13, Geoffrey S. Knauth <ge...@knauth.org> wrote:

Anton van Straaten

unread,
Mar 30, 2013, 10:29:51 PM3/30/13
to Joe O'Donnell, li...@lispnyc.org
On 03/30/2013 02:02 PM, Joe O'Donnell wrote:
> It would seem the difference is Lisp is an extended Untyped Lambda
>Calculus, and Haskell is an extended Typed Lambda Calculus.

Yes, although untyped/typed is just one dimension in which Haskell and
Lisp were extended in different and even opposite directions.

Some of the other dimensions are (Lisp choice listed first): mutability
vs. immutability for variables and data structures; strict vs.
non-strict evaluation; and function arguments as lists vs.
single-argument functions that support partial application.

All together, these make for significantly different languages, which
share a similar core based on the lambda calculus.

Finally, Lisp is homoiconic, Haskell is not. Although, the existence of
GHC's Template Haskell, basically a macro language for Haskell, relates
directly to your quote of McCarthy's about abstract syntax:

> "A language should be designed in terms of an abstract syntax and it
> should have perhaps, several forms of concrete syntax: one which is
> easy to write and maybe quite abbreviated; another which is good to
> look at and maybe quite fancy... and another, which is easy to make
> computers manipulate... all should be based on the same abstract
> syntax... the abstract syntax is what the theoreticians will use and
> one or more of the concrete syntaxes is what the practitioners will
> use." John McCarthy, creator of Lisp
> http://www.dwheeler.com/readable/

Haskell has two concrete syntaxes for writing ordinary code in: the
standard indentation-sensitive one which minimizes punctuation, and a
braces+semicolons one which is indentation insensitive.

On top of that, Template Haskell provides AST data types for
programmatically manipulating Haskell code at compile time, complete
with quasi-quoting ability etc.

This loses Lisp's ability to make macros look like the code it's going
to generate, but aside from the reduced notational convenience it's
equally powerful.

McCarthy is right, most serious languages should provide something like
this, but most don't - or alternatively, most languages are not serious...

Anton

Geoffrey S. Knauth

unread,
Mar 31, 2013, 5:43:45 AM3/31/13
to li...@lispnyc.org
On Sat, Mar 30, 2013, at 22:29, Anton van Straaten wrote:
> untyped/typed is just one dimension in which Haskell and
> Lisp were extended in different and even opposite directions.

Maybe there is another axis: discipline. Haskell imposes some strict
rules and offers certain guarantees. Lisp is freer about letting you
create your own rules.

Lisp coders have created great works of art and adaptable functionality
only limited by the intellect of the authors. The greatest burst was in
the 1980s. Did less Lisp get written afterward (per average developer)
just because of AI Winter? Or did the freedom of Lisp outpace the
average programmer's ability? Plenty of people still write Lisp or
Scheme or Racket, thank goodness, but the explosion of interest seems a
past event. Maybe it will happen again. Clojure shows people still
want to have their cake and eat it too. Macros in Scala. AST
manipulation in Haskell, as Anton notes.

Current bursts of interest in other languages like Haskell and Scala
seem to offer a promise suited to busy people with less and less time:
do more succinctly and correctly, keep pieces quite small, assemble as
needed. Scalability is de rigueur. People and programs as widgets.

There are so many attributes of languages we like that are shared by
varying degree I think we'll end up with very fine-grained DNA maps of
language features and strengths along with similar maps of programmer
preferences and problem types. Then the God computer (maybe written in
Lisp) will suggest: "Use X for this problem. You'll get it done in
your lifetime, and we'll all be happy."

Samium Gromoff

unread,
Mar 31, 2013, 7:10:00 AM3/31/13
to li...@lispnyc.org
Let's not forget about Lisk and Liskell..

Joe O'Donnell

unread,
Mar 31, 2013, 10:58:16 AM3/31/13
to li...@lispnyc.org
Anton's post was #1 on Hacker News for much of the day yesterday. Lisp and Haskell, right down to their logic theory natures, do seem suited for 2 different areas-none well defined problems, and well defined problems. Lisp is building a statue as it comes to mind out of clay, Haskell is building a statue with an ultra precise and fast CNC machine. One is plan as you go, the other is plan and done.

I can't help but think that Lisp is preferable for at least none financial and engineering type problems, in other words most programming problems, because most such problems are not well defined. The client for most projects says I want X, and X changes daily as the project proceeds. On the other hand learning Haskell and it's coding paradigm now seems like a rightfully required part of compsci.

Are most larger Haskell projects now done by piecing together library components as mentioned? The 'libraries and teams' approach was a huge factor for both Linux (in general) and Java's development platform popularity, and brings up 'the curse of the Lisp programmer':
https://news.ycombinator.com/item?id=2450973

-Although I don't see that as a curse, just a natural tendency relating to Lisp's focus strength.

Concerning the second rise and fall of Lisp AI in the 80's, I think what seems to have happened is that expert systems, the most important 80's AI technology, were new enough in the 80's that the flexible power of Lisp was needed in creating and dealing with them. But after that 3 things seem to have happened:

A. Expert systems didn't ultimately provide an automated expert on demand, and the 'public' wrongly perceived that as a failure. Though previous AI PR claims didn't help that situation.

B. There was a lack of understanding in general that effective long term use of expert systems required a certain level and type of training in how to use them.

C. Expert systems and other AI techniques developed by Lisp coders matured to a 'crystallization' point, and were ported as libraries to other languages like C, C++, MS C++, Perl and later on ultimately Java. These other languages were targeted specifically for running on Wintel, Sun and Later Linux machines, which out commoditized Lisp machines in the market place, due to 'worse is better.'

I think another reason Lisp isn't as popular as it once was, is that in the 80's, a larger percentage of programmers came from elite institutions like MIT, where the cutting edge of programming was constantly aggressively pursued. When that type of culture predominates, languages such as Lisp are pushed to the front. I think the reason Lisp didn't catch on with the masses after the masses started picking up computers and programming during the 80's, is that to make the most powerful use of Lisp, to even understand why it's desirable to code in a math symbolism looking language, one has to understand and even have experience with some fairly deep compsci topics. Namely the importance and nature of abstraction, the nature of compilation, the effect various language constructs have on compilation, and the what,why and how of those language constructs. That would seem to be a fairly tall order, that may be unlikely to come about outside of being in a
community of compsci pursuing programmers.

I have always been surprised that it took until the 90's for a book like On Lisp to come out. And that book is still pretty much one of a kind. The power of Lisp was only starting to be well communicated outside of MIT and a few other places by the 80's, and after that got swamped with the tremendous noise of the commoditization of computing.

I do think there is a huge future for Lisp, because it seems to be the ideal platform for not only none well defined programming problems, but as a base to build every other language on. If every language was built on say Scheme, than the programmer could mix and match approaches and libraries from all the different language communities. This would be done be decomposing down to Scheme and back up to the other language when needed. Such a system would also be ideal for an AI assisted style of programming that Geoffrey alluded to. PG has a great article on this general area:
http://paulgraham.com/popular.html

Joe

--- On Sun, 3/31/13, Geoffrey S. Knauth <ge...@knauth.org> wrote:

> From: Geoffrey S. Knauth <ge...@knauth.org>
> Subject: Re: A toy implementation of M-expressions

Joe O'Donnell

unread,
Mar 31, 2013, 11:30:03 AM3/31/13
to li...@lispnyc.org
Just wanted to add, I think the biggest educational thing needed for Lisp, is a book of annotated classic Lisp code. On Lisp explained most of the secrets of AST manipulation, PCL explained how to get started working with Lisp in the modern computing environment, SICP explains compsci fundamentals, PAIP explains many good coding practices, Lisp in small pieces explains Lisp compilation, but as far as I know, there is no book that explains in detail the break through, classic, and most of all real world working Lisp code of the 40's-00's. Call it 'Beautiful mostly real world Lisp code.' Even making a list of such code to include would be a huge step forward. Here is a first draft:

40's: Lambda Calculus, early functional papers
50's: pre lisp AI material, early functional papers
early 60's: cons cell machine code, primitives machine code, eval
later 60's: garbage collection, macros, pre planner AI material
70's: planner, sherdlu, maclisp collection, lambda papers
80's: what,why and how of Common Lisp, expert systems, Emacs, Autocad, genera
90's: common lisp logistical systems, Viaweb
00's: Reddit, ITA, Clojure

Joe

ps, Fowler wrote a book recently on DSL's, which shows the need and market for a classic Lisp code book:

http://martinfowler.com/books/dsl.html


--- On Sun, 3/31/13, Joe O'Donnell <gal...@yahoo.com> wrote:

Anton van Straaten

unread,
Mar 31, 2013, 1:00:42 PM3/31/13
to li...@lispnyc.org
On 03/31/2013 05:43 AM, Geoffrey S. Knauth wrote:
> On Sat, Mar 30, 2013, at 22:29, Anton van Straaten wrote:
>> untyped/typed is just one dimension in which Haskell and
>> Lisp were extended in different and even opposite directions.
>
> Maybe there is another axis: discipline. Haskell imposes some strict
> rules and offers certain guarantees. Lisp is freer about letting you
> create your own rules.

True, I was coming at it from a reductionist perspective: given strong
static types and immutability, discipline is an unavoidable consequence.

Which brings up a version of the Euthyphro dilemma for programming
languages: does Haskell have types because it wants discipline, or does
it have discipline because it wants types? Of course in this case it's
not such a dilemma, and the answer is "yes".

Although it's worth noting that if you want to "create your own rules",
Haskell itself doesn't stop you. For example, people have embedded
Javascript in Haskell in various ways - JMacro and HJScript are two
libraries that do this - and here's BASIC in Haskell:

http://augustss.blogspot.com/2009/02/more-basic-not-that-anybody-should-care.html

I think the case for Lisp is not so much that it's "freer about letting
you" do any specific thing, including creating your own rules - rather,
it's freer in general from language-imposed discipline, and all that
really matters are the things that are absolutely essential to the
functioning of your program.

As Danvy put it for (R4RS) Scheme:

http://www.daimi.au.dk/~danvy/scheme.html

Anton

Anton van Straaten

unread,
Mar 31, 2013, 1:27:50 PM3/31/13
to li...@lispnyc.org
On 03/31/2013 10:58 AM, Joe O'Donnell wrote:
> Anton's post was #1 on Hacker News for much of the day yesterday.

Thanks!

> Are most larger Haskell projects now done by piecing together
> library components as mentioned?

I can't speak to "most larger Haskell projects", but the list of
libraries available in the Hackage repository is definitely a big factor
in making Haskell useful:

http://hackage.haskell.org/packages/archive/pkg-list.html

Without that, it would be a struggle to do anything useful with the
language, since by comparison to any commercial language it is not
"batteries included".

This is also aided by the fact that practically speaking, there's only
one implementation of Haskell, i.e. GHC. There are others, but most of
them are experimental and none of them are nearly as widely used.

> If every language was built on say Scheme, than the programmer could
> mix and match approaches and libraries from all the different
> language communities. This would be done be decomposing down to
> Scheme and back up to the other language when needed.

Various academic efforts have been made along these lines, such as C-- :

http://www.cminusminus.org/

That page points out some of the problems with using a language like
Scheme for this purpose: "C-- has a machine-level type system, so you
don't have to shoehorn your favorite high-level language into a
high-level data model that doesn't fit; C-- provides a run-time
interface, so you can implement garbage collection and exception
handling using the techniques that are best suited to your language."

In general, anything higher-level than a low-level abstract machine
involve semantic and implementation choices which tend to introduce
mismatches with languages built on top of them. So languages that don't
resemble the underlying core tend to end up as second class citizens in
some sense.

Of course this isn't always a huge problem - Clojure and Scala manage,
with clever engineering and various compromises, to run on the JVM,
which wasn't even designed as a cross-language machine. But still, the
issues involved are a factor in industry agreement on choice of runtime.

In support of the Scheme point, another increasingly popular
cross-language target is Javascript, but that's because it's the only
choice when compiling non-native code to run in a browser.

One of the most successful cross-language virtual machines at the
moment, aside from .NET and the JVM, is LLVM - and the way it succeeds
is by being lower-level than something like Scheme.

Anton

John Cowan

unread,
Mar 31, 2013, 1:39:48 PM3/31/13
to Anton van Straaten, li...@lispnyc.org
Anton van Straaten scripsit:

> Which brings up a version of the Euthyphro dilemma for programming
> languages: does Haskell have types because it wants discipline, or
> does it have discipline because it wants types? Of course in this
> case it's not such a dilemma, and the answer is "yes".

I am convinced that Haskell wants discipline above all. It has
laziness, for example, not because laziness is required for
immutability, but in order to make 100% sure of immutability.

> Although it's worth noting that if you want to "create your own
> rules", Haskell itself doesn't stop you.

Well, sure. It's Turing-complete, it can implement a compiler or
interpreter for anything. But that's not the same as truly *embedding*
languages into Haskell; to do that, they have to play by Haskell's
rules.

--
You annoy me, Rattray! You disgust me! John Cowan
You irritate me unspeakably! Thank Heaven, co...@ccil.org
I am a man of equable temper, or I should http://www.ccil.org/~cowan
scarcely be able to contain myself before
your mocking visage. --Stalky imitating Macrea

Anton van Straaten

unread,
Mar 31, 2013, 2:17:08 PM3/31/13
to li...@lispnyc.org
On 03/31/2013 01:39 PM, John Cowan wrote:

> Anton van Straaten scripsit:
>
>> Although it's worth noting that if you want to "create your own
>> rules", Haskell itself doesn't stop you.
>
> Well, sure. It's Turing-complete, it can implement a compiler or
> interpreter for anything.

That's not what I was referring to. I referred to examples of
Javascript and BASIC embedded in Haskell. BTW, I already gave the BASIC
link; here's a link for the excellent Javascript embedding that was
implemented by a LispNYC member:
http://www.haskell.org/haskellwiki/Jmacro

These are not standalone compilers for those languages, they're
embeddings that can be intermixed with Haskell source, and compiled by GHC.

> But that's not the same as truly *embedding*
> languages into Haskell; to do that, they have to play by Haskell's
> rules.

What difference do you see in Haskell compared to Lisp or Scheme?

In both cases, to embed a language without relying on macros, you have
to play by the language's rules. This limits the degree to which an
embedding can depart from the semantics of the host language.

Introducing macros into the picture (in the Haskell case, via Template
Haskell), allows transformations from some source language into the host
language. This certainly qualifies as "truly embedding" some other
language into the host language.

Anton

John Cowan

unread,
Mar 31, 2013, 3:02:58 PM3/31/13
to Anton van Straaten, li...@lispnyc.org
Anton van Straaten scripsit:

> Introducing macros into the picture (in the Haskell case, via
> Template Haskell), allows transformations from some source language
> into the host language. This certainly qualifies as "truly
> embedding" some other language into the host language.

I agree, provided that Template Haskell counts as Haskell, but I don't
think it does. It is a more powerful language on top of Haskell.

--
Her he asked if O'Hare Doctor tidings sent from far John Cowan
coast and she with grameful sigh him answered that http://ccil.org/~cowan
O'Hare Doctor in heaven was. Sad was the man that word co...@ccil.org
to hear that him so heavied in bowels ruthful. All
she there told him, ruing death for friend so young, James Joyce, Ulysses
algate sore unwilling God's rightwiseness to withsay. "Oxen of the Sun"

Geoffrey S. Knauth

unread,
Mar 31, 2013, 3:41:46 PM3/31/13
to lisp@lispnyc.org NYC
On Mar 31, 2013, at 15:02 , John Cowan <co...@mercury.ccil.org> wrote in answer to Anton van Straaten:

> I agree, provided that Template Haskell counts as Haskell, but I don't
> think it does. It is a more powerful language on top of Haskell.

Template Haskell was implemented in ghc. Was the ghc extension written in Haskell? If yes, wouldn't Haskell be just as powerful? If not, say, if it was in C, it would be ironic for Haskell to gain superpowers by dropping to C.

I just downloaded and examined quickly the source to ghc, and it looks to me as though Template Haskell was implemented in Haskell.

Anton van Straaten

unread,
Mar 31, 2013, 3:43:30 PM3/31/13
to John Cowan, li...@lispnyc.org
On 03/31/2013 03:02 PM, John Cowan wrote:
> Anton van Straaten scripsit:
>
>> Introducing macros into the picture (in the Haskell case, via
>> Template Haskell), allows transformations from some source language
>> into the host language. This certainly qualifies as "truly
>> embedding" some other language into the host language.
>
> I agree, provided that Template Haskell counts as Haskell, but I don't
> think it does. It is a more powerful language on top of Haskell.

That modular separation could be considered an advantage. You could
just as well argue that Scheme macros don't count as Scheme - they are
distinct languages on top of Scheme (or underneath, depending.)

If we're restricting the discussion to traditional Lisp defmacros, then
one can make a better case that it's all the same language, but even
then, what is significant about this distinction - what are the
consequences that matter?

To me, the point is that you can, and people do, take the primary
Haskell implementation, implement embedded languages with it, and
release those embeddings as libraries. The capability is there,
built-in, and people actually use it. What is unsatisfactory about this?

The main advantage Lisp/Scheme seems to me to have here has to do with
homoiconicity - you don't have to learn an independent AST
representation of the target language in order to write macros.

Anton

Anton van Straaten

unread,
Mar 31, 2013, 3:59:46 PM3/31/13
to li...@lispnyc.org
Yes, it's implemented in Haskell. It is essentially a GHC compiler
extension, but the compiler is mostly implemented in Haskell, too.

The summary at the Haskell wiki is pretty good:

http://www.haskell.org/haskellwiki/Template_Haskell

"Template Haskell is an extension to Haskell 98 that allows you to do
type-safe compile-time meta-programming, with Haskell both as the
manipulating language and the language being manipulated.

"Intuitively Template Haskell provides new language features that allow
us to convert back and forth between concrete syntax, i.e. what you
would type when you write normal Haskell code, and abstract syntax trees.

"These abstract syntax trees are represented using Haskell datatypes
and, at compile time, they can be manipulated by Haskell code. This
allows you to reify (convert from concrete syntax to an abstract syntax
tree) some code, transform it and splice it back in (convert back
again), or even to produce completely new code and splice that in, while
the compiler is compiling your module."

Anton

Vassil Nikolov

unread,
Apr 1, 2013, 1:35:11 AM4/1/13
to li...@lispnyc.org

On Sat, 30 Mar 2013 01:53:52 -0400, Anton van Straaten <an...@appsolutions.com> said:
> ...
> Below is an M-expression-like Haskell version. Forget about trolling
> Java One as Ray suggested, Lisp fans can now claim that McCarthy
> invented Haskell, and wrote the first Haskell programs, in 1960.

Or, I would say that the M-language, the language of M-expressions,
was the first algorithmic language [*]. Furthermore, I would say
that among the languages extant today, Haskell is closest, or
certainly one of the top closest, to an algorithmic language---and
this excellent example illustrates that.

_________
[*] I cannot claim I am the first one to point this out, but I would
appreciate a reference to an earlier statement in this regard.

---Vassil.

2013-03-31.


--
Vassil Nikolov | Васил Николов | <vnik...@pobox.com>

"Be careful how you fix what you don't understand." (Brooks 2010, 185)

Vassil Nikolov

unread,
Apr 1, 2013, 1:38:49 AM4/1/13
to li...@lispnyc.org

On Sat, 30 Mar 2013 11:02:45 -0700 (PDT), Joe O'Donnell <gal...@yahoo.com> said:

> ...
> "A language should be designed in terms of an abstract syntax and it
> should have perhaps, several forms of concrete syntax: one which is
> easy to write and maybe quite abbreviated; another which is good to
> look at and maybe quite fancy... and another, which is easy to make
> computers manipulate... all should be based on the same abstract
> syntax... the abstract syntax is what the theoreticians will use and
> one or more of the concrete syntaxes is what the practitioners will
> use. John McCarthy, creator of Lisp http://www.dwheeler.com/readable/

Algol had that. Fortress has something like that. (Neither is too
surprising, of course.)

Vassil Nikolov

unread,
Apr 1, 2013, 1:53:01 AM4/1/13
to li...@lispnyc.org

On Sat, 30 Mar 2013 14:34:18 -0400, "Geoffrey S. Knauth" <ge...@knauth.org> said:

> ...
> William R. Cook, who is writing a new textbook, Anatomy of Programming Languages.

The title evokes Allen's (great) book; would that be intentional?

Dan Cross

unread,
Apr 1, 2013, 6:44:20 AM4/1/13
to li...@lispnyc.org

Yes.  He says as much into the introductory chapter.

Akhil Wali

unread,
Apr 1, 2013, 10:17:58 AM4/1/13
to li...@lispnyc.org
Great post about M-exprs!

> PG has a great article on this general area: http://paulgraham.com/popular.html

I actually re-read this; and so much of the dream language that PG talks about sounds like Clojure.
A powerful and simplistic language core, easily moldable, awesome multi-threading/multi-coring support, and great libaries (Clojure mostly, and lesser significant Java). 

Thoughts?
Reply all
Reply to author
Forward
0 new messages