At the 1982 LISP and Functional Programming Conference I asked
Alonzo Church about the origin of the lambda symbol. What I found out
is briefly summarized in a footnote on p. 357 of my book
"Functional Programming: Practice and Theory" (Addison Wesley,
1990). Since Church never confirmed the story in writing, I
thought it was inappropriate to attribute it to him in my book.
Nevertheless, here is the history of the lambda symbol, based
on the notes I wrote down after my conversation.
Church said that the starting point was Russell and Whitehead's
abstraction operator (in Principia Mathematica), which they wrote
with a caret over the bound variable: $\hat{x}(x^2+1)$. To facilitate
mechanical manipulation of the bound variables, he began to write
the caret in front of the bound variable (because, I presume, this
made the abstraction a string rather than a two-dimensional structure):
$\hat{}x(x^2+1)$. From there the caret symbol evolved into an
uppercase lambda, $\Lambda x(x^2+1)$, and finally a lower case lambda,
$\lambda x(x^2+1)$. I presume the latter stages of this evolution
were under the pressure of writing convenience and to avoid confusion
with other symbols (such as the and-sign).
Bruce MacLennan
Department of Computer Science
107 Ayres Hall
The University of Tennessee
Knoxville, TN 37996-1301
> Date: 27 Feb 91 16:53:22 GMT
> Followup-To: why lambda ? (Daniel de Rauglaudre)
..........
> At the 1982 LISP and Functional Programming Conference I asked
> Alonzo Church about the origin of the lambda symbol.
..........
> Church said that the starting point was Russell and Whitehead's
> abstraction operator (in Principia Mathematica), which they wrote
> with a caret over the bound variable: $\hat{x}(x^2+1)$.
..........
It is perhaps worth noting that the first person (as far as I know)
to use an operator that binds a variable as an abstraction operator
was Gottlob Frege, who also invented the existential and universal
quantifiers, though he used a cumbersome 2-D notation for
implication.
Russell learnt about Frege's notation as a result of reviewing his
work (I think it was the first volume of Frege's "The basic laws
(Grundgezetse) of arithmetic" the first full blown attempt to show
(a) that all concepts of arithmetic can be defined solely in terms
of purely logical concepts
(b) that all truths of arithmetic could be proved solely on the
basis of truths of logic.
Although Russell found that Frege's system was inconsistent (because
it allowed the formulation of Russell's paradox, concerning the set
of all sets that are not members of themselves), he continued to use
many of the ideas, though using a rather different notation.
I believe computer science owes a great deal to Frege's pioneering
work, including the generalisation of the notion of a function to
include predicates and higher order functions, and the first proper
analysis of variables.
Aaron Sloman,
School of Cognitive and Computing Sciences,
Univ of Sussex, Brighton, BN1 9QH, England
EMAIL aar...@cogs.sussex.ac.uk