Who were the opponents of Lisp?

211 views
Skip to first unread message

Kazimir Majorinc

unread,
Mar 25, 2010, 3:15:42 AM3/25/10
to
Informally, many people criticize Lisp. Some best
programmers I know easily dismissed it because
"they do not like parentheses." But who of programming
celebrities or theoreticians were enemies of Lisp?
Are there any influential anti-Lisp book or article,
scientific or not?

I know for Dijkstra - he harshly criticized Lisp in
few manuscripts he distributed, and partly - but
only partly - rehabilitated in one article someone
recently posted somewhere (on Reditt?) Did he
wrote any formal or more detailed text about that?
Anyone else?

(I ask about Lisp generally, not Lisp dialects)

--
Kazimir Majorinc
http://kazimirmajorinc.blogspot.com

Tamas K Papp

unread,
Mar 25, 2010, 4:28:32 AM3/25/10
to

It is well known that speech therapists are the greatest enemies of lisp.

Cheers,

Tamas

Waldek Hebisch

unread,
Mar 25, 2010, 6:24:30 AM3/25/10
to
Kazimir Majorinc <em...@false.false> wrote:
> Informally, many people criticize Lisp. Some best
> programmers I know easily dismissed it because
> "they do not like parentheses." But who of programming
> celebrities or theoreticians were enemies of Lisp?
> Are there any influential anti-Lisp book or article,
> scientific or not?
>

Look up "Typeful programming" by Luca Cardelli. At the
beginning (in few sentences) he criticizes dynamically
typed languages with Lisp beeing prominent example.
However note that critique is not the goal of Cardelli
and is only tiny part of the article. His main goal
is to promote advanced types systems.

--
Waldek Hebisch
heb...@math.uni.wroc.pl

Mark Tarver

unread,
Mar 25, 2010, 5:39:02 PM3/25/10
to

David Turner for one. There is a famous quote from him.

'It needs to be said very firmly that LISP, at least as represented by
the dialects in common use, is not a functional language at
all. ....... My suspicion is that the success of Lisp set back the
development of a properly functional style of programming by at least
ten years.'

This was from 1984, but the general feeling amongst the British
functional community was strongly anti-Lisp at that time. During my
spell at the LFCS from 1988-1990, Lisp was barely regarded. It
certainly influenced the teaching of FP at university which emphasised
ML (now more Haskell) with inevitable effects on the Lisp job
market. Some of the consequences of that educational judgement and
why it was made are laid out in http://www.lambdassociates.org/blog/l21.htm
(Windows only). Qi was a way of bridging the gulf.

Mark

Mark Tarver

unread,
Mar 26, 2010, 7:51:45 AM3/26/10
to
Some comments about Turner et al.

1. A lot of the assertions made for and against Lisp masqueraded, and
still masquerade, as purely scientific assertions, delivered from the
top of the mountain. However if you examine these assertions they
actually contain strong value judgements. It's generally important to
see them for what they are and isolate them.

2. One of these was the attack on the procedural elements of Lisp-
this was Turner's complaint. From an educational point of view -
teaching functional programming - having a language that allows escape
into C-like idioms is not a help. The other weakness is that formal
verification of programs becomes more difficult.

The British theoreticians were much influenced by maths and logic and
felt these aspects were very important. It is also true that these
procedural elements are very useful on occasion to *mature
programmers*. Hence there was a value gap between the experienced
Lispers and the theoreticians about what was important and why.

3. Generally the Lisp community did not answer these challenges at
the time they were aired. I think partly because people like Milner/
Turner etc. are very clever and therefore intimidating. Also because
the best Lisp people were focussed elsewhere (Symbolics). This meant
that they ceded intellectual territory at university level, and the
effects took over a decade to come through.

4. Part of the reason for Turner's attack was that Lisp style prior
to SICP could be pretty awful. Few people knew how to write decent
abstract code and that, and the poor performance of machines, meant
that destructive operations abounded in Lisp code. A sample from
Chang & Lee (early 70s) of a resolution ATP is quite horrendous code.
Turner's Miranda was beautiful, but impractically slow.

Mark

Alex Mizrahi

unread,
Mar 26, 2010, 8:07:45 AM3/26/10
to
KM> Informally, many people criticize Lisp. Some best
KM> programmers I know easily dismissed it because
KM> "they do not like parentheses." But who of programming
KM> celebrities or theoreticians were enemies of Lisp?

Stephen Wolfram counts?

http://groups.google.com.ua/group/comp.lang.lisp/msg/f3b93140c2f2e922?hl=en&dmode=source&output=gplain

From: Kent M Pitman <pit...@world.std.com>

----
I'm not sure this is precisely the forum in which to log this fact,
but since Fateman is telling historical stories I wanted to add one.
I was in Pasadena at one point, visiting a friend at Caltech, and
popped in to see Wolfram around the time he was gearing up to write
SMP, I think. If I recall, he was 19 at the time. People around me
informed me that though he was very young, or maybe because of it, he
was on track to win a nobel prize of some sort. I myself worked for
the MIT Macsyma group at the time as an undergrad, perhaps my first
senior year, so I think I must have been a year or two older than him.

He told me that Lisp was "inherently" (I'm pretty sure even after all
this time that this was his exact word) 100 times slower than C and
therefore an unsuitable vehicle. I tried to explain to him that this
was implausible. That he could probably construct an argument for 2-5
that he could at least defend in some prima facie way, but that 100
was ridiculous. (This was in the heyday of Maclisp when it had been shown
to trump Fortran's speed, so probably even 2-5 could be refuted, but at
least
taking a position in that range would have left him with some defenses in
a debate. He didn't cite anything credible that I recall to back up this
factor of 100 problem.

I tried to explain why and was not clear why a person smart enough to
"maybe win a nobel prize" couldn't entertain a discussion on the
simple set of concepts involved, whether or not schooled in
computation. It was quite frustrating and he seemed impatient.
----

Tim Bradshaw

unread,
Mar 26, 2010, 12:32:49 PM3/26/10
to
On 2010-03-26 12:07:45 +0000, Alex Mizrahi said:

> Stephen Wolfram counts?

He may have had some mileage a long time ago, but I can't imagine
anyone who'se used Mathematica would pay much attention now: it's not
really an example of good language design.

(In fact, based on the documentation, and although I find it a fine
system for my current domestic use, I probably would not consider it
for serious use as the possibility that I might have to deal with
someone whose head is that big would put me off.)

His kennyness

unread,
Mar 26, 2010, 3:44:37 PM3/26/10
to
Kazimir Majorinc wrote:
> Informally, many people criticize Lisp. Some best
> programmers I know easily dismissed it because
> "they do not like parentheses." But who of programming
> celebrities or theoreticians were enemies of Lisp?
> Are there any influential anti-Lisp book or article,
> scientific or not?
>
> I know for Dijkstra - he harshly criticized Lisp in
> few manuscripts he distributed, and partly - but
> only partly - rehabilitated in one article someone
> recently posted somewhere (on Reditt?) Did he
> wrote any formal or more detailed text about that?
> Anyone else?

Knuth.

kt

His kennyness

unread,
Mar 26, 2010, 4:20:02 PM3/26/10
to
His kennyness wrote:
> Kazimir Majorinc wrote:
>> Informally, many people criticize Lisp. Some best
>> programmers I know easily dismissed it because
>> "they do not like parentheses." But who of programming
>> celebrities or theoreticians were enemies of Lisp?
>> Are there any influential anti-Lisp book or article,
>> scientific or not?
>>
>> I know for Dijkstra - he harshly criticized Lisp in
>> few manuscripts he distributed, and partly - but
>> only partly - rehabilitated in one article someone
>> recently posted somewhere (on Reditt?) Did he
>> wrote any formal or more detailed text about that?
>> Anyone else?
>
> Knuth.

More recently, Steele and Norvig.

kt

Francogrex

unread,
Mar 26, 2010, 4:30:28 PM3/26/10
to
On Mar 26, 9:20 pm, His kennyness <kentil...@gmail.com> wrote:
>
> More recently, Steele and Norvig.

Show good evidence to back up your claim.

David Formosa (aka ? the Platypus)

unread,
Mar 26, 2010, 9:50:02 PM3/26/10
to
On Thu, 25 Mar 2010 08:15:42 +0100, Kazimir Majorinc
<em...@false.false> wrote:
> Informally, many people criticize Lisp. Some best
> programmers I know easily dismissed it because
> "they do not like parentheses." But who of programming
> celebrities or theoreticians were enemies of Lisp?
> Are there any influential anti-Lisp book or article,
> scientific or not?

The most prominent anti-Lisp artical would be "Worse is better" but
that is not so much an anti-lisp artical as a explonation of why Unix
and C became dominate in the marketplace over Lisp running on lisp
mechines.

His kennyness

unread,
Mar 27, 2010, 7:22:02 AM3/27/10
to

How about wild and crazy evidence I don't give a rat's ass what you
think about?

Steele contributed hugely to an anti-Lisp called Java, and contributed
mostly the anti-Lispiness. He should have contributed groovy/scala.

Norvig rather famously pronounced Python to be as good as Lisp.

Two men who should be advocates for lisp but are not because their
masters use something else.

Knuth said any idiot can implement a linked list so there is no need for
Lisp. Look it up.

D Herring

unread,
Mar 27, 2010, 12:16:27 PM3/27/10
to
On 03/27/2010 07:22 AM, His kennyness wrote:

> Knuth said any idiot can implement a linked list so there is no need for
> Lisp. Look it up.

When it comes to algorithms and other low-level details, Knuth knows
where its at. After looking at TeX and other programs he wrote, does
anyone care what he says about style or other large-scale issues?

- Daniel

His kennyness

unread,
Mar 27, 2010, 1:06:13 PM3/27/10
to
D Herring wrote:
> On 03/27/2010 07:22 AM, His kennyness wrote:
>
>> Knuth said any idiot can implement a linked list so there is no need for
>> Lisp. Look it up.
>
> When it comes to algorithms and other low-level details, Knuth knows
> where its at.

Which suggesrs those skills have nothing to do with understanding what
makes languages more or less powerful.

> After looking at TeX and other programs he wrote, does
> anyone care what he says about style or other large-scale issues?

It took him ten years just to lay out type, TeX is notoriously hard to
mark up, and there is no wysiwyg GUI. Hard to be impressed.

His other contribution was literate programming, one of the worst and
least successful programming ideas ever but a natural result of his
rejection of powerful languages. They guy writes in assembler, no wonder
he needs something he can read.

kt

Tim Bradshaw

unread,
Mar 27, 2010, 1:41:46 PM3/27/10
to
On 2010-03-27 17:06:13 +0000, His kennyness said:

> It took him ten years just to lay out type

To be fair, I think he basically had to discover how typography works,
including mathematical typography, invent algorithms that could do a
decent job of it, implement them and (not least) design a typeface (OK,
a horrible typeface, but I think he was constrained by the typeface his
publisher had already used) and write a program in which to implement
that design.

That's actually pretty good going for ten years.

> TeX is notoriously hard to mark up,

TeX is indeed a horrible language

> and there is no wysiwyg GUI.

People who can type generally don't want such. Though TeX is horrible,
it is still a pretty good way of creating printed maths - certainly far
better than anything else I've seen.

(None of this should be taken as implying I think Knuth has anything
interesting to say about programming languages: I don't.)

His kennyness

unread,
Mar 27, 2010, 11:04:40 PM3/27/10
to
Tim Bradshaw wrote:
> On 2010-03-27 17:06:13 +0000, His kennyness said:
>
>> It took him ten years just to lay out type
>
> To be fair, I think he basically had to discover how typography works,

Right, and I am supposed to pay for his self-education? He set out on a
programming task and got distracted into a typesetting task. Clear sign
of a weak mind. He is a computer science professor and his contribution
to computer science was to explain that Lisp was a mistake. QED/RIP.

> including mathematical typography, invent algorithms that could do a
> decent job of it, implement them and (not least) design a typeface (OK,

> a horrible typeface,..

He giveth with one hand and taketh with the other.

> but I think he was constrained by the typeface his
> publisher had already used) and write a program in which to implement
> that design.
>
> That's actually pretty good going for ten years.

Sure, if one is fundamentally a useless academic.

>
>> TeX is notoriously hard to mark up,
>
> TeX is indeed a horrible language
>
>> and there is no wysiwyg GUI.
>
> People who can type generally don't want such.

Never ever having had one, they are not in a position not to want one.
Given a horrible language, wysiwyg is a necessary language tutor. I am
sorry I had to explain that. But wysiwyg was beyond knuth, he's...an
academic. They know nothing about teaching (see "Costanza, P." .ca 2007)

> Though TeX is horrible,

You could not stop there?

> ..it is still a pretty good way of creating printed maths - certainly far

> better than anything else I've seen.

And in ten years I could implement cold fusion. Not seeing the win here.

kt

Tim Bradshaw

unread,
Mar 28, 2010, 6:06:54 PM3/28/10
to
On 2010-03-28 04:04:40 +0100, His kennyness said:
>
> Right, and I am supposed to pay for his self-education? He set out on a
> programming task and got distracted into a typesetting task.

And in the process invented what a tool which has enormously improved
the ability of people who use maths in their work to communicate. Yes,
you are meant to pay for that.

> Clear sign of a weak mind. He is a computer science professor and his
> contribution to computer science was to explain that Lisp was a mistake.

No.

>>
>> People who can type generally don't want such. [GUIs]


>
> Never ever having had one, they are not in a position not to want one.

They have many, of course, and yet somehow they still persist in
typing. I have at least two good ones to hand, and I still type stuff
in TeX when I need to type maths.

>
> And in ten years I could implement cold fusion. Not seeing the win here.

Good, go and do so! You will both make yourself extremely rich, and
solve the world's energy problems at a stroke. Hot fusion would do, in
fact.

His kennyness

unread,
Mar 28, 2010, 9:14:15 PM3/28/10
to
Tim Bradshaw wrote:
> On 2010-03-28 04:04:40 +0100, His kennyness said:
>>
>> Right, and I am supposed to pay for his self-education? He set out on
>> a programming task and got distracted into a typesetting task.
>
> And in the process invented what a tool which has enormously improved
> the ability of people who use maths in their work to communicate. Yes,
> you are meant to pay for that.

hey, at least it is an honest application, something I always whine
about academics needing to do more.

>
>> Clear sign of a weak mind. He is a computer science professor and his
>> contribution to computer science was to explain that Lisp was a mistake.
>
> No.
>
>>>
>>> People who can type generally don't want such. [GUIs]
>>
>> Never ever having had one, they are not in a position not to want one.
>
> They have many, of course, and yet somehow they still persist in
> typing. I have at least two good ones to hand, and I still type stuff
> in TeX when I need to type maths.

Actually, text is good. I certainly would not want a visual programming
language. But I doubt you have experienced a serious math editor unless
you have played with this:

http://www.stuckonalgebra.com/index.html

>
>>
>> And in ten years I could implement cold fusion. Not seeing the win here.
>
> Good, go and do so! You will both make yourself extremely rich, and
> solve the world's energy problems at a stroke. Hot fusion would do, in
> fact.
>

Funny you should mention that. I googled the princton reactor recently
to see what they were up to. Plug pulled, 1997. I wonder how warm
superconductivity is doing.

As for solving the world's energy problem, I think having limitless
cheap environmentally clean energy would be the worst thing that could
happen to the planet: imagine what we would do to it then.

hk

Tim Bradshaw

unread,
Mar 29, 2010, 6:09:18 AM3/29/10
to
On 2010-03-29 02:14:15 +0100, His kennyness said:

>
> Actually, text is good. I certainly would not want a visual programming
> language. But I doubt you have experienced a serious math editor unless
> you have played with this:
>
> http://www.stuckonalgebra.com/index.html

I'd try it, but there seems not to be a mac one yet. Don't get me
wrong - I don't think TeX is the ultimate way of typing in maths, I
just think it can be fairly good. A lot of other systems seem to do
this pull-things-from-a-palette thing which is just abysmal as it means
you can't type things in. MathML also seems to have catastrophically
missed the point as it's so verbose: you can type it but who would want
to?

Of course TeX fails to capture the semantics of the expression at all,
but I think that if you try and do that you likely end up with
something you can't type quickly. And that does matter to people who
are fluent with maths and typing - you want something which is not
enormously slower than handwriting stuff.

(Note none of this means I think TeX is a good programming language: I
don't, it's a shit programming language.)

Tamas K Papp

unread,
Mar 29, 2010, 8:02:54 AM3/29/10
to
On Mon, 29 Mar 2010 11:09:18 +0100, Tim Bradshaw wrote:

> Of course TeX fails to capture the semantics of the expression at all,
> but I think that if you try and do that you likely end up with something
> you can't type quickly. And that does matter to people who are fluent
> with maths and typing - you want something which is not enormously
> slower than handwriting stuff.

I am not sure that I would want to capture semantics when I am typing
math, it would usually add a lot of overhead. A lot symbols/conventions
are used for various different things, depending on the context.

> (Note none of this means I think TeX is a good programming language: I
> don't, it's a shit programming language.)

Although you can program in TeX, very few people think of TeX as a
programming language. Its main focus is not programming, but typing
mathematical text, and that it does rather well (there are no serious
competitors in its market). The "programming" one can do in TeX is
just it for extending the language, and most TeX users don't have to
do any of that.

Even though I am sometimes frustrated with TeX, I admire it as an
achievement. Created decades ago, it is still the most used
typesetting environment for texts that use math. People who dislike
TeX have hard time convincing others, because it TeX gets the job
done, so it is good enough.

Tamas

e...@infometrics.nl

unread,
Mar 29, 2010, 10:04:02 AM3/29/10
to
From a discussion I had years ago with one of Dijkstra's pupils, I was
clearly told that, although Dijkstra was not a Lisper himself,
Dijkstra had a high opinion of Lisp. A well-known quote by Dijkstra
about Lisp (CACM, 15:10) makes this clear:

"Lisp has jokingly been called 'the most
intelligent way to misuse a computer'.
I think that description is a great
compliment because it transmits the full
flavor of liberation: it has assisted a
number of our most gifted fellow humans
in thinking previously impossible
thoughts."

Having known Dijkstra from seminars, yes, he could be very critical.
Even though his criticisms could shock some, he usually had a point,
IMHO. Same for Lisp: it may not be ideal, but do we have anything
better? Look again at the quote above...

Ernst

Tim Bradshaw

unread,
Mar 29, 2010, 10:41:32 AM3/29/10
to
On 2010-03-29 13:02:54 +0100, Tamas K Papp said:

> I am not sure that I would want to capture semantics when I am typing
> math, it would usually add a lot of overhead. A lot symbols/conventions
> are used for various different things, depending on the context.

I agree with this. I also suspect that for most people who actually do
a lot of maths in their work, the semantics is often pretty vague. I
think people often think of everything being enormously precise, but my
esperience was that there is a fair amount of hand-waving even among
proper mathematicians (I was a physicist so obviously I waved my hands
really a lot).

>
> Although you can program in TeX, very few people think of TeX as a
> programming language. Its main focus is not programming, but typing
> mathematical text, and that it does rather well (there are no serious
> competitors in its market). The "programming" one can do in TeX is
> just it for extending the language, and most TeX users don't have to
> do any of that.

What I specifically meant is that TeX is, I think, unduly abominable if
you need to write non-trivial macros. You can do it, of course, and
people do write very complex packages (witness LaTeX and even plain
TeX), but it just should not need to be that difficult and unpleasant.


>
> Even though I am sometimes frustrated with TeX, I admire it as an
> achievement. Created decades ago, it is still the most used
> typesetting environment for texts that use math. People who dislike
> TeX have hard time convincing others, because it TeX gets the job
> done, so it is good enough.

Yes. I would still use it if I needed to type maths.


Scott Burson

unread,
Mar 31, 2010, 1:19:43 AM3/31/10
to
On Mar 25, 2:39 pm, Mark Tarver <dr.mtar...@ukonline.co.uk> wrote:
>
> David Turner for one.  There is a famous quote from him.
>
> 'It needs to be said very firmly that LISP, at least as represented by
> the dialects in common use, is not a functional language at
> all.  ....... My suspicion is that the success of Lisp set back the
> development of a properly functional style of programming by at least
> ten years.'

I can't speak to that directly, but I do recall that in 1980, when I
was trying to adopt a more functional style in ZetaLisp, I had to
invent some tools to make it easier. I wasn't aware of the work going
on in functional languages, so what I did was somewhat idiosyncratic,
but the point is that I don't recall anyone else around the MIT AI lab
trying to write as much functional code as I was (though I never tried
to write only functionally). I think Lisp always held the potential
for writing at least a fair amount of functional code, but it required
a certain determination and willingness to go against the prevailing
culture to realize that potential.

-- Scott

Nicolas Neuss

unread,
Mar 31, 2010, 3:55:38 AM3/31/10
to
Tim Bradshaw <t...@tfeb.org> writes:

> Of course TeX fails to capture the semantics of the expression at all, but
> I think that if you try and do that you likely end up with something you
> can't type quickly. And that does matter to people who are fluent with
> maths and typing - you want something which is not enormously slower than
> handwriting stuff.

I think this is the most important point. It would probably be very
amusing to see a competition in math typing speed between someone
proficient in TeX (who is additionally using a reasonable environment
like Emacs/AucTeX) and someone proficient in typing math in Word or
similar (are there such people at all?).

OTOH, if you use TeX/LaTeX only rarely, the overhead is large. And
learning LaTeX becomes masochism, if you are not really trained in
structured thinking and/or programming, and you have noone at hand who
sets you straight if you maltreat it as a WYSIWYG system.

Nicolas

Tamas K Papp

unread,
Mar 31, 2010, 4:20:32 AM3/31/10
to
On Wed, 31 Mar 2010 09:55:38 +0200, Nicolas Neuss wrote:

> Tim Bradshaw <t...@tfeb.org> writes:
>
>> Of course TeX fails to capture the semantics of the expression at all,
>> but I think that if you try and do that you likely end up with
>> something you can't type quickly. And that does matter to people who
>> are fluent with maths and typing - you want something which is not
>> enormously slower than handwriting stuff.
>
> I think this is the most important point. It would probably be very
> amusing to see a competition in math typing speed between someone
> proficient in TeX (who is additionally using a reasonable environment
> like Emacs/AucTeX) and someone proficient in typing math in Word or
> similar (are there such people at all?).

Since about a year ago, I started using a Wacom Bamboo tablet (the
cheapest one, for about $50) with Xournal on Linux for "writing" math
derivations that only have a small audience (ie coauthors, students in
a class). I like it very much, but the funny thing is that even
though using it feels perfectly natural, it is only about 2x as fast
as typing in LaTeX! So I agree with your point completely.

> OTOH, if you use TeX/LaTeX only rarely, the overhead is large. And
> learning LaTeX becomes masochism, if you are not really trained in
> structured thinking and/or programming, and you have noone at hand who
> sets you straight if you maltreat it as a WYSIWYG system.

I think that the Lamport book is a gentle introduction for such
people.

Tamas

Tim Bradshaw

unread,
Mar 31, 2010, 5:53:09 AM3/31/10
to
On 2010-03-31 09:20:32 +0100, Tamas K Papp said:

> Since about a year ago, I started using a Wacom Bamboo tablet (the
> cheapest one, for about $50) with Xournal on Linux for "writing" math
> derivations that only have a small audience (ie coauthors, students in
> a class). I like it very much, but the funny thing is that even
> though using it feels perfectly natural, it is only about 2x as fast
> as typing in LaTeX! So I agree with your point completely.

I use paper for that... I think I can probably type text faster than I
can coherently write it, but I can handwrite maths faster than I can
type at present. However, at one time I typed maths books for a
living, and I'm fairly sure I could type TeX faster than by hand at
that point. Of course that's a weird case, because you are just
transcriing existing stuff. I would always prefer to write maths
initially (I mean with a pen & paper), because it makes thinking easier
somehow.

>
> I think that the Lamport book is a gentle introduction for such
> people.

I would guess that the youth of today would just reject TeX out of hand
because, you know, it's not pretty. I can remember (somewhere around
the eternal September) when students started appearing who would no
longer accept emacs & TeX for writing stuff, but demanded Word, despite
it being much slower to get anything technical down with Word.

Nicolas Neuss

unread,
Mar 31, 2010, 8:21:44 AM3/31/10
to
Tim Bradshaw <t...@tfeb.org> writes:

> On 2010-03-31 09:20:32 +0100, Tamas K Papp said:
>
>> Since about a year ago, I started using a Wacom Bamboo tablet (the
>> cheapest one, for about $50) with Xournal on Linux for "writing" math
>> derivations that only have a small audience (ie coauthors, students
>> in a class). I like it very much, but the funny thing is that even
>> though using it feels perfectly natural, it is only about 2x as fast
>> as typing in LaTeX! So I agree with your point completely.
>
> I use paper for that... I think I can probably type text faster than I
> can coherently write it, but I can handwrite maths faster than I can
> type at present. However, at one time I typed maths books for a
> living, and I'm fairly sure I could type TeX faster than by hand at
> that point. Of course that's a weird case, because you are just
> transcriing existing stuff. I would always prefer to write maths
> initially (I mean with a pen & paper), because it makes thinking
> easier somehow.

Yes, but as soon as one arrives at the point where the content is
relatively clear, the possibility to edit and copy things around becomes
more valuable than the mere speed of sketching. For example, when
writing scripts for lectures, I observe that I rarely use paper any
more.

>> I think that the Lamport book is a gentle introduction for such
>> people.
>
> I would guess that the youth of today would just reject TeX out of
> hand because, you know, it's not pretty. I can remember (somewhere
> around the eternal September) when students started appearing who
> would no longer accept emacs & TeX for writing stuff, but demanded
> Word, despite it being much slower to get anything technical down with
> Word.

I think LaTeX appeals to the same minds who also like programming.
Unfortunately, I observe that there are even many math students who do
not fit this description.

Nicolas

Tim Bradshaw

unread,
Mar 31, 2010, 9:33:13 AM3/31/10
to
On 2010-03-31 13:21:44 +0100, Nicolas Neuss said:

> Yes, but as soon as one arrives at the point where the content is
> relatively clear, the possibility to edit and copy things around becomes
> more valuable than the mere speed of sketching. For example, when
> writing scripts for lectures, I observe that I rarely use paper any
> more.

Yes, I agree with this completely. In the trivial playing which is all
I do nowadays, I tend to hand-write things until I understand what I am
doing, then type up a fair copy (which I actually do using the
wonderful jsMath plugin for TiddlyWiki rather than TeX itself, but only
because I essentially keep my brain in TW now).

Kazimir Majorinc

unread,
Mar 31, 2010, 12:23:13 PM3/31/10
to
These are all his non-trivial quotes on Lisp I was able to find
on EWD archives, also available on my blog:

(
http://kazimirmajorinc.blogspot.com/2010/03/what-dijkstra-blogged-about-lisp.html
)

LISP has been jokingly described as "the most intelligent way to misuse
a computer". I think that description a great compliment because it


transmits the full flavor of liberation: it has assisted a number of our
most gifted fellow humans in thinking previously impossible thoughts.

(1972 Turing Award Lecture[1], Communications of the ACM 15 (10),
October 1972)


"In Departments of Computing Science, one of the most common confusions
is the one between a program and its execution, between a programming
language and its implementation. I always find this very amazing: the
whole vocabulary to make the distinction is generally available, and
also, the very similar confusion between and computer and its order
code, remarkably enough, is quite rare. But it is a deep confusion of
long standing. One of the oldest examples is presented by the LISP 1.5
Manual: halfway their description of the programming language LISP, its
authors give up and from then onwards try to complement their incomplete
language definition by an equally incomplete sketch of a specific
implementation. Needless to say, I have not been able to learn LISP from
that booklet!"

(EWD 447, On the role of scientific thought, 1974)


LISP is a good example. In the early sixties we have suffered from a
piece of computer science folklore, viz. that the crucial test whether
on had designed "a good language" was, whether one could write its own
interpreter in itself. I have never understood the reason for that
criterion --it was suggested that such a language definition would solve
the problem of the definition of (particularly) the semantics of the
language--, the origin of the criterion was the example of LISP, which
was "defined" by a ten-line interpreter written in LISP, and that,
somehow, set the standard. (I found this hard to accept because when I
started to study LISP from the LISP 1.5 Manual, I could not understand
it: after an incomplete definition of the language, its authors try to
supplement these deficiencies by an equally incomplete sketch of a
specific implementation, and that was a confusion if there ever was
one!) This unfortunate LISP-interpreter formulated in LISP has somehow
given mechanistic (or "operational") language definitions an undeserved
aura of respectability, and its unwholesome influence can be traced
right through the Vienna Definition Language and ALGOL 68 --what I
knew-- and even --what I learned this week to my surprise and
disappointment-- to the work of Dana S. Scott. It is enlightening to
know --what I also learned later that week-- that LISP's most awful
property, viz. "shallow binding" --now religiously defended by its
addicts!-- is described by McCarthy as "a mistake" in that ten-line
interpreter (which, of course, it is,)!

(EWD448, Trip report E. W. Dijkstra, Edinburgh and Newcastle, 1 - 6
September 1974)


It was the complete entanglement of language definition and language
implementation that characterized the discussion after Mary Shaw's
presentation, and it was the entanglement that left many of the
Europeans flabbergasted. It was also this entanglement that made it
possible for me to read the LISP 1.5 Manual: after an incomplete
language definition, that text tries to fill the gaps with an equally
incomplete sketch of an —of the?— implementation. Yet LISP 1.5 conquered
in the decade after the publication of that manual a major portion of
the American academic computing community. This, too, must have had a
traceable influence. Why did LISP never get to that position in Europe?
Perhaps because in the beginning its implementation made demands beyond
our facilities, while later many had learned to live without it. (I
myself was completely put off by the Manual.)

(EWD 611, On the fact that the Atlantic Ocean has two sides, 1977)


Four slots were filled by G.J. Sussman (Artificial Intelligence
Laboratory, MIT) and a similar number by D. S. Wise (Department of
Computer Science, Indiana University). Both belonged very much to the
LISP subculture, neither of the two proved a single theorem, both showed
too much and made such heavy use of anthropomorphic terminology that
they were painful to listen to. Sussman's transparencies were printed,
but overloaded; Wise's transparencies were handwritten, but very messy.
Not used to Sussman's lecturing style - is it called "teaching by
example"? - I found him very tiring to listen to; he spoke very fast but
told very little, since he used most of his words to go in detail
through a number of almost identical examples. Wise struck me as rather
old-fashioned: he seemed to talk about the implementation of LISP in
very much the same way as people did in the early sixties. LISP's syntax
is so atrocious that I never understood its popularity. LISP's
possibility to introduce higher-order functions was mentioned several
times in its defence, but now I come to think of it, that could be done
in ALGOL60 as well. My current guess is that LISP's popularity in the
USA is related to FORTRAN's shortcomings.

(EWD798, Trip report, Newcastle, 19-25 July 1981)


Q We have seen Lisp emerge in documenting constructs?
A Lisp was great at the time of its inception if only for a radically
new way of using machines. It has suffered the fate of having been
elevated to a status of de facto standard with all its defects. Despite
its conceptual novelty a lot of the design of Lisp is terribly
amateurish even by the standards of the time. When I read the first
manual, the Lisp 1.5 manual, published in 1961, I could not believe my
eyes. It was an extremely poor language. Now it has become the defacto
standard of the AI community, which now suffers from Lisp, the way the
rest of the world suffered from Fortran.

Interview with Prof. Dr. Edsger W. Dijkstra, Austin, 04–03–1985
Rogier F. van Vlissingen


"Then LISP emerged as programming vehicle for symbol manipulation. LISP
can be viewed as a highly successful exercise in the mechanization of
applied logic. For computing science, its cultural significance is that
it liberated computing from the narrow connotations of number crunching
by including all formal processes --from symbolic differentiation and
integration to formal theorem proving-- as computational activities."

(EWD924, On a Cultural Gap, The Mathematical Intelligencer 8 (1986), 1:
48–52)


As said, FORTRAN and LISP were the two great achievements of the 50s.
Compared with FORTRAN, LISP embodied a much greater leap of imagination.
Conceptually FORTRAN remained on familiar grounds in the sense that its
purpose was to aid the mechanization of computational processes we used
to do with pen and paper (and mechanical desk calculators if you could
afford them). This was in strong contrast to LISP whose purpose was to
enable the execution of processes that no one would dream of performing
with pen and paper.

At the time LISP's radical novelties were for instance recognized by its
characterization as "the most intelligent way of misusing a computer",
in retrospect we see its radical novelty because it was what is now
known as a language for "functional programming", while now, 40 years
later, functional programming is still considered in many CS departments
as something much too fancy, too sophisticated to be taught to
undergraduates. LISP had its serious shortcomings: what became known as
"shallow binding" (and created a hacker's paradise) was an ordinary
design mistake; also its promotion of the idea that a programming
language should be able to formulate its own interpreter (which then
could be used as the language's definition) has caused a lot of
confusion because the incestuous idea of self-definition was
fundamentally flawed. But in spite of these shortcomings, LISP was a
fantastic and in the long run highly influential achievement that has
enabled some of the most sophisticated computer applications.

I must confess that I was very slow on appreciating LISP's merits. My
first introduction was via a paper that defined the semantics of LISP in
terms of LISP, I did not see how that could make sense, I rejected the
paper and LISP with it. My second effort was a study of the LISP 1.5
Manual from MIT which I could not read because it was an incomplete
language definition supplemented by equally incomplete description of a
possible implementation; that manual was so poorly written that I could
not take the authors seriously, and rejected the manual and LISP with
it. (I am afraid that tolerance of inadequate texts was not my leading
characteristic).

(Computing Science: Achievements and Challenges, EWD1284, 1999)

Things changed in 1960, when a transatlantic cooperation created the
programming language ALGOL 60, an event that my colleague F. E. J.
Kruseman Aretz for good reasons has chosen to mark the birth of
Computing Science. The reasons were good because ALGOL 60 introduced a
handful of deep novelties, all of which have withstood the test of time.
Let me review them briefly.

Firstly there is the introduction and use of Backus-Naur-Form (or BNF
for short), a formal grammar to define the syntax of the programming
language. More than anything else, ALGOL 60's formal definition has made
language implementation a topic worthy of academic attention, and this
was a crucial factor in turning automatic computing into a viable
scientific discipline.

Secondly there was the block structure and the concept of the local
variable, with the beautiful synthesis of lexicographic scope with
dynamically nested lifetimes. The lexicographic scope led to a unified
notation for mathematical quantification, while the nested lifetimes
opened the door to the next novelty.

This was recursion. Logicians knew it, LISP had incorporated it, but
ALGOL 60 gave it a firm place in imperative programming.

(Under the spell of Leibniz's Dream, EWD1298, 1999?)

Andrew Reilly

unread,
Apr 2, 2010, 2:48:48 PM4/2/10
to
On Wed, 31 Mar 2010 09:55:38 +0200, Nicolas Neuss wrote:

> I think this is the most important point. It would probably be very
> amusing to see a competition in math typing speed between someone
> proficient in TeX (who is additionally using a reasonable environment
> like Emacs/AucTeX) and someone proficient in typing math in Word or
> similar (are there such people at all?).

I was delighted, yesterday, to discover how far OpenOffice has come in
the math/equation entry sphere. Not only is it possible to enter
equations without pointing and clicking on menus, it's possible to do so
without even having to open up a different window/environment: just type
the equation using the TeX-like math language, then select it (mouse or
keyboard selection gestures) and then select Insert>Object>Formula. Yes,
I'd be much happier if that last step was a key sequence. Maybe it is: I
don't know. Very slick. Not as pretty as TeX, but good enough for my
current purposes. Much, much faster than I've seen in other WYSIWYG
editors.

Maybe the ODF (Open Document Format) might even come to have the same
sort of longevity as TeX. I really like that I can still recreate and
print the TeX documents that I wrote twenty years ago. Can't do that
with Word (although at least OpenOffice can open most of those old Word
documents that Word can't...)

Cheers,

--
Andrew

Tim Bradshaw

unread,
Apr 6, 2010, 6:29:32 AM4/6/10
to
On 2010-04-02 19:48:48 +0100, Andrew Reilly said:

> Maybe the ODF (Open Document Format) might even come to have the same
> sort of longevity as TeX. I really like that I can still recreate and
> print the TeX documents that I wrote twenty years ago. Can't do that
> with Word (although at least OpenOffice can open most of those old Word
> documents that Word can't...)

It seems to me that this is a case where there may be limited purpose
in re-fighting a battle which has been won (in this case, by TeX). If
I was designing a language for embedding maths, I think I would just
grit my teeth and nick great chunks of TeX. Interestingly, that's what
jsMath does, and it is really very effective indeed (when I type maths
now, I almost always do it using jsMath embedded in a TiddlyWiki,
rather than actually TeX).

Vassil Nikolov

unread,
Apr 9, 2010, 1:16:49 AM4/9/10
to

On Wed, 31 Mar 2010 10:53:09 +0100, Tim Bradshaw <t...@tfeb.org> said:
> ...

> I would always prefer to write maths initially (I mean with a pen &
> paper), because it makes thinking easier somehow.

And because (traditional) mathematical notation is optimized
(rather, has evolved towards being optimized) for that, e.g. by
being more than one-dimensional.

I think it is a rather interesting question whether and how
mathematical notation will evolve in a different direction now
"under the pressure" of keyboard use.

> I would guess that the youth of today would just reject TeX out of
> hand because, you know, it's not pretty.

Not _all_ youth of today, though...

---Vassil.


--
"Be careful how you fix what you don't understand." (Brooks 2010, 185)

Tim Bradshaw

unread,
Apr 9, 2010, 2:54:12 AM4/9/10
to
On 2010-04-09 06:16:49 +0100, Vassil Nikolov said:

> I think it is a rather interesting question whether and how
> mathematical notation will evolve in a different direction now
> "under the pressure" of keyboard use.

So do I. I suspect it won't because there are other reasons why people
handwrite maths, but I may well be wrong.

Mario S. Mommer

unread,
Apr 9, 2010, 3:32:59 AM4/9/10
to

In my experience, writing directly to TeX or anything similar is asking
for trouble unless you are doing simple things. The source is not all
that readable, and the hardcopy looks too good. The result is that you
do not see the mistakes and the holes in the arguments. Taking a draft
on paper and cleaning it up by copying the non-strike-out to new a paper
draft is the only really good way to make sure you are really really
really going over every detail again.

When I write maths, I do it on paper. When it is ready, I write it in
LaTeX, and that draft is already pretty close to the final product.

Restructuring a decently sized latex document through copy and paste is
problematic, because there is no compiler (nor will there ever be) to
tell you that the semantics do not fit any longer.

Nicolas Neuss

unread,
Apr 9, 2010, 5:00:06 AM4/9/10
to
m_mo...@yahoo.com (Mario S. Mommer) writes:

> In my experience, writing directly to TeX or anything similar is
> asking for trouble unless you are doing simple things.

I guess everyone uses paper&pencil for working out new things. But
after this initial step...

> The source is not all that readable,

To make sure: Do you use Emacs/AucTeX and do you use structuring
commands like

\newcommand{\abs}[1]{\lvert#1\rvert}
\newcommand{\set}[1]{\{#1\}}
\newcommand{\norm}[1]{\lVert#1\rVert}
?

With that and a little discipline (using begin/end{displaymath} for
example instead of $$) I can get quite readable source.

> and the hardcopy looks too good. The result is that you do not see the
> mistakes and the holes in the arguments. Taking a draft on paper and
> cleaning it up by copying the non-strike-out to new a paper draft is
> the only really good way to make sure you are really really really
> going over every detail again.
>
> When I write maths, I do it on paper. When it is ready, I write it in
> LaTeX, and that draft is already pretty close to the final product.
>
> Restructuring a decently sized latex document through copy and paste
> is problematic, because there is no compiler (nor will there ever be)
> to tell you that the semantics do not fit any longer.

I did not encounter this problem so far.

Nicolas

His kennyness

unread,
Apr 9, 2010, 5:17:58 AM4/9/10
to
Nicolas Neuss wrote:
> m_mo...@yahoo.com (Mario S. Mommer) writes:
>
>> In my experience, writing directly to TeX or anything similar is
>> asking for trouble unless you are doing simple things.
>
> I guess everyone uses paper&pencil for working out new things. But
> after this initial step...
>
>> The source is not all that readable,
>
> To make sure: Do you use Emacs/AucTeX and do you use structuring
> commands like
>
> \newcommand{\abs}[1]{\lvert#1\rvert}
> \newcommand{\set}[1]{\{#1\}}
> \newcommand{\norm}[1]{\lVert#1\rVert}

Yikes.

Listen, this is silly, this is like people who say Emacs/Slime is a good
way to develop sotware. It's not, it's just what you know.

All youse guys is saying is that you lack the imagination needed to
conceive anything other than what you know, including any other wysiwyg
math editor than the ones you already know (such as retarded editors
that have one forever pulling things out of palettes).

Ironically, emacs is indeed a good example of how much one can get out
of keychords once those are mastered. I suppose a fair question is
whether one writes enough maths to master those keychords, but if one
writes enough to get comfortable with TeX methinks that covers that. And
with a decent editor one can use it for scratch work as well as final
proof and get even better.

Go ahead, dinosaurs, you hang on to the past, that's what yer good at!
The rest of us will improve things.

>
> With that and a little discipline (using begin/end{displaymath} for
> example instead of $$) I can get quite readable source.

Oh, my. The denial!

:)

kt


ps. What do the dinosaurs think of MathML? The typesetting, I mean.

Tamas K Papp

unread,
Apr 9, 2010, 5:32:30 AM4/9/10
to
On Fri, 09 Apr 2010 05:17:58 -0400, His kennyness wrote:

> Nicolas Neuss wrote:
>> m_mo...@yahoo.com (Mario S. Mommer) writes:
>>
>>> In my experience, writing directly to TeX or anything similar is
>>> asking for trouble unless you are doing simple things.
>>
>> I guess everyone uses paper&pencil for working out new things. But
>> after this initial step...
>>
>>> The source is not all that readable,
>>
>> To make sure: Do you use Emacs/AucTeX and do you use structuring
>> commands like
>>
>> \newcommand{\abs}[1]{\lvert#1\rvert}
>> \newcommand{\set}[1]{\{#1\}}
>> \newcommand{\norm}[1]{\lVert#1\rVert}
>
> Yikes.
>
> Listen, this is silly, this is like people who say Emacs/Slime is a good
> way to develop sotware. It's not, it's just what you know.
>
> All youse guys is saying is that you lack the imagination needed to
> conceive anything other than what you know, including any other wysiwyg
> math editor than the ones you already know (such as retarded editors
> that have one forever pulling things out of palettes).

So where is the software developed by His Kennyness, Prince of
Imagination, that dominates LaTeX+AUCTex+Emacs (or LaTeX+one's
favorite environment) for typesetting math?

> Go ahead, dinosaurs, you hang on to the past, that's what yer good at!
> The rest of us will improve things.

Sure. Let me know when it is ready. I am not holding my breath,
though :-)

Tamas

Tim Bradshaw

unread,
Apr 9, 2010, 5:43:43 AM4/9/10
to
On 2010-04-09 10:17:58 +0100, His kennyness said:

> ps. What do the dinosaurs think of MathML? The typesetting, I mean.

Are there good systems that set type from it? The only thing I've
played with is getting Mathematica to generate MathML (because I'm not
about to type it in) and pointing Firefox at it, which produces
terrible-looking results on screen (and I would expect on paper). But
this is two levels of cruft - I bet Mathematica produces peculiar
MathML and I don't expect Firefox is anything like a decent typesetting
system. It looks much worse than jsMath though (but jsMath is pretty
heroic really).

Tim Bradshaw

unread,
Apr 9, 2010, 6:11:50 AM4/9/10
to
On 2010-04-09 08:32:59 +0100, Mario S. Mommer said:

> In my experience, writing directly to TeX or anything similar is asking
> for trouble unless you are doing simple things. The source is not all
> that readable, and the hardcopy looks too good. The result is that you
> do not see the mistakes and the holes in the arguments. Taking a draft
> on paper and cleaning it up by copying the non-strike-out to new a paper
> draft is the only really good way to make sure you are really really
> really going over every detail again.

I agree with this. I hand-write any maths I do (which is not much,
though more than a few years ago) until I have a final (+/-) version.
People obviously vary, but I find it hard to imagine a computing system
which would allow me to do what I can with paper - my notes are
typically full of scribbly little diagrams and written in some weird
order with arrows connecting things. People have said I should use
some kind of tablet thing, but as far as I can see this is just very
expensive paper, since it adds no value (actually, I am thinking of
playing with something to let me capture vector versions of the
scribbly little pictures so I can embed them in later versions of
things - trying to use some kind of proper diagamming tool is terribly
laborious).

Tim Bradshaw

unread,
Apr 9, 2010, 6:13:05 AM4/9/10
to
On 2010-04-09 11:11:50 +0100, Tim Bradshaw said:

> I hand-write any maths I do (which is not much, though more than a few
> years ago) until I have a final (+/-) version.

I meant to add: signifiantly I *don't* hand-write programs, not even
tiny fragments of them.

His kennyness

unread,
Apr 9, 2010, 7:22:29 AM4/9/10
to

Aha! jsMath!! You just changed my life.

Thx, kt

ps. kennyfans can start tracking 'kennytilton' on github, where I am
ramping up an o/s qxells project (+ cells qooxdoo). just utils-kt so
far, cells later.

Harald Hanche-Olsen

unread,
Apr 9, 2010, 11:46:01 AM4/9/10
to
+ His kennyness <kent...@gmail.com>:

> Aha! jsMath!! You just changed my life.

Then take a look at its successor-to-be: http://www.mathjax.org/

--
* Harald Hanche-Olsen <URL:http://www.math.ntnu.no/~hanche/>
- It is undesirable to believe a proposition
when there is no ground whatsoever for supposing it is true.
-- Bertrand Russell

Mario S. Mommer

unread,
Apr 9, 2010, 12:19:01 PM4/9/10
to

Nicolas Neuss <last...@kit.edu> writes:
> m_mo...@yahoo.com (Mario S. Mommer) writes:
> To make sure: Do you use Emacs/AucTeX and do you use structuring
> commands like
>
> \newcommand{\abs}[1]{\lvert#1\rvert}
> \newcommand{\set}[1]{\{#1\}}
> \newcommand{\norm}[1]{\lVert#1\rVert}
> ?
>
> With that and a little discipline (using begin/end{displaymath} for
> example instead of $$) I can get quite readable source.

I use almost the whole stack, Emacs + AucTex + x-symbol, although not
that preview thing that comes with auctex. x-symbol is really great, btw.

I still don't think the source is all that readable. Pen on paper just
beats it.

His kennyness

unread,
Apr 9, 2010, 1:52:22 PM4/9/10
to
Harald Hanche-Olsen wrote:
> + His kennyness <kent...@gmail.com>:
>
>> Aha! jsMath!! You just changed my life.
>
> Then take a look at its successor-to-be: http://www.mathjax.org/
>

Damn. There goes my weekend!

(thx!)

kt

Xah Lee

unread,
Apr 9, 2010, 2:30:18 PM4/9/10
to
On Mar 27, 4:22 am, His kennyness <kentil...@gmail.com> wrote:
> Francogrex wrote:
> > On Mar 26, 9:20 pm, His kennyness <kentil...@gmail.com> wrote:
> >> More recently, Steele and Norvig.
>
> > Show good evidence to back up your claim.
>
> How about wild and crazy evidence I don't give a rat's ass what you
> think about?
>
> Steele contributed hugely to an anti-Lisp called Java, and contributed
> mostly the anti-Lispiness. He should have contributed groovy/scala.
>
> Norvig rather famously pronounced Python to be as good as Lisp.
>
> Two men who should be advocates for lisp but are not because their
> masters use something else.
>
> Knuth said any idiot can implement a linked list so there is no need for
> Lisp. Look it up.

if one adds Steele and Norvig as anti lispers, one should also include
Paul Graham. Paul with is Arc, and specifically, claims that Common
Lisp is not good enough, and by his Arc, also by implication disgrace
the bunch of Scheme Lisps.

Didn't someone named William something who in the past year posts a
collection of anti-lisp quotes from famous lispers?

Xah
http://xahlee.org/


Xah Lee

unread,
Apr 9, 2010, 2:37:16 PM4/9/10
to
On Mar 27, 10:41 am, Tim Bradshaw <t...@tfeb.org> wrote:
> On 2010-03-27 17:06:13 +0000, His kennyness said:
>
> > It took him ten years just to lay out type
>
> To be fair, I think he basically had to discover how typography works,
> including mathematical typography, invent algorithms that could do a
> decent job of it, implement them and (not least) design a typeface (OK,
> a horrible typeface, but I think he was constrained by the typeface his
> publisher had already used) and write a program in which to implement
> that design.
>
> That's actually pretty good going for ten years.
>
> > TeX is notoriously hard to mark up,
>
> TeX is indeed a horrible language
>
> > and there is no wysiwyg GUI.
>
> People who can type generally don't want such.  Though TeX is horrible,
> it is still a pretty good way of creating printed maths - certainly far
> better than anything else I've seen.
>
> (None of this should be taken as implying I think Knuth has anything
> interesting to say about programming languages: I don't.)

certainly i disagree. See:

• The TeX Pestilence
http://xahlee.org/cmaci/notation/TeX_pestilence.html

as to better tools for showing math formulas... there are quite a few
today that are not based on TeX.

MathML, TeXmacs, Lout ...

you can find a lot more here http://en.wikipedia.org/wiki/Formula_editor

Xah
http://xahlee.org/


Xah Lee

unread,
Apr 9, 2010, 2:50:34 PM4/9/10
to

LOL. If you use Mathematica, for any complex math formula, it
trivially beat TeX/LaTeX. In fact, for complex formula, you won't even
known how to do it with TeX unless you are like one of the world's top
TeX expert.

When most tech geekers speaks of “i use TeX...”, their formula amounts
to the likes of quadraic formula, matrixes, derivatives, integrals,
square roots, powers... the type of math typesetting you see in common
math books. When the thing needs to be typeset gets a bet exotic, such
as continued fractions, or bunch of arrows in category theory..., it
gets very hard to know how to do in TeX.

TeX and LaTeX survived this long for mainly one reason: $Free$. (just
like lots of garbage from the open source thing, and beore this, X11,
unix. First they spam by $free$, then they become the recognized
industry “standard”, by this time, they killed most quality things out
there, then they try to fix the problems... e.g. Apache, MySQL, ... )

Xah
http://xahlee.org/


Xah Lee

unread,
Apr 9, 2010, 2:54:26 PM4/9/10
to
On Apr 9, 2:17 am, His kennyness <kentil...@gmail.com> wrote:

> ps. What do the dinosaurs think of MathML? The typesetting, I mean.

LOL.

As far as i know, many mathematicians don't like it... cause it is too
verbose to the degree that it is not possible to write manually, while
many of these old mathematicians knows TeX already.

Xah
http://xahlee.org/


Xah Lee

unread,
Apr 9, 2010, 4:13:00 PM4/9/10
to
On Mar 31, 9:23 am, Kazimir Majorinc <em...@false.false> wrote:
> These are all his non-trivial quotes on Lisp I was able to find
> on EWD archives, also available on my blog:
> http://kazimirmajorinc.blogspot.com/2010/03/what-dijkstra-blogged-about-lisp.html

Great post Kazimir!

I enjoyed your blog, the few times i ran across it in the past.

Also, i recall a month or two back, when you posted a very sensible
and valid question about number of lispers, the thread was quickly
twisted into idiotic blathering, by many regular posters here. These
motherfucking idiotic scumbags.

I also recall reading some of your post that is very technical about
NewLisp... don't remember the detail but it's one about some
expressiveness that can be done with NewLisp but hard with CL, again
attacked by defensive idiot's ramblings ...

I like Dijkstra, even though i don't know much about his work. When i
was reading this thread in the beginning, i thought Dijkstra is rather
pro-lisp. But after reading your compilation of quotes (great
research! thanks), things isn't that rosy. I guess i could blame this
partly to the random blatherings and propaganda by the lispers
(including Scheme Lispers) in the past 15 years... who washed my brain
even when i'm quite careful.

I really admire Dijkstra. I guess partly because he's not afraid to
criticize, and partly i guess i truely agree with his math formalism
approach to computation. Sometimes, one feels that his attacks are a
tainted by colorful words. Still, i highly respect him.

speaking of celebrities, this thread mentioned Knuth. I really have
little admiration of Knuth. Of course he's one of the most prominent
computer scientist, but not at all my type. Not his philosophies, his
work, nor his style. In colorful words, the archetype of tech geeking
idiot (nerd ne plus ultra).

Xah
http://xahlee.org/


Tim Bradshaw

unread,
Apr 9, 2010, 5:06:23 PM4/9/10
to
On 2010-04-09 16:46:01 +0100, Harald Hanche-Olsen said:

> + His kennyness <kent...@gmail.com>:
>
>> Aha! jsMath!! You just changed my life.
>
> Then take a look at its successor-to-be: http://www.mathjax.org/

That looks interesting. I will have to work out how to embed it into
TiddlyWiki.

His kennyness

unread,
Apr 9, 2010, 5:14:25 PM4/9/10
to
Xah Lee wrote:
> On Mar 27, 4:22 am, His kennyness <kentil...@gmail.com> wrote:
>> Francogrex wrote:
>>> On Mar 26, 9:20 pm, His kennyness <kentil...@gmail.com> wrote:
>>>> More recently, Steele and Norvig.
>>> Show good evidence to back up your claim.
>> How about wild and crazy evidence I don't give a rat's ass what you
>> think about?
>>
>> Steele contributed hugely to an anti-Lisp called Java, and contributed
>> mostly the anti-Lispiness. He should have contributed groovy/scala.
>>
>> Norvig rather famously pronounced Python to be as good as Lisp.
>>
>> Two men who should be advocates for lisp but are not because their
>> masters use something else.
>>
>> Knuth said any idiot can implement a linked list so there is no need for
>> Lisp. Look it up.
>
> if one adds Steele and Norvig as anti lispers, one should also include
> Paul Graham.

Perhaps under the category of "giving aid and comfort to the enemy" and
to the extent that is fun having all the gods of Lisp as its biggest
opponents, yes.

But he wrote two great books about Lisp and did a startup incubator that
was pro-Lisp so we might have to make him an honorary opponent.

> Paul with is Arc, and specifically, claims that Common
> Lisp is not good enough,

I accept the Lisp/Common Lisp distinction and while Paul and other
minilisp proponents are Simply Wrong, I would say that makes them
anti-CL not anti-Lisp.

> and by his Arc, also by implication disgrace
> the bunch of Scheme Lisps.

I had not thought of that. Did he cover "why not Scheme?" in his "why
Arc" essay?

kt

Peter Keller

unread,
Apr 9, 2010, 5:33:16 PM4/9/10
to
Xah Lee <xah...@gmail.com> wrote:
> These motherfucking idiotic scumbags.

It is really tragic that the knowledge you are able to contribute to
humanity is diluted by your vitriol.

People are not going to remember you for the things you did.

They are going to remember you for the things you said.

Later,
-pete


His kennyness

unread,
Apr 9, 2010, 7:07:24 PM4/9/10
to

Yep, him and John Lennon.

And I do not think you have seen how long is Xah's hair.

kt

Kazimir Majorinc

unread,
Apr 10, 2010, 4:25:54 AM4/10/10
to

Thanx.

Yes, I believe that Dijkstra's opinion on Lisp is accurately described
by this comment:

"Dijkstra did not like Lisp, his compliments about Lisp are always
insults hidden tongue in cheek. If you read his articles you will find
mostly he bashes Lisp for its poor specifications, poor manuals, lack of
clear direction, and its ridiculous artificial intelligence aims which
he was heavily against."


<http://tekkie.wordpress.com/2006/05/31/rediscovering-dr-dijkstra-and-giving-lisp-a-second-chance/#comment-8348>

--
http://kazimirmajorinc.blogspot.com

Pascal J. Bourguignon

unread,
Apr 10, 2010, 6:10:41 AM4/10/10
to
m_mo...@yahoo.com (Mario S. Mommer) writes:

What about if you merged LaTeX with an automatic proof checker?

--
__Pascal Bourguignon__

Pascal J. Bourguignon

unread,
Apr 10, 2010, 6:37:53 AM4/10/10
to
Harald Hanche-Olsen <han...@math.ntnu.no> writes:

> + His kennyness <kent...@gmail.com>:
>
>> Aha! jsMath!! You just changed my life.
>
> Then take a look at its successor-to-be: http://www.mathjax.org/

Yes, that's the problem with all these "nice" "intuitive" GUI tools:
they always have a successor, so you are always switching and learning a
new one.

Those who are not left with much time prefer to stick with efficient
stable tools, such as emacs or lisp which have been basically the same
for 30 years, so that we can concentrate on _our_ work, not on the job
of making another software enterpreneur rich or flattered.

--
__Pascal Bourguignon__

His kennyness

unread,
Apr 10, 2010, 7:07:56 AM4/10/10
to
Kazimir Majorinc wrote:
>
> Thanx.
>
> Yes, I believe that Dijkstra's opinion on Lisp is accurately described
> by this comment:
>
> "Dijkstra did not like Lisp, his compliments about Lisp are always
> insults hidden tongue in cheek. If you read his articles you will find
> mostly he bashes Lisp for its poor specifications

heh-heh, nowadays languages do not even have specifications.

>, poor manuals

heh-heh, that's not an objection to a /language/.

>, lack of
> clear direction

heh-heh, Mr. Graham said Lisp was the perfect language when you do not
know what code to write. Is that because Lisp is the language that did
not decide up front what it wanted to be?

>, and its ridiculous artificial intelligence aims which
> he was heavily against."

heh-heh, that's not an objection to a /language/.

kt

His kennyness

unread,
Apr 10, 2010, 7:18:57 AM4/10/10
to
Pascal J. Bourguignon wrote:
> Harald Hanche-Olsen <han...@math.ntnu.no> writes:
>
>> + His kennyness <kent...@gmail.com>:
>>
>>> Aha! jsMath!! You just changed my life.
>> Then take a look at its successor-to-be: http://www.mathjax.org/
>
> Yes, that's the problem with all these "nice" "intuitive" GUI tools:
> they always have a successor, so you are always switching and learning a
> new one.

Go back to bed. We'll wake you when the world is perfect.

>
> Those who are not left with much time prefer to stick with efficient
> stable tools, such as emacs or lisp which have been basically the same
> for 30 years, so that we can concentrate on _our_ work, not on the job
> of making another software enterpreneur rich or flattered.
>

I gather you are not concentrating on doing math on the Web.

kt

Mario S. Mommer

unread,
Apr 10, 2010, 8:53:02 AM4/10/10
to

That would have to be one hell of a proof checker! It would have to
understand natural language, and know what can be "safely assumed" as
being implicit, given the audience. It has to be that way because a
maths paper written like a formal spec of something is not going far,
because of its intrinsic wetware incompatibility. The referees will
absolutely hate you for it. And then you get the "way too technical"
rejection. Which is a good thing, because if such a paper were accepted,
nobody would read it.

In short, that automatic proof checker has to be a really damn good AI,
of the type Lisp was originally concieved to make happen.

I mean, I'm all for such a program, but I'm a tiny bit skeptical :-)

Harald Hanche-Olsen

unread,
Apr 10, 2010, 9:22:39 AM4/10/10
to
+ p...@informatimago.com (Pascal J. Bourguignon):

> Harald Hanche-Olsen <han...@math.ntnu.no> writes:
>
>> + His kennyness <kent...@gmail.com>:
>>
>>> Aha! jsMath!! You just changed my life.
>>
>> Then take a look at its successor-to-be: http://www.mathjax.org/
>
> Yes, that's the problem with all these "nice" "intuitive" GUI tools:
> they always have a successor, so you are always switching and learning a
> new one.

Eh? Did mathjax suddenly morph into an intuitive GUI tool while I wasn't
looking. Strange.

> Those who are not left with much time prefer to stick with efficient
> stable tools, such as emacs or lisp which have been basically the same
> for 30 years, so that we can concentrate on _our_ work, not on the job
> of making another software enterpreneur rich or flattered.

Or TeX. You hardly get any more stable than that. Come to think of it,
that is what jsmath and mathjax do: They let you type TeX code and have
it transformed into readable math on a web page. Mathjax is just a
better implementation of what jsmath does, and both try to do for the
web what TeX did for paper.

I take it you don't approve of Lisp implementations that haven't been
around for 20 years or more either?

Xah Lee

unread,
Apr 10, 2010, 10:57:53 AM4/10/10
to

it is impractical for TeX to be merged into a proof checking language.
I mean, if it happens, the result would really have nothing to do with
TeX.

However, the idea of combining a system that can display math formulas
and proof checking, or with a so-called computer algebra system, and
also as a computer language, is a common need, and has in fact been
done to various degrees. Mathematica for example, is one. While, most
computer algebra systems (such as Maple), or any math tools used in
physical sciences, all have a math formula display system builtin to
some degree (MatLab, Sage, MathCAD...)

for more detail of this, see:

• Math Notations, Computer Languages, and the “Form” in Formalism
http://xahlee.org/cmaci/notation/index.html

Xah
http://xahlee.org/


Xah Lee

unread,
Apr 10, 2010, 11:25:28 AM4/10/10
to
2010-04-10

it's been brewing in my mind for a while to write a criticism on Paul
Graham's Arc and his essay about ideal language.

of the various essays of his i read in the past years about Arc or
designing of a lang, his essential idea lies on the concept of
“hacker”. He keeps saying, lang needs to be this or that because
“hacker” is this or that way.

That illusive thing, “hacker”. Whence, one can't really get a precise
idea what he consider as ideal in a language design. I think that is
the main problem, and consquently, whatever he comes up i can't deem
good. (and from seeing what he actually have done with Arc, you know
my opinion of it is shit, and as well it generally isn't well
received... e.g. far less fanfare than say Haskell, Clojure, Scala,
erlang, OCaml/F#... and far less users than say NewLisp, OCaml, Scheme
Lisps...)

In 2008 i wrote a essay listing tens of new langs that are popularly
talked about:

• Proliferation of Computing Languages
http://xahlee.org/UnixResource_dir/writ/new_langs.html

But in the past 2 years, more have come into the mass radar!!!
Google's Go, Sun Micro's Fortress. All these langs have a bunch of
philosophies and defense on how or why they should exist, with
complaints about how they fill a hole. Some are dubious of course, but
some do seem sensible. If you consider problem space of computing, and
possibilities of imagination, it really has room for new langs.

Though, as far as i see, i can't see any concrete or theoretical merit
of Arc with his “hackers need it!” concept.

Clojure for example, i can justify easily, for one thing, it is a lisp
on JVM, modern lisp without baggage, easy install, independent, or
with concurrency argument. NewLisp, i see merit, for example, it fills
the scripting niche, and for hobbyist coders.

I really hate the word “hacker”. Imagine, what Dijkstra will have to
say about “a lang designed for ‘hacker’?” LOL.

... what society overwhelmingly asks for is snake oil. Of course, the
snake oil has the most impressive names —otherwise you would be
selling nothing— like “Structured Analysis and Design”, “Software
Engineering”, “Maturity Models”, “Ma