Xah Lee, 2007-05-03
There is a common complain by programers about lisp's notation, of
nested parenthesis, being unnatural or difficult to read. Long time
lisp programers, often counter, that it is a matter of conditioning,
and or blaming the use of “inferior” text editors that are not
designed to display nested notations. In the following, i describe how
lisp notation is actually a problem, in several levels.
(1) Some 99% of programers are not used to the nested parenthesis
syntax. This is a practical problem. On this aspect along, lisp's
syntax can be considered a problem.
(2) Arguably, the pure nested syntax is not natural for human to read.
Long time lispers may disagree on this point.
(3) Most importantly, a pure nested syntax discourages frequent or
advanced use of function sequencing or compositions. This aspect is
the most devastating.
The first issue, that most programers are not comfortable with nested
notation, is well known. It is not a technical issue. Whether it is
considered a problem of the lisp language is a matter of philosophical
disposition.
The second issue, about nested parenthesis not being natural for human
to read, may be debatable. I do think, that deep nesting is a problem
to the programer. Here's a example of 2 blocks of code that are
syntactically equivalent in the Mathematica language:
vectorAngle[{a1_, a2_}] := Module[{x, y},
{x, y} = {a1, a2}/Sqrt[a1^2 + a2^2] // N;
If[x == 0, If[Sign@y === 1, π/2, -π/2],
If[y == 0, If[Sign@x === 1, 0, π],
If[Sign@y === 1, ArcCos@x, 2 π - ArcCos@x]
]
]
]
SetDelayed[vectorAngle[List[Pattern[a1,Blank[]],Pattern[a2,Blank[]]]],
Module[List[x,y],
CompoundExpression[
Set[List[x,y],
N[Times[List[a1,a2],
Power[Sqrt[Plus[Power[a1,2],Power[a2,2]]],-1]]]],
If[Equal[x,0],
If[SameQ[Sign[y],1],Times[π,Power[2,-1]],
Times[Times[-1,π],Power[2,-1]]],
If[Equal[y,0],If[SameQ[Sign[x],1],0,π],
If[SameQ[Sign[y],1],ArcCos[x],
Plus[Times[2,π],Times[-1,ArcCos[x]]]]]]]]]
In the latter, it uses a full nested form (called FullForm in
Mathematica). This form is isomorphic to lisp's nested parenthesis
syntax, token for token (i.e. lisp's “(f a b)” is Mathematica's
“f[a,b]”). As you can see, this form, by the sheer number of nested
brackets, is in practice problematic to read and type. In Mathematica,
nobody really program using this syntax. (The FullForm syntax is
there, for the same reason of language design principle shared with
lisp of “consistency and simplicity”, or the commonly touted lisp
advantage of “data is program; program is data”.)
The third issue, about how nested syntax seriously discourages
frequent or advanced use of inline function sequencing on the fly, is
the most important and I'll give further explanation below.
One practical way to see how this is so, is by considering unix's
shell syntax. You all know, how convenient and powerful is the unix's
pipes. Here are some practical example: “ls -al | grep xyz”, or “cat a
b c | grep xyz | sort | uniq”.
Now suppose, we get rid of the unix's pipe notation, instead, replace
it with a pure functional notation: e.g. (uniq (sort (grep xyz (cat a
b c)))), or enrich it with a composition function and a pure function
construct (λ), so this example can be written as: ((compose (lambda
(x) (grep xyz x)) sort uniq) (cat a b c)).
You see, how this change, although syntactically equivalent to the
pipe “|” (or semantically equivalent in the example using function
compositions), but due to the cumbersome nested syntax, will force a
change in the nature of the language by the code programer produces.
Namely, the frequency of inline sequencing of functions on the fly
will probably be reduced, instead, there will be more code that define
functions with temp variables and apply it just once as with
traditonal languages.
A language's syntax or notation system, has major impact on what kind
of code or style or thinking pattern on the language's users. This is
a well-known fact for those acquainted with the history of math
notations.
The sequential notation “f@g@h@x”, or “x//h//g//f”, or unixy “x|h|g|
f”, are far more convenient and easier to decipher, than “(f (g (h
x)))” or “((compose h g f) x)”. In actual code, any of the f, g, h
might be a complex pure function (aka lambda construct, full of
parenthesis themselves).
Lisp, by sticking with almost uniform nested parenthesis notation, it
immediately reduces the pattern of sequencing functions, simply
because the syntax does not readily lend the programer to it as in the
unix's “x|h|g|f”. For programers who are aware of the coding pattern
of sequencing functions, now either need to think in terms of a
separate “composition” construct, and or subject to the much
problematic typing and deciphering of nested parenthesis.
(Note: Lisp's sexp is actually not that pure. It has ad hoc syntax
equivalents such as the “quote” construct “ '(a b c) ”, and also “`”,
“#”, “,@” constructs, precisely for the purpose of reducing
parenthesis and increasing readability. Scheme's coming standard the
R6RS ↗, even proposes the introduction of [] and {} and few other
syntax sugars to break the uniformity of nested parenthesis for
legibility. Mathematica's FullForm, is actually a pure nested notation
as can be.)
-------
The above, is part of a 3-part exposition:
“The Concepts and Confusions of Prefix, Infix, Postfix and Fully
Functional Notations”,
“Prefix, Infix, Postfix notations in Mathematica”,
“How Lisp's Nested Notation Limits The Language's Utility”,
archived at:
http://xahlee.org/UnixResource_dir/writ/notations.html
That is false. The complaint does not frequently occur among all of
the complaints occured by the entire population of complaintive
programmers.
> (1) Some 99% of programers are not used to the nested parenthesis
> syntax. This is a practical problem.
Since 99% of programmers don't use Lisp, it's not a practical problem.
> (2) Arguably, the pure nested syntax is not natural for human to read.
> Long time lispers may disagree on this point.
Programming language syntax shouldn't be natural for humans to read.
Or, rather, this shouldn't be a requirement which creates technical
compromises.
> (3) Most importantly, a pure nested syntax discourages frequent or
> advanced use of function sequencing or compositions.
That is an artifact of the way in which function composition is
rendered in nested syntax, and not an artifact of that syntax itself.
There exist macros that provide alternate syntax for function
composition, such as a left-to-right pipeline.
> This aspect is the most devastating.
Compared to issues like global warming and problems in the Middle
East, hardly.
> The first issue, that most programers are not comfortable with nested
> notation, is well known.
Many programmers are also not comfortable in social situations. So
what?
> shell syntax. You all know, how convenient and powerful is the unix's
> pipes. Here are some practical example: “ls -al | grep xyz”, or “cat a
> b c | grep xyz | sort | uniq”.
>
> Now suppose, we get rid of the unix's pipe notation, instead, replace
> it with a pure functional notation: e.g. (uniq (sort (grep xyz (cat a
> b c)))),
When you think about this more deeply (i.e. at all) you run into nasty
details. These programs pass their output to each other, but they also
have a return value (termination status) which isn't passed through
the pipeline. And they take arguments, but the arguments of a pipeline
element are not derived from the previous pipeline element.
Your call (grep xyz (cat ...)) means to pass the output of cat as the
third argument of grep.
In the POSIX shell, this would be coded using guess what, Lisp-like
syntax:
grep xyz $(cat ...)
Taking it further:
uniq $(sort $(grep xyz $(cat a b c)))
or enrich it with a composition function and a pure function
> construct (λ), so this example can be written as: ((compose (lambda
> (x) (grep xyz x)) sort uniq) (cat a b c)).
I posted a filter macro in comp.lang.lisp which expresses function
chaining into a left-to-right notation. Look for it.
In this notation, you might write:
(filter 3 (expt _ 2) (* 4))
which means, start with 3, raise it to the power of 2 to obtain 9, and
then multiply by 4 to get 36.
The underscore indicates the argument position where the output of the
previous pipeline element is to be inserted when calling the next
pipeline element. The default is to add it as a rightmost argument.
The macro has features for dealing with splitting and recombining
lists, and handling multiple values.
This is still the same ``pure nested'' syntax; it merely expresses
function chaining differently.
What was that about notation limiting language utility?
Yes. The Mathematica language really brings home the fact that non-trivial
syntax is good. In particular, it does an excellent job of mimicking
conventional mathematical notation. Arguing in favor of Lisp syntax is like
advocating the use of cave painting...
Also, note that Mathematica provides strictly more in the way of macros.
--
Dr Jon D Harrop, Flying Frog Consultancy
The F#.NET Journal
http://www.ffconsultancy.com/products/fsharp_journal/?usenet
If you're being pedantic, you may mean "it is an uncommon practical
problem". However, the problem extends beyond Lisp.
Recent discussions have covered the use of pattern matching. In SML, OCaml,
Haskell (I believe) and F# you must write pattern matches over the expr
type in prefix notation. To borrow from Alan's example, the Lisp code:
(destructuring-bind (op1 (op2 n x) y) form
`(* ,n (* ,(simplify x) ,(simplify y)))))
((cons (eql *) *)
(destructuring-bind (op left right) form
(list op
(simplify left)
(simplify right))))
((cons (eql +)
(cons (eql 0)
(cons * null)))
could be written:
| n*x*y -> n*(x*y)
but in ML this must be written as a pattern match over a sum type where the
type constructors must use prefix notation:
| Mul(Mul(n, x), y) -> Mul(n, Mul(x, y))
While this is clearly much better than the Lisp, it would be preferable to
use the mathematical syntax in this case. You can address this in OCaml
using macros and there is a chance that F# will support overloaded
operators in patterns in the future.
Mathematica lets you do:
n x y -> n (x y)
but I don't know of any other languages that do, without forcing you to
reinvent the wheel via macros.
> How Lisp's Nested Notation Limits The Language's Utility
You know, I am all about keeping one's eyes open so as to see the
limitations of the tools one chooses or is forced to use. If more
people were open to the flaws in their OS, religion, editor, and a
myriad of other things the world would surely be a better and more
peaceful place. However, I have to say that all I ever see from you,
Mr. Lee, is complaints and how you want emacs and lisp to change. For
the love of whatever god you choose to pray to, find a frigging tool
that makes you happy. If it is emacs that makes you happy, then change
the things you don't like and be happy but I fail to see how your
whining diatribes serve any useful purpose. If you don't like lisp then
by all means use something else.
No offense but it does get old after a while.
--
Robert D. Crawford rd...@comcast.net
Marriage is learning about women the hard way.
If I wanted to exhibit the benefits of syntax, Mathematica would be
pretty low on my list of languages to do it with. The combination of C-
ish abreviations (i++, et c.), tricky rules for implicit
multiplication, and a maze of twisty little infix operators make it
very confusing. It's not Perl, but I'd still pick Lisp's syntax any
day of the week and twice on Sundays.
Cheers,
Pillsy
Xahlee likes to complain. That's the main factor here.
-Miles
--
Run away! Run away!
Xah Lee wrote:
> How Lisp's Nested Notation Limits The Language's Utility
>
> Xah Lee, 2007-05-03
>
> There is a common complain by programers about lisp's notation, of
> nested parenthesis, being unnatural or difficult to read. Long time
> lisp programers, often counter, that it is a matter of conditioning,
> and or blaming the use of “inferior” text editors that are not
> designed to display nested notations. In the following, i describe how
> lisp notation is actually a problem, in several levels.
>
> (1) Some 99% of programers are not used to the nested parenthesis
> syntax. This is a practical problem. On this aspect along, lisp's
> syntax can be considered a problem.
Simply wrong. What would be a problem would be if they tried it and
could not quickly get used to it.
>
> (2) Arguably, the pure nested syntax is not natural for human to read.
> Long time lispers may disagree on this point.
No code is natural to read, and yes, I have written a ton of COBOL.
Neither are legal or scholarly natural language texts natural to read.
>
> (3) Most importantly, a pure nested syntax discourages frequent or
> advanced use of function sequencing or compositions. This aspect is
> the most devastating.
The example you chose was a command line, where conciseness rules and
where one is not likely to encounter refactoring or macros. ie, the
syntax for a command-line has nothing to do with the syntax for serious
programming.
sexpr notation provides a hierarchical lexical grouping that maps
isomorphically onto the functional semantics, and the latter is what I
need to rearrange when refactoring, so it kinda helps that the code I
must edit to achieve said rearrangement maps so directly to the semantics.
I think the biggest argument in favor of parens is all the people that
want them to go away. Bad features just die out (see C++ and Java).
kzo
--
http://www.theoryyalgebra.com/
"Algebra is the metaphysics of arithmetic." - John Ray
"As long as algebra is taught in school,
there will be prayer in school." - Cokie Roberts
"Stand firm in your refusal to remain conscious during algebra."
- Fran Lebowitz
"I'm an algebra liar. I figure two good lies make a positive."
- Tim Allen
The first time I saw graph-theory notation, I thought it was
gibberish. I still think most mathematical notation is gibberish, but
I deal with it --- I don't turn in proofs written in prose.
> (2) Arguably, the pure nested syntax is not natural for human to read.
> Long time lispers may disagree on this point.
And mathematical notation is not natural for people to read, at least
not anybody who grew up learning a natural language. But people deal
with it. They're remarkably adaptable this way.
> vectorAngle[{a1_, a2_}] := Module[{x, y},
> {x, y} = {a1, a2}/Sqrt[a1^2 + a2^2] // N;
> If[x == 0, If[Sign@y === 1, π/2, -π/2],
> If[y == 0, If[Sign@x === 1, 0, π],
> If[Sign@y === 1, ArcCos@x, 2 π - ArcCos@x]
> ]
> ]
> ]
Speaking of gibberish. I can't believe anybody would hold up
Mathematica as an example of good programming. Next you'll be telling
me that 99% of Matlab code isn't really crap!
> The third issue, about how nested syntax seriously discourages
> frequent or advanced use of inline function sequencing on the fly, is
> the most important and I'll give further explanation below.
I consider this to be a Good Thing (TM). Unix shell commands read like
line noise.
> How Lisp's Nested Notation Limits The Language's Utility
>
> Xah Lee, 2007-05-03
>
> There is a common complain by programers about lisp's notation, of
> nested parenthesis, being unnatural or difficult to read. Long time
> lisp programers, often counter, that it is a matter of conditioning,
> and or blaming the use of “inferior” text editors that are not
> designed to display nested notations. In the following, i describe how
> lisp notation is actually a problem, in several levels.
As a practical matter, most LISPers don't even see the parens, but
rather interpret the code according to the indentation.
> (1) Some 99% of programers are not used to the nested parenthesis
> syntax. This is a practical problem. On this aspect along, lisp's
> syntax can be considered a problem.
It's certainly different. Whether it is a problem or not depends on
personal or subjective criteria.
> (2) Arguably, the pure nested syntax is not natural for human to read.
> Long time lispers may disagree on this point.
Harder to read for some things, easier to read for others.
E.G.
1 + 2 + 3 + 4 + 6 - 3 + 32
or
(- (+ 1 2 3 4 6 32) 3)
> (3) Most importantly, a pure nested syntax discourages frequent or
> advanced use of function sequencing or compositions. This aspect is
> the most devastating.
This is one reason LISP has macros. Don't like the syntax? Make your
own. How many languages make this as easy as LISP does?
> The third issue, about how nested syntax seriously discourages
> frequent or advanced use of inline function sequencing on the fly,
> is the most important and I'll give further explanation below.
>
> One practical way to see how this is so, is by considering unix's
> shell syntax. You all know, how convenient and powerful is the
> unix's pipes. Here are some practical example: “ls -al | grep xyz”,
> or “cat a b c | grep xyz | sort | uniq”.
>
> Now suppose, we get rid of the unix's pipe notation, instead,
> replace it with a pure functional notation: e.g. (uniq (sort (grep
> xyz (cat a b c)))), or enrich it with a composition function and a
> pure function construct (λ), so this example can be written as:
> ((compose (lambda (x) (grep xyz x)) sort uniq) (cat a b c)).
(pipe (cat a b c) (grep xyz) (sort) (uniq))
'Nuff said.
UNIX pipe `|' is not just a throw-away syntactic separator, anyone
with LISP experience would see it for what it is: a function that
operates on functions.
> Lisp, by sticking with almost uniform nested parenthesis notation,
> it immediately reduces the pattern of sequencing functions, simply
> because the syntax does not readily...<snip>
I would contend that LISP did not do this. You did.
<snip>
> (Note: Lisp's sexp is actually not that pure. It has ad hoc syntax
> equivalents such as the “quote” construct “ '(a b c) ”, and also
> “`”, “#”, “,@” constructs, precisely for the purpose of reducing
> parenthesis and increasing readability. Scheme's coming standard the
> R6RS ↗, even proposes the introduction of [] and {} and few other
> syntax sugars to break the uniformity of nested parenthesis for
> legibility. Mathematica's FullForm, is actually a pure nested
> notation as can be.)
Some functions are used so often, they have shorthand equivalents.
This is a feature in many languages. But if for some reason one
wanted the sexp to be "pure," nothing's stopping him from using the
fully-parenthesized versions.
> -------
> The above, is part of a 3-part exposition:
> “The Concepts and Confusions of Prefix, Infix, Postfix and Fully
> Functional Notations”,
> “Prefix, Infix, Postfix notations in Mathematica”,
> “How Lisp's Nested Notation Limits The Language's Utility”,
> archived at:
> http://xahlee.org/UnixResource_dir/writ/notations.html
>
> Xah
> x...@xahlee.org
> ∑ http://xahlee.org/
>
I suggest you start to acquaint yourself with LISP before getting too
far into the exposition:
--
Edward
I didn't read your thesis, but the general premise is correct.
Caution: late-night ramblings follow. Proceed with caution.
The solution is simple: do something in Lisp that is nearly impossible
in other languages.
Do something that is anathema to the programmer machismo.
Write an interactive, point-and-click interface that lets users edit
Lisp code as block diagrams, seamlessly transforming back and forth
between s-expressions and widgets.
In another project, write macros that convert Basic code into idiomatic
Lisp.
Provide the user with "UI/syntax macros" that can present code in any
number of forms.
With such frameworks in place, the truth will be exposed: All languages
are "preprocessed Lisp"; their only essential difference is the
preferred syntax and the libraries provided.
Definition: Turing-complete, n., an expression of Lisp.
(I should really get some sleep now.)
Seriously, though. The lisp environment is overwhelming to the new
user. Everything is foreign; they should offer a course in the history
department just so people can understand all the thoughts that resulted
in what we have today.
I "learned programming" by typing hex codes from magazines into a VIC20;
when these didn't work, I finally entered the BASIC checksum program to
help identify where things went wrong. This was foreign, but Commodore
basic provided a simple interface for the types of interactive
programming that Lisp excels at. I learned "for loops" by bouncing *'s
across the screen... I can only imagine what it would happen if I were
at that age trying the same experiments today... construct a widget, add
a text panel, oops can't position text, try a blank panel, figure out
which command draws text... 2 years later?
What will future generations learn to program on? The abomination known
as Visual Basic? Perl, Python, and other "scripting languages"? Almost
certainly not Lisp. Dr.Scheme (or whatever had the simple built-in
Windows IDE) was my first introduction to Lisp, and I must say that
Commodore and GRA (? AtariST) basic were much easier to grasp -- largely
due to the linguistic nature of their commands.
Lisp has a full suite of macros for morphing code; but the main efforts
to fix the syntax always seem to result in new compilers being written
(Dylan anyone?). Why can't we have syntax macros?
We need something that new coders can grasp easily. If it runs on a
thin layer over Lisp, then they will learn to see through the facade --
and hopefully avoid the brain damage encoded in restrictive programming
paradigms enforced by inferior language systems. Plus code written in
the "simple" syntax will be translated by these syntax macros into
idiomatic Lisp code, thus providing an interactive lesson in Lisp.
Treat Lisp as the bytecode of a universal "virtual machine".
I've cobbled together a simple framework inspired by Terence Parr's
Antlr and StringTemplate libraries. These (Mr. Parr's libraries) are
excellent libraries that implement a fairly , ignoring their linguistic
implementation. Something like this in Lisp should be just the key to
providing a standardized syntax macro framework...
/end transmission
___________________________
/| /| | |
||__|| | Please don't |
/ O O\__ feed |
/ \ the trolls |
/ \ \ |
/ _ \ \ ----------------------
/ |\____\ \ ||
/ | | | |\____/ ||
/ \|_|_|/ | __||
/ / \ |____| ||
/ | | /| | --|
| | |// |____ --|
* _ | |_|_|_| | \-/
*-- _--\ _ \ // |
/ _ \\ _ // | /
* / \_ /- | - | |
* ___ c_c_c_C/ \C_c_c_c____________
Sorry for double posting but I'm getting sick of Xah Lee and Jon
Harrop.
We all get their message :COMMON LISP SUCKS so please let us leave in
our misery
while you make all the great programms in F#, Mathematica or
whatever ...
>From now on I'll ignore both of them for good .
cheers
bobi
Ken Tilton wrote:
«Simply wrong. What would be a problem would be if they tried it and
could not quickly get used to it.»
I think what you said is part of what i said. But let me elaborate.
Lisp's nested parenthesis certainly is a problem, if you want to
attract programers to lisp and programers are not attracted by sexp in
the first place.
What you expressed, is to consider the question of whether people can
get used to sexp. If programers cannot adapt sexp and find it
comfortable as a syntax of a programing language, then, you consider
THAT, to be a problem of the lisp syntax. Otherwise no problemo.
I think you vantage point is not so goody. But if we consider the
question itself, i think yes, people can adapt and use sexp reasonable
well for programing. Actually, this has been already done i guess by
lispers for 40+ years right? But if we really focus on this point of
view, namely: whether sexp can be suitable as a syntax for computing
language. Then, that is rather not a interesting question i thinky.
Because, in general, people can adapt to all kind of wierd shits,
really depending how pressing the matter is. If it is life
threatening, or they have no other choice, human animals can adapt.
Imagine, compare, the technology today are far advanced than 40 years
ago. Computers are what? a thousand, ten thousand times faster? And
there's no Perl, Python, Java, Internet 40 years ago. But People went
to the Moon! Imagine, what kind of torturous pain these people have
to go thru in software and hardware or the computation of mathematics
to do this. BASIC with GOTO and FORTRAN? What? no structured
constructs? No code blocks? No libraries? No symbols? No macros? Not
even Eval()? No USB and DVDs but punchcards?
You know about Chinese characters right? To a European, Chinese
characters looks torturous. But in fact it is. Just consider the
number of strokes. But i think Chinese people of 13 billion are
surviving.
Sure. I certainly agree human animals can get used to sexp. Under
some conditions, human animals can get used to a lot of things, and
loving it too.
See:
• “Body and Mind Modification”
http://xahlee.org/Periodic_dosage_dir/20031129_body_mod.html
• “Difficulty Perceptions in Human Feats”
http://xahlee.org/vofli_bolci/juggling.html
The interesting question would be, to what degree of botchedness of
computer language syntax can be, until programers will not be able to
adapt it and code it fruitfully? Suppose, for each paren in lisp, we
double it. So, “(+ 3 4)” will be written as “((+ 3 4))”. And suppose
we force this to be the new syntax for lisp. I will bet you one
hundred bucks, that all lispers will find in exactly as easy to read
as before.
How many level of parens do you think we can add till all lispers
abandon ship?
How many?
----------------------------
Xah Lee wrote:
«(2) Arguably, the pure nested syntax is not natural for human to
read. Long time lispers may disagree on this point.»
Kenny wrote:
«No code is natural to read, and yes, I have written a ton of COBOL.
Neither are legal or scholarly natural language texts natural to
read.»
Huh? surely you agree that there is at least some qualification as to
some written language being more natural or easier to read?
For example, compare language A and B, where A is just lisp's sexp,
and B is sexp with every paren replaced by double parens. Now, which
is more “natural” or easy to read?
So, likewise, compare Perl and Python. Which is easier to read in
general? You can't wax philosophy on this can you?
Imagine a philosophical scenario, where a devil killed off all
pythoners on earth. Therefore perl code is easier to read, because
nobody would understand a fuck when shown a python code. So, the
question of whether Perl or Python code is easier to read, depends.
Also, i mean, it depends on the programer!! Because, if i write Perl
code versus a moron writing in Python, then my Perl code will sure be
easier to read. So, it also depends on me.
What a great tough undecidability!
----------------------------
Xah Wrote:
«(3) Most importantly, a pure nested syntax discourages frequent or
advanced use of function sequencing or compositions. This aspect is
the most devastating.»
Kenny wrote:
«The example you chose was a command line, where conciseness rules and
where one is not likely to encounter refactoring or macros. ie, the
syntax for a command-line has nothing to do with the syntax for
serious programming.»
Yes the example i gave was unix's shell. It is chosen because it is
simple to illustrate and widely known.
I could give Mathematica examples with explanations... but its not
suitable as the simple unix pipe example because then i'll have to
spend great space explaining Mathematica.
(if anyone is interested in Mathematica syntax, i explained some here
http://xahlee.org/UnixResource_dir/writ/notations.html
)
But anyway, here's a few mathematica example from my Plane Tiling
package.
Here's one line that's the most simple to understand:
Reflect2DGraphics[{a1_, a2_}, {n1_, n2_}][gra_] :=
(Translate2DGraphics[{n1, n2}])@
(Reflect2DGraphics[{a1,a2}])@
(Translate2DGraphics[-{n1, n2}][gra]);
Basically, you'll see on the right side are 3 sequence of functions.
The left side is a function applied to a function. i.e.
Reflect2DGraphics takes 2 vectors and “returns” a function that takes
one graphics argument. (it doesn't actually return a function since
it's written as pattern matching... but you can think of it as a
function generator)
Here's the lispy form some lisper believe is easier to read:
CompoundExpression[
SetDelayed[
Reflect2DGraphics[List[Pattern[a1, Blank[]], Pattern[a2,
Blank[]]],
List[Pattern[n1, Blank[]], Pattern[n2, Blank[]]]][
Pattern[gra, Blank[]]],
Translate2DGraphics[List[n1, n2]][
Reflect2DGraphics[List[a1, a2]][
Translate2DGraphics[Times[-1, List[n1, n2]]][gra]]]], Null]
The following are more code. They involve pure functions, function
application, function mapping, function generator applied to
functions... and basically all done with flat sequencing of functions
when possible. (aka function chaining sans nesting). Whenever you see
the ampersand sign &, its a pure function (aka lambda). Some pure
function has pure function as its innards.
These are written in 1997, by yours truely. I would say, if you have
not been coding lisp or Mathematica for, say, 40 hours a week for 4
years, you wouldn't understand these code when explained. (not
considering the math it is doing)
Okie, here's a good one. Lots of sequencing of pure functions on the
fly:
cycledMatrix[{m_Integer, n_Integer}, cycleLists : {_List ..}] :=
Module[{}, (Flatten[({Table[#1, {Quotient[#2, #3]}],
Take[#1, Mod[#2, #3]]} &) @@ ({#, m,
Length@#} &)@#] &) /@
Flatten[(({(Flatten[#, 1] &)@Table[#1, {Quotient[#2, #3]}],
Take[#1, Mod[#2, #3]]} &) @@ ({#, n, Length@#} &)@
cycleLists), 1]];
Btw, the “f@@g” you see is shortcut syntax for “Apply[f,g]”.
Here's the equivalent lispy form, lispers think its easier to read:
CompoundExpression[
SetDelayed[
cycledMatrix[
List[Pattern[m, Blank[Integer]], Pattern[n, Blank[Integer]]],
Pattern[cycleLists, List[Repeated[Blank[List]]]]],
Module[List[],
Map[Function[
Flatten[Apply[
Function[
List[Table[Slot[1], List[Quotient[Slot[2],
Slot[3]]]],
Take[Slot[1], Mod[Slot[2], Slot[3]]]]],
Function[List[Slot[1], m, Length[Slot[1]]]]
[Slot[1]]]]],
Flatten[Apply[
Function[
List[Function[Flatten[Slot[1], 1]][
Table[Slot[1], List[Quotient[Slot[2],
Slot[3]]]]],
Take[Slot[1], Mod[Slot[2], Slot[3]]]]],
Function[List[Slot[1], n, Length[Slot[1]]]]
[cycleLists]], 1]]]],
Null]
----------------------------------------
Okie, another one, wee!
ColoredLatticeNetwork[n:(4 | 3), m_Integer,
colors:{_List..}, s_, a_, {x_, y_}] :=
Module[{lines, gp}, lines = (Map[Line, #1, {2}] & )[
(Partition[#1, 2, 1] & ) /@
N[Table[{1, 0}*s*i + ({Cos[#1], Sin[#1]} & )[
(2*Pi)/n]*s*j, {j, -m, m}, {i, -m, m}]]];
gp = (Transpose[#1, {3, 1, 2}] & )[
{cycledMatrix[{m*2, m*2 + 1}, (RotateRight[#1, m] & )[
(RotateRight[#1, m] & ) /@ colors]], lines}];
Translate2DGraphics[{x, y}][
Table[Rotate2DGraphics[{0, 0}, ((2*Pi)*i)/n + a][
gp], {i, 0, n - 1}]]];
Here's the equivalent lispy form lisp morons insist it is easier to
read:
CompoundExpression[
SetDelayed[
ColoredLatticeNetwork[Pattern[n, Alternatives[4, 3]],
Pattern[m, Blank[Integer]],
Pattern[colors, List[Repeated[Blank[List]]]], Pattern[s,
Blank[]],
Pattern[a, Blank[]], List[Pattern[x, Blank[]], Pattern[y,
Blank[]]]],
Module[List[lines, gp],
CompoundExpression[
Set[lines,
Function[Map[Line, Slot[1], List[2]]][
Map[Function[Partition[Slot[1], 2, 1]],
N[Table[
Plus[Times[List[1, 0], s, i],
Times[Function[List[Cos[Slot[1]], Sin[Slot[1]]]]
[
Times[Times[2, Pi], Power[n, -1]]], s, j]],
List[j, Times[-1, m], m], List[i, Times[-1, m],
m]]]]]],
Set[gp, Function[Transpose[Slot[1], List[3, 1, 2]]][
List[cycledMatrix[List[Times[m, 2], Plus[Times[m, 2],
1]],
Function[RotateRight[Slot[1], m]][
Map[Function[RotateRight[Slot[1], m]], colors]]],
lines]]],
Translate2DGraphics[List[x, y]][
Table[Rotate2DGraphics[List[0, 0],
Plus[Times[Times[Times[2, Pi], i], Power[n, -1]], a]]
[gp],
List[i, 0, Plus[n, -1]]]]]]], Null]
--------------------------------
Okie, this one is a simple one.
StarMotif[n_Integer, coords_?MatrixQ, s_, α_, {x_, y_}] :=
Polygon@(Flatten[#, 1] &)@
Transpose@
Map[Table[{Cos[t + Last@#], Sin[t + Last@#]}*First[#]*s +
{x,
y}, {t, α, 2*Pi + α - 2*Pi/n/2, 2*Pi/n}] &,
coords];
Now, the easier-to-read lisp form follows:
CompoundExpression[
SetDelayed[
StarMotif[Pattern[n, Blank[Integer]],
PatternTest[Pattern[coords, Blank[]], MatrixQ], Pattern[s,
Blank[]],
Pattern[α, Blank[]],
List[Pattern[x, Blank[]], Pattern[y, Blank[]]]],
Polygon[Function[Flatten[Slot[1], 1]][
Transpose[
Map[Function[
Table[Plus[
Times[List[Cos[Plus[t, Last[Slot[1]]]],
Sin[Plus[t, Last[Slot[1]]]]], First[Slot[1]],
s],
List[x, y]],
List[t, α,
Plus[Times[2, Pi], α,
Times[-1,
Times[2,
Times[Times[Pi, Power[n, -1]], Power[2,
-1]]]]],
Times[2, Times[Pi, Power[n, -1]]]]]], coords]]]]],
Null]
--------------
This one is semantically and geometrically complex!
SlitLine[Line[pts_List /; (Length@pts > 2)], cuttingLine : {{_, _},
{_, _}}] :=
Module[{temp},
If[SameQ @@ (temp = PointOnLeftSideQ[#, cuttingLine] & /@ pts),
Return@List@Line@pts];
temp = MapThread[
And[#1, #2] &, {(UnsameQ @@ # &) /@
Partition[temp, 2,
1], (PointOnLeftSideQ[First@cuttingLine, #] =!=
PointOnLeftSideQ[Last@cuttingLine, #] &) /@
Partition[pts, 2, 1]}];
temp = Append[#, Null] &@
MapThread[
If[#2, 0@LineIntersectionPoint[#1, cuttingLine],
Null] &, {Partition[pts, 2, 1], temp}];
temp = (DeleteCases[#, Null, {1}] &)@Flatten[Transpose@{pts, temp},
1];
(Line /@
ReplaceRepeated[
List@temp, {a__, 0[pt_], b__} :> Sequence[{a, pt}, {pt,
b}]])]
Now, the spectacular lisp form:
SetDelayed[
SlitLine[Line[
Condition[Pattern[pts, Blank[List]], Greater[Length[pts],
2]]],
Pattern[cuttingLine,
List[List[Blank[], Blank[]], List[Blank[], Blank[]]]]],
Module[List[temp],
CompoundExpression[
If[Apply[SameQ,
Set[temp,
Map[Function[PointOnLeftSideQ[Slot[1], cuttingLine]],
pts]]],
Return[List[Line[pts]]]],
Set[temp,
MapThread[Function[And[Slot[1], Slot[2]]],
List[Map[Function[Apply[UnsameQ, Slot[1]]],
Partition[temp, 2, 1]],
Map[Function[
UnsameQ[PointOnLeftSideQ[First[cuttingLine],
Slot[1]],
PointOnLeftSideQ[Last[cuttingLine], Slot[1]]]],
Partition[pts, 2, 1]]]]],
Set[temp,
Function[Append[Slot[1], Null]][
MapThread[
Function[
If[Slot[2], 0[LineIntersectionPoint[Slot[1],
cuttingLine]],
Null]], List[Partition[pts, 2, 1], temp]]]],
Set[temp,
Function[DeleteCases[Slot[1], Null, List[1]]][
Flatten[Transpose[List[pts, temp]], 1]]],
Map[Line,
ReplaceRepeated[List[temp],
RuleDelayed[
List[Pattern[a, BlankSequence[]], 0[Pattern[pt,
Blank[]]],
Pattern[b, BlankSequence[]]],
Sequence[List[a, pt], List[pt, b]]]]]]]]
If you want to see more, you can download my package at
http://xahlee.org/MathGraphicsGallery_dir/PlaneTilingPackageDemo_dir/planeTilingPackageDemo.html
Free of charge.
--
Free games and programming goodies.
http://www.personal.leeds.ac.uk/~bgy1mm
You might be interested in the page 10 Rules of Computer Programming on my
website which deals with psychological issues in programming. Lisp breaks
the rule of three in spades.
I fully agree. Neither of them as any (!) clue about Lisp and programming in Lisp.
Neither of them has written any significant or even non-significant Lisp program.
Their postings are full of (wrong) assumptions and are mostly based on
no experience and no knowledge of Lisp programming. Both are only trolling
in comp.lang.lisp. Best to post a warning sign and other to ignore them...
Lisp doesn't have a problem here. Lisp is a programming language, an
inanimate, dispassionate, virtual concept which doesn't (cannot!) "care"
whether it is used at all.
Only people can have problems with something. For example, programmers
who want some of the cool features of Lisp but don't want to put up with
its syntax.
However, for decades, there have been programmers who have learned to
prefer Lisp's syntax over anything else. As long as they continue to
exist, Lisp dialects will continue to exist. Those people don't have a
problem - to the contrary.
Leave them alone - they are happy with their choice.
If you don't like Lisp, there are numerous attempts to copy its features
using other surface syntaxes. Just pick one.
Pascal
--
My website: http://p-cos.net
Common Lisp Document Repository: http://cdr.eurolisp.org
Closer to MOP & ContextL: http://common-lisp.net/project/closer/
> "Miles Bader" <mi...@gnu.org> wrote in message
> news:873b2cx...@catnip.gol.com...
>> "Robert D. Crawford" <rd...@comcast.net> writes:
>>> If it is emacs that makes you happy, then change the things you don't
>>> like and be happy but I fail to see how your whining diatribes serve
>>> any useful purpose. If you don't like lisp then by all means use
>>> something else.
>>
>> Xahlee likes to complain. That's the main factor here.
>>
> There is thought and intelligence in his posts. However they are
> provocative in both the good and the bad sense of the term, and rather
Where is the good sense in there? We have (at c.l.f) people
complaining about parenthesis in lisp every 3 months and the arguments
never get any better or new. Rehashing all that once again doesn't
attest to "thought and intelligence" but rather to a trollish streak.
Regards -- Markus
>> (1) Some 99% of programers are not used to the nested parenthesis
>> syntax. This is a practical problem. On this aspect along, lisp's
>> syntax can be considered a problem.
>
> The first time I saw graph-theory notation, I thought it was
> gibberish. I still think most mathematical notation is gibberish, but
> I deal with it --- I don't turn in proofs written in prose.
Right.
>
>> (2) Arguably, the pure nested syntax is not natural for human to read.
And BTW, it's not natural for humans to read at all.
:-)
>> Long time lispers may disagree on this point.
>
> And mathematical notation is not natural for people to read, at least
> not anybody who grew up learning a natural language. But people deal
> with it. They're remarkably adaptable this way.
Regards -- Markus
Indeed, and people have been saying what Xahlee's saying since the
_1950s_! Lisp still has the parens, and there are good reasons for that.
-Miles
--
[|nurgle|] ddt- demonic? so quake will have an evil kinda setting? one that
will make every christian in the world foamm at the mouth?
[iddt] nurg, that's the goal
That suggests that the "rule of three" is wrong...
-Miles
--
o The existentialist, not having a pillow, goes everywhere with the book by
Sullivan, _I am going to spit on your graves_.
> You might be interested in the page 10 Rules of Computer Programming
> on my website which deals with psychological issues in
> programming. Lisp breaks the rule of three in spades.
Rule three is about indirection, not about nesting. The other rules
suck also. If your "Refutation of Common Atheist Arguments" have a
similar quality, the atheists will have a field day.
Regards -- Markus
Xah Lee wrote:
> Xah Lee wrote:
> «(1) Some 99% of programers are not used to the nested parenthesis
> syntax. This is a practical problem. On this aspect along, lisp's
> syntax can be considered a problem.»
>
> Ken Tilton wrote:
> «Simply wrong. What would be a problem would be if they tried it and
> could not quickly get used to it.»
>
> I think what you said is part of what i said. But let me elaborate.
>
> Lisp's nested parenthesis certainly is a problem, if you want to
> attract programers to lisp and programers are not attracted by sexp in
> the first place.
Ah, but now you are talking about marketing, and I happen to believe
that one does not win at marketing by throwing out one of the best
features of the language.
Do things well and people will find you. Why else has Lisp recovered
from the AI Winter? When has a computer language ever made a comeback?
Below you suggest I may be blinded by perspective. My question to you
is, do /you/ appreciate the importance of parens? If not, your
perspective is the problem.
>
> What you expressed, is to consider the question of whether people can
> get used to sexp. ...snip... Then, that is rather not a interesting question i thinky.
> Because, in general, people can adapt to all kind of wierd shits,
I understand how you went wrong, but you have read something into what I
wrote and are now off on a tangent by yourself, complete with
bibliography <g>. Nowhere did I say people would have to struggle to
appreciate parens, what I said was that it is not interesting if people
who have never tried Lisp complain about the parens. The only
interesting repsondent is one who has tried Lisp for some reason, and no
such person has ever had a problem with the parens. You are reacting to
word of mouth from people who have never tried Lisp, and that means
nothing. Give someone a microphone, they will say something.
But this is all Usenet hair-splitting. The bottom line is this: all
Lispers learned parens at some point, and know from experience that it
was easy for them, that it would be easy for anyone, and that parens are
an absolute advantage. If you disagree on any of those points we likely
will not find common ground.
[a bunch more on the mistaken idea that parens are an acquired taste]
> The interesting question would be, to what degree of botchedness of
> computer language syntax can be, until programers will not be able to
> adapt it and code it fruitfully? Suppose, for each paren in lisp, we
> double it. So, “(+ 3 4)” will be written as “((+ 3 4))”. And suppose
> we force this to be the new syntax for lisp. I will bet you one
> hundred bucks, that all lispers will find in exactly as easy to read
> as before.
You seem deeply convinced that parens /are/ a problem? Perhaps the story
here is that they /do/ confuse you. Anyway, the above is thoroughly
wrong. Lispers hate line noise, which is why they love parens and why
they use macros such as my BIF, which changes:
(let ((x (look-up y)))
(when x (print (list x 42))))
to:
(bwhen (x (look-up y))
(print (list x 42)))
Come to think of it, one reason I like loop is that it lets me express
iteration without so many parens. I guess I am making a tradeoff in
which I accept locally a burden to think about syntax in return for
being able to dash off commonly written code (which commonness means the
syntax will be second-nature). And a big point you have missed: I still
get decent auto indentation of my code even with the whacky loop syntax.
Note, btw, that your second set of parens adds nothing. The first
brilliantly captures lexically that the operator along with its operands
forms a meaningful semantic chunk....hang on.... eureka! So do the
arguments, so (+ 2 3) should really be (+ (2 3)). Ah, but then (+ 40 (-
4 2)) becomes (+ (40 (- (4 2)))... well, I guess Lispers /do/ worry
about readability. :)
Meanwhile, you owe me $100. Paul Graham, at ILC '2003 (? In NYC), said
one problem he wanted to fix with Arc was that Lisp had too many
parnetheses. He picked on COND as an example.
But only because you think the only thing that makes a computer language
readable is experience with it. And you are stuck on the idea of
readability being important, another big mistake. We /write/ code, we do
not read it. The important thing is to make the writing easy. The parens
make writing code code (guessing) twice as easy, certainly refactoring.
Another mistake: code ever being as readable as Stephen King. Nope, with
code one /always/ has to slow down to figure out what is going on.
I think you are also missing that the parens also mean there is a ton of
syntax and precedence I do not need to learn or make mistakes on. One
simple rule and I am good to go.
You need to remember, Xah, that any Lisper is master also of other
computer languages, because we likely pay the bills with something else.
You seem to have a mental model of us in which the only thing we know is
Lisp.
>
> ----------------------------
> Xah Wrote:
> «(3) Most importantly, a pure nested syntax discourages frequent or
> advanced use of function sequencing or compositions. This aspect is
> the most devastating.»
>
> Kenny wrote:
>
> «The example you chose was a command line, where conciseness rules and
> where one is not likely to encounter refactoring or macros. ie, the
> syntax for a command-line has nothing to do with the syntax for
> serious programming.»
>
> Yes the example i gave was unix's shell. It is chosen because it is
> simple to illustrate and widely known.
Well, the problem with examples is the problem with analogies: even if
we agree with the proffered case, what does that add to the discussion?
How well does it apply? What does it say? It just opens up a whole new
thread. The second set of parens is different from the first, a
command-line is not a 3gl with pages of text (and to head you off, if I
was writing a ten-line shell script, yes, I would want sexprs).
How on Earth is that Lispy??! Bring the operator inside, then lose the
[]s and commas. Puzzled.
kenny
> Xah Lee wrote:
[...]
> programers are not attracted by sexp in the first place.
No sex-p ?
Why wonder then why so many women see us geeks as wussies ... :(
Perhaps it's not a coincidence that these two seemingly unrelated
arguments are offered by the same person. Both religious belief and a
belief in the essential importance of syntactic sugar require abandoning
Occam's Razor.
Anton
Xah Lee wrote:
« (1) Some 99% of programers are not used to the nested parenthesis
syntax. This is a practical problem. On this aspect along, lisp's
syntax can be considered a problem.»
Pascal Costanza wrote:
«Lisp doesn't have a problem here. Lisp is a programming language, an
inanimate, dispassionate, virtual concept which doesn't (cannot!)
"care" whether it is used at all. Only people can have problems with
something. For example, programmers who want some of the cool features
of Lisp but don't want to put up with its syntax. »
You are actually wrong here, because, people can't have a problem.
People are animals, made of meat. They can have a problem when, they
get sick, and their body will disfunction or function abnormally. So,
for example, even sexp has all the problems, lisp morons are still
fine.
Pascal fuckhead wrote:
«However, for decades, there have been programmers who have learned to
prefer Lisp's syntax over anything else. As long as they continue to
exist, Lisp dialects will continue to exist. Those people don't have a
problem - to the contrary.»
existence?
my god... you are quite desperate.
In this thread, a few people already started to post completely off
topic, brainless, pure personal attacks. I think you are still good.
Sometimes i have a problem telling, of those who post brainless things
to my articles, whether they are joking with wanton carelessness, or
if their critical thinking abilities is really that low.
For example, your reply is a good example. I wasn't sure, if you are
just fucking with me, or really, really meant your writing as a valid
argument to my thesis. In this thread, Ken Tilton, also wrote
something that's incredibly stupid. In fact, more or less repeating
what i said, just with antagonism and explicit denial. I take it, that
he acted that way probably because my essay is so cogent and powerful
that it hit a nerve and paralyzed his reasoning faculties and numbed
his nonchalant quip Muse.
There are some others in this thread, which are so hateful, fucking
full of the ugliness of humanity. The replies from Markus E Leypold is
a prime example. I wonder, outside of newsgroups, if he is just a
sophomoron at college.
Other posts in this thread, some sincere ones, are full utterly
fucking stupid drivels. (or i shall put the polite version: logical
fallacies) The level is so low that i'm really ashamed that they
really came out from the mouthes of a class of people in society who's
jobs are to program computers.
(in this regard, see:
The Condition of Industrial Programers
http://xahlee.org/UnixResource_dir/writ/it_programers.html
)
For example, there's this gem: «Programming language syntax shouldn't
be natural for humans to read. Or, rather, this shouldn't be a
requirement which creates technical compromises.»
I think this quote or the concept of it has been around for a while.
If you take it seriously, it's full of MOTHERFUCKING nonsensical,
meaningless, shit. It amounts to a sound bite. Behind it, it wants to
make programers, in particular the elite programers like the lisp
fuckers, to think something along the line: “my syntax is so bizarre
because i'm superior”.
I guess it's lisper's version of “The three chief virtues of a
programmer are: Laziness, Impatience and Hubris”.
Pascal Costanza moron wrote:
«Leave them alone - they are happy with their choice. »
«If you don't like Lisp, there are numerous attempts to copy its
features
using other surface syntaxes. Just pick one. »
Try, perhaps, learn to calm down yourself. You are not under attack.
Lisp the language, is not under attack by my essay. My article,
although cogently detailed why lisp syntax is a serious problem at
several levels, it does not mean that, it is “no good” in some
absolute way. Also, it is far better, then essentially all imperative
language's syntaxes. Even, one can logically agree with my article,
and remain positive, because although lisp perhaps reduce the use of
function sequencing, but that by itself is not necessarily a very bad
thing. Because, function sequencing (aka chaining), although a very
important paradigm, is not necessarily the only way to program, nor
the only way for functional programing. Consider, even today in the
light of OOP. And Common Lisp, in particular, isn't all that much
concerned about purity of functional programing anyway. So logically,
from this perspective or argument, my essay isn't a damning
certificate.
I'm really sorry i have to say the above paragraph, as if like a pat
on the back to some crying pussies. But the juvenility, sensitivity,
ignorance, and stupidity, as shown in the responses in this thread,
made me.
I was hoping, perhaps due to my clear exposition of the sexp problem
for perhaps the first time, that we could move on to other fruitful
discussions, not necessarily agree that it is a problem. For example,
LOGO, DyLan, Arc all abandoned the nested fuck. I mean, lispers may
disagree, but the people behind these LISP languages are not brainless
fuckheads, are they? Alternatively, i was hoping, perhaps there will
be fruitful discussions in the direction of technical issues of
transforming syntax. i.e. of Mathematica's f[a,b] and lisp's (f a b).
Or, perhaps some technically knowledgable people here (there are many)
can by ways of chat, impart some insight about some technical
connections between syntaxes and semantic (apart from myself). For
example, DyLan i believe once was believed to have have sexp
underneath. If so, that would be interesting because it gives us a
actual, existing, example of a lang with sexp underneath alternative
to Mathematica's ways. But also, lisp can actually relatively easily,
create a layer of syntax on top of sexp...
Really, you people need to learn. Be calm. Keep your mind open. I'm
not attacking you or attacking lisp. If there's any attacking, it is
your immature and unhealthy nerve.
PS By the way, i also responded a article recently about why Perl and
Python programers today should learn lisp.
(
“Why learn lisp when there are python and perl?”, Xah Lee, 2007-05-03.
http://xahlee.org/UnixResource_dir/writ/wl_lisp.html
)
Can a lisper write a open letter thanking me for that please?
Very good. Now, off you go and design and implement an alternative
syntax. It's easy, it's been done hundreds of times. I think I have
two or three limited examples on my web site and several more I never
bothered publishing. There must, as I say, be hundreds of examples
around and probably thousands when you count all the non-published
ones. So pick the best and use them, or, since you're by far the
smartest person here, why not come up with your own? Jon Harrop can
probably help you.
I am just a Lisp newbie, quietly lurking in this newsgroup,
occasionally asking questions. Some of these questions are
elementary, doubtless asked a zillion times before (even though I
always search the archives), but kind people in this newsgroup always
give me a lot of helpful answers and hints.
Even though I cannot prevent it, I hate to see them being called names
by a disgruntled teenager (or someone with an equivalent level of
social skills).
So far, I have been using Google Groups, but that has no filtering.
Could somebody please recommend a reliable and cheap (or a convex
combination of the two ;-) news server? I have seen them mentioned in
the past before, but could not find them. Then I could use mutt or
something else to read news, and filter certain messages.
Thanks,
Tamas
PS: Sorry for the OT post, but this thread finally made it clear that
Google Groups (a great free service) is not the perfect solution for
increasing the signal/noise ratio.
On May 5, 11:52 am, Xah Lee <x...@xahlee.org> wrote:
> Dear Pascal moron,
>
[snip]
Be cool and if you do think someone is using ad hominem arguments;
tersely point it out and return to the main points. Don't get
abusive.
Mark
> The Condition of Industrial Programershttp://xahlee.org/UnixResource_dir/writ/it_programers.html
> Could somebody please recommend a reliable and cheap (or a convex
> combination of the two ;-) news server? I have seen them mentioned in
> the past before, but could not find them.
I use http://news.individual.net/ and am pretty happy with their service.
Tamas wrote:
> Ouch.
>
> I am just a Lisp newbie, quietly lurking in this newsgroup,
> occasionally asking questions. Some of these questions are
> elementary, doubtless asked a zillion times before (even though I
> always search the archives), but kind people in this newsgroup always
> give me a lot of helpful answers and hints.
>
> Even though I cannot prevent it, I hate to see them being called names
> by a disgruntled teenager (or someone with an equivalent level of
> social skills).
We don't need your stinking concern. <kidding!>
Don't worry, Xah is OK. Google applications agreed are not. I use
Mozilla Thunderbird on win32 for mail and news (I have the mail from my
google account forwarded to my ISP mail address so I still get the
portable address and junk filtering).
kenneth
The rule of three was written before I was familiar with Lisp, and so you
might say that Lisp is a counter-example. I don't think it is, because the
first thing that everyone says about Lisp is "I can't do with all those
parentheses". In fact Lisp programmers use indentation to try to impose
another hierarchy of structure on the program, something they wouldn't need
to do if humans could glance at a long expression and match up parentheses.
>I think this quote or the concept of it has been around for a while.
>If you take it seriously, it's full of --------- nonsensical,
>meaningless, ----. It amounts to a sound bite. Behind it, it wants to
>make programers, in particular the elite programers like the lisp
>-----, to think something along the line: “my syntax is so bizarre
>because i'm superior”.
>
Now I also disagreed with Kaz on that one. Compare my post with yours. and
consider why I had to delete some many words in quoting you. Who do you
think is more likely to convince Mr Kylheku that he is in error?
[...]
>
> But only because you think the only thing that makes a computer language
> readable is experience with it. And you are stuck on the idea of
> readability being important, another big mistake. We /write/ code, we do
> not read it. The important thing is to make the writing easy. The parens
> make writing code code (guessing) twice as easy, certainly refactoring.
We write /and/ read code. Sometimes, especially on large multi-team
projects, some of us spend more time reading it than writing it. One of
the things I have to keep pounding into the heads of our programmers is
that you aren't writing code for yourself and you certainly aren't
writing it for the computer -- you're writing it for other programmers.
That's why it's so important to remember rules such as:
1) "Write clearly -- don't be too clever"
2) "Say what you mean, simply and directly"
3) "Parenthesize to avoid ambiguity"
4) "Use the telephone test for readibility"
And so on, from "The Elements of Programming Style" by Kernighan and
Plaugher. [1]
We have engineers who, I kid you not, have used 9 lines of C code where
1 would do. Think that double and triple negatives are a reasonable way
of writing conditionals. Think that liberal use of temporary variables
is an excellent way of expressing onself. And so on.
This, of course, is true of all programming languages, not just Lisp.
So, while experience in the language contributes to readibility, it's
just (if not more important) that the code is written well. Joyce
certainly was experienced in English but I still find "Ulysses" a
dreadful chore.
-----
[1] I'd gladly buy a version of this book targeted to Lisp.
>
> Another mistake: code ever being as readable as Stephen King. Nope, with
> code one /always/ has to slow down to figure out what is going on.
>
> I think you are also missing that the parens also mean there is a ton of
> syntax and precedence I do not need to learn or make mistakes on. One
> simple rule and I am good to go.
>
> You need to remember, Xah, that any Lisper is master also of other
> computer languages, because we likely pay the bills with something else.
> You seem to have a mental model of us in which the only thing we know is
> Lisp.
>
Lee is suffering from the notion that there is a "natural" way of
expressing onself; the "natural" way, of course, being what he's used
to. The only "natual" language is the one spoken by God and I have the
feeling that He's an omniglot. Lisp, with its parenthesis, is no more
unnatural than Greek. And I like Greek. I can say things in it that I
can't say in English. Lee might as well complain that Greek uses an
"unnatural" alphabet, "funny" punctuation (like two kinds of "breathing
marks"), and an incredibly complex system of verb stems.
Lisp is the most elegant computer language I know, because it allows me
to express thoughts that are difficult to express in other languages.
Part of the reason for this is its syntax, just like one of the reasons
I can say things in Greek that I can't say in English are the different
verb tenses.
I also like Lisp because I can express more with less. Our company
decided to try PSP (Personal Software Process) using two teams as guinea
pigs. Lots of fanfare, classes, etc... Of course it turned out to be,
as Scott Adams might say, "a dead woodchuck". But as part of their
training, engineers had to code a set of eight exercises and collect
various metrics about the code. I think everyone used C; I certainly
did because I wanted to compare my time/size/defect metrics with the
other engineers. In order to improve my Lisp-fu, I later re-did the
exercises in Lisp. I was blown away by how much faster I could work.
The first exercise, for example, took 189 lines of C code but 51 lines
of Lisp. And the Lisp code did more; it had a built in test suite that
the C code did not.
Criticising a language based upon one small point of punctuation shows
that the critic doesn't know anything about languages. Best not to pay
any attention to them.
> >
> > ----------------------------
> > Xah Wrote:
> > «(3) Most importantly, a pure nested syntax discourages frequent or
> > advanced use of function sequencing or compositions. This aspect is
> > the most devastating.»
> >
> > Kenny wrote:
> >
> > «The example you chose was a command line, where conciseness rules and
> > where one is not likely to encounter refactoring or macros. ie, the
> > syntax for a command-line has nothing to do with the syntax for
> > serious programming.»
> >
> > Yes the example i gave was unix's shell. It is chosen because it is
> > simple to illustrate and widely known.
>
> Well, the problem with examples is the problem with analogies: even if
> we agree with the proffered case, what does that add to the discussion?
> How well does it apply? What does it say? It just opens up a whole new
> thread. The second set of parens is different from the first, a
> command-line is not a 3gl with pages of text (and to head you off, if I
> was writing a ten-line shell script, yes, I would want sexprs).
>
> >
> > I could give Mathematica examples with explanations... but its not
> > suitable as the simple unix pipe example because then i'll have to
> > spend great space explaining Mathematica.
> >
[...]
So what? Given _any_ language, human or machine, one can find an
example that is easily expressible in one but not in another. That's
one reason why we keep inventing languages. This will never stop unless
we can no longer find new things to express.
There ya go. Natural language is wonderfully ambiguous. We can lump a
lot of things into readability, including typeface and justification. In
this thread we of course must narrow it down to things language
specific. You have failed, and Mr. Bradshaw will be around later to pick
you up.
> Criticising a language based upon one small point of punctuation shows
> that the critic doesn't know anything about languages. Best not to pay
> any attention to them.
We need something to drown out the incessant threads about getting free
software to work.
:)
kxo
«I think *you* need to cool down Xah Lee. I agree it can be annoying
if you make a critical analysis of Lisp and people attack *you*. This
is called an ad hominem argument, and, sadly, they are all too common
on usenet. ...»
«But calling people names like 'moron' is not the right way to win
hearts and minds....»
Exactly. I've been using online forums since 1990. I've been using
newsgroups since maybe 1994. If i wanted to play politics, given my IQ
and my knowledge of computing, and great skill of composition, i might
be a fucknig George Bush by now, with a massive army of morons at my
commands, not like, the likes of some computing industry leaders.
Understand, politness, is a intrinsic element of politics. (and
politics, means power-struggle) Politness arose from the existance of
hostility and aggression. Without aggression, there is no politness.
One easy way to see this, is notice how the behavior of politeness is
inversly proportional to how close are the parties involved. There
will be little politness, between your sisters, wives, or parents.
There will be a huge politeness, when big stakes are involved between
unbound human animals.
The computing newsgroups, made of males, have been brawling just about
when it begin. Your few words on politness and lecturing, is one
additional off-topic drivel in its history. (though, i appreciate your
kindness and your knowledge)
In this thread, i did not start four-letter words before NUMEROUS
fuckheads began sprouting purely unreasonable, totally off-topic,
personal attacks on me in a way more serious than name calling.
(because, it smacks of witch-hunting) (they naturally did this,
because they perceived that i “attacked” their computer language. This
is because, the computing language newsgroup tech geekers are utterly
ignorant of sociology, and combined with male power struggling
instict, have habored the thought about “trolling”, and some ethical
ideas about what one can or cannot say about other's computer
languages. Utterly bizarre.)
This doesn't happen to just me. It just a normal day in newsgroups.
Whoever, may it be professor, or eminent mathematician, professinoal
senior programers (as many are here), as long as he is a long time
newsgroup user, may have involved in verbal abuses given or taken.
(and each party are fully aware the other's social standings, but it
does'n matter. (e.g. in this very thread, a few started to gang up on
Jon Harrop, who is a author of a few functional language books)) It
is, the culture of newsgroups, so to speak. In a way, it is how people
WANT newsgroups to be the way it is. You knew it, i knew it, and we
should not deny it. If you are concerned about this state of affairs
of newsgroups, you need to rethink about it. Not the same old
instinctive good intentions. George motherfucking Bush has good
intentions and 50k to 300k Islamic civilians are dead. I'm sure you
are a kind man. Thousands of kind men existed in newsgroups before
you. They all have, at one time or another or more, given kind
advices, and contributed to what newsgroup is today.
I recommend a few articles. These articles, if tech geekrs all read
it, will help improve the state of affairs of newsgroups.
• What Desires are Politically Important? by Bertrand Russell
http://xahlee.org/Periodic_dosage_dir/_p2/russell-lecture.html
• Demonic Males. Apes and the Origins of Human Violence.
By Richard Wrangham and Dale Peterson
http://xahlee.org/p/demonic_males.html
Nice meeting you.
I use http://teranews.com/. $3.95 for a lifetime account with a 50mb
daily limit, which is more then enough for the comp.lang.* reading i do.
Cheers,
drewc
>
> Thanks,
>
> Tamas
>
> PS: Sorry for the OT post, but this thread finally made it clear that
> Google Groups (a great free service) is not the perfect solution for
> increasing the signal/noise ratio.
>
> On May 5, 11:52 am, Xah Lee <x...@xahlee.org> wrote:
>> Dear Pascal moron,
>>
> [snip]
>
--
Posted via a free Usenet account from http://www.teranews.com
No. No, they aren't. The Lisp community are actually more interested in
other languages than almost all other programming language communities.
This is why I continue to post in c.l.l, because it garners a lot of
interest in my work from the many Lisp users who are still looking for
something better.
Lisp didn't recover from the "AI Winter", it remains very unpopular.
> Below you suggest I may be blinded by perspective. My question to you
> is, do /you/ appreciate the importance of parens? If not, your
> perspective is the problem.
Most people disagree.
>> Here's the equivalent lispy form, lispers think its easier to read:
>>
>> CompoundExpression[
>> SetDelayed[
>> cycledMatrix[
>> List[Pattern[m, Blank[Integer]], Pattern[n, Blank[Integer]]],
>> Pattern[cycleLists, List[Repeated[Blank[List]]]]],
>> Module[List[],
>
> How on Earth is that Lispy??! Bring the operator inside, then lose the
> []s and commas. Puzzled.
It is "lispy" because it is prefix notation. Mathematica is probably the
best example to these claims that parens are important because it provides
all of the symbolic rewriting (macro) capabilities of Lisp with the brevity
of a decent programming language. Contrary to the claims of Lispers,
Mathematica clearly demonstrates that you can combine macros with syntax.
The basic problem is that people coming to Lisp have all kinds
of interesting experiences and believes. But Lisp tends to
do things differently. It questions those prior experiences
and believes. But we are unwilling to question what we learned
is true. It is not that the Lisp way is better, but it is
different. Accepting that a different approach to software
development may also make sense seems to be hard.
Once these people get converted, they are the most
active evangelists. ;-)
So the basic first thing when you come to Lisp from other
programming languages and development tools is UNLEARNING.
That's also a reason Lisp was (is?) once popular in computer science
courses. Students are coming to the university full of
prior knowledge (basic, c, ... what ever programming).
Then they have to use Lisp to solve problems. Shock. Horror.
All are suddenly beginners and have to learn something
completely new.
* You work bottom-up talking to a partial ready program?
No batch compiler? No half-hour turn around times with
enough time for a long coffee break?
* No fixed language? No half year waiting time unless
you get a new operator into your language?
Language building built-in?
* S-expressions? Easy manipulation of code by integrated tools?
* Multiple paradigms in one language?
And so on.
About the Lisp syntax has been written quite a lot, I'm
not repeating it. Only a few words.
For me Lisp code is different from Pascal/C++/.. or other code
because it has a completely look & feel. The feel
is also an interesting part.
You see direct manipulative user interfaces quite a lot.
We drag icons around. We get instant visual feedback
from graphical user interfaces. Yet programming
often is far from interactive.
When I press keys and write code, is there a direct connection
from the brain to what I see on the screen and back?
In less than a second?
When I write code for a batch system, I type to an
editor and the objects