Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

one small function

86 views
Skip to first unread message

ab talebi

unread,
Feb 3, 2002, 12:22:22 PM2/3/02
to
consider these functions:

(setf my-list '((titi tata) (toto) (tada)))
(defun my-eql (list)
(eql 'toto list))

(mapcar #'my-eql my-list)
; probobly we can use a lambda-function but I'm not sure how

this will give us (NIL T NIL) whereas I just want a 2 or NIL

NIL if we can not find any 'toto in my-list or
2 because it's the element number 2 of my-list which is 'toto

Nils Goesche

unread,
Feb 3, 2002, 12:59:03 PM2/3/02
to

How about

(1+ (position 'toto my-list :key #'car))

Regards,
--
Nils Goesche
Ask not for whom the <CONTROL-G> tolls.

PGP key ID 0xC66D6E6F

Raymond Wiker

unread,
Feb 3, 2002, 1:09:32 PM2/3/02
to
"ab talebi" <ab_tal...@yahoo.com> writes:

I get (NIL NIL NIL), which is quite understandable because
'toto is not the same as '(toto).

Try something like

(position '(toto) *my-list* :test #'equal)

or

(position 'toto *my-list* :key #'car :test #'eql)

or even

(position t (mapcar #'my-test *my-list*))

--- but the last is somewhat inelegant :-)

All of these "solutions" require you to sort of what you're
actually trying to do - are you looking for '(toto), or 'toto as the
first element in a sublist?

--
Raymond Wiker Mail: Raymon...@fast.no
Senior Software Engineer Web: http://www.fast.no/
Fast Search & Transfer ASA Phone: +47 23 01 11 60
P.O. Box 1677 Vika Fax: +47 35 54 87 99
NO-0120 Oslo, NORWAY Mob: +47 48 01 11 60

Try FAST Search: http://alltheweb.com/

Markus Kliegl

unread,
Feb 3, 2002, 3:39:48 PM2/3/02
to
Nils Goesche <car...@t-online.de> writes:

> In article <ile78.7459$pd5.1...@news2.ulv.nextra.no>, ab talebi wrote:
> > consider these functions:
> >
> > (setf my-list '((titi tata) (toto) (tada)))
> > (defun my-eql (list)
> > (eql 'toto list))
> >
> > (mapcar #'my-eql my-list)
> > ; probobly we can use a lambda-function but I'm not sure how
> >
> > this will give us (NIL T NIL) whereas I just want a 2 or NIL
> >
> > NIL if we can not find any 'toto in my-list or
> > 2 because it's the element number 2 of my-list which is 'toto
>
> How about
>
> (1+ (position 'toto my-list :key #'car))

That doesn't take into account that position might return nil. Use something
like this instead:
(let ((pos (position 'toto my-list :key #'car)))
(and pos (1+ pos)))

>
> Regards,
> --
> Nils Goesche
> Ask not for whom the <CONTROL-G> tolls.
>
> PGP key ID 0xC66D6E6F

--
Markus Kliegl

P.C.

unread,
Feb 3, 2002, 6:54:00 PM2/3/02
to
Hi.

----- Oprindelig meddelelse -----
Fra: "ab talebi" <ab_tal...@yahoo.com>
Nyhedsgrupper: comp.lang.lisp
Sendt: 3. februar 2002 18:22
Emne: one small function

----- Guess that's all right. and then I made setf into setq and pastet to an
Lisp evaluator , like this ;


(setq my-list '((titi tata) (toto) (tada)))


(defun my-eql (list)
(eql 'toto list))
(mapcar #'my-eql my-list)

Yield error function undefined, after expecting an enter with the line (mapcar
#'my-eql my-list),
I simply don't understand why you write (list) in parentets in ;


(defun my-eql (list)
(eql 'toto list))

Then beside the function undefined, must be the "eql" in last line.

Have a nice day.
P.C.
http://d1o115.dk.telia.net/~u139600113/td/ans1.jpg


> ; probobly we can use a lambda-function but I'm not sure how
>
> this will give us (NIL T NIL) whereas I just want a 2 or NIL

Eh, ------ most proberly would be T and then nil then nothing more ;))

P.C.

unread,
Feb 4, 2002, 3:29:16 AM2/4/02
to
Hi.

> (setq my-list '((titi tata) (toto) (tada)))
> (defun my-eql (list)
> (eql 'toto list))
> (mapcar #'my-eql my-list)

I don't understand the last line ; you call mapcar with a function you define ;


(defun my-eql (list)
(eql 'toto list))

Now this is not a function, so what I don't understand, is how you have two
parameters to mapcar, and what the _if_ you pass on a function you described
yourself, maby the trouble is, that the function you define for that, need a
parameter list with one member ; the element the function will work on down the
list.
Please clearify, as Im'e very confused, how you first define the function that
will work with mapcar.

ab talebi

unread,
Feb 4, 2002, 9:41:07 AM2/4/02
to
On 03 Feb 2002 21:39:48 +0100, Markus Kliegl
<markus...@t-online.de> wrote:

>Nils Goesche <car...@t-online.de> writes:
>
>> In article <ile78.7459$pd5.1...@news2.ulv.nextra.no>, ab talebi wrote:
>> > consider these functions:
>> >
>> > (setf my-list '((titi tata) (toto) (tada)))
>> > (defun my-eql (list)
>> > (eql 'toto list))
>> >
>> > (mapcar #'my-eql my-list)
>> > ; probobly we can use a lambda-function but I'm not sure how
>> >
>> > this will give us (NIL T NIL) whereas I just want a 2 or NIL
>> >
>> > NIL if we can not find any 'toto in my-list or
>> > 2 because it's the element number 2 of my-list which is 'toto
>>
>> How about
>>
>> (1+ (position 'toto my-list :key #'car))
>
>That doesn't take into account that position might return nil. Use something
>like this instead:
>(let ((pos (position 'toto my-list :key #'car)))
> (and pos (1+ pos)))
>

ok, lets say that we impliment that into a function like this:

(defun find-position (word list)
(let ((pos (position word list :key #'car)))
(and pos (1+ pos))))

and we modify our list:

(setf my-list
'((titi tata) (toto) (titi) (toto) (tada)))

to contain several '(toto)s it will still return only the position of
the first one
2
whereas it should give us
2 4

any idea?

tnx

ab talebi

Alexey Dejneka

unread,
Feb 4, 2002, 10:11:05 AM2/4/02
to
ab_tal...@yahoo.com (ab talebi) writes:

(defun find-positions (item list &key (test #'eql) (key #'identity))
(loop for position from 1
and element in list
when (funcall test item (funcall key element))
collect position))

* (find-positions '(toto) '((titi tata) (toto) (titi) (toto) (tada)) :test #'equal)
(2 4)
* (find-positions '(toto) '((titi tata) (toto) (titi) (toto) (tada)) :test #'equal :key #'car)
NIL
* (find-positions 'toto '((titi tata) (toto) (titi) (toto) (tada)) :test #'equal :key #'car)
(2 4)
* (find-positions 'toto '((titi tata) (toto) (titi) (toto) (tada)) :key #'car)
(2 4)

--
Regards,
Alexey Dejneka

Nils Goesche

unread,
Feb 4, 2002, 10:04:51 AM2/4/02
to
In article <3c5e9cf3...@news.uio.no>, ab talebi wrote:
> ok, lets say that we impliment that into a function like this:
>
> (defun find-position (word list)
> (let ((pos (position word list :key #'car)))
> (and pos (1+ pos))))
>
> and we modify our list:
>
> (setf my-list
> '((titi tata) (toto) (titi) (toto) (tada)))
>
> to contain several '(toto)s it will still return only the position of
> the first one
> 2
> whereas it should give us
> 2 4

Ok, how about

(defun find-position (word list)
(loop for sub in list
for index from 1
when (eq word (car sub))
collect index))

Regards,
--
Nils Goesche
"Don't ask for whom the <CTRL-G> tolls."

PGP key ID 0x42B32FC9

Alexey Dejneka

unread,
Feb 4, 2002, 10:47:45 AM2/4/02
to
ab_tal...@yahoo.com (ab talebi) writes:

> ok, lets say that we impliment that into a function like this:
>
> (defun find-position (word list)
> (let ((pos (position word list :key #'car)))
> (and pos (1+ pos))))
>
> and we modify our list:
>
> (setf my-list
> '((titi tata) (toto) (titi) (toto) (tada)))
>
> to contain several '(toto)s it will still return only the position of
> the first one
> 2
> whereas it should give us
> 2 4
>
> any idea?

How could I forgot series...

---
(require :series)

(defun find-positions (item list &key (test #'eql) (key #'identity))

(let* ((positions (scan-range :from 1))
(words (scan list))
(goodp (#M(lambda (word)
(funcall test item (funcall key word)))
words)))
(collect (choose good-p positions))))
---

--
Regards,
Alexey Dejneka

Thomas A. Russ

unread,
Feb 4, 2002, 2:09:16 PM2/4/02
to

In Common Lisp, DEFUN is used to define functions and the syntax of the
definition form is rather a bit different than for DEFINE in Scheme.

--
Thomas A. Russ, USC/Information Sciences Institute t...@isi.edu

P.C.

unread,
Feb 6, 2002, 5:18:51 AM2/6/02
to
Hi.

"Thomas A. Russ" <t...@sevak.isi.edu> skrev i en meddelelse
news:ymivgdd...@sevak.isi.edu...


>
> In Common Lisp, DEFUN is used to define functions and the syntax of the
> definition form is rather a bit different than for DEFINE in Scheme.

Now I agrea that I don't know Sceme ; so in Sceme you can have a list
representing the symbol .
What I mean is, that this differ from Lisp, where an evaluator automaticly will
look at car list, as if that's a symbol there will be a defined function to be
evaluatet, and the remaining part of the list, will then be the parameters
described in the function description , ---- guess you se why I simply didn't
understand this ;

>(setf my-list '((titi tata) (toto) (tada))) ;
;------ isn't there a missing right parentets here ?------- so the defun
my-list continue with


> (defun my-eql (list)
> (eql 'toto list))
>
> (mapcar #'my-eql my-list)

While using "(list)" ; a predefined function, as parameter, and then Mapcar a
function "#" onto a function defined, and then making Mapcar work a function "#"
on two lists ,first a function defination list and then an unballanced list with
3 lists with each one symbol ; hope you understand how I find this very
confusing ; don't sceem have same rules about ballancing parentets as Lisp ?
P.C.


P.C.

unread,
Feb 6, 2002, 10:08:19 AM2/6/02
to
Guess the difference Sceme/Lisp are very small but important, ------- still the
doubt I had, made me think, that you was mapping within a function allready
defined, changing the function defination, I can now se, after you explained the
#'foo being _one_ symbol, that Mapcar is not working on two but one list, and
that make sense, as Im'e used to define my own one parameter functions working
recursive with mapcar , mapping down a list.
P.C.


Geoff Summerhayes

unread,
Feb 6, 2002, 10:16:04 AM2/6/02
to

"P.C." <per.c...@privat.dk> wrote in message
news:3c610338$0$89100$edfa...@dspool01.news.tele.dk...

>
> Now I agrea that I don't know Sceme ; so in Sceme you can have a list
> representing the symbol .
> What I mean is, that this differ from Lisp, where an evaluator automaticly
will
> look at car list, as if that's a symbol there will be a defined function to be
> evaluatet, and the remaining part of the list, will then be the parameters
> described in the function description , ---- guess you se why I simply didn't
> understand this ;
>
> >(setf my-list '((titi tata) (toto) (tada))) ;
> ;------ isn't there a missing right parentets here ?------- so the defun

No, look again.
A BC C D D E EBA
1 > (setq my-list '((titi tata) (toto) (tada)))
((TITI TATA) (TOTO) (TADA))

> my-list continue with
> > (defun my-eql (list)
> > (eql 'toto list))

No, the function is seperate from my-list.

2 > (defun my-eql (list)
(eql '(toto) list))
MY-EQL

3 > (my-eql 'toto)
T

4 > (my-eql '(toto))
NIL

> >
> > (mapcar #'my-eql my-list)
>
> While using "(list)" ; a predefined function, as parameter, and then Mapcar
a
> function "#" onto a function defined, and then making Mapcar work a function
"#"
> on two lists ,first a function defination list and then an unballanced list
with
> 3 lists with each one symbol ; hope you understand how I find this very
> confusing ; don't sceem have same rules about ballancing parentets as Lisp ?
> P.C.
>

In Lisp, a symbol may be a function name and still be a variable
AT THE SAME TIME, which gets used depends on context.

5 > (setf list 4)
4

6 > list ;; context: "list" treated as variable
4

7 > (function list) ;; function says: No, treat "list" as a function name
#<function LIST 200DCA7A>

8 > (function not-a-function)
Error: Undefined function NOT-A-FUNCTION...

#' is shorthand for (function ...)
just like ' is shorthand for (quote ...)

9 > #'list
#<function LIST 200DCA7A>

10 > (list 1 2 3 4) ;; context: "list" treates as function name
(1 2 3 4)

MAPCAR's entry in the hyperspec looks like this:
mapcar function &rest lists+ => result-list
In other words, MAPCAR takes as arguments a function and 0 or more lists
and returns a list.

11 > (mapcar #'my-eql my-list)
(NIL NIL NIL)

-----------
Geoff

Kent M Pitman

unread,
Feb 6, 2002, 11:13:21 AM2/6/02
to
"Geoff Summerhayes" <sNuOmS...@hNoOtSmPaAiMl.com> writes:

> MAPCAR's entry in the hyperspec looks like this:
> mapcar function &rest lists+ => result-list
> In other words, MAPCAR takes as arguments a function and 0 or more lists
> and returns a list.
>
> 11 > (mapcar #'my-eql my-list)
> (NIL NIL NIL)

Actually, it's a small point, but the use of "lists+" in the ANSI CL
(and hence CLHS) notation indicates that at least one of these lists
is required. For better or worse, you aren't allowed to do
(mapcar #'(lambda ())).

Geoff Summerhayes

unread,
Feb 6, 2002, 12:04:24 PM2/6/02
to

"Kent M Pitman" <pit...@world.std.com> wrote in message
news:sfwg04e...@shell01.TheWorld.com...

Oops, my bad. Nice to see you back.

Geoff

Erik Naggum

unread,
Feb 7, 2002, 3:01:23 AM2/7/02
to
* "P.C." <per.c...@privat.dk>

| Guess the difference Sceme/Lisp are very small but important

They are so large that someone who has been trained in Scheme has no hope
of ever learning Common Lisp, because they _think_ the differences are
very small despite strong evidence to the contrary, and because it is
possible to write Scheme in Common Lisp, they will never be "offered" the
necessary negative feedback necessary to "get it".

I think the problem is that most people who have not been made aware of
this fallacy at a young age have an unnerving tendency to think that they
know something if it looks like something they know, and the less they
_actually_ know what they think they know, the more they think something
that looks the same to them, also _is_ the same. In a world of physical
objects, this is probably not such a bad idea. The number of things you
need to examine to determine that something is a duck is quite limited.
This changes with man-made objects, and even more with purely man-made
constructs like the infrastructure of a society or programming languages,
where you have to be able to distinguish differences between _versions_
of what is actually _marketed_ as the same, For instance, if you plan
according to the tax law or the Java version of a year ago, you are in
trouble. This leads to a reversal of an important relationship between
identity and details: In the physical world, if you find large similar
properties between something new and something you know, you are quite
justified in concluding that you know what it is, but in the man-made
world, especially that of ideas, it is more important to look carefully
after you have discovered large-scale similarities than before.

For man-made things and ideas in particular, the belief that what looks
the same is the same is an anti-intellectual attitude towards one's
knowledge and actually betrays a lack of understanding of that which is
believeed known. Also, since this belief is most probably not restricted
to one particular area of knowledge, but is a result of methodological
errors in reasoning and learning, there is no chance at all to make such
a believer realize that he does not actually know what he believes he
knows until he runs into a very serious problem that he cannot resolve
well enough for comfort. Only when programming is made into a "team"
effort which exposes newbies to the critical review that only gurus can
provide, and gurus themselves acknowledge that new gurus may arise worth
listening to, can the necessary feedback loops work well enough that good
knowledge drives out bad. Today, we have a lot of people who are quite
convinced of their own competence mainly because they have never been
challenged beyond what they already know.

The tragedy is that "looks the same, is the same" is not commutative. If
you learned Common Lisp well and then found Scheme, it would be nothing
more than some old home-made toy discovered in your parent's basement --
cute and all that, but not worth playing with for long, because all of
the stuff you think a real programming language should have are simply
not there. If you learned Scheme first and then found Common Lisp, it
would seem just like Scheme, because what you have come to think of as a
"programming language" is basically so little that you would not even
comprehend all the stuff in Common Lisp that is not like Scheme at all.

///
--
In a fight against something, the fight has value, victory has none.
In a fight for something, the fight is a loss, victory merely relief.

Kent M Pitman

unread,
Feb 7, 2002, 5:16:48 AM2/7/02
to
Erik Naggum <er...@naggum.net> writes:

> * "P.C." <per.c...@privat.dk>
> | Guess the difference Sceme/Lisp are very small but important
>

...


> The tragedy is that "looks the same, is the same" is not commutative. If
> you learned Common Lisp well and then found Scheme, it would be nothing
> more than some old home-made toy discovered in your parent's basement --
> cute and all that, but not worth playing with for long, because all of
> the stuff you think a real programming language should have are simply
> not there. If you learned Scheme first and then found Common Lisp, it
> would seem just like Scheme, because what you have come to think of as a
> "programming language" is basically so little that you would not even
> comprehend all the stuff in Common Lisp that is not like Scheme at all.

This kind of reminds me of when I was in high school and had taken
high school algebra and a little analysis and someone handed me a
calculus textbook to give me an idea of what I would learn the next
year. I flipped through it quickly, taking an inventory of the
notation--lots of x's and y's and exponents and sigmas and stuff. I
shrugged and handed it back. "Yeah, I can do all this. Looks pretty
much like what I already know except maybe for that little squiggle
[integral sign]," I remarked to the person showing me the book. "I
guess I won't learn anything much new next year," I continued with a
sigh.

Superficial similarity and overlap of notation can be quite deceiving
sometimes.

Christian Nybų

unread,
Feb 7, 2002, 12:23:41 PM2/7/02
to
Erik Naggum <er...@naggum.net> writes:

> * "P.C." <per.c...@privat.dk>
> | Guess the difference Sceme/Lisp are very small but important
>
> They are so large that someone who has been trained in Scheme has no hope
> of ever learning Common Lisp, because they _think_ the differences are
> very small despite strong evidence to the contrary, and because it is
> possible to write Scheme in Common Lisp, they will never be "offered" the
> necessary negative feedback necessary to "get it".

At my uni, taking the SICP course is a recommended preparation for
those who wish to take the CL course. Therefore I choose, despite
your warnings, to assimilate what Scheme has to offer as a teaching
language. Any advice on how to get through unharmed? (OVS phrasing
optional...)
--
chr

Xah Lee

unread,
Feb 7, 2002, 10:36:03 PM2/7/02
to
Here we have the Nagging Naggum, detailing his fantasy of Scheme
hatred, through cunning theories and fastidious remarks, on the
philosophies of same and not the same, the tragedies of commutative or
non-commutative phrase inventions.

"P.C." <per.c...@privat.dk> wrote:
> Guess the difference Sceme/Lisp are very small but important

The different is trivial, the distinction trifle. In the history of
computer languages, they are identical.

What's different languages? Logical languages and functional languages
are different. Pure OOP languages and traditional imperative languages
are different. Lisp family languages and purely functional languages
makes a huge difference. Purely functional languages and denotational
languages have importantly distinct concepts.

My readers, set your eyes afar, your views afield. Don't fret about
the trifling particularities of Scheme and common lisps that so bugs
the Nag'm & Pit-man scheming haters. If you are concerned about
learning something, master a *different* language as i outlined above.
Broaden your mind.

Btw, i think i'm the first person who coined Nagging Naggum. (is that
right, Erik?) If so, i should get some credit.

Xah
x...@xahlee.org
http://xahlee.org/PageTwo_dir/more.html


--

Kent M Pitman

unread,
Feb 7, 2002, 11:23:36 PM2/7/02
to
x...@xahlee.org (Xah Lee) writes:

> My readers, set your eyes afar, your views afield. Don't fret about
> the trifling particularities of Scheme and common lisps that so bugs
> the Nag'm & Pit-man scheming haters. If you are concerned about
> learning something, master a *different* language as i outlined above.
> Broaden your mind.

I'll pass on responding to the first sentence here, but
(notwithstanding the fact that Xah wants to position himself as
disagreeing with me) I'll agree with the second sentence here.

To the extent that there are things presented in a Scheme course which
introduce new ideas and broaden the mind, take them in, learn, and enjoy.

The place where we differ is in the belief that every Scheme course is,
in point of fact, an exercise in mind-broadening. I have no problem with
Scheme per se, and I have used it myself for various work-related tasks.
However, I do have considerable problem with Scheme qua religion, and I
have routinely seen people come out of a Scheme course not thinking they
have learned a new thing, but that they have UNlearned an old thing, and,
more to the point, have often specifically learned to hate CL. It is
routine to find that people taking Scheme courses have learned that
"iteration is Bad and tail recursion is Good", which I believe is itself
limiting and therefore bad; I have no problem if they have gotten instead
the lesson that "iteration can take on multiple syntactic forms", "there
are equivalences between conventional loops and tail recursions", "sometimes
seeing things in a different syntactic form, whether conventional or
recursive, reveals through that form truths that are not as perspicuous
in the other form". To the extent that people instead believe that grand
truths only leap out of tail recursive form and do not leap out of the other
form, they have been misled.

Broadening the mind, that is, learning new ways to see and use things
is good. Revamping the mind according to propaganda is not, however,
broadening the mind.

> Btw, i think i'm the first person who coined Nagging Naggum. (is that
> right, Erik?) If so, i should get some credit.

If you are able to make your own coinage, I'm not sure why you need to
operate on credit.

Rahul Jain

unread,
Feb 7, 2002, 11:49:46 PM2/7/02
to
c...@sli.uio.no (Christian Nybų) writes:

> At my uni, taking the SICP course is a recommended preparation for
> those who wish to take the CL course. Therefore I choose, despite
> your warnings, to assimilate what Scheme has to offer as a teaching
> language. Any advice on how to get through unharmed? (OVS phrasing
> optional...)

You should forget that Scheme exists when you learn Lisp. Essentially,
approach Lisp with no preconcieved notions, and beware of specific
biases towards specific programming styles that your professor may
have. Lisp is designed to allow for all styles of programming, which
also means that it allows for exclusively one style if the programmer
so desires. I find it much more freeing to allow oneself to mix all
styles of programming in the same code. Object oriented, applicative,
logic, symbolic; whatever combination fits the needs of the algorithm.

--
-> -/- - Rahul Jain - -\- <-
-> -\- http://linux.rice.edu/~rahul -=- mailto:rj...@techie.com -/- <-
-> -/- "I never could get the hang of Thursdays." - HHGTTG by DNA -\- <-
|--|--------|--------------|----|-------------|------|---------|-----|-|
Version 11.423.999.221020101.23.50110101.042
(c)1996-2002, All rights reserved. Disclaimer available upon request.

Craig Brozefsky

unread,
Feb 8, 2002, 12:44:30 AM2/8/02
to
Kent M Pitman <pit...@world.std.com> writes:

> > Btw, i think i'm the first person who coined Nagging Naggum. (is that
> > right, Erik?) If so, i should get some credit.
>
> If you are able to make your own coinage, I'm not sure why you need to
> operate on credit.

Fatality, Kent wins.

--
Craig Brozefsky <cr...@red-bean.com>
http://www.red-bean.com/~craig
Ask me about Common Lisp Enterprise Eggplants at Red Bean!

Dorai Sitaram

unread,
Feb 8, 2002, 10:26:48 AM2/8/02
to
In article <sikheot...@ulrik.uio.no>,

Christian Nybø <c...@sli.uio.no> wrote:
>
>At my uni, taking the SICP course is a recommended preparation for
>those who wish to take the CL course. Therefore I choose, despite
>your warnings, to assimilate what Scheme has to offer as a teaching
>language. Any advice on how to get through unharmed? (OVS phrasing
>optional...)

I would say that (in general) it is fairly difficult
for Scheme learners to learn Common Lisp. You can kind
of get by, but you have to be resigned to that fact.
Common Lisp is the archetypal jealous-god type of
programming language. Scheme tends to be a lot more
freewheeling, and its programs suffer less crosstalk
from their writer's acquaintance with other languages
(including CL). From the CL point of view,
crosstalk from Scheme can be immobilizing.

What's "OVS phrasing"?

--d

Christian Nybų

unread,
Feb 8, 2002, 1:23:32 PM2/8/02
to
ds...@goldshoe.gte.com (Dorai Sitaram) writes:

> What's "OVS phrasing"?

Object Verb Subject, Yoda's manner of sentence phrasing.
--
chr

Duane Rettig

unread,
Feb 8, 2002, 3:00:11 PM2/8/02
to
c...@sli.uio.no (Christian Nybų) writes:

Nah, Yoda speaks with OSV phrasing, at least for declarative
statements...

"A Jedi knight you will be"

--
Duane Rettig Franz Inc. http://www.franz.com/ (www)
1995 University Ave Suite 275 Berkeley, CA 94704
Phone: (510) 548-3600; FAX: (510) 548-8253 du...@Franz.COM (internet)

Duane Rettig

unread,
Feb 8, 2002, 3:00:11 PM2/8/02
to
c...@sli.uio.no writes:

Nah, Yoda speaks with OSV phrasing, at least for declarative

Thomas F. Burdick

unread,
Feb 8, 2002, 3:44:47 PM2/8/02
to
ds...@goldshoe.gte.com (Dorai Sitaram) writes:

> In article <sikheot...@ulrik.uio.no>,
> Christian Nybø <c...@sli.uio.no> wrote:
> >
> >At my uni, taking the SICP course is a recommended preparation for
> >those who wish to take the CL course. Therefore I choose, despite
> >your warnings, to assimilate what Scheme has to offer as a teaching
> >language. Any advice on how to get through unharmed? (OVS phrasing
> >optional...)
>
> I would say that (in general) it is fairly difficult
> for Scheme learners to learn Common Lisp.

If they think they're the same language, it certainly will be. I
think if someone is specifically trying to not think that CL is a
funny-looking Scheme, I think they will probably have a very good
chance.

(with-phrasing (:object :subject :verb)
Paul Graham's books very good are. Over-rated sometimes they can
be, but shine for schemers they do. A good path from Scheme-think
to Lisp-think the book "ANSI Common Lisp" provides. Both
tail-recursion but even more, iteration, loves him he does.)

--
/|_ .-----------------------.
,' .\ / | No to Imperialist war |
,--' _,' | Wage class war! |
/ / `-----------------------'
( -. |
| ) |
(`-. '--.)
`. )----'

Dorai Sitaram

unread,
Feb 8, 2002, 5:09:21 PM2/8/02
to
In article <xcv6657...@famine.OCF.Berkeley.EDU>,

Thomas F. Burdick <t...@famine.OCF.Berkeley.EDU> wrote:
>ds...@goldshoe.gte.com (Dorai Sitaram) writes:
>
>> In article <sikheot...@ulrik.uio.no>,
>> Christian Nybø <c...@sli.uio.no> wrote:
>> >
>> >At my uni, taking the SICP course is a recommended preparation for
>> >those who wish to take the CL course. Therefore I choose, despite
>> >your warnings, to assimilate what Scheme has to offer as a teaching
>> >language. Any advice on how to get through unharmed? (OVS phrasing
>> >optional...)
>>
>> I would say that (in general) it is fairly difficult
>> for Scheme learners to learn Common Lisp.
>
>If they think they're the same language, it certainly will be. I
>think if someone is specifically trying to not think that CL is a
>funny-looking Scheme, I think they will probably have a very good
>chance.

I do think it is risky to learn Scheme if the goal is
CL. The learner is essentially being made to
unnecessarily bet on his exceptional capacity for
unlearning.

I speak as someone who is fairly comfortable with
Scheme but cannot realistically hope to achieve that
comfort with CL. I know the two languages are
different (and _how_ they are different). Maybe my
experience is not statistically significant (whatever
that means), I hope so.

It isn't exactly a new hypothesis around here that
Scheme ruins one for CL. I find it a plausible
hypothesis. Exceptions for exceptional talent of
course, as always.

--d

Christopher C. Stacy

unread,
Feb 9, 2002, 12:30:34 AM2/9/02
to
MGHMnhhmm...Always two there are...a master and a learner language.
But which one was the Sithcp? The master, or the learner?

Xah Lee

unread,
Feb 9, 2002, 1:59:37 AM2/9/02
to
> iteration is Bad and tail recursion is GoodLet's talk about this
popular subject.First of all, i like to voice my opinion that lots of
people inthe computing field likes to spur the use of spurious
jargons.The less educated they are, the more they like
extraneousjargons, such as in the Unix & Perl community.
Unlikemathematicians, where in mathematics there are no fewer
jargonsbut each and every one are absolutely necessary. For
example,polytope, manifold,
injection/bijection/surjection,group/ring/field.., homological,
projective, pencil, bundle,lattice, affine, topology, isomorphism,
isometry, homeomorphism,aleph-0, fractal, supremum/infimum, simplex,
matrix, quaternions,derivative/integral, ...and so on. Each and every
one of thesecaptures a concept, for which practical and
theoreticalconsiderations made the terms a necessity. Often there
aresynonyms for them because of historical developments, but
never"jargons for jargon's sake" because mathematicians hate bloatsand
irrelevance.The jargon-soaked stupidity in computing field can be
groupedinto classes. First of all, there are jargons for
marketingpurposes. Thus you have Mac OS "X", Windows "XP", Sun OS
->Solaris and the versioning confusion of 4.x to 7 to 8 and alsothe so
called "Platform" instead of OS. One flagrant example isSun
Microsystem's Java stuff. Oak, Java, JDK, JSDK, J2EE, J2SEenterprise
edition or no, from java 1.x to 1.2 == Java 2 now 1.3,JavaOne, JFC,
Jini, JavaBeans, entity Beans, Awk, Swing...fucking stupid Java and
fuck Sun Microsystems. This is just oneexample of Jargon hodgepodge of
one single commercial entity.Marketing jargons cannot be avoided in
modern society. Theyabound outside computing fields too. The Jargons
of marketingcame from business practice, and they can be excusable
becausethey are kinda a necessity or can be considered as a
naturallyevolved strategy for attracting attention in a free-market
socialsystem.The other class of jargon stupidity is from
computingpractitioners, of which the Unix/Perl community is exemplary.
Forexample, the name Unix & Perl themselves are good examples
ofbuzzing jargons. Unix is supposed to be opposed of Multics andhints
on the offensive and tasteless term eunuchs. PERL is cookedup to be
"Practical Extraction & Reporting Language" and for theprecise
marketing drama of being also "Pathologically EclecticRubbish Lister".
These types of jargons exudes juvenile humor.Cheesiness and low-taste
is their hall-mark. If you are familiarwith unixism and perl
programing, you'll find tons and tons ofsuch jargons embraced and
verbalized by unix & perl lovers. e.g.grep, glob, shell, pipe, man,
regex, tarball, shebang,Schwartzian Transform, croak, bless,
interpolation, TIMTOWTDI,DWIM, RFC, RTFM, I-ANAL, YMMV and so on.There
is another class of jargon moronicity, which i find themmost damaging
to society, are jargons or spurious and vague termsused and brandished
about by programers that we see and heardaily among design meetings,
comp.lang.* newsgroup postings, oreven in lots of computing textbooks
or tutorials. I think thereason for these, is that these massive body
of averageprogramers usually don't have much knowledge of
significantmathematics, yet they are capable of technical thinking
that isnot too abstract, thus you ends up with these people defining
orhatching terms a-dime-a-dozen that's vague, context
dependent,vacuous, and their commonality are often a result of
sopho-moronstrying to sound big.Here are some examples of the terms in
question: * anonymous functions or lambda or lamba function * closure
* exceptions (as in Java) * list, array, vector, aggregate * hash (or
hash table) <-- fantastically stupid * rehash (as in csh or tcsh) *
regular expression (as in regex, grep, egrep, fgrep) * name space (as
in Scheme vs Common Lisp debates) * depth first/breadth first (as in
tree traversing.) * operator/function * operator overloading,
polymorphism * inheritance * first class objects/citizen * pointers,
references
* ...My time is limited, so i'll just give a brief explanation of
mythesis on selective few of these examples among the umpteen.In a
branch of math called lambda calculus, in which muchtheories of
computation are based on, is the origin of the jargon_lambda function_
that so commonly reciprocated by lisp and otherprogramering donkeys.
In practice, a subroutine withoutside-effects is supposed to be what
"lambda function" means.Functional languages often can defined them
without assigningthem to some variable (name), therefore the "function
withoutside-effects" are also called "anonymous functions". One can
seethat these are two distinct concepts. If mathematicians
aredesigning computer languages, they would probably just calledsuch
thing _pure functions_. The term conveys the meaning,without the
"lamba" abstruseness. (in Fact, the mathematicsoriented language
Mathematica refers to lambda function as purefunction, with the
keyword Function.) Because most programers aresopho-morons who are
less capable of clear thinking butnevertheless possess human vanity,
we can see that they have notadopted the clear and fitting term, but
instead you see lambdafunction this and that obfuscations dropping
from their mouthsconstantly.Now the term "closure" can and indeed have
meant several thingsin the computing field. The most common i believe
today, is forit to mean a subroutine that holds some memory but
without somedisadvantages of modifying a global variable. Usually such
is afeature of a programing language. When taken to extreme, we
havethe what's called Object Oriented Programing methodology
andlanguages. The other meaning of "closure" i have seen in textbook,
is for it to indicate that the things in the language is"closed" under
the operations of the language. For example, forsome languages you can
apply operations or subroutines to anything in the language. (These
languages are often what's called"dynamic typing" or "typeless").
However, in other languages,things has types and cannot be passed
around subroutines oroperators arbitrarily. One can see that the term
"closure" isquite vague in conveying its meaning. The term
nevertheless isvery popular among talkative programers and dense
tutorials,precisely because it is vague and mysterious. These
pseudo-witliving zombies, never thought for a moment that they are
using amoronic term, mostly because they never clearly understand
theconcepts behind the term among the contexts. One can particularsee
this exhibition among Perl programers. (for an example of
thefantastically stupid write-up on closure by the Perl folks,
see"perldoc perlfaq7" and "perldoc perlref".)in the so-called
"high-level" computing languages, there areoften data types that's
some kind of a collection. The mostillustrative is LISt Processing
language's lists. Essentially,the essential concept is that the
language can treat a collectionof things as if it's a single entity.
As computer languagesevolves, such collection entity feature also
diversified, fromsyntax to semantics to implementation. Thus, besides
lists, thereare also terms like vector, array, matrix, tree,
hash/"hashtable"/dictionary. Often each particular term is to convey
aparticular implementation of collection so that it has
certainproperties to facilitate specialized uses of such groupy.
TheJava language has such groupy that can illustrate the point well.In
Java, there's these hierarchy of collection-type of things: Collection
Set (AbstractSet, HashSet) SortedSet (TreeSet) List
(AbstractList, LinkedList, Vector, ArrayList) Map (AbstractMap,
HashMap, Hashtable) SortedMap (TreeMap)The words without parenthesis
are java Interfaces, and ones inare implementations. The interface
hold a concept. The deeper thelevel, the more specific or specialized.
The implementation carryout concepts. Different implementation gives
differentalgorithmic properties. Essentially, these hierarchies of
Javashows the potential complexity and confusion around groupyentities
in computer languages. Now, among the programers we seedaily, who
never really thought out of these things, will attachtheir own
specific meaning to list/array/vector/matrix/etc typeof jargons in
driveling and arguments, oblivious to any thoughtof formalizing what
the fuck they are really talking about. (onemay think from the above
tree-diagram that Java the language hasat least put clear distinction
to interface and implementation,whereas in my opinion they are one
fantastic fuck up too, in manyrespects. (I will detail it some
day.))... i'm getting tired of writing this. There are just too
manyspurious stupid jargons that's meaningless and endlesslyconfusing,
yet embraced and spoke by all monkey coders. What i amcoming into, is
the jargon "tail recursion". What a fantasticallyfucking fantastic
stupid meaningless term that is so loved by theLISP programing geeks.
In every trickle of chance they'll flashit out to debate. (hopefully
i'll continue explaining the stupidjargons and background of common
computing terms that i listedabove and much more in the future.)Now,
let's talk about tail recursion.I read the Structure and
Interpretation of Computer Programs ofHarold Abelson et al about 4
years ago. I recall reading thesection on tail recursion. In
particular, i recall how exactingand clear it explains the myths of
looping away. To this day, istill hold their writing illuminating. (it
is in reading theirbook, chapter i think 3 on functional dispatch,
that it dawned onme what is OOP really about, far beyond any
fantastically fuckingstupid OOP tutorials and books.) My understanding
of Scheme isvery minimal. I'm not sure if i can even write "hello
world"correctly in one shot. I don't know Common Lisp at all. So,
youwill pardon and correct me if i make some stupid remarks
here.However, my feelings is that Abelson et al is quite lucid
inexplaining that the time/space algorithmic behaviors of
thosefor/while/until loops of imperative languages can be equivalentto
recursion. And, i think that the term "tail recursion"
meansimplementation of recursion syntax in a language so that when
itcan be linear, it is so. (we should simply call itwell-implemented
recursion or something, instead of thefantastically stupid "tail
recursion".)Kent Pitman wrote:> "iteration can take on multiple

syntactic forms",> "there> are equivalences between conventional loops
and> tail recursions", "sometimes > seeing things in a different>
syntactic form, whether conventional or> recursive, reveals>through
that form truths that are not as> perspicuous in the other form"and >

To the extent that people instead believe that grand > truths > only
leap out of tail recursive form and do not leap out > of the other
form, they have been misled.Kent, please make clear and precise
statements. What are yousaying exactly? When you speak to me, try to
formalize yourthoughts, and express them as much close to symbolic
logic aspossible. The former is for your benefit, the latter is for
mybenefit. (what i hate is the amorphous humbling blob you wont
towrite.)(don't be smug that i didn't sting you for your devious
maneuverof my post this time. It did not went undetected. Consider it
mykindness to let you off this time. Frankly, my energy are spendon
writing the jargons passage, and now i don't have time tochase you
off.) Xah x...@xahlee.org http://xahlee.org/PageTwo_dir/more.html-- >
From: Kent M Pitman (pit...@world.std.com) > Subject: Re: one small
function > Newsgroups: comp.lang.lisp > Date: 2002-02-07 20:24:57 PST

Xah Lee

unread,
Feb 9, 2002, 2:02:31 AM2/9/02
to
> iteration is Bad and tail recursion is Good

Let's talk about this popular subject.

First of all, i like to voice my opinion that lots of people in
the computing field likes to spur the use of spurious jargons.
The less educated they are, the more they like extraneous
jargons, such as in the Unix & Perl community. Unlike
mathematicians, where in mathematics there are no fewer jargons
but each and every one are absolutely necessary. For example,
polytope, manifold, injection/bijection/surjection,
group/ring/field.., homological, projective, pencil, bundle,
lattice, affine, topology, isomorphism, isometry, homeomorphism,
aleph-0, fractal, supremum/infimum, simplex, matrix, quaternions,
derivative/integral, ...and so on. Each and every one of these
captures a concept, for which practical and theoretical
considerations made the terms a necessity. Often there are
synonyms for them because of historical developments, but never
"jargons for jargon's sake" because mathematicians hate bloats
and irrelevance.

The jargon-soaked stupidity in computing field can be grouped
into classes. First of all, there are jargons for marketing
purposes. Thus you have Mac OS "X", Windows "XP", Sun OS ->
Solaris and the versioning confusion of 4.x to 7 to 8 and also
the so called "Platform" instead of OS. One flagrant example is

Sun Microsystem's Java stuff. Oak, Java, JDK, JSDK, J2EE, J2SE
enterprise edition or no, from java 1.x to 1.2 == Java 2 now 1.3,


JavaOne, JFC, Jini, JavaBeans, entity Beans, Awk, Swing...
fucking stupid Java and fuck Sun Microsystems. This is just one
example of Jargon hodgepodge of one single commercial entity.
Marketing jargons cannot be avoided in modern society. They
abound outside computing fields too. The Jargons of marketing
came from business practice, and they can be excusable because
they are kinda a necessity or can be considered as a naturally
evolved strategy for attracting attention in a free-market social
system.

The other class of jargon stupidity is from computing
practitioners, of which the Unix/Perl community is exemplary. For
example, the name Unix & Perl themselves are good examples of
buzzing jargons. Unix is supposed to be opposed of Multics and

hints on the offensive and tasteless term eunuchs. PERL is cooked


up to be "Practical Extraction & Reporting Language" and for the

precise marketing drama of being also "Pathologically Eclectic


Rubbish Lister". These types of jargons exudes juvenile humor.
Cheesiness and low-taste is their hall-mark. If you are familiar
with unixism and perl programing, you'll find tons and tons of
such jargons embraced and verbalized by unix & perl lovers. e.g.
grep, glob, shell, pipe, man, regex, tarball, shebang,
Schwartzian Transform, croak, bless, interpolation, TIMTOWTDI,
DWIM, RFC, RTFM, I-ANAL, YMMV and so on.

There is another class of jargon moronicity, which i find them


most damaging to society, are jargons or spurious and vague terms
used and brandished about by programers that we see and hear
daily among design meetings, comp.lang.* newsgroup postings, or
even in lots of computing textbooks or tutorials. I think the
reason for these, is that these massive body of average
programers usually don't have much knowledge of significant
mathematics, yet they are capable of technical thinking that is
not too abstract, thus you ends up with these people defining or
hatching terms a-dime-a-dozen that's vague, context dependent,
vacuous, and their commonality are often a result of sopho-morons
trying to sound big.

Here are some examples of the terms in question:

* anonymous functions or lambda or lamba function
* closure
* exceptions (as in Java)
* list, array, vector, aggregate
* hash (or hash table) <-- fantastically stupid
* rehash (as in csh or tcsh)
* regular expression (as in regex, grep, egrep, fgrep)
* name space (as in Scheme vs Common Lisp debates)
* depth first/breadth first (as in tree traversing.)
* operator

* operator overloading
* polymorphism


* inheritance
* first class objects

* pointers, references

My time is limited, so i'll just give a brief explanation of my
thesis on selective few of these examples among the umpteen.

In a branch of math called lambda calculus, in which much
theories of computation are based on, is the origin of the jargon
_lambda function_ that so commonly reciprocated by lisp and other
programering donkeys. In practice, a subroutine without
side-effects is supposed to be what "lambda function" means.
Functional languages often can defined them without assigning
them to some variable (name), therefore the "function without
side-effects" are also called "anonymous functions". One can see
that these are two distinct concepts. If mathematicians are
designing computer languages, they would probably just called

such thing _pure functions_. The term conveys the meaning,


without the "lamba" abstruseness. (in Fact, the mathematics
oriented language Mathematica refers to lambda function as pure
function, with the keyword Function.) Because most programers are
sopho-morons who are less capable of clear thinking but
nevertheless possess human vanity, we can see that they have not
adopted the clear and fitting term, but instead you see lambda
function this and that obfuscations dropping from their mouths
constantly.

Now the term "closure" can and indeed have meant several things
in the computing field. The most common i believe today, is for
it to mean a subroutine that holds some memory but without some
disadvantages of modifying a global variable. Usually such is a
feature of a programing language. When taken to extreme, we have
the what's called Object Oriented Programing methodology and
languages. The other meaning of "closure" i have seen in text

book, is for it to indicate that the things in the language is


"closed" under the operations of the language. For example, for
some languages you can apply operations or subroutines to any
thing in the language. (These languages are often what's called
"dynamic typing" or "typeless"). However, in other languages,
things has types and cannot be passed around subroutines or
operators arbitrarily. One can see that the term "closure" is
quite vague in conveying its meaning. The term nevertheless is
very popular among talkative programers and dense tutorials,
precisely because it is vague and mysterious. These pseudo-wit
living zombies, never thought for a moment that they are using a
moronic term, mostly because they never clearly understand the
concepts behind the term among the contexts. One can particular

see this exhibition among Perl programers. (for an example of the


fantastically stupid write-up on closure by the Perl folks, see
"perldoc perlfaq7" and "perldoc perlref".)

in the so-called "high-level" computing languages, there are
often data types that's some kind of a collection. The most
illustrative is LISt Processing language's lists. Essentially,
the essential concept is that the language can treat a collection
of things as if it's a single entity. As computer languages
evolves, such collection entity feature also diversified, from
syntax to semantics to implementation. Thus, besides lists, there
are also terms like vector, array, matrix, tree, hash/"hash
table"/dictionary. Often each particular term is to convey a
particular implementation of collection so that it has certain
properties to facilitate specialized uses of such groupy. The
Java language has such groupy that can illustrate the point well.

In Java, there's these hierarchy of collection-type of things:

Collection
Set (AbstractSet, HashSet)
SortedSet (TreeSet)
List (AbstractList, LinkedList, Vector, ArrayList)

Map (AbstractMap, HashMap, Hashtable)
SortedMap (TreeMap)

The words without parenthesis are java Interfaces, and ones in
are implementations. The interface hold a concept. The deeper the
level, the more specific or specialized. The implementation carry
out concepts. Different implementation gives different
algorithmic properties. Essentially, these hierarchies of Java
shows the potential complexity and confusion around groupy

entities in computer languages. Now, among the programers we see


daily, who never really thought out of these things, will attach
their own specific meaning to list/array/vector/matrix/etc type
of jargons in driveling and arguments, oblivious to any thought
of formalizing what the fuck they are really talking about. (one
may think from the above tree-diagram that Java the language has
at least put clear distinction to interface and implementation,
whereas in my opinion they are one fantastic fuck up too, in many
respects. (I will detail it some day.))

... i'm getting tired of writing this. There are just too many
spurious stupid jargons that's meaningless and endlessly

confusing, yet embraced and spoke by all monkey coders. What i am


coming into, is the jargon "tail recursion". What a fantastically
fucking fantastic stupid meaningless term that is so loved by the
LISP programing geeks. In every trickle of chance they'll flash
it out to debate. (hopefully i'll continue explaining the stupid
jargons and background of common computing terms that i listed
above and much more in the future.)

Now, let's talk about tail recursion.

I read the Structure and Interpretation of Computer Programs of
Harold Abelson et al about 4 years ago. I recall reading the
section on tail recursion. In particular, i recall how exacting
and clear it explains the myths of looping away. To this day, i
still hold their writing illuminating. (it is in reading their
book, chapter i think 3 on functional dispatch, that it dawned on
me what is OOP really about, far beyond any fantastically fucking
stupid OOP tutorials and books.) My understanding of Scheme is
very minimal. I'm not sure if i can even write "hello world"
correctly in one shot. I don't know Common Lisp at all. So, you
will pardon and correct me if i make some stupid remarks here.
However, my feelings is that Abelson et al is quite lucid in
explaining that the time/space algorithmic behaviors of those
for/while/until loops of imperative languages can be equivalent

to recursion. And, i think that the term "tail recursion" means


implementation of recursion syntax in a language so that when it
can be linear, it is so. (we should simply call it

well-implemented recursion or something, instead of the


fantastically stupid "tail recursion".)

Kent Pitman wrote:
> "iteration can take on multiple syntactic forms",
> "there
> are equivalences between conventional loops and
> tail recursions", "sometimes
> seeing things in a different
> syntactic form, whether conventional or
> recursive, reveals
>through that form truths that are not as
> perspicuous in the other form"

and

> To the extent that people instead believe that grand
> truths
> only leap out of tail recursive form and do not leap out
> of the other form, they have been misled.

Kent, please make clear and precise statements. What are you


saying exactly? When you speak to me, try to formalize your
thoughts, and express them as much close to symbolic logic as
possible. The former is for your benefit, the latter is for my
benefit. (what i hate is the amorphous humbling blob you wont to
write.)

(don't be smug that i didn't sting you for your devious maneuver
of my post this time. It did not went undetected. Consider it my
kindness to let you off this time. Frankly, my energy are spend

on writing the jargons passage, and now i don't have time to
chase you off.)

Xah
x...@xahlee.org
http://xahlee.org/PageTwo_dir/more.html

--


> From: Kent M Pitman (pit...@world.std.com)
> Subject: Re: one small function
> Newsgroups: comp.lang.lisp

> Date: 2002-02-07 20:24:57 PST

Kaz Kylheku

unread,
Feb 9, 2002, 2:26:29 AM2/9/02
to
In article <7fe97cc4.02020...@posting.google.com>, Xah Lee wrote:
>Unlikemathematicians, where in mathematics there are no fewer
>jargonsbut each and every one are absolutely necessary. For
>example,polytope, manifold,
>injection/bijection/surjection,group/ring/field.., homological,

How about interjection, that hopefully raises no popular objection,
though that instills in you a sense of dejection, by expressing rejection
of your attempt at this fine establishment's abjection.

Kaz Kylheku

unread,
Feb 9, 2002, 2:32:11 AM2/9/02
to
In article <7fe97cc4.02020...@posting.google.com>, Xah Lee wrote:
>Unlikemathematicians, where in mathematics there are no fewer
>jargonsbut each and every one are absolutely necessary. For
>example,polytope, manifold,
>injection/bijection/surjection,group/ring/field.., homological,

How about an interjection, that hopefully raises no popular objection,


though that instills in you a sense of dejection, by expressing rejection
of your attempt at this fine establishment's abjection.

>projective, pencil, bundle,lattice, affine, topology, isomorphism,

Paolo Amoroso

unread,
Feb 9, 2002, 7:41:52 AM2/9/02
to
On Fri, 08 Feb 2002 20:00:11 GMT, Duane Rettig <du...@franz.com> wrote:

> c...@sli.uio.no (Christian Nybų) writes:
>
> > ds...@goldshoe.gte.com (Dorai Sitaram) writes:
> >
> > > What's "OVS phrasing"?
> >
> > Object Verb Subject, Yoda's manner of sentence phrasing.
>
> Nah, Yoda speaks with OSV phrasing, at least for declarative
> statements...

An Apollo astronaut would speak with Verb Noun phrasing :)


Paolo
--
EncyCMUCLopedia * Extensive collection of CMU Common Lisp documentation
http://www.paoloamoroso.it/ency/README
[http://cvs2.cons.org:8000/cmucl/doc/EncyCMUCLopedia/]

Kent M Pitman

unread,
Feb 9, 2002, 8:35:38 AM2/9/02
to
x...@xahlee.org (Xah Lee) writes:

> ... My understanding of Scheme is


> very minimal. I'm not sure if i can even write "hello world"

> correctly in one shot. I don't know Common Lisp at all. ...

(Not that this disqualifies you from posting here, but this
certainly leads me to wonder why you do both read and post here.
Please don't take this as a challenge; I am just profoundly curious.)

> However, my feelings is that Abelson et al is quite lucid in
> explaining that the time/space algorithmic behaviors of those
> for/while/until loops of imperative languages can be equivalent
> to recursion. And, i think that the term "tail recursion" means
> implementation of recursion syntax in a language so that when it
> can be linear, it is so. (we should simply call it
> well-implemented recursion or something, instead of the
> fantastically stupid "tail recursion".)

Yes, but I see little evidence that this is what is picked up by
most students.

I asked Sussman and Abelson about this once, long ago. Their answer may
have changed. But at the time they told me that they agreed with me that
"iteration" is an abstract concept and that a "loop" construct is one way
to achieve it notationally and that "tail recursion" is another. To put
it another way, the lisp program:

(defun my-length (list)
(do ((length 0 (+ length 1))
(sublist list (cdr sublist)))
((null sublist) length)))

is conceptually isomorphic to

(defun my-length (list)
(labels ((do-it (length sublist)
(if (null sublist)
length
(my-length (+ length 1) (cdr sublist)))))
(do-it 0 list)))

In effect, they are just syntactic rewrites of each other. To me, and I
think to both of them, these are just syntactic sugar for one another.
But they were trying to make some teaching points so they focused mainly
on getting people to write the latter format. I don't think they
have a deathwish for alternate syntaxes, but they feel those syntaxes are
covered to death in other courses and they didn't see wasting their own
precious course space on stuff that could be learned elsewhere.

But the studious omission of such an obvious alternate notation does not
go unnoticed by students. And, moreover, students get their problems marked
wrong if they use a perfectly valid notation that is not part of the course
material. And, as a consequence, the Skinnerian effect of taking this course
is often, in practice, regardless of original intent of Sussman and Abelson,
to teach people the "false truths" that (a) Iteration and Tail Recursion are
different [since it's often the case that the former is informally called
recursion, and the latter for contrast purposes only, is not, and since
one will get them credit and one won't] and that (b) Iteration is Bad and
Tail Recursion is Good.

>
> Kent Pitman wrote:
> > "iteration can take on multiple syntactic forms",
> > "there
> > are equivalences between conventional loops and
> > tail recursions", "sometimes
> > seeing things in a different
> > syntactic form, whether conventional or
> > recursive, reveals
> >through that form truths that are not as
> > perspicuous in the other form"
>
> and
>
> > To the extent that people instead believe that grand
> > truths
> > only leap out of tail recursive form and do not leap out
> > of the other form, they have been misled.
>
> Kent, please make clear and precise statements. What are you
> saying exactly? When you speak to me, try to formalize your
> thoughts, and express them as much close to symbolic logic as
> possible. The former is for your benefit, the latter is for my
> benefit. (what i hate is the amorphous humbling blob you wont to
> write.)

My point was that iteration means exactly what you said: to move down
an object of unbounded length using finite space to perform each
iteration. Tail recursion (but not real recursion) shares this
property because tail recursion implements/expresses/is iteration.

But this is not what a lot of people are observed to come out of Scheme
courses having learned.

Nils Goesche

unread,
Feb 9, 2002, 12:03:08 PM2/9/02
to
In article <sfwvgd6...@shell01.TheWorld.com>, Kent M Pitman wrote:

> x...@xahlee.org (Xah Lee) writes:
>
>> However, my feelings is that Abelson et al is quite lucid in
>> explaining that the time/space algorithmic behaviors of those
>> for/while/until loops of imperative languages can be equivalent
>> to recursion. And, i think that the term "tail recursion" means
>> implementation of recursion syntax in a language so that when it
>> can be linear, it is so. (we should simply call it
>> well-implemented recursion or something, instead of the
>> fantastically stupid "tail recursion".)
>
> Yes, but I see little evidence that this is what is picked up by
> most students.

Did you meet many students who've taken an SICP course?

> I asked Sussman and Abelson about this once, long ago. Their answer may
> have changed. But at the time they told me that they agreed with me that
> "iteration" is an abstract concept and that a "loop" construct is one way
> to achieve it notationally and that "tail recursion" is another. To put
> it another way, the lisp program:
>
> (defun my-length (list)
> (do ((length 0 (+ length 1))
> (sublist list (cdr sublist)))
> ((null sublist) length)))
>
> is conceptually isomorphic to
>
> (defun my-length (list)
> (labels ((do-it (length sublist)
> (if (null sublist)
> length
> (my-length (+ length 1) (cdr sublist)))))
> (do-it 0 list)))
>
> In effect, they are just syntactic rewrites of each other. To me, and I
> think to both of them, these are just syntactic sugar for one another.
> But they were trying to make some teaching points so they focused mainly
> on getting people to write the latter format. I don't think they
> have a deathwish for alternate syntaxes, but they feel those syntaxes are
> covered to death in other courses and they didn't see wasting their own
> precious course space on stuff that could be learned elsewhere.

That was also the impression I got from reading the book.

> But the studious omission of such an obvious alternate notation does not
> go unnoticed by students. And, moreover, students get their problems marked
> wrong if they use a perfectly valid notation that is not part of the course
> material. And, as a consequence, the Skinnerian effect of taking this course
> is often, in practice, regardless of original intent of Sussman and Abelson,
> to teach people the "false truths" that (a) Iteration and Tail Recursion are
> different [since it's often the case that the former is informally called
> recursion, and the latter for contrast purposes only, is not, and since
> one will get them credit and one won't] and that (b) Iteration is Bad and
> Tail Recursion is Good.

Hm. That disappoints me a little. It's been a while since I read the
book; I grew up with languages like Assembler, BASIC, Pascal and C,
and I remember that when I ran into SICP and Scheme, I found it hard
at first to write recursive functions; so I forced myself, too, to
write everything in a tail recursive manner instead of using loop
constructs; it didn't take long and I ``got it'' and had no problems
anymore with recursion. But never along the way, I had the impression
that tail recursion was better than iteration, and the equivalence
of them was obvious to me, too (and I stopped immediately writing
loops tail-recursively except for rare cases where the recursive
solution looks indeed more elegant). So, maybe it is not wrong
to ``force'' students to write recursively for a while, only to make
them comfortable with recursion (not only tail recursion), which is
a good thing.

And it was always my impression that SICP (the book at least) was
free of the usual Scheme propaganda, I wouldn't even call it a Scheme
book; they just use Scheme as a language, for whatever reason. The
impression I got, and the reason I value this book very high, is that
it teaches people how to THINK, not only how to DO, as most other
books. Naturally, that makes this book somewhat harder to understand
for students, and I had hoped that at least those who are intelligent
enough to ``survive'' the course until the end, would be smart
enough to figure out for themselves that tail recursion and iteration
are simply isomorphic concepts and it doesn't matter at all which
you use. Pure speculation on my side, of course; I am very interested
whether that is actually true, so: Did you (or anyone else) ever
meet people who took (and finished :-) the course and turned out
to be silly Scheme fanatics? And was such a course held by
Abelson or Sussman themselves? Of course, a Scheme fanatic could
use the book for a course and add a whole lot of Scheme propaganda
here and there :-)

Regards,
--
Nils Goesche
Ask not for whom the <CONTROL-G> tolls.

PGP key ID 0xC66D6E6F

Thomas F. Burdick

unread,
Feb 9, 2002, 2:49:41 PM2/9/02
to
Nils Goesche <car...@t-online.de> writes:

> Did you (or anyone else) ever meet people who took (and finished :-)
> the course and turned out to be silly Scheme fanatics? And was such
> a course held by Abelson or Sussman themselves? Of course, a Scheme
> fanatic could use the book for a course and add a whole lot of
> Scheme propaganda here and there :-)

One of the first CS courses here uses SICP, so I've seen a ton of
people who've gone through a SICP course (succesfully and no). AFAIK,
none of the people who teach it are Scheme fanatics (one of them,
Fateman, insists on CL as the implementation language when he teaches
the compilers course). Some people come out as Scheme fanatics. Even
more depressing, some come out as Scheme fanatics in the region of
languages that look &/or are like Scheme (Lisp and functional
languages), and insist on the purity and Goodness of tail-recursion,
the Evilness of iteration, Lisp-2, the Goodness of continuations, etc;
but they don't use these languages -- they use Java or C or C++ or
... And with the style they insist on in Lisp, Scheme, functional
languages, etc, it's not hard to see why. People are really
frustrating sometimes.

Brian P Templeton

unread,
Feb 9, 2002, 3:30:51 PM2/9/02
to
Erik Naggum <er...@naggum.net> writes:

> * "P.C." <per.c...@privat.dk>
> | Guess the difference Sceme/Lisp are very small but important
>
> They are so large that someone who has been trained in Scheme has no hope
> of ever learning Common Lisp, because they _think_ the differences are
> very small despite strong evidence to the contrary, and because it is
> possible to write Scheme in Common Lisp, they will never be "offered" the
> necessary negative feedback necessary to "get it".
>

[snip]

Although I also dislike Scheme strongly, I don't agree with your
assertion that someone who knows Scheme has no hope of learning CL - I
know at least one person who is a counterexample to this.

>
> ///
> --
> In a fight against something, the fight has value, victory has none.
> In a fight for something, the fight is a loss, victory merely relief.

--
BPT <b...@tunes.org> /"\ ASCII Ribbon Campaign
backronym for Linux: \ / No HTML or RTF in mail
Linux Is Not Unix X No MS-Word in mail
Meme plague ;) ---------> / \ Respect Open Standards

Nils Goesche

unread,
Feb 9, 2002, 6:45:53 PM2/9/02
to
In article <xcvpu3e...@conquest.OCF.Berkeley.EDU>, Thomas F. Burdick wrote:
> Nils Goesche <car...@t-online.de> writes:
>
>> Did you (or anyone else) ever meet people who took (and finished :-)
>> the course and turned out to be silly Scheme fanatics? And was such
>> a course held by Abelson or Sussman themselves? Of course, a Scheme
>> fanatic could use the book for a course and add a whole lot of
>> Scheme propaganda here and there :-)
>
> One of the first CS courses here uses SICP, so I've seen a ton of
> people who've gone through a SICP course (succesfully and no). AFAIK,
> none of the people who teach it are Scheme fanatics (one of them,
> Fateman, insists on CL as the implementation language when he teaches
> the compilers course). Some people come out as Scheme fanatics. Even
> more depressing, some come out as Scheme fanatics in the region of
> languages that look &/or are like Scheme (Lisp and functional
> languages), and insist on the purity and Goodness of tail-recursion,
> the Evilness of iteration, Lisp-2, the Goodness of continuations, etc;
> but they don't use these languages -- they use Java or C or C++ or
> ... And with the style they insist on in Lisp, Scheme, functional
> languages, etc, it's not hard to see why. People are really
> frustrating sometimes.

Hm; sad. But, do you think this is the fault of the book? Maybe
some people just /are/ stupid, or maybe it is possible to pass the
course without understanding the book, I don't know. Sorry for
insisting like this, but I really think it's a good book and keep
recommending it to people, but I wouldn't want to create new Scheme
fanatics that way, of course :-)

Thomas F. Burdick

unread,
Feb 9, 2002, 7:10:09 PM2/9/02
to
Nils Goesche <car...@t-online.de> writes:

I think concluding that people are stupid is a cop-out. UC-Berkeley
is a very good university; no one here is stupid. They may be
immature, behave illogically, think they know everything, etc., but
it's a pretty easy guarantee that they're not stupid. And I don't
think that most people pass that course (a lot don't pass, btw)
without understanding the book. A more reasonable conclusion would be
that SICP is a good book for some people, but its extremely one-sided
approach often fails miserably.

Nils Goesche

unread,
Feb 9, 2002, 9:03:11 PM2/9/02
to
In article <xcvpu3e...@whirlwind.OCF.Berkeley.EDU>, Thomas F. Burdick wrote:
> Nils Goesche <car...@t-online.de> writes:

>> Hm; sad. But, do you think this is the fault of the book? Maybe
>> some people just /are/ stupid, or maybe it is possible to pass the
>> course without understanding the book, I don't know. Sorry for
>> insisting like this, but I really think it's a good book and keep
>> recommending it to people, but I wouldn't want to create new Scheme
>> fanatics that way, of course :-)
>
> I think concluding that people are stupid is a cop-out. UC-Berkeley
> is a very good university; no one here is stupid. They may be
> immature, behave illogically, think they know everything, etc., but

Yes, yes; that's all I meant by `stupid' :-)

> it's a pretty easy guarantee that they're not stupid. And I don't
> think that most people pass that course (a lot don't pass, btw)
> without understanding the book. A more reasonable conclusion would be
> that SICP is a good book for some people, but its extremely one-sided
> approach often fails miserably.

Another conclusion might be that it is only wrong to give it to
beginners. I got very much out of it, but when I read it I was already
programming for about 15 years and working as a full time programmer
for about three years. I always thought ``Oh my god, why didn't
I discover this wonderful book earlier?'', but maybe that's wrong.
It is very hard to tell which books are good for beginners. I was an
assistant teacher of mathematics at the University for four years,
and I think I was a pretty good teacher (people who were officially
assigned to other teachers came to my courses instead), but I never
found out how to tell which books are good for students.

For instance, I told everyone that there is one and only One True
Book that everybody should read as a first text on complex analysis:
``Functions of One Complex Variable'' by John Conway; some people
bought it, but I don't think many of them read much of it. Instead,
most of them preferred a certain other book, which I always thought was
totally stupid, full of errors, much too wordy and infinitely boring:
The authors always close every topic just when it becomes interesting.
Something like a hundred and fifty pages that contain almost nothing,
a total waste of time. But they preferred it, even the good ones.
I never found out, why. I just hope the fact that the other book
was in German (and cheaper) wasn't the only reason ;-)

Erik Naggum

unread,
Feb 9, 2002, 9:31:15 PM2/9/02
to
* Nils Goesche

| Hm; sad. But, do you think this is the fault of the book?

Ideas influence people in sundry ways. Books that are written with the
purpose of expanding on a particular idea is always responsible for the
readers and their reactions. The ultimate example is what religions call
their holy scriptures -- they are credited with giving people a sense of
meaning in their lives. Various other books have had the same effect on
people, such as Atlas Shrugged. When something bad happens to people who
"believe" in these books, a lot of people claim that the religion, book,
etc, had nothing to do with it.

| Maybe some people just /are/ stupid, or maybe it is possible to pass the
| course without understanding the book, I don't know.

Gee, I would to like visit your planet. Here on earth, stupidity is the
most common mental illness and it has no cure. Intellectual laziness is
the result of the dramatic absence of any need to think for the common
citizen, so most people get by unexercised, just like they do with their
physical well-being. I think "fathead" is very descriptive.

| Sorry for insisting like this, but I really think it's a good book and
| keep recommending it to people, but I wouldn't want to create new Scheme
| fanatics that way, of course :-)

The book is actually a very good read _after_ you understand much more
than it tries to teach and have the background knowledge to keep it in
context. It puts things into a context of its own that differs greatly
from other contexts that you cannot expect people to have. Many books
that have "influential" ideas work this way because they are somehow
"detached" from the world people normally experience and live in, yet
"explain" things to them with stunning clarity -- which is not hard if
you do not have to be bothered by the real world. In this way, it is far
easier to tell a wonderfully elegant lie about something that is not than
it is to find elegance in what is. I mean, Hollywood would not _be_ if
it had not always been easier to tell a wonderful story than to live a
wonderful life, but believing that one lives in a Hollywood world does
people a lot of harm. Likewise, not understanding that SICP tells a
wonderful story can seriously hurt the immature mind.

/// 2002-08-09


--
In a fight against something, the fight has value, victory has none.

In a fight for something, the fight is a loss, victory merely relief.

Erik Naggum

unread,
Feb 9, 2002, 9:34:39 PM2/9/02
to
* Thomas F. Burdick

| I think concluding that people are stupid is a cop-out. UC-Berkeley
| is a very good university; no one here is stupid. They may be
| immature, behave illogically, think they know everything, etc., but
| it's a pretty easy guarantee that they're not stupid.

Intelligent people can be far more stupid than unintelligent people,
because the latter do not have the intellectual capacity of the former to
create a "better" world of their own in which they can pretend to liv,
and get away with it. Intelligence is merely higher ablity, stupidity is
the lack of skill in using whatever abilities you have.

/// 2002-02-09


--
In a fight against something, the fight has value, victory has none.

In a fight for something, the fight is a loss, victory merely relief.

Erik Naggum

unread,
Feb 9, 2002, 9:38:40 PM2/9/02
to
* Brian P Templeton

| Although I also dislike Scheme strongly, I don't agree with your
| assertion that someone who knows Scheme has no hope of learning CL - I
| know at least one person who is a counterexample to this.

Oh, but anomalies do not disprove a general assertion, nor do they make
good material for generalization to begin with. If a person has been
trained in some other language before being trained in Scheme, they have
much better chances of not being brainwashed by Scheme, because there is
something there to begin with. I should perhaps clarify that I mostly
consider the problem of "Scheme as the first programming language".

/// 2002-02-09

Nils Goesche

unread,
Feb 9, 2002, 10:16:03 PM2/9/02
to
In article <32222970...@naggum.net>, Erik Naggum wrote:
> * Nils Goesche
>| Hm; sad. But, do you think this is the fault of the book?
>
> Ideas influence people in sundry ways. Books that are written with the
> purpose of expanding on a particular idea is always responsible for the
> readers and their reactions. The ultimate example is what religions call
> their holy scriptures -- they are credited with giving people a sense of
> meaning in their lives. Various other books have had the same effect on
> people, such as Atlas Shrugged. When something bad happens to people who
> "believe" in these books, a lot of people claim that the religion, book,
> etc, had nothing to do with it.

Sure enough, but I won't do that. The very purpose of my question was
something different: We were talking about people who took a certain
course that was based on SICP. Some or many of them got something
bad out of it . Now, my question was, is this because of the
/book/ or only because of the /course/; I don't really know what it
/means/ that a course is `based' on a book -- on the university I
attended, there wasn't a /single/ course `based' on a book. The one
who held the lecture presented the material in whatever way he liked,
proving everything he wanted by himself, selecting every topic he
wanted by himself. When I hear that a course is `based' on SICP,
I imagine a professor who tells his students on the first day
``There is this fine book, SICP, which you really should read'', and
then starts presenting whatever stuff he cares about (I am exaggerating
a little, but I hope you know what I mean). If this is the case in
such courses, I think it is really not justified to blame the book
if something bad comes out of them (of course I don't know how such
courses look like at American universities, sorry about that).

>| Maybe some people just /are/ stupid, or maybe it is possible to pass the
>| course without understanding the book, I don't know.
>
> Gee, I would to like visit your planet. Here on earth, stupidity is the
> most common mental illness and it has no cure. Intellectual laziness is
> the result of the dramatic absence of any need to think for the common
> citizen, so most people get by unexercised, just like they do with their
> physical well-being. I think "fathead" is very descriptive.

I know what you mean, and you are right. But don't forget that there
are always a few, very few, among these idiots who are only `sleeping'
and might, at some time, `awake'; much of the teaching effort is all
about making those `sleepers' awake.

>| Sorry for insisting like this, but I really think it's a good book and
>| keep recommending it to people, but I wouldn't want to create new Scheme
>| fanatics that way, of course :-)
>
> The book is actually a very good read _after_ you understand much more
> than it tries to teach and have the background knowledge to keep it in
> context. It puts things into a context of its own that differs greatly
> from other contexts that you cannot expect people to have.

Maybe. As I said in another post, it might be better to recommend
the book to more advanced people, rather than to beginners.

Nils Goesche

unread,
Feb 9, 2002, 10:38:25 PM2/9/02
to
In article <32222970...@naggum.net>, Erik Naggum wrote:
> Various other books have had the same effect on people, such as
> Atlas Shrugged.

Sorry, I forgot to ask something: What about this book? I found it at
Amazon but have never heard of it. Should I read it?

Erik Naggum

unread,
Feb 9, 2002, 11:05:06 PM2/9/02
to
* Nils Goesche

| Now, my question was, is this because of the /book/ or only because of
| the /course/;

I only argue that books that expand on ideas affect people in many
strange ways. The book clearly has partial "blame" because it erects its
own context without sufficient ties and references to the world around it
to make it clear that it is a wonderful story, not a description of the
real world. Since most people are not trained in integrating what they
hear with what they know, the creation of a context _outside_ of their
normal frame of reference is _dangerous_. Bridging between the context
of a book/idea and the real world is very necessary. The course and the
professor would both have to be _exceptional_ to bridge the context of
SICP with the real world of programming computers. (I would argue that
the same is true for K&R's C book, because it also creates a very simple
world/context that simply is not real and which has deluded C programmers
that they live in the "virtual world" that C is good at programming in.)

| If this is the case in such courses, I think it is really not justified
| to blame the book if something bad comes out of them (of course I don't
| know how such courses look like at American universities, sorry about
| that).

When _would_ it be justified to blame the book?

| I know what you mean, and you are right. But don't forget that there are
| always a few, very few, among these idiots who are only `sleeping' and
| might, at some time, `awake'; much of the teaching effort is all about
| making those `sleepers' awake.

That was very poetic and beautifully said.

///

Erik Naggum

unread,
Feb 9, 2002, 11:42:21 PM2/9/02
to
* Nils Goesche

| Sorry, I forgot to ask something: What about this book? I found it at
| Amazon but have never heard of it. Should I read it?

Yes, you should read it. It adds an important perspective on many things
that are difficult to understand in our complex society, but do not
mistake the perspective for the whole story, which is the advice common
to SICP. It is especially important it if you are inclined to think that
only the "working masses" are worth fighting for and believe that the
owners exploit them -- since there are billions of words written on how
bad business people are, it is quite interesting to see things more from
their "side". It is wisely written as a novel, in which you can immerse
yourself and suspend disbelief and really enjoy it, but some people never
unsuspended their disbelief and have become rather "nutty" as a result,
arguing and living as if the world described therein _is_ the real world.
(The author once said the proof that the world she described was real was
that she could write the book. Whichever way you try to understand this,
it is a warning sign.) From a philosophical perspective, it provides an
opportunity to understand a view that implicitly underlies much of modern
society but which is fought and misrepresented by those who would rather
return to the tribal societies of, e.g., the Taliban, or the socialist
hell that the author barely escaped (but which never escaped her, like
many very traumatizing events in people's life, and which must be kept in
mind to understand what she is actually rebelling against). My signature
is my summary of a lot of hardship both witnessed and experienced when I
found to my surprise that it was actually hard to let go of the desire to
fight against the Norwegian tax authorities when they finally released
their death grip after 15 years of hell and I felt more empty than happy.
(I think Ayn Rand would have been a very different person had she been
able to pick herself up and go on, rather than delve on the evils she had
endured and latched into a "live to tell" mode that some victims of evil
have a tendency never to get out of.) From a literary perspective, she
has really mastered the "romantic" school of description of people and
landscapes alike, but it is not naturalistic, and therefore appear to
_be_ only what she describes. This is another commonality with SICP that
it takes some literary exposure to be comfortable with. Naturalism is
the school that argues that you should tell the whole story, while the
romantic school argues that you should say only what is important to
understand something, discarding the inessential. I suspect that you
will have no problem with this since you could absorb the good ideas from
SICP without believing that what was omitted does not exist.

I think further discussion should go in mail; this is way off-topic.

///

Alain Picard

unread,
Feb 9, 2002, 11:49:41 PM2/9/02
to
Erik Naggum <er...@naggum.net> writes:

> The book clearly has partial "blame" because it erects its
> own context without sufficient ties and references to the world around it
> to make it clear that it is a wonderful story, not a description of the
> real world.

[SNIP]
and

> (I would argue that the same is true for K&R's C book, because it
> also creates a very simple world/context that simply is not real
> and which has deluded C programmers that they live in the "virtual
> world" that C is good at programming in.)

Well, it seems a bit harsh to blame those books (or, I suppose, more appropriately,
their authors) for the (admiteddly vast) shortcomings of their readers.

I guess K&R probably _ARE_ responsible for an entire generation of C programmers
who don't check the return codes of their system calls. K&R would probably
argue that they were trying to teach C, not proper program design and defensive
programming.

Abelson & Sussman probably _ARE_ responsible for an entire generation of Scheme
programmers thinking that iteration is "impure and degraded", _completely_ missing
SICP's point that an iterative _PROCESS_ can be written in recursive form, (given
a guarantee of tail recursion). SICP was trying to teach about the underlying
algorithmic structure.

In both cases, that countless morons didn't get it isn't really the author's
fault.

These arguments would apply even more strongly to religious texts, IMO.
(When was the last time you saw a "christian" giving away all their worldly
possessions and devoting themselves to their fellow man!?)

> When _would_ it be justified to blame the book?

Never. Although I guess books like these _could_ come with a disclaimer:
"Warning! Adult material inside! Read only if you can be critical and
form your own judgments!"

--
It would be difficult to construe Larry Wall, in article
this as a feature. <1995May29....@netlabs.com>

Bulent Murtezaoglu

unread,
Feb 10, 2002, 12:12:33 AM2/10/02
to
>>>>> "EN" == Erik Naggum <er...@naggum.net> writes:
[...]
EN> I only argue that books that expand on ideas affect people
EN> in many strange ways. The book clearly has partial "blame"
EN> because it erects its own context without sufficient ties and
EN> references to the world around it to make it clear that it is
EN> a wonderful story, not a description of the real world.

I think I understand this and I agree with what I understand. The scare
quotes around 'blame' are appropriate, I think.

EN> Since
EN> most people are not trained in integrating what they hear with
EN> what they know, the creation of a context _outside_ of their
EN> normal frame of reference is _dangerous_. [...]

But how is this integration or rather its necessity and utility is to
be learned? One way, it seems to me, is to read a wonderful story
(say SICP), and then another (say K & R's C), and then another and so
on. With the net gain being not just the accumulation of what those
books teach you but also the realization that all these wonderful
stories are just aspects of something more wonderous which is solving
problems in the "real world." Unless _great_ teaching _and_ mentoring
is available, it seems listening to more than one of these great
stories and maybe believing a few for a period is a workable way to
gain the reflex of integration you point to. I think the danger you
describe is brief in effect, and, at any rate, worth taking.

cheers,

BM

Erik Naggum

unread,
Feb 10, 2002, 12:50:43 AM2/10/02
to
* Alain Picard <api...@optushome.com.au>

| In both cases, that countless morons didn't get it isn't really the
| author's fault.

I think a responsible author should build in _correctors_ to easily
predictable and false beliefs that might arise in their readers if these
beliefs are unintended side-effects. (After all, there is a second
edition out, now.) Most of my excellent math books are rife with
correctors and clues that keep the reader off the false paths to
understanding, and I really admire these authors (John M. Apostol,
Richard Courant, Fritz John). If you read Richard Feynman's papers and
lectures on physics, you will find that he lets you see several open
paths ahead from a given experiement or theory or argument, then
experctly closes each of the wrong ones in a way that lets you grasp
exactly how things work, but not from his words, from your understanding
of the world around you, leading forward to several new open paths, etc.
Most of the computer science books I have read are of the "this is the
truth because I say so" style, which I think betrays the problems of the
entire field: There is a "bottom" to computer science since it is all
made-up stuff. (Mathematics may also be largely made up, but it has no
bottom, and there is no bottom to the natural sciences, medicine, etc.)
If you can get to the bottom of it, it is too shallow.

| These arguments would apply even more strongly to religious texts, IMO.
| (When was the last time you saw a "christian" giving away all their
| worldly possessions and devoting themselves to their fellow man!?)

Yesterday. Because of my fight with our evil tax authorities, I get to
hear a lot of stories about how people have more or less willingly given
up their all their worldly possessions for the benefit of their fellow
men, at least if you believe that that is what taxation is for. :)

| Although I guess books like these _could_ come with a disclaimer:
| "Warning! Adult material inside! Read only if you can be critical and
| form your own judgments!"

Responsible authors sort of bake that into their tone and language. For
instance, Guy L. Steele jr did it with a lot of very clever humor in both
editions of Common Lisp the Language -- you had to get the point he made
to really enjoy it, and people who failed to get it, at least seem to
realize they failed to get it. If the reader can walk away with "I think
I get it" when he does not, the book is no good. If the reader can
accurately determine whether he gets it or not, the book is great. If
the reader thinks "I don't get it" even when he does, it is the worst
kind of book. (I think Immanuel Kant is the worst author in the history
of mankind for this particular reason.)

Erik Naggum

unread,
Feb 10, 2002, 1:18:47 AM2/10/02
to
* Bulent Murtezaoglu

| But how is this integration or rather its necessity and utility is to be
| learned?

I have no idea. Some people do, some people don't.

| Unless _great_ teaching _and_ mentoring is available, it seems listening
| to more than one of these great stories and maybe believing a few for a
| period is a workable way to gain the reflex of integration you point to.
| I think the danger you describe is brief in effect, and, at any rate,
| worth taking.

I think, from observing children, that this desire to understand how new
things they hear match what they previously heard or understood and
coming up with new understanding on their own and being able to deal
_productively_ with cognitive dissonance instead of the apparent default
to reject the new out of hand and just keep believing the old, starts at
a very early age and is not actually _learned_, but is more acquired as a
general attitude towards the world and their interaction with it.

Some seem to grasp that cognitive dissonance is not a threat to their
beliefs and do not generally believe all kinds of crap to begin with,
from the get go, but others may learn how to deal with new information if
they are provided with a very safe and secure setting where being upset
(some seem to think they have been "lied" to if they have revise their
view of something) is handled appropriately by their parents. I have
seen how making it more painful to stick to an old belief than to adopt a
new one can jump-start this process in adults who don't generally get it,
but a few people go postal when they are presented with anything that
even hints at their being _wrong_ about something.

Craig Brozefsky

unread,
Feb 10, 2002, 1:02:31 PM2/10/02
to
Erik Naggum <er...@naggum.net> writes:

> They are so large that someone who has been trained in Scheme has no hope
> of ever learning Common Lisp, because they _think_ the differences are
> very small despite strong evidence to the contrary, and because it is
> possible to write Scheme in Common Lisp, they will never be "offered" the
> necessary negative feedback necessary to "get it".

I looked ahead in the thread to see your qualification about Scheme
being the first language taught, but I would like to add another data
point from someone who learned Scheme before CL, but had learned other
languages first.

I came across Scheme after a few years of C/Perl/ObjC/Java with some
minor readings in ML and it's ilk. I became quite fascinated with it
and started writing sizeable chunks of code with it[1]. I quickly ran
into the limitations of Scheme's size and ability to model the
complexity of the worlds little niggling points. I tried to write
code to R5RS and was basically stuck compromising and including SLIB
or trucking along my own little collection of code.

During this process there was a window of a month or two where the
benefits of working in Scheme (the new ideas, the early introduction
to macros thru the portable "defmacro" utilities most implementations
had, refined understanding of scoping and the basics of symbolic
computation) outweighed the pain of dealing with it's pragmatic
shortcomings for many tasks. But the scale soon tipped and I found
myself spending most of my time overcoming Schemes simplicity and
purity fetish.

It was about that time that my co-worker and freind Jesse Bouwman
starting pushing CL at the office for various spare-time hacks,
including of course the proverbial Emacs replacement. He wrote some
buffers while I whipped up a fledgling curses interface. I had a real
object system, defmacro, a great free implementation, cl-http, ILISP,
the HyperSpec, and all the other things that make CL useful.

That was the end of Scheme for me. Shortly after we convinced our
management to let us develop our next project, which became our major
product and revenue stream, in CL and the driver/funder of IMHO and
USQL development[2].

[1] http://www.red-bean.com/~craig/software.html

[2] http://alpha.onshored.com/lisp-software

--
Craig Brozefsky <cr...@red-bean.com>
http://www.red-bean.com/~craig
Ask me about Common Lisp Enterprise Eggplants at Red Bean!

Thomas F. Burdick

unread,
Feb 10, 2002, 2:23:23 PM2/10/02
to
Nils Goesche <car...@t-online.de> writes:

> In article <32222970...@naggum.net>, Erik Naggum wrote:
> > * Nils Goesche
> >| Hm; sad. But, do you think this is the fault of the book?
> >
> > Ideas influence people in sundry ways. Books that are written with the
> > purpose of expanding on a particular idea is always responsible for the
> > readers and their reactions. The ultimate example is what religions call
> > their holy scriptures -- they are credited with giving people a sense of
> > meaning in their lives. Various other books have had the same effect on
> > people, such as Atlas Shrugged. When something bad happens to people who
> > "believe" in these books, a lot of people claim that the religion, book,
> > etc, had nothing to do with it.
>
> Sure enough, but I won't do that. The very purpose of my question
> was something different: We were talking about people who took a
> certain course that was based on SICP. Some or many of them got
> something bad out of it . Now, my question was, is this because of
> the /book/ or only because of the /course/; I don't really know what
> it /means/ that a course is `based' on a book -- on the university I
> attended, there wasn't a /single/ course `based' on a book. The one
> who held the lecture presented the material in whatever way he
> liked, proving everything he wanted by himself, selecting every
> topic he wanted by himself.

This is how things work in advanced courses (sometimes with a
pathologicl result in book costs if the professor doesn't care about
the cost to his/her students, and uses 100 pages from 5 different
books), but introductory courses are really difficult to teach.
University professors are surrounded by people with a sturdy knowledge
in the basics of their field, and so never have to worry about what
exactly those are. Introductory courses are supposed to teach
students those things that experts in the field never even think about
the fact that they think about.

So, people who have supposedly designed successful courses write up
books like SICP. Other professors at other institutions will trust
their more introductory-teaching-minded colleagues and use the
organization of the book as the basis for their syllabus. That's not
to say they just read the book, but the syllabus genrally tracks the
book, and every day they expect the students to have read the portion
of the book up to where they are then; and they talk about that
subject, with the assumption of knowledge of the book.

> When I hear that a course is `based' on SICP, I imagine a professor
> who tells his students on the first day ``There is this fine book,
> SICP, which you really should read'', and then starts presenting
> whatever stuff he cares about (I am exaggerating a little, but I
> hope you know what I mean). If this is the case in such courses, I
> think it is really not justified to blame the book if something bad
> comes out of them (of course I don't know how such courses look like
> at American universities, sorry about that).

Professors still do that sometimes, but there have been so many
useless introductory courses taught that way, that, at least at public
universities, that approach is greatly discouraged. Well, except for
the small percentage of professors who are good at teaching
introductory material, in which case, they teach whatever they want,
and everyone else teaches more-or-less whatever that professor wanted :)

> > The book is actually a very good read _after_ you understand much more
> > than it tries to teach and have the background knowledge to keep it in
> > context. It puts things into a context of its own that differs greatly
> > from other contexts that you cannot expect people to have.
>
> Maybe. As I said in another post, it might be better to recommend
> the book to more advanced people, rather than to beginners.

Which should make people wonder if the book isn't an utter failure in
its intended goal of being an introduction to CS. I think it could be
used in conjunction with another intro book as the basis for a good
introductory course, but I think that it forms an entirely rotten
foundation alone. The practical problem, of course, is that you'd
need to find a book to balance it out, that was also based on Scheme.

Kent M Pitman

unread,
Feb 10, 2002, 3:37:32 PM2/10/02
to
Nils Goesche <car...@t-online.de> writes:

> When I hear that a course is `based' on SICP,
> I imagine a professor who tells his students on the first day
> ``There is this fine book, SICP, which you really should read'', and
> then starts presenting whatever stuff he cares about (I am exaggerating
> a little, but I hope you know what I mean). If this is the case in
> such courses, I think it is really not justified to blame the book
> if something bad comes out of them (of course I don't know how such
> courses look like at American universities, sorry about that).

I don't know. A course that doesn't define both what to do and what
not to do seems potentially incomplete, especially in the face of
years of data that suggests a certain outcome. I _know_ I spoke to
Abelson and Sussman about this outcome and they did not fix their
course. As a new course, I think you could say this was an accidental
result. As a robust course, I think you have to say they weren't
unhappy with the outcome.

As a related issue, the reason that it tells you how to pronounce SETQ
in the ANSI CL standard isn't that I thought someone would want to know
how to pronounce it, since it seems obvious to me. It's because someone
told me they worked somewhere where people pronounced it "set-kuh" (like
the "uh" in "cut") and I wanted to give people some leverage in saying
"no, this isn't right".

I think textbook authors can be an enormous help to students learning from
Rip Van Winkle professors (who learned a subject 20 years ago, then went to
sleep, then woke up and thought they could still teach it) by anticipating
wrong things that might be taught and saying "Actually, that's not so."
This may cause some determined folks not to use your book so they don't have
to answer to it, but in some cases it will wake some people up. And I think
it's definitely worth doing.

Erann Gat

unread,
Feb 10, 2002, 3:45:50 PM2/10/02
to
In article <32222972...@naggum.net>, Erik Naggum <er...@naggum.net> wrote:

> Intelligent people can be far more stupid than unintelligent people,
> because the latter do not have the intellectual capacity of the former to
> create a "better" world of their own in which they can pretend to liv,
> and get away with it. Intelligence is merely higher ablity, stupidity is
> the lack of skill in using whatever abilities you have.

stu·pid - adj. stu·pid·er, stu·pid·est

1.Slow to learn or understand; obtuse.
2.Tending to make poor decisions or careless mistakes.
3.Marked by a lack of intelligence or care;
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
foolish or careless: a stupid mistake.
4.Dazed, stunned, or stupefied.
5.Pointless; worthless: a stupid job.

E.

Erik Naggum

unread,
Feb 10, 2002, 5:29:39 PM2/10/02
to
* Erik Naggum

> Intelligent people can be far more stupid than unintelligent people,
> because the latter do not have the intellectual capacity of the former to
> create a "better" world of their own in which they can pretend to liv,
> and get away with it. Intelligence is merely higher ablity, stupidity is
> the lack of skill in using whatever abilities you have.

* g...@jpl.nasa.gov (Erann Gat)


| stu·pid - adj. stu·pid·er, stu·pid·est
|
| 1.Slow to learn or understand; obtuse.
| 2.Tending to make poor decisions or careless mistakes.
| 3.Marked by a lack of intelligence or care;
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| foolish or careless: a stupid mistake.
| 4.Dazed, stunned, or stupefied.
| 5.Pointless; worthless: a stupid job.

I do not think you are supporting any actual disagreement with me.

Both actions and people may be intelligent, but "intelligent" has a
different meaning in each of those cases. Only meaning 1 above is
therefore _actually_ related to intelligence, the rest of them are
clearly supporting my take on this distinction, and _particularly_
meaning 3.

///

Dorai Sitaram

unread,
Feb 10, 2002, 7:17:59 PM2/10/02
to
In article <u8za3c...@theworld.com>,
Christopher C. Stacy <cst...@theworld.com> wrote:
>MGHMnhhmm...Always two there are...a master and a learner language.
>But which one was the Sithcp? The master, or the learner?

Three months already gone since Jan 22? My, how
time flies! :-)

--d

Kenny Tilton

unread,
Feb 10, 2002, 8:53:02 PM2/10/02
to
Erann Gat wrote:
>
>
> stu·pid - adj. stu·pid·er, stu·pid·est
>
> 1.Slow to learn or understand; obtuse.
> 2.Tending to make poor decisions or careless mistakes.
> 3.Marked by a lack of intelligence or care;
> ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Now that peace has broken out on c.l.l., funny how the needlessly
disputatious stuff really stands out.

--

kenny tilton
clinisys, inc
---------------------------------------------------------------
"We have a pond and a pool. The pond would be good for you."
- Ty to Carl, Caddy Shack

Brad Knotwell

unread,
Feb 10, 2002, 11:39:07 PM2/10/02
to
Alain Picard <api...@optushome.com.au> writes:
> Abelson & Sussman probably _ARE_ responsible for an entire generation of
> Scheme programmers thinking that iteration is "impure and degraded",
> _completely_ missing SICP's point that an iterative _PROCESS_ can be
> written in recursive form, (given a guarantee of tail recursion). SICP
> was trying to teach about the underlying algorithmic structure.

I suppose I'm being naive, but I don't see how someone reading SICP could
miss this key point. Either the first or second chapter goes over this (it's
at work) detail having the user convert recursive procedures into
iterative ones.

It makes me wonder if there's a significant difference between people who
read SICP for a university class and those of us who read it on our own.

--Brad


Marc Spitzer

unread,
Feb 11, 2002, 12:31:44 AM2/11/02
to

I think there is. You can see it in every field, there are people who
want to get by and people who want to do well. The people who want to
do well can turn into professionals in their field. The people who
want to get by are just laborer's, do your week go home and forget
about it when you are not working. Many of the programmers out there
are labor and it shows. Many of the people in programming courses,
college and else where, are labor and I guess it shows.

One of the nice things about this group is almost everyone here wants
to do well.

marc

Dr. Edmund Weitz

unread,
Feb 11, 2002, 1:07:50 AM2/11/02
to
ma...@oscar.eng.cv.net (Marc Spitzer) writes:

> I think there is. You can see it in every field, there are people
> who want to get by and people who want to do well. The people who
> want to do well can turn into professionals in their field. The
> people who want to get by are just laborer's, do your week go home
> and forget about it when you are not working. Many of the
> programmers out there are labor and it shows. Many of the people in
> programming courses, college and else where, are labor and I guess
> it shows.

Once again an opportunity to point to Robert Strandh's fine article
about this topic:

<http://dept-info.labri.u-bordeaux.fr/~strandh/Teaching/MTP/Common/Strandh-Tutorial/psychology.html>

Edi.

--

Dr. Edmund Weitz
Hamburg
Germany

The Common Lisp Cookbook
<http://agharta.de/cookbook/>

Erann Gat

unread,
Feb 11, 2002, 12:59:30 AM2/11/02
to
In article <3C6724B0...@nyc.rr.com>, Kenny Tilton
<kti...@nyc.rr.com> wrote:

> Erann Gat wrote:
> >
> >
> > stu·pid - adj. stu·pid·er, stu·pid·est
> >
> > 1.Slow to learn or understand; obtuse.
> > 2.Tending to make poor decisions or careless mistakes.
> > 3.Marked by a lack of intelligence or care;
> > ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>
> Now that peace has broken out on c.l.l., funny how the needlessly
> disputatious stuff really stands out.

Indeed.

E.

Xah Lee

unread,
Feb 11, 2002, 3:22:10 AM2/11/02
to
Dear Kent Pitman,

Thank you for taking the time to answer my thoughts about tail
recursion.

x...@xahlee.org (Xah Lee) writes:
> ... My understanding of Scheme is
> very minimal. I'm not sure if i can even write "hello world"
> correctly in one shot. I don't know Common Lisp at all. ...

Kent Pitman wrote:
> (Not that this disqualifies you from posting here, but this
> certainly leads me to wonder why you do both read and post here.
> Please don't take this as a challenge; I am just profoundly curious.)

Dear Kent, like you, i also am curious about lots of things. For
example, i' m curious about why courtesan sells their pussy, and why
pussies have profound attraction to me. I would and still do spend
hours upon hours gazing them, finding it a great form of relaxation,
more so than correcting droppings from comp.lang.lisp bigwigs, or
enjoying Naggum's personal fantasies. Nevertheless, i'm aware your
ramification and topic swinging abilities do not disqualify you from
posting here.

The plain and longer answer, would be the following.

I come here to read Nagging Naggum's posts. I enjoy reading his
posts. I learn English vocabularies. I would not say that i knew less
words than Erik, but nevertheless i learn words from his writings,
from totally new to usage example. I also cull off his fantastic
opinions for my consideration. Lastly, i find his rants and spats with
fellow human beings entertaining.

The second important reason for me reading comp.lang.lisp, is that i
use it as on outlet of my meticulous crafted rants when opportune. I
could pick other newsgroups, but in general it is not fun if the
community does not welcome it. Trolling per se is not something that
interest me. As it so happens, that i started to read lisp newsgroups
around 1998 because i was learning Scheme, and i find the lispers in
general much more educated than, say, comp.lang.perl.* in which i use
to discuss on-topic perl related stuff now and then. So have i used
comp.emacs or xemacs often, and some ohter mailing lists as well. In
any case, my lavish rants tends to go to comp.lang.lisp.

The harmony of authorship and readership takes a matching. Imagine
Einstein ranting his physics theories to 17th century physicists, he'd
probably be kill-filed to death. Likewise, Larry Wall's drivel fits
the unix moron's mind to a tee; and that i find comp.lang.lisp has the
best readership for my rants among the few online discussions groups i
use.

My vague motivation, is for me in the future to collect my rants and
form a book. May it be a coherent account of unix & perl's damage to
society , with technical criticisms, or similar attacks to the slew of
fantastic fucking stupid imperative languages or the SQL language and
other software "technologies", or cogent commentaries on the idiocies
of software industry such as the Design Patterns... i don't know. In
the last few years as i write more and more, i find myself enjoy
writing. I consider my act of writing soothing to my anger caused by
unthinkers in society. When my rants are offensive to unhtinkers, i
consider it a form of sweet revenge.

The other motivation is to educate people, but let's not talk about
that. Once you tell people that, all sorts of things come flying
against you, from hats of hypocrisy to spontaneous resistance. I find
that the best way to educate, is by means of covert brain washing. I
try to go in unassuming and rant my rants and get myself attacked and
kill-file announced, but behind the scene i stab people's brain with
sharp objects, jam their wires and screw their programs, totally
shattering the world of their minds. And when they try to recuperate,
to think ways of counter-attack, bang! I have succeeded in my goal.

--

Kent wrote:
> My point was that iteration means exactly what you said: to move
> down an object of unbounded length using finite space to perform
> each iteration. Tail recursion (but not real recursion) shares this
> property because tail recursion implements/expresses/is iteration.

Thanks Kent for this concise recapitulation.

As you described, some Scheme fanantics are mislead by Ableson's
dubunking of loop/recursion myth, to think that loop syntax should be
avoided at all costs.

I believe this being plausible, or fact, and i think the culprit
being: stupid terminology and jargons.

A quick look at the text on how the authors defined "tail recursion" found me:

http://mitpress.mit.edu/sicp/full-text/book/book-Z-H-11.html#%_sec_1.2.1

<quote>
One reason that the distinction between process and procedure may be
confusing is that most implementations of common languages (including
Ada, Pascal, and C) are designed in such a way that the interpretation
of any recursive procedure consumes an amount of memory that grows
with the number of procedure calls, even when the process described
is, in principle, iterative. As a consequence, these languages can
describe iterative processes only by resorting to special-purpose
``looping constructs'' such as do, repeat, until, for, and while. The
implementation of Scheme we shall consider in chapter 5 does not
share this defect. It will execute an iterative process in constant
space, even if the iterative process is described by a recursive
procedure. An implementation with this property is called
tail-recursive. With a tail-recursive implementation, iteration can be
expressed using the ordinary procedure call mechanism, so that special
iteration constructs are useful only as syntactic sugar.31
</quote>

From here we see the authors clearly indicated that the term "tail
recursion" refers to an _implementation_.

I think few people when they discuss "tail recursion" do have a clear
idea that the phrase refers to an implementation. "Tail recursion"
being a _stupid_ jargon itself, thus confusion begets confusion, and
there we have endless meaningless arguments and throwings of opinions
with far-reaching consequences, not unlike the many jargons i have
discussed in my previous post.

imagine that people are info processors. In a newsgroups or general
communication, people take muddy inputs from other people, process it
with their imperfect processor, then express their output in a variety
of muddy ways with these jargons galore. The process of _garbage in
and garbage out_ are nevertheless useful, since people are not exactly
machines. But, imaging if we get rid of these fucking stupid jargons
and mis-representations and instead communicate on what we mean, what
kind of percentage of stupidy we can get rid of society?

I propose that a part of those Scheme fanatics, where the Common
Lispers accuse of being stupid tail-recursive purists, is in very
large part due to the fantastic use and propergation of the term "tail
recursion". I propose, that we should ban the use of the term "tail
recursion". When refering to the concept of loop expressed in
recursive form, simply say linear recursion. When refering to a good
implementation of recursion that recognizes linear recursion, we
simply should call it "good implementation" or something more
descriptive. If this is achieved, i would say that the tail-recursive
Scheme fanatics so accused would reduce in quantity significantly.

Next time you hear your colleague utter "tail recursion", quickly
grab a nearby book and whack his mouth. Look in his eyes, and say
quizzically "do you mean linear recursion and good implementation of
recursion"? (if he says "what??", make a mental note that this guy
didn't read Xah's opuses. Help him out. Resolve his mystery.)

Acute old-time Xah readers will detect that linguistics & logic is a
theme in my writings, that i employ them and advocate them. The jargon
debunking in this episode, is part of the grand scheme that English
the language is the cause of fantastic stupidity in society. I wait for
the day when human beings communicate by symbolic logic
protocols. Just imagine what kind of unrestricted rich humor and
nuisances that entails, where humor can perceived more clearly, and
muddiness can be muddier by intention.

PS currently i don't have access to newsgroup other than deja.com, so
my readings of other's posts will be severely limited. If anyone can
give me a news account, that'd be much appreciated.

Xah
x...@xahlee.org
http://xahlee.org/PageTwo_dir/more.html

Xah Lee

unread,
Feb 11, 2002, 6:41:05 AM2/11/02
to
Nils Goesche wrote:
> I know what you mean, and you are right. But don't forget that there are
> always a few, very few, among these idiots who are only `sleeping' and
> might, at some time, `awake'; much of the teaching effort is all about
> making those `sleepers' awake.

Erik Naggum wrote:
> That was very poetic and beautifully said.

what makes poetry, is that they be quite fucked up when taken
literally. If they are beautiful, it is with this context in
mind. Great minds knows that, "artists" don't know that.

Schooling, when poetically explained, is about waking up smart
sleepers. But we now can see here how fantastically stupid
that description is. Schooling is about mass systematic education. Not
to a few, and benefits just about every students. The benefit of
systematic education cannot be seen or measured in the fantastically
fucking stupid grades or exams, sleeper or no. If we want to see it,
see it in the community, the town, the nation.

--

Those well-to-do schoolers, such as those Ph.D tag wearers churned out
yearly by Stanford & Ivy League schooling institutions, are the
names of my pain. I want to ram my head with their head, and see their
head crushed bloody.

--

By the way, people are not idiots, ok? People are just people. They
are not morons or idiots. Let's all not develop a fad of calling
everybody else stupid in our refined discourses. When i do, consider
it a manner of speaking. When Naggin Naggum does not, consider it News.

Xah
x...@xahlee.org
http://xahlee.org/PageTwo_dir/more.html

Xah Lee

unread,
Feb 11, 2002, 7:17:12 AM2/11/02
to
There is a entire set of websites, that seems to have developed by Ayn
Rand followers, the auhtor of Atlas Shrugged and founder of
"Objectivism".

The sites in question is:
http://www.capitalismmagazine.com/

I have not read any of Ayn Rand's books. From the websites, i detect a
cult forming. The essays on their sites tends to be strong and
suggestive, arguments insubstantial, and some totally fucked in my
opinion. (for example, after 9-11, few essays directed people to get
US to go for war, with very powerful and emotional manipulations). I
would advise some of you to keep a watchful eye on them.

Even i haven't read anything by Ayn Rand, i'd advise not to read it,
simply because now there's a huge series of sites that advocates
it. By a twist of turn, it can be the next Nazi.

I'd however, advice people to read Basic Economics by Thomas
Sowell. That book is an eye opener for me. You can surf amazon.com and
find out Thomas Sowell's other works and people's comments.

(Thomas Sowell, as i found out, is a controvercial fellow, and a
regular contributer to the above mentioned site. He essays on the
site, like the others, are inciting and lacks substance. (read his
book to know him.))

When you have read most of the works of great philosophiers or
writers, then you can read other books, such as those written in the
last 50 years that competes for your attention and interest.

Xah
x...@xahlee.org
http://xahlee.org/PageTwo_dir/more.html

> From: Nils Goesche <car...@t-online.de>
> Newsgroups: comp.lang.lisp
> Subject: Re: one small function
> Date: Sun, 10 Feb 2002 04:38:25 +0100

Marco Antoniotti

unread,
Feb 11, 2002, 9:59:03 AM2/11/02
to

ds...@goldshoe.gte.com (Dorai Sitaram) writes:

> In article <xcv6657...@famine.OCF.Berkeley.EDU>,
> Thomas F. Burdick <t...@famine.OCF.Berkeley.EDU> wrote:
> >ds...@goldshoe.gte.com (Dorai Sitaram) writes:
> >
> >> In article <sikheot...@ulrik.uio.no>,
> >> Christian Nybų <c...@sli.uio.no> wrote:
> >> >
> >> >At my uni, taking the SICP course is a recommended preparation for
> >> >those who wish to take the CL course. Therefore I choose, despite
> >> >your warnings, to assimilate what Scheme has to offer as a teaching
> >> >language. Any advice on how to get through unharmed? (OVS phrasing
> >> >optional...)
> >>
> >> I would say that (in general) it is fairly difficult
> >> for Scheme learners to learn Common Lisp.
> >
> >If they think they're the same language, it certainly will be. I
> >think if someone is specifically trying to not think that CL is a
> >funny-looking Scheme, I think they will probably have a very good
> >chance.
>
> I do think it is risky to learn Scheme if the goal is
> CL.

<<I wrote something here that I decided to erase :) >>

> I speak as someone who is fairly comfortable with
> Scheme but cannot realistically hope to achieve that
> comfort with CL. I know the two languages are
> different (and _how_ they are different). Maybe my
> experience is not statistically significant (whatever
> that means), I hope so.
>
> It isn't exactly a new hypothesis around here that
> Scheme ruins one for CL. I find it a plausible
> hypothesis. Exceptions for exceptional talent of
> course, as always.

I believe one of the things that is missing here (and maybe could go
in the Cookbook) is some list of things you need to know about when
moving from Scheme to CL.

E.g. one of the things that many Schemers do is the `let loop' trick.

If you do not know much about CL, you are going to be quite at a loss
when you try to program this with a LOOP or similar.

Well, just providing a NAMED-LET macro (or a LET-LOOP) macro, would be
enough to help a Schemer lost in CL.

It is my gut feeling that most of the other issues are simpler that
this one (maybe apart from the multiple inner `define' that may appear
in a Scheme function). Even call/cc is IMHO a non issue.

Cheers

--
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group tel. +1 - 212 - 998 3488
719 Broadway 12th Floor fax +1 - 212 - 995 4122
New York, NY 10003, USA http://bioinformatics.cat.nyu.edu
"Hello New York! We'll do what we can!"
Bill Murray in `Ghostbusters'.

Dorai Sitaram

unread,
Feb 11, 2002, 10:20:00 AM2/11/02
to
In article <86bsewm...@localhost.my.domain>,

Add to that the possibility that SICP is a classic in
the Mark Twain sense. As a case in point, I had messed
with Scheme for a long time before I got a chance to
read the book (I remember being surprised that there
was no call/cc in it). If anyone I fit the profile
evoked in this NG of the individual exposed to Scheme
and (therefore??) unable to use CL equally effectively
-- but the good news is it wasn't because of
SICP. I think people who have read SICP properly at a
time when their minds are still flexible don't exhibit
the syndromes that SICP is being described here as
promoting.

Still, there is a certain antique charm to the notion
that people in this day 'n age can with passion "blame"
a contemporary book for being dangerously influential.

--d

Dr. Edmund Weitz

unread,
Feb 11, 2002, 11:14:21 AM2/11/02
to
Marco Antoniotti <mar...@cs.nyu.edu> writes:

> I believe one of the things that is missing here (and maybe could go
> in the Cookbook) is some list of things you need to know about when
> moving from Scheme to CL.

A good idea. Unfortunately (hehehe... :) I've never been exposed to
Scheme, so I can't provide this chapter. Anyone out there willing to
volunteer?

Thanks,

Mr Usenet Himself

unread,
Feb 11, 2002, 12:24:27 PM2/11/02
to
From: Kent M Pitman <pit...@world.std.com>
Date: 11 Feb 2002 12:24:24 -0500
Message-ID: <sfw4rko...@shell01.TheWorld.com>
Organization: My ISP can pay me if they want an ad here.
Lines: 148
X-Newsreader: Gnus v5.7/Emacs 20.7

x...@xahlee.org (Xah Lee) writes:

Well, in the passage you quoted, there is also the confusion that the words
"share this defect" are ambiguously arranged, such that it compounds the
other misreading you cite, and might well be interpreted to mean the defect
of having special-purpose looping constructs. You have to read carefully
and hop back over one potential to a better one in order to get to the
interpretation the authors almost surely intended.

> I propose that a part of those Scheme fanatics, where the Common
> Lispers accuse of being stupid tail-recursive purists, is in very
> large part due to the fantastic use and propergation of the term "tail
> recursion".

Well, by requiring a particular implementation of tail recursion, they
promote tail recursion issues to the linguistic level.

Even so, that's an interesting theory. But it doesn't explain why
those people eschew the construction of standard iteration constructs.
It's easy enough in Scheme's macro system to define DOTIMES or DOLIST
or various other constructs. [LOOP is `perhaps a bit tricky', but
that's more a criticism of their macro system. Scheme macro systems
were proposed, even ones that implemented hygiene, which could have
implemented LOOP straightforwardly (well, as straightforwardly as CL
does; it's no small system) and the Scheme community chose not to use
those macro systems.]

But even given the ability to build these abstractions, a great many
people prefer not to have them, and I suspect it's because the
underlying message they have learned is this one:

Overgeneralization is good.

The problem, I suspect, is that through seeing things, HARD things,
written out longhand, people who go through this book are just learning
the same bad habits that people going through other more trivial Lisp
texts are learning when they are caused to write to a newsgroup saying
"how do I reverse a list using only CONS and NULL and ...". The answer
ought to be: "You don't, you use REVERSE." But instead people end up
"learning" of Lisp that it doesn't have REVERSE because the only thing
they are ever taught about Lisp is how to implement the things it
provides you to start with.

Sometimes, syntactically, a procedure is just going to count. And that's
when DOTIMES is nice. It doesn't lose any more generality to use DOTIMES
than it loses to use a (tail) recursive description. The fact is that
withotu changing the code, it's not going to become something else either
way, so the code might as well be a DOTIMES. But users never see an example
of this, so they are left with the feeling they are doing something bad by
doing DOTIMES and taught the macho theory that "I can tell that recursion
is counting, can't you?" It's hard for an experienced programmer to deny
that they can recognize a recursion as being a simple counter, but that's
not the same as that programmer feeling that the recognition process is
optimally found.

Generality is often the enemy of efficiency. One wants a system to be
as general as will be required, but not more general than that. And I
think that extends to syntax. Syntactic generality is the enemy of
syntactic effiency. Not saying the more concise form is, to many of us,
saying "this might require more generality, so read carefully", and that
gets tedious in cases where really it's just not going to.

Yet, the S&ICP course requires people to express in longhand what
would not be accepted as acceptable in longhand in a production
situation--certainly not in _my_ software house. And when I see
someone with S&ICP on their resume, I immediately flag it as "this
person is going to need a little talk" because my practical experience
is that the person is going to be oververbose, overgeneral, and unable
to grasp the essential concept of perspicuity in programming, a
concept that is given short shrift in this course.

The technique of expressing all loops as recursions is a little like
the technique of expressing all code in "Continuation Passing Style".
Lots of interesting things can be learned by doing it, but that doesn't
mean CPS-converted code should be used in all, or even most, cases.

I don't think there is or can be a single, best notation. Some notations
make one thing easier, some make other things easier. This is the reason
mathematicians resort to "transform spaces" to work certain problems.
But if a given transform space were always right, we would live all the
time in it and not call it a transform space. What makes transforming
something cool or smart or necessary is not the fact of the end result,
but the knowledge of the choice to be in it or not. What makes behavior
intelligent, in the most general sense, is the ability to exploit knowledge
of multiple possible tools for a given problem, and to know when to shift
among them for best results. This is not a course that seems to teach
that skill. Rather, it teaches the dogma (useful dogma, but still dogma)
of a particular transform space. It does not teach the intelligent
application of this transform space against others possible. It doesn't
say when it would be better to collapse down a notation. It utterly
devalues the notion that there are other possible spaces by insinuating
subtly that the cost of doing a trivial transformation from
(dotimes (i 10) ...) to the equivalent recursive formulation is always
unbearably high, and so suggesting implicitly that any time you write this
notation you have forever boxed yourself into a way of thinking that will
never support later generalization. When in fact it should teach lessons
about how to "shrinkwrap" your statements into more common idioms when the
generalized notation is overkill, and it should, by contrast, teach you
when and how to unshrinkwrap them into a more general notation _when_
such generalization is useful or necessary.

Gareth McCaughan

unread,
Feb 14, 2002, 4:46:36 PM2/14/02
to
Nils Goesche wrote:

> For instance, I told everyone that there is one and only One True
> Book that everybody should read as a first text on complex analysis:
> ``Functions of One Complex Variable'' by John Conway; some people
> bought it, but I don't think many of them read much of it. Instead,
> most of them preferred a certain other book, which I always thought was
> totally stupid, full of errors, much too wordy and infinitely boring:
> The authors always close every topic just when it becomes interesting.
> Something like a hundred and fifty pages that contain almost nothing,
> a total waste of time. But they preferred it, even the good ones.
> I never found out, why. I just hope the fact that the other book
> was in German (and cheaper) wasn't the only reason ;-)

Is Ahlfors too hard?

--
Gareth McCaughan Gareth.M...@pobox.com
.sig under construc

Nils Goesche

unread,
Feb 15, 2002, 2:24:37 PM2/15/02
to
In article <slrna6oc1s.1d97....@g.local>, Gareth McCaughan wrote:
> Nils Goesche wrote:

>> For instance, I told everyone that there is one and only One True
>> Book that everybody should read as a first text on complex analysis:
>> ``Functions of One Complex Variable'' by John Conway;

> Is Ahlfors too hard?

Yes and no. The problem, or my problem, with Ahlfors is that there
are sometimes gaps in his proofs; gaps that are actually easy to fill
(and thus not really gaps) by someone with experience, but I seem to
remember that I found the book hard the first time I looked into it.
The best would be, probably, to start with both Ahlfors and Conway
at the same time :-)

Regards,
--
Nils Goesche
"Don't ask for whom the <CTRL-G> tolls."

PGP key ID 0x42B32FC9

Lieven Marchand

unread,
Feb 15, 2002, 4:52:48 PM2/15/02
to
Nils Goesche <car...@cartan.de> writes:

> In article <slrna6oc1s.1d97....@g.local>, Gareth McCaughan wrote:
> > Nils Goesche wrote:
>
> >> For instance, I told everyone that there is one and only One True
> >> Book that everybody should read as a first text on complex analysis:
> >> ``Functions of One Complex Variable'' by John Conway;
>
> > Is Ahlfors too hard?
>
> Yes and no. The problem, or my problem, with Ahlfors is that there
> are sometimes gaps in his proofs; gaps that are actually easy to fill
> (and thus not really gaps) by someone with experience, but I seem to
> remember that I found the book hard the first time I looked into it.
> The best would be, probably, to start with both Ahlfors and Conway
> at the same time :-)

I've always been partial to Rudin's way of doing things, so I'd
probably go with Real and Complex Analysis. Some people really seem to
dislike his economical style of proof though.

--
Lieven Marchand <m...@wyrd.be>
She says, "Honey, you're a Bastard of great proportion."
He says, "Darling, I plead guilty to that sin."
Cowboy Junkies -- A few simple words

Xah Lee

unread,
Feb 16, 2002, 11:24:19 PM2/16/02
to
> Is Ahlfors too hard?


> Yes and no. The problem, or my problem, with
> Ahlfors is that there
> are sometimes gaps in his proofs

> I've always been partial to Rudin's way of doing things,
> so I'd probably go with Real and Complex Analysis


Gentlemen!

observe the donkeys, one esoteric than another!

The spontaneous passion to partake an elite airing, leaving no one mystified.

Xah
x...@xahlee.org
http://xahlee.org/PageTwo_dir/more.html

Xah Lee

unread,
Feb 18, 2002, 2:54:00 AM2/18/02
to
Thanks for Mr Usernet Himself (pit...@world.std.com) and Kaz Kylheku
guy (k...@accton.shaw.ca) for supporting a laughingstock like myself.
Also thanks go to the many taciturn supporters and a slew of smalltime
gibers, in particular the bigtime Nagging Naggum.

In my last message, i mentioned mathematical terms.

> For example,
> polytope, manifold, injection/bijection/surjection,
> group/ring/field.., homological, projective, pencil, bundle,
> lattice, affine, topology, isomorphism, isometry, homeomorphism,
> aleph-0, fractal, supremum/infimum, simplex, matrix, quaternions,
> derivative/integral
> ...

I'm sure many of you are mystified by these terms. In front of a
contributing mathematician, i dare not banter these words, but here i
think i can give a few explanations to the lay people like yourselves.
(thoroughout this essay are littered with relevant URLs for your
reference convenience.)

In the plane, you have regular _polygons_. That's regularly shaped
where each angle are equal, and each side are equal. For example,
equilateral triangle,
square, regular pentagon, hexagon, heptagon, octagon, nonagon,
decagon and so on are all regular polygons. Analogous shapes in
3-dimensions are called regular _polyhedrons_. They are solids where
each face is a regular polygon.
As it turned out, there can exist only 5 such solids by this
requirement,
each one has a special name. For example, an octahedron is one of the
5 regular solids. Take two pyramids and glue the bottoms together and
you get the picture. It's called octahedron because it has 8 faces. A
cube is also a regular solid, where each face is made of a regular
4-gon the square. Cube is also called hexahedron because of its 6
faces. There's also the tetrahedron (picture it as a three sided
pyramid. Add the bottom, it has four faces.). Dodecahedron is a
regular solid of 12 regular pentagons as faces. Icosahedron is another
one made of 20 equilateral triangles.

In summary, the regular solids of 3 dimensions are:

name Faces Edges Vertex
------------------------------------------
tetrahedron 4 6 4
cube 6 12 8
octahedron 8 12 6
dodecahedron 12 30 20
icosahedron 20 30 12

and these are the only ones. It was known (proven) by the Greeks that
there cannot be other solids with regular faces. The general name for
these is regular polyhedron.

Now, if you have taken multi-variable calculus, you might know that
the dimensions can be abstractly extended to more than 3. When, for
example, in science fictions they talk about n-dimensions, it is
originated and ascertained here in pure mathematics. A 4-dimensional
space, from an abstract point of view, is the same multi-variable
calculus with 4 instead of 3 variables, all other things remain the
same. From that point on, we could derive theorems of all sorts for
n-th dimension without knowing what n-th dimension really is or looks
or feels like. Nevertheless, the logic is firm, and mathematician's
derivative faculties are not delusional.

The details of dimensions higher than 3 takes more space to explain
then i have here, but you will proceed to believe me that in higher
dimensions, mathematicians have also made systematic imaginations
about regular solids.
The simplest case many of you must have heard is a hypercube. A
hypercube is a 4-dimensional solid that are made of 8 cubes, having 24
squares, 32 edges, and 16 vertexes. A hypercube is also known as
_tesseract_.

http://mathworld.wolfram.com/Tesseract.html

In 4 or more dimensions, the number of regular shapes is finite, just
as in 3 dimensions. The general name for regular shapes of any
dimension is regular _POLYTOPE_. Thus, a cube is a regular polytope in
3 dimensions.

+ Wolfram Research's Mathworld page on polytope
http://mathworld.wolfram.com/Polytope.html

+ wonderful java applet
http://dogfeathers.com/java/hyperstar.html

+ George Hart's comprehensive online Polyhedra Encyclopedia
http://www.georgehart.com/virtual-polyhedra/vp.html

One can view higher dimensions figures by a process called projection
-- pretty much like how shadow works. That's also how we can draw a
3-dimensional object "cube" on a 2-dimensional object "paper". Another
method is by cross-section. Here's a page about projecting or slicing
higher dimensional polytopes:

http://www.graphics.cornell.edu/~gordon/peek/peek.html

According to the book Flatland,
(http://www.geom.umn.edu/~banchoff/Flatland/), at the last day
midnight of every millennium, a being from higher dimension will come
down and disclose the gospel of higher dimensions to a choosen
one. The unfortunate person (or fortunate depending on your view) will
likly go nuts afterwards and be put into asylum for life.

... i digress...

Now, in each dimension there's the minimally configured regular
polytope. For example, in 2-D you have equilateral triangle. In 3-D
the simplest is the tetrahedron, made of 4 triangles. In 4-D there's
pentatope. The generic name for such simplest configured polytope for
n-D is called _SIMPLEX_. To me, that term stands for "(math is)
simple, but complex!", as in a birch cane licking on some ass when
explaining the depth of mathematics. Remember that, folks!

http://mathworld.wolfram.com/Simplex.html
http://mathworld.wolfram.com/Pentatope.html

--

Now, with polytope and simplex gone, let me embark on manifold. When i
first heard that word, my little mind wonders the depth of
mathematics, that i knew mathematicians don't like vacuous jargons as
the Unix & Perl programing morons, but they had to invent such
mysterious word signify what thoughts that i could not possibly
understand without years of study. What thinking! These mathematician,
thinking they do! Unlike philosophers, who thinks a lot, much
oftentimes without a governing body, whereas learned man who thinks a
lot can more or less be a philosopher, but whereas in mathematics,
lots of thinking won't do. The tremendous thinking and creativity has
to fit some kind of nature's rule. The nature being, with a huge sharp
ax, lop off thoughts that's won't fit her temperaments and exactitude,
and without mercy. Therefore, in the fields of philosophy for example,
one can have revered crackpots, but not so in math. Now, on to
manifold.

Manifold, like polytope, is a generic name for certain concept of
arbitrary dimensions. In 2-D, we have curves. For example, most of you
know that the graph of y=x^2 is a parabola, and y=x is a line. These
things are called plane curves. The curves themselves are of
1-dimension, but they sits in a 2-dimensional plane. Now, in
3-dimensions, we can also plot functions to form space curves or
surfaces. For example, z=x+y is a plane, and z=x*y is a saddle shaped
surface. Restricting the values of y to a constant then we have curves
in space. Here, the idea is that in 3-D we can have 1-D objects like
curves, or 2-D objects like surfaces. As with regular solids of higher
dimensions, mathematician's imagination faculties thought about curves
and surfaces in higher dimensions. Basically, in n-dimension you can
have "curves/surfaces" that has dimensions less then n. For example,
in 3-D we can have the 2-D surfaces or the 1-D curves, and in 2-D
plane we can have 1-D curves or 0-D dots. Therefore, in 4-dimensional
space, we can have not only 0-D dots or 1-D curves or 2-D surfaces,
but we can also have "curve/surface-like" objects that's 3-D, sitting
in 4-D space. The general name for these curve/surface-like objects in
arbitrary dimensions is called _MANIFOLD_. Thus, a curve is just a
1-dimensional manifold. A surface is a 2-dimensional manifold. How
fantastically fascinating.

With normal math expositors, you can never, ever read such lucid
explanation. Instead, you'll bump into esoteric explanations which the
mathematicians pleased themselves. Only the gruesome Xah is willing
and able to bring such beautiful concepts to moron's eyes.

http://mathworld.wolfram.com/Manifold.html

--

Now, let's talk about injection/bijection/surjection a bit. Except
bijection, i don't like these terms. Injection always reminds me of
fuel injection, which reminds me of cum. I find these terms a bit
pompous, perhaps because i learned alternative terms first.

Injection just means one-to-one mapping. E.g. if we put a spoon in
each coffee cup, that's one-to-one mapping because each spoon
corresponds to one cup. No spoon, for instance, sits in two cups and
no cup holds two spoons.
It is "one-to-one", so to speak. Think of the mappings of penis and
pussy,
then the one-to-one of injection will not be easily forgotten.

A bijection differers slightly in context than injection. Basically,
it is an injection with the requirement that for every pussy there's
is a penis for it. Alternatively, there is no pussy that is without a
cock. These implies that there are equal number in either party.
Bijection is also known as one-to-one AND onto mappings. It is easy to
remember because of the symmetry. (think of fairness)

A Surjection is just an onto mapping. I.e. every destination is
filled, possibly by more than one entities. For space reasons i'm
unwilling to explain further for those who don't have the good concept
of function and sets.

As you can see, these names are not as clear as simply saying the
mapping is one-to-one and/or onto. Only the bijection is easy to
recall.

for your convenience the mathworld url on these is here:
http://mathworld.wolfram.com/One-to-One.html

--

The group/ring/field words are fascinating. Normally, math jargons use
everyday words. I.e. they try to not invent new words. It is
unfortunate that it would take quite some space to explain these well
to mere mortals who never heard of them, therefore i won't do it here.
But, let me tell you a personal tale about the math term "group".

Like other math jargons, i was infinitely fascinated by these esoteric
terms. I wanted to know what they mean. The mere existence of these
terms are sufficient to make me cry, to think that there are god of
sorts who have written rules of the universe in stone, and we mortals
are left struggling to find them. Just open some graduate math texts
-- all sort of gibberish talk presents themselves. It is these
gibberish, that made computers and cellphones and cars and planes and
TVs and whatnot possible.
It is these gibberish that made the lesser beings physicists
possible. We could say, that it is these gibberish that made the
universe possible. Suppose in an instance all math writings disappear
from the face of this earth. Then, you please imagine the
consequential catastrophe for a moment.
I would suppose, that all today's sciences would immediately cease,
even those with experimental methodologies. All engineering, will
forthwith stop progress. Moore's law will cease to be true. mm... this
would be an interesting topic to put thought on... but i digressed
again...

I was talking about the math term groups. A "group" is the name to an
abstract concept of: a set and some functions that satisfies a few
requirements. The study of such are often called Group Theory, and
group theory is a branch of modern algebra, where algebra being one of
the main trunk of modern mathematics. (the others two are analysis and
geometry. (such categorization are very vague.)) The core of group
theory, let me tell you, is about the study of symmetry. The previous
sentence i just wrote, is one of the fantastic journey and revelation
that i personally experienced.

Think about symmetry for a moment. Observe that many animals of earth
have bilateral symmetry. In other worlds, their left and right parts
are mirror images, or symmetric. There are many other types of
symmetry. The patterns on a Persian carpet, is often symmetric. The
helical staircase are symmetric. Stairs are also symmetric, in that
they repeat on and on. A wire mesh in fences, are also symmetric.
Honeycomb are symmetric -- look at those regularly arranged hexagons.
An equilateral triangle is symmetric too.
It has bilateral symmetry along any of the three middle cuts. The
peaceful swastika symbol is also symmetric. The symmetry is what
technically called rotational symmetry. It has 4-fold rotational
symmetry, in that one can rotate it 1/4th of a circle and the image
becomes itself again. The yin-yan tai-chi symbol also has 2-fold
rotational symmetry. Many English alphabets has symmetries. For
example, idealized O has all round, rotational symmetric of
infinite-fold. X has mirror symmetries of vertical and horizontal
axes. H has bilateral, or just vertical mirror symmetries. P has none.
Here's is a digression i wrote in 1999 about symmetries in letters,
modified slightly:

<start>
Btw, the name "XAH" is by design. It is not a Chinese transliteration.
I
choose XAH as my English legal name. I wanted a unique name. Out of
myriad
letter combinations, XAH came to be more or less because I am
fascinated by
symmetric alphabets since young. There are still a lot possibilities
of
names with symmetric letters, but XAH seems to be good and was chosen.

It would be fun to write a program to generate all plausible names
that are
composed of symmetric letters. Anyone risen for the challenge?

By the way, alphabet may have vertical mirror symmetry (A, M, U...) or
horizontal (B, C, E...) or 2-fold rotational symmetry (N, Z, S), or
4-fold
rotational symmetry with mirror (X, I), or diagonal symmetry
(idealized L,
Q), or a combination of the above (H, I, O). We might also consider
lower
cases. For example C, O, I ... has the property that the symmetry is
invariant among lower or upper cases. Symmetry can be applied to whole
words
in different ways. i.e. AHA, WOW, DO, DID, MOW ; CIVIC, EVE, EYE. When
applied to sentences in the second sense, it's called palindrome. That
is,
sentences that reads the same forward or backward. i.e. "A man, a
plan, a
canal, Panama!". There are palindromes that are pages long, and I know
there
are books collecting it.

By by the way, speaking of symmetry... The mathematical study of
symmetry is
called group theory. Group theory is the abstraction of
abstractions. Its beauty surpasses that of visual. Just about few
years ago,
I was very puzzled by such an idea. How could math, with all the
formula
and precision, codify such a non-numeric concept as symmetry? What's
there
to study about symmetry? Isn't it just all 'looks'? Indeed, knowing
what
it's about is an extreme satisfaction for me.
<end>

Can you imagine for a second, how could the mundane and everyday
concept of symmetry be codified by mathematics to form a complete and
far-reaching study called group theory? Groups and group theory is a
fundamental concept and extremely important branch of math, and one
can get a Ph D specializing in group theory.

As it happens, almost all math that i know i learned on my own,
without teacher. It is around 1997, that i studied the basics of group
theory, and got my extreme awe of its beauty. Also, it is a pity of
society, that a genius like myself literally took several years to
finally understand what group theory is or is about, whereas today i
can explain the concept to anyone willing to learn in about an hour,
the basics in few weeks. (you now already learned some aspects of
what it is about.) Pity the fantastically fucking bureaucratic
schooling system. Fuck the system. Fuck the stupid teachers. Fuck them
and fuck their wifes.

(the internet, the info hotbed, to some extend remedies the situation
and soothes my anger. I'd like to give an important side-advice here:
For the society to progress, there is nothing more important for
free-flow of information. I repeat, nothing more important than that.
Not prevention of AIDs, not prevention of war, not prevention of
deaths or human suffering. Nothing is more important than the massive
unfiltered flow of information accessible for anyone, of any kind. A
controlled form of such info flow is called education, and a more
extreme form of education is called brain-washing. We want free-flow
of info. We do not want controlled info. Next time your government
speaks of protecting the children or prevention of terrorism with
censorship, please furiously kick their face bloody, and vote yourself
for governor.)

--

Ok, here are few more terms i alluded to cover:

> homological, projective, pencil, bundle,
> lattice, affine, topology, isomorphism, isometry, homeomorphism,
> aleph-0, fractal, supremum/infimum, matrix, quaternions,
> derivative/integral

i think this article is already quite long. I also wanted to continue
my thesis on why much of the jargons in computing field i mentioned
and not explained:

* regular expression (as in regex, grep, egrep, fgrep)
* name space (as in Scheme vs Common Lisp debates)
* depth first/breadth first (as in tree traversing.)
* subroutine, function, operator
* operator overloading
* polymorphism
* inheritance
* first class objects
* pointers, references

are asinine. (the OOP jargons, are grossly asinine.)

Until next time.

Xah
x...@xahlee.org
http://xahlee.org/PageTwo_dir/more.html

0 new messages