Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

The Importance of Terminology's Quality

11 views
Skip to first unread message

xah...@gmail.com

unread,
May 7, 2008, 7:13:36 PM5/7/08
to
I'd like to introduce a blog post by Stephen Wolfram, on the design
process of Mathematica. In particular, he touches on the importance of
naming of functions.

• Ten Thousand Hours of Design Reviews (2008 Jan 10) by Stephen
Wolfram
http://blog.wolfram.com/2008/01/10/ten-thousand-hours-of-design-reviews/

The issue is fitting here today, in our discussion of “closure”
terminology recently, as well the jargons “lisp 1 vs lisp2” (multi-
meaning space vs single-meaning space), “tail recursion”, “currying”,
“lambda”, that perennially crop up here and elsewhere in computer
language forums in wild misunderstanding and brouhaha.

The functions in Mathematica, are usually very well-name, in contrast
to most other computing languages. In particular, the naming in
Mathematica, as Stephen Wolfram implied in his blog above, takes the
perspective of naming by capturing the essense, or mathematical
essence, of the keyword in question. (as opposed to, naming it
according to convention, which often came from historical happenings)
When a thing is well-named from the perspective of what it actually
“mathematically” is, as opposed to historical developments, it avoids
vast amount of potential confusion.

Let me give a few example.

• “lambda”, widely used as a keyword in functional languages, is named
just “Function” in Mathematica. The “lambda” happend to be called so
in the field of symbolic logic, is due to use of the greek letter
lambda “λ” by happenstance. The word does not convey what it means.
While, the name “Function”, stands for the mathematical concept of
“function” as is.

• Module, Block, in Mathematica is in lisp's various “let*”. The
lisp's keywords “let”, is based on the English word “let”. That word
is one of the English word with multitudes of meanings. If you look up
its definition in a dictionary, you'll see that it means many
disparate things. One of them, as in “let's go”, has the meaning of
“permit; to cause to; allow”. This meaning is rather vague from a
mathematical sense. Mathematica's choice of Module, Block, is based on
the idea that it builds a self-contained segment of code. (however,
the choice of Block as keyword here isn't perfect, since the word also
has meanings like “obstruct; jam”)

• Functions that takes elements out of list are variously named First,
Rest, Last, Extract, Part, Take, Select, Cases, DeleteCases... as
opposed to “car”, “cdr”, “filter”, “filter”, “pop”, “shift”,
“unshift”, in lisps and perl and other langs.

The above are some examples. The thing to note is that, Mathematica's
choices are often such that the word stands for the meaning themselves
in some logical and independent way as much as possible, without
having dependent on a particular computer science field's context or
history. One easy way to confirm this, is taking a keyword and ask a
wide audience, who doesn't know about the language or even unfamiliar
of computer programing, to guess what it means. The wide audience can
be made up of mathematicians, scientists, engineers, programers,
laymen. This general audience, are more likely to guess correctly what
Mathematica's keyword is meant in the language, than the the name used
in other computer languages who's naming choices goes by convention or
context.

(for example, Perl's naming heavily relies on unix culture (grep,
pipe, hash...), while functional lang's namings are typically heavily
based on the field of mathematical logic (e.g. lambda, currying,
closure, monad, ...). Lisp's cons, car, cdr, are based on computer
hardware (this particular naming, caused a major damage to the lisp
language to this day). (Other examples: pop, shift are based on
computer science jargon of “stack”. Grep is from Global Regular
Expression Print, while Regular Expression is from theoretical
computer science of Automata... The name regex has done major hidden
damage to the computing industry, in the sense that if it have just
called it “string patterns”, then a lot explanations, literatures,
confusions, would have been avoided.))

(Note: Keywords or functions in Mathematica are not necessarily always
best named. Nor are there always one absolute choice as best, as there
are many other considerations, such as the force of wide existing
convention, the context where the function are used, brevity,
limitations of English language, different scientific context (e.g.
math, physics, engineering), or even human preferences.)

----------------------------

Many of the issues regarding the importance and effects of
terminology's quality, i've wrote about since about 2000. Here are the
relevant essays:

• Jargons of Info Tech Industry
http://xahlee.org/UnixResource_dir/writ/jargons.html

• The Jargon “Lisp1” vs “Lisp2”
http://xahlee.org/emacs/lisp1_vs_lisp2.html

• The Term Curring In Computer Science
http://xahlee.org/UnixResource_dir/writ/currying.html

• What Is Closure In A Programing Language
http://xahlee.org/UnixResource_dir/writ/closure.html

• What are OOP's Jargons and Complexities
http://xahlee.org/Periodic_dosage_dir/t2/oop.html

• Sun Microsystem's abuse of term “API” and “Interface”
http://xahlee.org/java-a-day/interface.html

• Math Terminology and Naming of Things
http://xahlee.org/cmaci/notation/math_namings.html

Xah
x...@xahlee.org
http://xahlee.org/


Bruce C. Baker

unread,
May 7, 2008, 7:53:42 PM5/7/08
to

<xah...@gmail.com> wrote in message
news:f4abdb41-be28-4628...@u12g2000prd.googlegroups.com...

[...]

(for example, Perl's naming heavily relies on unix culture (grep,

pipe, hash...), ...

"hash" + "pipe"? Ahhhhh, /no wonder/ Perl is the syntactic mishmash it is!
;-)

Kyle McGivney

unread,
May 7, 2008, 10:14:35 PM5/7/08
to
> • Module, Block, in Mathematica is in lisp's various “let*”. The
> lisp's keywords “let”, is based on the English word “let”. That word
> is one of the English word with multitudes of meanings. If you look up
> its definition in a dictionary, you'll see that it means many
> disparate things. One of them, as in “let's go”, has the meaning of
> “permit; to cause to; allow”. This meaning is rather vague from a
> mathematical sense. Mathematica's choice of Module, Block, is based on
> the idea that it builds a self-contained segment of code. (however,
> the choice of Block as keyword here isn't perfect, since the word also
> has meanings like “obstruct; jam”)

If the purpose of let is to introduce one or more variable bindings,
then I don't see how changing to block or module would improve
anything. I've always found it fairly intuitive to parse (let ((x
5)) ...) to "let x be five". Additionally, replacing let with the
synonyms you provided would approximately yield "permit x to be five"
or "allow x to be five". In my mind you have constructed an argument
in favor of let here (obviously it's better than block, because
nobody's going to come along and be confused about whether let will
"obstruct" or "jam" them :)

There are many easy targets to poke at in the CL spec. let isn't one
of those.

George Neuner

unread,
May 8, 2008, 1:05:34 AM5/8/08
to

>lambda “?” by happenstance. The word does not convey what it means.


>While, the name “Function”, stands for the mathematical concept of
>“function” as is.

Lambda is not a function - it is a function constructor. A better
name for it might be MAKE-FUNCTION.

I (and probably anyone else you might ask) will agree that the term
"lambda" is not indicative of it's meaning, but it's meaning is not
synonymous with "function" as you suggest.

I suspect Mathematica of just following historical convention itself.
Mathematica uses the term inappropriately just as it was (ab)used in
Pascal (circa 1970). I'm not aware of earlier (ab)uses but there
probably were some.


>• Module, Block, in Mathematica is in lisp's various “let*”. The
>lisp's keywords “let”, is based on the English word “let”. That word
>is one of the English word with multitudes of meanings. If you look up
>its definition in a dictionary, you'll see that it means many
>disparate things. One of them, as in “let's go”, has the meaning of
>“permit; to cause to; allow”. This meaning is rather vague from a
>mathematical sense. Mathematica's choice of Module, Block, is based on
>the idea that it builds a self-contained segment of code. (however,
>the choice of Block as keyword here isn't perfect, since the word also
>has meanings like “obstruct; jam”)

"Let" is the preferred mathematical term for introducing a variable.
Lisp uses it in that meaning. What a word means or doesn't in English
is not particularly relevant to its use in another language. There
are many instances of two human languages using identical looking
words with very different meanings. Why should computer languages be
immune?


>• Functions that takes elements out of list are variously named First,
>Rest, Last, Extract, Part, Take, Select, Cases, DeleteCases... as
>opposed to “car”, “cdr”, “filter”, “filter”, “pop”, “shift”,
>“unshift”, in lisps and perl and other langs.

Lisp has "first" and "rest" - which are just synonyms for "car" and
"cdr". Older programmers typically prefer car and cdr for historical
reasons, but few object to the use of first and rest except for
semantic reasons - Lisp does not have a list data type, lists are
aggregates constructed from a primitive pair data type. Pairs can be
used to construct trees as well as lists and "rest" has little meaning
for a tree. When used with lists, first and rest are meaningful terms
and no one will object to them.

Besides which, you easily create synonyms for car and cdr (and
virtually any other Lisp function) with no more burden on the reader
of your code than using a C macro. You can call them "first and
rest", or "first and second", or "left and right", or "red and black"
or whatever else makes sense for your data.

People coming to Lisp from other languages often complain of macros
that they have to learn "a new language" every time they read a
program. But in fact, the same is true in all languages - the reader
always has to learn the user-defined functions and how they are used
to make sense of the code. In that sense Lisp is no different from
any other language.

Common Lisp doesn't have "filter". Even so, with respect to the
merits of calling a function "extract" or "select" versus "filter", I
think that's just a matter of familiarity. The term "filter" conveys
a more general idea than the others and can, by parameterization,
perform either function.

>? http://xahlee.org/
>
>?

George
--
for email reply remove "/" from address

Jürgen Exner

unread,
May 8, 2008, 2:05:20 AM5/8/08
to
George Neuner <gneuner2/@/comcast.net> wrote:
>On Wed, 7 May 2008 16:13:36 -0700 (PDT), "xah...@gmail.com"
><xah...@gmail.com> wrote:

+-------------------+ .:\:\:/:/:.
| PLEASE DO NOT | :.:\:\:/:/:.:
| FEED THE TROLLS | :=.' - - '.=:
| | '=(\ 9 9 /)='
| Thank you, | ( (_) )
| Management | /`-vvv-'\
+-------------------+ / \
| | @@@ / /|,,,,,|\ \
| | @@@ /_// /^\ \\_\
@x@@x@ | | |/ WW( ( ) )WW
\||||/ | | \| __\,,\ /,,/__
\||/ | | | jgs (______Y______)
/\/\/\/\/\/\/\/\//\/\\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
==============================================================

jue

John Thingstad

unread,
May 8, 2008, 4:05:36 AM5/8/08
to
På Thu, 08 May 2008 04:14:35 +0200, skrev Kyle McGivney
<Kyl...@gmail.com>:

How about bind?
(bind ((v f (mod i)) ((a b) list) (t (rem q)))

1. is a multiple-value-bind
2. is a destructuring-bind
3. is a let

http://common-lisp.net/project/metabang-bind/

To me this is a example of where the ANSI group could have spent more time
on naming.

--------------
John Thingstad

kod...@eurogaran.com

unread,
May 8, 2008, 4:12:27 AM5/8/08
to

>          |   PLEASE DO NOT   |            :.:\:\:/:/:.:
>          |  FEED THE TROLLS  |           :=.' -   - '.=:

I don't think Xah is trolling here (contrary to his/her habit)
but posing an interesting matter of discussion.

Don't know to which point it fits, but I would like to do some rather
novel comment on operator naming:
As a non native english speaker, the first time I ever encountered the
word "if" was when learning to program. The same can be said of the
other words (for, then, else...) This caused my brain to adscribe them
meanings completely outside the context of everyday language. My point
is that perhaps this is advantageous. So, contrary to tradition (which
considers a desirable goal to write programs as close as possible to
everyday english), I found convenient that programming languages use
words different from the words of my native tongue. I suspect that is
why car and cdr have caught on vs. first end rest.

Robert Maas, http://tinyurl.com/uh3t

unread,
May 8, 2008, 6:25:54 AM5/8/08
to
> From: "xah...@gmail.com" <xah...@gmail.com>

> the importance of naming of functions.

I agree, that's a useful consideration in the design of any system
based on keywords, such as names of functions or operators.
(I'm not using "keyword" in the sense of a symbol in the KEYWORD package.)

> the naming in Mathematica, as Stephen Wolfram implied in his blog
> above, takes the perspective of naming by capturing the essense, or
> mathematical essence, of the keyword in question.

For concepts adapted from mathematics, that naming meta-convention makes sense.
For concepts adapted from data-processing, it's not so clearly good.

> lambda, widely used as a keyword in functional languages, is


> named just Function in Mathematica.

That makes sense, but then what does Mathematica call the special
operator which Common Lisp calls Function? Or is that not provided,
and the programmer must case-by-case handcode what they really mean?
(function foo) == (symbol-function 'foo) ;so FUNCTION not really needed there
(function (lambda (x) (+ x y)) ;y is a lexical variable to be encapsulated
; into the resultant closure
How would you do something like that in mathematica?

> Module, Block, in Mathematica is in lisp's various let*

> The lisp's keywords let, is based on the English word let.

No, that's not correct. It's based on the mathematical use of "let":
Let R be the complete ordered field of real numbers.
Let S be the subset of R consisting of numbers which satisfy
equations expressed in transcendental functions with rational
parameters.
Prove that S is dense in R.
(Yeah, the question is trivial, but I'm just showing how "let" might be used.)

I see nothing wrong with LET used in that way in Lisp.
Bind would be good too.
I don't like Module or Block at all, because those words convey
nothing about binding some temporary local name to represent some
globally-definable concept, and they actually mis-lead because
module can mean either something like a vector space over a ring,
or a set of functions/methods that serve some small problem domain
within a larger library of functions/methods, or a set of lectures
on a single-topic within a curriculum, while Block sounds more like
just some random consecutive interval of statements rather having
nothing specific to do with bindings. TAGBODY or PROGN would be
more properly called Block perhaps.

> One easy way to confirm this, is taking a keyword and ask a wide
> audience, who doesn't know about the language or even unfamiliar of
> computer programing, to guess what it means.

That is a biassed question because the only options are the words
you choose in advance. Better would be to reverse the question: Ask
random people on the street what they would like to call these
concepts:
1 You set up a temporary relation between a name and some datum,
such that all references to that name give you that same datum.
For example you might set up the name 'n' to temporarily mean 5.
Fill in the missing word: (word (n 5) <say "ouch" n times>)
2 You set up a rule by which one step of data-processing is performed,
taking input to produce output. For example you set up a rule by
which the input value is divided by two if it's even but
multiplied by three and then 1 added if it was odd.
Fill in the missing word: (word (n) <if n even n/2 else 3*n+1>)
3 You set up a rule as above, but also give it a permanent name.
Fill in the missing word: (word collatz (n) <if n even n/2 else 3*n+1>)
4 You already have a rule that takes two input data and produces one output.
(oldword foo (x y) <absolute value of difference between x and y>)
You now know one of the two inputs, and that info won't change for a while,
and you hate to keep repeating it. So you want to define a new rule
that has that one known parameter built-in so that you only have
to say the *other* parameter each time you use the new rule.
Fill in the missing newword: (newword foo <x is fixed as 3>)
So I'd choose words: 1=let 2=lambda 3=defun 4=curry.
What words would a typical man on the street choose instead?

> The name regex has done major hidden damage to the computing industry

I have my own gripe against regular expressions ("regex" for short).
I hate the extremely terse format, with horrible concoctions of
escape and anti-escape magic characters, to fit within Unix's
255-character limit on command lines, compared to a nicely
s-expression nested notation that would be possible if regex hadn't
entrenched itself so solidly.

Lew

unread,
May 8, 2008, 6:52:22 AM5/8/08
to
kod...@eurogaran.com wrote:
>> | PLEASE DO NOT | :.:\:\:/:/:.:
>> | FEED THE TROLLS | :=.' - - '.=:
>
> I don't think Xah is trolling here (contrary to his/her habit)
> but posing an interesting matter of discussion.

Interesting is in the eye of the beholder. After you'd read the same recycled
crud from certain posters again and again, it because trollish spam.

--
Lew

Lew

unread,
May 8, 2008, 6:53:36 AM5/8/08
to
Robert Maas, http://tinyurl.com/uh3t wrote:
> I have my own gripe against regular expressions ("regex" for short).
> I hate the extremely terse format, with horrible concoctions of
> escape and anti-escape magic characters, to fit within Unix's
> 255-character limit on command lines, compared to a nicely
> s-expression nested notation that would be possible if regex hadn't
> entrenched itself so solidly.

This is all very interesting, but not Java.

--
Lew

Sherman Pendley

unread,
May 8, 2008, 11:08:35 AM5/8/08
to
kod...@eurogaran.com writes:

>>          |   PLEASE DO NOT   |            :.:\:\:/:/:.:
>>          |  FEED THE TROLLS  |           :=.' -   - '.=:
>
> I don't think Xah is trolling here (contrary to his/her habit)
> but posing an interesting matter of discussion.

It might be interesting in the abstract, but any such discussion, when
cross-posted to multiple language groups on usenet, will inevitably
devolve into a flamewar as proponents of the various languages argue
about which language better expresses the ideas being talked about.
It's like a law of usenet or something.

If Xah wanted an interesting discussion, he could have posted this to
one language-neutral group such as comp.programming. He doesn't want
that - he wants the multi-group flamefest.

sherm--

--
My blog: http://shermspace.blogspot.com
Cocoa programming in Perl: http://camelbones.sourceforge.net

Message has been deleted

Jon Harrop

unread,
May 8, 2008, 2:02:18 PM5/8/08
to
xah...@gmail.com wrote:
> I'd like to introduce a blog post by Stephen Wolfram, on the design
> process of Mathematica. In particular, he touches on the importance of
> naming of functions.
>
> ? Ten Thousand Hours of Design Reviews (2008 Jan 10) by Stephen
> Wolfram
> http://blog.wolfram.com/2008/01/10/ten-thousand-hours-of-design-reviews/
>
> The issue is fitting here today, in our discussion of ?closure?
> terminology recently, as well the jargons ?lisp 1 vs lisp2? (multi-
> meaning space vs single-meaning space), ?tail recursion?, ?currying?,
> ?lambda?, that perennially crop up here and elsewhere in computer

> language forums in wild misunderstanding and brouhaha.
>
> The functions in Mathematica, are usually very well-name, in contrast
> to most other computing languages.

The mathematical functions are well named but many of the general
programming constructs are not even well defined, let alone well named.

For example, overloaded "Gamma" in Mathematica vs "gammainc" in MATLAB for
the incomplete gamma functions.

> In particular, the naming in
> Mathematica, as Stephen Wolfram implied in his blog above, takes the
> perspective of naming by capturing the essense, or mathematical
> essence, of the keyword in question. (as opposed to, naming it
> according to convention, which often came from historical happenings)
> When a thing is well-named from the perspective of what it actually

> ?mathematically? is, as opposed to historical developments, it avoids


> vast amount of potential confusion.
>
> Let me give a few example.
>

> ? ?lambda?, widely used as a keyword in functional languages, is named
> just ?Function? in Mathematica. The ?lambda? happend to be called so


> in the field of symbolic logic, is due to use of the greek letter

> lambda ??? by happenstance. The word does not convey what it means.
> While, the name ?Function?, stands for the mathematical concept of
> ?function? as is.

Look at the "function" keyword in OCaml and F#. They also pattern match over
their input whereas Mathematica does not allow this in "Function".

> ? Module, Block, in Mathematica is in lisp's various ?let*?. The
> lisp's keywords ?let?, is based on the English word ?let?. That word


> is one of the English word with multitudes of meanings. If you look up
> its definition in a dictionary, you'll see that it means many

> disparate things. One of them, as in ?let's go?, has the meaning of
> ?permit; to cause to; allow?. This meaning is rather vague from a


> mathematical sense. Mathematica's choice of Module, Block, is based on
> the idea that it builds a self-contained segment of code. (however,
> the choice of Block as keyword here isn't perfect, since the word also

> has meanings like ?obstruct; jam?)

Bad example. Mathematica's "Block" implements what we all know as "Scope".
Mathematica's "Module" implements something most people have never needed
to learn (rewriting a subexpression with uniquely tagged identifiers to
disambiguate them from externals):

In[1] := Module[{a}, a]
Out[1] = a$617

> ? Functions that takes elements out of list are variously named First,


> Rest, Last, Extract, Part, Take, Select, Cases, DeleteCases... as

> opposed to ?car?, ?cdr?, ?filter?, ?filter?, ?pop?, ?shift?,
> ?unshift?, in lisps and perl and other langs.

You are comparing arrays in Mathematica to lists in other functional
languages. Mathematica is often asymptotically slower as a consequence,
with "Rest" being O(n):

In[1] := AbsoluteTiming[Rest[Range[1, 1000000]];]
Out[1] = {0.219, Null}

The equivalents over lists are all O(1) in SML, OCaml, F#, Haskell, Scala,
Lisp and Scheme and is called the "tail" of a list. For example, in F#:

> time List.tl [1 .. 1000000];;
Took 0ms
val it : int list = ...

Perhaps the best example I can think of to substantiate your original point
is simple comparison because Mathematica allows:

a < b < c

I wish other languages did.

--
Dr Jon D Harrop, Flying Frog Consultancy
http://www.ffconsultancy.com/products/?u

George Neuner

unread,
May 8, 2008, 9:32:58 PM5/8/08
to
On Thu, 08 May 2008 03:25:54 -0700,
usenet1.3...@SpamGourmet.Com (Robert Maas,
http://tinyurl.com/uh3t) wrote:

>> From: "xah...@gmail.com" <xah...@gmail.com>
>> the importance of naming of functions.
>

>> ... [take] a keyword and ask a wide


>> audience, who doesn't know about the language or even unfamiliar of
>> computer programing, to guess what it means.

This is a dumb idea ...

>Better would be to reverse the question: Ask
>random people on the street what they would like to call these
>concepts:

... and this one is even dumber.

Terms don't exist in a vacuum - they exist to facilitate communication
within a particular knowledge or skill domain. For example, English
is only meaningful to those who speak English. The opinions of random
people who have no relevant domain knowledge are worthless.

Such a survey could only be meaningful if the survey population
already possessed some knowledge of programming, but were not already
aware of the particular terminology being surveyed.

Waylen Gumbal

unread,
May 9, 2008, 1:38:44 AM5/9/08
to
Sherman Pendley wrote:
> kod...@eurogaran.com writes:
> >
> > > PLEASE DO NOT | :.:\:\:/:/:.:
> > > FEED THE TROLLS | :=.' - - '.=:
> >
> > I don't think Xah is trolling here (contrary to his/her habit)
> > but posing an interesting matter of discussion.
>
> It might be interesting in the abstract, but any such discussion, when
> cross-posted to multiple language groups on usenet, will inevitably
> devolve into a flamewar as proponents of the various languages argue
> about which language better expresses the ideas being talked about.
> It's like a law of usenet or something.
>
> If Xah wanted an interesting discussion, he could have posted this to
> one language-neutral group such as comp.programming. He doesn't want
> that - he wants the multi-group flamefest.

Not everyone follows language-neutral groups (such as comp,programming
as you pointed out), so you actually reach more people by cross posting.
This is what I don't understand - everyone seems to assume that by cross
posting, one intends on start a "flamefest", when in fact most such
"flamefests" are started by those who cannot bring themselves to
skipping over the topic that they so dislike.

--
wg


Lew

unread,
May 9, 2008, 1:45:04 AM5/9/08
to
Waylen Gumbal wrote:
> Not everyone follows language-neutral groups (such as comp,programming
> as you pointed out), so you actually reach more people by cross posting.
> This is what I don't understand - everyone seems to assume that by cross
> posting, one intends on start a "flamefest", when in fact most such
> "flamefests" are started by those who cannot bring themselves to
> skipping over the topic that they so dislike.

It's not an assumption in Xah Lee's case. He spams newsgroups irregularly
with rehashed essays from years ago, and a number of people are just tired of
him. Don't blame the victims for the perpetrator's actions, OK?

--
Lew

Jürgen Exner

unread,
May 9, 2008, 1:45:21 AM5/9/08
to
"Waylen Gumbal" <wgu...@gmail.com> wrote:
>Sherman Pendley wrote:
>> kod...@eurogaran.com writes:
>> >
>> > > PLEASE DO NOT | :.:\:\:/:/:.:
>> > > FEED THE TROLLS | :=.' - - '.=:
>Not everyone follows language-neutral groups (such as comp,programming
>as you pointed out), so you actually reach more people by cross posting.

You seem so have failed to grasp the concept of why Usenet is divided
into separate groups in the first place.

>This is what I don't understand - everyone seems to assume that by cross
>posting, one intends on start a "flamefest", when in fact most such
>"flamefests" are started by those who cannot bring themselves to
>skipping over the topic that they so dislike.

By your argument there is no need for individual groups in the first
place. We could just as well use a single "HereGoesEverything" and just
skip over those topics that we so dislike.

jue

Robert Maas, http://tinyurl.com/uh3t

unread,
May 9, 2008, 2:34:43 AM5/9/08
to
> From: Jon Harrop <j...@ffconsultancy.com>

> Perhaps the best example I can think of to substantiate your
> original point is simple comparison because Mathematica allows:
> a < b < c
> I wish other languages did.

When the comparison operators are all the same, Lisp has that:
(< a b c)
For my personal comparison between C C++ Java and Lisp, see:
<http://www.rawbw.com/~rem/HelloPlus/CookBook/Matrix.html#NumBool>

Now for mixed comparisons, such as a < b <= c, lisp as delivered is
no better than other languages. But in lisp it's possible/easy to
write a macro that would deal with such expressions. Exercise for
the reader (especially newbies wishing to make a good impression):
(defmacro altcompare ...)
(altcompare a < b <= c ...) ;a b c are evaluated
;< <= are *not* evaluated, instead wrapped with (function ...)
;Any function that can take two arguments is allowed there, not just
; the six comparison operators. Such function can be given in any form
; convertable by FUNCTION at compile time, i.e. symbol name or lambda expr.
Note for newbie: Be sure to evaluate each alternating sub-form just once,
in correct sequence, binding each result to a GENSYMmed LET variable
which can then be referenced twice each (except first and last once each).
Let the compiler optimize out any LET bindings that weren't really needed.

Waylen Gumbal

unread,
May 9, 2008, 2:43:03 AM5/9/08
to
Lew wrote:
> Waylen Gumbal wrote:
>> Not everyone follows language-neutral groups (such as
>> comp,programming as you pointed out), so you actually reach more
>> people by cross posting. This is what I don't understand - everyone
>> seems to assume that by cross posting, one intends on start a
>> "flamefest", when in fact most such "flamefests" are started by
>> those who cannot bring themselves to skipping over the topic that
>> they so dislike.
>
> It's not an assumption in Xah Lee's case. He spams newsgroups
> irregularly with rehashed essays from years ago, and a number of
> people are just tired of him.

I did not know this. One should obviously not do that.

> Don't blame the victims for the perpetrator's actions, OK?

I'm not blaming any "victims", but I don't see anyone saying "read this
or else", so why not just skip the thread or toss the OP in your
killfile so you don't see his postings. If others want to discuss his
topics, who are you or I to tell them not to?

--
wg


Waylen Gumbal

unread,
May 9, 2008, 2:47:46 AM5/9/08
to
Jürgen Exner wrote:
> "Waylen Gumbal" <wgu...@gmail.com> wrote:
> > Sherman Pendley wrote:
> > > kod...@eurogaran.com writes:
> > > >
> > > > > PLEASE DO NOT | :.:\:\:/:/:.:
> > > > > FEED THE TROLLS | :=.' - - '.=:
> > Not everyone follows language-neutral groups (such as
> > comp,programming as you pointed out), so you actually reach more
> > people by cross posting.
>
> You seem so have failed to grasp the concept of why Usenet is divided
> into separate groups in the first place.

No, not really. You keep group specific content in the applicable group
or groups only. But are there not times where content overlaps the
topics of multiple groups and to get maximum feedback, post to all
applicable groups (the keyword being "applicable") ?

--
wg


Jürgen Exner

unread,
May 9, 2008, 3:01:02 AM5/9/08
to
"Waylen Gumbal" <wgu...@gmail.com> wrote:
> so why not just skip the thread or toss the OP in your
>killfile so you don't see his postings.

Done years ago.

>If others want to discuss his
>topics, who are you or I to tell them not to?

They are very welcome to do so in an appropriate NG for those topics.

jue

George Neuner

unread,
May 9, 2008, 5:56:48 PM5/9/08
to
On Thu, 8 May 2008 22:38:44 -0700, "Waylen Gumbal" <wgu...@gmail.com>
wrote:

The problem is that many initial posts have topics that are misleading
or simplistic. Often an interesting discussion can start on some
point the initial poster never considered or meant to raise.

Rob Warnock

unread,
May 9, 2008, 11:45:26 PM5/9/08
to
George Neuner <gneuner2/@/comcast.net> wrote:
+---------------

| Common Lisp doesn't have "filter".
+---------------

Of course it does! It just spells it REMOVE-IF-NOT!! ;-} ;-}

> (remove-if-not #'oddp (iota 10))

(1 3 5 7 9)
> (remove-if-not (lambda (x) (> x 4)) (iota 10))

(5 6 7 8 9)
>


-Rob

-----
Rob Warnock <rp...@rpw3.org>
627 26th Avenue <URL:http://rpw3.org/>
San Mateo, CA 94403 (650)572-2607

Waylen Gumbal

unread,
May 10, 2008, 3:56:33 AM5/10/08
to

Is this not a possibility for any topic, whether it's cross-posted or
not?


--
wg


Lew

unread,
May 10, 2008, 8:26:55 AM5/10/08
to

You guys are off topic. None of the million groups to which this message was
posted are about netiquette.

--
Lew

Sherman Pendley

unread,
May 10, 2008, 12:26:45 PM5/10/08
to
Lew <l...@lewscanon.com> writes:

> You guys are off topic. None of the million groups to which this
> message was posted are about netiquette.

Netiquette has come up at one point or another in pretty much every
group I've ever read. It's pretty much a universal meta-topic.

Lew

unread,
May 10, 2008, 5:38:29 PM5/10/08
to
Sherman Pendley wrote:
> Lew <l...@lewscanon.com> writes:
>
>> You guys are off topic. None of the million groups to which this
>> message was posted are about netiquette.
>
> Netiquette has come up at one point or another in pretty much every
> group I've ever read. It's pretty much a universal meta-topic.

Good. Then please have the courtesy not to include comp.lang.java.programmer
in this thread's distribution any longer.

--
Lew

Lew

unread,
May 10, 2008, 5:39:42 PM5/10/08
to
Sherman Pendley wrote:
> Lew <l...@lewscanon.com> writes:
>
>> You guys are off topic. None of the million groups to which this
>> message was posted are about netiquette.
>
> Netiquette has come up at one point or another in pretty much every
> group I've ever read. It's pretty much a universal meta-topic.

Good, then please have the courtesy not to include comp.lang.java.programmer
in the distribution for this thread any longer.

--
Lew

George Neuner

unread,
May 10, 2008, 9:01:21 PM5/10/08
to
On Fri, 09 May 2008 22:45:26 -0500, rp...@rpw3.org (Rob Warnock) wrote:

>George Neuner <gneuner2/@/comcast.net> wrote:
>
>>On Wed, 7 May 2008 16:13:36 -0700 (PDT), "xah...@gmail.com"
>><xah...@gmail.com> wrote:
>

>>>• Functions [in Mathematica] that takes elements out of list


>>>are variously named First, Rest, Last, Extract, Part, Take,
>>>Select, Cases, DeleteCases... as opposed to “car”, “cdr”,
>>>“filter”, “filter”, “pop”, “shift”, “unshift”, in lisps and
>>>perl and other langs.
>

>>| Common Lisp doesn't have "filter".
>

>Of course it does! It just spells it REMOVE-IF-NOT!! ;-} ;-}

I know. You snipped the text I replied to.

Xah carelessly conflated functions snatched from various languages in
an attempt to make some point about intuitive naming. If he objects
to naming a function "filter", you can just imagine what he'd have to
say about remove-if[-not].

David Combs

unread,
May 29, 2008, 8:22:36 PM5/29/08
to
In article <68i6b5F...@mid.individual.net>,

Not one person on the planet agrees with me, I believe, but
it's always seemed to me that an *advantage* to posting to
multiple groups (especially ones generally "interested" in similar
subject matter but NOT subject to huge poster/lurker/answerer overlap,
er, without too many *people* getting multiple copies of the *same*
post) is that it would provide an opportunity of a widely-dispersed
bunch of people to have a *joint* discussion, with comments hopefully
coming in from a *variety* of viewpoints.

David

David Combs

unread,
May 29, 2008, 8:56:37 PM5/29/08
to
In article <rem-2008...@yahoo.com>,

Robert Maas, http://tinyurl.com/uh3t <usenet1.3...@SpamGourmet.Com> wrote:
>> From: "xah...@gmail.com" <xah...@gmail.com>
>> the importance of naming of functions.
>

Lisp is *so* early a language (1960?), preceeded mainly only by Fortran (1957?)?,
and for sure the far-and-away the first as a platform for *so many* concepts
of computer-science, eg lexical vs dynamic ("special") variables, passing
*unnamed* functions as args (could Algol 60 also do something like that,
via something it maybe termed a "thunk"), maybe is still the only one
in which program and data have the same representation -- that it'd
seem logical to use it's terminology in all languages.

From C is the very nice distinction between "formal" and "actual" args.

And from algol-60, own and local -- own sure beats "static"!

And so on.


To me, it's too bad that that hacker-supreme (and certified genius)
Larry W. likes to make up his own terminology for Perl. Sure makes
for a lot of otherwise-unnecessary pages in the various Perl texts,
as well as posts here.

Of course, a whole lot better his terminology than no language at all!


David


David Combs

unread,
May 29, 2008, 9:08:47 PM5/29/08
to
(This one is also cross-posted, to apologize to one and all
about my just-prior followup.)

I stupidly didn't remember that whatever followup I made
would also get crossposted until *after* I had kneejerked
hit "s" (send) before I noticed the warning (Pnews?) on
just how many groups it would be posted to.

A suggestion for Pnews: that as soon as you give the
F (followup for trn), ie as soon as Pnews starts-up
on this followup, before you've typed in anything
or given it a filename to include, that AT THAT TIME
it remind you that it'll be crossposted to the
following 25 newsgroups:
1: foo
2: comp.lang.perl.misc
3: other-group
4: ...


, so way before you've said anything, you can
abort it if you want to.


SORRY!


David


John Thingstad

unread,
May 30, 2008, 4:03:37 PM5/30/08
to

Perl is solidly based in the UNIX world on awk, sed, bash and C.
I don't like the style, but many do.

--------------
John Thingstad

Lew

unread,
May 30, 2008, 7:11:01 PM5/30/08
to
John Thingstad wrote:
> Perl is solidly based in the UNIX world on awk, sed, bash and C.
> I don't like the style, but many do.

Please exclude the Java newsgroups from this discussion.

--
Lew

Gordon Etly

unread,
May 30, 2008, 7:23:05 PM5/30/08
to

Why? Do you speak for everyone in that, this, or other groups?


--
G.Etly


Lew

unread,
May 30, 2008, 7:33:22 PM5/30/08
to

I don't know why you'd even want to impose your Perl conversation on the Java
group in the first place, troll.

Plonk.

--
Lew

Stephan Bour

unread,
May 30, 2008, 7:40:39 PM5/30/08
to

Did it ever occur to you that you don't speak for entire news groups?


Stephan.


Arne Vajhøj

unread,
May 30, 2008, 9:35:47 PM5/30/08
to

Did it occur to you that there are nothing about Java in the above ?

Arne

szr

unread,
May 31, 2008, 1:40:03 AM5/31/08
to

Looking at the original post, it doesn't appear to be about any specific
language.

--
szr


Peter Duniho

unread,
May 31, 2008, 3:36:27 AM5/31/08
to

Indeed. That suggests it's probably off-topic in most, if not all, of the
newsgroups to which it was posted, inasmuch as they exist for topics
specific to a given programming language.

Regardless, unless you are actually reading this thread from the c.l.j.p
newsgroup, I'm not sure I see the point in questioning someone who _is_
about whether the thread belongs there or not. Someone who is actually
following the thread from c.l.j.p can speak up if they feel that Lew is
overstepping his bounds. Anyone else has even less justification for
"speaking for the entire newsgroup" than Lew does, and yet that's what
you're doing when you question his request.

And if it's a vote you want, mark me down as the third person reading
c.l.j.p that doesn't feel this thread belongs. I don't know whether Lew
speaks for the entire newsgroup, but based on comments so far, it's pretty
clear that there unanimous agreement among those who have expressed an
opinion.

If you all in the other newsgroups are happy having the thread there,
that's great. Please feel free to continue with your discussion. But
please, drop comp.lang.java.programmer from the cross-posting.

Pete

szr

unread,
May 31, 2008, 12:27:11 PM5/31/08
to
Peter Duniho wrote:
> On Fri, 30 May 2008 22:40:03 -0700, szr <sz...@szromanMO.comVE> wrote:
>
>> Arne Vajhøj wrote:
>>> Stephan Bour wrote:
>>>> Lew wrote:
>>>> } John Thingstad wrote:
>>>> } > Perl is solidly based in the UNIX world on awk, sed, } > bash
>>>> and C. I don't like the style, but many do.
>>>> }
>>>> } Please exclude the Java newsgroups from this discussion.
>>>>
>>>> Did it ever occur to you that you don't speak for entire news
>>>> groups?
>>>
>>> Did it occur to you that there are nothing about Java in the above ?
>>
>> Looking at the original post, it doesn't appear to be about any
>> specific language.
>
> Indeed. That suggests it's probably off-topic in most, if not all,
> of the newsgroups to which it was posted, inasmuch as they exist for
> topics specific to a given programming language.

Perhaps - comp.programming might of been a better place, but not all
people who follow groups for specific languages follow a general group
like that - but let me ask you something. What is it you really have
against discussing topics with people of neighboring groups? Keep in
mind you don't have to read anything you do not want to read. [1]

> Regardless, unless you are actually reading this thread from the
> c.l.j.p newsgroup, I'm not sure I see the point in questioning
> someone who _is_ about whether the thread belongs there or not.

I would rather have the OP comment about that, as he started the thread.
But what gets me is why you are against that specific group being
included but not others? What is so special about the Java group and why
are you so sure people there don't want to read this thread? [1] What
right do you or I or anyone have to make decisions for everyone in a
news group? Isn't this why most news readers allow one to block a
thread?

> And if it's a vote you want, mark me down as the third person reading
> c.l.j.p that doesn't feel this thread belongs. I don't know whether
> Lew speaks for the entire newsgroup, but based on comments so far,
> it's pretty clear that there unanimous agreement among those who have
> expressed an opinion.

Ok, so, perhaps 3 people out of what might be several hundred, if not
thousand (there is no way to really know, but there are certainly a lot
of people who read that group, and as with any group, there are far more
readers than there are people posting, so, again, just because you or
two other people or so don't want to read a topic or dislike it, you
feel you can decide for EVERYONE they mustn't read it? Again, this is
why readers allow you to ignore threads. Please don't force your views
on others; let them decide for themselves. [1]


[1] I do not mean this topic specifically, but in general, if one
dislikes a thread, they are free to ignore it. I find it rather
inconsiderate to attempt to force a decision for everyone, when
one has the ability to simply ignore the thread entirely.


--
szr


Jürgen Exner

unread,
May 31, 2008, 12:48:11 PM5/31/08
to
"szr" <sz...@szromanMO.comVE> wrote:
>I would rather have the OP comment about that, as he started the thread.

The OP is a very well-known troll who has the habit of spitting out a
borderline OT article to a bunch of loosly related NGs ever so often and
then sits back and enjoys the complaints and counter-complaints of the
regulars. He doesn't provide anything useful in any of the groups he
targets (at least AFAIK) and he doesn't participate in the resulting
mayhem himself, either.
He will only go away if everyone just ignores him.

With this in mind another reminder:

+-------------------+ .:\:\:/:/:.


| PLEASE DO NOT | :.:\:\:/:/:.:
| FEED THE TROLLS | :=.' - - '.=:

| | '=(\ 9 9 /)='
| Thank you, | ( (_) )
| Management | /`-vvv-'\
+-------------------+ / \
| | @@@ / /|,,,,,|\ \
| | @@@ /_// /^\ \\_\
@x@@x@ | | |/ WW( ( ) )WW
\||||/ | | \| __\,,\ /,,/__
\||/ | | | jgs (______Y______)
/\/\/\/\/\/\/\/\//\/\\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
==============================================================

Follow-up adjusted.

jue

szr

unread,
May 31, 2008, 2:36:54 PM5/31/08
to
Jürgen Exner wrote:
> "szr" <sz...@szromanMO.comVE> wrote:
>> I would rather have the OP comment about that, as he started the
>> thread.
>
> The OP is a very well-known troll who has the habit of spitting out a
> borderline OT article to a bunch of loosly related NGs ever so often
> and then sits back and enjoys the complaints and counter-complaints
> of the regulars.

While I agree cross-posting should be chosen more carefully, it seemed
like a harmless article to me. I did not get the impression he was just
trolling. There are people who like to post articles they come across
and maybe want to start a discussion on.

Like I said, this may have been better to put in a more general
programming group, and that yes, it is kind of OT for specific language
groups, but I really saw no harm in it (and I saw no one try to redirect
the discussion to a more general group), and you and anyone else are
free to ignore the thread. All I ask is you allow people to make up
their own minds.

--
szr


Arne Vajhøj

unread,
May 31, 2008, 8:30:45 PM5/31/08
to
szr wrote:

That does not make it on topic in the Java group.

And the subthread Lew commented on most certainly is not.

Arne

Arne Vajhøj

unread,
May 31, 2008, 8:43:22 PM5/31/08
to
szr wrote:
> Peter Duniho wrote:
>> On Fri, 30 May 2008 22:40:03 -0700, szr <sz...@szromanMO.comVE> wrote:
>>> Arne Vajhøj wrote:
>>>> Stephan Bour wrote:
>>>>> Lew wrote:
>>>>> } John Thingstad wrote:
>>>>> } > Perl is solidly based in the UNIX world on awk, sed, } > bash
>>>>> and C. I don't like the style, but many do.
>>>>> }
>>>>> } Please exclude the Java newsgroups from this discussion.
>>>>>
>>>>> Did it ever occur to you that you don't speak for entire news
>>>>> groups?
>>>> Did it occur to you that there are nothing about Java in the above ?
>>> Looking at the original post, it doesn't appear to be about any
>>> specific language.
>> Indeed. That suggests it's probably off-topic in most, if not all,
>> of the newsgroups to which it was posted, inasmuch as they exist for
>> topics specific to a given programming language.
>
> Perhaps - comp.programming might of been a better place, but not all
> people who follow groups for specific languages follow a general group
> like that - but let me ask you something. What is it you really have
> against discussing topics with people of neighboring groups? Keep in
> mind you don't have to read anything you do not want to read. [1]

I very much doubt that the original thread is relevant for the Java
group.

But the subthread Lew commente don was about Perl and Unix. That is
clearly off topic.

Personally I am rather tolerant for topics. But I can not blame Lew
for requesting that a Perl-Unix discussion does not get cross posted
to a Java group.

>> Regardless, unless you are actually reading this thread from the
>> c.l.j.p newsgroup, I'm not sure I see the point in questioning
>> someone who _is_ about whether the thread belongs there or not.
>
> I would rather have the OP comment about that, as he started the thread.
> But what gets me is why you are against that specific group being
> included but not others? What is so special about the Java group and why
> are you so sure people there don't want to read this thread? [1] What
> right do you or I or anyone have to make decisions for everyone in a
> news group? Isn't this why most news readers allow one to block a
> thread?

I doubt Lew read any of the other groups, so it seems quite
natural that he did not comment on the on/off topic characteristics
in those.

>> And if it's a vote you want, mark me down as the third person reading
>> c.l.j.p that doesn't feel this thread belongs. I don't know whether
>> Lew speaks for the entire newsgroup, but based on comments so far,
>> it's pretty clear that there unanimous agreement among those who have
>> expressed an opinion.
>
> Ok, so, perhaps 3 people out of what might be several hundred, if not
> thousand (there is no way to really know, but there are certainly a lot
> of people who read that group, and as with any group, there are far more
> readers than there are people posting, so, again, just because you or
> two other people or so don't want to read a topic or dislike it, you
> feel you can decide for EVERYONE they mustn't read it? Again, this is
> why readers allow you to ignore threads. Please don't force your views
> on others; let them decide for themselves. [1]

And I am sure that Lew did not intended to pretend to speak for
the entire group. He spoke for himself.

I believe there has been several posts that agreed with him and none
that disagreed, so it seems very plausible that the group indeed agree
with him.

Arguing that a huge silent majority has a different opinion
than those speaking up is a very questionable argument. Everybody
could try and count them for their view. The only reasonable
thing is not to count them.

Arne

szr

unread,
Jun 1, 2008, 2:27:35 AM6/1/08
to
Arne Vajhřj wrote:
> szr wrote:
>> Peter Duniho wrote:
>>> On Fri, 30 May 2008 22:40:03 -0700, szr <sz...@szromanMO.comVE>
>>> wrote:

I agree with and understand what you are saying in general, but still,
isn't it possible that were are people in the java group (and others)
who might of been following the thread, only to discover (probably not
right away) that someone decided to remove the group they were reading
the thread from? I know I would not like that, even if it wasn't on
topic at the branch.

Personally, I find it very annoying to have to switch news groups in
order to resume a thread and weed my way down the thread to where it
left off before it was cut off from the previous group.
--
szr


Peter Duniho

unread,
Jun 1, 2008, 2:43:21 AM6/1/08
to
On Sat, 31 May 2008 23:27:35 -0700, szr <sz...@szromanMO.comVE> wrote:

> [...]


>> But the subthread Lew commente don was about Perl and Unix. That is
>> clearly off topic.
>
> I agree with and understand what you are saying in general, but still,
> isn't it possible that were are people in the java group (and others)
> who might of been following the thread, only to discover (probably not
> right away) that someone decided to remove the group they were reading
> the thread from? I know I would not like that, even if it wasn't on
> topic at the branch.

All due respect, I don't really care if those people find the thread
gone. And no one should.

Each individual person has a wide variety of interests. A thread that is
off-topic in a newsgroup may in fact be concerning a topic of interest for
someone who just happened to be reading that newsgroup. The fact that
that person might have been interested in it isn't justification for
continuing the thread in that newsgroup. The most important question
isn't who might have been reading the thread, but rather whether the
thread is on-topic.

What if someone cross-posted a thread about motorcycle racing in the Perl
newsgroup as well as an actual motorcycle racing newsgroup? No doubt, at
least some people reading the Perl newsgroup have an interest in
motorcycle racing. They may in fact be racers themselves. Those people
may have found the thread about motorcycle racing interesting.

Does that justify the thread continuing to be cross-posted to the Perl
newsgroup? No, of course not.

So please. Quit trying to justify a thread being cross-posted to a
newsgroup that you aren't even reading just on the sole basis of the
remote possibility that someone in that newsgroup was interested in the
thread. It's not a legitimate justification, and even if it were, there's
been sufficient opportunity for someone here in the Java newsgroup to
speak up and say "hey, wait! I was reading that!"

But no one's said anything of the sort. Those people who don't exist have
no need for you to provide an irrelevant defense for them.

> Personally, I find it very annoying to have to switch news groups in
> order to resume a thread and weed my way down the thread to where it
> left off before it was cut off from the previous group.

If people use the newsgroups responsibly, that never happens.

A thread should never be cut-off midstream like that unless it was
inappropriately cross-posted in the first place, and if the thread was
inappropriately cross-posted in the first place, no one has any business
expecting to be able to continue reading it in any newsgroup where it's
off-topic.

If you're interested in discussions on Perl and Unix, go read a newsgroup
about Perl and/or Unix. Don't look for those discussions in the Java
newsgroup, and don't get comfy reading the thread in the Java newsgroup
should you happen across it. They don't belong, and they should be
terminated within the Java newsgroup ASAP. Go follow the thread where
it's on-topic.

Pete

szr

unread,
Jun 1, 2008, 4:24:18 AM6/1/08
to
Peter Duniho wrote:
> On Sat, 31 May 2008 23:27:35 -0700, szr <sz...@szromanMO.comVE> wrote:
>
>> [...]
>>> But the subthread Lew commente don was about Perl and Unix. That is
>>> clearly off topic.
>>
>> I agree with and understand what you are saying in general, but
>> still, isn't it possible that were are people in the java group (and
>> others) who might of been following the thread, only to discover
>> (probably not right away) that someone decided to remove the group
>> they were reading the thread from? I know I would not like that,
>> even if it wasn't on topic at the branch.
>
> All due respect, I don't really care if those people find the thread
> gone. And no one should.

I prefer to be considerate of others.

> Each individual person has a wide variety of interests. A thread
> that is off-topic in a newsgroup may in fact be concerning a topic of
> interest for someone who just happened to be reading that newsgroup.

Well if a thread has absolutely no relation to a group, then yes,
cross-posting to said group is inappropiate, and setting follow ups may
well be warrented. But when there is some relation, sometimes it may be
better to mark it as [OT] i nthe subject line, a practice that is
sometimes seen, and seems to suffice.

> What if someone cross-posted a thread about motorcycle racing in the
> Perl newsgroup as well as an actual motorcycle racing newsgroup?

You are comparing apples and oranges now; sure, if you post about
motorcycles (to use your example) it would be wildly off topic, but the
thread in question was relating to programming (the naming of functions
and such) in general.

> Does that justify the thread continuing to be cross-posted to the Perl
> newsgroup? No, of course not.

but who decides this? And why does said individual get to decide for
everyone?

> So please. Quit trying to justify a thread being cross-posted to a
> newsgroup that you aren't even reading

You do not know what groups I read. And I am not attempting to justify
cross posting at all. Rather I am arguing against deciding for a whole
news group when a thread should be discontinued.

--
szr


Arne Vajhøj

unread,
Jun 1, 2008, 7:04:35 PM6/1/08
to

I am relative tolerant towards threads that are a bit off topic, if
the S/N ratio overall is good.

But I accept and respect that other people has a more strict
attitude against off topic posts.

And I am very little tolerant towards people that think they
can attack those that want only on topic posts.

One thing is to ask for a bit of slack regarding the rules
something else is attacking those that want the rules
kept.

Arne

szr

unread,
Jun 2, 2008, 1:45:25 AM6/2/08
to

Agreed.

[...]

If a thread, that is cross-posted, branches off on a tangent that has
nothing to do with one or more groups what so ever, then yes, it makes
sense to prune the 'newsgroup:' list / set follow ups, but in this case,
someone made one mention or so of 'Perl', which was being used as an
example, and someone (lew) moved to have the Java group removed.

There was little reason to cut off the thread, when people very well may
have been following it, over the utterance of one word, which was being
used as an example. The bulk of the thread had to do with general
programming, and suddenly writing the name of a language doesn't mean
it's way off on a tangent.

I hope this clears up some waters.

Regards.

--
szr


Robert Maas, http://tinyurl.com/uh3t

unread,
Jun 5, 2008, 1:39:22 AM6/5/08
to
> From: dkco...@panix.com (David Combs)

> Lisp is *so* early a language (1960?), preceeded mainly only by
> Fortran (1957?)?, and for sure the far-and-away the first as a
> platform for *so many* concepts of computer-science, eg lexical vs
> dynamic ("special") variables, passing *unnamed* functions as
> args ... maybe is still the only one in which program and data

> have the same representation -- that it'd seem logical to use it's
> terminology in all languages.

Yeah, but why did you cross-post to so many newsgroups? Are you
trying to run a flame war between advocates of the various
languages? (Same accusation to the OP moreso!)

> From C is the very nice distinction between "formal" and "actual" args.

I think Lisp already had that nearly 50 years ago. Function
definition (lambda expression) has formal args, EVAL recursively
calls EVAL on sub-forms to create actual args and calls APPLY on
them and whatever function is named in the CAR position of the form.
Whether anybody bothered to use that specific jargon, or it was
just so obvious it didn't need jargon, I don't know.

> And from algol-60, own and local -- own sure beats "static"!

Yeah. But now that you mention it and I think about it, what's
really meant is "private persistent". Global variables are public
persistent. Local variables and formal args to functions are
private transient (they go away as soon as the function returns).
but OWN variables are private to the function but stay around
"forever" just like globals do, so that side effects on the OWN
variables that occurred during one call can persist to affect the
next call. Lexical closures in Common Lisp go one step further,
allowing private persistent variables to be shared between several
functions. All those functions share access to the private variable
which they co-OWN. Another way in which OWN or lexical-closure
variables aren't like what the word "own" means in ordinary
language is that it's possible to transfer ownership by selling or
giving something to somebody else, but not with OWN variables or
lexical-closure variables. So even though I like the word OWN
better than the word STATIC for this meaning, I'm not totally
comfortable with that jargon. But "persistent private" is a
mouthful compared to "OWN", and I doubt anyone can find a word of
appx. 3 characters that conveys the intended meaning so we're
probably stuck with "OWN" as the best short term.

Jon Harrop

unread,
Jun 5, 2008, 6:37:48 AM6/5/08
to
Robert Maas, http://tinyurl.com/uh3t wrote:
>> From: dkco...@panix.com (David Combs)
>> Lisp is *so* early a language (1960?), preceeded mainly only by
>> Fortran (1957?)?, and for sure the far-and-away the first as a
>> platform for *so many* concepts of computer-science, eg lexical vs
>> dynamic ("special") variables, passing *unnamed* functions as
>> args ... maybe is still the only one in which program and data
>> have the same representation -- that it'd seem logical to use it's
>> terminology in all languages.
>
> Yeah, but why did you cross-post to so many newsgroups? Are you
> trying to run a flame war between advocates of the various
> languages?

What would be the point? We all know that Java, Perl, Python and Lisp suck.
They don't even have pattern matching over algebraic sum types if you can
imagine that. How rudimentary...

--
Dr Jon D Harrop, Flying Frog Consultancy
http://www.ffconsultancy.com/products/?u

jon.harro...@gmail.com

unread,
Jun 5, 2008, 9:17:01 AM6/5/08
to
On 5 Giu, 12:37, Jon Harrop <j...@ffconsultancy.com> wrote:
> [...]

P.S. Please don't look at my profile (at google groups), thanks!

Jon Harrop

John W Kennedy

unread,
Jun 24, 2008, 6:42:15 PM6/24/08
to
David Combs wrote:
> passing
> *unnamed* functions as args (could Algol 60 also do something like that,
> via something it maybe termed a "thunk")

No, the "thunks" were necessary at the machine-language level to
/implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.

--
John W. Kennedy
"The first effect of not believing in God is to believe in anything...."
-- Emile Cammaerts, "The Laughing Prophet"

Robert Maas, http://tinyurl.com/uh3t

unread,
Jun 30, 2008, 2:44:02 AM6/30/08
to
Why this response is so belated:
<http://groups.google.com/group/misc.misc/msg/cea714440e591dd2>
= <news:rem-2008...@yahoo.com>
> Date: Thu, 05 Jun 2008 11:37:48 +0100
> From: Jon Harrop <j...@ffconsultancy.com>

> We all know that Java, Perl, Python and Lisp suck.

Well at least you're three-quarters correct there.
But Lisp doesn't suck. That's where you're one quarter wrong.

> They don't even have pattern matching over algebraic sum types if
> you can imagine that.

I'm still waiting for you to precisely define what you mean by
"pattern matching" in this context. All I've heard from you so-far
are crickets. If you've set up a Web page with your personal
definition of "pattern matching" as you've been using that term
here, and you've posted its URL in a newsgroup article I didn't
happen to see, please post just the URL again here so that I might
finally see it. Or e-mail me the URL.

-
Nobody in their right mind likes spammers, nor their automated assistants.
To open an account here, you must demonstrate you're not one of them.
Please spend a few seconds to try to read the text-picture in this box:

/----------------------------------------------------------------------------\
| ,-.-. | |
| | | |, . ,---.,---.,---. ,---.,---.,---.,---|,---. |
| | | || | `---.| || | | ||---'|---'| |`---. |
| ` ' '`---| `---'`---'` ' ` '`---'`---'`---'`---' |
| `---' |
| | | o | | |
| |---.,---.| ,---. .,---. ,---.| ,---.,---. |---.,---.,---. |
| | ||---'| | | || | ,---|| | ||---'---| || ,---| |
| ` '`---'`---'|---' `` ' `---^`---'`---|`---' `---'` `---^ | |
| | `---' ' |
| | | | |
| ,---.,---.|--- . . .,---.,---.,---|,---.,---. |---.,---.,---. |
| | || || | | || || || ||---'| ---| || ,---| |
| ` '`---'`---' `-'-'`---'` '`---'`---'` `---'` `---^o |
\--------(Rendered by means of <http://www.schnoggo.com/figlet.html>)--------/
(You don't need JavaScript or images to see that ASCII-text image!!
You just need to view this in a fixed-pitch font such as Monaco.)

Then enter your best guess of the text (40-50 chars) into this TextField:
+--------------------------------------------------+
| |
+--------------------------------------------------+

Robert Maas, http://tinyurl.com/uh3t

unread,
Jun 30, 2008, 3:03:30 AM6/30/08
to
> Date: Thu, 5 Jun 2008 06:17:01 -0700 (PDT)
> From: jon.harrop.ms.sh...@gmail.com

> P.S. Please don't look at my profile (at google groups), thanks!

Please don't look at the orange and green checkered elephant
playing a harp off-key while sitting on a toilet and passing wind.

Robert Maas, http://tinyurl.com/uh3t

unread,
Jun 30, 2008, 3:07:03 AM6/30/08
to
> Date: Tue, 24 Jun 2008 18:42:15 -0400
> From: John W Kennedy <jwke...@attglobal.net>
> ... the "thunks" were necessary at the machine-language level to

> /implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.

Ah, thanks for the clarification. Is that info in the appropriate
WikiPedia page? If not, maybe you would edit it in?

Lew

unread,
Jun 30, 2008, 7:51:21 AM6/30/08
to
Robert Maas wrote:
> /----------------------------------------------------------------------------\
> | ,-.-. | |
> | | | |, . ,---.,---.,---. ,---.,---.,---.,---|,---. |
> | | | || | `---.| || | | ||---'|---'| |`---. |
> | ` ' '`---| `---'`---'` ' ` '`---'`---'`---'`---' |
> | `---' |
> | | | o | | |
> | |---.,---.| ,---. .,---. ,---.| ,---.,---. |---.,---.,---. |
> | | ||---'| | | || | ,---|| | ||---'---| || ,---| |
> | ` '`---'`---'|---' `` ' `---^`---'`---|`---' `---'` `---^ | |
> | | `---' ' |
> | | | | |
> | ,---.,---.|--- . . .,---.,---.,---|,---.,---. |---.,---.,---. |
> | | || || | | || || || ||---'| ---| || ,---| |
> | ` '`---'`---' `-'-'`---'` '`---'`---'` `---'` `---^o |
> \--------(Rendered by means of <http://www.sc*****************.html>)--------/

> (You don't need JavaScript or images to see that ASCII-text image!!
> You just need to view this in a fixed-pitch font such as Monaco.)
>
> Then enter your best guess of the text (40-50 chars) into this TextField:
> +--------------------------------------------------+
> | Your son totally needs a Wonder-Bra(r), double-D |
> +--------------------------------------------------+

--
Lew

John W Kennedy

unread,
Jul 1, 2008, 9:56:58 PM7/1/08
to
Robert Maas, http://tinyurl.com/uh3t wrote:

It is explained s.v. "thunk", which is referenced from "ALGOL 60". The
ALGOL "pass-by-name" argument/parameter matching was perhaps the most
extreme example ever of a language feature that was "elegant" but
insane. What it meant, in effect, was that, unless otherwise marked,
every argument was passed as two closures, one that returned a fresh
evaluation of the expression given as the argument, which was called
every time the parameter was read, and one that set the argument to a
new value, which was called every time the parameter was set.

See <URL:http://www.cs.sfu.ca/~cameron/Teaching/383/PassByName.html>.

ALGOL 60 could not create generalized user-written closures, but could
create one no more complex than a single expression with no arguments of
its own simply by passing the expression as an argument. But it was not
thought of as a closure; that was just how ALGOL 60 did arguments.
--
John W. Kennedy
"Give up vows and dogmas, and fixed things, and you may grow like
That. ...you may come to think a blow bad, because it hurts, and not
because it humiliates. You may come to think murder wrong, because it
is violent, and not because it is unjust."
-- G. K. Chesterton. "The Ball and the Cross"

Robert Maas, http://tinyurl.com/uh3t

unread,
Jul 20, 2008, 1:36:21 PM7/20/08
to
> >> ... the "thunks" were necessary at the machine-language level to
> >> /implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.
> > Ah, thanks for the clarification. Is that info in the appropriate
> > WikiPedia page? If not, maybe you would edit it in?
> From: John W Kennedy <jwke...@attglobal.net>
> It is explained s.v. "thunk", which is referenced from "ALGOL
> 60". The ALGOL "pass-by-name" argument/parameter matching was
> perhaps the most extreme example ever of a language feature that
> was "elegant" but insane. What it meant, in effect, was that,
> unless otherwise marked, every argument was passed as two closures,
> one that returned a fresh evaluation of the expression given as the
> argument, which was called every time the parameter was read, and
> one that set the argument to a new value, which was called every
> time the parameter was set.

Wow! All these years when I occasionally heard of a "thunk" I never
was told, until now, what it really meant. Thanks for the info!!

Followup question #1: I assume these are lexical closures in the
environment of the point of the call, right?

Followup question #2: For simple arithmetic expressions, I can
possibly understand how the UPDATE closure might be implemeted
(expressed in Lisp to make the intent clear):
Call form: MyFunction(X+2);
GET closure: (+ closedX 2)
UPDATE closure: (lambda (newval) (setf closedX (- newval 2))
Thus from inside MyFunction where formal parameter Arg1 is bound
to actual parameter X+2, after doing Arg1 := 7; X will have the
value 5 so that calling Arg1 will return 7 as expected, right?
But if the actual argument is something complicated, especially if
it makes a nested function call, how can that possibly be
implemented? Given an arbitrary expression that calls some external
function, how can assigning a value to that expression make
sufficient changes in the runtime environment such that
subsequently evaluating that expression will yield the expected
value i.e. the value that had been assigned?
Or is the default of passing two closures (GET and UPDATE) *only*
if the actual-argument expression is simple enough that it's
invertible, and in complicated cases only a GET closure is passed
(or the UPDATE closure is simply a signal of a runtime error
that you're not allowed to assign a value to a complicated expression)?

IMO the "right" way to pass parameters that can be modified is to
use "locatatives" as in the Lisp Machine. That converts the idea of
a "place" (as used by SETF in Common Lisp) into a "first class
citizen" which can be passed around and stored etc., compared to a
SETF place which is merely a compiletime-macro trick to convert
place-references in source code into direct calls to the
appropriate accessor just above the place followed by specialized
SETter call to do the act. A hack to emulate a locative in CL would
be to pass a closure where the code to find the object directly
containing the place, and any parameters needed to find that place,
and the function needed to perform the act. Then the called
function would need to know it's going to get such a thunk-like
closure, but since it's expecting to modify one of its parameters
anyway, that's reasonable. Sketch of implementation (two special cases):
(defun make-thunk-cadr (topptr)
(let* ((midptr (cdr topptr))
(getclo (make-getter-closure :PARENT midptr :GETTERFN #'car
:PARMS nil))
(setclo (make-setter-closure :PARENT midptr :SETTERFN #'rplaca
:PARMS nil)))
(make-thunk getclo setclo))
(defun make-thunk-aref1 (topptr arrindex1)
(let ((getclo (make-getter-closure :PARENT topptr :GETTERFN #'aref1
:PARMS (list arrindex1)))
(setclo (make-setter-closure :PARENT midptr :SETTERFN #'setaref1
:PARMS (list arrindex1))))
(make-thunk getclo setclo))
(defun swap (thunk1 thunk2)
(prog (tmp)
(setq tmp (thunk-get thunk1))
(thunk-set thunk1 (thunk-get thunk2))
(thunk-set thunk2 tmp)))
;Definitions of make-getter-closure make-setter-closure make-thunk
; thunk-get thunk-set not shown because they depend on whether
; closures and thunks are implemented via tagged assoc lists or
; DEFSTRUCT structures or CLOS objects or whatever. But I made the
; call to the constructors explicit enough that it should be obvious
; what components are inside each type of object. Note that with
; CLOS objects, this could all be condensed to have a single CLOS
; object which is the thunk which has two methods GET and SET, no
; need to make closures for get and set separately, templates for
; those closures are made automatically when the CLOS class is
; defined, and closures are generated from those templates whenever
; a new CLOS thunk-object is made. Thus:
; ... (make-CLOS-thunk :PARENT topptr :GETTERFN #'aref1 :SETTERFN #'setaref1
; :PARMS (list arrindex1)) ...

;Example that should actually work:
(format t " arr: ~S~% ixs: ~S~%" arr ixs)
arr: #'(3 5 7 11 13))
ixs: (2 4)
(setq arr #'(3 5 7 11 13))
(setq ixs (list 2 4))
(setq thunkcadr (make-thunk-cadr ixs))
;Locative to the 4 in ixs
(setq thunkaref (make-thunk-aref1 arr (thunk-get thunkcadr)))
;Locative to the 11 in the array
(swap thunkcadr thunkaref)
(format t " arr: ~S~% ixs: ~S~%" arr ixs)
arr: #'(3 5 7 4 13))
ixs: (2 11)
I haven't implemented this. I'm just specifying what the behaviour should be
and giving a sketch how it ought to be easily doable in Common Lisp.

And I'm not going to implement it because I have no use for this way
of coding, at least not currently or in the foreseeable future.
<tmi> Generally my abstract data type is at a higher level where the
caller doesn't know that a single place is going to need to be
SETFed, so there's no point in getting a locative to work with.
Instead there's some *kind* of update to do, and parameters to
that *kind* of update; what really happens internally (one or more
SETFs, or alternately re-build anything that changed and share what
didn't change) doesn't need to be known by the caller. All the
caller needs to know is generically whether the update is in-place
or non-destructive. (If it's non-destructive, then the new edition
of the data structure is one of the return values. If it's
in-place, then there's no need to bother with setq of the new
value, because my structures always have a header cell that has a
tag for the intentional datatype, and that header cell always
points to either the in-place-modified object or the
latest-edition-of-object.) </tmi>
<mtmi> I'd rather program in "paranoid" mode than in "risk shoot foot" mode.
The CAR of each ADT object is a keyword identifying the
intentional type of that object, and every function that
operates on that intentional type first checks if the parameter
really does have the expected CAR, just to make sure I didn't
copy&paste some inappropriate function name in my code. Yeah, it
takes extra CPU cycles to do that checking on every calls, but it
sure saves me from shooting myself in the foot and having to spend
an hour to find out how I did it before I can fix it. <mtmi>
<emtmi> TMI = Too Much Information (actually YMMV, some readers might like it)
MTMI = More of Too Much Information
EMTMI = Even More of Too Much Information (only newbies need read)
Credits to Babylon Five for the "I spy" game in the cargo hold.
I spy something that starts with the letter B. Boxes!
I spy something that starts with the letter M. More boxes! <emtmi>

John W Kennedy

unread,
Jul 20, 2008, 6:29:25 PM7/20/08
to
Robert Maas, http://tinyurl.com/uh3t wrote:
>>>> ... the "thunks" were necessary at the machine-language level to
>>>> /implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.
>>> Ah, thanks for the clarification. Is that info in the appropriate
>>> WikiPedia page? If not, maybe you would edit it in?
>> From: John W Kennedy <jwke...@attglobal.net>
>> It is explained s.v. "thunk", which is referenced from "ALGOL
>> 60". The ALGOL "pass-by-name" argument/parameter matching was
>> perhaps the most extreme example ever of a language feature that
>> was "elegant" but insane. What it meant, in effect, was that,
>> unless otherwise marked, every argument was passed as two closures,
>> one that returned a fresh evaluation of the expression given as the
>> argument, which was called every time the parameter was read, and
>> one that set the argument to a new value, which was called every
>> time the parameter was set.
>
> Wow! All these years when I occasionally heard of a "thunk" I never
> was told, until now, what it really meant. Thanks for the info!!
>
> Followup question #1: I assume these are lexical closures in the
> environment of the point of the call, right?

Yes. (Actually, subprogram calls are first described as working like
macro expansions, but then the specification adds that there must be
magic fixups so that variable names are resolved in point-of-call
context anyway.)

At this point in the history of computing, the conceptual distinction
between subprograms and macros was not at all clear. It was quite
possible to have "macros" that generated an out-of-line subprogram once
and then generated calling code the first time and every time thereafter
(it was a way of life on systems without linkage editors or linking
loaders), and it was also quite possible to refer to in-line macro
expansions as "subprograms". I suspect that the triumph of FORTRAN may
have had something to do with cleaning up the terminological mess by the
mid-60s.

Into the 60s, indeed, there were still machines being made that had no
instruction comparable to the mainframe BASx/BALx family, or to Intel's
CALL. You had to do a subprogram call by first overwriting the last
instruction of what you were calling with a branch instruction that
would return back to you.

> Followup question #2: For simple arithmetic expressions, I can
> possibly understand how the UPDATE closure might be implemeted
> (expressed in Lisp to make the intent clear):
> Call form: MyFunction(X+2);
> GET closure: (+ closedX 2)
> UPDATE closure: (lambda (newval) (setf closedX (- newval 2))
> Thus from inside MyFunction where formal parameter Arg1 is bound
> to actual parameter X+2, after doing Arg1 := 7; X will have the
> value 5 so that calling Arg1 will return 7 as expected, right?
> But if the actual argument is something complicated, especially if
> it makes a nested function call, how can that possibly be
> implemented?

It was forbidden. In the formal definition of ALGOL 60, there was no
such thing as an external subprogram (except for non-ALGOL code, which
was treated as magic), so the whole program was supposed to be one
compile. Therefore, a theoretical compiler could verify that any
parameter used as an LHS was always matched with an argument that was a
variable. (However, a "thunk" was still required to evaluate any array
index and, possibly, to convert between REAL and INTEGER variables.)
Many actual compilers /did/ support separate complication as a language
extension -- I suppose they had to use run-time checking.

--
John W. Kennedy
"Compact is becoming contract,
Man only earns and pays."
-- Charles Williams. "Bors to Elayne: On the King's Coins"

Martin Gregorie

unread,
Jul 22, 2008, 5:21:50 AM7/22/08
to
On Tue, 24 Jun 2008 18:42:15 -0400, John W Kennedy wrote:

> David Combs wrote:
>> passing
>> *unnamed* functions as args (could Algol 60 also do something like that,
>> via something it maybe termed a "thunk")
>
> No, the "thunks" were necessary at the machine-language level to
> /implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.
>

Are you sure about that?

The first time I ran across the term "thunking" was when Windows 3
introduced the Win32S shim and hence the need to switch addressing between
16 bit and 32 bit modes across call interfaces. That was called "thunking"
by Microsoft and even they would surely admit it was a kludge.

I used Algol 60 on an Elliott 503 and the ICL 1900 series back when it was
a current language. The term "thunking" did not appear in either compiler
manual nor in any Algol 60 language definition I've seen. A60 could pass
values by name or value and procedures by name. That was it. Call by name
is what is now referred to as reference passing.

I should also point out that Algol 60 was initially written as a means for
communicating algorithms between people. Compiler implementations came
later. In consequence the language did not define links to libraries or
i/o methods. Both features were compiler specific - for instance the
Elliott introduced 'input' and 'print' reserved words and syntax while the
1900 compilers used function calls. The Elliott approach was more readable.

Algol 60 did not have 'functions'. It had procedures which could be
declared to return values or not. A procedure that returned a value was
equivalent to a function but the term 'function' was not used. Similarly
it did not have a mechanism for declaring anonymous procedures. That, like
the incorporation of machine code inserts, would have been a
compiler-specific extension, so it is a terminological mistake to refer to
it without specifying the implementing compiler.


--
martin@ | Martin Gregorie
gregorie. |
org | Zappa fan & glider pilot


Josef Moellers

unread,
Jul 22, 2008, 5:56:41 AM7/22/08
to
Martin Gregorie wrote:

> Are you sure about that?

> I used Algol 60 on an Elliott 503 and the ICL 1900 series back when it was


> a current language. The term "thunking" did not appear in either compiler
> manual nor in any Algol 60 language definition I've seen. A60 could pass
> values by name or value and procedures by name. That was it. Call by name
> is what is now referred to as reference passing.

Are you sure about that? ;-)

AFAIK "Call by name" is *not* the same as passing an argument by
reference. With "call by name" you can implement this wonderful thing
called "Jensen's Device", which you cannot do when you pass parameters
by reference!

Josef
--
These are my personal views and not those of Fujitsu Siemens Computers!
Josef Möllers (Pinguinpfleger bei FSC)
If failure had no penalty success would not be a prize (T. Pratchett)
Company Details: http://www.fujitsu-siemens.com/imprint.html

Rob Warnock

unread,
Jul 22, 2008, 8:23:22 AM7/22/08
to
Martin Gregorie <martin@see_sig_for_address.invalid> wrote:
+---------------

| John W Kennedy wrote:
| > No, the "thunks" were necessary at the machine-language level to
| > /implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.
|
| Are you sure about that?
+---------------

I don't know if John is, but *I* am! ;-}

+---------------


| I used Algol 60 on an Elliott 503 and the ICL 1900 series back when
| it was a current language. The term "thunking" did not appear in either
| compiler manual nor in any Algol 60 language definition I've seen.

+---------------

It wouldn't have been. Thunks were something used by Algol 60
*compiler writers* in the code generated by their compilers to
implement the semantics of Algol 60 call-by-name, but were not
visible to users at all [except that they allowed call-by-name
to "work right"].

+---------------


| A60 could pass values by name or value and procedures by name. That
| was it. Call by name is what is now referred to as reference passing.

+---------------

(*sigh*) NO, IT IS NOT!!! Please go read the following:

http://en.wikipedia.org/wiki/Thunk
http://en.wikipedia.org/wiki/Evaluation_strategy#Call_by_name
http://en.wikipedia.org/wiki/Jensen%27s_Device

+---------------
| [Algol 60] did not have a mechanism for declaring anonymous procedures.
+---------------

Quite correct, but completely off the mark. While an Algol 60 *user*
could not declare an anonymous procedure, the *implementation* of an
Algol 60 compilers required the ability for the compiler itself to
generate/emit internal anonymous procedures, to wit, the above-mentioned
thunks, sometimes creating them dynamically during the procedure call.
[Actually, a pair of them for each actual argument in a procedure call.]

+---------------


| That, like the incorporation of machine code inserts, would have been
| a compiler-specific extension, so it is a terminological mistake to
| refer to it without specifying the implementing compiler.

+---------------

Again, "incompetent, irrelevant, and immaterial" [as Perry Mason used
to so frequently object during trials]. Thunks were not "extensions" to
Algol 60 compilers; they were part of the basic implementation strategy
*within* Algol 60 compilers, necessary because of the semantics required
by call-by-name.

Basically, in Algol 60, each parameter must be passed [in general,
that is, one can optimize away many special cases] as *two* closures --
conventionally called "thunks" by Algol 60 compiler writers -- one
for "getting" (evaluating) and the other for "setting" the parameter
[if the parameter was a "place" in Common Lisp terms, else an error
was signalled]... IN THE CALLER'S LEXICAL ENVIRONMENT!!

The big deal was two-fold: (1) each time a formal parameter was
*referenced* in a callee, the expression for the actual parameter
in the caller had to be *(re)evaluated* in the *caller's* lexical
environment, and the value of that (re)evaluation used as the
value of the referenced formal parameter in the callee; and
(2) if a variable appeared twice (or more) in a parameter list,
say, once as a naked variable [which is a "place", note!] and again
as a sub-expression of a more complicated parameter, then setting
the formal parameter in the *callee* would *change* the value of
the actual parameter in the caller(!!), which in turn would change
the value of the *other* actual parameter in the caller the next time
it was referenced in the callee. The above-referenced "Jensen's Device"
shows how this can be used to do "very tricky stuff". A simpler and
shorter example is here:

http://www.cs.rit.edu/~afb/20013/plc/slides/procedures-07.html

Because the actual parameters in the callee had to be evaluated
in the *caller's* lexical environment -- and because Algol 60 was
fully recursive, allowed nested procedure definitions, and could
pass "local" procedures as arguments -- efficient implementation of
Algol 60 procedure calls almost necessitated placing the bodies
of the compiler-generated actual parameter thunks on the caller's
dynamic stack frame [or at least call instructions *to* the thunks
which could pass the current lexical contours as sub-arguments].
Knuth's nasty "man or boy test" stressed this to the limit:

http://en.wikipedia.org/wiki/Man_or_boy_test


-Rob

p.s. IIRC [but it's been a *long* time!], the ALGOL-10 compiler
for the DEC PDP-10 passed each actual parameter as the address of
a triple of words, of which the first two were executable and the
third could be used to store a variables value (simple case) or
to pass the lexical contour (more complicated case). When the
callee needed to reference (evaluate) an argument, it used the
PDP-10 XCT ("execute") instruction to execute the first word of
the block, which was required to deliver the value to a standard
register [let's say "T0", just for concreteness], and if the callee
wanted to *set* an argument, it executed the *second* word pointed
to by the passed address, with the new value also in a defined place
[again, let's use T0]. So to implement "X := X + 1;" in the callee,
the compiler would emit code like this:

MOVE T1,{the arg (address) corresponding to "X"}
XCT 0(T1) ; fetch the current value of X into T0.
ADDI T0, 1 ; increment it
XCT 1(T1) ; execute the "setter" for X.

Now in the case where the actual parameter in the caller was a
simple global variable, call it "Y", then the address passed as
the arg could be the following "YTHNK" in static data space:

YTHNK: MOVE T0,.+2 ; one-instruction "getter"
MOVEM T0,.+1 ; one-instruction "setter"
Y: BLOCK 1 ; the actual place where the value "Y" lives

Whereas if the argument being passed were some more complicated
expression, such as an array reference or a reference to a local
procedure in the caller, then the 3-word arg block would be passed
on the stack and the passed address would point to this [possibly
dynamically-constructed] triple, where PUSHJ is the PDP-10 stack-
oriented subroutine call instruction:

PUSHJ P,{lambda-lifted getter code}
PUSHJ P,{lambda-lifted setter code}
EXP {lexical contour info needed for getter/setter to work}

Efficient for the simple case; slow-but-correct for the messy case.

-----
Rob Warnock <rp...@rpw3.org>
627 26th Avenue <URL:http://rpw3.org/>
San Mateo, CA 94403 (650)572-2607

Lew

unread,
Jul 22, 2008, 9:54:06 AM7/22/08
to
Rob Warnock wrote:
> Martin Gregorie <martin@see_sig_for_address.invalid> wrote:
> +---------------
> | John W Kennedy wrote:
> | > No, the "thunks" were necessary at the machine-language level to
> | > /implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.
> |
> | Are you sure about that?
> +---------------
>
> I don't know if John is, but *I* am! ;-}

At this point we are so far off topic for clj.programmer, but still impinging
on functional programming issues with the discussion of closures, et al., that
I respectfully request that you all exclude clj.programmer from followups to
this thread.

(f-u set to comp.lang.functional)

--
Lew

Steve Schafer

unread,
Jul 22, 2008, 12:29:40 PM7/22/08
to
On Tue, 22 Jul 2008 10:21:50 +0100, Martin Gregorie
<martin@see_sig_for_address.invalid> wrote:

>The first time I ran across the term "thunking" was when Windows 3
>introduced the Win32S shim and hence the need to switch addressing between
>16 bit and 32 bit modes across call interfaces. That was called "thunking"
>by Microsoft and even they would surely admit it was a kludge.

Win32s thunks are a completely different beast from the original Algol
60 thunks. As far as I know, the first published description of thunks
was:

Ingerman PZ (1961) Thunks: A way of compiling procedure statements with
some comments on procedure declarations, CACM 4:55-58.

Steve Schafer
Fenestra Technologies Corp.
http://www.fenestra.com/

Grant Edwards

unread,
Jul 22, 2008, 2:17:10 PM7/22/08
to
On 2008-07-22, Steve Schafer <st...@fenestra.com> wrote:
> On Tue, 22 Jul 2008 10:21:50 +0100, Martin Gregorie
><martin@see_sig_for_address.invalid> wrote:
>
>>The first time I ran across the term "thunking" was when Windows 3
>>introduced the Win32S shim and hence the need to switch addressing between
>>16 bit and 32 bit modes across call interfaces. That was called "thunking"
>>by Microsoft and even they would surely admit it was a kludge.

What?! Microsoft took a technical term and used it to mean
something completely different than the widely used meaning?
Never.

> Win32s thunks are a completely different beast from the
> original Algol 60 thunks. As far as I know, the first
> published description of thunks was:
>
> Ingerman PZ (1961) Thunks: A way of compiling procedure statements with
> some comments on procedure declarations, CACM 4:55-58.

The Algol usage is certainly what we were taught back in the
late 70's. I wasn't even aware that Microsoft had hijacked it
to mean something else.

--
Grant Edwards grante Yow! My polyvinyl cowboy
at wallet was made in Hong
visi.com Kong by Montgomery Clift!

John W Kennedy

unread,
Jul 22, 2008, 3:54:05 PM7/22/08
to
Martin Gregorie wrote:
> I used Algol 60 on an Elliott 503 and the ICL 1900 series back when it was
> a current language. The term "thunking" did not appear in either compiler
> manual nor in any Algol 60 language definition I've seen.

It doesn't have to; Algol 60 thunks are not part of the language.
However, practical implementation of Algol 60 call by name means that
thunks are created by every Algol 60 compiler, and the word "thunk" was
coined in 1961 to designate them.

> A60 could pass
> values by name or value and procedures by name. That was it. Call by name
> is what is now referred to as reference passing.

Either you misunderstood (because in many simple cases the semantics of
call-by-reference and call-by-name cannot be distinguished) or the
compiler you used implemented non-standard Algol (which was fairly
common in compilers meant for day-to-day practical work). Algol
call-by-name was a unique form that subsequent language designers have
recoiled from in horror.

(Historically, "call-by-name" has sometimes been used in non-Algol
contexts to mean "call-by-reference".)

> Algol 60 did not have 'functions'. It had procedures which could be
> declared to return values or not. A procedure that returned a value was
> equivalent to a function but the term 'function' was not used.

This is simply wrong. You are accurately describing the language syntax,
which used (as PL/I does) the keyword "procedure" for both functions
and subroutines, but Algol documentation nevertheless referred to
"functions".

> Similarly
> it did not have a mechanism for declaring anonymous procedures. That, like
> the incorporation of machine code inserts, would have been a
> compiler-specific extension, so it is a terminological mistake to refer to
> it without specifying the implementing compiler.

Standards-conforming Algol compilers had a limited ability to create
de-facto anonymous functions in the call-by-name implementation.

--
John W. Kennedy
"Information is light. Information, in itself, about anything, is light."
-- Tom Stoppard. "Night and Day"

John W Kennedy

unread,
Jul 22, 2008, 3:57:21 PM7/22/08
to
Rob Warnock wrote:
> Thunks were something used by Algol 60
> *compiler writers* in the code generated by their compilers to
> implement the semantics of Algol 60 call-by-name, but were not
> visible to users at all [except that they allowed call-by-name
> to "work right"].

...unless you were a system programmer and had to write Algol-friendly
assembler.

Robert Maas, http://tinyurl.com/uh3t

unread,
Aug 12, 2008, 3:28:33 PM8/12/08
to
John W Kennedy <jwke...@attglobal.net> wrote:
JWK> Into the 60s, indeed, there were still machines being made
JWK> that had no instruction comparable to the mainframe BASx/BALx
JWK> family, or to Intel's CALL. You had to do a subprogram call by
JWK> first overwriting the last instruction of what you were
JWK> calling with a branch instruction that would return back to
JWK> you.

That's not true, that you needed to do that, that there was no
other way available. The subroutine linkage I invented for S.P.S.
(Symbolic Programming System, i.e. IBM 1620 assembly language) was
to reserve a 5-digit space immediately before the subroutine entry
point for storing the return address. So the caller needed to know
only one address, the entry point, and do both store-return-address
and jump relative to that address, rather than needing to know both
the entry point and the last-instruction-JUMP-needs-patch address
as independent items of information. So calling a subroutine was
two instructions (pseudo-code here):
literal[nextAdrOfSelf} -> memory[SubrEntryPoint-1]
jump to SubrEntryPoint
and returning from a subroutine was two instructios:
copy memory[SubrEntryPoint-1] -> memory[here + 11]
jump to 00000 ;These zeroes replaced by return address just above
Of course if you needed to pass parameters and/or return value,
that was handled separately, perhaps by reserving additional
storage just before the return address. Of course this methodology
didn't support recursion.

So my method required one extra instruction per return point, but
allowed multiple return points from a single subroutine, and
allowed "encapsulation" of the relation between entry point and
return point.

Note: On IBM 1620, instructions and forward-sweeping data records
were addressed by their *first* digit, whereas arithmetic fields
were addressed by their *last* digit, the low-order position, to
support natural add-and-carry operations. Storage was decimal
digits, with two extra bits, flag to indicate negative value (if in
low-order position) or high-order-end (if in any other position),
and parity. Values larger than nine were reserved for special
purposes, such as RECORD MARK used to terminate right-sweep data
records. Because of that, the low-order position of the return
address and the first digit of the machine instruction at the
subroutine entry point differed by only machine address, hence the
SubrEntryPoint-1 instead of SubrEntryPoint-5 you would otherwise
expect.

Hmm, I suppose if I had thought it out more at the time, I might have
done it slightly differently:

Entry point like this:
jump 00000 ;Patched by caller to contain return address
Entry: ...(regular code)...
...

Each return point like this:
jump Entry-12


I wonder if anybody ever implemented a stack on the IBM 1620?
Probably not, because it would take a lot more machine instructions
to push and pop, and if you weren't writing anything recursive then
extra work for no extra benefit except saving a few digits of
memory if your maximum stack depth is less than the total number of
subroutines you have loaded, except the extra instructions more
than kill off the storage savings.

Hmm, I suppose you could have a auxilary function that serves as
trampoline for stack-based call and return. To call, you move your
own return address and address of subroutine to fixed locations in
low memory then jump to the call trampoline, which pushes the
return address onto the stack and jumps at entry address. To
return, you just jump to the return trampoline, which pops the
return address off the stack and jumps at it. The trampoline,
occuping memory only *once*, could afford to have code to safely
check for stack over/under flow.

Roedy Green

unread,
Aug 12, 2008, 4:26:16 PM8/12/08
to
On Tue, 12 Aug 2008 12:28:33 -0700,
jaycx2.3....@spamgourmet.com.remove (Robert Maas,
http://tinyurl.com/uh3t) wrote, quoted or indirectly quoted someone
who said :

>Note: On IBM 1620, instructions and forward-sweeping data records
>were addressed by their *first* digit, whereas arithmetic fields
>were addressed by their *last* digit, the low-order position, to
>support natural add-and-carry operations. Storage was decimal
>digits, with two extra bits, flag to indicate negative value (if in
>low-order position) or high-order-end (if in any other position),
>and parity.

What a memory you have to recall the precise details of work you did
45 odd years ago.
--

Roedy Green Canadian Mind Products
The Java Glossary
http://mindprod.com

Arne Vajhøj

unread,
Aug 12, 2008, 8:40:07 PM8/12/08
to
Robert Maas, http://tinyurl.com/uh3t wrote:
> John W Kennedy <jwke...@attglobal.net> wrote:
> JWK> Into the 60s, indeed, there were still machines being made
> JWK> that had no instruction comparable to the mainframe BASx/BALx
> JWK> family, or to Intel's CALL. You had to do a subprogram call by
> JWK> first overwriting the last instruction of what you were
> JWK> calling with a branch instruction that would return back to
> JWK> you.
>
> That's not true, that you needed to do that, that there was no
> other way available. The subroutine linkage I invented for S.P.S.
> (Symbolic Programming System, i.e. IBM 1620 assembly language) was
> to reserve a 5-digit space immediately before the subroutine entry
> point for storing the return address. So the caller needed to know
> only one address, the entry point, and do both store-return-address
> and jump relative to that address, rather than needing to know both
> the entry point and the last-instruction-JUMP-needs-patch address
> as independent items of information.

CDC Cyber did something very similar.

Not very recursion friendly.

Arne

John W Kennedy

unread,
Aug 14, 2008, 6:33:30 PM8/14/08
to
Robert Maas, http://tinyurl.com/uh3t wrote:

Actually, I was thinking of the 1401. But both the 1620 and the 1401
(without the optional Advanced Programming Feature) share the basic
omission of any instruction that could do call-and-return without
hard-coding an adcon with the address of the point to be returned to.
(The Advanced Programming Feature added a 1401 instruction, Store
B-address Register, that, executed as the first instruction of a
subroutine, could store the return-to address.)

The 1620, oddly enough, /did/ have call instructions (Branch and
Transfer, and Branch and Transfer Immediate) and a return instruction
(Branch Back), but with a hard-wired stack depth of 1.

--
John W. Kennedy
"When a man contemplates forcing his own convictions down another
man's throat, he is contemplating both an unchristian act and an act of
treason to the United States."
-- Joy Davidman, "Smoke on the Mountain"

Martijn Lievaart

unread,
Aug 16, 2008, 6:10:59 PM8/16/08
to
On Thu, 14 Aug 2008 18:33:30 -0400, John W Kennedy wrote:

> Actually, I was thinking of the 1401. But both the 1620 and the 1401
> (without the optional Advanced Programming Feature) share the basic
> omission of any instruction that could do call-and-return without
> hard-coding an adcon with the address of the point to be returned to.
> (The Advanced Programming Feature added a 1401 instruction, Store
> B-address Register, that, executed as the first instruction of a
> subroutine, could store the return-to address.)

Raaaagh!!!!

Don't. Bring. Back. Those. Nightmares. Please.

The 1401 was a decent enough processor for many industrial tasks -- at
that time -- but for general programming it was sheer horror.

M4

John W Kennedy

unread,
Aug 16, 2008, 9:46:18 PM8/16/08
to

But the easiest machine language /ever/.

--
John W. Kennedy
"The grand art mastered the thudding hammer of Thor
And the heart of our lord Taliessin determined the war."
-- Charles Williams. "Mount Badon"

Martijn Lievaart

unread,
Aug 17, 2008, 5:31:11 AM8/17/08
to
On Sat, 16 Aug 2008 21:46:18 -0400, John W Kennedy wrote:

>> The 1401 was a decent enough processor for many industrial tasks -- at
>> that time -- but for general programming it was sheer horror.
>
> But the easiest machine language /ever/.

True, very true.

M4

Martin Gregorie

unread,
Aug 17, 2008, 6:38:43 AM8/17/08
to
On Sat, 16 Aug 2008 21:46:18 -0400, John W Kennedy wrote:

> Martijn Lievaart wrote:
>> On Thu, 14 Aug 2008 18:33:30 -0400, John W Kennedy wrote:
>>
>>> Actually, I was thinking of the 1401. But both the 1620 and the 1401
>>> (without the optional Advanced Programming Feature) share the basic
>>> omission of any instruction that could do call-and-return without
>>> hard-coding an adcon with the address of the point to be returned to.
>>> (The Advanced Programming Feature added a 1401 instruction, Store
>>> B-address Register, that, executed as the first instruction of a
>>> subroutine, could store the return-to address.)
>>
>> Raaaagh!!!!
>>
>> Don't. Bring. Back. Those. Nightmares. Please.
>>
>> The 1401 was a decent enough processor for many industrial tasks -- at
>> that time -- but for general programming it was sheer horror.
>
> But the easiest machine language /ever/.

What? Even easier than ICL 1900 PLAN or MC68000 assembler? That would be
difficult to achieve.


--
martin@ | Martin Gregorie
gregorie. | Essex, UK
org |

John W Kennedy

unread,
Aug 17, 2008, 10:30:35 PM8/17/08
to

I said "machine language" and I meant it. I haven't touched a 1401 since
1966, and haven't dealt with a 1401 emulator since 1968, but I can
/still/ write a self-booting program. In 1960, some people still looked
on assemblers (to say nothing of compilers) as a useless waste of
resources that could be better applied to end-user applications, and the
1401 was designed to be programmable in raw machine language. Even shops
that used assembler nevertheless frequently did bug fixes as
machine-language patches, rather than take the time to run the assembler
again. (SPS, the non-macro basic assembler, ran at about 70 lines a
minute, tops.)

--
John W. Kennedy
"The bright critics assembled in this volume will doubtless show, in
their sophisticated and ingenious new ways, that, just as /Pooh/ is
suffused with humanism, our humanism itself, at this late date, has
become full of /Pooh./"
-- Frederick Crews. "Postmodern Pooh", Preface

Martin Gregorie

unread,
Aug 18, 2008, 5:33:44 AM8/18/08
to
On Sun, 17 Aug 2008 22:30:35 -0400, John W Kennedy wrote:

> I said "machine language" and I meant it.
>

OK - I haven't touched that since typing ALTER commands into the console
of a 1903 running the UDAS executive or, even better, patching the
executive on the hand switches.

I was fascinated, though by the designs of early assemblers: I first
learnt Elliott assembler, which required the op codes to be typed on
octal but used symbolic labels and variable names. Meanwhile a colleague
had started on a KDF6 which was the opposite - op codes were mnemonics
but all addresses were absolute and entered in octal. I always wondered
about the rationale of the KDF6 assembler writers in tackling only the
easy part of the job.

> Even shops that used assembler nevertheless frequently did bug fixes as
> machine-language patches, rather than take the time to run the assembler
> again. (SPS, the non-macro basic assembler, ran at about 70 lines a
> minute, tops.)
>

Even a steam powered 1901 (3.6 uS for a half-word add IIRC) running a
tape based assembler was faster than that. It could just about keep up
with a 300 cpm card reader.

Rob Warnock

unread,
Aug 20, 2008, 9:47:50 PM8/20/08
to
John W Kennedy <jwk...@attglobal.net> wrote:
+---------------

| I said "machine language" and I meant it. I haven't touched a 1401 since
| 1966, and haven't dealt with a 1401 emulator since 1968, but I can
| /still/ write a self-booting program.
+---------------

Heh! I never dealt with a 1401 per se [except when running a 1410
in 1401 emulation mode to run the Autoplotter program, which wasn't
available for the 1410], but I still remember the IBM 1410 bootstrap
instructions you had to type in on the console to boot from magtape.

v v
L%B000012$N

where the "v" accent is the "wordmark" indicator.

That says to read in a whole tape record in "load" mode (meaning
that wordmarks & groupmarks in memory are overwritten), synchronously
(stop & wait), from tape drive 0, starting at memory location
decimal 12, which, since the 1410 used *1*-based addressing,
was the location just after the no-op at location 11 above.

[Note that these are actual *machine* instructions, *not* "assember"!!
Like the 1401, the 1410 was a *character* machine, not an 8-bit-byte
binary machine. The bits in a character were named 1, 2, 4, 8, A, B,
and W (wordmark). Oh, and C, but that was character parity -- the
programmer couldn't set that separately.]

What was the corresponding 1401 boot sequence?

Oh, for the record, IMHO the DEC PDP-8 had a *much* simpler machine
language and assembler than the IBM 1401/1410. ;-}


-Rob

Rob Warnock

unread,
Aug 20, 2008, 10:18:22 PM8/20/08
to
Martin Gregorie <mar...@see.sig.for.address.invalid> wrote:
+---------------

| I was fascinated, though by the designs of early assemblers: I first
| learnt Elliott assembler, which required the op codes to be typed on
| octal but used symbolic labels and variable names. Meanwhile a colleague
| had started on a KDF6 which was the opposite - op codes were mnemonics
| but all addresses were absolute and entered in octal. I always wondered
| about the rationale of the KDF6 assembler writers in tackling only the
| easy part of the job.
+---------------

In the LGP-30, they used hex addresses, sort of[1], but the opcodes
(all 16 of them) had single-letter mnemonics chosen so that the
low 4 bits of the character codes *were* the correct nibble for
the opcode! ;-}

[Or you could type in the actual hex digits, since the low 4 bits
of *their* character codes were also their corresponding binary
nibble values... "but that would have been wrong".]


-Rob

[1] The LGP-30 character code was defined before the industry had
yet standardized on a common "hex" character set, so instead of
"0123456789abcdef" they used "0123456789fgjkqw". [The "fgjkqw"
were some random characters on the Flexowriter keyboard whose low
4 bits just happened to be what we now call 0xa-0xf]. Even worse,
the sector addresses of instructions were *not* right-justified
in the machine word (off by one bit), plus because of the shift-
register nature of the accumulator you lost the low bit of each
machine word when you typed in instructions (or read them from
tape), so the address values you used in coding went up by *4*!
That is, machine locations were counted [*and* coded, in both
absolute machine code & assembler] as "0", "4", "8", "j", "10",
"14", "18", "1j" (pronounced "J-teen"!!), etc.

s...@netherlands.com

unread,
Aug 20, 2008, 10:30:27 PM8/20/08
to


Whats os interresting about all this hullabaloo is that nobody has
coded machine code here, and know's squat about it.

I'm not talking assembly language. Don't you know that there are routines
that program machine code? Yes, burned in, bitwise encodings that enable
machine instructions? Nothing below that.

There is nobody here, who ever visited/replied with any thought relavence that can
be brought foward to any degree, meaning anything, nobody....

sln

s...@netherlands.com

unread,
Aug 20, 2008, 10:36:39 PM8/20/08
to

At most, your trying to validate you understanding. But you don't pose questions,
you pose terse inflamatory declarations.

You make me sick!

sln

Andrew Reilly

unread,
Aug 21, 2008, 2:02:38 AM8/21/08
to
On Thu, 21 Aug 2008 02:36:39 +0000, sln wrote:

>>Whats os interresting about all this hullabaloo is that nobody has coded
>>machine code here, and know's squat about it.
>>
>>I'm not talking assembly language. Don't you know that there are
>>routines that program machine code? Yes, burned in, bitwise encodings
>>that enable machine instructions? Nothing below that.
>>
>>There is nobody here, who ever visited/replied with any thought
>>relavence that can be brought foward to any degree, meaning anything,
>>nobody....
>>
>>sln
>
> At most, your trying to validate you understanding. But you don't pose
> questions, you pose terse inflamatory declarations.
>
> You make me sick!

Could you elaborate a little on what it is that you're upset about? I
suspect that there are probably quite a few readers of these posts that
have designed and built their own processors, and coded them in their own
machine language. I have, and that was before FPGAs started to make that
exercise quite commonplace. But I don't see how that's at all relevant
to the debate about the power or other characteristics of programming
languages. Certainly anyone who's programmed a machine in assembly
language has a pretty fair understanding of what the machine and the
machine language is doing, even though they don't choose to bang the bits
together manually.

Hope you get better.

--
Andrew

Piet van Oostrum

unread,
Aug 21, 2008, 4:56:50 AM8/21/08
to
>>>>> Arne Vajhøj <ar...@vajhoej.dk> (AV) wrote:

>AV> CDC Cyber did something very similar.

>AV> Not very recursion friendly.

Actually, the CYBER way wasn't too bad. IIRC the CYBER had a subroutine
instruction that stored the return address in the location that the
instruction referenced and then jumped to the address following that
location. To implement a recursive procedure you started the code of the
procedure with saving the return address to a stack.
--
Piet van Oostrum <pi...@cs.uu.nl>
URL: http://pietvanoostrum.com [PGP 8DAE142BE17999C4]
Private email: pi...@vanoostrum.org

Rob Warnock

unread,
Aug 21, 2008, 10:11:48 AM8/21/08
to
s...@netherlands.com> wrote:
+---------------

| rp...@rpw3.org (Rob Warnock) wrote:
| >In the LGP-30, they used hex addresses, sort of[1], but the opcodes
| >(all 16 of them) had single-letter mnemonics chosen so that the
| >low 4 bits of the character codes *were* the correct nibble for
| >the opcode! ;-}
...

| >[1] The LGP-30 character code was defined before the industry had
| > yet standardized on a common "hex" character set, so instead of
| > "0123456789abcdef" they used "0123456789fgjkqw". [The "fgjkqw"
| > were some random characters on the Flexowriter keyboard whose low
| > 4 bits just happened to be what we now call 0xa-0xf]. Even worse,
| > the sector addresses of instructions were *not* right-justified
| > in the machine word (off by one bit), plus because of the shift-
| > register nature of the accumulator you lost the low bit of each
| > machine word when you typed in instructions (or read them from
| > tape), so the address values you used in coding went up by *4*!
| > That is, machine locations were counted [*and* coded, in both
| > absolute machine code & assembler] as "0", "4", "8", "j", "10",
| > "14", "18", "1j" (pronounced "J-teen"!!), etc.
|
| Whats os interresting about all this hullabaloo is that nobody has
| coded machine code here, and know's squat about it.
+---------------

Think again! *BOTH* of the two examples I gave -- for the LGP-30 & the
IBM 1410 -- *WERE* raw machine code, *NOT* assembler!!! Please read again
what I wrote about the character codes for the instruction mnemonics
*BEING* the machine instruction codes. For the IBM 1410, the bootstrap
code that ones types in:

v v
L%B000012$N

*IS* raw machine code, *NOT* assembler!! [See my previous post for
the meaning of the fields of that instruction.] The IBM 1410 is a
*character* machine, not a binary machine, so each memory location is
a (6+1 bit) character. As a result, one can [and *must*, during bootup!]
type absolute machine code directly into memory using the console
typewriter [an only-slightly-modified IBM Selectric, as it happens].
The fact that -- for the sake of user convenience -- the hardware
supports manual entry of machine code into memory by the operator
in a form that closely *resembles* assembler (but *ISN'T*!) was a
deliberate design feature.

Similarly, the characters you type for the LGP-30's opcodes *ARE* the
opcodes, at least by the time they get into the machine, since when
typing in code (or reading it from paper tape) one sets the I/O system
to "4-bit input mode", in which the upper bits of the characters are
stripped off by the I/O hardware during input. Thus when you type a
"Bring" (load) instruction such as this ["Bring" (load) location 0xd7c
(track 0xd, sector 0x31)]:

b0q7j

you're typing *RAW* machine code, *NOT* assembler!! You see, the
lower 4 bits of character "b" -- the "mnemonic" for "Bring" -- were
binary 0001, the *same* as the lower 4 bits of the digit "1" (and two
other characters as well). So when you typed a "b" in that position
in "4-bit input mode" you were typing the same thing as the character "1"
which was the same thing as *binary* 0001 which was the absolute machine
opcode for the ""Bring" instruction!! "No assembler required" (pardon
the pun).

+---------------


| I'm not talking assembly language.

+---------------

Neither was I. The fact that for the two machines I mentioned
absolute machine code was somewhat readable to humans seems to
have confused you, but that's the way some people liked to design
their hardware back in those days -- with clever punning of character
codes with absolute machine opcodes (for the convenience of the user).

+---------------


| Don't you know that there are routines that program machine code?

+---------------

What do you mean "routines"?!? For the above two machines, you can
enter machine code with *no* programs ("routines?") running; that is,
no boot ROM is required (or even available, as it happened).

+---------------


| Yes, burned in, bitwise encodings that enable machine instructions?

+---------------

The two machines mentioned above did not *HAVE* a "boot ROM"!!
You had to *manually* type raw, absolute machine code into
them by hand every time you wanted to boot them up [at least
a little bit, just enough to load a larger bootstrap program].

+---------------
| Nothing below that.
+---------------

You're assuming that all machines *have* some sort of "boot ROM".
Before the microprocessor days, that was certainly not always
the case. The "boot ROM", or other methods of booting a machine
without manually entering at least a small amount of "shoelace"
code [enough the *load* the real bootstrap], was a fairly late
invention.


-Rob

p.s. Similarly, the DEC PDP-8 & PDP-11 were also originally booted
by manually toggling the console switches in order to deposit a few
instructions into memory, and then the starting address was toggled
in and "Start" was pushed. It was only later that a boot ROM became
available for the PDP-11 (as an expensive option!) -and only much
later still for the PDP-8 series (e.g., the MI8E for the PDP-8/E).

Martin Gregorie

unread,
Aug 21, 2008, 5:22:09 PM8/21/08
to
On Thu, 21 Aug 2008 09:11:48 -0500, Rob Warnock wrote:

> You're assuming that all machines *have* some sort of "boot ROM". Before
> the microprocessor days, that was certainly not always the case. The
> "boot ROM", or other methods of booting a machine without manually
> entering at least a small amount of "shoelace" code [enough the *load*
> the real bootstrap], was a fairly late invention.
>

Quite. I never knew how to boot the Elliott 503 (never got closer to the
console than the other side of a plate glass window). However, I dealt
with that aspect of ICL 1900s. They had ferrite core memory and NO ROM.
When you hit Start this cleared the memory and then pulsed a wire that
wrote the bootstrap into memory and executed it. The wire wove through
the cores and wrote 1 bits to the the right places to:
- set word 8 (the PC) to 20
- set 25 words from 20 as bootstrap instructions to boot from disk.

Then it started the CPU running.

On the 1902 this sequence often didn't work, so a good operator knew the
25 words by heart and would toggle them in on hand switches, set PC to 20
and hit the GO switch.


>
> -Rob
>
> p.s. Similarly, the DEC PDP-8 & PDP-11 were also originally booted by
> manually toggling the console switches in order to deposit a few
> instructions into memory, and then the starting address was toggled in
> and "Start" was pushed. It was only later that a boot ROM became
> available for the PDP-11 (as an expensive option!) -and only much later
> still for the PDP-8 series (e.g., the MI8E for the PDP-8/E).
>
> -----
> Rob Warnock <rp...@rpw3.org>
> 627 26th Avenue <URL:http://rpw3.org/> San Mateo,
CA 94403
> (650)572-2607

--

Arne Vajhøj

unread,
Aug 21, 2008, 7:27:24 PM8/21/08
to

It was of course doable.

Else Pascal would have been hard to implement.

Arne

George Neuner

unread,
Aug 22, 2008, 11:11:09 AM8/22/08
to
On Thu, 21 Aug 2008 02:30:27 GMT, s...@netherlands.com wrote:

A friend of mine had an early 8080 micros that was programmed through
the front panel using knife switches ... toggle in the binary address,
latch it, toggle in the binary data, latch it, repeat ad nauseam. It
had no storage device initially ... to use it you had to input your
program by hand every time you turned it on.

I did a little bit of programming on it, but I tired of it quickly.
As did my friend - once he got the tape storage working (a new prom)
and got hold of a shareware assembler, we hardly ever touched the
switch panel again. Then came CP/M and all the initial pain was
forgotten (replaced by CP/M pain 8-).


>I'm not talking assembly language. Don't you know that there are routines
>that program machine code? Yes, burned in, bitwise encodings that enable
>machine instructions? Nothing below that.
>
>There is nobody here, who ever visited/replied with any thought relavence that can
>be brought foward to any degree, meaning anything, nobody....

What are you looking for? An emulator you can play with?

Machine coding is not relevant anymore - it's completely infeasible to
input all but the smallest program. My friend had a BASIC interpreter
for his 8080 - about 2KB which took hours to input by hand and heaven
help you if you screwed up or the computer crashed.

>sln

George

s...@netherlands.com

unread,
Aug 22, 2008, 6:56:09 PM8/22/08
to

[snip]

I don't see the distinction.
Just dissasemble it and find out.

>
>you're typing *RAW* machine code, *NOT* assembler!! You see, the
>lower 4 bits of character "b" -- the "mnemonic" for "Bring" -- were
>binary 0001, the *same* as the lower 4 bits of the digit "1" (and two
>other characters as well). So when you typed a "b" in that position
>in "4-bit input mode" you were typing the same thing as the character "1"
>which was the same thing as *binary* 0001 which was the absolute machine
>opcode for the ""Bring" instruction!! "No assembler required" (pardon
>the pun).
>
>+---------------
>| I'm not talking assembly language.
>+---------------
>
>Neither was I. The fact that for the two machines I mentioned
>absolute machine code was somewhat readable to humans seems to
>have confused you, but that's the way some people liked to design
>their hardware back in those days -- with clever punning of character
>codes with absolute machine opcodes (for the convenience of the user).
>
>+---------------
>| Don't you know that there are routines that program machine code?
>+---------------
>
>What do you mean "routines"?!? For the above two machines, you can
>enter machine code with *no* programs ("routines?") running; that is,
>no boot ROM is required (or even available, as it happened).
>

[snip all the bullshit]

Each op is a routine in microcode.
That is machine code. Those op routines use machine cycles.

You hit the wall of understanding along time ago, a wall you
never looked past.

sln

s...@netherlands.com

unread,
Aug 22, 2008, 7:14:10 PM8/22/08
to

I'm not looking for anything didn't you know?

I guess I wasn't talking assembly language, I was talking machine language.
The language behind the op codes, the one that takes cycles per 'op'eration.

It is entirely disgusting to listen to a littany of old time has beens who
did or did not work on old time cpu's that did or did not have an accumulator.

I worked with many cpu's at the assembly level, hell my first endevour was
writing dissasemblers, like Zilogs and pass through's to gain a whole different
instruction set.

I moved on, but before I did, I fully understood everyting I touched.
I've programmed over 14 million lines of code in my lifetime, but who gives
a rats ass.

Cpu's aren't complicated, far from it. Its having to eat the dubious constructs
of higher level languages built on those platforms.

Get your head out of the ground!


sln

Martin Gregorie

unread,
Aug 22, 2008, 7:23:57 PM8/22/08
to
On Fri, 22 Aug 2008 22:56:09 +0000, sln wrote:

> On Thu, 21 Aug 2008 09:11:48 -0500, rp...@rpw3.org (Rob Warnock) wrote:
>
>>s...@netherlands.com> wrote:
>>*IS* raw machine code, *NOT* assembler!!
> [snip]
>
> I don't see the distinction.
> Just dissasemble it and find out.
>

There's a 1:1 relationship between machine code and assembler.
Unless its a macro-assembler, of course!



>
> Each op is a routine in microcode.
> That is machine code. Those op routines use machine cycles.
>

Not necessarily. An awful lot of CPU cycles were used before microcode
was introduced. Mainframes and minis designed before about 1970 didn't
use or need it and I'm pretty sure that there was no microcode in the
original 8/16 bit microprocessors either (6800, 6809, 6502, 8080, 8086,
Z80 and friends).

The number of clock cycles per instruction isn't a guide either. The only
processors I know that got close to 1 cycle/instruction were all RISC,
all used large lumps of microcode and were heavily pipelined.

By contrast the ICL 1900 series (3rd generation mainframe, no microcode,
no pipeline, 24 bit word) averaged 3 clock cycles per instruction.
Motorola 6800 and 6809 (no microcode or pipelines either, 1 byte fetch)
average 4 - 5 cycles/instruction.

s...@netherlands.com

unread,
Aug 22, 2008, 7:36:07 PM8/22/08
to

Surely you have caved to intelligence. And there is nothing beyond op.
What has the friggin world come to!!!


sln

Paul Wallich

unread,
Aug 22, 2008, 9:52:27 PM8/22/08
to

One problem with this discussion is that the term "microcode" isn't
really well-defined. There's the vertical kind, the horizontal kind,
with and without internal control-flow constructs, and then there are
various levels of visibility to the user -- see e.g. the pdp-8 manual,
where "microcoding" is used to mean piling the bits for a bunch of
instructions together in the same memory location, which works fine as
long as the instructions in question don't use conflicting sets of bits.

paul

Arne Vajhøj

unread,
Aug 22, 2008, 10:15:28 PM8/22/08
to

I thought microcode was relative well defined as being the software
used to implement instructions that were not fully implemented in
hardware.

http://en.wikipedia.org/wiki/Microcode does not make me think otherwise.

Arne

John W Kennedy

unread,
Aug 23, 2008, 12:06:28 AM8/23/08
to
Martin Gregorie wrote:
> Not necessarily. An awful lot of CPU cycles were used before microcode
> was introduced. Mainframes and minis designed before about 1970 didn't
> use or need it

No, most S/360s used microcode.

--
John W. Kennedy
"There are those who argue that everything breaks even in this old
dump of a world of ours. I suppose these ginks who argue that way hold
that because the rich man gets ice in the summer and the poor man gets
it in the winter things are breaking even for both. Maybe so, but I'll
swear I can't see it that way."
-- The last words of Bat Masterson

John W Kennedy

unread,
Aug 23, 2008, 12:18:29 AM8/23/08
to
Rob Warnock wrote:
> What was the corresponding 1401 boot sequence?

The 1401 had a boot-from-tape-1 button on the console, and a
boot-from-card button on the card reader. You couldn't truly boot from a
disk; you loaded a little starter deck of about 20 cards on the card reader.

On the 1401, the typewriter was an optional luxury, mainly used in long
batch jobs to do ad-hoc on-line queries. On the compatible 1460, the
typewriter was somewhat more common, because the console the typewriter
mounted on was a standard part of the system, so only the typewriter had
to be added.

--
John W. Kennedy
"You can, if you wish, class all science-fiction together; but it is
about as perceptive as classing the works of Ballantyne, Conrad and W.
W. Jacobs together as the 'sea-story' and then criticizing _that_."
-- C. S. Lewis. "An Experiment in Criticism"

Martin Gregorie

unread,
Aug 23, 2008, 6:54:25 AM8/23/08
to
On Sat, 23 Aug 2008 00:06:28 -0400, John W Kennedy wrote:

> Martin Gregorie wrote:
>> Not necessarily. An awful lot of CPU cycles were used before microcode
>> was introduced. Mainframes and minis designed before about 1970 didn't
>> use or need it
>
> No, most S/360s used microcode.

I never used an S/360.

I thought microcode came into the IBM world with S/370 and Future Series
(which later reappeared as the AS/400, which I did use). Didn't the S/370
load its microcode off an 8 inch floppy?

John W Kennedy

unread,
Aug 23, 2008, 9:22:05 PM8/23/08
to
Martin Gregorie wrote:
> On Sat, 23 Aug 2008 00:06:28 -0400, John W Kennedy wrote:
>
>> Martin Gregorie wrote:
>>> Not necessarily. An awful lot of CPU cycles were used before microcode
>>> was introduced. Mainframes and minis designed before about 1970 didn't
>>> use or need it
>> No, most S/360s used microcode.
>
> I never used an S/360.
>
> I thought microcode came into the IBM world with S/370 and Future Series
> (which later reappeared as the AS/400, which I did use). Didn't the S/370
> load its microcode off an 8 inch floppy?

Some did, but not all. The 370/145 was the first, and made a big splash
thereby.

As to the 360s:

20 (Incompatible subset) I don't know
22 (Recycled end-of-life 30) CROS
25 Loaded from punched cards
30 CROS
40 TROS
44 (Subset) None
50 CROS
60, 62, 65 ROS
64, 66, 67 ROS
70, 75 None
85 I don't know
91, 95 I don't know -- probably none
195 I don't know

CROS used plastic-coated foil punched cards as the dielectrics of 960
capacitors each.

TROS used little transformer coils that might or might not be severed.

ROS means it was there, but I don't know the technology.
--
John W. Kennedy
"Those in the seat of power oft forget their failings and seek only
the obeisance of others! Thus is bad government born! Hold in your
heart that you and the people are one, human beings all, and good
government shall arise of its own accord! Such is the path of virtue!"
-- Kazuo Koike. "Lone Wolf and Cub: Thirteen Strings" (tr. Dana Lewis)

Martin Gregorie

unread,
Aug 24, 2008, 8:28:01 AM8/24/08
to
On Sat, 23 Aug 2008 21:22:05 -0400, John W Kennedy wrote:

> Martin Gregorie wrote:
>> On Sat, 23 Aug 2008 00:06:28 -0400, John W Kennedy wrote:
>>
>>> Martin Gregorie wrote:
>>>> Not necessarily. An awful lot of CPU cycles were used before
>>>> microcode was introduced. Mainframes and minis designed before about
>>>> 1970 didn't use or need it
>>> No, most S/360s used microcode.
>>
>> I never used an S/360.
>>
>> I thought microcode came into the IBM world with S/370 and Future
>> Series (which later reappeared as the AS/400, which I did use). Didn't
>> the S/370 load its microcode off an 8 inch floppy?
>
> Some did, but not all. The 370/145 was the first, and made a big splash
> thereby.
>

..snip...

Thanks for that. As I said, during most of that era I was using ICL kit.
Microcode was never mentioned in the 1900 contect. Hoiwever, they had a
very rough approximation called extracodes. though they were closer to
software traps than microcode: if hardware didn't implement an op code
the OS intercepted it and ran equivalent code. This was used for i/o
operations and for FP instructions on boxes that didn't have FP hardware.
As a result all boxes executed the same instruction set. Some opcodes
might be very slow on some hardware but it would execute.

The 2900 series had huge amounts of microcode - it even defined both
memory mapping and opcodes. You could run 1900 code (24 bit words, fixed
length instructions, ISO character codes) simultaneously with 'native'
code (8 bit bytes, v/l instructions, EBCDIC) with each program running
under its usual OS (George 3 for 1900, VME/B for 1900).

The only other systems I'm aware of that could do this were the big
Burroughs boxes (6700 ?), which used a byte-based VM for COBOL and a word-
based VM for FORTRAN and Algol 60) and IBM AS/400 (OS/400 could run S/34
code alongside S/38 and AS/400 code). AFAICT Intel virtualisation doesn't
do this - all code running under VMware or any of the other VMs is still
running in a standard Intel environment.

It is loading more messages.
0 new messages