Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

ISO/IEC CD 13816 -- ISLisp

92 views
Skip to first unread message

Erik Naggum

unread,
Nov 26, 1995, 3:00:00 AM11/26/95
to
I haven't seen this mentioned earlier, so I thought it might be of interest
to the Lisp community.

the second CD (committee draft) ballot for CD 13816, full title
"Programming Languages, their environments and system software interfaces
-- Programming Language Lisp", is now out for vote to ISO member bodies,
ending 1996-02-12.

some time ago, I found an FTP site with some earlier drafts, but I have
misplaced the address. anybody know?

from the Introduction:

"The programming language ISLisp is a member of the Lisp family. It is the
result of standardization efforts by the committee ISO/IEC JTC 1/SC 22/WG
16.

"The following factors influenced the establishment of design goals for
ISLisp:

"1. A desire of the international Lisp community to standardize on those
features of Lisp upon which there is widespread agreement.

"2. The existence of the incompatible dialects Common Lisp, EuLisp, Le-Lisp
and Scheme (mentioned in alphabetical order).

"3. A desire to affirm Lisp as an industrial language.

"This led to the following design goals for ISLisp:

"1. ISLisp shall be compatible with existing Lisp dialects where feasible.

"2. ISLisp shall have as a primary goal to provide basic funcationality.

"3. ISLisp shall be object-oriented.

"4. ISLisp shall be designed with extensibility in mind.

"5. ISLisp shall give priority to industrial needs over academic needs.

"6. ISLisp shall promote efficient implementations and applications."


on the first item 2: there is no mention of any other standardization
activity in this document, despite the existence of ANSI Common Lisp and
IEEE Scheme. treating published standards as "incompatible dialects" is a
rather heavy-handed propaganda move in my eyes. on item 5, I get a feeling
this language is the victim of yet another silly "industry vs academia"
war, and that it is yet another propaganda move.

I already know how politicized ISO can get -- I have been working with ISO
and standardization for five years, now -- but this seems unnecessary on
the face of it. I haven't read the entire draft carefully, but some of the
things here are clearly designed to break every existing Lisp program in
existence, sort of spiting the Lisp community. the second point 1 is
merely paying lip service to the standards community.

the following are special operators (however, many more are labeled
"special operators" in the document)

assure -- `the' with error semantics
catch
class
convert -- `coerce'
dynamic -- reference to dynamic variables
dynamic-let -- binding of dynamic variables
flet
function
go
if
labels
lambda
let
progn
quote
return-from
setf
tagbody
the
throw
unwind-protect
while

these are the defining operators:

defclass
defconstant
defdynamic -- `defvar', establishment of dynamic variable
defgeneric
defglobal -- `defvar', establishment of non-dynamic variable
defmacro
defmethod
defun

among the more important similarities to Common Lisp:

- lexical scope
- variable and function namespaces
- first element of list not evaluated (unless lambda expression)
- nil and () identical
- arbitrary integer precision

among the more important differences from Common Lisp:

- specialized access to dynamic variables
- apparently, dynamic variables are different from lexical variables
- lambda-lists have only &rest
- no keywords for user functions
- no packages
- no list destructuring in macros
- no documentation
- no reader macros
- <, =, >, etc, are binary, not n-ary
- numbers are only floating point or integer
- no concept of fixnums
- only one floating point precision
- / is not defined, but `quotient' and `reciprocal' are
- lobotomized file I/O and handling
- no `open', but specialized functions for output, input, io
- no equivalent to `read-sequence', `write-sequence'
- lobotomized `format', no `print' or `write'

#<Erik 3026390951>
--
"We have built a lot of security directly into Java to make it virus-proof.
And people's concerns about security on the Net tend to be based on age. You
talk to people in their twenties and they are much less concerned about it than
older generations. Pretty soon it won't worry them at all." -- Scott McNealy

Scott Fahlman

unread,
Nov 26, 1995, 3:00:00 AM11/26/95
to

In article <19951126...@naggum.no> Erik Naggum <er...@naggum.no> writes:

"2. The existence of the incompatible dialects Common Lisp, EuLisp, Le-Lisp
and Scheme (mentioned in alphabetical order).

on the first item 2: there is no mention of any other standardization


activity in this document, despite the existence of ANSI Common Lisp and
IEEE Scheme. treating published standards as "incompatible dialects" is a
rather heavy-handed propaganda move in my eyes. on item 5, I get a feeling
this language is the victim of yet another silly "industry vs academia"
war, and that it is yet another propaganda move.

No, I think that this is a silly Europe vs. US/Japan war. Le-Lisp and
EuLisp each have some admirable features, but to mention them as
standards on a part with CL and Scheme is just silly. Maybe it seems
less silly if you sit in France.

In terms of language design, it looks like ISO Lisp is taking some
minor (but not really important) clean-up steps in the right
direction, but in so doing it is creating yet another incompatible
dialect in a market that can barely sustain CL and Scheme. This is
not what a standards organization should be doing.

This ISO standard will be ignored, of course, but it may cause some
trouble for European organizations that are required to adhere to ISO
standrds, however silly.

Well, this is no longer my problem. I'm off trying to establish the
next great langauge (Dyaln), and I hope we can avoid official
standardization of Dylan for at least 20 or 30 years. We should have
it right by then. :-)

-- Scott


Barry Margolin

unread,
Nov 26, 1995, 3:00:00 AM11/26/95
to
In article <19951126...@naggum.no>, Erik Naggum <er...@naggum.no> wrote:
>"6. ISLisp shall promote efficient implementations and applications."
>
>
>on the first item 2: there is no mention of any other standardization
>activity in this document, despite the existence of ANSI Common Lisp and
>IEEE Scheme. treating published standards as "incompatible dialects" is a
>rather heavy-handed propaganda move in my eyes. on item 5, I get a feeling
>this language is the victim of yet another silly "industry vs academia"
>war, and that it is yet another propaganda move.

Basically, criterion 6 (quoted above) was a major reason for the
differences between ISLisp and Common Lisp. The feeling among many of the
ISO Lisp committee members was that CL was too big and complex to be able
to be implemented efficiently (several existing implementation
notwithstanding). In particular, they were concerned about implementations
that could fit easily on PC's. What they wanted to do was define a kernel
Lisp that provided all the basic functionality needed, and upon which the
full functionality of something like CL could be implemented using
libraries.

It's interesting to note in retrospect that some of the same criteria
resulted in the development of Dylan. However, Dylan explicitly severed
the relationship to the Lisp community by leaving "Lisp" out of the
language's name, and later adopting a non-Lisp-like syntax.
--
Barry Margolin
BBN PlaNET Corporation, Cambridge, MA
bar...@bbnplanet.com
Phone (617) 873-3126 - Fax (617) 873-6351

Pierpaolo Bernardi

unread,
Nov 28, 1995, 3:00:00 AM11/28/95
to
Erik Naggum (er...@naggum.no) wrote:
...
: the second CD (committee draft) ballot for CD 13816, full title

: "Programming Languages, their environments and system software interfaces
: -- Programming Language Lisp", is now out for vote to ISO member bodies,
: ending 1996-02-12.

: some time ago, I found an FTP site with some earlier drafts, but I have
: misplaced the address. anybody know?

ma2s2.mathematik.uni-karlsruhe.de

Bruno Haible

unread,
Nov 28, 1995, 3:00:00 AM11/28/95
to
Scott Fahlman <s...@CS.CMU.EDU> wrote:
>
> No, I think that this is a silly Europe vs. US/Japan war.

It's more like US vs. Europe/Japan.

> This ISO standard will be ignored, of course,

It is not being ignored. At least two commercial Lisp implementations
I know of (ILOG Talk being one of them) implement most of ISLisp.

> Well, this is no longer my problem. I'm off trying to establish the

> next great langauge (Dylan),

Dylan requires a much more intelligent compiler than ISLisp does. Who
on the world (except Harlequin) has the human and financial resources
to develop a Dylan compiler and development environment? ISLisp is
simpler, so you don't have to invest that much.


----------------------------------------------------------------------------
Bruno Haible net: <hai...@ilog.fr>
ILOG S.A. tel: +33 1 4908 3585
9, rue de Verdun - BP 85 fax: +33 1 4908 3510
94253 Gentilly Cedex url: http://www.ilog.fr/
France url: http://www.ilog.com/

Bruno Haible

unread,
Nov 28, 1995, 3:00:00 AM11/28/95
to
Erik Naggum <er...@naggum.no> wrote:
> the second CD (committee draft) ballot for CD 13816, full title
> "Programming Languages, their environments and system software interfaces
> -- Programming Language Lisp", is now out for vote to ISO member bodies,
> ending 1996-02-12.

Thank you, Erik, for drawing attention to the nearly finished ISO Lisp
standard.

> some time ago, I found an FTP site with some earlier drafts, but I have
> misplaced the address. anybody know?

Kent Pitman's FTP server is ftp.harlequin.co.uk:/pub/kmp/iso/,
and mine is ma2s2.mathematik.uni-karlsruhe.de:/pub/lisp/islisp/.

> there is no mention of any other standardization
> activity in this document, despite the existence of ANSI Common Lisp and
> IEEE Scheme.

At the time the design goals were formulated (1991), Common Lisp was still
miles away from being standardized.

> treating published standards as "incompatible dialects" is a
> rather heavy-handed propaganda move in my eyes.

The ISLisp standardization did in no way ignore Common Lisp. In fact,
minimizing the incompatibilities vs. CL was a major design goal (although
unwritten) of ISLisp. The object system, for example, is a slim-lined
version of CLOS.
IEEE Scheme was not ignored either, for example the explicit distinction
between lexical and dynamic (fluid) variables comes from Scheme.

> on item 5, I get a feeling
> this language is the victim of yet another silly "industry vs academia"
> war, and that it is yet another propaganda move.

Not at all. On the contrary, industry and universities have closely
cooperated to produce the ISLisp standard draft: Japanese universities
took part as well as french companies, the university-based Eulisp
project as well as german companies.


You have made a very valuable comparison between Common Lisp and ISLisp.
Let me just explain the reasons for some of the differences you noticed.

> the following are special operators (however, many more are labeled
> "special operators" in the document)

In ISLisp, there is not distinction between predefined special forms
and predefined macros. They all behave the same way: they have special
evaluation rules, and you can't redefine them. So where's the
difference?

> among the more important differences from Common Lisp:
>
> - specialized access to dynamic variables
> - apparently, dynamic variables are different from lexical variables

This cleans up one of the major pitfalls in Common Lisp: In CL, executing
some '(declaim (special x))' will completely screw up the files you
compile afterwards.

> - lambda-lists have only &rest
> - no keywords for user functions

For efficiency. C and C++ don't have them either, but in ISLisp you can
define a macro `defun-with-keywords' yourself.

> - no list destructuring in macros

Once you have written `destructuring-let' (or `destructuring-bind', as
CL calls it) in a library, you don't need this in `defmacro' itself.

In Common Lisp, you have four different syntaxes of lambda lists:
one for `defun', one for `defmethod', one for `defmacro' with recursive
destructuring, one for `deftype' with `*' instead of `nil' as the
default, not to mention `defsetf', `define-setf-method' and
`destructuring-bind'. This was crying for simplification.

> - no documentation

ISLisp doesn't confuse the program, whose final form is an executable
or a shared library, with the development environment. You don't want
to have documentation strings filling up the memory of the computer
which runs your application.

ISLisp doesn't standardize the development environment. The development
environment might store the comments of your source files in a database.
This doesn't affect the programming language at all.

> - no reader macros

How do you solve the collision problems when two modules define, say,
#{...} in two different ways? - Features which carry unsolved problems
are not ready for standardization.

> - <, =, >, etc, are binary, not n-ary

n-ary comparison functions are confusing for people which come from other
programming languages.

> - numbers are only floating point or integer

Rational, complex, fixed-point numbers etc. can be done in extensions
provided by the ISLisp implementation.

> - no concept of fixnums

Common Lisp doesn't have a "concept" of fixnums either: There is no
class FIXNUM. Anyway, implementation details like the storage of an integer
don't need to be specified in a high-level language like Lisp.

> - only one floating point precision

Current hardware handles "double-float" reasonably well, so there is no
need for smaller floating point types. And longer floating-point
types haven't been implemented in CL, except for VAX Lisp and CLISP.

> - / is not defined, but `quotient' and `reciprocal' are

Because in C, `/' on integers has the meaning of `floor' or `truncate'.
And in CL, `/' on integers generates a rational number. You can't be
compatible to both.

> - lobotomized file I/O and handling

Common Lisp's I/O was not ready for standardization. The developers
of Dylan have taken the better route, IMO.

> - no equivalent to `read-sequence', `write-sequence'

Right, this is missing in ISLisp.

> - lobotomized `format', no `print' or `write'

You don't agree that CL's `format' facility is a monster?


Bruno

Erik Naggum

unread,
Nov 29, 1995, 3:00:00 AM11/29/95
to
[Bruno Haible]

| Kent Pitman's FTP server is ftp.harlequin.co.uk:/pub/kmp/iso/,
| and mine is ma2s2.mathematik.uni-karlsruhe.de:/pub/lisp/islisp/.

thanks.

| At the time the design goals were formulated (1991), Common Lisp was
| still miles away from being standardized.

hmmm. I think it should have been updated. it was fairly obvious for a
long time that CL would reach status as published standard.

| In ISLisp, there is not distinction between predefined special forms
| and predefined macros. They all behave the same way: they have special
| evaluation rules, and you can't redefine them. So where's the
| difference?

when writing code-walkers, you need to expand macros and stop at special
operators. (not that you can expand macros in ISLisp, anyway.) having to
handle all predefined macros specially may make working with ISLisp apart
from the compiler harder. since we all work with languages apart from the
compiler, and few languages were designed to help that process, I don't
think ISLisp should learn the wrong lesson here.

| > among the more important differences from Common Lisp:
| >
| > - specialized access to dynamic variables
| > - apparently, dynamic variables are different from lexical variables
|
| This cleans up one of the major pitfalls in Common Lisp: In CL,
| executing some '(declaim (special x))' will completely screw up the
| files you compile afterwards.

I'm aware of the pitfall, but my impression is still (I have read the draft
a _little_ more carefully, now) that a symbol may have both a lexical and a
dynamic value. is this so? I don't think pitfalls are a good reason to
remove an extremely useful facility. life has its own share of pitfalls,
but we don't kill people to avoid them.

| > - lambda-lists have only &rest
| > - no keywords for user functions
|
| For efficiency. C and C++ don't have them either, but in ISLisp you
| can define a macro `defun-with-keywords' yourself.

C and C++ are not exactly model languages. keywords, to be efficient,
requires compiler support. Norway will require user-code support for
keywords, since it is already supported in some of the predefined
functions. I will miss &optional. even C++ has it. (gack, I can't
believe I said that.) maybe I should require &optional arguments. their
semantics is well-defined, and the compiler can handle it.

| > - no documentation
|
| ISLisp doesn't confuse the program, whose final form is an executable
| or a shared library, with the development environment. You don't want
| to have documentation strings filling up the memory of the computer
| which runs your application.

both counts are highly irrelevant. documentation strings are not primarily
for a development environment, but for documenting code in a standardized
way. that you can access this information in a development environment
just goes to show that it cares for its programmers. (I als note that
CLISP lacks documentation, and this is _the_ major reason I don't use
CLISP. I can't afford to use it as a development platform when I have to
look things up in manuals that aren't even about the implementation, and
the many special features of CLISP are useless without proper and easily
accessible documentation.) Norway will require documentation in all
defining forms.

| ISLisp doesn't standardize the development environment. The
| development environment might store the comments of your source files
| in a database. This doesn't affect the programming language at all.

it does. an integrated documentation facility is what marks the difference
between a good and a bad language, to put it in strong terms. comments are
just that, and people invent all sorts of weird conventions to obtain what
documentation strings could have afforded, were they present. the first
language to realize that code was written primarily for people was COBOL,
and although it was for the wrong people (sorry), the lesson from COBOL is
that formalized documentation is very important in a program. not just
development, but the whole lifecycle. external documentation is never up
to date.

also, note that Common Lisp has `declare', ISLisp doesn't. I missed that
entirely, but it is important also in this regard. `declare' can be used
to introduce formal fields of documentation for functions.

| Features which carry unsolved problems are not ready for
| standardization.

IMNSHO, all features carry unsolved problems. it's the easiest thing in
the world to manufacture problems with any feature. there are pointers in
C and C++, for crying out loud.

| > - <, =, >, etc, are binary, not n-ary
|
| n-ary comparison functions are confusing for people which come from
| other programming languages.

why is this confusing? they will never use it. they will go on writing
(and (< foo bar) (< bar zot)) instead of (< foo bar zot), and they won't
ever be confused. but when they see it the first time, they will go
"halleluja" and never return to the stupid binary comparison functions.
this is not confusion. this is revelation! I _especially_ like the n-ary
/=. this explodes if done in user code, but may easily be implemented by
sorting the values first if there are more than, say, 3.

I don't understand the assymetry in the argument, either. +, *, and - are
n-ary in the Common Lisp way. isn't that confusing? is it because x < y <
z i C is assymetric with x + y + z? I happen to think C is seriously
broken because of this assymetry. it appears that ISLisp is learning the
wrong lesson from a number of issues.

| > - numbers are only floating point or integer
|
| Rational, complex, fixed-point numbers etc. can be done in extensions
| provided by the ISLisp implementation.

well, OK. lacking reader macros at this point makes for an interesting
syntax challenge, however.

| > - no concept of fixnums
|
| Common Lisp doesn't have a "concept" of fixnums either: There is no
| class FIXNUM. Anyway, implementation details like the storage of an
| integer don't need to be specified in a high-level language like Lisp.

obviously, you can ask a CL system whether a number is a fixnum, for
efficiency. it appears that this "efficiency" argument is being used
rather selectively.

| > - only one floating point precision
|
| Current hardware handles "double-float" reasonably well, so there is no
| need for smaller floating point types. And longer floating-point types
| haven't been implemented in CL, except for VAX Lisp and CLISP.

again, efficiency.

| > - / is not defined, but `quotient' and `reciprocal' are
|
| Because in C, `/' on integers has the meaning of `floor' or `truncate'.
| And in CL, `/' on integers generates a rational number. You can't be
| compatible to both.

you actually want to be "compatible" with _C_? I didn't see this in the
design goals. it's depressing how that language has warped people's minds.
that's where the "while" and "for" come from, too, isn't it? gack, barf.

besides, the C / may be both floor _and_ truncate, depending on the sign of
its operands. that doesn't mean that "quotient" isn't silly.

| > - lobotomized `format', no `print' or `write'
|
| You don't agree that CL's `format' facility is a monster?

no. I think it is the most powerful formatting function around. its only
problem is that it is tied to English in some of its more useful
constructs. (that things are unreadable does not appear to throw people,
anymore.) the biggest problem with localizing code is handling plurals in
languages that force the programmer to write code to handle it. with
format strings, one may do wondrous things for international code. (that's
ISO bait for you.)

#<Erik 3026639680>
--
suppose we actually were immortal. what is the opposite of living your
life as if every day were your last?

Bill Dubuque

unread,
Nov 30, 1995, 3:00:00 AM11/30/95
to
From: hai...@ilog.fr (Bruno Haible)
Date: 28 Nov 1995 21:27:16 GMT

Erik Naggum <er...@naggum.no> wrote:
> - no reader macros

How do you solve the collision problems when two modules define, say,
#{...} in two different ways? - Features which carry unsolved problems
are not ready for standardization.

Has anyone done work on solving this problem?

It would be nice if there were a namespace mechanism, perhaps
integrated with packages, that allowed for hierarchical
structuring of readtables, thus providing inheritance,
and other OOP mechanisms.

Further, I think that it should be possible to dynamically
bind the entries in a readtable (in fact, why not create
a syntax for denoting them, e.g. say foo/read:# denotes
the # entry (method) in the readtable in the foo package, and
you can use flet to bind it). Then the full power of CLOS
could be used for building hairy readers.

Similar considerations apply to the reading of (potential)
numbers, a capability which is sorely needed if you
do any algebraic style programming (e.g. symbolic algebra
systems like Macsyma, Maple, Mathematica).

The exisiting CL reader specification is very weak in power
and extensibility.

-Bill

Barry Margolin

unread,
Dec 1, 1995, 3:00:00 AM12/1/95
to
In article <WGD.95No...@martigny.ai.mit.edu>,

Bill Dubuque <w...@zurich.ai.mit.edu> wrote:
>It would be nice if there were a namespace mechanism, perhaps
>integrated with packages, that allowed for hierarchical
>structuring of readtables, thus providing inheritance,
>and other OOP mechanisms.

Since readtables are first-class objects in CL, it should be possible to
implement this in user-level code.

For instance, to make a readtable that inherits from another readtable,
simply bind all its entries to a function that looks up the corresponding
entry in the parent readtable and invokes it. Multiple inheritance is
trickier, but should be doable.

>Further, I think that it should be possible to dynamically
>bind the entries in a readtable (in fact, why not create
>a syntax for denoting them, e.g. say foo/read:# denotes
>the # entry (method) in the readtable in the foo package, and
>you can use flet to bind it). Then the full power of CLOS
>could be used for building hairy readers.

Actually, if I were to CLOSify readtables, I would probably do it with
something like:

(defclass my-readtable (standard-readtable))

(defmethod reader-handle-character ((self my-readtable)
(char (eql #\#))
stream)
...)

(let ((*readtable* (make-instance 'my-readtable)))
...)

Functions like SET-MACRO-CHARCTER would use the Meta-Object Protocol to
modify classes on the fly.

Scott Fahlman

unread,
Dec 2, 1995, 3:00:00 AM12/2/95
to

In article <49fvok$p...@nz12.rz.uni-karlsruhe.de> hai...@ilog.fr (Bruno Haible) writes:

Scott Fahlman <s...@CS.CMU.EDU> wrote:
>
> No, I think that this is a silly Europe vs. US/Japan war.

It's more like US vs. Europe/Japan.

Whatever...

> This ISO standard will be ignored, of course,

It is not being ignored. At least two commercial Lisp implementations
I know of (ILOG Talk being one of them) implement most of ISLisp.

I meant that it would be ignored by those not deeply involved in
creating the ISO standard. Clearly ILOG has a major interest in
seeing this standard "take". Who is doing the other implementation,
if you're allowed to say?

> Well, this is no longer my problem. I'm off trying to establish the
> next great langauge (Dylan),

Dylan requires a much more intelligent compiler than ISLisp does.

Not really. In some ways it is easier, once you get past the parser
(which is pretty standard technology). Implementing method dispatch
efficiently adds some complication, but you get that problem with CLOS
as well.

Who
on the world (except Harlequin) has the human and financial resources
to develop a Dylan compiler and development environment? ISLisp is
simpler, so you don't have to invest that much.

Our group at CMU whipped up the Mindy implementation of Dylan in a
couple of wizard-months, though more work has gone into extending it
since then.

If you're talking about a super-optimizing compiler completely
integrated with a hypercode environment, that is indeed a difficult
and open-ended problem, though not (we hope!) beyond the capabilities
of a strong university project.

Cheers,
Scott

===========================================================================
Scott E. Fahlman Internet: se...@cs.cmu.edu
Principal Research Scientist Phone: 412 268-2575
School of Computer Science Fax: 412 268-5576
Carnegie Mellon University Latitude: 40:26:46 N
5000 Forbes Avenue Longitude: 79:56:55 W
Pittsburgh, PA 15213 Mood: :-)
===========================================================================

Jeff Dalton

unread,
Dec 2, 1995, 3:00:00 AM12/2/95
to
Erik Naggum <er...@naggum.no> writes:

>[Bruno Haible]

I hope we can avoid a language war here...

>| > - specialized access to dynamic variables
>| > - apparently, dynamic variables are different from lexical variables
>|
>| This cleans up one of the major pitfalls in Common Lisp: In CL,
>| executing some '(declaim (special x))' will completely screw up the
>| files you compile afterwards.

>I'm aware of the pitfall, but my impression is still (I have read the draft
>a _little_ more carefully, now) that a symbol may have both a lexical and a
>dynamic value. is this so?

Yes, but what Bruno said was right as well (technically).

If you do not proclaim/declaim a name as special, then it can have
both lexical and dynamic values; but after you've proclaimed/declaimed
it special, that ceases to be the case. (Note that defvar does a
special proclamation.)

(Code that's already been processed (especially if it's been compiled)
may not notice, of course. I'm not sure exactly what the standard
specifies there.)

The Common Lisp convention of having using a "*" at the beginning
and and of the names of variables that are proclaimed special
means that "files yuo compile afterwards" are seldom screwed
up by special proclamations/declamations.

But many people are not satisfied by mere conventions. Hence
demands for "information hiding", etc.

-- jeff

Jeff Dalton

unread,
Dec 2, 1995, 3:00:00 AM12/2/95
to
bar...@tools.bbnplanet.com (Barry Margolin) writes:

>In article <19951126...@naggum.no>, Erik Naggum <er...@naggum.no> wrote:
>>"6. ISLisp shall promote efficient implementations and applications."

>>[...]

>Basically, criterion 6 (quoted above) was a major reason for the
>differences between ISLisp and Common Lisp. The feeling among many of the
>ISO Lisp committee members was that CL was too big and complex to be able
>to be implemented efficiently (several existing implementation
>notwithstanding).

Why the "notwithstanding"?

They (ISO's WG-16) knew about the implementations and their properties.

Now, (6) above is a design goal. It is not directly an expression
of what were thought to be problems with Common Lisp and should not
be read as if it were.

-- jd

Marcus Daniels

unread,
Dec 2, 1995, 3:00:00 AM12/2/95
to
>>>>> "Joe" == Joe User <g...@mci.newscorp.com> writes:
In article <49o5vq$4...@merlin.delphi.com> Joe User <g...@mci.newscorp.com> writes:

Joe> The differences between compiling CL, Scheme, Dylan, EULISP, or
Joe> whatever are rather trivial compared with the difficulting of
Joe> generating good code for RISC architectures, properly interfacing
Joe> with the operating system and competing with the debugging and
Joe> general code development environments available in products such
Joe> as Microsoft Visual C++.

Joe User, are you a representative of alt.syntax.tactical?


Jeff Dalton

unread,
Dec 2, 1995, 3:00:00 AM12/2/95
to
s...@CS.CMU.EDU (Scott Fahlman) writes:

>In article <19951126...@naggum.no> Erik Naggum <er...@naggum.no> writes:

> "2. The existence of the incompatible dialects Common Lisp, EuLisp, Le-Lisp
> and Scheme (mentioned in alphabetical order).

> on the first item 2: there is no mention of any other standardization


> activity in this document, despite the existence of ANSI Common Lisp and

> IEEE Scheme. treating published standards as "incompatible dialects" is a
> rather heavy-handed propaganda move in my eyes. on item 5, I get a feeling

I think you are (EN) reading way too much into this.

Note that you are quoting a list of factors that influenced the
establishment of design goals for ISLisp, and the ISLisp effort
started before even Scheme was a standard. This sentence about
incompatible dialiects has been around pretty much from the
beginning.

> this language is the victim of yet another silly "industry vs academia"
> war, and that it is yet another propaganda move.

>No, I think that this is a silly Europe vs. US/Japan war. Le-Lisp and


>EuLisp each have some admirable features, but to mention them as
>standards on a part with CL and Scheme is just silly. Maybe it seems
>less silly if you sit in France.

Does it mention them "as standards"? It doesn't in the quote above.

Note that Kent Pitman -- the editor of the CL standard -- was the
editor of the ISLisp draft as well, so he is presumably aware of
it's contents. I forget who, these days, is the US representative
to the ISO committee that produced this draft, but he is presumably
reasonably familiar with the contents as well. I'm not saying
either of these people approves of the language in the draft,
but so far as I know no one has thought it was a serious
problem.

-- jeff


Jeff Dalton

unread,
Dec 2, 1995, 3:00:00 AM12/2/95
to
hai...@ilog.fr (Bruno Haible) writes:

>> on item 5, I get a feeling
>> this language is the victim of yet another silly "industry vs academia"
>> war, and that it is yet another propaganda move.

>Not at all. On the contrary, industry and universities have closely
>cooperated to produce the ISLisp standard draft: Japanese universities
>took part as well as french companies, the university-based Eulisp
>project as well as german companies.

Why do you say EuLisp was university-based? There was a fair
amount of commercial representation, especially from France and
Germany.

>> - no concept of fixnums

>Common Lisp doesn't have a "concept" of fixnums either: There is no
>class FIXNUM. Anyway, implementation details like the storage of an integer
>don't need to be specified in a high-level language like Lisp.

There is a fixnum _type_ in Common Lisp.

"Implementation details" often do need to be specified if you
want sufficiently efficient code.

-- jd

Joe User

unread,
Dec 2, 1995, 3:00:00 AM12/2/95
to
When people propose that they need a language which is easier to
compile than CL because it reduces the implementation resources
I really begin to wonder.

The differences between compiling CL, Scheme, Dylan, EULISP, or whatever
are rather trivial compared with the difficulting of generating
good code for RISC architectures, properly interfacing with


the operating system and competing with the

debugging and general code development environments available
in products such as Microsoft Visual C++.


Richard A. O'Keefe

unread,
Dec 4, 1995, 3:00:00 AM12/4/95
to
As soon as I heard about ISlisp, I downloaded the current (15.6) draft,
and spent the weekend studying it in some detail.

Who precisely is helped by renaming RPLACA to SET-CAR
If you don't like the name RPLACA, just leave it out, (SETF (CAR ..) ..)
will do fine.

Renaming DO to FOR looks like petty spite.

Renaming &REST to :REST was rather petty too. (Yes, there is one sentence
that says you can use either of them, but all the examples use :REST.)

The standard is supposed to be more oriented to the needs of industry,
but in marked contrast to Common Lisp gives you no information about
floating point properties, less floating-point support than C, not least
no choice of floating-point output formats (only unparametrised ~G, the
result of which is not spelled out).

There are several kinds of simplicity.
1. How easy is it to _specify_ something?

The ISlisp 15.6 draft is about 120 pages. No argument that it is
easy to specify. (Well, maybe. The text needs a *lot* of work.)

2. How easy is it to _implement_ something?

keyword parameters have been left out.
multiple inheritance is provided for in the same as as Object Pascal;
it's not clear to me that this simplifies the implementation much.
The specification of (format --) has been limited (to the point where
it is less capable at numeric formatting than C, which is pretty limited)
Hash tables are missing.
Packages are not there.
Apart from that, pretty much all the hard stuff is there.
I see no evidence that type inference, for example, is easier than in
Scheme or Common Lisp.

I am not up to speed on current Lisp compilation techniques, but I
really do get a strong impression that writing a really good compiler
for ISlisp is not _significantly_ simpler than writing a really good
compiler for Common Lisp.

There is no call-with-current-continuation (so much for learning from
Scheme). Nor are there multiple values. (Which _are_ in R5RS Scheme.)

3. How easy is it to _teach_ and _learn_?

A good small language is a real winner there. Roughly speaking, the
main advantage of ISlisp over Scheme is
- format (but that's in most Scheme libraries including SLIB)
- the condition system
The main disadvantage opf ISlisp over Scheme for teaching is
- separate name spaces (variables, functions -- in variable name
space in Scheme), block labels (block exit done using call/cc
in Scheme), goto labels (no gotos in Scheme)
- which necessitates the somewhat confusing 'funcall' and #'
For example, I _can_ say in ISlisp
(defvar foo (lambda (x) (car (cdr x)) ))
but I then have to call it with 'funcall'. Note the absence
of #' on the lambda expression.

Providing hash tables instead of property lists would be a good move
for teaching and learning. (So it proved in Pop.)

4. How easy is it to _use_?

Much of the reason why ISlisp 15.6 is smaller than CLtL2 is the
omission of a good deal of the library. For example, the ISlisp
designers seem to think that industrial programmers have no need
for case conversion or case-ignoring string comparison. Leaving
string-equal out simplifies the standard, but does little to
impress an industrial programmer who wants to know the Lisp
equivalent of strcasecmp(). Putting property lists (properties
can only be attached to symbols, and property names can only be
symbols) does nothing to help industrial programmers, who _would_
be well served by hash tables.

I know that in my own Scheme programming I spent a lot of time
reproducing things from Common Lisp, because they were useful.
ISlisp programmers will have to do the same. This is silly.

The only mapping function available for vectors (or any non-list
sequence) is map-into. Heck, even the C++ draft standard provides
operators mapped over arrays (see section 26.3 in the draft standard
and the vector<> template).

Industrial programmers are benefitted by
- the code they _don't_ have to write
(that is, well structured libraries are a good thing, even if
large)
- bindings to the environment (POSIX.1/1a bindings, GUI bindings,
network bindings)
- training materials
(splitting the market helps *nobody* here).

Given the existence of MCL, CLISP, GCL, CMU Common Lisp, &c, I can see
no advantage to potential industrial users in having yet another dialect.

Consider this. For C programming, a currently popular debugging kit is
ddd on top of gdb. This gives C programmers _some_ of the inspection
and debugging capabilities I was used to in Interlisp-D. On the machine
I'm posting from, gdb is 1.5M and ddd is 2.2M. That's 3.7 *megabytes*
for a C debugging environment. On the same machine, gcl is 2.5 megabytes.

Given those sizes, I don't see any need for a smaller Lisp standard.
I *certainly* don't see any advantage to *users*.
--
"conventional orthography is ... a near optimal system for the
lexical representation of English words." Chomsky & Halle, S.P.E.
Richard A. O'Keefe; http://www.cs.rmit.edu.au/~ok; RMIT Comp.Sci.

Paul Walsh

unread,
Dec 4, 1995, 3:00:00 AM12/4/95
to
Hello,
Could anybody direct me to a good Fortran to Lisp translator, if such a thing exists.
Thanks,
Paul Walsh
pa...@csvax1.ucc.ie


Jeff Dalton

unread,
Dec 4, 1995, 3:00:00 AM12/4/95
to
w...@zurich.ai.mit.edu (Bill Dubuque) writes:

> From: hai...@ilog.fr (Bruno Haible)
> Date: 28 Nov 1995 21:27:16 GMT

> Erik Naggum <er...@naggum.no> wrote:
> > - no reader macros

> How do you solve the collision problems when two modules define, say,
> #{...} in two different ways? - Features which carry unsolved problems
> are not ready for standardization.

>Has anyone done work on solving this problem?

What's supposed to be so difficult about it? In any case, it
has no more serious "unsolved" problems than many of the things
that went into Common Lisp.

-- jd

T. Kurt Bond

unread,
Dec 6, 1995, 3:00:00 AM12/6/95
to
o...@goanna.cs.rmit.EDU.AU (Richard A. O'Keefe) wrote:
> Nor are there multiple values. (Which _are_ in R5RS >Scheme.)

Uh, what R5RS? Was this ever published?

--
T. Kurt Bond, t...@wvlink.mpl.com


Marco Antoniotti

unread,
Dec 6, 1995, 3:00:00 AM12/6/95
to
In article <49o5vq$4...@merlin.delphi.com> Joe User <g...@mci.newscorp.com> writes:

From: Joe User <g...@mci.newscorp.com>
Newsgroups: comp.lang.lisp
Date: 2 Dec 1995 00:18:02 GMT
Organization: MCI/News Corp.
Lines: 17
References: <19951126...@naggum.no> <49furk$o...@nz12.rz.uni-karlsruhe.de> <WGD.95No...@martigny.ai.mit.edu>
NNTP-Posting-Host: slip162-178.bb.delphi.com
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 1.22D (Windows; U; 16bit)

Yes! It took MS and Borland 10 years to get close (not quite there) to
the Xerox and Symbolics environments of the early 80's. :)

Cheers
--
Marco Antoniotti - Resistente Umano
===============================================================================
International Computer Science Institute | mar...@icsi.berkeley.edu
1947 Center STR, Suite 600 | tel. +1 (510) 643 9153
Berkeley, CA, 94704-1198, USA | +1 (510) 642 4274 x149
===============================================================================
...it is simplicity that is difficult to make.
...e` la semplicita` che e` difficile a farsi.
Bertholdt Brecht

k p c

unread,
Dec 7, 1995, 3:00:00 AM12/7/95
to disclaimer: neither a civil servant nor a representative., k...@ptolemy.arc.nasa.gov
Quoth je...@cogsci.ed.ac.uk (Jeff Dalton):

> I hope we can avoid a language war here...

I thought that's what c.l.l is for :-).

I wish there were good demographics data. Maybe that would focus our
"Lisp is dead" threads and maybe even defragment the landscape.

For example:

o Are there very many people who learned CL or CLOS recently
not for legacy code or programming language research but to
write new applications? Or is it mostly historical users?
o How many non-academics use Lisp heavily?
o Will a new standard increase Lisp market share or fragment
existing market share?
o How many Schemers think source code incompatibility is a
problem?
o How many elispers would be convinced to use Guile?
o How many people use xlisp and what are the most important
features to them?
o How many people would use Scheme for Java's VM?
o How many people would use a standard (i.e. all code runs
portably on more than one implementation) dialect of Lisp
for everyday scripting purposes? (Fast regexps, easy pipes,
OS interface.)
o How many people think standards are important? lreP does
not have one, but who worries about it dying? On the other
hand Gina is dead and who knows which Scheme will become the
most popular.
o What percentage of Lisp users would "defect" to non-Lisp
syntax (Dylan, Java) if they had their favorite semantics
and library functions?

I'm not asking for opinions on these questions, just saying that maybe
we could focus our "squabbles" better if we knew the answers somehow.

I sometimes run du on /var/spool/news/comp/lang and related groups to
count the number of articles posted about each language. I sometimes
even analyze the newsgroups for things like relative percentages of
.edu sites. But that doesn't answer all of the questions. Market
data from commercial concerns might be interesting if they were
interested in telling us.

If you post a followup to this article, I would appreciate a courtesy
verbatim copy by email to help work around potentially unreliable feeds.

---
k...@ptolemy.arc.nasa.gov. AI, multidisciplinary neuroethology, info filtering.
Live and let live. Do as ye will, an ye harm none. Let it be. Dirty laundry.
Informed consent. Heal thyself. First things first. Specks in eyes, removal.
Homes and castles. Trees and forests. Pots and kettles. High ground, taking.

Jeff Dalton

unread,
Dec 7, 1995, 3:00:00 AM12/7/95
to
M'Isr <10323...@CompuServe.COM> writes:

>If they would use Call/CC continuations instead
>of
> TAG
> CATCH
> THROW

>Call/CC is more powerful can be used to
>write nondeterminative and multitasking
>and has inherent continuations--
>this is Schemes strong point eleminates three
>commands and does more, so why do we use Call/CC insted of

> THROW
> CATCH
> TAG,
> ETC.

Call/cc is more controversial than you might suppose. Indeed,
some knowledgeable people feel it is a significant defect in
Scheme.

By the way, what is your "etc"? What other cases do you have in
mind?

-- jd

M'Isr

unread,
Dec 7, 1995, 3:00:00 AM12/7/95
to
If they would use
Call/CC continuations instead
of
TAG
CATCH
THROW

Call/CC is more powerful can be used to
write nondeterminative and multitasking
and has inherent continuations--
this is Schemes strong point eleminates three
commands and does more, so why do we use Call/CC insted of

THROW
CATCH
TAG,
ETC.

--
Khos AI
VIRX

Joe User

unread,
Dec 9, 1995, 3:00:00 AM12/9/95
to
mar...@sysc.pdx.edu (Marcus Daniels) wrote:
>>>>>> "Joe" == Joe User <g...@mci.newscorp.com> writes:
>In article <49o5vq$4...@merlin.delphi.com> Joe User <g...@mci.newscorp.com> writes:
>
>Joe> The differences between compiling CL, Scheme, Dylan, EULISP, or
>Joe> whatever are rather trivial compared with the difficulting of

>
>Joe User, are you a representative of alt.syntax.tactical?
>

No. I am an old lisp implementor, having worked on the internals
of multiple university-based and commercial lisp systems,
and having shipped commercial products that use lisp.
Even now I am using lisp in commercial products, hidden in
the guts of the system.

Now CL may be ugly, but any differences in difficulty
of doing a quality implementation, compared with Scheme, Dylan
or EULISP are commercially insignificant.

And any company that says otherwise is only documenting
the restricted capabilities of its own implementation team.


MAEDA Atusi

unread,
Dec 14, 1995, 3:00:00 AM12/14/95
to
Any serious programming language should provide a way to develop
application in a modular manner. That is, application should be written
as a set of separate files and merged together to form a complete program.

ISLisp Working Draft 15.6 states that:

"An ISLisp text consists of a sequence of toplevel forms." (1.7)
"An object is prepared for execution... The method of preparation for
execution and its result are not defined in this document ..." (1.3)

Now suppose there is an application program whose text is divided into
two files `module1' and `module2'. Assume a global variable, say G,
is used from within both of these files.

Now where to place the definition of G? May I write:
(defglobal G ...)
in both files? If not, how to tell compiler that G is global when
compiling files?

Dratft says "For each namespace, defining forms can occur at most once
for the same name ..." (4.4), (in (DEFGLOBAL name form)) "The scope
of NAME is the entire current toplevel scope except the body FORM."
(4.8), "Each complete ISLisp text unit is precessed in a scope called
the TOPLEVEL SCOPE".

Here, the meaning of "complete ISLisp text unit" is not clearly
specified. Which of the following is true?

(A) Each of files forms distinct toplevel scope.
(B) All files belongs to the same toplevel scope.

It is obvious that Interpretation (A) makes no sense. We must share G
between two files. Following interpretation (B), we must define G in
either of the files, but not both. If we define G in the file
`module1', then how to tell the compiler that G is a global variable
when compiling `module2'? We can't include implemenation specific
directive such as (REQUIRE "MODULE1"), since "A complying ISLisp text
shall not rely on implemenation-dependent features."(1.9). ISLisp
doesn't provide portable #ifdef-like constructs (such as #+ and #- in
Common Lisp), either.

So we must tell compiler that `module2' depends on objects defined in
`module1' by ways other than program text.

A possible solution would be to provide interface file for each
module, describing what definition is exported from the module and
what is imported from other modules. Syntax of such interface file is
of course totally implementation-dependent in current Draft. But
since virtually all implemenation should have this kind of feature,
and most practical ISLisp program will have to rely on them, some kind
of module facility should be included in the standard. Otherwise,
application writers are forced to provide interface definitions for
every possible ISLisp implementation.

--mad

MAEDA Atusi

unread,
Dec 14, 1995, 3:00:00 AM12/14/95
to
>>>>> "ok" == Richard A O'Keefe <o...@goanna.cs.rmit.EDU.AU> writes:
In article <49u965$9...@goanna.cs.rmit.EDU.AU> o...@goanna.cs.rmit.EDU.AU (Richard A. O'Keefe) writes:

ok> Who precisely is helped by renaming RPLACA to SET-CAR

New users would find SET-CAR is easier to remember than some cryptic
six-char-limited name.

ok> If you don't like the name RPLACA, just leave it out, (SETF (CAR ..) ..)
ok> will do fine.

Maybe. But many implementations would have something like that
anyway. CMU CL has COMMON-LISP::%RPLACA, CLISP has SYSTEM::%RPLACA,
Allegro has EXCL::.INV-CAR, and so on. Note that SET-CAR is
incompatible with RPLACA in its argument order and return value. It
shares convention with other setter functions such as SET-PROPERTY,
SET-AREF, etc. And it gives user a concrete idea of what SETF is
doing. (CL's DEFINE-SETF-METHOD is a mess).

[deleted...]
ok> 2. How easy is it to _implement_ something?
[deleted..]

ok> I am not up to speed on current Lisp compilation techniques, but I
ok> really do get a strong impression that writing a really good compiler
ok> for ISlisp is not _significantly_ simpler than writing a really good
ok> compiler for Common Lisp.

I believe immutable binding for functions has significant impact on
the implementation techniques. (We can't interactively redefine
functions anymore. Shocking news, eh?). Compiler of ISLisp can
always inline functions safely, or make a simple call instruction (as
in C).

Compare this with Scheme. When executing procedure call (CAR
something), compiled Scheme code (without dangerous optimization) must
first fetch the value of the global variable CAR, check that it is a
procedure, and then invoke it.

This check can be omitted in Common Lisp (because of its separate
namespace for functions), but the call must be done indirectly to
retain redefinability. Inlining functions loses redefinability in
Common Lisp. Compiled code with (DECLARE (INLINE FOO) can't reflect
later changes on FOO, for example. This also defeats possibility of
interprocedural side-effect analysis. Assuming function NULL is
side-effect free is as dangerous as inlining.

Local functions are easier to handle in Common Lisp, because they
cannot be redefined within their scope. In Scheme, however,
LETREC-bound variables can be modified with SET!. So compiler must
first prove that the value is a procedure at the time of invocation
before emitting direct call instruction.

ok> There is no call-with-current-continuation (so much for learning from
ok> Scheme). Nor are there multiple values. (Which _are_ in R5RS Scheme.)

CALL-WITH-CURRENT-CONTINUATION, aside from its implementation
difficulty per se, has some drawbacks.

* UNWIND-PROTECT no longer works with it. The following code:
(UNWIND-PROTECT
<some file processing>
(CLOSE FILE))
looks okay. But what will happen if someone captured continuation
inside <some file processing> and try to resume processing later?

Note that UNWIND-PROTECT is crucial in writing real-world
applications.

* CALL/CC defeats some compiler optimizations.
Without CALL/CC,
(LET ((A ..) (B ..))
..
(LET ((X (CONS A B)))
(FOO)
X) ..)
can be safely transformed into:
(LET ((A ..) (B ..))
..
(PROGN
(FOO)
(CONS A B) ..)
Here, A and B are lexical variables and FOO can't affect the values
thereof. With CALL/CC, the transformation above is not safe (if
continuation is caputured in FOO and invoked many times, the former
always returns identical cons cell but the latter doesn't).

Similary,
(LET ((A ..))
..
(LET ((X A))
(FOO)
(SETQ X (+ X 1)))
..)
cannot be transformed into:
(LET ((A ..))
..
(PROGN
(FOO)
(+ A 1))
..)
in the presence of CALL/CC.

In summary, CALL/CC is way too general. It's actually strictly more
general than GOTOs. We need to restrict the usage of CALL/CC into
some structured way (i.e. only for non-local exit) until we find
better way to tame.

--mad

Erik Naggum

unread,
Dec 15, 1995, 3:00:00 AM12/15/95
to
[Atusi Maeda]

| New users would find SET-CAR is easier to remember than some cryptic
| six-char-limited name.

new users find FIRST and REST easier to remember and use than CAR and CDR.
if they could use SETF universally, that would also help them remember and
use Lisp correctly. since RPLACA and SETCAR are already part of the Lisp
tradition, and FIRST and REST were adopted in Common Lisp, I find retention
of CAR and CDR while pretending to cater to "new users" somewhat specious.

| (CL's DEFINE-SETF-METHOD is a mess).

could you elaborate on that?

| I believe immutable binding for functions has significant impact on the
| implementation techniques. (We can't interactively redefine functions

| anymore. Shocking news, eh?) Compiler of ISLisp can always inline


| functions safely, or make a simple call instruction (as in C).

a Common Lisp compiler can do that with functions in COMMON-LISP, as well.
rather than throwing the baby (redefinition) out with the bathwater (lack
of static analysis), couldn't ISLisp have provided a facility to freeze
packages? (oh, right, it doesn't have packages.)

| Assuming function NULL is side-effect free is as dangerous as inlining.

assuming that function COMMON-LISP:NULL is side-effect free is perfectly
legitimate, as is inlining it. Common Lisp has its good sides, although I
recognize the need for inventors of new dialects, especially political
creations such as International standards, to slam it. (i.e., a new
dialect loses respect according as it is disrespectful of other dialects,
from which it should learn, not differ from in bitterness.)

| In summary, CALL/CC is way too general. It's actually strictly more
| general than GOTOs. We need to restrict the usage of CALL/CC into some
| structured way (i.e. only for non-local exit) until we find better way
| to tame.

is it meaningful to restrict a continuation to be the continuation of a
currently active activation? i.e., not allow re-entry to a continuation
and no separate life of a continuation apart form the activation records?
this would once again make co-routines harder to implement, but that could
perhaps be another (related) concept?

I find ISLisp a depressing development. it appears unnecessary, and it is
gratuitously different from Common Lisp. have you failed to realize that
Lispers are facing people who want nothing stronger than to ridicule Lisp
because they don't understand it and so don't want to use it? what better
weapon to give them than to point out that even Lispers don't want to talk
each others' languages?

#<Erik 3027980519>
--
suppose we actually were immortal...

William D Clinger

unread,
Dec 15, 1995, 3:00:00 AM12/15/95
to
While making several correct observations in article
<MAD.95De...@tanzanite.math.keio.ac.jp>, m...@math.keio.ac.jp
(MAEDA Atusi) made several remarks that deserve clarification.

>Compare this with Scheme. When executing procedure call (CAR
>something), compiled Scheme code (without dangerous optimization) must
>first fetch the value of the global variable CAR, check that it is a
>procedure, and then invoke it.

This would be true if, like Ada, it were illegal to extend Scheme.
In reality, the Scheme standards state that extensions are legal
so long as they don't conflict with the standards. In particular,
it is legal to extend Scheme to allow the programmer to specify that
certain procedures should be inlined. This is, of course, a very
common extension. Since inlining is done only by permission of the
programmer, it is not a dangerous optimization.

Although Common Lisp differs from Scheme by forbidding redefinition
of CAR, it does allow the programmer to choose between inlining and
redefinition for programmer-defined procedures.

Not all languages give the programmer this choice. MAEDA observed
that ISLISP does not.

>interprocedural side-effect analysis. Assuming function NULL is


>side-effect free is as dangerous as inlining.

Since it is illegal to redefine NULL in Common Lisp, there is nothing
dangerous about assuming it has no side effects.

>cannot be redefined within their scope. In Scheme, however,
>LETREC-bound variables can be modified with SET!. So compiler must
>first prove that the value is a procedure at the time of invocation
>before emitting direct call instruction.

This is true. It is also a very easy optimization to implement:
If a LETREC binds a variable to a procedure, and that variable
does not appear on the left hand side of a SET! within its scope,
then the compiler knows not only that that variable is a procedure
throughout its scope, but it also knows the code for the procedure.

This optimization has been in MacScheme since 1986.

>CALL-WITH-CURRENT-CONTINUATION, aside from its implementation
>difficulty per se, has some drawbacks.
>

>* UNWIND-PROTECT no longer works with it....


>
> Note that UNWIND-PROTECT is crucial in writing real-world
> applications.

UNWIND-PROTECT is a special case of DYNAMIC-WIND. It is true that
UNWIND-PROTECT is not powerful enough to protect against all uses
of CALL-WITH-CURRENT-CONTINUATION. That's why Scheme programmers
use DYNAMIC-WIND instead.

DYNAMIC-WIND is expressible in R4RS Scheme. Portable code for
DYNAMIC-WIND is available from several sources, e.g. SLIB.

>* CALL/CC defeats some compiler optimizations.

> (LET ((A ..))
> ..
> (LET ((X A))
> (FOO)
> (SETQ X (+ X 1)))
> ..)
> cannot be transformed into:
> (LET ((A ..))
> ..
> (PROGN
> (FOO)
> (+ A 1))
> ..)
> in the presence of CALL/CC.

This is true, except that it isn't really CALL/CC that's to blame for
this. It's the assignment. To see why, consider the fact that

(let ((a ...))
...
(let ((x a))
(foo #'(lambda (v)
(setq x (+ x 1)))))
...)

cannot be transformed into

(let ((a ...))
...
(let ((x a))
(foo #'(lambda (v)
(+ a 1))))
...)

either, even in a language without CALL/CC. My example is, of
course, equivalent to MAEDA's.

William D Clinger

Jeffrey Mark Siskind

unread,
Dec 18, 1995, 3:00:00 AM12/18/95
to

I believe immutable binding for functions has significant impact on
the implementation techniques. (We can't interactively redefine

functions anymore. Shocking news, eh?). Compiler of ISLisp can


always inline functions safely, or make a simple call instruction (as
in C).

Compare this with Scheme. When executing procedure call (CAR


something), compiled Scheme code (without dangerous optimization) must
first fetch the value of the global variable CAR, check that it is a
procedure, and then invoke it.

This check can be omitted in Common Lisp (because of its separate


namespace for functions), but the call must be done indirectly to
retain redefinability. Inlining functions loses redefinability in
Common Lisp. Compiled code with (DECLARE (INLINE FOO) can't reflect
later changes on FOO, for example. This also defeats possibility of

interprocedural side-effect analysis. Assuming function NULL is
side-effect free is as dangerous as inlining.

Local functions are easier to handle in Common Lisp, because they


cannot be redefined within their scope. In Scheme, however,
LETREC-bound variables can be modified with SET!. So compiler must
first prove that the value is a procedure at the time of invocation
before emitting direct call instruction.

The Stalin compiler for Scheme does sound automatic inlining without
declarations. It safely compiles all Scheme procedure calls into direct C
function calls, even when the call is higher-order. It safely eliminates the
runtime check that the callee is a procedure when it can determine by compile
time type analysis that the callee must be a procedure of the correct arity.
In practise, it can do this in almost all cases, even for higher-order
procedures.

* CALL/CC defeats some compiler optimizations.

In summary, CALL/CC is way too general. It's actually strictly more


general than GOTOs. We need to restrict the usage of CALL/CC into
some structured way (i.e. only for non-local exit) until we find
better way to tame.

Stalin can determine when and where in a program CALL/CC is used and
continuations called. It can determine on an expression by expression basis
whether or not that expression can be reentered by calling a continuation.
The fact that the language supports CALL/CC causes no penalty whatsoever for
programs that do not use it. And even for programs that do use it, there is no
penalty whatsoever for those portions of the program that do no use it.
Stalin automatically determines when it is sound to compile calls to CALL/CC
and continuations as simple goto statements, when it needs to use
setjmp/long, and when it needs to resort to CPS conversion. It does this on an
expression by expression basis, independently for each call to CALL/CC and
each continuation call. And this is all soundly integrated with procedure call
inlining and type analysis.

In my experience, there is nothing in the R4RS Scheme language that cannot be
compiled as efficiently as C given sufficient resources to write a good
compiler and sufficient resources to run such a compiler.

Jeff (home page http://www.cs.toronto.edu/~qobi)
--

Jeff (home page http://www.cs.toronto.edu/~qobi)

Richard A. O'Keefe

unread,
Dec 18, 1995, 3:00:00 AM12/18/95
to
m...@math.keio.ac.jp (MAEDA Atusi) writes:

>>>>>> "ok" == Richard A O'Keefe <o...@goanna.cs.rmit.EDU.AU> writes:
>In article <49u965$9...@goanna.cs.rmit.EDU.AU> o...@goanna.cs.rmit.EDU.AU (Richard A. O'Keefe) writes:

> ok> Who precisely is helped by renaming RPLACA to SET-CAR

>New users would find SET-CAR is easier to remember than some cryptic
>six-char-limited name.

New users shouldn't be using it no matter _what_ it is called.
They should be writing
(setf (car x) y)
instead.


> ok> If you don't like the name RPLACA, just leave it out, (SETF (CAR ..) ..)
> ok> will do fine.

>Maybe. But many implementations would have something like that
>anyway.

Yes. So what? Using it shouldn't buy you anything that (setf (car -) -)
won't buy you.

I do appreciate the line of argument here, but it is precisely the
same kind of "nyaa nyaa ni nyaa nyaa i'm not gunna play with YOUR
language I'm gunna play with MY language" approach that nearly wrecked
the Prolog standard and quite certainly delayed it for more years than
is creditable. We're talking about a situation where there was
- one "Lisp" standard already official (Scheme)
(and lots of free implementations including some great ones)
- one "Lisp" standard in use by a large fraction of the community
undergoing revision with the ANSI standard very close (CL)
(and a KCL to name but one)
- a rival "Lisp" standard under development with a freely evailable
implementation and a lot of really neat ideas (EULisp)
and ISlisp is compatible with *none* of them, in broad or in detail.

>I believe immutable binding for functions has significant impact on
>the implementation techniques. (We can't interactively redefine
>functions anymore. Shocking news, eh?). Compiler of ISLisp can
>always inline functions safely, or make a simple call instruction (as
>in C).

It would have been very easy to add something to one of the existing
standards in order to express this, without kicking compatibility (and
existing Lisp programmers) in the teeth. Most of the Scheme implementations
I have manuals for have DEFINE-INTEGRABLE (or allow it to be defined; nothing
in any language standard I've seen _forces_ a compiler to take advantage of
things like this).

What's more, if you are talking about a batch compilation environment,
you are talking about a system where the compiler can find out that a
particular function _isn't_ redefined (even if it could have been).
I've recently been playing with Siskind's "Stalin" compiler which does
extensive global analysis and generates really winning code.

>Compare this with Scheme. When executing procedure call (CAR
>something), compiled Scheme code (without dangerous optimization) must
>first fetch the value of the global variable CAR, check that it is a
>procedure, and then invoke it.

The claim that this _must_ be done is simply false. What you _do_ need is
a good module system (which could certainly be added to Scheme without
breaking anything except a few reserved words; there are several proposals
around) so that the compiler can know when it is compiling a particular
module that every other module _can't_ assign to this variable and that this
module _doesn't_. It's not even hard.

One thing I have learned to dislike is environments where the debugging
environment has different semantics from the production environment. For
example, if functions _can_ be redefined somehow in the debugging
environment but not in the production environment, then you are debugging
a different language from the one your shipped application is written in.

Oh yes, the idea that function calls can _always_ be mapped to a simple
call instruction in C is quite a few years behind the times. Windows,
OS/2, VMS, and modern UNIX systems, all have some form of dynamic linking.
It's even possible in OS/2, if I've read one of the manuals correctly, to
load a file, binding a function, then unload the file and load a different
one. Certainly it's OS/2 PL/I, but if your language _can't_ do that, then
you can't take advantage of modern operating systems (or indeed the better
old ones).

>Local functions are easier to handle in Common Lisp, because they
>cannot be redefined within their scope. In Scheme, however,
>LETREC-bound variables can be modified with SET!. So compiler must
>first prove that the value is a procedure at the time of invocation
>before emitting direct call instruction.

This is, in practice, trivial. You have
(LET (... (id (lambda .. ...)) ...
or (LET* (... (id (lambda . ...)) ...
or (LETREC (... (id (lambda .. ...)) ...
or (DEFINE id (lambda .. ...))
or (DEFINE (id ..) ...)
You do a very cheap pass over the body governed by the declaration in
question, find that there are no instances of (SET! id ...), and in
O(n) time you know what you need to know.

>CALL-WITH-CURRENT-CONTINUATION, aside from its implementation
>difficulty per se, has some drawbacks.

>* UNWIND-PROTECT no longer works with it. The following code:


> (UNWIND-PROTECT
> <some file processing>
> (CLOSE FILE))
> looks okay. But what will happen if someone captured continuation
> inside <some file processing> and try to resume processing later?

> Note that UNWIND-PROTECT is crucial in writing real-world
> applications.

Funny, I thought that problem had been solved by the introduction of
(DYNAMIC-WIND before-thunk result-thunk after-thunk)
It is widely available, and does deal with the issue of CALL/CC exits
(after-thunk is called) and reentrances (before-thunk is recalled).

>* CALL/CC defeats some compiler optimizations.

Yes. IF it is used. Most escape handling methods in most programming
languages defeat some compiler optimisations, when they are used.

>In summary, CALL/CC is way too general. It's actually strictly more
>general than GOTOs. We need to restrict the usage of CALL/CC into
>some structured way (i.e. only for non-local exit) until we find
>better way to tame.

Actually, my argument against ISlisp was not based only on CALL/CC.
That was just one of several illustrations that the "spirit of
compatibility" claim in the ISlisp draft is purest flannel.

However, it is easily possible to have a dialect of Lisp which _restricts
the usage_ of call/cc to the "structured" case WITHOUT LOSING SCHEME
COMPATIBILITY. It's called Elk. (Whether you get full call/cc or cheap
call/cc depends on how much work the person did who ported Elk to your
machine.) The result is that you can write code using call/cc in a
"structured" way in Elk, and run the _same_ code in another Scheme.

The major point remains: there are differences between ISlisp and existing
Lisp standards or proposed standards which do nothing to address real-world
problems but merely suit someone's taste better.

Back in the days before the Prolog standard, our biggest barriers to
portability were not syntax, but system interface (file name, FFI) ones.
That the ISlisp current draft does not address these issues makes its
pretensions to consider the needs of "industray" sound insincere.

Fernando D. Mato Mira

unread,
Dec 20, 1995, 3:00:00 AM12/20/95
to
What hurdles, if any, exist to implement ISO Lisp as a thin layer
of macrology on top of Common Lisp?

--
Fernando D. Mato Mira http://ligwww.epfl.ch/matomira.html
Computer Graphics Lab
Swiss Federal Institute of Technology (EPFL) Phone : +41 (21) 693 - 5248
CH-1015 Lausanne FAX : +41 (21) 693 - 5328
Switzerland E-mail : mato...@di.epfl.ch

MAEDA Atusi

unread,
Dec 21, 1995, 3:00:00 AM12/21/95
to
This is not very relevant to ISLisp.

>>>>>> "mad" == MAEDA Atusi Km...@math.keio.ac.jp> writes:
mad> Local functions are easier to handle in Common Lisp, because they
mad> cannot be redefined within their scope. In Scheme, however,
mad> LETREC-bound variables can be modified with SET!. So compiler must
mad> first prove that the value is a procedure at the time of invocation
mad> before emitting direct call instruction.

>>>>> "will" == William D Clinger <wi...@ccs.neu.edu> writes:

will> This is true. It is also a very easy optimization to implement:
will> If a LETREC binds a variable to a procedure, and that variable
will> does not appear on the left hand side of a SET! within its scope,
will> then the compiler knows not only that that variable is a procedure
will> throughout its scope, but it also knows the code for the procedure.

>>>>> "ok" == Richard A O'Keefe <o...@goanna.cs.rmit.EDU.AU> writes:

ok> You do a very cheap pass over the body governed by the declaration in
ok> question, find that there are no instances of (SET! id ...), and in
ok> O(n) time you know what you need to know.

So, here's yet another SMOP. But is it worth doing it? How many
programmers modify variables bound by LETREC? I read in Lisp Pointer
about proposal to prohibit assignment to such variables.

;;; Keio University
;;; Faculty of Science and Technology
;;; Department of Math
;;; MAEDA Atusi (MAEDA is my family name)
;;; m...@math.keio.ac.jp

MAEDA Atusi

unread,
Dec 21, 1995, 3:00:00 AM12/21/95
to
>>>>>> "mad" == MAEDA Atusi Km...@math.keio.ac.jp> writes:
mad> * UNWIND-PROTECT no longer works with it. The following code:
mad> (UNWIND-PROTECT
mad> <some file processing>
mad> (CLOSE FILE))
mad> looks okay. But what will happen if someone captured continuation
mad> inside <some file processing> and try to resume processing later?

>>>>> "will" == William D Clinger <wi...@ccs.neu.edu> writes:

will> UNWIND-PROTECT is a special case of DYNAMIC-WIND. It is true that
will> UNWIND-PROTECT is not powerful enough to protect against all uses
will> of CALL-WITH-CURRENT-CONTINUATION. That's why Scheme programmers
will> use DYNAMIC-WIND instead.

>>>>> "ok" == Richard A O'Keefe <o...@goanna.cs.rmit.EDU.AU> writes:

ok> Funny, I thought that problem had been solved by the introduction of
ok> (DYNAMIC-WIND before-thunk result-thunk after-thunk)
ok> It is widely available, and does deal with the issue of CALL/CC exits
ok> (after-thunk is called) and reentrances (before-thunk is recalled).

I think DYNAMIC-WIND won't help in this particular case, unless the
language provide a way to reopen and reset the state of closed file.
One possible solution is to make before-thunk flag an error when
invoked twice, preventing reentrances.

Now, should ISLisp include CALL/CC and define CATCH, THROW, and
UNWIND-PROTECT in terms of CALL/CC? I'd like to hear from people who
are using CALL/CC heavily. Is CATCH/THROW insufficient in many cases?
(I write programs in mainly Common Lisp).

mad> * CALL/CC defeats some compiler optimizations.
mad> (LET ((A ..))
mad> ..
mad> (LET ((X A))
mad> (FOO)
mad> (SETQ X (+ X 1)))
mad> ..)
mad> cannot be transformed into:
mad> (LET ((A ..))
mad> ..
mad> (PROGN
mad> (FOO)
mad> (+ A 1))
mad> ..)
mad> in the presence of CALL/CC.

ok> Yes. IF it is used. Most escape handling methods in most programming
ok> languages defeat some compiler optimisations, when they are used.

Actually, my example illustrates that CALL/CC disables some
optimizations even if it isn't really used. The problem is the
*possibility* of captured continuation.

will> (let ((a ...))
will> ...
will> (let ((x a))
will> (foo #'(lambda (v)
will> (setq x (+ x 1)))))
will> ...)

will> cannot be transformed into

will> (let ((a ...))
will> ...
will> (let ((x a))
will> (foo #'(lambda (v)
will> (+ a 1))))
will> ...)

will> either, even in a language without CALL/CC. My example is, of
will> course, equivalent to MAEDA's.

This CPS version is equivalent to mine only when the language provides
first-class continuations.

My point is that, in the presence of CALL/CC, compiler can't prove
that a form after (unknown) function call is evaluated just once.
Hence, compiler can't move any side-effecting forms across (unknown)
function call, and can't remove some side effect.

>>>>> "erik" == Erik Naggum <er...@naggum.no> writes:

erik> is it meaningful to restrict a continuation to be the continuation of a
erik> currently active activation? i.e., not allow re-entry to a continuation
erik> and no separate life of a continuation apart form the activation records?
erik> this would once again make co-routines harder to implement, but that could
erik> perhaps be another (related) conce000000012
ok> the usage_ of call/cc to the "structured" case WITHOUT LOSING SCHEME
ok> COMPATIBILITY. It's called Elk. (Whether you get full call/cc or cheap
ok> call/cc depends on how much work the person did who ported Elk to your
ok> machine.) The result is that you can write code using call/cc in a
ok> "structured" way in Elk, and run the _same_ code in another Scheme.

EuLisp also has continuations with dynamic extent.

MAEDA Atusi

unread,
Dec 21, 1995, 3:00:00 AM12/21/95
to
>>>>> "ok" == Richard A O'Keefe <o...@goanna.cs.rmit.EDU.AU> writes:

ok> I do appreciate the line of argument here, but it is precisely the
ok> same kind of "nyaa nyaa ni nyaa nyaa i'm not gunna play with YOUR
ok> language I'm gunna play with MY language" approach that nearly wrecked
ok> the Prolog standard and quite certainly delayed it for more years than
ok> is creditable. We're talking about a situation where there was
ok> - one "Lisp" standard already official (Scheme)
ok> (and lots of free implementations including some great ones)
ok> - one "Lisp" standard in use by a large fraction of the community
ok> undergoing revision with the ANSI standard very close (CL)
ok> (and a KCL to name but one)
ok> - a rival "Lisp" standard under development with a freely evailable
ok> implementation and a lot of really neat ideas (EULisp)
ok> and ISlisp is compatible with *none* of them, in broad or in detail.

...`MY language'? I've never said I particulary prefer ISLisp over
Common Lisp or Scheme. I think much (but not all) of the criticism
made against ISLisp in this group is just right.

>>>>> "erik" == Erik Naggum <er...@naggum.no> writes:

erik> I find ISLisp a depressing development. it appears unnecessary, and it is
erik> gratuitously different from Common Lisp. have you failed to realize that
erik> Lispers are facing people who want nothing stronger than to ridicule Lisp
erik> because they don't understand it and so don't want to use it? what better
erik> weapon to give them than to point out that even Lispers don't want to talk
erik> each others' languages?

Do you mean you want single standard, instead of several parallel
standards (as we have now)? Then that's what ISLisp is intended to be.

Or are you asking for accepting Common Lisp (or one of other existing
standards) as international standard? If the standard, as a result of
deep arguments on individual features, eventually becomes exactly the
same as Common Lisp, then that's fine. I'm willing to accept it. But
I don't think modification is automatically a bad thing.

MAEDA Atusi

unread,
Dec 21, 1995, 3:00:00 AM12/21/95
to
Ok, I withdraw my "new users" silliness. Erik Naggum is right. If
ISLisp retains CAR and CDR tradition, and *if* it provides functions
that are equivalent to (SETF (CAR ...) ...) and (SETF (CDR ...) ...)
forms, then SET-CAR and SET-CDR would be good names. Should whole
update functions be dropped? I don't know. This affects, at least

+ Meaning of :accessor and :writer slot options of DEFCLASS.
+ We can't do something like (APPLY #'SET-AREF V ARRAY INDEX-LIST)
unless we have (SETF (APPLY ...) ...).
+ Renaming GET to PROPERTY becomes unnecessary.

--------------------------------------------------------


>>>>> "erik" == Erik Naggum <er...@naggum.no> writes:

(MAEDA Atusi wrote)
erik> | (CL's DEFINE-SETF-METHOD is a mess).

erik> could you elaborate on that?

It is much more complicated than DEFSETF, but for what reason?
DEFSETF is just fine for almost all purposes. I know that
DEFINE-SETF-METHOD is necessary when we want to define setf method for
some place forms such as (LDB ...) and (MASK-FIELD ...), but they are
all special in that they do not destructively modify specified data
structure. That is, (SETF (LDB BYTE INT) V) does not update bits in
integer bound to INT, but the variable INT. I find this confusing.

If such special cases are the only purposes for DEFINE-SETF-METHOD,
then I think it isn't worth having it. Or is there more obvious
reason to have it? If so, enlighten me, please.

MAEDA Atusi

unread,
Dec 21, 1995, 3:00:00 AM12/21/95
to
>>>>>> "mad" == MAEDA Atusi Km...@math.keio.ac.jp> writes:
mad> Compare this with Scheme. When executing procedure call (CAR
mad> something), compiled Scheme code (without dangerous optimization) must
mad> first fetch the value of the global variable CAR, check that it is a
mad> procedure, and then invoke it.

>>>>> "will" == William D Clinger <wi...@ccs.neu.edu> writes:

In article <4as2be$c...@camelot.ccs.neu.edu> wi...@ccs.neu.edu (William D Clinger) writes:

will> This would be true if, like Ada, it were illegal to extend Scheme.
will> In reality, the Scheme standards state that extensions are legal
will> so long as they don't conflict with the standards. In particular,
will> it is legal to extend Scheme to allow the programmer to specify that
will> certain procedures should be inlined. This is, of course, a very
will> common extension. Since inlining is done only by permission of the
will> programmer, it is not a dangerous optimization.

I failed to find out where in R4RS the above statements were made.
Would you please point on the section? Or is that in IEEE Standard?

About functions in COMMON-LISP package, yes, I was out-of-date, for
around ten years. I should have read CLtL2 more carefully.

About user-defined functions, there are three approaches.

1. Include (DECLARE (INLINE ...)) in standard, or allow equivalent
extension to implementation. Functions can be redefined.
Redefinition affects other functions that calls redefined one
unless redefined functions are declared to be inlined.
2. No (DECLARE (INLINE ...)) and functions can be redefined.
3. No (DECLARE (INLINE ...)) and functions can't be redefined.

Common Lisp and Scheme takes the first approach and ISLisp the third.
If both approach can achieve efficiency of almost same level, then
what is the difference? Programmers should add in Common Lisp
programs or Scheme programs to gain maximum efficiency (this may be
trivial). Compilers of Common Lisp or Scheme will be complicated
compared to ISLisp, to support multiple linkage method (it may also be
a Small Matter Of Programming, but difference is difference).

ISLisp's approach has a big disadvantage. Although the draft states
that it does not specify how to prepare an ISLisp text for execution
and how to execute it, the fact that it forbids redefinition of
functions effectively kills the usefulness of READ-EVAL-PRINT style
interpreters. Toplevel loop for ISLisp, if exists, will look more like
GDB command line rather than ordinary Lisp environment.

How about ease of distribution and installation of applications?
Now I must admit that I don't know how commercial Lisp programs are
distributed. Are applications distributed as stand-alone programs?
Or must users have entire Lisp system in advance?

>>>>> "ok" == Richard A O'Keefe <o...@goanna.cs.rmit.EDU.AU> writes:

(On MAEDA's statements saying "calling redefinable functions requires
run-time checking".)
ok> The claim that this _must_ be done is simply false. What you _do_ need is
ok> a good module system (which could certainly be added to Scheme without
ok> breaking anything except a few reserved words; there are several proposals
ok> around) so that the compiler can know when it is compiling a particular
ok> module that every other module _can't_ assign to this variable and that this
ok> module _doesn't_. It's not even hard.

>>>>ernet.media-coverage

(In reply to MAEDA's statements saying "ISLisp compilers can always
inline function calls".)
erik> rather than throwing the baby (redefinition) out with the bathwater (lack
erik> of static analysis), couldn't ISLisp have provided a facility to freeze
erik> packages? (oh, right, it doesn't have packages.)

ISLisp does need module facility, as I wrote in my article
<MAD.95De...@tanzanite.math.keio.ac.jp>.

It should differ from Common Lisp packages, though. (CL packages
solve name clash, but they don't provide encapsulation. Internal
symbols can be freely accessed via qualified names.)

ok> One thing I have learned to dislike is environments where the debugging
ok> environment has different semantics from the production environment. For
ok> example, if functions _can_ be redefined somehow in the debugging
ok> environment but not in the production environment, then you are debugging
ok> a different language from the one your shipped application is written in.

Well, I thought having both interpreter and compiler makes debugging
easier. Following your opinion, switching compiler settings or adding
declarations to programs both change the language, then. So can this
be counted as an advantage of ISLisp?

ok> Oh yes, the idea that function calls can _always_ be mapped to a simple
ok> call instruction in C is quite a few years behind the times. Windows,
ok> OS/2, VMS, and modern UNIX systems, all have some form of dynamic linking.

My point is that calls to redefinable Scheme functions may add extra
indirection in addition to the ones required by OS's dynamic link. Of
course, you can avoid it by throwing away assemblers and linkers
provided by OS and reimplementing it by yourself. Another SMOP.

Jeffrey Mark Siskind

unread,
Dec 21, 1995, 3:00:00 AM12/21/95
to

So, here's yet another SMOP. But is it worth doing it? How many
programmers modify variables bound by LETREC? I read in Lisp Pointer
about proposal to prohibit assignment to such variables.

There is no reason to since a compiler can very efficiently determine when a
program doesn't modify a bound variable.

Jeffrey Mark Siskind

unread,
Dec 21, 1995, 3:00:00 AM12/21/95
to

Actually, my example illustrates that CALL/CC disables some
optimizations even if it isn't really used. The problem is the
*possibility* of captured continuation.

My point is that, in the presence of CALL/CC, compiler can't prove


that a form after (unknown) function call is evaluated just once.
Hence, compiler can't move any side-effecting forms across (unknown)
function call, and can't remove some side effect.

This is incorrest. If you make a closed-world assumption and batch compile an
entire application (yes this is compatible with a strict interpretation of
R4RS) then it is possible to detect that a program doesn't use CALL/CC and
perform all optimizations as if the language didn't contain CALL/CC. But you
can do better than that. You can determine on an expression by expression
basis which expressions might be reentered due to a continuation call and
control your optimiziations accordingly. See my previous posting for more
details.

Erik Naggum

unread,
Dec 22, 1995, 3:00:00 AM12/22/95
to
[Erik Naggum]

| I find ISLisp a depressing development. it appears unnecessary, and it

| is gratuitously different from Common Lisp. have you failed to realize
| that Lispers are facing people who want nothing stronger than to
| ridicule Lisp because they don't understand it and so don't want to use
| it? what better weapon to give them than to point out that even
| Lispers don't want to talk each others' languages?

[MAEDA Atusi] (supercite undone)

| Do you mean you want single standard, instead of several parallel
| standards (as we have now)?

I'm curious which "several parallel standards" you mean. is it IEEE Scheme
and ANSI Common Lisp? as far as I'm concerned, that's one, single standard
for _each_ of those languages. in this sense, I already have what I want.

| Then that's what ISLisp is intended to be.

do you mean that ISLisp will cause IEEE Scheme and ANSI Common Lisp to go
away? that's an amazing attitude. unless it takes important steps to the
contrary, ISLisp will clutter up the Lisp world even _more_ than the
current set of standard and non-standard Lisps do. (see my followup to
Fernando D. Mato Mira for a suggestion.)

when Unicode was marketed at the heaviest, they also clamored on about how
Unicode would be the new "single standard, instead of several parallel
standards" for character representation. what happened? we got _another_
parallel standard for character representation, and then Unicode went into
a phase of random cell division and now we have numerous kinds of Unicode.

| Or are you asking for accepting Common Lisp (or one of other existing
| standards) as international standard?

of course I'm asking that instead of going ahead to create yet another Lisp
standard, we use the one that successfully became a standard.

| If the standard, as a result of deep arguments on individual features,
| eventually becomes exactly the same as Common Lisp, then that's fine.
| I'm willing to accept it. But I don't think modification is
| automatically a bad thing.

"modification" is neutral. the _reason_ for making modifications may be
good or bad. a change may be an improvement with a strong consensus behind
it, one that users have essentially already adopted and are just waiting
for their standard to reflect. a change may also be a gratuitous departure
from the past and the consensus among users. ISLisp represents the latter
in the areas where it does not do useful invention (like the way it treats
dynamic binding), but invention in committees is not building a consensus,
it's an attempt to force something down someone's throat, however immature.
(this latter point is unfortunatly true of many standards in information
technology published in the last few years.)

#<Erik 3028579343>

Erik Naggum

unread,
Dec 22, 1995, 3:00:00 AM12/22/95
to
[Fernando D. Mato Mira]

| What hurdles, if any, exist to implement ISO Lisp as a thin layer
| of macrology on top of Common Lisp?

I see this as implying that it would be desirable to have a Common Lisp
package that implemented ISLisp.

if I may stretch the idea, I think one should _define_ International
Standard Lisp as a package that can be referenced in existing Lisp systems
(probably several, not just the _standard_ ones). this is not a trivial
exercise, but keeping with the spirit of the Norwegian suggestion (mine,
actually, although it was not as new in the real world as it was to ISO <--
understatement) during discussions about the procedures of ISO recognition
of Publicly Accessible Specifications (PAS), which was adopted with much
less enthusiasm than I had hoped (essentially "yeah, great idea. what's
the next item?"), that ISO standards should have a free reference
implementation before they were fully adopted.

C++ got jump-started by free-loading on C compilers, a devilishly clever
move in retrospect, but probably one of necessity at the time. similarly,
I don't think ISLisp has _any_ chance of becoming a winning standard Lisp
unless it is piggy-backing on previous work and available compilers, just
as I don't think C++ would have a snowball's chance in hell of winning if
the current ISO draft was thrown in implementors' faces with an "implement
_this_" attitude.

probably forging a bad pun, I'd like to think of this as the incrementality
of standards editors, with stress on "mentality".

#<Erik 3028583669>

Barry Margolin

unread,
Dec 22, 1995, 3:00:00 AM12/22/95
to
In article <4b9o38$6...@info.epfl.ch>,

Fernando D. Mato Mira <mato...@lig.di.epfl.ch> wrote:
>What hurdles, if any, exist to implement ISO Lisp as a thin layer
>of macrology on top of Common Lisp?

I believe that Kent Pitman was acting as X3J13's representative to the ISO
Lisp committee during the time that ISLisp was being finalized. One of his
goals in this role was to make sure that ISLisp would be compatible enough
with Common Lisp to allow such a package to be written. We also made some
small, last-minute changes to Common Lisp to enable this (sorry, I can't
think of any specific ones right now).

The layer of macrology would probably be relatively "thick", not thin, but
it should be doable.
--
Barry Margolin
BBN PlaNET Corporation, Cambridge, MA
bar...@bbnplanet.com
Phone (617) 873-3126 - Fax (617) 873-6351

Bruno Haible

unread,
Dec 22, 1995, 3:00:00 AM12/22/95
to
MAEDA Atusi <m...@math.keio.ac.jp> wrote:
>
> ISLisp's approach has a big disadvantage. Although the draft states
> that it does not specify how to prepare an ISLisp text for execution
> and how to execute it, the fact that it forbids redefinition of
> functions effectively kills the usefulness of READ-EVAL-PRINT style
> interpreters. Toplevel loop for ISLisp, if exists, will look more like
> GDB command line rather than ordinary Lisp environment.

Your interpretation of the forbidden redefinition for functions is
unfortunate. The ISLisp text specifies what a correct ISLisp program
is, but it has not the intent of specifying the interpreter.
It is of course expected that function redefinition is possible in a
debugging environment.

An interpreter and debugging environment always extends the language
spec: by providing debugging commands such as `inspect-object', environ-
mental commands such as `exit', documentation such as `help', and
by not enforcing some language restrictions.

For example, no CL implementation forbids you to execute
(setf (car '(a . b)) 'x)
in the interpreter, although this is illegal according to CLtL2.

The major mistake is the design of CL was to consider debugging commands
and features (such as redefinability of functions) as being part of the
language, hence meaningful in files to be compiled, and this is what
ISLisp tries to address.


----------------------------------------------------------------------------
Bruno Haible net: <hai...@ilog.fr>
ILOG S.A. tel: +33 1 4908 3585
9, rue de Verdun - BP 85 fax: +33 1 4908 3510
94253 Gentilly Cedex url: http://www.ilog.fr/
France url: http://www.ilog.com/

LMayka

unread,
Dec 24, 1995, 3:00:00 AM12/24/95
to
One ISO-inspired addition that I remember was DEFINE-SYMBOL-MACRO. This
makes it possible, for example, to implement an ISLisp global variable as
a Common Lisp reader-writer function pair.

- Larry Mayka

Pekka P. Pirinen

unread,
Dec 27, 1995, 3:00:00 AM12/27/95
to
(CL's DEFINE-SETF-METHOD is a mess).
It is much more complicated than DEFSETF, but for what reason?
DEFSETF is just fine for almost all purposes. I know that
DEFINE-SETF-METHOD is necessary when we want to define setf method for
some place forms such as (LDB ...) and (MASK-FIELD ...), but they are
all special in that they do not destructively modify specified data
structure. [...]

If such special cases are the only purposes for DEFINE-SETF-METHOD,
then I think it isn't worth having it. Or is there more obvious
reason to have it? If so, enlighten me, please.

I'd say the real reason is that Lisp wizards want access to the
internals so they can leverage off those mechanisms, when they're
extending the language. DEFINE-SETF-METHOD is only half of it; the
other half is that GET-SETF-METHOD returns the same five values.

In practical terms, this allows writing macros that use _generalized
variables_ for operations more sophisticated than just setting, such
as SHIFTF, ROTATEF, PUSH, POP, and GETF. It also allows extending the
idea to places such as APPLY, THE, VALUES -- all now part of ANSI.
We've also found ASSOC a useful place, and there was a article in Lisp
Pointers some years ago that suggested a use for QUOTE as a place! I
notice that we've even used it in the LispWorks sources to implement a
few places that could have been written using DEFSETF, for efficiency
or clarity of code.
--
Pekka P. Pirinen
Harlequin Limited, Cambridge, UK

HStearns

unread,
Jan 2, 1996, 3:00:00 AM1/2/96
to
Subject: Re: ISO on Common (was: ISO/IEC CD 13816 -- ISLisp)
From: bar...@tools.bbnplanet.com (Barry Margolin)
Date: 22 Dec 1995 20:06:53 -0500
Message-ID: <4bfknd$b...@tools.bbnplanet.com>

In article <4b9o38$6...@info.epfl.ch>,
Fernando D. Mato Mira <mato...@lig.di.epfl.ch> wrote:
>What hurdles, if any, exist to implement ISO Lisp as a thin layer
>of macrology on top of Common Lisp?

I believe that Kent Pitman was acting as X3J13's representative to the
ISO
Lisp committee during the time that ISLisp was being finalized. One of
his
goals in this role was to make sure that ISLisp would be compatible
enough
with Common Lisp to allow such a package to be written. We also made
some
small, last-minute changes to Common Lisp to enable this (sorry, I
can't
think of any specific ones right now).

I seem to remember Kent mentioning on the last day of of LUV-95 that he
and
some others have played around with a partial implementation of ISLisp
within
CL. Is anybody listening that can comment? Kent?

Jeff Dalton

unread,
Jan 4, 1996, 3:00:00 AM1/4/96
to
In article <4bfatn$9...@nz12.rz.uni-karlsruhe.de> hai...@ilog.fr (Bruno Haible) writes:
>
>The major mistake is the design of CL was to consider debugging commands
>and features (such as redefinability of functions) as being part of the
>language, hence meaningful in files to be compiled, and this is what
>ISLisp tries to address.

That is at least very misleading. You, and the others who
take this line, need to bear in mind that things are not that
simple and clear-cut.

It's true that Common Lisp has some debugging features in the
"language". TRACE is a good example. It's questionable whether
this was a mistake rather than just one of several reasonably
good ways of doing things. It's a bit extreme, to say the least,
to say it's "the major mistake in the design of Common Lisp".

In any case, function redefinition is in another category.

No one says assignment to global variables is a debugging command, or
meaningless in files to be compiled. Redefining a function is not all
that different. In a "one namespace" Lisp (such as Scheme), it's the
same. Moreover, in some programs redefining functions at run-time
makes sense. I've written a number of programs that redefine
functions in ways that are not debugging or development. And I
compile the code as well.

ISLisp was supposed to do something that EuLisp was also supposed
to do, namely to leave function redefinition out of the language
and leave it up to implementations. The idea was to avoid saying
that conforming implementations had to suppose redefinition, and
hence to allow (as conforming) implementations that performed
certain kinds of static analysis.

(I say "supposed", because it's possible people have messed it up
and ended up going further than they should have.)

-- jd

Jeff Dalton

unread,
Jan 4, 1996, 3:00:00 AM1/4/96
to

>ISLisp's approach has a big disadvantage. Although the draft states
>that it does not specify how to prepare an ISLisp text for execution
>and how to execute it, the fact that it forbids redefinition of
>functions effectively kills the usefulness of READ-EVAL-PRINT style
>interpreters. Toplevel loop for ISLisp, if exists, will look more like
>GDB command line rather than ordinary Lisp environment.

I am not convinced that ISLisp forbids implementations from allowing
function redefinition. What is your textual evidence from the ISLisp
definition?

-- jd

Jeff Dalton

unread,
Jan 4, 1996, 3:00:00 AM1/4/96
to
In article <19951222T...@arcana.naggum.no> Erik Naggum <er...@naggum.no> writes:
>
>| Do you mean you want single standard, instead of several parallel
>| standards (as we have now)?
>
>I'm curious which "several parallel standards" you mean. is it IEEE Scheme
>and ANSI Common Lisp? as far as I'm concerned, that's one, single standard
>for _each_ of those languages. in this sense, I already have what I want.
>
>| Then that's what ISLisp is intended to be.
>
>do you mean that ISLisp will cause IEEE Scheme and ANSI Common Lisp to go
>away? that's an amazing attitude.

That is not what ISLisp is for. It's another language in the Lisp
family (along with CL and Scheme), not a standard for all of Lisp.

>when Unicode was marketed at the heaviest, they also clamored on about how
>Unicode would be the new "single standard, instead of several parallel
>standards" for character representation.

Also? There's no clamoring on about ISLisp being a new, single
standard. There's just one news article from one person who seems
to be under a mistaken impression.

-- jd

MAEDA Atusi

unread,
Jan 8, 1996, 3:00:00 AM1/8/96
to
In article <DKo7rq.Lw4...@cogsci.ed.ac.uk> je...@cogsci.ed.ac.uk (Jeff Dalton) writes:

jeff> I am not convinced that ISLisp forbids implementations from allowing
jeff> function redefinition. What is your textual evidence from the ISLisp
jeff> definition?

"For each namespace, defining forms can occur at most once for the

same name ..." (Working Draft 15.6; 4.4)

"The binding between function-name and the function object is
immutable." (4.8)

"It is a violation if there is attempt to change an immutable binding
(error-id. immutable-binding)." (1.7)

------------------------------------
Re possibility of ISLisp implementation that allows redefinition:

If an implementation provides read-eval-print loop as primary user
interface and if it allows mutation of immutable bindings, such
feature cannot be regarded as extension to the language. Rather, it
looks more like an omitted check. I think such an implementation is
non-conforming because it fails to detect violation.

"An ISLisp processor complying with the requirements of this document
shall ... reject any text that contains any textual usage which this
document explicitly defines to be a violation ..." (1.9)

Special debugger or switches to control the toplevel loop behavior
may be convenient, however. (Like some C development environment which
allows incremental function redefinition).

--mad

MAEDA Atusi

unread,
Jan 8, 1996, 3:00:00 AM1/8/96
to
(I failed to post this article and retried several times. Sorry if
you see this many times).

> [Erik Naggum]
> | I find ISLisp a depressing development. it appears unnecessary, and it
> | is gratuitously different from Common Lisp. have you failed to realize
> | that Lispers are facing people who want nothing stronger than to
> | ridicule Lisp because they don't understand it and so don't want to use
> | it? what better weapon to give them than to point out that even
> | Lispers don't want to talk each others' languages?

> [MAEDA Atusi] (supercite undone)

> | Do you mean you want single standard, instead of several parallel


> | standards (as we have now)?

> and precisely which several parallel standards do you mean? IEEE Scheme
> and ANSI Common Lisp, right? as far as I'm concerned, that's one, single
> standard for each of those languages. I this sense, I already have what I
> want.

> | Then that's what ISLisp is intended to be.

> do you mean that ISLisp will cause IEEE Scheme and ANSI Common Lisp to go

> away? that's an amazing attitude, to put it mildly. ISLisp will clutter


> up the Lisp world even _more_ than the current set of standard and
> non-standard Lisps do.

Please don't extract sentence out of its context. My sentence above
should be read "(if you want single standard) then that's what ISLisp
is intended to be."

[Unicode stuff deleted]


> | Or are you asking for accepting Common Lisp (or one of other existing
> | standards) as international standard?

> again, _which_ "other existing standards"? of course I'm asking that


> instead of going ahead to create yet another Lisp standard, we use the one
> that successfully became a standard.

Why do you belive two is the best number for standards? I agree with
you that Common Lisp and Scheme each has its own importance. And
ISLisp emphasizes another aspect. It tries to be a language which is
small and easy to implement efficiently.

> | If the standard, as a result of deep arguments on individual features,
> | eventually becomes exactly the same as Common Lisp, then that's fine.
> | I'm willing to accept it. But I don't think modification is
> | automatically a bad thing.

> "modification" is neutral. it's the _reason_ for making modifications that


> may be good or bad. a change may be an improvement with a strong consensus

> behind it, one that users have essentially already adopted and are just for
> the standard to reflect. a change may also be a gratuitous destruction of
> the past and of the consensus among users. ISLisp represents the latter in


> the areas where it does not do useful invention (like the way it treats

> dynamic binding), but invention in committees is not building a consensus.

Yes, a change can be good or bad. And "the past" alone doesn't make
it good or bad. And consensus can be made after proposal, through
argument. By what way can changes be made, other than proposing
changes first?

BTW, I can recall there were many arguments against Common Lisp. Some
people said: "Lisp doesn't need standard at all", "Common Lisp is
largely incompatible with existing implementations (e.g. in lexical
variable bindings)", "Some features of Common Lisp (e.g. multiple
values, dynamically adjustable arrays, etc.) has significant
overhead", etc.

And now Common Lisp seems to be successful and it is widely accepted.

Some inventions in committee may be discarded if they turn out to be
bad and some may survive. If *all* changes proposed by the committee
are discarded, then we have a consensus rejecting the whole standard.
So why not talking about technical and/or practical aspects of the
language, instead of talking about politics?

Jeff Dalton

unread,
Jan 10, 1996, 3:00:00 AM1/10/96
to
In article <MAD.96Ja...@tanzanite.math.keio.ac.jp> m...@math.keio.ac.jp writes:
>In article <DKo7rq.Lw4...@cogsci.ed.ac.uk> je...@cogsci.ed.ac.uk (Jeff Dalton) writes:
>
> jeff> I am not convinced that ISLisp forbids implementations from allowing
> jeff> function redefinition. What is your textual evidence from the ISLisp
> jeff> definition?
>
>"For each namespace, defining forms can occur at most once for the
>same name ..." (Working Draft 15.6; 4.4)
>
>"The binding between function-name and the function object is
>immutable." (4.8)
>
>"It is a violation if there is attempt to change an immutable binding
>(error-id. immutable-binding)." (1.7)

Ok, thanks.

>Re possibility of ISLisp implementation that allows redefinition:
>
>If an implementation provides read-eval-print loop as primary user
>interface and if it allows mutation of immutable bindings, such
>feature cannot be regarded as extension to the language.

How do you know? And why does it matter? Does the standard
even provide for read-eval-print loops. Does it even contain
eval?

>Rather, it
>looks more like an omitted check. I think such an implementation is
>non-conforming because it fails to detect violation.
>
>"An ISLisp processor complying with the requirements of this document
>shall ... reject any text that contains any textual usage which this
>document explicitly defines to be a violation ..." (1.9)

There's something weird going on here. If it's supposed to reject
the text, why is there an error id?

>Special debugger or switches to control the toplevel loop behavior
>may be convenient, however. (Like some C development environment which
>allows incremental function redefinition).

Just so.

-- jeff

Jeff Dalton

unread,
Jan 10, 1996, 3:00:00 AM1/10/96
to

> > | Then that's what ISLisp is intended to be.
>
> > do you mean that ISLisp will cause IEEE Scheme and ANSI Common Lisp to go
> > away? that's an amazing attitude, to put it mildly. ISLisp will clutter
> > up the Lisp world even _more_ than the current set of standard and
> > non-standard Lisps do.
>
>Please don't extract sentence out of its context. My sentence above
>should be read "(if you want single standard) then that's what ISLisp
>is intended to be."

What do you mean by "a single standard". ISLisp is *not* supposed
to be a standard for all of Lisp; it's supposed to be a standard for
one language in the Lisp family. That's one of the reasons it's
called "ISLisp" instead of "Lisp" (which would then be called "ISO
Lisp" and appear to be a standard for all of Lisp).

This was discussed explicitly at the first meeting of WG-16 (the
ISO committee that produced the ISLisp definition).

-- jeff


MAEDA Atusi

unread,
Jan 19, 1996, 3:00:00 AM1/19/96
to
>>>>> "jeff" == Jeff Dalton <je...@cogsci.ed.ac.uk> writes:
In article <DKz4zA.4Ex...@cogsci.ed.ac.uk> je...@cogsci.ed.ac.uk (Jeff Dalton) writes:

mad> Re possibility of ISLisp implementation that allows redefinition:
mad>
mad> If an implementation provides read-eval-print loop as primary user
mad> interface and if it allows mutation of immutable bindings, such
mad> feature cannot be regarded as extension to the language.

jeff> How do you know? And why does it matter? Does the standard
jeff> even provide for read-eval-print loops. Does it even contain
jeff> eval?

I said "If". I know that draft standard says nothing about how ISLisp
programs are executed.

Bruno Haible replied to my previous posting and said that (rougly) an
implementation can be conforming to the draft standard even if it
provides a read-eval-print loop that allows function redefinition.
(Sorry, I didn't save his article.)

mad> Rather, it
mad> looks more like an omitted check. I think such an implementation is
mad> non-conforming because it fails to detect violation.
mad>
mad> "An ISLisp processor complying with the requirements of this document
mad> shall ... reject any text that contains any textual usage which this
mad> document explicitly defines to be a violation ..." (1.9)

jeff> There's something weird going on here. If it's supposed to reject
jeff> the text, why is there an error id?

My guess is that the phrase `reject the text' in the draft means
`detect violation', and error-ids are used to identify both errors and
violations.

--mad

0 new messages