GNU Extension Language Plans

71 views
Skip to first unread message

Aaron Watters

unread,
Oct 20, 1994, 8:59:03 AM10/20/94
to
In article <EBOUCK.94O...@dsdbqvarsa.er.usgs.gov> ebo...@usgs.gov writes:
> GNU Extension Language Plans
> Richard Stallman, GNU Project
>....
>* Step 1. The base language should be modified Scheme, with these features:...

Fine. Please extend Scheme (as I knew it) with a flexible
object-encapsulation/inheritence system and convenient, well
designed, portable interfaces to common os functionalities
and libraries -- for a good example of how to do this you might do this
look at the Python code. If you want people to USE the language
without fear of illegality, maybe you could use a copyright which
protects your rights without restricting the USE of the language
-- like the one that applies to Python.

In the mean time, since I want a good scripting/extension language
without scary copyright restrictions and with good interfaces to
just about everything I could possibly want NOW, I'll burrow on
ahead using Python.
Aaron Watters
Department of Computer and Information Sciences
New Jersey Institute of Technology
University Heights
Newark, NJ 07102
phone (201)596-2666
fax (201)596-5777
home phone (908)545-3367
email: aa...@vienna.njit.edu
PS:
Personally, I've always found Scheme a little irritating --
ever since I read the standard text on the subject which mentions
arrays somewhere around page 400, in a footnote, without telling
you how to use one. (Do I detect MIT/NIH? Naw.)

Peter da Silva

unread,
Oct 20, 1994, 2:21:46 PM10/20/94
to
Normally I would applaud an announcement that someone with some credibility
(I can safely say that the FSF have proven themselves credible in the field
of language implementation) was going to do something to resolve the mish-
mash of lisp and scheme and lispoid language implementations out there. But
this announcement doesn't exactly fill my heart with warm fuzzies.

First of all, it starts off with more of the FSF's recent attacks on Tcl, a
fairly innocuous little language that is really quite good at what it does
even if it doesn't do everything Stallman wants. This tends to make me think
that the real reason isn't to fix a problem, but to promote some subtle
political agenda.

This is reinforced when he starts attacking Sun. Attacking Sun is a great
way to get the iconoclasts on your side, but occasionally Sun does come up
with some good things (NeWS, for example, which people stayed away from in
droves, largely due to FUD) and the folks bashing Sun miss the boat (DEC
has done some good things with OSF/1, but nobody else has managed to pull
a rabbit out of that hat).

So, is the Gnu Extension Language going to be another Motif, another RFS,
another OSF/1? Some great smelly monster that even if it succeeds makes nobody
happy?

That brings me to my second problem... the list of extensions. It seems like
about half the announcement is extensions to scheme. Come on, the big advantage
to lisp-like languages is the way their simple semantics and syntax can be
used to bootstrap very powerful concepts.

What's wrong with taking some existing implementation, like STk's interpreter,
and adding a modicum of string and O/S functions? If Tcl is an unholy cross
of Lisp and Awk, this is sounding like some similarly sanctified marriage of
Lisp and Perl.

Oh well, at least it looks like he's got a clue about licensing...
--
Peter da Silva `-_-'
Network Management Technology Incorporated 'U`
1601 Industrial Blvd. Sugar Land, TX 77478 USA
+1 713 274 5180 "Hast Du heute schon Deinen Wolf umarmt?"

Barry Margolin

unread,
Oct 20, 1994, 5:22:42 PM10/20/94
to
In article <1994Oct20.1...@njitgw.njit.edu> aa...@vienna.njit.edu (Aaron Watters) writes:
>Personally, I've always found Scheme a little irritating --
>ever since I read the standard text on the subject which mentions
>arrays somewhere around page 400, in a footnote, without telling
>you how to use one. (Do I detect MIT/NIH? Naw.)

The standard reference for Scheme is the "Revised**4 Report on the
Algorithmic Language Scheme", which is only 55 pages long. I suspect
you're referring the "Structure and Interpretation of Computer Programs".
As the name implies, this is a text on programming, not on any particular
programming language. This book is oriented towards teaching about
advanced programming structures such as closures, streams, and
continuations, which are unique to Scheme and similar languages; mundane
features like arrays do admittedly get little coverage. The course assumes
that the student has some prior, basic programming skills, so already knows
how to use arrays.

--

Barry Margolin
BBN Internet Services Corp.
bar...@near.net

Wayne A. Christopher

unread,
Oct 20, 1994, 5:54:20 PM10/20/94
to
I have a few concerns about the way GNUScript is supposed to operate.

1. It appears that there are to be two modules -- a compiler for a given
extension language, and a Scheme-based runtime interpreter. It sounds
like the compiler will be GPL'ed and the runtime won't be. But what if
I want to call "eval some-extension-language-command" from a running program?
Will it get compiled into Scheme on the fly (and then into C and dynamically
loaded, even)? I think this will be a big performance hit, and furthermore
it will require every application to include the GPL'ed compiler, which
defeats the purpose of unencumbering the runtime.

2. Will the Scheme runtime need something like Boehm's garbage collector?
I'm sure there are applications that can't use this sort of system -- for
example, ones that maintain pointers to objects in core in external
storage but not internally (for whatever reason).

3. Tcl is very popular in embedded applications where code size is critical.
It seems that the Scheme interpreter plus the garbage collector plus the
compiler would be a lot larger.

It's definitely a cute idea, but I'm not sure it's very practical...

Wayne

Christophe Muller

unread,
Oct 20, 1994, 8:47:59 AM10/20/94
to
In article <941019042...@mole.gnu.ai.mit.edu> Tom Lord <lo...@gnu.ai.mit.edu> writes:

> * Distribution conditions.

> We will permit use of the modified Scheme interpreter in proprietary
> programs, so as to compete effectively with alternative extensibility
> packages.

Good.

> Translators from other languages to modified Scheme will not be part
> of any application; each individual user will decide when to use one
> of these. Therefore, there is no special reason not to use the GPL as
> the distribution terms for translators. So we will encourage
> developers of translators to use the GPL as distribution terms.

This simply means that I _won't_ use the modified scheme in my application..
full point. Because I consider that Tcl syntax (and even "set x 12" more than
the C-like "x=12") is much more readable than any Elisp or Scheme. I'd like to
use scheme as a more powerful language as a developper but I will not impose
this choice on my customers who only know mouse manipulations or, in the best
case, Fortran! Proposing scheme, Lisp, or C-like, as an extension language for
customers is missing the issue IMHO..

And BTW, I agree that it should be possible to write large programs as scripts
or extensions (and it will be more and more the case as CPU power gets
cheaper) *but* neither Elisp nor Tcl can serve this purpose today (still IMHO).
The reason? They are not modular, they are not object oriented. You have to be
a very good and cautious programmer in order to write 100.000 lines of
*maintenable* Lisp or Tcl code.. So the conclusion (Tcl is bad) seems very
strange for me, especially given the fact that [Incr Tcl] exists! I'm
surprised nobody has raised this issue.. If you really want the "very best"
extension language, you should try interpreted Eiffel rather than Elisp!!!
(which does exist in the melted-ice development technology).

Anyway. If Tcl2Scheme is copylefted, Sun people do not have to worry any
more.. :-)

Cheers,
Christophe.

= Are you the police? -- No ma'am, we're musicians. =

Bryan O'Sullivan

unread,
Oct 21, 1994, 8:54:57 AM10/21/94
to
fau...@remarque.berkeley.edu (Wayne A. Christopher) writes:

>2. Will the Scheme runtime need something like Boehm's garbage collector?

I can't imagine why it would. It's fairly straightforward to have your
Scheme (or whatever) system maintain pointers into C land and vice
versa, with rather less magical support from the RTS than the Parc GC
gives (the system I've seen uses structures called "malloc" and
"stable" pointers, respectively, to point in each direction).

What GNUscript's RTS does absolutely need is a decent generational
garbage collector, so that it will provide reasonably sane interactive
performance. One of the things that regularly makes me want to kick my
workstation through a window is the GC and buffer relocation burps in
Emacs.

<b

--
Bryan O'Sullivan email: b...@cyclic.com, bosu...@maths.tcd.ie
Department of Poverty wuh wuh wuh: http://www.scrg.cs.tcd.ie/~bos
Trinity College, Dublin nous n'avons qu'un peu de mouton aujourd'hui.

Tom Lord

unread,
Oct 19, 1994, 12:20:18 AM10/19/94
to
GNU Extension Language Plans
Richard Stallman, GNU Project

[Please redistribute widely]

Many software packages need an extension language to make it easier
for users to add to and modify the package.

In a previous message I explained why Tcl is inadequate as an
extension language, and stated some design goals for the extension
language for the GNU system, but I did not choose a specific
alternative.

At the time, I had not come to a conclusion about what to do. I knew
what sort of place I wanted to go, but not precisely where or how to
get there.

Since then, I've learned a lot more about the topic. I've read about
scsh, Rush and Python, and talked with people working on using Scheme
as an extension and scripting language. Now I have formulated a
specific plan for providing extensibility in the GNU system.


Who chooses which language?

Ousterhour, the author of Tcl, responded to my previous message by
citing a "Law" that users choose the language they prefer, and
suggested that we each implement our favorite languages, then sit back
and watch as the users make them succeed or fail.

Unfortunately, extension languages are the one case where users
*cannot* choose the language they use. They have to use the language
supported by the application or tool they want to extend. For
example, if you wanted to extend PDP-10 Emacs, you had to use TECO.
If you want to extend GNU Emacs, you have to use Lisp.

When users simply want "to write a program to do X or Y," they can use
any language the system supports. There's no reason for system
designers to try to decide which language is best. We can instead
provide as many languages as possible, to give each user the widest
possible choice. In the GNU system, I would like to support every
language people want to use--provided someone will implement them.

With the methods generally used today, we cannot easily provide many
languages for extending any particular utility or application package.
Supporting an extension language means a lot of work for the developer
of the package. Supporting two languages is twice as much work,
supposing the two fit together at all. In practice, the developer has
to choose a language--and then all users of the package are stuck with
that one. For example, when I wrote GNU Emacs, I had to decide which
language to support. I had no way to let the users decide.

When a developer chooses Tcl, that has two consequences for the
users of the package:

* They can use Tcl if they wish. That's fine with me.

* They can't use any other language. That I consider a problem.

Sometimes developers choose a language because they like it. But not
always. Sun recently announced a campaign to "make Tcl the universal
scripting language." This is a campaign to convince all the
developers who *don't* prefer Tcl that they really have no choice.
The idea is that each one of us will believe that Sun will inevitably
convince everyone else to use Tcl, and each of us will feel compelled
to follow where we believe the rest are going.

That campaign is what led me to decide that I needed to speak to the
community about the issue. By announcing on the net that GNU software
packages won't use Tcl, I hope to show programmers that not everyone
is going to jump on the Tcl bandwagon--so they don't have to feel
compelled to do so. If developers choose to support Tcl, it should be
because they want to, not because Sun convinces them they have no
choice.


Design goals for GNU

When you write a program, or when you modify a GNU program, I think
you should be the one who decides what to implement. I can't tell you
what language to support, and I wouldn't want to try.

But I am the leader of one particular project, the GNU project. So I
make the decision about which packages to include in the GNU operating
system, and which design goals to aim for in developing the GNU
system.

These are the design goals I've decided on concerning extension
languages in the GNU system:

* As far as possible, all GNU packages should support the same
extension languages, so that a user can learn one language (any one of
those we support) and use it in any package--including Emacs.

* The languages we support should not be limited to special, weak
"scripting languages". They should be designed to be good for writing
large programs as well as small ones.

My judgement is that Tcl can't satisfy this goal. (Ousterhout seems
to agree that Tcl doesn't serve this goal. He thinks that doesn't
constitute a problem--I think it does.) That's why I've decided not
to use Tcl as the main system-wide extension language of the GNU
system.

* It is important to support a Lisp-like language, because they
provide certain special kinds of power, such as representing programs
as data in a structured way that can be decoded without parsing.

** It is desirable to support Scheme, because it is simple and clean.

** It is desirable to support Emacs Lisp, for compatibility with Emacs
and the code already written for Emacs.

* It is important to support a more usual programming language syntax
for users who find Lisp syntax too strange.

* It would be good to support Tcl as well, if that is easy to do.


The GNU extension language plan

Here is the plan for achieving the design goals stated above.

* Step 1. The base language should be modified Scheme, with these features:

** Case-sensitive symbol names.
** No distinction between #f and (), for the sake of supporting Lisp
as well as Scheme.

** Convenient fast exception handling, and catch and throw.
** Extra slots in a symbol, to better support
translating other Lisp dialects into Scheme.
** Multiple obarrays.
** Flexible string manipulation functions.
** Access to all or most of the Unix system calls.
** Convenient facilities for forking pipelines,
making redirections, and so on.
** Two interfaces for call-outs to C code.
One allows the C code to work on arbitrary Scheme data.
The other passes strings only, and is compatible with Tcl
C callouts provided the C function does not try to call
the Tcl interpreter.
** Cheap built-in dynamic variables (as well as Scheme's lexical variables).
** Support for forwarding a dynamic variable's value
into a C variable.
** A way for applications to define additional Scheme data types
for application-specific purposes.
** A place in a function to record an interactive argument reading spec.
** An optional reader feature to convert nil to #f and t to #t,
for the sake of supporting Lisp as well as Scheme.
** An interface to the library version of expect.
** Backtrace and debugging facilities.

All of these things are either straightforward or have already been
done in Scheme systems; the task is to put them together. We are
going to start with SCM, add some of these features to it, and write
the rest in Scheme, using existing implementations where possible.

* Step 2. Other languages should be implemented on top of Scheme.

** Rush is a cleaned-up version of the Tcl language, which runs far
faster than Tcl itself, by means of translation into Scheme. Some
kludgy but necessary Tcl constructs don't work in Rush, and Tcl
aficionadoes may be unhappy about this; but Rush provides cleaner ways
to get the same results, so users who write extensions should like it
better. Developers looking for an extension language are likely to
prefer Rush to Tcl if they are not already attached to Tcl.
Here are a couple of examples supplied by Adam Sah:

*** To pass an array argument without copying it, in Tcl you must use
upvar or make the array a global variable. In Rush, you can simply
declare the argument "pass by reference".

*** To extract values from a list and pass them as separate arguments
to a function, in Tcl you must construct a function call expression
using that list, and then evaluate it. This can cause trouble if the
other arguments contain text that includes any special Tcl syntax. In
Rush, the apply function handles this simply and reliably.

*** Rush eliminates the need for the "expr" command by allowing infix
mathematical expressions and statements. For example, the Tcl
computation `"set a [expr $b*$c]' can be written as `a = b*c' in
Rush. (The Tcl syntax works also.)

Some references:

[SBD94] Adam Sah, Jon Blow and Brian Dennis. "An Introduction to the Rush
Language." Proc. Tcl'94 Workshop. June, 1994.
ftp://ginsberg.cs.berkeley.edu:pub/papers/asah/rush-tcl94.*

[SB94] Adam Sah and Jon Blow. "A New Architecture for the Implementation of
Scripting Languages." Proc. USENIX Symp. on Very High Level Languages.
October, 1994. to appear.
ftp://ginsberg.cs.berkeley.edu:pub/papers/asah/rush-vhll94.*

** It appears that Emacs Lisp can be implemented efficiently by
translation into modified Scheme (the modifications are important).

** Python appears suitable for such an implementation, as far as I can
tell from a quick look. By "suitable" I mean that mostly the same
language could be implemented--minor changes in semantics would be ok.
(It would be useful for someone to check this carefully.)

** A C-like language syntax can certainly be implemented this way.

* Distribution conditions.

We will permit use of the modified Scheme interpreter in proprietary
programs, so as to compete effectively with alternative extensibility
packages.

Translators from other languages to modified Scheme will not be part


of any application; each individual user will decide when to use one
of these. Therefore, there is no special reason not to use the GPL as
the distribution terms for translators. So we will encourage
developers of translators to use the GPL as distribution terms.


Conclusion

Until today, users have not been able to choose which extension
language to use. They have always been compelled to use whichever
language is supposed by the tool they wish to extend. And that has
meant many different languages for different tools.

Adopting Tcl as the universal scripting language offers the
possibility of eliminating the incompatibility--users would be able to
extend everything with just one language. But they wouldn't be able
to choose which language. They would be compelled to use Tcl and
nothing else.

By making modified Scheme the universal extension language, we can
give users a choice of which language to write extensions in. We can
implement other languages, including modified Tcl (Rush), a Python
variant, and a C-like language, through translation into Scheme, so
that each user can choose the language to use. Even users who choose
modified Tcl will benefit from this decision--they will be happy with
the speedup they get from an implementation that translates into
Scheme.

Only Scheme, or something close to Scheme, can serve this purpose.
Tcl won't do the job. You can't implement Scheme or Python or Emacs
Lisp with reasonable performance on top of Tcl. But modified Scheme
can support them all, and many others.

The universal extension language should be modified Scheme.


Request for Volunteers

If you understand Scheme implementation well, and you want to
contribute a substantial amount of time to this project, please send
mail to Tom Lord, lo...@gnu.ai.mit.edu.

If you expect to have time later but don't have time now, please send
mail when you do have time to work. Participation in a small way is
probably not useful until after the package is released.

Aaron Watters

unread,
Oct 21, 1994, 8:30:18 PM10/21/94
to
In article <386n33$1...@tools.near.net> bar...@nic.near.net (Barry Margolin) writes:
>In article <1994Oct20.1...@njitgw.njit.edu> aa...@vienna.njit.edu (Aaron Watters) writes:
>>....Personally, I've always found Scheme a little irritating --

>>ever since I read the standard text on the subject which mentions
>>arrays somewhere around page 400, in a footnote, without telling
>>you how to use one....

>
>The standard reference for Scheme is the "Revised**4 Report on the
>Algorithmic Language Scheme", which is only 55 pages long. I suspect
>you're referring the "Structure and Interpretation of Computer Programs".

You got it!

>As the name implies, this is a text on programming, not on any particular
>programming language. This book is oriented towards teaching about
>advanced programming structures such as closures, streams, and
>continuations, which are unique to Scheme and similar languages; mundane

>features like arrays do admittedly get little coverage....

Quite right. As we all know, all uses of arrays are mundane and
trivial. Advanced programmers never use arrays. There are no
interesting algorithms that use arrays in interesting ways. Arrays
never come up in good classes about "advanced" programming.

In fact: didn't Turing show that all we *really* need is two stacks
of bits? Hmmm....

This gives me a really good idea for my own extension language!
[which I'll make freely copiable, but I'll restrict it from use in
any activity which makes any money in any way using a really complex
copyright: I'll take the gpl as a starting point (this'll be the
really fun part -- hey, maybe I'll start a really important movement
or revolution by forcing other people to not make money using my
program!!!).]
-a.

Ps: the real reason this "text on programming" doesn't
talk about arrays is there is no
good way to "do" arrays in functional programming, even though
arrays are the single most useful structures in real programming;
hence my irritation.

Peter da Silva

unread,
Oct 21, 1994, 8:00:34 PM10/21/94
to
In article <388u56$1...@agate.berkeley.edu>,
Josef Dalcolmo <jos...@albert.ssl.berkeley.edu> wrote:
> You are missing the point. You and your customers are free to write
> extensions to your application in perhaps Python (very readable). Since you
> may distribute the Python to Scheme translator for free (under the GPL) You
> won't have to rewrite it nor will it restrict you from selling your product,
> that has a Scheme interface. You may even write your own extensions to your
> product in modified Tcl if you wish. So what's the problem ?

OK, here's the scenario: I want to maintain a config file (~/browse.cf, say)
that is generated by the application but the user shuld have the ability
to edit. It needs to be in a language easy to automatically generate, easy
to reload, easy for external programs to maintain, and easy for the naive
user to modify.

What language would you recommend I choose? How do I provide the tools so
that the user can *also* maintain it in their language of choice?

Under any such scheme as this, the language that it all ends up being in
is going to be Scheme. The translators are just not going to be used, long
term.

This is not necessarily a bad thing. Just something to keep in mind.

Peter da Silva

unread,
Oct 21, 1994, 8:05:01 PM10/21/94
to
In article <388pc2$5...@csnews.cs.colorado.edu>,
Tom Christiansen <tch...@mox.perl.com> wrote:
> Thanks, Peter. See Larry's release notice posted elsewhere in these
> language areas for details, and/or glance at my implementation of a
> Patricia trie in perl, recently posted elsewhere in this thread.

Yes, I had a look at it. You didn't use postix if at all, that I could see.

> Do you want big languages and little programs or vice versa?

I want little languages and little programs. I don't believe you can't
get there (watch out, he's got something under his coat! Oh no! He's got
a Forth interpreter! Run!) from here...

> I guess all we need is a elisp-to-perl (or is that scheme-to-perl)
> translator now and even rms will be happy. :-) (Someone else reports
> working on a tcl-to-perl translator already, but progress is slow.)

I don't think that you're going to get a good translator from any of these
data-driven languages to a procedural language any time soon. Run-time
manipulation of code is too much a part of what makes them interesting.
And it's also too much a part of what makes them useful extension
languages.

Peter da Silva

unread,
Oct 21, 1994, 8:15:09 PM10/21/94
to
In article <y53Juc...@sytex.com>, Scott McLoughlin <sm...@sytex.com> wrote:
> Anyway, none of these seems "big" in terms of code size and/or
> runtime space requirements. As for "flexible string manipulation
> functions" and "access to all or most of the Unix system calls", these
> will add to the "footprint" of the language.

You're right, they will. But for the application domain I'm interested in
(extension languages for programs running on UNIX) it's more important.

I mean in this domain I find Tcl more than adequate, and nobody is going
to tell me it has anywhere near the functionality of scheme even without
any of these extensions.

> I don't, OTOH, see why not we couldn't just have a simple
> #ifdef compilation option to exclude various features from the
> language,

That turns out not to work very well. You end up with everything compiled
in anyway. We've already been down that road in the Tcl world. The real
solution to application size is dynamically loadable extensions. Things like
the UNIX I/O package and the strings package would be well suited to that.
Extensions to the language syntax are less so.

And the real "size" metric I'm using is more like the one Tom was using
to measure the size of Perl 5 versus Perl 4: complexity. Adding new primitives
doesn't add much to the complexity (the mental size, if you like) of the
language. Adding new control structures, or changing the basic syntax
(making it more lispy) does.

> For real tightness and power in a scripting language, I (and many others)
> would recommend using "Forth" where these things matter

OK, OK, how about Postscript? In a lot of ways it's got most of the tightness
of Forth with a lot cleaner syntax. And people are used to dealing with it.
Yes, traditional PS implementations are pretty big but that's mostly the
rendering engine...

> Anyway, when can we get it ;-) Also, I'll repeat my oft'
> expressed desire to see DOS/Windoze/Win32 versions.

You already got that with Tcl.

Raul Deluth Miller

unread,
Oct 22, 1994, 12:37:27 PM10/22/94
to
Tom Christiansen:
: Ousterhout is more right about the Law than most in their Ivory
: Towers here seem to care to admit: the people will use gleefully
: only that which comes easy to them, and these languages will drive
: out the others.

Which reminds me of this little gem:

#!/usr/local/bin/wish -f
label .lab -bd 2 -relief raised -text "So, What is wrong with using a utility"
label .lab2 -bd 2 -relief raised -text "that kills babies...I happen to like"
label .lab3 -bd 2 -relief raised -text "tcl."
pack append . .lab {top fill} .lab2 {top fill} .lab3 {top fill}


button .b1 -text "End this madness" -command exit
pack append . .b1 {top}

--
Raul D. Miller n =: p*q NB. 9<##:##:n [.large prime p, q
<rock...@nova.umd.edu> y =: n&|&(*&x)^:e 1 NB. -.1 e.e e.&factors<:p,q [.e<n
NB. public e, n, y
x -: n&|&(*&y)^:d 1 NB. 1=(d*e)+.p*&<:q

Stefan Monnier

unread,
Oct 22, 1994, 1:35:50 PM10/22/94
to
In article <id.YD1...@nmti.com>, Peter da Silva <pe...@nmti.com> wrote:
> What language would you recommend I choose? How do I provide the tools so
> that the user can *also* maintain it in their language of choice?
>
> Under any such scheme as this, the language that it all ends up being in
> is going to be Scheme. The translators are just not going to be used, long
> term.

That's why I think using scheme as an intermediate language is not
a good idea. I think a lower level language would be better, forcing
a compile no matter what language you write the thing in.


Stefan

Stefan Monnier

unread,
Oct 22, 1994, 1:58:26 PM10/22/94
to
In article <1994Oct22.0...@njitgw.njit.edu>,

Aaron Watters <aa...@vienna.njit.edu> wrote:
> Quite right. As we all know, all uses of arrays are mundane and
> trivial. Advanced programmers never use arrays. There are no

Arrays are simple in concept. SICP is supposed to teach other more
"advanced" concepts. Doesn't mean that arrays are less used. They are
just assumed to be known !

> In fact: didn't Turing show that all we *really* need is two stacks
> of bits? Hmmm....

Yup ! Unbounded stacks !
And those stacks are easier to implement with lisp lists than with
arrays (and their fixed size)

> Ps: the real reason this "text on programming" doesn't
> talk about arrays is there is no
> good way to "do" arrays in functional programming, even though

There is, but it's still recent technology.

> arrays are the single most useful structures in real programming;
> hence my irritation.

And arrays are the single most obvious reason why most programs either
crash on big data sets or (if you're lucky) complain because it's
bigger than some arbitrary internal limit !


Stefan

Peter da Silva

unread,
Oct 22, 1994, 1:36:03 PM10/22/94
to
In article <38a3mk$l...@csnews.cs.colorado.edu>,
Tom Christiansen <tch...@mox.perl.com> wrote:
> Peter, be aware that the common man has neither the time for nor any
> appreciation of your spartan minimalism.

Spartan minimalism? Tcl is hardly spartan... it's just designed for a
specific job and does it very well. Perl is designed for a different
job and does THAT very well. I don't think it could do Tcl's even as
well as Tcl does Perl's.

[preaching to the choir omitted]

Hey, I'm responsible for some of the features in Tcl that *are* there, like
the way strings work. Karl and I worked out the semantics of Tcl arrays on
his whiteboard when he worked here. We did Tcl Extended, because the original
language was too minimal, and a lot of that has been picked up. For that
matter we picked some ideas up from Perl... some of them didn't make the
cut and still aren't in the core language (like the filescan stuff).

But there's always been this basic assumption: that you don't add a feature
just because it sounds good. You add it because you need it. If there's two
ways of doing something you use the one that avoids complicating the language.
The classic example in Perl is the postfix if statement. It doesn't add
any capability to the language, and it confuses new users. In an extension
language that's a bad thing... because most of the time most users are
new users, because they're not using the language to do a job, they're
using it to configure the tool that does the job.

> The problem is, you see, is that quite simply, you're designing the wrong
> languages for the wrong crowd.

Who, me? I'm not designing a language at all. Or redesigning one. I'm trying
to keep a bunch of people from inventing yet another camel when the specs
don't even call for a horse.

[a bunch of stuff that doesn't seem to have anything to do with me at all,
skipped]

> When it comes to lisp or tcl, while the extensive run-time nature of
> those languages make machine language generation (at least of certain
> constructs) difficult, compiling them into native perl (probably
> with a run-time emulator library) should in theory present no insurmountable
> hurdles.

Certainly with a runtime emulator library... especially when you're running
around loading stuff on an ongoing basis at runtime and using code fragments
as your communication channel between components. And since you have to keep
doing that, what's the point to putting Perl in the loop at all? It's not
technically infeasible, it's just not very useful. And that's why I think
it's unlikely.

Tom Christiansen

unread,
Oct 22, 1994, 3:35:52 PM10/22/94
to
:-> In comp.lang.perl, pe...@nmti.com (Peter da Silva) writes:
:But there's always been this basic assumption: that you don't add a feature

:just because it sounds good. You add it because you need it. If there's two
:ways of doing something you use the one that avoids complicating the language.
:The classic example in Perl is the postfix if statement. It doesn't add
:any capability to the language, and it confuses new users.

Don't make the user through more work than necessary. If it bothers you
that that we in English sometimes naturally express outself with the
conditional afterwards, use somthing else. It's more restrictive and
stilted and unnatural to enforce a particular style on the user. Ask your
mother if you don't believe me.

if (annoy $peter reversed("conditional")) {
use Something_Else;
}

value("flexibility") > value("restriction");

ask $mom if disbelieve $tom;

Remember, in C, you can say for(;c;) wherever you can say while(c), and
no one seems to mind that. It's the same issue. One is more readable.
You're asking for decreased legility for no good reason. Likewise,

do {
foo();
} until $a || $b;

is some better than either of these:

do {
foo();
} while !$a && !$b;

do {
foo();
} while !($a || $b);

because they make you go through more work than needed. Likewise

foreach $a (@list) {
foo($a);
}

is superior to the far busier:

for ($i = 0; $i <= $#list; $i++) {
foo( $list[$i] );
}

But so what? It's not like we should can one or the other and
force you to choose between C and shell.

[yes, much of the previous was more addressed to the thread then
to just Peter]


--tom
--
Tom Christiansen Perl Consultant, Gamer, Hiker tch...@mox.perl.com

Malt does more than Milton can
To justify God's ways to Man.

Suresh Srinivas

unread,
Oct 22, 1994, 2:55:55 PM10/22/94
to
ous...@tcl.eng.sun.com (John Ousterhout) writes:

>(sigh... here we go again)

>I'd like to respond to an error in Richard Stallman's latest posting.
>Stallman said:

> Sun recently announced a campaign to "make Tcl the universal
> scripting language." This is a campaign to convince all the
> developers who *don't* prefer Tcl that they really have no choice.
> The idea is that each one of us will believe that Sun will inevitably
> convince everyone else to use Tcl, and each of us will feel compelled
> to follow where we believe the rest are going.

>Please understand that this "campaign" exists only in Stallman's mind.
>As far as I know there has never been *any* official Sun announcement
>about Tcl. There is no campaign at Sun to stamp out Tcl competitors;
>Tcl and Tk aren't even official Sun products right now, nor has Sun
>made a commitment to turn them into products (yet). If anyone has
>concrete evidence to back up Stallman's accusations, please post it
>so we can all see it.

Here's a job posting that I came across while searching on Career
Mosaic's home page. It does talk about Tcl as the universal scripting
language. Maybe Prof Ousterhout could clarify this.

Prof Osterhout is right in saying that negative campaigning is not
good. I'd say certainly I've heard more negative things said about
TCL, C++ etc in the scheme newsgroup than vice versa. There are really
neat things about scheme like high level macros but also not so neat
things like poor support for reuse (unable to use neat libraries
developed in C++ for example). The foreign function support in scheme
is far from good. The thing to remember is that scheme is not a
panacea for everything, it is one paradigm and providing interfaces
to other paradigms is only going to make it more acceptable.

--Suresh Srinivas


-----Job posting about Tcl from Sun -------------

Sun Micorsystems Laboratories, Inc. is embarking on a new project
directed by Dr. John Ousterhout. Our goal is to make Tcl/Tk the
universal scripting language.

To accomplish this we are in the process of building a new group
which is well funded and fully dedicated to this project. This group
is under SMLI (Sun Microsystems Laboratories, Inc.) which is the
advanced technology and research arm of Sun Microsystems, Inc.

We are searching for several more individuals to join us in this
effort and play a key role in making this goal a reality.

You will help us on the development of the Tcl scripting language,
the Tk toolkit, and their extentions and applications.

The two most important projects will be a port of Tk to Windows and
Macintosh platforms, and the creation of a graphical designer for
Tk user interfaces. This will allow people to create interfaces
graphically on the screen, rather than writing scripts.

The individualals we are looking for will have solid experience
with C, C++, and experience with Tcl/Tk. We would also like to
have some expertise with MS/Windows and/or MACS.
The qualified candidate will also have a BSCS/MSCS and 5 plus
years work experience.

If you are interested in exploring this new opportunity please
follow up to:

Scott Knowles
SMLI
2550 Garcia Ave. MTV19-02
Mt.View, CA 94043

--
Suresh Srinivas Department of Computer Science
Grad Student Indiana University, Bloomington,
email: ssri...@cs.indiana.edu IN 47406.

lvi...@cas.org

unread,
Oct 22, 1994, 7:44:37 PM10/22/94
to

In article <389tj9$f...@Mercury.mcs.com>, Thomas H. Moog <tm...@MCS.COM> wrote:
:A choice of scripting languages ?
:
:share software between users. One party that wasn't adversely affected
:by the inability of the Unix vendors to agree on a "standard" was
:Microsoft - that's for sure.
:

On the other hand, Microsoft has things like visual basic, visual c, visual
c--, etc. and so they too see the need to provide various programming languageds
into the user community. Or at least they see the financial benefits. What
is interesting is that friends who have these software packages indicate that
their 'widgets' (extensions, whatever you want to call them) can in many cases
be used across languages - a useful concept which doesn't seem to be making
it into the Unix arena.
--
:s Great net resources sought...
:s Larry W. Virden INET: lvi...@cas.org
:s <URL:http://www.mps.ohio-state.edu/cgi-bin/hpp?lvirden_sig.html>
The task of an educator should be to irrigate the desert not clear the forest.

lvi...@cas.org

unread,
Oct 22, 1994, 7:41:15 PM10/22/94
to

In article <y53Juc...@sytex.com>, Scott McLoughlin <sm...@sytex.com> wrote:
:pe...@nmti.com (Peter da Silva) writes:
:
:> I still think that all other things being equal a tighter, smaller language
:> is better than a larger and more complicated one, and all the enhancements
:> to Scheme suggested by RMS in <941019042...@mole.gnu.ai.mit.edu> are
:> a bit worrisome. If this comes down to a fight between Sun and the FSF my
:

: I don't, OTOH, see why not we couldn't just have a simple


:#ifdef compilation option to exclude various features from the

Having to compile up multiple copies of interpreters has been tried in the
perl and tcl communities - to the frustrations of many. What I and many
others have called for are ways to dynamically load independantly developed
sets of enriched command sets into a very small base interpreter. This
would allow me to tailor an application to a required set of objects
and appropriate operations/methods, while passing on pieces for which I have
no need. Why should my applications be saddled with hundreds of k of X
overhead if the app I want to develop just wants to send messages to an
existing X app - but needs to do no window instantiation at all? Equally,
if all I need to do is small integer manipulation, I would just as soon
not be saddled with bignum floats. On the other hand, if a user wishes
to write their own extended commands for my app, and in doing so determines
that _they_ need bignum floats, X, or whatever, I would like for the language
to be able to support _them_ requesting said objects be loaded, along with
appropriate operation/method library code, etc.

:language, e.g. #undef GXL_UNIXCALLS or whatever. Same for "expect"
:interface (what is expect anyway?).

Expect is a nifty concept (available at least in a Tcl and Perl form -
perhaps in other languages now as well) where one defines a set of
interactions that need to take place one or more processes. Think of
telecomm software in the micro world which allow you to capture login
scripts and then replay them to log into services, etc. Expect is
a language where one can write 'scripts' to invoke ftp, telnet, etc.
and then generate requests, watch for respones, etc. The latest Expect,
with an extended environment known as Expectk, allows one to wrap aaGUI
around a traditional text based interaction such as ftp, password changes,
whatever, in a rather nifty way. There is also a neat paper done able
a feature of Expect called Kibitz - where one links two separate programs
together with expect/kibitz glue between - so that one program feeds input
to another and then recieves the second's output as it's input (think
of playing two chess programs against one another - not that this is
the only use, but a simple to grasp one).

: I would guess in the general case, the lion's share of apps
:using the language will have oodles of giant GUI extensions, user
:i/o validation, etc. and one average size 8bit+ color image will have
:a footprint bigger than the whole language implementation anyway! For


:real tightness and power in a scripting language, I (and many others)

:would recommend using "Forth" where these things matter - but I

It is true that many apps will be extended using many pieces of
extensions. If they are all loaded only when needed, and able to be
unloaded when not needed, this would allow an app to consume only the
resources needed at any one time. And if folk take into consideration
the 'hypertool' or applet approach, where entire mini-applications grow
up and communicate between one another, then one will find that more
use of distributed compute resouces, threading, etc. will be utilized.

Tom Christiansen

unread,
Oct 22, 1994, 8:30:39 PM10/22/94
to
:-> In comp.lang.tcl, lvi...@cas.org writes:
:Having to compile up multiple copies of interpreters has been tried in the

:perl and tcl communities - to the frustrations of many. What I and many
:others have called for are ways to dynamically load independantly developed
:sets of enriched command sets into a very small base interpreter.

Dynamic loading of extensions works just fine in Perl. Why do you think
the /usr/bin/perl binary can be just 50k? There's no longer any need for
fooperl, barperl, and flotzperl. Tcl users can use this feature if they
start their tcl programs with

#!/usr/bin/repl
use Tcl;

and go from there. No, I'm not entirely kidding.

--tom
--
Tom Christiansen Perl Consultant, Gamer, Hiker tch...@mox.perl.com

Documentation is the castor oil of programming. Managers know it must
be good because the programmers hate it so much.

Aaron Watters

unread,
Oct 23, 1994, 4:29:53 PM10/23/94
to
I apologize. I thought my posts on this thread were kinda funny and had
some points of interest, but in retrospect they were mean spirited.
[I'm trying to put into practice various pieces of advice I recieved
privately, like "don't be such a moron".]

Scheme is a beautiful little language and could be a great extension/
scripting language if it had standard portable interfaces to a large
number of libraries, and if it had native object support with
inheritance -- I understand some mutant strains do...
[Python certainly does.]

I think the GPL and its variants should be changed to something
less restrictive and simpler -- in one case I know of a developer did some
work using GNU stuff and ended ripping a lot of it out in order
to avoid the bother of complying with the terms. He says next
time he'll license proprietary source with binary distribution
rights (such things exist, and can be very nice). Clearly, in this
case, the GPL didn't encourage the use of freely copiable software.

Next time I see him I'll recommend python and it's associated tools,
since he can use them however he pleases, as long as he credits the
source. A gnu-scheme would fair better in the world if it had
a copyright like the ones on TCL and python.

No more comments on books. Sorry,sorry,sorry.

Peter da Silva

unread,
Oct 23, 1994, 7:52:53 PM10/23/94
to
In article <38bihm$4...@info.epfl.ch>,

Stefan Monnier <mon...@di.epfl.ch> wrote:
> That's why I think using scheme as an intermediate language is not
> a good idea. I think a lower level language would be better, forcing
> a compile no matter what language you write the thing in.

I can't agree. A compile step automatically makes for a lousy extension
language, unless all the compilers are built into the binary. For a lot of
uses the extra fork/exec overhead is by itself too high. And if all the
compilers are built into the binary, then they're the extension languages
and the "low level" one is an implementation detail.

Peter da Silva

unread,
Oct 23, 1994, 8:04:32 PM10/23/94
to
In article <38bpio$a...@csnews.cs.colorado.edu>,

Tom Christiansen <tch...@mox.perl.com> wrote:
> Don't make the user through more work than necessary. If it bothers you
> that that we in English sometimes naturally express outself with the
> conditional afterwards, use somthing else. It's more restrictive and
> stilted and unnatural to enforce a particular style on the user. Ask your
> mother if you don't believe me.

[snippet]

I'm sorry, but putting a feature in because it's english like is just
plain silly. Programming languages are not human languages. If you don't
think so, there's always COBOL.

The syntactic distance between

> if (disbelieve $tom) {
ask $mom;
}

and:

> ask $mom if disbelieve $tom;

is pretty high. The former is clearly a control structure. The latter is
hard to pick out of code.

As for C, I don't recall arguing that C is either easy to learn or that it
would make a good extension language.

On the gripping hand:

for(;read_news;)
flame();

and:

while(read_news)
flame();

retain the same basic form. They don't add to the conceptual cost of learning
and using the language.

Some extensions are useful. Foreach is like C's "+=", it takes a common
idiom and removes a lot of duplication from it. Postfix if doesn't reduce
the complexity of the statement any (there's still as many elements to
evaluate) but does add to the complexity of the language.

This is where I'm coming from: adding a feature to a language because it's
neat (and postfix if is certainly neat... it's downright cute) is a bad idea.
That way lies COBOL.

(no, I don't think Perl's COBOL. I will note that a lot of the improvements
to Perl have involved removing complexity, which defends it from that
charge quite well *and* supports my argument against un-necessary frills)

Mikael Djurfeldt

unread,
Oct 23, 1994, 10:10:40 PM10/23/94
to
Sorry for using bandwidth for this kind of messages...

>>>>> "ozan" == ozan s yigit <o...@nexus.yorku.ca> writes:
In article <OZ.94Oct...@nexus.yorku.ca> o...@nexus.yorku.ca (ozan s. yigit) writes:

ozan> Mikael Djurfeldt is concerned about the suggested
ozan> modifications to the Scheme language in order to fullfill
ozan> GNU extension language goals. I do not think there is any
ozan> cause for concern. It is clear that the changes are not to
ozan> The Scheme language, which is well defined in an IEEE
ozan> standard and Revised^4 Report [...]

It was my intention only to talk about the GNU extension language.
I would like it to be a good language/implementation.

/mdj

Scott McLoughlin

unread,
Oct 23, 1994, 9:05:27 PM10/23/94
to
lvi...@cas.org writes:

> On the other hand, Microsoft has things like visual basic, visual c, visual
> c--, etc. and so they too see the need to provide various programming languag

> into the user community. Or at least they see the financial benefits. What
> is interesting is that friends who have these software packages indicate that
> their 'widgets' (extensions, whatever you want to call them) can in many case

> be used across languages - a useful concept which doesn't seem to be making
> it into the Unix arena.
> --

Howdy,
The X-language widgets currently in use are typically
termed "components" in the DOS world and the dominant component
format is called a "VBX", originally a Visual Basic only library
format that became very popular.
VBX's are _very_ popular. While I don't use them, most
of my colleagues do. They report that the quality is _very_
_LOW_. Crash city (GPF, core dump, whatever). Many will
say "that's an implementation issue" - which is true. But it's
also an _economic issue_ - the heavy "consumerization" of
software, including programming tools, in the DOS/Windows
world. I wouldn't recommend chasing after the MSoft/Intel/
DOS/Windows model of computing, esp. as its shaped up in
the last year or so. I (like many others) am there - it's not
pretty.
If you're interested in VBX's, though, you can go
out and snag VB for ~100 and a large collection of VBX's
for another ~ 100 dollars or so and go to town. Heck, get
a big C++ compiler for another ~100 or so and you can
write you're own VBX's. Have fun.

=============================================
Scott McLoughlin
Conscious Computing
=============================================

Tim Smith

unread,
Oct 24, 1994, 5:15:35 AM10/24/94
to
Tom Christiansen <tch...@mox.perl.com> wrote:
>You're suffering from delusions of clinically committable magnitude if you
>begin to think for one moment that anything as counterintuitive to the kid
>off the street as Forth -- or Lisp for that matter -- has a chance of
>actually gaining sufficient acceptance and widespread use to become a the
>cyber lingua franca you're looking for. Even RMS in his guise as the

Everyone seems to be assuming that the extension language built into the
program must be the same as the extension language that the user sees.
Yet this is not so. What would be wrong with building in something that
is small, simple, and fast (e.g., Forth), and then providing tools to
compile something else to that (e.g., a gcc backend that generates Forth
instead of assembly)?

--Tim Smith

Joseph H Allen

unread,
Oct 20, 1994, 3:17:48 PM10/20/94
to
In article <1994Oct20.1...@njitgw.njit.edu>,
Aaron Watters <aa...@vienna.njit.edu> wrote:

>In article <EBOUCK.94O...@dsdbqvarsa.er.usgs.gov> ebo...@usgs.gov writes:
> > GNU Extension Language Plans
>> Richard Stallman, GNU Project
>>....
>>* Step 1. The base language should be modified Scheme, with these features:...
>
>Fine. Please extend Scheme (as I knew it) with a flexible
>object-encapsulation/inheritence system and convenient, well
>designed, portable interfaces to common os functionalities
>and libraries -- for a good example of how to do this you might do this
>look at the Python code.

Actually I would say that an object system is almost essential for a good
extension langauge. You want to be able to add new types to the language
which correspond to the elements of whatever system the language is embedded
in. Without this, you have to add tons of keywords to manage data objects
which are not really part of the language.

Python can do this, although it's operator overloading syntax is a little
awkward and it might be too slow. Some of the modern functional languages
might be fast enough. Haskel comes to mind, but it has a horrendous syntax.

Another concern is that you probably want the source-code inputs to these
languages to be event driven- for X and multi-user applications. With
scheme you could do this with a preprocessor (just collect input until you
have balanced parenthasis). Python's parser could probably be made event
driven as well, since it's table driven.
--
/* jha...@world.std.com (192.74.137.5) */ /* Joseph H. Allen */
int a[1817];main(z,p,q,r){for(p=80;q+p-80;p-=2*a[p])for(z=9;z--;)q=3&(r=time(0)
+r*57)/7,q=q?q-1?q-2?1-p%79?-1:0:p%79-77?1:0:p<1659?79:0:p>158?-79:0,q?!a[p+q*2
]?a[p+=a[p+=q]=q]=q:0:0;for(;q++-1817;)printf(q%79?"%c":"%c\n"," #"[!a[q-1]]);}

Simon Leinen

unread,
Oct 24, 1994, 7:56:19 AM10/24/94
to
Mikael> In fact, most people intuitively think about symbols as having
Mikael> only one binding.

This is always taken for granted by you proponents of single-namespace
Lisps. I argue that most people "intuitively" have no problems at all
distinguishing the several meanings of a symbol. I have never seen
anybody who was confused about the predefined meanings of the word
CONS as a type specifier and an operator (function) in Common Lisp,
for example. Personally I would rather expect people to be surprised
when they use a variable with a generic name like LIST, and find that
they have redefined a system function in the scope.

I admit that from a formalism-aesthetic point of view, multiple
namespaces are not as nice ("violating the 0-1-infinity principle"),
but this does not mean that a LISP-1 is more "intuitive" than a LISP-2.

Mikael> Because of this most lisp code doesn't use both value and
Mikael> function bindings simultaneously.

It is true that most symbols in a given program don't have a variable
and a function binding at the same time. But the reason might also be
that "intuitive" names for functions and variables are often (but not
always!) different.
--
Simon.

lvi...@cas.org

unread,
Oct 24, 1994, 9:15:07 AM10/24/94
to

It is great that Perl 5 now has this feature - I am still expecting to see
such a feature in Tcl 4 or 5 as well. My point was that folks planning
a new GNU Scheme should plan on also providing such a feature as well.
With the ability to auto load both object libraries AND externally written
{plug in your favorite language interpreter here} files, one gets quite
a productivity boost.


Of course, the real problem is how to do this across all platforms - how
does Perl handle all the different Unix, MS-DOS, etc. limitations on
dynamically loaded objects? (I have the Perl5 tech doc printed - it's
just that my speed reading skills keep getting cancelled out by kids
wanting my attention...)

Raul Deluth Miller

unread,
Oct 24, 1994, 11:20:06 AM10/24/94
to
It's "trivial" to make an elisp function/data slot distinction with
scheme: for example, give all variables a v prefix, and all functions
an f prefix, and run a syntax analyzer over the code to insert v or f
in the front of each symbol.

Similarly, it's "trivial" to add (e.g.) property lists to scheme --
just introduce a global variable that's a hash table indexed by
symbols.

These are just examples -- their point is to point out that adding
features to scheme needn't destroy the underlying language.

Marco Antoniotti

unread,
Oct 24, 1994, 9:47:10 AM10/24/94
to
In all the discussion I have seen about Tcl, I still have to see the
argument that actually would clarify the wide acceptance that Tcl/Tk
has gained.

The main reason for the acceptance of Tcl is the '/Tk' suffix.

I cannot see anybody who would like to learn a language like Tcl if it
where not for all the "Value Added" by Tk.

I am usually amazed by the ease with wich you can build interfaces
with Tk. I also think that that is what makes the whole thing so
appetible.

Which brings up the following point. Any language which wants to be a
replacement for Tcl/Tk (and I believe that any Scheme or CL serious
attempt would just breeze through) should provide an equivalent of Tk.

Thanks for the attention.

--
Marco Antoniotti - Resistente Umano
-------------------------------------------------------------------------------
Robotics Lab | room: 1220 - tel. #: (212) 998 3370
Courant Institute NYU | e-mail: mar...@cs.nyu.edu

...e` la semplicita` che e` difficile a farsi.
...it is simplicity that is difficult to make.
Bertholdt Brecht

Peter da Silva

unread,
Oct 24, 1994, 10:53:56 AM10/24/94
to
In article <38ftvn$d...@nntp1.u.washington.edu>,

Tim Smith <t...@u.washington.edu> wrote:
> Everyone seems to be assuming that the extension language built into the
> program must be the same as the extension language that the user sees.
> Yet this is not so. What would be wrong with building in something that
> is small, simple, and fast (e.g., Forth), and then providing tools to
> compile something else to that (e.g., a gcc backend that generates Forth
> instead of assembly)?

Using an arbitrary syntax to avoid favoritism:

display "Enter keystroke: "
read keystroke
display "Enter macro: "
read macro
define_key_macro %keystroke %macro

Now, if you're using an external compiler you need to run that compiler
from "define_key_macro".

Now suppose you're reading these from an X resource at startup. You're going
to have to call the compiler for *each* resource in turn.

Sorry, it just don't work. The underlying language *is* going to be exposed
to the user.

That doesn't mean that Scheme isn't a fine language for this, just that it's
nonsense to pretend the underlying language doesn't matter.

Mike McDonald

unread,
Oct 24, 1994, 11:50:07 AM10/24/94
to
In article <38ftvn$d...@nntp1.u.washington.edu>, t...@u.washington.edu (Tim Smith) writes:
|>
|> Everyone seems to be assuming that the extension language built into the
|> program must be the same as the extension language that the user sees.
|> Yet this is not so. What would be wrong with building in something that
|> is small, simple, and fast (e.g., Forth), and then providing tools to
|> compile something else to that (e.g., a gcc backend that generates Forth
|> instead of assembly)?
|>
|> --Tim Smith

Because you can't maintain the system then. If your users get to go off and
pick any tranlator they want, you'll have to learn every available language
inorder to debug your user's scripts. I don't believe that it is practical for
you to expect to be able to debug a script written is some unknown language that
was machine translated into scheme, or forth, or whatever. (I've argued this same
point with the Dylan people to no good.)

If you're not going to let them pick, then you just as well force them to use
the same language as you picked.

Mike McDonald m...@trantor.ess.harris.com

Zvi Lamm

unread,
Oct 24, 1994, 4:42:39 PM10/24/94
to
Tom Christiansen <tch...@mox.perl.com> writes

> Why is it, do you suppose,
>that languages like BASIC, sh, awk, perl, tcl, REXX, and Visual BASIC have
>enjoyed the popularity they have? It's really very simple: they're easy
>to use for someone who hasn't had the questionable good fortune of
>having done graduate work in programming languages.

As I see it, doing grad work in programming languages constitutes
understanding the appeal (and, shame on me, the good features) of REXX,
VB etc.

Except perhaps BASIC all the languages you mentioned have specific
features the make them both interesting and easy==useful.

Do you think INTERPRET (that's from REXX :-)) is a good feature for a
language to have? Do you see how good it can be for short scripts?
Does SIGNAL ON ÝNO¨VALUE help or not? Default value of foo is FOO! Good
or BAD? etc. etc.

Sure bus riders don't care how a bus works, but it is still for the bus
designer to make the bus safe, comfortable and efficient. The same applies
to designing computer languages!!

Ehud

ozan s. yigit

unread,
Oct 23, 1994, 9:27:28 PM10/23/94
to
Mikael Djurfeldt is concerned about the suggested modifications to the
Scheme language in order to fullfill GNU extension language goals. I do
not think there is any cause for concern. It is clear that the changes are
not to The Scheme language, which is well defined in an IEEE standard and
Revised^4 Report, The language described in the RMS note may at best be
characterized as a derivative of Scheme, and i doubt it would be called
Scheme, any more than Dylan is called Scheme.

oz
---
a technology is indistinguishable from | electric: o...@nexus.yorku.ca
its implementation. -- Marshall Rose | or o...@sni.ca [416 449 9449]


Mikael Djurfeldt

unread,
Oct 23, 1994, 5:41:08 PM10/23/94
to
I was happy when I read your plans regarding a GNU extension language.
Scheme seems to be the right thing to choose. Also, the idea of
translating other languages into scheme, thereby allowing the user to
use his/her favourite extension language, is a very nice one.

But, I feel concerned about three of the suggested modifications to
the language. According to RMS "[it] is desirable to support Scheme,
because it is simple and clean." This is indeed one of the great
strengths of Scheme, and it's extremely important not to compromise
it.

In "The GNU extension language plan" we find:

** Extra slots in a symbol, to better support
translating other Lisp dialects into Scheme.

I suspect that you intend to add function and property-list bindings
to symbols. Having symbols associated with multiple slots makes the
language complicated, regardless of their use or implementation.
But, let's take a closer look:

I take it for granted that you're going to use the value-binding for
symbols occuring in the first position of a form. To have another
scheme of evaluation for operators would simply make it a different
language. It would make "Modified Scheme" incompatible with Scheme
and would make the code clumsier.

Apart from the conceptual burden, extra symbol bindings would
complicate the implementation and make symbols "heavier", thus slowing
down code which creates new symbols.

The property list could easily be implemented in scheme as a separate
table.

Assuming that symbols have a function binding, how would you translate
elisp into "Modified Scheme"?

One way to do it is to use a selector `function-binding' to access the
function binding and a mutator `set-function!' for modification.
(Both are special forms which don't evaluate their first argument.)
Then some translations could be:

(* 2 3) --> (* 2 3)

(g 3) --> ((function-binding g) 3)

(Of course one could have the same value in both value and function
bindings of primitive operators, which would make the translation
simpler.)

(defun g (x) (* 5 x)) --> (if (not (defined? x))
(define x #f))
(set-function! x (lambda (x) (* 5 x)))

(funcall 'g 3) --> ((function-binding g) 3)

(funcall func 3) --> ((eval (list 'function-binding func)
(interaction-environment))
3)

(1. I don't know how you are going to implement dynamic variables,
therefore I ignore the dynamic - lexical binding issues in my
examples.
2. I used the eval from the proposal in Jonathan Rees "The Scheme
of Things: The June 1992 Meeting" which can be found in the
Scheme Repository
3. As far as I understand, implementing `function-binding' as a
procedure would put the same kind of restrictions on the
implementation as first class environments do.)

I can't think of any better way to use the extra binding. How are you
going to use it?

In fact, most people intuitively think about symbols as having only
one binding. Because of this most lisp code doesn't use both value
and function bindings simultaneously.

Wouldn't it be a much simpler solution then just to drop the
distinction and manually correct code which use both bindings?

(* 2 3) --> (* 2 3)

(g 3) --> (g 3)

(defun g (x) (* 5 x)) --> (define x (lambda (x) (* 5 x)))

(funcall 'g 3) --> (g 3)

(funcall func 3) --> ((eval func (interaction-environment)) 3)

This obviates the need of the extra function binding.

From the "multiple extension languages" perspective this would of
course lead to an odd dialect of lisp. If this is a big issue there
are still other solutions. E.g., having multiple obarrays opens the
possibility of using a different name space for functions in
translated code...


** A place in a function to record an interactive argument reading
spec.

Such interactive argument specs are particular to certain
applications, like emacs. Why make a change in the Scheme language
when it could be implemented as a separate table? The elisp ->
M. Scheme translator could generate code for maintenance of this
table. Or, `defun' could be implemented as a macro which augments the
table.


** An optional reader feature to convert nil to #f and t to #t,
for the sake of supporting Lisp as well as Scheme.

Alternatively one could define the symbol nil as #f and t as #t,
thereby not changing characteristics of the language.


Mikael Djurfeldt

Matt Kennel

unread,
Oct 23, 1994, 5:40:57 PM10/23/94
to
Thomas H. Moog (tm...@MCS.COM) wrote:
: A choice of scripting languages ?

: This sounds a lot like OpenWindows vs. Motif vs. Athena/MIT for
: widgets and window managers. Did the user community benefit from all
: this choice ? It made it difficult to have a program with a GUI run
: on all platforms and it created a new industry of companies writing
: GUI independent libraries thereby making it even more difficult to
: share software between users. One party that wasn't adversely affected


: by the inability of the Unix vendors to agree on a "standard" was
: Microsoft - that's for sure.

The image that comes to mind is a tcl rat preoccupied and in mortal combat
with a scheme opossum all the while a Tyrannosaurus Basicus bears down:

"Yummm......both white meat and dark meat for lunch today....."

: Tom Moog

--
-Matt Kennel m...@inls1.ucsd.edu
-Institute for Nonlinear Science, University of California, San Diego
-*** AD: Archive for nonlinear dynamics papers & programs: FTP to
-*** lyapunov.ucsd.edu, username "anonymous".

ozan s. yigit

unread,
Oct 25, 1994, 2:43:48 AM10/25/94
to
Tom Lord [on variant syntaxes and gnu extension language]:

Syntaxes other than Scheme are just ordinary extensions.

but so is scheme syntax itself. adam shah's architecture does not mention
the fact that most of the scheme syntax is an "extension" on top of a simpler
core language containing not much more than symbols, constants, quote,
define, lambda, begin, if, set!, prim, proc etc.

oz

Assar Westerlund

unread,
Oct 23, 1994, 11:46:22 PM10/23/94
to
In article <OZ.94Oct...@nexus.yorku.ca> o...@nexus.yorku.ca (ozan s. yigit) writes:
Mikael Djurfeldt is concerned about the suggested modifications to the
Scheme language in order to fullfill GNU extension language goals. I do
not think there is any cause for concern. It is clear that the changes are
not to The Scheme language, which is well defined in an IEEE standard and
Revised^4 Report, The language described in the RMS note may at best be
characterized as a derivative of Scheme, and i doubt it would be called
Scheme, any more than Dylan is called Scheme.

It's not Scheme, but wouldn't it be nice if it could be as close to
Scheme as possible?

lvi...@cas.org

unread,
Oct 25, 1994, 7:46:11 AM10/25/94
to

There are over 320 programs which currently use Tcl and/or Tk as
their extension language. This includes DejaGNU , a GNU program.

lvi...@cas.org

unread,
Oct 25, 1994, 7:49:20 AM10/25/94
to

I do think however that it need not, and perhaps should not, try to
take tk as it is and reproduce it. Instead of trying to shoehorn a
language designed around the Tcl programming philosophy, I would rather
see folk design a similar concept extension designed with the
appropriate language in mind. For instance, I would rather see Dylan's
X not be an Xt or Tk knock off, but something that fits into Dylan's
gestalt. The same for GNUel - make it something designed intentionally
for the language, taking advantages of the strengths and limiting the
weaknesses of the language.

Jeffrey Friedl

unread,
Oct 24, 1994, 9:18:11 PM10/24/94
to
pe...@nmti.com (Peter da Silva) writes:
|> I'm sorry, but putting a feature in because it's english like is just
|> plain silly. Programming languages are not human languages. If you don't
|> think so, there's always COBOL.
|>
|> The syntactic distance between
|>
|> > if (disbelieve $tom) {
|> ask $mom;
|> }
|> and:
|> > ask $mom if disbelieve $tom;
|>
|> is pretty high. The former is clearly a control structure. The latter is
|> hard to pick out of code.

Again, sorry to jump into what's obviously an ongoing and heated
discussion, but, although I completely agree with most of what Peter says
(particularly the first paragraph above), the ending "the latter is hard to
pick out of code" is a rather silly thing to assert. If you don't know the
language, of course it's hard. If you don't know the language (and it
doesn't resemble something you *do* know, such as COBOL vs. English :-),
anything is hard.

If you _Know_ the language, it's as natural as can be. At least that's my
opinion. That it's not natural for you is yours.

The problem with perl is that it resembles C in many ways. This is a
double-edged sword. It's good in that what is similar is, uh, similar.
It's bad in that what's not, isn't, and it's not always apparent what is
and isn't similar (that make any sense?). Anyone familiar with Japanese
will see the same double-edged sword in romaji, the expression of Japanese
using "English" letters.

The biggest trap of perl resembling C/sed/awk/COBOL/English/whatever is
that it can seduce a beginner. If you program in perl while Thinking in C,
your perl will suck bigtime.

Let me say that again: If you program in perl while Thinking in C, your
perl will suck.

Over the years I've done real, large, non-academic projects in some wild
languages (including FORTH, a nastalgic favorite), so had a pretty wide
range of experience when I first encountered perl in the late 2.x stage.
It took me a *long* time to get to really _Know_ perl (i.e. in the biblical
sense :-). But once I was able to Think in perl, it was magical, just as
when I was finally able to think in Japanese.

*jeffrey*
-------------------------------------------------------------------------
Jeffrey E.F. Friedl <jfr...@omron.co.jp> Omron Corporation, Kyoto Japan
See my Jap/Eng dictionary at http://www.omron.co.jp/cgi-bin/j-e
or http://www.cs.cmu.edu:8001/cgi-bin/j-e

Tom Lord

unread,
Oct 24, 1994, 8:51:20 PM10/24/94
to

Peter da Silva writes:

I'm sorry, you've lost me. Either the extension language the user's
interested in is built into the executable, in which case they all
have to be, or it's got to be execed to convert the user's key macro
string into the implementation language, which gives you too much
of a performance hit, or you expose the underlying mechanics of the
implementation language to the user, which is what I thought you were
trying to avoid. What's the fourth alternative?

The ``fourth alternative'' is this: the parser and translator for a
user's favorite syntax is loaded into the running program on demand.
Thus, it is as easy to use as if built-in, but without the associated
costs of building in all languages.

Syntaxes other than Scheme are just ordinary extensions.

-t
--
----

If you would like to volunteer to help with the GNU extension language
project, please write to lo...@gnu.ai.mit.edu.

Kevin K. Lewis

unread,
Oct 25, 1994, 10:30:43 AM10/25/94
to
In article <MIKEW.94O...@gopher.dosli.govt.nz> mi...@gopher.dosli.govt.nz (Mike Williams) writes:

>>> "Paul" == egg...@twinsun.com (Paul Eggert) wrote:

Paul> Isn't the choice of name obvious?
Paul> ``GNU Extension Scheme''.
Paul> We can call it ``escheme'' for short.

No no no ... it should called `gel', particularly if it's going to be
used to "glue" all the diverse GNU applications together. :-)

- Mike W.

Hey, I like that! gel == GNU Extension Language? (or Glue Extension
Languge? Or perhaps GlNUe Extension Language?) Pronounced `jel'? I
like that a lot!
--
Kevin K. Lewis | My opinions may be unreasonable
lew...@aud.alcatel.com | but such is the voice of inspiration

Peter da Silva

unread,
Oct 25, 1994, 11:53:34 AM10/25/94
to
In article <38gjlr$4...@spot.twinsun.com>,

Paul Eggert <egg...@twinsun.com> wrote:
> Isn't the choice of name obvious?
> ``GNU Extension Scheme''.

> We can call it ``escheme'' for short.

gescheme

Has a sort of middle-european feel to it. A nice cuddly sort of name.

Lee Iverson

unread,
Oct 25, 1994, 1:26:20 PM10/25/94
to

>>> "Paul" == egg...@twinsun.com (Paul Eggert) wrote:

Paul> Isn't the choice of name obvious?
Paul> ``GNU Extension Scheme''.
Paul> We can call it ``escheme'' for short.

No no no ... it should called `gel', particularly if it's going to be
used to "glue" all the diverse GNU applications together. :-)

- Mike W.

Hey, I like that! gel == GNU Extension Language? (or Glue Extension
Languge? Or perhaps GlNUe Extension Language?) Pronounced `jel'? I
like that a lot!

How about GLU, the ``GNU Language Utility'' (given that it is really
intended to be a sort assembly language underneath a possible variety
of faces) or the ``GNU Lisp Utility''.

We must appreciate the importance (perhaps) of retaining a hard G ;-)

--
-------------------------------------------------------------------------------
Lee Iverson SRI International
le...@ai.sri.com 333 Ravenswood Ave., Menlo Park CA 94025
(415) 859-3307

Juergen Wagner

unread,
Oct 24, 1994, 1:46:19 PM10/24/94
to
In article <id.LH3...@nmti.com>, Peter da Silva <pe...@nmti.com> wrote:
>In article <38bihm$4...@info.epfl.ch>,
>Stefan Monnier <mon...@di.epfl.ch> wrote:
>> That's why I think using scheme as an intermediate language is not
>> a good idea. I think a lower level language would be better, forcing
>> a compile no matter what language you write the thing in.
>
>I can't agree. A compile step automatically makes for a lousy extension
>language, unless all the compilers are built into the binary. For a lot of
>uses the extra fork/exec overhead is by itself too high. And if all the
>compilers are built into the binary, then they're the extension languages
>and the "low level" one is an implementation detail.

Interpreted languages allow you to write programs generating pieces of
other programs. This is something widely used in LISP-like languages.
Also, Tcl has that feature (you can write your own programming
environment on top of Tcl, including debugger and interpreter).

As for the suggestion that a compilation would be performed no matter
what language a function or program is written in, the utility of this
depends on what you mean by "compilation". In the classical sense
(C/C++ to assembly language), this will be just another language like
C or others. Therefore, it is difficult to find a rationale for that
new language (why not just take C or C++ and have the compiler
translate your scripting language to that?).

However, if we view compilation as a process similar to
byte-compilation in Smalltalk, Xerox LISP, or Emacs LISP, this makes
sense. In fact, it would support the idea of platform independence as
well as syntax independence. This is along the lines of my postulate
from an earlier message in that thread, that syntax independence must
be one of the features of this extension language. What we have to
agree upon are the primitives and the data type model of the language
..ummm... environment.

--Juergen Wagner

Juergen...@iao.fhg.de
gan...@csli.stanford.edu

Fraunhofer-Institut fuer Arbeitswirtschaft und Organisation (FhG-IAO)
Nobelstr. 12c Voice: +49-711-970-2013
D-70569 Stuttgart, Germany Fax: +49-711-970-2299
<a href = http://www.iao.fhg.de/Public/leute/jrw-en.html>J. Wagner</a>

Thomas M. Breuel

unread,
Oct 25, 1994, 2:02:41 PM10/25/94
to
In article <MIKEW.94O...@gopher.dosli.govt.nz> mi...@gopher.dosli.govt.nz (Mike Williams) writes:
|Paul> Isn't the choice of name obvious?
|Paul> ``GNU Extension Scheme''.
|Paul> We can call it ``escheme'' for short.
|
| No no no ... it should called `gel', particularly if it's going to be
| used to "glue" all the diverse GNU applications together. :-)

The GNU extension language should, of course be called "TINT":

TINT Is Not TCL

Thomas.

Jurgen Botz

unread,
Oct 25, 1994, 2:23:04 PM10/25/94
to
In article <38gjlr$4...@spot.twinsun.com>,
Paul Eggert <egg...@twinsun.com> wrote:
>lo...@x1.cygnus.com (Tom Lord) writes:
>> Since these violations of the standard are minor, if the string
>> "Scheme" shows up in the name, I think that will not be too great an
>> abuse of terminology.

>
>Isn't the choice of name obvious?
>``GNU Extension Scheme''.

I don't think it should have "Scheme" in the name unless it can run
R4RS programs, which it can't if (for example) (eqv? '() #f). I'm not
opposed to a language with the proposed 'extensions' to Scheme, but
that language is not Scheme.

Some people have been writing 'GNUel' to refer to the proposed
language in this thread... I propose 'gnudel'---short 'e', rhymes with
"noodle"---the 'd' is for 'dynamic' which is nice because this is a
salient feature of all the extension languages discussed and makes the
acronym pronounceable.

Peter da Silva

unread,
Oct 25, 1994, 11:18:43 AM10/25/94
to
In article <ROCKWELL.94...@nova.umd.edu>,
Raul Deluth Miller <rock...@nova.umd.edu> wrote:
> Peter da Silva:

> . What's the fourth alternative?

> Dynamically loaded libraries. Possibly configured on a per-user
> basis.

That'd work. Would be a bummer if you saved your configuration when you had
your environment in "scheme mode" and then tried to reload your config file
in "Rush mode" though. Make sure your API supports having multiple DLLs
loaded at once.

(Hrm. GCC is a bit big for a DLL)

Peter da Silva

unread,
Oct 25, 1994, 11:27:12 AM10/25/94
to
In article <DAVIDM.94O...@halfdome.prism.kla.com>,
David S. Masterson <dav...@prism.kla.com> wrote:
> Doesn't this argue for not embedding *any* extension language in a program but
> instead defining a request/reply API that programs must conform to? Isn't
> this the model that C uses (ie. think of functions as a request to something
> for service). Then, any language that handles the API can be an extension
> language.

That's a good one. AREXX works that way. You need to be very careful about
the interface, though... AREXX requires you implement your own baby
extension language to parse AREXX's input, and the interface is butt
ugly. Karl wrote a Tcl interface for the Amiga that just passed argvs
around that worked pretty well. A more general interface would want to
use type-tagged elements.

Also, if you're using dynamic libraries to implement the high level extension
language, why have a low level one at all (to pull in another thread), just
define the extension language API and dynamic load the one you want.

That requires dynamic libs, you say? Well, it seems to me that any practical
multiple extension language arrangement is going to. It's too expensive to
fork-and-exec all the time, and you don't want to link everything in, so why
not just depend on dynamic loading and make scheme merely one option of
many?

Peter da Silva

unread,
Oct 24, 1994, 6:34:59 PM10/24/94
to
In article <38grtb$g...@csli.stanford.edu>,

Juergen Wagner <gan...@Csli.Stanford.EDU> wrote:
> In article <id.LH3...@nmti.com>, Peter da Silva <pe...@nmti.com> wrote:
> >In article <38bihm$4...@info.epfl.ch>,
> >Stefan Monnier <mon...@di.epfl.ch> wrote:
> >> That's why I think using scheme as an intermediate language is not
> >> a good idea. I think a lower level language would be better, forcing
> >> a compile no matter what language you write the thing in.

> >I can't agree. A compile step automatically makes for a lousy extension
> >language, unless all the compilers are built into the binary.

> However, if we view compilation as a process similar to


> byte-compilation in Smalltalk, Xerox LISP, or Emacs LISP, this makes
> sense. In fact, it would support the idea of platform independence as

Which is what I mean by "all compilers are built into the binary". Now
your bytecode is, as I've said, an implementation detail. Do you really
mean that every program that uses GNUel should include Scheme and Rush
and Perl and Elisp?

Darin Johnson

unread,
Oct 25, 1994, 2:36:14 PM10/25/94
to
> No no no ... it should called `gel', particularly if it's going to be
> used to "glue" all the diverse GNU applications together. :-)

No no. GSP. GNU Spackling Putty.

(although I think the "GNU Extension Scheme" has a nice subtlety to it)
--
Darin Johnson
djoh...@ucsd.edu
Ensign, activate the Wesley Crusher!

Darin Johnson

unread,
Oct 25, 1994, 2:40:42 PM10/25/94
to
> For an
> English language speaker "do this if that" is very natural.

Odd, I'm a native English speaker, and I think "if this, do that"
is very natural as well.

Of course, in other languages, "this if, that do" is natural,
should we also support that sort of thing?
--
Darin Johnson
djoh...@ucsd.edu
Where am I? In the village... What do you want? Information...

David S. Masterson

unread,
Oct 24, 1994, 10:32:29 PM10/24/94
to
>>>>> "Bruce" == bruce <br...@liverpool.ac.uk> writes:

>>>>> "Peter" == Peter da Silva <pe...@nmti.com> writes:

>> Under any such scheme as this, the language that it all ends up
>> being in is going to be Scheme. The translators are just not going
>> to be used, long term.

> I don't think this is necessarily true, but you may be right.

Doesn't this argue for not embedding *any* extension language in a program but
instead defining a request/reply API that programs must conform to? Isn't
this the model that C uses (ie. think of functions as a request to something
for service). Then, any language that handles the API can be an extension
language.

Think performance might be a problem? Short-circuit the request code to look
in the local services list to see if the request could be handled locally.
Afraid of programs that *might* build in a specific language? Who cares! As
long as it supports the API, I'll start up my language processor that makes
requests according to the API and bypass the in-built language.

Possibilities?
--
==================================================================
David Masterson KLA Instruments
408-456-6836 160 Rio Robles
dav...@prism.kla.com San Jose, CA 95161
==================================================================
I only speak for myself -- it keeps me out of trouble

Juergen Wagner

unread,
Oct 25, 1994, 2:47:36 PM10/25/94
to
In article <id.EH4...@nmti.com>, Peter da Silva <pe...@nmti.com> wrote:
>In article <38grtb$g...@csli.stanford.edu>,
>Juergen Wagner <gan...@Csli.Stanford.EDU> wrote:
[..deleted..]

>> >I can't agree. A compile step automatically makes for a lousy extension
>> >language, unless all the compilers are built into the binary.
>
>> However, if we view compilation as a process similar to
>> byte-compilation in Smalltalk, Xerox LISP, or Emacs LISP, this makes
>> sense. In fact, it would support the idea of platform independence as
>
>Which is what I mean by "all compilers are built into the binary". Now
>your bytecode is, as I've said, an implementation detail. Do you really
>mean that every program that uses GNUel should include Scheme and Rush
>and Perl and Elisp?

An interpreter is supposed to interpret textual representations and
map them to some primitive operations of the language. None of the
languages I mentioned actually compile every statement to an
intermediate code. My point is: define a set of primitive operations
and put an appropriate interpreter or compiler for the language of
your choice on top. This way, "the language" software would consist of
the bytecode interpreter, plus some "canonical" syntax for the
extension language, not necessarily for all languages.

However, it probably doesn't make sense arguing too much about
architectural or syntactic issues at this time. The question is more
what kind of semantics (i.e., primitives, data types, and control
structures) we need. Next, we can talk about how to represent that in
different syntactic forms, and how this can be mapped to a system
architecture.

In principle, *any* interpreted (or interpretable), universal
programming language would qualify as a candidate for *the* extension
language. What we should do now is to define some requirements for the
semantics of that language. If there is a 1:1 translator (maybe
through an intermediate code), it wouldn't matter much if we wrote

proc fact (n) {
if (n == 0) { return 1; }
else return fact(n-1)*n;
}

instead of

(defun fact (n)
(if (= n 0) 1
(* (fact (- n 1)) n)))

or anything else. One thing which should be avoided (because it
endangers the ease of use of such a language) is mandatory type
declarations. Typing should be dynamic as in LISP. Multiple (also
user-defined) types should be part of the language concept, unlike in
Tcl.

The second important point is what is supposed to make that language
so special as an extension language, i.e., which extra features
particularily qualify the selected language better than others? Here,
two issues are important: data exchange between code written in the
extension language and code written in other languages, and control
flow between such pieces of code from different languages.

We should keep in mind that this extension language business doesn't
aim at developing a new language, but may take an existing one and
make it fit the requirements by as little as possible modifications.

>Peter da Silva `-_-'
>Network Management Technology Incorporated 'U`
>1601 Industrial Blvd. Sugar Land, TX 77478 USA
>+1 713 274 5180 "Hast Du heute schon Deinen Wolf umarmt?"

^^ Nein noch nicht. Habe dafuer zwei
Katzen...

Cheers,

Juergen Wagner

unread,
Oct 25, 1994, 3:00:23 PM10/25/94
to
In article <id.WH3...@nmti.com>, Peter da Silva <pe...@nmti.com> wrote:
[deleted]

>I'm sorry, but putting a feature in because it's english like is just
>plain silly. Programming languages are not human languages. If you don't
>think so, there's always COBOL.
>
>The syntactic distance between
>
>> if (disbelieve $tom) {
> ask $mom;
> }
>
>and:
>
>> ask $mom if disbelieve $tom;
>
>is pretty high. The former is clearly a control structure. The latter is
>hard to pick out of code.

I agree in that the syntactic differences are pretty clear.

However, you're mixing syntax and semantics. Depending on the semantics
attached to the two statements, you can have *both*, *one*, or *none*
representing a control structure. In a rule-based system, the first
form is quite common and only declaratively formulates a rule. The
control structure comes in through the rule interpreter/inference
mechanism. Therefore, this is just a bad example.

All languages (including COBOL :-) just happen to be similar to a
context-free language which might be mistaken as a subset of English
with some mathematical notation (except for LISP which is natural
language with parentheses :-)). It is the nature of programming
languages that they cannot be and "are not human languages". In fact,
they don't have to.

Paul Eggert

unread,
Oct 24, 1994, 11:25:47 AM10/24/94
to
lo...@x1.cygnus.com (Tom Lord) writes:

> Since these violations of the standard are minor, if the string
> "Scheme" shows up in the name, I think that will not be too great an
> abuse of terminology.

Isn't the choice of name obvious?
``GNU Extension Scheme''.

Darin Johnson

unread,
Oct 24, 1994, 5:48:01 PM10/24/94
to
> Which brings up the following point. Any language which wants to be a
> replacement for Tcl/Tk (and I believe that any Scheme or CL serious
> attempt would just breeze through) should provide an equivalent of Tk.

This has already been done - Scheme/TK exists. I think there was a
Perl and Python port as well.
--
Darin Johnson
djoh...@ucsd.edu
"I wonder what's keeping Guybrush?"

Matthias Blume

unread,
Oct 24, 1994, 12:14:28 PM10/24/94
to

No. This only leads to confusion and subtle bugs. If it isn't
Scheme, then it should be different enough for every real Scheme to
cough up and complain about a program in that new language on first
sight. Otherwise we just add one more to the list of slightly
incompatible Scheme implementations...

--- Scheme talk --- only for the comp.lang.scheme audience ----

Many of RMS' suggestions of what needs to be changed actually
constitute inverse progress. It took a lot of time and arguing to
finally get over the non-distinction of #f and (). Now RMS walks in
and tries to tell us: ``Well folks, nice work -- but let's go back to
square one.'' I don't think so!

What are ``multiple slots'' in a symbol? I'm not even aware of one
single slot in a symbol! Multiple obarrays? Oh -- I see -- he wants
a module system. Good point!

Fluid-let? What for? (I'm aware of the fact that there are many
people who seem to think that fluid-let is indispensable. I disagree
with this point of view. Establishing error-handlers,
interrupt-handlers and so on can be done with procedures similar to
WITH-INPUT-FROM-FILE. I don't think we should sacrifice strict lexical
scoping for a few special cases.)

Distinguishing between upper-case and lower-case in symbols. Yes,
please! But we have been through that argument, haven't we?

Powerful catch and throw? It's there, and it's called
CALL-WITH-CURRENT-CONTINUATION. If you don't want to confuse your
``average programmer'' with the word ``continuation'' then just wrap
call/cc into some macros:

(define-syntax catch
(syntax-rules ()
((_ c exp ...)
(call-with-current-continuation (lambda (c) exp ...)))))

(define-syntax throw
(syntax-rules ()
((_ c val) (c val))))

Ok, let me step down from this soapbox...

--
-Matthias

Logan Ratner

unread,
Oct 24, 1994, 5:57:47 PM10/24/94
to
In article <id.H52...@nmti.com>, Peter da Silva <pe...@nmti.com> wrote:
>But there's always been this basic assumption: that you don't add a feature
>just because it sounds good. You add it because you need it. If there's two
>ways of doing something you use the one that avoids complicating the language.
>The classic example in Perl is the postfix if statement. It doesn't add
>any capability to the language, and it confuses new users. In an extension
>language that's a bad thing... because most of the time most users are
>new users, because they're not using the language to do a job, they're
>using it to configure the tool that does the job.

Well, when I was a very new user to perl (I still consider myself a 'new' user
a year later) one of the most attractive, clear, and anti-confusing features
was the ability to write conditionals the way I would speak them. For an


English language speaker "do this if that" is very natural.

Now, I have an E.E background. I can do conditionals on a gate-logic level,
and I've programmed C and C++ for years, so I knew prefix syntax, and I still
wasn't confused. In fact I'd argue that perl is less confusing because you
can write "a unless b" and C more confusing because you can't, and the
condition always gets emphasized over the action.

--
Logan Ratner | rat...@rice.edu | It is not funny that a man should die, but
CRPC/CITI | tinker | it is funny that he should be killed for so
Rice Univ. | tailor | little, and the coin of his death should be
Houston TX | cybernaut | what we call civilization - R. Chandler

Peter da Silva

unread,
Oct 24, 1994, 6:38:35 PM10/24/94
to
In article <ROCKWELL.94...@nova.umd.edu>,
Raul Deluth Miller <rock...@nova.umd.edu> wrote:
> Peter da Silva:
> . Now, if you're using an external compiler you need to run that compiler
> . from "define_key_macro".

> Depending on the definition of define_key_macro. [Some definitions
> of define_key_macro wouldn't expose any of the details of the
> extension language.]

I'm sorry, you've lost me. Either the extension language the user's
interested in is built into the executable, in which case they all
have to be, or it's got to be execed to convert the user's key macro
string into the implementation language, which gives you too much
of a performance hit, or you expose the underlying mechanics of the
implementation language to the user, which is what I thought you were

trying to avoid. What's the fourth alternative?
--