How about standardizing on an object system? I recommend Chris Haines'
Scheme Object System, which features multiple inheritance and a
meta-object protocol.
Kaveh
--
Kaveh Kardan
kar...@ERE.UMontreal.CA
Fine. Please extend Scheme (as I knew it) with a flexible
object-encapsulation/inheritence system and convenient, well
designed, portable interfaces to common os functionalities
and libraries -- for a good example of how to do this you might do this
look at the Python code. If you want people to USE the language
without fear of illegality, maybe you could use a copyright which
protects your rights without restricting the USE of the language
-- like the one that applies to Python.
In the mean time, since I want a good scripting/extension language
without scary copyright restrictions and with good interfaces to
just about everything I could possibly want NOW, I'll burrow on
ahead using Python.
Aaron Watters
Department of Computer and Information Sciences
New Jersey Institute of Technology
University Heights
Newark, NJ 07102
phone (201)596-2666
fax (201)596-5777
home phone (908)545-3367
email: aa...@vienna.njit.edu
PS:
Personally, I've always found Scheme a little irritating --
ever since I read the standard text on the subject which mentions
arrays somewhere around page 400, in a footnote, without telling
you how to use one. (Do I detect MIT/NIH? Naw.)
First of all, it starts off with more of the FSF's recent attacks on Tcl, a
fairly innocuous little language that is really quite good at what it does
even if it doesn't do everything Stallman wants. This tends to make me think
that the real reason isn't to fix a problem, but to promote some subtle
political agenda.
This is reinforced when he starts attacking Sun. Attacking Sun is a great
way to get the iconoclasts on your side, but occasionally Sun does come up
with some good things (NeWS, for example, which people stayed away from in
droves, largely due to FUD) and the folks bashing Sun miss the boat (DEC
has done some good things with OSF/1, but nobody else has managed to pull
a rabbit out of that hat).
So, is the Gnu Extension Language going to be another Motif, another RFS,
another OSF/1? Some great smelly monster that even if it succeeds makes nobody
happy?
That brings me to my second problem... the list of extensions. It seems like
about half the announcement is extensions to scheme. Come on, the big advantage
to lisp-like languages is the way their simple semantics and syntax can be
used to bootstrap very powerful concepts.
What's wrong with taking some existing implementation, like STk's interpreter,
and adding a modicum of string and O/S functions? If Tcl is an unholy cross
of Lisp and Awk, this is sounding like some similarly sanctified marriage of
Lisp and Perl.
Oh well, at least it looks like he's got a clue about licensing...
--
Peter da Silva `-_-'
Network Management Technology Incorporated 'U`
1601 Industrial Blvd. Sugar Land, TX 77478 USA
+1 713 274 5180 "Hast Du heute schon Deinen Wolf umarmt?"
The standard reference for Scheme is the "Revised**4 Report on the
Algorithmic Language Scheme", which is only 55 pages long. I suspect
you're referring the "Structure and Interpretation of Computer Programs".
As the name implies, this is a text on programming, not on any particular
programming language. This book is oriented towards teaching about
advanced programming structures such as closures, streams, and
continuations, which are unique to Scheme and similar languages; mundane
features like arrays do admittedly get little coverage. The course assumes
that the student has some prior, basic programming skills, so already knows
how to use arrays.
--
Barry Margolin
BBN Internet Services Corp.
bar...@near.net
1. It appears that there are to be two modules -- a compiler for a given
extension language, and a Scheme-based runtime interpreter. It sounds
like the compiler will be GPL'ed and the runtime won't be. But what if
I want to call "eval some-extension-language-command" from a running program?
Will it get compiled into Scheme on the fly (and then into C and dynamically
loaded, even)? I think this will be a big performance hit, and furthermore
it will require every application to include the GPL'ed compiler, which
defeats the purpose of unencumbering the runtime.
2. Will the Scheme runtime need something like Boehm's garbage collector?
I'm sure there are applications that can't use this sort of system -- for
example, ones that maintain pointers to objects in core in external
storage but not internally (for whatever reason).
3. Tcl is very popular in embedded applications where code size is critical.
It seems that the Scheme interpreter plus the garbage collector plus the
compiler would be a lot larger.
It's definitely a cute idea, but I'm not sure it's very practical...
Wayne
> * Distribution conditions.
> We will permit use of the modified Scheme interpreter in proprietary
> programs, so as to compete effectively with alternative extensibility
> packages.
Good.
> Translators from other languages to modified Scheme will not be part
> of any application; each individual user will decide when to use one
> of these. Therefore, there is no special reason not to use the GPL as
> the distribution terms for translators. So we will encourage
> developers of translators to use the GPL as distribution terms.
This simply means that I _won't_ use the modified scheme in my application..
full point. Because I consider that Tcl syntax (and even "set x 12" more than
the C-like "x=12") is much more readable than any Elisp or Scheme. I'd like to
use scheme as a more powerful language as a developper but I will not impose
this choice on my customers who only know mouse manipulations or, in the best
case, Fortran! Proposing scheme, Lisp, or C-like, as an extension language for
customers is missing the issue IMHO..
And BTW, I agree that it should be possible to write large programs as scripts
or extensions (and it will be more and more the case as CPU power gets
cheaper) *but* neither Elisp nor Tcl can serve this purpose today (still IMHO).
The reason? They are not modular, they are not object oriented. You have to be
a very good and cautious programmer in order to write 100.000 lines of
*maintenable* Lisp or Tcl code.. So the conclusion (Tcl is bad) seems very
strange for me, especially given the fact that [Incr Tcl] exists! I'm
surprised nobody has raised this issue.. If you really want the "very best"
extension language, you should try interpreted Eiffel rather than Elisp!!!
(which does exist in the melted-ice development technology).
Anyway. If Tcl2Scheme is copylefted, Sun people do not have to worry any
more.. :-)
Cheers,
Christophe.
= Are you the police? -- No ma'am, we're musicians. =
>2. Will the Scheme runtime need something like Boehm's garbage collector?
I can't imagine why it would. It's fairly straightforward to have your
Scheme (or whatever) system maintain pointers into C land and vice
versa, with rather less magical support from the RTS than the Parc GC
gives (the system I've seen uses structures called "malloc" and
"stable" pointers, respectively, to point in each direction).
What GNUscript's RTS does absolutely need is a decent generational
garbage collector, so that it will provide reasonably sane interactive
performance. One of the things that regularly makes me want to kick my
workstation through a window is the GC and buffer relocation burps in
Emacs.
<b
--
Bryan O'Sullivan email: b...@cyclic.com, bosu...@maths.tcd.ie
Department of Poverty wuh wuh wuh: http://www.scrg.cs.tcd.ie/~bos
Trinity College, Dublin nous n'avons qu'un peu de mouton aujourd'hui.
What would be better is a treadmill collector (Henry Baker's algorithm)
so you could have real time performance and work with device drivers
and animation. The treadmill collector is more speed overhead, less
space overhead than generational collection. However the treadmill
collector will generally do its work while the program is waiting for
an interaction. Also it preserves the destructor semantics of C++
and only manages the objects you tell it to manage.
[ comments about using a generational gc removed ]
Charles> What would be better is a treadmill collector (Henry Baker's
Charles> algorithm) so you could have real time performance and work
Charles> with device drivers and animation. The treadmill collector is
Charles> more speed overhead, less space overhead than generational
Charles> collection. However the treadmill collector will generally do
Charles> its work while the program is waiting for an
Charles> interaction. Also it preserves the destructor semantics of
Charles> C++ and only manages the objects you tell it to manage.
I'm pretty new to the GC arena, and the above confuses me. Perhaps
someone more knowledgeable can explain?
My understanding of the treadmill scheme was that it takes more space
(and less speed? I don't know) overhead than other GC schemes,
because each object requires a forward and backward link.
Also, it seems to me that pretty much any sort of GC could be written
to be incremental, which I assume is what is meant by "will do its
work while the program is waiting". I don't see why a treadmill
collector has a lock on this property.
Heck, in Emacs 19, the GC does its work while waiting for an
interaction, and Emacs has a non-incremental mark-sweep collector. It
just notices when its been idle for a period of time, and prefers GC
then.
Last, I think the C++ thing is a red herring. At least, I don't
understand its relation to the treadmill scheme at all.
Tom
--
tro...@cns.caltech.edu Member, League for Programming Freedom
"Sadism and farce are always inexplicably linked"
-- Alexander Theroux
[Please redistribute widely]
Many software packages need an extension language to make it easier
for users to add to and modify the package.
In a previous message I explained why Tcl is inadequate as an
extension language, and stated some design goals for the extension
language for the GNU system, but I did not choose a specific
alternative.
At the time, I had not come to a conclusion about what to do. I knew
what sort of place I wanted to go, but not precisely where or how to
get there.
Since then, I've learned a lot more about the topic. I've read about
scsh, Rush and Python, and talked with people working on using Scheme
as an extension and scripting language. Now I have formulated a
specific plan for providing extensibility in the GNU system.
Who chooses which language?
Ousterhour, the author of Tcl, responded to my previous message by
citing a "Law" that users choose the language they prefer, and
suggested that we each implement our favorite languages, then sit back
and watch as the users make them succeed or fail.
Unfortunately, extension languages are the one case where users
*cannot* choose the language they use. They have to use the language
supported by the application or tool they want to extend. For
example, if you wanted to extend PDP-10 Emacs, you had to use TECO.
If you want to extend GNU Emacs, you have to use Lisp.
When users simply want "to write a program to do X or Y," they can use
any language the system supports. There's no reason for system
designers to try to decide which language is best. We can instead
provide as many languages as possible, to give each user the widest
possible choice. In the GNU system, I would like to support every
language people want to use--provided someone will implement them.
With the methods generally used today, we cannot easily provide many
languages for extending any particular utility or application package.
Supporting an extension language means a lot of work for the developer
of the package. Supporting two languages is twice as much work,
supposing the two fit together at all. In practice, the developer has
to choose a language--and then all users of the package are stuck with
that one. For example, when I wrote GNU Emacs, I had to decide which
language to support. I had no way to let the users decide.
When a developer chooses Tcl, that has two consequences for the
users of the package:
* They can use Tcl if they wish. That's fine with me.
* They can't use any other language. That I consider a problem.
Sometimes developers choose a language because they like it. But not
always. Sun recently announced a campaign to "make Tcl the universal
scripting language." This is a campaign to convince all the
developers who *don't* prefer Tcl that they really have no choice.
The idea is that each one of us will believe that Sun will inevitably
convince everyone else to use Tcl, and each of us will feel compelled
to follow where we believe the rest are going.
That campaign is what led me to decide that I needed to speak to the
community about the issue. By announcing on the net that GNU software
packages won't use Tcl, I hope to show programmers that not everyone
is going to jump on the Tcl bandwagon--so they don't have to feel
compelled to do so. If developers choose to support Tcl, it should be
because they want to, not because Sun convinces them they have no
choice.
Design goals for GNU
When you write a program, or when you modify a GNU program, I think
you should be the one who decides what to implement. I can't tell you
what language to support, and I wouldn't want to try.
But I am the leader of one particular project, the GNU project. So I
make the decision about which packages to include in the GNU operating
system, and which design goals to aim for in developing the GNU
system.
These are the design goals I've decided on concerning extension
languages in the GNU system:
* As far as possible, all GNU packages should support the same
extension languages, so that a user can learn one language (any one of
those we support) and use it in any package--including Emacs.
* The languages we support should not be limited to special, weak
"scripting languages". They should be designed to be good for writing
large programs as well as small ones.
My judgement is that Tcl can't satisfy this goal. (Ousterhout seems
to agree that Tcl doesn't serve this goal. He thinks that doesn't
constitute a problem--I think it does.) That's why I've decided not
to use Tcl as the main system-wide extension language of the GNU
system.
* It is important to support a Lisp-like language, because they
provide certain special kinds of power, such as representing programs
as data in a structured way that can be decoded without parsing.
** It is desirable to support Scheme, because it is simple and clean.
** It is desirable to support Emacs Lisp, for compatibility with Emacs
and the code already written for Emacs.
* It is important to support a more usual programming language syntax
for users who find Lisp syntax too strange.
* It would be good to support Tcl as well, if that is easy to do.
The GNU extension language plan
Here is the plan for achieving the design goals stated above.
* Step 1. The base language should be modified Scheme, with these features:
** Case-sensitive symbol names.
** No distinction between #f and (), for the sake of supporting Lisp
as well as Scheme.
** Convenient fast exception handling, and catch and throw.
** Extra slots in a symbol, to better support
translating other Lisp dialects into Scheme.
** Multiple obarrays.
** Flexible string manipulation functions.
** Access to all or most of the Unix system calls.
** Convenient facilities for forking pipelines,
making redirections, and so on.
** Two interfaces for call-outs to C code.
One allows the C code to work on arbitrary Scheme data.
The other passes strings only, and is compatible with Tcl
C callouts provided the C function does not try to call
the Tcl interpreter.
** Cheap built-in dynamic variables (as well as Scheme's lexical variables).
** Support for forwarding a dynamic variable's value
into a C variable.
** A way for applications to define additional Scheme data types
for application-specific purposes.
** A place in a function to record an interactive argument reading spec.
** An optional reader feature to convert nil to #f and t to #t,
for the sake of supporting Lisp as well as Scheme.
** An interface to the library version of expect.
** Backtrace and debugging facilities.
All of these things are either straightforward or have already been
done in Scheme systems; the task is to put them together. We are
going to start with SCM, add some of these features to it, and write
the rest in Scheme, using existing implementations where possible.
* Step 2. Other languages should be implemented on top of Scheme.
** Rush is a cleaned-up version of the Tcl language, which runs far
faster than Tcl itself, by means of translation into Scheme. Some
kludgy but necessary Tcl constructs don't work in Rush, and Tcl
aficionadoes may be unhappy about this; but Rush provides cleaner ways
to get the same results, so users who write extensions should like it
better. Developers looking for an extension language are likely to
prefer Rush to Tcl if they are not already attached to Tcl.
Here are a couple of examples supplied by Adam Sah:
*** To pass an array argument without copying it, in Tcl you must use
upvar or make the array a global variable. In Rush, you can simply
declare the argument "pass by reference".
*** To extract values from a list and pass them as separate arguments
to a function, in Tcl you must construct a function call expression
using that list, and then evaluate it. This can cause trouble if the
other arguments contain text that includes any special Tcl syntax. In
Rush, the apply function handles this simply and reliably.
*** Rush eliminates the need for the "expr" command by allowing infix
mathematical expressions and statements. For example, the Tcl
computation `"set a [expr $b*$c]' can be written as `a = b*c' in
Rush. (The Tcl syntax works also.)
Some references:
[SBD94] Adam Sah, Jon Blow and Brian Dennis. "An Introduction to the Rush
Language." Proc. Tcl'94 Workshop. June, 1994.
ftp://ginsberg.cs.berkeley.edu:pub/papers/asah/rush-tcl94.*
[SB94] Adam Sah and Jon Blow. "A New Architecture for the Implementation of
Scripting Languages." Proc. USENIX Symp. on Very High Level Languages.
October, 1994. to appear.
ftp://ginsberg.cs.berkeley.edu:pub/papers/asah/rush-vhll94.*
** It appears that Emacs Lisp can be implemented efficiently by
translation into modified Scheme (the modifications are important).
** Python appears suitable for such an implementation, as far as I can
tell from a quick look. By "suitable" I mean that mostly the same
language could be implemented--minor changes in semantics would be ok.
(It would be useful for someone to check this carefully.)
** A C-like language syntax can certainly be implemented this way.
* Distribution conditions.
We will permit use of the modified Scheme interpreter in proprietary
programs, so as to compete effectively with alternative extensibility
packages.
Translators from other languages to modified Scheme will not be part
of any application; each individual user will decide when to use one
of these. Therefore, there is no special reason not to use the GPL as
the distribution terms for translators. So we will encourage
developers of translators to use the GPL as distribution terms.
Conclusion
Until today, users have not been able to choose which extension
language to use. They have always been compelled to use whichever
language is supposed by the tool they wish to extend. And that has
meant many different languages for different tools.
Adopting Tcl as the universal scripting language offers the
possibility of eliminating the incompatibility--users would be able to
extend everything with just one language. But they wouldn't be able
to choose which language. They would be compelled to use Tcl and
nothing else.
By making modified Scheme the universal extension language, we can
give users a choice of which language to write extensions in. We can
implement other languages, including modified Tcl (Rush), a Python
variant, and a C-like language, through translation into Scheme, so
that each user can choose the language to use. Even users who choose
modified Tcl will benefit from this decision--they will be happy with
the speedup they get from an implementation that translates into
Scheme.
Only Scheme, or something close to Scheme, can serve this purpose.
Tcl won't do the job. You can't implement Scheme or Python or Emacs
Lisp with reasonable performance on top of Tcl. But modified Scheme
can support them all, and many others.
The universal extension language should be modified Scheme.
Request for Volunteers
If you understand Scheme implementation well, and you want to
contribute a substantial amount of time to this project, please send
mail to Tom Lord, lo...@gnu.ai.mit.edu.
If you expect to have time later but don't have time now, please send
mail when you do have time to work. Participation in a small way is
probably not useful until after the package is released.
You got it!
>As the name implies, this is a text on programming, not on any particular
>programming language. This book is oriented towards teaching about
>advanced programming structures such as closures, streams, and
>continuations, which are unique to Scheme and similar languages; mundane
>features like arrays do admittedly get little coverage....
Quite right. As we all know, all uses of arrays are mundane and
trivial. Advanced programmers never use arrays. There are no
interesting algorithms that use arrays in interesting ways. Arrays
never come up in good classes about "advanced" programming.
In fact: didn't Turing show that all we *really* need is two stacks
of bits? Hmmm....
This gives me a really good idea for my own extension language!
[which I'll make freely copiable, but I'll restrict it from use in
any activity which makes any money in any way using a really complex
copyright: I'll take the gpl as a starting point (this'll be the
really fun part -- hey, maybe I'll start a really important movement
or revolution by forcing other people to not make money using my
program!!!).]
-a.
Ps: the real reason this "text on programming" doesn't
talk about arrays is there is no
good way to "do" arrays in functional programming, even though
arrays are the single most useful structures in real programming;
hence my irritation.
OK, here's the scenario: I want to maintain a config file (~/browse.cf, say)
that is generated by the application but the user shuld have the ability
to edit. It needs to be in a language easy to automatically generate, easy
to reload, easy for external programs to maintain, and easy for the naive
user to modify.
What language would you recommend I choose? How do I provide the tools so
that the user can *also* maintain it in their language of choice?
Under any such scheme as this, the language that it all ends up being in
is going to be Scheme. The translators are just not going to be used, long
term.
This is not necessarily a bad thing. Just something to keep in mind.
Yes, I had a look at it. You didn't use postix if at all, that I could see.
> Do you want big languages and little programs or vice versa?
I want little languages and little programs. I don't believe you can't
get there (watch out, he's got something under his coat! Oh no! He's got
a Forth interpreter! Run!) from here...
> I guess all we need is a elisp-to-perl (or is that scheme-to-perl)
> translator now and even rms will be happy. :-) (Someone else reports
> working on a tcl-to-perl translator already, but progress is slow.)
I don't think that you're going to get a good translator from any of these
data-driven languages to a procedural language any time soon. Run-time
manipulation of code is too much a part of what makes them interesting.
And it's also too much a part of what makes them useful extension
languages.
You're right, they will. But for the application domain I'm interested in
(extension languages for programs running on UNIX) it's more important.
I mean in this domain I find Tcl more than adequate, and nobody is going
to tell me it has anywhere near the functionality of scheme even without
any of these extensions.
> I don't, OTOH, see why not we couldn't just have a simple
> #ifdef compilation option to exclude various features from the
> language,
That turns out not to work very well. You end up with everything compiled
in anyway. We've already been down that road in the Tcl world. The real
solution to application size is dynamically loadable extensions. Things like
the UNIX I/O package and the strings package would be well suited to that.
Extensions to the language syntax are less so.
And the real "size" metric I'm using is more like the one Tom was using
to measure the size of Perl 5 versus Perl 4: complexity. Adding new primitives
doesn't add much to the complexity (the mental size, if you like) of the
language. Adding new control structures, or changing the basic syntax
(making it more lispy) does.
> For real tightness and power in a scripting language, I (and many others)
> would recommend using "Forth" where these things matter
OK, OK, how about Postscript? In a lot of ways it's got most of the tightness
of Forth with a lot cleaner syntax. And people are used to dealing with it.
Yes, traditional PS implementations are pretty big but that's mostly the
rendering engine...
> Anyway, when can we get it ;-) Also, I'll repeat my oft'
> expressed desire to see DOS/Windoze/Win32 versions.
You already got that with Tcl.
But Scheme is not a scripting language; it is a full, general-purpose
programming language. If the GNU extension language will be Scheme,
then why not call it Scheme? Given that we already have Emacs-Lisp,
something like GNU-Scheme would be a logical choice.
Which reminds me of this little gem:
#!/usr/local/bin/wish -f
label .lab -bd 2 -relief raised -text "So, What is wrong with using a utility"
label .lab2 -bd 2 -relief raised -text "that kills babies...I happen to like"
label .lab3 -bd 2 -relief raised -text "tcl."
pack append . .lab {top fill} .lab2 {top fill} .lab3 {top fill}
button .b1 -text "End this madness" -command exit
pack append . .b1 {top}
--
Raul D. Miller n =: p*q NB. 9<##:##:n [.large prime p, q
<rock...@nova.umd.edu> y =: n&|&(*&x)^:e 1 NB. -.1 e.e e.&factors<:p,q [.e<n
NB. public e, n, y
x -: n&|&(*&y)^:d 1 NB. 1=(d*e)+.p*&<:q
That's why I think using scheme as an intermediate language is not
a good idea. I think a lower level language would be better, forcing
a compile no matter what language you write the thing in.
Stefan
Arrays are simple in concept. SICP is supposed to teach other more
"advanced" concepts. Doesn't mean that arrays are less used. They are
just assumed to be known !
> In fact: didn't Turing show that all we *really* need is two stacks
> of bits? Hmmm....
Yup ! Unbounded stacks !
And those stacks are easier to implement with lisp lists than with
arrays (and their fixed size)
> Ps: the real reason this "text on programming" doesn't
> talk about arrays is there is no
> good way to "do" arrays in functional programming, even though
There is, but it's still recent technology.
> arrays are the single most useful structures in real programming;
> hence my irritation.
And arrays are the single most obvious reason why most programs either
crash on big data sets or (if you're lucky) complain because it's
bigger than some arbitrary internal limit !
Stefan
Spartan minimalism? Tcl is hardly spartan... it's just designed for a
specific job and does it very well. Perl is designed for a different
job and does THAT very well. I don't think it could do Tcl's even as
well as Tcl does Perl's.
[preaching to the choir omitted]
Hey, I'm responsible for some of the features in Tcl that *are* there, like
the way strings work. Karl and I worked out the semantics of Tcl arrays on
his whiteboard when he worked here. We did Tcl Extended, because the original
language was too minimal, and a lot of that has been picked up. For that
matter we picked some ideas up from Perl... some of them didn't make the
cut and still aren't in the core language (like the filescan stuff).
But there's always been this basic assumption: that you don't add a feature
just because it sounds good. You add it because you need it. If there's two
ways of doing something you use the one that avoids complicating the language.
The classic example in Perl is the postfix if statement. It doesn't add
any capability to the language, and it confuses new users. In an extension
language that's a bad thing... because most of the time most users are
new users, because they're not using the language to do a job, they're
using it to configure the tool that does the job.
> The problem is, you see, is that quite simply, you're designing the wrong
> languages for the wrong crowd.
Who, me? I'm not designing a language at all. Or redesigning one. I'm trying
to keep a bunch of people from inventing yet another camel when the specs
don't even call for a horse.
[a bunch of stuff that doesn't seem to have anything to do with me at all,
skipped]
> When it comes to lisp or tcl, while the extensive run-time nature of
> those languages make machine language generation (at least of certain
> constructs) difficult, compiling them into native perl (probably
> with a run-time emulator library) should in theory present no insurmountable
> hurdles.
Certainly with a runtime emulator library... especially when you're running
around loading stuff on an ongoing basis at runtime and using code fragments
as your communication channel between components. And since you have to keep
doing that, what's the point to putting Perl in the loop at all? It's not
technically infeasible, it's just not very useful. And that's why I think
it's unlikely.
Don't make the user through more work than necessary. If it bothers you
that that we in English sometimes naturally express outself with the
conditional afterwards, use somthing else. It's more restrictive and
stilted and unnatural to enforce a particular style on the user. Ask your
mother if you don't believe me.
if (annoy $peter reversed("conditional")) {
use Something_Else;
}
value("flexibility") > value("restriction");
ask $mom if disbelieve $tom;
Remember, in C, you can say for(;c;) wherever you can say while(c), and
no one seems to mind that. It's the same issue. One is more readable.
You're asking for decreased legility for no good reason. Likewise,
do {
foo();
} until $a || $b;
is some better than either of these:
do {
foo();
} while !$a && !$b;
do {
foo();
} while !($a || $b);
because they make you go through more work than needed. Likewise
foreach $a (@list) {
foo($a);
}
is superior to the far busier:
for ($i = 0; $i <= $#list; $i++) {
foo( $list[$i] );
}
But so what? It's not like we should can one or the other and
force you to choose between C and shell.
[yes, much of the previous was more addressed to the thread then
to just Peter]
--tom
--
Tom Christiansen Perl Consultant, Gamer, Hiker tch...@mox.perl.com
Malt does more than Milton can
To justify God's ways to Man.
>(sigh... here we go again)
>I'd like to respond to an error in Richard Stallman's latest posting.
>Stallman said:
> Sun recently announced a campaign to "make Tcl the universal
> scripting language." This is a campaign to convince all the
> developers who *don't* prefer Tcl that they really have no choice.
> The idea is that each one of us will believe that Sun will inevitably
> convince everyone else to use Tcl, and each of us will feel compelled
> to follow where we believe the rest are going.
>Please understand that this "campaign" exists only in Stallman's mind.
>As far as I know there has never been *any* official Sun announcement
>about Tcl. There is no campaign at Sun to stamp out Tcl competitors;
>Tcl and Tk aren't even official Sun products right now, nor has Sun
>made a commitment to turn them into products (yet). If anyone has
>concrete evidence to back up Stallman's accusations, please post it
>so we can all see it.
Here's a job posting that I came across while searching on Career
Mosaic's home page. It does talk about Tcl as the universal scripting
language. Maybe Prof Ousterhout could clarify this.
Prof Osterhout is right in saying that negative campaigning is not
good. I'd say certainly I've heard more negative things said about
TCL, C++ etc in the scheme newsgroup than vice versa. There are really
neat things about scheme like high level macros but also not so neat
things like poor support for reuse (unable to use neat libraries
developed in C++ for example). The foreign function support in scheme
is far from good. The thing to remember is that scheme is not a
panacea for everything, it is one paradigm and providing interfaces
to other paradigms is only going to make it more acceptable.
--Suresh Srinivas
-----Job posting about Tcl from Sun -------------
Sun Micorsystems Laboratories, Inc. is embarking on a new project
directed by Dr. John Ousterhout. Our goal is to make Tcl/Tk the
universal scripting language.
To accomplish this we are in the process of building a new group
which is well funded and fully dedicated to this project. This group
is under SMLI (Sun Microsystems Laboratories, Inc.) which is the
advanced technology and research arm of Sun Microsystems, Inc.
We are searching for several more individuals to join us in this
effort and play a key role in making this goal a reality.
You will help us on the development of the Tcl scripting language,
the Tk toolkit, and their extentions and applications.
The two most important projects will be a port of Tk to Windows and
Macintosh platforms, and the creation of a graphical designer for
Tk user interfaces. This will allow people to create interfaces
graphically on the screen, rather than writing scripts.
The individualals we are looking for will have solid experience
with C, C++, and experience with Tcl/Tk. We would also like to
have some expertise with MS/Windows and/or MACS.
The qualified candidate will also have a BSCS/MSCS and 5 plus
years work experience.
If you are interested in exploring this new opportunity please
follow up to:
Scott Knowles
SMLI
2550 Garcia Ave. MTV19-02
Mt.View, CA 94043
--
Suresh Srinivas Department of Computer Science
Grad Student Indiana University, Bloomington,
email: ssri...@cs.indiana.edu IN 47406.
On the other hand, Microsoft has things like visual basic, visual c, visual
c--, etc. and so they too see the need to provide various programming languageds
into the user community. Or at least they see the financial benefits. What
is interesting is that friends who have these software packages indicate that
their 'widgets' (extensions, whatever you want to call them) can in many cases
be used across languages - a useful concept which doesn't seem to be making
it into the Unix arena.
--
:s Great net resources sought...
:s Larry W. Virden INET: lvi...@cas.org
:s <URL:http://www.mps.ohio-state.edu/cgi-bin/hpp?lvirden_sig.html>
The task of an educator should be to irrigate the desert not clear the forest.
: I don't, OTOH, see why not we couldn't just have a simple
:#ifdef compilation option to exclude various features from the
Having to compile up multiple copies of interpreters has been tried in the
perl and tcl communities - to the frustrations of many. What I and many
others have called for are ways to dynamically load independantly developed
sets of enriched command sets into a very small base interpreter. This
would allow me to tailor an application to a required set of objects
and appropriate operations/methods, while passing on pieces for which I have
no need. Why should my applications be saddled with hundreds of k of X
overhead if the app I want to develop just wants to send messages to an
existing X app - but needs to do no window instantiation at all? Equally,
if all I need to do is small integer manipulation, I would just as soon
not be saddled with bignum floats. On the other hand, if a user wishes
to write their own extended commands for my app, and in doing so determines
that _they_ need bignum floats, X, or whatever, I would like for the language
to be able to support _them_ requesting said objects be loaded, along with
appropriate operation/method library code, etc.
:language, e.g. #undef GXL_UNIXCALLS or whatever. Same for "expect"
:interface (what is expect anyway?).
Expect is a nifty concept (available at least in a Tcl and Perl form -
perhaps in other languages now as well) where one defines a set of
interactions that need to take place one or more processes. Think of
telecomm software in the micro world which allow you to capture login
scripts and then replay them to log into services, etc. Expect is
a language where one can write 'scripts' to invoke ftp, telnet, etc.
and then generate requests, watch for respones, etc. The latest Expect,
with an extended environment known as Expectk, allows one to wrap aaGUI
around a traditional text based interaction such as ftp, password changes,
whatever, in a rather nifty way. There is also a neat paper done able
a feature of Expect called Kibitz - where one links two separate programs
together with expect/kibitz glue between - so that one program feeds input
to another and then recieves the second's output as it's input (think
of playing two chess programs against one another - not that this is
the only use, but a simple to grasp one).
: I would guess in the general case, the lion's share of apps
:using the language will have oodles of giant GUI extensions, user
:i/o validation, etc. and one average size 8bit+ color image will have
:a footprint bigger than the whole language implementation anyway! For
:real tightness and power in a scripting language, I (and many others)
:would recommend using "Forth" where these things matter - but I
It is true that many apps will be extended using many pieces of
extensions. If they are all loaded only when needed, and able to be
unloaded when not needed, this would allow an app to consume only the
resources needed at any one time. And if folk take into consideration
the 'hypertool' or applet approach, where entire mini-applications grow
up and communicate between one another, then one will find that more
use of distributed compute resouces, threading, etc. will be utilized.
Dynamic loading of extensions works just fine in Perl. Why do you think
the /usr/bin/perl binary can be just 50k? There's no longer any need for
fooperl, barperl, and flotzperl. Tcl users can use this feature if they
start their tcl programs with
#!/usr/bin/repl
use Tcl;
and go from there. No, I'm not entirely kidding.
--tom
--
Tom Christiansen Perl Consultant, Gamer, Hiker tch...@mox.perl.com
Documentation is the castor oil of programming. Managers know it must
be good because the programmers hate it so much.
Period.
To a large extent, most people design computer languages the "wrong" way.
They make them beautiful gems, with subtle properties, like deep binding,
first classness of arbitrary objects, dynamic vs lexical variables,
multiple inheritance of potentially overloaded methods, etc. These tiny,
balanced, sparkling works are some computer scientists' pride and joy, but
their subtleties escape the junior high school graduate trying to put
together a quick application.
Truth and beauty may make you feel good, but realize that your truths,
your beauties, mean nothing to the vast majority of the computer *user*
out there....
Let me correct what seems to be a common misconception. "Ivory tower"
types don't advocate these things just because they think they are
beautiful, they actually believe that these properties make programming
languages better tools for software engineering. You may disagree with
that, or you may think that software engineering isn't relevant for an
extension language, or you may think that modern beliefs about software
engineering are bunk, but you might at least have the courtesy to address
the argument.
Why is it, ... that languages like [BASIC, perl, tcl, ...] have
enjoyed the popularity they have? [] they're easy to use for
someone who hasn't had [specific education].
I don't believe, sir, that you have addressed that point, which is the
central idea upon which the rest of my posting expounds. Why not?
Because I wasn't trying to refute your entire post or even directly address
any of the points you made in it. I'm not really interested in getting
involved in this stupid argument about extension languages. I only
responded because many paragraphs of the text you wrote (I could have
quoted more) seemed to be a rant against a position your opponents don't
actually believe in. Having recently seen several other people make
essentially the same mischaracterization, I thought I might attempt to
clarify the point and perhaps improve the quality of the discussion.
Furthermore, you seem to have omitted in your citation above
a passage I find entirely relevant:
Mind you, it is not my intent to suggest that one blithely foist off
some shoddy language on the world just so that the whole world might
find itself sufficiently technically astute (that is, barely at all)
to make use it. I don't believe that having a language sufficiently
straightforward that anyone can program in it after nothing but a few
simple hours of intruction necessarily precludes its potential as a
fundamentally solid work lending itself to the necessary goals of
reliability, extensibility, and eventual optimization.
Well, I have to confess, I don't understand what either of these two
sentences are trying to say. The first one doesn't seem to parse as
English, and the second one, while seemingly grammatical, loses me
somewhere around the words "precludes its potential".
To restate and summarize, I believe that too many perfectionists (perhaps
you're a scheme programmer?) create or espouse languages which they
themselves have, through unbridled verve for cyberlingistic beauty,
rendered relatively inaccessible to most of their potentional users.
Um, well, my point was that nobody adds features to a language just because
they believe it makes the language more beautiful. They usually believe
they are increasing the language's -utility-. You seem to be
characterizing the designers of Scheme as willing to sacrifice utility for
aesthetics, but they don't see themselves that way at all, and so they
aren't likely to take your complaints seriously.
Scheme is a beautiful little language and could be a great extension/
scripting language if it had standard portable interfaces to a large
number of libraries, and if it had native object support with
inheritance -- I understand some mutant strains do...
[Python certainly does.]
I think the GPL and its variants should be changed to something
less restrictive and simpler -- in one case I know of a developer did some
work using GNU stuff and ended ripping a lot of it out in order
to avoid the bother of complying with the terms. He says next
time he'll license proprietary source with binary distribution
rights (such things exist, and can be very nice). Clearly, in this
case, the GPL didn't encourage the use of freely copiable software.
Next time I see him I'll recommend python and it's associated tools,
since he can use them however he pleases, as long as he credits the
source. A gnu-scheme would fair better in the world if it had
a copyright like the ones on TCL and python.
No more comments on books. Sorry,sorry,sorry.
I can't agree. A compile step automatically makes for a lousy extension
language, unless all the compilers are built into the binary. For a lot of
uses the extra fork/exec overhead is by itself too high. And if all the
compilers are built into the binary, then they're the extension languages
and the "low level" one is an implementation detail.
[snippet]
I'm sorry, but putting a feature in because it's english like is just
plain silly. Programming languages are not human languages. If you don't
think so, there's always COBOL.
The syntactic distance between
> if (disbelieve $tom) {
ask $mom;
}
and:
> ask $mom if disbelieve $tom;
is pretty high. The former is clearly a control structure. The latter is
hard to pick out of code.
As for C, I don't recall arguing that C is either easy to learn or that it
would make a good extension language.
On the gripping hand:
for(;read_news;)
flame();
and:
while(read_news)
flame();
retain the same basic form. They don't add to the conceptual cost of learning
and using the language.
Some extensions are useful. Foreach is like C's "+=", it takes a common
idiom and removes a lot of duplication from it. Postfix if doesn't reduce
the complexity of the statement any (there's still as many elements to
evaluate) but does add to the complexity of the language.
This is where I'm coming from: adding a feature to a language because it's
neat (and postfix if is certainly neat... it's downright cute) is a bad idea.
That way lies COBOL.
(no, I don't think Perl's COBOL. I will note that a lot of the improvements
to Perl have involved removing complexity, which defends it from that
charge quite well *and* supports my argument against un-necessary frills)
> good way to "do" arrays in functional programming, even though
> arrays are the single most useful structures in real programming;
> hence my irritation.
Howdy,
Exactly why the universal scripting language should be
an embedded APL ;-)
=============================================
Scott McLoughlin
Conscious Computing
=============================================
>>>>> "ozan" == ozan s yigit <o...@nexus.yorku.ca> writes:
In article <OZ.94Oct...@nexus.yorku.ca> o...@nexus.yorku.ca (ozan s. yigit) writes:
ozan> Mikael Djurfeldt is concerned about the suggested
ozan> modifications to the Scheme language in order to fullfill
ozan> GNU extension language goals. I do not think there is any
ozan> cause for concern. It is clear that the changes are not to
ozan> The Scheme language, which is well defined in an IEEE
ozan> standard and Revised^4 Report [...]
It was my intention only to talk about the GNU extension language.
I would like it to be a good language/implementation.
/mdj
> On the other hand, Microsoft has things like visual basic, visual c, visual
> c--, etc. and so they too see the need to provide various programming languag
> into the user community. Or at least they see the financial benefits. What
> is interesting is that friends who have these software packages indicate that
> their 'widgets' (extensions, whatever you want to call them) can in many case
> be used across languages - a useful concept which doesn't seem to be making
> it into the Unix arena.
> --
Howdy,
The X-language widgets currently in use are typically
termed "components" in the DOS world and the dominant component
format is called a "VBX", originally a Visual Basic only library
format that became very popular.
VBX's are _very_ popular. While I don't use them, most
of my colleagues do. They report that the quality is _very_
_LOW_. Crash city (GPF, core dump, whatever). Many will
say "that's an implementation issue" - which is true. But it's
also an _economic issue_ - the heavy "consumerization" of
software, including programming tools, in the DOS/Windows
world. I wouldn't recommend chasing after the MSoft/Intel/
DOS/Windows model of computing, esp. as its shaped up in
the last year or so. I (like many others) am there - it's not
pretty.
If you're interested in VBX's, though, you can go
out and snag VB for ~100 and a large collection of VBX's
for another ~ 100 dollars or so and go to town. Heck, get
a big C++ compiler for another ~100 or so and you can
write you're own VBX's. Have fun.
Gareth
>How about standardizing on an object system? I recommend Chris Haines'
>Scheme Object System, which features multiple inheritance and a
>meta-object protocol.
My $.02: unless you have a pressing need, and a compiler, a MOP is a
pain. The overhead involved (at least in the MOPpy implementations
I've seen) is just far too great.
I like Preston Briggs' attitude to this kind of thing: does all this
extra flexibility make up for the cost of all this extra flexibility?
A simple base interface such as that provided by Tiny CLOS or Dylan
should be just fine for about 99.999% of cases, I'd imagine.
<b
--
Bryan O'Sullivan email: b...@cyclic.com, bosu...@maths.tcd.ie
Department of Poverty wuh wuh wuh: http://www.scrg.cs.tcd.ie/~bos
Trinity College, Dublin nous n'avons qu'un peu de mouton aujourd'hui.
Everyone seems to be assuming that the extension language built into the
program must be the same as the extension language that the user sees.
Yet this is not so. What would be wrong with building in something that
is small, simple, and fast (e.g., Forth), and then providing tools to
compile something else to that (e.g., a gcc backend that generates Forth
instead of assembly)?
--Tim Smith
Actually I would say that an object system is almost essential for a good
extension langauge. You want to be able to add new types to the language
which correspond to the elements of whatever system the language is embedded
in. Without this, you have to add tons of keywords to manage data objects
which are not really part of the language.
Python can do this, although it's operator overloading syntax is a little
awkward and it might be too slow. Some of the modern functional languages
might be fast enough. Haskel comes to mind, but it has a horrendous syntax.
Another concern is that you probably want the source-code inputs to these
languages to be event driven- for X and multi-user applications. With
scheme you could do this with a preprocessor (just collect input until you
have balanced parenthasis). Python's parser could probably be made event
driven as well, since it's table driven.
--
/* jha...@world.std.com (192.74.137.5) */ /* Joseph H. Allen */
int a[1817];main(z,p,q,r){for(p=80;q+p-80;p-=2*a[p])for(z=9;z--;)q=3&(r=time(0)
+r*57)/7,q=q?q-1?q-2?1-p%79?-1:0:p%79-77?1:0:p<1659?79:0:p>158?-79:0,q?!a[p+q*2
]?a[p+=a[p+=q]=q]=q:0:0;for(;q++-1817;)printf(q%79?"%c":"%c\n"," #"[!a[q-1]]);}
This is always taken for granted by you proponents of single-namespace
Lisps. I argue that most people "intuitively" have no problems at all
distinguishing the several meanings of a symbol. I have never seen
anybody who was confused about the predefined meanings of the word
CONS as a type specifier and an operator (function) in Common Lisp,
for example. Personally I would rather expect people to be surprised
when they use a variable with a generic name like LIST, and find that
they have redefined a system function in the scope.
I admit that from a formalism-aesthetic point of view, multiple
namespaces are not as nice ("violating the 0-1-infinity principle"),
but this does not mean that a LISP-1 is more "intuitive" than a LISP-2.
Mikael> Because of this most lisp code doesn't use both value and
Mikael> function bindings simultaneously.
It is true that most symbols in a given program don't have a variable
and a function binding at the same time. But the reason might also be
that "intuitive" names for functions and variables are often (but not
always!) different.
--
Simon.
Of course, the real problem is how to do this across all platforms - how
does Perl handle all the different Unix, MS-DOS, etc. limitations on
dynamically loaded objects? (I have the Perl5 tech doc printed - it's
just that my speed reading skills keep getting cancelled out by kids
wanting my attention...)
Similarly, it's "trivial" to add (e.g.) property lists to scheme --
just introduce a global variable that's a hash table indexed by
symbols.
These are just examples -- their point is to point out that adding
features to scheme needn't destroy the underlying language.
The main reason for the acceptance of Tcl is the '/Tk' suffix.
I cannot see anybody who would like to learn a language like Tcl if it
where not for all the "Value Added" by Tk.
I am usually amazed by the ease with wich you can build interfaces
with Tk. I also think that that is what makes the whole thing so
appetible.
Which brings up the following point. Any language which wants to be a
replacement for Tcl/Tk (and I believe that any Scheme or CL serious
attempt would just breeze through) should provide an equivalent of Tk.
Thanks for the attention.
--
Marco Antoniotti - Resistente Umano
-------------------------------------------------------------------------------
Robotics Lab | room: 1220 - tel. #: (212) 998 3370
Courant Institute NYU | e-mail: mar...@cs.nyu.edu
...e` la semplicita` che e` difficile a farsi.
...it is simplicity that is difficult to make.
Bertholdt Brecht
Using an arbitrary syntax to avoid favoritism:
display "Enter keystroke: "
read keystroke
display "Enter macro: "
read macro
define_key_macro %keystroke %macro
Now, if you're using an external compiler you need to run that compiler
from "define_key_macro".
Now suppose you're reading these from an X resource at startup. You're going
to have to call the compiler for *each* resource in turn.
Sorry, it just don't work. The underlying language *is* going to be exposed
to the user.
That doesn't mean that Scheme isn't a fine language for this, just that it's
nonsense to pretend the underlying language doesn't matter.
Because you can't maintain the system then. If your users get to go off and
pick any tranlator they want, you'll have to learn every available language
inorder to debug your user's scripts. I don't believe that it is practical for
you to expect to be able to debug a script written is some unknown language that
was machine translated into scheme, or forth, or whatever. (I've argued this same
point with the Dylan people to no good.)
If you're not going to let them pick, then you just as well force them to use
the same language as you picked.
Mike McDonald m...@trantor.ess.harris.com
No question about it.
But because most of them haven't written a program that had to
compete in any real marketplace for years (maybe ever) they are
usually wrong. [I don't want to start any more flame wars than I have
to, so I omit an example here.]
>You may disagree with
>that, or you may think that software engineering isn't relevant for an
>extension language, or you may think that modern beliefs about software
>engineering are bunk....
Not at all. When I talk with people who have real experience with
large projects I generally think their insights are very valuable.
I rarely hear anything about the "Von Neumann bottleneck" however,
this kind of bunk only comes from academics (like myself).
When I was struggling with some of the design decisions of Python,
Guido Van Rossum (python creator and god) once flamed me that "Your
intuitions are generally wrong" (I think he was using the universally
quantified "you", but I can't be sure). This pissed me off at the time.
As it turned out he was right on the particular point of discussion.
He may be right about academic intuitions and "research" (that is,
guesswork) about programming languages more generally.
One of the things I like about python is that it clearly has not
been overly influenced by some of the goofier and less generally
useful programming language ideas that academics have failed to
impose on the world outside of the fishbowl.
Aaron Watters
Department of Computer and Information Sciences
New Jersey Institute of Technology
University Heights
Newark, NJ 07102
phone (201)596-2666
fax (201)596-5777
home phone (908)545-3367
email: aa...@vienna.njit.edu
ps: check out http://www.cwi.nl/~guido/Python.html
oz
---
a technology is indistinguishable from | electric: o...@nexus.yorku.ca
its implementation. -- Marshall Rose | or o...@sni.ca [416 449 9449]
But, I feel concerned about three of the suggested modifications to
the language. According to RMS "[it] is desirable to support Scheme,
because it is simple and clean." This is indeed one of the great
strengths of Scheme, and it's extremely important not to compromise
it.
In "The GNU extension language plan" we find:
** Extra slots in a symbol, to better support
translating other Lisp dialects into Scheme.
I suspect that you intend to add function and property-list bindings
to symbols. Having symbols associated with multiple slots makes the
language complicated, regardless of their use or implementation.
But, let's take a closer look:
I take it for granted that you're going to use the value-binding for
symbols occuring in the first position of a form. To have another
scheme of evaluation for operators would simply make it a different
language. It would make "Modified Scheme" incompatible with Scheme
and would make the code clumsier.
Apart from the conceptual burden, extra symbol bindings would
complicate the implementation and make symbols "heavier", thus slowing
down code which creates new symbols.
The property list could easily be implemented in scheme as a separate
table.
Assuming that symbols have a function binding, how would you translate
elisp into "Modified Scheme"?
One way to do it is to use a selector `function-binding' to access the
function binding and a mutator `set-function!' for modification.
(Both are special forms which don't evaluate their first argument.)
Then some translations could be:
(* 2 3) --> (* 2 3)
(g 3) --> ((function-binding g) 3)
(Of course one could have the same value in both value and function
bindings of primitive operators, which would make the translation
simpler.)
(defun g (x) (* 5 x)) --> (if (not (defined? x))
(define x #f))
(set-function! x (lambda (x) (* 5 x)))
(funcall 'g 3) --> ((function-binding g) 3)
(funcall func 3) --> ((eval (list 'function-binding func)
(interaction-environment))
3)
(1. I don't know how you are going to implement dynamic variables,
therefore I ignore the dynamic - lexical binding issues in my
examples.
2. I used the eval from the proposal in Jonathan Rees "The Scheme
of Things: The June 1992 Meeting" which can be found in the
Scheme Repository
3. As far as I understand, implementing `function-binding' as a
procedure would put the same kind of restrictions on the
implementation as first class environments do.)
I can't think of any better way to use the extra binding. How are you
going to use it?
In fact, most people intuitively think about symbols as having only
one binding. Because of this most lisp code doesn't use both value
and function bindings simultaneously.
Wouldn't it be a much simpler solution then just to drop the
distinction and manually correct code which use both bindings?
(* 2 3) --> (* 2 3)
(g 3) --> (g 3)
(defun g (x) (* 5 x)) --> (define x (lambda (x) (* 5 x)))
(funcall 'g 3) --> (g 3)
(funcall func 3) --> ((eval func (interaction-environment)) 3)
This obviates the need of the extra function binding.
From the "multiple extension languages" perspective this would of
course lead to an odd dialect of lisp. If this is a big issue there
are still other solutions. E.g., having multiple obarrays opens the
possibility of using a different name space for functions in
translated code...
** A place in a function to record an interactive argument reading
spec.
Such interactive argument specs are particular to certain
applications, like emacs. Why make a change in the Scheme language
when it could be implemented as a separate table? The elisp ->
M. Scheme translator could generate code for maintenance of this
table. Or, `defun' could be implemented as a macro which augments the
table.
** An optional reader feature to convert nil to #f and t to #t,
for the sake of supporting Lisp as well as Scheme.
Alternatively one could define the symbol nil as #f and t as #t,
thereby not changing characteristics of the language.
Mikael Djurfeldt
: This sounds a lot like OpenWindows vs. Motif vs. Athena/MIT for
: widgets and window managers. Did the user community benefit from all
: this choice ? It made it difficult to have a program with a GUI run
: on all platforms and it created a new industry of companies writing
: GUI independent libraries thereby making it even more difficult to
: share software between users. One party that wasn't adversely affected
: by the inability of the Unix vendors to agree on a "standard" was
: Microsoft - that's for sure.
The image that comes to mind is a tcl rat preoccupied and in mortal combat
with a scheme opossum all the while a Tyrannosaurus Basicus bears down:
"Yummm......both white meat and dark meat for lunch today....."
: Tom Moog
--
-Matt Kennel m...@inls1.ucsd.edu
-Institute for Nonlinear Science, University of California, San Diego
-*** AD: Archive for nonlinear dynamics papers & programs: FTP to
-*** lyapunov.ucsd.edu, username "anonymous".
>>A simple base interface such as that provided by Tiny CLOS or Dylan
>>should be just fine for about 99.999% of cases, I'd imagine.
>But Tiny CLOS *does* have a mop.
The base interface (which just provides generics, classes, and objects)
can be used without any explicit reference to the MOP on the part of
the programmer. That was my point.
Ah, MIT. Should have known. Impractical in the extreme.
Well, the day there are more scheme users than perl than scheme
users I'll eat my words. Until then, you're unrealistic about
what "commoners" need.
--tom
I get the feeling you're not reading what I'm writing. I haven't said a
word yet that would reveal my own opinion about what makes a good extension
language and whether it should be Scheme or something else. (You might be
suprised -- especially since you seem to think that the substring "MIT" in
my electronic mail address immediately renders my opinions suspect.) I've
been quite careful in my previous two messages to leave open the
possibility that perl and tcl really -are- better extension languages than
Scheme.
The Pro-Life movement calls their opponents baby killers. The Pro-Choice
movement says their opponents want to enslave women. In neither case does
the other side describe -themselves- in those terms. No Pro-Choicer says
"yes, I am in favor of killing babies". No Pro-Lifer says "yes, I think
women should be slaves". The descriptions are wonderful debating tactics,
helpful for converting the undecided, good for improving the resolve of
those who already agree, but truly terrible as a basis for having any kind
of meaningful discussion.
The Anti-Scheme movement often says their opponents value beauty above
utility. Of course no Pro-Schemer actually thinks that. It's a good
debating tactic, helpful for converting the undecided, good for improving
the resolve of those who already agree, but not very good for getting the
opposition to take you seriously. And since you are posting in
comp.lang.scheme (or the Scheme Digest), I assume you are hoping that the
people who read that newsgroup will seriously consider your arguments.
And let me emphasize that it wasn't just -your- post that contained this
characterization of Scheme proponents. If it was, I wouldn't be wasting my
breath on this. It seemed to me to be a quite common tack that many other
people were taking as well.
Syntaxes other than Scheme are just ordinary extensions.
but so is scheme syntax itself. adam shah's architecture does not mention
the fact that most of the scheme syntax is an "extension" on top of a simpler
core language containing not much more than symbols, constants, quote,
define, lambda, begin, if, set!, prim, proc etc.
oz
It's not Scheme, but wouldn't it be nice if it could be as close to
Scheme as possible?
Again, sorry to jump into what's obviously an ongoing and heated
discussion, but, although I completely agree with most of what Peter says
(particularly the first paragraph above), the ending "the latter is hard to
pick out of code" is a rather silly thing to assert. If you don't know the
language, of course it's hard. If you don't know the language (and it
doesn't resemble something you *do* know, such as COBOL vs. English :-),
anything is hard.
If you _Know_ the language, it's as natural as can be. At least that's my
opinion. That it's not natural for you is yours.
The problem with perl is that it resembles C in many ways. This is a
double-edged sword. It's good in that what is similar is, uh, similar.
It's bad in that what's not, isn't, and it's not always apparent what is
and isn't similar (that make any sense?). Anyone familiar with Japanese
will see the same double-edged sword in romaji, the expression of Japanese
using "English" letters.
The biggest trap of perl resembling C/sed/awk/COBOL/English/whatever is
that it can seduce a beginner. If you program in perl while Thinking in C,
your perl will suck bigtime.
Let me say that again: If you program in perl while Thinking in C, your
perl will suck.
Over the years I've done real, large, non-academic projects in some wild
languages (including FORTH, a nastalgic favorite), so had a pretty wide
range of experience when I first encountered perl in the late 2.x stage.
It took me a *long* time to get to really _Know_ perl (i.e. in the biblical
sense :-). But once I was able to Think in perl, it was magical, just as
when I was finally able to think in Japanese.
*jeffrey*
-------------------------------------------------------------------------
Jeffrey E.F. Friedl <jfr...@omron.co.jp> Omron Corporation, Kyoto Japan
See my Jap/Eng dictionary at http://www.omron.co.jp/cgi-bin/j-e
or http://www.cs.cmu.edu:8001/cgi-bin/j-e
Peter da Silva writes:
I'm sorry, you've lost me. Either the extension language the user's
interested in is built into the executable, in which case they all
have to be, or it's got to be execed to convert the user's key macro
string into the implementation language, which gives you too much
of a performance hit, or you expose the underlying mechanics of the
implementation language to the user, which is what I thought you were
trying to avoid. What's the fourth alternative?
The ``fourth alternative'' is this: the parser and translator for a
user's favorite syntax is loaded into the running program on demand.
Thus, it is as easy to use as if built-in, but without the associated
costs of building in all languages.
Syntaxes other than Scheme are just ordinary extensions.
-t
--
----
If you would like to volunteer to help with the GNU extension language
project, please write to lo...@gnu.ai.mit.edu.
>>> "Paul" == egg...@twinsun.com (Paul Eggert) wrote:
Paul> Isn't the choice of name obvious?
Paul> ``GNU Extension Scheme''.
Paul> We can call it ``escheme'' for short.
No no no ... it should called `gel', particularly if it's going to be
used to "glue" all the diverse GNU applications together. :-)
- Mike W.
Hey, I like that! gel == GNU Extension Language? (or Glue Extension
Languge? Or perhaps GlNUe Extension Language?) Pronounced `jel'? I
like that a lot!
--
Kevin K. Lewis | My opinions may be unreasonable
lew...@aud.alcatel.com | but such is the voice of inspiration
gescheme
Has a sort of middle-european feel to it. A nice cuddly sort of name.
> In article <22Oct1994....@LCS.MIT.EDU> Al...@lcs.mit.EDU (Alan Bawden) writes:
> ...
> >Let me correct what seems to be a common misconception. "Ivory tower"
> >types don't advocate these things just because they think they are
> >beautiful, they actually believe that these properties make programming
> >languages better tools for software engineering.
> No question about it. But because most of them haven't written a
> program that had to compete in any real marketplace for years (maybe
> ever) they are usually wrong.
Aren't you the guy who came out to play the blazing bozo
w.r.t. SICP? Maybe if you'd actually read the book, and there are
undoubtedly other fine ones such as it, you'd know what "academics"
actually mean when they speak of software engineering tools.
Competition in the marketplace is not, and should not be, the
principal method of judging the quality of a paradigm. It just
happens to be a method we find convenient and that is widespread.
> He may be right about academic intuitions and "research" (that is,
> guesswork) about programming languages more generally.
Ahem. Have you actually seen what programming languages people do? I
daresay some of the work I've studied is rather far from "guesswork".
(A good example: control operators.)
> One of the things I like about python is that it clearly has not
> been overly influenced by some of the goofier and less generally
> useful programming language ideas that academics have failed to
> impose on the world outside of the fishbowl.
Charming.
> Let me correct what seems to be a common misconception. "Ivory tower"
> types don't advocate these things just because they think they are
> beautiful, they actually believe that these properties make programming
> languages better tools for software engineering.
In article <1994Oct24.1...@njitgw.njit.edu>,
aa...@funcity.njit.edu writes:
> No question about it.
> But because most of them haven't written a program that had to
> compete in any real marketplace for years (maybe ever) they are
> usually wrong. [...]
The problem is not that these beautiful concepts aren't practical,
it's that practical implementations of these beautiful concepts are
relatively rare.
I am a `real-world' programmer (in fact, I've never taken a single
computer science class in my life), and I am constantly trying to
simulate the kind of functionality available in higher-level languages
in the pain-in-the-ass language of C++ (in the name of compatibility).
Even the existence of C++ itself is testimony to fact that
higher-level abstractions are useful in `real-world' programming.
It's also testimony to how reluctant people are to give up what they
know in persuit of these higher-level abstractions.
(None of this is to deny that there is esoteric, impractical, or just
plain bad research and development going on in academia. After all, a
university's purpose is to teach people (I assume). And one great way
to learn things is to make mistakes.)
thant
How about 'Gleem' - rhymes and has a "G" and is shines ;-)
Gleem is good! Someone else suggested `gel' (GNU Extension
Language?); it makes you think of sticky things like glue.
I think we're on track!
In article <MIKEW.94O...@gopher.dosli.govt.nz> mi...@gopher.dosli.govt.nz (Mike Williams) writes:
>>> "Paul" == egg...@twinsun.com (Paul Eggert) wrote:
Paul> Isn't the choice of name obvious?
Paul> ``GNU Extension Scheme''.
Paul> We can call it ``escheme'' for short.
No no no ... it should called `gel', particularly if it's going to be
used to "glue" all the diverse GNU applications together. :-)
- Mike W.
Hey, I like that! gel == GNU Extension Language? (or Glue Extension
Languge? Or perhaps GlNUe Extension Language?) Pronounced `jel'? I
like that a lot!
How about GLU, the ``GNU Language Utility'' (given that it is really
intended to be a sort assembly language underneath a possible variety
of faces) or the ``GNU Lisp Utility''.
We must appreciate the importance (perhaps) of retaining a hard G ;-)
--
-------------------------------------------------------------------------------
Lee Iverson SRI International
le...@ai.sri.com 333 Ravenswood Ave., Menlo Park CA 94025
(415) 859-3307
Interpreted languages allow you to write programs generating pieces of
other programs. This is something widely used in LISP-like languages.
Also, Tcl has that feature (you can write your own programming
environment on top of Tcl, including debugger and interpreter).
As for the suggestion that a compilation would be performed no matter
what language a function or program is written in, the utility of this
depends on what you mean by "compilation". In the classical sense
(C/C++ to assembly language), this will be just another language like
C or others. Therefore, it is difficult to find a rationale for that
new language (why not just take C or C++ and have the compiler
translate your scripting language to that?).
However, if we view compilation as a process similar to
byte-compilation in Smalltalk, Xerox LISP, or Emacs LISP, this makes
sense. In fact, it would support the idea of platform independence as
well as syntax independence. This is along the lines of my postulate
from an earlier message in that thread, that syntax independence must
be one of the features of this extension language. What we have to
agree upon are the primitives and the data type model of the language
..ummm... environment.
--Juergen Wagner
Juergen...@iao.fhg.de
gan...@csli.stanford.edu
Fraunhofer-Institut fuer Arbeitswirtschaft und Organisation (FhG-IAO)
Nobelstr. 12c Voice: +49-711-970-2013
D-70569 Stuttgart, Germany Fax: +49-711-970-2299
<a href = http://www.iao.fhg.de/Public/leute/jrw-en.html>J. Wagner</a>
The GNU extension language should, of course be called "TINT":
TINT Is Not TCL
Thomas.
I don't think it should have "Scheme" in the name unless it can run
R4RS programs, which it can't if (for example) (eqv? '() #f). I'm not
opposed to a language with the proposed 'extensions' to Scheme, but
that language is not Scheme.
Some people have been writing 'GNUel' to refer to the proposed
language in this thread... I propose 'gnudel'---short 'e', rhymes with
"noodle"---the 'd' is for 'dynamic' which is nice because this is a
salient feature of all the extension languages discussed and makes the
acronym pronounceable.
> Dynamically loaded libraries. Possibly configured on a per-user
> basis.
That'd work. Would be a bummer if you saved your configuration when you had
your environment in "scheme mode" and then tried to reload your config file
in "Rush mode" though. Make sure your API supports having multiple DLLs
loaded at once.
(Hrm. GCC is a bit big for a DLL)
That's a good one. AREXX works that way. You need to be very careful about
the interface, though... AREXX requires you implement your own baby
extension language to parse AREXX's input, and the interface is butt
ugly. Karl wrote a Tcl interface for the Amiga that just passed argvs
around that worked pretty well. A more general interface would want to
use type-tagged elements.
Also, if you're using dynamic libraries to implement the high level extension
language, why have a low level one at all (to pull in another thread), just
define the extension language API and dynamic load the one you want.
That requires dynamic libs, you say? Well, it seems to me that any practical
multiple extension language arrangement is going to. It's too expensive to
fork-and-exec all the time, and you don't want to link everything in, so why
not just depend on dynamic loading and make scheme merely one option of
many?
> >I can't agree. A compile step automatically makes for a lousy extension
> >language, unless all the compilers are built into the binary.
> However, if we view compilation as a process similar to
> byte-compilation in Smalltalk, Xerox LISP, or Emacs LISP, this makes
> sense. In fact, it would support the idea of platform independence as
Which is what I mean by "all compilers are built into the binary". Now
your bytecode is, as I've said, an implementation detail. Do you really
mean that every program that uses GNUel should include Scheme and Rush
and Perl and Elisp?
No no. GSP. GNU Spackling Putty.
(although I think the "GNU Extension Scheme" has a nice subtlety to it)
--
Darin Johnson
djoh...@ucsd.edu
Ensign, activate the Wesley Crusher!
Odd, I'm a native English speaker, and I think "if this, do that"
is very natural as well.
Of course, in other languages, "this if, that do" is natural,
should we also support that sort of thing?
--
Darin Johnson
djoh...@ucsd.edu
Where am I? In the village... What do you want? Information...
> GNU Script and GNU EL (with minor variations) are both
> potentially confusing, which is a pity. GNU Scheme is simply
> inaccurate. While on the topic, though, anyone else notice that
> "Scheme" backwards....
How about Gingel? (Gingel Is Not the GNU Exte....NO CARRIER)
--Mitch
Mitchell Wand Internet: wa...@ccs.neu.edu
College of Computer Science, Northeastern University
360 Huntington Avenue #161CN, Boston, MA 02115 Phone: (617) 373 2072
World Wide Web: http://www.ccs.neu.edu/home/wand Fax: (617) 373 5121
>>>>> "Peter" == Peter da Silva <pe...@nmti.com> writes:
>> Under any such scheme as this, the language that it all ends up
>> being in is going to be Scheme. The translators are just not going
>> to be used, long term.
> I don't think this is necessarily true, but you may be right.
Doesn't this argue for not embedding *any* extension language in a program but
instead defining a request/reply API that programs must conform to? Isn't
this the model that C uses (ie. think of functions as a request to something
for service). Then, any language that handles the API can be an extension
language.
Think performance might be a problem? Short-circuit the request code to look
in the local services list to see if the request could be handled locally.
Afraid of programs that *might* build in a specific language? Who cares! As
long as it supports the API, I'll start up my language processor that makes
requests according to the API and bypass the in-built language.
Possibilities?
--
==================================================================
David Masterson KLA Instruments
408-456-6836 160 Rio Robles
dav...@prism.kla.com San Jose, CA 95161
==================================================================
I only speak for myself -- it keeps me out of trouble
An interpreter is supposed to interpret textual representations and
map them to some primitive operations of the language. None of the
languages I mentioned actually compile every statement to an
intermediate code. My point is: define a set of primitive operations
and put an appropriate interpreter or compiler for the language of
your choice on top. This way, "the language" software would consist of
the bytecode interpreter, plus some "canonical" syntax for the
extension language, not necessarily for all languages.
However, it probably doesn't make sense arguing too much about
architectural or syntactic issues at this time. The question is more
what kind of semantics (i.e., primitives, data types, and control
structures) we need. Next, we can talk about how to represent that in
different syntactic forms, and how this can be mapped to a system
architecture.
In principle, *any* interpreted (or interpretable), universal
programming language would qualify as a candidate for *the* extension
language. What we should do now is to define some requirements for the
semantics of that language. If there is a 1:1 translator (maybe
through an intermediate code), it wouldn't matter much if we wrote
proc fact (n) {
if (n == 0) { return 1; }
else return fact(n-1)*n;
}
instead of
(defun fact (n)
(if (= n 0) 1
(* (fact (- n 1)) n)))
or anything else. One thing which should be avoided (because it
endangers the ease of use of such a language) is mandatory type
declarations. Typing should be dynamic as in LISP. Multiple (also
user-defined) types should be part of the language concept, unlike in
Tcl.
The second important point is what is supposed to make that language
so special as an extension language, i.e., which extra features
particularily qualify the selected language better than others? Here,
two issues are important: data exchange between code written in the
extension language and code written in other languages, and control
flow between such pieces of code from different languages.
We should keep in mind that this extension language business doesn't
aim at developing a new language, but may take an existing one and
make it fit the requirements by as little as possible modifications.
>Peter da Silva `-_-'
>Network Management Technology Incorporated 'U`
>1601 Industrial Blvd. Sugar Land, TX 77478 USA
>+1 713 274 5180 "Hast Du heute schon Deinen Wolf umarmt?"
^^ Nein noch nicht. Habe dafuer zwei
Katzen...
Cheers,
I agree in that the syntactic differences are pretty clear.
However, you're mixing syntax and semantics. Depending on the semantics
attached to the two statements, you can have *both*, *one*, or *none*
representing a control structure. In a rule-based system, the first
form is quite common and only declaratively formulates a rule. The
control structure comes in through the rule interpreter/inference
mechanism. Therefore, this is just a bad example.
All languages (including COBOL :-) just happen to be similar to a
context-free language which might be mistaken as a subset of English
with some mathematical notation (except for LISP which is natural
language with parentheses :-)). It is the nature of programming
languages that they cannot be and "are not human languages". In fact,
they don't have to.
In article <38jdk4...@ford.is.wdi.disney.com> th...@disney.com (Thant Tessman) writes:
>The problem is not that these beautiful concepts aren't practical,
>it's that practical implementations of these beautiful concepts are
>relatively rare.
>...I am constantly trying to
>simulate the kind of functionality available in higher-level languages
>in the pain-in-the-ass language of C++ (in the name of compatibility).
Don't get me started on C++.
>Even the existence of C++ itself is testimony to fact that
>higher-level abstractions are useful in `real-world' programming.
>It's also testimony to how reluctant people are to give up what they
>know in persuit of these higher-level abstractions.
Actually, C++ is mostly syntactic sugar on C. It's not all that
high level -- it aims close to the right height, but misses the barn.
>(None of this is to deny that there is esoteric, impractical, or just
>plain bad research and development going on in academia. After all, a
>university's purpose is to teach people (I assume).
Tell that to the tenure committee. And there is no such thing as bad
research, especially by a well known or powerful person (first rule of
academia: never criticize anyone to their face).
>And one great way
>to learn things is to make mistakes.)
If anyone ever realized they'd made a mistake this might be true.
Academics too often take Stalin's approach: if something doesn't
work DO IT HARDER (and make sure all your grad students and the
junior faculty are doing it too). Popper once said that paradigms
change when the old guys retire, and in academia at least I think
this is true. -a.
> Since these violations of the standard are minor, if the string
> "Scheme" shows up in the name, I think that will not be too great an
> abuse of terminology.
Isn't the choice of name obvious?
``GNU Extension Scheme''.
This has already been done - Scheme/TK exists. I think there was a
Perl and Python port as well.
--
Darin Johnson
djoh...@ucsd.edu
"I wonder what's keeping Guybrush?"
In article <OZ.94Oct...@nexus.yorku.ca> o...@nexus.yorku.ca (ozan s. yigit) writes:
Mikael Djurfeldt is concerned about the suggested modifications to the
Scheme language in order to fullfill GNU extension language goals. I do
not think there is any cause for concern. It is clear that the changes are
not to The Scheme language, which is well defined in an IEEE standard and
Revised^4 Report, The language described in the RMS note may at best be
characterized as a derivative of Scheme, and i doubt it would be called
Scheme, any more than Dylan is called Scheme.
It's not Scheme, but wouldn't it be nice if it could be as close to
Scheme as possible?
No. This only leads to confusion and subtle bugs. If it isn't
Scheme, then it should be different enough for every real Scheme to
cough up and complain about a program in that new language on first
sight. Otherwise we just add one more to the list of slightly
incompatible Scheme implementations...
--- Scheme talk --- only for the comp.lang.scheme audience ----
Many of RMS' suggestions of what needs to be changed actually
constitute inverse progress. It took a lot of time and arguing to
finally get over the non-distinction of #f and (). Now RMS walks in
and tries to tell us: ``Well folks, nice work -- but let's go back to
square one.'' I don't think so!
What are ``multiple slots'' in a symbol? I'm not even aware of one
single slot in a symbol! Multiple obarrays? Oh -- I see -- he wants
a module system. Good point!
Fluid-let? What for? (I'm aware of the fact that there are many
people who seem to think that fluid-let is indispensable. I disagree
with this point of view. Establishing error-handlers,
interrupt-handlers and so on can be done with procedures similar to
WITH-INPUT-FROM-FILE. I don't think we should sacrifice strict lexical
scoping for a few special cases.)
Distinguishing between upper-case and lower-case in symbols. Yes,
please! But we have been through that argument, haven't we?
Powerful catch and throw? It's there, and it's called
CALL-WITH-CURRENT-CONTINUATION. If you don't want to confuse your
``average programmer'' with the word ``continuation'' then just wrap
call/cc into some macros:
(define-syntax catch
(syntax-rules ()
((_ c exp ...)
(call-with-current-continuation (lambda (c) exp ...)))))
(define-syntax throw
(syntax-rules ()
((_ c val) (c val))))
Ok, let me step down from this soapbox...
--
-Matthias
Well, when I was a very new user to perl (I still consider myself a 'new' user
a year later) one of the most attractive, clear, and anti-confusing features
was the ability to write conditionals the way I would speak them. For an
English language speaker "do this if that" is very natural.
Now, I have an E.E background. I can do conditionals on a gate-logic level,
and I've programmed C and C++ for years, so I knew prefix syntax, and I still
wasn't confused. In fact I'd argue that perl is less confusing because you
can write "a unless b" and C more confusing because you can't, and the
condition always gets emphasized over the action.
--
Logan Ratner | rat...@rice.edu | It is not funny that a man should die, but
CRPC/CITI | tinker | it is funny that he should be killed for so
Rice Univ. | tailor | little, and the coin of his death should be
Houston TX | cybernaut | what we call civilization - R. Chandler
In article <LEWIKK.94O...@grasshopper.aud.alcatel.com> lew...@grasshopper.aud.alcatel.com (Kevin K. Lewis) writes:
Path: pulitzer.eng.sematech.org!gater3.sematech.org!news.dell.com!swrinde!cs.utexas.edu!utnut!nott!bcarh189.bnr.ca!corpgate!news.utdallas.edu!fozzy.aud.alcatel.com!fozzy.aud.alcatel.com!lewikk
From: lew...@grasshopper.aud.alcatel.com (Kevin K. Lewis)
Newsgroups: comp.org.lisp-users,comp.lang.scheme
Date: 24 Oct 1994 18:06:06 GMT
Organization: Alcatel Network Systems
Lines: 11
References: <941020194...@mole.gnu.ai.mit.edu> <389e9e$2...@tools.near.net>
<38av7u$o...@news.cs.tu-berlin.de>
NNTP-Posting-Host: grasshopper.aud.alcatel.com
Xref: pulitzer.eng.sematech.org comp.org.lisp-users:113 comp.lang.scheme:3483
In article <38av7u$o...@news.cs.tu-berlin.de> n...@cs.tu-berlin.de (Oliver Laumann) writes:
But Scheme is not a scripting language; it is a full, general-purpose
programming language. If the GNU extension language will be Scheme,
then why not call it Scheme? Given that we already have Emacs-Lisp,
something like GNU-Scheme would be a logical choice.
But that's neither catchy, nor funny, and, therefore, impractical. ;-)
I think you're confusing three different issues. The treadmill arrangement
is just a way of allocating storage that lets you do a copy-collector-like
trick without actually copying. That's a separate issue from whether
it's incremental, which is in turn a separate issue from whether it's
opportunistic.
Incremental means it interleaves small units of work with small units
of program operation, so that the program appears to operate continuously.
Opportunistic means that it preferentially schedules collection work
when it's less awkward, e.g., during user pauses, or when it's cheaper.
Oh yeah, and generational means that it garbage collects younger objects
more recently than older objects.
If you're interested in a survey that explains these things have a look
at the GC paper repository on cs.utexas.edu in the directory pub/garbage.
The README file explains what's there, including a short version and a
long version of my survey on garbage collection, and a paper on our
incremental treadmill(ish) collector. (We have a new generational version,
but that's not in the paper.) There's also an old paper on my old
opportunistic generational GC.
There are also some notes on Scheme and Scheme compilers and interpreters
(roughly half a book's worth of class notes) in pub/garbage/schnotes.
Constructive comments welcome.
--
| Paul R. Wilson, Computer Sciences Dept., University of Texas at Austin |
| Taylor Hall 2.124, Austin, TX 78712-1188 wil...@cs.utexas.edu |
| (Recent papers on garbage collection, memory hierarchies, and persistence |
| are available via ftp from cs.utexas.edu (128.83.139.9), in pub/garbage.) |
> Depending on the definition of define_key_macro. [Some definitions
> of define_key_macro wouldn't expose any of the details of the
> extension language.]
I'm sorry, you've lost me. Either the extension language the user's
interested in is built into the executable, in which case they all
have to be, or it's got to be execed to convert the user's key macro
string into the implementation language, which gives you too much
of a performance hit, or you expose the underlying mechanics of the
implementation language to the user, which is what I thought you were
trying to avoid. What's the fourth alternative?
--
Dynamically loaded libraries. Possibly configured on a per-user
basis.
--
Raul D. Miller n =: p*q NB. 9<##:##:n [.large prime p, q
<rock...@nova.umd.edu> y =: n&|&(*&x)^:e 1 NB. -.1 e.e e.&factors<:p,q [.e<n
NB. public e, n, y
x -: n&|&(*&y)^:d 1 NB. 1=(d*e)+.p*&<:q
Which leads me to wonder about how to contribute to the current
effort. What is considered to be a contribution? Are these sorts of
implementation design issues still being considered?
I'm interested in the effort, but not sure how I would fit in with the
existing effort. I'm not even sure how to ask about the existing
effort.
Where do things stand?
> For an
> English language speaker "do this if that" is very natural.
Odd, I'm a native English speaker, and I think "if this, do that"
is very natural as well.
what is so odd about it? i'm a native `english' speaker, and *both*
are natural to me.
Of course, in other languages, "this if, that do" is natural,
should we also support that sort of thing?
depends if larry speaks those languages or not.. ;-)
.mrg.
I have to sort-of-agree with Oz here. I think rms has accidentally
broken the GNU language w.r.t. Scheme compatibility. (But presumably
it's not too late to avoid that mistake.)
>The GNU language (whatever it is called) will be very nearly a
>superset of standard Scheme. Only two kinds of standard Scheme
>program won't work with the GNU language without modification:
>
> 1. Those that depend on the distinction between '() and #f.
> 2. Those that depend on the case insensativity of Scheme identifiers.
>
>It would not be hard to provide a reader option to control case
>sensativity. In that case, standard programs from class 2 would work
>perfectly well with the GNU language although they might have trouble
>communicating with other packages that take advantage of case
>sensativity.
I don't yet know if this is right, but I'm willing to entertain the idea.
However...
>R4RS itself gives warning that some implementations conflate '() and
>#f. So, there is precendent for discouraging people from writing
>programs that depend on the distinction.
I think you've misread the R4RS. The passage in question should say
that there are obsolete, broken old implementations that do not conform
to the R4RS in this regard. Please have a look at section 6.1 (Booleans).
It says (all-caps added for emphasis).
Of all the standard Scheme values, only #f counts as false in conditional
expressions. Except for #f, all standard Scheme values, including #t,
pairs, THE EMPTY LIST, symbols, numbers, strings, vectors, and procedures,
count as true.
It goes on to give examples of things that evaluate to #t and #f:
(not #t) ==> #f
(not 3) ==> #f
(not (list 3)) ==> #f
(not #f) ==> #t
(not '()) ==> #f
(not (list)) ==> #f
(not 'nil) ==> #f
I think this is important enough to reconsider the decision to be incompatible.
My impression is that the R4RS authors intentionally made #f and '() REQUIRED
to be distinct objects in part to support a programming style where an
empty list means a list of zero objects, but #f means the absence of any list
(or whatever) at all. E.g., you can compute a set of zero things, or you
can fail to compute a usable answer. I used this idiom regularly in
Scheme programs, and suspect other people do too. An implementation that
reverts to pre-R4RS days may break a significant fraction of conforming
R4RS or IEEE Scheme programs.
Another significant consideration is that the separation of boolean types
from list types is important if you want to be upward-compatible with
extended Schemes that support type declarations, e.g., for interoperability
with Dylan.
Language ideology-wise, I think the conflation of false and nil is
generally viewed as a category mistake, even if it's one that's useful
sometimes.
Practically speaking, it has important implications for compatibility
with existing standards and code, and with future extended languages.
Personally, I'm very interested because we're buiding a highly
portable extended Scheme that might end up being the big brother of
a GNU extension language, for larger and more performance-critical
applications. (It's got objects, threads, sockets, and a real-time
garbage collector, and compiles to C without losing proper tail
recursion or the ability to have full continuations. We're also working
on a portable GUI which is based on the VIBRANT package, so it should
work under UNIX, Windows and the Mac OS.) It may eventually end up being
a full Dylan implementation as well, supporing interoperability between
Dylan and Scheme code. This is possible because Dylan is semantically
almost a superset of Scheme, despite major syntactic differences.
I think a GNU extension language based on Scheme could be just the
boost that Scheme needs right now, and a nice encouragement to Dylan
implementors to support Scheme.
As for the other changes, please let us know the reasoning behind them.
I agree with an earlier posting that you don't need a property list
slot to efficiently support property lists. (A separate table can be
plenty efficient for an interpreted system, or even a high-performance
system. Our symbols have a field that holds the hash value of the
name string, so you don't have to compute that value each time you
hash into a table.
I assume that you're not going to implement a separate function
namespace for Scheme code. (Scheme programs freely use first-class
procedures, and that is much more awkward with separate namespaces;
it would probably break the large majority of nontrivial Scheme programs,
and half the trival ones too.) What is your plan for using the extra
binding slots?
Paul> Isn't the choice of name obvious?
Paul> ``GNU Extension Scheme''.
Paul> We can call it ``escheme'' for short.
No no no ... it should called `gel', particularly if it's going to be
used to "glue" all the diverse GNU applications together. :-)
- Mike W.
In article <LORD.94Oc...@x1.cygnus.com>,
Tom Lord <lo...@x1.cygnus.com> wrote:
>R4RS itself gives warning that some implementations conflate '() and
>#f. So, there is precendent for discouraging people from writing
>programs that depend on the distinction.
I think you've misread the R4RS. The passage in question should say
that there are obsolete, broken old implementations that do not conform
to the R4RS in this regard. Please have a look at section 6.1 (Booleans).
It says (all-caps added for emphasis).
Of all the standard Scheme values, only #f counts as false in conditional
expressions. Except for #f, all standard Scheme values, including #t,
pairs, THE EMPTY LIST, symbols, numbers, strings, vectors, and procedures,
count as true.
I think this is important enough to reconsider the decision to be incompatible.
My impression is that the R4RS authors intentionally made #f and '() REQUIRED
to be distinct objects in part to support a programming style where an
empty list means a list of zero objects, but #f means the absence of any list
(or whatever) at all. E.g., you can compute a set of zero things, or you
can fail to compute a usable answer. I used this idiom regularly in
Scheme programs, and suspect other people do too. An implementation that
reverts to pre-R4RS days may break a significant fraction of conforming
R4RS or IEEE Scheme programs.
It really is no big deal to write Scheme code which works either way.
By the way, here is a little example from the mother of all Schemes,
MIT Scheme - the implemenation which Profs. Sussman and Abelson use:
================================================================
Scheme Microcode Version 11.148
MIT Scheme running under HP-UX
Type `^C' (control-C) followed by `H' to obtain information about interrupts.
^L
Scheme saved on Monday October 24, 1994 at 9:47:41 PM
Release 7.4.0 (alpha)
Microcode 11.148
Runtime 14.166
;Loading "/zu/jaffer/.scheme.init"
;Loading "/zu/jaffer/slib/require.scm" -- done -- done
1 ]=> (eq? '() '#f)
;Value: #t
================================================================
So is it an "obsolete, broken old implementation"?
Raul D. Miller asks:
Which leads me to wonder about how to contribute to the current
effort. What is considered to be a contribution? Are these sorts of
implementation design issues still being considered?
I'm interested in the effort, but not sure how I would fit in with the
existing effort. I'm not even sure how to ask about the existing
effort.
Where do things stand?
Here are some answers:
* How can I contribute to the project?
To volunteer your time either for hacking or writing, you can write
to lo...@gnu.ai.mit.edu. A very promising list of volunteers is
already forming. I think there is a chance the project can go quite
quickly if this keeps up.
* Where do things stand now?
I am currently putting together a task list. During the week of
31-Oct-94 I'll be sending the task list to all the volunteers and
organizing the initial division of labor.
The first hacking step of the project, which must precede all
others, will be to rename some of the identifiers in the SCM source,
to combine the SCM and SCM autoconf distributions, and to make some
slight amendments to the Makefile.
The most immediate goal of this is to make a standard GNU
distribution of SCM that compiles to a namespace-friendly library.
Included with this distribution will be a header file of CPP defines
for the old identifier names; its purpose will be to enable existing
SCM extensions to continue to work.
* Can we convince you to change the design?
Compelling, pragmaticly oriented arguments may be convincing. They
should probably be made to lo...@gnu.ai.mit.edu rather than carried
on endlessly in netnews.
For practical reasons, we won't be engaging in prolonged debate on
any point. Informed critiques are very welcome, but please don't
count on an engaging reply; there isn't enough time to reply to
every critique that comes in.
Perhaps the best way to influence the design is with code. If you
don't like the way we plan to implement some feature, and you can
provide that feature in a better way, you'll find that we have
trouble arguing against code that works. If your way really is
better, we'll gratefully accept the code, and voila -- the design is
changed. (But before going off the deep end, it is probably a good
idea to understand all the reasons the design is as it is; if you
accidently ignore some of our requirements, your code may not be
useful to the project.)
* Where can I ask about the project?
You can send inquiries to lo...@gnu.ai.mit.edu, but please understand
that not all inquiries receive immediate or even eventual replies.
There is simply not enough time.
As new versions of the language are released, announcements will be
posted to at least the newsgroups comp.lang.scheme and
gnu.misc.discuss, and to the mailing list info...@prep.ai.mit.edu.
Eventually, I hope a volunteer can start a FAQ for the project.
-t
--
----
If you would like to volunteer to help with the GNU extension language
project, please write to lo...@gnu.ai.mit.edu.
For information on a good implementation of tk with perl4 see
<A HREF="http://www.ira.uka.de/IRA/SMILE/tkperl/">
http://www.ira.uka.de/IRA/SMILE/tkperl/</A>
Note: this is not the same tkperl that gets the lions share of
attention in this group. I sure wish one of those two camps would
change the name of their product... though come to think of it, one
is tkperl and the other is tkperl5...
>
> If you're interested in a survey that explains these things have a look
> at the GC paper repository on cs.utexas.edu in the directory pub/garbage.
> The README file explains what's there, including a short version and a
> long version of my survey on garbage collection, and a paper on our
> incremental treadmill(ish) collector. (We have a new generational version,
> but that's not in the paper.) There's also an old paper on my old
> opportunistic generational GC.
>
> There are also some notes on Scheme and Scheme compilers and interpreters
> (roughly half a book's worth of class notes) in pub/garbage/schnotes.
> Constructive comments welcome.
Howdy,
For what it's worth, I found the survey (long version) to be
a wonderful resource. I found that a second reading, after reading
H. Baker's articles on "realtime" copying/non-copying GC (available
on ftp.netcom.com::pub/hbaker), was even better.
Here's a constructive comment: While it's hard to include
sniggly details in a paper, sometimes a lack of detail makes a
paper more confusing. Case in point: only now that I've read Baker's
1978 paper describing a realtime copying collector can I start to
"imagine" how one might _code up_ one of these fancier incremental
collectors. In that paper he addresses issues of arrays and the
program stack that are frequently not discussed. I also like
the provided "pseudo-code".
Anyway, for this reader, dumping a bit of advice gained
from experience implementing these things into the theoretical
discussion contributes to an understanding of the theoretical
concepts.
Another issue that I'd like to see addressed is ease
of coding/maintenance/code generation for various schemes. For
many projects these issues will be just as important as speed
(which is, of course, the _fun_ part) since these things do
have to be shoehorned into language compilers/interpreters
where many design decisions have to be juggled simultaneously.
Keep up the good work.
=============================================
Scott McLoughlin
Conscious Computing
=============================================
>> Then again, GnuScript isn't too bad a name (although people might confuse
>> it with GhostScript).
>If the GNU extension language will be Scheme, then why not call
>it Scheme? Given that we already have Emacs-Lisp, something
>like GNU-Scheme would be a logical choice.
Except that the GNU extension language will not be Scheme, which
has an international standard, but an incompatible subset with
extensions. (The third is not a problem, but the first two are.)
GNU Script and GNU EL (with minor variations) are both
potentially confusing, which is a pity. GNU Scheme is simply
inaccurate. While on the topic, though, anyone else notice that
"Scheme" backwards....
John L
Yeah.
So, would -you- agree that Tcl's data structuring capabilities are somewhat
impoverished? (Compared with, say, Pascal, ADA, PL/1, C, Scheme or ML.)
But Scheme is not a scripting language; it is a full, general-purpose
programming language. If the GNU extension language will be Scheme,
then why not call it Scheme? Given that we already have Emacs-Lisp,
something like GNU-Scheme would be a logical choice.
But that's neither catchy, nor funny, and, therefore, impractical. ;-)
Thanks, Peter. See Larry's release notice posted elsewhere in these
language areas for details, and/or glance at my implementation of a
Patricia trie in perl, recently posted elsewhere in this thread.
:I still think that all other things being equal a tighter, smaller language
:is better than a larger and more complicated one,
[]
:...I'd rather the winner be something more minimalist...
I'd rather it were more usable. :-)
Do you want big languages and little programs or vice versa?
Nonetheless, it turns out that perl is growing *SMALLER*, which is a truly
remarkable thing for a programming language, if you think about it. The
v5 release has a grammar that's 50% smaller than v4.0, and v5 are only 33%
the reserve words that v4 had. Functions are migrating out of the core
language into libraries, particularly now that this can be done cleanly,
portably, and transparently. At the same time, it has gained a great deal
to provide for safe, flexible, portable, and extensible programming of
more serious programs than were hitherto reasonably attempted.
I guess all we need is a elisp-to-perl (or is that scheme-to-perl)
translator now and even rms will be happy. :-) (Someone else reports
working on a tcl-to-perl translator already, but progress is slow.)
--tom
--
Tom Christiansen Perl Consultant, Gamer, Hiker tch...@mox.perl.com
As Zeus said to Narcissus, "Watch yourself."
kar...@ERE.UMontreal.CA (Kardan Kaveh) writes:
>How about standardizing on an object system? I recommend Chris Haines'
>Scheme Object System, which features multiple inheritance and a
>meta-object protocol.
My $.02: unless you have a pressing need, and a compiler, a MOP is a
pain. The overhead involved (at least in the MOPpy implementations
I've seen) is just far too great.
I like Preston Briggs' attitude to this kind of thing: does all this
extra flexibility make up for the cost of all this extra flexibility?
A simple base interface such as that provided by Tiny CLOS or Dylan
should be just fine for about 99.999% of cases, I'd imagine.
EuLisp's leveled approach to an object system is designed as it is for
just this reason. In Level-0 EuLisp, you have access to a simple
object system without a MOP. In Level-1, a MOP is available.
Conformning implementations can implement just Level-0 to maintain
high efficiency. Additionally, the MOP is designed such that Level-0
programs can remain efficient even in the presence of a MOP. The
paper "Balancing the EuLisp Metaobject Protocol", by Bretthauer et al,
published in Lisp Pointers vol 6 nos 1/2, page 119, explains the
system in detail.
I would suggest that the GNU project look at EuLisp for its
extensions, as it is a clean, well-designed language in which
considerations similar to those of GNU's stated goals have been part
of the language design goals since its inception. More information is
available from, I believe, http://www.maths.bath.ac.uk/.
-- Harley
--
------------------------------------------------------------------------------
motto: Use an integrated object-oriented dynamic language today.
Write to in...@ilog.com and ask about Ilog Talk.
------------------------------------------------------------------------------
Harley Davis net: da...@ilog.fr
ILOG S.A. tel: +33 1 46 63 66 66
2 Avenue GalliƩni, BP 85 fax: +33 1 46 63 15 82
94253 Gentilly Cedex, France url: http://www.ilog.fr/
> I still think that all other things being equal a tighter, smaller language
> is better than a larger and more complicated one, and all the enhancements
> to Scheme suggested by RMS in <941019042...@mole.gnu.ai.mit.edu> are
> a bit worrisome. If this comes down to a fight between Sun and the FSF my
Howdy,
Ummm, RMS's proposed extensions don't seem to "bloat" Scheme
too much IMHO. Scheme itself is a very small language. Adding fluid
variables was done in Oaklisp, so there's precedent. Exception
handling and catch/throw are simply first class exits with dynamic
extent, which has also been researched, implemented, etc. Multiple
obarrays are simply "packages" - not rocket science.
Anyway, none of these seems "big" in terms of code size and/or
runtime space requirements. As for "flexible string manipulation
functions" and "access to all or most of the Unix system calls", these
will add to the "footprint" of the language. I guess it's a tradeoff -
multiple incompatible "user level" extensions vs. built in convenience
(and perhaps some economy achievable only by "built-ins"). It's a
"judgement call".
I don't, OTOH, see why not we couldn't just have a simple
#ifdef compilation option to exclude various features from the
language, e.g. #undef GXL_UNIXCALLS or whatever. Same for "expect"
interface (what is expect anyway?).
I would guess in the general case, the lion's share of apps
using the language will have oodles of giant GUI extensions, user
i/o validation, etc. and one average size 8bit+ color image will have
a footprint bigger than the whole language implementation anyway! For
real tightness and power in a scripting language, I (and many others)
would recommend using "Forth" where these things matter - but I
wouldn't sick Forth on users of a "general purpose" scripting language
(I'm not religious).
Anyway, when can we get it ;-) Also, I'll repeat my oft'
expressed desire to see DOS/Windoze/Win32 versions.
It's been pointed out to me that Perl 5 has apparently considerably tighter
syntax and semantics than the rather ad-hoc Perls of yesteryear, so I'll see
if I can come up with a better analogy. It's a sad day when ones favorite
bad examples pass by the wayside. (HHOS)
I still think that all other things being equal a tighter, smaller language
is better than a larger and more complicated one, and all the enhancements
to Scheme suggested by RMS in <941019042...@mole.gnu.ai.mit.edu> are
a bit worrisome. If this comes down to a fight between Sun and the FSF my
money wouldn't be on Sun (not after they dropped NeWS and OpenLook), and
I'd rather the winner be something more minimalist...
The threat that TCL will take over the world is improbable. Sun has not
been completely successful in imposing standards. (I certainly don't feel
compelled. Just look at all the sites that haven't adopted NIS+ for
example.) Why should anyone think that this attempt will be any better?
Sun certainly can not prohibit other scripting languages from running on
their systems. In addition, they are not the only game in town. Far from
it; DEC, HP, IBM are also in the OS and workstation business. If they
don't choose TCL as the "universal scripting language" then Sun's
"universe" will be very small indeed.
I think healthy competition is good, so I welcome GNUel (rhymes with
"jewel" -- gosh it even looks like *.el !) into this arena. The best thing
that could happen is that the wizards build a better mouse trap.
--
Ken Mayer
MRJ, Inc. (703) 385-0722
10455 White Granite Drive (703) 385-4637 fax
Oakton, Virginia 22124 kma...@mrj.com
USA "Not now, I'm spanking my inner child."
Ahem!, some of us will want to get programs in the (pseudo) realtime arena
done. I don't think I like the idea of a garbage collector in my programs
that takes the whole thing for an extended walk once in a while. I this can
be done in small chunks, it's certainly a major plus.
- Josef
My $.02: unless you have a pressing need, and a compiler, a MOP is a
pain. The overhead involved (at least in the MOPpy implementations
I've seen) is just far too great.
I like Preston Briggs' attitude to this kind of thing: does all this
extra flexibility make up for the cost of all this extra flexibility?
A simple base interface such as that provided by Tiny CLOS or Dylan
should be just fine for about 99.999% of cases, I'd imagine.
But Tiny CLOS *does* have a mop. That's the whole point of it -
to embed a CLOS-like language in scheme.
- Eric