: [Please redistribute widely]
: In practice, the developer has
: to choose a language--and then all users of the package are stuck with
: that one. For example, when I wrote GNU Emacs, I had to decide which
: language to support. I had no way to let the users decide.
Ever hear of the idea of a "focus group", Mr. Stallman? That's
where you get users or potential users together and help them
decide between multiple choices.
Todd.
--
Todd Bradley--Supreme Ruler of The Galaxy | Visual Numerics, Boulder
| 303-581-3293
"Welcome to Hell. Here's your shotgun." | to...@boulder.vni.com
True. My point is that RMS makes it sound like it's impossible
to find out what users want, and so he had to make the decision
to use Lisp as the Emacs language all on his own.
Fine. Please extend Scheme (as I knew it) with a flexible
object-encapsulation/inheritence system and convenient, well
designed, portable interfaces to common os functionalities
and libraries -- for a good example of how to do this you might do this
look at the Python code. If you want people to USE the language
without fear of illegality, maybe you could use a copyright which
protects your rights without restricting the USE of the language
-- like the one that applies to Python.
In the mean time, since I want a good scripting/extension language
without scary copyright restrictions and with good interfaces to
just about everything I could possibly want NOW, I'll burrow on
ahead using Python.
Aaron Watters
Department of Computer and Information Sciences
New Jersey Institute of Technology
University Heights
Newark, NJ 07102
phone (201)596-2666
fax (201)596-5777
home phone (908)545-3367
email: aa...@vienna.njit.edu
PS:
Personally, I've always found Scheme a little irritating --
ever since I read the standard text on the subject which mentions
arrays somewhere around page 400, in a footnote, without telling
you how to use one. (Do I detect MIT/NIH? Naw.)
TB> My point is that RMS makes it sound like it's impossible
TB> to find out what users want, and so he had to make the decision
TB> to use Lisp as the Emacs language all on his own.
You give the impression that you have not read Stallman's article all the
way through. I think he makes it pretty clear that he considers GNU-emacs
reliability on a single extension language to be a weakness. Even so, I am
not sure that he necessarily envisioned the current wide set of users for
Emacs when he set about writing it, and in any case, I rather doubt that he
would have gotten much code written if he had spent all his time trying to
make up focus groups of programmers who we all know would never agree on a
suitable extension language. It hardly seems fair to slam RMS on choosing
lisp as an extension language when someone could criticize him no matter
what language he choose. Hell, at least there *is* an extension language.
I think the approach he outlined in this document is about as reasonable as
you could expect from anyone, and since it in-theory supports any arbitrary
extension language that can be implemented on top of an extended-Scheme,
this should eliminate the usually language war issues.
- Chris
--
Christopher Barber
(cba...@bbn.com)
First of all, it starts off with more of the FSF's recent attacks on Tcl, a
fairly innocuous little language that is really quite good at what it does
even if it doesn't do everything Stallman wants. This tends to make me think
that the real reason isn't to fix a problem, but to promote some subtle
political agenda.
This is reinforced when he starts attacking Sun. Attacking Sun is a great
way to get the iconoclasts on your side, but occasionally Sun does come up
with some good things (NeWS, for example, which people stayed away from in
droves, largely due to FUD) and the folks bashing Sun miss the boat (DEC
has done some good things with OSF/1, but nobody else has managed to pull
a rabbit out of that hat).
So, is the Gnu Extension Language going to be another Motif, another RFS,
another OSF/1? Some great smelly monster that even if it succeeds makes nobody
happy?
That brings me to my second problem... the list of extensions. It seems like
about half the announcement is extensions to scheme. Come on, the big advantage
to lisp-like languages is the way their simple semantics and syntax can be
used to bootstrap very powerful concepts.
What's wrong with taking some existing implementation, like STk's interpreter,
and adding a modicum of string and O/S functions? If Tcl is an unholy cross
of Lisp and Awk, this is sounding like some similarly sanctified marriage of
Lisp and Perl.
Oh well, at least it looks like he's got a clue about licensing...
--
Peter da Silva `-_-'
Network Management Technology Incorporated 'U`
1601 Industrial Blvd. Sugar Land, TX 77478 USA
+1 713 274 5180 "Hast Du heute schon Deinen Wolf umarmt?"
Even so, Lisp is a pretty decent extension language. One of my favorites. If
I was in his "focus group" I'd have been cheering for that. Now the default
key bindings, and funky screen updating, those I'm not so happy about.
I'm still leery of the whole multiple-extension-language model. Pick one
language sized to do the job right and stick to it. Concerting emacs lisp,
Python, and Tcl into Scheme seems a bit of a waste...
(BTW, how about REXX? I hate REXX, myself, but it's enormously popular as
an extension language in certain circles...)
Religious wars usually start when people are *very* convinced of their
favourite language/system/hardware or whatever gadget. I'd like to
avoid this by starting from the application requirement's point of
view on extension languages.
[1] Interpreted Language
An extension language should support the rapid prototyping of
applications, i.e., it needs to be flexible in various respects. The
success of Tcl/Tk shows that a language which is easy to use (cf. the
good ol' BASIC days), and which provides a full set of operations from
very low-level basics to higher-lever graphics will find user
acceptance because it meets users' needs.
From this, I infer that we call for an interpreted, not primarily
compiled, language. A copmiler may come handy for performance boosts
once the built system has reached some maturity, but it should be
avoided to compile code (long edit-compile-run cycles inhibit
prototyping).
[2] Dynamic Type System
The data types supported by the extension language should be the
standard data types (integer, floats, strings, characters), plus
sequential lists and hash-indexed arrays of data objects. This is a
direct lesson from several languages.
Especially the hash-indexed arrays are crucial when it comes to
implementing associative structures. Lists are a natural form of
variable-length collections. As Tcl/Tk shows, the syntactic typing of
data objects (which is due to the fact that strings are the only real
data type in Tcl) is insufficient and awkward for many applications,
e.g., when an object-oriented database is interfaced. Explicit type
markers will be necessary, making the handling of such marked data
objects somewhat obscure.
In LISP, every object "knows" its type. This certainly is an approach
which is superior to that of classical compiled languages where it is
impossible to tell what kind of object a pointer actually points to.
In LISP, type casts are meaningless: either you convert the object to
a different type (possibly also changing the semantics), or you have
to stick with the object as is.
A facility similar to that of defstruct/Flavors in LISP would allow
users to introduce their own first-class data types, including object
systems resembling those of any application. This can be implemented
in a few pages of LISP code, so it shouldn't be a problem.
[3] Syntax
Syntax is probably what keeps so many people from using basically
comfortable language environments such as Smalltalk and LISP. I
remember a package called CGOL which basically introduced ALGOL-like
infix notation for LISP. The resulting programs very much looked like
ALGOL or PASCAL programs.
I guess, if we manage to separate syntax from the extension language
implementation, this would be a big win. Of course, some kind of
standard syntax would have to be defined, but an extension language
should be able to significantly change its syntax if needed.
Granted that feature, there isn't really a question of whether the GNU
extension language is Scheme or LISP or Tcl or Python. There would be
a semantics for certain data types, function primitives, and control
structures. That's all. Map that onto your favourite syntax. RMS may
prefer to provide a Scheme-like base system. Others may want to
implement a C-like syntax.
[4] Built-In Features
Tcl by itself is nice but not perfect. I suppose, much of the success
of Tcl is contributed by the availability of Tk. The large user
community facing the problem of having to prototype user interfaces
has been calling for such a system for a long time. Smalltalk had
comparable features but was too difficult to handle and too exotic for
most people.
In my opinion, an extension language should define built-in operations
in the following areas:
- string and pattern manipulation (ever done programming in SNOBOL?);
- file manipulation (this is essential for replacing many shell
and awk/perl scripts);
- invocation of programs and capture of results (this requires some
transparent use of pipe mechanisms);
- inter-process communication (not just bits and bytes, but structured
objects), including an event mechanism.
With the de-facto standard X11/Motif, there is a common user interface
programming API on many platforms. Even PC windowing systems aren't
that different from Motif in their functionality anymore.
This added functionality should be provided as a separate GNU X
toolkit on top of the extension language. Graphical user interfaces
have become as essential for programs as arithmetical operations. In
the case of Tcl/Tk, one isn't really sure if Tcl and Tk are designed
as separate systems or an integrated environment. The separation is
there in some points, at others it isn't.
A tight integration of a visual toolkit will help prototyping
significantly. The integration should be tight because of the need of
a seamless integration with the object system of the extension
language.
[5] Extensions
So how about extending the language? Extensions can be built in one of
three ways:
- programs written in the extension language itself provide new
functionality through a well-defined API (something like a LISP
package system would be useful to keep name spaces separate);
- the dynamic loading of modules provides a way of dynamically
including implementations of defined APIs which are provided by some
compiled-in extension (this may be the preferred way of handling the
GUI extension).
- adding new commands in *some* other language requires the definition
of an interface protocol regarding control flow and data passing (this
is the Tcl model).
The first alternative certainly is the LISP way of doing it. In order
to be of use even for complex extensions, an optimizing compiler must
be available (looking at the KCL/AKCL LISP, you can see how this may
be done by going over a conventional language like C).
The second and third alternative require the rebuilding of the
language interpreter (aka "wish inflation") unless a generic protocol
for the automatic integration of dynamically loaded modules is defined
(e.g., loading a dynamic library could put the resepctive symbols into
a new name space/package, and a special segment in a shared library
could be used to specify the parameter passing conventions of the
exported functions).
-----
To summarize: in my opinion, an extension language should have rapid
application prototyping as its foremost goal. This will facilitate the
prototyping of entire applications as well as the quick writing of
small scripts. Consequently, an interpreted language with dynamic
typing and a user-definable syntax is needed. Some basic non-standard
operations (when compared, e.g., to C or C++) will be required for
string and file manipulation, and for inter-process communication (as
distributed environments become more and more common). The language
should encourage users to write extensions in the extension language
itself, but allow the dynamic linking of code at the same time.
If you ckeck these requirements against Tcl/Tk, you'll notice that
with the exception of the syntax issue, the typing system, and the
extension mechanism, pretty much everything else is there. That's why
Tcl/Tk is successful. However, when it comes to writing serious
applications requiring complex typing (e.g., a project management
system for concurrent engineering, or a product documentation and
information system for mechanical engineering), Tcl/Tk is not suited
well primarily because it is lacking a proper type system. In the
realm of multimedia applications and computer-supported cooperative
work (that's what I'm interested in), communication and information
exchange are important. The simple Tk "send" command certainly doesn't
satisfy those needs.
-----
Well, I hope the above provides some useful ideas for to the ongoing
discussion, as well as a rationale for some features a scripting or
extension language should have.
Greetings,
--Juergen Wagner
Juergen...@iao.fhg.de
gan...@csli.stanford.edu
Fraunhofer-Institut fuer Arbeitswirtschaft und Organisation (FhG-IAO)
Nobelstr. 12c Voice: +49-711-970-2013
D-70569 Stuttgart, Germany Fax: +49-711-970-2299
<a href = http://www.iao.fhg.de/Public/leute/jrw-en.html>J. Wagner</a>
The standard reference for Scheme is the "Revised**4 Report on the
Algorithmic Language Scheme", which is only 55 pages long. I suspect
you're referring the "Structure and Interpretation of Computer Programs".
As the name implies, this is a text on programming, not on any particular
programming language. This book is oriented towards teaching about
advanced programming structures such as closures, streams, and
continuations, which are unique to Scheme and similar languages; mundane
features like arrays do admittedly get little coverage. The course assumes
that the student has some prior, basic programming skills, so already knows
how to use arrays.
--
Barry Margolin
BBN Internet Services Corp.
bar...@near.net
1. It appears that there are to be two modules -- a compiler for a given
extension language, and a Scheme-based runtime interpreter. It sounds
like the compiler will be GPL'ed and the runtime won't be. But what if
I want to call "eval some-extension-language-command" from a running program?
Will it get compiled into Scheme on the fly (and then into C and dynamically
loaded, even)? I think this will be a big performance hit, and furthermore
it will require every application to include the GPL'ed compiler, which
defeats the purpose of unencumbering the runtime.
2. Will the Scheme runtime need something like Boehm's garbage collector?
I'm sure there are applications that can't use this sort of system -- for
example, ones that maintain pointers to objects in core in external
storage but not internally (for whatever reason).
3. Tcl is very popular in embedded applications where code size is critical.
It seems that the Scheme interpreter plus the garbage collector plus the
compiler would be a lot larger.
It's definitely a cute idea, but I'm not sure it's very practical...
Wayne
> * Distribution conditions.
> We will permit use of the modified Scheme interpreter in proprietary
> programs, so as to compete effectively with alternative extensibility
> packages.
Good.
> Translators from other languages to modified Scheme will not be part
> of any application; each individual user will decide when to use one
> of these. Therefore, there is no special reason not to use the GPL as
> the distribution terms for translators. So we will encourage
> developers of translators to use the GPL as distribution terms.
This simply means that I _won't_ use the modified scheme in my application..
full point. Because I consider that Tcl syntax (and even "set x 12" more than
the C-like "x=12") is much more readable than any Elisp or Scheme. I'd like to
use scheme as a more powerful language as a developper but I will not impose
this choice on my customers who only know mouse manipulations or, in the best
case, Fortran! Proposing scheme, Lisp, or C-like, as an extension language for
customers is missing the issue IMHO..
And BTW, I agree that it should be possible to write large programs as scripts
or extensions (and it will be more and more the case as CPU power gets
cheaper) *but* neither Elisp nor Tcl can serve this purpose today (still IMHO).
The reason? They are not modular, they are not object oriented. You have to be
a very good and cautious programmer in order to write 100.000 lines of
*maintenable* Lisp or Tcl code.. So the conclusion (Tcl is bad) seems very
strange for me, especially given the fact that [Incr Tcl] exists! I'm
surprised nobody has raised this issue.. If you really want the "very best"
extension language, you should try interpreted Eiffel rather than Elisp!!!
(which does exist in the melted-ice development technology).
Anyway. If Tcl2Scheme is copylefted, Sun people do not have to worry any
more.. :-)
Cheers,
Christophe.
= Are you the police? -- No ma'am, we're musicians. =
>2. Will the Scheme runtime need something like Boehm's garbage collector?
I can't imagine why it would. It's fairly straightforward to have your
Scheme (or whatever) system maintain pointers into C land and vice
versa, with rather less magical support from the RTS than the Parc GC
gives (the system I've seen uses structures called "malloc" and
"stable" pointers, respectively, to point in each direction).
What GNUscript's RTS does absolutely need is a decent generational
garbage collector, so that it will provide reasonably sane interactive
performance. One of the things that regularly makes me want to kick my
workstation through a window is the GC and buffer relocation burps in
Emacs.
<b
--
Bryan O'Sullivan email: b...@cyclic.com, bosu...@maths.tcd.ie
Department of Poverty wuh wuh wuh: http://www.scrg.cs.tcd.ie/~bos
Trinity College, Dublin nous n'avons qu'un peu de mouton aujourd'hui.
[Please redistribute widely]
Many software packages need an extension language to make it easier
for users to add to and modify the package.
In a previous message I explained why Tcl is inadequate as an
extension language, and stated some design goals for the extension
language for the GNU system, but I did not choose a specific
alternative.
At the time, I had not come to a conclusion about what to do. I knew
what sort of place I wanted to go, but not precisely where or how to
get there.
Since then, I've learned a lot more about the topic. I've read about
scsh, Rush and Python, and talked with people working on using Scheme
as an extension and scripting language. Now I have formulated a
specific plan for providing extensibility in the GNU system.
Who chooses which language?
Ousterhour, the author of Tcl, responded to my previous message by
citing a "Law" that users choose the language they prefer, and
suggested that we each implement our favorite languages, then sit back
and watch as the users make them succeed or fail.
Unfortunately, extension languages are the one case where users
*cannot* choose the language they use. They have to use the language
supported by the application or tool they want to extend. For
example, if you wanted to extend PDP-10 Emacs, you had to use TECO.
If you want to extend GNU Emacs, you have to use Lisp.
When users simply want "to write a program to do X or Y," they can use
any language the system supports. There's no reason for system
designers to try to decide which language is best. We can instead
provide as many languages as possible, to give each user the widest
possible choice. In the GNU system, I would like to support every
language people want to use--provided someone will implement them.
With the methods generally used today, we cannot easily provide many
languages for extending any particular utility or application package.
Supporting an extension language means a lot of work for the developer
of the package. Supporting two languages is twice as much work,
supposing the two fit together at all. In practice, the developer has
to choose a language--and then all users of the package are stuck with
that one. For example, when I wrote GNU Emacs, I had to decide which
language to support. I had no way to let the users decide.
When a developer chooses Tcl, that has two consequences for the
users of the package:
* They can use Tcl if they wish. That's fine with me.
* They can't use any other language. That I consider a problem.
Sometimes developers choose a language because they like it. But not
always. Sun recently announced a campaign to "make Tcl the universal
scripting language." This is a campaign to convince all the
developers who *don't* prefer Tcl that they really have no choice.
The idea is that each one of us will believe that Sun will inevitably
convince everyone else to use Tcl, and each of us will feel compelled
to follow where we believe the rest are going.
That campaign is what led me to decide that I needed to speak to the
community about the issue. By announcing on the net that GNU software
packages won't use Tcl, I hope to show programmers that not everyone
is going to jump on the Tcl bandwagon--so they don't have to feel
compelled to do so. If developers choose to support Tcl, it should be
because they want to, not because Sun convinces them they have no
choice.
Design goals for GNU
When you write a program, or when you modify a GNU program, I think
you should be the one who decides what to implement. I can't tell you
what language to support, and I wouldn't want to try.
But I am the leader of one particular project, the GNU project. So I
make the decision about which packages to include in the GNU operating
system, and which design goals to aim for in developing the GNU
system.
These are the design goals I've decided on concerning extension
languages in the GNU system:
* As far as possible, all GNU packages should support the same
extension languages, so that a user can learn one language (any one of
those we support) and use it in any package--including Emacs.
* The languages we support should not be limited to special, weak
"scripting languages". They should be designed to be good for writing
large programs as well as small ones.
My judgement is that Tcl can't satisfy this goal. (Ousterhout seems
to agree that Tcl doesn't serve this goal. He thinks that doesn't
constitute a problem--I think it does.) That's why I've decided not
to use Tcl as the main system-wide extension language of the GNU
system.
* It is important to support a Lisp-like language, because they
provide certain special kinds of power, such as representing programs
as data in a structured way that can be decoded without parsing.
** It is desirable to support Scheme, because it is simple and clean.
** It is desirable to support Emacs Lisp, for compatibility with Emacs
and the code already written for Emacs.
* It is important to support a more usual programming language syntax
for users who find Lisp syntax too strange.
* It would be good to support Tcl as well, if that is easy to do.
The GNU extension language plan
Here is the plan for achieving the design goals stated above.
* Step 1. The base language should be modified Scheme, with these features:
** Case-sensitive symbol names.
** No distinction between #f and (), for the sake of supporting Lisp
as well as Scheme.
** Convenient fast exception handling, and catch and throw.
** Extra slots in a symbol, to better support
translating other Lisp dialects into Scheme.
** Multiple obarrays.
** Flexible string manipulation functions.
** Access to all or most of the Unix system calls.
** Convenient facilities for forking pipelines,
making redirections, and so on.
** Two interfaces for call-outs to C code.
One allows the C code to work on arbitrary Scheme data.
The other passes strings only, and is compatible with Tcl
C callouts provided the C function does not try to call
the Tcl interpreter.
** Cheap built-in dynamic variables (as well as Scheme's lexical variables).
** Support for forwarding a dynamic variable's value
into a C variable.
** A way for applications to define additional Scheme data types
for application-specific purposes.
** A place in a function to record an interactive argument reading spec.
** An optional reader feature to convert nil to #f and t to #t,
for the sake of supporting Lisp as well as Scheme.
** An interface to the library version of expect.
** Backtrace and debugging facilities.
All of these things are either straightforward or have already been
done in Scheme systems; the task is to put them together. We are
going to start with SCM, add some of these features to it, and write
the rest in Scheme, using existing implementations where possible.
* Step 2. Other languages should be implemented on top of Scheme.
** Rush is a cleaned-up version of the Tcl language, which runs far
faster than Tcl itself, by means of translation into Scheme. Some
kludgy but necessary Tcl constructs don't work in Rush, and Tcl
aficionadoes may be unhappy about this; but Rush provides cleaner ways
to get the same results, so users who write extensions should like it
better. Developers looking for an extension language are likely to
prefer Rush to Tcl if they are not already attached to Tcl.
Here are a couple of examples supplied by Adam Sah:
*** To pass an array argument without copying it, in Tcl you must use
upvar or make the array a global variable. In Rush, you can simply
declare the argument "pass by reference".
*** To extract values from a list and pass them as separate arguments
to a function, in Tcl you must construct a function call expression
using that list, and then evaluate it. This can cause trouble if the
other arguments contain text that includes any special Tcl syntax. In
Rush, the apply function handles this simply and reliably.
*** Rush eliminates the need for the "expr" command by allowing infix
mathematical expressions and statements. For example, the Tcl
computation `"set a [expr $b*$c]' can be written as `a = b*c' in
Rush. (The Tcl syntax works also.)
Some references:
[SBD94] Adam Sah, Jon Blow and Brian Dennis. "An Introduction to the Rush
Language." Proc. Tcl'94 Workshop. June, 1994.
ftp://ginsberg.cs.berkeley.edu:pub/papers/asah/rush-tcl94.*
[SB94] Adam Sah and Jon Blow. "A New Architecture for the Implementation of
Scripting Languages." Proc. USENIX Symp. on Very High Level Languages.
October, 1994. to appear.
ftp://ginsberg.cs.berkeley.edu:pub/papers/asah/rush-vhll94.*
** It appears that Emacs Lisp can be implemented efficiently by
translation into modified Scheme (the modifications are important).
** Python appears suitable for such an implementation, as far as I can
tell from a quick look. By "suitable" I mean that mostly the same
language could be implemented--minor changes in semantics would be ok.
(It would be useful for someone to check this carefully.)
** A C-like language syntax can certainly be implemented this way.
* Distribution conditions.
We will permit use of the modified Scheme interpreter in proprietary
programs, so as to compete effectively with alternative extensibility
packages.
Translators from other languages to modified Scheme will not be part
of any application; each individual user will decide when to use one
of these. Therefore, there is no special reason not to use the GPL as
the distribution terms for translators. So we will encourage
developers of translators to use the GPL as distribution terms.
Conclusion
Until today, users have not been able to choose which extension
language to use. They have always been compelled to use whichever
language is supposed by the tool they wish to extend. And that has
meant many different languages for different tools.
Adopting Tcl as the universal scripting language offers the
possibility of eliminating the incompatibility--users would be able to
extend everything with just one language. But they wouldn't be able
to choose which language. They would be compelled to use Tcl and
nothing else.
By making modified Scheme the universal extension language, we can
give users a choice of which language to write extensions in. We can
implement other languages, including modified Tcl (Rush), a Python
variant, and a C-like language, through translation into Scheme, so
that each user can choose the language to use. Even users who choose
modified Tcl will benefit from this decision--they will be happy with
the speedup they get from an implementation that translates into
Scheme.
Only Scheme, or something close to Scheme, can serve this purpose.
Tcl won't do the job. You can't implement Scheme or Python or Emacs
Lisp with reasonable performance on top of Tcl. But modified Scheme
can support them all, and many others.
The universal extension language should be modified Scheme.
Request for Volunteers
If you understand Scheme implementation well, and you want to
contribute a substantial amount of time to this project, please send
mail to Tom Lord, lo...@gnu.ai.mit.edu.
If you expect to have time later but don't have time now, please send
mail when you do have time to work. Participation in a small way is
probably not useful until after the package is released.
You got it!
>As the name implies, this is a text on programming, not on any particular
>programming language. This book is oriented towards teaching about
>advanced programming structures such as closures, streams, and
>continuations, which are unique to Scheme and similar languages; mundane
>features like arrays do admittedly get little coverage....
Quite right. As we all know, all uses of arrays are mundane and
trivial. Advanced programmers never use arrays. There are no
interesting algorithms that use arrays in interesting ways. Arrays
never come up in good classes about "advanced" programming.
In fact: didn't Turing show that all we *really* need is two stacks
of bits? Hmmm....
This gives me a really good idea for my own extension language!
[which I'll make freely copiable, but I'll restrict it from use in
any activity which makes any money in any way using a really complex
copyright: I'll take the gpl as a starting point (this'll be the
really fun part -- hey, maybe I'll start a really important movement
or revolution by forcing other people to not make money using my
program!!!).]
-a.
Ps: the real reason this "text on programming" doesn't
talk about arrays is there is no
good way to "do" arrays in functional programming, even though
arrays are the single most useful structures in real programming;
hence my irritation.
OK, here's the scenario: I want to maintain a config file (~/browse.cf, say)
that is generated by the application but the user shuld have the ability
to edit. It needs to be in a language easy to automatically generate, easy
to reload, easy for external programs to maintain, and easy for the naive
user to modify.
What language would you recommend I choose? How do I provide the tools so
that the user can *also* maintain it in their language of choice?
Under any such scheme as this, the language that it all ends up being in
is going to be Scheme. The translators are just not going to be used, long
term.
This is not necessarily a bad thing. Just something to keep in mind.
Yes, I had a look at it. You didn't use postix if at all, that I could see.
> Do you want big languages and little programs or vice versa?
I want little languages and little programs. I don't believe you can't
get there (watch out, he's got something under his coat! Oh no! He's got
a Forth interpreter! Run!) from here...
> I guess all we need is a elisp-to-perl (or is that scheme-to-perl)
> translator now and even rms will be happy. :-) (Someone else reports
> working on a tcl-to-perl translator already, but progress is slow.)
I don't think that you're going to get a good translator from any of these
data-driven languages to a procedural language any time soon. Run-time
manipulation of code is too much a part of what makes them interesting.
And it's also too much a part of what makes them useful extension
languages.
You're right, they will. But for the application domain I'm interested in
(extension languages for programs running on UNIX) it's more important.
I mean in this domain I find Tcl more than adequate, and nobody is going
to tell me it has anywhere near the functionality of scheme even without
any of these extensions.
> I don't, OTOH, see why not we couldn't just have a simple
> #ifdef compilation option to exclude various features from the
> language,
That turns out not to work very well. You end up with everything compiled
in anyway. We've already been down that road in the Tcl world. The real
solution to application size is dynamically loadable extensions. Things like
the UNIX I/O package and the strings package would be well suited to that.
Extensions to the language syntax are less so.
And the real "size" metric I'm using is more like the one Tom was using
to measure the size of Perl 5 versus Perl 4: complexity. Adding new primitives
doesn't add much to the complexity (the mental size, if you like) of the
language. Adding new control structures, or changing the basic syntax
(making it more lispy) does.
> For real tightness and power in a scripting language, I (and many others)
> would recommend using "Forth" where these things matter
OK, OK, how about Postscript? In a lot of ways it's got most of the tightness
of Forth with a lot cleaner syntax. And people are used to dealing with it.
Yes, traditional PS implementations are pretty big but that's mostly the
rendering engine...
> Anyway, when can we get it ;-) Also, I'll repeat my oft'
> expressed desire to see DOS/Windoze/Win32 versions.
You already got that with Tcl.
Todd-
A focus group might be helpful in designing the application.
However, the choice ends there. Once the application is in use
the flexibility is gone. While I do not agree with everything RMS
has said, there has been little strategic thinking in the field
of freely available software - applications, operating systems
and 'total systems' like GNU.
--
Quentin Fennessy
Which reminds me of this little gem:
#!/usr/local/bin/wish -f
label .lab -bd 2 -relief raised -text "So, What is wrong with using a utility"
label .lab2 -bd 2 -relief raised -text "that kills babies...I happen to like"
label .lab3 -bd 2 -relief raised -text "tcl."
pack append . .lab {top fill} .lab2 {top fill} .lab3 {top fill}
button .b1 -text "End this madness" -command exit
pack append . .b1 {top}
--
Raul D. Miller n =: p*q NB. 9<##:##:n [.large prime p, q
<rock...@nova.umd.edu> y =: n&|&(*&x)^:e 1 NB. -.1 e.e e.&factors<:p,q [.e<n
NB. public e, n, y
x -: n&|&(*&y)^:d 1 NB. 1=(d*e)+.p*&<:q
That's why I think using scheme as an intermediate language is not
a good idea. I think a lower level language would be better, forcing
a compile no matter what language you write the thing in.
Stefan
Arrays are simple in concept. SICP is supposed to teach other more
"advanced" concepts. Doesn't mean that arrays are less used. They are
just assumed to be known !
> In fact: didn't Turing show that all we *really* need is two stacks
> of bits? Hmmm....
Yup ! Unbounded stacks !
And those stacks are easier to implement with lisp lists than with
arrays (and their fixed size)
> Ps: the real reason this "text on programming" doesn't
> talk about arrays is there is no
> good way to "do" arrays in functional programming, even though
There is, but it's still recent technology.
> arrays are the single most useful structures in real programming;
> hence my irritation.
And arrays are the single most obvious reason why most programs either
crash on big data sets or (if you're lucky) complain because it's
bigger than some arbitrary internal limit !
Stefan
Spartan minimalism? Tcl is hardly spartan... it's just designed for a
specific job and does it very well. Perl is designed for a different
job and does THAT very well. I don't think it could do Tcl's even as
well as Tcl does Perl's.
[preaching to the choir omitted]
Hey, I'm responsible for some of the features in Tcl that *are* there, like
the way strings work. Karl and I worked out the semantics of Tcl arrays on
his whiteboard when he worked here. We did Tcl Extended, because the original
language was too minimal, and a lot of that has been picked up. For that
matter we picked some ideas up from Perl... some of them didn't make the
cut and still aren't in the core language (like the filescan stuff).
But there's always been this basic assumption: that you don't add a feature
just because it sounds good. You add it because you need it. If there's two
ways of doing something you use the one that avoids complicating the language.
The classic example in Perl is the postfix if statement. It doesn't add
any capability to the language, and it confuses new users. In an extension
language that's a bad thing... because most of the time most users are
new users, because they're not using the language to do a job, they're
using it to configure the tool that does the job.
> The problem is, you see, is that quite simply, you're designing the wrong
> languages for the wrong crowd.
Who, me? I'm not designing a language at all. Or redesigning one. I'm trying
to keep a bunch of people from inventing yet another camel when the specs
don't even call for a horse.
[a bunch of stuff that doesn't seem to have anything to do with me at all,
skipped]
> When it comes to lisp or tcl, while the extensive run-time nature of
> those languages make machine language generation (at least of certain
> constructs) difficult, compiling them into native perl (probably
> with a run-time emulator library) should in theory present no insurmountable
> hurdles.
Certainly with a runtime emulator library... especially when you're running
around loading stuff on an ongoing basis at runtime and using code fragments
as your communication channel between components. And since you have to keep
doing that, what's the point to putting Perl in the loop at all? It's not
technically infeasible, it's just not very useful. And that's why I think
it's unlikely.
Don't make the user through more work than necessary. If it bothers you
that that we in English sometimes naturally express outself with the
conditional afterwards, use somthing else. It's more restrictive and
stilted and unnatural to enforce a particular style on the user. Ask your
mother if you don't believe me.
if (annoy $peter reversed("conditional")) {
use Something_Else;
}
value("flexibility") > value("restriction");
ask $mom if disbelieve $tom;
Remember, in C, you can say for(;c;) wherever you can say while(c), and
no one seems to mind that. It's the same issue. One is more readable.
You're asking for decreased legility for no good reason. Likewise,
do {
foo();
} until $a || $b;
is some better than either of these:
do {
foo();
} while !$a && !$b;
do {
foo();
} while !($a || $b);
because they make you go through more work than needed. Likewise
foreach $a (@list) {
foo($a);
}
is superior to the far busier:
for ($i = 0; $i <= $#list; $i++) {
foo( $list[$i] );
}
But so what? It's not like we should can one or the other and
force you to choose between C and shell.
[yes, much of the previous was more addressed to the thread then
to just Peter]
--tom
--
Tom Christiansen Perl Consultant, Gamer, Hiker tch...@mox.perl.com
Malt does more than Milton can
To justify God's ways to Man.
>(sigh... here we go again)
>I'd like to respond to an error in Richard Stallman's latest posting.
>Stallman said:
> Sun recently announced a campaign to "make Tcl the universal
> scripting language." This is a campaign to convince all the
> developers who *don't* prefer Tcl that they really have no choice.
> The idea is that each one of us will believe that Sun will inevitably
> convince everyone else to use Tcl, and each of us will feel compelled
> to follow where we believe the rest are going.
>Please understand that this "campaign" exists only in Stallman's mind.
>As far as I know there has never been *any* official Sun announcement
>about Tcl. There is no campaign at Sun to stamp out Tcl competitors;
>Tcl and Tk aren't even official Sun products right now, nor has Sun
>made a commitment to turn them into products (yet). If anyone has
>concrete evidence to back up Stallman's accusations, please post it
>so we can all see it.
Here's a job posting that I came across while searching on Career
Mosaic's home page. It does talk about Tcl as the universal scripting
language. Maybe Prof Ousterhout could clarify this.
Prof Osterhout is right in saying that negative campaigning is not
good. I'd say certainly I've heard more negative things said about
TCL, C++ etc in the scheme newsgroup than vice versa. There are really
neat things about scheme like high level macros but also not so neat
things like poor support for reuse (unable to use neat libraries
developed in C++ for example). The foreign function support in scheme
is far from good. The thing to remember is that scheme is not a
panacea for everything, it is one paradigm and providing interfaces
to other paradigms is only going to make it more acceptable.
--Suresh Srinivas
-----Job posting about Tcl from Sun -------------
Sun Micorsystems Laboratories, Inc. is embarking on a new project
directed by Dr. John Ousterhout. Our goal is to make Tcl/Tk the
universal scripting language.
To accomplish this we are in the process of building a new group
which is well funded and fully dedicated to this project. This group
is under SMLI (Sun Microsystems Laboratories, Inc.) which is the
advanced technology and research arm of Sun Microsystems, Inc.
We are searching for several more individuals to join us in this
effort and play a key role in making this goal a reality.
You will help us on the development of the Tcl scripting language,
the Tk toolkit, and their extentions and applications.
The two most important projects will be a port of Tk to Windows and
Macintosh platforms, and the creation of a graphical designer for
Tk user interfaces. This will allow people to create interfaces
graphically on the screen, rather than writing scripts.
The individualals we are looking for will have solid experience
with C, C++, and experience with Tcl/Tk. We would also like to
have some expertise with MS/Windows and/or MACS.
The qualified candidate will also have a BSCS/MSCS and 5 plus
years work experience.
If you are interested in exploring this new opportunity please
follow up to:
Scott Knowles
SMLI
2550 Garcia Ave. MTV19-02
Mt.View, CA 94043
--
Suresh Srinivas Department of Computer Science
Grad Student Indiana University, Bloomington,
email: ssri...@cs.indiana.edu IN 47406.
On the other hand, Microsoft has things like visual basic, visual c, visual
c--, etc. and so they too see the need to provide various programming languageds
into the user community. Or at least they see the financial benefits. What
is interesting is that friends who have these software packages indicate that
their 'widgets' (extensions, whatever you want to call them) can in many cases
be used across languages - a useful concept which doesn't seem to be making
it into the Unix arena.
--
:s Great net resources sought...
:s Larry W. Virden INET: lvi...@cas.org
:s <URL:http://www.mps.ohio-state.edu/cgi-bin/hpp?lvirden_sig.html>
The task of an educator should be to irrigate the desert not clear the forest.
: I don't, OTOH, see why not we couldn't just have a simple
:#ifdef compilation option to exclude various features from the
Having to compile up multiple copies of interpreters has been tried in the
perl and tcl communities - to the frustrations of many. What I and many
others have called for are ways to dynamically load independantly developed
sets of enriched command sets into a very small base interpreter. This
would allow me to tailor an application to a required set of objects
and appropriate operations/methods, while passing on pieces for which I have
no need. Why should my applications be saddled with hundreds of k of X
overhead if the app I want to develop just wants to send messages to an
existing X app - but needs to do no window instantiation at all? Equally,
if all I need to do is small integer manipulation, I would just as soon
not be saddled with bignum floats. On the other hand, if a user wishes
to write their own extended commands for my app, and in doing so determines
that _they_ need bignum floats, X, or whatever, I would like for the language
to be able to support _them_ requesting said objects be loaded, along with
appropriate operation/method library code, etc.
:language, e.g. #undef GXL_UNIXCALLS or whatever. Same for "expect"
:interface (what is expect anyway?).
Expect is a nifty concept (available at least in a Tcl and Perl form -
perhaps in other languages now as well) where one defines a set of
interactions that need to take place one or more processes. Think of
telecomm software in the micro world which allow you to capture login
scripts and then replay them to log into services, etc. Expect is
a language where one can write 'scripts' to invoke ftp, telnet, etc.
and then generate requests, watch for respones, etc. The latest Expect,
with an extended environment known as Expectk, allows one to wrap aaGUI
around a traditional text based interaction such as ftp, password changes,
whatever, in a rather nifty way. There is also a neat paper done able
a feature of Expect called Kibitz - where one links two separate programs
together with expect/kibitz glue between - so that one program feeds input
to another and then recieves the second's output as it's input (think
of playing two chess programs against one another - not that this is
the only use, but a simple to grasp one).
: I would guess in the general case, the lion's share of apps
:using the language will have oodles of giant GUI extensions, user
:i/o validation, etc. and one average size 8bit+ color image will have
:a footprint bigger than the whole language implementation anyway! For
:real tightness and power in a scripting language, I (and many others)
:would recommend using "Forth" where these things matter - but I
It is true that many apps will be extended using many pieces of
extensions. If they are all loaded only when needed, and able to be
unloaded when not needed, this would allow an app to consume only the
resources needed at any one time. And if folk take into consideration
the 'hypertool' or applet approach, where entire mini-applications grow
up and communicate between one another, then one will find that more
use of distributed compute resouces, threading, etc. will be utilized.
Stallman did recommend that separate translators from other languages to
the GNU standard language be GPL'd.
--
-F. Sullivan Segal
_______________________________________________________________
_
/V\ E-Credibility: (n -- ME) The unguaranteed likelyhood that
' the electronic mail you are reading is genuine rather than
someone's made up crap.
_______________________________________________________________
GCS d-- p--(---) @c++ u e-(*) m+(-) s/+ @n++ h--- f+ g+(--)
w+(+++) t++(-)@ b5++ yij++ r(dm)+ y+(*)
Mail to: flet...@netcom.com
Dynamic loading of extensions works just fine in Perl. Why do you think
the /usr/bin/perl binary can be just 50k? There's no longer any need for
fooperl, barperl, and flotzperl. Tcl users can use this feature if they
start their tcl programs with
#!/usr/bin/repl
use Tcl;
and go from there. No, I'm not entirely kidding.
--tom
--
Tom Christiansen Perl Consultant, Gamer, Hiker tch...@mox.perl.com
Documentation is the castor oil of programming. Managers know it must
be good because the programmers hate it so much.
Scheme is a beautiful little language and could be a great extension/
scripting language if it had standard portable interfaces to a large
number of libraries, and if it had native object support with
inheritance -- I understand some mutant strains do...
[Python certainly does.]
I think the GPL and its variants should be changed to something
less restrictive and simpler -- in one case I know of a developer did some
work using GNU stuff and ended ripping a lot of it out in order
to avoid the bother of complying with the terms. He says next
time he'll license proprietary source with binary distribution
rights (such things exist, and can be very nice). Clearly, in this
case, the GPL didn't encourage the use of freely copiable software.
Next time I see him I'll recommend python and it's associated tools,
since he can use them however he pleases, as long as he credits the
source. A gnu-scheme would fair better in the world if it had
a copyright like the ones on TCL and python.
No more comments on books. Sorry,sorry,sorry.
I can't agree. A compile step automatically makes for a lousy extension
language, unless all the compilers are built into the binary. For a lot of
uses the extra fork/exec overhead is by itself too high. And if all the
compilers are built into the binary, then they're the extension languages
and the "low level" one is an implementation detail.
[snippet]
I'm sorry, but putting a feature in because it's english like is just
plain silly. Programming languages are not human languages. If you don't
think so, there's always COBOL.
The syntactic distance between
> if (disbelieve $tom) {
ask $mom;
}
and:
> ask $mom if disbelieve $tom;
is pretty high. The former is clearly a control structure. The latter is
hard to pick out of code.
As for C, I don't recall arguing that C is either easy to learn or that it
would make a good extension language.
On the gripping hand:
for(;read_news;)
flame();
and:
while(read_news)
flame();
retain the same basic form. They don't add to the conceptual cost of learning
and using the language.
Some extensions are useful. Foreach is like C's "+=", it takes a common
idiom and removes a lot of duplication from it. Postfix if doesn't reduce
the complexity of the statement any (there's still as many elements to
evaluate) but does add to the complexity of the language.
This is where I'm coming from: adding a feature to a language because it's
neat (and postfix if is certainly neat... it's downright cute) is a bad idea.
That way lies COBOL.
(no, I don't think Perl's COBOL. I will note that a lot of the improvements
to Perl have involved removing complexity, which defends it from that
charge quite well *and* supports my argument against un-necessary frills)
> good way to "do" arrays in functional programming, even though
> arrays are the single most useful structures in real programming;
> hence my irritation.
Howdy,
Exactly why the universal scripting language should be
an embedded APL ;-)
=============================================
Scott McLoughlin
Conscious Computing
=============================================
> On the other hand, Microsoft has things like visual basic, visual c, visual
> c--, etc. and so they too see the need to provide various programming languag
> into the user community. Or at least they see the financial benefits. What
> is interesting is that friends who have these software packages indicate that
> their 'widgets' (extensions, whatever you want to call them) can in many case
> be used across languages - a useful concept which doesn't seem to be making
> it into the Unix arena.
> --
Howdy,
The X-language widgets currently in use are typically
termed "components" in the DOS world and the dominant component
format is called a "VBX", originally a Visual Basic only library
format that became very popular.
VBX's are _very_ popular. While I don't use them, most
of my colleagues do. They report that the quality is _very_
_LOW_. Crash city (GPF, core dump, whatever). Many will
say "that's an implementation issue" - which is true. But it's
also an _economic issue_ - the heavy "consumerization" of
software, including programming tools, in the DOS/Windows
world. I wouldn't recommend chasing after the MSoft/Intel/
DOS/Windows model of computing, esp. as its shaped up in
the last year or so. I (like many others) am there - it's not
pretty.
If you're interested in VBX's, though, you can go
out and snag VB for ~100 and a large collection of VBX's
for another ~ 100 dollars or so and go to town. Heck, get
a big C++ compiler for another ~100 or so and you can
write you're own VBX's. Have fun.
Gareth
Everyone seems to be assuming that the extension language built into the
program must be the same as the extension language that the user sees.
Yet this is not so. What would be wrong with building in something that
is small, simple, and fast (e.g., Forth), and then providing tools to
compile something else to that (e.g., a gcc backend that generates Forth
instead of assembly)?
--Tim Smith
Anyway, that's one of the undertones *I* saw in the posting. And I'd
also appreciate it if future posts of this nature don't get posted
here. If I'm interested in an alternative to TCL, I'll look
on my own, thank you.
Now, when they get this language working and have Tk widgets for it, then
they can come and talk to me :-). Of course, I'd need it to not be GPLed,
so I guess it will never be of use to me.
--
|+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++|
| Michael J. Suzio msu...@umich.edu |
| Marketing Director - Friday Knight Games |
| aka "That F*K*G company!" |
|+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++|
Actually I would say that an object system is almost essential for a good
extension langauge. You want to be able to add new types to the language
which correspond to the elements of whatever system the language is embedded
in. Without this, you have to add tons of keywords to manage data objects
which are not really part of the language.
Python can do this, although it's operator overloading syntax is a little
awkward and it might be too slow. Some of the modern functional languages
might be fast enough. Haskel comes to mind, but it has a horrendous syntax.
Another concern is that you probably want the source-code inputs to these
languages to be event driven- for X and multi-user applications. With
scheme you could do this with a preprocessor (just collect input until you
have balanced parenthasis). Python's parser could probably be made event
driven as well, since it's table driven.
--
/* jha...@world.std.com (192.74.137.5) */ /* Joseph H. Allen */
int a[1817];main(z,p,q,r){for(p=80;q+p-80;p-=2*a[p])for(z=9;z--;)q=3&(r=time(0)
+r*57)/7,q=q?q-1?q-2?1-p%79?-1:0:p%79-77?1:0:p<1659?79:0:p>158?-79:0,q?!a[p+q*2
]?a[p+=a[p+=q]=q]=q:0:0;for(;q++-1817;)printf(q%79?"%c":"%c\n"," #"[!a[q-1]]);}
Of course, the real problem is how to do this across all platforms - how
does Perl handle all the different Unix, MS-DOS, etc. limitations on
dynamically loaded objects? (I have the Perl5 tech doc printed - it's
just that my speed reading skills keep getting cancelled out by kids
wanting my attention...)
The main reason for the acceptance of Tcl is the '/Tk' suffix.
I cannot see anybody who would like to learn a language like Tcl if it
where not for all the "Value Added" by Tk.
I am usually amazed by the ease with wich you can build interfaces
with Tk. I also think that that is what makes the whole thing so
appetible.
Which brings up the following point. Any language which wants to be a
replacement for Tcl/Tk (and I believe that any Scheme or CL serious
attempt would just breeze through) should provide an equivalent of Tk.
Thanks for the attention.
--
Marco Antoniotti - Resistente Umano
-------------------------------------------------------------------------------
Robotics Lab | room: 1220 - tel. #: (212) 998 3370
Courant Institute NYU | e-mail: mar...@cs.nyu.edu
...e` la semplicita` che e` difficile a farsi.
...it is simplicity that is difficult to make.
Bertholdt Brecht
Using an arbitrary syntax to avoid favoritism:
display "Enter keystroke: "
read keystroke
display "Enter macro: "
read macro
define_key_macro %keystroke %macro
Now, if you're using an external compiler you need to run that compiler
from "define_key_macro".
Now suppose you're reading these from an X resource at startup. You're going
to have to call the compiler for *each* resource in turn.
Sorry, it just don't work. The underlying language *is* going to be exposed
to the user.
That doesn't mean that Scheme isn't a fine language for this, just that it's
nonsense to pretend the underlying language doesn't matter.
Because you can't maintain the system then. If your users get to go off and
pick any tranlator they want, you'll have to learn every available language
inorder to debug your user's scripts. I don't believe that it is practical for
you to expect to be able to debug a script written is some unknown language that
was machine translated into scheme, or forth, or whatever. (I've argued this same
point with the Dylan people to no good.)
If you're not going to let them pick, then you just as well force them to use
the same language as you picked.
Mike McDonald m...@trantor.ess.harris.com
: This sounds a lot like OpenWindows vs. Motif vs. Athena/MIT for
: widgets and window managers. Did the user community benefit from all
: this choice ? It made it difficult to have a program with a GUI run
: on all platforms and it created a new industry of companies writing
: GUI independent libraries thereby making it even more difficult to
: share software between users. One party that wasn't adversely affected
: by the inability of the Unix vendors to agree on a "standard" was
: Microsoft - that's for sure.
The image that comes to mind is a tcl rat preoccupied and in mortal combat
with a scheme opossum all the while a Tyrannosaurus Basicus bears down:
"Yummm......both white meat and dark meat for lunch today....."
: Tom Moog
--
-Matt Kennel m...@inls1.ucsd.edu
-Institute for Nonlinear Science, University of California, San Diego
-*** AD: Archive for nonlinear dynamics papers & programs: FTP to
-*** lyapunov.ucsd.edu, username "anonymous".
Syntaxes other than Scheme are just ordinary extensions.
but so is scheme syntax itself. adam shah's architecture does not mention
the fact that most of the scheme syntax is an "extension" on top of a simpler
core language containing not much more than symbols, constants, quote,
define, lambda, begin, if, set!, prim, proc etc.
oz
Again, sorry to jump into what's obviously an ongoing and heated
discussion, but, although I completely agree with most of what Peter says
(particularly the first paragraph above), the ending "the latter is hard to
pick out of code" is a rather silly thing to assert. If you don't know the
language, of course it's hard. If you don't know the language (and it
doesn't resemble something you *do* know, such as COBOL vs. English :-),
anything is hard.
If you _Know_ the language, it's as natural as can be. At least that's my
opinion. That it's not natural for you is yours.
The problem with perl is that it resembles C in many ways. This is a
double-edged sword. It's good in that what is similar is, uh, similar.
It's bad in that what's not, isn't, and it's not always apparent what is
and isn't similar (that make any sense?). Anyone familiar with Japanese
will see the same double-edged sword in romaji, the expression of Japanese
using "English" letters.
The biggest trap of perl resembling C/sed/awk/COBOL/English/whatever is
that it can seduce a beginner. If you program in perl while Thinking in C,
your perl will suck bigtime.
Let me say that again: If you program in perl while Thinking in C, your
perl will suck.
Over the years I've done real, large, non-academic projects in some wild
languages (including FORTH, a nastalgic favorite), so had a pretty wide
range of experience when I first encountered perl in the late 2.x stage.
It took me a *long* time to get to really _Know_ perl (i.e. in the biblical
sense :-). But once I was able to Think in perl, it was magical, just as
when I was finally able to think in Japanese.
*jeffrey*
-------------------------------------------------------------------------
Jeffrey E.F. Friedl <jfr...@omron.co.jp> Omron Corporation, Kyoto Japan
See my Jap/Eng dictionary at http://www.omron.co.jp/cgi-bin/j-e
or http://www.cs.cmu.edu:8001/cgi-bin/j-e
Peter da Silva writes:
I'm sorry, you've lost me. Either the extension language the user's
interested in is built into the executable, in which case they all
have to be, or it's got to be execed to convert the user's key macro
string into the implementation language, which gives you too much
of a performance hit, or you expose the underlying mechanics of the
implementation language to the user, which is what I thought you were
trying to avoid. What's the fourth alternative?
The ``fourth alternative'' is this: the parser and translator for a
user's favorite syntax is loaded into the running program on demand.
Thus, it is as easy to use as if built-in, but without the associated
costs of building in all languages.
Syntaxes other than Scheme are just ordinary extensions.
-t
--
----
If you would like to volunteer to help with the GNU extension language
project, please write to lo...@gnu.ai.mit.edu.
> Nonetheless, it turns out that perl is growing *SMALLER*, which is a truly
> remarkable thing for a programming language, if you think about it. The
> v5 release has a grammar that's 50% smaller than v4.0, and v5 are only 33%
> the reserve words that v4 had.
Depends on how you define smaller -- the language might be shrinking,
but the interpreter is getting bigger, at least on BSD/386:
text data bss dec hex
405504 45056 28960 479520 75120 /usr/obj/export/src/usr.local/bin/perl5.000/miniperl
335872 49152 40876 425900 67fac /usr/contrib/bin/perl
I think you're inventing AppleScript here. Applescript runs as a
separate stream from the programs that it's controlling and sends
messages to the programs requesting services (launching them if
necessary). The mechanism that AppleScript uses may be (and is) used
by other languages as well. (Yes, even tcl.) All the applications
have to do in order to be controlled by AppleScript is to provide
appropriate Apple Events and to document them through resources in the
executable. If these Apple Events are selected from a documented (and
growing) list of standard event suites, then the applications can even
be swapped out for other programs that provide the same functionality.
(Receive a script that calls Microsloth Word, but don't have a copy?
Use Word Perfect instead.) You can send Apple Events over the network
too. If the support is in the application, you can record an Apple
Script interactively.
Unfortunately, support for Apple Events is applications isn't
universal yet. Apple should have done this ten years ago, but that's
a separate flame which has nothing to do with the current thread.
Of course Apple had to do something like this because shell based
scripting languages just don't make much sense on a Mac.
Chris
----
Chris Garrigues
At work: (MIME capable) c...@mcc.com
Microelectronics and Computer Technology Corporation +1 512 338 3328
3500 West Balcones Center Fax +1 512 338 3838
Austin, TX 78759-5398 USA
At home: (also MIME capable) c...@DeepEddy.Com
609 Deep Eddy Avenue +1 512 499 0483
Austin, TX 78703-4513 USA
<a href="http://DeepEddy.Com/~cwg/">My homepage</a>
Please use this address for non-MCC related messages.
Interpreted languages allow you to write programs generating pieces of
other programs. This is something widely used in LISP-like languages.
Also, Tcl has that feature (you can write your own programming
environment on top of Tcl, including debugger and interpreter).
As for the suggestion that a compilation would be performed no matter
what language a function or program is written in, the utility of this
depends on what you mean by "compilation". In the classical sense
(C/C++ to assembly language), this will be just another language like
C or others. Therefore, it is difficult to find a rationale for that
new language (why not just take C or C++ and have the compiler
translate your scripting language to that?).
However, if we view compilation as a process similar to
byte-compilation in Smalltalk, Xerox LISP, or Emacs LISP, this makes
sense. In fact, it would support the idea of platform independence as
well as syntax independence. This is along the lines of my postulate
from an earlier message in that thread, that syntax independence must
be one of the features of this extension language. What we have to
agree upon are the primitives and the data type model of the language
..ummm... environment.
--Juergen Wagner
Juergen...@iao.fhg.de
gan...@csli.stanford.edu
Fraunhofer-Institut fuer Arbeitswirtschaft und Organisation (FhG-IAO)
Nobelstr. 12c Voice: +49-711-970-2013
D-70569 Stuttgart, Germany Fax: +49-711-970-2299
<a href = http://www.iao.fhg.de/Public/leute/jrw-en.html>J. Wagner</a>
> Dynamically loaded libraries. Possibly configured on a per-user
> basis.
That'd work. Would be a bummer if you saved your configuration when you had
your environment in "scheme mode" and then tried to reload your config file
in "Rush mode" though. Make sure your API supports having multiple DLLs
loaded at once.
(Hrm. GCC is a bit big for a DLL)
That's a good one. AREXX works that way. You need to be very careful about
the interface, though... AREXX requires you implement your own baby
extension language to parse AREXX's input, and the interface is butt
ugly. Karl wrote a Tcl interface for the Amiga that just passed argvs
around that worked pretty well. A more general interface would want to
use type-tagged elements.
Also, if you're using dynamic libraries to implement the high level extension
language, why have a low level one at all (to pull in another thread), just
define the extension language API and dynamic load the one you want.
That requires dynamic libs, you say? Well, it seems to me that any practical
multiple extension language arrangement is going to. It's too expensive to
fork-and-exec all the time, and you don't want to link everything in, so why
not just depend on dynamic loading and make scheme merely one option of
many?
> >I can't agree. A compile step automatically makes for a lousy extension
> >language, unless all the compilers are built into the binary.
> However, if we view compilation as a process similar to
> byte-compilation in Smalltalk, Xerox LISP, or Emacs LISP, this makes
> sense. In fact, it would support the idea of platform independence as
Which is what I mean by "all compilers are built into the binary". Now
your bytecode is, as I've said, an implementation detail. Do you really
mean that every program that uses GNUel should include Scheme and Rush
and Perl and Elisp?
Odd, I'm a native English speaker, and I think "if this, do that"
is very natural as well.
Of course, in other languages, "this if, that do" is natural,
should we also support that sort of thing?
--
Darin Johnson
djoh...@ucsd.edu
Where am I? In the village... What do you want? Information...
>>>>> "Peter" == Peter da Silva <pe...@nmti.com> writes:
>> Under any such scheme as this, the language that it all ends up
>> being in is going to be Scheme. The translators are just not going
>> to be used, long term.
> I don't think this is necessarily true, but you may be right.
Doesn't this argue for not embedding *any* extension language in a program but
instead defining a request/reply API that programs must conform to? Isn't
this the model that C uses (ie. think of functions as a request to something
for service). Then, any language that handles the API can be an extension
language.
Think performance might be a problem? Short-circuit the request code to look
in the local services list to see if the request could be handled locally.
Afraid of programs that *might* build in a specific language? Who cares! As
long as it supports the API, I'll start up my language processor that makes
requests according to the API and bypass the in-built language.
Possibilities?
--
==================================================================
David Masterson KLA Instruments
408-456-6836 160 Rio Robles
dav...@prism.kla.com San Jose, CA 95161
==================================================================
I only speak for myself -- it keeps me out of trouble
An interpreter is supposed to interpret textual representations and
map them to some primitive operations of the language. None of the
languages I mentioned actually compile every statement to an
intermediate code. My point is: define a set of primitive operations
and put an appropriate interpreter or compiler for the language of
your choice on top. This way, "the language" software would consist of
the bytecode interpreter, plus some "canonical" syntax for the
extension language, not necessarily for all languages.
However, it probably doesn't make sense arguing too much about
architectural or syntactic issues at this time. The question is more
what kind of semantics (i.e., primitives, data types, and control
structures) we need. Next, we can talk about how to represent that in
different syntactic forms, and how this can be mapped to a system
architecture.
In principle, *any* interpreted (or interpretable), universal
programming language would qualify as a candidate for *the* extension
language. What we should do now is to define some requirements for the
semantics of that language. If there is a 1:1 translator (maybe
through an intermediate code), it wouldn't matter much if we wrote
proc fact (n) {
if (n == 0) { return 1; }
else return fact(n-1)*n;
}
instead of
(defun fact (n)
(if (= n 0) 1
(* (fact (- n 1)) n)))
or anything else. One thing which should be avoided (because it
endangers the ease of use of such a language) is mandatory type
declarations. Typing should be dynamic as in LISP. Multiple (also
user-defined) types should be part of the language concept, unlike in
Tcl.
The second important point is what is supposed to make that language
so special as an extension language, i.e., which extra features
particularily qualify the selected language better than others? Here,
two issues are important: data exchange between code written in the
extension language and code written in other languages, and control
flow between such pieces of code from different languages.
We should keep in mind that this extension language business doesn't
aim at developing a new language, but may take an existing one and
make it fit the requirements by as little as possible modifications.
>Peter da Silva `-_-'
>Network Management Technology Incorporated 'U`
>1601 Industrial Blvd. Sugar Land, TX 77478 USA
>+1 713 274 5180 "Hast Du heute schon Deinen Wolf umarmt?"
^^ Nein noch nicht. Habe dafuer zwei
Katzen...
Cheers,
I agree in that the syntactic differences are pretty clear.
However, you're mixing syntax and semantics. Depending on the semantics
attached to the two statements, you can have *both*, *one*, or *none*
representing a control structure. In a rule-based system, the first
form is quite common and only declaratively formulates a rule. The
control structure comes in through the rule interpreter/inference
mechanism. Therefore, this is just a bad example.
All languages (including COBOL :-) just happen to be similar to a
context-free language which might be mistaken as a subset of English
with some mathematical notation (except for LISP which is natural
language with parentheses :-)). It is the nature of programming
languages that they cannot be and "are not human languages". In fact,
they don't have to.
This has already been done - Scheme/TK exists. I think there was a
Perl and Python port as well.
--
Darin Johnson
djoh...@ucsd.edu
"I wonder what's keeping Guybrush?"
Well, when I was a very new user to perl (I still consider myself a 'new' user
a year later) one of the most attractive, clear, and anti-confusing features
was the ability to write conditionals the way I would speak them. For an
English language speaker "do this if that" is very natural.
Now, I have an E.E background. I can do conditionals on a gate-logic level,
and I've programmed C and C++ for years, so I knew prefix syntax, and I still
wasn't confused. In fact I'd argue that perl is less confusing because you
can write "a unless b" and C more confusing because you can't, and the
condition always gets emphasized over the action.
--
Logan Ratner | rat...@rice.edu | It is not funny that a man should die, but
CRPC/CITI | tinker | it is funny that he should be killed for so
Rice Univ. | tailor | little, and the coin of his death should be
Houston TX | cybernaut | what we call civilization - R. Chandler
:My only problem with the whole issue is that it seems that this push to
:develop GNUScript was motivated solely by RMS seeing Sun hire John and
:concluding that the evil corporate types had subverted the language. I
:mean, fine if he doesn't like TCL - there are plenty of improvements that
:could be made. But to shy away from it just because some stupid Sun
:marketing type wants to tout it as "the universal scripting language" is a
:bad idea.
Exactly. This is also the tone I gather from the article.
To be frankly, Lisp/Scheme-like syntax confuses the heck out of me.
I don't have a problem if they (GNU) develop GNUscript, but don't
develop it for the wrong reason(s). It's just ... yet another language.
Let the people pick or kick whatever language as they see it ...
I wish the GNU people good luck. In the mean time, I'll stick with
perl, Tcl, and Tk. Thank you.
-- budi
--
Budi Rahardjo <Budi_R...@UManitoba.Ca>
#include <std-disclaimer.h>
Unix Support - Computer Services - University of Manitoba
You are right.
Of course, the proper way to do it is not Tk-compatabilty (as Tk
is just X windows). One really simply wants the scripting
language to contain display as well as procedural commands;
within a given environment (X, Windows, Mac) this display commands
would map onto suitable graphical operations.
I believe folks in the python community are worrying about this;
they may be the first to solve it properly. Python, as I understand,
already has a feature like this, but it is not yet as
powerful as Tk.
But: since Stallman is now designing the GNU language, he should
bear this in mind from the start---more important in practice
than whether it is lisp based is what sort of graphical
features it will have as part of the standard language.
--
Barry Merriman
UCLA Dept. of Math
UCLA Inst. for Fusion and Plasma Research
ba...@math.ucla.edu (Internet; NeXTMail is welcome)
The GNU vision seems to be Scheme + interpreters for
all other scripting languages, thereby achieving
universality through multi-linguality; also, RMS
presented a specific set of desirable features.
Can DR. O provide us with an idea of the directions
they are thinking in at Tcl central?
Note: The Tcl direction need not address the same
concerns as RMS---I'm just wondering what there
concerns are. For example, questions like
Will Tk be extended to be platform independent
(run on Windows and mac as well as X)?
Will Tcl get faster?
Will Tcl be extended to be a full-featured
scripting language, including support for
more daat structures, object oriented programming
(as part of the base language, not just an add on)
Or, whatever other Tcl visions they have.
Now that we see where GNU is heading, if we
had an idea of the general direction Tcl is
moving, that could help users decide things...
> Depending on the definition of define_key_macro. [Some definitions
> of define_key_macro wouldn't expose any of the details of the
> extension language.]
I'm sorry, you've lost me. Either the extension language the user's
interested in is built into the executable, in which case they all
have to be, or it's got to be execed to convert the user's key macro
string into the implementation language, which gives you too much
of a performance hit, or you expose the underlying mechanics of the
implementation language to the user, which is what I thought you were
trying to avoid. What's the fourth alternative?
Dynamically loaded libraries. Possibly configured on a per-user
basis.
--
Raul D. Miller n =: p*q NB. 9<##:##:n [.large prime p, q
<rock...@nova.umd.edu> y =: n&|&(*&x)^:e 1 NB. -.1 e.e e.&factors<:p,q [.e<n
NB. public e, n, y
x -: n&|&(*&y)^:d 1 NB. 1=(d*e)+.p*&<:q
> For an
> English language speaker "do this if that" is very natural.
Odd, I'm a native English speaker, and I think "if this, do that"
is very natural as well.
what is so odd about it? i'm a native `english' speaker, and *both*
are natural to me.
Of course, in other languages, "this if, that do" is natural,
should we also support that sort of thing?
depends if larry speaks those languages or not.. ;-)
.mrg.
For information on a good implementation of tk with perl4 see
<A HREF="http://www.ira.uka.de/IRA/SMILE/tkperl/">
http://www.ira.uka.de/IRA/SMILE/tkperl/</A>
Note: this is not the same tkperl that gets the lions share of
attention in this group. I sure wish one of those two camps would
change the name of their product... though come to think of it, one
is tkperl and the other is tkperl5...
ARexx ARexx ARexx... :p ;)
Personally, I think the general method that ARexx [and apparently
AppleScript] uses in its implementation would be the way to go for
multiple languages. The important thing is that you need to be -very-
careful about designing the application<->interpreter interface. But
theoretically, it would truely let you support whatever languages you
wanted to.
----------------------------------------------------------------------------
Dianne Kyra Hackborn "Information is not knowledge; Knowledge is not
hac...@mail.cs.orst.edu wisdom; Wisdom is not truth; Truth is not beauty;
BIX: dhack / IRC: Dianne Beauty is not love; Love is not music;
Oregon State University Music is THE BEST. . ." -- Frank Zappa
> Now that GNU has announced its fundamental vision, it put the
> pressure on the Tcl designers to put forth their vision of the
> future. [...]
> Now that we see where GNU is heading, if we had an idea of the
> general direction Tcl is moving, that could help users decide
> things...
One hopes it will be in the direction of something like rush (in
addition to platform independence), in which case users don't need to
decide: rush is likely to be one of the languages available from GNU.
--
Bruce Institute of Advanced Scientific Computation
br...@liverpool.ac.uk University of Liverpool
Thanks, Peter. See Larry's release notice posted elsewhere in these
language areas for details, and/or glance at my implementation of a
Patricia trie in perl, recently posted elsewhere in this thread.
:I still think that all other things being equal a tighter, smaller language
:is better than a larger and more complicated one,
[]
:...I'd rather the winner be something more minimalist...
I'd rather it were more usable. :-)
Do you want big languages and little programs or vice versa?
Nonetheless, it turns out that perl is growing *SMALLER*, which is a truly
remarkable thing for a programming language, if you think about it. The
v5 release has a grammar that's 50% smaller than v4.0, and v5 are only 33%
the reserve words that v4 had. Functions are migrating out of the core
language into libraries, particularly now that this can be done cleanly,
portably, and transparently. At the same time, it has gained a great deal
to provide for safe, flexible, portable, and extensible programming of
more serious programs than were hitherto reasonably attempted.
I guess all we need is a elisp-to-perl (or is that scheme-to-perl)
translator now and even rms will be happy. :-) (Someone else reports
working on a tcl-to-perl translator already, but progress is slow.)
--tom
--
Tom Christiansen Perl Consultant, Gamer, Hiker tch...@mox.perl.com
As Zeus said to Narcissus, "Watch yourself."
> I still think that all other things being equal a tighter, smaller language
> is better than a larger and more complicated one, and all the enhancements
> to Scheme suggested by RMS in <941019042...@mole.gnu.ai.mit.edu> are
> a bit worrisome. If this comes down to a fight between Sun and the FSF my
Howdy,
Ummm, RMS's proposed extensions don't seem to "bloat" Scheme
too much IMHO. Scheme itself is a very small language. Adding fluid
variables was done in Oaklisp, so there's precedent. Exception
handling and catch/throw are simply first class exits with dynamic
extent, which has also been researched, implemented, etc. Multiple
obarrays are simply "packages" - not rocket science.
Anyway, none of these seems "big" in terms of code size and/or
runtime space requirements. As for "flexible string manipulation
functions" and "access to all or most of the Unix system calls", these
will add to the "footprint" of the language. I guess it's a tradeoff -
multiple incompatible "user level" extensions vs. built in convenience
(and perhaps some economy achievable only by "built-ins"). It's a
"judgement call".
I don't, OTOH, see why not we couldn't just have a simple
#ifdef compilation option to exclude various features from the
language, e.g. #undef GXL_UNIXCALLS or whatever. Same for "expect"
interface (what is expect anyway?).
I would guess in the general case, the lion's share of apps
using the language will have oodles of giant GUI extensions, user
i/o validation, etc. and one average size 8bit+ color image will have
a footprint bigger than the whole language implementation anyway! For
real tightness and power in a scripting language, I (and many others)
would recommend using "Forth" where these things matter - but I
wouldn't sick Forth on users of a "general purpose" scripting language
(I'm not religious).
Anyway, when can we get it ;-) Also, I'll repeat my oft'
expressed desire to see DOS/Windoze/Win32 versions.
=============================================
Scott McLoughlin
Conscious Computing
=============================================
It's been pointed out to me that Perl 5 has apparently considerably tighter
syntax and semantics than the rather ad-hoc Perls of yesteryear, so I'll see
if I can come up with a better analogy. It's a sad day when ones favorite
bad examples pass by the wayside. (HHOS)
I still think that all other things being equal a tighter, smaller language
is better than a larger and more complicated one, and all the enhancements
to Scheme suggested by RMS in <941019042...@mole.gnu.ai.mit.edu> are
a bit worrisome. If this comes down to a fight between Sun and the FSF my
money wouldn't be on Sun (not after they dropped NeWS and OpenLook), and
I'd rather the winner be something more minimalist...
The threat that TCL will take over the world is improbable. Sun has not
been completely successful in imposing standards. (I certainly don't feel
compelled. Just look at all the sites that haven't adopted NIS+ for
example.) Why should anyone think that this attempt will be any better?
Sun certainly can not prohibit other scripting languages from running on
their systems. In addition, they are not the only game in town. Far from
it; DEC, HP, IBM are also in the OS and workstation business. If they
don't choose TCL as the "universal scripting language" then Sun's
"universe" will be very small indeed.
I think healthy competition is good, so I welcome GNUel (rhymes with
"jewel" -- gosh it even looks like *.el !) into this arena. The best thing
that could happen is that the wizards build a better mouse trap.
--
Ken Mayer
MRJ, Inc. (703) 385-0722
10455 White Granite Drive (703) 385-4637 fax
Oakton, Virginia 22124 kma...@mrj.com
USA "Not now, I'm spanking my inner child."
Isn't OLE the Microsloth technology for embedding applications in one
another?
Conceptually, I think ToolTalk/DCE/etc are at the same level as Apple
Events in that they provide a way to do interprocess communication.
Applescript is a layer above that which uses Apple Events to control
applications. It includes registries of event suites which are
required in order to control applications, an actual language
(modelled on Hypertalk *ugh!*), and techniques for collecting events
in real time and building a script from them, which can then be edited
with the Script Editor.
And that said . . . I've now exausted my knowledge and in fact, I
probably went a little beyond my knowledge into mythology. However,
this being usenet, I'm sure any misstatements will be corrected in due
time.
Yes. Even though scheme didn't have it for a long time.
It's useful sometimes, but not too often.
> 2. Eval must take statements in the language the user writes.
Wrong assumption !
Eval is pretty rare. And building source code on the fly is a pain.
A library to build compiled ocde will anyway be written for the
compilers, so why not make it pretty and user useable to build up code
on the fly and call eval on it afterwards ?
the following points are of course not valid any more :-)
> 3. The extension language runtime must understand (either to interpret
> or to compile) this language.
> 4. Therefore it is impossible to completely separate the user-visible extension
> language from the internal extension language.
Stefan
Why ?
Is elisp visible from its keyboard macro facility ?
> Now suppose you're reading these from an X resource at startup. You're going
> to have to call the compiler for *each* resource in turn.
First, I don't see why use X resources for that purpose (but then, why
not). Second, these resources might be generated by a compiler !
Stefan
No ! you can just announce them that if they don't use language X, then
you don't provide any support: no problem for you and more flexibility
for the experienced user !
Your reaction is bit the same as our syadmin's: he put the whole init
files in /etc/csh.{login,cshrc}. Claiming that he doesn't support
users who change their init files. But then, in order not to use his
files I had to recompile my own tcsh version which doesn't source
/etc/csh.*.
Stefan
It's very simple:
1. A decent extension language will probably need "eval".
2. Eval must take statements in the language the user writes.
3. The extension language runtime must understand (either to interpret
or to compile) this language.
4. Therefore it is impossible to completely separate the user-visible extension
language from the internal extension language.
Of course, one could provide the user-visible language as a dynamically-
loadable interpreter or compiler, but this would preclude anything as
heavy-weight as running gcc for every eval statement.
Wayne
Well, I started using Tcl before Tk even existed outside sprite. Why?
Because at 40k Tcl was the only extension language out there that was
practical to use on Xenix-286, and because it's still the easiest language
to suck existing programs in as extensions (because they're already
written to parse argv). It also let me write significant scripts on a
machine for which Perl was too large to compile, and the shell was too
slow. It turned a tortoise of a development environment into something
useful.
Tcl is still about the smallest useful extension language. I know it's
politically incorrect to worry about code size these days, but I come from
simpler times when megs were megs and ... oh, never mind ...
pe...@nmti.com (Peter da Silva) writes:
>(no, I don't think Perl's COBOL.
Perl _is_ COBOL, not because of its verbosity, but because of the
large number of builtin commands *each with its own peculiar syntax*.
if you look at perl 5, you'll see the number of reserved words
dropped by 2/3rds, and the "rules" for pretty much all of them
have been standardised.
perl 5 can *look* like cobol when you do
use English;
and use the *long* names (like $INPUT_FIELD_SEPARATOR)
(Saying "Perl is COBOL" is a bit too strong, but you know what I mean.)
bit? perl is the combination of a myriad of languages. it
is just wrong to say "perl is blah" in any sense.
.mrg.
I used REXX a lot back when I was a VM/CMS guy, and I loved it (in
spite of its flaws). But I think that most of the appeal lies in the
way it interacts with its environment: REXX was designed for CMS, and
many standard parts of CMS were designed for REXX. The combination of
REXX, the CMS program stack, and CMS Pipelines can be *great* fun.
But under UNIX, REXX's flaws overshadow the benefits, and I definitely
prefer other alternatives. Tcl has most of REXX's good qualities,
with few of the bad ones. And Perl has the same sort of synergy with
UNIX that REXX has with CMS.
(IMHO, of course.)
---glv
Do you really think it's frequent in elisp to generate code on the
fly, for example ?
What I suggest is to separate the elisp compiler from the rest and
have only the byte code interpreter embedded. Scripts would be
distributed in byte compiled form.
For on the fly code generation, there could be a library to
conveniently generate the byte-code (library necessary anyway for the
compilers).
When I say byte-compiled, I don't really care about how it looks like,
just that it would be designed for being convenient to use for a
program (interpreter or code generator) rather than for a programmer !
The fork/exec cost would a one-time cost: no big deal indeed !
Stefan
PS: and it would avoid the tendency of forcing everybody to use scheme
because it's the only language really supported by GNUel.
What is the point of all the discussions? Is the goal to give 'users' (who
ever that might be) the ability to add new commands to a program? If so,
then we need to figure out which of the users we are trying to
cater to. The users most used to Windows are not going to want a selection
of dozens of langauges - they are going to want Visual Basic - or something
so close that the differences are non-important . This means not Tcl,
not Perl, not Scheme, or anything else.
Or are the users the system administrators of large Unix installations?
Then in all likelihood the language should look like either ksh (or one
of the derivitive/supersets) or perl.
Or are the users college students? By capturing the attention of folk
before they move out to industry, one has a better chance of moving one's
product on out to the marketplace - that's the approach Unix took as you
all know. Then Modula-3, Scheme, Dylan or whatever is the language
du jour is preferred as a base.
:>
:> If you're not going to let them pick, then you just as well force them to use
:> the same language as you picked.
:
:No ! you can just announce them that if they don't use language X, then
:you don't provide any support: no problem for you and more flexibility
:for the experienced user !
There are two results to this approach. One is that indeed you limit the
problems of dealing with all those languages out there. That is because
you limit to a very small subset the number of customers. If however
you are depending on said customers for income - you have just lost bucks.
I'd like to respond to an error in Richard Stallman's latest posting.
Stallman said:
Sun recently announced a campaign to "make Tcl the universal
scripting language." This is a campaign to convince all the
developers who *don't* prefer Tcl that they really have no choice.
The idea is that each one of us will believe that Sun will inevitably
convince everyone else to use Tcl, and each of us will feel compelled
to follow where we believe the rest are going.
Please understand that this "campaign" exists only in Stallman's mind.
As far as I know there has never been *any* official Sun announcement
about Tcl. There is no campaign at Sun to stamp out Tcl competitors;
Tcl and Tk aren't even official Sun products right now, nor has Sun
made a commitment to turn them into products (yet). If anyone has
concrete evidence to back up Stallman's accusations, please post it
so we can all see it.
The only information I recall from Sun about Tcl is a couple of job
listings and one or two personal messages from me, which were intended
to keep the Tcl community informed about what my group is doing and
to solicit input. I believe that these messages were posted on
comp.lang.tcl only; if the goal had been to browbeat people who dislike
Tcl, the messages would have been posted where Tcl haters would see them.
Of course, I hope that someday Tcl and Tk will become Sun products (and
also products at many other companies), but I think that this will
*increase* the alternatives available to people, not decrease them.
I also hope that Tcl and Tk will become a universal scripting language
for the Internet, but I hope to do this by making Tcl and Tk so attractive
that people *want* to use them, not by somehow preventing people from
using alternatives. I don't see how such a negative campaign could work
anyhow.
If there has been a negative campaign, I daresay it has come from
Stallman. After all, who posted a message that used half-truths to
try to convince people not to use a particular system, without even
providing a viable alternative?
Fortunately, I found the rest of Stallman's message, where he began
the process of designing a new scripting language, more encouraging.
I just hope that he and those who work with him can focus on the
positive process of developing what they think is a better scripting
language, rather than a negative process of accusation and
misinformation.
The saddest thing about all of this is that the UNIX community
continues to bicker about silly details such as whether a programming
system can be considered to have linked lists if it doesn't have a
garbage collector too, while Microsoft steadily increases its market
share and makes us irrelevant. If you're looking for a company that
really knows how to squash its competition, you should look farther
north.
Excessive exegesis is the greatest vice of internet discourse. I
would much rather use my posting minutes to work on the project than
to argue about the sources of its inspiration. However, I feel I have
a responsibility to clear up one point for the record, and to extend
an offer.
John Ousterhout quoted RMS and replied:
Sun recently announced a campaign to "make Tcl the
universal scripting language." This is a campaign to
convince all the developers who *don't* prefer Tcl that
they really have no choice. The idea is that each one of
us will believe that Sun will inevitably convince everyone
else to use Tcl, and each of us will feel compelled to
follow where we believe the rest are going.
Please understand that this "campaign" exists only in Stallman's mind.
[...]
If anyone has concrete evidence to back up Stallman's accusations,
please post it so we can all see it.
[...]
The only information I recall from Sun about Tcl is a couple of job
listings and one or two personal messages from me, which were intended
to keep the Tcl community informed about what my group is doing and
to solicit input.
Was the campaign only in RMS' mind? In one ``personal message'' to
the Tcl community, Dr. Ousterhout wrote:
I'm enclosing below my "official blurb" on what is happening in my new
group at Sun. [....]
Here's the blurb:
The Tcl/Tk project that I'm heading at Sun has the long-term goal
of making Tcl and Tk into a universal scripting language for the
Internet.
One of the job postings was written this way:
_________________________
SUN LABS Tcl/Tk Project
_________________________
Sun Micorsystems Laboratories, Inc. is embarking on a new project
directed by Dr. John Ousterhout. Our goal is to make Tcl/Tk the
universal scripting language.
By all means, Dr. Ousterhout is forever free to clarify his meaning or
even simply change the stated goals of his project -- but RMS' posts
are very much responsive to a reasonable interpretation of what
Dr. Ousterhout wrote in the past.
But the news isn't all contentious. Dr. Ousterhout explains the
positive emphasis of his campaign this way:
I also hope that Tcl and Tk will become a universal scripting
language for the Internet, but I hope to do this by making Tcl
and Tk so attractive that people *want* to use them, not by
somehow preventing people from using alternatives.
In that case, Dr. Ousterhout, you should find the GNU extension
language plans very exciting, and i hope we can find ways to cooperate
more directly.
For example, a program that supports the GNU extension language will
be programmable using Rush or a Rush-like language, provided only that
a suitable translator has been written. (Rush is, in fact, already
written as a translator to Scheme).
Rush is semanticly and syntactictly very close to Tcl; the differences
are not likely to be noticed by many programmers. The performance of
Rush is generally superior to Tcl. It is quite plausible to view Rush
(or a Rush-like language) as a direction in which Tcl can evolve
smoothly. Upward compatability is a delicate matter, but I am sure
that, working together, we could handle it with the greatest
gentleness towards existing users.
I am at your disposal to discuss the possibility of cooperation off
line. (Our offices are within minutes of each other).
-t
>> Translators from other languages to modified Scheme will not be part
>> of any application; each individual user will decide when to use one
>> of these. Therefore, there is no special reason not to use the GPL as
>> the distribution terms for translators. So we will encourage
>> developers of translators to use the GPL as distribution terms.
>
>This simply means that I _won't_ use the modified scheme in my application..
>full point. Because I consider that Tcl syntax (and even "set x 12" more than
>the C-like "x=12") is much more readable than any Elisp or Scheme. I'd like to
You are missing the point. You and your customers are free to write
extensions to your application in perhaps Python (very readable). Since you
may distribute the Python to Scheme translator for free (under the GPL) You
won't have to rewrite it nor will it restrict you from selling your product,
that has a Scheme interface. You may even write your own extensions to your
product in modified Tcl if you wish. So what's the problem ?
- Josef
watserv1.waterloo.edu: languages/apl/j
>One hopes it will be in the direction of something like rush (in
>addition to platform independence), in which case users don't need to
>decide: rush is likely to be one of the languages available from GNU.
It is more likely that the layered extension language will
be 'gush', which will be 99% the same as rush, but wont run
any existing scripts.
Tom
--
Tom Moore tmo...@pnfi.forestry.ca
Petawawa National Forestry Institute
Canadian Forest Service, Box 2000, Chalk River +1 (613) 589-3048
ONT K0J 1J0 CANADA +1 (613) 589-2275 telefax
Another advantage I like in Scheme is that parenthesized Lisp seems to have
a special feature, missing in any other language I can think of: while it's
pretty repulsive to read, it isn't any _more_ repulsive to read
machine-generated code than hand-crafted human-edited code.
The downside I see to this GNU Extension Language plan is illustrated by
RMS's plan to pile language uglifications (which is calls "extensions") onto
Scheme: bloat and inconsistency. RMS opened this discussion (in his "Why you
shouldn't use TCL" flamebait) by suggesting we learn a lesson from GNU
Emacs. Well, the important lesson I see in GNU Emacs is that it takes
restraint to avoid having a successful program bloat up until it eats 10s of
megs of VM and disk, all in overhead.
I don't care to have two different false values in scheme, and lose the
distinction between a false result and an empty list. But then, I've never
particularly wanted to run dissociated-press over the output of
psychoanalyze-pinhead, either. I think psychoanalyze-pinhead is the
important lesson of GNU Emacs.
-Bennett
b...@mordor.com
> Do you really think it's frequent in elisp to generate code on the
> fly, for example ?
I don't know. I don't use elisp. I *do* frequently generate code on the fly
in my own scripting, even in the shell:
while read blah
do
....
done | sh
An extension language that didn't provide this cheaply would not be useful to
me.
> > display "Enter keystroke: "
> > read keystroke
> > display "Enter macro: "
> > read macro
> > define_key_macro %keystroke %macro
> > Now, if you're using an external compiler you need to run that compiler
> > from "define_key_macro".
> Why ?
Because the macro is in the extension language.
> Is elisp visible from its keyboard macro facility ?
I don't much care what elisp does. I'm talking about what I do.
> > Now suppose you're reading these from an X resource at startup. You're going
> > to have to call the compiler for *each* resource in turn.
> First, I don't see why use X resources for that purpose (but then, why
> not).
Because if you use X resources then it'll work the way you want on your
display, even if you're running a program from someone else's account.
> Second, these resources might be generated by a compiler !
I prefer to have my .Xdefaults file at least marginally readable.
> Wrong assumption !
Huh?
> Eval is pretty rare.
Eval is pretty common.
> And building source code on the fly is a pain.
That depends on the language.
I also find myself using eval (or derivatives. By the way, isn't it
cheaper to call eval than to pipe something to sh ?) sometimes.
But (even though, I have no evidence for that) it seems it depends on
the language: my scheme scripts never use eval, whereas sh, csh and tcl
use it fairly frequently.
I have the impression that whenever I use eval, it's because I don't
have enough control of what gets evaluated when. The substitute
paradigm doesn't seem flexible enough.
And it so happens that scheme is powerful enough to let me specify
things conveniently without eval. Note that I never used eval in elisp
either.
Stefan
> In article <38loi6$3...@info.epfl.ch>, Stefan Monnier <mon...@di.epfl.ch> wro
> : Do you really think it's frequent in elisp to generate code on the
> : fly, for example ?
>
> I do it quite often in JED. For example, almost everyday I will type Ctrl-X
> ESC and enter some code at the `S-Lang>' prompt. Usually though, I am
> testing an debugging packages for JED. For example, suppose that I decide
> to write a function that numbers lines in a buffer. I move to the scratch
> buffer and type:
[ Example of top level definition deleted. ]
Howdy,
I do not see a big problem compiling/running top level
definitions and bits of code. One can easily compile to bytecode
so fast that the user will have _no idea_ with regard to
eval VS. compile/run. Many "existence proofs" abound.
I think the original poster was concerned with
examples like this:
(defun weird-lisp (op-key arg1 arg2)
(eval (list (find-op op-key) arg1 arg2))))
Here we are building Lisp expressions INSIDE OF LISP and
running them. Here an interpreter will probabily be
faster than a compile/run sequence, especially for
small, non-iterating expressions.
Many folks (including me) consider this bad Lisp
in general. In most of the cases, LAMBDA will do a better
job. Nevertheless, the practice exists. A few months ago,
a poster on BIX was using EVAL + ASSOC in lieue of a
CASE construct! When I pointed him in the right direction,
he replied that he though CASE was "fancy". Go figure.
In general, I think the read/compile/run/print loop
is better than read/eval/print, although debugging/stepping/
etc. might suffer or be lots more work. In these cases,
an EVAL can look for the trivially simple cases (literals,
variables) and compile/run the rest.
Anyway, I think SCM does clever modifications of
S-Expression representations of programs to achieve some
of the speed-ups one gets from a byte code compiler, so
it looks like a good middle ground given its other
advantages. In general, though, no Lisp I know does
anything so brain damaged as performing the "read"
portion of the interpretation cycle more than once,
like many other language interpreters do.
=============================================
Scott McLoughlin
Conscious Computing
=============================================
Well, I don't consider this as on the fly code generation.
Since you type the code by hand at the time, a compiler is likely to
be fast enough so that you hardly notice it's not directly
interpreted. Just tell the editor which compiler to use (so that you
can choose your syntax) and he will run it for you and eval the
resulting code.
On the fly code generation is when a program generates code and then
evals it. Since it's generated by a program, you might expect the
generation to be fast enough for the compile time to be non-negligible !
Stefan
Agreed.
Just thought I'd point out that standard scheme doesn't really have
user defined types, nor object encapsulation, like python does.
You have to emulate them using marked structures (or more preposterously
named "manifest types") and differentiate them using some sort
of a control structure like a look up table or a case statement.
This makes scheme very unsuitable for interfacing with multiple API's
IMHO, since such programming involves the manipulation of tons of
different types of objects with different representations. Maybe
they'll extend scheme with some of the conveniences of Python,
but I doubt they'll be happy about it since it will really screw
up their beautiful denotational semantics (though that won't bother
anyone else).
Aaron Watters
Department of Computer and Information Sciences
New Jersey Institute of Technology
University Heights
Newark, NJ 07102
phone (201)596-2666
fax (201)596-5777
home phone (908)545-3367
email: aa...@vienna.njit.edu
PS: check out http://www.cwi.nl/~guido/Python.html
I don't think anyone is worried about such bytecode interpreters. They're
really just an implementation detail... it's way too soon to start worrying
about whether gescheme is bytecompiled, threaded, or interpreted by a
listwalker.
Either all compilers have to be built into the binary, or they have
to be compiled into the bytecode supported by the virtual machine.
While a bytecode GCC doesn't seem feasible, a bytecode TCL interpreter
doesn't seem to present any hideous problems.
Largish compilers can compile down to bytecode as an alternative to
the native instruction set of the machine. Smallish compilers, and
most interpreters can be implemented on top of the virtual machine
directly. (People don't tend to bootstrap interpreters. They seem
to mostly be written in languages other than themselves.)
--
-F. Sullivan Segal
_______________________________________________________________
_
/V\ E-Credibility: (n -- ME) The unguaranteed likelyhood that
' the electronic mail you are reading is genuine rather than
someone's made up crap.
_______________________________________________________________
GCS d-- p--(---) @c++ u e-(*) m+(-) s/+ @n++ h--- f+ g+(--)
w+(+++) t++(-)@ b5++ yij++ r(dm)+ y+(*)
Mail to: flet...@netcom.com