I'm passing this by way of comp.software-eng,
in hopes that one of us will re-open the question
of the desirability of referential transparency.
Please adjust follow-ups appropriately.
--
Cameron Laird
cla...@Neosoft.com (claird%Neoso...@uunet.uu.net) +1 713 267 7966
cla...@litwin.com (claird%litwi...@uunet.uu.net) +1 713 996 8546
Like TCL/Tk, WINTERP is free, portable, extensible, embeddable,
etc. WINTERP uses Motif which, unlike Tk, is an industrial-strength,
supported, **STANDARD** UI toolkit. Unfortunately, Motif is not free;
however it is bundled with all the major unix player's software developers
kits, including Solaris 2.3 SDK. IMHO, if you are developing products or
applications for use on major-vendor workstations, you should carefully
consider the costs of future incompatibility with COSE/CDE standards when
choosing Tk simply because it is "free".
WINTERP is available via anonymous ftp from ftp.x.org:/contrib/devel_tools,
file winterp-2.XX.tar.gz (XX==02 currently).
You may also retrieve WINTERP software and information via the
World Wide Web -- http://www.eit.com/software/winterp/winterp.html
For more info, read on ...
------- Forwarded Message
WINTERP -- The OSF/Motif *W*idget *INTERP*reter
Niels Mayer
Enterprise Integration Technologies
800 El Camino Real, Fourth Floor
Menlo Park, CA 94025
e-mail: ma...@eit.com
URL: http://www.eit.com/people/mayer.html
WINTERP is the OSF/Motif *W*idget *INTERP*reter, an application development
environment enabling rapid prototyping of graphical user-interfaces (GUI)
through the interactive programmatic manipulation of user interface objects
and their attached actions. WINTERP is also an excellent platform for
delivering extensible or customizable applications. By embedding a small,
efficient Lisp interpreter with UI primitives within the delivered
application, users and system integrators can tailor the static and dynamic
layout of the UI, UI/application dialogue, and application functionality.
WINTERP is a good tool for learning and experimenting with the capabilities
of the OSF/Motif UI toolkit, allowing UI designers to more easily play
"what if" games with different interface styles. WINTERP's implementation
provides a compromise between the prototyping and extensibility advantages
of Lisp environments, and the inefficiency and expenses of delivering Unix
applications under environments such as Common Lisp. Typically, prototyping
and customization are done entirely in interpreted Lisp; for delivery,
efficiency-critical low-level code may be written in C and is easily
exported to the interpreter as a new primitive.
WINTERP was first made publicly available on the X11r4 "contrib"
distribution and new releases have appeared on the X11r5 and X11r6
distribution. The recent X11r6 release of WINTERP 2.0 significantly
improves on previous releases by providing a variety of developer tools and
libraries for increased productivity. Improved functionality is delivered
via object-oriented graphics and 2.5D animation, asynchronous subprocesses,
the XmGraph widget (for creating directed acyclic graphs, trees, and
direct-manipulation displays), the Table widget (GUI layout using
tbl(1)-style specifications), GIF support, etc.
WINTERP's interpreter is based on David Betz, Tom Almy, Luke Tierney et
al's XLISP-PLUS. The interpreter's Smalltalk-inspired object system enables
a truly object oriented interface to the X11 toolkit Intrinsics (Xt) and
the OSF/Motif widget set. WINTERP's use of a real programming language for
customization/prototyping allows WINTERP based applications to be much more
flexible than applications using lower-level and less-general languages
provided by the X resource database, Brunecky&Smythe's Widget Creation
Library (WCL), OSF/Motif's UIL (user interface language), and Ousterhout's
TCL/Tk. Furthermore, the use of object-orientation at a fundamental level
within the application UI code allows WINTERP-based applications to scale
more effectively than other languages.
WINTERP 2.0 features an object-oriented 2.5D graphics and animation
"widget" based on John Stasko's Xtango path transition paradigm. Both for
static and dynamic graphics, this high-level interface simplifies and
abstracts away much of the low-level drudgery required to create 2.5 D
graphics interfaces -- smooth, flicker free display updates occur as
complex nonrectangular graphical objects move around and obscure and
uncover each other. Animation composition operations allow multiple
individual shapes to all move "simultaneously" through sequences of
animation frames. The graphics are pixel-independent and easily resizeable,
scalable and zoomable. Each primitive graphics image class supports its own
set of class specific animation and movement methods, while some operations
(e.g. movement, fill, etc) are polymorphic. The following primitive objects
are supported:
* Line (w/ color, forward-arrow, backward-arrow, bidirectional-arrow,
thickness, and style options);
* Rectangle (w/ color, fill options);
* Circle (w/ color, fill options);
* Ellipse (w/ color, fill options);
* Polygon (w/ color, fill options);
* Polyline (w/ color, forward-arrow, backward-arrow, bidirectional-
arrow, line style, line-thickness options);
* Spline (w/ color, line-style and line-thickness options)
* Text (w/ font, color, and centering options)
* Bitmaps and Bitmap movies
* GIF images.
The primitive graphics classes may also be contained in a composite image
class, which provides a grouping and layering principle for new classes
presenting multiple images. Composite images allow the construction of
independent layers of animation objects which may be operated on in groups.
WINTERP's graphics capabilities enable simple game-style animation,
employing multiple layers of arbitrarily shaped objects. Furthermore,
application-specific interactive-graphics capabilities may be encapsulated
into a new Widget-Class. This significantly simplifies the creation and
integration of new graphics widgets into the system -- these special
widgets look like normal Motif widgets to the rest of the system.
To enable GUI-applications based on existing Unix facilities, WINTERP
provides primitives for collecting data from Unix processes, and facilities
for interacting with other Unix processes. These facilities make it
possible to glue together existing Unix functionality into a GUI based
application with a relatively small amount of WINTERP-Lisp "glue". WINTERP
2.0 features the ability to run multiple interactive, asynchronous Unix
subprocesses without blocking GUI interactivity. This feature is useful for
creating GUI interfaces to existing terminal-based programs, and can also
be used for connecting to interactive network services and databases.
An environment similar to WINTERP's already exists in the Gnu-Emacs text
editor -- WINTERP was strongly influenced by Gnu-Emacs's successful
design. In Gnu-Emacs, a mini-Lisp interpreter is used to extend the editor
to provide text-browser style interfaces to a number of Unix applications
(e.g. e-mail user agents, directory browsers, debuggers, etc.). Whereas
Emacs-Lisp enables the creation of new applications by tying together C
Implemented primitives operating on text-buffer UI objects, WINTERP-Lisp
ties together operations on graphical UI objects implemented by the Motif
widgets. Both achieve a high degree of customizability that is common for
systems implemented in Lisp, while still attaining the speed of execution
and (relatively) small size associated with C-implemented applications.
WINTERP features:
*** Free with non-restrictive copyright -- available via anonymous
ftp from ftp.x.org, directory contrib/devel_tools, file
winterp-2.XX.tar.gz.
*** Portable -- entirely implemented via machine independent C
source and X11/Xt/Motif libraries.
*** OSF/Motif widgets are real XLISP objects widgets can be specialized
via subclassing, methods added or altered, etc.
*** Automatic storage management (via garbage collection) of Motif/Xt/X
data, animation and graphics data, and application resources.
*** Contains facilities for simple "direct manipulation" of UI
components.
*** Interface to Gnu Emacs's lisp-mode allows code to be developed and
tested without leaving the editor.
*** Interactive programing also available in the "WINTERP Control Panel",
with editing taking place in a Motif text widget controlled by
WINTERP.
*** Built-in RPC mechanism for inter-application communications,
implemented via serverized, event-driven Lisp interpreter.
*** XmGraph widget for creating directed acyclic graphs, trees, and
direct-manipulation displays.
*** Table widget allows constraint-based GUI static layout
using tbl(1)-style specifications.
--------------------
An old paper on WINTERP version 1.X may may be obtained via
World-Wide-Web/Mosaic:
WINTERP paper from Motif '91, First Annual Intl. Motif Users
Meeting (postscript) (226756 bytes):
ftp://www.eit.com/pub/winterp/doc/papers/winterp.PS
Screen Dump, page 3 (postscript) (157936 bytes):
ftp://www.eit.com/pub/winterp/doc/papers/page3.PS
Hybrid Application Architecture Diagram, page 3 (postscript)
(16505 bytes):
ftp://www.eit.com/pub/winterp/doc/papers/arch.PS
Diagram of RPC Architecture, page 10 (postscript) (16686 bytes):
ftp://www.eit.com/pub/winterp/doc/papers/RPC-Arch.PS
Screen Dump, page 25 (postscript) (145444 bytes):
ftp://www.eit.com/pub/winterp/doc/papers/page25.PS
Screen Dump, page 26 (postscript) (135663 bytes):
ftp://www.eit.com/pub/winterp/doc/papers/page26.PS
--------------------
Further information on WINTERP may be obtained via World-Wide-Web/Mosaic:
The WINTERP Home Page:
http://www.eit.com/software/winterp/winterp.html
WINTERP 2.0 documentation (plain-text) (652268 bytes):
ftp://www.eit.com/pub/winterp/doc/winterp.doc
XLISP-PLUS documentation (plain-text) (211733 bytes):
ftp://www.eit.com/pub/winterp/doc/xlisp.doc
Xtango Path Transition Animation (postscript) (588746 bytes):
ftp://www.eit.com/pub/winterp/doc/xtangodoc.ps
EIT's WINTERP-based World-Wide-Web Multimedia Authoring Environment:
http://www.eit.com/papers/gpware94/paper.html
--------------------
References on WINTERP, and its components:
David Michael Betz. "XLISP: An Object-oriented Lisp (version 2.1)"
Unpublished documentation accompanying the public release of Xlisp
software. David Michael Betz, P.O. Box 144, Peterborough, NH 03458,
April, 1989.
Olaf Heimburger. "Elche Im Winter -- Interaktive X-Applicationbuilder
unter Lisp -- Elk und WINTERP." iX, July 1991, pp 64-68.
Niels P. Mayer, Allan W. Shepherd and Allan J. Kuchinsky. "Winterp:
An object-oriented, rapid prototyping, development and delivery
environment for building extensible applications with the OSF/Motif
UI Toolkit." In Proceedings Xhibition '90, X Window System and Open
Systems Technical Conference, San Jose, CA, May 1990, pp 49-64.
Niels P. Mayer, Allan W. Shepherd and Allan J. Kuchinsky. The
WINTERP Widget INTERPreter -- An Application Prototyping and
Extension Environment for OSF/Motif. In Proceedings X Into The Future,
The European X Users Group Autumn Conference 1990, Surrey, UK,
September 1990, pp. 33-55.
Niels P. Mayer. The WINTERP Widget INTERPreter -- A Lisp Prototyping
and Extension Environment for OSF/Motif-based Applications and
User-Interfaces. Lisp Pointers, ACM SIGPLAN, Volume IV, Number 1,
pp 45-60.
Niels P. Mayer. The WINTERP OSF/Motif Widget INTERPreter -- A
graphical user-interface language for rapid prototyping and
delivering extensible applications. In Proceedings Motif '91,
First Annual International Motif Users Meeting, Washington DC,
December 1991, pp. 248-269.
John T. Stasko. The Path-Transition Paradigm: A Practical Methodology
for Adding Animation to Program Interfaces. Journal of Visual
Languages and Computing. (date, volume, page, publisher info unknown).
Don Libes. Expect: Curing Those Uncontrollable Fits of Interaction,
Proceedings of the Summer 1990 USENIX Conference, Anaheim, CA, June
11-15, 1990.
Papers describing applications written using WINTERP:
Allan Shepherd, Niels Mayer, and Allan Kuchinsky. STRUDEL: An
Extensible Electronic Conversation Toolkit. In David Marca and
Geoffrey Bock, editors, GROUPWARE: Software for Computer-Supported
Cooperative Work, IEEE Computer Society Press, 1992, pp. 505-518.
(originally, in proceedings Conference on Computer-Supported
Cooperative Work, Los Angeles, October 1990, pp. 93-104.)
Jay Glicksman, Glenn Kramer, and Niels Mayer. "Internet Publishing via
the World Wide Web". In proceedings Groupware '94, August 1994. San
Jose, CA.
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
= Niels Mayer ..... ma...@eit.com .... http://www.eit.com/people/mayer.html =
= Multimedia Engineering Collaboration Environment (MM authoring for WWW) =
= Enterprise Integration Technologies, 459 Hamilton Ave, Palo Alto CA 94301 =
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
------- End of Forwarded Message
That's a strange slant on the truth. I've seen several instances of
tcl/TK using Motif. I've a copy on cdrom -- and that's merely a copy
of an ftp archive, try "ftp://sunsite.unc.edu/pub/linux/devel/".
Raul D. Miller n =: p*q NB. prime p, q, e
<rock...@nova.umd.edu> NB. public e, n, y
y =: n&|&(*&x)^:e 1
x -: n&|&(*&y)^:d 1 NB. 1 < (d*e) +.&<: (p,q)
not to mention that while Motif is a self-declared "standard", it is
far less usable and extensible (because of it's Xt heritage) than tk,
which is based on more simple, more expressive notions.
> Another direction we could take in looking for
> a "real language" that permits prototyping of
> GUIfied applications is Concurrent Clean, a
> pure, lazy functional language with awareness
> of concurrency, distribution, polymorphism,
> and, along with such now-conventional typings
> as abstract, algebraic, and synonyms, intro-
> duces "polmorphic uniqueness types". In any
> case, Sun, Mac, and OS/2 versions are avail-
> able at ftp://ftp.cs.kun.nl/pub/Clean/...
> All I've done so far is look at documentation,
> so I can't testify as to its industrial vi-
> ability, but it sure has me interested.
> I'm passing this by way of comp.software-eng,
> in hopes that one of us will re-open the question
> of the desirability of referential transparency.
Could someone please elaborate on the usual answers to this question?
The usual mention of "nice mathematical properties" just doesn't cut
it for me. Proving the correctness of program using techniques that
cannot be applied to "referentially opaque" languages like C/C++ is a
noted feature, but is that all pure functional languages have to
offer? What are the benefits to software engineering? Where exactly
are the inherent costs of inefficiency lurking? Or are there any at
all?
Is this just a pretty way to program, good for teachers and their
students, or does it also have a place in the world which cares about
performance?
rodrigo vanegas
r...@cs.brown.edu
>Could someone please elaborate on the usual answers to this question?
>
>The usual mention of "nice mathematical properties" just doesn't cut
>it for me. Proving the correctness of program using techniques that
>cannot be applied to "referentially opaque" languages like C/C++ is a
>noted feature, but is that all pure functional languages have to
>offer? What are the benefits to software engineering? Where exactly
>are the inherent costs of inefficiency lurking? Or are there any at
>all?
>
>Is this just a pretty way to program, good for teachers and their
>students, or does it also have a place in the world which cares about
>performance?
>
My own work is more concerned with Logic rather than Functional
programming, but the Logic language with which I am working is
referentially transparent, so hopefully most of the same arguments
hold.
In a referentially transparent language it is possible to perform
various source-to-source transformations which can dramatically
improve the performance of programs (we're talking big-O improvements).
For example, the propegation of constraints in programs can be used to
transfrom the N-queens problem from O(N!) to O(N^2), or something like
that anyway.
In general, source level transformation can move consumers closer
to the producers so that "failure" branches of a search space can
be cut off as soon as possible. While this certainly holds for
logic programs, I can see no reason that this same principle should
not apply to functional programs.
It is quite true that some source level transformation can be applied
to non-referentially transparent languages, but it is much more difficult
to detect transformation oppertunities.
From a S.E. point of view, perhaps more important feature of referential
transparency is the effect it has on the maintainability of programs.
In any referentially transparent (R.T.) language, you can always be
certain that the result of a function is dependant on its _visible_
inputs (currying aside). I am currently developing a compiler (written
in a R.T. language). At the moment it is about 35,000 lines of code
(and growing). There are parts which I haven't touched for 6 months
which I need to go an alter, and the semantics of the code is still
clear, even though I'd completely forgotten the details.
It is true that a well commented C/C++ program can retain its clarity,
but I have yet to see a sufficiently well commented program in that case.
Just my 2c...
Thomas
--
Thomas Conway con...@cs.mu.oz.au
Decidability is like a balloon: one little hole and it vanishes.
While it is true that today, Tcl/Tk does not use Motif, on the other hand
COSE/CDE's toolkit is, in my understanding, going to be a modified derivative
on Motif - so Motif based applications will not be COSE/CDE compliant automatically
anyways. Also note that the father of Tcl/Tk now is employed by Sun, one
of the proponents of COSE/CDE. I would expect to see Tcl/Tk COSE/CDE compliant
sooner than OSF/Motif...
--
:s Great net resources sought...
:s Larry W. Virden INET: lvi...@cas.org
:s Personal: 674 Falls Place, Reynoldsburg, OH 43068-1614
The task of an educator should be to irrigate the desert not clear the forest.
: >Could someone please elaborate on the usual answers to this question?
: >
: >The usual mention of "nice mathematical properties" just doesn't cut
: >it for me. Proving the correctness of program using techniques that
: >cannot be applied to "referentially opaque" languages like C/C++ is a
: >noted feature, but is that all pure functional languages have to
: >offer? What are the benefits to software engineering? Where exactly
: >are the inherent costs of inefficiency lurking? Or are there any at
: >all?
: >
: >Is this just a pretty way to program, good for teachers and their
: >students, or does it also have a place in the world which cares about
: >performance?
As is often the case R.T. is frequently a very useful and desirable
property which has been promoted as a absolute good. I will discuss
data modeling with the concept of a reference and then implementation
where I will distinguish pointer and object identifier (OID).
The time when complete R.T. is *not* appropriate in data modeling is
when an object has identity and a changing state. There are many real
world examples, e.g., my cup of coffee cooling is modeled in physics
as a function from time and space to temperature. A computer model of
this should use a reference to distinguish my cup from the person's
who dropped by to chat. However, properties of my cup (e.g., the heat
transfer characteristics of the material it is made of) are actually
attributes of the cup and should not be modeled with references to a
common set of properties since changing the properties in fact changes
the model (assuming that the material my coffee cup is made of is not
being replaced with something else during this period---an assumption
which would be much less true of me.) Finally, note that in a
computer model the identify of an object usually should be relative to
some context, e.g., a file to a directory, the directory to a disk of
a host, the host to the internet site, etc.
Implementation raises a different set of questions. Referentially
transparent structures have a price which may not be O()...at least
not until you figure in the cost of garbage collection. It may be highly
desirable if I have 10^4 coffee cups to share the material property
information. Using a reference allows me to do this at the risk of
corrupting the correctness of my program. Ideally, I would like to
code the problem originally with R.T. material properties and then
introduce the optimization. In the process, the code which uses
the material properties should still assume R.T., and the code
which manages the material properities should be able to guarantee
that they are constant (something C and C++ do not provide).
Further, I would like to escape from 'pointers' altogether and have
simplely the concept of a reference relative to some container object,
so that saving my coffee cup model to disk (requiring offsets or
OIDs---not pointers) would have minimal impact on my code.
In summary, IMHO, current "referentially opaque" languages have the
following failings:
o Usually, one can not get R.T. when one wants it, i.e., the
programmer *must* use pointers even when pointers violate
the data model---this is very bad---R.T. is correct much
(even most) of the time and not having it available can be
infuriating.
o The process address space is not the only valid base for
references (as people who work with distributed and persistent
programs are aware) and the languages fail to allow code to
easily migrated to these other situations.
o The ability to transform an R.T. program into a non-R.T.
program when performance (often a factor of 2 or 3 even
without considering the GC) is a concern---granted this is
a harder problem I tend to think `linear types' are the
best approach, but...
(Note: `*must* use pointers' is a bit strong...the issue is dynamic
memory allocation where C++ returns pointers to newly created
objects).
Having gone on about 3 paragraphs longer than I meant I suppose this
is my 3c...
--
George Beshers
bes...@hks.com
Hibbitt, Karlsson, & Sorensen, inc.
1080 Main St. Pawtucket, RI 02860
Tel (401) 727-4200; Fax: (401) 727-4208
> In article <30e9cq$1...@Starbase.NeoSoft.COM>, cla...@Starbase.NeoSoft.COM (Cameron Laird) writes:
>>...
>> I'm passing this by way of comp.software-eng,
>> in hopes that one of us will re-open the question
>> of the desirability of referential transparency.
> Could someone please elaborate on the usual answers to this question?
>
> The usual mention of "nice mathematical properties" just doesn't cut
> it for me. Proving the correctness of program using techniques that
> cannot be applied to "referentially opaque" languages like C/C++ is a
> noted feature, but is that all pure functional languages have to
> offer? What are the benefits to software engineering? Where exactly
> are the inherent costs of inefficiency lurking? Or are there any at
> all?
>
> Is this just a pretty way to program, good for teachers and their
> students, or does it also have a place in the world which cares about
> performance?
Referential transparency, or lack of it, is an important consideration
in the definitions of large program configurations. I refer to those
structures which are evaluated (or otherwise processed) at *build* time,
e.g. references from source files to header files, to libraries, etc.
Use of 'make' would be greatly simplified if its actions were
i) referentially transparent
ii) understood by 'make' rather than just blindly executed.
Minimal recompilation could be more effectively and more soundly achieved
if dependencies were explicit.
Also, clean well-behaved code is more amenable to reuse than dirty code,
due to the lower complexity (and greater traceability) of its interaction
with its environment. Of all the code I have tried to reuse, pure Prolog
procedures come out tops: they are stateless, re-entrant, referentially
transparent, all dependencies are of the same type (references to other
procedures). They can be bunged together without fear of unwanted
interaction; they do not need any initialisation; their use of dynamically
allocated memory is watertight (no dangling pointers or leaks), and
(because of Prolog's type system) they tend to be very type-generic (eg.
they manipulate lists, sets, maps, bags, graphs of *anything*).
----
__ __ Paul Singleton (Dr) email: pa...@cs.keele.ac.uk
|__) (__ Computer Science Dept. tel: +44 (0)782 583477
| . __). Keele University, Newcastle, fax: +44 (0)782 713082
Staffs ST5 5BG, ENGLAND road: M6 J15 follow signs
Above, I said "unlike Tk" not "unlike TCL". Yes, there is a TCL interface
to Motif (ftp.x.org:contrib/devel_tools/tclMotif.1.2.tar.gz), however it is
Xt/Motif based, and not Tk based.
Another possible point of confusion is that TCL/Tk is often advertised as
having a "motif look and feel". This is often a sore point for me because
being Motif compatible is more than just having 3D drop shadows on your
pushbuttons and frames. There's a number of features in Motif going on
"under the hood" which if you get rid of allow you to come up with a
significantly simplified toolkit such as Tk. However, these features are
precisely those which have allowed motif to become widely adopted in the
international community of professional Unix software users. Here's a few
areas where Tk is not motif compliant:
(1) Internationalization, both in terms of strings, and also
layout (left to right, right to left, etc).
(2) Keyboard traversal, a consistent model for keyboard
accelerators (<escape>/<return> bindings in dialog boxes,
default buttons, etc), a consistent model for mnemonic
accelerators.
(3) Consistent resource set for system wide application customization.
(4) Resolution Independence facilities.
(5) Motif offers (IMHO) better constraint based layout capabilities
which means that you can, with a small amout of effort, lay out
screens that do reasonable things when stretched or shrunk by
the window manager (either that, or the TCL/Tk apps that I've
used were programmed by people that didn't care about layout
dynamics). Yes, I know about the Tk "packer" -- doesn't do it
for me.
(6) Xt's selection model, and more importantly drag-and-drop.
(not that I think the programmers API to these is particularly
easy to use -- though Motif 2.0 has significantly simplified
and unified Xt selections with drag-and-drop protocols.)
I personally find #3 most bothersome. I occasionally use Tk-based
applications on an otherwise all-motif desktop (e.g. SGI Irix/IndigoMagic
and HPUX VUE). Since most of the applications provided by those
manufacturers are Motif based, and all the desktop tools are Motif based, I
can very easily cuztomize the overall appearance/functionality of all the
Motif applications that I ever bring up by setting some global X
resources. Perhaps by now current versions of Tcl/Tk even use X resources,
however, even if it does, it will not have the same bindings as Motif.
So lets say I want all my scrollbars on the left hand side of their
controlling areas; and lets say I want a certain set of keyboard bindings
on all my text editing widgets; and lets say I want certain kinds of
widgets to display in certain sets of colors. I can achieve this effect
with Motif by setting the following resources:
! all XmText widgets get the following keybindings...
*XmText.translations: #override ....
! all scrollable XmText widgets have a scrollbar on left and bottom side
*XmText.scrollLeftSide: true
*XmText.scrollBottomSide: false
! all other scrolled windows have a scrollbar on left and bottom side
*XmScrolledWindow.scrollBarPlacement: bottom_left
! all toggle buttons should indicate they are selected with color
*XmToggleButton*fillOnSelect: true
*XmToggleButton*selectColor: red
*XmToggleButtonGadget*fillOnSelect: true
*XmToggleButtonGadget*selectColor: red
Meanwhile, a TCL/Tk application will happily ignore all these global
settings and behave/look inconsistently with the rest of my desktop
applications. So in addition to driving me batty because they don't behave
anything like a Motif application (in terms of keyboard accelerators and
traversal), these applications also look and work slightly different from
the customized Motif apps that I use everywhere else.
While I'm sure that I can customize a TCL/Tk application to do things like
that, why should I have to, especially given the usual TCL/Tk claims of
being "working like motif in a fraction of the size and complexity".
Like most Macintosh or MS-Windows users, I too *LIKE* and *WANT*
consistency across the board in all the applications that I use. For the
most part, this works fine as long as I'm using Motif apps. But every time
I get around a TCL/Tk application, it beckons "i'm just like Motif" with
it's attempted emulation of Motif's looks. But then you try to do something
that you'd do with any other Motif application and you find out real soon
what's under the hood. At least with an Andrew or Athena Toolkit
Application (Xaw), they look different enought from Motif that you'd never
think to use a Motif keyboard acceleration trick...
--------------------
From: yu...@shimari.cmf.nrl.navy.mil (Yuri Trifanov)
> not to mention that while Motif is a self-declared "standard", it is
> far less usable and extensible (because of it's Xt heritage) than tk,
> which is based on more simple, more expressive notions.
Motif is not a self-declared standard. A defacto-standard, perhaps, given
it's acceptance by all the major Unix players (HP, Sun, IBM, DEC, SGI, SCO
and many others). A "real" standard given IEEE P1295 which was derived from
Motif's Application Environment Specification and style. Beyond that, there
is the X/Open Common Desktop Environment (CDE) specification, which is
founded on Motif.
In terms of extensibility I would claim that WINTERP+Motif+Xtango is just
as extensible as Tcl+Tk. We need to be comparing apples and apples, not
apples and oranges -- Motif+C by itself doesn't have any notion of a
built-in higher-level extension language, and this makes constructing Motif
applications very tedious.
Given a more fair comparison between WINTERP 2.0 and TCL/Tk (not TCL/Tk and
Motif/C), however, I contend that WINTERP is more expressive than Tcl/Tk,
and more importantly, it scales better due to it's useage of object
orientation at a fundamental level within the UI code. I personally find
Tcl/Tk UI code to be extremely hard to read/follow unless you are doing
a relatively trivial example program (e.g. "hello world" or a simple browser).
TCL's problems of language quoting incosistency make it difficult to write
higher level abstractions for UI construction -- macros, iterators,
data-driven-UI-generation, etc. Not that higher level abstractions can't be
done -- it's just that creating them in TCL is linguistically painful
because the language uses the wrong fundamental data-type -- newline
delimited STRINGs separated by spaces. In constrast LISP chose the right
fundamental data-type LISTs of SYMBOLs, and consistently applies this
datatype across throughout the language.
Yes, TCL is a simple language -- I contend that it is too simple and too
simplistic for constructing well engineered applications:
* type-overloading of strings, and lack of any real type system.
* lack of proper automatic memory and resource management (freeing
up data when a widget is destroyed doesn't quite cut it).
* lack of lexical scopes for functions/procedures.
* lack of closures (forcing programmers to use too many global
variables).
* Lack of object orientation in the standard TCL/Tk release.
The fact that people *can* program applications in the bourne shell, or
tcsh, or the korn shell does not imply that a language having the above
limitations is a good or expressive language for building applications.
--------------------
From: lw...@chemabs.uucp (Larry W. Virden)
> While it is true that today, Tcl/Tk does not use Motif, on the other hand
> COSE/CDE's toolkit is, in my understanding, going to be a modified derivative
> on Motif - so Motif based applications will not be COSE/CDE compliant automatic$
> anyways.
The differences between COSE/CDE and Motif are based on the fact (as far as
I can tell) that there are two different release streams in
progress. COSE/CDE had to be based on something that was real and stable --
Motif 1.2. Meanwhile OSF was off doing Motif 2.0, so naturally there will
be differences. But since very few people have ported or moved to Motif 2.0
yet (its first release was quite recent), people that have existing Motif
1.2-based apps will be able to run them on COSE/CDE without porting.
Anybody that cared/cares about Motif portability in the first place will
have based their useage of the toolkit on "OSF/AES API" for Motif 1.2 and
will find their apps both CDE and non-CDE compliant as a result of sticking
to that API.
In either case, I think your argument is questionable since it will take
far less work to iron out any differences between even a non-compliant
useage of the OSF/AES API than it will to make a Tcl/Tk-based application
interoperate with COSE/CDE.
For more info on COSE/CDE versus Motif, I've included some information from
a FAQ on "OSF/MOTIF AND THE COMMON DESKTOP ENVIRONMENT" from
http://www.osf.org:8001/motif/MotifFAQ.html
> Also note that the father of Tcl/Tk now is employed by Sun, one
> of the proponents of COSE/CDE. I would expect to see Tcl/Tk COSE/CDE compliant
> sooner than OSF/Motif...
I'm talking about the present. There's no point in talking about or
speculating about TCL/Tk's future until real working code lands on some ftp
site. Meanwhile COSE/CDE is available, and there are groups at Sun
using/developing real working COSE/CDE desktops and desktop-compliant tools
(in Motif) for release on an upcoming version of Solaris.
Beyond that, Sun has also adopted NeXT-STEP as well, but I don't think that
has much to to do with COSE/CDE. If I put on my conspiracy theorist hat,
I'd say that Sun's promotion of TCL/Tk is their last-gasp attempt at
getting in one last punch against the OSF/Motif crowd that forced them to
drop OpenLook. (Too bad Sun couldn't support a free publicly-available-source
"standard" back when the Unix windowing community really needed it -- by
releasing a free, unencumbered version of NeWS.)
--------------------
(from http://www.osf.org:8001/motif/MotifFAQ.html)
OSF/MOTIF AND THE COMMON DESKTOP ENVIRONMENT
*Q:* What is CDE, the Common Desktop Environment?
*A:* CDE is a specification for a desktop environment to be published by
X/Open, which a number of major system and software vendors have publicly
committed to implement.
*Q:* Is Motif included in CDE?
*A:* Yes. The specification for the Motif portion of X/Open's CDE is
taken directly from OSF's documentation of OSF/Motif 1.2. The draft
specification recently completed review in X/Open's FastTrack process, and
will be published by X/Open shortly.
*Q:* Are the CDE implementors' extensions to Motif incorporated into OSF/Motif
2.0?
*A:* Yes. The User Environment Special Interest Group of the OSF
Membership recommended that OSF explore the possibility of incorporating the
small set of CDE additions to the Motif 1.2 API and new MWM visuals into
OSF/Motif 2.0 release. OSF incorporated these into Motif 2.0 to help
eliminate any minor differences. (Note, however, that OSF is releasing Motif
2.0 prior to the release of CDE 1.0; we cannot guarantee that there will not
be late changes to CDE implementations that introduce incompatibilities.)
*Q:* How consistent is the CDE specification with the new IEEE 1295 standard
specification for the user interface?
*A:* The Motif portion of the CDE specification is a superset of the IEEE
1295 specification, and both documents were based directly on the OSF/Motif
documentation from OSF. The format and some of the vocabulary of the
documents may differ, but the intersecting content is the same.
*Q:* What is included in the Motif part of the X/Open CDE specification that
isn't included in IEEE 1295?
*A:* Specifications for the Motif window manager (mwm), UIL, gadgets, and
drag/drop are all included in the CDE specification. OSF will also submit the
OSF/Motif style documentation to X/Open for inclusion. (IEEE 1295 also
includes minimal style specifications.)
*Q:* Will OSF/Motif 2.0 comply with the Motif portion of the X/Open CDE API
and IEEE 1295 specifications?
*A:* Yes.
*Q:* Are existing implementations of OSF/Motif 1.2 expected to comply with the
Motif portion of the new X/Open CDE specification and the IEEE 1295
specification?
*A:* Yes, OSF certified implementations of Motif 1.2 are expected to
comply with both the Motif portion of the X/Open CDE and the IEEE P1295 specs.
*Q:* Does this mean I can expect to run an existing application developed with
Motif 1.2 in a CDE environment?
*A:* Yes, as long as the vendor of your CDE environment has followed the
X/Open specification.
*Q:* I heard that the vendors' CDE implementations will already have some of
the features OSF/Motif 2.0 will have, plus a couple of other extensions to the
user interface that the current OSF/Motif 1.2 release doesn't have yet. How
do I choose whose Motif to use?
*A:* A good way to ensure portability and interoperability of applications
in both CDE and a non-CDE environments is to stick to the OSF/AES Rev D API
plus the CDE extensions that will be submitted to X/Open and subsequently
become part of the X/Open CDE specification. All parties are interested in
100% consistency across Motif-supporting platforms. The minor differences
that exist now between OSF/Motif 1.2 and the CDE 1.0 implementations of Motif
are due to release timing issues. None require application developers to
change or recompile their Motif 1.2 applications to run in CDE, however there
will be some optional behaviors when using a CDE shared library which users
may turn on or off.
*Q:* What is OSF doing to prevent divergence between the CDE implementors'
versions of Motif and the original OSF/Motif, today and in the future?
*A:* OSF has been, and will continue to be, both a primary supplier of the
OSF/Motif technology and a vocal contributor to the X/Open specification in
the CDE working group. OSF has agreed to contribute any enhancements to the
OSF/Motif user interface specification to X/Open. Furthermore, under the new
OSF PST model, OSF is working with the CDE suppliers and constructing a
proposal for the next release which will include CDE and Motif. Motif will be
offered as part of CDE and seperately. As for changes X/Open may make in the
future to CDE, it has always been OSF's position to adopt relevant de facto
standard specifications.
*Q:* Will OSF/Motif 2.0 be included in CDE implementations?
*A:* Due to release timing constraints, CDE 1.0 is based on OSF/Motif 1.2.
OSF has and will continue to proactively work with the known implementors of
CDE (all of whom are also OSF Members) to ensure that future CDE requirements
are met by future releases of OSF/Motif. The CDE implementors will decide
when to adopt OSF/Motif 2.0 once it is generally available. Motif 2.0
includes many enhancements which are currently not included in CDE, but which
developers may require (see OSF/Motif 2.0, above).
*Q:* Will I be able to use OSF/Motif 2.0 for my applications even if I'm
running in a CDE 1.0 environment that uses OSF/Motif 1.2?
*A:* Yes, everything is based on R5 release of the X Window System which
makes them highly compatible. We expect that your application will be able to
run with the OSF/Motif 2.0 toolkit within the CDE 1.0 environment, with or
without the optional CDE behavior.
*Q:* Who is going to validate implementations of Motif against the X/Open CDE
specification? And from whom do I license the Motif trademark?
*A:* For the time being, OSF will manage the certification of Motif
software implementations and license the Motif trademark. OSF is also
committed to supplying materials and technology X/Open needs to implement a
successful branding program of their own.
--------------------
> That's a strange slant on the truth. I've seen several instances of
> tcl/TK using Motif. I've a copy on cdrom -- and that's merely a copy
> of an ftp archive, try "ftp://sunsite.unc.edu/pub/linux/devel/".
Not to mention that Motif is an immense bloated hog forced on the industry
by the OSF during its brief reign of power when everyone was so scared of
AT&T/Sun that they'd accept *anything* if it wasn't "blue". Motif's user
interface has real problems (for example, you have to move the pointer into
an entry field to use it... that sort of finicky stuff is a waste of the
user's time), and to top it off you have to pay extra for the right to use
it! That's adding insult to injury.
Give me OLIT any day.
Ah well, at least Motif is better than the truly awful Athena widgets that
used to be the norm...
--
Peter da Silva `-_-'
Network Management Technology Incorporated 'U`
1601 Industrial Blvd. Sugar Land, TX 77478 USA
+1 713 274 5180 "Hast Du heute schon Deinen Wolf umarmt?"
|> TCL's problems of language quoting incosistency make it difficult to write
|> higher level abstractions for UI construction -- macros, iterators,
|> data-driven-UI-generation, etc. Not that higher level abstractions can't be
|> done -- it's just that creating them in TCL is linguistically painful
|> because the language uses the wrong fundamental data-type -- newline
|> delimited STRINGs separated by spaces. In constrast LISP chose the right
|> fundamental data-type LISTs of SYMBOLs, and consistently applies this
|> datatype across throughout the language.
|>
This paragraph suggests that you don't understand Tcl. In fact the quoting
rules are very simple and very consistent; I claim that it's quite
straightforward to build higher-level constructs for UIs and I think there
are many many examples in the Tcl community that illustrate this. One
can make religious arguments ad nauseum, but I know of no technically
sound evidence that lists of symbols are fundamentally better than strings.
|> Yes, TCL is a simple language -- I contend that it is too simple and too
|> simplistic for constructing well engineered applications:
|> ...
Again, I think the large number of Tcl applications and the growing size of
the Tcl community speak for themselves. What better evidence could there
be that Tcl and Tk are practical for building interesting and useful
applications?
Since we're slinging arrows, allow me to sling mine. It's time to accept the
fact that Lisp will never be a mainstream language. There's no question that
Lisp is an intellectually interesting language, and it is very important
in certain communities, such as AI and formal language design. But it's
simply never going to be widely used. Lisp has been trying for more than
30 years; every few years we hear another round of grandiose claims, and
a few years later they have evaporated. We could argue about why Lisp
hasn't entered the mainstream, but none of this changes the basic fact
that it hasn't.
I believe that it is reasonable to give new ideas a chance and to accept
plausible arguments that they might succeed, but when a system has been
around for 30 years I don't think "plausible arguments" are enough anymore.
If someone is going to argue today why Lisp should be a mainstream
language, they must somehow explain why it hasn't succeeded in the past
and what will happen this time to make things turn out differently.
Of course, this is a double-edged sword. If Tcl loses its momentum and
stops gaining additional users, you're welcome to come back to me in a
few years and claim that Tcl has passed its prime too.
Agree.
>Give me OLIT any day.
Desperate aren't we. HPWM (HP window manager) was nice. The VUE crap
that HP puts out now a day, isn't worth a cent.
>Ah well, at least Motif is better than the truly awful Athena widgets that
>used to be the norm...
Maybe, but Athena is free; Motif is not. Plus, Motif is beginning to
sink under its own weight. What's needed is something that looks better
than Athena and doesn't sink. TK is it!
>--
>Peter da Silva `-_-'
>Network Management Technology Incorporated 'U`
>1601 Industrial Blvd. Sugar Land, TX 77478 USA
>+1 713 274 5180 "Hast Du heute schon Deinen Wolf umarmt?"
Regards,
__ __/ / / __ / | / Tuan T. Doan
/ / / / / / | / IEC Layer Testing and Advance Technology
/ / / __ / / | / 2201 Lakeside Blvd. P.O. Box 833871
__/ ______/ __/ __/ __/ __/ Richardson, TX 75083-3871
"It's a kind of magic" -Highlander Phone: 6-444-4575/214-684-4575
Internet: td...@bnr.ca Fax: 6-444-3716/214-684-3716
How about, lists have O(1) insertion cost, while strings have O(n) cost.
Real tcl applications end up having almost unavoidable superquadradic
behavior because of this.
Again, I think the large number of Tcl applications and the growing size of
the Tcl community speak for themselves. What better evidence could there
be that Tcl and Tk are practical for building interesting and useful
applications?
Speaking as a die-hard Tcl user, let me say that the main reason is your
superb engineering. I like Scheme a lot, and it might be better in
theory, but that means little when compared with the practical
superiority of your implementation. Similarly, despite the popularity
of perl, it remains to be seen if an analog of Tk will appear for it.
Insert obligatory plug for Richard Gabriel's paper "The good new, the
bad news, and how to win big", where he explains why wrong is right and
Lisp is unpopular.
Actually, I have found that people who are used to structure program-
ming paradigm will not accept TCL syntax as quick as those who started
programming in TCL before say a language like PASCAL. They don't realize
or pay attention to the simple rules of TCL.
>|> Yes, TCL is a simple language -- I contend that it is too simple and too
>|> simplistic for constructing well engineered applications:
>|> ...
>
>Again, I think the large number of Tcl applications and the growing size of
>the Tcl community speak for themselves. What better evidence could there
>be that Tcl and Tk are practical for building interesting and useful
>applications?
Agree.
>Since we're slinging arrows, allow me to sling mine. It's time to accept the
>fact that Lisp will never be a mainstream language. There's no question that
>Lisp is an intellectually interesting language, and it is very important
>in certain communities, such as AI and formal language design. But it's
>simply never going to be widely used. Lisp has been trying for more than
>30 years; every few years we hear another round of grandiose claims, and
>a few years later they have evaporated. We could argue about why Lisp
>hasn't entered the mainstream, but none of this changes the basic fact
>that it hasn't.
LISP may not be accepted, but some of it concepts are. TCL's list command
and string data structure are prime example.
>I believe that it is reasonable to give new ideas a chance and to accept
>plausible arguments that they might succeed, but when a system has been
>around for 30 years I don't think "plausible arguments" are enough anymore.
>If someone is going to argue today why Lisp should be a mainstream
>language, they must somehow explain why it hasn't succeeded in the past
>and what will happen this time to make things turn out differently.
I've often wondered about this. If everyone were taught only LISP(like)
language, will it be in the mainstream. I think people often take the
lazy approach to do things. Why formally prove an algorithm when regression
and testing are done. Why worry about grammar and syntax in programming
when the compiler can catch it.
>Of course, this is a double-edged sword. If Tcl loses its momentum and
>stops gaining additional users, you're welcome to come back to me in a
>few years and claim that Tcl has passed its prime too.
I think TCL/TK will continue to survive as long as there are free license
to use it and there are valuable input from people worldwide. I feel that
TCL/TK is about half way through it "evolution". There are areas that
definitely need to be addressed and improved if TCL/TK is to compete with
other platforms.
In article <SCHWARTZ.94...@roke.cse.psu.edu> schw...@roke.cse.psu.edu (Scott Schwartz) writes:
> Insert obligatory plug for Richard Gabriel's paper "The good new, the
> bad news, and how to win big", where he explains why wrong is right and
> Lisp is unpopular.
Could you please post this article? (or a pointer) I happen to find a PS copy
somewhere and found it one of the most interesting article I ever read. Tcl
definitely fall in the "worse is better" category, which might explain its
huge success..
Cheers,
Christophe.
= How many Bell Labs Vice Presidents does it take to change a light bulb? =
= That's proprietary information. Answer available from AT&T on payment =
= of license fee (binary only). Of course, this license applies only if =
= you have AT&T light bulbs; if you have someone elses light bulbs then =
= you have to pay both their license fee and ours. =
I wonder how you dare to judge Motif, when the above sentence shows
you don't know anything about it - except maybe braking it's default
behavior.
>Give me OLIT any day.
You are free to stick to OLIT. But you should give yourself some
time to learn Motif, and _then_ you can start comparing.
Of course, Motif is far from perfect. However, there are very few "perfect"
software packages of the comparable functionality. Small pieces of
software can be real jewells, but when the package demands a team effort,
the warts start piling up.
--
>>> Opinions presented above are solely of my own, not those of my employer <<<
Martin Brunecky mar...@xvt.com (303)443-5130 ext 229 or 443-4223
>ous...@sprite.Berkeley.EDU (John Ousterhout) writes:
> One can make religious arguments ad nauseum, but I know of no technically
> sound evidence that lists of symbols are fundamentally better than strings.
>How about, lists have O(1) insertion cost, while strings have O(n) cost.
True.
>Real tcl applications end up having almost unavoidable superquadradic
>behavior because of this.
This doesn't really present a problem, because most of the time,
if you are going to be working with a large enough data set that
it would take a lot of processing time to deal with the strings,
you can write your algorithm in such a way as to use Tcl's
associative arrays (or else it's time to do it in C/C++ and just
pass around a handle to the data in Tcl - that's how Tcl is
intended to be used, afterall).
--
Joe V. Moss | Hm: j...@italia.rain.com Wk: j...@morton.rain.com
Morton & Associates |--------------------------------------------------
7478 S.W. Coho Ct. | label .l -text "Insert cute quote here"
Tualatin, OR 97062 | button .b -text "OK" -command exit; pack .l .b
Since we're slinging arrows, allow me to sling mine. It's
time to accept the fact that Lisp will never be a mainstream
language. There's no question that Lisp is an intellectually
interesting language, and it is very important in certain
communities, such as AI and formal language design. But it's
simply never going to be widely used.
Right. No general-use application written in lisp will ever fly.
Only in a few exotic situations do people ever use lisp.
Tom Lord
part-time emacs hacker
;; Right. No general-use application written in lisp will ever fly.
;; Only in a few exotic situations do people ever use lisp.
;;
;; Tom Lord
;; part-time emacs hacker
Now hang on before you make such a wide sweeping statement (and try
saying it in a lisp newsgroup and watch the flames fly!). I agree
with your first statement. Deserved or not (not, imho), Lisp has a
rather bad reputation. Some people just can't deal with parens. But
then there is the valid argument of the application's footprint.
Common Lisp is a bloated everything-plus-the-kitchen-sink-and
the-garage-too monster, but other lisps, such as EuLisp, were designed
around a small language kernel with optional libraries.
Apple's Dylan promises to have the incremental development environment
of most good lisps with the option of static compilation for delivery.
While it's not a parenthesized language like common lisp (the syntax is
Algol-like), symantically it shares a great deal with lisp, is
sufficiently high level, and will probably make a good language for
"general-use applications".
I use lisp for alot more than a few exotic situations. I use it for file
management, backups, many things usually relegated to UNIX shell scripts
(which, obviously, don't exist on a Mac), such as search my filesystem
with regular expressions. Now, I know that that certainly doesn't mean
this would be viable for everyone, I just have a lisp image running almost
all the time when I'm at my computer, so I can afford the overhead of having
the entire development environment around just to copy a file.
Just a few thoughts.
Adam
--
Adam Alpern. HCC Group, University of Colorado at Boulder
a...@cs.colorado.edu
a...@neural.hampshire.edu
Er... perhaps you missed his signature? ;-)
Yep, I guess having been blinded by the "Unix Tower of Babble"
thread, that little bit was too much for my brain to handle ;-).
Emacs *is* a perfect example of a general-use application written
mostly in lisp. Actually, Emacs almost qualifies as an operating
system. Then again, I personally know a few part-time emacs hackers
who curse LISP and every one who uses it while they grudgingly program
the editor they're addicted to.
Actually, they DO exist on a Mac---in MPW, the Macintosh Programmer's Workshop. This programming
environment is really a complete Finder replacement with a UNIX-like interface and tools, including
scripts, searches with regular expressions, etc. By itself, it costs about $150, but it also comes
with some third-party products. For example, I bought an Ada compiler a few years ago for a mere
$149 and it came with MPW.
John Doner do...@math.ucsb.edu or do...@aero.org
> Right. No general-use application written in lisp will ever fly.
> Only in a few exotic situations do people ever use lisp.
>
> Tom Lord
> part-time emacs hacker
And perhaps you should add: Only in a few exotic situations do people
ever use an application written in Lisp.
--
Juergen Nickelsen
part-time emacs hacker, too
Could you please post this article? (or a pointer) I happen to find a PS copy
somewhere and found it one of the most interesting article I ever read. Tcl
definitely fall in the "worse is better" category, which might explain its
huge success..
--------------------
Here is the LaTeX source which I picked up somewhere (I don't know where)
some time ago. Note that it is written in LaTeX as opposed to that
worse-is-better solution from unix -- troff. :-)
--------------------
\documentstyle[12pt]{article}
%\twocolumn
\pagestyle{plain}
\pagenumbering{arabic}
\title{Lisp: \\
Good News \\Bad News \\How to Win Big}
\author{Richard P. Gabriel\\
Lucid, Inc.}
\begin{document}
\maketitle
\begin{abstract}
Lisp has done quite well over the last ten years: becoming nearly
standardized, forming the basis of a commercial sector, achieving
excellent performance, having good environments, able to deliver
applications. Yet the Lisp community has failed to do as well as
it could have. In this paper I look at the successes, the failures,
and what to do next.
\end{abstract}
The Lisp world is in great shape: Ten years ago there was no standard
Lisp; the most standard Lisp was InterLisp, which ran on PDP-10's and
Xerox Lisp machines (some said it ran on Vaxes, but I think they
exaggerated); the second most standard Lisp was MacLisp, which ran
only on PDP-10's, but under the three most popular operating systems
for that machine; the third most standard Lisp was Portable Standard
Lisp, which ran on many machines, but very few people wanted to use
it; the fourth most standard Lisp was Zetalisp, which ran on two
varieties of Lisp machine; and the fifth most standard Lisp was
Scheme, which ran on a few different kinds of machine, but very few
people wanted to use it. By today's standards, each of these had poor
or just barely acceptable performance, nonexistent or just barely
satisfactory environments, nonexistent or poor integration with other
languages and software, poor portability, poor acceptance, and poor
commercial prospects.
Today there is Common Lisp (CL), which runs on all major machines, all
major operating systems, and virtually in every country. Common Lisp
is about to be standardized by ANSI, has good performance, is
surrounded with good environments, and has good integration with other
languages and software.
But, as a business, Lisp is considered to be in ill health. There are
persistent---and sometimes true---rumors about the abandonment of Lisp
as a vehicle for delivery of practical applications.
To some extent the problem is one of perception---there are simply
better Lisp delivery solutions than are generally believed to
exist---and to a disturbing extent the problem is one of unplaced or
misplaced resources, of projects not undertaken, and of implementation
strategies not activated.
Part of the problem stems from our very dear friends in the artificial
intelligence (AI) business. AI has a number of good approaches to
formalizing human knowledge and problem solving behavior. However, AI
does not provide a panacea in any area of its applicability. Some
early promoters of AI to the commercial world raised expectation
levels too high. These expectations had to do with the effectiveness
and deliverability of expert-system-based applications.
When these expectations were not met, some looked for scapegoats,
which frequently were the Lisp companies, particularly when it came to
deliverability. Of course, if the AI companies had any notion about
what the market would eventually expect from delivered AI software,
they never shared it with any Lisp companies I know about. I believe
the attitude of the AI companies was that the Lisp companies will do
what they need to survive, so why share customer lists and information
with them?
Another part of the problem is the relatively bad press Lisp got,
sometimes from very respectable publications. I saw an article in
Forbes (October 16, 1989) entitled ``Where Lisp Slipped'' by Julie
Pitta. However, the article was about Symbolics and its fortunes. The
largest criticisms of Symbolics in the article are that Symbolics
believed AI would take off and that Symbolics mistakenly pushed its
view that proprietary hardware was the way to go for AI. There was
nothing about Lisp in the article except the statement that it is a
``somewhat obscure programming language used extensively in artificial
intelligence.''
It seems a pity for the Lisp business to take a bump partly because
Julie thought she could make a cute title for her article out of the
name ``Lisp''.
But, there are some real successes for Lisp, some problems, and some
ways out of those problems.
\section{Lisp's Successes}
As I mentioned, Lisp is in better shape today than it ever has been. I want
to review some Lisp success stories.
\subsection{Standardization}
A major success is that there is a standard Lisp---Common Lisp. Many
observers today wish there were a simpler, smaller, cleaner Lisp that
could be standardized, but the Lisp that we have today that is ready
for standardization is Common Lisp. This isn't to say that a better
Lisp could not be standardized later, and certainly there should be.
Furthermore, like any language, Common Lisp should be improved and
changed as needs change.
Common Lisp started as a grassroots effort in 1981 after an
ARPA-sponsored meeting held at SRI to determine the future of Lisp. At
that time there were a number of Lisps in the US being defined and
implemented by former MIT folks: Greenblatt (LMI), Moon and Weinreb
(Symbolics), Fahlman and Steele (CMU), White (MIT), and Gabriel and
Steele (LLNL). The core of the Common Lisp committee came from this
group. That core was Fahlman, Gabriel, Moon, Steele, and Weinreb, and
Common Lisp was a coalescence of the Lisps these people cared about.
There were other Lisps that could have blended into Common Lisp, but
they were not so clearly in the MacLisp tradition, and their
proponents declined to actively participate in the effort because they
predicted success for their own dialects over any {\it common lisp\/}
that was defined by the grassroots effort. Among these Lisps were
Scheme, Interlisp, Franz Lisp, Portable Standard Lisp, and Lisp370.
And outside the US there were major Lisp efforts, including Cambridge
Lisp and Le-Lisp. The humble US grassroots effort did not seek
membership from outside the US, and one can safely regard that as a
mistake. Frankly, it never occurred to the Common Lisp group that this
purely American effort would be of interest outside the US, because
very few of the group saw a future in AI that would extend the needs
for a standard Lisp beyond North America.
Common Lisp was defined and a book published in 1984 called ``Common
Lisp: the Language'' (CLtL). And several companies sprang up to put
Common Lisp on stock hardware to compete against the Lisp machine
companies. Within four years, virtually every major computer company
had a Common Lisp that it had either implemented itself or
private-labeled from a Common Lisp company.
In 1986, X3J13 was formed to produce an ANSI version of Common Lisp.
By then it was apparent that there were significant changes required to
Common Lisp to clean up ambiguities and omissions, to add a condition
system, and to define object-oriented extensions.
After several years it became clear that the process of
standardization was not simple, even given a mature language with a
good definition. The specification of the Common Lisp Object System
(CLOS) alone took nearly two years and seven of the most talented
members of X3J13.
It also became apparent that the interest in international Lisp
standardization was growing. But there was no heir apparent to Common
Lisp. Critics of Common Lisp, especially those outside the US,
focused on Common Lisp's failures as a practical delivery vehicle.
In 1988, an international working group for the standardization of
Lisp was formed. That group is called WG16. Two things are absolutely
clear: The near-term standard Lisp is Common Lisp; a longer-term
standard that goes beyond Common Lisp is desirable.
In 1988, the IEEE Scheme working group was formed to produce an IEEE
and possibly an ANSI standard for Scheme. This group completed its
work in 1990, and the relatively small and clean Scheme is a standard.
Currently, X3J13 is less than a year away from a draft standard for
ANSI Common Lisp; WG16 is stalled because of international bickering;
Scheme has been standardized by IEEE, but it is of limited commercial
interest.
Common Lisp is in use internationally, and serves at least as
a de facto standard until the always contentious Lisp community
agrees to work together.
\subsection{Good Performance}
Common Lisp performs well. Most current implementations use modern
compiler technology, in contrast to older Lisps, which used very
primitive compiler techniques, even for the time. In terms of
performance, anyone using a Common Lisp today on almost any computer
can expect better performance than could be obtained on single-user
PDP-10's or on single-user Lisp machines of mid-1980's vintage. Many
Common Lisp implementations have multitasking and non-intrusive
garbage collection---both regarded as impossible features on stock
hardware ten years ago.
In fact, Common Lisp performs well on benchmarks compared to~C.
The following table shows the ratio of Lisp time and code size to
C~time and code size for three benchmarks.
\begin{center}
\begin{tabular}{|l|c|c|}
\hline
& CPU Time& Code Size \\
\hline
Tak& 0.90& 1.21 \\
\hline
Traverse& 0.98& 1.35 \\
\hline
Lexer& 1.07& 1.48 \\
\hline
\end{tabular}
\end{center}
{\bf Tak} is a Gabriel benchmark that measures function calling and
fixnum arithmetic. {\bf Traverse} is a Gabriel benchmark that
measures structure creation and access. {\bf Lexer} is the tokenizer
of a C compiler and measures dispatching and character manipulation.
These benchmarks were run on a Sun~3 in 1987 using the standard Sun C
compiler using full optimization. The Lisp was not running a
non-intrusive garbage collector.
\subsection{Good Environments}
It is arguable that modern programming environments come from the Lisp
and AI tradition. The first bitmapped terminals (Stanford/MIT), the
mouse pointing device (SRI), full-screen text editors (Stanford/MIT),
and windowed environments (Xerox PARC) all came from laboratories
engaged in AI research. Even today one can argue that the Symbolics
programming environment represents the state of the art.
It is also arguable that the following development environment
features originated in the Lisp world:
\begin{itemize}
\item Incremental compilation and loading
\item Symbolic debuggers
\item Data inspectors
\item Source code level single stepping
\item Help on builtin operators
\item Window-based debugging
\item Symbolic stack backtraces
\item Structure editors
\end{itemize}
Today's Lisp environments are equal to the very best Lisp machine
environments in the 1970's. Windowing, fancy editing, and good
debugging are all commonplace. In some Lisp systems, significant
attention has been paid to the software lifecycle through the use of
source control facilities, automatic cross-referencing, and automatic
testing.
\subsection{Good Integration}
Today Lisp code can coexist with C, Pascal, Fortran, etc. These
languages can be invoked from Lisp and in general, these languages can
then re-invoke Lisp. Such interfaces allow the programmer to pass Lisp
data to foreign code, to pass foreign data to Lisp code, to manipulate
foreign data from Lisp code, to manipulate Lisp data from foreign
code, to dynamically load foreign programs, and to freely mix foreign
and Lisp functions.
The facilities for this functionality are quite extensive and provide a
means for mixing several different languages at once.
\subsection{Object-oriented Programming}
Lisp has the most powerful, comprehensive, and pervasively object-oriented
extensions of any language. CLOS embodies features not found in any other
object-oriented language. These include the following:
\begin{itemize}
\item Multiple inheritance
\item Generic functions including multi-methods
\item First-class classes
\item First-class generic functions
\item Metaclasses
\item Method combination
\item Initialization protocols
\item Metaobject protocol
\item Integration with Lisp types
\end{itemize}
It is likely that Common Lisp (with CLOS) will be the first
standardized object-oriented programming language.
\subsection{Delivery}
It is possible to deliver applications written in Lisp. The currently
available tools are good but are not yet ideal. These solutions
include from removing unused code and data from application, building
up applications using only the code and data needed, and producing
{\tt .o} files from Lisp code.
Delivery tools are commercially provided by Lucid, Franz, and Ibuki.
\section{Lisp's Apparent Failures}
\begin{verse}
\smallskip\it
\hfill Too many teardrops for one heart to be crying.\\*
\hfill Too many teardrops for one heart to carry on.\\*
\hfill You're way on top now, since you left me,\\*
\hfill Always laughing, way down at me.\\*
\hfill{\em ? \& The Mysterians}
\end{verse}
This happy story, though, has a sad interlude, an interlude that might
be attributed to the failure of AI to soar, but which probably has
some other grains of truth that we must heed. The key problem with
Lisp today stems from the tension between two opposing software
philosophies. The two philosophies are called ``The Right Thing'' and
``Worse is Better.''
\subsection{The Rise of ``Worse is Better''}
I and just about every designer of Common Lisp and CLOS has had
extreme exposure to the MIT/Stanford style of design. The essence of
this style can be captured by the phrase ``the right thing.'' To such
a designer it is important to get all of the following characteristics
right:
\begin{itemize}
\item Simplicity---the design must be simple, both in
implementation and interface. It is more important for the interface
to be simple than the implementation.
\item Correctness---the design must be correct in all observable
aspects. Incorrectness is simply not allowed.
\item Consistency---the design must not be inconsistent. A design is
allowed to be slightly less simple and less complete to avoid
inconsistency. Consistency is as important as correctness.
\item Completeness---the design must cover as many important situations
as is practical. All reasonably expected cases must be covered.
Simplicity is not allowed to overly reduce completeness.
\end{itemize}
I believe most people would agree that these are good characteristics.
I will call the use of this philosophy of design the ``MIT approach.''
Common Lisp (with CLOS) and Scheme represent the MIT approach to
design and implementation.
The worse-is-better philosophy is only slightly different:
\begin{itemize}
\item Simplicity---the design must be simple, both in implementation
and interface. It is more important for the implementation to be
simple than the interface. Simplicity is the most important
consideration in a design.
\item Correctness---the design must be correct in all observable
aspects. It is slightly better to be simple than correct.
\item Consistency---the design must not be overly inconsistent.
Consistency can be sacrificed for simplicity in some cases, but it is
better to drop those parts of the design that deal with less common
circumstances than to introduce either implementational complexity or
inconsistency.
\item Completeness---the design must cover as many important
situations as is practical. All reasonably expected cases should be
covered. Completeness can be sacrificed in favor of any other
quality. In fact, completeness must sacrificed whenever implementation
simplicity is jeopardized. Consistency can be sacrificed to achieve
completeness if simplicity is retained; especially worthless is
consistency of interface.
\end{itemize}
Early Unix and C are examples of the use of this school of design, and
I will call the use of this design strategy the ``New Jersey
approach.'' I have intentionally caricatured the worse-is-better
philosophy to convince you that it is obviously a bad philosophy and
that the New Jersey approach is a bad approach.
However, I believe that worse-is-better, even in its strawman form,
has better survival characteristics than the-right-thing, and that the
New Jersey approach when used for software is a better approach than
the MIT approach.
Let me start out by retelling a story that shows that the
MIT/New-Jersey distinction is valid and that proponents of each
philosophy actually believe their philosophy is better.
Two famous people, one from MIT and another from Berkeley (but working
on Unix) once met to discuss operating system issues. The person from
MIT was knowledgeable about ITS (the MIT AI Lab operating system) and
had been reading the Unix sources. He was interested in how Unix
solved the PC loser-ing problem. The PC loser-ing problem occurs when
a user program invokes a system routine to perform a lengthy operation
that might have significant state, such as IO buffers. If an interrupt
occurs during the operation, the state of the user program must be
saved. Because the invocation of the system routine is usually a
single instruction, the PC of the user program does not adequately
capture the state of the process. The system routine must either back
out or press forward. The right thing is to back out and restore the
user program PC to the instruction that invoked the system routine so
that resumption of the user program after the interrupt, for example,
re-enters the system routine. It is called ``PC loser-ing'' because
the PC is being coerced into ``loser mode,'' where ``loser'' is the
affectionate name for ``user'' at MIT.
The MIT guy did not see any code that handled this case and asked the
New Jersey guy how the problem was handled. The New Jersey guy said
that the Unix folks were aware of the problem, but the solution was
for the system routine to always finish, but sometimes an error code
would be returned that signaled that the system routine had failed to
complete its action. A correct user program, then, had to check the
error code to determine whether to simply try the system routine
again. The MIT guy did not like this solution because it was not the
right thing.
The New Jersey guy said that the Unix solution was right because the
design philosophy of Unix was simplicity and that the right thing was
too complex. Besides, programmers could easily insert this extra test
and loop. The MIT guy pointed out that the implementation was simple
but the interface to the functionality was complex. The New Jersey guy
said that the right tradeoff has been selected in Unix---namely,
implementation simplicity was more important than interface
simplicity.
The MIT guy then muttered that sometimes it takes a tough man to make a
tender chicken, but the New Jersey guy didn't understand (I'm not sure
I do either).
Now I want to argue that worse-is-better is better. C~is a programming
language designed for writing Unix, and it was designed using the New
Jersey approach. C~is therefore a language for which it is easy to
write a decent compiler, and it requires the programmer to write text
that is easy for the compiler to interpret. Some have called~C a fancy
assembly language. Both early Unix and C compilers had simple
structures, are easy to port, require few machine resources to run,
and provide about 50\%--80\% of what you want from an operating system
and programming language.
Half the computers that exist at any point are worse than median
(smaller or slower). Unix and C work fine on them. The
worse-is-better philosophy means that implementation simplicity has
highest priority, which means Unix and~C are easy to port on such
machines. Therefore, one expects that if the 50\% functionality Unix
and~C support is satisfactory, they will start to appear everywhere.
And they have, haven't they?
Unix and~C are the ultimate computer viruses.
A further benefit of the worse-is-better philosophy is that the
programmer is conditioned to sacrifice some safety, convenience, and
hassle to get good performance and modest resource use. Programs
written using the New Jersey approach will work well both in small
machines and large ones, and the code will be portable because it is
written on top of a virus.
It is important to remember that the initial virus has to be basically
good. If so, the viral spread is assured as long as it is portable.
Once the virus has spread, there will be pressure to improve it,
possibly by increasing its functionality closer to 90\%, but users
have already been conditioned to accept worse than the right thing.
Therefore, the worse-is-better software first will gain acceptance,
second will condition its users to expect less, and third will be
improved to a point that is almost the right thing. In concrete
terms, even though Lisp compilers in 1987 were about as good as C
compilers, there are many more compiler experts who want to make C
compilers better than want to make Lisp compilers better.
The good news is that in 1995 we will have a good operating system and
programming language; the bad news is that they will be Unix and C++.
There is a final benefit to worse-is-better. Because a New Jersey
language and system are not really powerful enough to build complex
monolithic software, large systems must be designed to reuse
components. Therefore, a tradition of integration springs up.
How does the right thing stack up? There are two basic scenarios: the
``big complex system scenario'' and the ``diamond-like jewel''
scenario.
The ``big complex system'' scenario goes like this:
First, the right thing needs to be designed. Then its implementation
needs to be designed. Finally it is implemented. Because it is the
right thing, it has nearly 100\% of desired functionality, and
implementation simplicity was never a concern so it takes a long time
to implement. It is large and complex. It requires complex tools to
use properly. The last 20\% takes 80\% of the effort, and so the right
thing takes a long time to get out, and it only runs satisfactorily on
the most sophisticated hardware.
The ``diamond-like jewel'' scenario goes like this:
The right thing takes forever to design, but it is quite small at
every point along the way. To implement it to run fast is either
impossible or beyond the capabilities of most implementors.
The two scenarios correspond to Common Lisp and Scheme.
The first scenario is also the scenario for classic artificial
intelligence software.
The right thing is frequently a monolithic piece of software, but for
no reason other than that the right thing is often designed
monolithically. That is, this characteristic is a happenstance.
The lesson to be learned from this is that it is often undesirable
to go for the right thing first. It is better to get half of the right
thing available so that it spreads like a virus. Once people are hooked on
it, take the time to improve it to 90\% of the right thing.
A wrong lesson is to take the parable literally and to conclude that C
is the right vehicle for AI software. The 50\% solution has to be
basically right, and in this case it isn't.
But, one can conclude only that the Lisp community needs to seriously
rethink its position on Lisp design. I will say more about this
later.
\subsection{Good Lisp Programming is Hard}
Many Lisp enthusiasts believe that Lisp programming is easy. This is
true up to a point. When real applications need to be delivered, the
code needs to perform well. With~C, programming is always difficult
because the compiler requires so much description and there are so few
data types. In Lisp it is very easy to write programs that perform
very poorly; in~C it is almost impossible to do that. The following
examples of badly performing Lisp programs were all written by
competent Lisp programmers while writing real applications that were
intended for deployment. I find these quite sad.
\subsubsection{Bad Declarations}
This example is a mistake that is easy to make. The programmer here
did not declare his arrays as fully as he could have. Therefore, each
array access was about as slow as a function call when it should have
been a few instructions. The original declaration was as follows:
\begin{verbatim}
(proclaim '(type (array fixnum *) *ar1* *ar2* *ar3*))
\end{verbatim}
\noindent
The three arrays happen to be of fixed size, which is reflected in the
following correct declaration:
\begin{verbatim}
(proclaim '(type (simple-array fixnum (4)) *ar1*))
(proclaim '(type (simple-array fixnum (4 4)) *ar2*))
(proclaim '(type (simple-array fixnum (4 4 4)) *ar3*))
\end{verbatim}
Altering the faulty declaration improved the performance of the entire
system by 20\%.
\subsubsection{Poor Knowledge of the Implementation}
The next example is where the implementation has not optimized a
particular case of a general facility, and the programmer has used the
general facility thinking it will be fast. Here five values are being
returned in a situation where the order of side effects is critical:
\begin{verbatim}
(multiple-value-prog1
(values (f1 x)
(f2 y)
(f3 y)
(f4 y)
(f5 y))
(setf (aref ar1 i1) (f6 y))
(f7 x y))
\end{verbatim}
\noindent
The implementation happens to optimize {\tt multiple-value-prog1} for
up to three return values, but the case of five values CONSes. The
correct code follows:
\begin{verbatim}
(let ((x1 (f1 x))
(x2 (f2 y))
(x3 (f3 y))
(x4 (f4 y))
(x5 (f5 y)))
(setf (aref ar1 i1) (f6 y))
(f7 x y)
(values x1 x2 x3 x4 x5))
\end{verbatim}
\noindent
There is no reason that a programmer should know that this rewrite is
needed. On the other hand, finding that performance was not as
expected should not have led the manager of the programmer in question
to conclude, as he did, that Lisp was the wrong language.
\subsubsection{Use of FORTRAN Idioms}
Some Common Lisp compilers do not optimize the same way as others. The
following expression is sometimes used:
\begin{verbatim}
(* -1 <form>)
\end{verbatim}
\noindent
when compilers often produce better code for this variant:
\begin{verbatim}
(- <form>)
\end{verbatim}
\noindent
Of course, the first is the Lisp analog of the FORTRAN idiom:
\begin{verbatim}
- -1*<form>
\end{verbatim}
\subsubsection{Totally Inappropriate Data Structures}
Some might find this example hard to believe. This really occurred in
some code I've seen:
\begin{verbatim}
(defun make-matrix (n m)
(let ((matrix ()))
(dotimes (i n matrix)
(push (make-list m) matrix))))
(defun add-matrix (m1 m2)
(let ((l1 (length m1))
(l2 (length m2)))
(let ((matrix (make-matrix l1 l2)))
(dotimes (i l1 matrix)
(dotimes (j l2)
(setf (nth i (nth j matrix))
(+ (nth i (nth j m1))
(nth i (nth j m2)))))))))
\end{verbatim}
\noindent
What's worse is that in the particular application, the matrices were
all fixed size, and matrix arithmetic would have been just as fast in
Lisp as in FORTRAN.
This example is bitterly sad: The code is absolutely beautiful, but it
adds matrices slowly. Therefore it is excellent prototype code and
lousy production code. You know, you cannot write production code as
bad as this in~C.
\subsection{Integration is God}
In the worse-is-better world, integration is linking your {\tt .o}
files together, freely intercalling functions, and using the same
basic data representations. You don't have a foreign loader, you don't
coerce types across function-call boundaries, you don't make one
language dominant, and you don't make the woes of your implementation
technology impact the entire system.
The very best Lisp foreign functionality is simply a joke when faced
with the above reality. Every item on the list can be addressed in a
Lisp implementation. This is just not the way Lisp implementations
have been done in the right thing world.
The virus lives while the complex organism is stillborn. Lisp must
adapt, not the other way around. The right thing and 2 shillings will
get you a cup of tea.
\subsection{Non-Lisp Environments are Catching Up}
This is hard to face up to. For example, most C
environments---initially imitative of Lisp environments---are now
pretty good. Current best C environments have the following:
\begin{itemize}
\item Symbolic debuggers
\item Data inspectors
\item Source code level single stepping
\item Help on builtin operators
\item Window-based debugging
\item Symbolic stack backtraces
\item Structure editors
\end{itemize}
\noindent
And soon they will have incremental compilation and loading. These
environments are easily extendible to other languages, with
multilingual environments not far behind.
Though still the best, current Lisp environments have several
prominent failures. First, they tend to be window-based but not well
integrated. That is, related information is not represented so as to
convey the relationship. A multitude of windows does not mean
integration, and neither does being implemented in the same language
and running in the same image. In fact, I believe no currently
available Lisp environment has any serious amount of integration.
Second, they are not persistent. They seemed to be defined for a
single login session. Files are used to keep persistent data---how
1960's.
Third, they are not multilingual even when foreign interfaces are available.
Fourth, they do not address the software lifecycle in any extensive
way. Documentation, specifications, maintenance, testing, validation,
modification, and customer support are all ignored.
Fifth, information is not brought to bear at the right times. The compiler
is able to provide some information, but the environment should be able to
generally know what is fully defined and what is partially defined. Performance
monitoring should not be a chore.
Sixth, using the environment is difficult. There are too many
things to know. It's just too hard to manage the mechanics.
Seventh, environments are not multi-user when almost all interesting
software is now written in groups.
The real problem has been that almost no progress in Lisp environments
has been made in the last 10 years.
\section{How Lisp Can Win Big}
\begin{verse}
\smallskip\it
\hfill When the sun comes up, I'll be on top.\\*
\hfill You're right down there looking up.\\*
\hfill On my way to come up here,\\*
\hfill I'm gonna see you waiting there.\\*
\hfill I'm on my way to get next to you.\\*
\hfill I know now that I'm gonna get there.\\*
\hfill{\em ? \& The Mysterians}
\end{verse}
The gloomy interlude can have a happy ending.
\subsection{Continue Standardization Progress}
We need to bury our differences at the ISO level and realize that
there is a short term need, which must be Common Lisp, and a long term
need, which must address all the issues for practical applications.
We've seen that the right thing attitude has brought us a very large,
complex-to-understand, and complex-to-implement Lisp---Common
Lisp---that solves way too many problems. We need to move beyond
Common Lisp for the future, but that does not imply giving up on
Common Lisp now. We've seen it is possible to do delivery of
applications, and I think it is possible to provide tools that make it
easier to write applications for deployment. A lot of work has gone
into getting Common Lisp to the point of a ``right thing'' in many
ways, and there are viable commercial implementations. But we need to
solve the delivery and integration problems in spades.
Earlier I characterized the MIT approach as often yielding stillborn
results. To stop Common Lisp standardization now is equivalent to
abortion, and that is equivalent to the Lisp community giving up on
Lisp. If we want to adopt the New Jersey approach, it is wrong to give
up on Lisp, because C just isn't the right language for AI.
It also simply is not possible to dump Common Lisp now, work on a new
standard, and then standardize in a timely fashion. Common Lisp is
all we have at the moment. No other dialect is ready for
standardization.
Scheme is a smaller Lisp, but it also suffers from the MIT approach.
It is too tight and not appropriate for large-scale software. At least
Common Lisp has some facilities for that.
I think there should be an internationally recognized standard for
Common Lisp. I don't see what is to be gained by aborting the Common
Lisp effort today just because it happens to not be the best solution
to a commercial problem. For those who believe Lisp is dead or dying,
what does killing off Common Lisp achieve but to convince people that
the Lisp community kills its own kind? I wish less effort would go
into preventing Common Lisp from becoming a standard when it cannot
hurt to have several Lisp standards.
On the other hand, there should be a strong effort towards the next
generation of Lisp. The worst thing we can do is to stand still as a
community, and that is what is happening.
All interested parties must step forward for the longer-term effort.
\subsection{Retain the High Ground in Environments}
I think there is a mistake in following an environment path that
creates monolithic environments. It should be possible to use a
variety of tools in an environment, and it should be possible for
those who create new tools to be able to integrate them into the
environment.
I believe that it is possible to build a tightly integrated
environment that is built on an open architecture in which all tools,
including language processors, are protocol-driven. I believe it is
possible to create an environment that is multilingual and addresses
the software lifecycle problem without imposing a particular software
methodology on its users.
Our environments should not discriminate against non-Lisp programmers
the way existing environments do. Lisp is not the center of the world.
\subsection{Implement Correctly}
Even though Common Lisp is not structured as a kernel plus libraries,
it can be implemented that way. The kernel and library routines can be
in the form of {\tt .o} files for easy linking with other, possibly
non-Lisp, modules; the implementation must make it possible to write,
for example, small utility programs. It is also possible to piggyback
on existing compilers, especially those that use common back ends.
It is also possible to implement Lisp so that standard debuggers,
possibly with extensions, can be made to work on Lisp code.
It might take time for developers of standard tools to agree to extend
their tools to Lisp, but it certainly won't happen until our
(exceptional) language is implemented more like ordinary ones.
\subsection{Achieve Total Integration}
I believe it is possible to implement a Lisp and surrounding
environment which has no discrimination for or against any other
language. It is possible using multilingual environments, clever
representations of Lisp data, conservative garbage collection, and
conventional calling protocols to make a completely integrated Lisp
that has no demerits.
\subsection{Make Lisp the Premier Prototyping Language}
Lisp is still the best prototyping language. We need to push this
forward. A multilingual environment could form the basis or
infrastructure for a multilingual prototyping system. This means doing
more research to find new ways to exploit Lisp's strengths and to
introduce new ones.
Prototyping is the act of producing an initial implementation of a
complex system. A prototype can be easily instrumented, monitored, and
altered. Prototypes are often built from disparate parts that have
been adapted to a new purpose. Descriptions of the construction of a
prototype often involve statements about modifying the behavioral
characteristics of an existing program. For example, suppose there
exists a tree traversal program. The description of a prototype using
this program might start out by saying something like ``let $S_1$ be
the sequence of leaf nodes visited by $P$ on tree $T_1$ and $S_2$ the
leaf nodes visited by $P$ on tree $T_2$. Let $C$ be a correspondence
between $S_1$ and $S_2$ where $f\colon S_1 \to S_2$ maps elements to
corresponding elements.'' Subsequent statements might manipulate the
correspondence and use $f$. Once the definition of a leaf node is made
explicit, this is a precise enough statement for a system to be able
to modify the traversal routine to support the correspondence and
$f$.
A language that describes the modification and control of an existing
program can be termed a {\it program\/} language. Program languages
be built on one or several underlying programming languages, and in
fact can be implemented as part of the functionality of the
prototyping environment. This view is built on the insight that an
environment is a mechanism to assist a programmer in creating a
working program, including preparing the source text. There is no
necessary requirement that an environment be limited to working only
with raw source text. As another example, some systems comprise
several processes communicating through channels. The creation of this
part of the system can be visual, with the final result produced by
the environment being a set of source code in several languages, build
scripts, link directives, and operating system calls. Because no
single programming language encompasses the program language, one
could call such a language an {\it epilanguage}.
\subsection{The Next Lisp}
I think there will be a next Lisp. This Lisp must be carefully
designed, using the principles for success we saw in worse-is-better.
There should be a simple, easily implementable kernel to the Lisp.
That kernel should be both more than Scheme---modules and macros---and
less than Scheme---continuations remain an ugly stain on the
otherwise clean manuscript of Scheme.
The kernel should emphasize implementational simplicity, but not at the
expense of interface simplicity. Where one conflicts with the other,
the capability should be left out of the kernel. One reason is so that
the kernel can serve as an extension language for other systems, much
as GnuEmacs uses a version of Lisp for defining Emacs macros.
Some aspects of the extreme dynamism of Common Lisp should be
re-examined, or at least the tradeoffs reconsidered. For example, how
often does a real program do this?
\begin{verbatim}
(defun f ...)
(dotimes (...)
...
(setf (symbol-function 'f) #'(lambda ...))
...)
\end{verbatim}
\noindent
Implementations of the next Lisp should not be influenced by previous
implementations to make this operation fast, especially at the expense
of poor performance of all other function calls.
The language should be segmented into at least four layers:
\begin{itemize}
\item[1.] The kernel language, which is small and simple to
implement. In all cases, the need for dynamic redefinition should be
re-examined to determine that support at this level is necessary. I
believe nothing in the kernel need be dynamically redefinable.
\item[2.] A linguistic layer for fleshing out the language. This
layer may have some implementational difficulties, and it will
probably have dynamic aspects that are too expensive for the kernel
but too important to leave out.
\item[3.] A library. Most of what is in Common Lisp would be in this
layer.
\item[4.] Environmentally provided epilinguistic features.
\end{itemize}
In the first layer I include conditionals, function calling, all
primitive data structures, macros, single values, and very basic
object-oriented support.
In the second layer I include multiple values and more elaborate
object-oriented support. The second layer is for difficult
programming constructs that are too important to leave to environments
to provide, but which have sufficient semantic consequences to warrant
precise definition. Some forms of redefinition capabilities might
reside here.
In the third layer I include sequence functions, the elaborate IO
functions, and anything else that is simply implemented in the first
and possibly the second layers. These functions should be linkable.
In the fourth layer I include those capabilities that an environment
can and should provide, but which must be standardized. A typical
example is {\tt defmethod} from CLOS. In CLOS, generic functions are
made of methods, each method applicable to certain classes. The first
layer has a definition form for a complete generic function---that is,
for a generic function along with all of its methods, defined in one
place (which is how the layer~1 compiler wants to see it). There will
also be means of associating a name with the generic function.
However, while developing a system, classes will be defined in various
places, and it makes sense to be able to see relevant (applicable)
methods adjacent to these classes. {\tt defmethod} is the construct
to define methods, and {\tt defmethod} forms can be placed anywhere
amongst other definitional forms.
But methods are relevant to each class on which the method is
specialized, and also to each subclass of those classes. So, where
should the unique {\tt defmethod} form be placed? The environment
should allow the programmer to see the method definition in any or all
of these places, while the real definition should be in some
particular place. That place might as well be in the single generic
function definition form, and it is up to the environment to show the
{\tt defmethod} equivalent near relevant classes when required, and to
accept as input the source in the form of a {\tt defmethod} (which it
then places in the generic function definition).
We want to standardize the {\tt defmethod} form, but it is a
linguistic feature provided by the environment. Similarly, many uses
of elaborate lambda-list syntax, such as keyword arguments, are
examples of linguistic support that the environment can provide possibly
by using color or other adjuncts to the text.
In fact, the area of function-function interfaces should be
re-examined to see what sorts of argument naming schemes are needed and
in which layer they need to be placed.
Finally, note that it might be that every layer~2 capability could
be provided in a layer~1 implementation by an environment.
\subsection{Help Application Writers Win}
The Lisp community has too few application writers. The Lisp
vendors need to make sure these application writers win. To do this
requires that the parties involved be open about their problems and
not adversarial. For example, when an expert system shell company
finds problems, it should open up its source code to the Lisp vendor
so that both can work towards the common goal of making a faster,
smaller, more deliverable product. And the Lisp vendors should do the
same.
The business leadership of the AI community seems to have adopted the
worst caricature-like traits of business practice: secrecy, mistrust,
run-up-the-score competitiveness. We are an industry that has enough
common competitors without searching for them among our own ranks.
Sometimes the sun also rises.
\begin{thebibliography}{2}
\bibitem{?} ? \& the Mysterians, ``96 Tears,'' Pa-go-go Records 1966,
re-released on Cameo Records, September 1966.
\end{thebibliography}
\end{document}
%%%
%%% Local Variables: %%%
%%% TeX-command: "/usr/lib/tex/bin/clatex" %%%
%%% TeX-dvi-print-command: "lp -dljnpm -odvi" %%%
%%% TeX-show-queue-command: "lpstat -c ljnpm" %%%
%%% End: %%%
In article <LORD.94Ju...@x1.cygnus.com> lo...@x1.cygnus.com (Tom Lord) writes:
Perhaps you mean no general use application written entirely in Common
Lisp will ever fly? I can think of lots of counterexamples:
* GNU Emacs partially written in Lisp. It is a hybrid
implementation using C at its core, and Lisp as a configuration
and extension language.
* AutoCAD uses AutoLisp as an extension language. It is very very
popular given that it is a CAD package, and not, say, a spreadsheet.
I can't remember the figures exactly, but I've heard there's 500,000
to 1,000,000 copies of it floating around world-wide.
* Various other CAD packages use Lisp extension languages as well,
for example Cadence Design Systems.
* Interleaf uses Lisp as an extension language.
* SoftQuad has an SGML editor which uses Scheme (a Lisp dialect) as an
extension language.
* All applications written in WINTERP (I know of one that had
an estimated 1500 users.) All applications using Elk. All.
applications using CLM/GINA. All applications using GNU Emacs
and written in Emacs-Lisp (such as the newsreader I'm using --
GNUS).
* The statistics community has found Luke Tierney's XLISP-STAT
to be a very cool package as well. THis is a package with
a hybrid C/Lisp implementation of Lisp which is extended to
do graphics and plotting (Mac,Unix/X11) and with statistics
primitives.
--------------------
From: ous...@sprite.Berkeley.EDU (John Ousterhout)
> Since we're slinging arrows, allow me to sling mine. It's time to accept the
> fact that Lisp will never be a mainstream language. There's no question that
> Lisp is an intellectually interesting language, and it is very important
> in certain communities, such as AI and formal language design. But it's
> simply never going to be widely used. Lisp has been trying for more than
> 30 years; every few years we hear another round of grandiose claims, and
> a few years later they have evaporated. We could argue about why Lisp
> hasn't entered the mainstream, but none of this changes the basic fact
> that it hasn't.
>
> I believe that it is reasonable to give new ideas a chance and to accept
> plausible arguments that they might succeed, but when a system has been
> around for 30 years I don't think "plausible arguments" are enough anymore.
> If someone is going to argue today why Lisp should be a mainstream
> language, they must somehow explain why it hasn't succeeded in the past
> and what will happen this time to make things turn out differently.
Repeating "Lisp is dead" over and over and over again will not make it
true, just like repeating "Tk has a Motif Look and Feel" will not make it
true to someone expecting a Tk application to "feel" like Motif.
All you have to do is go down to any technical bookstore and you will find
many many books on Lisp programming, Lisp application programming and
Scheme programming. And I'm not talking about 30 year old books on Lisp,
I'm talking about books that are in their third or fourth printing and have
been published in the last few years. There is a market for these books,
otherwise the publishers would never have put these books out, nor put out
new editions of older books (e.g. Winston&Horn).
In addition to Lisp books, there are also a number of recently published
books on the Scheme dialect of Lisp. One of the best introductory books on
computer science, Abelson&Sussman's "Structure and Interpretation of
Computer Programs", uses SCHEME and not Pascal or C to introduce the
student to a variety of interesting CS problems. The fortunate few that are
able to make it in to MIT end up taking 6.001 (which uses the
Abelson&Sussman book) as their introductory computer science and
programming course, which tends to put these folks on the right track right
from the start.
Finally, I know of at least three books on GNU Emacs which discuss
Emacs-Lisp programming, and there's whole rows of books on AutoLisp
programming for AutoCAD.
(Since WINTERP uses XLISP-PLUS, I'll also mention that I know of two books
on XLISP, and one on Luke Tierney's XLISP-STAT).
Meanwhile, I know of exactly one book on TCL/Tk (Ousterhout's) that is now
available in bookstores. More importantly I haven't come across any books
on applications using TCL/Tk as an extension language (like the GNU Emacs
or AutoCAD books).
> |> Yes, TCL is a simple language -- I contend that it is too simple and too
> |> simplistic for constructing well engineered applications:
> |> ...
>
> Again, I think the large number of Tcl applications and the growing size of
> the Tcl community speak for themselves. What better evidence could there
> be that Tcl and Tk are practical for building interesting and useful
> applications?
I already answered to this in my original post when I said:
| The fact that people *can* program applications in the bourne shell, or
| tcsh, or the korn shell does not imply that a language having the above
| limitations is a good or expressive language for building applications.
Some of the problems I addressed:
| * type-overloading of strings, and lack of any real type system.
| * lack of proper automatic memory and resource management (freeing
| up data when a widget is destroyed doesn't quite cut it).
| * lack of lexical scopes for functions/procedures.
| * lack of closures (forcing programmers to use too many global
| variables).
| * Lack of object orientation in the standard TCL/Tk release.
Are clearly problems with TCL in terms of building larger applications that
scale well, and aren't mired in a sea of global references. I can't imagine
that people spent the time to develop [incr tcl] and [incr tk] unless
they were attacking a known problem.
I think the whole "popularity" line of reasoning is questionable: MS-DOS
and MS-WINDOWS are more popular than Unix/X, but many of us have good
reasons to keep working on Unix and X. Beyond that, if you're worried about
popularity, both the Lisp and TCL/Tk community are no doubt insignificant
when comparted to, say the Visual Basic community, or Actor, or other
similar packages available for MS-WINDOWS.
In other words, the popularity argument is meaningless at best, and
specious at worst. Some people are willing to pay the costs of Lisp in
return for its benefits. Others may be lured to TCL/Tk because it is small,
simple and free and includes a free toolkit. None of these have anything
to do with the fact that the language may not be good for building
GUI-based applications. People write applications in assembly too.
When discussing language features, one shouldn't confuse a trend with a
movement.
I personally couldn't give a deleted expletive about whether Lisp is more
or less popuplar than TCL/Tk or Perl, just as long as no employer of mine
ever demands that I write, maintain, or extend a TCL/Tk or Safe-TCL
program. In the mean time, I'm happy to see that we all have choices and
are not locked in to any particular form of lowest-common-denominator
groupthink (e.g. MS-DOS) -- use the best tool for the job, whether it be
C, C++, Lisp, TCL, Perl, Awk, Sed, ASN1, APL, Prolog, or whatever.
> I wonder how you dare to judge Motif, when the above sentence shows
> you don't know anything about it - except maybe braking it's default
> behavior.
The whole point to a GUI is that you *shouldn't* have to know how to use
and configure it to have it work. I haven't added a single Motif resource
to my .Xdefaults on this or any other system, though dxsession has put a
few in on this box
This is how Motif applications have always behaved.
The only resource that might be applicable is:
Mwm*keyboardFocusPolicy: pointer
This was probably added by dxconsole. If that forces Motif programs to
use pointer focus for *widgets* then Motif is just plain broken.
Yes, it's hard to make a system as complex as Motif without warts.
You seem to think that implies we should just accept the warts.
I think a better solution is to ask if we need such a complex system.
[lots of stuff deleted]
|>
|> I personally couldn't give a deleted expletive about whether Lisp is more
|> or less popuplar than TCL/Tk or Perl, just as long as no employer of mine
|> ever demands that I write, maintain, or extend a TCL/Tk or Safe-TCL
|> program. In the mean time, I'm happy to see that we all have choices and
|> are not locked in to any particular form of lowest-common-denominator
|> groupthink (e.g. MS-DOS) -- use the best tool for the job, whether it be
|> C, C++, Lisp, TCL, Perl, Awk, Sed, ASN1, APL, Prolog, or whatever.
Am I the only one who finds this paragraph self-contradicting.
"...as long as no employer of mine ever demands that I write...TCL/Tk"
followed in virtually the same breath by
"...use the best tool for the job..."
Come on.
Mike
--
"If I cannot bring it up on my monitor and it does not interfere
with a major aspect of my physiology, like eating, sleeping, or breathing,
then it probably does not exist"
1. Build one to throw away.
The "worse is better" approach allows you to implement things, get a LOT
of people playing with them, and then throw them away if they don't work
out. System V IPC, for example, or the old stty/gtty interface to terminal
I/O. On a smaller scale this is called "Rapid Prototyping" and is considered
a good thing.
Where this falls down is where you get people dependent on an interface
It's *far* more important to get the interface right. The problem Lisp
has here is that it's *all* interface.
2. The Second System Effect.
C++ and Common Lisp. 'nuff said.
You need to compromise everything, to get a system out there people can use.
The best system in the world does you no good if it only exists on paper, or
on hardware people can't afford:
Smalltalk, Common Lisp, ...
Also, surprising:
UNIX
For a long time UNIX wouldn't run on reasonable hardware, or if it
did it was crippled by AT&T's "Mainframe" licensing.
It's not "worse is better" enough. DOS won *that* race.
PLAN 9
'nuff said.
Am I the only one who finds this paragraph self-contradicting.
"...as long as no employer of mine ever demands that I write...TCL/Tk"
followed in virtually the same breath by
"...use the best tool for the job..."
With all due respect for the wonderful implementation of Tcl, I find
it difficult to think of a job for which Tcl is the best tool.
Perhaps someone could make a list?
Bill
--
Bill Janssen <jan...@parc.xerox.com> (415) 812-4763 FAX: (415) 812-4777
Xerox Palo Alto Research Center, 3333 Coyote Hill Rd, Palo Alto, CA 94304
URL: ftp://parcftp.parc.xerox.com/pub/ilu/misc/janssen.html
From: ous...@sprite.Berkeley.EDU (John Ousterhout)
| There are many things in Niels Mayer's article that deserve a rebuttal, I'll
| just pick a couple:
|
| |> TCL's problems of language quoting incosistency make it difficult to write
| |> higher level abstractions for UI construction -- macros, iterators,
| |> data-driven-UI-generation, etc. Not that higher level abstractions can't be
| |> done -- it's just that creating them in TCL is linguistically painful
| |> because the language uses the wrong fundamental data-type -- newline
| |> delimited STRINGs separated by spaces. In constrast LISP chose the right
| |> fundamental data-type LISTs of SYMBOLs, and consistently applies this
| |> datatype across throughout the language.
| |>
|
| This paragraph suggests that you don't understand Tcl. In fact the quoting
| rules are very simple and very consistent; I claim that it's quite
| straightforward to build higher-level constructs for UIs and I think there
| are many many examples in the Tcl community that illustrate this. One
| can make religious arguments ad nauseum, but I know of no technically
| sound evidence that lists of symbols are fundamentally better than strings.
It looks like a number of your users are also having trouble understanding
Tcl. In my opinion, this is because the quoting and parenthisation rules
are neither straighforward, simple, nor consistent. About once every week
or two, I see questions come up on comp.lang.tcl dealing with one or more
of the issues raised above. Examples of these problems have been
appended to the end of this article...
When users are having real problems, it becomes clear that the arguments
are no longer religious. In Lisp, you can get arbitrarily complex and
inscrutable once you go to extreme macro-ology and multiple levels of
backquoting. However, for the more typical cases handled by the neophyte,
the problems of quoting, time-of binding, time-of substitution, and scope
of bindings are more consistent and easier to understand in Lisp. And
when you need to, you know that there's a very nice macro capability
to back up any meta-level programming you need to accomplish.
By default, Lisp does the right thing for the case of binding at run-time,
and when you need to perform binding at evaluation time, you use
backquotes... The places where backquote substitutions happen are clearly
marked by backquoting 'commands' e.g. (,foo); all other cases look up the
value of a symbol based on the closure surrounding the code being
executed. By using closures, one may limit the extent of variables to
groups of functions or methods, an application module, or the entire
application -- closures are the basis enabling any lexically scoped Lisp
solution to scale better than TCL. And in Lisp, you don't have multitudes
of different ways of opening and closing expressions...
Furthermore, there's 30 years of history and 30 years worth of books on
Lisp programming which have dealt with these problems from the pedagogical,
technical, and style/design perspectives. Most beginners find this helpful.
Reinventing the issues which are already solved problems in Lisp, just so
that you can have a language that doesn't parenthesize like Lisp -- that is
not helpful to the beginner. It does, however, open up a whole new market
for books.
Finally, I think that having language syntax depend on newlines is a bad
idea -- I think most unix shell programmers will grudgingly agree with me
on this. I'd much rather have a Lispish syntax where you delineate your
current top-level evaluations by [ ... ] and don't bother with newlines.
Unfortunately, given people's knee-jerk response to Lisp, having such a TCL
would highlight TCL's parentage as bastard-child-of-Lisp. Something almost
Lisp-like would set-off or shatter people's preconceived panic
response. Can't do that! (Meanwhile, I eagerly await the chance/time to see
how the Dylan folks solved the problem of being politically right-thinking
with regards to syntax while retaining a Lisp core).
Well, here go the examples, cleaned from a random reading through
comp.lang.tcl articles on this site:
--------------------
From: hfv...@bcarh80a.bnr.ca (7Z31 Coop)
| In article <1994Jul21.1...@khis.com>, cha...@khis.com (David Chang) writes:
|
| [... Stuff Deleted ...]
| |> I'm trying to get the number of word "set" appeared in $file and set
| |> the scale value by using "set count [exec grep -c set $file]; .scale set $count".
| |> This doesn't work if no "set" appeared in file "$file" , so I change to use
| |> the following method (use catch) :-) Guess what ?
| |>
| [... Stuff Deleted ...]
| |>
| |> catch [exec grep -c "set" $file] retval
| |> if { $retval == 1 } {
| |> .scale set 0
| |> return 0
| |> }
| |>
| |> if { $retval != 1 } {
| |> set count [exec grep -c "set" $file]
| |> .scale set $count
| |> return 0
| |> }
| |>
| |> So my question is : How do I set the variable "count" if the return value of
| |> "grep -c" is 0 ?
|
| Try:
|
| if {[catch {set count [exec grep -c "set" $file]}]} {
| .scale set 0
| } else {
| .scale set $count
| }
| return 0
From: fri...@faw.uni-ulm.de (Davide Frisoni)
| Hello everyone,
| thanks to all people who answered my question, the main error come from a wrong use of [].
|
| I wrote :
| set code [catch [my_cprogram] err] :-(
| and now with :
| set code [catch {my_cprogram} err] :-)
| it works.
From: Rob Earhart <earh...@CMU.EDU>
| wessel@stt_even.let.ruu.nl (Wessel Kraaij) writes:
| > proc 8bitbindings {widget} {
| > bind $widget \" {set nop 0}
| > bind $widget \"\" {$widget insert insert \"}
| > bind $widget \"e {$widget insert insert \353}
| > }
| >
| > Unfortunately, this does not work, because $widget in the binding
| > script is evaluated at event time, not at bind time!
|
| proc 8bitbindings {widget} {
| bind $widget \" {set nop 0}
| bind $widget \"\" [list $widget insert insert \"]
| bind $widget \"e [list $widget insert insert \353]
| }
|
| Yeah, I stumbled on this when learning TCL, too :-)
From: dav...@london.sbi.com (David Artus)
| Long Pham (lp...@tern.NoSubdomain.NoDomain) wrote:
| : Hello tcl experts,
|
| : I'm a newbie in this area, and have a small problem while playing w/
| : tcl. So, be gentle.
|
| : Here is my problem:
| : I'm trying to create an n number of buttons, each button will call
| : a same proc upon activated. And an argument passes to the proc is
| : an variable, i.e. different value for different button. I want to
| : display the value of the argument that I passed to the proc everytime
| : I click on a button. However, the program doesn't work, it always
| : display the last value of the variable no matter what button I
| : clicked on.
|
| : set names {John Anne Mary Jim}
| : foreach b $names {
| :
| : button .$b -text $b -command {displayName $b}
| : pack .$b -side left -padx 3
|
| : }
|
| : proc displayName bname {
| : puts $bname
| : }
|
| your problem is -command {displayName $b}
|
| The {} defer evaluation of $b till the event occurs. At that time
| b is, of course, set to Jim. You probably mean -command "display $b"
| for the purposes of this playing, or in real life maybe some
| cunning way of extracting the button name, contents or whatever.
From: hfv...@bcarh80a.bnr.ca (Dan MacDonald)
|In article <1994Jul22.005941.19824@wmichgw>, 31ch...@wmich.edu writes:
||> Hello Everybody,
||>
||> I am a new user of tk/tcl. I have a question.
||>
|
|Your entire problem:
| Very simply it's the value being passed to your procedure s1 is
| always '0'. This stems from the fact that when you created the
|
| button .watch22.yes -text YES \
| -command "s1 $foo_value"
|
| you specified the command (bind command) in quotes which performs
| substitutions when created. If you enclose the command in curly
| braces the substitution will be performed later. Although the (all)
| bind command operates at global scope, this is acceptable since the
| variable foo_value is a global variable - it must be because it
| to is "bound" by the scale widget to store the value.
|
|Try: button .watch22.yes -text YES -command {s1 $foo_value}
|
|Dan MacDonald
From: f...@cam-orl.co.uk (Frank Stajano)
| In article <303o17$4...@hydrox.cs.umd.edu>, nle...@cs.umd.edu (Nathan Lewis) writes:
| |> Hi everyone,
| |>
| |> I'm still new to tcl and I'm having some trouble. I'm trying to write lines to
| |> a file. There is one line where I want to write a bunch of things to the file
| |> literally but the problem is that I want to write $variable-name to a file
| |> literally and the interpreter keeps trying to evaluate the variable and I can't
| |> seem to get it not to.
| |>
| |>
| |> Thanks alot for any help.
| |>
| |> This is the code:
| |>
| |> button $w.ok -text "EXIT" -command " ;\
| |> puts \$datafile \"set precedence_relO \$precedence_rel\" ;\
| |> puts \$datafile \"\n\n\n\" ;\
| |> puts \$datafile \"set precedence_rel {\[lindex \\$precedence_relO 0\]}\" ;\
| |> This is the line ^ This is the variable ^
| |> puts \$datafile \"\n\" ;\
| |> puts \$datafile \"set Threads [list $Threads]\" ;\
| |> puts \$datafile \"\n\" ;\
| |> puts \$datafile \"set exclusion_rel [list $exclusion_rel]\" ;\
| |> exit ;\
| |>
|
| I must admit I haven't got a very clear idea of what you are trying to do, but
| if all you want is putting out stuff literally you could use curly brackets
| instead of quotes. If instead you want to have some of the things substituted
| (like the []'s) and some not (like the $'s) then I'd say your best bet is to
| make the command a separate procedure and build up the complex string in the
| procedure with "append". Otherwise it becomes a bit too complicated to do in one
| single command line, and the code becomes difficult to read and maintain.
From: f...@cam-orl.co.uk (Frank Stajano)
| In article <ASSON.94J...@chacmool.stsci.edu>, as...@chacmool.stsci.edu (Drew Justin Asson) writes:
| [...]
| |> >proc foo {x} {
| |> > global Global_Var
| |>
| |> > if {$x == ""} {
| |> > return $Global_Var
| |> > } else {
| |> > return $x
| |> > }
| |> >}
| |>
| |>
| |> The only problem with this is that this will raise an error
| |> if no argument is provided. To do what you want, you would
| |> have to declare the argument list as:
| |>
| |> { {x ""} }
| |>
| |> Then, the code would work.
|
| You're absolutely right, thanks for pointing it out. (Using args instead of x is
| another option.)
|
| |> But, this isn't really what I wanted. I was more interested in
| |> doing something that we often do with LISP code. E.g.
| |>
| |> proc foo { {x minus-infinity} {y plus-infinity} } {
| |> ...
| |> }
| |>
|
| I can't think of a Tcl way of doing this sort of thing in the argument list. If
| someone finds one, I'd be curious to know about it.
From: s...@cs.hut.fi (Sami-Jaakko Tikka)
| In <30jdao$8...@access3.digex.net> cob...@access3.digex.net (Cary B. O'Brien) writes:
| >Why does expr { fred < sam } fail,
| >When expr { "fred" < "sam" } succeeds.
|
| When expr sees strings without quotes, it tries to convert them to
| numbers.
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
== Niels Mayer -- netcom!mayer -- ma...@netcom.com ==
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
+ It looks like a number of your users are also having trouble understanding
+ Tcl. In my opinion, this is because the quoting and parenthisation rules
+ are neither straighforward, simple, nor consistent. About once every week
+ or two, I see questions come up on comp.lang.tcl dealing with one or more
+ of the issues raised above. Examples of these problems have been
+ appended to the end of this article...
+ When users are having real problems, it becomes clear that the arguments
+ are no longer religious. In Lisp, you can get arbitrarily complex and
+ inscrutable once you go to extreme macro-ology and multiple levels of
+ backquoting. However, for the more typical cases handled by the neophyte,
+ the problems of quoting, time-of binding, time-of substitution, and scope
+ of bindings are more consistent and easier to understand in Lisp. And
+ when you need to, you know that there's a very nice macro capability
+ to back up any meta-level programming you need to accomplish.
As a computer scientist, I found Lisp beautiful and easy to learn.
But that doesn't mean I like programming in it. The heart of Lisp,
the list, models only a fraction of the algorithms and data structures
that I deal with. Everything else is 2nd class and dilutes the beauty
of the language. As soon as I wrote my first do-loop in Lisp, started
declaring my data structures, and explaining all sorts of things to
the compiler, the courtship was over. As I remember, macros were the
final straw. As you admit, they make the language inscrutable, and
since everyone uses them...
I am not defending Tcl. I am simply saying that I find your arguments
about Lisp contrived. There are obvious difficulties in Lisp to
beginners that you have conveniently forgotten. If as many people
were learning Lisp as were learning Tcl, I'm sure the number of
beginner-type questions to comp.lang.lisp would be just as high. I
actually went and looked to see what's happening in the newsgroups and
c.l.l has the same beginner questions that you can find in any
language newsgroup. Here's a classic sample:
>From: [name changed to protect the innocent]
>Subject: Book Recommendation????
>Message-ID: <2vo19p$o...@jadzia.CSOS.ORST.EDU>
>Date: 10 Jul 94 05:34:49 GMT
>
>I am new to the LISP language (studying for alomst a year) and have read
>a few books on LISP. I am interested in hard core (excuse the expression)
>programming using the LISP language. Most of the books that I have read
>have gotten confusing about halfway through. I know this may have been
>asked in the past but I would like to know what book(s) would be
>recommended for a beginner on LISP?? Any responses would be greatly
>appreciated...Thanks
This poor guy has been studying Lisp for a year, has read several
books on it and still considers himself "new to the language"?!
My point is, citing messages from confused novices doesn't prove the
language is difficult any more than messages from rabid experts proves
that the language is great.
Don Libes <li...@nist.gov>
|> Finally, I think that having language syntax depend on newlines is a bad
|> idea -- I think most unix shell programmers will grudgingly agree with me
|> on this. I'd much rather have a Lispish syntax where you delineate your
|> current top-level evaluations by [ ... ] and don't bother with newlines.
But Tcl was intended, and used by many of us, as a shell-like command
interface to applications and ALSO as an extension language. I think the
vast majority of users would be confused by a "SHELL" in which newline did
not terminate input to a command. It is in this respect that Tcl's very
simple syntax is just the job for an embedded language.
Paul Alexander
Department of Physics, University of Cambridge, Cambridge, UK
Lisp 1.5 perhaps, but things have moved on since then. Functions are
the heart of Scheme (I don't think Common Lisp has a heart :-)
... models only a fraction of the algorithms and data structures
that I deal with. Everything else is 2nd class and dilutes the beauty
of the language. ...
Are you implying that structures, vectors, ... etc. are all "2nd
class" in Common Lisp? Either way, what is your definition of "2nd class"?
Both have advantages and disadvantages: interpreted languages usually
allow dynamic loading in a simple fashion, i.e., they are good for
prototyping of complex user interfaces you couldn't easily do in a
compiled environment due to the time needed for edit-compile-run
cycles. Also, compiler-based languages require an entire restart of
the application (because the patched program is a new program), while
interpreted systems can handle updates on the fly. Compiler-based
languages are more efficient for numerical calculations and other
low-level operations.
Compiling both approaches is what has been done in LISP. One of the
problems of LISP (still my favourite programming language), however,
is that it is a fully-fledged programming language with constructs a
little bit unfamiliar to usual programmers. The price you have to pay
for an elegant language like LISP: it is...umm... different.
Tcl also tries to combine the two approaches by providing an
interpreted language which is extensible by C- or C++-written
commands. However, sacrificing a real type system (LISP has a full
type system) causes problems for certain applications: take an
object-oriented database, for example. Object types and ids cannot be
simply distinguished from strings which happen to look the same. Users
have to artificially add type markers because there are only strings
in Tcl. Syntactic features must be used to differentiate between
types.
Other than that, Tcl very much resembles a LISP with BASIC elements,
and turns out to be useful for most user interface designs, although
the syntacticness of types can cause weird situations with character
or expression quoting.
As for Tk, there are also advantages and disadvantages: First of all,
it is nice to have a toolkit of relatively elaborate widgets which is
as simple to manage as Tk. Tools like XDesigner are nice, but once you
go beyond what they provide as pre-manufactured widgets, you have to
go down into the basement and dig up those Motif manuals. Also, you
may find your actual data items nested into a four-level struct. The
problems with Tk come in where compute-intensive portions of a GUI
need to be implemented. Passing data between Tcl/Tk and a C/C++
application in the form of strings simply won't be efficient enough
for such applications.
This is the point where I wished I had a more layered system at my
disposal:
Level 0 Xlib and friends
Level 1 Tk with a C/C++ interface pretty much
like SunView had
Level 2 Tcl-binding for Tk
This way, Tk would be 100% separated from Tcl, also allowing other
languages to bind more easily to the nice widgets provided. In
particular, C/C++ interfaces would be able to directly access
functions to do their high-volume data output which would be too
time-consuming when done through Tcl strings.
In a way, the purely syntactic character of data types in Tcl wouldn't
be that much of an issue any more because the critical parts, where
type semantics is lost in string conversion, could be done directly in
C or C++, or any other language replacing the level-2 binding (LISP?).
How much of Tk would be available at level 2, and how types are
handled exactly, would be left to level 2. Tcl would be a simple but
type-wise restricted language, LISP could carry through all
information.
The ultimate question always is: "Is it useful for my application? If
not, what is?" We had a project at FhG-IAO, where the GUI design was
done with a copmiler-based GUI designer. At some point, there was a
requirement to do splines. Due to problems implementing this quickly
under the GUI tool, we had to bring in UNIRAS, a huge, overkill
scientific visualization package. It sort of worked, but was an
incredible kludge to use that monstrous package only for the splines.
Little later, the entire GUI was re-done with Tcl/Tk. Not only was the
development time (including the time to learn Tcl/Tk) of the entire
GUI much shorter, but the configurability and overall flexibility was
greatly enhanced.
If I compare the results (and efforts!) for some GUI designs, I'm
happy to have a tool at my disposal which allows me to quickly
prototype GUIs without bothering much about type information and
declarations. This tool may be Tcl/Tk, it may be a LISP-based
environment. If I only have to position a few buttons, choose colors,
and plug a simple database interface program into the resulting GUI, a
tool like XDesigner would be most welcome. At WESCON'94, I'll describe
some of the experiences made in this project and others in more
detail.
IMHO, some of the previous discussion, where religious arguments came
up, lacked this view.
--Juergen Wagner
Juergen...@iao.fhg.de
gan...@csli.stanford.edu
Fraunhofer-Institut fuer Arbeitswirtschaft und Organisation (FhG-IAO)
Nobelstr. 12c Voice: +49-711-970-2013
D-70569 Stuttgart, Germany Fax: +49-711-970-2299
For more information/detail, please do "finger gan...@csli.stanford.edu".
> I think the whole "popularity" line of reasoning is questionable: MS-DOS
> and MS-WINDOWS are more popular than Unix/X, but many of us have good
> reasons to keep working on Unix and X. Beyond that, if you're worried about
> popularity, both the Lisp and TCL/Tk community are no doubt insignificant
> when comparted to, say the Visual Basic community, or Actor, or other
> similar packages available for MS-WINDOWS.
This is true. If you don't use VB or VC++, there are a lot of people
who won't take you too seriously. Even Actor and Smalltalk/V get heavy
criticism, coz reviewers (I'm told that one of the was Ray Duncan) use
the "Hello, World" type app (App? What a joke.) as a benchmark. Using
code like that can make even VC++ look bad, but that doesn't stop
reviewers wetting themselves over it, while all they can say about
Lisp, Smalltalk and Actor is that you _can_ use them if you want to.
They don't say you _have_ to, which is what they're saying about VB
and VC++.
That may explain why I never see any job offers for Lisp or Smalltalk,
but I see many for C++ and VB. Perhaps I'm looking in the wrong places?
After all, I'm only looking in the mainstream computing press.
Languages issues arew irrelevent when you look at them like this. It's
too easy to say, "Pay me to use your favourite language, and I'll be
happy to install it and program with it."
--
Martin Rodgers, WKBBG, London UK AKA "Cyber Surfer"
If "One likes to believe in the freedom of email", email
clipper....@cpsr.org and tell them you oppose Clipper.
This is a shareware .signature -- please pass it on!
Well, I use a variety of tools some use TCL/Tk some don't. The two
most recent uses I have found for TCL/Tk were...
1) My real work (research) deals with mathematics. As part of this
research I found it necessary to write a Finite Element analysis
package. The package is written in C++ and I was quite content
to write a C++ "program" for each FEA "problem" that I wanted to
run. However after a while this got pretty tedious. Basically,
what it comes down to is the need for an input file which is
read interactively. Now I played around with some parser tools
to try to write a grammar for my input files. However, I kept
changing what kinds of elements I had and such and it got pretty
tedious. Then I decided to write the front-end in TCL. It
works great. Among the added features of using TCL...
* Expressiveness. I can use for loops and such rather than
explicit lists
* I get some added interactive capabilities. For example, in
addition to specifying the problem, I get a certain amount of
control over number of time steps and such. I can stop half
way through a simulation to examine some results
2) I'm currently working as a contractor for Ford Motor Company until
August. They had several text-based applications and they wanted
me to write some GUI stuff for them. I banged these GUIs together
pretty damn fast with the help of TCL/Tk. This has sparked a fair
amount of interest in TCL/Tk here in my department.
Now, I could have used anything I wanted for both of these projects
and I chose TCL/Tk because I felt they were the best tools for the
job. So far, I'm very happy with the results.
|>
|> Bill
|> --
|> Bill Janssen <jan...@parc.xerox.com> (415) 812-4763 FAX: (415) 812-4777
|> Xerox Palo Alto Research Center, 3333 Coyote Hill Rd, Palo Alto, CA 94304
|> URL: ftp://parcftp.parc.xerox.com/pub/ilu/misc/janssen.html
Michael Tiller
University of Illinois
By the way, 'Derive' is a succesful <general-use application>. It's
written in Lisp.
bye. P.
----
Pierpaolo Bernardi (ber...@cli.di.unipi.it)
> Neils Mayer's recent post detailing the applications written
> wholly or partially in Lisp IMHO should end all talk of Lisp
> is Dead.
Lisp is not dead.
I think Lisp _and_ Tcl, and Perl etc.. will have more and more fields of
applications, it's just a matter of increasing CPU power and of ratio
hardware cost vs engineer cost (is this a fact that we could all agree on? :-).
It seems that Unix, X11, Windows, Emacs, Gcc will stay written in C (with a
small interpreter when needed) for a while, but that a lot of other
applications will be better generated in Cobol by a L4G or in C by Eiffel,
the rest, the "glue" for integrations being Tcl or Lisp code.
For this reason, a religious war between Lisp and Tcl is pointless, they
will certainly *both* have more and more adepts, and C/Fortran/Cobol less.. at
least I hope.. :-)
Cheers,
Christophe.
= % ping elvis =
= elvis is alive =
Do you really think so? On my machine, only csh terminates input
with a newline.
sh, ksh, and bash will prompt for additional input if you haven't
closed your quoted expressions.
The secondary prompt makes it clear (to me at least) that additional
input is expected.
--
Don Bennett (415)390-2145
d...@sgi.com
Silicon Graphics
Thanks in advance
Mat
> Do you really think so? On my machine, only csh terminates input
> with a newline.
> sh, ksh, and bash will prompt for additional input if you haven't
> closed your quoted expressions.
% tcl
tcl>echo "hi there
=>foo"
hi there
foo
What's your point?
You can find STk 2.1 in the Scheme Repository:
ftp.cs.indiana.edu:/pub/scheme-repository/imp/STk-2.1.tar.gz
I believe this is the latest version of STk. If you just want info
without actually downloading the package, you might try sending mail
to Erick Gallesio (e...@unice.fr).
David Eby
de...@cs.indiana.edu
From: ber...@cli.di.unipi.it (Pierpaolo Bernardi)
Newsgroups: comp.lang.functional,comp.lang.misc,comp.lang.lisp
Followup-To: comp.lang.functional,comp.lang.misc,comp.lang.lisp
Date: 25 Jul 1994 15:39:35 GMT
Organization: Dipartimento di Informatica, Universita' di Pisa
Lines: 23
Distribution: world
Thomas Lawrence (lawr...@cesn2.cen.uiuc.edu) wrote:
: In article <ala-2107941905530001@el_diente.cs.colorado.edu> a...@cs.colorado.edu (Adam Alpern) writes:
: >In article <LORD.94Ju...@x1.cygnus.com>, lo...@x1.cygnus.com (Tom Lord) wrote:
: >;; Right. No general-use application written in lisp will ever fly.
: Er... perhaps you missed his signature? ;-)
By the way, 'Derive' is a succesful <general-use application>. It's
written in Lisp.
Not to speak of Macsyma. And... has anybody looked beneath the surface
of Mathematica?
Happy Lisping
--
Marco Antoniotti - Resistente Umano
-------------------------------------------------------------------------------
Robotics Lab | room: 1220 - tel. #: (212) 998 3370
Courant Institute NYU | e-mail: mar...@cs.nyu.edu
...e` la semplicita` che e` difficile a farsi.
...it is simplicity that is difficult to make.
Bertholdt Brecht
> As a computer scientist, I found Lisp beautiful and easy to learn.
> But that doesn't mean I like programming in it. The heart of Lisp,
> the list, models only a fraction of the algorithms and data structures
> that I deal with. Everything else is 2nd class and dilutes the beauty
> of the language. As soon as I wrote my first do-loop in Lisp, started
> declaring my data structures, and explaining all sorts of things to
> the compiler, the courtship was over. As I remember, macros were the
> final straw. As you admit, they make the language inscrutable, and
> since everyone uses them...
Which Lisp dialect was this? Is it one with structures? Arrays?
Not everything is a list. Some things are even symbols.
> This poor guy has been studying Lisp for a year, has read several
> books on it and still considers himself "new to the language"?!
I still consider myself new to C, even tho I first used it in the
early 80s. Now I'm learning C++, which looks even worse. Most of
what I consider "learning a language" to be is more than the syntax
and the symantics. It's more like style and experience.
> My point is, citing messages from confused novices doesn't prove the
> language is difficult any more than messages from rabid experts proves
> that the language is great.
Agreed. Languages are just different. I love an exchange of opinions,
but please, _please_ let's not try to "prove" anything?
|> I think Lisp _and_ Tcl, and Perl etc.. will have more and more fields of
|> applications, it's just a matter of increasing CPU power and of ratio
|> hardware cost vs engineer cost (is this a fact that we could all agree on? :-).
But Lisp could have better syntax. On the other hand, all these other
little languages could have better semantics. For some time it has been
a pet peeve of mine that people inventing little languages, with their
syntax appropriate-to-task/audience, seem to feel it is necessary to badly
re-invent execution semantics. It-would-be-nice if people would take a
little bit of trouble to layer their new syntax on top of (for instance)
Scheme, or an interpreter built in that (continuation-passing) style (it's
NOT HARD, DAMMIT!)
And, there are those little languages with inferior syntax, which could just
as well be replaced by Scheme with a few appropriate primitives. Here, I'm
thinking of the "languages" used to program sendmail and adb.
Then again, it seems to be the norm in Computer "Science" to step on toes
instead of stand on shoulders. Carry on.
David Chase, speaking for myself
Thinking Machines Corp.
>In article <310mbn$h...@serra.unipi.it> ber...@cli.di.unipi.it (Pierpaolo Bernardi) writes:
> : In article <ala-2107941905530001@el_diente.cs.colorado.edu> a...@cs.colorado.edu (Adam Alpern) writes:
> : >In article <LORD.94Ju...@x1.cygnus.com>, lo...@x1.cygnus.com (Tom Lord) wrote:
> : >;; Right. No general-use application written in lisp will ever fly.
> : Er... perhaps you missed his signature? ;-)
> By the way, 'Derive' is a succesful <general-use application>. It's
> written in Lisp.
>Not to speak of Macsyma. And... has anybody looked beneath the surface
>of Mathematica?
I heard Mathematica is written in Objective-C, though I wouldn't be surprised
if much of it is written in Mathematica.
Amen. Preach it, brother!
In addition to "unix" languages, the Matlabs and Maples of the world
could use a reasonable --or at least sane-- execution semantics.
--
John Baugh
j...@eos.ncsu.edu
I like TCL fine, but I strongly disagree with this. A lot of Lisp's strengths
are *due* to its unconventional syntax (as is the case with Forth, SmallTalk,
Logo, PostScript, and, yes, TCL).
I just wish Lisp didn't have all these *other* syntaxes for arrays and macros
and everything else layered on top of it. I want Lisp 1.5, damnit.
(Peter shows his age again)
Scheme looks better, these days.
> For this reason, a religious war between Lisp and Tcl is pointless, they
> will certainly *both* have more and more adepts, and C/Fortran/Cobol less.. at
> least I hope.. :-)
Agreed. Nobody will ever "prove" one language is "better" than another.
All they'll do is convince people (like me) that there are a few more
fanatics that support language X than language Y. That's hardly an
attractive feature.
Neither will I. I had plans to post a major treatise--
well, at least a hundred lines of NetNews prose--but I'm
facing up to the reality that I have other priorities.
Here are the highlights for me:
1. lots of what we get with RT we could get other
ways, and there's plenty of research left to do
on what this means; but
2. some of the things that are good to have from
wherever they come are
a. reduction of overspecification: this
is just good engineering;
b. parallelization, which I see as the
most interesting of the program trans-
formations that become possible; and
c. correctness proofs, which have much
different consequences when automated,
and when done by hand, but which can
become important in either case.
--
Cameron Laird
cla...@Neosoft.com (claird%Neoso...@uunet.uu.net) +1 713 267 7966
cla...@litwin.com (claird%litwi...@uunet.uu.net) +1 713 996 8546
A neat experimental language that is in some ways similar to lisp is
Icon. It is built on the concepts of success/failure (instead of
true/false boolean values), generators and control backtracking. It is
also a very functional language, about the only lisp-like features it
is missing are lambda expressions, quote and and a simple syntax
(although all expressions except for assingments can be expressed in a
form that is very similar to lisp's s-expressions). It's greatest
strength lies in text handling, though it is by no means a specialized
language. It also has most of the types that lisp has and there is a
[relatively] simple way to write variants of it (there's an
object-oriented variant for example).
Icon also has a graphics library that makes graphics programs written
in it fairly portable accross platforms and OS's.
Icon can be obtained from ftp://cs.arizona.edu/icon/ and the newsgroup
for it is comp.lang.icon.
Nick
Actually, a religious war needs no point.
--
The more I get to know people, the more I like my dog.
Wolfgang
--
Wolfgang Lux
WZH Heidelberg, IBM Germany IBM IP-Net: l...@rio.heidelbg.ibm.com
+49-6221-59-4546 VNET: LUX at HEIDELBG
+49-6221-59-3300 (fax) EARN: LUX at DHDIBMIP
As Michael Tiller said, one things TCL/Tk is excellent at is quickly
implementing small GUIs --- particularly GUI front-ends to non-GUI
applications. And that's an excellent model for general programming; don't
put any substantial, complex functionality into a GUI program if you can
help it; instead, write substantive command-line-driven programs, and GUI
front-ends. Improves code-reusability, modularity, regression testing, and
so on.
What's more, it helps set a precedent. As recently as a few years ago,
people were willing to stand up in public and claim that C (or C++) was the
only acceptable application development language. Within the last couple of
years Perl has forced its way in as another acceptable language, but somehow
most of those same people now want to try to say that C (or C++) and Perl
are the only two acceptable languages. So write your next application in
TCL/Tk. Then the one after that in Python. Then find a little Scheme, add
some dynamic loading, and create your own language for the next one. Use a
different language for every project until they get the point that chosing
the right language for the job is part of the design process, and can be
revisited _ab_initio_ for each new design.
-Bennett
b...@sbi.com
The whole point of Lisp syntax is that it is trivial, and doesn't require
a full-blown parser to interpret. Unfortunately, Common Lisp's parameter
syntax completely blew that particular advantage.
Improving the 'syntax' of a language is a little like climbing a tree
to make the first step towards the moon. It doesn't accomplish very
much, but it's easy and accessible and allows one to feel superior to
those still standing on the ground. In most cases, people would be
better off dispensing with character-based languages completely (e.g.,
for graphical languages), rather than wasting everyone's time creating
additional parsing and training nightmares.
Is the syntax of Lotus 1-2-3 macros really that brilliant, that it deserves
all of the brainpower wasted on it?
After tripping over all the minor syntactic incompatibilities of Fortran,
PL/I, Pascal, C, Ada, ..., I'm sick to death of the whole syntactic mess.
Proponents of syntactic sugar keep claiming their languages are more
'readable', but this presumes that users want to waste time reading
their silly 500-page manuals in the first place. I'm sure that
Persian and Finnish are 'readable' to their natives, too, but that
doesn't mean that they are 'readable' to anyone else.
>On the other hand, all these other
>little languages could have better semantics. For some time it has been
>a pet peeve of mine that people inventing little languages, with their
>syntax appropriate-to-task/audience, seem to feel it is necessary to badly
>re-invent execution semantics. It-would-be-nice if people would take a
>little bit of trouble to layer their new syntax on top of (for instance)
>Scheme, or an interpreter built in that (continuation-passing) style (it's
>NOT HARD, DAMMIT!)
Agreed, but I think that Lisp people have ignored for too long what the
users really want/need. Some of the things they want:
* the advantages of an integrated system with both compiled & interpreted code
* the ability to construct strict (file) and lazy (Unix stream)
compositions of existing functions
* the ability to explicitly control resource usage (GC hides too much)
without creating dangling references
* the ability to encapsulate nearly anything within a protocol-conversion
wrapper
* a powerful defaulting mechanism that isn't based on _state_ and
_state variables_. (This is probably one of the most important
problems facing the functional language community.)
>And, there are those little languages with inferior syntax, which could just
>as well be replaced by Scheme with a few appropriate primitives. Here, I'm
>thinking of the "languages" used to program sendmail and adb.
>
>Then again, it seems to be the norm in Computer "Science" to step on toes
>instead of stand on shoulders. Carry on.
Progress in most fields requires a throwing away of paradigms that
bind and constrict. Perhaps it is time to stop teaching students how
to parse, so they can focus on more worthwhile pursuits?
>>But Lisp could have better syntax.
>The whole point of Lisp syntax is that it is trivial, and doesn't require
>a full-blown parser to interpret. Unfortunately, Common Lisp's parameter
>syntax completely blew that particular advantage.
I agree. Common lisp is hopeless. Its standard manual is longer than that
of _every_ _other_ programming language and its syntax diverged from the
lisp ideal since the introduction dotted-pairs and ' for quoting. Perhaps
ideal lisp is not powerful or convenient enough for a real language.
>Improving the 'syntax' of a language is a little like climbing a tree
>to make the first step towards the moon. It doesn't accomplish very
>much, but it's easy and accessible and allows one to feel superior to
>those still standing on the ground.
Language syntax is related to style, in the graphic arts sense. It is
important for readability and can guide programming methodology. It helps
programmers remember language syntax and provides historic continuity.
Ideal-Lisp syntax may be trivial but it is not very nice looking. Why do
you think that algorithms in every non-lisp computer journal are printed in
pseudo-Algol-60?
>Is the syntax of Lotus 1-2-3 macros really that brilliant, that it deserves
>all of the brainpower wasted on it?
I don't know what you mean by this. Like lisp, Lotus 1-2-3 macro syntax is
trivial, but also like lisp, it's terrible looking.
>After tripping over all the minor syntactic incompatibilities of Fortran,
>PL/I, Pascal, C, Ada, ..., I'm sick to death of the whole syntactic mess.
Lisp isn't so wonderful in this respect either. Although the syntax, in the
strictest sense of the word, is trivial, the language is big, redundant and
non-uniform.
For example, 'setf' takes list of pairs which don't need to be enclosed:
(setf a 10 b 20 ... ), but then structures, which could also have this
syntax, require parenthesis around its pairs: (defstruct foo (a 3) (b 4)...).
Then there's: set, setq and setf; eq, eql and equal; and let, let*, letrec
and declare. Not to mention: cond and if; do, do*, and dotimes; and
array-dimension, initialize-instance, multiple-value-bind and
with-input-from-string.
>Proponents of syntactic sugar keep claiming their languages are more
>'readable', but this presumes that users want to waste time reading
>their silly 500-page manuals in the first place.
Which programming language has a 500-page manual? Only common lisp. Even
the manuals for PL/I, Ada and Algol-68 are shorter.
>I'm sure that
>Persian and Finnish are 'readable' to their natives, too, but that
>doesn't mean that they are 'readable' to anyone else.
I suppose you'd like them to convert to esperanto.
>>On the other hand, all these other
>>little languages could have better semantics. For some time it has been
>>a pet peeve of mine that people inventing little languages, with their
>>syntax appropriate-to-task/audience, seem to feel it is necessary to badly
>>re-invent execution semantics. It-would-be-nice if people would take a
>>little bit of trouble to layer their new syntax on top of (for instance)
>>Scheme, or an interpreter built in that (continuation-passing) style (it's
>>NOT HARD, DAMMIT!)
>Agreed, but I think that Lisp people have ignored for too long what the
>users really want/need. Some of the things they want:
I agree too, most languages could use better semantics. But they don't need
lisp's syntax.
>>And, there are those little languages with inferior syntax, which could just
>>as well be replaced by Scheme with a few appropriate primitives. Here, I'm
>>thinking of the "languages" used to program sendmail and adb.
I don't see what's so great about scheme. Constructs such as
(call-with-current-continuation name (body)) certainly don't encourage the
use of it's powerful semantic capability.
The major semantic paradigms present in scheme/lisp are:
1 Closures
2 Loops through tail-recursion and lambda functions
3 Type inference
4 Self-representation through common list syntax
1 is the most important one, and is syntacticly possible with other
languages. Most functional languages have closures. Other static languages
don't have them because they do not sit well with stack-based local
variables. For a non-lisp and non-functional language with closures, try my
language Ivy.
2 and 3 are not necessary, hurt readability and can be considered semantic
sugar.
4 is the only one which is really unique to lisp. It's made redundant by 1
though, so I don't miss it.
--
/* jha...@world.std.com (192.74.137.5) */ /* Joseph H. Allen */
int a[1817];main(z,p,q,r){for(p=80;q+p-80;p-=2*a[p])for(z=9;z--;)q=3&(r=time(0)
+r*57)/7,q=q?q-1?q-2?1-p%79?-1:0:p%79-77?1:0:p<1659?79:0:p>158?-79:0,q?!a[p+q*2
]?a[p+=a[p+=q]=q]=q:0:0;for(;q++-1817;)printf(q%79?"%c":"%c\n"," #"[!a[q-1]]);}
In article <Ctu43...@world.std.com>, jha...@world.std.com (Joseph H Allen) writes:
> The major semantic paradigms present in scheme/lisp are:
> 1 Closures
> 2 Loops through tail-recursion and lambda functions
> 3 Type inference
> 4 Self-representation through common list syntax
> 1 is the most important one, and is syntacticly possible with other
> languages. Most functional languages have closures. Other static languages
> don't have them because they do not sit well with stack-based local
> variables. For a non-lisp and non-functional language with closures, try my
> language Ivy.
1) ML also meets this criteria. It is an impure functional language
(meaning it has assignment which is all you need to program as you
would in C), and has a pleasant non-lisp syntax.
> 2 and 3 are not necessary, hurt readability and can be considered semantic
> sugar.
2) On the contrary, using tail-recursion allows one to use the return
values of each iteration to accumulate final values. (Mail me if you
want an example.) This is not possible with WHILE or FOR loops in
C/Pascal. Although, i admit that the paren-syntax is annoying, i
still prefer to program loops with tail recursion.
3) Type inference in Lisp? This is new to me. (Someone please
enlighten me here.)
> 4 is the only one which is really unique to lisp. It's made redundant by 1
> though, so I don't miss it.
4) But this is the ultimate reason why LISP is still indispensably
unique. Lisp's macros and ultra-flexible syntax have made it the
hot-bed of language research for many many years and continues to play
that role today. (Consider MOP, CLOS, CL Condition System, and
syntax-case for Scheme.) It's not made redundant by closures at all.
You still need the macros which can manipulate code as data
self-referentially.
Even so, i will concede without prompting,that if one is not
interested in either higher-order functions or language
experimentation, the lisp parentheses become only a nuisance and
Common Lisp's size just an obstacle to delivery.
rodrigo vanegas
r...@cs.brown.edu
I think a quick perusal would show that most languages are happy with syntax
_trees_. Lisp S-expressions are general trees. QED
>>Improving the 'syntax' of a language is a little like climbing a tree
>>to make the first step towards the moon. It doesn't accomplish very
>>much, but it's easy and accessible and allows one to feel superior to
>>those still standing on the ground.
>
>Language syntax is related to style, in the graphic arts sense. It is
>important for readability and can guide programming methodology. It helps
>programmers remember language syntax and provides historic continuity.
God save us from language designers who want to impose their own personal
styles on all their users. The problem is that programming styles change
rather more rapidly than languages do, leaving the programs and programmers
of the 'I know what's best for you' languages high and dry.
>Ideal-Lisp syntax may be trivial but it is not very nice looking. Why do
>you think that algorithms in every non-lisp computer journal are printed in
>pseudo-Algol-60?
And how nice looking is the numerical program in pseudo-Algol that has
to first symbolically differentiate its functional argument, before
compiling and executing it?? Lisp syntax is ideal for programs which
must analyze and/or compose other programs. Since this is one of the
most powerful software levers available, languages which make this
difficult are guaranteeing poor productivity.
As to the pseudo-Algol-60 algorithms in computer books and
journals--it's truly a shame that Knuth never really got into Lisp.
His books would be _more_ readable, and TeX might actually have been a
decent language. Sigh...
>Lisp isn't so wonderful in this respect either. Although the syntax, in the
>strictest sense of the word, is trivial, the language is big, redundant and
>non-uniform.
>
>For example, 'setf' takes list of pairs which don't need to be enclosed:
>(setf a 10 b 20 ... ), but then structures, which could also have this
>syntax, require parenthesis around its pairs: (defstruct foo (a 3) (b 4)...).
>
>Then there's: set, setq and setf; eq, eql and equal; and let, let*, letrec
>and declare. Not to mention: cond and if; do, do*, and dotimes; and
>array-dimension, initialize-instance, multiple-value-bind and
>with-input-from-string.
Touche' ! (I'll have to send you a copy of my article "If it ain't Baroque,
fix it!")
>>Proponents of syntactic sugar keep claiming their languages are more
>>'readable', but this presumes that users want to waste time reading
>>their silly 500-page manuals in the first place.
>
>Which programming language has a 500-page manual? Only common lisp. Even
>the manuals for PL/I, Ada and Algol-68 are shorter.
The PL/I manual that I used was many more than 500 pages. Algol-68 is
shorter, but only because it appears to have been LZ-compressed and
therefore incomprehensible. The Ada83 manual is shorter, but you then need
the Commentaries and the Congressional Record to figure out what the manual
says.
Look at the computer books in your nearby Bookstar. Of course, the chatty
style, the double spacing, and those silly small-format pages do a lot to
create 500++ page books.
>The major semantic paradigms present in scheme/lisp are:
>
>1 Closures
>2 Loops through tail-recursion and lambda functions
>3 Type inference
>4 Self-representation through common list syntax
>
>1 is the most important one, and is syntacticly possible with other
>languages. Most functional languages have closures. Other static languages
>don't have them because they do not sit well with stack-based local
>variables. For a non-lisp and non-functional language with closures, try my
>language Ivy.
>
>2 and 3 are not necessary, hurt readability and can be considered semantic
>sugar.
If you think that anonymous lambda functions _hurt_ readability, I'd like to
sell you some continuation-passing code which is written in Pascal/Ada style.
Now _that_ really hurts the eyes.
??????????? 'Type inference' _hurt_ readability?? Perhaps you meant type
declarations?
I don't know Ada, but I agree, emulating continuations with Pascal is awful
(you have to pass a record via a global variable). Now if Pascal had
continuations, (and function pointers) it would be fairly nice, even without
lambda functions:
function foo():function;
var ...
begin
... (* code which sets vars *)
function bar(x:real):real;
begin
...
end.
some_proceedure(bar); (* call with continuation *)
foo:=bar; (* return with continuation *)
end.
>??????????? 'Type inference' _hurt_ readability?? Perhaps you meant type
>declarations?
IMHO, except for trivial programs, declarations improve readability because
you can easily figure out what is stored in variables and arguments and
don't have to trace constants or depend on operators not being overloaded.
You can jump into the middle of a program and start modifying it without
comprehending all of it. It makes your programs less general perhaps, but
definitly more readable. Likewise, structures (records) are better than
lists and tuples.
>Finally, this long thread has reached the heart of what i believe to
>be the principal reason why we can't do without lisp.
>In article <Ctu43...@world.std.com>, jha...@world.std.com (Joseph H Allen) writes:
>> The major semantic paradigms present in scheme/lisp are:
>
>> 1 Closures
>> 2 Loops through tail-recursion and lambda functions
>> 3 Type inference
>> 4 Self-representation through common list syntax
>> 1 is the most important one, and is syntacticly possible with other
>> languages. Most functional languages have closures. Other static languages
>> don't have them because they do not sit well with stack-based local
>> variables. For a non-lisp and non-functional language with closures, try my
>> language Ivy.
>1) ML also meets this criteria. It is an impure functional language
>(meaning it has assignment which is all you need to program as you
>would in C), and has a pleasant non-lisp syntax.
Yeah, ML's pretty nice.
>> 2 and 3 are not necessary, hurt readability and can be considered semantic
>> sugar.
>2) On the contrary, using tail-recursion allows one to use the return
>values of each iteration to accumulate final values. (Mail me if you
>want an example.) This is not possible with WHILE or FOR loops in
>C/Pascal.
Yes it is:
while(!context.done) context=function(context);
or if you have continuations (or are using global variables):
declare context;
function()
{
operate on context;
}
while(!done) function();
Any function which is strictly tail recursive can be directly expanded this
way. More general recursive functions can't:
LIST slow_reverse(LIST a)
{
LIST b=a->next;
if(b) return a->next=0, append(slow_reverse(b),a);
else return a;
}
IMHO, loop structures should be used in place of tail recursion for
readability. Recursion should, of course, still be possible.
>3) Type inference in Lisp? This is new to me. (Someone please
>enlighten me here.)
It's not guarenteed to infer the type of everything, but lisp optimizing
compilers sure try. The more that can be inferred, the more that can be
optimized.
>> 4 is the only one which is really unique to lisp. It's made redundant by 1
>> though, so I don't miss it.
>4) But this is the ultimate reason why LISP is still indispensably
>unique. Lisp's macros and ultra-flexible syntax have made it the
>hot-bed of language research for many many years and continues to play
>that role today. (Consider MOP, CLOS, CL Condition System, and
>syntax-case for Scheme.) It's not made redundant by closures at all.
>You still need the macros which can manipulate code as data
>self-referentially.
I should have said 'nearly redundant' :-)
Simple macros can be more powerful in lisp than in any other language and
rather easily lead to parameterized classes and other syntax augmentations.
You can also do neat things like easily take the derivatives of mathematical
functions expressed in lisp. But then I don't want to express my functions
in lisp :-) A simple precidence parser is only about a hundred lines long in
any other language, so it's not that big of a deal to use lists for
representation, and infix/prefix for data entry and display.
>Even so, i will concede without prompting,that if one is not
>interested in either higher-order functions or language
>experimentation, the lisp parentheses become only a nuisance and
>Common Lisp's size just an obstacle to delivery.
Lisp is a neat idea, but to gross to use in practice.
All expressions in Icon produce results (i.e. there is no such thing as
a void typed functions as in C or CL). It also has apply/funcall like
expressions. Pointer semantics abound and there is no way to pass
pointers-to-pointers to functions (i.e. no quote function). All of which
makes Icon a heck of a lot more of a strict functional language than is
CL, though it does have variable assignment expressions, as well as
destructive functions and though it does not have lambda expressions. It
does have a feature that comes close to lambda expressions and which its
authors call "co-expressions;" these co-expressions are co-routines that
can be created more or less dynamically from regular Icon expressions.
Perhaps you should take a look at the introduction to Icon that can be
found in cs.arizona.edu:/icon/doc/tr-90.6.doc (also tr-90.6.ps.Z). Its
quite short...
>Wolfgang
>--
>Wolfgang Lux
>WZH Heidelberg, IBM Germany IBM IP-Net: l...@rio.heidelbg.ibm.com
>+49-6221-59-4546 VNET: LUX at HEIDELBG
>+49-6221-59-3300 (fax) EARN: LUX at DHDIBMIP
Nick
Aha! We've finally smoked you out! What you are complaining about is
the lack of a good code browser which would tell you what a compiler
knows about the various constructs that it sees. Rather than have to
specify all this redundancy every time I write a program, wouldn't it
be better to have the program tell me what it thought? This would be
a much better tradeoff in terms of my time vs. the computer's time.
For what it's worth, such a code browser is much easier to write, due
to (you guessed it) Lisp's trivial syntax.
CommonLisp is not particularly complicated by modern standards. Most
of CLtL/2 describes what in other languages is part of a separate
library standard. And CLtL/2 describes a language that is
substantially simpler than, say, C++ and several other successful
modern languages.
Still, CommonLisp probably should be simpler...
Thomas.
Type inference is definitely not a "semantic paradigm" of Scheme or
Lisp. Type inference is mainly used in statically typed functional
programming languages like SML and Haskell. Scheme and Lisp use
dynamic typing.
Thomas.
In article <RV.94Au...@tahoe.cs.brown.edu>,
rodrigo vanegas <r...@cs.brown.edu> wrote:
>Finally, this long thread has reached the heart of what i believe to
>be the principal reason why we can't do without lisp.
>In article <Ctu43...@world.std.com>, jha...@world.std.com
(Joseph H Allen) writes:
>> The major semantic paradigms present in scheme/lisp are:
>
>> 1 Closures
>> 2 Loops through tail-recursion and lambda functions
>> 3 Type inference
>> 4 Self-representation through common list syntax
>> 1 is the most important one, and is syntacticly possible with other
>> languages. Most functional languages have closures. Other
>> static languages
>> don't have them because they do not sit well with stack-based local
>> variables. For a non-lisp and non-functional language with
>> closures, try my
>> language Ivy.
>1) ML also meets this criteria. It is an impure functional language
>(meaning it has assignment which is all you need to program as you
>would in C), and has a pleasant non-lisp syntax.
Yeah, ML's pretty nice.
Except when you FORGET parenthesis! :-) The operator precedence rules
are so contrived that you end up stuffing the code with parenthesis
just to play it safe. :-)
>> 2 and 3 are not necessary, hurt readability and can be
>> considered semantic
>> sugar.
...
IMHO, loop structures should be used in place of tail recursion for
readability. Recursion should, of course, still be possible.
It is the nature of data that makes recursion more suitable than loops
and vice versa.
...
I should have said 'nearly redundant' :-)
Simple macros can be more powerful in lisp than in any other language and
rather easily lead to parameterized classes and other syntax augmentations.
You can also do neat things like easily take the derivatives of mathematical
functions expressed in lisp. But then I don't want to express my functions
in lisp :-) A simple precidence parser is only about a hundred lines long in
any other language, so it's not that big of a deal to use lists for
representation, and infix/prefix for data entry and display.
Have you tried the 'infix.lisp' package from the AI-repository?
>Even so, i will concede without prompting,that if one is not
>interested in either higher-order functions or language
>experimentation, the lisp parentheses become only a nuisance and
>Common Lisp's size just an obstacle to delivery.
Lisp is a neat idea, but to gross to use in practice.
As is ML, Haskell, Miranda, Prolog and I would add FORTRAN, COBOL,
PL/I and, last but not least, C/C++.
So much for the daily religion wars :)
ESL itself supports automatic memory allocation for text objects,
arrays, and associative arrays. Assoc's can also be either in-core or
bound to ndbm files without any modification to their usage. Scripts
can also be "compiled" into an architecture neutral format and shiped
onto any box running the ESL interpreter (bit like forth in that regard
- or so I'm told).
There are currently bindings provided for SNMP and also Mini SQL. It
was originally written for use in network management.
I may be biased (probably am since I wrote it :-) but it's a nice
language if your a C programmer. If you want to have a look at it it's
available from ftp://Bond.edu.au/pub/Bond_Uni/Minerva/esl. Like I said,
the Motif support will appear in the enxt release.
__ David J. Hughes - Ba...@Bond.edu.au
/ \ / / / http://Bond.edu.au/People/bambi.html
/___/ __ _ ____/ / / _
/ \ / \ / \ / / / / / \ / Senior Network Programmer, Bond University
\___/ \__// / \__/ \__/ / / / Qld. 4229 AUSTRALIA (+61 75 951450)
Whenever a parenthesized phrase occurs at the end of another.
use a slash instead of parenthesizing it, thus:
(abf gut ( hnniuf slks))
becomes
(abf gut / hnniuf slks)
this gets rid of most of the parenthesis clusters in Lisp.
It makes it easy to use "if" instead of "cond",
or a "let" without a list, because many such constructs naturally
nest in their last argument.
P.s. another candidate for a "/" is a semicolon, but that looks better at
the end of a lin than at the beginnning:
( if onions bagels
/ if cows milk
/ let horse pony
/ list onions cows horse
)
( if onions bagels ;
if cows milk ;
let horse pony ;
list onions cows horse
)
In my experience, with this feature, most Lisp read-macros
become unnecessary, and much syntactic sugar likewise.
hendrik.
I now believe that real users dont care about the language itself. They
are only concerned with the *ENVIRONMENT* available with it (librairies,
etc...). They dont want to be bothered on HOW to do it, but WHAT they can
do with it.
Replace emacs lisp by TCL and people will continue to use it
Replace Tcl (in Tk) by another language and people will use it (Stk...)
(I think Tk success has a lot to do with the canvas widget,
allowing to build whole applications entierely in tcl, )
But language implementors (myself included) have the dream of solving user
problems just by the elegant design of their core interpreter...
PS: the lessons gained with wool led me to make Wool2 which is the heart
of Bull (a 40 000 person computer vendor) biggest software
product (millions of lines of code). So there *ARE* real applications using
lisp dialects. But because of US irrational lisp-fear one now has to never
admit that the software you sell has parentheses inside :-)
--
Colas Nahaboo, Koala (Bull Research)
Mosaic Info: <A HREF="http://zenon.inria.fr:8003/koala/colas.html"></A>
Oh, I think you're wrong... Or at least not fully right.
> Replace emacs lisp by TCL and people will continue to use it
Because most people don't code things longer than ten lines in elisp.
> Replace Tcl (in Tk) by another language and people will use it (Stk...)
> (I think Tk success has a lot to do with the canvas widget,
> allowing to build whole applications entierely in tcl, )
Why would I use Stk instead of Tcl if I didn't care which language I
programmed in? (Tcl is an odd language...)
Mathematica can do all sorts of wonderful things, but I have to be
chained down to use it --- the language is barely structured, and horribly
complex. It can make a simple program seem like a monstrosity and
totally unreadable two days after you've coded it.
Have you ever had the misfortune to be forced to code an application
in some xbase language?
I recognize that these examples are extreme, but that's partially
because I have yet to find a language I'm even reasonably satisfied
with. Some languages bug me less than others.
-Randy
> IMHO, except for trivial programs, declarations improve readability because
> you can easily figure out what is stored in variables and arguments and
> don't have to trace constants or depend on operators not being overloaded.
> You can jump into the middle of a program and start modifying it without
> comprehending all of it. It makes your programs less general perhaps, but
> definitly more readable.
But I want this to work (yes this is a trivial program, but it could
be part of a non-trivial one, and having to put declarations in at
that level is a serious pain.
(defun foo (n)
(declare (fixnum n))
(dotimes (i n ...)
<use i, compiler knows i is a fixnum>
...))
> Likewise, structures (records) are better than
> lists and tuples.
Which is why people use them in Lisp of course.
--tim
Which is silly, since 90% of Tcl syntax is taken from Lisp, except it uses
{} instead of (). The semantics are quite different, though apart from one
feature Lisp's semantics are better. That one feature? String handling.
The ability to embed variables in unformatted strings is tremendously
popular. Look at the lengths peope to go through to force code into the
"$this $that" style instead of [list $this $that]. The format command is
also a great tool.
The actual commands in Tcl are slightly more regular than in most lisps:
quoting conventions are more regular, and the naming is less haphazard, but
that's mostly a "new broom" phenomenon. Lisp has a lot of baggage here.
The best extension language I ever used was, like Tcl, a cross between Lisp
and a string-oriented shell. It was closer to Lisp though... but abandoned
all the old Lisp command name spam. It was the original extension language
for Brief. I understand it's been replaced by something C-style. Sigh... C
is not a good design for an interactive interpreted language, 'cos it's got
no hooks for operating on code.
The semantics of TCL are truly mind-boggling. I was using it to build
a gui front-end to something (under the dicates of portability).
Suffice to say, I ended up using a couple of global variables to pass
things around.
In fact, I highly recommend trying to write a nontrivial program in
TCL, as a rather enlightening experience. You should particularily try
to use 'uplevel' and 'upvar'. You should also pass associative arrays
around a lot. Try pass closures as the 'action' field of a pushbutton.
On the plus side, TCL interfaces to C very easily. There's no garbage
collection to worry about. Data structures are simple. All in all,
it's a breeze to write tcl commands in C.
Maybe part of the reason for the success of TCL and C++ is this. Lisp
& Smalltalk & ML & your favourite language all provide a number of
simple primitives, which you connect together to form an
application. The primitives themselves are a bit abstract, so it takes
some creativity to apply them in order to reach your goal. With some
experience, though, cranking out common program structures becomes
automatic.
TCL & C++, on the other hand, have language features for many of the
things you might wish to do. So you get a nice warm feeling by looking
in the index for the manual and seeing the thing you want to do there.
You should look at Jeff Ullman's "Elements of ML Programming" as an
example of how to present a language as a large collection of
solutions, rather than as a simple and elegant set of primitives.
I'm not knocking the book - it just has a very different style from
the way I like to think about languages. (You don't learn until
chapter 21 of 26, in the "Additional Details and Features" section,
that you can curry functions.) In fact, I would highly recommend the
book to anyone who wants to learn to program in ML, and hasn't seen
much beyond Pascal or C.
It seems to be the case that language designers have a different way
of thinking about things than inexperienced language users.
"Elements...", like Ousterhout's TCL manual, presents them in a nice,
concrete way that's palatable to beginners. The manuals for any of the
research languages I've used are very different.
To summarize, TCL is succeeding because:
- hello-world-in-an-X-window is 2 lines, and easy to understand
- the manual is very concrete
- the environment is very portable
- there are no abstract concepts you need to learn to write basic programs.
--
--
Trevor Blackwell t...@das.harvard.edu (617) 495-8912
(info and words of wit in my .plan)
Disclaimer: My opinions.
I agree but I think it is for another reason. It seems to me that what
disturbs mostly lisp beginners is the quoting/eval rule, I mean, it seems
that most users find more natural to think that foo is "foo" and that you
have to go an extra step to get its contents, as in $foo -> "bar", and in
lisp they never know when to quote things.
maybe it is because with script you just do "immediate programming" just
stuffing data quickly, and dont want to take the time to put them nicely
into boxes (variables)....
Henry> In article <CtuG1...@world.std.com> jha...@world.std.com
Henry> (Joseph H Allen) writes:
hbaker> ??????????? 'Type inference' _hurt_ readability?? Perhaps you
hbaker> meant type declarations?
jhallen> IMHO, except for trivial programs, declarations improve
jhallen> readability because you can easily figure out what is stored in
jhallen> variables and arguments and don't have to trace constants or
jhallen> depend on operators not being overloaded. You can jump into
jhallen> the middle of a program and start modifying it without
jhallen> comprehending all of it.
A very dangerous attitude indeed, in the general case. And where it
works it has nothing to do with the manifest/latent or static/runtime
typing issue, it can be done because the program is well modularized.
jhallen> It makes your programs less general perhaps, but definitly more
jhallen> readable. Likewise, structures (records) are better than lists
jhallen> and tuples.
Agreed, lists and tuples are too general.
Henry> Aha! We've finally smoked you out! What you are complaining
Henry> about is the lack of a good code browser which would tell you
Henry> what a compiler knows about the various constructs that it sees.
Aha! We've finally smoked you out! You don't believe that programs are
about _communication_, i.e. texts that describe what the author
thinks. If you did, any extra annotations by the author are welcome,
because they enlighten the reader as to the assumptions of the author.
When I read a program I want to understand not what it does (using a
browser or a tracer), but what the author wanted to communicate with
it. _Then_ I shall check that the intent was correctly implemented.
When I see "const x" or "pure y", what I think is not "ah, a read-only
variable", I think "the author assumes that there will be no need to
change this variable's value". When I see "reducible proc p" I know that
in the author's intent the procedure's result only depends on its
arguments.
This is much more informative than just having a browser telling me
whether in fact a variable is or is not modified in a region of code
(and find a browser that will tell you whether a procedure is
reducible!). Of course I want the latter information too... But not
just that. I want to compare intent with actuality.
Henry> Rather than have to specify all this redundancy every time I
Henry> write a program, wouldn't it be better to have the program tell
Henry> me what it thought?
This redundancy is actually descriptive statements of the author's
intent. Communication, useful communication, about design choices. A
browser will not tell me anything about design choices, will only tell
me about what's there, not in the mind of the author.
Henry> This would be a much better tradeoff in terms of my time vs. the
Henry> computer's time. For what it's worth, such a code browser is
Henry> much easier to write, due to (you guessed it) Lisp's trivial
Henry> syntax.
But I still want the author's annotations, perhaps compared to the
annotations synthetized by a browser.
All this assuming that the author agrees that the purpose of a program
text is to communicate a description to the reader, and puts in
descriptive annotations. A rare thing!
No argument there...
>2 and 3 are not necessary, hurt readability and can be considered semantic
>sugar.
I don't think any of these /hurt/ readability, ever, and I can't
imagine what "semantic sugar" might be (I've heard of "syntactic
sugar", i.e. "sweetening" cumbersome syntax by dressing it up, but
before we can talk about sweetening cumbersome semantics, we'd have to
get into an argument over what cumbersome semantics /are/...) Also,
isn't it a bit self-contradictory to claim that something is sugar,
but that it ain't sweet?
>4 is the only one which is really unique to lisp. It's made redundant by 1
>though, so I don't miss it.
I don't see what Lisp's ability to treat programs and data identically
(which is what I assume you mean by self-representation) has to do
with closures? Most Lisp's don't have closures, but they all
represent programs as lists...
On the other hand, Tcl (I started reading this thread in comp.lang.tcl) is
also self-representing in the above sense, but using strings rather than
lists as the fundamental type from which programs are made.
(I removed comp.windows.x from the newsgroups line.)
--
Jurgen Botz, jb...@mtholyoke.edu | Communications security is too important to
Northampton, MA, USA | be left to secret processes and classified
| algorithms. -- USACM
Henry> Rather than have to specify all this redundancy every time I
Henry> write a program, wouldn't it be better to have the program
Henry> tell me what it thought?
This redundancy is actually descriptive statements of the author's
intent. Communication, useful communication, about design choices.
A browser will not tell me anything about design choices, will only
tell me about what's there, not in the mind of the author.
what if my intent is to write a generalized and polymorphic procedure?
don't all those required type declarations get in the way of my
intent?
There's generally a common reason people like things that are
"crufty" and "complex": job security.
Perhaps that's the reason so many of these complex and crufty
systems are economically viable - and why so many academic systems
are not. Academicians should remember to add a healthy helping of
cruft and complexity to their designs.
Something to think about.
- Mark
I think that it's because natural languages are like this too (well,
they have complex syntax and semantics, perhaps not overly complex).
This however doesn't explain why only *some* people like these
programming languages.
--tim
It depends on the language. You could have a language which lets you
specify that a variable or function argument is any type, one of a set of
types (a union or ML style polymorphic type) or a type of a certain class.
The class method is particularly useful because then you could have any type
which has the proper attributes inherit the class specified for the argument
or variable. The class name is then used for classifying which types are
allowed. You could conceive of a system where integers are in the
whole-number class, the integer-class, and the exact-class, while floats are
in both the whole-number and integer-classes and additionally the
real-class, but not in the exact class.
A debate going on in other newsgroups is "sub-typing vs. sub-classing".
Some think that a value of certain type should be usable in a variable if
the class specified for the variable has method names which are the same as
those of the class of the type. This eliminates the need for inheritance
but I think it also prevents you from using class names as general attribute
specifiers. I would be happier with the sub-classing idea if you had method
names which did nothing but indicate the presence of certain properties that
the type has.
I don't lack the creativity. But I do lack the time. Trying to work
around a lack of numerical libraries, string libraries, and graphics
and GUI libraries requires a _lot_ of time and effort. Likewise,
trying to get some complicated algorithm to run fast and use little
space can be much harder in Lisp or Smalltalk than in C, even if a
simple implementation of the algorithm would be easier in those
languages.
Thomas.
>I don't lack the creativity. But I do lack the time. Trying to work
>around a lack of numerical libraries, string libraries, and graphics
>and GUI libraries requires a _lot_ of time and effort.
This is true. Haskell's standard prelude goes some very small way
towards addressing this, but you invariably end up having to use custom
libraries and compiler features to get half-decent performance (string
manipulation is horribly inefficient in standard Haskell, but a *lot*
faster and more space-efficient if you use the Glasgow Haskell packed
string extensions).
In the Scheme world, SLIB does a nice job of giving you lots of useful
library features, and does it quite portably too. Common Lisp
programmers get an even larger kitchen sink with their systems, so I
think the library problem is more or less solely confined to the more
``experimental'' functional languages (though I assume Yale Haskell
lets you get at a lot of CL baggage without too much pain).
>Likewise, trying to get some complicated algorithm to run fast and use
>little space can be much harder in Lisp or Smalltalk than in C, even if
>a simple implementation of the algorithm would be easier in those
>languages.
I don't know about Smalltalk, but it's not overly hard to get complex
algorithms to go quickly in Lisp or even, dare I say it, Haskell
(though space efficiency will bite you in Haskell). I'd say the amount
of effort required to code an algorithm efficiently in either of these
languages is about the same as that required to get an efficient
algorithm going in C.
<b
--
Bryan O'Sullivan u nas est tolko odyeen yagnyonok seychas.
Computing Science Department email: b...@cyclic.com, b...@dcs.gla.ac.uk
University of Glasgow web.gunge: http://www.scrg.cs.tcd.ie/~bos
--tim
I think it is quite simple. People who have trouble with abstract
concepts (like me), prefer languages with the minimum number of
new concepts over those they know from natural language. I am quite
happy to read something that is a few characters longer if it looks
more like something that I already know (English). Consider what is
required to understand:
(cond (< (+ a b) 5)
(setq a 1)
(setq b 1)
)
versus:
if a + b < 5 then
a <- 1;
else
a <- 2;
endif;
This principle works for Cobol, but that language (IMHO) violates
the same principle in another way, by having far too many words
with differing meanings.
--
Chris Gray c...@myrias.ab.ca [C macros - a powerful way to make bugs!]
Myrias Computer Technologies Inc. (403) 463-1337
IMHO, Terry Winograd showed that the _syntax_ of English is trivial, and
that what NL types were trying to do with parsing and syntax is much better
left to semantic analysis.
The problem with Enlish is analogous to the problem with certain
graphical languages: after you select certain entities to play a part
in the next operation, it is up to the operation to try to figure out
what role each entity will play, based on the semantics of the
operation and the characteristics of the entities.
I thought that adding cruft and complexity was the role of the ANSI/IEEE
standardization committee. :-)
There a nice article in the Byte Magazine about it. I think in the
June issue...
What I like about TAOS is that the compilitation output is targeted
to a virtual machine. When objects are loaded, the system compiles
them into the native machine code. Supposedly the assembly process
is very efficient.
Tnks,
Amancio
--
FREE unix, gcc, tcp/ip, X, open-look, netaudio, tcl/tk, MIME, midi,sound
at freebsd.cdrom.com:/pub/FreeBSD
Amancio Hasty, Consultant |
Home: (415) 495-3046 |
e-mail ha...@netcom.com | ftp-site depository of all my work:
| sunvis.rtpnc.epa.gov:/pub/386bsd/X
The length of a language manual is not a commensurate metric.
"Common Lisp, the Language" has more complete and detailed
coverage of it's language than almost any other manual I've seen. Guy
Steele ( the editor of CLtL ) was also the coauthor of a reference
manual for C which was 2 or 3 times as long as K&R. Steele's C manual
was as detailed as the Lisp manual, with discussion and justifications
of various ANSI features, and comparison with non-ANSI compilers, etc.
A lot of explaination of WHY and WHY-NOT.
The functional equivalent of the same level of documentation
for C would be Harrison & Steele's C Reference Manual *PLUS* another
book on portability and the standard C libraries. This would probably
equal the page count of CLtL1 ( The first version, which is the one
I have on my shelf. CLtL2, which adds documentation for CLOS probably
regains the record again. )
I'm not arguing that Common Lisp isn't, in many ways, a complex
language. Just that the size of the manual is NOT usually a reasonable
mesaure of complexity. ( I could also argue that there are different
TYPES of complexity, and that some types are better/worse than others,
but I'll save that argument for later! )
- Steve Majewski (804-982-0831) <sd...@Virginia.EDU>
- UVA Department of Molecular Physiology and Biological Physics
The functional equivalent of the same level of documentation
for C would be Harrison & Steele's C Reference Manual *PLUS* another
book on portability and the standard C libraries.
plus a complete reference on c++ and associated object libraries along
with specification of the hooks available to build a class browser
(i.e. a metaobject protocol).
This would probably
equal the page count of CLtL1 ( The first version, which is the one
I have on my shelf. CLtL2, which adds documentation for CLOS probably
regains the record again. )
not after you mention c++, it doesn't. and c alone is a much more
like what cltl1 describes.
In article <31tq0n$o...@sophia.inria.fr> co...@crios.inria.fr (Colas Nahaboo) writes:
I agree but I think it is for another reason. It seems to me that what
disturbs mostly lisp beginners is the quoting/eval rule,
Which I find puzzling since I thought the quoting rule was
semantically same as that of English.
I mean, it seems that most users find more natural to think that
foo is "foo" ...
Isn't that -contrary- to the method of quoting used in English?
Yes -- just look at;
LISP
COBOL
The excuse given to me when I enquired as to just why I should learn
lisp when there are perfectly good, and genuine, functional languages
I could use was that
"in the world of AI research, lisp is everywhere.
If you can't speak lisp, you won't do very well
in AI research. Its whats out there.
So much AI software is written in lisp..."
etc.
Lame excuses for perpetuating a lame language.
IMHO of course :-)
;; I think it is quite simple. People who have trouble with abstract
;; concepts (like me), prefer languages with the minimum number of
;; new concepts over those they know from natural language. I am quite
;; happy to read something that is a few characters longer if it looks
;; more like something that I already know (English). Consider what is
;; required to understand:
;;
;; (cond (< (+ a b) 5)
;; (setq a 1)
;; (setq b 1)
;; )
;;
;; versus:
;;
;; if a + b < 5 then
;; a <- 1;
;; else
;; a <- 2;
;; endif;
FYI: The above attempt at Lisp code is incorrect. Also note how Emacs
re-indents the code automatically, making a suspicious-looking piece of
code:
(cond (< (+ a b) 5)
(setq a 1)
(setq b 1) )
It is syntactically incorrect (cond is a macro) and gives "void
variable: <". You probably intended something like:
(cond ((< (+ a b) 5)
(setq a 1))
(t
(setq b 2))))
which is a lot more readable too, IMHO.
;; This principle works for Cobol, but that language (IMHO) violates
;; the same principle in another way, by having far too many words
;; with differing meanings.
What principle?
For what it's worth, I have a paper copy (double-sided A4) of the current
draft of the Ada 9X standard. It's pretty close to CLtL2 in bulk. About
half of that is "language", and half of it is "libraries". Since Ada9x
and Common Lisp have _comparable_ feature sets (OOP + packages) apart from
philosophical differences about types, this is not too much of a surprise.
If it comes to that, my C++ document set + NIHCL library doc are about the
same size as CLtL2.
--
30 million of Australia's 140 million sheep
suffer from some form of baldness. -- Weekly Times.
I am a little bemused reading all this discussion, Tcl syntax is simple.
All you need to know is "" vs {} quoting including \ quoting.
and $ vs [] substitution.
and what a properly formed list is, which is simle to explain once the
first two concepts have been grasped.
All this can be explained in a hour or so, or in a very small sector of a book.
The problem often is that people do not correctly grasp the
fundamentals, before attacking other parts of the language, in the case
of Tcl the fundamentals are so simple that there is really no excuse.
I personally don't care wether other people like or dislike Tcl, I
certainly don't wish to put down there preferences for other languages,
but it is *not* a complicated language.
My personal reasons for liking it over and above C++ and lisp et al is
that it doesn't force me to "buy in" to a whole new way of thinking, I
can mix and match my solutions to various problems in C or Tcl or any
other language that I can readily interface to C.
Sure I would like to see it faster, there must be a way to get below 23
microseconds for a Tcl nop, but it is a good, *simple* and powerful
system, and I am happy with that.
The debates that go on on Usenet seem to me to often be akin to the
political and cutural battles that have plauged human civilisation ever
since it begun, namely
We expend enormous energy *stressing* we our views are
different to others, rather than building on the commonality
that we all share.
For instance look at modern computing, as defined in popular use, we
are decades behind where we might have been because we fight amongst
ourselves, either between companies for spurious financial or power
gain, but also amongst the research community, isn't it time we matured
beyond this stage in our evolution, think not of differences, but of
what we share.
As always, these our my own views (an my wifes!).
;; (cond (< (+ a b) 5)
;; (setq a 1)
;; (setq b 1)
;; )
;;
;; versus:
;;
;; if a + b < 5 then
;; a <- 1;
;; else
;; a <- 2;
;; endif;
FYI: The above attempt at Lisp code is incorrect. Also note how Emacs
re-indents the code automatically, making a suspicious-looking piece of
code:
(cond (< (+ a b) 5)
(setq a 1)
(setq b 1) )
It is syntactically incorrect (cond is a macro) and gives "void
variable: <". You probably intended something like:
(cond ((< (+ a b) 5)
(setq a 1))
(t
(setq b 2))))
which is a lot more readable too, IMHO.
Sorry, its been 15 years since I programmed in LISP.
;; This principle works for Cobol, but that language (IMHO) violates
;; the same principle in another way, by having far too many words
;; with differing meanings.
What principle?
The principle of a programming language being mostly readable by
a reader not familiar with it. Programming is a difficult enough
job as it is - why make it harder by using languages that are hard
to read?
Note that I am not an academic - I program for a living and as a
hobby sometimes. I am much more concerned with code readability and
maintainability then I am with how quickly something can be hacked
together using some language's "neato keen" tools and features.
Here's kind of a twisted example, but heck, maybe someone out there
can help me with it. I've written a MUD system at home (if you don't
know what a MUD is, just ignore this), and one day I thought it would
be K00L to put a variant of "Eliza" or "Doctor" into it. I think all
the needed facilities are there. So, I grabbed the only version I
had around - that for GNU-Emacs ELisp. I have yet to be able to
figure out just what it does. This is precisely what I mean - if I
can ever convert it to my MUD system (a language along the lines of
the first one above), I would hope that it would be much more
readable. Anyone got a version that I can understand?
What cruft is there in Lisp, as opposed to specific varieties of
Lisp such as Common Lisp?
>The excuse given to me when I enquired as to just why I should learn
>lisp when there are perfectly good, and genuine, functional languages
>I could use was that
>"in the world of AI research, lisp is everywhere.
> If you can't speak lisp, you won't do very well
> in AI research. Its whats out there.
> So much AI software is written in lisp..."
>etc.
>
>Lame excuses for perpetuating a lame language.
>IMHO of course :-)
That's what you get for asking the wrong people. People who
don't have good reasons can't give you good reasons.
>Lame excuses for perpetuating a lame language.
>IMHO of course :-)
Golly, I can kind of see how C or TCL might get propagated this way,
seeing how they are comparatively featureless and straightforward to
learn, but LISP? Why would all these counterproductive academics
go to all the trouble to learn this complex language unless it gave
them unique flexibility? Could it be that LISP has been useful in
developing these `good, and genuine functional languages'? Hmm.
Apparently you weren't convinced, and didn't try to learn LISP.