I'm finding this difficult because I'm an old lisp enthusiast and it
irks me to distraction that these peculiar little languages do neat
little tricks that could be easily accomplished in lisp -- without
having to learn a different syntax for each functionality, and without
the weird hobbled type system limitations, and with polymorphism
and macros etcetera etcetera etcetera....
Is there any *real* reason that a combination of GCL+CLX+GLUE+POSIX
hooks with some neat add-on packages couldn't replace all this gunk?
Maybe you can never get around the C-syntax chauvinism problem
but if the only other problem is program size or speed, can't we
expect the difficulty to go away as hardware cheapens?
Comments? -aaron
PS: I don't mean to detract from the accomplishments of the developers
of tcl, perl, and company -- congratulations guys, quite impressive
-- but do we really need a new language paradigm every time?
LISP isn't ideal for everything, either; the reasons all of these languages
proliferate is because people like them and use them (sorry, that was
probably too obvious :). I've seen LISP, and I'm planning to delve into
it soon, but it can be a fairly annoying language to work with (from
reports, at least).
Competition is good :). If we all stuck with LISP-like languages, then
we'd have a pretty non-diverse set of languages...
>Is there any *real* reason that a combination of GCL+CLX+GLUE+POSIX
>hooks with some neat add-on packages couldn't replace all this gunk?
>Maybe you can never get around the C-syntax chauvinism problem
>but if the only other problem is program size or speed, can't we
>expect the difficulty to go away as hardware cheapens?
This is one reason I chose Tk/Tcl for my application; quick prototyping speed,
fast runtime, easy integration into speedy C (OK, I just threw the last in :).
While in the future computers may become fast enough, I really would like
to get something done *now*. 'sides, we can expect things to grow ever more
hoggy and slower as hardware gets faster... It's been the nature so far.
Cheers,
--titus
--
C. Titus Brown <- http://www.krl.caltech.edu/~brown/plan.html -> br...@reed.edu
--> GCS/GSS:@d--,-p+,c++++,l,u(++),e+,m+,s+/,n+,h+,f+,g+,w+,t-,r-,y? <--
Sysadmin at Caltech KRL / Guest sysadmin at Reed College
Member of the Avida Artificial Life research group
There is (if I may say so (and I really hope I may)) a small disadvantage
(maybe not so small either (in fact rather large (not to say huge))) in the
usage of lisp (or scheme (or any other lookalikes)) that readbility is
rather low (very low (if you want my opinion)).
Jacob Hallen
Well, another reason is that LISP isn't very popular as a general purpose
language any more. Not that there aren't cases where it is still useful, but
the typical '90s programmer is much more comfortable with C and languages
which try to provide a C-like syntax such as awk, perl, and tcl.
What does the "p" in "perl" stand for? "Practical". For the tasks it's
geared towards, perl is danged practical. It's in the Unix idiom, so those
familiar with Unix can learn to use it relatively easily. It lets you screw
around with text really easily. It's concise. It's efficient -- efficient
enough that you can generally use a decently written perl program without
frequently thinking "Maybe I should re-write this in C."
Perl is a tool. It's well suited for a particular task. Maybe it's
possible to do all that stuff in lisp, but all the convenient stuff like
formats, associative arrays, sockets, split(), join(), s/foo/bar/, `foo`,
dbmopen(), system(), and die() are already in perl. Is there a version of
lisp that has all the convenience features that perl already has? (Actually,
this is an honest question. If there is, it might be interesting.)
> -- without
> having to learn a different syntax for each functionality,
Agreed. This is annoying. It might be worth it, though.
> and without
> the weird hobbled type system limitations,
What limitations? Perl is really pretty good about not imposing arbitrary
limitations.
Adios,
Logan
--
The genius of France can be seen at a glance
And it's not in their fabled fashion scene
It's not that they're mean, or their wine, or cuisine
I refer of course to the guillotine
(the French knew how to lynch)
T-Bone Burnett, "I Can Explain Everything"
Good question. I find myself thinking this sometimes. Here is my
explanation...
1) Tcl is small
Common Lisp is *TOO HUGE*. Scheme might work and there have
been some people who have been fiddling with using it with Tk I
think.
2) I can auto-configure Tcl
I used to really enjoy hacking makefiles and such, but now I'm
too old or too busy to really enjoy. Give me a cool package that
configures itself any day.
3) I can build Tcl from scratch in ~5 minutes on anything
I can configure and build the entire Tcl/Tk system from scratch
on most machines in ~10 min, requiring only libX11.a and some header
files. I've built it on Suns (4.x and 5.x), SGIs, HPs and Crays
without a hitch.
Also, I don't need to get three packages from three people and
try to get them to like each other.
4) Tcl (at least to my knowledge) isn't picky about how big my "int"s
are. Several lisps I've seen are picky.
5) As much as I like the unambiguous nature of prefix-notation expressions,
I still can't bring myself to prefer (+ 5 (* 3 4)) over 5+3*4. This
is a biggy for me.
6) I like Tcl. It's simple.
People complain about Tcl's syntax. I didn't like it when I
first started toying with it, but I like it now. The reason I
like it now is that I find that there are a few simple and general
principles that it follows. Everything is handled in a kinda
universal fashion (like scripts being first class) which makes it
very powerful for me. I like the fact that I can write my own
control structions in TCL for example.
I've looked at perl a few times and I don't find it very easy
to understand. No flames please, I'm sure its great and maybe you
can make an argument that it is simple, but it seems to have an
awful lot of built-ins...more than I can keep track of. If I'm
wrong please enlighten me.
7) I'm familiar with Tcl.
I find this a loathsome reason to endorse something. I've
always prided myself on using the best tools available, no
excuses. However, when its neck-and-neck, I go with what I know.
I've got some complaints about Tcl, but they aren't anything major.
Lastly, doesn't Elk (a schemish thingy as I recall) do lots of what
Tcl/Tk do? I seem to recall building Elk once and asking the author
how I could embed Elk in something (or maybe the other way around,
embed something in Elk) and he said he didn't intend it to be used
this way. Tcl works great both ways (although I find it a little
tedious executing Tcl from withing a C program, but I'm probably just
doing it awkwardly).
Mike
Perl is a tool. It's well suited for a particular task. Maybe it's
possible to do all that stuff in lisp, but all the convenient stuff like
formats, associative arrays, sockets, split(), join(), s/foo/bar/, `foo`,
dbmopen(), system(), and die() are already in perl. Is there a version of
lisp that has all the convenience features that perl already has? (Actually,
this is an honest question. If there is, it might be interesting.)
ELK (scheme) comes close. It's mainly missing the regex string operators,
the last time I looked (and I don't remember the dbm library, either).
I'm using Python, which has all of the above, and a reasonable syntax to
boot. It also now has Tk, from the Tcl folk.
ELK: ftp://ftp.x.org/contrib/devel_tools/elk-2.2.README
Python: ftp://ftp.cwi.nl/pub/python/ (or see comp.lang.python)
Bill
--
Bill Janssen <jan...@parc.xerox.com> (415) 812-4763 FAX: (415) 812-4777
Xerox Palo Alto Research Center, 3333 Coyote Hill Rd, Palo Alto, CA 94304
URL: ftp://parcftp.parc.xerox.com/pub/ilu/misc/janssen.html
>
> I've been toying with the idea of coming up to speed with
> various hot trends in the unix world: tcl, perl, whatnot.
>
> I'm finding this difficult because I'm an old lisp enthusiast and it
> irks me to distraction that these peculiar little languages do neat
> little tricks that could be easily accomplished in lisp -- without
> having to learn a different syntax for each functionality, and without
> the weird hobbled type system limitations, and with polymorphism
> and macros etcetera etcetera etcetera....
>
> Is there any *real* reason that a combination of GCL+CLX+GLUE+POSIX
> hooks with some neat add-on packages couldn't replace all this gunk?
I've suggested a Lisp shell idea in this newsgroup in the past and received
very little enthusiasm and one responder mentioned that someone has already
done something similar, can't remember who (perhaps it's in the FAQ by now).
The basic idea was to organize a Lisp (say Common Lisp) into levels, level
0, level 1 etc. where each level is composed of more abstract Lisp functions
that use the lower levels. Functions in levels above 0 would autoload when
invoked. Level 0 would be very primitive but more powerful in its control
language than say "csh". A Lisp shell, call it "lsh", then would be composed
of Level 0 and an OS package per OS. An alias system would allow brief
commands in the shell interpreter so that instead of entering the OS package
function
(os:set-default-directory "/usr/local/bin")
the user could specify the aliasing in his ".lsh" file so that he could
enter
cd /usr/local/bin
and it would expand correctly. Otherwise the shell interpreter would be a
(print (eval (read))) loop. Shell scripts then would be ordinary Lisp load
files which could also be compiled for faster execution.
A major problem with this idea is that people would be tempted to write
their programs in Lisp! SMILE.
--
Bill Vrotney - vro...@netcom.com
A minor correction:
Elk has a (dynamically loadable) GNU-dbm extension; it has been
contributed by Martin Stut at TU Muenchen. See the initial comment in
src/lib/misc/gdbm.c for a list of functions and types.
(You have to set `gdbm=yes' in the `site file' to compile the extension.
Then try `(load 'gdbmtest.scm)' from the Scheme top-level.)
Thank you,
David Pollen
pol...@netcom.com
-------
In article <Ct09y...@taligent.com> lo...@taligent.com (Logan Shaw) writes:
Compared to _PERL_?!?!
-Miles
[tcl ain't so hot either, if you're doing more than simple sequences of
commands]
--
--
Miles Bader / mi...@eskimo.com / (206) 842-7219
I'd rather be consing.
Well perl was supposed to unify these little languages under one roof. But
you are right, it would be fun to have a Lisp dialect shell. I guess that
is why GNU emacs is very popular, because you can do quite a lot with
the elisp.
But seriously, the little languages are usually quite focussed tools,
often written many moons ago. Remember that in the past, UNIX programs
were much smaller than they are today. Some are very specialised for
handling data in particular formats and are not at all bad for what
they do.
Unfortunately in the UNIX world, there seems to be a mentality that
nothing can ever be thrown away. Which means we still get programs and
utilities which nobody wants to use, but everyone is too scared to
delete. It's about time vendors trashed everything above the kernel
level, and started again IMHO. (Yes that's right, no more "vi".)
The GWYDION project, based on Dylan, this may be interesting for you.
Bill
--
bi...@zikzak.apana.org.au Bill Birch on ZikZak (Melbourne, Australia)
{:-) A bas la loi Toubon! {:-)
Hey, I use vi and like it ;-)
|> The GWYDION project, based on Dylan, this may be interesting for you.
|>
|>
|> Bill
|>
|>
|> --
|>
|> bi...@zikzak.apana.org.au Bill Birch on ZikZak (Melbourne, Australia)
|>
|> {:-) A bas la loi Toubon! {:-)
--
Sincerely,
David G. Boney AF&AM
American Heart Association Medical Student Research Fellow
Texas Tech School of Medicine
dbo...@cs.ttu.edu Texas Tech University
Ph. 806-742-1191 Department of Computer Science
Fax 806-742-3519 Lubbock, Tx. 79409 USA
In article <Ct09y...@taligent.com> lo...@taligent.com (Logan Shaw) writes:
Perl is a tool. It's well suited for a particular task. Maybe it's
possible to do all that stuff in lisp, but all the convenient stuff like
formats, associative arrays, sockets, split(), join(), s/foo/bar/, `foo`,
dbmopen(), system(), and die() are already in perl.
--------------------
Check out WINTERP 2.0, which has properties similar to Elk, Tcl/Tk,
Python, but is particularly oriented to creating and running graphical UI
applications using an object interface to OSF/Motif and object-oriented
Xtango-based 2.5D graphics/animation. WINTERP 2.0 is based on XLISP-PLUS by
Betz/Almy/Tierney et al -- this is a small, fast, simple, and
easy-to-extend-in-C Lisp interpreter with a smalltalk-inspired object
system.
I too believe in using "the right tool for the job". So If I need the
functionality of Perl (or it's efficiency in blasting through large amounts
of text), I just go ahead and use it as a subprocess from WINTERP. One
particularly useful feature is WINTERP's asynchronous subprocess
facility. This allows you to write parts of the program more suited to
languages like Tcl, Python, Perl or Awk in those languages. Or create
GUI-wrappers to existing interactive unix programs or network services...
WINTERP is available via anonymous ftp from ftp.x.org:/contrib/devel_tools,
file winterp-2.XX.tar.gz (XX==02 currently). You may also retrieve WINTERP
via the WINTERP home page -- http://www.eit.com/software/winterp/winterp.html
------- Forwarded Message
WINTERP -- The OSF/Motif *W*idget *INTERP*reter
Niels Mayer
Enterprise Integration Technologies
800 El Camino Real, Fourth Floor
Menlo Park, CA 94025
e-mail: ma...@eit.com
URL: http://www.eit.com/people/mayer.html
WINTERP is the OSF/Motif *W*idget *INTERP*reter, an application development
environment enabling rapid prototyping of graphical user-interfaces (GUI)
through the interactive programmatic manipulation of user interface objects
and their attached actions. WINTERP is also an excellent platform for
delivering extensible or customizable applications. By embedding a small,
efficient Lisp interpreter with UI primitives within the delivered
application, users and system integrators can tailor the static and dynamic
layout of the UI, UI/application dialogue, and application functionality.
WINTERP is a good tool for learning and experimenting with the capabilities
of the OSF/Motif UI toolkit, allowing UI designers to more easily play
"what if" games with different interface styles. WINTERP's implementation
provides a compromise between the prototyping and extensibility advantages
of Lisp environments, and the inefficiency and expenses of delivering Unix
applications under environments such as Common Lisp. Typically, prototyping
and customization are done entirely in interpreted Lisp; for delivery,
efficiency-critical low-level code may be written in C and is easily
exported to the interpreter as a new primitive.
WINTERP was first made publicly available on the X11r4 "contrib"
distribution and new releases have appeared on the X11r5 and X11r6
distribution. The recent X11r6 release of WINTERP 2.0 significantly
improves on previous releases by providing a variety of developer tools and
libraries for increased productivity. Improved functionality is delivered
via object-oriented graphics and 2.5D animation, asynchronous subprocesses,
the XmGraph widget (for creating directed acyclic graphs, trees, and
direct-manipulation displays), the Table widget (GUI layout using
tbl(1)-style specifications), GIF support, etc.
WINTERP's interpreter is based on David Betz, Tom Almy, Luke Tierney et
al's XLISP-PLUS. The interpreter's Smalltalk-inspired object system enables
a truly object oriented interface to the X11 toolkit Intrinsics (Xt) and
the OSF/Motif widget set. WINTERP's use of a real programming language for
customization/prototyping allows WINTERP based applications to be much more
flexible than applications using lower-level and less-general languages
provided by the X resource database, Brunecky&Smythe's Widget Creation
Library (WCL), OSF/Motif's UIL (user interface language), and Ousterhout's
TCL/Tk. Furthermore, the use of object-orientation at a fundamental level
within the application UI code allows WINTERP-based applications to scale
more effectively than other languages.
WINTERP 2.0 features an object-oriented 2.5D graphics and animation
"widget" based on John Stasko's Xtango path transition paradigm. Both for
static and dynamic graphics, this high-level interface simplifies and
abstracts away much of the low-level drudgery required to create 2.5 D
graphics interfaces -- smooth, flicker free display updates occur as
complex nonrectangular graphical objects move around and obscure and
uncover each other. Animation composition operations allow multiple
individual shapes to all move "simultaneously" through sequences of
animation frames. The graphics are pixel-independent and easily resizeable,
scalable and zoomable. Each primitive graphics image class supports its own
set of class specific animation and movement methods, while some operations
(e.g. movement, fill, etc) are polymorphic. The following primitive objects
are supported:
* Line (w/ color, forward-arrow, backward-arrow, bidirectional-arrow,
thickness, and style options);
* Rectangle (w/ color, fill options);
* Circle (w/ color, fill options);
* Ellipse (w/ color, fill options);
* Polygon (w/ color, fill options);
* Polyline (w/ color, forward-arrow, backward-arrow, bidirectional-
arrow, line style, line-thickness options);
* Spline (w/ color, line-style and line-thickness options)
* Text (w/ font, color, and centering options)
* Bitmaps and Bitmap movies
* GIF images.
The primitive graphics classes may also be contained in a composite image
class, which provides a grouping and layering principle for new classes
presenting multiple images. Composite images allow the construction of
independent layers of animation objects which may be operated on in groups.
WINTERP's graphics capabilities enable simple game-style animation,
employing multiple layers of arbitrarily shaped objects. Furthermore,
application-specific interactive-graphics capabilities may be encapsulated
into a new Widget-Class. This significantly simplifies the creation and
integration of new graphics widgets into the system -- these special
widgets look like normal Motif widgets to the rest of the system.
To enable GUI-applications based on existing Unix facilities, WINTERP
provides primitives for collecting data from Unix processes, and facilities
for interacting with other Unix processes. These facilities make it
possible to glue together existing Unix functionality into a GUI based
application with a relatively small amount of WINTERP-Lisp "glue". WINTERP
2.0 features the ability to run multiple interactive, asynchronous Unix
subprocesses without blocking GUI interactivity. This feature is useful for
creating GUI interfaces to existing terminal-based programs, and can also
be used for connecting to interactive network services and databases.
An environment similar to WINTERP's already exists in the Gnu-Emacs text
editor -- WINTERP was strongly influenced by Gnu-Emacs's successful
design. In Gnu-Emacs, a mini-Lisp interpreter is used to extend the editor
to provide text-browser style interfaces to a number of Unix applications
(e.g. e-mail user agents, directory browsers, debuggers, etc.). Whereas
Emacs-Lisp enables the creation of new applications by tying together C
Implemented primitives operating on text-buffer UI objects, WINTERP-Lisp
ties together operations on graphical UI objects implemented by the Motif
widgets. Both achieve a high degree of customizability that is common for
systems implemented in Lisp, while still attaining the speed of execution
and (relatively) small size associated with C-implemented applications.
WINTERP features:
*** Free with non-restrictive copyright -- available via anonymous
ftp from ftp.x.org, directory contrib/devel_tools, file
winterp-2.XX.tar.gz.
*** Portable -- entirely implemented via machine independent C
source and X11/Xt/Motif libraries.
*** OSF/Motif widgets are real XLISP objects widgets can be specialized
via subclassing, methods added or altered, etc.
*** Automatic storage management (via garbage collection) of Motif/Xt/X
data, animation and graphics data, and application resources.
*** Contains facilities for simple "direct manipulation" of UI
components.
*** Interface to Gnu Emacs's lisp-mode allows code to be developed and
tested without leaving the editor.
*** Interactive programing also available in the "WINTERP Control Panel",
with editing taking place in a Motif text widget controlled by
WINTERP.
*** Built-in RPC mechanism for inter-application communications,
implemented via serverized, event-driven Lisp interpreter.
*** XmGraph widget for creating directed acyclic graphs, trees, and
direct-manipulation displays.
*** Table widget allows constraint-based GUI static layout
using tbl(1)-style specifications.
--------------------
An old paper on WINTERP version 1.X may may be obtained via
World-Wide-Web/Mosaic:
WINTERP paper from Motif '91, First Annual Intl. Motif Users
Meeting (postscript) (226756 bytes):
ftp://www.eit.com/pub/winterp/doc/papers/winterp.PS
Screen Dump, page 3 (postscript) (157936 bytes):
ftp://www.eit.com/pub/winterp/doc/papers/page3.PS
Hybrid Application Architecture Diagram, page 3 (postscript)
(16505 bytes):
ftp://www.eit.com/pub/winterp/doc/papers/arch.PS
Diagram of RPC Architecture, page 10 (postscript) (16686 bytes):
ftp://www.eit.com/pub/winterp/doc/papers/RPC-Arch.PS
Screen Dump, page 25 (postscript) (145444 bytes):
ftp://www.eit.com/pub/winterp/doc/papers/page25.PS
Screen Dump, page 26 (postscript) (135663 bytes):
ftp://www.eit.com/pub/winterp/doc/papers/page26.PS
--------------------
Further information on WINTERP may be obtained via World-Wide-Web/Mosaic:
The WINTERP Home Page:
http://www.eit.com/software/winterp/winterp.html
WINTERP 2.0 documentation (plain-text) (652268 bytes):
ftp://www.eit.com/pub/winterp/doc/winterp.doc
XLISP-PLUS documentation (plain-text) (211733 bytes):
ftp://www.eit.com/pub/winterp/doc/xlisp.doc
Xtango Path Transition Animation (postscript) (588746 bytes):
ftp://www.eit.com/pub/winterp/doc/xtangodoc.ps
EIT's WINTERP-based World-Wide-Web Multimedia Authoring Environment:
http://www.eit.com/papers/gpware94/paper.html
--------------------
References on WINTERP, and its components:
David Michael Betz. "XLISP: An Object-oriented Lisp (version 2.1)"
Unpublished documentation accompanying the public release of Xlisp
software. David Michael Betz, P.O. Box 144, Peterborough, NH 03458,
April, 1989.
Olaf Heimburger. "Elche Im Winter -- Interaktive X-Applicationbuilder
unter Lisp -- Elk und WINTERP." iX, July 1991, pp 64-68.
Niels P. Mayer, Allan W. Shepherd and Allan J. Kuchinsky. "Winterp:
An object-oriented, rapid prototyping, development and delivery
environment for building extensible applications with the OSF/Motif
UI Toolkit." In Proceedings Xhibition '90, X Window System and Open
Systems Technical Conference, San Jose, CA, May 1990, pp 49-64.
Niels P. Mayer, Allan W. Shepherd and Allan J. Kuchinsky. The
WINTERP Widget INTERPreter -- An Application Prototyping and
Extension Environment for OSF/Motif. In Proceedings X Into The Future,
The European X Users Group Autumn Conference 1990, Surrey, UK,
September 1990, pp. 33-55.
Niels P. Mayer. The WINTERP Widget INTERPreter -- A Lisp Prototyping
and Extension Environment for OSF/Motif-based Applications and
User-Interfaces. Lisp Pointers, ACM SIGPLAN, Volume IV, Number 1,
pp 45-60.
Niels P. Mayer. The WINTERP OSF/Motif Widget INTERPreter -- A
graphical user-interface language for rapid prototyping and
delivering extensible applications. In Proceedings Motif '91,
First Annual International Motif Users Meeting, Washington DC,
December 1991, pp. 248-269.
John T. Stasko. The Path-Transition Paradigm: A Practical Methodology
for Adding Animation to Program Interfaces. Journal of Visual
Languages and Computing. (date, volume, page, publisher info unknown).
Don Libes. Expect: Curing Those Uncontrollable Fits of Interaction,
Proceedings of the Summer 1990 USENIX Conference, Anaheim, CA, June
11-15, 1990.
Papers describing applications written using WINTERP:
Allan Shepherd, Niels Mayer, and Allan Kuchinsky. STRUDEL: An
Extensible Electronic Conversation Toolkit. In David Marca and
Geoffrey Bock, editors, GROUPWARE: Software for Computer-Supported
Cooperative Work, IEEE Computer Society Press, 1992, pp. 505-518.
(originally, in proceedings Conference on Computer-Supported
Cooperative Work, Los Angeles, October 1990, pp. 93-104.)
Jay Glicksman, Glenn Kramer, and Niels Mayer. "Internet Publishing via
the World Wide Web". In proceedings Groupware '94, August 1994. San
Jose, CA.
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
= Niels Mayer ..... ma...@eit.com .... http://www.eit.com/people/mayer.html =
= Multimedia Engineering Collaboration Environment (MM authoring for WWW) =
= Enterprise Integration Technologies, 459 Hamilton Ave, Palo Alto CA 94301 =
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
------- End of Forwarded Message
1) Tcl is small
Common Lisp is *TOO HUGE*. Scheme might work and there have
been some people who have been fiddling with using it with Tk I
think.
The delivery tools available for commercial Common Lisp
implementations can much improve the footprint situation. Among
free-of-charge implementations, CLISP's full runtime and image add up
to scarcely more than a megabyte. And as others have suggested, other
(smaller) choices in the Lisp family, such as WINTERP, may be
appropriate in some situations.
4) Tcl (at least to my knowledge) isn't picky about how big my "int"s
are. Several lisps I've seen are picky.
Of course, the standard Common Lisp INTEGER class has no practical
upper bound. Unless, perhaps, you're referring to the source code
itself (of a free-of-charge CL implementation).
5) As much as I like the unambiguous nature of prefix-notation expressions,
I still can't bring myself to prefer (+ 5 (* 3 4)) over 5+3*4. This
is a biggy for me.
Perhaps I don't do as much heavily numerical programming as other
people; but as I see it, C statements and expressions like 'if',
'for', 'switch', and 'putchar(c)' are using prefix notation just as
Lisp does, except that Lisp's syntax is much more regular and more
amenable to programmatic manipulation.
6) I like Tcl. It's simple.
People complain about Tcl's syntax. I didn't like it when I
first started toying with it, but I like it now. The reason I
like it now is that I find that there are a few simple and general
principles that it follows. Everything is handled in a kinda
universal fashion (like scripts being first class) which makes it
very powerful for me. I like the fact that I can write my own
control structions in TCL for example.
If only everyone were as open-minded about syntax as you are.
--
Lawrence G. Mayka
AT&T Bell Laboratories
l...@ieain.att.com
Standard disclaimer.
I've suggested a Lisp shell idea in this newsgroup in the past and received
very little enthusiasm and one responder mentioned that someone has already
done something similar, can't remember who (perhaps it's in the FAQ by now).
The basic idea was to organize a Lisp (say Common Lisp) into levels, level
0, level 1 etc. where each level is composed of more abstract Lisp functions
that use the lower levels. Functions in levels above 0 would autoload when
invoked. Level 0 would be very primitive but more powerful in its control
language than say "csh". A Lisp shell, call it "lsh", then would be composed
of Level 0 and an OS package per OS. An alias system would allow brief
commands in the shell interpreter so that instead of entering the OS package
function
(os:set-default-directory "/usr/local/bin")
the user could specify the aliasing in his ".lsh" file so that he could
enter
cd /usr/local/bin
and it would expand correctly. Otherwise the shell interpreter would be a
(print (eval (read))) loop. Shell scripts then would be ordinary Lisp load
files which could also be compiled for faster execution.
- Division of an industrial-strength Lisp into multiple levels is one
of the goals of Eulisp, I think.
- An alternative to your "Lisp shell" idea is a command processor such
as is offered by CLIM. CLIM's command processor will take
parenthesized Lisp expressions (if you want it to), but its ordinary
syntax is more like ordinary English commands. Typing can be very
terse due to automatic command completion, but without the need to
memorize a sea of cryptic and unpronounceable 2- and 3-letter
combinations. Helpful but unintrusive feedback, such as prompts and
menus for command arguments, is provided or available at every step.
The suggested response is typically the last one of that type
previously input. All in all, it has about everything you could want
from a command-line interface--plus automatic generation, if you wish,
of corresponding gadget-based dialogs for those who prefer that style.
> 6) I like Tcl. It's simple.
I find this a *major* reason explaining the success of Tcl, and not just the
Tk toolkit. Tcl programs are easy to read (unlike C), easy to change, easy to
reuse by somebody else. Furthermore, Tcl is now becoming almost as mainstream
as C (Perl also) and so more and more people will be able to "understand" Tcl.
It's not that I consider Lisp syntax to be unclear. Personally, I like it a
lot, and find it very elegant; but most of the people I work with hate it..
and this is what counts! I can get them to learn Tcl, or Shell, but not Lisp.
Somebody posted a message asking to "liberate" Tk so that it can be used in
other languages, (which is almost the case anyway: Scheme, Perl, Python, Wool,
etc..) but the only question is Why? If you program in Perl or Scheme, etc..
of course you should be able to access Tk's features, but I would certainly
not abandon Tcl. It has its own advantages compared to the others and the
simplicity/popularity that makes it -IMHO- more interesting (same reason
probably of a great success of "Basic" in other places..).
Cheers,
Christophe.
= How many Bell Labs Vice Presidents does it take to change a light bulb? =
= That's proprietary information. Answer available from AT&T on payment =
= of license fee (binary only). Of course, this license applies only if =
= you have AT&T light bulbs; if you have someone elses light bulbs then =
= you have to pay both their license fee and ours. =
How "easily implemented" does it have to be? The only Lisp I'm familiar
with that has a regular expression facility built-in is GNU Emacs Lisp.
And Perl's regexp's are a little different from Emacs's. Perl also
optimizes multiple regular expression comparisons against the same string
(i.e. when you have an if/then/elsif/elsif/.../else sequence it compiles
this into a single match that classifies the string and determines which is
the first matching branch).
I love Lisp, but I have to admit that for the kinds of things I use Perl
for, it's damned convenient. $str = `foo bar baz` is much more concise
than
(with-open-stream (str (run-unix-program "foo" :arguments '("bar" "baz")
:output :stream))
(let ((result "")
(newline (make-array 1 :element-type 'character
:initial-element #\newline)))
(loop
(multiple-value-bind (line eof-p) (read-line str nil nil)
(when line
(setq result (concatenate 'str result temp-str
(if eof-p "" newline))))
(when (or (null line) eof-p)
(return result))))))
Of course, I could easily write a function that does this. The point of
Perl (and Tcl, Python, and even sed and awk) is that all these convenience
operators for scripting, regular expression handling and simple output
parsing are already provided.
--
Barry Margolin
System Manager, Thinking Machines Corp.
bar...@think.com {uunet,harvard}!think!barmar
Paul Alexander
Department of Physics, University of Cambridge, Cambridge, UK.
[ deletia regarding simplicitly of Tcl ]
Christophe> I find this a *major* reason explaining the success of
Christophe> Tcl, and not just the Tk toolkit. Tcl programs are easy to
Christophe> read (unlike C), easy to change, easy to reuse by somebody
Christophe> else. Furthermore, Tcl is now becoming almost as
Christophe> mainstream as C (Perl also) and so more and more people
Christophe> will be able to "understand" Tcl.
I don't think readability is a property of the Tcl language. I've
generally found readability to be linked to the original programmer's
skill, and to my own familiarity with the language. You can write
unreadable code in any langauge, and code in a language you don't know
is always unreadable.
I also don't think that Tcl and Perl are as mainstream as C.
Christophe> It's not that I consider Lisp syntax to be
Christophe> unclear. Personally, I like it a lot, and find it very
Christophe> elegant; but most of the people I work with hate it.. and
Christophe> this is what counts! I can get them to learn Tcl, or
Christophe> Shell, but not Lisp.
There seems to be a fixation on syntax in this thread. I don't see
what the big deal with syntax is. Generally it is the model of the
language that is important. You can change the syntax of many
languages by just using some sort of macro preprocessor -- surely
we've all seen C programs that look like Pascal because of the (silly)
use of macros like:
#define BEGIN {
#define END }
etc.
I think one reason that Tcl is more popular as an extension language
than say Scheme is that Tcl is closer to the familiar C programming
model.
Tom
--
tro...@cns.caltech.edu
"In a riddle whose answer is chess, what is the only prohibited word?"
I thought a moment and replied, "The word chess".
-- Jorge Luis Borges
Bill> In article <1994Jul15....@njitgw.njit.edu>,
Bill> Aaron Watters <aa...@funcity.njit.edu> wrote:
Bill> Unfortunately in the UNIX world, there seems to be a mentality
Bill> that nothing can ever be thrown away. Which means we still get
Bill> programs and utilities which nobody wants to use, but everyone
Bill> is too scared to delete. It's about time vendors trashed
Bill> everything above the kernel level, and started again IMHO. (Yes
Bill> that's right, no more "vi".)
This seems to be the usual response to any conflict between existing
systems, "throw it all out we'll just write another one".
I have a few questions to pose on this point, though:
1. What's wrong with od? fsck? rm? cat? (well, cat has too
many dang arguments...)
2. Why wouldn't you throw out the kernel?
3. What do you propose to replace it all/why is it any better?
4. Will you throw out what you come up with when it conflicts
with someone else's idea?
5. What makes you think that anyone would use it?
6. Who's going to do all of the training?
I think that any UNIX vendor is going to have to ask 3-6. 1-2 are just
practicality questions. In reality, these are the kinds of things that
Microsoft is realizing in trying to convert UNIX people to NT.
-AJS
--
Aaron Sherman I-Kinetics, Inc.
Systems Engineer "Open Systems Stepstones"
Voice: (617)661-8181 (x230) 19 Bishop Allen Dr.
Fax: (617)661-8625 Cambridge, MA 02139
Pager: (508)545-0584 ashe...@i-kinetics.com
Key fingerprint = 62 6A 5E EB 6B 2A 46 48 3D 06 01 79 66 A2 87 0C
Come on, Perl is just about the only language that remains
as readable under ROT_13 as before.
--
=========================================================
Mark Riggle | "Give me LAMBDA or
sas...@unx.sas.com | give me death"
SAS Institute Inc., |
SAS Campus Drive, Cary, NC, 27513 |
(919) 677-8000 |
Well, I went out and got the clisp binaries from gatekeeper.dec.com.
4.5Mb is not scarcely a megabyte. Now I don't know whats needed you
might be able to trim it, but I wouldn't know what I can trim (and
shouldn't need to). In addition, the text mentions says...
"CLISP is mostly CLtL1 compliant, with some CLtL2 additions, including a
CLOS subset. Many features of CLtL2 or dpANS CL are currently not supported."
My remarks were about Common Lisp. This is only a subset of Common
Lisp (and it is still pretty big).
|>
|> 4) Tcl (at least to my knowledge) isn't picky about how big my "int"s
|> are. Several lisps I've seen are picky.
|>
|> Of course, the standard Common Lisp INTEGER class has no practical
|> upper bound. Unless, perhaps, you're referring to the source code
|> itself (of a free-of-charge CL implementation).
I was talking about the source code itself.
|>
|> 5) As much as I like the unambiguous nature of prefix-notation expressions,
|> I still can't bring myself to prefer (+ 5 (* 3 4)) over 5+3*4. This
|> is a biggy for me.
|>
|> Perhaps I don't do as much heavily numerical programming as other
|> people; but as I see it, C statements and expressions like 'if',
|> 'for', 'switch', and 'putchar(c)' are using prefix notation just as
|> Lisp does, except that Lisp's syntax is much more regular and more
|> amenable to programmatic manipulation.
I was talking about *expressions*. In C, "if" is part of an
expression (although I realize in Lisp it is). In C you still use
5+3*4 to calculate 17. Agreed, C syntax is ugly and irregular.
However, the discussion was Lisp vs. TCL.
|>
|> 6) I like Tcl. It's simple.
|>
|> People complain about Tcl's syntax. I didn't like it when I
|> first started toying with it, but I like it now. The reason I
|> like it now is that I find that there are a few simple and general
|> principles that it follows. Everything is handled in a kinda
|> universal fashion (like scripts being first class) which makes it
|> very powerful for me. I like the fact that I can write my own
|> control structions in TCL for example.
|>
|> If only everyone were as open-minded about syntax as you are.
I'm not sure if this is meant as sarcasm :-)
|> --
|> Lawrence G. Mayka
|> AT&T Bell Laboratories
|> l...@ieain.att.com
|>
|> Standard disclaimer.
Remember, I was only responding to why some people prefer TCL to Lisp
or Perl or whatever. I'm not going to argue which language is
technically superior...lisp would probably win. I've used lisp and I
like it, but that isn't what we are discussing...We are discussing why
lots of people like TCL, which is a purely subjective topic.
Michael Tiller
University of Illinois
(til...@solace.me.uiuc.edu)
LISP will never serve all requirements. No language will ever do.
But consider the question "Why does all these people invent all these
languages?" The answer is simple: It's the natural way to solve
decriptive problems. Evereybody who describes something complex goes
on to invent a problem-oriented language. So you better ask why we
should damm that for computer-languages?
LISP is very good for programming close to some theorie. But doesn't
check the syntax (except for the balanced "()").
Moreover LISP is interpreted --> relativ slow. That will never go away
when hardware gets cheaper. Cause than someone will write somthing
more complex with a full or half-compiled system. This srew never ends.
So we better select the best language for a particular problem.
The only importend thing in this sence is:
All these languages reinvent the wheel.
They all share some constucts (like if) with the same semantic and
give them a little different syntax. So everybody new to a particular
language have to learn a lot redundant stuff. (Moreover the users of
the languages get involved in religious wars about the (non-existent)
one and only true syntax, refer to all the "python is not C"-statments
in c.l.python for instance.)
But the only one answer to that I have (and it's not supposed to be
the best you can find) is to invent more and more languages wich are
as similar to eachother as possible and program with them. The best
would be to invent a language wich give you both: the freedom to
invent a new syntax (like LISP, FORTH etc.) and the power to enforce
this new syntax by the interpreter/compiler (like the EBNF-described
languages).
--
-----------------------------------------------------------------------------
Joerg Wittenberger
Rietzstr. 32b
01139 Dresden
Germany
email: j...@ibch50.inf.tu-dresden.de
j...@mail.inf.tu-dresden.de
WWW: <a href=http://www.inf.tu-dresden.de:8002/people/jw6/top.html>jerry</a>
PGP PUBLIC KEY: available on request or by finger
: Unfortunately in the UNIX world, there seems to be a mentality that
: nothing can ever be thrown away. Which means we still get programs and
: utilities which nobody wants to use, but everyone is too scared to
: delete. It's about time vendors trashed everything above the kernel
: level, and started again IMHO. (Yes that's right, no more "vi".)
But then, sometimes those programs are simply the best way of doing
things. What exactly would you replace awk with? For what it is good
at, nothing else is better. I don't use it because I'm afraid to delte
it. I use it because it's often the #1 tool for the job at hand.
No more vi? Why? It and Emacs are two of the most powerful editors
ever written. I've yet to see anything new that offered much beyond
being easier to learn. In addition, none of the new ones have the power
of something like vi. I almost lose my mind trying to use some editor
like Borland's IDE stuff or some other so-called programmers editor.
Your idea of cutting off above the kernel is good in a way and I wouldn't
mind seeing it happen. However, some tools cannot be improved upon.
They are simply great just how they are.
Finally, what tools would replace all the functions necessary on a UNIX
system? What shell would you suggest replace Bourne for the kernel?
What editor to replace vi? How can we replace:
#!/bin/nawk -f
/^SECTION/ {
print "#define",$2,section++ > "catalog.h"
}
/^ENTRY/ {
print "#define",$2,entry++ > "catalog.h"
}
???????????????? IMO, nothing could replace the above and be better
in any way. It's just so simple and perfect. It does *EXACTLY* what
I want and nothing more.
Anyway, I sometimes get frustrated by all the extra baggage of UNIX but
at the same time I cringe when I see some new OS that just doesn't have
any decent tools available for it and those that do come out are
binary-only proprietary stuff where each system's tools are different
from any other. At least with this old-clunky UNIX stuff I can write
things that I know will run on all the other systems I use. If I get
a tool for OS/2 I have absolutely no idea wether or not I'll find the
same thing for Windows NT or, God forbid, the Macintosh. Unless of
course I port over some of those old UNIX programs! :-)
: The GWYDION project, based on Dylan, this may be interesting for you.
What's that?
: Bill
: --
: bi...@zikzak.apana.org.au Bill Birch on ZikZak (Melbourne, Australia)
: {:-) A bas la loi Toubon! {:-)
--
csh
---------------------------------------------------------------------------
shen...@escape.widomaker.com (UUCP) | Amd486/40 Linux system
shen...@pcs.cnu.edu (Internet) | Christopher Newport University
Well, I went out and got the clisp binaries from gatekeeper.dec.com.
4.5Mb is not scarcely a megabyte. Now I don't know whats needed you
might be able to trim it, but I wouldn't know what I can trim (and
shouldn't need to). In addition, the text mentions says...
The original source of CLISP, 'ma2s2.mathematik.uni-karlsruhe.de', has
images prebuilt for popular platforms. The two files necessary for
execution, "lisp.run" and "lispinit.mem", total about 1.2 MB on
Solaris 2.3. I think the rest of the distribution is only necessary
for those who want to rebuild CLISP themselves, examine the source,
etc.
"CLISP is mostly CLtL1 compliant, with some CLtL2 additions, including a
CLOS subset. Many features of CLtL2 or dpANS CL are currently not supported."
My remarks were about Common Lisp. This is only a subset of Common
Lisp (and it is still pretty big).
Yes, CLISP doesn't support all of ANSI Common Lisp yet. The most
obvious omission appears to be the extended LOOP macro (which,
fortunately, is available elsewhere as a free-of-charge add-on). But
as far as image size is concerned, the missing constructs shouldn't
add much bulk percentagewise.
I'm not sure if this is meant as sarcasm :-)
No, not at all. Your willingness to learn new syntax shows more
open-mindedness than so many others appear able to muster.
How "easily implemented" does it have to be? The only Lisp I'm familiar
with that has a regular expression facility built-in is GNU Emacs Lisp.
Lisp systems that contain an embedded Emacs editor certainly have this
capability, but it unfortunately is not typically made conveniently
available to programmers. And the publicly available 'match' facility
is too verbose for most people's taste.
I love Lisp, but I have to admit that for the kinds of things I use Perl
for, it's damned convenient. $str = `foo bar baz` is much more concise
than
(with-open-stream (str (run-unix-program "foo" :arguments '("bar" "baz")
:output :stream))
(let ((result "")
(newline (make-array 1 :element-type 'character
:initial-element #\newline)))
(loop
(multiple-value-bind (line eof-p) (read-line str nil nil)
(when line
(setq result (concatenate 'str result temp-str
(if eof-p "" newline))))
(when (or (null line) eof-p)
(return result))))))
I myself would write
(string-trim '(#\Newline #\Space #\Tab)
(with-output-to-string (*standard-output*)
(foreign:call-system-showing-output "foo bar baz"
:prefix ""
:show-cmd nil)))
on LispWorks.
But I agree that we could use a de-facto-standard library of
shell-like capabilities such as this one.
Of course, I could easily write a function that does this. The point of
Perl (and Tcl, Python, and even sed and awk) is that all these convenience
operators for scripting, regular expression handling and simple output
parsing are already provided.
I think the original question that started this thread was, "Why did
people go to the trouble of designing a whole new language for these
operations when it would have been so much easier to just graft them
into Lisp?" Perhaps part of the answer is simply, "Because they
didn't know about CLISP."
Is it really worth to take so much bandwith for that thread?
We'll stop soon. :-)
But consider the question "Why does all these people invent all these
languages?" The answer is simple: It's the natural way to solve
decriptive problems. Evereybody who describes something complex goes
on to invent a problem-oriented language. So you better ask why we
should damm that for computer-languages?
The Lisp community's argument is that it's much easier to write a
problem-oriented language on top of Lisp--incrementally as you need it
and integrated smoothly with the base language and other
problem-oriented languages--rather than to design and implement it
from scratch, totally divorced from the base language and all the
other problem-oriented languages people have written.
Moreover LISP is interpreted --> relativ slow. That will never go away
when hardware gets cheaper. Cause than someone will write somthing
more complex with a full or half-compiled system. This srew never ends.
Correction: Commercial Common Lisp systems compile to machine code.
One can optimize time-critical code legs by adding (optional) type
declarations.
Needless to say, a commercial Common Lisp compiler generally enforces
the language's compile-time rules on a program, otherwise the compiler
could not generate correct machine code for that program. (A run-time
rule such as "don't call a function that hasn't been defined" can only
be enforced at run time via the condition system, although most CL
compilers will print some kind of information message if a referenced
function has not yet been defined.)
> In article <30eil2$7...@ef2007.efhd.ford.com>
> mti...@cands1.srl.ford.com (Michael Tiller) writes: Well, I
> went out and got the clisp binaries from gatekeeper.dec.com.
> 4.5Mb is not scarcely a megabyte.
CLISP uses about 1.5 mb.
-rwxr-xr-x 1 marcus source 939012 Jul 19 06:58 lisp.run
-rw-r--r-- 1 marcus source 692412 Jul 17 16:54 lispinit.mem
USER PID %CPU %MEM SIZE RSS TTY STAT START TIME COMMAND
marcus 2675 24.5 3.6 1286 1160 pp6 S 06:55 0:00 lisp.run -M compiled.mem
And, like it says:
"CLISP is mostly CLtL1 compliant, with some CLtL2
additions, including a CLOS subset. Many features of CLtL2
or dpANS CL are currently not supported."
A loop macro is on the way. Other dpANS improvements, are
addressed as they come up.
Of course. There are few if any programming languages in which it is
impossible to write code that's difficult to understand. This is the
nature of the problem domain.
You're blaming the language for the practices of some of its practioners.
Careless or rushed or ignorant or devious programmers can obfuscate any
language. Perl can be made utterly legible, or perfectly inscrutable --
but then, so can nearly anything else.
--tom
"Reasonable" is in the mind of the reasoner, and varies.
--tom
> Perhaps I don't do as much heavily numerical programming as other people;
> but as I see it, C statements and expressions like 'if', 'for', 'switch',
> and 'putchar(c)' are using prefix notation just as Lisp does, except that
> Lisp's syntax is much more regular and more amenable to programmatic
> manipulation.
Well, it has to do with how we think about the math, and about the controll
structures. Most people, when envisioning an equation, -think- something
along the lines of
4 + ( n**6 ) + (3/6N) ... so it's easier to print it that way. But control
structures are different ( And here's the PERL tie-in ) we think of structure
in terms of
"Do this until condition"
and
"I want X to range over this range, and for each value, do this"
.. which are more naturally expressed in the "prefix notation" you describe.
--
< a...@cis.ufl.edu >| Please feel free to respond to anything I
< (904)373-0906 >| say: I have strong opinions, but a wide-open
< 1936 N.W. 2nd Avenue >| mind.
< Gainesville, FL 32603 >| BUT KEEP THE FLAMES TO E-MAIL! (geez!)
-"My marriage is on the rocks, but only because we can't find a blanket."
<a href=http://www.cis.ufl.edu/~asr/asr.html>My Home Page</a>
True, but no one in this thread has suggested using APL or TECO as a
scripting language. The closest thing in any of the languages under
discussion is Common Lisp's FORMAT control string (which is almost Turing
complete -- it can call arbitrary functions as subroutines, but I'm not
sure the order of calls is specified).
In article <1994Jul19....@escape.widomaker.com> shen...@escape.widomaker.com (Shannon Hendrix) writes:
>Bill Birch (bi...@zikzak.apana.org.au) wrote:
>Anyway, I sometimes get frustrated by all the extra baggage of UNIX but
>at the same time I cringe when I see some new OS that just doesn't have
>any decent tools available for it and those that do come out are
>binary-only proprietary stuff....
Sounds like a job for HIERARCHICAL STORAGE: put all those silly tools
written by MIT undergrads in the seventies off onto a robotic tape
farm where they belong, but be able to get them back transparently in
case you accidentally hire a guy/gal who wrote one as a consultant.
No. Let's not replace awk. But it seem to me that all this stuff
that is being crammed into perl is a bit beyond the scope of the
language: do we really want to be writing client/server db
applications over Oracle on an ad-hoc language that doesn't support complex
data types?
Aaron Watters
Department of Computer and Information Sciences
New Jersey Institute of Technology
University Heights
Newark, NJ 07102
phone (201)596-2666
fax (201)596-5777
home phone (908)545-3367
email: aa...@vienna.njit.edu
A LISP Shell
John R. Ellis
SIGPLAN Notices, Volume 15, Number 5, May 1980.
At Yale, Ellis built a UNIX shell on top of a LISP interpreter that
included i/o redirection, background processes, pipes, and other
goodies. The LISP shell took about 60K bytes: 20K for the
interpreter, 40K for the shell functions. The source code listing was
20 pages, including comments.
--Stephen Slade
vro...@netcom.com (William Paul Vrotney) writes:
>The basic idea was to organize a Lisp (say Common Lisp) into levels, level
>0, level 1 etc. where each level is composed of more abstract Lisp functions
>that use the lower levels. Functions in levels above 0 would autoload when
>invoked. Level 0 would be very primitive but more powerful in its control
>language than say "csh". A Lisp shell, call it "lsh", then would be composed
>of Level 0 and an OS package per OS. An alias system would allow brief
>commands in the shell interpreter so that instead of entering the OS package
>function
If someone /has/ implimented this system, I'd nearly kill for this
shell... Preferable with both SunOS and Linux versions, since I'll be
moving that way soon.
*drool* My gods, that's a lovely thought...
-----BEGIN PGP SIGNATURE-----
Version: 2.3a
iQCVAgUBLivrsl6rnc26j4NHAQG+TQP9EZsGR3KVdru8Z79txcv4TnGoZMWxmb41
SHS0mthsfvefxV1EtrLMTNPeK9O+LsStbV3M1io2SYbUe6z2LIVddRlRMdKySsP9
E05T9nsT5UwYvH+AKtu6y8chxEMeBXo1Oer8p6s5Sk7vyqpl32mMBiRW2sdszvVH
5mI1lx8v8Uk=
=Hkfa
-----END PGP SIGNATURE-----
--
tha...@runic.via.mind.org (Alexander Williams) | PGP 2.x key avail
Email is the right of the masses. So do it. | DF 22 16 CE CA 7F
Do What Thou Wilt Shall Be the Whole of the | 98 47 13 EE 8E EC
Law. Love is the Law, Love Under Will. -oOo- | 9C 2D 9B 9B
=====================================================================
"Democracy isn't just the best form of government; its the only one
even remotely worth a damn. Only democracy guarantees people get
what they deserve."
-- Zeno Marley (Early 21st Century Mercenary-Philosopher)
[Dark Conspiracy RPG, pg 29]
I'm very glad that the vendors don't agree with you. The toolkit, which has
evolved over time, might have some tools that could be better-designed, but
it has a lot of useful stuff, which I'm glad doesn't get thrown out.
For example, I was recently writing a program to automate installing a lot
of software packages. Sometimes those packages have dependancies on each
other; in that case, you should install the on depended on before you
install the one that depends on it.
Performing the sort to implement this ordering isn't a horribly difficult
job, but neither is it trivial. The code to do that has been swallowed up in
modern ranlib+ld, but once upon a time object libraries were kept in
dependancy order, using lorder(1) and tsort(1). Happily, tsort(1) does the
precise job I needed, and it was implemented as a separate, reuseable
program, and that program has been kept around.
It's possible that there's a standard Unix utility out there that is
actually useless, in the sense that it isn't currently being used to get
useful work done. But I doubt it.
-Bennett
b...@sbi.com
>
> Is it really worth to take so much bandwith for that thread?
>
> LISP will never serve all requirements. No language will ever do.
>
> But consider the question "Why does all these people invent all these
> languages?" The answer is simple: It's the natural way to solve
> decriptive problems. Evereybody who describes something complex goes
> on to invent a problem-oriented language. So you better ask why we
> should damm that for computer-languages?
>
> LISP is very good for programming close to some theorie. But doesn't
> check the syntax (except for the balanced "()").
>
> Moreover LISP is interpreted --> relativ slow. That will never go away
> when hardware gets cheaper. Cause than someone will write somthing
> more complex with a full or half-compiled system. This srew never ends.
>
> So we better select the best language for a particular problem.
>
Wie heiBt "Norm-Crosbyism" auf deutsch?
--
Bill Vrotney - vro...@netcom.com
Common myth. Repeating it over and over doesn't make it true.
Most Lisp implementations these days include a compiler.
--
Simon.
I agree with you, and have for quite some time. I suspect that Perl 5 goes
a long way towards addressing some of your concerns.
Larry
Wake up, boys and girls! What makes Unix so programmer friendly is that it
is a swiss army knife - comes with two screwdrivers and one of everything else.
Lots of little tools, all custom made for a different nich.
To me it sounds like sour grapes. "I can't learn all this, there's too much!
Who needs all this crud anyway?" BTW, Unix doesn't have tool bloat. My
Linux machine has swap space, the kernel, utilities, a developer's environment
and a desk-top publishing system sitting on a 120Meg disk. It DOES have
kernel bloat, which will only get worse as everybody clammers to get their
firm's bell and whistle into COSE. G-d I miss version 7.
In article <1994Jul19....@njitgw.njit.edu> aa...@funcity.njit.edu (Aaron Watters) writes:
Sounds like a job for HIERARCHICAL STORAGE: put all those silly tools
written by MIT undergrads in the seventies off onto a robotic tape
farm where they belong, but be able to get them back transparently in
case you accidentally hire a guy/gal who wrote one as a consultant.
Or someone wha has used them for the past 15 years.
About changing `vi'. I'm used to it. I'd like to see something more powerfull,
perhaps with a scripting language (imbeddible Perl? :-), and some X
power. Don't throw them away, improve them.
No. Let's not replace awk. But it seem to me that all this stuff
that is being crammed into perl is a bit beyond the scope of the
language: do we really want to be writing client/server db
applications over Oracle on an ad-hoc language that doesn't support complex
data types?
I do most of my scripting in tkSybPerl. I like being able to read a row into
an array and not worry about binding, converting, et al. I love having Perl's
"format" and "write" on a dB script. I also like being able to query the user
using GUI's.
About not supporting complex data types, two answers:
1- Oracle doesn't either.
2- I write s-expressions ("Nested Lists") in Perl 5 regularly. I think,
although I haven't tried it, that an associative array of references
to lists might be very usefull in handeling relations db's.
--
Micha Berger Ron Arad, Zechariah Baumel, Zvi Feldman, Yehudah Katz:
mbe...@lehman.com May the Omnipresent have mercy on them and take them from
(212) 464-6565 restraint to openness, from dark to light, from slavery
(201) 916-0287 to salvation.
> There is (if I may say so (and I really hope I may)) a small disadvantage
> (maybe not so small either (in fact rather large (not to say huge))) in the
> usage of lisp (or scheme (or any other lookalikes)) that readbility is
> rather low (very low (if you want my opinion)).
No one comparing lisp and perl can complain about lisp's readability.
SENDMAIL config files are more readable than perl. Considerably more
readable, in fact.
Anyway, just why do people think lisp is unreadable (complex
arithmetic expressions apart)?
--tim
>No one comparing lisp and perl can complain about lisp's readability.
>SENDMAIL config files are more readable than perl. Considerably more
>readable, in fact.
>Anyway, just why do people think lisp is unreadable (complex
>arithmetic expressions apart)?
I think the more relevant question is why you think Perl is unreadable. It's
a very straightforward structured language, much like C, and very readable
to C programmers. I can't imagine why you think Perl is unreadable, unless
you think C is also unreadable.
I think many people find LISP unreadable because its philosophy is so
different from other languages (I don't count Scheme as a language in its
own right), while awk and perl share the same structured philosophy as C,
Pascal, Algol, etc.
In my opinion, this "screwdriver" argument is silly; Lisp is not a
screwdriver, but rather more of a "screwdriver representation
language." In other words, you are comparing screwdrivers to a
language capable of representing (and therefore building) any kind of
screwdriver you can imagine.
>Wake up, boys and girls! What makes Unix so programmer friendly is that it
>is a swiss army knife - comes with two screwdrivers and one of everything else.
>Lots of little tools, all custom made for a different nich.
Wake up, Micha! What makes Lisp so programmer friendly is that it is
capable of representing any Swiss Army knife you would like! And all
such Swiss Army knives can be "interoperable" and "open" and all those
other wonderful buzz words!
>To me it sounds like sour grapes. "I can't learn all this, there's too much!
>Who needs all this crud anyway?" BTW, Unix doesn't have tool bloat. My
>Linux machine has swap space, the kernel, utilities, a developer's environment
>and a desk-top publishing system sitting on a 120Meg disk. It DOES have
>kernel bloat, which will only get worse as everybody clammers to get their
>firm's bell and whistle into COSE. G-d I miss version 7.
Guess what? Lisp doesn't have kernel bloat (perhaps feature bloat for
Common Lisp, but certainly not kernel bloat). An important difference
between Lisp and Unix is that every tool in Unix has its own
quirkiness, its own special syntax for args, etc. This is not sour
grapes, it's a serious point: when using the more obscure Unix tools,
life is hell without the man pages. In contrast, I seldom refer to
CLtL, though (of course) I do use 'describe'. Lisp is simply more
"unified", more consistent. It is easier to "wire together" diverse
tools.
Case in point: a few months ago, I had a need for a desk calculator
capable of doing algebraic manipulation, and I really wanted a GUI
interface and PostScript output. Graphs would be a big plus, and I
also wanted to process a large quantity of tabular data coming from
Sybase. So, using CMU Lisp, Garnet, and a Macsyma-like package, I
wired one up with a little "glue" code and some nice window stuff.
The hardest part was the Lisp-Sybase interface, pain which could have
been avoided by actually (*horrors*) spending money.
Net development time a week of my spare time. As they say on Saturday
Night Live, "No big whoop." I ended up with the world's most powerful
desk calculator for free.
The consistency issue is what made this all possible; I didn't have to
learn the insides of 3 tools to get them all hooked up. Look at Perl
as an example of inconsistency: while I agree that it is a neat tool,
the syntax is not entirely consistent, and the division between what
is built-in versus what is in external libraries seems rather
arbitrary. I would, in fact, say that Perl itself has kernel bloat.
Another major criticism I have of the Unix "many tools" approach is
that code written for the tools mentioned on the subject line is
typically reused via the "cut and paste" route -- not the most
effective software development strategy! While all of these tools
support load "modules", this feature is rarely used, in my experience.
I have seen numerous windows developed in Tk, only to be cut and
pasted to numerous other applications, in order to achieve common
"look and feel"; unfortunately, when the time comes to make a change,
all hell breaks loose.
One of the reasons for the failure to use the load module features
seems to be desire for total portability accross Unix (and even
sometimes DOS) systems. None of the named tools supports a
comprehensive module facility that makes up for the lack of
compatability between the target environments. In all fairness, Lisp
also lacks such a comprehensive facility (so long old 'require'), but
the difference seems to be that Lisp developers generally include a
fairly general loader for their software. It is generally easy to
take the provided loader and make it functional on your platform.
Perhaps the Lisp situation is skewed towards using load modules (even
if it requires some development) because we Lisp'ers really abhor cut
and paste reuse. In contrast to the named tools, Lisp is the only
tool which provides a really comprehensive namespace conflict
avoidance and resolution facility (Perl is the only one which come
close, and Perl's namespace is much less sophisticated).
>About changing `vi'. I'm used to it. I'd like to see something more powerfull,
>perhaps with a scripting language (imbeddible Perl? :-), and some X
>power. Don't throw them away, improve them.
Gee, I guess you've never used Emacs (especially V19, or Lucid-Emacs).
>I do most of my scripting in tkSybPerl. I like being able to read a row into
>an array and not worry about binding, converting, et al. I love having Perl's
>"format" and "write" on a dB script. I also like being able to query the user
>using GUI's.
That's 3 programming languages right there! (TCL, SQL, and Perl) I
can't even fathom doing anything resembling serious development in
such an environment. And to think that with the three languages you
don't even achieve a diversity of programming paradigms (imperative vs
logical, for example)!
>About not supporting complex data types, two answers:
> 1- Oracle doesn't either.
> 2- I write s-expressions ("Nested Lists") in Perl 5 regularly. I think,
> although I haven't tried it, that an associative array of references
> to lists might be very usefull in handeling relations db's.
Gee, what an answer: because Oracle doesn't support it, I don't need
it (implying that complex data types will not make the application any
easier to develop). As for #2, knowing Perl, I say Blechhh.
>--
>Micha Berger Ron Arad, Zechariah Baumel, Zvi Feldman, Yehudah Katz:
>mbe...@lehman.com May the Omnipresent have mercy on them and take them from
>(212) 464-6565 restraint to openness, from dark to light, from slavery
>(201) 916-0287 to salvation.
-frank
--
f...@panix.com | Just another bumper sticker on the
1 212 559 5534 | Information Superhighway.
1 917 992 2248 |
1 718 746 7061 |
Ok. Ok. Sorry for that.
But half the way compiled or not - it makes a difference, but not too
much.
But how tells me the following is not true:
Whenever you have something written in LISP, there will be a way to
write it with another languages so the other program is faster?
For a fixed program, the ratio
execution time of interpreted program : execution time of compiled program
is usually 5 : 1 for CLISP
and 10 : 1 or 20 : 1 for other
CL implementations.
> But how tells me the following is not true:
> Whenever you have something written in LISP, there will be a way to
> write it with another languages so the other program is faster?
Whenever you have something written in C, there will be the possibility
to rewrite it in assembly language, doing global register allocation by
hand. I have done this for some time, and the outcome is usually twice
as fast. You just need a bit more manpower than for the simple C version.
Whenever you have something written in assembly language, you can still
design a custom chip that will perform the same operations in hardware,
much faster.
So, what's your point?
Bruno Haible
hai...@ma2s2.mathematik.uni-karlsruhe.de
[deleted stuff about readability of lisp as compared to Perl]
>I think the more relevant question is why you think Perl is unreadable. It's
>a very straightforward structured language, much like C, and very readable
>to C programmers. I can't imagine why you think Perl is unreadable, unless
>you think C is also unreadable.
Stupid thread. Everyone finds some languages easier than others. I
find it extremely difficult to read C (among others), but very simple
to read both Perl and Lisp (and half a dozen other languages).
Who gives a toss?
--
Jack j...@biu.icnet.uk
If you only have a hammer, you tend to see every problem as a nail.
-- Maslow
I agree...
>In my opinion, this "screwdriver" argument is silly; Lisp is not a
>screwdriver, but rather more of a "screwdriver representation
>language." In other words, you are comparing screwdrivers to a
>language capable of representing (and therefore building) any kind of
>screwdriver you can imagine.
Oops, I thought I used languages to represent my thoughts to other people
or machines, and the language I choose to use depends on the situation.
The more languages you know the more freedom you have in that choice, I
*think* I can assert that anything I can do in Perl, lisp, basic, pascal,
or whatever could also be done in assembler, but that's no reason to dump
them.
>>About changing `vi'. I'm used to it. I'd like to see something more powerfull,
>>perhaps with a scripting language (imbeddible Perl? :-), and some X
>>power. Don't throw them away, improve them.
>
>Gee, I guess you've never used Emacs (especially V19, or Lucid-Emacs).
Please, not content with a language skirmish let's bring editors into it
as well to make it a "proper" war :-(
>>I do most of my scripting in tkSybPerl. I like being able to read a row into
>>an array and not worry about binding, converting, et al. I love having Perl's
>>"format" and "write" on a dB script. I also like being able to query the user
>>using GUI's.
>
>That's 3 programming languages right there! (TCL, SQL, and Perl) I
>can't even fathom doing anything resembling serious development in
>such an environment. And to think that with the three languages you
>don't even achieve a diversity of programming paradigms (imperative vs
>logical, for example)!
...but alas it probably does the job well enough and quickly enough to
keep the user happy :-) This is usually the point of the exercise, the
end is usually what the person paying me cares about, the means are
incidental.
I find that the language I use affects the way I see a task, and I am
grateful for the diversity of languages available, it would be a sad day
were someone to say X is the only language and Y is the only editor and
all others are hereby banned.
Mike
--
The "usual disclaimers" apply. | Meiko
Mike Stok | 130C Baker Ave. Ext
Mike...@meiko.concord.ma.us | Concord, MA 01742
Meiko tel: (508) 371 0088 x124 |
They did, and then discovered a 1.5 megabyte executable to search through
a file was a bit of a drag on the system.
--
Peter da Silva `-_-'
Network Management Technology Incorporated 'U`
1601 Industrial Blvd. Sugar Land, TX 77478 USA
+1 713 274 5180 "Hast Du heute schon Deinen Wolf umarmt?"
1) Regular expression matching a-la Perl and Awk.
2) A fairly decent general purpose programming language.
3) At least one free and reasonably portable implementation.
4) Modestly sized executables that can be
made to behave as filters in a UNIX pipe.
5) "One-liner" capability.
All of these features have to be available at the same time, in the
same implementation!
As far as I know, features 1) and 4) are the main problems as far as
Common Lisp is concerned. Both of these problems _could_ be solved, 1)
more easily than 4).
--
(Rmz)
Bj\o rn Remseth !Institutt for Informatikk !Net: r...@ifi.uio.no
Phone:+47 22855802!Universitetet i Oslo, Norway !ICBM: N595625E104337
Of course, when you do write it, you have to call it '16' (for 'xvi'),
unless you make a Motif version, in which case you have to call it
'1006' (for 'mvi'). Finally, object-oriented vi would be '7' ('vi++' ==>
'6++' ==> '7').
Adios,
Logan
--
The genius of France can be seen at a glance
And it's not in their fabled fashion scene
It's not that they're mean, or their wine, or cuisine
I refer of course to the guillotine
(the French knew how to lynch)
T-Bone Burnett, "I Can Explain Everything"
[ Just being the devil's advocate here... ]
The same is true for assembly. What's your point?
> One of the reasons for the failure to use the load module features
> seems to be desire for total portability accross Unix (and even
> sometimes DOS) systems. None of the named tools supports a
> comprehensive module facility that makes up for the lack of
> compatability between the target environments.
What is it about perl's "require" that you don't like?
> >About changing `vi'. I'm used to it. I'd like to see something more powerfull,
> >perhaps with a scripting language (imbeddible Perl? :-), and some X
> >power. Don't throw them away, improve them.
>
> Gee, I guess you've never used Emacs (especially V19, or Lucid-Emacs).
Please - all we need is to add a vi-versus-emacs religious war to this
discussion! In case you haven't noticed, it has been done before. To
paraphrase Sting,
There's no such thing as a winnable editor flame war
It's a lie we don't believe anymore
Lisp remains just about as readable after being piped through 'fmt'.
As with any langauge its not the language that is the primary fault of
unreadable code its the person that wrote the code. Perl like any other
language lets you write unreadable code (regex's being the primary fault
here), but with a little care and thought readable code is quite possible.
--
Brian Blackmore (b...@gryphon.demon.co.uk) Phone: +44 81 391 1116
They gave it me,-- for an un-birthday present.
> Well, I went out and got the clisp binaries from gatekeeper.dec.com.
> 4.5Mb is not scarcely a megabyte. Now I don't know whats needed you
> might be able to trim it, but I wouldn't know what I can trim (and
> shouldn't need to). In addition, the text mentions says...
For which platform? I have a recent (but not the latest) version of
CLISP. The .EXE file is 638K, and the image file is 536K. I'll agree
that this is not a complete Common Lisp, but I'm told that some other
CL systems are far smaller than others.
It's easy to play games with the figures, like suggesting that a C
program takes up more than 50 MB, simply coz you want the complete
development system included. That would be more powerful, but it's
almost unrealistic for most people.
One of the problems that I see with CL is that too many CL programmers
insist that the development system should be available to the user.
I've yet to see any figures for a stand alone app written in CL, or
a CL app that has been translated into C and then compiled.
Another issue that rarely gets mentioned is the use of shared libs
for stand alone CL code. Does anyone have figures for a 10,000+
CL app written in Allegro CL for Windows?
--
Martin Rodgers, WKBBG, London UK AKA "Cyber Surfer"
If "One likes to believe in the freedom of email", email
clipper....@cpsr.org and tell them you oppose Clipper.
This is a shareware .signature -- please pass it on!
> LISP is very good for programming close to some theorie. But doesn't
> check the syntax (except for the balanced "()").
Define what you mean by "syntax". There're a difference between a
Lisp expression and an expression read by the Lisp parser. Both
use the same "syntax", but there's another level of "syntax" used
by Lisp code.
It's similar in some ways to Forth, where the parser only knows about
one "level" of the syntax (the lexical level). The semantic level is
something else. The parser in Forth isn't usually implemented in the
same way as in most languages - tools like lex and yacc might not work
for many Forth.
> Moreover LISP is interpreted --> relativ slow. That will never go away
> when hardware gets cheaper. Cause than someone will write somthing
> more complex with a full or half-compiled system. This srew never ends.
Which Lisp? XLISP? Sure, but many others are compiled, interpreted,
or whatever. There are many different ways of implemented a language,
and just refering to "Lisp" doesn't name a dialet, never mind a single
implementation.
There are also interpreted C systems. :-)
> So we better select the best language for a particular problem.
And then select the "best" implementation? That's not easy, when
there are so many choices. I could use a "Lisp" with the samantics
of C if I wanted. In fact, I might do that, as I've always wanted
an "ideal" language in which to re-write my Lisp interpreter. It
would be a "C" like "Lisp" and compile directly into C.
> The only importend thing in this sence is:
>
> All these languages reinvent the wheel.
All languages perform a useful function: they allow programmers to
write code that will run on a computer. Some languages are no longer
with us, and many implementations no longer have machines to run on.
Most languages beat using hex switches on a front panel. :-)
Everything else is just relogious war. I don't give a shit about
which languages other people use, which dialects, or which vendor
they get their implementation(s) from. The only time when it might
be useful to compare different langauges and language systems might
be when there are useful ideas to exchange.
Mind you, people can get killed for expressing the "wrong" ideas.
Anyone who wants to change my mind can do it very easily. Just pay
to me think something else, and use your favourite language. You
don't have to kill me.
> In my opinion, this whole thread is silly. The premise is (see subject line)
> that everything can be done in lisp, as if one screwdriver could fit every
> screw.
>
Did you ever use a Symbolics or a Xerox D-Machine? Everything IS done in
Lisp and it is WONDERFUL. Lisp is one of the few high level languages that
is elegant enough to serve as a basis for a machine language.
Wittenberger> Moreover LISP is interpreted --> relativ slow. That will
Wittenberger> never go away when hardware gets cheaper. Cause than
Wittenberger> someone will write somthing more complex with a full or
Wittenberger> half-compiled system. This srew never ends.
LISP can be interpreted or compiled. It is _not_ a purely interpreted
language family. For example, most commercial Common Lisp
implementations compile source to NATIVE MACHINE INSTRUCTIONS.
Jason
--
_____________________________________________________________________________
| Jason Trenouth, | EMAIL: ja...@uk.co.harlequin |
| Harlequin Ltd, Barrington Hall, | TEL: (0223) 872522 |
| Barrington, Cambridge CB2 5RG, UK | FAX: (0223) 872519 |
Michael> 4) Tcl (at least to my knowledge) isn't picky about how big
Michael> my "int"s are. Several lisps I've seen are picky.
Common Lisp isn't. It supports _arbitrary precision_ integer
arithmetic. Can Tcl do that?
First, I checked, and I am not permitted to pass on this code (it was
developed on my employer's time).
But, yes, it should run on a mac, with the new (alpha) version of
Garnet. The only problem on the mac would be the Sybase interface,
but that could be changed to some other DB.
The PC would be a problem, as Garnet has not been ported to windows.
However, the non-GUI parts could be used on the PC, using the
Lisp-interaction mode.
Other Lisp's should not present much of a problem: the symbolic
package has been written with portability to many Lisps, as has
Garnet. I might need to rebuild the loader, but nothing major.
Note that with the alternative of Tk/Perl/etc this would 1) not be
possible at all (would you seriously consider writing a symbolic
algebra package in Perl, or even C for that matter), and 2) would not
be portable to PC/Mac either (is Tk available for PC/Mac -- and
moreover, is it compatible with X?). If we throw Mathematica into the
equation, we get portability, but is there a Sybase (or other RDB)
interface avialable for Mathematica? And can I use a logic
programming style in Mathematica (backtracking)?
>Trying to do *everything* in lisp is just plain silly! Use the tools
>that are right for the problem.
Well, I suppose that you could have saved the various Lisp machine
companies lots of grief had you only told them this 15 years ago...
:-) (Breaking my no-smileys vow, lest this ignite an all-out flame war...)
>Yusuf
>Ps: I do almost all my programming in lisp, but I would write a perl
>script, a shell script or anything else if I thought it could solve my
>problem quicker.
Me too, but generally it can't. I'm not above the occaisional Perl
script, or even a Tk window. But those apps have to be very small to
make the Perl or Tk route pay off. None of this Perl/Tk/Sybase/etc
development -- it gets to be a mess, quickly.
>Yusuf Pisan
>y-p...@nwu.edu
>Qualitiative Reasoning Group
>The Institute for the Learning Sciences
>Northwestern University
-frank
--
f...@panix.com | You never step into the same river twice.
1 212 559 5534 | -- Heraclitus
I'll bet you have trouble with Greek and Arabic too.
: Anyway, just why do people think lisp is unreadable (complex
: arithmetic expressions apart)?
Lisp has all the visual appeal of oatmeal with fingernail clippings mixed in.
Larry Wall
lw...@netlabs.com
Path: info-server.bbn.com!news2.near.net!MathWorks.Com!europa.eng.gtefsd.com!howland.reston.ans.net!EU.net!Germany.EU.net!netmbx.de!zib-berlin.de!irz401!not-for-mail
From: j...@ibch50.inf.tu-dresden.de (Wittenberger)
Newsgroups: comp.lang.lisp,comp.lang.perl,comp.lang.tcl,comp.lang.clos
Date: 20 Jul 1994 19:18:25 +0200
Organization: Dept. of Computer Science, TU Dresden, Germany
Lines: 31
References: <1994Jul15....@njitgw.njit.edu> <30ga71$i...@ibch50.inf.tu-dresden.de>
<SIMON.94J...@liasg5.epfl.ch>
NNTP-Posting-Host: ibch50.inf.tu-dresden.de
Mime-Version: 1.0
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 8bit
Xref: info-server.bbn.com comp.lang.lisp:13349 comp.lang.perl:30887 comp.lang.tcl:15340 comp.lang.clos:2451
In article <SIMON.94J...@liasg5.epfl.ch> si...@lia.di.epfl.ch (Simon Leinen) writes:
...
Ok. Ok. Sorry for that.
But half the way compiled or not - it makes a difference, but not too
much.
But how tells me the following is not true:
Whenever you have something written in LISP, there will be a way to
write it with another languages so the other program is faster?
This is another myth. Lisp can be as fast as C, even faster. Well
written, comparable C and Lisp programs should have comparable performance.
I have a draft paper, Courage in Profiles, that makes a comparison of two
program written in C and Lisp: ftp from wheaton.bb.ncom:
pub/profile/profile.ps.
k
--
Ken Anderson
Internet: kand...@bbn.com
BBN ST Work Phone: 617-873-3160
10 Moulton St. Home Phone: 617-643-0157
Mail Stop 6/4a FAX: 617-873-2794
Cambridge MA 02138
USA
Perl is in another league -- one in which it has become one of the most
powerful, though not as easy as awk. Hopefully, it becomes apart of standard
distribution and becomes what BASIC was to all those microcomputers in the
early 80's!!
LONG LIVE PERL!!!
But how tells me the following is not true:
Whenever you have something written in LISP, there will be a way to
write it with another languages so the other program is faster?
First, this is not true if the Lisp program implements a highly
efficient but complex algorithm whose implementation in the other
language is simply not feasible and reasonable (typically, because
expressing the algorithm in that other language would exceed the
complexity that a human programmer can deal with sensibly). The other
language must then resort to a slower, simpler algorithm.
Second, high-quality Common Lisp compilers can often generate code
just as efficient as C compilers when they are given the same amount
of information about the program (e.g., type declarations) and are
allowed to make the same assumptions and impose the same restrictions
(e.g., no incremental update of a running program).
--
Lawrence G. Mayka
AT&T Bell Laboratories
l...@ieain.att.com
Standard disclaimer.
In article <LGM.94Ju...@polaris.ih.att.com>,
Lawrence G. Mayka <l...@polaris.ih.att.com> wrote:
> I think the original question that started this thread was, "Why did
> people go to the trouble of designing a whole new language for these
> operations when it would have been so much easier to just graft them
> into Lisp?"
They did, and then discovered a 1.5 megabyte executable to search through
a file was a bit of a drag on the system.
About 700KB of CLISP is object-file text and could perhaps go into a
shared library. A large portion of the remainder is, I think,
byte-compiled Lisp text and could perhaps go into shared memory.
Perhaps someone will look into these possibilities?
--Bennett
-b...@sbi.com
--
Marty Cohen, AMSAA-LAD mco...@arl.mil Custom House Rm 800,
Phila. PA 19106-2976 (215)597-8377 Fax (215)597-2240
> For a fixed program, the ratio
>
> execution time of interpreted program : execution time of compiled program
>
> is usually 5 : 1 for CLISP
> and 10 : 1 or 20 : 1 for other
> CL implementations.
That's far from my point.
> Whenever you have something written in C, there will be the possibility
> to rewrite it in assembly language, doing global register allocation by
> hand. I have done this for some time, and the outcome is usually twice
> as fast. You just need a bit more manpower than for the simple C version.
>
> Whenever you have something written in assembly language, you can still
> design a custom chip that will perform the same operations in hardware,
> much faster.
>
> So, what's your point?
You meet it! My point is the absurd religios war this thread has been
obvious hope to break that fact by LISP.
> But how tells me the following is not true:
> Whenever you have something written in LISP, there will be a way to
> write it with another languages so the other program is faster?
>
> First, this is not true if the Lisp program implements a highly
> efficient but complex algorithm whose implementation in the other
> language is simply not feasible and reasonable (typically, because
> expressing the algorithm in that other language would exceed the
> complexity that a human programmer can deal with sensibly). The other
> language must then resort to a slower, simpler algorithm.
>
> Second, high-quality Common Lisp compilers can often generate code
> just as efficient as C compilers when they are given the same amount
> of information about the program (e.g., type declarations) and are
> allowed to make the same assumptions and impose the same restrictions
> (e.g., no incremental update of a running program).
> --
Thats what I mean.
For your first you need a specialized system to talk in. Just a
specialized language. (Where language may be native LISP together with
some functions made befor or may be a complete different language no
matter.)
For your second: That's what optimization is good for. (And what real
optimization is.) It's nothing more or less than to drag someting from
the system you are in in the mata system you use.
And so long these things can be called a fact, there will ever be a
way to optimize something if you can effort the pain to drag it into
the meta system.
As long as the machine, as in the cases you cite, was designed to
use lisp as its machine language to start with....
------------------------------------------------------------
Kirk Rader ki...@triple-i.com
As long as the machine, as in the cases you cite, was designed to
use lisp as its machine language to start with....
Not really. You can turn your 486 into the equivalent of a D-machine
just by running Interlisp (er, Medley) on it. It's available for a
couple of hundred bucks, I believe.
Bill
--
Bill Janssen <jan...@parc.xerox.com> (415) 812-4763 FAX: (415) 812-4777
Xerox Palo Alto Research Center, 3333 Coyote Hill Rd, Palo Alto, CA 94304
URL: ftp://parcftp.parc.xerox.com/pub/ilu/misc/janssen.html
I suggest you read something other than Byte, the documentation in the
Concurrent Clean distribution would be a good start.
From: bar...@think.com (Barry Margolin)
Newsgroups: comp.lang.lisp,comp.lang.perl,comp.lang.tcl,comp.lang.clos
Date: 19 Jul 1994 16:42:26 GMT
Organization: Thinking Machines Corporation, Cambridge MA, USA
Lines: 15
In article <Ct5E...@unx.sas.com> sas...@zinfande.unx.sas.com (Mark S. Riggle) writes:
>Come on, Perl is just about the only language that remains
>as readable under ROT_13 as before.
True, but no one in this thread has suggested using APL or TECO as a
scripting language. The closest thing in any of the languages under
discussion is Common Lisp's FORMAT control string (which is almost Turing
complete -- it can call arbitrary functions as subroutines, but I'm not
sure the order of calls is specified).
You are forgetting INTERCAL :)
Happy Lisping
--
Marco Antoniotti - Resistente Umano
-------------------------------------------------------------------------------
Robotics Lab | room: 1220 - tel. #: (212) 998 3370
Courant Institute NYU | e-mail: mar...@cs.nyu.edu
...e` la semplicita` che e` difficile a farsi.
...it is simplicity that is difficult to make.
Bertholdt Brecht
> Lisp has all the visual appeal of oatmeal with fingernail clippings mixed in.
Which Lisp do you mean? One of the first Lisps I remember reading about
was MuLisp. It included an Algol-like front end call MuSimp, and this
was used to write the MuMath pachage. It didn't look a bit like what
most people would call "Lisp", which is really just a collection of
dialects. These days we could talk about Dylan, which also uses as
Algol-like syntax _as_standard_. In fact, this isn't even an option.
From: umbr...@mcs.drexel.edu (/\/\_i_k_e Brzycki)
Organization: Drexel University
Date: Thu, 21 Jul 94 21:57:58 GMT
Lines: 13
I think the LISP programmers have better things to worry about than Perl.
There's a new language out called Concurrent Clean, and from what I read
in the last issue of Byte - it would kick the crap out of LISP. Concurrent
Clean acts like both a functional language and a concurrent programming
language and it compilies just about as small as C does.
I too read BYTE and find that recently:
a) they hyped way tooooo much Windows and similar stuff. They have no
balance toward UNIX systems or OS/2 and practically never mention PD
or GPL software. (They could at least run a service on LINUX).
b) the article on functional languages is just a nice way to say that
they are keeping track of "nice things"
From what I gathered from the glimpse of Clean from that article
ConcClean looks like a nifty thing, but:
- the "compile to C technology" is something that was out in 1985 with
the first release of (guess what?!?) Kyoto CL.
- ConcClean is claimed to leave out some of the "more functional"
aspects of functional programming.
- Type Inference technology *has* been incorporated in at least one
Common Lisp COMPILER (for those out there who do have not used a Lisp
system in the last 15 years :-) )
On top of that, (and I might be wrong) the author does not seem to be
really familiar with Common Lisp at all.
Finally, the "goodness" of a concurrent language is entirely dependent
on the underlying (or lackness of) operating system. The ADA story
tells us that. Any ConcClean implemeentation on a "standard" UN*X or
MS-Windows (not NT) system is going to simulate concurrent processes
(or a similar concept) in the run time environment. Again, this is
what most Common Lisp implementations do in order to manage
thereads. What a novel idea.
Perl is in another league -- one in which it has become one of the most
powerful, though not as easy as awk. Hopefully, it becomes apart of standard
distribution and becomes what BASIC was to all those microcomputers in the
early 80's!!
LONG LIVE PERL!!!
I never use PERL. I guess I do no need it. You never used Common Lisp,
you might need it instead. :)
In article <LGM.94Ju...@polaris.ih.att.com> l...@polaris.ih.att.com (Lawrence G. Mayka) writes:
Yes, CLISP doesn't support all of ANSI Common Lisp yet. The most
obvious omission appears to be the extended LOOP macro (which,
fortunately, is available elsewhere as a free-of-charge add-on). But
as far as image size is concerned, the missing constructs shouldn't
add much bulk percentagewise.
Actually the latest CLISP for Solaris 2.3 includes the extended LOOP
macro. The most important omissions from CLISP now appear to be class
redefinition, method combination other than standard, and logical
pathnames.
Newsgroups: comp.lang.lisp,comp.lang.perl,comp.lang.tcl,comp.lang.clos
From: pe...@nmti.com (Peter da Silva)
Organization: Network/development platform support, NMTI
Date: Wed, 20 Jul 1994 18:59:12 GMT
In article <LGM.94Ju...@polaris.ih.att.com>,
Lawrence G. Mayka <l...@polaris.ih.att.com> wrote:
> I think the original question that started this thread was, "Why did
> people go to the trouble of designing a whole new language for these
> operations when it would have been so much easier to just graft them
> into Lisp?"
They did, and then discovered a 1.5 megabyte executable to search through
a file was a bit of a drag on the system.
You're wedded to an image of individual executable files(!) rather than
individual subroutines. In support of your one executable file you have
megabytes and megabytes of other files as well as resident RAM programs,
such as the kernel. If your executable files need to pass data in a way
more sophisticated than Unix pipes, you lose. Of course, you could always
write things to file or shared memory pipes for sockets or whatever to get
the job done. (However, discarding history and installed base, this is far
from natural.)
If you started your Lisp today and kept it running forever your 'searching a
file' program might have been only a few kbytes?
Imagine what Unix would be like if instead of a shell you entered
'commands' to a C Interpreter.
Conjecture: The only reason the Unix kernel exists is to provide a level of
protection for the system resources. The only reason protection is needed
is because the kernel was written in C and C has progammer-visible
pointers? (Sure it is not quite right... but if you've not thought about
it, do.)
-- Ed
Michael> 4) Tcl (at least to my knowledge) isn't picky about how big
Michael> my "int"s are. Several lisps I've seen are picky.
Common Lisp isn't. It supports _arbitrary precision_ integer
arithmetic. Can Tcl do that?
Tcl doesn't support a lot of things. But if you are willing to use a
Tcl extension (which is the point of the language), yes, you can do
arbitrary precision integer arithmetic. It's trivial.
Tcl doesn't know about X windows either, yet a lot of Tcl scripts do
it anyway. It's amazing what you can do with Tcl considering how
little it does.
Don Libes <li...@nist.gov>
You're right. If you want to enforce discretionary access controls you have
to have some minimum protection domain. In UNIX this unit is the file. In a
lisp machine this unit is the machine. It's a fundamental design decision
that invariably ends up with different environments.
Note that I wrote the above paragraph *before* reading the following:
> Conjecture: The only reason the Unix kernel exists is to provide a level of
> protection for the system resources. The only reason protection is needed
> is because the kernel was written in C and C has progammer-visible
> pointers? (Sure it is not quite right... but if you've not thought about
> it, do.)
Oh, I have. (see above... grin)
The reason protection is needed is because there is no requirement that
a single language be used for all application programs. What language that
is is irrelevant.
--
Peter da Silva `-_-'
Network Management Technology Incorporated 'U`
1601 Industrial Blvd. Sugar Land, TX 77478 USA
+1 713 274 5180 "Hast Du heute schon Deinen Wolf umarmt?"
I agree in theory, but the practice seems to be somewhat different
i.e. what percentage of UNIX apps aren't written in C, a derivative of
C or an interpreter written in C?
: In my opinion, this "screwdriver" argument is silly; Lisp is not a
: screwdriver, but rather more of a "screwdriver representation
: language." In other words, you are comparing screwdrivers to a
: language capable of representing (and therefore building) any kind of
: screwdriver you can imagine.
Yeah, a 300-lb screwdriver that's about 10 feet long. Not real good
for tightening up my sunglasses is it?
: That's 3 programming languages right there! (TCL, SQL, and Perl) I
: can't even fathom doing anything resembling serious development in
: such an environment.
I get paid to do that every day and have little or no trouble doing it
because all of those tools work so incredibly well together. Doing my
job in Lisp would be insane. Today I wrote an awk program in about 1
minute, 45 seconds that parse a 5 meg file perfectly the first time and
generated a C header file from it. It ate up around 125K of RAM and
completed in just over 10 minutes. The same thing in Lisp, besides
being near impossible, would have been HUGE, taken too long to write,
and would have been slow and a memory hog.
Of course, I would like to learn Lisp because there are things it is
very good at. I just have a hard time imagining a Lisp machine as
your do-all general purpose computer system. It's probably possible,
but not really all that desirable.
: -frank
: --
: f...@panix.com | Just another bumper sticker on the
: 1 212 559 5534 | Information Superhighway.
: 1 917 992 2248 |
: 1 718 746 7061 |
--
csh
---------------------------------------------------------------------------
shen...@escape.widomaker.com (UUCP) | Amd486/40 Linux system
shen...@pcs.cnu.edu (Internet) | Christopher Newport University
: Common myth. Repeating it over and over doesn't make it true.
: Most Lisp implementations these days include a compiler.
Which produces slow code of course. Same basic result. Now, there
are Lisp compilers that produce pretty fast code, or so I've been
told. But nothing I've been able to get my hands on. All of it
is pretty hefty when the program is compiled.
: --
: Simon.
I've yet to see any figures for a stand alone app written in CL, or
a CL app that has been translated into C and then compiled.
Another issue that rarely gets mentioned is the use of shared libs
for stand alone CL code. Does anyone have figures for a 10,000+
CL app written in Allegro CL for Windows?
In WCL 2.2 for Solaris, a simple 5 line program that does some I/O and
calls EVAL takes 20k bytes. All WCL applications share code via the
standard shared library mechanism. If you'd like more information,
there's a paper about WCL in cdr.stanford.edu:/pub/wcl. -wade
In article <1994Jul23....@escape.widomaker.com> shen...@escape.widomaker.com (Shannon Hendrix) writes:
I get paid to do that every day and have little or no trouble doing it
because all of those tools work so incredibly well together. Doing my
job in Lisp would be insane. Today I wrote an awk program in about 1
minute, 45 seconds that parse a 5 meg file perfectly the first time and
generated a C header file from it. It ate up around 125K of RAM and
completed in just over 10 minutes. The same thing in Lisp, besides
being near impossible, would have been HUGE, taken too long to write,
and would have been slow and a memory hog.
I daresay the Common Lisp program would have:
- Taken a bit longer to write, since the CL community hasn't
standardized as complete a text-manipulation and pattern-matching
library as one might like.
- Taken a bit more space, since the CL program would be compiled
rather than interpreted. I of course do not include the run-time
environment, since you are apparently not including it for awk either.
- Taken either more or less memory, depending on the CL
implementation's generational GC and awk's memory management style.
- Been either faster or slower, depending on the awk program's ratio
of interpreted computation (which CL would greatly improve upon) to
library computation and I/O (which awk may well do faster than CL--I
don't know).
I'm not arguing that CL would necessarily have been a better choice;
I'm simply maintaining that CL would have been a reasonable choice
too.
>Tcl doesn't know about X windows either, yet a lot of Tcl scripts do
>it anyway. It's amazing what you can do with Tcl considering how
>little it does.
Gosh, sounds a lot like UNIX in the early days. Commands had few if
any options, they were small pretty pathetic when take individually
but boy you could bold the pieces together and make some pretty
amazing things....
>Don Libes <li...@nist.gov>
--
Steven C. Monroe VOICE:(703)406-2082 PC Three, Inc.
sc...@pcthree.com FAX:(703)406-2078 P.O. Box 1644
Herndon, VA 22070
Could you post the AWK script so we can judge for ourselves whether a
Lisp version would be "near impossible ... etc."
>Lisp has all the visual appeal of oatmeal with fingernail clippings mixed in.
Perl has all the visual appeal of camel dung with raisins on top :)
It all depends on who is doing the reading as well as how the code being
read, was written.
|> Anyway, just why do people think lisp is unreadable (complex
|> arithmetic expressions apart)?
Too many ()'s for my taste, however one dialect I used had a ] which said
close all open parens.
This thread will soon belong in alt.religion.perl_vs_lisp.
- Pat
-----------------------------------------------------------------------
| Patrick Martin | My opinions | World's Crummiest JAPH: |
| p...@advance.com | Are My Own. | print 'Just another perl hacker'; |
| Computer Engineer --------------| The First Attempt: |
| Advance Geophysical Corporation | print 'Just another perl hacker`; |
-----------------------------------------------------------------------
>Lisp has all the visual appeal of oatmeal with fingernail clippings mixed in.
Ken> Perl has all the visual appeal of camel dung with raisins on top :)
Let's not start a religious war here. One can implement most any
algorithm in most any programming language. It depends on how much
pain one is willing to endure.
Part of the problem that non-lisp programmers have on their first encounter
with the language is that it is a functional language. Perl, and
practically everything else (except for scheme, t, prolog, and a few others)
are procedural languages. The mind set for programming in one is very
different from the other.
How many world-class tennis players also play raquetball? Both use a ball
and a raquet, but technique is radically different and one does not usually
find individuals that excell in both. Same thing for programming paradigms.
--
Dan Ehrlich - Systems Analyst - PSU Computer Science and Engineering
"Universities should be safe havens where ruthless examination of realities
will not be distorted by the aim to please or inhibited by the risk of
displeasure." - Kingman Brewster
2.6 <EE26E805> fingerprint = 5C 01 7F 57 B0 AB 68 72 04 23 B9 BD 27 AD 85 60
echo '[q]sa[ln0=aln256%Pln256/snlbx]sb3135071790101768542287578439snlbxq'|dc
Nah, more like a fruit salad with trail mix on top. The ants are optional.
Larry
Depends on the class of the app.
Apps intended to run on more than one UNIX system pretty much have to be
written in C, because otherwise you're blowing away most of your market.
Custom apps are written in whatever is appropriate. As are vertical market
apps. The product whose development I support is written largely in Fortran.
But this is a red herring.
My point is that if you use the language to provide protection domains,
then you are forcing everyone to use that language. I can run PL/I or
ADA code on UNIX. There's quite a lot of people in the military and
the government who do. I can run Cobol code on UNIX, and people do that
too... quite a lot of them, though they run custom apps...
UNIX, as a design goal, allows you to program in any language, including
assembler. This means you can *not* use language based protection mechanisms.
It's just plain impossible, without massive performance impacts once you
leave that language (see Burroughs and Algol). The fact that C includes
pointers is irrelevant.
> Too many ()'s for my taste, however one dialect I used had a ] which said
> close all open parens.
The biggest problem is that Lispers have this annoying paper-saving habit
that hides control structure:
( ( (( ) ( ))
(( ( ( ) ( )))) ( ( ) ( )))
(( ) ( ))))
Instead of:
(
(
(( ) ( ))
(( ( ( ) ( )) )
(
( )
( )
)
)
(( ) ( ))
)
)
After enough re-training you can see the structure of the first set of
parens at a glance, but to the novice it looks like gibberish. A true
Lisper will barf at my second example, but it *does* let you see the
nesting at a glance, and tell control constructs from expressions.
You can read the code without being in an editor that does paren matching.
[Code deleted]
>Perl, Bourne, C, and sed would not be as good for
>this particular job.
I don't know awk, so I can't give a definive answer, but I'll bet
dollars to doughnuts that Perl would have been as good, if not better.
>Of course, I could have done it in Perl but Perl isn't
>everywhere and awk/nawk usually are (at least on all the systems we need
>to use).
Okay, that's a valid excuse for not using Perl, I guess; of course, it's
your fault for not DEMANDING Perl be available on all the systems. :)
--
Mike Gebis m-g...@uiuc.edu Mean people suck.
http://www.cen.uiuc.edu/~mg7932/mike.html
Nope, it's gone. But I have another one at work I can put up here later
tonight. It was also a quickie but it's going to be around for awhile
because it's used to generate NLS catalogs for some current software
project. I'm sure it can be done in Lisp but not as quick, small, and
easy. To give you an idea what it does, it's like:
BEGIN {
<print the header for a C file>
<print catalog header if any>
}
/^SECTION/ {
print "#define", $2, section++ > $headerfile
print "$set", section > $catalogfile
}
...and so on. There are about 5 keywords to lookup, 3 procedures to
handle a few odd things, and an END section that finishes up writing the
C source header and prints out some stats about what all it did. It
took me 30 minutes to do this one and it was the first awk script I've
ever written. Perfect tool for the job. 'sed' would be too ugly and
hard for others to maintain, perl is just too much for the job, Lisp is
too much for it (too big, we don't run it, innappropriate for the work
we do), Bourne scripts would be slightly harder to write and maintain,
same for a C program, etc. awk was a perfect match for the job.
Now, if you ran a Lisp machine I guess you could make the above a
subroutine and just jump to it whenever. But then, in that case you
wouldn't be running this script either.
OK... at work now. As I said in a previous post, I don't have that
script any more. But here is one I'm currently using in a project.
First one I ever wrote, took 30 minutes because I was learning as I
went.
--- begin ---
#!/bin/nawk -f
#
# nawk script to generate catalog files and C header files
# for error catalogs
#
# 19940718 csh
#
#
# create a define for this
#
function adddef(lineno,newdef,number)
{
if (defines[$2] != 0)
{
print "***** ERROR *****"
print "Line: ",lineno," ",$2,"was already defined in line",defines[$2]
exit
}
defines[$2] = lineno
print "#define",newdef,number > headerfile
return 0
}
#
# print out something to make for more human readable output
#
function anothersection(section)
{
print "" > headerfile
print "/*" > headerfile
print " *",section > headerfile
print " */" > headerfile
}
BEGIN {
print "Generating C header and nls catalog"
headerfile = "catalog.h"
catalogfile = "errors.english"
}
# set a line number counter
# counting lines with anything
/./ {
tlines++
}
# and lines with nothing
/^$/ {
tlines++
}
#
# generate the main comment for the C header file here
#
/^BEGIN/ {
print "/*" > headerfile
print " * AUTOMATICALLY GENERATED FILE, DO NOT EDIT" > headerfile
print " * " > headerfile
print " * Header for error message catalogs " > headerfile
print " * " > headerfile
print " * Generated: " > headerfile
print " * " > headerfile
print " */ " > headerfile
print "" > headerfile
print "#ifndef CATALOG.H" > headerfile
print "#define CATALOG.H" > headerfile
print "" > headerfile
}
#
# section command found
#
/^SECTION/ {
section++
entry = 0
anothersection($2)
adddef(tlines,$2,section)
print "" > headerfile
print "$set",section > catalogfile
}
#
# entry command found
#
/^ENTRY/ {
entry++
tentry++
adddef(tlines,$2,entry)
getline
tlines++
print entry,$0 > catalogfile
}
#
# file processing finished, do a few little knick-knack things
#
END {
print "Processed",section,"sections",tentry,"entries",tlines,"lines"
print "" > headerfile
print "#endif" > headerfile
}
--- end ---
I'm sure this can be done in Lisp. However, why would I care to do that?
It would take forever to load and run. awk is on all of the systems I
need to run this on, Lisp is not. awk is small and fast and this script
is easy to maintain. Perl, Bourne, C, and sed would not be as good for
this particular job.
I guess if you have Lisp loaded all the time then you would disagree.
However, you wouldn't need this script in that case so the point is
moot.
Anyway, the above is a perfect example of something where "Gosh, I need to
do this real quick but it needs to be easy to come back to later... what
should I use." Of course, I could have done it in Perl but Perl isn't
everywhere and awk/nawk usually are (at least on all the systems we need
to use). No reason to writ it in C, Bourne would be harder, sed is a
bitch to do something like this, etc.
Nope, don't want to see a one-tool system... I like having the different
utils.
;; The biggest problem is that Lispers have this annoying paper-saving habit
;; that hides control structure:
[nice, well-formed parens]
;; Instead of:
[ugly, C-style formatted parens]
;; After enough re-training you can see the structure of the first set of
;; parens at a glance, but to the novice it looks like gibberish. A true
;; Lisper will barf at my second example, but it *does* let you see the
;; nesting at a glance, and tell control constructs from expressions.
;;
;; You can read the code without being in an editor that does paren matching.
Are you saying you would rather see this:
(
defmethod draw-ports ((view rview) dir)
"in: view, dir {:input, :output}"
(
let (
(fn (
if (equal dir :input)
'n-in
'n-out)
)
)
(
unless (= (funcall fn view) 0)
(
with-focused-view view
(
dotimes (
n (funcall fn view)
)
(draw-port view n dir)
)
)
)
)
)
over this:
(defmethod draw-ports ((view rview) dir)
"in: view, dir {:input, :output}"
(let ((fn (if (equal dir :input) 'n-in 'n-out)))
(unless (= (funcall fn view) 0)
(with-focused-view view
(dotimes (n (funcall fn view))
(draw-port view n dir))))))
The second example "*does* let you see the nesting at a glance, and
tell control constructs from expressions.". The first example is a
horrid example of formatting which would only be written by a C
programmer being forced to use Lisp. Judging by the formatting and the
parens, your example is a function call which gets its first argument from
a cond statement. Simple, easy to read, and completely unambiguous.
The biggest problem is that C-ers have this annoying habit of spreading
their code over as much space as possible, in an attempt to make it look
like more code. ;-)
Cheers, Adam
--
Adam Alpern. HCC Group, University of Colorado at Boulder
a...@cs.colorado.edu
a...@neural.hampshire.edu
> [ Second example. ]
Barf.
I do agree with you to a point: some lispers don't hit the LF key
often enough. For instance:
(if (oddp n) (complex (expression (here))) (different (expression)))
vs.
(if (oddp n)
(complex (expression (here)))
(different (expression)))
The later shows the structure more clearly.
However I don't believe that putting opening and closing parens on
separate lines increases readability at all -- in fact I think it does
the opposite. If nothing else it wastes valuable screen real estate.
-- Chris.
(c...@lks.csi.com)
You can lead a C++'er to water, but if you don't tie him up you can't
make him drown. (Apologies to the Psychodots.)
I'm sure this can be done in Lisp.
It can, I've appended an UNTESTED version in (R4RS) Scheme at the end of
this message.
However, why would I care to do that?
In most cases you probably wouldn't. However, what do you turn to
when AWK can't deal with the problem (see the use of a trie I
mentioned in another post)?
It would take forever to load and run.
The Scheme system I generally use (SCM) loads the script in some
small fraction of a second on a SPARCstation, which doesn't count as
"forever" in my book. I can't comment on the runtime since I don't
have any data to test it with, though I don't doubt that AWK would
probably be faster since all the mawk stuff is written in Scheme and
running in an interpreter. However, if the Scheme version was too
slow, then a reasonable performance boost can be had by using the
Scheme specific version of mawk (e.g. if READ-LINE is built in, it
uses that rather than the portable version of READ-LINE).
-------
; All the following libraries are available from the Scheme Repository,
; though I don't guarantee they are the same versions as I'm using :-)
(require 'mawk)
(require 'bawk:table)
(require 'format)
;
(define (echo str . rest)
(apply display str rest)
(apply newline rest))
; Gobal vars
(define header-file (open-output-file "catalog.h"))
(define catalog-file (open-output-file "errors.english"))
(define defines (bawk:table))
(define section 0)
(define entry 0)
(define tentry 0)
(define tlines 0)
;
(define (add-def line-no new-def number)
(if (bawk:table:at defines new-def)
(error "***** ERROR *****~%Line: ~a ~a was already defined in line" line-no (bawk:table:at defines new-def)))
(bawk:table= new-def line-no defines)
(format header-file "#define ~a ~a~%" new-def number))
(define (another-section section)
(format header-file "/*~% *~a~%*/" section))
(define section-pattern (mawk:pattern:make (mawk:pattern:match "SECTION")))
(define (section-action line line-num next-line state)
(lambda ()
(let ((section-name (mawk:$2 (mawk:split line))))
(set! section (+ section 1))
(set! entry 0)
(another-section section-name)
(add-def tlines section-name section)
(echo "" header-file)
(format catalog-file "$set ~a~%" section))))
(define entry-pattern (mawk:pattern:make (mawk:pattern:match "ENTRY")))
(define (entry-action line line-num next-line state)
(lambda ()
(let ((entry-name (mawk:$2 (mawk:split line))))
(set! entry (+ section 1))
(set! tentry (+ tentry 1))
(add-def tlines entry-name entry)
; does the "getline" here do anything useful?
(set! tlines (+ tlines 1))
(format catalog-file "~a~a~%" entry line))))
(define process-line
(mawk:pattern-match
(mawk:pattern/action section-pattern section-action)
(mawk:pattern/action entry-pattern entry-action)))
(define (prologue)
(echo "/*" header-file)
(echo " * AUTOMATICALLY GENERATED FILE, DO NOT EDIT" header-file)
(echo " * " header-file)
(echo " * Header for error message catalogs" header-file)
; ... etc.
)
(define (epilogue)
(format #t "Processed ~a sections ~a entries ~a lines~%" section tentry tlines)
(echo "" header-file)
(echo "#endif" header-file))
(prologue)
(mawk:awk process-line)
(epilogue)
Well, I have to write awk-like stuff in CL every now-and-then because
I have to grind through large text files and perform 3D geometrical
computations, which is just too painful in awk.
In my experience:
-- regular expression matching in CL is inefficient;
I don't know whether that's because CL regular expression
matchers have not been hacked on long enough (C
implementations tend to be old and mature),
or because they can't be optimized well by the compiler;
FFI-based implementations are a hassle to use...
-- text I/O is slow compared in all CL
implementations I have tried (compared to good awks)
-- built-in hash tables tend to be slow and memory hungry
compared to those found in good awks
-- a lot of garbage is generated that could probably
be reclaimed statically or using very simply local
tests ("compile-time garbage collection"); I suspect
that awk memory management strategies have the important
special cases for strings hard-wired in
Also, apart from efficiency, writing that kind of code could be a
whole lot less painful if there was a standard set of powerful and
useful string manipulation functions (like "split") and common
programming paradigms (like "loop-over-fields-of-file").
|I'm not arguing that CL would necessarily have been a better choice;
|I'm simply maintaining that CL would have been a reasonable choice
|too.
Well, none of these are profoundly difficult things to implement or
standardize. But the CL community seems to be collectively lost in
MOP-space and ErrorSystem-land...
Thomas.
In article <ala-2807941333300001@el_diente.cs.colorado.edu>,
Adam Alpern <a...@cs.colorado.edu> wrote, as a supposed example of code I'd
like to see, "a horrid example of formatting that could be only written by
a Lisp programmer intent on parodying C".
I've already discussed the code in email, but lest anyone think I like his
example "c-like" code (why c-like? I learned lisp before I learned C), here's
his example reformatted as I'd like it:
> (defmethod draw-ports ((view rview) dir) "in: view, dir {:input, :output}"
> (let
((fn
(if (equal dir :input)
'n-in
> 'n-out
)
> ))
> (unless (= (funcall fn view) 0)
> (with-focused-view view
> (dotimes (n (funcall fn view))
> (draw-port view n dir)
> )
> )
> )
> )
> )
Or at least:
> (defmethod draw-ports ((view rview) dir) "in: view, dir {:input, :output}"
> (let
((fn (if (equal dir :input) 'n-in 'n-out) ))
> (unless (= (funcall fn view) 0)
> (with-focused-view view
> (dotimes (n (funcall fn view))
> (draw-port view n dir) )
)
)
)
)
I like lisp, but the standard formatting (designed for the days when screen
real-estate and paper was expensive) is pretty opaque for novices. There's a
lot of things people do in C that make it hard to read, but at least in C
people make fun of it... not hold it up as the ideal. Perl's somewhere in
between, around where Forth lives: people talk about using verbose and well
commented code, but in practice use lots of ghastly shortcuts.