So. Maybe lisp was very great for the time of its creation. But it
gets really outdated now. Some time will pass and some other language
will occur which would kill lisp. Maybe if we package best java-
related MP stuff we will see that lisp's time to die have come. I
didn't studied Java (and Java's MP) seriously still, but I was very
impressed to know about transactional capabilities of Spring framework
and the power of this preprocessor: http://people.csail.mit.edu/jrb/jse/jse.pdf.
Also Java IDEs have great refactoring capabilities inaccessible in
lisp.
Finally, I'd like to underline that I don't idealise Java and I
believe it is crappy language created for making profit from selling
hardware. I'm only trying to say that there is nothing unique in a
lisp's approach to MP today and that this approach is not that
perfect.
> I consider the statement "macros are success due to s-exp syntax" as a
> greatest lisper's illusion.
> After learning lisp, I have tried to practize MP in different
> languages during a years.
> What is MP? It has several aspects.
>
> [... a lot of words to say "Universal Turing Machine" ...]
>
> So. Maybe lisp was very great for the time of its creation. But it
> gets really outdated now. Some time will pass and some other language
> will occur which would kill lisp.
English is even older. Perhaps you should have written this post in Esperanto?
Oh, and nobody forces you to use lisp. You can write your meta-programs in C++.
> Maybe if we package best java-
> related MP stuff we will see that lisp's time to die have come. I
> didn't studied Java (and Java's MP) seriously still, but I was very
> impressed to know about transactional capabilities of Spring framework
> and the power of this preprocessor: http://people.csail.mit.edu/jrb/jse/jse.pdf.
> Also Java IDEs have great refactoring capabilities inaccessible in
> lisp.
>
> Finally, I'd like to underline that I don't idealise Java and I
> believe it is crappy language created for making profit from selling
> hardware. I'm only trying to say that there is nothing unique in a
> lisp's approach to MP today and that this approach is not that
> perfect.
What word don't you understand in "local optimum"?
--
__Pascal Bourguignon__
Nice ideologem! Every ideology is a lie. It is often based on stopping
reasoning process from forwarding into some direction. Consider "fail"
or "cut"
operators in a Prolog. Humans have hooks for priests to put fail
operators
anywhere in a knowledgebase. And humans even seem to like that: this
helps
to work out decisions faster (as in Prolog).
Rather common human's tendency is the tendency for "better".
Having local optimum means nothing in fact.
It might also mean that other (better) local optimum is very close.
But magic is in the word "optimum" here. Cut operator says: don't look
for
"better", you got "optimium" and that's enough.
This is not for me. Sorry.
You don't seem to know anything well, you just demonstrate a quite
superficial knowledge of quite everything.
The internets are already full of people like you, no need to increase
their number.
The only real effect of word producers like you is confusing everybody
(including yourself).
To optimize your thinking process, I'd recommend to write some larger
CL applications/libraries.
Yes, CL can even be therapeutic, no joke (I know by experience!).
-JO
> This is not for me. Sorry.
We'll be sad to see you go. Don't let the door hit you in the backside
on the way out!
--
Raffael Cavallaro, Ph.D.
I can see anyone trapped there, I've been there, and sometimes I'm still
doing it - guess recovering now... But with people like Kaz, Ken,
Pascal^2, Tamas, George Neuner and many others it's not easy to spit out
lies right here at comp.lang.lisp - they'll catch you. I mean even Jon
Harrop, being stubborn sometimes, is still a smart guy... Lots to learn
from, and not even lisp, but other things too :)
Yes, of course. What did you expect?
> Some time will pass and some other language will occur which would kill
> lisp.
Programming languages don't die, they become "undead" and linger in
obscurity indefinitely. That has essentially been the situation with Lisp
for several decades. You might want to look for tell-tale signs like lack
of growth next time...
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Here we go!
Lisp has Big Problem!!! More than one!
Etc.
I'm baseball player for a several time periods (days,
moths ,years,decades) I've noticed that interest in baseball is
dwindling, and baseball is becoming less and less relevant and will
soon become extinct with only baby boomers supporting it, and even
those are either going to die or switch to golf. In order to save our
favorite sport I propose we make drastic changes and adapt more modern
things like:
a. Playing on the beach sand wearing swimwear like in beach
volleyball, very modern sport. Check Thiobe for growth rate
b. Replacing bats with hockey sticks. Note that hockey is popular in
many world countries and we should think international
c. Including 24-Second Shot Clock like in NBA that will make our
sport more lively and fast paced
d. Square playing fields should be replaced with the more common
rectangular one like found in many popular sports : soccer, football,
tennis etc
Including this will make baseball prosper.
very truly yours
Concerned Semi-Ex Baseball Player
Avenue of delusional weirdos Number 23
"Wait!! Go back! It's a trap!" Try this one instead:
http://smuglispweeny.blogspot.com/2009/01/spring-cannot-come-soon-enough-for-bobi.html
Yeah, that's the one. Un-tinkered with. The "Newer/Older" buttons
still work. The "Archive" still works. A *much* nicer color scheme...
[Tricky, Kenny, very tricky...]
-Rob
-----
Rob Warnock <rp...@rpw3.org>
627 26th Avenue <URL:http://rpw3.org/>
San Mateo, CA 94403 (650)572-2607
The funny thing is that the color scheme was the only thing that tipped
me off. Thx for the correction.
>
> [Tricky, Kenny, very tricky...]
>
I'll give you tricky:
http://smuglispweeny.blogspot.com/2009/01/tiltons-law-solve-failure-first.html
:)
kth
It's not the refusing the known optimum that's driving people mad,
it's the way that you do it. See 99% of common lisp folks agree that
there is no way for drastic changes to went into lisp. There is no
money for new ANSI standard and for various reasons even if there is
one, the changes mustn't be so drastic. Common Lisp is an mature
technology, and it ages beautifully with time. And you are insisting
that drastic changes must be made to common lisp. Well there is
certain amount of changes that something can accept, afterwards it
stops being itself. Car with an electric engine is still a car, but
car that could fly to the man is more like spaceship. So if you want
drastic changes go for it, build a language of your own. Don't insist
that common lisp enviroment should work or pay on something that YOU
think is beneficial. Common Lisp works good for people who use it, the
burden of proof is on you to show that you could do better. There are
many lispy languages with various success like (scheme, Qi, NewLisp,
Dylan, clojure etc) that refused the dominion of Common Lisp and went
their own way. The time you're wasting whining would be better spent
offering some solution. Working solution, not a boiling the ocean
strategy.
bobi
In C++, scopes "stick" to the innermost curly brace-delimited block.
If one would
accept this approach into lisp, it would look like
(proga
(let foo nil)
(with-open-file (*standard-input* "foo" :direction :input))
(flet bar (x) (print x))
(bar foo))
I have written this macro about a month ago, and I found it extremely
intuitive to use and easily readable (you'll get to it in a minute if
you can read lisp) but I didn't advertise it actively yet and even
didn't use it massively by myself. So it is not tested well still.
This code looks much more readable then simply lisp syntax: there are
less parens, and no more than two subsequent closing parens can be
found, while equivalent lisp code has 4 subsequent closing parens.
It'd be interesting to write "progify" program which converts nested
lisp forms to that form. To be pragmatic, it should take source file
and transform it to less nested and more readable form. But here we
again face some difficulty (this is typical for lisp): while lisp is
advertised as MP language, it is not that easy to write general
program which reads, transforms and writes CL source to file. At
least, I'm not sure I can use standard lisp reader for that program.
It needs to read and keep comments and #+'s.
The problem that is solved by the parentheses, is to give explicitely
the number of argument of each operator 'call'.
This allow macros to work on each node of the syntactic tree without
having to know the arity of the operators. In any case, you would
always have to do something for the operators of variable arity.
> Mathematica is a bright counter-example.
Mathematica still uses parentheses, AFAIK.
> I would never believe to the statement that s-expr syntax is _really_
> convinient. It is relatively easy to write perl, pascal or C on a
> paper without text editor. It is extremely hard to write lisp this
> way. Parens are piling and all becomes a mess. So lispers pay real
> inconvinience of reading/writing every expression in unnatural way for
> imaginary unique managebility of s-exprs. I don't mean lispers can't
> solve their tasks and I don't want to prove anything about lisp's
> fitness to any programming task, but it is a fact that syntax pushes
> away many potential users of lisp.
It's not so clear it is such a fact. You'd have to demonstrate it.
Failure of languages such as dylan tend to make me think that they
would avoid lisp even if it had no parentheses.
> And even very advanced and unique
> design of lisp can't help this: most people give up before getting to
> it.
Why? Is it really the parentheses?
> Well, just to be more constructive, I might suggest one idea for
> syntax. The most nasty thing is not s-exprs themselves, but
> unnecessary nesting. If you want to open a file, you can do this
> safely with with-open-file (and create a nesting level). If you want
> local variable, you again create nesting level. If you have local
> function, you have one more nesting level. In most cases you do not
> need three nesting levels - one is enough. We could get rid of
> some unnecessary nesting.
Ok, here you have another complain, about the link that exist between
lexical scope and parentheses.
But again, you don't see that this is a fundamental feature of lisp
for meta-programming. If the scope of things defined in a given form
goes beyond this form, then you can't anymore process this form
independently: you cannot write a macro anymore.
> [...] If one would accept this approach into lisp, it would look like
>
> (proga
> (let foo nil)
> (with-open-file (*standard-input* "foo" :direction :input))
> (flet bar (x) (print x))
> (bar foo))
So now, when I want to write a macro DEFINE-MY-THINGY, I cannot
process the "body" where my thingy is used anymore, since the scope of
my thingy may be anything, depending on where the call to
define-my-thingy is to be found in the PROGA subtree...
> I have written this macro about a month ago, and I found it extremely
> intuitive to use and easily readable (you'll get to it in a minute if
> you can read lisp) but I didn't advertise it actively yet and even
> didn't use it massively by myself.
Why?
> So it is not tested well still.
> This code looks much more readable then simply lisp syntax: there are
> less parens, and no more than two subsequent closing parens can be
> found, while equivalent lisp code has 4 subsequent closing parens.
>
> It'd be interesting to write "progify" program which converts nested
> lisp forms to that form. To be pragmatic, it should take source file
> and transform it to less nested and more readable form. But here we
> again face some difficulty (this is typical for lisp): while lisp is
> advertised as MP language, it is not that easy to write general
> program which reads, transforms and writes CL source to file. At
> least, I'm not sure I can use standard lisp reader for that program.
> It needs to read and keep comments and #+'s.
Have a look at source-text and reader at
http://darcs.informatimago.com/lisp/common-lisp/source-text.lisp
http://darcs.informatimago.com/lisp/common-lisp/reader.lisp
--
__Pascal Bourguignon__
i'm really only an occasional (amateur) programmer. when i program, i
usually do it in emacs lisp, sometimes i dabble a little in common lisp. so
who am i to say anything. to me, however, the "real" lisp code, the one you
think has too much nesting, feels much more natural than your flattened
alternative. the nesting tells me exactly what the scope of a command is,
and the parens and the indentation are the programmatic/visual aids that
indicate nesting.
when i type an opening paren in lisp, i know i'm starting a new unit, a new
thought, if you will, and i can type the closing paren when i've finished
expressing that thought. of the (admittedly few) other languages i have
some experience with, none gives me this feeling of naturalness, none gives
me the sense that the way a program or function is written textually
corresponds so directly to the way i'm thinking it. *that*, to me, is the
beauty of lisp.
> This code looks much more readable then simply lisp syntax:
i disagree. a quick look at your code gives me *no* idea where the scope of
the let ends, for example. nor do i immediately see until where
with-open-file is active... the lisp code wears its structure on its
sleeve, so to speak, your code doesn't. in lisp code, it is *much* easier
to see the programmatic units, the parts that express a coherent thought.
even before reading the actuall keywords...
--
Joost Kremers joostk...@yahoo.com
Selbst in die Unterwelt dringt durch Spalten Licht
EN:SiS(9)
Check this out:
1 + 2
Indeed. I *hate* NULLs in databases. :-{
Actually, your "Solve the Failure First" [and reminder of & pointer to
your older "Solve the First Problem"] post couldn't have come at a better
time for me. I've been debugging a garbage collector in a toy[1] CL I've
been working on, and I've already gotten bitten several times by *not*
Solving (or at least Analyzing) the First Problem/Failure First, but
I hadn't yet made the meta-heuristic "up-level" step of realizing that
that was what was happening. Thanks for the reminder to "never deal with
the unknown"!! Paraphrased recap: Debug the *first* unexpected thing
that happens[2], *not* the most "interesting" or even catastrophic.
Thanks!
-Rob
[1] "Toy" only in its size. It "purports to be a subset of ANSI
Common Lisp", to paraphrase CLHS "1.7 Language Subsets".
For me it's a mainly research tool for exploring some issues
I've wanted to look at for some time, though it might be
useful on its own for "scripting" or "embedding" or something.
Someday. Maybe.
[2] My current problem has to do with an internal "GC Botch" assertion
after several hundred collections while computing (ACKERMANN 3 7).
But [up until now] I've been ignoring a *very* early indication that
something is wrong: I have a debugging option set to trigger a GC
when 16384 bytes have been allocated, but the *very first* bit of
GC verbosity says "Commencing GC with 32736 bytes in use", which
by Tilton's Law(s) should tell me that something is already fubar.
[Note: This will probably eventually result in an embarrassing
war story, which I promise to share as soon as the smoke clears. ;-} ]
As you've done placing my-thingy inside &body of any lisp form.
Second, you can extend proga to understand my-thingy so that
(proga
(my-thingy (foo bar))
. body))
Source of proga is here and it is in public domain (hope I've put all
dependencies you need)
http://paste.lisp.org/display/74339 Extension examples are there too.
Code walkers expand macros and proga would transform to ordinary cl.
> Have a look at source-text and reader at
Thanks
> when i type an opening paren in lisp, i know i'm starting a new unit, a new
> thought, if you will, and i can type the closing paren when i've finished
> expressing that thought. of the (admittedly few) other languages i have
> some experience with, none gives me this feeling of naturalness, none gives
> me the sense that the way a program or function is written textually
> corresponds so directly to the way i'm thinking it. *that*, to me, is the
> beauty of lisp.
The most common and natural logic is to read text from the start to
the beginning.
This means: define function first, then use it. So scoping is from
definition to the end of the construct.
Proga follows this. Lisp does so, too:
If you put forms to a file, they are processed from the first to the
last (sometimes things are more
complicated due to "compilation units", but basically it is true). So,
you need
to be able to manage this logic anyway.
Not in lisp.
Have a look at: http://groups.google.com/group/comp.lang.lisp/msg/a827235ce7466a92
And I notice also that C programmers who didn't learn Pascal first
also write main at the beginning, and used functions later (after the
needed forward declarations).
--
__Pascal Bourguignon__
> And I notice also that C programmers who didn't learn Pascal first also
> write main at the beginning, and used functions later (after the needed
> forward declarations).
Really? Don't they instinctively recoil at the stupid redundancy of
forward declarations all by themselves?
I admit that I can't judge for myself: Pascal came before C for me, too...
Cheers,
--
Andrew
Cool. Throw in a Warnock Corrollary if you can. :)
kth
> I don't mean lispers can't solve their tasks and I don't want to prove
> anything about lisp's fitness to any programming task, but it is a
> fact that syntax pushes away many potential users of lisp. And even
> very advanced and unique design of lisp can't help this: most people
> give up before getting to it.
Just because everyone leaps off of the same cliff every year doesn't
make it a great idea.
A lot of the people who are turned away by the syntax of Lisp are just
following the herd mentality. They are used to braces and operator
precedence tables. If they think that's the only way to do things, let
them jump off the cliff with everyone else.
At the risk of sounding elitist, if all it takes is syntax to turn away
a potential Lisp programmer, I think Lisp is better for it.
> Have a look at:
> http://groups.google.com/group/comp.lang.lisp/msg/a827235ce7466a92
Cool! ;-)
> And I notice also that C programmers who didn't learn Pascal first
> also write main at the beginning, and used functions later (after the
> needed forward declarations).
I used to do that. After I saw some main-at-the-end code, I realised
that I could avoid having to write all the stupid forward declarations.
Nowadays, I've realised that I naturally thing in a bottom-up way
anyway, so I'd rather see the nitty-gritty details before they all get
drawn together for the punchline.
-- [mdw]
Huh! What an insight! I've learned Pascal before C, and my main()
function is always at the end :), though I know it could be anywhere!
bobi
The syntax "stopping people" is absurd. The only thing more absurd is
someone claiming that is the 'true reason' lisp is uncommon. Most
people cannot understand the advantages of lisp, and refuse to gamble
some time away from "Dancing with the Stars" to investigate.
--
Lisp : a multi-fetish language
And this is how it is done in Lisp:
3
Interesting. I learned Pascal first, but I always started C programs
with main() ... I guess just because the examples all did.
But I noticed that when Pascal added units/modules and the compilers
no longer cared where top level function were defined that I tended to
stay with the C convention of using forward declarations and putting
the main function first. Same with Modula-2 (though I never used M2
much) and with Modula-3.
I do it in Scheme too (don't write much Lisp). I guess I just like to
see the program top down ... I actually write both top down and bottom
up and hope it meets somewhere in the middle.
Oh well.
George
> Most
> people cannot understand the advantages of lisp, and refuse to gamble
> some time away from "Dancing with the Stars" to investigate.
Conversely, if we could guarantee that using lisp prevents one from
ever having to watch "Dancing with the Stars" it would draw a lot of
people to the language.
;^)
--
Raffael Cavallaro, Ph.D.
http://paste.lisp.org/display/74381
This is the comparison of generic slot access vs
direct slot access.
Generic access was 10 times slower in sbcl, 64 times
slower in Allegro, 4+ times slower in ccl.
Lispworks was slow at all. Clisp is always slow, didn't test it.
So, we see difference is no way neglectible.
But defstruct syntax is an unscalable crap (either (:conc-name nil)
and
you're unable to have equally-named slots in misc structures, or
repeat conc-name at
every access). Both cases are much worse than C.
So, I've done what was recommended at FAQ. It is high time to go to
bed, so
code is not finished: I can't yet handle included slots.
Here is it:
http://paste.lisp.org/display/74382
Now it looks more like a C:
(defstruct*mc foo bar baz) ; just as defstruct
(let ((x (make-foo :baz 5)))
(with-struct x foo ; bind accessors to C-like names
(setf .x.bar 4)
(print (+ .x.bar .x.baz)))) ; 9
Leading dot is required to make package less messy.
Maybe it is reasonable to add some syntax to name acessors w/o
instance name, e.g.
(with-struct x (foo) ; (foo) means we don't want instance prefix
(print (+ .bar. .baz.)))
Hm, looks fine... Doing so... Will publish it later.
Code was tested under Lispworks, Allegro and ccl. Seem to work.
Comments (not a flood) are welcome. Also I can gather more metadata on
defstruct (such as initform or type declaration), but this was not
done yet too.
Or, just for convenience obviously
+ 1 2
at the Lispworks REPL.
Cheers
--
Marco
http://www.n-a-n-o.com/lisp/save-object-10.2.lisp
It would allow to get slot list for structures defined somewhere else.
Maybe some macro of kind
(guess-defstruct-info (type (:conc-name <real-conc-name>))) would be
useful for that. It looks like
conc-name is not stored in a lisp image (I might be wrong).