*Please don't take this as a flame or a troll*
Lisp seems OK for prototyping and implementing higher level
business rules and for general scripting, but it simply
will never be able to compete with languages such as
C and C++ (and to a lesser extent languages such as VB
and Delphi under Windows).
Have you noticed...
You don't see fast numerical libraries written in Lisp.
You don't see scientific libraries written in Lisp.
You don't see commercial games written in Lisp.
You don't see application suites written in Lisp.
In fact, you don't see any mainstream commercial applications
written in Lisp for the the basic reason that any
competitor will simply go out and write their competing
application in C/C++ and make a faster, more responsive
application that makes more efficient use of machine
resources. Why do you think that, despite the massive
amount of hype, no mainstream apps have been written
in Java? Because it is too slow for the real world when
compared to equivalent code written in C or C++.
I would say that C++ has such a bad repution amongst Lisp
programmers because it takes several years to become
a very good and efficient C++ programmer whilst you can
quickly become an efficient Lisp programmer. The massive
number of *bad* C++ programmers certainly doesn't help
its reputation. An experienced C++ programmer can write
truelly elegant, safe, and efficient code that leaves
the equivalent Lisp code in the dust.
Of course, the ease at which you can write good code in
Lisp is a major point in its favour.
But to sum up; we always seem to push hardware to its
limits and because of this languages such as C and C++
will maintain their edge over languages such as Lisp.
I suppose one day we may have computers with more
processing power than we could ever make use of
(e.g. quantum computers) and then it will be commercially
feasible to throw away the low-level languages. But I
imagine by that time Artificial Intelligences will have
done away with programmers alltogether.
OK now... go ahead an rip my argument to shreds :)
Cheers,
- PCM
I think that's more because most programmers don't
_like_ Lisp very much. Notice all the crap being made in/for
VB - and that's _interpreted_, at least Lisp can be compiled.
Look at the popularity of perl - a garbage collected interpreted
language. The upcoming popularity of Java, again garbage collected
and interpreted.
If the programmers wanted to Lisp, and there was a abundance of
Lisp programmers, companies would make programs written in Lisp.
:But to sum up; we always seem to push hardware to its
:limits and because of this languages such as C and C++
:will maintain their edge over languages such as Lisp.
:I suppose one day we may have computers with more
:processing power than we could ever make use of
:(e.g. quantum computers) and then it will be commercially
:feasible to throw away the low-level languages. But I
:imagine by that time Artificial Intelligences will have
:done away with programmers alltogether.
Whoa. I suspect quantum computers will come long before AI
makes any significant headway; unless you mean that the
indeterism in quantum computers might allow something closer
to an actual brain, in which case I dunno.
Again we see the popularity of numerous "unefficient" languages,
Java, Tk/Tcl, and Perl.
:OK now... go ahead an rip my argument to shreds :)
I'm not a regular Lisp programmer, btw; I use mostly C/C++ and Delphi.
Personally, I prefer Scheme to Lisp; it strikes me as a cleaner,
more orthogonal language.
--
Shaun flis...@cs.wisc.edu
http://www.kagi.com/flisakow - Shareware Windows Games, and Unix Freeware.
"In your heart you know its flat."
-Flat Earth Society
Satan - The Evil 1 wrote:
> ... text omitted...
>
> I would say that C++ has such a bad repution amongst Lisp
> programmers because it takes several years to become
> a very good and efficient C++ programmer whilst you can
> quickly become an efficient Lisp programmer.
In my opinion, no one can quickly become an efficient Lisp
programmer.
(No one can quickly become an efficient programmer in any
language, for that matter...)
And even if the above were true, I don't think that has any
contribution to the negative attitude towards C++.
I also don't think that an efficient Lisp programmer would
have any problems becoming an efficient C++ programmer
(apart from overcoming their distaste for C++, that is).
> ... text omitted...
My 2 centimes worth, I guess.
--
Vassil Nikolov, visitor at LRI:
Laboratoire de Recherche en Informatique, Universite Paris-Sud
Until 9 July 1997: Vassil....@lri.fr
Normally: vnik...@bgearn.acad.bg
> Have you noticed...
> You don't see fast numerical libraries written in Lisp.
> You don't see scientific libraries written in Lisp.
> You don't see commercial games written in Lisp.
> You don't see application suites written in Lisp.
>
> In fact, you don't see any mainstream commercial applications
> written in Lisp for the the basic reason that any
> competitor will simply go out and write their competing
> application in C/C++ and make a faster, more responsive
> application that makes more efficient use of machine
> resources.
That might be part of it. I think more relevant facts are:
- There are way, way fewer Lisp programmers than C programmers
around.
- Lisp programmers are usually not very interested in writing
mainstream commercial applications.
- Vendors haven't really targetted their Lisp systems for the
ability to produce mainstream commercial applications.
- The people producing the mainstream commercial applications
have a long history of writing stuff in C. They probably
haven't even considered considering higher-level languages.
It's a cultural thing.
> Why do you think that, despite the massive
> amount of hype, no mainstream apps have been written
> in Java? Because it is too slow for the real world when
> compared to equivalent code written in C or C++.
Actually, there has been at least one mainstream app written in
Java. Unfortunately.
There's no reason why you couldn't make a Java compiler that
produces code at least as good as you get from a C or C++ compiler.
However, for cultural reasons this hasn't been done yet: everyone
is more interested (rightly) in writing Java compilers that target
the JVM, so that the code they produce is portable.
> I would say that C++ has such a bad repution amongst Lisp
> programmers because it takes several years to become
> a very good and efficient C++ programmer whilst you can
> quickly become an efficient Lisp programmer.
I don't agree. It's certainly much faster to become able to write
non-trivial programs in Lisp than to become able to write non-trivaial
programs in C++, because Lisp is a much more elegant language. But
becoming *very good and efficient* in any language takes time, and
Lisp isn't an exception.
> The massive
> number of *bad* C++ programmers certainly doesn't help
> its reputation. An experienced C++ programmer can write
> truelly elegant, safe, and efficient code that leaves
> the equivalent Lisp code in the dust.
If this is true, it's only because of shortcomings in the available
Lisp implementations. In other words, if your argument is valid then
it's not an argument against Lisp but an argument against Allegro
Common Lisp, LispWorks, Macintosh Common Lisp, and all the other
specific compilers out there.
(And I'm not sure it's true at all.)
> But to sum up; we always seem to push hardware to its
> limits and because of this languages such as C and C++
> will maintain their edge over languages such as Lisp.
This might be true if "languages such as Lisp" means "languages
that happen not to have any implementations with properties X,Y,Z".
But I don't think that's really a language issue.
And, of course, even if everything you've said is right, it
doesn't apply to every field of programming. You mentioned
artificial intelligence, for instance. If you're right in predicting
that programmers will be made obsolete by AI, then I bet you the
programs that make them obsolete won't be written in C.
--
Gareth McCaughan Dept. of Pure Mathematics & Mathematical Statistics,
gj...@dpmms.cam.ac.uk Cambridge University, England.
I suggest you take a look at the following reference:
R. Bradley Andrews (Brad_A...@rbacomm.com)
"Speeding Game Development and Debugging (Lisp
finds a niche in the game development arena)",
pp.28-31, Object Magazine, May 97,
SIGS Publications (http://www.sigs.com).
Here are some edited quotes which you may find revealing:
"Nichimen Graphics used LISP to devlop their powerful
N-World development environment... Nintendo used N-World
to produce all the characters in their flagship 64-bit
game, "Mario 64" (including Mario himself)."
"...Naughty Dog Software - makers of "Crash Bandicoot,"
the current mascot for the Sony Playstation - have proven
that it [LISP] can [hook together a modern game that gets
played]. Like their previous release, "Way of the Warrior,"
their current bestseller uses LISP code for significant
parts of the game, including character control and AI
functionality."
"Gavin [Naughty Dog co-founder] emphasizes that some ideas
about LISP are outdated... The speed issue is one where
reality is different than the perception. "It is easy to
construct a simple dialect which is just as efficient as
C, but retains the dynamic and consistent qualities that
make LISP a much more effective expression of one's
programming intentions," adds Gavin."
Copyright acknowledged.
Hope you find this informative.
--
alasdair mcintyre - thermoteknix
<initial>"dot"<surname>"at"<company>"dot"com
: Personally, I prefer Scheme to Lisp; it strikes me as a cleaner,
: more orthogonal language.
Do you mean Scheme is not Lisp? This is weird.
: --
Why not? It *IS* a troll. You can tell it's a troll because anyone with
half-a-brain would have looked in DejaNews and seen that this "brilliant
insight" is posted about once a WEEK, and has been for the last ZILLION
YEARS.
Does this reflect well on C programmers's intelligence or programming
style, if they can't even consider the idea that someone somewhere has
already talked about this topic? No wonder "reusable code" is such a
hot-button!
IF you want real answers to your questions, GET OFF YOUR ASS and go look.
Go ask a Lisp vendor what "real world" apps have been written. For crying
out loud, go read the back scroll of comp.language.lisp
(<http://www.dejanews.com/> found almost 500 articles when I did a simple
query). Don't bother us Lisp weirdos. We're busy hacking dumb things with
lambda calculus.
IF you don't want answers, just go back to your C++ and write some code.
You have my OFFICIAL permission to do so. If you want to feel that C++ is
better, GO RIGHT AHEAD. I don't CARE.
After all, everyone knows that the BEST language in the ENTIRE WORLD is
COBOL, because there's more COBOL code than ALL OTHER LANGUAGES PUT
TOGETHER.
Have a really nice day,
HEINOUS
ps. If application speed is so damn important, explain damn near any
Microsoft product. "Slow, huge programs"? The most popular software in the
world -- and it's written in C.
> Someone calling himself "Satan - The Evil 1" (but I beg leave to doubt
> whether that's really who it was) wrote:
Well, it's not me. ;-)
> - There are way, way fewer Lisp programmers than C programmers
> around.
This is hard to dispute. I doubt that book publishers are stupid. If
there's money to be made by publishing books that sell, and books
about a particular language do in fact sell, then they'll go for it.
Either there's less interest in Lisp than C/C++, or there's a book
publishing conspiracy.
> - Lisp programmers are usually not very interested in writing
> mainstream commercial applications.
Not too long ago a follow Lisp programmer tried to convince me that
there are more programmers who know Lisp than SQL. His evidence seemed
to be anecdotal, as his sample of programmers only included his
colleagues and the students that he teaches Lisp to. He later argued
that his claim was true of programmers in the city where he lived,
which I couldn't dispute.
Why were we discussing (by email) the relative popularity of SQL vs
Lisp? We began by discussing the use of CL-HTTP as an alternative to
the various - and popular - web server extensions that use database
features. I suggested that most people wanted a familiar query
language.
> - Vendors haven't really targetted their Lisp systems for the
> ability to produce mainstream commercial applications.
Coupling CL-HTTP with a Lisp package would be very tasty. If the Lisp
aspects could be hidden behind a less "Lisp-ish" syntax, not unlike
mu-simp, it might even look "friendly" to non-Lisp people (the kind of
people who're scared off by the simplicity of the Lisp non-syntax).
Some heavy marketing might also help.
> - The people producing the mainstream commercial applications
> have a long history of writing stuff in C. They probably
> haven't even considered considering higher-level languages.
This is why I believe that heavy marketing could help. It might even
be necessary, if we consider the myths about Lisp that need killing
before we can even begin to discuss Lisp with some people.
> It's a cultural thing.
Hmm. Lisp Machines are not what most people want, and yet this seems
to be what many Lisp people are still lusting after. We know how
bloody OS wars can be, but the culture that comes with an OS can also
create friction. Consider the hostility toward command lines, for
example. Any "negative" feature associated with an OS can do this.
Those of us who don't feel such hostility perhaps understand that
feature and appreciate it. Others may understand it and yet still
dislike it. The rest could be baffled and scared of it, because it's
something that they don't understand and it intimidates them.
To remove the hostility, perhaps we first have to create
understanding, then an appreciating. This could be as true for
programmers as it is for non-programmers. Being techie doesn't mean
that a person will understand and appreciate all things technical.
> Actually, there has been at least one mainstream app written in
> Java. Unfortunately.
There are also many that don't get much attention, because they just
quietly work for the people who wrote them and the people they work
for. The same is true for a lot of languages, including Lisp.
> There's no reason why you couldn't make a Java compiler that
> produces code at least as good as you get from a C or C++ compiler.
> However, for cultural reasons this hasn't been done yet: everyone
> is more interested (rightly) in writing Java compilers that target
> the JVM, so that the code they produce is portable.
There are also historical reasons. We should be comparing Java to
languages with an equally short history, or at least with languages at
a point where their history was equally short.
Unfortunately, there are a few people who appear to prefer to ignore a
language's history, and perhaps even distort it. If Java was 20 years
old, perhaps it would be fair to compare it with C. Perhaps if it was
15 years ago, it might be fair to compare it with C++.
There's also the possibility that some people are using Java for
things they might not have considered doing in C++. I know that there
are things I won't hesitate to do in Lisp that I'd never have dreamed
of doing in C or C++. One language opens up my imagination by making
hard things easy, while the other makes simple things hard work.
However, once I've written something in Lisp, and found that it works
and is useful, I can then think about how to write it in C++.
This is not too different from writting a tool in a shell language,
awk, or perl, producing a solution quickly and easily. Later, after
you've used your tool enough to see that it's worth the trouble, you
can re-write it in C++. If you started in C++, you might never start
it, or possibly worse, spend a lot of time on it but never complete
it, making your time and effort a waste.
--
<URL:http://www.wildcard.demon.co.uk/> You can never browse enough
Please note: my email address is gubbish
Will write Lisp code for food
My opinion is that anyone who uses computers to develop
software and wants to have marketable skills needs to
learn the following:
C or C++
familiarity with unix and windowing systems
API libraries (MFC, STL) or that for the operating system you are using
User Interface design fundamentals
Event based programming
Embedded programming fundamentals
Familiarity with threads and multiprocess communication
Debugging
An editor style(brief/EMACS)
Perl or Rexx
Other languages as needed (LISP could be added here in support of a
particular project or as scripting for EMACS)
Object methodologies (OOA/D, Booch or OMT, patterns)
Familiarity with makefile utilities
Familiarity with version control and configuration management
Some type of prototyping package (visual basic/rexx)
Good coding style that is readable and understood by others
Good communication skills and the ability to get along with *ANYONE*
Complete understanding of the problem at hand
Total humility to admit when you do not understand something completely
and are willing to ask questions.
An understanding of full time employment benefits as well as dealing with
contract engineering (you are either a ass kisser, an ass, or a
mercenary...)
The students/young professionals out there wondering what they need to
learn to be marketable need to become good engineers, knowledgable in
tools and technologies, but must first be responsible human beings.
Just my rants.
I welcome any comments,
Larry
-----------------------------------------------------------------------
Laurence A. Gordon
Intelligent Medical Imaging, Inc "No generalization is worth a damn,
4360 Northlake Blvd including this one." -- Will Rogers
Palm Beach Gardens, FL 33410
(561) 627-0344 XT 3201 DISCLAIMER
The opinions of the aforenamed individual
799 Sanctuary Cove Dr. might not be the same as his employer's.
North Palm Beach, FL 33410 If you feel like telling his boss about how
(561) 776-6336 much of an insensitive, obnoxious jerk the
author is, don't bother. He already knows.
-----------------------------------------------------------------------
Scheme is a subset of lisp, like C is a subset of C++. There are C
programmers out there who prefer C to C++, citing the simplicity and
standardization present in C. Standards exist in C++, but they aren't
uniformly implemented by compilers such as Microsoft VC++.
--
Tyson Jensen
tje...@mosbych1.com
Actually I believe the success of C can be most directly attributed to
the successful proliferation of UN*X into mainstream computing,
regardless of what treatment it may or may not have had in schools. If
it were not for UN*X, and the need to port it to platforms other than
DEC PDPs (work initially funded by corporate dollars, not academic
ones), it is very likely there would've never been a K&R effort to begin
with. And since C libraries remain the dominate programming interface
to UN*X system call services, and most kernel source is implemented in
C, it's presence in the marketplace is at least proportional to that of
UN*X. Nowhere else in computing will you find this kind of mutual
symbiotic success between OS and language. Early commercial acceptance
eventually made C "safe" to consider for the desktop, where it
flourished largely because of its low-level expressive power and small
footprint. Plus the fact that it's just a damn good little language.
I would say academia had more of an impact on the successful evolution
of UN*X (e.g. the work with BSD) than the popularity of C as a general
purpose programming language. Of course, granted, the latter did
benefit from the former. And I would agree that C++ had its roots in
academia, although its early explosive popularity growth can only be
attributed to market hysteria... ::shields up::
Where's data modeling?? Schema design, creation and optimization,
access method tuning, relationship integrity methods, etc. College
grad: "Yes sir, I can screw up your company's data in 7 different
languages..."
// David Hinson
// dhi...@rational.com
: Scheme is a subset of lisp, like C is a subset of C++.
This is false.
: There are C programmers out there who prefer C to C++,
Duh. If they are C programmers, this is comprehensible.
Sorry, I don't care about C and C++.
The point is that Scheme _is_ lisp, you are confused about what lisp is.
Scheme is lisp, so is Common Lisp, so is Lisp 1.5, so is T, so is NIL,
so is Maclisp, so is Interlisp, so is Franz Lisp, so is Mulisp, so is
Multilisp, so is Standard Lisp, so is Portable Standard Lisp, so is
Oaklisp, so is elisp, so is xlisp, so is ISO Lisp, so is Eulisp, so is
*lisp, so is 1100 Lisp.
Sorry for the ones that I forgot, hope you got the idea anyway.
: --
: Tyson Jensen
: tje...@mosbych1.com
> :Have you noticed...
> : You don't see fast numerical libraries written in Lisp.
> : You don't see scientific libraries written in Lisp.
> : You don't see commercial games written in Lisp.
> : You don't see application suites written in Lisp.
> :
None of the above are true, of course. BTW The third domain is perhaps the
most in fashion of those listed.
__Jason
Therefore, Scheme is a subset of Lisp like C is a subset of C++
like fish are a subset of Amazonian life forms - all are false.
Having said that, I'd like to get back to my favorite C/C++/whatever
vs. Lisp rant.
Assume two programmers, both competent, one in C or C++ or something
similar, one in Lisp.
Both write a program to solve a given task. At the end of a certain
time period, they have initial versions. The C version is efficient,
but it doesn't work. The Lisp version works, but is inefficient.
Now, it's obvious to everybody that the program has to work (well,
everybody not involved with Microsoft), but it isn't obvious that
it has to work efficiently. There is a strong temptation to ship the
Lisp version and report to the customers that the C program is coming
along nicely.
Suppose that both programmers continue. Both will end up with a
working, efficient program. The Lisp programmer is likely to finish
sooner, and the Lisp version will be more maintainable (to good
Lisp programmers) than the C version (to good C programmers).
The Lisp version is likely to work better, and will be less susceptible
to certain classes of bugs (like slow memory leaks).
There's also the cultural thing. Most programming books I've seen
make reference to machine efficiency (sometimes mistakenly). There
are exceptions, notably Kernighan and Plauger's (is that correct?)
superb _Elements_of_Programming_Style_. Most Lisp books mention
efficiency only briefly. Until Norvig's excellent book came out,
I didn't have one that helped me write efficient Common Lisp programs.
Another thing that gave rise to the inefficient Lisp myth is that
Lisp runtimes are usually considerably larger than older C
runtimes, and therefore there was a strict limit to how small a
Lisp executable could be, as opposed to a C executable. This is
changing, as the standard applications are getting bigger and the
old, lean, C runtimes are replaced by the more modern and fatter
C++ runtimes, while Lisp runtimes aren't getting much bigger.
>: There are C programmers out there who prefer C to C++,
>
>Duh. If they are C programmers, this is comprehensible.
>
Matter of opinion. The thing to remember about C++ is that you
don't have to use all the features. Operator overloading can get
you into deep weeds if you use it inappropriately, templates can
bloat your executables, RTTI can be used to make some really ugly
programs, and so on. I prefer C++ to C because there are some
things you can do in C++ that you can't do anywhere near as well
in C, and I am old and scarred enough to use the neat powerful stuff
only when appropriate.
>The point is that Scheme _is_ lisp, you are confused about what lisp is.
>
>Scheme is lisp, so is Common Lisp, so is Lisp 1.5, so is T, so is NIL,
>so is Maclisp, so is Interlisp, so is Franz Lisp, so is Mulisp, so is
>Multilisp, so is Standard Lisp, so is Portable Standard Lisp, so is
>Oaklisp, so is elisp, so is xlisp, so is ISO Lisp, so is Eulisp, so is
>*lisp, so is 1100 Lisp.
>Sorry for the ones that I forgot, hope you got the idea anyway.
>
Um, Pearl Lisp, the 8080 and 6800 Lisps in the old Doctor Dobb's
Journals I threw out last year, that CP/M Lisp I used to use and
can't remember the name of any more.... Oh well, I'd rather use
any of them than COBOL or Pascal (there's ones I wouldn't prefer
over C, but that's another story).
David Thornley
When I said Lisp, I was referring to common lisp, which I though had
been standardized.
I know that Scheme has been standardized, which seems like rather
unusual for a mere "dialect".
Tyson Jensen wrote:
> > Do you mean Scheme is not Lisp? This is weird.
>
[snip]
>
> Another thing that gave rise to the inefficient Lisp myth is that
> Lisp runtimes are usually considerably larger than older C
> runtimes, and therefore there was a strict limit to how small a
> Lisp executable could be, as opposed to a C executable. This is
> changing, as the standard applications are getting bigger and the
> old, lean, C runtimes are replaced by the more modern and fatter
> C++ runtimes, while Lisp runtimes aren't getting much bigger.
I second this. SAP has been the world's most successful standard
application for half a decade, with a footprint of 20GB storage and
192MB - 2GB RAM (it just barely runs with 192MB, average is
256MB-1.5GB).
Even if you can write the same functionality in Lisp with a portion
of this storage footprint, the RAM requirements would not really change.
A few megs of Lisp (running or development) environment is just not
significant at all.
Just my 2 Fille'r.
Robert
> When I said Lisp, I was referring to common lisp, which I though had
> been standardized.
Common Lisp and Scheme appear to me to be two dialects of the family
of languages that known as Lisp. Their semantics overlap, but CL
doesn't entirely enclose those of Scheme, nor do Scheme's semantics
entirely enclose those of CL.
The size of the language is a red herring. Think of the semantics as
sets, and then draw a Venn diagram. The sets intersect.
> I know that Scheme has been standardized, which seems like rather
> unusual for a mere "dialect".
Ohh, excellent flamebait. ;) Yep, Fortran is a subset of Basic, C++ is
a subset of Smalltalk, and _everything_ is a subset of Lisp. Hmm.
Well, that last one could be true (in the sense that _anything_ can be
added to Lisp), but you might not get away with it in a newsgroup like
comp.lang.c++. Would _you_ like to try it? Light the blue touch paper
and stand well back...
With Lisp, talking about sets of semantics is probably meaningless,
because we can extend the language in any way we like, so easily. If
you want to be pedantic, and only use the language as defined by the
language spec, then that's different. Does the Scheme spec specify a
standard macro system? Is that even necessary?
Meta-circular evaluators (SICP style) make _anything_ possible. All
you have to do is write your meta-circular evaluator.
> >: Personally, I prefer Scheme to Lisp; it strikes me as a cleaner,
> >: more orthogonal language.
> >
> >Do you mean Scheme is not Lisp? This is weird.
>
> When I said Lisp, I was referring to common lisp, which I though had
> been standardized.
Common Lisp has been standardized; there is ANSI Common Lisp. There
is also a dialect of Lisp called ISLisp that has been standardized by
ISO. However, "Lisp" is a generic term for a family of languages and
it is unlikely that any standard will ever be called "The Standard
Lisp". This was attempted with ISLisp and voted down; I doubt there
is any standards organization out there which will take up the banner
again.
-- Harley
-------------------------------------------------------------------
Harley Davis net: da...@ilog.com
Ilog, Inc. tel: (415) 944-7130
1901 Landings Dr. fax: (415) 390-0946
Mountain View, CA, 94043 url: http://www.ilog.com/
ja...@harlequin.co.uk (Jason Trenouth) writes:
That's interesting. Could you give a couple of examples of commercial
games running compiled (or even interpreted) Lisp code. How about an
application suite? Anything mainstream?
Thanks,
Strictly speaking though, there are enough differences between C and C++ to
make life interesting. :-)
So, realistically, there exists a set S = C intersect C++. S contains the
common language features and syntax. Note that S != C.
Dennis
J.D. Jordan wrote in article <33C4D182...@erols.com>...
>C a subset of C++??? C came first!
>
>Tyson Jensen wrote:
>
>> > Do you mean Scheme is not Lisp? This is weird.
>>
>> Scheme is a subset of lisp, like C is a subset of C++. There are C
>> programmers out there who prefer C to C++, citing the simplicity and
>> standardization present in C. Standards exist in C++, but they aren't
>>
>> uniformly implemented by compilers such as Microsoft VC++.
>>
>> --
>> Tyson Jensen
>> tje...@mosbych1.com
>>
>
>
>
>.
>
> C a subset of C++??? C came first!
And Scheme existed before Common Lisp. Not that will stop anyone from
calling Scheme a subset of CL. Perhaps it would be better to say that
C++ is a superset of C, and that CL is a larger language than Scheme?
I vaguely remember a language, I think called something like Comal,
that was a superset of Basic. At least, it looked like at the time, as
every Basic available on micros was pathetically small, and most of
them didn't even have WHILE/WEND.
For what it's worth, every Pascal compiler that I've used has been a
superset of ISO Pascal. I've even witnessed a Pascal programmer
refering to as Modular 2 as Pascal. Does that make M-2 a superset of
Pascal, or was he just wrong?
As a he was arguing with some C programmers at the time, perhaps he
was excused. I found it very hard to follow the debate, as neither
side were talking about the same languages. So it could've been
C/C++/K&R vs ISO Pascal/Turbo Pascal/M-2, respectively. No wonder I
was confused!
>> C a subset of C++??? C came first!
>And Scheme existed before Common Lisp. Not that will stop anyone from
>calling Scheme a subset of CL. Perhaps it would be better to say that
>C++ is a superset of C, and that CL is a larger language than Scheme?
No, Scheme came after Common LISP. Scheme was a reaction to Common
LISP by a group of people at MIT who wanted a simple, clean LISP.
Common LISP had too much junk in it at the insistence of the Symbolics
people, who wanted to justify their custom hardware. The original
Scheme paper is a joy to read; in a few tightly-written pages it
defines the whole language.
>For what it's worth, every Pascal compiler that I've used has been a
>superset of ISO Pascal. I've even witnessed a Pascal programmer
>refering to as Modular 2 as Pascal. Does that make M-2 a superset of
>Pascal, or was he just wrong?
Modula 2 is considered to belong to the Pascal/Modula/Ada family
of languages, but it is not a superset of Pascal. It isn't even a
superset of Modula 1. Pascal, Modula 1, and Modula 2 were designed
by Wirth; Modula 3 was designed at DEC, and Ada was designed through
a competition between four proposals.
John Nagle
> No, Scheme came after Common LISP. Scheme was a reaction to Common
> LISP by a group of people at MIT who wanted a simple, clean LISP.
> Common LISP had too much junk in it at the insistence of the Symbolics
> people, who wanted to justify their custom hardware. The original
> Scheme paper is a joy to read; in a few tightly-written pages it
> defines the whole language.
Hmm. I seem to recall finding references to Scheme: An interpreter for
the extended lambda calculus, Memo 349, MIT AI Laboratory, 1975. Does
Common Lisp predate this memo?
> Modula 2 is considered to belong to the Pascal/Modula/Ada family
> of languages, but it is not a superset of Pascal. It isn't even a
> superset of Modula 1. Pascal, Modula 1, and Modula 2 were designed
> by Wirth; Modula 3 was designed at DEC, and Ada was designed through
> a competition between four proposals.
Exactly. That's why I mentioned it. IMHO a comparison between Scheme
and Common Lisp that declares that one is the subset of the other is
not unlike declaring that Pascal is a subset of Modular 2. I prefer to
say that only that they're different languages, with strong, possibly
supperficial, similarities. I feel that it's more honest to say that
the real relationship between them is a historical one.
I'm happy to leave the exact nature of that relationship to the
historians, language lawyers, and other pedants. ;) I'm a programmer.
If I can use a language to write code that performs tasks useful to
me, then that's good enough for me.
Language wars are for those who either have too much time to spare, or
have too much to gain by spreading hostile memes. If there's an idea
used in some language not currently exploited by the language that I'm
using, there are a number of options. The one that I like best is to
take that idea and add it (perhaps as an option) to my chosen
language. This is how language speciation may occur, but it's not
always necessary to create a new language in order to accomplish this,
esp if your language is flexible enough to extend itself.
This may be why some people claim that Lisp semantics can include the
semantics of any other language. Symbolic expressions may express
anything we wish, making Lisp semantics (or meta semantics?) a
superset of everything else that we can imagine. In order to be of
practical use, we need only implement those semantics. This is true
for all practical languages anyway, hence the superset argument.
So, perhaps there's no point in asking which Lisp dialect is superset
of another? The only real question is how much effort it would take to
add the semantics of one dialect (or language) to another, and what
would be the practical value of doing so? I expect the answer(s) to
vary according to the needs and abilities of each programmer.
Lisp is a deception. All lisp compilers and interpreters that
I've seen have been written in C, and run on top of a C program. I've
seen a lot of LISP and PROLOG programmers, especially in the post
graduate
level of computer science, think that lisp functions the same way as
mathematics. They think that a call to a recursive function
instantaneously
returns a result. The fact is, these function is broken down into
machine
level instructions, and is executed the same way as a for next loop.
AI is a bunch of garbage as well. Do you know what the main goal
of AI is? It is to develope a program where a person cannot
distinguish the program from a human being. What does that have to do
with intelligence? It's just an emulator.
The bottom line is, all computer programs, including AI
programs, are just fast pocket calculators.
Peaceman
> Now, here is a man who has seen a real lot of Lisp compilers. :-)
There are a few _books_ that could refute his belief! A few years ago,
I wrote one myself. Of course, it was very primitive, but it did
compile a small subset of Scheme into C. It didn't take long to write,
either. After all, I wrote it in Scheme...
I sense a round of URLs about to fly, but they could be pre-empted by
a reference to the Lisp FAQ. Do any C++ programmers bother to read it?
I sometimes wonder, esp when someone tries to distort facts that are
available for anyone with Internet access to check for themselves.
There's that nice little "rtfm" ftp site at MIT, where they keep the
FAQs. Anyone who thinks that all Lisp compilers and interpreters are
written in C should take a look. Still, perhaps Sajid Ahmed hasn't
seen many Lisp compilers and interpreters? That might explain a spot
of ignorance. Again, the Lisp FAQ can help fix that.
It certainly beats the old "open mouth and insert foot" strategy that
all anti-Lisp attacks employ. Oops, perhaps I should say qualify that
by adding that this applies to all the attacks that _I've_ seen. There
may be some out there that are better informed. If such things exist,
I've not seen one during the 5 years that I've been reading UseNet.
Not one.
> The bottom line is, all computer programs, including AI
> programs, are just fast pocket calculators.
Nice troll. Nothing original, alas.
You might like to read the FAQ for comp.lang.lisp, and check every one
of your "claims" against the reality. Section 4 should be esp
illuminating!
<URL:http://www.cs.cmu.edu/afs/cs.cmu.edu/project/ai-
repository/ai/html/faqs/lang/lisp/top.html>
Alternately, try the "rtfm" site at MIT:
<URL:ftp://rtfm.mit.edu/pub/usenet-by-hierarchy/comp/lang/lisp/>
RTFM is rather appropriate, in your case. Read the FAQ and weep.
Then try something rather more constructive, please.
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Mon, 14 Jul 1997 14:04:48 -0400
Organization: Logical Net
Reply-To: peac...@capital.net
Lines: 82
References: <5puscn$e0h$1...@cdn-news.telecom.com.au>
NNTP-Posting-Host: dialup098.colnny1.capital.net
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 3.01 (WinNT; I)
Xref: agate comp.lang.lisp:29244 comp.programming:52315 comp.lang.c++:281622
Satan - The Evil 1 wrote:
>
> OK now... go ahead an rip my argument to shreds :)
>
> Cheers,
> - PCM
Lisp is a deception. All lisp compilers and interpreters that
I've seen have been written in C, and run on top of a C program. I've
seen a lot of LISP and PROLOG programmers, especially in the post
graduate
level of computer science, think that lisp functions the same way as
mathematics. They think that a call to a recursive function
instantaneously
returns a result. The fact is, these function is broken down into
machine
level instructions, and is executed the same way as a for next loop.
Coming from a person who might seem to think that binary tree
traversal can be executed in "the same way as a for next loop", I
cannot help to dismiss the whole argument. Why have C (or Pascal or
Algol or Lisp) in the first place? Let's stick to FORTRAN. :)
Cheers
--
Marco Antoniotti
==============================================================================
California Path Program - UC Berkeley
Richmond Field Station
tel. +1 - 510 - 231 9472
> Lisp is a deception. All lisp compilers and interpreters that
> I've seen have been written in C, and run on top of a C program.
How sad.
Scheme 48 is written entirely in Scheme. (Its virtual machine
is compiled to C, but that's just for portability; it's *written*
in Scheme.)
CMU Common Lisp is almost entirely written in Lisp. Some bits are
written in C for bootstrapping purposes.
I have a vague feeling someone told me Harlequin's Lisp compiler
is written entirely in Lisp.
> I've
> seen a lot of LISP and PROLOG programmers, especially in the post
> graduate
> level of computer science, think that lisp functions the same way as
> mathematics. They think that a call to a recursive function
> instantaneously
> returns a result. The fact is, these function is broken down into
> machine
> level instructions, and is executed the same way as a for next loop.
Well, knock me down with a feather. Who would have thought it?
You'll be telling me that my hardware doesn't have a single run-emacs
operation next.
If you've seen a lot of postgraduate computer scientists who think
that recursive functions return instantaneously, then the only
conclusion I can draw is that you've been hanging around a university
where either they don't know how to teach computer science or they
get really really weak students. Too bad.
> AI is a bunch of garbage as well. Do you know what the main goal
> of AI is? It is to develope a program where a person cannot
> distinguish the program from a human being. What does that have to do
> with intelligence? It's just an emulator.
Thank you for sharing this profound insight with us. Are you an
emulator or a person, by the way? How are we supposed to tell?
> The bottom line is, all computer programs, including AI
> programs, are just fast pocket calculators.
So much the better for pocket calculators, say I.
--
Gareth McCaughan Dept. of Pure Mathematics & Mathematical Statistics,
gj...@dpmms.cam.ac.uk Cambridge University, England.
> All lisp compilers and interpreters that I've seen have been written
> in C, and run on top of a C program.
Now, here is a man who has seen a real lot of Lisp compilers. :-)
(rest of bait ignored.)
--
Hrvoje Niksic <hni...@srce.hr> | Student at FER Zagreb, Croatia
--------------------------------+--------------------------------
Then... his face does a complete change of expression. It goes from
a "Vengeance is mine" expression, to a "What the fuck" blank look.
> Lisp is a deception. All lisp compilers and interpreters that
>I've seen have been written in C, and run on top of a C program. I've
>seen a lot of LISP and PROLOG programmers, especially in the post
>graduate
>level of computer science, think that lisp functions the same way as
>mathematics. They think that a call to a recursive function
>instantaneously
>returns a result. The fact is, these function is broken down into
>machine
>level instructions, and is executed the same way as a for next loop.
<sarcasm>Gosh, who'd have thought it?</sarcasm> Don't be so utterly
stupid! Any decent LISP programmer, in fact any decent programmer, knows
that. Efficiency is part of programming. But it is not the be-all and
end-all. Yes, the lower-level parts of many LISP systems might well be
written in C/Assembly etc.
The real question is: What is the most efficient/elegant/best way to
express a particular program? It might, for some problems, be in C or C++.
For some, it might be LISP or Prolog. If I can write a program which
efficiently does a job in 60 lines of well-documented LISP that would take
300 lines or more of C, and they both run at similiar speeds, it would
seem that LISP is the better choice.
> AI is a bunch of garbage as well. Do you know what the main goal
>of AI is? It is to develope a program where a person cannot
>distinguish the program from a human being. What does that have to do
>with intelligence? It's just an emulator.
Totally untrue. It would appear that you don't understand what you are
talking about.
> The bottom line is, all computer programs, including AI
>programs, are just fast pocket calculators.
This is not accurate either. All digital computers can be said to be
equivalent to a Universal Turing Machine. But not a calculator. A
calculator hasn't got several of the basic properties it needs.
--
Mark
Certified Waifboy And when they come to ethnically cleanse me
Will you speak out? Will you defend me?
http://www.st.nepean.uws.edu.au/~mgreenaw - Ich bin ein Auslander, PWEI
[...]
> Lisp is a deception. All lisp compilers and interpreters that
>I've seen have been written in C, and run on top of a C program.
Well then you are not looking hard enough. Lisp predates C by meany years.
There are many verents of lisp that are writton in lisp.
In addtion you could say that C is a deception as all C compilers and
interpreters are written in mechean code.
> I've seen a lot of LISP and PROLOG programmers, especially in the post
>graduate level of computer science, think that lisp functions the same way as
>mathematics.
Thats what makes lisp so easy to use.
> They think that a call to a recursive function instantaneously
>returns a result.
No good programer would beleave that.
> The fact is, these function is broken down into machine
>level instructions, and is executed the same way as a for next loop.
You seem to be implieing that ever recusive call gets recompiled. This
is simply not true. In addtion there have been times where the recursive
verson has been faster then the iterave form (I don't know why this is
but time says its so.)
--
Please excuse my spelling as I suffer from agraphia see the url in my header.
Never trust a country with more peaple then sheep. Buy easter bilbies.
Save the ABC Is $0.08 per day too much to pay? ex-net.scum and proud
I'm sorry but I just don't consider 'because its yucky' a convincing argument
well, correct me if i'm wrong, but i have read in many places that *all
recursive functions can be expressed as iterations* yea, it's hard to
believe, but it's been prooven, but i don't really know how to do such a
work of magic.
note that this is done automatically in some really good compilers!
(A copy of this message has also been posted to the following newsgroups:
comp.lang.lisp, comp.programming,comp.lang.c++)
You are right. I was kinda flame baiting :). You can always express a
recursive function with a loop. However, some functions (e.g. binary
tree traversal) are "inherently" recursive. I.e. to remove the
recursive calls you have to explicitely manage a stack. Old FORTRAN
is a good example of this burden posed on the programmer.
From: ? the platypus {aka David Formosa} <dfor...@st.nepean.uws.edu.au>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: 15 Jul 1997 03:40:39 GMT
Organization: UWS Nepean - Department of Computing
Path: agate!ihnp4.ucsd.edu!munnari.OZ.AU!metro!metro!ob1.uws.edu.au!news
Lines: 36
In <33CA6A...@capital.net> Sajid Ahmed the Peaceman <peac...@capital.net> writes:
[...]
> Lisp is a deception. All lisp compilers and interpreters that
>I've seen have been written in C, and run on top of a C program.
Well then you are not looking hard enough. Lisp predates C by meany years.
There are many verents of lisp that are writton in lisp.
In addtion you could say that C is a deception as all C compilers and
interpreters are written in mechean code.
You can go even further than that. C compilers are written mostly in
C. However, GNAT (the Ada 95 front end to the gcc backend) is written
partly in Ada 95 and a large portion of the CMU Common Lisp compiler
is written in Common Lisp. This notion of "compiler bootstrapping" is
as old as computer programming itself.
It is true, and often not that hard. But i don't see why compilers want
to make recursion interative, since that (if the recursion is justified)
means making the code bigger -> bigger/slower executable. Or am i wrong?
/Joe
Actually, I have been given to understand quite the opposite. Apparently
whenver (during runtime) a function calls itself, it has to make another
complete copy of itself to run. You can see how this would tend to take
up much more in the way of resources than an iterative function.
I'm not sure what the exact effect on speed would be, but I imagine it
would be slower. In my limited experience, iterative algorithms have not
been that much larger than their recursive counterparts, just more
difficult to divine.
I have written binary tree traversal algorithms which don't use a stack.
I would be interested to know if this were true for some other
algorithms,
however.
Here's an iterative preorder binary tree traversal.
(Preorder means process root, then process the subtrees.)
void BT::Preorder2Screen()
{
BTnode *temp = this->root; //used to find a node with an untraversed
right subtree
BTnode *current = this->root; //start at the root
int i;
while(temp!=NULL) // need to do this for each node
{
cout<<current->key<<" ";
if(current->Lchild)
{
current = current->Lchild;
}
else
{
if(current->Rchild!=NULL)
{
current = current->Rchild;
}
else //go backwards
{
temp = current->parent;
while(((temp->Rchild == NULL)||(temp->Rchild == current))&&
(temp!=NULL)) //shouldn't happen
{
current = temp;
temp = temp->parent;
}
if (temp!=NULL)
current = temp->Rchild;
}
}
}//endwhile
cout<<"\n\n";
}//end Preorder2Screen()
I wonder what you mean by "a complete copy of itself". it appears that you
think a copy of the actual function's _code_ is copied, and this is of
course not true. however, a new stack frame is often created upon a
function call, with space for various registers, local variables, etc, etc.
this does consume resources. languages that are made for or encourage
recursive function calls often offer "tail call merging" to handle the case
where a function returns simply what the function it is about to call would
return. in such a case, the calling function's stack frame is undone
before the call (if necessary), and the called function returns to the
caller's caller, saving both time and memory. (Scheme is "properly tail
recursive" because it requires an implementation to do tail calls as jumps,
not calls. most Common Lisp implementations offer tail call merging.)
I previously made a serious mistake in believing that Lisp compilers were
so much faster than C compilers when the real difference was that the Lisp
compilers did tail call merging and the C compiler did not. on a SPARC,
this translates to a very heavy penalty because of the way register windows
are saved on the stack, so the Lisp compilers won big, disproportionately.
(I haven't been able to compute the actual cost of a call so I could
discount it and get a better comparison. for now, the GNU C compiler makes
recursion extremely costly on the SPARC.)
programmers who write recursive functions learn to use tail recursion soon
after they discover that each function call can take up hundreds of bytes
of memory. e.g., the former of these two functions will use memory (stack
space) proportional to n, while the latter will use constant space if the
compiler merges tail calls.
(defun factorial (n)
(if (plusp n)
(* n (factorial (1- n)))
1))
(defun factorial (n)
(flet ((tail-factorial (n accumulator)
(if (plusp n)
(tail-factorial (1- n) (* n accumulator))
accumulator)))
(tail-factorial n 1)))
let's hope this puts some needless worries about recursion to rest.
#\Erik
--
Microsoft Pencil 4.0 -- the only virtual pencil whose tip breaks.
> I have written binary tree traversal algorithms which don't use a stack.
> I would be interested to know if this were true for some other
> algorithms,
> however.
> Here's an iterative preorder binary tree traversal.
[snip]
> temp = current->parent;
^ That's the catch.
Effectively you're using a stack. Not the function call stack, but a
stack embedded into your tree. It's fine as long as you need the parent
pointers anyway, but if you only use them for traversal they impose a
possibly significant space overhead and they take time to update.
Michael
--
Michael Schuerig The usual excuse for our most unspeakable
mailto:uzs...@uni-bonn.de public acts is that they are necessary.
http://www.uni-bonn.de/~uzs90z/ -Judith N. Shklar
Michael Schuerig <uzs...@uni-bonn.de> wrote in article
<1997071614...@rhrz-isdn3-p5.rhrz.uni-bonn.de>...
> W. Daniel Axline <wax...@cse.unl.edu> wrote:
>
> > I have written binary tree traversal algorithms which don't use a
stack.
> > I would be interested to know if this were true for some other
> > algorithms,
There are plenty of recursive functions for which most of us don't use a
stack. Iterative factorial() or fibonicci() are examples.
int fact(unsigned int n)
{
int result = 1;
while(n) result *= n--;
return result;
}
> > however.
> > Here's an iterative preorder binary tree traversal.
> [snip]
> > temp = current->parent;
> ^ That's the catch.
>
> Effectively you're using a stack. Not the function call stack, but a
> stack embedded into your tree. It's fine as long as you need the parent
> pointers anyway, but if you only use them for traversal they impose a
> possibly significant space overhead and they take time to update.
Its possible to do n-ary trees without child or parent pointers (a heap),
but some operations (such as swapping child trees or insertions at arbitray
points) become much more expensive. However depth-first or breadth-first
traversals of a heap are easy to do without recursion.
Its hard to argue that fact() above has significant time or space penalties
compared to a recursive version.
The obvious iterative fib() function requires a linear number of additions
and no stack. The obvious recursive version requires an exponential number
of additions.
a friend of mine sent me some Lisp code that looked really odd some time
ago, and told me it was from a game called Abuse. I don't know much about
it, but the Lisp code appeared to be definitions of characters and such,
and Lisp is the extension language of this game, running interpreted or
compiled after loading source code. no compile-file exists. he described
Abuse as a "sidescroller" game. I don't know what that is; some sort of
shoot-em-up game. see http://games.3dreview.com/abuse/files/lispedit.txt
or http://games.3dreview.com/abuse/minigames/minigames.html.
the new Nintendo 64-bit games are produced with Common Lisp, but I don't
know how much of Lisp is running in the actual game. Nichimen Graphics and
their N-world is the place to go for Lisp game action. check out
www.franz.com -- they had a lot of pointers there some time ago. (I can't
check right now, 'cuz Netscape expresses their desire to have me to upgrade
to their next version in their own peculiar way.)
> Actually, I have been given to understand quite the opposite. Apparently
> whenver (during runtime) a function calls itself, it has to make another
> complete copy of itself to run.
You mean a complete copy of the function's *code*? No, definitely not.
Parameters and local variables are allocated on the stack and control
flow continues at the beginning of the function.
If the recursive call is the last one in the function ("tail-recursion")
this can even be done without growing the stack as the new parameters
and variables replace the old ones. You get a function that looks
recursive, but effectively is executed iteratively. Lisp compilers do
this optimization, I'm not sure about others.
Michael
--
Michael Schuerig P'rhaps he's hungry. Six volts make him smile.
mailto:uzs...@uni-bonn.de And twelve volts would probably kill.
http://www.uni-bonn.de/~uzs90z/ -Jethro Tull, "Batteries Not Included"
From: "Bill Wade" <bill...@stoner.com>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: 16 Jul 1997 15:12:58 GMT
Organization: NeoSoft, Inc.
NNTP-Posting-Host: gateway.stoner.com
X-Newsreader: Microsoft Internet News 4.70.1161
Lines: 44
Xref: agate comp.lang.lisp:29286 comp.programming:52441 comp.lang.c++:282065
Michael Schuerig <uzs...@uni-bonn.de> wrote in article
<1997071614...@rhrz-isdn3-p5.rhrz.uni-bonn.de>...
> W. Daniel Axline <wax...@cse.unl.edu> wrote:
>
> > I have written binary tree traversal algorithms which don't use a
stack.
> > I would be interested to know if this were true for some other
> > algorithms,
There are plenty of recursive functions for which most of us don't use a
stack. Iterative factorial() or fibonicci() are examples.
int fact(unsigned int n)
{
int result = 1;
while(n) result *= n--;
return result;
}
You are not getting it. There are "inherently recursive" function
definitions for which an equivalent iterative algorithm *requires* the
use of a stack. No matter how disguised.
> > however.
> > Here's an iterative preorder binary tree traversal.
> [snip]
> > temp = current->parent;
> ^ That's the catch.
>
> Effectively you're using a stack. Not the function call stack, but a
> stack embedded into your tree. It's fine as long as you need the parent
> pointers anyway, but if you only use them for traversal they impose a
> possibly significant space overhead and they take time to update.
Its possible to do n-ary trees without child or parent pointers (a heap),
but some operations (such as swapping child trees or insertions at arbitray
points) become much more expensive. However depth-first or breadth-first
traversals of a heap are easy to do without recursion.
That, once again is because - I surmise - you are assuming an array
implementation of the heap where you know exactly where "parent" and
"children" are (typically at (i%2), (2*i) and (2*1 + 1) for a node at
'i' in a binary heap). You are always allocating memory in some way
or the other which eventually provides you with an "encoded stack".
You can't do that with purely malloc'ed data structures.
Its hard to argue that fact() above has significant time or space penalties
compared to a recursive version.
The obvious iterative fib() function requires a linear number of additions
and no stack. The obvious recursive version requires an exponential number
of additions.
These are examples of non-inherently recursive functions and do not
change a bit the terms of the debate. If memory does not fail me,
there is even a closed form solution for the Fibonacci numbers that
can - hear hear - be computed in O(1) time.
Again, try to do the binary tree search without extra memory allocated
in the basic data structures.
As you might have noticed. The title of this thread is now totally
bogus :)
From: "W. Daniel Axline" <wax...@cse.unl.edu>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Wed, 16 Jul 1997 02:11:31 -0500
Organization: Internet Nebraska
Reply-To: wax...@cse.unl.edu
NNTP-Posting-Host: lin-pm1-022.inetnebr.com
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 3.0Gold (Win95; I)
Xref: agate comp.lang.lisp:29268 comp.programming:52406 comp.lang.c++:281961
Marco Antoniotti wrote:
>
> Delivery-Date: Tue, 15 Jul 1997 08:21:57 -0700
> Date: Tue, 15 Jul 1997 10:23:04 +0000
> From: c.ha...@mail.utexas.edu (Thomas Hallock)
> Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
> References: <5puscn$e0h$1...@cdn-news.telecom.com.au> <33CA6A...@capital.net> <scf3eph...@infiniti.PATH.Berkeley.EDU>
> Organization: bovine soft.
>
> (A copy of this message has also been posted to the following newsgroups:
> comp.lang.lisp, comp.programming,comp.lang.c++)
>
> > Coming from a person who might seem to think that binary tree
> > traversal can be executed in "the same way as a for next loop", I
> > cannot help to dismiss the whole argument. Why have C (or Pascal or
> > Algol or Lisp) in the first place? Let's stick to FORTRAN. :)
>
> well, correct me if i'm wrong, but i have read in many places that *all
> recursive functions can be expressed as iterations* yea, it's hard to
> believe, but it's been prooven, but i don't really know how to do such a
> work of magic.
> note that this is done automatically in some really good compilers!
>
> You are right. I was kinda flame baiting :). You can always express a
> recursive function with a loop. However, some functions (e.g. binary
> tree traversal) are "inherently" recursive. I.e. to remove the
> recursive calls you have to explicitely manage a stack. Old FORTRAN
> is a good example of this burden posed on the programmer.
I have written binary tree traversal algorithms which don't use a stack.
I would be interested to know if this were true for some other
algorithms,
however.
Here's an iterative preorder binary tree traversal.
(Preorder means process root, then process the subtrees.)
void BT::Preorder2Screen()
{
BTnode *temp = this->root; //used to find a node with an untraversed
right subtree
BTnode *current = this->root; //start at the root
int i;
while(temp!=NULL) // need to do this for each node
{
cout<<current->key<<" ";
if(current->Lchild)
{
current = current->Lchild;
}
else
{
if(current->Rchild!=NULL)
{
current = current->Rchild;
}
else //go backwards
{
temp = current->parent;
while(((temp->Rchild == NULL)||(temp->Rchild == current))&&
(temp!=NULL)) //shouldn't happen
{
current = temp;
temp = temp->parent;
}
if (temp!=NULL)
current = temp->Rchild;
}
}
}//endwhile
cout<<"\n\n";
}//end Preorder2Screen()
You are embedding a 'parent' field in your tree nodes. This is equivalent
to use up the the memory needed for a call stack or for a node stack.
If you do not need the parent node for other purposes, this is a waste
of memory which makes the C++ program "inefficient" :)
The recursive solution is much more compact and readable.
void
BTNode::Preorder()
{
cout << current->key << ' '; // Write a space character.
if (Rchild != NULL)
Rchild->Preorder();
if (Lchild != NULL)
Lchild->Preorder();
cout << '\n' << endl; // Use 'endl' to flush.
}
//end Preorder2Screen()
Of course, I could have written a standard function taking a BTNode
argument. The code would have been even more compact, but I decided
to stick to your style.
Not bad for a Lisp programmer, isn't it? Or maybe, it is because I am
a Lisp programmer that I see the beauty and the ease in which you
write recursive functions like these :)
BTW. I suggest to look at
http://www.delorie.com/gnu/docs/GNU/standards_toc.html
reading C/C++ code inconsistently formatted is worse than reading lots
of parenthesis.
Might one gain some advantage by first implementing some
reasonably complex functionality using a clear, if less
efficient, recursive style? Then, if necessary, one could
implement the same functionality using an iterative style.
This presupposes that it is easier to produce correct recursive
solutions. It can be useful to have two solutions to the
same problem at hand. Even if one is dramatically less
efficient than the other.
I would speculate that choosing to implement a more complex
solution first would run a greater risk of failure and might
be more expensive in the long run. Constructing a prototype
first would allow one to obtain experience solving the
problem (or pieces of it). If the main effort fails then
the prototype would exist as a solution.
--
Mike Rilee NASA/GSFC Mailstop 930.0 Greenbelt, MD 20771
mri...@hannibal.gsfc.nasa.gov Ph. (301)286-4743 Fx. (301)286-1777
--
Composed using Oberon. http://www-cs.inf.ethz.ch/Oberon.html
You are wrong, but it is not clear to me how wrong you are since
1. It is not clear to me what you mean by the phrase "if the recursion
is justified". The typical justification for the use of recursion over
iteration is that it more clearly represents the programmer's intent for
the properties of the code. However, clarity of intent need not have a
direct relationship with efficiency of implementation.
2. The most commonly quoted examples of replacement of recursion by
iteration, i.e., tail recursion elimination, do gain in efficiency by
the replacement, for a reasonably intelligent compiler. The procedure
call adds overhead and indirection not present in the equivalent
iterative construct. However it is not clear to me that typical
compilers can replace more sophisticated useages of recursion with
comparably efficient iterative implementations.
--
William B. Clodius Phone: (505)-665-9370
Los Alamos Nat. Lab., NIS-2 FAX: (505)-667-3815
PO Box 1663, MS-C323 Group office: (505)-667-5776
Los Alamos, NM 87545 Email: wclo...@lanl.gov
Is one of you flame-baiting, or are you both completely clueless?
Neither recursive nor iterative functions are inherently larger than one
another; in general, the amount of code will be very similar. Most of the
work is usually done in the body of the loop or recursive function, not in
the code that determines how to repeat itself, and this should be almost
identical in the two versions.
When a function recurses, it doesn't "make another complete copy of itself
to run." There's just a single copy of a function, but perhaps a new
activation record (AKA stack frame) to hold the context of the recursive
call. And if the function is tail-recursive, and the language
implementation supports tail-call optimization, the recursive call can use
the same activation record. Sometimes the tail-recursive version of a
function is a little less obvious than the recursive version, though.
Here's an example of factorial in iterative, recursive, and tail-recursive
versions:
(defun fact-iter (n)
(do ((result 1 (* result i))
(i n (1- i)))
((< i 2) result)))
(defun fact-recurs (n)
(if (< n 2)
1
(* n (fact-recurs (1- n)))))
(defun fact-tail-recurs (n)
(labels ((recurs (i result)
(if (< i 2)
result
(recurs (1- i) (* result i)))))
(recurs n 1)))
In an implementation that supports tail-recursion optimization, the
iterative and tail-recursive versions should generate almost identical
code. The recursive version, however, will use O(n) stack space for all
the recursive calls; once the recursion bottoms out, the multiplications
will be done as each call returns.
--
Barry Margolin, bar...@bbnplanet.com
BBN Corporation, Cambridge, MA
Support the anti-spam movement; see <http://www.cauce.org/>
For perfection, that's LABELS, not FLET.
--Vassili
[...]
>Actually, I have been given to understand quite the opposite. Apparently
>whenver (during runtime) a function calls itself, it has to make another
>complete copy of itself to run.
This is not true. All has to be done is a new bit of stack has to be
allocated.
[...]
> In my limited experience, iterative algorithms have not
>been that much larger than their recursive counterparts, just more
>difficult to divine.
Ok the extened eucliden algorthum then. I have seen no way to do it
iteraveravly that dosen't involve stacks and a hell of a lot hard to
underatand code.
> Ok the extened eucliden algorthum then. I have seen no way to do it
> iteraveravly that dosen't involve stacks and a hell of a lot hard to
> underatand code.
>
> --
> Please excuse my spelling as I suffer from agraphia see the url in my header.
I don't know whether you're a Lisp person or a C++ person (both groups
are in the headers), so I'll do this in pseudocode. It's not the
most efficient code you could write, but it works and it should be
pretty easy to understand.
function euclid(integer x, y) returning integers d,a,b:
// d is the gcd of x,y
// ax+by = d
// for simplicity we assume that x,y are both non-negative,
// and that x>=y.
integer xx=x, px=1, qx=0 // xx=px.x+qx.y always
integer yy=y, py=0, qy=1 // yy=py.x+qy.y always
while yy>0 do
integer q=floor(xx/yy), r=yy-q*xx
// now replace xx,yy with yy,r
(xx,px,qx, yy,py,qy) := (yy,py,qy, r,px-q*py,qx-q*qy)
return d=xx, a=px, b=qx
Not a stack in sight.
Calling people names doesn't do anything.
>Any decent LISP programmer, in fact any decent programmer, knows
> that. Efficiency is part of programming. But it is not the be-all and
> end-all. Yes, the lower-level parts of many LISP systems might well be
> written in C/Assembly etc.
>
> The real question is: What is the most efficient/elegant/best way to
> express a particular program? It might, for some problems, be in C or C++.
> For some, it might be LISP or Prolog.
It is true that LISP and Prolog may have some built in functions
that lets the programmer write less code, but as far as speed is
concerned,
Your smart to ignore it :)
I took a look at the faq. It's mostly about the syntax of
Lisp code.
Anyway, all lisp programs, as well as the compilers and
interpreters are broken down into assembly level code, which is
iterative. The thing I have a problem with is with people trying
to write programs that are completely recursive, which is what lisp
is about. That is the wrong way to go about it. It's a tremendous
waste.
Peaceman
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Thu, 17 Jul 1997 13:36:39 -0400
Organization: Logical Net
Reply-To: peac...@capital.net
Lines: 34
NNTP-Posting-Host: dialup077.colnny1.capital.net
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 3.01 (WinNT; I)
Xref: agate comp.lang.lisp:29306 comp.programming:52493 comp.lang.c++:282257
And in one fell swoop, 50 years of programming science and engineering
are thrown out the window (or left for the garbage collectors :) ).
This has nothing to do with Lisp or C/C++ or Assembly language. This
has to do with good programming practices and the experience of the
programmer. Next thing I'll hear from you is how to solve the
general TSP problem in linear time. (Of course, I'd like to hear from
you a definition of "linear time" beforehand).
Yeah, go figure....
I wonder Y
Calling people names doesn't do anything.
>Any decent LISP programmer, in fact any decent programmer, knows
> that. Efficiency is part of programming. But it is not the be-all and
> end-all. Yes, the lower-level parts of many LISP systems might well be
> written in C/Assembly etc.
>
> The real question is: What is the most efficient/elegant/best way to
> express a particular program? It might, for some problems, be in C or C++.
> For some, it might be LISP or Prolog. If I can write a program which
> efficiently does a job in 60 lines of well-documented LISP that would take
> 300 lines or more of C, and they both run at similiar speeds, it would
> seem that LISP is the better choice.
It is true that LISP has some built in functions that allows
a programmer to write less code. As far as speed is concerned, in almost
every situation, the same program written in C would be faster than a
similar program written in Lisp. Why? C is much closer to the machine
level
assembly code that all computers run on. Many C compilers allow inline
assembly
language code within the program.
As far as the size of the program is concerned, most of the time C
programs are smaller? Why? Good Lisp programs only allow recursive code,
without any stop codes, whereas good C programs allow for both recursive
and iterative code.
Have you ever seen a the quicksort algorithm written in Lisp?
Even though it is a recursive function, it still needs well over
100 lines of code. In C it would only be 5 or 6 lines.
Peaceman
While most assemblers are iterative, I believe I read recently (in the
procedings from the most recent PLDI conference in a paper on developing
code generators for processors from the analysis of C compiler output)
that the assembler for the Tera computer is a dialect of Scheme. This,
of course, need not imply that the Tera assembly code is essentially
recursive, (or that the machine code generated from the assembly need
have a close resemblance to the assembly code), but is interesting none
the less.
Programming language design is about providing a means of expressing an
idea as clearly as possible, (because recursion is often the clearest
way of expressing a concept languages should always include recursion),
implementation is about providing the "best" possible combination of
robustness and efficiency, (if that is best achieved by translating the
recursion into an iterative process then so be it), and programming is
about serving the needs of the user of the code (and, as important one
user of the code is very often its maintainer, that implies that the
code should be written as clearly as possible, with subtle trick (e.g.,
unclear iterative constructs) used rarely and well documented).
If I write a clever compiler, it can turn recursion into iteration.
Lisp does that, automatically. (see note 1)
If I want to write iteration instead of recursion in Lisp, there are
plenty of ways to do so. I am not at any time required to use recursion.
Here is an example I'm sure you'll understand:
(loop for x from 1 to 10 do (print "I am iterating"))
The Lisp function does not use recursion, but iterates in the exact same
way as this C program:
{for (x=1;x<=10;x++) {printf("I am iterating\n");};}; // (see note 2)
Please also note that in lisp, it's perfectly OK to say:
(defun eat (a b c)
"Combine a b and c into a single number."
(progn ;;;; (see note 3)
(setf x 0)
(setf x (+ a b))
(setf x (* c x))
x))
which is precisely the same as this C program
eat(a,b,c)
{
int x;
x=0;
x=a+b;
x=x*c;
return x;
}
Bottom line: you can do any programming construct you like in Lisp.
fred
(note 1) Recursion has nothing to do with programming languages. You can
write recursive assembly code just as easily as recursive Lisp. You just
push things onto "the stack" and then pop them off when you are finishing
up.
Indeed, every subroutine call, be it written in assembler, FORTRAN, C++,
or Lisp, does exactly this.
However, if you write a lisp function that explicitly uses recursion, the
compiler will be smart enough (in most cases) to compile an equivalent
function that uses iteration instead of recursion.
(note 2) I've put the brackets in here, even though they are unnecessary,
to call attention to the fact that C and Lisp have nearly identical
syntax.
(note 3) Strictly speaking, this particular "(progn" is unnecessary (defun
has one "built in"), but I put it there to make it clear that Lisp has it,
and that it can be used just like any other function call, and that it
works just like you think it should: evaluate each statement in turn, and
return the result of the last one.
Note however, that C does not work precisely this way -- without the
explicit "return" statement, the result of the function is unpredictable,
despite the fact that the compiler will not signal an error!
;;; Quicksort a lisp list in considerably fewer than 100 lines of code :-)
(defun qs (x l) ; sort the list x onto the list l.
(if (null x) l
(let* ((i (car x)) (restx (cdr x))
(high low (highlow restx i nil nil)))
(qs low (cons i (qs high l))))))
(defun highlow (x i h l) ; select the high and low elts of x onto h and l.
(if (null x) (values h l)
(let* ((firstx (car x)) (restx (cdr x)))
(if (< firstx i) (highlow restx i h (cons firstx l))
(highlow restx i (cons firstx h) l)))))
This is from my paper "A 'Linear Logic' Quicksort' ACM Sigplan Notices
Feb. 1994.
ftp://ftp.netcom.com/pub/hb/hbaker/LQsort.html (also .ps.Z)
> Anyway, all lisp programs, as well as the compilers and
> interpreters are broken down into assembly level code, which is
> iterative. The thing I have a problem with is with people trying
> to write programs that are completely recursive, which is what lisp
> is about. That is the wrong way to go about it. It's a tremendous
> waste.
>
I would advise that you go back to school and retake the assembly
language programming course. When I took my course, using PDP-11
assembly language BTW, I wrote recursive, iterative and self-modifying
code. Thats the beauty and pain of assembly language. You can make it
anything you want it to be. Branches and jumps are not iterative, and
at the assembly language level you can essentially jump wherever you
want and since activation frames are an artifact of the argument
passing convention you impose on the architecture, you can discard
them at this level. This means that all sorts of crazy things are
possible.
Also, in my view Lisp is not just about recursion just like C is not
just about machine level programming. The objective is to find a
medium in which it is straightforward to map abstract concepts into
computational entities. Certain concepts map well in C, some in
assembly language, others in Lisp+CLOS, others in C++,Eiffel, Sather,
Smalltalk, Perl, Tcl, whatever. So first find that medium that allows
you maximum expression, and then after you have the idea on canvas, so
to speak, then you can worry about wether you need to tweak the
medium.
Hopefully, you just program in Visual Basic and not some language where
you could hurt yourself and others, or, God forbid, learn something.
-Reggie
> W. Daniel Axline <wax...@cse.unl.edu> wrote:
>
> > Actually, I have been given to understand quite the opposite. Apparently
> > whenver (during runtime) a function calls itself, it has to make another
> > complete copy of itself to run.
>
> You mean a complete copy of the function's *code*? No, definitely not.
> Parameters and local variables are allocated on the stack and control
> flow continues at the beginning of the function.
>
> If the recursive call is the last one in the function ("tail-recursion")
> this can even be done without growing the stack as the new parameters
> and variables replace the old ones. You get a function that looks
> recursive, but effectively is executed iteratively. Lisp compilers do
> this optimization, I'm not sure about others.
The tail recursive version also doesn't need any function call -
it is just a jump.
For a normal Lisp compiler there won't be much speed difference.
(fac 1000) with bignum arithmetic needs 0.1 seconds on
my Macintosh Powerbook with MCL 4.1 in all three code versions.
The iterative execution. uses less space.
I prefer a system which lets me write most of the time
functional style code (since this is clearer code most of the time)
without much speed penalty. We have that.
> C a subset of C++??? C came first!
>
And?
> Good Lisp programs only allow recursive code,
> without any stop codes, whereas good C programs allow for both recursive
> and iterative code.
How about Backus' FP (Functional Programming) and FFP (Formal FP)
languages? Or APL? They all provide elegant ways to resolve recursion
without the
go-down-the-bit-level-let's-program-the-X-Y-Z-registers-this-is-how-
I've-been-counting-beans-since-kindergarten loops of imperative
languages.
The ideas of FP, FFP and APL can be embedded in Lisp because it IS a
language
that allows abstraction.
As for the speed, my belief is that the more abstract the language is,
the
more optimized the application can be compiled, and it's an area where
Lisp has an
opportunity to go even further. Anyway, it's the Return on Investment
that should
tell you something about the bottom line, and now good people are _far_
more expensive
than good hardware.
C or assembly is still better for high volume, low complexity
programming
(eg. quartz watch, calculator or washing machine logic).
Cheers,
Rob
> > Thats what makes lisp so easy to use.
> >
>
> Well, in some circumstances, but if you try to right every program
> using only recursive code, it makes things much much more difficult.
It's just as well Lisp doesn't require you to use "only recursive
code", then.
> > You seem to be implieing that every recursive call gets recompiled.
>
> That is true in some languages, but not true in others.
Name three languages that require all recursive function calls
to cause the function to be recompiled. In fact, name one.
> Every recursive function, whether is LISP, Prolog, C+ , ML, or
> any other language is translated into iterative assembly (machine)
> language
> code.
That's only true if you adopt so broad a definition of "iterative"
as to make your statement meaningless.
> Every recursive function, whether is LISP, Prolog, C+ , ML, or
> any other language is translated into iterative assembly (machine)
> language
> code.
And?
(You remember Backus' contribution to Fortran? And then inventing FFP as
a corrective step?
If von Neumann had lived longer, he'd have changed the processor history
in a similar way (IMO)).
(Non-Lisp programmers, sorry for the nested parentheses.)
Robert
This is generally only true if the function is tail-recursive. Many
recursive functions are not tail-recursive, and transforming an obviously
recursive function into a tail-recursive one may require somewhat contorted
coding. See my post with the iterative, recursive, and tail-recursive
versions of factorial.
An extremely clever compiler might be able to figure out how to transform a
recursive function into a tail-recursive one, but this requires quite a bit
of flow analysis that's beyond most compiler writers.
>> > : You don't see commercial games written in Lisp.
>> > : You don't see application suites written in Lisp.
>>
>> None of the above are true, of course. BTW The third domain is perhaps the
>> most in fashion of those listed.
>That's interesting. Could you give a couple of examples of commercial
>games running compiled (or even interpreted) Lisp code. How about an
>application suite? Anything mainstream?
In Object Expert magazine, 2 or 3 issues ago, there was an article on this
topic. A company that writes games were using Lisp (CLOS too ?? ) with
great success. They built a layer on top of Lisp specifically for games-
related needs, and called it GOOL (Games OO Lisp/Layer ?? ) .
I can't remember the name of the game(s) offhand.
Regards,
Steven Perryman
ste...@nortel.co.uk
>David Formosa wrote:
>> Ok the extened eucliden algorthum then. I have seen no way to do it
>> iteraveravly that dosen't involve stacks and a hell of a lot hard to
>> underatand code.
>I don't know whether you're a Lisp person or a C++ person (both groups
>are in the headers), so I'll do this in pseudocode.
The implemention you gave is for the eucliden algorthum. The extened
eucliden algorthum takes 'p' and 'r' and returns q such that
(p * q) mod r == 1
--
Please excuse my spelling as I suffer from agraphia see the url in my header.
Definition. A recursive program parameterized by an abstract data
type T consists of a set of first-order recursion equations over T.
Definition. An iterative program over an abstract data type T
consists of a flowchart program over T.
Theorem. Any recursive program parameterized by an abstract data
type T can be translated into an iterative program over the abstract
data type T' consisting of the union of T with auxiliary stacks.
Sketch of proof: Compilers do this. QED
Patterson-Hewitt Theorem. There exists a recursive program
parameterized by an abstract data type T that cannot be
translated into an iterative program over T.
Sketch of proof: Let T have operations
b : T -> boolean
i : T -> T
f : T -> T
g : T -> T
h : T * T -> T
and consider the program F defined by
F (x) = if b (x) then i (x) else h (F (f (x)), F (g (x)))
Suppose there exists an equivalent iterative program over T.
By a construction akin to the pumping lemma for finite state
automata, you can use the Herbrand interpretation of T to
construct two distinct inputs for which the iterative program
computes the same result, at least one of which must be incorrect.
Therefore no such iterative program exists. QED
So the bottom line is that translating recursion into iteration
sometimes requires an auxiliary data structure such as a stack.
Real programming languages provide many choices for this auxiliary
structure. For example, higher order languages allow recursion to
be translated into continuation-passing style (CPS), which is an
iterative form in which first class procedures are used as the
auxiliary data structure.
Will
>? the platypus {aka David Formosa} wrote:
[...]
>> > think that lisp functions the same way as mathematics.
>>
>> Thats what makes lisp so easy to use.
> Well, in some circumstances, but if you try to right every program
>using only recursive code, it makes things much much more difficult.
Did I say anything about using only recursive code? Becuase of lisps
closeness to mathmatics its easy to write mathmatical code. In fact
it is easy to prove lisp is correct.
[...]
>> You seem to be implieing that every recursive call gets recompiled.
> That is true in some languages, but not true in others.
No compler writer worth hir salt would do this.
[...]
> Every recursive function, whether is LISP, Prolog, C+ , ML, or
>any other language is translated into iterative assembly (machine)
>language code.
This is not true, infact it is within the bounds of possablity to directly
write recursive assembly.
> The implemention you gave is for the eucliden algorthum. The extened
> eucliden algorthum takes 'p' and 'r' and returns q such that
>
> (p * q) mod r == 1
No, the implementation I gave is for the extended Euclidean algorithm.
Given x,y it returns their gcd d, and integers a,b with ax+by=d. In
particular, if the gcd is 1 then you get ax+by=1, i.e. ax == 1 mod y,
which is exactly what you say above.
If you don't believe me, convert it into your favourite language and
try it.
From: j...@alexandria.organon.com (Jon S Anthony)
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: 17 Jul 1997 23:11:36 GMT
Organization: PSINet
Lines: 23
Distribution: world
NNTP-Posting-Host: 38.215.36.2
Xref: agate comp.lang.lisp:29312 comp.programming:52506 comp.lang.c++:282312
In article <scf67uc...@infiniti.PATH.Berkeley.EDU> mar...@infiniti.PATH.Berkeley.EDU (Marco Antoniotti) writes:
> You can go even further than that. C compilers are written mostly in
> C. However, GNAT (the Ada 95 front end to the gcc backend) is written
> partly in Ada 95 and a large portion of the CMU Common Lisp compiler
Actually, the GNAT FE is entirely written in Ada95
> is written in Common Lisp. This notion of "compiler bootstrapping" is
> as old as computer programming itself.
What's more, the GNAT RTL is largely written in Ada95
Well. I am not surprised. I just wasn't up to date.
and from what I
see, most (all?) the CMU CL functions are written in CL (including
EVAL...)
This is true as well. I just did not want to stretch it too
much. (And there are portions of CMUCL written in C - after all it
must run on UN*X platforms).
Cheers
> No, Scheme came after Common LISP. Scheme was a reaction to Common
> LISP by a group of people at MIT who wanted a simple, clean LISP.
This is not true. Scheme was invented in 1975 by Gerry Sussman
and Guy L Steele Jr; see MIT AI Memo 349. Common Lisp did not
begin until after an ARPA "Lisp Community Meeting" in April 1981;
see the article by Steele and Gabriel in the History of Programming
Languages Conference (HOPL-II), SIGPLAN Notices 28(3), March 1993.
The lexical scoping of local variables in Common Lisp came from
Scheme, and Common Lisp had some influence on Scheme's later
evolution (e.g. Scheme's generic arithmetic), but by and large
these two languages evolved separately.
It is true that C is almost a subset of C++, but it is not true
that Scheme is a subset of Common Lisp. There are significant
differences between Scheme and Common Lisp concerning scope rules,
tail recursion, generic arithmetic, data types, exceptions,
continuations, macros, and the Common Lisp Object System.
The relationship between IEEE/ANSI Scheme and ANSI Common Lisp
is more like the relationship between Modula-2 and Ada83 than
the relationship between C and C++.
William D Clinger
From: hba...@netcom.com (Henry Baker)
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Thu, 17 Jul 1997 22:15:17 GMT
Organization: nil
Content-Type: text/plain; charset=ISO-8859-1
Sender: hba...@netcom21.netcom.com
Content-Transfer-Encoding: 8bit
X-Newsreader: Yet Another NewsWatcher 2.2.0
Mime-Version: 1.0
Lines: 22
Xref: agate comp.lang.lisp:29310 comp.programming:52501 comp.lang.c++:282299
Come on Henry. This is another flame bait in disguise. :) You should
have posted the version with arrays. Otherwise, after the claim that
"Lisp does support only recursion", we'd get also the "Lisp does not
have arrays" crap. :)
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Thu, 17 Jul 1997 14:22:44 -0400
Organization: Logical Net
Reply-To: peac...@capital.net
Lines: 55
NNTP-Posting-Host: dialup077.colnny1.capital.net
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 3.01 (WinNT; I)
Xref: agate comp.lang.lisp:29313 comp.programming:52509 comp.lang.c++:282315
Mark Greenaway wrote:
>
...
>Any decent LISP programmer, in fact any decent programmer, knows
> that. Efficiency is part of programming. But it is not the be-all and
> end-all. Yes, the lower-level parts of many LISP systems might well be
> written in C/Assembly etc.
>
> The real question is: What is the most efficient/elegant/best way to
> express a particular program? It might, for some problems, be in C or C++.
> For some, it might be LISP or Prolog. If I can write a program which
> efficiently does a job in 60 lines of well-documented LISP that would take
> 300 lines or more of C, and they both run at similiar speeds, it would
> seem that LISP is the better choice.
It is true that LISP has some built in functions that allows
a programmer to write less code. As far as speed is concerned, in almost
every situation, the same program written in C would be faster than a
similar program written in Lisp. Why? C is much closer to the machine
level
assembly code that all computers run on. Many C compilers allow inline
assembly
language code within the program.
How about
(defun i-am-a-very-fast-lisp-function (x y z)
(* i (expt y z)))
or even
(defun i-am-a-very-fast-lisp-function (x y z)
#I(i * y^^z))
followed by a
(declaim (inline i-am-a-very-fast-lisp-function))
What now?
As far as the size of the program is concerned, most of the time C
programs are smaller? Why? Good Lisp programs only allow recursive code,
without any stop codes, whereas good C programs allow for both recursive
and iterative code.
Either you have not read all the posting or you have not seen a Lisp
program or both.
Have you ever seen a the quicksort algorithm written in Lisp?
Even though it is a recursive function, it still needs well over
100 lines of code. In C it would only be 5 or 6 lines.
In Common Lisp it is even shorter!
(defvar my-array (make-array 1000 :element-type 'single-float))
(dotimes (i 1000)
(setf (aref my-array i) (random 1000000.0)))
(sort my-array #'<)
...
These are examples of non-inherently recursive functions and do not
change a bit the terms of the debate. If memory does not fail me,
there is even a closed form solution for the Fibonacci numbers that
can - hear hear - be computed in O(1) time.
I was just chastised for my big mouth. The closed form contains
exponentials that cannot be computed in constant time.
I guess *I* have to go back to the books.
> In article <bc-170797...@17.127.18.234>,
> Fred Haineux <b...@wetware.com> wrote:
> >If I write a clever compiler, it can turn recursion into iteration.
> >Lisp does that, automatically. (see note 1)
>
> This is generally only true if the function is tail-recursive. Many
> recursive functions are not tail-recursive, and transforming an obviously
> recursive function into a tail-recursive one may require somewhat contorted
> coding. See my post with the iterative, recursive, and tail-recursive
> versions of factorial.
EVERY function can be made tail-recursive by means of continuation-passing.
That's what Michael Fischer proved in 1972. (The original 'push'
technology :-)
I'm goint to ignore most of what you said, as it is clear you don't
know either lisp or C from your posts. But you raise an interesting
point later...
> [speed stuff snipped]
>
> As far as the size of the program is concerned, most of the
> time C
> programs are smaller? Why? Good Lisp programs only allow recursive
> code,
> without any stop codes, whereas good C programs allow for both
> recursive
> and iterative code.
>
> Have you ever seen a the quicksort algorithm written in Lisp?
> Even though it is a recursive function, it still needs well over
> 100 lines of code. In C it would only be 5 or 6 lines.
I'd very much like to see this C 5 or 6 line quicksort. Can you
post these
sort sources? In ant case, here's a 5 line lisp quicksort I just wrote
( took me ~5 minutes )
(defun part(lst fn) (let ((ls nil)(gs nil)) (dolist (x lst)
(if (funcall fn x) (push x ls) (push x gs)))
(list ls gs)))
(defun qs(lst cmp) (if (< (length lst) 2) lst (let* ((p (first lst))
(pt (part (rest lst) #'(lambda(x)(funcall cmp x p)))))
(nconc (qs (first pt) cmp) (list p) (qs (second pt) cmp)))))
That's quicksort. Not my sort of choice, but it works a-ok.
But I just cooked it up really quickly to show a example. Note the
iteration in
the part(partition) function!
You have my humble apologies.
I feel bad for the people whose careers are based on this stuff.
I once had a professor who went to school with Ted Kazynski (aka the
Unabomber), who supposedly wrote the worlds fastest sorting algorithm.
The only catch was you needed to have more elements than the the total
number of atoms in the universe for it to be faster than any of the
so called slower sorting routines.
Peaceman
> Well, in some circumstances, but if you try to right every program
> using only recursive code, it makes things much much more difficult.
Would you mind to have a look at some actual Lisp code?
Michael
--
Michael Schuerig Opinions are essentially bets on the truth of
mailto:uzs...@uni-bonn.de sentences in a language that you understand.
http://www.uni-bonn.de/~uzs90z/ -Daniel C. Dennett
[...]
> I have written binary tree traversal algorithms which don't use a stack.
Using a parent pointer is cheating ;-)
--
Dr. Horst H. von Brand mailto:vonb...@inf.utfsm.cl
Departamento de Informatica Fono: +56 32 654431
Universidad Tecnica Federico Santa Maria +56 32 654239
Casilla 110-V, Valparaiso, Chile Fax: +56 32 797513
All compilers do that.
> If I want to write iteration instead of recursion in Lisp, there are
> plenty of ways to do so. I am not at any time required to use recursion.
...
> Bottom line: you can do any programming construct you like in Lisp.
>
> fred
If that is indeed the case, and considered to be good programming
style in lisp, I'll completely change my outlook on lisp, and accept
it as a decent programming language.
What I've been told when I took a course in Lisp about 5 years
ago, was that you could use iterative code, but it was considered bad
programming style, like using goto statements in other programming
languages.
Peaceman
I believe all the old Infocom games were written in ZDL (Zork Definition
Language, I think). I heard that ZDL was derived from MDL, a Lisp dialect
that was in use at MIT in the 70's, and which was used to implement the
original Zork on the PDP-10.
Good, but could you imagine writing code in lisp using tail
recursion to eveluate a triple integral formula :
x1 y1 z1
/ / /
| | | ex ey ez
| | | C x y z dx dy dz
/ / /
x0 y0 z0
It would be far easier to leave the tail recursion out.
Peaceman
Come to think of it, I can't remember any off hand. I
don't quite remember if Basic did this.
Anyway, I'm sure there are some language designs that don't
use the stack when making calls to functions. They would expand them
inline like a macro definition. When the code would finally be
compiled, there would be recompilations of the function calls.
Peaceman
Good, but does it use the quick sort algorithm?
If you want to talk about built in functions, I'd like to
note that the qsort function is included in the standard c
run time libraries.
Cheers.
Peaceman
Branches a jumps are too iterative.
You agree that for next loops are iterative, right?
How about if statements? They both involve branches
and jumps.
> Hopefully, you just program in Visual Basic
Sorry, no VB for me.
Peaceman
Looks like he got me hook line and sinker :)
The lisp code you have above is not considered to be good
lisp programming style, (or what was told to me to be good lisp
programming style):
> (let* ((i (car x)) (restx (cdr x))
> (high low (highlow restx i nil nil)))
The i and restx are fine since you can easily substitute the rhs values
wherever they are referenced in the latter part of the function.
However you can't do the same with the high and low vars.
Anyway, I took the time out to make the neccesary changes in your
code to take care of this. It's about 80 lines. I know you guys will
say I'm putting too little in each line, but I don't have access to
G-emacs
( written in Lisp,considered by many as the world's best text editor,
and considered by some as the world's most difficult to use text
editor)
where you can match up the close paranthesis to it's corresponding open
paranthesis. There has to be some kind of alignment for the
parenthesis.
Peaceman
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defun high (x i h) ; select the high elts of x onto h.
(if
(null x) h
(if
(< (car x) i)
(high
(cdr x)
i
h
)
(high
(cdr x)
i
(cons
(car x)
h
)
)
)
)
)
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defun low (x i l) ; select the low elts of x onto l.
(if
(null x) l
(if
(< (car x) i)
(low
(cdr x)
i
(cons
(car x)
l
)
)
(low
(cdr x)
i
l
)
)
)
)
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defun qs (x l) ; sort the list x onto the
list l.
(if
(null x) l
(qs
(low
(cdr x)
(car x)
nil
)
(cons
(car x)
(qs
(high
(cdr x)
(car x)
nil
)
l
)
)
)
)
)
Actually, I've been doing some stuff sort of like this in Lisp. First
of all, I feel obligated to say that iteration isn't really considered
to be bad style, which is what I think you were aiming for. for loops
work just as well in Lisp as they do in C. (do constructs, loop
constructs, etc.)
Now on to the cool bit. Imagine you have a function which knows how
to do 1D integrals. There are a zillion out there; call the one
you've got "integrate", so (integrate f x0 x1) does what you want it
to do.
Now, to integrate your 3D function f3(x, y, z), you just have to do:
(integrate
#'(lambda (x)
(integrate
#'(lambda (y)
(integrate
#'(lambda (z) (funcall f3 x y z))
z0 z1))
y0 y1))
x0 x1)
Pretty cool, eh? You don't even have to explicitly write the three
nested for loops. Sure, it's not a very good algorithm for doing 3D
integrals, but it's simple and it works.
Part of the appeal of Lisp is how closures can be used to pull off
trickery like this.
- Johann
--
Johann A. Hibschman | Grad student in Physics, working in Astronomy.
joh...@physics.berkeley.edu | Probing pulsar pair production processes.
> >
> > (defun qs (x l) ; sort the list x onto the list l.
> > (if (null x) l
> > (let* ((i (car x)) (restx (cdr x))
> > (high low (highlow restx i nil nil)))
> > (qs low (cons i (qs high l))))))
> >
>
> The lisp code you have above is not considered to be good
> lisp programming style, (or what was told to me to be good lisp
> programming style):
>
Don't believe everything you hear. The above style is far preferable to the
garbage you've suggested below. Some poor authors still use your old
fashioned garbage style in their books, especially many C and C++ books.
That doesn't justify it.
> Anyway, I took the time out to make the neccesary changes in your
> code to take care of this. It's about 80 lines. I know you guys will
> say I'm putting too little in each line, but I don't have access to
> G-emacs
Everybody has access to GNU Emacs. Down-load a copy and learn it. If you
learn it, I guarantee it will change your prehistoric thinking. After you
become more hip convert you friends who still think this way.
> ( written in Lisp,considered by many as the world's best text editor,
> and considered by some as the world's most difficult to use text
> editor)
> where you can match up the close paranthesis to it's corresponding open
> paranthesis. There has to be some kind of alignment for the
> parenthesis.
>
Why? I once encounter a large C program that used this style of formatting.
When I printed out a hardcopy to understand it the whole last page was
literally nothing but indented braces! Tell me how this is of any use to
anybody.
> Peaceman
>
> ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
> (defun qs (x l) ; sort the list x onto the
> list l.
> (if
> (null x) l
> (qs
> (low
> (cdr x)
> (car x)
> nil
> )
> (cons
> (car x)
> (qs
> (high
> (cdr x)
> (car x)
> nil
> )
> l
> )
> )
> )
> )
> )
Tell me how this style of formating is of any use to anyone without a very
long straight-edge? I think that it is old fashioned and for some
programmers just a fetish.
--
William P. Vrotney - vro...@netcom.com
>Marco Antoniotti wrote:
[...A correctly styled lisp code...]
> The lisp code you have above is not considered to be good
>lisp programming style, (or what was told to me to be good lisp
>programming style):
[...]
> Peaceman
>;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
>(defun high (x i h) ; select the high elts of x onto h.
> (if
> (null x) h
> (if
> (< (car x) i)
> (high
> (cdr x)
> i
> h
> )
[...Rest of the code where each open bracket makes a new line...]
This is realy poorly styled lisp. For one thing a group of close brackets
gose all on one line.
This is lisp code styled as per C. It is inpossable to read or debug.
>Gareth McCaughan wrote:
[...]
>> Name three languages that require all recursive function calls
>> to cause the function to be recompiled. In fact, name one.
> Come to think of it, I can't remember any off hand. I
>don't quite remember if Basic did this.
Basic didn't have function calls.
> Anyway, I'm sure there are some language designs that don't
>use the stack when making calls to functions. They would expand them
>inline like a macro definition.
In such a languge it would be inpossable to write recursive code.
I don't beleave that such a languge exists as the size of the
exicutables would be so massive as to be useless.
> Henry Baker wrote:
> >
> > EVERY function can be made tail-recursive by means of continuation-passing.
> > That's what Michael Fischer proved in 1972. (The original 'push'
> > technology :-)
>
> Good, but could you imagine writing code in lisp using tail
> recursion to eveluate a triple integral formula :
>
> x1 y1 z1
> / / /
> | | | ex ey ez
> | | | C x y z dx dy dz
> / / /
> x0 y0 z0
Yes. You could learn a lot from Sussman & Abelson's book.
You could even learn how to write a program to do those neat ascii
integral formulae in lisp!
I don't know what the Herbrand interpretation is,
but I can give you another proof.
Suppose you want to write a function that calculates the
Sine or Cosine of a number. You can write a nonterminating recursive
function for it, which has no iterative counterpart.
The point I'm making here is for functions used on
a computer. They're all translated into iterative assembly
language code.
Peaceman
The above style is fine if you don't care about the
readability of your code. You need to put spacing before each line
in a program to keep track of what level your in.
> Everybody has access to GNU Emacs. Down-load a copy and learn it. If you
> learn it, I guarantee it will change your prehistoric thinking. After you
> become more hip convert you friends who still think this way.
>
Believe me, I know how to use it. The IDEs of many compilers
today offer many of Gemacs features.
> Why? I once encounter a large C program that used this style of formatting.
> When I printed out a hardcopy to understand it the whole last page was
> literally nothing but indented braces! Tell me how this is of any use to
> anybody.
>
The spacing of the brackets tells you the end of sublevel,
or how deep you are in the program. By looking at the spacing, you can
go up your code and find the previous line that has the same
amount of spacing. You're then able to easily see the begining
and end of the section.
> Tell me how this style of formating is of any use to anyone without a very
> long straight-edge? I think that it is old fashioned and for some
> programmers just a fetish.
>
Well, some programmers don't care about the readability of
their code. Why not start every line at the very first space?
Peaceman
--------------087D7F8F6DEB1927E6ECD5F1
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
W. Daniel Axline wrote:
Actually, I have been given to understand quite the opposite.
Apparently
whenver (during runtime) a function calls itself, it has to make
another
complete copy of itself to run.
Function does not make "another complete copy of itself to run."
Function only
allocates local variables on the stack
--------------087D7F8F6DEB1927E6ECD5F1
Content-Type: text/html; charset=us-ascii
Content-Transfer-Encoding: 7bit
<HTML>
W. Daniel Axline wrote:
<OL>Actually, I have been given to understand quite the opposite. Apparently
<BR>whenver (during runtime) a function calls itself, it has to make another
<BR>complete copy of itself to run.
<P> Function does not make "another complete copy of itself to run."
Function only
<BR>allocates local variables on the stack</OL>
</HTML>
--------------087D7F8F6DEB1927E6ECD5F1--
>Gareth McCaughan wrote:
>>
>> Name three languages that require all recursive function calls
>> to cause the function to be recompiled. In fact, name one.
>>
> Come to think of it, I can't remember any off hand. I
>don't quite remember if Basic did this.
> Anyway, I'm sure there are some language designs that don't
>use the stack when making calls to functions. They would expand them
>inline like a macro definition. When the code would finally be
>compiled, there would be recompilations of the function calls.
I once wrote a particularly ugly set of Macros for the PDP-11 that
used a stack for user written routines, but used the registers for
the "primitives". It did, however, save any registeres used by the
primitives on the stack, and restored them later.
Of course, all of these primitives were effectively inlined into the
code. When a programs was compiled, I had about 10% actual code, and
90% saving/restoring registers. Like I said, it was horribly ugly. On
the bright side, though, all of my assembly language projects were
less than 30 "lines" of code.
But even with this horror, recursive functions were only inlined and
expanded once during compile, and then the stack exploded to insane
sizes during the recursion.
It was a great experiment when I wrote it, but quite terrible for
anything practical.
Heck, even calculators don't do what you suggest. Even INTERCAL
doesn't do what you suggest!
--
Will Hartung - Rancho Santa Margarita. It's a dry heat. vfr...@netcom.com
1990 VFR750 - VFR=Very Red "Ho, HaHa, Dodge, Parry, Spin, HA! THRUST!"
1993 Explorer - Cage? Hell, it's a prison. -D. Duck
>In <33D000...@capital.net> Sajid Ahmed the Peaceman <peaceman@capital.n
et> writes:
>
>>Gareth McCaughan wrote:
>
>[...]
>
>>> Name three languages that require all recursive function calls
>>> to cause the function to be recompiled. In fact, name one.
>
>> Come to think of it, I can't remember any off hand. I
>>don't quite remember if Basic did this.
>
>Basic didn't have function calls.
Not quite true. Several BASICs I've used have had the ability to define
procedures and functions.
>
>> Anyway, I'm sure there are some language designs that don't
>>use the stack when making calls to functions. They would expand them
>>inline like a macro definition.
>
>In such a languge it would be inpossable to write recursive code.
>I don't beleave that such a languge exists as the size of the
>exicutables would be so massive as to be useless.
Agreed. Which is why the inline C++ keyword is ignored for recursive
functions.
>
>--
>Please excuse my spelling as I suffer from agraphia see the url in my
header.
>Never trust a country with more peaple then sheep. Buy easter bilbies.
>Save the ABC Is $0.08 per day too much to pay? ex-net.scum and proud
>I'm sorry but I just don't consider 'because its yucky' a convincing
argument
>.
>
foo(a
)
bar(b
)
if(
c < d
)
{
thisfunc(
a,
b,
c,
d,
)
;
}
}
Would you count that as 24 lines of code? :-)
Your "it takes 80 lines to do this in lisp" doesnt really have too much
meaning if 52 of those lines is related solely to how the code is
formatted. Heck, I could write the whole thing in one line (text file input
to a lisp interpreter).As could the C version. DO we then start counting
characters?
Dennis
(defun high (x i h) ; select the high elts of x onto h.
(if (null x) h
(if (< (car x) i)
(high (cdr x) i h )
(high (cdr x) i (cons (car x) h ) )
)
)
)
(defun low (x i l) ; select the low elts of x onto l.
(if (null x) l
(if (< (car x) i)
(low (cdr x) i (cons (car x) l ) )
(low (cdr x) i l )
)
)
)
(defun qs (x l) ; sort the list x onto the
list l.
(if (null x) l
(qs (low (cdr x) (car x) nil )
(cons (car x)
(qs (high (cdr x) (car x) nil ) l )
)
)
)
)
>.
>
Sajid Ahmed the Peaceman wrote in article <33CFFD...@capital.net>...
>Marco Antoniotti wrote:
>>
>> In article <33CE62...@capital.net>, peac...@capital.net wrote:
>> > Have you ever seen a the quicksort algorithm written in
Lisp?
>> > Even though it is a recursive function, it still needs well over
>> > 100 lines of code. In C it would only be 5 or 6 lines.
>>
>> ;;; Quicksort a lisp list in considerably fewer than 100 lines of
code :-)
>>
>> (defun qs (x l) ; sort the list x onto
the list l.
>> (if (null x) l
>> (let* ((i (car x)) (restx (cdr x))
>> (high low (highlow restx i nil nil)))
>> (qs low (cons i (qs high l))))))
>>
>> (defun highlow (x i h l) ; select the high and low elts of x onto
h and l.
>> (if (null x) (values h l)
>> (let* ((firstx (car x)) (restx (cdr x)))
>> (if (< firstx i) (highlow restx i h (cons firstx l))
>> (highlow restx i (cons firstx h) l)))))
>>
>> This is from my paper "A 'Linear Logic' Quicksort' ACM Sigplan
Notices
>> Feb. 1994.
>> ftp://ftp.netcom.com/pub/hb/hbaker/LQsort.html (also .ps.Z)
>>
>> Come on Henry. This is another flame bait in disguise. :) You should
>> have posted the version with arrays. Otherwise, after the claim that
>> "Lisp does support only recursion", we'd get also the "Lisp does not
>> have arrays" crap. :)
>>
>
> Looks like he got me hook line and sinker :)
>
>
> The lisp code you have above is not considered to be good
>lisp programming style, (or what was told to me to be good lisp
>programming style):
>
>> (let* ((i (car x)) (restx (cdr x))
>> (high low (highlow restx i nil nil)))
>
> The i and restx are fine since you can easily substitute the rhs values
>wherever they are referenced in the latter part of the function.
>However you can't do the same with the high and low vars.
> Anyway, I took the time out to make the neccesary changes in your
>code to take care of this. It's about 80 lines. I know you guys will
>say I'm putting too little in each line, but I don't have access to
>G-emacs
>( written in Lisp,considered by many as the world's best text editor,
> and considered by some as the world's most difficult to use text
>editor)
>where you can match up the close paranthesis to it's corresponding open
>paranthesis. There has to be some kind of alignment for the
>parenthesis.
>
>
> Peaceman
>
>;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;;;
>(defun high (x i h) ; select the high elts of x onto h.
> (if
> (null x) h
> (if
> (< (car x) i)
> (high
> (cdr x)
> i
> h
> )
> (high
> (cdr x)
> i
> (cons
> (car x)
> h
> )
> )
> )
> )
>)
>;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;;
>(defun low (x i l) ; select the low elts of x onto l.
> (if
> (null x) l
> (if
> (< (car x) i)
> (low
> (cdr x)
> i
> (cons
> (car x)
> l
> )
> )
> (low
> (cdr x)
> i
> l
> )
> )
> )
>)
>;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;;
>(defun qs (x l) ; sort the list x onto the
>list l.
> (if
> (null x) l
> (qs
> (low
> (cdr x)
> (car x)
> nil
> )
> (cons
> (car x)
> (qs
> (high
> (cdr x)
> (car x)
> nil
> )
> l
> )
> )
> )
> )
>)
>.
>
It pains me to note that someone has written a new "C syntax" language
that outputs pseudo-code for the MDL-running engine so that they can make
"new" infocom-style games without having to learn Lisp... (I suppose I
should've looked up the details....)
One of the reasons I went to MIT was a run-in with MDL courtesy Prof
Lickleider at the AI-Lab back when I was an urchin....
bc
In article <33D111...@capital.net> Sajid Ahmed the Peaceman
<peac...@capital.net> writes:
> Reply-To: peac...@capital.net
> NNTP-Posting-Host: dialup090.colnny1.capital.net
> Mime-Version: 1.0
> Content-Type: text/plain; charset=us-ascii
> Content-Transfer-Encoding: 7bit
> X-Mailer: Mozilla 3.01 (WinNT; I)
> Xref: ix.netcom.com comp.lang.lisp:27663 comp.programming:52499 comp.lang.c++:278851
>
>
> The spacing of the brackets tells you the end of sublevel,
> or how deep you are in the program. By looking at the spacing, you can
> go up your code and find the previous line that has the same
> amount of spacing. You're then able to easily see the begining
> and end of the section.
>
For example do you mean that you can see the beginning and end of of a
section better with this
(defun faa (x)
(cond ((numberp x) (+ x x)
)
(t (list x)
)
)
)
than this?
(defun faa (x)
(cond ((numberp x) (+ x x))
(t (list x))))
>
>
> > Tell me how this style of formating is of any use to anyone without a very
> > long straight-edge? I think that it is old fashioned and for some
> > programmers just a fetish.
> >
>
>
> Well, some programmers don't care about the readability of
> their code. Why not start every line at the very first space?
>
>
> Peaceman
Obviously you did not understand my point. If you knew Emacs the way you
say you do then you would know that Emacs does proper indenting for you.
--
William P. Vrotney - vro...@netcom.com
> Basic didn't have function calls.
It did 15 years ago. At least, it did in the MS Basic on the TRS-80.
You could define functions and call them. Of course, not many people
used them, but they _were_ there.
Curiously, the function parameters were dynamically scoped, just like
some of the older Lisp dialects. You can still get this behaviour in
Common Lisp, by declaring a name to be special.
In the case of MS Basic, without any way of declaring scope for a
name, the onky option was dynamic scoping. In this sense, it didn't
have functions with explicit scope. Could this be what you mean?
More modern Basics _definitely_ have functions with explicit scoping.
> > Anyway, I'm sure there are some language designs that don't
> >use the stack when making calls to functions. They would expand them
> >inline like a macro definition.
>
> In such a languge it would be inpossable to write recursive code.
> I don't beleave that such a languge exists as the size of the
> exicutables would be so massive as to be useless.
I recall seeing the Winston and Horn animal program rewritten in Basic
(yep, the same MS Basic I used on the TRS-80), and it used arrays for
the "stack". The result was very ugly and hard to read. I didn't
understand what the program did until I saw the Lisp version.
I guess you could say that a language that compiles into Combinatory
Logic doesn't use a stack, and that recursive functions are copied. In
fact, unless I've misunderstood it, calling a function involved
copying it, subtituting the arguments for the parameters. The entire
program and its data is a single data structure which is gradually
reduced to just the data.
However, unless we're talking about pure functional dialects of Lisp,
like LispKit, we can say that this is not usually how Lisp works! I've
seen a Scheme interpreter written in Haskell, but that's not a typical
way to implement Lisp. In fact, if you were to implement Basic or C in
Haskell, you'd have to do it in a similar way, and it would be equally
atypical for those languages. Compiling Basic or C - or C++ - into
Combinatory Logic might be interesting, but not many people would find
it of practical value.
--
<URL:http://www.wildcard.demon.co.uk/> You can never browse enough
"An operating system is a collection of things that don't fit into a
language. There shouldn't be one." Daniel Ingalls, Byte August 1981
> G-emacs
> ( written in Lisp,considered by many as the world's best text editor,
> and considered by some as the world's most difficult to use text
> editor)
> where you can match up the close paranthesis to it's corresponding open
> paranthesis. There has to be some kind of alignment for the
> parenthesis.
VI should have done it, too.
Unless somebody has an extended LET*, a MULTIPLE-VALUE-BIND will
do the trick.
(defun qs (x l p)
(if (null x)
l
(let ((i (first x)))
(multiple-value-bind (high low)
(highlow (rest x) i nil nil p)
(qs low (cons i (qs high l p)) p)))))
(defun highlow (x i h l p)
(if (null x)
(values h l)
(let ((firstx (first x)))
(if (funcall p firstx i)
(highlow (rest x) i h (cons firstx l) p)
(highlow (rest x) i (cons firstx h) l p)))))
; (qs '(3 1 5 6) nil #'<)
If I'm counting right, this is still a bit less than 100 lines.
Btw., for formatting code you can use PPRINT:
(let ((*print-right-margin* 80)
(*print-case* :downcase))
(pprint '(defun qs (x l)
(if
(null x) l
(qs
(low
(cdr x)
(car x)
nil
)
(cons
(car x)
(qs
(high
(cdr x)
(car x)
nil
)
l
)
)
)
)
)))
Gives you a version that doesn't look like C:
(defun qs (x l)
(if (null x)
l
(qs (low (cdr x) (car x) nil)
(cons (car x) (qs (high (cdr x) (car x) nil) l)))))
Perhaps it's time you reconsidered Lisp on its own merits and determined how
well it suits your needs without anyone else's biases? I don't think Lisp is
the only useful language, but it's a good one for many applications, and
perhaps the best for some.
Bill House
--
http://www.housewebs.com
Note: my e-mail address has been altered to
confuse the enemy.
As someone who typically writes multi-language programs, I find that indenting
them all more or less the same way makes them all easier for me to read. If I
get too used to Lisp conventions, the others become subjectively less readable,
and vice versa. Human perception is unfortunately illogical. <shrug>
>
> William Paul Vrotney <vro...@netcom.com> wrote in article
> <vrotneyE...@netcom.com>...
> >
> >[snip two examples of indenting]
> >
> I think that people can most easily read the style they are most accustomed to.
>
>
> As someone who typically writes multi-language programs, I find that indenting
> them all more or less the same way makes them all easier for me to read. If I
> get too used to Lisp conventions, the others become subjectively less readable,
> and vice versa. Human perception is unfortunately illogical. <shrug>
>
Basically it sounds like you agree with me on Lisp conventions however I
don't have the vice versa experience. I suppose this is because I prefer
Lisp.
I too write multi-language programs and also use "the same way" on my C++
programs. But I get a lot of heat from other C++ programmers who do not
have a Lisp background. Instead of saying that my style is wrong form some
reason, they say that they don't like it because it reminds them of Lisp.
Sometimes I wonder if the reason for this is that an occurrence of "}}}}}"
in a C program dilutes the "LISP stands for "Lots of Irritating Single
Parentheses"" catch phrase.
> It pains me to note that someone has written a new "C syntax" language
> that outputs pseudo-code for the MDL-running engine so that they can make
> "new" infocom-style games without having to learn Lisp... (I suppose I
> should've looked up the details....)
If you're talking about Inform, he didn't write it so that you wouldn't
have to learn Lisp; he wrote it because there wasn't *any* freely available
(or indeed non-freely available) way of producing code for the Z-machine.
Which, incidentally, runs a byte code that isn't particularly Lispish.
--
Gareth McCaughan Dept. of Pure Mathematics & Mathematical Statistics,
gj...@dpmms.cam.ac.uk Cambridge University, England.
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Fri, 18 Jul 1997 18:54:30 -0400
Organization: Logical Net
Reply-To: peac...@capital.net
Lines: 32
NNTP-Posting-Host: dialup034.colnny1.capital.net
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 3.01 (WinNT; I)
Xref: agate comp.lang.lisp:29354 comp.programming:52576 comp.lang.c++:282522
Fred Haineux wrote:
>
> | Anyway, all lisp programs, as well as the compilers and
> | interpreters are broken down into assembly level code, which is
> | iterative. The thing I have a problem with is with people trying
> | to write programs that are completely recursive, which is what lisp
> | is about. That is the wrong way to go about it. It's a tremendous
> | waste.
>
> If I write a clever compiler, it can turn recursion into iteration.
> Lisp does that, automatically. (see note 1)
>
All compilers do that.
Not only you need to look at some algorithms books. You need to also
look at some compilers book. AFAIK, gcc does not do that in all
possible cases (I might be wrong). In the Lisp world, KCL and most of
its derivatives did not properly eliminate tail recursive calls.
> If I want to write iteration instead of recursion in Lisp, there are
> plenty of ways to do so. I am not at any time required to use recursion.
...
> Bottom line: you can do any programming construct you like in Lisp.
>
> fred
If that is indeed the case, and considered to be good programming
style in lisp, I'll completely change my outlook on lisp, and accept
it as a decent programming language.
Iteration in Lisp is good programming practice whenever it is useful.
What I've been told when I took a course in Lisp about 5 years
ago, was that you could use iterative code, but it was considered bad
programming style, like using goto statements in other programming
languages.
Iteration is considere bad programming practice when recursion does
not take a performance hit, the algorithm is inherently recursive
(yes! there exists such beasts) and a recursive solution would be
much better suited, readable and maintanable. Nobody forbids you from
writing iterative (Common) Lisp code. You have all you need in the
language to do just that.
--
Marco Antoniotti
==============================================================================
California Path Program - UC Berkeley
Richmond Field Station
tel. +1 - 510 - 231 9472
From: vro...@netcom.com (William Paul Vrotney)
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Sun, 20 Jul 1997 04:38:51 GMT
Organization: Netcom On-Line Services
Lines: 61
Sender: vro...@netcom3.netcom.com
Xref: agate comp.lang.lisp:29384 comp.programming:52638 comp.lang.c++:282702
In article <33D111...@capital.net> Sajid Ahmed the Peaceman
<peac...@capital.net> writes:
> Reply-To: peac...@capital.net
> NNTP-Posting-Host: dialup090.colnny1.capital.net
> Mime-Version: 1.0
> Content-Type: text/plain; charset=us-ascii
> Content-Transfer-Encoding: 7bit
> X-Mailer: Mozilla 3.01 (WinNT; I)
than this?
Not only that. He'd know that it does also the right indenting for
C/C++ using one or the other more or less "standard styles" (K&R C or
GNU Coding or a few others - last I checked the code for cc-mode I was
kinda overwhelemd. BTW cc-mode is written in Emacs Lisp :) ).
Cheers
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Fri, 18 Jul 1997 19:52:03 -0400
Organization: Logical Net
Reply-To: peac...@capital.net
Lines: 24
NNTP-Posting-Host: dialup034.colnny1.capital.net
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 3.01 (WinNT; I)
Xref: agate comp.lang.lisp:29360 comp.programming:52584 comp.lang.c++:282532
>
> In Common Lisp it is even shorter!
>
> (defvar my-array (make-array 1000 :element-type 'single-float))
>
> (dotimes (i 1000)
> (setf (aref my-array i) (random 1000000.0)))
>
> (sort my-array #'<)
>
> Cheers
> --
> Marco Antoniotti
Good, but does it use the quick sort algorithm?
Do you know a better "equality based" sort algorithm? (A case can be
made for heapsort over certain data structures).
If you want to talk about built in functions, I'd like to
note that the qsort function is included in the standard c
run time libraries.
Yep. And its signature is
void qsort(void *base, size_t nel, size_t width,
int (*compar) (const void *, const void *));
The Common Lisp signatures are
sort sequence predicate &key key => sorted-sequence
stable-sort sequence predicate &key key => sorted-sequence
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Sat, 19 Jul 1997 14:49:14 -0400
Organization: Logical Net
Reply-To: peac...@capital.net
Lines: 55
NNTP-Posting-Host: dialup090.colnny1.capital.net
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 3.01 (WinNT; I)
Xref: agate comp.lang.lisp:29374 comp.programming:52617 comp.lang.c++:282648
William D Clinger wrote:
>
...
>
> Suppose there exists an equivalent iterative program over T.
> By a construction akin to the pumping lemma for finite state
> automata, you can use the Herbrand interpretation of T to
> construct two distinct inputs for which the iterative program
> computes the same result, at least one of which must be incorrect.
> Therefore no such iterative program exists. QED
>
I don't know what the Herbrand interpretation is,
but I can give you another proof.
Suppose you want to write a function that calculates the
Sine or Cosine of a number. You can write a nonterminating recursive
function for it, which has no iterative counterpart.
I doubt that such a function would be useful. Ever heard of
"numerical precision"?
The point I'm making here is for functions used on
a computer. They're all translated into iterative assembly
language code.
Very well. Translate this into purely iterative assembly code. No
PUSH operations of *any* kind allowed.... and no 'parent' fields or
extra memory esplicitely allocated by the programmer allowed. Using a
continuation passing style is also a no-no.
Note that this is perfectly acceptable C code (indented as per the GNU
coding standars) and which a programmer might be asked to write.
The point will be that neither you nor any C (or Lisp, or INTERCAL, or
Ada) compiler will be able to produce an iterative assembly
translation for this piece of code. Inherently recursive algorithms
and data structures do exists.
==============================================================================
void
preorder_traversal (tree_node *tree)
{
if (tree == 0)
return;
else
{
printf ("%d ", tree->key);
preorder_traversal (tree->left);
preorder_traversal (tree->right);
}
}
==============================================================================
And, since we are at it, here is the whole program. Tested and
working on a Solaris platform using gcc 2.7.2.
------------------------------------------------------------------------------
#include <stdio.h>
#include <assert.h>
#include <stdlib.h>
typedef struct _tree_node
{
int key;
struct _tree_node *left;
struct _tree_node *right;
} tree_node;
void
insert (tree_node *tree, int value)
{
assert (tree != 0);
if (value == tree->key)
return;
else if (value < tree->key)
{
if (tree->left == 0)
{
tree->left = (tree_node*) malloc (sizeof (tree_node));
tree->left->key = value; /* Sorry. No check on malloc return. */
}
else
insert (tree->left, value);
}
else
{
if (tree->right == 0)
{
tree->right = (tree_node*) malloc (sizeof (tree_node));
tree->right->key = value; /* Sorry. No check on malloc return. */
}
else
insert (tree->right, value);
}
}
void
preorder_traversal (tree_node *tree)
{
if (tree == 0)
return;
else
{
printf ("%d ", tree->key);
preorder_traversal (tree->left);
preorder_traversal (tree->right);
}
}
void
inorder_traversal (tree_node *tree)
{
if (tree == 0)
return;
else
{
inorder_traversal (tree->left);
printf ("%d ", tree->key);
inorder_traversal (tree->right);
}
}
void
main()
{
tree_node *root = (tree_node*) malloc (sizeof (tree_node));
__const__ int tree_size = 20;
int count;
/* Sorry. No system error checking.. */
srand (getpid ());
root->key = rand () % 100;
for (count = 0; count < tree_size; count++)
insert(root, rand () % 100);
puts ("Preorder traversal\n");
preorder_traversal (root);
putchar ('\n');
puts ("\nInorder traversal\n");
inorder_traversal (root);
putchar ('\n');
}
------------------------------------------------------------------------------
Enjoy.
> Reginald S. Perry wrote:
> >
> > Sajid Ahmed the Peaceman <peac...@capital.net> writes:
> >
> > > Anyway, all lisp programs, as well as the compilers and
> > > interpreters are broken down into assembly level code, which is
> > > iterative. The thing I have a problem with is with people trying
> > > to write programs that are completely recursive, which is what lisp
> > > is about. That is the wrong way to go about it. It's a tremendous
> > > waste.
> > >
> >
> > I would advise that you go back to school and retake the assembly
> > language programming course. When I took my course, using PDP-11
> > assembly language BTW, I wrote recursive, iterative and self-modifying
> > code. Thats the beauty and pain of assembly language. You can make it
> > anything you want it to be. Branches and jumps are not iterative, and
> > at the assembly language level you can essentially jump wherever you
> > want and since activation frames are an artifact of the argument
> > passing convention you impose on the architecture, you can discard
> > them at this level. This means that all sorts of crazy things are
> > possible.
> >
>
> Branches a jumps are too iterative.
>
> You agree that for next loops are iterative, right?
> How about if statements? They both involve branches
> and jumps.
>
But, a branch or jump can be called iterative ONLY IF the branch or
jump takes you to a place where you will be traversing the same
section of code each time. BUT, in pure assembly language you can
write code which while it may jump to the same label, could be
traversing different code. One way to do this is by making space in
your data section where you will write the raw machine instructions
based on whatever conditions you want. Another thing you can do which
was used a lot in the 70s is set up what is called a jump table where
the code looks iterative in that you are jumping to the same area but
where in that area you jump depends on the value in a register. Things
like radix conversion used this technique.
So while you can do things that look like something like iteration, it
is really just a jump to someplace in the code. You are just mapping
your seemingly partial knowledge of C onto your limited understanding
of computer architecture. At the machine level, everything is just a
sequence of bytes. In order to avoid insanity, we map various
high-level concepts onto the machine-level architecture. But please
make no mistake, our mapping often constrains the things that are
possible at the machine level in order to gain clarity in
understanding the computational process. There will always be some
things that are better done at the machine level. And if you have
never really sat down and played and tried to _really understand_
what's going on at the machine level, your knowledge of programming
will always be limited and you will always get into silly arguments
where people are trying to explain something you don't understand in
terms of simpler things which you still don't understand.
-Reggie
-------------------
Reginald S. Perry e-mail: pe...@zso.dec.com
Digital Equipment Corporation
Performance Manager Group
http://www.UNIX.digital.com/unix/sysman/perf_mgr/
The train to Success makes many stops in the state of Failure.
Just because sort is a destructive function doesn't mean you can
omit the setf and get anything reasonable from it.
>
>
> Good, but does it use the quick sort algorithm?
>
Dunno. That's implementation-dependent. Just like in C.
> If you want to talk about built in functions, I'd like to
>note that the qsort function is included in the standard c
>run time libraries.
>
Yes, a function called "qsort" is included in the standard, and it
is defined as a sort function. The standard makes no claim of the
algorithm. Any claim it made would be vitiated by the "as-if"
attitude of the C standard, which states that any implementation
details mandated by the standard may be disregarded, provided this
cannot be detected by any strictly conforming program.
In other words, we've got precisely equal guarantees in Common Lisp
and C, and Common Lisp has a more versatile and easier-to-use
sort function.
As far as implementations go, well, it seems just as reasonable to
use efficient sorting algorithms in Lisp as in C. If you have some
evidence that C implementations typically implement qsort() better,
in some manner, than Common Lisp implementations implement sort,
please let us know.
David Thornley
From: thor...@visi.com (David Thornley)
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: 21 Jul 1997 21:43:25 GMT
Organization: Vector Internet Services, Inc.
Lines: 46
NNTP-Posting-Host: bambi.visi.com
NNTP-Posting-Date: 21 Jul 1997 16:43:25 CDT
Xref: agate comp.lang.lisp:29412 comp.programming:52700 comp.lang.c++:282914
In article <33D001...@capital.net>,
Sajid Ahmed the Peaceman <peac...@capital.net> wrote:
>>
>> In Common Lisp it is even shorter!
>>
>> (defvar my-array (make-array 1000 :element-type 'single-float))
>>
>> (dotimes (i 1000)
>> (setf (aref my-array i) (random 1000000.0)))
>>
>> (sort my-array #'<)
>>
Shouldn't that be
(setf my-array (sort my-array #'<))
?
Just because sort is a destructive function doesn't mean you can
omit the setf and get anything reasonable from it.
As a matter of fact, we can in this case since we are using arrays.
(Unless I was mistaken in my interpretation of CLtL2 and the
Hyperspec). If the sequence were a list, then the setf would be
mandatory. The punishment would be a "shorter" lists (unless the
first element were also the first in the sorted list).
If we want, we could discuss at length why some of the design choices
of "destructive" operations in Common Lisp sometime have a
non-intuitive behavior.
In an email, I responded to "peaceman" telling him that in Lisp, using an
"obvious" syntax, that is to say one which is very readable, is more
important than using recursive programming all the time.
Some algorithms are iterative, and it's 100% OK to write those functions
that way. Use the "obvious" mode of expression.
If you have a choice between an iterative and a recursive algorithm that
does the same thing, you are 100% OK to use whichever you like better.
Heck, put them both in.
Many Lisp heads assume, as a first approximation, that any syntax will
produce "fast enough" code (hopefully CORRECT code, but hey). Many Lisp
programmers would rather write "obvious," correct code than twisted,
speedy code.
This is a different attitude than some C programmers, who seem to care
most about efficiency, and less about correctness.
Interestingly, the famous book "Elements of Programming Style," by
Kernighan and Plauger, agrees that "obvious" code is better than "speedy"
code, because you rarely know where to optimize until you have run a
profiler, and "obvious" code is easier to maintain. These two guys, I am
told, are somewhat well-known to the C programmers, although this book
deals mostly with FORTRAN.
But anyway, this is a question of style, and "de gustibus in nolo
disputandum est" -- there's no accounting for taste.
I think that the highly technical term for a teacher who browbeats
students to using recursive syntax exclusively even when the algorithm is
better expressed iteratively is "a bozo", although that's my opinion, not
everyone's. (It is an interesting mind-expansion exercise to practice
turning recursion into iteration, and vice versa, though.)
> Sajid Ahmed the Peaceman wrote:
>
> > Good Lisp programs only allow recursive code,
> > without any stop codes, whereas good C programs allow for both recursive
> > and iterative code.
Bad C programs only allow iterative code.
>With a mighty <869323883.942607@cabal>,
>dfor...@st.nepean.uws.edu.au uttered these wise words...
>> Basic didn't have function calls.
>It did 15 years ago. At least, it did in the MS Basic on the TRS-80.
In the old versons that I was exposed to there where only 'goto' and
'gosub' for flow control. qbasic wich is the most moden basic I can
find (ignoring VB) dose have functions and scoping.
[...]
>> In such a languge it would be inpossable to write recursive code.
>> I don't beleave that such a languge exists as the size of the
>> exicutables would be so massive as to be useless.
>I recall seeing the Winston and Horn animal program rewritten in Basic
>(yep, the same MS Basic I used on the TRS-80), and it used arrays for
>the "stack". The result was very ugly and hard to read.
A soultion that C programers sometimes use to avoid rcurstion. In
meany cases avoiding recursion in this way has a negitive performunce
inpact.
>This is generally only true if the function is tail-recursive. Many
>recursive functions are not tail-recursive, and transforming an obviously
>recursive function into a tail-recursive one may require somewhat contorted
>coding. See my post with the iterative, recursive, and tail-recursive
>versions of factorial.
>An extremely clever compiler might be able to figure out how to transform a
>recursive function into a tail-recursive one, but this requires quite a bit
>of flow analysis that's beyond most compiler writers.
When I was a final year undergrad in 1987, the FP course I did had quite a bit
on 'transformation' techniques. The basis was that using proof techniques
(such as unfold/induction hypothesis/fold proofs etc) , a system could convert
one functional program with reams of recursion into an equivalent program that
was more computationally efficient. One specific part of this was the
conversion of tail-recursive programs into equivalent recursive progs that
used accumulating parameters to effect the 'iteration' .
And obviously the 'on paper' examples were stuff like Fibonacci and factorial.
Given that 10 years have passed, I'd like to think there are now quite a few
support systems (theorem provers etc) available to FP compiler writers to use
in areas like removing tail recursion.
As an aside, I still use the on paper techniques for stuff like C++ , where I
come up with an initial inefficient recursive algorithm, and then 'hand
transform' it into stuff using acculumating params and/or tail recursion
removal (someone finds it more than academic :-) ) .
Regards,
Steven Perryman
ste...@nortel.co.uk
> >It did 15 years ago. At least, it did in the MS Basic on the TRS-80.
>
> In the old versons that I was exposed to there where only 'goto' and
> 'gosub' for flow control. qbasic wich is the most moden basic I can
> find (ignoring VB) dose have functions and scoping.
Old vesions of MS Basic on the TRS-80, or some other Basic? As I said,
this Basic used _dynamic_ scoping, and most people didn't even know
that it had functions. You may have been one of the many who missed
them. I only know they were there because I read the entire manual.
> >I recall seeing the Winston and Horn animal program rewritten in Basic
> >(yep, the same MS Basic I used on the TRS-80), and it used arrays for
> >the "stack". The result was very ugly and hard to read.
>
> A soultion that C programers sometimes use to avoid rcurstion. In
> meany cases avoiding recursion in this way has a negitive performunce
> inpact.
And achieves little else. I've never had a problem with recursion, but
then, I used to write recursive decent parsers. Some compiler people
prefer this technique to the "yacc" style parser, and a few of them
claim that recursive decent is more efficient. Well, I don't know
about that, but there are certainly advantages to both techniques.
In both cases, you're thinking recursively, even if the code compiles
to a state machine that uses an array for the "stack". Hmm. Y'know,
even C uses a stack...Maybe some people have a problem with the _idea_
of recursion? I used to have a phobia of link lists, until I wrote my
first Forth system (which depended heavily on lists). After that, I
had no problem with lists! In fact, I can't understand why I used to
have so much trouble with them. Perhaps some people just need the
right learning experience, to make them more comfortable with an idea?
It could be that they just had a bad teacher, and so got an impression
that recursion was somehow "difficult", and so not for them.
I was lucky with recursion. I discovered it via compiler theory, which
was a subject that greatly interested me at the time. (It still does.)
No doubt that helped me a lot. Recursion makes so many other ideas
easy to understand! How could I avoid it? I can't imagine being
without it any more than I can imagine not being able to use lists,
trees, graphs, stacks, and so on. It's fundamental.
>If we want, we could discuss at length why some of the design choices
>of "destructive" operations in Common Lisp sometime have a
>non-intuitive behavior.
>
I talked this over once with a guy on X3J13 (if that's the correct
designation). I understand the issues involved. (No, it wasn't
too embarassing at the time.)
I do tend to write (<fun> <foo> ...) as (setf <foo> (<fun> <foo> ... ))
when <fun> is a destructive function. I *think* that's legal; if
<foo> is a constant or parameter I don't think using destructive
functions on it is kosher. Assuming it is legal, I don't think
it's bad style. It does protect you from changes when you decide
to convert an array to a list, for example.
David Thornley
> In an email, I responded to "peaceman" telling him that in Lisp, using an
> "obvious" syntax, that is to say one which is very readable, is more
> important than using recursive programming all the time.
>
> Some algorithms are iterative, and it's 100% OK to write those functions
> that way. Use the "obvious" mode of expression.
Unfortunately, "obvious" is in the mind of the beholder. State-based
(iterative) programming is harder not only for humans to understand
(subjective), but also harder for compilers to understand (objective).
One of these days (in my more than ample spare time :-) I'd like to show
how even traditionally "iterative" Fortran matrix codes are prettier and
easier to understand in functional/recursive form. (Hint: think linear
logic.)
If you think "iterative" codes are easier to understand, then please
explain to me how the "SVD" (Singular Value Decomposition) transformation
in "Numerical Recipes" works. (This section of the book is actually
online in pdf form at the publisher's web site, or at least used to be.)
See
ftp://ftp.netcom.com/pub/hb/hbaker/sigplannotices/gigo-1997-03.html
for some additional examples of iterative v. recursive.
| In article <bc-210797...@17.127.18.96>, b...@wetware.com (Fred
| Haineux) wrote:
|
| > In an email, I responded to "peaceman" telling him that in Lisp, using an
| > "obvious" syntax, that is to say one which is very readable, is more
| > important than using recursive programming all the time.
| >
| > Some algorithms are iterative, and it's 100% OK to write those functions
| > that way. Use the "obvious" mode of expression.
|
| Unfortunately, "obvious" is in the mind of the beholder. State-based
| (iterative) programming is harder not only for humans to understand
| (subjective), but also harder for compilers to understand (objective).
"Yes, that's exactly right." Or, perhaps, "Yes, but..."
|
| One of these days (in my more than ample spare time :-) I'd like to show
| how even traditionally "iterative" Fortran matrix codes are prettier and
| easier to understand in functional/recursive form. (Hint: think linear
| logic.)
|
| If you think "iterative" codes are easier to understand, then please
| explain to me how the "SVD" (Singular Value Decomposition) transformation
| in "Numerical Recipes" works. (This section of the book is actually
| online in pdf form at the publisher's web site, or at least used to be.)
|
| See
|
| ftp://ftp.netcom.com/pub/hb/hbaker/sigplannotices/gigo-1997-03.html
|
| for some additional examples of iterative v. recursive.
It's good to have these references, but I think that I was making the
point that you can write in whichever style you please.
Obviously, freedom of choice is freedom to piss off your colleagues with
non-obvious code.
But it beats Obfuscated C all to heck.
>In <MPG.e3bf8a69...@news.demon.co.uk> mcr@this_email_address_is_intentionally_left_crap_wildcard.demon.co.uk (Martin Rodgers) writes:
>>With a mighty <869323883.942607@cabal>,
>>dfor...@st.nepean.uws.edu.au uttered these wise words...
>>> Basic didn't have function calls.
>>It did 15 years ago. At least, it did in the MS Basic on the TRS-80.
>In the old versons that I was exposed to there where only 'goto' and
>'gosub' for flow control. qbasic wich is the most moden basic I can
>find (ignoring VB) dose have functions and scoping.
Well, in TRS-80 Level II BASIC, they had a DEF FN construct that
allowed one to define equations as functions. They were simple
expressions, there wasn't any logic allowed, and they were called
"functions" at the time.
I think it was something like:
10 DEF FNR(X)=INT(RND(0)*X)+1
20 R=FNR(6)+FNR(6)
30 IF R=7 OR R=11 THEN PRINT "YOU WIN!" ELSE PRINT "YOU LOSE"
I don't recall if you were allowed to put other variables in the
equations or not. If you were, then I imagine they were just globals
(like everything else).
I think you were able to create functions with both the numeric and
strings types.
Funny, I thought I had blotted most of this stuff out of my psyche.
--
Will Hartung - Rancho Santa Margarita. It's a dry heat. vfr...@netcom.com
1990 VFR750 - VFR=Very Red "Ho, HaHa, Dodge, Parry, Spin, HA! THRUST!"
1993 Explorer - Cage? Hell, it's a prison. -D. Duck
do they? the question is one of which value one looks at, I think. `sort'
on a list returns a sorted list, but the cons cells that used to be that
list have been reused. if we look at the return value of `sort', we get
the sorted list. if we look at a random cons cell that has been reused in
some unspecified way, who's to tell? like `nreverse' in one implementation
swaps the `car' of cons cells, and in another the `cdr', we cannot know
what a cons cell that has been destructively modified would contain, unless
the operation is specified by the specification of the language, and it
isn't for `sort' or `nreverse'.
or, another way: after (sort <list> <predicate>), <list> is _history_.
#\Erik
--
there was some junk mail in my mailbox. somebody wanted to sell me some
useless gizmo, and kindly supplied an ASCII drawing of it. "actual size",
the caption read. I selected smaller and smaller fonts until it vanished.
This is an invalid argument.
To start with, C is _not_ close to the machine.
- most machines do not support C's auto-scaled pointer arithmetic directly
- *NONE* of the C89 integral types supported by the compilers on this
machine (an UltraSPARC) match the machine's native register size (64 bits).
[Yes, I know about long long, which is why I said _C89_ integral types.]
- C lets me specify a floating point type that does not exist on this
machine. On an older machine from the same manufacturer, NONE of C's
floating point types were supported in hardware.
- There are some important differences between C functions and what you
get in assembler. In fact, on the immediate predecessor of this machine,
it was actually useful that two C compilers had an option to use a
calling convention that was _not_ the normal one on this machine.
But the machine flaw is the assumption that compilers are irredeemably
stupid. What matters is not how close to the machine the *starting* point
is, but how close the compiler can get it.
For the record, I have frequently found Scheme code compiled by Stalin to
run _faster_ than the corresponding C code.
Another reason the argument is flawed is that, as the Fortran people are
fond of pointing out, C is hard to optimise. Languages (such as Fortran 90)
where the compiler can know more about the program than any standard C
compiler can know about any standard C program, permit better optimisation.
In the examples where Scheme code ran faster than C, generally Fortran
code went even faster. And nobody talks about Fortran being "close to
assembler".
> As far as the size of the program is concerned, most of the time C
>programs are smaller?
This is meaningless. _Which_ Lisp? _Which_ C? Which machine/OS?
>Why? Good Lisp programs only allow recursive code,
>without any stop codes, whereas good C programs allow for both recursive
>and iterative code.
You shot yourself in the foot there.
Lisp has *all* the control structures that C has. ALL of them.
And then some.
> Have you ever seen a the quicksort algorithm written in Lisp?
Yes, several.
>Even though it is a recursive function, it still needs well over
>100 lines of code. In C it would only be 5 or 6 lines.
Obviously, you have not seen a quicksort in C. One that came with a
version of UNIX takes (by actual measurement) 177 lines. The rather
faster one I wrote myself takes 162 lines. There's an important
optimisation in that without which the C version would be about 80 lines.
The "Engineered" quicksort by Bentley & McIlroy is about 112 lines of C.
Since the partition step itself takes about 6 lines of C, it would be
rather hard to get a readable version of Quicksort in "5 or 6 lines" of C.
--
Four policemen playing jazz on an up escalator in the railway station.
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.
Note that there actually was a real commercial C system for the PC
where qsort() was implemented using Shell sort.
> Come to think of it, I can't remember any off hand. I
>don't quite remember if Basic did this.
There is nothing in any BASIC standard to require this.
Traditionally, "GOSUB" and "RETURN" manipulated a control stack
just like you'd expect. Multi-line functions in BASIC are, if
compiled, compiled just like functions in Pascal or C.
Anyway, I'm sure there are some language designs that don't
>use the stack when making calls to functions.
Instead of repeating your assertion, PROVIDE SOME EVIDENCE FOR IT!
There have certainly been languages that didn't use *a* stack for
function calls, such as Simula 67, Interlisp, Burroughs Algol, &c.
That's because they supported multiple threads of control, so there
were multiple stacks, or cactus stacks, or spaghetti stacks.
>They would expand them
>inline like a macro definition. When the code would finally be
>compiled, there would be recompilations of the function calls.
The Algol 60 standard *described* all procedure calls using the "copy rule".
Every Algol 60 *implementation* I've ever heard of used a stack of
activation records, just like Pascal or C.
Even dBase III isn't defined to copy functions when they're called!
> Well, in TRS-80 Level II BASIC, they had a DEF FN construct that
> allowed one to define equations as functions. They were simple
> expressions, there wasn't any logic allowed, and they were called
> "functions" at the time.
That's the Basic that I'm thinking of.
> I think it was something like:
>
> 10 DEF FNR(X)=INT(RND(0)*X)+1
> 20 R=FNR(6)+FNR(6)
> 30 IF R=7 OR R=11 THEN PRINT "YOU WIN!" ELSE PRINT "YOU LOSE"
>
> I don't recall if you were allowed to put other variables in the
> equations or not. If you were, then I imagine they were just globals
> (like everything else).
Yes, from what I remember, the variables were dynamically scoped,
which is another way of saying that they were global. In your example,
variable X would shadow any already existing X, so that while FNR is
running, X would be 6, but when FNR returns, the old value would
become "visible" again.
As with many things in that Basic, and many other Basic of that
generation, functions were very limited. No wonder they were so rarely
used! Fortunately, these things change, and now it's not unusual for a
Basic to use lexical scoping and compile to native code. Not that this
stops some people from calling Basic "slow". People with long memories
and little recent experience of Basic, I guess.
Hmm. <looks at thread subject> Just like Lisp, really. Most Lisps that
I've used have compiled to native code, and yet some people can only
remember the interpreted lisps, like XLISP. That could be because of
the wide availabilty and age of XLISP, but it's no excuse for assuming
that this is the best or fastest Lisp to be found.
That's a lot like assuming that Small C is the most sophisticated C
compiler you can find, simply because it's the only one that you've
found and used. A little more effort should give you a much better C
compiler, like GNU C/C++.
On the other hand, you might have used a C interpreter...
> I think you were able to create functions with both the numeric and
> strings types.
That sounds familiar, so you could well be right.
> Funny, I thought I had blotted most of this stuff out of my psyche.
Same here. I blame David Formosa. ;)
Every garbage collector capable of working in constant space does this.
You should add "does not modify the original data structure while
traversing it" to your list of no-nos. (Mark bits are usually shaved off
existing pointers, so they aren't really "explicitly allocated".)
IIRC, Knuth also describes an "elegant" technique somewhere in his AoCP
that involves encoded parent pointers without taking up additional space.
I remember an example involving double-linked lists, but I'm not sure if
this also works for trees.
Of course, the resulting code will be *much* uglier than a simple recursive
function. But there are people who don't care for elegance, or even
maintainability. I keep wondering why they feel a need for asserting the
perceived superiority of their favourite language here in comp.lang.lisp.
Perhaps to them, the mere existence of Lisp seems a threat.
And it may well be.
Greetings,
Jens.
--
mailto:j...@acm.org phone:+49-7031-14-7698 (HP TELNET 778-7698)
fax:+49-7031-14-7351
PGP: 06 04 1C 35 7B DC 1F 26 As the air to a bird, or the sea to a fish,
0x555DA8B5 BB A2 F0 66 77 75 E1 08 so is contempt to the contemptible. [Blake]
"Caution Function procedures can be recursive; that is, they can call
themselves to perform a given task. However, recursion can lead to stack
overflow. The Static keyword usually isn't used with recursive Function
procedures."
And of course, the VC++ compiler is equally "broken", if we are to judge by the
standard of your excellent article. <g>
Therefore, I think that proper handling of tail-recursion, like lambda, is
yet-another-killer-Lisp technique that the masses are doomed to remain ignorant
of (unless there is a sudden resurgence of Lisp popularity). <sigh>
A jump instruction, in assembly would translate
into the following:
MOVE IP, address
which is iterative.
Peaceman
Richard O'Keefe may correct me on this, but I believe that the primary
reason for the addition of the SAVE statement to Fortran 77 was to allow
implementations to use stacks in a reliable manner. Technically I
believe Fortran 66 could be implemented using stacks, but entities
became undefined when they left their scope, so that local entites could
not be used in a standard conforming manner to retain state. However,
because most implementations did not use stacks, a significant body of
code was written that assumed that local entities always retained their
values, making it difficult, of course, for compilers to use stacks and
still satisfy their customers. It is true, however, that stacks were not
as useful in Fortran 77 as they were for languages with recursion or
block scoping, let alone more explicitly stack based languages such as
Forth or Pop.
As to recursion, I don't know anyone who wrote standard conforming
recursive Fortran 77 procedures, although I have met a few that thought
they had done so. In every case they were wrong. Some were using
compiler specific extensions, but most had code that worked untill new
standard conforming optimizations were implemented (such as the useage
of a stack). In some sense it is possible to emulate recursion in
Fortran 77, for example the following might be considered a form of
anonymous recursion
FACTOR = 1
10 IF (M .EQ. 0) THEN
FACTOR = 1 * FACTOR
ELSE
FACTOR = FACTOR * M
M = M - 1
GO TO 10
ENDIF
but similar reasoning might consider iteration to be a way of emulating
recursion.
--
William B. Clodius Phone: (505)-665-9370
Los Alamos Nat. Lab., NIS-2 FAX: (505)-667-3815
PO Box 1663, MS-C323 Group office: (505)-667-5776
Los Alamos, NM 87545 Email: wclo...@lanl.gov
> I keep wondering why they feel a need for asserting the
> perceived superiority of their favourite language here in comp.lang.lisp.
> Perhaps to them, the mere existence of Lisp seems a threat.
> And it may well be.
This seems to be the only plausible explanation. It might also explain
many of the Java attacks, which use _identical_ arguments to attacks
on Lisp. They're all bizarre, wrong, and easily refuted.
Are there any archives for comp.lang.lisp, so that we could refer such
people to them? That way, they could simply read the efforts of the
past, and save everybody's time.
Watching hords of C++ programmers trying to convince us that Lisp
can't do what it _is_ doing (and has been doing, for some time) used
to remind me of seal culling. The C++ people were the baby seals,
while Lisp programmers would be the seal trappers, gently tapping the
heads of their prey with CS papers, Lisp software, and Lisp systems.
The result was the quick death of any anti-Lisp argument. Eventually,
the entire thread would die.
A few months would pass, and these events would repeat themselves,
only the words would be different. The arguments would be equally
clueless. Hence the image of baby seal culling. The difference between
C++ programmers and seals is that, well, seals don't write software.
Scary, isn't it?
--
<URL:http://www.wildcard.demon.co.uk/> You can never browse enough
"My body falls on you like a dead horse" -- Portait Chinois
Please note: my email address is gubbish
You could certainly write recursive code, as long as the
number of times the function calls itself is set at compile time.
> I don't beleave that such a languge exists
There out there. New programming languages are created everyday.
That's what YACC is for.
>as the size of the
> exicutables would be so massive as to be useless.
>
Most programming code out there (about 96%) is
nonrecursive. You've been programming is lisp too long.
Peaceman
To each their own. I find this style a lot easier to read than putting
all the end parenthesis on one line. Sure, it takes up more lines, but
it is a lot easier to group together the commands in the specified
level.
Peaceman
Igor, most people in the LISP newsgroup can't read your message
(which is MIME encoded). They probably use the EMACS Gnus newsreader,
which is a good newsreader, but doesn't support MIME encoded messages.
Peaceman
Note quite true. Even though there's no stack, the parameters still have
to be filled in. The difference is that this can be done by assigning to
fixed addresses rather than as an offset from the stack pointer. On early
computers, the performance difference might have been noticeable, but it
would be in the noise these days.
Also, the return address has to be saved somewhere, since a subroutine or
function can be called from multiple places. Again, the difference is that
there can be a fixed location for each procedure's return address
information, since procedures don't have to be reentrant. Many early CPU's
had an instruction that would store the old PC in the target address and
then jump to the address following it (on the PDP-8 this was the "JMS
<address>" instruction); a return would be done by doing an indirect jump
through the procedure's starting address (on the PDP-8, "JMP I <address>").
As recursive programming languages and "pure" text pages gained popularity,
this technique lost its utility.
The main advantage of the Fortran-77 model is that the compiler could
determine the total memory requirements of the program. With recursion
available, there's generally no way to determine the amount of memory that
will be needed for the stack. Again, this was much more important in the
early days, when memory was extremely limited.
--
Barry Margolin, bar...@bbnplanet.com
BBN Corporation, Cambridge, MA
Support the anti-spam movement; see <http://www.cauce.org/>
How about this:
(defun faa (x)
(cond
((numberp x) (+ x x))
(t (list x))
)
)
It's the same as what you had, except the close paranthesis
are in the same line as the matching open parenthesis.
> Obviously you did not understand my point. If you knew Emacs the way you
> say you do then you would know that Emacs does proper indenting for you.
>
I'm not the foremost expert in Emacs, but I do know that your
statement is not entirely true. Whether or not Emacs does the proper
indenting for you depends on whether or not Emacs is set to that mode.
One could set up a .emacs file to set the editor to always stay in
standard text mode, and not ever in c or lisp mode.
Peaceman
> >Any decent LISP programmer, in fact any decent programmer, knows
> > that. Efficiency is part of programming. But it is not the be-all and
> It is true that LISP has some built in functions that allows
> a programmer to write less code. As far as speed is concerned, in almost
> every situation, the same program written in C would be faster than a
> similar program written in Lisp. Why? C is much closer to the machine
> level
> assembly code that all computers run on.
Why should (+ 3 4) in Lisp be slower than 3 + 4 in C?
> Many C compilers allow inline
> assembly
> language code within the program.
Many Lisp compilers too.
>
> As far as the size of the program is concerned, most of the time C
> programs are smaller? Why? Good Lisp programs only allow recursive code,
> without any stop codes, whereas good C programs allow for both recursive
> and iterative code.
What is a *good* Lisp program versus a *good* C program.
Both language do allow recursive and iterative implementations
of algorithms.
> Have you ever seen a the quicksort algorithm written in Lisp?
> Even though it is a recursive function, it still needs well over
> 100 lines of code.
Only if a C programmer would format the code.
> Gareth McCaughan wrote:
> >
> > > > You seem to be implieing that every recursive call gets
> recompiled.
> > >
> > > That is true in some languages, but not true in others.
> >
> > Name three languages that require all recursive function calls
> > to cause the function to be recompiled. In fact, name one.
> >
>
> Come to think of it, I can't remember any off hand. I
> don't quite remember if Basic did this.
>
> Anyway, I'm sure there are some language designs that don't
> use the stack when making calls to functions. They would expand them
> inline like a macro definition. When the code would finally be
> compiled, there would be recompilations of the function calls.
>
> Peaceman
> Igor, most people in the LISP newsgroup can't read your message
> (which is MIME encoded). They probably use the EMACS Gnus newsreader,
> which is a good newsreader, but doesn't support MIME encoded messages.
>
> Peaceman
It does.
> A jump instruction, in assembly would translate
> into the following:
>
> MOVE IP, address
>
> which is iterative.
Perhaps it would help if you could say exactly what you mean by
"iterative" as opposed to "recursive".
In a piece of assembly language that looks like this (no, it's not
the assembly language of any specific processor known to me; no,
it isn't a usual assembly-language-ish syntax)
func: compare r1,=2
branch-if-less skip:
push-address after1:
subtract r1,r1,=1
jump func: <---
after1: push-register r0
push-address after2:
subtract r1,r1,=1
jump func: <---
after2: pop-register r2
add r0,r0,r2
pop-program-counter
skip: move r0,r1
pop-program-counter
it seems to me that the jumps I've labelled with <--- are *recursive*.
If you argue that they aren't recursive because at the level of
machine instructions there somehow isn't any concept of "function",
then I can equally argue that they aren't iterative either because
at the level of machine instructions there similarly isn't any
concept of "loop".
If you insist on ignoring the higher-level abstractions which
explain what a given piece of machine code means, then sure, you
can refuse to regard the code I give above as a recursive function;
it's just a load of instructions that push and pop things from a
stack and jump around. But then I can refuse to recognise
loop: store r1,[r0,r1]
add r1,r1,=1
compare r1,=1000
branch-if-less loop:
as a loop, too: it's just a load of instructions that do arithmetic
and jump around.
In other words, your claim about what machine-level stuff does has
nothing whatever to do with the difference between iteration and
recursion.
--
Gareth McCaughan Dept. of Pure Mathematics & Mathematical Statistics,
gj...@dpmms.cam.ac.uk Cambridge University, England.
> I thought that Fortran77 didn't use the stack. I believe It also
> doesn't allow recursion without some programming trickery.
> Nick
No it did not (and still does not :-) Once a friend of mine was given
an assignment to implement a recursive algorithm in Fortran, so he had
to emulate the stack using several arrays (remember -- Fortran does
not have structures, either!)
Besides the sipmlicity of the implementation, there was one more
reason for not using the stack: the speed. Indeed, when C or Lisp make
a function call, a lot of run-time job is being done on the
stack, while in Fortran, a function call gets translated into a single
'jump' instruction.
--
Keep talking! /~~~~~~~~~~~~~~~
Dmitry Zinoviev / /~~~~~~~~/
/ `~~~~~~~'
_From the Other Side of the World ____________/ Long Island, NY
>
> William Paul Vrotney wrote:
> > For example do you mean that you can see the beginning and end of of a
> > section better with this
> >
> > (defun faa (x)
> > (cond ((numberp x) (+ x x)
> > )
> > (t (list x)
> > )
> > )
> > )
> >
> > than this?
> >
> > (defun faa (x)
> > (cond ((numberp x) (+ x x))
> > (t (list x))))
> >
>
> How about this:
> (defun faa (x)
> (cond
> ((numberp x) (+ x x))
> (t (list x))
> )
> )
>
>
> It's the same as what you had, except the close paranthesis
> are in the same line as the matching open parenthesis.
OK fine, but the question still stands, is it easier for you to see the
sections in
(defun faa (x)
(cond
((numberp x) (+ x x))
(t (list x))
)
)
Than in
(defun faa (x)
(cond
((numberp x) (+ x x))
(t (list x))))
? This is your claim.
>
> > Obviously you did not understand my point. If you knew Emacs the way you
> > say you do then you would know that Emacs does proper indenting for you.
> >
>
> I'm not the foremost expert in Emacs, but I do know that your
> statement is not entirely true. Whether or not Emacs does the proper
> indenting for you depends on whether or not Emacs is set to that mode.
> One could set up a .emacs file to set the editor to always stay in
> standard text mode, and not ever in c or lisp mode.
>
This statement doesn't support your argument much for two reasons. One, you
want all of the leverage out of Emacs that you can get, so why would you
turn auto indenting off? Emacs beginners sometimes do not know that they
can tailor the way Emacs does auto-indenting so that it auto indents exactly
the way they like it. Two, language specific modes do a lot more than just
auto-indenting. In C mode moving back and forth on conditionals and source
debugging code with a debugger like GDB are just two more examples.
--
William P. Vrotney - vro...@netcom.com
>Nicholas Arthur Ambrose wrote:
>> <snip>
>> I thought that Fortran77 didn't use the stack. I believe It also
>> doesn't allow recursion without some programming trickery.
>> Nick
>Richard O'Keefe may correct me on this, but I believe that the primary
>reason for the addition of the SAVE statement to Fortran 77 was to allow
>implementations to use stacks in a reliable manner. Technically I
>believe Fortran 66 could be implemented using stacks, but entities
>became undefined when they left their scope, so that local entites could
>not be used in a standard conforming manner to retain state. However,
>because most implementations did not use stacks, a significant body of
>code was written that assumed that local entities always retained their
>values, making it difficult, of course, for compilers to use stacks and
>still satisfy their customers. It is true, however, that stacks were not
>as useful in Fortran 77 as they were for languages with recursion or
>block scoping, let alone more explicitly stack based languages such as
>Forth or Pop.
I would phrase this differently.
The Fortran 66 specification was very carefully crafted to allow
overlays to work. (Overlays were extremely important at the time.)
In particular, it was important that when a subroutine (which might
never be called again) exited, any data that had been brought into
memory for its use should be allowed to vanish. So the rule was
that if there are no subprograms active that mention a particular
COMMON block, that COMMON block just plain doesn't exist, and if a
subprogram isn't active, it's local variables don't exist either
(so their values don't have to be written back to the disc when
the overlay containing that subprogram is dropped).
A consequence of this concern for overlays was that Fortran 66 could
be implemented using stacks throughout, and several Fortran 66 implementations
(notably the one from Burroughs for the B6700, a _very_ nice Fortran for its
day) did in fact do this. This potential was particularly important for
multithreaded use of Fortran. Burroughs Fortran was multithreaded back in
the '60s, and the PrimeOS operating system (or systems; did PrimeOS for the
P300 and PrimeOS for the 50 series have much in common) was actually written
in Fortran.
The _problem_ was that a lot of other Fortran 66 implementations didn't,
notably the ones for the IBM mainframes. I have seen _really_ weird code
where you would call a subroutine passing 20 parameters, once, and then
repeatedly call an entry point of the subroutine, passing only one or two
parameters. People were expecting the old parameters to retain the value
they had last time! Many Fortran programmers never bothered to read (the
really rather clear and useful) Fortran standard, and were unaware that
their code rather flagrantly failed to conform in this area, just as many
Fortran programmers assumed that 'one-trip' loops were required by the
standard (which had carefully avoided saying anything about them).
So yes, SAVE was added so that old broken code could be repaired and made
to work in stack-oriented implementations. (Like the UNIX Fortran compilers
I use from time to time.)
>As to recursion, I don't know anyone who wrote standard conforming
>recursive Fortran 77 procedures,
They couldn't. A *program* that conforms to the Fortran 66 or Fortran 77
standards *must not* use recursion. However, an *implementation* that
conforms to those standards is not required to diagnose it as an error and
may implement recursion. Several Fortran 66 implementations did, and many
Fortran 77 implementations do. (Like those UNIX Fortran compilers.)
Fortran 90 _requires_ support for recursion, and one level of block scope.
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Wed, 23 Jul 1997 13:17:56 -0400
Organization: CADVision Development Corp.
Reply-To: peac...@capital.net
Lines: 27
? the platypus {aka David Formosa} wrote:
>
> >(defun high (x i h) ; select the high elts of x onto h.
> > (if
> > (null x) h
> > (if
> > (< (car x) i)
> > (high
> > (cdr x)
> > i
> > h
> > )
> [...Rest of the code where each open bracket makes a new line...]
>
To each their own. I find this style a lot easier to read
than putting
all the end parenthesis on one line. Sure, it takes up more lines, but
it is a lot easier to group together the commands in the specified
level.
I suppose you'd write the same in C as
int
high (int x[], int c, int i, int h)
{
if
(c == 0)
if
(x[c] < i)
high(x,
c - 1,
i,
h)
...
}
Give me a break!
--
Marco Antoniotti
==============================================================================
California Path Program - UC Berkeley
Richmond Field Station
tel. +1 - 510 - 231 9472
> Therefore, I think that proper handling of tail-recursion, like lambda, is
> yet-another-killer-Lisp technique that the masses are doomed to remain
> ignorant of (unless there is a sudden resurgence of Lisp popularity).
> <sigh>
Have a look at anonymous functions in Pizza, a Java superset:
<http://www.cis.unisa.edu.au/~pizza/>.
Michael
--
Michael Schuerig Although condemned by moralist,
mailto:uzs...@uni-bonn.de lying can have high survival-value.
http://www.uni-bonn.de/~uzs90z/ -Richard L. Gregory
This is back to front. On modern machines, and even on the machines that
were current when F77 was finalised, putting variables on the stack is/was
as fast as OR FASTER THAN making them global. "in Fortran" a function
call gets translated into whatever the compiler wants to translate it to.
The one thing it _can't_ be is just a jump, because it does have to figure
out how to come back!
To make this really obvious, consider the fact that you would like to
bind local variables to registers for speed. The rules of Fortran 66
and Fortran 77 (in the absence of a SAVE statement) say that you
_don'_t have to load the register from memory on entry and you _don't_
have to store it back on exit. There's a load and a store per variable
saved.
Also consider the IBM 360. On that machine, the hardware does not do
absolute addressing. If you want to refer to a statically allocated
variable, you have to have a base register pointing nearby. And
the instruction format only lets you have an offset of 0..4k from that
base register. Anything you can allocate on the stack reduces the
demand for extra base registers in _all_ the subprograms. Modern
machines have bigger offsets in their addresses, but not a _lot_ bigger.
> Reginald S. Perry wrote:
> >
> > Sajid Ahmed the Peaceman <peac...@capital.net> writes:
> >
> > > Branches a jumps are too iterative.
> > >
> > > You agree that for next loops are iterative, right?
> > > How about if statements? They both involve branches
> > > and jumps.
> > >
> >
> > But, a branch or jump can be called iterative ONLY IF the branch or
> > jump takes you to a place where you will be traversing the same
> > section of code each time. BUT, in pure assembly language you can
> > write code which while it may jump to the same label, could be
> > traversing different code. One way to do this is by making space in
> > your data section where you will write the raw machine instructions
> > based on whatever conditions you want. Another thing you can do which
> > was used a lot in the 70s is set up what is called a jump table where
> > the code looks iterative in that you are jumping to the same area but
> > where in that area you jump depends on the value in a register. Things
> > like radix conversion used this technique.
> >
>
> A jump instruction, in assembly would translate
> into the following:
>
>
> MOVE IP, address
>
>
> which is iterative.
>
Well this is interesting. How in the world is a move iterative? How is
iterative defined in your world?
If you want to convince me, you will have to:
1) define iteration
2) define recursion
3) define your assembly pseudo-language subset. this subset has to be
rich enough to describe computations that can be done today on modern
machines.
4) define the iterative operation in this language according to your
definition of iterative.
5) define the recursive operation in this language according to your
definition of recursive.
6) Show how the operations in 5) are equivalent to the operations in 4)
7) show that for every operation one can describe in your language,
they all map in to some equivalent of 4)
Do this, and I will be convinced. I would advise you think carefully
before you start.
-Reggie
One of my least favorite programmer arguments concerns which indenting
style is "best." For some reason, these usually smart people seem to
explode violently when presented with code that doesn't meet their
personal indenting style.
This usually results in someone in the org mandating a particular style,
which produces months of endless bickering, and eventually produces some
kind of compromise standard that is both neither fish nor foul and
resoundingly ignored.
Luckily, many programmers eventually realize that there is a pretty-print
program ("cb" for the unix heads), and it can be set to produce the exact
style of indenting that is preferred.
What makes this my least favorite is that the argument itself is pointless
-- no amount of moving around the brackets is going to change the
resulting machine code. (And if it does, you should shoot your compiler
vendor, because one of the few STANDARDS of C or Lisp is that whitespace
is irrelevant.)
What makes the argument especially pointless is the fact that the need to
perform a particular indentation has vanished over the years, because C
code editors are finally starting to catch up to Lisp in functionality.
For many years, Lisp editors have provided functions such as "indent based
on semantics" (ie. NOT just "indent the same as previous line, which is
lame), "flash matching bracket" and "highlight the entire bracketed
expression" without even cracking a sweat. Most of them have also
supported "find the definition of this function I'm pointing at," "show me
the documentation for this system call", and "show me the arguments to the
function whose name I've just typed"
Now that C editors are doing some of these things (hurray!) there really
is no need to have the close brackets on separate lines. However, it's
still a matter of PERSONAL TASTE.
Remember what the philosophers said: "We demand rigidly defined areas of
doubt and uncertainty." I think indentation style should be one of them.
>jos...@lavielle.com (Rainer Joswig) wrote:
>| > (defun faa (x)
>| > (cond
>| > ((numberp x) (+ x x))
>| > (t (list x))
>| > )
>| > )
>| >
>| >
>| > It's the same as what you had, except the close paranthesis
>| > are in the same line as the matching open parenthesis.
>|
>| This is bad style. You are wasting white space for nothing.
>| Lisp programmer usually don't
>| care about closing parentheses. Indentation is more important.
>| Both indentation and parenthesis matching/counting does
>| the Lisp environment (Emacs, Genera, whatever, ...)
>| for you.
Here's a great example of how two coding methods are right,
each for the people concerned.
I was coding in Dataflex. My boss was also. It uses goto
labels etc. All my labels were symbolic names like
"CalculateValue" or "ShowResult". His labels were all numeric,
"L0001", "L0002". For the life of me, I couldn't figure out why
he did this.
Then it dawned on me. He used a lot of PAPER PRINTOUTS when
looking over code. The numeric labels made sense because they
were ordered down the page. He didn't use an editor much.
I never use paper. For me, the symbolic labels made more
sense because I could use my editor to search.
Walt Howard
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Wed, 23 Jul 1997 13:14:21 -0400
Organization: CADVision Development Corp.
Reply-To: peac...@capital.net
Lines: 26
NNTP-Posting-Host: 199.185.6.241
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 3.01 (WinNT; I)
Xref: agate comp.lang.lisp:29514 comp.programming:52870 comp.lang.c++:283436
>
> > Anyway, I'm sure there are some language designs that don't
> >use the stack when making calls to functions. They would expand them
> >inline like a macro definition.
>
> In such a languge it would be inpossable to write recursive code.
You could certainly write recursive code, as long as the
number of times the function calls itself is set at compile time.
Errare humanum est, perseverare diabolicum. (If my Latin daoes not
fail me). :)
Mr. the Peaceman, do you have the slightest idea of what you are
talking about?
I still have to see the assembly code for the C tree traversal from
you. But apart from that, if a function is tail-recursive (assuming
you grasped the concept by now) what you are usually interested in, is
that the algorithm is provably terminating (and no! you can't answer
that question in its full generality). In that case the function runs
in constant space, since it is translated into a loop (if the compiler
is smart enough as most Lisp compilers are, contrary to many C/C++
ones). If the function is inherently recursive (prove they do not
exist, if you can), then the limit is the amount of memory of your
computer or some configuration parameter of the language run time
environment.
> I don't beleave that such a languge exists
There out there. New programming languages are created everyday.
That's what YACC is for.
A better way to experiment with new language semantics and constructs
is, of course, to extend CL or Scheme. :)
>as the size of the
> exicutables would be so massive as to be useless.
>
Most programming code out there (about 96%) is
nonrecursive. You've been programming is lisp too long.
I'd like to see the reason for the 96% figure (why not 94% or 98%?)
And no, unfortunately I am programming in pure C these days. It ain't
funny. However, your arguments do not have *anything whatsoever* to
do with Lisp. They concern good programming techniques.
I will never buy anything from CADVision Development Corp.
Cheers
> (defun faa (x)
> (cond
> ((numberp x) (+ x x))
> (t (list x))
> )
> )
>
>
> It's the same as what you had, except the close paranthesis
> are in the same line as the matching open parenthesis.
This is bad style. You are wasting white space for nothing.
Lisp programmer usually don't
care about closing parentheses. Indentation is more important.
Both indentation and parenthesis matching/counting does
the Lisp environment (Emacs, Genera, whatever, ...)
for you.
> statement is not entirely true. Whether or not Emacs does the proper
> indenting for you depends on whether or not Emacs is set to that mode.
This is no difficulty.
> One could set up a .emacs file to set the editor to always stay in
> standard text mode, and not ever in c or lisp mode.
You can set up the .emacs file so that you can edit anything at all.
You can configure your car that it doesn't drive - so what?
>jos...@lavielle.com (Rainer Joswig) wrote:
>| > (defun faa (x)
>| > (cond
>| > ((numberp x) (+ x x))
>| > (t (list x))
>| > )
>| > )
>| >
>| >
>| > It's the same as what you had, except the close paranthesis
>| > are in the same line as the matching open parenthesis.
>|
>| This is bad style. You are wasting white space for nothing.
>| Lisp programmer usually don't
>| care about closing parentheses. Indentation is more important.
>| Both indentation and parenthesis matching/counting does
>| the Lisp environment (Emacs, Genera, whatever, ...)
>| for you.
>
>One of my least favorite programmer arguments concerns which indenting
>style is "best." For some reason, these usually smart people seem to
>explode violently when presented with code that doesn't meet their
>personal indenting style.
It sucks, but it does affect readabiity. Programmers can more
quickly assimilate, i.e, find and fix bugs, in code they can read
more easily.
I do believe no one has the right to "explode" over someone
else's style. That shows a marked lack of experience and/or
overblown self centeredness.
The real problem is that people who don't indent like this:
int function( int parameter )
{
for( int i = 0; i<10; i++)
{
// do something
}
}
Are brain damaged. That's all I have to say about that.
>This usually results in someone in the org mandating a particular style,
>which produces months of endless bickering, and eventually produces some
>kind of compromise standard that is both neither fish nor foul and
>resoundingly ignored.
Yep.
>Luckily, many programmers eventually realize that there is a pretty-print
>program ("cb" for the unix heads), and it can be set to produce the exact
>style of indenting that is preferred.
You can't do this in a large project. The reason is, you
check this code in. Often it becomes necessary to "diff" two
versions to see "what change made it break". If someone
reformatted the whole thing and checked it in, its impossible to
tell what REAL changes were made between those two versions.
>What makes the argument especially pointless is the fact that the need to
>perform a particular indentation has vanished over the years, because C
>code editors are finally starting to catch up to Lisp in functionality.
Well, emacs ( an all around editor which includes a C mode
and a C++ mode ) has been doing that forever.
Walt Howard
> What makes this my least favorite is that the argument itself is pointless
> -- no amount of moving around the brackets is going to change the
> resulting machine code. (And if it does, you should shoot your compiler
> vendor, because one of the few STANDARDS of C or Lisp is that whitespace
> is irrelevant.)
Tut. It's perfectly legal for a compiler to produce different
machine code according to the formatting of the input, provided
the machine code does the same work. So it would in fact be
legal for (say) Microsoft's C compilers to spot that code is
indented according to the GNU coding standards, and insert
lots of delay loops. :-)
> I was coding in Dataflex. My boss was also. It uses goto
> labels etc. All my labels were symbolic names like
> "CalculateValue" or "ShowResult". His labels were all numeric,
> "L0001", "L0002". For the life of me, I couldn't figure out why
> he did this.
>
> Then it dawned on me. He used a lot of PAPER PRINTOUTS when
> looking over code. The numeric labels made sense because they
> were ordered down the page. He didn't use an editor much.
>
> I never use paper. For me, the symbolic labels made more
> sense because I could use my editor to search.
So why didn't your boss use labels like L001_CalculateValue
and L002_ShowResult?
>Walt Howard wrote:
>
>> I was coding in Dataflex. My boss was also. It uses goto
>> labels etc. All my labels were symbolic names like
>> "CalculateValue" or "ShowResult". His labels were all numeric,
>> "L0001", "L0002". For the life of me, I couldn't figure out why
>> he did this.
>>
>> Then it dawned on me. He used a lot of PAPER PRINTOUTS when
>> looking over code. The numeric labels made sense because they
>> were ordered down the page. He didn't use an editor much.
>>
>> I never use paper. For me, the symbolic labels made more
>> sense because I could use my editor to search.
>
>So why didn't your boss use labels like L001_CalculateValue
>and L002_ShowResult?
>
Because he was a total dip, why else?
Walt Howard
> 2) define recursion
From the Maclisp manual:
"Recursion. See recursion."
[...]
>> >They would expand them [...functions...] inline like a macro definition.
>>
>> In such a languge it would be inpossable to write recursive code.
> You could certainly write recursive code, as long as the
>number of times the function calls itself is set at compile time.
In most casers where you use recurstion you don't know how meany
times that you repeat.
[...]
>> as the size of the
>> exicutables would be so massive as to be useless.
>>
> Most programming code out there (about 96%) is
>nonrecursive. You've been programming is lisp too long.
Ok lets immagen a module gets called 10 or 12 times, you
have now got a 10 or 12 times module size extra code in your
program.
--
Please excuse my spelling as I suffer from agraphia see the url in my header.
Never trust a country with more peaple then sheep. Buy easter bilbies.
Save the ABC Is $0.08 per day too much to pay? ex-net.scum and proud
I'm sorry but I just don't consider 'because its yucky' a convincing argument
I'm afraid I disagree. I haven't worked on any large projects, merely
medium ones (20-30 programmers), but I've never found other people's
indentation styles to severely hamper my work. In fact, someone's style
can be a useful signature to see who wrote what. :)
I hate everyone's indentation style but my own. The problem with a
standard is that many if not most people are forced to use a style they
don't like and aren't comfortable with. This is like forcing everyone to
use the same editor. Is consistency in this area (and I invoke Emerson
here) important enough to irritate people? Are there any practical
reasons for indentation standards--i.e. situations where productivity
may suffer without them, and considering that productivity may also
suffer with them?
In article <33D8F4...@capital.net>,
Sajid Ahmed the Peaceman <peac...@capital.net> wrote:
>Marco Antoniotti wrote:
>>
>> Mr. the Peaceman, do you have the slightest idea of what you are
>> talking about?
>
> I know exactly what I'm talking about, thank you.
Well, if you know you're talking nonsense, then why should anyone
take you seriously?
>> I still have to see the assembly code for the C tree traversal from
>> you. But apart from that, if a function is tail-recursive (assuming
>> you grasped the concept by now) what you are usually interested in, is
>> that the algorithm is provably terminating (and no! you can't answer
>> that question in its full generality). In that case the function runs
>> in constant space, since it is translated into a loop (if the compiler
>> is smart enough as most Lisp compilers are, contrary to many C/C++
>> ones). If the function is inherently recursive (prove they do not
>> exist, if you can), then the limit is the amount of memory of your
>> computer or some configuration parameter of the language run time
>> environment.
>
> That's not the point. The point is, why make the functions
>tail recursive when simple iteration is good enough? It's a waste
>of time.
Whoah! Earth to Peaceman. What do you think people have been
talking about for the past few days. You started this off by
claiming that:
(Dated 7/17)
Sajid Ahmed the Peaceman wrote in <33CE58...@capital.net>:
> Anyway, all lisp programs, as well as the compilers and
> interpreters are broken down into assembly level code, which is
> iterative. The thing I have a problem with is with people trying
> to write programs that are completely recursive, which is what lisp
> is about. That is the wrong way to go about it. It's a tremendous
> waste.
People have been discussing your first claim, that all assembly
level code is "iterative," which you have yet to define in a
meaningful way.
If I'm reading you correctly, when you say "That's not the point."
up above, you are tacitly admitting defeat on this point, but
aren't willing to actually state this, so instead you change the
subject over to the second point.
> Just write the iterative code. It's faster and more efficient.
Faster to write? That depends completely on what you're used to.
More efficient? No, because tail recursion is *equivalent* to
iteration.
If you're arguing that you can write iterative code faster than you
can write tail-recursive code, than I'll just have to take you at
your word. Just don't think you've shown anything more significant
than that.
> That's where my gripe in Lisp comes about. It's a
>programming language that compiles code into simpler assembly
>language machine code. It's not Mathematics, where the results
>of functions are instantaneously there. There is no need to
>write tail recursive functions, when simple iterative code will do.
Here we go with that second point, the "I don't like Lisp because
my mean ol' Lisp instructor made me write everything recursively"
argument. Hello? Have you been listening?
First of all, I'd like to see a single programming langauge that
doesn't compile code into "simpler assembly language machine code."
That's the whole point of a compiiler. And of course it's not
Mathematics, no one claimed it was. Really, where do you get
these things?
If you prefer "simple iteration" to tail recursion, you are arguing
mere style preferences, because, as many others have pointed out,
tail recursion is equivalent to iteration.
In fact, you are arguing trivial style preferences, because in CL
you are perfectly free to write your iteration using any of the
"do" family of iterative constructs or the "loop" macro.
Satisfied? I didn't think so.
Sigh. I have to admit that was kind of fun; you're almost as bad
as the "Relativity is wrong because I don't like it" folks over
on sci.physics.
- Johann
P.S. From now on, I'll be good. Really. No more responding to
flame bait. I should really find more things to do while my
computation is running...
--
Johann A. Hibschman | Grad student in Physics, working in Astronomy.
joh...@physics.berkeley.edu | Probing pulsar pair production processes.
> In article <33D65B66...@interdyn.com> Nicholas Arthur Ambrose <ni...@interdyn.com> writes:
>
> > I thought that Fortran77 didn't use the stack. I believe It also
> > doesn't allow recursion without some programming trickery.
> > Nick
>
> No it did not (and still does not :-) Once a friend of mine was given
> an assignment to implement a recursive algorithm in Fortran, so he had
> to emulate the stack using several arrays (remember -- Fortran does
> not have structures, either!)
>
> Besides the sipmlicity of the implementation, there was one more
> reason for not using the stack: the speed. Indeed, when C or Lisp make
> a function call, a lot of run-time job is being done on the
> stack, while in Fortran, a function call gets translated into a single
> 'jump' instruction.
This can't be true in general. Where do the procedure arguments go if
you have greater than what can be stored in registers? On the stack
right?
-Reggie
-------------------
Reginald S. Perry e-mail: pe...@zso.dec.com
Digital Equipment Corporation
Performance Manager Group
http://www.UNIX.digital.com/unix/sysman/perf_mgr/
The train to Success makes many stops in the state of Failure.
I do understand your point -- that "f oo" is different than "foo " -- I
should have said "variances in non-zero amounts of whitespace are
considered irrelevant." But this point is pretty minor, no?
| Yes and no. Consensus on such matter is extremely important in an
| organization, especially when you are not just producing one=shot
| programs, but systems which will have to be maintained for a long time
| to come.
Yes, and no. Having *a* coding standard is a Good Thing, without doubt,
because it improves programmer efficiency by some small, but reasonable,
amount.
The problem is that standards are almost always embroigled in bitter
argument and bickering. If everyone could agree: code however you like,
but "checkin" WILL run "cb" (with our company's "official standards
module" attached) -- hey, that'd be great, wouldn't it?
I frankly couldn't care THAT much about the particular standard. Even if
it offends mine eye, I can cope.
I reject categorically that one coding standard is "intrinsically winning"
and that all others are "obviously braindamaged."
> 2) define recursion
From the Maclisp manual:
"Recursion. See recursion."
"Iteration: go to 'Iteration'" :)
From: b...@wetware.com (Fred Haineux)
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Thu, 24 Jul 1997 14:49:56 -0700
Organization: Castle Wetware Internet Services, INC.
Lines: 57
...
One of my least favorite programmer arguments concerns which indenting
style is "best." For some reason, these usually smart people seem to
explode violently when presented with code that doesn't meet their
personal indenting style.
I find myself objecting to "inconsistent style". I personally follow
the GNU Coding standards because are reasonable and because they are
well supported by Emacs.
This usually results in someone in the org mandating a particular style,
which produces months of endless bickering, and eventually produces some
kind of compromise standard that is both neither fish nor foul and
resoundingly ignored.
I have been activly campaigning in my group to enforce the GNU Coding
standards. This has led to some bickering, but it is a necessary evil
(again, not because the GNU coding standards are bad, but just because
they are a "coding standard").
Luckily, many programmers eventually realize that there is a pretty-print
program ("cb" for the unix heads), and it can be set to produce the exact
style of indenting that is preferred.
Emacs is the program that should be used to do the indentation in the
first place :)
What makes this my least favorite is that the argument itself is pointless
-- no amount of moving around the brackets is going to change the
resulting machine code. (And if it does, you should shoot your compiler
vendor, because one of the few STANDARDS of C or Lisp is that whitespace
is irrelevant.)
And here we see that you have written very little Lisp recently.
whitespaces (single ones at least) are all-important in Lisp. :)
What makes the argument especially pointless is the fact that the need to
perform a particular indentation has vanished over the years, because C
code editors are finally starting to catch up to Lisp in
functionality.
Emacs has had C and C++ modes (written in Emacs Lisp - of course) for
at least a decade.
...
Remember what the philosophers said: "We demand rigidly defined areas of
doubt and uncertainty." I think indentation style should be one of them.
Yes and no. Consensus on such matter is extremely important in an
organization, especially when you are not just producing one=shot
programs, but systems which will have to be maintained for a long time
to come.
Cheers
I know exactly what I'm talking about, thank you.
> I still have to see the assembly code for the C tree traversal from
> you. But apart from that, if a function is tail-recursive (assuming
> you grasped the concept by now) what you are usually interested in, is
> that the algorithm is provably terminating (and no! you can't answer
> that question in its full generality). In that case the function runs
> in constant space, since it is translated into a loop (if the compiler
> is smart enough as most Lisp compilers are, contrary to many C/C++
> ones). If the function is inherently recursive (prove they do not
> exist, if you can), then the limit is the amount of memory of your
> computer or some configuration parameter of the language run time
> environment.
>
That's not the point. The point is, why make the functions
tail recursive when simple iteration is good enough? It's a waste
of time.
Just write the iterative code. It's faster and more efficient.
That's where my gripe in Lisp comes about. It's a
programming language that compiles code into simpler assembly
language machine code. It's not Mathematics, where the results
of functions are instantaneously there. There is no need to
write tail recursive functions, when simple iterative code will do.
Peaceman
>
> Sajid Ahmed the Peaceman <peac...@capital.net> writes:
>
>
> Well this is interesting. How in the world is a move iterative? How is
> iterative defined in your world?
>
> If you want to convince me, you will have to:
>
> 1) define iteration
>
> 2) define recursion
>
> 3) define your assembly pseudo-language subset. this subset has to be
> rich enough to describe computations that can be done today on modern
> machines.
>
> 4) define the iterative operation in this language according to your
> definition of iterative.
>
> 5) define the recursive operation in this language according to your
> definition of recursive.
>
> 6) Show how the operations in 5) are equivalent to the operations in 4)
>
> 7) show that for every operation one can describe in your language,
> they all map in to some equivalent of 4)
>
> Do this, and I will be convinced. I would advise you think carefully
> before you start.
>
> -Reggie
Look, I'm not going to get into a debate
about whether assembly language is iterative or recursive.
The main point I had was about Lisp code, and the postgrad
curriculum in comp sci. I can admit that I was wrong in the
semantics of the language, but I'm not going to change my
views on the way computer science is taught in the post graduate
level. It's computer science, based on computer processors.
It's not mathematics. It is a function on a computer,translated into
lower level assembly language code run one instruction at a time, not a
function in an abstract mathematical world.
You may think I have a problem with mathematics. I don't,
as long as there is practical applications in the real physical
world. The problem with these abstract mathematical functions,
translated into lisp, is that in most instances they don't have
any practical use. It's mathematics just for the sake of mathematics.
That is a complete waste of time.
Peaceman
Long live EMACS! If your EMACS doesn't do something, just download the
latest version. Any new thing that comes around, you can bet that
someone, somewhere is writing an e-lisp extension for EMACS.
EMACS - it's not just a program, it's a lifestyle!
--
Tyson Jensen
Mosby Consumer Health w: (801)-464-6217
tje...@mosbych1.com h: (801)-461-4687
ty...@inconnect.com
* Walt Howard
| You can't do this in a large project. The reason is, you check this code
| in. Often it becomes necessary to "diff" two versions to see "what
| change made it break". If someone reformatted the whole thing and
| checked it in, its impossible to tell what REAL changes were made between
| those two versions.
it occurred to me that the problem is that "diff" is run on the text
representation of the code. had it been possible "diff" the _code_, each
programmer could check his code in and out in his own textual style.
#\Erik
--
Thomas J. Watson, founder of IBM: "man shall think. machines shall work."
William H. Gates, III, Microsoft: "man shall work. machines shall not."
> I know exactly what I'm talking about, thank you.
You don't give that impression. Just the opposite, in fact.
> That's not the point. The point is, why make the functions
> tail recursive when simple iteration is good enough? It's a waste
> of time.
> Just write the iterative code. It's faster and more efficient.
Is it? I've not seen any evidence for that claim. Are you perhaps
using an example with a poor implementation of Lisp, or even a
compiler for another language that doesn't support resursion well?
You might as well "prove" that the Earth is flat, by only looking at a
small piece of road that is flat. More realistic examples can also be
found.
> That's where my gripe in Lisp comes about. It's a
> programming language that compiles code into simpler assembly
> language machine code. It's not Mathematics, where the results
> of functions are instantaneously there. There is no need to
> write tail recursive functions, when simple iterative code will do.
Why use iteration when simple resursion will do? If you want to argue
that programming at a lower level is always best, then you should be
using assembly language for everything. Is this in fact what you do?
If a programmer choses a particular language, their reasons for doing
so need not depend on things like Mathematics. Recursion may be only
of many possible reasons for making a choice.
Note that Lisp doesn't depend on recursion. _You_ may think it does,
but you insist on demonstration your ignorance of Lisp. I strongly
recommend that you go away and read a few of the Lisp tutorials in the
Lisp FAQ (you did read the FAQ, didn't you?), and only then should you
try to tell people what Lisp can or cannot do.
Assertions based on ignorance can be called mistruths. A less generous
way of describing them would be as lies. Yes, I'm calling you a liar.
--
<URL:http://www.wildcard.demon.co.uk/> You can never browse enough
"My body falls on you like a dead horse" -- Portait Chinois
Please note: my email address is gubbish
> If you prefer "simple iteration" to tail recursion, you are arguing
> mere style preferences, because, as many others have pointed out,
> tail recursion is equivalent to iteration.
Alas, some peopl edon't know how little they know. I see this happen
in the context of compiler theory all the time. To be fair, most
people don't have the time to study compilers. However, when they
claim to know all about compilers, in spite of their ignorance, I find
it harder to forgive them.
Sajid Ahmed the Peaceman should by now be just starting to realise the
true depth of his ignorance in this area. Strangely, his posts seem
not to reflect this. I'm not sure what to conclude from this. It would
be tempting to just write him off as an idiot, but I've seen enough
clueless C++ programmers on UseNet behave differently, when pointing
in the right direction, to suspect that Sajid Ahmed the Peaceman's
motives are not based on a desire for enlightenment. Instead, I wonder
if he's perhaps wishing to spread his ignorance?
He's certainly spreading his clueless memes to one or two newsgroups.
Fortunately, anyone reading this can check the facts for themselves.
A good place to start is with the Lisp FAQ:
<URL:http://www.cs.cmu.edu/afs/cs.cmu.edu/project/ai-
repository/ai/html/faqs/lang/lisp/top.html>
This is an implementation decision. There is no problem allocating local
variables on the stack as long as they need not preserve their values
between invocations of a function
>> Besides the sipmlicity of the implementation, there was one more
>> reason for not using the stack: the speed. Indeed, when C or Lisp make
Depends on the way addressing works on the target hardware. There are
quite a few machines around where a stack would actually be faster than
static storage.
>> a function call, a lot of run-time job is being done on the
>> stack, while in Fortran, a function call gets translated into a single
>> 'jump' instruction.
>
>This can't be true in general. Where do the procedure arguments go if
>you have greater than what can be stored in registers? On the stack
>right?
You can assign static storage for the function parameters and the caller
copies them there (Not that I would recommend it, but it is possible)
Hartmann Schaffer
In case you didn't get it yet, the machine code for iteration and tail
recursion are indistinguishable.
Hartmann Schaffer
>
> My body fell on Sajid Ahmed the Peaceman like a dead horse,
> who then wheezed these wise words:
>
Sajid Ahmed the Peaceman then threw the dead horse
off of himself, and went on with his business.
> > I know exactly what I'm talking about, thank you.
>
> You don't give that impression. Just the opposite, in fact.
>
Sorry to burst your bubble. I'm not trying to impress
you or anybody else.
> > That's not the point. The point is, why make the functions
> > tail recursive when simple iteration is good enough? It's a waste
> > of time.
> > Just write the iterative code. It's faster and more efficient.
>
> Is it? I've not seen any evidence for that claim. Are you perhaps
> using an example with a poor implementation of Lisp, or even a
> compiler for another language that doesn't support resursion well?
Using recursive functions on a computer involves manipulating
a stack. Using iterative statements does not. QED.
>
> You might as well "prove" that the Earth is flat, by only looking at a
> small piece of road that is flat.
What's that have to do with anything?
Like I said in the post that your replying to, my gripe
is towards programmers (Lisp as well as others) living in their
own fantasy abstract mathematical world. It's time to accept
reality.
Peaceman
> Sorry to burst your bubble. I'm not trying to impress
> you or anybody else.
I didn't think you were trying to impress anyone. Oh no.
> Using recursive functions on a computer involves manipulating
> a stack. Using iterative statements does not. QED.
Really? How is it, then, that I can write tail recursive functions in
C, and when compiled with a C compiler that optimises tail recursion,
the resulting code reuses the activation record during a tail call?
You can also write recursive decent parsers in a wide variety of
languages, including C. I first did this more than 10 years ago. Have
you tried it yet?
> > You might as well "prove" that the Earth is flat, by only looking at a
> > small piece of road that is flat.
>
> What's that have to do with anything?
See my above points about recursion.
> Like I said in the post that your replying to, my gripe
> is towards programmers (Lisp as well as others) living in their
> own fantasy abstract mathematical world. It's time to accept
> reality.
So recursive decent parsers have no practical value? Could it be that
they use abstract mathematical ideas that you object to? Say bye bye
to all the world's compilers. Sajid Ahmed the Peaceman has declared
them to be unnecessary applications of abstract mathematics.
You can program in all kinds of languages without going into any more
abstract mathematics than, say, ANSI C. Perhaps you just had a poor
teacher who gave you the impression that Lisp is way too complex for
you? If so, then find a better teacher. The Lisp FAQ can recommend a
number of excellent books. A deeper understanding can come later.
Don't blame the language, friend. Blame your education. Some people
get put off algebra for the same reasons. It's not too late to correct
the damage. Stop attacking something because you don't understand it.
Someone who with your lack of understand is either ignorant or stupid.
Now, ignorance can be cured with education. Are you willing to learn?
scha...@wat.hookup.net wrote in article <5rdea4$l8g$1...@nic.wat.hookup.net>.
..
>In <33D8F4...@capital.net>, Sajid Ahmed the Peaceman <peaceman@capital.
net> writes:
>> ...
>> That's not the point. The point is, why make the functions
>>tail recursive when simple iteration is good enough? It's a waste
>>of time.
>> Just write the iterative code. It's faster and more efficient.
>>
>> That's where my gripe in Lisp comes about. It's a
>>programming language that compiles code into simpler assembly
>>language machine code. It's not Mathematics, where the results
>>of functions are instantaneously there. There is no need to
>>write tail recursive functions, when simple iterative code will do.
>
>In case you didn't get it yet, the machine code for iteration and tail
>recursion are indistinguishable.
>
>Hartmann Schaffer
And as Guy Steele pointed out, the machine code for general recursion
and iteration are indistiguishable. It's the subproblems that push the
stack, not the recursive/iterative calls.
Call me crazy, but I'm beginning to look forward to these posts.
It depends on your definition of ``recursion''. The programmers I respect
and pay a lot of attention to will generally use the word recursion
and iteration interchangably. They are often talking about fixed points
of recursively enumerable functions. The details of whether an
implementation
of such a function pushes or pops the stack is immaterial (both to me
and them).
I find that the really good programmers --- ok, I'll name a some names:
Henry Baker, Alan Bawden, Jonathan Rees, Will Clinger, Gerry Sussman,
Bill Rozas (and quite a few others) --- can take an extremely
abstract mathematical description of something, cast it to a
high level formal description in some language with virtually
no changes, and make it run blisteringly fast. I don't think it is an
accident that all these people use Lisp to express their programs.
There is the occasional person who, while brilliant in their own field,
cannot code worth beans, but these people seem few and far between
to me.
> Look, I'm not going to get into a debate
> about whether assembly language is iterative or recursive.
Phew. ;)
> The main point I had was about Lisp code, and the postgrad
> curriculum in comp sci. I can admit that I was wrong in the
> semantics of the language, but I'm not going to change my
> views on the way computer science is taught in the post graduate
> level. It's computer science, based on computer processors.
Is this supposed to be a suprise to anyone?
> It's not mathematics. It is a function on a computer,translated into
> lower level assembly language code run one instruction at a time, not a
> function in an abstract mathematical world.
Do you know _anything_ about compiler theory? Try posting this kind of
nonsense in comp.compilers, and see how far you get. Your claim that
abstract mathematics is divorced from the hardware once again shows us
how little you know.
Have you ever heard of Boolean Algebra? How about Lambda Calculus?
This is at the heart of computing.
> You may think I have a problem with mathematics. I don't,
> as long as there is practical applications in the real physical
> world. The problem with these abstract mathematical functions,
> translated into lisp, is that in most instances they don't have
> any practical use. It's mathematics just for the sake of mathematics.
> That is a complete waste of time.
Ever heard of Alan Turing? You're dismissing the foundations of
computing! You might as well claim that mathematics has no practical
use in the real world. If you do, then we may assume that you've
missed them all. Unless you're trying to convince us that such things
don't exist?
BTW, it ain't even that abstract. We are, after all, talking about
_computing_. Perhaps these machines aren't _quite_ so abstract at the
transistor level, but we don't see the millions of transistors in a
machine, and on such a scale as even a small computer, we have so many
of these damn things that the result is indeed pretty abstract. Never
mind the boolean alegbra that's used at the basic logic level.
May I recommend a book by Tom Duncan, called Advantures with Digital
Electronics? I think it may explain things at a simple enough level
for you to understand - very little mathematical. However, it's
impossible to avoid it completely. Yes, the book includes some logic
tables! Ouch. Still, this is a practical hands-on type of book, using
breadboards and actual digital circuits using chips.
You sound like computers are way to complicated for you. They're full
of abstractions, and in large quantities. You can only avoid them by
being ignorant of them, as I'm sure most users are. If you wish to be
a programmer, then you'll have to face at least some of these
abstractions. If you wish to talk about languages and compilers for
them, then you'll have to face some really tough abstractions.
Try reading a language spec if you're unconvinced. Even "simple" specs
still include _some_ of the things that you're dismissing. They have
to, otherwise the spec would be meaningless. More detailed specs will
include more of the things that you say are unnecessary. They're not
just for compiler writers, either. Programmers sometimes need to be a
language lawyer. It helps to know the language that you program in.
If you feel that you don't need to know a programming language that
well, then you're either not asking tough enough questions (e.g. how
do I write code that other compilers for this language will compile
correctly?), you're ignorant of the subjects of these questions (holes
in the language specs, for instance), or you're just ignoring them.
I'll leave the more fundamental issues for now, but I will say that
they've been around for some time. At least as long as modern
computing. Just ask yourself how old Lambda Calculus is, and when the
first digital computer was built.
You still have a lot to learn, Grasshoper. ;)
That's not the most amusing thing about this thread though. The most
amusing part is the way one side insists in writing everything one
way, whether it really is more natural to structure every problem that
way or not, and the other whines about it being slow even though
modern compilers output the same code for both _styles_.
But hey, sometimes a loop is.. just a loop.
> Call me crazy, but I'm beginning to look forward to these posts.
They are entertaining, aren't they?
Looks like you either didn't read the replies to your previous postings or
didn't understand them: tail recursive functions (when properly compiled)
do not use the stack. So what is the Q (in QED)?
> Like I said in the post that your replying to, my gripe
>is towards programmers (Lisp as well as others) living in their
>own fantasy abstract mathematical world. It's time to accept
>reality.
You mean the reality that people write about Lisp who don't know what they
are talking about and are unable to comprehend explanations given to them?
Hartmann Schaffer
> ja...@harlequin.co.uk (Jason Trenouth) writes:
>
> > On 9 Jul 1997 04:58:57 GMT, flis...@fontina.cs.wisc.edu (Shaun Flisakowski)
> > wrote:
> >
> > > :Have you noticed...
> > > : You don't see fast numerical libraries written in Lisp.
> > > : You don't see scientific libraries written in Lisp.
> > > : You don't see commercial games written in Lisp.
> > > : You don't see application suites written in Lisp.
> >
> > None of the above are true, of course. BTW The third domain is perhaps the
> > most in fashion of those listed.
>
> That's interesting. Could you give a couple of examples of commercial
> games running compiled (or even interpreted) Lisp code.
How about Nichimen's content development system for Nintendo 64?
There have also been some relatively recent articles in some software
magazines about Lisp and games: particularly the use of hybrid Lisp/C systems
used to fast-track the development of games. However, I'm unable to find one
of these at the moment. Perhaps I dreamt them... :-(
> How about an
> application suite?
I guess it depends what you mean by "application suite", but Lisp vendors (eg
ourselves, and presumably yourselves) have had GUI and DBI frameworks for
sometime so probably you meant something else...
> Anything mainstream?
:-j
__Jason
>But hey, sometimes a loop is.. just a loop.
>
Yup. This is why Norvig, in Paradigms of Artificial Intelligence:
Case Studies in Common Lisp, discussed about sixteen different ways
to do a loop in one of the earlier chapters. Some things want to
be recursive, even at some cost in efficency. Some things do nicely
in tail recursion, which is (when the compiler is through with it)
the same thing as iteration. Some things naturally loop, and Common
Lisp has more looping constructs than any other language I know of.
David Thornley
From: peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Fri, 25 Jul 1997 18:09:48 -0400
Organization: Logical Net
Lines: 99
NNTP-Posting-Host: dialup113.colnny1.capital.net
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 3.01 (Win95; I)
Xref: agate comp.lang.lisp:29555 comp.programming:52980 comp.lang.c++:283727
...
The main point I had was about Lisp code, and the postgrad
curriculum in comp sci. I can admit that I was wrong in the
semantics of the language,
Progress!!!!
but I'm not going to change my
views on the way computer science is taught in the post graduate
level. It's computer science, based on computer processors.
It's not mathematics. It is a function on a computer,translated into
lower level assembly language code run one instruction at a time, not a
function in an abstract mathematical world.
Last time I checked the science of "Algorithm Analysis" was doing pretty
well. Take an Analysis of Algorithm course and you will see that
there is a *lot* of work done to justify the costs of "translating a
math specification" into a working program.
You may think I have a problem with mathematics. I don't,
as long as there is practical applications in the real physical
world. The problem with these abstract mathematical functions,
translated into lisp, is that in most instances they don't have
any practical use. It's mathematics just for the sake of mathematics.
That is a complete waste of time.
Again, this has *nothing* to do with Lisp (or C or COBOL). It has to
do with your limited knowledge of the algorithmic issues involed any
programming activity.
From: b...@wetware.com (Fred Haineux)
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Fri, 25 Jul 1997 11:12:32 -0700
Organization: Castle Wetware Internet Services, INC.
Lines: 27
mar...@infiniti.PATH.Berkeley.EDU (Marco Antoniotti) wrote:
| And here we see that you have written very little Lisp recently.
| whitespaces (single ones at least) are all-important in Lisp. :)
I do understand your point -- that "f oo" is different than "foo " -- I
should have said "variances in non-zero amounts of whitespace are
considered irrelevant." But this point is pretty minor, no?
| Yes and no. Consensus on such matter is extremely important in an
| organization, especially when you are not just producing one=shot
| programs, but systems which will have to be maintained for a long time
| to come.
Yes, and no. Having *a* coding standard is a Good Thing, without doubt,
because it improves programmer efficiency by some small, but reasonable,
amount.
I much more concerned with the overall efficiency of the "programmer's
team" and with software maintainability over medium and long periods
of time.
The problem is that standards are almost always embroigled in bitter
argument and bickering. If everyone could agree: code however you like,
but "checkin" WILL run "cb" (with our company's "official standards
module" attached) -- hey, that'd be great, wouldn't it?
Well, people should just use Emacs and the C/C++ modes :)
I frankly couldn't care THAT much about the particular standard. Even if
it offends mine eye, I can cope.
I reject categorically that one coding standard is "intrinsically winning"
and that all others are "obviously braindamaged."
This is true. I can adapt, but having *a* standard surely helps.
>On Thu, 24 Jul 1997 14:49:56 -0700, b...@wetware.com (Fred Haineux)
>wrote:
>
[snip]
> You can't do this in a large project. The reason is, you
>check this code in. Often it becomes necessary to "diff" two
>versions to see "what change made it break". If someone
>reformatted the whole thing and checked it in, its impossible to
>tell what REAL changes were made between those two versions.
>
So keep two configurations for a formatter around. Format to
a standard style when checking in. Format to your preferred
style when checking out. No one needs to complain, and
real code differences are not lost in formatting "noise".
Steve Quist
------- Well, no, you can't diff a binary. How would you be able to
tell where the idfference mapped to in the source once you've found
it anyways? Also, diffing each programmer's source before checking
in might not work very well on a large project, where files might be
shared, or multiple people will be making changes to the same file?
What might work a bit better is to have a custom program to strip C
of any pretty features, extra spacing, etc, and then diff the files.
Hmmm... why not just follow a coding standard of whatever team/project
/company you are working at.
--
Mariusz Zydyk http://www.ucalgary.ca/~mszydyk/
Prince of Darkness p...@null.net
How do you make holy water? Boil the hell out of it.
sigh. of course you can, and "code" doesn't mean "binary", but that's not
the issue. my point is that you don't need to diff the whitespace to diff
the source. most diff programs are anal about newlines, for instance.
| Hmmm... why not just follow a coding standard of whatever team/project
| /company you are working at.
sigh. that's how this discussion started.
> What might work a bit better is to have a custom program to strip C
> of any pretty features, extra spacing, etc, and then diff the files.
> Hmmm... why not just follow a coding standard of whatever team/project
>
> /company you are working at.
Most diff algorithms work on a line basis. Working on a character
or tokenbasis would be substantially more computationally expensive, but
it's probably a
good idea. Then the diff program would be insensitive to formatting.
dave
jos...@lavielle.com (Rainer Joswig) writes:
> Why should (+ 3 4) in Lisp be slower than 3 + 4 in C?
Well, if you are really talking about "(+ 3 4)", both should take the
same amount of time, namely zero. Any self-respecting compiler would
constant-fold this operation out of existence. But if you are talking
about "(+ x y)" there are two fundamental reasons why Lisp might be
slower:
1. If the types of x and y are not declared and can't be deduced by
the compiler, the Lisp + must do a runtime type-dispatch on both
arguments to select the right kind of arithmetic/coercion to do. Lisp
must select among fixnums, bignums, several flavors of float, ratios,
and complex numbers.
2. In the most common case of fixnum-fixnum arithmetic, Lisp must
detect any overflow and coerce the result to a bignum. In the event
of an overflow, C will quietly reurn the wrong answer. OK,
technically it is the right answer, if "right" is defined as mod-N
arithmetic, where N is machine dependent, but that doesn't help the
guy whose nuclear plant just melted.
The first case goes away if you use a few declarations, but the CL
declaration language is admittedly awkward. As for the second case, I
would argue that overflow detection is worth paying for in 99% of all
coding situations, though reasonable people can and do differ about
the desirability of rolling over into bignums vs. signalling an
error as the default behavior.
Yes, there are rare cases where getting the wrong answer is preferable
to wasting a cycle or risking a runtime exception, but usually this is
a recipe for dangerously unreliable code.
-- Scott
===========================================================================
Scott E. Fahlman Internet: s...@cs.cmu.edu
Principal Research Scientist Phone: 412 268-2575
Department of Computer Science Fax: 412 268-5576
Carnegie Mellon University Latitude: 40:26:46 N
5000 Forbes Avenue Longitude: 79:56:55 W
Pittsburgh, PA 15213 Mood: :-)
===========================================================================
Let's just point out that if you are using C++, it does the same thing as
Lisp in this case, and is just as slow.
In Lisp, you certainly can declare a variable to be a type, if you want to
avoid runtime dispatch.
In Lisp, the usual development cycle is...
1) write a prototype
2) add functionality until complete
3) integration test
4) optimize/declare types/etc.
5) make standalone and ship.
However, most Lisp programs stop development in the middle of stage 2 (grin)...
In C, the usual development cycle is:
1) optimize code for a new hash table algorithm that programmer thought up
on bus
2) implement system using new hash table algorithm as often as possible
3) add user interface
4) Test
5) Ship
However, most C programs stop development in the middle of stage 2 (grin)...
>On 25 Jul 1997 08:55:42 +0000, Erik Naggum <er...@naggum.no> wrote:
>[>it occurred to me that the problem is that "diff" is run on the text
>[>representation of the code. had it been possible "diff" the _code_, each
>[>programmer could check his code in and out in his own textual style.
>
>------- Well, no, you can't diff a binary. How would you be able to
>tell where the idfference mapped to in the source once you've found
>it anyways? Also, diffing each programmer's source before checking
>in might not work very well on a large project, where files might be
>shared, or multiple people will be making changes to the same file?
>
>What might work a bit better is to have a custom program to strip C
>of any pretty features, extra spacing, etc, and then diff the files.
>Hmmm... why not just follow a coding standard of whatever team/project
>/company you are working at.
Well, in all honesty, the unix diff has a feature to ignore
whitespace in determining sameness.
Walt Howard
>In C, the usual development cycle is:
>1) optimize code for a new hash table algorithm that programmer thought up
>on bus
>2) implement system using new hash table algorithm as often as possible
>3) add user interface
>4) Test
>5) Ship
>However, most C programs stop development in the middle of stage 2 (grin)...
Hmm...in my experience, you seem to have 4 and 5 backwards.
--
Will Hartung - Rancho Santa Margarita. It's a dry heat. vfr...@netcom.com
1990 VFR750 - VFR=Very Red "Ho, HaHa, Dodge, Parry, Spin, HA! THRUST!"
1993 Explorer - Cage? Hell, it's a prison. -D. Duck
> jos...@lavielle.com (Rainer Joswig) writes:
>
> > Why should (+ 3 4) in Lisp be slower than 3 + 4 in C?
>
> Well, if you are really talking about "(+ 3 4)", both should take the
> same amount of time, namely zero. Any self-respecting compiler would
> constant-fold this operation out of existence. But if you are talking
> about "(+ x y)"
I wanted to talk about the process of adding to numbers. ;-)
> 1. If the types of x and y are not declared and can't be deduced by
> the compiler, the Lisp + must do a runtime type-dispatch on both
> arguments to select the right kind of arithmetic/coercion to do. Lisp
> must select among fixnums, bignums, several flavors of float, ratios,
> and complex numbers.
But if they are declared or can be deduced? Do you see
any Lisp-specific speed penalty?
> 2. In the most common case of fixnum-fixnum arithmetic, Lisp must
> detect any overflow and coerce the result to a bignum.
Even if I declare the result to be a fixnum?
(defun test (a b)
(declare (fixnum a b)
(optimize (speed 3) (safety 0)))
(the fixnum (+ a b)))
(loop for i from 1 upto 10
with start = (- most-positive-fixnum 5)
do (print (test start i)))
MCL 4.1 gives me:
536870907
536870908
536870909
536870910
536870911
-536870912
-536870911
-536870910
-536870909
-536870908
p...@null.net (Mariusz Zydyk) writes:
>------- Well, no, you can't diff a binary.
But Erik never suggested doing so. He was merely adverting to the old
Interlisp proverb: "a PROGRAM is not a LISTING". In short, he was
suggesting something like diffing abstract syntax trees, and yes you
CAN do that.
>How would you be able to tell where the idfference mapped to in the
>source once you've found it anyways?
If it were a matter of diffing binaries, the answer is pathetically
obvious: via the line number table. When it's a matter of diffing
abstract syntax trees, part of Erik's point iS WHAT SOURCE CODE?
He's saying that there shouldn't be ANY distinguished listing. In
fact, there might never ever have been a textual presentation of the
entire translation unit at one time. The differ would show you the
context of the differences in your choice of layout.
For the record, Interlisp did pretty much what Erik suggests.
>What might work a bit better is to have a custom program to strip C
>of any pretty features, extra spacing, etc, and then diff the files.
This is precisely a crude textual approximation to Erik's suggestion.
>Hmmm... why not just follow a coding standard of whatever team/project
>/company you are working at.
What if a translation unit has to be shared between two teams with
different coding standards, or is produced (and maintained) by company
X but used (and therefore tested and debugged) by company Y, with
different coding standards?
--
Four policemen playing jazz on an up escalator in the railway station.
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.
536870907
536870908
536870909
536870910
536870911
536870912
536870913
536870914
536870915
536870916
in interpretive mode; I have to compile to get the (expected ?)
536870907
536870908
536870909
536870910
536870911
-536870912
-536870911
-536870910
-536870909
-536870908
Obviously, the interpreter ignores the declarations.
not really. it only collapses sequence of blanks and tabs into one blank.
this doesn't take care of such things as adding or removing insignificant
whitespace where zero whitespace is allowed, or the annoying newlines that
are also considered "whitespace" in all programming languages.
Well, actually, you forgot the part about how since so many AI programs
have been written in Lisp, that the language itself has actually become
intelligent. "Well," you might say, "if this is so, why does it let me
write bad code?"
The answer is, of course, that Lisp is very much like a cat. Cats, you
see, understand every word you say. They don't pay any attention, usually,
but they do understand.
One could go along with the idea that C is much like a dog -- it assumes
you're the Lead Dog, and therefore follows your instructions implicitly.
This leaves C++, which seems to be neither cat nor dog.
I think that sums it up nicely....///..........
(the preceding was HUMOROUS FARCE, and should not be taken internally
unless under direction of competent counsel...)
The point that we have been repeatedly making is that Lisp compilers are
smart enough to turn some cases of recursive functions into iterative
ones. By doing so, these functions do not use any stack. Even though you
write something that looks like it's going to use the stack, it doesn't.
Period.
This frees you to use recursive or iterative notation for a function, as
suits you. If you happen to like iteration, hey, go for it. Some functions
are easier to figure out iterative. However, some functions are easier
written recursive. If I write them that way, *I* will be able to
understand them more easily, and the compiler may or may not optimize it
into an iterative function, which may or may not use a stack to execute.
Personal style.
But this begs a bigger question: why is it so all-important not to use the
stack? Because the stack is slow?
So what!
When I write a Lisp program, I start by making a wild guess as to how the
program should work. Then I play with the program and repeatedly refine it
until it works right. If, in the course of development, I happen to
express a function as recursive or iterative, I don't care. I just get the
darn thing working. Whatever seems right, probably is. (I might have an
AHA! insight later, and redesign the program to be simpler or clearer.)
THEN I look at speed. The funny thing is: the Lisp compiler will often
make optimizations that produce fast code, without my having to think
about them. Assuming I still need to speed something up, I can pepper my
code with declarations (which are so much easier to add after you KNOW
what the variables are, instead of before, when you have to guess, and
therefore keep changing them....) and maybe do some profiling to see
what's slow and what's not.
So when I get to looking at speed, I can do so all at once, as a global
problem. This is an enormous luxury -- you really ought to try sometime.
Sure, in C and C++ you're stuck to declaring your variables in advance,
but other than that, you can simply ignore speed concerns and write a loop
using whatever syntax seems appropriate.
I mean, it's not like C or C++ won't compile your program if it's
recursive and it OUGHT to be iterative. Indeed, C compilers these days
will, in fact, rewrite your code to optimize it. Just because C looks like
assembly code doesn't mean that compiling it is a one-to-one translation!
But both C and Lisp compilers let you have inline assembly language code,
if you want to.
...
I think your original question was, "Why does Lisp require you to write
functions using recursive notation?"
The answer to that is, "Lisp does not."
You then asked, "Why is it considered 'better style' to write functions
always in recursive notation?"
The answer to that is, "No, it is not considered better style to write
functions always in recursive notation. You can use iterative notation if
you like, and indeed, Lisp provides you with more kinds of iteration than
C. It's a question of personal style to decide which notation to use."
Then you asserted that recursive functions were necessarily slower than
iterative.
The answer to that is, "No, not necessarily. Good compilers optimize code."
I've gone on to assert that execution speed is not the be-all and end-all
of a program.
Certainly, there is a certain point at which additional speed is of no
value. I really couldn't care less if it takes me .01 or .001 seconds to
do something, UNLESS I intend to do more than ten thousand of them.
I assert further that there are two other measures of speed that are just
as important as execution speed: development speed, and comprehension
speed.
Development speed is the measure of how long it takes to write a program.
Comprehension speed is the measure of how long it takes someone to learn
how the code operates.
We are all familiar with the maxim, "six months later, you WILL have to
read the comments." Kernighan and Plauger, in "The Elements of Programming
Style," repeatedly enjoin the programmer to "Write clearly -- don't be too
clever." Indeed, of the one-hundred or so "rules" which are gone over,
MOST of them are simple variants of this fundamental rule.
Obviously, what to you might be obvious might be gibberish to someone
else. You will have to decide where to draw the line.
: To listen to many lisp advocates, lisp is :-
: 1, More powerful at machine level than any low level language hitherto
: created. In fact Lisp creates a psychic interface enabling you
: to coax the best out of the CPU *ooo now baby*
: 2, More suitable for OO development than any other language except for
: froo, a highly advanced mathematical language, details of which were
: zapped from Marvin Minsky's brain in highly suspicious circumstances.
: No one knows anymore details but its possible the *dark mutter* CIA
: were involved.
: 3, More suitable than any other language for scripting, in fact you should
: burn all heretics who use the mysterious of sh, ksh, perl or even *shudder*
: Tcl. Such people are all management lackeys, or very misguided at best.
: They may only save themselves by committing themselves to a years
: intensive study of the oracles of SickPea.
: 4, More intuitive than any natural language. After all who wants to speak in
: English, SerboCroat or that African language with the tongue clicks. In
: fact if we only re-engineered all our languages around lisp, we could
: make full use of AI! Not only would that fish swim on your screen, but
: it could talk to you!
: 5, Ideal for writing extensions in, i mean look at emacs! It may be a bloated
: pig of an editor, but it works, and more importantly is conceptually
: elegant. whaddya mean you don't agree ? What are you anyway ? Some sort of
: VB loving COMMUNIST ???
With the possible exception of nr. 4, you are completely right. [*]
IMO, programming languages should not be intuitive, they must be self
consistent and founded on very few principles, and not rely on any
external _intuit_ to make sense.
Pierpaolo.
[*] 8-)
> : This seems to be the only plausible explanation. It might also explain
> : many of the Java attacks, which use _identical_ arguments to attacks
> : on Lisp. They're all bizarre, wrong, and easily refuted.
>
> Or OTOH, this could be because certain Java/Lisp programmers advocate
> their programming language as a solution for all problems at every level
> of abstraction. All it needs is for someone to opinion that certain
> problems may need assembler (as an an example) as an optimal solution,
> immediately a Java programmer will jump in saying that "no one programs in
> assembler anymore", and that in his opinion the problem "would be much better
> coded in Java, for in Java you could do x", and if you ^really^ want to
> program in assembler "you can wait 3 years until a native Java CPU is
> released", and in the meantime "theres a really kewl assembler for Java
> byte codes".
I've never seen a Java programmer will jump in saying that "no one
programs in assembler anymore". Where did you see this remarkable
claim made? I'd dispute it myself!
> To listen to many lisp advocates, lisp is :-
>
> 1, More powerful at machine level than any low level language hitherto
> created. In fact Lisp creates a psychic interface enabling you
> to coax the best out of the CPU *ooo now baby*
I've never seen this claimed, either. Not in the 5 years that I've
been reading comp.lang.lisp. Some possibly over zealous claims _may_
be made, but it's also possible that a few people simply find the
claims hard to believe, maybe coz they've not actually checked to see
what's being done with a particular language.
Let us not forget the Clarke law about sufficiently advanced tech
being indistinguishable from magic. There's no magic to be found, but
there is tech that does some amazing things. I frequently see this law
in action, regarding all kinds of languages.
> 2, More suitable for OO development than any other language except for
> froo, a highly advanced mathematical language, details of which were
> zapped from Marvin Minsky's brain in highly suspicious circumstances.
> No one knows anymore details but its possible the *dark mutter* CIA
> were involved.
Perhaps someone better acquainted with mathematical languages could
answer this one, as I've no idea what you're talking about. I'm more
familiar with text processing, and compilers in particular.
> 3, More suitable than any other language for scripting, in fact you should
> burn all heretics who use the mysterious of sh, ksh, perl or even *shudder*
> Tcl. Such people are all management lackeys, or very misguided at best.
> They may only save themselves by committing themselves to a years
> intensive study of the oracles of SickPea.
I've never seen this claimed, either. I don't doubt that Lisp can be
used for scripting (consider scsh!). I used to use Forth for
scripting, back when CP/M-68K was my OS. Well, the scripting in that
OS is almost non-existant, so perhaps I had a good excuse.
When and where have you seen Lisp people insisting that only Lisp can
do scripting? ISTR a nice little article published by Byte in which a
William Gates outlined his plans for using Basic as a scripting tool.
Everyone can, and IMHO should able to, use their chosen language for
scripting. MS actually let us choose the language, unlike a great many
others. Even if you're right about this point, and that _some_ Lisp
people want to convert us all to Lisp, there are others promoting
other languages, and they have much greater power to make their dreams
come true.
I'm very relaxed about scripting, as I don't always have the luxury of
a choice, and when I do, well, there's no need to convert anyone, nor
any need to defend my choice(s). I read the Tcl thread, earlier this
year, with a certain amount of amusement. This is because I have no
strong feelings about what choices _other people make_. None of it
affects me. Is this a positive attitude? I can't tell.
> 4, More intuitive than any natural language. After all who wants to speak in
> English, SerboCroat or that African language with the tongue clicks. In
> fact if we only re-engineered all our languages around lisp, we could
> make full use of AI! Not only would that fish swim on your screen, but
> it could talk to you!
Now you've completely lost me. Techies like us tend to reject
"friendly" user interfaces, like "Bob". I'm not sure I'd like using
such a beast, but then, I'm likely to, so why worry?
I can't remember the last time I saw a thread discussing user
interfaces in comp.lang.lisp. Can you?
> 5, Ideal for writing extensions in, i mean look at emacs! It may be a bloated
> pig of an editor, but it works, and more importantly is conceptually
> elegant. whaddya mean you don't agree ? What are you anyway ? Some sort of
> VB loving COMMUNIST ???
Well, I like the IDE in VB. Perhaps I'm a heretic? Nobody has burned
me yet. If you wish to flamebait emacs users, don't look at me. Try an
emacs newsgroup instead.
> Did i miss anything out ?
Any evidence? I can refer you to the recent "Lisp is SLOW" thread,
which we're still posting to, plus various other recent threads, like
the one about garbage collection (see dejanews). There's also the
series of threads that have repeated themselves, of which I read at
least 3 years worth - too bad I didn't archive them, because then you
too could read them. You might learn something, even if it's how
pointless it is to argue with people who know more about the subject
than yourself. In this case, the subject is Lisp.
Here's my evident: the Lisp FAQ and everthing that it refers to.
Books, software, companies, compilers, it's all there. As Henry Baker
once said, when refering someone to one of his papers, read it and
weep. Alternately, you might discover something useful. You never
know, it might be a better arguement for not using Lisp...Try it and
let us all know what you find.
If you don't have time to do this, fair enough. After all, it's a big
world, even if we ony restrict ourselves to studying computers. Nobody
can know about all of it and then talk about it with any authority.
This is why you should consider that others may have a greater depth
of knowledge in this one area. No doubt there are subjects that a Lisp
programmer won't know, and with which you could win every arguement.
I'm only saying that this isn't one of them.
Anyway, you've joined the party near the end, when everyone is
leaving. The fun is over...until the next time. See you then.
Better go back to school and take a course in compiler design.
One word: stack.
Peaceman
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Fri, 01 Aug 1997 17:34:47 -0400
Organization: Logical Net
Reply-To: peac...@capital.net
Lines: 11
NNTP-Posting-Host: dialup036.colnny1.capital.net
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 3.01 (WinNT; I)
Xref: agate comp.lang.lisp:29650 comp.programming:53355 comp.lang.c++:284879
Peaceman
If there is a person who needs a compiler course it is you.
First you stated that
all recursive programs can be espressed by iteration without
the use of a stack
Now you essentially state
tail-recursive definitions always need a stack.
Why don't *you* go back to school and take a serious compiler (or PL)
course and then come back here telling everybody that you were wrong
on these two accounts? There is nothing bad into admitting that you
are wrong. At least you will have learned something.
PS. Apologies to everybody else, I just cannot resist.
> scha...@wat.hookup.net wrote:
> >
> > In case you didn't get it yet, the machine code for iteration and tail
> > recursion are indistinguishable.
> >
> > Hartmann Schaffer
>
> Better go back to school and take a course in compiler design.
> One word: stack.
>
> Peaceman
How about being not so quick sending other people back to school?
Tail recursion does not use a stack.
(defun fac (n acc)
(declare (optimize (speed 3) (space 0) (safety 0) (debug 0)))
(if (zerop n)
acc
(fac (1- n) (* n acc))))
? (fac 5 1)
120
Macintosh Common Lisp 4.1 generated PowerPC code:
? (disassemble 'fac)
L0
(MFLR LOC-PC)
(STWU SP -16 SP)
(STW FN 4 SP)
(STW LOC-PC 8 SP)
(STW VSP 12 SP)
(MR FN TEMP2)
(LWZ IMM0 -117 RNIL)
(TWLLT SP IMM0)
(VPUSH ARG_Z)
(:REGSAVE SAVE0 4)
(VPUSH SAVE0)
(MR SAVE0 ARG_Y)
(MR ARG_Y SAVE0)
(LI ARG_Z '0)
(BLA .SPBUILTIN-EQ)
(CMPW ARG_Z RNIL)
(BEQ L96)
(LWZ ARG_Z 4 VSP)
(LWZ SAVE0 0 VSP)
(LWZ LOC-PC 8 SP)
(MTLR LOC-PC)
(LWZ VSP 12 SP)
(LWZ FN 4 SP)
(LA SP 16 SP)
(BLR)
L96
(MR ARG_Y SAVE0)
(LI ARG_Z '1)
(BLA .SPBUILTIN-MINUS)
(MR ARG_Y ARG_Z)
(VPUSH ARG_Y)
(MR ARG_Y SAVE0)
(LWZ ARG_Z 8 VSP)
(BLA .SPBUILTIN-TIMES)
(LWZ ARG_Y 0 VSP)
(LA VSP 4 VSP)
(LWZ SAVE0 0 VSP)
(SET-NARGS 2)
(MR TEMP2 FN)
(LWZ LOC-PC 8 SP)
(MTLR LOC-PC)
(LWZ VSP 12 SP)
(LWZ FN 4 SP)
(LA SP 16 SP)
(B L0)
O.k., let's see what's on the stack at n = 1:
(defun fac (n acc)
(declare (optimize (speed 3) (space 0) (safety 0) (debug 0)))
(when (= n 1)
(print-call-history))
(if (zerop n)
acc
(fac (1- n) (* n acc))))
There is only one call to FAC pending on the stack.
? (fac 110 1)
(3202728) : 0 "FAC" 88
0 ACC:
1588245541522742940425370312709077287172441023447356320758174831844456716294
8183030959960131517678520479243672638179990208521148623422266876757623911219
200000000000000000000000000 ("required")
1 : #<PROCESS Listener [Suspended] #x2D1FEC6> ("saved SAVE0")
(3202738) : 1 NIL NIL
(3202748) : 2 "CCL::CALL-CHECK-REGS" 80
0 : FAC ("required")
1 : (110 1) ("rest")
2 : (#<PROCESS Listener [Suspended] #x2D1FEC6> 0 *TERMINAL-IO* (#<RESTART
ABORT #x3A63D3E> #<RESTART ABORT-BREAK #x3A63D66>) #<BOGUS object @
#x1B7CB6E> CCL::%PPC-APPLY-LEXPR-WITH-METHOD-CONTEXT (#<STANDARD-METHOD
INITIALIZE-INSTANCE (FRED-WINDOW)> #<CCL::STANDARD-KERNEL-METHOD
INITIALIZE-INSTANCE (WINDOW)> #<CCL::STANDARD-KERNEL-METHOD
INITIALIZE-INSTANCE (SIMPLE-VIEW)> #<CCL::STANDARD-KERNEL-METHOD
INITIALIZE-INSTANCE (CCL::INSTANCE-INITIALIZE-MIXIN)>
#<CCL::STANDARD-KERNEL-METHOD INITIALIZE-INSTANCE (STANDARD-OBJECT)>) 0)
3 : FAC
(3202758) : 3 NIL NIL
(3202768) : 4 "CCL::TOPLEVEL-EVAL" 176
0 : (FAC 110 1) ("required")
1 : NIL ("optional")
2 : NIL
(3202778) : 5 "CCL::READ-LOOP-INTERNAL" 716
0 : *EVAL-QUEUE* ("saved SAVE0")
1 : CCL::*CURRENT-STACK-GROUP* ("saved SAVE1")
2 : CCL::*RESUME-STACK-GROUP-ARG* ("saved SAVE2")
3 : #<BOGUS object @ #x1B7CB7E> ("saved SAVE3")
4 CCL::*BREAK-LEVEL*: 0 (:SAVED-SPECIAL)
5 CCL::*LAST-BREAK-LEVEL*: 0 (:SAVED-SPECIAL)
6 *LOADING-FILE-SOURCE-FILE*: NIL (:SAVED-SPECIAL)
7 CCL::*IN-READ-LOOP*: NIL (:SAVED-SPECIAL)
8 CCL::*LISTENER-P*: T (:SAVED-SPECIAL)
9 ***: NIL (:SAVED-SPECIAL)
10 **: NIL (:SAVED-SPECIAL)
11 *: NIL (:SAVED-SPECIAL)
12 +++: NIL (:SAVED-SPECIAL)
13 ++: NIL (:SAVED-SPECIAL)
14 +: NIL (:SAVED-SPECIAL)
15 ///: NIL (:SAVED-SPECIAL)
16 //: NIL (:SAVED-SPECIAL)
17 /: NIL (:SAVED-SPECIAL)
18 -: NIL (:SAVED-SPECIAL)
19 : (FAC 110 1)
20 : #<RESTART ABORT-BREAK #x3A63D66>
21 : #<RESTART ABORT #x3A63D3E>
22 CCL::%RESTARTS%: ((#<RESTART ABORT #x3A63F66> #<RESTART ABORT-BREAK
#x3A63F8E>)) (:SAVED-SPECIAL)
23 : T
(32027C8) : 6 NIL NIL
(32027D8) : 7 "CCL::READ-LOOP" 356
0 : 0 ("required")
1 : (#<RESTART ABORT #x3A63F66> #<RESTART ABORT-BREAK #x3A63F8E>) ("saved
SAVE0")
2 : T
3 : NIL
4 CCL::*LISTENER-P*: NIL (:SAVED-SPECIAL)
5 *EVAL-QUEUE*: NIL (:SAVED-SPECIAL)
(32027F8) : 8 "TOPLEVEL-LOOP" 48
(3202818) : 9 "Anonymous Function #x22E8E26" 44
(3202838) : 10 "CCL::RUN-PROCESS-INITIAL-FORM" 340
0 : #<PROCESS Listener [Running] #x2D1FEC6> ("required")
1 : (#<COMPILED-LEXICAL-CLOSURE #x2D1FF2E>) ("required")
2 : CCL::*NEXT-STACK-GROUP* ("saved SAVE0")
3 : NIL
4 : #<RESTART ABORT-BREAK #x3A63F8E>
5 : #<RESTART ABORT #x3A63F66>
6 CCL::%RESTARTS%: NIL (:SAVED-SPECIAL)
7 : #<COMPILED-LEXICAL-CLOSURE #x2D1FF2E>
(3202888) : 11 "CCL::%RUN-STACK-GROUP-FUNCTION" 796
0 : #<BOGUS object @ #x3A63FFE> ("required")
1 : 13109802 ("required")
2 *TOP-LISTENER*: NIL (:SAVED-SPECIAL)
1588245541522742940425370312709077287172441023447356320758174831844456716294
8183030959960131517678520479243672638179990208521148623422266876757623911219
200000000000000000000000000
Compare it to the recursive version:
(defun fac (n)
(declare (optimize (speed 3) (space 0) (safety 0) (debug 0)))
(when (= n 1)
(print-call-history))
(if (zerop n)
1
(* n (fac (1- n)))))
? (fac 7)
(32026C8) : 0 "FAC" 84
0 : 2 ("saved SAVE0")
(32026D8) : 1 "FAC" 160
0 : 3 ("saved SAVE0")
(32026E8) : 2 "FAC" 160
0 : 4 ("saved SAVE0")
(32026F8) : 3 "FAC" 160
0 : 5 ("saved SAVE0")
(3202708) : 4 "FAC" 160
0 : 6 ("saved SAVE0")
(3202718) : 5 "FAC" 160
0 : 7 ("saved SAVE0")
(3202728) : 6 "FAC" 160
0 : #<PROCESS Listener [Suspended] #x2D1FEC6> ("saved SAVE0")
(3202738) : 7 NIL NIL
(3202748) : 8 "CCL::CALL-CHECK-REGS" 80
0 : FAC ("required")
1 : (7) ("rest")
2 : (#<PROCESS Listener [Suspended] #x2D1FEC6> 0 *TERMINAL-IO* (#<RESTART
ABORT #x3A63D3E> #<RESTART ABORT-BREAK #x3A63D66>) #<BOGUS object @
#x1B7CB6E> CCL::%PPC-APPLY-LEXPR-WITH-METHOD-CONTEXT (#<STANDARD-METHOD
INITIALIZE-INSTANCE (FRED-WINDOW)> #<CCL::STANDARD-KERNEL-METHOD
INITIALIZE-INSTANCE (WINDOW)> #<CCL::STANDARD-KERNEL-METHOD
INITIALIZE-INSTANCE (SIMPLE-VIEW)> #<CCL::STANDARD-KERNEL-METHOD
INITIALIZE-INSTANCE (CCL::INSTANCE-INITIALIZE-MIXIN)>
#<CCL::STANDARD-KERNEL-METHOD INITIALIZE-INSTANCE (STANDARD-OBJECT)>) 0)
3 : FAC
(3202758) : 9 NIL NIL
(3202768) : 10 "CCL::TOPLEVEL-EVAL" 176
0 : (FAC 7) ("required")
1 : NIL ("optional")
2 : NIL
(3202778) : 11 "CCL::READ-LOOP-INTERNAL" 716
0 : *EVAL-QUEUE* ("saved SAVE0")
1 : CCL::*CURRENT-STACK-GROUP* ("saved SAVE1")
2 : CCL::*RESUME-STACK-GROUP-ARG* ("saved SAVE2")
3 : #<BOGUS object @ #x1B7CB7E> ("saved SAVE3")
4 CCL::*BREAK-LEVEL*: 0 (:SAVED-SPECIAL)
5 CCL::*LAST-BREAK-LEVEL*: 0 (:SAVED-SPECIAL)
6 *LOADING-FILE-SOURCE-FILE*: NIL (:SAVED-SPECIAL)
7 CCL::*IN-READ-LOOP*: NIL (:SAVED-SPECIAL)
8 CCL::*LISTENER-P*: T (:SAVED-SPECIAL)
9 ***: NIL (:SAVED-SPECIAL)
10 **: NIL (:SAVED-SPECIAL)
11 *: NIL (:SAVED-SPECIAL)
12 +++: NIL (:SAVED-SPECIAL)
13 ++: NIL (:SAVED-SPECIAL)
14 +: NIL (:SAVED-SPECIAL)
15 ///: NIL (:SAVED-SPECIAL)
16 //: NIL (:SAVED-SPECIAL)
17 /: NIL (:SAVED-SPECIAL)
18 -: NIL (:SAVED-SPECIAL)
19 : (FAC 7)
20 : #<RESTART ABORT-BREAK #x3A63D66>
21 : #<RESTART ABORT #x3A63D3E>
22 CCL::%RESTARTS%: ((#<RESTART ABORT #x3A63F66> #<RESTART ABORT-BREAK
#x3A63F8E>)) (:SAVED-SPECIAL)
23 : T
(32027C8) : 12 NIL NIL
(32027D8) : 13 "CCL::READ-LOOP" 356
0 : 0 ("required")
1 : (#<RESTART ABORT #x3A63F66> #<RESTART ABORT-BREAK #x3A63F8E>) ("saved
SAVE0")
2 : T
3 : NIL
4 CCL::*LISTENER-P*: NIL (:SAVED-SPECIAL)
5 *EVAL-QUEUE*: NIL (:SAVED-SPECIAL)
(32027F8) : 14 "TOPLEVEL-LOOP" 48
(3202818) : 15 "Anonymous Function #x22E8E26" 44
(3202838) : 16 "CCL::RUN-PROCESS-INITIAL-FORM" 340
0 : #<PROCESS Listener [Running] #x2D1FEC6> ("required")
1 : (#<COMPILED-LEXICAL-CLOSURE #x2D1FF2E>) ("required")
2 : CCL::*NEXT-STACK-GROUP* ("saved SAVE0")
3 : NIL
4 : #<RESTART ABORT-BREAK #x3A63F8E>
5 : #<RESTART ABORT #x3A63F66>
6 CCL::%RESTARTS%: NIL (:SAVED-SPECIAL)
7 : #<COMPILED-LEXICAL-CLOSURE #x2D1FF2E>
(3202888) : 17 "CCL::%RUN-STACK-GROUP-FUNCTION" 796
0 : #<BOGUS object @ #x3A63FFE> ("required")
1 : 13109802 ("required")
2 *TOP-LISTENER*: NIL (:SAVED-SPECIAL)
5040
Can you see the difference?
> > In case you didn't get it yet, the machine code for iteration and tail
> > recursion are indistinguishable.
> >
> > Hartmann Schaffer
>
> Better go back to school and take a course in compiler design.
> One word: stack.
Better go back to school and take a course in computer programming.
One word: tail.
--
Gareth McCaughan Dept. of Pure Mathematics & Mathematical Statistics,
gj...@dpmms.cam.ac.uk Cambridge University, England.
> One word: stack.
Many books on compiler theory neglect a certain issue.
Two words: tail recursion.
I guess they didn't teach it right. I took a course at Harvard once
(yeah, I get what I deserve, eh?) and came away with the impression
that Lisp was a complete waste of time. I changed my opinion
rather quickly when I learned it from *real* programmers.
Fred Haineux wrote in article ...
>Scott Fahlman <s...@clyde.boltz.cs.cmu.edu> wrote:
>| 1. If the types of x and y are not declared and can't be deduced by
>| the compiler, the Lisp + must do a runtime type-dispatch on both
>| arguments to select the right kind of arithmetic/coercion to do. Lisp
>| must select among fixnums, bignums, several flavors of float, ratios,
>| and complex numbers.
>
>Let's just point out that if you are using C++, it does the same thing as
>Lisp in this case, and is just as slow.
Really? Hmm...have y'profiled it? ;-)
Anyways, due to C++, the type of the object should be known at compile
time. All that has to occur is the lookup into the vtable
21: c = *ca + 5 ;
004011F6 mov dword ptr [ebp-8],5
004011FD lea eax,dword ptr [ebp-8]
00401200 push eax
00401201 mov ecx,dword ptr [ca]
00401204 mov edx,dword ptr [ecx]
00401206 mov ecx,dword ptr [ca]
00401209 call dword ptr [edx]
0040120B mov dword ptr [c],eax
Shouldn't be too shappy. Code to generate the coercions should already be
present. If virtual, then they'd use similar code.
The main thing is that in C++, the evidently much slower case(?) of "x and
y are not declared and their types can't be deduced...." doesn't apply.
Each variable's type is known at compile time.
What follows is the source
#include <iostream>
class A {
int a ;
public:
A(int _a) : a(_a) { }
int get_a(void) { return a ; }
virtual int operator +(const int &b) { std::cout << "A + operator" ;
return a+b ;} ;
} ;
class B : public A {
public:
B(int _a) : A(_a) {}
virtual int operator +(const int &b) { std::cout << "B + operator" ;
return get_a()+b ; } ;
} ;
int foo (A *ca)
{
int c ;
c = *ca + 5 ;
return c ;
}
int main(int argc, char **argv) {
B cb(4) ;
foo(&cb) ;
return 0 ;
}
Dennis
Dennis
Sajid Ahmed the Peaceman wrote in article <33E256...@capital.net>...
>scha...@wat.hookup.net wrote:
>>
>> In case you didn't get it yet, the machine code for iteration and tail
>> recursion are indistinguishable.
>>
>> Hartmann Schaffer
>
> Better go back to school and take a course in compiler design.
> One word: stack.
>
> Peaceman
>.
>
If I tried to pass an instance of class C (unrelated to A) calling foo with
that instance would generate a compile error ;-)
Dennis
Dennis Weldy wrote in article ...
Coming from YOU, this comment seems quite surrealistic.
David
--
David BrabaNT, | E-mail: David.Braba...@csl.sni.be
Siemens Nixdorf (SNI), | CIS: 100337(,)1733
Centre Software de Liège, | X-400: C=BE;A=RTT;P=SCN;O=SNI;OU1=LGG1;OU2=S1
2, rue des Fories, | S=BRABANT;G=DAVID
4020 Liège (BELGIUM) | HTTP: www.sni.de www.csl.sni.be/~david
>scha...@wat.hookup.net wrote:
>>
>> In case you didn't get it yet, the machine code for iteration and tail
>> recursion are indistinguishable.
>>
>> Hartmann Schaffer
>
> Better go back to school and take a course in compiler design.
> One word: stack.
>
> Peaceman
Do you know anything about compilers but the word?
Hartmann Schaffer
But if you want a text editor rather than a way of life, use ue.
Daniel Barker,
Institute of Cell and Molecular Biology,
University of Edinburgh,
Daniel Rutherford Building,
King's Buildings,
Mayfield Road,
Edinburgh
EH9 3JR
> int ifact (int n, int r) {
> while (n > 0) {
> r *= n--;
> }
> return r;
> }
>
> int rfact (int n, int r) {
> if (n == 0)
> return r;
> return rfact (n - 1, r * n);
> }
...
> The command "gcc -O2 -S t.c" produces:
...
> Now, notice one thing here: The code produced for the recursive
> form of the factorial is better than the interative code!
Am I missing something? It looks to me as if the inner loop of
the recursive version is one instruction longer and will therefore
be slower. (Of course, for this particular case you'll never go
round the loop very many times, unless you actually *want* to
know <something large> factorial mod 2^whatever...)
I completely agree that "Sajid Whatever-it-was the Peaceman" is
completely wrong in claiming that recursion is invariably more
inefficient, and I completely agree that this example shows that
his feared stack explosion doesn't happen; but I don't think the
code produced for the recursive version is *better*. Or did I
miss something?
Well, I was trying to stay out of this, but "Peaceman" really should
listen to those who have studied/learned more... As evidence, I
present the following "C" code!
int ifact (int n, int r) {
while (n > 0) {
r *= n--;
}
return r;
}
int rfact (int n, int r) {
if (n == 0)
return r;
return rfact (n - 1, r * n);
}
The output of the command "gcc --version" is 2.7.2 it is running on
a NetBSD-1.2G/pc532 box (so the output is for an ns32k arch.)
The command "gcc -O2 -S t.c" produces:
#NO_APP
gcc2_compiled.:
___gnu_compiled_c:
.text
.align 2
.globl _ifact
.type _ifact,@function
_ifact:
enter [],0
movd 8(fp),r1
movd 12(fp),r0
cmpqd 0,r1
bge L3
L4:
muld r1,r0
addqd -1,r1
cmpqd 0,r1
blt L4
L3:
exit []
ret 0
Lfe1: .size _ifact,Lfe1-_ifact
.align 2
.globl _rfact
.type _rfact,@function
_rfact:
enter [],0
movd 8(fp),r1
movd 12(fp),r0
L8:
cmpqd 0,r1
beq L7
muld r1,r0
addqd -1,r1
br L8
.align 2,0xa2
L7:
exit []
ret 0
Lfe2:
.size _rfact,Lfe2-_rfact
Now, notice one thing here: The code produced for the recursive
form of the factorial is better than the interative code! I kept
the "r" argument to hopefully make the comparison of the assembly
easier. However, I did try replacing the while loop with a for
loop, and a for loop with an index counting the other direction
(1 -> n) and it made no real difference in the generated code.
I was trying to get the output to be the same for both, but after
*multiple* attempts with the iterative version, I have been unable
to get gcc to generate code as good as my *first* recursive version!
Now the thing I want you to think about is this, *GCC-2.7.2* takes
an iterative version of some code, and a recursive version of the
same code, and makes better code for the recursive version in *EVERY*
case I tried. The recursive code is smaller, faster, and uses the
same amount of stack for any arguments.
Now do you still hold the same opinion on who should be in a compiler
design class?
Jon Buller
I think he may have a couple of valid points:
1) Recursion has been used more in Lisp-like languages.
The optimization techniqus for tail-recursion developed
in the context of Lisp, because that's where
they were most needed. (Of course, once developed,
nothing stopped GCC from taking advantage of the techniques.)
If you pick up an early Lisp book, you will find
recursion heavily emphasized as the "natural" technique
for Lisp. (Lately de-emphasized in the wake of intensive
performance concerns.)
2) Recursion is indeed inherently more expensive, otherwise
the whole field of "elimination of tail recursion"
would have been un-necessary.
Moreover, not all recursive problems are tail-recursive,
or as easily amenable to being compiled iteratively.
this is obviously false. function calls have always been expensive, and
will continue to be. if a tail call could universally be replaced with a
jump, be that to itself or to any other function, much would be saved in
performance. designing a language and implementing calling conventions
such that this is possible is actually very hard work. e.g., C blew it,
and therefore C++. Scheme focused on this aspect from very early on, and
tail-call merging has been a standard feature in Lisp compilers a long
time. since C blew it so disastrously, it's no wonder it took the C
community so long to get it right. we should also not ignore the fact that
the Free Software Foundation and GNU project is led by people with
extensive experience from the Lisp world.
since I live in an un contaminated world, without MS bugs or problems,
could somebody who has already been exposed to MS tell us whether their
compilers for C or C++ do tail-call merging? that could be instructive.
#\Erik
--
404 Give me a URL I cannot refuse.
> since I live in an un contaminated world, without MS bugs or problems,
> could somebody who has already been exposed to MS tell us whether their
> compilers for C or C++ do tail-call merging? that could be instructive.
VC++ 4.0 certain handles tail recursion, _if_ you use the optimiser.
Just like GNU C.
I don't think that you're disagreeing with Mukesh Prasad, as you're
both saying the same thing. It's not untrue that recursion is "more
expensive" than not using recursion, in the sense that it uses stack
space, but it's not true that recursion is more expensive than an
iterative approach. The ifact and rfact functions should produce
similar performance.
It's possible to "optimise" the second resursive call in quicksort, by
replacing it with a while loop, but compilers like VC++ and GNU C can
optimise the call _without_ a source level transformation of code.
IMHO we should let the compiler do this kind of work for us. The only
for making source level optimisations _in C_ is that C compilers tend
to not use optimisation by default. We have to explicitly tell them to
optimise the code, and this often makes the compile time longer.
Lisp programmers probably don't notice compiler times. I know that I
usually don't! Whether I code in C or Lisp, I spend a lot of time
recompiling single functions. The difference is that in C, this takes
a _lot_ longer than in Lisp, where it can be very hard to measure the
time to compile a single function.
This will vary from compiler to compiler, of course. Fortunately, in
Common Lisp, there's the option of an interpreter - perfect for
testing code that will be run once and then replaced by a new version
a few seconds later. C interpreters also exist, but I'm not sure if
they allow interpreted and compiled code to be mixed in the same
program. Perhaps this is why C/C++ people are so confused? They may be
judging Lisp by the limitations of C/C++ compilers, by assuming that
the same rules apply to both languages.
Compilers - I love 'em! There are no limits.
--
<URL:http://www.wildcard.demon.co.uk/> You can never browse enough
"There are no limits." -- ad copy for Hellraiser
Uh... no. I have lost the ability to count. It seems to only
happen just before I post, but somehow it seems to always happen
before I post 8-)
I was looking at that code thinking: Gee, the while loop has 2
compare instructions, and the recursive function merged them into
one so it MUST be better, I guess I don't even need to look closer
or actually *count* the instructions... Just in case someone wonders
if it's an antique CISC that allows that to occur, I checked the
same code on a SPARC last night, and they are both within a few
instructions of each other on that machine as well.
Of course this unfortunately does not counter the argument "Well
sure you can show such results with such a simple toy, but it'll
never work on 'real code'". I haven't the time to generate such
an example, nor would I flood the net with it if I had it. But I
am sure the outcome of such an example would be the same as this
one, provided the C parameter passing semantics didn't get in the
way, and the GCC optimizer didn't just simply give up on a large
input function.
> I completely agree that "Sajid Whatever-it-was the Peaceman" is
> completely wrong in claiming that recursion is invariably more
> inefficient, and I completely agree that this example shows that
> his feared stack explosion doesn't happen; but I don't think the
> code produced for the recursive version is *better*. Or did I
> miss something?
Now if only our flame-baiter would admit to the obvious facts as
readily...
Jon Buller
Ah, but if you wanted efficient code, you wouldn't be using quicksort
in the first place.
I can't speak about Lisp, but I do know that in the history of Prolog,
TRO support was added to DEC-10 Prolog *KNOWING THAT IT WOULD BE SLOWER*
than pure recursion, but purely in order to save memory (if we had more
than 250k in a program, the ERCC operators asked us to run it overnight!)
It became an essential part of all later Prolog systems precisely so that
we could *stop* writing iterative code!
What's expensive is not recursion, but retaining resources you are no longer
using. RECURSIVE CALLS AREN'T ANY DIFFERENT FROM ANY OTHER CALLS. The
key point about TRO is *not* that it turns the call into a jump, but that
it reclaims the stack frame *before* the procedure returns. The Quintus
Prolog compiler supported an additional (and very simply implemented)
optimisation: environment trimming. Basically, the compiler allocated
variables in the stack frame in reverse order of death time, and as each
variable's death time was reached, the stack frame was trimmed back.
Environment trimming and tail call optimisation were done for _all_ procedures
and calls, not just self-recursions.
With object-oriented programming, even when your code is not recursive, it
still has a heck of a lot of procedure calls. It's not just recursive
styles that have to use 'the stack'.
> > Can you see the difference?
> >
> > --
> > http://www.lavielle.com/~joswig/
>
>
> For your sake, I hope the Macintosh makes a come a back.
> They have already filed for chapter 11, and are just being
> bailed out by Microsoft.
>
> Peaceman
Boy, this is a technical argument...
> For your sake, I hope the Macintosh makes a come a back.
> They have already filed for chapter 11, and are just being
> bailed out by Microsoft.
How does this support your argument? Are you saying that "recursion is
expensive on every machine but the Mac"? Can you show us how recursion
for the Mac is different from every other machine? Or are you just the
ignoring Rainer's excellent post by trying to divert the subject to an
issue that is not at all relevant here?
TRO works on other machines, too. You must prove that it doesn't.
("Your mission, should you accept it...") Even Mr Phelps couldn't do
that, and you're already admitting that you've lost by resorting to
such a cheap tactice (yeah, _more_ flamebait).
> Ah, but if you wanted efficient code, you wouldn't be using quicksort
> in the first place.
??
What would you use instead? Bottom-up heapsort?
Michael
--
Michael Schuerig The usual excuse for our most unspeakable
mailto:uzs...@uni-bonn.de public acts is that they are necessary.
http://www.uni-bonn.de/~uzs90z/ -Judith N. Shklar
> I am willing to accept Lisp as a decent programming language,
> if it is based on nonrecursive code, i.e. most of the programming
> in standard iterative code, with a rare occasion of some recursive code.
> What I've seen of Lisp is the exact opposite.
I've told you before that you probably had a bad teacher. Try again,
this time teaching yourself from a good book. There are a number of
them listed in the Lisp FAQ. (Haven't I refered you to that already?)
You lost your argument weeks ago, but you haven't notice yet. Why not
stop posting disinformation and instead _learn something_? At the very
least, you should get a life and do something else. All you're doing
is giving Lisp programmers a wonderful excuse to post pro-Lisp memes.
While I think this may help kill a few myths, I've been watching this
happen long enough to know that there will always be more people like
yourself, in a few months, to try again.
Always you prove one of two things: 1) how clueless you are about
programming in general, never mind Lisp, and 2) that you have an
ulterior motive for slagging off a language that you don't use. If you
don't use Lisp, why should you even care that other people use it? If
Lisp is as you claim, then surely you would benefit from other people
using it, as they would then be writing slow and expensive code? Your
own arguments show that you are not to be trusted.
If you simply said, "I had a bad learning experience with Lisp, which
is why I'll never touch it again, never look at it again, and never
recommend it to anyone - because I'm assuming that they too will have
painful experiences with it", then we could at least take pity on you
and offer some support. However, you prove yourself to be someone of
much lower principles, resorting to cheap OS flamebait when your
"arguments" are slaughtered by a reference to the native code produced
by a real compiler for a real machine.
Try "showing" us that the Earth is flat - you'll find it much easier.
Guess that makes me an imaginary (or at least complex) programmer.
I deal with recursion quite often in the design of algorithms and
structures. Not all common coding is merely gluing together vendor
components; even if it were, somewhere along the line, all those
"unreal" problems have to be solved. Don't vendors do real work?
: If you want to land yourself a good job, let me tell you...
"Son, the future is... plastics."
Let me give you a further hint. If I'm interviewing a programmer,
a firm grasp of recursion is the LEAST I'll expect of the candidate.
There are more than a few jobs outside the university where code
grinders need not apply.
: Peaceman
Then how about showing a little more pacificism. Folk who make
routine professional use of their comp. sci. backgrounds bridle a
touch when you suggest they don't work in the real world.
--
Christopher Oliver Traverse Communications
Systems Coordinator 223 Grandview Pkwy, Suite 108
oli...@traverse.com Traverse City, Michigan, 49684
"You can even make Emacs look just like vi -- but why?" - C. Musciano
> I am a real programmer, and I can tell you what it's like
> in the real world. There are very times, if any, that you ever write
> recursive code. For the three years that I've been working, I only
> wrote two recursive functions. That's right, only two. One was for doing
> some parsing for some subdirectories, and the other was some dinky
> descrambler program I wrote in a few minutes.
I really do wish you'd stop putting forward your own limitations
as fundamental facts of life.
> I am willing to accept Lisp as a decent programming language,
> if it is based on nonrecursive code, i.e. most of the programming
> in standard iterative code, with a rare occasion of some recursive code.
Why on earth should anyone care what you're willing to accept as
a decent programming language, when you have shown yourself utterly
incapable of understanding the issues involved?
You have made, over and over and over again, the assertion that
recursion is unnecessary, inefficient and hard to understand.
People have explained that
- there are some problems for which recursion is the best way
of expressing things;
- in most cases where a human can do away with it, a compiler
can too, so inefficiency isn't a reason not to use it;
- for many people, recursion is just as easy to understand as
iteration.
But *you* can't see how a compiler could do a decent job of
optimising recursive code, so you say it's necessarily inefficient;
and *you* have trouble understanding recursive code, so you say
it's hard to understand; and *you* have only ever written two
bits of recursive code, so you say it's unnecessary.
So why, exactly, should we take any notice of your pronouncements?
> If you want to land yourself a good job, let me tell you...
Ah, I see. Lisp is bad because there are more jobs in C++. Well,
there are even more jobs -- and better paying ones, too, by all
accounts -- writing COBOL. Clearly that's an even better language,
yes?
[snip big collection of obiter dicta about the world of programming]
> The question is, where does LISP fit into this picture? I'd
> really like to hear your response.
Anywhere you want it to. Or, rather, anywhere someone who actually
has a clue how to use it wants it to.
If ... you ... code ... an ... algorithm ... using
... iterations ... or ... eliminated ... tail ...
recursion ... in ... place ... of ... standard ...
recursion..., you ... are ... reducing ... the ...
number ... of... calls.
Could you please explain what this witty argument has to do with the cost
of (tail)recursion and you superior compiler knowledge? Or are you bailing
out?
Hartmann Schaffer
> You have made, over and over and over again, the assertion that
> recursion is unnecessary, inefficient and hard to understand.
Sajid Ahmed the Peaceman has demonstrated, over and over and over
again, the limits of his education. The problem is that he assumes
that a poorly taught language is the same as a poor language. He could
have been taught C just as badly, but I guess he was just lucky.
Perhaps it's much easier to find bad teachers using Lisp than it is to
find bad teachers using C? Perhaps it's just easier to teach Lisp
poorly, or harder to teach the language well. I don't know, but at
least I can make the distinction. I've been using C since the early
80s, and I still consider myself a "beginner". While I started with
K&R, which I consider a good C tutorial, I'm not so sure about any of
the other C tutorials. I've read a number of Lisp tutorials, and some
of them are very heavy tomes indeed and I wouldn't recommend them to
beginners. It could be that Sajid Ahmed's teacher used one of these
advanced books, which may be ideal for MIT students, but may be
totally unsuitable for most programmers.
I think that we can give Sajid Ahmed the benefit of the doubt, and
guess that he's not an MIT student. ;) We get the languages that we
deserve, and it may be that Sajid Ahmed deserves C/C++.
hmmm. it is not very often that we find people who use arguments ad
hominem to support their own position.
why should we listen to you? because what you say is wise, true, coherent,
informative, or entertaining? no. because you're a "real" programmer who
can tell people what the "real world" is like? yeah, something like that.
how can we determine whether something is or is not "real" according to
your usage? what is the test to falsify a claim to be "real"? if you fall
through such a test, what are you? a fraud, a fake, a wannabe, a failure?
if you are real but others here are not, are they then supernatural,
omniscient, omnipotent, and/or omnipresent compared to you? surely, they
cannot be any _less_ than you are.
I have wanted to know what this "real world" thing is. however, the more I
hear about it, the less interesting it gets. this is in sharp contrast to
the observable world which gets more and more interesting, what with all
the fascinating research, development, creations, art, and other splendor.
the "real world" of you and other Microsoft victims is one where everything
that is great about the observable world is turned on its head. there's
research, but they're doing everything over again. there's development,
but it's about products with more useless "user functions", not any actual
_development_ -- they're just doing more of the same old shit. there are
creations, but they are insignificant compared to the market share and the
sales figures. there is probably art somewhere, but it's so hard to spot
that I don't even see it. if there is splendor, it's on the cover of
magazines and in their ads.
inside this extremely _superficial_ "real world", we find programmers who
show brilliance and sustained intelligence in solving hard problems. if we
dig below the surface, we find new releases of programming environments
with massive performance improvements, for both programmers and products,
and vast improvements in the ways we work. if we look at the products that
aren't sold by the millions, we find entirely new things we can do that the
consuming public just wouldn't be able to understand for several years. if
we are concerned with more than appearance, we find art in the intelligent
application of new technology to previously non-existing problems. if we
are willing to learn and study, we still find splendor in the work of many
people in this field. not many new ones, but the old ones still sparkle in
the darkness of our times.
many believe there is not much development left in computer science.
considering that people are more willing than ever before to do manual work
(they only need more immediate feedback from pushing the colored levers in
the lab rat interface, er, the graphical user interface) this may be true
at the consumer level, but this is not unlike physics. "consumer physics"
(i.e., the kind of physics that consumers would understand and use) has
probably not seen much development for the past 50 years. however, the
results of the continued "elite" research is ever more present in the
hardware and the technology we use today.
your "real world" is really a televised projection of the observable world.
whatever the TV tells you, "reports" to you, you believe, including the
ads. what's real is not what you experience, but what you're told. the
same is true of the computers the mass market uses: they don't really see
the computer _doing_ anything, only making user-friendly appearances on
their screen, the computerized TV.
who chooses what to display on the computerized TV? who chooses what to
send on the TV? according to research into what people will push levers to
get more of (the mass-market market research), people will push levers to
get more entertainment, more ways to keep from exposing themselves to new
ideas, change, or revolutions. people seek the safety of the known and the
customary, and they find it in the TV. society is "de-controversialized"
(starting with the desire for political correctness), and the approvable is
up for democratic vote.
the "real world" has been created by people who voted for whatever made
them feel safest, and they got to choose among the ads delivered through
their electronic pacifier -- the TV or the computerized TV, pick one.
the observable world becomes visible only when the strong glare from the
propaganda machine cum pacifier has subsided, which means you have to turn
it off, go out to see for yourself, enjoy the strange sensations that come
when you rediscover that you're a sensing being, not just a sink for
prepackaged propaganda and microwaved entertainment (satellite TV).
if you leave the glossy ads in the magazines behind, you will find that
people don't actually produce all that much with these things. they work a
lot, and the scream a lot about it, but what comes out of the "real world"
is just electronically heated air. and that's just what we see from this
bozo the peaceman, too. "consumer ideas for consumer brains."
| The question is, where does LISP fit into this picture? I'd really like
| to hear your response.
in this picture, Lisp is like the internal design of the ant hill busy with
bugs only significant because of their numbers, who don't know what they
are doing, why they are there, or where they are going, except by following
the trail of smell from others of their kind. what makes an ant colony
thrive and survive is the structure of the ant hill, not the busy ants.
the construction of the ant hill and the reason it survives winters and
predators is hard to see by focusing on the busy ants, but film an ant hill
for months and view it at 100 times normal speed, you see it. fortunately,
some people are able to think in larger time frames than 1/24th of a
second, and so don't need the aid of the TV to see the pictures worth
seeing.
...yada yada yada...
> The question is, where does LISP fit into this picture? I'd
>really like to hear your response.
Where does it fit?
Where it has ALWAYS fit!
Up on the ridge overlooking the Valley that seems to be the core of
your little world. Sitting in its lawn chair, drink in its hand, one of
those fruity ones with the umbrella. It thinks "What to do today? Where
can I go next? What can I make new today?".
It counts off on its fingers where it's been before. (Thankfully it
supports BIGNUMS). From OS kernels to applets, AI to scripting, it has
stripes on its arm a mile long. Been there, done that. Computing World
Traveler, Exlporer, and Pioneer.
It fits where it always has. A place for others to look to. For others
to wonder "How does it do that?". They try to emulate. Try to make do,
but always at some cost, always compromising. Never quite "making" it.
They specialize on some little nit, say "See I can do it too!", and
discovered that the target they were trying to hit has moved on again,
advanced once more, leaving it behind.
It has been said that those who do not learn Lisp are doomed to
reinvent it. There's also this bumper sticker - "I may be slow, but
I'm ahead of you!".
So, Lisp fits where it always has fit. Rather than being a peg looking
for a hole, it's a sponge capable of filling them all -- simultaneously.
It sits on its chair, takes of sip, and watches the fires in the city
below flare up, spread, and eventually die, and smiles as it sees
another moth, just like you, attracted to the lights of the pyres.
While others, sick of the smoke, try to climb the hill so they can get
a drink.
Welcome, pull up a chair. Come watch with me. Care for a Mai Tai?
Lisp smiles. Perfect fit.
"Specialization is for insects." - R. A. H.
--
Will Hartung - Rancho Santa Margarita. It's a dry heat. vfr...@netcom.com
1990 VFR750 - VFR=Very Red "Ho, HaHa, Dodge, Parry, Spin, HA! THRUST!"
1993 Explorer - Cage? Hell, it's a prison. -D. Duck
>Emergent Technologies Inc. wrote:
>>
>> I guess they didn't teach it right. I took a course at Harvard once
>> (yeah, I get what I deserve, eh?) and came away with the impression
>> that Lisp was a complete waste of time. I changed my opinion
>> rather quickly when I learned it from *real* programmers.
>
> I am a real programmer, and I can tell you what it's like
>in the real world. There are very times, if any, that you ever write
>recursive code. For the three years that I've been working, I only
>wrote two recursive functions. That's right, only two. One was for doing
>some parsing for some subdirectories, and the other was some dinky
>descrambler program I wrote in a few minutes.
I'm a hobbyist, fluent in Fortran(77 & 90), Basic, C, Pascal, C++, 68k
asm, various scripting languages, and I'm learning Lisp(quickly).
> I am willing to accept Lisp as a decent programming language,
>if it is based on nonrecursive code, i.e. most of the programming
>in standard iterative code, with a rare occasion of some recursive code.
>What I've seen of Lisp is the exact opposite.
People have told you and shown you many times that Lisp compilers can
turn recursion into iteration. So, bitching that "Lisp enforces recursion,
so it's slower than iteration" is pointless.
> If you want to land yourself a good job, let me tell you...
>
>
>1. Database programming
> Big Big demand, anywhere and everywhere. Most companies have
> some kind of database for their customers/clients. SQL and some
> kind of specialty in a platform helps out a lot. If you want to
> get into this field, forget about COBOL. It's still out there,
> but will soon be extinct.
Your point?
>2. Applications programming
> Most companies and people use ms windows, and visual c++ is the way
> to go, though VB is sufficient if your not going to be doing any large
> projects with several other programmers. MSdos applications
>programming
> is approaching, slowly but surely, the nil level.
What about mainframe programming? If you're a "real programmer," then
you must be familiar with this field. Also, what about embedded
programming.
>3. Network Programming
> This is a very strong field. You really need to know the O.S.
> and platform aspect, more than the programming aspect. There are
> quite a few jobs out there for networking, and administrating
> networks.
Hmm, sounds like a Unix thing, mostly. There isn't much market for VC++
unless WinNT is more popular that I've been led to beleive.
>4. Scientific programming
> C is the way to go. Fortran is still around, but will be gone
> as soon as the scientists that only know fortran kick the bucket.
Are you familiar with Fortran90? It has much better optimising for
vector manipulations than C because it was designed to be. I'm a hobbyist,
not a scientist, I know Fortran. Even my Fortran 77 compiler has better
optimization than my C compiler. BTW, Fortran77 did not allow recursion.
You should LOVE that language.
>5. Web/Java programming
> At the moment there is a demand for Java (and cgi scripting),
> but in my analysis, it won't last long. Why you ask? Let's take a
> look at the so called advantages of Java:
>
> Platform independence: That doesn't work. C, basic, fortran, and
> many other languages have been around for years, and there
> hasn't been any trend to make some kind of on the fly compiler,
> to run the source code on different platforms. If there really
> was a demand for platform independent programs, I'm sure
> we would have seen some kind of C machines, or Fortran machines,
> that correspond with the now JAVA machines.
Actually, since the virtual machines behave quite differently on the
different platforms, Java's portability is kinda shot. BTW, several LISP
systems compile the functions on the fly instead of interpreting it. So,
LISP is a highly portable language.
> Static access to internet resources: HTTP corresponds to the ftp
> protocol of the past, with an easier interface and fancy pictures.
> It's just a matter of time before a protocol is introduced that
> does the same with the telnet. When that happens, say goodbye to
> Java and cgi scripts.
CL-HTTP. Go read about it. It's quite interesting.
> The question is, where does LISP fit into this picture? I'd
>really like to hear your response.
>
>
> Peaceman
B.B.
> >2. Applications programming
> > Most companies and people use ms windows, and visual c++ is the way
> > to go, though VB is sufficient if your not going to be doing any large
> > projects with several other programmers. MSdos applications
> >programming
> > is approaching, slowly but surely, the nil level.
>
> What about mainframe programming? If you're a "real programmer," then
> you must be familiar with this field. Also, what about embedded
> programming.
the point was APPLICATION Programming. yea, and you "real programmers"
never use the gook you write, otherwise it would work as documented.
> >3. Network Programming
> > This is a very strong field. You really need to know the O.S.
> > and platform aspect, more than the programming aspect. There are
> > quite a few jobs out there for networking, and administrating
> > networks.
>
> Hmm, sounds like a Unix thing, mostly. There isn't much market for VC++
> unless WinNT is more popular that I've been led to beleive.
have absolutly no doubt, when MS dumped 150million into Apple, NT will
take over as the new server OS. how? java, they are splitting it up.
i guess SGI was too slow:) secondly, it's that myth of Apples/Macs
are the best thing since sliced bread in the corporate ranks.
on the other front, i don't think it's as much to do with java but
more about ActiveX when MS showed the money. yup, unless the unix
community gets off it's high horse about COM/DCOM, they're doomed.
--
Regards,
jason hummel
---------------------------------------------------------
NOTICE TO BULK EMAILERS: Pursuant to US Code, Title 47,
Chapter 5, Subchapter II, 227, any and all nonsolicited
commercial E-mail sent to this address is subject to a
download and archival fee in the amount of $500 US.
E-mailing denotes acceptance of these terms.
http://www.ols.net/~keycad1
mailto:key...@ols.net
---------------------------------------------------------
>4. Scientific programming
> C is the way to go. Fortran is still around, but will be gone
> as soon as the scientists that only know fortran kick the bucket.
You should have seen me rolling around the room laughing!
Whatever pieceman knows, he _doesn't_ know Fortran.
If you want really high scientific computing performance, Fortran is
still the only game in town (well, Sisal is better, and NESL may be,
but C certainly isn't). Modern Fortran has most of the good things
that C has, omits most if the mistakes, and has a ton of useful stuff
that C hasn't got. Ok, f90 isn't the best for everything, but what
language is? I don't see Fortran dying any time soon. (I know more
than 100 languages and still see the usefulness of Fortran.)
>5. Web/Java programming
> At the moment there is a demand for Java (and cgi scripting),
> but in my analysis, it won't last long. Why you ask? Let's take a
> look at the so called advantages of Java:
>
> Platform independence: That doesn't work. C, basic, fortran, and
> many other languages have been around for years, and there
> hasn't been any trend to make some kind of on the fly compiler,
> to run the source code on different platforms.
There isn't even any _source_ compatibility for BASIC, so scratch BASIC.
Fortran programs _are_ source compatible; to move an F90 program from
one 32/64-bit IEEE platform to another (all Java can manage) you just
recompile. If a program's going to run for hours, you don't particularly
mind a bit of up-front compiling.
> If there really
> was a demand for platform independent programs, I'm sure
> we would have seen some kind of C machines, or Fortran machines,
> that correspond with the now JAVA machines.
Well, we do actually have quite a lot of platform independent languages
already. Telescript, Facile, Oberon, TCL, ...
> Static access to internet resources: HTTP corresponds to the ftp
> protocol of the past, with an easier interface and fancy pictures.
> It's just a matter of time before a protocol is introduced that
> does the same with the telnet.
> The question is, where does LISP fit into this picture? I'd
>really like to hear your response.
Why should we believe that? You haven't _listened_ to anything else
that anyone has said.
I like the Xerox phrase: "Power tools for exploratory programming."
But not just exploratory, of course. People are programming huge
telecoms applications in a (somewhat simplified but still strictly
recursive) platform independent (yep, add Erlang to the list of platform
independent "mobile programming languages") descendant of Lisp. What
it buys them is reduced costs. (A factor of 6 fewer lines of code than
C.) This is bet-your-BIG-business stuff. I have two powerful statistics
packages installed here. One of them is an extension of Lisp. The other
_looks_ like C, but is implemented using Lisp technology. Lisp was used
for the Dylan compiler (story I heard is that Apple pulled the plug on
Dylan to appease Sun) which works very well.
I could multiply examples, but you will doubtless contract your picture
to exclude them.
> You should have seen me rolling around the room laughing!
People would be laughing even harder in comp.sys.super. Sajid Ahmed
the Peaceman should really do his research (instead of making it up).
The reality is very different for Fortran users than, say, the PC
users that he'll be used to. Good grief, some of those people don't
even take these small machines seriously! And who can blame them?
It's a big world, and it seems that Sajid Ahmed the Peaceman has yet
to step outside his hometown, never mind see the big city. It's easy
to dismiss a language that won't get much use in the corner of the
world that you inhabit, however large that may be. The survival of
Fortran has _nothing_ to do with what PC users want.
I have the proof that it doesn't work (in all situations).
Rainer seems like a nice guy. I feel bad that he knows A lot
about the MAC (soon to only be found in museums, unless Apple
starts making PC clones called Macs).
Peaceman
If you look at my previous posts (someone said that they were
archiving them) you will see several places where I admitted I didn't
know something, usually followed by people calling me an idiot,
stupid , etc. I have already said that I can accept Lisp as a
decent programming language, if the recursion is done away with.
I will also admit that I was wrong on one of my posts
about turning recursion into iteration, ( I gave the example
of infinitely recursive functions, like sine cos, etc. having
no iterative counterparts). I realized that about ten minutes
after stepping out of the office.
Now, will any of you guys admit that you were wrong?
I thought so.
Peaceman
Sajid Ahmed the Peaceman wrote in article <33EA5A...@capital.net>...
>5. Web/Java programming
> At the moment there is a demand for Java (and cgi scripting),
> but in my analysis, it won't last long. Why you ask? Let's take a
> look at the so called advantages of Java:
>
> Platform independence: That doesn't work. C, basic, fortran, and
> many other languages have been around for years, and there
> hasn't been any trend to make some kind of on the fly compiler,
> to run the source code on different platforms. If there really
> was a demand for platform independent programs, I'm sure
> we would have seen some kind of C machines, or Fortran machines,
> that correspond with the now JAVA machines.
Do remember the pascal P-machine of years gone by. Yes, that was a
forerunner (conceptual) of virtual machine-compile-on-the-fly platform
independence. Of course, it was so much slower than native code....and the
capabilities of different platforms were...quite different.
What do things look like today? Well. I'd imagine that you'd be
hard-pressed to find a plain-old-text-terminal hooked up to the net
anymore. I'd even go so far as to say that nearly all (ok, most) have
comparable graphics facilities. Furthger, increases in the processing power
of the machines don't make it that much of a penalty anymore to run the
interpeted code. Or even "compiling on the fly".
Im so glad that Dennis Ritchie, Ken Thompson didn't use the same if:
If there really was a demand for a small multiprocessor-multiuser OS, Im
sure someone else would've written it by now. So lets forget about this
Unix stuff.
If there really was a demand for a small language that could be used for
systems programming, we would've seen it already. So lets forget this C
stuff.
Or how about:
If there were a need for a client-server-type graphics protocol, it wuldve
been developed already. So lets shut down the X project....
If I remember my history rightly, Java started out as a language which
would be used for set-top boxes. Rather than use existing embedded-systems
tools, hey decided to create a virtual machine. That way, the controlling
code would be independent, just the VM would be reimplemented on different
platforms. Then along came the Internet. :-) "Hey, we could use this stuff
in Browsers and on the web!"
C can be platform independent (at the source level). Same with fortran. Of
course, for different platforms you have to recompile the program, or
provide executables for ALL platforms on which you wish your program to
execute. Lots of disk space, and lots of time. Oh yeah, you also have to be
alle to verify the proggie works on all the platforms.. ;-)
With Java, you just have the bytecode. The bytecode can either be
interpreted OR just-in-time compiled.
On the user's machine across the net. Realistically, I'd say that [personal
opinion] that Java would not be where it is today, were it not for the Web.
Had Java not entered the picture, most likely you wouldve seen someone
coming up with a "platform independent p-code C machine, or fortran, or
<insert favorite language here>"
>
> Static access to internet resources: HTTP corresponds to the ftp
> protocol of the past, with an easier interface and fancy pictures.
> It's just a matter of time before a protocol is introduced that
> does the same with the telnet. When that happens, say goodbye to
> Java and cgi scripts.
We have protocol that lets you have an easier interface and fancy pictures
on remote machines. it's called X. Of course, would you want strangers on
the net being able to execute any program on your server? Would you want
any site to have access to your local machine? For me, its NO on both
counts.
>
>
>
> The question is, where does LISP fit into this picture? I'd
>really like to hear your response.
Lisp fits int the above, by being yet another programming language. Use it
(or dont) as you see fit).
Dennis
>
>
> Peaceman
>.
>
Yeah, sometimes I ask myself why I bother responding to
you guys. You'll find out for yourself what real programming
is like once you get a job in the real world.
> ....
> Try "showing" us that the Earth is flat - you'll find it much easier.
I'm more into showing whether or not the universe is flat.
Peaceman
Let's take a look at the following example of tail recursion,
(In C++, sorry don't know how to do references in Lisp)
int factorial(int &number) {
int x;
if (number == 1) return 1;
x = number-1;
return number * factorial(x);
}
If you don't know C++ :
int factorial(int *number) {
int x;
if (*number == 1) return 1;
x = *number-1;
return *number * factorial(&x);
}
There you have it, tail recurion that needs a stack.
Peaceman
> Now, will any of you guys admit that you were wrong?
You need to do a lot more than merely claim that you're right. Can you
back it up with any evidence? Apparently not, otherwise you would have
done so already.
Prove that nobody can use Lisp to do real work. Now, tehre's a
challenge. If you merely asserted that nobody you know uses Lisp, then
there'd be no problem. It wouldn't even suprise me, after looking at
your Dejanews author profile:
131 unique articles posted.
Number of articles posted to individual newsgroups (slightly
skewed by cross-postings):
35 comp.periphs
25 comp.lang.lisp
10 comp.programming
9 comp.os.msdos.programmer
6 comp.sys.ibm.pc.hardware
5 alt.comp.hardware
4 alt.comp
4 comp.hardware
2 alt.comp.hardware.homebuilt
2 alt.os.windows95
2 comp.ai.philosophy
2 comp.ibm.pc.hardware
2 comp.sys.ibm.pc.hardware.misc
1 adass.iraf.programming
1 alt.cd-rom
1 alt.crackers
1 alt.test
1 comp.benchmarks
1 comp.ibm.pc
1 comp.os.ms-windows.misc
1 comp.os.ms-windows.nt
1 comp.os.ms-windows.setup.win95
1 comp.periphs.scsi
1 comp.sys.ibm.pc.hardware.chips
1 comp.sys.ibm.pc.hardware.comm
1 comp.sys.ibm.pc.hardware.systems
1 comp.sys.ibm.pc.misc
1 comp.sys.mac.programmer.help
1 comp.sys.palmtops
1 comp.virus
1 microsoft.public.win95
1 ott.forsale.computing
1 sci.electronics.design
1 tw.bbs.comp.hardware.cpu
It seems that you have a lot to say about PC issues, but that's not
being disputed here. Instead, it's your assertions about Lisp. Try
asking Dejanews for profiles of the people disagreeing with you, and
aee what it tells you about us.
Name the Lisps that you've used. Are any of them commercial systems,
like LispWorks and Allegro CL? Here's LispWorks:
<URL:http://www.harlequin.co.uk/products/ads/lispworks/lispworks.html>
There's a free version of ACL for you to play with:
<URL:http://www.franz.com/frames/dload.main.html>
Show us some code that runs with ACL/PC (I'm sure you can find a
machine that can run it) that demonstrates the problems that you
describe. Can you do that?
> Yeah, sometimes I ask myself why I bother responding to
> you guys. You'll find out for yourself what real programming
> is like once you get a job in the real world.
Wanna bet? You're making some mighty big assumptions about what we all
do for a living, and what constitutes the "real world". I'd bet that
you know a lot about the _PC world_, esp the lowest common demoninator
aspects of it. That, however, still leaves a lot of room for other
things which we can safely call computing.
Meanwhile, see <URL:http://www.franz.com/frames/ha.list.html> for a
some "real world" apps written in Lisp. You might be suprised.
BTW, what do you gain by making these assertions? Even if you were
right, do you think that you're doing anyone a favour?
> > Try "showing" us that the Earth is flat - you'll find it much easier.
>
>
> I'm more into showing whether or not the universe is flat.
Hmm. I'm tempted to ask...However, it would probably be off topic in
all three of the newsgroups to which we're posting. Some other time
and place, perhaps. sci.physics.relativity, perhaps.
> There you have it, tail recurion that needs a stack.
Yeah, in C++. Curiously, this has recently been discussed in
comp.lang.lisp - were you reading those posts? It's a C/C++ problem.
Lisp doesn't use references - it doesn't need them. The answer is
amazingly simple: don't use references!
Menwhile, people reading this can further amuse themselves, if they
wish to, by reading some arguments for and against using Lisp:
<URL:http://www.wildcard.demon.co.uk/lisp/for.html>
<URL:http://www.wildcard.demon.co.uk/lisp/against.html>
I just wish I had more archives to offer, like all the threads in
which people like Peaceman have tried to show that Lisp can't be used
for things that it is. That way, we could save some time by all re-
reading those threads, and then the C++ people could look for some
_new_ arguments. Obviously the old ones have failed, because some
people are still using Lisp!
<URL:http://www.wildcard.demon.co.uk/archives/>
Peaceman, could you please tell us what you hope to gain by this
attack on a language which you don't use, have no interest in using,
and can't even understand?
... which is NOT and example of tail recursion. Not only are you doing
further computation with the returned value of the procedure, but you're
also passing in an automatic variable.
Following are two somewhat long winded examples with code, but please
bear with me.
Since I'm more confortable with C:
: int factorial(int *number) {
: int x;
: if (*number == 1) return 1;
: x = *number-1;
: return *number * factorial(&x);
: }
This is not tail recursive. Instead you should have written:
static int fact-aux(int n, int p) {
return n <= 1 ? p : fact-aux(n-1, p * n);
}
int fact(int n) {
return fact-aux(n, 1);
}
Results for 'gcc -O3 -S -fomit-frame-pointer':
.file "fact.c"
.version "01.01"
gcc2_compiled.:
.text
.align 4
.globl fact
.type fact,@function
fact:
movl 4(%esp),%edx ; We fetch our initial argument here.
movl $1,%eax ; We fetch our iteration counter here.
.L11:
cmpl $1,%edx ; Are we done?
jle .L10 ; If so, return our value.
imull %edx,%eax ; Otherwise, compute next running product.
decl %edx ; Decrement iteration counter.
jmp .L11 ; Loop (NOT CALL SUBROUTINE!)
.align 4
.L10:
ret
.Lfe1:
.size fact,.Lfe1-fact
.ident "GCC: (GNU) 2.7.2.1"
I see no explicit recursion here.
With that out of the way, let's examine the compilation with Attardi's
EcoLisp to C of the following tail recursive factorial in Lisp:
(defun fact (n)
(labels ((fact-aux (n p)
(if (<= n 1)
p
(fact-aux (- n 1) (* n p)))))
(fact-aux n 1)))
The meat of the compilation:
/* function definition for FACT */
static L1(int narg, object V1)
{
VT3 VLEX3 CLSR3
TTL:
RETURN(LC2(2, (V1), MAKE_FIXNUM(1)) /* FACT-AUX */ );
}
/* local function FACT-AUX */
static LC2(int narg, object V1, object V2)
{
VT4 VLEX4 CLSR4
TTL:
if (!(number_compare((V1), MAKE_FIXNUM(1)) <= 0))
{
goto L2;
}
VALUES(0) = (V2);
RETURN(1);
L2:
{
object V3;
V3 = number_minus((V1), MAKE_FIXNUM(1));
V2 = number_times((V1), (V2));
V1 = (V3);
}
goto TTL;
}
While there is consing going on here, I see loops built with goto, but
no explicit recursion despite its presence in the original Lisp.
: There you have it, tail recurion that needs a stack.
You're not looking to well, son. Would you like to play some more?
--
Christopher Oliver Traverse Communications
Systems Coordinator 223 Grandview Pkwy, Suite 108
oli...@traverse.com Traverse City, Michigan, 49684
Some mornings it just doesn't seem worth it to gnaw through the
leather straps. -- Emo Phillips
It's true that there is some propaganda on TV and in the
political system, but not to the extent that you make it out to be.
> the "real world" has been created by people who voted for whatever made
> them feel safest, and they got to choose among the ads delivered through
> their electronic pacifier -- the TV or the computerized TV, pick one.
>
That is completelt untrue. The real world will always be the
real world, regardless of what people think.
> the observable world becomes visible only when the strong glare from the
> propaganda machine cum pacifier has subsided, which means you have to turn
> it off, go out to see for yourself, enjoy the strange sensations that come
> when you rediscover that you're a sensing being, not just a sink for
> prepackaged propaganda and microwaved entertainment (satellite TV).
>
> if you leave the glossy ads in the magazines behind, you will find that
> people don't actually produce all that much with these things. they work a
> lot, and the scream a lot about it, but what comes out of the "real world"
> is just electronically heated air.
If you want to live in your fantasy world, fine, but one
day reality will catch up to you, and then you'll be in big trouble.
You have to accept reality. There is no way to dodge it. Get out of
your dream man.
Peaceman
I think JAVA is a great language, I just don't think it will catch
on in the computer world. I could be wrong, but that is my analysis of
things.
> > Static access to internet resources: HTTP corresponds to the ftp
> > protocol of the past, with an easier interface and fancy pictures.
> > It's just a matter of time before a protocol is introduced that
> > does the same with the telnet. When that happens, say goodbye to
> > Java and cgi scripts.
> We have protocol that lets you have an easier interface and fancy pictures
> on remote machines. it's called X. Of course, would you want strangers on
> the net being able to execute any program on your server? Would you want
> any site to have access to your local machine? For me, its NO on both
> counts.
> >
CGI scripts are out there and running right this moment. They run
on the servers machine, and the output is sent back to the client
as a web page.
Peaceman
> I have the proof that it doesn't work (in all situations).
> Rainer seems like a nice guy. I feel bad that he knows A lot
> about the MAC (soon to only be found in museums, unless Apple
> starts making PC clones called Macs).
If you have proof that recursion is as expensive as you say, then show
it to us. Otherwise, take your OS flamebait to another newsgroup, one
where they'll appreciate it. Like comp.sys.mac.advocacy.
BTW, why did you not shown us this proof sooner?
But this is not tail recursive subroutine. Here is tail recursive one
(in C):
int factorial_tail(int number, int max,int so_far) {
int x;
if (number > max) return so_far;
return factorial_tail(number+1,max, so_far*number);
}
and you have to use this one to call it:
int factorial(int number) {
return factorial_tail(1,number, 1);
}
And here is Sparc assembly (compiled with gcc)
.file "fact.c"
.version "01.01"
gcc2_compiled.:
.global .umul
.section ".text"
.align 4
.global factorial_tail
.type factorial_tail,#function
.proc 04
factorial_tail:
!#PROLOGUE# 0
save %sp,-104,%sp
!#PROLOGUE# 1
mov %i0,%o1
mov %i2,%o0
.LL6:
cmp %o1,%i1
bg .LL5
nop
call .umul,0
add %o1,1,%l0
b .LL6 <----------- watch this
mov %l0,%o1
.LL5:
ret
restore %g0,%o0,%o0
.LLfe1:
.size factorial_tail,.LLfe1-factorial_tail
.align 4
.global factorial
.type factorial,#function
.proc 04
factorial:
!#PROLOGUE# 0
save %sp,-104,%sp
!#PROLOGUE# 1
mov 1,%o1
mov 1,%o0
.LL15:
cmp %o1,%i0
bg .LL14
nop
call .umul,0
add %o1,1,%l0
b .LL15
mov %l0,%o1
.LL14:
ret
restore %g0,%o0,%o0
.LLfe2:
.size factorial,.LLfe2-factorial
.ident "GCC: (GNU) 2.7.2"
As you see, gcc agrees with me that tail recursion here does not require
stack.
This doesn't look like tail recursion to me. The recursive call to
factorial is being used as an argument. Try it this way:
.int tfact1 (int number, int result) {
. return (number == 0)
. ? result
. : tfact1 (number - 1, number * result);
.}
.int tfact (int number) {
. return tfact1 (number, 1);
.}
Note that what makes this tail recursive is that the return value is
*directly* computed by another call to tfact1. This can therefore be
turned into a loop, and I'd bet a *lot* of C++ compilers can do it
(even though I don't think they can do it for the general case).
All this is covered in chapter 1 of Structure and Interpretation of
Computer Programs. It is also pointed out that the factorial program
as originally written runs in O(n) time and O(n) space, while the
tail recursive one runs in O(n) time and O(1) space. It is this that
makes the so-called recursive version such a dog, not the recursion
itself.
Demo or die.
--
Patric Jonsson,d88...@nada.kth.se;Joy, Happiness, and Banana Mochas all round.
Sajid Ahmed the Peaceman (peac...@capital.net) wrote:
: Let's take a look at the following example of tail recursion,
... which is NOT and example of tail recursion. Not only are you doing
further computation with the returned value of the procedure, but you're
also passing in an automatic variable.
Following are two somewhat long winded examples with code, but please
bear with me.
Since I'm more confortable with C:
: int factorial(int *number) {
: int x;
: if (*number == 1) return 1;
: x = *number-1;
: return *number * factorial(&x);
: }
This is not tail recursive. Instead you should have written:
static int fact-aux(int n, int p) {
return n <= 1 ? p : fact-aux(n-1, p * n);
}
int fact(int n) {
return fact-aux(n, 1);
}
Results for 'gcc -O3 -S -fomit-frame-pointer':
.file "fact.c"
.version "01.01"
gcc2_compiled.:
.text
.align 4
.globl fact
.type fact,@function
fact:
movl 4(%esp),%edx ; We fetch our initial argument here.
movl $1,%eax ; We fetch our iteration counter here.
.L11:
cmpl $1,%edx ; Are we done?
jle .L10 ; If so, return our value.
imull %edx,%eax ; Otherwise, compute next running product.
decl %edx ; Decrement iteration counter.
jmp .L11 ; Loop (NOT CALL SUBROUTINE!)
.align 4
.L10:
ret
.Lfe1:
.size fact,.Lfe1-fact
.ident "GCC: (GNU) 2.7.2.1"
I see no explicit recursion here nor any stack use within the iteration.
Where's the beef?
doesn't seem to be any repeated allocation of automatic variables either.
Hmmm... Curious!
: There you have it, tail recurion that needs a stack.
You're not looking too good, son. Would you like to play some more?
--
Christopher Oliver Traverse Communications
Systems Coordinator 223 Grandview Pkwy, Suite 108
oliver -at- traverse -dot- com Traverse City, Michigan, 49684
... which is NOT and example of tail recursion. Not only are you doing
further computation with the returned value of the procedure, but you're
also passing in an automatic variable.
Following are two somewhat long winded examples with code, but please
bear with me.
Since I'm more confortable with C:
: int factorial(int *number) {
: int x;
: if (*number == 1) return 1;
: x = *number-1;
: return *number * factorial(&x);
: }
This is not tail recursive. Instead you should have written:
static int fact_aux(int n, int p) {
return n <= 1 ? p : fact_aux(n-1, p * n);
}
int fact(int n) {
return fact_aux(n, 1);
jason hummel wrote in article <33EB27...@ols.net.NO_SPAM>...
>Bryant Brandon wrote:
>>
>> In article <33EA5A...@capital.net>, peac...@capital.net wrote:
>>
>> >Emergent Technologies Inc. wrote:
>> >>
[Bunch of stuff omitted]
No I didn't.
~jrm
[...]
>2. Applications programming
> Most companies and people use ms windows, and visual c++ is the way
> to go,
If I was doing Applications programing in a windows envoiroment I would
make use of delphi. In fact a current projet I am wroking on has
a delphi frount end and a back end writton in Lisp.
[...]
>3. Network Programming
> This is a very strong field. You really need to know the O.S.
> and platform aspect, more than the programming aspect.
For some odd reson the IMAP protcol kicks out very lispy resoponces.
In fact as Network Programming is mostly buliding parsers for protocols
and Lisp is very good for doing this Lisp is a very good languege for
some types of network programming.
[...]
>4. Scientific programming
> C is the way to go.
For what reson? Most Scince programing is Floating point number crunching
for wich Lisp has very good libries.
[...]
>5. Web/Java programming
> At the moment there is a demand for Java (and cgi scripting),
> but in my analysis, it won't last long.
This will be amusing
> Why you ask? Let's take a look at the so called advantages of Java:
>
> Platform independence: [...] there
> hasn't been any trend to make some kind of on the fly compiler,
> to run the source code on different platforms.
There has been massive demand for such a product. Why do you think that
C and Unix gained the strean in the first place? Simply becuse there
where C compliers for every know hardwhere advalable. But still with
C you have to do alot of hacking to get it to be totly portable.
[...]
> Static access to internet resources: HTTP corresponds to the ftp
> protocol of the past, with an easier interface and fancy pictures.
I would say that HTTP is closer to gopha then ftp.
> It's just a matter of time before a protocol is introduced that
> does the same with the telnet. When that happens, say goodbye to
> Java and cgi scripts.
Quoi? telnet is older then ftp and works on a totaly diffrent consept.
FTP and telnet live happerly togetther a new telnet like protocol will
not case the end of the web.
--
Please excuse my spelling as I suffer from agraphia see the url in my header.
Never trust a country with more peaple then sheep. Buy easter bilbies.
Save the ABC Is $0.08 per day too much to pay? ex-net.scum and proud
I'm sorry but I just don't consider 'because its yucky' a convincing argument
Come off it! The whole commercial television system exists solely
for the purpose of propaganda. It's called "Advertising". Advertising
not only sells consumable products, it sells a culture. Any good
history of advertising will tell you about the political and cultural
ideas implicitly (but none-the-less powerfully) pushed by advertisments.
Just look at old 50s advertisements some time and see the sex role
stereotyping, class values, and other stuff in just about every advertisement.
Just becuase it's not labelled "party political broadcast" doesn't mean
it doesn't tend to encourage some ideas and discourage others.
I find most of the advertising on TV here deeply offensive, not so much
because of the products, but because of the sexual, "musical", political,
&c attitudes they ram down viewers' eyes without consent. The "mute"
button on my remote control gets a lot of use.
For one concrete example of political propaganda, consider Pauline Hanson
and her One Nation party. She's an Australian politician with ideas that
would make me very very frightened if I were Jewish or Asian. Or at least,
_maybe_ she is. I _think_ she's a raving racist loony, but I have to say
that her _ideas_ are NEVER given a fair hearing on TV.
And "I am, you are, we are Australian" or whatever the wretched jingle is,
if that isn't propaganda, I don't know what is.
So now we know that pieceman doesn't know anything about the electronic
press either.
From Perlis's "Epigrams on Programming", Sigplan 17 #9, Sept 1982, also
available at
http://www.cs.cmu.edu/~spot/programming-epigrams.html
"42. You can measure a programmer's perspective by noting his attitude
on the continuing vitality of FORTRAN."
--
William B. Clodius Phone: (505)-665-9370
Los Alamos Nat. Lab., NIS-2 FAX: (505)-667-3815
PO Box 1663, MS-C323 Group office: (505)-667-5776
Los Alamos, NM 87545 Email: wclo...@lanl.gov
Don't kill me.
I posted an example in one of my previous posts.
Peaceman
Abuse from Crack Dot Com. http://www.crack.com/abuse
>> How about an
>> application suite?
>
>I guess it depends what you mean by "application suite", but Lisp vendors (eg
>ourselves, and presumably yourselves) have had GUI and DBI frameworks for
>sometime so probably you meant something else...
Perhaps something like {MS, Perfect, Smart} Office. My spreadsheet
SIAG is written in C and Scheme. http://www.edu.stockholm.se/siag
Spreadsheets using Scheme (in particular Guile) are discussed in
the mailing list g...@nortom.com.
Ulric Eriksson
--
I proactively leverage my synergies.
OK, if that's your definition of tail recursion, how about this?
int factorial(int *number, int result) {
int x;
if (*number == 1) return result;
x = *number-1;
return factorial(&x, result * number);
}
I know that this isn't efficient code, but it's just there as an
example. TRO fails when you have a local variable, and not allowed to
execute the destructor on it. You have to put the local variable
somewhere.
The stack is the appropriate place.
Peaceman
> OK, if that's your definition of tail recursion, how about this?
>
>
> int factorial(int *number, int result) {
> int x;
> if (*number == 1) return result;
> x = *number-1;
> return factorial(&x, result * number);
> }
>
>
>
> I know that this isn't efficient code, but it's just there as an
> example. TRO fails when you have a local variable, and not allowed to
> execute the destructor on it. You have to put the local variable
> somewhere.
> The stack is the appropriate place.
(For those poor benighted souls who haven't read the entire preceding
thread, this is in support of SAtP's claim that tail recursion can't
necessarily be done without using stack space.)
Yes, this is an example of how bad use of C's "features" turns
code that ought to be tail-recursive into code that isn't (at
least in the sense of being subject to tail-call optimisation).
What it shows is that certain points of the definition of the
C language were not designed with tail-call optimisation in mind.
It's an absolutely language-specific problem, and has nothing at all
to do with SAtP's claims that Lisp is slow, recursion is slow, recursion
implies space-inefficiency, etc etc. Lisp doesn't have first-class
reference objects comparable to C's pointers, so that if the compiler
sees
(let ((x (cons 123 123)) (y 234))
;; blah blah blah
(some-function x y))
it knows it's safe to forget about the bindings of X and Y before
calling SOME-FUNCTION, and that remains true whatever arguments
you give to SOME-FUNCTION. On the other hand, the cons cell to which
X is bound is still there; it continues to exist across the call;
but that's because Lisp doesn't (in general) use stack allocation
for "non-immediate" objects. Most systems will allocate the cons
cell in the heap, and pick it up later in a garbage collection if
appropriate. This is rather different from the C idiom of
{ int x[15];
/* blah blah blah */
return foo(x);
}
and possibly a bit less efficient (aha! could this be what SAtP is
trying to get at?), but
- a good garbage collector will result in the efficiency loss
being rather small;
- it's paid for with big winnage in convenience and clarity;
- it avoids the ghastly bugs that can result in C programs from
passing around pointers to stack-allocated objects (if you've
never been bitten by this, you're lucky);
- in the common case where you *don't* pass the object around,
a good compiler is entirely at liberty to stick the whole thing
on the stack. (I don't know whether they actually do this.)
(If you use Henry Baker's idea of "lazy allocation" you can actually
do *all* your allocation, of whatever kind of object, temporary or
not, on the stack. Has anyone actually tried doing this? Is the
necessary write barrier cheap enough to make it worth while?)
--
Gareth McCaughan Dept. of Pure Mathematics & Mathematical Statistics,
gj...@dpmms.cam.ac.uk Cambridge University, England.
Unfortunately Lawrence Livermore no longer supports Sisal. Although the
web pages (http://www.llnl.gov/sisal/), binaries, and source codes are
still available, the Sisal team was dissolved about a year ago. (See a
recent thread in comp.lang.functional). Unless someone else supports it,
Sisal's long term future is bleak.
Lisp uses No references? If that were the case you just stated
something that helps my argument. I will digress, however, because
I think that Lisp probably does use references (aka pointers) in
many situations. If not, and you have an array or string, the array is
*copied* every time you call the recursive function.
Peaceman
Precisely. You're doing something (destroying an automatic variable)
on return from the recursive call, thus I would say this is NOT tail
recursive. It might be possible to transform this code into itera-
tion using a sufficiently clever compiler, but then you probably bend
the meaning of C's automatic variables. Even then, I could write
a subtly different routine where conventional semantics (I.e. number
is a pointer to stack memory holding an int) are needed for a call
depending on the data passed to the routine. Then, such a transform
wouldn't work. Defeating TRO in a compiler with a given tail recurs-
ion doesn't prove that there is no iterative form for your routines;
it proves your compiler isn't sufficiently clever. Stumping the comp-
iler doesn't necessarily mean you've found a counterexample.
Red herring: Don't you mean "return factorial(&x, result * *number);"
Gareth McCaughan (gj...@dpmms.cam.ac.uk) wrote:
: (For those poor benighted souls who haven't read the entire preceding
: thread...
Benighted? This must be some usage of the word benighted of which I
was blissfully unaware. ;-)
--
Christopher Oliver Traverse Communications
Systems Coordinator 223 Grandview Pkwy, Suite 108
oliver -at- traverse -dot- com Traverse City, Michigan, 49684
"Getting wrong answers faster is NOT helping the end user." - R. O'Keefe
> : (For those poor benighted souls who haven't read the entire preceding
> : thread...
>
> Benighted? This must be some usage of the word benighted of which I
> was blissfully unaware. ;-)
That was, of course, the point. Happily I'm off on holiday for a couple
of weeks now, so I'll be, er, benighted too. I bet the thread will
still be running when I get back.
> Lisp uses No references? If that were the case you just stated
> something that helps my argument.
How does it do that? Like this, perhaps?
> Pointer arithmetic in C has some advantages. Lisp doesn't have pointer
> arithmetic. Therefore, lisp sucks. Refute.
>
> David Hanley
Lisp doesn't need pointer arithmetic. In fact, a great many languages
get by just fine _without pointer arithmetic_. The same is true for
references. Read a good book about Lisp and you might discover why.
> I will digress, however, because
> I think that Lisp probably does use references (aka pointers) in
> many situations. If not, and you have an array or string, the array is
> *copied* every time you call the recursive function.
That's a different kind of reference. Again, you should read a good
Lisp book. Winston and Horn's Lisp tutorial has an excellent section
on this very issue. Don't assume that Lisp behaves in the same way as
C/C++ - it doesn't. You have to pay attention in order to learn this.
Can you do that, please?
Or are you attacking Lisp simply because it is different from C++?
That would be a plausible explanation for your posts to this thread.
Can you deny it?
BTW, your recursive code was contrived. A cleaner version wouldn't use
a refence to pass a parameter, thus enabling tail recursion. Yes, it
can be done even in C! This has been pointed out to you many times,
but you always ignore it. Are you an idiot or a fool? If you are
neither, why can you not refute C examples? You do know the C
language, don't you? It appears that you do.
Perhaps you refuse to answer these points by choice, because they
destroy your whole argument? It's plausible.
Would you like to show that you're not clueless? That's also easy.
Name the Lisps that you've used. Are any of them commercial systems,
like LispWorks and Allegro CL? Here's LispWorks:
<URL:http://www.harlequin.co.uk/products/ads/lispworks/lispworks.html>
There's a free version of ACL for you to play with:
<URL:http://www.franz.com/frames/dload.main.html>
Show us some code that runs with ACL/PC (I'm sure you can find a
machine that can run it) that demonstrates the problems that you
describe. Can you do that? Why don't you do it?
> Don't kill me.
> I posted an example in one of my previous posts.
And your code was contrived so that would not be tail recursive.
It was a poor use of _C_, never mind recursion.
My point is, that if you don;t want to use recursion in Lisp, y'dont have
to. No one forces you to do so in C.
;-) It's been pointed out here that Lisp has several nice iterative
constructs.
So, given that, what's the problem?
Dennis
Sajid Ahmed the Peaceman wrote in article <33ECA6...@capital.net>...
>Marco Antoniotti wrote:
>> As per your declaration of "being a real programmer", I (and I assume
>> many other "fake programmers" over here) would like to see a very
>> simple admission of ignorance from you. We read Plato and his
>> description of Socrates: "knowing not to know" is usually a good
>> starting point.
>>
>> Cheers
>> --
>> Marco Antoniotti
>>
>
> If you look at my previous posts (someone said that they were
>archiving them) you will see several places where I admitted I didn't
>know something, usually followed by people calling me an idiot,
>stupid , etc. I have already said that I can accept Lisp as a
>decent programming language, if the recursion is done away with.
>
> I will also admit that I was wrong on one of my posts
>about turning recursion into iteration, ( I gave the example
>of infinitely recursive functions, like sine cos, etc. having
>no iterative counterparts). I realized that about ten minutes
>after stepping out of the office.
>
> Now, will any of you guys admit that you were wrong?
>I thought so.
>
> Peaceman
>.
>
Dennis
Martin Rodgers wrote in article ...
>Richard A. O'Keefe wheezed these wise words:
>
>> You should have seen me rolling around the room laughing!
>
>People would be laughing even harder in comp.sys.super. Sajid Ahmed
>the Peaceman should really do his research (instead of making it up).
>The reality is very different for Fortran users than, say, the PC
>users that he'll be used to. Good grief, some of those people don't
>even take these small machines seriously! And who can blame them?
>
>It's a big world, and it seems that Sajid Ahmed the Peaceman has yet
>to step outside his hometown, never mind see the big city. It's easy
>to dismiss a language that won't get much use in the corner of the
>world that you inhabit, however large that may be. The survival of
>Fortran has _nothing_ to do with what PC users want.
>--
><URL:http://www.wildcard.demon.co.uk/> You can never browse enough
> "There are no limits." -- ad copy for Hellraiser
> Please note: my email address is gubbish
>.
>
> It's like I've (and others) have saidrepeatedly when dealing with Nudds
> and his claims of "real-world" programming. What that means depends a great
> deal on what world you work in. ;-)
Hence the phrase Your Mileage May Vary. ;)
It's unfortunate that Peaceman is so unaware of the capabilties of the
GNUC & MS C/C++ compilers. It helps to recognise when you're out of
your depth. The risk is that you may find yourself teaching your
grandmother to such eggs.
While Peaceman will have no chance of proving his claims, he is doing
an excellent job of spreading false memes. The fact that we can
counter all his arguments may mean nothing to anyone who suffered from
an equally poor education as his. If he's complaining about something,
he's either succeeded in making his point, or he's yet to make it.
Note that he admits the failure of his CS teacher, and yet still
considers himself to be an authority on the subject of recursion and
compilers. Also note that he offers no proof at all for any of his
claims, but simply repeats his lies over and over.
He'd make a great taxi driver. ;)
Martin Rodgers wrote in article ...
>Dennis Weldy wheezed these wise words:
>
>> It's like I've (and others) have saidrepeatedly when dealing with
Nudds
>> and his claims of "real-world" programming. What that means depends a
great
>> deal on what world you work in. ;-)
>
>Hence the phrase Your Mileage May Vary. ;)
>
>It's unfortunate that Peaceman is so unaware of the capabilties of the
>GNUC & MS C/C++ compilers. It helps to recognise when you're out of
>your depth. The risk is that you may find yourself teaching your
>grandmother to such eggs.
Personally, I was unaware that MS c/C++ compilers would handle the tail
recursion. (Im assuming microsoft, and because I wasnt able ti read all
msgs in this thread, I figure that it was shown that they would, otherwise
y'wuldn'tve said it).
I really think that what Michael Abrash says in his book "Zen of code
optimization" really fit, in that y'shouldn't try to look at the code to
see where the bottlenecks are, rather you should measure, measure, measure.
Trying to optimize by turning a recursive function into iteration (which
depending on the aglorithm may make the code that much more difficult to
read and maintain) when it's not a performance problem is well...not a good
use of your time.
>
>While Peaceman will have no chance of proving his claims, he is doing
>an excellent job of spreading false memes. The fact that we can
>counter all his arguments may mean nothing to anyone who suffered from
>an equally poor education as his. If he's complaining about something,
>he's either succeeded in making his point, or he's yet to make it.
>
>Note that he admits the failure of his CS teacher, and yet still
>considers himself to be an authority on the subject of recursion and
>compilers. Also note that he offers no proof at all for any of his
>claims, but simply repeats his lies over and over.
>
>He'd make a great taxi driver. ;)
Maybe. I'll admit that he is much more pleasant to deal with than Nudds. No
lisp-pusher-religionist crap. ;-)
Dennis
>--
><URL:http://www.wildcard.demon.co.uk/> You can never browse enough
> "There are no limits." -- ad copy for Hellraiser
> Please note: my email address is gubbish
>.
>
I agree with you one hundred percent. The code is
easily translatable into iterative code, and in fact does so
when it is compiled.
The reason I posted the code was to show that you may need a stack,
even if you have tail recursion (How it ever lead up to this
argument, I don't know.)
> Red herring: Don't you mean "return factorial(&x, result * *number);"
You have a good eye.
Peaceman
They describe joint work done around 1991 between the MIT
Project for Mathematics and Computation and Hewlett-Packard's
Information Architecture Group on the design and construction of
a Supercomputer Toolkit.
This toolkit was used to compute the long-term motion of the
Solar System, improving upon previous integrations by two orders
of magnitude. The report on the analysis was published in Science,
which devoted an editorial to the significance of this achievement.
The toolkit achieves scalar floating point performance equal to eight
times a Cray 1S programmed in Cray Fortran. The approach of
partial evaluation allowed the development of a library of symbolic
manipulation components to support automatic construction of
simulation codes. As an example, the Solar system simulation code
issues a floating-point operation on 98% of the instructions.
What is the toolkit language (and I'm sure you see this coming)
?
Lisp.
And not micro-optimized Lisp, but highly abstract code. Here is an
example.
(define add-vectors (vector-elementwise +))
.(define (vector-elementwise f)
. (lambda (vectors)
. (generate-vector
. (vector-length (car vectors))
. (lambda (i)
. (apply f (map (lambda (v) (vector-ref v i))
. vectors))))))
.
.(define (generate-vector size proc)
. (let ((ans (make-vector size)))
. (define (loop i)
. (if (= i size)
. ans
. (begin (vector-set! ans i (proc i))
. (loop (+ i 1)))))
. (loop 0)))
This code is drawn from the paper and is the type of code used in
programming the toolkit.
Constructs such as APPLY cannot be expressed in C (variable
arity).
This code also uses recursion to describe an iterative
process.
Sajid Ahmed the Peaceman wrote in article <33EA5A...@capital.net>...
>
>4. Scientific programming
> C is the way to go. Fortran is still around, but will be gone
> as soon as the scientists that only know fortran kick the bucket.
>
> The question is, where does LISP fit into this picture? I'd
>really like to hear your response.
It looks to me that it occupies the high ground.
~jrm
> Ok, the subject line is a little misleading.
So I changed it to something closer to the truth. ;)
> I'd just like to draw attention to ``The Supercomputer Toolkit:
> A General Framework for Special-Purpose Computing'' by
> Harold Abelson, Andrew Berlin, Jacob Katzenelson, William
> McAllister, Guillermo Rozas, Gerald Sussman, and Jack Wisdom.
Is this paper on the web and do you have a URL for it? I'd very much
like to read it.
Thanks.
Pointer arithmetic is faster, than a list that can hold a variety
of different types as elements. Lets say you want to access the 1000th
element of a list. With pointer arithmetic, your there in an instance.
With the list of varying elements, you have to traverse 999 elements
before you get to the thousandth. This is true, unless you have some
array
pointing to the location of each element, but then your back to pointer
arithmetic again.
Deleting elements is also a lot quicker. The programmer
can write code to easily and quickly free up memory when it's no longer
needed. If you leave it up to a garbage collector, it has to determine
whether or not a particular part of memory is needed before it can free
it... translation .. slow.
> > I will digress, however, because
> > I think that Lisp probably does use references (aka pointers) in
> > many situations. If not, and you have an array or string, the array is
> > *copied* every time you call the recursive function.
>
> That's a different kind of reference.
Do tell. How are they different?
> Again, you should read a good
> Lisp book. Winston and Horn's Lisp tutorial has an excellent section
> on this very issue. Don't assume that Lisp behaves in the same way as
> C/C++ - it doesn't.
I don't think so. I've had enough of Lisp already.
>You have to pay attention in order to learn this.
> Can you do that, please?
> Or are you attacking Lisp simply because it is different from C++?
> That would be a plausible explanation for your posts to this thread.
> Can you deny it?
Nah... I'm attacking Lisp because theres too much recursion in it.
I have yet to see a lisp program without any recursion. I know that
you guys could probably easily write some.
>
> BTW, your recursive code was contrived. A cleaner version wouldn't use
> a refence to pass a parameter,
The code was meant as an example, not something that's clean
and fast.
>thus enabling tail recursion.
You mean TRO.
> Yes, it
> can be done even in C! This has been pointed out to you many times,
> but you always ignore it.
I know that. I've been saying it all along. All recursive code
is translated into 'iterative' assembly/machine language code. Using
recursive functions adds a step.
> Are you an idiot or a fool?
No. Are you?
> If you are
> neither, why can you not refute C examples? You do know the C
> language, don't you? It appears that you do.
>
OK, you got it. Recursive in C code is just as bad as
recursive Lisp code. Stick to the iterative form in both languages,
and use recursion only when it's better to do so.
Peaceman
> With the list of varying elements, you have to traverse 999 elements
> before you get to the thousandth. This is true, unless you have some
> array
and
> OK, you got it. Recursive in C code is just as bad as
> recursive Lisp code. Stick to the iterative form in both languages,
> and use recursion only when it's better to do so.
I think this should be enough evidence that there is no point
in carrying on this discussion before mr. Peaceman has taken
an elementary Lisp course, could we all agree to close it and
get back to discussing _interesting_ things?
Sigh.
--
Espen Vestre
So what exactly is wrong with these two statements?
At best I see nothing but a difference of opinion.
Personally, in C or in Lisp, I like the idea of "use
recursion only when it's better to do so." You may
not (use recursion when it's worse?), but neither
of these opinions is absolute.
And in C or in Lisp, unless there is an external unrelated
reference to the 1000th element, I don't know of a way to get
to the 1000th element of a list without traversing 999. Do you?
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Sat, 09 Aug 1997 14:04:09 -0400
Organization: Logical Net
Reply-To: peac...@capital.net
Lines: 35
NNTP-Posting-Host: dialup033.colnny1.capital.net
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 3.01 (WinNT; I)
Xref: agate comp.lang.lisp:29788 comp.programming:53754 comp.lang.c++:286388
scha...@wat.hookup.net wrote:
>
> Could you please explain what this witty argument has to do with the cost
> of (tail)recursion and you superior compiler knowledge? Or are you bailing
> out?
>
> Hartmann Schaffer
Let's take a look at the following example of tail recursion,
(In C++, sorry don't know how to do references in Lisp)
int factorial(int &number) {
int x;
if (number == 1) return 1;
x = number-1;
return number * factorial(x);
}
If you don't know C++ :
int factorial(int *number) {
int x;
if (*number == 1) return 1;
x = *number-1;
return *number * factorial(&x);
}
There you have it, tail recurion that needs a stack.
Sorry, but I could not resist but joining in the self satisfied chorus
of all those who are now ROTFLing.
YOU JUST DON'T GET IT!
The factorial example above is what separates the "real" programmers
from the "quiche eaters" :) The function is NOT tail recursive.
You eat too much quiche :)
--
Marco Antoniotti
==============================================================================
California Path Program - UC Berkeley
Richmond Field Station
tel. +1 - 510 - 231 9472
Not quite. Arrays can be mapped to pointers, but they generally have a
much cleaner, more easilly optimized, and safer semantics than general
pointer arithmetic. (In principle ANSI C pointers allow the detection of
unsafe operations, but compilers have been slow to implement beyond what
is required. C pointers remain difficult to optimize.)
> Deleting elements is also a lot quicker. The programmer
> can write code to easily and quickly free up memory when it's no longer
> needed. If you leave it up to a garbage collector, it has to determine
> whether or not a particular part of memory is needed before it can free
> it... translation .. slow.
How easy it is to determine when to delete a reference depends on the
data structure, otherwise problems with memory leaks would not exist.
Garbage collection is rarely significantly slower than hand coding,
indeed, since it is written by an expert, it is often faster than
typical hand coding. A more common problem is that many implementations
put off collecting untill it is necessary and then stop all other
processing untill the collection is completed. On average this is the
most efficient way to implement the collection, but for real time
systems or systems interfacing with humans this need not be the most
appropriate implementation. There are other ways of implementing garbage
collection however. See Wilson's surveys on garbage collection
http://www.cs.utexas.edu/users/oops/
> <snip most of the clueless discussion of recursive code>
>
> OK, you got it. Recursive in C code is just as bad as
> recursive Lisp code. Stick to the iterative form in both languages,
> and use recursion only when it's better to do so.
> <snip>
No recursion in C is worse than in Lisp because it is more error prone
and more complicated for a compiler to optimize.
You might also change your nickname. No one that is as agressive in his
postings as you are would be considered a "Peaceman" by others.
From: tor...@tyr.diku.dk (Torsten Poulin Nielsen)
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: 13 Aug 1997 07:43:57 GMT
Organization: Department of Computer Science, U of Copenhagen
Lines: 57
Sender: tor...@tyr.diku.dk
NNTP-Posting-Host: tyr.diku.dk
X-Newsreader: slrn (0.9.0.0 (BETA) UNIX)
Xref: agate comp.lang.lisp:29921 comp.programming:54056 comp.lang.c++:287105
"Sajid Ahmed the Peaceman" wrote:
> OK, if that's your definition of tail recursion, how about this?
>
>
> int factorial(int *number, int result) {
> int x;
> if (*number == 1) return result;
> x = *number-1;
> return factorial(&x, result * number);
> }
>
Sigh, why obfuscate?
Because he is a "real" programmer. :)
Try something like
int fak(int n, int acc)
{
if (n == 0) return acc;
return fak(n - 1, n * acc);
}
On top of that he does not get the very simple fact that his first function
is not tail recursive.
Cheers
Post it again. I'll provide you the iterative translation.
Peaceman
In article <w6wwlq2...@gromit.online.no> Espen Vestre <e...@nextel.no>
writes:
>
> I think this should be enough evidence that there is no point
> in carrying on this discussion before mr. Peaceman has taken
> an elementary Lisp course, could we all agree to close it and
> get back to discussing _interesting_ things?
>
> Sigh.
>
> --
>
> Espen Vestre
Amen!
Many people have tried to help him. It is obvious that he has a vendetta
with Lisp. If he had a point or even knew what he was talking about it
might be worth some discussion. He seems to be merely aping other Lisp
haters that he has come in contact with.
Lisp advocates in general are pretty open minded and are willing to listen
and have open discussions about how Lisp can be improved. But someone like
him who just wants to tear down for no good reason can expect this kind of
result.
--
William P. Vrotney - vro...@netcom.com
Oh yes, I forgot, but then again I guess I'm just one of those poor, clueless
computer scientists who deals in useless things like abstractions and
so on, instead of bits.
>On top of that he does not get the very simple fact that his first function
>is not tail recursive.
I know, but I thought it was futile to mention it. What I don't get is why
on earth he decided to pass `number' as a pointer in the first place. What
purpose should that have? He was so close to getting it right (if we overlook
the fact that the function was wrong, because he forgot to dereference
`number'). He even had the accumulator ...
I get the shivers when I think about the "quality" products his company
must be grinding out.
For reference:
"Sajid Ahmed the Peaceman" wrote:
> int factorial(int *number, int result) {
> int x;
> if (*number == 1) return result;
> x = *number-1;
> return factorial(&x, result * number);
> }
-Torsten
Sigh, why obfuscate?
Try something like
int fak(int n, int acc)
{
if (n == 0) return acc;
return fak(n - 1, n * acc);
}
Which on my MIPS (with SGI's cc) gives something
like
# 1 int fak(int n, int acc)
# 2 {
.ent fak 2
fak:
.option O2
.frame $sp, 0, $31
$32:
.loc 2 2
.loc 2 3
# 3 if (n == 0) return acc;
bne $4, 0, $33
.loc 2 3
move $2, $5
.livereg 0x2000FF0E,0x00000FFF
j $31
$33:
.loc 2 4
# 4 return fak(n - 1, n * acc);
addu $2, $4, -1
mul $5, $4, $5
move $4, $2
b $32 <--------- Branch!
$34:
.livereg 0x2000FF0E,0x00000FFF
j $31
.end fak
Which certainly looks iterative, quite unlike your
bad example.
-Torsten
> Personally, I was unaware that MS c/C++ compilers would handle the tail
> recursion. (Im assuming microsoft, and because I wasnt able ti read all
> msgs in this thread, I figure that it was shown that they would, otherwise
> y'wuldn'tve said it).
Yes, if you invokle the right incantation, then VC++ 4.0 will perform
TRO. Here's that incantation "cl fac.c /O1" for the factorial code.
The "/O1" option is short for "minimize space". "/O2" also works.
TRO in C can be tested/demonstrated with the following code:
int *p = NULL;
void r(int i)
{
if (p == NULL)
p = &i;
printf("%d %p\n", i, &i);
if (p != &i)
exit(0);
r(i + 1);
}
void main(void)
{
r(1);
}
> I really think that what Michael Abrash says in his book "Zen of code
> optimization" really fit, in that y'shouldn't try to look at the code to
> see where the bottlenecks are, rather you should measure, measure, measure.
I've read the same advise in a number of othe places, too. You can
sometimes be suprised by where the time is really spent.
> Trying to optimize by turning a recursive function into iteration (which
> depending on the aglorithm may make the code that much more difficult to
> read and maintain) when it's not a performance problem is well...not a good
> use of your time.
Good knowledge of aglorithms is always the place to start, IMHO. Of
course, if you learn about aglorithm from a CS book, then the author's
preferences may influence you. This is probably why I have a number of
these books, by a variety of authors.
Computer science is just as fallible as any other science. The people
involved are only human, after all. However, that's no reason to
reject CS. Considering the part that it plays in making these darn
machines do _anything_ at all, never mind something interesting or
useful...but you have to learn some of it in order to appreciate that.
> >He'd make a great taxi driver. ;)
>
> Maybe. I'll admit that he is much more pleasant to deal with than Nudds. No
> lisp-pusher-religionist crap. ;-)
There are extremes, and there are _extremes_. ;) You can even find
hardware people slagging off software, so I guess the choice of
programming language is pretty trivial. Even machine code may be too
"high level"! Not that my current machine is free from hardware bugs.
One of these hardware bugs prevented Linux FT installing, a few years
ago. Ok, ok. So that's just another software problem...
Perfection is great when you can afford it. That's why so many of us
write software, and perhaps why some of it gets shipped before it's
ready - and a few corners may have been cut just to get that far.
Maybe that happens at the hardware level, too. It might explain that
HD controller bug that Linux FT used to object to.
There could be an entire fleet of taxis looking for drivers.
Martin Rodgers wrote in article ...
<snip>
>
>Good knowledge of aglorithms is always the place to start, IMHO. Of
>course, if you learn about aglorithm from a CS book, then the author's
>preferences may influence you. This is probably why I have a number of
>these books, by a variety of authors.
>
>Computer science is just as fallible as any other science. The people
>involved are only human, after all. However, that's no reason to
>reject CS. Considering the part that it plays in making these darn
>machines do _anything_ at all, never mind something interesting or
>useful...but you have to learn some of it in order to appreciate that.
>
Agreed. And even the "theoretical" results of math/cs research has a strong
impact on what wwe do today. After all, one can see clearly the concept of
Turing machines in the most modern CPUs. Without Godel/Turing/Church, (or
equivalent discovery that certain problems are noncomputable)we might still
be trying to solve problems that are unsolvable. ;-)
Dennis.
I have already provided ample evidence for all the points
that I have stated.
>
> Prove that nobody can use Lisp to do real work. Now, tehre's a
> challenge.
It would be a challenge all right. You can use lisp to
do work. The question is whether or not it's as easy and
efficient as other languages. If you don't use any iterative
code in your lisp program, it's not.
>If you merely asserted that nobody you know uses Lisp, then
> there'd be no problem.
I do assert that nobody I know uses Lisp, or would want
to use lisp.
>It wouldn't even suprise me, after looking at
> your Dejanews author profile:
> ...
I'm glad to see your looking stuff up about me. When you get
a chance, take a look at my homepage.
http://www.capital.net/com/peaceman
>
> It seems that you have a lot to say about PC issues, but that's not
> being disputed here. Instead, it's your assertions about Lisp. Try
> asking Dejanews for profiles of the people disagreeing with you, and
> aee what it tells you about us.
>
> Name the Lisps that you've used. Are any of them commercial systems,
> like LispWorks and Allegro CL? Here's LispWorks:
> ...
I've used a few Lisps in the past, and I'm not about to
use anymore, thank you.
Peaceman
Apologies to the C programmers out there. Being a Lisp programmer I
assumed (wrongly) that 'malloc' returns a 'zeroed' structure, which
might not be the case. Hence I introduced what purify calls an
'uninitialized memory bug' in the function 'insert'.
I was wondering what the point of all this was.
Could this be it? Given that Lisp is now well
known to attract attention on newsgroups...
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Sat, 09 Aug 1997 13:20:58 -0400
Organization: Logical Net
Reply-To: peac...@capital.net
NNTP-Posting-Host: dialup033.colnny1.capital.net
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 3.01 (WinNT; I)
Xref: agate comp.lang.lisp:29784 comp.programming:53749 comp.lang.c++:286382
Marco Antoniotti wrote:
> As per your declaration of "being a real programmer", I (and I assume
> many other "fake programmers" over here) would like to see a very
> simple admission of ignorance from you. We read Plato and his
> description of Socrates: "knowing not to know" is usually a good
> starting point.
>
If you look at my previous posts (someone said that they were
archiving them) you will see several places where I admitted I didn't
know something, usually followed by people calling me an idiot,
stupid , etc. I have already said that I can accept Lisp as a
decent programming language, if the recursion is done away with.
I was not making any point about Lisp "per se". My remarks were on
the "general computing" track.
I will also admit that I was wrong on one of my posts
about turning recursion into iteration, ( I gave the example
of infinitely recursive functions, like sine cos, etc. having
no iterative counterparts). I realized that about ten minutes
after stepping out of the office.
But you still have not provided the iterative translation of the C
recursive code I posted a couple of weeks ago.
Now, will any of you guys admit that you were wrong?
I thought so.
Wrong about what? Let's agree on the topics we are discussing and we
can start distributing torts and medals.
1. This had nothing to do with what I asked you. Are you into creative
quoting?
2. Only a nut would write factorial like this.
3. Please try to understand tail recursion before you talk about it. This
example isn't tail recursive.
Hartmann Schaffer
International Journal of High Speed Electronics, Vol. 3, Nos. 3 & 4 (1992)
337-361
I would put it on the web myself except that it is copyright by
World Scientific Publishing Company.
Martin Rodgers wrote in article ...
>Emergent Technologies Inc. wheezed these wise words:
>
>> Ok, the subject line is a little misleading.
>
>So I changed it to something closer to the truth. ;)
>
>> I'd just like to draw attention to ``The Supercomputer Toolkit:
>> A General Framework for Special-Purpose Computing'' by
>> Harold Abelson, Andrew Berlin, Jacob Katzenelson, William
>> McAllister, Guillermo Rozas, Gerald Sussman, and Jack Wisdom.
>
>Is this paper on the web and do you have a URL for it? I'd very much
>like to read it.
>
>Thanks.
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Wed, 13 Aug 1997 14:27:04 -0400
Organization: Logical Net
Reply-To: peac...@capital.net
Lines: 10
Marco Antoniotti wrote:
>
> But you still have not provided the iterative translation of the C
> recursive code I posted a couple of weeks ago.
>
Post it again. I'll provide you the iterative translation.
Voila`! Remember. No stack or funny 'parent field allowed'. :)
Forget about 'insert'. Just provide the translations for
'preorder_traversal' and 'inorder_traversal'.
------------------------------------------------------------------------------
#include <stdio.h>
#include <assert.h>
#include <stdlib.h>
typedef struct _tree_node
{
int key;
struct _tree_node *left;
struct _tree_node *right;
} tree_node;
void
insert (tree_node *tree, int value)
{
assert (tree);
if (value == tree->key)
return;
else if (value < tree->key)
{
if (tree->left == 0)
{
tree->left = (tree_node*) malloc (sizeof (tree_node));
tree->left->key = value; /* Sorry. No check on malloc return. */
}
else
insert (tree->left, value);
}
else
{
if (tree->right == 0)
{
tree->right = (tree_node*) malloc (sizeof (tree_node));
tree->right->key = value; /* Sorry. No check on malloc return. */
}
else
insert (tree->right, value);
}
}
void
preorder_traversal (tree_node *tree)
{
if (tree == 0)
return;
else
{
printf ("%d ", tree->key);
preorder_traversal (tree->left);
preorder_traversal (tree->right);
}
}
void
inorder_traversal (tree_node *tree)
{
if (tree == 0)
return;
else
{
inorder_traversal (tree->left);
printf ("%d ", tree->key);
inorder_traversal (tree->right);
}
}
void
main()
{
tree_node *root = (tree_node*) malloc (sizeof (tree_node));
__const__ int tree_size = 20;
int count;
/* Sorry. No system error checking.. */
srand(getpid());
root->key = rand() % 100;
for (count = 0; count < tree_size; count++)
insert(root, rand() % 100);
puts("Preorder traversal\n");
preorder_traversal(root);
putchar('\n');
puts("\nInorder traversal\n");
inorder_traversal(root);
putchar('\n');
}
------------------------------------------------------------------------------
I haven't seen a single line of code.
I haven't seen any specific run times.
It should be quite simple to present some algorithms whose implementations
are slower in LISP than they are in C++, along with the corresponding
run times.
*THAT* would be ample evidence.
>> Name the Lisps that you've used. Are any of them commercial systems,
>> like LispWorks and Allegro CL? Here's LispWorks:
>> ...
>
> I've used a few Lisps in the past, and I'm not about to
>use anymore, thank you.
*WHICH* implementations, pray tell?
If it was Xlisp, then we can probably all agree that it's not terribly
efficient. Xlisp was created as a simple PC implementation of LISP.
Drawing any wide conclusions out of running things in Xlisp is entirely
foolish.
--
Christopher B. Browne, cbbr...@hex.net, chris_...@sdt.com
PGP Fingerprint: 10 5A 20 3C 39 5A D3 12 D9 54 26 22 FF 1F E9 16
URL: <http://www.hex.net/~cbbrowne/>
Q: What does the CE in Windows CE stand for? A: Caveat Emptor...
In C, function calls are moderately expensive as it requires allocation
of stack frames with associated memory allocation and cleanup. Because
function calls are thus expensive, functions are encouraged to be of
moderate to large size.
In FORTRAN, the situation seems to be even somewhat more drastic, with
the corresponding result that FORTRAN encourages even larger functions
and subroutines.
With LISP, the fact that functions tend to get heavily factored has
encouraged improving efficiency of the calls. Fewer calls is preferable,
but is of less criticality than is the case for the other languages.
> On top of that he does not get the very simple fact that his first function
> is not tail recursive.
Perhaps he does it get but, as he's trolling, chooses to ignore it.
> 1. This had nothing to do with what I asked you. Are you into creative
> quoting?
It appears that he majored in troll writing. ;)
> 2. Only a nut would write factorial like this.
Or a troll writer. aka a nut.
> 3. Please try to understand tail recursion before you talk about it. This
> example isn't tail recursive.
Some people aren't patient eneough to try to understand something
before attacking it. They want to burn a book before readin it, ban a
film before seeing it, or in this case, condemn an idea based solely
on the fact that they're too clueless to understand it.
Let the jury decide.
> I have already provided ample evidence for all the points
> that I have stated.
Evidence? Where? You posted some poor C code that TRO impossible.
That's not evidence for anything other than the fact that compilers
can be fooled into not performing an optimisation.
Can you refute the correctly written C code and the resulting code
produced by GNU C? You've not succeeded so far.
> > Prove that nobody can use Lisp to do real work. Now, tehre's a
> > challenge.
>
> It would be a challenge all right. You can use lisp to
> do work. The question is whether or not it's as easy and
> efficient as other languages. If you don't use any iterative
> code in your lisp program, it's not.
So you claim. Where's your evidence of this? You admit that your
education failed to teach you Lisp. The C source code you've posted
suggests that it also failed to teach you _C_, but perhaps it was just
a deliberately contrived example, designed to prevent TRO.
> I do assert that nobody I know uses Lisp, or would want
> to use lisp.
Which tells us nothing about Lisp.
> I'm glad to see your looking stuff up about me. When you get
> a chance, take a look at my homepage.
>
> http://www.capital.net/com/peaceman
I've seen it, some time ago. As I've said. Perhaps you've not been
reading my posts to closely? I won't question your expertise at
solving PC problems, but this is rather different from the issues of
selling PC hardware and software, fixing the problems of products that
fail, etc. We're talking about programming and compiler theory, two
areas which you admit that your education failed to teach you.
Let me ask you this question: how many compilers have you written? How
many books about compilers have you read? If your formal education
failed, then you could have still taught yourself. However, it appears
that you have not done this.
BTW, what do you gain by attacking Lisp? If you were right, then it
could be no possible threat to you! Why then do you go to so much
trouble (producing such weak arguments, but never mind) to slag off
something that couldn't ever hurt you? Did your CS teacher leave you
bitter and twisted, with a need to seek retribution on the language
that you feel was "responsible" for your educational scars?
> > Name the Lisps that you've used. Are any of them commercial systems,
> > like LispWorks and Allegro CL? Here's LispWorks:
> > ...
>
> I've used a few Lisps in the past, and I'm not about to
> use anymore, thank you.
Then shut up and go away. I don't like to be offensive, but you've
made your point. So why continue to post your anti-Lisp propaganda?
The phrase "put up or shut up" seems very appropriate, somehow.
You could at least explain why you're doing this. What will you
personally gain by this attack? What kind of threat does Lisp pose
that it motivates you to argue against it? Is it truely Lisp that
bothers you, or just the idea that people out there may be using it?
Anyway, back to recursion. Try compiling the following code with VC++
4.0 or later (you do have a copy of VC++, don't you?) using "cl $*.c
/O1" to invoke the compiler.
#include <stdio.h>
int *p = NULL;
void r(int i)
{
if (p == NULL)
p = &i;
printf("%d %p\n", i, &i);
if (p != &i)
exit(0);
r(i + 1);
}
void main(void)
{
// rfact(300, 400);
r(1);
}
You should get an output like the following:
1 0012FF80
2 0012FF80
3 0012FF80
4 0012FF80
5 0012FF80
6 0012FF80
7 0012FF80
8 0012FF80
9 0012FF80
10 0012FF80
11 0012FF80
12 0012FF80
13 0012FF80
14 0012FF80
15 0012FF80
16 0012FF80
17 0012FF80
18 0012FF80
19 0012FF80
20 0012FF80
21 0012FF80
22 0012FF80
23 0012FF80
....
And so on, ad infinitum (that's Latin, BTW). Note that the stack space
is constant. You can check this for yourself. Please do so.
If you can't cope with this, or choose not to, then we can justifiably
call you a troller and dismiss everything you have to say in your
trolls. We've certainly be refuting your claims.
Hmm. I wonder what other C/C++ programmers think of this? Perhaps some
of you could also try the above code and report the result. Remember
to use optmisation. It could be that only VC++ and GNU C support TRO,
but these compilers are available to enough programmers for a few of
you to test Peaceman's claims vs our claims wrt recursion.
Peaceman, your claims belong more in comp.compilers than in
comp.lang.lisp, comp.programming, comp.lang.c++. I might find the
result even more entertaining than this current thread, which I'm
archiving so that when I need a laugh, I can re-read it.
> Pointer arithmetic is faster, than a list that can hold a variety
> of different types as elements. Lets say you want to access the 1000th
> element of a list. With pointer arithmetic, your there in an instance.
> With the list of varying elements, you have to traverse 999 elements
> before you get to the thousandth. This is true, unless you have some
> array
> pointing to the location of each element, but then your back to pointer
> arithmetic again.
Who said anything about lists? You can use arrays, too, y'know.
Oh, but then you wouldn't.
> OK, you got it. Recursive in C code is just as bad as
> recursive Lisp code. Stick to the iterative form in both languages,
> and use recursion only when it's better to do so.
<ahem>
int *p = NULL;
void r(int i)
{
if (p == NULL)
p = &i;
printf("%d %p\n", i, &i);
if (p != &i)
exit(0);
r(i + 1);
}
You might like to put all of your anti-recursion arguments into your
CV. There must be some top class jobs waiting for someone with your
talents. Not necessarily _programming_ jobs, of course.
Try marketing and/or advertising. You'll be right at home.
> #include <stdio.h>
>
> int *p = NULL;
>
> void r(int i)
> {
> if (p == NULL)
> p = &i;
> printf("%d %p\n", i, &i);
>
> if (p != &i)
> exit(0);
>
> r(i + 1);
> }
>
> void main(void)
> {
> // rfact(300, 400);
> r(1);
> }
Metrowerks CodeWarrior Pro 1 does not optimize the tail recursion.
Michael
--
Michael Schuerig Happiness is good health and a bad memory.
mailto:uzs...@uni-bonn.de -Ingrid Bergman
http://www.uni-bonn.de/~uzs90z/
No, but noone in his right mind would use such a datastructure. If you must
access elements in a list at position 1000, then you design is wrong. Switch to
a data-representation with arrays, or hashtables. You have it right there in Lisp.
You have a lot more infrastructure built for you in Lisp than in C. Use it!
Andreas
The preceding paragraph was brought to you by the "real world".
Have fun!
Reality is for folks who can't handle Lisp.
David Thornley
Arrays would be faster than hashtables, if it is possible
to use them.
> O.k., let's do a reality check:
>
> ; Here comes a bit Lisp code. Are you still with me?
>
> ; Let's generate a pair of a random string and a number.
> ; The number is the sum of the character ascii values in the string.
>
> (defun make-pair ()
> (loop with length = (random 100)
> with string = (make-string length)
> for i below length
> for char = (code-char (+ 32 (random 90)))
> summing (char-code char) into sum
> do (setf (aref string i) char)
> finally (return (cons sum string))))
>
> ; let's define a size parameter of 10000.
>
> (defparameter *size* 10000)
>
> ; Now we make a list of 10000 such pairs.
>
> (defparameter *the-list*
> (loop repeat *size*
> collect (make-pair)))
>
> ; Now we put the same elements into an array.
>
> (defparameter *the-vector*
> (loop with vector = (make-sequence 'vector *size*)
> for item in *the-list*
> for i below (length vector)
> do (setf (aref vector i) item)
> finally (return vector)))
>
> ; Now it's time for a hashtable of the same data.
> ; keys are the numbers and values are the strings.
>
> (defparameter *the-hashtable*
> (loop with table = (make-hash-table )
> repeat *size*
> for (key . value) in *the-list*
> do (setf (gethash key table) value)
> finally (return table)))
>
> ; Now we want to access a string indexed by a number from
> ; the various datastructures.
>
> (defun test (item)
> (print (time (find item *the-list* :key #'first)))
> (print (time (find item *the-vector* :key #'first)))
> (print (time (gethash item *the-hashtable*))))
>
> ; Let's see at what position the number 4990 (a guess) is in the list?
>
> ? (position 4990 *the-list* :key #'first)
> 9995
>
> ; Now what results do you expect?
> ? (test 4990)
>
> (FIND ITEM *THE-LIST* :KEY #'FIRST) took 22 milliseconds (0.022 seconds) to run.
> (FIND ITEM *THE-VECTOR* :KEY #'FIRST) took 29 milliseconds (0.029 seconds)
> to run.
> (GETHASH ITEM *THE-HASHTABLE*) took 0 milliseconds (0.000 seconds) to run.
>
> ;Finding the 9995th element in this 10000 element unsorted list takes
> ;22 milliseconds on my Mac.
> ;Finding the 9995th element in this 10000 element unsorted array takes
> ;29 milliseconds on my Mac.
> ;Finding the value for the key 4990 in a hashtable is (way) below 1 millisecond
> ;on my Mac.
>
> ; Direct access of element 9995 in my list:
>
> ? (time (nth 9995 *the-list*))
> (NTH 9995 *THE-LIST*) took 15 milliseconds (0.015 seconds) to run.
>
> ; Direct access of element 9995 in my vector:
>
> ? (time (aref *the-vector* 9995))
> (AREF *THE-VECTOR* 9995) took 1 milliseconds (0.001 seconds) to run.
>
> Conclusion: Everything looks like we expect it. Lisp is fast.
That's the kind of lisp code I like to see.
It's good that your using iterative nonrecursive code.
If you wrote the above in recursive only code, it would take at
least twice as long to run, and about ten times as long to write.
>
> > With pointer arithmetic, your there in an instance.
> > With the list of varying elements, you have to traverse 999 elements
> > before you get to the thousandth. This is true, unless you have some
> > array
>
> Yes, let's simply take an array.
>
> > pointing to the location of each element, but then your back to pointer
> > arithmetic again.
>
> With the difference that in Lisp array operations usually are
> safe compared to pointer arithmetic in C. First in Lisp
> you know that you access an array (otherwise you'll get an error,
> this is the idea of typed data) and second access outside the
> array range isn't possible, too (you also get an error here).
>
That also means that the compiler needs to add code to check
the bounds of the array, as well as the string that states the error
message, to say the least.
> > Deleting elements is also a lot quicker.
>
> How do you delete? Faster than what?
>
> > The programmer
> > can write code to easily and quickly free up memory when it's no longer
> > needed. If you leave it up to a garbage collector, it has to determine
> > whether or not a particular part of memory is needed before it can free
> > it... translation .. slow.
>
> The programmer can write code to easily and quickly free up memory.
>
> Translation: error prone. Duplicate code all over the system. Slow.
>
Perhaps, but it depends on the programmer.
> > OK, you got it. Recursive in C code is just as bad as
> > recursive Lisp code. Stick to the iterative form in both languages,
> > and use recursion only when it's better to do so.
>
> And that's quite often, since a lot data structures
> (trees, etc.) have recursive definitions. I like it.
>
Well, the in real world programming (at least for C++),
you never write any code access datastructures. C++ has the
famous container classes that lets you implement linked lists,
binary trees, arrays, and almost any other structure that you can
think of. This enables a programmer to reuse code, and not have
to rewrite it every time the programmer wants to use a particular
data structure.
Peaceman
From: mar...@infiniti.PATH.Berkeley.EDU (Marco Antoniotti)
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: 14 Aug 1997 16:05:27 -0700
Organization: California PATH Program - UC Berkeley
Path: agate!usenet
Lines: 101
Sender: mar...@infiniti.PATH.Berkeley.EDU
...
The "famous container classes" of C++ are dispersed in a plethora of
incompatible class libraries. If you use the MFC in your program and
then need to link in a piece of code developed with some other library
(I am not mentioning the STL, since I do not know exactly where it
compiles on) then your run-time image bloats.
Ok. Apologies again. My terminology is incorrect. `run-time image
bloat' is not quite appropriate. :}
Why I do this, I do not know.... :{
In article <33F35D...@capital.net> Sajid Ahmed the Peaceman <peac...@capital.net> writes:
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Thu, 14 Aug 1997 15:33:36 -0400
Organization: Logical Net
Reply-To: peac...@capital.net
Lines: 166
Rainer Joswig wrote:
> Common Lisp has for example lists, arrays and hashtables (and some more).
> Each datastructure has different access, delete or insert speed.
> But this is independent of any programming language and
> has nothing to do with Lisp. If for a particular application
> lists are to slow - use arrays. If arrays are to slow, for
> example for indexed element access, use hashtables.
>
Arrays would be faster than hashtables, if it is possible
to use them.
Yeah. Tell me how to index a set of elements with no prior knowledge
of the size using a string as key.
Your ignorance of Basic Data Structures and their relative strength is
appalling.
> O.k., let's do a reality check:
>
> ; Here comes a bit Lisp code. Are you still with me?
>
...
> Conclusion: Everything looks like we expect it. Lisp is fast.
That's the kind of lisp code I like to see.
It's good that your using iterative nonrecursive code.
If you wrote the above in recursive only code, it would take at
least twice as long to run, and about ten times as long to write.
You just don't get it. In this case writing a recursive piece of code
would have required a different structure. The point that Rainer was
making was that: (a) you have the most powerful iteration construct
available in Common Lisp and (b) different data structures behave
differently.
...
>
> The programmer can write code to easily and quickly free up memory.
>
> Translation: error prone. Duplicate code all over the system. Slow.
>
Perhaps, but it depends on the programmer.
I am a much better programmer than you are (seeing your code has
conviced me of that) and yet I make *a lot* of stupid mistakes (mostly
memory leaks) when I program in C/C++.
> > OK, you got it. Recursive in C code is just as bad as
> > recursive Lisp code. Stick to the iterative form in both languages,
> > and use recursion only when it's better to do so.
>
> And that's quite often, since a lot data structures
> (trees, etc.) have recursive definitions. I like it.
>
Well, the in real world programming (at least for C++),
you never write any code access datastructures. C++ has the
famous container classes that lets you implement linked lists,
binary trees, arrays, and almost any other structure that you can
think of. This enables a programmer to reuse code, and not have
to rewrite it every time the programmer wants to use a particular
data structure.
The "famous container classes" of C++ are dispersed in a plethora of
incompatible class libraries. If you use the MFC in your program and
then need to link in a piece of code developed with some other library
(I am not mentioning the STL, since I do not know exactly where it
compiles on) then your run-time image bloats.
This is one of the strengths of Common Lisp (the API and Data
Structures that you really need, are mostly there and standardized)
that the Java development team learned very well.
Your statement above shows again a partial knowledge of "the real
world" and of its problems. :)
Cheers
> Pointer arithmetic is faster, than a list that can hold a variety
> of different types as elements. Lets say you want to access the 1000th
> element of a list.
You simply have detected that different data structures have
different properties.
Common Lisp has for example lists, arrays and hashtables (and some more).
Each datastructure has different access, delete or insert speed.
But this is independent of any programming language and
has nothing to do with Lisp. If for a particular application
lists are to slow - use arrays. If arrays are to slow, for
example for indexed element access, use hashtables.
O.k., let's do a reality check:
; Here comes a bit Lisp code. Are you still with me?
; Let's generate a pair of a random string and a number.
(defparameter *size* 10000)
Conclusion: Everything looks like we expect it. Lisp is fast.
> With pointer arithmetic, your there in an instance.
> With the list of varying elements, you have to traverse 999 elements
> before you get to the thousandth. This is true, unless you have some
> array
Yes, let's simply take an array.
> pointing to the location of each element, but then your back to pointer
> arithmetic again.
With the difference that in Lisp array operations usually are
safe compared to pointer arithmetic in C. First in Lisp
you know that you access an array (otherwise you'll get an error,
this is the idea of typed data) and second access outside the
array range isn't possible, too (you also get an error here).
;;; Let's get an array of 20 elements of random numbers
? (defparameter *numbers*
(let ((array (make-array 20)))
(loop for i below (length array)
do (setf (aref array i)
(random 200))
finally (return array))))
*NUMBERS*
;;; here it is
? *numbers*
#(93 191 106 33 28 114 160 115 136 2 5 188 172 92 104 93 27 187 95 194)
;;; access out of bounds
? (aref *numbers* (random 50))
> Error: Array index 30 out of bounds for #<SIMPLE-VECTOR 20> .
> While executing: "Unknown"
> Type Command-. to abort.
See the RestartsŠ menu item for further choices.
1 >
Aborted
;;; Try to access something not being an array as an array
? (aref nil 30)
> Error: value NIL is not of the expected type ARRAY.
> While executing: CCL::%AREF1
> Type Command-. to abort.
See the RestartsŠ menu item for further choices.
1 >
>
> Deleting elements is also a lot quicker.
How do you delete? Faster than what?
> The programmer
> can write code to easily and quickly free up memory when it's no longer
> needed. If you leave it up to a garbage collector, it has to determine
> whether or not a particular part of memory is needed before it can free
> it... translation .. slow.
The programmer can write code to easily and quickly free up memory.
Translation: error prone. Duplicate code all over the system. Slow.
> I don't think so. I've had enough of Lisp already.
Still you don't understand basic issues. How come?
> Nah... I'm attacking Lisp because theres too much recursion in it.
I like Lisp software with recursion. It often makes code clearer
and shorter.
> I have yet to see a lisp program without any recursion. I know that
> you guys could probably easily write some.
Sure. Common Lisp has a lot constructs for iterative programming.
Probably more than most other programming languages.
You may want to read the documentation for the LOOP
construct of ANSI Common Lisp:
http://www.harlequin.com/education/books/HyperSpec/Body/mac_loop.html
It provides more features for iterative programming then
you might be able to deal with. ;-)
> OK, you got it. Recursive in C code is just as bad as
> recursive Lisp code. Stick to the iterative form in both languages,
> and use recursion only when it's better to do so.
And that's quite often, since a lot data structures
(trees, etc.) have recursive definitions. I like it.
It's there as an example.
> I get the shivers when I think about the "quality" products his company
> must be grinding out.
>
The products I grind out are so fast, it will make you shiver :)
Peaceman
>> I'd just like to draw attention to ``The Supercomputer Toolkit:
>> A General Framework for Special-Purpose Computing'' by
>> Harold Abelson, Andrew Berlin, Jacob Katzenelson, William
>> McAllister, Guillermo Rozas, Gerald Sussman, and Jack Wisdom.
>
>Is this paper on the web and do you have a URL for it? I'd very much
>like to read it.
It's on the net---ftp to publications.ai.mit.edu and go to
ai-publications/1000-1499. It's AIM-1329.ps.
--
-Fred Gilham gil...@csl.sri.com
``The road to tyranny is paved with human rights....Once the state
sets out to guarantee a right, it will destroy every obstacle that
stands in the way of liberating the oppressed.'' --Thomas Fleming
> Arrays would be faster than hashtables, if it is possible
>to use them.
Can you tell us in what situations arrays are faster than hash tables?
Or vice versa?
I trust that you realize that linked lists provide unordered access in
O(N) time, which is the maximum rate of speed that is possible for this
sort of operation, supposing you need to visit every element...
With the apparent quality of your computer science education, answering
any of this may be problematic...
> No, but noone in his right mind would use such a datastructure. If you must
> access elements in a list at position 1000, then you design is wrong. Switch to
> a data-representation with arrays, or hashtables. You have it right there in Lisp.
> You have a lot more infrastructure built for you in Lisp than in C. Use it!
Trees are also useful for quickly finding an object identified by a
key. In an array, you have a simple range of keys that can be mapped
directly to physical addresses, but when the keys are sparase, or
worse, not even numeric, then a tree is much better.
You could even use a hash table of trees. I tend to use hash tables of
lists, but that's because the number of collisions tend to be small.
In a program where I expected collisions to be more frequent, I might
well consider either rehashing or trees. Perhaps even a simple table
of sorted items would do, as long as the number of collisions are
small enough to make inserting small, or insertions are infrequent. A
binary search could then be used for resolving lookup collisions.
In any case, some kind of profiling would help determine how well the
solution is working, plus where and how it might need improving.
I regret not having profiling in VC++, but it's possible to add manual
profiling in this case. I just count the number of string comparisons.
This works well, regardless of which language I use. It's all pretty
simple stuff that you can find in basic CS books. For hashing strings,
I use the hashing function from the "Dragon" book on compiler theory:
#define PRIME 211
unsigned long hash(const char *s)
{
register char *p;
register unsigned long h, g;
h = 0;
for (p = (char *) s; *p != EOS; p++) {
h = (h << 4) + (*p);
g = h & 0xf0000000;
if (g) {
h = h ^ (g >> 24);
h = h ^ g;
}
}
return (h % PRIME);
}
I expect this is obvious to most people reading this, so I'm only
posting it for the benifit of people like Peaceman. ;-) I doubt that
he's read the "Dragon" book, because if he had read it, he might be
posting some better quality arguments, as they'd be based on some
connection with reality.
On the other hand, there's a lot of compiler theory that may be too
close to maths for Peaceman. The above code, for example. Never mind
ideas like strength reduction, data flow analysis, graphs, NFA to DFA
transformations, etc.
So let's start with something simple, like hash tables. That shouldn't
be too challenging, should it? Considering how useful these things can
be, the small effort required easily pays for itself.
> Metrowerks CodeWarrior Pro 1 does not optimize the tail recursion.
Too bad. Are you sure that you just didn't it the right options? Even
GNU C will fail to perform TRO if you don't give it the right
incantation. Still, a few years ago I didn't know of _any_ C compilers
that could do this. It's possible that CodeWarrior will also be able
to do it, once Metrowerks decide that it's worth doing.
If Peaceman doesn't convince them that it can't be done, that is.
Since he appears to know far less about compiler theory than
Metrowerks, I won't worry about it. As I've said, I'd love to see him
try this troll in comp.compilers.
Let's got throught the summary of this thread:
I made a post that LISP programs were slow, and that
Lisp programmers were deceived because of their abstract
mathematical worlds, i.e. thinking computer functions returning
instantaneously. I got about 10 responses, mostly flames, some
legitimate saying that Lisp supports iteration. I then said
I could accept Lisp as a decent language if it indeed was
an iterative language, and I apologized for calling all Lisp
programmers as deceived. I then got more flames and name calling.
I then gave the example of the Qsort algorithm taking several
lines in Lisp, and just a few in C. Someone posted a 15 line Qsort
lisp program. I then pointed out that the program that was posted
was not "good" lisp code because of the use of local (automatic)
variables. I fixed up the code to get rid of the local variables,
and got a program with 80+ lines. I then got more flames about the
style that I used, and someone changed my code and put several
statements on each line, and brought it down to about 24 lines.
Then there were several comments from people in the Lisp group
about some implementations of C using shell sort instead of the
standard Qsort, with more flames thrown at me throughout.
I then pointed out that the reason the Lisp code is slow is becuase
of the overhead of recursion, i.e. a stack. You guys then said
that recursion doesn't need a stack because of tail recursion,
and tail recursion optimization, with more flames added. I then gave
an example of tail recursion that needed a stack, because of the
numerous posts about tail recursion, even though my argument was
about all recursion in general. I got more flames and name calling
as expected. Now the thread has moved on to pointers and references
in Lisp.
Flame me and call me as many names as you want. It doesn't
do anything.
>
> Anyway, back to recursion. Try compiling the following code with VC++
> 4.0 or later (you do have a copy of VC++, don't you?) using "cl $*.c
> /O1" to invoke the compiler.
>
> #include <stdio.h>
>
> int *p = NULL;
>
> void r(int i)
> {
> if (p == NULL)
> p = &i;
> printf("%d %p\n", i, &i);
>
> if (p != &i)
> exit(0);
>
> r(i + 1);
> }
>
> void main(void)
> {
> // rfact(300, 400);
> r(1);
> }
>
Thats a good observation you've made with the visual C++
compiler and it's optimization techniques. Some people claim this
as a bug, others as a feature.
Peaceman
>> I know, but I thought it was futile to mention it. What I don't get is why
>> on earth he decided to pass `number' as a pointer in the first place. What
>> purpose should that have? He was so close to getting it right (if we overlook
>> the fact that the function was wrong, because he forgot to dereference
>> `number'). He even had the accumulator ...
>
> It's there as an example.
Hmm, but that still doesn't explain the use of a reference when ordinary
pass-by-value would be both sufficient and better. What purpose did it serve?
I'm assuming here that you had a purpose. Did you want show that C is
difficult to compile efficiently? I fail to see how it demonstrated any
of your points regarding Lisp.
> The products I grind out are so fast, it will make you shiver :)
<grin> but are they correct?
-Torsten
100% agreed! The reading of Mr Peaceman's postings makes me think that
he is one
of those guys who prefers tweaking is code rather that rethinking about
the algorithms used when facing non trivial problems. No matter how well
optimized a piece of code that implements an exponential algorithm is,
using another algorithm that performs the task in polynomial time will
always be more efficient - well right, I know that solving the simplex
problem is usually done using an exponential algo rather than its
polynomial counterpart because it's faster for typical problem
instances, but this is an exception.
Don't get me wrong thoug, I think that code optimization is an important
step but that it is only a "small step" in the process of writing good
programs.
> I am a much better programmer than you are (seeing your code has
> conviced me of that) and yet I make *a lot* of stupid mistakes (mostly
> memory leaks) when I program in C/C++.
And you're by far not the only one! The mere existence of products like
purify (a fine product BTW) tends to prove this point.
> Rainer Joswig wrote
> > Sajid Ahmed the Peaceman wrote
> > > OK, you got it. Recursive in C code is just as bad as
> > > recursive Lisp code. Stick to the iterative form in both languages,
> > > and use recursion only when it's better to do so.
> >
> > And that's quite often, since a lot data structures
> > (trees, etc.) have recursive definitions. I like it.
> >
IMHO, the bottom line of this all is that the use of recursion/iteration
depends on the nature of the problem to solve - gee, sounds as if I just
wrote that the water is wet - and that average lisp programmers seem to
tackle problems that are more recursive in nature than average C/C++
programmers - or at least used to tackle, many former lisp programmers
have been forced into using C/C++, I'm one of them.
One thing I'm sure of is that stating that lisp is slow because it calls
for writing recursive code is total and utter nonsense. Had Mr Peaceman
mentioned the lack of performance of lisp code due to factors such as,
for instance, the boxing/unboxing of numbers (in my opinion these
problems appear only in poorly written lisp code), I would have
considered him as knowledgable enough to discuss the matter in this
newsgroup.
His postings, however, show that he is only ranting and that he doesn't
deserve more attention in this newsgroup.
Eric Sauthier
Unless you know that the arithmetic or array access or whatever is
in a highly-used spot in your program, which normally means running
it and profiling it, correctness is usually the better choice.
You're overlooking the "strong typing" argument also. In Lisp,
an array is an array, and the Lisp system should reject it if it
isn't. In C or C++, an array is a pointer, and any arbitrary
pointer can be used as an array. The freedom this gives you is
not useful most of the time, and it permanently restricts the
C or C++ compiler's ability to optimize code. Common Lisp can
crunch numbers as fast as Fortran, depending on implementations,
while C and C++ can't. The reason is that explicit pointer
arithmetic is legal in C and C++, and not in Common Lisp or Fortran,
and this inhibits the compiler from certain optimizations.
>
>> > Deleting elements is also a lot quicker.
>>
>> How do you delete? Faster than what?
>>
>> > The programmer
>> > can write code to easily and quickly free up memory when it's no longer
>> > needed. If you leave it up to a garbage collector, it has to determine
>> > whether or not a particular part of memory is needed before it can free
>> > it... translation .. slow.
>>
Modern garbage collectors are pretty much as fast as manual memory
management. They tend to clump up the management, though, so you can
get annoying delays. You can cut down on the delays if you're willing
to take a slight overall performance hit.
Getting memory management right is very difficult, particularly in C,
and it's the source of lots of bugs. I've seen references to memory
management taking about 30% of the programming effort in large C
projects, and I'm willing to believe it. Many large Unix programs
leak memory like crazy and trust the operating system to clean up
after the program is through.
>> The programmer can write code to easily and quickly free up memory.
>>
>> Translation: error prone. Duplicate code all over the system. Slow.
>
> Perhaps, but it depends on the programmer.
>
Really? One of the best programmers I know, who worked in a very
well-run shop, was involved in shipping a program that had a bug in
it, simply because he'd freed a data object and then used it. If
said friend can't handle memory management precisely, most people
will write error-prone code, and if they try to compensate for it
it'll wind up slow.
Look at all the discussion of reference-counting pointers in C++
books. There are a whole lot of very good programmers who are
using a very primitive form of garbage collection because they
can, and can't get anything better.
>
>
>> > OK, you got it. Recursive in C code is just as bad as
>> > recursive Lisp code. Stick to the iterative form in both languages,
>> > and use recursion only when it's better to do so.
>>
>> And that's quite often, since a lot data structures
>> (trees, etc.) have recursive definitions. I like it.
>>
>
> Well, the in real world programming (at least for C++),
>you never write any code access datastructures. C++ has the
>famous container classes that lets you implement linked lists,
>binary trees, arrays, and almost any other structure that you can
>think of. This enables a programmer to reuse code, and not have
>to rewrite it every time the programmer wants to use a particular
>data structure.
>
The Draft C++ Standard has a lot of containers, but not as many as
Lisp. They were going to put in hash tables, but ran out of time;
this, I think, would put C++ real close to Common Lisp in useful
data structures provided. Lisp lists work well for linked lists
and trees of various sorts, Lisp has arrays, Lisp has hash tables,
Lisp has structs.
Realistically, every commercial Common Lisp system has all these
neat data structures and algorithms built in, while every commercial
C++ system is converging on the Draft Standard on its own course.
In another three years or so, you'll be able to use the C++ Standard
Library reliably.
For several years it has seemed to me that C++ is trying to be Lisp
as a C extension. All things being equal, I'd rather go with the
real thing.
David Thornley
> Flame me and call me as many names as you want. It doesn't
> do anything.
Pointing out technical flaws in your argument isn't flaming. Any
flaming you get is for your insistance on assuming that you can talk
about a subject while being completely ignorant of it. You even admit
your ignorance, i.e. your failed education.
Your recursion code was _not_ tail recursion, and therefore could not
be eliminated by the compiler. Corrected versions of your function
were posted by others - did you read them? Can you see the difference?
Can you give me a reason for not thinking of your posts as trolls?
You repeated refused to offer any evidence wrt to Lisp, and your C
code is either flawed or deliberately contrived to prevent TRO.
Why should anyone _not_ call you a troller? Why should anyone not
dismiss all your claims as nothing but a troll? If you have an agenda
other than trashing Lisp, then please state it.
I offer my sincerest apologies to everyone (including the Peaceman) up
here. I just cannot resist. :)
In article <33F4FE...@capital.net> Sajid Ahmed the Peaceman <peac...@capital.net> writes:
From: Sajid Ahmed the Peaceman <peac...@capital.net>
Newsgroups: comp.lang.lisp,comp.programming,comp.lang.c++
Date: Fri, 15 Aug 1997 21:13:33 -0400
Organization: Logical Net
Reply-To: peac...@capital.net
Lines: 109
NNTP-Posting-Host: dialup118.colnny1.capital.net
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Mailer: Mozilla 3.01 (WinNT; I)
Xref: agate comp.lang.lisp:29992 comp.programming:54285 comp.lang.c++:287749
Let's got throught the summary of this thread:
I made a post that LISP programs were slow, and that
Lisp programmers were deceived because of their abstract
mathematical worlds, i.e. thinking computer functions returning
instantaneously. I got about 10 responses, mostly flames, some
legitimate saying that Lisp supports iteration. I then said
I could accept Lisp as a decent language if it indeed was
an iterative language, and I apologized for calling all Lisp
programmers as deceived.
Thank you. But you forget to mention that your unwarranted comments
about recursion came about at this point.
I then got more flames and name calling.
I then gave the example of the Qsort algorithm taking several
lines in Lisp, and just a few in C. Someone posted a 15 line Qsort
lisp program. I then pointed out that the program that was posted
was not "good" lisp code because of the use of local (automatic)
variables. I fixed up the code to get rid of the local variables,
and got a program with 80+ lines. I then got more flames about the
style that I used, and someone changed my code and put several
statements on each line, and brought it down to about 24 lines.
Your Qsort program in LISP was pure junk. I do not remember your
Qsort in C, but I am afraid to think about it, having seen your 'factorial'.
Then there were several comments from people in the Lisp group
about some implementations of C using shell sort instead of the
standard Qsort, with more flames thrown at me throughout.
This was an interesting discussion in which you showed you did not
have a clue why using shell sort instead of quicksort was an issue at
all.
I then pointed out that the reason the Lisp code is slow is becuase
of the overhead of recursion, i.e. a stack. You guys then said
that recursion doesn't need a stack because of tail recursion,
and tail recursion optimization, with more flames added.
"we guys" said that there are TWO kinds of recursion (a fact which you
choose to ignore) and that tail-recursion can be eliminated by the compiler.
I then gave
an example of tail recursion that needed a stack, because of the
numerous posts about tail recursion, even though my argument was
about all recursion in general.
You gave first an incredibly obfuscated C version of the factorial
function which was NOT tail-recursive (showing that you just did not
get it). BTW. I am still waiting for the iterative traversal of
binary trees. Then you posted an obfuscated version which was tail
recursive and people showed you that a good compiler can eliminate the
stack allocation.
I got more flames and name calling
as expected. Now the thread has moved on to pointers and references
in Lisp.
I have missed this part. I just recall your useless cluttering of C
code with pointers.
Flame me and call me as many names as you want. It doesn't
do anything.
I got that. But I get a perverted pleasure from pointing out balooney
Funny, I imagine that thought has crossed the minds of more than a
few of your opposition.
> You'll find out for yourself what real programming
> is like once you get a job in the real world.
Hmmm... You mean the world where you face the daily hard slog to keep
your business afloat? You mean the world where the market and legal
issues get in your face as much as technical ones? Funny, but that IS
where I work. I also use technology as suits the problems at hand;
this sometimes means Lisp, and sometimes recursion.
Do not insult your audience by presuming to tell them they do not work
in the real world simply because they do not share your views!
--
Christopher Oliver Traverse Communications
Systems Coordinator 223 Grandview Pkwy, Suite 108
oliver -at- traverse -dot- com Traverse City, Michigan, 49684
"Everyone takes the limits of his own vision for the limits of the world."
- Schopenhauer
The context was C++. In C++, pointers and references are two different
things, with different syntax and semantics. (Addresses are yet a third
thing, with no syntax.)
--
Four policemen playing jazz on an up escalator in the railway station.
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.
>Ok, the subject line is a little misleading.
Just like a sly LISP supporter! I'm suspicious already!!
>They describe joint work done around 1991 between the MIT
>Project for Mathematics and Computation and Hewlett-Packard's
>Information Architecture Group on the design and construction of
>a Supercomputer Toolkit.
>
>This toolkit was used to compute the long-term motion of the
>Solar System, improving upon previous integrations by two orders
>of magnitude.
The key word here is "long-term motion". Try a real-time app and see
where your LISP get's you!
>What is the toolkit language (and I'm sure you see this coming)
>Lisp.
>
>And not micro-optimized Lisp, but highly abstract code. Here is an
>example.
>
>(define add-vectors (vector-elementwise +))
>
>.(define (vector-elementwise f)
> . (lambda (vectors)
>. (generate-vector
>. (vector-length (car vectors))
>. (lambda (i)
>. (apply f (map (lambda (v) (vector-ref v i))
>. vectors))))))
>This code is drawn from the paper and is the type of code used in
>programming the toolkit.
>
>This code also uses recursion to describe an iterative
>process.
Which only a sly LISPy type would be absolutely sure to dream up!
Just to get even, I'm gonna write an iterative version of a recursive
function, so there!<g>
The rest of us only use recursion when we want to - but LISPy's are
just *wild* about it!
Good stuff! I don't think anyone familiar with AI or LISP is arguing
that LISP has no future - it is a premier AI language and allows
abstractions that are unique and valuable.
But it's still slow! Of course, Tiger's aren't Cheetah's either - and
way slower; still a very neat animal.
Regards,
Dave
Can it be Ackermann's function, please. I've /always/ wanted
to see the iterative version of Ackermann's function.
ian
Sorry Ian,
Never met the [A(1,j) = j+1 for j >= 1] dude. 'Sides, he looks dull.
Plan is to try an iterative version of an iteratively deepening alpha
beta search of a chess game tree.
Chess is fun, Ackerman's function is not, IMHO.
Regards,
Dave
> It's on the net---ftp to publications.ai.mit.edu and go to
> ai-publications/1000-1499. It's AIM-1329.ps.
I'll fetch it ASAP.
Many thanks.
--
"As you read this: Am I dead yet?" - Rudy Rucker
> On Tue, 12 Aug 1997 21:55:40 -0400, "Emergent Technologies Inc."
> <emer...@eval-apply.com> wrote:
>
> >Ok, the subject line is a little misleading.
> Just like a sly LISP supporter! I'm suspicious already!!
>
> >They describe joint work done around 1991 between the MIT
> >Project for Mathematics and Computation and Hewlett-Packard's
> >Information Architecture Group on the design and construction of
> >a Supercomputer Toolkit.
> >
> >This toolkit was used to compute the long-term motion of the
> >Solar System, improving upon previous integrations by two orders
> >of magnitude.
> The key word here is "long-term motion". Try a real-time app and see
> where your LISP get's you!
Video switching systems and spacecraft perhaps?
e.g. http://www.harlequin.com/news/press/archive/pr-att.html
__Jason
> Let's got throught the summary of this thread:
OK, let's.
> I made a post that LISP programs were slow, and that
> Lisp programmers were deceived because of their abstract
> mathematical worlds, i.e. thinking computer functions returning
> instantaneously. I got about 10 responses, mostly flames, some
> legitimate saying that Lisp supports iteration. I then said
> I could accept Lisp as a decent language if it indeed was
> an iterative language,
which, from later comments, seems to mean "if it indeed made
writing recursive code gratuitously difficult", since you're
still objecting to Lisp on the grounds that some people write
recursive code in it
> and I apologized for calling all Lisp
> programmers as deceived
while still suggesting that everyone who ever writes recursive
code is deceived or stupid or something
> . I then got more flames and name calling.
> I then gave the example of the Qsort algorithm taking several
> lines in Lisp, and just a few in C. Someone posted a 15 line Qsort
> lisp program. I then pointed out that the program that was posted
> was not "good" lisp code because of the use of local (automatic)
> variables
although why local variables are supposed to be bad Lisp style,
I cannot imagine
> . I fixed up the code to get rid of the local variables,
> and got a program with 80+ lines
by inserting vast quantities of whitespace and incidentally making
the code much worse
> . I then got more flames about the
> style that I used, and someone changed my code and put several
> statements on each line,
or (to put it differently) refrained from splitting each statement
over many, many lines, and undid some of your gratuitous insertion
of whitespace,
> and brought it down to about 24 lines.
> Then there were several comments from people in the Lisp group
> about some implementations of C using shell sort instead of the
> standard Qsort, with more flames thrown at me throughout.
> I then pointed out
falsely
> that the reason the Lisp code is slow
(which it isn't, so far as anyone can tell from the evidence you've
posted)
> is becuase
> of the overhead of recursion, i.e. a stack. You guys then said
> that
tail
> recursion doesn't need a stack because of tail recursion,
> and tail recursion optimization, with more flames added. I then gave
> an example of
something that looks a bit like
> tail recursion
but actually isn't
> that needed a stack, because of the
> numerous posts about tail recursion, even though my argument was
> about all recursion in general
and persisted in making assertions about all recursion which are
false because some recursion is tail recursion
> . I got more flames and name calling
> as expected. Now the thread has moved on to pointers and references
> in Lisp.
Then, later,
> Thats a good observation you've made with the visual C++
> compiler and it's optimization techniques. Some people claim this
> as a bug, others as a feature.
Perhaps you could explain some respect in which it is wrong for
a C compiler to perform the optimisation in question in the case
in question?
--
Gareth McCaughan Dept. of Pure Mathematics & Mathematical Statistics,
gj...@dpmms.cam.ac.uk Cambridge University, England.