Tim Bradshaw <t...@aiai.ed.ac.uk> writes: > while insertion / deletion at the head is >O(1) for a list but O(n) for a vector.
People keep saying this, but it is not strictly true. (Rather, it is true in only a limited and uninteresting sense.)
If you wanted to support insertion at the head of a vector (as, for example, CLU suports with a primitive addh (add at head) operator), then the underlying representation could be a block of W words, with a header pointing to N words beginning at offset o. To insert-at-head, decrement o. If o is already 0 you have to allocate a new, larger, block. If you double the length of W every time you insert-at-head and there's no room, then a series of K insert-at-heads is not (on average) noticeably worse than inserting at the head of a list. You incur O(k) costs Log K times, for k being powers of 2, so the total cost of K insertions will be approx 2K. The cost of consing K times to the head of a list is of the same complexity (i.e. K conses). That's worse case, of course, if you start removing entries from the head, then the vector is much more efficient than lists.
Space overhead? The vector will be at worst 50% empty in the constant growth scenario, which is equivalent to the wasted space for CDR's in non-CDR-coded lists.
There *is* a space advantage to CONS cells as a separate implementation type, if you think that short lists/queues will be common: avoiding the overhead of the header by having fixed sized cells. However, this is not so significant (and you can special case short vectors with special tricks, only going to separate headers for "large enough" vectors).
To avoid being misunderstood and to avoid provoking flames:
I didn't say anything about the relative values of CONS, LIST, and/or VECTOR as primitive data types.
The O(1) vs. O(n) insertion argument at the head also applies to the tail of a list. Using it to argue for CONS vs. Vector in these cases is a red-herring. Of course, if you implement a priority-queue with frequent insertions and deletions from the middle of a list, and you implement it as a sequential vector or list, then the linked list wins over the naive vector implementation. But you would only do this for short queues, anyway, before moving on to a totally different data structures.
* Michael Greenwald wrote: > If you wanted to support insertion at the head of a vector (as, for > example, CLU suports with a primitive addh (add at head) operator), > then the underlying representation could be a block of W words, with a > header pointing to N words beginning at offset o. To insert-at-head, > decrement o. If o is already 0 you have to allocate a new, larger, > block. If you double the length of W every time you insert-at-head > and there's no room, then a series of K insert-at-heads is not (on > average) noticeably worse than inserting at the head of a list. You > incur O(k) costs Log K times, for k being powers of 2, so the total > cost of K insertions will be approx 2K. The cost of consing K times > to the head of a list is of the same complexity (i.e. K conses). > That's worse case, of course, if you start removing entries from the > head, then the vector is much more efficient than lists.
This (last bit, sorry to quote whole para) can't be right I think. Removing n entries (one at a time!) has to be O(n), because ... well because if it isn't you're doing magic somewhere. Or do you mean efficient as `faster' rather than `less complex'?
But such an implementation obviously has lots of other advantages anyway like good locality which is a huge win over conses.
Anyway, I should have qualified my original message with `given the straightforward implementation of things'. I was really trying to point up the fact that ignoring `implementation details' (ie saying `everything can be a linked list' as the original poster seemed to be saying) is something one should not do in general.
x...@best.com (Xah) writes: > I (Xah) wrote: > >From a language design point of view, why do we have the pair notion in Scheme? > >...
> Most people have misintepreted my question. However, the replies are informative. Thanks.
> Perhaps I'd make my original question clearer by using an analogy.
> The essential controls to a car is the wheel, acceleration, and break. Stickture gear may provide more efficient control, but it is not essential. (conceptually redundant)
> Similarly, a list (abstractly) is simply an ordered sequence of things. A tree is simply a list of lists. A pair notion (like Scheme's cons) is perhaps good in a language for implementation/efficiency reasons, but it is a redundant notion conceptually.
> Here's my original question rephrased: Why did the designers of Scheme -- a language that emphasize conceptual elegance -- included the pairs notion (cons) in the language?
> Partial answers are inferred from replies: Because lists are efficiently implemented using linked pairs (or so), therefore Scheme designers decided (intentionally or not) to make this pairs explicit as a language primitive. (thus we have cons, car, cdr.)
> By having implementation/efficiency issues showing up in the language, Scheme apparantly is less a higher-order (or conceptually elegant) language than it would be. Similarly, the vector notion/primitive in Scheme is also conceptually redundant. A list will do. (again, in the name of efficiency...)
> Now having understood some backgrounds of cons, car, cdr, my next question is: Is there any good to actually have a pairs primitive in a (higher-order) language? The answer is apparantly no, as evidenced by few private email I got. (ML apparantly doesn't have the pairs notion, nor does Mathematica -- a list is simply a list, no fuzz attached. (purists don't drive a car with stictures (^_^)) Modern compiler/intepreter optimization technology should be advanced enough to generate the linked pairs internally from however shaped trees.
> All comments and inspirations are welcome. Language flames by email, please. Thanks.
> PS: The namings of cons, car, and cdr are apparantly historical (as explained in SICP p.85).
The problem is that you are seeing it all backwards, and missing the point. :-)
You view lists as the fundamental data structure, and pairs as an accident of their implementation. As such, you are right that pairs are not really necessary, may seem confusing, and it is an ugly wart to show through the implementation of lists in the language.
However, to a true Lisp programmer (the origin of the name Lisp nonwithstanding) _pairs_ are the fundamental data structure. Lists are just a conventional use of pairs. No more and no less.
All the abstract properties that you care about and think are a property of lists are really a property of pairs in Lisp. Since lists are _just_ a convention, Lisp is a little cavalier about them, and that's why the abstract properties do not really hold.
However, because lists are such a widespread and useful convention, Lisp dialects (Scheme included) provide lots of operations that operate on lists, thereby misleading novices into thinking that lists are the fundamental data structure, and making them wonder about how clean they are where the gaps between a "lists are fundamental" and "lists are a conventional use of pairs" paradigms conflict.
As an analogy, think of the MAXINT or similar values that some languages/libraries provide. In many programs such a value is used to represent infinity, but it is not. It is just an often useful convention. Now, since infinity (a number defined to be larger than all ordinary numbers) is such a useful convention, you might be tempted to ask that affine reals or transfinite integers replace the ordinary numbers of computer arithmetic in order to get the abstraction right, since otherwise the gaps show sometimes (and can lead to hard to catch bugs). Most people would probably suggest that you leave numbers as they are and that if you need a tighter bulletproof abstraction you could implement it yourself.
The same would hold for lists in Lisp. Pairs are not a wart, but a very useful ADT on their own right. So useful and covenient, in fact, that people use them as glue to build other types (such as lists) without bothering with a tighter definition or implementation.
I don't know much Mathematica, but remember that lists in ML can only contain uniformly-typed values (since the language is so-called statically typed), thus "exposing" the underlying pair structure would not be as useful as it is in Lisp.
An extreme way of viewing the situation would be that since their type system prevents them from having true Lisp-style pairs and making lists a convention based on them, they chose a common subset of the convention that they could fit in their type system and placed it in the language in the hope that it would make do. :-)
mcr@this_email_address_intentionally_left_crap_wildcard.demon.co.uk (Martin Rodgers) writes: > comfortable than he is with strong type checking. The real question is > whether a type system enables or disables you, and to what degree.
It does both. Which happens at any point depends on where you're coming from and what it is you want to accomplish. This is not one of those issues that has a single "right" answer and is very context dependent.
Tim Bradshaw <t...@aiai.ed.ac.uk> writes: >* Michael Greenwald wrote: >> The cost of consing K times >> to the head of a list is of the same complexity (i.e. K conses). >> That's worse case, of course, if you start removing entries from the >> head, then the vector is much more efficient than lists. >This (last bit, sorry to quote whole para) can't be right I think. >Removing n entries (one at a time!) has to be O(n), because ... well >because if it isn't you're doing magic somewhere. Or do you mean >efficient as `faster' rather than `less complex'?
I was assuming that copying and storage management costs were dominant, rather than storing the item and/or incrementing/decrementing a length field. Given that assumption (which I think is usually reasonable) then in the case of both insertions and deletions the dominant term is sub-linear for vectors but linear for cons/list. To be fair, you're right, I still shouldn't ignore the small linear term, but given the current (growing) disparity between cache hits and cache misses it *almost* seems reasonable to ignore memory references that hit in the cache. Given the linear term, I guess I switched meaning from "lower complexity class" to "faster". My apologies for being confusing.
x...@best.com (Xah) writes: > Consider a set. A set is a collection of things. It is not necessary > to think of a set as a pair, where one part is another pair > ...etc.
> An n-tuple is simply a set of n things. Similarly for > n-dimentional vectors. There is no structural difference between a > set of 3 real numbers, a 3-tuple, or a 3D vector.
no. At least not in the typical formal mathematical sense. An n-tuple is an _ordered_ set. That's the whole point of it. In CS land, about the only thing that acts sort of like a "set" is an ADT known as a "bag". Things like lists, arrays, vectors (aka arrays), etc. have order as part of their (public) semantics.
> A matrix is simply a set of vectors.
Well, no it's not - since order is quite important in the matrix concept.
> Conceptually, it is a set of n things containing m things.
Taking just the 2D case, it's an ordered n-set of ordered m-sets.
> Mathematica is a language that represents trees uniformly. All trees > are simply List[...].
Modulo representational equivalences, I don't believe this is correct.
> Eric's argument that cons is a *necessary good* as a primitive for > list alludes to me the steoretypical argument between C and lisp > programers. One camp wants control, the other abstraction.
I don't believe that was the point. I believe the point was that the concept of "list" (in particular, the definition indicated) is extremely fundamental and should be an intrinsic.
-- Jon Anthony Synquiry Technologies, Ltd., Belmont, MA 02178, 617.484.3383 "Nightmares - Ha! The way my life's been going lately, Who'd notice?" -- Londo Mollari
The recursive definition of a list is the following:
<list> = null | (cons <value> <list>)
Scheme provides you the two constructors, null (often written '()) and cons, and lets you build lists for yourself.
Unfortunately, Scheme also has a confusing notion of `pairs', so that the second argument to cons can be a non-list. This may have made sense in some land before time, but clearly once you have generative structures (aka records), there is no need for this, and cons can be used with the more restrictive interpretation. I certainly believe it should be. (We put this into practice in DrScheme and, guess what, the masses haven't been in revolt.)
The `list' procedure is just a convenient optimization over `cons'. If you view your data as ADTs corresponding to (free-term constructive) algebras, then you really want to build all your lists via cons and null. cadr, etc are, again, just conveniences.
micha...@Radon.Stanford.EDU (Michael Greenwald) writes: > Tim Bradshaw <t...@aiai.ed.ac.uk> writes: > > while insertion / deletion at the head is > >O(1) for a list but O(n) for a vector.
> People keep saying this, but it is not strictly true. (Rather, it is > true in only a limited and uninteresting sense.)
> If you wanted to support insertion at the head of a vector (as, for > example, CLU suports with a primitive addh (add at head) operator), > then the underlying representation could be a block of W words, with a > header pointing to N words beginning at offset o. To insert-at-head, > decrement o. If o is already 0 you have to allocate a new, larger, > block.
There are several different operations that might be called "insertion at the head". The one that cons gives you for pair-based lists is a non-mutating insertion with shared structure. The result of the operation is a new list with one more element than the operand. The operand remains unchanged and is still usable. Subsequent mutations of either list can affect the other.
You seem to be describing a mutating insertion operator that has a side-effect of increasing the length of its operand. This is very different, and often not as useful:
thego...@airmail.net (Bryant Brandon) writes: > Allright. A List is built up of CONSes. A CONS(short for Construct) > consists of two pointers. The first is named CAR, the second is named > CDR. You hold onto a list by the first CONS. [...] > The function CONS creates a new Construct in memory [...]
What's that you said? Pointers? New Construct in Memory? There are no pointers in Scheme. Scheme only has values. You must mean C++, which is down the hallway, second newsgroup on the right.
Even if Scheme had pointers, they would not have names. Variables have names. Variables are bound to values. But values do not inherently have names (except in certain implementations for certain kinds of values ... the kinds of disclaimers you need on c.l.s!).
So, to translate your C++ explanation into Scheme:
A list is built up using CONS. CONS builds a composite value out of two values. The first value can be accessed by passing the composite value to the procedure CAR; the second by invoking CDR.
alpho...@cs.ubc.ca (Carl Alphonce) writes: > I think the main point here is that languages like Scheme have made > different choices than languages like ML. :: in ML cannot be as > flexible as cons in Scheme, because ML strongly (though > polymorphically) typed.
This depends on what type model you choose to use for Scheme (as I said in the email message I sent you).
In return, ML offers the programmer a great
> deal of security: there can be no runtime type errors.
(This should probably go into the FAQ.)
ML programs do have run-time errors. They are called "exceptions". Just because you rename them does not make them go away. Just because they are not type errors does not mean they are not real, ugly, hard-to-debug errors.
If you believe that this points to a real distinction between Scheme and ML, run a program in MzScheme sometime (an implementation that has been strongly influenced by ML). In MzScheme, I can wrap every program I run in
(with-handlers ((exn? (lambda (exn) (display "An exception was raised!") (display "But I'm going to ignore it!") (display "La-di-da!")))) <program>)
and I would never see a run-time error either. As for _type_ errors at run-time, it all depends (again) on what type system you choose to use. (Machine code has no type system, so there are no type errors, right? (-:)
PS: That said, as I've said before in this forum, I think the single best thing Scheme programmers should do is learn ML. There is a kernel of truth behind what Alphonce says, and too many Lispers and Schemers never get it.
Erik Naggum wrote: > _all_ of your complaint is that you can see the internals of the `list' > abstraction in Scheme and Lisp that you cannot in Mathematica? right?
Yes, on the whole. (but perhaps not in the derogative and slightly mutated way you phrased it)
I don't know much computer science, but my questions isn't unthoughtful. If your sensitivity did not cause you to misintepreted it and immediatly severely flamed me and other groups of people flamboyantly and insincerely, this large thread verging on senseless flame war would not have started.
I wasn't complaining, nor was I arrogant. Comparing our posts in this thread, I think it's fair to say you are the one complaining and arrogant. Yes, I was ignorant about lisp, but so? In my first post, I wrote "Please illuminate a Scheme newbie. Thanks". (and please no cheap rebuttals. In all honesty, my first post is in the form of asking a question.)
Some people have indicated the incorrectness of the paragraphs I wrote in my last post on the sameness of sets, n-tuple, vectors, matrixes, and trees. I know their differences. I was just trying to illustrate how they are uniformly represented in Mathematica. In my second post in this thread I briefed described them as ordered lists and trees but apparantly the word `list' too much reminds people of `list' in CL or Scheme with associated implementation connotations.
It is apparant to me that no one so far indicated a familarity with Mathematica. Otherwise, the context of my question would be recognized clearly without doubt. The language of Mathematica is **close** to the ideal language I was talking about, where computer *engineering* concerns like byte, cons, int, float...etc. are as absent as possible in the language, and where a computer *science* oriented programer can concentrate on works in algorithms, AI, computational geometry...etc. without even knowing what is a register or caddadadar! (^_^) (now, is there a limitation to the caadadadddaadr length?)
In Erik's last message arguing point-for-point style to my message, many points are really pointless, philosiphical, and cheap shots. In summary, I think he is too deep into computer engineering than computer science, and too enamored with perhaps CL. Arrogance is usually proportional to one's (tech) prowess. Well, that's ok with me as long as you don't burn my head with real flame.
At least we are now all acquaintances. Good way to start a new year.
In article <xah-ya02408000R0301981503230...@nntp1.ba.best.com>,
x...@best.com (Xah) wrote: >Similarly, a list (abstractly) is simply an ordered sequence of things. >A tree is simply a list of lists. A pair notion (like Scheme's cons) is >perhaps good in a language for implementation/efficiency reasons, but it >is redundant notion conceptually.
Cons cells are more fundamental than lists.
>language? The answer is apparantly no, as evidenced by few private email >I got. (ML apparantly doesn't have the pairs notion,
SML most certainly DOES have the equivalent of a cons cell. Well, they improved on it by making it not mutable. Lists in SML are merely strings of "cons" cells terminated by a nil (e.g. ), same as Scheme.
-- Necessity is the Mother of Improvisation. | Samuel S. Paik Laziness and Greed are the Parents of Innovation| p...@webnexus.com A Rebel against the Small World Order | Speak only for self
> However, to a true Lisp programmer (the origin of the name Lisp > nonwithstanding) _pairs_ are the fundamental data structure. Lists > are just a conventional use of pairs. No more and no less.
> All the abstract properties that you care about and think are a > property of lists are really a property of pairs in Lisp. Since lists > are _just_ a convention, Lisp is a little cavalier about them, and > that's why the abstract properties do not really hold.
> However, because lists are such a widespread and useful convention, > Lisp dialects (Scheme included) provide lots of operations that > operate on lists, thereby misleading novices into thinking that lists > are the fundamental data structure, and making them wonder about how > clean they are where the gaps between a "lists are fundamental" and > "lists are a conventional use of pairs" paradigms conflict.
Your points are well taken, but let's not go so far as to imply to novices that lists are just a convention in Lisp. One of the first things that novices learn (or should learn) is that lists are composed of conses or derive from s-expressions, but beyond that lists are the far more prevalent abstraction. Modern Lisp programs are data made up of nested lists and atoms. Lisp macros build list structure that is subsequently evaluated. Again, lists are a very important and fundamental notation and abstraction. Any novice can prove this to themselves by taking a Lisp program and rewriting it in dot notation form. Also note that most modern texts on Lisp will avoid the use of the the term *cons* as much as possible in in favor of *list* or *dotted list* whenever it makes sense. This is beyond just a convention. In the fundamental development of the Lisp data type hierarchy cons is a primitive data type, but the list is the coup de grace.
> It is apparant to me that no one so far indicated a familarity with > Mathematica. Otherwise, the context of my question would be > recognized clearly without doubt. The language of Mathematica is > **close** to the ideal language I was talking about, where computer
Then why are you asking Lispers if your question is in the context of Mathematica? May I suggest Mathematica newsgroup? Many Mma users do know Lisp and maybe they can help you understand Lisp from Mma orientation.
I concur with Erik's remarks about the unavoidability of taking the processor into account. You can choose not to, but there is no denying the possibility of efficient and inefficient processors, so if you don't spend any time caring, you sometimes suffer.
Then again--it's like any other trade-off you can make--you're trading design time for execution time. Often the execution time doesn't matter and the design time does and it's a good trade. But you never get anything for free--the Three Laws of Thermodynamics always apply (1. "You can't win", 2. "You can't break even", and 3. "You can't get out of the game" ... or, as I loosely summarize them sometimes: "everything is a trade-off, and often an imperfect one" ... or, as they sometimes say to gamblers, "If you don't know who the sucker is in the game, it's you.")
I liked Erik's telescape analogy, too.
One reason I think this discussion belongs separate in CL and Scheme is that CL people mostly don't moralize over these things. I think CL has lists as conses because historically they've been there, for better or worse, and these days we try to maintain linguistic stability. To change something in CL, you'd have to demonstrate not only that you had a good idea, but that it was worth breaking all existing code and textbooks to install your good idea. I think it just wouldn't happen, and I think it's wrong to say that CONS/LIST as it exists today is there for technical reasons. It is not. It's there because it's "useful", because people are familiar with it, because code uses it, it's part of a rich tradition, etc. It's no longer retained because of any attempt on the part of the designers to be smarter than the community; rather, it's not removed because we've learned that little is gained and much is lost by trying to assume that just because an idea appears technically superior on paper, it will work out that way in practice if we force it on a community that has not asked for it.
Common Lisp itself arose historically because in the days of Maclisp (a PDP10 Lisp, long-predating the Macintosh) we the Lisp programmer used to find mail every monday morning from the Lisp implementors saying "the language has changed in the following ways: [...]. Please update your programs" and we (the users) just got tired of that and said "it's time for the language to be stable for a while. It's more important that sentences we say continue to have meaning over time than that those sentences be written in the perfect language. If it were really true that the language was more important than the programs written in it, that would say very little about the value of the language since it would say no serious work was being invested in it.
And while it may seem "dirty" and "inelegant" to some communities to have "compatibility" emphasized as a dominant criterion over "aesthetics", I stand with pride beside such a langauge. It doesn't mean we never used aesthetics--there are many things that are aesthetic in CL. But like any successful language (pick your favorite natural language) it has its odd littel places where things don't line up. It doesn't mean you can't be precise when you want to be. It doesn't mean you can't write great books or poems in it. It just means that the writing of poetry is not all that is important to the Lisp community.
For the most part, I would say very view things in Lisp are there because of a technical reason. A lot are hapinstance. But life is like that. And I'd rather see life in the language than a language which is a promise for a future that is never realized.
And, frankly, I think it's a personal choice issue whether the "pun" of lists and conses being shared is elegant or awful. Some people really like it, some don't. One reason for this is that the human mind is one of those possible processors and I think is often overlooked. The reason I mention it is that it seems clear from a variety of empirical evidence (eg. how we structure natural language) that the mind is good at handling in parallel a lot of very complex rules with special cases rather than a few simple ones. As such, complex language definitions which permit simple sentences rather than simple language definitions which require complex sentences would, I think, seem to be more natural. The Scheme community often tries to deny this, but I think they expect people to just take it on faith that "simplicity" is uniquely determined. More evidence of my claim about the three laws of thermodynamics above--i think simple languages lead to complex programs and complex languages to simple programs. You get nothing for free--often the Scheme designers would seem to want you to believe that a small simple language provides only beenfits and no drawbacks, while a larger, more complex language like CL provides only drawbacks and no benefits. Once again, I encourage people to be skeptical of any claim on its face when it appears to give you benefits for free.
I prefer complex language and simple programs. I am not choosing "complexity" over "simplicity" in so doing. I am chosing WHERE my simplicity goes. I think learning a language is an investment that pays dividends when you program later. I think languages that provide lots of services pay better dividences and are worth it because I spend more time programming than learning the language. I also like language constructs that aren't just there to provide raw power but are there to provide "commonality of idiom", "focus", and other things that are more subtle control of expression. Again, I think this simplifies aspects of program reading and maintenance.
People are welcome to debate the abstract niceties of pairs and whether we'd be better off with other size trees. If you make them be arbitrary size, they're trickier to GC in convenient blocks and you have to pay a penalty for maintaining and accessing a size slot. Or you can use fixed-size types other than 2--we had them, for a while: go to deja news (http://www.dejanews.com/) and search for posts by me about "hunks". Each of these is a trade-off, but historically they haven't won out.
Yale Scheme (called "T"), conversely, had CHAR/CHDR for moving down a string and giving the illusion that the string was listlike even though it wan't. It was fun but didn't catch on either...
We have what we have because when given a choice, it's what people tend to use. Maybe that's just a "practice effect" and is bad data as a way to design a language. But so it goes. Who's to say all that "practice" should be quickly tossed to the wind and not counted as a valuable commodity...
>>>>> "WPV" == William Paul Vrotney <vrot...@netcom.com> writes:
WPV> [ ... ] Modern Lisp programs are data made up of nested lists and WPV> atoms. Lisp macros build list structure that is subsequently WPV> evaluated.
That is, of course, utter nonsense for Scheme (as of R4RS, anyway). Programs have a representation separate from lists. Which is why, for, example
(+ . (4 5))
(shown to me by Richard Kelsey) may be a valid representation for the list (+ 4 5), but is not a valid Scheme expression. Admittedly, most Scheme implementations simply use the read primitive to parse programs (and therefore accept the above expression), but some (Gambit, notably) do not.
The only place where lists are really more fundamental to the language are in argument rest lists.
-- Cheers =8-} Mike Friede, Völkerverständigung und überhaupt blabla
> It does both. Which happens at any point depends on where you're > coming from and what it is you want to accomplish. This is not one of > those issues that has a single "right" answer and is very context > dependent.
This is how I see it, too. I just phrased my comment badly.
I should've said: "The real question is how a type system enables and disables you." The important point I was making was that not everyone will feel the same way about a particular type system, nor should we expect them to. It's a very subjective question because of, as you pointed out, the context dependant nature. -- Please note: my email address is munged; You can never browse enough "There are no limits." -- ad copy for Hellraiser
Barry Margolin <bar...@bbnplanet.com> writes: > How is a treee "simply a list of lists"? A binary tree is a pair of > subtrees. The cons is a perfect data structure for representing a binary > tree (well, not quite -- you often need to have data at the nodes, not just > the leaves).
I think a reasonable definition of a tree is a node and a set of subtrees. The cons cell implies an ordering of the subtrees, which may be ignored, of course, and which anyway would seem to be necessary for navigating the tree.
But I think you could argue that "perfect" is a strong word, and that "suitable" might be more fitting.
~kzm -- If I haven't seen further, it is by standing in the footprints of giants
> How is a treee "simply a list of lists"? A binary tree is a pair of > subtrees. The cons is a perfect data structure for representing a binary > tree (well, not quite -- you often need to have data at the nodes, not just > the leaves).
Okay, I'll take a crack at answering this one. A binary tree can be regarded as a list of two subtrees. Each of the subtrees can be regarded as a list of subtrees as well.
This is not scheme's notion of a "List", ie, the last node of each of these sublists is not necessarily the empty list. But it is a valid if suboptimal notion of what one way to represent a list might be.
> [ ... ] Modern Lisp programs are data made up of nested > lists and atoms. Lisp macros build list structure that > is subsequently evaluated.
> That is, of course, utter nonsense for Scheme (as of R4RS, > anyway). Programs have a representation separate from lists. > Which is why, for, example
> (+ . (4 5))
> (shown to me by Richard Kelsey) may be a valid representation > for the list (+ 4 5), but is not a valid Scheme expression. > Admittedly, most Scheme implementations simply use the read > primitive to parse programs (and therefore accept the above > expression), but some (Gambit, notably) do not. [...]
If "eval" is included in the language (as proposed in the not- yet-released R5RS), will the above be required to represent a valid expression?
| Comparing our posts in this thread, I think it's fair to say you are the | one complaining and arrogant.
I think it's fair to say that you don't listen to what people tell you. if there is a definition of arrogance, it must include such behavior.
| I was just trying to illustrate how they are uniformly represented in | Mathematica.
it seems that you don't differentiate between what something looks like and what it is. how something internal is shown to the user is (often) called the "presentation". how something external is implemented internally is (often) called the "representation" since we're talking about using concepts and means different from the ones we're trying to _represent_ and making them take on the meanings we want, as something essentially very different from what they _are_ per se.
e.g., a machine word is just a number of bits. we _represent_ an integer with them by multiplying each bit by a weight according to its position in the word and summing the results. we _present_ this machine word as a string of digits to the user when the user has decided that a particular machine word is to be used as an integer. then we ignore the number of bits in a machine word and talk of integers that may require any number of machine words (as many as are necessary to maintain the bit with the highest weight) as an "integer". this is a successful abstraction. is it inconceivable to you that somebody might want to know about the value of a particular bit and thus would have to know about the weight if he could not view the integer as a number of bits? now, ask yourself what you would do in a language that hid the implementation of integers as a number of bits: you would have to _recover_ the implementation. Common Lisp has a function `logbitp' which reports on a bit value by position. is this as much a no-no as the cons cell to you, or can you think in terms of integers as values even though `logbitp' exposes internals? I know I can think in terms of both integer value and bits at different times even though there is no difference to the computer. that's why I talk about using `first' instead of `car' when the object is a list, and using `car' instead of `first' when the object is a cons. do you get it?
however, since you keep clamoring about how Mathematica _represents_ this and that, I must assume that you don't listen to what you're being told. you don't _know_ how Mathematica _represents_ anything at all, since you aren't allowed to look inside it, right? so you confuse user interface with implementation. this is sheer ignorance, and it has not abated one millimeter since you asked to be illuminated. if you were blinded by the strong light, you could still have been illuminated, and there are other ways to express this than those you have chosen. I do get both irritated and sometimes downright hostile to "newbies" who don't want the answers they get because they didn't understand the questions they asked. I did try to be helpful and set the stage for your illumination, but _noooo_, you rejected all helpful offers and were instead _only_ offended. in contrast, I do something useful when I'm offended, but I don't turn off the irritation, because I think people who are irritating should be told. I very rerely flame people without solid technical contents alongside it that also indicate that flaming will cease if they get the ideas. if you can't see this, don't blame me for it. you choose what you react to just as much as I do.
| It is apparant to me that no one so far indicated a familarity with | Mathematica. Otherwise, the context of my question would be recognized | clearly without doubt.
this is another clear indication of your arrogance. you came to us to tell us that some notion of ours was redundant. you're wrong. then you tell us that it's our fault that we don't agree with you because you are ignorant of Lisp and you accuse us of being ignorant of whatever you think is ideal. (but don't you listen? ideals are _never_ universal, because they depend on the ideas they are embodiments of, and those ideas differ according to context and individual preferences.)
if somebody had come to your ideal Mathematica world and made as much fuss about the implementation of lists as you do in Lisp, what would you have thought about them? could you please try to listen to what you're saying as you best can imagine it would be heared by those you talk to?
| (now, is there a limitation to the caadadadddaadr length?)
in the predefined functions, c[ad][ad][ad][ad]r is the longest it gets. you are free to define your own, longer names, if you so desire. the _usefulness_ of this is highly debatable, however, since you should name them according to your needs, not according to the implementation.
but why _does_ this implementation issue bother you so much? I get the distinct impression that you cannot think in higher-level terms unless the lower-level concepts are removed from out of your sight. you even accuse _me_ of such a shallow-minded attitude, when it is instead obvious that you don't make _any_ effort to understand what I'm talking about. I have shown you that by a switch of names, your problem would go away, but you still continue to hang on to your confusion and insistence that we should listen to your misguided "ideals". _why_ don't you listen? do you suffer from prestige?
| In Erik's last message arguing point-for-point style to my message, many | points are really pointless, philosiphical, and cheap shots.
I'm really sorry, but you have to realize that the questions you asked just don't have the answers you want them to have. so, some of them will _have_ to be answered philosophically because you don't understand the philosophical underpinnings of your questions, only the implementation issues that you are used to from Mathematica.
consider this: it's an artifact of the implementation of the Mathematica language that causes you to believe that there is not a cons-cell-based implementation underneath, and an artifact of the implementation of lists in Scheme (and CL) that causes you to complain about the implementation. whether to provide access to lower-level aspects to an abstraction is _not_ a language issue, but an implementation issue. whether to focus on the implementation issue instead of the abstractions they implement is a user attirude issue, not a language issue. when you ignore the fact that there are other artifacts of the implementation that any reasonably good Lisp programmer would employ that would not expose the implementation, you argue very strongly against yourself -- you really want to deal with an implementation issue: "how much of the internals can an implementation show before it bothers Xah to death?" I don't ask that question. I ask the question: "how little of the implementation do I have to worry about if I want to worry as little about the implementation as possible?"
you claim that you don't want to think or talk about the implementation issues, but you talk of nothing else! now, quit that. talk about the abstractions you want to talk about. _I_ have written (at length) about the fact that you _can_ think in terms of the abstractions regardless of how the list abstraction is implemented. you have _ignored_ this and focus only on whatever you don't like, so I have revised my opinion of you. I think you are beyond hope and that your arrogant ignorance is really due to stupidity. I also think it was stupid of you to force me to revise my opinion of you.
| In summary, I think he is too deep into computer engineering than | computer science, and too enamored with perhaps CL.
well, you have demonstrated that you can't think, so it's immaterial what you say you think, but let's hope that you will take the time to listen to what I have said, instead of what your narrow-minded reactions tell you to ignore. you may find that I did lend _you_ a hand while I slapped your ignorant comments (did you notice how I differentiated between your state of mind before and after reading my explanation?). you stupidly decided to behave as if _you_ were slapped, and now you return with moronic accusations that portray your own _obsession_ with implementation issues, quite contrary to your stated interests. whatever did you expect to achieve from all this?
if you want lists, you have them. if you want to be bothered about cons cells, that's your choice. I suggest you don't get bothered by them.
This message is not a flame, but a sensible try at dicussion.
I hope this is one last post by me. I have some clarifications to make regarding Erick's comments on cons in general.
Let me make it clear one last time that all I'm concerned is interface. I don't care about how something is implementated. May it be cons cells or whatever. I learned from Erik that things internal are often termed "representation" while external termed "presentation". I correct my previous statement about using the term "representation" when I meant "presentation". Here's the correct statement: "trees are _presented_ in Mma as List[...]". It is true that in Mathematica (mma) one cannot look at internals. Only people at Wolfram Research Inc. (the maker of Mma) have the privilege to look at the internals.
On the whole, the only thing I object to Erik's posts is his tone. After some misunderstandings of his first message, I have now no objection to his technical information about cons provided to me. And on the whole, I think we have common views of what is the "right thing". But let me add that this is irrelevant since I don't really have any computer science background to make my concuring of his views meanful.
Erik expressed that he is uncertain about whether I care about implemenation, becaues he has explained that `cons' needs not to be used, and if I agree to this, I shouldn't have any problems. The reason for his uncertainty is: "I have not yet expressed whether I agree that cons, car, cdr, needs not to be used if one does not want to.". I have good general experience programing Mma. On the whole, I do not doubt Erik's words, but because my limited knowledge of Scheme, I'm yet unable to confirm whether I could write code without ever using cons, car, cdr and still have a normal looking Scheme code. I have reasons to say the contrary: In the SICP book, very early on, cons, car, cdr, pair? and null? are used for several purposes for making or traversing _flat_ lists. (list as of "ordered set". No language connotations) This bothers me because the appearance of cons now interfere with what would normally be coded if cons hasn't surfaced up in the language as a primitive. Now since this classic Scheme book is filled with cons and its associated primitives (car, cdr, pair?, null?), one could imagine writing codes without using them is probably out-of-style with the community. Also, as Erik himself expressed, only good or experiences lisper avoids using cons and associated primitives.
Basically, I think if you have something in a language, people will use it. Isn't this true by experience?
Again, let me emphasize I don't care about implementations. The case about cons is that, the implementation interfered with the language (interface), and that's what I dislike. (note: I'm not now making fun of languages. I'm just expressing myself.) It all could just be that I'm not used to Scheme yet. I'm just bothered by the fact that _on paper_, a nested thing is used to represent a flat thing, and afraid the confusion will really begin when I what a real nested thing -- trees. I could elaborate on the cons business and give sample codes to illustrate my point if anyone wishes. (it'll take time, especially because I'm not skilled at Scheme. I hope I've made my point clear.)
PS for few non-technical reasons, I'm looking for a second language other than Mma. That is the reason I'm learning Scheme.
I think people would understand your concerns a little better if you were able to explain why you don't have the same worries about mathematica. After all, First is the same operator as CAR, Rest is the same operator as CDR, and Prepend is the same operator as CONS (when you are dealing with lists). If you're not worried about the implementation, then why doesn't the presence of these operators in Mathematica also "interfere with what wouldnormally be coded."
Now, it sounds like you're worried about "good style" in the language, rather than "just interface".