Can anyone confirm and explain or elaborate?
You don't want to take a C++ class taught by a Lisp advocate.
For that matter you don't want to take a Lisp class taught by a C++ advocate.
Language bigots & fanatics are best avoided. They're just silly people. If you
trust what they say or teach you'll be heading for disaster.
Cheers & hth.,
- Alf
>You don't want to take a C++ class taught by a Lisp advocate.
>
>For that matter you don't want to take a Lisp class taught by a C++ advocate.
Understood but the teacher is neither
It was just something that came up when talking abt the
history of programming languages
Something that someone in history said. And I'm curious
why this person thinks this.
You'd have to find them and ask them. But deciding that one language is
the 'ultimate' language which is suitable for all tasks is like deciding
that that new screwdriver you got for Christmas is perfect for
everything, including hammering in nails. It's daft. Always try and use
a right tool for the job, and avoid pointless arguments about which is
the 'ultimate' tool for every job - it doesn't exist (and if it did, you
wouldn't be able to afford it <g>). That doesn't mean you can't have
productive, evidence-based discussions about which tools are more or
less appropriate for given tasks - those are important.
Stu
Well, it's certainly considered by some to be the ultimate
language. Just about any language has such fanatices, and I've
also heard C, C++ and Java qualified as the ultimate language.
Ignoring such extremists, however, I think it's safe to say that
the ultimate language doesn't exist, and almost certainly never
will.
With regards to the "direction" languages are going: there are
strong arugments on some grounds ("proving" correctness, etc.)
in favor of functional languages, and there have been very
serious articles, by leading experts (particularly John Backus),
preconcising functional languages in order to be able to better
prove correctness. All of the papers I can find in this regard,
however, date from the 1970's; since then, either the proponents
have given up or retired, or they've accepted the fact that
functional programming is not going to replace procedural
programming in practice, regardless of the theoretical aspects.
Or perhaps they've simply recognized that you can use a more or
less functional style even in more general purpose languages
like C++, which weren't designed for it; in my own code, there
are a large number of functions which consist of a single return
statement, and I'll accept a more complicated expression than
usual in order to achieve this (but without making any sort of
hard and fast rule about it).
Finally, the "direction" modern languages are going in practice
is a mixture of object oriented and various generic idioms (of
which C++'s template meta-programming is an example). Which
doesn't prevent most code written in industry to still be fairly
procedural---inside the classes of OO, the member functions are
still very procedural. (IMHO, making them more often
functional, without going to extremes, would be an improvement,
but the fact remains that in practice, when I look at existing
code bases, 40 or 50 line functions doing all sorts of
unrelated, or very weakly related things is still the norm.)
--
James Kanze
Except for Haskell! ;)
(Well, at least if you ask *any* Haskell advocate.)
It would help if you had more links for these quotes.
Probably some of them date to the 1990s. At the time
there may have been more truth to the quotes than
there is now. Progress has been made in ironing out
some of the weaknesses. Anyway, my opinion is
similar to something Kanze wrote once -- C++ isn't
very good, but it is better than the alternatives.
I don't think the future is bright for C, Java or
Lisp. The sapling on the cover of D&E though has
since been growing into a healthy tree. That
growth makes some people angry I guess.
Brian Wood
http://webEbenezer.net
you could take a look at comp.lang.lisp (don't cross-post or you'll
start a war!). You could take a look at scheme as well (a light-weight
variant of Lisp (and *that's* a controversial view as well!)).
Lisp is an interesting language and nearly as old as FORTRAN.
Lisp has been the future for an awful long time.
--
In a profession plagued by, "when all you have is a hammer, everything
looks like a nail," we get really excited when someone is able to come
along and prove that everything really *is* a nail if lambda is the
hammer.
B.R.Lewis (comp.lang.scheme)
> C++ wants to be an »object oriented programming language«
No it doesn't. C++ wants to be a multi-paradigm programming
langauge, supporting many different paradigms, including object
oriented programming. Many (not all) of the criticisms of it
come from people who insist that only one paradigm is good, and
are upset with C++ because it doesn't impose that paradigm. In
practice, the more tools you have in your workbox, the more
effective you can be (provided you know how to use them). For
any given paradigm, C++ is probably a bit harder to use than a
language dedicated to only supporting that paradigm, but it has
the advantage of not imposing any one paradigm, and letting you
use which ever one is most appropriate to the problem at hand.
Having said that: let's look at where your quotes are coming
from.
> The term »object oriented programming« was coined in 1967 by
> Alan Kay who wrote in 2003 regarding object oriented
> programming:
The term was coined by Alan Key, but has gone one to acquire a
life and a meaning of its own, beyond what he originally
conceived of. In particular, Key's OO was very much dynamically
bound, with inheritance only for implementation; almost all
serious OO languages today use static type checking, with
inheritance mainly of implementation. Arguably, these are two
different things, and a different word should have been chosen,
but it wasn't. (Meyer once defined OO in a way that it couldn't
be done in Smalltalk:-).)
> »It can be done in Smalltalk and in LISP. There are
> possibly other systems in which this is possible, but
> I'm not aware of them.«
> http://www.purl.org/stefan_ram/pub/doc_kay_oop_en
Note that Key was the inventer of Smalltalk. (Interested party,
so to speak.)
> Other quotes:
> »Actually I made up the term "object-oriented", and I
> can tell you I did not have C++ in mind.«
That's rather obvious, since C++ didn't exist at the time. More
to the point, Key was very much thinking in terms of dynamic
type checking (as in Lisp); as I said, the word has evolved, and
most people associate it with languages with static type
checking: Java and Eiffel, if not C++.
> http://en.wikiquote.org/wiki/Alan_Kay
> »As for myself, it wasn't until I got to play
> extensively with Smalltalk, Objective-C, and LISP that
> I realized just how badly broken C++ is.«
I'm not too sure how to take that (except maybe as sour grapes):
Key obviously was playing with Smalltalk (and probably Lisp,
given the influence of Lisp on Smalltalk) long before C++ was
even invented.
> Charlton Wilbur in <87oezm7eoa....@mithril.chromatico.net>
> »Assuming from the message title that you are going to
> be teaching the basic OO paradigm to novices, then I
> agree with O'Leary, C++ is a terrible place to start.«
> H. S. Lahman in <Kf7Df.1590$0J3.367@trndny08>
> »The problem of teaching object-oriented programming (...)
> C++ fails to meet almost all requirements on our list.«
> Michael Kölling
> »C++ is generally regarded as the most technically
> deficient of the popular OOPLs.«
> H. S. Lahman in <bv6je.8995$_f7.1506@trndny01>
> »There are only two things wrong with C++,
> The initial concept and the implementation.«
I'm not familiar with any of the above people, but I'd be
interested in knowing more about the context in which they are
speaking. *IF* the goal is to teach OO as the unique solution,
and as an end in itself, then no, C++ is not the ideal language.
If the goal is to teach effective programming, using OO when
appropriate, other languages when appropriate... C++ probably
isn't the ideal language, either, but none of the others are
very good either. It's a real problem; in some ways, I'd argue
that the first programming course should still be taught in
Pascal (but that's just showing my age). In other ways, I'd
argue that the language isn't that important in itself: the best
book on programming (in general) that I know is by far
Stroustrup's: _Programming Principles and Practice Using C++_.
And one of the main reasons is in the title: it insists on the
principles and practice, reducing the language (C++) to the role
of a tool (which is what it should be). He could rewrite the
book to use Java (or Lisp, or just about any other language)
without major modifications.
> Bertrand Meyer
> »C++ is a horrible language.«
Bertrand Meyer is the inventor of Eiffel. And he's very
dogmatic; you could replace C++ with any other language except
Eiffel in the above, and he would probably agree. As I said,
I've read a paper in which he proves not only that you can't
really do OO in C++, but that you can't do it in Smalltalk
either. (Meyer's contributions to software engineering are not
to be underestimated, but I do wish he'd be less dogmatic about
Eiffel.)
> Linus Torvalds
> »C++ has always been a zombie, its only drive is the C
> ghost inside it«
Given the quality of Linux, I don't think you can quote Linus on
software engineering issues.
--
James Kanze
It's probably a nice language in theory, but in practice
nothing really useful has been programmed with it. It's the
matter with many languages that are supposed to be better
than C++. I think the reason is simple: horrible syntax.
I know some people don't appreciate the syntax of C++, but
at least it can be read by humans, in most cases (there are
examples of very twisted C++ source code which is harder
to read than usual).
UMICH claims: Lisp totally dominated Artificial Intelligence
applications for a quarter of a century, and is still the most widely
used language for AI. In addition to its success in AI, Lisp pioneered
the process of Functional Programming.
http://www.engin.umd.umich.edu/CIS/course.des/cis400/lisp/lisp.html
MickG
Yes, but in terms of organization of program; not in term of paradigm.
See "A History of C++: 1979-1991" for more information:
http://www.research.att.com/~bs/hopl2.pdf
> >Many (not all) of the criticisms of it come from people who
> >insist that only one paradigm is good, and are upset with C++
> >because it doesn't impose that paradigm.
>
> Ignoring other paradigms, C++ does not support
> object-oriented programming well even when it is viewed in
> isolation. It misses:
>
> - >>Everything is an object<<
Which is not a OO fundamental principle.
Even Java which is pretty OO has some fundamental type to a class (int
by opposition to Integer IIRC - I have not programmed in Java for
quite a long time).
However, nothing prevents you from using some fundamental library that
wrap fundamental types into class and use them.
[snip]
> - Dynamics
>
> - In object-oriented programming, most is done
> as late as possible at run time (like message
> dispatching) - the C++ standard library tries
> to do as much as possible as early as possible
> (at compile time; template instantiation).
C++ does support runtime polymorphism.
If you are speaking about message dispatching, it can achieve OO but
it is only a mechanism, not a paradigm. And the runtime overhead is
not in line with C++ goal.
--
Michael
>On 20 loka, 11:05, Nick Keighley <nick_keighley_nos...@hotmail.com>
>wrote:
>> Lisp is an interesting language and nearly as old as FORTRAN.
>
>It's probably a nice language in theory, but in practice
>nothing really useful has been programmed with it.
Lilypond (www.lilypond.org) is written in Scheme, which I understand
is a dialect of Lisp. Lilypond is a free, open-source, music notation
program, capable of producing *extremely* nice looking scores, of just
about any degree of complexity.
This is extremely useful for those of use who have a need to write out
sheet music from time to time.
--
Tim Slattery
Slatt...@bls.gov
http://members.cox.net/slatteryt
Google for "Worse is Better". It was written at a time when C was
becoming more popular than LISP and gives insights on why it was so.
- Anand
Well, IIRC parts of the software for the DeepSpace 1 probe (or was it
another one?) were programmed in Lisp. Of course this depends on what
exactly you define as useful...
lg,
Michael
Just for the records, I think you have mistaken the quotes'
attribution.
You seem to have interpreted them as "quoted person / quote" while
Stefan Ram seems to have posted them as "quote / quoted person".
If I'm correct, then your considerations could eventually be reviewed
- in particular:
(editing Stefan Ram's text)
---
»There are only two things wrong with C++,
The initial concept and the implementation.«
Bertrand Meyer
---
»C++ is a horrible language.«
Linus Torvalds
---
»C++ has always been a zombie, its only
drive is the C ghost inside it«
Pascal J. Bourguignon
---
(end of edited Stefan Ram's text)
As for this discussion in general, I still need to get accustomed to
other languages.
I am currently rehearsing my javascript-ing, which was rusty at least,
and I'm having some headaches about the pass-by-value vs pass-by-
reference behavior.
For what is worth in this context, I love C++ for its completely
explicit behavior about this and several other subjects, and I love it
being multi-paradigmatic, it feels just like having a swiss-army knife
always at reach.
--
Francesco S. Carta, http://fscode.altervista.org
Well, I like Haskell very much. But it has it's own problems (space
leaks because of too much lazyness and the global variable problem come
into mind). Some do very low level things with Haskell (Linux kernel
modules and a operating system kernel), while most other uses can be
found on more high level stuff.
Anyway, there are still certain tasks for which I would stick with C++
because it's better suited there. At least in my daily work I use most
of the time C++.
lg,
Michael
Hmm. And what has AI achieved as a result, a quarter-century later?
;-/
--
Richard Herring
Pure paradigms seldom work very well. They might be theoretically
beautiful, but once you get into *practical* programming the limitations
imposed by the purity of the paradigm will be a huge hindrance.
That's the reason why most programming languages which were intended
from the start to be "pure" ended up not being so. A perfect example of
this is Haskell: It was intended to be an absolutely pure functional
programming language. However, today's Haskell has quite many
non-functional features (such as eg. mutable arrays) because of
*practical* reasons.
Paradigm purity and practicality often clash quite badly.
> It misses:
>
> - �Everything is an object�
There may be theoretical reasons why you would want everything to be
an "object" (wouldn't "class" be the proper term? "object" is an
instance of a class, at least when we are talking about C++), but once
again this can become a hindrance in practice. And usually such OO
purity comes with the cost of increased memory usage and slower
execution speed.
ints might not be "objects". Good thing, I say. I don't want ints
taking 24 bytes of memory and be 100 times slower to instantiate and
destroy.
> - In C++, there is no support for several kinds
> of data as objects (numbers are not objects,
> booleans are not objects)
Maybe not in the standard library, but nothing stops you from making
your own numerical and boolean classes if you really want. (I'm not sure
why you would want that, though. I don't see too many benefits.
Efficiency drawbacks, on the other hand...)
> - In C++, there is no support for code blocks
> as objects:
>
> - �{ o.f(); }� is not an object
Are you sure that's really part of the definition of "object oriented
programming language"? It sounds to me more like belonging to the
functional programming paradigm.
> - Dynamics
>
> - In object-oriented programming, most is done
> as late as possible at run time (like message
> dispatching) - the C++ standard library tries
> to do as much as possible as early as possible
> (at compile time; template instantiation).
And?
[...]
> >C++ isn't very good, but it is better than the alternatives.
> (BTW: Seems to me as if someone also said this about
> democracy.)
That's the way I first heard it, too. I just adapted it to C++.
(I think the original form, and the way I adapted it, was that
C++ isn't very good, but the alternatives are even worse. And
of course, it's one of those quotes which really doesn't mean
much---there are certainly cases where C++ isn't appropriate:
it's not about to replace PHP, for example, in the applications
where PHP is used, nor SQL.)
--
James Kanze
> C++ started out as »C with classes«, so, initially,
> classes were seen to be the main raison d'être for C++.
A desire to support object oriented programming is certainly one
of the motivations for using C++, rather than e.g. C. Object
oriented programming is a useful paradigm, and it's rare that I
don't use it for some parts of my application. *For* *some*
*parts*. The advantage of using C++ is that I don't have to
switch languages when other paradigms are more appropriate. Or
force my design into a paradigm that isn't appropriate. Or fake
it somehow (like java.lang.Math), pretending to be OO when I'm
not.
> >Many (not all) of the criticisms of it come from people who
> >insist that only one paradigm is good, and are upset with C++
> >because it doesn't impose that paradigm.
> Ignoring other paradigms, C++ does not support
> object-oriented programming well even when it is viewed in
> isolation.
It supports it "sufficiently".
> It misses:
> - »Everything is an object«
Which is an advantage. Some things aren't objects, and
shouldn't be considered such.
> - In C++, there is no support for most literals
> as objects:
> - »2« is not an object
> - »"abc"« is not an object
Define "object". According to the standard, both are objects.
I assume that you mean something more, however, but what
exactly? And depending on your answer, maybe 2 or "abc"
shouldn't be objects. (They certainly shouldn't be polymorphic,
nor have identity.)
> - In C++, there is no support for several kinds
> of data as objects (numbers are not objects,
> booleans are not objects)
Again, depending on your definition of object, maybe they
shouldn't be.
> - In C++, there is no support for code blocks
> as objects:
> - »{ o.f(); }« is not an object
Not yet:-).
> - Dynamics
> - In object-oriented programming, most is done
> as late as possible at run time (like message
> dispatching) - the C++ standard library tries
> to do as much as possible as early as possible
> (at compile time; template instantiation).
That really depends on your definition of object oriented
programming. C++ leaves it up to you to define when you want
late dispatch. In a controlled (type verified) manner. Which
for most uses is superior to uncontrolled dynamism a la Lisp:
errors at compile time are far better than the program failing
in the field. (Note that even an expert like Meyer insists on
static type checking.)
--
James Kanze
> There may be theoretical reasons why you would want everything to be
> an "object" (wouldn't "class" be the proper term? "object" is an
> instance of a class, at least when we are talking about C++),
That's one aspect of C++ that gets the purists' panties in a wad. In
"pure" OOP, classes are objects too.
sherm--
Computerized symbolic mathematics systems ie Macsyma and its
imitators. I find that useful on a weekly basis ;-)
KHD
This statement is not funded. COS uses dynamic dispatch but is as fast
as C++ virtual function call. See
http://cos.cvs.sourceforge.net/viewvc/cos/doc/
slides-cos.pdf section performance
cos-draft-oopsla09.pdf page 14
a+, ld.
Krice <pau...@mbnet.fi> spake the secret code
<e5246f17-b222-4dfc...@u13g2000vbb.googlegroups.com> thusly:
>On 20 loka, 11:05, Nick Keighley <nick_keighley_nos...@hotmail.com>
>wrote:
>> Lisp is an interesting language and nearly as old as FORTRAN.
>
>It's probably a nice language in theory, but in practice
>nothing really useful has been programmed with it.
Nonsense
--
"The Direct3D Graphics Pipeline" -- DirectX 9 draft available for download
<http://legalizeadulthood.wordpress.com/the-direct3d-graphics-pipeline/>
Legalize Adulthood! <http://legalizeadulthood.wordpress.com>
;)
MickG
How often is "often"?
If you have, for example, a vector of ints, where each int is a
full-fledged object (in other words, some of the ints could actually be
objects derived from an int), then I don't think the compiler has any
way of optimizing dynamic binding checks away. It cannot prove that none
of the ints in the array are objects derived from int.
Also since such ints are full-fledged objects, it means that the
vector cannot store them by value. It has to store them by reference.
Right there you just *at least* doubled the memory usage of the vector
(assuming the absolutely optimal situation; in practice, however, the
memory usage probably quadrupled, or worse).
If the vector has the ability to store the ints by value rather than
by reference, then you are immediately admitting that the ints are not
full-fledged objects (because they cannot be replaced with objects
derived from an int). However, if you want any kind of efficiency in
your program, that's a practical decision to make.
I really don't see such a huge advantage in being purist about the OO
paradigm. Purism often bites back in decreased efficiency.
I mean the "don't pay for what you don't use" and I don't use virtual
functions all the time (although OO guys would say I should).
> See
>
> http://cos.cvs.sourceforge.net/viewvc/cos/doc/
I have seen the benchmark but I am usually sceptical about such exercise
between language/compiler (whatever the result).
--
Michael
Says who? I doubt Prof. Winston for one agrees with you. See
for example the "integrates the same way I do" story in the
second paragraph of page 291 here
http://books.google.com/books?id=ZG2Zd_EQP-EC&lpg=PP1&pg=PA291#v=onepage&q=&f=false
He went on to note this tendency people have to say a thing
is no longer "intelligent" once they are shown the details of
how it works. intelligent != mysterious
Perhaps you would like to offer what you consider to be the
modern definition of "intelligence" suitable for demarcating
what is and is not A.I.?
> Maybe the outstanding project of classical AI is Cyc, I am
> not aware of any impressive results yet. By its original
> goals, Cyc should be reading newspapers today and learning
> from it. Of course, there might be something like this today
> and kept secret for some reason.
>
> Any claim of advances in AI should be related to what
> was already achieved about 40 years ago:
We are talking about a 40 year old language and responding
to a blanket statement regarding what has been done in the
language. The restriction you propose above seems quite
artificial and arbitrary and maybe begging the question.
> »Person: WHAT COLOR IS IT?
> SHRDLU: BY "IT", I ASSUME YOU MEAN THE SHORTEST
> THING THE TALLEST PYRAMID'S SUPPORT SUPPORTS.
> SHRDLU: RED
> Person: HOW MANY THINGS ARE ON TOP OF GREEN CUBES?
> SHRDLU: I'M NOT SURE WHAT YOU MEAN BY "ON TOP OF
> "IN THE PHRASE "ON TOP OF GREEN CUBES ".
> DO YOU MEAN:
> 1 - DIRECTLY ON THE SURFACE
> 2 - ANWHERE ON TOP OF ?
> Person: 2
> SHRDLU: THREE OF THEM.
> Person: HAD YOU TOUCHED ANY PYRAMID BEFORE YOU PUT THE
> GREEN ONE ON THE LITTLE CUBE?
> SHRDLU: YES, THE GREEN ONE.
> [...]
> Person: PICK UP A BIG RED BLOCK.
> SHRDLU: OK. (does it)
> Person: GRASP THE PYRAMID.
> SHRDLU: I DON'T UNDERSTAND WHICH PYRAMID YOU MEAN.
> Person: FIND A BLOCK WHICH IS TALLER THAN THE ONE YOU ARE
> HOLDING AND PUT IT INTO THE BOX.
> SHRDLU: BY "IT", I ASSUME YOU MEAN THE BLOCK WHICH IS TALLER
> THAN THE ONE I AM HOLDING.
> SHRDLU: OK. (does it) ¬´
>
> http://hci.stanford.edu/~winograd/shrdlu/
>
> Lupo LeBoucher said in 2004:
>
> »Computer science has been in a dark age since the 1970s.«
>
> (IIRC, in »comp.lang.lisp«)
>
> However, one should not despair, after all nowadays we have
>
> http://google.to/search?q=number+of+horns+on+a+unicorn
>
> and automatic weapon control:
>
> »It appears as though the gun, which is computerised,
> jammed before there was some sort of explosion, and then
> it opened fire uncontrollably, killing and injuring the
> soldiers."¬´
>
> http://blog.wired.com/defense/2007/10/robot-cannon-ki.html
KHD
PS. Why do you have a problem with HTTP access to your posts?
And do you think anyone/anything is paying attention to the
extra conditions you put on the X-No-Archive attribute? The
only practical result these days (it seems to me) is to mess
up the google archives causing threads to become confusing
within a few days. What is the benefit of that?
> Hmm. And what has AI achieved as a result, a quarter-century later?
Do you believe that the progress observed in an entire scientific domain has any relation with the use of a
certain programming language by some projects?
Rui Maciel
[ ... ]
> Hmm. And what has AI achieved as a result, a quarter-century later?
AI never has, and never will achieve anything. That's not to say that
AI research is useless though -- it's just that as soon as we figure
out how to actually do something in a usable, useful fashion, it
quits being considered AI.
If you look at older books on AI (e.g. from the 1950's) you'll find
that at one time things like OCR, hand-writing recognition, and
speech recognition were all considered proper "AI" subjects. They
aren't anymore though -- as soon as somebody figured out a semi-
practical way of doing them, they weren't "AI" anymore.
--
Later,
Jerry.
[ ... ]
> Google for "Worse is Better". It was written at a time when C was
> becoming more popular than LISP and gives insights on why it was so.
Mostly it gives insights into the phenomenon mostly commonly referred
to as "sour grapes". I.e. it was written by a guy who could see that
what he liked was losing out to something he didn't like. He
proceeded to proclaim, in essence, "Look, the whole rest of the band
is out of step!"
--
Later,
Jerry.
> > i.inc()
> > might often be compiled to the same code as
> > ++i
> > in the case where »i« is a primitive value. In many cases
> > »i« also will not have to be more memory than an int as long
> > as it behaves like an object in source code.
> How often is "often"?
> If you have, for example, a vector of ints, where each int is a
> full-fledged object (in other words, some of the ints could actually be
> objects derived from an int), then I don't think the compiler has any
> way of optimizing dynamic binding checks away. It cannot prove that none
> of the ints in the array are objects derived from int.
It can, and some do. Obviously, it's harder for the compiler
than optimizing around std::vector, but then, optimizing around
std::vector is harder for the compiler than optimizing around
Fortran style arrays. (C-style arrays are a problem for the
compiler, because they end up being pointers.)
> Also since such ints are full-fledged objects, it means that the
> vector cannot store them by value. It has to store them by reference.
> Right there you just *at least* doubled the memory usage of the vector
> (assuming the absolutely optimal situation; in practice, however, the
> memory usage probably quadrupled, or worse).
That, again, is an optimization issue. A fairly long time ago,
I stumbled upon a page by James Gosling on adapting Java for
numeric processing, and one of the issues discussed was how the
compiler could "optimize" such arrays to avoid dynamic
allocation of each object. He actually proposed an additional
keyword (IIRC) to tell the compiler that the type met certain
constraints (which the compiler then enforced), but in
principle, the compiler could determine whether this was the
case on its own. (Given that compilation in Java normally
occurs while you're running the program, you don't want to
impose optimization techniques that are too expensive. And
given that Java uses dynamic linking everywhere, done lazily,
it's very difficult for the compiler to have a view of the
complete program, which is necessary for the best optimization.)
With a suitably designed language (no dynamic linking, for
example, C++'s const), I don't think it would actually be that
difficult to detect and optimize such cases. (The page also
defined operator overloading; since Java didn't go that way, and
Gosling implied that it would be unusable for numerics if it
didn't, the page has disappeared.)
> If the vector has the ability to store the ints by value rather than
> by reference, then you are immediately admitting that the ints are not
> full-fledged objects (because they cannot be replaced with objects
> derived from an int). However, if you want any kind of efficiency in
> your program, that's a practical decision to make.
I'm not convinced that the performance issue is the key,
although it's certainly easier for the compiler to optimize if
part of the "optimization" is done by the programmer. The fact
is that in any well designed program, regardless of the
language, there are different categories of objects: value
objects do not respect the same rules as entity objects, for
example, and in a "pure" OO language, you generally have to
follow some hard rules to ensure that value objects behave as
expected. Basically, C++ favors value objects over entity
objects, pure OO languages favor entity objects, often to the
point of making value objects very difficult. In some ways, the
"pure" OO can be considered an over-reaction---earlier
languages, like C or Fortran, ignored entity objects entirely.
(They're practically impossible in Fortran.) In practice,
however, all of the applications I've seen need some of both.
(Java still has final, which, used correctly, allows something
very close to a value object. Arguably, if you're making a
significant number of classes final in Java, you aren't
programming OO. But you are writing more robust and easier to
maintain code than if you skipped the final.)
> I really don't see such a huge advantage in being purist about the OO
> paradigm. Purism often bites back in decreased efficiency.
Decreased programmer efficiency, above all. If all you have is
a hammer, then everything looks like a nail; but a good workman
has a variety of tools in his toolbox, and uses whichever one is
appropriate for the occasion. And using a screwdriver, rather
than a hammer, on screws, makes you more efficient.
--
James Kanze
> »2« is a literal and and expression denoting an r-value,
> »"abc"« is a literal and expression denoting an r-value
> of pointer type.
Agreed for the first (and it doesn't have an address). »"abc"«,
however, is an lvalue, of type »char const[4]«.
Array types do convert to pointers in a lot of circumstances,
and this is certainly a major defect in C++ (which it inherits
from C), but this has nothing to do with OO or not; Fortran
doesn't have this problem, and I don't think you could consider
it OO.
> You are right that one has to take care with the word
> »object« as it means two different things in C++ and in
> »object-oriented«.
Yes. That's why I felt it important to mention. However...
> In this case, i intended use »object« in the sense of »an
> entity that accepts messages at run-time«, but this is not
> directly applicable in C++ as C++ also does not have »messages«.
That's what I more or less supposed; I think globally, it was
clear that you were speaking of objects in the OO sense---what I
would call entity objects (i.e. which have identity).
As for messages... That's really just a question of vocabulary.
Most OO languages today speak of calling methods, and not
sending messages, but it all comes out to the same in the end:
send a message, call a method or call a member function.
Note that similar issues concern the word "class". In Java,
java.lang.Math is actually more like a namespace than a class
(but not quite---it would be more appropriate to say that what
it does would be better handled by a namespace than a class).
But of course, one could just as well argue that Java isn't pure
OO, since double and int aren't classes---logically, double
should be a class, and sin a method of that class.
> So, I meant to say: For a language to be called »object
> oriented«, it should have »things« that can receive messages
> at run time, and the traditional »values«, »objects«, and
> »blocks« of procedural programing languages should become
> such »things«.
> >That really depends on your definition of object oriented
> >programming.
> I took it from Alan Kay, the one who coined it:
> »OOP to me means only messaging, local retention and
> protection and hiding of state-process, and extreme
> late-binding of all things.«
The problem, as I said, is that while Alan Kay may have invented
the term, it has taken on a life of its own: by that definition,
neither Java nor Eiffel are OO. (Similarly, Bertrand Meyer's
definition of OO excludes Smalltalk.) Practically speaking, I
would avoid such a limiting definitions; I don't see where
anything is gained by intentionally defining a paradigm so
restrictively that only one language (the one you invented)
allows implementing it.
--
James Kanze
<snip>
> > H. S. Lahman in <Kf7Df.1590$0J3.367@trndny08>
> > »The problem of teaching object-oriented programming (...)
> > C++ fails to meet almost all requirements on our list.«
[...]
> > H. S. Lahman in <bv6je.8995$_f7.1506@trndny01>
> > »There are only two things wrong with C++,
> > The initial concept and the implementation.«
>
> I'm not familiar with any of the above people, but I'd be
> interested in knowing more about the context in which they are
> speaking.
H.S.Lahman posts in comp.object. He's a "contructionist" (if I've got
the terminology right) in that he builds very detailed models from
which
code is generated. He's not really a fan of *any* 3GL.
<snip>
--
Nick Keighley
"Object-oriented programming is an exceptionally bad idea
that could only have originated in California."
Dijkstra
> > Lisp is an interesting language and nearly as old as FORTRAN.
>
> It's probably a nice language in theory, but in practice
> nothing really useful has been programmed with it.
Emacs, Naughty Dog games engine. I believe some of the internal
file formats in GCC look remarkable Lisp like.
> It's the
> matter with many languages that are supposed to be better
> than C++. I think the reason is simple: horrible syntax.
You have seen a complex template declaration?
> I know some people don't appreciate the syntax of C++, but
> at least it can be read by humans, in most cases (there are
> examples of very twisted C++ source code which is harder
> to read than usual).
Yes the Lots of Irritating Silly Parenthases takes some getting
used tobut you /can/ get used to it (surprisingly rapidly), an
editor that can count brackets helps.
--
Nick Keighley
One of the coolest feature of lisp is that it isn't a language but a
building material.
I suppose that qualifies. For an appropriate definition of
"intelligence", of course.
--
Richard Herring
I'm not the one making claims here. Did you miss this:
>> ;-/
?
--
Richard Herring
Hi, I'm just curious: where did you read that Naughty Dog game engine
is written in LISP?
If you go to their website, career section, they explicitly ask for C,
C++ and C# knowledge.
http://www.naughtydog.com/site/careers/server_programmer/
Anyway it seems strange since the vast majority of games engines are
written either in C or C++.
Just to name a few:
- Unreal
- Quake
- Valve's Engine
And when it's not explicitly stated, all games software houses REQUIRE
C++ skills.
I'm not really a fan of C++, but it's the de-facto standard for the
games industry.
Other languages are usually used for scripting (LUA, Lisp, etc...)
I'm no expert, but the statement (Naughty Dog's engine in LISP) seems
strange.
By the way, Naughty Dog games are awesome, Uncharted rules!
Bye,
Francesco
<SNIP>
> > > > Lisp is an interesting language and nearly as old as FORTRAN.
>
> > > It's probably a nice language in theory, but in practice
> > > nothing really useful has been programmed with it.
>
> > Emacs, Naughty Dog games engine. I believe some of the internal
> > file formats in GCC look remarkable Lisp like.
>
> Hi, I'm just curious: where did you read that Naughty Dog game engine
> is written in LISP?
> If you go to their website, career section, they explicitly ask for C,
> C++ and C# knowledge.
I believe the high performance bits were always written in C++ but
that there was a scripting language or something that was lispy. I
then think the lisp bits got rewritten in C++. If you google "lisp
naughty dog" you get quite a few hits. For instance:-
http://ynniv.com/blog/2005/12/lisp-in-games-naughty-dogs-jax-and.html
(which also gives you idea why they stopped using it...)
or
http://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp
And what would you replace this C-style array "problem" with?
In other words, how would you implement iterators for C-style
array sequences? Do you have a link to a proposal for C++ or
examples of languages that "do it right" and support Stepanov
iterator concepts as C++ does?
KHD
> > On Oct 20, 8:19 pm, Juha Nieminen <nos...@thanks.invalid> wrote:
> > > If you have, for example, a vector of ints, where each int
> > > is a full-fledged object (in other words, some of the ints
> > > could actually be objects derived from an int), then I
> > > don't think the compiler has any way of optimizing dynamic
> > > binding checks away. It cannot prove that none of the ints
> > > in the array are objects derived from int.
> > It can, and some do. Obviously, it's harder for the
> > compiler than optimizing around std::vector, but then,
> > optimizing around std::vector is harder for the compiler
> > than optimizing around Fortran style arrays. (C-style
> > arrays are a problem for the compiler, because they end up
> > being pointers.)
> And what would you replace this C-style array "problem" with?
Make arrays first class types, behaving as any other type.
> In other words, how would you implement iterators for C-style
> array sequences?
The same way you implement them for any other container. For
that matter, you usually inhibit the convertion of array to
pointer (by using pass by reference) when you want "iterators"
for C style arrays, using something like:
template< typename T, size_t n >
T*
end( T (&array)[ n ] )
{
return array + n ;
// or return &array[0] + n, if there was no conversion,
// or just something like
// return array.end(), if the language were defined
// thusly.
}
Basically, you block the conversion because it involves loss of
important information concerning the type, mainly how many
elements the array contains.
> Do you have a link to a proposal for C++ or examples of
> languages that "do it right" and support Stepanov iterator
> concepts as C++ does?
The STL has been implemented in Ada. The problems doing so had
nothing to do with arrays; the problems had to do with the fact
that Ada's generics work somewhat differently than those in C++.
And of course, one might reasonably ask why you would want to
use the STL iterator idiom, which follows the C idiom in
requiring a separate object to contain all of the necessary
information, when you could put all the information in a single
object, resulting in a more powerful and simpler to use
abstraction.
--
James Kanze
> You are right, indeed:
> »A string literal is an lvalue«
> ISO/IEC 14882:2003(E), 5.1p2
> I, however, do not understand why a string literal is an
> lvalue. I would expect that an lvalue can be used on the
> left side of an assignment operator, but I cannot see sense
> in
> "abc" = ...
> .
That requires a modifiable lvalue:-). An lvalue can be used in
the circumstances where the standard says an lvalue is required
(and because of the implicit lvalue to rvalue conversion, also
in the circumstances where an rvalue is required). In C++, you
can almost consider lvalue-ness as a more or less arbitrary
attribute, associated with expressions according to the operator
and the type, and required (or not) for certain expressions.
Intuitively, one could say that an lvalue is something that has
an address, except that rvalues of class type also have
addresses.
> >Array types do convert to pointers in a lot of circumstances,
> If »a« is an expression of array type, I surely understand
> that »a[ i ]« is an lvalue, I just do not understand, why
> the bare »a« should be an lvalue, too. After all, the
> following statement is not allowed:
> { int x[ 5 ], y[ 5 ]; x = y; }
That's because arrays are second class citizens in C++. There
was some discussion in the committee, long before C++89, about
trying to correct this, but in the end, the consensus was that C
style arrays were so broken that really fixing them could only
be done at the expense of breaking most existing programs, which
wasn't acceptable, and that there was no point in small hacks.
C++ has std::vector, and the intent was that new code not use C
style arrays at all. Except that in practice, there are cases
(like "abc") where they're unavoidable.
> >As for messages... That's really just a question of vocabulary.
> To me, the notion of a message includes run-time
> polymorphism, that is, dynamic dispatch. Any object can
> receive any message and decides at run time what to do with
> the message, depending on the message selector and the
> message arguments.
Which is a recepe for disaster, since most objects can't
reasonable to anything with most messages except declare an
error. Better static type checking, to ensure that the error is
caught at compile time, when it costs a lot less to correct.
--
James Kanze
Smalltalk and LISP maybe, but Objective-C? That's just crazy, IMO.
Objective-C has no RAII, no automatic constructors (in fact, it has no
constructors at all, besides whatever non-enforced coding convention may
be used in the program), you cannot have objects as members (only
*pointers* to objects, which means that the objects are never
instantiated automatically, but you always have to instantiate them
manually in your chosen "constructor" method), you cannot have
types/classes inside classes (which greatly hinders modular design), no
private member functions (AFAIK; you can try to "hide" member functions
by not mentioning them in the header, but AFAIK you cannot stop outside
code from calling those functions), you cannot copy-construct nor assign
objects (at least not using compiler-generated code).
Those are some of the major flaws I have found in Objective-C. Some
less annoying flaws are lack of multiple inheritance (you can argue all
you want against MI, but in some situations it *is* useful), no "pure
virtual" functions (ie. a way to force a derived class to implement one
of the base class functions), and no templates (in some respects the
messaging and introspection mechanisms alleviate the lack of templates
eg. when creating generic containers, but at the cost of decreased
efficiency and increased memory usage).
C++ might be "badly broken", but I prefer it any day before Objective-C.
I definitely think it's easier to develop quality code in C++
than in C, perhaps by an order of magnitude or more.
Theoretically, perhaps some other languages would be even easier
than C++, but practically, for whatever reasons, they aren't
available or aren't appropriate for developing kernel code.
--
James Kanze
<WRT Linux quality>
>I definitely think it's easier to develop quality code in C++
>than in C, perhaps by an order of magnitude or more.
I think that statement epitomizes a huge problem I have had with C++, and
always will have. As I learned I kept looking for this wonderful magic
bullet in C++, kind of like structured programming. Once one really learns
about structured programming there is an enormous increase in productivity,
perhaps as much as an order of magnitude. As I worked myself through the
process of learning C++, I would say to myself, yes, this is nice, that is
nice, but where is the *really* good part? After many years I have
concluded that there is no really good part. It is just a complicated
agglomeration of pretty good ideas, fitted together in one language.
Someone mentioned, upthread, a Swiss army knife, and I think it is an
excellent metaphor. My problem is that I detest Swiss army knives and
Crescent wrenches!
Change order of magnitude to 40% better and I am on board as a C++ guy.
You can't make a nice cohesive, consistent, language (such as Algol 60 or
Pascal) with such diverse roots, one root is C, cryptic beyond belief with
'%' meaning modulo, and the huge, 15 character or so guru selected names of
things used in the STL. I sometimes find myself using *two* lines of code
for a simple for loop with iterators. It's kind of like someone grafted
COBOL on to a Teletype friendly APL. It is just a nasty, ugly mix, usable,
but still distasteful.
To me the "really good part" is generic programming (including
among other features overloading including operator overloading
and templates) and the paired constructor+destructor paradigm.
As for OO it's just so-so ;-)
For example, generic programming made the numerical work I was
doing FAR simpler and more enjoyable than it was in Fortran or
C.
KHD
As in make them std::vector? Why isn't that better done at the
library level as has already been done (in multiple different
ways to meet different needs)?
> > In other words, how would you implement iterators for C-style
> > array sequences?
>
> The same way you implement them for any other container. For
> that matter, you usually inhibit the convertion of array to
> pointer (by using pass by reference) when you want "iterators"
> for C style arrays, using something like:
>
> template< typename T, size_t n >
> T*
> end( T (&array)[ n ] )
> {
> return array + n ;
> // or return &array[0] + n, if there was no conversion,
> // or just something like
> // return array.end(), if the language were defined
> // thusly.
> }
>
> Basically, you block the conversion because it involves loss of
> important information concerning the type, mainly how many
> elements the array contains.
So your chief complaint is that information about the size of
the array is lost? Ok then I'm a bit confused because you also
mentioned Fortran as an example of a language that does not have
C's "problem" however Fortran also discards size information and
it must be passed in as additional parameters (or otherwise
known). For this reason canonical Fortran looks like
subroutine sum ( a, n, s )
real s
real a(n)
do i = 1, n
s = s + a(i)
end do
end
real a(3)
...
call sum(a,3,s)
It even supports passing subarrays in effectively the same way
as C++ by the fact that size is not part of the type for example
real b(9)
...
call sum(b(4),3,s)
to sum the middle 3 elements of the original 9 element array.
So at least for built-in arrays Fortran just like C/C++ doesn't
make size part of the type and provides no built-in operators to
query the size, bounds, etc.
So anyhow, what difference do you see with regard to array size
information between Fortran and C?
> > Do you have a link to a proposal for C++ or examples of
> > languages that "do it right" and support Stepanov iterator
> > concepts as C++ does?
>
> The STL has been implemented in Ada. The problems doing so had
> nothing to do with arrays; the problems had to do with the fact
> that Ada's generics work somewhat differently than those in C++.
Ok so you would happy if built-in arrays were instead std::vector?
Ie a class with some interface to the size in this case .size(),
.begin(), etc just like the standard interface that Ada offers to
array types (First, Last, etc)?
So then what would the type of "new T[]" and the return type of
malloc(sizeof(T)*N) be? std::vector<T>?
> And of course, one might reasonably ask why you would want to
> use the STL iterator idiom, which follows the C idiom in
I think it is misleading to say the STL iterator idiom "follows"
the C idiom. Stepanov had abstract idioms and concepts in mind
drawn from a lineage quite distinct from C. It just so happened
by fortune that the C/C++ memory model and concrete pointer fit
very nicely into his abstract concepts.
> requiring a separate object to contain all of the necessary
> information, when you could put all the information in a single
> object, resulting in a more powerful and simpler to use
> abstraction.
That really is a separate and large topic. You would certainly have
trouble justifying in absolute the "all information", "more powerful",
and "simpler to use" claims. You could look to the recent debates
stirred up by Andrei's "iterators must go" range advocacy to see many
cogent arguments against your view above.
I don't think there is any sense in opening that debate here;
but, I would like to know do you advocate "range" concepts a la
Andrei (for generalizing sequences I mean) or something else?
KHD
> > > On Oct 21, 4:22 am, James Kanze <james.ka...@gmail.com> wrote:
> > > > On Oct 20, 8:19 pm, Juha Nieminen <nos...@thanks.invalid> wrote:
> > > > > If you have, for example, a vector of ints, where each int
> > > > > is a full-fledged object (in other words, some of the ints
> > > > > could actually be objects derived from an int), then I
> > > > > don't think the compiler has any way of optimizing dynamic
> > > > > binding checks away. It cannot prove that none of the ints
> > > > > in the array are objects derived from int.
> > > > It can, and some do. Obviously, it's harder for the
> > > > compiler than optimizing around std::vector, but then,
> > > > optimizing around std::vector is harder for the compiler
> > > > than optimizing around Fortran style arrays. (C-style
> > > > arrays are a problem for the compiler, because they end up
> > > > being pointers.)
> > > And what would you replace this C-style array "problem" with?
> > Make arrays first class types, behaving as any other type.
> As in make them std::vector? Why isn't that better done at the
> library level as has already been done (in multiple different
> ways to meet different needs)?
That's what was decided, but there are still repercusions.
There's no way you can create an std::vector with static
initialization, for example, and the initialization syntax isn't
the same. (The committee is working on the latter.)
> > > In other words, how would you implement iterators for C-style
> > > array sequences?
> > The same way you implement them for any other container. For
> > that matter, you usually inhibit the convertion of array to
> > pointer (by using pass by reference) when you want "iterators"
> > for C style arrays, using something like:
> > template< typename T, size_t n >
> > T*
> > end( T (&array)[ n ] )
> > {
> > return array + n ;
> > // or return &array[0] + n, if there was no conversion,
> > // or just something like
> > // return array.end(), if the language were defined
> > // thusly.
> > }
> > Basically, you block the conversion because it involves loss
> > of important information concerning the type, mainly how
> > many elements the array contains.
> So your chief complaint is that information about the size of
> the array is lost?
Not only, with regards to C style vectors. My chief complaint
is that they follow completely different rules than other types
of objects. The decision to use pass by reference, rather than
the usual pass by value, should be made by the programmer. (And
yes, in both cases, length information should be preserved.)
> Ok then I'm a bit confused because you also mentioned Fortran
> as an example of a language that does not have C's "problem"
> however Fortran also discards size information and it must be
> passed in as additional parameters (or otherwise known).
Agreed, and now that I think of it, I'm not sure that Fortran is
a good example; Fortran's arrays don't convert to pointers,
which can then be abused, but you can't (or at least, you
couldn't when I used Fortran) assign an array to another array.
(It's hard to compare Fortran, of course, because it never uses
pass by value.)
The point is that objects in C++ have a specific behavior: this
holds for the basic types, for pointers, for structs, and in
fact, for every object type except arrays. (By default, anyway.
In C++, you can replace that behavior, at least in part, by
overloading operators and defining a copy and a default
constructor.) That default behavior includes things like copy
and assignment.
> > > Do you have a link to a proposal for C++ or examples of
> > > languages that "do it right" and support Stepanov iterator
> > > concepts as C++ does?
> > The STL has been implemented in Ada. The problems doing so
> > had nothing to do with arrays; the problems had to do with
> > the fact that Ada's generics work somewhat differently than
> > those in C++.
> Ok so you would happy if built-in arrays were instead
> std::vector? Ie a class with some interface to the size in
> this case .size(), .begin(), etc just like the standard
> interface that Ada offers to array types (First, Last, etc)?
That would be one solution. There are many. All I really
insist on is that arrays work like any other type---a struct
doesn't implicitly convert to a pointer to its first element in
just about any context; nor should an array. And of course, as
a side effect, indexation would be indexation, and not pointer
arithmetic.
Doing this does allow carrying the size around, so you could
then add all of the advantages that allows (like efficient
bounds checking). But that's really a second point---important,
but not as primordial as the first.
> So then what would the type of "new T[]" and the return type
> of malloc(sizeof(T)*N) be? std::vector<T>?
The return type of new T should be T*, for *all* T. Not just
for the cases where T is not an array. That's really part of
what I'm complaining about: it's totally abherant that the
return type of new int and new int[n] be the same. And that you
have to use a different form of delete on them, because the
original type has been lost.
> > requiring a separate object to contain all of the necessary
> > information, when you could put all the information in a
> > single object, resulting in a more powerful and simpler to
> > use abstraction.
> That really is a separate and large topic.
Agreed. But anyone who's worked with complex iterators to any
extend (filtering iterators, etc.) realizes what a pain it is
not being able to know the end from within the iterator. And
anyone who makes use of extensive functional decomposition with
e.g. one function determing the range, and another function
iterating over it, has suffered from the fact that you need two
separate objects to define the range.
> You would certainly have trouble justifying in absolute the
> "all information", "more powerful", and "simpler to use"
> claims.
Not in the least.
> You could look to the recent debates stirred up by Andrei's
> "iterators must go" range advocacy to see many cogent
> arguments against your view above.
> I don't think there is any sense in opening that debate here;
> but, I would like to know do you advocate "range" concepts a
> la Andrei (for generalizing sequences I mean) or something
> else?
Not having seen what Andrei is advocating, I don't know. But it
should only take one object in order to iterate, since a
function can only return a single object to be used as a single
argument to another function.
--
James Kanze
> <WRT Linux quality>
> >I definitely think it's easier to develop quality code in C++
> >than in C, perhaps by an order of magnitude or more.
> I think that statement epitomizes a huge problem I have had
> with C++, and always will have. As I learned I kept looking
> for this wonderful magic bullet in C++, kind of like
> structured programming.
And that is your problem. There is no magic bullet. There are
a number of important paradigms which improve productivity, each
important, and the strength of C++ is that all of them are
possible in it. Any one may be better implemented in another
language, but the real productivity gain is in using whichever
one is appropriate in a given situation.
> Once one really learns about structured programming there is
> an enormous increase in productivity, perhaps as much as an
> order of magnitude. As I worked myself through the process of
> learning C++, I would say to myself, yes, this is nice, that
> is nice, but where is the *really* good part? After many
> years I have concluded that there is no really good part. It
> is just a complicated agglomeration of pretty good ideas,
> fitted together in one language. Someone mentioned, upthread,
> a Swiss army knife, and I think it is an excellent metaphor.
> My problem is that I detest Swiss army knives and Crescent
> wrenches!
Perhaps. The screwdriver in a Swiss army knife probably isn't
as good as a purpose built screwdriver, but if you need to drive
a screw, it's a lot better than a hammer.
> Change order of magnitude to 40% better and I am on board as a
> C++ guy.
> You can't make a nice cohesive, consistent, language (such as
> Algol 60 or Pascal) with such diverse roots, one root is C,
> cryptic beyond belief with '%' meaning modulo, and the huge,
> 15 character or so guru selected names of things used in the
> STL.
That's certainly a problem, and a more Pascal like syntax (with
a lot less special characters) would certainly improve things.
But the number of special characters, or the use of {} instead
of begin/end is, in the end, a detail. The real diffence comes
with encapsulation (and access control), the ability to use
polymorphism when appropriate (and the fact that you're not
stuck with it when it isn't), the ability to use programming by
contract idioms (with private virtual functions), the ability to
define abstract types (like std::vector), the ability to
separate interface (in the header file) and implemenation (in
the source). None of these are unique to C++, and for any one,
you could probably find a better language, but having all of the
possibilities at hand makes C++ a powerful tool. Not perfect,
but it works, and the alternatives all seem to have some fatal
weakness.
--
James Kanze
Right. However those limitations of course apply to any UDT.
So given that you are arguing for making the built-in array
behave the same as other types I guess that is the price to
pay (at least for now).
Would you have any problem if built-in arrays were simply
removed entirely from the core language? Replaced instead
by some equivalent of a raw block new/malloc.
> > > > Do you have a link to a proposal for C++ or examples of
> > > > languages that "do it right" and support Stepanov iterator
> > > > concepts as C++ does?
> > > The STL has been implemented in Ada. The problems doing so
> > > had nothing to do with arrays; the problems had to do with
> > > the fact that Ada's generics work somewhat differently than
> > > those in C++.
> > Ok so you would happy if built-in arrays were instead
> > std::vector? Ie a class with some interface to the size in
> > this case .size(), .begin(), etc just like the standard
> > interface that Ada offers to array types (First, Last, etc)?
>
> That would be one solution. There are many. All I really
What other solutions besides something that is effectively
equivalent? Is removing them entirely a solution?
> insist on is that arrays work like any other type---a struct
> doesn't implicitly convert to a pointer to its first element in
> just about any context; nor should an array. And of course, as
> a side effect, indexation would be indexation, and not pointer
> arithmetic.
>
> Doing this does allow carrying the size around, so you could
> then add all of the advantages that allows (like efficient
> bounds checking). But that's really a second point---important,
> but not as primordial as the first.
>
> > So then what would the type of "new T[]" and the return type
> > of malloc(sizeof(T)*N) be? std::vector<T>?
>
> The return type of new T should be T*, for *all* T. Not just
> for the cases where T is not an array. That's really part of
> what I'm complaining about: it's totally abherant that the
> return type of new int and new int[n] be the same. And that you
> have to use a different form of delete on them, because the
> original type has been lost.
And what about malloc? Should such raw allocation capability
be defined in the core library? Or should there be an equivalent
core language operator? Ie something that returns a simple
pointer to a block.
Well as I said, this isn't the thread for that debate. For starters
if you are interested here is one major thread on the issue
'Andrei's "iterators must go" presentation'
I don't think you participated in that one so you may have missed
it entirely and you might like to read it. If you do I'd be most
interested in your thoughts. But I think we should start another
thread for that.
Thanks.
KHD
Except of course for PODs with respect to static initializers.
KHD
Yes. More generally, it's the price we pay for evolution,
rather than revolution (i.e. compatibility with historical
situations). On the other hand, evolution means benefiting from
previous experience, which is a definite advantage; every really
new language I've seen has introduced its own set of problems,
not foreseen from the start for lack of experience with the
idiom.
[...]
> > The point is that objects in C++ have a specific behavior:
> > this holds for the basic types, for pointers, for structs,
> > and in fact, for every object type except arrays. (By
> > default, anyway. In C++, you can replace that behavior, at
> > least in part, by overloading operators and defining a copy
> > and a default constructor.) That default behavior includes
> > things like copy and assignment.
> Would you have any problem if built-in arrays were simply
> removed entirely from the core language? Replaced instead by
> some equivalent of a raw block new/malloc.
If built-in arrays were simply removed, then they'd have to be
replaced with something with a similar syntax, to avoid breaking
code. What I'd have liked, way back when, would have been that
something like "int a[N];" have semantics similar to "struct
{int a[N];}", i.e. it acted like a real object, which could be
assigned and passed and returned by value, there was no implicit
conversion to int*, and &a had the type int (*)[N]. Or even
int (*)[]; i.e. the size is lost when you take the address. The
point is that you only loose the type information explicitly.
But I think I would prefer int (*)[N] with an implicit
convertion to int (*)[]---functions that want to treat arrays of
variable size can; those that require a specific size can also
insist on that. Ideally, there would be some means of
recovering the initial size from the array, but if we consider
the date when all of this was being specified, I suspect that
that would be asking too much.
Of course, all of this neglects the fact that Kernighan and
Richie were coming from B, where everything is just a machine
word (and "arrays" are a machine word initialized to point to
the first element), where it is all coherent and makes sense.
And that arrays were present from the first; structs (and other
elaborated types which behave as first class objects) came
later.
> > > > > Do you have a link to a proposal for C++ or examples of
> > > > > languages that "do it right" and support Stepanov iterator
> > > > > concepts as C++ does?
> > > > The STL has been implemented in Ada. The problems doing so
> > > > had nothing to do with arrays; the problems had to do with
> > > > the fact that Ada's generics work somewhat differently than
> > > > those in C++.
> > > Ok so you would happy if built-in arrays were instead
> > > std::vector? Ie a class with some interface to the size in
> > > this case .size(), .begin(), etc just like the standard
> > > interface that Ada offers to array types (First, Last, etc)?
> > That would be one solution. There are many. All I really
> What other solutions besides something that is effectively
> equivalent?
I don't know. To tell the truth, I've not thought about it
much, since practically speaking, nothing's going to change
(except the initialization syntax for class types like vector).
> Is removing them entirely a solution?
Only if we consider breaking all existing code a solution:-).
I'm really only complaining about "what should have been", 30 or
more years ago. I don't think there's much we can do about it
today.
> > insist on is that arrays work like any other type---a struct
> > doesn't implicitly convert to a pointer to its first element in
> > just about any context; nor should an array. And of course, as
> > a side effect, indexation would be indexation, and not pointer
> > arithmetic.
> > Doing this does allow carrying the size around, so you could
> > then add all of the advantages that allows (like efficient
> > bounds checking). But that's really a second point---important,
> > but not as primordial as the first.
> > > So then what would the type of "new T[]" and the return type
> > > of malloc(sizeof(T)*N) be? std::vector<T>?
> > The return type of new T should be T*, for *all* T. Not just
> > for the cases where T is not an array. That's really part of
> > what I'm complaining about: it's totally abherant that the
> > return type of new int and new int[n] be the same. And that you
> > have to use a different form of delete on them, because the
> > original type has been lost.
> And what about malloc?
Allocating an array using malloc should work exactly like
allocating an int using malloc. E.g.:
typedef int array[10];
int* pi = (int*)malloc( sizeof(int) );
array* pa = (array*)malloc( sizeof(array) );
(Note that the above is actually legal today. It does mean that
to access the array, you need to write (*pa)[i]. I wonder if
this awkwardness didn't also play a role in the early rules,
although it seems natural to me, and it fully parallels what I
do with a dynamically allocated int.)
> Should such raw allocation capability be defined in the core
> library?
Yes. At least, I think so; I'm not 100% sure. C++ should
continue to be a language that can be used at the lowest level
as well, for e.g. things like kernel code. On the other hand,
if you're defining memory allocation at this level, then you
probably should define other things, like IO, at this level as
well. C (and C++) doesn't, leaving that up to the system (Posix
or Windows I/O, etc.)
--
James Kanze
[ ... ]
> - At the time, LISP was created, FORTRAN was IIRC
> unable to do recursive function calls, so possibly
> that was implemented in LISP first, too. See
Simon, Newell and Shaw's IPL implemented recursion before McCarthy
even started working on Lisp.
As for the rest, you seem far more interested in advocating a
viewpoint than being accurate. Characterizing six out of thirteen as
"most" is little short of a blatant lie. Quoting IBM's Java web site
hardly qualifies as a disinterested source. Quoting Joel Spolsky
makes you seem overly credulous -- while he seems like a perfectly
decent guy, he doesn't seem to have any credentials to qualify him as
an authority on much of anything.
Those, however, pale to insignificance when your quote Ian Joyner. If
you read his diatribe carefully, essentially every argument he makes
works out as "C++ is different from (Eiffel|Java), and therefore
wrong" (or the variant, "Bertrand Meyer advocates something
different, therefore C++ is wrong").
Accepting and quoting such arguments as if they meant something
indicates that you're not even attempting to think objectively about
the subject matter.
--
Later,
Jerry.
As others have concluded, there's no silver bullet. However, I like
C++ much more than C for many reasons.
For one, standard C++ offers enormously more tools than standard C
does, right out of the box. In C, if you want to do anything even
slightly complicated, you either need to implement all data containers
and such from scratch, download (and learn) third-party libraries which
might or might not be simple to use, or you have to have a personal
collection of such libraries (self-made or third-party) which you always
"carry around" everywhere where you might have to write a small C program.
While standard C++, quite naturally, doesn't offer ready solutions for
every possible problem, it nevertheless offers good solutions for quite
many common problems. There aren't many programs I have written in C++
where I wouldn't have found the C++ standard library quite useful and
practical (ie. the parts which lack from C).
Also C++ makes it a lot easier to use eg. generic data containers and
algorithms than C does. For example, if I need a balanced binary search
tree of strings, almost everything I may ever need to do with it can be
done with one-liners or two-liners. Most importantly, I don't need to
worry about the memory management of those strings (nor the tree nodes)
in any way. And moreover, if I need a similar binary tree, but rather
than containing strings, containing something else (like integers or
some struct instances), the exact same library can be very easily used
for that too with an almost identical syntax. In C, such generic data
containers are by necessity complicated to use, error-prone and often
less efficient.
Automatic memory management of stack-allocated objects is also a huge
timesaver and makes it enormously easier to write safe programs which
handle dynamically allocated memory. Such programs become shorter,
cleaner, easier to understand and maintain and, most importantly, safer.
Also, if standard containers are used, other C++ programmers will
understand the code much more easily because it will contain
significantly less custom code and custom programming conventions.
C++ might be a "swiss army knife", but if all those features help me
writing shorter, cleaner and safer programs more easily, I'm all for it.
I don't consider being a "swiss army knife" a bad thing, even though the
term is often used in a derogatory manner.
Compared to the "swiss army knife" that C++ is, C is just a regular
knife. Without a handle. You might be able to do something with it, but
be careful to not to cut your fingers off.
"just as fast" is given by »-« not »1/1«
I think we are in pretty complete agreement, why would anyone use C when
they can use C++?
My argument was that the improvement in productivity is closer to 40% than
the 1000%. suggested by Kanze. Put starkly, I think C++ is over hyped.
Thinking about my post, after I made it, I think even 40% may be too high.
There are lots of problems where the C++ advantage is minuscule.
Overstating the advantages of a language one will spend hundreds of hours
learning is not good, it can only lead to disappointment.
FWIW, I disagree with your editorial.
Yes, Richard Gabriel was clearly prejudiced in favour of LISP, against
UNIX and C. Yes, many of his arguments were strawman.
That said, my takeaway from his essay was that for a programming
language (or for that matter, most anything) to succeed -- the world
expects/insists simplicity rather than correctness from it.
The irony here is that C++ is anything but simple!
LOL. I'm not sure that was the actual intent, but the post definitely drags
that feeling, and uses nice cuts that in turn twist the reality.
> Characterizing six out of thirteen as
> "most" is little short of a blatant lie.
The point here is not the count, but the selection of tests is fishy in
itself, the figures beyond the isolated benchmark time show huge mem
footprint, that in a real app translates to wasted time; also uses gcc as
the strowman that is known for being way lousy in the optimisation field.
> Quoting Joel Spolsky
> makes you seem overly credulous -- while he seems like a perfectly
> decent guy, he doesn't seem to have any credentials to qualify him as
> an authority on much of anything.
The referred article is a pretty good in with a ton of insight. The quoted
portion is not in its main stream, but what is more important, in the put
sense C++ count in the 'managed' group :-)). Unless certainly one tries to
sell C code as C++. Normal C++ code does not manage memory, but uses
vector, string and locals naturally, and for the tiny portion asking dynamic
creation the suite of smart pointers.
And that in effect brings superior productivity. Joel is right here too.
Going to the essence of the thought, it should not pick 'memory' as a
special item, but go for more generality, a language that allows automation
of pesky management, is ways more productive than one relies on the
programmer to do everything. And C++ with RAII enabled is absolute king,
covering more than memory.
> Those, however, pale to insignificance when your quote Ian Joyner.
He says in intro:
http://burks.brighton.ac.uk/burks/pcinfo/progdocs/cppcrit/index001.htm
"Another factor has been the publishing of Bjarne Stroustrup's "Design and
Evolution of C++" [Stroustrup 94]. This has many explanations of the
problems of extending C with object-oriented extensions while retaining
compatibility with C. In many ways, Stroustrup reinforces comments that I
made in the original critique, but I differ from Stroustrup in that I do not
view the flaws of C++ as acceptable, even if they are widely known, and many
programmers know how to avoid the traps. Programming is a complex endeavour:
complex and flawed languages do not help."
That translates to the usual "hey, I could create so much better a language
from scratch". We have hundreds of those cool languages, written for the
drawer.
His following quote of java white paper made me LOL then cry. Guess this guy
thinks Java 1.0 is some good thing. Nuff said.
Just to add something positive, if it comes to critique of C++ I rather
chode Matthew Wilson with "Imperfect C++".
(OTOH, IMO LISP is "the ultimate language" in many senses)
[ ... ]
> That said, my takeaway from his essay was that for a programming
> language (or for that matter, most anything) to succeed -- the world
> expects/insists simplicity rather than correctness from it.
>
> The irony here is that C++ is anything but simple!
So either C++ flopped completely, or his claim of a preference for
simplicity was wrong -- and C++ doesn't seem to have completely
flopped.
--
Later,
Jerry.
[ ... ]
> These are 6 entries with »1/1« and 5 others.
>
> »1/1« seems to indicate a ratio of run-times of
> approximately 1.
My apologies -- the '13' was a typo. It should have read '11'. That
doesn't really change anything though -- 6 out of 11 still isn't
"most", or even very close to it.
--
Later,
Jerry.
[snip]
> (OTOH, IMO LISP is "the ultimate language" in many senses)
That is intriguing given that you appreciate RAII+RRID (resource
release is destruction) as a general method for controlling any
resource (not just memory); and given that LISP does not support
RAII+RRID. Any further thoughts on this one aspect of comparison?
KHD
Where are you getting such an exact number as "40%" from?
I have programmed both in C and C++, and I consider myself quite
proficient in both (but more in the latter, naturally), and I have to
agree with the (rough) estimate of 1000% improved productivity. As I
already mentioned in my previous post, C++ offers quite many shortcuts
and ready solutions to many common tasks, out of the box, which C just
doesn't.
It's extremely common for me to have to write some function or module
to do some task requiring dynamic memory and efficient data containers,
and in most cases the C++ standard library offers me a perfect solution,
reducing the amount of work I need to do by at least that 1000%.
Comparing proficient C++ development to proficient C development, of
small-to-medium sized projects, especially with projects which have to
start from scratch (ie. without any programmer's personal battery of
helper libraries), I wouldn't say the 1000% increase in productivity is
all that far-fetched.
> That said, my takeaway from his essay was that for a programming
> language (or for that matter, most anything) to succeed -- the world
> expects/insists simplicity rather than correctness from it.
>
> The irony here is that C++ is anything but simple!
I don't believe that simplicity is an important factor. If that was the case then languages such as Perl
would never have caught.
Rui Maciel
I think if one considers "worse is better" more carefully
you find that the essence is pontification vs production.
In other words, those who sit about pontificating the ivory
perfection of the "ideal" solution lose to those who
produce a workable solution in the meantime.
KHD
It is probably way dependent on what quality level you aim at. If it is
high (i.e. no tolerance for problems) the gap opens wide. TMK James works
for such environment. Me too. I'd also vouch on the 1000% estimate, though
must add that I'm yet to see the desired quality reached with C that leaves
little to compare...
> I think if one considers "worse is better" more carefully
> you find that the essence is pontification vs production.
> In other words, those who sit about pontificating the ivory
> perfection of the "ideal" solution lose to those who
> produce a workable solution in the meantime.
It's stll a poor way to put it. Why not stick with Voltaire's "Perfect is
enemy of good"?
It also depends on the application domain. If you're writing
small numeric applications, where the only real types you're
dealing with are double and fixed sized arrays of double (and
int for indexing, of course), then the gap is considerably
smaller than if you're having to deal with more complex
abstractions (and in older C, even complex numbers would qualify
as a "more complex abstraction"), dynamic allocations and large
teams.
--
James Kanze
Stated that way, your statement is as tendential as the original
paper:-). The major problem with the original paper is that it
is arguing a foregone conclusion.
There is a hint of a valid point in the original paper: if I had
to categorize it, it would be something like "works today vs.
correct tomorrow" (not that the Lisp community has any monopoly
on "correct tomorrow", and the definitions the paper uses for
"better" don't always correspond to what I would consider
"correct"). If you expand the idea, howver, you realize that
you can't always ignore the importance of "works today", and
that the two positions don't really have to be in opposition: a
lot of places I've seen do use the principle of a working
prototype, as soon as possible, followed up with a more rigorous
implementation later. One could argue that in doing this, you
should use a very dynamic language (e.g. something like
Smalltalk) for the prototype, and a bondage and discipline
language (Ada?) for the final version; perhaps one of the things
C++ has going for it is that it can be used both ways, so you
don't need to use two different languages.
--
James Kanze
> > Going to the essence of the thought, it should not pick 'memory' as a
> > special item, but go for more generality, a language that allows automation
> > of pesky management, is ways more productive than one relies on the
> > programmer to do everything. And C++ with RAII enabled is absolute king,
> > covering more than memory.
>
> > (OTOH, IMO LISP is "the ultimate language" in many senses)
>
> That is intriguing given that you appreciate RAII+RRID (resource
> release is destruction) as a general method for controlling any
> resource (not just memory); and given that LISP does not support
> RAII+RRID. Any further thoughts on this one aspect of comparison?
scheme has its (dynamic-wind before proc after) which calls "proc" and
calls "before" before entry to proc and "after" on exit from proc.
Since continuations allow proc to be exited (and re-entered) this is
not a matter of simply calling one at the beginning and one at the.
Scheme has the same early exit possibilities as C++ exceptions. So for
resources other than memory you could build something like RAII.
I'm not sure if Common Lisp (what is often meant by "Lisp") has
corresponding features
nick keighley
br Jussi
>
> --
> James Kanze
[ ... ]
> In my understanding there is a single weakness in alternatives: In
> this discussion
> http://www.artima.com/forums/flat.jsp?forum=106&thread=268226
> Achilleas Margaritis gives an interesting insight C++ being the only
> language combining high and low level.
I can't agree that it's the only one -- Ada (for one example)
supports about the same levels of abstraction.
If we eliminate that part, I'd agree -- or more accurately, he's
agreeing with something I said around 5 years ago:
Then again, C++ does have one advantage in this respect:
it supports enough different levels of abstraction that
it can be used right down to the metal, or in a fairly
safe, high-level manner, and perhaps most importantly,
more or less seamlessly marrying the two, so even in my
system programming, much of what I write works at a fairly
high level of abstraction.
http://groups.google.com/group/comp.lang.java/msg/36943ca538b19724
--
Later,
Jerry.
[ ... ]
> Indeed, there is a paper claiming that Lisp macros were
> an »inspiration« for C macros:
If you honestly want to know what "inspired" C's macros, take a look
at Macro-8 (an assembler for the PDP-8), or Control Data 160G GASS
(General ASsembly System). E.g. See page 5 of:
http://www.bitsavers.org/pdf/cdc/160/CDC_160G_Brochure_Feb64.pdf
C was designed as a "portable assembly language", and by the early
1970's when it was designed, everybody "knew" that any sort of
assembly language needed to have macros.
--
Later,
Jerry.
> [ ... ]
> http://www.bitsavers.org/pdf/cdc/160/CDC_160G_Brochure_Feb64.pdf
I'm not sure that macros were added because C was trying to be
like assembler---C's macros are a lot weaker than anything I've
seen in assembler---, but what is sure is that all assemblers
back then did have some support for macros, and that the people
developing C were very familiar with assembler, and macro
technology, so when the need for e.g. "inline functions" was
felt, macros would be a more or less natural response.
What I don't get is Lisp proponents trying to imply that C
macros are derived from Lisp. Even if it were true, it's
something that I'd try to hide, rather than claim credit for;
C's preprocessor are not exactly the best feature of the
language.
--
James Kanze
[ ... ]
> I'm not sure that macros were added because C was trying to be
> like assembler---C's macros are a lot weaker than anything I've
> seen in assembler---, but what is sure is that all assemblers
> back then did have some support for macros, and that the people
> developing C were very familiar with assembler, and macro
> technology, so when the need for e.g. "inline functions" was
> felt, macros would be a more or less natural response.
Right -- I didn't mean to imply that it was necessarily an attempt at
being like an assembler, just that they seem to be looking for a
relatively obscure source, when a much more obvious one is available.
> What I don't get is Lisp proponents trying to imply that C
> macros are derived from Lisp. Even if it were true, it's
> something that I'd try to hide, rather than claim credit for;
> C's preprocessor are not exactly the best feature of the
> language.
How can you ask such foolish question? They're on a chivalrous quest!
Lisp will now be called "Dulcinea", and they will protect all from
the cruel windmills! :-)
--
Later,
Jerry.
There is no ultimate programming language. Ask you teacher to write an
operating system kernel using his favorite LISP. If there is an
ultimate programming language, that is machine code. Except for the
little problem, that there isn't one kind of machine code, but thousands...
The teacher has told you his opinion. He should be the one elaborating
why he thinks that, and why is he trying to disseminate his
opinion/experience as a dogma, and why on a C++ course to demotivate the
students...
--
BR, WW