The Failure of OOP

5 views
Skip to first unread message

christopher diggins

unread,
Apr 6, 2004, 1:27:25 PM4/6/04
to
OOP has failed us.

I have been using OO techniques for over 10 years and the promises that
accompanied OOP of code reuse, simplicity of design and speed of
development, have never materialized. Over and over again I have seen (and
written) programs that were needlessly complicated (i.e. hard to read,
difficult to maintain) and inefficient compared to more straightforward
designs using modular and strctured programming approaches.

I have come to believe what the problem has been for myself and others, and
it is definitely not a simple lack of knowledge or experience. Many
programmers and designers (myself included) often make two major errors :
1) failure to use multiple inheritance when it more accurately models the
problem domain
2) failure to inherit from Abstract Base Classes when appropriate

I believe the reasons these things occur are simple, we try to save code
where we can and we are afraid of making inefficient designs. Many of us are
taught to fear the multiple inheritance of abstract base class approach. I
think this is founded on the simple fact that multiple inheritance of
abstract base classes causes our objects to become bloated with vtable
pointers. I propose that the solution is to use interfaces which can be
implemented much more efficiently than abstract base classes (see
http://www.heron-language.com/heronfront.html for an example of how this is
possible).


-
Christopher Diggins - http://www.cdiggins.com
designer of Heron - http://www.heron-language.com

Alan Gauld

unread,
Apr 6, 2004, 5:47:13 PM4/6/04
to
On Tue, 6 Apr 2004 13:27:25 -0400, "christopher diggins"
<cdig...@users.sourceforge.net> wrote:
> OOP has failed us.
>
> I have been using OO techniques for over 10 years and the promises that
> accompanied OOP of code reuse, simplicity of design and speed of
> development, have never materialized.

They have materialised but not in the way the hypesters promised
during the early 90's. However there are many applications being
written today that would be horribly difficult to design and code
without OOP, and not just GUIs.

> I have come to believe what the problem has been for myself and others, and
> it is definitely not a simple lack of knowledge or experience.

Nope, it's because of the essential complexity of software.
As per Fred Brooks "No Silver Bullet" paper. No magic trick of
coding can get us away from the underlying complexity of mapping
human problems onto electronic hardware.

> 1) failure to use multiple inheritance when it more accurately models the
> problem domain

Thats an interesting assertion. It implies, amongst other things,
that MI is essential to good OOP, and yet many Smalltalk programs
are amongst the best examples of OOP I've seen and MI isn't even
available as an option. On the other hand I've also programmed
using Flavors which had MI all over the place - the origin
of the mixin style was in Flavors - I've seen a class that
directly inherited from 9 parent classes in a Flavors program.
And not a vtable in sight!

I'm not sure that meets your criterion for more accurately
modelling the problem domain though... But I can't say that
Flavors produced any greater or less productivity than other OOP
approaches. But it certainly encouraged reuse!

> 2) failure to inherit from Abstract Base Classes when appropriate

Again I wonder why you state this. If anything I'd suggest the
problem is more to do with a failure to create the abstract
classes in the first place! (And again Flavoors strongly
encouraged this). When those classes exist I've not really seen
any reluctance to use them. But a well crafted framework of
abstract and semi abstract (ie partially implemented or skeletal
concrete classes) is fundamental to good OO design.

> I believe the reasons these things occur are simple, we try to save code
> where we can and we are afraid of making inefficient designs.

Premature optimisation is a common problem when coding without
adequate design. If the design has taken account of where
performance is needed this shouldn't arise. But creating abstract
classes(and inheriting from them should save code and therefore
argue against your conclusion surely?

> think this is founded on the simple fact that multiple inheritance of
> abstract base classes causes our objects to become bloated with vtable
> pointers.

Only in certain rather crude OO language implementations.

> I propose that the solution is to use interfaces which can be
> implemented much more efficiently than abstract base classes

Depending on the language abstract classes are synonymous with
interfaces. In others intefaces don't exist. These appear to be
very language specific conclusions that you are drawing. I wonder
whether your initial starting point (OOP has failed) is similarly
biased towards the popular but primitive OOP lamguages, such as
C++ and Java?

> Christopher Diggins - http://www.cdiggins.com
> designer of Heron - http://www.heron-language.com

And yet you seem to recognise this limitation by advocating Heron
as "yet another OOP langage"? BTW It would be nice to see a more
OO example than a tree search, it is almost entirely procedural
despite using a few classes.

Alan G.
Author of the Learn to Program website
http://www.freenetpages.co.uk/hp/alan.gauld

JXStern

unread,
Apr 6, 2004, 7:14:41 PM4/6/04
to
On Tue, 6 Apr 2004 13:27:25 -0400, "christopher diggins"
<cdig...@users.sourceforge.net> wrote:
>OOP has failed us.

What you mean 'us', white man?

>I have been using OO techniques for over 10 years and the promises that
>accompanied OOP of code reuse, simplicity of design and speed of
>development, have never materialized. Over and over again I have seen (and
>written) programs that were needlessly complicated (i.e. hard to read,
>difficult to maintain) and inefficient compared to more straightforward
>designs using modular and strctured programming approaches.

Well then, shame on you, or congrats on your awakening, if.

>I have come to believe what the problem has been for myself and others, and
>it is definitely not a simple lack of knowledge or experience. Many
>programmers and designers (myself included) often make two major errors :
>1) failure to use multiple inheritance when it more accurately models the
>problem domain
>2) failure to inherit from Abstract Base Classes when appropriate

Your solution then is more of the same?

Good luck.

>I believe the reasons these things occur are simple, we try to save code
>where we can and we are afraid of making inefficient designs. Many of us are
>taught to fear the multiple inheritance of abstract base class approach. I
>think this is founded on the simple fact that multiple inheritance of
>abstract base classes causes our objects to become bloated with vtable
>pointers. I propose that the solution is to use interfaces which can be
>implemented much more efficiently than abstract base classes (see
>http://www.heron-language.com/heronfront.html for an example of how this is
>possible).

Blame it on the vtables, sure.

See ya in another ten years.

J.

thoff

unread,
Apr 6, 2004, 7:55:22 PM4/6/04
to

christopher diggins wrote:
> OOP has failed us.

So has god.
So has your mother.
So has government.
So has the medical community.

Maybe it is time to quit asking to be saved.

> I have been using OO techniques for over 10 years and the promises that
> accompanied OOP of code reuse, simplicity of design and speed of
> development, have never materialized. Over and over again I have seen (and
> written) programs that were needlessly complicated (i.e. hard to read,
> difficult to maintain) and inefficient compared to more straightforward
> designs using modular and strctured programming approaches.

I didn't read in any OO books where it said to write needlessly
complicated, hard to maintain, inefficient programs. Why
did you do that?


> I believe the reasons these things occur are simple, we try to save code
> where we can and we are afraid of making inefficient designs. Many of us are
> taught to fear the multiple inheritance of abstract base class approach. I
> think this is founded on the simple fact that multiple inheritance of
> abstract base classes causes our objects to become bloated with vtable
> pointers.

I have never seen a vtable pointer. What are the polluting?


>I propose that the solution is to use interfaces which can be
> implemented much more efficiently than abstract base classes (see
> http://www.heron-language.com/heronfront.html for an example of how this is
> possible).

With less vtable pointers you suddenly figured out how to write
better code? Don't think so.

Isaac Gouy

unread,
Apr 6, 2004, 8:45:35 PM4/6/04
to
"christopher diggins" <cdig...@users.sourceforge.net> wrote in message news:<dKBcc.98617$_95.19...@wagner.videotron.net>...

> OOP has failed us.

No, we failed OO.

Dave Benjamin

unread,
Apr 6, 2004, 10:19:33 PM4/6/04
to

No, we failed to implement the interface that OO is bound to.

--
.:[ dave benjamin: ramen/[sp00] -:- spoomusic.com -:- ramenfest.com ]:.
: please talk to your son or daughter about parametric polymorphism. :

Universe

unread,
Apr 6, 2004, 10:23:09 PM4/6/04
to

"Isaac Gouy" <ig...@yahoo.com> wrote in message news:ce7ef1c8.04040...@posting.google.com...

> "christopher diggins" <cdig...@users.sourceforge.net> wrote in message news:<dKBcc.98617$_95.19...@wagner.videotron.net>...
>
> > OOP has failed us.
>
> No, we failed OO.

*SOME* anti-scientific, anti-big picture systems approach, anti-actual domain & use
case modelling high level design leading, code centric, versus conceptual centric
which may advise "don't code" types have failed OO.

Let's be clear on that.

Those practicing non-dogmatic, iterative and incremental, model driven,
*TAILORED* RUP are right on the mark and doing fine! At least IMO.

Elliott
--
It is somewhat scandalous that many well known, highly considered sw
engineering notables latch on to new, and in many cases old, but newly
reshaped techniques in an isolated manner. These leading lights latch
on to these techniques failing to take into account a big picture
system perspective on the techniques. In addition they fail to
validate the technique from a scientific viewpoint.

Universe

unread,
Apr 6, 2004, 10:36:37 PM4/6/04
to
"Alan Gauld" <alan....@btinternet.com> wrote in message

> On Tue, 6 Apr 2004 13:27:25 -0400, "christopher diggins"

> > OOP has failed us.

> > I have been using OO techniques for over 10 years and the promises that
> > accompanied OOP of code reuse, simplicity of design and speed of
> > development, have never materialized.

> They have materialised but not in the way the hypesters promised
> during the early 90's.

I've been accused of "hyping OO", but what have I have been falsely
hyping that anyone can *concretely* and *actually* point out?

Struvstrup, RMartin among others have repeatedly foamed about
"OO hype". But when it gets down to it what they REALLY oppose
is the OO MODEL DRIVEN APPROACH.

What can I say if they don't get it? If they forsake the scientific method
and the big picture, holistic, systems approach for the intellectually
bankrupt, nickel and dime, baby step, the technical and parts are
key over conceptualization and abstraction viewpoint and praxis?

What can I say if they lack the spiritual center and sense of REAL
LIFE GIVING sense of AESTHETICS evident in doing it right by science
and whole systems method?

[Contrary to RMartin, I like much of what BS, says still overall, summing
it all up he's absent and missing a code modelling, conceptualization oriented
and generated "vitalism". What is apparent in what Christopher Alexander
and Gabriel had to say at the web site Gouy referred us to yesterday or
the say before.]

Elliott
--
This behavior of overlooking or deprecating the system approach and
scientific method in some cases, is at best, pathetically, poor
academics, and sloppy engineering. And in a number of other cases
where dollars matters most, it's dishonest opportunism and an
abrogation of intellectual integrity.

Universe

unread,
Apr 6, 2004, 10:40:29 PM4/6/04
to

"Dave Benjamin" <ra...@lackingtalent.com> wrote in message news:slrnc76ptu...@lackingtalent.com...

NOW THAT... THAT IS WORTH REPEATING & "SIG ELEVATION"!!!

Elliott
--
No, *some* failed to implement the interface that OO is bound to.
~ Dave Benjamin

christopher diggins

unread,
Apr 7, 2004, 1:59:01 AM4/7/04
to
"Alan Gauld" <alan....@btinternet.com> wrote in message
news:co767058g76c23nc9...@4ax.com...

> On Tue, 6 Apr 2004 13:27:25 -0400, "christopher diggins"
> <cdig...@users.sourceforge.net> wrote:
> > OOP has failed us.
>
> > I have been using OO techniques for over 10 years and the promises that
> > accompanied OOP of code reuse, simplicity of design and speed of
> > development, have never materialized.
>
> They have materialised but not in the way the hypesters promised
> during the early 90's. However there are many applications being
> written today that would be horribly difficult to design and code
> without OOP, and not just GUIs.

In retrospect I overstated the case.

> > I have come to believe what the problem has been for myself and others,
and
> > it is definitely not a simple lack of knowledge or experience.
>
> Nope, it's because of the essential complexity of software.
> As per Fred Brooks "No Silver Bullet" paper. No magic trick of
> coding can get us away from the underlying complexity of mapping
> human problems onto electronic hardware.

I agree that there is no silver bullet, but there is definitely progress
towards making software more manageable, and less complex. That is the whole
point of the various software design methodologies. So I disagree with the
implication that there exists an unsurmountable inherent complexity with the
problem of mapping human problems onto electronic hardware. Abstraction in
general continues to be increasingly effective tool.

> > 1) failure to use multiple inheritance when it more accurately models
the
> > problem domain
>
> Thats an interesting assertion. It implies, amongst other things,
> that MI is essential to good OOP, and yet many Smalltalk programs
> are amongst the best examples of OOP I've seen and MI isn't even
> available as an option.

When I said "when it more accurately models", I did not mean "because it
more accurately models". My point is that single inheritance is overly
restrictive in many design scenarios and sometimes just plain wrong.

> On the other hand I've also programmed
> using Flavors which had MI all over the place - the origin
> of the mixin style was in Flavors - I've seen a class that
> directly inherited from 9 parent classes in a Flavors program.
> And not a vtable in sight!

That was good for me to learn, thanks for bringing that to my attention.

> I'm not sure that meets your criterion for more accurately
> modelling the problem domain though... But I can't say that
> Flavors produced any greater or less productivity than other OOP
> approaches. But it certainly encouraged reuse!
>
> > 2) failure to inherit from Abstract Base Classes when appropriate
>
> Again I wonder why you state this. If anything I'd suggest the
> problem is more to do with a failure to create the abstract
> classes in the first place! (And again Flavoors strongly
> encouraged this). When those classes exist I've not really seen
> any reluctance to use them. But a well crafted framework of
> abstract and semi abstract (ie partially implemented or skeletal
> concrete classes) is fundamental to good OO design.

I think this was just poor semantics on my part, I agree with you.

> > I believe the reasons these things occur are simple, we try to save code
> > where we can and we are afraid of making inefficient designs.
>
> Premature optimisation is a common problem when coding without
> adequate design. If the design has taken account of where
> performance is needed this shouldn't arise. But creating abstract
> classes(and inheriting from them should save code and therefore
> argue against your conclusion surely?

I think then perhaps I should restate my conclusion. Which I shall do likely
in another post.

> > think this is founded on the simple fact that multiple inheritance of
> > abstract base classes causes our objects to become bloated with vtable
> > pointers.
>
> Only in certain rather crude OO language implementations.

I would say rather: many OO language implementations. The subjective
attribution of crudity is suspect.

> > I propose that the solution is to use interfaces which can be
> > implemented much more efficiently than abstract base classes
>
> Depending on the language abstract classes are synonymous with
> interfaces. In others intefaces don't exist. These appear to be
> very language specific conclusions that you are drawing. I wonder
> whether your initial starting point (OOP has failed) is similarly
> biased towards the popular but primitive OOP lamguages, such as
> C++ and Java?

An abstract base class is not an interface, it just looks like one. The
advantage of an interface is that binding is done top down rather than
bottom up. There is therefore no need for vtable lookups within function
calls. My discussion is of course somewhat language-centric, but I am not an
expert in all language which purport to be object oriented.

> > Christopher Diggins - http://www.cdiggins.com
> > designer of Heron - http://www.heron-language.com
>
> And yet you seem to recognise this limitation by advocating Heron
> as "yet another OOP langage"? BTW It would be nice to see a more
> OO example than a tree search, it is almost entirely procedural
> despite using a few classes.

What I do with regards to the marketing of Heron has no bearing to this
discussion.

> Alan G.
> Author of the Learn to Program website
> http://www.freenetpages.co.uk/hp/alan.gauld

I appreciated your informed and educational comments.

--
Christopher Diggins
http://www.cdiggins.com
http://www.heron-language.com


christopher diggins

unread,
Apr 7, 2004, 2:02:05 AM4/7/04
to
"Universe" <univ...@covad.net> wrote in message
news:9c858$40736936$97cff003$43...@msgid.meganewsservers.com...

"Alan Gauld" <alan....@btinternet.com> wrote in message

> > On Tue, 6 Apr 2004 13:27:25 -0400, "christopher diggins"
> > > OOP has failed us.

> > > I have been using OO techniques for over 10 years and the promises
that
> > > accompanied OOP of code reuse, simplicity of design and speed of
> > > development, have never materialized.

> > They have materialised but not in the way the hypesters promised
> > during the early 90's.

> I've been accused of "hyping OO", but what have I have been falsely
> hyping that anyone can *concretely* and *actually* point out?

[*big* snip]

Please don't usurp this thread for your own ego gratification in the guise
of a defense of some imagined attack on your character. Thank you.


christopher diggins

unread,
Apr 7, 2004, 2:12:13 AM4/7/04
to
"thoff" <t...@possibility.com> wrote in message
news:1076gq9...@news.supernews.com...

>
> christopher diggins wrote:
> > OOP has failed us.
>
> So has god.
> So has your mother.
> So has government.
> So has the medical community.
>
> Maybe it is time to quit asking to be saved.

Cute.

> > I have been using OO techniques for over 10 years and the promises that
> > accompanied OOP of code reuse, simplicity of design and speed of
> > development, have never materialized. Over and over again I have seen
(and
> > written) programs that were needlessly complicated (i.e. hard to read,
> > difficult to maintain) and inefficient compared to more straightforward
> > designs using modular and strctured programming approaches.
>
> I didn't read in any OO books where it said to write needlessly
> complicated, hard to maintain, inefficient programs. Why
> did you do that?

The question I am more interested in is why do other people? There is a
theory that surfaces commonly on usenet that any misuse of favourite
technology X is due simply to hordes of brain dead zombies, who can't tie
their shoes yet somehow make a living for a decade writing software. I just
can't buy into that theory. If a technology is hard to master or control I
think it is the technology that has failed, and not zombies.

> > I believe the reasons these things occur are simple, we try to save code
> > where we can and we are afraid of making inefficient designs. Many of us
are
> > taught to fear the multiple inheritance of abstract base class approach.
I
> > think this is founded on the simple fact that multiple inheritance of
> > abstract base classes causes our objects to become bloated with vtable
> > pointers.
>
> I have never seen a vtable pointer. What are the polluting?

Have you ever looked at the size of your objects in bytes with large
inheritance trees? I'd wager in language X of your choice it is likely using
more space than you need. I know this for a fact in C++, and I demonstrate
it. I am open to hear which languages handle vtables better (along with a
bit of demonstration with small psuedo-code bits and byte sizes).

> >I propose that the solution is to use interfaces which can be
> > implemented much more efficiently than abstract base classes (see
> > http://www.heron-language.com/heronfront.html for an example of how this
is
> > possible).
>
> With less vtable pointers you suddenly figured out how to write
> better code? Don't think so.

Using objects with many bases and preferably many interfaces I believe makes
for more robust and reusable code with less effort.

christopher diggins

unread,
Apr 7, 2004, 2:18:57 AM4/7/04
to
"Isaac Gouy" <ig...@yahoo.com> wrote in message
news:ce7ef1c8.04040...@posting.google.com...

That sounds good off the cuff, but then why have we failed OO? Is it because
it has flaws and weaknesses to begin with? A software development
methodology has a responsability to the users (the software developers), and
not the other way around. The mere idea that I can fail a design technique
smacks of the bad old days, when software end-users were viewed as idiots
for not learning how to "properly" use the crappy software that was
presented to them.

Universe

unread,
Apr 7, 2004, 3:01:45 AM4/7/04
to

"christopher diggins" <cdig...@videotron.ca> wrote in message news:dXMcc.108821$2K3.2...@weber.videotron.net...

> "thoff" <t...@possibility.com> wrote in message
> news:1076gq9...@news.supernews.com...
> >
> > christopher diggins wrote:
> > > OOP has failed us.
> >
> > So has god.
> > So has your mother.
> > So has government.
> > So has the medical community.
> >
> > Maybe it is time to quit asking to be saved.
>
> Cute.
>
> > > I have been using OO techniques for over 10 years and the promises that
> > > accompanied OOP of code reuse, simplicity of design and speed of
> > > development, have never materialized. Over and over again I have seen
> (and
> > > written) programs that were needlessly complicated (i.e. hard to read,
> > > difficult to maintain) and inefficient compared to more straightforward
> > > designs using modular and strctured programming approaches.


When did individual experience become the same as, qualify as *general experience*?

It seems to me that firms seeking substantial benefits and advantages in the SW
engineering domain have overwhelmingly adopted and *stayed* with the OO
modelling/conceptualization paradigm and the OO technology behind to
support that paradigm as the generally the *leading* one for the bulk of their
development efforts.

Universe

unread,
Apr 7, 2004, 3:21:18 AM4/7/04
to

"christopher diggins" <cdig...@videotron.ca> wrote in message news:KNMcc.108318$2K3.2...@weber.videotron.net...

> "Universe" <univ...@covad.net> wrote in message
> news:9c858$40736936$97cff003$43...@msgid.meganewsservers.com...
> "Alan Gauld" <alan....@btinternet.com> wrote in message
>
> > > On Tue, 6 Apr 2004 13:27:25 -0400, "christopher diggins"
> > > > OOP has failed us.
>
> > > > I have been using OO techniques for over 10 years and the promises
> that
> > > > accompanied OOP of code reuse, simplicity of design and speed of
> > > > development, have never materialized.
>
> > > They have materialised but not in the way the hypesters promised
> > > during the early 90's.
>

I've been accused of "hyping OO", but what have I have been falsely
hyping that anyone can *concretely* and *actually* point out?

Struvstrup, RMartin among others have repeatedly foamed about


"OO hype". But when it gets down to it what they REALLY oppose
is the OO MODEL DRIVEN APPROACH.

What can I say if they don't get it? If they forsake the scientific method
and the big picture, holistic, systems approach for the intellectually
bankrupt, nickel and dime, baby step, the technical and parts are
key over conceptualization and abstraction viewpoint and praxis?

What can I say if they lack the spiritual center and sense of REAL
LIFE GIVING sense of AESTHETICS evident in doing it right by science
and whole systems method?

> Please don't usurp this thread for your own ego gratification in the guise


> of a defense of some imagined attack on your character. Thank you.

You are a heckuva subjectivist. Attempting to make your lousy personal
experience the general case and here ignoring the statements of
substance I made that very do speak to the meat of this matter. Not the
only meat, but a major element.

If you were really interested in making a truly scientific evaluation
of OO in terms of the sw engineering industry, you:
1) would not generalize your *alleged* [who knows with your subjective take
if you actually made a genuinely objective summary of even your
personal OO experience] poor individual OO experience onto
what most see as mostly positive with respect to OO and software
development
2) would have dealt with the substance of my remarks rather than use it
as yet opportunity to demonstrate intellectual deficiency. That combined
with immature, non-productive behavior.

Your lack of response to issues of substance about how OO is fairing in
sw engineering speaks volumes about why you *must* declaim about OO,
and why.

Now playing, in the background, Braham's Lullaby, "Rock-a-Bye Baby"

Elliott

Universe

unread,
Apr 7, 2004, 3:42:30 AM4/7/04
to

"Universe" <univ...@covad.net> wrote in message news:9c858$40736936$97cff003

> > Others wrote:
> > >
> > > OOP has failed us.

Failed *you*.

> > > I have been using OO techniques for over 10 years and the promises that
> > > accompanied OOP of code reuse, simplicity of design and speed of
> > > development, have never materialized.

While boatloads of others have materialized same.

It is uneven with ups and downs within and across groups, but in the
main, the verdict is positive.

> > They have materialised but not in the way the hypesters promised
> > during the early 90's.

What was "hype"?

What has materialized positively?

> I've been accused of "hyping OO", but what have I have been falsely
> hyping that anyone can *concretely* and *actually* point out?
>
> Struvstrup, RMartin among others have repeatedly foamed about
> "OO hype". But when it gets down to it what they REALLY oppose
> is the OO MODEL DRIVEN APPROACH.

I'm making a point here that the largest part of what some alleged to
be OO hype was how genuine domain and use case modelling would
be the pivot around which we would primarily leverage the advantages
of OO.

If others have additional or different views of OO "hype", please,
please lay them out *concretely*.



> What can I say if they don't get it? If they forsake the scientific method
> and the big picture, holistic, systems approach for the intellectually
> bankrupt, nickel and dime, baby step, the technical and parts are
> key over conceptualization and abstraction viewpoint and praxis?
>
> What can I say if they lack the spiritual center and sense of REAL
> LIFE GIVING sense of AESTHETICS evident in doing it right by science
> and whole systems method?

And I really do think the failure to approach OO in THIS manner is
precisely why so many ran into problems, into a morass.

> Contrary to RMartin, I like much of what BS, says still overall, summing
> it all up he's absent and missing a code modelling, conceptualization oriented
> and generated "vitalism". What is apparent in what Christopher Alexander
> and Gabriel had to say at the web site Gouy referred us to yesterday or
> the say before.]

This is all a part of having the right attitude and perspective on/about
the OO conceptual paradigm and how that should have informed
and for the most part guided peoples attempts at OO software engineering
development.

In the framework of actually getting at and enhancing what the truth was and
is on these issues, I sincerely encourage folk to support, modify, and of course
even oppose the judgements I have put forth herein. And certainly to
continue to express and discuss the matter from angles I have not dealt with
at all.

Dmitry A. Kazakov

unread,
Apr 7, 2004, 5:34:33 AM4/7/04
to
christopher diggins wrote:

> OOP has failed us.

Not OOP, but an OOPL!

Interface implementation is a form of inheritance. The problem with vtable
pointers is not inheritance but a wrong concept used in C++ and many other
OO languages. This concept promotes objects having many types.

There is a simple, elegant and extermely efficient alternative solution. It
separates the types of specific objects from ones of class-wide objects.
This approach allows to keep the type tag separately from an [specific]
object. Thus multiple inheritance does not lead to any penalty. Also it
makes possible to implement multiple dispatch. Moreover, it allows to make
classes out of elementary types as well, and note without any
performance/space penalty.

Ada 95 uses this approach [though it has neither MI nor MD for other
reasons.]

--
Regards,
Dmitry A. Kazakov
www.dmitry-kazakov.de

Cristiano Sadun

unread,
Apr 7, 2004, 5:48:20 AM4/7/04
to
"christopher diggins" <cdig...@users.sourceforge.net> wrote in
news:dKBcc.98617$_95.19...@wagner.videotron.net:

> OOP has failed us.
>
>
> I have been using OO techniques for over 10 years and the promises
> that accompanied OOP of code reuse, simplicity of design and speed of
> development, have never materialized. Over and over again I have seen
> (and written) programs that were needlessly complicated (i.e. hard to
> read, difficult to maintain) and inefficient compared to more
> straightforward designs using modular and strctured programming
> approaches.

Is that right? I do recall applications done 10 years ago or so. On
average, they weren't even near the complexity of things we give for
granted today, and when they were, their cost, effort and failure rates
were quite larger than today.

In 10 years we've progressed to do much more complex systems. And the
market has grown expectations on their (low) price which were simply
unthinkable 10 years ago.

> I have come to believe what the problem has been for myself and
> others, and it is definitely not a simple lack of knowledge or
> experience.

That's for sure. Complex realizations are difficult, whatever the
underlying engineering. Things can go wrong, and occasionally do.

> Many programmers and designers (myself included) often
> make two major errors

> [...]


> I propose that the solution is to
> use interfaces which can be implemented much more efficiently than
> abstract base classes (see
> http://www.heron-language.com/heronfront.html for an example of how
> this is possible).

Well, u're just proposing another silver bullet, aren't you? Changing
programming language isnt going to do you any better than any other
miracle cure for software engineering. It's not _only_ a matter of
technology or paradigm or approach - you'll still have to find heads that
apply it smartly. That, I fear, is the real matter.

Doc O'Leary

unread,
Apr 7, 2004, 6:37:02 AM4/7/04
to
In article <x1Ncc.109159$2K3.2...@weber.videotron.net>,
"christopher diggins" <cdig...@videotron.ca> wrote:

> That sounds good off the cuff, but then why have we failed OO? Is it because
> it has flaws and weaknesses to begin with? A software development
> methodology has a responsability to the users (the software developers), and
> not the other way around. The mere idea that I can fail a design technique
> smacks of the bad old days, when software end-users were viewed as idiots
> for not learning how to "properly" use the crappy software that was
> presented to them.

Maybe, just maybe, you are an idiot. You're like those parents that
blame the school because Little Timmy isn't good at math. The world is
full of idiots, and *lots* of them got into computer science for the
money during the boom in the late 90s; far too many of which are still
around dragging things down.

As evidence that you, personally, are in that group I point to the fact
that your little Heron project assumes that C++ is a salvageable OO
technology. It is not. Even Java introduced nothing new to the OO
landscape. You seem to be one of those people who is exposed to little
more than buzzwords, yet thinks they have a revolution at hand when it
turns out the marketing was better than the technology.

Many people have experienced the advantages of OO development. I see so
much reuse that it is hard for me to open source certain projects my
company wants open while keeping certain other software closed. About
the only thing that interests me in new development is the use of
prototypes to create classless OO programs (or, more specifically, a
runtime that allows dynamic object construction). And what I want is
already being addressed in languages like Self and Io. What you want is
apparently to bitch about C++ being a crap OO language with the
"solution" being something built on that same crap; you need to broaden
your horizons instead of trying to push your pet project.

Universe

unread,
Apr 7, 2004, 7:58:20 AM4/7/04
to
"Universe" <univ...@covad.net> wrote in message news:6e1a3$4073b0e9$97cff003

> ...


> I'm making a point here that the largest part of what some alleged to
> be OO hype was how genuine domain and use case modelling would
> be the pivot around which we would primarily leverage the advantages
> of OO.

> ...

> ...


> I'm making a point here that the largest part of what some alleged to
> be OO hype was how genuine domain and use case modelling would
> be the

** fulcrum upon **


> which we would primarily leverage the advantages of OO.

> ...

[Hadda' redo that 'un !:- ]

Elliott


thoff

unread,
Apr 7, 2004, 9:53:42 AM4/7/04
to

christopher diggins wrote:
> Have you ever looked at the size of your objects in bytes with large
> inheritance trees?

I've worked on embedded systems using C++ with very small amounts
of memory and have never worried about vtables. I certainly don't
on a workstation. Most C++ programmers don't even know
vtables exist so i don't think it impacts their design.

The reason is the code savings from using base classes is ginormous.
A derived class only needs to add the code to add additional
functionality. This makes for much smaller code than the same
code usually written in C, as i have seen on several projects.


> Using objects with many bases and preferably many interfaces I believe makes
> for more robust and reusable code with less effort.

Maybe. Hopefully. But your message got lost in a lot of other "stuff."

JXStern

unread,
Apr 7, 2004, 11:41:03 AM4/7/04
to
On Wed, 7 Apr 2004 09:48:20 +0000 (UTC), Cristiano Sadun
<cristianoTA...@OUThotmail.com> wrote:
>In 10 years we've progressed to do much more complex systems. And the
>market has grown expectations on their (low) price which were simply
>unthinkable 10 years ago.

Quite.

And the market has also grown expectations on their low quality which


were simply unthinkable 10 years ago.

:)

J.


christopher diggins

unread,
Apr 7, 2004, 11:45:04 AM4/7/04
to

"thoff" <t...@possibility.com> wrote in message
news:10781u7...@news.supernews.com...

>
> christopher diggins wrote:
> > Have you ever looked at the size of your objects in bytes with large
> > inheritance trees?
>
> I've worked on embedded systems using C++ with very small amounts
> of memory and have never worried about vtables. I certainly don't
> on a workstation.

This unfortunately doesn't prove or even demonstrate anything, except that
you chose to not consider vtables in one of your designs. Whether that was a
good thing or not is undetermined.

> Most C++ programmers don't even know
> vtables exist so i don't think it impacts their design.

Your conclusion is false. Not understanding a fact does not mean that it
does not affect you.

> The reason is the code savings from using base classes is ginormous.

This is not a statement that can be supported. Given program P you can not
demonstrate that simply using base classes saves code over other arbitrary
technique T.

> A derived class only needs to add the code to add additional
> functionality. This makes for much smaller code than the same
> code usually written in C, as i have seen on several projects.

This statement is paradoxical. To paraphras: Code X which is the same as
Code Y is smaller. Either they are the same or one is smaller than the
other.

> > Using objects with many bases and preferably many interfaces I believe
makes
> > for more robust and reusable code with less effort.
>
> Maybe. Hopefully. But your message got lost in a lot of other "stuff."

I apologize that my message was obfuscated.

Shayne Wissler

unread,
Apr 7, 2004, 11:49:37 AM4/7/04
to

"christopher diggins" <cdig...@videotron.ca> wrote in message
news:x1Ncc.109159$2K3.2...@weber.videotron.net...

> "Isaac Gouy" <ig...@yahoo.com> wrote in message
> news:ce7ef1c8.04040...@posting.google.com...
> > "christopher diggins" <cdig...@users.sourceforge.net> wrote in message
> news:<dKBcc.98617$_95.19...@wagner.videotron.net>...
> >
> > > OOP has failed us.
> >
> > No, we failed OO.
>
> That sounds good off the cuff, but then why have we failed OO? Is it
because
> it has flaws and weaknesses to begin with? A software development
> methodology has a responsability to the users (the software developers),
and
> not the other way around.

"That sounds good off the cuff", but it's a false dichotomy. It isn't either
way around: a methodology has a responsibility to the facts, and so do the
users. And assuming a methodology is based in truth, then users have a
responsibility to use it correctly (but this is just a corollary to their
responsibility to facts in general).


Shayne Wissler


Shayne Wissler

unread,
Apr 7, 2004, 11:57:02 AM4/7/04
to

"Doc O'Leary" <drolear...@subsume.com> wrote in message
news:droleary.usenet-C4...@news.bpsi.net...

> As evidence that you, personally, are in that group I point to the fact
> that your little Heron project assumes that C++ is a salvageable OO
> technology.

In what way? Because Heron uses C++?

> What you want is
> apparently to bitch about C++ being a crap OO language with the
> "solution" being something built on that same crap; you need to broaden
> your horizons instead of trying to push your pet project.

I haven't looked at Heron in detail, but when I did skim over it I didn't
notice that it was built on C++ in the same manner that C++ is built on C
(i.e., it extends C). If you are complaining about the fact that it *uses*
C++ under the hood, then your criticisms here are totally unfounded.


Shayne Wissler


christopher diggins

unread,
Apr 7, 2004, 12:01:51 PM4/7/04
to
"Doc O'Leary" <drolear...@subsume.com> wrote in message
news:droleary.usenet-C4...@news.bpsi.net...
> In article <x1Ncc.109159$2K3.2...@weber.videotron.net>,
> "christopher diggins" <cdig...@videotron.ca> wrote:
>
> > That sounds good off the cuff, but then why have we failed OO? Is it
because
> > it has flaws and weaknesses to begin with? A software development
> > methodology has a responsability to the users (the software developers),
and
> > not the other way around. The mere idea that I can fail a design
technique
> > smacks of the bad old days, when software end-users were viewed as
idiots
> > for not learning how to "properly" use the crappy software that was
> > presented to them.
>
> Maybe, just maybe, you are an idiot. You're like those parents that
> blame the school because Little Timmy isn't good at math. The world is
> full of idiots, and *lots* of them got into computer science for the
> money during the boom in the late 90s; far too many of which are still
> around dragging things down.

Your definition of an idiot is obviously quite broad, and subjective.
Nonetheless, your insult doesn't contribute much to the discussion at hand.

> As evidence that you, personally, are in that group I point to the fact
> that your little Heron project assumes that C++ is a salvageable OO
> technology. It is not.

Are you implying that I am an idiot because I am working on designing a
programming language? The phrase "salvageable OO technology" doesn't carry
any meaning for me. Perhaps you mean to imply that C++ has no more room for
improvement? I would disagree.

> Even Java introduced nothing new to the OO
> landscape. You seem to be one of those people who is exposed to little
> more than buzzwords, yet thinks they have a revolution at hand when it
> turns out the marketing was better than the technology.

I disagree with your characterization of myself.

> Many people have experienced the advantages of OO development.

I agree that there are many advantages of OO development compared to other
development models. I didn't think I implied that in my original post. I am
trying to make the assertion that OO technologies hasn't lived up to its
potential and still has room for development and evolution.

> I see so
> much reuse that it is hard for me to open source certain projects my
> company wants open while keeping certain other software closed.

Good for you.

> About
> the only thing that interests me in new development is the use of
> prototypes to create classless OO programs (or, more specifically, a
> runtime that allows dynamic object construction). And what I want is
> already being addressed in languages like Self and Io. What you want is
> apparently to bitch about C++ being a crap OO language with the
> "solution" being something built on that same crap; you need to broaden
> your horizons instead of trying to push your pet project.

I don't think C++ is a crap language. Also I think I am broadening my
horizons, and my "pet project" is part of my attempt to do just that.

Shayne Wissler

unread,
Apr 7, 2004, 12:04:25 PM4/7/04
to

"christopher diggins" <cdig...@videotron.ca> wrote in message
news:hkVcc.146472$2K3.2...@weber.videotron.net...

>
> "thoff" <t...@possibility.com> wrote in message
> news:10781u7...@news.supernews.com...
> >
> > christopher diggins wrote:
> > > Have you ever looked at the size of your objects in bytes with large
> > > inheritance trees?
> >
> > I've worked on embedded systems using C++ with very small amounts
> > of memory and have never worried about vtables. I certainly don't
> > on a workstation.
>
> This unfortunately doesn't prove or even demonstrate anything, except that
> you chose to not consider vtables in one of your designs. Whether that was
a
> good thing or not is undetermined.

Well, since you haven't proved or demonstrated anything with regard to
vtables (yet), I think that makes you even.

> > Most C++ programmers don't even know
> > vtables exist so i don't think it impacts their design.
>
> Your conclusion is false. Not understanding a fact does not mean that it
> does not affect you.

Perhaps you would care to expand on why you think this about vtables. Most
developers, me included, think that vtables are an implementation detail of
C++. Indeed, I think that if anyone is going to complain about C++, this
should be one of the last things to complain about (behind the inheritance
model).


Shayne Wissler


christopher diggins

unread,
Apr 7, 2004, 12:06:46 PM4/7/04
to

"Dmitry A. Kazakov" <mai...@dmitry-kazakov.de> wrote in message
news:c50hvp$2o2aj9$1...@ID-77047.news.uni-berlin.de...

I think I see how this is true.

> There is a simple, elegant and extermely efficient alternative solution.
It
> separates the types of specific objects from ones of class-wide objects.
> This approach allows to keep the type tag separately from an [specific]
> object. Thus multiple inheritance does not lead to any penalty. Also it
> makes possible to implement multiple dispatch. Moreover, it allows to make
> classes out of elementary types as well, and note without any
> performance/space penalty.
>
> Ada 95 uses this approach [though it has neither MI nor MD for other
> reasons.]

I will look into this more, it sounds very promising. Thank you for bringing
it to my attention.

Shayne Wissler

unread,
Apr 7, 2004, 12:10:15 PM4/7/04
to

"Shayne Wissler" <thalesN...@yahoo.com> wrote in message
news:ixVcc.211505$Cb.1817114@attbi_s51...

> > What you want is
> > apparently to bitch about C++ being a crap OO language with the
> > "solution" being something built on that same crap; you need to broaden
> > your horizons instead of trying to push your pet project.
>
> I haven't looked at Heron in detail, but when I did skim over it I didn't
> notice that it was built on C++ in the same manner that C++ is built on C
> (i.e., it extends C). If you are complaining about the fact that it *uses*
> C++ under the hood, then your criticisms here are totally unfounded.

Well it has been a few months since I looked at Heron and my memory of it
was wrong--it does in fact build on C++ like C++ builds on C (for some
reason I was thinking that it was an altogether new language).

However, I think that C++ was definitely an improvement to C, so I don't
have sympathy for "Doc O'Leary's" criticisms. C++ is not "crap".


Shayne Wissler


christopher diggins

unread,
Apr 7, 2004, 12:14:22 PM4/7/04
to

"Cristiano Sadun" <cristianoTA...@OUThotmail.com> wrote in message
news:Xns94C4781...@212.45.188.38...

> "christopher diggins" <cdig...@users.sourceforge.net> wrote in
> news:dKBcc.98617$_95.19...@wagner.videotron.net:
>
> > OOP has failed us.
> >
> >
> > I have been using OO techniques for over 10 years and the promises
> > that accompanied OOP of code reuse, simplicity of design and speed of
> > development, have never materialized. Over and over again I have seen
> > (and written) programs that were needlessly complicated (i.e. hard to
> > read, difficult to maintain) and inefficient compared to more
> > straightforward designs using modular and strctured programming
> > approaches.
>
> Is that right? I do recall applications done 10 years ago or so. On
> average, they weren't even near the complexity of things we give for
> granted today, and when they were, their cost, effort and failure rates
> were quite larger than today.
>
> In 10 years we've progressed to do much more complex systems. And the
> market has grown expectations on their (low) price which were simply
> unthinkable 10 years ago.

Yes there definitely has been progress in management of complexity in
software.

> > I have come to believe what the problem has been for myself and
> > others, and it is definitely not a simple lack of knowledge or
> > experience.
>
> That's for sure. Complex realizations are difficult, whatever the
> underlying engineering. Things can go wrong, and occasionally do.
>
> > Many programmers and designers (myself included) often
> > make two major errors
> > [...]
> > I propose that the solution is to
> > use interfaces which can be implemented much more efficiently than
> > abstract base classes (see
> > http://www.heron-language.com/heronfront.html for an example of how
> > this is possible).
>
> Well, u're just proposing another silver bullet, aren't you? Changing
> programming language isnt going to do you any better than any other
> miracle cure for software engineering. It's not _only_ a matter of
> technology or paradigm or approach - you'll still have to find heads that
> apply it smartly. That, I fear, is the real matter.

I am trying to propose the technique of using interfaces. The language is
just a vessel for the methodology. I don't see any technology or methodology
as a silver bullet. I agree that different techniques when used judiciously
and intelligently can help to improve the process of writing software. I
believe though that using an object implementing interfaces approach can
reduce complexity compared to a design that uses abstract base clsases.

christopher diggins

unread,
Apr 7, 2004, 12:19:08 PM4/7/04
to

"Shayne Wissler" <thalesN...@yahoo.com> wrote in message
news:lqVcc.209436$_w.1970999@attbi_s53...

To move away from the philosophical discussion, from a practical software
development perspective if a methodology proves to be too difficult to use
correctly for the average development team member, then an organization
needs to find a new methodology or update the existing one.

Shayne Wissler

unread,
Apr 7, 2004, 12:28:21 PM4/7/04
to

"christopher diggins" <cdig...@videotron.ca> wrote in message
news:cQVcc.148117$2K3.2...@weber.videotron.net...

> > > That sounds good off the cuff, but then why have we failed OO? Is it
> > because
> > > it has flaws and weaknesses to begin with? A software development
> > > methodology has a responsability to the users (the software
developers),
> > and
> > > not the other way around.
> >
> > "That sounds good off the cuff", but it's a false dichotomy. It isn't
> either
> > way around: a methodology has a responsibility to the facts, and so do
the
> > users. And assuming a methodology is based in truth, then users have a
> > responsibility to use it correctly (but this is just a corollary to
their
> > responsibility to facts in general).
> >
> > Shayne Wissler
>
> To move away from the philosophical discussion,

I.e., to move away from precision...

> from a practical software
> development perspective if a methodology proves to be too difficult to use
> correctly for the average development team member, then an organization
> needs to find a new methodology or update the existing one.

Well you tried to get away from philosophy but you failed: Now you've just
raised the question of whether or not one should keep developers around who
can't understand/use a good methodology. If I knew of a good methodology,
then I certainly wouldn't seek to toss it out before questioning whether or
not to toss out the "average" developers (whatever "average" means...).


Shayne Wissler


christopher diggins

unread,
Apr 7, 2004, 12:38:40 PM4/7/04
to

"Shayne Wissler" <thalesN...@yahoo.com> wrote in message

news:HJVcc.211594$Cb.1817933@attbi_s51...

Thank you for sticking up for Heron, Shayne.
One small point though, Heron does not build on C++. The proposed
implementation does, but not the specification. HeronFront is another
matter, that builds on C++.

christopher diggins

unread,
Apr 7, 2004, 12:43:59 PM4/7/04
to

"Shayne Wissler" <thalesN...@yahoo.com> wrote in message
news:dEVcc.206906$1p.2329801@attbi_s54...

>
> "christopher diggins" <cdig...@videotron.ca> wrote in message
> news:hkVcc.146472$2K3.2...@weber.videotron.net...
> >
> > "thoff" <t...@possibility.com> wrote in message
> > news:10781u7...@news.supernews.com...
> > >
> > > christopher diggins wrote:
> > > > Have you ever looked at the size of your objects in bytes with large
> > > > inheritance trees?
> > >
> > > I've worked on embedded systems using C++ with very small amounts
> > > of memory and have never worried about vtables. I certainly don't
> > > on a workstation.
> >
> > This unfortunately doesn't prove or even demonstrate anything, except
that
> > you chose to not consider vtables in one of your designs. Whether that
was
> a
> > good thing or not is undetermined.
>
> Well, since you haven't proved or demonstrated anything with regard to
> vtables (yet), I think that makes you even.

I demonstrated on http://www.heron-language.com/heronfront.html that
implementing interfaces can be done without any vtable. What else would you
like to see demonstrated?

> > > Most C++ programmers don't even know
> > > vtables exist so i don't think it impacts their design.
> >
> > Your conclusion is false. Not understanding a fact does not mean that it
> > does not affect you.
>
> Perhaps you would care to expand on why you think this about vtables. Most
> developers, me included, think that vtables are an implementation detail
of
> C++. Indeed, I think that if anyone is going to complain about C++, this
> should be one of the last things to complain about (behind the inheritance
> model).

I am only familiar with the vtable techniques for implementing virtual
function lookups. Granted it can be done, AFAIK, with only a single pointer
within an object rather than one for each base class like many C++
implementations do.

Shayne Wissler

unread,
Apr 7, 2004, 12:46:44 PM4/7/04
to

"christopher diggins" <cdig...@videotron.ca> wrote in message
news:x6Wcc.149020$2K3.2...@weber.videotron.net...

> > Well it has been a few months since I looked at Heron and my memory of
it
> > was wrong--it does in fact build on C++ like C++ builds on C (for some
> > reason I was thinking that it was an altogether new language).
> >
> > However, I think that C++ was definitely an improvement to C, so I don't
> > have sympathy for "Doc O'Leary's" criticisms. C++ is not "crap".
>

> Thank you for sticking up for Heron, Shayne.

I'm not sticking up for Heron since I haven't researched it well enough to
do so, but I do think that "Doc's" criticisms here are way off base (and
that's not mentioning his extremely rude and vicious behavior toward you).

> One small point though, Heron does not build on C++. The proposed
> implementation does, but not the specification. HeronFront is another
> matter, that builds on C++.

OK--so my memory was right and my quick glance over to your website just now
wrong. Double-whoops.

In any case, that makes Doc's criticisms all the more unfounded.


Shayne Wissler

PS: Someone once said: "Don't worry about people stealing your ideas. If
your ideas are any good, you'll have to ram them down their throats."


Thomas Gagné

unread,
Apr 7, 2004, 1:26:00 PM4/7/04
to

christopher diggins wrote:

> <snip>
>
>
>I agree that there is no silver bullet, but there is definitely progress
>towards making software more manageable, and less complex. That is the whole
>point of the various software design methodologies. So I disagree with the
>implication that there exists an unsurmountable inherent complexity with the
>problem of mapping human problems onto electronic hardware. Abstraction in
>general continues to be increasingly effective tool.
>
>
>
I don't see any evidence of progress towards less complexity. In fact,
the software industry seems to enjoy accelerating entropy to spite
itself. Consider how much easier it was to build applications in the
80s. Look at the proliferation of languages, protocols, formats,
services, security, and just about anything else. Even relatively new
technologies like Java only get more complicated. Multiple window
managers, new keywords, new syntax, etc. Even our editors have become
increasingly more complicated in an attempt to help programmers manage
the complexity. In some cases these IDEs simply reimlement prior art,
but their features create larger and larger systems that mastery of them
is another skill of its own complexity.

The software industry has become a farce of a burocracy. Instead of
simplifying the rules we seem to make them more complex. I suppose if
everyone made better use of components and re-use it would be easier to
keep up with them all.

--
.tom
remove email address' dashes for replies
opensource middleware at <http://isectd.sourceforge.net>
<http://gagne.homedns.org/~tgagne/>

Thomas Gagné

unread,
Apr 7, 2004, 1:31:00 PM4/7/04
to

christopher diggins wrote:

>Have you ever looked at the size of your objects in bytes with large
>inheritance trees? I'd wager in language X of your choice it is likely using
>more space than you need. I know this for a fact in C++, and I demonstrate
>it. I am open to hear which languages handle vtables better (along with a
>bit of demonstration with small psuedo-code bits and byte sizes).
>
>

I'm curious why, in 2004, we should care. Memory and CPUs are becomming
so inexpensive they're ready to spill out of my cereal box in the
morning. Programmers' time is not becomming as cheap or productive as
hardware. Should we bother trying to compete with it or make our source
as easy to understand as possible and spend the extra $100 for a faster
CPU or more memory?

Thomas Gagné

unread,
Apr 7, 2004, 2:38:24 PM4/7/04
to
I would have preferred to correctly spell "bureaucracy" but hit the
"send" to quickly. Sigh.

christopher diggins

unread,
Apr 7, 2004, 3:53:02 PM4/7/04
to

"Thomas Gagné" <tga...@wide-open-west.com> wrote in message
news:C-adncj8ZKl...@wideopenwest.com...

>
>
> christopher diggins wrote:
>
> > <snip>
> >
> >
> >I agree that there is no silver bullet, but there is definitely progress
> >towards making software more manageable, and less complex. That is the
whole
> >point of the various software design methodologies. So I disagree with
the
> >implication that there exists an unsurmountable inherent complexity with
the
> >problem of mapping human problems onto electronic hardware. Abstraction
in
> >general continues to be increasingly effective tool.
> >
> >
> >
> I don't see any evidence of progress towards less complexity. In fact,
> the software industry seems to enjoy accelerating entropy to spite
> itself. Consider how much easier it was to build applications in the
> 80s. Look at the proliferation of languages, protocols, formats,
> services, security, and just about anything else. Even relatively new
> technologies like Java only get more complicated. Multiple window
> managers, new keywords, new syntax, etc. Even our editors have become
> increasingly more complicated in an attempt to help programmers manage
> the complexity. In some cases these IDEs simply reimlement prior art,
> but their features create larger and larger systems that mastery of them
> is another skill of its own complexity.
<snip>

I meant that complexity is decreasing when you compared two pieces of
software which accomplish the same thing for instance a program written in
Java compared to the same program in fortran and compared again to the same
program in assembly language. I agree that the requirements on software are
indeed getting more complex.

Alan Gauld

unread,
Apr 7, 2004, 3:54:30 PM4/7/04
to
On Wed, 07 Apr 2004 13:31:00 -0400, Thomas Gagné
<tga...@wide-open-west.com> wrote:
> morning. Programmers' time is not becomming as cheap or productive as
> hardware. Should we bother trying to compete with it or make our source
> as easy to understand as possible and spend the extra $100 for a faster
> CPU or more memory?

I sort of agree except that if the apps become so big they
require hardware upgrades more often that soon becomes big money.
In a corporate setting with say 1000 users, if a new software
version requires a PC upgrade a year earlier than expected thats
an extra $1m on the IT budget. You can buy a lot of programming
time for a million bucks...

In fact in most IT projects the development cost is still much
less than 50% of the total, with things like hardware and network
deployment and, above all, user adoption (training, process
reengineering etc) being the biggest cost. But the latter
aspects, at least, are well outside the scope of this particular
thread! :-)

Alan G
Author of the Learn to Program website
http://www.freenetpages.co.uk/hp/alan.gauld

christopher diggins

unread,
Apr 7, 2004, 4:05:18 PM4/7/04
to

"Thomas Gagné" <tga...@wide-open-west.com> wrote in message
news:4tedncl_SLB...@wideopenwest.com...

>
>
> christopher diggins wrote:
>
> >Have you ever looked at the size of your objects in bytes with large
> >inheritance trees? I'd wager in language X of your choice it is likely
using
> >more space than you need. I know this for a fact in C++, and I
demonstrate
> >it. I am open to hear which languages handle vtables better (along with a
> >bit of demonstration with small psuedo-code bits and byte sizes).
> >
> >
> I'm curious why, in 2004, we should care. Memory and CPUs are becomming
> so inexpensive they're ready to spill out of my cereal box in the
> morning. Programmers' time is not becomming as cheap or productive as
> hardware. Should we bother trying to compete with it or make our source
> as easy to understand as possible and spend the extra $100 for a faster
> CPU or more memory?

No matter how much RAM you put on a machine or how fast the CPU is, there
will always be scenarios where the limits are reached and the software needs
to perform better, so it is naive to assert that memory and performance
should be ignored. Of course, in the end all that matters is the end-user,
and if their requirements are met then it doesn't matter how much memory or
time it takes. I do want to point out though, that the world software market
is growing very quickly and it is only a handful of nations who have CPU's
and RAM spilling out of their cereal boxes.

Thomas Gagné

unread,
Apr 7, 2004, 4:19:54 PM4/7/04
to
christopher diggins wrote:

> <snip>
>


>I meant that complexity is decreasing when you compared two pieces of
>software which accomplish the same thing for instance a program written in
>Java compared to the same program in fortran and compared again to the same
>program in assembly language. I agree that the requirements on software are
>indeed getting more complex.
>
>
>

If they do the same thing, I suspect the FORTRAN code may have been
easier to read and write than the Java code. FORTRAN is a simpler
language than Java. No exception blocks. No mixing of primitive and
object types. No inner classes. Less typing. No imports. No class
paths. Fewer reserved words. No VM. Just a simple edit/compile/link
cycle. Some even had array bounds checking.

FORTRAN is easier than assembly in respect to syntax, keywords, lines of
code, and readibility. FORTRAN was a larger improvement over Assembler
than Java is to FORTRAN (using the noted metrics).

Universe

unread,
Apr 7, 2004, 5:51:51 PM4/7/04
to

"Shayne Wissler" <thalesN...@yahoo.com> wrote in message news:HJVcc.211594>

Universe

unread,
Apr 7, 2004, 6:04:16 PM4/7/04