Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

True faiths ( was Re: The true faith )

12 views
Skip to first unread message

israel r t

unread,
Jan 11, 2002, 6:46:36 PM1/11/02
to
On Fri, 11 Jan 2002 09:50:40 -0800, g...@jpl.nasa.gov (Erann Gat)
wrote:

Kent:
>> that the substance will end up mattering in the long run.
>
>The community seems to share a disdain for more than just advocacy. Our
>disdain extends to just about everything that isn't Lisp. This disdain is
>evident in statements like "substance will end up mattering in the long
>run", which implicit ly rejects the possibility that substance *has*
>mattered, and that the world has rejected Lisp for substantial reasons and
>not just superficial ones.

This seems to be typical of most marginal communities.
I have seen variants of the argument "substance will eventually win
" in the smalltalk, ada, eiffel, os2 and sgi communities.
It seems improbable that every marginalised language is going to win.

Lisp, Ada , Eiffel and Smalltalk are all excellent languages, far
superior to the Gang of Three ( Java, C++ , C ).

Yet, if Kent is right, they may have all been " rejected ... for
substantial reasons and not just superficial ones."

Will we all end up like the Moonies , convinced that our faith is the
One True Faith while the rest of the world moves on ?

Steven T Abell

unread,
Jan 11, 2002, 7:38:02 PM1/11/02
to
israel r t wrote:
> This seems to be typical of most marginal communities.
> I have seen variants of the argument "substance will eventually win
> " in the smalltalk, ada, eiffel, os2 and sgi communities.
> It seems improbable that every marginalised language is going to win.
>
> Lisp, Ada , Eiffel and Smalltalk are all excellent languages, far
> superior to the Gang of Three ( Java, C++ , C ).
>
> Yet, if Kent is right, they may have all been " rejected ... for
> substantial reasons and not just superficial ones."
>
> Will we all end up like the Moonies , convinced that our faith is the
> One True Faith while the rest of the world moves on ?

I have not done Ada or Eiffel,
have done Smalltalk and Lisp,
used to teach Smalltalk.

I think the reason is mindshare,
which unfortunately looks pretty relevant to a hiring manager.

Not very many people can use these languages well.
It's true that two or three people who can use them well
can outproduce a whole roomful of very competent C/C++/Java guys.
But a manager is highly influenced by the Truck Effect:
if one of your exotic wonderboys goes away for any reason,
he can be very hard to replace.
Furthermore, the loss of one of these
is equivalent to the loss of a whole team of C/C++/Java guys.
Yes, I know the time/cost tradeoffs,
but most managers don't want to hear the facts on this issue,
they just want to know if they can hire someone off the street,
and street people don't do Smalltalk.

I'm doing C++ right now,
and I'm painfully aware of just how unproductive this thing is.
But my client believes that I can be replaced if I go splat,
and that belief helps them get through the day.
Underneath it all,
my work is informed by my Smalltalk and Lisp experience
in ways that your average C/C++/Java guy just doesn't get,
and my client understands that I know something they don't.

I would love to be able to do Smalltalk all day long.
I would hate to go through life with the outlook of a C guy.
It's hard, but I try to be content,
and I go home and work on learning APL.
For those of us who actually have to produce things,
it's what you feed your brain that's relevant,
and C and its children are not enough.

Steve
--
Steven T Abell
Software Designer
http://www.brising.com

In software, nothing is more concrete than a good abstraction.

Victor B. Putz

unread,
Jan 11, 2002, 9:31:10 PM1/11/02
to
israel r t <isra...@optushome.com.au> writes:

> Will we all end up like the Moonies , convinced that our faith is the
> One True Faith while the rest of the world moves on ?

Depends.

If you can find the right group of folks who are all interested in improving
their productivity with marginal languages, you can always try and work outside
the "C Block" and do an end-run around slower-moving organizations.

It is Great Mistake to assume that, because a programming language has been
overlooked by the majority, it is therefore not worth pursuing in a niche.
Sure, there's a great draw to simply throwing up one's hands and saying "ah,
what the heck, everyone else is using C++"... and that is a choice of safety
over all else. Not a bad choice, but perhaps not the best choice.

It all depends, of course.

-->VPutz

Kenny Tilton

unread,
Jan 11, 2002, 10:56:04 PM1/11/02
to

> Lisp, Ada , Eiffel and Smalltalk are all excellent languages, far
> superior to the Gang of Three ( Java, C++ , C ).
>
> Yet, if Kent is right, they may have all been " rejected ... for
> substantial reasons and not just superficial ones."

For a second I thought by "they" you meant Java, C++ and C. They have
indeed all been rejected, if you think about it.

Anybody want to do without GC and OO (besides Graham)? Goodbye C.

Anybody want to do without GC and worry about pointers and use templates
to get past static typing and wait minutes for a single
edit-compile-link-run iteration? Goodbye C++, witness the rejoicing over
Java, Perl, Python, Ruby...

Anybody happy with Java? Why are people adding JIT compilation, GFs, MI
(AspectJ) and macros?

Obviously the winner is any compiled (fast), GCed, GFing, MI dynamic
language with macros. Mature, an ANSII standard and a free
implementation would not hurt either. hang on...

Kenny
CliniSys

israel r t

unread,
Jan 12, 2002, 12:49:59 AM1/12/02
to
On Sat, 12 Jan 2002 03:56:04 GMT, Kenny Tilton <kti...@nyc.rr.com>
wrote:

>Obviously the winner is any compiled (fast), GCed, GFing, MI dynamic
>language with macros. Mature, an ANSII standard and a free
>implementation would not hurt either. hang on...

And Ocaml wins again ! :-)

No macros though...
Lots of currying and and higher order functions however.

And speeeeeeed !
( As if it really matters with commodity 2 GHz processors... )

Thaddeus L Olczyk

unread,
Jan 12, 2002, 1:04:31 AM1/12/02
to
n Sat, 12 Jan 2002 16:49:59 +1100, israel r t
<isra...@optushome.com.au> wrote:

Hmmm. Is there a version of ML that supports macros?

Friedrich Dominicus

unread,
Jan 12, 2002, 3:42:48 AM1/12/02
to
olc...@interaccess.com (Thaddeus L Olczyk) writes:

> n Sat, 12 Jan 2002 16:49:59 +1100, israel r t
> <isra...@optushome.com.au> wrote:
>
> >On Sat, 12 Jan 2002 03:56:04 GMT, Kenny Tilton <kti...@nyc.rr.com>
> >wrote:
> >
> >>Obviously the winner is any compiled (fast), GCed, GFing, MI dynamic
> >>language with macros. Mature, an ANSII standard and a free
> >>implementation would not hurt either. hang on...
> >
> >And Ocaml wins again ! :-)
> >
> >No macros though...

Not fully correct see
http://caml.inria.fr/camlp4/

> >Lots of currying and and higher order functions however.
> >
> >And speeeeeeed !
> >( As if it really matters with commodity 2 GHz processors... )
> Hmmm. Is there a version of ML that supports macros?

Well I think it's Ocaml.

Regards
Friedrich

Andreas Bogk

unread,
Jan 12, 2002, 10:44:40 AM1/12/02
to
israel r t <isra...@optushome.com.au> writes:

> >Obviously the winner is any compiled (fast), GCed, GFing, MI dynamic
> >language with macros. Mature, an ANSII standard and a free
> >implementation would not hurt either. hang on...
> And Ocaml wins again ! :-)

OCaml doesn't have generic functions, and it is statically typed.

On the other hand it comes with algebraic data types, which can be
handy...

Andreas

--
"In my eyes it is never a crime to steal knowledge. It is a good
theft. The pirate of knowledge is a good pirate."
(Michel Serres)

Preben Randhol

unread,
Jan 12, 2002, 10:58:51 AM1/12/02
to
On Sat, 12 Jan 2002 10:46:36 +1100, israel r t wrote:
> Yet, if Kent is right, they may have all been " rejected ... for
> substantial reasons and not just superficial ones."

This implies that the marked is rational and logical. It has shown time
and time again that it isn't. The marked won't choose a superior
technical solution over an inferior yet better marketed or seemingly
cheaper solution.

Capitalisme will need to adjust itself in the near future from the
short-term gain view to a long-term view. This includes taking into
account security and maintainability over time. In all sectors we see
that the drive to cut costs leads to problems with the security; secure
food, transport, utilities, software etc... There now seems to come
articles that looks at the cost of software bugs for bussinesses.

It is just as frustrating every time I hear about some company that
makes some kind of "secure" solution and when you ask them what language
they use they say C or C++. When one then ask them why they use such an
unsafe language they start arguing not that C/C++ is safe, they mostly
agree that it isn't, but out of economical concerns or that they need
people that can program and people know C/C++.

Preben
--
() Join the worldwide campaign to protect fundamental human rights.
'||}
{||' http://www.amnesty.org/

Nils Goesche

unread,
Jan 12, 2002, 11:18:55 AM1/12/02
to
In article <slrna40nb0.7nn...@kiuk0156.chembio.ntnu.no>, Preben Randhol wrote:
> On Sat, 12 Jan 2002 10:46:36 +1100, israel r t wrote:
>> Yet, if Kent is right, they may have all been " rejected ... for
>> substantial reasons and not just superficial ones."
>
> This implies that the marked is rational and logical. It has shown time
> and time again that it isn't. The marked won't choose a superior
> technical solution over an inferior yet better marketed or seemingly
> cheaper solution.

It won't? Where do you know that? Who is ``the market''? Software
companies decide for themselves which programming language they use.
Your company is free to choose, too.

> Capitalisme will need to adjust itself in the near future from the
> short-term gain view to a long-term view.

Funny; ``in the near future''. This claim is *very* old, but we are
still waiting ;-)

> It is just as frustrating every time I hear about some company that
> makes some kind of "secure" solution and when you ask them what language
> they use they say C or C++. When one then ask them why they use such an
> unsafe language they start arguing not that C/C++ is safe, they mostly
> agree that it isn't, but out of economical concerns or that they need
> people that can program and people know C/C++.

Well, that's the decision they made, then. If you think it's a wrong
one, you could prove it by making a different decision in /your/ company.
If you are right, your product should be more successful, right?

Blaming the market doesn't make sense at all here. You think the market
is ``irrational and illogical''? Who is ``rational and logical'' then?
Some communist party? There isn't much of an alternative here; either
companies are free to choose, as they are now, or some communist party
decides which programming language we all have to use. Just imagine
which one would that be, considering the ``rational and logical''
decisions those parties have made in the past...

Regards,
--
Nils Goesche
Ask not for whom the <CONTROL-G> tolls.

PGP key ID 0xC66D6E6F

Thomas F. Burdick

unread,
Jan 12, 2002, 12:03:46 PM1/12/02
to
Nils Goesche <car...@t-online.de> writes:

> In article <slrna40nb0.7nn...@kiuk0156.chembio.ntnu.no>, Preben Randhol wrote:
>
> > Capitalisme will need to adjust itself in the near future from the
> > short-term gain view to a long-term view.
>
> Funny; ``in the near future''. This claim is *very* old, but we are
> still waiting ;-)

Even restricting one's view to economics alone, look at the United
States. We're in the second economic recession in 10 years. And in
the time in between, although unemployment was low, wages didn't grow.
This isn't much of a refultation of the claim. Or are you just
arguing that capitalism still exists? If so, that's not much of an
argument.

--
/|_ .-----------------------.
,' .\ / | No to Imperialist war |
,--' _,' | Wage class war! |
/ / `-----------------------'
( -. |
| ) |
(`-. '--.)
`. )----'

Christopher Browne

unread,
Jan 12, 2002, 12:56:57 PM1/12/02
to
t...@conquest.OCF.Berkeley.EDU (Thomas F. Burdick) writes:
> Nils Goesche <car...@t-online.de> writes:
>
> > In article <slrna40nb0.7nn...@kiuk0156.chembio.ntnu.no>, Preben Randhol wrote:

> > > Capitalism will need to adjust itself in the near future from


> > > the short-term gain view to a long-term view.

> > Funny; ``in the near future''. This claim is *very* old, but we
> > are still waiting ;-)

> Even restricting one's view to economics alone, look at the United
> States. We're in the second economic recession in 10 years. And in
> the time in between, although unemployment was low, wages didn't
> grow. This isn't much of a refultation of the claim. Or are you
> just arguing that capitalism still exists? If so, that's not much
> of an argument.

And "wages" and "unemployment" are forcibly supposed to be related to
the manner of ownership of the results of production _how_?

[You're probably _not_ amongst the clueless on this, but I find it
tremendously irritating when people get spectacularly worshipful about
"capitalism" when they're clearly thinking about things that aren't
forcibly related, such as when concepts of "private property" and
"free markets" could apply equally well under arrangements such as
"mercantilism."]

The notion that it makes sense for "capitalism to adjust itself" to
something when capitalism is an _economic concept_ seems just
spectacularly silly. It's a _definition_, and definitions don't
adjust themselves.
--
(concatenate 'string "cbbrowne" "@acm.org")
http://www3.sympatico.ca/cbbrowne/
"We're all a little weird. And life is a little weird. And when we
find someone whose weirdness is compatible with ours, we join up with
them and fall into mutually satisfying weirdness - and call it
love..." -- R. Fulghum

Doug Hockin

unread,
Jan 12, 2002, 2:39:25 PM1/12/02
to
> Obviously the winner is any compiled (fast), GCed, GFing, MI dynamic
> language with macros. Mature, an ANSII standard and a free
> implementation would not hurt either. hang on...


Like Dylan?

comp.lang.dylan
http://www.gwydiondylan.org
http://www.functionalobjects.com

-- Doug

Kenny Tilton

unread,
Jan 12, 2002, 3:14:53 PM1/12/02
to

Doug Hockin wrote:
>
> > Obviously the winner is any compiled (fast), GCed, GFing, MI dynamic
> > language with macros. Mature, an ANSII standard and a free
> > implementation would not hurt either. hang on...
>
> Like Dylan?

Is Dylan mature, stable, ANSII standard?

Anyway, not enough parentheses. And I like unhygienic macros.

Dylan to me adds no value over Lisp, so why bother?

All that said, yup, Dylan is something I would consider if Lisp did not
exist.

Oh, damnit, I forgot. How about a MOP. Does Dylan expose the MOP so I
can metaclass?

Anyway, the big quetsion is: what is the added value over Lisp to make
me give up Lisp.

One more thing: I love editing with parentheses-aware editors. Infix
won't cut it.

kenny
clinisys

Fernando Rodríguez

unread,
Jan 12, 2002, 5:45:29 PM1/12/02
to
On Sat, 12 Jan 2002 20:14:53 GMT, Kenny Tilton <kti...@nyc.rr.com> wrote:


>> Like Dylan?
>
>Is Dylan mature, stable, ANSII standard?
>
>Anyway, not enough parentheses. And I like unhygienic macros.
>
>Dylan to me adds no value over Lisp, so why bother?

If you are allergic to parentheses, Dylan and Python are very useful. ;-)

--
Fernando Rodríguez
frr at wanadoo dot es
--

Thomas F. Burdick

unread,
Jan 12, 2002, 7:52:33 PM1/12/02
to
Christopher Browne <cbbr...@acm.org> writes:

> t...@conquest.OCF.Berkeley.EDU (Thomas F. Burdick) writes:
> > Nils Goesche <car...@t-online.de> writes:
> >
> > > In article <slrna40nb0.7nn...@kiuk0156.chembio.ntnu.no>, Preben Randhol wrote:
>
> > > > Capitalism will need to adjust itself in the near future from
> > > > the short-term gain view to a long-term view.
>
> > > Funny; ``in the near future''. This claim is *very* old, but we
> > > are still waiting ;-)
>
> > Even restricting one's view to economics alone, look at the United
> > States. We're in the second economic recession in 10 years. And in
> > the time in between, although unemployment was low, wages didn't
> > grow. This isn't much of a refultation of the claim. Or are you
> > just arguing that capitalism still exists? If so, that's not much
> > of an argument.
>
> And "wages" and "unemployment" are forcibly supposed to be related to
> the manner of ownership of the results of production _how_?

I was responding to a claim that seemed to be implying that there
wasn't anything wrong with capitalism's optimizing for short-term
gain. In fact, I think imperialism has just about reached a point of
equilibrium, where it's no longer beneficial (for the capitalist
class) to increase productivity. The United States is now a society
where the average amount of labor is increasing, but the net
production isn't (if things are produced but subsequently exploded or
thrown out, there was no net production). Increased work and
declining wages during the period *between* cyclical recessions is a
symptom of this.

> [You're probably _not_ amongst the clueless on this, but I find it
> tremendously irritating when people get spectacularly worshipful about
> "capitalism" when they're clearly thinking about things that aren't
> forcibly related, such as when concepts of "private property" and
> "free markets" could apply equally well under arrangements such as
> "mercantilism."]
>
> The notion that it makes sense for "capitalism to adjust itself" to
> something when capitalism is an _economic concept_ seems just
> spectacularly silly. It's a _definition_, and definitions don't
> adjust themselves.

But capitalism has certainly evolved over time. Property and class
relations have changed since the mid 19th century. Who knows, maybe
it will evolve some more and find a way to increase production. Or
maybe it won't, and we'll have a social crisis and a change in
property relations :)

Thomas F. Burdick

unread,
Jan 12, 2002, 8:03:43 PM1/12/02
to
Kenny Tilton <kti...@nyc.rr.com> writes:

> Doug Hockin wrote:
> >
[> > > is Kenny Tilton]


> > > Obviously the winner is any compiled (fast), GCed, GFing, MI dynamic
> > > language with macros. Mature, an ANSII standard and a free
> > > implementation would not hurt either. hang on...
> >
> > Like Dylan?
>
> Is Dylan mature, stable, ANSII standard?

I thought those were just niceties, not necessary for the language to
"win", according to you just above.

> Anyway, not enough parentheses. And I like unhygienic macros.
>
> Dylan to me adds no value over Lisp, so why bother?

Well, sure, right now. Since Dylan seems to have what you think it
would take for a non-mainstream language to make it into the
mainstream, if it did, that's the value it *would* offer. I'd be
*thrilled* to be a Dylaner if there were a ton of Dylan jobs.

You seem to have changed from talking about what "the winner" needs,
to talking about what language you'd rather use. I don't think it's a
good idea to change the definition of "win" so radically, at least
mid-thread :)

> All that said, yup, Dylan is something I would consider if Lisp did not
> exist.
>
> Oh, damnit, I forgot. How about a MOP. Does Dylan expose the MOP so I
> can metaclass?

Yeah, it might have just fallen out of the winning category. I agree,
a MOP is vital.

> Anyway, the big quetsion is: what is the added value over Lisp to make
> me give up Lisp.
>
> One more thing: I love editing with parentheses-aware editors. Infix
> won't cut it.

If you had an infix language that could be turned into a sexp form,
you could probably make an editor that actually worked on its
structure in the image. Not that I'm advocating this, I'm just sayin...

Thaddeus L Olczyk

unread,
Jan 12, 2002, 2:49:00 PM1/12/02
to
On 12 Jan 2002 19:39:25 GMT, Doug Hockin
<dho...@staffware-spokane.com> wrote:

He didn't say dead language.

Kenny Tilton

unread,
Jan 13, 2002, 12:24:20 AM1/13/02
to

"Thomas F. Burdick" wrote:
>
> Kenny Tilton <kti...@nyc.rr.com> writes:
>
> > Doug Hockin wrote:
> > >
> [> > > is Kenny Tilton]
> > > > Obviously the winner is any compiled (fast), GCed, GFing, MI dynamic
> > > > language with macros. Mature, an ANSII standard and a free
> > > > implementation would not hurt either. hang on...
> > >
> > > Like Dylan?
> >
> > Is Dylan mature, stable, ANSII standard?
>
> I thought those were just niceties, not necessary for the language to
> "win", according to you just above.

was doug making that distinction? anyway, that was just a point of
information. i do not follow dylan, for all i know it's spec is
stable--i wasn't arguing, i was asking.

> > Dylan to me adds no value over Lisp, so why bother?
>
> Well, sure, right now. Since Dylan seems to have what you think it
> would take for a non-mainstream language to make it into the
> mainstream, if it did, that's the value it *would* offer. I'd be
> *thrilled* to be a Dylaner if there were a ton of Dylan jobs.

well, my whole point was, look past current popularity to how developers
are reacting to the status quo with new languages or new features for
java to get at True Popularity. so might-make-it-mainstream is subtly
OT.

I anyway like to separate the question of how great is a language
regardless of its popularity from considerations deriving from
popularity, such as the number of available jobs, programmers and
libraries. The latter effects of popularity do not in the long run wag
the popularity dog; there was a day when COBOL, VSAM and CICS ruled the
world. Pascal was so big the original Mac OS team used it. And the
Romans really kicked the Christians' asses. Hang on...

Note that I am not saying popularity does not add value in the practical
ways we all grok, just that I like to control for popularity when
assessing long-term prospects of languages.

kenny
clinisys

Kevin McFarlane

unread,
Jan 13, 2002, 7:20:29 AM1/13/02
to

"Steven T Abell" <ab...@brising.com> wrote in message
news:3C3F8689...@brising.com...

I think Steve is right. I too am primarily a C++ developer but would like to
be using langages such as Eiffel, which I've read a lot about but not used.
However, I'm constrained by learning what I have to learn to make a living
and what can be leveraged off of C++. So things like C# and ASP are worth my
while investing in and Eiffel isn't - for now, at any rate.

I think that it is good for programmers to be familiar with more than one
language, even if only cursorily. My C++ has improved and is improving
(hopefully) by being informed by important concepts from Eiffel and from
more general reading on software engineering.

It's always going to be hard for something new to get a look in. There is,
of course, the economic reason that, for example, C/C++ guys are two a penny
but Eiffel and Smalltalk guys aren't. That's why you often need some kind of
killer application or killer technology area to leverage off of. Eiffel, for
example, may get a boost from .Net, especially as its offering a few things
that the other languages don't. But I fear it's not being marketed very well
at the moment.

Another barrier to overcome is programmers themselves. It's difficult enough
to get C programmers to buy into OO and to get C++ programmers to use the
techniques that exist to write safer, more maintainable code. This may
partly be due to the complexity of C++ but I also think it's just
boneheadedness (e.g., the attitude "I've always used sprintf why try
anything different?")

In my last job, a difficult-to-fine damaged memory problem was caused by an
incorrectly coded sprintf. This would not have been caused by using the more
modern C++ alternatives. But a dyed-in-the-wool C programmer would probably
just say that programmers should be competent enough not to make mistakes.

However, having said this, the minority languages would probably help their
case better by not dogmatically dismissing everything else but by being more
constructive. Not everything in C/C++ is bad.


Patrick Doyle

unread,
Jan 13, 2002, 10:23:06 AM1/13/02
to
In article <3C409A34...@nyc.rr.com>,

Kenny Tilton <kti...@nyc.rr.com> wrote:
>
>Oh, damnit, I forgot. How about a MOP. Does Dylan expose the MOP so I
>can metaclass?

Er, what's a MOP?

--
--
Patrick Doyle
doy...@eecg.toronto.edu

Dr. Edmund Weitz

unread,
Jan 13, 2002, 11:29:48 AM1/13/02
to
doy...@eecg.toronto.edu (Patrick Doyle) writes:

> Er, what's a MOP?

MetaObject Protocol, see <http://www.elwoodcorp.com/alu/mop/>.

Edi

Nils Goesche

unread,
Jan 13, 2002, 1:17:57 PM1/13/02
to
In article <xcvu1tr...@conquest.OCF.Berkeley.EDU>, Thomas F. Burdick wrote:
> Christopher Browne <cbbr...@acm.org> writes:
>
>> t...@conquest.OCF.Berkeley.EDU (Thomas F. Burdick) writes:
>> > Nils Goesche <car...@t-online.de> writes:
>> >
>> > > In article <slrna40nb0.7nn...@kiuk0156.chembio.ntnu.no>, Preben Randhol wrote:
>>
>> > > > Capitalism will need to adjust itself in the near future from
>> > > > the short-term gain view to a long-term view.
>>
>> > > Funny; ``in the near future''. This claim is *very* old, but we
>> > > are still waiting ;-)
>>
>> > Even restricting one's view to economics alone, look at the United
>> > States. We're in the second economic recession in 10 years. And in
>> > the time in between, although unemployment was low, wages didn't
>> > grow. This isn't much of a refultation of the claim. Or are you
>> > just arguing that capitalism still exists? If so, that's not much
>> > of an argument.
>>
>> And "wages" and "unemployment" are forcibly supposed to be related to
>> the manner of ownership of the results of production _how_?
>
> I was responding to a claim that seemed to be implying that there
> wasn't anything wrong with capitalism's optimizing for short-term
> gain.

There /is/ something wrong with your idea of `capitalism' being
`optimizing' for anything like `short-term gains'. /People/ are
`optimizing'. Maybe some managers and politicians are `optimizing
for short-term gains'. But that is nothing to worry about: If you
think you know better how to win in the long run, by any means, do it!
Those stupid short-term-thinkers that apparently worry you so much
will lose and you'll take them over when they're broke ;-)

But yes, I know what you're up to. You are, like everybody else
on the `leftist' side,
convinced that you can do better than the market; that you know what
people /really/ want, or at least should want to do or to buy because
you are, because of your superior intelligence, able to predict
the future and are the only one who can tell what has to be done instead
in order to save us all. So, we need a strong government, ownership
of companies and anything else has to be transferred to the government
which must be led by yourself, of course, not by some ordinary,
mortal short-term-thinker. Then, elections are no longer necessary
and in fact dangerous, because the stupid public might not recognize,
because of their lack of intelligence, how everything the government
does will be good for them, in the long run, of course, and some
`right-wing' evildoers might convince them to elect somebody else and
we can't have that, can we?

Pretty close? I hope you are not offended by this too much, but I
wanted to show you how all this `leftist' talk sounds to people not
sharing your views. Now you probably think that I am a `right-wing'
evil-doer, that would be a typical reflex at least, but there are
other views of the world than the leftist, and not all of them are
right-wing.

If you find this interesting, however, you might be also interested in

``The Vision of the Anointed: Self-Congratulation As a Basis for
Social Policy''

by Thomas Sowell,

or

``Free to Choose: A Personal Statement''

by Milton and Rose D. Friedman,

both available at Amazon.

Erik Naggum

unread,
Jan 13, 2002, 1:52:43 PM1/13/02
to
* "Kevin McFarlane" <ke...@atech.globalnet.co.uk>

| It's always going to be hard for something new to get a look in. There
| is, of course, the economic reason that, for example, C/C++ guys are two
| a penny but Eiffel and Smalltalk guys aren't.

This is one of the most misleading abuses of statistics around. Just
because the probability that you hit a C++ programmer if you throw a rock
into a crowd is very high, does not mean that the probability that he can
replace _your_ C++ programmer is any higher than finding a replacement
Eiffel or Smalltalk programmer. Because you have to weed through tons of
idiots who only _claim_ they know C++, the effort required to find a real
replacement may be significantly lower for Eiffel or Smalltalk. Besides,
if you can find a good programmer, chances are very good that he will be
able to learn any programming language you use reasonably well in the
time it would take to find a good C++ programmer. And learning from the
sources of the previous programmer is a lot easier than learning the
language from scratch in a general, application-independent way.

I have actually witnessed this. A company I worked for got a new manager
level that was completely superfluous, so the new manager had to prove to
herself that she had a real job, and spent a lot of time arguing against
using languages that were not mainstream, and basically made it hard to
use anything but Java, and many good people quit. Then a Java man got
seriously ill. She was unable to replace him in the 5 months he was
away. The other Java men could not do his work. To her amazement,
choice of language mattered less than the other skills the programmers
had. The conclusion from this story that this manager actually arrived
at was that it was bad to have skilled programmers -- she alone should
make the design decisions and programmers would simply implement them.
She could now return to her policy of using only mainstream languages and
hire only unskilled programmers who lied about knowing a language. As
far as I know, nothing interesting has happened at that company for a
long time.

///
--

Thomas F. Burdick

unread,
Jan 13, 2002, 2:04:36 PM1/13/02
to
Nils Goesche <car...@t-online.de> writes:

> There /is/ something wrong with your idea of `capitalism' being
> `optimizing' for anything like `short-term gains'. /People/ are
> `optimizing'. Maybe some managers and politicians are `optimizing
> for short-term gains'. But that is nothing to worry about: If you
> think you know better how to win in the long run, by any means, do it!
> Those stupid short-term-thinkers that apparently worry you so much
> will lose and you'll take them over when they're broke ;-)

To say the above you need to completely ignore property relations.

> But yes, I know what you're up to.

No, you don't. Just because you've met a whole lot of Stalinists and
holier-than-thou Social Democrats doesn't mean you understand the
views of everyone in the class struggle. In particular, even if it
were reasonable to have that attitude with Europeans, projecting your
political understanding across the Atlantic is a recipe for disaster.

You are, like everybody else
> on the `leftist' side,
> convinced that you can do better than the market; that you know what
> people /really/ want, or at least should want to do or to buy because
> you are, because of your superior intelligence, able to predict
> the future and are the only one who can tell what has to be done instead
> in order to save us all. So, we need a strong government, ownership
> of companies and anything else has to be transferred to the government
> which must be led by yourself, of course, not by some ordinary,
> mortal short-term-thinker. Then, elections are no longer necessary
> and in fact dangerous, because the stupid public might not recognize,
> because of their lack of intelligence, how everything the government
> does will be good for them, in the long run, of course, and some
> `right-wing' evildoers might convince them to elect somebody else and
> we can't have that, can we?
>

> Pretty close? I hope you are not offended by this too much, but I
> wanted to show you how all this `leftist' talk sounds to people not
> sharing your views.

Oh, I know how I sound to others. I can't do much about the fact that
some people listen to one thing and hear another. The only way to get
past that is to have a real conversation with someone. On that note,
if you want to continue this over e-mail, I'm down. But I'd rather
not continue this in comp.*, for reasons I hope are obvious :)

> by Milton and Rose D. Friedman,

Oh dear god, Milton Friedman??? I don't know about your politicl
views, but this man's are frighteningly right-wing. If you want to
know what his politics means for the 2nd and 3rd world, look at what
he helped do in Latin America. BTW, this is a very mainstream view to
be frightened of / hateful towards him.

--
/|_ .-----------------------.
,' .\ / | No to Imperialist war |
,--' _,' | Wage class war! |
/ / `-----------------------'
( -. |
| ) |
(`-. '--.)
`. )----'

^This is a sabot cat, a big clue I'm not a communist

James A. Robertson

unread,
Jan 13, 2002, 2:26:14 PM1/13/02
to
As much as I enjoy a good political discussion, and as much as I've been
sucked into them in this and other forums, I'm going to ask that you
take this to email or an appropriate forum.

Thanks,

Andreas Bogk

unread,
Jan 13, 2002, 3:43:41 PM1/13/02
to
Nils Goesche <car...@t-online.de> writes:

> There /is/ something wrong with your idea of `capitalism' being
> `optimizing' for anything like `short-term gains'. /People/ are
> `optimizing'. Maybe some managers and politicians are `optimizing

The problem is within the so-called "system effect", actions that do
not result from individual action of the people, but of structures as
a whole.

Very much like neurons in our brain...

> But yes, I know what you're up to. You are, like everybody else
> on the `leftist' side,
> convinced that you can do better than the market; that you know what
> people /really/ want, or at least should want to do or to buy because
> you are, because of your superior intelligence, able to predict
> the future and are the only one who can tell what has to be done instead
> in order to save us all. So, we need a strong government, ownership

I think it's too easy to paint this as the typical 'leftist' attitude.
Modern politics has more dimensions than just the one left-right
dimension many people use to save themselves from thinking for
themselves.

What you describe is more the individualism vs. society axis. The far
extreme you describe is totalitarism: no power left at the individual,
all power at the society. This has been happening for both the
leftists (Stalin-type socialist dictatorship), and the rightists
(Hitler-type nationalist dictatorship).

For the left-right debate, I think this quote gets it all: "There are
two kinds of fools: one say 'This is old and therefore good', the
other say 'this is new and therefore better'." Neither the left nor
the right position is inherently right, it's about the balance that
benefits mankind the most.

The same goes for the balance between freedom and security: the very
nature of society is to give up some freedom (such as the freedom to
kill your neighbour) for security (such as being secure from being
killed by your neighbour). Again, we're challenged to balance this
well, and it's independent of the left-right balance.

Nils Goesche

unread,
Jan 13, 2002, 3:40:43 PM1/13/02
to
In article <3C41DF35...@mail.com>, James A. Robertson wrote:
> As much as I enjoy a good political discussion, and as much as I've been
> sucked into them in this and other forums, I'm going to ask that you
> take this to email or an appropriate forum.

Of course, sorry about that. I won't post anything more to this thread
and switched to email.

Espen Vestre

unread,
Jan 14, 2002, 3:12:01 AM1/14/02
to
Erik Naggum <er...@naggum.net> writes:

> I have actually witnessed this. A company I worked for got a new manager
> level that was completely superfluous, so the new manager had to prove to
> herself that she had a real job, and spent a lot of time arguing against
> using languages that were not mainstream, and basically made it hard to
> use anything but Java, and many good people quit. Then a Java man got
> seriously ill. She was unable to replace him in the 5 months he was
> away. The other Java men could not do his work.

The scary thing is that your experience is no exception, there are
_so_ many representatives of this type of manager around. It's the
kind of manager that actually thinks that hiring java or C++
programmers is so easy because they teach these languages at "every
school", and fail to understand that having successfully spent 2 or 3
years at some school which educates "it professionals" doesn't
necessarily imply that you're suited to be a programmer at all, and it
_definitely_ doesn't mean that you're well trained in the languages
they happen to use at that school.
--
(espen)

israel r t

unread,
Jan 14, 2002, 5:40:14 AM1/14/02
to
On Mon, 14 Jan 2002 08:12:01 GMT, Espen Vestre
<espen@*do-not-spam-me*.vestre.net> wrote:

>The scary thing is that your experience is no exception, there are
>_so_ many representatives of this type of manager around. It's the
>kind of manager that actually thinks that hiring java or C++
>programmers is so easy because they teach these languages at "every
>school"

It also explains why managers speak so disparagingly of programmers as
"mere coders"
-------------------
Direct all spam to:
pres...@whitehouse.gov, vice.pr...@whitehouse.gov,
ab...@aol.com,tos...@aol.com,
ab...@yahoo.com, ab...@hotmail.com, ab...@msn.com,
ab...@cia.gov , ab...@sprint.com, ab...@earthlink.com, u...@ftc.gov, spa...@spamcop.net

Immanuel Litzroth

unread,
Jan 14, 2002, 6:00:47 AM1/14/02
to

In this respect the articles
http://www-106.ibm.com/developerworks/java/library/assignment-operator/?dwzone=java
in which the author has a hard time finding a programmer that can
write a c++ assigment operator and
http://oss.software.ibm.com/icu/docs/papers/cpp_report/the_assignment_operator_revisited.html
in which the author admits that he wasn't up to the task either are
interesting and amusing reading
Immanuel

Kevin McFarlane

unread,
Jan 14, 2002, 6:42:59 AM1/14/02
to

"Erik Naggum" <er...@naggum.net> wrote in message
news:32199367...@naggum.net...

> * "Kevin McFarlane" <ke...@atech.globalnet.co.uk>
> | It's always going to be hard for something new to get a look in. There
> | is, of course, the economic reason that, for example, C/C++ guys are two
> | a penny but Eiffel and Smalltalk guys aren't.
>

Hi Erik

> This is one of the most misleading abuses of statistics around. Just
> because the probability that you hit a C++ programmer if you throw a
rock
> into a crowd is very high, does not mean that the probability that he
can
> replace _your_ C++ programmer is any higher than finding a replacement
> Eiffel or Smalltalk programmer. Because you have to weed through tons
of
> idiots who only _claim_ they know C++, the effort required to find a
real
> replacement may be significantly lower for Eiffel or Smalltalk.

But the Eiffel or Smalltalk programmer might cost you more (in salary), as
there are fewer of them to go around.

> Besides,
> if you can find a good programmer, chances are very good that he will be
> able to learn any programming language you use reasonably well in the
> time it would take to find a good C++ programmer. And learning from the
> sources of the previous programmer is a lot easier than learning the
> language from scratch in a general, application-independent way.

I agree with you. And I would like that to be the case but unfortuantely it
isn't. Employers are obsessed with buzzwords. Many of the buzzwords they're
obsessed about can be learnt to a useful level within a few days by
competent programmers. But I guess it's less effort for recruiters to
concentrate on buzzwords than to assess candidates in more depth.

A friend of mine who was involved in hiring told me a story about this. Two
candidates were offered jobs. The first had all the buzzwords and impressed
the senior manager. My friend was sceptical but the senior manager won on
that occasion. The second candidate had no buzzwords but exuded competence.
The senior manager was not impressed but my friend was. This time my friend
won.

The first candidate was hopelesly out of his depth and left within a week.
The second candidate turned out to be one of their star programmers.

>
> I have actually witnessed this. A company I worked for got a new
manager
> level that was completely superfluous, so the new manager had to prove
to
> herself that she had a real job, and spent a lot of time arguing against
> using languages that were not mainstream, and basically made it hard to
> use anything but Java, and many good people quit. Then a Java man got
> seriously ill. She was unable to replace him in the 5 months he was
> away. The other Java men could not do his work. To her amazement,
> choice of language mattered less than the other skills the programmers
> had.

Yes. This is true. BTW, have you read this? It supports your case.

"Debunking the Myth of a Desperate Software Labor Shortage"
http://heather.cs.ucdavis.edu/itaa.real.html

One of its counterintuitive findings is this:

"A study quoted Myths and Methods: a Guide to Software Productivity by David
T. Fisher (Prentice-Hall, 1991) found that the factor Personnel Capability,
i.e. general talent and energy of the programmers, counted for a score of
4.18 in a productivity prediciton equation. This was by far the largest
factor; the next largest was Product Complexity, with a score of only 2.36.
The factor (Programming) Language Experience, i.e. experience with a
specific software skill, had the smallest score among the 15 factors
studied, with a score of only 1.20. Fisher comments:


'The relatively small impact of language knowledge is an important fact
which is not intuitively obvious. Judging by advertisements for programmers
it would seem that [IT] managers tend to overemphasize specific language
experience.'"


> The conclusion from this story that this manager actually arrived
> at was that it was bad to have skilled programmers -- she alone should
> make the design decisions and programmers would simply implement them.
> She could now return to her policy of using only mainstream languages
and
> hire only unskilled programmers who lied about knowing a language. As
> far as I know, nothing interesting has happened at that company for a
> long time.
>

Unfortunately, this seems to be how most companies operate. And I can't see
it changing anytime soon.


Georg Bauhaus

unread,
Jan 14, 2002, 8:52:16 AM1/14/02
to
In comp.lang.ada israel r t <isra...@optushome.com.au> wrote:

: And Ocaml wins again ! :-)

: And speeeeeeed !
: ( As if it really matters with commodity 2 GHz processors... )

For a decision to be taken shortly, I have compared your
loosers (and more of them) to SML-NJ, Ruby, and SNOBOL4.
Task: do lots of calculations with (alledgedly double prec.)
reals. Result (with optimization, where I know how to achieve it
with compiler switches, with and without range checking):
loosers other than Java: in the range of 10-15 time units,
ML: 45,
Java: 70, (though default compiler options in VAJ 3.5/Win)
Scripting languages: aborted, since they were alread running
for a very long time doing only 10000 iterations, where the
others had run 1Mio times.

It would be interesting to learn whether or not OCaml or any other
of the more pleasing very high level languages are
significantly faster than SML-NL in this respect (since we have to
think about some real time/response time constraints).

Java seems to be adventurous anyway when it comes to fpt,
according to Kahan, so there...

(eAsk me for details if you are interested.)

-- Georg

Patrick Doyle

unread,
Jan 16, 2002, 1:42:20 AM1/16/02
to
israel r t <isra...@optushome.com.au> wrote:
>On Mon, 14 Jan 2002 08:12:01 GMT, Espen Vestre
><espen@*do-not-spam-me*.vestre.net> wrote:
>
>>The scary thing is that your experience is no exception, there are
>>_so_ many representatives of this type of manager around. It's the
>>kind of manager that actually thinks that hiring java or C++
>>programmers is so easy because they teach these languages at "every
>>school"
>
>It also explains why managers speak so disparagingly of programmers as
>"mere coders"

Right: because most of them are. :-)

israel r t

unread,
Jan 16, 2002, 3:00:01 AM1/16/02
to
On Wed, 16 Jan 2002 06:42:20 GMT, doy...@eecg.toronto.edu (Patrick
Doyle) wrote:

>>It also explains why managers speak so disparagingly of programmers as
>>"mere coders"
>Right: because most of them are. :-)

Right.
Their instructors are just as bad.

I met someone who is teaching VB and Pascal at the local TAFE ( sort
of substandard community college ).
He actually thought that Pascal is an OOP !
He either does not know what OOP is or else he is confusing Borlands
proprietary Delphi extensions with Pascal.

I now see kids who can barely walk without drooling who want to "do
computers because it pays well". In another decade they would have
been barely adequate butchers or bakers. Now they become VB
programmers and "help desk" people.


---------------------------------------------------------
Devising your own font (Devanagari, pinhead graphics, etc.)
and using it in the mail is a good tactic,as is finding some way to use existing obscure fonts.
Aramaic , Pre-Rashi Hebrew and Sanskrit are particularly useful in this regard.
-- from the Symbolics Guidelines for Sending Mail

israel r t

unread,
Jan 16, 2002, 3:00:47 AM1/16/02
to
On Wed, 16 Jan 2002 06:42:20 GMT, doy...@eecg.toronto.edu (Patrick
Doyle) wrote:

>>It also explains why managers speak so disparagingly of programmers as
>>"mere coders"
>Right: because most of them are. :-)

Right.

Espen Vestre

unread,
Jan 16, 2002, 3:55:58 AM1/16/02
to
israel r t <isra...@optushome.com.au> writes:

> He either does not know what OOP is or else he is confusing Borlands
> proprietary Delphi extensions with Pascal.

IRL he is right, isn't he? I mean: Isn't Delphi the only Pascal that
can be considered to be "alive & kicking"? (I don't know Delphi,
but AFAIK, the OO extensions actually have a longer history, they
go back to Apple's Object Pascal)
--
(espen)

Christian Lynbech

unread,
Jan 16, 2002, 7:00:41 AM1/16/02
to
>>>>> "Preben" == Preben Randhol <randho...@pvv.org> writes:

Preben> On Sat, 12 Jan 2002 10:46:36 +1100, israel r t wrote:
>> Yet, if Kent is right, they may have all been " rejected ... for
>> substantial reasons and not just superficial ones."

Preben> This implies that the marked is rational and logical. It has
Preben> shown time and time again that it isn't.

Exactly, the market is made up of armies of individuals that behave
according to their own small views of the world. If enough individuals
falls for a myth about something being smart and cool, then that will
be the markets collected reaction as well.

Preben> Capitalisme will need to adjust itself in the near future from
Preben> the short-term gain view to a long-term view.

Unfortunately, I do not share that optimism. To survive in the market
place, it is good enough that you do not do any worse than your
competitors in certain areas and then you can be better in some other
areas. So using C as an application language will not a business if
the competitors do the same. Of course using something better than C
will increase the advantages of that business, but it isn't a
requirement.

So as long as enough people believe something, and the market will
provide a selfsustainable weight to a belief once it is there, it will
not be working against you, even if it is downright wrong.

A real bummer for payrollers such as myself that needs to depend on
management seeing through the myths to be able to work with lisp, but
certainly also a great opportunity for the entrepreneur with a good
idea and an understanding of what lisp can do for him.

In the land of the blind, the one-eyed is king.


------------------------+-----------------------------------------------------
Christian Lynbech | Ericsson Telebit, Skanderborgvej 232, DK-8260 Viby J
Phone: +45 8938 5244 | email: christia...@ted.ericsson.dk
Fax: +45 8938 5101 | web: www.ericsson.com
------------------------+-----------------------------------------------------
Hit the philistines three times over the head with the Elisp reference manual.
- pet...@hal.com (Michael A. Petonic)

Bob Bane

unread,
Jan 18, 2002, 10:19:03 AM1/18/02
to
I was particularly impressed with his last paragraph in the second
article. For some reason, he doesn't conclude that there's something
wrong with C++. Can't imagine why...

---------BEGIN-QUOTE-----------

I don't know about you, but there's something really scary to me about a
language where copying state from one object to another is this
complicated. By now, I suspect at least a dozen or two programmers have
contributed something new to this discussion. If it takes this many
programmers to write a simple assignment operator, think how complicated
writing code that actually does something meaningful must be!

--
Remove obvious stuff to e-mail me.
Bob Bane

René

unread,
Jan 18, 2002, 12:44:36 PM1/18/02
to
Bob Bane <ba...@removeme.gst.com> writes:

> I was particularly impressed with his last paragraph in the second
> article. For some reason, he doesn't conclude that there's something
> wrong with C++. Can't imagine why...

Hmm, I read the below as "There is something wrong with C++!". Or were you
just being ironic?

> ---------BEGIN-QUOTE-----------
>
> I don't know about you, but there's something really scary to me about a
> language where copying state from one object to another is this
> complicated. By now, I suspect at least a dozen or two programmers have
> contributed something new to this discussion. If it takes this many
> programmers to write a simple assignment operator, think how complicated
> writing code that actually does something meaningful must be!

Bjarne Stroustrup often says that complexity is inevitable for a
programming language that is widely used for many different tasks and big
projects. It may start out simple, but as different user groups gets their
stuff in, the complexity comes in the language, compiler, runtime system,
libraries and/or user code. I don't know why CL isn't more popular, but
starting anew (arc, dylan) to simplify and change some superficial features
seems like a _lot_ of work and a good risk of ending up in the same spot
years later. Everything doesn't have to be elegant, as long as there is a
known approach/idiom for expressing it.


-- René

Hyman Rosen

unread,
Jan 18, 2002, 1:04:53 PM1/18/02
to
Bob Bane wrote:

> I was particularly impressed with his last paragraph in the second
> article. For some reason, he doesn't conclude that there's something
> wrong with C++. Can't imagine why...


I can. The problem he's trying to solve is difficult; I don't see
anything in its nature which wouldn't be equally difficult in Ada.

Simon Willcocks

unread,
Jan 18, 2002, 1:44:27 PM1/18/02
to
In message <3C483CE7...@removeme.gst.com>
Bob Bane <ba...@removeme.gst.com> wrote:

Despite this he's still (presumably) hiring C++ programmers!

In message <u45hkv9...@corp.supernews.com> Kevin McFarlane wrote:

> Yes. This is true. BTW, have you read this? It supports your case.
>
> "Debunking the Myth of a Desperate Software Labor Shortage"
> http://heather.cs.ucdavis.edu/itaa.real.html

One of the claims in this document is

> [firms could] hire a generic programmer and let him/her learn the specific
> skills on the job, which any competent programmer can do within weeks.
> Refusing to hire a C-language programmer to write Java code is like a Ford
> dealer refusing to hire mechanics who have only Chevy experience, [...]

Ordinarily, I would agree, but looking at the assignment operator example I
have had second thoughts. I read the code and saw quickly why each line was
there and why it was necessary, but I'm sure I wouldn't have come up with
all of it myself! There is a simple solution, though; the coding standards
in the last project I worked on required that each class should have an
assignment operator and a copy constructor defined, but made private and
with no implementation.

I think I'll try to stick to languages with garbage collection.

Regards,
Simon Willcocks

Bruce Hoult

unread,
Jan 18, 2002, 2:36:49 PM1/18/02
to
In article <3C483CE7...@removeme.gst.com>, Bob Bane
<ba...@removeme.gst.com> wrote:


And the really really scary thing is that I know C++ well enough to know
why I need an assignment operator, and know how to write it correctly
off the top of my head (and with the inherited:: call inside the
try{}!). Most C++ places that I've worked I'm the *only* person who
would get this right.

And yet, when I tell them that their development would be better done
using Dylan or one of the languages in the "Newsgroups:" line they think
I'm crazy.

-- Bruce

Kaz Kylheku

unread,
Jan 18, 2002, 2:57:50 PM1/18/02
to
In article <bruce-4C4B58....@news.paradise.net.nz>, Bruce

Hoult wrote:
>And the really really scary thing is that I know C++ well enough to know
>why I need an assignment operator, and know how to write it correctly
>off the top of my head (and with the inherited:: call inside the
>try{}!). Most C++ places that I've worked I'm the *only* person who
>would get this right.

Most C++ programmers I have come across know only an imaginary, simplified
dialect of C++ that they have assembled from various incorrect sources,
and compiler-specific reference. It is my experience that attitudes
toward standards and language specifications in general tend to be very
poor among C++ users.

There is a lot of literature written by idiots and for idiots, purportedly
about C++, but really about a simplified dialect having little to do with
C++. The literature about languages which are not hyped up tends to be of
a much higher quality. The result is that if you typical C++ programmer
comes across a textbook about some programming language (like say one of
the ones in the Newsgroups: line) that language seems incredibly difficult
compared to the simplified language that he misunderstands C++ to be.

He doesn't understand that half of his code is not even portable to
the compiler he developed it with, but works only by fluke, and that if
he were to get it right, he would have to work in a much more complex
programming language, namely the real C++.

Marc Spitzer

unread,
Jan 18, 2002, 2:58:46 PM1/18/02
to
In article <bruce-4C4B58....@news.paradise.net.nz>,
Bruce Hoult wrote:

From their POV you are crazy, most places want to hire, for lack of a
better term, cogs. People who do there job, go home, fear management
and are easy to push around by management. I feel the reason for this
is that management knows that they CAN replace one average(or sub
average) programmer with another as long as you pick commodity people
and are using commodity languages. The problem with very good or even
worse great programmers is that you need to find more people like that
to replace them, they do not fear management and are generally are
"trouble makers", they have opinions that differ from management
about what to do and/or how to do it with facts to back them up.

marc

Ed Falis

unread,
Jan 18, 2002, 3:11:28 PM1/18/02
to
Marc Spitzer wrote:

> The problem with very good or even
> worse great programmers is that you need to find more people like that
> to replace them, they do not fear management and are generally are
> "trouble makers", they have opinions that differ from management
> about what to do and/or how to do it with facts to back them up.
>
> marc
>


Yes, and they also have this annoying habit of coming through with "home
runs" for the company when conventional wisdom says their approach is wrong.

- Ed

Kenny Tilton

unread,
Jan 18, 2002, 5:14:17 PM1/18/02
to

>
> Marc Spitzer wrote:
>
> > The problem with very good or even
> > worse great programmers is that you need to find more people like that
> > to replace them,

no, you only think you do. then your average programmer looks at the
code for a week and before you can set up the first interview ms/mr
average says "looks pretty straightforward to me, i can maintain this."

otoh, if mr/ms average gives up on the mess, the dearly departed most
assuredly was not great nor even very good.

early on in my career i heard a line that puzzled me until i saw OPC:
"if you have an indispensible programmer, fire them."

kenny
clinisys

Richard Riehle

unread,
Jan 18, 2002, 6:22:29 PM1/18/02
to
Hyman Rosen wrote:

Dijkstra said something once, I believe it was in "A Discipline of
Computer Programming," about the inherent complexity of assignment
and the fact that most programmers did not understand how it actually
worked. In the same article, he suggests that, "until a programmer
really understands assignment, he [sic] does not understand programming."

In C++ or Ada (or whatever) there is no need to write an assignment
operator unless the assignment between two objects is more complex
than predefined assignment. One benefit of the limited type in Ada is
to highlight the dangers of assignment. This is also why most complex
data structures in Ada are limited types.

I am not fond of the C++ idiom for tinkering with assignment, but it is
a reasonable model given the rest of the language design. It is not clear

Ada 95 got it right either, but it feels safer to me. I prefer Ada's
proscription against directly overloading the assignment operator. The
contract is more clear to me if it is written as,

package P is
type T is [tagged] limited private; -- no predefined methods on
limited type
-- public explicit declaration of methods on T
procedure Copy_Deep (Source : in T; Target : in out T);
procedure Copy_Shallow (Source : in T; Target : in out T);
-- pre-conditions on Copy_Deep and Copy_Shallow
Invalid_Source : exception;
-- post-conditions on Copy_Deep and Copy_Shallow
Incomplete_Copy_Operation : exception;
private
-- private methods, if any
-- full definition of T;
end P;

This specification makes clear the contract and still eliminates any
possibility for stupid assignment between objects of the type.

Richard Riehle

av1...@comtv.ru

unread,
Jan 19, 2002, 2:32:53 AM1/19/02
to
Bruce Hoult wrote:

>
> And the really really scary thing is that I know C++ well enough to know
> why I need an assignment operator, and know how to write it correctly
> off the top of my head (and with the inherited:: call inside the
> try{}!). Most C++ places that I've worked I'm the *only* person who
> would get this right.
>
> And yet, when I tell them that their development would be better done
> using Dylan or one of the languages in the "Newsgroups:" line they think
> I'm crazy.
>
> -- Bruce

Was reading this thread, this message, simultaneously talking on IRC.
Let me reproduce part of one conversation:

<dAS-> it sucks.. I'm still depressed over lisp
<dAS-> like wtf..
<dAS-> if I can't get a job with it.. I see little reason to spend all my
time on it..
<dAS-> I love it..
<dAS-> but really..
<dAS-> and why do all lisp companies only request *LISP GURUS*
<dAS-> like wtf is up with that

I guess this can be applied to any non-mainstream language.


Matthew Heaney

unread,
Jan 18, 2002, 6:53:32 PM1/18/02
to

"Richard Riehle" <ric...@adaworks.com> wrote in message
news:3C48AE35...@adaworks.com...

> In C++ or Ada (or whatever) there is no need to write an assignment
> operator unless the assignment between two objects is more complex
> than predefined assignment. One benefit of the limited type in Ada is
> to highlight the dangers of assignment. This is also why most complex
> data structures in Ada are limited types.

That's why I routinely turn off copy-assignment and copy-construction in my
ADTs:

class C
{
public:
C();
//...
private:
C& operator=(const C&);
C(const C&);
//...
};

I recently had to override assignment for a type, but that was only for a
class sans vtable (the type wasn't "tagged"). Fortunately, I haven't had to
overrided assignment for a tagged type.


> This specification makes clear the contract and still eliminates any
> possibility for stupid assignment between objects of the type.

What I showed above is the standard C++ idiom for doing the same.


Thomas F. Burdick

unread,
Jan 18, 2002, 9:46:29 PM1/18/02
to
Kenny Tilton <kti...@nyc.rr.com> writes:

> no, you only think you do. then your average programmer looks at the
> code for a week and before you can set up the first interview ms/mr
> average says "looks pretty straightforward to me, i can maintain this."
>
> otoh, if mr/ms average gives up on the mess, the dearly departed most
> assuredly was not great nor even very good.

Hmm, for the slightly-below-average programmer, though, often they'll
*think* they can maintain something, but in doing so will fuck up all
over the place, break perfectly nice algorithms, change things to
ineffecient algorithms, etc., not because they *couldn't* have
followed the nicely-documented, cleverly written code, but because
they figured it would be easier to do things the "normal" way whenever
they had to change a part of a program. Things can stepwise decline
into a mess in the hands of people who want everything to be done the
"normal" way. The problem is that most programs are a mess, so
changing things to work the way everything else does is likely to make
it worse, if it was above-average to start with.

> early on in my career i heard a line that puzzled me until i saw OPC:
> "if you have an indispensible programmer, fire them."

I understand what this is supposed to mean, but what about the very
well-read, educated programmer who knows (or knows how to find) better
ways of doing lots of things than the normal, naive method/algorithm?
Such a programmer can be indiscpensible for things that can't be done
in the naive way (for efficiency or complexity reasons, or whatever).

--
/|_ .-----------------------.
,' .\ / | No to Imperialist war |
,--' _,' | Wage class war! |
/ / `-----------------------'
( -. |
| ) |
(`-. '--.)
`. )----'

Mad Hamish

unread,
Jan 19, 2002, 7:55:59 PM1/19/02
to
On Wed, 16 Jan 2002 08:55:58 GMT, Espen Vestre
<espen@*do-not-spam-me*.vestre.net> wrote:

>israel r t <isra...@optushome.com.au> writes:
>
>> He either does not know what OOP is or else he is confusing Borlands
>> proprietary Delphi extensions with Pascal.
>
>IRL he is right, isn't he? I mean: Isn't Delphi the only Pascal that
>can be considered to be "alive & kicking"?

Well Pascal tended to hang around for quite a while in education areas
because
a) it was originally designed for education
b) they had it and what's the point of buying a new environment?

Yahoo lists a couple of compilers available for Pascal now.

> (I don't know Delphi,
>but AFAIK, the OO extensions actually have a longer history, they
>go back to Apple's Object Pascal)

Delphi was built on top of Object Pascal, I'm not sure who originally
did Object Pascal.
--
"Hope is replaced by fear and dreams by survival, most of us get by."
Stuart Adamson 1958-2001

Mad Hamish
Hamish Laws
h_l...@bigpond.com

Bruce Hoult

unread,
Jan 19, 2002, 8:18:02 PM1/19/02
to
In article <3c4a149d...@news.bigpond.com>, h_l...@bigpond.com (Mad
Hamish) wrote:

> > (I don't know Delphi,
> >but AFAIK, the OO extensions actually have a longer history, they
> >go back to Apple's Object Pascal)
>
> Delphi was built on top of Object Pascal, I'm not sure who originally
> did Object Pascal.

Nicklaus Wirth spent a year on sabbatical at Apple, designing Object
Pascal with them.

-- Bruce

Hyman Rosen

unread,
Jan 20, 2002, 12:39:58 AM1/20/02
to
Richard Riehle wrote:

> procedure Copy_Deep (Source : in T; Target : in out T);
> procedure Copy_Shallow (Source : in T; Target : in out T);


I dislike the notion of providing these methods as external interfaces
to a type. It seems wrong to me for a client to have to know what kind
of copy to use - that's the type's business to know.

Kaz Kylheku

unread,
Jan 20, 2002, 12:59:42 AM1/20/02
to

This turns out to be circular reasoning, because what a type ``knows''
is partally defined by the operations you can perform on it.

Suppose that you have N methods for copying an object. I do not take it
as a theorem that you can always make an (N + 1)-th method which unifies
the other N, magically choosing the appropriate one. How can this magic
meta-method possibly know what semantics the caller wants?

Software Scavenger

unread,
Jan 20, 2002, 10:15:25 AM1/20/02
to
k...@accton.shaw.ca (Kaz Kylheku) wrote in message news:<i1t28.632$XG4....@news2.calgary.shaw.ca>...

> Suppose that you have N methods for copying an object. I do not take it
> as a theorem that you can always make an (N + 1)-th method which unifies
> the other N, magically choosing the appropriate one. How can this magic
> meta-method possibly know what semantics the caller wants?

A big part of the problem is that copy semantics are tied closely to
the memory management paradigm in use. With garbage collection, it's
common for an assignment statement to simply copy a pointer or
reference such that the target of the assignment refers to the same
object. Without garbage collection, it's much more complicated,
because the target of the assignment statement has to know whether to
free the object when done with it. Names such as deep copy and
shallow copy do not cover the complexities, and might just add more
confusion.

In languages with garbage collection, the best way might be to define
copy to mean shallow copy, and use other names than copy for all other
copying operations. For example, to make a copy on a remote computer,
you could use a name such as send. Thus we would be talking about
three levels of copying: Assignment, meaning to copy the reference to
make something else refer to the same object, copy, meaning to copy
the object itself so it can be modified without modifying the original
object, and the third level, where the copy operation would be named
send or whatever name might be appropriate for the purpose of each
such copy operation.

Richard Riehle

unread,
Jan 20, 2002, 8:42:40 PM1/20/02
to
Hyman Rosen wrote:

Well I don't like having an assignment statement that behaves in an
unpredictable way when I use it. So I prefer having the copy semantics
spelled out as clearly as possible. This clarity assures me that, when
I call a method, it will behave as I expected. From my perspective,
clarity is an essential feature of a type. Behaving according to my
expectations is another nice feature for any method, whether it is
assignment or some other method.

As to the client needing to know which copy to use, in practice we
would make only one available, but we would make clear the semantics
of that copy. The other point is that using Ada's limited types, we
don't have to worry about making mistakes regarding the assignment
operation.

Richard Riehle


Hyman Rosen

unread,
Jan 21, 2002, 4:52:08 AM1/21/02
to
Richard Riehle wrote:

> Hyman Rosen wrote:
>>Richard Riehle wrote:
>>> procedure Copy_Deep (Source : in T; Target : in out T);
>>> procedure Copy_Shallow (Source : in T; Target : in out T);
>>>
>>I dislike the notion of providing these methods as external interfaces
>>to a type. It seems wrong to me for a client to have to know what kind
>>of copy to use - that's the type's business to know.
>
> Well I don't like having an assignment statement that behaves in an
> unpredictable way when I use it.


I think we can agree on this with unbridled enthusiasm :-)

> So I prefer having the copy semantics spelled out as clearly as possible.

> This clarity assures me that, when I call a method, it will behave as
> I expected.

But the user of a type generally isn't concerned with its innards.
Why would you take a simple concept like copying or assignment and
burden it with implementation details? If you were writing a sort
method for a conatiner, wouldn't you just call it "sort" rather than
"merge_sort' or "bubble_sort"?

And if we were to say "do as Ada does", consider Unbounded_String.

> As to the client needing to know which copy to use, in practice we
> would make only one available, but we would make clear the semantics
> of that copy. The other point is that using Ada's limited types, we
> don't have to worry about making mistakes regarding the assignment
> operation.


Well, the normal semantics of a copy are that the source and destination
are not tied to each other after the copy is made. Of course, internally
that might not be true, because you could be maintaining reference
counts, and doing copy-on-write. You would place all that information
into the routine name?

And limited types are irrelevant here. C++ and Ada can both prevent
assignment from compiling. The problem was writing assignment correctly
when you do want to provide it.

Richard Riehle

unread,
Jan 21, 2002, 11:38:39 AM1/21/02
to
Hyman Rosen wrote:

> But the user of a type generally isn't concerned with its innards.
> Why would you take a simple concept like copying or assignment and
> burden it with implementation details? If you were writing a sort
> method for a conatiner, wouldn't you just call it "sort" rather than
> "merge_sort' or "bubble_sort"?

In fact, I do care about this. The performance characteristics
of each kind of sort are important. Under some circumstances
I want to use a Merge Sort utility and other times I want a
Quicksort. This is one place where I can benefit from a knowledge
of the Big O notation.

> Well, the normal semantics of a copy are that the source and destination
> are not tied to each other after the copy is made. Of course, internally
> that might not be true, because you could be maintaining reference
> counts, and doing copy-on-write. You would place all that information
> into the routine name?

I will want to have information in the contract that gives the client of
my specification an opportunity to choose which method to invoke,
and what to expect from it. It might be quite useful for a client to
know that a method is managed or not managed, but this does not
have to be part of the name. In my example, I included preconditions
and postconditions in the form of meaningful exception names. These also
provide some information for the client.

In the legal world, there is something called "due notice." Any contract,
including a type/class specfication, needs to be explicit.

> And limited types are irrelevant here. C++ and Ada can both prevent
> assignment from compiling. The problem was writing assignment correctly
> when you do want to provide it.

Actually, when considering the need for clarity, I think there is a difference.

The technique for preventing assignment in C++ is not, in my mind, as
clear as the declaration of a limited type. The fact that some construct
can be expressed in a particular language does not mean it is best
expressed in that language. We will simply have to agree to disagree
on the point regarding limited types.

Richard Riehle

Kaz Kylheku

unread,
Jan 21, 2002, 3:30:39 PM1/21/02
to
In article <3C4BE556...@mail.com>, Hyman Rosen wrote:
>Richard Riehle wrote:
>
>> Hyman Rosen wrote:
>>>Richard Riehle wrote:
>>>> procedure Copy_Deep (Source : in T; Target : in out T);
>>>> procedure Copy_Shallow (Source : in T; Target : in out T);
>>>>
>>>I dislike the notion of providing these methods as external interfaces
>>>to a type. It seems wrong to me for a client to have to know what kind
>>>of copy to use - that's the type's business to know.
>>
>> Well I don't like having an assignment statement that behaves in an
>> unpredictable way when I use it.
>
>
>I think we can agree on this with unbridled enthusiasm :-)
>
>> So I prefer having the copy semantics spelled out as clearly as possible.
>
> > This clarity assures me that, when I call a method, it will behave as
> > I expected.
>
>But the user of a type generally isn't concerned with its innards.

That is false.

>Why would you take a simple concept like copying or assignment and
>burden it with implementation details? If you were writing a sort
>method for a conatiner, wouldn't you just call it "sort" rather than
>"merge_sort' or "bubble_sort"?

The semantics of copying an object are not transparent to the user in
a way that choice of sorting algorithm is transparent.

All sorting algorithms which ensure the same postconditions are
effectively equivalent except for performance.

Different ways of copying an object do not ensure the same postconditions.
They are semantically different, in a way that may matter critically
to the caller.

Here is a slightly better analogy: do you want a stable sort or can you
live with an unstable one? The difference is visible to the caller.
If a stable sort is essential, but an unstable one is invoked, that
is a mistake.

There can be lots of other variations: what comparison function is
used for the elements? Each function leads to a different sort. There
are effectively as many sort operations as there are ways to compare
the elements.

Usually, we arrange for the caller to be able to specify the computation,
so we can offer all of these sorts behind one interface. The function
sorts, but the caller specifies *how*. It's the same thing with object
replication. We can copy in various ways, the caller specifies *how*.

Raffael Cavallaro

unread,
Jan 22, 2002, 12:35:26 AM1/22/02
to
Hyman Rosen <hyr...@mail.com> wrote in message news:<3C4BE556...@mail.com>...

> Well, the normal semantics of a copy are that the source and destination
> are not tied to each other after the copy is made. Of course, internally
> that might not be true, because you could be maintaining reference
> counts, and doing copy-on-write. You would place all that information
> into the routine name?

Why not?

clone
(returns a deep copy, with no references to original)

link
(shallow copy - just another reference to the original)

clone-on-write
(what it suggests - starts as a "link," but becomes a "clone" when a
client attempts to write to it.)

Patrick Doyle

unread,
Jan 22, 2002, 9:25:09 AM1/22/02
to

I must be crazy. I have never written a system where the semantics
of the "copy" operation mattered. Mainly, this is for two reasons:

1. A lot of my data is immutable, so the copying semantics are moot.

2. Instead of copying an existing object, my usual idiom is to create a
new one from scratch.

I suppose there are some cases where these two things don't apply, or
would be inappropriate, and one would have to think very carefully about
the semantics of copying, but I can't think of such a case off the top
of my head.

I suppose that only says something about my inexperience as a programmer.
Can anyone show an example of a situation where copying semantics really
matter?

--
--
Patrick Doyle
doy...@eecg.toronto.edu

Steven T Abell

unread,
Jan 22, 2002, 10:47:41 AM1/22/02
to
> Can anyone show an example of a situation where copying semantics really
> matter?

Sure.
Imagine you have an object with some simple attributes,
ans some connections to other complex objects,
and you want to save this object to a file.
A shallow copy would only take those attributes that were simple,
for example, numbers and strings.
A deep copy would also copy out all of the connected complex objects,
and all of their connected objects, and all of their connected objects, etc.
For a great many objects that you will actually build,
a deep copy will eventually copy the entire image,
or get stuck in a copy cycle,
where some referent of the original object
refers back to the original object.
There are tools, such as in VisualWorks,
that scan the copy graph before copying
to work around this cyclic copying.
They work, but they are often unavoidably slow.
The exact meaning of "copy" turns out to be a *really* hard problem:
there is no single answer that works in all cases.

One place where copy depth is often a problem is in copying collections.
The usual copy creates a duplicate collection
with references to the same objects as the original collection.
This might be what you want, or it might not.
For this kind of copy,
you have to remember that changes to a collection element
will appear in *both* collections,
since both collections refer to that object.
A deeper copy copies the elements also,
but the question arises: How deeply do you copy?
Once again, there is no simple answer.
You have to think very carefully about your application's needs
and then code your copy semantics just as carefully.

Steve
--
Steven T Abell
Software Designer
http://www.brising.com

In software, nothing is more concrete than a good abstraction.

Hyman Rosen

unread,
Jan 22, 2002, 12:01:55 PM1/22/02
to
Simon Willcocks wrote:

> I think I'll try to stick to languages with garbage collection.


But garbage collection has nothing to do with this problem!

The problem is handling an assignment when the type contains
bases and subobjects that must be copied, and doing such
copying could cause exceptions. You must arrange for the
copying to succeed, or else leave the destination object in
a consistent state, as well as propogate exceptions back to
the caller.

Kaz Kylheku

unread,
Jan 22, 2002, 1:00:08 PM1/22/02
to
In article <GqCG1...@ecf.utoronto.ca>, Patrick Doyle wrote:
>Kaz Kylheku <k...@ashi.footprints.net> wrote:
>>In article <3C4A58B...@mail.com>, Hyman Rosen wrote:
>>>Richard Riehle wrote:
>>>
>>>> procedure Copy_Deep (Source : in T; Target : in out T);
>>>> procedure Copy_Shallow (Source : in T; Target : in out T);
>>>
>>>I dislike the notion of providing these methods as external interfaces
>>>to a type. It seems wrong to me for a client to have to know what kind
>>>of copy to use - that's the type's business to know.
>>
>>This turns out to be circular reasoning, because what a type ``knows''
>>is partally defined by the operations you can perform on it.
>>
>>Suppose that you have N methods for copying an object. I do not take it
>>as a theorem that you can always make an (N + 1)-th method which unifies
>>the other N, magically choosing the appropriate one. How can this magic
>>meta-method possibly know what semantics the caller wants?
>
>I must be crazy. I have never written a system where the semantics
>of the "copy" operation mattered. Mainly, this is for two reasons:
>
>1. A lot of my data is immutable, so the copying semantics are moot.
>
>2. Instead of copying an existing object, my usual idiom is to create a
>new one from scratch.

Anecdotal evidence. I have written systems where it mattered.

Here is a recent example: a protocol stack where I have three ways
to copy a network buffer buffer object. I can copy a pointer, and
increment a reference count. I can copy the header structure which
maintains various pointers into the buffer, and bump up a lower reference
count on the buffer. Or I can do a deep copy which copies the buffer
as well.

This data is not immutable, so the copying semantics are not moot.

>I suppose there are some cases where these two things don't apply, or
>would be inappropriate, and one would have to think very carefully about
>the semantics of copying, but I can't think of such a case off the top
>of my head.
>
>I suppose that only says something about my inexperience as a programmer.
>Can anyone show an example of a situation where copying semantics really
>matter?

How about a Lisp example?

(eq 'a 'a) ==> T

(eq 'a (copy-symbol 'a)) ==> NIL

A copy of a symbol is usually useless, because a symbol is meaningful
precisely because it is EQ to itself. So you usually need shallow,
reference copies of a symbol.

If a symbol is embedded in an object, and you want to copy that object,
you probably want to copy those embeddded symbols by reference. And then
you are no longer making an entirely new object from scratch; the common
symbols are shared substructure.

Bruce Hoult

unread,
Jan 22, 2002, 4:59:48 PM1/22/02
to
In article <3C4D9B0...@mail.com>, Hyman Rosen <hyr...@mail.com>
wrote:

Garbage collection has a *lot* to do with this problem!

A large proportion of assignments in languages such as C++ are done
merely in order to resolve the question of who owns an object and who is
responsible for eventually deleting it.

With GC available, objects are seldom copied with pointers to them being
passed around instead. You don't care who "owns" the object, because it
just magically goes away when all the pointers to it get forgotten.

-- Bruce

Hyman Rosen

unread,
Jan 22, 2002, 5:08:51 PM1/22/02
to
Bruce Hoult wrote:

> Garbage collection has a *lot* to do with this problem!
>
> A large proportion of assignments in languages such as C++ are done
> merely in order to resolve the question of who owns an object and who is
> responsible for eventually deleting it.
>
> With GC available, objects are seldom copied with pointers to them being
> passed around instead. You don't care who "owns" the object, because it
> just magically goes away when all the pointers to it get forgotten.


You don't get to redefine the problem so that your favorite technique
becomes the solution.

Jim Rogers

unread,
Jan 22, 2002, 5:09:58 PM1/22/02
to
Bruce Hoult wrote:

> Garbage collection has a *lot* to do with this problem!
>
> A large proportion of assignments in languages such as C++ are done
> merely in order to resolve the question of who owns an object and who is
> responsible for eventually deleting it.
>
> With GC available, objects are seldom copied with pointers to them being
> passed around instead. You don't care who "owns" the object, because it
> just magically goes away when all the pointers to it get forgotten.

Copying pointers and using GC or even reference counting is a very
nice solution for single threaded applications. It becomes a very
messy solution for multi-threaded applications. Now all those
copied pointers need to be protected from inappropriate simultaneous
access by several threads.

In threaded applications GC solves no concurrency issues, but it
does complicate them. How can a semaphore be "held" by only one
thread if only one instance exists, and all threads hold a
pointer to that instance? How can you keep one thread from
consuming all the free store or "heap"?

Obviously there are answers to all those questions.
Unfortunately, those answers a generally pretty messy unless
someone has designed your GC solution with concurrency in mind.

Jim Rogers
Colorado Springs, Colorado USA

Kaz Kylheku

unread,
Jan 22, 2002, 5:54:08 PM1/22/02
to
In article <3C4DE336...@worldnet.att.net>, Jim Rogers wrote:
>Bruce Hoult wrote:
>
>> Garbage collection has a *lot* to do with this problem!
>>
>> A large proportion of assignments in languages such as C++ are done
>> merely in order to resolve the question of who owns an object and who is
>> responsible for eventually deleting it.
>>
>> With GC available, objects are seldom copied with pointers to them being
>> passed around instead. You don't care who "owns" the object, because it
>> just magically goes away when all the pointers to it get forgotten.
>
>Copying pointers and using GC or even reference counting is a very
>nice solution for single threaded applications. It becomes a very
>messy solution for multi-threaded applications. Now all those
>copied pointers need to be protected from inappropriate simultaneous
>access by several threads.

By definition, something that is copied isn't shared, and so doesn't
have to be protected. What it points to is shared, so that's a different
matter.

Making copies of every object is not a realistic solution in multithreaded
programs, no matter what language they are written in. You can do it
sometimes, but not always.

If you must share references to objects, GC is far cleaner and safer
than reference counting techniques. It will automatically ensure that no
thread will delete an object while it is still in use by another thread.

>In threaded applications GC solves no concurrency issues, but it
>does complicate them. How can a semaphore be "held" by only one
>thread if only one instance exists, and all threads hold a
>pointer to that instance?

How can a semaphore be useful unless there is one shared instance
of it? Do you know the first thing about threading? Doh!

>How can you keep one thread from
>consuming all the free store or "heap"?

This is a problem regardless of your memory management strategy, if you
admit dynamic storage allocation. If you insist on making copies for
the sake of avoiding sharing, you will only make this concern worse.
Garbage collection eliminates memory leaks, thus lessening the
concern.

>Obviously there are answers to all those questions.
>Unfortunately, those answers a generally pretty messy unless
>someone has designed your GC solution with concurrency in mind.

Do you know what GC is? It simply means computing what objects are unreachable
and liberating their storage for future allocations.

It has nothing to do with the semantics of those objects while they
are still live. Garbage collected or not, shared objects have to be
protected from concurrent access.

Of course a garbage collector has to be correctly implemented to be used
in a multithreaded environment. However, it remains easy to use; the
user simply remains unconcerned about what happens to unreachable
objects.

Kenny Tilton

unread,
Jan 22, 2002, 6:22:28 PM1/22/02
to

GC does not redefine the problem, it eliminates it. Which was the point.

kenny
clinisys

Jim Rogers

unread,
Jan 22, 2002, 6:47:51 PM1/22/02
to
Kaz Kylheku wrote:


Of course Doh! I was thinking about the concept of many handles
to the same object, without the object state itself identifying
the "owner".


>
>>How can you keep one thread from
>>consuming all the free store or "heap"?
>>
>
> This is a problem regardless of your memory management strategy, if you
> admit dynamic storage allocation. If you insist on making copies for
> the sake of avoiding sharing, you will only make this concern worse.
> Garbage collection eliminates memory leaks, thus lessening the
> concern.
>


Aha. There are solutions to this problem, with or without GC.
The solution is to provide a separate storage pool for each
thread. That storage pool could then be managed manually or by
GC. With this design, one thread consuming its storage pool would
not interfere with the storage usage of other threads.

If you do use GC you will need some way of partitioning the
GC work among the storage pools. This could be done in a single
GC, or by implementing many GC's.


>
>>Obviously there are answers to all those questions.
>>Unfortunately, those answers a generally pretty messy unless
>>someone has designed your GC solution with concurrency in mind.
>>
>

> Do you know what GC is? It simply means computing what objects are unreachable
> and liberating their storage for future allocations.
>


Yes, I know this. It appears to me that computing unreachability
across a multi-processor system is much more difficult than with
the simpler memory model of a uni-processor system. Am I wrong?


> It has nothing to do with the semantics of those objects while they
> are still live. Garbage collected or not, shared objects have to be
> protected from concurrent access.


Yes, and that concurrent sharing complicates the reachability
algorithm.

While operations on general shared memory objects do not need
to be, and indeed often cannot be, atomic, operations on
semaphores must be atomic. This includes reachability
calculations. It would seem that an object that is unreachable
must remain unreachable. In general this is true. With out of order
instruction execution in some multi-processor systems this cannot
be guaranteed.


>
> Of course a garbage collector has to be correctly implemented to be used
> in a multithreaded environment. However, it remains easy to use; the
> user simply remains unconcerned about what happens to unreachable
> objects.
>


Certainly, a GC properly implemented for a multithreaded environment
should be just as easy to use as one implemented in a single threaded
environment. This does, however, argue for a language providing GC
to also implement a threading model as part of the language standard.
Mixing a threaded implementation with a GC designed only for single
threaded applications could lead to very disappointing results.

Larry Kilgallen

unread,
Jan 22, 2002, 8:43:47 PM1/22/02
to

> Aha. There are solutions to this problem, with or without GC.
> The solution is to provide a separate storage pool for each
> thread. That storage pool could then be managed manually or by
> GC. With this design, one thread consuming its storage pool would
> not interfere with the storage usage of other threads.

What happens when one thread passes an object to another thread ?

Jim Rogers

unread,
Jan 22, 2002, 9:04:03 PM1/22/02
to
Larry Kilgallen wrote:


You have three choices.

1) Only objects allocated from a global storage pool can be passed
between threads.
2) (Yuck) Objects are only passed by copy, not by reference.
3) Any object can be passed by reference. Objects from a local
storage pool cannot be allocated by any but their local thread.
This option really makes deallocation, either manual or GC, very
hard to get right. :-(

The first option is often enforced by multi-processor systems,
where shared data must be in a special region of memory visible
to all processors. In Ada terms, this region of memory would be
used only for protected objects.

Of course, with this option, it is still possible for a single
thread to unfairly dominate memory usage. As in all cases, you
stil need to design intelligently to limit storage pool
contention.

Bruce Hoult

unread,
Jan 22, 2002, 9:26:12 PM1/22/02
to
In article <3C4DE2F3...@mail.com>, Hyman Rosen <hyr...@mail.com>
wrote:

> Bruce Hoult wrote:

Yes you do. The aim is to solve the customer's problem. They don't
give a damn whether you copy objects or pass them by reference. They
don't even know what the options *mean*!

-- programs are hard to analyse for correctness because of GOTOs.
Solution: don't use them

-- programs are hard to analyse for correctness because of destructive
assignment. Solution: don't use assignment.

-- programs are often buggy because of off-by-one errors in loop
control. Solution: use implicit loops and/or mapping functions
controlled by the size of the collection they are operating on.

-- programs are often buggy because of errors in memory management.
Solution: automate memory management.

Greg C

unread,
Jan 23, 2002, 11:01:49 AM1/23/02
to
Jim Rogers <jimmaure...@worldnet.att.net> wrote in message news:<3C4DE336...@worldnet.att.net>...
[...]

> Copying pointers and using GC or even reference counting is a very
> nice solution for single threaded applications. It becomes a very
> messy solution for multi-threaded applications. Now all those
> copied pointers need to be protected from inappropriate simultaneous
> access by several threads.
>

LOL! Jim, this sounds like an excellent example of a problem where
it's important to distinguish between deep and shallow copy operators,
which has been a large part of the debate in this thread in the first
place...

Greg

Darren New

unread,
Jan 23, 2002, 12:02:02 PM1/23/02
to
Jim Rogers wrote:
> 2) (Yuck) Objects are only passed by copy, not by reference.

Actually, this is only "Yuck" with primitive semantics. For example, a
functional program uses this type of semantic (only copy) all the time,
but the compiler is smart enough to figure out when to copy and when not
to. Hermes did a similar thing, and the compiler was good enough that it
not only usually passed by reference in spite of the pass by copy
semantics, it often managed to not pass the value at all, and simply
compiled the address of the caller's variable into the callee where it
could.

Think about something like SQL and whether it's easy or hard for the
programmer to make it work right on distributed processors.

--
Darren New
San Diego, CA, USA (PST). Cryptokeys on demand.
The opposite of always is sometimes.
The opposite of never is sometimes.

Georg Bauhaus

unread,
Jan 23, 2002, 12:45:17 PM1/23/02
to
In comp.lang.ada Bruce Hoult <br...@hoult.org> wrote:


: A large proportion [...ownership/deleting]
^^^^^

: With GC available, objects are seldom copied
^^^^^^

Yes, but what does frequency of an operation say about the importance
of the availability of the respective copying operation?

When an object has to be backed up (transaction, retransmission,
adventurous options of changing its state with backtracking,
recursively splitting processing to different processors/memories,
saving to files, ...), another pointer to serve the manipulation
is essentially useless, and GC or not GC is simply not the point.

For a concrete language example, why are there expanded objects in
Eiffel? And how does this relate to GC, and multitasking?

(I'm asking myself these questions to get a clear picture,
they are not merly rhetorical.)

Georg

Richard Riehle

unread,
Jan 23, 2002, 12:49:10 PM1/23/02
to
Bruce Hoult wrote:

> In article <3C4DE2F3...@mail.com>, Hyman Rosen <hyr...@mail.com>
> wrote:
> > You don't get to redefine the problem so that your favorite technique
> > becomes the solution.
>
> Yes you do. The aim is to solve the customer's problem. They don't
> give a damn whether you copy objects or pass them by reference. They
> don't even know what the options *mean*!

It depends on what you mean by customer. A customer of a class/type
declaration is usually another programmer. I am a member of that
class of customers, and I do care and I do know what the options mean.
One of the things I like about Eiffel is the careful attention given to the
design of the class contract. I want to know, from that contract, what
to expect of my engaging it. For example, what kinds of things can go
wrong, what are my responsibilities as a user of the contract, and what
performance considerations I might want to consider in choosing a
particular contract. And, yes, I want to know if it includes some form
of storage management, and have some idea of what that storage
management feature will have on my own program.

The next few observations remind me of something I once heard regarding
programming rules. "Those who make up rules about programming tend
to be people who no longer write production programs."

> -- programs are hard to analyse for correctness because of GOTOs.
> Solution: don't use them

Generally true. Knuth has a lengthy article on this in his book, Literate
Programming. To simply say, "Don't use go to, is a bit simplistic." I
have seen perfectly good examples where, during fine tuning of a program,
someone has achieved significant performance improvement with a go to.
Some languages support goto-less programming better than others. Some
eliminate the option altogether and also eliminate the benefits of it. Go
to
is an option that should be used sparingly, almost never, but is handy when
you actually need it. I realize one can "prove" that it is never needed.
So
be it. Sometimes it just might be useful.

> -- programs are hard to analyse for correctness because of destructive
> assignment. Solution: don't use assignment.

I understand this is a joke. However, it conforms nicely to some of the
fundamental notions of functional programming.

> -- programs are often buggy because of off-by-one errors in loop
> control. Solution: use implicit loops and/or mapping functions
> controlled by the size of the collection they are operating on.

Well, in Ada this never seems to be a problem. In Eiffel it is never
a problem. It is often a problem in programs that use the C family
of languages. However, I think C# has a fix for this.

> -- programs are often buggy because of errors in memory management.
> Solution: automate memory management.

A naive suggestion, at best. Which automated form of memory management
will you suggest? There are many from which one can choose. Each has
its own benefits, depending on the kind of software you are writing. This
is
one of the things C++ gets right. Ada also. The problem of automated
memory management is one of the things that makes Java ill-suited to
many kinds of embedded, real-time software applications. Jim Rogers
makes a good point about this vis a vis Ada. In Ada, we have the option
of selecting the memory management model we wish to use, letting it be
automatic or not, and targeting each type to a different model of memory
management when that is a appropriate. This is a level of flexibility
not typical of most other languages. Yes, I know one can do this in C++,
but expressibile and expressive are not the same thing. Ada is really
expressive in allowing this kind of automated memory management for
a given type. That is as it should be since Ada is intended for safety-
critical, real-time embedded software systems.

Richard Riehle

Hyman Rosen

unread,
Jan 23, 2002, 4:12:11 PM1/23/02
to
Kenny Tilton wrote:

> Hyman Rosen wrote:
>>You don't get to redefine the problem so that your favorite technique
>>becomes the solution.
>
> GC does not redefine the problem, it eliminates it. Which was the point.

It does no such thing. The original problem specified an object with
two subobjects which were owned by the parent and with a non-empty
base object. You must write assignment for this type so that the
destination acquires its own private copies of the subobjects of the
source, and such that if copying any subcomponent fails (with an
exception) you leave the destination object undamaged.

You do *not* get to hand-wave away the requirement of copying the
subobjects. Say, for example, they represent a pair of buffers. After
the copy is made, the source object and the destination object are
independent - when one writes into its buffers, the other must not
see any change to its own.

Tim Bradshaw

unread,
Jan 23, 2002, 5:24:46 PM1/23/02
to
* Hyman Rosen wrote:

> It does no such thing. The original problem specified an object with
> two subobjects which were owned by the parent and with a non-empty
> base object. You must write assignment for this type so that the
> destination acquires its own private copies of the subobjects of the
> source, and such that if copying any subcomponent fails (with an
> exception) you leave the destination object undamaged.

Well, one thing here is a terminological issue - in many GCd languages
assignment would not make a copy, so assignment and copying are
conceptually different operations.

But what GC does solve is not the copying problem - which, in Lisp at
least (together with the related equality problem), is regarded as an
issue which does not have a general solution nor any hope of one: Lisp
people at least do not regard copying as trivial - but the problem of
needing to copy to resolve low-level ownership issues. If you have to
manually manage memory then it's crucial that only one person frees
the memory associated with some object, and one (bad) way of resolving
this is, if you need two handles on something, to actually make a
copy, so each copy can then be independently freed. In a GCd language
this is a non-problem since the reuse of memory is dealt with by the
GC, so you can happily pass around a reference to the thing instead.

--tim

Kenny Tilton

unread,
Jan 23, 2002, 7:15:23 PM1/23/02
to

Perhaps I misunderstood. The problem of /copying/ a structure is
clear--how deep is something only the application can decide. Fine. Now
what is all this about assignment having to worry about copying?

The story I saw seemed to think you needed both a copy and assignment
method in C++, and that the assignment method also had to worry about
what to copy (and blowing up during a copy).

Why confuse the two? I just pass pointers (if you will) around.
assignment is always by reference. Automatic GC gets to clean up after
me. Life is good. When I /copy/ a structure...well, I'll have to get
back to you if it ever comes up.

:)

kenny
clinisys

Hyman Rosen

unread,
Jan 24, 2002, 3:14:38 AM1/24/02
to
Kenny Tilton wrote:
> Perhaps I misunderstood. The problem of /copying/ a structure is
> clear--how deep is something only the application can decide. Fine. Now
> what is all this about assignment having to worry about copying?

It was in the problem spec. Anyway, the (now) canonical way to do this
ib C++, thanks to Herb Sutter's "Exceptional C++", is to make objects
have a non-throwing swap() method, and base assignment on copying:

struct s
{
void swap(s &other) throw(); // non-throwing swap
s(const s &other); // copy constructor
s &operator=(const s &other)
{
s(other).swap(*this);
return *this;
}
};

> The story I saw seemed to think you needed both a copy and assignment
> method in C++, and that the assignment method also had to worry about
> what to copy (and blowing up during a copy).

Assignment exists in C++ not least because it exists in C. Like Ada, the
default is to do a memberwise copy. If you choose to allow assignment in
your class, you have to get it right.

Ray Blaak

unread,
Jan 24, 2002, 11:50:54 AM1/24/02
to
Hyman Rosen <hyr...@mail.com> writes:
> It was in the problem spec. Anyway, the (now) canonical way to do this
> ib C++, thanks to Herb Sutter's "Exceptional C++", is to make objects
> have a non-throwing swap() method, and base assignment on copying:
>
> struct s
> {
> void swap(s &other) throw(); // non-throwing swap
> s(const s &other); // copy constructor
> s &operator=(const s &other)
> {
> s(other).swap(*this);
> return *this;
> }
> };

Interesting. Is a discussion of the tradeoffs of this approach online
anywhere? I note an original presentation at http://www.gotw.ca/gotw/059.htm.

My only criticism with this approach is that one now has two places to
maintain a detailed field-by-field knowledge of the class: in the copy
constructor and in swap().

For maintenance reasons, I prefer this kind of approach, although there are
downsides with exception handling and dealing with virtual vs non-virtual
assignment operators:

class C
{
public:
C(const C &other)
{
*this = other;
}

C &operator=(const C &other)
{
if (this != &other)
{ // field by field assignment/cleanup, perhaps to temp vars first
// to allow intelligent exception processing.
}
return *this;
}
};

--
Cheers, The Rhythm is around me,
The Rhythm has control.
Ray Blaak The Rhythm is inside me,
bl...@telus.net The Rhythm has my soul.

AG

unread,
Jan 25, 2002, 7:40:00 PM1/25/02
to
> Bruce Hoult wrote:
>
> > Garbage collection has a *lot* to do with this problem!
> >
> > A large proportion of assignments in languages such as C++ are done
> > merely in order to resolve the question of who owns an object and who is
> > responsible for eventually deleting it.
> >
> > With GC available, objects are seldom copied with pointers to them being
> > passed around instead. You don't care who "owns" the object, because it
> > just magically goes away when all the pointers to it get forgotten.

This doesn't seem to be a question of something "going away", it's a
question
of how the object(s) behave while they are still in use. For example:

Suppose you have a process A which has something (an object or whatever)
called X it wants to pass to a process B. The process B must be free to do
whatever it wants to/with that X thing. However, the process A also wants to
keep going with the X after the B is done with whatever it was doing (and I
won't even mention the case where it wants to do it *while* the B is at it).
This sort of thing can be very useful in a roll-back situations, or when
some
duplicates are in fact completely independent in their handling but just
happen
to come from the same source etc. Pointers/references are totally useless in
such cases, you do need a separate copy and, if the language doesn't provide
it, you are reduced to hand-coding it yourself. With all the pitfalls that
it may
entail.

If my vague recollection of the vintage Pascal is correct, it did have a
specific
distinction between P^ := Q^; and P := Q; Which was, of course, the
difference
between the shallow and [one-level]deep copy. Now, just where does a GC
come to play in this difference? Mind you, that vintage Pascal could have or
not
the GC implemented - the semantics would be exactly the same.

The same applies to your argument above - it doesn't really matter what
happens
to the storage after it's out of use. The question is/was how do you
implement the
semantics while it's still around.


Patrick Doyle

unread,
Jan 26, 2002, 12:37:51 PM1/26/02
to
In article <IMh38.15646$XG4.6...@news2.calgary.shaw.ca>,

Kaz Kylheku <k...@ashi.footprints.net> wrote:
>
>Here is a recent example: a protocol stack where I have three ways
>to copy a network buffer buffer object. I can copy a pointer, and
>increment a reference count. I can copy the header structure which
>maintains various pointers into the buffer, and bump up a lower reference
>count on the buffer. Or I can do a deep copy which copies the buffer
>as well.
>
>This data is not immutable, so the copying semantics are not moot.

Ok, so when you copy a container object--like lists, arrays, etc.--you
need to know whether to copy its contents too. Is this the same issue?

>If a symbol is embedded in an object, and you want to copy that object,
>you probably want to copy those embeddded symbols by reference. And then
>you are no longer making an entirely new object from scratch; the common
>symbols are shared substructure.

I have a hard time picturing how this is implemented. Does one just
contain a reference to part of the other?

Patrick Doyle

unread,
Jan 26, 2002, 12:32:23 PM1/26/02
to
In article <3C4D8AC7...@brising.com>,

Steven T Abell <ab...@brising.com> wrote:
>> Can anyone show an example of a situation where copying semantics really
>> matter?
>
>Sure.
>Imagine you have an object with some simple attributes,
>ans some connections to other complex objects,
>and you want to save this object to a file.

Ok, I must admit, that's a good one. It's not really a copying issue
per se, but a persistence issue, though they have a lot in common.

I'm no expert on object persistence, so I'll have to leave it at that. :-)

Robert Dewar

unread,
Jan 27, 2002, 10:08:01 PM1/27/02
to
doy...@eecg.toronto.edu (Patrick Doyle) wrote in message news:<GqCG1...@ecf.utoronto.ca>...

> I must be crazy. I have never written a system where the > semantics of the "copy" operation mattered. Mainly, this
> is for two reasons:
>
> 1. A lot of my data is immutable, so the copying semantics are moot.
>
> 2. Instead of copying an existing object, my usual idiom is to create a
> new one from scratch.

Not crazy, but unusual. You are saying that you do not
do assignments of the form

a := b;

where a and b are both objects and that you do not
use IN OUT parameters. It is certainly possible to write
code with these constraints, but definitely unusual.

Patrick Doyle

unread,
Jan 28, 2002, 1:47:38 PM1/28/02
to
In article <5ee5b646.02012...@posting.google.com>,

Robert Dewar <de...@gnat.com> wrote:
>doy...@eecg.toronto.edu (Patrick Doyle) wrote in message news:<GqCG1...@ecf.utoronto.ca>...
>> I must be crazy. I have never written a system where the semantics
>> of the "copy" operation mattered. Mainly, this is for two reasons:
>>
>> 1. A lot of my data is immutable, so the copying semantics are moot.
>>
>> 2. Instead of copying an existing object, my usual idiom is to create a
>> new one from scratch.
>
>Not crazy, but unusual. You are saying that you do not
>do assignments of the form
>
> a := b;
>
>where a and b are both objects and that you do not
>use IN OUT parameters.

No, actually I'm saying what I said. ;-)

>It is certainly possible to write code with these constraints, but
>definitely unusual.

It is most certainly not unusual. For instance, Java has neither
object-wise assignment nor in-out parameters.

Do you mean it's unusual in some particular language?

David Combs

unread,
Feb 11, 2002, 2:18:01 AM2/11/02
to
In article <ZUm48.2916$Lv.3...@news.xtra.co.nz>,

AG <nos...@nowhere.co.nz> wrote:
>> Bruce Hoult wrote:
>>
>> > Garbage collection has a *lot* to do with this problem!
>> >
>> > A large proportion of assignments in languages such as C++ are done
>> > merely in order to resolve the question of who owns an object and who is
>> > responsible for eventually deleting it.
>> >
>> > With GC available, objects are seldom copied with pointers to them being
>> > passed around instead. You don't care who "owns" the object, because it
>> > just magically goes away when all the pointers to it get forgotten.
>
>This doesn't seem to be a question of something "going away", it's a
>question
>of how the object(s) behave while they are still in use. For example:
>
>Suppose you have a process A which has something (an object or whatever)
>called X it wants to pass to a process B. The process B must be free to do
>whatever it wants to/with that X thing. However, the process A also wants to
>keep going with the X after the B is done with whatever it was doing

Coroutines (across processes) help here? You do have to
know (not just when?) but *where*, in the source, you want
to "resume" the other one. Not at all "automatic".

How *does* one nicely-handle such a situation? "Automatically"?


How about this one (maybe simple, obvious, for you guys,
but not for me):

You're writing a program in a language that has gc,
eg lisp (or Mainsail, the language I use -- derived
from (ancient) SAIL (Stanford's Algol-like syntaxed competitor to
lisp as ai language -- lost, of course),

and from one or more places in the program written
in the language-with-gc, you want to call out to
something written in a *different* language,
which also has a *different* memory-mngt scheme --
maybe gc, maybe not.

And you want to pass a ref to a data structure
owned by the first progtam into the second program.

So far, simple enough. The with-gc language
is stopped between the call into the other-languaged
program and the return back in.

No problem with a gc occurring *while* the 2nd program
is playing with the (shared) data structure.

HOWEVER -- suppose that that called 2nd-program needs
to call a function back in the 1st program.

NOW we DO have a problem: that call back into the
first program, the one with memory-mngt via gc,
just might eat up enough pieces of newly allocated
memory that it triggers a gc!

Thus pulling the rug out from under the 2nd program --
since its reference back into the first program's
data structure will be pointing to garbage, if
the "shared" data structure gets slid somewhere
else by the gc.

QUESTION: what kind of solutions are there to
this problem?

(The one I've been using is to turn off the
gc during the first call out, and re-activating
it when that first call returns -- the hope
being that the 2nd call back in doesn't run
out of memory! (Actually, we make it do
a preemptive gc before we freeze everything.)

So far, so good....)

What are some more, er, robust approaches?

Thanks!

David


Ray Blaak

unread,
Feb 11, 2002, 12:50:13 PM2/11/02
to
dkc...@panix.com (David Combs) writes:
> [...] from one or more places in the program written in the

> language-with-gc, you want to call out to something written in a *different*
> language, which also has a *different* memory-mngt scheme -- maybe gc, maybe
> not.
[...]
> And you want to pass a ref to a data structure
> owned by the first progtam into the second program.

[...and what happens if this shared data is collected while still in use in
the second program?]

You need support from the gc-language to "lock" objects to that the gc will
not collect them.

That is, you need a way of marking objects as being under manual memory
management.

Many gc-languages provide such hooks. The Java Native Interface, for example,
allows the programmer to add/remove references to Java objects from outside
Java, so as to control whether or not they are gc'd.

Kaz Kylheku

unread,
Feb 11, 2002, 3:13:31 PM2/11/02
to
In article <a47r79$jf$1...@news.panix.com>, David Combs wrote:
>In article <ZUm48.2916$Lv.3...@news.xtra.co.nz>,
>AG <nos...@nowhere.co.nz> wrote:
>>> Bruce Hoult wrote:
>>>
>>> > Garbage collection has a *lot* to do with this problem!
>>> >
>>> > A large proportion of assignments in languages such as C++ are done
>>> > merely in order to resolve the question of who owns an object and who is
>>> > responsible for eventually deleting it.
>>> >
>>> > With GC available, objects are seldom copied with pointers to them being
>>> > passed around instead. You don't care who "owns" the object, because it
>>> > just magically goes away when all the pointers to it get forgotten.
>>
>>This doesn't seem to be a question of something "going away", it's a
>>question
>>of how the object(s) behave while they are still in use. For example:
>>
>>Suppose you have a process A which has something (an object or whatever)
>>called X it wants to pass to a process B. The process B must be free to do
>>whatever it wants to/with that X thing. However, the process A also wants to
>>keep going with the X after the B is done with whatever it was doing
>
>Coroutines (across processes) help here? You do have to
>know (not just when?) but *where*, in the source, you want
>to "resume" the other one. Not at all "automatic".

Not really, because A and B want to use the object in parallel. Coroutines
make the synchronization problem go away, because coroutine A is suspended
when coroutine B is running and vice versa. The passing of control
is explicit. With processes you don't have this, except maybe with
non-preemptive (cooperative) threads.

The solutions are to copy the object upfront, or to do some lazy copying:
copy on write. This is what happens with the memory pages of a process
on a moderm Unix system when you do fork(). Otherwise if B destructively
manpulates an object that A is also using, you have an instant race
condition.

Of course, there is the possibility of making a thread-aware object which
can be shared in some disciplined way.

>NOW we DO have a problem: that call back into the
>first program, the one with memory-mngt via gc,
>just might eat up enough pieces of newly allocated
>memory that it triggers a gc!
>
>Thus pulling the rug out from under the 2nd program --
>since its reference back into the first program's
>data structure will be pointing to garbage, if
>the "shared" data structure gets slid somewhere
>else by the gc.

Firstly, you may be confusing GC with compacting-GC. It's not a
necessity for garbage collection to actually move data around to
eliminate fragmentation. The primary job of garbage collection is to
hunt down unreachable memory and liberate it. If the GC in question is
non-compacting, you have nothing to worry about, (so long as the non-GC
domain isn't the only one with remaining references to the object, so
that it looks unreferenced in the GC domain).

If objects *can* be moved by the collector, what you can do is implement
the ability to lock some objects to prevent them from moving. Any objects
that are passed into the non-GC domain are locked, and then their
previous lock status is restored upon returning.

Making copies of objects into the non-GC domain, which have to be
liberated there, is also not out of the question. In the reverse
direction, any object created in the non-GC domain that you want to brring
into the GC domain will almost certainly have to be reallocated into
GC storage. The non-GC domain can be equipped with a special function
for allocating GC storage for objects in preparation for passing them
the GC domain, so the copy operation doesn't have to take place.

Since the non-GC domain is probably written in some stone age language
with manual memory allocation, its programmers won't mind learning some
special way of allocating memory. :)

>QUESTION: what kind of solutions are there to
>this problem?
>
>(The one I've been using is to turn off the
>gc during the first call out, and re-activating
>it when that first call returns -- the hope
>being that the 2nd call back in doesn't run
>out of memory! (Actually, we make it do
>a preemptive gc before we freeze everything.)

This is equivalent to locking down everything, which is simpler, but
unnecessary.

Tom Hawker

unread,
Feb 13, 2002, 11:31:35 AM2/13/02
to
Bruce Hoult <br...@hoult.org> wrote in message news:<bruce-35A637....@news.paradise.net.nz>...
>
> Nicklaus Wirth spent a year on sabbatical at Apple, designing Object
> Pascal with them.
>
> -- Bruce

Two thoughts.

If I remember correctly, Pascal was originally designed based on a
provability calculus. That is, Pascal programs were supposed to be
provably correct (don't ask me to define that), which is why there
were so many things "missing", such as C-equivalents "break" and
"goto". Provability gave way to usability...

If you read back a few messages about mindless managers and mere
coders, I thought of a very frightening parallel. Anyone out there
read "Atlas Shrugged" by Ayn Rand? How about The Peter Principle?
This sounds all too much like promotion to incompentence or epidemic
ignorance and apathy. Would that John Galt were here, or perhaps
*not* here...

-- Tom

Larry Kilgallen

unread,
Feb 13, 2002, 12:51:26 PM2/13/02
to
In article <43d7620c.02021...@posting.google.com>, tsh...@qwest.com (Tom Hawker) writes:

> If I remember correctly, Pascal was originally designed based on a
> provability calculus. That is, Pascal programs were supposed to be
> provably correct (don't ask me to define that), which is why there
> were so many things "missing", such as C-equivalents "break" and
> "goto". Provability gave way to usability...

DEC's SVS project to provide an A1 Security Kernel on VAX was written
in Pascal, with special caution taken to omit the Pascal runtime library
DEC generally used, because that was written in a lower level language.
This had to do with provability of the implementation.

The project was dumped after going into Field Test because DEC found out
there was no real customer demand. (NCSC wanted other agencies to use
A1 systems, other agencies did not want to spend their money on that.)

Tom Hawker

unread,
Feb 13, 2002, 4:17:05 PM2/13/02
to
dkc...@panix.com (David Combs) wrote in message news:<a47r79$jf$1...@news.panix.com>...

>
> QUESTION: what kind of solutions are there to
> this problem?
>
> What are some more, er, robust approaches?
>
> Thanks!
>
> David

Try a different language.

The Cincom Smalltalk implementation allows access to C calls (which,
of course, will get you to C++ if you're careful). The function must
reside in a shared library that loads into the Smalltalk address space
and can access -- through controls -- object memory. Object memory
runs under the GC; accessing an object from C "locks" it so it
doesn't get collected or moved on you. Dynamic memory allocation from
C (or C++) uses the standard malloc() calls. I know that some
Smalltalk implementations (such as GNU) actually have a private
version of the malloc routines (same semantics) to control memory
zones.

The interface is rather flexible, supporting semantics to create
objects that look like C structures internally and are converted to
native C structures on function invocation. The invocation will even
permit spawning the function as a thread, but there are more
restrictions placed on those.

The calling mechanism uses C's pass-by-value semantics. But since you
can pass object references (OOPs), you have access to everything else
as well. There is some support for semaphores, but I think it's
rather limited. Thunking, or a reverse call from C to Smalltalk, is
possible using Smalltalk blocks, but I do not know how this is
handled.

-- Tom

Tom Hawker

unread,
Feb 13, 2002, 6:12:18 PM2/13/02
to
Immanuel Litzroth <imma...@enfocus.be> wrote in message news:<m2advh9...@enfocus.be>...
>
> In this respect the articles...
>
> Immanuel

I read through both of these articles. I have also read through all
of the responses to this subthread. I am not a C++ expert, but I have
a background in computer languages. I would not have gotten a
"perfect" solution, but the problems associated with "owning" pointers
were obvious, and really should be so to anyone with a basic OO
background, which I guess eliminates most of the C++ programmers. ;-)
My solution did not include transactions (the try/catch stuff),
because I would have assumed a failure to be catastrophic (an
oversight of the stated problem requirements). Unwise, perhaps...

But all that aside, I am surprised that everyone seems to have missed
an underlying concept. Most of this thread goes into all the funny
semantics of assigning, copying, and garbage collection behavior,
while overlooking the simplest fact of all: the hassles come about
because you're dealing with a strongly typed language! (Ada's limited
private types only make the contract cleaner to visualize, they don't
overcome the associated problems. And don't get me started on the
supposed advantages of strong vs. weak/dynamic typing.)

GCed languages usually eliminate the problem of "assignment" because
there is no such feature: they do dynamic typing/dynamic binding,
replacing references (virtual pointers) to objects rather than the
contents. You have to write special methods to implement assignment,
and I have found only a handful of cases where that is desirable, let
alone necessary, and after years of thought I'm not certain about
those. In such cases I've used "copy_from", where the argument is the
source whose state is copied into the receiver. (Destructors for
releasing system resources are usually invoked manually or can be
through GC finalization mechanisms, which automate the necessary
cleanup even in cases that would otherwise be memory leaks.)

Now, all of the things said about "owned" versus "aliased" references
must be taken into account. In my experience a dynamically typed
language will never have a slicing problem. But ownership of
contained objects must still be managed. Where the language doesn't
support assignment per se, then implicitly this means "owned" objects
must be copied (that is, cloned) in the usual way for the language
when assignment is implemented, which is what you'd expect.

Which leads to the discussion of copy semantics. Either the language
or the type must define the semantics of what "copy" means. It
certainly is easier when the language specifies a default
interpretation. But there are valid reasons for having shared_copy
(which does NOT do ownership copies), shallow_copy (which copies or
clears contained objects not meant to be shared, such as a dictionary
cache), and deep_copy, which copies everything and its brother,
sister, cousin, and aunt. Some data structures cannot wisely
implement deep copies without some smarts to prevent infinite
replication, such as duplicating graph structures.

I guess my point to all of this is that I will contend that the
problem is in the language and not the individual. Yes, an
unfortunate number of our colleagues may not really understand the
complexities of languages they use. (Let's leave personal indolence
and insolence out of it, shall we?) Yes, management usually feels it
needs to exert its authority on issues for which it is not competent
to evaluate the options. (Let's skip the ego thing, too.) But *why*
do I need to propagate such insanities where there are better ways?

Stroustrop et al and his philosophy notwithstanding, with a background
in languages and a frequent, practical user of compiler technology, my
opinion professionally is that C++ as a language absolutely sucks. It
is a pain to use simply because all of the issues noted in this thread
are counter-intuitive. It doesn't matter in the least whether they
are useful to have. There is simply no reason whatsoever that the
normal, introductory, or just plain naive programmer should ever feel
like disemboweling him/herself trying to get something to work when
the trouble is related to obscure language semantics. (I'm sorry,
this is one of the reasons I dislike Ada: coming up with the Posix
bindings was almost an exercise in futility because of all the
restrictions on packages and types.) In that regard, Objective-C is
much cleaner as a strongly typed language, since it is a real albeit
hybrid OO language, where the OO-ness has been neatly layered onto the
original language.

And so that leads to objections about changes to the [base] language.
I must disagree: changes are inevitable. We always find better ways
to do things, or eventually discover that the original way was not as
extensible (read that, "OO") as one would like, or even that some way
we depend on is flatly wrong, and so we "improve" it. I've had to
port system implementations across 8 versions of a single language,
and it is never easy when you've made system (base code) extensions or
changes. (One particularly nasty variant was a rewrite of the
graphics system!) Having a language "protected" by a "standard" still
doesn't prevent gratuitous changes. The best one can do is adapt
programming practices that help to minimize the effects of such
rudeness. But don't whine about it, because the language we favor
most today may very well either be changed or obsolete tomorrow...

-- Tom

Peter Gummer

unread,
Feb 13, 2002, 6:28:28 PM2/13/02
to
tsh...@qwest.com (Tom Hawker) wrote in message news:<43d7620c.02021...@posting.google.com>...

> If I remember correctly, Pascal was originally designed based on a
> provability calculus. That is, Pascal programs were supposed to be
> provably correct (don't ask me to define that), which is why there
> were so many things "missing", such as C-equivalents "break" and
> "goto".

Wirth's original Pascal *did* have goto. In fact, it could jump right
out of the current procedure up the call stack, so it was actually a
bit like C's longjmp() function!

Hartmann Schaffer

unread,
Feb 13, 2002, 8:56:26 PM2/13/02
to
In article <43d7620c.02021...@posting.google.com>,
tsh...@qwest.com (Tom Hawker) writes:

> If I remember correctly, Pascal was originally designed based on a
> provability calculus. That is, Pascal programs were supposed to be
> provably correct (don't ask me to define that), which is why there
> were so many things "missing", such as C-equivalents "break" and
> "goto". Provability gave way to usability...

i am quite sure that you remember incorrectly ;-): Pascal had goto
(the labels were numeric, which helped the argument that gotos made
code unreadable). afair it was tony hoare who used pascal in his
provability calculus (there were some rumors that he was deeply
disturbed by how easy it turned out to be to include goto in the
calculus)

> ...

hs

--

how are we defending our freedom and democracy by dismantling them?

Greg C

unread,
Feb 13, 2002, 10:10:43 PM2/13/02
to
peter_...@hotmail.com (Peter Gummer) wrote in message news:<3b0c4a5.02021...@posting.google.com>...

The first Pascal compilers were in fact written in Pascal and that
code contained "goto"s.

Greg

Barry Watson

unread,
Feb 14, 2002, 4:59:05 AM2/14/02
to

Tom Hawker wrote:
>
> Bruce Hoult <br...@hoult.org> wrote in message news:<bruce-35A637....@news.paradise.net.nz>...
> >
> > Nicklaus Wirth spent a year on sabbatical at Apple, designing Object
> > Pascal with them.
> >
> > -- Bruce
>
> Two thoughts.
>
> If I remember correctly, Pascal was originally designed based on a
> provability calculus. That is, Pascal programs were supposed to be
> provably correct (don't ask me to define that), which is why there
> were so many things "missing", such as C-equivalents "break" and
> "goto". Provability gave way to usability...

You're thinking of Concurrent Pascal, didn't even have recursive
functioon calls, idea beaing you could predict stack usage in advance. I
think the ordinary Pascal was designed to be self documenting rather
than provable.

0 new messages