The thing is that for a relatively short period of time you could get
fractionally better performance by using StringBuffer instead of
concatenating with the + operator. So some people wrote articles
about this, and it entered the collective folklore.
A similar example is ArrayList. For a relatively short period of time
ArrayList was slightly faster than Vector. Now it isn't
(synchronisation improvements a couple of language versions back). So
we should _almost never_ use ArrayLists, because Vectors are _always_
safer. But, because it entered the collective folklore, most
libraries prefer the wrong one.
Also, Object Creation. Though we did kind of inherit this one, now
that 'new' is much faster (in many cases) than malloc, we should not
be afraid of slapping 'new' liberally throughout our programs. Go
wild, go crazy, do that OO thang. However, because it is in our
collective folklore, you get all these people jumping through some
very strange hoops (e.g. Singletons, Spring and all sorts of weird and
whacky Factory anti-patterns) to avoid using new.
------
My point isn't to be an expert at the byte code level of the JVM.
Quite the opposite. Knuth says it best: "premature optimisation is
the root of all evil".
My point is to challenge the assumptions and folklore. Some of that
is inherited from Computer Science (like the 'new is bad' and 'string
concatenation is bad' examples I have given).
That said, I personally think that learning about the byte code and
learning computer science are worthy (and interesting) goals in and of
themselves. So I did.
------
There has been a lot of discussion about the mythical "average"
programmer. The language change discussion for instance tended to use
this mythical beast as a kind of bogeyman. E.g. "you can't change the
language because average programmers are too stupid to cope with
change". And on the other side of that ideological fence, they also
flog the same bogeyman. E.g. "if you don't like generics it is
because you are too stupid to cope with change".*
*Not something that I've heard recently, but something that the
proponents of generics ended up saying a lot in frustration before and
for some time after generics were thrust upon us.
And what I would say is that the "average" programmer is what he is.
No more, no less. Just because they go home at the end of the day and
watch Big Brother or Neighbours and don't eat breathe and sleep
programming, does not mean that they are bad or stupid. We shouldn't
patronise them, and we shouldn't underestimate them.
At the same time, we shouldn't pretend that "if they only had some
computer science" exposure they would become uber programmers. For
starters, most of the "average" programmers I've worked with had
Computer Science degrees.
And it seemed like on this podcast they were saying (and I know Joel
says it a lot) that the way to fix bad/average programmers is to
educate them in computer science.
------
That said, if someone is already a successful programmer - without
having had any exposure to computer science - then adding a sprinkling
of comp sci is probably going to make them even better. The other
thing that I'd do is try to get them hooked on phonics. Or, well, the
programming equivalent, which is a highly readable coding style. If
they can see the benefit of readability and either develop their own
style or copy someone else's, then that would cover a multitude of
sins.