Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Python evangelists unite!

4 views
Skip to first unread message

bruce...@hotmail.com

unread,
Nov 29, 2001, 5:28:40 PM11/29/01
to
Okay, I was extolling the benefits of Python to a friend of mine. He
took strong exception to Python's OO model. <here we go again...>

ME: Python's great--you can add members to instances OR classes
on-the-fly!
FRIEND: Why would you want to do that?
ME: Uh...
FRIEND: Besides, that's awful Object Orientation. If I start adding
attributes to an instance of a class, it ceases to be an instance of
that class. If I create a bunch of instances of the same class, they
should be the same; they should have the same members.
ME: Yeah, but with dynamicism, I can add a new pane to a GUI
while it's running. I just change an instance to include a new pane,
and...
FRIEND: That's pretty cool, but it's not a reason, in and of itself,
to make a language so dynamic. There must be some advantage to being
able to add attributes during runtime. What are they?
ME: Uh...
FRIEND: And another thing! What's with encapsulation? There's no
private!?!?!?
ME: <Screaming and running for cover>

So, I need some help. I've checked out c.l.p and some on-line
articles, and I just can't find good practicle examples of a program
that adds members to instances (or classes) at runtime. Anybody got
any? I don't want to have to start avoiding my friend....

TIA,
b.

Peter Milliken

unread,
Nov 29, 2001, 6:18:00 PM11/29/01
to
Start avoiding him :-)

There is no real justification that works for these "features" of python -
they can be convenient sometimes but generally violate a lot of "standards"
considered good practice for general software engineering (first lesson I
learnt was don't write self modifying code! :-)).

I use Python as a good, quick and dirty hacking language. For real (read
production) stuff that I expect a customer to run or will require more than
a single person working for a couple of hours, I look elsewhere :-). Sure
there are examples of Python being used for "large" jobs - and very
successful they have been too - but these people are masochists (IMO) :-).
They could have been more productive with other languages that provide
better support for generic software engineering principles/standards.

There, that should bring the 'vangelists out of the woodwork! :-) It's just
too easy, they rise like starving trout - each language has it's strengths
and weaknesses. Python has some very nice features, I use it a good deal.
But it definitely has its place!

In the meantime, my advice is don't get "emotionally" hooked into the
language. Analyse its strengths and weaknesses, consider other languages and
their strengths and weaknesses and then you can hold an intelligent
conversation with another programmer! :-)

Peter

<bruce...@hotmail.com> wrote in message
news:baf2f841.0111...@posting.google.com...

Peoter Veliki

unread,
Nov 29, 2001, 6:48:56 PM11/29/01
to
> I use Python as a good, quick and dirty hacking language. For real (read
> production) stuff that I expect a customer to run or will require more than
> a single person working for a couple of hours, I look elsewhere :-).
 
What do you use for production?  I'm just curious, not challenging your opinion.  Is there an interpreted language that you would use before you would use Python?  Perl?  Or would you stick to compiled languages for production?

bru...@tbye.com

unread,
Nov 29, 2001, 5:51:38 PM11/29/01
to

> Okay, I was extolling the benefits of Python to a friend of mine. He
> took strong exception to Python's OO model. <here we go again...>
...
> FRIEND: Besides, that's awful Object Orientation.

I'd ask, "according to whom?" This is a pretty typical knee-jerk reaction
to something new. "Aargh! This is a different way of doing things! MUST BE
BAD!" There are definite cases where both dynamic classes and instances is
useful, but before getting to defensive in answering the "why?" I'd
personally push back a little more with "why not?" (also I'd point out
that just because classes and instances can be so dynamic does not mean
that most or even a lot of them are. It just means that the programmer has
that power available, if needed.)

> If I start adding attributes to an instance of a class, it ceases to
> be an instance of that class.

That's silly. If I put on a hat, do I cease to be an instance of the
Person class? In truth, all less dynamic languages *wish* they had this
feature. Since they don't developers instead have two additional pieces of
work when creating a new class: (1) they have to figure out up front the
total collection of possible attributes/properties instances can have, and
(2) they have to set them to some null-like value if those attrs/props
are not present. So instead of an instance simply not having a property,
all other instances have to have a property with a value that means they
don't have that property. Wow.

> FRIEND: And another thing! What's with encapsulation? There's no
> private!?!?!?

Yawn. ;-) Python provides enough protection against accidental
private-member usage. Unlike, say, Java, Python does not assume you are
an imbecile. IMO your friend probably has not really pondered the true
benefits (if any) of 'private'.

For a huge class of negative reactions to something different, the first
response should often be, "well, have you actually tried it yet?" It's so
common to hear people say that some feature is bad because they can
imagine a case where supposed harm could come from using that feature.

Interesting that you rarely hear people say stuff like "well, I came from
using C++. Now that I've used Python for awhile I can honestly say that
those publicly-accessible members caused all sorts of problems." Instead
you hear, "uh, yeah, I guess it *could* cause a problem. Never has
happened to me though."

-Dave


Delaney, Timothy

unread,
Nov 29, 2001, 6:02:24 PM11/29/01
to
> From: bruce...@hotmail.com [mailto:bruce...@hotmail.com]

>
> So, I need some help. I've checked out c.l.p and some on-line
> articles, and I just can't find good practicle examples of a program
> that adds members to instances (or classes) at runtime. Anybody got
> any? I don't want to have to start avoiding my friend....

Suggest he look at unit tests. With unit testing, you are supposed to write
the tests first, then the code.

Unfortunately, in most languages, this is simply not possible. For example,
you can't compile the unit tests for a java package if you don't have at
least a bare skeleton of each class existing (all methods present, etc).

OTOH, this is incredibly simple in Python. The method doesn't exist yet?
Throw an exception! The class doesn't exist? Throw an exception! Which is
exactly what you want in a unit test - you can code one bit at a time, and
gradually your tests start succeeding.

Tim Delaney

Fernando Pérez

unread,
Nov 28, 2001, 12:57:46 PM11/28/01
to
bruce...@hotmail.com wrote:

> Okay, I was extolling the benefits of Python to a friend of mine. He
> took strong exception to Python's OO model. <here we go again...>
>

[snip]


>
> So, I need some help. I've checked out c.l.p and some on-line
> articles, and I just can't find good practicle examples of a program
> that adds members to instances (or classes) at runtime. Anybody got
> any? I don't want to have to start avoiding my friend....
>

I am currently working on a replacement for the python interpreter
which depends *critically* on this. It allows the user to define
special commands to control the interpreter and access the underlying
shell on the fly, and this requires dynamical generation of method
code (via exec with fancy tricks with string formatting) and then
binding those methods on the fly to the currently running
interpreter. Obviously you can do that in C with a heinous mess of
pointer tables, but the point is, having those features available to
me is the difference between actually producing the program vs.
saying 'Obviously you can do that in C with a heinous mess of pointer
tables' and never actually doing it (because it's so ugly and tricky).

I'm playing tons of strange tricks with namespaces, code
introspection and dynamical changing of running code. I can't even
begin to imagine how I would do some of those things in any language
other than Python (yes, I don't know Lisp. Shoot me.)

Cheers,

f.

ps. If curious, I think I'll announce the first beta of this around
next week. It's almost ready.

James_...@i2.com

unread,
Nov 29, 2001, 7:18:15 PM11/29/01
to

Peter Milliken wrote:
>There is no real justification that works for these "features" of python -

OTOH, there *are* plenty of good *reasons* for them.

Consider "adding attributes to instances". Languages that don't support
this feature directly require some kind of a workaround since this is
something that one simply needs to do from time to time. A typical
workaround is to define, as part of a class's definition, a catch-all
property whose value is a hashtable. Clients add "instance-specific"
attributes to a given instance "on the fly" by adding name-value pairs to
the instance's hashtable. Java/Swing's JComponent (with its
putClientProperty and getClientProperty methods) is a good example of this
type of workaround.

Jim

Gareth McCaughan

unread,
Nov 29, 2001, 7:51:14 PM11/29/01
to
bruce...@hotmail.com wrote:

> Okay, I was extolling the benefits of Python to a friend of mine. He
> took strong exception to Python's OO model. <here we go again...>
>
> ME: Python's great--you can add members to instances OR classes
> on-the-fly!
> FRIEND: Why would you want to do that?
> ME: Uh...
> FRIEND: Besides, that's awful Object Orientation. If I start adding
> attributes to an instance of a class, it ceases to be an instance of
> that class. If I create a bunch of instances of the same class, they
> should be the same; they should have the same members.

Twaddle. They should provide the same interface. What members
they have is completely irrelevant to that, unless those
members are part of the interface you specify. (It so happens
that in Python there isn't really an in-language mechanism for
saying what's part of the interface and what isn't, but that
doesn't mean your objects don't have interfaces.)

And, of course, there's no law that says Python's objects
are only to be used for "proper" elegant OO programming.
They have other uses. :-)

> ME: Yeah, but with dynamicism, I can add a new pane to a GUI
> while it's running. I just change an instance to include a new pane,
> and...
> FRIEND: That's pretty cool, but it's not a reason, in and of itself,
> to make a language so dynamic. There must be some advantage to being
> able to add attributes during runtime. What are they?
> ME: Uh...
> FRIEND: And another thing! What's with encapsulation? There's no
> private!?!?!?

Here's an entry from my personal quotes file. It's about
CLOS (the Common Lisp Object System), which in most respects
is a very different beast from Python's object system; but
they have a similar attitude to encapsulation.

... it's just that in C++ and the like, you don't trust _anybody_,
and in CLOS you basically trust everybody. the practical result is
that thieves and bums use C++ and nice people use CLOS.

-- Erik Naggum

Someone who can't write well-formed OO programs without a
mechanism like "private" has serious problems with their
ability to write comments, or their ability to write other
documentation, or their discipline, or something.

Besides, in C++ at least, you can say

#define private public
#include "my-class.h"
#undef private

Yow!

> ME: <Screaming and running for cover>
>
> So, I need some help. I've checked out c.l.p and some on-line
> articles, and I just can't find good practicle examples of a program
> that adds members to instances (or classes) at runtime. Anybody got
> any? I don't want to have to start avoiding my friend....

The main advantage isn't the ability to add members at runtime
as such. That's just a side-effect of the fact that you don't
need explicit declarations of what can go in a class. And
*that*'s good for three reasons.

- Less repetitetition. Don't you just hate having to write
the same thing twice in C and C++? Programming's bad enough
for the wrists without having to write a pile of redundant
code.

- Handy for exploratory programming, where you sit at an
interactive prompt and play with things. If you suddenly
realise that you can make something go 10 times faster
by cacheing a crucial piece of information inside your
objects, it's nice that you can just add it and have it
work. I suppose that *is* adding members at runtime,
actually, but I suspect it's not the sort you had in
mind.

- One handy Python idiom is using an empty class definition
to provide things like "structs" in C.

class Empty: pass
stuff = Empty()
stuff.largest_prime = 44
stuff.modulus = -1
# etc

Having said that being able to add members at runtime isn't quite
the point, let me sort-of-contradict that by describing something
slightly icky I do in a program I use at work.

I'm collecting data from a sensor, and I want to be able to do
various bits of processing on the data. But not every packet of
data needs all the processing done on it. And, once all the
processing is done, I then want to stash a large number of
data-packets into a pickle for later use by another program.
I don't want my pickles to be hundreds of megabytes long.

So, my class looks a little like this:

class Packet:
def __init__(self, data):
self._data = data
self._processed = 0
def ensure_processed(self):
self._squares = map(operator.mul, self._data, self._data)
self._average = reduce(operator.add, self._data) / len(self._data)
self._sumsofquares = reduce(operator.add, self._Squares)
# blah blah blah
self._processed = 1
def minimize(self):
try:
del self._squares
del self._sumofsquares
# blah blah
except:
pass

my_packets = build_many_packets(100000)
do_hairy_stuff_with_packets(my_packets)
map(Packet.minimize, my_packets)
cPickle.dump(open("data.pickled", "w"))

(Only a little like that; what it actually does is a lot more
complicated.)

--
Gareth McCaughan Gareth.M...@pobox.com
.sig under construc

Jonathan Gardner

unread,
Nov 29, 2001, 9:04:07 PM11/29/01
to
On Friday 30 November 2001 08:18 am, Peter Milliken wrote:
> I use Python as a good, quick and dirty hacking language. For real (read
> production) stuff that I expect a customer to run or will require more than
> a single person working for a couple of hours, I look elsewhere :-). Sure
> there are examples of Python being used for "large" jobs - and very
> successful they have been too - but these people are masochists (IMO) :-).
> They could have been more productive with other languages that provide
> better support for generic software engineering principles/standards.
>

Au contraire...

I find large projects benefit the most from python. I have a several thousand
line program that quickly became unmanageable in C++. In python, everything
just blends together seamlessly. When something goes wrong, I don't have to
whip out a debugger because it is so easy to read the old code, and I can
follow it with my finger better than a debugger could.

The parts are extremely interchangeable, modifiable, and reworkable. I can go
in and change any class, even the design, and it still pretty much works.
When it doesn't, it is easy enough to debug. And the function parameters?
THANK HEAVENS that someone has the sense to do what python does when you have
like 2 million arguments to a function - NAMED PARAMETERS. And one more word:
dir()/__doc__. Every time I wonder what something does, I boot up the command
line, import it, and start dir'ing it and __doc__'in it. Without even looking
at the code, I have figured out the most important parts, because I have
documented it as I wrote it.

With C/C++ you have messy defines and stuff you have to manage in addition to
the code. And the code is unreadable, even if you are a professional C++
programmer. You have to remember what members there are, whether they are
protected, public, or private, and whether the other guy is a friend, child
class, or nobody at all. Not to mention you have to watch the memory and
freeing it all the time - but not if someone else is still using it! Keep
reference counts on pieces of data that are dynamic and used by more than one
pointer; then there is the list/array management that has to go on, the
off-by-one errors and the every to inevitable type-casting problems... You
get the idea. It's a lot to remember as you debug old code. It feels like
building a house with tiny pebbles and little sticks and pieces of string.

> There, that should bring the 'vangelists out of the woodwork! :-) It's just
> too easy, they rise like starving trout - each language has it's strengths
> and weaknesses. Python has some very nice features, I use it a good deal.
> But it definitely has its place!
>

I agree with you that there are limitations to Python; I disagree with you
that Python was not meant for large projects.

> In the meantime, my advice is don't get "emotionally" hooked into the
> language. Analyse its strengths and weaknesses, consider other languages
> and their strengths and weaknesses and then you can hold an intelligent
> conversation with another programmer! :-)
>

Exactly. Why do *you* use a language? The answer should always be something
reasonable. It should never be emotional. Well, unless you are a teenager,
then everything is emotional. =)

Jonathan

Milliken, Peter

unread,
Nov 29, 2001, 10:46:40 PM11/29/01
to
Ah, you said the magic in your opening lines "C++....". A maintainers life
line - job security forever! The number of "large" projects that have used
C++ just make me laugh all the way to the bank :-).

A more cruddy language for implementing large projects I have never seen.
But, as one manager used to argue, "I can get plenty of C++ programmers...."
- despite the technical arguments for why C++ is a disaster of a language
for large projects. I have no argument that python has probably saved your
bacon many a time on a C++ project - but the base of the argument is
floored. "...several thousand line program that quickly became unmanageable
in C++...." :-). Python is a better language than C++ - I would lay C++
somewhere down the bottom of any list for implementing "largish" projects (I
term "large" projects in millions of lines not thousands - sorry, I can't
resist bragging :-)).

As for the French - "Au contraire..." - I never could deal with the fancy
stuff :-)

In my opinion (:-)), C (and hence it's backward compatible "big brother")
are "old" languages - neither of which I would choose from a software
engineering principle perspective - but here I get onto ground that has been
argued before (see the extensive thread on whether to implement an Air
Traffic Controller in Python [comp.lang.python] - many people who
participated in that thread know my opinions :-)).

As for the poor coding habits of the projects that you have participated,
you have my sympathy many times over :-). People keep making the same
mistakes, that is one of the nice things about the software industry -
nobody (appears) to want to learn from previous pioneers (how many people
reading this have Steve McConnells book, Code Complete on their shelve -
excellent book because if you read and practice that you will save yourself
about 15-20 yrs of blundering around the coding world :-)). It all keeps
people like me in a job (I currently work maintaining a "largish" C product
:-)).

So, sorry mate, I accept you have a place for Python, but all you are
proving is my original statement. That different languages have different
places. Have you ever seen the purported interview with Stroustrop about
C++? I personally believe it to be true! :-).

So, "Au contaire" to you too (and thanks for taking the bait :-))

Peter

bru...@tbye.com

unread,
Nov 30, 2001, 12:13:09 AM11/30/01
to
On Fri, 30 Nov 2001, Peter Milliken wrote:

> I use Python as a good, quick and dirty hacking language. For real (read
> production) stuff that I expect a customer to run or will require more than
> a single person working for a couple of hours, I look elsewhere :-). Sure
> there are examples of Python being used for "large" jobs - and very
> successful they have been too - but these people are masochists (IMO) :-).
> They could have been more productive with other languages that provide
> better support for generic software engineering principles/standards.

I am *so* glad that there are people in the world that share your opinion
because you hand me a competitive advantage on a silver platter. I don't
even have to work for it! Keep up the good work; many, many thanks!

-Dave


Jyrinx

unread,
Nov 30, 2001, 1:05:36 AM11/30/01
to
I haven't worked much with Ruby, I'll admit, but one of their advertised
"features" sounds like something that violates OO far more than Python's
dynamic members. I don't like singleton functions at all - they let you
cheat by futzing with core implementation details in client code. Yeah,
Python probably allows this, too, but it's not common and certainly not
advertised. A friend argued that it's a waste of code to declare an entirely
new class just to override a method and declare a single instance of it; I'd
say that deriving a new class is clearer, and redefining an instance's
member functions seems kludgy and prone to hard-to-find bugs. (Besides, I
like the Python principle that, while short code is good, clear and succinct
code is more important than the absolute minimum of LOC's.)

Anyway, the point is, while Python allows extra members to be tacked on for
a client's own use, this is not half the violation of OO concepts that a
certain other language regards as a neat feature.

Jyrinx
jyr...@mindspring.com

<bruce...@hotmail.com> wrote in message
news:baf2f841.0111...@posting.google.com...

Christian Tanzer

unread,
Nov 30, 2001, 2:03:43 AM11/30/01
to

"Peter Milliken" <peter.m...@gtech.com> wrote:

> I use Python as a good, quick and dirty hacking language. For real (read
> production) stuff that I expect a customer to run or will require more than
> a single person working for a couple of hours, I look elsewhere :-).

Like what?

> They could have been more productive with other languages that provide
> better support for generic software engineering principles/standards.

What other languages? What principles/standards?

--
Christian Tanzer tan...@swing.co.at
Glasauergasse 32 Tel: +43 1 876 62 36
A-1130 Vienna, Austria Fax: +43 1 877 66 92


Peter Hansen

unread,
Nov 30, 2001, 8:23:26 AM11/30/01
to

Dave! :-( Sshhhhh!

(Most of the people here are probably not in business themselves.
You're not supposed to leak the secret out to our competitors!)

--
----------------------
Peter Hansen, P.Eng.
pe...@engcorp.com

bru...@tbye.com

unread,
Nov 30, 2001, 10:57:37 AM11/30/01
to
On Fri, 30 Nov 2001, Peter Hansen wrote:

> bru...@tbye.com wrote:
> > On Fri, 30 Nov 2001, Peter Milliken wrote:
> > > I use Python as a good, quick and dirty hacking language. For real (read
> > > production) stuff that I expect a customer to run or will require more than
> > > a single person working for a couple of hours, I look elsewhere :-). Sure

[snip]


> >
> > I am *so* glad that there are people in the world that share your opinion
> > because you hand me a competitive advantage on a silver platter. I don't
> > even have to work for it! Keep up the good work; many, many thanks!
>
> Dave! :-( Sshhhhh!
>
> (Most of the people here are probably not in business themselves.
> You're not supposed to leak the secret out to our competitors!)

Doh! Sorry... I may have exceeded the MADCAP (Maximum Advocacy Degree for
the Customer Acceptance of Python). It's a fine line - on the one hand
Python needs to be evangelized so that software consumers (clients and/or
bosses) realize it is "safe" to use and not a toy, and at the same time I
don't want it to become *too* popular because the playing field becomes
level again.

OTOH, if my competitors were really smart, they'd already be using Python,
so me spilling the beans isn't too damaging. ;-)

-Dave


Jason Voegele

unread,
Nov 30, 2001, 1:17:27 PM11/30/01
to
"Jyrinx" <jyrinx at mindspring dot com> wrote in message news:<9u77i8$a5j$1...@slb0.atl.mindspring.net>...

> I haven't worked much with Ruby, I'll admit, but one of their advertised
> "features" sounds like something that violates OO far more than Python's
> dynamic members. I don't like singleton functions at all - they let you
> cheat by futzing with core implementation details in client code. Yeah,
> Python probably allows this, too, but it's not common and certainly not
> advertised. A friend argued that it's a waste of code to declare an entirely
> new class just to override a method and declare a single instance of it; I'd
> say that deriving a new class is clearer, and redefining an instance's
> member functions seems kludgy and prone to hard-to-find bugs. (Besides, I
> like the Python principle that, while short code is good, clear and succinct
> code is more important than the absolute minimum of LOC's.)
>
> Anyway, the point is, while Python allows extra members to be tacked on for
> a client's own use, this is not half the violation of OO concepts that a
> certain other language regards as a neat feature.

Neither singleton methods, nor dynamic members are a violation of OO
principles in the least. They are both, on the other hand, violations
of a static typing principles. Neither Ruby or Python offer static
typing, so why is this a big deal? Ruby offers this flexibility
because it's a dynamic language, and it comes in very handy.

For example, I'm writing a Ruby binding to the Object-Oriented
database GOODS. Some objects are persistent, some are transient. You
don't know at "compile time" which are which, so it's very nice for my
binding to be able to add persistence support methods at run-time only
to those objects that will be stored in the database. Without Ruby's
support for adding these methods to individual objects, I'd have to
modify the class, which would mean adding all this persistence-related
stuff to transient objects. Ruby allows a much cleaner solution: add
the persistence support methods only to those objects that need them.

The point is: don't listen to this kind of argument from someone who
has little or no experience with dynamic languages. Singleton methods
and dynamic members are not cheating, they're incredibly useful
features that some C++ programmers are averse to because it doesn't
"feel right", because it doesn't follow the "strict interface"
semantics required by a statically typed language. If you're using a
dynamically typed language, you've already given up the static-typing
safety net, so why bother to live with the restrictions that are
necessary for a static type system? Remember, in a dynamic language,
class does not equal type.

Jason Voegele

Robert Folkerts

unread,
Nov 30, 2001, 5:59:02 PM11/30/01
to
tan...@swing.co.at (Christian Tanzer) wrote in message news:<mailman.100710428...@python.org>...

> "Peter Milliken" <peter.m...@gtech.com> wrote:
>
> > I use Python as a good, quick and dirty hacking language. For real (rea
> d
> > production) stuff that I expect a customer to run or will require more
> than
> > a single person working for a couple of hours, I look elsewhere :-).
>
> Like what?
>
> > They could have been more productive with other languages that provide
> > better support for generic software engineering principles/standards.
>
> What other languages? What principles/standards?

Perhaps this is a reference to generic programming, as exemplified by
the C++ Standard Template Library. In Python, you don't have data
types and you don't have the same level of control that you have in
C++ to declare const's that allow a complier to better optimze code.
If your client needs to get the most out of each clock cycle, I would
choose C++ over Python.

However, my first choice would be to use C++ 'under' Python. There
are some great examples of using Python and C++ together:
sip and the boost python libraries allow C++ and python to co-exist
nicely
Qt & PyQt are a great example of this synthesis. (Play with
theKompany's BlackAdder, I found it a rush to write Python that
inherits & oeverides behavior defined in C++.)

PySTL is another (still rough ) indication of how C++ and Python can
coexist. I find PySTL very exciting, because templates are a powerful
tool, but they are an incredible pain to use. The STL is an
intellectual masterpiece and there are other examples fo fine generic
programming using templates. If we can leverage this 'deep'
programming in Python, great!

With C++, you have a powerful OO tool, with Python you have a very
agile OO tool. Python Programmers will run circles around C++
programmers. But a C++ application will (after a long wait while the
coders deliver) run circles around the Python application. In my
ideal project, you deliver version 1.0 in pure python, the client will
have a bunch of changes and a desire to get better performance. Use
the very agile Python to make most of the changes and get the code
refactored into a stable, well-designed OO system.

Now, use the C/C++ to replace a few performance limiting pieces of
code. Zope is a great example of this. It is almost all Python with
a very limited use of C. The C++ programmers will only write code
when there is hard evidence from profiling that you have a bottleneck.
A development team should be able to deliver a mixed language app
that can be developed almost as fast as the Python app, while
executing nearly as fast as the C++ app.

Overtime, 'framework' code might migrate toward C++ (like in the Qt
library), while application programming (the bread and butter for must
of us) will migrate to Python (or a python-like langauge).

In summary, I feel that Python and C/C++ go together like branches and
roots. Attacking one only harms both.

Harry George

unread,
Nov 30, 2001, 11:14:40 PM11/30/01
to
bruce...@hotmail.com (bruce...@hotmail.com) writes:

>
> So, I need some help. I've checked out c.l.p and some on-line
> articles, and I just can't find good practicle examples of a program
> that adds members to instances (or classes) at runtime. Anybody got
> any? I don't want to have to start avoiding my friend....

We have to parse various csv files. The column headers are provided
in the files. For each data line, we generate an instance of a
generic record class, and load the data into attributes whose names
are derived from the column headers. From there the records are easy
to use.


--
Harry George
hgg...@seanet.com

Kragen Sitaker

unread,
Dec 1, 2001, 4:16:27 AM12/1/01
to
<bru...@tbye.com> writes:
> In truth, all less dynamic languages *wish* they had this
> feature. Since they don't developers instead have two additional pieces of
> work when creating a new class: (1) they have to figure out up front the
> total collection of possible attributes/properties instances can have, and
> (2) they have to set them to some null-like value if those attrs/props
> are not present. So instead of an instance simply not having a property,
> all other instances have to have a property with a value that means they
> don't have that property. Wow.

I do this all the time in Python, simply because it's easier to write
(and read):

if self.property is None:
self.property = self.computeproperty()

than

try: self.property
except AttributeError: self.property = self.computeproperty()

or

if not hasattr(self, 'property'): # especially when it's __property
self.property = self.computeproperty()


Kragen Sitaker

unread,
Dec 1, 2001, 4:28:23 AM12/1/01
to
bruce...@hotmail.com (bruce...@hotmail.com) writes:
> FRIEND: Besides, that's awful Object Orientation. If I start adding
> attributes to an instance of a class, it ceases to be an instance of
> that class. If I create a bunch of instances of the same class, they
> should be the same; they should have the same members.

Your friend obviously greatly overestimates their knowledge of
object-orientation (as evidenced by their saying it with Initial
Capitals). One of the standard OO dogmas is that if FireTruck is a
subclass of Vehicle, then FireTruck instances are (in addition to
being FireTruck instances) Vehicle instances. Another standard OO
dogma is that subclasses, at least, can define new attributes. If
your friend accepts these two dogmas, then he must think that
subclassing is "awful Object Orientation", too.

> FRIEND: That's pretty cool, but it's not a reason, in and of itself,
> to make a language so dynamic. There must be some advantage to being
> able to add attributes during runtime. What are they?
> ME: Uh...

I don't generally add attributes after __init__, myself. I like to be
able to summarize what I expect of an object in phrases like "foo is a
valid FireTruck object", rather than "foo is a FireTruck object that
has had the addStuff method called at some point in the past", because
that latter phrase is harder to ensure the truth of. And I don't like
using hasattr() and getattr() and __dict__ except when I'm extending the
language, doing things that can't be done without them.

> FRIEND: And another thing! What's with encapsulation? There's no
> private!?!?!?

Some languages have encapsulation, and others don't. It doesn't have
much to do with whether or not they're object-oriented.

adina...@mindspring.com

unread,
Dec 1, 2001, 8:31:25 AM12/1/01
to
What do people think is good-form use of hasattr(), getattr(), and __dict__,
as opposed to bad magic or suboptimal style?

- Adina


"Kragen Sitaker" <kra...@pobox.com> wrote in message And I don't like

Hans Nowak

unread,
Dec 1, 2001, 3:14:43 PM12/1/01
to
adina...@mindspring.com wrote:
>
> What do people think is good-form use of hasattr(), getattr(), and __dict__,
> as opposed to bad magic or suboptimal style?

I think there's nothing wrong with using hasattr, getattr or
setattr... __dict__ is a bit more tricky, but it still seems to
be an acceptable solution in some cases, like, for example,
preventing recursive calls in a __setattr__ method. Then again,
if, in a given situation, you can think of a solution that does
not require tinkering with attributes, it's probably better...

There. Was that vague or what? ;-)

--Hans

Peter Milliken

unread,
Dec 2, 2001, 3:25:06 PM12/2/01
to
You both prove *my* point - thanks! :-)

Peter

"Peter Hansen" <pe...@engcorp.com> wrote in message
news:3C07884E...@engcorp.com...

Peter Hansen

unread,
Dec 3, 2001, 12:09:07 AM12/3/01
to
[Top-quoting corrected.]

Peter Milliken wrote:
> "Peter Hansen" wrote:
> > bru...@tbye.com wrote:

> > > Peter Milliken wrote:
> > > > I use Python as a good, quick and dirty hacking language. For real (read
> > > > production) stuff that I expect a customer to run or will require more than
> > > > a single person working for a couple of hours, I look elsewhere :-). Sure
> > > > there are examples of Python being used for "large" jobs - and very
> > > > successful they have been too - but these people are masochists (IMO) :-).
> > > > They could have been more productive with other languages that provide
> > > > better support for generic software engineering principles/standards.
> > >
> > > I am *so* glad that there are people in the world that share your opinion
> > > because you hand me a competitive advantage on a silver platter. I don't
> > > even have to work for it! Keep up the good work; many, many thanks!
> >
> > Dave! :-( Sshhhhh!
> >
> > (Most of the people here are probably not in business themselves.
> > You're not supposed to leak the secret out to our competitors!)
>

> You both prove *my* point - thanks! :-)

You're point being what again? That we could have been more productive
using some language other than Python, and that we are masochists
because we chose to use Python instead?

If that's your claim, you're welcome to it, and Dave and I will
continue being *much* more productive than we have ever been
with other languages, for reasons clearly and directly
attributable to the design of Python and the community.
Do you really believe Python has poor support for "generic"
engineering principles? Maybe you're just using it wrong.
I find it supports *all* the useful engineering principles
I've ever learned, and then some.

Or did you have some other point I missed?

Jyrinx

unread,
Dec 3, 2001, 3:40:36 AM12/3/01
to
> For example, I'm writing a Ruby binding to the Object-Oriented
> database GOODS. Some objects are persistent, some are transient. You
> don't know at "compile time" which are which, so it's very nice for my
> binding to be able to add persistence support methods at run-time only
> to those objects that will be stored in the database. Without Ruby's
> support for adding these methods to individual objects, I'd have to
> modify the class, which would mean adding all this persistence-related
> stuff to transient objects. Ruby allows a much cleaner solution: add
> the persistence support methods only to those objects that need them.

Under the basic premises of OO, shouldn't you just have a base class with
derived classes "persistent" and "transient," or some such? I'd think this
would make clearer your intent for the uses of the objects.

Jason Voegele

unread,
Dec 3, 2001, 3:05:34 PM12/3/01
to
"Jyrinx" <jyrinx at mindspring dot com> wrote in message news:<9ufdth$155$1...@slb5.atl.mindspring.net>...

The problem with this is that it defeats the idea of transparent
persistence. One goal of object databases is that persistence is not
limited to a subset of the class heirarchy. This goal forbids a
superclass as an indicator of persistence-capability. Instead,
persistence is define by reachability: if an object (any object!) is
reachable from a persistent object, that object becomes persistent
itself, as well as any objects reachable from that object...and so on.

Some OO databases ignore this and force applications to use
"Persistent" superclasses or interfaces. IMO, this is a very bad
idea. Consider that you may have a million instances of a class, only
one of which is persistent (by the reachability definition). If the
class has to subclass a "Persistent" class, all of those million
objects have to carry around the Persistence baggage, even though only
one of those objects needs them.

Adding the persistence baggage to individual objects as necessary
provides a much simpler and scalable solution. Simpler because
application developers don't have to bother with subclassing (or
implementing) a Persistent class (or interface). More scalable
because the persistence support is added to *only* those objects (not
classes) that need it.

Jason Voegele

Peter Milliken

unread,
Dec 3, 2001, 4:45:57 PM12/3/01
to
Yeah, I guess you missed the point. But then you're very busy making money
aren't you? :-) So perhaps we shouldn't be too critical :-)

The major point was that Python is good for some simple, single person
jobs - which you and your mate obviously do, but it is *not* good for larger
projects that involve more than one individual (IMO :-)) - this statement
has implications on the size of the job in a very direct way.

As for Python meeting *all* of the software engineering principles you ever
*learned* - well, only you can be the judge of what you did or didn't learn
:-). It certainly fails in a number of important areas that I *learned*
about software engineering :-). So lets leave it at the fact that we have
different educational backgrounds :-).

Evangelical support of a language is nice to see, just don't let it blind
you to what is available in the way of tools. Python is a tool. Just as a
carpenter wouldn't attempt to build a house using only a hammer, I wouldn't
attempt to write all of my software using Python. I like Python, I use
Python. But I realise that Python has some severe limitations. I wouldn't
use Python to "build a house".

All you have to do is look at the Pep requests and tools such as PyChecker
to realise the deficiencies of the language. Sure it will grow as a result
of some of these tools and requests but it has some fundamental design basis
that can never change - but it keeps Guido gainfully employed and good luck
to him:-). Programmers should be aware of the limitation of any tool in
their toolbox and act accordingly (and please, don't throw C/C++ into the
argument! I consider C and by extension C++, nothing more than high level
assembler! :-)). There are a great deal of very mature languages out there
that provide greater support for programming in the large. You could stick
with Python as it matures and continually shoe-horn your design into its
features and as long as you're a one man band that doesn't have to provide
long term support for something that you wrote (I mean *long* term - a 10 -
15yr life for many products isn't uncommon - how long will your last job
survive in production?). Obviously you haven't worked at the architectural
level of any *large* projects (10's or even 100's of programmers) otherwise
you would never make the statements you do. Ignorance is bliss I guess :-)

In the meantime, Python seems to suit you, so enjoy! :-)
Peter

"Peter Hansen" <pe...@engcorp.com> wrote in message

news:3C0B08F3...@engcorp.com...

Fredrik Lundh

unread,
Dec 3, 2001, 6:47:04 PM12/3/01
to
> Or did you have some other point I missed?

don't feed the troll.

</F>


James_...@i2.com

unread,
Dec 3, 2001, 7:16:04 PM12/3/01
to

Peter Milliken wrote:
>Obviously you haven't worked at the architectural
>level of any *large* projects (10's or even 100's of programmers)
otherwise
>you would never make the statements you do. Ignorance is bliss I guess :-)

But since all things are not simply "black or white" you might want to keep
in the back of your mind the fact that our company has written a successful
commercial product that comprises several hundreds of thousands of lines of
Jython code written by several dozen programmers working in multiple
locations throughout the world and has been in production use for over a
year and is expected to be supported for many years to come (the C++
product that it replaced has been supported in production use for over
seven years and counting).

Jim

Peter Hansen

unread,
Dec 3, 2001, 11:34:50 PM12/3/01
to
Peter Milliken wrote:
>
> Peter Hansen wrote:
> > Or did you have some other point I missed?
>
> Yeah, I guess you missed the point. But then you're very busy making money
> aren't you? :-) So perhaps we shouldn't be too critical :-)

Sounds like reasonable advice...

> The major point was that Python is good for some simple, single person
> jobs - which you and your mate obviously do, but it is *not* good for larger
> projects that involve more than one individual (IMO :-)) - this statement
> has implications on the size of the job in a very direct way.

1. For the record, I've never met Dave (presumably the "your mate" in your
statement above) and we do not work together. We just seem to have similar
views on some of these issues. (I realize my comments might have been
misleading in this regard.)

2. Here's a simple point for you to consider in relation to your claims.
I'm the directory of software engineering for a wireless telecom
company with 100+ people. There are fourteen developers here who are
using Python (almost exclusively) for some rather large applications,
ranging from factory automation through Intranet to embedded Linux
stuff for industrial control. In case this isn't obvious: these
are NOT one-person jobs. Several have involved eight or nine
people simultaneously at their peaks. I'm happy to report from
*actual experience* that your statement has *no* implications on the
size of the jobs we have undertaken, direct or otherwise. And
as for suitability in other ways: we have had an order of magnitude
fewer reported bugs, and have productivity at least two, probably
three times, higher than on any project I've worked on in the past.
These numbers are largely attributable to Python, and that makes
using it good business sense no matter how you look at it.

> As for Python meeting *all* of the software engineering principles you ever
> *learned* - well, only you can be the judge of what you did or didn't learn
> :-). It certainly fails in a number of important areas that I *learned*
> about software engineering :-).

While I cannot speak for the grounding your school may have provided,
the University of Waterloo gave me its usual solid grounding in engineering
principles. Perhaps your school emphasized some areas I would consider
unimportant. Or perhaps the advice that a "poor workman should not
blame his tools" should apply even to those calling themselves software
engineers. (If I were you, I would probably have put a smiley here.)

> All you have to do is look at the Pep requests and tools such as PyChecker
> to realise the deficiencies of the language.

Actually, I am quite happy ignoring the PEPs, and so far have not
found it necessary to integrate PyChecker into our development environment.
We have done quite well without it so far (although I figure PyChecker
would be of some small help and we'll probably use it eventually).
Perhaps our testing processes (part of good software engineering)
are adequate for the purpose.

So your suggestion doesn't seem likely to reveal any significant
deficiencies to me: if I haven't seen any myself, it must be because
*for my purposes* there are none!

> You could stick with Python as it matures ...

How much more than 10 years old does a language have to be for you
to call it mature? (No, that's merely a rhetorical question.)

> ... and as long as you're a one man band that doesn't have to provide


> long term support for something that you wrote (I mean *long* term - a 10 -
> 15yr life for many products isn't uncommon - how long will your last job
> survive in production?). Obviously you haven't worked at the architectural
> level of any *large* projects (10's or even 100's of programmers) otherwise
> you would never make the statements you do. Ignorance is bliss I guess :-)

Erm, yes.... well. Our product design is intended to scale to
hundreds of systems and have a lifetime of roughly ten years.
I see no evidence after working on this project for two and
a half years that we will not fulfill our objectives. In spite of
my background in Systems Design Engineering and my professional
engineering license, I suppose you may be right that I don't have
what it takes to work on large projects. Luckily for me, your
opinion on that matter is unlikely to raise any doubts in my mind,
although I suppose you're quite welcome to express it.

> In the meantime, Python seems to suit you, so enjoy! :-)

I do! Thanks for the good wishes. :)

Peter Hansen

unread,
Dec 3, 2001, 11:41:25 PM12/3/01
to
Fredrik Lundh wrote:
>
> > Or did you have some other point I missed?
>
> don't feed the troll.

Doh! Who wrote the new regex library anyway? There must be a bug:

>>> import re, list_util
>>> re.findall('(troll)', list_util.get('<mailman.1007098659...@python.org>'))
[]

I'm not sure how it missed that one...

-Peter H

Peter Milliken

unread,
Dec 3, 2001, 10:51:27 PM12/3/01
to
Agreed Jim :-), there are many shades of grey in between. Note that I have
never said that you *can't* do it in Python - just that I believe there are
better languages for larger programs i.e. the cost of production would be
less in some languages than if you did it in Python - this must also be
tempered with exactly what you are coding i.e. Python is very, very good at
certain things, therefore if your program would benefit from those features
of the language then by all means - there are rarely any absolutes in this
world :-).

Just like you can produce large programs in C++ (but I wouldn't personally
:-)) - just out of curiousity, why was the C++ version of the product
replaced by a Jython version? Too difficult/expensive to update to the new
requirements?

Any ideas of what the maintenance costs of the C++ version were in
comparison to any other language? Was the Jython version cheaper, more
expensive or the same in terms of productions costs as the C++ version? What
were the problems encountered by the various teams using Jython? It has
always interested me what the selection process is in choosing a language
for a project - do you know how and why Jython was chosen? Was it because of
popular aclaim or was some formal comparision of various languages
performed? If there was some formal selection process, what languages were
considered? What features of Jython caused it to be selected? i.e. what
features where missing in the other languages under consideration.

Peter

<James_...@i2.com> wrote in message
news:mailman.1007425027...@python.org...

Peter Milliken

unread,
Dec 3, 2001, 11:09:05 PM12/3/01
to

"Robert Folkerts" <rob...@folkerts.net> wrote in message
news:abec4aa1.0111...@posting.google.com...

> tan...@swing.co.at (Christian Tanzer) wrote in message
news:<mailman.100710428...@python.org>...
> > "Peter Milliken" <peter.m...@gtech.com> wrote:
> >
> > > I use Python as a good, quick and dirty hacking language. For real
(rea
> > d
> > > production) stuff that I expect a customer to run or will require more
> > than
> > > a single person working for a couple of hours, I look elsewhere :-).
> >
> > Like what?

What about languages such as Oberon, Modula-2/3, Smalltalk, Java, Pascal,
Ada plus many others.

> >
> > > They could have been more productive with other languages that provide
> > > better support for generic software engineering principles/standards.
> >
> > What other languages? What principles/standards?
>
> Perhaps this is a reference to generic programming, as exemplified by
> the C++ Standard Template Library. In Python, you don't have data
> types and you don't have the same level of control that you have in
> C++ to declare const's that allow a complier to better optimze code.
> If your client needs to get the most out of each clock cycle, I would
> choose C++ over Python.
>

No, I didn't mean 'generic' in that sense :-). I was refering more to the
concepts of information hiding, programming via 'contract', pre/post
conditions, types and type checking, compile time checking i.e. mispell a
variable in Python and spend ages looking for it! :-) run-time checking etc
etc (I could think up others, but these will do for the time being :-)).
Many of these principles are mentioned in various portions of the Python
news list as "desirable" features of the language i.e. I think I remember
something in the 2.2 changes where the author states that the pre/post
condition feature could now feasably be implemented. PyChecker is another
example of someone's desire to 'improve' Python - Lint did the same thing
for C/C++ :-) All add-ons to attempt to correct "deficiencies" in the
language :-) So you have to ask yourself why don't people look for a
language that incorporate these features into the language by way of the
compiler etc? :-)

> However, my first choice would be to use C++ 'under' Python. There
> are some great examples of using Python and C++ together:
> sip and the boost python libraries allow C++ and python to co-exist
> nicely
> Qt & PyQt are a great example of this synthesis. (Play with
> theKompany's BlackAdder, I found it a rush to write Python that
> inherits & oeverides behavior defined in C++.)
>

<snip>

>
> In summary, I feel that Python and C/C++ go together like branches and
> roots. Attacking one only harms both.

No disagreement in the general sense of combining Python with another
language - I am not a fan of C++ (which is C with some OO thrown in for good
measure and C is nothing more than high level assembler :-)). I have
considered something similar myself but using a different language to C++.
The concept of combination is a good one (as witnessed by your examples),
match the strengths of the language in question with the problem. There is
no "one" language that would suit all situations (otherwise I wouldn't be a
professed Python user! :-)). It is just interesting that some can become
"one-eyed" towards their favourite language, thinking that it can solved all
problems (probably solve the problem of world hunger as well! :-)) similarly
to when one mentions editors, now mine is.......... :-)

Peter


Peter Milliken

unread,
Dec 3, 2001, 11:37:38 PM12/3/01
to
I am only writing in this thread to gain information myself :-). I have strong opinions (formed over 2 decades of being involved in software production) but am always prepared to learn from others experiences and opinions. So there is no offence taken :-)
 
I would (invariably :-)) choose Ada for a full scale production programs. It is a language that embodies support for many of the software engineering principles, has all of the "mod cons" such as OO etc and was intended for real-time applications - so it is fast! But as I mentioned in other responses, this is not "black and white". Python is my preferred choice of scripting language (I have only ever used awk) and if the job demanded it then I would quite happily integrate Python into the program so that it and Ada where doing appropriate parts of the job.
 
My last 20 years I spent in defence doing embedded systems (simulators, procedural trainers, train control systems, weapons systems). Surprisingly there is very little call for manipulation of information such as strings etc (something that I use Python to good advantage for). Similarly, there is very little need for maintaining information in "lists of things" (another place where Python shines). In fact I have seen people work on defence projects for 10 years and never even learn how to do I/O to a user in the chosen language i.e. string I/O because the problem being programmed contained nothing like that. I have years of experience in C/C++ (not so many in C++, but enough :-)), Ada, some small amount of Java (my second preference if I couldn't use Ada - but it is interpreted, so it is *slow*), assembler, some Fortran and Pascal/Modula-2. Outside of these languages, I can't comment (and don't pretend too :-)).
 
So Ada would be my choice, but as I said, if someone has any experience to offer up then I am more than happy to consider it and learn from it - I have Steve McConnell's Code Complete on my bookshelf - I wished he had written it 20 yrs earlier, it would have saved me some "lumps"! :-)
 
The software industry is (IMO) a very immature industry - it can't be called an engineering discipline because I have rarely seen any examples of true engineering principles applied (I am not talking about writing a program to meet customers needs, but rather the framework of the software organisation i.e. why did the company choose Java for this project - is it because a study was performed on available languages and Java won out due to best features? or was it because the chief architect wanted to learn Java for his resume? :-)). It strikes me that software engineering is still very much a "cottage industry". Good programs are written by good individuals, rarely do you see a software shop put out a good program using average or sub-standard programmers, it always comes down to a few individual "heroes" (hence the drive for CMM - another silver bullet :-)). Look at people's reaction to code review - they learn it in college don't they? It is taught as a "good practice", but as soon as they get that piece of paper and get out in the "real world", amongst "real programmers", then horror of horrors - "You want to review *my* code?" "You want to *criticise* my creative talents?" They approach the entire exercise with fear and trepidation :-) Personally, I like others to review my code - it helps me become a better software engineer! (But only if they put the time in!).
 
Hope this answers your questions.....
Peter
 
"Peoter Veliki" <peoter...@hotmail.com> wrote in message news:mailman.1007077727...@python.org...
> I use Python as a good, quick and dirty hacking language. For real (read
> production) stuff that I expect a customer to run or will require more than
> a single person working for a couple of hours, I look elsewhere :-).
 
What do you use for production?  I'm just curious, not challenging your opinion.  Is there an interpreted language that you would use before you would use Python?  Perl?  Or would you stick to compiled languages for production?

Peter Milliken

unread,
Dec 3, 2001, 11:39:49 PM12/3/01
to
Sorry Jim, you are one up on me here, I have never had to face this
situation. Thanks for the tip. Could you provide a real example of where you
might want this? (if it isn't too long to give! :-)). You have made me
curious.

<James_...@i2.com> wrote in message
news:mailman.1007079579...@python.org...
>
> Peter Milliken wrote:
> >There is no real justification that works for these "features" of
python -
>
> OTOH, there *are* plenty of good *reasons* for them.
>
> Consider "adding attributes to instances". Languages that don't support
> this feature directly require some kind of a workaround since this is
> something that one simply needs to do from time to time. A typical
> workaround is to define, as part of a class's definition, a catch-all
> property whose value is a hashtable. Clients add "instance-specific"
> attributes to a given instance "on the fly" by adding name-value pairs to
> the instance's hashtable. Java/Swing's JComponent (with its
> putClientProperty and getClientProperty methods) is a good example of this
> type of workaround.
>
> Jim
>
>
>


Peter Milliken

unread,
Dec 4, 2001, 12:30:50 AM12/4/01
to

"Peter Hansen" <pe...@engcorp.com> wrote in message
news:3C0C526A...@engcorp.com...

>
> 1. For the record, I've never met Dave (presumably the "your mate" in your
> statement above) and we do not work together. We just seem to have
similar
> views on some of these issues. (I realize my comments might have been
> misleading in this regard.)
>

Actually, "your mate" was meant in the sense of you having placed yourself
(apparently) in the same view. I assumed no other contact :-). I am
Australian, so "your mate" can be an expression that covers many situations
:-). It doesn't necessarily imply a close relationship or otherwise.

> 2. Here's a simple point for you to consider in relation to your claims.
> I'm the directory of software engineering for a wireless telecom
> company with 100+ people. There are fourteen developers here who are
> using Python (almost exclusively) for some rather large applications,
> ranging from factory automation through Intranet to embedded Linux
> stuff for industrial control. In case this isn't obvious: these
> are NOT one-person jobs. Several have involved eight or nine
> people simultaneously at their peaks. I'm happy to report from
> *actual experience* that your statement has *no* implications on the
> size of the jobs we have undertaken, direct or otherwise. And
> as for suitability in other ways: we have had an order of magnitude
> fewer reported bugs, and have productivity at least two, probably
> three times, higher than on any project I've worked on in the past.
> These numbers are largely attributable to Python, and that makes
> using it good business sense no matter how you look at it.
>

Good to hear your experiences. You look after 14 people - well done, that's
a big team, do you manage to do any development yourself with that number of
people to supervise? I never could with teams that size, it drove me to
distraction......

What is the basis for your comparisons though? You state that you have
achieved fewer bugs and increased productivity using Python. (just for my
information :-)) it would be nice to know over what other language you claim
these improvements. Have you considered that there might be other languages
available that provide similar order of magnitude improvements over Python?
What was the basis for your choice of Python as the development language of
choice? i.e did you perform any studies into available languages, or did
someone recommend it? If it was recommended, what was the basis of the
recommendation? Have you considered mixing languages i.e. Python for the
bits that suit python and some other language for the rest?

As I have stated in other posts here, all is not "black and white" (as
someone else said :-)) and I apologise if I gave that impression in my
original post. Having done some industrial control software when I was a
"pup" what features of Python do you find useful here that you can't find in
some other language? Does the interpreted nature of Python cause any speed
difficulties? What is the required responsiveness of yoir problem domain?

> > As for Python meeting *all* of the software engineering principles you
ever
> > *learned* - well, only you can be the judge of what you did or didn't
learn
> > :-). It certainly fails in a number of important areas that I *learned*
> > about software engineering :-).
>
> While I cannot speak for the grounding your school may have provided,
> the University of Waterloo gave me its usual solid grounding in
engineering
> principles. Perhaps your school emphasized some areas I would consider
> unimportant. Or perhaps the advice that a "poor workman should not
> blame his tools" should apply even to those calling themselves software
> engineers. (If I were you, I would probably have put a smiley here.)
>

I'll put in as many as you want - I intend no offence in any of my comments,
I am pursuing this thread in the hopes of (self) improvement :-). Lets see,
information hiding, type checking, run-time checking, design by 'contract'
(ties in with information hiding), spring to mind as immediate features that
I would like to see in a language and that I was taught are "good features"
to have. Seems you don't find them important, I guess we will have to
disagree as to what is important or not :-)

Personally I love information hiding and coding by 'contract' - it seems
there is always someone in the team who is just plain lazy and wants to
access and modify the behaviour of another object directly rather than using
the agreed interface or requesting that the agreed interface be changed.
Strong typing is wonderful too. My Ada programs have 95%+ of *all* bugs out
of them by the time I get it to compile cleanly - wished I could say the
same of my Python programs! :-) Finding bugs by testing is just so
*expensive*!

> > All you have to do is look at the Pep requests and tools such as
PyChecker
> > to realise the deficiencies of the language.
>
> Actually, I am quite happy ignoring the PEPs, and so far have not
> found it necessary to integrate PyChecker into our development
environment.
> We have done quite well without it so far (although I figure PyChecker
> would be of some small help and we'll probably use it eventually).
> Perhaps our testing processes (part of good software engineering)
> are adequate for the purpose.
>

Do you gather metrics on the problems that PyChecker is meant to cure? i.e.
does anyone record the fact that they lost 1/2 hr to the fact that they
mispelt a variable name? :-) Metrics is the best way to improve your shop -
unfortunately the typical programmer doesn't like to record them :-).

> So your suggestion doesn't seem likely to reveal any significant
> deficiencies to me: if I haven't seen any myself, it must be because
> *for my purposes* there are none!
>

Hmmm.... at the risk of offending, have you considered that you might not
have sufficiently broad experience to be able to determine that? (lots of
smileys here - :-), :-), :-), :-) - there, is that enough? :-) (one more for
good measure)).

> > You could stick with Python as it matures ...
>
> How much more than 10 years old does a language have to be for you
> to call it mature? (No, that's merely a rhetorical question.)
>

Rhetoric is OK :-) Difficult sometimes to keep it out of these threads :-).
10 years is a long time, I guess from what I have observed in some posts on
the list, people are attempting to state that they would like to see things
like some facility in the language to make it easy for them to type-check
their arguments (one example that springs to mind). I really don't spend too
much time looking for these items though or I might be able to pull up many
more examples :-) Unfortunately, I "skim" these because I know there are
very good languages that already offer these features (and they didn't take
10 years to get there! - which is *not* a slame at Guido - he has done an
excellent job! :-)).

> > ... and as long as you're a one man band that doesn't have to provide
> > long term support for something that you wrote (I mean *long* term - a
10 -
> > 15yr life for many products isn't uncommon - how long will your last job
> > survive in production?). Obviously you haven't worked at the
architectural
> > level of any *large* projects (10's or even 100's of programmers)
otherwise
> > you would never make the statements you do. Ignorance is bliss I guess
:-)
>
> Erm, yes.... well. Our product design is intended to scale to
> hundreds of systems and have a lifetime of roughly ten years.
> I see no evidence after working on this project for two and
> a half years that we will not fulfill our objectives. In spite of
> my background in Systems Design Engineering and my professional
> engineering license, I suppose you may be right that I don't have
> what it takes to work on large projects. Luckily for me, your
> opinion on that matter is unlikely to raise any doubts in my mind,
> although I suppose you're quite welcome to express it.
>

Sorry, no offence intended there. Rhetoric got the better of me :-). I'm
quite sure you could scale up to larger projects - after all it is the same
software engineering principles! :-)

All either of us can do is talk from personal experience. I have a
reasonable amount in that regards but it is limited in the sense of 29 years
work experience in certain fields and situations - first 8 in industrial
control, next 20 in defence (biggest project was 100+ programmers) and the
last year in a "commercial" environment. So I am trying to gather as much
as I can from other people (we can't learn it all! :-)).

Thankyou for your input. Hopefully I can learn some more from your
experiences in the answers to the questions I pose above. If you feel they
are too "personal" to do through this news group then please feel free to
reply directly to me.

Thanks
Peter

P.S. I have a Degree in Elect. Engineering. Formal software training
consisted of a single semester in Fortran and Basic back sometime around
1977 (I think) - I read a lot though, software is my hobby. I have been
*everything* except project manager (not that stupid! :-)) i.e. test
engineer, QA engineer, Configuration Manager, sub-contract manager, software
engineer, system architect, team leader of everyone of those disciplines etc
etc ad nauseum..... :-)


Jyrinx

unread,
Dec 4, 2001, 1:12:52 AM12/4/01
to
> > > For example, I'm writing a Ruby binding to the Object-Oriented
> > > database GOODS. Some objects are persistent, some are transient. You
> > > don't know at "compile time" which are which, so it's very nice for my
> > > binding to be able to add persistence support methods at run-time only
> > > to those objects that will be stored in the database. Without Ruby's
> > > support for adding these methods to individual objects, I'd have to
> > > modify the class, which would mean adding all this persistence-related
> > > stuff to transient objects. Ruby allows a much cleaner solution: add
> > > the persistence support methods only to those objects that need them.
> >
> > Under the basic premises of OO, shouldn't you just have a base class
with
> > derived classes "persistent" and "transient," or some such? I'd think
this
> > would make clearer your intent for the uses of the objects.
>
> The problem with this is that it defeats the idea of transparent
> persistence. One goal of object databases is that persistence is not
> limited to a subset of the class heirarchy.

Okay, I can see that. In other words, the hierarchy should express the
relationships between objects, not a particular feature of certain objects?

> ...


> Some OO databases ignore this and force applications to use
> "Persistent" superclasses or interfaces. IMO, this is a very bad
> idea. Consider that you may have a million instances of a class, only
> one of which is persistent (by the reachability definition). If the
> class has to subclass a "Persistent" class, all of those million
> objects have to carry around the Persistence baggage, even though only
> one of those objects needs them.

I thought only the persistent objects would subclass the Persistent class;
how does the persistence baggage get into objects of classes derived from
Transient? Wouldn't only leaves, so to speak, of the hierarchy derive from
Persistent or Transient anyway?

> Adding the persistence baggage to individual objects as necessary
> provides a much simpler and scalable solution. Simpler because
> application developers don't have to bother with subclassing (or
> implementing) a Persistent class (or interface). More scalable
> because the persistence support is added to *only* those objects (not
> classes) that need it.

I'm not sure how the singleton stuff is more scalable (though I'm sure you
have more experience than me). Hypothetically (yeah, I'm on shaky ground
here), what if you had more than one major behavior (besides just
persistence) that might vary between objects of a given class? Mightn't it
get messy to keep track of which objects are which? (I suppose, though, you
don't have to "keep track" of which ones are persistent; they just act
differently.) I wonder if it wouldn't really be clearer just to declare
trivial classes which inherit multiply from Persistent/Transient, other
behavior implementations, and other parent classes. (Of course, in Python
there's no such thing as a trivial subclass definition; you've got to
overload __init__ specifically and call __init__'s in each base class. This
ticks me off to no end.)

Also, a more technical concern: shouldn't every Ruby object need hooks in
place so that any single method could be overridden in individual objects?
I'm envisioning a class instance with a function pointer for each method ...
that's a nasty thought. Or does each object carry a list of overridden
methods, to be checked with each method call? Isn't there significant
overhead either way? Or (most likely) is it implemented in too clever a way
for me to come up with? :-)

Peter Hansen

unread,
Dec 4, 2001, 2:26:50 AM12/4/01
to
Peter Milliken wrote:
>
> "Peter Hansen" <pe...@engcorp.com> wrote:
> > ... we have had an order of magnitude

> > fewer reported bugs, and have productivity at least two, probably
> > three times, higher than on any project I've worked on in the past.
> > These numbers are largely attributable to Python, and that makes
> > using it good business sense no matter how you look at it.
> >
>
> Good to hear your experiences. You look after 14 people - well done, that's
> a big team, do you manage to do any development yourself with that number of
> people to supervise? I never could with teams that size, it drove me to
> distraction......

Some. Enough for now. Not much.

> What is the basis for your comparisons though? You state that you have
> achieved fewer bugs and increased productivity using Python. (just for my
> information :-)) it would be nice to know over what other language you claim
> these improvements.

C, C++, Java, Delphi as primary languages involved in the comparison.
The kind that emphasize (obviously not C :-) things like data hiding,
type checking, compile-time checking, and so on (the things too often
claimed as being required for writing "real" programs but lacking from
Python and therefore making it unsuitable). Many others before and
along with these with no better results. (I did learn Ada, but
not Ada95. :-) (But after 100,000 hours of writing software or
thinking about it, I forget which languages I used to know.)

> Have you considered that there might be other languages
> available that provide similar order of magnitude improvements over Python?

Except in specialized areas, very, very doubtful. I'll consider it
seriously when I see the slightest evidence of such a thing. I do
keep abreast of the industry and stay alert for more data.

> What was the basis for your choice of Python as the development language of
> choice? i.e did you perform any studies into available languages, or did
> someone recommend it? If it was recommended, what was the basis of the
> recommendation? Have you considered mixing languages i.e. Python for the
> bits that suit python and some other language for the rest?

We contrasted it with other languages with similar capabilities (fitting our
requirements at the time), including Perl, TCL, Ruby, others. Not
recommended by anyone, except as discovered during research. And yes,
naturally we mix Python as needed. We just rarely find the need, and more
often than not find that the benefits of sticking with Python outweigh
the minor apparent advantages of diluting our focus. (For example,
C is used from time to time for performance or integration with
hardware. Almost all other areas we now find more effective
to do with Python, increasing the gains from code reuse, training,
utilities, test frameworks, and so forth. If we encounter a
situation where another language will clearly be better we'll use
it.... but we're still waiting.)

> Having done some industrial control software when I was a
> "pup" what features of Python do you find useful here that you can't find in
> some other language? Does the interpreted nature of Python cause any speed
> difficulties? What is the required responsiveness of yoir problem domain?

Speed is not an issue. As I've written elsewhere, I believe 95%
(or was it 98% last time? :-) of all optimization is an unnecessary
waste of programmer time. We recently had to optimize a routine in
Python. It's an unusual occurrence. Responsiveness varies from
tens of microseconds (some embedded stuff for which we obviously
use straight C, not Python) up to several seconds, where Python is
more than adequate even on 100MHz 486. Advantages of Python (not
necessarily missing from other languages, but perhaps not
available in any single other one, and certainly not implemented
in the same, simple, effective way) include its introspection,
dynamic typing (yes, an advantage!), named arguments, incredible
readability, maintainability, and learnability, its interpreted
nature (interactive console lets developers test small portions
of code during development, so the result is already valid
*before* the application is run the first time, and it saves
looking up APIs), cross-platform support (Windows and Linux),
very strong standard library, incredible community support,
licensing (well down the list), capabilities as a "glue"
language (highly valued when we wrote the factory automation
application which talks GPIB, writes XML, has a GUI, multitasks,
talks to a serial port, talks to a CAN interface, wraps some
Windows DLLs, and does a half dozen other things), and
probably a few more which I won't think of tonight. :) Oh,
yeah, dictionaries, regular expressions, Zope, and more.
Oh, wxPython is great, too. And it's great for writing
automated testing (yes, a circular argument, since we rely
more on automated testing *because* it's Python.)

> > ... Or perhaps the advice that a "poor workman should not


> > blame his tools" should apply even to those calling themselves software
> > engineers. (If I were you, I would probably have put a smiley here.)
> >
> I'll put in as many as you want - I intend no offence in any of my comments,

Often people overuse smileys in postings with frequent sarcastic remarks,
apparently to let belittle others even while they claim not to have meant it.
I'll accept you didn't intend any of your comments that way!

> I am pursuing this thread in the hopes of (self) improvement :-). Lets see,
> information hiding, type checking, run-time checking, design by 'contract'
> (ties in with information hiding), spring to mind as immediate features that
> I would like to see in a language and that I was taught are "good features"
> to have. Seems you don't find them important, I guess we will have to
> disagree as to what is important or not :-)

Agreed. :-) I used to think information hiding was important, and
that type checking provided value to *me* (and not just to the
compiler). Python's pragmatic approach (self._dontuseme)
to the issue and the number of times it has been useful during
development or testing *not* to have to fight with the Java/C++
type of strictness has convinced me otherwise. I assume you
mean compile-time checking above (?) (not "run-time checking")
and PyChecker provides a large part of the same value. The rest
should (I now believe) be caught by adequate testing, since
compile-time checking becomes a crutch (people end up relying
solely on it, and don't do the other tests). No reason
you can't do design by contract, if you wish. Test-first
design provides a more efficient way of handling the issue.

But the part about our training is interesting, too. I was
not taught much about information hiding and type-checking.
I was in an engineering program which emphasized systems
design, looking at complexity of systems, coupling and
cohesion, design methodologies, and so forth. Little
directly to do with software, as it tried to move to a
"higher" level (so they claimed) and focus on issues which
apply to engineering in any field. As a result, perhaps
I'm less tied to (or trained in :-) these things which
I consider minor and mundane details of certain programming
languages, and look more at the benefits Python brings
to those other issues. Code, for example, appears to
be significantly simpler in Python than in many other
languages. The result is less complexity, easier
debugging and maintenance, faster development, and so on.
*These* are the engineering principles I keep in mind
and on which I judge languages.

> Personally I love information hiding and coding by 'contract' - it seems
> there is always someone in the team who is just plain lazy and wants to
> access and modify the behaviour of another object directly rather than using
> the agreed interface or requesting that the agreed interface be changed.

When (or if, really) this happens, we'll just refactor the
problem out of the code. It hasn't happened yet.

> Strong typing is wonderful too. My Ada programs have 95%+ of *all* bugs out
> of them by the time I get it to compile cleanly - wished I could say the
> same of my Python programs! :-)
> Finding bugs by testing is just so *expensive*!

Not if you write the tests first...

> Do you gather metrics on the problems that PyChecker is meant to cure? i.e.
> does anyone record the fact that they lost 1/2 hr to the fact that they
> mispelt a variable name? :-) Metrics is the best way to improve your shop -
> unfortunately the typical programmer doesn't like to record them :-).

Our metrics are poor so far, but then again resources have
been exceptionally tight and we've done our best under the
circumstances. But if problems caused by things like misspelt
variable names showed up much (or at all... not sure anyone's
actually been caught by it in the last year), we would focus more
attention on the problem. These things just haven't been
a problem for us, so we given them little thought.

> > So your suggestion doesn't seem likely to reveal any significant
> > deficiencies to me: if I haven't seen any myself, it must be because
> > *for my purposes* there are none!
>
> Hmmm.... at the risk of offending, have you considered that you might not
> have sufficiently broad experience to be able to determine that?

I'm considering it. Yes, of course that's possible. No one
could say otherwise. I just have no evidence to think that's
the case, and I believe I have quite enough experience (not
quite your level, but twenty something years developing
professionally, and probably 100,000 hours as I said above)
to judge when I have a problem to solve. The last time I
judged that to be the case, I went looking for a solution.
That's when I found Python. :-)

> > > You could stick with Python as it matures ...
> >
> > How much more than 10 years old does a language have to be for you
> > to call it mature?
>

> 10 years is a long time, I guess from what I have observed in some posts on
> the list, people are attempting to state that they would like to see things

^^^^^^
Not "people", newbies. :-) (Yes, I know they're not all newbies.)

> like some facility in the language to make it easy for them to type-check
> their arguments (one example that springs to mind). I really don't spend too
> much time looking for these items though or I might be able to pull up many
> more examples :-) Unfortunately, I "skim" these because I know there are
> very good languages that already offer these features (and they didn't take
> 10 years to get there! - which is *not* a slame at Guido - he has done an
> excellent job! :-)).

Don't kid yourself. Python is not going to arrive suddenly at
static type-checking after fifteen years and realize it was
wrong all along. It *might* end up with such a thing, probably
more to support compiler optimizations than anything, but some
of us (probably many of us) will be quite uninterested in it.
Guido has (in my opinion, as a supposedly well trained
systems design engineer) an excellent design sense. Some of
these things were left out for a reason.

> All either of us can do is talk from personal experience. I have a
> reasonable amount in that regards but it is limited in the sense of 29 years
> work experience in certain fields and situations - first 8 in industrial
> control, next 20 in defence (biggest project was 100+ programmers) and the
> last year in a "commercial" environment. So I am trying to gather as much
> as I can from other people (we can't learn it all! :-)).

Ahh.. You are much more experienced than I am in really hardcore
heavy-metal development (or whatever euphemism I can apply to the
defence industry :-). I seem to be much more experienced than you
in the commercial environment, where the issues can sometimes be
quite different (I believe). I'm still working in the commercial
area, while you are freshly arrived, but I won't presume to think
that makes me more right than you.

I *might*, however, have found that in this environment, some of
the rigorous controls provided by other languages actually get in
the way of solving the problem in a high quality and efficient
manner (high quality being defined as satisfying the customer,
and efficient being defined as before profitability vanishes :-).
Actually, I believe that's exactly what I have found, and Python
is a part of it. (XP is another part, by the way, but that's a
discussion for another newsgroup. :-) Either way, I wouldn't be
at all surprised if my conclusions would be all wrong in your
former field.

Fredrik Lundh

unread,
Dec 4, 2001, 2:29:16 AM12/4/01
to
James Althoff wrote:
> But since all things are not simply "black or white" you might want to keep
> in the back of your mind the fact that our company has written a successful
> commercial product that comprises several hundreds of thousands of lines of
> Jython code written by several dozen programmers working in multiple
> locations throughout the world and has been in production use for over a
> year

are you sure it still works, now that someone on comp.lang.python
has told you that it was a bad idea to do that? ;-)

</F>


Fredrik Lundh

unread,
Dec 4, 2001, 2:29:20 AM12/4/01
to
Peter Hansen wrote:
> I'm not sure how it missed that one...

were you trolling? or was I perhaps referring to that other
"I learned the one truth years ago" job security guy?

</F>


Christian Tanzer

unread,
Dec 4, 2001, 3:12:45 AM12/4/01
to

"Peter Milliken" <peter.m...@gtech.com> wrote:

> > > > I use Python as a good, quick and dirty hacking language. For real

> > > > (read production) stuff that I expect a customer to run or will


> > > > require more than a single person working for a couple of hours, I
> > > > look elsewhere :-).
> > >
> > > Like what?
>
> What about languages such as Oberon, Modula-2/3, Smalltalk, Java, Pascal,
> Ada plus many others.

Interesting collection.

Having used several of them I wouldn't prefer them over Python for
most kinds of production stuff (with the exception of Ada for
hard real-time applications -- and guess what, for the RT stuff C has
to be used in my customer's circles <sigh>).

And Smalltalk clearly has more things in common with Python than with
the other languages you mention.

> > > > They could have been more productive with other languages that provide
> > > > better support for generic software engineering principles/standards.
> > >
> > > What other languages? What principles/standards?

(snipped third party interpretation)


> No, I didn't mean 'generic' in that sense :-). I was refering more to the
> concepts of information hiding, programming via 'contract', pre/post
> conditions, types and type checking, compile time checking i.e. mispell a
> variable in Python and spend ages looking for it! :-) run-time checking etc
> etc (I could think up others, but these will do for the time being :-)).

Do you remember Ariane 5? The one thing they didn't do was to test the
software in the new environment...

I also find it amusing that you don't mention the one language which
actually supports pre/post conditions out of the box (BTW, it is a lot
easier to add support for pre/post conditions to Python (and
Smalltalk) than to your favorite languages).

Christian

--
Christian Tanzer tan...@swing.co.at
Glasauergasse 32 Tel: +43 1 876 62 36
A-1130 Vienna, Austria Fax: +43 1 877 66 92


Peter Hansen

unread,
Dec 4, 2001, 10:16:32 AM12/4/01
to

Sorry, a smiley might have helped. I was attempting to
make a small joke (and failed miserably) when it
occurred to me that *my* pattern matching might have
failed, and I realized you're the re guy... I'll work
on my act more before taking it to the streets. :-)

Jason Voegele

unread,
Dec 4, 2001, 10:53:10 AM12/4/01
to
> > The problem with this is that it defeats the idea of transparent
> > persistence. One goal of object databases is that persistence is not
> > limited to a subset of the class heirarchy.
>
> Okay, I can see that. In other words, the hierarchy should express the
> relationships between objects, not a particular feature of certain objects?

I'm not sure I understand what you're saying. My original point was
that persistence should not be limited to objects whose class
subclasses "Persistent". *Any* object should be capable of
persistence, whether it's class is Integer, or MyClass.

> > ...
> > Some OO databases ignore this and force applications to use
> > "Persistent" superclasses or interfaces. IMO, this is a very bad
> > idea. Consider that you may have a million instances of a class, only
> > one of which is persistent (by the reachability definition). If the
> > class has to subclass a "Persistent" class, all of those million
> > objects have to carry around the Persistence baggage, even though only
> > one of those objects needs them.
>
> I thought only the persistent objects would subclass the Persistent class;
> how does the persistence baggage get into objects of classes derived from
> Transient? Wouldn't only leaves, so to speak, of the hierarchy derive from
> Persistent or Transient anyway?

That's precisely how it works, because individual objects can have
methods added to them. If they didn't have this capability, then all
objects of a particular class would have to support persistence.

I think an example is in order. Say I have a class called "Book",
that represents a book having a title, an author, a date of
publication, etc. In my application, I might instantiate a million
Book objects. But say that I only need to store one of them in the
database. Given the ability to add members to a specific object, I
can add persistence support methods (such as object_id, mark_modified,
etc.) to the single book instance that I need to store in the
database. (More precisely, the database binding could add these
methods for me!) If Ruby did not have this ability, the Book class
would have to inherit from some sort of Persistent class. This would
mean that all one million books have these methods and variables to
support persistence, even though only one book object actually needs
them.

>
> > Adding the persistence baggage to individual objects as necessary
> > provides a much simpler and scalable solution. Simpler because
> > application developers don't have to bother with subclassing (or
> > implementing) a Persistent class (or interface). More scalable
> > because the persistence support is added to *only* those objects (not
> > classes) that need it.
>
> I'm not sure how the singleton stuff is more scalable (though I'm sure you
> have more experience than me).

To be fair, I don't know that it's any more scalable, but my intuition
tells me that it should be.

> Hypothetically (yeah, I'm on shaky ground
> here), what if you had more than one major behavior (besides just
> persistence) that might vary between objects of a given class? Mightn't it
> get messy to keep track of which objects are which? (I suppose, though, you
> don't have to "keep track" of which ones are persistent; they just act
> differently.)

Yes, you just answered your own question :-)

In addition, dynamic languages aren't concerned with such strict
semantics. Type is often defined by the messages an object
understands (whether those messages where implemented by the object's
class, or whether they were added to this specific instance does not
matter for a dynamic language).

> I wonder if it wouldn't really be clearer just to declare
> trivial classes which inherit multiply from Persistent/Transient, other
> behavior implementations, and other parent classes.

You could do this, but then you force application developers to do
things you could automate for them. And it doesn't provide any
benefit anyway. Adding this kind of behavior at the class level means
that all instances of that class provide that behavior. Most of the
time, this is what you want, but sometimes it is just incorrect. See
the Book example above.

> (Of course, in Python
> there's no such thing as a trivial subclass definition; you've got to
> overload __init__ specifically and call __init__'s in each base class. This
> ticks me off to no end.)

I'm just starting to play around with Python, so I can't comment on
this.

> Also, a more technical concern: shouldn't every Ruby object need hooks in
> place so that any single method could be overridden in individual objects?
> I'm envisioning a class instance with a function pointer for each method ...
> that's a nasty thought. Or does each object carry a list of overridden
> methods, to be checked with each method call? Isn't there significant
> overhead either way? Or (most likely) is it implemented in too clever a way
> for me to come up with? :-)


The Ruby object model is very flexible. Classes have methods, objects
have methods, objects can have a class created specifically for their
own use, objects or classes can mixin methods from a module, etc. A
lot of the (presumed) overhead is handled by method caching. See:

http://www.rubycentral.com/book/classes.html

for a description of the Ruby object model (although, unfortunately,
the very helpful figures from the printed book are not available in
the online version.)

--
Jason

Jeff Shannon

unread,
Dec 4, 2001, 2:18:22 PM12/4/01
to

Jason Voegele wrote:

> I think an example is in order. Say I have a class called "Book",
> that represents a book having a title, an author, a date of
> publication, etc. In my application, I might instantiate a million
> Book objects. But say that I only need to store one of them in the
> database. Given the ability to add members to a specific object, I
> can add persistence support methods (such as object_id, mark_modified,
> etc.) to the single book instance that I need to store in the
> database. (More precisely, the database binding could add these
> methods for me!) If Ruby did not have this ability, the Book class
> would have to inherit from some sort of Persistent class. This would
> mean that all one million books have these methods and variables to
> support persistence, even though only one book object actually needs
> them.

As a minor note, in general, methods don't create much "baggage" -- there is only
a single code object, as part of the class object, for each method, regardless of
how many instances of that class you create. So you really aren't saving a
significant amount of much of anything by doing this. I suppose that if every
instance needs a set of data members in order to support persistence, then you're
"wasting" that memory space for objects that won't be persisted, but in general,
this is not often a concern.

The *real* advantage for dynamically adding methods/attributes to an object, comes
when you don't have control over the original object.

If I have a commercial library to manage some task, and I want to store objects
from *that* library in my own object database, then I don't have the option of
having library objects inherit from some Persistent root class. In Python this
can be solved in two ways, one of which is mix-in multiple inheritance, the other
of which is dynamically adding whatever methods and attributes are needed, as they
are needed.

In other words, it's not a solution for avoiding baggage, it's a solution for not
being *able* to put the needed "baggage" in place. :)


Jyrinx wrote:

> > (Of course, in Python
> > there's no such thing as a trivial subclass definition; you've got to
> > overload __init__ specifically and call __init__'s in each base class. This
> > ticks me off to no end.)

This is, of course, only true if the base *requires* initialization--many mix-in
classes don't--they simply provide methods and class-level data attributes,
without explicitly setting instance attributes in an __init__().


Jeff Shannon
Technician/Programmer
Credit International


Jyrinx

unread,
Dec 5, 2001, 12:29:30 AM12/5/01
to
> The *real* advantage for dynamically adding methods/attributes to an
object, comes
> when you don't have control over the original object.
>
> If I have a commercial library to manage some task, and I want to store
objects
> from *that* library in my own object database, then I don't have the
option of
> having library objects inherit from some Persistent root class. In Python
this
> can be solved in two ways, one of which is mix-in multiple inheritance,
the other
> of which is dynamically adding whatever methods and attributes are needed,
as they
> are needed.
>
> In other words, it's not a solution for avoiding baggage, it's a solution
for not
> being *able* to put the needed "baggage" in place. :)

Gotcha. Well, at this point I can see the use for singleton methods in an
object database, but I'm still not convinced of the need in a general
language. In a more self-contained program, how often do you suddenly find
yourself with someone else's object at runtime and need to add functionality
like that? Besides, if I may continue stubbornly to stick by my favorite
contender :-) , I should think it would be trivial to have a Python library
to add this sort of runtime extension, if it isn't easy already. (Great.
I've argued myself in a circle - Ruby supports this paradigm too well; well,
maybe it could be useful; well, it's not hard in Python either :-) ... )
Anyway, this looks to me like a useful feature best relegated to a standard
library, more than a major feature of a general-purpose language.

I could be wrong, though ... where else have you guys found singleton
methods useful?

Jason Voegele

unread,
Dec 5, 2001, 11:32:06 AM12/5/01
to
"Jyrinx" <jyrinx at mindspring dot com> wrote in message news:<9ukbaj$thh$1...@slb0.atl.mindspring.net>...

> Gotcha. Well, at this point I can see the use for singleton methods in an
> object database, but I'm still not convinced of the need in a general
> language. In a more self-contained program, how often do you suddenly find
> yourself with someone else's object at runtime and need to add functionality
> like that?

One example I can think of is in a Web server implementation I was
working on. HTTP requires the line separator to be "CRLF"
(carraige-return line-feed). Ruby's IO methods use either '\n' or a
platform-specific line separator. Part of my application accepted a
socket connection back to the browser as a parameter. This method
generated a response and wrote it to the IO object passed in. Instead
of doing this everywhere:

print("Content-type: text/html \r\n")

I just did this:

def my_method(io)
def io.print(text)
self.print(text << "\r\n")
end

io.print("Content-type: text/html")
end

I changed the print method of the single instance of IO that I was
using, so this change did not affect other IO objects in the system.
Granted, I didn't *need* to do this, since I could have just added the
"\r\n" to the end of each print, but it's a lot easier and less
error-prone to do it OnceAndOnlyOnce.

> Besides, if I may continue stubbornly to stick by my favorite
> contender :-) , I should think it would be trivial to have a Python library
> to add this sort of runtime extension, if it isn't easy already. (Great.
> I've argued myself in a circle - Ruby supports this paradigm too well; well,
> maybe it could be useful; well, it's not hard in Python either :-) ... )
> Anyway, this looks to me like a useful feature best relegated to a standard
> library, more than a major feature of a general-purpose language.

I suppose it's one of those things where if you don't have it you
don't miss it, but once you've gotten used to it you'd be annoyed by
its absence.

> I could be wrong, though ... where else have you guys found singleton
> methods useful?

It happens very frequently in Ruby, since any class method is a
singleton method of the Class object for that class instance :) But
that of course is specific to the Ruby object model. (Maybe it works
like this in Smalltalk to? I don't know.)

In everyday application programming, I admit I don't use it all that
often, but I like to have it there when I need it. But consider that
any use of the Decorator pattern is easily implemented using singleton
methods.

Jason

Alan Winston

unread,
Dec 5, 2001, 2:18:07 PM12/5/01
to
> [megasnip]
> ... (XP is another part, by the way, but that's a
> discussion for another newsgroup. :-) ...

Why wouldn't a Python-specific XP discussion be entirely appropriate and
welcome in this group?

Throughout this thread I have been wishing someone would mention XP in this
context. In particular when there is reference to Python being suitable for
lone-wolf programmers, but not for large teams. Your mention of test-first
also had me mumbling "XP ..., XP ... ."

I would be very, very appreciative if you could comment on your XP
experiences as they specifically relate to Python, and in contrast to other
languages you have production XP'ed in. Do you do full pair-programming,
with shifting pairs?

Alan Winston
Seattle

James_...@i2.com

unread,
Dec 5, 2001, 5:10:08 PM12/5/01
to

Peter Milliken wrote:
>Sorry Jim, you are one up on me here, I have never had to face this
>situation. Thanks for the tip. Could you provide a real example of where
you
>might want this? (if it isn't too long to give! :-)). You have made me
>curious.

Sure thing.

Java/Swing is a large and sophisticated GUI framework (written in Java --
we access it via Jython). There is a superclass JComponent that defines a
large number of fields and methods that are inherited by most other GUI
components in the framework. We have found several examples where you want
to mark some of your components in order to give them special treatment.

Here's one. For the ui of an application we want to be able to mark one or
more of the many buttons (or other controls) that might appear in a given
panel as being "emphasized". The emphasized controls are what the user
would normally click most of the time and we want these to be highlighted
in some prescribed way (according to our corporate guidelines for UI look
and feel). Swing is designed in a flexible way that lets you easily change
the look of a component. So it is easy for us to go to that part of the
code and change the drawing algorithm to suit our needs. But *only* if we
know *which* components need to be drawn with emphasis. So we need to be
able to mark the emphasized controls in some fashion. One option is to
subclass the controls to add such an attribute. However, we can't subclass
JComponent because it is the superclass of all the other controls. And we
don't want to subclass each of the other controls because there are *many*
of them. And, in any case, the added attribute would not be generic to all
of them, and we would have to maintain lots of extra classes, etc. There
are a lot of problems with this approach. Another approach is to maintain
some kind of list of the emphasized controls. But this is nasty because
then you need to create a globally defined list (because it is accessed all
over the application) -- which is rarely a good idea. The global list, for
one thing, causes garbage collection problems because it has extra
references to the emphasized controls. So then you need to add lots of
extra code to try to manage the extra references. Or you need to use
advanced techniques like weak references. Again, it is a mess. The *easy*
and *straightforward* solution is just to add an "emphasized" attribute to
each component *instance* that is to be emphasized. The drawing code can
test for the attribute. If the attribute exists, then the code draws the
control in the emphasized way. If not, it draws the control in the usual
way. The Swing designers -- not knowing our application, of course -- did
not include an "isEmphasized" property in JComponent. And Java doesn't
allow one to add instance-specific attributes. But the Swing designers did
realize that people would need to do this kind of thing so they included a
catchall hashtable in JComponent that clients can use to add
(putClientProperty) and read (getClientProperty) "instance-specific"
attributes.

Generally, when you deal with user interface widgets (in a large GUI
framework) a lot, you tend to see lots of cases where instance-specific
attributes (both fields and methods) come in handy.

Jim


============================================


<James_...@i2.com> wrote in message

James_...@i2.com

unread,
Dec 5, 2001, 5:46:06 PM12/5/01
to

Peter Milliken wrote:
>Just like you can produce large programs in C++ (but I wouldn't personally
>:-)) - just out of curiousity, why was the C++ version of the product
>replaced by a Jython version? Too difficult/expensive to update to the new
>requirements?

Yes, those were factors. But actually, the market (for large,
sophisticated ERP/Exchange systems) has changed pretty dramatically the
past couple of years. In the early-to-mid nineties customers in this
market wanted/expected C++ servers talking to Windows/MFC/C++ clients. In
the late nineties the market shifted dramatically to where customers
demanded Java-based servers interacting with HTML in web browsers (or Java
desktop applications in some cases). Java was seen by many in this market
as providing
- better portability across different kinds of servers,
- increased interoperability between services from multiple vendors,
- better leverage of "plumbing" and advanced transaction services (using
J2EE containers),
- better productivity, reliability, and subsequent lower cost of ownership,
among other things. This market today is nearly 100% Java. C++ based
products don't sell anymore (again in this specific market).

>Any ideas of what the maintenance costs of the C++ version were in
>comparison to any other language? Was the Jython version cheaper, more
>expensive or the same in terms of productions costs as the C++ version?

I don't have accurate measures but most of the developers here believe that
our new Java/Jython-based system yielded 5X to 10X productivity increases
in development over the equivalent/previous C++ system and that the
maintenance ratio is at least as good.

>What were the problems encountered by the various teams using Jython? It
has
>always interested me what the selection process is in choosing a language
>for a project - do you know how and why Jython was chosen? Was it because
of
>popular aclaim or was some formal comparision of various languages
>performed? If there was some formal selection process, what languages were
>considered? What features of Jython caused it to be selected? i.e. what
>features where missing in the other languages under consideration.

The biggest problem with using Jython was convincing the team to use it.
My experience is that developers are very interested (with good cause ;-)
in populating their resumes with experience that is directed at the center
of the market. At the time we started this new technology our C++
developers were far more interested in putting Java on their resumes than
Python. Since I was the CTO and Senior VP of development, though, I
basically dictated the choice of Jython. Of course, to be fair, I sold the
concept by first developing (with my own rusty hand) a prototype system in
a month that amazed and dazzled all the C++ developers. They were
astonished to see that such a system could be created by one (old and
cranky ;-) person in such a short amount of time. The other factor that
helped in our case was the fact that the technology -- by market demand --
had to be Java *and* we were convinced that we needed a scripting language
as part of our technology. I sold Jython as being the best possible
scripting language for Java. The reason *I* was sold on Jython was because
of its being a complete, dynamic (not statically type-checked), OO language
that interfaced seamlessly with Java. I like to view Jython/Python as a
"high-level, application language" and Java as a "low-level, systems
language". :-)

Jim

Alex Martelli

unread,
Dec 6, 2001, 11:15:06 AM12/6/01
to
<James_...@i2.com> wrote in message
news:mailman.100759044...@python.org...
...

> allow one to add instance-specific attributes. But the Swing designers
did
> realize that people would need to do this kind of thing so they included a
> catchall hashtable in JComponent that clients can use to add
> (putClientProperty) and read (getClientProperty) "instance-specific"
> attributes.

...while some of the Windows API designers, many years before, given the
non-OO framing of that C-language API, had included SetProp, GetProp &c
calls in the Windows API... giving exactly the same overall ability.


> Generally, when you deal with user interface widgets (in a large GUI
> framework) a lot, you tend to see lots of cases where instance-specific
> attributes (both fields and methods) come in handy.

...and when you work in some other field, you miss them terriblym if
you're used to them from other realms and the framework doesn't provide
them... weren't "property lists" for arbitrary symbols supported as
far back as LISP 1.5?


Alex

Peter Milliken

unread,
Dec 6, 2001, 3:11:36 PM12/6/01
to
Thanks Jim, appreciate the replies - I have learnt a lot. I have a
completely different background where the requirements were/are very much
different - what suits your environment wouldn't have suited mine :-), but
now I have a better appreciation of what else exists in the world :-).

Your experience with C++ confirms something that I have believed (and been
opinionated about! :-)) for some time now. I will save this email off in my
"ammunition" folder :-).

Me personally, I take the "unworldy" view of "what is best for the customer
and company" rather than "what is best for my resume" - perhaps not the
smartest of viewpoints but then I never expected that the field of software
engineering wouldn't have some kind of place for me. I have some quite
"strong" discussions with some of the younger members of our staff
advocating this viewpoint - they think I am wrong :-). It's a shame, because
all of those C++ projects were driven by (at least in part) this attitude
and yet it has ended up costing everyone involved (except the developers -
their resume looks better! :-)). Ah well.....

Thanks
Peter

<James_...@i2.com> wrote in message
news:mailman.100759243...@python.org...

0 new messages