Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Can a low-level programmer learn OOP?

19 views
Skip to first unread message

Chris Carlen

unread,
Jul 13, 2007, 12:06:44 PM7/13/07
to
Hi:

From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

However, those articles were no more objective than the descriptions of
OOP I've read in making a case. Ie., what objective
data/studies/research indicates that a particular problem can be solved
more quickly by the programmer, or that the solution is more efficient
in execution time/memory usage when implemented via OOP vs. procedural
programming?

The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988. Then didn't program for nearly 10 years during
which time OOP was popularized. Starting in 1999 I got back into
programming, but the high-level-ness of PC programming and the
completely foreign language of OOP repelled me. My work was in analog
and digital electronics hardware design, so naturally I started working
with microcontrollers in .asm and C. Most of my work involves low-level
signal conditioning and real-time control algorithms, so C is about as
high-level as one can go without seriously loosing efficiency. The
close-to-the-machine-ness of C is ideal here. This is a realm that I
truly enjoy and am comfortable with.

Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.

Perhaps the only thing that may have clicked regarding OOP is that in
certain cases I might prefer a higher-level approach to tasks which
involve dynamic memory allocation. If I don't need the execution
efficiency of C, then OOP might produce working results faster by not
having to worry about the details of memory management, pointers, etc.

But I wonder if the OOP programmers spend as much time creating classes
and trying to organize everything into the OOP paradigm as the C
programmer spends just writing the code?

Ultimately I don't care what the *name* is for how I program. I just
need to produce results. So that leads back to objectivity. I have a
problem to solve, and I want to find a solution that is as quick as
possible to learn and implement.

Problem:

1. How to most easily learn to write simple PC GUI programs that will
send data to remote embedded devices via serial comms, and perhaps
incorporate some basic (x,y) type graphics display and manipulation
(simple drawing program). Data may result from user GUI input, or from
parsing a text config file. Solution need not be efficient in machine
resource utilization. Emphasis is on quickness with which programmer
can learn and implement solution.

2. Must be cross-platform: Linux + Windows. This factor can have a big
impact on whether it is necessary to learn a new language, or stick with
C. If my platform was only Linux I could just learn GTK and be done
with it. I wouldn't be here in that case.

Possible solutions:

Form 1: Use C and choose a library that will enable cross-platform GUI
development.

Pro: Don't have to learn new language.
Con: Probably will have difficulty with cross-platform implementation
of serial comms. This will probably need to be done twice. This will
waste time.

Form 2: Use Python and PySerial and TkInter or wxWidgets.

Pro: Cross-platform goal will likely be achieved fully. Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library. Must possibly learn a
completely new way of thinking (OOP) not just a new language syntax.
This might be difficult.

Form 3: Use LabVIEW

Pro: I think that the cross-platform goal can be met.
Con: Expensive. I would prefer to use an Open Source solution. But
that isn't as important as the $$$. I have also generally found the 2D
diagrammatical programming language of "G" as repelling as OOP. I
suspect that it may take as much time to learn LabVIEW as Python. In
that case the time spent on Python might be better spent since I would
be learning something foundational as opposed to basically just learning
how to negotiate someone's proprietary environment and drivers.


Comments appreciated.


--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
crcarleR...@BOGUSsandia.gov
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.

Marc 'BlackJack' Rintsch

unread,
Jul 13, 2007, 12:34:23 PM7/13/07
to
On Fri, 13 Jul 2007 09:06:44 -0700, Chris Carlen wrote:

> Perhaps the only thing that may have clicked regarding OOP is that in
> certain cases I might prefer a higher-level approach to tasks which
> involve dynamic memory allocation. If I don't need the execution
> efficiency of C, then OOP might produce working results faster by not
> having to worry about the details of memory management, pointers, etc.

That's not something tied to OOP. Automatic memory management is also
possible with procedural languages.

> But I wonder if the OOP programmers spend as much time creating classes
> and trying to organize everything into the OOP paradigm as the C
> programmer spends just writing the code?

Creating classes and organizing the program in an OOP language isn't
different from creating structs and organizing the program in C.

On one side Python is a very OOP language as everything is an object. On
the other side it is possible to write parts of the program in procedural
or even functional style. Python is not Java, you don't have to force
everything into classes.

From my experience Python makes it easy to "just write the code". Easier
than C because I don't have to deal with so much machine details, don't
have to manage memory, don't need extra indexes for looping over lists and
so on. And the "crashes" are much gentler, telling me what the error is
and where instead of a simple "segfault" or totally messed up results.

Ciao,
Marc 'BlackJack' Rintsch

John Nagle

unread,
Jul 13, 2007, 12:54:16 PM7/13/07
to
Chris Carlen wrote:
> Hi:
>
> From what I've read of OOP, I don't get it. I have also found some
> articles profoundly critical of OOP. I tend to relate to these articles.
>
> However, those articles were no more objective than the descriptions of
> OOP I've read in making a case. Ie., what objective
> data/studies/research indicates that a particular problem can be solved
> more quickly by the programmer, or that the solution is more efficient
> in execution time/memory usage when implemented via OOP vs. procedural
> programming?
>
> The problem for me is that I've programmed extensively in C and .asm on
> PC DOS way back in 1988. Then didn't program for nearly 10 years during
> which time OOP was popularized. Starting in 1999 I got back into
> programming, but the high-level-ness of PC programming and the
> completely foreign language of OOP repelled me. My work was in analog
> and digital electronics hardware design, so naturally I started working
> with microcontrollers in .asm and C. Most of my work involves low-level
> signal conditioning and real-time control algorithms, so C is about as
> high-level as one can go without seriously loosing efficiency. The
> close-to-the-machine-ness of C is ideal here. This is a realm that I
> truly enjoy and am comfortable with.
>
> Hence, being a hardware designer rather than a computer scientist, I am
> conditioned to think like a machine. I think this is the main reason
> why OOP has always repelled me.

Why?

I've written extensively in C++, including hard real-time programming
in C++ under QNX for a DARPA Grand Challenge vehicle. I have an Atmel
AVR with a cable plugged into the JTAG port sitting on my desk right now.
Even that little thing can be programmed in C++.

You can sometimes get better performance in C++ than in C, because C++
has "inline". Inline expansion happens before optimization, so you
can have abstractions that cost nothing.

If it has state and functions, it probably should be an object.
The instances of the object can be static in C++; dynamic memory
allocation isn't required in C++, as it is in Python.

Python is a relatively easy language, easier than C++, Java,
or even Perl. It's quite forgiving. The main implementation,
CPython, is about 60x slower than C, though, so if you're trying
to implement, say, a rapidly changing digital oscilloscope display,
the result may be sluggish.

John Nagle

Message has been deleted

Simon Hibbs

unread,
Jul 13, 2007, 1:59:09 PM7/13/07
to
Chris,

I can fully relate to your post. I trained as a programmer in the 80s
when OOP was an accademic novelty, and didn't learn OOP untill around
2002. However now I find myself naturaly thinking in OOP terms,
although I'm by no means an expert - I'm a sysadmin that writes the
occasional utility. I found learning OOP with Python very easy because
it has such a stripped-down and convenient syntax.

The advantages of OOP aren't in performance or memory, they're in the
fact that OOP simplifies the ways in which we can think about and
solve a problem. OOP packages up the functionality of a program into
logical units (objects) which can be written, debugged and maintained
independently of the rest of the programme almost as if they were
completely seperate programmes of their own, with their own data and
'user inteface' in the form of callable functions (actualy methods).

Here's a realy excellent tutorial on Python that's fun to follow.
Downloading and installing python, and following this tutorial will
probably take about as long as it took to write your post in the first
place. At the end of it you'll have a good idea how OOP works, and how
Python works. Learning OOp this way is easy and painless, and what you
learn about the theory and principles of OOP in Python will be
transferable to C++ if you end up going in that direction.

I hope this was helpful.

Simon Hibbs


Simon Hibbs

unread,
Jul 13, 2007, 2:00:21 PM7/13/07
to

Sorry, here's the tutorial link:

http://hetland.org/writing/instant-python.html


Simon Hibbs

Chris Carlen

unread,
Jul 13, 2007, 2:05:21 PM7/13/07
to
John Nagle wrote:
> Chris Carlen wrote:[edit]

>> Hence, being a hardware designer rather than a computer scientist, I
>> am conditioned to think like a machine. I think this is the main
>> reason why OOP has always repelled me.
>
> Why?

When pointers were first explined to me, I went "Ok." And rather
quickly ideas lit up in my head about what I could do with them.

When I read what OOP is, that doesn't happen. All I think is "what's
the point of this?" "What can this do for me that I can do already with
the procedural way of thinking?" And if it can't do anything new, then
why rearrange my thinking to a new terminology? It's results that
matter, not the paradigm.

> I've written extensively in C++, including hard real-time programming
> in C++ under QNX for a DARPA Grand Challenge vehicle.

Did the vehicle win?

> I have an Atmel
> AVR with a cable plugged into the JTAG port sitting on my desk right now.
> Even that little thing can be programmed in C++.

Yes.

> You can sometimes get better performance in C++ than in C, because C++
> has "inline". Inline expansion happens before optimization, so you
> can have abstractions that cost nothing.

That's interesting. But why is this any different than using
preprocessor macros in C?

>
> If it has state and functions, it probably should be an object.
> The instances of the object can be static in C++; dynamic memory
> allocation isn't required in C++, as it is in Python.

Why? Why is OOP any better at explaining a state machine to a computer?
I can write state machines all over the place in C, which tend to be
the core of most of my embedded programs. I can write them with
hardcoded logic if that seems like the easy thing to do any the
probability of extensive changes is extremely low. They are extremely
easy to read and to code. I have written a table-driven state machine
with arbitrary-length input condition lists. The work was all in
designing the data structures. The code to update the state machine was
about 4 lines.

Why would OOP be better? Different is not better. Popular is not
better. What the academics say is not better. Less lines of code might
be better, if the priority is ease of programming. Or, less machine
execution time or memory usage might be better, if that is the priority.

Until I can clearly understand why one or the other of those goals might
better be realized for a given problem with OOP vs. procedures, I just
don't get it.

I will keep an open mind however, that until I work with it for some
time there is still the possibility that I will have some light go on
about OOP. So don't worry, I'm not rejecting your input.

> Python is a relatively easy language, easier than C++, Java,
> or even Perl. It's quite forgiving. The main implementation,
> CPython, is about 60x slower than C, though, so if you're trying
> to implement, say, a rapidly changing digital oscilloscope display,
> the result may be sluggish.

Yes, I certainly wouldn't consider Python for that.

Thanks for your comments.

Evan Klitzke

unread,
Jul 13, 2007, 2:30:57 PM7/13/07
to pytho...@python.org
On 7/13/07, John Nagle <na...@animats.com> wrote:
> You can sometimes get better performance in C++ than in C, because C++
> has "inline". Inline expansion happens before optimization, so you
> can have abstractions that cost nothing.

This is a bit off topic, but inline is a keyword in C since C99.

--
Evan Klitzke <ev...@yelp.com>

Cousin Stanley

unread,
Jul 13, 2007, 3:24:26 PM7/13/07
to

> ....

> 2. Must be cross-platform: Linux + Windows.
>
> This factor can have a big impact on whether it is necessary
> to learn a new language, or stick with C.
>
> If my platform was only Linux I could just learn GTK
> and be done with it.
> ....

Chris ....

The Python bindings for GTK in the form of a mechanism
deemed PyGTK are also available for Windows and provide
a diverse set of widgets for building GUI applications ....

http://www.pygtk.org

The Applications page lists a rather large and wide variety
of the types of programs that have been built using PyGTK ....

http://www.pygtk.org/applications.html

There is plenty of decent documentation available
and a dedicated newsgroup for assistance if needed ....

--
Stanley C. Kitching
Human Being
Phoenix, Arizona


----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
----= East and West-Coast Server Farms - Total Privacy via Encryption =----

Neil Cerutti

unread,
Jul 13, 2007, 3:29:27 PM7/13/07
to
On 2007-07-13, Chris Carlen <crcarleR...@BOGUSsandia.gov> wrote:

> John Nagle wrote:
>> You can sometimes get better performance in C++ than in C,
>> because C++ has "inline". Inline expansion happens before
>> optimization, so you can have abstractions that cost nothing.
>
> That's interesting. But why is this any different than using
> preprocessor macros in C?

This is OT, however: inline functions have a few benefits over
preprocessor macros.

1. They are type-safe.
2. They never evaluate their arguments more than once.
3. They don't require protective parentheses to avoid precedence errors.
4. In C++, they have the additional benefit of being defined in a
namespace, rather than applying globally to a file.

As an experienced C programmer you're probably used to coping
with the problems of preprocessor macros, and may even take
advantage of their untyped nature occasionally. Even C++
programmers still use the advisedly.

> I will keep an open mind however, that until I work with it for
> some time there is still the possibility that I will have some
> light go on about OOP. So don't worry, I'm not rejecting your
> input.

In my opinion OOP is usefully thought of as a type of design
rather than a means of implementation. You can implement an OO
design in a procedural langauge just fine, but presumably an OO
programming language facilitates the implementation of an OO
design better than does a procedural language.

Going back to the stack machine question, and using it as an
example: Assume you design your program as a state machine.
Wouldn't it be easier to implement in a (hypothetical)
state-machine-based programming language than in a procedural
one? I think John was insinuating that a state-machine is more
like an object than it is like a procedure.

--
Neil Cerutti

Bruno Desthuilliers

unread,
Jul 14, 2007, 12:01:56 AM7/14/07
to
Chris Carlen a écrit :

> Hi:
>
> From what I've read of OOP, I don't get it. I have also found some
> articles profoundly critical of OOP. I tend to relate to these articles.
>
> However, those articles were no more objective than the descriptions of
> OOP I've read in making a case. Ie., what objective
> data/studies/research indicates that a particular problem can be solved
> more quickly by the programmer, or that the solution is more efficient
> in execution time/memory usage when implemented via OOP vs. procedural
> programming?

None. Definitively. wrt/ developper time and memory, it's mostly a
matter of fit-your-brains. If it does, you'll find it easier, else
choose another programming style. wrt/ cpu time and memory, and using
'low-level' languages (C/C++/Pascal etc) OO is usually worse than
procedural for simple programs. For more complex ones, I'd say it tends
to converge since these programs, when written procedurally, usually
rely on many abstraction/indirection layers.

> The problem for me is that I've programmed extensively in C and .asm on
> PC DOS way back in 1988. Then didn't program for nearly 10 years during
> which time OOP was popularized. Starting in 1999 I got back into
> programming, but the high-level-ness of PC programming and the
> completely foreign language of OOP repelled me. My work was in analog
> and digital electronics hardware design, so naturally I started working
> with microcontrollers in .asm and C. Most of my work involves low-level
> signal conditioning and real-time control algorithms, so C is about as
> high-level as one can go without seriously loosing efficiency.

You may still want to have a look on some more functional languages like
Haskell, OCaml or Erlang. But if you find OO alien, I doubt you'll have
a strong feeling for functional programming.

> The
> close-to-the-machine-ness of C is ideal here. This is a realm that I
> truly enjoy and am comfortable with.
>
> Hence, being a hardware designer rather than a computer scientist, I am
> conditioned to think like a machine. I think this is the main reason
> why OOP has always repelled me.

OTOH, OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept. According to him, each object
is a (simulation of) a small machine.

> Perhaps the only thing that may have clicked regarding OOP is that in
> certain cases I might prefer a higher-level approach to tasks which
> involve dynamic memory allocation.

While OO without automatic memory management can quickly become a major
PITA, OO and GC are two orthogonal concepts - some languages have
builtin support for OO but nothing specific for memory management
(ObjectPascal, C++, ObjectiveC), and some non-OO languages do have
builtin memory management (mostly but not only in the functional camp).

> If I don't need the execution
> efficiency of C, then OOP might produce working results faster by not
> having to worry about the details of memory management, pointers, etc.

It's not of feature of OO per se. But it's clear that not having (too
much) to worry about memory management greatly enhance productivity.

> But I wonder if the OOP programmers spend as much time creating classes
> and trying to organize everything into the OOP paradigm as the C
> programmer spends just writing the code?

Don't you design your programs ? AFAICT, correct design is not easier
with procedural programming.

Now to answer your question, I'd say it depends on your experience of
OO, and of course of the kind of OO language you're using. With
declaratively statically typed languages - like C++, Java etc - you are
forced into a lot of upfront design (way too much IMHO). Dynamic
languages like Smalltalk, Python or Ruby are much more lightweight in
this area, and tend to favor a much more exploratory style - sketch a
quick draft on a napkin, start coding, and evolve the design while
you're coding.

And FWIW, Python doesn't *force* you into OO - while you'll be *using*
objects, you can write most of your code in a procedural way, and only
"fall down" into OO for some very advanced stuff.

> Ultimately I don't care what the *name* is for how I program. I just
> need to produce results.

Indeed !-)

> So that leads back to objectivity. I have a
> problem to solve, and I want to find a solution that is as quick as
> possible to learn and implement.
>
> Problem:
>
> 1. How to most easily learn to write simple PC GUI programs

GUI are one of the best (and more successfull) application of OO - and
as a matter of fact, even GUI toolkits implemented in plain C tend to
take an OO approach (GTK+ being a clear example, but even the old
Pascal/C Mac GUI API does have a somewhat "object based" feeling).

> that will
> send data to remote embedded devices via serial comms, and perhaps
> incorporate some basic (x,y) type graphics display and manipulation
> (simple drawing program). Data may result from user GUI input, or from
> parsing a text config file. Solution need not be efficient in machine
> resource utilization. Emphasis is on quickness with which programmer
> can learn and implement solution.

So what you want is an hi-level, easy to learn language with a rich
collection of libraries. The Goodnews(tm) is that Python is one of the
possible answers.

> 2. Must be cross-platform: Linux + Windows.

Idem. You can even add most unices and MacOS X to the list.

> This factor can have a big
> impact on whether it is necessary to learn a new language, or stick with
> C. If my platform was only Linux I could just learn GTK and be done
> with it. I wouldn't be here in that case.
>
> Possible solutions:
>
> Form 1: Use C and choose a library that will enable cross-platform GUI
> development.
>
> Pro: Don't have to learn new language.
> Con: Probably will have difficulty with cross-platform implementation
> of serial comms. This will probably need to be done twice. This will
> waste time.

Con: C is a low-level language (not a criticism - it has been designed
so), which greatly impact productivity.
Con: the only serious C (not++) cross-platform GUI toolkit I know is
GTK+, which is less cross-platform than wxWidgets, and *is* OO.

> Form 2: Use Python and PySerial and TkInter or wxWidgets.

I'd probably go for wxWidgets.

> Pro: Cross-platform goal will likely be achieved fully.

Very likely. There are a couple of things to take care of, but nothing
close to what you'd have to do in C.

> Have a
> programmer nearby with extensive experience who can help.
> Con: Must learn new language and library.

Yes, obviously. The (other) GoodNews(tm) is that, according to most
estimations, an experimented programmer can become productive in Python
in a matter of weeks at worst (some manage to become productive in a few
days). This won't mean you'll master the language and use it at its
best, but don't worry, you'll get things done, and perhaps in less time
than with C.

> Must possibly learn a
> completely new way of thinking (OOP)

Not necessarly. While Python is OO all the way down - meaning that
everything you'll work with will be an object (functions included) -, it
doesn't *force* you into OO (IOW : you don't have to define classes to
write a Python program). You can as well use a procedural - or even
somewhat functional - approach, and most Python programs I've seen so
far are usually a mix of the three.

> not just a new language syntax.

You forgot one of the most important part of a language : idioms. And
it's definitively *not* idiomatic in Python to use classes when a
simpler solution (using plain functions and modules) is enough.

> This might be difficult.

Not necessarily that much.

> Form 3: Use LabVIEW
>
> Pro: I think that the cross-platform goal can be met.
> Con: Expensive. I would prefer to use an Open Source solution. But
> that isn't as important as the $$$. I have also generally found the 2D
> diagrammatical programming language of "G" as repelling as OOP. I
> suspect that it may take as much time to learn LabVIEW as Python.

I don't have much knowledge of LabVIEW so I can't comment on this. But I
remember a thread here about G, and I guess you'll find Python much more
familiar - even if you'll need some 'thinking adjustment' to grok it.

> In
> that case the time spent on Python might be better spent since I would
> be learning something foundational as opposed to basically just learning
> how to negotiate someone's proprietary environment and drivers.

IMHO, the biggest gain (in learning Python vs LabVIEW) is that you'll
add a very valuable tool to your toolbox - the missing link between C
and shell scripts.

>
> Comments appreciated.
>
HTH

Bruno Desthuilliers

unread,
Jul 14, 2007, 12:20:33 AM7/14/07
to
Chris Carlen a écrit :
(snip)

>
> Why? Why is OOP any better at explaining a state machine to a computer?

I don't know if it's "better", but state machines are the historical
starting point of OO with the Simula language.

> I can write state machines all over the place in C,

And even in assembler - so why use C ?-)

> which tend to be
> the core of most of my embedded programs. I can write them with
> hardcoded logic if that seems like the easy thing to do any the
> probability of extensive changes is extremely low. They are extremely
> easy to read and to code. I have written a table-driven state machine
> with arbitrary-length input condition lists. The work was all in
> designing the data structures.

Which is another approach to OO. When programming in C, you do use
structs, don't you ? And you do write functions operating on instances
of these structs ? And possibly, turn these structs into ADT ? Well, one
possible definition of "objects" is "ADT + polymorphism".

> Why would OOP be better?

Whoever pretend it's absolutely "better" should be shot down. I do find
OO *easier* than pure procedural programming, but I started programming
with mostly OO (or at least object-based) languages, and only then
learned pure procedural languages (and then bits of functional
programming). It's not a matter of being "better", it's a matter of what
style fits your brain. If OO doesn't fit your brain, then it certainly
won't be "better" *for you*.

> Different is not better. Popular is not
> better. What the academics say is not better. Less lines of code might
> be better, if the priority is ease of programming.

and maintenance, and robustness (AFAICT, the defect/LOC ratio is
somewhat constant whatever the language, so the less code the less bugs).

> Or, less machine
> execution time or memory usage might be better, if that is the priority.

Indeed.

> Until I can clearly understand why one or the other of those goals might
> better be realized for a given problem with OOP vs. procedures, I just
> don't get it.

Seems quite sane.

> I will keep an open mind however, that until I work with it for some
> time there is still the possibility that I will have some light go on
> about OOP. So don't worry, I'm not rejecting your input.
>
>> Python is a relatively easy language, easier than C++, Java,
>> or even Perl. It's quite forgiving. The main implementation,
>> CPython, is about 60x slower than C, though,

This is a very simplistic - and as such, debatable - assertion IMHO. On
my Linux box, a cat-like program is hardly faster in C than in Python
(obviously since such a program is IO bound, and both implementations
will use the native IO libs), and for quite a few computation-heavy
tasks, there are Python bindings to highly optimised C (or C++) libs. So
while it's clear that Python is not about raw execution speed, it's
usually quite correct for most applicative tasks. And when it isn't,
well, it's always possible to recode the critical parts in Pyrex or C.

samwyse

unread,
Jul 13, 2007, 4:20:35 PM7/13/07
to
On Jul 13, 1:05 pm, Chris Carlen <crcarleRemoveT...@BOGUSsandia.gov>
wrote:

> John Nagle wrote:
> > Chris Carlen wrote:[edit]
> >> Hence, being a hardware designer rather than a computer scientist, I
> >> am conditioned to think like a machine. I think this is the main
> >> reason why OOP has always repelled me.
>
> > Why?
>
> When pointers were first explined to me, I went "Ok." And rather
> quickly ideas lit up in my head about what I could do with them.
>
> When I read what OOP is, that doesn't happen. All I think is "what's
> the point of this?" "What can this do for me that I can do already with
> the procedural way of thinking?" And if it can't do anything new, then
> why rearrange my thinking to a new terminology? It's results that
> matter, not the paradigm.

What can this do for me that I can do already with the procedural way

of thinking? Absolutely nothing; it's all Turing machines under the
hood.

Why rearrange my thinking to a new terminology? Because new
terminologies matter a lot. There's nothing that you can do with
pointers that can't be done with arrays; I know because I wrote a lot
of FORTRAN 77 code back in the day, and withouy pointers I had to
write my own memory allocation routines that worked off of a really
big array.

Likewise, there's nothing that you can do in C that can't be done with
C++ (especially since C++ was originally a preprocessor for C);
however C++ will keep track of a lot of low-level detail for you so
you don't have to think about it. Let's say that you have an embedded
single-board computer with a serial and a parallel port. You probably
have two different routines that you use to talk to them, and you have
to always keep track which you are using at any given time.

It's a lot easier to have a single CommPort virtual class that you use
in all of your code, and then have two sub-classes, one for serial
ports and one for parallel. You'll be especially happy for this when
someone decides that as well as logging trace information to a
printer, it would be nice to also log it to a technician's handhelp
diagnostic device.

Chris Carlen

unread,
Jul 13, 2007, 4:32:30 PM7/13/07
to
Neil Cerutti wrote:
> Going back to the stack machine question, and using it as an
> example: Assume you design your program as a state machine.
> Wouldn't it be easier to implement in a (hypothetical)
> state-machine-based programming language than in a procedural
> one? I think John was insinuating that a state-machine is more
> like an object than it is like a procedure.

I think at this point, I should stop questioning and just learn for a while.

But regarding state machines, I had probably written a few in C the past
before really understanding that it was a state machine. Much later I
grasped state machines from digital logic. Then it became much clearer
how to use them as a tool and to code them intentionally.

Once I have written a state table, I can implement using flip-flops and
gates or in C as either a state variable and a switch statement or
something table driven. The switch code can be written as fast as I can
read through the state table. That's the easiest implementation, but
the least easy to change later unless it's fairly small.

I will be eager to see how to do this in Python.

I have found the comments in response to my doubts about OOP very
encouraging. I will do some learning, and come back when I have more
Python specific problems...

Thanks for the input!

Chris Carlen

unread,
Jul 13, 2007, 4:34:20 PM7/13/07
to


Thanks Simon. Actually, that's the tutorial that I've started with.

Your comments are encouraging. I'll keep learning.

Chris Carlen

unread,
Jul 13, 2007, 4:39:09 PM7/13/07
to
Bruno Desthuilliers wrote:
> Chris Carlen a écrit :
>[edit]

>> Must possibly learn a completely new way of thinking (OOP)
>
> Not necessarly. While Python is OO all the way down - meaning that
> everything you'll work with will be an object (functions included) -, it
> doesn't *force* you into OO (IOW : you don't have to define classes to
> write a Python program). You can as well use a procedural - or even
> somewhat functional - approach, and most Python programs I've seen so
> far are usually a mix of the three.
>
>> not just a new language syntax.
>
> You forgot one of the most important part of a language : idioms. And
> it's definitively *not* idiomatic in Python to use classes when a
> simpler solution (using plain functions and modules) is enough.

I see. That's very promising. I guess some articles I read painted a
picture of religiousity among OOP programmers. But that is not the
impression I am getting at all on the street.

> IMHO, the biggest gain (in learning Python vs LabVIEW) is that you'll
> add a very valuable tool to your toolbox - the missing link between C
> and shell scripts.


Thanks for the comments!

Bruno Desthuilliers

unread,
Jul 13, 2007, 5:06:41 PM7/13/07
to
Chris Carlen a écrit :

> Bruno Desthuilliers wrote:
>
>> Chris Carlen a écrit :
>
> >[edit]
>
>>> Must possibly learn a completely new way of thinking (OOP)
>>
>>
>> Not necessarly. While Python is OO all the way down - meaning that
>> everything you'll work with will be an object (functions included) -,
>> it doesn't *force* you into OO (IOW : you don't have to define classes
>> to write a Python program). You can as well use a procedural - or even
>> somewhat functional - approach, and most Python programs I've seen so
>> far are usually a mix of the three.
>>
>>> not just a new language syntax.
>>
>>
>> You forgot one of the most important part of a language : idioms. And
>> it's definitively *not* idiomatic in Python to use classes when a
>> simpler solution (using plain functions and modules) is enough.
>
>
> I see. That's very promising. I guess some articles I read painted a
> picture of religiousity among OOP programmers.

That's alas a common disease - I'd say the best way to be definitively
disgusted from OO is to read comp.lang.object :(

> But that is not the
> impression I am getting at all on the street.

Heck. As you said, the important is to get things done. And I guess
that's why we all (here) love Python. Last time I had to work on a
Pascal program (actually Delphi's ObjectPascal, but the whole thing was
almost caricaturally procedural), I found myself having to write tens of
lines of code for thing that would have been no-brainer one-liners in
Python, and define new types (records - Pascal's structs) where Python's
builtin dict type would have do the trick. It's not a matter of
procedural vs OO vs functional, it's a matter of using the appropriate
tool for the job.

Wayne Brehaut

unread,
Jul 13, 2007, 6:20:29 PM7/13/07
to
On Sat, 14 Jul 2007 06:01:56 +0200, Bruno Desthuilliers
<bdesth.qu...@free.quelquepart.fr> wrote:

>Chris Carlen a écrit :
>> Hi:
>>
>> From what I've read of OOP, I don't get it. I have also found some
>> articles profoundly critical of OOP. I tend to relate to these articles.
>>

=== 8< ===

>>
>> Hence, being a hardware designer rather than a computer scientist, I am
>> conditioned to think like a machine. I think this is the main reason
>> why OOP has always repelled me.
>
>OTOH, OO is about machines - at least as conceveid by Alan Key, who
>invented the term and most of the concept. According to him, each object
>is a (simulation of) a small machine.

Oh you young'uns, not versed in The Ancient Lore, but filled with
self-serving propaganda from Xerox PARC, Alan Kay, and Smalltalk
adherents everywhere!

As a few more enlightened have noted in more than one thread here, the
Mother of All OOP was Simula (then known as SIMULA 67). All Alan Kay
did was define "OOPL", but then didn't notice (apparently--though this
may have been a "convenient oversight") that Simula satisfied all the
criteria so was actually the first OOPL--and at least 10 years earlier
than Smalltalk!

So Kay actually invented NONE of the concepts that make a PL an OOPL.
He only stated the concepts concisely and named the result OOP, and
invented yet another implementation of the concepts-- based on a
LISP-like functional syntax instead of an Algol-60 procedural syntax,
and using message-passing for communication amongst objects (and
assumed a GUI-based IDE) (and introduced some new terminology,
especially use of the term "method" to distinguish class and instance
procedures and functions, which Simula hadn't done) .

As Randy Gest notes on http://www.smalltalk.org/alankay.html, "The
major ideas in Smalltalk are generally credited to Alan Kay with many
roots in Simula, LISP and SketchPad." Too many seem to assume that
some of these other "features" of Smalltalk are part of the definition
of an OOP, and so are misled into believing the claim that it was the
first OOPL. Or they claim that certain deficiencies in Simula's object
model--as compared to Smalltalk's--somehow disqualifies it as a "true
OOPL", even though it satisfies all the criteria as stated by Kay in
his definition. See http://en.wikipedia.org/wiki/Simula and related
pages, and "The History of Programming Languages I (HOPL I)", for
more details.

Under a claim of Academic Impunity (or was that "Immunity"), here's
another historical tid-bit. In a previous empolyment we once had a
faculty applicant from CalTech who knew we were using Simula as our
introductory and core language in our CS program, so he visited Xerox
PARC before coming for his inteview. His estimate of Alan Kay and
Smalltalk at that time (early 80s) was that "They wanted to implement
Simula but didn't understand it--so they invented Smalltalk and now
don't understand _it_!"

wwwayne

=== 8< ===

Aahz

unread,
Jul 13, 2007, 6:47:37 PM7/13/07
to
In article <f787u...@news4.newsguy.com>,

Chris Carlen <crcarleR...@BOGUSsandia.gov> wrote:
>
>From what I've read of OOP, I don't get it.

For that matter, even using OOP a bit with C++ and Perl, I didn't get it
until I learned Python.

>The problem for me is that I've programmed extensively in C and .asm on
>PC DOS way back in 1988.

Newbie. ;-)

(I started with BASIC in 1976.)

>Form 2: Use Python and PySerial and TkInter or wxWidgets.
>
>Pro: Cross-platform goal will likely be achieved fully. Have a
>programmer nearby with extensive experience who can help.
>Con: Must learn new language and library. Must possibly learn a
>completely new way of thinking (OOP) not just a new language syntax.
>This might be difficult.

My experience is that learning GUI programming is difficult. Moreover,
GUI programming in C involves a lot of boilerplate that can be automated
more easily with Python. So I think this will be a better solution.

Note very very carefully that Python does not require an OOP style of
programming, but it will almost certainly be the case that you just
naturally start using OOP techniques as you learn Python.
--
Aahz (aa...@pythoncraft.com) <*> http://www.pythoncraft.com/

I support the RKAB

Tony23

unread,
Jul 13, 2007, 8:09:47 PM7/13/07
to
Chris Carlen wrote:
> John Nagle wrote:
>> Chris Carlen wrote:[edit]
>>> Hence, being a hardware designer rather than a computer scientist, I
>>> am conditioned to think like a machine. I think this is the main
>>> reason why OOP has always repelled me.
>>
>> Why?
>
> When pointers were first explined to me, I went "Ok." And rather
> quickly ideas lit up in my head about what I could do with them.
>
> When I read what OOP is, that doesn't happen. All I think is "what's
> the point of this?" "What can this do for me that I can do already with
> the procedural way of thinking?" And if it can't do anything new, then
> why rearrange my thinking to a new terminology? It's results that
> matter, not the paradigm.

I have been programming since 1978. I started off with BASIC, learned
Assembly and Pascal, and much later eventually moved on to Javascript,
Perl, and PHP. All of my work was done procedurally.

Recently, I have been working on a very large project involving a lot of
OO-Javascript. For what we are doing on the project, OO makes sense. I
really didn't get OOP until working on this project - probably because I
never did anything that really needed it.

I have found myself leaning more toward the OO paradigm since doing
this, after 25+ years of procedural programming, and now I find myself
doing more work with OO concepts, and getting things done even faster,
and with less work, than I used to.

But I still have a problem with STRICT OOP - which is why I like Python.
Use OO where it's useful, use procedural when that works best.

I suspect that the reason it isn't clicking for you is twofold: 1) You
don't do anything currently that has an obvious need for OOP, and 2) You
haven't done anything with OOP.

A couple ideas:

1) Maybe you can try building a relatively trivial program that would
more naturally use an OO methodology - perhaps a simple videogame like
Pac-man? The 'monsters' would be objects, with properties such as color,
X-position, Y-position, etc. - make yourself work in OO terms

2) This may seem silly, but download & play with "Scratch"
(http://scratch.mit.edu) - it's basically an introduction to programming
for kids, but it's completely OO, and super easy to use. It might be
useful to help you to see the 'grand view' better.

3) Give in to the dark side :)

Good luck - after so much time invested in one way of thinking, it's not
easy to change.

Chris Carlen

unread,
Jul 13, 2007, 8:12:33 PM7/13/07
to
Aahz wrote:
> In article <f787u...@news4.newsguy.com>,
> Chris Carlen <crcarleR...@BOGUSsandia.gov> wrote:
>>From what I've read of OOP, I don't get it.
>
> For that matter, even using OOP a bit with C++ and Perl, I didn't get it
> until I learned Python.
>
>>The problem for me is that I've programmed extensively in C and .asm on
>>PC DOS way back in 1988.
>
> Newbie. ;-)
>
> (I started with BASIC in 1976.)

Heh heh, I actually first programmed when the RadioShack TRS-80 came
out. I think I saw it first in 1978 when I was 11. I would hang out in
the store for hours writing crude video games.

> My experience is that learning GUI programming is difficult. Moreover,
> GUI programming in C involves a lot of boilerplate that can be automated
> more easily with Python. So I think this will be a better solution.
>
> Note very very carefully that Python does not require an OOP style of
> programming, but it will almost certainly be the case that you just
> naturally start using OOP techniques as you learn Python.


Thanks for the input!

Neil Cerutti

unread,
Jul 13, 2007, 8:33:39 PM7/13/07
to
On 2007-07-13, Wayne Brehaut <wbre...@mcsnet.ca> wrote:
> So Kay actually invented NONE of the concepts that make a PL an
> OOPL. He only stated the concepts concisely and named the
> result OOP,

Naming and categorizing something shouldn't be underestimated as
an accomplishment, though. The exercise can have profound
results. For example, consider "marriage." ;)

> Under a claim of Academic Impunity (or was that "Immunity"),
> here's another historical tid-bit. In a previous empolyment we
> once had a faculty applicant from CalTech who knew we were
> using Simula as our introductory and core language in our CS
> program, so he visited Xerox PARC before coming for his
> inteview. His estimate of Alan Kay and Smalltalk at that time
> (early 80s) was that "They wanted to implement Simula but
> didn't understand it--so they invented Smalltalk and now don't
> understand _it_!"

Heh, heh. Thanks for the intersting info.

--
Neil Cerutti

Steve Holden

unread,
Jul 13, 2007, 8:37:04 PM7/13/07
to pytho...@python.org
Aahz wrote:
> In article <f787u...@news4.newsguy.com>,
> Chris Carlen <crcarleR...@BOGUSsandia.gov> wrote:
>>From what I've read of OOP, I don't get it.
>
> For that matter, even using OOP a bit with C++ and Perl, I didn't get it
> until I learned Python.
>
>> The problem for me is that I've programmed extensively in C and .asm on
>> PC DOS way back in 1988.
>
> Newbie. ;-)
>
> (I started with BASIC in 1976.)
>
Newbie ;-)

(I started with Algol 60 in 1967).

>> Form 2: Use Python and PySerial and TkInter or wxWidgets.
>>
>> Pro: Cross-platform goal will likely be achieved fully. Have a
>> programmer nearby with extensive experience who can help.
>> Con: Must learn new language and library. Must possibly learn a
>> completely new way of thinking (OOP) not just a new language syntax.
>> This might be difficult.
>
> My experience is that learning GUI programming is difficult. Moreover,
> GUI programming in C involves a lot of boilerplate that can be automated
> more easily with Python. So I think this will be a better solution.
>

I used to write in C for the SunView platform (back in the days when the
GUI was integrated into the kernel as the only way to get acceptable
speed on the display). From what I remember, "Hello World" took about 40
lines.

The immense (relatively speaking: this was 1985) size of the libraries
required was one of the primary justifications for implementing shared
libraries.

> Note very very carefully that Python does not require an OOP style of
> programming, but it will almost certainly be the case that you just
> naturally start using OOP techniques as you learn Python.

That's very true. I still use a lot of (perhaps too much) procedural
coding, but driving the object-oriented libraries is a great way for a
noob to get started in OOP.

regards
Steve
--
Steve Holden +1 571 484 6266 +1 800 494 3119
Holden Web LLC/Ltd http://www.holdenweb.com
Skype: holdenweb http://del.icio.us/steve.holden
--------------- Asciimercial ------------------
Get on the web: Blog, lens and tag the Internet
Many services currently offer free registration
----------- Thank You for Reading -------------

BJörn Lindqvist

unread,
Jul 13, 2007, 9:49:31 PM7/13/07
to John Nagle, pytho...@python.org
On 7/13/07, John Nagle <na...@animats.com> wrote:
> You can sometimes get better performance in C++ than in C, because C++
> has "inline". Inline expansion happens before optimization, so you
> can have abstractions that cost nothing.

C99 has that too.

> Python is a relatively easy language, easier than C++, Java,
> or even Perl. It's quite forgiving. The main implementation,
> CPython, is about 60x slower than C, though, so if you're trying
> to implement, say, a rapidly changing digital oscilloscope display,
> the result may be sluggish.

But if the data for that oscilloscope comes from an external device
connected via a serial port, execution speed won't matter.


--
mvh Björn

Alex Martelli

unread,
Jul 14, 2007, 12:43:43 AM7/14/07
to
Chris Carlen <crcarleR...@BOGUSsandia.gov> wrote:

> From what I've read of OOP, I don't get it. I have also found some
> articles profoundly critical of OOP. I tend to relate to these articles.

OOP can be abused (particularly with deep or intricate inheritance
structures). But the base concept is simple and clear: you can bundle
state and behavior into a stateful "black box" (of which you may make as
many instances, with independent state but equal behavior, as you need).

> Hence, being a hardware designer rather than a computer scientist, I am
> conditioned to think like a machine. I think this is the main reason
> why OOP has always repelled me.

I'm an MS in EE (minoring in computer engineering) by training (over a
quarter century ago:-); I "slid" into programming kind of inexhorably
(fate obviously wanted me to:-) but paradigms such as OOP (and
functional programming, but that's another subject) always made sense to
me *in direct analogy to my main discipline*. A JK flip-flop and a D
flip-flop are objects with 1-bit states and different behavior; I may
put in my circuit as many (e.g.) J-K flip-flops as I need, and each will
have separate state, even though each will have identical behavior (how
it responds to signals on the J and K lines). I don't need to think
about how a J-K flip-flop is *made*, inside; I use it as a basic
component in designing richer circuits (well, I did back when I DID
design circuits, but I haven't _totally_ forgotten:-). I do know how to
make one in terms of transistors, should I ever need to (well, maybe I'd
have to look it up, but I _used_ to know:-), but such a need is unlikely
to arise, because it's likely to be there as a basic component in
whatever design library I'm supposed to use for this IC.

Components much richer than J-K flip-flops are obviously more common
nowadays, but remember my real-world experience designing chips is from
the early '80s;-). Nevertheless the concept of a "bundle of state and
behavior" is still there -- and a direct, immediate analogy to OOP.
(Functional programming, OTOH, is analogous to stateless input-output
transformation circuits, an even more basic concept in HW design:-). If
anything, it's the concept of "procedural programming" that has no
direct equivalent in HW design (unless you consider microcode "HW", and,
personally, I don't;-). [[Fortunately as a part of the CE minor I did
learn Fortran, Lisp and Pascal, and a few machine-languages too, so I
wasn't totally blown away when I found myself earning a living by
programming rather than by designing chips, but that's another
story:-)]]


Alex

James Stroud

unread,
Jul 14, 2007, 2:49:56 AM7/14/07
to
Chris Carlen wrote:
> Hi:
>
> From what I've read of OOP, I don't get it. I have also found some
> articles profoundly critical of OOP.

I've also found articles critical of Darwinism--but we can chalk that up
to religious zealotry can't we?

Any gui more complicated than a few entry fields and some checkbuttons
is going to lend itself to OOP--so if you want to do GUI, learn OOP. The
time you spend learning OOP will be about 1/10th the time required to
debug a modestly complicated gui. This is especially true of guis that
require real-time feedback behavior.

If you just want to enter some values and set some flags and then hit
"go", you could always program the GUI in HTML and have a cgi script
process the result. This has a lot of benefits that are frequently
overlooked but tend to be less fun than using a bona-fide toolkit like
WX or QT.

James

Hendrik van Rooyen

unread,
Jul 14, 2007, 3:43:37 AM7/14/07
to pytho...@python.org
"Aahz" <aahz@pyt...aft.com> wrote:

> Newbie. ;-)
>
> (I started with BASIC in 1976.)
>

*grinz @ Newbie*

I was writing COBOL and NEAT/3 in 1968...

- Hendrik

Hendrik van Rooyen

unread,
Jul 14, 2007, 3:01:22 AM7/14/07
to pytho...@python.org

"Chris Carlen" <crcarl,,,,dia.gov> wrote:

> Form 2: Use Python and PySerial and TkInter or wxWidgets.
>
> Pro: Cross-platform goal will likely be achieved fully. Have a
> programmer nearby with extensive experience who can help.
> Con: Must learn new language and library. Must possibly learn a
> completely new way of thinking (OOP) not just a new language syntax.
> This might be difficult.
>

This is the way to go. - Trust me on this.

When you describe your history, it is almost an exact parallel to mine.
In my case, I have been doing real low level stuff (mostly 8031 assembler)
since 1982 or so. And then I found python in a GSM module (Telit), and
I was intrigued.

I really appreciate your comments on OO - it parallels a lot of what I feel
as there is a lot of apparent BS that does not seem to "do anything" at first
sight.

However- for the GUI stuff, there is an easily understood relationship between
the objects and what you see on the screen - so its a great way of getting
into OO - as far as people like you and me will go with it, which is not very
far, as we tend to think in machine instructions...

And for what its worth - you can programme assembler-like python, and it also
works.

The best thing to do is just to spend a few days playing with say Tkinter.
I use a reference from the web written by John W Shipman at New Mexico
Tech - it is succinct and clear, and deserves more widespread publicity.

Google for it - I have lost the link, although I still have the pdf file.

You will also find the interactive prompt that you get when you type
python at a command prompt invaluable - it lets you play with and debug
small code snippets so that you can learn as you go along - it really speeds
up the whole learning process, and makes it almost painless.

All this talking is just wasting time - you could have had your first frame up
on the screen already, with a blank canvas, ready for drawing. It really goes
that quick, once you start.

So the answer to the title question is: Yes - a low level programmer can learn
OOP, and its in fact easier than it looks, as almost all the heavy lifting has
been done for you by others.

- Hendrik

Michele Simionato

unread,
Jul 14, 2007, 4:16:09 AM7/14/07
to
On Jul 14, 8:49 am, James Stroud <jstr...@mbi.ucla.edu> wrote:
>
> Any gui more complicated than a few entry fields and some checkbuttons
> is going to lend itself to OOP--so if you want to do GUI, learn OOP.

Yep, there is nothing to be added to that. Except maybe that if you
don't care
too much about the look&feel you may consider starting with Tkinter.
Pros:

1. it is part of the standard library, and you already have it;
2. it is possibly the easiest/simplest GUI out there;
3. it runs pretty much everywhere with minimal fuss.

Michele Simionato

Dave Baum

unread,
Jul 13, 2007, 5:59:02 PM7/13/07
to
In article <f78et...@news1.newsguy.com>,
Chris Carlen <crcarleR...@BOGUSsandia.gov> wrote:

> Why would OOP be better? Different is not better. Popular is not
> better. What the academics say is not better. Less lines of code might
> be better, if the priority is ease of programming. Or, less machine
> execution time or memory usage might be better, if that is the priority.

Short answer: Increasing programmer productivity is better, and OO
frequently accomplishes this.

Longer answer:

Consider OOP as one tool in the toolbox. It isn't "best" for every
conceivable problem, but neither is procedural programming, functional
programming, table driven state machines, or any other style of design
and/or programming. Each works well in some situations and poorly in
others. Having a large number of tools at your disposal, and knowing
which ones to use, is a big plus.

Let's talk about design versus implementation for a moment, since OO
really applies to both, but in different ways. You mentioned state
machines, and that is a good example of a design technique. You can
look at a problem and convert it to a state machine (design), then
implement the state machine in whatever language your computer
understands. Over the years, finite state machines have proven to be
very effective models because:

1) there are plenty of real world problems that map cleanly to a state
machine

2) state machines are easy to implement reliably in most computer
languages

3) if you have already implemented a state machine for problem A, then
implementing it for problem B is pretty easy - the real work is
translating problem B into a state machine

OOD is similar. There are a large number of problems for which an
object oriented design is a good fit. Once you have an OO design, you
then need to implement it, and languages with good OO support make this
a lot easier.

From what I have seen, the advantages of OO tend to increase with the
size of the project. For example, polymorphism generally lets M clients
work with N different kinds of objects by writing M+N chunks of code
rather than M*N. When M or N is small, this difference in minor, but as
M and N increase, it becomes significant.

By combining state and function, objects provide a good means of
encapsulating operations and keeping client code independent of lower
level code. This is a very big win since it allows for the evolution of
the lower level code without breaking all of the client code. As with
polymorphism, the benefits of encapsulation tend to increase with the
size of the project.

Even before OO languages were popular, it was quite common to use some
basic OO design in order to increase encapsulation. If you look at
almost any GUI framework from the 80's or early 90's you'll find lots of
C code with structs that the user is not supposed to mess with, and then
functions that take pointers/handles/indexes to those "magic" structs as
the first argument. This is really OO design implemented in a
procedural language. In fact, GUI frameworks are an excellent example
of a domain for which OO has established itself a very good way to model
the problem.

I could probably spend a lot more time on the merits of OO, but I think
if you really want to understand its benefits you need to work with it
in a domain for which OO is useful. It is possible that the specific
projects you work on really wouldn't benefit much from OO, and that is
why you haven't had the "a-ha!" moment. Forcing an OO model onto a
problem just for the sake of OO will only result in frustration.

Dave

John J. Lee

unread,
Jul 14, 2007, 8:54:26 AM7/14/07
to
[Chris Carlen]

> From what I've read of OOP, I don't get it. I have also found some
> articles profoundly critical of OOP. I tend to relate to these
> articles.

If you want to know the truth, and opt to neither trust a friend or
colleague, nor spend the time to try it yourself, here's a third way:

Compile Qt (a toolkit like wx or Tk) and watch the list of source file
names scroll past. Beautiful! Perhaps there's some better way of
doing GUIs, but watching that list of source files, one realises that
that's an academic question: practically, OOP fits GUIs -- and much of
the other code in Qt -- so well, and so much effort has been put into
these GUI toolkit libraries, that one would be a fool not to use them
right now. A somewhat separate issue: You'd also be a fool not to
apply OOP to the GUI code *you* write *using* one of those OOP GUI
toolkits. Though you won't learn that all at once or without
conscious effort, that's not an obstacle with Python -- you can start
small.

Of course there's some level of experience / project size / project
longevity / number of people involved below which dashing it off using
what you know right now will be quicker, but the break-even point is
not far off in your case, I think.


[chris]


> However, those articles were no more objective than the descriptions
> of OOP I've read in making a case. Ie., what objective
> data/studies/research indicates that a particular problem can be
> solved more quickly by the programmer, or that the solution is more
> efficient in execution time/memory usage when implemented via OOP
> vs. procedural programming?

[bruno]


> None. Definitively. wrt/ developper time and memory, it's mostly a
> matter of fit-your-brains. If it does, you'll find it easier, else

[...]

How do we have confidence that that's true without doing experiments?
AFAIK, only a few such experiments have been done (not counting
research that does not meet basic standards of competence or is not
peer-reviewed).

I think some programming techniques are simply better than others for
certain tasks, even when including the variation in people's abilities
(but excluding the cost of people learning those techniques, which can
of course be significant). Of course, measurement is tricky because
of differences between programmers, but it's not impossible.


John

John J. Lee

unread,
Jul 14, 2007, 9:11:07 AM7/14/07
to
aa...@pythoncraft.com (Aahz) writes:
[...]

> Note very very carefully that Python does not require an OOP style of
> programming,

agree


> but it will almost certainly be the case that you just
> naturally start using OOP techniques as you learn Python.

There's some truth to this. But stagnation is also very easy to
achieve, without conscious effort to improve.

Also, reading OOP books (and this list) is still beneficial, both
before and after you've understood each concept: before because it
helps to learn new concepts at a faster rate, and to learn concepts
you'd otherwise miss; after because it helps "clean up" and extend
your understanding and because it teaches you standard names for
things, helping communication.


John

Rustom Mody

unread,
Jul 14, 2007, 9:48:05 AM7/14/07
to pytho...@python.org
On 7/14/07, Alex Martelli <al...@mac.com> wrote:
>
> OOP can be abused (particularly with deep or intricate inheritance
> structures). But the base concept is simple and clear: you can bundle
> state and behavior into a stateful "black box" (of which you may make as
> many instances, with independent state but equal behavior, as you need).
>

Many years ago (86??) Wegner wrote a paper in OOPSLA called Dimensions
of Object Orientation in which he called the 'base concept' of 'bundle
of state and behavior' as 'object based' programming and
'object-oriented' as object-based + inheritance.

What Alex is saying is (in effect) that object-based is simple and
clear (and useful) whereas the object-orientation is subject to abuse.

This anyway is my experience: C++ programmers are distinctly poorer
programmers than C programmers -- for some strange reason closeness to
the machine has a salutary effect whereas the encouragment of
uselessly over-engineered programs makes worse programmers.

GUI is one of those cases wherein inheritance actually helps people
produce better code but this is something of an exception.

And even here one of the most widely used popularisers of GUIs has
been VB which was (at least initially) not object-oriented. VB shows
that language orientation -- tailoring 'the language' of drag-n-drop
to GUI-building and not just GUI-use -- wins over OOP.
Ruby/Rails is another example of language-oriented programming though
I feel it goes too far in (de)capitalizing, pluralizing,
(de)hyphenizing etc towards 'readability'.
[Sorry if this offends some people -- just my view!]

And this makes me wonder: It seems that Tkinter, wxpython, pygtk etc
are so much more popular among pythonistas than glade, dabo etc.

Why is this?

UrsusM...@gmail.com

unread,
Jul 14, 2007, 12:25:50 PM7/14/07
to
The Tkinter tutorial refrrred to is at http://infohost.nmt.edu/tcc/help/pubs/tkinter//
and it is a great starting point ...

Ron Stephens

Wayne Brehaut

unread,
Jul 14, 2007, 1:39:36 PM7/14/07
to
On Fri, 13 Jul 2007 20:37:04 -0400, Steve Holden <st...@holdenweb.com>
wrote:

>Aahz wrote:
>> In article <f787u...@news4.newsguy.com>,
>> Chris Carlen <crcarleR...@BOGUSsandia.gov> wrote:
>>>From what I've read of OOP, I don't get it.
>>
>> For that matter, even using OOP a bit with C++ and Perl, I didn't get it
>> until I learned Python.
>>
>>> The problem for me is that I've programmed extensively in C and .asm on
>>> PC DOS way back in 1988.
>>
>> Newbie. ;-)
>>
>> (I started with BASIC in 1976.)
>>
>Newbie ;-)
>
>(I started with Algol 60 in 1967).

Newbie ;-)

(I started with Royal McBee LGP 30 machine language (hex input) in
1958, and their ACT IV assembler later! Then FORTRAN IV in 1965. By
1967 I too was using (Burroughs) Algol-60, and 10 years later upgraded
to (DEC-10) Simula-67.)

Going---going---

darren kirby

unread,
Jul 14, 2007, 1:49:48 PM7/14/07
to pytho...@python.org
quoth the Wayne Brehaut:

> (I started with Royal McBee LGP 30 machine language (hex input) in
> 1958, and their ACT IV assembler later! Then FORTRAN IV in 1965. By
> 1967 I too was using (Burroughs) Algol-60, and 10 years later upgraded
> to (DEC-10) Simula-67.)
>
> Going---going---

Mel? Is that you?

http://www.pbm.com/~lindahl/mel.html

-d
--
darren kirby :: Part of the problem since 1976 :: http://badcomputer.org
"...the number of UNIX installations has grown to 10, with more expected..."
- Dennis Ritchie and Ken Thompson, June 1972

Wayne Brehaut

unread,
Jul 14, 2007, 3:34:07 PM7/14/07
to
On Sat, 14 Jul 2007 19:18:05 +0530, "Rustom Mody"
<rusto...@gmail.com> wrote:

>On 7/14/07, Alex Martelli <al...@mac.com> wrote:
>>
>> OOP can be abused (particularly with deep or intricate inheritance
>> structures). But the base concept is simple and clear: you can bundle
>> state and behavior into a stateful "black box" (of which you may make as
>> many instances, with independent state but equal behavior, as you need).
>>
>
>Many years ago (86??) Wegner wrote a paper in OOPSLA called Dimensions
>of Object Orientation in which he called the 'base concept' of 'bundle
>of state and behavior' as 'object based' programming and
>'object-oriented' as object-based + inheritance.

Not quite--according to him:

object-based + classes => class-based
class-based + class inheritance => object-oriented

I.e., "object-oriented = objects + classes + inheritance".

This was not the, by then, standard definition: to be OO would require
all four of:

1. modularity (class-based? object-based?)
2. inheritance (sub-classing)
3. encapsulation (information hiding)
4. polymorphism ((sub-) class-specific response to a message, or
processing of a method)

Unfortunately, most of the "definitions" (usually just hand-waving,
loosey-goosey descriptions) found on the web include none--or only one
or two--of these fundamental requirements by name, and are so loose
that almost any proramming paradigm or style would be OO.

>What Alex is saying is (in effect) that object-based is simple and
>clear (and useful) whereas the object-orientation is subject to abuse.

But OO is also simple and clear (if clearly defined and explained and
illustrated and implemented), and ANY programming style is subject to
abuse. During the hey-day of Pascal as an introductory programming
language (as often misused as more than that) I found many often
spent much of their time defining the data types their program would
use.

>This anyway is my experience: C++ programmers are distinctly poorer
>programmers than C programmers -- for some strange reason closeness to
>the machine has a salutary effect whereas the encouragment of
>uselessly over-engineered programs makes worse programmers.

But this is a tautology: "over-engineered" programs are, by definition
or terminology, not a good thing--independent of what PL or style
they're finally implemented in (assuming that by "engineering" you
mean "design" or similar). Many of my Pascal students over-engineered
their solutions to simple problems too?

>GUI is one of those cases wherein inheritance actually helps people
>produce better code but this is something of an exception.

This seems to imply that the list of applications you have in mind or
have worked on includes fewer domains that might profit from full OO
instead of just OB. My guess is that there are many application
domains in which analysts and programmers often think in an "OO way",
but implement in just an OB way because of the PL they or their
employer requires or prefers: in some--perhaps many--of these cases
they have to do "manually" what OO would have automated.

There is a problem, though, of (especially university and college)
education and training in OOP "talking about" how glorious OO is, and
requiring students to use OO techniques whether they're most
appropriate or not (the "classes early" pedagogical mindset). And
this problem is compounded by teaching introductory programming using
a language like Java that requires one to use an OO style for even
trivial programs. And by using one of the many very similar
introductory texbooks that talk a lot about OO before actually getting
started on programming, so students don't realize how trivial a
program is required to solve a trivial problem, and hence look for
complexity everywhere--whether it exists or not--and spend a lot of
time supposedly reducing the complexity of an already simple problem
and its method of solution.

But as I noted above, a similar problem occurred with the crop of
students who first learned Pascal: they often spent much of their time
defining the data types their program would use, just as OO
(especially "classes early") graduates tend to waste time
"over-subclassing" and developing libraries of little-used classes.

The answer is always balance, and having an extensive enough toolkit
that one is not forced or encouraged to apply a programming model that
isn't appropriate and doesn't help in any way (including
maintainability). And starting with a language that doesn't brainwash
one into believing that the style it enforces or implies is always the
best--and texbooks that teach proper choice of programming style
instead of rigid adherence to one.

wwwayne

Wayne Brehaut

unread,
Jul 14, 2007, 3:58:06 PM7/14/07
to
On Sat, 14 Jul 2007 11:49:48 -0600, darren kirby
<bull...@badcomputer.org> wrote:

>quoth the Wayne Brehaut:
>
>> (I started with Royal McBee LGP 30 machine language (hex input) in
>> 1958, and their ACT IV assembler later! Then FORTRAN IV in 1965. By
>> 1967 I too was using (Burroughs) Algol-60, and 10 years later upgraded
>> to (DEC-10) Simula-67.)
>>
>> Going---going---
>
>Mel? Is that you?
>
>http://www.pbm.com/~lindahl/mel.html
>

Ha-ha! Thanks for that!

Although I'm not Mel, the first program I saw running on the LGP-30
was his Blackjack program! In 1958 I took a Numerical Methods course
at the University of Saskatchewan, and we got to program Newton's
forward difference method for the LGP-30. Our "computer centre tour"
was to the attic of the Physics building, where their LGP-30 was
networked to a similar one at the Univeristy of Toronto (the first
educational computer network in Canada!), and the operator played a
few hands of Blackjack with the operator there to illustrate how
useful computers could be.

A few years later, as a telecommunications officer in the RCAF, I
helped design (but never got to teach :-( ) a course in LGP-30
architecture and programming using both ML and ACT IV AL, complete
with paper tape input and Charactron Tube
(http://en.wikipedia.org/wiki/Charactron) output--handy, since this
display was also used in the SAGE system.

We weren't encouraged to use card games as examples, so used
navigational and tracking problems involving fairly simple
trigonometry.

wwwayne
>-d

Message has been deleted

Ben Finney

unread,
Jul 14, 2007, 11:43:38 PM7/14/07
to
Dennis Lee Bieber <wlf...@ix.netcom.com> writes:

> Though I should have added that, in Python, the toolset tends to
> be... just an editor...

Much more than that. The common toolset I see used is:

* A customisable, flexible, programmer's text editor
* The online documentation in a web browser
* A Python interactive shell
* An automated build tool like 'make'
* A programmable shell for ad-hoc tasks
* A multi-session terminal program to run all these simultaneously

What I *don't* see is some single all-in-one tool specific to
programming a particular language. Having learned one good instance of
each of the above, it seems silly to need to learn all of them again
simply because one has started using Python as the programming
language.

--
\ "Room service? Send up a larger room." -- Groucho Marx |
`\ |
_o__) |
Ben Finney

Sebastian Bassi

unread,
Jul 15, 2007, 12:15:39 AM7/15/07
to pytho...@python.org
On 7/13/07, Simon Hibbs <simon...@gmail.com> wrote:
> place. At the end of it you'll have a good idea how OOP works, and how
> Python works. Learning OOp this way is easy and painless, and what you
...

But this tutorial states "I assume you know how object-oriented
programming works"

--
Sebastián Bassi (セバスティアン)
Diplomado en Ciencia y Tecnología.
GPG Fingerprint: 9470 0980 620D ABFC BE63 A4A4 A3DE C97D 8422 D43D

bon...@macbird.com

unread,
Jul 15, 2007, 3:47:20 AM7/15/07
to
On Jul 13, 3:20 pm, Wayne Brehaut <wbreh...@mcsnet.ca> wrote:
> On Sat, 14 Jul 2007 06:01:56 +0200, Bruno Desthuilliers
>
> <bdesth.quelquech...@free.quelquepart.fr> wrote:
> >Chris Carlen a écrit :
> >> Hi:
>
> >> From what I've read of OOP, I don't get it. I have also found some
> >> articles profoundly critical of OOP. I tend to relate to these articles.
>
> === 8< ===
>
>
>
> >> Hence, being a hardware designer rather than a computer scientist, I am
> >> conditioned to think like a machine. I think this is the main reason
> >> why OOP has always repelled me.
>
> >OTOH, OO is about machines - at least as conceveid by Alan Key, who
> >invented the term and most of the concept. According to him, each object
> >is a (simulation of) a small machine.
>
> Oh you young'uns, not versed in The Ancient Lore, but filled with
> self-serving propaganda from Xerox PARC,Alan Kay, and Smalltalk

> adherents everywhere!
>
> As a few more enlightened have noted in more than one thread here, the
> Mother of All OOP was Simula (then known as SIMULA 67). AllAlan Kay

> did was define "OOPL", but then didn't notice (apparently--though this
> may have been a "convenient oversight") that Simula satisfied all the
> criteria so was actually the first OOPL--and at least 10 years earlier
> than Smalltalk!
>
> So Kay actually invented NONE of the concepts that make a PL an OOPL.
> He only stated the concepts concisely and named the result OOP, and
> invented yet another implementation of the concepts-- based on a
> LISP-like functional syntax instead of an Algol-60 procedural syntax,
> and using message-passing for communication amongst objects (and
> assumed a GUI-based IDE) (and introduced some new terminology,
> especially use of the term "method" to distinguish class and instance
> procedures and functions, which Simula hadn't done) .
>
> As Randy Gest notes onhttp://www.smalltalk.org/alankay.html, "The
> major ideas in Smalltalk are generally credited toAlan Kaywith many

> roots in Simula, LISP and SketchPad." Too many seem to assume that
> some of these other "features" of Smalltalk are part of the definition
> of an OOP, and so are misled into believing the claim that it was the
> first OOPL. Or they claim that certain deficiencies in Simula's object
> model--as compared to Smalltalk's--somehow disqualifies it as a "true
> OOPL", even though it satisfies all the criteria as stated by Kay in
> his definition. Seehttp://en.wikipedia.org/wiki/Simulaand related

> pages, and "The History of Programming Languages I (HOPL I)", for
> more details.
>
> Under a claim of Academic Impunity (or was that "Immunity"), here's
> another historical tid-bit. In a previous empolyment we once had a
> faculty applicant from CalTech who knew we were using Simula as our
> introductory and core language in our CS program, so he visited Xerox
> PARC before coming for his inteview. His estimate ofAlan Kayand
> Smalltalk at that time (early 80s) was that "They wanted to implement
> Simula but didn't understand it--so they invented Smalltalk and now
> don't understand _it_!"
>
> wwwayne
>
> === 8< ===

A couple of notes on this post.

Alan Kay has always publicly credited Simula as the direct inspiration
for Smalltalk, and if you know the man and his work, this implication
of taking credit for the first OOP language is not true, it is a
credit assigned to him by others, and one which he usually rights when
confronted with it.

You may be confused with the fact that "object oriented
programming"was a term which I believe was first used by Alan and his
group at PARC, so perhaps the coining of the term is what is being
referenced by others.

Perhaps I'm mistaken, but the tone of your post conveys an animosity
that did not exist between the original Smalltalk and Simula
inventors; Nygard and Kay were good friends, and admired each others'
work very much.


Bonnie MacBird


Wayne Brehaut

unread,
Jul 15, 2007, 3:05:38 PM7/15/07
to
On Sun, 15 Jul 2007 07:47:20 -0000, bon...@macbird.com wrote:

>On Jul 13, 3:20 pm, Wayne Brehaut <wbreh...@mcsnet.ca> wrote:
>> On Sat, 14 Jul 2007 06:01:56 +0200, Bruno Desthuilliers
>>
>> <bdesth.quelquech...@free.quelquepart.fr> wrote:
>> >Chris Carlen a écrit :
>> >> Hi:
>>
>> >> From what I've read of OOP, I don't get it. I have also found some
>> >> articles profoundly critical of OOP. I tend to relate to these articles.
>>
>> === 8< ===

=== 8< ===

>> Under a claim of Academic Impunity (or was that "Immunity"), here's
>> another historical tid-bit. In a previous empolyment we once had a
>> faculty applicant from CalTech who knew we were using Simula as our
>> introductory and core language in our CS program, so he visited Xerox
>> PARC before coming for his inteview. His estimate ofAlan Kayand
>> Smalltalk at that time (early 80s) was that "They wanted to implement
>> Simula but didn't understand it--so they invented Smalltalk and now
>> don't understand _it_!"
>>
>> wwwayne
>>
>> === 8< ===
>
>A couple of notes on this post.
>
>Alan Kay has always publicly credited Simula as the direct inspiration
>for Smalltalk, and if you know the man and his work, this implication
>of taking credit for the first OOP language is not true, it is a
>credit assigned to him by others, and one which he usually rights when
>confronted with it.

I know this, and was perhaps a little too flippant in my all-inclusive
statement "self-serving propaganda from Xerox PARC,Alan Kay, and
Smalltalk adherents everywhere!", for which I apologize. But it was
made with humorous intent, as I had hoped the opening "Oh you


young'uns, not versed in The Ancient Lore, but filled with

self-serving propaganda..." would imply.

A more accurate and unhumorous statement of my opinion is that it is
Smalltalk adherents who know virtually nothing of the history of
OOP--and even some who do--who did and still do make such claims,
both personally and in the published literature of OOP.

And my statement about a prospective faculty member's opinion was just
that: a historical anecdote, and the expression of an early 80s
opinion by a professional CS professor and researcher in formal
semantics (which may have been part of his distrust of the Smalltalk
team's "understanding" of Smalltalk) . The opinion he expressed was
his and not my own, and I was just recording (what I thought might
be) an amusing anecdote in a context in which I thought it
appropriate: discussion of what OOP is, and after Bruno made the
claim: "OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept." I don't think my
recording it here should be construed as my opinion of either
Smalltalk or its creators (at that time or now).

As often happens in many arenas, the creator of an idea can lose
control to the flock, and many publications can get accepted if
referrees themselves don't know the facts or take care to check them
before recommending publication--which probably explains why so many
publications (especially in conference proceedings) on OOP in the 80s
and 90s completely omitted any mention of Simula: so much so that I
once intended writing a paper on "Ignorance of Simula Considered
Harmful."

On the other hand, anytyhing you may have inferred about my distaste
for those who doesn't bother to learn anything of the history of a
subject, then make false or misleading claims, and don't bother to
correct themselves when questioned, is true.

>You may be confused with the fact that "object oriented
>programming"was a term which I believe was first used by Alan and his
>group at PARC, so perhaps the coining of the term is what is being
>referenced by others.

No, I have been at more than one CS (or related area) conference where
a Smalltalk aficionado has stated unequivocally that Kay invented OOP
and that Smalltalk was the first OOPL. The last I recall for sure was
WebNet 2000, where a (quite young) presenter on Squeak made that
statement, and was not at all certain what Simula was when I asked
whether it might actually have been the first more than 10 years
before Smalltalk 80. So his claim, and that of many others,
explicitly or implicitly, is that not only the term, but most (or all)
of the concept, and (often) the first implementation of OOP was by Kay
and his Xerox PARC team in Smalltalk 80.

>Perhaps I'm mistaken, but the tone of your post conveys an animosity
>that did not exist between the original Smalltalk and Simula
>inventors; Nygard and Kay were good friends, and admired each others'
>work very much.

Yes, you are very much mistaken (as I note above), and appear not to
have understood the intended humorous tone of my posting.

wwwayne

>
>Bonnie MacBird
>

Paddy

unread,
Jul 15, 2007, 3:57:35 PM7/15/07
to
On Jul 13, 5:06 pm, Chris Carlen <crcarleRemoveT...@BOGUSsandia.gov>
wrote:
> Hi:
> Christopher

>
> Problem:
>
> 1. How to most easily learn to write simple PC GUI programs that will
> send data to remote embedded devices via serial comms, and perhaps
> incorporate some basic (x,y) type graphics display and manipulation
> (simple drawing program). Data may result from user GUI input, or from
> parsing a text config file. Solution need not be efficient in machine
> resource utilization. Emphasis is on quickness with which programmer
> can learn and implement solution.

Have you also tried looking for a cross-platform GUI program that has
a
scripting interface that you might adapt? If found then the extra
scripting needs may be reduced.

- Paddy.

Message has been deleted

Aahz

unread,
Jul 15, 2007, 7:00:58 PM7/15/07
to
In article <om_li.21868$RX.2...@newssvr11.news.prodigy.net>,

James Stroud <jst...@mbi.ucla.edu> wrote:
>
>If you just want to enter some values and set some flags and then hit
>"go", you could always program the GUI in HTML and have a cgi script
>process the result. This has a lot of benefits that are frequently
>overlooked but tend to be less fun than using a bona-fide toolkit like
>WX or QT.

This is excellent advice worth emphasizing -- but then, I make my living
working on a web app. ;-)
--
Aahz (aa...@pythoncraft.com) <*> http://www.pythoncraft.com/

I support the RKAB

Bruno Desthuilliers

unread,
Jul 16, 2007, 3:55:35 AM7/16/07
to
Wayne Brehaut a écrit :

> On Sat, 14 Jul 2007 06:01:56 +0200, Bruno Desthuilliers
> <bdesth.qu...@free.quelquepart.fr> wrote:
>
>> Chris Carlen a écrit :
>>> Hi:
>>>
>>> From what I've read of OOP, I don't get it. I have also found some
>>> articles profoundly critical of OOP. I tend to relate to these articles.
>>>
>
> === 8< ===
>
>>> Hence, being a hardware designer rather than a computer scientist, I am
>>> conditioned to think like a machine. I think this is the main reason
>>> why OOP has always repelled me.
>> OTOH, OO is about machines - at least as conceveid by Alan Key, who
>> invented the term and most of the concept. According to him, each object
>> is a (simulation of) a small machine.
>
> Oh you young'uns, not versed in The Ancient Lore, but filled with
> self-serving propaganda from Xerox PARC, Alan Kay, and Smalltalk
> adherents everywhere!

Not feeling concerned.

(snip pro-simula/anti-Xerox propaganda).

Bruno Desthuilliers

unread,
Jul 16, 2007, 4:10:05 AM7/16/07
to
Wayne Brehaut a écrit :
(snip)

> after Bruno made the
> claim: "OO is about machines - at least as conceveid by Alan Key, who
> invented the term and most of the concept."

Please reread more carefully the above. I do give credit to Smalltalk's
author for the *term* "OOP", and *most* (not *all*) of the concepts (I
strongly disagree with your opinion that message-passing is not a core
concept of OO).

FWIW, I first mentionned Simula too (about the state-machine and
simulation aspect), then sniped this mention because I thought it was
getting a bit too much OT - we're not on comp.object here.

hide...@gmail.com

unread,
Jul 16, 2007, 11:29:58 AM7/16/07
to
You are lucky.Our project is a cross-platform cluster computer
managment system
this system can run on both windows and Linux
http://pluster.gf.cs.hit.edu.cn/
I tell you how we solve this problems

>
> 1. How to most easily learn to write simple PC GUI programs that will
> send data to remote embedded devices via serial comms, and perhaps
> incorporate some basic (x,y) type graphics display and manipulation
> (simple drawing program). Data may result from user GUI input, or from
> parsing a text config file. Solution need not be efficient in machine
> resource utilization. Emphasis is on quickness with which programmer
> can learn and implement solution.

We use tk for GUI and we have a interpreter reads VB form file ".frm"
in and
display it with tk.You just need to draw forms in VB and save it in
frm formate
load it in your python file
LoadForm("aa.frm")
after that you can use button,menu and so on in python

We use XMLRPC to conmunicate with remote node.XMLRPC is very cool for
you can
invoke a function in remote side in the same way you invoke a local
method.
for example
we have an remote object foo
foo.bar() #invoke bar() in remote side
but XMLRPC is work on network.I'm not sure it can work in serial
> 2. Must be cross-platform: Linux + Windows. This factor can have a big
> impact on whether it is necessary to learn a new language, or stick with
> C. If my platform was only Linux I could just learn GTK and be done
> with it. I wouldn't be here in that case.

and most important is XMLRPC is cross-platform.you can use a linux for
server and windows for client


Wayne Brehaut

unread,
Jul 16, 2007, 1:16:32 PM7/16/07
to
On Mon, 16 Jul 2007 10:10:05 +0200, Bruno Desthuilliers
<bruno.42.de...@wtf.websiteburo.oops.com> wrote:

>Wayne Brehaut a écrit :
>(snip)
> > after Bruno made the
>> claim: "OO is about machines - at least as conceveid by Alan Key, who
>> invented the term and most of the concept."
>
>Please reread more carefully the above. I do give credit to Smalltalk's
>author for the *term* "OOP", and *most* (not *all*) of the concepts (I
>strongly disagree with your opinion that message-passing is not a core
>concept of OO).

One problem is that it's often not clear what lists of properties are
his definition of OOP vs. what are the intended properties of
Smalltalk--his intended impelmentation of OOP. Many of the lists begin
with the basic requirements that "everything is an object" and
"objects communicate by message passing", but the most common
"generally agreed upon" definition abstracts just four requirements
from these (changing) lists--attempting to separate implementation
details from what is essential to the underlying framework. As I note
below, these were:

1. modularity (class-based? object-based?)
2. inheritance (sub-classing)
3. encapsulation (information hiding)
4. polymorphism ((sub-) class-specific response to a message, or
processing of a method)

Other details in Kay's lists are considered implementation details,
and important advances or alternatives to pevious methods, but not
required for a language to _be_ OO. It is reputed, though, that in
2003 Kay said
(http://c2.com/cgi/wiki?AlanKaysDefinitionOfObjectOriented) "OOP to
me means only messaging, local retention and protection and hiding of
state-process, and extreme LateBinding of all things."

So I understand your accepting one of Kay's lists as being a
definition of OOP instead of "just" a description of Smalltalk, or of
accepting this fairly recent "definition" as being the true one (as
opposed to the previous lists of usually 6 properties). "It's hard to
hit a moving target!"

>FWIW, I first mentionned Simula too (about the state-machine and
>simulation aspect), then sniped this mention because I thought it was
>getting a bit too much OT - we're not on comp.object here.

Understood--sort of--but there is sufficient accurate information
about Simula available on the web now that it's no longer necessary to
use quotes from Kay about OOP and Smalltalk just because they're more
accessible, as used to be the case. What would be so OT about
referring to Simulain one sentence instead of or in addition to
Smalltalk?

But I digress--my only real objection to your post was your opinion
and claim that Kay "invented the term and most of the concept": I've
never seen anyone claim that anyone else invented the term, but for
the claim that he invented "most of the concept" we need only refer to
Nygaard's claim in "How Object-Oriented Programming Started" at
http://heim.ifi.uio.no/~kristen/FORSKNINGSDOK_MAPPE/F_OO_start.html
that "Simula 67 introduced most of the key concepts of object-oriented
programming: both objects and classes, subclasses (usually referred to
as inheritance) and virtual procedures, combined with safe referencing
and mechanisms for bringing into a program collections of program
structures described under a common class heading (prefixed blocks)."

Combine this with the fact--as stated above by Bonnie MacBird (Alan
Kay's significant other)--that "Alan Kay has always publicly credited
Simula as the direct inspiration for Smalltalk, and... this


implication of taking credit for the first OOP language is not true,
it is a credit assigned to him by others, and one which he usually

rights when confronted with it." If he acknowledges this perhaps
others should too?

As has been noted before, it's often the fact that a cause becomes a
religion: true believers tend to take it over from the originator, and
this religiosity tends to blind them from the facts. Opinions and
rumours become facts, stories are invented, definitions are changed or
twisted, and another religion is born! Even those who don't belong to
the religion cpme to believe the oft-repreated stories, and then help
spread and perpetuate them. (Continuing in my original humorous vein I
was tempted to use terms like "religious zealots", "false gospel",
"propaganda", etc., but thought better of it in case I was again
misunderstood.)

Again, I disagree only with this one claim. You make significant
contributions to the group and to ellucidating Python and OOP to the
great unwashed: in contrast, all I've done so far is complain about
those who don't accept the correct (i.e., my) definition or use of
terms.

wwwayne

Wayne Brehaut

unread,
Jul 16, 2007, 1:36:43 PM7/16/07
to

Or, more accurately, pro:

1. Nygaard & Dahl as the inventors of most of the concept of OOP
2. Simula as the first OOP
3. Kay as the originator of the term OOP
4. Kay, Xerox PARC, and Smalltalk as making significant useful
advances in implementation of OOP and "popularizing" it

and anti:

1. attributing credit for any accomplishment to someone who doesn't
himself claim it and even denies it

wwwayne o/o

Chris Carlen

unread,
Jul 16, 2007, 2:45:04 PM7/16/07
to
[edit]

Thanks for the tip. The next poster provides the link, which I've got
bookmarked now.

The more I play with Python, the more I like it. Perhaps I will
understand OOP quicker than I thought. What I've learned so far about
names binding to objects instead of values stored in memory cells, etc.
has been interesting and fascinating.

--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
crcarleR...@BOGUSsandia.gov
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.

Steve Holden

unread,
Jul 16, 2007, 3:35:49 PM7/16/07
to pytho...@python.org
I'm happy you are proceeding with so little trouble. Without wishing to
confuse you, however, I should point out that this aspect of Python has
very little to do with its object-orientation. There was a language
called Icon, for example, 20 years ago, that used similar semantics but
wasn't at all object-oriented.

regards
Steve
--
Steve Holden +1 571 484 6266 +1 800 494 3119
Holden Web LLC/Ltd http://www.holdenweb.com
Skype: holdenweb http://del.icio.us/steve.holden
--------------- Asciimercial ------------------
Get on the web: Blog, lens and tag the Internet
Many services currently offer free registration
----------- Thank You for Reading -------------

Bruno Desthuilliers

unread,
Jul 15, 2007, 5:02:25 PM7/15/07
to
Wayne Brehaut a écrit :

> On Mon, 16 Jul 2007 10:10:05 +0200, Bruno Desthuilliers
> <bruno.42.de...@wtf.websiteburo.oops.com> wrote:
>
>
>>Wayne Brehaut a écrit :
>>(snip)
>>
>>>after Bruno made the
>>>claim: "OO is about machines - at least as conceveid by Alan Key, who
>>>invented the term and most of the concept."
>>
>>Please reread more carefully the above. I do give credit to Smalltalk's
>>author for the *term* "OOP", and *most* (not *all*) of the concepts (I
>>strongly disagree with your opinion that message-passing is not a core
>>concept of OO).
>
>
> One problem is that it's often not clear what lists of properties are
> his definition of OOP vs. what are the intended properties of
> Smalltalk--his intended impelmentation of OOP. Many of the lists begin
> with the basic requirements that "everything is an object" and
> "objects communicate by message passing", but the most common
> "generally agreed upon" definition abstracts just four requirements
> from these (changing) lists--attempting to separate implementation
> details from what is essential to the underlying framework. As I note
> below, these were:
>
> 1. modularity (class-based? object-based?)
> 2. inheritance (sub-classing)
> 3. encapsulation (information hiding)

I don't see information hiding and encapsulation as being the very same
thing. But anyway...

> 4. polymorphism ((sub-) class-specific response to a message, or
> processing of a method)

subclassing - and even classes - are not necessary for polymorphism. I
guess you have a good enough knowledge of Python and/or some
prototype-based OOPL to know why !-)

>
> Other details in Kay's lists are considered implementation details,
> and important advances or alternatives to pevious methods, but not
> required for a language to _be_ OO. It is reputed, though, that in
> 2003 Kay said
> (http://c2.com/cgi/wiki?AlanKaysDefinitionOfObjectOriented) "OOP to
> me means only messaging, local retention and protection and hiding of
> state-process, and extreme LateBinding of all things."
>
> So I understand your accepting one of Kay's lists as being a
> definition of OOP instead of "just" a description of Smalltalk, or of
> accepting this fairly recent "definition" as being the true one

Is there any "true one" ?-)

> (as
> opposed to the previous lists of usually 6 properties). "It's hard to
> hit a moving target!"

Indeed.

>
>>FWIW, I first mentionned Simula too (about the state-machine and
>>simulation aspect), then sniped this mention because I thought it was
>>getting a bit too much OT - we're not on comp.object here.
>
>
> Understood--sort of--but there is sufficient accurate information
> about Simula available on the web now that it's no longer necessary to
> use quotes from Kay about OOP and Smalltalk just because they're more
> accessible, as used to be the case. What would be so OT about
> referring to Simulain one sentence instead of or in addition to
> Smalltalk?

What I mean is that I felt my answer to be already OT enough so I sniped
large parts of it. FWIW, I could have sniped the reference to Alan Kay
and kept the one to Simula, but then it would have require more rewrite
work.

> But I digress--my only real objection to your post was your opinion
> and claim that Kay "invented the term and most of the concept":

I agree that the term "most" is perhaps a bit too strong. For my
defense, please keep in mind that I'm not a native english speaker, so I
often have hard time expressing myself with the exact nuance I'd use in
french.

(snip)

>
> As has been noted before, it's often the fact that a cause becomes a
> religion:

Good Lord, save us from becoming religious !-)

Ok, I admit that I have my own understanding of OO (as anyone else, I
guess), which is quite closer to Smalltalk's model than to any other
OOPL (even Python). It probabaly has to do with the extremely
generalized and systematic application of two key concepts - objects and
messages - in such a way that it becomes a coherent whole - while most
mainstream OOPLs feel to me more like ad-hoc collection of arbitrary
rules and features. So yes, I'm probably guilty of being a bit too
impassioned here, and you're right to correct me. But have mercy and
take time to read a bit more of the offending post, I'm pretty confident
you won't find me guilty of mis-placed "religiosity".

(snip)


> in contrast, all I've done so far is complain about
> those who don't accept the correct (i.e., my) definition or use of
> terms.

Lol ! I'm afraid this is something we're all guilty of one day or another...

Bruno Desthuilliers

unread,
Jul 15, 2007, 5:07:27 PM7/15/07
to
Wayne Brehaut a écrit :

> On Fri, 13 Jul 2007 20:37:04 -0400, Steve Holden <st...@holdenweb.com>
> wrote:
>
>
>>Aahz wrote:
>>
>>>In article <f787u...@news4.newsguy.com>,
>>>Chris Carlen <crcarleR...@BOGUSsandia.gov> wrote:
>>>>From what I've read of OOP, I don't get it.
>>>
>>>For that matter, even using OOP a bit with C++ and Perl, I didn't get it
>>>until I learned Python.
>>>
>>>
>>>>The problem for me is that I've programmed extensively in C and .asm on
>>>>PC DOS way back in 1988.
>>>
>>>Newbie. ;-)
>>>
>>>(I started with BASIC in 1976.)
>>>
>>
>>Newbie ;-)
>>
>>(I started with Algol 60 in 1967).
>
>
> Newbie ;-)
>
> (I started with Royal McBee LGP 30 machine language (hex input) in
> 1958, and their ACT IV assembler later! Then FORTRAN IV in 1965. By
> 1967 I too was using (Burroughs) Algol-60, and 10 years later upgraded
> to (DEC-10) Simula-67.)
>


My my my... Would you believe that my coworkers do consider me like an
old sage because I started programming in 1990 with HyperTalk on Mac
Classic !-)

I suddenly feel 20 again ! Woo !-)

Bruno Desthuilliers

unread,
Jul 15, 2007, 5:18:53 PM7/15/07
to
Wayne Brehaut a écrit :

> On Sat, 14 Jul 2007 19:18:05 +0530, "Rustom Mody"
> <rusto...@gmail.com> wrote:
>
>
>>On 7/14/07, Alex Martelli <al...@mac.com> wrote:
>>
>>>OOP can be abused (particularly with deep or intricate inheritance
>>>structures). But the base concept is simple and clear: you can bundle
>>>state and behavior into a stateful "black box" (of which you may make as
>>>many instances, with independent state but equal behavior, as you need).
>>>
>>
>>Many years ago (86??) Wegner wrote a paper in OOPSLA called Dimensions
>>of Object Orientation in which he called the 'base concept' of 'bundle
>>of state and behavior' as 'object based' programming and
>>'object-oriented' as object-based + inheritance.
>
>
> Not quite--according to him:
>
> object-based + classes => class-based
> class-based + class inheritance => object-oriented
>
> I.e., "object-oriented = objects + classes + inheritance".

What about prototype-based languages then ?-)

Hendrik van Rooyen

unread,
Jul 17, 2007, 2:22:51 AM7/17/07
to pytho...@python.org
"Bruno Desthuilliers" <bdesth.qu....se@free.quelquepart.fr> wrote:

>My my my... Would you believe that my coworkers do consider me like an
>old sage because I started programming in 1990 with HyperTalk on Mac
>Classic !-)
>
>I suddenly feel 20 again ! Woo !-)

*hands him a straw boater and a cane*

ok youngster - lets see you strut your stuff...

; - )


Wolfgang Strobl

unread,
Jul 17, 2007, 3:34:22 AM7/17/07
to
Steve Holden <st...@holdenweb.com>:

>I'm happy you are proceeding with so little trouble. Without wishing to
>confuse you, however, I should point out that this aspect of Python has
>very little to do with its object-orientation. There was a language
>called Icon, for example, 20 years ago, that used similar semantics but
>wasn't at all object-oriented.

Actually, there was a language called SNOBOL, 40 years ago, that used
similar semantics, developed by Griswold et al. Its object model was
remarkably similar to that of Python without classes. And it even had
dictionaries (called "tables") :-).

For an explaination of the concept "variable" in SNOBOL see
<http://www.cacs.louisiana.edu/~mgr/404/burks/language/snobol/catspaw/tutorial/ch1.htm#1.3>

SNOBOLs powerfull patterns still shine, compared to Pythons clumsy
regular expressions. I've used the language a lot in the past, first on
the mainframe (SPITBOL on System/360), later on the PC (Catspaws SNOBOL4
&SPITBOL). When I switched to Python, it wasn't because of the
expressiveness of the language, but of the rich library ("batteries
included") and the IMO elegant syntax, i.e. blocks by identation.

<http://en.wikipedia.org/wiki/SNOBOL>
<http://en.wikipedia.org/wiki/Ralph_E._Griswold>

Icon came later. Griswold developed Icon as a successor to SNOBOL,
constructing it around the concept of generators and co-expressions. I
didn't like it.


--
Thank you for observing all safety precautions

Aahz

unread,
Jul 17, 2007, 10:22:49 AM7/17/07
to
In article <7cqo93pona4qoc4s9...@4ax.com>,

Wolfgang Strobl <ne...@mystrobl.de> wrote:
>
>SNOBOLs powerfull patterns still shine, compared to Pythons clumsy
>regular expressions.

Keep in mind that Python regular expressions are modeled on the
grep/sed/awk/Perl model so as to be familiar to any sysadmin -- but
there's a reason why Python makes it a *library* unlike Perl. So adding
SNOBOL patterns to another library would be a wonderful gift to the
Python community...

Eddie Corns

unread,
Jul 17, 2007, 11:29:22 AM7/17/07
to
aa...@pythoncraft.com (Aahz) writes:

>In article <7cqo93pona4qoc4s9...@4ax.com>,
>Wolfgang Strobl <ne...@mystrobl.de> wrote:
>>
>>SNOBOLs powerfull patterns still shine, compared to Pythons clumsy
>>regular expressions.

>Keep in mind that Python regular expressions are modeled on the
>grep/sed/awk/Perl model so as to be familiar to any sysadmin -- but
>there's a reason why Python makes it a *library* unlike Perl. So adding
>SNOBOL patterns to another library would be a wonderful gift to the
>Python community...

I don't believe you can get the benefit of SNOBOL matching without direct
language support. There's only so much a library can do. However a valiant
and interesting effort:

http://www.wilmott.ca/python/patternmatching.html

Eddie

Paul Rubin

unread,
Jul 17, 2007, 11:58:08 AM7/17/07
to
aa...@pythoncraft.com (Aahz) writes:
> .So adding SNOBOL patterns to another library would be a wonderful

> gift to the Python community...

Snobol patterns were invented at a time when nobody knew anything
about parsing. They were extremely powerful (recursive with arbitrary
amounts of backtracking) but could use exponential time and maybe even
exponential space.

These days, it makes more sense to use something like pyparsing. Or I
wonder if it would be feasible to write something like Parsec for
Python (Parsec is a parser combinator library for Haskell).

Message has been deleted
Message has been deleted

hg

unread,
Jul 17, 2007, 4:28:41 PM7/17/07
to
Chris Carlen wrote:

> Hi:
>
> From what I've read of OOP, I don't get it. I have also found some
> articles profoundly critical of OOP. I tend to relate to these articles.
>

> However, those articles were no more objective than the descriptions of
> OOP I've read in making a case. Ie., what objective
> data/studies/research indicates that a particular problem can be solved
> more quickly by the programmer, or that the solution is more efficient
> in execution time/memory usage when implemented via OOP vs. procedural
> programming?


>
> The problem for me is that I've programmed extensively in C and .asm on

> PC DOS way back in 1988. Then didn't program for nearly 10 years during
> which time OOP was popularized. Starting in 1999 I got back into
> programming, but the high-level-ness of PC programming and the
> completely foreign language of OOP repelled me. My work was in analog
> and digital electronics hardware design, so naturally I started working
> with microcontrollers in .asm and C. Most of my work involves low-level
> signal conditioning and real-time control algorithms, so C is about as
> high-level as one can go without seriously loosing efficiency. The
> close-to-the-machine-ness of C is ideal here. This is a realm that I
> truly enjoy and am comfortable with.


>
> Hence, being a hardware designer rather than a computer scientist, I am
> conditioned to think like a machine. I think this is the main reason
> why OOP has always repelled me.
>

> Perhaps the only thing that may have clicked regarding OOP is that in
> certain cases I might prefer a higher-level approach to tasks which
> involve dynamic memory allocation. If I don't need the execution
> efficiency of C, then OOP might produce working results faster by not
> having to worry about the details of memory management, pointers, etc.
>
> But I wonder if the OOP programmers spend as much time creating classes
> and trying to organize everything into the OOP paradigm as the C
> programmer spends just writing the code?
>
> Ultimately I don't care what the *name* is for how I program. I just
> need to produce results. So that leads back to objectivity. I have a
> problem to solve, and I want to find a solution that is as quick as
> possible to learn and implement.
>
> Problem:


>
> 1. How to most easily learn to write simple PC GUI programs that will
> send data to remote embedded devices via serial comms, and perhaps
> incorporate some basic (x,y) type graphics display and manipulation
> (simple drawing program). Data may result from user GUI input, or from
> parsing a text config file. Solution need not be efficient in machine
> resource utilization. Emphasis is on quickness with which programmer
> can learn and implement solution.
>

> 2. Must be cross-platform: Linux + Windows. This factor can have a big
> impact on whether it is necessary to learn a new language, or stick with
> C. If my platform was only Linux I could just learn GTK and be done
> with it. I wouldn't be here in that case.
>

> Possible solutions:
>
> Form 1: Use C and choose a library that will enable cross-platform GUI
> development.
>
> Pro: Don't have to learn new language.
> Con: Probably will have difficulty with cross-platform implementation
> of serial comms. This will probably need to be done twice. This will
> waste time.


>
> Form 2: Use Python and PySerial and TkInter or wxWidgets.
>
> Pro: Cross-platform goal will likely be achieved fully. Have a
> programmer nearby with extensive experience who can help.
> Con: Must learn new language and library. Must possibly learn a
> completely new way of thinking (OOP) not just a new language syntax.
> This might be difficult.
>

> Form 3: Use LabVIEW
>
> Pro: I think that the cross-platform goal can be met.
> Con: Expensive. I would prefer to use an Open Source solution. But
> that isn't as important as the $$$. I have also generally found the 2D
> diagrammatical programming language of "G" as repelling as OOP. I
> suspect that it may take as much time to learn LabVIEW as Python. In
> that case the time spent on Python might be better spent since I would
> be learning something foundational as opposed to basically just learning
> how to negotiate someone's proprietary environment and drivers.
>
>
> Comments appreciated.


>
>
> --
> Good day!
>
> ________________________________________
> Christopher R. Carlen
> Principal Laser&Electronics Technologist
> Sandia National Laboratories CA USA
> crcarleR...@BOGUSsandia.gov
> NOTE, delete texts: "RemoveThis" and
> "BOGUS" from email address to reply.


yes, maybe you should look at UML first.

hg


Jay Loden

unread,
Jul 17, 2007, 5:44:51 PM7/17/07
to Steve Holden, pytho...@python.org
Steve Holden wrote:
> It took a little bit more careful planning to get Icon pattern-matching
> structures right, but there was much more explicit control of
> backtracking. I only wish they'd grafted more OO concepts into it, then
> I might never have bothered with Python! Someone did do an OO system
> layered on top of it, but IIRC it was clumsy and rebarbative.

OT, but I just wanted to say, "rebarbative" is my word of the day for today thanks to this post. I consider myself to have a reasonably comprehensive vocabulary but that's a new one for me, so I had to look it up after your post ;)

-Jay

Alex Martelli

unread,
Jul 18, 2007, 2:06:11 AM7/18/07
to
Dennis Lee Bieber <wlf...@ix.netcom.com> wrote:

> On Mon, 16 Jul 2007 11:45:04 -0700, Chris Carlen
> <crcarleR...@BOGUSsandia.gov> declaimed the following in
> comp.lang.python:


>
> > The more I play with Python, the more I like it. Perhaps I will
> > understand OOP quicker than I thought. What I've learned so far about
> > names binding to objects instead of values stored in memory cells, etc.
> > has been interesting and fascinating.
>

> Don't confuse Python's "roaming names" with OOP, though. There are
> OOP languages that still follow the variable=>memory address containing
> object structure.

C++, definitely. But most OO languages, like Java &c, use a more modern
"object reference" naming scheme, just like Python, FP languages, etc.


Alex

Hendrik van Rooyen

unread,
Jul 18, 2007, 2:51:13 AM7/18/07
to pytho...@python.org
"Dennis Lee Bieber" <wlf...@ix.netcom.com> wrote:

> Don't confuse Python's "roaming names" with OOP, though. There are
> OOP languages that still follow the variable=>memory address containing
> object structure.

"roaming names" is a brilliant description!

Thanks Dennis!

- Hendrik

Message has been deleted

greg

unread,
Jul 22, 2007, 3:25:55 AM7/22/07
to comp-lang-py...@moderators.individual.net
Aahz wrote:
> So adding
> SNOBOL patterns to another library would be a wonderful gift to the
> Python community...

I wrote a module for Snobol-style pattern matching a
while back, but didn't get around to releasing it.
I've just put it on my web page:

http://www.cosc.canterbury.ac.nz/greg.ewing/python/Snobol.tar.gz

There's no manual yet, but there's a fairly complete
set of docstrings and some test cases to figure it
out from.

--
Greg

Wolfgang Strobl

unread,
Jul 22, 2007, 10:20:09 AM7/22/07
to
ed...@holyrood.ed.ac.uk (Eddie Corns):

>I don't believe you can get the benefit of SNOBOL matching without direct
>language support.

That's my opinion, too.

>There's only so much a library can do. However a valiant
>and interesting effort:
>
>http://www.wilmott.ca/python/patternmatching.html

This is newer than http://sourceforge.net/projects/snopy/ which adapts a
ADA implemenation, which follows the SNOBOL model quite closely. Didn't
knew that. Thanks for pointing it out!

Well, unfortunately, it somehow demonstrates your point. This may be
missing familiarity with the changed idiom, though. Perhaps rewriting a
few of James Gimple's snippets from "Algorithms in SNOBOL4"
(->http://www.snobol4.org/) as an exercise using that library might help
to get a better appreciation. Perhaps I'll try, eventually ...


--
Wir danken für die Beachtung aller Sicherheitsbestimmungen

Wolfgang Strobl

unread,
Jul 22, 2007, 10:19:26 AM7/22/07
to
Paul Rubin <http://phr...@NOSPAM.invalid>:

>aa...@pythoncraft.com (Aahz) writes:
>> .So adding SNOBOL patterns to another library would be a wonderful
>> gift to the Python community...
>
>Snobol patterns were invented at a time when nobody knew anything
>about parsing.

But Snobol patterns aren't mainly about building parsers.

>They were extremely powerful (recursive with arbitrary
>amounts of backtracking) but could use exponential time and maybe even
>exponential space.

Sure. Like any Turing complete language feature.

>
>These days, it makes more sense to use something like pyparsing.

Probably, yes.

Wolfgang Strobl

unread,
Jul 22, 2007, 10:20:02 AM7/22/07
to
aa...@pythoncraft.com (Aahz):

>In article <7cqo93pona4qoc4s9...@4ax.com>,
>Wolfgang Strobl <ne...@mystrobl.de> wrote:
>>
>>SNOBOLs powerfull patterns still shine, compared to Pythons clumsy
>>regular expressions.
>
>Keep in mind that Python regular expressions are modeled on the
>grep/sed/awk/Perl model so as to be familiar to any sysadmin

Sure, I don't dispute that. There is room for both regular expressions
and SNOBOL type patterns, IMHO, because the concepts are different
enough.

>-- but
>there's a reason why Python makes it a *library* unlike Perl. So adding
>SNOBOL patterns to another library would be a wonderful gift to the
>Python community...

Like Eddie Corns if find it hard to do in an elegant way, without
integrating it into the language. I haven't looked into it for a long
time, though.

Eddie Corns

unread,
Jul 23, 2007, 6:53:02 AM7/23/07
to
Wolfgang Strobl <ne...@mystrobl.de> writes:
>few of James Gimple's snippets from "Algorithms in SNOBOL4"
>(->http://www.snobol4.org/) as an exercise using that library might help
>to get a better appreciation. Perhaps I'll try, eventually ...

I never noticed them or the PDF of the book there before. Some Friday
afternoon reading for sure.

Personally I hope to get more to time to look at a combination of Lua and
PEGs (http://en.wikipedia.org/wiki/Parsing_expression_grammar) for my parsing
needs.

Eddie

Aahz

unread,
Jul 23, 2007, 8:47:15 AM7/23/07
to
In article <mailman.989.1185112242.2...@python.org>,

greg <pytho...@python.org> wrote:
>Aahz wrote:
>>
>> So adding SNOBOL patterns to another library would be a wonderful
>> gift to the Python community...
>
>I wrote a module for Snobol-style pattern matching a while back, but
>didn't get around to releasing it. I've just put it on my web page:
>
>http://www.cosc.canterbury.ac.nz/greg.ewing/python/Snobol.tar.gz

Nice! You have restored my faith in the time machine. ;-)

This is Python. We don't care much about theory, except where it intersects
with useful practice.

Paul McGuire

unread,
Jul 23, 2007, 11:28:32 AM7/23/07
to

If you get a chance to look at pyparsing, I'd be interested in your
comments. The PEG page and the SNOBOL implementation have many
similar concepts with pyparsing (or is it the other way around?).

-- Paul

Eddie Corns

unread,
Jul 23, 2007, 1:43:31 PM7/23/07
to
Paul McGuire <pt...@austin.rr.com> writes:

It's on my list of things to get round to.

I think what I'm really after though is a parsing DSL. I only did only one
fairly small project in SNOBOL but I was impressed at the ease with which I
could express the problem (some googling suggested that many end users found
the same). I guess I want SNOBOL embedded in a modern language with scoping
etc. Python is antithetical to (this class of) DSLs (IMHO) :(

Probably what I really need is parser combinators in Haskell or maybe camlp4
or some such exotica but unfortunately I've never heard of them.

Eddie

Paul McGuire

unread,
Jul 23, 2007, 5:04:50 PM7/23/07
to
> Eddie- Hide quoted text -
>
> - Show quoted text -

I have had pyparsing users refer to pyparsing as an in-Python DSL, and
others make comparisons between pyparsing and Parsec (monadic
combinator in Haskell). I'm not sure why you would say this approach
is antithetical to Python - the builtin support for operator
overloading, __call__, __len__, __nonzero__(soon to be __bool__),
__set/getattr__, etc. permit you to impart quite a bit of expressive
behavior to your parsing classes. What I tried to do with pyparsing
was to emulate Python's builtin container classes and object instances
with the results that get returned from invoking a parser, so that the
after-parsing work would feel natural to established Python users.

If you want to just see some existing BNF's implemented in pyparsing,
you can view them online at the pyparsing wiki. Here are some
representative examples:
http://pyparsing.wikispaces.com/space/showimage/simpleSQL.py
http://pyparsing.wikispaces.com/space/showimage/fourFn.py
http://pyparsing.wikispaces.com/space/showimage/jsonParser.py

-- Paul

Eddie Corns

unread,
Jul 24, 2007, 2:36:24 PM7/24/07
to
Paul McGuire <pt...@austin.rr.com> writes:

I don't dispute it is _a_ DSL nor even that it's a very powerful way to do
parsing (in case anyone was wondering). But my intuition tells me that it
ought to be possible to close the remaining gap and I'm trying to understand
how that could be done and that's the bit I suspect might require a higher
class of DSL (<whisper>ie macros</whisper>). Of course my intuition never
fails but sometimes reality distorts so as to make me look wrong.

The plan is to apply pyparsing to some real project and hopefully gain a
better understanding. I have in mind parsing Cisco config files as a first
trial. As to when I get time...

You see I could just sit down and work out how to solve the problem (eg the
config files) using pyparsing or whatever and move on but I'm sure you of all
people know why that's not enough!

>If you want to just see some existing BNF's implemented in pyparsing,
>you can view them online at the pyparsing wiki. Here are some
>representative examples:
>http://pyparsing.wikispaces.com/space/showimage/simpleSQL.py
>http://pyparsing.wikispaces.com/space/showimage/fourFn.py
>http://pyparsing.wikispaces.com/space/showimage/jsonParser.py

>-- Paul

I appreciate the input and hopefully one day I'll be posting some questions on
the wiki.

Cheers,
Eddie

John J. Lee

unread,
Jul 26, 2007, 5:20:30 PM7/26/07
to
(Sorry if this appears twice, accidentally included c.l.py.announce
the first time where it was (correctly) rejected, not sure if that
means it didn't get through to c.l.py either).

greg <gr...@cosc.canterbury.ac.nz> writes:

Interesting.

Would be nice if tarball contained a top-level directory rather than
unpacking directly into the current working directory -- people on
traditional (non-Mac) unixy systems don't expect that. PEP 8-style
module name (snobol.py not Snobol.py) would be nice too.


John

0 new messages