Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Python was designed (was Re: Multi-threading in Python vs Java)

539 views
Skip to first unread message

Chris Angelico

unread,
Oct 12, 2013, 6:37:58 PM10/12/13
to pytho...@python.org
On Sat, Oct 12, 2013 at 7:10 AM, Peter Cacioppi
<peter.c...@gmail.com> wrote:
> Along with "batteries included" and "we're all adults", I think Python needs a pithy phrase summarizing how well thought out it is. That is to say, the major design decisions were all carefully considered, and as a result things that might appear to be problematic are actually not barriers in practice. My suggestion for this phrase is "Guido was here".

"Designed".

You simply can't get a good clean design if you just let it grow by
itself, one feature at a time. You'll end up with something where you
can do the same sort of thing in three different ways, and they all
have slightly different names:

http://me.veekun.com/blog/2012/04/09/php-a-fractal-of-bad-design/#general

(Note, I'm not here to say that PHP is awful and Python is awesome
(they are, but I'm not here to say it). It's just that I can point to
a blog post that shows what I'm saying.)

Design is why, for instance, Python's builtin types all behave the
same way with regard to in-place mutator methods: they don't return
self. I personally happen to quite like the "return self" style, as it
allows code like this:

GTK2.MenuBar()
->add(GTK2.MenuItem("_File")->set_submenu(GTK2.Menu()
->add(menuitem("_New Tab",addtab)->add_accelerator(...))
->add(menuitem("Close tab",closetab)->add_accelerator(...))
... etc ...
))
->add(GTK2.MenuItem("_Options")->set_submenu(GTK2.Menu()
->add(menuitem("_Font",fontdlg))
... etc ...
))
... etc ...

It's a single expression (this is from Pike, semantically similar to
Python) that creates and sets up the whole menu bar. Most of Pike's
object methods will return this (aka self) if it's believed to be of
use. The Python equivalent, since the .add() method on GTK objects
returns None, is a pile of code with temporary names. But that's a
smallish point of utility against a large point of consistency;
newbies can trust that a line like:

lst = lst.sort()

will trip them up immediately (since lst is now None), rather than
surprise them later when they try to make a sorted copy of the list:

sorted_lst = lst.sort()

which, if list.sort returned self, would leave you with sorted_lst is
lst, almost certainly not what the programmer intended.

Oh, and the use of exceptions everywhere is a sign of design, too.
Something went wrong that means you can't return a plausible value?
Raise.

>>> json.loads("{")
ValueError: Expecting object: line 1 column 0 (char 0)

>>> pickle.loads(b"\x80")
EOFError

Etcetera. PHP borrows from C in having piles and piles of "was there
an error" functions; there's no consistency in naming, nor (in many
cases) in the return values. Pike generally raises exceptions, but I/O
failure usually results in a zero return and the file object's errno
attribute set; but at least they're consistent error codes.

This is design. Python has a king (Guido). It wasn't built by a
committee. Maybe you won't like some aspect of Python's design, but it
has one, it's not just sloppily slapped together.

ChrisA

Steven D'Aprano

unread,
Oct 12, 2013, 11:38:21 PM10/12/13
to
On Sun, 13 Oct 2013 09:37:58 +1100, Chris Angelico wrote:

> This is design. Python has a king (Guido). It wasn't built by a
> committee. Maybe you won't like some aspect of Python's design, but it
> has one, it's not just sloppily slapped together.


While I agree with your general thrust, I don't think it's quite so
simple. Perl has a king, Larry Wall, but his design is more or less
"throw everything into the pot, it'll be fine" and consequently Perl is,
well, *weird*, with some pretty poor^W strange design decisions.

- Subroutines don't have signatures, you have to parse arguments
yourself by popping values off the magic variable @_ .

- More special variables than you can shake a stick at: @_ $_ $a $b @ARGV
$& ${^ENCODING} $. $| $= $$ $^O $^S @F and many, many more.

- Context sensitivity: these two lines do very different things:

$foo = @bar
@foo = @bar

and so do these two:

my($foo) = `bar`
my $foo = `bar`

- Sigils. Sigils everywhere.

- Separate namespaces for scalars, arrays, hashes, filehandles,
and subroutines (did I miss anything?), co-existing in the same
scope, all the better for writing code like this:

$bar = &foo($foo, $foo[1], $foo{1})

If you think that all three references to $foo refer to the same
variable, you would be wrong.

- Two scoping systems (dynamic and lexical) which don't cooperate.

- Strangers to Perl might think that the way to create a local variable
is to define it as local:

local $foo;

but you'd be wrong. "local" does something completely different. To
create a local variable, use "my $foo" instead.


More here: http://perl.plover.com/FAQs/Namespaces.html


Likewise Rasmus Lerdorf, king of PHP (at least initially), but he had no
idea what he was doing:

"I had no intention of writing a language. I didn't have a clue how to
write a language. I didn't want to write a language," Lerdorf explained.
"I just wanted to solve a problem of churning out Web applications very,
very fast."

http://www.devshed.com/c/a/PHP/PHP-Creator-Didnt-Set-Out-to-Create-a-Language/



--
Steven

Chris Angelico

unread,
Oct 13, 2013, 12:34:33 AM10/13/13
to pytho...@python.org
On Sun, Oct 13, 2013 at 2:38 PM, Steven D'Aprano
<steve+comp....@pearwood.info> wrote:
> On Sun, 13 Oct 2013 09:37:58 +1100, Chris Angelico wrote:
>
>> This is design. Python has a king (Guido). It wasn't built by a
>> committee. Maybe you won't like some aspect of Python's design, but it
>> has one, it's not just sloppily slapped together.
>
>
> While I agree with your general thrust, I don't think it's quite so
> simple. Perl has a king, Larry Wall, but his design is more or less
> "throw everything into the pot, it'll be fine" and consequently Perl is,
> well, *weird*, with some pretty poor^W strange design decisions.

My apologies, I wasn't exactly clear. Having a king doesn't in any way
guarantee a clean design...

> Likewise Rasmus Lerdorf, king of PHP (at least initially), but he had no
> idea what he was doing:
>
> "I had no intention of writing a language. I didn't have a clue how to
> write a language. I didn't want to write a language," Lerdorf explained.
> "I just wanted to solve a problem of churning out Web applications very,
> very fast."

... yeah, what he said; but having no king pretty much condemns a
project to design-by-committee. Python has a king and a clear design.

In any case, we're broadly in agreement here. It's design that makes
Python good. That's why the PEP system and the interminable
bike-shedding on python-dev is so important... and why, at the end of
the day, the PEP's acceptance comes down to one person (Guido or a
BDFL-Delegate).

ChrisA

Roy Smith

unread,
Oct 13, 2013, 9:04:56 AM10/13/13
to
In article <525a15ad$0$29984$c3e8da3$5496...@news.astraweb.com>,
Steven D'Aprano <steve+comp....@pearwood.info> wrote:

> While I agree with your general thrust, I don't think it's quite so
> simple. Perl has a king, Larry Wall, but his design is more or less
> "throw everything into the pot, it'll be fine" and consequently Perl is,
> well, *weird*, with some pretty poor^W strange design decisions.

To be fair to Larry, there were different design drivers working there.

Pre-perl, people built humungous shell scripts, duct-taping together
little bits of sed, grep, awk, and other command-line tools. What perl
did, was make it easier to use the functionality of those disparate
tools together in a single language. By holding on to the little bits
of syntax from the precursor languages, he kept the result familiar
feeling, so Unix sysadmins (who were the main audience for perl) were
willing to adopt it.

It was wildly successful, not because it was perfect, but because it
beat the pants off what came before it.

rusi

unread,
Oct 14, 2013, 6:39:10 AM10/14/13
to
On Sunday, October 13, 2013 6:34:56 PM UTC+5:30, Roy Smith wrote:
> To be fair to Larry, there were different design drivers working there.

One more thing to be said for perl:

I remember when some colleague first told me about perl (I guess early 90s) I was incredulous that the *same* language could run on DOS and on Unix unchanged.
Yeah in principle we all talked about portability however in practice, we found that the only program that would run on all systems was the asymptotic null C program:
main() {;}

So a full scale language whose programs ran unchanged on all systems was BIG back then.

That we take it for granted today indicates the shoulders of the giants we are standing on.

John Nagle

unread,
Oct 14, 2013, 3:18:59 PM10/14/13
to
On 10/12/2013 3:37 PM, Chris Angelico wrote:
> On Sat, Oct 12, 2013 at 7:10 AM, Peter Cacioppi
> <peter.c...@gmail.com> wrote:
>> Along with "batteries included" and "we're all adults", I think
>> Python needs a pithy phrase summarizing how well thought out it is.
>> That is to say, the major design decisions were all carefully
>> considered, and as a result things that might appear to be
>> problematic are actually not barriers in practice. My suggestion
>> for this phrase is "Guido was here".
>
> "Designed".
>
> You simply can't get a good clean design if you just let it grow by
> itself, one feature at a time.

No, Python went through the usual design screwups. Look at how
painful the slow transition to Unicode was, from just "str" to
Unicode strings, ASCII strings, byte strings, byte arrays,
16 and 31 bit character builds, and finally automatic switching
between rune widths. Old-style classes vs. new-style classes. Adding a
boolean type as an afterthought (that was avoidable; C went through
that painful transition before Python was created). Operator "+"
as concatenation for built-in arrays but addition for NumPy
arrays.

Each of those reflects a design error in the type system which
had to be corrected.

The type system is now in good shape. The next step is to
make Python fast. Python objects have dynamic operations suited
to a naive interpreter like CPython. These make many compile
time optimizations hard. At any time, any thread can monkey-patch
any code, object, or variable in any other thread. The ability
for anything to use "setattr()" on anything carries a high
performance price. That's part of why Unladen Swallow failed
and why PyPy development is so slow.

John Nagle

Peter Cacioppi

unread,
Oct 14, 2013, 5:11:24 PM10/14/13
to
So Python was designed reasonably well, with a minimum of hacky-screw-ups. This happened because Python's growth was effectively managed by an individual who was well suited to the task. In other words, "Guido was here".

Good thread, I learned a lot from it, thanks.

Mark Lawrence

unread,
Oct 14, 2013, 5:43:39 PM10/14/13
to pytho...@python.org
On 14/10/2013 22:11, Peter Cacioppi wrote:
>
> So Python was designed reasonably well, with a minimum of hacky-screw-ups. This happened because Python's growth was effectively managed by an individual who was well suited to the task. In other words, "Guido was here".
>
> Good thread, I learned a lot from it, thanks.
>

Would you be kind enough to learn something from this please
https://wiki.python.org/moin/GoogleGroupsPython

--
Roses are red,
Violets are blue,
Most poems rhyme,
But this one doesn't.

Mark Lawrence

Chris Angelico

unread,
Oct 14, 2013, 6:43:16 PM10/14/13
to pytho...@python.org
On Tue, Oct 15, 2013 at 6:18 AM, John Nagle <na...@animats.com> wrote:
> On 10/12/2013 3:37 PM, Chris Angelico wrote:
>> "Designed".
>>
>> You simply can't get a good clean design if you just let it grow by
>> itself, one feature at a time.
>
> No, Python went through the usual design screwups. Look at how
> painful the slow transition to Unicode was, from just "str" to
> Unicode strings, ASCII strings, byte strings, byte arrays,
> 16 and 31 bit character builds, and finally automatic switching
> between rune widths. Old-style classes vs. new-style classes. Adding a
> boolean type as an afterthought (that was avoidable; C went through
> that painful transition before Python was created). Operator "+"
> as concatenation for built-in arrays but addition for NumPy
> arrays.
>
> Each of those reflects a design error in the type system which
> had to be corrected.

Oh, Python's design wasn't perfect - that's a pretty much impossible
goal anyway. Sometimes you don't learn what you ought to have done
till it's been in production for a while - that's why, for instance,
these happened:

https://wiki.theory.org/YourLanguageSucks#Fixed_in_Python_3

You'd have to be completely omniscient to avoid that kind of
misjudgment, and breaking backward compatibility is such a major cost
that sometimes design errors just have to be kept. But you'll still
end up with something far cleaner than would come from ad-hoc
undirected changes; it'll be a design with warts, rather than a lack
of design.

> The type system is now in good shape. The next step is to
> make Python fast. Python objects have dynamic operations suited
> to a naive interpreter like CPython. These make many compile
> time optimizations hard. At any time, any thread can monkey-patch
> any code, object, or variable in any other thread. The ability
> for anything to use "setattr()" on anything carries a high
> performance price. That's part of why Unladen Swallow failed
> and why PyPy development is so slow.

Yeah, this does make things hard. But that dynamism is a fundamental
part of Python's design, even if it's used by almost nothing. I'd say
this isn't proof of a design error, just a consequence of a design
decision. Python isn't for everyone, nor for every task - sometimes
it'll be too slow for what you want. So be it! There are plenty of
places where it's good. And there are similar languages (hi Pike!) for
when you want a bit more performance.

ChrisA

Chris Angelico

unread,
Oct 14, 2013, 6:45:05 PM10/14/13
to pytho...@python.org
Pretty much, yeah. We're saying the same thing, only I'm focusing on
the importance of design rather than deifying the person who designed
it. But yes, that comes to much the same result.

ChrisA

Chris Angelico

unread,
Oct 14, 2013, 7:11:53 PM10/14/13
to pytho...@python.org
On Tue, Oct 15, 2013 at 6:18 AM, John Nagle <na...@animats.com> wrote:
> No, Python went through the usual design screwups.
> Each of [the below] reflects a design error in the type system which
> had to be corrected.

I'll pick up each one here as I think some of them need further discussion.

> Look at how painful the slow transition to Unicode was,
> from just "str" to Unicode strings, ASCII strings, byte strings, byte
> arrays, 16 and 31 bit character builds, and finally automatic
> switching between rune widths.

I'm not sure what you mean by all of these - I've known Python for
only a (relatively) short time, wasn't there in the 1.x days (much
less the <1.0 days). But according to its history page, the early 1.x
versions of Python predate the widespread adoption of Unicode, so it's
a little unfair to look with 2013 eyes and say that full true Unicode
support should have been there from the start. If anyone invents a
language today that doesn't handle Unicode properly, I would be very
much disappointed; but changing the meaning of quoted string literals
is a pretty major change. I'm just glad it got sorted out for 3.0. As
to the 16/32 bit builds, there aren't actually very many languages
that get this right; Python's now a blazing torch, showing the way for
others to follow. (Pike's had something very similar to PEP 393 for
years, but nobody looks to obscurities.) I hope we'll see other
languages start to follow suit.

> Old-style classes vs. new-style classes.

By the time I started using Python, new-style classes existed and were
the recommended way to do things, so I never got the "feel" for
old-style classes. I assume there was a simplicity to them, since
new-style classes were described as having a performance cost, but one
worth paying. My guess is it comes under the category of "would have
to be omniscient to recognize what would happen"; Steven, maybe you
can fill us in?

> Adding a
> boolean type as an afterthought (that was avoidable; C went through
> that painful transition before Python was created).

I don't know about that. Some languages get by just fine without
dedicated a boolean type. Python didn't have them, then it had them as
integers, now it has them as bools. Is it a major problem? (Apart from
adding them in a point release. That's an admitted mistake.) Python
doesn't have a 'vector' type either, you just use a tuple. Some things
don't need to be in the language, they can be pushed off to the
standard library. And speaking of which...

> Operator "+" as concatenation for built-in arrays but addition
> for NumPy arrays.

... NumPy definitely isn't part of the language. It's not even part of
the standard library, it's fully third-party. The language decrees
that [1,2] + [3,4] = [1,2,3,4], and that custom_object1 +
custom_object2 = custom_object1.__add__(custom_object2) more or less,
and then leaves the implementation of __add__ up to you. Maybe you'll
make an "Entropy" class, where entropy+=int blocks until it's acquired
that much more entropy (maybe from /dev/random), and entropy-int
returns a random number based on its current state. It makes a measure
of sense, if not what you normally would want. You can shoot yourself
in the foot in any language; and if you write something as big and
popular as NumPy, you get to shoot other people in the foot too! :)

ChrisA

Terry Reedy

unread,
Oct 14, 2013, 9:35:22 PM10/14/13
to pytho...@python.org
On 10/14/2013 7:11 PM, Chris Angelico wrote:

> I'm not sure what you mean by all of these - I've known Python for
> only a (relatively) short time, wasn't there in the 1.x days (much
> less the <1.0 days). But according to its history page, the early 1.x
> versions of Python predate the widespread adoption of Unicode, so it's
> a little unfair to look with 2013 eyes and say that full true Unicode
> support should have been there from the start.

The first versions of Python and unicode were developed and released
about the same time. No one knew that either would be as successful as
they have become over two decades.

>> Old-style classes vs. new-style classes.
>
> By the time I started using Python, new-style classes existed and were
> the recommended way to do things, so I never got the "feel" for
> old-style classes. I assume there was a simplicity to them, since

Too simple. All user classes were instances of the userclass type. All
user instances were instances of the userinstance type, or something
like that. There were otherwise separate from builtin types. I have
forgotten the details and have no wish to remember.

The system was usable but klutzy. I believe it was an add-on after the
initial release. People wanted to be able to subclass builtins even back
in 1.4 days, but Guido did not realized how to use the obscure metaclass
hook to do so until 2.2 was being developed. Most core devs are happy to
be rid of them (except when patching 2.7).

--
Terry Jan Reedy

Mark Janssen

unread,
Oct 14, 2013, 9:31:37 PM10/14/13
to John Nagle, Python List
On Mon, Oct 14, 2013 at 12:18 PM, John Nagle <na...@animats.com> wrote:
> On 10/12/2013 3:37 PM, Chris Angelico wrote:
>> On Sat, Oct 12, 2013 at 7:10 AM, Peter Cacioppi
>> <peter.c...@gmail.com> wrote:
>>> Along with "batteries included" and "we're all adults", I think
>>> Python needs a pithy phrase summarizing how well thought out it is.
>>> That is to say, the major design decisions were all carefully
>>> considered, and as a result things that might appear to be
>>> problematic are actually not barriers in practice. My suggestion
>>> for this phrase is "Guido was here".
>>
>> "Designed".
>>
>> You simply can't get a good clean design if you just let it grow by
>> itself, one feature at a time.
>
> No, Python went through the usual design screwups.

I hesitate to poke my nose in here, but Python is fine. No one knows
how to design the perfect language from the start, otherwise it would
be here. But Python has set the precedent for allowing
backwards-incompatibility to fix language problems and that's what
will keep it from breaking.

> Look at how
> painful the slow transition to Unicode was, from just "str" to
> Unicode strings, ASCII strings, byte strings, byte arrays,

This is where I wish I could have been involved with the discussion,
but I was outside of civilization at the time, and was not able to
contribute.

> 16 and 31 bit character builds, and finally automatic switching
> between rune widths. Old-style classes vs. new-style classes. Adding a
> boolean type as an afterthought (that was avoidable; C went through
> that painful transition before Python was created). Operator "+"
> as concatenation for built-in arrays but addition for NumPy
> arrays.

All of this will get fixed, but the problem is that you are stirring
up issues without really understanding the problem. The problem is
something that is at the bleeding-edge of Computer Science itself and
settling on a theory of types. I've answered this by creating a
unified object model, but no one has understood why the hell anyone
needs one, so I'm sitting around waiting for a friend..

> Each of those reflects a design error in the type system which
> had to be corrected.

To call it a "design error" makes it seem like someone make a decision
that resulted in a mistake, but it isn't (wasn't) that simple.

> The type system is now in good shape. The next step is to
> make Python fast.

Woah, boy. There's no reason to make an incomplete design faster, for
psuedo-problems that no one will care about in 5-10 years. The field
has yet to realize that it needs an object model, or even what that
is.

> Python objects have dynamic operations suited
> to a naive interpreter like CPython.

Naive, no.

> These make many compile
> time optimizations hard. At any time, any thread can monkey-patch
> any code, object, or variable in any other thread. The ability
> for anything to use "setattr()" on anything carries a high
> performance price. That's part of why Unladen Swallow failed
> and why PyPy development is so slow.

Yes, and all of that is because, the world has not settled on some
simple facts. It needs an understanding of type system. It's been
throwing terms around, some of which are well-defined, but others,
not: there has been enormous cross-breeding that has made mutts out
of everybody and someone's going to have to eat a floppy disk for
feigning authority where there wasn't any.

Mark J
Tacoma, Washington

Chris Angelico

unread,
Oct 14, 2013, 9:48:30 PM10/14/13
to pytho...@python.org
On Tue, Oct 15, 2013 at 12:31 PM, Mark Janssen
<dreamin...@gmail.com> wrote:
>> Python objects have dynamic operations suited
>> to a naive interpreter like CPython.
>
> Naive, no.
>

"Naive", in this instance, means executing code exactly as written,
without optimizing things (and it's not an insult, btw). For instance,
a C compiler might turn this into simple register operations:

int x=5;

int foo()
{
x+=3;
return x*2;
}

The two references to 'x' inside foo() can safely be assumed to be the
same 'x', and the value as written by the += MUST be the one used to
calculate *2. If you declare x to be volatile, that assumption won't
be made, and the interpretation will be naive. Now here's the CPython
equivalent:

x=5
def foo():
global x
x+=3
return x*2

>>> dis.dis(foo)
3 0 LOAD_GLOBAL 0 (x)
3 LOAD_CONST 1 (3)
6 INPLACE_ADD
7 STORE_GLOBAL 0 (x)

4 10 LOAD_GLOBAL 0 (x)
13 LOAD_CONST 2 (2)
16 BINARY_MULTIPLY
17 RETURN_VALUE

Note that the global is stored, then reloaded. This is the naive
approach, assuming nothing about the relations between operations.
It's an easy way to be thread-safe, it just costs performance.

ChrisA

Roy Smith

unread,
Oct 14, 2013, 9:50:24 PM10/14/13
to
In article <mailman.1087.1381800...@python.org>,
Terry Reedy <tjr...@udel.edu> wrote:

> The first versions of Python and unicode were developed and released
> about the same time. No one knew that either would be as successful as
> they have become over two decades.

Much the same can be said for IPv6 :-)

Mark Janssen

unread,
Oct 14, 2013, 10:11:35 PM10/14/13
to Chris Angelico, Python List
>>> Python objects have dynamic operations suited
>>> to a naive interpreter like CPython.
>>
>> Naive, no.
>
> "Naive", in this instance, means executing code exactly as written,
> without optimizing things (and it's not an insult, btw).

In that case, you're talking about a "non-optimizing" interpreter, but
then, that what is supposed to happen. I don't think it's fair to
call it "naive". An interpreter can't guess what you mean to do in
every circumstance (threading?). It's better to do it right (i.e.
well-defined), *slowly* than to do it fast, incorrectly.

MarkJ
Tacoma, Washington

Chris Angelico

unread,
Oct 14, 2013, 10:19:39 PM10/14/13
to pytho...@python.org
On Tue, Oct 15, 2013 at 1:11 PM, Mark Janssen <dreamin...@gmail.com> wrote:
>> "Naive", in this instance, means executing code exactly as written,
>> without optimizing things (and it's not an insult, btw).
>
> In that case, you're talking about a "non-optimizing" interpreter, but
> then, that what is supposed to happen. I don't think it's fair to
> call it "naive". An interpreter can't guess what you mean to do in
> every circumstance (threading?). It's better to do it right (i.e.
> well-defined), *slowly* than to do it fast, incorrectly.

The only thing that's unfair is the interpretation of "naive" as
meaning somehow inferior.

https://en.wikipedia.org/wiki/Naivety#Science

As you say, it's better to do it right slowly than wrong quickly. The
naive method is more easily proven.

ChrisA

rusi

unread,
Oct 14, 2013, 11:02:25 PM10/14/13
to
On Tuesday, October 15, 2013 7:01:37 AM UTC+5:30, zipher wrote:
> Yes, and all of that is because, the world has not settled on some
> simple facts. It needs an understanding of type system. It's been
> throwing terms around, some of which are well-defined, but others,
> not: there has been enormous cross-breeding that has made mutts out
> of everybody and someone's going to have to eat a floppy disk for
> feigning authority where there wasn't any.

Objects in programming languages (or 'values' if one is more functional programming oriented) correspond to things in the world.
Types on the other hand correspond to our classifications and so are things in our minds.
So for the world 'to settle' on a single universal type system is about as nonsensical and self contradictory as you and I having the same thoughts.

To see how completely nonsensical a classification system of a so-called alien culture is, please read:
http://en.wikipedia.org/wiki/Celestial_Emporium_of_Benevolent_Knowledge

And then reflect that the passage is implying that CONVERSELY our natural/obvious/FACTual classifications would appear similarly nonsensical to them.

The same in the world of programming languages:

Here's an APL session
$ ./apl

Welcome to GNU APL version 1.0
1 + 2
3
1 + 2 3 4
3 4 5
1 = 2
0
1 2 3 = 2 3 4
0 0 0
1 = 1 2 3
1 0 0
2 ≥ 1 2 3
1 1 0


a perfectly good (and for many of us old-timers a very beautiful) type system
but completely incompatible with anything designed in the last 40 years!
[Hell it does not even have a prompt!
Also note the character-set (≥ not >=) -- long before unicode not an emasculated deference to ASCII

Steven D'Aprano

unread,
Oct 14, 2013, 11:18:25 PM10/14/13
to
On Mon, 14 Oct 2013 12:18:59 -0700, John Nagle wrote:

> No, Python went through the usual design screwups. Look at how
> painful the slow transition to Unicode was, from just "str" to Unicode
> strings, ASCII strings, byte strings, byte arrays, 16 and 31 bit
> character builds, and finally automatic switching between rune widths.

Are you suggesting that Guido van Rossum wasn't omniscient back in 1991
when he first released Python??? OH MY GOD!!! You ought to blog about
this, let the world know!!!!

But seriously... although the Unicode standard was began as early as
1987, the first official release of the standard wasn't until nine months
after the first public release of Python. Do you really consider it a
"design screwup" that Guido didn't build support for Unicode into Python
since the beginning?

Given the constraints of backwards-compatibility, and that Unicode didn't
even exist when Python was first created, I don't think the history of
Unicode support in Python is a screw-up in the least. And if it is a
screw-up, it's *Unicode's* screw-up, because they're the ones that
thought that 16-bit chars would have been enough in the first place.

While it would have been nice if Python had invented the idea of using
different rune widths back in Python 2.2, I don't think we can hold it
against GvR or the other Python devs that they didn't. They're only
human. As far as I know, only one other language does such a thing,
namely Pike, which is not exactly high-profile.


> Old-style classes vs. new-style classes. Adding a boolean type as an
> afterthought (that was avoidable; C went through that painful transition
> before Python was created). Operator "+" as concatenation for
> built-in arrays but addition for NumPy arrays.
>
> Each of those reflects a design error in the type system which
> had to be corrected.

Perhaps the first one -- had GvR not decided in the first place that
built-in types should be separate from user-defined classes, the old vs
new style class thing would have been unnecessary. But bools are not an
example. The decision to leave out bools as a separate type was, and
remains, a perfectly legitimate decision. Perhaps one might argue that
Python-with-bools is *better* than Python-without-bools, but we would be
foolish to argue that Python-without-bools was a screw-up. Bools are a
nice-to-have, not a must-have.

And as for numpy arrays, well, if a long-standing Python developer such
as yourself doesn't yet understand that this is a feature, not a mistake,
there's a serious problem, and it's not with Python. Operator overloading
exists precisely so that custom classes aren't limited to the exact same
behaviour as built-ins. The fact that the numpy devs made a different
decision as to what + means than the Python devs is not a sign that the
design was screwed up, it is a sign that the system works.

It is true that numpy has a problem with Python operators in that there
aren't enough of them. There have been various attempts to work out a
good syntax for adding arbitrary additional operators, so that numpy can
have *both* element-wise operators and array-wise operators at the same
time. But the lack of this is not a design screw-up. It's a hard problem
to solve, and sometimes it is better to do without a feature than to add
it poorly.


> The type system is now in good shape. The next step is to
> make Python fast.

Whenever I see somebody describing a *language* as "fast" or "slow",
especially when the next few sentence reveals that they are aware of the
existence of multiple *implementations*:

> Python objects have dynamic operations suited to a
> naive interpreter like CPython. [...] That's
> part of why Unladen Swallow failed and why PyPy development is so slow.

as if "fast" and "slow" were objective, concrete and most importantly
*fixed* standards that are the same for everybody, then I suspect
trolling.

Or to put it another way: Python is already fast. Using PyPy, you can
write pure-Python code that is faster than the equivalent optimized C
code compiled using gcc. Even using vanilla CPython, you can write pure
Python code that (for example) checks over 12,000 nine-digit integers for
primality per second, on a relatively old and slow computer. If that's
not *fast*, nothing is.

Whether it is *fast enough* is a completely different question, and one
which leads to the question "fast enough for what?". But people who like
to complain about "Python being slow" don't like that question.


--
Steven

Chris Angelico

unread,
Oct 14, 2013, 11:29:03 PM10/14/13
to pytho...@python.org
On Tue, Oct 15, 2013 at 2:18 PM, Steven D'Aprano
<steve+comp....@pearwood.info> wrote:
> Even using vanilla CPython, you can write pure
> Python code that (for example) checks over 12,000 nine-digit integers for
> primality per second, on a relatively old and slow computer. If that's
> not *fast*, nothing is.

Agreed. I used to do my major numeric calculations in OS/2 REXX, which
served me quite well in the 1990s. But Python smokes it, even the
oh-so-slow Python 3 where every integer is a bignum (which is fair
comparison against REXX, where every number is... uhh... a string). A
modern, optimized language, even one that's perceived as "slow", is
able to do many orders of magnitude more calculations per second than
a human can.... okay, that's hardly fair, but it's about the only
absolute we're ever going to get :)

ChrisA

rusi

unread,
Oct 14, 2013, 11:48:15 PM10/14/13
to
On Tuesday, October 15, 2013 8:48:25 AM UTC+5:30, Steven D'Aprano wrote:
> On Mon, 14 Oct 2013 12:18:59 -0700, John Nagle wrote:
>
> > No, Python went through the usual design screwups. Look at how
> > painful the slow transition to Unicode was, from just "str" to Unicode
> > strings, ASCII strings, byte strings, byte arrays, 16 and 31 bit
> > character builds, and finally automatic switching between rune widths.
>
>
> Are you suggesting that Guido van Rossum wasn't omniscient back in 1991
> when he first released Python??? OH MY GOD!!! You ought to blog about
> this, let the world know!!!!

You are making a strawman out of John's statements:

> Python went through the usual design screwups.
> [screwup list which perhaps pinche John most]
> Each of those reflects a design error in the type system which had to be corrected.

The reasonable interpretation of John's statements is that propriety and even truth is a function of time: It was inappropriate for GvR to have put in unicode in 1990. It was appropriate in 2008. And it was done. You may call that being-human-not-God. I call that being real.

To have reality time-invariant, would imply for example that Abraham Lincoln was a racist because he use the word 'negro': (see speech
http://en.wikipedia.org/wiki/Abraham_Lincoln_and_slavery#Legal_and_political )

Or that it is ok to do so today.

Steven D'Aprano

unread,
Oct 15, 2013, 1:51:50 AM10/15/13
to
On Mon, 14 Oct 2013 20:48:15 -0700, rusi wrote:

> On Tuesday, October 15, 2013 8:48:25 AM UTC+5:30, Steven D'Aprano wrote:
>> On Mon, 14 Oct 2013 12:18:59 -0700, John Nagle wrote:
>>
>> > No, Python went through the usual design screwups. Look at how
>> > painful the slow transition to Unicode was, from just "str" to
>> > Unicode strings, ASCII strings, byte strings, byte arrays, 16 and 31
>> > bit character builds, and finally automatic switching between rune
>> > widths.
>>
>>
>> Are you suggesting that Guido van Rossum wasn't omniscient back in 1991
>> when he first released Python??? OH MY GOD!!! You ought to blog about
>> this, let the world know!!!!
>
> You are making a strawman out of John's statements:
>
>> Python went through the usual design screwups. [screwup list which
>> perhaps pinche John most] Each of those reflects a design error in the
>> type system which had to be corrected.
>
> The reasonable interpretation of John's statements is that propriety and
> even truth is a function of time: It was inappropriate for GvR to have
> put in unicode in 1990. It was appropriate in 2008. And it was done.
> You may call that being-human-not-God. I call that being real.

And I agree with you! But that's not what John wrote. John called it a
design screw-up. His very first example was the slow transition to
Unicode. Not "here's a choice that made sense at the time", but "screw-
up".


--
Steven

Mark Lawrence

unread,
Oct 15, 2013, 3:21:00 AM10/15/13
to pytho...@python.org
My encyclopedia doesn't mention Python, unicode or IPv6. Not that it's
old, but the stone mason retired years ago :)

Antoon Pardon

unread,
Oct 15, 2013, 3:48:32 AM10/15/13
to pytho...@python.org
Op 15-10-13 01:11, Chris Angelico schreef:
> On Tue, Oct 15, 2013 at 6:18 AM, John Nagle <na...@animats.com> wrote:

>
>> Operator "+" as concatenation for built-in arrays but addition
>> for NumPy arrays.
>
> ... NumPy definitely isn't part of the language. It's not even part of
> the standard library, it's fully third-party.

That doesn't matter. Adding and concating are different operations and
their are types in which both occur rather naturally. So as a designer
of such a class you have to choose for which operation you use the
natural python operator and for which operation you have to do it
differently. NumPy is just an example that you can't escape this sort
of incompatibilities in python.

--
Antoon Pardon

Steven D'Aprano

unread,
Oct 15, 2013, 4:50:10 AM10/15/13
to
On Tue, 15 Oct 2013 10:11:53 +1100, Chris Angelico wrote:

> On Tue, Oct 15, 2013 at 6:18 AM, John Nagle <na...@animats.com> wrote:

>> Old-style classes vs. new-style classes.
>
> By the time I started using Python, new-style classes existed and were
> the recommended way to do things, so I never got the "feel" for
> old-style classes. I assume there was a simplicity to them, since
> new-style classes were described as having a performance cost, but one
> worth paying. My guess is it comes under the category of "would have to
> be omniscient to recognize what would happen"; Steven, maybe you can
> fill us in?

Heh, I'm not privy to why GvR decided to have built-in types and classes
be distinct :-) but it wasn't a performance issue. The very first
versions of Python didn't have user-defined types at all:

http://python-history.blogspot.com.au/2009/02/adding-support-for-user-
defined-classes.html

For a while classic classes were faster, but I don't think that's been
the case for a long while now.

You can read up more about the unification of types and classes here:

http://www.python.org/download/releases/2.2.3/descrintro/

and the associated PEPs:

http://www.python.org/dev/peps/pep-0252/
http://www.python.org/dev/peps/pep-0253/


but note that the PEPs may no longer reflect the current implementation.
The descrintro document above is interesting for its explanation of how
descriptors work.


>> Adding a
>> boolean type as an afterthought (that was avoidable; C went through
>> that painful transition before Python was created).
>
> I don't know about that. Some languages get by just fine without
> dedicated a boolean type. Python didn't have them, then it had them as
> integers, now it has them as bools.

Actually, Python's bools *are* ints :-)


If you read the whole python-history blog on blogspot, you'll see that
Python's had it's share of mistakes, design failures and other "oops!"
moments. I think that it is a testament to GvR's over-all design that the
end result has been so good, despite the mistakes, as well as Python's
conservative-but-not-too-conservative approach to changes. Compared to
(say) Firefox, which comes out with new releases seemingly twice a week,
Python is slow to change and conservative; compared to (say) Fortran,
which changes in a time-span of decades rather than years, it's quite
fast moving. I think Python has more or less hit the sweet-spot.



--
Steven

Chris Angelico

unread,
Oct 15, 2013, 4:57:50 AM10/15/13
to pytho...@python.org
On Tue, Oct 15, 2013 at 6:48 PM, Antoon Pardon
<antoon...@rece.vub.ac.be> wrote:
> Op 15-10-13 01:11, Chris Angelico schreef:
>> On Tue, Oct 15, 2013 at 6:18 AM, John Nagle <na...@animats.com> wrote:
>
>>
>>> Operator "+" as concatenation for built-in arrays but addition
>>> for NumPy arrays.
>>
>> ... NumPy definitely isn't part of the language. It's not even part of
>> the standard library, it's fully third-party.
>
> That doesn't matter. Adding and concating are different operations and
> their are types in which both occur rather naturally. So as a designer
> of such a class you have to choose for which operation you use the
> natural python operator and for which operation you have to do it
> differently. NumPy is just an example that you can't escape this sort
> of incompatibilities in python.

So what should "abc" + "def" result in, if addition is different from
concatenation? No, adding strings should concatenate them. And other
arithmetic operators make sense, too; Python doesn't happen to
implement str-str or str/str, but some languages do:

> "abc"+"def"-"abc";
(1) Result: "def"
> "abc"-"b";
(2) Result: "ac"
> "foo bar asdf qwer"/" "*"##";
(3) Result: "foo##bar##asdf##qwer"

PHP has separate addition and concatenation operators, and it doesn't
help anything (granted, the biggest problem is that every other
language we work with uses + to concat strings, so it's an easy source
of bugs); having multiple operators for "add the elements of these
arrays" and "add these arrays together" is really orthogonal to the
general issue of adding and concatenating needing different operators.

But as Steven said, the way different types are free to react
differently to the same operators is *GOOD* design in Python. Java
doesn't let user-defined types override operators, so all those
possibilities are closed. You can argue that it's poor design in NumPy
(and people will argue the contrary position), but it's definitely not
a flaw in Python-the-language.

ChrisA

Antoon Pardon

unread,
Oct 15, 2013, 6:26:44 AM10/15/13
to pytho...@python.org
Op 15-10-13 10:57, Chris Angelico schreef:
> On Tue, Oct 15, 2013 at 6:48 PM, Antoon Pardon
> <antoon...@rece.vub.ac.be> wrote:
>> Op 15-10-13 01:11, Chris Angelico schreef:
>>> On Tue, Oct 15, 2013 at 6:18 AM, John Nagle <na...@animats.com> wrote:
>>
>>>
>>>> Operator "+" as concatenation for built-in arrays but addition
>>>> for NumPy arrays.
>>>
>>> ... NumPy definitely isn't part of the language. It's not even part of
>>> the standard library, it's fully third-party.
>>
>> That doesn't matter. Adding and concating are different operations and
>> their are types in which both occur rather naturally. So as a designer
>> of such a class you have to choose for which operation you use the
>> natural python operator and for which operation you have to do it
>> differently. NumPy is just an example that you can't escape this sort
>> of incompatibilities in python.
>
> So what should "abc" + "def" result in, if addition is different from
> concatenation?

A type error.

> No, adding strings should concatenate them. And other
> arithmetic operators make sense, too; Python doesn't happen to
> implement str-str or str/str, but some languages do:
>
>> "abc"+"def"-"abc";
> (1) Result: "def"
>> "abc"-"b";
> (2) Result: "ac"
>> "foo bar asdf qwer"/" "*"##";
> (3) Result: "foo##bar##asdf##qwer"
>
> PHP has separate addition and concatenation operators, and it doesn't
> help anything (granted, the biggest problem is that every other
> language we work with uses + to concat strings, so it's an easy source
> of bugs); having multiple operators for "add the elements of these
> arrays" and "add these arrays together" is really orthogonal to the
> general issue of adding and concatenating needing different operators.

No it is not, because adding elements to an array is concatenation.
If you want the array adding operator to be the same as the adding
operator for other types and you want the array concatenating operator
be the same as the concatenating operator for other types, you can
only do that if the two operations use different operators.

Since python use the same, you are forced to introduce some kind
of incompatibility.

If you have a library function that assumes '+' for concatenating
sequences and you have a library function that assumes '+' has
group like properties you can't implement your arrays in a way
they can use both.

--
Antoon Pardon

Steven D'Aprano

unread,
Oct 15, 2013, 11:01:58 AM10/15/13
to
On Tue, 15 Oct 2013 19:57:50 +1100, Chris Angelico wrote:

> On Tue, Oct 15, 2013 at 6:48 PM, Antoon Pardon
> <antoon...@rece.vub.ac.be> wrote:

>> That doesn't matter. Adding and concating are different operations and
>> their are types in which both occur rather naturally. So as a designer
>> of such a class you have to choose for which operation you use the
>> natural python operator and for which operation you have to do it
>> differently. NumPy is just an example that you can't escape this sort
>> of incompatibilities in python.
>
> So what should "abc" + "def" result in, if addition is different from
> concatenation?

TypeError, like any other unsupported operator.


> No, adding strings should concatenate them. And other
> arithmetic operators make sense, too;

For some definition of "sense".


> Python doesn't happen to implement str-str or str/str, but some
> languages do:

Which languages are you talking about?

For the record, if PHP is one of them, I consider that a good sign that
it shouldn't be done :-)


>> "abc"+"def"-"abc";
> (1) Result: "def"

Eww. What would "xyz" - "abc" give? How about "cba" - "abc"?

And "abcdabc" - "abc"?

Justify your answers.



>> "abc"-"b";
> (2) Result: "ac"
>> "foo bar asdf qwer"/" "*"##";
> (3) Result: "foo##bar##asdf##qwer"

And what, pray tell, would "foo bar" / " " be on its own?

How about "foo bar" * "*"?

Seems to me that using s/t*u as a way to perform substring replacement is
too clever by half.


> PHP has separate addition and concatenation operators, and it doesn't
> help anything

That's because PHP is beyond help.


> (granted, the biggest problem is that every other language
> we work with uses + to concat strings, so it's an easy source of bugs);
> having multiple operators for "add the elements of these arrays" and
> "add these arrays together" is really orthogonal to the general issue of
> adding and concatenating needing different operators.

Yes -- string concatenation and array operations are not really related,
although string concatenation is a special case of array concatenation.
But it is a wild over-generalisation to assume that because strings are
arrays, and (numeric) arrays might want separate element-wise
multiplication and whole-array multiplication operators, therefore
strings need to support the same too.

Personally, I think string and array concatenation ought to be & rather
than +. That would free up + for element-wise addition for arrays, while
still allowing & for concatenation. Of course, the cost would be the loss
of an element-wise bit-and operator. You win some, you lose some.



--
Steven

rusi

unread,
Oct 15, 2013, 11:48:02 AM10/15/13
to
On Tuesday, October 15, 2013 2:20:10 PM UTC+5:30, Steven D'Aprano wrote:
> If you read the whole python-history blog on blogspot, you'll see that
> Python's had it's share of mistakes, design failures and other "oops!"
> moments. I think that it is a testament to GvR's over-all design that the
> end result has been so good, despite the mistakes, as well as Python's
> conservative-but-not-too-conservative approach to changes. Compared to
> (say) Firefox, which comes out with new releases seemingly twice a week,
> Python is slow to change and conservative; compared to (say) Fortran,
> which changes in a time-span of decades rather than years, it's quite
> fast moving. I think Python has more or less hit the sweet-spot.

Yes heartily agree.

Mostly we have systems/software/languages that fall into one of two categories:
a. Completely immobile -- which means the only change is the inevitable bitrot of slow aging
b. So much blood that we cant see the 'edge' in the bleeding edge

That way python is surely in a sweetspot (for me):

In 2001 I started teaching programming with a computer projector (rather than chalk). This meant that classes became more dynamic and alive but also my head was more than ever on the line -- with chalk-board you can occasionally fudge your way out with bullshit. When the projector is displaying an unexpected result or a backtrace/segfault there is no such room!!

By chance(?) it was also the time I started teaching python.

And for these last 10+ years python has been like a faithful servant -- useful, unobtrusive, predictable. The number of times Ive had to say in a class: "Ok guys I dont know..." (In short I am screwed) is hardly a handful.

One of the very occasional embarrassing exceptions:
https://mail.python.org/pipermail/python-list/2011-July/609362.html

Chris Angelico

unread,
Oct 15, 2013, 3:09:06 PM10/15/13
to pytho...@python.org
On Wed, Oct 16, 2013 at 2:01 AM, Steven D'Aprano
<steve+comp....@pearwood.info> wrote:
> On Tue, 15 Oct 2013 19:57:50 +1100, Chris Angelico wrote:
>> Python doesn't happen to implement str-str or str/str, but some
>> languages do:
>
> Which languages are you talking about?
>
> For the record, if PHP is one of them, I consider that a good sign that
> it shouldn't be done :-)

My examples here are from Pike.

>>> "abc"+"def"-"abc";
>> (1) Result: "def"
>
> Eww. What would "xyz" - "abc" give? How about "cba" - "abc"?
>
> And "abcdabc" - "abc"?
>
> Justify your answers.

> "xyz" - "abc";
(1) Result: "xyz"
> "cba" - "abc";
(2) Result: "cba"
> "abcdabc" - "abc";
(3) Result: "d"

Every instance of the subtracted-out string is removed. It's something
like x.remove(y) in many other languages.

>>> "abc"-"b";
>> (2) Result: "ac"
>>> "foo bar asdf qwer"/" "*"##";
>> (3) Result: "foo##bar##asdf##qwer"
>
> And what, pray tell, would "foo bar" / " " be on its own?

A two-element array "foo","bar":

> "foo bar" / " ";
(4) Result: ({ /* 2 elements */
"foo",
"bar"
})

> How about "foo bar" * "*"?

That's an error (the equivalent of TypeError).

> Seems to me that using s/t*u as a way to perform substring replacement is
> too clever by half.

Maybe :) It looks fancy for something like this. Normally, I use
string division with iteration, but there is a more direct "replace"
function for when that's being done.

> Personally, I think string and array concatenation ought to be & rather
> than +. That would free up + for element-wise addition for arrays, while
> still allowing & for concatenation. Of course, the cost would be the loss
> of an element-wise bit-and operator. You win some, you lose some.

I don't know who was first to implement string+string --> concat, but
since it's swept the world, it's worth keeping just on that basis, in
the same way that the use of octal for character escapes like "\123"
is worth keeping:

>>> ord("\123")
83
>>> 0o123
83

Confused me no end when I came across BIND's use of that syntax to
mean decimal. So if & has its problems and + has its problems, go with
+, as it'll confuse less people.

ChrisA

wxjm...@gmail.com

unread,
Oct 15, 2013, 4:11:17 PM10/15/13
to
Le lundi 14 octobre 2013 21:18:59 UTC+2, John Nagle a écrit :

----

[...]
>
> No, Python went through the usual design screwups. Look at how
>
> painful the slow transition to Unicode was, from just "str" to
>
> Unicode strings, ASCII strings, byte strings, byte arrays,
>
> 16 and 31 bit character builds, and finally automatic switching
>
> between rune widths. [...]


Yes, a real disaster.

This "poor" Python is spending its time in reencoding
when necessary, without counting the fact it's necessary to
check if reencoding is needed.

Where is Unicode? Away.

jmf

Mark Janssen

unread,
Oct 15, 2013, 4:26:27 PM10/15/13
to rusi, Python List
> Objects in programming languages (or 'values' if one is more functional programming oriented) correspond to things in the world.

One of the things you're saying there is that "values correspond to
things in the world". But you will not get agreement in computer
science on that anymore than saying "numbers correspond to things in
the world" -- they are abstractions that are not supposed to
correspond to things. (Objects, OTOH, were intended to, so your
statement has mixed truthiness.)

> Types on the other hand correspond to our classifications and so are things in our minds.

That is not how a C programmer views it. They have explicit
"typedef"s that make it a thing for the computer.

> So for the world 'to settle' on a single universal type system is about as nonsensical and self contradictory as you and I having the same thoughts.

Yes, well clearly we are not "having the same thoughts", yet the
purpose of the academic establishment is to pin down such terminology
and not have these sloppy understandings everywhere. You dig?

> To see how completely nonsensical a classification system of a so-called alien culture is, please read:
> http://en.wikipedia.org/wiki/Celestial_Emporium_of_Benevolent_Knowledge
>
> And then reflect that the passage is implying that CONVERSELY our natural/obvious/FACTual classifications would appear similarly nonsensical to them.
>
> The same in the world of programming languages:

No. There is one world in which the computer is well-defined. All
others are suspect.

> Here's an APL session
> $ ./apl
> a perfectly good (and for many of us old-timers a very beautiful) type system
> but completely incompatible with anything designed in the last 40 years!

Yeah, well 40 years ago they didn't have parsers. The purpose of
having a field of computer science worthy of the name, is to advance
the science not let this riff-raff dominate the practice.

--
MarkJ
Tacoma, Washington

Mark Lawrence

unread,
Oct 15, 2013, 5:00:29 PM10/15/13
to pytho...@python.org
I very much look forward to seeing your correct Python unicode
implementation on the bug tracker.

Tim Chase

unread,
Oct 15, 2013, 5:17:33 PM10/15/13
to Chris Angelico, pytho...@python.org
On 2013-10-16 06:09, Chris Angelico wrote:
> > "xyz" - "abc";
> (1) Result: "xyz"
> > "cba" - "abc";
> (2) Result: "cba"
> > "abcdabc" - "abc";
> (3) Result: "d"
>
> Every instance of the subtracted-out string is removed. It's
> something like x.remove(y) in many other languages.

Or as one might write x.remove(y) in Python:

for demo in ("xyz", "cba", "abcdabc"):
print repr(demo), "->", repr(demo.replace("abc", ""))

> >>> "abc"-"b";
> >> (2) Result: "ac"
> >>> "foo bar asdf qwer"/" "*"##";
> >> (3) Result: "foo##bar##asdf##qwer"
> >
> > And what, pray tell, would "foo bar" / " " be on its own?
>
> A two-element array "foo","bar":
>
> > "foo bar" / " ";
> (4) Result: ({ /* 2 elements */
> "foo",
> "bar"
> })

which in Python sounds suspiciously like dividend.split(divisor)

So Python's giving both functionalities in ways that are more
readable (and in the case of "-", more flexible, as you can replace
with anything, not just delete the matching content).

While subtraction and division of strings make theoretical sense, I'm
glad I don't have to think about them in my Python code ;-)

-tkc









Grant Edwards

unread,
Oct 15, 2013, 5:46:27 PM10/15/13
to
On 2013-10-15, Mark Janssen <dreamin...@gmail.com> wrote:

> Yeah, well 40 years ago they didn't have parsers.

That seems an odd thing to say. People were assembling and compiling
computer programs long before 1973.

How did they do that without parsers?

--
Grant Edwards grant.b.edwards Yow! Of course, you
at UNDERSTAND about the PLAIDS
gmail.com in the SPIN CYCLE --

Rhodri James

unread,
Oct 15, 2013, 6:01:06 PM10/15/13
to
On Tue, 15 Oct 2013 21:26:27 +0100, Mark Janssen
<dreamin...@gmail.com> wrote:

>> = Rusi, attribution missing from original.

>> Objects in programming languages (or 'values' if one is more functional
>> programming oriented) correspond to things in the world.
>
> One of the things you're saying there is that "values correspond to
> things in the world". But you will not get agreement in computer
> science on that anymore than saying "numbers correspond to things in
> the world" -- they are abstractions that are not supposed to
> correspond to things. (Objects, OTOH, were intended to, so your
> statement has mixed truthiness.)
>
>> Types on the other hand correspond to our classifications and so are
>> things in our minds.
>
> That is not how a C programmer views it. They have explicit
> "typedef"s that make it a thing for the computer.

Speaking as a C programmer, no. We have explicit typedefs to create new
labels for existing types, to make the type-abstraction easier to relate
to the object-abstraction. Not that I personally think of C objects as
abstractions, I'm rather with Rusi on that, but if you do then the object
type must be an abstracted abstraction. At which point my head starts to
hurt and I'll get back to some engineering if you don't mind.

>> The same in the world of programming languages:
>
> No. There is one world in which the computer is well-defined. All
> others are suspect.

Perhaps, though I'm personally rather dubious of that claim. Proving
well-definedness may be rather interesting.

> Yeah, well 40 years ago they didn't have parsers. The purpose of
> having a field of computer science worthy of the name, is to advance
> the science not let this riff-raff dominate the practice.

That is an utterly ludicrous statement (and grammatically suspect to boot)
that does nothing to bolster my confidence in your philosophising.

--
Rhodri James *-* Wildebeest Herder to the Masses

Mark Lawrence

unread,
Oct 15, 2013, 6:48:39 PM10/15/13
to pytho...@python.org
On 15/10/2013 21:26, Mark Janssen wrote:
>
> Yeah, well 40 years ago they didn't have parsers.
>

I'm very pleased to see that (presumably) some Americans do have a sense
of humour.

Peter Cacioppi

unread,
Oct 15, 2013, 8:18:58 PM10/15/13
to
> only I'm focusing on the importance of design rather than deifying
> the person who designed it.

I'm cool with deification here. I'll happily get on my knees and bow towards Holland while chanting "Guido ... I'm not worthy" 5 times a day, if that's part of the cult.

Want an odd and ranty thread this one turned into. I think Python is awesome. For me, it literally inspires awe.

If you don't agree, and yet you're working in Python anyway, I kind of feel bad for you, just a little.

Cheers and thanks again.


Piet van Oostrum

unread,
Oct 15, 2013, 11:20:37 PM10/15/13
to
Mark Janssen <dreamin...@gmail.com> writes:

> Yeah, well 40 years ago they didn't have parsers. The purpose of
> having a field of computer science worthy of the name, is to advance
> the science not let this riff-raff dominate the practice.

Hah! 40 years ago I wrote a parser generator (similar to yacc, that I did not know) and used it to generate a parser for Algol 68.
--
Piet van Oostrum <pi...@vanoostrum.org>
WWW: http://pietvanoostrum.com/
PGP key: [8DAE142BE17999C4]

rusi

unread,
Oct 15, 2013, 11:53:39 PM10/15/13
to
On Wednesday, October 16, 2013 1:56:27 AM UTC+5:30, zipher wrote:
> > Objects in programming languages (or 'values' if one is more functional programming oriented) correspond to things in the world.
>
>
> One of the things you're saying there is that "values correspond to
> things in the world". But you will not get agreement in computer
> science on that anymore than saying "numbers correspond to things in
> the world"

Ok… I better take back the '…or values' because that's a completely separate (at least separable) argument and one which I dont want to go into with you.

The original argument: There can be a type-system that everyone can 'settle upon.'

The new (and avoidable) one: Objects and values are equivalent/conflatable as alternate models for building systems -- and therefore OOP and FP.

The APL on the other (third?) hand is at one remove the type argument.
One remove because you are now to see it not from the vanilla programmer's perspective but from the the pov of the language implementer. Once you agree to that you would (hopefully!!) see some things:

When implementing a scanner/lexer characters are the indicators of the types we are interested in. In C, a '/' may be a divide indicator but also a comment-starter. Dozens of other such 'type-confusions' in most languages. APL bites the bullet, enriches the character set and therefore simplifies:
⍝ starts a comment
÷ is divide
/ is the reduce (higher order) operator

Likewise
= is equal
← is assignment
Whether in academics or in professional software engineering, if you had a clue of the man-hours and therefore money wasted in programmers/students writing '=' instead of '==' in C code, you would appreciate the sense in these decisions.

And you think that APL is behind rather than ahead of the competition?

Ah well… the mercilessness of the Dunning-Kruger effect…

rusi

unread,
Oct 16, 2013, 12:45:43 AM10/16/13
to
On Wednesday, October 16, 2013 3:31:06 AM UTC+5:30, Rhodri James wrote:
> On Tue, 15 Oct 2013 21:26:27 +0100, Mark Janssen
> wrote:
>
> >> = Rusi, attribution missing from original.

Yes. It would help to keep your quotes bound (firstclassly?) to their respective quoters -- Mark Janssen also and Peter Cacioppi also.

Mark Janssen

unread,
Oct 16, 2013, 1:45:07 PM10/16/13
to Python List
On Tue, Oct 15, 2013 at 2:46 PM, Grant Edwards <inv...@invalid.invalid> wrote:
> On 2013-10-15, Mark Janssen <dreamin...@gmail.com> wrote:
>
>> Yeah, well 40 years ago they didn't have parsers.
>
> That seems an odd thing to say. People were assembling and compiling
> computer programs long before 1973.

I'm using the word "parser" in the sense of a stand-alone application
that became useful with the growing open source culture that was
developing in the 70's. Prior to that you have punch cards where
there's no meaningful definition of "parsing" because there are no
tokens. Would you say you were "parsing" on an old digital machine
where you input programs with binary switches?

But after the advent of the dumb terminal, parsers started evolving,
and that was the early 70's. I might be a year or two off, but I only
gave one significant digit there. ;^)

Cheers,
--
MarkJ
Tacoma, Washington

Mark Janssen

unread,
Oct 16, 2013, 1:57:03 PM10/16/13
to Rhodri James, Python List
>>> Types on the other hand correspond to our classifications and so are
>>> things in our minds.
>>
>> That is not how a C programmer views it. They have explicit
>> "typedef"s that make it a thing for the computer.
>
> Speaking as a C programmer, no. We have explicit typedefs to create new
> labels for existing types, to make the type-abstraction easier to relate to
> the object-abstraction.

Who uses "object abstraction" in C? No one. That's why C++ was invented.

--
MarkJ
Tacoma, Washington

rusi

unread,
Oct 16, 2013, 2:25:27 PM10/16/13
to
I wonder if you've heard of something called linux?
http://lwn.net/Articles/444910/

Grant Edwards

unread,
Oct 16, 2013, 2:42:32 PM10/16/13
to
On 2013-10-16, Mark Janssen <dreamin...@gmail.com> wrote:
> On Tue, Oct 15, 2013 at 2:46 PM, Grant Edwards <inv...@invalid.invalid> wrote:
>> On 2013-10-15, Mark Janssen <dreamin...@gmail.com> wrote:
>>
>>> Yeah, well 40 years ago they didn't have parsers.
>>
>> That seems an odd thing to say. People were assembling and compiling
>> computer programs long before 1973.
>
> I'm using the word "parser" in the sense of a stand-alone application
> that became useful with the growing open source culture that was
> developing in the 70's. Prior to that you have punch cards where
> there's no meaningful definition of "parsing" because there are no
> tokens.

What do you mean "there are no tokens?". Pascal/Fortran/COBOL on
a deck of punched cards is parsed/compiled the same as it is in a file
on a hard drive.

> Would you say you were "parsing" on an old digital machine
> where you input programs with binary switches?

No, that's not what I'm talking about. I'm talking about compiling
Fortran or PL/1 or whatnot.

> But after the advent of the dumb terminal, parsers started evolving,
> and that was the early 70's. I might be a year or two off, but I only
> gave one significant digit there. ;^)

I don't understand what dumb terminals have to do with it.

--
Grant Edwards grant.b.edwards Yow! I'm GLAD I
at remembered to XEROX all
gmail.com my UNDERSHIRTS!!

Grant Edwards

unread,
Oct 16, 2013, 2:44:08 PM10/16/13
to
On 2013-10-16, Mark Janssen <dreamin...@gmail.com> wrote:
>>>> Types on the other hand correspond to our classifications and so are
>>>> things in our minds.
>>>
>>> That is not how a C programmer views it. They have explicit
>>> "typedef"s that make it a thing for the computer.
>>
>> Speaking as a C programmer, no. We have explicit typedefs to create new
>> labels for existing types, to make the type-abstraction easier to relate to
>> the object-abstraction.
>
> Who uses "object abstraction" in C? No one.

It's not that uncommon. I've seen it done many times.

> That's why C++ was invented.

C++ was invented because people _were_ doing object abstraction in C
and wanted an easier way to do it.

--
Grant Edwards grant.b.edwards Yow! I'm continually AMAZED
at at th'breathtaking effects
gmail.com of WIND EROSION!!

Skip Montanaro

unread,
Oct 16, 2013, 2:49:51 PM10/16/13
to rusi, Python
>> Who uses "object abstraction" in C? No one. That's why C++ was invented.
>
> I wonder if you've heard of something called linux?
> http://lwn.net/Articles/444910/

If not, Linux, how about Python?

http://hg.python.org/cpython/file/e2a411a429d6/Objects

Skip

Chris Angelico

unread,
Oct 16, 2013, 4:40:32 PM10/16/13
to pytho...@python.org
On Thu, Oct 17, 2013 at 5:49 AM, Skip Montanaro <sk...@pobox.com> wrote:
>>> Who uses "object abstraction" in C? No one. That's why C++ was invented.
>>
>> I wonder if you've heard of something called linux?
>> http://lwn.net/Articles/444910/
>
> If not, Linux, how about Python?
>
> http://hg.python.org/cpython/file/e2a411a429d6/Objects

Or huge slabs of the OS/2 Presentation Manager, which is entirely
object oriented and mostly C. It's done with SOM, so it's possible to
subclass someone else's object using a completely different language.
Makes for a lot of boilerplate in the source code, but it works. You
can even - often without the different subclasses being aware of each
other - have two or more unrelated modules each subclass-and-replace a
standard class like WPFolder (which represents a folder, usually
backed by a directory on disk) to modify its behaviour.

Yep, definitely possible to write OO code in C.

ChrisA

Mark Janssen

unread,
Oct 16, 2013, 8:13:56 PM10/16/13
to Chris Angelico, Python List
>>>> Who uses "object abstraction" in C? No one. That's why C++ was invented.
>>>
>> If not, Linux, how about Python?
>>
>> http://hg.python.org/cpython/file/e2a411a429d6/Objects
>
> Or huge slabs of the OS/2 Presentation Manager, which is entirely
> object oriented and mostly C. It's done with SOM, so it's possible to
> subclass someone else's object using a completely different language.

Now this is the first real objection to my statement: OS/2 and the
Presentation Manager, or windowing system.

But, here it is significant that the user /consumer (i.e. *at the
workstation* mind you) is *making* the "object" because thier visual
system turns it into one. Otherwise, at the C-level, I'm guessing
it's normal C code without objects, only struct-ured data. That is,
you don't get all the OOP benefits like inheritance, polymorphism and
encapsulation. C can do 2 of those, albeit kludgingly, but not all
three. And without all three, it's not at all well-established that
you're doing real OOP.

--
MarkJ
Tacoma, Washington

alex23

unread,
Oct 16, 2013, 8:26:43 PM10/16/13
to
On 17/10/2013 3:57 AM, Mark Janssen wrote:
> Who uses "object abstraction" in C? No one. That's why C++ was invented.

"Aristotle maintained that women have fewer teeth than men; although he
was twice married, it never occurred to him to verify this statement by
examining his wives' mouths." -- Bertrand Russell

Chris Angelico

unread,
Oct 16, 2013, 8:28:26 PM10/16/13
to pytho...@python.org
On Thu, Oct 17, 2013 at 11:13 AM, Mark Janssen
<dreamin...@gmail.com> wrote:
> But, here it is significant that the user /consumer (i.e. *at the
> workstation* mind you) is *making* the "object" because thier visual
> system turns it into one. Otherwise, at the C-level, I'm guessing
> it's normal C code without objects, only struct-ured data. That is,
> you don't get all the OOP benefits like inheritance, polymorphism and
> encapsulation. C can do 2 of those, albeit kludgingly, but not all
> three. And without all three, it's not at all well-established that
> you're doing real OOP.

Wrong. At the C level, it's all still objects, with inheritance,
polymorphism, and encapsulation. Piles and piles of boilerplate to
make things work, and you have to compile your .IDL file into C and
then fill in your code, and make sure you don't disrupt things, but it
works beautifully. It's object oriented machine code.

ChrisA

Ned Batchelder

unread,
Oct 16, 2013, 8:47:57 PM10/16/13
to Mark Janssen, Python List
On 10/16/13 8:13 PM, Mark Janssen wrote:
>>>>> Who uses "object abstraction" in C? No one. That's why C++ was invented.
>>> If not, Linux, how about Python?
>>>
>>> http://hg.python.org/cpython/file/e2a411a429d6/Objects
>> Or huge slabs of the OS/2 Presentation Manager, which is entirely
>> object oriented and mostly C. It's done with SOM, so it's possible to
>> subclass someone else's object using a completely different language.
> Now this is the first real objection to my statement: OS/2 and the
> Presentation Manager, or windowing system.
>
> But, here it is significant that the user /consumer (i.e. *at the
> workstation* mind you) is *making* the "object" because thier visual
> system turns it into one. Otherwise, at the C-level, I'm guessing
> it's normal C code without objects, only struct-ured data. That is,
> you don't get all the OOP benefits like inheritance, polymorphism and
> encapsulation. C can do 2 of those, albeit kludgingly, but not all
> three. And without all three, it's not at all well-established that
> you're doing real OOP.
>

Mark, it's clear you're passionate about computer science, but with all
due respect, you need to learn more about it. "Real OOP" is a misnomer:
every language brings its own style of OOP, none more legitimate than
any other. And your earlier idea that punched cards didn't have tokens
is wildly ignorant of the state of software and languages 50 years ago.

--Ned.

Mark Janssen

unread,
Oct 16, 2013, 8:53:22 PM10/16/13
to Ned Batchelder, Python List
> And your earlier idea that punched cards didn't have tokens is wildly
> ignorant of the state of software and languages 50 years ago.

Please tell me how you parsed tokens with binary switches 50 years
ago. Your input is rubbish.
--
MarkJ
Tacoma, Washington

Chris Angelico

unread,
Oct 16, 2013, 9:01:24 PM10/16/13
to pytho...@python.org
On Thu, Oct 17, 2013 at 11:53 AM, Mark Janssen
<dreamin...@gmail.com> wrote:
>> And your earlier idea that punched cards didn't have tokens is wildly
>> ignorant of the state of software and languages 50 years ago.
>
> Please tell me how you parsed tokens with binary switches 50 years
> ago. Your input is rubbish.

I can't quote you anything for 50 years ago, but this is 40:

https://en.wikipedia.org/wiki/ELIZA#Significant_implementations

ELIZA parsed English text.

ChrisA

rusi

unread,
Oct 16, 2013, 9:30:23 PM10/16/13
to
On Thursday, October 17, 2013 6:17:57 AM UTC+5:30, Ned Batchelder wrote:
> On 10/16/13 8:13 PM, Mark Janssen wrote:
>
> >>>>> Who uses "object abstraction" in C? No one. That's why C++ was invented.

Examples from
1. Linux Kernel
2. Python
3. OS/2

> > But, here it is significant that the user /consumer (i.e. *at the
> > workstation* mind you) is *making* the "object" because thier visual
> > system turns it into one. Otherwise, at the C-level, I'm guessing
> > it's normal C code without objects, only struct-ured data. That is,
> > you don't get all the OOP benefits like inheritance, polymorphism and
> > encapsulation. C can do 2 of those, albeit kludgingly, but not all
> > three. And without all three, it's not at all well-established that
> > you're doing real OOP.
> >
>
>
> Mark, it's clear you're passionate about computer science, but with all
> due respect, you need to learn more about it. "Real OOP" is a misnomer:
> every language brings its own style of OOP, none more legitimate than
> any other.

> And your earlier idea that punched cards didn't have tokens
> is wildly ignorant of the state of software and languages 50 years ago.

Yes this is sounding like some slapstick comedy…

However… to speak a little for Mark's perspective (from a hopefully more educated background):
There's a fine line between laboriously simulating a feature and properly supporting it:
- C has arbitrary precision arithmetic -- use gmp library
- C is a functional language -- use function pointers and/or hand-generated macros with macro operators # and ##
Conversely:
- Haskell is an imperative language: Just make a parameter for machine state and pass it around.
etc etc ad libitum

Its called the Turing tarpit or more colloquially Greenspun's tenth law.

No the real problem is not primarily that Mark is CS-illiterate, but rather that being philosophy-illiterate he lectures, philosophizes and is generally logically completely inconsistent.

For me the real objectionable statement is:

> And without all three, it's not at all well-established that you're doing real OOP.

when combined with all his previous grandiloquence about how the object model is confused, wrong and needs to be redone from first principles.

In short its 'well-established' when it suits and not 'well-established' when it suits.

But then in all fairness this is the tendency of most OOP aficionados --
to jump between the 3 levels of discourse:
1. philosophy
2. science
3. technicality/technology

just to dodge the substantive, hard issues.

Ive written about this OOP-fan tendency prevaricate and bullshit:
http://blog.languager.org/2012/07/we-dont-need-no-ooooo-orientation-2.html

and more generally http://blog.languager.org/search/label/OOP

And I need to thank Mark for giving me much needed material for documenting the state of art of bullshitting.

Ned Batchelder

unread,
Oct 16, 2013, 10:09:57 PM10/16/13
to Mark Janssen, Python List
On 10/16/13 8:53 PM, Mark Janssen wrote:
>> And your earlier idea that punched cards didn't have tokens is wildly
>> ignorant of the state of software and languages 50 years ago.
> Please tell me how you parsed tokens with binary switches 50 years
> ago. Your input is rubbish.

The mention of punched cards was from you:

Prior to that [the '70s] you have punch cards where there's no meaningful definition of "parsing" because there are no tokens.

I have no idea what you mean by this. Punched cards are an input mechanism. Each one held 80 characters (ever wonder why people are so fixated on 80-character lines?). Those characters could represent text just as 80 characters in today's text files do. It was common for those cards to hold lines of program text which were parsed into tokens, etc.

Sure, go back far enough and you get to switches, etc, but programs have been input as text for far longer than you think. Fortran was first proposed 60 years ago, and was parsed as tokens. Lisp and Cobol both happened before 1960.

In any case, I've gone back to read the emails where you wrote this, and I can't make sense of how tokens come into the originl topic at all.

You seem drawn to sweeping statements about the current state and history of computer science, but then make claims like this about punched cards that just make no sense.

--Ned.

Piet van Oostrum

unread,
Oct 16, 2013, 10:33:50 PM10/16/13
to
Mark Janssen <dreamin...@gmail.com> writes:

>> And your earlier idea that punched cards didn't have tokens is wildly
>> ignorant of the state of software and languages 50 years ago.
>
> Please tell me how you parsed tokens with binary switches 50 years
> ago. Your input is rubbish.

With all due respect, Mark, your remarks are rubbish. Nobody talked about parsing input with binary switches except you. I answered that 40 years ago I wrote a parser generator that generated a parser for Algol 68 (and another one for Algol 60 I should have added). And all this was using punched cards.

Steven D'Aprano

unread,
Oct 17, 2013, 1:24:16 AM10/17/13
to
On Wed, 16 Oct 2013 17:53:22 -0700, Mark Janssen wrote:

>> And your earlier idea that punched cards didn't have tokens is wildly
>> ignorant of the state of software and languages 50 years ago.
>
> Please tell me how you parsed tokens with binary switches 50 years ago.
> Your input is rubbish.

Mark, it's 2013, not 1993. You're twenty years out of date.

"Binary switches" was state of the art in the mid 1940s. By the late
1940s programmers were writing code in machine code, and by early 1950s
they were using assembly code. Some of the early programming languages
include:

- Regional Assembly Language (1951)
- Autocode (1952)
- Speedcode (1953)
- IPL (1954)
- FLOW-MATIC (1955)

leading to the first "high-level" language, FORTRAN (1955 or 1957,
depending on what stage you consider it as "invented").

Fifty years ago, in 1963, programmers had a choice between many high-
level languages, including:

- FORTRAN (invented in 1955)
- COMTRAN (1957)
- LISP, ALGOL (1958)
- FACT, COBOL, RPG (1959)
- APL, Simula, Snobol (1962)
- CPL (1963)

and were only a year away from being able to program in BASIC and PL/I as
well.


--
Steven

Peter Cacioppi

unread,
Oct 17, 2013, 2:49:02 AM10/17/13
to
I don't know if I want to step into the flames here, but my understanding has always been that in the absence of polymorphism the best you can do is "object based" programming instead of "object oriented" programming.

Object based programming is a powerful step forward. The insight that by associating data structures and methods together you can significantly improve readability and robustness.

Object oriented programming takes things further, most significantly by introducing the idea that the object reference you are referencing might be a run time dependent sub-class. Even Python, which isn't strongly typed, manages polymorphism by allowing the self argument to a sub-class of the method class.

There are many wonderful examples of object based programming in C. I believe VB (not VB.net, the older VBA language) is object based but not object oriented.

True object oriented programming seems to require proper support from the language itself, because the run-time resolution of the "this/self" reference needs specific constructs in the language.

Bear in mind that my usual disclaimer when wading into the flames like this is to quote Randy Newman ... "I may be wrong .... but I don't think so!!"


Mark Lawrence

unread,
Oct 17, 2013, 3:02:00 AM10/17/13
to pytho...@python.org
On 17/10/2013 01:53, Mark Janssen wrote:
>> And your earlier idea that punched cards didn't have tokens is wildly
>> ignorant of the state of software and languages 50 years ago.
>
> Please tell me how you parsed tokens with binary switches 50 years
> ago. Your input is rubbish.
>

You must be one of the happiest people on this planet. At least with
respect to the history of computer science, ignorance is bliss.

Chris Angelico

unread,
Oct 17, 2013, 3:15:20 AM10/17/13
to pytho...@python.org
On Thu, Oct 17, 2013 at 5:49 PM, Peter Cacioppi
<peter.c...@gmail.com> wrote:
> I don't know if I want to step into the flames here, but my understanding has always been that in the absence of polymorphism the best you can do is "object based" programming instead of "object oriented" programming.
>
> Object oriented programming takes things further, most significantly by introducing the idea that the object reference you are referencing might be a run time dependent sub-class. Even Python, which isn't strongly typed, manages polymorphism by allowing the self argument to a sub-class of the method class.

What you've said here is that "without polymorphism, you can't have
polymorphism". :)

The OS/2 PM with SOM (System Object Model) classes does give
polymorphic functionality; here's a bit of an example:

(Reference: http://www.markcrocker.com/rexxtipsntricks/rxtt28.2.0299.html
- class hierarchy, standard classes only)

Part way down you'll see WPFolder. It's a subclass of WPFileSystem
(aka "stuff backed by the disk" as opposed to abstract objects), which
is a subclass of WPObject (aka "stuff you can see and click on"),
which is a subclass of SOMObject (aka "stuff"). The WPFolder class
defines a whole pile of functionality, and its code is all stored in
some library somewhere, as a binary on the disk. Well and good.

Now look at the WPRootFolder class. It's a subclass of WPFolder and
adds a few extra bits and bobs designed for the root of any particular
drive (OS/2 uses an MS-DOS style of drive letters for volumes, rather
than a Unix-style mount points) - menu items for formatting the drive,
getting extra info perhaps, whatever. But most of its functionality
comes from WPFolder.

When a WPFolder method is called on a WPRootFolder, the code has to
handle the fact that it's working with a subclass of that object. As
long as the proper SOM boilerplate is maintained correctly, the
WPFolder code won't even be aware that it's operating on a
WPRootFolder. That's polymorphism, and it's all done in a completely
cross-language way :)

(There are additional complications, as it's possible for a subclass
to WPReplaceClass (I'm probably misremembering the function name)
itself into the position of the parent class, which messes up the
hierarchy a bit - it still all works, but it's less clean to describe.
Most things work the way I'm describing.)

ChrisA

Peter Cacioppi

unread,
Oct 17, 2013, 3:23:23 AM10/17/13
to

> What you've said here is that "without polymorphism, you can't have
> polymorphism". :)

Respectfully, no. I refer to the distinction between object based and object oriented programming. Wikipedia's entry is consistent with my understanding (not to argue by wiki-authority, but the terminology here isn't my personal invention).

Your example of "polymorphism in a non OO" language makes my tired head hurt. Do you have a clean little example of polymorphism being mocked in a reasonable way with pure C? There are many nice object-based C projects floating around, but real polymorphism? I think you can't do it without some bizarre work-arounds, but I'd be happy to be shown otherwise.


Christian Gollwitzer

unread,
Oct 17, 2013, 3:42:29 AM10/17/13
to
Am 17.10.13 09:23, schrieb Peter Cacioppi:
> Do you have a clean little example of polymorphism being
> mocked in a reasonable way with pure C? There are many nice
> object-based C projects floating around, but real polymorphism? I
> think you can't do it without some bizarre work-arounds, but I'd be
> happy to be shown otherwise.

http://stackoverflow.com/questions/524033/how-can-i-simulate-oo-style-polymorphism-in-c/524076#524076

(Haven't tried to compile it, but I think there is an error in answer
#5, the this pointer must be cast to VTable* before calling).

C is reasonably close to OO thanks to its structs. Most language
constructs of C++ can be directly converted into C with just some more
typing effort. You will have problems with exceptions (setjump?),
templates (macros?) and RTTI.

Christian

Jussi Piitulainen

unread,
Oct 17, 2013, 4:20:49 AM10/17/13
to
rusi writes:

> However - to speak a little for Mark's perspective (from a hopefully
> more educated background): There's a fine line between laboriously
> simulating a feature and properly supporting it:
>
> - C has arbitrary precision arithmetic -- use gmp library
> - C is a functional language -- use function pointers and/or
> hand-generated macros with macro operators # and ##

A tangent, but I cannot resist: that latter point is literally the
first thing I ever heard about C.

I knew some Basic, 6502 (dis)assembler, I think Pascal, and probably
Forth, at the time, and I had some idea about what functional
programming means. I may have been already interested in Scheme, but I
don't think I had access to an implementation yet.

Then I overheard an acquaintance telling another: "C is a functional
programming language" ("C on funktionaalinen ohjelmointikieli").
Interesting! But when I later learnt some C it turned out to be almost
the opposite of functional programming.

I guess it was the terminology: Pascal had "procedures" and
"functions" while C only had "functions".

(Mis)information was not as abundant back then as it is now.

Chris Angelico

unread,
Oct 17, 2013, 4:24:58 AM10/17/13
to pytho...@python.org
On Thu, Oct 17, 2013 at 6:23 PM, Peter Cacioppi
<peter.c...@gmail.com> wrote:
> Respectfully, no. I refer to the distinction between object based and object oriented programming. Wikipedia's entry is consistent with my understanding (not to argue by wiki-authority, but the terminology here isn't my personal invention).

Yep, but your definition of object oriented programming is
fundamentally based on support for polymorphism, and your opening
statement said that it's impossible without polymorphism :)

Anyway, what I sought to prove was that polymorphic object oriented
code can be written in C or any other language.

ChrisA

rusi

unread,
Oct 17, 2013, 8:39:59 AM10/17/13
to
On Thursday, October 17, 2013 12:19:02 PM UTC+5:30, Peter Cacioppi wrote:
> Object oriented programming takes things further, most significantly by introducing the idea that the object reference you are referencing might be a run time dependent sub-class. Even Python, which isn't strongly typed, manages polymorphism by allowing the self argument to a sub-class of the method class.

Yes and the reference I earlier gave was just for that:
http://lwn.net/Articles/444910/

Ironically he describes the whole 'polymorphism-in-C' infrastructure there but does not call it that.
The first line however in the sequel article http://lwn.net/Articles/446317/
does just that. Heres the quote:

--------------------
In the first part of this analysis we looked at how the polymorphic side of object-oriented programming was implemented in the Linux kernel using regular C constructs. In particular we examined method dispatch, looked at the different forms that vtables could take, and the circumstances where separate vtables were eschewed in preference for storing function pointers directly in objects. In this conclusion we will explore a second important aspect of object-oriented programming - inheritance, and in particular data inheritance.
--------------------

rusi

unread,
Oct 17, 2013, 9:00:11 AM10/17/13
to
On Thursday, October 17, 2013 6:09:59 PM UTC+5:30, rusi wrote:
> On Thursday, October 17, 2013 12:19:02 PM UTC+5:30, Peter Cacioppi wrote:
>
> > Object oriented programming takes things further, most significantly by
> introducing the idea that the object reference you are referencing might be a
> run time dependent sub-class. Even Python, which isn't strongly typed,
> manages polymorphism by allowing the self argument to a sub-class of the
> method class.
>
>
> Yes and the reference I earlier gave was just for that:
> http://lwn.net/Articles/444910/
>
> Ironically he describes the whole 'polymorphism-in-C' infrastructure there but does not call it that.
>
> The first line however in the sequel article http://lwn.net/Articles/446317/
> does just that. Heres the quote:

I would be a bit remiss if I left it at that -- yeah Mark is clueless about his history and philosophy. However the general usage of the word polymorphism in the OOP community is not much better.

Cardelli and Wegner:
http://lucacardelli.name/Papers/OnUnderstanding.A4.pdf
give a conspectus of the field. Especially section 1.3 shows that the word can mean one of 4 very different and unrelated ideas.

OOP aficionados think one of them to be the only one.

Grant Edwards

unread,
Oct 17, 2013, 9:43:16 AM10/17/13
to
On 2013-10-17, Mark Janssen <dreamin...@gmail.com> wrote:
>> And your earlier idea that punched cards didn't have tokens is wildly
>> ignorant of the state of software and languages 50 years ago.
>
> Please tell me how you parsed tokens with binary switches 50 years
> ago. Your input is rubbish.

Are you under the misapprehension that "punched cards" and "binary
switches" are the same thing?

Punched cards were just another meidium for source code or textual
data. Each card was a line of text (either a line of source code or a
line of data).

--
Grant Edwards grant.b.edwards Yow! I know how to do
at SPECIAL EFFECTS!!
gmail.com

Mark Janssen

unread,
Oct 17, 2013, 10:49:52 AM10/17/13
to Ned Batchelder, Python List
> Prior to that [the '70s] you have punch cards where there's no meaningful
> definition of "parsing" because there are no tokens.
>
> I have no idea what you mean by this. [...]
> You seem drawn to sweeping statements about the current state and history of
> computer science, but then make claims like this about punched cards that
> just make no sense.

It's like this. No matter how you cut it, you're going to get back to
the computers where you load instructions with switches. At that
point, I'll be very much looking in anticipation to your binary-digit
lexer.

--
MarkJ
Tacoma, Washington

Chris Angelico

unread,
Oct 17, 2013, 11:00:20 AM10/17/13
to pytho...@python.org
On Fri, Oct 18, 2013 at 1:49 AM, Mark Janssen <dreamin...@gmail.com> wrote:
> It's like this. No matter how you cut it, you're going to get back to
> the computers where you load instructions with switches. At that
> point, I'll be very much looking in anticipation to your binary-digit
> lexer.

Even when computers were primarily programmed in high level languages,
boot code could still be toggled in with manual switches. There's a
story around someplace of a guy who did that _over the phone_ and, if
I recall correctly, without a reference manual - which would mean he
had the entire boot code for that computer memorized. So, yeah,
loading instructions with switches isn't incompatible with lexing,
though I don't know if that term existed at the time.

Ultimately, computers work with data, which can be represented (and
inputted) with binary states like switches, and can itself represent
text. To parse text, a computer performs analysis on binary data.
Someone could today build a computer that takes input on punched cards
or switches or a Navajo saying A'la'ih and Do'neh'lini [1], and then
parse the corresponding text as (say) C code. The two are completely
orthogonal.

ChrisA

[1] if http://xkcd.com/257/ is correct

Mark Lawrence

unread,
Oct 17, 2013, 11:00:39 AM10/17/13
to pytho...@python.org
On 17/10/2013 15:49, Mark Janssen wrote:
>> Prior to that [the '70s] you have punch cards where there's no meaningful
>> definition of "parsing" because there are no tokens.
>>
>> I have no idea what you mean by this. [...]
>> You seem drawn to sweeping statements about the current state and history of
>> computer science, but then make claims like this about punched cards that
>> just make no sense.
>
> It's like this. No matter how you cut it, you're going to get back to
> the computers where you load instructions with switches. At that
> point, I'll be very much looking in anticipation to your binary-digit
> lexer.
>

Please dial 911, 999 or whatever and ask for an ambulance, it looks as
if the last batch that you bought to smoke was heavily contaminated.

Piet van Oostrum

unread,
Oct 17, 2013, 12:58:33 PM10/17/13
to
The first C++ compilers were just preprocessors that translated into pure C code, which was then compiled with a C compiler. The resulting intermediate C code would be an object-oriented program in C. IIRC, the C code was reasonably clear, not really convoluted, so you would have been able to write it yourself.

rusi

unread,
Oct 17, 2013, 1:32:32 PM10/17/13
to
On Wednesday, October 16, 2013 1:56:27 AM UTC+5:30, zipher wrote:
> Yes, well clearly we are not "having the same thoughts", yet the
> purpose of the academic establishment is to pin down such terminology
> and not have these sloppy understandings everywhere. You dig?

Heh Mark I am really sorry. I think this is the third or fourth time that I say something to which you reply with such egregious rubbish -- parsing has something to do with card-punches?!?! Yeah like python has something to do with the purple shirt I am wearing -- that a dozen others jump at you with a resounding 'Cut the crap!'

Well speaking for myself, I know I speak more wisely sometimes and more stupidly at others and I would wish my penalizers to calibrate the punishment to the crime.

Likewise here. I certainly 'dig' your passion to clean up the 'sloppy understandings everywhere' and would only wish for you the sanity of more knowledge of the subject before you begin to hold forth.

MRAB

unread,
Oct 17, 2013, 1:44:29 PM10/17/13
to pytho...@python.org
I learned a new word yesterday: ultracrepidarian. :-)

Mark Lawrence

unread,
Oct 17, 2013, 2:08:58 PM10/17/13
to pytho...@python.org
On 17/10/2013 18:32, rusi wrote:
> On Wednesday, October 16, 2013 1:56:27 AM UTC+5:30, zipher wrote:
>> Yes, well clearly we are not "having the same thoughts", yet the
>> purpose of the academic establishment is to pin down such terminology
>> and not have these sloppy understandings everywhere. You dig?
>
> Heh Mark I am really sorry. I think this is the third or fourth time that I say something to which you reply with such egregious rubbish -- parsing has something to do with card-punches?!?! Yeah like python has something to do with the purple shirt I am wearing -- that a dozen others jump at you with a resounding 'Cut the crap!'
>

Cut the crap, cut the bollocks more like. I am of course using the term
in the context described here
http://en.wikipedia.org/wiki/Sex_Pistols#Never_Mind_the_Bollocks

Peter Cacioppi

unread,
Oct 17, 2013, 3:37:45 PM10/17/13
to
> The first C++ compilers were just preprocessors that translated into
> pure C code ...

I agree with this.

> the C code was reasonably clear, not really convoluted, so you would have
> been able to write it yourself.

I disagree with this. My sense of C is that IF you are relying on preprocessors to do sophisticated things, THEN you are not writing clear C code. The implementations I've seen of polymorphism-of-structs in C are quite ugly to my eyes, and make me grateful C++ was invented.

OTOH, I've seen object-based C development projects (I.e. where you could tell what function was being called at compile time) that are quite readable. It requires coding discipline (as does readability in any language).

We might just be arguing over the definition of "readable" here. I have been told that my standards of readable are unreasonable, so perhaps I'm in the wrong here. That said, I'm just glad true OO languages exist, especially Python.

All hail Guido.

Mark Janssen

unread,
Oct 17, 2013, 3:49:24 PM10/17/13
to rusi, Python List
On Thu, Oct 17, 2013 at 10:32 AM, rusi <rusto...@gmail.com> wrote:
> On Wednesday, October 16, 2013 1:56:27 AM UTC+5:30, zipher wrote:
>> Yes, well clearly we are not "having the same thoughts", yet the
>> purpose of the academic establishment is to pin down such terminology
>> and not have these sloppy understandings everywhere. You dig?
>
> Heh Mark I am really sorry. I think this is the third or fourth time that I say something to which you reply with such egregious rubbish -- parsing has something to do with card-punches?!?! Yeah like python has something to do with the purple shirt I am wearing -- that a dozen others jump at you with a resounding 'Cut the crap!'

You feedback is respected. However, you haven't included in your
analysis that you have a closed group here of Python aficionados. I
invite you to take a look at
http://c2.com/cgi/wiki?TypeSystemCategoriesInImperativeLanguagesTwo
before you continue to issue insults.

> Likewise here. I certainly 'dig' your passion to clean up the 'sloppy understandings everywhere' and would only wish for you the sanity of more knowledge of the subject before you begin to hold forth.

Talk to me after you've finished your assignment.
--
MarkJ
Tacoma, Washington

Terry Reedy

unread,
Oct 17, 2013, 4:15:43 PM10/17/13
to pytho...@python.org
On 10/17/2013 2:49 AM, Peter Cacioppi wrote:

> Even Python, which isn't strongly typed,

Python objects have a definite type/class. It is fixed for instances of
builtins. If that is not 'strong', the word has no meaning.

> manages polymorphism by allowing the self argument to a sub-class of the method class.

???

--
Terry Jan Reedy

Mark Lawrence

unread,
Oct 17, 2013, 4:41:41 PM10/17/13
to pytho...@python.org
On 17/10/2013 07:49, Peter Cacioppi wrote:
> I don't know if I want to step into the flames here,
>Even Python, which isn't strongly typed
>

Yeah right.

Peter Cacioppi

unread,
Oct 17, 2013, 4:54:51 PM10/17/13
to
My bad, Python is dynamically typed, but also strongly typed.

But I still say it has language features that specifically support polymorphism, which is why true OO can be developed in Python in a readable way.


Ned Batchelder

unread,
Oct 17, 2013, 4:57:53 PM10/17/13
to Mark Janssen, Python List
On 10/17/13 3:49 PM, Mark Janssen wrote:
> On Thu, Oct 17, 2013 at 10:32 AM, rusi <rusto...@gmail.com> wrote:
>> On Wednesday, October 16, 2013 1:56:27 AM UTC+5:30, zipher wrote:
>>> Yes, well clearly we are not "having the same thoughts", yet the
>>> purpose of the academic establishment is to pin down such terminology
>>> and not have these sloppy understandings everywhere. You dig?
>> Heh Mark I am really sorry. I think this is the third or fourth time that I say something to which you reply with such egregious rubbish -- parsing has something to do with card-punches?!?! Yeah like python has something to do with the purple shirt I am wearing -- that a dozen others jump at you with a resounding 'Cut the crap!'
> You feedback is respected. However, you haven't included in your
> analysis that you have a closed group here of Python aficionados. I
> invite you to take a look at
> http://c2.com/cgi/wiki?TypeSystemCategoriesInImperativeLanguagesTwo
> before you continue to issue insults.

I'm interested to learn more about your ideas, but that wiki page is not
going to help much. It's a chaotic back-and-forth, with no attribution,
so it's impossible to know who is saying what. Except that it devolves
into the same frustrated confusion, and then insults that this thread
has, so I can tell: those trying to understand are frustrated, and Mark
starts insulting people. "Hitler!": what does that mean??

Mark, if you want people to understand you, you have to get your facts
straight, you have to explain yourself clearly, and when people don't
understand, you have to not resort to insults. Perhaps you are a
misunderstood genius, I can't tell for sure. So far it just looks like
you are making sweeping over-generalizations based on insufficient
understanding of the current and past complexities of the field.

Read and listen more. Write and say less.

--Ned.
>> Likewise here. I certainly 'dig' your passion to clean up the 'sloppy understandings everywhere' and would only wish for you the sanity of more knowledge of the subject before you begin to hold forth.

Ethan Furman

unread,
Oct 17, 2013, 6:10:55 PM10/17/13
to pytho...@python.org
On 10/17/2013 01:57 PM, Ned Batchelder wrote:
>
> Read and listen more. Write and say less.

Mark Janssen has no interest in learning. From a thread long-ago:

Mark Janssen wrote:
> Ethan Furman wrote:
>> Mark Janssen wrote:
>>>
>>> Really?
>>>
>>> --> int="five"
>>> --> [int(i) for i in ["1","2","3"]]
>>>
>>> TypeError: str is not callable
>>>
>>> Now how are you going to get the original int type back?
>>
>> Mark Janssen, you would increase your credibility if you actually *learned*
>> Python.
>
> Thank you, I actually knew it and was feigning ignorance for a good
> reason -- where is this magical "assignment stack" which remembers
> what int was originally bound to after I rebound it myself?

As you can see, if caught out he claims he was feigning ignorance, but then immediately shows the ignorance is real.

I take some solace in him not being a profane troll, as some in the past have been.

--
~Ethan~

Roy Smith

unread,
Oct 17, 2013, 8:01:07 PM10/17/13
to
In article <cb5c412c-7d41-4778...@googlegroups.com>,
Peter Cacioppi <peter.c...@gmail.com> wrote:

> OTOH, I've seen object-based C development projects (I.e. where you could
> tell what function was being called at compile time) that are quite readable.

If you can tell what function will be called by looking at the code,
it's not object oriented enough :-)

Chris Angelico

unread,
Oct 17, 2013, 8:09:13 PM10/17/13
to pytho...@python.org
Is that why there's the International Object-oriented C Code Contest
at www.ioccc.org ?

ChrisA

Steven D'Aprano

unread,
Oct 17, 2013, 9:52:37 PM10/17/13
to
On Thu, 17 Oct 2013 07:49:52 -0700, Mark Janssen wrote:

> It's like this. No matter how you cut it, you're going to get back to
> the computers where you load instructions with switches. At that point,
> I'll be very much looking in anticipation to your binary-digit lexer.

Why stop there? If you go back far enough, you've got Babbage with his
Analytical Engine and his laboriously hand-cast analog gears.

Nobody disputes than once there were no parsers or lexers, and then some
time later there were. But so bloody what? That is ancient history,
irrelevant to the practice of computer programming for the last sixty
years. There is likely hardly anyone still alive who was programming
using switches, there weren't that many of them in the first place and
they would be in their 80s or 90s now.

It's not the fact that parsers once didn't exist that people object to,
but your total misunderstanding of when that was and its significance to
computer science today.

Relevant:

http://www.xkcd.com/451/



--
Steven

Steven D'Aprano

unread,
Oct 17, 2013, 9:58:31 PM10/17/13
to
On Thu, 17 Oct 2013 19:24:58 +1100, Chris Angelico wrote:

> Anyway, what I sought to prove was that polymorphic object oriented code
> can be written in C or any other language.

The proof of this is that any Turing-complete language can simulate any
other language. Obviously the *difficulty* can vary, but any sufficiently
expressive language can be used to write an interpreter for some other
language which gives you the results you want.

I'm not just talking hypothetically here. Python is polymorphic, and
there are at least two Python implementations written in C (CPython and
Stackless). So if you took all the C code which implements object-
oriented behaviour within CPython, added it to your C project, and then
used it directly as a framework, you would have polymorphic code written
using nothing but C.

Of course, this wouldn't be idiomatic C code, and you won't have syntax
for what you want, but that's why other languages get invented in the
first place.


--
Steven

Mark Janssen

unread,
Oct 17, 2013, 9:59:07 PM10/17/13
to Ethan Furman, Python List
On Thu, Oct 17, 2013 at 3:10 PM, Ethan Furman <et...@stoneleaf.us> wrote:
> On 10/17/2013 01:57 PM, Ned Batchelder wrote:
>>
>>
>> Read and listen more. Write and say less.
>
>
> Mark Janssen has no interest in learning. From a thread long-ago:
>
> Mark Janssen wrote:
>>
>> Ethan Furman wrote:
>>>
>>> Mark Janssen wrote:
>>>>
>>>>
>>>> Really?
>>>>
>>>> --> int="five"
>>>> --> [int(i) for i in ["1","2","3"]]
>>>>
>>>> TypeError: str is not callable
>>>>
>>>> Now how are you going to get the original int type back?
>>>

Thank you for bringing this back up. Was it you who suggested that
built-in are re-assignable? Because this is a bad idea for the
reasons I just showed. My error in that example was going into arcane
points that I should have cross-checked in the Python language
definition (that built-ins were or were *not* assignable), then I
wouldn't have had to have made my (otherwise valid) point, that there
is no magical "stack" which will remember your language re-assignment
so that you can get it back, but then the example should have never
been pushed into existence in the first place.

--
MarkJ
Tacoma, Washington

Mark Janssen

unread,
Oct 17, 2013, 10:08:30 PM10/17/13
to Steven D'Aprano, Python List
>> It's like this. No matter how you cut it, you're going to get back to
>> the computers where you load instructions with switches. At that point,
>> I'll be very much looking in anticipation to your binary-digit lexer.
>
> Why stop there? If you go back far enough, you've got Babbage with his
> Analytical Engine and his laboriously hand-cast analog gears.

And there you bring up the heart of it: the confusion in computer
science. thank you. Babbage's differential engine is not doing
*computation* , it is doing *physics*. We must draw a line somewhere,
because the digital realm in the machine is so entirely separate from
the physics (and even the physical hardware), that I could make a
whole other universe that does not conform to it. It is a whole other
ModelOfComputation.

Q.E.D. (Who else is going to have to eat a floppy disk here?)

> Relevant:
>
> http://www.xkcd.com/451/

*winks*. BTW, all this regarding "models of computation" and such is
relevant to the discussion only because of one thing: I like python.
I will leave that vague response for a later exercise after I get an
invite from a University (MIT?) to head their Computer Engineering
department.

Cheers,

Mark

Steven D'Aprano

unread,
Oct 17, 2013, 10:57:48 PM10/17/13
to
On Thu, 17 Oct 2013 18:59:07 -0700, Mark Janssen wrote:

>>>>> --> int="five"
>>>>> --> [int(i) for i in ["1","2","3"]]
>>>>>
>>>>> TypeError: str is not callable
>>>>>
>>>>> Now how are you going to get the original int type back?


Trivially easy:

py> int
<type 'int'>
py> int = "five" # Oops!
py> int(42.5)
Traceback (most recent call last):
File "<stdin>", line 1, in ?
TypeError: 'str' object is not callable
py> del int
py> int(42.5) # Phew!
42



> Thank you for bringing this back up. Was it you who suggested that
> built-in are re-assignable?


It's not just a suggestion, it is a fact. The built-ins are just a
namespace, like any other module, class, or other namespace.

(Of course, if you break something in the built-ins, the consequences are
likely to be significantly more wide-ranging, but that's another issue.)

However, in the code shown above, you don't actually reassign a built-in.
You merely shadow it within the current module. Do you understand the
difference? In the above, the *builtin* int still exists, but your code
can no longer get direct access to it because a *global* int gets in the
way. Using Python 2.7:

py> import __builtin__ as builtins
py> builtins.int
<type 'int'>
py> int = "five"
py> int
'five'
py> builtins.int
<type 'int'>

so deleting the global "int" simply reveals the otherwise hidden
builtin.int instead.

However, if you rebind the builtin, Python doesn't remember what the old
value was (although in principle it could):

py> del int # get rid of the global
py> int is builtins.int
True
py> builtins.int = "six" # oh oh, this could be bad
py> int
'six'
py> del int
Traceback (most recent call last):
File "<stdin>", line 1, in ?
NameError: name 'int' is not defined



In this case, deleting the builtin doesn't magically recover it, it just
deletes it:

py> del builtins.int
py> int
Traceback (most recent call last):
File "<stdin>", line 1, in ?
NameError: name 'int' is not defined


At this point, in general, you've buggered up the current Python
environment and would normally need to restart the interpreter. But in
this specific case, all is not quite so lost: we can recover from this if
only we can find another reference to the int built-in type, and restore
it to the builtins:


py> builtins.int = type(42)
py> int("23")
23


I see no reason why Python couldn't create a read-only "backup builtins"
namespace, but on the other hand, why bother?


> Because this is a bad idea for the reasons I just showed.

"Breaking things" is always a bad idea. But re-binding is not necessarily
a bad thing. Let's say I want to find out how often the built-in "len"
function is called by some module:


py> count = 0
py> def mylen(x):
... global count
... count += 1
... return _len(x)
...
py> _len = len # save the real len
py> builtins.len = mylen # monkey-patch the builtins
py> import mymodule
py> count
3

Now obviously this is a trivial example. But there are more interesting,
and useful, reasons for monkey-patching builtins, usually for testing and
debugging purposes. Such a technique should be used with caution, but it
can be used.



--
Steven

Steven D'Aprano

unread,
Oct 17, 2013, 11:14:09 PM10/17/13
to
On Wed, 16 Oct 2013 23:49:02 -0700, Peter Cacioppi wrote:

> Even Python, which isn't strongly typed

I see that in a later message you have stepped away from that
misconception, but I think it is well worth reading this essay:

https://cdsmith.wordpress.com/2011/01/09/an-old-article-i-wrote/

previously known as "What To Know Before Debating Type Systems".


I think the author goes a little too far to claim that "strong" and
"weak" are meaningless terms when it comes to type systems. I think it is
quite reasonable to accept that there is no hard and fast line dividing
"strongly" and "weakly" typed languages, without concluding that the
terms are meaningless. I think it is reasonable to say that Haskell has a
very strong type system, since it will (almost?) never allow any
operation on an unexpected type, or automatically convert one type to
another. Pascal is a little weaker, since it will automatically convert
numeric types but nothing else. Perl and PHP are a lot weaker, since they
will convert strings to numbers and vice versa. If you want to draw the
line between "strong" and "weak" so that Pascal is on one side and Perl
on the other, that seems reasonable to me.

One thing he missed is that there are untyped languages where everything
is the same type. If everything is the same type, that's equivalent to
there being no types at all. Examples include TCL and Hypertalk, where
everything are strings, and Forth, where everything are two-byte words.

But I digress. Apart from those couple of little criticisms, I think it
is a very useful article to read.



--
Steven

rusi

unread,
Oct 17, 2013, 11:15:35 PM10/17/13
to
Hehe!
And if you had uttered 'ultracrepidarian' before yesterday you would have been
ultracrepidarian. After that not.
[With frank and free respect to the power of google and cut-n-paste]

rusi

unread,
Oct 17, 2013, 11:34:00 PM10/17/13
to
On Friday, October 18, 2013 7:38:30 AM UTC+5:30, zipher wrote:
> >> It's like this. No matter how you cut it, you're going to get back to
> >> the computers where you load instructions with switches. At that point,
> >> I'll be very much looking in anticipation to your binary-digit lexer.
> >
> > Why stop there? If you go back far enough, you've got Babbage with his
> > Analytical Engine and his laboriously hand-cast analog gears.
>
> And there you bring up the heart of it: the confusion in computer
> science. thank you. Babbage's differential engine is not doing
> *computation* , it is doing *physics*.

And today's computers dont 'do' electronics??

Heres Dijkstra
http://www.cs.utexas.edu/users/EWD/transcriptions/EWD09xx/EWD924.html
and search forward to 'magic'

> We must draw a line somewhere,
> because the digital realm in the machine is so entirely separate from
> the physics (and even the physical hardware), that I could make a
> whole other universe that does not conform to it. It is a whole other
> ModelOfComputation.
>
> Q.E.D. (Who else is going to have to eat a floppy disk here?)


> > Relevant:
> >
> > http://www.xkcd.com/451/

> *winks*. BTW, all this regarding "models of computation" and such is
> relevant to the discussion only because of one thing: I like python.
> I will leave that vague response for a later exercise after I get an
> invite from a University (MIT?) to head their Computer Engineering
> department.

Jokes have a propensity to reveal the subconscious of the jokers
[Btw that joke is usually called a pun]

Steven D'Aprano

unread,
Oct 17, 2013, 11:56:58 PM10/17/13
to
On Thu, 17 Oct 2013 19:08:30 -0700, Mark Janssen wrote:

>>> It's like this. No matter how you cut it, you're going to get back to
>>> the computers where you load instructions with switches. At that
>>> point, I'll be very much looking in anticipation to your binary-digit
>>> lexer.
>>
>> Why stop there? If you go back far enough, you've got Babbage with his
>> Analytical Engine and his laboriously hand-cast analog gears.
>
> And there you bring up the heart of it: the confusion in computer
> science. thank you. Babbage's differential engine is not doing
> *computation* , it is doing *physics*.

That's certainly a good example of confusion, but it's not computer
science's, it's yours.

[Aside: I specifically didn't mention the difference engine because it
wasn't a full-blown computer, merely a calculating device like an abacus,
only more complicated. Babbage's Analytical Engine, on the other hand,
would have been a real, Turing Complete, fully general purpose
programmable computer, had he ever finished it.]

The point is, *everything* we do is "merely physics", since to do
something means to have matter and energy interacting, and that is
physics. Yes, Babbage's Analytical Engine was merely "doing physics" in
the same way that an iPhone or IBM mainframe is "doing physics" -- or
your brain, for that matter. All four examples are reductionism gone mad.
The fact that the Analytical Engine used mechanical gears, while iPhones
and mainframes use electrons drifting across doped silicon, and the brain
uses tiny electric currents in a gelatinous chunk of meat of Byzantine
complexity made from tens of thousands of chemicals, is the *least*
interesting part of the exercise, from the perspective of computer
science.

(Other perspectives are of value. For instance, how does a simple
molecule like CH₃CH₂OH affect the computations in the brain in such a way
that leads to punch-ups out the front of King Street nightclubs at 3 on a
Saturday morning? Inquiring minds want to know!)

One of the insights of computer science, and obviously one that you have
misunderstood, is that the *medium doesn't matter*. Computation is an
interesting and important phenomenon in its own right, and it doesn't
matter[1] whether you implement it using electric current flowing through
wires and values, in silicon chips, using mechanical gears, water flowing
through pipes, in the differential growth of DNA-based bacteria -- yes,
seriously, look up "DNA computers" -- or in messy slabs of complicated
meat. Or even using a mathematical abstraction like Conway's Game of Life.


> We must draw a line somewhere,
> because the digital realm in the machine is so entirely separate from
> the physics (and even the physical hardware), that I could make a whole
> other universe that does not conform to it. It is a whole other
> ModelOfComputation.

But it is precisely because computation is independent of the physical
media that it is performed on that we *should not* reject Babbage's
Analytic Engine. It simply doesn't matter that it uses mechanical gears
instead of doped silicon. That just means it's slower and noisier, not
that is is any less performing computation.



[1] Except for such boring matters as efficiency and speed.

--
Steven

Chris Angelico

unread,
Oct 18, 2013, 12:12:36 AM10/18/13
to pytho...@python.org
On Fri, Oct 18, 2013 at 2:14 PM, Steven D'Aprano
<steve+comp....@pearwood.info> wrote:
> One thing he missed is that there are untyped languages where everything
> is the same type. If everything is the same type, that's equivalent to
> there being no types at all. Examples include TCL and Hypertalk, where
> everything are strings, and Forth, where everything are two-byte words.
>
> But I digress. Apart from those couple of little criticisms, I think it
> is a very useful article to read.

Continuing the digression slightly: If everything's a string, how do
you handle aggregate types like arrays? Are they outside the type
system altogether (like in C, where an array-of-int isn't something
you can pass around, though pointer-to-int is)? The only language I've
worked with that has "everything is strings" is REXX, and it does some
fancy footwork with variable names to do mappings, with a general
convention around the use of stem.0 to create ersatz arrays (probably
how JavaScript got the idea).

ChrisA

Steven D'Aprano

unread,
Oct 18, 2013, 12:45:48 AM10/18/13
to
On Fri, 18 Oct 2013 15:12:36 +1100, Chris Angelico wrote:

> On Fri, Oct 18, 2013 at 2:14 PM, Steven D'Aprano
> <steve+comp....@pearwood.info> wrote:
>> One thing he missed is that there are untyped languages where
>> everything is the same type. If everything is the same type, that's
>> equivalent to there being no types at all. Examples include TCL and
>> Hypertalk, where everything are strings, and Forth, where everything
>> are two-byte words.
>>
>> But I digress. Apart from those couple of little criticisms, I think it
>> is a very useful article to read.
>
> Continuing the digression slightly: If everything's a string, how do you
> handle aggregate types like arrays? Are they outside the type system
> altogether (like in C, where an array-of-int isn't something you can
> pass around, though pointer-to-int is)?

I don't know about TCL, but in Hypertalk, when I said everything is a
string, I meant it. If you want a list of strings, you create one big
string using some delimiter (usually spaces, commas or newlines). So I
might say something like:

# it's been a few years, I may have some of the syntax wrong
put field "list of stuff" into text
for i = 1 to the number of lines of text:
get line i of text
if word 3 of item 6 of it is "stop" then do_stop()
else do_start(word 1 of item 2 of it)

Hypertalk uses (almost) natural language chunking: lines are chunks of
text separated by newlines, items are separated by commas, and words are
separated by spaces. So you can easily implement up to three dimensional
arrays:

a b,c d,e f
g h,i j,k l
m n,o p,q r

is a list of three lines, each line having three items, each item having
two words. (Oh, and there's one more layer of chunking: the character or
char. Guess what that does?)


Actually, perhaps it's not quite true that everything is a string.
Hypertalk also has fields, which are text fields in the GUI environment.
Fields have properties such as the textsize and the font, as well as
contents, which are strings. There are also buttons, which don't have
contents, although some of them can have state like On or Off. There are
cards, which contain fields and buttons, and backgrounds, which contain
cards, and stacks, which contain backgrounds. So it actually was rather
object-oriented in a way, but the objects were completely tied to the GUI
environment. You couldn't directly create an abstract field object,
instead you treated it like a macro playback and did things like this:

choose menu item "New Field" from "Tools" menu
set the name of field "New Field" to "foo"
set the rect of field "foo" to 25,25,100,200

or if you were really keen, or perhaps foolish:

select field tool
drag from 25,25 to 100,200
set the name of field (the number of fields) to "foo"


Despite its flaws, it was great fun to program in, and the best
integrated GUI programming environment I've every seen by far.



--
Steven

Chris Angelico

unread,
Oct 18, 2013, 12:53:27 AM10/18/13
to pytho...@python.org
On Fri, Oct 18, 2013 at 3:45 PM, Steven D'Aprano
<steve+comp....@pearwood.info> wrote:
> I don't know about TCL, but in Hypertalk, when I said everything is a
> string, I meant it. If you want a list of strings, you create one big
> string using some delimiter (usually spaces, commas or newlines).

Fair enough. As a system, that works reasonably cleanly... if a little
inefficiently, since you need to delimit everything. But hey, your
arrays are first-class objects by definition, and that's a good thing!

ChrisA

Peter Cacioppi

unread,
Oct 18, 2013, 4:32:24 PM10/18/13
to
> I think the author goes a little too far to claim that "strong"
> "weak" are meaningless terms when it comes to type systems

I can live with that, actually.

The important language classifications are more along the lines of static vs. dynamic typing, procedural vs. functional, no objects vs. object based vs. true OO.

That probably starts another flame war, but this thread is already running around with its hair on fire.

I still say that object-based is a distinct and meaningful subset of object-oriented programming. The former can be implemented elegantly in a wide range of languages without much in the way of specific language support, the latter needs to designed into the language to allow a modicum of polymorhpic readability.

It's an important distinction, because a project that is constrained to C should (IMHO) target an object-based design pattern but not an object-oriented one. That said, I'm open to disputation-by-example on this point, provided the example is reasonably small and pretty. (If the only polymorphic C code is ugly and non-small, it sort of proves my point).


Mark Lawrence

unread,
Oct 18, 2013, 4:41:57 PM10/18/13
to pytho...@python.org
As far as I'm concerned all of the above belongs on
comp.theoretical.claptrap, give me practicality beats purity any day of
the week :)

Peter Cacioppi

unread,
Oct 18, 2013, 4:56:14 PM10/18/13
to
> give me practicality beats purity any day of the week :)

Without some notion of theory you will end up with php instead of python (see how I looped the thread back around on track ... you're welcome).

If you think php is no worse than python for building reliable, readable code bases than god love you. Readability is huge for allowing efficient team development of larger projects, and readability flows from these sort of discussions.




rusi

unread,
Oct 19, 2013, 1:26:02 AM10/19/13
to
On Saturday, October 19, 2013 2:02:24 AM UTC+5:30, Peter Cacioppi wrote:
>
> I still say that object-based is a distinct and meaningful subset of
> object-oriented programming.

Yes that is what is asserted by
http://www-public.int-evry.fr/~gibson/Teaching/CSC7322/ReadingMaterial/Wegner87.pdf
-- a classic though old reference

> The former can be implemented elegantly in a wide range of languages without much in the way of specific language support, the latter needs to designed into the language to allow a modicum of polymorhpic readability.

3 examples were given (1) python's C implementation (2) OS/2 (3) Linux kernel
About 2 I dont know anything though I believe gdk and gobject are more contemporary examples.

About 1 I have reservations -- see below

IMO the linux kernel is the closest approx to what you are asking:
The details are here http://lwn.net/Articles/444910/
The top level summary is in the opening paras of
http://lwn.net/Articles/446317/

> It's an important distinction, because a project that is constrained to C
> should (IMHO) target an object-based design pattern but not an
> object-oriented one. That said, I'm open to disputation-by-example on this
> point, provided the example is reasonably small and pretty. (If the only
> polymorphic C code is ugly and non-small, it sort of proves my point).

Yes this is an important point though hard to argue in a definitive way -- I call such arguments philosophical rather than scientific; ie it is important but it cant really be settled once and for all.

To see this one can start with two extremes:
Extreme 1: Computability (aka Turing) theory. From this pov every language/system/computer is equivalent to every other and people designing 'newer' and 'better' ones are wasting their's and other's time just like fashion designers who keep alternating pant-hems from Elvis Presley to narrow.

Extreme 2: Semicolon as separator differs from semicolon as terminator;
P4 processor is different from P2 etc etc -- essentially treating any difference as a substantive difference.

Clearly both extremes are unsatisfactory: the first lands us into the Turing-tarpit. The second makes a discussion of close-but-different impossible.

Just as the study of algorithms arose out of a desire to study program efficiency but with the nitty-gritty details of machines abstracted away, in the same way programming language semantics arose in order to study broad classes of languages with details hidden away.

Unfortunately, even after 50 years of trying, semantics has been a dismal failure in defining the what and where and whither of OOP.
In a sane world this would have signified that perhaps OOP as a concept(s) needs to be questioned.
Considering that the opposite has happened -- programming language semantics as an area has become distinctly 'old-fashioned' and not-so-respectable-- I can only conclude that the world is not sane.

Well the tide is slowly turning -- here's a growing bunch of people questioning the validity of OOP:
http://en.wikipedia.org/wiki/Object-oriented_programming#Criticism

Of these I find two noteworthy:
1. Stepanov who is next to Stroustrup in C++ circles, calls OOP a hoax.
2. Carnegie Mellon university has eliminated OOP as "unsuitable for a modern CS curriculum"

And which is why I sympathize with Mark Janssen's passion to clean up the OOP act.
It is loading more messages.
0 new messages