Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

tough-to-explain Python

6 views
Skip to first unread message

kj

unread,
Jul 7, 2009, 4:04:46 PM7/7/09
to

I'm having a hard time coming up with a reasonable way to explain
certain things to programming novices.

Consider the following interaction sequence:

>>> def eggs(some_int, some_list, some_tuple):
... some_int += 2
... some_list += [2]
... some_tuple += (2,)
...
>>> x = 42
>>> y = (42,)
>>> z = [42]
>>> eggs(x, y, z)
>>> x
42
>>> y
(42,)
>>> z
[42, 2]
>>>

How do I explain to rank beginners (no programming experience at
all) why x and y remain unchanged above, but not z?

Or consider this one:

>>> ham = [1, 2, 3, 4]
>>> spam = (ham,)
>>> spam
([1, 2, 3, 4],)
>>> spam[0] is ham
True
>>> spam[0] += [5]
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: 'tuple' object does not support item assignment
>>> ham += [5]
>>> spam
([1, 2, 3, 4, 5, 5],)
>>>

What do you say to that?

I can come up with much mumbling about pointers and stacks and
heaps and much hand-waving about the underlying this-and-that, but
nothing that sounds even remotely illuminating.

Your suggestions would be much appreciated!

TIA!

kj

Chris Rebert

unread,
Jul 7, 2009, 4:08:50 PM7/7/09
to pytho...@python.org

You might find the following helpful (partially):
http://effbot.org/zone/call-by-object.htm

Cheers,
Chris
--
http://blog.rebertia.com

Piet van Oostrum

unread,
Jul 7, 2009, 4:53:13 PM7/7/09
to
>>>>> kj <no.e...@please.post> (k) wrote:

>k> I'm having a hard time coming up with a reasonable way to explain
>k> certain things to programming novices.

>k> Consider the following interaction sequence:

>>>>> def eggs(some_int, some_list, some_tuple):
>k> ... some_int += 2
>k> ... some_list += [2]
>k> ... some_tuple += (2,)
>k> ...


>>>>> x = 42
>>>>> y = (42,)
>>>>> z = [42]
>>>>> eggs(x, y, z)
>>>>> x

>k> 42
>>>>> y
>k> (42,)
>>>>> z
>k> [42, 2]
>>>>>

>k> How do I explain to rank beginners (no programming experience at
>k> all) why x and y remain unchanged above, but not z?

You shouldn't. That's not for beginners. Leave it waiing until you get
to the advanced level.

>k> Or consider this one:

>>>>> ham = [1, 2, 3, 4]
>>>>> spam = (ham,)
>>>>> spam

>k> ([1, 2, 3, 4],)
>>>>> spam[0] is ham
>k> True
>>>>> spam[0] += [5]
>k> Traceback (most recent call last):
>k> File "<stdin>", line 1, in <module>
>k> TypeError: 'tuple' object does not support item assignment
>>>>> ham += [5]
>>>>> spam
>k> ([1, 2, 3, 4, 5, 5],)
>>>>>

>k> What do you say to that?

Mutable and immutable. But use different examples. Like

ham = [1, 2, 3, 4]

spam = (1, 2, 3, 4)

spam[0] += 1 will give the same error message. You can't change the
components of a tuple.

Your example above is similar. The spam[0] += [5] appends the 5 to the
list in spam[0] (so it appends to ham), and then tries to assign the
result of it to spam[0], which is not allowed. That the item it tries to
assign is the same as the item that was already there doesn't matter.

So dont't forget += is a real assignment, even when it is an in-place
modification. Your example just proves that. The language ref manual
says:

With the exception of assigning to tuples and multiple targets in a
single statement, the assignment done by augmented assignment
statements is handled the same way as normal assignments.

But I think that your example isn't for beginners either.
--
Piet van Oostrum <pi...@cs.uu.nl>
URL: http://pietvanoostrum.com [PGP 8DAE142BE17999C4]
Private email: pi...@vanoostrum.org

kj

unread,
Jul 7, 2009, 5:11:29 PM7/7/09
to
In <m2my7gb...@cs.uu.nl> Piet van Oostrum <pi...@cs.uu.nl> writes:

>>>>>> kj <no.e...@please.post> (k) wrote:

>>k> I'm having a hard time coming up with a reasonable way to explain
>>k> certain things to programming novices.

>>k> Consider the following interaction sequence:

>>>>>> def eggs(some_int, some_list, some_tuple):
>>k> ... some_int += 2
>>k> ... some_list += [2]
>>k> ... some_tuple += (2,)
>>k> ...
>>>>>> x = 42
>>>>>> y = (42,)
>>>>>> z = [42]
>>>>>> eggs(x, y, z)
>>>>>> x
>>k> 42
>>>>>> y
>>k> (42,)
>>>>>> z
>>k> [42, 2]
>>>>>>

>>k> How do I explain to rank beginners (no programming experience at
>>k> all) why x and y remain unchanged above, but not z?

>You shouldn't. That's not for beginners.

No, of course not. And I don't plan to present these examples to
them. But beginners have a way of bumping into such conundrums
all on their own, and, as a former beginner myself, I can tell you
that they find them, at best, extremely frustrating, and at worst,
extremely discouraging. I doubt that an answer of the form "don't
worry your pretty little head over this now; wait until your second
course" will do the trick.

Thanks for your comments!

kj

kj

unread,
Jul 7, 2009, 5:18:53 PM7/7/09
to

>You might find the following helpful (partially):
>http://effbot.org/zone/call-by-object.htm


Extremely helpful. Thanks! (I learned more from it than someone
who will teach the stuff would care to admit...)

I had not realized how *profoundly* different the meaning of the
"=" in Python's

spam = ham

is from the "=" in its

spam[3] = ham[3]

So much for "explicit is better than implicit"...


And it confirmed Paul Graham's often-made assertion that all of
programming language design is still catching up to Lisp...

kj

Bearophile

unread,
Jul 7, 2009, 5:46:01 PM7/7/09
to
kj, as Piet van Oostrum as said, that's the difference between mutable
an immutable. It comes from the procedural nature of Python, and
probably an explanation of such topic can't be avoided if you want to
learn/teach Python.

Lot of people think that a language where everything is immutable is
simpler for newbies to understand (even if they have to learn what
higher-order functions are). That's why some people think Scheme or
other languages (even Clojure, I guess) are better for novices (Scheme
has mutability, but you can avoid showing it to newbies).

Some people say that languages that mostly encourage the use of
immutable data structures (like F#, Clojure, Scala and many other less
modern ones) help avoid bugs, maybe for novices too.

On the other hand, it's hard or impossible to actually remove
complexity from a system, and you usually just move it elsewhere. So
other things will become harder to do for those novices. I have no
idea if for the average novice it's simpler to learn to use a
immutables-based language instead of a mutables-based one (it can also
be possible that some novices prefer the first group, and other
novices prefer the second group).

From what I have seen lot of students seem able to learn Python, so
it's not a bad choice.

Python isn't perfect, and in *many* situations it is pragmatic, for
example to increase its performance. Generally for a novice programmer
running speed is not important, but it's important to have a really
coherent and clean language. I've personally seen that for such people
even Python looks very "dirty" (even if it's one of the less dirty
ones).

For example a novice wants to see 124 / 38 to return the 62/19
fraction and not 3 or 3.263157894736842 :-)

People may be able to invent a clean and orthogonal language that's
easy to use and has very few compromises, fit for newbies. But this
language may be very slow, not much useful in practice (who knows?
Maybe there's a practical niche even for such very high level
language), and it doesn't teach how to use lower level languages like
C :-)

Today I think there are no languages really fit for teaching. Python
is one of the few fit ones, but it's getting more and more complex as
time passes because it's getting used in more and more complex real
world situations (a language fit for newbies must not have abstract
base classes, decorators, etc). D language is too much complex for a
newbie. Java is not too much bad, but it's requires to write too much
code, it's too much fussy (semicolons at the end of lines? Newbies say
that the computer is an idiot when it refuses code just because
there's a missing useless semicolon!), and it misses some necessary
things (first class functions! Damn). A nice language like Boo running
on the Mono VM seems another option :-)

In the past Pascal was good enough, but I think it's not good enough
anymore. The problem is that teaching is a niche activity (even if a
very important one). PLT Scheme is one of the few environments (beside
Squeak, Python itself, and little more) that look refined and
implemented well enough for such purpose.

See you later,
bearophile

norseman

unread,
Jul 7, 2009, 7:07:04 PM7/7/09
to Bearophile, Python list
Bearophile wrote:
> kj, as Piet van Oostrum as said, that's the difference between mutable
> an immutable. It comes from the procedural nature of Python, and
> probably an explanation of such topic can't be avoided if you want to
> learn/teach Python.
...(snip)
>
> See you later,
> bearophile

==========================Perhaps because it is the way I learned or the
way I "naturally" approach programming, or maybe the way one influences
the other, but at any rate I think "newbies" to programming should first
learn a few basic concepts (if/then, for, while, do and other loops and
other control constructs) and then be forced to deal with the computer
on it's own terms. Assembly. Once the student learns what the computer
already knows, that the whole thing is just bits and the combination of
them determines it's responses, it then becomes easier to present
'idealistic' concepts and their implementations. With the knowledge and
'mental picture' of the computer's nature the rest of the ideas for
programming have a tendency to drift in the direction of reality and the
underlying needs to fulfill the 'better' and/or 'easier' languages.
Having both the knowledge of the 'full capabilities' of a computer and
the experience of a formalized language the student can, or should,
learn/compare the trade offs of each. By understanding the computer (I
at least) grasp the basics of a new language quite quickly. No, I don't
become a guru overnight, if at all, but I do have the advantage in
deciding if a given language is appropriate for a given job with very
little research.

The more I delve into OOP the more I liken an 'object' to a box. A box
with a shipping manifest.

There are big boxes,
little boxes,
squat boxes and so on.

A box can contain corn flakes,
bullets, raisins, rice, burlap, silk, motorcycle(s), soap and more.

The manifest describes contents.
The manifest is there but the description(s) change with content (type).
The descriptions always use one or more of the basics like: color,
count, dimension and so forth.

Just like an OOP object.

A box can contain things of all sorts, including references to the
contents of other box(es). A box can even be a virtual of another (the
global concept). The return statement, in this context, means hauling
the contents of the box (and/or its manifest) back to (wherever) and
disposing of the current box (a local).

Just like an OOP object.


It is easier to visualize a box and it's use than a non described blob.
Abstracts are never precise - hence the evolution of the word.


The one thing a teacher will always fail is the same as anyone else who
tries to adequately describe a pretty sunset to a person born totally
blind. No point(s) of reference.

Time for me to sign off. To all those that helped me when I needed it -

I thank you very much.

Food for thought: your watch (clock) does not tell time.
The watch (clock) only mimics one movement of the earth.
ie... 3 dimensions are still static, the 4th is movement.


Steve

Ben Finney

unread,
Jul 7, 2009, 8:06:07 PM7/7/09
to
kj <no.e...@please.post> writes:

> In <mailman.2796.1246997...@python.org> Chris Rebert <cl...@rebertia.com> writes:
>
> >You might find the following helpful (partially):
> >http://effbot.org/zone/call-by-object.htm
>
> Extremely helpful. Thanks! (I learned more from it than someone
> who will teach the stuff would care to admit...)

I have got very good results from teaching using the analogy of “paper
tags tied to physical objects” to describe Python's references to
values.

The analogy allows illustration of many otherwise confusing facts:

* an assignment is the act of putting a tag onto an object

* a tag leads to exactly one object, and it can be re-tied to a
different object

* an object can have arbitrarily many tags on it

* containers (e.g. lists, dicts, sets, etc.) don't contain objects,
they contain tags leading to objects

* passing arguments to a function always makes new tags leading to the
objects passed in, and throws those new tags away once the function
returns

* etc.

The analogy can then be contrasted to how Python *doesn't* do it: named
boxes with one value per box. You can point out that many other
languages do use this model, so they should be aware of it, but Python
doesn't use it.

> I had not realized how *profoundly* different the meaning of the
> "=" in Python's
>
> spam = ham

Access the object referenced by ‘ham’, and assigns the reference ‘spam’
to that object.

> is from the "=" in its
>
> spam[3] = ham[3]

Access the object referenced by ‘ham[3]’, and assigns the reference
‘spam[3]’ to that object.

No, they're exactly the same. The only thing that's different is the
references you use; the assignment operates *exactly* the same in both
cases.

--
\ “We are no more free to believe whatever we want about God than |
`\ we are free to adopt unjustified beliefs about science or |
_o__) history […].” —Sam Harris, _The End of Faith_, 2004 |
Ben Finney

John Yeung

unread,
Jul 7, 2009, 8:55:13 PM7/7/09
to
On Jul 7, 5:11 pm, kj <no.em...@please.post> wrote:
> I don't plan to present these examples to them.
> But beginners have a way of bumping into such
> conundrums all on their own [...].  I doubt that

> an answer of the form "don't worry your pretty
> little head over this now; wait until your second
> course" will do the trick.

I agree that beginners are bound to come across difficult issues on
their own, and when they do, you have to at least try to explain,
rather than dismiss them. I believe that the beginners which are most
curious, and thus most likely to run into things you didn't plan for
them, are also most likely to be ready and able to receive your
explanation.

That said, I think it's worth planning a fairly controlled flow of
information. For example, you can go a LONG way without touching
tuples or the augmented assignment operators.

For your function example, I suppose the key ideas to understand are
binding and containers (which may or may not be mutable). The
oversimplified version I think I would attempt to explain is that, as
far as the "outside world" is concerned, a function cannot rebind the
arguments passed to it. However, the contents of a mutable container
may be altered without rebinding. That is, you can hold a bucket,
pass that bucket to a function, and the function can put stuff in the
bucket or take stuff out without you ever letting go of the bucket.
When the function returns, you are still holding the bucket.

Nested containers, especially with "outside" bindings to inner
containers, may be tougher to visualize with real-world objects.
Hopefully by then the students can grasp a more abstract (pointerlike)
notion of binding! Good luck!

John

John Yeung

unread,
Jul 7, 2009, 9:26:13 PM7/7/09
to
On Jul 7, 8:06 pm, Ben Finney <ben+pyt...@benfinney.id.au> wrote:
> I have got very good results from teaching using
> the analogy of “paper tags tied to physical objects”
> to describe Python's references to values.

Ah, I like that! I think it's better than what I used in my post
(which I composed and submitted before yours showed up in my reader).

I am not formally a teacher, but I do try to help "nonprogrammers"
learn Python from time to time, and this will be a good one to
remember.

John

Steven D'Aprano

unread,
Jul 7, 2009, 10:03:16 PM7/7/09
to
On Tue, 07 Jul 2009 21:18:53 +0000, kj wrote:

> I had not realized how *profoundly* different the meaning of the "=" in
> Python's
>
> spam = ham
>
> is from the "=" in its
>
> spam[3] = ham[3]
>
> So much for "explicit is better than implicit"...

I'm sorry, I don't get it. Can you explain please? I don't see why it's
so "profoundly" different. Apart from spam[3] = x not being permitted if
spam is an immutable type.

I suppose though they are *fundamentally* different, in that spam=ham is
dealt with by the compiler while spam[3]=ham is handled by the object
spam using the __setitem__ method. Is that the difference you're talking
about?


--
Steven

pyt...@bdurham.com

unread,
Jul 7, 2009, 10:40:05 PM7/7/09
to pytho...@python.org
Ben,

> I have got very good results from teaching using the analogy of "paper tags tied to physical objects" to describe Python's references to values.

Great analogy!! And an excellent analogy for newcomers to Python. (this
would have saved me some personal pain in my early days).

Regards,
Malcolm

Gabriel Genellina

unread,
Jul 7, 2009, 11:02:53 PM7/7/09
to pytho...@python.org
En Tue, 07 Jul 2009 17:04:46 -0300, kj <no.e...@please.post> escribi�:

> I'm having a hard time coming up with a reasonable way to explain
> certain things to programming novices.
>

>>>> ham = [1, 2, 3, 4]
>>>> spam = (ham,)
>>>> spam
> ([1, 2, 3, 4],)
>>>> spam[0] is ham
> True
>>>> spam[0] += [5]
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> TypeError: 'tuple' object does not support item assignment
>>>> ham += [5]
>>>> spam
> ([1, 2, 3, 4, 5, 5],)

>
> What do you say to that?
>
> I can come up with much mumbling about pointers and stacks and
> heaps and much hand-waving about the underlying this-and-that, but
> nothing that sounds even remotely illuminating.

This article is based on an old thread in this list that is very
enlightening:

How To Think Like A Pythonista
http://python.net/crew/mwh/hacks/objectthink.html

and I'll quote just a paragraph from Alex Martelli:

�There is [...] a huge difference
between changing an object, and changing (mutating) some
OTHER object to which the first refers.

In Bologna over 100 years ago we had a statue of a local hero
depicted pointing forwards with his finger -- presumably to
the future, but given where exactly it was placed, the locals
soon identified it as "the statue that points to Hotel
Belfiore". Then one day some enterprising developer bought
the hotel's building and restructured it -- in particular,
where the hotel used to be was now a restaurant, Da Carlo.

So, "the statue that points to Hotel Belfiore" had suddenly
become "the statue that points to Da Carlo"...! Amazing
isn't it? Considering that marble isn't very fluid and the
statue had not been moved or disturbed in any way...?

This is a real anecdote, by the way (except that I'm not
sure of the names of the hotel and restaurant involved --
I could be wrong on those), but I think it can still help
here. The dictionary, or statue, has not changed at all,
even though the objects it refers/points to may have been
mutated beyond recognition, and the name people know it
by (the dictionary's string-representation) may therefore
change. That name or representation was and is referring
to a non-intrinsic, non-persistent, "happenstance"
characteristic of the statue, or dictionary...�

--
Gabriel Genellina

Simon Forman

unread,
Jul 8, 2009, 12:10:38 AM7/8/09
to
On Jul 7, 4:04 pm, kj <no.em...@please.post> wrote:
> I'm having a hard time coming up with a reasonable way to explain
> certain things to programming novices.
>
> Consider the following interaction sequence:
>
> >>> def eggs(some_int, some_list, some_tuple):
>
> ... some_int += 2
> ... some_list += [2]
> ... some_tuple += (2,)
> ...
>
> >>> x = 42
> >>> y = (42,)
> >>> z = [42]
> >>> eggs(x, y, z)
> >>> x
> 42
> >>> y
> (42,)
> >>> z
> [42, 2]

You have transposed some_list and some some_tuple. I.e. you should be
calling eggs(x, z, y).

> How do I explain to rank beginners (no programming experience at
> all) why x and y remain unchanged above, but not z?

You don't. Rank beginners don't have enough background knowledge to
grok that code.

Why would you even tell the poor bastards about "+=" before they were
comfortable with (python's style of) function calls, immutable
integers, mutable lists and immutable tuples?

Let them use "x = x + y" until they have enough knowledge to
understand "augmented" assignment.

Syntax matters (I mean general order of things, not linguistic
syntax.) Making an omelette requires putting eggs in a pan and
cracking them, but not in that order.

> Or consider this one:
>
> >>> ham = [1, 2, 3, 4]
> >>> spam = (ham,)
> >>> spam
> ([1, 2, 3, 4],)
> >>> spam[0] is ham
> True
> >>> spam[0] += [5]
>
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> TypeError: 'tuple' object does not support item assignment
> >>> ham += [5]
> >>> spam
>
> ([1, 2, 3, 4, 5, 5],)
>
>
>
> What do you say to that?

I say, "Don't use augmented assignment with indexed tuples."

Seriously, python's augmented assignment is almost magical. I think
you're just making trouble for yourself and your students if you
introduce it too early.

I get python pretty well (if I say so myself) but I wouldn't know how
to explain:

In [1]: def foo(a_list):
...: a_list = a_list + [5]
...:
...:

In [2]: n = []

In [3]: foo(n)

In [4]: n
Out[4]: []

In [5]: def bar(a_list):
...: a_list += [5]
...:
...:

In [6]: bar(n)

In [7]: n
Out[7]: [5]


It's "Just The Way It Is". Freakin' magic, man.

> I can come up with much mumbling about pointers and stacks and
> heaps and much hand-waving about the underlying this-and-that, but
> nothing that sounds even remotely illuminating.
>
> Your suggestions would be much appreciated!

Frankly, I'm of the impression that it's a mistake not to start
teaching programming with /the bit/ and work your way up from there.
I'm not kidding. I wrote a (draft) article about this: "Computer
Curriculum" http://docs.google.com/View?id=dgwr777r_31g4572gp4

I really think the only good way to teach computers and programming is
to start with a bit, and build up from there. "Ontology recapitulates
phylogeny"

I realize that doesn't help you teach python, but I wanted to put it
out there.

Steven D'Aprano

unread,
Jul 8, 2009, 2:09:19 AM7/8/09
to
On Tue, 07 Jul 2009 20:04:46 +0000, kj wrote:

> I'm having a hard time coming up with a reasonable way to explain
> certain things to programming novices.

[...]


> Or consider this one:
>
>>>> ham = [1, 2, 3, 4]
>>>> spam = (ham,)
>>>> spam
> ([1, 2, 3, 4],)
>>>> spam[0] is ham
> True
>>>> spam[0] += [5]
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> TypeError: 'tuple' object does not support item assignment
>>>> ham += [5]
>>>> spam
> ([1, 2, 3, 4, 5, 5],)
>>>>
>>>>
> What do you say to that?


That one surely is very straight forward. Just like the exception says,
tuples don't support item assignment, so spam[0] += [5] is not allowed.

But just because you have put a list inside a tuple doesn't mean the list
stops being a list -- you can still append to the list, which is what
ham += [5] does. So spam is unchanged: it is still the one-item tuple
containing a list. It is just that the list has now been modified.

This is only troublesome (in my opinion) if you imagine that tuples are
somehow magical "frozen-lists", where the contents can't be modified once
created. That's not the case -- the tuple itself can't be modified, but
the objects inside it remain ordinary objects, and the mutable ones can
be modified.

The thing to remember is that the tuple spam doesn't know anything about
the *name* ham -- it knows the object referred to by the name ham. You
can modify the name, and nothing happens to the tuple:

>>> spam
([1, 2, 3, 4, 5],)
>>> ham = [5]
>>> spam
([1, 2, 3, 4, 5],)

Or if you prefer:

>>> ham = spam[0] # label the list inside spam as 'ham'
>>> ham += [6] # modify the list labelled as 'ham'
>>> spam
([1, 2, 3, 4, 5, 6],)
>>> pork = ham # create a new label, 'pork', and bind it to the same list
>>> del ham # throw away the label 'ham'
>>> pork += [7] # modify the list labelled as 'pork'
>>> spam
([1, 2, 3, 4, 5, 6, 7],)


It's all about the objects, not the names.

--
Steven

Message has been deleted

Francesco Bochicchio

unread,
Jul 8, 2009, 3:32:07 AM7/8/09
to

I would go with something like this:

"""
In object oriented programming, the same function or operator can be
used to represent
different things. This is called overloading. To understand what the
operator/function do, we have to look at
the kind of object it is applied to.
In this case, the operator "+=" means two different things:
- for strings and numbers it means : "create a new object by merging
the two operands". This is why the original object is left the same.
- for lists, it means : "increase the left operand with the contents
of the right operand". This is why the original object is changed
"""

You couuld also add:
"""
You see, in python everithing is an object. Some object can be changed
(mutable objects), others cannot.
"""
but this is another story.


P:S : Sometime I think they should not have allowed += on immutables
and forced everybody to write s = s + "some more".

Ciao
----
FB

Piet van Oostrum

unread,
Jul 8, 2009, 4:11:35 AM7/8/09
to
>>>>> Simon Forman <sajm...@gmail.com> (SF) wrote:

>SF> Why would you even tell the poor bastards about "+=" before they were
>SF> comfortable with (python's style of) function calls, immutable
>SF> integers, mutable lists and immutable tuples?

>SF> Let them use "x = x + y" until they have enough knowledge to
>SF> understand "augmented" assignment.

And *then* you can tell them that "x += y" can be subtly different from
"x = x + y", which is what happened in the example that the OP gave.

Gabriel Genellina

unread,
Jul 8, 2009, 4:13:42 AM7/8/09
to pytho...@python.org
En Wed, 08 Jul 2009 04:32:07 -0300, Francesco Bochicchio
<bief...@gmail.com> escribi�:

> I would go with something like this:
>
> """
> In object oriented programming, the same function or operator can be
> used to represent
> different things. This is called overloading. To understand what the
> operator/function do, we have to look at
> the kind of object it is applied to.
> In this case, the operator "+=" means two different things:
> - for strings and numbers it means : "create a new object by merging
> the two operands". This is why the original object is left the same.
> - for lists, it means : "increase the left operand with the contents
> of the right operand". This is why the original object is changed
> """

Mmm, but this isn't quite exact. += isn't an operator, but a statement
(augmented assignment). a += b translates to: [*]

a = a.__iadd__(b) [*]

It's up to the __iadd__ implementation to return the same object or a new
one, and the assignment occurs even if the same object is returned.

If you think of an assignments as binding names to objects (well, that's
the documented behavior, btw) things become more clear.

> P:S : Sometime I think they should not have allowed += on immutables
> and forced everybody to write s = s + "some more".

I think not everyone agreed when += was introduced.
But now, having s += "some more" allows the operation to be optimized.

[*] except 'a' is evaluated only once

--
Gabriel Genellina

Ulrich Eckhardt

unread,
Jul 8, 2009, 5:24:28 AM7/8/09
to
Bearophile wrote:
> For example a novice wants to see 124 / 38 to return the 62/19
> fraction and not 3 or 3.263157894736842 :-)

Python has adopted the latter of the three for operator / and the the second
one for operator //. I wonder if it was considered to just return a
fraction from that operation.

x = 3/5
-> x = fraction(3, 5)
x = x*7
-> x = fraction(21, 5)
x = x/2
-> x = fraction(21, 10)
range(x)
-> error, x not an integer
range(int(x))
-> [0, 1,]
y = float(x)
-> y = 2.1

This would have allowed keeping the operator what people are used to. On the
other hand, implicit conversion to either of float or int would have been
avoided, which is usually the #1 cause for subtle problems.


Uli

--
Sator Laser GmbH
Geschäftsführer: Thorsten Föcking, Amtsgericht Hamburg HR B62 932

kj

unread,
Jul 8, 2009, 7:59:19 AM7/8/09
to

>Bearophile wrote:
>> For example a novice wants to see 124 / 38 to return the 62/19
>> fraction and not 3 or 3.263157894736842 :-)

>Python has adopted the latter of the three for operator / and the the second
>one for operator //. I wonder if it was considered to just return a
>fraction from that operation.

http://www.python.org/dev/peps/pep-0239/

kj

kj

unread,
Jul 8, 2009, 8:23:50 AM7/8/09
to

>Frankly, I'm of the impression that it's a mistake not to start
>teaching programming with /the bit/ and work your way up from there.
>I'm not kidding. I wrote a (draft) article about this: "Computer
>Curriculum" http://docs.google.com/View?id=dgwr777r_31g4572gp4

>I really think the only good way to teach computers and programming is
>to start with a bit, and build up from there. "Ontology recapitulates
>phylogeny"


I happen to be very receptive to this point of view. I had the
benefit of that sort of training (one of the first computer courses
I took started, believe it or not, with Turing machines, through
coding in machine language, and compiler theory, and all the way
up to dabbling with Unix!), and I suspect that the reason it is
sometimes difficult for me to explain even relatively simple-looking
things to others is that I have this background that I unconsciously,
and incorrectly, take for granted in others... There is this
persistent idea "out there" that programming is a very accessible
skill, like cooking or gardening, anyone can do it, and even profit
from it, monetarily or otherwise, etc., and to some extent I am
actively contributing to this perception by teaching this course
to non-programmers (experimental biologists to be more precise),
but maybe this idea is not entirely true... Maybe, to get past
the most amateurish level, one has to, one way or another, come
face-to-face with bits, compilers, algorithms, and all the rest
that real computer scientists learn about in their formal training...

kj

Steven D'Aprano

unread,
Jul 8, 2009, 8:43:55 AM7/8/09
to
On Wed, 08 Jul 2009 11:24:28 +0200, Ulrich Eckhardt wrote:

> Bearophile wrote:
>> For example a novice wants to see 124 / 38 to return the 62/19 fraction
>> and not 3 or 3.263157894736842 :-)
>
> Python has adopted the latter of the three for operator / and the the
> second one for operator //.

Up until a few years ago, / would return the integer division if both
arguments where ints or longs, and // didn't even exist. Even in Python
2.6, you still need to do

from __future__ import division

to get the behaviour you describe.


> I wonder if it was considered to just return
> a fraction from that operation.

Because of his experience with ABC, Guido dislikes using fractions for
standard arithmetic. He was resistant to even allowing the rational
module be added to the standard library.

The problem with fractions is that, unless you're very careful, repeated
arithmetic operations on numbers ends up giving you fractions like

498655847957/569892397664

instead of the 7/8 you were expecting. Not only is that ugly, but it
becomes slower and slower as the denominator and numerator become larger.

At least, that was Guido's experience.

--
Steven

Hans Nowak

unread,
Jul 8, 2009, 8:51:30 AM7/8/09
to
Ulrich Eckhardt wrote:
> Bearophile wrote:
>> For example a novice wants to see 124 / 38 to return the 62/19
>> fraction and not 3 or 3.263157894736842 :-)
>
> Python has adopted the latter of the three for operator / and the the second
> one for operator //. I wonder if it was considered to just return a
> fraction from that operation.

http://python-history.blogspot.com/2009/03/problem-with-integer-division.html

"For example, in ABC, when you divided two integers, the result was an exact
rational number representing the result. In Python however, integer division
truncated the result to an integer.

In my experience, rational numbers didn't pan out as ABC's designers had hoped.
A typical experience would be to write a simple program for some business
application (say, doing one’s taxes), and find that it was running much slower
than expected. After some debugging, the cause would be that internally the
program was using rational numbers with thousands of digits of precision to
represent values that would be truncated to two or three digits of precision
upon printing. This could be easily fixed by starting an addition with an
inexact zero, but this was often non-intuitive and hard to debug for beginners."

--
Hans Nowak (zephyrfalcon at gmail dot com)
http://4.flowsnake.org/

Steven D'Aprano

unread,
Jul 8, 2009, 9:10:23 AM7/8/09
to
On Wed, 08 Jul 2009 12:23:50 +0000, kj wrote:

> In <5f0a2722-45eb-468c...@q11g2000yqi.googlegroups.com>
> Simon Forman <sajm...@gmail.com> writes:
>
>>Frankly, I'm of the impression that it's a mistake not to start teaching
>>programming with /the bit/ and work your way up from there. I'm not
>>kidding. I wrote a (draft) article about this: "Computer Curriculum"
>>http://docs.google.com/View?id=dgwr777r_31g4572gp4
>
>>I really think the only good way to teach computers and programming is
>>to start with a bit, and build up from there. "Ontology recapitulates
>>phylogeny"
>
>
> I happen to be very receptive to this point of view.

[...]


> There is this persistent idea "out there" that
> programming is a very accessible skill, like cooking or gardening,
> anyone can do it, and even profit from it, monetarily or otherwise,
> etc., and to some extent I am actively contributing to this perception
> by teaching this course to non-programmers (experimental biologists to
> be more precise), but maybe this idea is not entirely true...

There is some evidence that 30-60% of people simply cannot learn to
program, no matter how you teach them:

http://www.codinghorror.com/blog/archives/000635.html
http://www.cs.mdx.ac.uk/research/PhDArea/saeed/

I'm sympathetic to the idea, but not entirely convinced. Perhaps the
problem isn't with the students, but with the teachers, and the
languages:

http://www.csse.monash.edu.au/~damian/papers/PDF/SevenDeadlySins.pdf

(My money is that it's a little of both.)


> Maybe, to
> get past the most amateurish level, one has to, one way or another, come
> face-to-face with bits, compilers, algorithms, and all the rest that
> real computer scientists learn about in their formal training...

The "No True Scotsman" fallacy.

There's nothing amateurish about building software applications that
work, with well-designed interfaces and a minimum of bugs, even if you've
never heard of Turing Machines.


--
Steven

kj

unread,
Jul 8, 2009, 9:27:34 AM7/8/09
to

>I'm not kidding. I wrote a (draft) article about this: "Computer
>Curriculum" http://docs.google.com/View?id=dgwr777r_31g4572gp4

Very cool.

kj

Paul Moore

unread,
Jul 8, 2009, 9:47:48 AM7/8/09
to kj, pytho...@python.org
2009/7/8 kj <no.e...@please.post>:

> There is this
> persistent idea "out there" that programming is a very accessible
> skill, like cooking or gardening, anyone can do it, and even profit
> from it, monetarily or otherwise, etc., and to some extent I am
> actively contributing to this perception by teaching this course
> to non-programmers (experimental biologists to be more precise),
> but maybe this idea is not entirely true...  Maybe, to get past
> the most amateurish level, one has to, one way or another, come
> face-to-face with bits, compilers, algorithms, and all the rest
> that real computer scientists learn about in their formal training...

Look at it another way. Experimental biologists don't want to program,
they want to use computers to do experimental biology. It's a tool,
and they (quite reasonably) don't *care* about robustness,
portability, etc. Or even about programming, to be honest.

In the context of the original question, it's entirely reasonable (in
my view) to tell this audience "if the code does something weird you
don't understand, either ignore it and find another way or dig into
the manuals and experiment if you care". They'd very quickly find a =
a + b as a less confusing alternative to a += b. (As has been pointed
out earlier, to some extent a += b is quite an advanced construct -
after all, it's essentially an optimisation of a = a + b).

Biologists don't expect me to understand their discipline before I can
plant seeds in my garden, after all. (And when I do plant seeds, I
usually get far more surprising results than I could get from a += b
:-))

Paul.

nn

unread,
Jul 8, 2009, 9:55:18 AM7/8/09
to
On Jul 7, 5:18 pm, kj <no.em...@please.post> wrote:

> In <mailman.2796.1246997332.8015.python-l...@python.org> Chris Rebert <c...@rebertia.com> writes:
>
> >You might find the following helpful (partially):
> >http://effbot.org/zone/call-by-object.htm
>
> Extremely helpful.  Thanks!  (I learned more from it than someone
> who will teach the stuff would care to admit...)
>
> And it confirmed Paul Graham's often-made assertion that all of
> programming language design is still catching up to Lisp...
>
> kj

One possibility is to avoid teaching mutable datatypes at the
beginning (i.e. no lists,dicts and sets), then you would have
something more lispish. You would have to use tuples instead of lists
for everything. That avoids the gotchas of mutable types at the cost
of efficiency.

Python is not a purist language but rather walks a fine line between
high flexible abstractions and simple fast implementation.

Dave Angel

unread,
Jul 8, 2009, 11:59:14 AM7/8/09
to kj, pytho...@python.org
And I learned to program Analog Computers, along with APL, in an early
course. In my case assembly language came later, but by then we had
covered the electronics behind transistors, and Karnough maps, logic
diagrams, and so on. By the time I graduated, I had five six-level
languages and two assembly languages under my belt.

.DaveA

Simon Forman

unread,
Jul 9, 2009, 1:05:57 AM7/9/09
to
(I wanted to reply to a few messages in one post so I quoted them all
below. Let me know if this is bad etiquette.)

On Jul 8, 8:23 am, kj <no.em...@please.post> wrote:


> In <5f0a2722-45eb-468c-b6b2-b7bb80ae5...@q11g2000yqi.googlegroups.com> Simon Forman <sajmik...@gmail.com> writes:
>
> >Frankly, I'm of the impression that it's a mistake not to start
> >teaching programming with /the bit/ and work your way up from there.
> >I'm not kidding. I wrote a (draft) article about this: "Computer
> >Curriculum"http://docs.google.com/View?id=dgwr777r_31g4572gp4
> >I really think the only good way to teach computers and programming is
> >to start with a bit, and build up from there. "Ontology recapitulates
> >phylogeny"
>
> I happen to be very receptive to this point of view. I had the
> benefit of that sort of training (one of the first computer courses
> I took started, believe it or not, with Turing machines, through
> coding in machine language, and compiler theory, and all the way
> up to dabbling with Unix!), and I suspect that the reason it is
> sometimes difficult for me to explain even relatively simple-looking
> things to others is that I have this background that I unconsciously,
> and incorrectly, take for granted in others... There is this

Yes! Once the concepts become so familiar that you call them
"intuitive" it seems to be very difficult to remember what they were
like before.

Something like "a = b" becomes "obvious" only after you've
internalized the preceding concepts.


> persistent idea "out there" that programming is a very accessible
> skill, like cooking or gardening, anyone can do it, and even profit
> from it, monetarily or otherwise, etc., and to some extent I am

Programming is not like any other human activity. I've been reading
some of Prof. Dijkstra's EWDs in the last few days. In one [1] he
says, "automatic computers embody not only one radical novelty but two
of them", to wit: First, the huge scales that must be understood,
"from a bit to a few hundred megabytes, from a microsecond to a half
an hour of computing"; and second, "that the automatic computer is our
first large-scale digital device" which our until-now overwhelmingly
analog experience does not prepare us to deal with well.

He talks about how "when all is said and done, the only thing
computers can do for us is to manipulate symbols and produce results
of such manipulations" and he emphasises the "uninterpreted" nature of
mechanical symbol manipulation, i.e. that the machine is doing it
mindlessly.

Dijkstra[1]: "It is true that the student that has never manipulated
uninterpreted formulae quickly realizes that he is confronted with
something totally unlike anything he has ever seen before. But
fortunately, the rules of manipulation are in this case so few and
simple that very soon thereafter he makes the exciting discovery that
he is beginning to master the use of a tool that, in all its
simplicity, gives him a power that far surpasses his wildest
dreams." [1]


> actively contributing to this perception by teaching this course
> to non-programmers (experimental biologists to be more precise),

Experimental biologists? Well that's probably harmless. Mostly
harmless.

> but maybe this idea is not entirely true... Maybe, to get past
> the most amateurish level, one has to, one way or another, come
> face-to-face with bits, compilers, algorithms, and all the rest
> that real computer scientists learn about in their formal training...
>

> kj

If you're never exposed to that constellation of concepts that
underpins "mechanical symbol manipulation" you are adrift in a sea
("c", ha ha) of abstractions.

However, if you /are/ exposed to the "so few and simple" rules of
manipulation the gates (no pun intended) to the kingdom are thrown
wide.


On Jul 8, 9:10 am, Steven D'Aprano <st...@REMOVE-THIS-


cybersource.com.au> wrote:
> On Wed, 08 Jul 2009 12:23:50 +0000, kj wrote:

> > I happen to be very receptive to this point of view.

> [...]


> > There is this persistent idea "out there" that
> > programming is a very accessible skill, like cooking or gardening,
> > anyone can do it, and even profit from it, monetarily or otherwise,
> > etc., and to some extent I am actively contributing to this perception
> > by teaching this course to non-programmers (experimental biologists to
> > be more precise), but maybe this idea is not entirely true...
>

> There is some evidence that 30-60% of people simply cannot learn to
> program, no matter how you teach them:
>
> http://www.codinghorror.com/blog/archives/000635.html
> http://www.cs.mdx.ac.uk/research/PhDArea/saeed/

Thank you! That's exactly the paper that prompted me to write the
article I mentioned. (Now I don't have to go find the link myself.
Win!)

I don't buy it: I believe strongly that any normal person can learn to
program, to manipulate symbols to create formulae that guide the
machine in its uninterpreted symbol manipulation.

I find it significant that in the paper [2] they say, "Formal logical
proofs, and therefore programs – formal logical proofs that particular
computations are possible, expressed in a formal system called a
programming language – are utterly meaningless. To write a computer
program you have to come to terms with this, to accept that whatever
you might want the program to mean, the machine will blindly follow
its meaningless rules and come to some meaningless conclusion. In the
test the consistent group showed a pre-acceptance of this fact: they
are capable of seeing mathematical calculation problems in terms of
rules, and can follow those rules wheresoever they may lead. The
inconsistent group, on the other hand, looks for meaning where it is
not."

In other words the people who don't understand computers, don't
understand computers.

I think that "first hump" people can become "second hump" people but
that it requires teaching them the foundations first, not confronting
them with such incredible novelties as "a = b" and saying in effect,
"here you go buddy, sink or swim."

Quoting Dijkstra again [1]: "Before we part, I would like to invite
you to consider the following way of doing justice to computing's
radical novelty in an introductory programming course.

"On the one hand, we teach what looks like the predicate calculus, but
we do it very differently from the philosophers. In order to train the
novice programmer in the manipulation of uninterpreted formulae, we
teach it more as boolean algebra, familiarizing the student with all
algebraic properties of the logical connectives. To further sever the
links to intuition, we rename the values {true, false} of the boolean
domain as {black, white}.

"On the other hand, we teach a simple, clean, imperative programming
language, with a skip and a multiple assignment as basic statements,
with a block structure for local variables, the semicolon as operator
for statement composition, a nice alternative construct, a nice
repetition and, if so desired, a procedure call. To this we add a
minimum of data types, say booleans, integers, characters and strings.
The essential thing is that, for whatever we introduce, the
corresponding semantics is defined by the proof rules that go with
it."

Imagine my surprise: he's saying (with immensely greater brilliance
and eloquence) much what I said in my little article. The major
difference from what he's outlined is that I think the students should
implement the imperative programming language themselves in Forth, but
the gist is the same.


> I'm sympathetic to the idea, but not entirely convinced. Perhaps the
> problem isn't with the students, but with the teachers, and the
> languages:
>
> http://www.csse.monash.edu.au/~damian/papers/PDF/SevenDeadlySins.pdf
>
> (My money is that it's a little of both.)

Hmm, that paper contains some good insights IMO, but I think they're
still missing the big picture, so to speak.

Really I suspect it's a case of "Programming languages considered
harmful."

The core abstractions of [mechanical] computation are just not that
complicated. You can teach them to anybody in about a half an hour,
drunk. I have.

After that, if they're interested, there is a smooth easy path to
"higher" abstractions: parsing, compiling, tree traversal and
transformation. (It is said that possession is 9/10s of the law, in
the same vein I would claim parsing is 9/10s of computer programming.)

I am beginning to suspect that concrete static (in the sense of
"standard" language specifications) languages are part of the
problem. Everyone gets so caught up in programming via languages that
you get, well, people trying to teach "Computer Programming" as if it
were only necessary to grok a language, rather than grokking /symbol
manipulation/ itself.

(Did you read that last paragraph and think, "Well how the heck else
are you supposed to program a computer if not in a computer
language?"? If so, well, that is kind of my point.)


> > Maybe, to
> > get past the most amateurish level, one has to, one way or another, come
> > face-to-face with bits, compilers, algorithms, and all the rest that
> > real computer scientists learn about in their formal training...
>

> The "No True Scotsman" fallacy.
>
> There's nothing amateurish about building software applications that
> work, with well-designed interfaces and a minimum of bugs, even if you've
> never heard of Turing Machines.
>
> --
> Steven

I beg to differ. I recall a conversation with a co-worker who had
"learned" to program using PHP. Another co-worker and I were trying
to convince him that there was a good reason to differentiate between
hash tables and arrays. He didn't even know that they were different
"things".

I remember telling him, "between the metal and the desktop there is
nothing but layers of abstraction. We use different names precisely
because of different behaviours."

He made "well-designed interfaces", but "amateurish" is about the
nicest thing I would have called him.

As for "a minimum of bugs"... sigh. The "minimum of bugs" is zero, if
you derive your "uninterpreted formulae" /correctly/. Deriving
provably correct "programs" should be what computer science and
computer education are all about (not "java vocational training" as
Alan Kay once decried.)

Again with Dijkstra[3]: "The prime paradigma of the pragmatic designer
is known as "poor man's induction", i.e. he believes in his design as
long as "it works", i.e. until faced with evidence to the contrary.
(He will then "fix the design".) The scientific designer, however,
believes in his design because he understands why it will work under
all circumstances. The transition from pragmatic to scientific design
would indeed be a drastic change within the computer industry."

"Obviously no errors" is the goal to strive for, and I am comfortable
calling anyone an amateur who prefers "no obvious errors." (Actually
that's a little harsh on the amateurs, "ama" meaning love, "amateur"
is one who does something for love of it.)


On Jul 8, 9:27 am, kj <no.em...@please.post> wrote:


> In <5f0a2722-45eb-468c-b6b2-b7bb80ae5...@q11g2000yqi.googlegroups.com> Simon Forman <sajmik...@gmail.com> writes:
>
> >I'm not kidding. I wrote a (draft) article about this: "Computer
> >Curriculum"http://docs.google.com/View?id=dgwr777r_31g4572gp4
>

> Very cool.
>
> kj

Hey, thank you!


On Jul 8, 9:47 am, Paul Moore <p.f.mo...@gmail.com> wrote:
> 2009/7/8 kj <no.em...@please.post>:


>
> > There is this
> > persistent idea "out there" that programming is a very accessible

<snip>


> > that real computer scientists learn about in their formal training...
>
> Look at it another way. Experimental biologists don't want to program,
> they want to use computers to do experimental biology. It's a tool,
> and they (quite reasonably) don't *care* about robustness,
> portability, etc. Or even about programming, to be honest.

I'd say it's just the opposite: to "use computers to do experimental
biology" they want to instruct that machine to manipulate their
(meaningful to them but meaningless to it) symbols in useful ways.
This is nothing more or less than programming.

The fact that they need to learn all sorts of details of a programming
language to do that is NOT because they can't grok programming. It's
because computer scientists have put too many layers of abstraction on
top of the "pure" symbol manipulation and then forgotten what they
have done.

I have a very nice book "Introduction to Programming and Problem
Solving with Pascal" that I picked up for $0.50 at a used bookstore
not long ago. It says, right on page 201, in the chapter on "Running,
Debugging, and Testing Programs":

"One of the nice features of programming in a high-level language like
Pascal is that it can be done with almost a total lack of
understanding of what a computer is and how it actually operates.
[...] There is no reason why someone who wants to write a computer
program should have to understand the electronic circuitry of a
computer, any more than someone learning to drive a car should have to
understand how the internal combustion engine works."

I think that's exactly wrong. What you're doing with computers doesn't
change from the bit to the high-level language. It's all symbol
manipulation according to the same set of rules, all the way up. The
elaboration becomes involved as you go up but the process does not
change qualitatively.


> In the context of the original question, it's entirely reasonable (in
> my view) to tell this audience "if the code does something weird you
> don't understand, either ignore it and find another way or dig into
> the manuals and experiment if you care". They'd very quickly find a =
> a + b as a less confusing alternative to a += b. (As has been pointed
> out earlier, to some extent a += b is quite an advanced construct -
> after all, it's essentially an optimisation of a = a + b).

On that point I completely agree. The language is getting in the way
of the programming.

> Biologists don't expect me to understand their discipline before I can
> plant seeds in my garden, after all. (And when I do plant seeds, I
> usually get far more surprising results than I could get from a += b
> :-))
>
> Paul.

The discipline of programming is different than biology. It's
incredibly simple yet profound if it's taught as what it is, i.e.
automatic symbol manipulation.

No scientist is a stranger to logic and reasoned argument. They
shouldn't be strangers to telling their mechanical brains what to
"reason" about. Learning to program should be /easy/ for someone who
basically already gets it.

Wow, long post.

(Oh, and, er, it's ONTOGENY not Ontology that recapitulates phylogeny.
Heh. My bad.)

[1] "On the cruelty of really teaching computing science"
http://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD1036.html


[2] "The camel has two humps"
http://www.cs.mdx.ac.uk/research/PhDArea/saeed/paper1.pdf


[3] "Can computing science save the computer industry?"
http://www.cs.utexas.edu/users/EWD/transcriptions/EWD09xx/EWD920.html

greg

unread,
Jul 9, 2009, 2:29:26 AM7/9/09
to
Dave Angel wrote:
> By the time I graduated, I had five six-level languages
^^^

Are they languages that you have to edit using vi? :-)

--
Greg

Dave Angel

unread,
Jul 9, 2009, 9:28:29 AM7/9/09
to greg, pytho...@python.org
greg wrote:
> <div class="moz-text-flowed" style="font-family: -moz-fixed">Dave
Back then I didn't need glasses. That was of course intended to be
"six high-level languages"


Benjamin Kaplan

unread,
Jul 9, 2009, 9:47:53 AM7/9/09
to pytho...@python.org
On Thu, Jul 9, 2009 at 1:05 AM, Simon Forman<sajm...@gmail.com> wrote:
> Everyone gets so caught up in programming via languages that
> you get, well, people trying to teach "Computer Programming" as if it
> were only necessary to grok a language, rather than grokking /symbol
> manipulation/ itself.
>
>

+1 QOTW.

I'm a CS major and this is exactly what I see happening in the intro
to CS courses. In class, the "Intro to Programming" students are
showed how to write Java code. Then they try their homework and are
completely lost because they were never taught how to think like a
programmer so they don't understand how to approach the question.

Steven D'Aprano

unread,
Jul 9, 2009, 1:20:14 PM7/9/09
to
On Wed, 08 Jul 2009 22:05:57 -0700, Simon Forman wrote:

> The core abstractions of [mechanical] computation are just not that
> complicated. You can teach them to anybody in about a half an hour,
> drunk. I have.

That's *easy*. Anyone can teach the most complicated and abstract
principles of any topic at all drunk. What's hard is doing it sober.

http://stackoverflow.com/questions/63668/confessions-of-your-worst-wtf-
moment-what-not-to-do/350267#350267

or http://tinyurl.com/lur784


You'll excuse my skepticism about all these claims about how anyone can
program, how easy it is to teach the fundamentals of Turing Machines and
functional programming to anybody at all. Prove it. Where are your peer-
reviewed studies demonstrating such successes on randomly chosen people,
not self-selected motivated, higher-than-average intelligence students?

In the real world, human beings are prone to serious cognitive biases and
errors. Our brains are buggy. Take any list of reasoning fallacies, and
you'll find the majority of people have difficulty avoid them. The first
struggle is to get them to even accept that they *are* fallacies, but
even once they have intellectually accepted it, getting them to live up
to that knowledge in practice is much, much harder.

In fact I'd go further and say that *every single person*, no matter how
intelligent and well-educated, will fall for cognitive biases and
fallacious reasoning on a regular basis.

http://en.wikipedia.org/wiki/Cognitive_bias
http://en.wikipedia.org/wiki/Fallacy


--
Steven

Steven D'Aprano

unread,
Jul 9, 2009, 2:10:16 PM7/9/09
to
On Wed, 08 Jul 2009 22:05:57 -0700, Simon Forman wrote:

>> persistent idea "out there" that programming is a very accessible
>> skill, like cooking or gardening, anyone can do it, and even profit
>> from it, monetarily or otherwise, etc., and to some extent I am
>
> Programming is not like any other human activity.

In practice? In principle? Programming in principle is not the same as it
is performed in practice.

But in either case, programming requires both the logical reasoning of
mathematics and the creativity of the arts. Funnily enough,
mathematicians will tell you that mathematics requires the same, and so
will the best artists. I think mathematicians, engineers, artists, even
great chefs, will pour scorn on your claim that programming is not like
any other human activity.


[...]


> He talks about how "when all is said and done, the only thing computers
> can do for us is to manipulate symbols and produce results of such
> manipulations" and he emphasises the "uninterpreted" nature of
> mechanical symbol manipulation, i.e. that the machine is doing it
> mindlessly.

"Manipulate symbols" is so abstract as to be pointless. By that
reasoning, I can build a "computer" consisting of a box open at the top.
I represent a symbol by an object (say, a helium-filled balloon, or a
stone), instead of a pattern of bits. I manipulate the symbol by holding
the object over the box and letting go. If it flies up into the sky, that
represents the symbol "Love is War", if it falls into the box, it
represents the symbol "Strength is Blue", and if it just floats there, it
represents "Cheddar Cheese". This is a deterministic, analog computer
which manipulates symbols. Great.

And utterly, utterly useless. So what is my computer lacking that real
computers have? When you have answered that question, you'll see why
Dijkstra's claim is under-specified.

> Dijkstra[1]: "It is true that the student that has never manipulated
> uninterpreted formulae quickly realizes that he is confronted with
> something totally unlike anything he has ever seen before. But
> fortunately, the rules of manipulation are in this case so few and
> simple that very soon thereafter he makes the exciting discovery that he
> is beginning to master the use of a tool that, in all its simplicity,
> gives him a power that far surpasses his wildest dreams." [1]

What is an uninterpreted formula? If you don't interpret it, how can you
distinguish it from random noise?


>> but maybe this idea is not entirely true... Maybe, to get past the
>> most amateurish level, one has to, one way or another, come
>> face-to-face with bits, compilers, algorithms, and all the rest that
>> real computer scientists learn about in their formal training...
>>
>> kj
>
> If you're never exposed to that constellation of concepts that underpins
> "mechanical symbol manipulation" you are adrift in a sea ("c", ha ha) of
> abstractions.
>
> However, if you /are/ exposed to the "so few and simple" rules of
> manipulation the gates (no pun intended) to the kingdom are thrown wide.


What nonsense. The keys to the kingdom are the abstractions.

Here's an exercise for you, if you dare. It's not that difficult to
remove the abstraction from integer addition, to explain it in terms of
bit manipulation. If you prefer, you can write a Turing Machine instead.

Now try doing the same thing for Quicksort. Don't worry, we'll wait.

Once you've done that, let's see you implement a practical, scalable, non-
toy webserver as a Turing Machine.

No, it's not the fundamental operations that open the doors to the
kingdom. It's the abstractions. The history of computing is to gain more
and more power by leveraging better and better abstractions.



> On Jul 8, 9:10 am, Steven D'Aprano <st...@REMOVE-THIS-
> cybersource.com.au> wrote:

>> There is some evidence that 30-60% of people simply cannot learn to
>> program, no matter how you teach them:
>>
>> http://www.codinghorror.com/blog/archives/000635.html
>> http://www.cs.mdx.ac.uk/research/PhDArea/saeed/

...


> I don't buy it:


I'm not entirely convinced myself. It's plausible -- not everyone has got
what it takes to be musically talented, or a great athlete, or a skilled
surgeon, or a charismatic leader. Why should programming be the one high-
level intellectual skill that everybody is good at? Seems mighty
implausible to me.

But I'm not sure that the distribution of skill is *fundamentally* bi-
modal. Maybe it is, maybe it isn't.


> I believe strongly that any normal person can learn to
> program, to manipulate symbols to create formulae that guide the machine
> in its uninterpreted symbol manipulation.

Most people don't manipulate abstract symbols *at all* -- even something
as simple as:

"if x is to z as half y is to twice z, then x is the same as ____ y"
(fill in the blank)

will perplex most people.


> I think that "first hump" people can become "second hump" people but
> that it requires teaching them the foundations first, not confronting
> them with such incredible novelties as "a = b" and saying in effect,
> "here you go buddy, sink or swim."

The weakness of the paper is not that "a=b" is a novelty, but that it's
dangerously familiar. All their students will have already seen "a=b" in
the context of many years of mathematics. But that's not what it means in
this context. So the risk is, they will be interpreting the symbols "a=b"
in terms of mathematical equality, and getting confused.


> Quoting Dijkstra again [1]: "Before we part, I would like to invite you
> to consider the following way of doing justice to computing's radical
> novelty in an introductory programming course.
>
> "On the one hand, we teach what looks like the predicate calculus, but
> we do it very differently from the philosophers. In order to train the
> novice programmer in the manipulation of uninterpreted formulae, we
> teach it more as boolean algebra, familiarizing the student with all
> algebraic properties of the logical connectives. To further sever the
> links to intuition, we rename the values {true, false} of the boolean
> domain as {black, white}.

This is supposed to be a good thing?

Oh my. It must be nice in that ivory tower, never having to deal with
anyone who hasn't already been winnowed by years of study and self-
selection before taking on a comp sci course.

I don't think Dijkstra would have taken that suggestion seriously if he
had to teach school kids instead of college adults.


...

> (Did you read that last paragraph and think, "Well how the heck else are
> you supposed to program a computer if not in a computer language?"? If
> so, well, that is kind of my point.)

No. What I thought was, how the hell are you supposed to manipulate
symbols without a language to describe what sort of manipulations you can
do?


>> > Maybe, to
>> > get past the most amateurish level, one has to, one way or another,
>> > come face-to-face with bits, compilers, algorithms, and all the rest
>> > that real computer scientists learn about in their formal training...
>>
>> The "No True Scotsman" fallacy.
>>
>> There's nothing amateurish about building software applications that
>> work, with well-designed interfaces and a minimum of bugs, even if
>> you've never heard of Turing Machines.
>>
>> --
>> Steven
>
> I beg to differ. I recall a conversation with a co-worker who had
> "learned" to program using PHP. Another co-worker and I were trying to
> convince him that there was a good reason to differentiate between hash
> tables and arrays. He didn't even know that they were different
> "things".

I shall manfully resist the urge to make sarcastic comments about
ignorant PHP developers, and instead refer you to my earlier reply to you
about cognitive biases. For extra points, can you identify all the
fallacious reasoning in that paragraph of yours?


> I remember telling him, "between the metal and the desktop there is
> nothing but layers of abstraction. We use different names precisely
> because of different behaviours."

But those abstractions just get in the way, according to you. He should
be programming in the bare metal, doing pure symbol manipulation without
language.


> He made "well-designed interfaces", but "amateurish" is about the nicest
> thing I would have called him.

Well, perhaps this is a failure of his profession, in that there isn't
enough specialisation. If he is skilled with making interfaces, and
unskilled with programming the backend, then he should work where his
strength lies, instead of being insulted by somebody who has an entirely
unjustified opinion about what "real programming" is.

If programming is symbol manipulation, then you should remember that the
user interface is also symbol manipulation, and it is a MUCH harder
problem than databases, sorting, searching, and all the other problems
you learn about in academia. The user interface has to communicate over a
rich but noisy channel using multiple under-specified protocols to a
couple of pounds of meat which processes information using buggy
heuristics evolved over millions of years to find the ripe fruit, avoid
being eaten, and have sex. If you think getting XML was hard, that's
*nothing* compared to user interfaces.

The fact that even bad UIs work at all is a credit to the heuristics,
bugs and all, in the meat.


> As for "a minimum of bugs"... sigh. The "minimum of bugs" is zero, if
> you derive your "uninterpreted formulae" /correctly/.

And the secret of getting rich on the stock market is to buy low, sell
high. Very true, and very pointless. How do you derive the formula
correctly?

Do you have an algorithm to do so? If so, how do you know that algorithm
is correct?


> Deriving provably
> correct "programs" should be what computer science and computer
> education are all about

That's impossible. Not just impractical, or hard, or subject to hardware
failures, but impossible.

I really shouldn't need to raise this to anyone with pretensions of being
a Computer Scientist and not just a programmer, but... try proving an
arbitrary program will *halt*.

If you can't do that, then what makes you think you can prove all useful,
necessary programs are correct?

> Again with Dijkstra[3]: "The prime paradigma of the pragmatic designer
> is known as "poor man's induction", i.e. he believes in his design as
> long as "it works", i.e. until faced with evidence to the contrary. (He
> will then "fix the design".)

Yes, that is the only way it can be.


> The scientific designer, however, believes
> in his design because he understands why it will work under all
> circumstances.

Really? Under *all* circumstances?

Memory failures? Stray cosmic radiation flipping bits? Noise in the power
supply?

What's that you say? "That's cheating, the software designer can't be
expected to deal with the actual reality of computers!!! We work in an
abstract realm where computers never run out of memory, swapping never
occurs, and the floating point unit is always correct!"

Fail enough. I don't want to make it too hard for you.

Okay, so let's compare the mere "pragmatic designer" with his provisional
expectation that his code is correct, and Dijkstra's "scientific
designer". He understands his code is correct. Wonderful!

Er, *how exactly* does he reach that understanding?

He reasons about the logic of the program. Great! I can do that. See,
here's a proof that binary search is correct. How easy this is.

Now try it for some non-trivial piece of code. Say, the Linux kernel.
This may take a while...

What guarantee do we have that the scientific designer's reasoning was
correct? I mean, sure he is convinced he has reasoned correctly, but he
would, wouldn't he? If he was unsure, he'd keep reasoning (or ask for
help), until he was sure. But being sure doesn't make him correct. It
just makes him sure.

So how do you verify the verification?

And speaking of binary search:

[quote]
I was shocked to learn that the binary search program that Bentley PROVED
CORRECT and SUBSEQUENTLY TESTED [emphasis added] in Chapter 5 of
"Programming Pearls" contains a bug. Once I tell you what the it is, you
will understand why it escaped detection for two decades.
[end quote]

http://northernplanets.blogspot.com/2006/07/nearly-all-binary-searches-
are-broken.html

or http://tinyurl.com/nco6yv

Perhaps the "scientific designer" should be a little less arrogant, and
take note of the lesson of real science: all knowledge is provisional and
subject to correction and falsification.

Which really makes the "scientific designer" just a slightly more clever
and effective "pragmatic designer", doesn't it?


> The transition from pragmatic to scientific design would
> indeed be a drastic change within the computer industry."

Given Dijkstra's definition of "scientific design", it certainly would.
It would be as drastic a change as the discovery of Immovable Objects and
Irresistible Forces would be to engineering, or the invention of a
Universal Solvent for chemistry, or any other impossibility.


> "Obviously no errors" is the goal to strive for, and I am comfortable
> calling anyone an amateur who prefers "no obvious errors."

It's not a matter of preferring no obvious errors, as understanding the
limitations of reality. You simply can't do better than no obvious errors
(although of course you can do worse) -- the only question is what you
call "obvious".

--
Steven

J. Cliff Dyer

unread,
Jul 9, 2009, 4:56:44 PM7/9/09
to Steven D'Aprano, pytho...@python.org
On Thu, 2009-07-09 at 18:10 +0000, Steven D'Aprano wrote:
> If programming is symbol manipulation, then you should remember that
> the
> user interface is also symbol manipulation, and it is a MUCH harder
> problem than databases, sorting, searching, and all the other
> problems
> you learn about in academia. The user interface has to communicate
> over a
> rich but noisy channel using multiple under-specified protocols to a
> couple of pounds of meat which processes information using buggy
> heuristics evolved over millions of years to find the ripe fruit,
> avoid
> being eaten, and have sex. If you think getting XML was hard, that's
> *nothing* compared to user interfaces.

+1 QOTW! QOTM, even!


Ethan Furman

unread,
Jul 9, 2009, 11:16:30 AM7/9/09
to pytho...@python.org
Steven D'Aprano wrote:
>
> There is some evidence that 30-60% of people simply cannot learn to
> program, no matter how you teach them:
>
> http://www.codinghorror.com/blog/archives/000635.html
> http://www.cs.mdx.ac.uk/research/PhDArea/saeed/
>
> I'm sympathetic to the idea, but not entirely convinced. Perhaps the
> problem isn't with the students, but with the teachers, and the
> languages:
>
> http://www.csse.monash.edu.au/~damian/papers/PDF/SevenDeadlySins.pdf
>
> (My money is that it's a little of both.)
>
>

Very interesting articles. Not suprising (to me, at least) -- everybody
has their strengths and weaknesses. I will never be, and could never
have been, any kind of sports person; I don't have the right
personality to be a good people person; I don't have the knack for
writing stories. Not everyone can do anything, or even many things.

~Ethan~

Terry Reedy

unread,
Jul 9, 2009, 11:07:34 PM7/9/09
to pytho...@python.org
Steven D'Aprano wrote:

> And speaking of binary search:
>
> [quote]
> I was shocked to learn that the binary search program that Bentley PROVED
> CORRECT and SUBSEQUENTLY TESTED [emphasis added] in Chapter 5 of
> "Programming Pearls" contains a bug. Once I tell you what the it is, you
> will understand why it escaped detection for two decades.
> [end quote]
>
> http://northernplanets.blogspot.com/2006/07/nearly-all-binary-searches-are-broken.html
>
> or http://tinyurl.com/nco6yv

The actual report is at
http://googleresearch.blogspot.com/2006/06/extra-extra-read-all-about-it-nearly.html

The is the so called bug:
"In Programming Pearls Bentley says that the analogous line "sets m to
the average of l and u, truncated down to the nearest integer." On the
face of it, this assertion might appear correct, but it fails for large
values of the int variables low and high. Specifically, it fails if the
sum of low and high is greater than the maximum positive int value (231
- 1). The sum overflows to a negative value, and the value stays
negative when divided by two. In C this causes an array index out of
bounds with unpredictable results. In Java, it throws
ArrayIndexOutOfBoundsException."

The is *not* a bug is Bentley program. It is a bug in bad, buggy, insane
integer arithmetic implementations. low + high should either return the
right answer, as it will in Python, or raise an overflow error.

tjr

Steven D'Aprano

unread,
Jul 9, 2009, 11:41:43 PM7/9/09
to

Wow. That's an impressive set of typos :)


> It is a bug in bad, buggy, insane
> integer arithmetic implementations. low + high should either return the
> right answer, as it will in Python, or raise an overflow error.

Is binary search *supposed* to raise an overflow error if given more than
2**30 elements? If not, then you're solving a different problem: it's
just like binary search, but it only works up to a maximum number of
elements. Now perhaps that's a legitimate problem to solve, but unless
you make the change in behaviour clear up front, your code has failed to
live up to specifications and therefore is buggy.

Is there something special about OverflowError that is "better" than
ArrayIndexOutOfBoundsException?

I don't have "Programming Pearls", and unfortunately the section on
binary search is not online:

http://netlib.bell-labs.com/cm/cs/pearls/sketch04.html

but there's no reason to imagine that the book will assume -- or even
state as a requirement for binary search -- that addition will never
overflow. Far from it: it seems to me that the author is very aware of
real world constraints, and in the early 1980s, BigInt types were not
something you took as a given.

Of course *binary search* as an algorithm is not broken. The bug was that
"Programming Pearls" neglected to take into account overflow when you
have more than 2**30 items. One way to take that into account is to
insist on using a language like Python where addition can never overflow.
But that's not the code presented in "Programming Pearls".


--
Steven

Hendrik van Rooyen

unread,
Jul 10, 2009, 6:54:21 AM7/10/09
to pytho...@python.org
"Steven D'Aprano" <steve@REMOVE-THIS-cy....e.com.au> wrote:

>On Wed, 08 Jul 2009 22:05:57 -0700, Simon Forman wrote:
>
>>> persistent idea "out there" that programming is a very accessible
>>> skill, like cooking or gardening, anyone can do it, and even profit
>>> from it, monetarily or otherwise, etc., and to some extent I am
>>
>> Programming is not like any other human activity.
>
>In practice? In principle? Programming in principle is not the same as it
>is performed in practice.
>
>But in either case, programming requires both the logical reasoning of
>mathematics and the creativity of the arts. Funnily enough,

I do not buy this arty creativity stuff. - or are you talking about
making a website look pretty?

>mathematicians will tell you that mathematics requires the same, and so
>will the best artists. I think mathematicians, engineers, artists, even
>great chefs, will pour scorn on your claim that programming is not like
>any other human activity.

So a chef is now an authority on programming?

Programming is actually kind of different - almost everything else is
just done, at the time that you do it.

Programming is creating stuff that is completely useless until it is
fed into something that uses it, to do something else, in conjuction
with the thing it is fed into, at a later time.

This is a highly significant difference, IMHO.


>
>
>[...]
>> He talks about how "when all is said and done, the only thing computers
>> can do for us is to manipulate symbols and produce results of such
>> manipulations" and he emphasises the "uninterpreted" nature of
>> mechanical symbol manipulation, i.e. that the machine is doing it
>> mindlessly.
>
>"Manipulate symbols" is so abstract as to be pointless. By that
>reasoning, I can build a "computer" consisting of a box open at the top.
>I represent a symbol by an object (say, a helium-filled balloon, or a
>stone), instead of a pattern of bits. I manipulate the symbol by holding
>the object over the box and letting go. If it flies up into the sky, that
>represents the symbol "Love is War", if it falls into the box, it
>represents the symbol "Strength is Blue", and if it just floats there, it
>represents "Cheddar Cheese". This is a deterministic, analog computer
>which manipulates symbols. Great.
>
>And utterly, utterly useless. So what is my computer lacking that real
>computers have? When you have answered that question, you'll see why
>Dijkstra's claim is under-specified.
>

So if computers do not manipulate symbols, what is it that they do?
They sure cannot think,
or drink,
or reason,
or almost any verb you can think of.

"Manipulating symbols" is actually an elegant definition.
Try coming up with a better one and you will see.

And calling an abstraction pointless kind of contradicts
what you say later...

8<------- camel humps and other stuff -------------------

- Hendrik

Steven D'Aprano

unread,
Jul 10, 2009, 9:11:09 AM7/10/09
to
On Fri, 10 Jul 2009 12:54:21 +0200, Hendrik van Rooyen wrote:

> "Steven D'Aprano" <steve@REMOVE-THIS-cy....e.com.au> wrote:
>
>>On Wed, 08 Jul 2009 22:05:57 -0700, Simon Forman wrote:
>>
>>>> persistent idea "out there" that programming is a very accessible
>>>> skill, like cooking or gardening, anyone can do it, and even profit
>>>> from it, monetarily or otherwise, etc., and to some extent I am
>>>
>>> Programming is not like any other human activity.
>>
>>In practice? In principle? Programming in principle is not the same as
>>it is performed in practice.
>>
>>But in either case, programming requires both the logical reasoning of
>>mathematics and the creativity of the arts. Funnily enough,
>
> I do not buy this arty creativity stuff. - or are you talking about
> making a website look pretty?

I must admit, it never crossed my mind that anyone here would claim that
there was no creativity involved in programming, that it was all a
mindless, algorithmic process capable of being done by a simple
mechanical device.

This is certainly the accusation made against *bad* programmers -- that
they can't actually solve new, unique problems, but just apply recipes
they learned without any insight or intelligence. The sort of people who
program so poorly that "a trained monkey could do what they do".

Do you really think that applies to good programmers too? If so, then a
good code generator should be able to replace any programmer. Is that
what you believe?



>>mathematicians will tell you that mathematics requires the same, and so
>>will the best artists. I think mathematicians, engineers, artists, even
>>great chefs, will pour scorn on your claim that programming is not like
>>any other human activity.
>
> So a chef is now an authority on programming?

Did I say that?

Chefs are authorities on OTHER HUMAN ACTIVITIES.

> Programming is actually kind of different - almost everything else is
> just done, at the time that you do it.
>
> Programming is creating stuff that is completely useless until it is fed
> into something that uses it, to do something else, in conjuction with
> the thing it is fed into, at a later time.

Somebody should teach Hendrik that human beings have been creating TOOLS
for hundreds of thousands of years. People have been creating tools to
build tools for thousands of years. Software is just more of the same.

Even *soup stock* fits the same profile as what Hendrik claims is almost
unique to programming. On its own, soup stock is totally useless. But you
make it, now, so you can you feed it into something else later on.

Or instant coffee.

No, Henrik, if that's the best you can do, it's not very good. It is
rather sad, but also hilarious, that the most different thing you have
noticed about software is that it's just like instant coffee.


> This is a highly significant difference, IMHO.


>>[...]
>>> He talks about how "when all is said and done, the only thing
>>> computers can do for us is to manipulate symbols and produce results
>>> of such manipulations" and he emphasises the "uninterpreted" nature of
>>> mechanical symbol manipulation, i.e. that the machine is doing it
>>> mindlessly.
>>
>>"Manipulate symbols" is so abstract as to be pointless. By that
>>reasoning, I can build a "computer" consisting of a box open at the top.
>>I represent a symbol by an object (say, a helium-filled balloon, or a
>>stone), instead of a pattern of bits. I manipulate the symbol by holding
>>the object over the box and letting go. If it flies up into the sky,
>>that represents the symbol "Love is War", if it falls into the box, it
>>represents the symbol "Strength is Blue", and if it just floats there,
>>it represents "Cheddar Cheese". This is a deterministic, analog computer
>>which manipulates symbols. Great.
>>
>>And utterly, utterly useless. So what is my computer lacking that real
>>computers have? When you have answered that question, you'll see why
>>Dijkstra's claim is under-specified.
>>
>>
> So if computers do not manipulate symbols, what is it that they do?

Did I say they don't manipulate symbols?


> They
> sure cannot think,
> or drink,
> or reason,

They can't reason? Then what are they doing when they manipulate symbols?

Yet again, it didn't even cross my mind that somebody would make this
claim. My entire point is that it's not enough to just "manipulate
symbols", you have to manipulate symbols the correct way, following laws
of logic, so that the computer can *mindlessly* reason.


> or almost any verb you can think of.

If you're going to take that argument, then I'll remind you that there
are no symbols inside a computer. There are only bits. And in fact, there
aren't even any bits -- there are only analog voltages, and analog
magnetic fields.


> "Manipulating symbols" is actually an elegant definition. Try coming up
> with a better one and you will see.

As I said above, it's not enough to just manipulate symbols. Here's a set
of rules to manipulate symbols:

X => 0

or in English, "Any symbol becomes the zero symbol".

That's symbol manipulation. Utterly useless. This is why it's not enough
to just manipulate symbols, you have to manipulate them THE RIGHT WAY.

--
Steven

Scott David Daniels

unread,
Jul 10, 2009, 11:28:29 AM7/10/09
to
Steven D'Aprano wrote:
> Even *soup stock* fits the same profile as what Hendrik claims is almost
> unique to programming. On its own, soup stock is totally useless. But you
> make it, now, so you can you feed it into something else later on.
>
> Or instant coffee.

I think I'll avoid coming to your house for a cup of coffee. :-)

--Scott David Daniels
Scott....@Acm.Org

Steven D'Aprano

unread,
Jul 10, 2009, 11:48:47 AM7/10/09
to
On Fri, 10 Jul 2009 08:28:29 -0700, Scott David Daniels wrote:

> Steven D'Aprano wrote:
>> Even *soup stock* fits the same profile as what Hendrik claims is
>> almost unique to programming. On its own, soup stock is totally
>> useless. But you make it, now, so you can you feed it into something
>> else later on.
>>
>> Or instant coffee.
>
> I think I'll avoid coming to your house for a cup of coffee. :-)


I meant the instant coffee powder is prepared in advance. It's useless on
it's own, but later on you feed it into boiling water, add sugar and
milk, and it's slightly less useless.

--
Steven

D'Arcy J.M. Cain

unread,
Jul 10, 2009, 12:09:28 PM7/10/09
to Steven D'Aprano, pytho...@python.org
On 10 Jul 2009 15:48:47 GMT

Steven D'Aprano <st...@REMOVE-THIS-cybersource.com.au> wrote:
> I meant the instant coffee powder is prepared in advance. It's useless on
> it's own, but later on you feed it into boiling water, add sugar and
> milk, and it's slightly less useless.

I don't know about that. I find instant coffee pretty useless no
matter what it is fed to. :-)

--
D'Arcy J.M. Cain <da...@druid.net> | Democracy is three wolves
http://www.druid.net/darcy/ | and a sheep voting on
+1 416 425 1212 (DoD#0082) (eNTP) | what's for dinner.

pdpi

unread,
Jul 10, 2009, 12:23:55 PM7/10/09
to
On Jul 10, 2:11 pm, Steven D'Aprano <st...@REMOVE-THIS-

cybersource.com.au> wrote:
> On Fri, 10 Jul 2009 12:54:21 +0200, Hendrik van Rooyen wrote:
> > "Steven D'Aprano" <steve@REMOVE-THIS-cy....e.com.au> wrote:
>
> >>On Wed, 08 Jul 2009 22:05:57 -0700, Simon Forman wrote:
>
> >>>> persistent idea "out there" that programming is a very accessible
> >>>> skill, like cooking or gardening, anyone can do it, and even profit
> >>>> from it, monetarily or otherwise, etc., and to some extent I am
>
> >>> Programming is not like any other human activity.
>
> >>In practice? In principle? Programming in principle is not the same as
> >>it is performed in practice.
>
> >>But in either case, programming requires both the logical reasoning of
> >>mathematics and the creativity of the arts. Funnily enough,
>
> > I do not buy this arty creativity stuff. - or are you talking about
> > making a website look pretty?
>
> I must admit, it never crossed my mind that anyone here would claim that
> there was no creativity involved in programming, that it was all a
> mindless, algorithmic process capable of being done by a simple
> mechanical device.
>
> This is certainly the accusation made against *bad* programmers -- that
> they can't actually solve new, unique problems, but just apply recipes
> they learned without any insight or intelligence. The sort of people who
> program so poorly that "a trained monkey could do what they do".

I wholeheartedly agree. Coming up with Duff's device is nothing if not
creative. My mind still reels at trying to grok it.
http://www.lysator.liu.se/c/duffs-device.html

> Even *soup stock* fits the same profile as what Hendrik claims is almost
> unique to programming. On its own, soup stock is totally useless. But you
> make it, now, so you can you feed it into something else later on.
>
> Or instant coffee.

I've always found cooking an apt metaphor for programming.

You've got your well-limited for loops (cook for x minutes), your less
straightforward while/until loops (roast until golden), you have your
subprocedures (prepare sauce in advance/in parallel), you have some
conditionals (tenderize the steak if the meat isn't really that
tender), etc etc.

The complexities of assignment can be easily visualized in terms of
containers and mixing stuff together. Nothing makes a += b more
obvious than having a bowl of cream (a), an egg (b), and adding the
egg to the bowl of cream (a += b). Well, except for the part where
that in that case evaluating b is destructive ;)

> They can't reason? Then what are they doing when they manipulate symbols?

"Computers aren't intelligent. They only think they are." Or, more to
the point: the typical definition of "reasoning" tends to involve more
of what defines humans as self-aware, animate beings than what is
usually ascribed to computers.

Wesley Chun

unread,
Jul 10, 2009, 1:48:43 PM7/10/09
to
On Jul 7, 1:04 pm, kj <no.em...@please.post> wrote:
> I'm having a hard time coming up with a reasonable way to explain
> certain things to programming novices.
> :
> How do I explain to rank beginners (no programming experience at
> all) why x and y remain unchanged above, but not z?
> :
> What do you say to that?
>
> I can come up with much mumbling about pointers and stacks and
> heaps and much hand-waving about the underlying this-and-that, but
> nothing that sounds even remotely illuminating.
>
> Your suggestions would be much appreciated!


kj,

i don't have too much to add to everyone else's response except to
describe how i deal with this. i teach Python courses several times a
year and realized long ago that conveying the concept of mutable vs.
immutable is a key to getting up-to-speed quickly with Python as well
as helping beginners.

so, although techically, this is more of an intermediate topic rather
than "beginner" material, i still teach it anyway, with the hopes of
producing better Python programmers out of the gate, and hopefully,
less frustrated ones. in fact, i dedicated an entire chapter (4) in
Core Python Programming just to address this important issue. to top
it all off, i end this module in the class by giving 2 quizzes, just
to make sure they understood what i just told them. i put the 1st one
online, so if you're curious, the PDF is at http://roadkill.com/~wesc/cyberweb/introQuiz.pdf
... the 2nd quiz is harder and involves the discussion of the
differences between shallow and deep copies. so yes, not very beginner-
ish stuff, hence the reason why i (re)named my course "Intro
+Intermediate Python".

finally, rather than the "paper tag" or alex's hotel statue analogy, i
just say that variables are like Post-It&reg; or sticky notes on
objects. i can tag objects anytime, tag objects more than once, remove
tags, or switch them to another object, etc.

just my $0.02,
-- wesley
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
"Core Python Programming", Prentice Hall, (c)2007,2001
"Python Fundamentals", Prentice Hall, (c)2009
http://corepython.com

wesley.j.chun :: wescpy-at-gmail.com
python training and technical consulting
cyberweb.consulting : silicon valley, ca
http://cyberwebconsulting.com

Scott David Daniels

unread,
Jul 10, 2009, 2:59:50 PM7/10/09
to

I know, but the image of even a _great_ soup stock with instant
coffee poured in, both appalled me and made me giggle. So, I
thought I'd share.

--Scott David Daniels
Scott....@Acm.Org

Terry Reedy

unread,
Jul 10, 2009, 4:27:12 PM7/10/09
to pytho...@python.org
Steven D'Aprano wrote:
> On Thu, 09 Jul 2009 23:07:34 -0400, Terry Reedy wrote:

>>
>> The is *not* a bug is Bentley program.

This is *not* a bug in Bentley's program.

> Wow. That's an impressive set of typos :)

3. Way beneath my usual standards ;-)

>> It is a bug in bad, buggy, insane
>> integer arithmetic implementations. low + high should either return the
>> right answer, as it will in Python, or raise an overflow error.
>
> Is binary search *supposed* to raise an overflow error if given more than
> 2**30 elements?

No. it is supposed to work, and Bentley's program will work if lo and hi
are actually integers, as his program presupposes, and not residue
classes mislabeled 'int'.

> Is there something special about OverflowError that is "better" than
> ArrayIndexOutOfBoundsException?

Wrong comparison. The actual alternative to OverflowError is a negative
number as the sum of two positive numbers. If one claims that a and b
are positive ints, then returning a negative is a bug. The index
exception was a side effect, in Java, of using the negative as an index.
In C, the side effect was a segmentation fault. Do you seriously
question that OverflowError is better than seg faulting? A different
program could go on to silently return a wrong answer. Perhaps it would
say to go west instead of east, or to reduce a medical dosage instead of
raising it.

Note that negative ints are legal indexes in Python, so that if any
Python version ever had ints that wrapped instead of overflowing or
returning a long, then the program would not necessarily stop.

> but there's no reason to imagine that the book will assume -- or even
> state as a requirement for binary search -- that addition will never
> overflow.

Bentley assumed that if (lo+hi)/2 were successfully calculated, then it
would be a positive number between lo and hi, and therefore a legal index.

The dirty secret of computing is that some languages do not have integer
types. They do not even have restricted-range integer types, which would
noisily fail when they cannot perform an operation. They only have
residue class types, which are something else, and which can give
bizarre results if one mindlessly uses them as if they really were
restricted-range integers. When one wants integer operations, every
operation with two residue-class variables has a potential *silent* bug.
Being relieved of the burden of constantly keeping this in mind is one
reason I use Python.

Float types are usually restricted range types. A huge_float +
huge_float may raise overflow, but never returns a negative float.

Bentley assumed that indexes hi and lo are integers. If one uses
restricted range integers that are too large, then lo+hi could fail
gracefully with an appropriate effor message. I would not consider that
a bug, bug a limitation due to using limited-range numbers. If one uses
residue classes instead of integers, and makes no adjustment, I consider
it wrong to blame Bentley.

Terry Jan Reedy

I V

unread,
Jul 10, 2009, 4:54:08 PM7/10/09
to
On Fri, 10 Jul 2009 16:27:12 -0400, Terry Reedy wrote:
> a bug, bug a limitation due to using limited-range numbers. If one uses
> residue classes instead of integers, and makes no adjustment, I consider
> it wrong to blame Bentley.

But it was Bentley himself who used the C int type, so it hardly seems
unreasonable to blame him.

Terry Reedy

unread,
Jul 10, 2009, 7:48:58 PM7/10/09
to pytho...@python.org

The original program that he verified was in pseudocode equivalent to
the following (not tested) Python. The odd spec is due to Bentley using
1-based arrays. From the 1986 book Programming Pearls

def binsearch(x, t):
"If t is in sorted x[1:n], return its index; otherwise 0"
#
L,U = 1, len(x)-1
while True:
if L > U: return 0
m = (L+U)/2
if x[m] < t: L = m+1
elif x[m] == t: return m
elif x[m] > t: U = m-1

He then translated into an unspecified dialect of BASIC, but it is
consistent with Microsoft GW-BASIC. If so, L and U were float variables,
while the largest possible array size for x was 32767. So as near as I
can tell with that dialect, there was no possible problem with L+U
overflowing or wrapping. Other dialects, I am not sure.

So I revise my statement to "I consider it wrong to blame the Bentley
that wrote the original program" ;-).

If he later translated to C and fell into the residue class trap, then
that reinforces my contention that using residue classes as ints is
error prone.

Terry Jan Reedy

John Nagle

unread,
Jul 10, 2009, 8:02:19 PM7/10/09
to
Bearophile wrote:
> kj, as Piet van Oostrum as said, that's the difference between mutable
> an immutable. It comes from the procedural nature of Python, and
> probably an explanation of such topic can't be avoided if you want to
> learn/teach Python.

The problem with Python's mutable/immutable distinction is that
it's not very visible. In a language with no declarations, no named
constants, and no parameter attributes like "value", "in", "out", or
"const", it's not at all obvious which functions can modify what.
Which is usually the place where this causes a problem.

The classic, of course, is

def additemtolist(item, lst = []) :
lst.append(item)
return(lst)

lista = additemtolist(1)
listb = additemtolist(2)
print(listb)

The general problem is inherent in Python's design. This particular problem,
though, results in the occasional suggestion that parameters shouldn't
be allowed to have mutable defaults.

John Nagle

Piet van Oostrum

unread,
Jul 11, 2009, 3:54:19 AM7/11/09
to
>>>>> I V <ivl...@gmail.com> (IV) wrote:

>IV> On Fri, 10 Jul 2009 16:27:12 -0400, Terry Reedy wrote:
>>> a bug, bug a limitation due to using limited-range numbers. If one uses
>>> residue classes instead of integers, and makes no adjustment, I consider
>>> it wrong to blame Bentley.

>IV> But it was Bentley himself who used the C int type, so it hardly seems
>IV> unreasonable to blame him.

If you are on a 32-bit machine, and the array to be searched contains
ints, floats or doubles, the the array must be < 2^32 bytes in size, and
as each element is at least 4 bytes, the indices are less than 2^30, so
l+u < 2^31. Therefore there is no overflow at all. I think the Bentley
programs were formulated in terms of arrays of ints. So his
implementations were safe.

If you are on a 64-bit machine you shouldn't use int for the indices
anyway (supposing int is 32 bits) but longs and then the same reasoning
shows that there are no overflows. Only when you have an array of shorts
or bytes (chars) you get the problem.

In that case the alternative formulation l + (u-l)/2 is more
robust and therefore preferable.
--
Piet van Oostrum <pi...@cs.uu.nl>
URL: http://pietvanoostrum.com [PGP 8DAE142BE17999C4]
Private email: pi...@vanoostrum.org

Hendrik van Rooyen

unread,
Jul 11, 2009, 8:01:25 AM7/11/09
to pytho...@python.org
"Steven D'Aprano" <steve@REMOVE-THIS-cyb.....ce.com.au> wrote:

>On Fri, 10 Jul 2009 12:54:21 +0200, Hendrik van Rooyen wrote:
>
>> "Steven D'Aprano" <steve@REMOVE-THIS-cy....e.com.au> wrote:
>>
>>>On Wed, 08 Jul 2009 22:05:57 -0700, Simon Forman wrote:
>>>
>>>>> persistent idea "out there" that programming is a very accessible
>>>>> skill, like cooking or gardening, anyone can do it, and even profit
>>>>> from it, monetarily or otherwise, etc., and to some extent I am
>>>>
>>>> Programming is not like any other human activity.
>>>
>>>In practice? In principle? Programming in principle is not the same as
>>>it is performed in practice.
>>>
>>>But in either case, programming requires both the logical reasoning of
>>>mathematics and the creativity of the arts. Funnily enough,
>>
>> I do not buy this arty creativity stuff. - or are you talking about
>> making a website look pretty?
>
>I must admit, it never crossed my mind that anyone here would claim that
>there was no creativity involved in programming, that it was all a
>mindless, algorithmic process capable of being done by a simple
>mechanical device.

"Programming" is the step of going from the "design" to something
that tells the machine how to implement the design.

The creativity could, arguably, be in the "Design".
Not in the translation to python, or assembler.
No way. That is just coding.

>
>This is certainly the accusation made against *bad* programmers -- that
>they can't actually solve new, unique problems, but just apply recipes
>they learned without any insight or intelligence. The sort of people who
>program so poorly that "a trained monkey could do what they do".
>
>Do you really think that applies to good programmers too? If so, then a
>good code generator should be able to replace any programmer. Is that
>what you believe?
>

Should eventually be possible, with sufficient restrictions to start off.
UML wants to go this route...

But may my eyes be stopped and my bones be heaped with dust
ere I see the day...

>
>
>>>mathematicians will tell you that mathematics requires the same, and so
>>>will the best artists. I think mathematicians, engineers, artists, even
>>>great chefs, will pour scorn on your claim that programming is not like
>>>any other human activity.
>>
>> So a chef is now an authority on programming?
>
>Did I say that?

No. I just read it like that to irritate you.

>
>Chefs are authorities on OTHER HUMAN ACTIVITIES.
>
>
>
>> Programming is actually kind of different - almost everything else is
>> just done, at the time that you do it.
>>
>> Programming is creating stuff that is completely useless until it is fed
>> into something that uses it, to do something else, in conjuction with
>> the thing it is fed into, at a later time.
>
>Somebody should teach Hendrik that human beings have been creating TOOLS
>for hundreds of thousands of years. People have been creating tools to
>build tools for thousands of years. Software is just more of the same.

I disagree - I actually own some machine tools, so I am a little
bit acquainted with what they can do, and it is not at all like computing
at any level I can think of - they are merely extensions of the hand, making
the transformation of materials more accurate and faster.

The line only becomes blurred when a processor is added, and a STORED
PROGRAM is brought into the equation.

>
>Even *soup stock* fits the same profile as what Hendrik claims is almost
>unique to programming. On its own, soup stock is totally useless. But you
>make it, now, so you can you feed it into something else later on.
>
>Or instant coffee.
>
>No, Henrik, if that's the best you can do, it's not very good. It is
>rather sad, but also hilarious, that the most different thing you have
>noticed about software is that it's just like instant coffee.
>

You have a wonderful ability to grab hold of part of a definition
and to ignore the rest, just like I can misread what you write.

Coffee and soup stay coffee and soup on re hydration. Mixing it
in with something else is not at all the same - it does not "DO" anything
else in conjunction with the thing it is fed into - how is that like
programming, and executing a program?

I am sorry if you are confusing the drinking of coffee,
which is an ancilliary activity to programming, with the
actual programming itself.

>
>
>> This is a highly significant difference, IMHO.
>
>
>>>[...]
>>>> He talks about how "when all is said and done, the only thing
>>>> computers can do for us is to manipulate symbols and produce results
>>>> of such manipulations" and he emphasises the "uninterpreted" nature of
>>>> mechanical symbol manipulation, i.e. that the machine is doing it
>>>> mindlessly.
>>>
>>>"Manipulate symbols" is so abstract as to be pointless. By that
>>>reasoning, I can build a "computer" consisting of a box open at the top.
>>>I represent a symbol by an object (say, a helium-filled balloon, or a
>>>stone), instead of a pattern of bits. I manipulate the symbol by holding
>>>the object over the box and letting go. If it flies up into the sky,
>>>that represents the symbol "Love is War", if it falls into the box, it
>>>represents the symbol "Strength is Blue", and if it just floats there,
>>>it represents "Cheddar Cheese". This is a deterministic, analog computer
>>>which manipulates symbols. Great.
>>>
>>>And utterly, utterly useless. So what is my computer lacking that real
>>>computers have? When you have answered that question, you'll see why
>>>Dijkstra's claim is under-specified.
>>>
>>>
>> So if computers do not manipulate symbols, what is it that they do?
>
>Did I say they don't manipulate symbols?

No you wanted someone to tell you how to make your
box machine useful, and that was too difficult, so I went this way.

>
>
>> They
>> sure cannot think,
>> or drink,
>> or reason,
>
>They can't reason? Then what are they doing when they manipulate symbols?

They simply manipulate symbols.
They really do.
There is no reasoning involved.

Any reasoning that is done, was done in the mind of the human who designed
the system, and the the machine simply manipulates the implementation of
the abstract symbols, following the implementation of the abstract rules that
were laid down. No reasoning at run time at all.

Lots of decision making though - and it is this ability to do this now, and
something else later, that tricks the casual observer into thinking that there
is somebody at home - a reasoning ghost in the machine.

>
>Yet again, it didn't even cross my mind that somebody would make this
>claim. My entire point is that it's not enough to just "manipulate
>symbols", you have to manipulate symbols the correct way, following laws
>of logic, so that the computer can *mindlessly* reason.

No.

There is no requirement for dragging in things like "the laws of logic"
or any arbitrarily chosen subset. You can build a machine to do
essentially "anything" - and people like Turing and his ilk have
spent a lot of skull sweat to try to define what is "general purpose",
to enable you to do "anything"

>
>
>> or almost any verb you can think of.
>
>If you're going to take that argument, then I'll remind you that there
>are no symbols inside a computer. There are only bits. And in fact, there
>aren't even any bits -- there are only analog voltages, and analog
>magnetic fields.
>

Duh - sorry to hear that - I have only been doing digital electronics
now for more years than I care to remember - but maybe there
is a difference in how you define "analogue". - maybe something like:
"if I can measure it with a multimeter, it is an analogue signal, because
a multimeter is capable of measuring analogue signals" - a bit fallacious.

More seriously, I again differ on the symbol bit - I am, right now,
working on a little processor that mindlessly does I/O, by manipulating
bits from one representation to another. In this case, the symbols
and the bits are mostly identical - but please do not try to tell
me that there are no symbols inside the little processors'
memory - I know they are there, because I have arranged for
them to be put there, via an RS-232 feed.

And my stored program actually arranges for the incoming symbols
to do things like activating output relays, and changes on input lines
actually cause other symbols to be transmitted out of the RS-232
port. The whole thing is simply rotten with symbology - in fact all
it does, is manipulating the implementation of the abstract symbols
in my mind.
It does so mindlessly.
This specific thing is so simple, that it hardly uses the
"logical ability" of the processor. And please remember that
for a function with two input bits and a single output bit, there can
at most be eight possible outcomes, not all of which are useful, for
some definition of useful. But they exist, and you can use them, if
you want. And you do not even have to be consistent, if you do not
want to be, or if it is convenient not to be.

I think it is this horrendous freedom that Dijkstra was talking about
when he was referring to the "power beyond your wildest dreams",
or some such.

>
>> "Manipulating symbols" is actually an elegant definition. Try coming up
>> with a better one and you will see.
>
>As I said above, it's not enough to just manipulate symbols. Here's a set
>of rules to manipulate symbols:
>
>X => 0
>
>or in English, "Any symbol becomes the zero symbol".
>
>That's symbol manipulation. Utterly useless. This is why it's not enough
>to just manipulate symbols, you have to manipulate them THE RIGHT WAY.
>

No.
There is no RIGHT WAY.
Really.
There are only ways that are useful, or convenient.

In my little processor, I can cause any result
to occur, and the rules can be as silly as your
example - if it is useful for you, under some
circumstances, for all symbols to become the
symbol for zero, then go for it - The point is that
there is nothing prohibiting it to be adopted as a
rule for a machine to implement.

/dev/null anyone?

- Hendrik

Hendrik van Rooyen

unread,
Jul 11, 2009, 8:14:55 AM7/11/09
to pytho...@python.org
"pdpi" <pd....iro@gmail.com> wrote;

>I've always found cooking an apt metaphor for programming.

No this is wrong.

Writing a recipe or a cookbook is like programming.

Cooking, following a recipe, is like running a program.

- Hendrik

D'Arcy J.M. Cain

unread,
Jul 11, 2009, 11:22:42 AM7/11/09
to Hendrik van Rooyen, pytho...@python.org
On Sat, 11 Jul 2009 14:01:25 +0200
"Hendrik van Rooyen" <ma...@microcorp.co.za> wrote:
> "Programming" is the step of going from the "design" to something
> that tells the machine how to implement the design.
>
> The creativity could, arguably, be in the "Design".
> Not in the translation to python, or assembler.
> No way. That is just coding.

One might also argue that divorcing the design from the code is the
problem in a lot of legacy code. See Agile Programming methods. Now
you could say that there is a design step still in talking to the
client and making a plan in your head or in some notes but that's like
saying that Michelangelo was done creating after discussing the Sistine
Chapel with Pope Sixtus and that the rest was just a house painting job.

Hendrik van Rooyen

unread,
Jul 11, 2009, 11:10:04 AM7/11/09
to pytho...@python.org
John O'Hagan wrote:


>The drawings produced by an architect, the script of a play, the score of a
piece of music, and the draft of a piece of >legislation are all examples of
other things which are "useless" until they are interpreted in some way.

Granted.
But...

>There are countless human activities which require a program, i.e. a conscious
plan or strategy, formed at least partly >by a creative process, and a computer
program is just a special case of this.

The difference is that for a piece of music, or a recipe for
"bobotie", there is room for artistry also in the performance
or preparation.

If the piece of music is reduced to a scroll with holes in it
and played on a pianola, then nobody in his right mind
would call the mechanically produced sounds
"a great performance".

And this is the essential difference that sets programming apart from
cookbook writing or drawing architectural plans - the one thing that
makes the real difference is the mindlessness of the performer.

(maybe I should conceed that writing legislation is a form of
programming, as the constabulary does not have a reputation
for anything other than a plodding consistency)

>I use Python as a tool for writing music, but I find I need both logical
reasoning and creativity to do either. In fact, I >find programming very similar
to writing music in a rigorous contrapuntal style, where each set of choices
constrains >each other, and there is a deep aesthetic satisfaction in getting it
right.

Getting it right has to do with the design, not the programming -

Have you ever struggled to get something right, and then one day
you take a different tack, and suddenly it is as if you cannot do
anything wrong - everything just falls into place? - That is the time
that you get the good feeling.

The difference between the one and the other is the difference
between bad or indifferent design and good design. When you
have a good design, the rest follows. If your design is crud, no
matter what you do, you struggle and nothing comes out just
right, despite heroic effort.

Now this stuff is easy to talk about, and immensely difficult to
do - it takes experience, some skill, some cunning, and a
willingness to experiment in an egoless fashion, amongst other
things.

And then the stupid compiler completely screws up your intent...
:-)

- Hendrik

Hendrik van Rooyen

unread,
Jul 11, 2009, 11:58:32 AM7/11/09
to pytho...@python.org, Paul Rubin
"D'Arcy J.M. Cain" <d..@druid.net>

> One might also argue that divorcing the design from the code is the
> problem in a lot of legacy code. See Agile Programming methods. Now
> you could say that there is a design step still in talking to the
> client and making a plan in your head or in some notes but that's like
> saying that Michelangelo was done creating after discussing the Sistine
> Chapel with Pope Sixtus and that the rest was just a house painting job.

How do you know that it was not exactly like that - he did, after all, take
a much longer time than expected to complete the job - just like a house
painter that gets paid by the hour. :-)

It is also unreasonable to assume the opposite fallacy - that he was in
a creative frenzy from the start to the time at the end when he was
cleaning his brush after applying the last spot of paint.

But it is a valid point - it is often difficult to draw the line between the
design and the implementation - and one of the reasons that we all
like to program in python, is that it is almost a language that is a
design language that can also be run directly. - I have lately been doing
stuff like writing rough prototypes using python syntax to serve as
designs for some of the assembler thingies I do. If you muck around
a bit at the lower level, you will also be more aware of the dichotomy
between the design and the implementation - in python, it is not obvious
at all. In assembler, it is glaring. But in either place, if you get it wrong,
you suffer. - your programs cripple along, or they hardly do what
you thought they would, and they bear no relationship with what the
customer wanted.

- Hendrik

greg

unread,
Jul 11, 2009, 9:28:17 PM7/11/09
to
Hendrik van Rooyen wrote:

> The creativity could, arguably, be in the "Design".
> Not in the translation to python, or assembler.
> No way. That is just coding.

No, the mechanical part of the process is called compiling,
and we have programs to do it for us.

By the time you've specified the design so rigorously
that not the slightest spark of creativity is needed
to implement it, you *have* coded it.

--
Greg

Calroc

unread,
Jul 19, 2009, 11:21:01 AM7/19/09
to
On Jul 9, 1:20 pm, Steven D'Aprano <st...@REMOVE-THIS-
cybersource.com.au> wrote:
[...]

> You'll excuse my skepticism about all these claims about how anyone can
> program, how easy it is to teach the fundamentals of Turing Machines and
> functional programming to anybody at all. Prove it. Where are your peer-
> reviewed studies demonstrating such successes on randomly chosen people,
> not self-selected motivated, higher-than-average intelligence students?

I'm engaged presently in starting a school to teach programming from
the ground up, based roughly on the curriculum outlined in the article
I mentioned. I can't randomly select students but I do intend to
gather data about who does how well from what starting conditions.

I'm excited about formal methods because one, I just heard about them,
and two, they provide me with a much more well-fleshed-out way to
bridge rigorously from "raw" Turing machine models to higher order
abstractions.

I'm confident that anyone who can play Sudoku can learn programming
and formal methods.

> In the real world, human beings are prone to serious cognitive biases and
> errors. Our brains are buggy. Take any list of reasoning fallacies, and
> you'll find the majority of people have difficulty avoid them. The first
> struggle is to get them to even accept that they *are* fallacies, but
> even once they have intellectually accepted it, getting them to live up
> to that knowledge in practice is much, much harder.
>
> In fact I'd go further and say that *every single person*, no matter how
> intelligent and well-educated, will fall for cognitive biases and
> fallacious reasoning on a regular basis.

I agree completely. One of my chief goals is to "debug" human
reasoning to whatever extent possible. This is precisely where I
expect training in computer programming to come in handy. The machine
symbol manipulator is not going to be fooled by reasoning fallacies.
Fluency in rigorous symbol manipulation can only help us cope with the
vagaries of our too-often sub-rational wetware.

I think the real win starts when computers can be used as a
communication medium, not for blogs and tweets and SMS txt and IM and
VoIP, but for reasoned symbolically coded discussion.


On Jul 9, 2:10 pm, Steven D'Aprano <st...@REMOVE-THIS-


cybersource.com.au> wrote:
> On Wed, 08 Jul 2009 22:05:57 -0700, Simon Forman wrote:
> >> persistent idea "out there" that programming is a very accessible
> >> skill, like cooking or gardening, anyone can do it, and even profit
> >> from it, monetarily or otherwise, etc., and to some extent I am
>
> > Programming is not like any other human activity.
>
> In practice? In principle? Programming in principle is not the same as it
> is performed in practice.
>
> But in either case, programming requires both the logical reasoning of
> mathematics and the creativity of the arts. Funnily enough,
> mathematicians will tell you that mathematics requires the same, and so
> will the best artists. I think mathematicians, engineers, artists, even
> great chefs, will pour scorn on your claim that programming is not like
> any other human activity.

Well it's actually Dijkstra's claim, and I find it reasonable, that
programming has two novel or unique aspects not present in other areas
of human endeavor: vast scale and digital mechanism.

http://www.cs.utexas.edu/users/EWD/transcriptions/EWD10xx/EWD1036.html


> [...]
>
> > He talks about how "when all is said and done, the only thing computers
> > can do for us is to manipulate symbols and produce results of such
> > manipulations" and he emphasises the "uninterpreted" nature of
> > mechanical symbol manipulation, i.e. that the machine is doing it
> > mindlessly.
>
> "Manipulate symbols" is so abstract as to be pointless. By that

That's the "koan form", longhand he means the Boolean domain {true,
false}, the logical operations and their algebraic relationships, and
the means of using same to deduce new formulae.


> reasoning, I can build a "computer" consisting of a box open at the top.
> I represent a symbol by an object (say, a helium-filled balloon, or a
> stone), instead of a pattern of bits. I manipulate the symbol by holding
> the object over the box and letting go. If it flies up into the sky, that
> represents the symbol "Love is War", if it falls into the box, it
> represents the symbol "Strength is Blue", and if it just floats there, it
> represents "Cheddar Cheese". This is a deterministic, analog computer
> which manipulates symbols. Great.
>
> And utterly, utterly useless. So what is my computer lacking that real
> computers have? When you have answered that question, you'll see why
> Dijkstra's claim is under-specified.

Well, you said it yourself: your computer is lacking utility.


[...]


>
> What is an uninterpreted formula? If you don't interpret it, how can you
> distinguish it from random noise?

Uninterpreted formula means that you manipulate the symbols without
thinking about what they mean. You just use the rules on the formulae
as such. To be useful, once you're "done" with them you can consider
their meaning. You can distinguish it at any time, but Dijktra's
saying that we should pay attention to the [mathematical, logical]
reasoning process as it's own thing, without reference to the meaning
of the symbols, at least temporarily.


[...]


>
> > If you're never exposed to that constellation of concepts that underpins
> > "mechanical symbol manipulation" you are adrift in a sea ("c", ha ha) of
> > abstractions.
>
> > However, if you /are/ exposed to the "so few and simple" rules of
> > manipulation the gates (no pun intended) to the kingdom are thrown wide.
>
> What nonsense. The keys to the kingdom are the abstractions.
>
> Here's an exercise for you, if you dare. It's not that difficult to
> remove the abstraction from integer addition, to explain it in terms of
> bit manipulation. If you prefer, you can write a Turing Machine instead.
>
> Now try doing the same thing for Quicksort. Don't worry, we'll wait.
>
> Once you've done that, let's see you implement a practical, scalable, non-
> toy webserver as a Turing Machine.
>
> No, it's not the fundamental operations that open the doors to the
> kingdom. It's the abstractions. The history of computing is to gain more
> and more power by leveraging better and better abstractions.

I agree that "better and better abstractions" are the key(s), but I
strongly disagree that the fundamental operations are not, uh, key to
those keys.

There are two intertwined phenomenon:

1) formulae, written in symbols

2) deriving new formulae by applying [in]formal methods to existing
formulae

The rules, and the application of those rules, are two separate but
compatible phenomenon. (Compatible in that you can encode the latter
in terms of the former. This is the essential "trick" of mechanized
thought.)

Now (1) is constantly being elaborated, these elaborations are the
"higher" abstractions we're talking about.

But (2) stays the same(-ish) no matter how high you go, and (2) is the
process by which all the higher (1)'s come from, so if we get it right
(and by right I mean of course by using rigorous proofs, formal
methods et. al.) from the get go and keep it right, we're golden.

I do literally intend to teach e.g. "integer addition ... in terms of
bit manipulation" and a bit of parsing too. But I intend to rapidly
bring students to a Forth or Forth-like stage, and then immediately
create simple parsers that implement, for example, C for loops, and
other "higher" abstractions.

Highly abridged, this is the curriculum: Enough about bits to bring
them to bytes, enough bytes to bring them to RAM, enough RAM to
implement a crude CPU, enough CPU to implement a bit of Forth, enough
Forth to implement parsers, and enough parsing (and compilation and
interpreting, etc.) to let them grok languages in general.

Formal methods gives me a wide straight road from each stage to the
next.

[...]


> > Quoting Dijkstra again [1]: "Before we part, I would like to invite you
> > to consider the following way of doing justice to computing's radical
> > novelty in an introductory programming course.
>
> > "On the one hand, we teach what looks like the predicate calculus, but
> > we do it very differently from the philosophers. In order to train the
> > novice programmer in the manipulation of uninterpreted formulae, we
> > teach it more as boolean algebra, familiarizing the student with all
> > algebraic properties of the logical connectives. To further sever the
> > links to intuition, we rename the values {true, false} of the boolean
> > domain as {black, white}.
>
> This is supposed to be a good thing?
>
> Oh my. It must be nice in that ivory tower, never having to deal with
> anyone who hasn't already been winnowed by years of study and self-
> selection before taking on a comp sci course.
>
> I don't think Dijkstra would have taken that suggestion seriously if he
> had to teach school kids instead of college adults.

I'm not going to teach it exactly like he outlines there, but I really
am going to try the curriculum I outlined. I'll post videos. :]

> ...
>
> > (Did you read that last paragraph and think, "Well how the heck else are
> > you supposed to program a computer if not in a computer language?"? If
> > so, well, that is kind of my point.)
>
> No. What I thought was, how the hell are you supposed to manipulate
> symbols without a language to describe what sort of manipulations you can
> do?

Direct-manipulation graphical Abstract Syntax Trees.

[...]


> > tables and arrays. He didn't even know that they were different
> > "things".
>
> I shall manfully resist the urge to make sarcastic comments about
> ignorant PHP developers, and instead refer you to my earlier reply to you
> about cognitive biases. For extra points, can you identify all the
> fallacious reasoning in that paragraph of yours?

I'll pass. I should have resisted the urge to whinge about him in the
first place.

>
> But those abstractions just get in the way, according to you. He should
> be programming in the bare metal, doing pure symbol manipulation without
> language.

No, the abstractions are useful, but their utility is tied to their
reliability, which in turn derives from the formal rigour of the
proofs of the abstractions' correctness.

My issues with modern art are, that too many of these abstractions are
not /proven/, and they are both frozen and fractured by the existing
milieu of "static" mutually incompatible programming languages.

I see a possibility for something better.


> If programming is symbol manipulation, then you should remember that the
> user interface is also symbol manipulation, and it is a MUCH harder
> problem than databases, sorting, searching, and all the other problems
> you learn about in academia. The user interface has to communicate over a
> rich but noisy channel using multiple under-specified protocols to a
> couple of pounds of meat which processes information using buggy
> heuristics evolved over millions of years to find the ripe fruit, avoid
> being eaten, and have sex. If you think getting XML was hard, that's
> *nothing* compared to user interfaces.
>
> The fact that even bad UIs work at all is a credit to the heuristics,
> bugs and all, in the meat.

Ok, yes, but have you read Jef Raskin's "the Humane Interface"?

> > As for "a minimum of bugs"... sigh. The "minimum of bugs" is zero, if
> > you derive your "uninterpreted formulae" /correctly/.

Sorry. I can't believe that I was so melodramatic.

> And the secret of getting rich on the stock market is to buy low, sell
> high. Very true, and very pointless. How do you derive the formula
> correctly?
>
> Do you have an algorithm to do so? If so, how do you know that algorithm
> is correct?

Well we know logic works, we can trust that. The algorithm of proving
the formula is logic.

> > Deriving provably
> > correct "programs" should be what computer science and computer
> > education are all about
>
> That's impossible. Not just impractical, or hard, or subject to hardware
> failures, but impossible.
>
> I really shouldn't need to raise this to anyone with pretensions of being
> a Computer Scientist and not just a programmer, but... try proving an
> arbitrary program will *halt*.
>
> If you can't do that, then what makes you think you can prove all useful,
> necessary programs are correct?

It may well be that there are useful, necessary programs that are
impossible to prove correct. I'm not trying to claim there aren't
such.

I think that wherever that's not the case, wherever useful necessary
programs can be proven, they should be.

It may be that flawless software is an unreachable asymptote, like the
speed of light for matter, but I'm (recently!) convinced that it's a
goal worthy of exploration and effort.

Just because it's unattainable doesn't make it undesirable. And who
knows? Maybe the horse will learn to sing.

> > Again with Dijkstra[3]: "The prime paradigma of the pragmatic designer
> > is known as "poor man's induction", i.e. he believes in his design as
> > long as "it works", i.e. until faced with evidence to the contrary. (He
> > will then "fix the design".)
>
> Yes, that is the only way it can be.
>
> > The scientific designer, however, believes
> > in his design because he understands why it will work under all
> > circumstances.
>
> Really? Under *all* circumstances?
>
> Memory failures? Stray cosmic radiation flipping bits? Noise in the power
> supply?
>
> What's that you say? "That's cheating, the software designer can't be
> expected to deal with the actual reality of computers!!! We work in an
> abstract realm where computers never run out of memory, swapping never
> occurs, and the floating point unit is always correct!"

First, yes, that could be considered cheating. We don't expect
automotive engineers to design for resistance to meteor strikes.
Second, those issues plague all programs, formally derived or not.
Formal methods certainly can't hurt. Third, in cases where you do
want reliability in the face of things like cosmic rays you're going
to want to use formal methods to achieve it.

And last but not least, it may be possible to "parameterize" the
abstractions we derive with information about their susceptibility to
stability of their underlying abstractions.

> Fail enough. I don't want to make it too hard for you.
>
> Okay, so let's compare the mere "pragmatic designer" with his provisional
> expectation that his code is correct, and Dijkstra's "scientific
> designer". He understands his code is correct. Wonderful!
>
> Er, *how exactly* does he reach that understanding?
>
> He reasons about the logic of the program. Great! I can do that. See,
> here's a proof that binary search is correct. How easy this is.
>
> Now try it for some non-trivial piece of code. Say, the Linux kernel.
> This may take a while...

Well, I'd start with Minix 3... But really the method would be to
build up provably correct compounds out of proven sub-assemblies.
Even if it takes a long time is it time wasted? A provably correct OS
is kind of a neat idea IMO.

>
> What guarantee do we have that the scientific designer's reasoning was
> correct? I mean, sure he is convinced he has reasoned correctly, but he
> would, wouldn't he? If he was unsure, he'd keep reasoning (or ask for
> help), until he was sure. But being sure doesn't make him correct. It
> just makes him sure.
>
> So how do you verify the verification?

Peer review and run it through a mechanical verifier. There is a self-
reflexive loop there but it's not infinite. If it was we couldn't
implement any computer.

> And speaking of binary search:

[Bentley's binary search bug]

This is a perfect example of leaky abstraction. The C "integer"
doesn't behave like a "real" integer but part of the algorithm's
implementation relied on the tacit overlap between C ints and integers
up to a certain value. This is the sort of thing that could be
annotated onto the symbolic representation of the algorithm and used
to compile provably correct machine code (for various platforms.)

If the same proven algorithm had been implemented on top of a correct
integer abstraction (like python's long int) it wouldn't have had that
bug.

(And yes, there would still be limits, truly vast arrays could
conceivably result in memory exhaustion during execution of the sort.
I think the answer there lies in being able to represent or calculate
the limits of an algorithm explicitly, the better to use it correctly
in other code.)

> Perhaps the "scientific designer" should be a little less arrogant, and
> take note of the lesson of real science: all knowledge is provisional and
> subject to correction and falsification.

(I apologize for my arrogance on behalf of formal methods, I just
found out about them and I'm all giddy and irritating.)

> Which really makes the "scientific designer" just a slightly more clever
> and effective "pragmatic designer", doesn't it?

Um, er, well, scientists are ultimate pragmatists, and Dijkstra's
description of "pragmatic designer" above could well be applied to
"scientific designers". The difference turns on /why/ each person has
confidence is his design: one can prove mathematically that his design
is correct while the other is ... sure.

Whether or not the delivered software is actually flawless, the
difference seems to me more than just "slightly more clever and
effective".

>
> > The transition from pragmatic to scientific design would
> > indeed be a drastic change within the computer industry."
>
> Given Dijkstra's definition of "scientific design", it certainly would.
> It would be as drastic a change as the discovery of Immovable Objects and
> Irresistible Forces would be to engineering, or the invention of a
> Universal Solvent for chemistry, or any other impossibility.
>
> > "Obviously no errors" is the goal to strive for, and I am comfortable
> > calling anyone an amateur who prefers "no obvious errors."
>
> It's not a matter of preferring no obvious errors, as understanding the
> limitations of reality. You simply can't do better than no obvious errors
> (although of course you can do worse) -- the only question is what you
> call "obvious".

It may be expensive, it may be difficult, and it may certainly be
impossible, but I feel the shift from ad hoc software creation to
rigorous scientific software production is obvious. To me it's like
the shift from astrology to astronomy, or from alchemy to chemistry.


~Simon

Paul Rubin

unread,
Jul 19, 2009, 2:51:17 PM7/19/09
to
Calroc <forman...@gmail.com> writes:
> I'm engaged presently in starting a school to teach programming from
> the ground up, based roughly on the curriculum outlined in the article
> I mentioned. ...

> I'm excited about formal methods because one, I just heard about them,

Formal methods are a big and complicated subject. Right after you
heard about them is probably not the best time to start teaching them.
Better get to understand them yourself first.

greg

unread,
Jul 20, 2009, 4:00:14 AM7/20/09
to
Calroc wrote:

> It may be that flawless software is an unreachable asymptote, like the
> speed of light for matter, but I'm (recently!) convinced that it's a
> goal worthy of exploration and effort.

Seems to me that once you get beyond the toy program
stage and try to apply correctness proving to real
world programming problems, you run up against the
problem of what exactly you mean by "correct".

Once the requirements get beyond a certain level of
complexity, deciding whether the specifications
themselves are correct becomes just as difficult as
deciding whether the program meets them.

Then there's the fact that very often we don't
even know what the exact requirements are, and it's
only by trying to write and use the program that
we discover what they really are -- or at least,
get a better idea of what they are, because the
process is usually iterative, with no clear end
point.

So in theory, correctness proofs are a fine idea,
and might even be doble on a small scale, but the
real world is huge and messy!

> Just because it's unattainable doesn't make it undesirable. And who
> knows? Maybe the horse will learn to sing.

Striving to find ways of writing less bugs is a
worthy goal, but I think of it more in terms of
adopting patterns of thought and programming that
make it more likely you will write code that does
what you had in mind, rather than a separate
"proof" process that you go through afterwards.

--
Greg

Simon Forman

unread,
Jul 20, 2009, 8:56:26 PM7/20/09
to
On Jul 19, 2:51 pm, Paul Rubin <http://phr...@NOSPAM.invalid> wrote:

Very good point.

But I'm glad it's there to study, these are wheels I don't have to
invent for myself.

~Simon

Simon Forman

unread,
Jul 20, 2009, 8:57:05 PM7/20/09
to
On Jul 20, 4:00 am, greg <g...@cosc.canterbury.ac.nz> wrote:
> Calroc wrote:
> > It may be that flawless software is an unreachable asymptote, like the
> > speed of light for matter, but I'm (recently!) convinced that it's a
> > goal worthy of exploration and effort.
>
> Seems to me that once you get beyond the toy program
> stage and try to apply correctness proving to real
> world programming problems, you run up against the
> problem of what exactly you mean by "correct".
>
> Once the requirements get beyond a certain level of
> complexity, deciding whether the specifications
> themselves are correct becomes just as difficult as
> deciding whether the program meets them.

I agree. You could prove that a game renders geometry without
crashing, but not that it's fun. Or in an accounting package you
could prove that it never loses a cent in tracking accounts and
payments, or that its graph-rendering code does correctly render data
(without introducing artifacts, for instance), but not that it will
let you notice trends.

> Then there's the fact that very often we don't
> even know what the exact requirements are, and it's
> only by trying to write and use the program that
> we discover what they really are -- or at least,
> get a better idea of what they are, because the
> process is usually iterative, with no clear end
> point.

Aye, if you're still exploring your solution space your requirements
are perforce somewhat nebulous.

> So in theory, correctness proofs are a fine idea,
> and might even be doble on a small scale, but the
> real world is huge and messy!

Very true, but I think it is still worthy to build larger systems from
proven components combined in provably correct ways, at least to the
extent possible.

> > Just because it's unattainable doesn't make it undesirable. And who
> > knows? Maybe the horse will learn to sing.
>
> Striving to find ways of writing less bugs is a
> worthy goal, but I think of it more in terms of
> adopting patterns of thought and programming that
> make it more likely you will write code that does
> what you had in mind, rather than a separate
> "proof" process that you go through afterwards.

I would consider formal methods exactly "patterns of thought and


programming that make it more likely you will write code that does

what you had in mind", in fact that's why I'm excited about them.

My understanding (so far) is that you (hope to) /derive/ correct code
using formal logic, rather than writing code and then proving its
soundness.

The real win, IMO, is teaching computers as a logical, mathematical
activity (that can certainly depart for realms less rigorously
defined.)

I think anyone who can spontaneously solve a Sudoku puzzle has enough
native grasp of "formal" methods of reasoning to learn how to derive
working programs logically. Making the formal methods explicit gives
you the strongest available confidence that you are reasoning, and
programming, correctly.

This is not to say that "incorrect" (or unproven or unprovable)
programs are never useful or necessary. However, I do feel that it's
better to learn the "correct" way, and then introduce ambiguity, than
to simply throw e.g. Java at some poor student and let them find solid
ground "catch as catch can".


~Simon

Paul Rubin

unread,
Jul 20, 2009, 9:54:54 PM7/20/09
to
Simon Forman <sajm...@gmail.com> writes:
> But I'm glad it's there to study, these are wheels I don't have to
> invent for myself.

http://dwheeler.com/essays/high-assurance-floss.html

might be an interesting place to start.

Simon Forman

unread,
Jul 21, 2009, 10:42:08 AM7/21/09
to pytho...@python.org

OO la la! Thank you!

greg

unread,
Jul 21, 2009, 8:18:44 PM7/21/09
to
Simon Forman wrote:

> My understanding (so far) is that you (hope to) /derive/ correct code
> using formal logic, rather than writing code and then proving its
> soundness.

But to the extent you can rigorously formalise it,
all you've done is create Yet Another Programming
Language.

Which is fine, if you can come up with one that
works at a high enough level in some domain that
you can see just by looking that the program will
do what you want.

However, I suspect that you can't improve much on
what we've already got without restricting the
domain of applicability of the language. Otherwise,
you just shift the intellectual bottleneck from
writing a correct program to writing correct
specifications.

In the realm of programming language design, it's
been said that you can't eliminate complexity, all
you can do is push it around from one place to
another. I suspect something similar applies to
the difficulty of writing programs.

--
Greg

Paul Rubin

unread,
Jul 21, 2009, 9:07:24 PM7/21/09
to
greg <gr...@cosc.canterbury.ac.nz> writes:
> However, I suspect that you can't improve much on what we've already
> got without restricting the domain of applicability of the
> language. Otherwise, you just shift the intellectual bottleneck from
> writing a correct program to writing correct specifications.

Really, it's easier to write specifications. They're a level of
abstraction away from the implementation. Think of a sorting
function that operates on a list a1,a2...a_n. The sorting algorithm
might be some fiendishly clever, hyper-optimized hybrid of quicksort,
heapsort, timsort, burstsort, and 52-pickup. It might be thousands
of lines of intricate code that looks like it could break at the
slightest glance. The specification simply says the output
has the property a1<=a2<=...<=a_n. That is a lot easier to
say. If you can then prove that the program satisfies the
specification, you are far better off than if you have some
super-complex pile of code that appears to sort when you test
it, but has any number of possible edge cases that you didn't
figure out needed testing. It's like the difference between
stating a mathematical theorem (like the four-color theorem)
and proving it.

0 new messages