Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

The type/object distinction and possible synthesis of OOP and imperative programming languages

122 views
Skip to first unread message

Mark Janssen

unread,
Apr 14, 2013, 11:48:05 PM4/14/13
to types...@lists.seas.upenn.edu, Python List
Hello,

I'm new to the list and hoping this might be the right place to
introduce something that has provoked a bit of an argument in my
programming community.

I'm from the Python programming community. Python is an "interpreted"
language. Since 2001, Python's has migrated towards a "pure" Object
model (ref: http://www.python.org/download/releases/2.2/descrintro/).
Prior to then, it had both types and classes and these types were
anchored to the underlying C code and the machine/hardware
architecture itself. After the 2001 "type/class unification" , it
went towards Alan Kay's ideal of "everything is an object". From
then, every user-defined class inherited from the abstract Object,
rooted in nothing but a pure abstract ideal. The parser, lexer, and
such spin these abstrations into something that can be run on the
actual hardware.

As a contrast, this is very distinct from C++, where everything is
concretely rooted in the language's type model which in *itself* is
rooted (from it's long history) in the CPU architecture. The STL,
for example, has many Container types, but each of them requires using
a single concrete type for homogenous containers or uses machine
pointers to hold arbitrary items in heterogeneous containers (caveat:
I haven't programmed in C++ for a long time, so it's possible this
might not be correct anymore).

My question is: Is there something in the Computer Science literature
that has noticed this distinction/development in programming language
design and history?

It's very significant to me, because as languages went higher and
higher to this pure OOP model, the programmer+data ecosystem tended
towards very personal object hierarchies because now the hardware no
longer formed a common basis of interaction (note also, OOPs promise
of re-usable code never materialized).

It's not unlike LISP, where the power of its general language
architecture tended towards hyperpersonal mini macro languages --
making it hardly used, in practice, though it was and is so powerful,
in theory.

That all being said, the thrust of this whole effort is to possibly
advance Computer Science and language design, because in-between the
purely concrete "object" architecture of the imperative programming
languages and the purely abstract object architecture of
object-oriented programming languages is a possible middle ground that
could unite them all.

Thank you for your time.

Mark Janssen
Tacoma, Washington

Terry Jan Reedy

unread,
Apr 15, 2013, 12:56:02 AM4/15/13
to pytho...@python.org
Note: cross-posting to mailing lists does not work well. Hence the reply
only to python-list and the gmane mirror.

On 4/14/2013 11:48 PM, Mark Janssen wrote:

> Python is an "interpreted" language.

I consider this a useless or even deceptive statement. Python is an
object-based algorithm language. The CPython implementation (currently)
*compiles* Python code to a virtual machine bytecode, similar to Java
bytecode. This could change. Jython translates to Java, which is then
compiled to Java bytecode. Do you call Java an 'interpreted' language?
There are compilers that compile Python to the same 'object code' that C
is compiled to.

>Since 2001, Python's has migrated towards a "pure" Object model.

It migrated towards a pure class model.

> model (ref: http://www.python.org/download/releases/2.2/descrintro/).
> Prior to then, it had both types and classes and these types were
> anchored to the underlying C code and the machine/hardware
> architecture itself.

I started with Python 1.3 and I have no idea what you mean by this. I
think you are heavily confusing Python the language and CPython the
implementation.

> After the 2001 "type/class unification" , it
> went towards Alan Kay's ideal of "everything is an object".

Everything has always been an object with value and identity.

This stands in contrast to the assemble/C model that everything (except
registers) is a block of linear memory with an address.

> My question is: Is there something in the Computer Science literature
> that has noticed this distinction/development in programming language
> design and history?

To me, there should be, since this is a fundamental distinction in data
models. I think the biggest problem in shifting from C to Python is
'getting' the data model difference. For Lispers, the difference is the
syntax.

--
Terry Jan Reedy


DeLesley Hutchins

unread,
Apr 15, 2013, 1:55:59 AM4/15/13
to Mark Janssen, types...@lists.seas.upenn.edu, Python List

I'm not quite sure I understand your question, but I'll give it a shot.  :-)

The C/C++ model, in which the types are anchored to the machine hardware, in the exception, not the rule.  In the academic literature,  "type theory" is almost entirely focused on studying abstract models of computation that are purely mathematical, and bear no resemblance to the underlying hardware.  The lambda calculus is the most general, and most commonly used formalism, but there are many others; e.g. Featherweight Java provides a formal model of objects and classes as they are used in Java.

"Types and Programming Languages", by Benjamin Pierce, is an excellent introductory textbook which describes how various language features, including objects, can be formalized.  If you are interested in OOP, Abadi and Cardelli's "Theory of Objects" is the obvious place to start, although I'd recommend reading Pierce's book first if you want to understand it.  :-)  Abadi and Cardelli discuss both class-based languages, and pure object languages.  If you are interested in the type/object distinction in particular, then I'll shamelessly plug my own thesis: "Pure Subtype Systems" (available online), which describes a formal model in which types are objects, and objects are types.  If you are familiar with the Self language, then you can think of it as a type system for Self.

Once you have a type system in place, it's usually fairly straightforward to compile a language down to actual hardware.  An interpreter, like that used in Python, is generally needed only for untyped or "dynamic" languages.  There are various practical considerations -- memory layout, boxed or unboxed data types, garbage collection, etc. -- but the basic techniques are described in any compiler textbook.  Research in the areas of "typed assembly languages" and "proof carrying code" are concerned with ensuring that the translation from high-level language to assembly language is sound, and well-typed at all stages.  

  -DeLesley



On Sun, Apr 14, 2013 at 8:48 PM, Mark Janssen <dreamin...@gmail.com> wrote:
[ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ]


Hello,

I'm new to the list and hoping this might be the right place to
introduce something that has provoked a bit of an argument in my
programming community.

I'm from the Python programming community.  Python is an "interpreted"
language.  Since 2001, Python's has migrated towards a "pure" Object

model (ref: http://www.python.org/download/releases/2.2/descrintro/).
Prior to then, it had both types and classes and these types were
anchored to the underlying C code and the machine/hardware
architecture itself.  After the 2001 "type/class unification" , it

went towards Alan Kay's ideal of "everything is an object".  From
then, every user-defined class inherited from the abstract Object,
rooted in nothing but a pure abstract ideal.  The parser, lexer, and
such spin these abstrations into something that can be run on the
actual hardware.

As a contrast, this is very distinct from C++, where everything is
concretely rooted in the language's type model which in *itself* is
rooted (from it's long history) in the CPU architecture.   The STL,
for example, has many Container types, but each of them requires using
a single concrete type for homogenous containers or uses machine
pointers to hold arbitrary items in heterogeneous containers (caveat:
I haven't programmed in C++ for a long time, so it's possible this
might not be correct anymore).

My question is:  Is there something in the Computer Science literature
that has noticed this distinction/development in programming language
design and history?

Uday S Reddy

unread,
Apr 15, 2013, 5:06:21 AM4/15/13
to Mark Janssen, types...@lists.seas.upenn.edu, Python List
Mark Janssen writes:

> After the 2001 "type/class unification" , it went towards Alan Kay's ideal
> of "everything is an object"....
>
> As a contrast, this is very distinct from C++, where everything is
> concretely rooted in the language's type model which in *itself* is
> rooted (from it's long history) in the CPU architecture. ...
>
> My question is: Is there something in the Computer Science literature
> that has noticed this distinction/development in programming language
> design and history?

In programming language theory, there is no law to the effect that
"everything" should be of one kind or another. So, we would not go with
Alan Kay's ideal.

Having said that, theorists do want to unify concepts wherever possible and
wherever they make sense. Imperative programming types, which I will call
"storage types", are semantically the same as classes. Bare storage types
have predefined operations for 'getting' and 'setting' whereas classes allow
user-defined operations. So, the distinction made between them in typical
programming languages is artificial and implementation-focused. C and C++
are especially prone to this problem because they were designed for writing
compilers and operating systems where proximity to the machine architecture
seems quite necessary. The higher-level languages such as Java are moving
towards abolishing the distinction. Scala might be the best model in this
respect, though I do not know its type system fully.

Here are a couple of references in theoretical work that might be helpful in
understanding these connections:

- John Reynolds, The Essence of Algol, in de Bakker and van Vliet (eds)
Algorithmic Languages, 1981. Also published in O'Hearn and Tennent (eds)
Algol-like Languages, Vol. A, 1997.

- Uday Reddy, Objects and Classes in Algol-like Languages, Information and
Computation, 172:63-97, 2002. (previously in FOOL workshop 1998.)
http://www.cs.bham.ac.uk/~udr/papers/classes.pdf

However, there are properties that are special to storage types, which are
not shared by all class types. Sometimes, they simplify some theoretical
aspects. It is not uncommon for authors to make a distinction between
storage types and general types. An example is one of our own papers:

- Swarup, Reddy and Ireland: Assignments for applicative languages, FPCA
1991. http://www.cs.bham.ac.uk/~udr/papers/assign.pdf

Cheers,
Uday Reddy

Moez AbdelGawad

unread,
Apr 15, 2013, 5:53:38 AM4/15/13
to DeLesley Hutchins, Mark Janssen, Types List, Python List, Moez Abdel-Gawad


> Date: Sun, 14 Apr 2013 22:55:59 -0700
> From: dele...@gmail.com
> To: dreamin...@gmail.com
> CC: types...@lists.seas.upenn.edu; pytho...@python.org
> Subject: Re: [TYPES] The type/object distinction and possible synthesis of OOP and imperative programming languages
> I'm not quite sure I understand your question, but I'll give it a shot. :-)
>

I'm in this same camp too :)


> The C/C++ model, in which the types are anchored to the machine hardware,
> in the exception, not the rule. In the academic literature, "type theory"
> is almost entirely focused on studying abstract models of computation that
> are purely mathematical, and bear no resemblance to the underlying
> hardware. The lambda calculus is the most general, and most commonly used
> formalism, but there are many others; e.g. Featherweight Java provides a
> formal model of objects and classes as they are used in Java.
>
> "Types and Programming Languages", by Benjamin Pierce, is an excellent
> introductory textbook which describes how various language features,
> including objects, can be formalized. If you are interested in OOP, Abadi
> and Cardelli's "Theory of Objects" is the obvious place to start, although
> I'd recommend reading Pierce's book first if you want to understand it.
> :-) Abadi and Cardelli discuss both class-based languages, and pure
> object languages. If you are interested in the type/object distinction in
> particular, then I'll shamelessly plug my own thesis: "Pure Subtype
> Systems" (available online), which describes a formal model in which types
> are objects, and objects are types. If you are familiar with the Self
> language, then you can think of it as a type system for Self.
>

Offering a different view, I'd like to (also, shamelessly) plug my own thesis: "NOOP: A Mathematical Model of OOP" (available online) in which I denotationally model nominally-typed (ie, statically-typed class-based) OO languages such as Java, C#, C++ and Scala (but not Python).

In agreement with the most common tradition in PL research, types in NOOP are modeled abstractly as (certain) sets (of objects).   NOOP largely mimics, for nominally-typed OO languages, what Cardelli, Cook, and others earlier did for structurally-typed OO languages.

Regards,

-Moez
> > As a contrast, this is very distinct from C++, where everything is
> > concretely rooted in the language's type model which in *itself* is
> > rooted (from it's long history) in the CPU architecture. The STL,
> > for example, has many Container types, but each of them requires using
> > a single concrete type for homogenous containers or uses machine
> > pointers to hold arbitrary items in heterogeneous containers (caveat:
> > I haven't programmed in C++ for a long time, so it's possible this
> > might not be correct anymore).
> >
> > My question is: Is there something in the Computer Science literature
> > that has noticed this distinction/development in programming language
> > design and history?
> >

Steven D'Aprano

unread,
Apr 15, 2013, 6:11:14 AM4/15/13
to
On Sun, 14 Apr 2013 20:48:05 -0700, Mark Janssen wrote:

> Hello,
>
> I'm new to the list and hoping this might be the right place to
> introduce something that has provoked a bit of an argument in my
> programming community.
>
> I'm from the Python programming community. Python is an "interpreted"
> language. Since 2001, Python's has migrated towards a "pure" Object
> model (ref: http://www.python.org/download/releases/2.2/descrintro/).
> Prior to then, it had both types and classes and these types were
> anchored to the underlying C code and the machine/hardware architecture
> itself.


Incorrect.

Python's data model has always been 100% object oriented. Prior to the
"class/type" unification, it simply had *two distinct* implementations of
objects: types, which were written in C, and classes, which were written
in Python.

After unification, the two kinds of object were no longer entirely
distinct -- you could then subclass types in Python code, using the same
"class" keyword as you would use for a pure-Python class.

And starting with Python 3, the last vestiges of the distinction have
disappeared. Now, "class" and "type" are mere synonyms. Both built-in
types and custom classes use the same mechanism.


> After the 2001 "type/class unification" , it went towards Alan
> Kay's ideal of "everything is an object". From then, every user-defined
> class inherited from the abstract Object, rooted in nothing but a pure
> abstract ideal.

Incorrect. In Python 2.7:


py> class AClass:
... pass
...
py> issubclass(AClass, object)
False



--
Steven

Matthias Felleisen

unread,
Apr 15, 2013, 9:50:38 AM4/15/13
to Mark Janssen, types...@lists.seas.upenn.edu, Python List

On Apr 14, 2013, at 11:48 PM, Mark Janssen wrote:

> After the 2001 "type/class unification" , it went towards Alan Kay's ideal

Are you sure? Remember Kay's two motivations [*], which he so elegantly describes with "[the] large scale one was to find a better module scheme for complex systems involving hiding of details, and the small scale one was to find a more flexible version of assignment, and then to try to eliminate it altogether." At least for me, this quote sends a signal to language designers that is still looking for a receiver -- Matthias

[*] http://gagne.homedns.org/~tgagne/contrib/EarlyHistoryST.html



Antoon Pardon

unread,
Apr 15, 2013, 1:43:32 PM4/15/13
to pytho...@python.org
Op 15-04-13 12:11, Steven D'Aprano schreef:

>
> Python's data model has always been 100% object oriented. Prior to the
> "class/type" unification, it simply had *two distinct* implementations of
> objects: types, which were written in C, and classes, which were written
> in Python.
>
> After unification, the two kinds of object were no longer entirely
> distinct -- you could then subclass types in Python code, using the same
> "class" keyword as you would use for a pure-Python class.
>
> And starting with Python 3, the last vestiges of the distinction have
> disappeared. Now, "class" and "type" are mere synonyms. Both built-in
> types and custom classes use the same mechanism.

I had gotten my hopes up after reading this but then I tried:


$ python3
Python 3.2.3 (default, Feb 20 2013, 17:02:41)
[GCC 4.7.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> class vslice (slice):
... pass
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: type 'slice' is not an acceptable base type


It seems types and classes are still not mere synonyms.

Dave Angel

unread,
Apr 15, 2013, 5:13:44 PM4/15/13
to pytho...@python.org
On 04/15/2013 01:43 PM, Antoon Pardon wrote:
> Op 15-04-13 12:11, Steven D'Aprano schreef:
>
>>
>> Python's data model has always been 100% object oriented. Prior to the
>> "class/type" unification, it simply had *two distinct* implementations of
>> objects: types, which were written in C, and classes, which were written
>> in Python.
>>
>> After unification, the two kinds of object were no longer entirely
>> distinct -- you could then subclass types in Python code, using the same
>> "class" keyword as you would use for a pure-Python class.
>>
>> And starting with Python 3, the last vestiges of the distinction have
>> disappeared. Now, "class" and "type" are mere synonyms. Both built-in
>> types and custom classes use the same mechanism.
>
> I had gotten my hopes up after reading this but then I tried:
>
>
> $ python3
> Python 3.2.3 (default, Feb 20 2013, 17:02:41)
> [GCC 4.7.2] on linux2
> Type "help", "copyright", "credits" or "license" for more information.
> >>> class vslice (slice):
> ... pass
> ...
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> TypeError: type 'slice' is not an acceptable base type
>
>
> It seems types and classes are still not mere synonyms.

No, it seems you're trying to use an internal detail as though it were a
supported feature.

From page:
http://docs.python.org/3.3/reference/datamodel.html#types

"""Internal types
A few types used internally by the interpreter are exposed to the user.
Their definitions may change with future versions of the interpreter,
but they are mentioned here for completeness.
"""


--
DaveA

Rotwang

unread,
Apr 15, 2013, 6:12:05 PM4/15/13
to
On 15/04/2013 22:13, Dave Angel wrote:
> On 04/15/2013 01:43 PM, Antoon Pardon wrote:
>> [...]
>>
>> I had gotten my hopes up after reading this but then I tried:
>>
>>
>> $ python3
>> Python 3.2.3 (default, Feb 20 2013, 17:02:41)
>> [GCC 4.7.2] on linux2
>> Type "help", "copyright", "credits" or "license" for more information.
>> >>> class vslice (slice):
>> ... pass
>> ...
>> Traceback (most recent call last):
>> File "<stdin>", line 1, in <module>
>> TypeError: type 'slice' is not an acceptable base type
>>
>>
>> It seems types and classes are still not mere synonyms.
>
> No, it seems you're trying to use an internal detail as though it were a
> supported feature.
>
> From page:
> http://docs.python.org/3.3/reference/datamodel.html#types
>
> """Internal types
> A few types used internally by the interpreter are exposed to the user.
> Their definitions may change with future versions of the interpreter,
> but they are mentioned here for completeness.
> """

To be fair, one can't do this either:

Python 3.3.0 (v3.3.0:bd8afb90ebf2, Sep 29 2012, 10:57:17) [MSC v.1600 64
bit (AMD64)] on win32
Type "copyright", "credits" or "license()" for more information.
>>> class C(type(lambda: None)):
pass

Traceback (most recent call last):
File "<pyshell#2>", line 1, in <module>
class C(type(lambda: None)):
TypeError: type 'function' is not an acceptable base type


and I don't think that FunctionType would be considered an "internal
detail", would it? Not that I'd cite the fact that not all types can be
inherited from as evidence that types and classes are not synonyms, mind.

Chris Angelico

unread,
Apr 15, 2013, 6:32:45 PM4/15/13
to pytho...@python.org
On Tue, Apr 16, 2013 at 8:12 AM, Rotwang <sg...@hotmail.co.uk> wrote:
> Traceback (most recent call last):
> File "<pyshell#2>", line 1, in <module>
> class C(type(lambda: None)):
> TypeError: type 'function' is not an acceptable base type
>
>
> and I don't think that FunctionType would be considered an "internal
> detail", would it? Not that I'd cite the fact that not all types can be
> inherited from as evidence that types and classes are not synonyms, mind.

Actually, I'm not sure how you'd go about inheriting from a function.
Why not just create a bare class, then assign its __call__ to be the
function you're inheriting from?

ChrisA

Rotwang

unread,
Apr 15, 2013, 6:54:21 PM4/15/13
to
No idea. I wasn't suggesting that trying to inherit from FunctionType
was a sensible thing to do; I was merely pointing out that slice's
status as an internal feature was not IMO relevant to the point that
Antoon was making.

Terry Jan Reedy

unread,
Apr 15, 2013, 8:52:58 PM4/15/13
to pytho...@python.org
On 4/15/2013 1:43 PM, Antoon Pardon wrote:

> $ python3
> Python 3.2.3 (default, Feb 20 2013, 17:02:41)
> [GCC 4.7.2] on linux2
> Type "help", "copyright", "credits" or "license" for more information.
> >>> class vslice (slice):
> ... pass
> ...
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> TypeError: type 'slice' is not an acceptable base type
>
>
> It seems types and classes are still not mere synonyms.

Some builtin classes cannot be subclassed. There is an issue to document
which better. That does not mean that it is not a class.


Steven D'Aprano

unread,
Apr 15, 2013, 10:15:00 PM4/15/13
to
You are misinterpreting what you are reading. The mere fact that
something cannot be subclassed doesn't mean anything. That's just a
restriction put on the class by the implementation. It's not even clear
that it is a guaranteed language restriction or a mere accident of
implementation. With a bit of metaclass trickery, I could equally create
a pure-Python class that cannot be easily subclassed.

The proof that types and classes are the same in Python 3 is simple:

py> class C:
... pass
...
py> type(C) is type(int) is type(type) is type
True

The type of the pure-Python class is type itself.

However, even this can be bypassed, using a metaclass!

py> class D(metaclass=Meta):
... pass
...
py> type(D) is type
False
py> issubclass(type(D), type)
True


So when using a metaclass, the type of the class is not necessarily type
itself, but it will be a subclass of type.

This does not hold in Python 2.x, not for old-style "classic" classes.
Classic classes are in a world of their own, distinct from types:

# Python 2
py> class C:
... pass
...
py> type(C)
<type 'classobj'>
py> issubclass(type(C), type)
False



In Python 3, we can expect these two conditions to always hold:

* all instances are instances of object;

* all classes are instances of type.


Notice that this implies that type and object are circularly defined:
object, being a class, is an instance of type, but type, being an object,
is an instance of object:

py> isinstance(type, object)
True
py> isinstance(object, type)
True



These two conditions even apply to unsubclassable objects like slice:

py> isinstance(slice(1, 5, 2), object)
True
py> isinstance(slice, type)
True



--
Steven

Steven D'Aprano

unread,
Apr 15, 2013, 10:32:59 PM4/15/13
to
I think it is also important to document whether that is a language
feature, or a mere restriction of the implementation. There is an
important distinction to be made between:

"In CPython, you cannot subclass slice or FunctionType. Other Pythons may
have more, or fewer, restrictions."

and:

"No language that calls itself Python is permitted to allow slice and
FunctionType to be subclassable."


If I had a say in this, I would vote for the first case, with the
possible exception of documented singleton types like NoneType and bool.



--
Steven

Terry Jan Reedy

unread,
Apr 15, 2013, 11:17:36 PM4/15/13
to pytho...@python.org
I will keep the above in mind if I write or review a patch. here are 4
non-subclassable builtin classes. Two are already documented. Bool in
one, forget which other. I believe it was recently decided to leave the
other two as is given the absence of any practical use case.


Ian Kelly

unread,
Apr 16, 2013, 12:46:49 AM4/16/13
to Python
On Mon, Apr 15, 2013 at 9:17 PM, Terry Jan Reedy <tjr...@udel.edu> wrote:
> I will keep the above in mind if I write or review a patch. here are 4
> non-subclassable builtin classes. Two are already documented. Bool in one,
> forget which other. I believe it was recently decided to leave the other two
> as is given the absence of any practical use case.

The four are bool, NoneType, slice and ellipsis, I believe.

rusi

unread,
Apr 16, 2013, 12:56:12 AM4/16/13
to
On Apr 16, 7:32 am, Steven D'Aprano <steve
+comp.lang.pyt...@pearwood.info> wrote:
>
> If I had a say in this, I would vote for the first case, with the
> possible exception of documented singleton types like NoneType and bool.

How is bool a singleton type?

Steven D'Aprano

unread,
Apr 16, 2013, 1:59:23 AM4/16/13
to
A doubleton, then.


The point being, GvR declared that bool should guarantee the invariant
that True and False are the only instances of bool, and if you can
subclass it, either that invariant is violated, or you can't instantiate
the subclass.



--
Steven

88888 Dihedral

unread,
Apr 16, 2013, 2:54:59 AM4/16/13
to types...@lists.seas.upenn.edu, Python List
zipher於 2013年4月15日星期一UTC+8上午11時48分05秒寫道:
> Hello,
>
>
>
> I'm new to the list and hoping this might be the right place to
>
> introduce something that has provoked a bit of an argument in my
>
> programming community.

I'll state about my opinions about the imperative and
non-imperative part.

If the finite stack depth is used instead of the infinite one,
then the auto local variables of the imperative part
can be implemented quite safe and cheap or at least
self-recoverable from a stack overflow event.

This can save a lot burdens in the GC part in an imperative
language.



88888 Dihedral

unread,
Apr 16, 2013, 2:54:59 AM4/16/13
to comp.lan...@googlegroups.com, types...@lists.seas.upenn.edu, Python List
zipher於 2013年4月15日星期一UTC+8上午11時48分05秒寫道:
> Hello,
>
>
>
> I'm new to the list and hoping this might be the right place to
>
> introduce something that has provoked a bit of an argument in my
>
> programming community.

Serhiy Storchaka

unread,
Apr 16, 2013, 4:25:53 AM4/16/13
to pytho...@python.org
>>> import builtins
>>> for n in dir(builtins):
... if type(getattr(builtins, n)) is type:
... try:
... t = type(n, (getattr(builtins, n),), {})
... except TypeError as e:
... print(e)
...
type 'bool' is not an acceptable base type
type 'memoryview' is not an acceptable base type
type 'range' is not an acceptable base type

Antoon Pardon

unread,
Apr 16, 2013, 5:07:38 AM4/16/13
to pytho...@python.org
Op 16-04-13 05:17, Terry Jan Reedy schreef:
> On 4/15/2013 10:32 PM, Steven D'Aprano wrote:
> I will keep the above in mind if I write or review a patch. here are 4
> non-subclassable builtin classes. Two are already documented. Bool in
> one, forget which other. I believe it was recently decided to leave
> the other two as is given the absence of any practical use case.

Why should there be a practical use case here? Since classes are in
general subclassable, shouldn't you have a reason to not make them so
instead of people needing to give you a practical use case before you
treat them as you do most of them?

I once had an idea of a slice-like class that I would have liked to
experiment with. As things were I didn't get far because slice not being
subclassable was a major hurdle in getting it practical. Would the end
result have been a practical use case? I don't know, I didn't get the
chance to find out because making a class that looked like a slice
didn't work either. Python wanted, maybe still wants, a real slice in a
number of circumstances and not a ducktyped slice-like object.

Now maybe there are good reasons for slice not being subclassable but
there not being a practical use case doesn't seem to be one in this case.

Terry Jan Reedy

unread,
Apr 16, 2013, 12:49:01 PM4/16/13
to pytho...@python.org
On 4/16/2013 5:07 AM, Antoon Pardon wrote:
> Op 16-04-13 05:17, Terry Jan Reedy schreef:
>> On 4/15/2013 10:32 PM, Steven D'Aprano wrote:
>>> On Mon, 15 Apr 2013 20:52:58 -0400, Terry Jan Reedy wrote:
>>

>> I will keep the above in mind if I write or review a patch. here are 4
>> non-subclassable builtin classes. Two are already documented. Bool in
>> one, forget which other. I believe it was recently decided to leave
>> the other two as is given the absence of any practical use case.
>
> Why should there be a practical use case here?

As a practical matter, the change is non-trivial. Someone has to be
motivated to write the patch to enable subclassing, write tests, and
consider the effect on internal C uses of slice and stdlib Python used
of slice (type() versus isinstance).

> Since classes are in general subclassable,

if written in Python, but not if written in C.

> I once had an idea of a slice-like class that I would have liked to
> experiment with.

Did the idea actually require that instances *be* a slice rather than
*wrap* a slice?

--
Terry Jan Reedy



Ethan Furman

unread,
Apr 16, 2013, 1:29:18 PM4/16/13
to pytho...@python.org
On 04/16/2013 01:25 AM, Serhiy Storchaka wrote:
> On 16.04.13 07:46, Ian Kelly wrote:
>> On Mon, Apr 15, 2013 at 9:17 PM, Terry Jan Reedy <tjr...@udel.edu> wrote:
>>> I will keep the above in mind if I write or review a patch. here are 4
>>> non-subclassable builtin classes. Two are already documented. Bool in one,
>>> forget which other. I believe it was recently decided to leave the other two
>>> as is given the absence of any practical use case.
>>
>> The four are bool, NoneType, slice and ellipsis, I believe.
>
> --> import builtins
> --> for n in dir(builtins):
> ... if type(getattr(builtins, n)) is type:
> ... try:
> ... t = type(n, (getattr(builtins, n),), {})
> ... except TypeError as e:
> ... print(e)
> ...
> type 'bool' is not an acceptable base type
> type 'memoryview' is not an acceptable base type
> type 'range' is not an acceptable base type
> type 'slice' is not an acceptable base type

Well that bumps our count to five then:

--> NoneType = type(None)
--> NoneType
<class 'NoneType'>
--> class MoreNone(NoneType):
... pass
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: type 'NoneType' is not an acceptable base type

--
~Ethan~

Terry Jan Reedy

unread,
Apr 16, 2013, 2:29:40 PM4/16/13
to pytho...@python.org
On 4/16/2013 1:29 PM, Ethan Furman wrote:
> On 04/16/2013 01:25 AM, Serhiy Storchaka wrote:
>> On 16.04.13 07:46, Ian Kelly wrote:
>>> On Mon, Apr 15, 2013 at 9:17 PM, Terry Jan Reedy <tjr...@udel.edu>
>>> wrote:
>>>> I will keep the above in mind if I write or review a patch. here are 4
>>>> non-subclassable builtin classes. Two are already documented. Bool
>>>> in one,
>>>> forget which other. I believe it was recently decided to leave the
>>>> other two
>>>> as is given the absence of any practical use case.
>>>
>>> The four are bool, NoneType, slice and ellipsis, I believe.
>>
>> --> import builtins
>> --> for n in dir(builtins):
>> ... if type(getattr(builtins, n)) is type:
>> ... try:
>> ... t = type(n, (getattr(builtins, n),), {})
>> ... except TypeError as e:
>> ... print(e)
>> ...
>> type 'bool' is not an acceptable base type
>> type 'memoryview' is not an acceptable base type
>> type 'range' is not an acceptable base type
>> type 'slice' is not an acceptable base type
>
> Well that bumps our count to five then:
>
> --> NoneType = type(None)
> --> NoneType
> <class 'NoneType'>
> --> class MoreNone(NoneType):
> ... pass
> ...
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> TypeError: type 'NoneType' is not an acceptable base type

'NoneType' is not a builtin name in builtins, which is precisely why you
accessed it the way you did ;-). From issue 17279 (for 3.3):

"Attached subclassable.py produces these lists:
Among named builtin classes, these cannot be subclassed:
bool, memoryview, range, slice,
Among types classes, these can be subclassed:
ModuleType, SimpleNamespace,"



Ian Kelly

unread,
Apr 16, 2013, 2:22:42 PM4/16/13
to Python
On Tue, Apr 16, 2013 at 11:29 AM, Ethan Furman <et...@stoneleaf.us> wrote:
>>> The four are bool, NoneType, slice and ellipsis, I believe.
>>
>>
>> --> import builtins
>> --> for n in dir(builtins):
>>
>> ... if type(getattr(builtins, n)) is type:
>> ... try:
>> ... t = type(n, (getattr(builtins, n),), {})
>> ... except TypeError as e:
>> ... print(e)
>> ...
>> type 'bool' is not an acceptable base type
>> type 'memoryview' is not an acceptable base type
>> type 'range' is not an acceptable base type
>> type 'slice' is not an acceptable base type
>
>
> Well that bumps our count to five then:

Six.

>>> class test(type(...)): pass
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: type 'ellipsis' is not an acceptable base type

Mark Janssen

unread,
Apr 16, 2013, 6:38:29 PM4/16/13
to Chris Angelico, pytho...@python.org
On Mon, Apr 15, 2013 at 3:32 PM, Chris Angelico <ros...@gmail.com> wrote:
> On Tue, Apr 16, 2013 at 8:12 AM, Rotwang <sg...@hotmail.co.uk> wrote:
>> Traceback (most recent call last):
>> File "<pyshell#2>", line 1, in <module>
>> class C(type(lambda: None)):
>> TypeError: type 'function' is not an acceptable base type
>>
>>
>> and I don't think that FunctionType would be considered an "internal
>> detail", would it? Not that I'd cite the fact that not all types can be
>> inherited from as evidence that types and classes are not synonyms, mind.
>
> Actually, I'm not sure how you'd go about inheriting from a function.
> Why not just create a bare class, then assign its __call__ to be the
> function you're inheriting from?

I think his point remains valid, from a theoretical pov. Python
prides itself on the idea of "first-class functions" and such, but
unlike the world of lambda calculus, this selling point is a bit
invalid. Because for Python (and any C-based language), it is roots
squarely in the Turing machine and its real-word implementation.

(Note this contrasts starkly with Java(script), which doesn't seem
to be based on anything -- can anyone clarify where Java actually
comes from?)
--
MarkJ
Tacoma, Washington

Ian Kelly

unread,
Apr 16, 2013, 7:14:45 PM4/16/13
to Python
On Tue, Apr 16, 2013 at 4:38 PM, Mark Janssen <dreamin...@gmail.com> wrote:
> I think his point remains valid, from a theoretical pov. Python
> prides itself on the idea of "first-class functions" and such, but
> unlike the world of lambda calculus, this selling point is a bit
> invalid. Because for Python (and any C-based language), it is roots
> squarely in the Turing machine and its real-word implementation.

I'm having a hard time following what you're trying to say here.
Lambda calculus and Turing machines are theoretical models of
computation, not languages. You can model Lisp programs with Turing
machine, and you can model C programs with lambda expressions.
Practically speaking you would probably have an easier time doing it
the other way around, due to the procedural nature of the Turing
machine versus the functional nature of the lambda calculus.

By the usual definition of "first-class function" [1], Python
functions are first-class; this has nothing to do with functional vs.
procedural programming (although it is more commonly found in the
former) or to do with Turing machines (which don't even include
functions as a concept).

> (Note this contrasts starkly with Java(script), which doesn't seem
> to be based on anything -- can anyone clarify where Java actually
> comes from?)

I don't understand why you would consider Python to be "C-based" or
"Turing machine-based" but not Java or Javascript.

[1] http://en.wikipedia.org/wiki/First-class_citizen

Mark Janssen

unread,
Apr 16, 2013, 7:16:42 PM4/16/13
to DeLesley Hutchins, types...@lists.seas.upenn.edu, Python List
> I'm not quite sure I understand your question, but I'll give it a shot. :-)

Thank you, and my apologies for my late reply.

> The C/C++ model, in which the types are anchored to the machine hardware, in
> the exception, not the rule. In the academic literature, "type theory" is
> almost entirely focused on studying abstract models of computation that are
> purely mathematical, and bear no resemblance to the underlying hardware.
> The lambda calculus is the most general, and most commonly used formalism,
> but there are many others; e.g. Featherweight Java provides a formal model
> of objects and classes as they are used in Java.

Understood, but I feel this is where theory has gone too far away from
reality. Wikipedia (admittedly not an authoritative resource), lists
a clear distinction between languages rooted to the Turing machine,
and those rooted in lambda calculus:

From: en.wikipedia.org: Programming_paradigm:

"A programming paradigm is a fundamental style of computer
programming. There are four main paradigms: object-oriented,
imperative, functional and declarative. Their foundations are distinct
models of computation: Turing machine for object-oriented and
imperative programming, lambda calculus for functional programming,
and first order logic for logic programming."

While I understand the interest in purely theoretical models, I wonder
two things: 1) Are these distinct models of computation valid? And,
2) If so, shouldn't a theory of types announce what model of
computation they are working from?

You say the C/C++ model is the exception, but in the programmer
community (excepting web-based languages) it is the opposite. The
machine model dominates. In fact, I'm not even sure how Java
operates, but through some sorcery I don't want to take part in.

> "Types and Programming Languages", by Benjamin Pierce, is an excellent
> introductory textbook which describes how various language features,
> including objects, can be formalized. If you are interested in OOP, Abadi
> and Cardelli's "Theory of Objects" is the obvious place to start, although
> I'd recommend reading Pierce's book first if you want to understand it. :-)
> Abadi and Cardelli discuss both class-based languages, and pure object
> languages. If you are interested in the type/object distinction in
> particular, then I'll shamelessly plug my own thesis: "Pure Subtype Systems"
> (available online), which describes a formal model in which types are
> objects, and objects are types. If you are familiar with the Self language,
> then you can think of it as a type system for Self.

Thank you very much. I will look for them.

> Once you have a type system in place, it's usually fairly straightforward to
> compile a language down to actual hardware. An interpreter, like that used
> in Python, is generally needed only for untyped or "dynamic" languages.
> There are various practical considerations -- memory layout, boxed or
> unboxed data types, garbage collection, etc. -- but the basic techniques are
> described in any compiler textbook. Research in the areas of "typed
> assembly languages" and "proof carrying code" are concerned with ensuring
> that the translation from high-level language to assembly language is sound,
> and well-typed at all stages.

Very interesting. I appreciate the those leads....
--
MarkJ
Tacoma, Washington

Mark Janssen

unread,
Apr 16, 2013, 7:40:16 PM4/16/13
to Uday S Reddy, types...@lists.seas.upenn.edu, Python List
On Mon, Apr 15, 2013 at 2:06 AM, Uday S Reddy <u.s....@cs.bham.ac.uk> wrote:
> In programming language theory, there is no law to the effect that
> "everything" should be of one kind or another. So, we would not go with
> Alan Kay's ideal.

I understand. I state Kay's points to show how the evolution of (this
part of) the programming language world *in practice* has gone in its
explorations.

> Having said that, theorists do want to unify concepts wherever possible and
> wherever they make sense. Imperative programming types, which I will call
> "storage types", are semantically the same as classes.

I like that word "storage type", it makes it much clearer what one is
referring to.

I feel like I'm having to "come up to speed" of the academic
community, but wonder how and why this large chasm happened between
the applied community and the theoretical. In my mind, despite the
ideals of academia, students graduate and they inevitably come to work
on Turing machines of some kind (Intel hardware, for example,
currently dominates). If this is not in some way part of some
"ideal", why did the business community adopt and deploy these most
successfully? Or is it, in some *a priori* way, not possible to apply
the abstract notions in academia into the real-world?

> Bare storage types
> have predefined operations for 'getting' and 'setting' whereas classes allow
> user-defined operations. So, the distinction made between them in typical
> programming languages is artificial and implementation-focused. C and C++
> are especially prone to this problem because they were designed for writing
> compilers and operating systems where proximity to the machine architecture
> seems quite necessary. The higher-level languages such as Java are moving
> towards abolishing the distinction.

Right, same with Python, but IMO this is where the evolution of
programming languages is going awry. As languages move away from the
machine, they're getting more based in different and disparate notions
of types. From a practical standpoint, this makes interoperability
and OOPs desire for "re-useability" recede.

> Here are a couple of references in theoretical work that might be helpful in
> understanding these connections:

Thank you for those references. I will look into them.

--
MarkJ
Tacoma, Washington

Ian Kelly

unread,
Apr 16, 2013, 7:52:46 PM4/16/13
to Python
On Tue, Apr 16, 2013 at 5:16 PM, Mark Janssen <dreamin...@gmail.com> wrote:
> Understood, but I feel this is where theory has gone too far away from
> reality.

How so? Turing machines and lambda calculus were both invented in the
30s, before any real mechanical computers existed. If anything, it is
programming that has strayed too far from theory. ;-)

> While I understand the interest in purely theoretical models, I wonder
> two things: 1) Are these distinct models of computation valid?

If they provide an algorithmic/mechanical means of solving problems,
then they are valid models of computation. An abacus is a valid model
of computation, although I wouldn't want to compute a million digits
of pi on it. I may be missing what you really mean by "valid" here.

> 2) If so, shouldn't a theory of types announce what model of
> computation they are working from?
>
> You say the C/C++ model is the exception, but in the programmer
> community (excepting web-based languages) it is the opposite. The
> machine model dominates.

In C/C++ the types are based on the hardware. You have 16-bit ints
and 32-bit ints and 64-bit floats and what not because those are the
types that fit neatly in the storage units of the hardware.

In higher-level languages like Python, the types are abstract, not
based on the hardware. Python doesn't have 32-bit ints, except as an
implementation detail that the programmer need not be aware of.
Python ints have arbitrary precision.

> In fact, I'm not even sure how Java
> operates, but through some sorcery I don't want to take part in.

It compiles to an intermediate language that is executed by a virtual
machine, much like Python.

Ian Kelly

unread,
Apr 16, 2013, 8:00:54 PM4/16/13
to Python
On Tue, Apr 16, 2013 at 5:40 PM, Mark Janssen <dreamin...@gmail.com> wrote:
> I feel like I'm having to "come up to speed" of the academic
> community, but wonder how and why this large chasm happened between
> the applied community and the theoretical. In my mind, despite the
> ideals of academia, students graduate and they inevitably come to work
> on Turing machines of some kind (Intel hardware, for example,
> currently dominates).

Modern computers are based on the Von Neumann architecture, not Turing machines.

> If this is not in some way part of some
> "ideal", why did the business community adopt and deploy these most
> successfully? Or is it, in some *a priori* way, not possible to apply
> the abstract notions in academia into the real-world?

Theory of computation is mostly about determining what kinds of
problems fundamentally can or cannot be solved by computation. The
models used are designed to be reasoned about, not to be used for
solving real-world programming tasks.

Steven D'Aprano

unread,
Apr 17, 2013, 2:40:42 AM4/17/13
to
On Tue, 16 Apr 2013 15:38:29 -0700, Mark Janssen wrote:

> On Mon, Apr 15, 2013 at 3:32 PM, Chris Angelico <ros...@gmail.com>
> wrote:
>> On Tue, Apr 16, 2013 at 8:12 AM, Rotwang <sg...@hotmail.co.uk> wrote:
>>> Traceback (most recent call last):
>>> File "<pyshell#2>", line 1, in <module>
>>> class C(type(lambda: None)):
>>> TypeError: type 'function' is not an acceptable base type
>>>
>>>
>>> and I don't think that FunctionType would be considered an "internal
>>> detail", would it? Not that I'd cite the fact that not all types can
>>> be inherited from as evidence that types and classes are not synonyms,
>>> mind.
>>
>> Actually, I'm not sure how you'd go about inheriting from a function.
>> Why not just create a bare class, then assign its __call__ to be the
>> function you're inheriting from?
>
> I think his point remains valid, from a theoretical pov. Python prides
> itself on the idea of "first-class functions" and such, but unlike the
> world of lambda calculus, this selling point is a bit invalid.


Python functions are first-class functions, which is short-hand for
saying "functions which are also capable of being treated as values,
which means they can be created at run-time, returned from functions,
passed around as arguments, and assigned to variables".

Python's function type is not a first-class object-type, because it
cannot be subclassed in at least three of the main implementations. But
this has nothing to do with whether or not functions are first-class
functions, which is an unrelated meaning. One can conceive of a language
where FunctionType is a first-class type capable of being subclasses, but
functions are *not* first-class values capable of being passed around as
arguments.



> Because for Python (and any C-based language),

Python-the-language is not C-based, or at least, very little of Python is
C-based. It's main influences are, according to GvR, Lisp and ABC, with
Pascal, Haskell and of course C also having some influence. Syntax-wise,
it is much more of an Algol-inspired language than a C-inspired language.



> it is roots squarely in the
> Turing machine and its real-word implementation.

Python is certainly not inspired by Turing machines. Since a Turing
machine is a simple device with an infinitely long paper tape which can
have marks added and deleted from it, very few real programming languages
are based on Turing machines.

It is, however, Turing-complete. Just like every programming language
worthy of the name, whether it has syntax like Python, C, Lisp, Forth,
INTERCAL, Oook, Applescript, Inform-7, Java, PHP, or x86 assembler.


> (Note this contrasts starkly with Java(script), which doesn't seem
> to be based on anything -- can anyone clarify where Java actually comes
> from?)

C.



--
Steven

Chris Angelico

unread,
Apr 17, 2013, 2:56:45 AM4/17/13
to pytho...@python.org
offee.

ChrisA

Chris Rebert

unread,
Apr 17, 2013, 3:16:21 AM4/17/13
to Python
On Tue, Apr 16, 2013 at 11:40 PM, Steven D'Aprano
<steve+comp....@pearwood.info> wrote:
> On Tue, 16 Apr 2013 15:38:29 -0700, Mark Janssen wrote:
<snip>
> > (Note this contrasts starkly with Java(script), which doesn't seem
> > to be based on anything -- can anyone clarify where Java actually comes
> > from?)
>
> C.

"Influenced by: Ada 83, C++, C#, Eiffel, Generic Java, Mesa, Modula-3,
Oberon, Objective-C, UCSD Pascal, Smalltalk"
"Categories: C programming language family | [...]"
http://en.wikipedia.org/wiki/Java_(programming_language)

Sincerely,
Chris
--
Read Wikipedia's infoboxes! People work hard on them!

Uday S Reddy

unread,
Apr 17, 2013, 5:10:58 AM4/17/13
to Mark Janssen, types...@lists.seas.upenn.edu, Uday S Reddy, Python List
Mark Janssen writes:

> > Having said that, theorists do want to unify concepts wherever possible
> > and wherever they make sense. Imperative programming types, which I
> > will call "storage types", are semantically the same as classes.
>
> I like that word "storage type", it makes it much clearer what one is
> referring to.

Indeed. However, this is not the only notion of type in imperative
programming languages. For example, a function type in C or its descendants
is not there to describe storage, but to describe the interface of an
abstraction. I will use Reynolds's term "phrase types" to refer to such
types. Reynolds's main point in "The Essence of Algol" was to say that
phrase types are much more general, and a language can be built around them
in a streamlined way. Perhaps "Streamlining Algol" would have been a more
descriptive title for his paper. Nobody should be designing an imperative
programming language without having read "The Essence of Algol", but they
do.

Whether storage types (and their generalization, class types) should be
there in a type system at all is an open question. I can think of arguments
both ways. In Java, classes are types. So are interfaces (i.e., phrase
types). I think Java does a pretty good job of combining the two
in a harmonious way.

If you have trouble getting hold of "The Essence of Algol", please write to
me privately and I can send you a scanned copy. The Handout 5B in my
"Principles of Programming Languages" lecture notes is a quick summary of
the Reynolds's type system.

http://www.cs.bham.ac.uk/~udr/popl/index.html

> I feel like I'm having to "come up to speed" of the academic
> community, but wonder how and why this large chasm happened between
> the applied community and the theoretical. In my mind, despite the
> ideals of academia, students graduate and they inevitably come to work
> on Turing machines of some kind (Intel hardware, for example,
> currently dominates). If this is not in some way part of some
> "ideal", why did the business community adopt and deploy these most
> successfully? Or is it, in some *a priori* way, not possible to apply
> the abstract notions in academia into the real-world?

The chasms are too many, not only between theoretical and applied
communities, but within each of them. My feeling is that this is
inevitable. Our field progresses too fast for people to sit back, take
stock of what we have and reconcile the multiple points of view.

There is nothing wrong with "Turing machines". But the question in
programming language design is how to integrate the Turing machine concepts
with all the other abstractions we need (functions/procedures, modules,
abstract data types etc.), i.e., how to fit the Turing machine concepts into
the "big picture". That is not an easy question to resolve, and there isn't
a single way of doing it. So you see multiple approaches being used in the
practical programming languages, some cleaner than the others.

The abstract notions of academia do make it into the real world, but rather
more slowly than one would hope. Taking Java for example, the initial
versions of Java treated interfaces in a half-hearted way, ignored
generics/polymorphism, and ignored higher-order functions. But all of them
are slowly making their way into Java, with pressure not only from the
academic community but also through competition from other "practical"
languages like Python, C# and Scala. If this kind of progress continues,
that is the best we can hope for in a fast-paced field like ours.

Cheers,
Uday Reddy

--
Prof. Uday Reddy Tel: +44 121 414 2740
Professor of Computer Science Fax: +44 121 414 4281
School of Computer Science
University of Birmingham Email: U.S....@cs.bham.ac.uk
Edgbaston
Birmingham B15 2TT Web: http://www.cs.bham.ac.uk/~udr

Uday S Reddy

unread,
Apr 17, 2013, 5:30:36 AM4/17/13
to Mark Janssen, types...@lists.seas.upenn.edu, Python List, DeLesley Hutchins
Mark Janssen writes:

> From: en.wikipedia.org: Programming_paradigm:
>
> "A programming paradigm is a fundamental style of computer
> programming. There are four main paradigms: object-oriented,
> imperative, functional and declarative. Their foundations are distinct
> models of computation: Turing machine for object-oriented and
> imperative programming, lambda calculus for functional programming,
> and first order logic for logic programming."
>
> While I understand the interest in purely theoretical models, I wonder
> two things: 1) Are these distinct models of computation valid? And,
> 2) If so, shouldn't a theory of types announce what model of
> computation they are working from?

These distinctions are not fully valid.

- Functional programming, logic programming and imperative programming are
three different *computational mechanisms*.

- Object-orientation and abstract data types are two different ways of
building higher-level *abstractions*.

The authors of this paragraph did not understand that computational
mechanisms and higher-level abstractions are separate, orthogonal dimensions
in programming language design. All six combinations, obtained by picking a
computational mechanism from the first bullet and an abstraction mechanism
from the second bullet, are possible. It is a mistake to put
object-orientation in the first bullet. Their idea of "paradigm" is vague
and ill-defined.

Cheers,
Uday Reddy

Antoon Pardon

unread,
Apr 17, 2013, 8:04:48 AM4/17/13
to pytho...@python.org
Op 16-04-13 18:49, Terry Jan Reedy schreef:
> On 4/16/2013 5:07 AM, Antoon Pardon wrote:
>> Op 16-04-13 05:17, Terry Jan Reedy schreef:
>>
>>> I will keep the above in mind if I write or review a patch. here are 4
>>> non-subclassable builtin classes. Two are already documented. Bool in
>>> one, forget which other. I believe it was recently decided to leave
>>> the other two as is given the absence of any practical use case.
>>
>> Why should there be a practical use case here?
>
> As a practical matter, the change is non-trivial. Someone has to be
> motivated to write the patch to enable subclassing, write tests, and
> consider the effect on internal C uses of slice and stdlib Python used
> of slice (type() versus isinstance).
I see. It seems I have underestimated the work involved.

>> I once had an idea of a slice-like class that I would have liked to
>> experiment with.
>
> Did the idea actually require that instances *be* a slice rather than
> *wrap* a slice?

As far as I remember I wanted my slice object usable to slice lists
with. But python doesn't allow duck typing when you use your object to
"index" a list. No matter how much your object resembles a slice, when
you actualy try to use it to get a slice of a list, python throw a
TypeError with the message "object cannot be interpreted as an index".
This in combination with slice not being subclassable effectively killed
the idea.

As I already said I don't know if the idea would have turned up
something usefull. The following years I never had the feeling how great
it would have been should I have been able to pursue this idea. I just
thought it was a pity I was so thoroughly stopped at the time.

--
Antoon Pardon

Rishiyur Nikhil

unread,
Apr 17, 2013, 10:04:02 AM4/17/13
to Uday S Reddy, types...@lists.seas.upenn.edu, Python List

>    If you have trouble getting hold of "The Essence of Algol", ...

There seems to be a downloadable copy at:


It's in PostScript, which is easily convertible to PDF if you wish.

Nikhil


On Wed, Apr 17, 2013 at 5:30 AM, Uday S Reddy <u.s....@cs.bham.ac.uk> wrote:

Andreas Abel

unread,
Apr 17, 2013, 10:42:25 AM4/17/13
to Uday S Reddy, types...@lists.seas.upenn.edu, Python List
On 17.04.2013 11:30, Uday S Reddy wrote:
> Mark Janssen writes:
>
>> From: en.wikipedia.org: Programming_paradigm:
>>
>> "A programming paradigm is a fundamental style of computer
>> programming. There are four main paradigms: object-oriented,
>> imperative, functional and declarative. Their foundations are distinct
>> models of computation: Turing machine for object-oriented and
>> imperative programming, lambda calculus for functional programming,
>> and first order logic for logic programming."

I removed the second sentence relating paradigms to computation models
and put it on the talk page instead. It does not make sense to connect
imperative programming to Turing machines like functional programming to
lambda calculus. A better match would be random access machines, but
the whole idea of a connection between a programming paradigm and a
computation model is misleading.

>> While I understand the interest in purely theoretical models, I wonder
>> two things: 1) Are these distinct models of computation valid? And,
>> 2) If so, shouldn't a theory of types announce what model of
>> computation they are working from?
>
> These distinctions are not fully valid.
>
> - Functional programming, logic programming and imperative programming are
> three different *computational mechanisms*.
>
> - Object-orientation and abstract data types are two different ways of
> building higher-level *abstractions*.
>
> The authors of this paragraph did not understand that computational
> mechanisms and higher-level abstractions are separate, orthogonal dimensions
> in programming language design. All six combinations, obtained by picking a
> computational mechanism from the first bullet and an abstraction mechanism
> from the second bullet, are possible. It is a mistake to put
> object-orientation in the first bullet. Their idea of "paradigm" is vague
> and ill-defined.
>
> Cheers,
> Uday Reddy
>


--
Andreas Abel <>< Du bist der geliebte Mensch.

Theoretical Computer Science, University of Munich
Oettingenstr. 67, D-80538 Munich, GERMANY

andrea...@ifi.lmu.de
http://www2.tcs.ifi.lmu.de/~abel/
Message has been deleted

Michael Torrie

unread,
Apr 18, 2013, 12:37:17 PM4/18/13
to pytho...@python.org
On 04/16/2013 04:38 PM, Mark Janssen wrote:
> (Note this contrasts starkly with Java(script), which doesn't seem
> to be based on anything -- can anyone clarify where Java actually
> comes from?)

Java is not equal in any way with JavaScript. The only thing they share
are semicolons and braces. Naming EMCAScript JavaScript was a very
unfortunate thing indeed.

For the record, JavaScript is what they call a "prototype-based
language." http://en.wikipedia.org/wiki/Prototype-based_programming.
You can emulate an OOP system with a prototype-based language.

I highly recommend you read a book on formal programming language theory
and concepts.

Neil Cerutti

unread,
Apr 18, 2013, 1:57:13 PM4/18/13
to
Let me recommend Concepts, Techniques and Models of Computer
Programming, Van Roy and Haridi.

http://www.info.ucl.ac.be/~pvr/book.html

--
Neil Cerutti

Jason Wilkins

unread,
Apr 18, 2013, 2:48:20 PM4/18/13
to Andreas Abel, types...@lists.seas.upenn.edu, Uday S Reddy, Python List
Warning, this is a bit of a rant.

That paragraph from Wikipedia seems to be confused.  It gives the fourth paradigm as "declarative" but then says "first order logic for logic programming".  It seems somebody did an incomplete replacement of "declarative" for "logic".  Wikipedia is often schizophrenic like that.

Personally, I think that object oriented and logical programming only became official paradigms because there was a certain level of hype for them in the 1980s and nobody has thought to strike them off the list after the hype died down.

Object-oriented, as constituted today, is just a layer of abstraction over imperative programming (or imperative style programming in functional languages, because objects require side-effects).  What "object-oriented" language actually in use now isn't just an imperative language with fancy abstraction mechanisms?

The problem with having declarative languages as a paradigm (which logical languages would be a part) is that it feels like it should be a "miscellaneous" category.  Being declarative doesn't tell you much except that some machine is going to turn your descriptions of something into some kind of action.  In logical programming it is a set of predicates, but it could just as easily be almost anything else.  In a way all languages are "declarative", it is just that we have some standard interpretations of what is declared that are very common (imperative and functional).

My wish is that the idea of there being four paradigms would be abandoned the same we the idea of four food groups has been abandoned (which may surprise some of you).  We have more than four different modes of thinking when programming and some are much more important than others and some are subsets of others.  We should teach students a more sophisticated view.

Ironically Wikipedia also shows us this complexity.  The programming language paradigm side bar actually reveals the wealth of different styles that are available.  There is simply no clean and useful way to overlay the four paradigms over what we see there, so it should be abandoned because it gives students a false idea.


On Wed, Apr 17, 2013 at 9:42 AM, Andreas Abel <andrea...@ifi.lmu.de> wrote:

Robert Harper

unread,
Apr 18, 2013, 5:15:15 PM4/18/13
to Jason Wilkins, types...@lists.seas.upenn.edu, Python List, Andreas Abel
The term "declarative" never meant a damn thing, but was often used, absurdly, to somehow lump together functional programming with logic programming, and separate it from imperative programming. It never made a lick of sense; it's just a marketing term.

Bob Harper
>> [ The Types Forum, http://lists.seas.upenn.edu/**
>> mailman/listinfo/types-list<http://lists.seas.upenn.edu/mailman/listinfo/types-list>]
>> http://www2.tcs.ifi.lmu.de/~**abel/ <http://www2.tcs.ifi.lmu.de/~abel/>
>>

Robert Harper

unread,
Apr 18, 2013, 5:14:13 PM4/18/13
to Jason Wilkins, types...@lists.seas.upenn.edu, Python List, Andreas Abel
In short, there is no such thing as a "paradigm". I agree fully. This term is a holdover from the days when people spent time and space trying to build taxonomies based on ill-defined superficialities. See Steve Gould's essay "What, If Anything, Is A Zebra?". You'll enjoy learning that there is, in fact, no such thing as a zebra---there are, rather, three different striped horse-like mammals, two of which are genetically related, and one of which is not. The propensity to be striped, like the propensity to have five things (fingers, segments, whatever) is a deeply embedded genetic artifact that expresses itself in various ways.

Mark Janssen

unread,
Apr 18, 2013, 6:53:15 PM4/18/13
to Types List, Moez Abdel-Gawad, Python List, DeLesley Hutchins, moez...@live.com
On Mon, Apr 15, 2013 at 2:53 AM, Moez AbdelGawad <moez...@outlook.com> wrote:
>> I'm not quite sure I understand your question, but I'll give it a shot.
>> :-)
>
> I'm in this same camp too :)

I am very thankful for the references given by everyone.
Unfortunately my library does not have the titles and it will be some
time before I can acquire them. I hope it not too intrusive to offer
a few points that I've garnered from this conversation until I can
study the history further.

The main thing that I notice is that there is a heavy "bias" in
academia towards mathematical models. I understand that Turing
Machines, for example, were originally abstract computational concepts
before there was an implementation in hardware, so I have some
sympathies with that view, yet, should not the "Science" of "Computer
Science" concern itself with how to map these abstract computational
concepts into actual computational hardware? Otherwise, why not keep
the field within mathematics and philosophy (where Logic traditionally
has been)? I find it remarkable, for example, that the simple
continued application of And/Or/Not gates can perform all the
computation that C.S. concerns itself with and these form the basis
for computer science in my mind, along with Boolean logic. (The
implementation of digital logic into physical hardware is where C.S.
stops and Engineering begins, I would argue.)

But still, it seems that there are two ends, two poles, to the whole
computer science enterprise that haven't been sufficiently *separated*
so that they can be appreciated: logic gates vs. logical "calculus"
and symbols. There is very little crossover as I can see. Perhaps
the problem is the common use of the Greek root "logikos"; in the
former, it pertains to binary arithmetic, where in the latter, it
retains it's original Greek pertaining to *speech* and symbols,
"logos"). Further, one can notice that in the former, the progression
has been towards more sophisticated Data Structures (hence the
evolution towards Object-Orientation), where in the latter (I'm
guessing, since it's not my area of expertise) the progression has
been towards function sophistication (where recursion seems to be
paramount).

In any case, I look forward to diving into the books and references
you've all offered so generously so that I can appreciate the field
and its history better.

Mark Janssen
Pacific Lutheran University
Tacoma, Washington

Ian Kelly

unread,
Apr 18, 2013, 7:30:48 PM4/18/13
to Python
On Thu, Apr 18, 2013 at 4:53 PM, Mark Janssen <dreamin...@gmail.com> wrote:
> The main thing that I notice is that there is a heavy "bias" in
> academia towards mathematical models. I understand that Turing
> Machines, for example, were originally abstract computational concepts
> before there was an implementation in hardware, so I have some
> sympathies with that view, yet, should not the "Science" of "Computer
> Science" concern itself with how to map these abstract computational
> concepts into actual computational hardware?

Why? You seem to have a notion that theoretical computer science is
ultimately about programming. It's not, any more than theoretical
physics is ultimately about how to build skyscrapers. Theoreticians
don't discuss complicated languages like Python because it would be
difficult to prove anything about computation using them. Programmers
don't use constructs like Turing machines because they're not
practical or useful for doing actual programming with. We're talking
about two different groups of people who use different tools because
they have very different objectives.

> Otherwise, why not keep
> the field within mathematics and philosophy (where Logic traditionally
> has been)?

Well now, that's an age-old debate. Ultimately what we call "computer
science" does not encompass one single discipline. But I think they
are generally kept under one academic umbrella because they are
closely related, and there is value in working with colleagues in
separate sub-fields. Certainly there is value in being passingly
familiar with the theory side of things if one is going to be
designing languages and writing parsers. Less so if one is primarily
occupied with building inventory systems.

> But still, it seems that there are two ends, two poles, to the whole
> computer science enterprise that haven't been sufficiently *separated*
> so that they can be appreciated: logic gates vs. logical "calculus"
> and symbols. There is very little crossover as I can see. Perhaps
> the problem is the common use of the Greek root "logikos"; in the
> former, it pertains to binary arithmetic, where in the latter, it
> retains it's original Greek pertaining to *speech* and symbols,
> "logos"). Further, one can notice that in the former, the progression
> has been towards more sophisticated Data Structures (hence the
> evolution towards Object-Orientation), where in the latter (I'm
> guessing, since it's not my area of expertise) the progression has
> been towards function sophistication (where recursion seems to be
> paramount).

Okay, you've lost me again. What do logic gates have to do with data
structures and OOP? What does symbolic logic have to do with
functional programming?

Steven D'Aprano

unread,
Apr 18, 2013, 9:00:48 PM4/18/13
to
On Thu, 18 Apr 2013 10:37:17 -0600, Michael Torrie wrote:

> For the record, JavaScript is what they call a "prototype-based
> language." http://en.wikipedia.org/wiki/Prototype-based_programming.
> You can emulate an OOP system with a prototype-based language.

Prototype languages *are* OOP. Note that it is called OBJECT oriented
programming, not class oriented, and prototype-based languages are based
on objects just as much as class-based languages. They are merely two
distinct models for OOP.




--
Steven

Roy Smith

unread,
Apr 18, 2013, 9:08:22 PM4/18/13
to
In article <51709740$0$29977$c3e8da3$5496...@news.astraweb.com>,
Steven D'Aprano <steve+comp....@pearwood.info> wrote:

One of the nice things about OOP is it means so many different things to
different people. All of whom believe with religious fervor that they
know the true answer.

Mark Janssen

unread,
Apr 18, 2013, 9:24:50 PM4/18/13
to Roy Smith, pytho...@python.org
> One of the nice things about OOP is it means so many different things to
> different people. All of whom believe with religious fervor that they
> know the true answer.

Here's a simple rule to resolve the ambiguity. Whoever publishes
first, gets to claim origin of a word and its usage, kind of like a
BDFL. The rest can adapt around that, make up their own word, or be
corrected as the community requires.

--
MarkJ
Tacoma, Washington

Ned Batchelder

unread,
Apr 18, 2013, 10:10:07 PM4/18/13
to Mark Janssen, pytho...@python.org
On 4/18/2013 9:24 PM, Mark Janssen wrote:
>> One of the nice things about OOP is it means so many different things to
>> different people. All of whom believe with religious fervor that they
>> know the true answer.
> Here's a simple rule to resolve the ambiguity. Whoever publishes
> first, gets to claim origin of a word and its usage, kind of like a
> BDFL. The rest can adapt around that, make up their own word, or be
> corrected as the community requires.
>

You won't solve the problem of confusing, ambiguous, or conflicting
terminology by making up a rule. "Object-oriented" means subtly
different things to different people. It turns out that computing is a
complex field with subtle concepts that don't always fit neatly into a
categorization. Python, Java, Javascript, Ruby, Smalltalk, Self, PHP,
C#, Objective-C, and C++ are all "object-oriented", but they also all
have differences between them. That's OK. We aren't going to make up a
dozen words as alternatives to "object-oriented", one for each language.

You seem to want to squeeze all of computer science and programming into
a tidy hierarchy. It won't work, it's not tidy. I strongly suggest you
read more about computer science before forming more opinions. You have
a lot to learn ahead of you.

--Ned.

Mark Janssen

unread,
Apr 18, 2013, 10:30:39 PM4/18/13
to Ned Batchelder, pytho...@python.org
On Thu, Apr 18, 2013 at 7:10 PM, Ned Batchelder <n...@nedbatchelder.com> wrote:
> You won't solve the problem of confusing, ambiguous, or conflicting
> terminology by making up a rule. "Object-oriented" means subtly different
> things to different people.

That's a problem, not a solution.

> It turns out that computing is a complex field
> with subtle concepts that don't always fit neatly into a categorization.

But that is the point of having a *field*.

> Python, Java, Javascript, Ruby, Smalltalk, Self, PHP, C#, Objective-C, and
> C++ are all "object-oriented", but they also all have differences between
> them. That's OK. We aren't going to make up a dozen words as alternatives
> to "object-oriented", one for each language.

Well, you won't, but other people *in the field* already have,
fortunately. They have names like dynamically-typed,
statically-typed, etc.

> You seem to want to squeeze all of computer science and programming into a
> tidy hierarchy.

No on "squeeze" and "tidy". Maybe on "hierarchy".

> It won't work, it's not tidy. I strongly suggest you read
> more about computer science before forming more opinions. You have a lot to
> learn ahead of you.

Okay, professor is it, master? What is your provenance anyway?

> --Ned.

-- :)



--
MarkJ
Tacoma, Washington

Ned Batchelder

unread,
Apr 18, 2013, 10:39:42 PM4/18/13
to Mark Janssen, pytho...@python.org
On 4/18/2013 10:30 PM, Mark Janssen wrote:
> Okay, professor is it, master? What is your provenance anyway?

I'm not a professor, I'm a software engineer. I'm just trying to help.
You've made statements that strike me as half-informed. You're trying to
unify concepts that perhaps can't or shouldn't be unified. You've
posted now to three different mailing lists about your thoughts on the
nature of programming languages and object-orientation, and have
apparently had little success in interesting people in them. A number
of people have suggested that you study more about the topics you're
interested in, and I agree with them.

--Ned.

rusi

unread,
Apr 18, 2013, 11:35:17 PM4/18/13
to
On Apr 19, 3:53 am, Mark Janssen <dreamingforw...@gmail.com> wrote:
> On Mon, Apr 15, 2013 at 2:53 AM, Moez AbdelGawad <moeza...@outlook.com> wrote:
> >> I'm not quite sure I understand your question, but I'll give it a shot.
> >> :-)
>
> > I'm in this same camp too :)
>
> I am very thankful for the references given by everyone.
> Unfortunately my library does not have the titles and it will be some
> time before I can acquire them.  I hope it not too intrusive to offer
> a few points that I've garnered from this conversation until I can
> study the history further.


You may want to see this: http://www.infoq.com/presentations/Functional-Thinking


>
> The main thing that I notice is that there is a heavy "bias" in
> academia towards mathematical models.

Yeah wonderful observation. Lets clean up!

If I have a loop:

while i < len(a) and a[i] != x:
i++

I need to understand that at the end of the loop:
i >= len(a) or a[i] == x
and not
i >= len(a) and a[i] == x
nor
i == len(a) or a[i] == x # What if I forgot to initialize i?

Now why bother to teach students such a silly thing (and silly name)
as deMorgan?

So all hail to your project of cleaning up the useless math from CS.
And to whet your appetite for the grandeur and glory of your
visionings why not start with making music schools enroll tone-deaf
students? Why wasn't Beethoven deaf?

>  I understand that Turing
> Machines, for example, were originally abstract computational concepts
> before there was an implementation in hardware, so I have some
> sympathies with that view, yet, should not the "Science" of "Computer
> Science" concern itself with how to map these abstract computational
> concepts into actual computational hardware?  Otherwise, why not keep
> the field within mathematics and philosophy (where Logic traditionally
> has been)?   I find it remarkable, for example, that the simple
> continued application of And/Or/Not gates can perform all the
> computation that C.S. concerns itself with and these form the basis
> for computer science in my mind, along with Boolean logic.  (The
> implementation of digital logic into physical hardware is where C.S.
> stops and Engineering begins, I would argue.)

You need to study some history (or is that irrelevant like math?)
The Turing who invented the Turing machine in 1936 led the code-
cracking efforts of the allies a couple of years later.
Do you allow for the fact that he may have had abilities that were
common to both aka 'math' 'theory' etc?
Or do you believe that winning wars is a theoretical and irrelevant
exercise?

>
> But still, it seems that there are two ends, two poles, to the whole
> computer science enterprise that haven't been sufficiently *separated*
> so that they can be appreciated:  logic gates vs. logical "calculus"
> and symbols.   There is very little crossover as I can see.  Perhaps
> the problem is the common use of the Greek root "logikos"; in the
> former, it pertains to binary arithmetic, where in the latter, it
> retains it's original Greek pertaining to *speech* and symbols,
> "logos").


Yes there is some truth in what you say. Just call it logic as object-
language (what you call logic-gates) and logic as meta-language ie
logic for reasoning
[the above line is not sarcastic]



> Further, one can notice that in the former, the progression
> has been towards more sophisticated Data Structures (hence the
> evolution towards Object-Orientation), where in the latter (I'm
> guessing, since it's not my area of expertise) the progression has
> been towards function sophistication (where recursion seems to be
> paramount).

Also good to study the views of one of the doyens of OOP:
http://en.wikipedia.org/wiki/Alexander_Stepanov#Criticism_of_OOP

Steven D'Aprano

unread,
Apr 18, 2013, 11:38:34 PM4/18/13
to
On Thu, 18 Apr 2013 19:30:39 -0700, Mark Janssen wrote:

> On Thu, Apr 18, 2013 at 7:10 PM, Ned Batchelder <n...@nedbatchelder.com>
> wrote:
>> You won't solve the problem of confusing, ambiguous, or conflicting
>> terminology by making up a rule. "Object-oriented" means subtly
>> different things to different people.
>
> That's a problem, not a solution.

It's a fact, not necessarily a problem.

"Sandwich" means subtly different things to different people in different
places, but human beings manage to cope, and very few people accidentally
eat a sandwich of differently doped silicon crystals (i.e. a transistor)
when they intended to eat a sandwich of bread and cheese.

So long as people recognise the existence and nature of these subtle
differences, it's all good. Java's OOP model is different from Python's,
which is different from Lua's, which is different from Smalltalk's.
That's all grand, they all have their strengths and weaknesses, and if
all programming languages were the same, there would only be one. (And it
would probably be PHP.)


>> It turns out that computing is a complex field
>> with subtle concepts that don't always fit neatly into a
>> categorization.
>
> But that is the point of having a *field*.

Reality is the way it is. However we would like fields of knowledge to
neatly fit inside pigeonholes, they don't.


>> Python, Java, Javascript, Ruby, Smalltalk, Self, PHP, C#, Objective-C,
>> and C++ are all "object-oriented", but they also all have differences
>> between them. That's OK. We aren't going to make up a dozen words as
>> alternatives to "object-oriented", one for each language.
>
> Well, you won't, but other people *in the field* already have,
> fortunately. They have names like dynamically-typed, statically-typed,
> etc.

They are not names for variations of OOP. They are independent of whether
a language is OOP or not. For example:


Java is statically typed AND object oriented.
Haskell is statically typed but NOT object oriented.

Python is dynamically typed AND object oriented.
Scheme is dynamically typed but NOT object oriented.



--
Steven

Mark Janssen

unread,
Apr 18, 2013, 11:58:20 PM4/18/13
to rusi, pytho...@python.org
>> The main thing that I notice is that there is a heavy "bias" in
>> academia towards mathematical models.
>
> Yeah wonderful observation. Lets clean up!
>
> If I have a loop:
>
> while i < len(a) and a[i] != x:
> i++
>
> I need to understand that at the end of the loop:
> i >= len(a) or a[i] == x
> and not
> i >= len(a) and a[i] == x
> nor
> i == len(a) or a[i] == x # What if I forgot to initialize i?

You know in my world, we have what's called Input/Output, rather than
punchcards or switchbanks where you come from. Why not: "print
i,a[i]". Done!

> Now why bother to teach students such a silly thing (and silly name)
> as deMorgan?

Well deMorgan falls into BooleanLogic which I'm arguing is distinct
from the the mathematical realm where the lambda calculus wizards come
from. So that's my camp, thanks.

> So all hail to your project of cleaning up the useless math from CS.

Yes, on useless math, no on *useful* math. Thanks.

> And to whet your appetite for the grandeur and glory of your
> visionings why not start with making music schools enroll tone-deaf
> students? Why wasn't Beethoven deaf?

Beethoven was deaf.

> You need to study some history (or is that irrelevant like math?)
> The Turing who invented the Turing machine in 1936 led the code-
> cracking efforts of the allies a couple of years later.
> Do you allow for the fact that he may have had abilities that were
> common to both aka 'math' 'theory' etc?
> Or do you believe that winning wars is a theoretical and irrelevant
> exercise?

Please, I don't dismiss math anymore than a number theorist might
dismiss the realm of complex numbers.

> Yes there is some truth in what you say. Just call it logic as object-
> language (what you call logic-gates) and logic as meta-language ie
> logic for reasoning

Right, and I'm arguing that there hasn't been enough conceptual
separation between the two. So why are you arguing?

> Also good to study the views of one of the doyens of OOP:
> http://en.wikipedia.org/wiki/Alexander_Stepanov#Criticism_of_OOP

That's a very good reference. It voices some of my points that are in
criticism of python's object architecture.
--
MarkJ
Tacoma, Washington

Moez AbdelGawad

unread,
Apr 19, 2013, 2:09:50 AM4/19/13
to Mark Janssen, Types List, Python List


> Date: Thu, 18 Apr 2013 15:53:15 -0700
> From: dreamin...@gmail.com
> To: types...@lists.seas.upenn.edu
> Subject: Re: [TYPES] The type/object distinction and possible synthesis of OOP and imperative programming languages


>
> I am very thankful for the references given by everyone.
> Unfortunately my library does not have the titles and it will be some
> time before I can acquire them.

The official version of my PhD thesis is available at https://scholarship.rice.edu/handle/1911/70199

(A version more suitable for electronic browsing and online distribution is available at http://sdrv.ms/15qsJ5x )

-Moez

Jason Wilkins

unread,
Apr 19, 2013, 2:31:08 AM4/19/13
to Mark Janssen, Types List, Python List, moez...@live.com
I don't quite think I understand what you are saying.  Are you saying that mathematical models are not a good foundation for computer science because computers are really made out of electronic gates?

All I need to do is show that my model reduces to some basic physical implementation (with perhaps some allowances for infinity) and then I can promptly forget about that messy business and proceed to use my clean mathematical model.

The reason any model of computation exists is that it is easier to think about a problem in some terms than in others.  By showing how to transform one model to another you make it possible to choose exactly how you wish to solve a problem.

The reason we do not work directly in what are called "von Neumann machines" is that they are not convenient for all kinds of problems.  However we can build a compiler to translate anything to anything else so we I don't see why anybody would care.


On Thu, Apr 18, 2013 at 5:53 PM, Mark Janssen <dreamin...@gmail.com> wrote:
On Mon, Apr 15, 2013 at 2:53 AM, Moez AbdelGawad <moez...@outlook.com> wrote:
>> I'm not quite sure I understand your question, but I'll give it a shot.
>> :-)
>
> I'm in this same camp too :)

I am very thankful for the references given by everyone.
Unfortunately my library does not have the titles and it will be some
time before I can acquire them.  I hope it not too intrusive to offer
a few points that I've garnered from this conversation until I can
study the history further.

The main thing that I notice is that there is a heavy "bias" in
academia towards mathematical models.  I understand that Turing

Machines, for example, were originally abstract computational concepts
before there was an implementation in hardware, so I have some
sympathies with that view, yet, should not the "Science" of "Computer
Science" concern itself with how to map these abstract computational
concepts into actual computational hardware?  Otherwise, why not keep
the field within mathematics and philosophy (where Logic traditionally
has been)?   I find it remarkable, for example, that the simple
continued application of And/Or/Not gates can perform all the
computation that C.S. concerns itself with and these form the basis
for computer science in my mind, along with Boolean logic.  (The
implementation of digital logic into physical hardware is where C.S.
stops and Engineering begins, I would argue.)

But still, it seems that there are two ends, two poles, to the whole
computer science enterprise that haven't been sufficiently *separated*
so that they can be appreciated:  logic gates vs. logical "calculus"
and symbols.   There is very little crossover as I can see.  Perhaps
the problem is the common use of the Greek root "logikos"; in the
former, it pertains to binary arithmetic, where in the latter, it
retains it's original Greek pertaining to *speech* and symbols,
"logos").  Further, one can notice that in the former, the progression

has been towards more sophisticated Data Structures (hence the
evolution towards Object-Orientation), where in the latter (I'm
guessing, since it's not my area of expertise) the progression has
been towards function sophistication (where recursion seems to be
paramount).

Chris Angelico

unread,
Apr 19, 2013, 3:05:41 AM4/19/13
to pytho...@python.org
On Fri, Apr 19, 2013 at 1:35 PM, rusi <rusto...@gmail.com> wrote:
> If I have a loop:
>
> while i < len(a) and a[i] != x:
> i++
>
> I need to understand that at the end of the loop:
> i >= len(a) or a[i] == x
> and not
> i >= len(a) and a[i] == x
> nor
> i == len(a) or a[i] == x # What if I forgot to initialize i?

Or your program has crashed out with an exception.

>>> i,a,x=-10,"test","q"
>>> while i < len(a) and a[i] != x:
i+=1

Traceback (most recent call last):
File "<pyshell#69>", line 1, in <module>
while i < len(a) and a[i] != x:
IndexError: string index out of range

Or if that had been in C, it could have bombed with a segfault rather
than a nice tidy exception. Definitely initialize i.

But yeah, the basis of algebra is helpful, even critical, to
understanding most expression evaluators.

ChrisA

Uday S Reddy

unread,
Apr 19, 2013, 3:21:09 AM4/19/13
to Mark Janssen, Types List, Python List, moez...@live.com
Mark Janssen writes:

> The main thing that I notice is that there is a heavy "bias" in
> academia towards mathematical models. I understand that Turing
> Machines, for example, were originally abstract computational concepts
> before there was an implementation in hardware, so I have some
> sympathies with that view, yet, should not the "Science" of "Computer
> Science" concern itself with how to map these abstract computational
> concepts into actual computational hardware?

I think there is some misunderstanding here. Being "mathematical" in
academic work is a way of making our ideas rigorous and precise, instead of
trying to peddle wooly nonsense. Providing a mathematical description does
not imply in any way that these ideas are not implementable on machines. In
fact, very often these mathematical descriptions state precisely how to
implement the concepts (called operational semantics), but using
mathematical notation instead of program code. The mathematical notation
used here is usually no more than high school set theory, used in a
stylized way.

In contrast, there are deeper "mathematical models" (called denotational
semantics and axiomatic semantics) which are invented to describe how
programming language features work in a deep, intrinsic way. This is
similar to, for instance, how Physics invents mathematical models to capture
how the nature around us works. Physicists don't need to "implement"
nature. It has already been "implemented" for us before we are born.
However, to understand how it works, and how to design systems using
physical materials in a predictable way, we need the mathematical models
that Physics has develeped.

Similarly, the mathematical models of programming languages help us to
obtain a deep understanding of how languages work and how to build systems
in a predictable, reliable way. It seems too much to expect, at the present
stage of our field, that all programmers should understand the mathematical
models. But I would definitely expect that programming language designers
who are trying to build new languages should understand the mathematical
models. Otherwise, they would be like automotive engineers trying to build
cars without knowing any Mechanics.

Claus Reinke

unread,
Apr 19, 2013, 4:25:42 AM4/19/13
to Mark Janssen, Types List, Python List, moez...@live.com
> The main thing that I notice is that there is a heavy "bias" in
> academia towards mathematical models. I understand that Turing
> Machines, for example, were originally abstract computational concepts
> before there was an implementation in hardware, so I have some
> sympathies with that view, yet, should not the "Science" of "Computer
> Science" concern itself with how to map these abstract computational
> concepts into actual computational hardware?

I prefer to think of Turing machines as an attempt to model existing
and imagined hardware (at the time, mostly human computers, or
groups of them with comparatively simple tools). See sections 1.
and 9. in

Turing, "On computable numbers, with an application to the
Entscheidungsproblem",
http://web.comlab.ox.ac.uk/oucl/research/areas/ieg/e-library/sources/tp2-ie.pdf

Modeling existing systems, in order to be able to reason about them,
is essential for science, as is translating models into experiments, in
order to compare predictions to reality.

Claus

Steven D'Aprano

unread,
Apr 19, 2013, 8:00:14 AM4/19/13
to
On Thu, 18 Apr 2013 17:14:13 -0400, Robert Harper wrote:

> In short, there is no such thing as a "paradigm".

Of course there is. A paradigm is a distinct way of thinking, a
philosophy if you will. To say that there is no such thing as a paradigm
is to say that all ways of thinking about a topic are the same, and
that's clearly nonsense.

OOP is a distinct paradigm from procedural programming, even though the
distinctions are minor when compared to those between imperative and
functional programming. Java's "everything in a class" style of
programming is very distinct from Pascal's "functions and records" style
of programming, even though both are examples of imperative programming.
They lead to different styles of coding, they have different strengths
and weaknesses, and even taking into account differences of syntax, they
differ in what you can do simply and naturally without the language
getting in the way.


> I agree fully. This
> term is a holdover from the days when people spent time and space trying
> to build taxonomies based on ill-defined superficialities. See Steve
> Gould's essay "What, If Anything, Is A Zebra?". You'll enjoy learning
> that there is, in fact, no such thing as a zebra---there are, rather,
> three different striped horse-like mammals, two of which are genetically
> related, and one of which is not.

All life on earth is genetically related. Even if the so-called "tree of
life" doesn't have a single common ancestor, it has a single set of
common ancestors.

In the case of the three species of zebra, they are all members of the
genus Equus, so they are actually *extremely* closely related. The
argument that "zebra is not a genealogical group" (which is very
different from the false statement that there is no such thing as a
zebra!) is that one of the three species of zebra is in fact more closely
related to non-zebras than the other two species of zebra.

Something like this tree:

Common ancestor of all Equus
|
+-- Common ancestor of Burchell's Zebras and Grevy's Zebras
| +-- Burchell's Zebra
| +-- Grevy's Zebra
|
+-- Common ancestor of horses and Mountain Zebras
+-- Horse
+-- Mountain Zebra

(I've left out asses and donkeys because I'm not sure where they fit in
relation to the others.)

There's no natural genealogical group that includes all three species of
zebra that doesn't also include horses. But that doesn't mean that
there's no reasonable non-genealogical groups! For example, all three
species of zebra have fewer than 50 chromosome pairs, while all other
Equus species have more than 50 pairs. Based on physical characteristics
rather than ancestry, zebras make up a perfectly good group, distinct
from other members of Equus.

To put it another way, the three species of zebra may not make up a
natural group when considered by lines of descent, but they do make up a
natural group when considered by other factors.


> The propensity to be striped, like
> the propensity to have five things (fingers, segments, whatever) is a
> deeply embedded genetic artifact that expresses itself in various ways.

"Zebra" is not a classification of "thing that is striped", but a
specific subset of such things, and stripes are only one of many
distinguishing features. Cladistics is an important classification
scheme, but it is not the only valid such scheme.


--
Steven

Roy Smith

unread,
Apr 19, 2013, 9:07:15 AM4/19/13
to
In article <517131cd$0$29977$c3e8da3$5496...@news.astraweb.com>,
Steven D'Aprano <steve+comp....@pearwood.info> wrote:

> On Thu, 18 Apr 2013 17:14:13 -0400, Robert Harper wrote:
>
> > In short, there is no such thing as a "paradigm".
>
> Of course there is. A paradigm is a distinct way of thinking, a
> philosophy if you will. To say that there is no such thing as a paradigm
> is to say that all ways of thinking about a topic are the same, and
> that's clearly nonsense.

This thread has gone off in a strange direction. When I said:

> One of the nice things about OOP is it means so many different things to
> different people. All of whom believe with religious fervor that they
> know the true answer.

I was indeed talking about the ways people think about programming. For
example, OOP in C++ is very much about encapsulation. People declare
all data private, and writing setter/getter functions which carefully
control what access outside entities have to your data.

Often, when you talk to C++ people, they will tell you that
encapsulation is what OOP is all about. What they are doing is saying,
C++ isa OOPL, and C++ has encapsulation, therefore OOPL implies
encapsulation. When they look at something like Python, they say,
"That's not object oriented because you don't have private data".

I suppose people who grew up learning Python as their first language
look at something like C++ and say, "That's not OOP because classes
aren't objects", or something equally silly.

Chris Angelico

unread,
Apr 19, 2013, 9:33:01 AM4/19/13
to pytho...@python.org
On Fri, Apr 19, 2013 at 11:07 PM, Roy Smith <r...@panix.com> wrote:
> I was indeed talking about the ways people think about programming. For
> example, OOP in C++ is very much about encapsulation. People declare
> all data private, and writing setter/getter functions which carefully
> control what access outside entities have to your data.

The funny thing about that notion is that, even in C++, it's
completely optional. I've subclassed someone else's class using a
struct and just left everything public. In fact, I've gotten so used
to the Python way of doing things that now I'm quite happy to run
everything public anyway.

Is OOP truly about X if X is optional?

ChrisA

Roy Smith

unread,
Apr 19, 2013, 11:31:48 AM4/19/13
to
In article <mailman.821.1366378...@python.org>,
Chris Angelico <ros...@gmail.com> wrote:

> On Fri, Apr 19, 2013 at 11:07 PM, Roy Smith <r...@panix.com> wrote:
> > I was indeed talking about the ways people think about programming. For
> > example, OOP in C++ is very much about encapsulation. People declare
> > all data private, and writing setter/getter functions which carefully
> > control what access outside entities have to your data.
>
> The funny thing about that notion is that, even in C++, it's
> completely optional.

Well, yeah:

#define private public
#define protected public
#include <whatever.h>

Not to mention all sorts of horrible things you can do with pointers and
const_cast, etc. But that doesn't stop people from thinking that
somehow they've built some uber-protected cocoon around their data, and
that this is part and parcel of what OOPL is all about.

Chris Angelico

unread,
Apr 19, 2013, 11:40:26 AM4/19/13
to pytho...@python.org
On Sat, Apr 20, 2013 at 1:31 AM, Roy Smith <r...@panix.com> wrote:
> #define private public
> #define protected public
> #include <whatever.h>

And:
#define class struct

But what I mean is that, _in my design_, I make everything public. No
getters/setters, just direct member access. The theory behind getters
and setters is that you can change the implementation without changing
the interface... but I cannot remember a *single time* when I have
made use of that flexibility. Not one. Nor a time when I've wished for
that flexibility, after coding without it. No no, not one!

ChrisA
(He's telling the truth, he is not Mabel.)

Roy Smith

unread,
Apr 19, 2013, 12:02:00 PM4/19/13
to
In article <mailman.824.1366386...@python.org>,
Chris Angelico <ros...@gmail.com> wrote:

> On Sat, Apr 20, 2013 at 1:31 AM, Roy Smith <r...@panix.com> wrote:
> > #define private public
> > #define protected public
> > #include <whatever.h>
>
> And:
> #define class struct

I suppose, while we're at it:

# define const ""

(I think that works, not sure).

PS: a great C++ interview question is, "What's the difference between a
class and a struct?" Amazing how few self-professed C++ experts have no
clue.

Steven D'Aprano

unread,
Apr 19, 2013, 12:16:18 PM4/19/13
to
On Fri, 19 Apr 2013 12:02:00 -0400, Roy Smith wrote:

> PS: a great C++ interview question is, "What's the difference between a
> class and a struct?" Amazing how few self-professed C++ experts have no
> clue.

I'm not a C++ expert, but I am an inquiring mind, and I want to know the
answer!


--
Steven

Steven D'Aprano

unread,
Apr 19, 2013, 12:37:59 PM4/19/13
to
On Fri, 19 Apr 2013 09:07:15 -0400, Roy Smith wrote:

> Often, when you talk to C++ people, they will tell you that
> encapsulation is what OOP is all about. What they are doing is saying,
> C++ isa OOPL, and C++ has encapsulation, therefore OOPL implies
> encapsulation. When they look at something like Python, they say,
> "That's not object oriented because you don't have private data".
>
> I suppose people who grew up learning Python as their first language
> look at something like C++ and say, "That's not OOP because classes
> aren't objects", or something equally silly.

You might say that, but I find in my experience that Python users don't
tend to fall for the "No True Scotsman" fallacy anywhere near as often as
(say) Java or C++ users. I'm not sure what the reason for this is.
Perhaps it is that the Python community as a whole is more open to other
languages and paradigms, and less stuffed to the gills with code monkeys
who only know how to copy and paste code from StackOverflow. The Python
community frequently tosses around references to other languages,
compares how Python would do something to other languages, or relates how
certain features were borrowed from language X (e.g. list comprehensions
are taken from Haskell; map, filter and reduce are taken from Lisp). But
when I read forums and blogs about (say) Java, it's nearly always about
Java in isolation, and one would be forgiven for thinking it was the only
programming language in existence.

I don't think that there is One True Way to design an OOP language, but I
do think there are *degrees* of OOP. Java, for instance, I would say is
only moderately OOP, since classes aren't objects, and it supports
unboxed native types. I think the first part of that is a weakness, and
the second is a pragmatic decision that on balance probably is a
strength. Yes, Python's "everything is an object" is a cleaner design,
but Java's unboxed types leads to faster code.

It also depends on what you mean by OOP. If we judge Python by the fact
that everything is an object, then it is strongly OOP. But if we judge
Python by its syntax and idioms, it is only weakly OOP, even less than
Java.


--
Steven

Ned Batchelder

unread,
Apr 19, 2013, 12:41:03 PM4/19/13
to Steven D'Aprano, pytho...@python.org
The only difference between a class and a struct is that classes default
to "private" access for their members, and structs default to "public".

--Ned.

Ian Kelly

unread,
Apr 19, 2013, 12:38:31 PM4/19/13
to Python
C++ class members are private by default; struct members are public by
default. That's the only difference as I recall.
Message has been deleted
Message has been deleted

Roy Smith

unread,
Apr 19, 2013, 7:37:38 PM4/19/13
to
I wrote:
> > I suppose people who grew up learning Python as their first language
> > look at something like C++ and say, "That's not OOP because classes
> > aren't objects", or something equally silly.
>

In article <517172e7$0$29977$c3e8da3$5496...@news.astraweb.com>,
Steven D'Aprano <steve+comp....@pearwood.info> wrote:
> You might say that, but I find in my experience that Python users don't
> tend to fall for the "No True Scotsman" fallacy anywhere near as often as
> (say) Java or C++ users.

Now that I think about it, I suspect relatively few people learned
Python as their first programming language.

Java, for example, is very popular as a teaching language in colleges
and universities. There are lots of people who go through a 4-year
program, do all of their coursework in Java, and come out as one-trick
ponies.

There aren't many schools who teach Python as a first (and only
language), but I suppose it's starting to catch on. 5 years from now,
we may see waves of kids graduating from college knowing nothing but
Python, with a similarly narrow view of the universe.

Roy Smith

unread,
Apr 19, 2013, 7:40:22 PM4/19/13
to
In article <mailman.843.1366412...@python.org>,
Dennis Lee Bieber <wlf...@ix.netcom.com> wrote:

> On Fri, 19 Apr 2013 12:02:00 -0400, Roy Smith <r...@panix.com> declaimed
> the following in gmane.comp.python.general:
>
> > PS: a great C++ interview question is, "What's the difference between a
> > class and a struct?" Amazing how few self-professed C++ experts have no
> > clue.
>
> It's been 15+ years but...
>
> "class" defaults to private; " struct" defaults to public... (very
> simplified <G>)

You were doing well until you added the "very simplified" part :-) That
is indeed the only difference.

Many people are surprised that you can write member functions for
structs. Or that you can subclass (substruct?) them.

Steven D'Aprano

unread,
Apr 19, 2013, 7:47:11 PM4/19/13
to
On Fri, 19 Apr 2013 19:37:38 -0400, Roy Smith wrote:

> There aren't many schools who teach Python as a first (and only
> language), but I suppose it's starting to catch on. 5 years from now,
> we may see waves of kids graduating from college knowing nothing but
> Python, with a similarly narrow view of the universe.

Send the young whipper-snappers here, we'll soon learn 'em better!



--
Steven


Mark Janssen

unread,
Apr 20, 2013, 2:02:22 AM4/20/13
to Jason Wilkins, Types List, Python List, moez...@live.com
On Thu, Apr 18, 2013 at 11:31 PM, Jason Wilkins
<jason.a...@gmail.com> wrote:
> I don't quite think I understand what you are saying. Are you saying that
> mathematical models are not a good foundation for computer science because
> computers are really made out of electronic gates?

No, I'm really trying to point out that models based on Digital Logic
vs. models based on Symbolic Logic are completely different -- they
have different basiis. They are both types of "Maths", and that you
can interchange them as a demonstration doesn't actually help the
practical issue of keeping the two domains separate -- they have
differing logics. It's like the domain of Natural numbers vs. the
Complex, or perhaps the Natural and the Real. Yes you can translate
back and forth, but they are for all practical purposes distinct and
can't be mixed.

> All I need to do is show that my model reduces to some basic physical
> implementation (with perhaps some allowances for infinity) and then I can
> promptly forget about that messy business and proceed to use my clean
> mathematical model.

If that's all you want to do, you can stick with Boolean Logic.

> The reason any model of computation exists is that it is easier to think
> about a problem in some terms than in others. By showing how to transform
> one model to another you make it possible to choose exactly how you wish to
> solve a problem.

Yes, and I'm attempting to provide an argument that the
(historically?) dominant model of symbolic calculus is misinforming
the practical domain of working out differences and arguments within
my own domain of the programming community.

Unfortunately, my inexperience with the literature is actually
betraying the validity of my point.

> The reason we do not work directly in what are called "von Neumann machines"
> is that they are not convenient for all kinds of problems. However we can
> build a compiler to translate anything to anything else so we I don't see
> why anybody would care.

I'm trying to say that *I* care, because I can't seem to find the
common ground that affects 1000's of people in the applied C.S. domain
with the 1000's of people in the theoretical C.S. domain.

MarkJ
Tacoma

Mark Janssen

unread,
Apr 20, 2013, 2:37:56 AM4/20/13
to Uday S Reddy, Types List, Python List, moez...@live.com
> I think there is some misunderstanding here. Being "mathematical" in
> academic work is a way of making our ideas rigorous and precise, instead of
> trying to peddle wooly nonsense.

I'm sorry. I am responsible for the misunderstanding. I used the
word "math" when I really mean symbolic logic (which, historically,
was part of philosophy). My point is that the field is confusing
because it seems to ignore binary logic in favor of symbolic logic.
Is binary logic not "rigorous and precise" enough?
--
MarkJ
Tacoma, Washington

88888 Dihedral

unread,
Apr 20, 2013, 3:50:52 AM4/20/13
to
Ned Batchelder於 2013年4月20日星期六UTC+8上午12時41分03秒寫道:
In python even a class can be decorated. Also features of instances
can be added at run time from programs by different programmers
or even programs from machines by the code generation scheme
used in many CAD tools.

Nowadays the concept of a structure is not clear without
specifying the language used in programming.

A list is a structure of non-homogeneous types of items in LISP,
PERL and PYTHON.

But the cases are different in C++, PASCAL, ADDA, JAVA ....

88888 Dihedral

unread,
Apr 21, 2013, 2:38:12 AM4/21/13
to
Uday S Reddy於 2013年4月17日星期三UTC+8下午5時10分58秒寫道:
> Mark Janssen writes:
>
>
>
> > > Having said that, theorists do want to unify concepts wherever possible
>
> > > and wherever they make sense. Imperative programming types, which I
>
> > > will call "storage types", are semantically the same as classes.
>
> >
The imperative part is supported in Python by tuples only.
The name binding assignment of an object is perative in Python.

Anyway it is not so difficult to mimic the behaviors
and to gain the benefits of imperative languages
at least in the c-python implementation by those
who can play with the Python language in programming.

rusi

unread,
Apr 21, 2013, 11:44:32 AM4/21/13
to
On Apr 15, 8:48 am, Mark Janssen <dreamingforw...@gmail.com> wrote:
> That all being said, the thrust of this whole effort is to possibly
> advance Computer Science and language design, because in-between the
> purely concrete "object" architecture of the imperative programming
> languages and the purely abstract object architecture of
> object-oriented programming languages is a possible middle ground that
> could unite them all.

Just been poking around with eclipse.
And saw this: http://softwaresimplexity.blogspot.in/2013/02/where-java-fails.html

For context, the guy seems to be big in the java/eclipse community
[Or at least is well hyped -- he's finalist for eclipse' best
developer award]

This context is needed to underscore something he says in one of the
comments on that page:

"OO just sucks…"

Mark Janssen

unread,
May 1, 2013, 4:32:39 PM5/1/13
to Ned Batchelder, pytho...@python.org
>> Here's a simple rule to resolve the ambiguity. Whoever publishes
>> first, gets to claim origin of a word and its usage, kind of like a
>> BDFL. The rest can adapt around that, make up their own word, or be
>> corrected as the community requires.
>
> You seem to want to squeeze all of computer science and programming into a
> tidy hierarchy. It won't work, it's not tidy. I strongly suggest you read
> more about computer science before forming more opinions. You have a lot to
> learn ahead of you.

Done: see the wikiwikiweb: http://c2.com/cgi/wiki?ComputerScienceVersionTwo
--
MarkJ
Tacoma, Washington

alex23

unread,
May 1, 2013, 9:13:33 PM5/1/13
to
On May 2, 6:32 am, Mark Janssen <dreamingforw...@gmail.com> wrote:
> > You seem to want to squeeze all of computer science and programming into a
> > tidy hierarchy.  It won't work, it's not tidy. I strongly suggest you read
> > more about computer science before forming more opinions.  You have a lot to
> > learn ahead of you.
>
> Done:  see the wikiwikiweb:  http://c2.com/cgi/wiki?ComputerScienceVersionTwo

"There are two camps:
Us, who are right and good.
Them, who are wrong and evil.
Us should be supported, whilst Them should be condemned."

Wow. Just....wow. Because the world really needs a manichean model of
computation.
0 new messages