Lenard Lindstrom
"<%s@%s.%s>" % ("len-l.", "telus", "net")
> Would someone please explain why subtyping of type 'function' is not
> permitted in Python 2.3?
That would probably be best done by asking a counter question: Why
would you want to?
--
Erik Max Francis / m...@alcyone.com / http://www.alcyone.com/max/
__ San Jose, CA, USA / 37 20 N 121 53 W / &tSftDotIotE
/ \ All delays are dangerous in war.
\__/ John Dryden
Blackgirl International / http://www.blackgirl.org/
The Internet resource for black women.
Nothing implemented in C is subclassable unless somebody volunteers the work
to make it subclassable; nobody volunteered the work to make the function
type subclassable. It sure wasn't at the top of my list <wink>.
* To provide methods on the object allowing OO-style API's for
introspection?
o retrieving arguments and/or variable names, defaults,
presence of * or ** argument-sets
o retrieving text of the function from source files
o retrieving formatted "parameter" help text, or more detailed
run-time help-text
o retrieving "raised exception" information and/or
caught-exception information
* Experimenting with meta-programming mechanisms
o automatically generating pre and/or post-condition checks
+ argument and/or return value types/interfaces/values
+ system-status checks (there is an open database
connection, there is a TCP/IP stack, or whatever)
o providing a "restart" API for extremely long-running
functions which have encountered exceptions
o providing enhanced "cleanup" functionality for an entire
class of functions which a framework may call if the
function fails
* To provide, for instance, a "script" function object defined in
such a way that the function object stores the source code,
provides pickling support, and is generally a first-class content
object, but is a simple function object as far as the system is
concerned (with no overhead for its call method, standard support
in IDE's and documentation systems etceteras)
You can certainly do that stuff using wrapping, and it's certainly
easier to do it that way today (given that the subclassing is
impossible). The question then becomes, is it appropriate to allow the
sub-classing approach?
Recent changes in Python (i.e. 2.2+) have made the type-class split
considerably less visible, with a trend toward using sub-classing of
core objects as an acceptable and useful idiom. Consistency and the
principle of least surprise would suggest that having all of the
primitive types sub-classable, where it doesn't negatively impact
performance, is a good thing. (It's also nice to allow systems such as
the generic introspection systems to deal with the objects without
needing explicit modification to support them (i.e., your functions
still show up as functions in your IDE, you don't have them showing up
as data attributes)).
Which comes back to Tim's post, suggesting that a volunteer creating a
sub-classable version of the function object would be sufficient to
introduce such a thing to the core.
Enjoy,
Mike
Erik Max Francis wrote:
>Lenard Lindstrom wrote:
>
>
>
>>Would someone please explain why subtyping of type 'function' is not
>>permitted in Python 2.3?
>>
>>
>
>That would probably be best done by asking a counter question: Why
>would you want to?
>
>
_______________________________________
Mike C. Fletcher
Designer, VR Plumber, Coder
http://members.rogers.com/mcfletch/
That won't work. If you create a function through the def statement,
it will be of type function, not of a subclass thereof. So on all function
objects, your introspection methods won't be available.
> * Experimenting with meta-programming mechanisms
Same counter-argument: How do you create such specialized
function objects?
> The question then becomes, is it appropriate to allow the
> sub-classing approach?
Not unless accompanied with some other change to
conveniently create specialized function objects. One
proposal (originally proposed to create class methods) might work:
def foo(args)[enhanced_cleanup]:
code
This would create a function object first, then invoke the
callable enhanced_cleanup, which returns a modified (or
wrapped) object. However, if this extension were available,
specializing the function type is still not necessary, since
you could just as well create wrapper objects in all
usage scenarios you have given.
> Which comes back to Tim's post, suggesting that a volunteer creating a
> sub-classable version of the function object would be sufficient to
> introduce such a thing to the core.
I suspect any volunteer would notice quickly that this is not sufficient.
Regards,
Martin
>>Why would you want to sub-class a function object?
>>
>> * To provide methods on the object allowing OO-style API's for
>> introspection?
>>
>>
>
>That won't work. If you create a function through the def statement,
>it will be of type function, not of a subclass thereof. So on all function
>objects, your introspection methods won't be available.
>
>
It would certainly be required to have a base-class constructor. You'll
note, however, that this wasn't really an insurmountable problem for the
type meta-class ;) . Like that change, you need the system to support
the operation (i.e. call it) if you want syntactic support, but it's not
an impossible change.
BTW Even today, without support built into the system to call the
function-constructor when reading a source-file, you could still use it
within systems where run-time definition of classes is the norm, such as
embedded scripting systems (you call the function-type's constructor
whenever you want to create a new function).
...
>>The question then becomes, is it appropriate to allow the
>>sub-classing approach?
>>
>>
>
>Not unless accompanied with some other change to
>conveniently create specialized function objects. One
>proposal (originally proposed to create class methods) might work:
>
>def foo(args)[enhanced_cleanup]:
> code
>
>
or:
foo = SpecialFunction( 'foo', """
self.x = y
blah()
""")
Sure, there's lots of problems with it (about equal to those with
"property" IMO), but the question of the *hook* to invoke the mechanism
is somewhat distinct from the mechanism itself, no? After all, we had
the "new" module for years before the functionality eventually migrated
to the base-type constructors.
>This would create a function object first, then invoke the
>callable enhanced_cleanup, which returns a modified (or
>wrapped) object. However, if this extension were available,
>specializing the function type is still not necessary, since
>you could just as well create wrapper objects in all
>usage scenarios you have given.
>
>
See notes regarding wrapping versus inheriting in the original post. My
"arguments" aren't necessarily compelling, but the trend does appear to
be toward sub-classable base-types.
>>Which comes back to Tim's post, suggesting that a volunteer creating a
>>sub-classable version of the function object would be sufficient to
>>introduce such a thing to the core.
>>
>>
>
>I suspect any volunteer would notice quickly that this is not sufficient.
>
>Regards,
>Martin
>
>
I'd guess it would depend on the volunteer's purposes, and whether
they're trying to get the syntactic operation you want ;) working :) .
I find the meta-type constructor's syntactic sugar gets in the way about
as much as it helps (you wind up having to muck about in the class'
namespace to define named arguments which you then need to decide
whether to include in the final class or not).
The simple call approach works. It lets you define arguments (how do you
do that with the def x()[a,b,c] syntactic construct, for instance), it
is familiar and readily understood (no new syntax required, you can
immediately see that you're getting something back from a
constructor/function, there's no magic invocation going on). It's not
convenient for replacing every function/method in a file, but neither
would the [] syntax.
Aside:
It might be neat to see def and class (maybe even if and else ;)
(just kidding :) )) decommisioned as keywords and a syntactic
construct created something like this:
callable name [ ( args, named ) ]:
suite
creating pseudo-code like this:
name = callable( name, suite-code-string, argumentDescriptions )
with def and class being objects (a new function meta-type, and the
type meta-type (or rather, a sub-class or factory to generate
appropriately new/old style classes by executing suite-code in a
dictionary then calling the current constructor)) which obey the
protocol.
Even cooler, you might rework the parser so that the suites are
passed to their constructors w/out going through the parser's
syntactic check. You then have a hook for such things as Pyrex to
embed their functions directly in the Python code. Of course,
that's probably a little *too* flexible if we're going to maintain
some level of "Python-1.5-ness" in the core :) .
Have fun, stay loose, become one with the universe,
Mike
Maybe something like the __metaclass__ syntax is
called for:
def myfunction(args):
__functionclass__ = myspecialfunc
...
(I know that looks like a local variable assignment,
but it wouldn't be. As Guido once said about something else,
"It's a double-underscore name -- I can give it whatever
semantics I want!")
--
Greg Ewing, Computer Science Dept,
University of Canterbury,
Christchurch, New Zealand
http://www.cosc.canterbury.ac.nz/~greg
def newscope():
__functionclass__ = myspecialfunc
def myfunction(args):
...
return myfunction
myfunction = newscope()
The def statement checks the local, as well as the global, scope for
__functionclass__.
I tentatively suggest a syntax change to def which restricts the scope of
the function class to a particular definition while keeping the class
specifier external to the function body:
def(funcsubtype) afunc(args):
....
where '(funcsubtype)' is optional, of coarse. I haven't checked the Python
grammar to see if anything breaks, but I don't see any ambiguities, and the
syntax change is isolated to the def statement.
[Lenard Lindstrom]
> Okay. I think I see. I notice that functions are not provided for the
> tp_new, tp_init, tp_alloc, and tp_free slots of PyFunction_Type. Just
> adding Py_TPFLAGS_BASETYPE to tp_flags will not enable
> subclassing for type function. More work than I first thought.
Yup, it always is. See the other replies in this thread too (e.g., if it
were subclassable, how would you create an instance? the natural "def"
notation doesn't have a way to specify a base class).
> Given PEP 253 I assumed most classes would already be converted,
> leaving just thoughs which would break something if subtyping were
> permitted on them.
Or things that were low on the perceived bang-for-the-buck scale. I made
the file type subclassable very late in the release cycle, and that may well
have been the last builtin type to become subclassable for 2.2. Being able
to subclass file was worth something to me. For 2.3, someone else (IIRC)
made the array type (array.array) subclassable. The universe of volunteers
for this stuff is small.
> Instead function was simply neglected due to time constaints. ;-)
That's so even without the smiley. There's never enough time to do all that
we'd like to do. BTW, that's why Python is more usable than Scheme <0.8
wink>.
Staticmethods and classmethods don't have all those attributes,
so they shouldn't be subclasses of function.
> should not that wrapper class be derived from function?
Yes, but it will only have those attributes if it's a
subtype of "function consisting of interpreted Python
bytecode", not "callable object in general". The use
cases of *that* are somewhat more restricted.
That's not to say they don't exist, and I agree that
there's no reason for function objects *not* to be
subclassable. It's just that nobody's had a pressing
enough need to subclass one yet to make it so.
Guido is on record as saying that's just a detail of
the current implementation.
But I've had a better idea:
def f(arg, ..., __functionclass__ = myfunkyfunc):
...
Now it's a real assignment (of a default argument value),
and it's outside the function's local scope, and it's
restricted to that function!
Happy now? :-)
import operator
from new import function
class myfunction(function):
def __init__(self, fn, x):
if not operator.isNumberType(x):
raise TypeError, "x not numeric"
self.x = x
newfn = self._wrap(fn)
super(myfunction, self).__init__(newfn.func_code,
globals(),
newfn.func_name,
newfn.func_defaults,
newfn.func_closure)
def _wrap(self, fn):
def newfn(y):
return fn(self.x + y)
return newfn
def setx(self, x):
if not operator.isNumberType(x):
raise TypeError, "x not numeric"
self.x = x
def getx(self):
return self.x
def addsomething(x):
return x + 42
addsomethingmore = myfunction(addsomething, 10)
The example _wrap method is rudimentary. A truly powerfully wrapper would
reproduce the arguments of the function being wrapped. But no, permitting
subtyping of function is not absolutely necessary. The above example works
with a function standin class. So I suppose all I am really suggesting is an
abstract function class, a contract saying a callable object will stand up
to introspection. Then the inspect module functions can check for this as
well as type function.
--
>...
> That's not to say they don't exist, and I agree that
> there's no reason for function objects *not* to be
> subclassable. It's just that nobody's had a pressing
> enough need to subclass one yet to make it so.
>
I am not neccessarily suggesting function subtyping. That was just an
observation. What I see is an inconsistancy between real functions and all
other non-builin callables. Having some way to get 'inspect' to work with
non-function callables would probably be enough. So maybe only inspect needs
changing.
--
# functionsubtype, Demonstrates function subtyping
import operator
from new import function
class myfunction(function):
def __new__(cls, fn, x):
# Function objects are initialized by __new__.
if not operator.isNumberType(x):
raise TypeError, "x not numeric"
mutable_x = [x]
newfn = cls._wrap(fn, mutable_x)
self = function.__new__(cls,
newfn.func_code,
globals(),
newfn.func_name,
newfn.func_defaults,
newfn.func_closure)
self.mutable_x = mutable_x
return self
def _wrap(fn, mutable_x):
def newfn(y):
return fn(mutable_x[0] + y)
return newfn
_wrap = staticmethod(_wrap)
def setx(self, x):
if not operator.isNumberType(x):
raise TypeError, "x not numeric"
self.mutable_x[0] = x
def getx(self):
return self.mutable_x[0]
The following session used the above module.
Python 2.3a2 (#39, Mar 28 2003, 15:43:16) [MSC v.1200 32 bit (Intel)] on
win32
Type "help", "copyright", "credits" or "license" for more information.
>>> from functionsubtype import myfunction
>>> def fn(x):
... return x
...
>>> addx = myfunction(fn, 42)
>>> addx(0)
42
>>> addx.getx()
42
>>> addx.setx(99)
>>> addx(0)
99
>>> addx(100)
199
>>> type(addx)
<class 'functionsubtype.myfunction'>
>>> isinstance(addx, type(fn))
True
>>> import inspect
>>> inspect.getargspec(addx)
(['y'], None, None, None)
Python 2.3a2 provides a tp_new slot method for function. I used type tuple
as a model for filling in the rest. Preliminary profiling shows a call to a
function subtype is 13 percent slower that a function call, but 16 percent
faster that calling an object with a static __call__ method.
Most of the above session can be done without subtyping function. Only
isinstance() and inspect.getargspec() would fail. If anything comes from
these postings it should be a more general purpose form of
inspect.getargspec() which handles all callable objects.
Thanks to everyone who answered my questions and participated in the
discussion that followed.
--