Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

is decorator the right thing to use?

1 view
Skip to first unread message

Dmitry S. Makovey

unread,
Sep 24, 2008, 6:21:01 PM9/24/08
to
Hi,

after hearing a lot about decorators and never actually using one I have
decided to give it a try. My particular usecase is that I have class that
acts as a proxy to other classes (i.e. passes messages along to those
classes) however hand-coding this type of class is rather tedious, so I
decided to use decorator for that. Can somebody tell me if what I'm doing
is a potential shot-in-the-foot or am I on the right track? (Note, It's
rather rudimentary proof-of-concept implementation and not the final
solution I'm about to employ so there are no optimizations or
signature-preserving code there yet, just the idea).

Here's the code:

class A:
b=None
def __init__(self,b):
self.val='aval'
self.b=b
b.val='aval'

def mymethod(self,a):
print "A::mymethod, ",a

def mymethod2(self,a):
print "A::another method, ",a

def Aproxy(fn):
def delegate(*args,**kw):
print "%s::%s" % (args[0].__class__.__name__,fn.__name__)
args=list(args)
b=getattr(args[0],'b')
fnew=getattr(b,fn.__name__)
# get rid of original object reference
del args[0]
fnew(*args,**kw)
setattr(A,fn.__name__,delegate)
return fn

class B:
def __init__(self):
self.val='bval'

@Aproxy
def bmethod(self,a):
print "B::bmethod"
print a, self.val

@Aproxy
def bmethod2(self,a):
print "B::bmethod2"
print a, self.val

b=B()
b.bmethod('foo')
a=A(b)
b=B()
b.val='newval'
a.bmethod('bar')
a.bmethod2('zam')


showell...@gmail.com

unread,
Sep 24, 2008, 10:08:04 PM9/24/08
to
On Sep 24, 3:21 pm, "Dmitry S. Makovey" <dmi...@athabascau.ca> wrote:
> Hi,
>
> after hearing a lot about decorators and never actually using one I have
> decided to give it a try. My particular usecase is that I have class that
> acts as a proxy to other classes (i.e. passes messages along to those
> classes) however hand-coding this type of class is rather tedious, so I
> decided to use decorator for that. Can somebody tell me if what I'm doing
> is a potential shot-in-the-foot or am I on the right track? (Note, It's
> rather rudimentary proof-of-concept implementation and not the final
> solution I'm about to employ so there are no optimizations or
> signature-preserving code there yet, just the idea).
>

Your code below is very abstract, so it's kind of hard to figure out
what problem you're trying to solve, but it seems to me that you're
using the B proxy class to decorate the A target class, which means
you want one of these options:

1) Put decorators over the methods in A, not B. Isn't it the
methods of A that are being decorated here?

2) Eliminate the decorator syntax and make your code more
expressive:

a = SomeClass()
# first call it directly
x = a.foo()
y = a.bar()
# now decorate it
debug_proxy =
ClassThatDecoratesMethodCallsToObjectWithDebuggingCode(a)
debug_proxy.decorate_methods('foo', 'bar')

The decorate_methods method would be magical, in terms of overwriting
a's innards, while still preserving the same interface for its users.

But again, I'm just guessing here, because it's hard to know what
problem you're really solving.

Cheers,

Steve

Code quoted below:

Aaron "Castironpi" Brady

unread,
Sep 24, 2008, 11:33:32 PM9/24/08
to

It might help to tell us the order of events that you want in your
program. You're not using 'mymethod' or 'mymethod2', and you probably
want 'return fnew' for the future. Something dynamic with __getattr__
might work. Any method call to A, that is an A instance, tries to
look up a method of the same name in the B instance it was initialized
with.

a.foo( ) -> 'in a.foo' -> 'calling b.foo' -> 'return' -> 'return'
a.mymethod( ) -> 'in a.mymethod' -> 'calling b.mymethod' ->
AttributeError: 'b' has no attribute 'mymethod'.

Dmitry S. Makovey

unread,
Sep 24, 2008, 11:35:06 PM9/24/08
to
showell...@gmail.com wrote:
> Your code below is very abstract, so it's kind of hard to figure out
> what problem you're trying to solve, but it seems to me that you're
> using the B proxy class to decorate the A target class, which means
> you want one of these options:

Sorry for unclarities in original post. Basically A aggregates object of
class B (example with no decorators and again it's oversimplified):

class A:
b=None
def __init__(self,b):

self.b=b

def amethod(self,a):
print "A::amethod ", a

def bmethod(self,a):
print "A::bmethod ",a
return self.b.bmethod(a)

def bmethod2(self,a,z):
print "A::bmethod2 ",a,z
return self.b.bmethod2(a,z)


class B:
def __init__(self):
self.val=a

def bmethod(self,a):
print "B::bmethod ",a

def bmethod2(self,a,z):
print "B::bmethod2 ",a,z


b=B()
a=A(b)
a.bmethod('foo')
a.bmethod2('bar','baz')

In my real-life case A is a proxy to B, C and D instances/objects, not just
one. If you look at above code - whenever I write new method in either B, C
or D I have to modify A, or even when I modify signature (say, add
parameter x to bmethod) in B, C or D I have to make sure A is synchronized.
I was hoping to use decorator to do it automatically for me. Since the
resulting code is virtually all the same for all those proxy methods it
seems to be a good place for automation. Or am I wrong assuming that?
(since it is my first time using decorators I honestly don't know)

Abovementioned code ilustrates what I am doing right now. My original post
is an attempt to make things more automated/foolproof.

Dmitry S. Makovey

unread,
Sep 24, 2008, 11:45:31 PM9/24/08
to
Aaron "Castironpi" Brady wrote:

> It might help to tell us the order of events that you want in your
> program. You're not using 'mymethod' or 'mymethod2', and you probably
> want 'return fnew' for the future. Something dynamic with __getattr__
> might work. Any method call to A, that is an A instance, tries to
> look up a method of the same name in the B instance it was initialized
> with.

well 'mymethod' and 'mymethod2' were there just to show that A doesn't
function as a pure proxy - it has methods of it's own. See my respnse to
Steve - I proxy messages to more than one aggregated object. going over
them on __getattr__ to look up methods just doesn't seem to be really
efficient to me (I might be wrong though). Decorators seemed to present
good opportunity to simplify the code (well except for the decorator
function itself :) ), make code bit more "fool-proofed" (and give me the
opportunity to test decorators in real life, he-he).

So decorators inside of B just identify that those methods will be proxied
by A. On one hand from logical standpoint it's kind of weird to tell class
that it is going to be proxied by another class, but declaration would be
real close to original function definition which helps to identify where is
it used.

Note that my decorator doesn't change original function - it's a subversion
of decorator to a certain degree as I'm just hooking into python machinery
to add methods to A upon their declaration in B (or so I think).


Dmitry S. Makovey

unread,
Sep 24, 2008, 11:51:01 PM9/24/08
to
Dmitry S. Makovey wrote:
> In my real-life case A is a proxy to B, C and D instances/objects, not
> just one.

forgot to mention that above would mean that I need to have more than one
decorator function like AproxyB, AproxyC and AproxyD or make Aproxy smarter
about which property of A has instance of which class etc.

Unless I'm totally "out for lunch" and there are better ways of implementing
this (other than copy-pasting stuff whenever anything in B, C or D
changes).

Diez B. Roggisch

unread,
Sep 25, 2008, 2:45:20 AM9/25/08
to
Dmitry S. Makovey schrieb:

__getattr__?

class Proxy(object):


def __init__(self, delegate):
self._delegate = delegate


def __getattr__(self, attr):
v = getattr(self._delegate, attr)
if callable(v):
class CallInterceptor(object):
def __init__(self, f):
self._f = f

def __call__(self, *args, **kwargs):
print "Called " + str(self._f) + " with " +
str(args) + str(kwargs)
return self._f(*args, **kwargs)
return CallInterceptor(v)
return v


Decorators have *nothing* to do with this. They are syntactic sugar for


def foo(...):
...

foo = a_decorator(foo)

Nothing less, nothing more.

Diez

Bruno Desthuilliers

unread,
Sep 25, 2008, 4:50:04 AM9/25/08
to
Dmitry S. Makovey a écrit :

> Aaron "Castironpi" Brady wrote:
>
>> It might help to tell us the order of events that you want in your
>> program. You're not using 'mymethod' or 'mymethod2', and you probably
>> want 'return fnew' for the future. Something dynamic with __getattr__
>> might work. Any method call to A, that is an A instance, tries to
>> look up a method of the same name in the B instance it was initialized
>> with.
>
> well 'mymethod' and 'mymethod2' were there just to show that A doesn't
> function as a pure proxy - it has methods of it's own. See my respnse to
> Steve - I proxy messages to more than one aggregated object. going over
> them on __getattr__ to look up methods just doesn't seem to be really
> efficient to me (I might be wrong though). Decorators seemed to present
> good opportunity to simplify the code (well except for the decorator
> function itself :) ), make code bit more "fool-proofed" (and give me the
> opportunity to test decorators in real life, he-he).
>
> So decorators inside of B just identify that those methods will be proxied
> by A. On one hand from logical standpoint it's kind of weird to tell class
> that it is going to be proxied by another class,

Indeed - usually, proxied objects shouldn't have to be aware of the
fact. That doesn't mean your variation on the proxy pattern is
necessarily bad design (hard to tell without lot of context anyway...),
but still there's some alarm bell ringing here IMHO - IOW : possibly the
right thing to do, but needs to be double-checked.

> but declaration would be
> real close to original function definition which helps to identify where is
> it used.
>
> Note that my decorator doesn't change original function - it's a subversion
> of decorator to a certain degree as I'm just hooking into python machinery
> to add methods to A upon their declaration in B (or so I think).

I wouldn't call this a "subversion" of decorators - it's even a pretty
common idiom to use decorators to flag some functions/methods for
special use.

Now I'm not sure I really like your implementation. Here's a possible
rewrite using a custom descriptor:

class Proxymaker(object):
def __init__(self, attrname):
self.attrname = attrname

def __get__(self, instance, cls):
def _proxied(fn):
fn_name = fn.__name__
def delegate(inst, *args, **kw):
target = getattr(inst, self.attrname)
#return fn(target, *args,**kw)
method = getattr(target, fn_name)
return method(*args, **kw)

delegate.__name__ = "%s_%s_delegate" % \
(self.attrname, fn_name)

setattr(cls, fn_name, delegate)
return fn

return _proxied

class A(object):


def __init__(self,b):
self.val='aval'
self.b=b
b.val='aval'

proxy2b = Proxymaker('b')

def mymethod(self,a):
print "A::mymethod, ",a

def mymethod2(self,a):
print "A::another method, ",a

class B(object):
def __init__(self):
self.val='bval'

@A.proxy2b


def bmethod(self,a):
print "B::bmethod"
print a, self.val

@A.proxy2b


def bmethod2(self,a):
print "B::bmethod2"
print a, self.val


My point is that:
1/ you shouldn't have to rewrite a decorator function - with basically
the same code - for each possible proxy class / attribute name pair combo
2/ making the decorator an attribute of the proxy class makes
dependencies clearer (well, IMHO at least).

I'm still a bit uneasy wrt/ high coupling between A and B, and if I was
to end up with such a design, I'd probably take some times to be sure
it's really ok.

My cents...

Dmitry S. Makovey

unread,
Sep 25, 2008, 11:41:04 AM9/25/08
to
Diez B. Roggisch wrote:

> Dmitry S. Makovey schrieb:
>> Dmitry S. Makovey wrote:
>>> In my real-life case A is a proxy to B, C and D instances/objects, not
>>> just one.
>>
>> forgot to mention that above would mean that I need to have more than one
>> decorator function like AproxyB, AproxyC and AproxyD or make Aproxy
>> smarter about which property of A has instance of which class etc.
>

> __getattr__?

see, in your code you're assuming that there's only 1 property ( 'b' )
inside of A that needs proxying. In reality I have several. So in your code
self._delegate should be at least a tupple or a list. Plus what you're
doing - you just promiscuously passing any method not found in Proxy to
self._delegate which is not what I need as I need to pass only a subset of
calls, so now your code needs to acquire dictionary of "allowed" calls, and
go over all self._delegates to find if any one has it which is not
efficient since there IS a 1:1 mapping of A::method -> B::method so lookups
shouldn't be necessary IMO (for performance reasons).

> class Proxy(object):
>
>
> def __init__(self, delegate):
> self._delegate = delegate
>
>
> def __getattr__(self, attr):
> v = getattr(self._delegate, attr)
> if callable(v):
> class CallInterceptor(object):
> def __init__(self, f):
> self._f = f
>
> def __call__(self, *args, **kwargs):
> print "Called " + str(self._f) + " with " +
> str(args) + str(kwargs)
> return self._f(*args, **kwargs)
> return CallInterceptor(v)
> return v

> Decorators have *nothing* to do with this. They are syntactic sugar for
> def foo(...):
> ...
> foo = a_decorator(foo)

exactly. and in my case they would've simplified code reading/maintenance.
However introduced "tight coupling" (A knows about B and be should know
about A) is something that worries me and I'm trying to figure out if there
is another way to use decorators for my scenario or is there another way of
achieving the same thing without using decorators and without bloating up
the code with alternative solution.

Another way could be to use Metaclass to populate class with method upon
declaration but that presents quite a bit of "special" cruft which is more
than I have to do with decorators :) (but maybe it's all *necessary* ? )

Dmitry S. Makovey

unread,
Sep 25, 2008, 11:47:51 AM9/25/08
to
Thanks Bruno,

your comments were really helpful (so was the "improved" version of code).

My replies below:

Bruno Desthuilliers wrote:
>> So decorators inside of B just identify that those methods will be
>> proxied by A. On one hand from logical standpoint it's kind of weird to
>> tell class that it is going to be proxied by another class,
>
> Indeed - usually, proxied objects shouldn't have to be aware of the
> fact. That doesn't mean your variation on the proxy pattern is
> necessarily bad design (hard to tell without lot of context anyway...),
> but still there's some alarm bell ringing here IMHO - IOW : possibly the
> right thing to do, but needs to be double-checked.

I'm kind of looking at options and not dead-set on decorators, but I can't
find any other "elegant enough" solution which wouldn't lead to such tight
coupling. The problem I'm trying to solve is not much more complicated than
what I have already described so if anybody can suggest a better approach -
I'm all for it.

> Now I'm not sure I really like your implementation. Here's a possible
> rewrite using a custom descriptor:

yeah, that was going to be my next step - I was just aiming for
proof-of-concept more then efficient code :)

agreed on all points

> I'm still a bit uneasy wrt/ high coupling between A and B, and if I was
> to end up with such a design, I'd probably take some times to be sure
> it's really ok.

that is the question that troubles me at this point - thus my original post
(read the subject line ;) ). I like the clarity decorators bring to the
code and the fact that it's a solution pretty much "out-of-the-box" without
need to create something really-really custom, but I'm worried about tight
coupling and somewhat backward logic that they would introduce (the way I
envisioned them).


Bruno Desthuilliers

unread,
Sep 25, 2008, 1:19:43 PM9/25/08
to
Dmitry S. Makovey a écrit :
> Thanks Bruno,
>
> your comments were really helpful (so was the "improved" version of code).
>
> My replies below:
>
> Bruno Desthuilliers wrote:
>>> So decorators inside of B just identify that those methods will be
>>> proxied by A. On one hand from logical standpoint it's kind of weird to
>>> tell class that it is going to be proxied by another class,
>>>
>> Indeed - usually, proxied objects shouldn't have to be aware of the
>> fact. That doesn't mean your variation on the proxy pattern is
>> necessarily bad design (hard to tell without lot of context anyway...),
>> but still there's some alarm bell ringing here IMHO - IOW : possibly the
>> right thing to do, but needs to be double-checked.
>
> I'm kind of looking at options and not dead-set on decorators, but I can't
> find any other "elegant enough" solution which wouldn't lead to such tight
> coupling. The problem I'm trying to solve is not much more complicated than
> what I have already described

Well... You didn't mention why you need a proxy to start with !-)


> so if anybody can suggest a better approach -
> I'm all for it.
>

(snip code)

>> My point is that:
>> 1/ you shouldn't have to rewrite a decorator function - with basically
>> the same code - for each possible proxy class / attribute name pair combo
>> 2/ making the decorator an attribute of the proxy class makes
>> dependencies clearer (well, IMHO at least).
>
> agreed on all points
>
>> I'm still a bit uneasy wrt/ high coupling between A and B, and if I was
>> to end up with such a design, I'd probably take some times to be sure
>> it's really ok.
>
> that is the question that troubles me at this point - thus my original post
> (read the subject line ;) ). I like the clarity decorators bring to the
> code and the fact that it's a solution pretty much "out-of-the-box" without
> need to create something really-really custom, but I'm worried about tight
> coupling and somewhat backward logic that they would introduce (the way I
> envisioned them).

Well... The canonical solution for delegation in Python is using
__getattr__. Your problem - according to this post and your answer to
Diez - is that your proxy may have to
1/ delegate to more than one object
2/ don't necessarily delegate each and any attribute access

I can envision one solution using both __getattr__ and a simple decorator:

def proxy(func):
func._proxied = True
return func

class A(object):
def __init__(self, delegates):
self._delegates = delegates
def __getattr__(self, name):
for d in self.__delegate:
func = getattr(d, name)
if callable(func) and getattr(func, '_proxied', False):
return func
raise AttributeError(
'object %s has no attribute '%s' % (self.__class__, name)
)


class B(object):
def __init__(self):
self.val='bval'

@proxy


def bmethod(self,a):
print "B::bmethod"
print a, self.val

@proxy


def bmethod2(self,a):
print "B::bmethod2"
print a, self.val


class C(object):
def __init__(self):
self.val='bval'

@proxy
def cmethod(self,a):


print "B::bmethod"
print a, self.val

@proxy
def cmethod2(self,a):


print "B::bmethod2"
print a, self.val


a = A([B(), C()])

# not tested...


This solves most of the coupling problems (B and C still have to make
clear which methods are to be proxied, but at least they need not know
which class will be used as proxy), and makes sure only 'allowed' method
calls are delegated. But I wouldn't call it a perfect solution neither.
If you do have more than one object having method xxx, only the first
one will match... And let's not talk about the lookup penalty.

There's a possible variant that avoids the call to __getattr__ (in
short: attaching delegation instancemethods to A instance in the
initializer for each proxied method in delegates), but that wont solve
the problem of potential name clashes.


My 2 cents...


Aaron "Castironpi" Brady

unread,
Sep 25, 2008, 1:52:43 PM9/25/08
to
On Sep 25, 12:19 pm, Bruno Desthuilliers <bruno.

You should write it like this:

class B(object):
@A.proxy
def bmethod(self,a):

Making 'proxy' a class method on A. In case different A instances (do
you have more than one BTW?) proxy different objects, you could make
it a plain old method.

a= A()
class B(object):
@a.proxy
def bmethod(self,a):

I recommend this solution so that if you add a method to a B instance
later, 'a' can be notified simply.:

b.meth3= a.proxy( meth3 )

The big problem with that is if 'a' has 'b' in its constructor. You
can reverse that, since B 'knows' about it proxy object quite a bit
anyway.

What you've said implies that you only have one B instance, or only
one per A instance. Is this correct?

I agree that __setattr__ is the canonical solution to proxy, but you
have stated that you want each proxied method to be a member in the
proxy class.

Dmitry S. Makovey

unread,
Sep 25, 2008, 2:22:11 PM9/25/08
to
Aaron "Castironpi" Brady wrote:
> You should write it like this:
>
> class B(object):
> @A.proxy
> def bmethod(self,a):
>
> Making 'proxy' a class method on A.

makes sense.

> In case different A instances (do
> you have more than one BTW?)

yep. I have multiple instances of class A, each one has properties (one per
class) of classes B, C and D:

class A:
b=None
c=None
d=None
def __init__(self,b,c,d):
self.b=b
self.c=c
self.d=d

...magic with proxying methods goes here...

class B:
def bmethod(self,x): pass # we proxy this method from A
def bmethod2(self,x): pass # this is not proxied
class C:
def cmethod(self,x): pass # we proxy this method from A
class D:
def dmethod(self,x): pass # we proxy this method from A

a=A(B(),C(),D())
x='foo'
a.bmethod(x)
a.cmethod(x)
a.dmethod(x)
a.bmethod2(x) # raises error as we shouldn't proxy bmethod2

above is the ideal scenario.

> What you've said implies that you only have one B instance, or only
> one per A instance. Is this correct?

yes. as per above code.

> I agree that __setattr__ is the canonical solution to proxy, but you
> have stated that you want each proxied method to be a member in the
> proxy class.

well. kind of. if I can make it transparent to the consumer so that he
shouldn't do:

a.b.bmethod(x)

but rather:

a.bmethod(x)

As I'm trying to keep b, c and d as private properties and would like to
filter which calls are allowed to those. Plus proxied methods in either one
always expect certain parameters like:

class B:
def bmethod(self,c,x): pass

and A encapsulates 'c' already and can fill in that blank automagically:

class A:
c=None
b=None
def bmethod(self,c,x):
if not c:
c=self.c
b.bmethod(self,c,x)

I kept this part of the problem out of this discussion as I'm pretty sure I
can fill those in once I figure out the basic problem of auto-population of
proxy methods since for each class/method those are going to be nearly
identical. If I can autogenerate those on-the-fly I'm pretty sure I can add
some extra-logic to them as well including signature change where
A::bmethod(self,c,x) would become A::bmethod(self,x) etc.


Aaron "Castironpi" Brady

unread,
Sep 25, 2008, 2:50:30 PM9/25/08
to

Do you want to couple instances or classes together?

If A always proxies for B, C, and D, then the wrapper solution isn't
bad. If you're going to be doing any instance magic, that can change
the solution a little bit.

There's also a revision of the first implementation of Aproxy you
posted, which could stand alone as you have it, or work as a
classmethod or staticmethod.

> def Aproxy(fn):
> def delegate(*args,**kw):
> print "%s::%s" % (args[0].__class__.__name__,fn.__name__)
> args=list(args)
> b=getattr(args[0],'b')
> fnew=getattr(b,fn.__name__)
> # get rid of original object reference
> del args[0]
> fnew(*args,**kw)
> setattr(A,fn.__name__,delegate)
> return fn

def Aproxy(fn):
def delegate(self,*args,**kw):


print "%s::%s" % (args[0].__class__.__name__,fn.__name__)

fnew=getattr(self.b,fn.__name__)
return fnew(*args,**kw)
setattr(A,fn.__name__,delegate)
return fn

Bruno Desthuilliers

unread,
Sep 25, 2008, 1:31:41 PM9/25/08
to
Aaron "Castironpi" Brady a écrit :
(snip)

> You should write it like this:
>
> class B(object):
> @A.proxy
> def bmethod(self,a):
>
> Making 'proxy' a class method on A.

That's exactly what I wanted to avoid here : making B depending on A.

(snip)

> I agree that __setattr__ is the canonical solution to proxy,

Err... I assume you mean '__getattr__' ???

> but you
> have stated that you want each proxied method to be a member in the
> proxy class.

This doesn't necessarily imply that "proxied" classes need to know about
the "proxying" class. FWIW, that was the whole point : decoupling.

Dmitry S. Makovey

unread,
Sep 25, 2008, 3:36:16 PM9/25/08
to
Aaron "Castironpi" Brady wrote:
>> I kept this part of the problem out of this discussion as I'm pretty sure
>> I can fill those in once I figure out the basic problem of
>> auto-population of proxy methods since for each class/method those are
>> going to be nearly identical. If I can autogenerate those on-the-fly I'm
>> pretty sure I can add some extra-logic to them as well including
>> signature change where A::bmethod(self,c,x) would become
>> A::bmethod(self,x) etc.
>
> Do you want to couple instances or classes together?

It would be nice to have objects of B, C and D classes not knowing that they
are proxied (as they are used on their own too, not only inside of A
objects).

> If A always proxies for B, C, and D, then the wrapper solution isn't
> bad.

the whole purpose of A is pretty much to proxy and filter. It's got some
extra logic to combine and manipulate b, c and d objects inside of A class
objects.

> If you're going to be doing any instance magic, that can change
> the solution a little bit.
>
> There's also a revision of the first implementation of Aproxy you
> posted, which could stand alone as you have it, or work as a
> classmethod or staticmethod.
>
> def Aproxy(fn):

> def delegate(self,*args,**kw):
> print "%s::%s" % (args[0].__class__.__name__,fn.__name__)
> fnew=getattr(self.b,fn.__name__)
> return fnew(*args,**kw)
> setattr(A,fn.__name__,delegate)
> return fn

yep, that does look nicer/cleaner :)

George Sakkis

unread,
Sep 25, 2008, 4:07:53 PM9/25/08
to
On Sep 25, 3:36 pm, "Dmitry S. Makovey" <dmi...@athabascau.ca> wrote:

> Aaron "Castironpi" Brady wrote:
> >> I kept this part of the problem out of this discussion as I'm pretty sure
> >> I can fill those in once I figure out the basic problem of
> >> auto-population of proxy methods since for each class/method those are
> >> going to be nearly identical. If I can autogenerate those on-the-fly I'm
> >> pretty sure I can add some extra-logic to them as well including
> >> signature change where A::bmethod(self,c,x) would become
> >> A::bmethod(self,x) etc.
>
> > Do you want to couple instances or classes together?
>
> It would be nice to have objects of B, C and D classes not knowing that they
> are proxied (as they are used on their own too, not only inside of A
> objects).

I'm not sure if the approach below deals with all the issues, but one
thing it does is decouple completely the proxied objects from the
proxy:

#======== usage
================================================================

from proxies import Proxy

class B(object):
def __init__(self): self.val = 'bval'
def bmethod(self,n): print "B::bmethod",n
def bmethod2(self,n,m): print "B::bmethod2",n,m

class C(object):
def __init__(self): self.val = 'cval'
def cmethod(self,x): print "C::cmethod",x
def cmethod2(self,x,y): print "C::cmethod2",x,y
cattr = 4

class A(Proxy):
DelegateMap = {
'bmethod' : B,
'bmethod2': B,
'cmethod': C,
# do NOT delegate C.cmethod2
#'cmethod2': C,
'cattr' : C,
}

def __init__(self, b, c):
print "init A()"
# must call Proxy.__init__(*delegates)
super(A,self).__init__(b,c)

def amethod(self,a):
print "A::mymethod",a


if __name__ == '__main__':
a = A(B(), C())
a.amethod('foo')

# test bounded methods


a.bmethod('foo')
a.bmethod2('bar','baz')

a.cmethod('foo')
try: a.cmethod2('bar','baz')
except Exception, ex: print ex

# works for unbound methods too
A.bmethod(a,'foo')
A.bmethod2(a,'bar','baz')
A.cmethod(a, 'foo')
try: A.cmethod2(a,'bar','baz')
except Exception, ex: print ex

# non callable attributes
print A.cattr

#====== output ==================================
init A()
A::mymethod foo
B::bmethod foo
B::bmethod2 bar baz
C::cmethod foo
'A' object has no attribute 'cmethod2'
B::bmethod foo
B::bmethod2 bar baz
C::cmethod foo
type object 'A' has no attribute 'cmethod2'
4

#======== proxies.py =========================

class _ProxyMethod(object):
def __init__(self, name):
self._name = name
def unbound(proxy, *args, **kwds):
method = proxy._get_target_attr(name)
return method(*args, **kwds)
self._unbound = unbound

def __get__(self, proxy, proxytype):
if proxy is not None:
return proxy._get_target_attr(self._name)
else:
return self._unbound


class _ProxyMeta(type):
def __new__(meta, name, bases, namespace):
for attrname,cls in namespace.get('DelegateMap',
{}).iteritems():
if attrname not in namespace:
attr = getattr(cls, attrname)
if callable(attr):
namespace[attrname] = _ProxyMethod(attrname)
else:
namespace[attrname] = attr
return super(_ProxyMeta,meta).__new__(meta, name, bases,
namespace)


class Proxy(object):
__metaclass__ = _ProxyMeta

def __init__(self, *delegates):
self._cls2delegate = {}
for delegate in delegates:
cls = type(delegate)
if cls in self._cls2delegate:
raise ValueError('More than one %s delegates were
given' % cls)
self._cls2delegate[cls] = delegate

def _get_target_attr(self, name):
try:
cls = self.DelegateMap[name]
delegate = self._cls2delegate[cls]
return getattr(delegate, name)
except (KeyError, AttributeError):
raise AttributeError('%r object has no attribute %r' %
(self.__class__.__name__, name))

HTH,
George

Dmitry S. Makovey

unread,
Sep 25, 2008, 4:38:03 PM9/25/08
to
George Sakkis wrote:

> I'm not sure if the approach below deals with all the issues, but one
> thing it does is decouple completely the proxied objects from the
> proxy:

<snip/>

> class _ProxyMeta(type):

<snip/>

It smelled to me more and more like metaclass too, I was just trying to
avoid them :)

Your code looks awefully close to what I'm trying to do, except it looks bit
heavier than decorators. Seems like decorators are not going to happen in
this part of project for me anyway, however the whole discussion gave me a
lot to think about. Thank you Bruno, Aaron, Diez and George.

Thanks for the concrete code with metaclass. I'm going to study it
thoroughly to see if I can spot caveats/issues for my usecases however it
seems to put me on the right track. I never used metaclasses before and
decorators seemed to be bit more straight-forward to me :) ..oh well.

Bruno Desthuilliers

unread,
Sep 26, 2008, 7:48:29 AM9/26/08
to
Dmitry S. Makovey a écrit :
(snip)

> I never used metaclasses before and
> decorators seemed to be bit more straight-forward to me :) ..oh well.

Don't be afraid !-) While it's true that they can be a bit confusing at
first, metaclasses are just classes - whose instances happens to be
classes themselves.

And while you're certainly right to not jump on metaclasses *before*
hanving spent a bit time thinking of other possible solutions, there's
nothing wrong with using metaclasses when that's really what you need...

Paul McGuire

unread,
Sep 26, 2008, 10:18:28 AM9/26/08
to
On Sep 25, 10:41 am, "Dmitry S. Makovey" <dmi...@athabascau.ca> wrote:
> Diez B. Roggisch wrote:
> > Dmitry S. Makovey schrieb:
> >> Dmitry S. Makovey wrote:
> >>> In my real-life case A is a proxy to B, C and D instances/objects, not
> >>> just one.
>
> >> forgot to mention that above would mean that I need to have more than one
> >> decorator function like AproxyB, AproxyC and AproxyD or make Aproxy
> >> smarter about which property of A has instance of which class etc.
>
> > __getattr__?
>
> see, in your code you're assuming that there's only 1 property ( 'b' )
> inside of A that needs proxying. In reality I have several. So in your code
> self._delegate should be at least a tupple or a list. Plus what you're
> doing - you just promiscuously passing any method not found in Proxy to
> self._delegate which is not what I need as I need to pass only a subset of
> calls, so now your code needs to acquire dictionary of "allowed" calls, and
> go over all self._delegates to find if any one has it which is not
> efficient since there IS a 1:1 mapping of A::method -> B::method so lookups
> shouldn't be necessary IMO (for performance reasons).
>

No, really, Diez has posted the canonical Proxy form in Python, using
__getattr__ on the proxy, and then redirecting to the contained
delegate object. This code does *not* assume that only one property
('b'? where did that come from?) is being redirected - __getattr__
will intercept all attribute lookups and redirect them to the
delegate.

If you need to get fancier and support this single-proxy-to-multiple-
delegates form, then yes, you will need some kind of map that says
which method should delegate to which object. Or, if it is just a
matter of precedence (try A, then try B, then...), then use hasattr to
see if the first delegate has the given attribute, and if not, move on
to the next.

>
> I'm trying to figure out if there
> is another way to use decorators for my scenario or is there another way of
> achieving the same thing without using decorators and without bloating up
> the code with alternative solution.
>
> Another way could be to use Metaclass to populate class with method upon
> declaration but that presents quite a bit of "special" cruft which is more
> than I have to do with decorators :) (but maybe it's all *necessary* ? )
>

Your original question was "is decorator the right thing to use?" For
this application, the answer is "no". It sounds like you are trying
to force this particular to solution to your problem, but you are
probably better off giving __getattr__ intercepting another look.

-- Paul

Dmitry S. Makovey

unread,
Sep 26, 2008, 11:41:32 AM9/26/08
to
Paul McGuire wrote:
>> see, in your code you're assuming that there's only 1 property ( 'b' )
>> inside of A that needs proxying. In reality I have several.
<snip/>

>
> No, really, Diez has posted the canonical Proxy form in Python, using
> __getattr__ on the proxy, and then redirecting to the contained
> delegate object. This code does *not* assume that only one property
> ('b'? where did that come from?) is being redirected - __getattr__
> will intercept all attribute lookups and redirect them to the
> delegate.
>
> If you need to get fancier and support this single-proxy-to-multiple-
> delegates form, then yes, you will need some kind of map that says
> which method should delegate to which object. Or, if it is just a
> matter of precedence (try A, then try B, then...), then use hasattr to
> see if the first delegate has the given attribute, and if not, move on
> to the next.

that is what I didn't like about it - I have to iterate over delegates when
I can build direct mapping once and for all and tie it to class
definition ;)

> Your original question was "is decorator the right thing to use?" For
> this application, the answer is "no".

yeah. seems that way. in the other fork of this thread you'll find my
conclusion which agrees with that :)

> It sounds like you are trying
> to force this particular to solution to your problem, but you are
> probably better off giving __getattr__ intercepting another look.

__getattr__ implies constant lookups and checks (for filtering purposes) - I
want to do them once, attach generated methods as native methods and be
done with it. That is why I do not like __getattr__ in this particular
case. Otherwise - you're right.

Aaron "Castironpi" Brady

unread,
Sep 26, 2008, 1:54:10 PM9/26/08
to
On Sep 26, 10:41 am, "Dmitry S. Makovey" <dmi...@athabascau.ca> wrote:

> Paul McGuire wrote:
> > If you need to get fancier and support this single-proxy-to-multiple-
> > delegates form, then yes, you will need some kind of map that says
> > which method should delegate to which object.  Or, if it is just a
> > matter of precedence (try A, then try B, then...), then use hasattr to
> > see if the first delegate has the given attribute, and if not, move on
> > to the next.
>
> that is what I didn't like about it - I have to iterate over delegates when
> I can build direct mapping once and for all and tie it to class
> definition ;)

You're right, there is a difference in performance. But it would
still be simpler to have your 'getattr' method 'learn' what functions
come from what objects, and then just route directly to them.

> __getattr__ implies constant lookups and checks (for filtering purposes) - I
> want to do them once, attach generated methods as native methods and be
> done with it. That is why I do not like __getattr__ in this particular
> case. Otherwise - you're right.

Try this (untested!):

def __getattr__( self, key ):
if key in self.knownmeths:
return self.knownmeths[ key ]
for x in self.delegs:
if hasattr( x, key ):
_= self.knownmethds[ key ]= getattr( x, key )
return _
raise name not found

self.knownmethds has desired order of precedence.

Bruno Desthuilliers

unread,
Sep 26, 2008, 4:03:39 PM9/26/08
to
Dmitry S. Makovey a écrit :
> Paul McGuire wrote:
>>> see, in your code you're assuming that there's only 1 property ( 'b' )
>>> inside of A that needs proxying. In reality I have several.
> <snip/>
>> No, really, Diez has posted the canonical Proxy form in Python, using
>> __getattr__ on the proxy, and then redirecting to the contained
>> delegate object. This code does *not* assume that only one property
>> ('b'? where did that come from?) is being redirected - __getattr__
>> will intercept all attribute lookups and redirect them to the
>> delegate.
>>
>> If you need to get fancier and support this single-proxy-to-multiple-
>> delegates form, then yes, you will need some kind of map that says
>> which method should delegate to which object. Or, if it is just a
>> matter of precedence (try A, then try B, then...), then use hasattr to
>> see if the first delegate has the given attribute, and if not, move on
>> to the next.
>
> that is what I didn't like about it - I have to iterate over delegates when
> I can build direct mapping once and for all and tie it to class
> definition ;)

Hem... I'm afraid you don't really take Python's dynamic nature into
account here. Do you know that even the __class__ attribute of an
instance can be rebound at runtime ? What about 'once and for all' then ?

But anyway:

>> Your original question was "is decorator the right thing to use?" For
>> this application, the answer is "no".
>
> yeah. seems that way. in the other fork of this thread you'll find my
> conclusion which agrees with that :)
>
>> It sounds like you are trying
>> to force this particular to solution to your problem, but you are
>> probably better off giving __getattr__ intercepting another look.
>
> __getattr__ implies constant lookups and checks (for filtering purposes)

Unless you cache the lookups results...

> - I
> want to do them once, attach generated methods as native methods

What is a "native method" ? You might not be aware of the fact that
method objects are usually built anew from functions on each method call...

> and be
> done with it. That is why I do not like __getattr__ in this particular
> case.

There's indeed an additional penalty using __getattr__, which is that
it's only called as a last resort. Now remember that premature
optimization is the root of evil... Depending on effective use (ie : how
often a same 'proxied' method is called on a given Proxy instance, on
average), using __getattr__ to retrieve the appropriate bound method on
the delegate then adding it to the proxy instance *as an instance
attribute* might be a way better (and simpler) optimization.

Dmitry S. Makovey

unread,
Sep 26, 2008, 6:38:46 PM9/26/08
to
Bruno Desthuilliers wrote:
> Hem... I'm afraid you don't really take Python's dynamic nature into
> account here. Do you know that even the __class__ attribute of an
> instance can be rebound at runtime ? What about 'once and for all' then ?

must've been wrong wording on my part. Dynamic nature is exactly what I
wanted to use :) except that I do not expect clients to take advantage of
it while using my classes ;)

>>> Your original question was "is decorator the right thing to use?" For
>>> this application, the answer is "no".
>>
>> yeah. seems that way. in the other fork of this thread you'll find my
>> conclusion which agrees with that :)
>>
>>> It sounds like you are trying
>>> to force this particular to solution to your problem, but you are
>>> probably better off giving __getattr__ intercepting another look.
>>
>> __getattr__ implies constant lookups and checks (for filtering purposes)
>
> Unless you cache the lookups results...

sure

>> - I
>> want to do them once, attach generated methods as native methods
>
> What is a "native method" ? You might not be aware of the fact that
> method objects are usually built anew from functions on each method
> call...

again, wrong wording on my part. by native I meant: make use as much as
possible of existing machinery and override default behavior only when it's
absolutely necessary. (hopefully my wording is not off this time ;) )

>> and be
>> done with it. That is why I do not like __getattr__ in this particular
>> case.
>
> There's indeed an additional penalty using __getattr__, which is that
> it's only called as a last resort. Now remember that premature
> optimization is the root of evil... Depending on effective use (ie : how
> often a same 'proxied' method is called on a given Proxy instance, on
> average), using __getattr__ to retrieve the appropriate bound method on
> the delegate then adding it to the proxy instance *as an instance
> attribute* might be a way better (and simpler) optimization.

I actually ended up rewriting things (loosely based on George's suggested
code) with descriptors and not using metaclasses or decorators (so much for
my desire to use them).

With following implementation (unpolished at this stage but already
functional) I can have several instances of B objects inside of A object
and proxy certain methods to one or another object (I might be having a
case when I have A.b1 and A.b2 and passing some methods to b1 and others to
b2 having both of the same class B, maybe even multiplexing). This one
seems to be fairly light as well without need to scan instances (well,
except for one getattr, but I couldn't get around it). Maybe I didn't
account for some shoot-in-the-foot scenarios but I can't come up with any.
Last time I played with __getattr__ I shot myself in the foot quite well
BTW :)

class ProxyMethod(object):

def __init__(self,ob_name,meth):
self.ob_name=ob_name
self.meth=meth

def my_call(self,instance,*argv,**kw):
ob=getattr(instance,self.ob_name)
cls=self.meth.im_class
return self.meth.__get__(ob,cls)(*argv,**kw)

def __get__(self,instance,owner):
if not instance:
return self.my_call
ob=getattr(instance,self.ob_name)
cls=self.meth.im_class
return self.meth.__get__(ob,cls)

class B:
def __init__(self):
self.val='bval'

def bmethod(self,a):
print "B::bmethod",
print a, self.val

class A:
b=None

def __init__(self,b=None):
self.val='aval'
self.b=b
b.val='aval-b'

def mymethod(self,a):
print "A::mymethod, ",a

bmethod=ProxyMethod('b',B.bmethod)

b=B()
b.bmethod('foo')
a=A(b)
b=B()
b.val='newval'

a.mymethod('baz')
a.bmethod('bar')
A.bmethod(a,'zoom')

Aaron "Castironpi" Brady

unread,
Sep 26, 2008, 6:46:03 PM9/26/08
to
On Sep 26, 3:03 pm, Bruno Desthuilliers

That prohibits using a descriptor in the proxied classes, or at least
the proxied functions, since you break descriptor protocol and only
call __get__ once. Better to cache and get by name. It's only slower
by the normal amount, and technically saves space, strings vs.
instancemethod objects (except for really, really long function names).

Dmitry S. Makovey

unread,
Sep 26, 2008, 7:30:52 PM9/26/08
to
Aaron "Castironpi" Brady wrote:
> That prohibits using a descriptor in the proxied classes, or at least
> the proxied functions, since you break descriptor protocol and only
> call __get__ once. Better to cache and get by name. It's only slower
> by the normal amount, and technically saves space, strings vs.
> instancemethod objects (except for really, really long function names).

that is an interesting point since I didn't think about having descriptors
in proxied classes. my reworked code clearly breaks when descriptors are
thrown at it. It will break with methods in proxied objects that are
implemented as objects too. Now I adjusted constructor a bit to account for
that (I just can't figure out case when I'll be proxying descriptors unless
they return function but than I don't see benefit in using descriptor for
that, probably because I haven't used them much).

class ProxyMethod(object):
def __init__(self,ob_name,meth):
self.ob_name=ob_name

if not hasattr(meth,'im_class'):
if hasattr(meth,'__call__'):
self.meth=getattr(meth,'__call__')
else:
raise ValueError("Method should be either a class method or
a callable class")
else:
self.meth=meth


George Sakkis

unread,
Sep 26, 2008, 9:57:10 PM9/26/08
to
On Sep 26, 6:38 pm, "Dmitry S. Makovey" <dmi...@athabascau.ca> wrote:

> I actually ended up rewriting things (loosely based on George's suggested
> code) with descriptors and not using metaclasses or decorators (so much for
> my desire to use them).
>
> With following implementation (unpolished at this stage but already
> functional) I can have several instances of B objects inside of A object
> and proxy certain methods to one or another object (I might be having a
> case when I have A.b1 and A.b2 and passing some methods to b1 and others to
> b2 having both of the same class B, maybe even multiplexing).

You seem to enjoy pulling the rug from under our feet by changing the
requirements all the time :)

> This one
> seems to be fairly light as well without need to scan instances (well,
> except for one getattr, but I couldn't get around it). Maybe I didn't
> account for some shoot-in-the-foot scenarios but I can't come up with any.
> Last time I played with __getattr__ I shot myself in the foot quite well
> BTW :)
>

> [code snipped]
>
> class A:
>     b=None
^^^^^^ you don't need this

>     def __init__(self,b=None):
>         self.val='aval'
>         self.b=b
>         b.val='aval-b'
>
>     def mymethod(self,a):
>         print "A::mymethod, ",a
>
>     bmethod = ProxyMethod('b',B.bmethod)

Although this works, the second argument to ProxyMethod shouldn't be
necessary, it's semantically redundant; ideally you would like to
write it as "bmethod = ProxyMethod('b')". As before, I don't think
that's doable without metaclasses (or worse, stack frame hacking).
Below is the update of my original recipe; interestingly, it's
(slightly) simpler than before:

#======= usage ========================================

from proxies import Proxy

class B(object):
def __init__(self, val): self.val = val
def bmethod(self,n): print "B::bmethod", self.val, n
def bmethod2(self,n,m): print "B::bmethod2", self.val, n, m

class C(object):
def __init__(self, val): self.val = val
def cmethod(self,x): print "C::cmethod", self.val, x
def cmethod2(self,x,y): print "C::cmethod2",self.val, x, y
cattr = 4

class A(Proxy):
# DelegateMap:
# Maps each delegate method to the proxy attribute that refers to
the
# respective delegate object
DelegateMap = {
'bmethod' : 'b1',
'bmethod2': 'b2',
'cmethod' : 'c',


# do NOT delegate C.cmethod2

#'cmethod2': 'c',
}

def __init__(self, b1, b2, c):


print "init A()"
# must call Proxy.__init__

super(A,self).__init__(b1=b1, b2=b2, c=c)

def amethod(self,a):
print "A::mymethod",a

if __name__ == '__main__':
a = A(B(10), B(20), C(30))
a.amethod('foo')

print "bound proxy calls"


a.bmethod('foo')
a.bmethod2('bar','baz')

a.cmethod('foo')
try: a.cmethod2('bar','baz')
except Exception, ex: print ex

print "unbound proxy calls"


A.bmethod(a,'foo')
A.bmethod2(a,'bar','baz')
A.cmethod(a, 'foo')
try: A.cmethod2(a,'bar','baz')
except Exception, ex: print ex

#======= output ========================================

init A()
A::mymethod foo

bound proxy calls
B::bmethod 10 foo
B::bmethod2 20 bar baz
C::cmethod 30 foo


'A' object has no attribute 'cmethod2'

unbound proxy calls
B::bmethod 10 foo
B::bmethod2 20 bar baz
C::cmethod 30 foo


type object 'A' has no attribute 'cmethod2'

#====== proxies.py ======================================

class _ProxyMeta(type):
def __new__(meta, name, bases, namespace):

for methodname in namespace.get('DelegateMap', ()):
if methodname not in namespace:
namespace[methodname] = _ProxyMethod(methodname)


return super(_ProxyMeta,meta).__new__(meta, name, bases,
namespace)

class _ProxyMethod(object):
def __init__(self, name):
self._name = name

def __get__(self, proxy, proxytype):


if proxy is not None:
return proxy._get_target_attr(self._name)
else:

return self._unbound_method

def _unbound_method(self, proxy, *args, **kwds):
method = proxy._get_target_attr(self._name)
return method(*args, **kwds)


class Proxy(object):
__metaclass__ = _ProxyMeta

def __init__(self, **attr2delegate):
self.__dict__.update(attr2delegate)

def _get_target_attr(self, name):
try:
delegate = getattr(self, self.DelegateMap[name])

Dmitry S. Makovey

unread,
Sep 27, 2008, 1:44:24 AM9/27/08
to
George Sakkis wrote:
> You seem to enjoy pulling the rug from under our feet by changing the
> requirements all the time :)

but that's half the fun! ;)

Bit more seriously - I didn't know I had those requirements until now :) I'm
kind of exploring where can I get with those ideas. Initial post was based
exactly on what I had in my code with desire to make it more
automated/{typo,fool}-proof/robust, elegant and possibly faster. Hopefully
such behavior is not off-topic on this list (if it is - let me know and
I'll go exploring solo).

> Although this works, the second argument to ProxyMethod shouldn't be
> necessary, it's semantically redundant; ideally you would like to
> write it as "bmethod = ProxyMethod('b')".

since I'm already on exploratory trail (what about that rug being pulled
from under....?) With my code I can do really dirty tricks like this (not
necessary that I'm going to):

class B_base:
def bmethod(self):
print 'B_base'

class B(B_base):
def bmethod(self):
print 'B'

class A:
bmethod=ProxyMethod('b',B_base.bmethod)

> As before, I don't think
> that's doable without metaclasses (or worse, stack frame hacking).
> Below is the update of my original recipe; interestingly, it's
> (slightly) simpler than before:

Out of curiosity (and trying to understand): why do you insist on
dictionaries with strings contents ( {'bmethod' : 'b1' } ) rather than
something more explicit ? Again, I can see that your code is working and I
can even understand what it's doing, just trying to explore alternatives :)

I guess my bias is towards more explicit declarations thus

bmethod=ProxyMethod('b',B.bmethod)

looks more attractive to me, but I stand to be corrected/educated why is
that not the right thing to do?

Another thing that turns me away from string dictionaries is that those are
the ones causing me more trouble hunting down typos. Maybe it's just "my
thing" so I'm not going to insist on it. I'm open to arguments against that
theory.

One argument I can bring in defence of more explicit declarations is IDE
parsing when autocompletion for B.bme... pops up (suggesting bmethod and
bmethod2) and with 'b':'bmethod' it never happens. However that's not good
enough reason to stick to it if it introduces other problems. What kind of
problems could those be?

P.S.
so far I find this discussion quite educating BTW. I am more of a "weekend
programmer" with Python - using basic language for most of things I need,
however I decided to explore further to educate myself and to write more
efficient code.


George Sakkis

unread,
Sep 27, 2008, 9:23:33 AM9/27/08
to
On Sep 27, 1:44 am, "Dmitry S. Makovey" <dmi...@makovey.net> wrote:

> George Sakkis wrote:
> > Although this works, the second argument to ProxyMethod shouldn't be
> > necessary, it's semantically redundant; ideally you would like to
> > write it as "bmethod = ProxyMethod('b')".
>
> since I'm already on exploratory trail (what about that rug being pulled
> from under....?) With my code I can do really dirty tricks like this (not
> necessary that I'm going to):
>
> class B_base:
> def bmethod(self):
> print 'B_base'
>
> class B(B_base):
> def bmethod(self):
> print 'B'
>
> class A:
> bmethod=ProxyMethod('b',B_base.bmethod)

Yes, that's indeed dirty; don't do it :)

> > As before, I don't think
> > that's doable without metaclasses (or worse, stack frame hacking).
> > Below is the update of my original recipe; interestingly, it's
> > (slightly) simpler than before:
>
> Out of curiosity (and trying to understand): why do you insist on
> dictionaries with strings contents ( {'bmethod' : 'b1' } ) rather than
> something more explicit ? Again, I can see that your code is working and I
> can even understand what it's doing, just trying to explore alternatives :)
>
> I guess my bias is towards more explicit declarations thus
>
> bmethod=ProxyMethod('b',B.bmethod)
>
> looks more attractive to me, but I stand to be corrected/educated why is
> that not the right thing to do?

I see where you're coming from and I also prefer explicit reflection
mechanisms instead of strings (e.g. avoid eval/exec as much as
possible). As I mentioned, the second argument to ProxyMethod is (for
all sane purposes) redundant, so if you could implement it in a way
that "bmethod = ProxyMethod('b')" worked, I would be all for it, but
AFAIK it's not possible without a metaclass. A dict with string keys
and values to be consumed by a metaclass is perhaps the simplest thing
that could possibly work. It contains all the relevant information for
hooking the proxy to the delegate methods and nothing more; zero
boilerplate code overhead. Also note that it's not that big of a
difference; you have to provide the attribute name as a string anyway.

> Another thing that turns me away from string dictionaries is that those are
> the ones causing me more trouble hunting down typos. Maybe it's just "my
> thing" so I'm not going to insist on it. I'm open to arguments against that
> theory.

From my experience, I am equally prone to typos for both strings and
regular attributes; I agree though that the traceback information is
often more helpful when you mistype an attribute.

> One argument I can bring in defence of more explicit declarations is IDE
> parsing when autocompletion for B.bme... pops up (suggesting bmethod and
> bmethod2) and with 'b':'bmethod' it never happens.

I don't rely on autocompleting IDEs, at least in dynamic languages, so
it's not much of an issue for me but yes, it's another small argument
against strings.

George

George Sakkis

unread,
Sep 27, 2008, 11:27:13 AM9/27/08
to
On Sep 27, 9:23 am, George Sakkis <george.sak...@gmail.com> wrote:

> On Sep 27, 1:44 am, "Dmitry S. Makovey" <dmi...@makovey.net> wrote:
>
> > I guess my bias is towards more explicit declarations thus
>
> > bmethod=ProxyMethod('b',B.bmethod)
>
> > looks more attractive to me, but I stand to be corrected/educated why is
> > that not the right thing to do?
>
> I see where you're coming from and I also prefer explicit reflection
> mechanisms instead of strings (e.g. avoid eval/exec as much as
> possible). As I mentioned, the second argument to ProxyMethod is (for
> all sane purposes) redundant, so if you could implement it in a way
> that "bmethod = ProxyMethod('b')" worked, I would be all for it, but
> AFAIK it's not possible without a metaclass.

Just for completeness, here's a metaclass version that uses
ProxyMethod declarations instead of a dict; you'll probably like this
better:

#======= usage =========================
from proxies import Proxy, ProxyMethod

class B(object):
def __init__(self, val): self.val = val
def bmethod(self,n): print "B::bmethod", self.val, n
def bmethod2(self,n,m): print "B::bmethod2", self.val, n, m

class C(object):
def __init__(self, val): self.val = val
def cmethod(self,x): print "C::cmethod", self.val, x
def cmethod2(self,x,y): print "C::cmethod2",self.val, x, y
cattr = 4

class A(Proxy):


def __init__(self, b1, b2, c):
print "init A()"
# must call Proxy.__init__
super(A,self).__init__(b1=b1, b2=b2, c=c)

def amethod(self,a):
print "A::mymethod",a

bmethod = ProxyMethod('b1')
bmethod2 = ProxyMethod('b2')
cmethod = ProxyMethod('c')

#======= output ========================================

#====== proxies.py ======================================

for attrname,value in namespace.iteritems():
if isinstance(value, ProxyMethod) and value.name is None:
value.name = attrname


return super(_ProxyMeta,meta).__new__(meta, name, bases,
namespace)


class ProxyMethod(object):
def __init__(self, proxy_attr, name=None):
self._proxy_attr = proxy_attr
self.name = name

def __get__(self, proxy, proxytype):
if proxy is not None:

return self.__get_target_attr(proxy)
else:
return self.__unbound_method

def __unbound_method(self, proxy, *args, **kwds):
method = self.__get_target_attr(proxy)
return method(*args, **kwds)

def __get_target_attr(self, proxy):
try:
delegate = getattr(proxy, self._proxy_attr)
return getattr(delegate, self.name)
except AttributeError:


raise AttributeError('%r object has no attribute %r' %

(proxy.__class__.__name__,
self.name))


class Proxy(object):
__metaclass__ = _ProxyMeta

def __init__(self, **attr2delegate):
self.__dict__.update(attr2delegate)

#======================================================

If you want to eliminate completely specifying attributes with
strings, it's easy to modify the above so that you write instead:

class A(Proxy):
...
bmethod = ProxyMethod(lambda self: self.b1)
bmethod2 = ProxyMethod(lambda self: self.b2)

This is more verbose for the common case, but it's more flexible in
cases where the callable may be more complex than a plain getattr().
Actually you can support both, it doesn't have to be either/or; just
check whether the argument to ProxyMethod is a callable and if not,
make it:

from operator import attrgetter

class ProxyMethod(object):
def __init__(self, proxy_attr, name=None):
if not callable(proxy_attr):
proxy_attr = attrgetter(proxy_attr)
...

Remaining Implementation is left as an exercise to the reader ;)

George

George Sakkis

unread,
Sep 27, 2008, 12:19:57 PM9/27/08
to
On Sep 27, 11:27 am, George Sakkis <george.sak...@gmail.com> wrote:

> If you want to eliminate completely specifying attributes with
> strings, it's easy to modify the above so that you write instead:
>
> class A(Proxy):
> ...
> bmethod = ProxyMethod(lambda self: self.b1)
> bmethod2 = ProxyMethod(lambda self: self.b2)

It's funny how often you come with a better solution a few moments
after htting send! The snippet above can (ab)use the decorator syntax
so that it becomes:

class A(Proxy):

@ProxyMethod
def bmethod(self):
return self.b1

@ProxyMethod
def bmethod2(self):
return self.b2

With the observation that ProxyMethod has access both to the callable
that returns the delegate and the name of the delegated method, we can
remove the need for the metaclass and the Proxy base class altogether:

class proxymethod(object):
def __init__(self, get_delegate):
self._get_delegate = get_delegate

def __get__(self, proxy, proxytype):
if proxy is not None:
return self.__get_target_attr(proxy)
else:
return self.__unbound_method

def __unbound_method(self, proxy, *args, **kwds):
method = self.__get_target_attr(proxy)
return method(*args, **kwds)

def __get_target_attr(self, proxy):
get_delegate = self._get_delegate
try: return getattr(get_delegate(proxy),
get_delegate.__name__)


except AttributeError:
raise AttributeError('%r object has no attribute %r' %

(proxy.__class__.__name__, get_delegate.__name__))


class A(object):

def __init__(self, b1, b2):
self.b1 = b1
self.b2 = b2

def amethod(self,a):
print "A::mymethod",a

@proxymethod
def bmethod(self):
return self.b1

@proxymethod
def bmethod2(self):
return self.b2


a = A(B(10), B(20))
a.amethod('foo')

print "bound proxy calls"
a.bmethod('foo')
a.bmethod2('bar','baz')

print "unbound proxy calls"


A.bmethod(a,'foo')
A.bmethod2(a,'bar','baz')

So back to the OP's original question and after a long circle.. a
decorator might well be the right thing to use after all :)

George

Dmitry S. Makovey

unread,
Sep 27, 2008, 11:38:27 PM9/27/08
to
George Sakkis wrote:
> It's funny how often you come with a better solution a few moments
> after htting send! The snippet above can (ab)use the decorator syntax
> so that it becomes:
>
> class A(Proxy):
>
> @ProxyMethod
> def bmethod(self):
> return self.b1
>
> @ProxyMethod
> def bmethod2(self):
> return self.b2

That is outstanding! This code looks very clean to me (just a touch cryptic
around declarations in A, but that was unavoidable anyway). Seems like the
right way to read it would be bottom up (or is it only my mind so
perverted?). By the looks of it - it does exactly what I needed with great
number of possibilities behind it and is very lightweight and transparent.
Now I regret I haven't come up with it myself :-D

George, at this point I'm out of rugs - so no more rug pulling from under
your feet for me.

Now I'm going to apply all this knowledge to my code, see how that goes and
come back with more questions later.

Thank you (all) very much for a great discussion. This thread educated me
quite a bit on descriptors and why one would need them, and decorators -
just as subject line suggested, were not forgotten.


George Sakkis

unread,
Sep 28, 2008, 6:19:45 AM9/28/08
to
On Sep 27, 11:38 pm, "Dmitry S. Makovey" <dmi...@makovey.net> wrote:
> George Sakkis wrote:
> > It's funny how often you come with a better solution a few moments
> > after htting send! The snippet above can (ab)use the decorator syntax
> > so that it becomes:
>
> > class A(Proxy):
>
> >     @ProxyMethod
> >     def bmethod(self):
> >         return self.b1
>
> >     @ProxyMethod
> >     def bmethod2(self):
> >         return self.b2
>
> That is outstanding!

FYI, in case you missed it the final version doesn't need a Proxy base
class, just inherit from object. Also lowercased ProxyMethod to look
similar to staticmethod/classmethod:

class A(object):

def __init__(self, b1, b2):
self.b1 = b1
self.b2 = b2

@proxymethod
def bmethod(self):
return self.b1

@proxymethod
def bmethod2(self):
return self.b2

George

Dmitry S. Makovey

unread,
Sep 28, 2008, 4:26:59 PM9/28/08
to
George Sakkis wrote:
> FYI, in case you missed it the final version doesn't need a Proxy base
> class, just inherit from object. Also lowercased ProxyMethod to look
> similar to staticmethod/classmethod:

I cought that, just quoted the wrong one :)

0 new messages