Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

generator object or 'send' method?

0 views
Skip to first unread message

Aaron Brady

unread,
Feb 9, 2009, 12:46:26 PM2/9/09
to
Hello,

I am writing a generator to return a sequence of numbers with some
variation. The parameters of the variation can be changed by the
caller, even after the generator is started. My question is, is it
better to wrap the generator in an object, so that the parameters can
be changed just by an attribute access? Or keep the generator clear,
and modify parameters with a 'send' call?

P.S. It would receive the 'send' from a different point in control
flow than its usual 'next'. Should it repeat a value if it receives a
'send'? Or should I wrap it in a secondary 'trap_send_and_repeat'
generator?

Thanks sincerely as always.

Terry Reedy

unread,
Feb 9, 2009, 2:22:52 PM2/9/09
to pytho...@python.org
Aaron Brady wrote:
> Hello,
>
> I am writing a generator to return a sequence of numbers with some
> variation. The parameters of the variation can be changed by the
> caller, even after the generator is started. My question is, is it
> better to wrap the generator in an object, so that the parameters can
> be changed just by an attribute access?

Rather than wrap a generator, I would just write an iterator class.
A generator function is basically an abbreviated class statement. (The
abbreviation is the removal of boilerplate code.)

> Or keep the generator clear, and modify parameters with a 'send' call?

If I were modifying a single parameter and wanted the next value
returned immediately upon sending, I would do this.

> P.S. It would receive the 'send' from a different point in control
> flow than its usual 'next'. Should it repeat a value if it receives a
> 'send'? Or should I wrap it in a secondary 'trap_send_and_repeat'
> generator?

But for this I would go with the class. Pretty foolproof.

tjr

andrew cooke

unread,
Feb 9, 2009, 3:07:41 PM2/9/09
to Aaron Brady, pytho...@python.org

If I were experimenting with Python to see just how far I could push
coroutines at the moment, I would use .send() and look at how I could
factor things into a small library (containing, for example, your
trap-and-response secondary generator).

But if this was paid work, I would write a class with a __next__ method
and do things explicitly.

In answer to your PS, via a roundabout route: I've done something similar
recently by using .throw() to raise an exception in the body of the
generator. This was done with a reset() function. So, for example:

mygen = ...
a = next(mygen)
b = next(mygen)
reset(mygen)
c = next(mygen)

And in the generator:

while True:
try:
...
yield ...
catch ResetException:
yield # discarded by the reset function

And finally:

def reset(gen):
gen.throw(ResetException())

In that case, I needed an extra "yield" that did nothing.

Andrew

> --
> http://mail.python.org/mailman/listinfo/python-list
>
>


John O'Hagan

unread,
Feb 10, 2009, 12:28:26 AM2/10/09
to pytho...@python.org
On Mon, 9 Feb 2009, Aaron Brady wrote:
> Hello,
>
> I am writing a generator to return a sequence of numbers with some
> variation. The parameters of the variation can be changed by the
> caller, even after the generator is started.
[...]

I would love to see a simple code example of this if you have one; I've been
wanting to do this but couldn't even get started.

Regards,

John

Steven D'Aprano

unread,
Feb 10, 2009, 1:04:16 AM2/10/09
to


Is this too simple?


>>> def gen(n):
... while True:
... obj = yield n
... if obj is not None: n = obj
...
>>> g = gen(5)
>>> g.next()
5
>>> g.next()
5
>>> g.send(12)
12
>>> g.next()
12


--
Steven

andrew cooke

unread,
Feb 10, 2009, 7:30:05 AM2/10/09
to pytho...@python.org

steven probably knows this, but to flag the issue for people who are
looking at generators/coroutines for the first time: there's a little
"gotcha" about exactly how the two sides of the conversation are
synchronized. in simple terms: send also receives.

unfortunately the example steven gave doesn't really show this, so i've
modified it below. you can now see that the first next() after .send()
receives 2, not 1. note that i am using python 3, so the .next() method
is .__next__() (the asymmetry between next and send seems odd to me, but
there you go).

(in a sense this is completely logical, but if you're used to the
asynchronous way the internet works, it can seem unintuitive)

how to handle this was one of the things the original post was asking
about (if i understood correctly).

>>> def gen(n):
... while True:
... obj = yield n

... n += 1


... if obj is not None: n = obj
...
>>> g = gen(5)

>>> next(g)
5
>>> g.__next__()
6
>>> g.send(1)
1
>>> next(g)
2

andrew

Aaron Brady

unread,
Feb 10, 2009, 1:39:59 PM2/10/09
to

I guess a generator that counts, but skips K numbers, where K can be
varied. For instance, you initialize it with N, the starting number,
and K, the jump size. Then, you can change either one later on.
(Unproduced.)

>>> j= jumper( N= 1, K= 1 )
>>> next( j )
1
>>> next( j )
2
>>> j.send( K= 3 )
5
>>> next( j )
8

However, if in the caller, the 'send' and 'next' come from two
different places, then your 'send' "eats" one of your 'next's, and the
'next' code just receives 1..2..8.

The above code isn't legal (or practical), due to the 'K=3' in send.
You could do 'j.send( dict( K= 3 ) )', but you can't do 'locals
().update( result )' in your generator, unfortunately. So you have a
big switch statement there.

One solution: You could always query a dictionary for your variables,
and do 'my_locals.update( result )', and 'my_locals[ "K" ]' for a
variable. Then you just need some sort of collector generator to go
around the first one, and duplicate the value 'send' returns, before
going back to 'next'. (Unproduced.)

>>> j= repeat_on_send( jumper( N= 1, K= 1 ) )
>>> next( j )
1
>>> next( j )
2
>>> j.send( dict( K= 3 ) )
5
>>> next( j )
5
>>> next( j )
8

Two solution (er, solution two), would be to give a generator object
access. Unfortunately, generators have no 'gi_dict' attribute, so
arbitrary attributes aren't possible. So you have to be clever.
(Unproduced.)

>>> @gen_dict
>>> def jumper...
...
>>> j= jumper( N= 1, K= 1 )
>>> next( j )
1
>>> next( j )
2
>>> j.K= 3
>>> next( j )
5
>>> next( j )
8

This gets two birds with one stone, but is it possible?

Terry Reedy

unread,
Feb 10, 2009, 5:02:44 PM2/10/09
to pytho...@python.org
Aaron Brady wrote:

> I guess a generator that counts, but skips K numbers, where K can be
> varied. For instance, you initialize it with N, the starting number,
> and K, the jump size. Then, you can change either one later on.

This specification is incomplete as to the timing of when changes to N
take effect and when variable K is applied.

> This gets two birds with one stone, but is it possible?

If you write an iterator class instead of trying to stretch the
abbreviated form beyond its intention, and the specification is
coherent, it should be trivial.

greg

unread,
Feb 11, 2009, 5:41:22 AM2/11/09
to
Aaron Brady wrote:

> It would receive the 'send' from a different point in control
> flow than its usual 'next'. Should it repeat a value if it receives a
> 'send'?

No. This requirement clearly indicates that send() is
the wrong tool for the job.

I would use a class with a generator as a method, e.g.

class Multiples:

m = 1

def set_multiplier(self, m):
self.m = m

def generate(self, limit):
for i in xrange(limit):
yield i * self.m

g = Multiples()
for x in g.generate(10):
print x
if x == 3:
g.set_multiplier(42)

produces

0
1
2
3
168
210
252
294
336
378

--
Greg

Aaron Brady

unread,
Feb 11, 2009, 12:17:11 PM2/11/09
to

Alright, but you don't have to go so far as to create an entire
class. It's alright to use a formula. Here's a way to add a
dictionary to a generator instance.

def gen_dict( fun ):
def inner( *ar, **kw ):
self.gen= fun( self, *ar, **kw )
return self
class GenWrapper:
def __iter__( self ): return self
def __next__( self, *ar, **kw ): return self.gen.__next__
( *ar, **kw )
self= GenWrapper()
return inner

It's a shame that you can't set 'self.__next__= self.gen.__next__'.
It costs an entire level of call depth. On the other hand, I'm not
sure a pure delegate class is possible.

The decorator just puts the generator object in a 'gen' attribute, and
adds itself to the '__init__' argument list. So the generator has a
'self' argument, which is of type 'instance', and allows dynamic
attributes.

@gen_dict
def jumper( self, N= 1, K= 1 ):
self.N, self.K= N, K
while 1:
yield self.N
self.N+= self.K

j= jumper()
print( next( j ) )
j.K= 4
print( next( j ) )

It might need some touch-ups to run in Python 2. Neat mix. It's a
cool hybrid between Terry's and Greg's idea, and a bare generator.
Unfortunately, if you want 'self' to have any methods, you'll probably
need a class... though maybe it could inherit from 'GenWrapper'. What
do you think?

P.S. Here's a slightly simpler version.

def gen_dict( fun ):
class GenWrapper:
def __init__( self, *ar, **kw ):
self.gen= fun( self, *ar, **kw )
def __iter__( self ): return self
def __next__( self, *ar, **kw ): return self.gen.__next__
( *ar, **kw )
return GenWrapper

P.P.S. Simpler, but a line longer.

class GenWrapper:
def __init__( self, fun ):
self.fun= fun
def __call__( self, *ar, **kw ):
self.gen= self.fun( self, *ar, **kw )
return self
def __iter__( self ): return self
def __next__( self, *ar, **kw ): return self.gen.__next__( *ar,
**kw )

0 new messages