Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Peek inside iterator (is there a PEP about this?)

67 views
Skip to first unread message

Luis Zarrabeitia

unread,
Oct 1, 2008, 10:46:33 AM10/1/08
to pytho...@python.org

Hi there.

For most use cases I think about, the iterator protocol is more than enough.
However, on a few cases, I've needed some ugly hacks.

Ex 1:

a = iter([1,2,3,4,5]) # assume you got the iterator from a function and
b = iter([1,2,3]) # these two are just examples.

then,

zip(a,b)

has a different side effect from

zip(b,a)

After the excecution, in the first case, iterator a contains just [5], on the
second, it contains [4,5]. I think the second one is correct (the 5 was never
used, after all). I tried to implement my 'own' zip, but there is no way to
know the length of the iterator (obviously), and there is also no way
to 'rewind' a value after calling 'next'.

Ex 2:

Will this iterator yield any value? Like with most iterables, a construct

if iterator:
# do something

would be a very convenient thing to have, instead of wrapping a 'next' call on
a try...except and consuming the first item.

Ex 3:

if any(iterator):
# do something ... but the first true value was already consumed and
# cannot be reused. "Any" cannot peek inside the iterator without
# consuming the value.

Instead,

i1, i2 = tee(iterator)
if any(i1):
# do something with i2

Question/Proposal:

Has there been any PEP regarding the problem of 'peeking' inside an iterator?
Knowing if the iteration will end or not, and/or accessing the next value,
without consuming it? Is there any (simple, elegant) way around it?

Cheers,

--
Luis Zarrabeitia (aka Kyrie)
Fac. de Matemática y Computación, UH.
http://profesores.matcom.uh.cu/~kyrie

Peter Otten

unread,
Oct 1, 2008, 1:14:14 PM10/1/08
to
Luis Zarrabeitia wrote:

> For most use cases I think about, the iterator protocol is more than
> enough. However, on a few cases, I've needed some ugly hacks.
>
> Ex 1:
>
> a = iter([1,2,3,4,5]) # assume you got the iterator from a function and
> b = iter([1,2,3]) # these two are just examples.

Can you provide a concrete use case?



> then,
>
> zip(a,b)
>
> has a different side effect from
>
> zip(b,a)
>
> After the excecution, in the first case, iterator a contains just [5], on
> the second, it contains [4,5]. I think the second one is correct (the 5
> was never used, after all). I tried to implement my 'own' zip, but there
> is no way to know the length of the iterator (obviously), and there is
> also no way to 'rewind' a value after calling 'next'.
>
> Ex 2:
>
> Will this iterator yield any value? Like with most iterables, a construct
>
> if iterator:
> # do something

I don't think this has a chance. By adding a __len__ to some iterators R.
Hettinger once managed to break GvR's code. The BDFL was not amused.



> would be a very convenient thing to have, instead of wrapping a 'next'
> call on a try...except and consuming the first item.
>
> Ex 3:
>
> if any(iterator):
> # do something ... but the first true value was already consumed and
> # cannot be reused. "Any" cannot peek inside the iterator without
> # consuming the value.

for item in iflter(bool, iterator):
# do something
break

is not that bad.

> Instead,
>
> i1, i2 = tee(iterator)
> if any(i1):
> # do something with i2
>
> Question/Proposal:
>
> Has there been any PEP regarding the problem of 'peeking' inside an
> iterator? Knowing if the iteration will end or not, and/or accessing the
> next value, without consuming it? Is there any (simple, elegant) way
> around it?

Personally I think that Python's choice of EAFP over LBYL is a good one, but
one that cannot easily be reconciled with having peekable iterators. If I
were in charge I'd rather simplify the iterator protocol (scrap send() and
yield expressions) than making it more complex.

Peter

Lie Ryan

unread,
Oct 1, 2008, 1:22:32 PM10/1/08
to pytho...@python.org
On Wed, 01 Oct 2008 10:46:33 -0400, Luis Zarrabeitia wrote:

> Hi there.


>
> For most use cases I think about, the iterator protocol is more than
> enough. However, on a few cases, I've needed some ugly hacks.
>
> Ex 1:
>
> a = iter([1,2,3,4,5]) # assume you got the iterator from a function and
> b = iter([1,2,3]) # these two are just examples.
>

> then,
>
> zip(a,b)
>
> has a different side effect from
>
> zip(b,a)
>
> After the excecution, in the first case, iterator a contains just [5],
> on the second, it contains [4,5]. I think the second one is correct (the
> 5 was never used, after all). I tried to implement my 'own' zip, but
> there is no way to know the length of the iterator (obviously), and
> there is also no way to 'rewind' a value after calling 'next'.
>
> Ex 2:
>
> Will this iterator yield any value? Like with most iterables, a
> construct
>
> if iterator:
> # do something
>

> would be a very convenient thing to have, instead of wrapping a 'next'
> call on a try...except and consuming the first item.
>
> Ex 3:
>
> if any(iterator):
> # do something ... but the first true value was already consumed and
> # cannot be reused. "Any" cannot peek inside the iterator without #
> consuming the value.
>

> Instead,
>
> i1, i2 = tee(iterator)
> if any(i1):
> # do something with i2
>
> Question/Proposal:
>
> Has there been any PEP regarding the problem of 'peeking' inside an
> iterator?

No (or I'm not aware of any). Why? Because for some iterable, it is not
possible to know in advance its length (video data stream, for example),
or whether it'd ever end (the digits of pi).

Second, in python, iterator is a use-once object, it is not designed to
be reused. Some languages, especially the purely functional ones, allow
multiple use of iterator because they guarantee immutability, python
allows mutable object, and is unable to provide that.

> Knowing if the iteration will end or not, and/or accessing the
> next value, without consuming it?

No, it is not possible to do that for some iterators. For example, this
code:

import time
class Iterable(object):
def __iter__(self):
return self
def next(self):
return time.time()

if you peeked the iterator in advance, the result would be different
compared to the result when you actually need it.

> Is there any (simple, elegant) way
> around it?

Simple, but probably not that elegant, if you need such a fine control,
use while loop.

George Sakkis

unread,
Oct 1, 2008, 2:23:53 PM10/1/08
to

Testing for an empty iterator: http://code.activestate.com/recipes/413614/

There also used to be a more general recipe at the Cookbook but it
seems it has not made it to the revamped site. Thanks to the web
archive, here's a link:
http://web.archive.org/web/20060529065931/http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/304373

HTH,
George

Aaron "Castironpi" Brady

unread,
Oct 1, 2008, 3:04:58 PM10/1/08
to

It wouldn't be that hard to make your own.

a = peekingiter([1,2,3,4,5])
b = peekingiter([1,2,3])

Just don't cross it with typing and get peking duck.

Terry Reedy

unread,
Oct 1, 2008, 4:14:09 PM10/1/08
to pytho...@python.org
Luis Zarrabeitia wrote:
> Hi there.
>
> For most use cases I think about, the iterator protocol is more than enough.
> However, on a few cases, I've needed some ugly hacks.
>
> Ex 1:
>
> a = iter([1,2,3,4,5]) # assume you got the iterator from a function and
> b = iter([1,2,3]) # these two are just examples.
>
> then,
>
> zip(a,b)
>
> has a different side effect from
>
> zip(b,a)
>
> After the excecution, in the first case, iterator a contains just [5], on the
> second, it contains [4,5]. I think the second one is correct (the 5 was never
> used, after all). I tried to implement my 'own' zip, but there is no way to
> know the length of the iterator (obviously), and there is also no way
> to 'rewind' a value after calling 'next'.

Interesting observation. Iterators are intended for 'iterate through
once and discard' usages. To zip a long sequence with several short
sequences, either use itertools.chain(short sequences) or put the short
sequences as the first zip arg.

> Ex 2:
>
> Will this iterator yield any value? Like with most iterables, a construct
>
> if iterator:
> # do something
>
> would be a very convenient thing to have, instead of wrapping a 'next' call on
> a try...except and consuming the first item.

To test without consuming, wrap the iterator in a trivial-to-write
one_ahead or peek class such as has been posted before.

> Ex 3:
>
> if any(iterator):
> # do something ... but the first true value was already consumed and
> # cannot be reused. "Any" cannot peek inside the iterator without
> # consuming the value.

If you are going to do something with the true value, use a for loop and
break. If you just want to peek inside, use a sequence (list(iterator)).

> Instead,
>
> i1, i2 = tee(iterator)
> if any(i1):
> # do something with i2

This effectively makes two partial lists and tosses one. That may or
may not be a better idea.

> Question/Proposal:
>
> Has there been any PEP regarding the problem of 'peeking' inside an iterator?

Iterators are not sequences and, in general, cannot be made to act like
them. The iterator protocol is a bare-minimum, least-common-denominator
requirement for inter-operability. You can, of course, add methods to
iterators that you write for the cases where one-ahead or random access
*is* possible.

> Knowing if the iteration will end or not, and/or accessing the next value,
> without consuming it? Is there any (simple, elegant) way around it?

That much is trivial. As suggested above, write a wrapper with the
exact behavior you want. A sample (untested)

class one_ahead():
"Self.peek is the next item or undefined"
def __init__(self, iterator):
try:
self.peek = next(iterator)
self._it = iterator
except StopIteration:
pass
def __bool__(self):
return hasattr(self, 'peek')
def __next__(self): # 3.0, 2.6?
try:
next = self.peek
try:
self.peek = next(self._it)
except StopIteration:
del self.peek
return next
except AttrError:
raise StopIteration

Terry Jan Reedy

Luis Zarrabeitia

unread,
Oct 1, 2008, 8:42:30 PM10/1/08
to pytho...@python.org
On Wednesday 01 October 2008 01:14:14 pm Peter Otten wrote:

> Luis Zarrabeitia wrote:
> > a = iter([1,2,3,4,5]) # assume you got the iterator from a function and
> > b = iter([1,2,3]) # these two are just examples.
>
> Can you provide a concrete use case?

I'd like to... but I've refactored away all the examples I had, as soon as I
realized that I didn't know which one was the shortest sequence to put it
first.

But, it went something like this:

===
def do_stuff(tasks, params):
params = iter(params)
for task in tasks:
for partial_task, param in zip(task, params):
pass #blah blah, do stuff here.
print "task completed"
===

Unfortunately that's not the real example (as it is, it shows very bad
programming), but imagine if params and/or tasks were streams beyond your
control (a data stream and a control stream). Note that I wouldn't like a
task or param to be wasted.

I didn't like the idea of changing both the 'iter' and the 'zip' (changing
only one of them wouldn't have worked).

> > Will this iterator yield any value? Like with most iterables, a construct
> >
> > if iterator:
> > # do something
>
> I don't think this has a chance. By adding a __len__ to some iterators R.
> Hettinger once managed to break GvR's code. The BDFL was not amused.

Ouch :D
But, no no no. Adding a __len__ to iterators makes little sense (specially
in my example), and adding an optional __len__ that some iterators have and
some don't (the one that can't know their own lengths) would break too many
things, and still, wouldn't solve the problem of knowing if there is a next
element. A __nonzero__() that would move the iterator forward and cache the
result, with a next() that would check the cache before advancing, would be
closer to what I'd like.

> > if any(iterator):
> > # do something ... but the first true value was already consumed and
> > # cannot be reused. "Any" cannot peek inside the iterator without
> > # consuming the value.
>
> for item in iflter(bool, iterator):
> # do something
> break

It is not, but (feel free to consider this silly) I don't like breaks. In this
case, you would have to read until the end of the block to know that what you
wanted was an if (if you are lucky you may figure out that you wanted to
simulate an if test).

(Well, I use breaks sometimes, but most of them are because I need to test if
an iterator is empty or not)

> Personally I think that Python's choice of EAFP over LBYL is a good one,
> but one that cannot easily be reconciled with having peekable iterators. If
> I were in charge I'd rather simplify the iterator protocol (scrap send()
> and yield expressions) than making it more complex.

Oh, I defend EAFP strongly. On my university LBYL is preferred, so whenever I
teach python, I have to give strong examples of why I like EAFP.

When the iterator is empty means that there is something wrong, I wouldn't
think of using "if iterator:". That would be masquerading what should be an
exception. However, if "iterator is empty" is meaningful, that case should go
in an "else" clause, rather than "except". Consider if you need to find the
first non-empty iterator from a list (and then sending it to another
function - can't test for emptiness with a "for" there, or one could lose the
first element)

On Wednesday 01 October 2008 04:14:09 pm Terry Reedy wrote:
> Interesting observation. Iterators are intended for 'iterate through
> once and discard' usages. To zip a long sequence with several short
> sequences, either use itertools.chain(short sequences) or put the short
> sequences as the first zip arg.

I guess that the use of the word 'rewind' wasn't right. To 'push back' an item
into the iterator would be an ugly hack to not being able to know if it was
there in the first place.

Putting the short sequences first wont help a lot when you cannot know which
sequence is shorter, and chaining all of them could be hard to read at best -
look at the do_stuff function at the beginning of this message.

> > Knowing if the iteration will end or not, and/or accessing the
> > next value, without consuming it?

> No, it is not possible to do that for some iterators. For example, this
> code:
(...)


> if you peeked the iterator in advance, the result would be different
> compared to the result when you actually need it.

But that's one of the cases where one should know what is doing. Both C# and
Java have iterators that let you know if they are finished before consuming
the item. (I didn't mean to compare, and I like java's more than C#, as
java's iterator also promote the 'use once' design).

This may be dreaming, but if the default iter() constructor returned an object
with a .has_next() or a __nonzero__() attribute (if __iter__() had
a 'has_next', no problem, if it didn't, just wrap it - no problem with
backwards compatibility), functions like zip or self-made zips could make use
of it... Common cases could be solved by having the has_next compute the next
one and save it until the 'next()', and weird cases like your time() example
could define the has_next() as true.

I think that my only objection to this is (beside the similarities with java)
is that it could promote a LBYL style.

Aaron "Castironpi" Brady

unread,
Oct 1, 2008, 11:07:14 PM10/1/08
to

Terry's is close. '__nonzero__' instead of '__bool__', missing
'__iter__', 'next', 'self._it.next( )' in 2.5.

Then just define your own 'peekzip'. Short:

def peekzip( *itrs ):
while 1:
if not all( itrs ):
raise StopIteration
yield tuple( [ itr.next( ) for itr in itrs ] )

In some cases, you could require 'one_ahead' instances in peekzip, or
create them yourself in new iterators.

Here is your output: The first part uses zip, the second uses peekzip.

[(1, 1), (2, 2), (3, 3)]
5
[(1, 1), (2, 2), (3, 3)]
4

4 is what you expect.

Here's the full code.

class one_ahead(object):


"Self.peek is the next item or undefined"
def __init__(self, iterator):
try:

self.peek = iterator.next( )


self._it = iterator
except StopIteration:
pass

def __nonzero__(self):
return hasattr(self, 'peek')
def __iter__(self):
return self
def next(self): # 3.0, 2.6?


try:
next = self.peek
try:

self.peek = self._it.next( )


except StopIteration:
del self.peek
return next

except AttributeError:
raise StopIteration


a= one_ahead( iter( [1,2,3,4,5] ) )
b= one_ahead( iter( [1,2,3] ) )
print zip( a,b )
print a.next()

def peekzip( *itrs ):
while 1:
if not all( itrs ):
raise StopIteration
yield tuple( [ itr.next( ) for itr in itrs ] )

a= one_ahead( iter( [1,2,3,4,5] ) )
b= one_ahead( iter( [1,2,3] ) )
print list( peekzip( a,b ) )
print a.next()

There's one more option, which is to create your own 'push-backable'
class, which accepts a 'previous( item )' message.

(Unproduced)
>>> a= push_backing( iter( [1,2,3,4,5] ) )
>>> a.next( )
1
>>> a.next( )
2
>>> a.previous( 2 )
>>> a.next( )
2
>>> a.next( )
3

Peter Otten

unread,
Oct 2, 2008, 4:40:53 AM10/2/08
to
Luis Zarrabeitia wrote:

> On Wednesday 01 October 2008 01:14:14 pm Peter Otten wrote:
>> Luis Zarrabeitia wrote:
>> > a = iter([1,2,3,4,5]) # assume you got the iterator from a function and
>> > b = iter([1,2,3]) # these two are just examples.
>>
>> Can you provide a concrete use case?
>
> I'd like to... but I've refactored away all the examples I had, as soon as
> I realized that I didn't know which one was the shortest sequence to put
> it first.
>
> But, it went something like this:
>
> ===
> def do_stuff(tasks, params):
> params = iter(params)
> for task in tasks:
> for partial_task, param in zip(task, params):
> pass #blah blah, do stuff here.
> print "task completed"
> ===
>
> Unfortunately that's not the real example (as it is, it shows very bad
> programming), but imagine if params and/or tasks were streams beyond your
> control (a data stream and a control stream). Note that I wouldn't like a
> task or param to be wasted.

This remains a bit foggy to me. Maybe you are better off with deques than
iterators?



> I didn't like the idea of changing both the 'iter' and the 'zip' (changing
> only one of them wouldn't have worked).
>
>> > Will this iterator yield any value? Like with most iterables, a
>> > construct
>> >
>> > if iterator:
>> > # do something
>>
>> I don't think this has a chance. By adding a __len__ to some iterators R.
>> Hettinger once managed to break GvR's code. The BDFL was not amused.
>
> Ouch :D
> But, no no no. Adding a __len__ to iterators makes little sense (specially
> in my example), and adding an optional __len__ that some iterators have
> and some don't (the one that can't know their own lengths) would break too
> many things, and still, wouldn't solve the problem of knowing if there is
> a next element. A __nonzero__() that would move the iterator forward and
> cache the result, with a next() that would check the cache before
> advancing, would be closer to what I'd like.

The problem was that __len__() acts as a fallback for __nonzero__(), see

http://mail.python.org/pipermail/python-dev/2005-September/056649.html

>> > if any(iterator):
>> > # do something ... but the first true value was already consumed and
>> > # cannot be reused. "Any" cannot peek inside the iterator without
>> > # consuming the value.
>>
>> for item in iflter(bool, iterator):
>> # do something
>> break
>
> It is not, but (feel free to consider this silly) I don't like breaks. In
> this case, you would have to read until the end of the block to know that
> what you wanted was an if (if you are lucky you may figure out that you
> wanted to simulate an if test).

Ok, make it

for item in islice(ifilter(bool, iterator), 1):
# do something

then ;)

> (Well, I use breaks sometimes, but most of them are because I need to test
> if an iterator is empty or not)
>
>> Personally I think that Python's choice of EAFP over LBYL is a good one,
>> but one that cannot easily be reconciled with having peekable iterators.
>> If I were in charge I'd rather simplify the iterator protocol (scrap
>> send() and yield expressions) than making it more complex.
>
> Oh, I defend EAFP strongly. On my university LBYL is preferred, so
> whenever I teach python, I have to give strong examples of why I like
> EAFP.
>
> When the iterator is empty means that there is something wrong, I wouldn't
> think of using "if iterator:". That would be masquerading what should be
> an exception. However, if "iterator is empty" is meaningful, that case
> should go in an "else" clause, rather than "except". Consider if you need
> to find the first non-empty iterator from a list (and then sending it to
> another function - can't test for emptiness with a "for" there, or one
> could lose the first element)

You can do it

def non_empty(iterators):
for iterator in iterators:
it = iter(iterator)
try:
yield chain([it.next()], it)
except StopIteration:
pass

for it in non_empty(iterators):
return process(it)

but with iterators as they currently are in Python you better rewrite
process() to handle empty iterators and then write

for it in iterators:
try:
return process(it)
except NothingToProcess: # made up
pass

That's how I understand EAFP. Assume one normal program flow and deal with
problems as they occur.

> But that's one of the cases where one should know what is doing. Both C#
> and Java have iterators that let you know if they are finished before
> consuming the item. (I didn't mean to compare, and I like java's more than
> C#, as java's iterator also promote the 'use once' design).

I think that may be the core of your problem. Good code built on Python's
iterators will not resemble the typical Java approach.

Peter

Steven D'Aprano

unread,
Oct 2, 2008, 10:06:24 AM10/2/08
to
On Wed, 01 Oct 2008 16:14:09 -0400, Terry Reedy wrote:

> Iterators are intended for 'iterate through once and discard' usages.

Also for reading files, which are often seekable.

I don't disagree with the rest of your post, I thought I'd just make an
observation that if the data you are iterating over supports random
access, it's possible to write an iterator that also supports random
access.

--
Steven

0 new messages