I didn't even know 'return' did that with yield from.
The issue of less interaction with the scheduler could use an explicit example as it seems a huge upside to this approach. E.g.
def g():
sock = yield connect()
data = yield sock.recv(10)
data2 = yield sock.recv(10)
The fact that every 'yield' would go to the scheduler while if it were 'yield from' some of those could short circuit - blows my mind. IIUC it's because a 'yield' doesn't interact with its argument at all, it just fires it up the stack regardless - whatever is yielded has no control over that.
How or where does one collect proverbial cookies?
Yuval Greenfield
Pardon my top posting from my phone
I didn't even know 'return' did that with yield from.
The issue of less interaction with the scheduler could use an explicit example as it seems a huge upside to this approach. E.g.
def g():
sock = yield connect()
data = yield sock.recv(10)
data2 = yield sock.recv(10)
The fact that every 'yield' would go to the scheduler while if it were 'yield from' some of those could short circuit - blows my mind. IIUC it's because a 'yield' doesn't interact with its argument at all, it just fires it up the stack regardless - whatever is yielded has no control over that.
How or where does one collect proverbial cookies?
Right. Whereas yield-from gives its argument more freedom, as it is an iterator.
On Tue, Mar 19, 2013 at 1:02 PM, Yuval Greenfield <ubers...@gmail.com> wrote:
So then the only thing missing for me in the rationale is - why shoehorn iterators into coroutines? IIUC, with a "suspend this stack now" keyword/special-object we would have removed the need for all the "yield from"s. It seems as though this is a complicated way of achieving just that.
I guess you should have brought that up when we were discussing PEP 380. :-) People have chided me before for not calling it "await". But adding a new keyword is a lot harder than adding new syntax composed of existing keywords. And the superficial equivalence between "yield from X" and "for i in X: yield i" is occasionally useful.
And of course there's the matter that we've got a much longer tradition of conflating iterators and coroutines, starting at least with PEP 342. So I say, meh, just deal with it.
As an aside, I couldn't agree more with "Async programming requires careful attention to the points where you are returning control to the scheduler" — sometimes when explaining monocle to programmers familiar with threading and mutexes, I like to say that "yield" also means "unlock EVERYTHING". The hint that "yield" or "yield from" provide vs "await" is a subtle one, but it can't hurt.
But on to my main point: While I think "yield from" is a neat feature for generators, I'm not sold on the idea that "yield from" is a better choice than "yield" for a framework like Tulip. Here are some points to consider:
- The speed advantages of working with "yield from" are small. We haven't seen this as a performance bottleneck in monocle. In fact, we discussed implementing our generator-iterating scheduler piece in C, which I think would be possible and roughly equivalent to the performance improvement from using "yield from", but we haven't gotten around to it because it hasn't been an issue. (This piece of monocle does almost exactly what the "yield from is semantically equivalent to..." code from PEP 380 does.)
- Returning Futures from coroutines has some advantages. They're easy to use from outside coroutine code, so they make libraries written in the coroutine style compatible without translation with libraries written using callbacks (especially if everyone starts standardizing on Futures). And they're clear values for introspection: if a function returns a generator, it's unclear whether that's a potential value to "yield from" in a coroutine, whereas a Future is a clear sign you're dealing with something yieldable.
- For users familiar with "yield from", it will be apparent that at some low level a "yield" must be necessary.
A) If the lowest-level IO APIs are wrapped in such a way as to make "yield from" work, it will be unclear where that is happening. Users of the framework will have a sense that they don't understand it at the deepest level until they unravel this.
B) If the lowest-level IO APIs aren't wrapped, or if "yield" is used directly with low-level IO functions in some badly written code, then low-level IO will work differently from higher-level user IO functions, making it difficult to replace the low level of IO functions in code that expects to use them. This could be an issue, for example, in seamlessly using SSL with code originally written to work with a TCP socket.
- In general, because yield has to work with Futures, sometimes people will end up using it, making their code unusable with coroutine generators in the process. This will make it necessary to sometimes wrap coroutine generators in a Future. When they're not wrapped, "yield some_generator" will silently yield the wrong type of object and fail later, in an unclear way, in event loop code. On the other hand, "yield some_future" and "yield from some_future" both work. This also makes it tempting to *always* wrap coroutine generators in a Future.
tl;dr: I think yield and Futures make for a nicer-looking and more simply and transparently composable API. I believe the speed issue with "yield" is not a major one, and can be dealt with in a way equivalent to the speedup experienced with "yield from". The remaining advantages of "yield from" seem to me to be about implementer convenience more than end-user convenience. All end users benefit from an implementation that goes out of its way to make things simple.
So then the only thing missing for me in the rationale is - why shoehorn iterators into coroutines? IIUC, with a "suspend this stack now" keyword/special-object we would have removed the need for all the "yield from"s. It seems as though this is a complicated way of achieving just that.
I guess you should have brought that up when we were discussing PEP 380. :-) People have chided me before for not calling it "await". But adding a new keyword is a lot harder than adding new syntax composed of existing keywords. And the superficial equivalence between "yield from X" and "for i in X: yield i" is occasionally useful.I like the names `yield` and `yield from`, I'm asking about a different functionality - one that can freeze the entire stack I'm on. I.e. converting any python stack into a generator/coroutine. That way you wouldn't need a "transparent channel" because you could have a "direct channel". The naive approach would be more efficient, O(1) instead of O(stack_size). You can say that you prefer it clearly visible that a function might pause - which `yield from` does clearly signal, and I'd be content I think. Also, we could short-circuit chained `yield from`s behind the scenes.
And of course there's the matter that we've got a much longer tradition of conflating iterators and coroutines, starting at least with PEP 342. So I say, meh, just deal with it.I reread the pep to see if this was discussed and found a minor bug in the trampoline, so I assume it's pseudo code-ish:
def schedule(self, coroutine, stack=(), value=None, *exc):def resume():try:if exc:value = coroutine.throw(value,*exc)else:value = coroutine.send(value)UnboundLocalError: local variable 'value' referenced before assignmentI'd replace the top with:def schedule(self, coroutine, stack=(), val=None, *exc):def resume():value = val