Thank-you :)
-cj.
// test_and_load does two things. First it checks to see if a filepath// is really a file and not a directory ot something crazy.// If it's not a file it emits "undefined" (except for one case where null is used)// If it is a file, then it emits the file's contents// Any errors in fs.stat or fs.fileRead are passed through as is.// In these examples, we're assuming that the fs module is of the same// style as the function comsuming it. So we're showing a small example of// producer and consumer for each of the styles.// simple continuable with seperate callback and errbackfunction test_and_load(filename) { return function (callback, errback) {fs.stat(filename)(function (stat) {// Filter out non-filesif (!stat.isFile()) { callback(); return; }// Otherwise read the file infs.readFile(filename)(callback, errback);}, errback);}}// simple continuable with single combined callbackfunction test_and_load(filename) { return function (callback) {fs.stat(filename)(function (stat) {// Pass along any errors before we do anythingif (stat instanceof Error) { callback(stat); return; }// Filter out non-filesif (!stat.isFile()) { callback(); return; }// Otherwise read the file infs.readFile(filename)(callback);});}}// inimino style continuablesfunction test_and_load(filename) { return function (cont) {fs.stat(filename)(rightContinuation(cont)(function (stat) {// Filter out non-filesif (!stat.isFile()) { cont(); return; }// Otherwise read the file infs.readFile(filename)(cont);}));}}function test_and_load(filename) {return fs.stat(filename)(function (stat) {// Pass along any errors before we do anythingif(stat instanceof Error) { return; }// Filter out non-filesif (!stat.isFile()) { return null; }// Otherwise read the file inreturn fs.readFile(filename);});}// promise based with node-promise helpers (github.com/kriszyp/node-promise)var when = require("promise").when;function test_and_load(filename) {return when(fs.stat(filename), function (stat) {// Filter out non-filesif (!stat.isFile()) { return; }// Otherwise read the file inreturn fs.readFile(filename);});}// promise based with CommonJS monkey patch (github.com/kriszyp/node-commonjs/blob/master/patch-promise.js)function test_and_load(filename) {return fs.stat(filename).then(function (stat) {// Filter out non-filesif (!stat.isFile()) { return; }// Otherwise read the file inreturn fs.readFile(filename);});}// promise based (how it works in current node)function test_and_load(filename) {var promise = new process.Promise();fs.stat(filename).addCallback(function (stat) {// Filter out non-filesif (!stat.isFile()) { promise.emitSuccess(); return; }// Otherwise read the file infs.readFile(filename).addCallback(function (data) {promise.emitSuccess(data);}).addErrback(function (error) {promise.emitError(error);});}).addErrback(function (error) {promise.emitError(error);});return promise;}
Well, for file system operations I've decided to allow sync op
exposure because of the high possibility of file system caches. For
other things which use promises (DNS resolution and simple HTTP GET
requests) there will be no synchronous interface.
I'm glad we're discussing this - I think the current promises could be improved.
+1
> - A true sync version of that library for the times you don't care about blocking
+0.5
I'd be even happier with no such library, but given the
existence of .wait(), this boat has probably sailed...
> - And a "blessed" async library that wraps the base async library in Ryan's favorite of these options.
-1
I don't think there needs to be a "blessed" library, but
we do need an installer for packages (which I know several
people are working on). Once this is in place it should be
no problem to pull in async convenience libraries for use in
your own projects or as dependencies of other libraries or
frameworks you install.
I think Promises will get a more full-featured implementation
this way then what most node users have seen of them so far,
while Node itself can stay light, and we can have continuable
libraries and other approaches as well to suit various styles
and preferences.
I think there should be a reasonable "start here" for people just
getting into things. I'd also appreciate having something for the
times when I'm on a remote system and want to bang out a script
without installing stuff (this is actually one of my annoyances with
Python, I hate the stdlib for scripting jobs). Maybe the sync versions
can cover that but I'd still like some blessed option.
I actually have ideas on yet another way to do this that I'd like to
try out over the weekend, but I'd be happy enough with more or less
all of the options presented.
I was informed by Kris Kowal that you guys are discussing various
options for an asynchronous API. Having worked extensively with a
promise-based API myself, Kris thought it would be useful if I would
share my thoughts on this list. So here goes:
A promise-based API has the benefit that asynchronous actions can be
'chained' implicitly and that they bundle up callback and errback in a
single abstraction. Consider the simplest form of composition: an
asynchronous action A that just wants to chain two asynchronous
actions B and C. A good measuring point to compare each async. version
against is the sequential version:
function A(x) { return B(C(x)); }
With explicit continuations, you get something resembling:
function A(x, cb, errb) { return C(x, function(val) { return B(val,
cb, errb); }, errb);
With promises you get:
function A(x) { return when(C(x), function(val) { return B(val); } }
The difference may not appear to be that big, but the key point is
that it's not a constant-factor improvement: the more async.
compositions you perform in your code, the larger the difference
between the callback and the promise-based version becomes. Another
cool thing to note with a promise-based API is that the "signature" of
your functions is unmodified as compared to the sequential API, except
that the "return type" is of type promise<T> instead of type T, so to
speak. In my experience, this reduces the burden of reasoning about
async API's. And it means a promise-based API plays nicer with
variable-argument functions, which don't have to perform argument-list-
surgery to treat continuation arguments specially.
I can think of other reasons for choosing promises, but given the
context of this discussion, these may not be that relevant here.
Just my $0.02.
Cheers,
Tom
// Plain callback stylefunction read(filename, callback, errback) {// Do async work and then...callback(data);}// Continuation stylefunction read(filename) { return function (callback, errback) {// Do async work and then...callback(data);}}// Using a callbackread("myfile.txt", function (data) {// use data}, error_handler);// Using a continuationread("myfile.txt")(function (data) {// Use data}, error_handler);
I think Tim's suggestions go a long way towards making this possible. So, +1 for
// three ways of calling fs.open()
// sync
try {
fs.open('/tmp/hello', 'r+', 0600);
puts('complete');
} catch (err) {
puts('error');
}
// simple async
fs.open('/tmp/hello', 'r+', 0600, function (err, fd) {
if (err) {
puts('error');
} else {
puts('complete');
}
});
// promise async
promise(fs.open, '/tmp/hello', 'r+', 0600)(function (fd) {
puts('complete');
}, function (err) {
puts('error');
});
Then we can expose all three options.
> // promise async
> promise(fs.open, '/tmp/hello', 'r+', 0600)(function (fd) {
> puts('complete');
> }, function (err) {
> puts('error');
> });
I mentioned this a few days ago[1], but if you add a curry-like
function to Function.prototype you could end up with
fs.open.promise('/tmp/hello', 'r+', 0600)(function (fd) {
puts('complete');
}, function (err) {
puts('error');
});
which is then syntactically very close to continuables.
I do think it's often useful to encapsulate the "what happens next"
code paths into an object, though. Objects are not a necessary part of
any programming language, but if objects are fundamental to
JavaScript, and async programming is fundamental to JavaScript, then I
think it's useful to have them combined in some way.
If the "what happens next" stuff is an object the various code paths
are named, they can be inherited and monkey patched and inspected,
they can easily be passed around and emitted, and they can easily
support more than two code paths, as well as operations like timeout.
Many of the continuable examples I've seen are very elegant (they're
particularly nice when chaining a "success" code path), but I think
they get a bit awkward when the "error" code path is introduced (is
error a second argument, or is a single argument "overloaded"?) and
none have a good solution for timeouts.
Michael
[1] http://groups.google.com/group/nodejs/msg/b8a9b8b644bbd291
I'm a bit confused here. I thought a "continuable" was the curried
version of a function written in continuation passing style[1].
So you have
function add(i, j) {
return i + j;
}
console.log(add(1, 2));
// -> 3
function add_cps(i, j, callback) { // cps = continuation passing style
callback(add(i, j));
}
add_cps(1, 2, function(x) { console.log(x); });
// -> 3
function add_continuable(i, j) {
return function(callback) { add_cps(i, j, callback); }
}
add_continuable(1, 2)(function (x) { console.log(x); });
// -> 3
I thought "add_continuable" was the only actual "continuable" in the
code above, and that the various callback arguments are the
"continuations"--?
Michael
I believe the current promise model (or the promise function
shorthand, as suggested by Ryan above) in combination with the
CommonJS "then" would both give us flexibility and all "papers on the
tables" (fewer language tricks causing confusion).
Rather write an extra line of code than having to explain the code to someone.
> --
> You received this message because you are subscribed to the Google Groups "nodejs" group.
> To post to this group, send email to nod...@googlegroups.com.
> To unsubscribe from this group, send email to nodejs+un...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/nodejs?hl=en.
>
>
--
Rasmus Andersson
1. API/signature for registering listeners. Node's promises are an
object with addCallback/addErrback for registering listeners.
CommonJS's promises are objects that use "then" to register the
callback and error handler. Continuables are the function itself that
is called to register the listeners. Interesting, Continuables are
actually closer to CommonJS promises here, since you have a single
function for registering listeners, they only differ in whether the
promise is that function itself or an object that holds the function.
Modifying continuables to use then would be pretty trivial as well you
just use "return {then:function(..." instead "return function(...".
And actually, the continuable style of promise API is similar to
ref_send's promises, which are also functions that are called
directly. We have certainly considered this before, but, the major
flaw in this design, and the reason we didn't follow this approach in
CommonJS, is that it becomes very hard for functions that need to
differentiate between sync values and async values to branch when the
only definition of a promise is a value where (typeof value ===
"function"). This type of branching is a key part of the when()
construct (and I use when() all the time), where it provides
normalization between sync and async. If you call a function that
returns a function, how do you determine if the return value is
synchronous and really a function, or is a promise/continuable that
needs a listener? Of course the other aspect of this is more
subjective, having a function call in your code, instead of just
adjacent parenthesis explicitly indicates to the reader of the code
that you are setting up a function to handle the completion of an
operation.
2. No automatic chaining. Node's promises don't support chaining
anyway, but CommonJS promises do. Without chaining, users have to
manually fulfill the continuable/promise that is returned. The gist
that started (thanks Tim for putting that together) clearly shows the
difference between chaining and no chaining.
3. Registering listeners triggers the execution the of async
operation. With normal promises, execution and listener registration
are separate concerns. One can fire off an async operation and then
safely add (or not add) listeners without affecting the operation.
This is probably the most dangerous and insidious aspect of
continuables, IMO. If multiple listeners are registered, this causes
multiple firings of the async operation. If one were to do something
like:
var myContinuable = appendToMyFile();
printResults(myContinuable);
Now if the author of printResults decided to register an extra
callback to do another debugging statement, that would cause multiple
executions of the action in appendToMyFile. In this case, where
appendToMyFile is probably not idempotent, this results in duplicate
appends to the file. Users have an enormous extra responsibility to
ensure that a continuable is only called once (which can be quite
onerous since it is so easy to pass the continuable around). This type
of behavior also prevents one from writing a safe reliable version
when() for use with continuables (whereas when() would work with #1
and #2, except for the amiguity problem).
Anyway, hopefully it is helpful for the sake of discussion (or at
least education) to be able consider each of these ideas individually.
Agreed. This is one of the problems with this kind of continuation I
was trying to point at.
Thanks for the break-down.
On Feb 19, 2:26 am, Ryan Dahl <coldredle...@gmail.com> wrote:
> What about this:
> [snip]
> // promise async
> promise(fs.open, '/tmp/hello', 'r+', 0600)(function (fd) {
> puts('complete');
> }, function (err) {
> puts('error');
> });
Would node provide the promise function, or third party libraries?
Would Node have a function called "promise" that returns a continuable
(instead of a promise)? I'd prefer having third parties provide the
promise and continuable APIs over this being the third option.
Kris
+1
I think node should go with returning a function that accepts a single
callback, and the first parameter is reserved for indicating if there
was an error.
All high level wrapping and mangling should be done using 3rd party
libraries.
--fg
+1
This is how I usually do it in C-blocks[1] and appreciate not having
to define two separate callbacks. e.g.
typedef int (^closure_t)(error_t *, void *);
void parse_tree(closure_t c) {
// ... eventually call c
}
parse_tree(^(error_t *err, void *arg) {
if (err) {
present_error(err);
return;
}
parsed_tree_t *tree = (parsed_tree_t*)arg;
// ...
});
[1] http://thirdcog.eu/pwcblocks/
>
> All high level wrapping and mangling should be done using 3rd party
> libraries.
Indeed. Apparently there are many different tastes out there.
>
> --fg
>
>
> On Feb 19, 3:12 pm, Kris Zyp <kris...@gmail.com> wrote:
>> On Feb 19, 2:26 am, Ryan Dahl <coldredle...@gmail.com> wrote:
>>
>> > What about this:
>> > [snip]
>> > // promise async
>> > promise(fs.open, '/tmp/hello', 'r+', 0600)(function (fd) {
>> > puts('complete');
>> > }, function (err) {
>> > puts('error');
>> > });
>>
>> Would node provide the promise function, or third party libraries?
>> Would Node have a function called "promise" that returns a continuable
>> (instead of a promise)? I'd prefer having third parties provide the
>> promise and continuable APIs over this being the third option.
>> Kris
>
I've renamed "promise" to "closure" here, but that's just because I
think it's a more suiting name.
Basically, you do this:
asyncOperation(args)(function(err, results){
// do something with error and/or results
})
To close a closure (i.e. promise.emit{Error, Success}) you call close:
function asyncOperation(args) {
var cl = mkclosure();
cl.close(error, results); // << in reality this happens later
return cl;
}
You could of course add extra sugar to allow for calling the closure
directly by adding input checks in the closure function. The above
becomes:
function asyncOperation(args) {
var cl = mkclosure();
cl(error, results); // << in reality this happens later
return cl;
}
These closures can be passed on as callbacks themselves:
function asyncOperation(args) {
var cl = mkclosure();
anotherAsyncOperation(args)(function(err, args) {
// process args
yetAnotherAsyncOperation(args)(closure.close);
}
return cl;
}
Or if the aforementioned call sugar is added:
anotherAsyncOperation(args)(closure);
The closure could easily be extended with chaining and queueing.
--
Rasmus Andersson
+1 for keeping all this higher-level stuff out of node core
Providing either little or no API around the async style will invite a
lot of divergent styles. This sounds good in the short term because
the competition could spur some new and creative solutions.
But the tradeoff here is that different libraries may have radically
different API styles for relatively simple operations requiring the
average programmer to understand and context switch between different
styles when reusing third party modules.
The bonus of having EventEmitter and Promise be first class objects
that ship with node is that you have incentive to use them and keep
your style close to the base node APIs and the majority of the third
party modules.
It's ok for node to have an opinion about what the best async style is
and adopt that style as default. I think that we should go with the
easiest style to understand which IMHO is the current Promise API.
-Mikeal
One really nice thing about node Promises ATM is that explicit
addErrback means that node can throw an exception when the error
doesn't have a handler.
Having this by default has made my development process a lot easier
since anything I didn't explicitly set an error handler on throws an
exception even when async and I don't have some problem way down the
stack.
Passing the error as the first argument means I have nearly as much
code for handling the errors anyway and gives me far worse default
behavior when I don't write the error handler.
-Mikeal
On Feb 19, 9:20 am, Mikeal Rogers <mikeal.rog...@gmail.com> wrote:
> I'm going to disagree with most of this.
>
> Providing either little or no API around the async style will invite a
> lot of divergent styles. This sounds good in the short term because
> the competition could spur some new and creative solutions.
>
> But the tradeoff here is that different libraries may have radically
> different API styles for relatively simple operations requiring the
> average programmer to understand and context switch between different
> styles when reusing third party modules.
This is great point, Mikeal. Specifically we have already seen a lot
database adapters utilize promises. If one writes a module that can
generically work with the results of a database call, it is much
easier if it knows to expect a certain style of promises. Having to
write code that can work properly with CommonJS promises, the various
different variants on continuables, and anything else people invent is
a lot of extra work.
Kris
I'm not convinced that is true. As long as libraries adopt Node's
base system (whatever that ends up being) then ALL the libraries will
have a consistent API.
What would then happen is that two different kinds of libraries would
emerge, those offering async services, and then those for dealing with
async services.
So, I could write a library for CouchDB, that matches Node's API. Now
all of a sudden my CouchDB library is compatible with any async lib
(like Promises or Continuations or whatever) that works with Node's
API.
Or I could write a Promise library that works with Node's API, now I
am guaranteed that it will work with any library that matches Node's
base API.
I realize I am being redundant here, but I'm trying to drive home that
by having Node choose a base async API (and encouraging library
developers to do this as well) we are not segmenting the node
libraries and nor are we making people switch back and forth between
different styles. You get to choose the async system you like, AND
you know it will work with Node libraries.
On Fri, Feb 19, 2010 at 9:25 AM, Mikeal Rogers <mikeal...@gmail.com> wrote:
> One really nice thing about node Promises ATM is that explicit
> addErrback means that node can throw an exception when the error
> doesn't have a handler.
>
> Having this by default has made my development process a lot easier
> since anything I didn't explicitly set an error handler on throws an
> exception even when async and I don't have some problem way down the
> stack.
>
> Passing the error as the first argument means I have nearly as much
> code for handling the errors anyway and gives me far worse default
> behavior when I don't write the error handler.
>
> -Mikeal
It doesn't sound to me like the using the base Node libraries would be
for you. It isn't for me necessarily either. We'll just have to
choose a Promise or continuable implementation we like to use on top
of them. We can take that performance hit and abstraction in exchange
ease of use.
The thing is, I'm having a hard time thinking of an abstraction over
all the noted styles that won't be leaky. Maybe I'm not just not
creative enough.
If the performance hit is substantial, or the abstraction is leaky,
you can expect everyone who favors a particular style to ignore the
abstraction and write directly for the style they like which leads to
the kind of segmentation I'm afraid of.
The question isn't whether or not an abstraction is possible, it's
whether it will actually be used widely and adopted as a defacto
standard.
If node's interface is exceedingly low level in order to enable all
the styles mentioned then it's going to be the least attractive to use
directly and a higher level abstraction that strings together all the
styles that might be implemented on top of node's interface would need
adopt an API that is relatively inflexible in order to accommodate the
variances and would also not be as attractive as using a specific
style. Which means it's still up to the average developer to use this
abstraction to string together all the styles of the modules he/she
wants to use and I just don't see that as a very attractive way to
program.
-Mikeal
+1
I like this solution best, with the sync version and
// simple async
fs.open('/tmp/hello', 'r+', 0600, function (err, fd) {
in node, and the rest can be built on top. So you'd have an
fs-promise module, maybe maintained by Kris Zyp and depending
on the promise module that includes the CommonJS-style Promise
constructor, and we can have a fs-continuable module maintained
by Tim or me, with a continuable or Do module to go with it, and
whatever other abstractions may come along can throw their hat
in the ring as well.
All of these will just wrap the bare fs functions with the async
abstraction du jour, and node core can stay as small as possible.
// Super simplified sample libraryfs.readFile = function (filename, callback) {callback(error, content);}// Promise people could do thispromise(fs.readFile, "myfile").then(function (data) {...})// Or even thiswhen(fs.readFile, "myfile", function (data) {...})// Continuable people could do thiscont(fs.readFile, "myfile")(function (data) {...})
// Super simplified sample library (current)fs.readFile = function (filename, another_arg, more args, callback) {callback(error, content);}// Curried (continuable-like)fs.readFile = function (filename, another_arg, more args) { return function (callback) {callback(error, content);}}// base arg with options hashfs.readFile = function (filename, {encoding: "ascii"}, callback) {callback(error, content);}The curried version is good because it's powerful enough on it's own to be used with libraries like "Do", the downside is that the double function syntax is ugly
I'm going to remove promises entirely. For places where promises would
be used (fs, dns) I will instead use a standard callback interface:
method(arg0, arg1, arg2, function (error, result0, result1) {
puts('complete');
});
That is, the completion callback will always be the last parameter and
the first argument the completion will always be reserved for an error
object. By following this scheme everywhere, I'm confident that very
good user-land deferred/promise/continuation libraries will spring up.
Expect this interface in the next version of node, 0.1.30.
No, such a function is actually a function returning a continuable.
A continuable would be the partial application of such a curried
function.
Adding (Haskell-like, sorry) type annotations to your functions:
Add just takes two ints and returns int:
add :: (Int, Int) -> Int
> function add(i, j) {
> return i + j;
> }
>
> console.log(add(1, 2));
> // -> 3
Add_cps takes two ints and a function that takes an int (and
does something unspecified with it) and returns nothing of
interest. We can call the second argument a continuation,
since that is what it is.
add_cps :: (Int, Int, (Int -> _)) -> _
> function add_cps(i, j, callback) { // cps = continuation passing style
> callback(add(i, j));
> }
>
> add_cps(1, 2, function(x) { console.log(x); });
> // -> 3
Now if we want to make this clearer, instead of writing
"Int -> _" for the continuation, we can just introduce a
continuation as a type, where "Continuation Int", where means
a function which takes an Int and does something unspecified
with it:
type Continuation a = a -> _
and now the type for add_cps can be more clearly expressed:
add_cps :: (Int, Int, Continuation Int) -> _
Add_continuable takes two ints and returns a function which
takes a continuation:
add_continuable :: (Int, Int) -> (Continuation Int -> _)
> function add_continuable(i, j) {
> return function(callback) { add_cps(i, j, callback); }
> }
>
> add_continuable(1, 2)(function (x) { console.log(x); });
> // -> 3
>
Once again, we can simplify things by naming the type of
this second function. Instead of "a function taking a
Continuation Int and returning nothing" we would like to
be able to just say "a Continuable Int":
type Continuable a = Continuation a -> _
Now we can write the type of add_continuable just as
easily as add_cps above:
add_continuable :: (Int, Int) -> Continuable Int
Conceptually, a "Continuable Int" is very much like a
promise to deliver an int. The only thing you can do with
a continuable, is pass a continuation to it, which will
then (eventually) be called with the value.
The fact that a Continuable happens to be a function is
almost just an implementation detail, just as the fact
that Promises are implemented as objects is almost just
an implementation detail.
> I thought "add_continuable" was the only actual "continuable" in the
> code above, and that the various callback arguments are the
> "continuations"--?
The only continuable is the one returned from add_continuable.
Add_continuable is not a continuable, as seen by
its type, it is a function from (Int, Int) to
Continuable Int.
The motivation for the names "continuation" and "continuable"
was to have something to call these types, which greatly
simplifies thinking about these kinds of patterns. The idea
of "a function returning a function which takes a function
which takes an argument of type 'a'" is not as easy to work
with as "A function returning Continuable A".
A continuation is a well-understood concept in computer
science, and is used here in the broader meaning of "an
encapsulation of the rest of a computation", not in the
narrower "current continuation" sense.
A continuable is just a convenient name for an action
or computation which is "paused", but can be continued
by giving a continuation to it.
A continuation, as used here, is not a call/cc style "current
continuation", nor could it be, since JavaScript doesn't have
call/cc. It's just a JavaScript function representing the rest
of the computation.
You can't use node without using continuations, whether you call
them continuations, callbacks, event handlers, or something else.
> The promise model is also
> straight-forward and includes no "hidden" language tricks (like the
> continuables discussed do). When hiding what's going on, mistakes
> appear.
I don't know what "hidden" language tricks you think you see here,
all we have are functions returning functions. I would say the
implementation is no more than intermediate-to-advanced JavaScript.
Just to make sure this interface is clear, will the error parameter
always be equals to null (or undefined) if the operation completed
successfully? And process.Promise will no longer exist now (be
undefined), right?
Anyway, I'll certainly do my best to have a solid deferred/promise
implementation available that works well with this design.
Kris
Yes.
> And process.Promise will no longer exist now (be
> undefined), right?
Yes. (Well, it will probably be a depreciation error.)
> Anyway, I'll certainly do my best to have a solid deferred/promise
> implementation available that works well with this design.
Good :)
-Tim Caswell
Finally a decision :P
However, a few problems with this solution:
a) How will you be able to set a timeout? (i.e. today you do
method(...).timeout(N))
b) How will you "wait" for completion? Will wait disappear in favor
for not passing the last argument (implying synchronous operation)?
c) How will you support variable arguments (something very common in
javascript)?
I'd really like to see all async operations returning some sort of
"handle" which can be passed on or manipulated (e.g. a "promise", a
"closure" or a "continuable").
IMHO this would be a better solution which is still light-weight and
does not add much abstraction:
var fd = fs.open(filename, true) // true for synchronous
var fd = fs.open(filename, fs.O_EXLOCK, true) // variable arguments
// complex chained operations
fs.open(filename).then(fs.fread, 512)(function(err, data){
// data is a string of length <= 512
}).end(fs.fclose);
If there is no "handle" returned, creating wrappers would be tricky.
This would be an acceptable "layer on top":
var fd = fs.open(filename) // no callback means synchronous
var fd = fs.open(filename, fs.O_EXLOCK) // variable arguments
// complex chained operations
promise(fs.open, filename).then(fs.fread, 512)(function(err, data){
// data is a string of length <= 512
}).end(fs.fclose);
But I don't know if it's possible to do the kind of introspection of
the input callables as would be required (i.e. deducing the number of
arguments fs.open takes, interpolating with undefined's and adding a
callback to the end).
>
> That is, the completion callback will always be the last parameter and
> the first argument the completion will always be reserved for an error
> object. By following this scheme everywhere, I'm confident that very
> good user-land deferred/promise/continuation libraries will spring up.
>
> Expect this interface in the next version of node, 0.1.30.
>
> --
> You received this message because you are subscribed to the Google Groups "nodejs" group.
> To post to this group, send email to nod...@googlegroups.com.
> To unsubscribe from this group, send email to nodejs+un...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/nodejs?hl=en.
>
>
--
Rasmus Andersson
> However, a few problems with this solution:
>
> a) How will you be able to set a timeout? (i.e. today you do
> method(...).timeout(N))
the timeout could be one of the parameters
>
> b) How will you "wait" for completion? Will wait disappear in favor
> for not passing the last argument (implying synchronous operation)?
Ryan has made it clear he doesn't want wait because of the way it's dangerous like threading. There will be sync versions of most the fs functions.
> c) How will you support variable arguments (something very common in
> javascript)?
with the current api, arguments surgery or passing in an array as a single argument
The only real problem I see is with optional arguments, but there is nothing stopping the api from accepting an options hash as one of the arguments, it doesn't have to be part of the spec.
True, but then every function would need to implement their own
timeout mechanism, something which todays promises takes care of in a
"pretty" way.
>> b) How will you "wait" for completion? Will wait disappear in favor
>> for not passing the last argument (implying synchronous operation)?
>
> Ryan has made it clear he doesn't want wait because of the way it's dangerous like threading. There will be sync versions of most the fs functions.
I know. The ugly "coroutine" thingy should be removed. I mean "wait"
as in it's abstract meaning -- if an operation can be either
synchronous or asynchronous (e.g. opening a file) the call to wait (if
available) could tell the mechanism to perform a synchronous
operation.
Example:
fd = fs.open(filename, fs.O_EXLOCK).wait()
When passing around a "handle":
function dequeue() {
var operation = queue.shift(), r;
if (operation.wait)
r = operation.wait()
else
r = operation()
returnValues.push(r)
dequeue();
}
>
>> c) How will you support variable arguments (something very common in
>> javascript)?
>
> with the current api, arguments surgery or passing in an array as a single argument
>
> The only real problem I see is with optional arguments, but there is nothing stopping the api from accepting an options hash as one of the arguments, it doesn't have to be part of the spec.
options arguments could easily be done like this:
function x(options, cb) {
if (typeof options === 'function') {
cb = options;
delete options;
}
// do something
}
However, how would you implement this?
promise(fs.open(filename)).then(fs.read).then...
For people who want something else (which this thread is a living proof of).
AFAIK you would need deep V8 surgery involving introspection to do
this kind of wrapping. If this is not possible, many of the ideas
discussed will not be possible. Only solution would be:
var promise = new Promise();
fs.open(filename, promise.cb);
promise.then(function(fd){
fs.read(fd, promise.cb);
...
})
...
Which would be far too complex for any sane person to use (as it's
about the same amnt of code as to do this w/o any help).
If you can prove that the
promise(fs.open(filename)).then(fs.read).then... is solvable, I'm
convinced.
Ryan mentioned doing simple callbacks with the anticipation that
others would be developing userland promise/continuable/whatever APIs
on top of it. The answer would be to use an API that provides
.timeout().
> I know. The ugly "coroutine" thingy should be removed. I mean "wait"
> as in it's abstract meaning -- if an operation can be either
> synchronous or asynchronous (e.g. opening a file) the call to wait (if
> available) could tell the mechanism to perform a synchronous
> operation.
Kris Zyp's promise API provides this via the when() function. Check it
out for details.
As for the rest, I was hoping Ryan would pick callbacks with the
callback as the first parameter since it'd make variadic handling (and
API evolution) more sane.
There is nothing stopping you from writing a promise implementation
that has a timeout function just like the current one:
promise(fs.open, filename).timeout(1000, function() {
sys.puts('oops, opening file timed out!');
});
>>> b) How will you "wait" for completion? Will wait disappear in favor
>>> for not passing the last argument (implying synchronous operation)?
>>
>> Ryan has made it clear he doesn't want wait because of the way it's dangerous like threading. There will be sync versions of most the fs functions.
>
> I know. The ugly "coroutine" thingy should be removed. I mean "wait"
> as in it's abstract meaning -- if an operation can be either
> synchronous or asynchronous (e.g. opening a file) the call to wait (if
> available) could tell the mechanism to perform a synchronous
> operation.
If you want an operation to be able to be synchronous or asynchronous
do what the current process.fs library does. If there is a callback
it is asynchronous. If there isn't it is synchronous.
But just make everything asynchronous. That's the way Node is meant to be.
>>> c) How will you support variable arguments (something very common in
>>> javascript)?
>>
>> with the current api, arguments surgery or passing in an array as a single argument
>>
>> The only real problem I see is with optional arguments, but there is nothing stopping the api from accepting an options hash as one of the arguments, it doesn't have to be part of the spec.
> However, how would you implement this?
if your function called "async_function" needs to take an optional options hash:
function async_function(options, callback) {
if( typeof callback == 'undefined' ) {
callback = options;
options = {};
}
// do something asynchronous
}
Now to use it:
promise(async_options).then( ... );
or
promise(async_options, { option1: 1}).then ( ... );
function promise(fn) {var p = new Promise();var args = Array.prototype.slice.call(arguments, 1);args.push(p.addCallback);fn.apply(fn, args);return p;}
Exactly. If we go by this design, there can never be a function with
variable arguments, or "api sugar" would break (because it can not
know where to put the callback).
>
> On Feb 19, 2010, at 2:23 PM, Rasmus Andersson wrote:
>
> If you can prove that the
> promise(fs.open(filename)).then(fs.read).then... is solvable, I'm
> convinced.
>
fs.open = function(filename, flags, mode) {
var cl = closure();
// call cl.close(error, fd) somewhere
return cl;
}
Allowing almost equivalent call style:
fs.open(filename, flags, mode, function(error, fd){
// do something with fd
})
becomes:
fs.open(filename, flags, mode)(function(error, fd){
// do something with fd
})
allowing default values for flags and mode:
fs.open(filename)(function(error, fd){
// do something with fd
})
I've written an updated version of such a light-weight closure (or
call it whatever you like -- promise or continuable :) which as a
bonus supports chaining:
--
Rasmus Andersson
The synchronous functions are now completely separate, so this will
be either:
fs_sync.open(...
or
fs.openSync(...
> promise(fs.open, filename).then(fs.fread, 512)(function(err, data){
I think "promise(fs.open)" or "continuable(fs.open)"is ugly and
I don't expect it to catch on. When I talk about a library that
wraps the low-level functions, I don't mean that you'll have to
wrap them when you use them.
Instead, you'll import a library that wraps the low-level fs.open
with a promise-returning (or continuable-returning) function.
var fs=require('your-favorite-fs-promise-layer')
var promise = fs.open(filename,...)
Or if you prefer:
var fs=require('your-favorite-fs-continuable-layer')
var continuable = fs.open(filename,...)
> But I don't know if it's possible to do the kind of introspection of
> the input callables as would be required (i.e. deducing the number of
> arguments fs.open takes, interpolating with undefined's and adding a
> callback to the end).
This level of introspection is possible, but it's not necessary
given that the boilerplate to return some of "handle" as you call
only needs to be written once, and doesn't need to happen at the
point of use at all... you'll just import a library that handles
it.
Ryan talked about merging them in a recent mail on this list (that's
the "why?" to my examples).
>
>> promise(fs.open, filename).then(fs.fread, 512)(function(err, data){
>
> I think "promise(fs.open)" or "continuable(fs.open)"is ugly and
> I don't expect it to catch on. When I talk about a library that
> wraps the low-level functions, I don't mean that you'll have to
> wrap them when you use them.
>
> Instead, you'll import a library that wraps the low-level fs.open
> with a promise-returning (or continuable-returning) function.
>
> var fs=require('your-favorite-fs-promise-layer')
>
> var promise = fs.open(filename,...)
>
> Or if you prefer:
>
> var fs=require('your-favorite-fs-continuable-layer')
>
> var continuable = fs.open(filename,...)
Well, that will probably cause a lot of headache since you would need
to wrap _evey single thing_ in node. What happens when such a library
is out of sync with node underlying impl? You top-level applications
will "randomly break", making the work of tracing the errors a
cumbersome task.
By returning a simple object which will call a member of itself, we
would keep the API simplicity, minimize overhead (no need for "new
type" w all it's implied machinery) and so on. It could even be as
simple as:
fs.open = function(filename, flags, mode) {
var x = function(cb){ x.cb = cb; }
// eventually call x.cb
return x;
}
It would make API sugar/layers easy to write, maintain and use (in
contrast to passing callback as an argument).
>
>> But I don't know if it's possible to do the kind of introspection of
>> the input callables as would be required (i.e. deducing the number of
>> arguments fs.open takes, interpolating with undefined's and adding a
>> callback to the end).
>
> This level of introspection is possible, but it's not necessary
> given that the boilerplate to return some of "handle" as you call
> only needs to be written once, and doesn't need to happen at the
> point of use at all... you'll just import a library that handles
> it.
How is it possible? AFAIK the only way it's possible is to query V8
for the current AST in a given context, then traverse that tree, find
the function, step into the function, look for a use of the magic
arguments, then try to deduce how many arguments are drawn from
arguments, and finally find which position (or possible positions) a
callback would have. I'd say that counts as "not possible" :P.
An abstraction just takes a variable number of arguments itself, and then "apply"s the function with the callback last on the list.
For functions that want to be fancy and accept variable arguments, then the burden is on them to strip the callback off the end of the arguments object.
IMHO this isn't very pretty (it's the pretties solution I've come up
with so far):
var cl = closure(readfile, __filename)(function(err, data){
// do something with err and data
})
This would be better:
var cl = readfile(__filename)(function(err, data){
// do something with err and data
})
As it seems at the moment, we will all need to code using the first
version (passing function ref to a wrapper) since the second will only
be possible by ugly monkey-patching of nodes modules (which is
cumbersome in practice).
Personally I both appreciate the minimalism introduced by passing a
single callback to an async function, but at the same time I have _a
lot_ of code which would need to grow larger — i.e. I would need to
add _more_ sugar to accomplish what's already possible today (e.g.
multiple callbacks, chaining, timeout, etc).
I prefer to write code which is only the essence of the task I'm
approaching — rather not write boilerplate code over and over.
>
> An abstraction just takes a variable number of arguments itself, and then "apply"s the function with the callback last on the list.
>
> For functions that want to be fancy and accept variable arguments, then the burden is on them to strip the callback off the end of the arguments object.
>
>
> On Feb 19, 2010, at 3:38 PM, Rasmus Andersson wrote:
>
>> How is it possible? AFAIK the only way it's possible is to query V8
>> for the current AST in a given context, then traverse that tree, find
>> the function, step into the function, look for a use of the magic
>> arguments, then try to deduce how many arguments are drawn from
>> arguments, and finally find which position (or possible positions) a
>> callback would have. I'd say that counts as "not possible" :P.
>
http://gist.github.com/308779#file_node_closure_wrapper.js
(you can find the other version, returning closures, below if you scroll down).
--
Rasmus Andersson
On Feb 19, 11:38 am, Benjamin Thomas <bam.tho...@gmail.com> wrote:
> On Fri, Feb 19, 2010 at 9:20 AM, Mikeal Rogers <mikeal.rog...@gmail.com> wrote:
> > But the tradeoff here is that different libraries may have radically
> > different API styles for relatively simple operations requiring the
> > average programmer to understand and context switch between different
> > styles when reusing third party modules.
>
> I'm not convinced that is true. As long as libraries adopt Node's
> base system (whatever that ends up being) then ALL the libraries will
> have a consistent API.
>
> What would then happen is that two different kinds of libraries would
> emerge, those offering async services, and then those for dealing with
> async services.
>
> So, I could write a library for CouchDB, that matches Node's API. Now
> all of a sudden my CouchDB library is compatible with any async lib
> (like Promises or Continuations or whatever) that works with Node's
> API.
>
> Or I could write a Promise library that works with Node's API, now I
> am guaranteed that it will work with any library that matches Node's
> base API.
>
> I realize I am being redundant here, but I'm trying to drive home that
> by having Node choose a base async API (and encouraging library
> developers to do this as well) we are not segmenting the node
> libraries and nor are we making people switch back and forth between
> different styles. You get to choose the async system you like, AND
> you know it will work with Node libraries.
>
Library builders can feel free to use whatever async style they prefer
_inside_ their library so long as they expose bare node-style async
apis at their edges.
Jeremy
* as much as I don't like the terminology in use, as there is plenty
of prior art that has already established terminology and proven
design & implementation, the term has stuck here in node-land and I am
therefore more than happy to continue with it.
fs.readFile("/foo", function(error, content){
// do something with error and/or content
});
The first argument to callbacks is reserved for an Error, rest is free.
Sent from my iPhone
On 20 feb 2010, at 19.54, lollicode <loll...@gmail.com> wrote:
> Sorry if I'm late to the party or missed anything, but looking at ry's
> commits from y'day on github, it seems that the continuable style has
> already been chosen - so is this debate still current ?
> And if something has been decided, what is it ?
// Takes any async lib that uses callback based signatures and converts// the specified names to continuable style and returns the new library.exports.convert = function (lib, names) {var newlib = {};names.forEach(function (key) {newlib[key] = function () {var args = Array.prototype.slice.call(arguments);return function (callback, errback) {args.push(function (err, val) {if (err) {errback(err);} else {callback(val);}});lib[key].apply(lib, args)}}});return newlib;}It can be used like this:var fs = Do.convert(require('fs'), ["readFile", "stat", "readdir"]);
Hey everyone,
I'm going to remove promises entirely. For places where promises would
be used (fs, dns) I will instead use a standard callback interface:
method(arg0, arg1, arg2, function (error, result0, result1) {
puts('complete');
});
That is, the completion callback will always be the last parameter and
the first argument the completion will always be reserved for an error
object. By following this scheme everywhere, I'm confident that very
good user-land deferred/promise/continuation libraries will spring up.
Expect this interface in the next version of node, 0.1.30.
--
Rasmus Andersson
Oh, what I meant to say is "agreed, but let's try to adopt to the
simpler single-callback based API" :)
Jeremy
+1 nice idea...
> - And a "blessed" async library that wraps the base async library in Ryan's favorite of these options.
+1 the eventual CommonJS promise api should be used here...
-g.
I received a comment on my mongodb driver about adapting the 0.1.30
format for callbacks and a thought struck me.
The mongodb driver collections.find always needs a callback but can
take one or more arguments meaning that if the callback is at the end
you have to write a lot of extra code to ensure the correct ordering
of arguments infront or face large function calls.
currently: collection.find(function(docs}())
at end: collection.find({}, {}, function(docs){})
Of course you can write code to detect the function, but what if you
pass more than one function as an argument such as a map-reduce call
to mongo.
I'm still out on this. I would lean towards putting the callback at
the start rather than the end.
Any suggestions ? Am I complete delusional here ?
Christian
I'm running into some issues with my mongodb driver when looking at
0.1.30.
The callback function at the end of the method is hard right, because
some methods like collections.find always needs a callback but can
take one or more arguments meaning that if the callback is at the end
you have to write a lot of extra code to ensure the correct ordering
of arguments infront or face large function calls.
currently: collection.find(function(docs}())
at end: collection.find({}, {}, function(docs){})
Of course you can write code to detect the function, but what if you
pass more than one function as an argument such as a map-reduce call
to mongo.
I'm still out on this. I would lean towards putting the callback at
the start rather than the end. BUT I might be missing something
incredibly obvious here.
Hi Ryan
I'm running into some issues with my mongodb driver when looking at
0.1.30.
The callback function at the end of the method is hard right, because
some methods like collections.find always needs a callback but can
take one or more arguments meaning that if the callback is at the endthe start rather than the end. BUT I might be missing something
you have to write a lot of extra code to ensure the correct ordering
of arguments infront or face large function calls.
currently: collection.find(function(docs}())
at end: collection.find({}, {}, function(docs){})
Of course you can write code to detect the function, but what if you
pass more than one function as an argument such as a map-reduce call
to mongo.
I'm still out on this. I would lean towards putting the callback at
incredibly obvious here.