Comparison between continuables, promises and many variants on them.

1,325 views
Skip to first unread message

Tim Caswell

unread,
Feb 18, 2010, 3:56:46 PM2/18/10
to nod...@googlegroups.com
There has been a lot of discussion lately about the best way to do async functions in node.  This is so important because node has non-blocking IO everywhere and regular exceptions don't work in this case.

Several of us in IRC made a gist to show a quick example of the different styles.


This is a simple example, but it highlights enough of the differences to get a feel for how everything works.

I think that ideally node should ship with three versions of the async io libraries.

 - A base library based on pure callbacks
 - A true sync version of that library for the times you don't care about blocking
 - And a "blessed" async library that wraps the base async library in Ryan's favorite of these options.

The plain async library will great for us who want to do our own thing and don't want the overhead of wrapping someone else's stuff that you don't like.

The sync version will be very useful for short running scripts and things that only happen at server startup.

And for the node programmers who don't care what style of async code they use, but just want a common official style that's better than plain callbacks, will use the third lib.

The first two libs have to be shipped with node, but third may or may not.

Ciaran

unread,
Feb 18, 2010, 4:01:51 PM2/18/10
to nod...@googlegroups.com
On Thu, Feb 18, 2010 at 8:56 PM, Tim Caswell <t...@creationix.com> wrote:
> There has been a lot of discussion lately about the best way to do async
> functions in node.  This is so important because node has non-blocking IO
> everywhere and regular exceptions don't work in this case.
> Several of us in IRC made a gist to show a quick example of the different
> styles.

Thank-you :)
-cj.

Tim Caswell

unread,
Feb 18, 2010, 4:03:01 PM2/18/10
to nod...@googlegroups.com
Here's the source in case github ever is down again:

// test_and_load does two things. First it checks to see if a filepath
// is really a file and not a directory ot something crazy.
// If it's not a file it emits "undefined" (except for one case where null is used)
// If it is a file, then it emits the file's contents
// Any errors in fs.stat or fs.fileRead are passed through as is.
 
// In these examples, we're assuming that the fs module is of the same
// style as the function comsuming it. So we're showing a small example of
// producer and consumer for each of the styles.
 
// simple continuable with seperate callback and errback
function test_and_load(filename) { return function (callback, errback) {
  fs.stat(filename)(function (stat) {
 
    // Filter out non-files
    if (!stat.isFile()) { callback(); return; }
 
    // Otherwise read the file in
    fs.readFile(filename)(callback, errback);
 
  }, errback);
}}
 
// simple continuable with single combined callback
function test_and_load(filename) { return function (callback) {
  fs.stat(filename)(function (stat) {
 
    // Pass along any errors before we do anything
    if (stat instanceof Error) { callback(stat); return; }
 
    // Filter out non-files
    if (!stat.isFile()) { callback(); return; }
 
    // Otherwise read the file in
    fs.readFile(filename)(callback);
 
  });
}}
 
// inimino style continuables
function test_and_load(filename) { return function (cont) {
  fs.stat(filename)(rightContinuation(cont)(function (stat) {
 
    // Filter out non-files
    if (!stat.isFile()) { cont(); return; }
 
    // Otherwise read the file in
    fs.readFile(filename)(cont);
 
  }));
}}
 
 
function test_and_load(filename) {
  return fs.stat(filename)(function (stat) {
    
    // Pass along any errors before we do anything
    if(stat instanceof Error) { return; }
    
    // Filter out non-files
    if (!stat.isFile()) { return null; }
 
    // Otherwise read the file in
    return fs.readFile(filename);
 
  });
}
 
// promise based with node-promise helpers (github.com/kriszyp/node-promise)
var when = require("promise").when;
function test_and_load(filename) {
  return when(fs.stat(filename), function (stat) {
 
    // Filter out non-files
    if (!stat.isFile()) { return; }
 
    // Otherwise read the file in
    return fs.readFile(filename);
    
  });
}
 
 
// promise based with CommonJS monkey patch (github.com/kriszyp/node-commonjs/blob/master/patch-promise.js)
function test_and_load(filename) {
  return fs.stat(filename).then(function (stat) {
 
    // Filter out non-files
    if (!stat.isFile()) { return; }
 
    // Otherwise read the file in
    return fs.readFile(filename);
    
  });
}
 
 
// promise based (how it works in current node)
function test_and_load(filename) {
  var promise = new process.Promise();
  fs.stat(filename).addCallback(function (stat) {
 
    // Filter out non-files
    if (!stat.isFile()) { promise.emitSuccess(); return; }
 
    // Otherwise read the file in
    fs.readFile(filename).addCallback(function (data) {
      promise.emitSuccess(data);
    }).addErrback(function (error) {
      promise.emitError(error);
    });
 
  }).addErrback(function (error) {
    promise.emitError(error);
  });
  return promise;
}

Ryan Dahl

unread,
Feb 18, 2010, 4:10:32 PM2/18/10
to nod...@googlegroups.com
On Thu, Feb 18, 2010 at 12:56 PM, Tim Caswell <t...@creationix.com> wrote:
> There has been a lot of discussion lately about the best way to do async
> functions in node.  This is so important because node has non-blocking IO
> everywhere and regular exceptions don't work in this case.
> Several of us in IRC made a gist to show a quick example of the different
> styles.
> https://gist.github.com/602efd6a0d24b77fda36
> This is a simple example, but it highlights enough of the differences to get
> a feel for how everything works.
> I think that ideally node should ship with three versions of the async io
> libraries.
>  - A base library based on pure callbacks
>  - A true sync version of that library for the times you don't care about
> blocking

Well, for file system operations I've decided to allow sync op
exposure because of the high possibility of file system caches. For
other things which use promises (DNS resolution and simple HTTP GET
requests) there will be no synchronous interface.

I'm glad we're discussing this - I think the current promises could be improved.

Tim Caswell

unread,
Feb 18, 2010, 4:13:43 PM2/18/10
to nod...@googlegroups.com
ok, for things where the delay could be large like http.cat, I agree. There is no need for a sync version.

inimino

unread,
Feb 18, 2010, 4:19:47 PM2/18/10
to nod...@googlegroups.com
On 2010-02-18 13:56, Tim Caswell wrote:
> - A base library based on pure callbacks

+1

> - A true sync version of that library for the times you don't care about blocking

+0.5

I'd be even happier with no such library, but given the
existence of .wait(), this boat has probably sailed...

> - And a "blessed" async library that wraps the base async library in Ryan's favorite of these options.

-1

I don't think there needs to be a "blessed" library, but
we do need an installer for packages (which I know several
people are working on). Once this is in place it should be
no problem to pull in async convenience libraries for use in
your own projects or as dependencies of other libraries or
frameworks you install.

I think Promises will get a more full-featured implementation
this way then what most node users have seen of them so far,
while Node itself can stay light, and we can have continuable
libraries and other approaches as well to suit various styles
and preferences.

--
http://inimino.org/~inimino/blog/

Karl Guertin

unread,
Feb 18, 2010, 4:55:17 PM2/18/10
to nod...@googlegroups.com
On Thu, Feb 18, 2010 at 4:19 PM, inimino <ini...@inimino.org> wrote:
> I don't think there needs to be a "blessed" library, but
> we do need an installer for packages (which I know several
> people are working on).  Once this is in place it should be
> no problem to pull in async convenience libraries for use in
> your own projects or as dependencies of other libraries or
> frameworks you install.

I think there should be a reasonable "start here" for people just
getting into things. I'd also appreciate having something for the
times when I'm on a remote system and want to bang out a script
without installing stuff (this is actually one of my annoyances with
Python, I hate the stdlib for scripting jobs). Maybe the sync versions
can cover that but I'd still like some blessed option.

I actually have ideas on yet another way to do this that I'd like to
try out over the weekend, but I'd be happy enough with more or less
all of the options presented.

Tom Van Cutsem

unread,
Feb 18, 2010, 7:59:44 PM2/18/10
to nodejs
Hi node.js people,

I was informed by Kris Kowal that you guys are discussing various
options for an asynchronous API. Having worked extensively with a
promise-based API myself, Kris thought it would be useful if I would
share my thoughts on this list. So here goes:

A promise-based API has the benefit that asynchronous actions can be
'chained' implicitly and that they bundle up callback and errback in a
single abstraction. Consider the simplest form of composition: an
asynchronous action A that just wants to chain two asynchronous
actions B and C. A good measuring point to compare each async. version
against is the sequential version:
function A(x) { return B(C(x)); }

With explicit continuations, you get something resembling:
function A(x, cb, errb) { return C(x, function(val) { return B(val,
cb, errb); }, errb);

With promises you get:
function A(x) { return when(C(x), function(val) { return B(val); } }

The difference may not appear to be that big, but the key point is
that it's not a constant-factor improvement: the more async.
compositions you perform in your code, the larger the difference
between the callback and the promise-based version becomes. Another
cool thing to note with a promise-based API is that the "signature" of
your functions is unmodified as compared to the sequential API, except
that the "return type" is of type promise<T> instead of type T, so to
speak. In my experience, this reduces the burden of reasoning about
async API's. And it means a promise-based API plays nicer with
variable-argument functions, which don't have to perform argument-list-
surgery to treat continuation arguments specially.

I can think of other reasons for choosing promises, but given the
context of this discussion, these may not be that relevant here.
Just my $0.02.

Cheers,
Tom

Tim Caswell

unread,
Feb 19, 2010, 12:23:22 AM2/19/10
to nod...@googlegroups.com
Just to be clear, continuations are not plain callbacks.  They don't modify your signature either and work with multiple argument functions our of the box.  Also you can think of the return value being a continuation of type <t>.

// Plain callback style
function read(filename, callback, errback) {
  // Do async work and then...
  callback(data);
}
 
// Continuation style
function read(filename) { return function (callback, errback) {
  // Do async work and then...
  callback(data);
}}
 
 
// Using a callback
read("myfile.txt", function (data) {
  // use data
}, error_handler);
 
// Using a continuation
read("myfile.txt")(function (data) {
  // Use data
}, error_handler);

Note that this is just my modification on continuations.  There are a few variants running around, and one of them does a lot of the implicit stuff that promises do. My version doesn't require any library at all to use.  It's just a convention of currying the initial arguments and then attaching the callback later when you want to run it.  This make variable arguments possible and allows for utility libraries like Do work do the advanced stuff when needed.

If you were to write a full on continuation library then it's just a powerful as promises, just a shorter syntax in most cases.  

I'm not saying that continuations are always better or even better for everyone.  I just want us to all be talking about the same thing.

Of course promises are easier to use than plain callbacks because they allow all that nifty chaining and composition. Continuables can be just as powerful as promises depending on which version you use.  It's mainly a thing of style and taste as far as I'm concerned.

I think that easy things should be easy.  Promises don't do this, you have to include an entire promise library, create a promise in your function, return the promise and then call a method on the promise later when you've got the value. with my simple continuables, you just curry your function into two parts.  The arguments and the callbacks.  That's it, the callback is there to use when you have the value and there is nothing to return.

Benjamin Thomas

unread,
Feb 19, 2010, 1:23:36 AM2/19/10
to nod...@googlegroups.com
I think at this point it is clear that a consensus is not going to be
reached as to which approach is The Best Way. The thing to do is
decide what Node should do to make it as easy as possible for people
to make their own choices about how to do async.

I think Tim's suggestions go a long way towards making this possible. So, +1 for

Ryan Dahl

unread,
Feb 19, 2010, 4:26:59 AM2/19/10
to nod...@googlegroups.com
What about this:

// three ways of calling fs.open()

// sync
try {
fs.open('/tmp/hello', 'r+', 0600);
puts('complete');
} catch (err) {
puts('error');
}

// simple async
fs.open('/tmp/hello', 'r+', 0600, function (err, fd) {
if (err) {
puts('error');
} else {
puts('complete');
}
});

// promise async
promise(fs.open, '/tmp/hello', 'r+', 0600)(function (fd) {
puts('complete');
}, function (err) {
puts('error');
});

Then we can expose all three options.

Michael Stillwell

unread,
Feb 19, 2010, 5:36:29 AM2/19/10
to nod...@googlegroups.com
On Fri, Feb 19, 2010 at 9:26 AM, Ryan Dahl <coldre...@gmail.com> wrote:

>  // promise async
>  promise(fs.open, '/tmp/hello', 'r+', 0600)(function (fd) {
>    puts('complete');
>  }, function (err) {
>    puts('error');
>  });

I mentioned this a few days ago[1], but if you add a curry-like
function to Function.prototype you could end up with

fs.open.promise('/tmp/hello', 'r+', 0600)(function (fd) {


puts('complete');
}, function (err) {
puts('error');
});

which is then syntactically very close to continuables.

I do think it's often useful to encapsulate the "what happens next"
code paths into an object, though. Objects are not a necessary part of
any programming language, but if objects are fundamental to
JavaScript, and async programming is fundamental to JavaScript, then I
think it's useful to have them combined in some way.

If the "what happens next" stuff is an object the various code paths
are named, they can be inherited and monkey patched and inspected,
they can easily be passed around and emitted, and they can easily
support more than two code paths, as well as operations like timeout.
Many of the continuable examples I've seen are very elegant (they're
particularly nice when chaining a "success" code path), but I think
they get a bit awkward when the "error" code path is introduced (is
error a second argument, or is a single argument "overloaded"?) and
none have a good solution for timeouts.

Michael

[1] http://groups.google.com/group/nodejs/msg/b8a9b8b644bbd291

--
http://beebo.org

Michael Stillwell

unread,
Feb 19, 2010, 6:01:21 AM2/19/10
to nod...@googlegroups.com
On Fri, Feb 19, 2010 at 5:23 AM, Tim Caswell <t...@creationix.com> wrote:
> Just to be clear, continuations are not plain callbacks.  They don't modify
> your signature either and work with multiple argument functions our of the
> box.  Also you can think of the return value being a continuation of type
> <t>.

I'm a bit confused here. I thought a "continuable" was the curried
version of a function written in continuation passing style[1].

So you have

function add(i, j) {
return i + j;
}

console.log(add(1, 2));
// -> 3

function add_cps(i, j, callback) { // cps = continuation passing style
callback(add(i, j));
}

add_cps(1, 2, function(x) { console.log(x); });
// -> 3

function add_continuable(i, j) {
return function(callback) { add_cps(i, j, callback); }
}

add_continuable(1, 2)(function (x) { console.log(x); });
// -> 3

I thought "add_continuable" was the only actual "continuable" in the
code above, and that the various callback arguments are the
"continuations"--?


Michael

[1] http://en.wikipedia.org/wiki/Continuation-passing_style

--
http://beebo.org

Rasmus Andersson

unread,
Feb 19, 2010, 9:04:31 AM2/19/10
to nod...@googlegroups.com
Don't forget continuations are both hard to grasp and tricky to use
it's full potential for most users. Node (and Javascript in general)
are both easy to learn and understand. The promise model is also
straight-forward and includes no "hidden" language tricks (like the
continuables discussed do). When hiding what's going on, mistakes
appear.

I believe the current promise model (or the promise function
shorthand, as suggested by Ryan above) in combination with the
CommonJS "then" would both give us flexibility and all "papers on the
tables" (fewer language tricks causing confusion).

Rather write an extra line of code than having to explain the code to someone.

> --
> You received this message because you are subscribed to the Google Groups "nodejs" group.
> To post to this group, send email to nod...@googlegroups.com.
> To unsubscribe from this group, send email to nodejs+un...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/nodejs?hl=en.
>
>

--
Rasmus Andersson

Kris Zyp

unread,
Feb 19, 2010, 9:07:52 AM2/19/10
to nodejs
To make the comparison a little clearer, it might be helpful to break
down the different aspects of what is being proposed, rather than just
arguing continuables vs promises. In some ways, continuable are a
different type of promise, with several particular differences:

1. API/signature for registering listeners. Node's promises are an
object with addCallback/addErrback for registering listeners.
CommonJS's promises are objects that use "then" to register the
callback and error handler. Continuables are the function itself that
is called to register the listeners. Interesting, Continuables are
actually closer to CommonJS promises here, since you have a single
function for registering listeners, they only differ in whether the
promise is that function itself or an object that holds the function.
Modifying continuables to use then would be pretty trivial as well you
just use "return {then:function(..." instead "return function(...".
And actually, the continuable style of promise API is similar to
ref_send's promises, which are also functions that are called
directly. We have certainly considered this before, but, the major
flaw in this design, and the reason we didn't follow this approach in
CommonJS, is that it becomes very hard for functions that need to
differentiate between sync values and async values to branch when the
only definition of a promise is a value where (typeof value ===
"function"). This type of branching is a key part of the when()
construct (and I use when() all the time), where it provides
normalization between sync and async. If you call a function that
returns a function, how do you determine if the return value is
synchronous and really a function, or is a promise/continuable that
needs a listener? Of course the other aspect of this is more
subjective, having a function call in your code, instead of just
adjacent parenthesis explicitly indicates to the reader of the code
that you are setting up a function to handle the completion of an
operation.

2. No automatic chaining. Node's promises don't support chaining
anyway, but CommonJS promises do. Without chaining, users have to
manually fulfill the continuable/promise that is returned. The gist
that started (thanks Tim for putting that together) clearly shows the
difference between chaining and no chaining.

3. Registering listeners triggers the execution the of async
operation. With normal promises, execution and listener registration
are separate concerns. One can fire off an async operation and then
safely add (or not add) listeners without affecting the operation.
This is probably the most dangerous and insidious aspect of
continuables, IMO. If multiple listeners are registered, this causes
multiple firings of the async operation. If one were to do something
like:

var myContinuable = appendToMyFile();
printResults(myContinuable);

Now if the author of printResults decided to register an extra
callback to do another debugging statement, that would cause multiple
executions of the action in appendToMyFile. In this case, where
appendToMyFile is probably not idempotent, this results in duplicate
appends to the file. Users have an enormous extra responsibility to
ensure that a continuable is only called once (which can be quite
onerous since it is so easy to pass the continuable around). This type
of behavior also prevents one from writing a safe reliable version
when() for use with continuables (whereas when() would work with #1
and #2, except for the amiguity problem).

Anyway, hopefully it is helpful for the sake of discussion (or at
least education) to be able consider each of these ideas individually.

Rasmus Andersson

unread,
Feb 19, 2010, 9:12:09 AM2/19/10
to nod...@googlegroups.com
On Fri, Feb 19, 2010 at 15:07, Kris Zyp <kri...@gmail.com> wrote:
<snip>

>
> var myContinuable = appendToMyFile();
> printResults(myContinuable);
>
> Now if the author of printResults decided to register an extra
> callback to do another debugging statement, that would cause multiple
> executions of the action in appendToMyFile. In this case, where
> appendToMyFile is probably not idempotent, this results in duplicate
> appends to the file. Users have an enormous extra responsibility to
> ensure that a continuable is only called once (which can be quite
> onerous since it is so easy to pass the continuable around). This type

Agreed. This is one of the problems with this kind of continuation I
was trying to point at.

Thanks for the break-down.

Kris Zyp

unread,
Feb 19, 2010, 9:12:39 AM2/19/10
to nodejs

On Feb 19, 2:26 am, Ryan Dahl <coldredle...@gmail.com> wrote:
> What about this:
> [snip]


>   // promise async
>   promise(fs.open, '/tmp/hello', 'r+', 0600)(function (fd) {
>     puts('complete');
>   }, function (err) {
>     puts('error');
>   });

Would node provide the promise function, or third party libraries?
Would Node have a function called "promise" that returns a continuable
(instead of a promise)? I'd prefer having third parties provide the
promise and continuable APIs over this being the third option.
Kris

Felix Geisendörfer

unread,
Feb 19, 2010, 9:19:46 AM2/19/10
to nodejs
> I'd prefer having third parties provide the
> promise and continuable APIs over this being the third option.

+1

I think node should go with returning a function that accepts a single
callback, and the first parameter is reserved for indicating if there
was an error.

All high level wrapping and mangling should be done using 3rd party
libraries.

--fg

Rasmus Andersson

unread,
Feb 19, 2010, 9:27:16 AM2/19/10
to nod...@googlegroups.com
2010/2/19 Felix Geisendörfer <fe...@debuggable.com>:

>> I'd prefer having third parties provide the
>> promise and continuable APIs over this being the third option.
>
> +1
>
> I think node should go with returning a function that accepts a single
> callback, and the first parameter is reserved for indicating if there
> was an error.

+1

This is how I usually do it in C-blocks[1] and appreciate not having
to define two separate callbacks. e.g.

typedef int (^closure_t)(error_t *, void *);
void parse_tree(closure_t c) {
// ... eventually call c
}
parse_tree(^(error_t *err, void *arg) {
if (err) {
present_error(err);
return;
}
parsed_tree_t *tree = (parsed_tree_t*)arg;
// ...
});


[1] http://thirdcog.eu/pwcblocks/

>
> All high level wrapping and mangling should be done using 3rd party
> libraries.

Indeed. Apparently there are many different tastes out there.

>
> --fg
>
>
> On Feb 19, 3:12 pm, Kris Zyp <kris...@gmail.com> wrote:
>> On Feb 19, 2:26 am, Ryan Dahl <coldredle...@gmail.com> wrote:
>>
>> > What about this:
>> > [snip]
>> >   // promise async
>> >   promise(fs.open, '/tmp/hello', 'r+', 0600)(function (fd) {
>> >     puts('complete');
>> >   }, function (err) {
>> >     puts('error');
>> >   });
>>
>> Would node provide the promise function, or third party libraries?
>> Would Node have a function called "promise" that returns a continuable
>> (instead of a promise)? I'd prefer having third parties provide the
>> promise and continuable APIs over this being the third option.
>> Kris
>

Rasmus Andersson

unread,
Feb 19, 2010, 10:31:31 AM2/19/10
to nod...@googlegroups.com
I created a quick implementation of this:

http://gist.github.com/308779

I've renamed "promise" to "closure" here, but that's just because I
think it's a more suiting name.

Basically, you do this:

asyncOperation(args)(function(err, results){
// do something with error and/or results
})

To close a closure (i.e. promise.emit{Error, Success}) you call close:

function asyncOperation(args) {
var cl = mkclosure();
cl.close(error, results); // << in reality this happens later
return cl;
}

You could of course add extra sugar to allow for calling the closure
directly by adding input checks in the closure function. The above
becomes:

function asyncOperation(args) {
var cl = mkclosure();
cl(error, results); // << in reality this happens later
return cl;
}

These closures can be passed on as callbacks themselves:

function asyncOperation(args) {
var cl = mkclosure();
anotherAsyncOperation(args)(function(err, args) {
// process args
yetAnotherAsyncOperation(args)(closure.close);
}
return cl;
}

Or if the aforementioned call sugar is added:

anotherAsyncOperation(args)(closure);

The closure could easily be extended with chaining and queueing.

--
Rasmus Andersson

Tim Caswell

unread,
Feb 19, 2010, 10:36:25 AM2/19/10
to nod...@googlegroups.com
Excellent discussion everyone. Sorry for my confusing email earlier. I said continuation when I meant "continuable". For clarity, I'm now calling what I was calling "continuables" to "curried-cps". It's just a function that returns a new function in a closure that takes the callback and errback.

+1 for keeping all this higher-level stuff out of node core

Mikeal Rogers

unread,
Feb 19, 2010, 11:20:35 AM2/19/10
to nod...@googlegroups.com
I'm going to disagree with most of this.

Providing either little or no API around the async style will invite a
lot of divergent styles. This sounds good in the short term because
the competition could spur some new and creative solutions.

But the tradeoff here is that different libraries may have radically
different API styles for relatively simple operations requiring the
average programmer to understand and context switch between different
styles when reusing third party modules.

The bonus of having EventEmitter and Promise be first class objects
that ship with node is that you have incentive to use them and keep
your style close to the base node APIs and the majority of the third
party modules.

It's ok for node to have an opinion about what the best async style is
and adopt that style as default. I think that we should go with the
easiest style to understand which IMHO is the current Promise API.

-Mikeal

Mikeal Rogers

unread,
Feb 19, 2010, 11:25:42 AM2/19/10
to nod...@googlegroups.com
2010/2/19 Felix Geisendörfer <fe...@debuggable.com>:

>> I'd prefer having third parties provide the
>> promise and continuable APIs over this being the third option.
>
> +1
>
> I think node should go with returning a function that accepts a single
> callback, and the first parameter is reserved for indicating if there
> was an error.
>
> All high level wrapping and mangling should be done using 3rd party
> libraries.

One really nice thing about node Promises ATM is that explicit
addErrback means that node can throw an exception when the error
doesn't have a handler.

Having this by default has made my development process a lot easier
since anything I didn't explicitly set an error handler on throws an
exception even when async and I don't have some problem way down the
stack.

Passing the error as the first argument means I have nearly as much
code for handling the errors anyway and gives me far worse default
behavior when I don't write the error handler.

-Mikeal

Kris Zyp

unread,
Feb 19, 2010, 11:37:53 AM2/19/10
to nodejs

On Feb 19, 9:20 am, Mikeal Rogers <mikeal.rog...@gmail.com> wrote:
> I'm going to disagree with most of this.
>
> Providing either little or no API around the async style will invite a
> lot of divergent styles. This sounds good in the short term because
> the competition could spur some new and creative solutions.
>
> But the tradeoff here is that different libraries may have radically
> different API styles for relatively simple operations requiring the
> average programmer to understand and context switch between different
> styles when reusing third party modules.

This is great point, Mikeal. Specifically we have already seen a lot
database adapters utilize promises. If one writes a module that can
generically work with the results of a database call, it is much
easier if it knows to expect a certain style of promises. Having to
write code that can work properly with CommonJS promises, the various
different variants on continuables, and anything else people invent is
a lot of extra work.
Kris

Benjamin Thomas

unread,
Feb 19, 2010, 11:38:11 AM2/19/10
to nod...@googlegroups.com
On Fri, Feb 19, 2010 at 9:20 AM, Mikeal Rogers <mikeal...@gmail.com> wrote:
> But the tradeoff here is that different libraries may have radically
> different API styles for relatively simple operations requiring the
> average programmer to understand and context switch between different
> styles when reusing third party modules.

I'm not convinced that is true. As long as libraries adopt Node's
base system (whatever that ends up being) then ALL the libraries will
have a consistent API.

What would then happen is that two different kinds of libraries would
emerge, those offering async services, and then those for dealing with
async services.

So, I could write a library for CouchDB, that matches Node's API. Now
all of a sudden my CouchDB library is compatible with any async lib
(like Promises or Continuations or whatever) that works with Node's
API.

Or I could write a Promise library that works with Node's API, now I
am guaranteed that it will work with any library that matches Node's
base API.

I realize I am being redundant here, but I'm trying to drive home that
by having Node choose a base async API (and encouraging library
developers to do this as well) we are not segmenting the node
libraries and nor are we making people switch back and forth between
different styles. You get to choose the async system you like, AND
you know it will work with Node libraries.

On Fri, Feb 19, 2010 at 9:25 AM, Mikeal Rogers <mikeal...@gmail.com> wrote:
> One really nice thing about node Promises ATM is that explicit
> addErrback means that node can throw an exception when the error
> doesn't have a handler.
>
> Having this by default has made my development process a lot easier
> since anything I didn't explicitly set an error handler on throws an
> exception even when async and I don't have some problem way down the
> stack.
>
> Passing the error as the first argument means I have nearly as much
> code for handling the errors anyway and gives me far worse default
> behavior when I don't write the error handler.
>
> -Mikeal

It doesn't sound to me like the using the base Node libraries would be
for you. It isn't for me necessarily either. We'll just have to
choose a Promise or continuable implementation we like to use on top
of them. We can take that performance hit and abstraction in exchange
ease of use.

Mikeal Rogers

unread,
Feb 19, 2010, 11:58:41 AM2/19/10
to nod...@googlegroups.com
> It doesn't sound to me like the using the base Node libraries would be
> for you.  It isn't for me necessarily either.  We'll just have to
> choose a Promise or continuable implementation we like to use on top
> of them.  We can take that performance hit and abstraction in exchange
> ease of use.

The thing is, I'm having a hard time thinking of an abstraction over
all the noted styles that won't be leaky. Maybe I'm not just not
creative enough.

If the performance hit is substantial, or the abstraction is leaky,
you can expect everyone who favors a particular style to ignore the
abstraction and write directly for the style they like which leads to
the kind of segmentation I'm afraid of.

The question isn't whether or not an abstraction is possible, it's
whether it will actually be used widely and adopted as a defacto
standard.

If node's interface is exceedingly low level in order to enable all
the styles mentioned then it's going to be the least attractive to use
directly and a higher level abstraction that strings together all the
styles that might be implemented on top of node's interface would need
adopt an API that is relatively inflexible in order to accommodate the
variances and would also not be as attractive as using a specific
style. Which means it's still up to the average developer to use this
abstraction to string together all the styles of the modules he/she
wants to use and I just don't see that as a very attractive way to
program.

-Mikeal

inimino

unread,
Feb 19, 2010, 12:19:27 PM2/19/10
to nod...@googlegroups.com
On 2010-02-19 07:12, Kris Zyp wrote:
> I'd prefer having third parties provide the
> promise and continuable APIs over this being the third option.
> Kris

+1

I like this solution best, with the sync version and

// simple async
fs.open('/tmp/hello', 'r+', 0600, function (err, fd) {

in node, and the rest can be built on top. So you'd have an
fs-promise module, maybe maintained by Kris Zyp and depending
on the promise module that includes the CommonJS-style Promise
constructor, and we can have a fs-continuable module maintained
by Tim or me, with a continuable or Do module to go with it, and
whatever other abstractions may come along can throw their hat
in the ring as well.

All of these will just wrap the bare fs functions with the async
abstraction du jour, and node core can stay as small as possible.

--
http://inimino.org/~inimino/blog/

Tim Caswell

unread,
Feb 19, 2010, 12:35:07 PM2/19/10
to nod...@googlegroups.com, Tim Caswell
I completely agree with Benjamin, if  node and all node libraries adopt a single-simple convention then we can all use our favorite abstraction without any conflict or problem.  I think that's a fabulous solution.


// Super simplified sample library
fs.readFile = function (filename, callback) {
  callback(error, content);
}
 
// Promise people could do this
promise(fs.readFile, "myfile").then(function (data) {...})
// Or even this
when(fs.readFile, "myfile", function (data) {...})
// Continuable people could do this
cont(fs.readFile, "myfile")(function (data) {...})


Tim Caswell

unread,
Feb 19, 2010, 12:58:42 PM2/19/10
to nod...@googlegroups.com
One quick node on the current proposed common low-level api:  Anyone wanting to implement variable arguments will have to perform argument surgery to ensure that the last argument is kept separate.

// Super simplified sample library (current)
fs.readFile = function (filename, another_arg, more args, callback) {
  callback(error, content);
}
 
// Curried (continuable-like)
fs.readFile = function (filename, another_arg, more args) { return function (callback) {
  callback(error, content);
}}
 
// base arg with options hash
fs.readFile = function (filename, {encoding: "ascii"}, callback) {
  callback(error, content);
}

The curried version is good because it's powerful enough on it's own to be used with libraries like "Do", the downside is that the double function syntax is ugly

The strict 3 argument api of main arg, options, callback, is both simple and powerful from an API standpoint and probably familiar to the most people.

Ryan Dahl

unread,
Feb 19, 2010, 1:00:08 PM2/19/10
to nod...@googlegroups.com
Hey everyone,

I'm going to remove promises entirely. For places where promises would
be used (fs, dns) I will instead use a standard callback interface:

method(arg0, arg1, arg2, function (error, result0, result1) {
puts('complete');
});

That is, the completion callback will always be the last parameter and
the first argument the completion will always be reserved for an error
object. By following this scheme everywhere, I'm confident that very
good user-land deferred/promise/continuation libraries will spring up.

Expect this interface in the next version of node, 0.1.30.

inimino

unread,
Feb 19, 2010, 1:01:06 PM2/19/10
to nod...@googlegroups.com
On 2010-02-19 04:01, Michael Stillwell wrote:
> I'm a bit confused here. I thought a "continuable" was the curried
> version of a function written in continuation passing style[1].

No, such a function is actually a function returning a continuable.

A continuable would be the partial application of such a curried
function.

Adding (Haskell-like, sorry) type annotations to your functions:

Add just takes two ints and returns int:

add :: (Int, Int) -> Int

> function add(i, j) {
> return i + j;
> }
>
> console.log(add(1, 2));
> // -> 3


Add_cps takes two ints and a function that takes an int (and
does something unspecified with it) and returns nothing of
interest. We can call the second argument a continuation,
since that is what it is.

add_cps :: (Int, Int, (Int -> _)) -> _

> function add_cps(i, j, callback) { // cps = continuation passing style
> callback(add(i, j));
> }
>
> add_cps(1, 2, function(x) { console.log(x); });
> // -> 3

Now if we want to make this clearer, instead of writing
"Int -> _" for the continuation, we can just introduce a
continuation as a type, where "Continuation Int", where means
a function which takes an Int and does something unspecified
with it:

type Continuation a = a -> _

and now the type for add_cps can be more clearly expressed:

add_cps :: (Int, Int, Continuation Int) -> _

Add_continuable takes two ints and returns a function which
takes a continuation:

add_continuable :: (Int, Int) -> (Continuation Int -> _)

> function add_continuable(i, j) {
> return function(callback) { add_cps(i, j, callback); }
> }
>
> add_continuable(1, 2)(function (x) { console.log(x); });
> // -> 3
>

Once again, we can simplify things by naming the type of
this second function. Instead of "a function taking a
Continuation Int and returning nothing" we would like to
be able to just say "a Continuable Int":

type Continuable a = Continuation a -> _

Now we can write the type of add_continuable just as
easily as add_cps above:

add_continuable :: (Int, Int) -> Continuable Int

Conceptually, a "Continuable Int" is very much like a
promise to deliver an int. The only thing you can do with
a continuable, is pass a continuation to it, which will
then (eventually) be called with the value.

The fact that a Continuable happens to be a function is
almost just an implementation detail, just as the fact
that Promises are implemented as objects is almost just
an implementation detail.

> I thought "add_continuable" was the only actual "continuable" in the
> code above, and that the various callback arguments are the
> "continuations"--?

The only continuable is the one returned from add_continuable.

Add_continuable is not a continuable, as seen by
its type, it is a function from (Int, Int) to
Continuable Int.

The motivation for the names "continuation" and "continuable"
was to have something to call these types, which greatly
simplifies thinking about these kinds of patterns. The idea
of "a function returning a function which takes a function
which takes an argument of type 'a'" is not as easy to work
with as "A function returning Continuable A".

A continuation is a well-understood concept in computer
science, and is used here in the broader meaning of "an
encapsulation of the rest of a computation", not in the
narrower "current continuation" sense.

A continuable is just a convenient name for an action
or computation which is "paused", but can be continued
by giving a continuation to it.

--
http://inimino.org/~inimino/blog/

inimino

unread,
Feb 19, 2010, 1:06:03 PM2/19/10
to nod...@googlegroups.com
On 2010-02-19 07:04, Rasmus Andersson wrote:
> Don't forget continuations are both hard to grasp and tricky to use
> it's full potential for most users. Node (and Javascript in general)
> are both easy to learn and understand.

A continuation, as used here, is not a call/cc style "current
continuation", nor could it be, since JavaScript doesn't have
call/cc. It's just a JavaScript function representing the rest
of the computation.

You can't use node without using continuations, whether you call
them continuations, callbacks, event handlers, or something else.

> The promise model is also
> straight-forward and includes no "hidden" language tricks (like the
> continuables discussed do). When hiding what's going on, mistakes
> appear.

I don't know what "hidden" language tricks you think you see here,
all we have are functions returning functions. I would say the
implementation is no more than intermediate-to-advanced JavaScript.

--
http://inimino.org/~inimino/blog/

Kris Zyp

unread,
Feb 19, 2010, 1:06:53 PM2/19/10
to nodejs

Just to make sure this interface is clear, will the error parameter
always be equals to null (or undefined) if the operation completed
successfully? And process.Promise will no longer exist now (be
undefined), right?

Anyway, I'll certainly do my best to have a solid deferred/promise
implementation available that works well with this design.
Kris

Ryan Dahl

unread,
Feb 19, 2010, 1:12:31 PM2/19/10
to nod...@googlegroups.com
On Fri, Feb 19, 2010 at 10:06 AM, Kris Zyp <kri...@gmail.com> wrote:
> Just to make sure this interface is clear, will the error parameter
> always be equals to null (or undefined) if the operation completed
> successfully?

Yes.

> And process.Promise will no longer exist now (be
> undefined), right?

Yes. (Well, it will probably be a depreciation error.)

> Anyway, I'll certainly do my best to have a solid deferred/promise
> implementation available that works well with this design.

Good :)

inimino

unread,
Feb 19, 2010, 1:13:23 PM2/19/10
to nod...@googlegroups.com

Tim Caswell

unread,
Feb 19, 2010, 1:17:45 PM2/19/10
to nod...@googlegroups.com
Awesome, I'm looking forward to seeing this too. Hopefully I can come up with a competitive alternative. (Just for kicks of course, your's will probably be better given the current constraints)

-Tim Caswell

Kris Zyp

unread,
Feb 19, 2010, 2:50:49 PM2/19/10
to nodejs
One more clarification, if you don't mind. With variadic functions,
does the callback need to be at the last position in the declared
parameters, or the last position in the provided parameters? In other
words, to call fs.open with two params, should it be:
fs.open(path, "r", callback);
or does it need to be:
fs.open(path, "r", undefined, callback);
Based on internal structure of node.js, I am guessing the latter, but
wanted to make sure.
Kris

Kris Zyp

unread,
Feb 19, 2010, 2:52:02 PM2/19/10
to nodejs

Rasmus Andersson

unread,
Feb 19, 2010, 2:55:22 PM2/19/10
to nod...@googlegroups.com
On Fri, Feb 19, 2010 at 19:00, Ryan Dahl <coldre...@gmail.com> wrote:
> Hey everyone,
>
> I'm going to remove promises entirely. For places where promises would
> be used (fs, dns) I will instead use a standard callback interface:
>
>  method(arg0, arg1, arg2, function (error, result0, result1) {
>     puts('complete');
>  });

Finally a decision :P

However, a few problems with this solution:

a) How will you be able to set a timeout? (i.e. today you do
method(...).timeout(N))

b) How will you "wait" for completion? Will wait disappear in favor
for not passing the last argument (implying synchronous operation)?

c) How will you support variable arguments (something very common in
javascript)?

I'd really like to see all async operations returning some sort of
"handle" which can be passed on or manipulated (e.g. a "promise", a
"closure" or a "continuable").

IMHO this would be a better solution which is still light-weight and
does not add much abstraction:

var fd = fs.open(filename, true) // true for synchronous
var fd = fs.open(filename, fs.O_EXLOCK, true) // variable arguments
// complex chained operations
fs.open(filename).then(fs.fread, 512)(function(err, data){
// data is a string of length <= 512
}).end(fs.fclose);

If there is no "handle" returned, creating wrappers would be tricky.
This would be an acceptable "layer on top":

var fd = fs.open(filename) // no callback means synchronous
var fd = fs.open(filename, fs.O_EXLOCK) // variable arguments
// complex chained operations
promise(fs.open, filename).then(fs.fread, 512)(function(err, data){
// data is a string of length <= 512
}).end(fs.fclose);

But I don't know if it's possible to do the kind of introspection of
the input callables as would be required (i.e. deducing the number of
arguments fs.open takes, interpolating with undefined's and adding a
callback to the end).

>
> That is, the completion callback will always be the last parameter and
> the first argument the completion will always be reserved for an error
> object. By following this scheme everywhere, I'm confident that very
> good user-land deferred/promise/continuation libraries will spring up.
>
> Expect this interface in the next version of node, 0.1.30.
>

> --
> You received this message because you are subscribed to the Google Groups "nodejs" group.
> To post to this group, send email to nod...@googlegroups.com.
> To unsubscribe from this group, send email to nodejs+un...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/nodejs?hl=en.
>
>

--
Rasmus Andersson

Tim Caswell

unread,
Feb 19, 2010, 3:06:36 PM2/19/10
to nod...@googlegroups.com

On Feb 19, 2010, at 1:55 PM, Rasmus Andersson wrote:

> However, a few problems with this solution:
>
> a) How will you be able to set a timeout? (i.e. today you do
> method(...).timeout(N))

the timeout could be one of the parameters

>
> b) How will you "wait" for completion? Will wait disappear in favor
> for not passing the last argument (implying synchronous operation)?

Ryan has made it clear he doesn't want wait because of the way it's dangerous like threading. There will be sync versions of most the fs functions.

> c) How will you support variable arguments (something very common in
> javascript)?

with the current api, arguments surgery or passing in an array as a single argument

The only real problem I see is with optional arguments, but there is nothing stopping the api from accepting an options hash as one of the arguments, it doesn't have to be part of the spec.

Rasmus Andersson

unread,
Feb 19, 2010, 3:23:15 PM2/19/10
to nod...@googlegroups.com
On Fri, Feb 19, 2010 at 21:06, Tim Caswell <t...@creationix.com> wrote:
<snip>

>> a) How will you be able to set a timeout? (i.e. today you do
>> method(...).timeout(N))
> the timeout could be one of the parameters

True, but then every function would need to implement their own
timeout mechanism, something which todays promises takes care of in a
"pretty" way.

>> b) How will you "wait" for completion? Will wait disappear in favor
>> for not passing the last argument (implying synchronous operation)?
>
> Ryan has made it clear he doesn't want wait because of the way it's dangerous like threading.  There will be sync versions of most the fs functions.

I know. The ugly "coroutine" thingy should be removed. I mean "wait"
as in it's abstract meaning -- if an operation can be either
synchronous or asynchronous (e.g. opening a file) the call to wait (if
available) could tell the mechanism to perform a synchronous
operation.

Example:

fd = fs.open(filename, fs.O_EXLOCK).wait()

When passing around a "handle":

function dequeue() {
var operation = queue.shift(), r;
if (operation.wait)
r = operation.wait()
else
r = operation()
returnValues.push(r)
dequeue();
}

>
>> c) How will you support variable arguments (something very common in
>> javascript)?
>
> with the current api, arguments surgery or passing in an array as a single argument
>
> The only real problem I see is with optional arguments, but there is nothing stopping the api from accepting an options hash as one of the arguments, it doesn't have to be part of the spec.

options arguments could easily be done like this:

function x(options, cb) {
if (typeof options === 'function') {
cb = options;
delete options;
}
// do something
}

However, how would you implement this?

promise(fs.open(filename)).then(fs.read).then...

For people who want something else (which this thread is a living proof of).

AFAIK you would need deep V8 surgery involving introspection to do
this kind of wrapping. If this is not possible, many of the ideas
discussed will not be possible. Only solution would be:

var promise = new Promise();
fs.open(filename, promise.cb);
promise.then(function(fd){
fs.read(fd, promise.cb);
...
})
...

Which would be far too complex for any sane person to use (as it's
about the same amnt of code as to do this w/o any help).

If you can prove that the
promise(fs.open(filename)).then(fs.read).then... is solvable, I'm
convinced.

Karl Guertin

unread,
Feb 19, 2010, 3:31:58 PM2/19/10
to nod...@googlegroups.com
On Fri, Feb 19, 2010 at 3:23 PM, Rasmus Andersson <ras...@notion.se> wrote:
> True, but then every function would need to implement their own
> timeout mechanism, something which todays promises takes care of in a
> "pretty" way.

Ryan mentioned doing simple callbacks with the anticipation that
others would be developing userland promise/continuable/whatever APIs
on top of it. The answer would be to use an API that provides
.timeout().

> I know. The ugly "coroutine" thingy should be removed. I mean "wait"
> as in it's abstract meaning -- if an operation can be either
> synchronous or asynchronous (e.g. opening a file) the call to wait (if
> available) could tell the mechanism to perform a synchronous
> operation.

Kris Zyp's promise API provides this via the when() function. Check it
out for details.


As for the rest, I was hoping Ryan would pick callbacks with the
callback as the first parameter since it'd make variadic handling (and
API evolution) more sane.

Benjamin Thomas

unread,
Feb 19, 2010, 3:33:19 PM2/19/10
to nod...@googlegroups.com
On Fri, Feb 19, 2010 at 1:23 PM, Rasmus Andersson <ras...@notion.se> wrote:
> On Fri, Feb 19, 2010 at 21:06, Tim Caswell <t...@creationix.com> wrote:
> <snip>
>>> a) How will you be able to set a timeout? (i.e. today you do
>>> method(...).timeout(N))
>> the timeout could be one of the parameters
>
> True, but then every function would need to implement their own
> timeout mechanism, something which todays promises takes care of in a
> "pretty" way.

There is nothing stopping you from writing a promise implementation
that has a timeout function just like the current one:

promise(fs.open, filename).timeout(1000, function() {
sys.puts('oops, opening file timed out!');
});

>>> b) How will you "wait" for completion? Will wait disappear in favor
>>> for not passing the last argument (implying synchronous operation)?
>>
>> Ryan has made it clear he doesn't want wait because of the way it's dangerous like threading.  There will be sync versions of most the fs functions.
>
> I know. The ugly "coroutine" thingy should be removed. I mean "wait"
> as in it's abstract meaning -- if an operation can be either
> synchronous or asynchronous (e.g. opening a file) the call to wait (if
> available) could tell the mechanism to perform a synchronous
> operation.

If you want an operation to be able to be synchronous or asynchronous
do what the current process.fs library does. If there is a callback
it is asynchronous. If there isn't it is synchronous.

But just make everything asynchronous. That's the way Node is meant to be.

>>> c) How will you support variable arguments (something very common in
>>> javascript)?
>>
>> with the current api, arguments surgery or passing in an array as a single argument
>>
>> The only real problem I see is with optional arguments, but there is nothing stopping the api from accepting an options hash as one of the arguments, it doesn't have to be part of the spec.

> However, how would you implement this?

if your function called "async_function" needs to take an optional options hash:

function async_function(options, callback) {
if( typeof callback == 'undefined' ) {
callback = options;
options = {};
}
// do something asynchronous
}

Now to use it:

promise(async_options).then( ... );

or

promise(async_options, { option1: 1}).then ( ... );

Tim Caswell

unread,
Feb 19, 2010, 3:34:35 PM2/19/10
to nod...@googlegroups.com
It's not that, it's

promise(fs.open, filename).then(fs.read).then ...

the arguments have to be separate from the function, or as you say it's impossible.

function promise(fn) {
  var p = new Promise();
  var args = Array.prototype.slice.call(arguments, 1);
  args.push(p.addCallback);
  fn.apply(fn, args);
  return p;
}

Rasmus Andersson

unread,
Feb 19, 2010, 4:09:40 PM2/19/10
to nod...@googlegroups.com
On Fri, Feb 19, 2010 at 21:34, Tim Caswell <t...@creationix.com> wrote:
> It's not that, it's
> promise(fs.open, filename).then(fs.read).then ...
> the arguments have to be separate from the function, or as you say it's
> impossible.
>
> function promise(fn) {
>   var p = new Promise();
>   var args = Array.prototype.slice.call(arguments, 1);
>   args.push(p.addCallback);
>   fn.apply(fn, args);
>   return p;
> }

Exactly. If we go by this design, there can never be a function with
variable arguments, or "api sugar" would break (because it can not
know where to put the callback).


>
> On Feb 19, 2010, at 2:23 PM, Rasmus Andersson wrote:
>
> If you can prove that the
> promise(fs.open(filename)).then(fs.read).then... is solvable, I'm
> convinced.
>

Rasmus Andersson

unread,
Feb 19, 2010, 4:15:59 PM2/19/10
to nod...@googlegroups.com
How about a very light-weight closure type which only takes a single
callback and trigger that on closing.

fs.open = function(filename, flags, mode) {
var cl = closure();
// call cl.close(error, fd) somewhere
return cl;
}

Allowing almost equivalent call style:

fs.open(filename, flags, mode, function(error, fd){
// do something with fd
})

becomes:

fs.open(filename, flags, mode)(function(error, fd){
// do something with fd
})

allowing default values for flags and mode:

fs.open(filename)(function(error, fd){
// do something with fd
})

I've written an updated version of such a light-weight closure (or
call it whatever you like -- promise or continuable :) which as a
bonus supports chaining:

http://gist.github.com/308779

--
Rasmus Andersson

inimino

unread,
Feb 19, 2010, 4:19:21 PM2/19/10
to nod...@googlegroups.com
On 2010-02-19 12:55, Rasmus Andersson wrote:
> If there is no "handle" returned, creating wrappers would be tricky.
> This would be an acceptable "layer on top":
>
> var fd = fs.open(filename) // no callback means synchronous

The synchronous functions are now completely separate, so this will
be either:

fs_sync.open(...

or

fs.openSync(...

> promise(fs.open, filename).then(fs.fread, 512)(function(err, data){

I think "promise(fs.open)" or "continuable(fs.open)"is ugly and
I don't expect it to catch on. When I talk about a library that
wraps the low-level functions, I don't mean that you'll have to
wrap them when you use them.

Instead, you'll import a library that wraps the low-level fs.open
with a promise-returning (or continuable-returning) function.

var fs=require('your-favorite-fs-promise-layer')

var promise = fs.open(filename,...)

Or if you prefer:

var fs=require('your-favorite-fs-continuable-layer')

var continuable = fs.open(filename,...)

> But I don't know if it's possible to do the kind of introspection of
> the input callables as would be required (i.e. deducing the number of
> arguments fs.open takes, interpolating with undefined's and adding a
> callback to the end).

This level of introspection is possible, but it's not necessary
given that the boilerplate to return some of "handle" as you call
only needs to be written once, and doesn't need to happen at the
point of use at all... you'll just import a library that handles
it.

--
http://inimino.org/~inimino/blog/

Rasmus Andersson

unread,
Feb 19, 2010, 4:38:23 PM2/19/10
to nod...@googlegroups.com
On Fri, Feb 19, 2010 at 22:19, inimino <ini...@inimino.org> wrote:
> On 2010-02-19 12:55, Rasmus Andersson wrote:
>> If there is no "handle" returned, creating wrappers would be tricky.
>> This would be an acceptable "layer on top":
>>
>> var fd = fs.open(filename) // no callback means synchronous
>
> The synchronous functions are now completely separate, so this will
> be either:
>
> fs_sync.open(...
>
> or
>
> fs.openSync(...

Ryan talked about merging them in a recent mail on this list (that's
the "why?" to my examples).

>
>> promise(fs.open, filename).then(fs.fread, 512)(function(err, data){
>
> I think "promise(fs.open)" or "continuable(fs.open)"is ugly and
> I don't expect it to catch on.  When I talk about a library that
> wraps the low-level functions, I don't mean that you'll have to
> wrap them when you use them.
>
> Instead, you'll import a library that wraps the low-level fs.open
> with a promise-returning (or continuable-returning) function.
>
> var fs=require('your-favorite-fs-promise-layer')
>
> var promise = fs.open(filename,...)
>
> Or if you prefer:
>
> var fs=require('your-favorite-fs-continuable-layer')
>
> var continuable = fs.open(filename,...)

Well, that will probably cause a lot of headache since you would need
to wrap _evey single thing_ in node. What happens when such a library
is out of sync with node underlying impl? You top-level applications
will "randomly break", making the work of tracing the errors a
cumbersome task.

By returning a simple object which will call a member of itself, we
would keep the API simplicity, minimize overhead (no need for "new
type" w all it's implied machinery) and so on. It could even be as
simple as:

fs.open = function(filename, flags, mode) {

var x = function(cb){ x.cb = cb; }
// eventually call x.cb
return x;
}

It would make API sugar/layers easy to write, maintain and use (in
contrast to passing callback as an argument).

>
>> But I don't know if it's possible to do the kind of introspection of
>> the input callables as would be required (i.e. deducing the number of
>> arguments fs.open takes, interpolating with undefined's and adding a
>> callback to the end).
>
> This level of introspection is possible, but it's not necessary
> given that the boilerplate to return some of "handle" as you call
> only needs to be written once, and doesn't need to happen at the
> point of use at all... you'll just import a library that handles
> it.

How is it possible? AFAIK the only way it's possible is to query V8
for the current AST in a given context, then traverse that tree, find
the function, step into the function, look for a use of the magic
arguments, then try to deduce how many arguments are drawn from
arguments, and finally find which position (or possible positions) a
callback would have. I'd say that counts as "not possible" :P.

Tim Caswell

unread,
Feb 19, 2010, 4:50:28 PM2/19/10
to nod...@googlegroups.com
No, the burden isn't on the abstraction author, it's on the author of the async function itself.

An abstraction just takes a variable number of arguments itself, and then "apply"s the function with the callback last on the list.

For functions that want to be fancy and accept variable arguments, then the burden is on them to strip the callback off the end of the arguments object.

Rasmus Andersson

unread,
Feb 19, 2010, 5:27:07 PM2/19/10
to nod...@googlegroups.com
On Fri, Feb 19, 2010 at 22:50, Tim Caswell <t...@creationix.com> wrote:
> No, the burden isn't on the abstraction author, it's on the author of the async function itself.

IMHO this isn't very pretty (it's the pretties solution I've come up
with so far):

var cl = closure(readfile, __filename)(function(err, data){
// do something with err and data
})

This would be better:

var cl = readfile(__filename)(function(err, data){
// do something with err and data
})

As it seems at the moment, we will all need to code using the first
version (passing function ref to a wrapper) since the second will only
be possible by ugly monkey-patching of nodes modules (which is
cumbersome in practice).

Personally I both appreciate the minimalism introduced by passing a
single callback to an async function, but at the same time I have _a
lot_ of code which would need to grow larger — i.e. I would need to
add _more_ sugar to accomplish what's already possible today (e.g.
multiple callbacks, chaining, timeout, etc).

I prefer to write code which is only the essence of the task I'm
approaching — rather not write boilerplate code over and over.

>
> An abstraction just takes a variable number of arguments itself, and then "apply"s the function with the callback last on the list.
>
> For functions that want to be fancy and accept variable arguments, then the burden is on them to strip the callback off the end of the arguments object.
>
>
> On Feb 19, 2010, at 3:38 PM, Rasmus Andersson wrote:
>
>> How is it possible? AFAIK the only way it's possible is to query V8
>> for the current AST in a given context, then traverse that tree, find
>> the function, step into the function, look for a use of the magic
>> arguments, then try to deduce how many arguments are drawn from
>> arguments, and finally find which position (or possible positions) a
>> callback would have. I'd say that counts as "not possible" :P.
>

Rasmus Andersson

unread,
Feb 19, 2010, 5:32:00 PM2/19/10
to nod...@googlegroups.com
I've updated my test case code to work with the "single callback
passed as last argument" being discussed. Find it here:

http://gist.github.com/308779#file_node_closure_wrapper.js

(you can find the other version, returning closures, below if you scroll down).

--
Rasmus Andersson

cloudhead

unread,
Feb 19, 2010, 3:47:59 PM2/19/10
to nodejs
I don't see any reason to use Node's base API if it doesn't provide
basic event-driven programming like promises do. Passing a callback as
the last parameter is like trying to fight a tiger with a fork.

On Feb 19, 11:38 am, Benjamin Thomas <bam.tho...@gmail.com> wrote:


> On Fri, Feb 19, 2010 at 9:20 AM, Mikeal Rogers <mikeal.rog...@gmail.com> wrote:
> > But the tradeoff here is that different libraries may have radically
> > different API styles for relatively simple operations requiring the
> > average programmer to understand and context switch between different
> > styles when reusing third party modules.
>
> I'm not convinced that is true.  As long as libraries adopt Node's
> base system (whatever that ends up being) then ALL the libraries will
> have a consistent API.
>
> What would then happen is that two different kinds of libraries would
> emerge, those offering async services, and then those for dealing with
> async services.
>
> So, I could write a library for CouchDB, that matches Node's API.  Now
> all of a sudden my CouchDB library is compatible with any async lib
> (like Promises or Continuations or whatever) that works with Node's
> API.
>
> Or I could write a Promise library that works with Node's API, now I
> am guaranteed that it will work with any library that matches Node's
> base API.
>
> I realize I am being redundant here, but I'm trying to drive home that
> by having Node choose a base async API (and encouraging library
> developers to do this as well) we are not segmenting the node
> libraries and nor are we making people switch back and forth between
> different styles.  You get to choose the async system you like, AND
> you know it will work with Node libraries.
>

Jeremy Gray

unread,
Feb 20, 2010, 12:31:32 AM2/20/10
to nodejs
Just to put my two bits into this thread: node should offer one and
only one style, almost surely based on its current promise model* or
one closely related to it, for the simple reasons that a) on the async
side it has already proven trivial for people with a preference for a
different async style to be able to implement it on top for their own
use and b) on the sync side node should be opinionated enough to Just
Say No To Sync.

Library builders can feel free to use whatever async style they prefer
_inside_ their library so long as they expose bare node-style async
apis at their edges.

Jeremy

* as much as I don't like the terminology in use, as there is plenty
of prior art that has already established terminology and proven
design & implementation, the term has stuck here in node-land and I am
therefore more than happy to continue with it.

Message has been deleted

Rasmus Andersson

unread,
Feb 20, 2010, 3:25:47 PM2/20/10
to nod...@googlegroups.com, nodejs
Node 0.1.30 will not have any promises but instead use simple
callbacks passed as the last argument to a asynchronous function. For
instance, this is how you would read the contents of a file:

fs.readFile("/foo", function(error, content){
// do something with error and/or content
});

The first argument to callbacks is reserved for an Error, rest is free.

Sent from my iPhone

On 20 feb 2010, at 19.54, lollicode <loll...@gmail.com> wrote:

> Sorry if I'm late to the party or missed anything, but looking at ry's
> commits from y'day on github, it seems that the continuable style has
> already been chosen - so is this debate still current ?
> And if something has been decided, what is it ?

Tim Caswell

unread,
Feb 20, 2010, 3:53:42 PM2/20/10
to nod...@googlegroups.com
I just updated Do to use the new promise-less node, and discovered a handy pattern for converting libraries to your preferred style.

Instead of converting every time you call a function, do it once at require time and then use the methods as if they were written in your style.

For example, here is the new "Do.convert"

// Takes any async lib that uses callback based signatures and converts
// the specified names to continuable style and returns the new library.
exports.convert = function (lib, names) {
  var newlib = {};
  names.forEach(function (key) {
    newlib[key] = function () {
      var args = Array.prototype.slice.call(arguments);
      return function (callback, errback) {
        args.push(function (err, val) {
          if (err) {
            errback(err);
          } else {
            callback(val);
          }
        });
        lib[key].apply(lib, args)
      }
    }
  });
  return newlib;
}

It can be used like this:

var fs = Do.convert(require('fs'), ["readFile", "stat", "readdir"]);


Then I can use fs.readFile, fs.stat, and fs.readdir as if they were Do style continuables instead of node callbacks.

Using this approach of require-time conversion, anyone can use anyone's library as long as we all follow node's  lead and export callback functions and convert them on input.

I hope this helps ease some of the apprehension about the recent api change.  Now let's get back to writing awesome code!

-Tim Caswell

On Feb 19, 2010, at 12:00 PM, Ryan Dahl wrote:

Hey everyone,

I'm going to remove promises entirely. For places where promises would
be used (fs, dns) I will instead use a standard callback interface:

 method(arg0, arg1, arg2, function (error, result0, result1) {
    puts('complete');
 });

That is, the completion callback will always be the last parameter and
the first argument the completion will always be reserved for an error
object. By following this scheme everywhere, I'm confident that very
good user-land deferred/promise/continuation libraries will spring up.

Expect this interface in the next version of node, 0.1.30.

Rasmus Andersson

unread,
Feb 20, 2010, 4:41:24 PM2/20/10
to nod...@googlegroups.com
However nice that might be internally, modules meant for other people
to use should expose a node-style API. (That is, last argument is the
callback(err[, arg..]) -- not using legacy promises etc).

--
Rasmus Andersson

Rasmus Andersson

unread,
Feb 20, 2010, 4:42:22 PM2/20/10
to nod...@googlegroups.com
On Sat, Feb 20, 2010 at 22:41, Rasmus Andersson <ras...@notion.se> wrote:
> However nice that might be internally, modules meant for other people
> to use should expose a node-style API. (That is, last argument is the
> callback(err[, arg..]) -- not using legacy promises etc).

Oh, what I meant to say is "agreed, but let's try to adopt to the
simpler single-callback based API" :)

Jeremy Gray

unread,
Feb 20, 2010, 8:49:15 PM2/20/10
to nodejs
Following up with a clarification: For what it's worth, in my previous
post I should have made it clearer that I don't have a strong
preference as to the specific pattern used by node so long as 1. it is
async and 2. there is only one such pattern in node (and at the public
edges of node libraries). I just didn't want to see a case where a
bunch of variations were all exposed at once by node. This thread has
moved on from that possibility, therefore so has my concern, and
things seem to be converging nicely. Keep up the great work everyone!

Jeremy

George Moschovitis

unread,
Feb 21, 2010, 8:23:15 AM2/21/10
to nodejs
>  - A base library based on pure callbacks

+1 nice idea...

>  - And a "blessed" async library that wraps the base async library in Ryan's favorite of these options.

+1 the eventual CommonJS promise api should be used here...

-g.

--
http://www.appenginejs.org

christkv

unread,
Mar 2, 2010, 8:26:06 AM3/2/10
to nodejs
Hi Ryan

I received a comment on my mongodb driver about adapting the 0.1.30
format for callbacks and a thought struck me.

The mongodb driver collections.find always needs a callback but can
take one or more arguments meaning that if the callback is at the end
you have to write a lot of extra code to ensure the correct ordering
of arguments infront or face large function calls.

currently: collection.find(function(docs}())
at end: collection.find({}, {}, function(docs){})

Of course you can write code to detect the function, but what if you
pass more than one function as an argument such as a map-reduce call
to mongo.

I'm still out on this. I would lean towards putting the callback at
the start rather than the end.

Any suggestions ? Am I complete delusional here ?

Christian

christkv

unread,
Mar 2, 2010, 8:30:49 AM3/2/10
to nodejs
Hi Ryan

I'm running into some issues with my mongodb driver when looking at
0.1.30.

The callback function at the end of the method is hard right, because
some methods like collections.find always needs a callback but can


take one or more arguments meaning that if the callback is at the end
you have to write a lot of extra code to ensure the correct ordering
of arguments infront or face large function calls.

currently: collection.find(function(docs}())
at end: collection.find({}, {}, function(docs){})

Of course you can write code to detect the function, but what if you
pass more than one function as an argument such as a map-reduce call
to mongo.

I'm still out on this. I would lean towards putting the callback at

the start rather than the end. BUT I might be missing something
incredibly obvious here.

Dean Landolt

unread,
Mar 2, 2010, 11:22:14 AM3/2/10
to nod...@googlegroups.com
On Tue, Mar 2, 2010 at 8:30 AM, christkv <chri...@gmail.com> wrote:
Hi Ryan

I'm running into some issues with my mongodb driver when looking at
0.1.30.

The callback function at the end of the method is hard right, because
some methods like collections.find always needs a callback but can
take one or more arguments meaning that if the callback is at the end
you have to write a lot of extra code to ensure the correct ordering
of arguments infront or face large function calls.

currently: collection.find(function(docs}())
at end: collection.find({}, {}, function(docs){})

Of course you can write code to detect the function, but what if you
pass more than one function as an argument such as a map-reduce call
to mongo.

I'm still out on this. I would lean towards putting the callback at
the start rather than the end. BUT I might be missing something
incredibly obvious here.

I think the standard argument against putting the callback at the front is largely style -- all of your arguments to the function call would be way down the page, if visible at all -- which makes code pretty unreadable.
Reply all
Reply to author
Forward
0 new messages