So, after I merge net2 I'll be reworking all the streaming interfaces
to look like that. Ir at least something similar. HTTP request and
response objects will follow it too - so instead of getting "body"
events and doing "sendBody", it will act as a normal data stream.
What's cool is we'll be able to write abstract functions to pump data
from a readable stream into a writable one. Maybe we could have
something where you can open a new file write stream (which under the
hood uses the posix module) and if you wanted to do a HTTP server with
uploads you'd do
createFile("/tmp/x", function (fStream) {
pump(req, fStream);
});
The pump would do automatic throttling. Similarly if you want to
stream a file through a socket you might do
openFileStream("/data/movie.avi", function (fStream) {
pump(fStream, socket);
});
The "pump" function would be very simple
function pump (readStream, writeStream, cb) {
readStream.addListener("data", function (d) {
if (!writeStream.write(d)) readStream.pause();
});
readStream.addListener("eof", function () {
writeStream.close();
cb();
});
writeStream.addListener("drain", function () {
readStream.resume();
});
}
Or at least something like that.
CommonJS should adopt this as their I/O API and it should be the basis
of the socket and HTTP server APIs. It needs to be broken in before
proposing it there. For example, it's not clear to me the best way to
"filter" data through the callbacks.
-Ray
> --
> You received this message because you are subscribed to the Google Groups "nodejs" group.
> To post to this group, send email to nod...@googlegroups.com.
> To unsubscribe from this group, send email to nodejs+un...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/nodejs?hl=en.
>
The key is that write() and close() need to be guaranteed async. They
can push data onto the buffer (or close it to new writes)
automatically, but they must not fire "data" or "eof" events until at
least process.nextTick.
Then, you can have something like this:
function filter (input, output, cb) {
input.addListener("data", function (chunk) { output.write(cb(chunk)) });
input.addListener("eof", function () { output.close() });
}
Now everything that input reads will be written to output after being
filtered through cb. The callback could of course buffer it, look for
EOL tokens, whatever.
I'm going to write a sed-style middleware using this principle today.
--i