It seems to me that an elegant way to process certain kinds of data in Node.js would be to chain processing objects, like UNIX pipes.
For example, grep:
function Grep(pattern) {
...
}
util.inherits(Grep, stream.Stream);
Grep.prototype.???? = ??????? // What goes here?
grep = new Grep(/foo/);
process.stdin.pipe(grep);
myStream.pipe(process.stdout);
However it's not at all clear to me how the various Stream methods need to be overridden in order for this to work.
How can I create a Stream object that simply copies from its input to its output? Presumably with that answered, more sophisticated filtering streams become trivial.
Update: it feels as if the following should work (expressed in CoffeeScript, so I don't fill this box with JS syntax!):
class Forwarder extends stream.Stream
write: (chunk, encoding) ->
@emit 'data', chunk
end: (chunk, encoding) =>
if chunk?
@emit 'data', chunk
@emit 'end'
fwd = new Forwarder()
fwd.pipe(process.stdout);
process.stdin.pipe(fwd);
process.stdin.resume();
However catting something to this script doesn't output anything. Calling fwd.write() explicitly in the script does cause output on stdout.