5

I want to stream multiple files, one after each other, to the browser. To illustrate, think of having multiple CSS files which shall be delivered concatenated as one.

The code I am using is:

var directory = path.join(__dirname, 'css');
fs.readdir(directory, function (err, files) {
  async.eachSeries(files, function (file, callback) {
    if (!endsWith(file, '.css')) { return callback(); } // (1)

    var currentFile = path.join(directory, file);
    fs.stat(currentFile, function (err, stats) {
      if (stats.isDirectory()) { return callback(); } // (2)

      var stream = fs.createReadStream(currentFile).on('end', function () {
        callback(); // (3)
      });
      stream.pipe(res, { end: false }); // (4)
    });
  }, function () {
    res.end(); // (5)
  });
});

The idea is that I

  1. filter out all files that do not have the file extension .css.
  2. filter out all directories.
  3. proceed with the next file once a file has been read completely.
  4. pipe each file to the response stream without closing it.
  5. end the response stream once all files have been piped.

The problem is that only the first .css file gets piped, and all remaining files are missing. It's as if (3) would directly jump to (5) after the first (4).

The interesting thing is that if I replace line (4) with

stream.on('data', function (data) {
  console.log(data.toString('utf8'));
});

everything works as expected: I see multiple files. If I then change this code to

stream.on('data', function (data) {
  res.write(data.toString('utf8'));
});

all files expect the first are missing again.

What am I doing wrong?

PS: The error happens using Node.js 0.8.7 as well as using 0.8.22.

UPDATE

Okay, it works if you change the code as follows:

var directory = path.join(__dirname, 'css');
fs.readdir(directory, function (err, files) {
  var concatenated = '';
  async.eachSeries(files, function (file, callback) {
    if (!endsWith(file, '.css')) { return callback(); }

    var currentFile = path.join(directory, file);
    fs.stat(currentFile, function (err, stats) {
      if (stats.isDirectory()) { return callback(); }

      var stream = fs.createReadStream(currentFile).on('end', function () {
        callback();
      }).on('data', function (data) { concatenated += data.toString('utf8'); });
    });
  }, function () {
    res.write(concatenated);
    res.end();
  });
});

But: Why? Why can't I call res.write multiple times instead of first summing up all the chunks, and then write them all at once?

2
  • 2
    What you come up with is not what you were asking for because you are not actually piping, rather you are buffering and then writing the buffer to an output stream. This is also the answer to your additional question: if you would really pipe the data to the output stream, you would not have to worry about write() and that it can only be called once. You should find a piped solution by reviewing these Streams 2 examples: github.com/Floby. Especially this one: github.com/Floby/node-catstream Commented Mar 27, 2013 at 14:45
  • I know that I am not piping in what I've written, but I have found a solution using piping. As you can see from my answer below, the problem was not in the code, it was in the unit test. Commented Mar 27, 2013 at 19:04

3 Answers 3

3

Consider also using multistream, that allows you to combine and emit multiple streams one after another.

Sign up to request clarification or add additional context in comments.

Comments

2

The code was perfectly fine, it was the unit test that was wrong ...

Fixed that, and now it works like a charme :-)

1 Comment

I'm trying to do the same with image & audio files. But I get the first image shown (in Postman) although the sum of the response payload indicates that all files have been received. How do you make distinction between multiple files when they are being streamed in one stream.
1

May help someone else:

const fs = require("fs");
const pth = require("path");

let readerStream1 = fs.createReadStream(pth.join(__dirname, "a.txt"));
let readerStream2 = fs.createReadStream(pth.join(__dirname, "b.txt"));
let writerStream = fs.createWriteStream(pth.join(__dirname, "c.txt"));

//only readable streams have "pipe" method
readerStream1.pipe(writerStream);
readerStream2.pipe(writerStream);

I also checked Rocco's answer and its working like a charm:

//npm i --save multistream

const multi = require('multistream');
const fs = require('fs');
const pth = require("path");
 
let streams = [
  fs.createReadStream(pth.join(__dirname, "a.txt")),
  fs.createReadStream(pth.join(__dirname, "b.txt"))
];
let writerStream = fs.createWriteStream(pth.join(__dirname, "c.txt"));
 
//new multi(streams).pipe(process.stdout);
new multi(streams).pipe(writerStream);

and to send the results to client:

const multi = require('multistream');
const fs = require('fs');
const pth = require("path");
const exp = require("express");
const app = exp();
app.listen(3000);

app.get("/stream", (q, r) => {
  new multi([
    fs.createReadStream(pth.join(__dirname, "a.txt")),
    fs.createReadStream(pth.join(__dirname, "b.txt"))
  ]).pipe(r);
});

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.