3

i'm using nodejs to capture a video stream and then calling ffmpeg using spawn to process the feed. I would like to do this in parallel if i receive multiple video streams.

I can mimic this manually by opening multiple terminals and using a different variation of the ffmpeg command to execute the processes.

I get that nodejs is single threaded, I have reviewed async but not figured out how this may apply given that callbacks play an integral role.

Essentially I want to call multiple ffmpeg processes in parallel without opening several terminal windows and handle callbacks such as errors or exit.

e.g.

//spawn ffmpeg with params
var ffmpegexec = spawn('ffmpeg', ['-i', 'pathname', '-vcodec', 'libx264', '-acodec', 'libfdk_aac', '-threads', '2', '-s', '320x240', 'filename.mp4');

//deal with output
ffmpegexec.stdout.on('data', function(data) {
    console.log("stdout: " + data);
});

ffmpegexec.stderr.on('data', function(data) {
    console.log("stderr : " + data);
});

ffmpegexec.on('exit', function(code) {

    console.log("exit: " + code);
});

I was thinking that perhaps i could launch each new process in a separate nodejs instance, or not depending on your recommendations

1
  • node is non-blocking, seems like each request (steam you receive) could just spawn its own ffmpeg process (while node is single threaded, I believe "spawn" will create a new thread). Although you might run into trouble with lots of request. Maybe create a queue? Commented Apr 7, 2014 at 22:27

2 Answers 2

6

Just call spawn(...) multiple times consecutively.

Each call to process.spawn(...) starts a new OS process, a large number of which can be run concurrently. You will have to register separate stdout/stderr/exit handlers for each instance but if they are similar you can use the same (or similar) handlers.

For example, if you wanted to run the same commands on separate filenames you could do the following:

function runFfmpeg(filename) {
  var proc = process.spawn('ffmpeg', ['-i', 'pathname', ..., filename]);
  proc.stdout.on('data', function(data) { console.log("stdout: " + data); });
  proc.stderr.on('data', function(data) { console.log("stderr: " + data); });
  proc.on('exit', function(code) { console.log("exit: " + code); });
}

var filenames = ['foo.mp4', 'bar.mp4', 'gah.mp4'];
filenames.forEach(function(filename) { runFfmpeg(filename); });

Keep in mind that this could incur a huge load on your machine. You will likely need to throttle the number of threads/processes running at any time based on the resources available on the target physical machine (e.g. one process per physical CPU and one or two threads per CPU core).

Sign up to request clarification or add additional context in comments.

1 Comment

importing spawn method: const { spawn } = require('child_process');
3

This will happen naturally in node's normal code path. If you had 2 files like: ['video1.mp4', 'video2.mp4'] and looped over that array and passed each filename to your function that spawns ffmpeg, they would run in parallel. Fundamentally this is straightforward to achieve in node.

However, it is naive to do this within a network service (your question isn't clear as to whether this has a web interface or is a CLI or what), because it is then trivial to DoS the system to a halt by launching an untenable number of these in parallel. Here is where async.eachLimit comes to the rescue with a sensible queue/batch paradigm.

1 Comment

That approach does not prevent a DoS but rather limits the impact to a single thread on the system. The WebServer is still out to lunch but only busy eating from one plate not 8 or 12 or more.. True spam filters should be applied when serving to the general public.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.