3

So in Node I can execute a JavaScript file using a command like:

$ node src/someFile.js

But is there a way to execute all of the JavaScript files in a given directory synchronously (one file executes, then after it has finished the next one executes, etc)? Basically, is there a single command that would have the effect of something like

$ node src/firstFile.js
$ node src/secondFile.js
$ node src/thirdFile.js
...

I've tried commands like

$ node src/*.js

but with no success.

If there exists no such command, what's the best way to go about doing something like this?

4
  • Is an answer that uses shell-specific functionality acceptable to you? (I see you've already tried globbing) Commented Apr 12, 2016 at 23:31
  • @ᅵZVᅵ I'm currently running a Windows machine so shell scripting wouldn't be optimal for me personally - however if it does solve the issue then it should be a perfectly valid solution Commented Apr 12, 2016 at 23:34
  • I'll fixup my answer with a powershell result -- I think it would be best to do this from the shell unless you need to capture their exit code, restore their SEH chains in CreateRemoteThread etc. Commented Apr 12, 2016 at 23:35
  • I'm away from my windows machine, let me know if that PS script works for you. Commented Apr 12, 2016 at 23:42

4 Answers 4

4

I am not sure if this is going to work for you because this is a feature of the shell not of the node runtime but..

for f in src/*.js; do node "$f"; done

Or in Powershell:

Get-ChildItem .\*.js | Foreach-Object {
  node $_ 
}
Sign up to request clarification or add additional context in comments.

Comments

2

You could use spawn to run a node process from node like

(function() {

var cp = require('child_process');
var childProcess = cp.spawn('node', [`src/firstFile.js`]);

At this point you have to add some listeners:

// now listens events
// Listen for an exit event:
child.on('exit', function(exitCode) {
    console.log("Child exited with code: " + exitCode);
    return resolve(exitCode);
});
// Listen for stdout data
child.stdout.on('data', function(data) {
    console.log(data.toString());
});
// child error
child.stderr.on('data',
    function(data) {
        console.log('err data: ' + data);
        // on error, kill this child
        child.kill();
    }
);
}).call(this);

Of course you need to serialize execution here, but it's easy since you have the child.on('exit') that tells you that the process ended, so you can start the next one.

Look to Controlling Multiple Processes in Node for my example working solution that run multi processes in node and wait execution to end / join.

2 Comments

Ah so there's no simple one-liner solution native to node for this, but rather it requires writing a file (like you've suggested above)?
One line solution will not give you control of the process status, exit code, and the pid. Of course, you can write a bash script that takes the pid, execute node $filename in a for loop statement, etc, but you need a supervisor to handle all these processes, that can be writted in bash, python, perl or ...node.
1

Using a POSIX shell:

$ for js in src/*.js; do node "$js"; done

Comments

1

If the calling each one from the shell thing isn't a hard requirement, I would kick them all off with a single node process from the shell. This node script would:

  • traverse the directory of modules
  • require the first one, which executes it, and pass a callback which the module will call on completion
  • When the complete callback is called, execute the next script in your directory.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.