197

I need to pass in a text file in the terminal and then read the data from it, how can I do this?

node server.js file.txt

How do I pass in the path from the terminal, how do I read that on the other side?

2
  • If you find yourself adding more options on the command line, you can use Optimist. Commented Jan 17, 2014 at 17:05
  • stackoverflow.com/questions/6156501/… shows another way to read a text file Commented May 17, 2017 at 6:13

6 Answers 6

226

You'll want to use the process.argv array to access the command-line arguments to get the filename and the FileSystem module (fs) to read the file. For example:

// Make sure we got a filename on the command line.
if (process.argv.length < 3) {
  console.log('Usage: node ' + process.argv[1] + ' FILENAME');
  process.exit(1);
}
// Read the file and print its contents.
var fs = require('fs')
  , filename = process.argv[2];
fs.readFile(filename, 'utf8', function(err, data) {
  if (err) throw err;
  console.log('OK: ' + filename);
  console.log(data)
});

To break that down a little for you process.argv will usually have length two, the zeroth item being the "node" interpreter and the first being the script that node is currently running, items after that were passed on the command line. Once you've pulled a filename from argv then you can use the filesystem functions to read the file and do whatever you want with its contents. Sample usage would look like this:

$ node ./cat.js file.txt
OK: file.txt
This is file.txt!

[Edit] As @wtfcoder mentions, using the "fs.readFile()" method might not be the best idea because it will buffer the entire contents of the file before yielding it to the callback function. This buffering could potentially use lots of memory but, more importantly, it does not take advantage of one of the core features of node.js - asynchronous, evented I/O.

The "node" way to process a large file (or any file, really) would be to use fs.read() and process each available chunk as it is available from the operating system. However, reading the file as such requires you to do your own (possibly) incremental parsing/processing of the file and some amount of buffering might be inevitable.

Sign up to request clarification or add additional context in comments.

5 Comments

Awesome, thanks so much, very helpful. How could I split this data by line?
@fancy: try var lines = data.split(/\r?\n/);, then the array "lines" will have each line.
This isnt a good idea if the text file is large, as it will be all be read into memory, if you process say a 1000mb CSV file look at fs.createFilestream, you will need to take care with line splitting though as the data chunks wont (in most cases) fall on the line boundaries (some people have already came up with solutions - google)
@wtfcoder: yes, very good point. My intent was just to demonstrate the simple case of reading a file named on the command-line; there are obviously many subtleties (esp. performance) that are beyond the scope of this question.
I posted a solution to a similar question for parsing a very large file, using a stream, synchronous. see: stackoverflow.com/questions/16010915/…
114

Usign fs with node.

var fs = require('fs');

try {  
    var data = fs.readFileSync('file.txt', 'utf8');
    console.log(data.toString());    
} catch(e) {
    console.log('Error:', e.stack);
}

4 Comments

Note that this is the synchronous version.
@RichWerden what do you mean by "synchronous" in this context?
In Node when something is "synchronous" it stops/blocks the system from doing anything else. Let's say you have a node webserver - if any other requests come in while the above is happening, the server won't/can't respond because it is busy reading the file.
@RichWerden, how to make the "asynchronous" version of this solution? Thanks.
30

IMHO, fs.readFile() should be avoided because it loads ALL the file in memory and it won't call the callback until all the file has been read.

The easiest way to read a text file is to read it line by line. I recommend a BufferedReader:

new BufferedReader ("file", { encoding: "utf8" })
    .on ("error", function (error){
        console.log ("error: " + error);
    })
    .on ("line", function (line){
        console.log ("line: " + line);
    })
    .on ("end", function (){
        console.log ("EOF");
    })
    .read ();

For complex data structures like .properties or json files you need to use a parser (internally it should also use a buffered reader).

3 Comments

Thanks for pointing out this technique. You're right that this may be the best way, but I just thought it was slightly confusing in the context of this question, which I think is asking about an undemanding use-case. As pointed out above, if it's just a small file being passed to a command line tool, there's no reason not to use fs.readFile() or fs.readFileSync(). It's got to be a huge file to cause a noticeable wait. A JSON config file like package.json is likely to be under 1 kb, so you can just fs.readFile() and JSON.parse() it.
BufferedReader may have changed its signature. I had to replace BufferedReader with BufferedReader,DataReader, where BufferedReader was the module. See github.com/Gagle/Node-BufferedReader
I see that BufferedReader is now deprecated.
13

You can use readstream and pipe to read the file line by line without read all the file into memory one time.

var fs = require('fs'),
    es = require('event-stream'),
    os = require('os');

var s = fs.createReadStream(path)
    .pipe(es.split())
    .pipe(es.mapSync(function(line) {
        //pause the readstream
        s.pause();
        console.log("line:", line);
        s.resume();
    })
    .on('error', function(err) {
        console.log('Error:', err);
    })
    .on('end', function() {
        console.log('Finish reading.');
    })
);

Comments

7

I am posting a complete example which I finally got working. Here I am reading in a file rooms/rooms.txt from a script rooms/rooms.js

var fs = require('fs');
var path = require('path');
var readStream = fs.createReadStream(path.join(__dirname, '../rooms') + '/rooms.txt', 'utf8');
let data = ''
readStream.on('data', function(chunk) {
    data += chunk;
}).on('end', function() {
    console.log(data);
});

Comments

-3

The async way of life:

#! /usr/bin/node

const fs = require('fs');

function readall (stream)
{
  return new Promise ((resolve, reject) => {
    const chunks = [];
    stream.on ('error', (error) => reject (error));
    stream.on ('data',  (chunk) => chunk && chunks.push (chunk));
    stream.on ('end',   ()      => resolve (Buffer.concat (chunks)));
  });
}

function readfile (filename)
{
  return readall (fs.createReadStream (filename));
}

(async () => {
  let content = await readfile ('/etc/ssh/moduli').catch ((e) => {})
  if (content)
    console.log ("size:", content.length,
                 "head:", content.slice (0, 46).toString ());
})();

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.