2

I'm attempting to parse a fairly large JSON file (~500Mb) in NodeJS. My implementation is based on the Async approach given in this answer:

var fileStream = require('fs');
var jsonObj;

fileStream.readFile('./data/exporttest2.json', fileCallback);

function fileCallback (err, data) {
    return err ? (console.log(err), !1):(jsonObj = JSON.parse(data));
    //Process JSON data here
}

That's all well and good, but I'm getting hit with the following error message:

buffer.js:495
    throw new Error('"toString()" failed');
    ^

Error: "toString()" failed
    at Buffer.toString (buffer.js:495:11)
    at Object.parse (native)
    at fileCallback (C:\Users\1700675\Research\Experiments\NodeJS\rf_EU.js:49:18)
    at FSReqWrap.readFileAfterClose [as oncomplete] (fs.js:445:3)

I understand from this answer that this is caused by the maximum buffer length in the V8 engine set at 256Mb.

My question then is this, is there a way I can asynchronously read my JSON file in chunks that do not exceed the buffer length of 256Mb, without manually disseminating my JSON data into several files?

1
  • can you share how you used the JSONstream module? Commented Apr 19, 2017 at 16:02

2 Answers 2

3

is there a way I can asynchronously read my JSON file in chunks that do not exceed the buffer length of 256Mb, without manually disseminating my JSON data into several files?

This is acommon problem and there are several modules than can help you with that:

Example with JSONStream:

const JSONStream = require('JSONStream');
const fs = require('fs');

fs.createReadStrem('./data/exporttest2.json')
  .pipe(JSONStream.parse('...'))...

See the docs for details of all of the arguments.

Sign up to request clarification or add additional context in comments.

2 Comments

Literally just stumbled across a couple of these libraries. I'll use one until the NodeJS community address the issue. Thanks for your response nonetheless.
do you know why the buffer is not exceeded with JSONStream? does it try to parse it as data goes in?
-1

Try using streams:

let fs = require("fs");

let s = fs.createReadStream('./a.json');
let data = [];
s.on('data', function (chunk) {
    data.push(chunk);
}).on('end', function () {
    let json = Buffer.concat(data).toString();
    console.log(JSON.parse(json));
});

1 Comment

This still causes the same issue, when Buffer.concat(data).toString() executes there is >256Mb in the buffer

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.