0

I have a JSON file about 300MB in size and I'm trying to read it but it's not working.

How can I read a large JSON file? If anyone can guide me with a small piece of code then it would be great.

I have already tried fs.readFile but no luck, it's working fine for smaller files but not for large ones.

Below is the code I have tried so far:

app.get('/getData', function (req, res) {
  fs.readFile('./uploads/test.json', function (err, data) {
      if (err) throw err
      var jsonData = data;
      var jsonParsed = JSON.parse(jsonData);
      res.json(jsonParsed);
    });
});
6
  • 2
    What "does not work"? Do you get an error? Run out of memory? Take too long? What exactly is the problem? For starters, if (err) throw err should log. And, you can put a try/catch with logging in the catch around JSON.parse(). You should at least start by logging possible errors in this code. This takes some elemental debugging on your part before any of us could help you. You should do that before you come here and then tell us what you found. Commented Jun 22, 2019 at 0:43
  • 2
    It's possible that 300MB of JSON is too large for your environment. There are stream parsers for JSON that might be more memory efficient. At 300MB, I'd personally be looking at a database, not a JSON file for storage. Commented Jun 22, 2019 at 0:45
  • Have you looked at something like this? itnext.io/… Commented Jun 22, 2019 at 0:47
  • Possible duplicate of Upload a large file (1GB) with node and express Commented Jun 22, 2019 at 0:48
  • FYI, CSV is simpler to read serially line by line if that's appropriate for what you're storing. Commented Jun 22, 2019 at 0:52

1 Answer 1

3

Speculating from the code snippet, I believe there is no requirement to do any modification (i.e. filtering) to the JSON data prior dispatching the response. If that is the case, parsing of the JSON would be unnecessary and Node.js's built-in readable streams could be used to provide more efficiency.

const fs = require('fs');
const path = require('path');
const express = require('express');
const app = express();

app.get('/data', (req, res) => {
  res.setHeader('Content-Type', 'application/json');
  fs.createReadStream(path.join(__dirname, 'public', 'citylots.json')).pipe(res);
});

app.listen(process.env.PORT || 8080);
Sign up to request clarification or add additional context in comments.

5 Comments

Is res a WritableStream? I would have just used res.sendFile(jsonFile)
@slebetman Yes, the res object is an enhanced version of Node’s http.ServerResponse class which inherits Stream.
@slebetman The sendFile method implemented via send utilises the exact same technique above but since the content type is know beforehand the above saves few function calls. Regardless, the sendFile is a commodious alternate too!
@AshenGunaratne with your solution i'm getting the data but it's taking too much time to load , is there any way to make it process faster ?
@rajatmeena - What are you expecting? It takes awhile to stream 300MB to the client. Fix your design so you don't need to send 300MB to the client. Keep the data on the server and let the client make ajax requests to fetch what it needs. Or, send the data in small chunks and let the client request only what it needs when it needs it.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.