0

I have a huge data structure that I need to dump to file:

fs.writeFile('dump.json', JSON.stringify(bigData));

The resulting file is close to 100MB and it takes several seconds to generate. While JSON.stringify runs, it blocks the event loop and my server does not handle any requests.

Is there a way to somehow split the JSON.stringify call? My bigData var is an array of objects, so I could probably write a function to serialize them separately and then stitch the JSON together, to make sure that requests can be handled in between - but are there any solutions that are already existing (external modules is fine)?

0

1 Answer 1

0

Try stream-json-stringify. It should do the trick.

Sign up to request clarification or add additional context in comments.

5 Comments

While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes. - From Review
@AhmedAshour - you cannot seriously expect the whole module source to be pasted in the answer. The link is to the module's homepage, and that is just fine.
As for the module itself, it writes some data, but not all - I get only some 50MB instead of all 100MB of data. Quite hard to trace what's missing though!
I'm new here. Don't know the rules yet. Just wanted to help. Anyway, if there is a bug or an issue in the package, why not send a message to the owner? (For you and for the sake of others that will use this package in the future).
Thanks for pointing me in the right direction - I found code at stackoverflow.com/a/13504078/979 which seems to work.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.