16

My service returns responses of very large JSON objects - around 60MB. After some profiling I have found that it spends almost all of the time doing the JSON.stringify() call which is used to convert to string and send it as a response. I have tried custom implementations of stringify and they are even slower.

This is quite a bottleneck for my service. I want to be able to handle as many requests per second as possible - currently 1 request takes 700ms.

My questions are:
1) Can I optimize the sending of response part? Is there a more effective way than stringify-ing the object and sending the response?

2) Will using async module and performing the JSON.stringify() in a separate thread improve overall the number of requests/second(given that over 90% of the time is spent at that call)?

8
  • There's nothing you can do to reduce the size of those objects? What's making them so big? Commented Feb 26, 2014 at 16:05
  • Did you try using socket.io. JSON.stringify loads the entire object into memory so I dont think you can optimize it. But 60 mb of json is just too huge. Commented Feb 26, 2014 at 16:06
  • Are you sending redundant data? If a subset of your data is modified, don't resend all the data, just resend the subset. That might allow you to reduce the amount of data you stringify Commented Feb 26, 2014 at 16:06
  • It would be helpful to see the code you are using to send the response so we can assess your process. Commented Feb 26, 2014 at 16:08
  • 2
    How are you consuming this service.. i.e. Do you have to serialize it as JSON? Maybe try serializing and sending as BSON instead? I would imagine you'd see a performance improvement encoding/decoding it; and the output should be smaller. Commented Feb 26, 2014 at 16:31

2 Answers 2

11

You've got two options:

1) find a JSON module that will allow you to stream the stringify operation, and process it in chunks. I don't know if such a module is out there, if it's not you'd have to build it. EDIT: Thanks to Reinard Mavronicolas for pointing out JSONStream in the comments. I've actually had it on my back burner to look for something like this, for a different use case.

2) async does not use threads. You'd need to use cluster or some other actual threading module to drop the processing into a separate thread. The caveat here is that you're still processing a large amount of data, you're gaining bandwidth using threads but depending on your traffic you still may hit a limit.

Sign up to request clarification or add additional context in comments.

4 Comments

thanks! I have another idea - to gzip the content and make my service return gziped conent instead. I found this library github.com/sapienlab/jsonpack which provides function which takes the json object and returns the gzipped string and it seems to me that it doesn't internally call JSON.stringify(). What do you think?
As long as it does streamed/evented/chunked processing, that's probably fine. Dealing with that much data all in one operation without breaking it up into pieces is going to take a long time no matter what you do with it.
@Jason, you can add to your answer under point number 1 - github.com/dominictarr/JSONStream. Also, seems to be anohter SO question similar to this one: stackoverflow.com/questions/13503844/…
streaming to msgpack would probably be even better.
8

After some year, this question has a new answer for the first question: yieldable-json lib. As described by in this talk by Gireesh Punathil (IBM India), this lib can evaluate a JSON of 60MB without blocking the event loop of node.js let you accept new requests in order to upgrade your throughput.

For the second one, with node.js 11 in the experimental phase, you can use the worker thread in order to increase your web server throughput.

2 Comments

Did you implement either of those? I am experimenting with yieldable-json, worker threads and even a native API extension like github.com/lemire/simdjson
Yes, I used yieldable-json but I didn't try other approaches

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.