1

What are the best practices when using node.js for a queue processing application?

2 Answers 2

1

My main concern there would be that Node processes can handle thousands of items at once, but that a rogue unhandled error in any of them could bring down the whole process.

I'd be looking for a queue/driver combination that allowed a two-phase commit (wrong terminology I think?), i.e:

  1. Get the next appropriate item from the queue (which then blocks that item from being consumed elsewhere)
  2. Once each item is handed over to the downstream service/database/filesystem you can then tell the queue that the item has been processed

I'd also want repeatably unique identifiers so that you can reliably detect if an item comes down the pipe twice. In a theoretical system it might not happen, but in a practical environment the capability to deal with it will make your life easier.

Sign up to request clarification or add additional context in comments.

1 Comment

Agreed. In my architecture, all the jobs are managed by Gearman (gearman.org). Some of the jobs are executed by some C/C++ modules and I want to do some jobs using Node.js which I think is better suited. thanks
0

check out http://learnboost.github.com/kue/ i have used it for a couple of pet projects and works quite good, you can look at their source and check what practices they have take care of

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.