I'm trying to write the following code and make it work synchronously, but the only problem that it works correctly with console.log, which prints me every item in array with delay in 1 second, but don't work with the following structure:
for (let i = 0; i < array.length; i++) {
setTimeout(function () {
1.http request via rp or request.get (I receive a huge data array)
2. .map results
3.insert to Mongo via mongoose
}
}
as for now I have the following code inside:
request.get({url: array[i].url}), function (error, body) {
body.map(element => {
//do stuff, it works fine
});
collection.insertMany(body, function (err, docs) {
//#justloggerthings
}
Or I have almost the same version with rp instead of request.get
By default I have mongoose.Promise = global.Promise;
Why this cause a problem? Because body.length is very huge dataset which eat a lot of RAM. (Now imagine 20+ arrays with insertMany)
So Mongo trying to insertMany all responses from request at once (when they ready, w/o 1000s delay). Actually that's why I choose request instead of rp (request-promise) but it seems look async too. So should I choose another http get module from npm and switch to it. And not to worry about it?
Or should I wrap this operations to promise || made an async function and recall it inside loop every time (1000s for example) when I it's correctly finished. In this case, the only thing which I found actual on StackOverflow is:
How to insert data to mongo synchronously (Nodejs, Express)
Bit it's a bit outdated. So any ideas?
.create50'000+ elements better than use.insertMany(array). My responses fromrequestisn't a simple document, it's huge array with documents. [TL:DR: I need to insert 10+ this arrays (body) one-by-one instead of inserting them all-at-once. That's why I'm usingforandsetTimeoutLogic in this case is simple: request via url datasetbody(it's an array) and then insert it to Mongo, because if I insert all them at once it require a lot of RAM (like I mentioned before)]awaitin a loop, notmap