0

I'm trying to write the following code and make it work synchronously, but the only problem that it works correctly with console.log, which prints me every item in array with delay in 1 second, but don't work with the following structure:

for (let i = 0; i < array.length; i++) {
    setTimeout(function () {
        1.http request via rp or request.get (I receive a huge data array)
        2. .map results 
        3.insert to Mongo via mongoose 
     }
}

as for now I have the following code inside:

request.get({url: array[i].url}), function (error, body) {
    body.map(element => {
        //do stuff, it works fine
     });
     collection.insertMany(body, function (err, docs) {
         //#justloggerthings
     }

Or I have almost the same version with rp instead of request.get By default I have mongoose.Promise = global.Promise;

Why this cause a problem? Because body.length is very huge dataset which eat a lot of RAM. (Now imagine 20+ arrays with insertMany)

So Mongo trying to insertMany all responses from request at once (when they ready, w/o 1000s delay). Actually that's why I choose request instead of rp (request-promise) but it seems look async too. So should I choose another http get module from npm and switch to it. And not to worry about it?

Or should I wrap this operations to promise || made an async function and recall it inside loop every time (1000s for example) when I it's correctly finished. In this case, the only thing which I found actual on StackOverflow is:

How to insert data to mongo synchronously (Nodejs, Express)

Bit it's a bit outdated. So any ideas?

3
  • If you dont want to insert all the responses at once, then dont use insertMany. Loop over responses and insert one by one. Commented Oct 11, 2018 at 12:40
  • Actually I'm not sure that .create 50'000+ elements better than use .insertMany(array). My responses from request isn't a simple document, it's huge array with documents. [TL:DR: I need to insert 10+ this arrays (body) one-by-one instead of inserting them all-at-once. That's why I'm using for and setTimeout Logic in this case is simple: request via url dataset body (it's an array) and then insert it to Mongo, because if I insert all them at once it require a lot of RAM (like I mentioned before)] Commented Oct 11, 2018 at 12:49
  • 1
    If you want to do it sequentially, use await in a loop, not map Commented Oct 11, 2018 at 13:07

2 Answers 2

1

Well, i dont have your actual code so i will write in pseudo code what you can do.

const chunkArray = (array, chunkSize) => {
    let i,j,chunks = [];
    for (i = 0, j = array.length; i < j; i += chunkSize) {
        chunks.push(array.slice(i, i + chunkSize));
    }
    return chunks;
}

for (let i = 0; i < array.length; i++) {
    let bigArray = await request('your data url');
    // do some mapping or whatever
    // then break your large array into smaller chunks
    let chunks = chunkArray(bigArray, 10);
    let j;
    for (j = 0; j < chunks.length; j++) {
        await collection.insertMany(chunks[j]);
    }

}
Sign up to request clarification or add additional context in comments.

Comments

0

Actual code that solve my problem is:

async function test (name, url, lastModified) {
    try {
        const response = await rp({uri: url, json: true});
        response.map(async (element) => {
            if (element.buyout > 0) {
                element.price = (element.buyout / element.quantity);
            }
            element.lastModified = lastModified
        });
        return collection.insertMany(response);
    } catch (err) {
        console.error(err)
    }
}

async function addAsync() {
    const z = await test();
    console.log(z);
}

addAsync();

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.