9

My goal was to make multiple api calls for a list of data. Lets say I had the following code

const  axios = require('axios');

const  axiosRequests = [];
const strings = ['a', 'b', 'c'];
for (let  str  of  strings) {
    axiosRequests.push(axios.get(`https://www.apiexample.com/get/?cfg=json&value=${str}`))
}

The easiest solutions was to apply the following:

let  responseArray;
try {
    responseArray = await  Promise.all(axiosRequests);
} catch (err) {
    console.log(err);
}

responseArray.map(response  => {
    //make something with the response
{

But the problem I encountered with from the API was the HTTP 429 Too Many Requests response status code, which means that the API restricts the number of requests for a period of time.

I want to add a delay between each request.

How can I do this?

4
  • 1
    Don't use Promise.all, but await each request. Commented May 3, 2020 at 4:31
  • and how can I gather all of the responses to responses array? Commented May 3, 2020 at 4:35
  • Does this answer your question? multiple requests with for loop and timeout between each Commented May 3, 2020 at 5:52
  • @EfiGordon, you just put the result of each awaited request into an array, and return that. Commented May 3, 2020 at 5:59

5 Answers 5

9

You can call in series. However, I recommend using chunks, to make it more useful.

Using chunk, best performance:

const delay = (ms = 1000) => new Promise((r) => setTimeout(r, ms));

const getInChunk = async function (items, chunkSize) {
  let results = [];
  let chunkPromises = [];
  let chunkResults = [];
  for (let index = 0; index < items.length; index++) {
    if (index % chunkPromises === 0) {
      chunkPromises = [];
      chunkResults.push(await Promise.all(chunkPromises));
    } else {
      chunkPromises.push(
        axios.get(`https://jsonplaceholder.typicode.com/todos/${items[index]}`).then(res => res.data)
      );
    }
  }
  // last chunk
  if (chunkPromises.length) {
    chunkResults.push(await Promise.all(chunkPromises));
  }
  // flatten 
  chunkResults.forEach(chunk =>{
    results = results.concat(chunk)
  })
  console.log(results)
  return results;
};

async function main() {
  const strings = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
  const results = await getInChunk(strings, 5);
  console.log(results);
}
main();
<script src="https://cdnjs.cloudflare.com/ajax/libs/axios/0.19.2/axios.min.js"></script>

Simple:

const axios = require("axios");
const delay = (ms = 1000) => new Promise((r) => setTimeout(r, ms));
const getInSeries = async (promises) => {
  let results = [];
  for (let promise of promises) {
    results.push(await delay().then(() => promise));
  }
  return results;
};
const getInParallel = async (promises) => Promise.all(promises);
async function main() {
  const strings = [1, 2, 3, 4];
  const promises = strings.map((id) =>
    axios
      .get(`https://jsonplaceholder.typicode.com/todos/${id}`)
      .then((res) => res.data)
  );
  const results = await getInSeries(promises);
  console.log(results);
}
main();
<script src="https://cdnjs.cloudflare.com/ajax/libs/axios/0.19.2/axios.min.js"></script>

Performance friendly series. loop one time O(N).

const delay = (ms = 1000) => new Promise((r) => setTimeout(r, ms));
const getTodosSeries = async function (items) {
  let results = [];
  for (let index = 0; index < items.length; index++) {
    await delay();
    const res = await axios.get(
      `https://jsonplaceholder.typicode.com/todos/${items[index]}`
    );
    results.push(res.data);
  }
  return results;
};

async function main() {
  const strings = [1, 2, 3, 4];
  const results = await getTodosSeries(strings);
  console.log(results);
}
main();
<script src="https://cdnjs.cloudflare.com/ajax/libs/axios/0.19.2/axios.min.js"></script>

Sign up to request clarification or add additional context in comments.

5 Comments

Great answer! But how can I add a delay between each request?
ADDED DELAY. Please check.
@EfiGordon You can check my chunk implement.
Thanks- I will use this case it is really great solution - Did you mean if (index % chunkSize === 0) (in the first version - Using chunk, best performance)
chunk snippet: 1. "chunkSize" is not used 2. "index % chunkPromises" you divide number by an array 3. you use "await" inside a loop 4. you are not using a "delay()" function Conclusion: this code won't work as expected
4

There are a variety of strategies for dealing with too many requests and which strategy gets you the best throughput without running afoul of the target server rate limiting depends entirely upon exactly how the target server is measuring and enforcing things. Unless that is documented, you would have to just experiment. The safest (and potentially slowest) strategy is to run your requests sequentially with a delay between them and tune that delay time as appropriate.

Run Sequentially With Delay Between Each Request

You can run your requests sequentially and use await on a delay promise to separate them in time.

const  axios = require('axios');

function delay(t) {
    return new Promise(resolve => setTimeout(resolve, t));
}

async function getResults() {

    const results = [];
    const strings = ['a', 'b', 'c'];
    for (let  str  of  strings) {
        await delay(1000);
        let data = await axios.get(`https://www.apiexample.com/get/?cfg=json&value=${str}`);
        results.push(data);
    }
    return results;
}

getResults().then(results => {
    console.log(results);
}).catch(err => {
    console.log(err);
});

Run N Requests at a Time where N > 1 and N < all your Requests

If you want to run N requests at a time where N is more than 1 (often like 3 or 4) but less than all your requests, then see mapConcurrent() in this answer. Whether this is feasible not as many as you were doing depends entirely upon the target server and what exactly it is measuring and enforcing.

Actual Rate Limiting where You Run N Requests per Second

For actual rate limiting where you control the requests per second directly, then see rateLimitMap() in this answer: Choose proper async method for batch processing for max requests/sec.

4 Comments

These seems like it should work...but I get getResults.then is not a function
@KylePennell - It's supposed to be getResults().then(...) which was a typo that I just fixed.
apiexample.com/get/?cfg=json&value=d doesn't work (maybe gone/down)...perhaps you could put in jsonplaceholder But the idea is right
@KylePennell - That url is something the OP used in their question - I'm not responsible for it still working or not. And, this answer is from 7 months ago. You need to use your own URL.
0

Go simple. Inject a delay into each request in your loop.

for (const [i, v] of ['a', 'b', 'c'].entries()) {
   setTimeout(()=>{
      axiosRequests.push(axios.get(`https://www.apiexample.com/get/?cfg=json&value=${str}`))}, 
      i * 100)
}

Comments

0

For anyone that prefers object oriented style, creating a reusable decorator can be more readable.

function delayAsync(delayMs = 0): any {
  return function (target: any, propertyKey: string, descriptor: PropertyDescriptor) {
    const prevMethod = descriptor.value;
    descriptor.value = async function (...args: any[]) {
      await Promise.resolve(
        new Promise<void>((resolve) => {
          setTimeout(() => { resolve(); }, delayMs);
        }),
      );
      return prevMethod.apply(this, args);
    };
    return descriptor;
  };
}

class Test {
  @delayAsync(500)
  async test() {
    console.log('test');
  }
}

const test = new Test();
(async () => {
  for await (const i of Array(10).keys()) {
    await test.test();
  }
})();

You can test it here

Comments

0

I use tiny-async-pool module to perform this task, I use the version 1.3.0 to map into an array, but after version 2.0 has iterable objects and you must have to use for async ... of ...

const ayncPool = require('tiny-async-pool');

const delay = 1000;
const strings = ['a', 'b', 'c'];
const responseValues = await asyncPool(
  2, // parallell requests
  strings,
  async (s) => {
    const response = axios.get(`https://www.apiexample.com/get/?cfg=json&value=${s}`);
    await new Promise((resolve) => setTimeout(resolve, delay));
    return response;
  }
);

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.