2

I have a function that sends 2 different requests. I need to call this function with different parameters 20 times.

I would like to run the functions concurrently (different arguments) to spare some time between request and response.

This is a very simplified function:

async def get_data(url):
    return requests.get(url)

And this is how I call it:

loop = asyncio.get_event_loop()
tasks = [asyncio.ensure_future(get_data(url)) for url in websites.split('\n')]
group = asyncio.gather(*tasks)
results = loop.run_until_complete(group)
print(results)
loop.close()

The problem is that it runs sequentially instead of concurrently.

It's obvious that I'm missing something. Do you know what to do?

2
  • 1
    Your code is going to run sequentially because requests isn't asyncio-aware. You'll want to use a library like aiohttp so that additional requests can be made while you're awaiting others. Commented Oct 9, 2020 at 19:03
  • Answered here stackoverflow.com/a/63881674/13782669 Commented Oct 9, 2020 at 20:55

1 Answer 1

2

Don't wrap coroutine in asyncio.create_task, use * to unpack coroutins when passing to asyncio.gather and call loop.run_until_complete in the end

loop = asyncio.get_event_loop()
tasks = [get_data(url) for url in websites.split('\n')]
group = asyncio.gather(*tasks)
results = loop.run_until_complete(group)
print(results)
loop.close()

Plus, it won't be concurrent because requests isn't asynchronous, it's blocking the thread, you need to use alternative async HTTP clients like aiohttp.

Sign up to request clarification or add additional context in comments.

4 Comments

Thanks, this works but the problem is that it runs sequentially instead of concurrently.
What makes you think so?
I've added print(url) inside the function and it prints one after another with 1 - 3 second pauses.
Updated with a note

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.