Using Python 3.6 and asyncio and aiohttp I wrote a simple async program:
from aiohttp import ClientSession
import asyncio, ssl, time
base_url = 'https://my-base-url.com/api'
async def fetch(session, id):
query_params = {'qp1':'v1','qp2':'v2', 'id': id}
async with session.get(base_url, params=query_params, ssl=ssl.SSLContext()) as response:
res_json = await response.json()
if response.status == 200:
time.sleep(2)
min_rating = res_json.get('minRating')
max_rating = res_json.get('maxRating')
print("id = %s, min = %s, max = %s" % (id, min_rating, max_rating))
async def run(ids):
tasks = []
async with ClientSession() as session:
for id in ids:
task = asyncio.ensure_future(fetch(session, id))
tasks.append(task)
responses = await asyncio.gather(*tasks)
return responses
if __name__ == '__main__':
ids = [123, 456, 789]
future = asyncio.ensure_future(run(ids))
event_loop = asyncio.get_event_loop()
event_loop.run_until_complete(future)
print("\n\ndone")
The time.sleep(2) inside fetch(session, id) makes it seem like this program is not asynchronous because it gets one response, sleeps, gets another, sleeps, so on and so forth. When I remove the sleep call, it does seem to be async/concurrent because the responses come back in a random order. What is sleep doing in this case? Is it locking all threads? Why does it appear to be sequential instead of parallel?
time.sleepblocks everything; you want to hand back control to the event loop until some time in the future, so useasyncio.sleep. See e.g. stackoverflow.com/a/42282058/3001761.asyncis cooperative multitasking. When you usetime.sleep, you're not cooperating.