I have a backend application that is currently written with asyncio: fastapi for a web server, sqlalchemy 1.4 + asyncpg for an async database driver. I have a need to deploy tasks to workers that will run and update the host application. Currently I am using aio_pika, but would like something more robust such as celery with flower.
I understand that celery is not integrated with asyncio. I also have read through answers like this one and my concern is not having the tasks be async, that is trivial. I am concerned about launching tasks from within the main event loop.
My primary question, does my_task.delay()/my_task.apply_async() block the running thread at all? If so, would a better approach be to use multiprocessing workers that get items from a central mp.Queue, or a ProcessPoolExecutor, and then deploy celery tasks only from that worker process?
I want to deploy tasks and, ideally, be notified when they are complete. This can be done from within the task itself via the fastapi interface, though. I just want to ensure that deploying tasks does not block the async event loop.
my_task.delay()or other celery-related functions, as you suggested.