10

I have a pet project with the following logic:

import asyncio, multiprocessing

async def sub_main():
    print('Hello from subprocess')

def sub_loop():
    asyncio.get_event_loop().run_until_complete(sub_main())

def start():
    multiprocessing.Process(target=sub_loop).start()

start()

If you run it, you'll see:

Hello from subprocess

That is good. But what I have to do is to make start() coroutine instead:

async def start():
    multiprocessing.Process(target=sub_loop).start()

To run it, I have to do something like that:

asyncio.get_event_loop().run_until_complete(start())

Here is the issue: when sub process is created, it gets the whole Python environment cloned, so event loop is already running there:

Process Process-1:
Traceback (most recent call last):
  File "/usr/lib/python3.5/multiprocessing/process.py", line 249, in _bootstrap
    self.run()
  File "/usr/lib/python3.5/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "test.py", line 7, in sub_loop
    asyncio.get_event_loop().run_until_complete(sub_main())
  File "/usr/lib/python3.5/asyncio/base_events.py", line 361, in run_until_complete
    self.run_forever()
  File "/usr/lib/python3.5/asyncio/base_events.py", line 326, in run_forever
    raise RuntimeError('Event loop is running.')
RuntimeError: Event loop is running.

I tried to destroy it on subprocess side with no luck but I think that the correct way is to prevent its sharing with subprocess though. Is it possible somehow?

UPDATE: Here is the full failing code:

import asyncio, multiprocessing

import asyncio.unix_events

async def sub_main():
    print('Hello from subprocess')

def sub_loop():
    asyncio.get_event_loop().run_until_complete(sub_main())


async def start():
    multiprocessing.Process(target=sub_loop).start()

asyncio.get_event_loop().run_until_complete(start())
8
  • I don't have time for a full answer, but you may want to consider a design where (a) your multiprocessing stuff is done by a script that can be invoked using eg. subprocess.Popen([sys.executable, "the_script.py"], ...) (b) this script communicates with its parent on eg. stdout using a designed protocol (it could be totally simple, eg. single byte control characters to the script and status updates back) and (c) using the asyncio subprocess API. Commented Jul 5, 2016 at 0:25
  • (I don't mean you should use subprocess.Popen and asyncio's subprocess API at the same time, just that you should write your script so that it could be controlled as any language-agnostic subprocess.) Commented Jul 5, 2016 at 0:27
  • @detly Thank for the suggestion, but there is a plenty of data which should be inherited by subprocess. If there is a simple solution to avoid the mentioned problem, I'd prefer it rather then rewriting all multiprocessing stuff by hand. Commented Jul 5, 2016 at 0:28
  • That's fair enough, it's not a trivial undertaking. Commented Jul 5, 2016 at 0:31
  • I think that this is possible since I've found a hack which seems to be working but only on unix platform. sub_loop can start with asyncio.set_event_loop(asyncio.unix_events._UnixSelectorEventLoop()) which will create a new loop for the subprocess while the parent's one would be (hopefully) garbage collected Commented Jul 5, 2016 at 0:35

2 Answers 2

16

First, you should consider using loop.run_in_executor with a ProcessPoolExecutor if you plan to run python subprocesses from within the loop. As for your problem, you can use the event loop policy functions to set a new loop:

import asyncio
from concurrent.futures import ProcessPoolExecutor

async def sub_main():
    print('Hello from subprocess')

def sub_loop():
    loop = asyncio.new_event_loop()
    asyncio.set_event_loop(loop)
    loop.run_until_complete(sub_main())

async def start(executor):
    await asyncio.get_event_loop().run_in_executor(executor, sub_loop)

if __name__ == '__main__':
    executor = ProcessPoolExecutor()
    asyncio.get_event_loop().run_until_complete(start(executor))
Sign up to request clarification or add additional context in comments.

3 Comments

Right... I seem to be blind since I've missed this obvious function asyncio.new_event_loop(). Thanks! Could you please explain to me what is the profit of ProcessPoolExecutor in that case?
@Grief run_in_executor is a coroutine so you can easily join your subprocess using await or asyncio.wait_for for instance. ProcessPoolExecutor also lets you specify a number of workers.
Thanks for pointing out the need for creating a new event loop in the created subprocess. This was the key bit I was missing - otherwise was getting a very cryptic Bad file descriptor error.
1

You should always add a check to see how you're running the code (the if __name__ == '__main__': part. Your subprocess is running everything in the module a 2nd time, giving you grief (couldn't resist).

import asyncio, multiprocessing

async def sub_main():
    print('Hello from subprocess')

def sub_loop():
    asyncio.get_event_loop().run_until_complete(sub_main())


async def start():
    multiprocessing.Process(target=sub_loop).start()

if __name__ == '__main__':
    asyncio.get_event_loop().run_until_complete(start())

5 Comments

I guess this is only unix-related since this ends up with the same result, I mean exactly the same exception.
@Grief: I'll see if I can replicate your issue in a linux env.
By the way, they changed that part in the python3 documentation. It was For an explanation of why (on Windows) the if __name__ == '__main__' part is necessary, see Programming guidelines. in docs.python.org/2/library/… and now it is just For an explanation of why the if __name__ == '__main__' part is necessary, see Programming guidelines. I believe that nothing changed under the hood here and the removal of Windows mention is just a push toward cross platform coding.
@Grief: ...well, now your question is interesting (to me). Different results on windows and linux. I get your exception running on linux.
I guess I can try-catch exception and do asyncio.set_event_loop(asyncio.unix_events._UnixSelectorEventLoop()) if error occurred. At least as a workaround for now. However, I'd prefer something more reliable.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.