5

I have a Python C-extension that gets data from a socket and processes it, during the processing it releases the GIL. Currently I use 2 Python threads that nicely intertwine due to the GIL-releasing, resulting in 90% load on 2 CPU cores.

How would I achieve the same in Python 3 asyncio? I cannot find the correct Python-C-API command to tell the event-loop that it can go and do something else.

Or put another way: if my extension releases the GIL, does this automatically mean that it will not block the execution of the next item available in the eventloop? E.g. will the next socket be read while my C-extension is processing the data of the first socket? This I cannot find anywhere. From what I understand this way I could get data from a number of sockets and put even more CPU cores to work.

3
  • As written, this isn't a very good question. You need to sketch out how you'd like to see your extension interact with the event loop. As you have described it, it doesn't sound like your extension should interact with the event loop at all. Commented Dec 3, 2017 at 15:56
  • If you can describe pseudocode describing the interactions you would like to have with your extension and the event loop, then it's probably possible to describe how to do that. Commented Dec 3, 2017 at 15:57
  • @SamHartman That might also be true. My question mostly is: if my extension releases the GIL, does this automatically mean that it will not block the execution of the next item available in the eventloop? E.g. will the next socket be read while my C-extension is processing the data of the first socket? This I cannot find anywhere. Commented Dec 3, 2017 at 16:00

1 Answer 1

6

If your extension is called from inside the event loop, releasing the GIL will not cause the event loop to continue.

That is, say I do something like:

async def process():
    call_your_extension()

asyncio.get_event_loop.create_task(process())

then releasing the GIL in your extension will not cause the event loop to continue. Your routine is synchronous.

Instead, you could return or create an awaitable of some kind and then do

async def process()
    await call_your_extension()

In this model, your extension would probably be a class with an __await__ method. Eventually, a asyncio.futures.Future would need to be generated. The event loop would attach a done callback to that future, and when that done callback was called, the process could resume. That's probably not going to be a good fit for something that wants to use multiple cores. The event loop will only run in one thread.

You may be able to get some advantage by using concurrent.futures and generating a future that can run across threads. You could then use asyncio.ensure_future to convert it into an asyncio Future. However all this assumes that the event loop somehow benefits your extension. That is only likely to be the case if you can move the reading from the socket out of your extension and into event-loop code. Otherwise, your current design is probably ideal. If you need to interoperate with an event loop model, take a look at asyncio.get_event_loop().run_in_executor

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.