26

I am running the following code in iPython:

import multiprocessing

def my_function(x):
    """The function you want to compute in parallel."""
    x += 1
    return x


if __name__ == '__main__':
    pool = multiprocessing.Pool()
    results = pool.map(my_function, [1,2,3,4,5,6])
    print(results)

in ipython QT console on Windows. However, the code does not work -- the QT console just freezes up. The issue is specific to iPython (the code above should work for the regular Python 2.7).

Any solution to this?

6
  • 2
    Works fine on Ubuntu. Must be an issue with windows. Do you paste the code into Ipython? Commented May 13, 2014 at 21:09
  • Yes, I do paste the code. Commented May 14, 2014 at 19:50
  • does it work when you run it from another shell or IDE? Commented May 14, 2014 at 20:00
  • Yes, I just tried it in Spyder IDLE. It works there. Any idea how to make it work in the actual QT console? Commented May 15, 2014 at 12:41
  • 1
    I second @PadraicCunningham's comment. Works fine on Linux distributions (I'm using Debian). Commented Sep 1, 2016 at 21:31

2 Answers 2

24

From the documentation:

Note

Functionality within this package requires that the __main__ module be importable by the children. This is covered in Programming guidelines however it is worth pointing out here. This means that some examples, such as the multiprocessing.Pool examples will not work in the interactive interpreter

Sign up to request clarification or add additional context in comments.

3 Comments

Did you copy/paste the code in the QT console, or did someting else?
The FINAL answer. No more effort struggling trying to get multiprocessing.Pool working on Jupyter notebook.
The answer by @Maverick does work in both Jupyter Notebook and Jupyter Lab, for both 'fork' and 'spawn' multiprocessing.
21

At least in the current version of Jupyter Notebook (the successor of IPython) you can solve this by moving the target function to a separate module, and importing it.

I have no idea why this works, it's rather odd, but it does.

i.e. - in a workers.py file put

def my_function(x):
    """The function you want to compute in parallel."""
    x += 1
    return x

Then in IPython/Jupyter notebook put:

import multiprocessing
import workers

pool = multiprocessing.Pool()
results = pool.map(workers.my_function, [1,2,3,4,5,6])
print(results)

also - the if-main thingy doesn't seem to be needed.

Credit: Gaurav Singhal

4 Comments

This works but it seems like the performance has no increase because it looks like still running in a single process/thread.
@JohnZhang I get different process id's when I run this (although sometimes I do get the same process). You can just add os.getpid() to the worker.py (make sure to import os).
In my test, I will get MP if I run it in Python console. However, if I run it in Jupyter, it will be always 1 process. I think it makes sense since one Jupyter notebook is one process in Jupyter architecture (at least on Windows).
@JohnZhang have you used pool? I also use windows and I do get multiprocessing. Also it doesn't make sense, as every IDE or even console you use is one process, and they all spawn different processes.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.