7

I have to use the Process method with "spawn" context in Linux. Then I write a sample code as follows:

from multiprocessing import Value
import multiprocessing

class Test(object):
    def __init__(self, m_val):
        print("step1")
        self.m_val = m_val
        print("step2")
        self.m_val_val = m_val.value
        self.prints()
    def prints(self):
        print("self.m_val_val:%d"%self.m_val_val)


def main(m_val):
    t = Test(m_val)

if __name__   == "__main__":
    N = 2
    procs = []
    v = Value("i",10)
    for i in range(0,N):
        proc_i = multiprocessing.get_context("spawn").Process(target=main,args=(v,))
        proc_i.daemon=True
        procs.append(proc_i)
    for i in range(0,N):
        procs[i].start()
    
    for i in range(0,N):
        procs[i].join()

When I run this code in Linux, it will print:

step1
step2
step1
step2

while in Windows, the print content will be:

step1
step2
self.m_val_val:10
step1
step2
self.m_val_val:10

Besides, there is no error information printed on the screen. So, how can I solve this problem, i.e., how to use multiprocessing Value in among processes while using "spawn" context in Linux?

2
  • I reproduce the issue. I still not have an explanation. Take a look at this answer on how to add logging. Commented Sep 27, 2021 at 10:58
  • The problem comes from accessing the .value from the synchronized wrapper. But I have no clue what is the problem. My debugger can't show me the frame, but the process exits durint it. While on Windows, after a few frames it continues to execute the script. I'm puzzled. Commented Sep 27, 2021 at 11:12

1 Answer 1

4
+50

The problem is that you are creating Value in the default context, which is fork on Unix.

You can resolve this by setting the default start context to "spawn":

multiprocessing.set_start_method("spawn")  # Add this
v = Value("i",10)

Better yet, create the Value in the context explicitly:

# v = Value("i",10)                         # Change this
ctx = multiprocessing.get_context("spawn")  # to this
v = ctx.Value("i",10)                       #
for i in range(0,N):
    # proc_i = multiprocessing.get_context("spawn").Process(target=main,args=(v,))  # (Optional) Refactor this
    proc_i = ctx.Process(target=main,args=(v,))                                     # to this

Reference

From https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods:

spawn

Available on Unix and Windows. The default on Windows and macOS.

fork

Available on Unix only. The default on Unix.

Note that objects related to one context may not be compatible with processes for a different context. In particular, locks created using the fork context cannot be passed to processes started using the spawn or forkserver start methods.


Reproducing this issue on macOS

This issue can be reproduced on macOS with Python < 3.8.

From https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods:

Changed in version 3.8: On macOS, the spawn start method is now the default.

To reproduce this issue on macOS with Python 3.8 and above:

multiprocessing.set_start_method("fork")  # Add this
v = Value("i",10)

Error message and stack trace

OSError: [Errno 9] Bad file descriptor

Traceback (most recent call last):
  File "/path/to/python/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/path/to/python/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/path/to/file.py", line 18, in main
    t = Test(m_val)
  File "/path/to/file.py", line 10, in __init__
    self.m_val_val = m_val.value
  File "<string>", line 3, in getvalue
OSError: [Errno 9] Bad file descriptor
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.