19

I'm using multiprocessing to create a sub-process to my Python app. I would like to share data between my parent process and the child process. it's important to mention that I need to share this asynchronously, means that the child process and the parent process will update the data during the code running.

What would be the best way to perform that?

2
  • p = multiprocessing.Process(target=myProcess, args=()) p.start() Commented Feb 2, 2016 at 15:30
  • 1
    It depends on the kind of data you want to share, and the code you are running. Can you provide a minimal example of what you want/did ? Commented Feb 2, 2016 at 15:50

3 Answers 3

16

This is one simple example from python documentation -

from multiprocessing import Process, Queue

def f(q):
    q.put([42, None, 'hello'])

if __name__ == '__main__':
    q = Queue()
    p = Process(target=f, args=(q,))
    p.start()
    print q.get()    # prints "[42, None, 'hello']"
    p.join()

You can use pipe as well, Refer for more details - https://docs.python.org/2/library/multiprocessing.html

Sign up to request clarification or add additional context in comments.

4 Comments

Thanks! How can I check if something changed in the Queue or the PIPE variables, and than perform some action?
Queue and pipe are not variable, but links between process. You can send and receive variable through it
@Dan you may like to refer "Shared memory" section of the link i have shared
@alokthkur, if i want to pass huge amounts of data in the memory to another process for analysis (i.e. file contents), how can i pass it without copying the data ? any idea how can i share the data, and pass only pointer as argv ?
4

Fom 3.8 it is possible to use shared_memory. This example is taken from the docs:

>>> from multiprocessing import shared_memory
>>> shm_a = shared_memory.SharedMemory(create=True, size=10)
>>> type(shm_a.buf)
<class 'memoryview'>
>>> buffer = shm_a.buf
>>> len(buffer)
10
>>> buffer[:4] = bytearray([22, 33, 44, 55])  # Modify multiple at once
>>> buffer[4] = 100                           # Modify single byte at a time
>>> # Attach to an existing shared memory block
>>> shm_b = shared_memory.SharedMemory(shm_a.name)
>>> import array
>>> array.array('b', shm_b.buf[:5])  # Copy the data into a new array.array
array('b', [22, 33, 44, 55, 100])
>>> shm_b.buf[:5] = b'howdy'  # Modify via shm_b using bytes
>>> bytes(shm_a.buf[:5])      # Access via shm_a
b'howdy'
>>> shm_b.close()   # Close each SharedMemory instance
>>> shm_a.close()
>>> shm_a.unlink()  # Call unlink only once to release the shared memory

However you should be carefull with the race conditions, I personally recommend a message system or join as mentioned in other answers. However, if you only have one writting process and the rest are readers, it should be quite safe.

Comments

1

Here's an example of multiprocess-multithread and sharing a couple variables:

from multiprocessing import Process, Queue, Value, Manager
from ctypes import c_bool
from threading import Thread


ps = []

def yourFunc(pause, budget):
    while True:
        print(budget.value, pause.value)
        ##set value
        pause.value = True  
        ....

def multiProcess(threads, pause, budget):
    for _ in range(threads):
        t = Thread(target=yourFunc(), args=(pause, budget,)) 
        t.start()
        ts.append(t)
        time.sleep(3)

if __name__ == '__main__':
    pause = Value(c_bool, False)
    budget = Value('i', 5000)

    for i in range(2):
        p = Process(target=multiProcess, args=(2, pause, budget))
        p.start()
        ps.append(p) 

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.