6

I'm trying to use a shared string variable between my Python processes, but it seems that I'm doing something wrong since I'm getting coredumps and invalid memory values.

I use multiprocessing.Value to create a ctypes.c_char_p value and use the value attribute to access it. In my understanding of the Python docs the value attribute should be synchronized, as long it is an instance of Value (contrary of an instance of RawValue). Is that correct so far?

I've created a short example to demonstrate my use of Value and to show the inconsistency while executing:

from multiprocessing import Process, Value
from ctypes import c_char_p

def process(v):
    while True:
        val = v.value
        print val
        while val == v.value:
            pass

v = Value(c_char_p, None)
p = Process(target=process, args=(v,))
p.start()

for i in range(1,999):
    v.value = str(i)

p.terminate()

4 Answers 4

7

I think the problem may have been caused by using Value(c_char_p) to hold a string value. If you want a string, you should probably just use multiprocessing.Array(c_char).

Sign up to request clarification or add additional context in comments.

3 Comments

That did the trick. I just found a small, but important section in the docs: see note in gray box. I didn't keep in mind that c_char_p is actually a pointer.
here important is that on multiprocessing.Array(c_char, <init string>), should be passed string with max length which is planned to keep in this variable (as array have a fixed length)
This did not work
2

From the Python-reference: https://docs.python.org/2/library/multiprocessing.html

your_string = Array('B', range(LENGHT))

You can take the identifier for the datatype from the table from the array module reference: https://docs.python.org/2/library/array.html

Comments

2

This is very similar in function to your example, though subtly different. Notice that the child process terminates on its own when it gets the None sentinel. The polling loop could consume less CPU if it were to use a timeout.

from multiprocessing import Process, Pipe

def show_numbers(consumer):
    while True:
        if consumer.poll():
           val = consumer.recv()
           if val==None:
              break
           print(val)

(consumer,producer) = Pipe(False)
proc = Process(target=show_numbers, args=(consumer,))
proc.start()

for i in range(1,999):
    producer.send(str(i))

producer.send(None)

proc.join()

Comments

0

I ran into a similar problem when I attempted to set up multiple processes to access a shared I/O resource. It seems that Windows doesn't share a global variable space between processes, and items passed as arguments are squashed and passed by value.

This may not directly relate to your problem, but reading the discussion my help point you in the right direction:

Multiprocess with Serial Object as Parameter

1 Comment

Neither Windows nor Linux has a global variable space between processes. This is where IPC comes into place and we need shared memory.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.