I am experimenting with Python's (v. 3.8) multiprocessing library, for developing a bigger program, and trying to share an multiprocessing.Array of strings between multiple processes such that this Array can be updated and read by each process with the same data. I tried c_char_p and it says to use byte strings. However, for this code:
from multiprocessing import Process, Array
from ctypes import c_char_p
def show(a):
print("This ran")
print("a: ", a[:])
if __name__ == "__main__":
array = Array(c_char_p, 1)
array[0] = b'Hello World'
print(array[:])
p = Process(target=show, args=(array,))
p.start()
p.join()
the output varies from this:
[b'Hello World']
This ran
a: [b'c']
to this:
[b'Hello World']
This ran
but I expect:
['Hello World']
This ran
['Hello World']
I guess an obvious solution would be to share a common file between each process but I expect to use multiple arrays and that can get a bit tedious. I was wondering what be the best current solution to this in Python 3.8.
For this code:
from multiprocessing import Process, Array
from ctypes import c_char_p
def show(a):
print("This ran")
# Decode the encoded values
arr = [s.decode("utf-8") for s in a]
print(arr)
if __name__ == "__main__":
array = Array(c_char_p, 1)
message = b"Hello, world"
array[0] = message
p = Process(target=show, args=(array,))
p.start()
p.join()
# Decode the encoded values
arr = [s.decode("utf-8") for s in array]
print(arr)
it never ends (i.e. never gets to the print statement after the process starts), the output is:
This ran
I am not sure why.