Since this is the top Google result for Python Multiprocessing Queue implementation I'm going to post a slightly more generalized example.
Consider the following script:
import time
import math
import pprint
def main():
print('\n' + 'starting . . .' + '\n')
startTime = time.time()
my_list = [1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0, 10.0]
result_list = []
for num in my_list:
result_list.append(squareNum(num))
# end for
elapsedTime = time.time() - startTime
print('result_list: ')
pprint.pprint(result_list)
print('\n' + 'program took ' + '{:.2f}'.format(elapsedTime) + ' seconds' + '\n')
# end function
def squareNum(num: float) -> float:
time.sleep(1.0)
return math.pow(num, 2)
# end function
if __name__ == '__main__':
main()
This script declares 10 floats, squares them (sleeping for 1 second upon each square to simulate some significant process), then collects the results in a new list. This takes about 10 seconds to run.
Here is a refactored version using Multiprocessing Process and Queue:
from multiprocessing import Process, Queue
import time
import math
from typing import List
import pprint
def main():
print('\n' + 'starting . . .' + '\n')
startTime = time.time()
my_list = [1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0, 10.0]
result_list = []
multiProcQueue = Queue()
processes: List[Process] = []
for num in my_list:
processes.append(Process(target=squareNum, args=(num, multiProcQueue,)))
# end for
for process in processes:
process.start()
# end for
for process in processes:
process.join()
# end for
while not multiProcQueue.empty():
result_list.append(multiProcQueue.get())
# end for
elapsedTime = time.time() - startTime
print('result_list: ')
pprint.pprint(result_list)
print('\n' + 'program took ' + '{:.2f}'.format(elapsedTime) + ' seconds' + '\n')
# end function
def squareNum(num: float, multiProcQueue: Queue) -> None:
time.sleep(1.0)
result = math.pow(num, 2)
multiProcQueue.put(result)
# end function
if __name__ == '__main__':
main()
This script runs in about 1 second. To my knowledge this is the cleanest way of having multiple processes write results in parallel to the same data structure. I wish the documentation https://docs.python.org/3/library/multiprocessing.html had an example like this.
Note the order of the result list will usually not match the order of the input list, a different approach would be needed if order had to be maintained.
pool.mapalready provides for you?