Summary
Using Python I want to start multiple processes which run the same executable with different parameters in parallel. When all are finished I want to check there were not errors and then do some more processing.
What I've tried
I have this already:
def main(path_of_script):
path_of_exe = make_path_to_exe(path_of_script)
#
lst_standin_params = [["1", "5"], ["2", "1"]]
#
with concurrent.futures.ProcessPoolExecutor(max_workers=3) as executor:
#
future_standin_exe = {
executor.submit(
subprocess.Popen(
[path_of_exe, standin_arg_lst[TASK_ID_IDX], standin_arg_lst[TASK_DELAY_IDX]]
)
): standin_arg_lst for standin_arg_lst in lst_standin_params
}
#
for future in concurrent.futures.as_completed(future_standin_exe):
tmp_rv_holder = future_standin_exe[future]
#
try:
data = future.result()
except Exception as exc:
print('An exception occurred: %s' % (exc))
Question
The processes run fine but I'm clearly doing something wrong with respect to checking that each process started by subprocess.Popen has completed successfully. I think I need a way to capture the return value from the call to subprocess.Popen but I'm not sure how to .
The code as is stands throws an exception when the line data = future.result() is executed with an exception can't pickle _thread.lock objects. I'm pretty sure that attempting to use the Future object is the wrong idea but I can't work out how to access the results of the execution.
subprocess.Popenand with functions which catch output -stdout=PIPE,p.stdout.read()- andreturnthis output. And then use this function inProcessPoolExecutorasynciohas method to run external processes withawait- docs.python.org/3/library/asyncio-subprocess.html - but it still need to usestdout=PIPE, p.stdout.read()to get output.