0

I wonder if it's possible to spawn the multiple processes using subprocess module to run a function or a method defined in a same script (without a need to import it). So the main script is not waiting for execution to be completed. Like so (the code is wrong but it illustrate the concept):

def printMe(arg):
    print arg

myList=['One','Two','Three','Four','Five']

for word in myList:
    printMe(word)
    proc = subprocess.Popen(printMe(word), stdout=subprocess.PIPE, stderr=subprocess.PIPE)

EDITED:

Thanks for the comments! Apparently multiprocessing module needs to be used when there is a need to spawn an internal method or function. It appears the multiprocessing module method pool.map() behaves quite differently from a "standard" function when it is used to send an argument variable to a called by it function.

Example

import os, sys
from multiprocessing import Pool

def printMe(arg):
    arg+="_Completed"
    return arg
        
myList=['One','Two','Three']

pool = Pool(processes=10) 
results = pool.map(printMe, myList)    

print results, type(results), len(results)

# Results to ['One_Completed', 'Two_Completed', 'Three_Completed'] <type 'list'> 3

SingleWord="Once_Upon_A_Time"

pool = Pool(processes=10) 
results = pool.map(printMe, SingleWord)

# Results to: ['O_Completed', 'n_Completed', 'c_Completed', 'e_Completed', `'__Completed', 'U_Completed', 'p_Completed', 'o_Completed', 'n_Completed', '__Completed', 'A_Completed', '__Completed', 'T_Completed', 'i_Completed', 'm_Completed', 'e_Completed'] <type 'list'> 16`
2

2 Answers 2

4

You can use multiprocessing and not necessary with Pool.

import multiprocessing

def worker():
    """worker function"""
    print 'Worker'
    return

if __name__ == '__main__':
    jobs = []
    for i in range(5):
        p = multiprocessing.Process(target=worker)
        jobs.append(p)
        p.start()
Sign up to request clarification or add additional context in comments.

Comments

2

That's why multiprocessing become a standard lib.

from multiprocessing import Pool

def run(*args):
    # this is the function to be run
    return sum(*args)

if __name__ == "__main__":
    pool = Pool(processes=10) # 10 processes
    results = pool.map(run, [(1, 1, 1), (2, 2, 2), (3, 3, 3)])
    print(results)

@Spuntnix As for your update. pool.map actually expect the second argument to be a iterable. So if you give it a string, it will iterate over the string and send each characters as argument.

Personally I'd like str not iterable. See also: https://mail.python.org/pipermail/python-3000/2006-April/000759.html

3 Comments

Thanks for clarification! I've just tested pool.map() to spawn an internal method (function) in my code. Everything run well. But the execution stalls (stops, pauses) waiting for those spawned processes to finish. I thought the main idea behind of spawning is to make sure the program won't be waiting for the processes to be completed.
@Sputnix Use pool.map_async if that's what you want.
Yes! map_async was exactly what I needed! myProcess = pool.map_async( myFunction, myArgList, callback=results.append )

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.