0

I think I have confused myself on how I should approach this.

I have a number of functions that I use to interact with an api, for example get product ID, update product detail, update inventory. These calls need to be done one after another, and are all wrapped up in one function api.push().

Let's say I need to run api.push() 100 times, 100 product IDs

What I want to do is run many api.push at the same time, so that I can speed up the processing of my. For example, lets say I want to run 5 at a time.

I am confused to whether this is multiprocessing or threading, or neither. I tried both but they didn't seem to work, for example I have this

jobs = []
for n in range(0, 4):
    print "adding a job %s" % n
    p = multiprocessing.Process(target=api.push())
    jobs.append(p)

# Starts threads
for job in jobs:
    job.start()

for job in jobs:
    job.join()

Any guidance would be appreciated

Thanks

1 Answer 1

1

Please read the python doc and do some research on the global interpreter lock to see whether you should use threading or multiprocessing in your situation.

I do not know the inner workings of api.push, but please note that you should pass a function reference to multiprocessing.Process. Using p = multiprocessing.Process(target=api.push()) will pass whatever api.push() returns as the function to be called in the subprocesses.

if api.push is the function to be called in the subprocess, you should use p = multiprocessing.Process(target=api.push) instead, as it passes a reference to the function rather than a reference to the result of the function.

Sign up to request clarification or add additional context in comments.

1 Comment

That was it, I had the brackets in there. Thanks

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.