0

I have a question about running function parallel in Python.

I have tried using multi processing to reduce time sending and receiving data from API but when I execute code below, it tend to crash my IDE.

def network_request_function(value)
      #this function sends requests using value.

for i in list:
      p1 = Process(target=network_request_function, args=(i,))
      p1.start()

Can you provide a way to fix my code? Or are there better alternatives?

1
  • Try wrapping your process spawning loop with if __name__ == '__main__': so that importing your module doesn't spawn more processes - see the guidelines docs.python.org/3/library/… Commented Oct 5, 2022 at 0:26

1 Answer 1

1

You should specify what platform this is running on what your IDE is. Also, if all network_request_function is doing is making a network request and awaiting a reply which gets no further processing requiring intensive CPU, then this seems like it should be using multithreading instead of multiprocessing and a multithreading pool where the number of concurrent threads can be limited in case the length of your input list is very large and where it is simpler to get a return value from network_request_function that you might be interested in. And you should not use a name, such as list, that happens to be the name of a built-in function or class for naming a variable.

For example:

def network_request_function(value):
    #this function sends requests using value and returns the reply
    return reply

if __name__ == '__main__': # Required if we switch to multiprocessing
    # To use multiprocessing:
    #from multiprocessing.pool import Pool as Executor
    # To use multithreading:
    from multiprocessing.pool import ThreadPool as Executor

    # inputs is our list of value arguments used with network_request_function:
    inputs = []; # This variable is set somewhere

    # May need to be a smaller number if we are using multiprocessing and
    # depending on the platform:
    MAX_POOL_SIZE = 200

    pool_size = min(len(inputs), MAX_POOL_SIZE)
    with Executor(pool_size) as pool:
        # Get list of all replies:
        replies = pool.map(network_request_function, inputs)
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.