1

Is the memory consumed by a process spawned by a multiprocessing.Process going to be released once the process is joined?

The scenario I have in mind is roughly like this:

from multiprocessing import Process
from multiprocessing import Queue
import time
import os

def main():
  tasks = Queue()  
  for task in [1, 18, 1, 2, 5, 2]:
    tasks.put(task)

  num_proc = 3           # this many workers @ each point in time
  procs = []
  for j in range(num_proc):
     p = Process(target = run_q, args = (tasks,))  
     procs.append(p)
     p.start()

  # joines a worker once he's done
  while procs:
    for p in procs:
        if not p.is_alive():
            p.join()        # what happens to the memory allocated by run()?  
            procs.remove(p)
            print p, len(procs)
    time.sleep(1)  

def run_q(task_q):
    while not task_q.empty():  # while's stuff to do, keep working
        task = task_q.get()
        run(task)

def run(x):       # do real work, allocates memory
    print x, os.getpid()
    time.sleep(3*x)


if __name__ == "__main__":
  main()  

In real code, the length of tasks is much larger then the number of CPU cores, each task is lightweight, different tasks take vastly different amount of CPU time (minutes to days) and vastly different amount of memory (from peanuts to a couple of GBs). All this memory is local to a run, and there's no need to share it --- so the question is if it's released once a run returns, and/or once a process is joined.

1 Answer 1

3

The memory consumed by a process is released when the process terminates. In your example, this happens when run_q() returns.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.