2

I am new to amazing world of python, was developing a test system consist of continuous sense and test run. i have three or more while loops of which one is producer and rest two are consumers. did not understand multiprocessing very well, here are a sample code, first loop will create a data and second loop will get the data, how to impliment this in a infinity while loop, i will stop loop in the main program but asking your kind help to understand data exchange between while loops

from multiprocessing import Process,Queue
from time import sleep

q=Queue()
cnt=0

def send():
  global cnt
  while True:
        sleep(1)
        cnt=cnt+1
        q.put(cnt,False)
        print ("data Send:",cnt)

def rcv():
  while True:
      sleep(1)
      newdata=q.get(cnt, False)
      print ("data Received",newdata)

if __name__=='__main__':
      p1=Process(target=send)
      p2=Process(target=rcv)
      p1.start()
      p2.start()
      p1.join()
      p2.join()
1
  • If the queue is empty your consumer is blocked until something is on the queue, if the queue is full the producer is blocked untill there is free space. BTW you could get the same model with one thread using Python generators. Commented Sep 19, 2016 at 23:43

2 Answers 2

2

I would suggest you to dive in to the documentation of the multiprocessing-library you are using.

Basically, you have two options. Queue and Pipe. Now you are using Queue,

q.put(cnt,False)
...
newdata=q.get(cnt, False)

this will crash because you will try to get data from and empty Queue at some point, so you will need to check the queue status before reading from it.

while not q.empty() and not q.task_done():
    newdata = q.get(cnt)

Other than that, if you want to have multiple receivers, you need to think about some kind of mutexes (see multitprocessing.Lock), or multiprocessing.Pipe, since if one reader-process is just getting a value from the queue, and another one is checking the status of the queue, it will fail because the queue will actually be empty when the second one tries to read from it.

However, for this minimal example, using a mutex (mutual exclusive lock, prevents multiple processes accessing the same memory at the same time), will most likely negate the advantage gained when using multiple cores. However, if the different processes actually do some heavy calculation with the values before/after accessing the queue, the benefit gained will be greater than the loss from using a lock.

Sign up to request clarification or add additional context in comments.

5 Comments

Thanks Teemu, looking for a way to run multiple while loops in parallel and exchange data between them
No problem, and welcome to stackoverlflow! If some of the answers satisfy you and solve the problem you have, remember to mark the most helpful one as accepted.
while not q.empty() and not q.task_done(): newdata = q.get(cnt)
q.get(cnt, False) This might fail, in the consumer you want to block if the queue is empty but the producer is still processing. I think it should be q.get(cnt).
when checked is_alive() i get both the process started and they are alive. but no print or activity of second while loop.
0
def send():
  cnt = -1
  while True:
    cnt += 1
    yield cnt

def rcv(value):
  ...  # logic

consumers = [rcv] * NUM_CONSUMERS
idx = 0
for val in send():
  consumers[idx](val)
  idx += 1
  idx %= NUM_CONSUMERS - 1

10 Comments

i have to use multiprocessing as i need to run tasks in parallel
You only get one core in Python. Processes should only be used for I/O calls that block.
@Dan 1) "You only get one core in Python": What do you mean by this statement?, and 2) "Processes should only be used for I/O calls that block": There's no reason to limit the use of multiprocessing to blocking I/O calls. Can you clarify what you mean? It's unclear what you're trying to say and is potentially misinformed.
Python can only run on 1 cpu core at a time so no multi core concurrency. Because of this if you spin up threads or processes to perform parallel computations, you will only slow down the interpreter with extra overhead. But if you want to make a network call that blocks then you can spin up a thread to make that call with out blocking the rest of your program.
@Dan Then could you kindly explain to me what I have understood wrong in the multiprocessing documentation when it says: The multiprocessing package offers both local and remote concurrency, effectively side-stepping the Global Interpreter Lock by using subprocesses instead of threads. Due to this, the multiprocessing module allows the programmer to fully leverage multiple processors on a given machine.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.