0

I am trying to implement a threaded timer to control a timeout for a serial process.

def tst_setMaxTimeFlag():

    lock.acquire()
    maxTimeFlag = 1
    lock.release()

    print "timeout!"
    return

def tst_setMaxTimeTimer(maxResponseTime):
    global responseTimer

    lock.acquire()
    maxTimeFlag = 0
    lock.release()

    responseTimer = threading.Timer(2,tst_setMaxTimeFlag)
    print "timer set!"
    responseTimer.start        
    print "timer start!"
    return

I would imagine the output to be:

  1. timer set
  2. timer start
  3. timeout!

However, the tst_setMaxTimeFlag() is never called and timeout! is never printed.

If I alter responseTimer = threading.Timer(2,tst_setMaxTimeFlag) to responseTimer = threading.Timer(2,tst_setMaxTimeFlag()) the timeout function is called immediately regardless of the time parameter.

maxTimeFlag is set as a global in main and initialized to 0.

Any thoughts?

1 Answer 1

3

You lost all the indentation in your code snippet, so it's hard to be sure what you did.

The most obvious problem is responseTimer.start. That merely retrieves the start method of your responseTimer object. You need to call that method to start the timer; i.e., do responseTimer.start().

Then it will produce the output you expected, with a delay of about 2 seconds before the final "timeout!" is printed.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.