1

I have a small web server doing some operations on a POST request. It reads a data file, does some checks, and then resaves the file adding some informations from the POST in it.

The issue I have is that if two clients are doing a POST request at almost the same time, both will read the same file, then one will write the file containing the new information, and then the other client will write the file containing its new information, but without the information from the other client, since that part wasn't in the file when it was read.

f = open("foo.txt", "r+")
tests_data = yaml.safe_load(f)
post_data = json.loads(web.data())
#Some checks

f.write(json.dumps(tests_data))
f.close()

I wanted the script to "wait", without giving an error, at the "open" line if the file is already opened by another process of the same code, then read the file when the other process is done and has closed the file.

Or something else if other solutions exist.

1
  • have some kind of queuing system perhaps, where your program writes a queue of people currently accessing to some ancillary data file and then executes if the assigned unique ID is at the front of the file. have the program clean up its instance's ID's lines. or however you want to do it. Commented Feb 29, 2016 at 21:18

1 Answer 1

3

Would a standard lock not suit your needs? The lock would need to be at the module level.

from threading import Lock
# this needs to be module level variable
lock = Lock 

with lock:
    # do your stuff.  only one thread at a time can
    # work in this space...
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.