0

I've been using ConcurrentLogHandler for multi-platform/multi-processes file logging.

Now I'd like to be sure I'm reading 'atomic' parts of the logs, that is I do not want to read half a log line for example. The concurrent file logger actually performs a LOCK_EX (exclusive) on the file, did anyone has a practice of using LOCK_SH (shared) on the ConcurrentLogHandler files? I cannot see such a read in the module services.

Or do you have such an experience in such a multiple-read/single-write using another python module? (without coding again everything by hand)

1 Answer 1

1

ConcurrentLogHandler is just using the file locking tools the OS provides (fcntl.flock on Posix, win32file.LockFileEx on Windows), so there shouldn't be any issues if you take a LOCK_SH on the file; ConcurrentLogHandler will respect the lock when it tries taking its LOCK_EX. The easiest way to do it would be to use the portalocker module that's included with ConcurrentLogHandler:

import portalocker

with open("logfile.txt") as f:
    portalocker.lock(f, portalocker.LOCK_SH)
    for line in f:
        # do stuff with each line
# file will be unlocked when its closed.
Sign up to request clarification or add additional context in comments.

1 Comment

ok, I'll have a try, it looks to me there were some other hiddent locks in the log Handlers, but you're right, it would be queued at least for the protalocker actual lock. thx

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.