6

I've an unfortunate situation where multiple Perl processes write and read the same SQLite3 database at the same time.

This often causes Perl processes to crash as two processes would be writing at the same time, or one process would be reading from the database while the other tries to update the same record.

Does anyone know how I could coordinate the multiple processes to work with the same sqlite database?

I'll be working on moving this system to a different database engine but before I do that, I somehow need to fix it to work as it is.

1
  • 3
    You might consider connecting to the DB using a DBIx::Connector object, and running your queries through the run( fixup => ... ) API of DBIx::Connector. It retries on failure, and has strong fault tolerance. The txn() function may be even better in this situation. Commented Jun 20, 2012 at 19:45

1 Answer 1

7

SQLite is designed to be used from multiple processes. There are some exceptions if you host the sqlite file on a network drive, and there maybe a way to compile it such that it expects to be used from one process, but I use it from multiple processes regularly. If you are experiencing problems, try increasing the timeout value. SQLite uses the filesystem locks to protect the data from simultaneous access. If one process is writing to the file, a second process might have to wait. I set my timeouts to 3 seconds, and have very little problems with that.

Here is the link to set the timeout value

Sign up to request clarification or add additional context in comments.

1 Comment

Lengthy details: sqlite.org/docs.html / SQLite Technical/Design Documentation

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.