2

I have a file called db.py with the following code:

from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker


engine = create_engine('sqlite:///my_db.sqlite')
session = scoped_session(sessionmaker(bind=engine,autoflush=True))

I am trying to import this file in various subprocesses started using a spawn context (potentially important, since various fixes that worked for fork don't seem to work for spawn)

The import statement is something like:

from db import session

and then I use this session ad libitum without worrying about concurrency, assuming SQLite's internal locking mechanism will order transactions as to avoid concurrency error, I don't really care about transaction order.

This seems to result in errors like the following: sqlite3.ProgrammingError: SQLite objects created in a thread can only be used in that same thread. The object was created in thread id 139813508335360 and this is thread id 139818279995200.

Mind you, this doesn't directly seem to affect my program, every transaction goes through just fine, but I am still worried about what's causing this.

My understanding was that scoped_session was thread-local, so I could import it however I want without issues. Furthermore, my assumption was that sqlalchemy will always handle the closing of connections and that sqllite will handle ordering (i.e. make a session wait for another seesion to end until it can do any transaction).

Obviously one of these assumptions is wrong, or I am misunderstanding something basic about the mechanism here, but I can't quite figure out what. Any suggestions would be useful.

1
  • I don't think this will solve your problem, but note that SQLite does'nt always work well with multithreading (though the default configuration in my python distribution is in the default mode, which is serialized). See this Commented Feb 28, 2021 at 7:56

1 Answer 1

2
+50

The problem isn't about thread-local sessions, it's that the original connection object is in a different thread to those sessions. SQLite disables using a connection across different threads by default.

The simplest answer to your question is to turn off sqlite's same thread checking. In SQLAlchemy you can achieve this by specifying it as part of your database URL:

engine = create_engine('sqlite:///my_db.sqlite?check_same_thread=False')

I'm guessing that will do away with the errors, at least.

Depending on what you're doing, this may still be dangerous - if you're ensuring your transactions are serialised (that is, one after the other, never overlapping or simultaneous) then you're probably fine. If you can't guarantee that then you're risking data corruption, in which case you should consider a) using a database backend that can handle concurrent writes, or b) creating an intermediary app or service that solely manages sqlite reads and writes and that your other apps can communicate with. That latter option sounds fun but be warned you may end up reinventing the wheel when you're better off just spinning up a Postgres container or something.

Sign up to request clarification or add additional context in comments.

3 Comments

But the connection is thread local, so how comes you say it's used accross different threads? This hardly makes sense, that's why I have scoped_session. Or is it that the scoped_session I'm creating is bound to a connection originally created by the engine? That would also be unlikely, since I don't always notice these error, just in certain cases when exceptions intrerupt a thread.
Also, I get that using PG would be better, sadly enough this is a fairly portable app that has to be installed via pypi, so I have to default to sqlite :(
I'd have assumed such a "sincronization" feature ought to already be implemented by either sqlalchemy or some other library I can plug in.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.