I have a situation where i have lots of individual database connections ( close to 1000) that are per customer database. From my sql alchemy, i open and close connections to keep the max connection value low. However, seems like open and close has it's penalty, so i was wondering if i could go the connection pool route. Given the databases are distinct, my question is where the sql alchemy connection pool can really help me here ? I am not clear if the pool connects to the postgres server or individual databases ( seems like individual database ). I would basically like to have a mechanism where the pool can connect/switch to/between different databases . Any pointers ?
1 Answer
SQLAlchemy has an in-application pooling layer enabled by default, which is described at connection pooling. This layer is IMHO plenty good for garden variety connection pooling within the scope of a single process, however Postgresql enthusiasts nearly always recommend the use of PGBouncer for maximum performance and configurability and especially the ability to evenly scale connections across a multi-process environment.