1

can someone explain me how can we handle the situation when the max connections limit has been reached for any database. Can we make a connection in wait state until any existing connection gets released automatically.

import snowflake.connector as sf
import sqlalchemy.pool as pool
def get_conn():
    conn = sf.connect(
        user='username',
        password='password',
        account='snowflake-account-name',
        warehouse='compute_wh',
        database='customer_data'
    )

    return conn

mypool = pool.QueuePool(get_conn, max_overflow=10, pool_size=5)
a = mypool.connect()
a1 = mypool.connect()
a2 = mypool.connect()
a3 = mypool.connect()
a4 = mypool.connect()
a5 = mypool.connect()
a6 = mypool.connect()
a7 = mypool.connect()
a8 = mypool.connect()
a9 = mypool.connect() 
a11 = mypool.connect()
a12 = mypool.connect()
a13 = mypool.connect()
a14 = mypool.connect()
a15 = mypool.connect()

till a14 we will get 10 connections objects successfully but when we uncomment and run this a15. we will get an error as pool exhausted.. how to handle this case ??

if we need to write the logic in such a way that we have to give access even
though the instances keep increasing. help me how i can send back the connection
to the pool ??

2 Answers 2

2

Your pool is configured to hold five open connections to the database and create up to ten 'overflow' connections for times of heavy load.

Doing mypool.connect() will check out a connection, so when you exceed the pool size plus the overflow you get an error.

You should return connections to the pool if you don't need them to do more work.

This can be done explicitly by calling e.g. a1.close() (by the way, it would be good to give descriptive names - these are Connection objects).

However, the better way is to use a context manager, because if an exception is thrown between your checking out a connection and returning it with conn.close(), you will end up having a permanently checked-out connection and ultimately have the same problem.

e.g.

with mypool.connect() as conn:
    do_some_work(conn)

more_code()

When you finish the with block (context manager) and move on to more code, or if you exit it with an exception, the connection will be returned to the pool for future use.

Sign up to request clarification or add additional context in comments.

1 Comment

@padding Atlon thanks for the hint you have given to me... I've tried to implement by using with statement.. and succeeded to do it. please check my code and help me if i'm wrong...
0

I think this code will work even if the max limit is reached in pool. Correct me if I did something wrong.

import snowflake.connector as sf
import sqlalchemy.pool as pool
    
class Database:
    connection_pool = None
    
    @classmethod
    def initialise(cls):
        def get_conn():
            conn = sf.connect(
                user='username',
                password='password',
                account='accountname',
                warehouse='compute_wh',
                database='customer_data'
            )
    
            return conn
    
        cls.connection_pool = pool.QueuePool(get_conn, max_overflow=1, pool_size=1)
    
class ConnectionFromPool:
    def __init__(self):
        self.connection  = None
        
    def __enter__(self):
        self.connection = Database.connection_pool.connect()
        return self.connection
        
    def __exit__(self, exc_type, exc_val, exc_tb):
        self.connection.commit()
        Database.connection_pool.dispose()
    
    
class User:
    def __init__(self, cust_key, name, address):
        self.cust_key = cust_key
        self.name = name
        self.address = address
    
    def save_to_db(self):
        with ConnectionFromPool() as connection:
            with connection.cursor() as cursor:
                cursor.execute("insert into data(cust_key,name,address) values (%s, %s, %s)", (self.cust_key, self.name, self.address))
    
    
    
Database.initialise()
user_to_db = User(1,'padma', 'ramnagar')
user_to_db.save_to_db()
user_to_db.save_to_db()
print("success")

6 Comments

It seems very ... intricate. But perhaps I don't understand your use case. As I said in my answer, SQLAlchemy connections can automatically be used in a context manager, so not sure why you need to make your own. I'm also not sure that pool.dispose is what you want (why throw away the pooled connections?) and if you don't want to be bound to a number of connections other than the hard limit of the DB I'm not sure why you want a pool at all?
@PaddyAlton , first of all. thank you for your concern towards this issue. Let us assume that i've mentioned pool_size as 5 and max_overflow limit as 10 for connections and lets assume we deployed our project. and users try to login into our app. it only allows upto 15 connections for the 16 th person who tries to login would failed, to handle this case. My code is working fine for this now.
Right - the point of having a pool is to limit how much of the database's resources your application can utilise. If you need more simultaneous connections in practice, you can increase the size of the pool. If you don't want these limits, you can avoid using a pool at all. Anyway, if you want a pool then I think you should change Database.connection_pool.dispose() to self.connection.close(). This will return the connection to the pool rather than throwing away the pool.
@PaddyAlton thank you for the help but i'm using Queuepool from sqlalchemy for the connection pool purpose but quuepool class does not have a close() method. it only has a dispose() method to close a connection
That's right, but the connection which you checked out from the pool does have a close() method. Note that my suggestion was to switch Database.connection_pool.dispose for self.connection.close (where self is an instance of ConnectionFromPool). This will not actually close the connection, despite the name - it will check the connection back into the pool, ready for reuse. If you don't want that, then in my opinion you don't want a pool at all.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.