I have a python project that uses Postgresql. I would like to use django-like unit tests where the database is created and destroyed at every test. However, I don't want to use sqlalchemy. I tried something along these lines:
pg = psycopg2.connect(
"host={} dbname={} user={} password={}".format(
POSTGRES_HOST, 'postgres', POSTGRES_USER, POSTGRES_PASSWORD))
pg.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT)
cur = pg.cursor()
def reset_db():
cur.execute('DROP DATABASE IF EXISTS {} '.format(POSTGRES_DB))
cur.execute('CREATE DATABASE {}'.format(POSTGRES_DB))
newconn = psycopg2.connect(
"host={} dbname={} user={} password={}".format(
POSTGRES_HOST, POSTGRES_DB, POSTGRES_USER, POSTGRES_PASSWORD))
newcur = newconn.cursor()
# SCHEMAS is an imported dict containing schema creation instructions
for schema in SCHEMAS:
newcur.execute(SCHEMAS[schema])
return newcur
class Test(TestCase):
def setUp(self):
os.environ['testing'] = 'true'
self.cur = reset_db()
Then the setUp method sets an environmental variable that informs my database layer to use the testing db.
This seems to work fine. The only problem is the reset_db() takes about 0.8 seconds, which is far too much.
Are there better approaches or ways to optimise my approach?