0

I'm trying to insert multiple rows into my database, and currently I do not know a way to insert them all at the same time or any other method which will help save time (sequentially it takes about ~30s for around 300 rows).

My 'rows' are are tuples in a list of tuples (converted into tuple of tuples), e.g. [(col0, col1, col2), (col0, col1, col2), (.., .., ..), ..]

def commit(self, tuple):
    cursor = self.conn.cursor()
    for tup in tuple:
        try:
            sql = """insert into "SSENSE_Output" ("productID", "brand", "categoryID", "productName", "price", "sizeInfo", "SKU", "URL", "dateInserted", "dateUpdated")
              values (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s)"""

            cursor.execute(sql, tup)
            self.conn.commit()
        except psycopg2.IntegrityError:
            self.conn.rollback()
            sql = 'insert into "SSENSE_Output" ' \
                  '("productID", "brand", "categoryID", "productName", "price", "sizeInfo", "SKU", "URL", "dateInserted", "dateUpdated")' \
                  'values (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s) on conflict ("productID") do update set "dateUpdated" = EXCLUDED."dateUpdated"'
            cursor.execute(sql, tup)
            self.conn.commit()
        except Exception as e:
            print(e)

I have also tried commiting after the for loop is done, but still results in the same amount of time. Are there any ways to make this insert significantly faster?

1
  • You should build just one large insert statement instead of many of them, that will improve your query Commented Mar 5, 2018 at 14:54

2 Answers 2

1

In postgres you can use a format like:

INSERT INTO films (code, title, did, date_prod, kind) VALUES
('B6717', 'Tampopo', 110, '1985-02-10', 'Comedy'),
('HG120', 'The Dinner Game', 140, DEFAULT, 'Comedy');

Due to your record base exception handling you can better first resolve the duplicates before generating this query as the whole query might fail when an integrity error occurs.

Sign up to request clarification or add additional context in comments.

Comments

0

Building one large INSERT statement instead of many of them will considerably improve the execution time, you should take a look here. It is for mysql, but I think a similar approach apply for postgreSQL

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.