2

I have a number of daily csv files to be imported into the Postgres. Below python codes work for importing a single csv file. How can I import batch cvs files? Thanks!

import psycopg2

con = psycopg2.connect(
    database="xxxx", 
    user="xxxx", 
    password="xxxx", 
    host="xxxx")

cur = con.cursor()
file = open('path/to/directory/client_record_2020-01-01.csv', 'r')
next(file)
cur.copy_from(file, "table_name", columns=('col1', 'col2', 'col3', 'col4', 'col5'), sep=",")

con.commit()
con.close()

1 Answer 1

2

Let's try putting the names of our files in a list and then iterate over that list doing one copy_from() at a time. Maybe:

import psycopg2

file_names = [
    'path/to/directory/client_record_2020-01-01.csv'
]

con = psycopg2.connect(database="xxxx", user="xxxx", password="xxxx", host="xxxx")

for file_name in file_names:
    with open(file_name, 'r') as file_in:
        next(file_in)
        with con.cursor() as cur:
            cur.copy_from(file_in, "table_name", columns=('col1', 'col2', 'col3', 'col4', 'col5'), sep=",")
        con.commit()

con.close()
Sign up to request clarification or add additional context in comments.

2 Comments

hmm it doesn't loop through though.
I know what it is. "file_names" lists one file only. So it doesn't loop through. I used glob to grab all files. Done. Thanks!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.