14

I am new to python and postgresql

I've been battling just hardcoding each json line with python and I don't think this the scalable method. If someone can point me to the literature or documentation that can handle json insertion from python without hardcoding.

I've looked into COPY.

2 Answers 2

21
import json

data = [1, [2,3], {'a': [4,5]}]
my_json = json.dumps(data)
insert_query = "insert into t (j) values (%s) returning j"
cursor.execute(insert_query, (my_json,))
print (cursor.fetchone()[0])

Output:

[1, [2, 3], {'a': [4, 5]}]
Sign up to request clarification or add additional context in comments.

1 Comment

Works like a charm! And don't forget conn.autocommit = True or conn.commit to see the actual results in database.
4

With dataset + psycopg2

import dataset

def construct_pg_url(postgres_user='postgres', postgres_password='', postgres_host='localhost', postgres_port='5432', postgres_database='postgres'):
    PG_URL = "postgresql://" + postgres_user + ":" + postgres_password + '@' + postgres_host + ':' + postgres_port + '/' + postgres_database
    return PG_URL

pgconn = dataset.Database(
             url=construct_pg_url(),
             schema='my_schema'
)
table = pgconn['my_table']
rows = [dict(foo='bar', baz='foo')] * 10000
table.insert_many(rows)

1 Comment

I have tried Pgadmin's interface to import a csv but I failed badly. Meaning, it was very time consuming. Finally, I injected my data directly to the db using psycopg2 and it worked well.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.