1

Just started using PostgreSQL 9.5 and have ran into my first problem with jsonb column. I have been trying to find an answer to this for a while but failing badly. Can someone help?

I have a json array in python containing json objects like this:

[{"name":"foo", "age":"18"}, {"name":"bar", "age":"18"}]

I'm trying to insert this into a jsonb column like this:

COPY person(person_jsonb) FROM '/path/to/my/json/file.json';

But only 1 row gets inserted. I hope to have each json object in the array as a new row like this:

1. {"name":"foo", "age":"18"}
2. {"name":"bar", "age":"18"}

Also tried:

INSERT INTO person(person_jsonb)
                VALUES (%s)
                ,(json.dumps(data['person'])

Still only one row gets inserted. Can someone please help??

EDIT: Added python code as requested

import psycopg2, sys, json

con = None
orders_file_path = '/path/to/my/json/person.json'

try:

    with open(orders_file_path) as data_file:
        data = json.load(data_file)
    con = psycopg2.connect(...)
    cur = con.cursor()
    person = data['person']
    cur.execute("""
                    INSERT INTO orders(orders_jsonb)
                    VALUES (%s)
                """, (json.dumps(person), ))
    con.commit()

    except psycopg2.DatabaseError, e:
        if con:
            con.rollback()

    finally:

        if con:
           con.close()

person.json file:

{"person":[{"name":"foo", "age":"18"}, {"name":"bar", "age":"18"}]}
2
  • Show some lines of the file you are copying from. Show the relevant code where the insert is tried. Commented Aug 31, 2016 at 13:40
  • Thanks for the feedback. I have made some edits to include further details :) Commented Aug 31, 2016 at 14:09

1 Answer 1

6

Assuming the simplest schema:

CREATE TABLE test(data jsonb);

Option 1: parse the JSON in Python

You need to insert each row in PostgreSQL apart, you could parse the JSON on Python side and split the upper level array, then use cursor.executemany to execute the INSERT with each json data already split:

import json
import psycopg2

con = psycopg2.connect('...')

cur = con.cursor()

data = json.loads('[{"name":"foo", "age":"18"}, {"name":"bar", "age":"18"}]')

with con.cursor() as cur:
    cur.executemany('INSERT INTO test(data) VALUES(%s)', [(json.dumps(d),) for d in data])

con.commit()
con.close()

Option 2: parse the JSON in PostgreSQL

Another option is to push the JSON processing into PostgreSQL side using json_array_elements:

import psycopg2

con = psycopg2.connect('...')

cur = con.cursor()

data = '[{"name":"foo", "age":"18"}, {"name":"bar", "age":"18"}]'

with con.cursor() as cur:
    cur.execute('INSERT INTO test(data) SELECT * FROM json_array_elements(%s)', (data,))

con.commit()
con.close()
Sign up to request clarification or add additional context in comments.

1 Comment

execute_batch (see here) will give better performance over executemany

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.