0

I am trying to insert json data in to a postgres table using this query-

INSERT INTO rf_dsgns 
SELECT * FROM json_populate_recordset(NULL::rf_dsgns,
'[
  {
    "Tracking_ID": 2377125,
    "Constr_Zone": "Cleveland",
    "AF_Name": "PbCleveland_10236716P",       
    "Address": "4755 1/2 Rose Avenue",
    "Zip_Code": 44867,
    "Latitude": 5.8923486,
    "Longitude": -71.71052258,        
  },{
    "Tracking_ID": 2377126,
    "Constr_Zone": "Cleveland",
    "AF_Name": "PggClevelandCLE_25236718P",       
    "Street_Address": "4413 1/3 Clain Avenue",  
    "Zip_Code": 44225,
    "Latitude": 40.88960254,
    "Longitude": -71.20898567,        
  }]');

Data types I have used while creating my table are integer,character,character,character,integer,numeric,numeric respectively. My create table script is

CREATE TABLE rf_dsgns
(
    tracking_id integer,
    constr_zone character(300),
    af_name character(300),       
    address character(300),        
    zip_code integer,
    latitude numeric,
    longitude numeric       
);
4
  • Your JSON string is invalid, you need to remove the , after the longitude Commented Mar 20, 2019 at 19:13
  • It's still not working. I removed , after longitude and tried to run the query again. Commented Mar 20, 2019 at 19:21
  • Might be insightful to also disclose the table definition of rf_dsgnsits schema and the search_path setting of your session. Commented Mar 20, 2019 at 19:24
  • Unrelated, but: ZIP codes are not integers! Commented Mar 20, 2019 at 19:40

1 Answer 1

3

You most probably create the table without using double quotes for the column names (which is a good thing). However, json_populate_recordset() matches case sensitive, and thus the lower case column names in the table are not matched with the mixed case names in the JSON.

This:

create table rf_dsgns ("Tracking_ID" int, "Constr_Zone" text, "AF_Name" text, "Address" text, "Zip_Code" text, "Latitude" numeric, "Longitude" numeric);

SELECT * 
FROM json_populate_recordset(NULL::rf_dsgns,
'[
  {
    "Tracking_ID": 2377125,
    "Constr_Zone": "Cleveland",
    "AF_Name": "PbCleveland_10236716P",       
    "Address": "4755 1/2 Rose Avenue",
    "Zip_Code": 44867,
    "Latitude": 5.8923486,
    "Longitude": -71.71052258
  },{
    "Tracking_ID": 2377126,
    "Constr_Zone": "Cleveland",
    "AF_Name": "PggClevelandCLE_25236718P",       
    "Street_Address": "4413 1/3 Clain Avenue",  
    "Zip_Code": 44225,
    "Latitude": 40.88960254,
    "Longitude": -71.20898567        
  }]');

returns:

Tracking_ID | Constr_Zone | AF_Name                   | Address              | Zip_Code | Latitude    | Longitude   
------------+-------------+---------------------------+----------------------+----------+-------------+-------------
    2377125 | Cleveland   | PbCleveland_10236716P     | 4755 1/2 Rose Avenue |    44867 |   5.8923486 | -71.71052258
    2377126 | Cleveland   | PggClevelandCLE_25236718P |                      |    44225 | 40.88960254 | -71.20898567

However when the table is created without quotes:

create table rf_dsgns (tracking_id int, constr_zone text, af_name text, address text, zip_code text, latitude numeric, longitude numeric);

Then no columns are matched and everything will be NULL

I would not re-create the table with case sensitive column names. I can see two workarounds: create a new type that uses the quoted identifiers but is identical to the table definition and use that to map the JSON data.

Use json_array_elements() instead and spell out all column names manually That is a bit more typing but does not duplicate the type definition (and is actually a bit more flexible):

insert into rf_dsgns
SELECT (j ->> 'Tracking_ID')::int, 
       j ->> 'Constr_Zone',
       j ->> 'AF_Name',
       j ->> 'Addres', 
       j ->> 'Zip_Code',
       (j ->> 'Latitude')::numeric,
       (j ->> 'longitude')::numeric
FROM json_array_elements('.... your json here ') a t(j);
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.