1

I am trying to import few data from a JSON file. The JSON file is nested and I want to import the child values. The JSO structure is something like this

{
    "type": "FeatureCollection",
    "properties": {
        "zoom": 14,
        "x": 12302,
        "y": 7075
    },
    "features": [
        {
            "type": "FeatureCollection",
            "properties": {
                "layer": "poi",
                "version": 3,
                "extent": 4096
            },
            "features": [
                {
                    "type": "Feature",
                    "id": 4356,
                    "properties": {
                        "fid": "eg-34678h765",
                        "name": "Brooklyn Children's Museum"
                    },
                    "geometry": {
                        "type": "Point",
                        "coordinates": [
                            -73.944030,
                            40.674427
                        ]
                    }
                }
            ]
        }
    ]
}

I want to fetch the following child values (given as like I call this with JS)

features[0].features[i].id
features[0].features[i].properties.fid
features[0].features[i].properties.name
features[0].features[i].geometry.coordinates[0]
features[0].features[i].geometry.coordinates[1]

into myTable entitled columns id, fid, name, longitude, latitude

I came up with a solution but that only insert the parent values like type, properties, features like this through psql

copy temp_json from 'E:\myJson.json';

insert into myTable ("type", "properties", "features") 

select values->>'type' as type,
       values->>'properties' as properties,
       values->>'features' as features
from   (
           select json_array_elements(replace(values,'\','\\')::json) as values 
           from   temp_json
       ) a;

where features inserted as JSONB.

How can I get my desired fields from the JSON file and insert into the targetted column of my table?

2 Answers 2

1

Try This

select j2->>'id' as id,
       j2->'properties'->>'fid'  as fid,
       j2->'properties'->>'name' as name,
       MAX( CASE WHEN l.k = 1 THEN l.cord end ) as longitude,
       MAX( CASE WHEN l.k = 2 THEN l.cord end ) as latitude
       from temp_json 
    cross join json_array_elements(values->'features') as j1
    cross join json_array_elements(j1->'features') as j2
    cross join json_array_elements_text(j2->'geometry'->'coordinates')
                                 with ordinality l(cord,k)
                                 GROUP BY 1,2,3

DEMO

Sign up to request clarification or add additional context in comments.

Comments

0

One approach would be to use to extract the data in TSV format (for example), and then import it into the database.

The relevant jq filter would be very similar to your preferred format:

features[0].features[]
| [.id, .properties.fid, .properties.name, .geometry.coordinates[:2][] ]
| @tsv

Since your postgres script is reading from a file anyway, it would probably be simplest to perform the conversion at the command-line, along the lines of:

jq -f totsv.jq E:\myJson.json > myExtract.tsv

where totsv.jq holds the jq script above.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.