0

What is a good way to insert large amount of data into an Postgres table using node?

We are using an api to fetch a json object array with a lot of data (objects) from a 3rd party service and we need to send this data to our Postgres database using a node library.

1 Answer 1

1

If you have a large amount of data (ie, lots of rows) in json form, it will probably be faster to pass the json directly to postgres and have it process it and insert the data into a table.

Example:

Table foo contains one column and one row which contains this: [{"a":707,"b":12,"c":407}, {"a":221,"b":327,"c":19}, {"a":472,"b":797,"c":430}, {"a":515,"b":174,"c":198}, ..... up to 100k values]

Let's insert each json object of this array into a table "target" which has corresponding columns a,b,c ...

INSERT INTO target (a,b,c) 
SELECT (j->>'a')::INTEGER, (j->>'b')::INTEGER, (j->>'c')::INTEGER 
FROM (SELECT json_array_elements(data) j from foo) f;

This does the 100k rows in less than 1s, which is a lot faster than doing the same number of INSERTs.

Postgres has many ways to manipulate json, so you can season to taste.

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks! Exactly the answer i was looking for!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.