18

I am importing an excel file into a postgres database with express and node-pg

Currently I am looping through the excel rows and executing an insert for every row but I feel it's not the right way:

workbook.xlsx.readFile(excel_file).then(function () {
        // get the first worksheet          
        var worksheet = workbook.getWorksheet(1);
        // Loop through all rows
        worksheet.eachRow(function (row, rowNumber) {
            // Commit to DB only from line 2 and up. We want to exclude headers from excel file
            if (rowNumber > 1) {
                // Loop through all values and build array to pass to DB function
                row.eachCell(function (cell, colNumber) {
                    arrSQLParams.push(cell.value)                   
                })

                // Add the user id from session to the array
                arrSQLParams.push(user);

                // Insert into DB
                db.query(strSQL, arrSQLParams, function (err, result) {
                    if (err) {
                        console.log(err);
                            ret = false;
                        }
                })

                // Empty the array for new query
                arrSQLParams = [];
            }
        })          
    });

Is there a better way to do this to improve performance?

4
  • Since you asking about the performance, what is the maximum number of rows you may need to insert at a time, and approximate size of each row? To put otherwise - is it small enough to get all the data into Node.js memory at once? Commented Feb 27, 2017 at 13:27
  • About 1000 rows at a time Commented Feb 27, 2017 at 13:31
  • 1
    In that case the best way to do it is via pg-promise and its ColumnSet type. I can throw in an example later on, if you really need it, but it is basically all here: stackoverflow.com/questions/37300997/…. That's what you need for really good performance and flexibility at the same time ;) Commented Feb 27, 2017 at 13:47
  • stackoverflow.com/questions/24008668/… Commented Sep 13, 2023 at 4:20

2 Answers 2

6

Following the clarification provided by the author, to insert up to 1000 records at a time, the solution as suggested within Multi-row insert with pg-promise is exactly what the author needs, in terms of both performance and flexibility.

UPDATE

A must-read article: Data Imports.

Sign up to request clarification or add additional context in comments.

Comments

0

You can use this package https://www.npmjs.com/package/pg-essential. It will apply a patch on node-postgres and You just need to call it's executeBulkInsertion function. You can create an array of the objects to be inserted and pass it to the executeBulkInsertion function.

let bulkData = [];
foreach( user in users){
 bulkData.push(user);
}
await db.executeBulkInsertion(bulkData,[array of column names],[table name]);

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.