0

What is the fastest way to send batch requests to Postgres database using Golang? Each request contains 500-200000 rows.
Methods I know about are-
1. Using database/sql package's transaction Begin, Prepare, Commit.
2. Sending all data in one statement.
3. Sending a list of statements using sql.Exec() method.
Is there some other way to send batch requests without making a connection at every statement? If not which is the best way of these?

This question is similar to question at- Golang how do I batch sql statements with package database.sql

2
  • Any question that begins with "What is the fastest way" is best answered by benchmarks. Commented Dec 27, 2017 at 8:49
  • 1
    stackoverflow.com/a/35388503/5315974 Commented Dec 27, 2017 at 9:03

1 Answer 1

1

There is a bit old depesz blog post on that. His programs are Perl scripts, but if you concentrate on SQL... Anyway - from DB perspective, you can use COPY, or INSERT with many rows in VALUES. It looks that around 20 is good choice, but it is worth to test that in your case. If performance is key factor, I would put around 2000-5000 rows per transaction. Also, from DB perspective transaction, and session are two separate things. So you can open session, and to many transactions in it.

For PostgreSQL starting new session per operation is really bad idea - DB spawns new process for each session. One of answers for the question you referred contains this. So you open connection, and then transaction, as it should be done.

Sign up to request clarification or add additional context in comments.

2 Comments

In the blog post, COPY outperforms the INSERT with many rows. Is there any reason not to use COPY over INSERT?
not really. pg_dump generates dumps with COPY by default.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.