0

I have a situation where my script parse approx 20000 entries and save them to db. I have used transaction which takes around 35 seconds to save and also consume high memory since until committed queries are saved in memory.

I have Found another way to write CSV then load into postgres using "copy_from" which is very fast. If anyone can suggest that if I should open file once at start then close file while loading to postgres or open file when single entry is ready to write then close.

what will be the best approach to save memory utilization?

1 Answer 1

1

Reduce the size of your transactions?

Sign up to request clarification or add additional context in comments.

3 Comments

If I commit 1000 records or 5000 records then accumulative time of 20000 records increases from single 20000 records transaction
It's a trade-off between memory usage and speed ALWAYS.
Whats is the best size of transaction?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.