I have a situation where my script parse approx 20000 entries and save them to db. I have used transaction which takes around 35 seconds to save and also consume high memory since until committed queries are saved in memory.
I have Found another way to write CSV then load into postgres using "copy_from" which is very fast. If anyone can suggest that if I should open file once at start then close file while loading to postgres or open file when single entry is ready to write then close.
what will be the best approach to save memory utilization?