4

I run the an update query on a table of 36 millions of lines. This request takes 45 minutes to run. processed field is indexed, and the database is on a ssd.

UPDATE batch_item SET processed=true

do you have a clue why this is so long?

4
  • 1
    This updates all rows in the table - which will take some while. However updating 36 million rows shouldn't take that long. My guess is that it was waiting for some row locks during that time because other transactions were also updating some rows Commented Aug 23, 2018 at 14:57
  • I have this kind of queries that runs in the same time : SHOW TRANSACTION ISOLATION LEVEL Commented Aug 23, 2018 at 14:58
  • 3
    BTW: if you don't add a semicolon to the query it might take forever (assuming the psql front end) Commented Aug 23, 2018 at 14:59
  • @joop I use datagrip and no need of semicolon to start a request :-) Commented Aug 23, 2018 at 15:14

1 Answer 1

2

I don't know how important your index is and whether 100% availability of it is crucial, but dropping the index, setting the value and adding the index back on at the end may save you time.

There's some useful information on bulk update operations here: https://www.codacy.com/blog/how-to-update-large-tables-in-postgresql/

Sign up to request clarification or add additional context in comments.

1 Comment

This is a dead link. The information from this link should be included in the answer to avoid link rot.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.