I'm experimenting with Postgres recently instead of using Bigquery.
My table transactions_all structure
access_id (text)
serial_no (text)
transaction_id (text)
date_local (timestamp)
and an index (BTREE, condition (((date_local)::date), serial_no))
When I'm loading 500.000 rows for one month into this table, performance is okay to query the last 2 days like this
SELECT *
FROM transactions_all
WHERE DATE(date_local) BETWEEN CURRENT_DATE - INTERVAL '1 day' AND CURRENT_DATE
AND access_id = 'accessid1'
and serial_no = 'IO99267637'
But if I'm selecting the last 21 days like this
SELECT *
FROM transactions_all
WHERE DATE(date_local) BETWEEN CURRENT_DATE - INTERVAL '20 day' AND CURRENT_DATE
AND access_id = 'accessid1'
and serial_no = 'IO99267637'
then fetching the data takes multiple seconds instead of milliseconds.
Is this a normal behaviour or am I using the wrong index?
explain (analyze, buffers, format text)(not just a "simple" explain) as formatted text and make sure you preserve the indention of the plan. Paste the text, then put```on the line before the plan and on a line after the plan.date_local::date(the expression used in the index) instead ofDATE(date_local)