I'm trying to store some measurement data into my postgresql db using Python Django. So far all good, i've made a docker container with django, and another one with the postgresql server. However, i am getting close to 2M rows in my measurement table, and queries start to get really slow, while i'm not really sure why, i'm not doing very intense queries.
This query
SELECT ••• FROM "measurement" WHERE "measurement"."device_id" = 26 ORDER BY "measurement"."measure_timestamp" DESC LIMIT 20
for example takes between 3 and 5 seconds to run, depending on which device i query.
I would expect this to run a lot faster, since i'm not doing anything fancy. The measurement table
id INTEGER
measure_timestamp TIMESTAMP WITH TIMEZONE
sensor_height INTEGER
device_id INTEGER
with indices on id and measure_timestamp. The server doesn't look too busy, even though it's only 512M memory, i have plenty left during queries.
I configured the postgresql server with shared_buffers=256MB and work_mem=128MB. The total database is just under 100MB, so it should easily fit. If i run the query in PgAdmin, i'm seeing a lot of Block I/O, so i suspect it has to read from disk, which is obviously slow.
Could anyone give me a few pointers in the right direction how to find the issue?
EDIT: Added output of explain analyze on a query. I now added index on the device_id, which helped a lot, but i would expect even quicker query times. https://pastebin.com/H30JSuWa
EXPLAIN (ANALYZE, BUFFERS)on the query and add the results to your question. That will help giving an answer that is not based on guesses only.