I'm working with a somewhat large set (~30000 records) of data that my Django app needs to retrieve on a regular basis. This data doesn't really change often (maybe once a month or so), and the changes that are made are done in a batch, so the DB solution I'm trying to arrive at is pretty much read-only.
The total size of this dataset is about 20mb, and my first thought is that I can load it into memory (possibly as a singleton on an object) and access it very fast that way, though I'm wondering if there are other, more efficient ways of decreasing the fetch time by avoiding disk I/O. Would memcached be the best solution here? Or would loading it into an in-memory SQLite DB be better? Or loading it on app startup simply as an in-memory variable?