We are using the django-dbarray module to use arrays in postgres. I've been doing some research regarding postgres arrays, and some developers have said they wouldn't recommend using a postgres array storing over X values. Sometimes, this is ten, and I've heard as many as thirty. Is there any consensus on how many values can or should be stored in an array before performance starts to taper off?
For reference, the above DB is mainly a read only DB.
We are trying to decide where we should use intermediate tables and where we should use a postgres array.
One additional related question: When creating an index against a column in a table, where that column stores array values (let's say bigint []). I realize the values stored within the array would not be indexed, but only the array itself (I'm assuming this is something like a C pointer). How efficient is this compared to simply having an intermediate table?
We may need to create joins against the values or have some of the specific values in a where clause, and I am concerned some of the performance could degrade and we may be better off having an intermediate table whenever we may need to create a join.
Lastly, given that we are using dbarray, what is the efficiency of that vs simply using an intermediate table with the standard django ORM (assume no joins are where clauses from the above question)?
Thank you