I'm looking for a general rule of thumb on when it's faster to re-query the database, and when it's faster to use python and extract data from the cache.
Let's assume I need to extract two things simultaniously from the database: all pizzas, and a specific pizza with pk=5.
What's more optimized:
pizzas = Pizza.objects.all()
specific_pizza = Piazza.objects.get(pk=5)
OR
pizzas = Pizza.objects.all()
for pizza in pizzas:
if pizza.pk == 5
specific_pizza = pizza
break
Of course it depends on the database. For example, if pizzas are 10 million rows, it's obvious that re-querying sql is better, and if pizzas are 10 rows, even if the field is indexed, python is probably faster.
Can anyone help what's more optimized in the middle range? For example, pizzas is hundreds of rows? thousands of rows?