1

I have a MySQL table with about 10000 entries. I want to represent these entries sorted by name. The page should only show 20 entries at a time.

My question is, is it more efficient to

  • Let the database sort it, this means load corresponding 20 entries by using a sorting, limit, select query.

or should one rather

  • Sort the list of entries once, save them in an array in a file, load file and only look at the 20 indices of interest.

Both seems terrible to me. I do not want to sort a database with 10000 entries each time a user loads the page just to show 20 entries, nor do I want to load an array with more then 10000 entries to have access to corresponding 20 entries.

Remark: I am not asking is php sort better than mysql "order by"? or database sort vs. programmatic java sort - I want to know if it is better to presort a database, save it in an array and then load the complete sorted array including all entries.

4
  • It is not clear how your question differs from the question you refer to which has already received some good answers. Commented Feb 8, 2016 at 15:47
  • For such a small data set, I doubt it makes much difference. Commented Feb 8, 2016 at 15:50
  • Well, this depends on the user really. If most of your users will spend going page by page in your pagination then it makes sense to store the results in an array, since in return that will make the interaction faster, probably better using AJAX in that case. Commented Feb 8, 2016 at 15:50
  • @drmonkeyninja The difference is that I am considering to presort the array - I mean by that when someone calls the page the array is already sorted and saved in a file like my_already_saved_array.php. Thus I have the advantage that I don't need to waste time for the sorting, but I have to load a huge array. Commented Feb 8, 2016 at 15:53

1 Answer 1

1

It depends what you mean by "better".

What makes things better? Speed, simplicity, versatility?

What happens if you save the file and then the table is updated? You will be missing some rows in your file. You also can't guarantee that storing the rows in a file is actually going to be faster. MySQL can get quite good at caching if the table isn't being updated much.

That being said, if speed is that important to you I would look at Memcached or Redis. Both of these are storage solutions for key-pair data in which the data is stored in memory. You could do something like this to implement memcached:

function getTableRows()
{
    $memcached = & get_memcached_instance();
    $result = $memcached->get("myTableRows");
    if (! is_array($result)) {
        $result = $this->model->fetchSortedRowsFromDb();
        $memcached->put("myTableRows", $result);
    }
    return $result;
}

You can then use only the required indexes for your pagination. Keep in mind you would have to delete the cache every time the table is updated.

Is that much of a speed improvement required though? As your table gets more and more rows it will put more strain on PHP and may eventually lead to memory issues. You can quite easily use LIMIT and OFFSET to deal with this kind of thing and assuming your tables are indexed properly it shouldn't give much of a performance hit.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.