4

I'm building a site where users will be able to compare specs between products. Users can view specs for one single product where I only need specs for that specific product but they will also be able to compare the chosen product with any other product available on the site. I think I'm going to end up with something like 75-100 products.

All the specs are available in a global csv file. Each product has potentially 37 features and there are between 75-100. The size of this file is around 50Kb. I have also individual files for each product containing the same data but only for one product.

I was wondering what would be the best way to ensure a good performance:

  1. Load the whole file in a $_SESSION to make it available all the time for all pages. This means that all data is loaded once and available all the time without any additional request but can this affect performance?

  2. Load files individually whenever they are needed but that means that if someone starts comparing products (and that's the goal), php will have to read files all the time

  3. Store data in a database and access it each time it is necessary

What would be the best option? If you think of any other solution, I'm also interested of course.

Thanks

0

2 Answers 2

2

Since you're going to store data, you might want to put it into a data-base ;)

Since database-engines are really good at caching, you probably won't have to worry about performance anytime soon, plus you've built yourself some headache-prevention in case the 'file' suddenly gets a lot larger.

And if the database is not enough performance, you could always resort to a MEMORY-table in MySQL or memcached for example.

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks! I also thought of using the database option but what I would like to understand is if it's better to have something like 10 queries in the database or having once the whole file loaded?
2

Loading the csv into $_SESSION will still need to happen for each request, which doesn't sound very efficient.

Mysql is great for a write-some / read-some solutions but it seems to me that your situation is more of a write-seldom / read-often situation.

Using an in-memory cache like one of the options below to store the contents of the csv would probably make much more sense:

1 Comment

Hello Stuart, thanks! I'm indeed in a situation where I would write 3-4 times a month and read many times (and many times within the same session). I'll need to check with the hosting company if they provide the solutions you proposed. Thanks for your suggestion!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.