1

I've got an auto-emailer application which imports contacts from a CSV file and parses it to a PHP script via AJAX to email in chunks to get around an unresolvable PHP timeout issue.

Unfortunately, this method is not 100% reliable and I want to create a new application which instead uses a mySQL table of contacts and does away with parsing data via AJAX.

So let's say I have a table of 500 contacts that I want the PHP script to process, but it can only handle 20 at a time before you must close and restart the script due to timeout. How can I automatically restart the script without calling it via AJAX?

3
  • 4
    Run it via CLI/CRON instead of via the webserver? The front end can then simply queue stuff. Commented Jan 12, 2015 at 19:08
  • Personally, I'd look at why this method is not 100% reliable, rather than abandoning AJAX altogether. It seems by far the simplest solution to the problem. In what specific way is it failing? Commented Jan 12, 2015 at 19:12
  • Here's an example of a question I asked related to the previous version's problem: stackoverflow.com/questions/24080155/… It could never be resolved Commented Jan 12, 2015 at 19:28

1 Answer 1

1

If the problem is php timeout, you can make the php script call itself again when it finishes, passing some parameter trough $_GET, $_POST or other to indicate where should it start on the next run. In the case you describe, it will reload the script 500/20 = 25 times. Each step calls another right where it stops until there's nothing left.

When using mysql, php waits for it to conclude its query. So you can read data from database, fetch into an array, modify it, write back on database and reload the script with a parameter indicating where next step should start and finish.

Sign up to request clarification or add additional context in comments.

1 Comment

Once I had a similar problem, I was taking data from a CSV file, modifying it in php, and then saving it on a mySQL database. For a while, I used a solution similiar to what I described to you above, but now I remember that the definitive solution was to make a php script that wrote a giant mySQL query with all the data that I wanted to save and then execute all at once. A loop structure was enough to write the giant query without much effort and I got free from php timeout because all the job was made a lot faster by the mySQL server.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.