1

I have a database of 20,000 email addresses and continuously increasing. I am sending emails 100 one time and then hitting the same page using CURL php but still the page got maximum execution time out. I am running this php script on a shared (Hostgator) server. What should I do to eliminate this problem?

I am executing this script via:

exec("php-cli file.php > testoutput.php 2>&1 & echo $!", $output)
1
  • Do the work in smaller chunks which are scheduled by crontab, perhaps store which ID you got up to so that the crontab job can quickly continue and do the next chunk Commented Sep 28, 2012 at 9:16

2 Answers 2

1

You can add a column (or an ancillary table) in the database, and mark all addresses as initially "to be worked".

Then your script could:

- SELECT the number of worked addresses and total addresses
- display a progress bar
- SELECT the lowest 100, or 50, or N addresses yet to be worked
- send the emails and mark them worked
- issue a javascript to refresh itself.

Then the client will see a page where a progress bar continuously advances, and never times out (as long as "N" is small enough).

You could also do this in AJAX, but that might be overkill.

Moreover, you can employ some tricks (if you don't already do) to speed up operations. One possibility is to "coalesce" several addresses as Bcc:, if the email body is identical; that way you deliver several messages in one call. Take care to not exceed whatever Bcc: limit your ISP might enforce. This strategy uses more resources ISP-side, so check with them.

Another possibility (can be used together) is to send emails sorted by domain. Several mail servers attempt, if possible, to send emails to the same server in one connection to save resources and time. If you send a group of emails all to the same domain, you make things easier for the server.

In this last case you would SELECT emails like this:

SELECT * FROM addresses
    WHERE (worked=0 AND active = 1)
    ORDER BY SUBSTRING_INDEX(email, '@', -1)
    LIMIT 20;

Then (example with mysql functions - PDO are better),

while($tuple = mysql_fetch_assoc($exec))
{
    if (send_mail_to(...))
         $sent[] = $tuple['id'];
    else $failed[] = $tuple['id'];
}
// Now mark both SENT and FAILED as worked
$sentm = implode(',', $sent);
$failm = implode(',', $failed);

// UPDATE addresses SET worked = 1 WHERE id IN ($sentm,$failm);
// UPDATE addresses SET active = 0 WHERE id IN ($failm);

If you saved the start time in a PHP session, you could even display a nice bar-with-dashboard such as

+--------------------+
|########            | 
+--------------------+
40% processed
38% delivered
 2% failed
Expected time remaining 5 min 17"
Sign up to request clarification or add additional context in comments.

1 Comment

Currently i have done it with creating a back hand process and invoking it with php. I also creating a log. Making it a back end process client dose not needed to wait anymore i build a dash board to show him how may mails are sent how may remaining emails are there and i come to know that all emails are sent or not thanks for this answer
0

PHP's time limit can be removed via PHP's set_time_limit() function:

set_time_limit(0);

However, that will only solve the issue of a long-running process dying due to PHP itself noticing that the process was taking too long. While it is an important step, you should also anticipate other types of failures in long-running processes, by dividing the work up into smaller batches and ensuring your code is structured in a way that allows the process to be resumed without re-running the whole thing.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.