0

I have 24 databases with a table labeled email_queue.

I have another database with a list of all the databases that have the email_queue table in it.

I loop through the list of databases and query my email_queue table to send mails for each database.

The problem with this is that the php script gets held up on, lets say, the 3rd database while sending 500 emails leaving the other databases after that one to wait for their turn.

I am trying to figure out how i can query all 24 databases at the same time and send the email queue at the same time.

Any suggestions?

7
  • 1
    Sounds like a mess. You should make them multiple tables, not multiple DBs. Commented Jan 5, 2014 at 3:09
  • 3
    This sounds like something you do regularly. You already have a separation built, might as well go with it now: 24 sep cron scripts to run each queue. Stagger them if you fear it will cause too much congestion. Commented Jan 5, 2014 at 3:12
  • Why do you have so many databases ? Commented Jan 5, 2014 at 3:13
  • There are multiple DBs because they are for multiple sites I have. I like to keep them in separate DBs along with other tables and data. Commented Jan 5, 2014 at 3:14
  • I would write multiple crons to run multiple scripts, one for each queue, but the issue is that the number of queues will increase. so i would like a dynamic solution. There must be a way but i am just drawing a blank. :( Commented Jan 5, 2014 at 3:15

3 Answers 3

4

I think having this many databases is probably a sign of bad design. If you can't change it and need to move forward now, I suggest one of two options:

  1. Run the same script with a parameter to select which database to use. You should be able to find resources on how to do this
  2. Use non-blocking queries; the rest of this answer will be spent talking about this.

Here's a somewhat complete example using the mysqli extension (requires the mysqlnd driver):

$credentials = array(
    array(
        'host' => 'host1',
        'user' => 'user',
        'password' => 'password',
        'database' => 'database'
    ),
    array(
        'host' => 'host2',
        'user' => 'user',
        'password' => 'password',
        'database' => 'database'
    ),
    // credentials for other sites
);
$dbcs = array();
foreach ($credentials as $config) {
    $dbcs[] = array($db = new mysqli(
        $config['host'],
        $config['user'],
        $config['pass'],
        $config['database']
    ));
    $query = ""; // here is your query to do whatever it is with your table
    $db->query($query, MYSQLI_ASYNC);
}

$results = array();
$errors = array();
$rejected = array();
$secondsToWait = 1;

while (!empty($dbcs)) {
    foreach ($dbcs as $key => $c) {
        $db = $c[0];
        if (mysqli_poll($c, $errors, $rejected, $secondsToWait) == 1) {
            $r = $db->reap_async_query();

            // here you would do your fetches for each query, such as
            $results[] = $r->fetch_assoc();

            // do what you need to do with the result

            // then cleanup
            $r->free();
            $db->close();
            unset($dbcs[$key]);
        }
    }
}

Note that it does have drawbacks, such as a failed query may bring down the whole program.

Sign up to request clarification or add additional context in comments.

Comments

2

On way to do this is with curl_multi_open

Split your script into two, you can make one php file (say email_out.php) take the db name (or some variable that's used to look up the db name, either the switch will be in the for loop or in email_out.php), and then do the mass email based of that one script.

the second part uses curl_multi_open to open the email_out.php script multiple times, effectively creating multiple separate connections to different db's, the scripts can all resolve at different times since they are all running in parallel. Essentially, your loop is now adding the script to curl_multi_open multiple times with different arguments and then executing all of them asynchronously.

class Fork
{
    private $_handles = array();
    private $_mh      = array();

    function __construct()
    {
        $this->_mh = curl_multi_init();
    }

    function add($url)
    {
        $ch = curl_init();
        curl_setopt($ch, CURLOPT_URL, $url);
        curl_setopt($ch, CURLOPT_HEADER, 0);
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
        curl_setopt($ch, CURLOPT_TIMEOUT, 30);
        curl_multi_add_handle($this->_mh, $ch);
        $this->_handles[] = $ch;
        return $this;
    }

    function run()
    {
        $running=null;
        do {
            curl_multi_exec($this->_mh, $running);
            usleep (250000);
        } while ($running > 0);
        for($i=0; $i < count($this->_handles); $i++) {
            $out = curl_multi_getcontent($this->_handles[$i]);
            $data[$i] = json_decode($out);
            curl_multi_remove_handle($this->_mh, $this->_handles[$i]);
        }
        curl_multi_close($this->_mh);
        return $data;
    }
}

(from http://gonzalo123.com/2010/10/11/speed-up-php-scripts-with-asynchronous-database-queries/)

So your loop would look something like this:

$fork = new Fork;
for ($i = 0; $i < 24; $i++) {
    $fork->add("email_out.php?t=" . $i);
}
$fork->run();

3 Comments

I am using your suggestion but... $test = $fork->run();print_r($test); returns Array ( [0] => [1] => [2] => [3] => [4] => [5] => [6] => [7] => [8] => [9] => [10] => [11] => [12] => [13] => [14] => [15] => [16] => [17] => [18] => [19] => [20] => [21] => [22] => [23] => [24] => )
And i have the email_sender.php script outputting success or fail.
I fixed it. i had an error with the link to my script. Thank you very much! This is working great!!
0

In your script try this.

  1. Use set_time_limit(0); in order to override PHP's max_execution_time
  2. Use the getopt() function to get the database name when the script is ran from the command line (i.e. php script.php -d database1).
  3. From there do the logic.
  4. In your crontab make an entry for each database you would like to send emails from using the switch (-d) I specified in 2. If you have 20 databases then you must have 20 entries.

Using this way you will see a separate PHP process for each cron job and you can isolate a database if ever you will encounter an error with it.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.