1

I've wrote a script to batch process domains and retrieve data on each one. For each domain retrieved, it connects to a remote page via curl and retrieves the data required for 30 domains at a time.

This page typical takes between 2 - 3 mins to load and return the curl result, at this point, the details are parsed and placed into an array (page rank tools function).

Upon running this script via CRON, I keep getting the error 'MySQL server has gone away'.

Can anyone tell me if I'm missing something obvious that could be causing this?

// script dies after 4 mins in time for next cron to start
set_time_limit(240);

include('../include_prehead.php');

$sql = "SELECT id, url FROM domains WHERE (provider_id = 9 OR provider_id = 10) AND google_page_rank IS NULL LIMIT 30";
$result = mysql_query($sql);
$row = mysql_fetch_assoc($result);

do {
$url_list[$row['id']] = $row['url'];
} while ($row = mysql_fetch_assoc($result));

// curl domain information page  - typically takes about 3 minutes
$pr = page_rank_tools($url_list);



foreach ($pr AS $p) {
// each domain
if (isset($p['google_page_rank']) && isset($p['alexa_rank']) && isset($p['links_in_yahoo']) && isset($p['links_in_google'])) {

$sql = "UPDATE domains SET google_page_rank = '".$p['google_page_rank']."' , alexa_rank = '".$p['alexa_rank']."' , links_in_yahoo = '".$p['links_in_yahoo']."' , links_in_google = '".$p['links_in_google']."' WHERE id = '".$p['id']."'";


mysql_query($sql) or die(mysql_error());

}
}

Thanks

CJ

1
  • Usually that's just server timeout Commented Jan 21, 2011 at 8:45

3 Answers 3

5

This happens because MySQL connection has its own timeout and while you are parsing your pages, well, it ends. You can try to increase this timeout with

ini_set('mysql.connect_timeout', 300);
ini_set('default_socket_timeout', 300);

(as mentioned in MySQL server has gone away - in exactly 60 seconds)

Or just call mysql_connect() again.

Sign up to request clarification or add additional context in comments.

3 Comments

the ini set didnt work for me but reconnecting (mysql_pconnect) did.
try not to use persistent connect
I had a similar issue and solved it by using a try/catch block and reconnecting to the database if the exception was thrown.
2

Because the curl take too long time, you can consider to connect again your database before entering the LOOP for update

Comments

0

There are many reasons why this error occurs. See a list here, it may be something you can fix quite easily MySQL Server has gone away

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.