1

I have a particular situation where my client require to import (periodically) an ms-access database into his mysql website database (so it's a remote database).

Because the hosting plan is a shared hosting (not a vps), the only way to do it is through PHP through an SQL query, because I don't have ODBC support on hosting.

My current idea is this one (obviusly the client has a MS-Windows O.S.):

  • Create a small C# application that convert MS-Access database into a big SQL query written on a file
  • The application will then use FTP info to send the file into a specified directory on the website
  • A PHP script will then run periodically (like every 30 minutes) and check if file exists, eventually importing it into the database

I know it's not the best approach so I'm proposing a question to create a different workaround for this problem. The client already said that he wants keep using his ms-access database.

The biggest problem I have is that scripts can last only 30 seconds, which is obviusly a problem to import data.

4
  • Similar process I've used before. Used a script to export Access database to a valid MySQL import SQL file and upload to an FTP site where another script on the remote server ran every night to see if the file existed. If so, it would import the SQL file (after some security checks) Commented Mar 14, 2012 at 18:39
  • How you handle the 30seconds script execution limit? I don't know how much will the database be big Commented Mar 14, 2012 at 18:46
  • php.net/manual/en/function.set-time-limit.php Commented Mar 14, 2012 at 19:12
  • Php is running in safe mode, I already know that function :( Commented Mar 14, 2012 at 19:15

1 Answer 1

2

To work around the 30-second limit, call your script repeatedly, and keep track of your progress. Here's one rough idea:

if(!file_exists('upload.sql')) exit();

$max = 2000; // the maximum number you want to execute.

if(file_exists('progress.txt')) {
    $progress = file_get_contents('progress.txt');
} else {
    $progress = 0;
}

// load the file into an array, expecting one query per line
$file = file('upload.sql');

foreach($file as $current => $query) {
    if($current < $progress) continue; // skip the ones we've done
    if($current - $progress >= $max) break; // stop before we hit the max
    mysql_query($query);
}

// did we finish the file?
if($current == count($file) - 1) {
    unlink('progress.txt');
    unlink('upload.sql');
} else {
    file_put_contents('progress.txt', $current);
}
Sign up to request clarification or add additional context in comments.

1 Comment

Yeah that's the same approach I thought about, however I like the way you handle it in code (expecially the $current < $progress thing), I'll tell you the result in a few days. Thanks a lot

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.