1

So I have a flatfile db in the format of username:$SHA$1010101010101010$010110010101010010101010100101010101001010:255.255.255.255:1342078265214

Each record on a new line... about 5000+ lines.. I want to import it into a mysql table. Normally I'd do this using phpmyadmin and "file import", but now I want to automate this process by using php to download the db via ftp and then clean up the existing table data and upload the updated db.

id(AUTH INCREMENT) | username | password | ip | lastlogin

The script I've got below for the most part works.. although php will generate an error: "PHP Fatal error: Maximum execution time of 30 seconds exceeded" I believe I could just increase this time, but on remote server I doubt I'll be allowed, so I need to find better way of doing this.

Only about 1000 records will get inserted into the database before that timeout...

The code I'm using is below.. I will say right now I'm not a pro in php and this was mainly gathered up and cobbled together. I'm looking for some help to make this more efficient as I've heard that doing an insert like this is just bad. And it really sounds bad aswel, as a lot of disk scratching when I run this script on local pc.. I mean why does it want to kill the hdd for doing such a seemingly simple task.

<?php
require ('Connections/local.php');

$wx = array_map('trim',file("auths.db"));
$username = array();
$password = array();
$ip = array();
$lastlogin = array();
foreach($wx as $i => $line) {

        $tmp = array_filter(explode(':',$line));
        $username[$i] = $tmp[0];
        $password[$i] = $tmp[1];
        $ip[$i] = $tmp[2];
        $lastlogin[$i] = $tmp[3];

mysql_query("INSERT INTO authdb (username,password,ip,lastlogin) VALUES('$username[$i]', '$password[$i]', '$ip[$i]', '$lastlogin[$i]') ") or die(mysql_error()); 
}
?>
7
  • 1
    Inserting multiple rows in one query should significantly improve performance. Commented Oct 10, 2012 at 18:41
  • 1
    Since you're using mysql_query without proper SQL escaping you should probably be using mysqli or PDO for your own safety. Commented Oct 10, 2012 at 18:42
  • I'd recommend switching to using fgets, rather than splitting the whole file into an array, that would be a speed increase also. Commented Oct 10, 2012 at 18:48
  • I'll look into using fgets, though I think my current problem I'm really having here is getting this into mysql db in such a way that php doesn't timeout beforehand. Commented Oct 10, 2012 at 19:11
  • @alfasin - I'll assume your native language isn't english. aswel --> as well, (or 'also', or 'additionally') - it's a spelling mistake. :) Commented Oct 10, 2012 at 19:14

2 Answers 2

4

Try this, with bound parameters and PDO.

<?php
require ('Connections/local.php');

$wx = array_map('trim',file("auths.db"));
$username = array();
$password = array();
$ip = array();
$lastlogin = array();

try {
    $dbh = new PDO("mysql:host=$ip;dbname=$database", $dbUsername, $dbPassword);
    $dbh->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
} catch(PDOException $e) {
    echo 'ERROR: ' . $e->getMessage();
}

$mysql_query = "INSERT INTO authdb (username,password,ip,lastlogin) VALUES(:username, :password, :ip, :lastlogin)";
$statement = $dbh->prepare($mysql_query);

foreach($wx as $i => $line) {
        set_time_limit(0);

        $tmp = array_filter(explode(':',$line));
        $username[$i] = $tmp[0];
        $password[$i] = $tmp[1];
        $ip[$i] = $tmp[2];
        $lastlogin[$i] = $tmp[3];

        $params = array(":username"  => $username[$i],
                        ":password"  => $password[$i],
                        ":ip"        => $ip[$i],
                        ":lastlogin" => $lastlogin[$i]);
        $statement->execute($params);
}
?>
Sign up to request clarification or add additional context in comments.

8 Comments

Thanks for sharing that.. I tested it out straight away, though I got some bad news. Php still timed out at 30seconds. //1,188 records got inserted ... I tested the original script I had and managed //1,085 inserted records. Is there anyway to get around this php limit? other than increasing the limit?
What is the value for max_execution_time in your php.ini ? Are you running this from the command line?
I'm just running the php script from h**p://localhost ... Also php.ini is max_execution_time = 30 .Which is about right for the "PHP Fatal error: Maximum execution time of 30 seconds exceeded" errors I get. Like I said I could change that here, but I don't think I would be allowed to on the remote server, where I'm ideally wanting to have this script so its quicker to get the db updated.
Try adding set_time_limit(0) inside your foreach loop.
I'm glad. This should work on the remote server, unless PHP is running in safe mode on it. Feel free to accept the answer I provided if it helped with your mysql. This bound parameter style is not only efficient, it's quite safe.
|
0

Instead of sending queries to server one by one in the form

insert into table (x,y,z) values (1,2,3)

You should use extended insert syntax, as in:

insert into table (x,y,z) values (1,2,3),(4,5,6),(7,8,9),...

This will increase insert performance by miles. However you need to be careful about how many rows you insert in one statement, since there is a limit to the size of a single SQL can be. So, I'd say start with 100 row packs and see how it goes, then adjust pack size accordingly. Chances are your insert time will go down to like 5 seconds, putting it way under max_execution_time limit.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.