I have .csv file with 9k rows. When I use script from php manual :
<?php
$row = 1;
if (($handle = fopen("test.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ";")) !== FALSE) {
$num = count($data);
echo "<p> $num fields in line $row: <br /></p>\n";
$row++;
for ($c=0; $c < $num; $c++) {
echo $data[$c] . "<br />\n";
}
}
fclose($handle);
}
?>
I got on my output everysingle cell from the csv file.
Problem occurs when I want to treat every row as record create object from it and store it into database. I have modified it like:
if (($handle = fopen("test.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ";")) !== FALSE) {
$num = count($data);
$record= new Record($db);
echo "<p> $num fields in line $row: <br /></p>\n";
$row++;
for ($c=0; $c < $num; $c++) {
$record->$attributes[$c] = utf8_encode($data[$c]);
}
$record->Store();
}
fclose($handle);
}
In this case only about 2k records are store into mine PostgreSQL database, but no exception or error is shown. I have no idea why my loop stops. Everytime i run the script it loads into mine database different amount of records between 1,6k and 2,1k.
I have no limits set on my PostgreSQL (At least i don't knew of any..)
Can anyone explain me what am I doing wrong?