0

I have a large CSV file of 12000 rows and 90 columns.

I want to use the mysql load data infile query to upload it to my mysql database.

But I keep getting the error that my CSV has duplicates on the primary key.

I am sure that it does not have duplicates on the primary key.

What could be the problem?

here is my code

$sql = "LOAD DATA INFILE '/a_bysch_store (2).csv' INTO TABLE a_bysch"
. " FIELDS TERMINATED BY ','"
. " LINES TERMINATED BY '\r\n'"
. " IGNORE 1 LINES"; 

//Try to execute query (not stmt) and catch mysqli error from engine and php error
if (!($stmt = $mysqli->query($sql))) {
   echo "\nQuery execute failed: ERRNO: (" . $mysqli->errno . ") " . $mysqli->error;
}
2
  • 3
    load it in to something like excel then check if it does have duplicates. you could also remove the primary key key for testing Commented Apr 29, 2014 at 20:15
  • Possible duplicate of mysql duplicates with LOAD DATA INFILE Commented Sep 13, 2016 at 7:24

1 Answer 1

0

Instead of LOAD DATA INFILE use LOAD DATA LOCAL INFILE. This converts duplicate key errors to warnings and the file should import with duplicates skipped. This is the same as using the IGNORE switch. You can probably use either one. I see you are using IGNORE 1 Lines, but that doesn't ignore duplicate keys.

Also, if you are using an autoincrement as a primary key, don't pass that value in the csv file.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.