57

For loading huge amounts of data into MySQL, LOAD DATA INFILE is by far the fastest option. Unfortunately, while this can be used in a way INSERT IGNORE or REPLACE works, ON DUPLICATE KEY UPDATE is not currently supported.

However, ON DUPLICATE KEY UPDATE has advantages over REPLACE. The latter does a delete and an insert when a duplicate exists. This brings overhead for key management. Also, autoincrement ids will not stay the same on a replace.

How can ON DUPLICATE KEY UPDATE be emulated when using LOAD DATA INFILE?

1

3 Answers 3

109

These steps can be used to emulate this functionality:

  1. Create a new temporary table.

    CREATE TEMPORARY TABLE temporary_table LIKE target_table;
    
  2. Optionally, drop all indices from the temporary table to speed things up.

    SHOW INDEX FROM temporary_table;
    DROP INDEX `PRIMARY` ON temporary_table;
    DROP INDEX `some_other_index` ON temporary_table;
    
  3. Load the CSV into the temporary table

    LOAD DATA INFILE 'your_file.csv'
    INTO TABLE temporary_table
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    (field1, field2);
    
  4. Copy the data using ON DUPLICATE KEY UPDATE

    SHOW COLUMNS FROM target_table;
    INSERT INTO target_table
    SELECT * FROM temporary_table
    ON DUPLICATE KEY UPDATE field1 = VALUES(field1), field2 = VALUES(field2);
    
  5. Remove the temporary table

    DROP TEMPORARY TABLE temporary_table;
    

Using SHOW INDEX FROM and SHOW COLUMNS FROM this process can be automated for any given table.

Sign up to request clarification or add additional context in comments.

13 Comments

I suggest rather using INSERT INTO target_table SELECT coloumn_name1, coloumn_name1 FROM temporary_table because * will include the primary key and cause items with the same primary key (in the case of an auto_increment primary key) in the main table to be updated, otherwise this worked for me!
Thanks for the answer, worked for me after a little of SQL juggling. My SQL-fu is very rusty, nice to learn neat techniques like this.
Thanks from here too - works brilliantly - didn't think it was possible to do a ON DUPLICATE KEY with LOAD DATA INFILE, so never tried before.
@verisimilitude - Dropping indexes will improve performance for loading data from file!
Steps 1) and 2) can be done in a single statement: CREATE TEMPORARY TABLE temporary_table SELECT * FROM target_table LIMIT 0;
|
5

We can replace first (two steps) with below single query in the answer shared by Jan.

For steps 1 and 2 we can create new table with same reference structure and without any indexes.

CREATE TEMPORARY TABLE temporary_table SELECT * FROM target_table WHERE 1=0;

Instead of.

  1. Create a new temporary table.

    CREATE TEMPORARY TABLE temporary_table LIKE target_table;
    
  2. Optionally, drop all indices from the temporary table to speed things up.

    SHOW INDEX FROM temporary_table;
    DROP INDEX `PRIMARY` ON temporary_table;
    DROP INDEX `some_other_index` ON temporary_table;
    

Comments

0

Non-LOCAL Versus LOCAL Operation

The LOCAL modifier affects these aspects of LOAD DATA, compared to non-LOCAL operation:

  • It changes the expected location of the input file; see Input File Location.
  • It changes the statement security requirements; see Security Requirements.
  • It has the same effect as the IGNORE modifier on the interpretation of input file contents and error handling; see Duplicate-Key and Error Handling, and Column Value Assignment.

LOCAL works only if the server and your client both have been configured to permit it. For example, if mysqld was started with the local_infile system variable disabled, LOCAL produces an error. See Section 6.1.6, “Security Considerations for LOAD DATA LOCAL”.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.