349

Is the database query faster if I insert multiple rows at once:

like

INSERT....

UNION

INSERT....

UNION

(I need to insert like 2-3000 rows)

3

5 Answers 5

1506

INSERT statements that use VALUES syntax can insert multiple rows. To do this, include multiple lists of column values, each enclosed within parentheses and separated by commas.

Example:

INSERT INTO tbl_name
    (a,b,c)
VALUES
    (1,2,3),
    (4,5,6),
    (7,8,9);

Source

Sign up to request clarification or add additional context in comments.

8 Comments

@RPK. I agree with you but I don't know her data source. As I already wrote, if she has a file, I would use load data syntax as cularis suggests. :)
it's also possible to use INSERT INTO Table SELECT 1, '14/05/2012', 3 UNION SELECT 2, '05/14/2012', 3. of course, this will only be better of the inserted values are coming from different tables.
Helpful reference, because sometimes I just forget simple syntax.
is a, b and c here temporary variables storing the content of the rows?
@Lealo no, they are the table column names in which to insert the values in the same order.
|
64

If you have your data in a text-file, you can use LOAD DATA INFILE.

When loading a table from a text file, use LOAD DATA INFILE. This is usually 20 times faster than using INSERT statements.

Optimizing INSERT Statements

You can find more tips on how to speed up your insert statements on the link above.

6 Comments

What about duplicated records?
@Matteo Duplicates will be either inserted or rejected by the database based on the schema you've defined.
Use mysql multiqueries
Second link 404.
The broken link "Speed of Insert statements" is now covered in: Optimizing INSERT Statements.
|
36

Just use a SELECT statement to get the values for many lines of the chosen columns and put these values into columns of another table in one go. As an example, columns "size" and "price" of the two tables "test_b" and "test_c" get filled with the columns "size" and "price" of table "test_a".

BEGIN;
INSERT INTO test_b (size, price)
  SELECT size, price
  FROM   test_a;
INSERT INTO test_c (size, price) 
  SELECT size, price
  FROM   test_a;
COMMIT;

The code is embedded in BEGIN and COMMIT to run it only when both statements have worked, else the whole run up to that point gets withdrawn.

2 Comments

This deserves more upvotes, using this you can bulk insert data retrieved from other tables
If only there were an explanation as to what is happening in this piece of code, given that I have to "bulk insert data retrieved from other tables"...
-2

Here is a PHP solution ready for use with a n:m (many-to-many relationship) table :

// get data
$table_1 = get_table_1_rows();
$table_2_fk_id = 123;

// prepare first part of the query (before values)
$query = "INSERT INTO `table` (
   `table_1_fk_id`,
   `table_2_fk_id`,
   `insert_date`
) VALUES ";

//loop the table 1 to get all foreign keys and put it in array
foreach($table_1 as $row) {
    $query_values[] = "(".$row["table_1_pk_id"].", $table_2_fk_id, NOW())";
}

// Implode the query values array with a coma and execute the query.
$db->query($query . implode(',',$query_values));

EDIT : After @john's comment I decided to enhance this answer with a more efficient solution :

  • divides the query to multiple smaller queries
  • use rtrim() to delete last coma instead of implod()
// limit of query size (lines inserted per query)
$query_values  = "";
$limit         = 100;
$table_1       = get_table_1_rows();
$table_2_fk_id = 123;

$query = "INSERT INTO `table` (
   `table_1_fk_id`,
   `table_2_fk_id`,
   `insert_date`
) VALUES ";

foreach($table_1 as $row) {
    $query_values .= "(".$row["table_1_pk_id"].", $table_2_fk_id, NOW()),";
    
    // entire table parsed or lines limit reached :
    // -> execute and purge query_values
    if($i === array_key_last($table_1) 
    || fmod(++$i / $limit) == 0) {
        $db->query($query . rtrim($query_values, ','));
        $query_values = "";
    }
}

2 Comments

Using the implode() does circumvent the "last character" problem but it creates a huge memory overhead. She asked for 3000 rows, imagine each row has 1kb of data, that's 3MB of raw data already. The array will take up 30MB of memory she already consumes another 30MB from the $table_1 so the script would use 60MB. Just saying, otherwise it's a good solution
it is useful for my situation.
-15
// db table name / blog_post / menu /  site_title
// Insert into Table (column names separated with comma)
$sql = "INSERT INTO product_cate (site_title, sub_title) 
  VALUES ('$site_title', '$sub_title')";

// db table name / blog_post / menu /  site_title
// Insert into Table (column names separated with comma)
$sql = "INSERT INTO menu (menu_title, sub_menu)
  VALUES ('$menu_title', '$sub_menu', )";

// db table name / blog_post /  menu /  site_title
// Insert into Table (column names separated with comma)
$sql = "INSERT INTO blog_post (post_title, post_des, post_img)
  VALUES ('$post_title ', '$post_des', '$post_img')";

3 Comments

Aside from the confusingness of this response, you may also be vulnerable to SQL injection, assuming you're using PHP.
1. There is bug in your SQL code. 2. The next $sql value will replace the previous $sql value.
small hint for future readers, this is a really easy mistake to make especially for a novice, NEVER NEVER EVER insert a raw string into a sql query there's a rather nasty attack that users of our website can do to call random queries, more info owasp.org/www-community/attacks/SQL_Injection most libraries will have a sanitiser function to edit variables into safe forms that don't break out and escape the quotes

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.