1

I have a table in my database with a string for a primary_key, which I set. I need to check frequently if around 1000 "items" exist in that table before adding them, because there must not be any duplicates. The result is 2 queries per Item, or 2000 total, which is 1-2 seconds of extra loading time.

If I try to insert the new row anyway, without checking for duplicates it doesn't get inserted and that's great, but mysql returns an error, which crashes my service.

My questions:

  1. Can I turn off those errors?
  2. Is there a better way to prevent inserting duplicates than making an extra query?
5
  • 1
    If you think my question is bad, please say why. Commented Jun 18, 2013 at 12:17
  • yes, fix errors maybe? insert ignore Commented Jun 18, 2013 at 12:18
  • I need the error to go away, nothing else needs fixing. Commented Jun 18, 2013 at 12:20
  • 2
    If the service is crashing then your code needs fixing! so yes something else does need fixing don't ignore errors fix them. Commented Jun 18, 2013 at 12:20
  • If you have a solution, post your answer. I can think of "fix the error" myself. Commented Jun 18, 2013 at 12:22

2 Answers 2

5

You could use the IGNORE keyword to have duplicates dropped from your inserts:

INSERT IGNORE INTO yourTable Values (...)
Sign up to request clarification or add additional context in comments.

Comments

5

mysql returns an error, which crashes my service.

it's actually your own code that crashes your service as mysql error cannot crash anything.

Is there a better way to prevent inserting duplicates

INSERT IGNORE

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.