0

I know this question has been asked quite a few times but I cannot seem to get any of the ways described working.

I am trying to import a file to a database the file has columns of data seperated by a comma (i.e. csv). I have made a test file that resembles my actula file I would like to read in so here is the contents.

Wind_Speed, Wind_Direction, Wind_Max
10, 360, 12
13, 320, 11
12, 340, 14
12, 360, 14

Here is what I have been doing already.

#!/usr/bin/python3.3

import csv, sqlite3

con = sqlite3.connect('test.db')
cur = con.cursor()

with open('test.dat', newline='') as f:
    reader = csv.reader(f)
    for column in reader:
        cur.execute('INSERT INTO jaws VALUES {}'.format(column))
con.commit()

And here is the error message,

Traceback (most recent call last):
  File "csvdict.py", line 11, in <module>
    cur.execute('INSERT INTO jaws VALUES {}'.format(column))
sqlite3.OperationalError: near "['Wind_Speed', ' Wind_Direction', ' Wind_Max']": syntax error

I am sure I am being stupid and missing something obvious but I can't see it.

Thanks

2
  • read the error message. it describes clearly where (and what) the problem is. Commented Nov 26, 2014 at 20:11
  • Your terminology is quite confusing. What you call "columns" are what everyone else calls "rows". Commented Nov 26, 2014 at 21:55

1 Answer 1

1

You are not formatting your data properly.

First of all your error message says you inserting what I assume to be your column names.

Secondly you should be careful how you insert into sql.

#!/usr/bin/python3.3

import csv, sqlite3

con = sqlite3.connect('test.db')
cur = con.cursor()

with open('test.dat', 'r') as f:
    reader = csv.reader(f)
    reader.next() # skip the headers
    for column in reader:
        cur.execute('INSERT INTO jaws VALUES (?,?,?)',tuple(i for i in column))
con.commit()

For a small file like your example one commiting changes at the end of the input is fine, but you might want to consider doing it more often for a larger file. I have run into issues beacause of this in the past.

EDIT:

In response to the exception mentioned in your comment, here is an alternate way to skip the first line:

#!/usr/bin/python3.3

import csv, sqlite3

con = sqlite3.connect('test.db')
cur = con.cursor()

with open('test.dat', 'r') as f:
    reader = csv.reader(f)
    first = True
    for column in reader:
        if first:
            first = False
            continue
        cur.execute('INSERT INTO jaws VALUES (?,?,?)',tuple(i for i in column))
con.commit()

EDIT2:

In response to the exception mentioned in your comment, here is an alternate way to skip the first line:

#!/usr/bin/python3.3

import csv, sqlite3

con = sqlite3.connect('test.db')
cur = con.cursor()

with open('test.dat', 'r') as f:
    reader = csv.reader(f)
    skip_n = 4  # skips 4 lines
    skip_i = 0
    for column in reader:
        if skip_i < skip_n:
            skip_i += 1
            continue
        cur.execute('INSERT INTO jaws VALUES (?,?,?)',tuple(i for i in column))
con.commit()
Sign up to request clarification or add additional context in comments.

6 Comments

Thanks for your help Farmer Joe, I have been trying iterations of this to no avail. I am getting a AttributeError: '_csv.reader' object has no attribute 'next'.
@user3355986 Dropping that line will cause the headers to be inserted into your db...is that what you want?
@user3355986 See my edit for an alternative. I am using python2.7 so that could be why my code threw an exception for you.
@user3355986 Also, no need for insults, their comment may have been unhelpful but name calling only perpetuates the negativity. Share the love.
No that is not what I wanted, I will try this code out and see. I am not sure if I understand it correctly would I be able to skip more lines by adding second = True and so on?
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.