I have a realy anoying daily routine work. I have to upload csv sheet to our database. Since the data source have a simple rest api its possible to request all data by JSON Url. For that i implement a small python script which laods the JSON and insert all data to my DB. The problem is that the inserts are realy slow (i expected less then 10 minutes) but after 3 houres i breaked up. Here is my code (a little bite change for better understanding):
db = MySQLdb.connect(host="...", # your host, usually localhost
user="...", # your username
passwd="...", # your password
db="...") # name of the data base
with db:
cur = db.cursor()
JSONURL = URL[0][y]+URL2+URL3+URL4+URL5
response = urllib.urlopen(JSONURL)
data = json.loads(response.read())
n=len(data["results"])-1
for z in range(0,n):
with db:
sql="INSERT INTO t_data (col1,col2,col3) VALUES("+data["results"][0]+","+data["results"][1]+","+data["results"][2]+")"
cur.execute(sql)
Like you can see i use MySQL. Maybe i can use the prepare statement (but i don`t know if it works on python if yes how i do that?) Is it possible to change the setting of MySQL to get a better performance? I am looking forward to any help.
bulk-insertinstead of insert query again and again. Here is the link for it dev.mysql.com/doc/refman/5.5/en/…