data_input=open(ratingsfilepath,'r')
for row in data_input:
cur_load.execute("INSERT INTO "+ratingstablename+" VALUES (%s, %s, %s)", (row.split('::')[0],row.split('::')[1],row.split('::')[2]))
I have 10 million records in .dat file I am loading them into table using python script. But it takes nearly 1 hour to load them. Is there anything to reduce the time