0

I have dataframe like this with more than 300 rows

         name price percent  volume      Buy       Sell
1        BID   41.30 -0.36    62292.0     604.0   6067.0
2        BVH   49.00 -1.01    57041.0    3786.0   3510.0
3        CTD   67.80  6.94    68098.0    2929.0    576.0
4        CTG   23.45  0.43   298677.0   16965.0  20367.0
5        EIB   18.20 -0.27    10517.0     306.0    210.0

For each name I create 1 table in mysql. Here is my code so far.

vn30 = vn30_list.iloc[:, [10,13,12,15,25,26]].dropna(how='all').fillna(0)
data = vn30_list.iloc[:, [13,12,15,25,26]].dropna(how='all').fillna(0)

data.columns = ['gia','percent','khoiluong','nnmua','nnban']
en = sa.create_engine('mysql+mysqlconnector://...', echo=True)

#insert into mysql
for i in range(30):
    macp = vn30.iloc[i][0].lower()
    #print(row)
    compare_item = vn30.iloc[i][1]
    if compare_item == data.iloc[i][0]:
        row = data.iloc[i:i + 1, :]
        #print(row)
        row.to_sql(name=str(macp), con=en, if_exists= "append", index=False,schema="online")

Is there anyway to make it faster for 300 rows? Thank you so much. And sorry for my English.

1 Answer 1

1
# import the module
from sqlalchemy import create_engine

# create sqlalchemy engine
engine = create_engine("mysql+pymysql://{user}:{pw}@localhost/{db}".format(user="root",pw="12345",db="employee"))

# Insert whole DataFrame into MySQL
data.to_sql('book_details', con = engine, if_exists = 'append', chunksize = 1000)

You can get all the details here: https://www.dataquest.io/blog/sql-insert-tutorial/

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks for help. But my question is how can I make it faster because each row in dataframe become one table, not insert the whole dataframe into one table.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.