0

I have a folder which contains a set of file .txt with the same structure.

The folder directory is E:\DataExport

Which contain 1000 .txt file: text1.txt, text2.txt,.... Each txt file contain price data of one. The data is updated on a daily. An example of one .txt file is as below

Ticker,Date/Time,Open,High,Low,Close,Volume
AAA,7/15/2010,19.581,20.347,18.429,18.698,174100
AAA,7/16/2010,19.002,19.002,17.855,17.855,109200
AAA,7/19/2010,19.002,19.002,17.777,17.777,104900
....

My question is, I want to:

  1. Load all the file into mysql data base through a line of code, I can do it one by one in MySql command line but do not know how to import all at once, into one table in MySql (I have already create the table)
  2. Everyday, when I get new data, how could I only update the new data to the table in MySql

I would like to get solution using either Python or MySql. I have tried to google and apply some solution but cannot success, the data do not load into mySQL

1 Answer 1

1
  1. You could use python package pandas.

    Read data with read_csv into DataFrame and use to_sql method with proper con and schema.

  2. You will have to keep track of what is imported. You could for example keep it in file or database that last imported 54th line on 1032th file. And perform an update that reads the rest and imports that.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.