I have this datetime format 2016-01-31T20:13:48.000+02:00 as an object in pandas. What would be the best way for changing it, for transfering to sql database. For sql, iam using mysql. I need to store all this format, including time zone.
-
Do you need to store all date time info from the original format in the database? If not, which info do you need to store? MySQL recommended format is: 'YYYY-MM-DD hh:mm:ss'hknjj– hknjj2022-11-11 18:33:00 +00:00Commented Nov 11, 2022 at 18:33
-
Yes, i need to store it with time zone. Maybe i should choose another sql ? Not mysql?DeBe– DeBe2022-11-11 18:34:11 +00:00Commented Nov 11, 2022 at 18:34
-
Timezones are tricky to handle in SQL, can't you store as UTC? Then display it to user in their own timezone during operation?hknjj– hknjj2022-11-11 18:57:48 +00:00Commented Nov 11, 2022 at 18:57
-
I'am working with big csv, those dates are already in it, so i need to show them in sql.DeBe– DeBe2022-11-11 18:59:53 +00:00Commented Nov 11, 2022 at 18:59
Add a comment
|
1 Answer
Up to you how to store this into a SQL database, answering your main question though, here's how you could convert this into a datetime class type.
import pandas as pd
from datetime import datetime
def convert_to_datetime(input):
# function that reformats input string to datetime type
return datetime.strptime(input, "%Y-%m-%dT%H:%M:%S.%f%z")
# example DataFrame
df = pd.DataFrame ({'date_col': ['2016-01-02T20:13:48.000+02:00',
'2016-02-02T20:13:48.000+02:00',
'2016-03-03T20:13:48.000+02:00']})
# convert string values into datetime type for all rows
df['new_date_col'] = df.apply(lambda x: convert_to_datetime(x['date_col']), axis=1)
# drop original column
df = df.drop(columns=['date_col'])
Then you could use pandas.DataFrame.to_sql() to store into any SQL database supported by sqlalchemy package. See https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.to_sql.html