In short: I want to convert object to string dynamically for all my object columns in all my Pandas dataframes. I saw similar posts about a single conversion, but none of them deals with a dynamic conversion to strings.
I'm writing multiple JSON files to our SQL Server database, using Python 3.x. When I import the JSON files, and store it in a Pandas DF, all strings are stored as an object, since the length is unknown in advance. Hence, when I write the data to the SQL Server, the datatype is chosen to be text rather than varchar(255)
Is there a way to dynamically convert all columns with dtype object to dtype string? The strings may be cut at the first 255 characters.
I tried the following, but made Python crash...
url = 'some-url-to-json-file'
params['connectionString'] = 'driver://user:pw@server/database'
engine = sqlalchemy.create_engine(connectionString)
response = requests.get(url)
pandasDF = pandas.DataFrame(response.json()['value'])
# Convert Objects to Strings
for cName in list(pandasDF.columns.values):
if pandasDF[cName].dtype == 'object':
pandasDF[cName] = pandasDF[cName].to_string
pandasDF.to_sql(tableName, engine, if_exists = 'append')