1

I am trying to move data from azure blob to azure sql Data warehouse. Azure blob has a json file.

I am getting the exception while moving createdate into the datetime type column on sql.
{ "createdDate":"2016-07-13 15:24:58.000" }

Copy activity encountered a user error at Sink:tcp:database.windows.net,1433 side: ErrorCode=UserErrorInvalidDataValue,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Column 'createdDate' contains an invalid value '2016-07-13 15:24:58.000'. Cannot convert '2016-07-13 15:24:58.000' to type 'DateTime' with format 'yyyy-MM-dd HH:mm:ss.fffffff'.,Source=Microsoft.DataTransfer.Common,''Type=System.FormatException,Message=String was not recognized as a valid DateTime.,Source=mscorlib,'.

Any pointers will be appreciated.

1
  • 2
    Landed here from Google, commenting for future Googlers. Data Factory kept telling me "Invalid Column Mapping" when copying DateTime columns from a CSV file into SQL Data Warehouse. In my case the issue was that Data Factory data types are case sensitive. It didn't like the fact that I had "datetime" or "Datetime" instead of "DateTime" in some of the column types. Commented Sep 12, 2018 at 13:44

2 Answers 2

1

The blob data is missing precision. It looks like the column in the destination database has a DATETIME2(7) setup, whereas the source is outputting DATETIME2(3). You need to either

  1. Fix the source data to comply
  2. Massage the data via a stored procedure for the insert
  3. if possible change the destination column to DATETIME2(3)

... theoretically of course. 😊

Sign up to request clarification or add additional context in comments.

4 Comments

Hey I figured it out, I think when destination is Azure SQL data warehouse it need strict mapping to columns as well as the structure defined at source and destination. Source Dataset:- {"CreatedDate":String} DestinationDataset:- {"CreateDate":- Datetime} and in the activity I put a table translator {columnMapping=createdDate:createdDate} This did the Trick for me.
Old question, but facing a similar issue. Could you explain how you put a table translator into the copy activity? I have googled for the term table translator in relation to Azure data factory, but nothing came up and I can't find anything in the UI
@JamesMatson maybe too late already but do you mean the "TabularTranslator" ? where the columnMappings get defined? in the UI this is located in the Mapping tab when selecting a CopyActivity, to do this you have to import both source and sink schemas and map what's needed, if you open the code {} you will see these JSON objects (translator)
Thanks Saul! I haven't been back to this in a while (You know what it's like, other stuff comes up) but when I get back to playing in Azure DF, I'll check this out and let you know. Thanks for taking the time to comment :)
0

You should try this which will resolve the issue. This is a trick that was a blocker for me, but try this-

  1. Go to sink
  2. Mapping
  3. Click on the output format
  4. Select the data format or time format you prefer to store the data into the sink. For data format choose 'date' for time choose 'time'. You can choose boolean true or false too. enter image description here

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.