0

I have a column in my databricks table, with a customised date time format as string, while trying to convert the string to datetime I am observing below error PARSE_DATETIME_BY_NEW_PARSER

SQL Command

select to_date(ORDERDATE, 'M/dd/yyyy H:mm') from sales_kaggle_chart limit 10;

The format of ORDERDATE column is M/dd/yyyy H:mm

example of ORDERDATE columns 10/10/2003 0:00 and 8/25/2003 0:00

complete error message

Job aborted due to stage failure: [INCONSISTENT_BEHAVIOR_CROSS_VERSION.PARSE_DATETIME_BY_NEW_PARSER] You may get a different result due to the upgrading to Spark >= 3.0: Fail to parse '5/7/2003' in the new parser. You can set "legacy_time_parser_policy" to "LEGACY" to restore the behavior before Spark 3.0, or set to "CORRECTED" and treat it as an invalid datetime string.

Note: the same command works for a single value

SELECT to_date("12/24/2003 0:00", 'M/d/yyyy H:mm') as date;

2
  • 1
    You should try with 'M/d/yyyy H:mm' and it should work. It would throw an error even if any of the date value us less than 10 and you represent it without 0 prefix if you use dd in the format. Commented Dec 31, 2022 at 6:45
  • 1
    Thanks Nikunj Kakadiya, the solution worked, the part where i got confused the error message , thanks for the infro. Commented Jan 5, 2023 at 12:47

1 Answer 1

1

Have you tried setting to legacy parser, like the error message is hinting you?

SET legacy_time_parser_policy = legacy;
SELECT to_date(ORDERDATE, 'M/dd/yyyy H:mm') FROM sales_kaggle_chart LIMIT 10;

This error is quite common, and adjusting configuration typically does the job.

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks Nikunj Kakadiya mentioned and I have able to get the sql work.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.