2

when i use spark2.0 read json file like:

Dataset<Row> logDF = spark.read().json(path);
logDF.show();

but it failed :

 16/08/04 15:35:05 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.RuntimeException: Multiple sources found for json (org.apache.spark.sql.execution.datasources.json.JsonFileFormat, org.apache.spark.sql.execution.datasources.json.DefaultSource), please specify the fully qualified class name.
 java.lang.RuntimeException: Multiple sources found for json (org.apache.spark.sql.execution.datasources.json.JsonFileFormat, org.apache.spark.sql.execution.datasources.json.DefaultSource), please specify the fully qualified class name.
  at scala.sys.package$.error(package.scala:27)
at org.apache.spark.sql.execution.datasources.DataSource.lookupDataSource(DataSource.scala:167)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:78)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:78)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:310)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149)
at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:287)
at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:249)

when I use spark 1.6 it was run correct. the error tell specify the fully qualified class name , but i cant find which class conflict.

thank you very much!

4
  • I am not sure how to fix this but no one would be able to do that unless you should add your dependencies file here. Commented Aug 4, 2016 at 8:57
  • Try spark.read().format("json").json(path);. I was facing something similar for csv and found github.com/databricks/spark-csv/issues/367 Commented Aug 4, 2016 at 11:51
  • you have multiple version of spark sql dependency Commented Dec 19, 2017 at 8:10
  • how did you resolve this? Same problem here Commented Jul 5, 2020 at 13:22

1 Answer 1

1

I came across this and found below to work for me.

df = spark.read.format("org.apache.spark.sql.execution.datasources.json.JsonFileFormat").load(path)

More details can be found here https://github.com/AbsaOSS/ABRiS/issues/147

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.