4

I have a Java application using Spark SQL (Spark 1.5.2 using local mode), but I cannot execute any SQL commands without getting errors.

This is the code I am executing:

//confs
SparkConf sparkConf = new SparkConf();  
sparkConf.set("spark.master","local");
sparkConf.set("spark.app.name","application01");
sparkConf.set("spark.driver.host","10.1.1.36");
sparkConf.set("spark.driver.port", "51810");
sparkConf.set("spark.executor.port", "51815");
sparkConf.set("spark.repl.class.uri","http://10.1.1.36:46146");
sparkConf.set("spark.executor.instances","2");
sparkConf.set("spark.jars","");
sparkConf.set("spark.executor.id","driver");
sparkConf.set("spark.submit.deployMode","client");
sparkConf.set("spark.fileserver.uri","http://10.1.1.36:47314");
sparkConf.set("spark.localProperties.clone","true");
sparkConf.set("spark.app.id","app-45631207172715-0002");

//Initialize contexts
JavaSparkContext sparkContext = new JavaSparkContext(sparkConf);
SQLContext sqlContext = new SQLContext(sparkContext);           

//execute command
sqlContext.sql("show tables").show();

Spark dependencies in pom.xml look like this:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.10</artifactId>
  <version>1.5.2</version>
</dependency>

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-sql_2.10</artifactId>
  <version>1.5.2</version>
</dependency>

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-hive_2.10</artifactId>
  <version>1.5.2</version>
</dependency>

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-repl_2.10</artifactId>
  <version>1.5.2</version>
</dependency>

Here is the error I am getting:

java.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer$.handledType()Ljava/lang/Class;

The stack trace is here.

My application is a web application running on Tomcat 7. I don't have any other configuration files. What could I be doing wrong? Could it be some dependency conflict, since I am able to run the same code in a clean project?

EDIT: I found an issue that gives some more information about the problem.

2
  • Is com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer already in your class path? Commented Dec 9, 2015 at 18:26
  • It is, but when I take off all spark dependencies, I see that I have two jackson jars (probably from another dependency): jackson-core-asl-1.9.12.jar and jackson-mapper-asl-1.9.12.jar. If you access the links, it is possible to see that the packages are moved to com.fasterxml.jackson.core, the dependency with conflict. The only choice I got is to track down which maven dependency has this packages and upgrade it? Commented Dec 9, 2015 at 19:05

2 Answers 2

5

In this situation, NoSuchMethodError occurred because of maven dependency conflict.

The library which was used by your project during compile time is either not available or some other version of the library is being used during runtime.

I tried many things to solve this conflict issue, and finally following worked for me -

Just add the correct dependency version of jackson.databind as the first dependency in your pom.xml.

Use version 2.4.x or greater of jackson.databind dependency.

Note: This will work only for Maven version 2.0.9 and above.

Why will this work?

In Maven 2.0.9, a new feature of transitive dependencies was added.

Dependency mediation - this determines what version of a dependency will be used when multiple versions of an artifact are encountered. Currently, Maven 2.0 only supports using the "nearest definition" which means that it will use the version of the closest dependency to your project in the tree of dependencies. You can always guarantee a version by declaring it explicitly in your project's POM. Note that if two dependency versions are at the same depth in the dependency tree, until Maven 2.0.8 it was not defined which one would win, but since Maven 2.0.9 it's the order in the declaration that counts: the first declaration wins.

Maven Transitive Dependency

Sign up to request clarification or add additional context in comments.

Comments

1

BigDecimalDeserializer wasn't introduced to FasterXML/jackson-module-scala until 2.4. Confirm the following:

  1. The same jars you compile with are on the classpath at runtime.
  2. ${fasterxml.jackson.version} in the pom.xml file for Spark SQL is 2.4.x or greater.
<dependency>
  <groupId>com.fasterxml.jackson.core</groupId>
  <artifactId>jackson-databind</artifactId>
  <version>2.4.4</version>
</dependency>

8 Comments

Here is a screenshot of my maven dependency tree, it seems to be okay, right?
Are you building this with eclipse?
I've had problems with class path issues in Eclipse for Spark applications before (actually switched to IntellijIDEA for Spark/Scala applications). Anyways, let me review the dependency hierarchy and see what I can find. Can you try building from the command line and seeing if you get compile errors?
Here is the mvn clean install output, i am not getting any errors
What happens when you execute this jar? It looks like you have some additional jars on your class path in eclipse.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.