8

I just upgraded my spark project from 2.2.1 to 2.3.0 to find the versioning exception below. I have dependencies on the spark-cassandra-connector.2.0.7 and cassandra-driver-core.3.4.0 from datastax which in turn have dependencies on netty 4.x whereas spark 2.3.0 uses 3.9.x.

The class raising the exception, org.apache.spark.network.util.NettyMemoryMetrics, has been introduced in spark 2.3.0.

Is downgrading my Cassandra dependencies the only way round the exception? Thanks!

Exception in thread "main" java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric;
at org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80)
at org.apache.spark.network.util.NettyMemoryMetrics.<init>(NettyMemoryMetrics.java:76)
at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:109)
at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:423)

2 Answers 2

18

It seems like you use an "too old" netty 4 version. Maybe you have multiple on your classpath ? It should be not problem to have netty 4.x and 3.x on the classpath.

Sign up to request clarification or add additional context in comments.

5 Comments

Thanks you got on the right direction. I come from the .Net world and Scala is still a bit new to me. I eventually got it fixed following this: maven.apache.org/guides/introduction/…
@rodders can you tell us, how did you fixed it?
I got by adding this fragment to my pom, and thus forcing a common netty reference: ` <dependencyManagement> <dependencies> <dependency> <groupId>io.netty</groupId> <artifactId>netty-all</artifactId> <version>4.1.17.Final</version> </dependency> </dependencies> </dependencyManagement> `
@rodders I am also facing the same issue but in my case i am using pyspark(v 2.3.2) and hadoop (v2.8.3) . When i submit a pyspark job using spark-submit and open the logs of the yarn container i get the similar error. Due you have any idea how to solve it in case of pyspark
I had same problem and after excluding netty from one of the dependency ` <dependency> <groupId>com.datastax.spark</groupId> <artifactId>spark-cassandra-connector_2.11</artifactId> <version>${spark-cassandra-connector_2.11.version}</version> <exclusions> <exclusion> <groupId>io.netty</groupId> <artifactId>netty-all</artifactId> </exclusion> </exclusions> </dependency>` it works for me
6

I would like to add some more details to the answer for ease of work, just run mvn dependency:tree -Dverbose -Dincludes=io.netty:netty-all it will return all the dependencies using io.netty and its version. In my case the culprit was Hive Jdbc 2.1.0 which has netty-all of version lower than the version used by spark 2.3.1 so the classpath omits to load the spark's netty as it was already loaded from hive-jdbc.

So the fix is to exclude the dependencies from the Hive-Jdbc in pom.xml

2 Comments

This option helped me out to figure out the problem. Thanks @Vicky
@JaiPrakash, I'm glad it helped you.!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.