10

SPARK 2.3 is throwing following exception. Can anyone please help!! I tried adding the JARs

308 [Driver] ERROR org.apache.spark.deploy.yarn.ApplicationMaster - User class threw exception: java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric; java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric; at org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80) at org.apache.spark.network.util.NettyMemoryMetrics.(NettyMemoryMetrics.java:76) at org.apache.spark.network.client.TransportClientFactory.(TransportClientFactory.java:109) at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99) at org.apache.spark.rpc.netty.NettyRpcEnv.(NettyRpcEnv.scala:71) at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461) at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256) at org.apache.spark.SparkContext.(SparkContext.scala:423) at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58) at com.voicebase.etl.HBasePhoenixPerformance2.main(HBasePhoenixPerformance2.java:55) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:706) 315 [main] ERROR org.apache.spark.deploy.yarn.ApplicationMaster - Uncaught exception: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205) at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:486) at org.apache.spark.deploy.yarn.ApplicationMaster.org$apache$spark$deploy$yarn$ApplicationMaster$$runImpl(ApplicationMaster.scala:345) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply$mcV$sp(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$5.run(ApplicationMaster.scala:800) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) at org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:799) at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:259) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:824) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala) Caused by: java.util.concurrent.ExecutionException: Boxed Error

6
  • 1
    Added <dependency> <groupId>io.netty</groupId> <artifactId>netty-all</artifactId> <version>4.1.17.Final</version> </dependency> <dependency> <groupId>io.netty</groupId> <artifactId>netty</artifactId> <version>3.9.9.Final</version> </dependency> Commented May 17, 2018 at 10:16
  • NOT SURE WHAT IS CAUSING THIS ERROR ADDED PROPER JAR NO EFFECT. ANY HELP?? Commented May 17, 2018 at 10:16
  • 3
    I have the same here, any solution. Incredible Spark dependency hell! Commented Jun 21, 2018 at 15:35
  • Possible duplicate of Spark 2.3.0 netty version issue: NoSuchMethod io.netty.buffer.PooledByteBufAllocator.metric() Commented Jul 3, 2018 at 7:25
  • @Alchemist are you able to solve the issue because i am also facing the same. Commented Jan 1, 2019 at 10:43

5 Answers 5

13

This is because Hadoop binaries compiled with an older version and need us to just replace them. I haven't faced any issues with Hadoop by replacing them.

You need to replace netty-3.6.2.Final.jar and netty-all-4.0.23.Final.jar from path $HADOOP_HOME\share\hadoop with netty-all-4.1.17.Final.jar and netty-3.9.9.Final.jar.

Sign up to request clarification or add additional context in comments.

2 Comments

really. I took those jars from spark installation and repleced those in hadoop and it worked..
Yes it worked. Strangely the docs show that it is part of 4.0 but seems like it only got included after 4.1.
2

This issue plagues due to mismatch of the version that Hadoop and Spark are compiled on for Netty. So you can follow this.

Similar Issue , solved by manually compiling the Spark by using specific version of Netty

And the other one as recommended by Suhas , by copying the content of SPARK_HOME/jars folder to the various lib folder or only the one in yarn inside HADOOP_HOME/share/hadoop solves the problem also. But it's a dirty fix. So maybe use latest version of both or manually compile them.

Comments

2

An older version of Netty was required by the aws-java-sdk. Deleting all the netty jars and removing the aws-java-sdk from the project solved the problem.

Comments

2

Issue has been resolved by adding the below netty jars in the dependencies,

   "io.netty" % "netty-all" % "4.1.68.Final"
   "io.netty" % "netty-buffer" % "4.1.68.Final"

And excluding all existing netty jars by adding excludeAll code.

 val excludeNettyBufferBinding = ExclusionRule(organization = "io.netty.buffer")
excludeAll(excludeNettyBufferBinding)

1 Comment

The dependency part fixed it for me. +1
1

this is due to hadoop and spark unsupported versions, in order to solve it add netty as a dependency, try with different hit & trial versions. For mine it worked with this version -> 4.1.53.Final

add this

<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.53.Final</version>
</dependency>

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.