3

I have a simple Scala object that creates an RDD and then collects and prints out all the elements.

I've created a Maven project on Eclipse and added Scala library 2.12.3 To pom.xml I have added spark 2.4.3 dependency as below :

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.12</artifactId>
    <version>2.4.3</version>
</dependency>

Finally, I've created a JAR and am then trying to execute spark-submit but this is failing with

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.longArrayOps([J)[J
    at org.spark.learning.Demo$.main(Demo.scala:14)
    at org.spark.learning.Demo.main(Demo.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

The culprit seems to be r1.collect.foreach(println) in my scala code where r1 is the rdd created from range(1,50)

And yes, I have Scala 2.12.3 and Spark 2.4.3 on Eclipse as well as my terminal so version incompatibility doesn't seem to be the issue here.

Could someone please help ?

0

1 Answer 1

2

this is clear version issue. nothing else even though you are claiming to use 2.12.x seems like its pointing to old version of scala try to clean and build. verify dependencies in maven or sbt which ever you are using.

Also do File -> Project Structure -> Global libraries -> Remove SDK -> Rebuild

if you are using intellij all external libraries under class path will be listed like in below picture... under external libraries section.

enter image description here

one way to find the decrepencies is using classloader...

val  urls = urlsinclasspath(getClass.getClassLoader).foreach(println)


def urlsinclasspath(cl: ClassLoader): Array[java.net.URL] = cl match {
    case null => Array()
    case u: java.net.URLClassLoader => u.getURLs() ++ urlsinclasspath(cl.getParent)
    case _ => urlsinclasspath(cl.getParent)
  }


using this you can print all the jars which are in the classpath of the project which you are running from intellij or your driver program using cluster.

Sign up to request clarification or add additional context in comments.

6 Comments

Thanks @Ram I have the below dependencies listed in pom.xml <dependencies> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>3.8.1</version> <scope>test</scope> </dependency>
<!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library --> <dependency> <groupId>org.scala-lang</groupId> <artifactId>scala-library</artifactId> <version>2.12.3</version> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.12</artifactId> <version>2.4.3</version> </dependency> </dependencies> </project>
Also in my Eclipse IDE, Scala Library container [2.12.3] is imported as part of the Maven project.
And in my terminal, the scala version is Scala code runner version 2.12.3 -- Copyright 2002-2017, LAMP/EPFL and Lightbend, Inc.
Ok so I just checked the Scala version that Spark 2.4.3 is using and it seems to be 2.11.12 ./spark-submit --version Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ / __/ / /___/ .__/\_,_/_/ /_/\_\ version 2.4.3 /_/ **Using Scala version 2.11.12** OpenJDK 64-Bit Server VM, 1.8.0_212 Branch Compiled by user on 2019-05-01T05:08:38Z Revision How do I make Spark 2.4.3 run on Scala 2.12.3?
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.