1

I have written a simple spark app in scala using idea, and got the error message when run it :

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.apache.spark.util.Utils$.getCallSite(Utils.scala:1406)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:76)
at com.chandler.hellow_world_b6$.main(hellow_world_b6.scala:13)
at com.chandler.hellow_world_b6.main(hellow_world_b6.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)

Process finished with exit code 1 and the code is :

import org.apache.spark.{SparkContext,SparkConf}
object hellow_world_b6{
    def main(args: Array[String]): Unit = {
        println( "Hello World   12!")
        val conf=new SparkConf()
        val sc=new SparkContext(conf)
    }
}

maven configure is :

<properties>
    <scala.version>2.12.1</scala.version>
</properties>
<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>${scala.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.1.0</version>
</properties>
7
  • 1
    Possible duplicate of java.lang.NoSuchMethodError: scala.Predef$.refArrayOps Commented Feb 1, 2017 at 16:22
  • You're using two different Scala versions - spark-core_2.11 uses 2.11 and you're importing Scala 2.12.1 - align the versions (use Spark's version or build spark deps with 2.12) Commented Feb 1, 2017 at 16:24
  • so how can i fix it ? install scala 2.11 or just change <scala.version>2.12.1</scala.version> to <scala.version>2.11</scala.version>, i have already change the scala version specifying to 2.11. but the do not fixed the issue, BTW, i am new to all off those : java, scala, spark - Commented Feb 1, 2017 at 16:52
  • changing <scala.version> to any minor 2.11 version (e.g. 2.11.8) should be enough Commented Feb 1, 2017 at 16:53
  • 1
    Yes, that's exactly what it means Commented Feb 2, 2017 at 6:23

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.