The following code results in Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.implicits()Lorg/apache/spark/sql/SQLContext$implicits$
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.log4j.Logger
import org.apache.log4j.Level
object Small {
def main(args: Array[String]) {
Logger.getLogger("org.apache.spark").setLevel(Level.WARN)
Logger.getLogger("org.eclipse.jetty.server").setLevel(Level.OFF)
// set up environment
val conf = new SparkConf()
.setMaster("local[1]")
.setAppName("Small")
.set("spark.executor.memory", "2g")
val sc = new SparkContext(conf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._
val df = sc.parallelize(Array((1,30),(2,10),(3,20),(1,10), (2,30))).toDF("books","readers")
df.show
Project was built with SBT:
name := "Small"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.3.1"
I run this with submit script:
#!/bin/sh
/home/test/usr/spark-1.1.0/bin/spark-submit \
--class Small \
--master local[*] \
--driver-memory 2g \
/home/test/wks/Pairs/target/scala-2.10/small_2.10-1.0.jar
Any ideas?
SBT compiles and packages this code all right. Yet when I try to run this code with sbt run I get another exception: [error] (run-main-0) scala.reflect.internal.MissingRequirementError: class org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with java.net.URLClassLoader@210ce673 of type class java.net.URLClassLoader with classpath [file:/home/test/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.10.4.jar, ...
Is there any way to make sbt run include all dependencies when launching scala program?
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"in sbt file, so I assumesbt packageandsbt runshould use spark 1.3.1., don't they?/home/test/usr/spark-1.1.0/bin/spark-submitshows what I am refering to (spark instance version is 1.1.0 while you are using 1.3.1 at compile time). Please update your spark version so that they match.sbt run? Should not it work anyway withlibraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"?