I'm trying to run my own spark application but when I'm using the spark-submit command I get this error:
Users/_name_here/dev/sp/target/scala-2.10/sp_2.10-0.1-SNAPSHOT.jar --stacktrace
java.lang.ClassNotFoundException: /Users/_name_here/dev/sp/mo/src/main/scala/MySimpleApp
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:340)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:633)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I'm using following command:
/Users/_name_here/dev/spark/bin/spark-submit
--class "/Users/_name_here/dev/sp/mo/src/main/scala/MySimpleApp"
--master local[4] /Users/_name_here/dev/sp/target/scala-2.10/sp_2.10-0.1-SNAPSHOT.jar
My build.sb looks like this:
name := "mo"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % "1.4.0",
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41",
"org.apache.spark" % "spark-sql_2.10" % "1.4.0",
"org.apache.spark" % "spark-mllib_2.10" % "1.4.0",
"org.tachyonproject" % "tachyon-client" % "0.6.4",
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41",
"org.apache.spark" % "spark-hive_2.10" % "1.4.0",
"com.typesafe" % "config" % "1.2.1"
)
resolvers += "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/"
My plugin.sbt:
logLevel := Level.Warn
resolvers += "Sonatype snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/"
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" %"0.11.2")
I'm using the prebuild package from spark.apache.org. I installed sbt through brew as well as scala. Running sbt package from the spark root folder works fine and it creates the jar but using assembly doesn't work at all, maybe because its missing in the rebuild spark folder. I would appreciate any help because I'm quite new to spark. oh and btw spark is running fine within intelliJ