6

I wrote a spark program with scala,but when I use "spark-submit" to submit my project,I met the java.lang.ClassNotFoundException.

my .sbt file :

name:="Spark Project"

version:="1.0"

scalaVersion:="2.10.5"

libraryDependencies+="org.apache.spark" %% "spark-core" % "1.3.0"

my .scala file's name is SparkProject.scala and in it object's name is SparkProject too.

/* SparkProject.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object SparkProject {
  def main(args: Array[String]) {
    val logFile = "YOUR_SPARK_HOME/README.md" // Should be some file on your system
    val conf = new SparkConf().setAppName("Simple Application")
    val sc = new SparkContext(conf)
    val logData = sc.textFile(logFile, 2).cache()
    val numAs = logData.filter(line => line.contains("a")).count()
    val numBs = logData.filter(line => line.contains("b")).count()
    println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
  }
}

my command to submit project is :

spark-submit --class "SparkProject" --master local[12] target/scala-2.10/spark-project_2.10-1.0.jar

Anyone knows how to solve this? At last what make me confuse is when I try the example provide here [http://spark.apache.org/docs/latest/quick-start.html],it runs well.But when I build a new project and submit it goes wrong. Any help will be great appreciated.

3
  • 6
    I think you are missing the package name in the class name in your submit command. I guess in your project, your SparkProject.scala file has something like package com.example as its first line. If it is... then this means that the fully qualified name of your class will be com.example.SparkProject so you will have to use -- class "com.example.SparkProject". Commented Apr 8, 2015 at 10:16
  • @SarveshKumarSingh. Yeah ! You are right,this is just the thing I nedd.Thanks a lot. Commented Apr 8, 2015 at 13:24
  • 1
    You guys might want to put that as an answer, to avoid keeping this one as "unanswered" to other that are looking to help ;-) Commented Apr 8, 2015 at 19:06

5 Answers 5

6

Adding package name worked for me.

My code is simple too:

package spark.wordcount

/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object WordCount {
  def main(args: Array[String]) {
    val infile = "/input" // Should be some file on your system
    val conf = new SparkConf().setAppName("word count")
    val sc = new SparkContext(conf)
    val indata = sc.textFile(infile, 2).cache()
    val words = indata.flatMap(line => line.split(" ")).map(word => (word,1)).reduceByKey((a,b) => (a+b))
    words.saveAsTextFile("/output")
    println("All words are counted!")
  }
}

I tried to run spark-submit like this: [root@sparkmaster bin]# ./spark-submit --class spark.wordcount.WordCount /opt/spark-wordcount-in-scala.jar and it ran successfully.

Sign up to request clarification or add additional context in comments.

Comments

1

Removing package name works for me.

Comments

0

I was getting the same error while running spark on windows and building with sbt.

I did not have the line "package xxx" (E.g package spark.wordcount) in the code. If you do not have any mention of "package xxx" in your code then executing the command

spark-submit --class "SparkProject" --master local[12] target/scala-2.10/spark-project_2.10-1.0.jar

should work as mentioned by @SarveshKumarSingh in the comment section of the question.

But I was getting the error for a different reason. The scale file I created in notepad was SimpleApp.scala.txt. When I saved it properly as SimpleApp.scala it worked.

Comments

0

My two cents,

I tried all the mentioned solutions -- which are all valid. In my case, I had made a change in package name after running
sbt package
Hence, I was getting the mentioned error. On recompiling i.e "$sbt package" it worked for me.

Comments

0

For others who are still looking for solution. I tried all other answers but not work to me. It works for me when I remove double quotes on the class name. You can try spark-submit --class SparkProject --master local[12]

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.