4

I am following the instructions here: https://spark.apache.org/docs/latest/quick-start.html to create a simple application that will run on a local standalone Spark build.

In my system I have Scala 2.9.2 and sbt 0.13.7. When I write in my simple.sbt the following:

scalaVersion := "2.9.2"

after I use sbt package, I get the error: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.9.2;1.3.1: not found

However, when I write in simple.sbt:

scalaVersion := "2.10.4"

sbt runs successfully and the application runs fine on Spark.

How can this happen, since I do not have scala 2.10.4 on my system?

2
  • 1
    Sbt download scala compiler in version 2.10.4 and use it. Commented Apr 27, 2015 at 16:30
  • The below post would also be helping you. I believe! stackoverflow.com/questions/43883325/… Commented Nov 28, 2017 at 19:39

2 Answers 2

10

Scala is not a package, it is a library that executes on top of the Java runtime. Likewise, the Scala compiler scalac runs on top of a Java runtime. The fact that you have a version of Scala installed in your "system" is a convenience, but is not in any way required.

Therefore, it is entirely possible to launch sbt from one version of Scala (2.9.2) but instruct it to run other commands (compilation) using an entirely different version of Scala (2.10.x) by passing the appropriate flags such as -classpath.

See: Can java run a compiled scala code?

Sign up to request clarification or add additional context in comments.

Comments

7

As @noahlz said, you don't need Scala on your system as sbt will fetch it for you.

The issue you're having is that there is not spark-core version 1.3.1 for Scala 2.9.2.

From what I can see in Maven Central (searching for spark-core) there are only builds of spark-core for Scala 2.10 and 2.11.

Therefore I would recommend you use this setup:

scalaVersion := "2.11.6"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"

If for whatever reason that doesn't work for you, use Scala 2.10.5:

scalaVersion := "2.10.5"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.