I had the same issue and here how I could fix it
i ran spark-shell in cmd (windows) and it worked because the SPARK_HOME environment variable is set in my system/or user environment variable.
In the same cmd I could see the scala version and spark version the i navigated to the target/build.sbt inside the base directory of my scala project and changed the scala version to be the same as the scala version mentioned above like this
ThisBuild / scalaVersion := "2.12.15"
and I had this dependency
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "3.3.4", // Spark Core
"org.apache.spark" %% "spark-sql" % "3.3.4" // Spark SQL (optional, if you need SQL support)
)
and 3.3.4 is the same version as my spark installation them ran the command of spark-submit again like this
spark-submit --class "main.wordcountapp" --master "local[*]" "F:\path\to\base_directory\target\scala-2.12\myscalaproj_2.12-0.1.0-SNAPSHOT.jar"
and it ran as expected
the wordcountapp is the name of my object that had the main function inside it and the
"F:\path\to\base_directory\target\scala-2.12\myscalaproj_2.12-0.1.0-SNAPSHOT.jar"
is the name of the generated jar file
to generate this file run sbt package inside the base_directory
hope it helps.
scala -versionisn't necessarily relevant. If you havebuild.sbtfile, what does it look like?