1

I am facing following issue while read CSV file using Intellij Scala.

Error Message:

Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.mutable.Buffer$.empty()Lscala/collection/GenTraversable;
    at org.apache.spark.sql.SparkSessionExtensions.<init>(SparkSessionExtensions.scala:72)
    at org.apache.spark.sql.SparkSession$Builder.<init>(SparkSession.scala:780)
    at org.apache.spark.sql.SparkSession$.builder

Source Code:

import org.apache.spark.sql.SparkSession
object broadcastright {

   def main(args : Array[String]): Unit = {
     val spark = SparkSession
       .builder()
       .master("local")
       .appName("Read CSV File")
       .getOrCreate()

     val df = spark.read
       .option("header", "true")
       .option("delimiter", ",")
       .option("inferSchema", "false")
       .load("src\\main\\resources\\people.csv")

     df.show()
   }
  }

Pom.xml

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.12</artifactId>
    <version>2.4.5</version>
</dependency>

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.12</artifactId>
    <version>2.4.0</version>
</dependency>

I am not sure what is the cause for this issue.

4
  • What version of scala u are using ?? Commented May 23, 2020 at 12:05
  • is this issue fixed ? Commented May 23, 2020 at 12:50
  • No..still not yet..Using Spark-sdk-2.13.1 Commented May 23, 2020 at 16:56
  • from pom.xml your spark uses scala version 2.12 & you have to use same scala version in ide also. Commented May 23, 2020 at 16:59

3 Answers 3

3

If any version difference in spark libraries we will be getting lot of issue as some of methods might not be available in newer version or some new methods might not available in lower version of spark.

Please add same version of spark libraries in maven file. Also scala version in pom.xml and classpath scala version should match.

Sign up to request clarification or add additional context in comments.

Comments

2

You run into these type of issues if your Intellij is using incompatible Scala compiler version. This is a very common issue which happens within Intellij workspace if you are building Spark pipeline using Scala.

If you install Scala plugin in your Intellij, it installs a default version of SDK with it. You just have to make sure the installed version of Scala compiler/SDK is compatible with you Spark version.

For example you are using using Spark 3.x.x, be assured that you have Scala 2.12.x compiler Installed.

Process to fix the issue:

File -> Project Structure -> Platform Settings -> Global Libraries -> +

  1. Click on + icon

Click on + icon

  1. Click on version on Scala you want to download. In my case Maven 2.12.2

Maven Scala version

  1. Select Scala version from Dropdown and click ok. In my case 2.12.2

Scala 2.12.2

  1. Tag your project against it.

enter image description here

  1. This is how your configuration should look like.

enter image description here

Comments

0

In my case, Scala 2.13 was installed automatically on IntellijIDEA but the spark was running on Scala 2.12.

I removed Scala 2.13 from IntellijIDEA and installed Scala 2.12 using maven.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.