1

When i am trying to build maven project in eclipse IDE based on scala nature.

Getting error

object sql is not a member of package org.apache.spark

We tried

Adding this dependency in pom.xml

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>1.6.0</version>
</dependency>

Input code

import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.streaming.kafka.KafkaUtils
import org.apache.spark.streaming.Seconds
import org.apache.spark.streaming.StreamingContext



object MyApp {
  def main(args: Array[String]) {

    //Read from KAFKA TOPIC 
    val conf = new SparkConf().setMaster("local[*]").setAppName("Spark-Kafk-Integration")
    val sc = new SparkContext(conf)
    val ssc = new StreamingContext(sc, Seconds(5))
    val kafkaStream = KafkaUtils.createStream(ssc, "hostname:2181", "spark-streaming-consumer-group", Map("test4" -> 1))
    val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._

 kafkaStream.foreachRDD(rdd => {
 rdd.foreach(println)

 if(rdd.count()>0) {
// rdd.toDF("value").coalesce(1).write.mode(SaveMode.Append).text("file:///D:/my/")
// rdd.toDF("value").coalesce(1).write.mode(SaveMode.Append).text("file://user/cloudera/testdata")

 rdd.toDF("value").coalesce(1).write.mode(SaveMode.Append).text("hdfs://hostname:8020/user/cloudera/testdata")

 // rdd.saveAsTextFile("C:/data/spark/")
 }
 })

Complete POM.XML

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.cyb</groupId>
    <artifactId>First</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <packaging>jar</packaging>

    <name>First</name>
    <url>http://maven.apache.org</url>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>

    <dependencies>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.11</artifactId>
            <version>1.6.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka_2.11</artifactId>
            <version>1.6.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <version>0.10.2.0</version>
        </dependency>
        <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>1.6.0</version>
</dependency>

        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>3.8.1</version>
            <scope>test</scope>
        </dependency>
    </dependencies>
</project>

Output

We want to write the stream data into HDFS storage from Kafka topic.

Any help on it would be much appreciated ?

10
  • What are other dependencies with versions that you used? Commented Apr 9, 2018 at 11:36
  • When you have added the dependency spark-sql_2.11, have you tried mvn clean install ? Commented Apr 9, 2018 at 11:41
  • Yes i did mvn clean install after putting the dependency Commented Apr 9, 2018 at 12:01
  • I used following dependency version : <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.11</artifactId> <version>1.6.0</version> </dependency> Commented Apr 9, 2018 at 12:02
  • can you see the jar file for spark sql in the extra libraries ? Commented Apr 9, 2018 at 12:03

1 Answer 1

1

You need to import spark sql libraries to use spark-sql functions. Try importing this

import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.SQLImplicits
import org.apache.spark.sql.SQLContext
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.