I am trying to create a dataframe in kafka-spark stream , i have successfuly mapped values to case class, but whenever i call toDF method it gives me error. **
value toDF is not a member of Array[WeatherEvent] [error] possible cause: maybe a semicolon is missing before `value toDF'? [error]
}).toDF("longitude", "latitude", "country", "sunrise", "sunset", "temperature", "temperatureMin", "temperatureMax", [error] ^ [error] one error found [error] (compile:compileIncremental) Compilation failed [error] Total time: 2 s, completed Sep 27, 2017 11:49:23 AM
Here is my code
val inputStream = KafkaUtils.createDirectStream(ssc, PreferConsistent, Subscribe[String, String ](Array("test"), kafkaParams))
// val json = parse(inputStream)
val processedStream = inputStream
.flatMap(record => record.value.split(" ").map(payload => {
//val ts = Timestamp.valueOf(payload(3))
WeatherEvent(payload(0).toDouble, payload(1).toDouble, payload(2).toString , payload(3).toInt,
payload(4).toInt, payload(5).toDouble, payload(6).toDouble, payload(7).toDouble,
payload(8).toDouble, payload(9).toInt, payload(10).toInt, payload(11).toInt,
payload(12).toDouble, payload(13).toDouble)
}).toDF("longitude", "latitude", "country", "sunrise", "sunset", "temperature", "temperatureMin", "temperatureMax",
"pressure", "humidity", "cloudiness", "id", "wind_speed", "wind_deg")
)
Thanks **
import ssc.implicits._