1

I am trying to convert a Row in a dataframe to a case class and getting following error

2019-08-19 20:13:08 Executor task launch worker for task 1 ERROR Executor:91 - Exception in task 0.0 in stage 1.0 (TID 1) java.lang.ClassCastException: org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema cannot be cast to Models.City

Sample Log = {"Id": "1","City": {"name": "A","state": "B"}}

Below is the code that is reading a text file having data in json format which is throwing above error

case class City(name: String, state: String)

val file = new File("src/test/resources/log.txt")
val logs = spark.
  read.
  text(file.getAbsolutePath).
  select(col("value").
    as("body"))
import spark.implicits._
var logDF: DataFrame = spark.read.json(logs.as[String])
logDF.map(row => row.getAs[City]("City").state).show()

Basically I can not perform any operation on the dataframe itself due to some restrictions. So given a row how can we cast it into a case class (i cannot use match pattern here as case class can have lot of fields and nested case classes)

Thanks in advance. Any help is greatly appreciated!!

2

1 Answer 1

1

I had the same issue (Spark SQL 3.1.3). The solution was to use Spark to convert Dataframe to Dataset and then access the fields.

import spark.implicits._
var logDF: DataFrame = spark.read.json(logs.as[String])
logDF.select("City").as[City].map(city => city.state).show()
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.