0

I am new to scala and spark.I have below case class A

case class A(uniqueId : String, attributes: HashMap[String, List[String]])

Now I have a dataFrame of type A. I need to call a java function on each row of that DF. I need to convert Hashmap to Java HashMap and List to java list.. How can i do that.

I am trying to do following

val rddCaseClass = RDD[A]
val a = rddCaseClass.toDF().map ( x=> {
val rowData = x.getAs[java.util.HashMap[String,java.util.List[String]]]("attributes")
callJavaMethod(rowData)

But this is giving me error :

java.lang.ClassCastException: scala.collection.mutable.WrappedArray$ofRef cannot be cast to java.util.List

Please help.

1

2 Answers 2

1

You can convert Scala Wrapped array to Java List using scala.collection.JavaConversions

 val wrappedArray: WrappedArray[String] = WrappedArray.make(Array("Java", "Scala"))
 val javaList = JavaConversions.mutableSeqAsJavaList(wrappedArray)

JavaConversions.asJavaList can also be used but its deprecated: use mutableSeqAsJavaList instead

Sign up to request clarification or add additional context in comments.

Comments

0

I think, you could use Seq instead of List for your parameters to work efficiently with List. This way it should work with most of the Seq implementations and no need to to convert the seqs like WrappedArray.

val rddCaseClass = RDD[A]
val a = rddCaseClass.toDF().map ( x=> {
val rowData = x.getAs[java.util.HashMap[String, Seq[String]]]("attributes")
callJavaMethod(rowData)

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.