0

We can create a dataframe from a list of Java objects using:

DataFrame df = sqlContext.createDataFrame(list, Example.class);

In case of Java, Spark can infer the schema directly from the class, in this case Example.class.

Is there a way to do the same in case of Scala?

0

1 Answer 1

6

if you use case classes in scala, this works out of the box

// define this class outside main method
case class MyCustomObject(id:Long,name:String,age:Int)

import spark.implicits._

val df = Seq(
  MyCustomObject(1L,"Peter",34),
  MyCustomObject(2L,"John",52)
).toDF()

df.show()

+---+-----+---+
| id| name|age|
+---+-----+---+
|  1|Peter| 34|
|  2| John| 52|
+---+-----+---+

If you want to use a non-case class, you need to extend the trait Product and implement these methods yourself

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.