4

Right now, I have to use df.count > 0 to check if the data frame is empty or not. But it is kind of inefficient. Is there any better way to do that.

Thanks.

PS: I want to check if it's empty so that I only save the data frame if it's not empty

How to check if spark dataframe is empty - this question tell me a way in scala, I am looking for a way in pyspark.

3
  • 2
    Exactly the same way, up to single pair of parenthesis df.rdd.isEmpty() Commented Jan 29, 2018 at 21:28
  • 1
    thank you very much, I was trying df.head(1).isEmpty and was worried that it was not working. thank you very much @user6910411 Commented Jan 29, 2018 at 21:33
  • Also see this comment for another approach. Commented Jan 29, 2018 at 21:34

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.