import scala.collection.mutable.ArrayBuffer
spark.sql("set db=test_script")
spark.sql("set table=member_test")
val colDF = sql("show columns from ${table} from ${db}")
var tempArray = new ArrayBuffer[String]()
var temp
colDF.foreach { row => row.toSeq.foreach { col =>
temp = "count(case when "+ col+ " ='X' then 1 else NULL END) AS count"+ col
tempArray += temp
}}
println(tempArray) // getting empty array
println(temp) // getting blank string
Hi, I am new to scala programming. I am trying to loop through a dataframe and append the formatted String data to my ArrayBuffer. When I put the print statement inside the for loop, everything, seems to be fine, whereas If i try to access the arrayBuffer outside the loop, its empty. Is it something related to the scope of the variable? I am using arrayBuffer, because I got to know that list is mutable in Scala. Please suggest any better way if you have. Thanks in advance