i'm writing a spark-scala program in intelligi, my code is basically to bring the table from oracle and store them in hdfs as text files insert_df.rdd.saveAsTextFile("hdfs://path"). I have tried this way, but it didn't work val script_sh = "///samplepath/file_creation_script.sh".!
But i have some conversions to make to the text file that I generated, I wrote a shell script to that. I dont want to run the spark jar file and the .sh file separately.
Please let me know if there is any way that i can call shell script through the program.
df.write.text("/hdfs/path")