I wrote a spark program in scala. now I want to run the script I wrote, in terminal. in pyspark I use spark-submit for a python file. now I want to do the same for my scala program. I do not want to use Intellij or write my program in spark-shell. I just want to write my code in an editor and run it by using a command in terminal. is that possible? Thank you in advance
1 Answer
Create a JAR file for your code (jar name is HelloWorld.jar) say . You can use HDFS or local path like below examples.
You can add lot options in below commands which you can found in the url given by philantrovert in comments.
Run in Local mode.
spark-submit --class path.to.YourMainClass --master local[*] /path/to/your/jar/HelloWorld.jar
Run in cluster mode.
spark-submit --deploy-mode cluster --class path.to.YourMainClass --master yarn hdfs://nameservice1/hdfsPath/to/your/jar/HelloWorld.jar
3 Comments
Ali AzG
Thank you. and what "path.to.YourMainClass" is exactly? what should I palce instead of it?
Chandan Ray
It should be your complete package name of the class with your class name
Praveen L
If your package in which you created your program is org.myprograms and your main class is MainClass.... give org.myprograms.MainClass