I would like to submit Apache Spark application to YARN programmatically (in Java, not in Scala). When I try do it (my code):
package application.RestApplication;
import org.apache.hadoop.conf.Configuration;
import org.apache.spark.SparkConf;
import org.apache.spark.deploy.yarn.Client;
import org.apache.spark.deploy.yarn.ClientArguments;
public class App {
public static void main(String[] args1) {
String[] args = new String[] {
"--class", "org.apache.spark.examples.JavaWordCount",
// "--deploy-mode", "cluster",
// "--master", "yarn",
// "--driver-memory", "3g",
// "--executor-memory", "3g",
"--jar", "/opt/spark/examples/jars/spark-examples_2.11-2.0.0.jar",
"--arg", "hdfs://hadoop-master:9000/input/file.txt"
};
Configuration config = new Configuration();
System.setProperty("SPARK_YARN_MODE", "true");
SparkConf sparkConf = new SparkConf();
ClientArguments cArgs = new ClientArguments(args);
Client client = new Client(cArgs, config, sparkConf);
client.run();
}
}
I get an error in line: ClientArguments cArgs = new ClientArguments(args);:
Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object;
at org.apache.spark.deploy.yarn.ClientArguments.parseArgs(ClientArguments.scala:38)
at org.apache.spark.deploy.yarn.ClientArguments.<init>(ClientArguments.scala:31)
at application.RestApplication.App.main(App.java:37)
This is problem with parse String[] args - when array is empty program starts (but no parameters = no work). When I type the correct parameters (as above) or incorrect (eg. "--foo", "foo") I have same error. How can I fix it?