0

When invalid number of arguments are passed while calling application jar(spark-submit in this case) Scala Scopt does not return non zero error code. So although the scopt failed the scheduler shows it as success

    <dependency>
        <groupId>com.github.scopt</groupId>
        <artifactId>scopt_2.12</artifactId>
        <version>3.7.1</version>
    </dependency>

Sample code used:

    val parser = new scopt.OptionParser[IngestionCommandLineConfigs](
      ingestionPipelineJob
    ) {
      opt[String]('t', sampleTableArg)
        .required()
        .action((x, c) => c.copy(sampleTableArg = x))
        .text(tableDesc)
    }
1
  • 2
    Can you provide a minimal reproducible example including how you're calling this code? Commented Jun 20 at 13:50

1 Answer 1

4

Indeed, when scopt fails, by default it just prints a message, it doesn't return non-zero code (it doesn't throw exception). This behavior is the same whether you use spark-submit or not.

You can specify exit code with System.exit (or throwing exception):

parser.parse(args, Config()) map { config =>
  // do stuff
  println(s"Some: config=$config")
} getOrElse {
  // arguments are bad, usage message will have been displayed
  println(s"None")
  System.exit(100)                                      //  <--
  // throw new RuntimeException("arguments are bad")    //  <--
}

If you choose to throw exception then exit code will be 1.

My full reproduction:

case class Config(
  x: Int = 0,
  y: Int = 0,
  z: Int = 0,
)

object App {
  def main(args: Array[String]): Unit = {
    println("args=" + args.mkString(", "))

    val parser = new scopt.OptionParser[Config]("scopt") {
      head("scopt", "3.x")
      opt[Int]("x") required() action { (t, c) =>
        c.copy(x = t) } text("x is an integer property")
      opt[Int]("y") required() action { (t, c) =>
        c.copy(y = t) } text("y is an integer property")
      opt[Int]("z") required() action { (t, c) =>
        c.copy(z = t) } text("z is an integer property")
    }

    parser.parse(args, Config()) map { config =>
      // do stuff
      println(s"Some: config=$config")
    } getOrElse {
      // arguments are bad, usage message will have been displayed
      println(s"None")
      System.exit(100)
    }
  }
}

Testing:

  • running fat jar with all command-line options:

script.sh

java -jar target/scala-2.12/scoptdemo-assembly-0.1.0-SNAPSHOT.jar --x 1 --y 2 --z 3
echo "Exit code: $?"
./script.sh 

args=--x, 1, --y, 2, --z, 3
Some: config=Config(1,2,3)
Exit code: 0
  • running fat jar with a missing command-line option:

script.sh

java -jar target/scala-2.12/scoptdemo-assembly-0.1.0-SNAPSHOT.jar --x 1 --y 2
echo "Exit code: $?"
./script.sh

args=--x, 1, --y, 2
Error: Missing option --z
scopt 3.x
Usage: scopt [options]

  --x <value>  x is an integer property
  --y <value>  y is an integer property
  --z <value>  z is an integer property
None
Exit code: 100
  • running spark-submit with all command-line options:

script.sh

/path/to/spark-3.5.6-bin-hadoop3/bin/spark-submit --class App target/scala-2.12/scoptdemo-assembly-0.1.0-SNAPSHOT.jar --x 1 --y 2 --z 3
echo "Exit code: $?"
./script.sh 

WARNING: package sun.security.action not in java.base
25/06/21 04:45:16 WARN Utils: Your hostname, ... resolves to a loopback address: 127.0.1.1; using 192.168.8.7 instead (on interface wlo1)
25/06/21 04:45:16 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
args=--x, 1, --y, 2, --z, 3
Some: config=Config(1,2,3)
25/06/21 04:45:16 INFO ShutdownHookManager: Shutdown hook called
25/06/21 04:45:16 INFO ShutdownHookManager: Deleting directory /tmp/spark-95f5e216-25d8-4f22-83a5-47ea235bbef8
Exit code: 0
  • running spark-submit with a missing command-line option:

script.sh

/path/to/spark-3.5.6-bin-hadoop3/bin/spark-submit --class App  target/scala-2.12/scoptdemo-assembly-0.1.0-SNAPSHOT.jar --x 1 --y 2
echo "Exit code: $?"
./script.sh
 
WARNING: package sun.security.action not in java.base
25/06/21 04:47:22 WARN Utils: Your hostname, ... resolves to a loopback address: 127.0.1.1; using 192.168.8.7 instead (on interface wlo1)
25/06/21 04:47:22 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
args=--x, 1, --y, 2
Error: Missing option --z
scopt 3.x
Usage: scopt [options]

  --x <value>  x is an integer property
  --y <value>  y is an integer property
  --z <value>  z is an integer property
None
25/06/21 04:47:22 INFO ShutdownHookManager: Shutdown hook called
25/06/21 04:47:22 INFO ShutdownHookManager: Deleting directory /tmp/spark-17be09a2-c741-4cd4-afe4-4159d6420edd
Exit code: 100

BTW, current version of scopt is 4.1.0

https://github.com/scopt/scopt

https://eed3si9n.com/scopt3/

https://eed3si9n.com/scopt4/

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.