0

I have tried (unsuccessfully so far) to replace Log4j with Log4j2 for Apache Spark Logging. So far I have managed to use Log4j2 for my application logs, but I would like to use it also for Spark internal logs (to avoid having 2 different configurations and frameworks coexisting at the same time).

Based on the related questions, I understand that I should somehow tell Spark to ignore it's own log4j jars and use the ones I provide with the help of SPARK_CLASSPATH.

However, I have faced 2 issues:

  1. If I remove log4j jars from $SPARK_HOME/jars directory and add the new ones through SPARK_CLASSPATH, the driver won't start, complaining about missing log4j.

  2. If I leave log4j jars in $SPARK_HOME/jars directory but tell spark-submit to use jars located on another directory (using --conf spark.driver.extraClassPath), the application will immediately fail with a StackOverflowError, similar to what is mentioned here.

I hope someone can shed some light into my issue.

Regards

Related questions: Using log4j2 in Spark java application , Can I use log4j2.xml in my Apache Spark application , Configuring Apache Spark Logging with Scala and logback

1 Answer 1

0

I'm afraid you can't use log4j 2 since spark is hardcoded with log4j 1.

The best solution I found was to just use log4j 1 so I can also collect spark logs as well as my own.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.