0

I have a spark job running with 260K tasks. I can check individual task executor computing time from Spark UI. For the purpose of calculate resource usage of the whole job, how to summarize all computing time?

1 Answer 1

0

Using Spark UI → Stages tab

  1. Go to the Spark UI.

  2. Open the Stages tab.

  3. For each stage:

    • Look at the column “Executor CPU Time” (or sometimes called “Task Time”).

    • This shows total executor CPU time in milliseconds for all tasks in that stage.

  4. Sum these values for all stages to get total compute time of the job.

    • Alternatively, you can download the Spark event logs and script the parsing to extract CPU time per stage.
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.