Apache Spark UI shows wrong number of jobs

Apache Spark UI shows the wrong number of active jobs.

Written by ashish

Last published at: May 11th, 2022

Problem

You are reviewing the number of active Apache Spark jobs on a cluster in the Spark UI, but the number is too high to be accurate.

If you restart the cluster, the number of jobs shown in the Spark UI is correct at first, but over time it grows abnormally high.

Cause

The Spark UI is not always accurate for large, or long-running, clusters due to event drops. The Spark UI requires termination entries to know when an active job has completed. If a job misses this entry, due to errors or unexpected failure, the job may stop running while incorrectly showing as active in the Spark UI.

Delete

Info

For more information review the Apache Spark UI is not in sync with job KB article.

Solution

You should not use the Spark UI as a source of truth for active jobs on a cluster.

The method sc.statusTracker().getActiveJobIds() in the Spark API is a reliable way to track the number of active jobs.

Please review the Spark Status Tracker documentation for more information.

Was this article helpful?