Project performance dashboard

The Performance dashboard is one of the primary tools used to manage labeling operations in a Labelbox project. It reports the throughput, efficiency, and quality of the labeling process. The throughput, efficiency, and quality analytics are reported at the overall project level and at an individual level. In addition, diagnostic analysis of the labeling operation can be performed at each level of detail.

Evaluating the performance of your data labeling operation can be broken down into three components. Each of these has its own view so you can better understand the overall performance of your labeling operation.

Throughput

The Throughput view provides insight into the amount of labeling work being produced. The metrics in this section help you answer questions like, How many assets were labeled in the last 30 days?, How much time is being spent reviewing labeled assets?, and What is the average amount of labeling work being produced?

Clicking on a bar element in the bar charts will show the associated labels comprising that data in the Activity table.

The metrics shown above are available at the project level (i.e., across all members in the project) and at the individual level. Here are the descriptions for each metric under the Throughput view.

Metric

Description

Labels

The count of labeled assets over a selected period of time.

Reviews

The count of Reviews plotted over the selected period. A Review is created when a Thumbs Up (Approved) or Thumbs Down (Rejected) is submitted while reviewing a labeled asset. This includes when a labeled asset is being reviewed/reworked from the Label browser or the queue-based review. A Review can be created from either screen.

Labeling time

The sum of labeling time over the selected period.

Reviewing time

The sum of reviewing and editing (rework) time plotted over the selected period.

NOTE: When a user reviews and/or edits a labeled asset that they created, that time is not counted toward Reviewing time. Instead, it is counted toward Labeling time. Review time is captured when a member who did not create the labeled asset spends time viewing, editing, and/or reviewing that labeled asset in the Label browser or via queue-based review.

Total time

The sum of all labeling time, reviewing time, and reworking time plotted over the selected period.

Efficiency

The Efficiency view displays the time spent per unit of work (e.g., per labeled asset or per review). The metrics in this section help you answer questions like, What is the average amount of time spent labeling an asset?, and How can I reduce time spent per labeled asset?

The metrics shown above are available at the project level (i.e., across all members in the project) and at the individual level. Here are the descriptions for each metric under the Efficiency view.

Metric

Description

Avg time per label

The average time spent labeling an asset before submitting or skipping it in the Editor.

Avg time per review

The average time spent reviewing and editing (reworking) a labeled asset plotted over the selected period.

NOTE: When a user reviews and/or edits a labeled asset that they created, that time is not counted toward Reviewing time. Instead, it is counted toward Labeling time. Review time is captured when a member who did not create the labeled asset spends time viewing, editing, and/or reviewing that labeled asset in the Label browser or via queue-based review.

Quality

The Quality view helps you understand the accuracy and consistency of the labeling work being produced. The metrics in this section help you answer questions like, What is the average quality of a labeled asset?, and How can I ensure label quality is more consistent across the team?

Clicking on a bar element in the bar charts will show the associated labels comprising that data in the Activity Table.

The metrics shown above are available at the project level (i.e., across all members in the project) and at the individual level. Here are the descriptions for each metric under the Quality view. Note that only relevant dashboard components will appear depending on whether you have Benchmarks or Consensus set up.

Metric

Description

Benchmark

The average benchmark score of labeled assets over the selected period.

Benchmark distribution

A histogram of benchmark scores (grouped by 10) for labeled assets plotted over the selected period.

Consensus

The average consensus score of labeled assets over the selected period.

Consensus distribution

A histogram of consensus scores (grouped by 10) for labeled assets plotted over the selected period.


Did this page help you?