Skip to main content
The Monitor page allows you to monitor labeling operations and performance across all projects in real-time. It provides filters and visualizations to help you identify outlier labelers and make decisions to improve productivity and efficiency. The Monitor page has the following sections:
  • Project Performance: Shows performance metrics for each project, allowing you to monitor and analyze performance at the project level.
  • Member performance: Shows performance metrics for individual labelers, allowing you to assess and compare the performance of each team member, including your own and requested labeling services workforce.
  • Performance charts: Visualizes key performance metrics indicating the throughput, efficiency, and quality of the labeling process, including all charts available on the performance dashboard.

Enterprise only feature

Only organizations on the enterprise plan can access the Monitor page. To upgrade to the enterprise plan, please contact sales.

Performance charts

The Monitor page displays metrics in the following categories (separated by subtab):
  • Throughput: Tracks the volume of labeling, annotation, and review activities over time.
  • Efficiency: Measures the time taken for labeling, reviewing, and rework activities to assess process efficiency.
By default, the page displays all available metrics, but you can adjust the number, order, and density of metrics displayed on the performance tables and charts. Each metric automatically detects and highlights outlier data for your attention. For example, if you see a decline in quality scores, you can quickly investigate and take corrective action, such as redistributing tasks or providing additional guidance to your team. You can also apply these metrics as filters to find certain projects and members.

Throughput

The Performance charts in the Monitor tab provide visual aids to help you understand how your data rows are progressing through the workflow steps over a selected time period. The Throughput graphs in the Performance charts section of the Production Monitor display the following:
MetricDescription
DoneThe number of data rows in the Done state of the project workflow. Hover over the chip in the upper right corner to see the number of data rows moved to Done in the selected time period.
LabelsThe number of labeled data rows over a specified time period, including deleted labels by default.
AnnotationsThe number of annotations (features) created over a specified time period, including deleted annotations by default.
ReviewsThe count of approve and reject actions performed on labeled data rows within a project.
Total timeThe total time spent on labeling, reviewing, and reworking data rows.
Labeling timeThe total time spent labeling data rows. Time increments when a labeler skips or submits an asset in the labeling queue.
Review timeThe total time spent reviewing labeled data rows. For reviewers, time increments when they view an asset in the review queue or in the data row browser view. For labelers, time increments when they view submitted labels.
Rework timeThe total time spent on reworking labeled data rows. Time increments when a labeler submits an asset in the rework queue, navigates to the review queue to make edits, or approves/rejects the asset in the data row browser view.

Average daily throughput

The Done, Labels, Annotations, and Reviews charts in the Performance charts > Throughput section contain a chip that displays the average daily throughput for the data rows in that chart. Hover over the chips to see a further breakdown of the average daily throughput. Performance Charts Throughput Done Chip 1

Efficiency

The Monitor page has the following efficiency metrics:
MetricDescription
Avg time per labelThe average labeling time spent per label. Avg time per label = Total labeling time / number of labels submitted
Avg review timeThe average review time per data row. Avg review time = Total review time / number of data rows reviewed
Avg rework timeThe average rework time per data row. Avg rework time = Total rework time/ number of data rows reworked
AHT per labeled data rowIncludes creating, reviewing, and reworking time. Calculated as total time / number of labeled data rows. Includes all data rows with at least one label.
AHT per done data rowIncludes creating, reviewing, and reworking time. Calculated as total time / number of done data rows. Only includes data rows marked as Done in the workflow.
AHT per created labelIncludes creating, reviewing, and reworking time. Calculated as total time / number of labels created. Includes all created labels (skipped, abandoned, and submitted).
AHT per submitted labelIncludes creating, reviewing, and reworking time. Calculated as total time / number of submitted labels. Excludes skipped and abandoned labels.
AHT per done labelIncludes creating, reviewing, and reworking time. Calculated as total time / number of labels on done data rows. Excludes skipped and abandoned labels.