- Project Performance: Shows performance metrics for each project, allowing you to monitor and analyze performance at the project level.
- Member performance: Shows performance metrics for individual labelers, allowing you to assess and compare the performance of each team member, including your own and requested labeling services workforce.
- Performance charts: Visualizes key performance metrics indicating the throughput, efficiency, and quality of the labeling process, including all charts available on the performance dashboard.
Enterprise only feature
Only organizations on the enterprise plan can access the Monitor page. To upgrade to the enterprise plan, please contact sales.Performance charts
The Monitor page displays metrics in the following categories (separated by subtab):- Throughput: Tracks the volume of labeling, annotation, and review activities over time.
- Efficiency: Measures the time taken for labeling, reviewing, and rework activities to assess process efficiency.
Throughput
The Performance charts in the Monitor tab provide visual aids to help you understand how your data rows are progressing through the workflow steps over a selected time period. The Throughput graphs in the Performance charts section of the Production Monitor display the following:| Metric | Description |
|---|---|
| Done | The number of data rows in the Done state of the project workflow. Hover over the chip in the upper right corner to see the number of data rows moved to Done in the selected time period. |
| Labels | The number of labeled data rows over a specified time period, including deleted labels by default. |
| Annotations | The number of annotations (features) created over a specified time period, including deleted annotations by default. |
| Reviews | The count of approve and reject actions performed on labeled data rows within a project. |
| Total time | The total time spent on labeling, reviewing, and reworking data rows. |
| Labeling time | The total time spent labeling data rows. Time increments when a labeler skips or submits an asset in the labeling queue. |
| Review time | The total time spent reviewing labeled data rows. For reviewers, time increments when they view an asset in the review queue or in the data row browser view. For labelers, time increments when they view submitted labels. |
| Rework time | The total time spent on reworking labeled data rows. Time increments when a labeler submits an asset in the rework queue, navigates to the review queue to make edits, or approves/rejects the asset in the data row browser view. |
Average daily throughput
The Done, Labels, Annotations, and Reviews charts in the Performance charts > Throughput section contain a chip that displays the average daily throughput for the data rows in that chart. Hover over the chips to see a further breakdown of the average daily throughput.
Efficiency
The Monitor page has the following efficiency metrics:| Metric | Description |
|---|---|
| Avg time per label | The average labeling time spent per label. Avg time per label = Total labeling time / number of labels submitted |
| Avg review time | The average review time per data row. Avg review time = Total review time / number of data rows reviewed |
| Avg rework time | The average rework time per data row. Avg rework time = Total rework time/ number of data rows reworked |
| AHT per labeled data row | Includes creating, reviewing, and reworking time. Calculated as total time / number of labeled data rows. Includes all data rows with at least one label. |
| AHT per done data row | Includes creating, reviewing, and reworking time. Calculated as total time / number of done data rows. Only includes data rows marked as Done in the workflow. |
| AHT per created label | Includes creating, reviewing, and reworking time. Calculated as total time / number of labels created. Includes all created labels (skipped, abandoned, and submitted). |
| AHT per submitted label | Includes creating, reviewing, and reworking time. Calculated as total time / number of submitted labels. Excludes skipped and abandoned labels. |
| AHT per done label | Includes creating, reviewing, and reworking time. Calculated as total time / number of labels on done data rows. Excludes skipped and abandoned labels. |