Migration guide: Reporting page to Monitor

A guide that compares the Reporting page to the new Monitor tab.

In February 2025, we will sunset the Reporting page (AKA Enterprise Dashboard) for all customers. If you are a regular user of the Reporting page, please read this guide to understand how the new Monitor offers an improved experience for viewing your org metrics.

What is Monitor?

Recently, we released a new metrics tab in the Labelbox platform called Monitor. The new Monitor tab allows you to view throughput (counts) & efficiency (time) metrics across all of the projects in your workspace in real-time. It provides filters and visualizations to help you identify outlier labelers and make decisions to improve productivity and efficiency.

To learn more about how Monitor works, read this blog post, Monitor and optimize: Boosting data quality with new Labelbox workspace Monitor.

Why are we sunsetting the Reporting page?

The Reporting page is being replaced by the new Monitor tab. Unlike the Reporting page, the Monitor is a native solution that is built into the Labelbox platform, so it offers a more seamless and robust user experience.

How does the Monitor compare to the Reporting page?

The Monitor tab was built to replace the Reporting page, so it offers the same functionality plus more.
Read the tables below to understand how the metrics in the Reporting page map to the metrics in the new Monitor tab.

Org-level metrics (all projects, all members)

Reporting page
(Tile > Count)
Monitor
(Section > Chart)
Data Rows > Labeled data rowsPerformance charts > Labels
Label events > Submitted and ReviewedPerformance charts > Done
Label events > Submitted LabelsPerformance charts > Labels
Label events > Reviewed LabelsPerformance charts > Reviews
Label event time > Label & Reviewing HoursPerformance charts > Total time
Label event time > Label HoursPerformance charts > Labeling time
Label event time > Reviewing HoursPerformance charts > Review time
Annotation events > AnnotationsPerformance charts > Annotations
Annotation Distribution > Total AnnotationsLocated in Annotate*
Annotation Distribution > DistributionLocated in Annotate*
Labeling & Reviewing Hours > Total Labeling Time [hrs] / Month label createdPerformance charts > Labeling time / month

*Annotation Distribution metrics can be found in the project overview tab. To view this metric, go to Annotate, select a project, and scroll to the bottom of the Overview tab.

Per member metrics

Reporting page
(Table > Column)
Production monitor
(Table > Column)
Users > Labeler EmailMember performance > Email
Users > Count Distinct LabelsMember performance > Labels created
Users > Submitted labelsMember performance > Labels created
Users > Reviewed LabelsMember performance > Reviews received
Users > Total Time HrsMember performance > Total time (member only)
Users > Sum Labeling Time HrsMember performance > Labeling time (submitted)
Users > Sum Reviewing Time HrsMember performance > Review time (all)
Users > Avg Labeling Time per DatarowMember performance > Avg time per label
Users > Avg Reviewing Time per DatarowMember performance > Avg review time (all)
Users > Labeled & Reviewed Datarows(Not supported)
User Annotations > Labeler EmailMember performance > Email
User Annotations > Avg Annotations per Hour(Not supported)
User Annotations > Annotations createdMember performance (select member) > Annotations

Per project metrics

Reporting page
(Table > Column)
Production monitor
(Table > Column)
Users by Project > Labeler EmailMember performance > Email
Users by Project > Project NameMember performance (select member) > Project
Users by Project > Count Distinct LabelsProject performance > Labels created
Users by Project > Submitted LabelsProject performance > Labels created
Users by Project > Reviewed LabelsProject performance > Reviews received
Users by Project > Total Time HrsProject performance > Total time (member only)
Users by Project > Sum Labeling Time HrsProject performance > Labeling time (submitted)
Users by Project > Avg Labeling Time per DatarowProject performance > Avg time per label (s)
Users by Project > Avg Reviewing Time per DatarowProject performance > Avg review time (all)
Users by Project > Labeled & Reviewed Datarows(Not supported)
Project > Project ID(Not supported)
Project > Project NameProject performance (select project) > Project: [Name]
Project > Project Deleted(Not supported)
Project > Count Labeled DatarowsProject performance (select project) > Labels
Project > Sum Label Event Time HrsProject performance (select project) > Labeling time
Project > Avg Time per Datarow(Not supported)
Project > Avg Annotations per Hour(Not supported)
Project > Est Cost Per Datarow(Not supported)

Where can I learn more?