Migration guide: Reporting page to Monitor
A guide that compares the Reporting page to the new Monitor tab.
In February 2025, we will sunset the Reporting page (AKA Enterprise Dashboard) for all customers. If you are a regular user of the Reporting page, please read this guide to understand how the new Monitor offers an improved experience for viewing your org metrics.
What is Monitor?
Recently, we released a new metrics tab in the Labelbox platform called Monitor. The new Monitor tab allows you to view throughput (counts) & efficiency (time) metrics across all of the projects in your workspace in real-time. It provides filters and visualizations to help you identify outlier labelers and make decisions to improve productivity and efficiency.
To learn more about how Monitor works, read this blog post, Monitor and optimize: Boosting data quality with new Labelbox workspace Monitor.
Why are we sunsetting the Reporting page?
The Reporting page is being replaced by the new Monitor tab. Unlike the Reporting page, the Monitor is a native solution that is built into the Labelbox platform, so it offers a more seamless and robust user experience.
How does the Monitor compare to the Reporting page?
The Monitor tab was built to replace the Reporting page, so it offers the same functionality plus more.
Read the tables below to understand how the metrics in the Reporting page map to the metrics in the new Monitor tab.
Org-level metrics (all projects, all members)
Reporting page (Tile > Count) | Monitor (Section > Chart) |
---|---|
Data Rows > Labeled data rows | Performance charts > Labels |
Label events > Submitted and Reviewed | Performance charts > Done |
Label events > Submitted Labels | Performance charts > Labels |
Label events > Reviewed Labels | Performance charts > Reviews |
Label event time > Label & Reviewing Hours | Performance charts > Total time |
Label event time > Label Hours | Performance charts > Labeling time |
Label event time > Reviewing Hours | Performance charts > Review time |
Annotation events > Annotations | Performance charts > Annotations |
Annotation Distribution > Total Annotations | Located in Annotate* |
Annotation Distribution > Distribution | Located in Annotate* |
Labeling & Reviewing Hours > Total Labeling Time [hrs] / Month label created | Performance charts > Labeling time / month |
*Annotation Distribution metrics can be found in the project overview tab. To view this metric, go to Annotate, select a project, and scroll to the bottom of the Overview tab.
Per member metrics
Reporting page (Table > Column) | Production monitor (Table > Column) |
---|---|
Users > Labeler Email | Member performance > Email |
Users > Count Distinct Labels | Member performance > Labels created |
Users > Submitted labels | Member performance > Labels created |
Users > Reviewed Labels | Member performance > Reviews received |
Users > Total Time Hrs | Member performance > Total time (member only) |
Users > Sum Labeling Time Hrs | Member performance > Labeling time (submitted) |
Users > Sum Reviewing Time Hrs | Member performance > Review time (all) |
Users > Avg Labeling Time per Datarow | Member performance > Avg time per label |
Users > Avg Reviewing Time per Datarow | Member performance > Avg review time (all) |
Users > Labeled & Reviewed Datarows | (Not supported) |
User Annotations > Labeler Email | Member performance > Email |
User Annotations > Avg Annotations per Hour | (Not supported) |
User Annotations > Annotations created | Member performance (select member) > Annotations |
Per project metrics
Reporting page (Table > Column) | Production monitor (Table > Column) |
---|---|
Users by Project > Labeler Email | Member performance > Email |
Users by Project > Project Name | Member performance (select member) > Project |
Users by Project > Count Distinct Labels | Project performance > Labels created |
Users by Project > Submitted Labels | Project performance > Labels created |
Users by Project > Reviewed Labels | Project performance > Reviews received |
Users by Project > Total Time Hrs | Project performance > Total time (member only) |
Users by Project > Sum Labeling Time Hrs | Project performance > Labeling time (submitted) |
Users by Project > Avg Labeling Time per Datarow | Project performance > Avg time per label (s) |
Users by Project > Avg Reviewing Time per Datarow | Project performance > Avg review time (all) |
Users by Project > Labeled & Reviewed Datarows | (Not supported) |
Project > Project ID | (Not supported) |
Project > Project Name | Project performance (select project) > Project: [Name] |
Project > Project Deleted | (Not supported) |
Project > Count Labeled Datarows | Project performance (select project) > Labels |
Project > Sum Label Event Time Hrs | Project performance (select project) > Labeling time |
Project > Avg Time per Datarow | (Not supported) |
Project > Avg Annotations per Hour | (Not supported) |
Project > Est Cost Per Datarow | (Not supported) |
Where can I learn more?
Updated 29 days ago