Labelbox documentation

Quality assurance

Labelbox offers several tools to help you monitor the quality of your labeled data.

When you turn on the Review step, you can designate a percentage of your labeled assets to enter the review queue. The reviewers on your team can then load the labeled assets in a separate review interface to approve or deny the labeled assets.

The Benchmarks feature allows you to label an asset, designate those annotations as a "gold standard" for that asset, and then compare other labelers' work to that gold standard.

The Consensus feature allows you to compare a labeler's work to other labelers.

This section also contains a simple workflow for relabeling data if a label does not meet quality standards.